Back To Top

Samsung calls for more powerful AI at forum

Samsung Electronics on Monday hosted the third annual forum on artificial intelligence by inviting the world’s top-notch experts, calling for “more powerful and comprehensive” AI models in the future.

Scholars on deep learning, the core of AI, attended the two-day forum on the first day and introduced new achievements that prove the level of deep learning is evolving.

“AI technologies are already exerting a significant influence across society, as we rely pretty much on them in everyday lives, such as by using facial verification for unlocking smartphones and AI speakers,” said Kim Ki-nam, vice chairman of Samsung’s semiconductor division and head of the Samsung Advanced Institute of Technology, in his opening speech.

“There have been some noticeable developments, especially in the field of natural language processing with the debut of BERT, but (as of) yet there are many tasks to be addressed, for example to make AI systems not misrecognize objects in a nosy environment.”

BERT stands for bidirectional encoder representations from transformers, a deep learning model unveiled by Google last November. 

Samsung Vice Chairman Kim Ki-nam delivers a speech at Samsung AI Forum held at Samsung’s head office in Seocho-dong, southern Seoul, on Monday. (Samsung Electronics)
Samsung Vice Chairman Kim Ki-nam delivers a speech at Samsung AI Forum held at Samsung’s head office in Seocho-dong, southern Seoul, on Monday. (Samsung Electronics)

“We need more powerful and comprehensive AI models,” Kim said.

At the forum, Samsung’s key research and development unit SAIT demonstrated the world’s first on-device AI translator that does not need an internet connection.

The institute has been working on the development of on-device AI technologies, which will enable operations of related systems without the need of cloud integration.

At the first AI forum held in 2017, the Samsung institute introduced machine translation and end-to-end voice recognition technology.

Dubbed as “SAIT Translator,” the mobile platform that has combined on-device speech recognition and machine translation technologies can run on the latest Samsung Galaxy S10 phones.

The demonstration took place while the phone was disconnected.

As soon as full sentences were spoken to the device, the translator showed them as written text, which was at the same time translated to appropriate English sentences.

“I believe the performance of our on-device AI translator is almost the same as that of Google’s,” said Hwang Sung-woo, vice president of SAIT.

There are some rumors that Samsung is going to adopt the AI translator for its upcoming flagship models, but the company declined to confirm.

Yoshua Bengio, a professor at University of Montreal, proposed four ingredients to make AI systems come closer to human competence levels based on a premise that humans are still much better than current AI systems.

Like a child who learns about the world by experiencing things, machine learning agents are in need of meta-learning that enables learners to effectively adapt to modified tasks.

Trevor Darrell, a professor at UC Berkeley, presented recent methods in adversarial adaptive learning, referring to technologies that support AI systems to make their own analyses and judgements during unexpected situations.

Samsung announced AI as one of its four key businesses as future growth engines last year along with 5G, biotechnology and automotive components.

The South Korean tech giant is running a total of seven AI centers around the world with the aim of securing around 1,000 global experts by 2020.

By Song Su-hyun (song@heraldcorp.com



MOST POPULAR
LATEST NEWS
leadersclub
subscribe
소아쌤