KAIST Vision Hall
Important On-campus Websites
Personal Information Policy
Main Campus (Daejeon)
Seoul Campus (Seoul)
Munji Campus (Daejeon)
Dogok Campus (Seoul)
How to get to KAIST
Giving to KAIST
Excellence in KAIST
Use of Gift
Make a Gift
College of Natural Sciences
College of Life Science and Bio Engineering
College of Engineering
College of Liberal Arts and Convergence Science
College of Business
General Studies Requirements
Education Support Program
Int'l Exchange Programs
International Scholar and Student Service Team
Center for Excellence in Learning and Teaching
Research Areas and Main Research Programs
Person in Charge by Research Area
Common Utilized Equipment
Office of Univ. Industry Coop.
World Economic Forum
KAIST Annual R&D Report
Student Health Insurance Association
Cultural Event Info.
Satisfaction survey of food&beverage Enterprise inside Campus
Day Care Center
Student Clubs and Activities
Undergraduate Student Clubs
Graduate Student Clubs
Center For Ethics And Human Rights
Intl’ Student Identity Card (ISIC)
KAIST in Media
International Scholar and Student Services
Recognizing Seven Different Face Emotions on a Mobile Platform
(Professor Hoi-Jun Yoo)
A KAIST research team succeeded in achieving face emotion recognition on a mobile platform by developing an AI semiconductor IC that processes two neural networks on a single chip.
Professor Hoi-Jun Yoo and his team (Primary researcher: Jinmook Lee Ph. D. student) from the School of Electrical Engineering developed a unified deep neural network processing unit (UNPU).
Deep learning is a technology for machine learning based on artificial neural networks, which allows a computer to learn by itself, just like a human.
The developed chip adjusts the weight precision (from 1 bit to 16 bit) of a neural network inside of the semiconductor in order to optimize energy efficiency and accuracy. With a single chip, it can process a convolutional neural network (CNN) and recurrent neural network (RNN) simultaneously. CNN is used for categorizing and recognizing images while RNN is for action recognition and speech recognition, such as time-series information.
Moreover, it enables an adjustment in energy efficiency and accuracy dynamically while recognizing objects. To realize mobile AI technology, it needs to process high-speed operations with low energy, otherwise the battery can run out quickly due to processing massive amounts of information at once. According to the team, this chip has better operation performance compared to world-class level mobile AI chips such as Google TPU. The energy efficiency of the new chip is 4 times higher than the TPU.
In order to demonstrate its high performance, the team installed UNPU in a smartphone to facilitate automatic face emotion recognition on the smartphone. This system displays a user’s emotions in real time. The research results for this system were presented at the 2018 International Solid-State Circuits Conference (ISSCC) in San Francisco on February 13.
Professor Yoo said, “We have developed a semiconductor that accelerates with low power requirements in order to realize AI on mobile platforms. We are hoping that this technology will be applied in various areas, such as object recognition, emotion recognition, action recognition, and automatic translation. Within one year, we will commercialize this technology.”