[ad_1]

A groundbreaking technology that can recognize human emotions in real time has been developed by Professor Jeon Kim and his research team at UNIST’s Department of Materials Science and Engineering. This cutting-edge technology is poised to revolutionize various industries, including next-generation wearable systems that provide services based on emotions.

Understanding and accurately extracting emotional information has long been a challenge due to the abstract and ambiguous nature of human affect such as emotions, moods, and sensations. To address this, the research team developed a multi-modal human emotion recognition system that combines verbal and non-verbal expression data to effectively use comprehensive emotional information.

The core of this system is the Personalized Skin Integrated Facial Interface (PSiFI) system, which is self-powered, flexible, stretchable and transparent. It features a first-of-its-kind bidirectional triboelectric strain and vibration sensor that enables simultaneous sensing and integration of verbal and nonverbal expression data. The system is fully integrated with a data processing circuit for wireless data transfer, enabling real-time emotion recognition.

Using machine learning algorithms, the advanced technology performs accurate and real-time human emotion recognition tasks, even when people are wearing masks. The system has also been successfully implemented in a virtual reality (VR) environment in a digital concierge application.

The technology is based on the phenomenon of “friction charging”, where objects separate into positive and negative charges upon friction. In particular, the system is self-generating, requiring no external power source or complex measuring equipment to identify the data.

“Based on these technologies, we have developed a skin-integrated face interface (PSiFI) system that can be customized for individuals,” commented Professor Kim. The team used a semi-curing technique to create a transparent conductor for friction charging electrodes. Additionally, a personalized mask was created using a multi-angle shooting technique combining elasticity, flexibility and transparency.

The research team successfully integrated the detection of facial muscle disturbances and vocal cord vibration, enabling real-time emotion recognition. The system’s capabilities were demonstrated in a virtual reality “digital concierge” application, where customized services were provided based on customer sentiment.

“With this developed system, it is possible to implement real-time emotion recognition with only a few learning steps and without complex measurement equipment,” said first author of the study Jin Pyo Lee. opens up possibilities for devices. and future next-generation emotion-based digital platform services.”

The research team conducted real-time emotion recognition experiments, collecting multimodal data such as facial muscle contractions and voice. The system demonstrated high emotion recognition accuracy with minimal training. Its wireless and customizable nature ensures wearability and convenience.

Additionally, the team applied the system to VR environments, using it as a “digital concierge” for a variety of settings including smart homes, private movie theaters, and smart offices. The system’s ability to identify individual emotions in different situations enables it to deliver personalized recommendations for music, movies and books.

Professor Kim emphasized, “For effective interaction between humans and machines, human-machine interface (HMI) devices must be able to collect diverse data and handle complex integrated information. This study exemplifies the ability to use emotions.” gives, which are complex forms of human. information, in next-generation wearable systems.”

The research was carried out in collaboration with Professor Lee Poi Si of Nanyang Technical University in Singapore and was supported by the National Research Foundation of Korea (NRF) and the Korea Institute of Materials (KIMS) under the Ministry of Science and ICT. The study was published online on January 15. Nature Communications.

[ad_2]