Page 14: Research news on Human-centered AI interfaces

Human-centered AI interfaces encompass computational systems that use machine learning, generative models, and multimodal sensing to mediate, augment, or interpret human communication and behavior. Work in this area spans assistive communication for speech, hearing, and motor impairments, real-time sign language and speech technologies, and social robots that adapt behavior and express empathy. Vision-language models and video analytics support long-video reasoning, activity recognition, and error detection, while interactive agents, privacy-aware speech systems, and affect-sensitive tools enable more accessible, expressive, and context-aware human–AI interaction across physical and virtual environments.

Hi Tech & Innovation

AI-generated voices now indistinguishable from real human voices

Many people still think of AI-generated speech as sounding "fake" or unconvincing and easily told apart from human voices. But new research from Queen Mary University of London shows that AI voice technology has now reached ...

Robotics

Creating robots that adapt to your emotion

Robots might be getting smarter, but to truly support people in daily life, they also need to become more empathetic. That means recognizing and responding to human emotions in real time.

Hi Tech & Innovation

Predictive AI could prevent crowd crush disasters

To prevent crowd crush incidents like the Itaewon tragedy, it's crucial to go beyond simply counting people and to instead have a technology that can detect the real-inflow and movement patterns of crowds. A KAIST research ...

page 14 from 26