Page 25: Research news on Human-centered AI interfaces

Human-centered AI interfaces encompass computational systems that use machine learning, generative models, and multimodal sensing to mediate, augment, or interpret human communication and behavior. Work in this area spans assistive communication for speech, hearing, and motor impairments, real-time sign language and speech technologies, and social robots that adapt behavior and express empathy. Vision-language models and video analytics support long-video reasoning, activity recognition, and error detection, while interactive agents, privacy-aware speech systems, and affect-sensitive tools enable more accessible, expressive, and context-aware human–AI interaction across physical and virtual environments.

Robotics

That 'uhh... let me think' face you make? Androids need it too

Ever asked a question and been met with a blank stare? It's awkward enough with a person—but on a humanoid robot, it can be downright unsettling. Now, an international team co-led by Hiroshima University and RIKEN has found ...

Software

AI-driven software is 96% accurate at diagnosing Parkinson's

Existing research indicates that the accuracy of a Parkinson's disease diagnosis hovers between 55% and 78% in the first five years of assessment. That's partly because Parkinson's sibling movement disorders share similarities, ...

page 25 from 26