Page 20: Research news on Human-centered AI interfaces

Human-centered AI interfaces encompass computational systems that use machine learning, generative models, and multimodal sensing to mediate, augment, or interpret human communication and behavior. Work in this area spans assistive communication for speech, hearing, and motor impairments, real-time sign language and speech technologies, and social robots that adapt behavior and express empathy. Vision-language models and video analytics support long-video reasoning, activity recognition, and error detection, while interactive agents, privacy-aware speech systems, and affect-sensitive tools enable more accessible, expressive, and context-aware human–AI interaction across physical and virtual environments.

Consumer & Gadgets

Jamming with AI: Jazz trio plays live with AI-generated sound

A fascinating recent development enabling musicians to improvise live music with AI-generated sound could be the biggest innovation since the advent of sampling, or perhaps even the invention of recorded sound, according ...

Consumer & Gadgets

Big tech on a quest for ideal AI device

ChatGPT-maker OpenAI has enlisted the legendary designer behind the iPhone to create an irresistible gadget for using generative artificial intelligence (AI).

Consumer & Gadgets

AI-generated podcasts open new doors to make science accessible

The first study to use artificial intelligence (AI) technology to generate podcasts about research published in scientific papers has shown the results were so good that half of the papers' authors thought the podcasters ...

page 20 from 26