Page 11: Research news on Human-centered AI interfaces

Human-centered AI interfaces encompass computational systems that use machine learning, generative models, and multimodal sensing to mediate, augment, or interpret human communication and behavior. Work in this area spans assistive communication for speech, hearing, and motor impairments, real-time sign language and speech technologies, and social robots that adapt behavior and express empathy. Vision-language models and video analytics support long-video reasoning, activity recognition, and error detection, while interactive agents, privacy-aware speech systems, and affect-sensitive tools enable more accessible, expressive, and context-aware human–AI interaction across physical and virtual environments.

Computer Sciences

AI tech can compress LLM chatbot conversation memory by 3–4 times

Seoul National University College of Engineering announced that a research team led by Professor Hyun Oh Song from the Department of Computer Science and Engineering has developed a new AI technology called KVzip that intelligently ...

Business

Zuckerbergs put AI at heart of pledge to cure diseases

The Chan Zuckerberg Initiative, a nonprofit launched by Mark Zuckerberg and his wife aimed at curing all disease, on Thursday announced it was restructuring to focus on using artificial intelligence to achieve that goal.

Computer Sciences

Computer model mimics human audiovisual perception

A new computer model developed at the University of Liverpool can combine sight and sound in a way that closely resembles how humans do it. This model is inspired by biology and could be useful for artificial intelligence ...

page 11 from 26