Page 5: Research news on Embodied robotic manipulation

Embodied robotic manipulation investigates robotic and prosthetic limbs that physically interact with the environment using human-like, adaptive control. Work in this area integrates soft robotic structures, tendon-driven and biohybrid actuators, and exoskeletons with rich multimodal sensing, including vision, tactile, and proprioceptive feedback. Machine learning methods such as imitation learning, reinforcement learning, and meta-learning are used to acquire dexterous skills, enable shared and autonomous control, and support intuitive human–robot interaction through haptic interfaces, brain–computer interfaces, and teleoperation systems.

Computer Sciences

AI decodes pianists' muscle activity via video

AI and human-movement research intersect in a study that enables precise estimation of hand muscle activity from standard video recordings. Using a deep-learning framework trained on a large, comprehensive multimodal dataset ...

Robotics

Tactile sensors enable robots to carry unsecured loads

If you've ever moved into a new home, you know the challenge of packing a moving truck—it's like solving a giant, three-dimensional puzzle. Everything needs to fit just right, and nothing can be left loose or unbalanced, ...

Electronics & Semiconductors

Wearable tech lets users control machines and robots while on the move

Engineers at the University of California San Diego have developed a next-generation wearable system that enables people to control machines using everyday gestures—even while running, riding in a car or floating on turbulent ...

page 5 from 16