Perceptually-enabled Task Guidance (PTG) performers trained their prototypes on various recipes as a proxy for completing an unfamiliar task. Future PTG demonstrations will focus on medical procedures, military equipment sustainment, and aircraft co-piloting scenarios. Credit: Tom Shortridge/DARPA

In this video, DARPA program manager Dr. Bruce Draper describes the technology he thinks could usher in the next "do-it-yourself" revolution.

The Perceptually-enabled Task Guidance (PTG) program aims to develop virtual "task guidance" assistants that can work with different sensor platforms to help perform complex physical tasks and expand their skillsets. Unlike today's AI assistants, PTG technology would be able to see what the user sees and hears what they hear by integrating with a microphone, a head-mounted camera, and displays like augmented reality (AR) headsets, to deliver accurate instructions.

PTG performers recently demonstrated early successes of their prototypes by using the task of cooking recipes as a proxy for unfamiliar, more , such as battlefield , military equipment sustainment, and co-piloting aircraft.

"Today the is pursuing new, useful ways to present data to the user but it doesn't go far enough," said Draper. "The gamechanger with PTG would be having perceptually-driven AI interfaces that can make sense of the real world, react to whatever the user is doing and provide advice. I'm really impressed at how quickly performing teams are making progress toward the goals."

Credit: DARPA

Provided by DARPA