July 25, 2017

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

HoloLens HPU: Second version will incorporate AI coprocessor

Credit: Microsoft
× close
Credit: Microsoft

(Tech Xplore)—Microsoft has announced that the next generation of its mixed reality HoloLens headset will incorporate an AI chip.

Specifically, the second version of the HPU, which stands for "Holographic Processing Unit, "will incorporate an AI coprocessor. The HPU is now under development.

The Microsoft Research blog writes about the HPU:

"HoloLens contains a custom multiprocessor called the Holographic Processing Unit, or HPU. It is responsible for processing the information coming from all of the on-board sensors, including Microsoft's custom time-of-flight depth sensor, head-tracking cameras, the inertial measurement unit (IMU), and the infrared camera. The HPU is part of what makes HoloLens the world's first–and still only–fully self-contained holographic computer."

James Vincent in The Verge talked about it: "For the second generation HoloLens, the AI coprocessor will be built into its "Holographic Processing Unit" or HPU—Microsoft's name for its central vision-processing chip." Actually, said Bloomberg, the new processor is a version of the company's existing Holographic Processing Unit, and it was unveiled at an event in Hawaii on Sunday.

Tom's Hardware said "the second generation of the HoloLens' Holographic Processing Unit" is having "a deep learning accelerator."

Why do this? Posting in the Microsoft Research blog, Marc Pollefeys, director of science, HoloLens, said that "in HoloLens, we're in the business of making untethered devices. We put the battery on your head, in addition to the compute, the sensors, and the display. Any compute we want to run locally for low-latency, which you need for things like hand-tracking, has to run off the same battery that powers everything else. So what do you do? You create custom silicon to do it."

Vincent in The Verge said this custom silicon will be used to analyze visual data directly on the device, "saving time by not uploading it to the cloud."

One of the key points in stories on news sites over this announcement was about quick performance on the HoloLens 2.

"Today's mobile devices, where AI is going to be used more frequently, simply aren't built to handle these sorts of programs, and when they're asked, the result is usually slower performance or a burned-out battery (or both)," Vincent wrote.

"The AI coprocessor is designed to work in the next version of HoloLens, running continuously, off the HoloLens battery," according to Pollefeys.

Tom's Hardware: "The deep learning accelerator is designed to work offline and use the HoloLens' battery, which means it should be quite efficient, while still providing significant benefits to Microsoft's machine learning code."

Bloomberg elaborated on this too:

"Tech companies are keen to bring cool artificial intelligence features to phones and augmented reality goggles—the ability to show mechanics how to fix an engine, say, or tell tourists what they are seeing and hearing in their own language."

But how to manage such features without a device slowdown or quick battery drain? Solution: an AI processor, said Bloomberg, that analyzes what the user sees and hears right there on the device.

Bloomberg noted business use cases that bring the achievement all home:

"Moving this expertise from the cloud down to the device in a person's hand or on their face is a key priority for Microsoft's AI-focused CEO Satya Nadella." Earlier this year, said the article, in a speech, he touted the idea of AI's use to track industrial equipment. A user could be told where to find a jackhammer, and generate warnings over unauthorized use or chemical spill. The article quoted Microsoft CTO Kevin Scott: 'We really do need custom silicon to help power some of the scenarios and applications that we are building.'"

Load comments (0)