January 20, 2016 weblog
Ultrasound proximity software may outshine phone sensors
Can ultrasound software usurp the kingdom of proximity sensors in smartphones? It will be interesting to see how Elliptic Labs fares in its new BEAUTY ultrasound proximity software.
The company is talking about ultrasound technology in action: Ultrasound signals sent through the air from speakers integrated in smart phones and tablets bounce against your hand and are recorded by microphones integrated in the devices as well. In this way, the technology recognizes hand gestures and uses them to move objects on a screen, similar to how bats use echolocation to navigate.
The company said its BEAUTY solution is to be incorporated into phones this year. The team is working directly with OEMs to integrate BEAUTY ultrasound proximity software into phones, although the news release did not name any of the OEMs.
Reinventing the optical proximity sensor into BEAUTY is being promoted as carrying two advantages—helping to streamline smartphone designs, in freeing up physical space inside the mobile device, and bringing down costs, by eliminating the need for a traditional optical proximity sensor.
But first things first. What is a proximity sensor? Look at your smartphone. You will most likely see dots near the earpiece and front camera.
The hardware is responsible for turning the screen off when you put the device to your face to make a call. Daniel Fuller in Android Headlines said Tuesday that it is typically "an oblong or squarish affair under the hood and takes a little bit of room and power." Jason Bouwmeester in Techaeris similarly said, "Proximity sensors are a key component that turn off the phone screen and disable touch functionality when a user is holding their smartphone up to their ear."
Elliptic Labs wants to eliminate the proximity sensor with its software-based solution. With Elliptic Labs' BEAUTY software, according to the promotional messages, phones can be sleeker and less expensive yet at the same time deliver the same functionality, without the consumer using any physical hardware proximity sensor.
Tech watchers offered more descriptions of how it works, in re-using the existing earpiece and microphone. Brad Linder in Liliputing said, "Here's the idea: a phone's speaker emits an ultrasonic tone that's inaudible to the human ear. But the phone's microphone can detect the audio, and detect distortion in the signal caused by a hand, head, or other object that's close to the phone."
Rob Triggs in Android Authority also explained what the software does: "The software sends out small waves from the phone's speaker and then uses the microphone to listen out for the returning reflections from any objects in front of the phone. Presumably the time taken and amplitude of the returning signal can be used to tell how close an object is to the phone."
As for the optical proximity sensors that Elliptic Labs' BEAUTY would replace, the company made the case that the sensors can be unreliable in certain weather conditions or in response to variations in hair and skin color.
Chris Velazco in Engadget expanded on the technology:
"To hear Danielsen explain it, the phone's speaker can act 'like the mouth of a bat' and emit sound at ultrasound frequencies (in this case, between 23kHz and around 35kHz). That would make the phone's microphone the equivalent of the bat's ears, listening for how our faces or hands or whatever distort that inaudible sound. That job of interpreting that shifting soundscape falls to the software—they call it 'Beauty'—which determines when an appendage is too close and causes the screen to shut off. Voilà, the behavior you expect with one less bit of hardware involved."
© 2016 Tech Xplore