Elliptic Labs has developed technology that provides touchless gesture control around the display up to 180 degrees. It works by sending ultrasound signals through the air from speakers integrated into smartphones/tablets that bounce against the hand, which are then recorded by microphones also integrated in these devices.
This allows Elliptic Labs’ technology to recognise hand gestures and uses them to move objects on a screen, very similar to how bats use echolocation to navigate. One major benefit of Elliptic Labs’ ultrasound technology it that it offers 180 degree field of view. The technology uses microphones and transmitters to sense movement in front of a screen and to the sides, enabling an interaction zone extending over the screen and beyond the sides. Elliptic Labs enables gesturing both from a distance and very close to the screen. Another feature is distributed sensing, which enables motion capture of the hand from multiple angles, avoiding occlusion of objects or parts of an object. Sensors used are MEMS microphones, which can also double up and be used for speech enhancement and recognition. The ability to separate the 'first returning' echoes from other, echoes arriving later, means that Elliptic Labs’ touchless gesturing technology can separate foreground from background. This is essential both, for separating finger motion from wrist, and hand motion from movements or reflections from the body. This prevents unwanted and accidental gestures from being recognised. Elliptic Labs is now making the ultrasonic technology (and an SDK) available to manufacturers interested in integrating it into their smartphones/tablets. Smartphones using the tech are expected to be released by Q2 2015. Source: InAVate