Ultrasonic tech enables touchless gesture control

Elliptic Labs has developed technology that provides touchless gesture control around the display up to 180 degrees. It works by sending ultrasound signals through the air from speakers integrated into smartphones/tablets that bounce against the hand, which are then recorded by microphones also integrated in these devices.
This allows Elliptic Labs’ technology to recognise hand gestures and uses them to move objects on a screen, very similar to how bats use echolocation to navigate. One major benefit of Elliptic Labs’ ultrasound technology it that it offers 180 degree field of view. The technology uses microphones and transmitters to sense movement in front of a screen and to the sides, enabling an interaction zone extending over the screen and beyond the sides. Elliptic Labs enables gesturing both from a distance and very close to the screen. Another feature is distributed sensing, which enables motion capture of the hand from multiple angles, avoiding occlusion of objects or parts of an object. Sensors used are MEMS microphones, which can also double up and be used for speech enhancement and recognition. The ability to separate the 'first returning' echoes from other, echoes arriving later, means that Elliptic Labs’ touchless gesturing technology can separate foreground from background. This is essential both, for separating finger motion from wrist, and hand motion from movements or reflections from the body. This prevents unwanted and accidental gestures from being recognised. Elliptic Labs is now making the ultrasonic technology (and an SDK) available to manufacturers interested in integrating it into their smartphones/tablets. Smartphones using the tech are expected to be released by Q2 2015. Source: InAVate
Read More........

Human-like AI is becoming a reality:Year 2029 (Prospects)

Year 2029 (Prospects): A major milestone is reached in the field of artificial intelligence this year, as a computer passes the Turing Test for the first time.** This test is conducted by a human judge who is made to engage in a natural language conversation with one human and one machine, each of which tries to appear human. The participants are placed in isolated locations. Information technology has seen exponential growth for decades. This has led to vast improvements in memory, processing power, software algorithms, voice recognition and overall machine intelligence. It has now reached the stage where an independent judge is literally unable to tell which is the real human and which is not.* Answers to certain "obscure" questions posed by the judge may appear childlike from the AI – but they are humanlike nonetheless. Source: Future Time Line
Read More........

Apple paves way for creation of 3D, interactive images from handheld devices

Apple has patented a device display that uses lasers, micro lenses and sensors to create a 3D "holographic" image as well as detecting how a user interacts with it in real time, according to Apple news feed and forum Apple Insider. The "Interactive holographic display device" would allow a 2D display panel to create a 3D, interactive image, which Apple presumably intends to deploy in devices such as iPhones and iPads. The system would generate multiple views of an on-screen object from various viewing angles with lenses deflecting laser light.  Apple Insider reports that single finger gestures would turn or move the image, while pinch gestures would change the size. Finger speed would also have an impact on turning or moving the image.  The patent was filed for in February 2011. More information... Apple Insider, Contact Details and Archive... AppleSource: InAVate
Read More........

Augmented reality gets serious with high-tech hard hat


.Subscribe
Augmented reality developer DAQRI is targeting industrial applications with a hard hat that incorporates 360 degree navigation cameras and a high resolution depth sensor to deliver augmented reality to workers in the field. It uses DAQRI's tracking technology, Intellitrack to overlay 4D virtual content on the wearer’s field of vision.
Intellitrack was designed for industrial applications and can maintain tracking when dealing with non-standard shapes and low lighting. Even if the majority of DAQRI Smart Helmet’s sensors are obscured or blocked, tracking will continue to function. The helmet was designed to integrate with existing hardware and software and become part of an existing workflow. The interface can be controlled and touched via integration with new form factors such as smart watches.Source: InAVate
Read More........

In your face projection mapping delivers virtual make-up

Japanese artist, director and producer Nobumichi Asai has unveiled Omote, his latest project that uses projection mapping techniques to create stunning illusions on a model’s face. In his recently released video the model moves her head while projected graphics constantly transform how she appears; creating masks and cyber influenced visuals. He was inspired by the Japanese Noh mask.
Laser scanning was used to create a mesh that followed the contours of the model’s face and it’s rumoured that Asai is now looking to create a system the covers the whole body. Asai has built up a large

portfolio of work that includes a number of projection mapping projects, usually featuring huge backdrops including buildings, a dockyard and a large stage show for Subaru. Source: InAVate, Image-Courtesy: https://33.media.tumblr.com
Read More........