um-did-google-s-project-soli-just-perfect-motion-control-tech

Google Project Soli: Micro gesture recognition

Google’s Advanced Technology and Projects group (ATAP) previously brought us Project Tango, a smartphone-like device which tracks the 3D motion of the device, and creates a 3D model of the environment around it. ATAP is back with an exciting gesture-sensing technology that replaces the physical controls of smartwatches or mobile phones with your hands using radar to capture your movements. Previous attempts of gesture based technology was more focused towards bringing motion sensors to your gaming consoles (Microsoft Kinect) or computers (Leap Motion and Intel RealSense), but this is different,.

“Soli” is the name of the new visionary project of Google’s ATAP. It stretches the limits of gesture recognition and connected objects through a sensitive captor (radar) reacting to hand movements. Soli is based on complex signal processing and machine learning techniques to detect gestures. A continuous beam of  signal is emitted that gets reflected by fingers, hand or arm and a  measurement in  the differences between the emitted and the received signal is calculated and mapped to gestures.

Soli’s sensors are designed to capture motion at up to 10,000 frames per second at 60Ghz spectrum which improves the accuracy much beyond the traditional camera based gesture tools.

Here’s a video featuring Google’s Project Soli in action:

The future is exciting, let us know what do you guys think!