ATAP’s ‘Soli’ Radar-Based Gesture Control Could Be The Perfect Wearable Interface

Google’s ATAP Project Soli is an attempt to harness the power of the hand and finger manipulation to make it easier to interact with ever-smaller devices and screens. It posits that instead of using tools, we should use our “hand motion vocabulary” to control devices, even when the devices aren’t present.

What it does is let you control devices using natural hand motions, detecting incredibly fine motions accurately and precisely, even through materials (you could install the sensor beneath a table, for instance). It does this using radar (I’ll get to that later) and allows you to manipulate tiny or huge displays with equal accuracy, thanks to the lack of a need for touch point sizing constraints.

Haptic feedback is included, since your hand naturally provides it itself – your own skin offers friction when you touch fingertip to fingertip. Soli then is designed to reimagine your hand as its own user interface. This requires a sensor to detect motion, of course, but should have no other requirements in terms of hardware.

  1. google-io-2015-atap0037

  2. google-io-2015-atap0036

  3. google-io-2015-atap0038

  4. google-io-2015-atap0039

  5. google-io-2015-atap0020

  6. google-io-2015-atap0022

  7. google-io-2015-atap0021

  8. google-io-2015-atap0024

  9. google-io-2015-atap0023

  10. google-io-2015-atap0025

  11. google-io-2015-atap0026

  12. google-io-2015-atap0027

  13. google-io-2015-atap0028

  14. google-io-2015-atap0029

  15. google-io-2015-atap0030

  16. google-io-2015-atap0031

  17. google-io-2015-atap0033

  18. google-io-2015-atap0032

  19. google-io-2015-atap0035

  20. google-io-2015-atap0034

  21. google-io-2015-atap0017

Soli is designed to work in tiny devices, like smartwatches, and to work through surfaces and at a distance. ATAP found that radar was the appropriate sensor to accomplish this, satisfying all its demands except for size. They therefore set about making it smaller. Through an iterative hardware design process, they shrunk it down from about the size of a home gaming console to something smaller than a quarter.

ATAP also achieved scale production, and did it all within 10 months of development time. They also figured out how to figure out how to track minute changes in received signal to determine position of individual fingers. It responds to very slight motion, thanks to tech that transforms the signal into a number of different visualizations to arrive at a final, subtle and multidimensional message about what exactly your hand is doing.

An API will provide developers with all the translated signal information, letting them do what they want with all the various stages of interpreted data.

Soli is being targeted for broad availability later this year, in a form that’s usable in wearable devices like smartwatches. This could seriously help improve the chops of Android Wear hardware, so here’s hoping it arrives in time for a big flagship device launch this holiday season.



from TechCrunch http://feedproxy.google.com/~r/Techcrunch/~3/Cf0jDNZHiAg/
via IFTTT

0 коммент.:

Отправить комментарий