If Google has its way, the hand control technology of “Minority Report” with an interactive gesture-based user interface might soon become a reality. The FCC recently granted the company a waiver that allows it to operate sensors for the initiative, known as Project Soli, at higher power levels than previously allowed.
The technology employs sensors to capture motion in a 3D space with radar beams, allowing users to perform tasks without physically touching any hardware or controls, such as using a computer with hand motions alone. For example, as Business Insider’s Rob Price explains it, “You might press an individual button with your thumb and index finger, or turn a virtual dial by rubbing your thumb and index finger together.” According to researchers who have recently tested the technology, the sensors can be embedded in wearables, phones, computers, and vehicles, and are sensitive enough to count sheets of paper and read Lego bricks.
Google envisions numerous use cases for Project Soli, including helping those with disabilities. What’s truly fascinating about the project, however, is how it challenges us to rethink our relationship with technology. Ivan Poupyrev, a Project Soli researcher when the technology was first announced, said in a 2015 interview, “The whole world is becoming a gadget that we interact with, with software everywhere, which raises the question how can we react with the entire world?”
This is a topic we’ve explored numerous times at the APEX—see this piece on facial recognition technology, this post detailing the data monetization opportunities of the concierge economy, and this article examining how we can improve our environment with better access to information.
How can we expect Project Soli to add to the conversation? The details remain to be seen, but the FCC’s waiver provides Google with new opportunities for innovation and simultaneously encourages other touchless technology pioneers in their own research initiatives.