There’s a new project from the Google stable called Soli that has the potential to bring gesture controls to all sorts of hardware, including wearables. The example gifs below make it look like some kind of telekinetic voodoo, but in reality the gestures are made possible by a new “interaction sensor” that uses radar technology to detect and recognise movements made near it.Google quietly unveiled Soli at its recent I/O conference and while the initiative is still in its infancy, expect to see an API for developers in coming months.Here’s a video showing just some of the potential use cases for Soli, we can imagine a whole lot more.https://www.youtube.com/watch?v=0QNiZfSsPc0
Trending
- Boeing’s Starliner is about to launch − if successful, the test represents an important milestone for commercial spaceflight
- China’s new Moon mission is about to launch, and it’s a rare example of countries working together
- Much ado about the Nothing Phone (2), landing in South Africa soon
- Democratizing access to fun games with MTN Free Games
- Anthropic releases Claude 3 AI chatbot on iOS
- Vivo’s new Y38 isn’t like other mid-rangers, packs a 6,000mAh battery
- Microsoft bars police agencies from using its AI systems for facial recognition
- Gmail lured us in and then billed us