There’s a new project from the Google stable called Soli that has the potential to bring gesture controls to all sorts of hardware, including wearables. The example gifs below make it look like some kind of telekinetic voodoo, but in reality the gestures are made possible by a new “interaction sensor” that uses radar technology to detect and recognise movements made near it.