Intelligent machinery

Speaker see. Speaker do

Household electronics are undergoing a sensory makeover

SMART SPEAKERS, like Amazon Echo, Google Home and Apple HomePod, are spreading rapidly, and it is now common to hear people asking such assistants to provide weather forecasts or traffic updates, or to play audiobooks or music from streaming services. But because a smart speaker can act only on what it hears, it has little understanding of objects and people in its vicinity, or what those people might be up to. Having such awareness might improve its performance—and might also let users communicate with these digital servants by deed as well as word. Several groups of researchers are therefore working on ways to extend smart speakers’ sensory ranges.

One such effort is led by Chris Harrison and Gierad Laput of Carnegie Mellon University, in Pittsburgh, Pennsylvania. On May 6th, at a conference in Glasgow, Dr Harrison and Mr Laput unveiled their proposal, which they call SurfaceSight, to give smart speakers vision as well as hearing. Their chosen tool is lidar, a system that works, like radar, by bouncing a beam of electromagnetic waves off its surroundings and measuring how quickly those waves return. That information, run through appropriate software, builds up an image of what the beam is pointing at. If, as many radars do, a lidar then revolves, it can sweep the beam around to create a 360° picture of its surroundings.

Dr Harrison and Mr Laput have fitted such a system to an Amazon Echo speaker, permitting it to sense and identify nearby household objects and to recognise hand gestures—and,…