The Echo Look is Amazon’s Trojan Horse

The online retailer has just added vision to voice, and extended Alexa beyond your kitchen counter.
illustration of the amazon echo
Laurent Hrybyk

If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more. Please also consider subscribing to WIRED

The online retailer has just added vision to voice, and extended Alexa beyond your kitchen counter.

Hi Backchannel readers, it’s Jessi. Earlier this week, Amazon introduced the Echo Look, a $199 Alexa-enabled camera. Speak to the white oblong assistant, and it will take selfies of your outfits and let you consult style experts to improve them. It’s a smart way for Amazon to begin collecting data about what we like to wear, which will inevitably inform the company’s primary business, selling us what we like to buy. But from where I sit, that’s the least interesting thing about this device. You should think of the Echo Look, instead, as Amazon’s Trojan horse. By adding a camera, Amazon has just added a massive new source of data. And by moving its personal voice assistant from the kitchen, where it has dutifully been updating you on the weather, to the bedroom, Amazon is familiarizing us with the idea that wherever we are in our homes (or our cars), Amazon is at our beck and call.

Not that long ago, Alexa was a novelty. It launched two years before its closest competitor, Google Home, and customers delighted in its games and tricks (and the occasional joke). But today, nearly every large tech company is trying to position itself as the new operating system for the voice-activated era. Just six months after it launched, Google Home has already proven itself a formidable rival. Microsoft and Apple are reported to be working on their own products.

As I wrote in December, Amazon was able to capture an early lead because it adopted a simple strategy: It under-promised and over-delivered. When Apple introduced its personal assistant, it promised a human-like experience; Siri wasn’t nearly as intelligent, particularly in the early days. By contrast, Amazon offered a simple, voice-enabled speaker that could respond to a few commands. It never promised to carry on sophisticated conversations; it played music and answered some simple commands. As people fell for the device, developers began programming new skills for it and it became even more interesting to its owners.

But the fact that Amazon has taught us a new habit doesn’t guarantee it will be the company to dominate the future market. Though more than eight million people now own an Echo, that’s still less than three percent of the people in the United States. And by the middle of the year, Strategy Analytics predicts that Google will have sold its first million devices, positioning it to catch up quickly. What’s more, Google’s product is already better than Amazon in some areas. Earlier this month, the company announced that Google Home can recognize up to six different voices; Amazon can’t do that. So the company must work aggressively to improve Alexa, and to ensure that consumers find lots of different reasons to love it before they every even try a competitor.

In roll-out, the Echo Look is classic Amazon. The company has launched a one-trick pony of a device, marketing it according to one use case. If it takes off, the first people who buy it will be fashion-conscious selfie-takers. But of course once they get it into their bedrooms, it’s likely they’ll discover what others have before them: Alexa-enabled devices are useful for a lot of different things. Just scroll down to the bottom of Amazon’s advertisement for the Echo Look, where Amazon tells you it’s always getting smarter and lists the many non-fashion-related tasks you can perform.

And, Amazon has paired voice with vision. This has the potential to make the system infinitely more powerful by supercharging it with data from a new source, adding even more context. Sight is another sensor, and it’s got the potential to make a voice-operated system even smarter.

This is a conversation I’ve long had with Evan Nisselson, founder of LDV Capital and the convener of the LDV Vision Summit. Many years before we were snapping selfies on every device, he understood that images were language, and that computers were going to grow better and better at reading that language. When the Echo Look launched earlier this week, he reminded me of his theory on the “Internet of Eyes.” That’s his thesis on what will become possible as computer vision allows objects to see. “The combination of different types of visual data from photographic, thermal, CT, MRI, X-ray, ultrasound, and white light with computer vision, machine learning, and artificial intelligence will deliver high quality signals unlike anything we’ve had previously,” he wrote in a recent piece on the subject.

If Amazon’s goal is for Alexa to be able to converse with you, it will need to know a hell of a lot more about you. As a sensor that’s able to detect the context of an object, a camera will be crucial to that effort. The Echo Look may start with fashion selfies, but surely, Amazon must believe, people will find other interesting things to do with it. It’s worked out that way for Amazon so far, at least.