Skip to Main Content

Lego brain

In the News

If You Thought Voice Was Easy, Wait Until Devices Make Predictions

'It's a shift from instant gratification to clairvoyance'

Voice technology was everywhere at CES this year (toilets included), in large part because it’s an easy, natural interface. But it still requires input from consumers—which, at least in marketing terms, means friction remains.

“There are few things more natural than simply asking for something and having that thing brought to you,” said Jason Snyder, CTO at brand agency Momentum Worldwide.

Unless, of course, you don’t have to ask at all. Which is precisely how interfaces are evolving.

Per Snyder, connected devices, pervasive broadband and ample data will to allow brands to “look at consumer consumption patterns, behaviors, profiles and locations” and “understand that consumer’s needs and geography” so that they can “start to predict what’s going to happen.”

Consumers increasingly expect products and services on demand. Amazon Prime made two-day delivery standard, giving rise to Prime Now, which reduced delivery times to hours. Meanwhile, Amazon also launched Dash buttons—some of which offer automatic replenishment—along with Subscribe and Save, a service that establishes recurring deliveries for select products. Similarly, apps from players like Starbucks store money and automatically reload when the balance falls below a given threshold without any input from the consumer (beyond establishing said threshold).

Jeremy Lockhorn, vice president of experience strategy, mobile and emerging technology at digital agency SapientRazorfish, called this a shift from explicit to implicit interfaces.

“The way to think about it is: Right now, we are working the computers and devices, but … in very short order, it’s going to flip and the computers and devices will work for us,” he said. “If you play that out a bit more, it’s much less about me as a user saying, ‘Alexa, turn the lights on,’ or, ‘Google, help me figure out the answer,’ and it’s more about devices getting so smart about our context … they are able to predict what you want and deliver it before you ask for it, regardless of device. It’s a shift from instant gratification to clairvoyance.”

For example, Lockhorn said Google can already send a push notification if it’s time for a user to leave for the airport because Google sees the flight on that user’s calendar and knows the user’s location and nearby traffic conditions.

“I think it’s not too big of a leap to think that the next generation of that will be [Google saying], ‘It’s time to leave for your flight and I called you an Uber,’” he said.

Or, in the connected home, if a consumer accesses a recipe for a casserole from a smart refrigerator and the recipe will result in a pan that’s difficult to clean, the refrigerator could theoretically communicate with the dishwasher to brace itself to run the heavy cycle. Something similar already happens with Whirlpool’s smart washer with Amazon Dash replenishment that automatically reorders laundry supplies when a customer is running low.

Taking that a step further, the washer could perhaps put out a call to retailers for laundry supplies in order to find detergent at the cheapest price, Lockhorn added.

“[It’ll be] brands understanding … what is appropriate to bring you,” Synder said. “All of those friction points are reduced, so it’s [about reading] your mood.”

An even further extension would be a self-driving car that takes a passenger to the beach instead of work because it knows that passenger needs a day off, he added.

But, of course, this requires extensive data that would have to be compiled to assemble a picture of a consumer across touchpoints, with some AI thrown in to make predictions.

“[Brands] need to be able to stitch together and make intelligent guesses about what the user is going to want next, which is a whole lot harder than it sounds … it’s going to be a mix of third party and first party data,” Lockhorn said. “It’s clear that brands and marketers are investing heavily in infrastructure, but there’s a lot more in the works.”

Lockhorn said he has some clients experimenting with this in private programs and he expects to see it more broadly in about 18 months. A rep declined to provide any client names.

“Convenience is huge. We often get into discussions around [whether AI will] destroy jobs and put people out of work, and what we’d] do in that new world,” Lockhorn said. “I think the positive side of that is there are more opportunities for human-machine collaboration. And the other positive is [that] it frees us up to do other things potentially that are [better] tasks in terms of professional and personal development.”



Publish date

February 16, 2018


Lisa Lacy


A group of people in conversation