The unspoken pact that we have with machines is that their algorithms will be able to “read the tea leaves” in our data, tell us a story about ourselves in a new way or help us simplify our lives by being more informed.
For example, the motion sensor in the Fitbit listens to our movements, and its algorithms tell us a detailed story of our exercise and sleep – both quantity and quality – and where we need to improve.
Yet in spite of the ever-increasing sophistication of algorithms and machine learning, the stories that machines tell us don’t always make us more aware of our behaviour, better informed about ourselves or help us make better choices. Often, companies forget to take the time to listen to what their customers actually need and desire.
The value [of products and services] will increasingly come from being great at reading the tea leaves in the data.
Randy Komisar, Partner at Kleiner, Perkins, Caufield & Byers
There are three ways algorithms under-deliver on the value they promise:
– Although sensors are becoming ubiquitous, there are always gaps in the data, resulting in stories with thin or no data to back them up.
– Algorithms are designed to optimize around specific metrics and sometimes they optimize to a fault, no longer providing value to people.
– The intelligence behind an algorithm and its recommendations may not be transparent to users, making them difficult to trust.
Keeping algorithms user-centered
As algorithms and the stories they tell us become more prevalent in our lives, it’s important to recognize those stories may be based on noisy data, be over-optimized, or just seem illogical and therefore are ignored. The stories must be designed to create true value for people, or we miss the great potential of IoT and the data it provides.
Product designers and data scientists should work to improve these algorithms together with users so that they can make meaningful contributions to people’s quality of life.
Gaps in the data with Apple Watch
In recent years, the consumer health tech industry has exploded with “wearables” and apps that track our sleep activity, our exercise, our heart rates and more. Algorithms look at that data and offer recommendations on how to improve our health. However, there are limitations.
For example, the Apple Watch tells its user, “You got 5 minutes of exercise, but you should be getting 30 minutes,” but how does it know I didn’t take my watch off and go for an hour long swim? Or that it hasn’t been tracking my movement because I simply forgot to charge my watch last night?
There are always gaps in the data, and this is only compounded when there is no way for humans to manually correct the data. Because of this, we can’t accept the stories algorithms tell as the full story.
Optimizing to a fault with Facebook
The Facebook feed is the lens through which users see the stories of their world of connections. By clicking, “liking” and commenting on posts in their Facebook feed, users signal to the social network’s algorithm that they care about a particular story or message. That in turn helps influence what the algorithm shows in their feed later.
But the algorithm has its own selfish goal as well – to deliver as many promoted articles and ad views as it can to your feed. So what happens if a user liked everything he saw on Facebook for two days? Mat Honen did exactly that, and the result was very telling. His News Feed took on an entirely new character – devoid of any sign of friends and family. His feed became about brands and messaging, rather than real people with messages about their lives.
In more extreme cases, an algorithm can optimize and cause damage at a staggering scale, such as the:
Lack of transparency with UPS
UPS reportedly spent 10 years developing the Orion algorithm to give its drivers the most efficient route to take to complete their daily deliveries. The algorithm saves a dollar or two here and there, but when scaled to UPS’ more than 55,000 daily delivery routes, the savings can be huge.
But according to a WSJ article, “Driver reaction to Orion is mixed.” The experience can be frustrating for some who might not want to give up a degree of autonomy, or who might not follow Orion’s logic. For example, some drivers don’t understand why it makes sense to deliver a package in one neighborhood in the morning, and come back to the same area later in the day for another delivery. But Orion often can see a payoff, measured in small amounts of time and money that the average person might not see.”
The article continues “One driver, who declined to speak for attribution, said he has been on Orion since mid-2014 and dislikes it, because it strikes him as illogical.”
Alex Tabarrok write in “The Rise of Opaque Intelligence” that “the problem isn’t artificial intelligence but opaque intelligence. Algorithms have now become so sophisticated that we humans can’t really understand why they are telling us what they are telling us.”