Your Bad Mood as a Data Point
It didn’t feel particularly dramatic and it happens from time to time that I have a bad day. But there it was again. One of those days where you wake up tired and stay tired all day. When even the cup of coffee doesn’t help and neither does the second one. That evening I sat on the couch and scrolled through shopping portals and price comparison sites. I was really just looking for relaxation and I knew that wasn’t a particularly good idea. Scrolling is not relaxation. The internet was once again extraordinarily motivated to get me to buy something. What was shown to me matched my current mood but not my actual needs. So I looked at things I would have ignored on a normal day. I was more receptive.
I could feel that the algorithm knew exactly what was going on again today.
AI and the algorithms behind it detect emotional states and respond to them in real time. I don’t know whether it was dwell time, scrolling behavior, browser history, the time of day or all the cookies that created this picture of my moment of weakness. The industry calls it “precision and context” and celebrates it as progress.
There are moments in life when you’re vulnerable. A breakup, a loss, a sleepless night, an argument that’s still smoldering. In those moments you make different decisions. You buy things you don’t need. You click on things you’d normally ignore or look for something that offers comfort.
A vulnerable state is also part of being human. And the technology is designed to detect that state and exploit it.
Not exploit in the sense of: help. But exploit in the sense of: monetize. The difference is important because a friend who notices you’re not doing well and asks if they can help uses emotional perception for you. A system that notices you’re not doing well and shows you an offer uses it against you. The perception is the same. But the resulting intent is not.
The tech industry is notoriously shameless here. Out front they talk about “empathic AI” that responds to the customer’s emotional state and leave the impression that it’s in their favor. But the word empathy is problematic in this context. Because a machine that detects your bad mood and shows you a product that perfectly matches it is purely rationally speaking empathic. And that’s exactly how it gets sold.
Empathy means compassion. Sensing another person’s state and responding in their interest. What’s happening here is the opposite. It uses the ability to read a person’s state but the resulting action is exclusively in its own interest. That’s not empathy then, that’s exploitation. But exploitation doesn’t read well in a policy.
I think about specific situations. A person going through a severe personal loss who can’t sleep at night. When they scroll, the system registers the time, the behavior, the dwell time and draws conclusions about their state. From my own experience I know how extremely advanced the algorithms are and I’m not exaggerating when I say that a system recognizes: this person is vulnerable right now. And immediately shows an offer. It could be a trip, a subscription, a self-help course, anything that promises comfort and converts immediately.
On a day when they were rested and clear-headed, they wouldn’t have clicked but scrolled past it. But the system recognized the right moment. And the right moment is the vulnerable moment.
The future of marketing is “context-sensitive,” they’ve been saying for years. Context in this case means the system doesn’t just know what you want but how you’re feeling right now. It doesn’t just know your preferences. It knows your state. And it acts accordingly.
Of course nobody controls which states may be commercially exploitable and which may not. Just like with data protection there are urgently boundaries to be maintained but the fines imposed here and there on tech companies for violations are more likely to be laughed at in Silicon Valley. That means every line in some code of ethics that says grief is not a data point, loneliness is not a conversion window or fear is not a targeting criterion is one of the most brazen lies to the face of consumers and legislators.
Data remains data and every moment can be turned into a moment where you can sell something. But I wonder how the people who build this explain it to their kids at night. Daddy worked today on how machines recognize sad people and sell them something they don’t need and that doesn’t help them either. Some might answer: that’s not what it was meant to do or we didn’t think that far. I dare to say that it’s often meant exactly like that.
How these texts are written is explained here.