The Economy of Feeling
The AI industry wants to take personalization “to the next level”. That’s what it claims, anyway. And it builds solutions around one principle: the customer must feel like they’re being treated individually.
In other words, you use artificial intelligence and pretend. The most common example are chatbots. I ask a question and get called by name, because I typed it in earlier or it’s in my profile. It’s the same thing as the old email newsletters. Hello Mr. Christian. Or Hello Mr. Christian Huser. Or Hello Huser. Whatever the case, in the early days of mass emails it felt somehow warm.
In the real world, it still works better. When I sit in a good restaurant and the waiter knows my name, I feel at home right away. Sometimes there’s a hug and that goes beyond being a valued customer. And it costs the waiter almost nothing. I get his attention, he remembers me. With that many customers, that’s worth something. The impression is real and I feel honored. It means something.
Back to artificial intelligence. To the database fields that have my name stored and that an algorithm is now thinking about. The last ten emails I received from companies all started with my first name. They sounded personal, as if someone had thought of me. And still I react differently to an email that starts with Christian than to one that starts with Dear Customer. At first I reacted like this: How do they know my name. When did I sign up. Why. Today I still react, but I’m more suspicious than ever.
The industry calls it scaling customer experience. The feeling of being perceived as a special customer was already a mass product, but now everything is supposed to get even more personal. My name, my needs, every trace I’ve left behind, goes into the pot. At the end of it all, it’s emails and personalized greetings when I’m logged in.
But I can’t shake the feeling that I’ve become the mass product. It devalues me. Everything is perfect: the offer, the approach. The illusion gets better, but not the heart. I know that behind the screen there’s no person who actually knows me. And when every email sounds personal, the personal part stops. It just pretends I matter. When every chatbot calls me by my first name, the first name is no longer a sign of appreciation.
The AI remembers everything. In my last chat, it asked me whether the issue with my subscription had been resolved. I was trying to get around the subscription, but it caught me. It couldn’t forget. That’s inhumanly perfect.
The AI is supposed to feel like a person who cares about you. Who does everything right, is always there and never gets impatient. And when you then meet a real person who forgets and gets impatient, that no longer feels like care. It feels like a deficiency. But it also feels human. Maybe we’ll miss that someday. A person with feeling. Who makes mistakes sometimes. And when they don’t, we appreciate it.
That is the economy of feeling: making the simulation so good that reality fades next to it. Or stays human.
How these texts are written is explained here.