Simulated Empathy as a Feature
AI influencers bring emotions and persuasiveness into marketing. Obviously they simulate them. The joy, the anger, the compassion that they often express so effusively. They express something.
Expression implies that something internal comes to the surface. Usually in the form of a message. And it’s implied that there’s a source behind the expression. An actually existing feeling that becomes authentically visible on video. With a normal person, that’s usually the case. You’re sad and your face shows it. The same goes for joy. Your voice changes, everyone senses it comes from inside. With influencers, everyone is supposed to sense that it comes from inside. We assume that this applies to AI influencers too.
But with an AI, nothing comes from inside. There is no inside. Well, there is the code and its parameters, set so that an output is produced that looks like joy from the outside. The corners of the mouth go up, the eyes narrow slightly, the voice gets warmer, everything fits together. But none of it is real. Obviously. Everyone knows this. The developers certainly do, and the users, meaning the followers, actually know it too. Provided they know it’s an AI-generated character.
Why does simulated compassion work in this environment where everyone supposedly knows? After all, these aren’t particularly cleverly disguised deceptions. Some are, but you should be able to see through it, because most people know there’s an AI behind it. The fact is: people respond to it. They feel addressed and often interact as if they were dealing with real people.
Real compassion requires attention, and when a friend is having a bad day and can’t listen to you, it doesn’t always work perfectly. But we generally associate real compassion with real people, along with the flaw that real people are only available to a limited extent.
Simulated compassion, on the other hand, is always available. It never has a bad day and never forgets anything. It always presents a perfect surface of something that actually needs depth or comes from depth. And if the surface is enough, if all you need is someone who smiles and says they understand you, then the simulation will do.
The question is whether we’re content with the surface. Superficiality isn’t exactly a quality we value in the people around us. Or maybe we’re just slowly getting used to it having to be enough, because real compassion has become too demanding for many.
The tech industry treats simulated empathy as a feature. AI influencers can indeed build emotional connections, if the judgment of the human counterpart accepts it. But what emerges is an emotional connection without real emotion and without real connection. Just the surface of both.
I’ve lived through enough situations where compassion genuinely saved my mood. It wasn’t the words that made the difference, but the fact that someone was there. Their presence, because they took time for you and shared their emotions with you. That’s the real emotional connection that ultimately counts and changes a mood. It’s not about what is said, but what it means to the person saying it. What you feel in that moment.
An AI costs nothing at first. Sure, the servers and the energy. But that’s not what we’re talking about here. It costs the AI nothing to listen and send calculated signals back. It’s not exhausted afterwards and you don’t end up in each other’s arms in tears. Its risk of saying the wrong thing is manageable, which is why it has no fear of it either. Or a guilty conscience that it can’t do more. But these are the differences to real compassion. Not the quality of the words, but the imaginary price someone pays for compassion. I don’t like that sentence at all, but it’s meant to rationalize the difference between emotional coldness and warmth.
My takeaway from these thoughts is: if empathy is a feature you can activate and deactivate, then it’s not empathy, it’s the opposite. And my concern is: if we cultivate that too much, we’ll quickly forget what the real thing feels like.
I don’t know how far this goes, and I doubt anyone can predict it. I also don’t know if there will ever be a generation that considers real compassion slow and imperfect because they grew up with the polished version.
But I know that anyone who describes simulated empathy as progress is confusing empathy with output. So we need a new term for output empathy.
How these texts are written is explained here.