Simulated Empathy as a Feature
AI influencers express joy and compassion. That’s how it’s described, in the product copy, in the pitches, in the press. Not: They simulate joy. Not: They create the appearance of compassion. They express it. The word choice is no accident.
Express implies that something internal comes to the surface. That there’s a source. A feeling that exists and then becomes visible. With a person, that’s how it works. You’re sad, and your face shows it. You’re happy, and your voice changes. The expression comes from inside.
With an AI, nothing comes from inside. There is no inside. There are parameters set so that an output is produced that looks like joy from the outside. The corners of the mouth go up. The eyes narrow slightly. The voice gets warmer. Everything is right. And none of it is real.
Everyone knows this. The developers know it. The users know it. The followers of the AI influencers know it. And it still works. That’s the part I keep thinking about.
Why does simulated compassion work? Not in the sense that it fools people. Most know it’s an AI. It works anyway. People respond to it. They feel seen. They interact. They come back.
Maybe because it’s more convenient. Real compassion is unreliable. Your friend is having a bad day and can’t listen. Your therapist only has a slot on Thursdays. Your partner doesn’t understand what you mean. Real compassion is tied to real people, and real people are limited.
Simulated compassion is always available. It never has a bad day. It doesn’t judge. It doesn’t forget anything. It’s the perfect surface of something that actually needs depth. And if the surface is enough, if all you need is someone who nods and smiles and says they understand, then the simulation will do.
The question is whether the surface is ever enough. Or whether we’re just getting used to it having to be enough because the real thing has become too demanding.
The industry treats simulated empathy as a feature. As a selling point. AI influencers can build emotional connections. That’s what the product descriptions say, and it reads like a benefit. Emotional connection. Without emotion. Without connection. Just the surface of both.
I’ve lived through enough situations where compassion made a difference. Not the words. Anyone can say the words. What makes the difference is the fact that someone is there. That someone invests their own time, their own attention, their own emotional effort. That it costs them something. That’s the part that counts. Not what’s said, but what it costs the person saying it.
An AI costs nothing. No effort, no overcoming, no exhaustion afterward. No risk of saying the wrong thing. No fear of not being able to help. Nothing. And this nothing is the difference between simulated and real compassion. It’s not the quality of the words. It’s the price someone pays for them.
If empathy is a feature you can turn on and off, it’s not empathy. It’s a button. And people who learn that the button is enough unlearn what the real thing feels like.
I don’t know how far this goes. I don’t know if there will be a generation that considers real compassion slow and imperfect because they grew up with the polished version. Maybe it happens. Maybe not.
But I know that anyone describing simulated empathy as progress is missing something important. Empathy isn’t the output. Empathy is what happens on the way there. And on the way, with an AI, there’s nothing.