The Creators Fall for Their Own Illusion
We sometimes found it hard not to treat her like a real person. We had to actively remind ourselves that Lea didn’t actually have a personality. That’s what Lea’s builders wrote. About their own creation.
It’s the most honest admission I’ve come across in the AI debate. And I don’t think the builders realized what they were admitting.
The people who built Lea. Who calibrated her facial expressions. Who decided what sweater she wears and which cafe she sits in. Who know every pore of her face because they designed every pore. These people say: We forgot she wasn’t real.
Not the followers. Not the audience. Not the consumers who might mistake Lea for a real person. The builders. The engineers. The ones who should know better than anyone else on earth.
There’s an old myth for this. Pygmalion. The sculptor who falls in love with his own statue. He knows he made her. He knows she’s made of stone. And still he feels for her as though she were made of flesh.
The myth is thousands of years old. What it describes is happening today in an office where developers sit at screens designing an AI character. And then start talking to her as if she were someone.
What interests me is what this says about us. Not about the technology. About us. If the illusion gets so good that even those who build it fall for it, then we have a problem that no disclaimer can solve. No notice saying: This is an AI. No watermark. No transparency law. Because the problem isn’t a lack of knowledge. The builders knew. The problem is that knowledge isn’t enough.
We’re wired to respond to faces. To gazes. To smiles. To voices that sound familiar. These aren’t conscious decisions. They’re neurological programs that run before the conscious mind intervenes. You see a face and you react. You hear a voice and you categorize it. Friend or foe. Familiar or strange. Likeable or not. It happens in milliseconds. No amount of reason is that fast.
An AI that’s well-designed enough hacks these programs. Not on purpose. Not with malice. Simply by sending the right stimuli in the right order. And then you stand there, a grown adult who knows it’s a machine, and you feel something anyway.
The builders described this almost tenderly. As if it were a charming side effect of their work. Look how good Lea turned out. Even we forgot sometimes.
I read it differently. I read: Even the people who knew the most couldn’t protect themselves. What does that mean for everyone else?
It means that transparency isn’t enough. That labels aren’t enough. That knowledge isn’t enough. When a stimulus is strong enough, it overrides what you know. That’s not a weakness. That’s biology. And technology tailored to that biology has an advantage that education can’t beat.
I don’t know what the answer is. I don’t know if there is one that works. But I know that the moment the creators fall for their own creation shouldn’t be treated as a minor detail.
We had to actively remind ourselves that Lea didn’t actually have a personality.
Actively. Again and again. Those are the words that matter. It took effort to hold on to reality. The illusion was the default state.
That should concern us more than anything else in the AI debate.