Diversity Washing Through AI
There’s a feature called Virtual Try-On. The customer sees clothing on a virtual model that matches their body type. You can choose different skin colors, body shapes and sizes. All automatically generated by AI. And it’s offered as progress on diversity. While I was immersed in the technology to write the book, I had a suspicion: this is the opposite of diversity.
Diversity means that different people are equally involved in something, whatever it may be. Not as a depiction, but as real participation. In practice that means a model with dark skin stands equally in front of the camera and gets paid equally, and that a size 46 model gets booked too. In general terms it means that someone who looks different from the standard gets an equal place.
What Virtual Try-On does is something else. It creates the image of diversity without a single diverse person being involved. An algorithm generates an image that looks like representation but isn’t. What I saw was this: a company says we now have diverse models on our website, but not a single one of them was a person. Sounds completely trivial at first. But when diversity is entirely synthetic, meaning nobody was hired, paid or included, it shows that the company primarily displays diversity without changing anything about how it operates.
That’s not diversity. That’s decoration. I was alone with this view at first. For most people what counts is the result on the screen, which supposedly represents the right thing. The provider sees it the same way as the customers and it works, because the hardest currency in online shopping, conversion, goes up too. The customer feels addressed and buys. So everything is fine.
Fine. But what has actually changed on the other side of the screen? The same people still make the same decisions and these structures are cemented because the confirmation is factually there. Everything is fine.
The costs are also lower, because real models cost money and have rights. Rendering an image costs a little more than writing the prompt, and the question of image rights is moot anyway.
So the result is: if you look at diversity as a cost-benefit calculation, the AI solution is naturally far superior. More variety is created with less effort. A significant efficiency gain.
But a model was never just an image. There was always one and usually several jobs behind it. Diverse models had an income like everyone else and with it a fixed place in society. An AI-rendered image doesn’t deliver any of that.
Elsewhere, companies advertise with diversity. Numbers like the share of women or people of color in the workforce are socially relevant and reveal something about the ethics work in companies, and speeches are given about how important diversity is. But it’s only important on the consumer side, not on the cost side. That Virtual Try-On only represents a technological variant of the bigger picture is perhaps left unsaid, because AI-rendered images don’t flow into company figures. It only creates visible diversity. But real diversity remains untouched.
What we call progress is the image on the surface, not the reality behind it. This is simply a shortcut. Because somewhere there’s a model who would have needed that job. Whose face could have been on a website and who would have been paid for it.
How these texts are written is explained here.