Dispatches from the Empire


Generative AI like Midjourney creates images full of stereotypes

A new Rest of World analysis shows that generative AI systems have tendencies toward bias, stereotypes, and reductionism when it comes to national identities, too. 

Of course! Computers are all about broad data sets, not specific outliers.

This isn’t just AI, either. It’s in the algorithms behind Facebook and TikTok and YouTube, etc. We humans create these algorithms in our own image. Why do most YouTube “celebrities” look so similar? Why are so many female TikTok “stars” facsimiles of the Kardashians, themselves facsimiles of a standard of beauty now twenty years old?

These algorithms are built on millions of clicks, taps, scrolls, and hours watched. They’re extremely efficient at doing what old-school media has always done: flatten culture. After all, who were John Wayne and Frank Sinatra if not the embodiment — and perpetuation — of stereotypes?

What’s unnerving about social media and AI is that this flattening happens at terrific speed, which wasn’t possible in our analog culture.

Humans are not built for speed. We might be addicted to it, but our brains didn’t evolve to handle it.

The future looks terrifically unsettling.