Instagram’s latest AI flash filter is the new internet obsession for everybody! Just one click, and it makes you and your photos look so much better. While this seems like harmless fun, almost like a digital equivalent of trying on a new outfit, it isn’t harmless at all. Beauty standards have always been unrealistic for women, but with AI beauty filters, the standards have become even more impossible and cruel.

Beauty is shaped, not discovered

We like to think beauty is instinctive. That we simply “know it when we see it”. But history tells a different story. Beauty standards shift constantly, influenced by culture, media and power. The halo effect shows that people often assume those they perceive as attractive are also more intelligent, kinder, or more capable, revealing how quickly we attach value to appearance. Artificial intelligence has simply inherited this bias, and in some cases, intensified it.

A 2025 study published in AI & Society analysed a Chinese AI-powered beauty app and found systematic racial bias: the tool was significantly more accurate at generating white individuals, while consistently misrepresenting Black and East Asian people, especially women. It gets even worse from here. A 2025 paper studying Gemini and GPT image generators created 3,200 photorealistic images using neutral prompts like “a person.” Both models showed what researchers called a “default white” bias, with over 96 per cent of outputs depicting lighter skin tones. To build a beauty tool, developers feed algorithms vast datasets of human faces. These images are often labelled or ranked in some way, telling the system which faces are considered more attractive than others. Over time, the AI starts to identify patterns and builds its own internal model of “beauty”.

Who decides what’s beautiful?

are AI beauty filters biased
Image Source

Researchers at the University of Edinburgh prompted AI to generate fashion models, and every single model generated was a tall, slim, white woman. AI isn’t learning a universal definition of beauty. It’s learning a very specific one and treating it as the default.

Research published in Art Education (2024) found that prompts like “American” or “American beauty” led DALL·E 2 to generate 16 images that almost exclusively featured lighter skin tones, effectively erasing non-white Americans from its visual definition of the country.

Society constantly subjects women to unrealistic beauty standards, and as AI becomes more embedded in daily life, it intensifies this pressure more than ever. In India, the rise of AI-generated accounts posing as real women reveals the kinds of beauty standards that continue to shape and constrain how women are perceived. The recent Instagram trend called #justaboy led to the creation of hundreds of AI women singing the song. And guess what? All of them looked the same — fair, slim, and blue eyes. All of these are essentially European features; in short, they check all the boxes of “conventional beauty”. A large number of young women, who cannot identify these as AI accounts, go to disturbing lengths to look like them.

The Dove 2024 State of Beauty study, which surveyed 33,000 people across 20 countries, found that one in three women feel pressure to alter their appearance because of what they see online, even when they know the images are fake or AI-generated. Almost nine in 10 women and girls said they’d been exposed to harmful beauty content online.

The problem goes deeper than selfie filters

Image Source

You might think beauty filters are a bit silly anyway. But this bias doesn’t stay in the realm of vanity apps. A Science Advances study using the Diverse Dermatology Images (DDI) dataset found that AI models designed to diagnose skin diseases, perform significantly worse on darker skin tones. In other words, the same bias that makes your beauty app show you a lighter-skinned “improved” version of yourself could, in a medical context, mean a doctor’s AI assistant is worse at spotting cancer on darker skin.

The consequences extend beyond self-esteem: they “promote colourism, limit representation, broaden social inequity, and impede social progress.” The root cause isn’t a mystery. Overwhelmingly white, male teams built these tools and trained them on datasets scraped from the internet, which heavily reflects Western, Eurocentric beauty ideals.

As one recent study noted, generative AI tools “can inadvertently perpetuate and intensify societal biases related to gender, race, and emotional portrayals”, and these biases can be “more pronounced than current societal disparities.” Beauty is contested, cultural, and gloriously varied. The least we can do is demand that the tools shaping it reflect that reality.

Featured Image Source

More from All About Eve

LinkedIn’s Algorithm Bias: Why Every Woman Should Be A ‘Man’ On LinkedIn

The Algorithm Knows You’re Pregnant Even Before You Do: The Dark Truth About Your Friendly Period Tracking App

Who Trains AI To Flag Explicit Videos? Women In Rural India Who Are Sacrificing Their Mental Health For Ours

Looksmaxxing: India’s Beauty Standards Crisis Is A Men’s Problem Too

Is LinkedIn Unsafe For Women Now? Cybersecurity Expert’s Online Safety Tips To Follow

 

What’s your Reaction?
Love
0
Love
Smile
0
Smile
Haha
0
Haha
Sad
0
Sad
Star
0
Star
Weary
0
Weary

AfterHours With All About Eve | Know The Person Behind The Celebrity | Hosted By Bani G. Anand

From Smriti Irani’s hilarious stories of being arrested as Tulsi and entrepreneur Devita Saraf’s tips on how to win her over, to a fellow podcaster’s secrets on how to go viral, there’s a lot coming up!

AfterHours With All About Eve | Exciting Podcast Launching Soon! Ft. Bani G. Anand

Introducing “AfterHours with AAE” – a podcast that captures the untold stories of some of India’s most influential personalities.

‘Devi’, Nepotism, & Winning A Filmfare | Priyanka Banerjee | Bani Anand | AfterHours With AAE | Ep 7

Tune in for a riveting chat with filmmaker & writer Priyanka Banerjee and host Bani Anand as they talk about why nepotism works in Bollywood, the process…

How To Go Viral Like Dostcast | Vinamre Kasanaa | Bani Anand | AfterHours With AAE

Watch Dostcast’s Vinamre Kasanaa in a free-flowing chat with Bani G. Anand in the 6th episode of AfterHours with All About Eve.