From Puppy Face Filters to Algorithmic Assessments of Attractiveness: How AI and AR are changing our perceptions of beauty

When I tried Snapchat’s puppy face filter back in 2015, I never thought the same technology would be considered the norm six years later. As Augmented Reality advanced further, more filters were developed, with beauty filters gaining the most popularity. These so-called beauty filters have slowly been distorting our perception of beauty and expectations of humans.

From puppy face filter to impossible beauty standards

Our perception of beauty and the ‘ideal body’ has changed over time and will continue to change. What’s alarming is that the percentage of people dissatisfied with their bodies is increasing at a drastic rate. According to a 1997 study, 56% of women and 43% of men were dissatisfied with their bodies. By 2018, it had risen to 83% in women and 75% in men! Almost eight out of every ten people you know are dissatisfied with their bodies.

A number of factors have contributed to this, but face filters and apps like Facetune claim a significant share. The first major development in face filters happened back in 2013 when Lightricks, a company that develops video and image editing apps, introduced an app called Facetune. This app allowed users to make themselves look better in images by reshaping the face, taking in the waist, and whitening the teeth. Although image editing was popular even before Facetune, heavy editing to the extent of reshaping the face and body gained popularity only after Facetune.

Later, in 2015, Snapchat introduced a feature called Lenses. We call it ‘face filters’ today — it was an AR-based technology that let users change their looks in different ways. Selfies got a lot more creative with puppy ears and barfing rainbows. But a few months later, beauty filters were developed using Lenses, and they started stealing the limelight. While heavily edited images were identifiable in the initial days, they are unidentifiable today. The gap between one’s real and virtual appearances started getting wider. Instead of just making us look better in the images, these filters carve an ‘ideal’ version of ourselves into our minds — a version of us with thin waists, altered bone structure, flawless skin, amplified lips, etc.

Of course, magazines and publishers have been promoting impossible beauty standards even before Facetune and Snap Lenses. The difference? We all have the same power in our hands today as content creators with digital augmentation as simple as the press of a button.

Once “beautify” is selected on the app Meitu, “Auto” is an option. There is quite literally a ‘default’ of beauty here.

Before we became the publishers of our own content, we were taught that the images magazines were digitally altered — likely underestimating the extent. Today, our social media feeds are brimming with unidentifiable, heavily edited images. Our fashion and beauty icons are able to share their own images, but do they look like that in person?

Is one’s digital identity usually aligned with their identity IRL?

‘Try before you buy’: finding brand opportunities in face filters

With the amount of attention face filters were attracting, several beauty brands found marketing opportunities in them. It started with a new concept called ‘Try before you buy,’ a feature introduced by Kylie Jenner’s cosmetic brand Kylie Cosmetics. It allowed users to try their product line virtually using face filters. What started as a fun technological feature was transforming into a way to sell more cosmetic products.

Now, imagine a cosmetic brand has come up with a new product, something that we are not used to. The first challenge for the brand is to get people to use their product. In other words, create a market for it. People are more likely to buy the product if they can see that it will look good on them.

With these face filters, cosmetic brands market their products in no time by introducing new filters that mirror the effects of the products on our selfies. Once people start using these filters on Snapchat, Instagram, or Facetune, they start believing that their product is the way to make them look like that virtual, ideal version of themselves.

Beauty score and the unification of beauty perceptions around the world

A recently developed Instagram filter gives us a beauty score based on our looks. Yes, a ‘beauty score’! It attacks our already fragile self-esteem. But exactly what is a beauty score, and how do they determine it? We all know that beauty is very subjective, and so there will never be one single person who is considered a ‘10’ by the entire population. The perception of beauty changes from place to place, from people to people — it always has. Then, how does this filter calculate the beauty score when beauty is perceived differently by different people?

Here’s where it shocks us. The app uses an algorithm — an algorithm that assigns the beauty score based on our facial appearance. Essentially, it is the geometric proportions of our face set against the standardized beauty norms that determine our beauty score. The problem with this is that our perception of beauty is not in our hands anymore — it is controlled by an algorithm written by a programmer, probably working for a cosmetic brand.

It also unifies the perception of beauty all over the world — imagine a world where everyone has the same concepts about what is beautiful and what isn’t. And despite the various biological differences between various ethnicities, people try to achieve one single body type — slim waists, glowing faces, poreless skin, sparkling eyes! Even the influencers and celebrities who are portrayed as the ‘most beautiful’ on these platforms do not meet all these beauty standards in real life.

However, in the last few years, several Instagram profiles and social handles have exposed celebrities who advertise cosmetic products using digitally altered images. Also, the campaigns against beauty filters have been gaining widespread acceptance.

Where are we headed? Based on existing trend developments in beauty, we developed four future scenarios: Real, Fake, F’real, and Surreal. Later in this series, I’ll explain these four scenarios and their impacts.

________________________________________________________________

  • This is a summary of a talk given by myself and Melissa Eshaghbeigi for The Association of Professional Futurists’ online Futures Festival 2020. You can watch the full talk and discussion here!

________________________________________________________________

Let me know your thoughts in the comments!

designs experiences, solves complex problems, fights for social justice, nerds out on AI ethics & futures

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store