Who exists in the world, according to Google? … and who doesn’t?

Note, this is a modified transcript from a talk. You can watch the talk here!

Meet Jyotish and Nandini.

Image for post
Image for post

Last December I attended the wedding of one of my best friends, Jyotish, in northeast India. Here are some more photos of what it looked like.

Image for post
Image for post

The Indian population is over 1 billion people, 17% of the world population is in India. That means a lot of weddings look similar to his — not including all the Indian weddings that take place outside of India.

Image for post
Image for post

So how come when Jyotish looks up “wedding” in Google from India, he sees pictures like this?

Image for post
Image for post

There are a bunch of photos of white, heterosexual, possibly Christian or Western couples, — not representative at all to what he sees and experiences.

Not all weddings look the way they do in these photos.

It begs the question: who exists in the world according to Google… and who doesn’t?

Let’s look at another practical example.

Here’s what an autonomous car in Toronto might see. It would have to know the rules of the road and what jaywalkers are. If someone honked, it may pause to figure out what went wrong.

Image for post
Image for post

Now what if we upgraded that same vehicle into the streets of India? How to drive in Toronto, wouldn’t cut it.

Image for post
Image for post

Social norms are completely different. Honking is everywhere because it serves a different purpose, the rules are different, and everyone isa jaywalker! The car probably wouldn’t move.

So why am I talking so much about India and cultural idiosyncrasies?

Because today, an extremely small amount of the world was represented online AI algorithms and automated processes, won’t work equally across the world. So why let them make decisions that impact it?

Google’s mission is to organize the world’s information. So why do its results seem to represent only a small part of the world? And how many algorithms are relying on data that is not representative to make decisions?

“Algorithms are opinions embedded in code.” — Cathy O’Neil

Algorithms are opinions embedded in code. They don’t make things fair: they repeat our past practices and our patterns. They automate the status quo, and the status quo is not perfect. It’s about more than seeing a representative wedding on Google images.

It’s about erasure. From online shopping to predicting future criminals, algorithms are making a lot of decisions for us every day — learning from a sample data set, which has been provided by an imperfect human.

Looking for the best coffee in Toronto? If you open up Google, that result is biased. Similarly, if I didn’t know what a wedding looks like, I would have used Google Images, what I’ve seen would not be accurate for a lot of the world.

Not only are algorithms making decisions, they’re expanding rapidly, and they’re assumed to be based on fact. While you may think some photos of a wedding don't matter, this is indicative of a much larger problem.

When some people are erased and others are surfaced that matters. When the word doctor is only associated with a man, when a certain resume gets surfaced, when image recognition algorithms misclassify black people as gorillas, that matters.

Algorithms can spread bias on a massive scale at a rapid pace, leading to exclusionary experiences and discriminatory practices

Image for post
Image for post

When certain people are erased, mislabelled, misclassified, and denied opportunities while others are surfaced… that matters.

This is a timeline of Facebook’s privacy glitches, starting in 2006. We had opportunities to impact the trajectory before it exploded. You’ll hear people saying, if we had known, things would have gone differently.

Image for post
Image for post

Historically we’ve seen what happens when tech gets out of our hands and mitigation techniques are reactive. Foresight, and a futures mindset can help us prevent a dystopian future by thinking critically so that we can shape and influence the future as it is developing. This is our time to change it before it explodes.

If we want to shape tomorrow. We have to act today.

This is a time to impact the subject through AI and showing that it represents protects and improves our society. Let’s spend the time to imagine it to talk about it with a critical lens. And once you’re done reading (and sharing) this post, let’s act.

designs experiences, solves complex problems, fights for social justice, nerds out on AI ethics & futures

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store