The moral bias behind your search results
-
0:01 - 0:04So whenever I visit a school
and talk to students, -
0:04 - 0:06I always ask them the same thing:
-
0:07 - 0:08Why do you Google?
-
0:09 - 0:12Why is Google the search engine
of choice for you? -
0:13 - 0:15Strangely enough, I always get
the same three answers. -
0:15 - 0:17One, "Because it works,"
-
0:17 - 0:20which is a great answer;
that's why I Google, too. -
0:20 - 0:22Two, somebody will say,
-
0:22 - 0:25"I really don't know of any alternatives."
-
0:26 - 0:29It's not an equally great answer
and my reply to that is usually, -
0:29 - 0:31"Try to Google the word 'search engine,'
-
0:31 - 0:33you may find a couple
of interesting alternatives." -
0:33 - 0:35And last but not least, thirdly,
-
0:35 - 0:39inevitably, one student will raise
her or his hand and say, -
0:39 - 0:44"With Google, I'm certain to always get
the best, unbiased search result." -
0:45 - 0:52Certain to always get the best,
unbiased search result. -
0:53 - 0:55Now, as a man of the humanities,
-
0:56 - 0:58albeit a digital humanities man,
-
0:58 - 0:59that just makes my skin curl,
-
0:59 - 1:04even if I, too, realize that that trust,
that idea of the unbiased search result -
1:04 - 1:08is a cornerstone in our collective love
for and appreciation of Google. -
1:09 - 1:13I will show you why that, philosophically,
is almost an impossibility. -
1:13 - 1:16But let me first elaborate,
just a little bit, on a basic principle -
1:16 - 1:19behind each search query
that we sometimes seem to forget. -
1:20 - 1:22So whenever you set out
to Google something, -
1:22 - 1:26start by asking yourself this:
"Am I looking for an isolated fact?" -
1:26 - 1:29What is the capital of France?
-
1:30 - 1:32What are the building blocks
of a water molecule? -
1:32 - 1:34Great -- Google away.
-
1:34 - 1:37There's not a group of scientists
who are this close to proving -
1:37 - 1:39that it's actually London and H30.
-
1:39 - 1:42You don't see a big conspiracy
among those things. -
1:42 - 1:43We agree, on a global scale,
-
1:43 - 1:46what the answers are
to these isolated facts. -
1:46 - 1:52But if you complicate your question
just a little bit and ask something like, -
1:52 - 1:54"Why is there
an Israeli-Palestine conflict?" -
1:55 - 1:58You're not exactly looking
for a singular fact anymore, -
1:58 - 1:59you're looking for knowledge,
-
1:59 - 2:02which is something way more
complicated and delicate. -
2:03 - 2:04And to get to knowledge,
-
2:04 - 2:07you have to bring 10 or 20
or 100 facts to the table -
2:07 - 2:10and acknowledge them and say,
"Yes, these are all true." -
2:10 - 2:12But because of who I am,
-
2:12 - 2:14young or old, black or white,
gay or straight, -
2:14 - 2:16I will value them differently.
-
2:16 - 2:18And I will say, "Yes, this is true,
-
2:18 - 2:20but this is more important
to me than that." -
2:20 - 2:22And this is where it becomes interesting,
-
2:22 - 2:24because this is where we become human.
-
2:24 - 2:27This is when we start
to argue, to form society. -
2:27 - 2:30And to really get somewhere,
we need to filter all our facts here, -
2:30 - 2:33through friends and neighbors
and parents and children -
2:33 - 2:35and coworkers and newspapers
and magazines, -
2:35 - 2:38to finally be grounded in real knowledge,
-
2:38 - 2:42which is something that a search engine
is a poor help to achieve. -
2:43 - 2:50So, I promised you an example
just to show you why it's so hard -
2:50 - 2:53to get to the point of true, clean,
objective knowledge -- -
2:53 - 2:55as food for thought.
-
2:55 - 2:58I will conduct a couple of simple
queries, search queries. -
2:58 - 3:03We'll start with "Michelle Obama,"
-
3:03 - 3:04the First Lady of the United States.
-
3:04 - 3:06And we'll click for pictures.
-
3:07 - 3:09It works really well, as you can see.
-
3:09 - 3:12It's a perfect search
result, more or less. -
3:12 - 3:15It's just her in the picture,
not even the President. -
3:16 - 3:17How does this work?
-
3:18 - 3:19Quite simple.
-
3:19 - 3:22Google uses a lot of smartness
to achieve this, but quite simply, -
3:22 - 3:25they look at two things
more than anything. -
3:25 - 3:30First, what does it say in the caption
under the picture on each website? -
3:30 - 3:32Does it say "Michelle Obama"
under the picture? -
3:32 - 3:34Pretty good indication
it's actually her on there. -
3:34 - 3:37Second, Google looks at the picture file,
-
3:37 - 3:40the name of the file as such
uploaded to the website. -
3:40 - 3:42Again, is it called "MichelleObama.jpeg"?
-
3:43 - 3:46Pretty good indication it's not
Clint Eastwood in the picture. -
3:46 - 3:50So, you've got those two and you get
a search result like this -- almost. -
3:50 - 3:57Now, in 2009, Michelle Obama
was the victim of a racist campaign, -
3:57 - 4:01where people set out to insult her
through her search results. -
4:01 - 4:04There was a picture distributed
widely over the Internet -
4:04 - 4:07where her face was distorted
to look like a monkey. -
4:07 - 4:10And that picture was published all over.
-
4:10 - 4:14And people published it
very, very purposefully, -
4:14 - 4:16to get it up there in the search results.
-
4:16 - 4:19They made sure to write
"Michelle Obama" in the caption -
4:19 - 4:23and they made sure to upload the picture
as "MichelleObama.jpeg," or the like. -
4:23 - 4:25You get why -- to manipulate
the search result. -
4:25 - 4:27And it worked, too.
-
4:27 - 4:29So when you picture-Googled
for "Michelle Obama" in 2009, -
4:29 - 4:33that distorted monkey picture
showed up among the first results. -
4:33 - 4:36Now, the results are self-cleansing,
-
4:36 - 4:38and that's sort of the beauty of it,
-
4:38 - 4:42because Google measures relevance
every hour, every day. -
4:42 - 4:44However, Google didn't settle
for that this time, -
4:44 - 4:47they just thought, "That's racist
and it's a bad search result -
4:48 - 4:51and we're going to go back
and clean that up manually. -
4:51 - 4:54We are going to write
some code and fix it," -
4:54 - 4:55which they did.
-
4:55 - 4:59And I don't think anyone in this room
thinks that was a bad idea. -
5:00 - 5:01Me neither.
-
5:03 - 5:06But then, a couple of years go by,
-
5:06 - 5:09and the world's most-Googled Anders,
-
5:09 - 5:11Anders Behring Breivik,
-
5:11 - 5:13did what he did.
-
5:13 - 5:15This is July 22 in 2011,
-
5:15 - 5:18and a terrible day in Norwegian history.
-
5:18 - 5:21This man, a terrorist, blew up
a couple of government buildings -
5:21 - 5:24walking distance from where we are
right now in Oslo, Norway -
5:24 - 5:26and then he traveled
to the island of Utøya -
5:26 - 5:29and shot and killed a group of kids.
-
5:29 - 5:31Almost 80 people died that day.
-
5:32 - 5:37And a lot of people would describe
this act of terror as two steps, -
5:37 - 5:40that he did two things: he blew up
the buildings and he shot those kids. -
5:40 - 5:42It's not true.
-
5:42 - 5:44It was three steps.
-
5:44 - 5:47He blew up those buildings,
he shot those kids, -
5:47 - 5:50and he sat down and waited
for the world to Google him. -
5:51 - 5:54And he prepared
all three steps equally well. -
5:55 - 5:57And if there was somebody
who immediately understood this, -
5:57 - 5:59it was a Swedish web developer,
-
5:59 - 6:03a search engine optimization expert
in Stockholm, named Nikke Lindqvist. -
6:03 - 6:04He's also a very political guy
-
6:04 - 6:07and he was right out there
in social media, on his blog and Facebook. -
6:07 - 6:09And he told everybody,
-
6:09 - 6:11"If there's something that
this guy wants right now, -
6:11 - 6:14it's to control the image of himself.
-
6:15 - 6:17Let's see if we can distort that.
-
6:17 - 6:21Let's see if we, in the civilized world,
can protest against what he did -
6:21 - 6:25through insulting him
in his search results." -
6:25 - 6:26And how?
-
6:27 - 6:29He told all of his readers the following,
-
6:29 - 6:31"Go out there on the Internet,
-
6:31 - 6:34find pictures of dog poop on sidewalks --
-
6:35 - 6:37find pictures of dog poop on sidewalks --
-
6:37 - 6:40publish them in your feeds,
on your websites, on your blogs. -
6:40 - 6:43Make sure to write the terrorist's
name in the caption, -
6:43 - 6:48make sure to name
the picture file "Breivik.jpeg." -
6:48 - 6:52Let's teach Google that that's
the face of the terrorist." -
6:54 - 6:55And it worked.
-
6:56 - 6:59Two years after that campaign
against Michelle Obama, -
6:59 - 7:02this manipulation campaign
against Anders Behring Breivik worked. -
7:02 - 7:07If you picture-Googled for him weeks after
the July 22 events from Sweden, -
7:07 - 7:11you'd see that picture of dog poop
high up in the search results, -
7:11 - 7:12as a little protest.
-
7:13 - 7:18Strangely enough, Google
didn't intervene this time. -
7:18 - 7:23They did not step in and manually
clean those search results up. -
7:24 - 7:26So the million-dollar question,
-
7:26 - 7:29is there anything different
between these two happenings here? -
7:29 - 7:32Is there anything different between
what happened to Michelle Obama -
7:32 - 7:34and what happened
to Anders Behring Breivik? -
7:34 - 7:36Of course not.
-
7:37 - 7:38It's the exact same thing,
-
7:38 - 7:41yet Google intervened in one case
and not in the other. -
7:41 - 7:42Why?
-
7:43 - 7:47Because Michelle Obama
is an honorable person, that's why, -
7:47 - 7:50and Anders Behring Breivik
is a despicable person. -
7:50 - 7:52See what happens there?
-
7:52 - 7:55An evaluation of a person takes place
-
7:55 - 7:59and there's only one
power-player in the world -
7:59 - 8:01with the authority to say who's who.
-
8:02 - 8:04"We like you, we dislike you.
-
8:04 - 8:06We believe in you,
we don't believe in you. -
8:06 - 8:08You're right, you're wrong.
You're true, you're false. -
8:08 - 8:10You're Obama, and you're Breivik."
-
8:11 - 8:13That's power if I ever saw it.
-
8:15 - 8:19So I'm asking you to remember
that behind every algorithm -
8:19 - 8:21is always a person,
-
8:21 - 8:23a person with a set of personal beliefs
-
8:23 - 8:26that no code can ever
completely eradicate. -
8:26 - 8:28And my message goes
out not only to Google, -
8:28 - 8:31but to all believers in the faith
of code around the world. -
8:31 - 8:34You need to identify
your own personal bias. -
8:34 - 8:36You need to understand that you are human
-
8:36 - 8:39and take responsibility accordingly.
-
8:40 - 8:43And I say this because I believe
we've reached a point in time -
8:43 - 8:44when it's absolutely imperative
-
8:44 - 8:48that we tie those bonds
together again, tighter: -
8:48 - 8:50the humanities and the technology.
-
8:50 - 8:52Tighter than ever.
-
8:52 - 8:56And, if nothing else, to remind us
that that wonderfully seductive idea -
8:56 - 8:58of the unbiased, clean search result
-
8:58 - 9:01is, and is likely to remain, a myth.
-
9:02 - 9:03Thank you for your time.
-
9:03 - 9:06(Applause)
- Title:
- The moral bias behind your search results
- Speaker:
- Andreas Ekström
- Description:
-
Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekström argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls on us to strengthen the bonds between technology and the humanities, and he reminds us that behind every algorithm is a set of personal beliefs that no code can ever completely eradicate.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 09:18
Brian Greene edited English subtitles for The moral bias behind your search results | ||
Brian Greene edited English subtitles for The moral bias behind your search results | ||
Brian Greene edited English subtitles for The moral bias behind your search results | ||
Brian Greene approved English subtitles for The moral bias behind your search results | ||
Brian Greene edited English subtitles for The moral bias behind your search results | ||
Brian Greene edited English subtitles for The moral bias behind your search results | ||
Brian Greene edited English subtitles for The moral bias behind your search results | ||
Camille Martínez accepted English subtitles for The moral bias behind your search results |