Return to Video

The moral bias behind your search results

  • 0:01 - 0:04
    So whenever I visit a school
    and talk to students,
  • 0:04 - 0:06
    I always ask them the same thing:
  • 0:07 - 0:08
    Why do you Google?
  • 0:09 - 0:12
    Why is Google the search engine
    of choice for you?
  • 0:13 - 0:15
    Strangely enough, I always get
    the same three answers.
  • 0:15 - 0:17
    One, "Because it works,"
  • 0:17 - 0:20
    which is a great answer;
    that's why I Google, too.
  • 0:20 - 0:22
    Two, somebody will say,
  • 0:22 - 0:25
    "I really don't know of any alternatives."
  • 0:26 - 0:29
    It's not an equally great answer
    and my reply to that is usually,
  • 0:29 - 0:31
    "Try to Google the word 'search engine,'
  • 0:31 - 0:33
    you may find a couple
    of interesting alternatives."
  • 0:33 - 0:35
    And last but not least, thirdly,
  • 0:35 - 0:39
    inevitably, one student will raise
    her or his hand and say,
  • 0:39 - 0:44
    "With Google, I'm certain to always get
    the best, unbiased search result."
  • 0:45 - 0:52
    Certain to always get the best,
    unbiased search result.
  • 0:53 - 0:55
    Now, as a man of the humanities,
  • 0:56 - 0:58
    albeit a digital humanities man,
  • 0:58 - 0:59
    that just makes my skin curl,
  • 0:59 - 1:04
    even if I, too, realize that that trust,
    that idea of the unbiased search result
  • 1:04 - 1:08
    is a cornerstone in our collective love
    for and appreciation of Google.
  • 1:09 - 1:13
    I will show you why that, philosophically,
    is almost an impossibility.
  • 1:13 - 1:16
    But let me first elaborate,
    just a little bit, on a basic principle
  • 1:16 - 1:19
    behind each search query
    that we sometimes seem to forget.
  • 1:20 - 1:22
    So whenever you set out
    to Google something,
  • 1:22 - 1:26
    start by asking yourself this:
    "Am I looking for an isolated fact?"
  • 1:26 - 1:29
    What is the capital of France?
  • 1:30 - 1:32
    What are the building blocks
    of a water molecule?
  • 1:32 - 1:34
    Great -- Google away.
  • 1:34 - 1:37
    There's not a group of scientists
    who are this close to proving
  • 1:37 - 1:39
    that it's actually London and H30.
  • 1:39 - 1:42
    You don't see a big conspiracy
    among those things.
  • 1:42 - 1:43
    We agree, on a global scale,
  • 1:43 - 1:46
    what the answers are
    to these isolated facts.
  • 1:46 - 1:52
    But if you complicate your question
    just a little bit and ask something like,
  • 1:52 - 1:54
    "Why is there
    an Israeli-Palestine conflict?"
  • 1:55 - 1:58
    You're not exactly looking
    for a singular fact anymore,
  • 1:58 - 1:59
    you're looking for knowledge,
  • 1:59 - 2:02
    which is something way more
    complicated and delicate.
  • 2:03 - 2:04
    And to get to knowledge,
  • 2:04 - 2:07
    you have to bring 10 or 20
    or 100 facts to the table
  • 2:07 - 2:10
    and acknowledge them and say,
    "Yes, these are all true."
  • 2:10 - 2:12
    But because of who I am,
  • 2:12 - 2:14
    young or old, black or white,
    gay or straight,
  • 2:14 - 2:16
    I will value them differently.
  • 2:16 - 2:18
    And I will say, "Yes, this is true,
  • 2:18 - 2:20
    but this is more important
    to me than that."
  • 2:20 - 2:22
    And this is where it becomes interesting,
  • 2:22 - 2:24
    because this is where we become human.
  • 2:24 - 2:27
    This is when we start
    to argue, to form society.
  • 2:27 - 2:30
    And to really get somewhere,
    we need to filter all our facts here,
  • 2:30 - 2:33
    through friends and neighbors
    and parents and children
  • 2:33 - 2:35
    and coworkers and newspapers
    and magazines,
  • 2:35 - 2:38
    to finally be grounded in real knowledge,
  • 2:38 - 2:42
    which is something that a search engine
    is a poor help to achieve.
  • 2:43 - 2:50
    So, I promised you an example
    just to show you why it's so hard
  • 2:50 - 2:53
    to get to the point of true, clean,
    objective knowledge --
  • 2:53 - 2:55
    as food for thought.
  • 2:55 - 2:58
    I will conduct a couple of simple
    queries, search queries.
  • 2:58 - 3:03
    We'll start with "Michelle Obama,"
  • 3:03 - 3:04
    the First Lady of the United States.
  • 3:04 - 3:06
    And we'll click for pictures.
  • 3:07 - 3:09
    It works really well, as you can see.
  • 3:09 - 3:12
    It's a perfect search
    result, more or less.
  • 3:12 - 3:15
    It's just her in the picture,
    not even the President.
  • 3:16 - 3:17
    How does this work?
  • 3:18 - 3:19
    Quite simple.
  • 3:19 - 3:22
    Google uses a lot of smartness
    to achieve this, but quite simply,
  • 3:22 - 3:25
    they look at two things
    more than anything.
  • 3:25 - 3:30
    First, what does it say in the caption
    under the picture on each website?
  • 3:30 - 3:32
    Does it say "Michelle Obama"
    under the picture?
  • 3:32 - 3:34
    Pretty good indication
    it's actually her on there.
  • 3:34 - 3:37
    Second, Google looks at the picture file,
  • 3:37 - 3:40
    the name of the file as such
    uploaded to the website.
  • 3:40 - 3:42
    Again, is it called "MichelleObama.jpeg"?
  • 3:43 - 3:46
    Pretty good indication it's not
    Clint Eastwood in the picture.
  • 3:46 - 3:50
    So, you've got those two and you get
    a search result like this -- almost.
  • 3:50 - 3:57
    Now, in 2009, Michelle Obama
    was the victim of a racist campaign,
  • 3:57 - 4:01
    where people set out to insult her
    through her search results.
  • 4:01 - 4:04
    There was a picture distributed
    widely over the Internet
  • 4:04 - 4:07
    where her face was distorted
    to look like a monkey.
  • 4:07 - 4:10
    And that picture was published all over.
  • 4:10 - 4:14
    And people published it
    very, very purposefully,
  • 4:14 - 4:16
    to get it up there in the search results.
  • 4:16 - 4:19
    They made sure to write
    "Michelle Obama" in the caption
  • 4:19 - 4:23
    and they made sure to upload the picture
    as "MichelleObama.jpeg," or the like.
  • 4:23 - 4:25
    You get why -- to manipulate
    the search result.
  • 4:25 - 4:27
    And it worked, too.
  • 4:27 - 4:29
    So when you picture-Googled
    for "Michelle Obama" in 2009,
  • 4:29 - 4:33
    that distorted monkey picture
    showed up among the first results.
  • 4:33 - 4:36
    Now, the results are self-cleansing,
  • 4:36 - 4:38
    and that's sort of the beauty of it,
  • 4:38 - 4:42
    because Google measures relevance
    every hour, every day.
  • 4:42 - 4:44
    However, Google didn't settle
    for that this time,
  • 4:44 - 4:47
    they just thought, "That's racist
    and it's a bad search result
  • 4:48 - 4:51
    and we're going to go back
    and clean that up manually.
  • 4:51 - 4:54
    We are going to write
    some code and fix it,"
  • 4:54 - 4:55
    which they did.
  • 4:55 - 4:59
    And I don't think anyone in this room
    thinks that was a bad idea.
  • 5:00 - 5:01
    Me neither.
  • 5:03 - 5:06
    But then, a couple of years go by,
  • 5:06 - 5:09
    and the world's most-Googled Anders,
  • 5:09 - 5:11
    Anders Behring Breivik,
  • 5:11 - 5:13
    did what he did.
  • 5:13 - 5:15
    This is July 22 in 2011,
  • 5:15 - 5:18
    and a terrible day in Norwegian history.
  • 5:18 - 5:21
    This man, a terrorist, blew up
    a couple of government buildings
  • 5:21 - 5:24
    walking distance from where we are
    right now in Oslo, Norway
  • 5:24 - 5:26
    and then he traveled
    to the island of Utøya
  • 5:26 - 5:29
    and shot and killed a group of kids.
  • 5:29 - 5:31
    Almost 80 people died that day.
  • 5:32 - 5:37
    And a lot of people would describe
    this act of terror as two steps,
  • 5:37 - 5:40
    that he did two things: he blew up
    the buildings and he shot those kids.
  • 5:40 - 5:42
    It's not true.
  • 5:42 - 5:44
    It was three steps.
  • 5:44 - 5:47
    He blew up those buildings,
    he shot those kids,
  • 5:47 - 5:50
    and he sat down and waited
    for the world to Google him.
  • 5:51 - 5:54
    And he prepared
    all three steps equally well.
  • 5:55 - 5:57
    And if there was somebody
    who immediately understood this,
  • 5:57 - 5:59
    it was a Swedish web developer,
  • 5:59 - 6:03
    a search engine optimization expert
    in Stockholm, named Nikke Lindqvist.
  • 6:03 - 6:04
    He's also a very political guy
  • 6:04 - 6:07
    and he was right out there
    in social media, on his blog and Facebook.
  • 6:07 - 6:09
    And he told everybody,
  • 6:09 - 6:11
    "If there's something that
    this guy wants right now,
  • 6:11 - 6:14
    it's to control the image of himself.
  • 6:15 - 6:17
    Let's see if we can distort that.
  • 6:17 - 6:21
    Let's see if we, in the civilized world,
    can protest against what he did
  • 6:21 - 6:25
    through insulting him
    in his search results."
  • 6:25 - 6:26
    And how?
  • 6:27 - 6:29
    He told all of his readers the following,
  • 6:29 - 6:31
    "Go out there on the Internet,
  • 6:31 - 6:34
    find pictures of dog poop on sidewalks --
  • 6:35 - 6:37
    find pictures of dog poop on sidewalks --
  • 6:37 - 6:40
    publish them in your feeds,
    on your websites, on your blogs.
  • 6:40 - 6:43
    Make sure to write the terrorist's
    name in the caption,
  • 6:43 - 6:48
    make sure to name
    the picture file "Breivik.jpeg."
  • 6:48 - 6:52
    Let's teach Google that that's
    the face of the terrorist."
  • 6:54 - 6:55
    And it worked.
  • 6:56 - 6:59
    Two years after that campaign
    against Michelle Obama,
  • 6:59 - 7:02
    this manipulation campaign
    against Anders Behring Breivik worked.
  • 7:02 - 7:07
    If you picture-Googled for him weeks after
    the July 22 events from Sweden,
  • 7:07 - 7:11
    you'd see that picture of dog poop
    high up in the search results,
  • 7:11 - 7:12
    as a little protest.
  • 7:13 - 7:18
    Strangely enough, Google
    didn't intervene this time.
  • 7:18 - 7:23
    They did not step in and manually
    clean those search results up.
  • 7:24 - 7:26
    So the million-dollar question,
  • 7:26 - 7:29
    is there anything different
    between these two happenings here?
  • 7:29 - 7:32
    Is there anything different between
    what happened to Michelle Obama
  • 7:32 - 7:34
    and what happened
    to Anders Behring Breivik?
  • 7:34 - 7:36
    Of course not.
  • 7:37 - 7:38
    It's the exact same thing,
  • 7:38 - 7:41
    yet Google intervened in one case
    and not in the other.
  • 7:41 - 7:42
    Why?
  • 7:43 - 7:47
    Because Michelle Obama
    is an honorable person, that's why,
  • 7:47 - 7:50
    and Anders Behring Breivik
    is a despicable person.
  • 7:50 - 7:52
    See what happens there?
  • 7:52 - 7:55
    An evaluation of a person takes place
  • 7:55 - 7:59
    and there's only one
    power-player in the world
  • 7:59 - 8:01
    with the authority to say who's who.
  • 8:02 - 8:04
    "We like you, we dislike you.
  • 8:04 - 8:06
    We believe in you,
    we don't believe in you.
  • 8:06 - 8:08
    You're right, you're wrong.
    You're true, you're false.
  • 8:08 - 8:10
    You're Obama, and you're Breivik."
  • 8:11 - 8:13
    That's power if I ever saw it.
  • 8:15 - 8:19
    So I'm asking you to remember
    that behind every algorithm
  • 8:19 - 8:21
    is always a person,
  • 8:21 - 8:23
    a person with a set of personal beliefs
  • 8:23 - 8:26
    that no code can ever
    completely eradicate.
  • 8:26 - 8:28
    And my message goes
    out not only to Google,
  • 8:28 - 8:31
    but to all believers in the faith
    of code around the world.
  • 8:31 - 8:34
    You need to identify
    your own personal bias.
  • 8:34 - 8:36
    You need to understand that you are human
  • 8:36 - 8:39
    and take responsibility accordingly.
  • 8:40 - 8:43
    And I say this because I believe
    we've reached a point in time
  • 8:43 - 8:44
    when it's absolutely imperative
  • 8:44 - 8:48
    that we tie those bonds
    together again, tighter:
  • 8:48 - 8:50
    the humanities and the technology.
  • 8:50 - 8:52
    Tighter than ever.
  • 8:52 - 8:56
    And, if nothing else, to remind us
    that that wonderfully seductive idea
  • 8:56 - 8:58
    of the unbiased, clean search result
  • 8:58 - 9:01
    is, and is likely to remain, a myth.
  • 9:02 - 9:03
    Thank you for your time.
  • 9:03 - 9:06
    (Applause)
Title:
The moral bias behind your search results
Speaker:
Andreas Ekström
Description:

Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekström argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls on us to strengthen the bonds between technology and the humanities, and he reminds us that behind every algorithm is a set of personal beliefs that no code can ever completely eradicate.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
09:18

English subtitles

Revisions Compare revisions