Return to Video

The moral bias behind your search results

  • 0:01 - 0:04
    So whenever I visit a school
    and talk to students,
  • 0:04 - 0:06
    I always ask them the same thing:
  • 0:07 - 0:09
    Why do you Google?
  • 0:09 - 0:13
    Why is Google the search engine
    of choice for you?
  • 0:13 - 0:15
    Strangely enough, I always get
    the same three answers.
  • 0:15 - 0:18
    One, "Because it works",
  • 0:18 - 0:20
    which is a great answer,
    that's why I Google, too.
  • 0:20 - 0:23
    Two, somebody will say,
  • 0:23 - 0:25
    "I really don't know of any alternatives."
  • 0:25 - 0:27
    It's not an equally great answer
  • 0:27 - 0:29
    and my reply to that is usually,
  • 0:29 - 0:31
    "Try to Google the word 'search engine',
  • 0:31 - 0:33
    you may find a couple
    of interesting alternatives."
  • 0:33 - 0:36
    And last but not least, thirdly,
  • 0:36 - 0:39
    inevitably, one student will raise
    her or his hand and say,
  • 0:39 - 0:44
    "With Google, I'm certain to always get
    the best, unbiased search result."
  • 0:46 - 0:52
    Certain to always get
    the best, unbiased search result.
  • 0:53 - 0:56
    Now, as a man of the humanities,
  • 0:56 - 0:58
    albeit, a digital humanities man,
  • 0:58 - 1:00
    that just makes my skin curl,
  • 1:00 - 1:05
    even if I, too, realize that that trust,
    that idea of the unbiased search result
  • 1:05 - 1:09
    is a cornerstone in our collective
    love for and appreciation of Google.
  • 1:09 - 1:13
    I will show you why that, philosophically,
    is almost an impossibility.
  • 1:13 - 1:16
    But let me first elaborate,
    just a little bit, on a basic principle
  • 1:16 - 1:19
    behind the search query
    that we sometimes seem to forget.
  • 1:20 - 1:22
    So whenever you set out
    to Google something,
  • 1:22 - 1:27
    start by asking yourself this,
    "Am I looking for an isolated fact?
  • 1:27 - 1:30
    What is the capital of France?
  • 1:30 - 1:32
    What are the building blocks
    of a water molecule?"
  • 1:32 - 1:35
    Great -- Google away.
  • 1:35 - 1:37
    There's not a group of scientists
    who are this close
  • 1:37 - 1:40
    to proving that it's actually
    London and H30.
  • 1:40 - 1:42
    You don't see a big conspiracy
    among those things.
  • 1:42 - 1:45
    We agree, on a global scale,
    what the answers are
  • 1:45 - 1:47
    to these isolated facts.
  • 1:47 - 1:52
    But if you complicate your question
    just a little bit and ask something like,
  • 1:52 - 1:55
    "Why is there
    an Israeli-Palestine conflict?"
  • 1:55 - 1:58
    You're not exactly looking
    for a singular fact anymore,
  • 1:58 - 2:00
    you're looking for knowledge,
  • 2:00 - 2:03
    which is something way more
    complicated and delicate.
  • 2:03 - 2:04
    And to get to knowledge,
  • 2:04 - 2:08
    you have to bring 10 or 20
    or 100 facts to the table
  • 2:08 - 2:10
    and acknowledge them and say,
    "Yes, these are all true."
  • 2:10 - 2:12
    But because of who I am,
  • 2:12 - 2:14
    young or old, black or white,
    gay or straight,
  • 2:14 - 2:16
    I will value them differently.
  • 2:16 - 2:18
    And I will say, "Yes, this is true,
  • 2:18 - 2:20
    "but this is more
    important to me than that."
  • 2:20 - 2:22
    And this is where it becomes interesting
  • 2:22 - 2:24
    because this is where we become human.
  • 2:24 - 2:27
    This is when we start to argue,
    to form society
  • 2:27 - 2:28
    and to really get somewhere.
  • 2:28 - 2:30
    We need to filter all our facts here
  • 2:30 - 2:33
    through friends and neighbors
    and parents and children
  • 2:33 - 2:35
    and coworkers and newspapers
    and magazines
  • 2:35 - 2:38
    to finally be grounded in real knowledge,
  • 2:38 - 2:42
    which is something that a search engine
    is a poor help to achieve.
  • 2:44 - 2:50
    So I promised you an example
    just to show you why it's so hard
  • 2:50 - 2:53
    to get to the point of true, clean,
    objective knowledge --
  • 2:53 - 2:55
    that's food for thought.
  • 2:55 - 2:58
    I will conduct a couple of
    simple queries, search queries.
  • 2:59 - 3:02
    We'll start by "Michelle Obama",
  • 3:02 - 3:05
    the First Lady of the United States.
  • 3:05 - 3:06
    And we'll click for pictures.
  • 3:06 - 3:09
    It works really well, as you can see.
  • 3:09 - 3:13
    It's a perfect search result,
    more or less.
  • 3:13 - 3:15
    It's just her in the picture,
    not even the President.
  • 3:16 - 3:18
    How does this work?
  • 3:18 - 3:19
    Quite simple,
  • 3:19 - 3:22
    Google uses a lot of smartness
    to achieve this, but quite simply
  • 3:22 - 3:25
    they look at two things
    more than anything.
  • 3:25 - 3:27
    First, what does it say in the caption,
  • 3:27 - 3:29
    what does it say under the picture
    on each website?
  • 3:29 - 3:32
    Does it say "Michelle Obama"
    under the picture?
  • 3:32 - 3:34
    Pretty good indication
    it's actually her on there.
  • 3:34 - 3:37
    Second, Google looks at the picture file,
  • 3:37 - 3:40
    the name of the file as such
    uploaded to the website.
  • 3:40 - 3:43
    Again, is it called "Michelle Obama.jpeg"?
  • 3:43 - 3:46
    Pretty good indication it's not
    Clint Eastwood in the picture.
  • 3:46 - 3:50
    So you got those two and you get
    a search result like this, almost.
  • 3:50 - 3:57
    Now, in 2009, Michelle Obama
    was the victim of a racist campaign
  • 3:57 - 4:01
    where people set out to insult her
    through her search results.
  • 4:01 - 4:04
    There was a picture distributed widely
    over the Internet
  • 4:04 - 4:07
    where her face was distorted
    to look like a monkey.
  • 4:07 - 4:10
    And that picture was published all over.
  • 4:10 - 4:14
    And people published it
    very, very purposefully
  • 4:14 - 4:16
    to get it up there in the search results.
  • 4:16 - 4:19
    They made sure to write
    "Michelle Obama" in the caption
  • 4:19 - 4:23
    and they made sure to upload the picture
    as "Michelle Obama.jpeg" or the like.
  • 4:23 - 4:25
    You get why -- to manipulate
    the search result.
  • 4:25 - 4:27
    And it worked, too.
  • 4:27 - 4:30
    So when you picture-Googled
    for "Michelle Obama" in 2009,
  • 4:30 - 4:33
    that distorted money picture
    showed up among the first results.
  • 4:33 - 4:37
    Now, the results are self-cleansing,
  • 4:37 - 4:38
    and that's sort of the beauty of it
  • 4:38 - 4:42
    because Google measures relevance
    every hour every day.
  • 4:42 - 4:45
    However, Google didn't settle
    for that this time,
  • 4:45 - 4:48
    they just thought, "That's racist
    and it's a bad search result
  • 4:48 - 4:51
    and we're going to go back
    and clean that up, manually.
  • 4:51 - 4:54
    We are going to write
    some code and fix it",
  • 4:54 - 4:56
    which they did,
  • 4:56 - 4:59
    and I don't think anyone in this room
    thinks that was a bad idea.
  • 4:59 - 5:01
    Me neither.
  • 5:03 - 5:06
    But then, a couple years go by,
  • 5:06 - 5:09
    and the world's most googled Anders,
  • 5:09 - 5:12
    Anders Behring Breivik,
  • 5:12 - 5:13
    did what he did.
  • 5:13 - 5:18
    This is July 22 in 2011 and a terrible day
    in Norwegian history.
  • 5:18 - 5:21
    This man, a terrorist, blew up
    a couple of government buildings,
  • 5:21 - 5:24
    walking distance from where we are
    right now in Oslo, Norway
  • 5:24 - 5:26
    and then he traveled to the
    island of Utøya
  • 5:26 - 5:29
    and shot and killed a group of kids.
  • 5:29 - 5:33
    Almost 80 people died that day.
  • 5:33 - 5:37
    And a lot of people would describe
    this act of terror as two steps,
  • 5:37 - 5:39
    that he did two things:
  • 5:39 - 5:41
    he blew up the buildings
    and he shot those kids.
  • 5:41 - 5:43
    It's not true.
  • 5:43 - 5:45
    It was three steps.
  • 5:45 - 5:46
    He blew up those buildings,
  • 5:46 - 5:47
    he shot those kids,
  • 5:47 - 5:51
    and he sat down and waited
    for the world to Google him.
  • 5:51 - 5:54
    And he prepared
    all three steps equally well.
  • 5:54 - 5:57
    And if there was somebody who
    immediately understood this,
  • 5:57 - 6:01
    it was a Swedish web developer,
    or search engine optimization expert
  • 6:01 - 6:03
    in Stockholm named Nikke Lindqvist,
  • 6:03 - 6:04
    he's also a very political guy
  • 6:04 - 6:07
    and he was right out there
    in social media, on his blog and Facebook.
  • 6:07 - 6:09
    And he told everybody,
  • 6:09 - 6:11
    "If there's something that
    this guy wants right now,
  • 6:11 - 6:14
    it's to control the image of himself.
  • 6:15 - 6:18
    Let's see if we can distort that.
  • 6:18 - 6:22
    Let's see if we, in the civilized world,
    can protest against what he did
  • 6:22 - 6:25
    through insulting him
    in his search results."
  • 6:25 - 6:27
    And how?
  • 6:27 - 6:29
    He told all of his readers the following,
  • 6:29 - 6:31
    "Go out there on the Internet,
  • 6:31 - 6:33
    find pictures of dog poop
    on sidewalks --
  • 6:34 - 6:36
    find pictures of dog poop on sidewalks --
  • 6:36 - 6:40
    publish them in your feeds,
    on your websites, on your blogs.
  • 6:40 - 6:43
    Make sure to write the terrorist's
    name in the caption,
  • 6:43 - 6:48
    make sure to name
    the picture file "Breivik.jpeg",
  • 6:48 - 6:52
    let's teach Google that that's
    the face of the terrorist."
  • 6:54 - 6:56
    And it worked.
  • 6:56 - 6:59
    Two years after that campaign
    against Michelle Obama,
  • 6:59 - 7:02
    this manipulation campaign
    against Anders Behring Breivik worked.
  • 7:02 - 7:07
    If you picture-Googled for him weeks after
    the July 22 events from Sweden,
  • 7:07 - 7:11
    you'd see that picture of dog poop
    high up in your search result,
  • 7:11 - 7:13
    as a little protest
  • 7:14 - 7:18
    Strangely enough, Google
    didn't intervene this time.
  • 7:19 - 7:24
    They did not step in and manually clean
    those search results up.
  • 7:24 - 7:26
    So the million-dollar question is,
  • 7:26 - 7:29
    is there anything different between
    these two happenings here?
  • 7:29 - 7:32
    Is there anything different between
    what happened to Michelle Obama
  • 7:32 - 7:35
    and what happened
    to Anders Behring Breivik?
  • 7:35 - 7:37
    Of course not.
  • 7:37 - 7:39
    It's the exact same thing,
  • 7:39 - 7:42
    yet Google intervened in one case
    and not in the other.
  • 7:42 - 7:43
    Why?
  • 7:44 - 7:47
    Because Michelle Obama
    is an honorable person, that's why,
  • 7:47 - 7:51
    and Anders Behring Breivik
    is a despicable person.
  • 7:51 - 7:52
    See what happens there?
  • 7:52 - 7:54
    An evaluation of a person takes place
  • 7:54 - 7:56
    and there's only one power in the world,
  • 7:56 - 7:58
    one power-player in the world
  • 7:58 - 8:00
    with the authority to say who's who,
  • 8:00 - 8:03
    "We like you, we dislike you.
  • 8:03 - 8:05
    We believe in you,
    we don't believe in you.
  • 8:05 - 8:07
    You're right, you're wrong.
  • 8:07 - 8:08
    You're true, you're false.
  • 8:08 - 8:11
    You're Obama, and you're Breivik."
  • 8:11 - 8:14
    That's power if I ever saw it.
  • 8:15 - 8:19
    So I'm asking you to remember
    that behind every algorithm
  • 8:19 - 8:21
    there's always a person,
  • 8:21 - 8:23
    a person with a set of personal beliefs
  • 8:23 - 8:26
    that no code could ever
    completely eradicate.
  • 8:26 - 8:28
    and my message goes out to
    not only Google,
  • 8:28 - 8:31
    but to all believers in the faith
    of code around the world.
  • 8:31 - 8:34
    You need to identify
    your own personal bias.
  • 8:34 - 8:36
    You need to understand that
    you are human
  • 8:36 - 8:39
    and take responsibility accordingly.
  • 8:40 - 8:43
    And I say this because I believe
    that we've reach a point in time
  • 8:43 - 8:45
    when it's absolutely imperative
  • 8:45 - 8:48
    that we tie those bonds
    together again, tighter:
  • 8:48 - 8:50
    the humanities and the technology.
  • 8:50 - 8:52
    Tighter than ever.
  • 8:52 - 8:56
    And, if nothing else, to remind us
    that that wonderfully seductive idea
  • 8:56 - 8:59
    of the unbiased, clean search result
  • 8:59 - 9:02
    is, and is likely to remain, a myth.
  • 9:02 - 9:03
    Thank you for your time.
  • 9:03 - 9:06
    (Applause)
Title:
The moral bias behind your search results
Speaker:
Andreas Ekstrøm
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
09:18

English subtitles

Revisions Compare revisions