English subtitles

← The moral bias behind your search results

Get Embed Code
36 Languages

Showing Revision 12 created 11/10/2015 by Brian Greene.

  1. So whenever I visit a school
    and talk to students,
  2. I always ask them the same thing:
  3. Why do you Google?
  4. Why is Google the search engine
    of choice for you?
  5. Strangely enough, I always get
    the same three answers.
  6. One, "Because it works,"
  7. which is a great answer;
    that's why I Google, too.
  8. Two, somebody will say,
  9. "I really don't know of any alternatives."
  10. It's not an equally great answer
    and my reply to that is usually,
  11. "Try to Google the word 'search engine,'
  12. you may find a couple
    of interesting alternatives."
  13. And last but not least, thirdly,
  14. inevitably, one student will raise
    her or his hand and say,
  15. "With Google, I'm certain to always get
    the best, unbiased search result."
  16. Certain to always get the best,
    unbiased search result.
  17. Now, as a man of the humanities,

  18. albeit a digital humanities man,
  19. that just makes my skin curl,
  20. even if I, too, realize that that trust,
    that idea of the unbiased search result
  21. is a cornerstone in our collective love
    for and appreciation of Google.
  22. I will show you why that, philosophically,
    is almost an impossibility.
  23. But let me first elaborate,
    just a little bit, on a basic principle

  24. behind each search query
    that we sometimes seem to forget.
  25. So whenever you set out
    to Google something,
  26. start by asking yourself this:
    "Am I looking for an isolated fact?"
  27. What is the capital of France?
  28. What are the building blocks
    of a water molecule?
  29. Great -- Google away.
  30. There's not a group of scientists
    who are this close to proving
  31. that it's actually London and H30.
  32. You don't see a big conspiracy
    among those things.
  33. We agree, on a global scale,
  34. what the answers are
    to these isolated facts.
  35. But if you complicate your question
    just a little bit and ask something like,

  36. "Why is there
    an Israeli-Palestine conflict?"
  37. You're not exactly looking
    for a singular fact anymore,
  38. you're looking for knowledge,
  39. which is something way more
    complicated and delicate.
  40. And to get to knowledge,
  41. you have to bring 10 or 20
    or 100 facts to the table
  42. and acknowledge them and say,
    "Yes, these are all true."
  43. But because of who I am,
  44. young or old, black or white,
    gay or straight,
  45. I will value them differently.
  46. And I will say, "Yes, this is true,
  47. but this is more important
    to me than that."
  48. And this is where it becomes interesting,
  49. because this is where we become human.
  50. This is when we start
    to argue, to form society.
  51. And to really get somewhere,
    we need to filter all our facts here,
  52. through friends and neighbors
    and parents and children
  53. and coworkers and newspapers
    and magazines,
  54. to finally be grounded in real knowledge,
  55. which is something that a search engine
    is a poor help to achieve.
  56. So, I promised you an example
    just to show you why it's so hard

  57. to get to the point of true, clean,
    objective knowledge --
  58. as food for thought.
  59. I will conduct a couple of simple
    queries, search queries.
  60. We'll start with "Michelle Obama,"
  61. the First Lady of the United States.
  62. And we'll click for pictures.
  63. It works really well, as you can see.
  64. It's a perfect search
    result, more or less.
  65. It's just her in the picture,
    not even the President.
  66. How does this work?

  67. Quite simple.
  68. Google uses a lot of smartness
    to achieve this, but quite simply,
  69. they look at two things
    more than anything.
  70. First, what does it say in the caption
    under the picture on each website?
  71. Does it say "Michelle Obama"
    under the picture?
  72. Pretty good indication
    it's actually her on there.
  73. Second, Google looks at the picture file,
  74. the name of the file as such
    uploaded to the website.
  75. Again, is it called "MichelleObama.jpeg"?
  76. Pretty good indication it's not
    Clint Eastwood in the picture.
  77. So, you've got those two and you get
    a search result like this -- almost.
  78. Now, in 2009, Michelle Obama
    was the victim of a racist campaign,

  79. where people set out to insult her
    through her search results.
  80. There was a picture distributed
    widely over the Internet
  81. where her face was distorted
    to look like a monkey.
  82. And that picture was published all over.
  83. And people published it
    very, very purposefully,
  84. to get it up there in the search results.
  85. They made sure to write
    "Michelle Obama" in the caption
  86. and they made sure to upload the picture
    as "MichelleObama.jpeg," or the like.
  87. You get why -- to manipulate
    the search result.
  88. And it worked, too.
  89. So when you picture-Googled
    for "Michelle Obama" in 2009,
  90. that distorted monkey picture
    showed up among the first results.
  91. Now, the results are self-cleansing,

  92. and that's sort of the beauty of it,
  93. because Google measures relevance
    every hour, every day.
  94. However, Google didn't settle
    for that this time,
  95. they just thought, "That's racist
    and it's a bad search result
  96. and we're going to go back
    and clean that up manually.
  97. We are going to write
    some code and fix it,"
  98. which they did.
  99. And I don't think anyone in this room
    thinks that was a bad idea.
  100. Me neither.
  101. But then, a couple of years go by,

  102. and the world's most-Googled Anders,
  103. Anders Behring Breivik,
  104. did what he did.
  105. This is July 22 in 2011,
  106. and a terrible day in Norwegian history.
  107. This man, a terrorist, blew up
    a couple of government buildings
  108. walking distance from where we are
    right now in Oslo, Norway
  109. and then he traveled
    to the island of Utøya
  110. and shot and killed a group of kids.
  111. Almost 80 people died that day.
  112. And a lot of people would describe
    this act of terror as two steps,

  113. that he did two things: he blew up
    the buildings and he shot those kids.
  114. It's not true.
  115. It was three steps.
  116. He blew up those buildings,
    he shot those kids,
  117. and he sat down and waited
    for the world to Google him.
  118. And he prepared
    all three steps equally well.
  119. And if there was somebody
    who immediately understood this,

  120. it was a Swedish web developer,
  121. a search engine optimization expert
    in Stockholm, named Nikke Lindqvist.
  122. He's also a very political guy
  123. and he was right out there
    in social media, on his blog and Facebook.
  124. And he told everybody,
  125. "If there's something that
    this guy wants right now,
  126. it's to control the image of himself.
  127. Let's see if we can distort that.
  128. Let's see if we, in the civilized world,
    can protest against what he did
  129. through insulting him
    in his search results."
  130. And how?

  131. He told all of his readers the following,
  132. "Go out there on the Internet,
  133. find pictures of dog poop on sidewalks --
  134. find pictures of dog poop on sidewalks --
  135. publish them in your feeds,
    on your websites, on your blogs.
  136. Make sure to write the terrorist's
    name in the caption,
  137. make sure to name
    the picture file "Breivik.jpeg."
  138. Let's teach Google that that's
    the face of the terrorist."
  139. And it worked.
  140. Two years after that campaign
    against Michelle Obama,
  141. this manipulation campaign
    against Anders Behring Breivik worked.
  142. If you picture-Googled for him weeks after
    the July 22 events from Sweden,
  143. you'd see that picture of dog poop
    high up in the search results,
  144. as a little protest.
  145. Strangely enough, Google
    didn't intervene this time.

  146. They did not step in and manually
    clean those search results up.
  147. So the million-dollar question,
  148. is there anything different
    between these two happenings here?
  149. Is there anything different between
    what happened to Michelle Obama
  150. and what happened
    to Anders Behring Breivik?
  151. Of course not.
  152. It's the exact same thing,
  153. yet Google intervened in one case
    and not in the other.
  154. Why?

  155. Because Michelle Obama
    is an honorable person, that's why,
  156. and Anders Behring Breivik
    is a despicable person.
  157. See what happens there?
  158. An evaluation of a person takes place
  159. and there's only one
    power-player in the world
  160. with the authority to say who's who.
  161. "We like you, we dislike you.
  162. We believe in you,
    we don't believe in you.
  163. You're right, you're wrong.
    You're true, you're false.
  164. You're Obama, and you're Breivik."
  165. That's power if I ever saw it.
  166. So I'm asking you to remember
    that behind every algorithm

  167. is always a person,
  168. a person with a set of personal beliefs
  169. that no code can ever
    completely eradicate.
  170. And my message goes
    out not only to Google,
  171. but to all believers in the faith
    of code around the world.
  172. You need to identify
    your own personal bias.
  173. You need to understand that you are human
  174. and take responsibility accordingly.
  175. And I say this because I believe
    we've reached a point in time

  176. when it's absolutely imperative
  177. that we tie those bonds
    together again, tighter:
  178. the humanities and the technology.
  179. Tighter than ever.
  180. And, if nothing else, to remind us
    that that wonderfully seductive idea
  181. of the unbiased, clean search result
  182. is, and is likely to remain, a myth.
  183. Thank you for your time.

  184. (Applause)