< Return to Video

Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble

  • 0:03 - 0:06
    Hi, my name is Safiya Umoja Noble,
    and I'm an assistant professor
  • 0:06 - 0:09
    in the Annenberg School
    of Communication and Journalism.
  • 0:09 - 0:13
    My research looks at racist
    and sexist algorithmic bias
  • 0:13 - 0:16
    and the way in which people
    are marginalized and oppressed
  • 0:16 - 0:20
    by digital media platforms.
  • 0:20 - 0:23
    I spent 15 years in corporate
    marketing and advertising,
  • 0:23 - 0:27
    working for some of
    the largest Fortune 100 brands
  • 0:27 - 0:28
    in the United States.
  • 0:28 - 0:30
    We were starting to redirect
    significant portions
  • 0:30 - 0:34
    of our advertising media
    buying dollars online
  • 0:34 - 0:37
    and thinking about, in fact,
    how to game Google search
  • 0:37 - 0:40
    and Yahoo! to elevate the brands
    and amplify the messages.
  • 0:40 - 0:44
    And so at the moment that
    I was leaving corporate America
  • 0:44 - 0:47
    and moving into academia,
    the public was increasingly
  • 0:47 - 0:49
    falling in love with Google.
  • 0:49 - 0:53
    And this lead me to thinking
    that this was a space and a place
  • 0:53 - 0:55
    that needed to be looked at more closely.
  • 0:55 - 0:58
    It was interesting to see
    this total diversion
  • 0:58 - 1:00
    of public goods,
  • 1:00 - 1:01
    public knowledge,
  • 1:01 - 1:04
    and libraries being shifted into
  • 1:04 - 1:08
    a corporate, privately-held company.
  • 1:09 - 1:12
    When we go to places like Google search,
  • 1:12 - 1:14
    the public generally thinks
    that what they'll find there
  • 1:14 - 1:19
    will be credible and fairly representing
    different kinds of ideas, people,
  • 1:19 - 1:20
    and spheres of knowledge.
  • 1:20 - 1:26
    And so this is what really prompted
    a 6 year inquiry into this phenomenon
  • 1:26 - 1:30
    of thinking about misrepresentation
    on the internet, particularly
  • 1:30 - 1:35
    when people are using search engines,
    and that culminated in my new book,
  • 1:35 - 1:39
    Algorithms of Oppression:
    How Search Engines Reinforce Racism
    .
  • 1:39 - 1:43
    People think of algorithms
    as simply a mathematical formulation.
  • 1:43 - 1:47
    But in fact algorithms
    are really about automated decisions.
  • 1:48 - 1:52
    In 2009, I was kind of joking around,
    in fact, with a colleague,
  • 1:52 - 1:55
    and I was telling him
    that I was really interested
  • 1:55 - 1:57
    in what's happening with Google.
  • 1:57 - 1:59
    And just kind of offhand he said to me,
  • 1:59 - 2:02
    "Oh yeah, you should see what happens
    when you google 'Black girls'."
  • 2:02 - 2:07
    Of course I immediately did the search,
    found that pornography was the primary way
  • 2:07 - 2:11
    that Black girls, Latina girls,
    Asian girls were represented.
  • 2:11 - 2:14
    That started a whole
    deeper line of inquiry
  • 2:14 - 2:17
    about the way in which
    misrepresentation happens
  • 2:17 - 2:19
    for women of color on the internet
  • 2:19 - 2:22
    and what some of the broader
    social consequences of that are.
  • 2:22 - 2:25
    In my work, I look at the way
    that these platforms are designed
  • 2:25 - 2:29
    to amplify certain voices
    and silence other voices.
  • 2:29 - 2:31
    How does that come about?
  • 2:31 - 2:32
    What is that phenomena about?
  • 2:32 - 2:36
    What's the role of capital
    or advertising dollars
  • 2:36 - 2:39
    in driving certain results
    to the first page?
  • 2:39 - 2:42
    What do the results mean
    in kind of a broader social,
  • 2:42 - 2:45
    historical, economic context?
  • 2:45 - 2:49
    So I contextualize the results
    that I find to show
  • 2:49 - 2:53
    how incredibly problematic this is
    because it further marginalizes
  • 2:53 - 2:55
    people who are already living in a margin,
  • 2:55 - 2:59
    people who are already suffering
    from systemic oppression,
  • 2:59 - 3:03
    and yet again, these results
    show up in these platforms
  • 3:03 - 3:07
    as if they are credible, fair,
    objective, neutral ideas.
  • 3:07 - 3:10
    In the end, I call for alternatives.
  • 3:10 - 3:13
    And I argue strongly that we need
  • 3:13 - 3:16
    to have things like public interest search
  • 3:16 - 3:19
    that are not driven by commercial biases.
  • 3:19 - 3:23
    And I put out some ideas about
    what it means to imagine
  • 3:23 - 3:28
    and create alternatives
    in our public information sphere
  • 3:28 - 3:30
    that are based on
    a different set of ethics.
  • 3:30 - 3:33
    If anything, I think that this book
    is the kind of book that will help us
  • 3:33 - 3:37
    re-frame the idea that,
    "We should just google it"
  • 3:37 - 3:39
    and everything will be fine.
  • 3:39 - 3:41
    ♪ (music) ♪
Title:
Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble
Description:

Back in 2009, when Safiya Noble conducted a Google search using keywords "black girls," "latina girls," and "asian girls," the first page of results were invariably linked to pornography.

In her book, Algorithms of Oppression: How Search Engines Reinforce Racism, Noble delves into the ways search engines misrepresent a variety of people, concepts, types of information and knowledge. Her aim: to get people thinking and talking about the prominent role technology plays in shaping our lives and our future.

This video explores her journey into researching this topic and what it means to us.

more » « less
Video Language:
English
Duration:
03:44

English subtitles

Revisions