0:00:03.082,0:00:06.455
Hi, my name is Safiya Umoja Noble,[br]and I'm an assistant professor
0:00:06.455,0:00:09.481
in the Annenberg School [br]of Communication and Journalism.
0:00:09.481,0:00:12.861
My research looks at racist[br]and sexist algorithmic bias
0:00:12.861,0:00:16.473
and the way in which people[br]are marginalized and oppressed
0:00:16.473,0:00:19.552
by digital media platforms.
0:00:19.552,0:00:23.271
I spent 15 years in corporate[br]marketing and advertising,
0:00:23.271,0:00:26.512
working for some of [br]the largest Fortune 100 brands
0:00:26.512,0:00:27.561
in the United States.
0:00:27.561,0:00:30.499
We were starting to redirect[br]significant portions
0:00:30.499,0:00:33.861
of our advertising media [br]buying dollars online
0:00:33.861,0:00:36.742
and thinking about, in fact, [br]how to game Google search
0:00:36.742,0:00:40.182
and Yahoo! to elevate the brands[br]and amplify the messages.
0:00:40.182,0:00:43.681
And so at the moment that[br]I was leaving corporate America
0:00:43.681,0:00:47.130
and moving into academia,[br]the public was increasingly
0:00:47.130,0:00:48.641
falling in love with Google.
0:00:48.641,0:00:52.610
And this lead me to thinking[br]that this was a space and a place
0:00:52.610,0:00:55.048
that needed to be looked at more closely.
0:00:55.048,0:00:57.548
It was interesting to see[br]this total diversion
0:00:57.548,0:00:59.748
of public goods,
0:00:59.748,0:01:01.255
public knowledge,
0:01:01.255,0:01:04.298
and libraries being shifted into
0:01:04.298,0:01:07.928
a corporate, privately-held company.
0:01:09.078,0:01:11.707
When we go to places like Google search,
0:01:11.707,0:01:13.968
the public generally thinks[br]that what they'll find there
0:01:13.968,0:01:18.748
will be credible and fairly representing[br]different kinds of ideas, people,
0:01:18.748,0:01:20.256
and spheres of knowledge.
0:01:20.256,0:01:25.597
And so this is what really prompted[br]a 6 year inquiry into this phenomenon
0:01:25.597,0:01:29.538
of thinking about misrepresentation[br]on the internet, particularly
0:01:29.538,0:01:34.537
when people are using search engines,[br]and that culminated in my new book,
0:01:34.537,0:01:38.729
Algorithms of Oppression:[br]How Search Engines Reinforce Racism.
0:01:38.729,0:01:42.977
People think of algorithms [br]as simply a mathematical formulation.
0:01:42.977,0:01:47.249
But in fact algorithms [br]are really about automated decisions.
0:01:48.199,0:01:52.491
In 2009, I was kind of joking around,[br]in fact, with a colleague,
0:01:52.491,0:01:54.900
and I was telling him[br]that I was really interested
0:01:54.900,0:01:56.810
in what's happening with Google.
0:01:56.810,0:01:59.181
And just kind of offhand he said to me,
0:01:59.181,0:02:02.291
"Oh yeah, you should see what happens[br]when you google 'Black girls'."
0:02:02.291,0:02:06.611
Of course I immediately did the search,[br]found that pornography was the primary way
0:02:06.611,0:02:10.911
that Black girls, Latina girls, [br]Asian girls were represented.
0:02:10.911,0:02:13.580
That started a whole [br]deeper line of inquiry
0:02:13.580,0:02:17.042
about the way in which [br]misrepresentation happens
0:02:17.042,0:02:19.002
for women of color on the internet
0:02:19.002,0:02:22.170
and what some of the broader [br]social consequences of that are.
0:02:22.170,0:02:25.369
In my work, I look at the way[br]that these platforms are designed
0:02:25.369,0:02:29.271
to amplify certain voices[br]and silence other voices.
0:02:29.271,0:02:30.591
How does that come about?
0:02:30.591,0:02:32.350
What is that phenomena about?
0:02:32.350,0:02:36.051
What's the role of capital[br]or advertising dollars
0:02:36.051,0:02:38.940
in driving certain results[br]to the first page?
0:02:38.940,0:02:42.062
What do the results mean[br]in kind of a broader social,
0:02:42.062,0:02:45.271
historical, economic context?
0:02:45.271,0:02:49.300
So I contextualize the results[br]that I find to show
0:02:49.300,0:02:53.341
how incredibly problematic this is[br]because it further marginalizes
0:02:53.341,0:02:55.271
people who are already living in a margin,
0:02:55.271,0:02:58.999
people who are already suffering[br]from systemic oppression,
0:02:58.999,0:03:02.821
and yet again, these results [br]show up in these platforms
0:03:02.821,0:03:07.330
as if they are credible, fair, [br]objective, neutral ideas.
0:03:07.330,0:03:10.401
In the end, I call for alternatives.
0:03:10.401,0:03:13.281
And I argue strongly that we need
0:03:13.281,0:03:16.422
to have things like public interest search
0:03:16.422,0:03:19.130
that are not driven by commercial biases.
0:03:19.130,0:03:23.110
And I put out some ideas about[br]what it means to imagine
0:03:23.110,0:03:27.572
and create alternatives[br]in our public information sphere
0:03:27.572,0:03:29.670
that are based on [br]a different set of ethics.
0:03:29.670,0:03:33.460
If anything, I think that this book[br]is the kind of book that will help us
0:03:33.460,0:03:36.849
re-frame the idea that, [br]"We should just google it"
0:03:36.849,0:03:39.201
and everything will be fine.
0:03:39.201,0:03:41.382
♪ (music) ♪