1
00:00:03,082 --> 00:00:06,455
Hi, my name is Safiya Umoja Noble,
and I'm an assistant professor
2
00:00:06,455 --> 00:00:09,481
in the Annenberg School
of Communication and Journalism.
3
00:00:09,481 --> 00:00:12,861
My research looks at racist
and sexist algorithmic bias
4
00:00:12,861 --> 00:00:16,473
and the way in which people
are marginalized and oppressed
5
00:00:16,473 --> 00:00:19,552
by digital media platforms.
6
00:00:19,552 --> 00:00:23,271
I spent 15 years in corporate
marketing and advertising,
7
00:00:23,271 --> 00:00:26,512
working for some of
the largest Fortune 100 brands
8
00:00:26,512 --> 00:00:27,561
in the United States.
9
00:00:27,561 --> 00:00:30,499
We were starting to redirect
significant portions
10
00:00:30,499 --> 00:00:33,861
of our advertising media
buying dollars online
11
00:00:33,861 --> 00:00:36,742
and thinking about, in fact,
how to game Google search
12
00:00:36,742 --> 00:00:40,182
and Yahoo! to elevate the brands
and amplify the messages.
13
00:00:40,182 --> 00:00:43,681
And so at the moment that
I was leaving corporate America
14
00:00:43,681 --> 00:00:47,130
and moving into academia,
the public was increasingly
15
00:00:47,130 --> 00:00:48,641
falling in love with Google.
16
00:00:48,641 --> 00:00:52,610
And this lead me to thinking
that this was a space and a place
17
00:00:52,610 --> 00:00:55,048
that needed to be looked at more closely.
18
00:00:55,048 --> 00:00:57,548
It was interesting to see
this total diversion
19
00:00:57,548 --> 00:00:59,748
of public goods,
20
00:00:59,748 --> 00:01:01,255
public knowledge,
21
00:01:01,255 --> 00:01:04,298
and libraries being shifted into
22
00:01:04,298 --> 00:01:07,928
a corporate, privately-held company.
23
00:01:09,078 --> 00:01:11,707
When we go to places like Google search,
24
00:01:11,707 --> 00:01:13,968
the public generally thinks
that what they'll find there
25
00:01:13,968 --> 00:01:18,748
will be credible and fairly representing
different kinds of ideas, people,
26
00:01:18,748 --> 00:01:20,256
and spheres of knowledge.
27
00:01:20,256 --> 00:01:25,597
And so this is what really prompted
a 6 year inquiry into this phenomenon
28
00:01:25,597 --> 00:01:29,538
of thinking about misrepresentation
on the internet, particularly
29
00:01:29,538 --> 00:01:34,537
when people are using search engines,
and that culminated in my new book,
30
00:01:34,537 --> 00:01:38,729
Algorithms of Oppression:
How Search Engines Reinforce Racism.
31
00:01:38,729 --> 00:01:42,977
People think of algorithms
as simply a mathematical formulation.
32
00:01:42,977 --> 00:01:47,249
But in fact algorithms
are really about automated decisions.
33
00:01:48,199 --> 00:01:52,491
In 2009, I was kind of joking around,
in fact, with a colleague,
34
00:01:52,491 --> 00:01:54,900
and I was telling him
that I was really interested
35
00:01:54,900 --> 00:01:56,810
in what's happening with Google.
36
00:01:56,810 --> 00:01:59,181
And just kind of offhand he said to me,
37
00:01:59,181 --> 00:02:02,291
"Oh yeah, you should see what happens
when you google 'Black girls'."
38
00:02:02,291 --> 00:02:06,611
Of course I immediately did the search,
found that pornography was the primary way
39
00:02:06,611 --> 00:02:10,911
that Black girls, Latina girls,
Asian girls were represented.
40
00:02:10,911 --> 00:02:13,580
That started a whole
deeper line of inquiry
41
00:02:13,580 --> 00:02:17,042
about the way in which
misrepresentation happens
42
00:02:17,042 --> 00:02:19,002
for women of color on the internet
43
00:02:19,002 --> 00:02:22,170
and what some of the broader
social consequences of that are.
44
00:02:22,170 --> 00:02:25,369
In my work, I look at the way
that these platforms are designed
45
00:02:25,369 --> 00:02:29,271
to amplify certain voices
and silence other voices.
46
00:02:29,271 --> 00:02:30,591
How does that come about?
47
00:02:30,591 --> 00:02:32,350
What is that phenomena about?
48
00:02:32,350 --> 00:02:36,051
What's the role of capital
or advertising dollars
49
00:02:36,051 --> 00:02:38,940
in driving certain results
to the first page?
50
00:02:38,940 --> 00:02:42,062
What do the results mean
in kind of a broader social,
51
00:02:42,062 --> 00:02:45,271
historical, economic context?
52
00:02:45,271 --> 00:02:49,300
So I contextualize the results
that I find to show
53
00:02:49,300 --> 00:02:53,341
how incredibly problematic this is
because it further marginalizes
54
00:02:53,341 --> 00:02:55,271
people who are already living in a margin,
55
00:02:55,271 --> 00:02:58,999
people who are already suffering
from systemic oppression,
56
00:02:58,999 --> 00:03:02,821
and yet again, these results
show up in these platforms
57
00:03:02,821 --> 00:03:07,330
as if they are credible, fair,
objective, neutral ideas.
58
00:03:07,330 --> 00:03:10,401
In the end, I call for alternatives.
59
00:03:10,401 --> 00:03:13,281
And I argue strongly that we need
60
00:03:13,281 --> 00:03:16,422
to have things like public interest search
61
00:03:16,422 --> 00:03:19,130
that are not driven by commercial biases.
62
00:03:19,130 --> 00:03:23,110
And I put out some ideas about
what it means to imagine
63
00:03:23,110 --> 00:03:27,572
and create alternatives
in our public information sphere
64
00:03:27,572 --> 00:03:29,670
that are based on
a different set of ethics.
65
00:03:29,670 --> 00:03:33,460
If anything, I think that this book
is the kind of book that will help us
66
00:03:33,460 --> 00:03:36,849
re-frame the idea that,
"We should just google it"
67
00:03:36,849 --> 00:03:39,201
and everything will be fine.
68
00:03:39,201 --> 00:03:41,382
♪ (music) ♪