Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble
-
0:03 - 0:06Hi, my name is Safiya Umoja Noble,
and I'm an assistant professor -
0:06 - 0:09in the Annenberg School
of Communication and Journalism. -
0:09 - 0:13My research looks at racist
and sexist algorithmic bias -
0:13 - 0:16and the way in which people
are marginalized and oppressed -
0:16 - 0:20by digital media platforms.
-
0:20 - 0:23I spent 15 years in corporate
marketing and advertising, -
0:23 - 0:27working for some of
the largest Fortune 100 brands -
0:27 - 0:28in the United States.
-
0:28 - 0:30We were starting to redirect
significant portions -
0:30 - 0:34of our advertising media
buying dollars online -
0:34 - 0:37and thinking about, in fact,
how to game Google search -
0:37 - 0:40and Yahoo! to elevate the brands
and amplify the messages. -
0:40 - 0:44And so at the moment that
I was leaving corporate America -
0:44 - 0:47and moving into academia,
the public was increasingly -
0:47 - 0:49falling in love with Google.
-
0:49 - 0:53And this lead me to thinking
that this was a space and a place -
0:53 - 0:55that needed to be looked at more closely.
-
0:55 - 0:58It was interesting to see
this total diversion -
0:58 - 1:00of public goods,
-
1:00 - 1:01public knowledge,
-
1:01 - 1:04and libraries being shifted into
-
1:04 - 1:08a corporate, privately-held company.
-
1:09 - 1:12When we go to places like Google search,
-
1:12 - 1:14the public generally thinks
that what they'll find there -
1:14 - 1:19will be credible and fairly representing
different kinds of ideas, people, -
1:19 - 1:20and spheres of knowledge.
-
1:20 - 1:26And so this is what really prompted
a 6 year inquiry into this phenomenon -
1:26 - 1:30of thinking about misrepresentation
on the internet, particularly -
1:30 - 1:35when people are using search engines,
and that culminated in my new book, -
1:35 - 1:39Algorithms of Oppression:
How Search Engines Reinforce Racism. -
1:39 - 1:43People think of algorithms
as simply a mathematical formulation. -
1:43 - 1:47But in fact algorithms
are really about automated decisions. -
1:48 - 1:52In 2009, I was kind of joking around,
in fact, with a colleague, -
1:52 - 1:55and I was telling him
that I was really interested -
1:55 - 1:57in what's happening with Google.
-
1:57 - 1:59And just kind of offhand he said to me,
-
1:59 - 2:02"Oh yeah, you should see what happens
when you google 'Black girls'." -
2:02 - 2:07Of course I immediately did the search,
found that pornography was the primary way -
2:07 - 2:11that Black girls, Latina girls,
Asian girls were represented. -
2:11 - 2:14That started a whole
deeper line of inquiry -
2:14 - 2:17about the way in which
misrepresentation happens -
2:17 - 2:19for women of color on the internet
-
2:19 - 2:22and what some of the broader
social consequences of that are. -
2:22 - 2:25In my work, I look at the way
that these platforms are designed -
2:25 - 2:29to amplify certain voices
and silence other voices. -
2:29 - 2:31How does that come about?
-
2:31 - 2:32What is that phenomena about?
-
2:32 - 2:36What's the role of capital
or advertising dollars -
2:36 - 2:39in driving certain results
to the first page? -
2:39 - 2:42What do the results mean
in kind of a broader social, -
2:42 - 2:45historical, economic context?
-
2:45 - 2:49So I contextualize the results
that I find to show -
2:49 - 2:53how incredibly problematic this is
because it further marginalizes -
2:53 - 2:55people who are already living in a margin,
-
2:55 - 2:59people who are already suffering
from systemic oppression, -
2:59 - 3:03and yet again, these results
show up in these platforms -
3:03 - 3:07as if they are credible, fair,
objective, neutral ideas. -
3:07 - 3:10In the end, I call for alternatives.
-
3:10 - 3:13And I argue strongly that we need
-
3:13 - 3:16to have things like public interest search
-
3:16 - 3:19that are not driven by commercial biases.
-
3:19 - 3:23And I put out some ideas about
what it means to imagine -
3:23 - 3:28and create alternatives
in our public information sphere -
3:28 - 3:30that are based on
a different set of ethics. -
3:30 - 3:33If anything, I think that this book
is the kind of book that will help us -
3:33 - 3:37re-frame the idea that,
"We should just google it" -
3:37 - 3:39and everything will be fine.
-
3:39 - 3:41♪ (music) ♪
- Title:
- Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble
- Description:
-
Back in 2009, when Safiya Noble conducted a Google search using keywords "black girls," "latina girls," and "asian girls," the first page of results were invariably linked to pornography.
In her book, Algorithms of Oppression: How Search Engines Reinforce Racism, Noble delves into the ways search engines misrepresent a variety of people, concepts, types of information and knowledge. Her aim: to get people thinking and talking about the prominent role technology plays in shaping our lives and our future.
This video explores her journey into researching this topic and what it means to us.
- Video Language:
- English
- Duration:
- 03:44
Aloha Sargent edited English subtitles for Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble | ||
Aloha Sargent edited English subtitles for Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble | ||
Aloha Sargent edited English subtitles for Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble | ||
Aloha Sargent edited English subtitles for Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble | ||
Aloha Sargent edited English subtitles for Algorithms of Oppression: Faculty Focus: Safiya Umoja Noble |