Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet
-
0:14 - 0:17Hello, I'm Joy, a poet of code,
-
0:17 - 0:22on a mission to stop
an unseen force that's rising, -
0:22 - 0:25a force that I called "the coded gaze,"
-
0:25 - 0:28my term for algorithmic bias.
-
0:28 - 0:33Algorithmic bias, like human bias,
results in unfairness. -
0:33 - 0:39However, algorithms, like viruses,
can spread bias on a massive scale -
0:39 - 0:40at a rapid pace.
-
0:41 - 0:45Algorithmic bias can also lead
to exclusionary experiences -
0:45 - 0:47and discriminatory practices.
-
0:47 - 0:49Let me show you what I mean.
-
0:50 - 0:53(Video) Joy Boulamwini: Hi, camera.
I've got a face. -
0:53 - 0:55Can you see my face?
-
0:55 - 0:57No-glasses face?
-
0:58 - 1:00You can see her face.
-
1:01 - 1:03What about my face?
-
1:06 - 1:07(Laughter)
-
1:07 - 1:10I've got a mask. Can you see my mask?
-
1:12 - 1:14Joy Boulamwini: So how did this happen?
-
1:14 - 1:17Why am I sitting in front of a computer
-
1:17 - 1:19in a white mask,
-
1:19 - 1:23trying to be detected by a cheap webcam?
-
1:23 - 1:25Well, when I'm not fighting the coded gaze
-
1:25 - 1:26as a poet of code,
-
1:26 - 1:30I'm a graduate student
at the MIT Media Lab, -
1:30 - 1:35and there I have the opportunity to work
on all sorts of whimsical projects, -
1:35 - 1:37including the Aspire Mirror,
-
1:37 - 1:42a project I did so I could project
digital masks onto my reflection. -
1:42 - 1:44So in the morning, if I wanted
to feel powerful, -
1:44 - 1:46I could put on a lion.
-
1:46 - 1:49If I wanted to be uplifted,
I might have a quote. -
1:49 - 1:52So I used generic
facial recognition software -
1:52 - 1:54to build the system,
-
1:54 - 1:59but found it was really hard to test it
unless I wore a white mask. -
2:00 - 2:04Unfortunately, I've run
into this issue before. -
2:04 - 2:08When I was an undergraduate
at Georgia Tech studying computer science, -
2:08 - 2:11I used to work on social robots,
-
2:11 - 2:14and one of my tasks was to get a robot
to play peek-a-boo, -
2:14 - 2:16a simple turn-taking game
-
2:16 - 2:21where partners cover their face
and then uncover it saying, "Peek-a-boo!" -
2:21 - 2:25The problem is, peek-a-boo
doesn't really work if I can't see you, -
2:25 - 2:28and my robot couldn't see me.
-
2:28 - 2:32But I borrowed my roommate's face
to get the project done, -
2:32 - 2:33submitted the assignment,
-
2:33 - 2:37and figured, you know what,
somebody else will solve this problem. -
2:37 - 2:40Not too long after,
-
2:40 - 2:44I was in Hong Kong
for an entrepreneurship competition. -
2:44 - 2:47The organizers decided
to take participants -
2:47 - 2:49on a tour of local start-ups.
-
2:49 - 2:52One of the start-ups had a social robot,
-
2:52 - 2:54and they decided to do a demo.
-
2:54 - 2:57The demo worked on everybody
until it got to me, -
2:57 - 2:59and you can probably guess it.
-
2:59 - 3:02It couldn't detect my face.
-
3:02 - 3:04I asked the developers what was going on,
-
3:04 - 3:10and it turned out we had used the same
generic facial recognition software. -
3:10 - 3:12Halfway around the world,
-
3:12 - 3:16I learned that algorithmic bias
can travel as quickly -
3:16 - 3:19as it takes to download
some files off of the internet. -
3:20 - 3:23So what's going on?
Why isn't my face being detected? -
3:23 - 3:26Well, we have to look
at how we give machines sight. -
3:26 - 3:29Computer vision uses
machine learning techniques -
3:29 - 3:31to do facial recognition.
-
3:31 - 3:35So how this works is, you create
a training set with examples of faces. -
3:35 - 3:38This is a face. This is a face.
This is not a face. -
3:38 - 3:43And over time, you can teach a computer
how to recognize other faces. -
3:43 - 3:47However, if the training sets
aren't really that diverse, -
3:47 - 3:50any face that deviates too much
from the established norm -
3:50 - 3:52will be harder to detect,
-
3:52 - 3:54which is what was happening to me.
-
3:54 - 3:56But don't worry -- there's some good news.
-
3:56 - 3:59Training sets don't just
materialize out of nowhere. -
3:59 - 4:01We actually can create them.
-
4:01 - 4:05So there's an opportunity to create
full-spectrum training sets -
4:05 - 4:09that reflect a richer
portrait of humanity. -
4:09 - 4:11Now you've seen in my examples
-
4:11 - 4:13how social robots
-
4:13 - 4:17was how I found out about exclusion
with algorithmic bias. -
4:17 - 4:22But algorithmic bias can also lead
to discriminatory practices. -
4:23 - 4:25Across the US,
-
4:25 - 4:29police departments are starting to use
facial recognition software -
4:29 - 4:31in their crime-fighting arsenal.
-
4:31 - 4:33Georgetown Law published a report
-
4:33 - 4:40showing that one in two adults
in the US -- that's 117 million people -- -
4:40 - 4:44have their faces
in facial recognition networks. -
4:44 - 4:48Police departments can currently look
at these networks unregulated, -
4:48 - 4:53using algorithms that have not
been audited for accuracy. -
4:53 - 4:57Yet we know facial recognition
is not fail proof, -
4:57 - 5:01and labeling faces consistently
remains a challenge. -
5:01 - 5:03You might have seen this on Facebook.
-
5:03 - 5:06My friends and I laugh all the time
when we see other people -
5:06 - 5:08mislabeled in our photos.
-
5:08 - 5:14But misidentifying a suspected criminal
is no laughing matter, -
5:14 - 5:17nor is breaching civil liberties.
-
5:17 - 5:20Machine learning is being used
for facial recognition, -
5:20 - 5:24but it's also extending beyond the realm
of computer vision. -
5:25 - 5:29In her book, "Weapons
of Math Destruction," -
5:29 - 5:36data scientist Cathy O'Neil
talks about the rising new WMDs -- -
5:36 - 5:40widespread, mysterious
and destructive algorithms -
5:40 - 5:43that are increasingly being used
to make decisions -
5:43 - 5:46that impact more aspects of our lives.
-
5:46 - 5:48So who gets hired or fired?
-
5:48 - 5:50Do you get that loan?
Do you get insurance? -
5:50 - 5:54Are you admitted into the college
you wanted to get into? -
5:54 - 5:57Do you and I pay the same price
for the same product -
5:57 - 6:00purchased on the same platform?
-
6:00 - 6:04Law enforcement is also starting
to use machine learning -
6:04 - 6:06for predictive policing.
-
6:06 - 6:10Some judges use machine-generated
risk scores to determine -
6:10 - 6:14how long an individual
is going to spend in prison. -
6:14 - 6:16So we really have to think
about these decisions. -
6:16 - 6:18Are they fair?
-
6:18 - 6:21And we've seen that algorithmic bias
-
6:21 - 6:24doesn't necessarily always
lead to fair outcomes. -
6:24 - 6:26So what can we do about it?
-
6:26 - 6:30Well, we can start thinking about
how we create more inclusive code -
6:30 - 6:33and employ inclusive coding practices.
-
6:33 - 6:35It really starts with people.
-
6:36 - 6:37So who codes matters.
-
6:38 - 6:42Are we creating full-spectrum teams
with diverse individuals -
6:42 - 6:44who can check each other's blind spots?
-
6:44 - 6:48On the technical side,
how we code matters. -
6:48 - 6:51Are we factoring in fairness
as we're developing systems? -
6:51 - 6:54And finally, why we code matters.
-
6:55 - 7:00We've used tools of computational creation
to unlock immense wealth. -
7:00 - 7:04We now have the opportunity
to unlock even greater equality -
7:04 - 7:07if we make social change a priority
-
7:07 - 7:09and not an afterthought.
-
7:10 - 7:14And so these are the three tenets
that will make up the "incoding" movement. -
7:14 - 7:16Who codes matters,
-
7:16 - 7:18how we code matters
-
7:18 - 7:20and why we code matters.
-
7:20 - 7:23So to go towards incoding,
we can start thinking about -
7:23 - 7:26building platforms that can identify bias
-
7:26 - 7:29by collecting people's experiences
like the ones I shared, -
7:29 - 7:32but also auditing existing software.
-
7:32 - 7:36We can also start to create
more inclusive training sets. -
7:36 - 7:39Imagine a "Selfies for Inclusion" campaign
-
7:39 - 7:42where you and I can help
developers test and create -
7:42 - 7:45more inclusive training sets.
-
7:45 - 7:48And we can also start thinking
more conscientiously -
7:48 - 7:53about the social impact
of the technology that we're developing. -
7:53 - 7:56To get the incoding movement started,
-
7:56 - 7:59I've launched the Algorithmic
Justice League, -
7:59 - 8:05where anyone who cares about fairness
can help fight the coded gaze. -
8:05 - 8:08On codedgaze.com, you can report bias,
-
8:08 - 8:10request audits, become a tester
-
8:10 - 8:13and join the ongoing conversation,
-
8:13 - 8:15#codedgaze.
-
8:17 - 8:19So I invite you to join me
-
8:19 - 8:23in creating a world where technology
works for all of us, -
8:23 - 8:25not just some of us,
-
8:25 - 8:29a world where we value inclusion
and center social change. -
8:29 - 8:31Thank you.
-
8:31 - 8:36(Applause)
-
8:37 - 8:40But I have one question:
-
8:40 - 8:42Will you join me in the fight?
-
8:42 - 8:43(Laughter)
-
8:43 - 8:47(Applause)
- Title:
- Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet
- Description:
-
Joy Buolamwini is a brilliant coder, but she had an issue: facial recognition algorithms don't always recognize her face unless she wears a white mask. This is an example of the subtle ways in which racial privilege excludes full participation in learning opportunities, so Joy did something about it - she started coding programs that are more inclusive. She started a revolution, and she's ready to get you on board!
Joy Buolamwini is the founder of Code4Rights and a graduate researcher with the Civic Media group at the MIT Media Lab. Joy is a Rhodes Scholar, aFulbright Fellow, an Astronaut Scholar, a Google Anita Borg Scholar, and a Carter Center technical consultant recognized as a distinguished volunteer. As the Chief Technology Officer for Techturized Inc, a hair care technology company and Swift Tech Solutions, a global health tech consultancy, Joy gained technical competency by developing software for under-served communities in the US, Ethiopia, Mali, Nigeria, and Niger. During her Fulbright Fellowship in Zambia, she explored empowering engaged citizens with skills to create their own technology through the Zamrize Initiative
This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDxTalks
- Duration:
- 09:02
Krystian Aparta edited English subtitles for Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet | ||
Krystian Aparta edited English subtitles for Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet | ||
Krystian Aparta commented on English subtitles for Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet | ||
Krystian Aparta edited English subtitles for Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet |
Krystian Aparta
The TEDTalk version of the talk is available for translation here: http://amara.org/en/videos/ABZEiHp9akUS/info/how-im-fighting-bias-in-algorithms/