How I'm fighting bias in algorithms
-
0:01 - 0:04Hello, I'm Joy, a poet of code,
-
0:04 - 0:09on a mission to stop
an unseen force that's rising, -
0:09 - 0:12a force that I called "the coded gaze,"
-
0:12 - 0:15my term for algorithmic bias.
-
0:15 - 0:20Algorithmic bias, like human bias,
results in unfairness. -
0:20 - 0:26However, algorithms, like viruses,
can spread bias on a massive scale -
0:26 - 0:27at a rapid pace.
-
0:28 - 0:32Algorithmic bias can also lead
to exclusionary experiences -
0:32 - 0:34and discriminatory practices.
-
0:35 - 0:37Let me show you what I mean.
-
0:37 - 0:39(Video) Joy Boulamwini: Hi, camera.
I've got a face. -
0:40 - 0:42Can you see my face?
-
0:42 - 0:44No-glasses face?
-
0:44 - 0:46You can see her face.
-
0:46 - 0:48What about my face?
-
0:52 - 0:56I've got a mask. Can you see my mask?
-
0:56 - 0:59Joy Boulamwini: So how did this happen?
-
0:59 - 1:02Why am I sitting in front of a computer
-
1:02 - 1:03in a white mask,
-
1:03 - 1:07trying to be detected by a cheap webcam?
-
1:07 - 1:09Well, when I'm not fighting the coded gaze
-
1:09 - 1:11as a poet of code,
-
1:11 - 1:14I'm a graduate student
at the MIT Media Lab, -
1:14 - 1:19and there I have the opportunity to work
on all sorts of whimsical projects, -
1:19 - 1:21including the Aspire Mirror,
-
1:21 - 1:26a project I did so I could project
digital masks onto my reflection. -
1:26 - 1:29So in the morning, if I wanted
to feel powerful, -
1:29 - 1:30I could put on a lion.
-
1:30 - 1:34If I wanted to be uplifted,
I might have a quote. -
1:34 - 1:37So I used generic
facial recognition software -
1:37 - 1:38to build the system,
-
1:38 - 1:43but found it was really hard to test it
unless I wore a white mask. -
1:44 - 1:49Unfortunately, I've run
into this issue before. -
1:49 - 1:53When I was an undergraduate
at Georgia Tech studying computer science, -
1:53 - 1:55I used to work on social robots,
-
1:55 - 1:59and one of my tasks was to get a robot
to play peek-a-boo, -
1:59 - 2:01a simple turn-taking game
-
2:01 - 2:05where partners cover their face
and then uncover it saying, "Peek-a-boo!" -
2:05 - 2:09The problem is, peek-a-boo
doesn't really work if I can't see you, -
2:09 - 2:12and my robot couldn't see me.
-
2:12 - 2:16But I borrowed my roommate's face
to get the project done, -
2:16 - 2:17submitted the assignment,
-
2:17 - 2:21and figured, you know what,
somebody else will solve this problem. -
2:22 - 2:24Not too long after,
-
2:24 - 2:28I was in Hong Kong
for an entrepreneurship competition. -
2:28 - 2:31The organizers decided
to take participants -
2:31 - 2:33on a tour of local start-ups.
-
2:33 - 2:36One of the start-ups had a social robot,
-
2:36 - 2:38and they decided to do a demo.
-
2:38 - 2:41The demo worked on everybody
until it got to me, -
2:41 - 2:43and you can probably guess it.
-
2:43 - 2:46It couldn't detect my face.
-
2:46 - 2:49I asked the developers what was going on,
-
2:49 - 2:54and it turned out we had used the same
generic facial recognition software. -
2:54 - 2:56Halfway around the world,
-
2:56 - 3:00I learned that algorithmic bias
can travel as quickly -
3:00 - 3:03as it takes to download
some files off of the internet. -
3:04 - 3:07So what's going on?
Why isn't my face being detected? -
3:07 - 3:10Well, we have to look
at how we give machines sight. -
3:10 - 3:14Computer vision uses
machine learning techniques -
3:14 - 3:16to do facial recognition.
-
3:16 - 3:19So how this works is, you create
a training set with examples of faces. -
3:19 - 3:22This is a face. This is a face.
This is not a face. -
3:22 - 3:27And over time, you can teach a computer
how to recognize other faces. -
3:27 - 3:31However, if the training sets
aren't really that diverse, -
3:31 - 3:34any face that deviates too much
from the established norm -
3:34 - 3:36will be harder to detect,
-
3:36 - 3:38which is what was happening to me.
-
3:38 - 3:40But don't worry -- there's some good news.
-
3:40 - 3:43Training sets don't just
materialize out of nowhere. -
3:43 - 3:45We actually can create them.
-
3:45 - 3:49So there's an opportunity to create
full-spectrum training sets -
3:49 - 3:53that reflect a richer
portrait of humanity. -
3:53 - 3:55Now you've seen in my examples
-
3:55 - 3:57how social robots
-
3:57 - 4:02was how I found out about exclusion
with algorithmic bias. -
4:02 - 4:06But algorithmic bias can also lead
to discriminatory practices. -
4:07 - 4:09Across the US,
-
4:09 - 4:13police departments are starting to use
facial recognition software -
4:13 - 4:16in their crime-fighting arsenal.
-
4:16 - 4:18Georgetown Law published a report
-
4:18 - 4:24showing that one in two adults
in the US -- that's 117 million people -- -
4:24 - 4:28have their faces
in facial recognition networks. -
4:28 - 4:33Police departments can currently look
at these networks unregulated, -
4:33 - 4:37using algorithms that have not
been audited for accuracy. -
4:37 - 4:41Yet we know facial recognition
is not fail proof, -
4:41 - 4:45and labeling faces consistently
remains a challenge. -
4:45 - 4:47You might have seen this on Facebook.
-
4:47 - 4:50My friends and I laugh all the time
when we see other people -
4:50 - 4:52mislabeled in our photos.
-
4:52 - 4:58But misidentifying a suspected criminal
is no laughing matter, -
4:58 - 5:01nor is breaching civil liberties.
-
5:01 - 5:04Machine learning is being used
for facial recognition, -
5:04 - 5:08but it's also extending beyond the realm
of computer vision. -
5:09 - 5:13In her book, "Weapons
of Math Destruction," -
5:13 - 5:20data scientist Cathy O'Neil
talks about the rising new WMDs -- -
5:20 - 5:24widespread, mysterious
and destructive algorithms -
5:24 - 5:27that are increasingly being used
to make decisions -
5:27 - 5:31that impact more aspects of our lives.
-
5:31 - 5:32So who gets hired or fired?
-
5:32 - 5:35Do you get that loan?
Do you get insurance? -
5:35 - 5:38Are you admitted into the college
you wanted to get into? -
5:38 - 5:42Do you and I pay the same price
for the same product -
5:42 - 5:44purchased on the same platform?
-
5:44 - 5:48Law enforcement is also starting
to use machine learning -
5:48 - 5:50for predictive policing.
-
5:50 - 5:54Some judges use machine-generated
risk scores to determine -
5:54 - 5:58how long an individual
is going to spend in prison. -
5:58 - 6:01So we really have to think
about these decisions. -
6:01 - 6:02Are they fair?
-
6:02 - 6:05And we've seen that algorithmic bias
-
6:05 - 6:08doesn't necessarily always
lead to fair outcomes. -
6:08 - 6:10So what can we do about it?
-
6:10 - 6:14Well, we can start thinking about
how we create more inclusive code -
6:14 - 6:17and employ inclusive coding practices.
-
6:17 - 6:19It really starts with people.
-
6:20 - 6:22So who codes matters.
-
6:22 - 6:26Are we creating full-spectrum teams
with diverse individuals -
6:26 - 6:28who can check each other's blind spots?
-
6:28 - 6:32On the technical side,
how we code matters. -
6:32 - 6:35Are we factoring in fairness
as we're developing systems? -
6:36 - 6:38And finally, why we code matters.
-
6:39 - 6:44We've used tools of computational creation
to unlock immense wealth. -
6:44 - 6:48We now have the opportunity
to unlock even greater equality -
6:48 - 6:51if we make social change a priority
-
6:51 - 6:53and not an afterthought.
-
6:54 - 6:59And so these are the three tenets
that will make up the "incoding" movement. -
6:59 - 7:00Who codes matters,
-
7:00 - 7:02how we code matters
-
7:02 - 7:04and why we code matters.
-
7:04 - 7:07So to go towards incoding,
we can start thinking about -
7:07 - 7:10building platforms that can identify bias
-
7:10 - 7:13by collecting people's experiences
like the ones I shared, -
7:13 - 7:16but also auditing existing software.
-
7:16 - 7:20We can also start to create
more inclusive training sets. -
7:20 - 7:23Imagine a "Selfies for Inclusion" campaign
-
7:23 - 7:27where you and I can help
developers test and create -
7:27 - 7:29more inclusive training sets.
-
7:29 - 7:32And we can also start thinking
more conscientiously -
7:32 - 7:38about the social impact
of the technology that we're developing. -
7:38 - 7:40To get the incoding movement started,
-
7:40 - 7:43I've launched the Algorithmic
Justice League, -
7:43 - 7:49where anyone who cares about fairness
can help fight the coded gaze. -
7:49 - 7:52On codedgaze.com, you can report bias,
-
7:52 - 7:55request audits, become a tester
-
7:55 - 7:57and join the ongoing conversation,
-
7:57 - 8:00#codedgaze.
-
8:01 - 8:03So I invite you to join me
-
8:03 - 8:07in creating a world where technology
works for all of us, -
8:07 - 8:09not just some of us,
-
8:09 - 8:14a world where we value inclusion
and center social change. -
8:14 - 8:15Thank you.
-
8:15 - 8:19(Applause)
-
8:21 - 8:24But I have one question:
-
8:24 - 8:26Will you join me in the fight?
-
8:26 - 8:27(Laughter)
-
8:27 - 8:31(Applause)
- Title:
- How I'm fighting bias in algorithms
- Speaker:
- Joy Buolamwini
- Description:
-
MIT grad student Joy Buolamwini was working with facial recognition software when she noticed a problem: the software didn't recognize her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 08:46
Brian Greene edited English subtitles for How I'm fighting bias in algorithms | ||
Brian Greene edited English subtitles for How I'm fighting bias in algorithms | ||
Krystian Aparta edited English subtitles for How I'm fighting bias in algorithms | ||
Krystian Aparta edited English subtitles for How I'm fighting bias in algorithms | ||
Brian Greene edited English subtitles for How I'm fighting bias in algorithms | ||
Brian Greene edited English subtitles for How I'm fighting bias in algorithms | ||
Camille Martínez accepted English subtitles for How I'm fighting bias in algorithms | ||
Camille Martínez edited English subtitles for How I'm fighting bias in algorithms |