-
Hello, I'm Joy,
-
a poet of code
-
on a mission to stop
-
an unseen force that's rising,
-
a force that I called the coded gaze,
-
my term for algorithmic bias.
-
Algorithmic bias, like human bias,
-
results in unfairness.
-
However, algorithms, like viruses,
-
can spread bias on a massive scale
-
at a rapid pace.
-
Algorithmic bias can also lead
to exclusionary experiences,
-
and discriminatory practices.
-
Let me show you what I mean.
-
(Video) Joy Boulamwini: Camera.
I've got a face.
-
Can you see my face?
-
No glasses face.
-
You can see her face.
-
What about my face?
-
(Laughter)
-
I've got a mask. Can you see my mask?
-
Joy Boulamwini: So how did this happen?
-
Why am I sitting in front of a computer
-
in a white mask
-
trying to be detected by a cheap webcam?
-
Well, when I'm not fighting the coded gaze
-
as a poet of code,
-
I'm a graduate student
at the MIT Media Lab,
-
and there I have the opportunity
-
to work on all sorts
of whimsical projects,
-
including the Aspire Mirror,
-
a project I did so I could project
digital masks onto my reflection.
-
So in the morning, if I wanted
to feel powerful,
-
I could put on a lion.
-
If I wanted to be uplifted,
I might have a quote.
-
So I used generic facial
recognition software
-
to build the system,
-
but found that it was
really hard to test it
-
unless I wore a white mask.
-
Unfortunately, I've run
into this issue before.
-
When I was an undergraduate
at Georgia Tech studying computer science,
-
I used to work on social robots,
-
and one of my tasks was to get a robot
-
to play peek-a-boo,
-
a simple turn-taking game
-
where partners cover their face
and then uncover it saying "Peek-a-boo."
-
The problem is, peek-a-boo
doesn't really work if I can't see you,
-
and my robot couldn't see me.
-
But I borrowed my roommate's face
to get the project done,
-
submitted the assignment, and figured,
-
you know what? Somebody else
will solve this problem.
-
Not too longer after,
-
I was in Hong Kong
for an entrepreneurship competition.
-
The organizers decided
to take participants
-
on a tour of local start-ups.
-
One of the start-ups had a social robot,
-
and they decided to do a demo.
-
The demo worked on everybody
until it got to me,
-
and you can probably guess it.
-
It couldn't detect my face.
-
I asked the developers
what was going on,
-
and it turned out
-
we had used the same generic
facial recognition software.
-
Halfway around the world,
-
I learned that algorithmic bias
can travel as quickly
-
as it takes to download
some files off of the Internet.
-
So what's going on?
Why isn't my face being detected?
-
Well, we have to look at
how we give machines sight.
-
Computer vision uses
machine learning techniques
-
to do facial recognition.
-
So how this works is
you create a training set
-
with examples of faces.
-
This is face. This is face.
This is not a face.
-
And over time, you can teach
a computer how to recognize other faces.
-
However, if the training sets
aren't really that diverse,
-
any face that deviates too much
from the established norm
-
will be harder to detect,
-
which is what was happening to me.
-
But don't worry, there's some good news.
-
Training sets don't just
materialize out of nowhere.
-
We actually can create them.
-
So there's an opportunity to create
full spectrum training sets
-
that reflect a richer
portrait of humanity.
-
Now you've seen in my examples
-
how social robots
-
was how I found out about exclusion
with algorithmic bias,
-
but algorithmic bias can also lead
to discriminatory practices.
-
Across the U.S., police departments
are starting to use
-
facial recognition software
-
in their crime-fighting arsenal.
-
Georgetown Law published a report
showing that one in two
-
adults in the U.S. --
that's 117 million people --
-
have their faces
in facial recognition networks.
-
Police departments can currently look
at these networks unregulated
-
using algorithms that have not
been audited for accuracy.
-
Yet we know facial recognition
-
is not failproof,
-
and labeling faces consistently
remains a challenge.
-
You might have seen this on Facebook.
-
My friends and I laugh all the time
when we see other people
-
mislabeled in our photos.
-
But misidentifying a suspected criminal
-
is no laughing matter,
-
nor is breaching civil liberties.
-
Machine learning is being used
for facial recognition,
-
but it's also extending beyond the realm
-
of computer vision.
-
In her book
"Weapons of Math Destruction,"
-
data scientist Cathy O'Neil
-
talks about the rising new WMDs --
-
widespread, mysterious,
and destructive algorithms
-
that are increasingly being used
to make decisions
-
that impact more aspects of our lives.
-
So who gets hired or fired?
-
Do you get that loan?
Do you get insurance?
-
Are you admitted into the college
that you wanted to get into?
-
Do you and I pay the same price
for the same product
-
purchased on the same platform?
-
Law enforcement is also starting
to use machine learning
-
for predictive policing.
-
Some judges use machine-generated
risk scores to determine
-
how long an individual
is going to spend in prison.
-
So we really have to think
about these decisions.
-
Are they fair?
-
And we've seen that algorithmic bias
-
doesn't necessarily always
lead to fair outcomes.
-
So what can we do about it?
-
Well, we can start thinking about
how we create more inclusive code
-
and employ inclusive coding practices.
-
It really starts with people.
-
So who codes matters.
-
Are we creating full spectrum teams
with diverse individuals
-
who can check each other's blind spots?
-
On the technical side,
-
how we code matters.
-
Are we factoring in fairness
as we're developing systems?
-
And finally, why we code matters.
-
We've used tools of computational creation
-
to unlock immense wealth.
-
We now have the opportunity
-
to unlock even greater equality
-
if we make social change a priority
-
and not an afterthought.
-
And so these are the three tenets
that will make up the incoding movement.
-
Who codes matters,
-
how we code matters,
-
and why we code matters.
-
So to go towards incoding,
we can start thinking about
-
building platforms that can identify bias
-
by collecting people's experiences
like the ones I shared,
-
but also auditing existing software.
-
We can also start to create
more inclusive training sets.
-
Imagine a selfies for inclusion campaign
-
where you and I can help developers
-
test and create more
inclusive training sets.
-
And we can also start
thinking more conscientiously
-
about the social impact
of the technology that we're developing.
-
To get the incoding movement started,
-
I've launched the Algorithmic
Justice League,
-
where anyone who cares about fairness
-
can help fight the coded gaze.
-
On codedgaze.com, you can report bias,
-
request audits, become a tester,
-
and join the ongoing conversation,
-
#codedgaze.
-
So I invite you to join me
-
in creating a world where technology
works for all of us,
-
not just some of us,
-
a world where we value inclusion
and center social change.
-
Thank you.
-
(Applause)
-
But I have one question.
-
Will you join me in the fight?
-
(Laughter)
-
(Applause)