Return to Video

How I'm fighting bias in algorithms

  • 0:01 - 0:03
    Hello, I'm Joy,
  • 0:03 - 0:04
    a poet of code
  • 0:04 - 0:07
    on a mission to stop
  • 0:07 - 0:09
    an unseen force that's rising,
  • 0:09 - 0:12
    a force that I called the coded gaze,
  • 0:12 - 0:15
    my term for algorithmic bias.
  • 0:15 - 0:18
    Algorithmic bias, like human bias,
  • 0:18 - 0:20
    results in unfairness.
  • 0:20 - 0:23
    However, algorithms, like viruses,
  • 0:23 - 0:26
    can spread bias on a massive scale
  • 0:26 - 0:28
    at a rapid pace.
  • 0:28 - 0:33
    Algorithmic bias can also lead
    to exclusionary experiences,
  • 0:33 - 0:35
    and discriminatory practices.
  • 0:35 - 0:38
    Let me show you what I mean.
  • 0:38 - 0:41
    (Video) Joy Boulamwini: Camera.
    I've got a face.
  • 0:41 - 0:43
    Can you see my face?
  • 0:43 - 0:46
    No glasses face.
  • 0:46 - 0:48
    You can see her face.
  • 0:48 - 0:51
    What about my face?
  • 0:54 - 0:55
    (Laughter)
  • 0:55 - 0:59
    I've got a mask. Can you see my mask?
  • 0:59 - 1:02
    Joy Boulamwini: So how did this happen?
  • 1:02 - 1:05
    Why am I sitting in front of a computer
  • 1:05 - 1:06
    in a white mask
  • 1:06 - 1:10
    trying to be detected by a cheap webcam?
  • 1:10 - 1:12
    Well, when I'm not fighting the coded gaze
  • 1:12 - 1:14
    as a poet of code,
  • 1:14 - 1:17
    I'm a graduate student
    at the MIT Media Lab,
  • 1:17 - 1:19
    and there I have the opportunity
  • 1:19 - 1:22
    to work on all sorts
    of whimsical projects,
  • 1:22 - 1:24
    including the Aspire Mirror,
  • 1:24 - 1:29
    a project I did so I could project
    digital masks onto my reflection.
  • 1:29 - 1:31
    So in the morning, if I wanted
    to feel powerful,
  • 1:31 - 1:33
    I could put on a lion.
  • 1:33 - 1:36
    If I wanted to be uplifted,
    I might have a quote.
  • 1:36 - 1:39
    So I used generic facial
    recognition software
  • 1:39 - 1:41
    to build the system,
  • 1:41 - 1:44
    but found that it was
    really hard to test it
  • 1:44 - 1:47
    unless I wore a white mask.
  • 1:47 - 1:51
    Unfortunately, I've run
    into this issue before.
  • 1:51 - 1:56
    When I was an undergraduate
    at Georgia Tech studying computer science,
  • 1:56 - 1:58
    I used to work on social robots,
  • 1:58 - 2:00
    and one of my tasks was to get a robot
  • 2:00 - 2:01
    to play peek-a-boo,
  • 2:01 - 2:03
    a simple turn-taking game
  • 2:03 - 2:07
    where partners cover their face
    and then uncover it saying "Peek-a-boo."
  • 2:07 - 2:12
    The problem is, peek-a-boo
    doesn't really work if I can't see you,
  • 2:12 - 2:14
    and my robot couldn't see me.
  • 2:14 - 2:18
    But I borrowed my roommate's face
    to get the project done,
  • 2:18 - 2:20
    submitted the assignment, and figured,
  • 2:20 - 2:24
    you know what? Somebody else
    will solve this problem.
  • 2:24 - 2:26
    Not too longer after,
  • 2:26 - 2:31
    I was in Hong Kong
    for an entrepreneurship competition.
  • 2:31 - 2:34
    The organizers decided
    to take participants
  • 2:34 - 2:36
    on a tour of local start-ups.
  • 2:36 - 2:39
    One of the start-ups had a social robot,
  • 2:39 - 2:41
    and they decided to do a demo.
  • 2:41 - 2:44
    The demo worked on everybody
    until it got to me,
  • 2:44 - 2:46
    and you can probably guess it.
  • 2:46 - 2:49
    It couldn't detect my face.
  • 2:49 - 2:51
    I asked the developers
    what was going on,
  • 2:51 - 2:52
    and it turned out
  • 2:52 - 2:57
    we had used the same generic
    facial recognition software.
  • 2:57 - 2:58
    Halfway around the world,
  • 2:58 - 3:02
    I learned that algorithmic bias
    can travel as quickly
  • 3:02 - 3:05
    as it takes to download
    some files off of the Internet.
  • 3:05 - 3:07
    So what's going on?
    Why isn't my face being detected?
  • 3:07 - 3:13
    Well, we have to look at
    how we give machines sight.
  • 3:13 - 3:16
    Computer vision uses
    machine learning techniques
  • 3:16 - 3:18
    to do facial recognition.
  • 3:18 - 3:21
    So how this works is
    you create a training set
  • 3:21 - 3:22
    with examples of faces.
  • 3:22 - 3:25
    This is face. This is face.
    This is not a face.
  • 3:25 - 3:29
    And over time, you can teach
    a computer how to recognize other faces.
  • 3:29 - 3:33
    However, if the training sets
    aren't really that diverse,
  • 3:33 - 3:36
    any face that deviates too much
    from the established norm
  • 3:36 - 3:38
    will be harder to detect,
  • 3:38 - 3:40
    which is what was happening to me.
  • 3:40 - 3:43
    But don't worry, there's some good news.
  • 3:43 - 3:46
    Training sets don't just
    materialize out of nowhere.
  • 3:46 - 3:47
    We actually can create them.
  • 3:47 - 3:52
    So there's an opportunity to create
    full spectrum training sets
  • 3:52 - 3:56
    that reflect a richer
    portrait of humanity.
  • 3:56 - 3:58
    Now you've seen in my examples
  • 3:58 - 4:00
    how social robots
  • 4:00 - 4:04
    was how I found out about exclusion
    with algorithmic bias,
  • 4:04 - 4:10
    but algorithmic bias can also lead
    to discriminatory practices.
  • 4:10 - 4:14
    Across the U.S., police departments
    are starting to use
  • 4:14 - 4:16
    facial recognition software
  • 4:16 - 4:18
    in their crime-fighting arsenal.
  • 4:18 - 4:19
    Georgetown Law published a report
    showing that one in two
  • 4:19 - 4:19
    adults in the U.S. --
    that's 117 million people --
  • 4:19 - 4:30
    have their faces
    in facial recognition networks.
  • 4:30 - 4:35
    Police departments can currently look
    at these networks unregulated
  • 4:35 - 4:39
    using algorithms that have not
    been audited for accuracy.
  • 4:39 - 4:42
    Yet we know facial recognition
  • 4:42 - 4:43
    is not failproof,
  • 4:43 - 4:48
    and labeling faces consistently
    remains a challenge.
  • 4:48 - 4:49
    You might have seen this on Facebook.
  • 4:49 - 4:52
    My friends and I laugh all the time
    when we see other people
  • 4:52 - 4:55
    mislabeled in our photos.
  • 4:55 - 4:58
    But misidentifying a suspected criminal
  • 4:58 - 5:01
    is no laughing matter,
  • 5:01 - 5:03
    nor is breaching civil liberties.
  • 5:03 - 5:06
    Machine learning is being used
    for facial recognition,
  • 5:06 - 5:09
    but it's also extending beyond the realm
  • 5:09 - 5:12
    of computer vision.
  • 5:12 - 5:16
    In her book
    "Weapons of Math Destruction,"
  • 5:16 - 5:18
    data scientist Cathy O'Neil
  • 5:18 - 5:23
    talks about the rising new WMDs --
  • 5:23 - 5:27
    widespread, mysterious,
    and destructive algorithms
  • 5:27 - 5:30
    that are increasingly being used
    to make decisions
  • 5:30 - 5:33
    that impact more aspects of our lives.
  • 5:33 - 5:35
    So who gets hired or fired?
  • 5:35 - 5:37
    Do you get that loan?
    Do you get insurance?
  • 5:37 - 5:41
    Are you admitted into the college
    that you wanted to get into?
  • 5:41 - 5:44
    Do you and I pay the same price
    for the same product
  • 5:44 - 5:47
    purchased on the same platform?
  • 5:47 - 5:51
    Law enforcement is also starting
    to use machine learning
  • 5:51 - 5:53
    for predictive policing.
  • 5:53 - 5:56
    Some judges use machine-generated
    risk scores to determine
  • 5:56 - 6:01
    how long an individual
    is going to spend in prison.
  • 6:01 - 6:03
    So we really have to think
    about these decisions.
  • 6:03 - 6:04
    Are they fair?
  • 6:04 - 6:07
    And we've seen that algorithmic bias
  • 6:07 - 6:11
    doesn't necessarily always
    lead to fair outcomes.
  • 6:11 - 6:13
    So what can we do about it?
  • 6:13 - 6:16
    Well, we can start thinking about
    how we create more inclusive code
  • 6:16 - 6:19
    and employ inclusive coding practices.
  • 6:19 - 6:22
    It really starts with people.
  • 6:22 - 6:24
    So who codes matters.
  • 6:24 - 6:28
    Are we creating full spectrum teams
    with diverse individuals
  • 6:28 - 6:31
    who can check each other's blind spots?
  • 6:31 - 6:33
    On the technical side,
  • 6:33 - 6:34
    how we code matters.
  • 6:34 - 6:38
    Are we factoring in fairness
    as we're developing systems?
  • 6:38 - 6:41
    And finally, why we code matters.
  • 6:41 - 6:44
    We've used tools of computational creation
  • 6:44 - 6:46
    to unlock immense wealth.
  • 6:46 - 6:48
    We now have the opportunity
  • 6:48 - 6:51
    to unlock even greater equality
  • 6:51 - 6:54
    if we make social change a priority
  • 6:54 - 6:57
    and not an afterthought.
  • 6:57 - 7:01
    And so these are the three tenets
    that will make up the incoding movement.
  • 7:01 - 7:03
    Who codes matters,
  • 7:03 - 7:04
    how we code matters,
  • 7:04 - 7:07
    and why we code matters.
  • 7:07 - 7:10
    So to go towards incoding,
    we can start thinking about
  • 7:10 - 7:13
    building platforms that can identify bias
  • 7:13 - 7:16
    by collecting people's experiences
    like the ones I shared,
  • 7:16 - 7:19
    but also auditing existing software.
  • 7:19 - 7:23
    We can also start to create
    more inclusive training sets.
  • 7:23 - 7:26
    Imagine a selfies for inclusion campaign
  • 7:26 - 7:28
    where you and I can help developers
  • 7:28 - 7:31
    test and create more
    inclusive training sets.
  • 7:31 - 7:35
    And we can also start
    thinking more conscientiously
  • 7:35 - 7:40
    about the social impact
    of the technology that we're developing.
  • 7:40 - 7:43
    To get the incoding movement started,
  • 7:43 - 7:45
    I've launched the Algorithmic
    Justice League,
  • 7:45 - 7:48
    where anyone who cares about fairness
  • 7:48 - 7:51
    can help fight the coded gaze.
  • 7:51 - 7:54
    On codedgaze.com, you can report bias,
  • 7:54 - 7:56
    request audits, become a tester,
  • 7:56 - 8:00
    and join the ongoing conversation,
  • 8:00 - 8:03
    #codedgaze.
  • 8:03 - 8:06
    So I invite you to join me
  • 8:06 - 8:10
    in creating a world where technology
    works for all of us,
  • 8:10 - 8:12
    not just some of us,
  • 8:12 - 8:16
    a world where we value inclusion
    and center social change.
  • 8:16 - 8:18
    Thank you.
  • 8:18 - 8:23
    (Applause)
  • 8:24 - 8:27
    But I have one question.
  • 8:27 - 8:29
    Will you join me in the fight?
  • 8:29 - 8:31
    (Laughter)
  • 8:31 - 8:33
    (Applause)
Title:
How I'm fighting bias in algorithms
Speaker:
Joy Boulamwini
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
08:46

English subtitles

Revisions Compare revisions