< Return to Video

How I'm fighting bias in algorithms

  • 0:01 - 0:04
    Hello, I'm Joy, a poet of code,
  • 0:04 - 0:09
    on a mission to stop
    an unseen force that's rising,
  • 0:09 - 0:12
    a force that I called "the coded gaze,"
  • 0:12 - 0:15
    my term for algorithmic bias.
  • 0:15 - 0:20
    Algorithmic bias, like human bias,
    results in unfairness.
  • 0:20 - 0:26
    However, algorithms, like viruses,
    can spread bias on a massive scale
  • 0:26 - 0:27
    at a rapid pace.
  • 0:28 - 0:32
    Algorithmic bias can also lead
    to exclusionary experiences
  • 0:32 - 0:34
    and discriminatory practices.
  • 0:35 - 0:37
    Let me show you what I mean.
  • 0:38 - 0:40
    (Video) Joy Boulamwini: Camera.
    I've got a face.
  • 0:41 - 0:42
    Can you see my face?
  • 0:42 - 0:46
    No-glasses face.
  • 0:46 - 0:48
    You can see her face.
  • 0:48 - 0:50
    What about my face?
  • 0:53 - 0:54
    (Laughter)
  • 0:54 - 0:58
    I've got a mask. Can you see my mask?
  • 0:59 - 1:01
    Joy Boulamwini: So how did this happen?
  • 1:01 - 1:05
    Why am I sitting in front of a computer
  • 1:05 - 1:06
    in a white mask,
  • 1:06 - 1:10
    trying to be detected by a cheap webcam?
  • 1:10 - 1:12
    Well, when I'm not fighting the coded gaze
  • 1:12 - 1:14
    as a poet of code,
  • 1:14 - 1:17
    I'm a graduate student
    at the MIT Media Lab,
  • 1:17 - 1:22
    and there I have the opportunity to work
    on all sorts of whimsical projects,
  • 1:22 - 1:24
    including the Aspire Mirror,
  • 1:24 - 1:29
    a project I did so I could project
    digital masks onto my reflection.
  • 1:29 - 1:31
    So in the morning, if I wanted
    to feel powerful,
  • 1:31 - 1:33
    I could put on a lion.
  • 1:33 - 1:36
    If I wanted to be uplifted,
    I might have a quote.
  • 1:36 - 1:39
    So I used generic
    facial recognition software
  • 1:39 - 1:41
    to build the system,
  • 1:41 - 1:46
    but found it was really hard to test it
    unless I wore a white mask.
  • 1:47 - 1:51
    Unfortunately, I've run
    into this issue before.
  • 1:51 - 1:55
    When I was an undergraduate
    at Georgia Tech studying computer science,
  • 1:56 - 1:58
    I used to work on social robots,
  • 1:58 - 2:01
    and one of my tasks was to get a robot
    to play peek-a-boo,
  • 2:01 - 2:03
    a simple turn-taking game
  • 2:03 - 2:07
    where partners cover their face
    and then uncover it saying, "Peek-a-boo!"
  • 2:07 - 2:12
    The problem is, peek-a-boo
    doesn't really work if I can't see you,
  • 2:12 - 2:14
    and my robot couldn't see me.
  • 2:14 - 2:18
    But I borrowed my roommate's face
    to get the project done,
  • 2:18 - 2:20
    submitted the assignment,
  • 2:20 - 2:24
    and figured, you know what,
    somebody else will solve this problem.
  • 2:24 - 2:26
    Not too longer after,
  • 2:26 - 2:30
    I was in Hong Kong
    for an entrepreneurship competition.
  • 2:31 - 2:34
    The organizers decided
    to take participants
  • 2:34 - 2:36
    on a tour of local start-ups.
  • 2:36 - 2:39
    One of the start-ups had a social robot,
  • 2:39 - 2:41
    and they decided to do a demo.
  • 2:41 - 2:44
    The demo worked on everybody
    until it got to me,
  • 2:44 - 2:46
    and you can probably guess it.
  • 2:46 - 2:49
    It couldn't detect my face.
  • 2:49 - 2:51
    I asked the developers what was going on,
  • 2:51 - 2:57
    and it turned out we had used the same
    generic facial recognition software.
  • 2:57 - 2:58
    Halfway around the world,
  • 2:58 - 3:02
    I learned that algorithmic bias
    can travel as quickly
  • 3:02 - 3:05
    as it takes to download
    some files off of the internet.
  • 3:06 - 3:09
    So what's going on?
    Why isn't my face being detected?
  • 3:09 - 3:13
    Well, we have to look at
    how we give machines sight.
  • 3:13 - 3:16
    Computer vision uses
    machine learning techniques
  • 3:16 - 3:18
    to do facial recognition.
  • 3:18 - 3:22
    So how this works is, you create
    a training set with examples of faces.
  • 3:22 - 3:25
    This is a face. This is a face.
    This is not a face.
  • 3:25 - 3:29
    And over time, you can teach a computer
    how to recognize other faces.
  • 3:29 - 3:33
    However, if the training sets
    aren't really that diverse,
  • 3:33 - 3:37
    any face that deviates too much
    from the established norm
  • 3:37 - 3:38
    will be harder to detect,
  • 3:38 - 3:40
    which is what was happening to me.
  • 3:40 - 3:43
    But don't worry -- there's some good news.
  • 3:43 - 3:46
    Training sets don't just
    materialize out of nowhere.
  • 3:46 - 3:47
    We actually can create them.
  • 3:47 - 3:52
    So there's an opportunity to create
    full-spectrum training sets
  • 3:52 - 3:55
    that reflect a richer
    portrait of humanity.
  • 3:56 - 3:58
    Now you've seen in my examples
  • 3:58 - 4:00
    how social robots
  • 4:00 - 4:04
    was how I found out about exclusion
    with algorithmic bias.
  • 4:04 - 4:09
    But algorithmic bias can also lead
    to discriminatory practices.
  • 4:10 - 4:11
    Across the US,
  • 4:11 - 4:16
    police departments are starting to use
    facial recognition software
  • 4:16 - 4:18
    in their crime-fighting arsenal.
  • 4:18 - 4:20
    Georgetown Law published a report
  • 4:20 - 4:27
    showing that one in two adults
    in the US -- that's 117 million people --
  • 4:27 - 4:31
    have their faces
    in facial recognition networks.
  • 4:31 - 4:35
    Police departments can currently look
    at these networks unregulated,
  • 4:35 - 4:39
    using algorithms that have not
    been audited for accuracy.
  • 4:39 - 4:43
    Yet we know facial recognition
    is not fail proof,
  • 4:43 - 4:47
    and labeling faces consistently
    remains a challenge.
  • 4:48 - 4:49
    You might have seen this on Facebook.
  • 4:49 - 4:52
    My friends and I laugh all the time
    when we see other people
  • 4:52 - 4:55
    mislabeled in our photos.
  • 4:55 - 5:00
    But misidentifying a suspected criminal
    is no laughing matter,
  • 5:00 - 5:03
    nor is breaching civil liberties.
  • 5:03 - 5:06
    Machine learning is being used
    for facial recognition,
  • 5:06 - 5:11
    but it's also extending beyond the realm
    of computer vision.
  • 5:12 - 5:16
    In her book, "Weapons
    of Math Destruction,"
  • 5:16 - 5:23
    data scientist Cathy O'Neil
    talks about the rising new WMDs --
  • 5:23 - 5:27
    widespread, mysterious
    and destructive algorithms
  • 5:27 - 5:30
    that are increasingly being used
    to make decisions
  • 5:30 - 5:33
    that impact more aspects of our lives.
  • 5:33 - 5:35
    So who gets hired or fired?
  • 5:35 - 5:37
    Do you get that loan?
    Do you get insurance?
  • 5:37 - 5:41
    Are you admitted into the college
    you wanted to get into?
  • 5:41 - 5:44
    Do you and I pay the same price
    for the same product
  • 5:44 - 5:47
    purchased on the same platform?
  • 5:47 - 5:50
    Law enforcement is also starting
    to use machine learning
  • 5:50 - 5:53
    for predictive policing.
  • 5:53 - 5:56
    Some judges use machine-generated
    risk scores to determine
  • 5:56 - 6:01
    how long an individual
    is going to spend in prison.
  • 6:01 - 6:03
    So we really have to think
    about these decisions.
  • 6:03 - 6:04
    Are they fair?
  • 6:04 - 6:07
    And we've seen that algorithmic bias
  • 6:07 - 6:11
    doesn't necessarily always
    lead to fair outcomes.
  • 6:11 - 6:13
    So what can we do about it?
  • 6:13 - 6:16
    Well, we can start thinking about
    how we create more inclusive code
  • 6:16 - 6:19
    and employ inclusive coding practices.
  • 6:19 - 6:22
    It really starts with people.
  • 6:22 - 6:24
    So who codes matters.
  • 6:24 - 6:28
    Are we creating full-spectrum teams
    with diverse individuals
  • 6:28 - 6:31
    who can check each other's blind spots?
  • 6:31 - 6:34
    On the technical side,
    how we code matters.
  • 6:34 - 6:38
    Are we factoring in fairness
    as we're developing systems?
  • 6:38 - 6:41
    And finally, why we code matters.
  • 6:41 - 6:46
    We've used tools of computational creation
    to unlock immense wealth.
  • 6:46 - 6:51
    We now have the opportunity
    to unlock even greater equality
  • 6:51 - 6:54
    if we make social change a priority
  • 6:54 - 6:56
    and not an afterthought.
  • 6:57 - 7:01
    And so these are the three tenets
    that will make up the "incoding" movement.
  • 7:01 - 7:03
    Who codes matters,
  • 7:03 - 7:04
    how we code matters,
  • 7:04 - 7:06
    and why we code matters.
  • 7:06 - 7:09
    So to go towards encoding,
    we can start thinking about
  • 7:10 - 7:13
    building platforms that can identify bias
  • 7:13 - 7:16
    by collecting people's experiences
    like the ones I shared,
  • 7:16 - 7:19
    but also auditing existing software.
  • 7:19 - 7:23
    We can also start to create
    more inclusive training sets.
  • 7:23 - 7:25
    Imagine a "Selfies for Inclusion" campaign
  • 7:26 - 7:29
    where you and I can help
    developers test and create
  • 7:29 - 7:31
    more inclusive training sets.
  • 7:32 - 7:35
    And we can also start thinking
    more conscientiously
  • 7:35 - 7:40
    about the social impact
    of the technology that we're developing.
  • 7:40 - 7:43
    To get the incoding movement started,
  • 7:43 - 7:45
    I've launched the Algorithmic
    Justice League,
  • 7:45 - 7:51
    where anyone who cares about fairness
    can help fight the coded gaze.
  • 7:51 - 7:55
    On codedgaze.com, you can report bias,
  • 7:55 - 7:57
    request audits, become a tester
  • 7:57 - 8:00
    and join the ongoing conversation,
  • 8:00 - 8:02
    # codedgaze.
  • 8:03 - 8:06
    So I invite you to join me
  • 8:06 - 8:10
    in creating a world where technology
    works for all of us,
  • 8:10 - 8:11
    not just some of us,
  • 8:11 - 8:16
    a world where we value inclusion
    and center social change.
  • 8:16 - 8:17
    Thank you.
  • 8:17 - 8:22
    (Applause)
  • 8:23 - 8:26
    But I have one question:
  • 8:26 - 8:28
    Will you join me in the fight?
  • 8:28 - 8:30
    (Laughter)
  • 8:30 - 8:32
    (Applause)
Title:
How I'm fighting bias in algorithms
Speaker:
Joy Boulamwini
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
08:46

English subtitles

Revisions Compare revisions