Return to Video

How I'm fighting bias in algorithms

  • 0:01 - 0:04
    Hello, I'm Joy, a poet of code,
  • 0:04 - 0:09
    on a mission to stop
    an unseen force that's rising,
  • 0:09 - 0:12
    a force that I called "the coded gaze,"
  • 0:12 - 0:15
    my term for algorithmic bias.
  • 0:15 - 0:20
    Algorithmic bias, like human bias,
    results in unfairness.
  • 0:20 - 0:26
    However, algorithms, like viruses,
    can spread bias on a massive scale
  • 0:26 - 0:27
    at a rapid pace.
  • 0:28 - 0:32
    Algorithmic bias can also lead
    to exclusionary experiences
  • 0:32 - 0:34
    and discriminatory practices.
  • 0:35 - 0:37
    Let me show you what I mean.
  • 0:37 - 0:39
    (Video) Joy Buolamwini: Hi, camera.
    I've got a face.
  • 0:40 - 0:42
    Can you see my face?
  • 0:42 - 0:44
    No-glasses face?
  • 0:44 - 0:46
    You can see her face.
  • 0:46 - 0:48
    What about my face?
  • 0:52 - 0:56
    I've got a mask. Can you see my mask?
  • 0:56 - 0:59
    Joy Buolamwini: So how did this happen?
  • 0:59 - 1:02
    Why am I sitting in front of a computer
  • 1:02 - 1:03
    in a white mask,
  • 1:03 - 1:07
    trying to be detected by a cheap webcam?
  • 1:07 - 1:09
    Well, when I'm not fighting the coded gaze
  • 1:09 - 1:11
    as a poet of code,
  • 1:11 - 1:14
    I'm a graduate student
    at the MIT Media Lab,
  • 1:14 - 1:19
    and there I have the opportunity to work
    on all sorts of whimsical projects,
  • 1:19 - 1:21
    including the Aspire Mirror,
  • 1:21 - 1:26
    a project I did so I could project
    digital masks onto my reflection.
  • 1:26 - 1:29
    So in the morning, if I wanted
    to feel powerful,
  • 1:29 - 1:30
    I could put on a lion.
  • 1:30 - 1:34
    If I wanted to be uplifted,
    I might have a quote.
  • 1:34 - 1:37
    So I used generic
    facial recognition software
  • 1:37 - 1:38
    to build the system,
  • 1:38 - 1:43
    but found it was really hard to test it
    unless I wore a white mask.
  • 1:44 - 1:49
    Unfortunately, I've run
    into this issue before.
  • 1:49 - 1:53
    When I was an undergraduate
    at Georgia Tech studying computer science,
  • 1:53 - 1:55
    I used to work on social robots,
  • 1:55 - 1:59
    and one of my tasks was to get a robot
    to play peek-a-boo,
  • 1:59 - 2:01
    a simple turn-taking game
  • 2:01 - 2:05
    where partners cover their face
    and then uncover it saying, "Peek-a-boo!"
  • 2:05 - 2:09
    The problem is, peek-a-boo
    doesn't really work if I can't see you,
  • 2:09 - 2:12
    and my robot couldn't see me.
  • 2:12 - 2:16
    But I borrowed my roommate's face
    to get the project done,
  • 2:16 - 2:17
    submitted the assignment,
  • 2:17 - 2:21
    and figured, you know what,
    somebody else will solve this problem.
  • 2:22 - 2:24
    Not too long after,
  • 2:24 - 2:28
    I was in Hong Kong
    for an entrepreneurship competition.
  • 2:28 - 2:31
    The organizers decided
    to take participants
  • 2:31 - 2:33
    on a tour of local start-ups.
  • 2:33 - 2:36
    One of the start-ups had a social robot,
  • 2:36 - 2:38
    and they decided to do a demo.
  • 2:38 - 2:41
    The demo worked on everybody
    until it got to me,
  • 2:41 - 2:43
    and you can probably guess it.
  • 2:43 - 2:46
    It couldn't detect my face.
  • 2:46 - 2:49
    I asked the developers what was going on,
  • 2:49 - 2:54
    and it turned out we had used the same
    generic facial recognition software.
  • 2:54 - 2:56
    Halfway around the world,
  • 2:56 - 3:00
    I learned that algorithmic bias
    can travel as quickly
  • 3:00 - 3:03
    as it takes to download
    some files off of the internet.
  • 3:04 - 3:07
    So what's going on?
    Why isn't my face being detected?
  • 3:07 - 3:10
    Well, we have to look
    at how we give machines sight.
  • 3:10 - 3:14
    Computer vision uses
    machine learning techniques
  • 3:14 - 3:16
    to do facial recognition.
  • 3:16 - 3:19
    So how this works is, you create
    a training set with examples of faces.
  • 3:19 - 3:22
    This is a face. This is a face.
    This is not a face.
  • 3:22 - 3:27
    And over time, you can teach a computer
    how to recognize other faces.
  • 3:27 - 3:31
    However, if the training sets
    aren't really that diverse,
  • 3:31 - 3:34
    any face that deviates too much
    from the established norm
  • 3:34 - 3:36
    will be harder to detect,
  • 3:36 - 3:38
    which is what was happening to me.
  • 3:38 - 3:40
    But don't worry -- there's some good news.
  • 3:40 - 3:43
    Training sets don't just
    materialize out of nowhere.
  • 3:43 - 3:45
    We actually can create them.
  • 3:45 - 3:49
    So there's an opportunity to create
    full-spectrum training sets
  • 3:49 - 3:53
    that reflect a richer
    portrait of humanity.
  • 3:53 - 3:55
    Now you've seen in my examples
  • 3:55 - 3:57
    how social robots
  • 3:57 - 4:02
    was how I found out about exclusion
    with algorithmic bias.
  • 4:02 - 4:06
    But algorithmic bias can also lead
    to discriminatory practices.
  • 4:07 - 4:09
    Across the US,
  • 4:09 - 4:13
    police departments are starting to use
    facial recognition software
  • 4:13 - 4:16
    in their crime-fighting arsenal.
  • 4:16 - 4:18
    Georgetown Law published a report
  • 4:18 - 4:24
    showing that one in two adults
    in the US -- that's 117 million people --
  • 4:24 - 4:28
    have their faces
    in facial recognition networks.
  • 4:28 - 4:33
    Police departments can currently look
    at these networks unregulated,
  • 4:33 - 4:37
    using algorithms that have not
    been audited for accuracy.
  • 4:37 - 4:41
    Yet we know facial recognition
    is not fail proof,
  • 4:41 - 4:45
    and labeling faces consistently
    remains a challenge.
  • 4:45 - 4:47
    You might have seen this on Facebook.
  • 4:47 - 4:50
    My friends and I laugh all the time
    when we see other people
  • 4:50 - 4:52
    mislabeled in our photos.
  • 4:52 - 4:58
    But misidentifying a suspected criminal
    is no laughing matter,
  • 4:58 - 5:01
    nor is breaching civil liberties.
  • 5:01 - 5:04
    Machine learning is being used
    for facial recognition,
  • 5:04 - 5:08
    but it's also extending beyond the realm
    of computer vision.
  • 5:09 - 5:13
    In her book, "Weapons
    of Math Destruction,"
  • 5:13 - 5:20
    data scientist Cathy O'Neil
    talks about the rising new WMDs --
  • 5:20 - 5:24
    widespread, mysterious
    and destructive algorithms
  • 5:24 - 5:27
    that are increasingly being used
    to make decisions
  • 5:27 - 5:31
    that impact more aspects of our lives.
  • 5:31 - 5:32
    So who gets hired or fired?
  • 5:32 - 5:35
    Do you get that loan?
    Do you get insurance?
  • 5:35 - 5:38
    Are you admitted into the college
    you wanted to get into?
  • 5:38 - 5:42
    Do you and I pay the same price
    for the same product
  • 5:42 - 5:44
    purchased on the same platform?
  • 5:44 - 5:48
    Law enforcement is also starting
    to use machine learning
  • 5:48 - 5:50
    for predictive policing.
  • 5:50 - 5:54
    Some judges use machine-generated
    risk scores to determine
  • 5:54 - 5:58
    how long an individual
    is going to spend in prison.
  • 5:58 - 6:01
    So we really have to think
    about these decisions.
  • 6:01 - 6:02
    Are they fair?
  • 6:02 - 6:05
    And we've seen that algorithmic bias
  • 6:05 - 6:08
    doesn't necessarily always
    lead to fair outcomes.
  • 6:08 - 6:10
    So what can we do about it?
  • 6:10 - 6:14
    Well, we can start thinking about
    how we create more inclusive code
  • 6:14 - 6:17
    and employ inclusive coding practices.
  • 6:17 - 6:19
    It really starts with people.
  • 6:20 - 6:22
    So who codes matters.
  • 6:22 - 6:26
    Are we creating full-spectrum teams
    with diverse individuals
  • 6:26 - 6:28
    who can check each other's blind spots?
  • 6:28 - 6:32
    On the technical side,
    how we code matters.
  • 6:32 - 6:35
    Are we factoring in fairness
    as we're developing systems?
  • 6:36 - 6:38
    And finally, why we code matters.
  • 6:39 - 6:44
    We've used tools of computational creation
    to unlock immense wealth.
  • 6:44 - 6:48
    We now have the opportunity
    to unlock even greater equality
  • 6:48 - 6:51
    if we make social change a priority
  • 6:51 - 6:53
    and not an afterthought.
  • 6:54 - 6:59
    And so these are the three tenets
    that will make up the "incoding" movement.
  • 6:59 - 7:00
    Who codes matters,
  • 7:00 - 7:02
    how we code matters
  • 7:02 - 7:04
    and why we code matters.
  • 7:04 - 7:07
    So to go towards incoding,
    we can start thinking about
  • 7:07 - 7:10
    building platforms that can identify bias
  • 7:10 - 7:13
    by collecting people's experiences
    like the ones I shared,
  • 7:13 - 7:16
    but also auditing existing software.
  • 7:16 - 7:20
    We can also start to create
    more inclusive training sets.
  • 7:20 - 7:23
    Imagine a "Selfies for Inclusion" campaign
  • 7:23 - 7:27
    where you and I can help
    developers test and create
  • 7:27 - 7:29
    more inclusive training sets.
  • 7:29 - 7:32
    And we can also start thinking
    more conscientiously
  • 7:32 - 7:38
    about the social impact
    of the technology that we're developing.
  • 7:38 - 7:40
    To get the incoding movement started,
  • 7:40 - 7:43
    I've launched the Algorithmic
    Justice League,
  • 7:43 - 7:49
    where anyone who cares about fairness
    can help fight the coded gaze.
  • 7:49 - 7:52
    On codedgaze.com, you can report bias,
  • 7:52 - 7:55
    request audits, become a tester
  • 7:55 - 7:57
    and join the ongoing conversation,
  • 7:57 - 8:00
    #codedgaze.
  • 8:01 - 8:03
    So I invite you to join me
  • 8:03 - 8:07
    in creating a world where technology
    works for all of us,
  • 8:07 - 8:09
    not just some of us,
  • 8:09 - 8:14
    a world where we value inclusion
    and center social change.
  • 8:14 - 8:15
    Thank you.
  • 8:15 - 8:19
    (Applause)
  • 8:21 - 8:24
    But I have one question:
  • 8:24 - 8:26
    Will you join me in the fight?
  • 8:26 - 8:27
    (Laughter)
  • 8:27 - 8:31
    (Applause)
Title:
How I'm fighting bias in algorithms
Speaker:
Joy Buolamwini
Description:

MIT grad student Joy Buolamwini was working with facial recognition software when she noticed a problem: the software didn't recognize her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
08:46

English subtitles

Revisions Compare revisions