< Return to Video

Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet

  • 0:14 - 0:17
    Hello, I'm Joy, a poet of code,
  • 0:17 - 0:22
    on a mission to stop
    an unseen force that's rising,
  • 0:22 - 0:25
    a force that I called "the coded gaze,"
  • 0:25 - 0:28
    my term for algorithmic bias.
  • 0:28 - 0:33
    Algorithmic bias, like human bias,
    results in unfairness.
  • 0:33 - 0:39
    However, algorithms, like viruses,
    can spread bias on a massive scale
  • 0:39 - 0:40
    at a rapid pace.
  • 0:41 - 0:45
    Algorithmic bias can also lead
    to exclusionary experiences
  • 0:45 - 0:47
    and discriminatory practices.
  • 0:47 - 0:49
    Let me show you what I mean.
  • 0:50 - 0:53
    (Video) Joy Boulamwini: Hi, camera.
    I've got a face.
  • 0:53 - 0:55
    Can you see my face?
  • 0:55 - 0:57
    No-glasses face?
  • 0:58 - 1:00
    You can see her face.
  • 1:01 - 1:03
    What about my face?
  • 1:06 - 1:07
    (Laughter)
  • 1:07 - 1:10
    I've got a mask. Can you see my mask?
  • 1:12 - 1:14
    Joy Boulamwini: So how did this happen?
  • 1:14 - 1:17
    Why am I sitting in front of a computer
  • 1:17 - 1:19
    in a white mask,
  • 1:19 - 1:23
    trying to be detected by a cheap webcam?
  • 1:23 - 1:25
    Well, when I'm not fighting the coded gaze
  • 1:25 - 1:26
    as a poet of code,
  • 1:26 - 1:30
    I'm a graduate student
    at the MIT Media Lab,
  • 1:30 - 1:35
    and there I have the opportunity to work
    on all sorts of whimsical projects,
  • 1:35 - 1:37
    including the Aspire Mirror,
  • 1:37 - 1:42
    a project I did so I could project
    digital masks onto my reflection.
  • 1:42 - 1:44
    So in the morning, if I wanted
    to feel powerful,
  • 1:44 - 1:46
    I could put on a lion.
  • 1:46 - 1:49
    If I wanted to be uplifted,
    I might have a quote.
  • 1:49 - 1:52
    So I used generic
    facial recognition software
  • 1:52 - 1:54
    to build the system,
  • 1:54 - 1:59
    but found it was really hard to test it
    unless I wore a white mask.
  • 2:00 - 2:04
    Unfortunately, I've run
    into this issue before.
  • 2:04 - 2:08
    When I was an undergraduate
    at Georgia Tech studying computer science,
  • 2:08 - 2:11
    I used to work on social robots,
  • 2:11 - 2:14
    and one of my tasks was to get a robot
    to play peek-a-boo,
  • 2:14 - 2:16
    a simple turn-taking game
  • 2:16 - 2:21
    where partners cover their face
    and then uncover it saying, "Peek-a-boo!"
  • 2:21 - 2:25
    The problem is, peek-a-boo
    doesn't really work if I can't see you,
  • 2:25 - 2:28
    and my robot couldn't see me.
  • 2:28 - 2:32
    But I borrowed my roommate's face
    to get the project done,
  • 2:32 - 2:33
    submitted the assignment,
  • 2:33 - 2:37
    and figured, you know what,
    somebody else will solve this problem.
  • 2:37 - 2:40
    Not too long after,
  • 2:40 - 2:44
    I was in Hong Kong
    for an entrepreneurship competition.
  • 2:44 - 2:47
    The organizers decided
    to take participants
  • 2:47 - 2:49
    on a tour of local start-ups.
  • 2:49 - 2:52
    One of the start-ups had a social robot,
  • 2:52 - 2:54
    and they decided to do a demo.
  • 2:54 - 2:57
    The demo worked on everybody
    until it got to me,
  • 2:57 - 2:59
    and you can probably guess it.
  • 2:59 - 3:02
    It couldn't detect my face.
  • 3:02 - 3:04
    I asked the developers what was going on,
  • 3:04 - 3:10
    and it turned out we had used the same
    generic facial recognition software.
  • 3:10 - 3:12
    Halfway around the world,
  • 3:12 - 3:16
    I learned that algorithmic bias
    can travel as quickly
  • 3:16 - 3:19
    as it takes to download
    some files off of the internet.
  • 3:20 - 3:23
    So what's going on?
    Why isn't my face being detected?
  • 3:23 - 3:26
    Well, we have to look
    at how we give machines sight.
  • 3:26 - 3:29
    Computer vision uses
    machine learning techniques
  • 3:29 - 3:31
    to do facial recognition.
  • 3:31 - 3:35
    So how this works is, you create
    a training set with examples of faces.
  • 3:35 - 3:38
    This is a face. This is a face.
    This is not a face.
  • 3:38 - 3:43
    And over time, you can teach a computer
    how to recognize other faces.
  • 3:43 - 3:47
    However, if the training sets
    aren't really that diverse,
  • 3:47 - 3:50
    any face that deviates too much
    from the established norm
  • 3:50 - 3:52
    will be harder to detect,
  • 3:52 - 3:54
    which is what was happening to me.
  • 3:54 - 3:56
    But don't worry -- there's some good news.
  • 3:56 - 3:59
    Training sets don't just
    materialize out of nowhere.
  • 3:59 - 4:01
    We actually can create them.
  • 4:01 - 4:05
    So there's an opportunity to create
    full-spectrum training sets
  • 4:05 - 4:09
    that reflect a richer
    portrait of humanity.
  • 4:09 - 4:11
    Now you've seen in my examples
  • 4:11 - 4:13
    how social robots
  • 4:13 - 4:17
    was how I found out about exclusion
    with algorithmic bias.
  • 4:17 - 4:22
    But algorithmic bias can also lead
    to discriminatory practices.
  • 4:23 - 4:25
    Across the US,
  • 4:25 - 4:29
    police departments are starting to use
    facial recognition software
  • 4:29 - 4:31
    in their crime-fighting arsenal.
  • 4:31 - 4:33
    Georgetown Law published a report
  • 4:33 - 4:40
    showing that one in two adults
    in the US -- that's 117 million people --
  • 4:40 - 4:44
    have their faces
    in facial recognition networks.
  • 4:44 - 4:48
    Police departments can currently look
    at these networks unregulated,
  • 4:48 - 4:53
    using algorithms that have not
    been audited for accuracy.
  • 4:53 - 4:57
    Yet we know facial recognition
    is not fail proof,
  • 4:57 - 5:01
    and labeling faces consistently
    remains a challenge.
  • 5:01 - 5:03
    You might have seen this on Facebook.
  • 5:03 - 5:06
    My friends and I laugh all the time
    when we see other people
  • 5:06 - 5:08
    mislabeled in our photos.
  • 5:08 - 5:14
    But misidentifying a suspected criminal
    is no laughing matter,
  • 5:14 - 5:17
    nor is breaching civil liberties.
  • 5:17 - 5:20
    Machine learning is being used
    for facial recognition,
  • 5:20 - 5:24
    but it's also extending beyond the realm
    of computer vision.
  • 5:25 - 5:29
    In her book, "Weapons
    of Math Destruction,"
  • 5:29 - 5:36
    data scientist Cathy O'Neil
    talks about the rising new WMDs --
  • 5:36 - 5:40
    widespread, mysterious
    and destructive algorithms
  • 5:40 - 5:43
    that are increasingly being used
    to make decisions
  • 5:43 - 5:46
    that impact more aspects of our lives.
  • 5:46 - 5:48
    So who gets hired or fired?
  • 5:48 - 5:50
    Do you get that loan?
    Do you get insurance?
  • 5:50 - 5:54
    Are you admitted into the college
    you wanted to get into?
  • 5:54 - 5:57
    Do you and I pay the same price
    for the same product
  • 5:57 - 6:00
    purchased on the same platform?
  • 6:00 - 6:04
    Law enforcement is also starting
    to use machine learning
  • 6:04 - 6:06
    for predictive policing.
  • 6:06 - 6:10
    Some judges use machine-generated
    risk scores to determine
  • 6:10 - 6:14
    how long an individual
    is going to spend in prison.
  • 6:14 - 6:16
    So we really have to think
    about these decisions.
  • 6:16 - 6:18
    Are they fair?
  • 6:18 - 6:21
    And we've seen that algorithmic bias
  • 6:21 - 6:24
    doesn't necessarily always
    lead to fair outcomes.
  • 6:24 - 6:26
    So what can we do about it?
  • 6:26 - 6:30
    Well, we can start thinking about
    how we create more inclusive code
  • 6:30 - 6:33
    and employ inclusive coding practices.
  • 6:33 - 6:35
    It really starts with people.
  • 6:36 - 6:37
    So who codes matters.
  • 6:38 - 6:42
    Are we creating full-spectrum teams
    with diverse individuals
  • 6:42 - 6:44
    who can check each other's blind spots?
  • 6:44 - 6:48
    On the technical side,
    how we code matters.
  • 6:48 - 6:51
    Are we factoring in fairness
    as we're developing systems?
  • 6:51 - 6:54
    And finally, why we code matters.
  • 6:55 - 7:00
    We've used tools of computational creation
    to unlock immense wealth.
  • 7:00 - 7:04
    We now have the opportunity
    to unlock even greater equality
  • 7:04 - 7:07
    if we make social change a priority
  • 7:07 - 7:09
    and not an afterthought.
  • 7:10 - 7:14
    And so these are the three tenets
    that will make up the "incoding" movement.
  • 7:14 - 7:16
    Who codes matters,
  • 7:16 - 7:18
    how we code matters
  • 7:18 - 7:20
    and why we code matters.
  • 7:20 - 7:23
    So to go towards incoding,
    we can start thinking about
  • 7:23 - 7:26
    building platforms that can identify bias
  • 7:26 - 7:29
    by collecting people's experiences
    like the ones I shared,
  • 7:29 - 7:32
    but also auditing existing software.
  • 7:32 - 7:36
    We can also start to create
    more inclusive training sets.
  • 7:36 - 7:39
    Imagine a "Selfies for Inclusion" campaign
  • 7:39 - 7:42
    where you and I can help
    developers test and create
  • 7:42 - 7:45
    more inclusive training sets.
  • 7:45 - 7:48
    And we can also start thinking
    more conscientiously
  • 7:48 - 7:53
    about the social impact
    of the technology that we're developing.
  • 7:53 - 7:56
    To get the incoding movement started,
  • 7:56 - 7:59
    I've launched the Algorithmic
    Justice League,
  • 7:59 - 8:05
    where anyone who cares about fairness
    can help fight the coded gaze.
  • 8:05 - 8:08
    On codedgaze.com, you can report bias,
  • 8:08 - 8:10
    request audits, become a tester
  • 8:10 - 8:13
    and join the ongoing conversation,
  • 8:13 - 8:15
    #codedgaze.
  • 8:17 - 8:19
    So I invite you to join me
  • 8:19 - 8:23
    in creating a world where technology
    works for all of us,
  • 8:23 - 8:25
    not just some of us,
  • 8:25 - 8:29
    a world where we value inclusion
    and center social change.
  • 8:29 - 8:31
    Thank you.
  • 8:31 - 8:36
    (Applause)
  • 8:37 - 8:40
    But I have one question:
  • 8:40 - 8:42
    Will you join me in the fight?
  • 8:42 - 8:43
    (Laughter)
  • 8:43 - 8:47
    (Applause)
Title:
Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet
Description:

Joy Buolamwini is a brilliant coder, but she had an issue: facial recognition algorithms don't always recognize her face unless she wears a white mask. This is an example of the subtle ways in which racial privilege excludes full participation in learning opportunities, so Joy did something about it - she started coding programs that are more inclusive. She started a revolution, and she's ready to get you on board!

Joy Buolamwini is the founder of Code4Rights and a graduate researcher with the Civic Media group at the MIT Media Lab. Joy is a Rhodes Scholar, aFulbright Fellow, an Astronaut Scholar, a Google Anita Borg Scholar, and a Carter Center technical consultant recognized as a distinguished volunteer. As the Chief Technology Officer for Techturized Inc, a hair care technology company and Swift Tech Solutions, a global health tech consultancy, Joy gained technical competency by developing software for under-served communities in the US, Ethiopia, Mali, Nigeria, and Niger. During her Fulbright Fellowship in Zambia, she explored empowering engaged citizens with skills to create their own technology through the Zamrize Initiative

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
09:02
  • The TEDTalk version of the talk is available for translation here: http://amara.org/en/videos/ABZEiHp9akUS/info/how-im-fighting-bias-in-algorithms/

English subtitles

Revisions Compare revisions