< Return to Video

Unsurprisingly, Microsoft's AI Bot Tay Was Tricked Into Being Racist - Newsy

  • 0:00 - 0:02
    Surprise surprise,
  • 0:02 - 0:05
    just today after Microsoft's
    new Artificial Intelligence,
  • 0:05 - 0:07
    Tay launched on several social platforms,
  • 0:07 - 0:09
    it was corrupted by the internet.
  • 0:09 - 0:11
    If you haven't heard of Tay,
  • 0:11 - 0:14
    it's a machine learning
    project created by Microsoft,
  • 0:14 - 0:16
    that's supposed to mimic the personality
  • 0:16 - 0:17
    of a 19 year old girl.
  • 0:17 - 0:20
    It's essentially an
    instant messaging chat bot
  • 0:20 - 0:22
    with a bit more smarts built in.
  • 0:22 - 0:24
    Those smarts give Tay the ability to learn
  • 0:24 - 0:26
    from the conversations
    she has with people,
  • 0:26 - 0:29
    that's where the
    corruption comes into play.
  • 0:29 - 0:30
    As surprising as it may sound,
  • 0:30 - 0:32
    the company didn't have the foresight
  • 0:32 - 0:35
    to keep Tay from learning
    inappropriate responses.
  • 0:35 - 0:38
    Tay ended up sending out racial slurs,
  • 0:38 - 0:42
    denying the Holocaust,
    expressing support for genocide
  • 0:42 - 0:45
    and posting many other
    controversial statements.
  • 0:45 - 0:48
    Microsoft eventually deactivated Tay,
  • 0:48 - 0:49
    the company told Tech Crunch,
  • 0:49 - 0:51
    once it discovered a coordinated effort
  • 0:51 - 0:54
    to make the AI project
    say inappropriate things,
  • 0:54 - 0:57
    it took the program offline
    to make adjustments.
  • 0:57 - 1:00
    Seasoned internet users among
    us are none too surprised
  • 1:00 - 1:02
    by the unfortunate turn of events.
  • 1:02 - 1:04
    If you don't program in fail safes,
  • 1:04 - 1:07
    the internet is going to
    do its worst, and it did.
  • 1:07 - 1:10
    In fact, the Guardian sites Godwin's law,
  • 1:10 - 1:13
    which holds the longer an
    online discussion goes on,
  • 1:13 - 1:15
    the more likely it is that
    someone will compare something
  • 1:15 - 1:17
    to Hitler or the Nazis.
  • 1:17 - 1:18
    As a writer for Tech Crunch put it,
  • 1:18 - 1:21
    while technology is neither good nor evil,
  • 1:21 - 1:23
    engineers have a responsibility
  • 1:23 - 1:24
    to make sure it's not designed
  • 1:24 - 1:27
    in a way that will reflect
    back the worst of humanity.
  • 1:27 - 1:29
    You can't skip the part
    about teaching a bot
  • 1:29 - 1:31
    what not to say.
  • 1:31 - 1:33
    For Newsy, I'm Mikah Sergeant.
  • 1:33 - 1:35
    [upbeat music]
Title:
Unsurprisingly, Microsoft's AI Bot Tay Was Tricked Into Being Racist - Newsy
Description:

more » « less
Video Language:
English
Duration:
01:40

English subtitles

Revisions