1 00:00:00,130 --> 00:00:01,590 Surprise surprise, 2 00:00:01,590 --> 00:00:04,550 just today after Microsoft's new Artificial Intelligence, 3 00:00:04,550 --> 00:00:07,380 Tay launched on several social platforms, 4 00:00:07,380 --> 00:00:09,450 it was corrupted by the internet. 5 00:00:09,450 --> 00:00:10,630 If you haven't heard of Tay, 6 00:00:10,630 --> 00:00:13,820 it's a machine learning project created by Microsoft, 7 00:00:13,820 --> 00:00:15,610 that's supposed to mimic the personality 8 00:00:15,610 --> 00:00:17,430 of a 19 year old girl. 9 00:00:17,430 --> 00:00:20,000 It's essentially an instant messaging chat bot 10 00:00:20,000 --> 00:00:21,910 with a bit more smarts built in. 11 00:00:21,910 --> 00:00:24,090 Those smarts give Tay the ability to learn 12 00:00:24,090 --> 00:00:26,350 from the conversations she has with people, 13 00:00:26,350 --> 00:00:28,640 that's where the corruption comes into play. 14 00:00:28,640 --> 00:00:30,400 As surprising as it may sound, 15 00:00:30,400 --> 00:00:32,370 the company didn't have the foresight 16 00:00:32,370 --> 00:00:35,360 to keep Tay from learning inappropriate responses. 17 00:00:35,360 --> 00:00:37,980 Tay ended up sending out racial slurs, 18 00:00:37,980 --> 00:00:41,700 denying the Holocaust, expressing support for genocide 19 00:00:41,700 --> 00:00:45,010 and posting many other controversial statements. 20 00:00:45,010 --> 00:00:47,500 Microsoft eventually deactivated Tay, 21 00:00:47,500 --> 00:00:48,790 the company told Tech Crunch, 22 00:00:48,790 --> 00:00:51,220 once it discovered a coordinated effort 23 00:00:51,220 --> 00:00:54,080 to make the AI project say inappropriate things, 24 00:00:54,080 --> 00:00:56,970 it took the program offline to make adjustments. 25 00:00:56,970 --> 00:01:00,230 Seasoned internet users among us are none too surprised 26 00:01:00,230 --> 00:01:02,130 by the unfortunate turn of events. 27 00:01:02,130 --> 00:01:04,090 If you don't program in fail safes, 28 00:01:04,090 --> 00:01:07,480 the internet is going to do its worst, and it did. 29 00:01:07,480 --> 00:01:09,760 In fact, the Guardian sites Godwin's law, 30 00:01:09,760 --> 00:01:12,570 which holds the longer an online discussion goes on, 31 00:01:12,570 --> 00:01:15,100 the more likely it is that someone will compare something 32 00:01:15,100 --> 00:01:16,890 to Hitler or the Nazis. 33 00:01:16,890 --> 00:01:18,260 As a writer for Tech Crunch put it, 34 00:01:18,260 --> 00:01:20,790 while technology is neither good nor evil, 35 00:01:20,790 --> 00:01:22,570 engineers have a responsibility 36 00:01:22,570 --> 00:01:23,950 to make sure it's not designed 37 00:01:23,950 --> 00:01:26,980 in a way that will reflect back the worst of humanity. 38 00:01:26,980 --> 00:01:29,030 You can't skip the part about teaching a bot 39 00:01:29,030 --> 00:01:30,750 what not to say. 40 00:01:30,750 --> 00:01:32,544 For Newsy, I'm Mikah Sergeant. 41 00:01:32,544 --> 00:01:35,127 [upbeat music]