WEBVTT 00:00:00.130 --> 00:00:01.590 Surprise surprise, 00:00:01.590 --> 00:00:04.550 just today after Microsoft's new Artificial Intelligence, 00:00:04.550 --> 00:00:07.380 Tay launched on several social platforms, 00:00:07.380 --> 00:00:09.450 it was corrupted by the internet. 00:00:09.450 --> 00:00:10.630 If you haven't heard of Tay, 00:00:10.630 --> 00:00:13.820 it's a machine learning project created by Microsoft, 00:00:13.820 --> 00:00:15.610 that's supposed to mimic the personality 00:00:15.610 --> 00:00:17.430 of a 19 year old girl. 00:00:17.430 --> 00:00:20.000 It's essentially an instant messaging chat bot 00:00:20.000 --> 00:00:21.910 with a bit more smarts built in. 00:00:21.910 --> 00:00:24.090 Those smarts give Tay the ability to learn 00:00:24.090 --> 00:00:26.350 from the conversations she has with people, 00:00:26.350 --> 00:00:28.640 that's where the corruption comes into play. 00:00:28.640 --> 00:00:30.400 As surprising as it may sound, 00:00:30.400 --> 00:00:32.370 the company didn't have the foresight 00:00:32.370 --> 00:00:35.360 to keep Tay from learning inappropriate responses. 00:00:35.360 --> 00:00:37.980 Tay ended up sending out racial slurs, 00:00:37.980 --> 00:00:41.700 denying the Holocaust, expressing support for genocide 00:00:41.700 --> 00:00:45.010 and posting many other controversial statements. 00:00:45.010 --> 00:00:47.500 Microsoft eventually deactivated Tay, 00:00:47.500 --> 00:00:48.790 the company told Tech Crunch, 00:00:48.790 --> 00:00:51.220 once it discovered a coordinated effort 00:00:51.220 --> 00:00:54.080 to make the AI project say inappropriate things, 00:00:54.080 --> 00:00:56.970 it took the program offline to make adjustments. 00:00:56.970 --> 00:01:00.230 Seasoned internet users among us are none too surprised 00:01:00.230 --> 00:01:02.130 by the unfortunate turn of events. 00:01:02.130 --> 00:01:04.090 If you don't program in fail safes, 00:01:04.090 --> 00:01:07.480 the internet is going to do its worst, and it did. 00:01:07.480 --> 00:01:09.760 In fact, the Guardian sites Godwin's law, 00:01:09.760 --> 00:01:12.570 which holds the longer an online discussion goes on, 00:01:12.570 --> 00:01:15.100 the more likely it is that someone will compare something 00:01:15.100 --> 00:01:16.890 to Hitler or the Nazis. 00:01:16.890 --> 00:01:18.260 As a writer for Tech Crunch put it, 00:01:18.260 --> 00:01:20.790 while technology is neither good nor evil, 00:01:20.790 --> 00:01:22.570 engineers have a responsibility 00:01:22.570 --> 00:01:23.950 to make sure it's not designed 00:01:23.950 --> 00:01:26.980 in a way that will reflect back the worst of humanity. 00:01:26.980 --> 00:01:29.030 You can't skip the part about teaching a bot 00:01:29.030 --> 00:01:30.750 what not to say. 00:01:30.750 --> 00:01:32.544 For Newsy, I'm Mikah Sergeant. 00:01:32.544 --> 00:01:35.127 [upbeat music]