WEBVTT 00:00:00.460 --> 00:00:01.960 ♪ (music) ♪ 00:00:03.634 --> 00:00:07.363 [Behind the screens: Who decides what I see online?] 00:00:08.296 --> 00:00:09.526 Hi, my name’s Taylor. 00:00:09.526 --> 00:00:11.778 I study the internet and journalism, 00:00:11.778 --> 00:00:16.077 and how we as citizens receive information about the world. 00:00:17.389 --> 00:00:19.628 In the olden days when people got information, 00:00:19.778 --> 00:00:22.688 it was mostly decided by people. 00:00:22.688 --> 00:00:25.381 There were humans that decided what we needed to know. 00:00:25.560 --> 00:00:28.550 So when we opened the newspaper or turned on the evening news, 00:00:28.550 --> 00:00:32.570 it was a person that decided what we heard and saw. 00:00:33.240 --> 00:00:36.546 The result of that is that we all knew similar things. 00:00:38.059 --> 00:00:40.742 Now the internet has changed everything. 00:00:41.090 --> 00:00:44.017 When you go online, when you open an application, 00:00:44.486 --> 00:00:47.224 what you see is determined not by a person 00:00:47.489 --> 00:00:48.710 but by a machine. 00:00:49.589 --> 00:00:51.820 And this is awesome in a lot of ways: 00:00:51.984 --> 00:00:54.053 It allows you to use Google Maps; 00:00:54.053 --> 00:00:56.523 It allows you to order food online; 00:00:56.523 --> 00:01:00.162 It allows you to connect with friends around the world 00:01:00.162 --> 00:01:01.870 and share information... 00:01:02.290 --> 00:01:05.061 But there are aspects of this machine 00:01:05.061 --> 00:01:07.431 that we need to think really carefully about 00:01:07.681 --> 00:01:11.793 because they determine the information that we all receive 00:01:11.793 --> 00:01:15.050 as citizens in a society and in a democracy. 00:01:15.419 --> 00:01:17.179 So when you open up an app 00:01:17.179 --> 00:01:20.531 and you’re shown a picture in your Snapchat feed, 00:01:20.764 --> 00:01:24.420 all of that information is determined by this machine, 00:01:24.420 --> 00:01:28.951 and that machine is driven by the incentives of the company 00:01:28.951 --> 00:01:32.251 that owns the website or owns the application. 00:01:32.684 --> 00:01:36.430 And the incentive is for you to spend as much time as possible 00:01:36.430 --> 00:01:38.080 inside that application. 00:01:38.680 --> 00:01:42.087 So they do things that make you feel very good about being there. 00:01:42.250 --> 00:01:44.553 They allow people to like your photos. 00:01:44.975 --> 00:01:48.066 They show you content that they think you want to see 00:01:48.066 --> 00:01:50.850 that will either make you really happy or really angry 00:01:50.850 --> 00:01:53.880 that will get an emotional response from you to keep you there. 00:01:54.260 --> 00:01:57.173 That is because they want to show you as many ads as possible 00:01:57.173 --> 00:01:58.173 when you’re there 00:01:58.173 --> 00:02:00.160 because that is their business model. 00:02:00.897 --> 00:02:04.414 They’re also taking that same opportunity of you being in their app 00:02:04.414 --> 00:02:06.240 to collect data about you. 00:02:06.240 --> 00:02:08.005 And they use these data 00:02:08.005 --> 00:02:11.553 to create detailed profiles of your life and your behaviour, 00:02:11.755 --> 00:02:13.924 and these profiles can then be used 00:02:13.924 --> 00:02:16.930 to target more ads back to you, 00:02:17.200 --> 00:02:19.700 and that then determines what you see as well. 00:02:21.370 --> 00:02:25.170 But all of this isn’t just about the business model of these companies, 00:02:25.614 --> 00:02:28.277 it actually has an impact on our democracy 00:02:28.735 --> 00:02:34.509 because what we each see on the internet is highly customized to us, 00:02:34.990 --> 00:02:36.183 to what we like, 00:02:36.453 --> 00:02:37.843 to what we believe, 00:02:38.253 --> 00:02:41.203 to what we want to see or want to believe. 00:02:41.523 --> 00:02:43.498 And that means that as a society, 00:02:43.756 --> 00:02:47.500 we no longer all have a shared set of knowledge 00:02:47.500 --> 00:02:50.197 which is hard for a democracy that requires us 00:02:50.197 --> 00:02:52.621 to work together and know the same things 00:02:52.851 --> 00:02:55.171 to make decisions about our lives together. 00:02:55.391 --> 00:02:57.030 When we all know different things 00:02:57.030 --> 00:03:01.059 and we’re all being siloed into our own little bubbles of information, 00:03:01.567 --> 00:03:04.860 it’s incredibility difficult for us to get along with one another. 00:03:04.860 --> 00:03:07.670 We have no shared experience and shared knowledge. 00:03:08.241 --> 00:03:11.335 I think it’s really important that we think critically 00:03:11.545 --> 00:03:13.897 about the information we receive online, 00:03:14.247 --> 00:03:17.167 and about the companies and structures that determine 00:03:17.337 --> 00:03:19.455 what we see on the internet. 00:03:19.979 --> 00:03:20.979 ♪ (music) ♪ 00:03:21.504 --> 00:03:24.702 [NewsWise is a project of CIVIX & The Canadian Journalism Foundation] 00:03:25.172 --> 00:03:27.595 Subtitles by Claudia Contreras Review by Carol Wang