0:00:03.634,0:00:07.363 [Behind the screens:[br]Who decides what I see online?] 0:00:08.376,0:00:09.526 Hi, my name’s Taylor. 0:00:09.526,0:00:11.778 I study the internet and journalism, 0:00:11.778,0:00:16.077 and how we as citizens receive[br]information about the world. 0:00:17.389,0:00:19.628 In the olden days [br]when people got information, 0:00:19.778,0:00:22.688 it was mostly decided by people. 0:00:22.688,0:00:25.381 There were humans that decided[br]what we needed to know. 0:00:25.560,0:00:28.550 So when we opened the newspaper[br]or turned on the evening news, 0:00:28.550,0:00:32.570 it was a person that decided [br]what we heard and saw. 0:00:33.240,0:00:36.546 The result of that is[br]that we all knew similar things. 0:00:38.059,0:00:40.742 Now the internet has changed everything. 0:00:41.090,0:00:44.017 When you go online,[br]when you open an application, 0:00:44.486,0:00:47.224 what you see is determined not by a person 0:00:47.489,0:00:48.710 but by a machine. 0:00:49.589,0:00:51.820 And this is awesome in a lot of ways: 0:00:51.984,0:00:54.053 It allows you to use Google Maps; 0:00:54.053,0:00:56.523 It allows you to order food online; 0:00:56.523,0:01:00.162 It allows you to connect[br]with friends around the world 0:01:00.162,0:01:01.870 and share information... 0:01:02.290,0:01:04.911 But there are aspects of this machine 0:01:05.221,0:01:07.511 that we need to think[br]really carefully about 0:01:07.741,0:01:11.618 because they determine[br]the information that we all receive 0:01:11.793,0:01:15.050 as citizens in a society[br]and in a democracy. 0:01:15.489,0:01:17.029 So when you open up an app 0:01:17.179,0:01:20.531 and you’re shown a picture[br]in your Snapchat feed, 0:01:20.764,0:01:24.420 all of that information[br]is determined by this machine, 0:01:24.465,0:01:28.951 and that machine is driven [br]by the incentives of the company 0:01:28.951,0:01:32.251 that owns the website[br]or owns the application. 0:01:32.684,0:01:36.540 And the incentive is for you [br]to spend as much time as possible 0:01:36.540,0:01:38.080 inside that application. 0:01:38.680,0:01:42.087 So they do things that make you feel[br]very good about being there. 0:01:42.250,0:01:44.553 They allow people to like your photos. 0:01:44.975,0:01:47.943 They show you content[br]that they think you want to see 0:01:48.066,0:01:50.850 that will either make you[br]really happy or really angry 0:01:50.850,0:01:53.880 that will get an emotional response[br]from you to keep you there. 0:01:54.260,0:01:57.173 That is because they want[br]to show you as many ads as possible 0:01:57.173,0:01:58.173 when you’re there 0:01:58.173,0:02:00.160 because that is their business model. 0:02:00.897,0:02:04.414 They’re also taking that same[br]opportunity of you being in their app 0:02:04.414,0:02:06.140 to collect data about you. 0:02:06.370,0:02:08.005 And they use these data 0:02:08.005,0:02:11.553 to create detailed profiles[br]of your life and your behaviour, 0:02:11.845,0:02:13.924 and these profiles can then be used 0:02:13.924,0:02:16.930 to target more ads back to you, 0:02:17.200,0:02:19.700 and that then determines[br]what you see as well. 0:02:21.470,0:02:25.170 But all of this isn’t just about[br]the business model of these companies, 0:02:25.614,0:02:28.277 it actually has an impact on our democracy 0:02:28.735,0:02:34.509 because what we each see on the internet[br]is highly customized to us, 0:02:34.990,0:02:36.183 to what we like, 0:02:36.453,0:02:37.843 to what we believe, 0:02:38.253,0:02:41.203 to what we want to see or want to believe. 0:02:41.523,0:02:43.498 And that means that as a society, 0:02:43.756,0:02:47.500 we no longer all have[br]a shared set of knowledge 0:02:47.500,0:02:50.197 which is hard for a democracy[br]that requires us 0:02:50.197,0:02:52.621 to work together and know the same things 0:02:52.851,0:02:55.171 to make decisions[br]about our lives together. 0:02:55.391,0:02:57.030 When we all know different things 0:02:57.030,0:03:01.059 and we’re all being siloed into[br]our own little bubbles of information, 0:03:01.567,0:03:04.860 it’s incredibility difficult for us[br]to get along with one another. 0:03:04.860,0:03:07.670 We have no shared experience[br]or shared knowledge. 0:03:08.241,0:03:11.335 I think it’s really important[br]that we think critically 0:03:11.545,0:03:13.897 about the information we receive online, 0:03:14.247,0:03:17.167 and about the companies[br]and structures that determine 0:03:17.337,0:03:19.455 what we see on the internet. 0:03:21.504,0:03:24.702 [NewsWise is a project of CIVIX&[br]The Canadian Journalism Foundation] 0:03:25.172,0:03:27.595 Subtitles by Claudia Contreras[br]Review by Carol Wang