1 00:00:03,634 --> 00:00:07,363 [Behind the screens: Who decides what I see online?] 2 00:00:08,376 --> 00:00:09,526 Hi, my name’s Taylor. 3 00:00:09,526 --> 00:00:11,778 I study the internet and journalism, 4 00:00:11,778 --> 00:00:15,977 and how we as citizens receive information about the world. 5 00:00:17,389 --> 00:00:19,778 In the olden days when people got information, 6 00:00:19,778 --> 00:00:22,688 it was mostly decided by people. 7 00:00:22,688 --> 00:00:25,381 There were humans that decided what we needed to know. 8 00:00:25,560 --> 00:00:28,550 So when we opened the newspaper or turned on the evening news, 9 00:00:28,550 --> 00:00:32,570 it was a person that decided what we heard and saw. 10 00:00:33,240 --> 00:00:36,546 The result of that is that we all knew similar things. 11 00:00:38,059 --> 00:00:40,742 Now the internet has changed everything. 12 00:00:41,090 --> 00:00:44,017 When you go online, when you open an application, 13 00:00:44,486 --> 00:00:47,114 what you see is determined not by a person 14 00:00:47,489 --> 00:00:48,710 but by a machine. 15 00:00:49,589 --> 00:00:51,820 And this is awesome in a lot of ways. 16 00:00:51,984 --> 00:00:54,053 It allows you to use Google Maps, 17 00:00:54,053 --> 00:00:56,523 it allows you to order food online, 18 00:00:56,523 --> 00:01:00,162 it allows you to connect with friends around the world 19 00:01:00,162 --> 00:01:01,870 and share information, 20 00:01:02,290 --> 00:01:07,751 but there are aspects of this machine that we need to think really carefully about 21 00:01:07,751 --> 00:01:11,618 because they determine the information that we all receive 22 00:01:11,793 --> 00:01:15,050 as citizens in a society and in a democracy. 23 00:01:15,489 --> 00:01:17,029 So when you open up an app 24 00:01:17,179 --> 00:01:20,531 and you’re shown a picture in your Snapchat feed, 25 00:01:20,764 --> 00:01:24,420 all of that information is determined by this machine, 26 00:01:24,465 --> 00:01:28,951 and that machine is driven by the incentives of the company 27 00:01:28,951 --> 00:01:32,251 that owns the website or owns the application. 28 00:01:32,684 --> 00:01:36,540 And the incentive is for you to spend as much time as possible 29 00:01:36,540 --> 00:01:38,080 inside that application. 30 00:01:38,680 --> 00:01:42,087 So they do things that make you feel very good about being there. 31 00:01:42,250 --> 00:01:44,553 They allow people to like your photos. 32 00:01:44,975 --> 00:01:47,943 They show you content that they think you want to see 33 00:01:48,066 --> 00:01:50,850 that will either make you really happy or really angry 34 00:01:50,850 --> 00:01:53,880 that will get an emotional response from you to keep you there. 35 00:01:54,410 --> 00:01:58,133 That is because they want to show you as many ads as possible when you’re there 36 00:01:58,133 --> 00:02:00,160 because that is their business model. 37 00:02:00,897 --> 00:02:04,414 They’re also taking that same opportunity of you being in their app 38 00:02:04,414 --> 00:02:06,140 to collect data about you. 39 00:02:06,370 --> 00:02:11,625 And they use these data to create detailed profiles of your life and your behaviour, 40 00:02:11,845 --> 00:02:16,810 and these profiles can then be used to target more ads back to you. 41 00:02:17,200 --> 00:02:19,700 And that then determines what you see as well. 42 00:02:21,470 --> 00:02:25,170 But all of this isn’t just about the business model of these companies, 43 00:02:25,614 --> 00:02:28,277 it actually has an impact on our democracy 44 00:02:28,735 --> 00:02:34,509 because what we each see on the internet is highly customized to us, 45 00:02:34,990 --> 00:02:36,183 to what we like, 46 00:02:36,453 --> 00:02:37,843 to what we believe, 47 00:02:38,253 --> 00:02:41,203 to what we want to see or want to believe. 48 00:02:41,523 --> 00:02:43,498 And that means that as a society, 49 00:02:43,756 --> 00:02:47,500 we no longer all have a shared set of knowledge 50 00:02:47,500 --> 00:02:51,134 which is hard for a democracy that requires us to work together 51 00:02:51,134 --> 00:02:54,990 and know the same things to make decisions about our lives together. 52 00:02:55,391 --> 00:02:57,030 When we all know different things 53 00:02:57,030 --> 00:03:01,059 and we’re all being siloed into our own little bubbles of information, 54 00:03:01,567 --> 00:03:04,860 it’s incredibility difficult for us to get along with one another. 55 00:03:04,860 --> 00:03:07,670 We have no shared experience or shared knowledge. 56 00:03:08,241 --> 00:03:11,545 I think it’s really important that we think critically 57 00:03:11,545 --> 00:03:13,897 about the information we receive online 58 00:03:14,247 --> 00:03:17,167 and about the companies and structures that determine 59 00:03:17,337 --> 00:03:19,455 what we see on the internet. 60 00:03:21,504 --> 00:03:24,702 [NewsWise is a project of CIVIX& The Canadian Journalism Foundation] 61 00:03:25,172 --> 00:03:27,595 Subtitles by Claudia Contreras Review by Carol Wang