1 00:00:03,634 --> 00:00:07,363 [Behind the screens: Who decides what I see online?] 2 00:00:08,376 --> 00:00:09,526 Hi, my name’s Taylor. 3 00:00:09,526 --> 00:00:11,778 I study the internet and journalism, 4 00:00:11,778 --> 00:00:16,077 and how we as citizens receive information about the world. 5 00:00:17,389 --> 00:00:19,628 In the olden days when people got information, 6 00:00:19,778 --> 00:00:22,688 it was mostly decided by people. 7 00:00:22,688 --> 00:00:25,381 There were humans that decided what we needed to know. 8 00:00:25,560 --> 00:00:28,550 So when we opened the newspaper or turned on the evening news, 9 00:00:28,550 --> 00:00:32,570 it was a person that decided what we heard and saw. 10 00:00:33,240 --> 00:00:36,546 The result of that is that we all knew similar things. 11 00:00:38,059 --> 00:00:40,742 Now the internet has changed everything. 12 00:00:41,090 --> 00:00:44,017 When you go online, when you open an application, 13 00:00:44,486 --> 00:00:47,224 what you see is determined not by a person 14 00:00:47,489 --> 00:00:48,710 but by a machine. 15 00:00:49,589 --> 00:00:51,820 And this is awesome in a lot of ways: 16 00:00:51,984 --> 00:00:54,053 It allows you to use Google Maps; 17 00:00:54,053 --> 00:00:56,523 It allows you to order food online; 18 00:00:56,523 --> 00:01:00,162 It allows you to connect with friends around the world 19 00:01:00,162 --> 00:01:01,870 and share information... 20 00:01:02,290 --> 00:01:04,911 But there are aspects of this machine 21 00:01:05,221 --> 00:01:07,511 that we need to think really carefully about 22 00:01:07,741 --> 00:01:11,618 because they determine the information that we all receive 23 00:01:11,793 --> 00:01:15,050 as citizens in a society and in a democracy. 24 00:01:15,489 --> 00:01:17,029 So when you open up an app 25 00:01:17,179 --> 00:01:20,531 and you’re shown a picture in your Snapchat feed, 26 00:01:20,764 --> 00:01:24,420 all of that information is determined by this machine, 27 00:01:24,465 --> 00:01:28,951 and that machine is driven by the incentives of the company 28 00:01:28,951 --> 00:01:32,251 that owns the website or owns the application. 29 00:01:32,684 --> 00:01:36,540 And the incentive is for you to spend as much time as possible 30 00:01:36,540 --> 00:01:38,080 inside that application. 31 00:01:38,680 --> 00:01:42,087 So they do things that make you feel very good about being there. 32 00:01:42,250 --> 00:01:44,553 They allow people to like your photos. 33 00:01:44,975 --> 00:01:47,943 They show you content that they think you want to see 34 00:01:48,066 --> 00:01:50,850 that will either make you really happy or really angry 35 00:01:50,850 --> 00:01:53,880 that will get an emotional response from you to keep you there. 36 00:01:54,260 --> 00:01:57,173 That is because they want to show you as many ads as possible 37 00:01:57,173 --> 00:01:58,173 when you’re there 38 00:01:58,173 --> 00:02:00,160 because that is their business model. 39 00:02:00,897 --> 00:02:04,414 They’re also taking that same opportunity of you being in their app 40 00:02:04,414 --> 00:02:06,140 to collect data about you. 41 00:02:06,370 --> 00:02:08,005 And they use these data 42 00:02:08,005 --> 00:02:11,553 to create detailed profiles of your life and your behaviour, 43 00:02:11,845 --> 00:02:13,924 and these profiles can then be used 44 00:02:13,924 --> 00:02:16,930 to target more ads back to you, 45 00:02:17,200 --> 00:02:19,700 and that then determines what you see as well. 46 00:02:21,470 --> 00:02:25,170 But all of this isn’t just about the business model of these companies, 47 00:02:25,614 --> 00:02:28,277 it actually has an impact on our democracy 48 00:02:28,735 --> 00:02:34,509 because what we each see on the internet is highly customized to us, 49 00:02:34,990 --> 00:02:36,183 to what we like, 50 00:02:36,453 --> 00:02:37,843 to what we believe, 51 00:02:38,253 --> 00:02:41,203 to what we want to see or want to believe. 52 00:02:41,523 --> 00:02:43,498 And that means that as a society, 53 00:02:43,756 --> 00:02:47,500 we no longer all have a shared set of knowledge 54 00:02:47,500 --> 00:02:50,197 which is hard for a democracy that requires us 55 00:02:50,197 --> 00:02:52,621 to work together and know the same things 56 00:02:52,851 --> 00:02:55,171 to make decisions about our lives together. 57 00:02:55,391 --> 00:02:57,030 When we all know different things 58 00:02:57,030 --> 00:03:01,059 and we’re all being siloed into our own little bubbles of information, 59 00:03:01,567 --> 00:03:04,860 it’s incredibility difficult for us to get along with one another. 60 00:03:04,860 --> 00:03:07,670 We have no shared experience or shared knowledge. 61 00:03:08,241 --> 00:03:11,335 I think it’s really important that we think critically 62 00:03:11,545 --> 00:03:13,897 about the information we receive online, 63 00:03:14,247 --> 00:03:17,167 and about the companies and structures that determine 64 00:03:17,337 --> 00:03:19,455 what we see on the internet. 65 00:03:21,504 --> 00:03:24,702 [NewsWise is a project of CIVIX& The Canadian Journalism Foundation] 66 00:03:25,172 --> 00:03:27,595 Subtitles by Claudia Contreras Review by Carol Wang