1 00:00:03,634 --> 00:00:08,343 [BEHIND THE SCREENS Who decides what I see online?] 2 00:00:08,376 --> 00:00:09,526 Hi, my name’s Taylor. 3 00:00:09,526 --> 00:00:11,778 I study the internet and journalism 4 00:00:11,778 --> 00:00:15,977 and how we as citizens receive information about the world. 5 00:00:17,389 --> 00:00:19,778 In the olden days when people got information, 6 00:00:19,778 --> 00:00:22,688 it was mostly decided by people. 7 00:00:22,688 --> 00:00:25,711 There were humans that decided what we needed to know. 8 00:00:25,711 --> 00:00:30,090 So, when we opened the newspaper or turned on the evening news, it was a person that 9 00:00:30,090 --> 00:00:32,699 decided what we heard and saw. 10 00:00:33,300 --> 00:00:36,546 The result of that is that we all knew similar things. 11 00:00:38,059 --> 00:00:40,742 Now the internet has changed everything. 12 00:00:41,090 --> 00:00:44,017 When you go online, when you open an application, 13 00:00:44,486 --> 00:00:47,114 what you see is determined not by a person 14 00:00:47,489 --> 00:00:48,710 but by a machine. 15 00:00:49,589 --> 00:00:51,820 And this is awesome in a lot of ways. 16 00:00:51,984 --> 00:00:56,523 It allows you to use Google Maps, it allows you to order food online, 17 00:00:56,523 --> 00:01:01,605 it allows you to connect with friends around the world and share information. 18 00:01:02,000 --> 00:01:07,751 But there are aspects of this machine that we need to think really carefully about 19 00:01:07,751 --> 00:01:11,618 because they determine the information that we all receive 20 00:01:11,793 --> 00:01:15,050 as citizens in a society and in a democracy. 21 00:01:15,489 --> 00:01:17,179 So, when you open up an app 22 00:01:17,179 --> 00:01:20,531 and you’re shown a picture in your Snapchat feed, 23 00:01:20,764 --> 00:01:24,420 all of that information is determined by this machine 24 00:01:24,465 --> 00:01:28,030 and that machine is driven by the incentives 25 00:01:28,030 --> 00:01:31,321 of the company that owns the website or owns the application. 26 00:01:32,684 --> 00:01:34,520 And the incentive is for you to spend 27 00:01:34,520 --> 00:01:38,080 as much time as possible inside that application. 28 00:01:38,680 --> 00:01:42,250 So, they do things that make you feel very good about being there. 29 00:01:42,250 --> 00:01:44,467 They allow people to like your photos, 30 00:01:44,975 --> 00:01:47,943 they show you content that they think you want to see 31 00:01:48,066 --> 00:01:50,850 that will either make you really happy or really angry, 32 00:01:50,850 --> 00:01:53,880 that will get an emotional response from you to keep you there. 33 00:01:54,410 --> 00:01:58,153 That is because they want to show you as many ads as possible when you’re there 34 00:01:58,153 --> 00:01:59,950 because that is their business model. 35 00:02:00,897 --> 00:02:04,414 They’re also taking that same opportunity of you being in their app 36 00:02:04,414 --> 00:02:06,370 to collect data about you. 37 00:02:06,370 --> 00:02:11,845 And they use these data to create detailed profiles of your life and your behaviour, 38 00:02:11,845 --> 00:02:16,810 and these profiles can then be used to target more ads back to you. 39 00:02:17,200 --> 00:02:19,700 And that then determines what you see as well. 40 00:02:21,470 --> 00:02:25,170 But all of this isn’t just about the business model of these companies, 41 00:02:25,614 --> 00:02:28,277 it actually has an impact on our democracy 42 00:02:28,735 --> 00:02:34,289 because what we each see on the internet is highly customized to us, 43 00:02:34,990 --> 00:02:41,010 to what we like, to what we believe, to what we want to see or want to believe. 44 00:02:41,523 --> 00:02:43,498 And that means that as a society, 45 00:02:43,756 --> 00:02:47,500 we no longer all have a shared set of knowledge 46 00:02:47,500 --> 00:02:51,134 which is hard for a democracy that requires us to work together 47 00:02:51,134 --> 00:02:54,880 and know the same things to make decisions about our lives together. 48 00:02:55,391 --> 00:02:57,030 When we all know different things 49 00:02:57,030 --> 00:03:01,059 and we’re all being siloed into our own little bubbles of information, 50 00:03:01,567 --> 00:03:04,860 it’s incredibility difficult for us to get along with one another. 51 00:03:04,860 --> 00:03:07,670 We have no shared experience or shared knowledge. 52 00:03:08,241 --> 00:03:12,810 So, I think it’s really important that we think critically about the information we 53 00:03:12,810 --> 00:03:17,113 receive online and about the companies and structures that determine 54 00:03:17,385 --> 00:03:19,219 what we see on the internet.