0:00:03.634,0:00:08.343 [BEHIND THE SCREENS[br]Who decides what I see online?] 0:00:08.376,0:00:09.526 Hi, my name’s Taylor. 0:00:09.526,0:00:11.778 I study the internet and journalism 0:00:11.778,0:00:15.977 and how we as citizens receive[br]information about the world. 0:00:17.389,0:00:19.778 In the olden days [br]when people got information, 0:00:19.778,0:00:22.688 it was mostly decided by people. 0:00:22.688,0:00:25.711 There were humans that decided[br]what we needed to know. 0:00:25.711,0:00:30.090 So, when we opened the newspaper or turned[br]on the evening news, it was a person that 0:00:30.090,0:00:32.699 decided what we heard and saw. 0:00:33.300,0:00:36.546 The result of that is[br]that we all knew similar things. 0:00:38.059,0:00:40.742 Now the internet has changed everything. 0:00:41.090,0:00:44.017 When you go online,[br]when you open an application, 0:00:44.486,0:00:47.114 what you see is determined not by a person 0:00:47.489,0:00:48.710 but by a machine. 0:00:49.589,0:00:51.820 And this is awesome in a lot of ways. 0:00:51.984,0:00:56.523 It allows you to use Google Maps,[br]it allows you to order food online, 0:00:56.523,0:01:01.605 it allows you to connect with friends[br]around the world and share information. 0:01:02.000,0:01:07.751 But there are aspects of this machine that[br]we need to think really carefully about 0:01:07.751,0:01:11.618 because they determine[br]the information that we all receive 0:01:11.793,0:01:15.050 as citizens in a society[br]and in a democracy. 0:01:15.489,0:01:17.179 So, when you open up an app 0:01:17.179,0:01:20.531 and you’re shown a picture[br]in your Snapchat feed, 0:01:20.764,0:01:24.420 all of that information[br]is determined by this machine 0:01:24.465,0:01:28.030 and that machine[br]is driven by the incentives 0:01:28.030,0:01:31.321 of the company that owns the website[br]or owns the application. 0:01:32.684,0:01:34.520 And the incentive is for you to spend 0:01:34.520,0:01:38.080 as much time as possible[br]inside that application. 0:01:38.680,0:01:42.250 So, they do things that make you feel[br]very good about being there. 0:01:42.250,0:01:44.467 They allow people to like your photos, 0:01:44.975,0:01:47.943 they show you content[br]that they think you want to see 0:01:48.066,0:01:50.850 that will either make you[br]really happy or really angry, 0:01:50.850,0:01:53.880 that will get an emotional response[br]from you to keep you there. 0:01:54.410,0:01:58.153 That is because they want to show you[br]as many ads as possible when you’re there 0:01:58.153,0:01:59.950 because that is their business model. 0:02:00.897,0:02:04.414 They’re also taking that same opportunity[br]of you being in their app 0:02:04.414,0:02:06.370 to collect data about you. 0:02:06.370,0:02:11.845 And they use these data to create detailed[br]profiles of your life and your behaviour, 0:02:11.845,0:02:16.810 and these profiles can then be used[br]to target more ads back to you. 0:02:17.200,0:02:19.700 And that then determines[br]what you see as well. 0:02:21.470,0:02:25.170 But all of this isn’t just about[br]the business model of these companies, 0:02:25.614,0:02:28.277 it actually has an impact on our democracy 0:02:28.735,0:02:34.289 because what we each see on the internet[br]is highly customized to us, 0:02:34.990,0:02:41.010 to what we like, to what we believe,[br]to what we want to see or want to believe. 0:02:41.523,0:02:43.498 And that means that as a society, 0:02:43.756,0:02:47.500 we no longer all have[br]a shared set of knowledge 0:02:47.500,0:02:51.134 which is hard for a democracy[br]that requires us to work together 0:02:51.134,0:02:54.880 and know the same things to make[br]decisions about our lives together. 0:02:55.391,0:02:57.030 When we all know different things 0:02:57.030,0:03:01.059 and we’re all being siloed into[br]our own little bubbles of information, 0:03:01.567,0:03:04.860 it’s incredibility difficult for us[br]to get along with one another. 0:03:04.860,0:03:07.670 We have no shared experience[br]or shared knowledge. 0:03:08.241,0:03:12.810 So, I think it’s really important that we[br]think critically about the information we 0:03:12.810,0:03:17.113 receive online and about the companies[br]and structures that determine 0:03:17.385,0:03:19.219 what we see on the internet.