WEBVTT 00:00:00.792 --> 00:00:03.059 Every day, every week, 00:00:03.083 --> 00:00:05.268 we agree to terms and conditions. 00:00:05.292 --> 00:00:06.768 And when we do this, 00:00:06.792 --> 00:00:09.268 we provide companies with the lawful right 00:00:09.292 --> 00:00:12.976 to do whatever they want with our data 00:00:13.000 --> 00:00:15.375 and with the data of our children. 00:00:16.792 --> 00:00:19.768 Which makes us wonder: 00:00:19.792 --> 00:00:22.684 how much data are we giving away of children, 00:00:22.708 --> 00:00:24.708 and what are its implications? NOTE Paragraph 00:00:26.500 --> 00:00:27.893 I'm an anthropologist, 00:00:27.917 --> 00:00:30.518 and I'm also the mother of two little girls. 00:00:30.542 --> 00:00:35.018 And I started to become interested in this question in 2015 00:00:35.042 --> 00:00:37.768 when I suddenly realized that there were vast -- 00:00:37.792 --> 00:00:40.809 almost unimaginable amounts of data traces 00:00:40.833 --> 00:00:44.000 that are being produced and collected about children. 00:00:44.792 --> 00:00:46.768 So I launched a research project, 00:00:46.792 --> 00:00:49.268 which is called Child Data Citizen, 00:00:49.292 --> 00:00:51.417 and I aimed at filling in the blank. NOTE Paragraph 00:00:52.583 --> 00:00:55.601 Now you may think that I'm here to blame you 00:00:55.625 --> 00:00:58.393 for posting photos of your children on social media, 00:00:58.417 --> 00:01:00.559 but that's not really the point. 00:01:00.583 --> 00:01:04.000 The problem is way bigger than so-called "sharenting." 00:01:04.792 --> 00:01:08.893 This is about systems, not individuals. 00:01:08.917 --> 00:01:11.208 You and your habits are not to blame. NOTE Paragraph 00:01:12.833 --> 00:01:15.684 For the very first time in history, 00:01:15.708 --> 00:01:18.268 we are tracking the individual data of children 00:01:18.292 --> 00:01:20.059 from long before they're born -- 00:01:20.083 --> 00:01:22.768 sometimes from the moment of conception, 00:01:22.792 --> 00:01:25.143 and then throughout their lives. 00:01:25.167 --> 00:01:28.268 You see, when parents decide to conceive, 00:01:28.292 --> 00:01:31.268 they go online to look for "ways to get pregnant," 00:01:31.292 --> 00:01:34.042 or they download ovulation-tracking apps. 00:01:35.250 --> 00:01:37.851 When they do get pregnant, 00:01:37.875 --> 00:01:41.018 they post ultrasounds of their babies on social media, 00:01:41.042 --> 00:01:43.059 they download pregnancy apps 00:01:43.083 --> 00:01:46.809 or they consult Dr. Google for all sorts of things, 00:01:46.833 --> 00:01:48.351 like, you know -- 00:01:48.375 --> 00:01:50.934 for "miscarriage risk when flying" 00:01:50.958 --> 00:01:53.726 or "abdominal cramps in early pregnancy." 00:01:53.750 --> 00:01:55.559 I know because I've done it -- 00:01:55.583 --> 00:01:57.208 and many times. 00:01:58.458 --> 00:02:01.268 And then, when the baby is born, they track every nap, 00:02:01.292 --> 00:02:02.559 every feed, 00:02:02.583 --> 00:02:05.167 every life event on different technologies. 00:02:06.083 --> 00:02:07.559 And all of these technologies 00:02:07.583 --> 00:02:13.726 transform the baby's most intimate behavioral and health data into profit 00:02:13.750 --> 00:02:15.542 by sharing it with others. NOTE Paragraph 00:02:16.583 --> 00:02:18.726 So to give you an idea of how this works, 00:02:18.750 --> 00:02:23.934 in 2019, the British Medical Journal published research that showed 00:02:23.958 --> 00:02:27.601 that out of 24 mobile health apps, 00:02:27.625 --> 00:02:31.083 19 shared information with third parties. 00:02:32.083 --> 00:02:37.917 And these third parties shared information with 216 other organizations. 00:02:38.875 --> 00:02:42.309 Of these 216 other fourth parties, 00:02:42.333 --> 00:02:45.476 only three belonged to the health sector. 00:02:45.500 --> 00:02:50.018 The other companies that had access to that data were big tech companies 00:02:50.042 --> 00:02:53.559 like Google, Facebook or Oracle, 00:02:53.583 --> 00:02:56.184 they were digital advertising companies 00:02:56.208 --> 00:03:00.333 and there was also a consumer credit reporting agency. 00:03:01.125 --> 00:03:02.559 So you get it right: 00:03:02.583 --> 00:03:07.708 ad companies and credit agencies may already have data points on little babies. 00:03:09.125 --> 00:03:11.893 But mobile apps, web searches and social media 00:03:11.917 --> 00:03:15.018 are really just the tip of the iceberg, 00:03:15.042 --> 00:03:17.893 because children are being tracked by multiple technologies 00:03:17.917 --> 00:03:19.643 in their everyday lives. 00:03:19.667 --> 00:03:23.809 They're tracked by home technologies and virtual assistants in their homes. 00:03:23.833 --> 00:03:25.809 They're tracked by educational platforms 00:03:25.833 --> 00:03:28.018 and educational technologies in their schools. 00:03:28.042 --> 00:03:29.643 They're tracked by online records 00:03:29.667 --> 00:03:32.684 and online portals at their doctor's office. 00:03:32.708 --> 00:03:35.059 They're tracked by their internet-connected toys, 00:03:35.083 --> 00:03:36.393 their online games 00:03:36.417 --> 00:03:39.083 and many, many, many, many other technologies. NOTE Paragraph 00:03:40.250 --> 00:03:41.893 So during my research, 00:03:41.917 --> 00:03:46.059 a lot of parents came up to me and they were like, "So what? 00:03:46.083 --> 00:03:49.000 Why does it matter if my children are being tracked? 00:03:50.042 --> 00:03:51.375 We've got nothing to hide." 00:03:52.958 --> 00:03:54.458 Well, it matters. 00:03:55.083 --> 00:04:01.101 It matters because today individuals are not only being tracked, 00:04:01.125 --> 00:04:05.226 they're also being profiled on the basis of their data traces. 00:04:05.250 --> 00:04:09.059 Artificial intelligence and predictive analytics are being used 00:04:09.083 --> 00:04:12.726 to harness as much data as possible of an individual life 00:04:12.750 --> 00:04:14.601 from different sources: 00:04:14.625 --> 00:04:19.143 family history, purchasing habits, social media comments. 00:04:19.167 --> 00:04:21.018 And then they bring this data together 00:04:21.042 --> 00:04:23.792 to make data-driven decisions about the individual. 00:04:24.792 --> 00:04:28.226 And these technologies are used everywhere. 00:04:28.250 --> 00:04:30.643 Banks use them to decide loans. 00:04:30.667 --> 00:04:33.042 Insurance uses them to decide premiums. 00:04:34.208 --> 00:04:36.684 Recruiters and employers use them 00:04:36.708 --> 00:04:39.625 to decide whether one is a good fit for a job or not. 00:04:40.750 --> 00:04:43.851 Also the police and courts use them 00:04:43.875 --> 00:04:47.393 to determine whether one is a potential criminal 00:04:47.417 --> 00:04:50.042 or is likely to recommit a crime. NOTE Paragraph 00:04:52.458 --> 00:04:56.518 We have no knowledge or control 00:04:56.542 --> 00:05:00.184 over the ways in which those who buy, sell and process our data 00:05:00.208 --> 00:05:02.917 are profiling us and our children. 00:05:03.625 --> 00:05:07.667 But these profiles can come to impact our rights in significant ways. NOTE Paragraph 00:05:08.917 --> 00:05:11.125 To give you an example, 00:05:13.792 --> 00:05:17.851 in 2018 the "New York Times" published the news 00:05:17.875 --> 00:05:19.851 that the data that had been gathered 00:05:19.875 --> 00:05:22.934 through online college-planning services -- 00:05:22.958 --> 00:05:27.684 that are actually completed by millions of high school kids across the US 00:05:27.708 --> 00:05:31.351 who are looking for a college program or a scholarship -- 00:05:31.375 --> 00:05:34.417 had been sold to educational data brokers. 00:05:35.792 --> 00:05:41.226 Now, researchers at Fordham who studied educational data brokers 00:05:41.250 --> 00:05:46.476 revealed that these companies profiled kids as young as two 00:05:46.500 --> 00:05:49.559 on the basis of different categories: 00:05:49.583 --> 00:05:53.768 ethnicity, religion, affluence, 00:05:53.792 --> 00:05:55.851 social awkwardness 00:05:55.875 --> 00:05:58.809 and many other random categories. 00:05:58.833 --> 00:06:03.851 And then they sell these profiles together with the name of the kid, 00:06:03.875 --> 00:06:06.684 their home address and the contact details 00:06:06.708 --> 00:06:08.559 to different companies, 00:06:08.583 --> 00:06:11.042 including trade and career institutions, 00:06:12.083 --> 00:06:13.351 student loans 00:06:13.375 --> 00:06:15.125 and student credit card companies. 00:06:16.542 --> 00:06:17.893 To push the boundaries, 00:06:17.917 --> 00:06:21.726 the researchers at Fordham asked an educational data broker 00:06:21.750 --> 00:06:27.559 to provide them with a list of 14-to-15-year-old girls 00:06:27.583 --> 00:06:30.958 who were interested in family planning services. 00:06:32.208 --> 00:06:34.684 The data broker agreed to provide them the list. 00:06:34.708 --> 00:06:39.583 So imagine how intimate and how intrusive that is for our kids. 00:06:40.833 --> 00:06:44.809 But educational data brokers are really just an example. 00:06:44.833 --> 00:06:49.518 The truth is that our children are being profiled in ways that we cannot control 00:06:49.542 --> 00:06:52.958 but that can significantly impact their chances in life. NOTE Paragraph 00:06:54.167 --> 00:06:57.643 So we need to ask ourselves: 00:06:57.667 --> 00:07:02.351 can we trust these technologies when it comes to profiling our children? 00:07:02.375 --> 00:07:03.625 Can we? 00:07:05.708 --> 00:07:06.958 My answer is no. 00:07:07.792 --> 00:07:09.059 As an anthropologist, 00:07:09.083 --> 00:07:12.851 I believe that artificial intelligence and predictive analytics can be great 00:07:12.875 --> 00:07:14.893 to predict the course of a disease 00:07:14.917 --> 00:07:16.750 or to fight climate change. 00:07:18.000 --> 00:07:19.643 But we need to abandon the belief 00:07:19.667 --> 00:07:23.351 that these technologies can objectively profile humans 00:07:23.375 --> 00:07:26.559 and that we can rely on them to make data-driven decisions 00:07:26.583 --> 00:07:28.476 about individual lives. 00:07:28.500 --> 00:07:31.059 Because they can't profile humans. 00:07:31.083 --> 00:07:34.434 Data traces are not the mirror of who we are. 00:07:34.458 --> 00:07:36.559 Humans think one thing and say the opposite, 00:07:36.583 --> 00:07:39.018 feel one way and act differently. 00:07:39.042 --> 00:07:41.518 Algorithmic predictions or our digital practices 00:07:41.542 --> 00:07:46.708 cannot account for the unpredictability and complexity of human experience. NOTE Paragraph 00:07:48.417 --> 00:07:49.976 But on top of that, 00:07:50.000 --> 00:07:52.684 these technologies are always -- 00:07:52.708 --> 00:07:53.976 always -- 00:07:54.000 --> 00:07:55.917 in one way or another, biased. 00:07:57.125 --> 00:08:02.184 You see, algorithms are by definition sets of rules or steps 00:08:02.208 --> 00:08:05.917 that have been designed to achieve a specific result, OK? 00:08:06.833 --> 00:08:09.559 But these sets of rules or steps cannot be objective, 00:08:09.583 --> 00:08:11.726 because they've been designed by human beings 00:08:11.750 --> 00:08:13.476 within a specific cultural context 00:08:13.500 --> 00:08:16.000 and are shaped by specific cultural values. 00:08:16.667 --> 00:08:18.393 So when machines learn, 00:08:18.417 --> 00:08:20.667 they learn from biased algorithms, 00:08:21.625 --> 00:08:24.833 and they often learn from biased databases as well. NOTE Paragraph 00:08:25.833 --> 00:08:29.559 At the moment, we're seeing the first examples of algorithmic bias. 00:08:29.583 --> 00:08:33.083 And some of these examples are frankly terrifying. 00:08:34.500 --> 00:08:38.559 This year, the AI Now Institute in New York published a report 00:08:38.583 --> 00:08:40.976 that revealed that the AI technologies 00:08:41.000 --> 00:08:44.476 that are being used for predictive policing 00:08:44.500 --> 00:08:47.625 have been trained on "dirty" data. 00:08:48.333 --> 00:08:51.226 This is basically data that had been gathered 00:08:51.250 --> 00:08:55.434 during historical periods of known racial bias 00:08:55.458 --> 00:08:57.708 and nontransparent police practices. 00:08:58.542 --> 00:09:02.601 Because these technologies are being trained with dirty data, 00:09:02.625 --> 00:09:04.059 they're not objective, 00:09:04.083 --> 00:09:08.601 and their outcomes are only amplifying and perpetrating 00:09:08.625 --> 00:09:10.250 police bias and error. NOTE Paragraph 00:09:13.167 --> 00:09:16.309 So I think we are faced with a fundamental problem 00:09:16.333 --> 00:09:17.976 in our society. 00:09:18.000 --> 00:09:22.792 We are starting to trust technologies when it comes to profiling human beings. 00:09:23.750 --> 00:09:26.268 We know that in profiling humans, 00:09:26.292 --> 00:09:29.101 these technologies are always going to be biased 00:09:29.125 --> 00:09:31.851 and are never really going to be accurate. 00:09:31.875 --> 00:09:34.809 So what we need now is actually political solution. 00:09:34.833 --> 00:09:39.542 We need governments to recognize that our data rights are our human rights. NOTE Paragraph 00:09:40.292 --> 00:09:44.375 (Applause and cheers) NOTE Paragraph 00:09:47.833 --> 00:09:51.917 Until this happens, we cannot hope for a more just future. 00:09:52.750 --> 00:09:55.476 I worry that my daughters are going to be exposed 00:09:55.500 --> 00:09:59.226 to all sorts of algorithmic discrimination and error. 00:09:59.250 --> 00:10:01.643 You see the difference between me and my daughters 00:10:01.667 --> 00:10:04.851 is that there's no public record out there of my childhood. 00:10:04.875 --> 00:10:08.893 There's certainly no database of all the stupid things that I've done 00:10:08.917 --> 00:10:11.059 and thought when I was a teenager. NOTE Paragraph 00:10:11.083 --> 00:10:12.583 (Laughter) NOTE Paragraph 00:10:13.833 --> 00:10:16.583 But for my daughters this may be different. 00:10:17.292 --> 00:10:20.476 The data that is being collected from them today 00:10:20.500 --> 00:10:24.309 may be used to judge them in the future 00:10:24.333 --> 00:10:27.292 and can come to prevent their hopes and dreams. NOTE Paragraph 00:10:28.583 --> 00:10:30.101 I think that's it's time. 00:10:30.125 --> 00:10:31.559 It's time that we all step up. 00:10:31.583 --> 00:10:34.059 It's time that we start working together 00:10:34.083 --> 00:10:35.518 as individuals, 00:10:35.542 --> 00:10:38.059 as organizations and as institutions, 00:10:38.083 --> 00:10:41.184 and that we demand greater data justice for us 00:10:41.208 --> 00:10:42.601 and for our children 00:10:42.625 --> 00:10:44.143 before it's too late. NOTE Paragraph 00:10:44.167 --> 00:10:45.434 Thank you. NOTE Paragraph 00:10:45.458 --> 00:10:46.875 (Applause)