[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.79,0:00:03.06,Default,,0000,0000,0000,,Every day, every week, Dialogue: 0,0:00:03.08,0:00:05.27,Default,,0000,0000,0000,,we agree to terms and conditions. Dialogue: 0,0:00:05.29,0:00:06.77,Default,,0000,0000,0000,,And when we do this, Dialogue: 0,0:00:06.79,0:00:09.27,Default,,0000,0000,0000,,we provide companies with the lawful right Dialogue: 0,0:00:09.29,0:00:12.98,Default,,0000,0000,0000,,to do whatever they want with our data Dialogue: 0,0:00:13.00,0:00:15.38,Default,,0000,0000,0000,,and with the data of our children. Dialogue: 0,0:00:16.79,0:00:19.77,Default,,0000,0000,0000,,Which makes us wonder: Dialogue: 0,0:00:19.79,0:00:22.68,Default,,0000,0000,0000,,how much data are we giving\Naway of children, Dialogue: 0,0:00:22.71,0:00:24.71,Default,,0000,0000,0000,,and what are its implications? Dialogue: 0,0:00:26.50,0:00:27.89,Default,,0000,0000,0000,,I'm an anthropologist, Dialogue: 0,0:00:27.92,0:00:30.52,Default,,0000,0000,0000,,and I'm also the mother\Nof two little girls. Dialogue: 0,0:00:30.54,0:00:35.02,Default,,0000,0000,0000,,And I started to become interested\Nin this question in 2015 Dialogue: 0,0:00:35.04,0:00:37.77,Default,,0000,0000,0000,,when I suddenly realized\Nthat there were vast -- Dialogue: 0,0:00:37.79,0:00:40.81,Default,,0000,0000,0000,,almost unimaginable amounts of data traces Dialogue: 0,0:00:40.83,0:00:44.00,Default,,0000,0000,0000,,that are being produced\Nand collected about children. Dialogue: 0,0:00:44.79,0:00:46.77,Default,,0000,0000,0000,,So I launched a research project, Dialogue: 0,0:00:46.79,0:00:49.27,Default,,0000,0000,0000,,which is called Child Data Citizen, Dialogue: 0,0:00:49.29,0:00:51.42,Default,,0000,0000,0000,,and I aimed at filling in the blank. Dialogue: 0,0:00:52.58,0:00:55.60,Default,,0000,0000,0000,,Now you may think\Nthat I'm here to blame you Dialogue: 0,0:00:55.62,0:00:58.39,Default,,0000,0000,0000,,for posting photos\Nof your children on social media, Dialogue: 0,0:00:58.42,0:01:00.56,Default,,0000,0000,0000,,but that's not really the point. Dialogue: 0,0:01:00.58,0:01:04.00,Default,,0000,0000,0000,,The problem is way bigger\Nthan so-called "sharenting." Dialogue: 0,0:01:04.79,0:01:08.89,Default,,0000,0000,0000,,This is about systems, not individuals. Dialogue: 0,0:01:08.92,0:01:11.21,Default,,0000,0000,0000,,You and your habits are not to blame. Dialogue: 0,0:01:12.83,0:01:15.68,Default,,0000,0000,0000,,For the very first time in history, Dialogue: 0,0:01:15.71,0:01:18.27,Default,,0000,0000,0000,,we are tracking\Nthe individual data of children Dialogue: 0,0:01:18.29,0:01:20.06,Default,,0000,0000,0000,,from long before they're born -- Dialogue: 0,0:01:20.08,0:01:22.77,Default,,0000,0000,0000,,sometimes from the moment of conception, Dialogue: 0,0:01:22.79,0:01:25.14,Default,,0000,0000,0000,,and then throughout their lives. Dialogue: 0,0:01:25.17,0:01:28.27,Default,,0000,0000,0000,,You see, when parents decide to conceive, Dialogue: 0,0:01:28.29,0:01:31.27,Default,,0000,0000,0000,,they go online to look\Nfor "ways to get pregnant," Dialogue: 0,0:01:31.29,0:01:34.04,Default,,0000,0000,0000,,or they download ovulation-tracking apps. Dialogue: 0,0:01:35.25,0:01:37.85,Default,,0000,0000,0000,,When they do get pregnant, Dialogue: 0,0:01:37.88,0:01:41.02,Default,,0000,0000,0000,,they post ultrasounds\Nof their babies on social media, Dialogue: 0,0:01:41.04,0:01:43.06,Default,,0000,0000,0000,,they download pregnancy apps Dialogue: 0,0:01:43.08,0:01:46.81,Default,,0000,0000,0000,,or they consult Dr. Google\Nfor all sorts of things, Dialogue: 0,0:01:46.83,0:01:48.35,Default,,0000,0000,0000,,like, you know -- Dialogue: 0,0:01:48.38,0:01:50.93,Default,,0000,0000,0000,,for "miscarriage risk when flying" Dialogue: 0,0:01:50.96,0:01:53.73,Default,,0000,0000,0000,,or "abdominal cramps in early pregnancy." Dialogue: 0,0:01:53.75,0:01:55.56,Default,,0000,0000,0000,,I know because I've done it -- Dialogue: 0,0:01:55.58,0:01:57.21,Default,,0000,0000,0000,,and many times. Dialogue: 0,0:01:58.46,0:02:01.27,Default,,0000,0000,0000,,And then, when the baby is born,\Nthey track every nap, Dialogue: 0,0:02:01.29,0:02:02.56,Default,,0000,0000,0000,,every feed, Dialogue: 0,0:02:02.58,0:02:05.17,Default,,0000,0000,0000,,every life event\Non different technologies. Dialogue: 0,0:02:06.08,0:02:07.56,Default,,0000,0000,0000,,And all of these technologies Dialogue: 0,0:02:07.58,0:02:13.73,Default,,0000,0000,0000,,transform the baby's most intimate\Nbehavioral and health data into profit Dialogue: 0,0:02:13.75,0:02:15.54,Default,,0000,0000,0000,,by sharing it with others. Dialogue: 0,0:02:16.58,0:02:18.73,Default,,0000,0000,0000,,So to give you an idea of how this works, Dialogue: 0,0:02:18.75,0:02:23.93,Default,,0000,0000,0000,,in 2019, the British Medical Journal\Npublished research that showed Dialogue: 0,0:02:23.96,0:02:27.60,Default,,0000,0000,0000,,that out of 24 mobile health apps, Dialogue: 0,0:02:27.62,0:02:31.08,Default,,0000,0000,0000,,19 shared information with third parties. Dialogue: 0,0:02:32.08,0:02:37.92,Default,,0000,0000,0000,,And these third parties shared information\Nwith 216 other organizations. Dialogue: 0,0:02:38.88,0:02:42.31,Default,,0000,0000,0000,,Of these 216 other fourth parties, Dialogue: 0,0:02:42.33,0:02:45.48,Default,,0000,0000,0000,,only three belonged to the health sector. Dialogue: 0,0:02:45.50,0:02:50.02,Default,,0000,0000,0000,,The other companies that had access\Nto that data were big tech companies Dialogue: 0,0:02:50.04,0:02:53.56,Default,,0000,0000,0000,,like Google, Facebook or Oracle, Dialogue: 0,0:02:53.58,0:02:56.18,Default,,0000,0000,0000,,they were digital advertising companies Dialogue: 0,0:02:56.21,0:03:00.33,Default,,0000,0000,0000,,and there was also\Na consumer credit reporting agency. Dialogue: 0,0:03:01.12,0:03:02.56,Default,,0000,0000,0000,,So you get it right: Dialogue: 0,0:03:02.58,0:03:07.71,Default,,0000,0000,0000,,ad companies and credit agencies may\Nalready have data points on little babies. Dialogue: 0,0:03:09.12,0:03:11.89,Default,,0000,0000,0000,,But mobile apps,\Nweb searches and social media Dialogue: 0,0:03:11.92,0:03:15.02,Default,,0000,0000,0000,,are really just the tip of the iceberg, Dialogue: 0,0:03:15.04,0:03:17.89,Default,,0000,0000,0000,,because children are being tracked\Nby multiple technologies Dialogue: 0,0:03:17.92,0:03:19.64,Default,,0000,0000,0000,,in their everyday lives. Dialogue: 0,0:03:19.67,0:03:23.81,Default,,0000,0000,0000,,They're tracked by home technologies\Nand virtual assistants in their homes. Dialogue: 0,0:03:23.83,0:03:25.81,Default,,0000,0000,0000,,They're tracked by educational platforms Dialogue: 0,0:03:25.83,0:03:28.02,Default,,0000,0000,0000,,and educational technologies\Nin their schools. Dialogue: 0,0:03:28.04,0:03:29.64,Default,,0000,0000,0000,,They're tracked by online records Dialogue: 0,0:03:29.67,0:03:32.68,Default,,0000,0000,0000,,and online portals\Nat their doctor's office. Dialogue: 0,0:03:32.71,0:03:35.06,Default,,0000,0000,0000,,They're tracked by their\Ninternet-connected toys, Dialogue: 0,0:03:35.08,0:03:36.39,Default,,0000,0000,0000,,their online games Dialogue: 0,0:03:36.42,0:03:39.08,Default,,0000,0000,0000,,and many, many, many,\Nmany other technologies. Dialogue: 0,0:03:40.25,0:03:41.89,Default,,0000,0000,0000,,So during my research, Dialogue: 0,0:03:41.92,0:03:46.06,Default,,0000,0000,0000,,a lot of parents came up to me\Nand they were like, "So what? Dialogue: 0,0:03:46.08,0:03:49.00,Default,,0000,0000,0000,,Why does it matter\Nif my children are being tracked? Dialogue: 0,0:03:50.04,0:03:51.38,Default,,0000,0000,0000,,We've got nothing to hide." Dialogue: 0,0:03:52.96,0:03:54.46,Default,,0000,0000,0000,,Well, it matters. Dialogue: 0,0:03:55.08,0:04:01.10,Default,,0000,0000,0000,,It matters because today individuals\Nare not only being tracked, Dialogue: 0,0:04:01.12,0:04:05.23,Default,,0000,0000,0000,,they're also being profiled\Non the basis of their data traces. Dialogue: 0,0:04:05.25,0:04:09.06,Default,,0000,0000,0000,,Artificial intelligence and predictive\Nanalytics are being used Dialogue: 0,0:04:09.08,0:04:12.73,Default,,0000,0000,0000,,to harness as much data as possible\Nof an individual life Dialogue: 0,0:04:12.75,0:04:14.60,Default,,0000,0000,0000,,from different sources: Dialogue: 0,0:04:14.62,0:04:19.14,Default,,0000,0000,0000,,family history, purchasing habits,\Nsocial media comments. Dialogue: 0,0:04:19.17,0:04:21.02,Default,,0000,0000,0000,,And then they bring this data together Dialogue: 0,0:04:21.04,0:04:23.79,Default,,0000,0000,0000,,to make data-driven decisions\Nabout the individual. Dialogue: 0,0:04:24.79,0:04:28.23,Default,,0000,0000,0000,,And these technologies\Nare used everywhere. Dialogue: 0,0:04:28.25,0:04:30.64,Default,,0000,0000,0000,,Banks use them to decide loans. Dialogue: 0,0:04:30.67,0:04:33.04,Default,,0000,0000,0000,,Insurance uses them to decide premiums. Dialogue: 0,0:04:34.21,0:04:36.68,Default,,0000,0000,0000,,Recruiters and employers use them Dialogue: 0,0:04:36.71,0:04:39.62,Default,,0000,0000,0000,,to decide whether one\Nis a good fit for a job or not. Dialogue: 0,0:04:40.75,0:04:43.85,Default,,0000,0000,0000,,Also the police and courts use them Dialogue: 0,0:04:43.88,0:04:47.39,Default,,0000,0000,0000,,to determine whether one\Nis a potential criminal Dialogue: 0,0:04:47.42,0:04:50.04,Default,,0000,0000,0000,,or is likely to recommit a crime. Dialogue: 0,0:04:52.46,0:04:56.52,Default,,0000,0000,0000,,We have no knowledge or control Dialogue: 0,0:04:56.54,0:05:00.18,Default,,0000,0000,0000,,over the ways in which those who buy,\Nsell and process our data Dialogue: 0,0:05:00.21,0:05:02.92,Default,,0000,0000,0000,,are profiling us and our children. Dialogue: 0,0:05:03.62,0:05:07.67,Default,,0000,0000,0000,,But these profiles can come to impact\Nour rights in significant ways. Dialogue: 0,0:05:08.92,0:05:11.12,Default,,0000,0000,0000,,To give you an example, Dialogue: 0,0:05:13.79,0:05:17.85,Default,,0000,0000,0000,,in 2018 the "New York Times"\Npublished the news Dialogue: 0,0:05:17.88,0:05:19.85,Default,,0000,0000,0000,,that the data that had been gathered Dialogue: 0,0:05:19.88,0:05:22.93,Default,,0000,0000,0000,,through online\Ncollege-planning services -- Dialogue: 0,0:05:22.96,0:05:27.68,Default,,0000,0000,0000,,that are actually completed by millions\Nof high school kids across the US Dialogue: 0,0:05:27.71,0:05:31.35,Default,,0000,0000,0000,,who are looking for a college\Nprogram or a scholarship -- Dialogue: 0,0:05:31.38,0:05:34.42,Default,,0000,0000,0000,,had been sold to educational data brokers. Dialogue: 0,0:05:35.79,0:05:41.23,Default,,0000,0000,0000,,Now, researchers at Fordham\Nwho studied educational data brokers Dialogue: 0,0:05:41.25,0:05:46.48,Default,,0000,0000,0000,,revealed that these companies\Nprofiled kids as young as two Dialogue: 0,0:05:46.50,0:05:49.56,Default,,0000,0000,0000,,on the basis of different categories: Dialogue: 0,0:05:49.58,0:05:53.77,Default,,0000,0000,0000,,ethnicity, religion, affluence, Dialogue: 0,0:05:53.79,0:05:55.85,Default,,0000,0000,0000,,social awkwardness Dialogue: 0,0:05:55.88,0:05:58.81,Default,,0000,0000,0000,,and many other random categories. Dialogue: 0,0:05:58.83,0:06:03.85,Default,,0000,0000,0000,,And then they sell these profiles\Ntogether with the name of the kid, Dialogue: 0,0:06:03.88,0:06:06.68,Default,,0000,0000,0000,,their home address and the contact details Dialogue: 0,0:06:06.71,0:06:08.56,Default,,0000,0000,0000,,to different companies, Dialogue: 0,0:06:08.58,0:06:11.04,Default,,0000,0000,0000,,including trade and career institutions, Dialogue: 0,0:06:12.08,0:06:13.35,Default,,0000,0000,0000,,student loans Dialogue: 0,0:06:13.38,0:06:15.12,Default,,0000,0000,0000,,and student credit card companies. Dialogue: 0,0:06:16.54,0:06:17.89,Default,,0000,0000,0000,,To push the boundaries, Dialogue: 0,0:06:17.92,0:06:21.73,Default,,0000,0000,0000,,the researchers at Fordham\Nasked an educational data broker Dialogue: 0,0:06:21.75,0:06:27.56,Default,,0000,0000,0000,,to provide them with a list\Nof 14-to-15-year-old girls Dialogue: 0,0:06:27.58,0:06:30.96,Default,,0000,0000,0000,,who were interested\Nin family planning services. Dialogue: 0,0:06:32.21,0:06:34.68,Default,,0000,0000,0000,,The data broker agreed\Nto provide them the list. Dialogue: 0,0:06:34.71,0:06:39.58,Default,,0000,0000,0000,,So imagine how intimate\Nand how intrusive that is for our kids. Dialogue: 0,0:06:40.83,0:06:44.81,Default,,0000,0000,0000,,But educational data brokers\Nare really just an example. Dialogue: 0,0:06:44.83,0:06:49.52,Default,,0000,0000,0000,,The truth is that our children are being\Nprofiled in ways that we cannot control Dialogue: 0,0:06:49.54,0:06:52.96,Default,,0000,0000,0000,,but that can significantly impact\Ntheir chances in life. Dialogue: 0,0:06:54.17,0:06:57.64,Default,,0000,0000,0000,,So we need to ask ourselves: Dialogue: 0,0:06:57.67,0:07:02.35,Default,,0000,0000,0000,,can we trust these technologies\Nwhen it comes to profiling our children? Dialogue: 0,0:07:02.38,0:07:03.62,Default,,0000,0000,0000,,Can we? Dialogue: 0,0:07:05.71,0:07:06.96,Default,,0000,0000,0000,,My answer is no. Dialogue: 0,0:07:07.79,0:07:09.06,Default,,0000,0000,0000,,As an anthropologist, Dialogue: 0,0:07:09.08,0:07:12.85,Default,,0000,0000,0000,,I believe that artificial intelligence\Nand predictive analytics can be great Dialogue: 0,0:07:12.88,0:07:14.89,Default,,0000,0000,0000,,to predict the course of a disease Dialogue: 0,0:07:14.92,0:07:16.75,Default,,0000,0000,0000,,or to fight climate change. Dialogue: 0,0:07:18.00,0:07:19.64,Default,,0000,0000,0000,,But we need to abandon the belief Dialogue: 0,0:07:19.67,0:07:23.35,Default,,0000,0000,0000,,that these technologies\Ncan objectively profile humans Dialogue: 0,0:07:23.38,0:07:26.56,Default,,0000,0000,0000,,and that we can rely on them\Nto make data-driven decisions Dialogue: 0,0:07:26.58,0:07:28.48,Default,,0000,0000,0000,,about individual lives. Dialogue: 0,0:07:28.50,0:07:31.06,Default,,0000,0000,0000,,Because they can't profile humans. Dialogue: 0,0:07:31.08,0:07:34.43,Default,,0000,0000,0000,,Data traces are not\Nthe mirror of who we are. Dialogue: 0,0:07:34.46,0:07:36.56,Default,,0000,0000,0000,,Humans think one thing\Nand say the opposite, Dialogue: 0,0:07:36.58,0:07:39.02,Default,,0000,0000,0000,,feel one way and act differently. Dialogue: 0,0:07:39.04,0:07:41.52,Default,,0000,0000,0000,,Algorithmic predictions\Nor our digital practices Dialogue: 0,0:07:41.54,0:07:46.71,Default,,0000,0000,0000,,cannot account for the unpredictability\Nand complexity of human experience. Dialogue: 0,0:07:48.42,0:07:49.98,Default,,0000,0000,0000,,But on top of that, Dialogue: 0,0:07:50.00,0:07:52.68,Default,,0000,0000,0000,,these technologies are always -- Dialogue: 0,0:07:52.71,0:07:53.98,Default,,0000,0000,0000,,always -- Dialogue: 0,0:07:54.00,0:07:55.92,Default,,0000,0000,0000,,in one way or another, biased. Dialogue: 0,0:07:57.12,0:08:02.18,Default,,0000,0000,0000,,You see, algorithms are by definition\Nsets of rules or steps Dialogue: 0,0:08:02.21,0:08:05.92,Default,,0000,0000,0000,,that have been designed to achieve\Na specific result, OK? Dialogue: 0,0:08:06.83,0:08:09.56,Default,,0000,0000,0000,,But these sets of rules or steps\Ncannot be objective, Dialogue: 0,0:08:09.58,0:08:11.73,Default,,0000,0000,0000,,because they've been designed\Nby human beings Dialogue: 0,0:08:11.75,0:08:13.48,Default,,0000,0000,0000,,within a specific cultural context Dialogue: 0,0:08:13.50,0:08:16.00,Default,,0000,0000,0000,,and are shaped\Nby specific cultural values. Dialogue: 0,0:08:16.67,0:08:18.39,Default,,0000,0000,0000,,So when machines learn, Dialogue: 0,0:08:18.42,0:08:20.67,Default,,0000,0000,0000,,they learn from biased algorithms, Dialogue: 0,0:08:21.62,0:08:24.83,Default,,0000,0000,0000,,and they often learn\Nfrom biased databases as well. Dialogue: 0,0:08:25.83,0:08:29.56,Default,,0000,0000,0000,,At the moment, we're seeing\Nthe first examples of algorithmic bias. Dialogue: 0,0:08:29.58,0:08:33.08,Default,,0000,0000,0000,,And some of these examples\Nare frankly terrifying. Dialogue: 0,0:08:34.50,0:08:38.56,Default,,0000,0000,0000,,This year, the AI Now Institute\Nin New York published a report Dialogue: 0,0:08:38.58,0:08:40.98,Default,,0000,0000,0000,,that revealed that the AI technologies Dialogue: 0,0:08:41.00,0:08:44.48,Default,,0000,0000,0000,,that are being used\Nfor predictive policing Dialogue: 0,0:08:44.50,0:08:47.62,Default,,0000,0000,0000,,have been trained on "dirty" data. Dialogue: 0,0:08:48.33,0:08:51.23,Default,,0000,0000,0000,,This is basically data\Nthat had been gathered Dialogue: 0,0:08:51.25,0:08:55.43,Default,,0000,0000,0000,,during historical periods\Nof known racial bias Dialogue: 0,0:08:55.46,0:08:57.71,Default,,0000,0000,0000,,and nontransparent police practices. Dialogue: 0,0:08:58.54,0:09:02.60,Default,,0000,0000,0000,,Because these technologies\Nare being trained with dirty data, Dialogue: 0,0:09:02.62,0:09:04.06,Default,,0000,0000,0000,,they're not objective, Dialogue: 0,0:09:04.08,0:09:08.60,Default,,0000,0000,0000,,and their outcomes are only\Namplifying and perpetrating Dialogue: 0,0:09:08.62,0:09:10.25,Default,,0000,0000,0000,,police bias and error. Dialogue: 0,0:09:13.17,0:09:16.31,Default,,0000,0000,0000,,So I think we are faced\Nwith a fundamental problem Dialogue: 0,0:09:16.33,0:09:17.98,Default,,0000,0000,0000,,in our society. Dialogue: 0,0:09:18.00,0:09:22.79,Default,,0000,0000,0000,,We are starting to trust technologies\Nwhen it comes to profiling human beings. Dialogue: 0,0:09:23.75,0:09:26.27,Default,,0000,0000,0000,,We know that in profiling humans, Dialogue: 0,0:09:26.29,0:09:29.10,Default,,0000,0000,0000,,these technologies\Nare always going to be biased Dialogue: 0,0:09:29.12,0:09:31.85,Default,,0000,0000,0000,,and are never really going to be accurate. Dialogue: 0,0:09:31.88,0:09:34.81,Default,,0000,0000,0000,,So what we need now\Nis actually political solution. Dialogue: 0,0:09:34.83,0:09:39.54,Default,,0000,0000,0000,,We need governments to recognize\Nthat our data rights are our human rights. Dialogue: 0,0:09:40.29,0:09:44.38,Default,,0000,0000,0000,,(Applause and cheers) Dialogue: 0,0:09:47.83,0:09:51.92,Default,,0000,0000,0000,,Until this happens, we cannot hope\Nfor a more just future. Dialogue: 0,0:09:52.75,0:09:55.48,Default,,0000,0000,0000,,I worry that my daughters\Nare going to be exposed Dialogue: 0,0:09:55.50,0:09:59.23,Default,,0000,0000,0000,,to all sorts of algorithmic\Ndiscrimination and error. Dialogue: 0,0:09:59.25,0:10:01.64,Default,,0000,0000,0000,,You see the difference\Nbetween me and my daughters Dialogue: 0,0:10:01.67,0:10:04.85,Default,,0000,0000,0000,,is that there's no public record\Nout there of my childhood. Dialogue: 0,0:10:04.88,0:10:08.89,Default,,0000,0000,0000,,There's certainly no database\Nof all the stupid things that I've done Dialogue: 0,0:10:08.92,0:10:11.06,Default,,0000,0000,0000,,and thought when I was a teenager. Dialogue: 0,0:10:11.08,0:10:12.58,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:10:13.83,0:10:16.58,Default,,0000,0000,0000,,But for my daughters\Nthis may be different. Dialogue: 0,0:10:17.29,0:10:20.48,Default,,0000,0000,0000,,The data that is being collected\Nfrom them today Dialogue: 0,0:10:20.50,0:10:24.31,Default,,0000,0000,0000,,may be used to judge them in the future Dialogue: 0,0:10:24.33,0:10:27.29,Default,,0000,0000,0000,,and can come to prevent\Ntheir hopes and dreams. Dialogue: 0,0:10:28.58,0:10:30.10,Default,,0000,0000,0000,,I think that's it's time. Dialogue: 0,0:10:30.12,0:10:31.56,Default,,0000,0000,0000,,It's time that we all step up. Dialogue: 0,0:10:31.58,0:10:34.06,Default,,0000,0000,0000,,It's time that we start working together Dialogue: 0,0:10:34.08,0:10:35.52,Default,,0000,0000,0000,,as individuals, Dialogue: 0,0:10:35.54,0:10:38.06,Default,,0000,0000,0000,,as organizations and as institutions, Dialogue: 0,0:10:38.08,0:10:41.18,Default,,0000,0000,0000,,and that we demand\Ngreater data justice for us Dialogue: 0,0:10:41.21,0:10:42.60,Default,,0000,0000,0000,,and for our children Dialogue: 0,0:10:42.62,0:10:44.14,Default,,0000,0000,0000,,before it's too late. Dialogue: 0,0:10:44.17,0:10:45.43,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:10:45.46,0:10:46.88,Default,,0000,0000,0000,,(Applause)