What tech companies know about your kids
-
0:01 - 0:03Every day, every week,
-
0:03 - 0:05we agree to terms and conditions.
-
0:05 - 0:07And when we do this,
-
0:07 - 0:09we provide companies with the lawful right
-
0:09 - 0:13to do whatever they want with our data
-
0:13 - 0:15and with the data of our children.
-
0:17 - 0:20Which makes us wonder:
-
0:20 - 0:23how much data are we giving
away of children, -
0:23 - 0:25and what are its implications?
-
0:26 - 0:28I'm an anthropologist,
-
0:28 - 0:31and I'm also the mother
of two little girls. -
0:31 - 0:35And I started to become interested
in this question in 2015 -
0:35 - 0:38when I suddenly realized
that there were vast -- -
0:38 - 0:41almost unimaginable amounts of data traces
-
0:41 - 0:44that are being produced
and collected about children. -
0:45 - 0:47So I launched a research project,
-
0:47 - 0:49which is called Child Data Citizen,
-
0:49 - 0:51and I aimed at filling in the blank.
-
0:53 - 0:56Now you may think
that I'm here to blame you -
0:56 - 0:58for posting photos
of your children on social media, -
0:58 - 1:01but that's not really the point.
-
1:01 - 1:04The problem is way bigger
than so-called "sharenting." -
1:05 - 1:09This is about systems, not individuals.
-
1:09 - 1:11You and your habits are not to blame.
-
1:13 - 1:16For the very first time in history,
-
1:16 - 1:18we are tracking
the individual data of children -
1:18 - 1:20from long before they're born --
-
1:20 - 1:23sometimes from the moment of conception,
-
1:23 - 1:25and then throughout their lives.
-
1:25 - 1:28You see, when parents decide to conceive,
-
1:28 - 1:31they go online to look
for "ways to get pregnant," -
1:31 - 1:34or they download ovulation-tracking apps.
-
1:35 - 1:38When they do get pregnant,
-
1:38 - 1:41they post ultrasounds
of their babies on social media, -
1:41 - 1:43they download pregnancy apps
-
1:43 - 1:47or they consult Dr. Google
for all sorts of things, -
1:47 - 1:48like, you know --
-
1:48 - 1:51for "miscarriage risk when flying"
-
1:51 - 1:54or "abdominal cramps in early pregnancy."
-
1:54 - 1:56I know because I've done it --
-
1:56 - 1:57and many times.
-
1:58 - 2:01And then, when the baby is born,
they track every nap, -
2:01 - 2:03every feed,
-
2:03 - 2:05every life event
on different technologies. -
2:06 - 2:08And all of these technologies
-
2:08 - 2:14transform the baby's most intimate
behavioral and health data into profit -
2:14 - 2:16by sharing it with others.
-
2:17 - 2:19So to give you an idea of how this works,
-
2:19 - 2:24in 2019, the British Medical Journal
published research that showed -
2:24 - 2:28that out of 24 mobile health apps,
-
2:28 - 2:3119 shared information with third parties.
-
2:32 - 2:38And these third parties shared information
with 216 other organizations. -
2:39 - 2:42Of these 216 other fourth parties,
-
2:42 - 2:45only three belonged to the health sector.
-
2:46 - 2:50The other companies that had access
to that data were big tech companies -
2:50 - 2:54like Google, Facebook or Oracle,
-
2:54 - 2:56they were digital advertising companies
-
2:56 - 3:00and there was also
a consumer credit reporting agency. -
3:01 - 3:03So you get it right:
-
3:03 - 3:08ad companies and credit agencies may
already have data points on little babies. -
3:09 - 3:12But mobile apps,
web searches and social media -
3:12 - 3:15are really just the tip of the iceberg,
-
3:15 - 3:18because children are being tracked
by multiple technologies -
3:18 - 3:20in their everyday lives.
-
3:20 - 3:24They're tracked by home technologies
and virtual assistants in their homes. -
3:24 - 3:26They're tracked by educational platforms
-
3:26 - 3:28and educational technologies
in their schools. -
3:28 - 3:30They're tracked by online records
-
3:30 - 3:33and online portals
at their doctor's office. -
3:33 - 3:35They're tracked by their
internet-connected toys, -
3:35 - 3:36their online games
-
3:36 - 3:39and many, many, many,
many other technologies. -
3:40 - 3:42So during my research,
-
3:42 - 3:46a lot of parents came up to me
and they were like, "So what? -
3:46 - 3:49Why does it matter
if my children are being tracked? -
3:50 - 3:51We've got nothing to hide."
-
3:53 - 3:54Well, it matters.
-
3:55 - 4:01It matters because today individuals
are not only being tracked, -
4:01 - 4:05they're also being profiled
on the basis of their data traces. -
4:05 - 4:09Artificial intelligence and predictive
analytics are being used -
4:09 - 4:13to harness as much data as possible
of an individual life -
4:13 - 4:15from different sources:
-
4:15 - 4:19family history, purchasing habits,
social media comments. -
4:19 - 4:21And then they bring this data together
-
4:21 - 4:24to make data-driven decisions
about the individual. -
4:25 - 4:28And these technologies
are used everywhere. -
4:28 - 4:31Banks use them to decide loans.
-
4:31 - 4:33Insurance uses them to decide premiums.
-
4:34 - 4:37Recruiters and employers use them
-
4:37 - 4:40to decide whether one
is a good fit for a job or not. -
4:41 - 4:44Also the police and courts use them
-
4:44 - 4:47to determine whether one
is a potential criminal -
4:47 - 4:50or is likely to recommit a crime.
-
4:52 - 4:57We have no knowledge or control
-
4:57 - 5:00over the ways in which those who buy,
sell and process our data -
5:00 - 5:03are profiling us and our children.
-
5:04 - 5:08But these profiles can come to impact
our rights in significant ways. -
5:09 - 5:11To give you an example,
-
5:14 - 5:18in 2018 the "New York Times"
published the news -
5:18 - 5:20that the data that had been gathered
-
5:20 - 5:23through online
college-planning services -- -
5:23 - 5:28that are actually completed by millions
of high school kids across the US -
5:28 - 5:31who are looking for a college
program or a scholarship -- -
5:31 - 5:34had been sold to educational data brokers.
-
5:36 - 5:41Now, researchers at Fordham
who studied educational data brokers -
5:41 - 5:46revealed that these companies
profiled kids as young as two -
5:46 - 5:50on the basis of different categories:
-
5:50 - 5:54ethnicity, religion, affluence,
-
5:54 - 5:56social awkwardness
-
5:56 - 5:59and many other random categories.
-
5:59 - 6:04And then they sell these profiles
together with the name of the kid, -
6:04 - 6:07their home address and the contact details
-
6:07 - 6:09to different companies,
-
6:09 - 6:11including trade and career institutions,
-
6:12 - 6:13student loans
-
6:13 - 6:15and student credit card companies.
-
6:17 - 6:18To push the boundaries,
-
6:18 - 6:22the researchers at Fordham
asked an educational data broker -
6:22 - 6:28to provide them with a list
of 14-to-15-year-old girls -
6:28 - 6:31who were interested
in family planning services. -
6:32 - 6:35The data broker agreed
to provide them the list. -
6:35 - 6:40So imagine how intimate
and how intrusive that is for our kids. -
6:41 - 6:45But educational data brokers
are really just an example. -
6:45 - 6:50The truth is that our children are being
profiled in ways that we cannot control -
6:50 - 6:53but that can significantly impact
their chances in life. -
6:54 - 6:58So we need to ask ourselves:
-
6:58 - 7:02can we trust these technologies
when it comes to profiling our children? -
7:02 - 7:04Can we?
-
7:06 - 7:07My answer is no.
-
7:08 - 7:09As an anthropologist,
-
7:09 - 7:13I believe that artificial intelligence
and predictive analytics can be great -
7:13 - 7:15to predict the course of a disease
-
7:15 - 7:17or to fight climate change.
-
7:18 - 7:20But we need to abandon the belief
-
7:20 - 7:23that these technologies
can objectively profile humans -
7:23 - 7:27and that we can rely on them
to make data-driven decisions -
7:27 - 7:28about individual lives.
-
7:28 - 7:31Because they can't profile humans.
-
7:31 - 7:34Data traces are not
the mirror of who we are. -
7:34 - 7:37Humans think one thing
and say the opposite, -
7:37 - 7:39feel one way and act differently.
-
7:39 - 7:42Algorithmic predictions
or our digital practices -
7:42 - 7:47cannot account for the unpredictability
and complexity of human experience. -
7:48 - 7:50But on top of that,
-
7:50 - 7:53these technologies are always --
-
7:53 - 7:54always --
-
7:54 - 7:56in one way or another, biased.
-
7:57 - 8:02You see, algorithms are by definition
sets of rules or steps -
8:02 - 8:06that have been designed to achieve
a specific result, OK? -
8:07 - 8:10But these sets of rules or steps
cannot be objective, -
8:10 - 8:12because they've been designed
by human beings -
8:12 - 8:13within a specific cultural context
-
8:14 - 8:16and are shaped
by specific cultural values. -
8:17 - 8:18So when machines learn,
-
8:18 - 8:21they learn from biased algorithms,
-
8:22 - 8:25and they often learn
from biased databases as well. -
8:26 - 8:30At the moment, we're seeing
the first examples of algorithmic bias. -
8:30 - 8:33And some of these examples
are frankly terrifying. -
8:34 - 8:39This year, the AI Now Institute
in New York published a report -
8:39 - 8:41that revealed that the AI technologies
-
8:41 - 8:44that are being used
for predictive policing -
8:44 - 8:48have been trained on "dirty" data.
-
8:48 - 8:51This is basically data
that had been gathered -
8:51 - 8:55during historical periods
of known racial bias -
8:55 - 8:58and nontransparent police practices.
-
8:59 - 9:03Because these technologies
are being trained with dirty data, -
9:03 - 9:04they're not objective,
-
9:04 - 9:09and their outcomes are only
amplifying and perpetrating -
9:09 - 9:10police bias and error.
-
9:13 - 9:16So I think we are faced
with a fundamental problem -
9:16 - 9:18in our society.
-
9:18 - 9:23We are starting to trust technologies
when it comes to profiling human beings. -
9:24 - 9:26We know that in profiling humans,
-
9:26 - 9:29these technologies
are always going to be biased -
9:29 - 9:32and are never really going to be accurate.
-
9:32 - 9:35So what we need now
is actually political solution. -
9:35 - 9:40We need governments to recognize
that our data rights are our human rights. -
9:40 - 9:44(Applause and cheers)
-
9:48 - 9:52Until this happens, we cannot hope
for a more just future. -
9:53 - 9:55I worry that my daughters
are going to be exposed -
9:56 - 9:59to all sorts of algorithmic
discrimination and error. -
9:59 - 10:02You see the difference
between me and my daughters -
10:02 - 10:05is that there's no public record
out there of my childhood. -
10:05 - 10:09There's certainly no database
of all the stupid things that I've done -
10:09 - 10:11and thought when I was a teenager.
-
10:11 - 10:13(Laughter)
-
10:14 - 10:17But for my daughters
this may be different. -
10:17 - 10:20The data that is being collected
from them today -
10:20 - 10:24may be used to judge them in the future
-
10:24 - 10:27and can come to prevent
their hopes and dreams. -
10:29 - 10:30I think that's it's time.
-
10:30 - 10:32It's time that we all step up.
-
10:32 - 10:34It's time that we start working together
-
10:34 - 10:36as individuals,
-
10:36 - 10:38as organizations and as institutions,
-
10:38 - 10:41and that we demand
greater data justice for us -
10:41 - 10:43and for our children
-
10:43 - 10:44before it's too late.
-
10:44 - 10:45Thank you.
-
10:45 - 10:47(Applause)
- Title:
- What tech companies know about your kids
- Speaker:
- Veronica Barassi
- Description:
-
The digital platforms you and your family use every day -- from online games to education apps and medical portals -- may be collecting and selling your children's data, says anthropologist Veronica Barassi. Sharing her eye-opening research, Barassi urges parents to look twice at digital terms and conditions instead of blindly accepting them -- and to demand protections that ensure their kids' data doesn't skew their future.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 11:01
Erin Gregory edited English subtitles for What tech companies know about your kids | ||
Erin Gregory approved English subtitles for What tech companies know about your kids | ||
Erin Gregory edited English subtitles for What tech companies know about your kids | ||
Joanna Pietrulewicz accepted English subtitles for What tech companies know about your kids | ||
Joanna Pietrulewicz edited English subtitles for What tech companies know about your kids | ||
Joanna Pietrulewicz edited English subtitles for What tech companies know about your kids | ||
Leslie Gauthier edited English subtitles for What tech companies know about your kids | ||
Leslie Gauthier edited English subtitles for What tech companies know about your kids |