-
Not Synced
Every day --
-
Not Synced
every week --
-
Not Synced
we agree to terms and conditions.
-
Not Synced
And when we do this,
-
Not Synced
we provide companies with the lawful right
to do whatever they want with our data
-
Not Synced
and with the data of our children ...
-
Not Synced
which makes us vulnerable.
-
Not Synced
How much data are we giving
away of children?
-
Not Synced
And what are its implications?
-
Not Synced
I'm an antrhopologist,
-
Not Synced
and I'm also the mother
of two little girls.
-
Not Synced
And I started to become interested
in this question starting in 2015
-
Not Synced
when I suddenly realized
-
Not Synced
that there were vast, almost unimaginable
amounts of data traces
-
Not Synced
that are being produced
and collected about children.
-
Not Synced
So I launched a research project
which is called Child Data Citizen,
-
Not Synced
and I aimed at filling in the blank.
-
Not Synced
Now you may think
that I'm here to blame you
-
Not Synced
for posting photos
of your children on social media,
-
Not Synced
but that's not really the point.
-
Not Synced
The problem is way bigger
than so-called "sharenting."
-
Not Synced
This is about systems, not individuals.
-
Not Synced
You and your habits are not to blame.
-
Not Synced
For the very first time in history,
-
Not Synced
we are tracking the individual
data of children
-
Not Synced
from long before they're born --
-
Not Synced
sometimes from the moment of conception,
-
Not Synced
and then throughout their lives.
-
Not Synced
You see when parents decide to conceive,
-
Not Synced
they go online to look
for "ways to get pregnant,"
-
Not Synced
or they download ovulation-tracking apps.
-
Not Synced
When they do get pregnant,
-
Not Synced
they post ultrasounds
of their baby on social media,
-
Not Synced
they download pregnancy apps,
-
Not Synced
or they consult Dr. Google
for all sorts of things,
-
Not Synced
like, you know --
-
Not Synced
for as least .... ....
-
Not Synced
or abdominal cramps in early pregnancy.
-
Not Synced
I know because I've done it,
-
Not Synced
and many times.
-
Not Synced
And then when the baby is born,
-
Not Synced
they track everything:
-
Not Synced
every feat, every life event
on different technologies.
-
Not Synced
And all of these technologies
-
Not Synced
transform the baby's most intimate
behavioral and health data into profit
-
Not Synced
by sharing it with others.
-
Not Synced
So to give you an idea of how this works,
-
Not Synced
in 2019, the British Medical Journal
published research that showed
-
Not Synced
that out of 24 mobile health apps,
-
Not Synced
19 shared information with third parties.
-
Not Synced
And these third parties shared information
with 216 other organizations.
-
Not Synced
Of these 216 other fourth parties,
-
Not Synced
only three belonged to the health sector.
-
Not Synced
The other companies that had access
to that data were big tech companies
-
Not Synced
like Google, Facebook or Oracle,
-
Not Synced
they were digital advertising companies
-
Not Synced
and there was also the Consumer
credit reporting agents.
-
Not Synced
So you get it right:
-
Not Synced
ad companies and credit agencies may
already have data points on little babies.
-
Not Synced
But mobile apps, web searches
and social media
-
Not Synced
are really just the tip of the iceberg,
-
Not Synced
because children are being tracked
by multiple technologies
-
Not Synced
in their everyday lives.
-
Not Synced
They're tracked by home technologies
and viritual assistants in their homes.
-
Not Synced
They're tracked by educational platforms
-
Not Synced
and educational technologies
in their schools.
-
Not Synced
They're tracked by online records
-
Not Synced
and online portals
at their doctor's office.
-
Not Synced
They're tracked by their
internet-connected toys,
-
Not Synced
their online games
-
Not Synced
and many, many, many other technologies.
-
Not Synced
So during my research,
-
Not Synced
a lot of parents came up to me
and they were like, "So what?
-
Not Synced
Why does it matter if my children
are being tracked?
-
Not Synced
We've got nothing to hide."
-
Not Synced
Well, it matters.
-
Not Synced
It matters because today individuals
are not only being tracked,
-
Not Synced
they're also being profiled
on the basis of their data traces.
-
Not Synced
Artificial intelligence and predictive
analytics are being used
-
Not Synced
to harness as much data as possible
of individual life from different sources.
-
Not Synced
Family history, purchasing habits,
social media comments.
-
Not Synced
And then they bring this data together
to make data-driven decisions
-
Not Synced
about the individual.
-
Not Synced
And these technologies
are used everywhere.
-
Not Synced
Banks use them to decide loans.
-
Not Synced
Insurance uses them to decide premiums.
-
Not Synced
And recruiters and employers use them
-
Not Synced
to decide whether one
is a good fit for a job or not.
-
Not Synced
Also the police and courts use them
-
Not Synced
to determine whether one
is a potential criminal
-
Not Synced
or is likely to recommit a crime.
-
Not Synced
We have no knowledge or control
-
Not Synced
over the ways in which those who buy,
sell and process are data
-
Not Synced
are profiling us and our children.
-
Not Synced
But these profiles can come to impact
our rights in significant ways.
-
Not Synced
To give you an example,
-
Not Synced
in 2018 the "New York Times:
published the news
-
Not Synced
that the data that had been gathered
through an online college planning service
-
Not Synced
that are actually completed by millions
of high school students across the US
-
Not Synced
were looking for a college program
or a scholarship
-
Not Synced
had been sold to educational data brokers.
-
Not Synced
Now, researchers at Fordham
who studied educational data brokers
-
Not Synced
revealed that these companies profiled
kids as young as two
-
Not Synced
on the basis of different categories:
-
Not Synced
ethnicity, religion, affluence,
-
Not Synced
social awkwardness
-
Not Synced
and many other random categories.
-
Not Synced
And then they sell these profiles
together with the name of the kids,
-
Not Synced
their home address
-
Not Synced
and the contact details
-
Not Synced
to different companies,
-
Not Synced
including trade and career institutions,
-
Not Synced
student loans
-
Not Synced
and student credit card companies.
-
Not Synced
To push the boundaries,
-
Not Synced
the researchers at Fordham asked
an educational data broker
-
Not Synced
to provide them with a list
of 14-15-year-old girls
-
Not Synced
who were interested
in family planning services.
-
Not Synced
The data broker agreed
to provide them the list.
-
Not Synced
So imagine how intimate and how
intrusive that is for our kids.
-
Not Synced
Well, educational data brokers
are really just an example.
-
Not Synced
The truth is that our children are being
profiled in ways that we cannot control
-
Not Synced
but that can significantly impact
their chances in life.
-
Not Synced
So we need to ask ourselves:
-
Not Synced
can we trust these technologies
when it comes to profiling our children?
-
Not Synced
Can we?
-
Not Synced
My answer is no.
-
Not Synced
As an anthropologist,
-
Not Synced
I believe that artificial intelligence
and predictive analytics can be great
-
Not Synced
to predict the course of a disease
-
Not Synced
or to fight climate change.
-
Not Synced
But we need to abandon the belief
-
Not Synced
that these technologies
can objectively profile humans
-
Not Synced
and that we can rely on them
to make data-driven decisions
-
Not Synced
about individual lives.
-
Not Synced
Because they can profile humans.
-
Not Synced
Data traces are not the mirror
of who we are.
-
Not Synced
Humans think one thing
and say the opposite,
-
Not Synced
feel one way and act differently.
-
Not Synced
Our [...] predications or digital
practices cannot account for
-
Not Synced
the unpredictability and complexity
of human experience.
-
Not Synced
But on top of that,
-
Not Synced
these technologies are always,
always, in one way or another, biased.
-
Not Synced
You see, algorithms are by definition
sets of rules or steps
-
Not Synced
that have been designed to achieve
a specific result, OK?
-
Not Synced
But these sets or rules or steps
cannot be objective,
-
Not Synced
because they've been designed
by human beings
-
Not Synced
within a specific cultural context,
-
Not Synced
and are shaped by specific
cultural values.
-
Not Synced
So when machines learn,
-
Not Synced
they learn from biased algorithms
-
Not Synced
and they often learn from biased
databases as well.
-
Not Synced
At the moment, we're seeing the first
examples of algorithmic bias.
-
Not Synced
And some of these examples
are frankly terrifying.
-
Not Synced
This year, the AI Now Institute
in New York published a report
-
Not Synced
that revealed that the AI technologies
-
Not Synced
that are being used
for predictive policing
-
Not Synced
have been trained on "dirty" data.
-
Not Synced
This is basically data
that had been gathered
-
Not Synced
through historical periods
of known racial bias
-
Not Synced
and known transparent police practices.
-
Not Synced
Because these technologies
are being trained with dirty data,
-
Not Synced
they're not objective
-
Not Synced
and their outcomes are only
amplifying and perpetrating
-
Not Synced
police bias and error.
-
Not Synced
So I think we're faced with
a fundamental problem
-
Not Synced
in our society.
-
Not Synced
We are starting to trust technologies
when it comes to profiling human beings.
-
Not Synced
We know that in profiling humans,
-
Not Synced
these technologies are always
going to be biased
-
Not Synced
and are never really going to be accurate.
-
Not Synced
So what we need now
is actually political solution.
-
Not Synced
We need governments to recognize
that our data rights are our human rights.
-
Not Synced
(Applause and cheers)
-
Not Synced
Until this happens we cannot hope
for a more just future.
-
Not Synced
I worry that my daughters
are going to exposed
-
Not Synced
to all sorts of algorithmic
discrimination and error.
-
Not Synced
You see the difference between me
and my daughters
-
Not Synced
is that there's no public record
out there of my childhood.
-
Not Synced
There's certainly no database of all
the stupid things that I've done
-
Not Synced
and thought when I was a teenager.
-
Not Synced
(Laughter)
-
Not Synced
But for my daughters
this may be different.
-
Not Synced
The data is being collected
from them today
-
Not Synced
may be used to judge them in the future,
-
Not Synced
and can come to prevent
their hopes and dreams.
-
Not Synced
I think that's it's time.
-
Not Synced
It's time that we all step up.
-
Not Synced
It's time that we start working together
-
Not Synced
as individuals,
-
Not Synced
as organizations
-
Not Synced
and as institutions,
-
Not Synced
and that we demand greater
data justice for us
-
Not Synced
and for our children
-
Not Synced
before it's too late.
-
Not Synced
Thank you.
-
Not Synced
(Applause)