Return to Video

What tech companies know about your kids

  • 0:01 - 0:02
    Every day --
  • 0:02 - 0:03
    every week --
  • 0:03 - 0:05
    we agree to terms and conditions.
  • 0:05 - 0:07
    And when we do this,
  • 0:07 - 0:09
    we provide companies with the lawful right
  • 0:09 - 0:13
    to do whatever they want with our data
  • 0:13 - 0:15
    and with the data of our children.
  • 0:17 - 0:20
    Which makes us wonder:
  • 0:20 - 0:23
    how much data are we giving
    away of children,
  • 0:23 - 0:25
    and what are its implications?
  • 0:26 - 0:28
    I'm an anthropologist,
  • 0:28 - 0:30
    and I'm also the mother
    of two little girls.
  • 0:31 - 0:35
    And I started to become interested
    in this question in 2015
  • 0:35 - 0:36
    when I suddenly realized
  • 0:36 - 0:38
    that there were vast --
  • 0:38 - 0:41
    almost unimaginable amounts of data traces
  • 0:41 - 0:44
    that are being produced
    and collected about children.
  • 0:45 - 0:47
    So I launched a research project,
  • 0:47 - 0:49
    which is called Child Data Citizen,
  • 0:49 - 0:52
    and I aimed at filling in the blank.
  • 0:53 - 0:56
    Now you may think
    that I'm here to blame you
  • 0:56 - 0:59
    for posting photos
    of your children on social media,
  • 0:59 - 1:00
    but that's not really the point.
  • 1:01 - 1:04
    The problem is way bigger
    than so-called "sharenting."
  • 1:05 - 1:09
    This is about systems, not individuals.
  • 1:09 - 1:11
    You and your habits are not to blame.
  • 1:13 - 1:16
    For the very first time in history,
  • 1:16 - 1:18
    we are tracking the individual
    data of children
  • 1:18 - 1:20
    from long before they're born --
  • 1:20 - 1:23
    sometimes from the moment of conception,
  • 1:23 - 1:25
    and then throughout their lives.
  • 1:25 - 1:28
    You see when parents decide to conceive,
  • 1:28 - 1:31
    they go online to look
    for "ways to get pregnant,"
  • 1:31 - 1:34
    or they download ovulation-tracking apps.
  • 1:35 - 1:38
    When they do get pregnant,
  • 1:38 - 1:41
    they post ultrasounds
    of their babies on social media,
  • 1:41 - 1:43
    they download pregnancy apps,
  • 1:43 - 1:47
    or they consult Dr. Google
    for all sorts of things,
  • 1:47 - 1:48
    like, you know --
  • 1:48 - 1:51
    for "miscarriage risk when flying"
  • 1:51 - 1:54
    or "abdominal cramps in early pregnancy."
  • 1:54 - 1:56
    I know because I've done it --
  • 1:56 - 1:57
    and many times.
  • 1:59 - 2:01
    And then when the baby is born
    they track every nap,
  • 2:01 - 2:02
    every feed,
  • 2:02 - 2:05
    every life event
    on different technologies.
  • 2:06 - 2:08
    And all of these technologies
  • 2:08 - 2:14
    transform the baby's most intimate
    behavioral and health data into profit
  • 2:14 - 2:16
    by sharing it with others.
  • 2:17 - 2:19
    So to give you an idea of how this works,
  • 2:19 - 2:24
    in 2019, the British Medical Journal
    published research that showed
  • 2:24 - 2:28
    that out of 24 mobile health apps,
  • 2:28 - 2:31
    19 shared information with third parties.
  • 2:32 - 2:38
    And these third parties shared information
    with 216 other organizations.
  • 2:39 - 2:42
    Of these 216 other fourth parties,
  • 2:42 - 2:45
    only three belonged to the health sector.
  • 2:46 - 2:50
    The other companies that had access
    to that data were big tech companies
  • 2:50 - 2:54
    like Google, Facebook or Oracle,
  • 2:54 - 2:56
    they were digital advertising companies
  • 2:56 - 3:00
    and there were also the consumer
    credit reporting agencies.
  • 3:01 - 3:03
    So you get it right:
  • 3:03 - 3:08
    ad companies and credit agencies may
    already have data points on little babies.
  • 3:09 - 3:12
    But mobile apps, web
    searches and social media
  • 3:12 - 3:15
    are really just the tip of the iceberg,
  • 3:15 - 3:18
    because children are being tracked
    by multiple technologies
  • 3:18 - 3:20
    in their everyday lives.
  • 3:20 - 3:24
    They're tracked by home technologies
    and viritual assistants in their homes.
  • 3:24 - 3:26
    They're tracked by educational platforms
  • 3:26 - 3:28
    and educational technologies
    in their schools.
  • 3:28 - 3:30
    They're tracked by online records
  • 3:30 - 3:32
    and online portals
    at their doctor's office.
  • 3:33 - 3:35
    They're tracked by their
    internet-connected toys,
  • 3:35 - 3:37
    their online games
  • 3:37 - 3:39
    and many, many, many,
    many other technologies.
  • 3:40 - 3:42
    So during my research,
  • 3:42 - 3:46
    a lot of parents came up to me
    and they were like, "So what?
  • 3:46 - 3:49
    Why does it matter if my children
    are being tracked?
  • 3:50 - 3:52
    We've got nothing to hide."
  • 3:53 - 3:54
    Well, it matters.
  • 3:55 - 4:01
    It matters because today individuals
    are not only being tracked,
  • 4:01 - 4:05
    they're also being profiled
    on the basis of their data traces.
  • 4:05 - 4:09
    Artificial intelligence and predictive
    analytics are being used
  • 4:09 - 4:13
    to harness as much data as possible
    of an individual life
  • 4:13 - 4:15
    from different souces:
  • 4:15 - 4:19
    family history, purchasing habits,
    social media comments.
  • 4:19 - 4:21
    And then they bring this data together
  • 4:21 - 4:24
    to make data-driven decisions
    about the individual.
  • 4:25 - 4:28
    And these technologies
    are used everywhere.
  • 4:28 - 4:30
    Banks use them to decide loans.
  • 4:31 - 4:33
    Insurance uses them to decide premiums.
  • 4:34 - 4:37
    Recruiters and employers use them
  • 4:37 - 4:40
    to decide whether one
    is a good fit for a job or not.
  • 4:41 - 4:44
    Also the police and courts use them
  • 4:44 - 4:47
    to determine whether one
    is a potential criminal
  • 4:47 - 4:50
    or is likely to recommit a crime.
  • 4:53 - 4:57
    We have no knowledge or control
  • 4:57 - 5:00
    over the ways in which those who buy,
    sell and process our data
  • 5:00 - 5:03
    are profiling us and our children.
  • 5:04 - 5:08
    But these profiles can come to impact
    our rights in significant ways.
  • 5:09 - 5:14
    To give you an example,
  • 5:14 - 5:18
    in 2018 the "New York Times"
    published the news
  • 5:18 - 5:20
    that the data that had been gathered
  • 5:20 - 5:23
    through online
    college-planning services --
  • 5:23 - 5:28
    that are actually completed by millions
    of high school kids across the US
  • 5:28 - 5:31
    who are looking for a college
    program or a scholarship --
  • 5:31 - 5:35
    had been sold to educational data brokers.
  • 5:36 - 5:41
    Now, researchers at Fordham
    who studied educational data brokers
  • 5:41 - 5:47
    revealed that these companies
    profiled kids as young as two
  • 5:47 - 5:50
    on the basis of different categories:
  • 5:50 - 5:54
    ethnicity, religion, affluence,
  • 5:54 - 5:56
    social awkwardness
  • 5:56 - 5:58
    and many other random categories.
  • 5:59 - 6:04
    And then they sell these profiles
    together with the name of the kids,
  • 6:04 - 6:05
    their home address
  • 6:05 - 6:07
    and the contact details
  • 6:07 - 6:09
    to different companies,
  • 6:09 - 6:12
    including trade and career institutions,
  • 6:12 - 6:13
    student loans
  • 6:13 - 6:15
    and student credit card companies.
  • 6:17 - 6:18
    To push the boundaries,
  • 6:18 - 6:22
    the researchers at Fordham
    asked an educational data broker
  • 6:22 - 6:28
    to provide them with a list
    of 14-to 15-year-old girls
  • 6:28 - 6:31
    who were interested
    in family planning services.
  • 6:32 - 6:35
    The data broker agreed
    to provide them the list.
  • 6:35 - 6:40
    So imagine how intimate
    and how intrusive that is for our kids.
  • 6:41 - 6:45
    But educational data brokers
    are really just an example.
  • 6:45 - 6:50
    The truth is that our children are being
    profiled in ways that we cannot control
  • 6:50 - 6:53
    but that can significantly impact
    their chances in life.
  • 6:54 - 6:58
    So we need to ask ourselves:
  • 6:58 - 7:02
    can we trust these technologies
    when it comes to profiling our children?
  • 7:02 - 7:04
    Can we?
  • 7:06 - 7:07
    My answer is no.
  • 7:08 - 7:09
    As an anthropologist,
  • 7:09 - 7:13
    I believe that artificial intelligence
    and predictive analytics can be great
  • 7:13 - 7:15
    to predict the course of a disease
  • 7:15 - 7:17
    or to fight climate change.
  • 7:18 - 7:20
    But we need to abandon the belief
  • 7:20 - 7:23
    that these technologies
    can objectively profile humans
  • 7:23 - 7:27
    and that we can rely on them
    to make data-driven decisions
  • 7:27 - 7:28
    about individual lives.
  • 7:28 - 7:31
    Because they can't profile humans.
  • 7:31 - 7:34
    Data traces are not
    the mirror of who we are.
  • 7:34 - 7:36
    Humans think one thing
    and say the opposite,
  • 7:36 - 7:39
    feel one way and act differently.
  • 7:39 - 7:42
    Algorithmic predictions
    or our digital practices
  • 7:42 - 7:47
    cannot account for the unpredictability
    and complexity of human experience.
  • 7:48 - 7:50
    But on top of that,
  • 7:50 - 7:53
    these technologies are always --
  • 7:53 - 7:54
    always --
  • 7:54 - 7:56
    in one way or another, biased.
  • 7:57 - 8:02
    You see, algorithms are by definition
    sets of rules or steps
  • 8:02 - 8:06
    that have been designed to achieve
    a specific result, OK?
  • 8:07 - 8:09
    But these sets of rules or steps
    cannot be objective
  • 8:09 - 8:12
    because they've been designed
    by human beings
  • 8:12 - 8:13
    with a specific cultural context,
  • 8:13 - 8:16
    and are shaped by specific
    cultural values.
  • 8:17 - 8:19
    So when machines learn,
  • 8:19 - 8:22
    they learn from biased algorithms
  • 8:22 - 8:25
    and they often learn from biased
    databases as well.
  • 8:26 - 8:29
    At the moment, we're seeing the first
    examples of algorithmic bias.
  • 8:30 - 8:33
    And some of these examples
    are frankly terrifying.
  • 8:35 - 8:39
    This year, the AI Now Institute
    in New York published a report
  • 8:39 - 8:41
    that revealed that the AI technologies
  • 8:41 - 8:45
    that are being used
    for predictive policing
  • 8:45 - 8:48
    have been trained on "dirty" data.
  • 8:48 - 8:52
    This is basically data
    that had been gathered
  • 8:52 - 8:56
    during historical periods
    of known racial bias
  • 8:56 - 8:58
    and nontransparent police practices.
  • 8:59 - 9:03
    Because these technologies
    are being trained with dirty data,
  • 9:03 - 9:04
    they're not objective,
  • 9:04 - 9:09
    and their outcomes are only
    amplifying and perpetrating
  • 9:09 - 9:11
    police bias and error.
  • 9:13 - 9:16
    So I think we're faced
    with a fundamental problem
  • 9:16 - 9:18
    in our society.
  • 9:18 - 9:23
    We are starting to trust technologies
    when it comes to profiling human beings.
  • 9:24 - 9:26
    We know that in profiling humans,
  • 9:26 - 9:29
    these technologies are always
    going to be biased
  • 9:29 - 9:32
    and are never really going to be accurate.
  • 9:32 - 9:34
    So what we need now
    is actually political solution.
  • 9:35 - 9:40
    We need governments to recognize
    that our data rights are our human rights.
  • 9:40 - 9:44
    (Applause and cheers)
  • 9:48 - 9:52
    Until this happens we cannot hope
    for a more just future.
  • 9:53 - 9:56
    I worry that my daughters
    are going to be exposed
  • 9:56 - 9:59
    to all sorts of algorithmic
    discrimination and error.
  • 9:59 - 10:02
    You see the difference between me
    and my daughters
  • 10:02 - 10:04
    is that there's no public record
    out there of my childhood.
  • 10:05 - 10:09
    There's certainly no database
    of all the stupid things that I've done
  • 10:09 - 10:11
    and thought when I was a teenager.
  • 10:11 - 10:13
    (Laughter)
  • 10:14 - 10:17
    But for my daughters
    this may be different.
  • 10:17 - 10:21
    The data that is being collected
    from them today
  • 10:21 - 10:24
    may be used to judge them in the future,
  • 10:24 - 10:28
    and can come to prevent
    their hopes and dreams.
  • 10:29 - 10:30
    I think that's it's time.
  • 10:30 - 10:31
    It's time that we all step up.
  • 10:31 - 10:34
    It's time that we start working together
  • 10:34 - 10:35
    as individuals,
  • 10:35 - 10:37
    as organizations
  • 10:37 - 10:38
    and as institutions,
  • 10:38 - 10:41
    and that we demand greater
    data justice for us
  • 10:41 - 10:43
    and for our children
  • 10:43 - 10:44
    before it's too late.
  • 10:44 - 10:45
    Thank you.
  • 10:45 - 10:47
    (Applause)
Title:
What tech companies know about your kids
Speaker:
Veronica Barassi
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
11:01

English subtitles

Revisions Compare revisions