English 字幕

← What tech companies know about your kids

埋め込みコードを取得する
36言語

Showing Revision 8 created 06/11/2020 by Erin Gregory.

  1. Every day, every week,
  2. we agree to terms and conditions.
  3. And when we do this,
  4. we provide companies with the lawful right
  5. to do whatever they want with our data
  6. and with the data of our children.
  7. Which makes us wonder:
  8. how much data are we giving
    away of children,
  9. and what are its implications?
  10. I'm an anthropologist,

  11. and I'm also the mother
    of two little girls.
  12. And I started to become interested
    in this question in 2015
  13. when I suddenly realized
    that there were vast --
  14. almost unimaginable amounts of data traces
  15. that are being produced
    and collected about children.
  16. So I launched a research project,
  17. which is called Child Data Citizen,
  18. and I aimed at filling in the blank.
  19. Now you may think
    that I'm here to blame you

  20. for posting photos
    of your children on social media,
  21. but that's not really the point.
  22. The problem is way bigger
    than so-called "sharenting."
  23. This is about systems, not individuals.
  24. You and your habits are not to blame.
  25. For the very first time in history,

  26. we are tracking
    the individual data of children
  27. from long before they're born --
  28. sometimes from the moment of conception,
  29. and then throughout their lives.
  30. You see, when parents decide to conceive,
  31. they go online to look
    for "ways to get pregnant,"
  32. or they download ovulation-tracking apps.
  33. When they do get pregnant,
  34. they post ultrasounds
    of their babies on social media,
  35. they download pregnancy apps
  36. or they consult Dr. Google
    for all sorts of things,
  37. like, you know --
  38. for "miscarriage risk when flying"
  39. or "abdominal cramps in early pregnancy."
  40. I know because I've done it --
  41. and many times.
  42. And then, when the baby is born,
    they track every nap,
  43. every feed,
  44. every life event
    on different technologies.
  45. And all of these technologies
  46. transform the baby's most intimate
    behavioral and health data into profit
  47. by sharing it with others.
  48. So to give you an idea of how this works,

  49. in 2019, the British Medical Journal
    published research that showed
  50. that out of 24 mobile health apps,
  51. 19 shared information with third parties.
  52. And these third parties shared information
    with 216 other organizations.
  53. Of these 216 other fourth parties,
  54. only three belonged to the health sector.
  55. The other companies that had access
    to that data were big tech companies
  56. like Google, Facebook or Oracle,
  57. they were digital advertising companies
  58. and there was also
    a consumer credit reporting agency.
  59. So you get it right:
  60. ad companies and credit agencies may
    already have data points on little babies.
  61. But mobile apps,
    web searches and social media
  62. are really just the tip of the iceberg,
  63. because children are being tracked
    by multiple technologies
  64. in their everyday lives.
  65. They're tracked by home technologies
    and virtual assistants in their homes.
  66. They're tracked by educational platforms
  67. and educational technologies
    in their schools.
  68. They're tracked by online records
  69. and online portals
    at their doctor's office.
  70. They're tracked by their
    internet-connected toys,
  71. their online games
  72. and many, many, many,
    many other technologies.
  73. So during my research,

  74. a lot of parents came up to me
    and they were like, "So what?
  75. Why does it matter
    if my children are being tracked?
  76. We've got nothing to hide."
  77. Well, it matters.
  78. It matters because today individuals
    are not only being tracked,
  79. they're also being profiled
    on the basis of their data traces.
  80. Artificial intelligence and predictive
    analytics are being used
  81. to harness as much data as possible
    of an individual life
  82. from different sources:
  83. family history, purchasing habits,
    social media comments.
  84. And then they bring this data together
  85. to make data-driven decisions
    about the individual.
  86. And these technologies
    are used everywhere.
  87. Banks use them to decide loans.
  88. Insurance uses them to decide premiums.
  89. Recruiters and employers use them
  90. to decide whether one
    is a good fit for a job or not.
  91. Also the police and courts use them
  92. to determine whether one
    is a potential criminal
  93. or is likely to recommit a crime.
  94. We have no knowledge or control

  95. over the ways in which those who buy,
    sell and process our data
  96. are profiling us and our children.
  97. But these profiles can come to impact
    our rights in significant ways.
  98. To give you an example,

  99. in 2018 the "New York Times"
    published the news
  100. that the data that had been gathered
  101. through online
    college-planning services --
  102. that are actually completed by millions
    of high school kids across the US
  103. who are looking for a college
    program or a scholarship --
  104. had been sold to educational data brokers.
  105. Now, researchers at Fordham
    who studied educational data brokers
  106. revealed that these companies
    profiled kids as young as two
  107. on the basis of different categories:
  108. ethnicity, religion, affluence,
  109. social awkwardness
  110. and many other random categories.
  111. And then they sell these profiles
    together with the name of the kid,
  112. their home address and the contact details
  113. to different companies,
  114. including trade and career institutions,
  115. student loans
  116. and student credit card companies.
  117. To push the boundaries,
  118. the researchers at Fordham
    asked an educational data broker
  119. to provide them with a list
    of 14-to-15-year-old girls
  120. who were interested
    in family planning services.
  121. The data broker agreed
    to provide them the list.
  122. So imagine how intimate
    and how intrusive that is for our kids.
  123. But educational data brokers
    are really just an example.
  124. The truth is that our children are being
    profiled in ways that we cannot control
  125. but that can significantly impact
    their chances in life.
  126. So we need to ask ourselves:

  127. can we trust these technologies
    when it comes to profiling our children?
  128. Can we?
  129. My answer is no.
  130. As an anthropologist,
  131. I believe that artificial intelligence
    and predictive analytics can be great
  132. to predict the course of a disease
  133. or to fight climate change.
  134. But we need to abandon the belief
  135. that these technologies
    can objectively profile humans
  136. and that we can rely on them
    to make data-driven decisions
  137. about individual lives.
  138. Because they can't profile humans.
  139. Data traces are not
    the mirror of who we are.
  140. Humans think one thing
    and say the opposite,
  141. feel one way and act differently.
  142. Algorithmic predictions
    or our digital practices
  143. cannot account for the unpredictability
    and complexity of human experience.
  144. But on top of that,

  145. these technologies are always --
  146. always --
  147. in one way or another, biased.
  148. You see, algorithms are by definition
    sets of rules or steps
  149. that have been designed to achieve
    a specific result, OK?
  150. But these sets of rules or steps
    cannot be objective,
  151. because they've been designed
    by human beings
  152. within a specific cultural context
  153. and are shaped
    by specific cultural values.
  154. So when machines learn,
  155. they learn from biased algorithms,
  156. and they often learn
    from biased databases as well.
  157. At the moment, we're seeing
    the first examples of algorithmic bias.

  158. And some of these examples
    are frankly terrifying.
  159. This year, the AI Now Institute
    in New York published a report
  160. that revealed that the AI technologies
  161. that are being used
    for predictive policing
  162. have been trained on "dirty" data.
  163. This is basically data
    that had been gathered
  164. during historical periods
    of known racial bias
  165. and nontransparent police practices.
  166. Because these technologies
    are being trained with dirty data,
  167. they're not objective,
  168. and their outcomes are only
    amplifying and perpetrating
  169. police bias and error.
  170. So I think we are faced
    with a fundamental problem

  171. in our society.
  172. We are starting to trust technologies
    when it comes to profiling human beings.
  173. We know that in profiling humans,
  174. these technologies
    are always going to be biased
  175. and are never really going to be accurate.
  176. So what we need now
    is actually political solution.
  177. We need governments to recognize
    that our data rights are our human rights.
  178. (Applause and cheers)

  179. Until this happens, we cannot hope
    for a more just future.

  180. I worry that my daughters
    are going to be exposed
  181. to all sorts of algorithmic
    discrimination and error.
  182. You see the difference
    between me and my daughters
  183. is that there's no public record
    out there of my childhood.
  184. There's certainly no database
    of all the stupid things that I've done
  185. and thought when I was a teenager.
  186. (Laughter)

  187. But for my daughters
    this may be different.

  188. The data that is being collected
    from them today
  189. may be used to judge them in the future
  190. and can come to prevent
    their hopes and dreams.
  191. I think that's it's time.

  192. It's time that we all step up.
  193. It's time that we start working together
  194. as individuals,
  195. as organizations and as institutions,
  196. and that we demand
    greater data justice for us
  197. and for our children
  198. before it's too late.
  199. Thank you.

  200. (Applause)