< Return to Video

When technology can read minds, how will we protect our privacy?

  • 0:01 - 0:06
    In the months following
    the 2009 presidential election in Iran,
  • 0:06 - 0:09
    protests erupted across the country.
  • 0:10 - 0:13
    The Iranian government
    violently suppressed
  • 0:13 - 0:17
    what came to be known
    as the Iranian Green Movement,
  • 0:17 - 0:19
    even blocking mobile signals
  • 0:19 - 0:21
    to cut off communication
    between the protesters.
  • 0:22 - 0:27
    My parents, who emigrated
    to the United States in the late 1960s,
  • 0:27 - 0:29
    spend substantial time there,
  • 0:29 - 0:32
    where all of my large,
    extended family live.
  • 0:33 - 0:36
    When I would call my family in Tehran
  • 0:36 - 0:39
    during some of the most violent
    crackdowns of the protest,
  • 0:39 - 0:43
    none of them dared discuss
    with me what was happening.
  • 0:43 - 0:47
    They or I knew to quickly steer
    the conversation to other topics.
  • 0:47 - 0:51
    All of us understood
    what the consequences could be
  • 0:51 - 0:53
    of a perceived dissident action.
  • 0:54 - 0:58
    But I still wish I could have known
    what they were thinking
  • 0:58 - 0:59
    or what they were feeling.
  • 1:00 - 1:02
    What if I could have?
  • 1:02 - 1:03
    Or more frighteningly,
  • 1:03 - 1:06
    what if the Iranian government could have?
  • 1:07 - 1:10
    Would they have arrested them
    based on what their brains revealed?
  • 1:11 - 1:14
    That day may be closer than you think.
  • 1:15 - 1:18
    With our growing capabilities
    in neuroscience, artificial intelligence
  • 1:18 - 1:20
    and machine learning,
  • 1:20 - 1:24
    we may soon know a lot more
    of what's happening in the human brain.
  • 1:25 - 1:28
    As a bioethicist, a lawyer, a philosopher
  • 1:28 - 1:30
    and an Iranian-American,
  • 1:30 - 1:34
    I'm deeply concerned
    about what this means for our freedoms
  • 1:34 - 1:36
    and what kinds of protections we need.
  • 1:37 - 1:40
    I believe we need
    a right to cognitive liberty,
  • 1:40 - 1:43
    as a human right
    that needs to be protected.
  • 1:44 - 1:46
    If not, our freedom of thought,
  • 1:46 - 1:49
    access and control over our own brains
  • 1:49 - 1:52
    and our mental privacy will be threatened.
  • 1:54 - 1:55
    Consider this:
  • 1:55 - 1:58
    the average person thinks
    thousands of thoughts each day.
  • 1:59 - 2:00
    As a thought takes form,
  • 2:00 - 2:05
    like a math calculation
    or a number, a word,
  • 2:05 - 2:08
    neurons are interacting in the brain,
  • 2:08 - 2:11
    creating a miniscule electrical discharge.
  • 2:12 - 2:15
    When you have a dominant
    mental state, like relaxation,
  • 2:15 - 2:19
    hundreds and thousands of neurons
    are firing in the brain,
  • 2:19 - 2:24
    creating concurrent electrical discharges
    in characteristic patterns
  • 2:24 - 2:28
    that can be measured
    with electroencephalography, or EEG.
  • 2:29 - 2:31
    In fact, that's what
    you're seeing right now.
  • 2:32 - 2:36
    You're seeing my brain activity
    that was recorded in real time
  • 2:36 - 2:39
    with a simple device
    that was worn on my head.
  • 2:40 - 2:45
    What you're seeing is my brain activity
    when I was relaxed and curious.
  • 2:46 - 2:48
    To share this information with you,
  • 2:48 - 2:51
    I wore one of the early
    consumer-based EEG devices
  • 2:51 - 2:52
    like this one,
  • 2:52 - 2:56
    which recorded the electrical
    activity in my brain in real time.
  • 2:57 - 3:01
    It's not unlike the fitness trackers
    that some of you may be wearing
  • 3:01 - 3:04
    to measure your heart rate
    or the steps that you've taken,
  • 3:04 - 3:06
    or even your sleep activity.
  • 3:07 - 3:11
    It's hardly the most sophisticated
    neuroimaging technique on the market.
  • 3:12 - 3:14
    But it's already the most portable
  • 3:14 - 3:17
    and the most likely to impact
    our everyday lives.
  • 3:18 - 3:19
    This is extraordinary.
  • 3:19 - 3:22
    Through a simple, wearable device,
  • 3:22 - 3:26
    we can literally see
    inside the human brain
  • 3:26 - 3:32
    and learn aspects of our mental landscape
    without ever uttering a word.
  • 3:33 - 3:36
    While we can't reliably decode
    complex thoughts just yet,
  • 3:37 - 3:39
    we can already gauge a person's mood,
  • 3:39 - 3:42
    and with the help
    of artificial intelligence,
  • 3:42 - 3:46
    we can even decode
    some single-digit numbers
  • 3:46 - 3:51
    or shapes or simple words
    that a person is thinking
  • 3:51 - 3:53
    or hearing, or seeing.
  • 3:54 - 3:58
    Despite some inherent limitations in EEG,
  • 3:58 - 4:02
    I think it's safe to say
    that with our advances in technology,
  • 4:02 - 4:06
    more and more of what's happening
    in the human brain
  • 4:06 - 4:09
    can and will be decoded over time.
  • 4:09 - 4:12
    Already, using one of these devices,
  • 4:12 - 4:15
    an epileptic can know
    they're going to have an epileptic seizure
  • 4:15 - 4:17
    before it happens.
  • 4:17 - 4:21
    A paraplegic can type on a computer
    with their thoughts alone.
  • 4:22 - 4:27
    A US-based company has developed
    a technology to embed these sensors
  • 4:27 - 4:29
    into the headrest of automobilies
  • 4:29 - 4:31
    so they can track driver concentration,
  • 4:31 - 4:34
    distraction and cognitive load
    while driving.
  • 4:35 - 4:39
    Nissan, insurance companies
    and AAA have all taken note.
  • 4:40 - 4:44
    You could even watch this
    choose-your-own-adventure movie
  • 4:44 - 4:49
    "The Moment," which, with an EEG headset,
  • 4:49 - 4:53
    changes the movie
    based on your brain-based reactions,
  • 4:53 - 4:57
    giving you a different ending
    every time your attention wanes.
  • 4:59 - 5:02
    This may all sound great,
  • 5:02 - 5:04
    and as a bioethicist,
  • 5:04 - 5:08
    I am a huge proponent of empowering people
  • 5:08 - 5:10
    to take charge of their own
    health and well-being
  • 5:10 - 5:13
    by giving them access
    to information about themselves,
  • 5:13 - 5:16
    including this incredible
    new brain-decoding technology.
  • 5:18 - 5:19
    But I worry.
  • 5:20 - 5:24
    I worry that we will voluntarily
    or involuntarily give up
  • 5:25 - 5:29
    our last bastion of freedom,
    our mental privacy.
  • 5:29 - 5:32
    That we will trade our brain activity
  • 5:32 - 5:35
    for rebates or discounts on insurance,
  • 5:36 - 5:39
    or free access
    to social-media accounts ...
  • 5:40 - 5:42
    or even to keep our jobs.
  • 5:43 - 5:45
    In fact, in China,
  • 5:46 - 5:52
    the train drivers on
    the Beijing-Shanghai high-speed rail,
  • 5:52 - 5:55
    the busiest of its kind in the world,
  • 5:55 - 5:57
    are required to wear EEG devices
  • 5:57 - 6:00
    to monitor their brain activity
    while driving.
  • 6:00 - 6:02
    According to some news sources,
  • 6:02 - 6:05
    in government-run factories in China,
  • 6:05 - 6:10
    the workers are required to wear
    EEG sensors to monitor their productivity
  • 6:10 - 6:13
    and their emotional state at work.
  • 6:13 - 6:16
    Workers are even sent home
  • 6:16 - 6:20
    if their brains show less-than-stellar
    concentration on their jobs,
  • 6:20 - 6:22
    or emotional agitation.
  • 6:23 - 6:25
    It's not going to happen tomorrow,
  • 6:25 - 6:28
    but we're headed to a world
    of brain transparency.
  • 6:29 - 6:32
    And I don't think people understand
    that that could change everything.
  • 6:32 - 6:36
    Everything from our definitions
    of data privacy to our laws,
  • 6:36 - 6:38
    to our ideas about freedom.
  • 6:39 - 6:42
    In fact, in my lab at Duke University,
  • 6:42 - 6:45
    we recently conducted a nationwide study
    in the United States
  • 6:45 - 6:47
    to see if people appreciated
  • 6:47 - 6:49
    the sensitivity
    of their brain information.
  • 6:50 - 6:54
    We asked people to rate
    their perceived sensitivity
  • 6:54 - 6:56
    of 33 different kinds of information,
  • 6:56 - 6:58
    from their social security numbers
  • 6:58 - 7:01
    to the content
    of their phone conversations,
  • 7:01 - 7:03
    their relationship history,
  • 7:03 - 7:05
    their emotions, their anxiety,
  • 7:05 - 7:07
    the mental images in their mind
  • 7:07 - 7:09
    and the thoughts in their mind.
  • 7:09 - 7:15
    Shockingly, people rated their social
    security number as far more sensitive
  • 7:15 - 7:17
    than any other kind of information,
  • 7:17 - 7:19
    including their brain data.
  • 7:20 - 7:24
    I think this is because
    people don't yet understand
  • 7:24 - 7:28
    or believe the implications
    of this new brain-decoding technology.
  • 7:29 - 7:32
    After all, if we can know
    the inner workings of the human brain,
  • 7:32 - 7:35
    our social security numbers
    are the least of our worries.
  • 7:35 - 7:36
    (Laughter)
  • 7:36 - 7:37
    Think about it.
  • 7:37 - 7:40
    In a world of total brain transparency,
  • 7:40 - 7:42
    who would dare have
    a politically dissident thought?
  • 7:43 - 7:45
    Or a creative one?
  • 7:46 - 7:49
    I worry that people will self-censor
  • 7:49 - 7:52
    in fear of being ostracized by society,
  • 7:52 - 7:56
    or that people will lose their jobs
    because of their waning attention
  • 7:56 - 7:58
    or emotional instability,
  • 7:58 - 8:02
    or because they're contemplating
    collective action against their employers.
  • 8:02 - 8:06
    That coming out
    will no longer be an option,
  • 8:06 - 8:11
    because people's brains will long ago
    have revealed their sexual orientation,
  • 8:11 - 8:13
    their political ideology
  • 8:13 - 8:15
    or their religious preferences,
  • 8:15 - 8:18
    well before they were ready
    to consciously share that information
  • 8:18 - 8:19
    with other people.
  • 8:20 - 8:24
    I worry about the ability of our laws
    to keep up with technological change.
  • 8:25 - 8:27
    Take the First Amendment
    of the US Constitution,
  • 8:27 - 8:29
    which protects freedom of speech.
  • 8:29 - 8:31
    Does it also protect freedom of thought?
  • 8:32 - 8:36
    And if so, does that mean that we're free
    to alter our thoughts however we want?
  • 8:36 - 8:41
    Or can the government or society tell us
    what we can do with our own brains?
  • 8:42 - 8:45
    Can the NSA spy on our brains
    using these new mobile devices?
  • 8:46 - 8:50
    Can the companies that collect
    the brain data through their applications
  • 8:50 - 8:52
    sell this information to third parties?
  • 8:53 - 8:56
    Right now, no laws prevent them
    from doing so.
  • 8:57 - 8:59
    It could be even more problematic
  • 8:59 - 9:02
    in countries that don't share
    the same freedoms
  • 9:02 - 9:04
    enjoyed by people in the United States.
  • 9:05 - 9:09
    What would've happened during
    the Iranian Green Movement
  • 9:09 - 9:13
    if the government had been
    monitoring my family's brain activity,
  • 9:13 - 9:17
    and had believed them
    to be sympathetic to the protesters?
  • 9:18 - 9:21
    Is it so far-fetched to imagine a society
  • 9:21 - 9:24
    in which people are arrested
    based on their thoughts
  • 9:24 - 9:25
    of committing a crime,
  • 9:25 - 9:30
    like in the science-fiction
    dystopian society in "Minority Report."
  • 9:30 - 9:35
    Already, in the United States, in Indiana,
  • 9:35 - 9:40
    an 18-year-old was charged
    with attempting to intimidate his school
  • 9:40 - 9:43
    by posting a video of himself
    shooting people in the hallways ...
  • 9:44 - 9:47
    Except the people were zombies
  • 9:47 - 9:52
    and the video was of him playing
    an augmented-reality video game,
  • 9:52 - 9:57
    all interpreted to be a mental projection
    of his subjective intent.
  • 9:58 - 10:03
    This is exactly why our brains
    need special protection.
  • 10:03 - 10:07
    If our brains are just as subject
    to data tracking and aggregation
  • 10:07 - 10:09
    as our financial records and transactions,
  • 10:09 - 10:14
    if our brains can be hacked
    and tracked like our online activities,
  • 10:14 - 10:16
    our mobile phones and applications,
  • 10:16 - 10:20
    then we're on the brink of a dangerous
    threat to our collective humanity.
  • 10:21 - 10:23
    Before you panic,
  • 10:24 - 10:27
    I believe that there are solutions
    to these concerns,
  • 10:27 - 10:30
    but we have to start by focusing
    on the right things.
  • 10:31 - 10:34
    When it comes to privacy
    protections in general,
  • 10:34 - 10:35
    I think we're fighting a losing battle
  • 10:35 - 10:38
    by trying to restrict
    the flow of information.
  • 10:38 - 10:42
    Instead, we should be focusing
    on securing rights and remedies
  • 10:42 - 10:45
    against the misuse of our information.
  • 10:45 - 10:49
    If people had the right to decide
    how their information was shared,
  • 10:49 - 10:52
    and more importantly, have legal redress
  • 10:52 - 10:54
    if their information
    was misused against them,
  • 10:54 - 10:57
    say to discriminate against them
    in an employment setting
  • 10:57 - 11:00
    or in health care or education,
  • 11:00 - 11:02
    this would go a long way to build trust.
  • 11:03 - 11:05
    In fact, in some instances,
  • 11:05 - 11:08
    we want to be sharing more
    of our personal information.
  • 11:09 - 11:12
    Studying aggregated information
    can tell us so much
  • 11:12 - 11:15
    about our health and our well-being,
  • 11:15 - 11:18
    but to be able to safely share
    our information,
  • 11:18 - 11:21
    we need special protections
    for mental privacy.
  • 11:22 - 11:25
    This is why we need
    a right to cognitive liberty.
  • 11:26 - 11:30
    This right would secure for us
    our freedom of thought and rumination,
  • 11:30 - 11:32
    our freedom of self-determination,
  • 11:32 - 11:37
    and it would insure that we have
    the right to consent to or refuse
  • 11:37 - 11:39
    access and alteration
    of our brains by others.
  • 11:40 - 11:42
    This right could be recognized
  • 11:42 - 11:45
    as part of the Universal Declaration
    of Human Rights,
  • 11:45 - 11:47
    which has established mechanisms
  • 11:47 - 11:50
    for the enforcement
    of these kinds of social rights.
  • 11:52 - 11:54
    During the Iranian Green Movement,
  • 11:54 - 11:59
    the protesters used the internet
    and good old-fashioned word of mouth
  • 11:59 - 12:01
    to coordinate their marches.
  • 12:02 - 12:05
    And some of the most oppressive
    restrictions in Iran
  • 12:05 - 12:07
    were lifted as a result.
  • 12:08 - 12:12
    But what if the Iranian government
    had used brain surveillance
  • 12:12 - 12:15
    to detect and prevent the protest?
  • 12:17 - 12:20
    Would the world have ever heard
    the protesters' cries?
  • 12:22 - 12:27
    The time has come for us to call
    for a cognitive liberty revolution.
  • 12:28 - 12:31
    To make sure that we responsibly
    advance technology
  • 12:31 - 12:34
    that could enable us to embrace the future
  • 12:34 - 12:41
    while fiercely protecting all of us
    from any person, company or government
  • 12:41 - 12:46
    that attempts to unlawfully access
    or alter our innermost lives.
  • 12:47 - 12:48
    Thank you.
  • 12:48 - 12:51
    (Applause)
Title:
When technology can read minds, how will we protect our privacy?
Speaker:
Nita A. Farahany
Description:

Tech that can decode your brain activity and reveal what you're thinking and feeling is on the horizon, says legal scholar and ethicist Nita Farahany. What will it mean for our already violated sense of privacy? In a cautionary talk, Farahany warns of a society where people are arrested for merely thinking about committing a crime (like in "Minority Report") and private interests sell our brain data -- and makes the case for a right to cognitive liberty that protects our freedom of thought and self-determination.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
13:04

English subtitles

Revisions Compare revisions