When technology can read minds, how will we protect our privacy?
-
0:01 - 0:06In the months following
the 2009 presidential election in Iran, -
0:06 - 0:09protests erupted across the country.
-
0:10 - 0:13The Iranian government
violently suppressed -
0:13 - 0:17what came to be known
as the Iranian Green Movement, -
0:17 - 0:19even blocking mobile signals
-
0:19 - 0:21to cut off communication
between the protesters. -
0:22 - 0:27My parents, who emigrated
to the United States in the late 1960s, -
0:27 - 0:29spend substantial time there,
-
0:29 - 0:32where all of my large,
extended family live. -
0:33 - 0:36When I would call my family in Tehran
-
0:36 - 0:39during some of the most violent
crackdowns of the protest, -
0:39 - 0:43none of them dared discuss
with me what was happening. -
0:43 - 0:47They or I knew to quickly steer
the conversation to other topics. -
0:47 - 0:51All of us understood
what the consequences could be -
0:51 - 0:53of a perceived dissident action.
-
0:54 - 0:58But I still wish I could have known
what they were thinking -
0:58 - 0:59or what they were feeling.
-
1:00 - 1:02What if I could have?
-
1:02 - 1:03Or more frighteningly,
-
1:03 - 1:06what if the Iranian government could have?
-
1:07 - 1:10Would they have arrested them
based on what their brains revealed? -
1:11 - 1:14That day may be closer than you think.
-
1:15 - 1:18With our growing capabilities
in neuroscience, artificial intelligence -
1:18 - 1:20and machine learning,
-
1:20 - 1:24we may soon know a lot more
of what's happening in the human brain. -
1:25 - 1:28As a bioethicist, a lawyer, a philosopher
-
1:28 - 1:30and an Iranian-American,
-
1:30 - 1:34I'm deeply concerned
about what this means for our freedoms -
1:34 - 1:36and what kinds of protections we need.
-
1:37 - 1:40I believe we need
a right to cognitive liberty, -
1:40 - 1:43as a human right
that needs to be protected. -
1:44 - 1:46If not, our freedom of thought,
-
1:46 - 1:49access and control over our own brains
-
1:49 - 1:52and our mental privacy will be threatened.
-
1:54 - 1:55Consider this:
-
1:55 - 1:58the average person thinks
thousands of thoughts each day. -
1:59 - 2:00As a thought takes form,
-
2:00 - 2:05like a math calculation
or a number, a word, -
2:05 - 2:08neurons are interacting in the brain,
-
2:08 - 2:11creating a miniscule electrical discharge.
-
2:12 - 2:15When you have a dominant
mental state, like relaxation, -
2:15 - 2:19hundreds and thousands of neurons
are firing in the brain, -
2:19 - 2:24creating concurrent electrical discharges
in characteristic patterns -
2:24 - 2:28that can be measured
with electroencephalography, or EEG. -
2:29 - 2:31In fact, that's what
you're seeing right now. -
2:32 - 2:36You're seeing my brain activity
that was recorded in real time -
2:36 - 2:39with a simple device
that was worn on my head. -
2:40 - 2:45What you're seeing is my brain activity
when I was relaxed and curious. -
2:46 - 2:48To share this information with you,
-
2:48 - 2:51I wore one of the early
consumer-based EEG devices -
2:51 - 2:52like this one,
-
2:52 - 2:56which recorded the electrical
activity in my brain in real time. -
2:57 - 3:01It's not unlike the fitness trackers
that some of you may be wearing -
3:01 - 3:04to measure your heart rate
or the steps that you've taken, -
3:04 - 3:06or even your sleep activity.
-
3:07 - 3:11It's hardly the most sophisticated
neuroimaging technique on the market. -
3:12 - 3:14But it's already the most portable
-
3:14 - 3:17and the most likely to impact
our everyday lives. -
3:18 - 3:19This is extraordinary.
-
3:19 - 3:22Through a simple, wearable device,
-
3:22 - 3:26we can literally see
inside the human brain -
3:26 - 3:32and learn aspects of our mental landscape
without ever uttering a word. -
3:33 - 3:36While we can't reliably decode
complex thoughts just yet, -
3:37 - 3:39we can already gauge a person's mood,
-
3:39 - 3:42and with the help
of artificial intelligence, -
3:42 - 3:46we can even decode
some single-digit numbers -
3:46 - 3:51or shapes or simple words
that a person is thinking -
3:51 - 3:53or hearing, or seeing.
-
3:54 - 3:58Despite some inherent limitations in EEG,
-
3:58 - 4:02I think it's safe to say
that with our advances in technology, -
4:02 - 4:06more and more of what's happening
in the human brain -
4:06 - 4:09can and will be decoded over time.
-
4:09 - 4:12Already, using one of these devices,
-
4:12 - 4:15an epileptic can know
they're going to have an epileptic seizure -
4:15 - 4:17before it happens.
-
4:17 - 4:21A paraplegic can type on a computer
with their thoughts alone. -
4:22 - 4:27A US-based company has developed
a technology to embed these sensors -
4:27 - 4:29into the headrest of automobilies
-
4:29 - 4:31so they can track driver concentration,
-
4:31 - 4:34distraction and cognitive load
while driving. -
4:35 - 4:39Nissan, insurance companies
and AAA have all taken note. -
4:40 - 4:44You could even watch this
choose-your-own-adventure movie -
4:44 - 4:49"The Moment," which, with an EEG headset,
-
4:49 - 4:53changes the movie
based on your brain-based reactions, -
4:53 - 4:57giving you a different ending
every time your attention wanes. -
4:59 - 5:02This may all sound great,
-
5:02 - 5:04and as a bioethicist,
-
5:04 - 5:08I am a huge proponent of empowering people
-
5:08 - 5:10to take charge of their own
health and well-being -
5:10 - 5:13by giving them access
to information about themselves, -
5:13 - 5:16including this incredible
new brain-decoding technology. -
5:18 - 5:19But I worry.
-
5:20 - 5:24I worry that we will voluntarily
or involuntarily give up -
5:25 - 5:29our last bastion of freedom,
our mental privacy. -
5:29 - 5:32That we will trade our brain activity
-
5:32 - 5:35for rebates or discounts on insurance,
-
5:36 - 5:39or free access
to social-media accounts ... -
5:40 - 5:42or even to keep our jobs.
-
5:43 - 5:45In fact, in China,
-
5:46 - 5:52the train drivers on
the Beijing-Shanghai high-speed rail, -
5:52 - 5:55the busiest of its kind in the world,
-
5:55 - 5:57are required to wear EEG devices
-
5:57 - 6:00to monitor their brain activity
while driving. -
6:00 - 6:02According to some news sources,
-
6:02 - 6:05in government-run factories in China,
-
6:05 - 6:10the workers are required to wear
EEG sensors to monitor their productivity -
6:10 - 6:13and their emotional state at work.
-
6:13 - 6:16Workers are even sent home
-
6:16 - 6:20if their brains show less-than-stellar
concentration on their jobs, -
6:20 - 6:22or emotional agitation.
-
6:23 - 6:25It's not going to happen tomorrow,
-
6:25 - 6:28but we're headed to a world
of brain transparency. -
6:29 - 6:32And I don't think people understand
that that could change everything. -
6:32 - 6:36Everything from our definitions
of data privacy to our laws, -
6:36 - 6:38to our ideas about freedom.
-
6:39 - 6:42In fact, in my lab at Duke University,
-
6:42 - 6:45we recently conducted a nationwide study
in the United States -
6:45 - 6:47to see if people appreciated
-
6:47 - 6:49the sensitivity
of their brain information. -
6:50 - 6:54We asked people to rate
their perceived sensitivity -
6:54 - 6:56of 33 different kinds of information,
-
6:56 - 6:58from their social security numbers
-
6:58 - 7:01to the content
of their phone conversations, -
7:01 - 7:03their relationship history,
-
7:03 - 7:05their emotions, their anxiety,
-
7:05 - 7:07the mental images in their mind
-
7:07 - 7:09and the thoughts in their mind.
-
7:09 - 7:15Shockingly, people rated their social
security number as far more sensitive -
7:15 - 7:17than any other kind of information,
-
7:17 - 7:19including their brain data.
-
7:20 - 7:24I think this is because
people don't yet understand -
7:24 - 7:28or believe the implications
of this new brain-decoding technology. -
7:29 - 7:32After all, if we can know
the inner workings of the human brain, -
7:32 - 7:35our social security numbers
are the least of our worries. -
7:35 - 7:36(Laughter)
-
7:36 - 7:37Think about it.
-
7:37 - 7:40In a world of total brain transparency,
-
7:40 - 7:42who would dare have
a politically dissident thought? -
7:43 - 7:45Or a creative one?
-
7:46 - 7:49I worry that people will self-censor
-
7:49 - 7:52in fear of being ostracized by society,
-
7:52 - 7:56or that people will lose their jobs
because of their waning attention -
7:56 - 7:58or emotional instability,
-
7:58 - 8:02or because they're contemplating
collective action against their employers. -
8:02 - 8:06That coming out
will no longer be an option, -
8:06 - 8:11because people's brains will long ago
have revealed their sexual orientation, -
8:11 - 8:13their political ideology
-
8:13 - 8:15or their religious preferences,
-
8:15 - 8:18well before they were ready
to consciously share that information -
8:18 - 8:19with other people.
-
8:20 - 8:24I worry about the ability of our laws
to keep up with technological change. -
8:25 - 8:27Take the First Amendment
of the US Constitution, -
8:27 - 8:29which protects freedom of speech.
-
8:29 - 8:31Does it also protect freedom of thought?
-
8:32 - 8:36And if so, does that mean that we're free
to alter our thoughts however we want? -
8:36 - 8:41Or can the government or society tell us
what we can do with our own brains? -
8:42 - 8:45Can the NSA spy on our brains
using these new mobile devices? -
8:46 - 8:50Can the companies that collect
the brain data through their applications -
8:50 - 8:52sell this information to third parties?
-
8:53 - 8:56Right now, no laws prevent them
from doing so. -
8:57 - 8:59It could be even more problematic
-
8:59 - 9:02in countries that don't share
the same freedoms -
9:02 - 9:04enjoyed by people in the United States.
-
9:05 - 9:09What would've happened during
the Iranian Green Movement -
9:09 - 9:13if the government had been
monitoring my family's brain activity, -
9:13 - 9:17and had believed them
to be sympathetic to the protesters? -
9:18 - 9:21Is it so far-fetched to imagine a society
-
9:21 - 9:24in which people are arrested
based on their thoughts -
9:24 - 9:25of committing a crime,
-
9:25 - 9:30like in the science-fiction
dystopian society in "Minority Report." -
9:30 - 9:35Already, in the United States, in Indiana,
-
9:35 - 9:40an 18-year-old was charged
with attempting to intimidate his school -
9:40 - 9:43by posting a video of himself
shooting people in the hallways ... -
9:44 - 9:47Except the people were zombies
-
9:47 - 9:52and the video was of him playing
an augmented-reality video game, -
9:52 - 9:57all interpreted to be a mental projection
of his subjective intent. -
9:58 - 10:03This is exactly why our brains
need special protection. -
10:03 - 10:07If our brains are just as subject
to data tracking and aggregation -
10:07 - 10:09as our financial records and transactions,
-
10:09 - 10:14if our brains can be hacked
and tracked like our online activities, -
10:14 - 10:16our mobile phones and applications,
-
10:16 - 10:20then we're on the brink of a dangerous
threat to our collective humanity. -
10:21 - 10:23Before you panic,
-
10:24 - 10:27I believe that there are solutions
to these concerns, -
10:27 - 10:30but we have to start by focusing
on the right things. -
10:31 - 10:34When it comes to privacy
protections in general, -
10:34 - 10:35I think we're fighting a losing battle
-
10:35 - 10:38by trying to restrict
the flow of information. -
10:38 - 10:42Instead, we should be focusing
on securing rights and remedies -
10:42 - 10:45against the misuse of our information.
-
10:45 - 10:49If people had the right to decide
how their information was shared, -
10:49 - 10:52and more importantly, have legal redress
-
10:52 - 10:54if their information
was misused against them, -
10:54 - 10:57say to discriminate against them
in an employment setting -
10:57 - 11:00or in health care or education,
-
11:00 - 11:02this would go a long way to build trust.
-
11:03 - 11:05In fact, in some instances,
-
11:05 - 11:08we want to be sharing more
of our personal information. -
11:09 - 11:12Studying aggregated information
can tell us so much -
11:12 - 11:15about our health and our well-being,
-
11:15 - 11:18but to be able to safely share
our information, -
11:18 - 11:21we need special protections
for mental privacy. -
11:22 - 11:25This is why we need
a right to cognitive liberty. -
11:26 - 11:30This right would secure for us
our freedom of thought and rumination, -
11:30 - 11:32our freedom of self-determination,
-
11:32 - 11:37and it would insure that we have
the right to consent to or refuse -
11:37 - 11:39access and alteration
of our brains by others. -
11:40 - 11:42This right could be recognized
-
11:42 - 11:45as part of the Universal Declaration
of Human Rights, -
11:45 - 11:47which has established mechanisms
-
11:47 - 11:50for the enforcement
of these kinds of social rights. -
11:52 - 11:54During the Iranian Green Movement,
-
11:54 - 11:59the protesters used the internet
and good old-fashioned word of mouth -
11:59 - 12:01to coordinate their marches.
-
12:02 - 12:05And some of the most oppressive
restrictions in Iran -
12:05 - 12:07were lifted as a result.
-
12:08 - 12:12But what if the Iranian government
had used brain surveillance -
12:12 - 12:15to detect and prevent the protest?
-
12:17 - 12:20Would the world have ever heard
the protesters' cries? -
12:22 - 12:27The time has come for us to call
for a cognitive liberty revolution. -
12:28 - 12:31To make sure that we responsibly
advance technology -
12:31 - 12:34that could enable us to embrace the future
-
12:34 - 12:41while fiercely protecting all of us
from any person, company or government -
12:41 - 12:46that attempts to unlawfully access
or alter our innermost lives. -
12:47 - 12:48Thank you.
-
12:48 - 12:51(Applause)
- Title:
- When technology can read minds, how will we protect our privacy?
- Speaker:
- Nita A. Farahany
- Description:
-
Tech that can decode your brain activity and reveal what you're thinking and feeling is on the horizon, says legal scholar and ethicist Nita Farahany. What will it mean for our already violated sense of privacy? In a cautionary talk, Farahany warns of a society where people are arrested for merely thinking about committing a crime (like in "Minority Report") and private interests sell our brain data -- and makes the case for a right to cognitive liberty that protects our freedom of thought and self-determination.
- Video Language:
- English
- Team:
closed TED
- Project:
- TEDTalks
- Duration:
- 13:04
![]() |
Brian Greene edited English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Oliver Friedman approved English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Oliver Friedman edited English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Oliver Friedman edited English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Oliver Friedman edited English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Krystian Aparta accepted English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Krystian Aparta edited English subtitles for When technology can read minds, how will we protect our privacy? | |
![]() |
Krystian Aparta edited English subtitles for When technology can read minds, how will we protect our privacy? |