How a handful of tech companies control billions of minds every day
-
0:01 - 0:02I want you to imagine
-
0:03 - 0:04walking into a room,
-
0:05 - 0:08a control room with a bunch of people,
-
0:08 - 0:10a hundred people, hunched
over a desk with little dials, -
0:11 - 0:13and that that control room
-
0:14 - 0:17will shape the thoughts and feelings
-
0:17 - 0:19of a billion people.
-
0:21 - 0:22This might sound like science fiction,
-
0:23 - 0:26but this actually exists
-
0:26 - 0:27right now, today.
-
0:28 - 0:31I know because I used to be
in one of those control rooms. -
0:32 - 0:34I was a design ethicist at Google,
-
0:34 - 0:38where I studied how do you ethically
steer people's thoughts? -
0:39 - 0:41Because what we don't talk about
is how the handful of people -
0:41 - 0:44working at a handful
of technology companies -
0:44 - 0:49through their choices will steer
what a billion people are thinking today. -
0:50 - 0:52Because when you pull out your phone
-
0:52 - 0:55and they design how this works
or what's on the feed, -
0:55 - 0:58it's scheduling little blocks
of time in our minds. -
0:59 - 1:02If you see a notification,
it schedules you to have thoughts -
1:02 - 1:04that maybe you didn't intend to have.
-
1:04 - 1:07If you swipe over that notification,
-
1:07 - 1:09it schedules you into spending
a little bit of time -
1:09 - 1:11getting sucked into something
-
1:11 - 1:14that maybe you didn't intend
to get sucked into. -
1:15 - 1:17When we talk about technology,
-
1:18 - 1:21we tend to talk about it
as this blue sky opportunity. -
1:21 - 1:22It could go any direction.
-
1:23 - 1:25And I want to get serious for a moment
-
1:25 - 1:28and tell you why it's going
in a very specific direction. -
1:29 - 1:31Because it's not evolving randomly.
-
1:32 - 1:34There's a hidden goal
driving the direction -
1:34 - 1:36of all of the technology we make,
-
1:36 - 1:39and that goal is the race
for our attention. -
1:41 - 1:44Because every news site --
-
1:44 - 1:46TED, elections, politicians,
-
1:46 - 1:48games, even meditation apps --
-
1:48 - 1:50have to compete for one thing,
-
1:51 - 1:53which is our attention,
-
1:53 - 1:55and there's only so much of it.
-
1:56 - 1:59And the best way to get people's attention
-
1:59 - 2:01is to know how someone's mind works.
-
2:02 - 2:04And there's a whole bunch
of persuasive techniques -
2:04 - 2:08that I learned in college at a lab
called the Persuasive Technology Lab -
2:08 - 2:09to get people's attention.
-
2:10 - 2:11A simple example is YouTube.
-
2:12 - 2:15YouTube wants to maximize
how much time you spend. -
2:15 - 2:16And so what do they do?
-
2:17 - 2:19They autoplay the next video.
-
2:20 - 2:22And let's say that works really well.
-
2:22 - 2:24They're getting a little bit
more of people's time. -
2:24 - 2:26Well, if you're Netflix,
you look at that and say, -
2:26 - 2:28well, that's shrinking my market share,
-
2:28 - 2:30so I'm going to autoplay the next episode.
-
2:31 - 2:33But then if you're Facebook,
-
2:33 - 2:35you say, that's shrinking
all of my market share, -
2:35 - 2:38so now I have to autoplay
all the videos in the newsfeed -
2:38 - 2:40before waiting for you to click play.
-
2:40 - 2:43So the internet is not evolving at random.
-
2:44 - 2:49The reason it feels
like it's sucking us in the way it is -
2:49 - 2:51is because of this race for attention.
-
2:51 - 2:53We know where this is going.
-
2:53 - 2:54Technology is not neutral,
-
2:55 - 2:59and it becomes this race
to the bottom of the brain stem -
2:59 - 3:01of who can go lower to get it.
-
3:02 - 3:04Let me give you an example of Snapchat.
-
3:04 - 3:08If you didn't know,
Snapchat is the number one way -
3:08 - 3:10that teenagers in
the United States communicate. -
3:10 - 3:14So if you're like me, and you use
text messages to communicate, -
3:14 - 3:16Snapchat is that for teenagers,
-
3:16 - 3:19and there's, like,
a hundred million of them that use it. -
3:19 - 3:21And they invented
a feature called Snapstreaks, -
3:21 - 3:23which shows the number of days in a row
-
3:23 - 3:26that two people have
communicated with each other. -
3:26 - 3:28In other words, what they just did
-
3:28 - 3:31is they gave two people
something they don't want to lose. -
3:32 - 3:35Because if you're a teenager,
and you have 150 days in a row, -
3:35 - 3:37you don't want that to go away.
-
3:37 - 3:42And so think of the little blocks of time
that that schedules in kids' minds. -
3:42 - 3:44This isn't theoretical:
when kids go on vacation, -
3:45 - 3:48it's been shown they give their passwords
to up to five other friends -
3:48 - 3:50to keep their Snapstreaks going,
-
3:50 - 3:52even when they can't do it.
-
3:52 - 3:54And they have, like, 30 of these things,
-
3:54 - 3:57and so they have to get through
taking photos of just pictures or walls -
3:57 - 4:00or ceilings just to get through their day.
-
4:01 - 4:04So it's not even like
they're having real conversations. -
4:04 - 4:06We have a temptation to think about this
-
4:06 - 4:09as, oh, they're just using Snapchat
-
4:09 - 4:11the way we used to
gossip on the telephone. -
4:11 - 4:12It's probably OK.
-
4:12 - 4:15Well, what this misses
is that in the 1970s, -
4:15 - 4:17when you were just
gossiping on the telephone, -
4:17 - 4:20there wasn't a hundred engineers
on the other side of the screen -
4:20 - 4:22who knew exactly
how your psychology worked -
4:23 - 4:25and orchestrated you
into a double bind with each other. -
4:26 - 4:30Now, if this is making you
feel a little bit of outrage, -
4:31 - 4:33notice that that thought
just comes over you. -
4:33 - 4:37Outrage is a really good way also
of getting your attention, -
4:38 - 4:39because we don't choose outrage.
-
4:39 - 4:41It happens to us.
-
4:41 - 4:43And if you're the Facebook newsfeed,
-
4:43 - 4:44whether you'd want to or not,
-
4:44 - 4:47you actually benefit when there's outrage.
-
4:47 - 4:50Because outrage
doesn't just schedule a reaction -
4:50 - 4:53in emotional time, space, for you.
-
4:53 - 4:56We want to share that outrage
with other people. -
4:56 - 4:57So we want to hit share and say,
-
4:57 - 5:00"Can you believe the thing
that they said?" -
5:01 - 5:04And so outrage works really well
at getting attention, -
5:04 - 5:08such that if Facebook had a choice
between showing you the outrage feed -
5:08 - 5:09and a calm newsfeed,
-
5:10 - 5:12they would want
to show you the outrage feed, -
5:12 - 5:14not because someone
consciously chose that, -
5:14 - 5:17but because that worked better
at getting your attention. -
5:19 - 5:25And the newsfeed control room
is not accountable to us. -
5:25 - 5:27It's only accountable
to maximizing attention. -
5:27 - 5:29It's also accountable,
-
5:29 - 5:31because of the business model
of advertising, -
5:31 - 5:34for anybody who can pay the most
to actually walk into the control room -
5:34 - 5:36and say, "That group over there,
-
5:36 - 5:39I want to schedule these thoughts
into their minds." -
5:40 - 5:41So you can target,
-
5:42 - 5:44you can precisely target a lie
-
5:44 - 5:47directly to the people
who are most susceptible. -
5:48 - 5:51And because this is profitable,
it's only going to get worse. -
5:53 - 5:55So I'm here today
-
5:56 - 5:58because the costs are so obvious.
-
6:00 - 6:02I don't know a more urgent
problem than this, -
6:02 - 6:06because this problem
is underneath all other problems. -
6:07 - 6:10It's not just taking away our agency
-
6:10 - 6:13to spend our attention
and live the lives that we want, -
6:14 - 6:17it's changing the way
that we have our conversations, -
6:17 - 6:19it's changing our democracy,
-
6:19 - 6:22and it's changing our ability
to have the conversations -
6:22 - 6:24and relationships we want with each other.
-
6:25 - 6:27And it affects everyone,
-
6:27 - 6:30because a billion people
have one of these in their pocket. -
6:33 - 6:35So how do we fix this?
-
6:37 - 6:40We need to make three radical changes
-
6:40 - 6:42to technology and to our society.
-
6:44 - 6:48The first is we need to acknowledge
that we are persuadable. -
6:49 - 6:50Once you start understanding
-
6:50 - 6:53that your mind can be scheduled
into having little thoughts -
6:53 - 6:56or little blocks of time
that you didn't choose, -
6:56 - 6:58wouldn't we want to use that understanding
-
6:58 - 7:00and protect against the way
that that happens? -
7:01 - 7:04I think we need to see ourselves
fundamentally in a new way. -
7:04 - 7:06It's almost like a new period
of human history, -
7:06 - 7:07like the Enlightenment,
-
7:07 - 7:10but almost a kind of
self-aware Enlightenment, -
7:10 - 7:12that we can be persuaded,
-
7:12 - 7:15and there might be something
we want to protect. -
7:15 - 7:20The second is we need new models
and accountability systems -
7:20 - 7:23so that as the world gets better
and more and more persuasive over time -- -
7:24 - 7:26because it's only going
to get more persuasive -- -
7:26 - 7:28that the people in those control rooms
-
7:28 - 7:30are accountable and transparent
to what we want. -
7:30 - 7:33The only form of ethical
persuasion that exists -
7:33 - 7:35is when the goals of the persuader
-
7:35 - 7:37are aligned with the goals
of the persuadee. -
7:38 - 7:41And that involves questioning big things,
like the business model of advertising. -
7:43 - 7:44Lastly,
-
7:44 - 7:46we need a design renaissance,
-
7:47 - 7:50because once you have
this view of human nature, -
7:50 - 7:53that you can steer the timelines
of a billion people -- -
7:53 - 7:56just imagine, there's people
who have some desire -
7:56 - 7:59about what they want to do
and what they want to be thinking -
7:59 - 8:02and what they want to be feeling
and how they want to be informed, -
8:02 - 8:04and we're all just tugged
into these other directions. -
8:05 - 8:08And you have a billion people just tugged
into all these different directions. -
8:08 - 8:10Well, imagine an entire design renaissance
-
8:10 - 8:13that tried to orchestrate
the exact and most empowering -
8:13 - 8:17time-well-spent way
for those timelines to happen. -
8:17 - 8:18And that would involve two things:
-
8:18 - 8:20one would be protecting
against the timelines -
8:20 - 8:22that we don't want to be experiencing,
-
8:22 - 8:25the thoughts that we
wouldn't want to be happening, -
8:25 - 8:28so that when that ding happens,
not having the ding that sends us away; -
8:28 - 8:32and the second would be empowering us
to live out the timeline that we want. -
8:32 - 8:34So let me give you a concrete example.
-
8:34 - 8:37Today, let's say your friend
cancels dinner on you, -
8:37 - 8:41and you are feeling a little bit lonely.
-
8:41 - 8:42And so what do you do in that moment?
-
8:42 - 8:44You open up Facebook.
-
8:45 - 8:47And in that moment,
-
8:47 - 8:50the designers in the control room
want to schedule exactly one thing, -
8:50 - 8:53which is to maximize how much time
you spend on the screen. -
8:55 - 8:59Now, instead, imagine if those designers
created a different timeline -
8:59 - 9:02that was the easiest way,
using all of their data, -
9:02 - 9:05to actually help you get out
with the people that you care about? -
9:05 - 9:11Just think, alleviating
all loneliness in society, -
9:11 - 9:14if that was the timeline that Facebook
wanted to make possible for people. -
9:14 - 9:16Or imagine a different conversation.
-
9:16 - 9:19Let's say you wanted to post
something supercontroversial on Facebook, -
9:19 - 9:22which is a really important
thing to be able to do, -
9:22 - 9:23to talk about controversial topics.
-
9:23 - 9:26And right now, when there's
that big comment box, -
9:26 - 9:29it's almost asking you,
what key do you want to type? -
9:29 - 9:32In other words, it's scheduling
a little timeline of things -
9:32 - 9:34you're going to continue
to do on the screen. -
9:34 - 9:37And imagine instead that there was
another button there saying, -
9:37 - 9:39what would be most
time well spent for you? -
9:39 - 9:41And you click "host a dinner."
-
9:41 - 9:43And right there
underneath the item it said, -
9:43 - 9:45"Who wants to RSVP for the dinner?"
-
9:45 - 9:48And so you'd still have a conversation
about something controversial, -
9:48 - 9:52but you'd be having it in the most
empowering place on your timeline, -
9:52 - 9:55which would be at home that night
with a bunch of a friends over -
9:55 - 9:56to talk about it.
-
9:57 - 10:00So imagine we're running, like,
a find and replace -
10:01 - 10:04on all of the timelines
that are currently steering us -
10:04 - 10:06towards more and more
screen time persuasively -
10:07 - 10:10and replacing all of those timelines
-
10:10 - 10:11with what do we want in our lives.
-
10:15 - 10:16It doesn't have to be this way.
-
10:18 - 10:21Instead of handicapping our attention,
-
10:21 - 10:23imagine if we used all of this data
and all of this power -
10:23 - 10:25and this new view of human nature
-
10:25 - 10:28to give us a superhuman ability to focus
-
10:28 - 10:32and a superhuman ability to put
our attention to what we cared about -
10:32 - 10:35and a superhuman ability
to have the conversations -
10:35 - 10:37that we need to have for democracy.
-
10:40 - 10:42The most complex challenges in the world
-
10:44 - 10:47require not just us
to use our attention individually. -
10:48 - 10:52They require us to use our attention
and coordinate it together. -
10:52 - 10:55Climate change is going to require
that a lot of people -
10:55 - 10:57are being able
to coordinate their attention -
10:57 - 10:59in the most empowering way together.
-
10:59 - 11:02And imagine creating
a superhuman ability to do that. -
11:07 - 11:11Sometimes the world's
most pressing and important problems -
11:12 - 11:16are not these hypothetical future things
that we could create in the future. -
11:17 - 11:18Sometimes the most pressing problems
-
11:18 - 11:21are the ones that are
right underneath our noses, -
11:21 - 11:24the things that are already directing
a billion people's thoughts. -
11:25 - 11:28And maybe instead of getting excited
about the new augmented reality -
11:28 - 11:31and virtual reality
and these cool things that could happen, -
11:31 - 11:35which are going to be susceptible
to the same race for attention, -
11:35 - 11:37if we could fix the race for attention
-
11:37 - 11:40on the thing that's already
in a billion people's pockets. -
11:40 - 11:42Maybe instead of getting excited
-
11:42 - 11:46about the most exciting
new cool fancy education apps, -
11:46 - 11:49we could fix the way
kids' minds are getting manipulated -
11:49 - 11:51into sending empty messages
back and forth. -
11:52 - 11:56(Applause)
-
11:56 - 11:58Maybe instead of worrying
-
11:58 - 12:01about hypothetical future
runaway artificial intelligences -
12:01 - 12:03that are maximizing for one goal,
-
12:05 - 12:07we could solve the runaway
artificial intelligence -
12:07 - 12:09that already exists right now,
-
12:09 - 12:12which are these newsfeeds
maximizing for one thing. -
12:14 - 12:18It's almost like instead of running away
to colonize new planets, -
12:18 - 12:20we could fix the one
that we're already on. -
12:20 - 12:24(Applause)
-
12:28 - 12:30Solving this problem
-
12:30 - 12:34is critical infrastructure
for solving every other problem. -
12:35 - 12:39There's nothing in your life
or in our collective problems -
12:39 - 12:42that does not require our ability
to put our attention where we care about. -
12:44 - 12:45At the end of our lives,
-
12:46 - 12:49all we have is our attention and our time.
-
12:50 - 12:52What will be time well spent for ours?
-
12:52 - 12:53Thank you.
-
12:53 - 12:56(Applause)
-
13:06 - 13:09Chris Anderson: Tristan, thank you.
Hey, stay up here a sec. -
13:09 - 13:10First of all, thank you.
-
13:10 - 13:13I know we asked you to do this talk
on pretty short notice, -
13:13 - 13:15and you've had quite a stressful week
-
13:15 - 13:18getting this thing together, so thank you.
-
13:19 - 13:23Some people listening might say,
what you complain about is addiction, -
13:23 - 13:26and all these people doing this stuff,
for them it's actually interesting. -
13:26 - 13:27All these design decisions
-
13:27 - 13:31have built user content
that is fantastically interesting. -
13:31 - 13:33The world's more interesting
than it ever has been. -
13:33 - 13:34What's wrong with that?
-
13:34 - 13:37Tristan Harris:
I think it's really interesting. -
13:37 - 13:41One way to see this
is if you're just YouTube, for example, -
13:41 - 13:43you want to always show
the more interesting next video. -
13:43 - 13:46You want to get better and better
at suggesting that next video, -
13:46 - 13:49but even if you could propose
the perfect next video -
13:49 - 13:50that everyone would want to watch,
-
13:51 - 13:54it would just be better and better
at keeping you hooked on the screen. -
13:54 - 13:56So what's missing in that equation
-
13:56 - 13:58is figuring out what
our boundaries would be. -
13:58 - 14:01You would want YouTube to know
something about, say, falling asleep. -
14:01 - 14:03The CEO of Netflix recently said,
-
14:03 - 14:05"our biggest competitors
are Facebook, YouTube and sleep." -
14:05 - 14:10And so what we need to recognize
is that the human architecture is limited -
14:10 - 14:13and that we have certain boundaries
or dimensions of our lives -
14:13 - 14:15that we want to be honored and respected,
-
14:15 - 14:17and technology could help do that.
-
14:17 - 14:19(Applause)
-
14:19 - 14:21CA: I mean, could you make the case
-
14:21 - 14:27that part of the problem here is that
we've got a naïve model of human nature? -
14:27 - 14:30So much of this is justified
in terms of human preference, -
14:30 - 14:32where we've got these algorithms
that do an amazing job -
14:33 - 14:34of optimizing for human preference,
-
14:34 - 14:36but which preference?
-
14:36 - 14:39There's the preferences
of things that we really care about -
14:39 - 14:40when we think about them
-
14:41 - 14:44versus the preferences
of what we just instinctively click on. -
14:44 - 14:48If we could implant that more nuanced
view of human nature in every design, -
14:48 - 14:50would that be a step forward?
-
14:50 - 14:52TH: Absolutely. I mean, I think right now
-
14:52 - 14:55it's as if all of our technology
is basically only asking our lizard brain -
14:55 - 14:58what's the best way
to just impulsively get you to do -
14:58 - 15:00the next tiniest thing with your time,
-
15:00 - 15:02instead of asking you in your life
-
15:02 - 15:04what we would be most
time well spent for you? -
15:04 - 15:07What would be the perfect timeline
that might include something later, -
15:07 - 15:10would be time well spent for you
here at TED in your last day here? -
15:10 - 15:13CA: So if Facebook and Google
and everyone said to us first up, -
15:13 - 15:16"Hey, would you like us
to optimize for your reflective brain -
15:16 - 15:18or your lizard brain? You choose."
-
15:18 - 15:20TH: Right. That would be one way. Yes.
-
15:22 - 15:25CA: You said persuadability,
that's an interesting word to me -
15:25 - 15:28because to me there's
two different types of persuadability. -
15:28 - 15:31There's the persuadability
that we're trying right now -
15:31 - 15:33of reason and thinking
and making an argument, -
15:33 - 15:36but I think you're almost
talking about a different kind, -
15:36 - 15:37a more visceral type of persuadability,
-
15:38 - 15:40of being persuaded without
even knowing that you're thinking. -
15:40 - 15:43TH: Exactly. The reason
I care about this problem so much is -
15:43 - 15:46I studied at a lab called
the Persuasive Technology Lab at Stanford -
15:47 - 15:49that taught [students how to recognize]
exactly these techniques. -
15:49 - 15:52There's conferences and workshops
that teach people all these covert ways -
15:52 - 15:55of getting people's attention
and orchestrating people's lives. -
15:55 - 15:58And it's because most people
don't know that that exists -
15:58 - 16:00that this conversation is so important.
-
16:00 - 16:03CA: Tristan, you and I, we both know
so many people from all these companies. -
16:04 - 16:05There are actually many here in the room,
-
16:06 - 16:08and I don't know about you,
but my experience of them -
16:08 - 16:10is that there is
no shortage of good intent. -
16:10 - 16:12People want a better world.
-
16:12 - 16:16They are actually -- they really want it.
-
16:16 - 16:20And I don't think anything you're saying
is that these are evil people. -
16:21 - 16:24It's a system where there's
these unintended consequences -
16:24 - 16:26that have really got out of control --
-
16:26 - 16:28TH: Of this race for attention.
-
16:28 - 16:31It's the classic race to the bottom
when you have to get attention, -
16:31 - 16:32and it's so tense.
-
16:32 - 16:35The only way to get more
is to go lower on the brain stem, -
16:35 - 16:37to go lower into outrage,
to go lower into emotion, -
16:37 - 16:39to go lower into the lizard brain.
-
16:39 - 16:43CA: Well, thank you so much for helping us
all get a little bit wiser about this. -
16:43 - 16:45Tristan Harris, thank you.
TH: Thank you very much. -
16:45 - 16:48(Applause)
- Title:
- How a handful of tech companies control billions of minds every day
- Speaker:
- Tristan Harris
- Description:
-
A handful of people working at a handful of tech companies steer the thoughts of billions of people every day, says design thinker Tristan Harris. From Facebook notifications to Snapstreaks to YouTube autoplays, they're all competing for one thing: your attention. Harris shares how these companies prey on our psychology for their own profit and calls for a design renaissance in which our tech instead encourages us to live out the timeline we want.
- Video Language:
- English
- Team:
closed TED
- Project:
- TEDTalks
- Duration:
- 17:00
![]() |
Brian Greene edited English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
TED Translators admin commented on English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
Brian Greene edited English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
Brian Greene edited English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
Brian Greene edited English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
Brian Greene edited English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
Brian Greene edited English subtitles for How a handful of tech companies control billions of minds every day | |
![]() |
Brian Greene approved English subtitles for How a handful of tech companies control billions of minds every day |
TED Translators admin
Hi All,
Please note note that 1:41 "new" has been updated to "news."
Thank you!