-
I want you to imagine
-
walking into a room,
-
a control room with a bunch of people,
-
a hundred people, hunched
over a desk with little dials,
-
and that that control room
-
will shape the thoughts and feelings
-
of a billion people.
-
This might sound like science fiction,
-
but this actually exists
-
right now today.
-
I know because I used to be
in one of those control rooms.
-
I was a design ethicist at Google,
-
where I studied how do you ethically
steer people's thoughts?
-
Because what we don't talk about
are how the handful of people
-
working at a handful
of technology companies
-
through their choices will steer
what a billion people are thinking today.
-
Because when you pull out your phone
and they design how this works
-
or what's on the feed,
-
it's scheduling little blocks
of time in our minds.
-
If you see a notification,
it schedules you to have thoughts
-
that maybe you didn't intend to have.
-
If you swipe over that notification,
-
it schedules you into spending
a little bit of time
-
getting sucked into something
that maybe you didn't intend
-
to get sucked into.
-
When we talk about technology,
-
we tend to talk about it
as this blue sky opportunity.
-
It could go any direction.
-
And I want to get serious for a moment
and tell you why it's going
-
in a very specific direction.
-
Because it's not evolving randomly.
-
There's a hidden goal
driving the direction
-
of all of the technology we make,
-
and that goal is the race
for our attention.
-
Because every new site --
-
TED, elections, politicians,
-
games, even meditation apps --
-
have to compete for one thing,
-
which is our attention,
-
and there's only so much of it.
-
And the best way to get people's attention
-
is to know how someone's mind works.
-
And there's a whole bunch
of persuasive techniques
-
that I learned in college at a lab
called the Persuasive Technology Lab
-
to get people's attention.
-
A simple example is YouTube.
-
YouTube wants to maximize
how much time you spend,
-
and so what do they do?
-
They autoplay the next video.
-
And let's say that works really well.
-
They're getting a little bit more
of people's time.
-
Well, if you're Netflix, you look at that
-
and say, well that's shrinking
my market share,
-
so I'm going to autoplay the next episode.
-
But then if you're Facebook, you say,
well no, that's shrinking
-
all of my market share, so now
I have to autoplay
-
all the videos in the newsfeed
before waiting for you to click play.
-
So the Internet is not evolving at random.
-
The reason it feels like it's
sucking us in the way it is
-
is because of this race for attention.
-
We know where this is going.
-
Technology is not neutral,
-
and it becomes this race to the bottom
of the brainstem of who can go lower
-
to get it.
-
Let me give you an example of Snapchat.
-
If you didn't know,
-
Snapchat is the number one way
-
that teenagers in
the United States communicate.
-
So if you're like me, and you use
text messages to communicate,
-
Snapchat is that for teenagers,
-
and there's, like, a hundred million
of them that use it.
-
And they invented a feature
called Snapstreaks,
-
which shows the number of days in a row
-
that two people have
communicated with each other.
-
In other words, what they just did
-
is they gave two people
something they don't want to lose,
-
because if you're a teenager,
and you have 150 days in a row,
-
you don't want that to go away,
-
and so think of the little blocks of time
-
that that schedules in kids' minds.
-
This isn't theoretical: when kids
go on vacation, it's been shown
-
they give their passwords
to up to five other friends
-
to keep their Snapstreaks going,
-
even when they can't do it.
-
And they have, like, 30 of these things,
-
and so they have to get through
taking photos of just pictures or walls
-
or ceilings just to get through their day.
-
So it's not even like
they're having real conversations.
-
We have a temptation to think about this
-
as, oh, they're just using Snapchat
-
the way we used
to gossip on the telephone.
-
It's probably okay.
-
Well, what this misses
is that in the 1970s
-
when you were just gossiping
on the telephone,
-
there wasn't a hundred engineers
on the other side of the screen
-
who knew exactly
how your psychology worked
-
and orchestrated you into
a double bind with each other.
-
Now, if this is making you
feel a little bit outraged,
-
notice that that thought
just comes over you.
-
Outrage is a really good way also
of getting your intention,
-
because we don't choose outrage.
-
It happens to us.
-
And if you're the Facebook news feed,
-
whether you'd want to or not
you actually benefit
-
when there's outrage,
-
because outrage doesn't just
schedule a reaction
-
in emotional time, space, for you.
-
We want to share that outrage
with other people.
-
So we want to share and say,
-
"Can you believe the thing
that they said?"
-
And so outrage works really well
at getting attention,
-
such that if Facebook had a choice
between showing you the outrage feed
-
and a calm news feed,
-
they would want to show you
the outrage feed,
-
not because someone
consciously chose that,
-
but because that worked better
at getting your attention.
-
And the news feed control room
is not accountable to us.
-
It's only accountable
to maximizing attention.
-
It's also accountable, because
of the business model of advertising,
-
for anybody who can pay the most
to actually walk into the control room
-
and say, "That group over there,
I want to schedule these thoughts
-
into their minds."
-
So you can target,
-
you can precisely target a lie
-
directly to the people
who are most susceptible.
-
And because this is profitable,
it's only going to get worse.
-
So I'm here today
-
because the costs are so obvious.
-
I don't know a more urgent
problem than this,
-
because this problem is underneath
all other problems.
-
It's not just taking away our agency
-
to spend our attention and live
the lives that we want,
-
it's changing the way
that we have our conversations,
-
it's changing our democracy,
-
and it's changing our ability
to have the conversations
-
and relationships we want with each other,
-
and it affects everyone,
-
because a billion people
have one of these in their pocket.
-
So how do we fix this?
-
We need to make three radical changes
-
to technology and to our society.
-
The first is we need to acknowledge
-
that we are persuadable.
-
Once you start understanding
that your mind can be scheduled
-
into having little thoughts
or little blocks of time
-
that you didn't choose,
-
wouldn't we want to use that understanding
-
and protect against the way
that that happens?
-
I think we need to see ourselves
fundamentally in a new way.
-
It's almost like a new period
of human history,
-
like the Enlightenment,
-
but almost a kind of
self-aware Enlightenment,
-
that we can be persuaded,
-
and there might be something
we want to protect.
-
The second is we need new models
and accountability systems
-
so that as the world gets better
and more and more persuasive over time,
-
because it's only going
to get more persuasive,
-
that the people in those control rooms
are accountable and transparent
-
to what we want.
-
The only form of ethical
persuasion that exists
-
is when the goals of the persuader
-
are aligned with the goals
of the persuadee.
-
And that involves questioning big things,
-
like the business model of advertising.
-
Lastly, we need a design renaissance,
-
because once you have
this view of human nature,
-
that you can steer the timelines
of a billion people --
-
Just imagine, there's people
who have some desire
-
about what they want to do
and what they want to be thinking
-
and what they want to be feeling
-
and how they want to be informed,
-
and we're all just tugged
into these other directions,
-
and you have a billion people just tugged
into all these different directions.
-
Well imagine an entire design renaissance
-
that tried to orchestrate the exact
and most empowering
-
time-well-spent way for those
timelines to happen.
-
And that would involve two things:
one would be protecting against
-
the timelines that we don't
want to be experiencing,
-
the thoughts that we wouldn't
want to be happening,
-
so that when that ding happens,
not having the ding
-
that sends us away;
-
and the second would be empowering us
to live out the timeline that we want.
-
So let me give you a concrete example.
-
Today, let's say your friend
cancels dinner on you,
-
and you are feeling a little bit lonely.
-
And so what do you do in that moment?
-
You open up Facebook.
-
And in that moment,
-
the designers in the control room
want to schedule exactly one thing,
-
which is to maximize how much time
you spend on the screen.
-
Now instead, imagine if those designers
created a different timeline
-
that was the easiest way,
using all of their data,
-
to actually help you get out
with the people that you care about?
-
Just imagine how, just think,
alleviating all loneliness in society,
-
if that was the timeline that Facebook
wanted to make possible for people.
-
Or imagine a different conversation
-
where, let's say you wanted to post
something super-controversial on Facebook,
-
which is a really important
thing to be able to do,
-
to talk about controversial topics.
-
And right now, when there's
that big comment box,
-
it's almost asking you,
-
what key do you want to type?
-
In other words, it's scheduling
a little timeline of things
-
you're going to continue
to do on the screen.
-
And imagine instead that there was
another button there saying,
-
well what would be most
time well spent for you?
-
And you click "host a dinner."
-
And right there underneath the item,
-
it said, "Who wants
to RSVP for the dinner?"
-
And so you'd still have a conversation
about something controversial,
-
but you'd be having it in the most
empowering place on your timeline,
-
which would be at home that night
with a bunch of a friends over
-
to talk about it.
-
So imagine we're running, like,
a find and replace
-
on all of the timelines that are
currently steering us towards more
-
and more screen time persuasively
-
and replacing all of those timelines
-
with what do we want in our lives.
-
It doesn't have to be this way.
-
Instead of handicapping our attention,
-
imagine if we used all of this data
and all of this power
-
and this new view of human nature
-
to give us a superhuman ability to focus
-
and a superhuman ability to put
our attention to what we cared about
-
and a superhuman ability to have
the conversations that we need
-
to have for a democracy.
-
The most complex challenges in the world
-
require not just us to use
our attention individually.
-
They require us to use our attention
and coordinate it together.
-
Climate change is going to require
that a lot of people
-
are being able to coordinate
their attention
-
in the most empowering way together.
-
And imagine creating a superhuman
ability to do that.
-
Sometimes the world's most pressing
and important problems
-
are not these hypothetical future things
that we could create in the future.
-
Sometimes the most pressing problems
-
are the ones that are
right underneath our noses,
-
the things that are already directing
a billion people's thoughts.
-
And maybe instead of getting excited
about the new augmented reality
-
and virtual reality and these
cool things that could happen,
-
which are going to be susceptible
to the same race for attention,
-
if we could fix the race for attention
-
on the thing that's already
in a billion people's pockets.
-
Maybe instead of getting excited
about the most exciting new cool
-
fancy education apps,
-
we could fix the way kids' minds
are getting manipulated
-
into sending empty messages
back and forth.
-
(Applause)
-
Maybe instead of worrying about
hypothetical future runaway
-
artificial intelligences that are
maximizing for one goal,
-
we could solve the runaway
artificial intelligence
-
that already exists right now,
-
which are these news feed
-
maximizing for one thing.
-
It's almost like instead of running away
to colonize new planets,
-
we could fix the one
that we're already on.
-
(Applause)
-
Solving this problem
-
is critical infrastructure
for solving every other problem.
-
There's nothing in your life
or in our collective problems
-
that does not require our ability
to put our attention where we care about.
-
At the end of our lives,
-
all we have is our attention
-
and our time.
-
What will be time well spent for ours?
-
Thank you.
-
(Applause)
-
Chris Anderson: Tristan, thank you.
Hey, stay up here a sec.
-
First of all, thank you.
-
I know we asked you to do this talk
on pretty short notice,
-
and you've had quite a stressful week
-
getting this thing together, so thank you.
-
Some people listening might say,
-
well look, what you complain about
-
is addiction, and all these people
doing this stuff,
-
for them it's actually interesting.
-
All these design decisions
-
have built user content
that is fantastically interesting.
-
The world's more interesting
than it ever has been.
-
What's possibly wrong with that?
-
Tristan Harris: Well, I think
it's really interesting.
-
It's like, one way to see this
-
is if you're just YouTube for example,
-
you want to always show
the more interesting next video.
-
You want to get better and better
at suggesting that next video,
-
but even if you could propose
the perfect next video
-
that everyone would want to watch,
-
it would just be better and better
at keeping you hooked on the screen.
-
So what's missing in that equation
-
is figuring out what
our boundaries would be.
-
Right? You would want YouTube to know
something about, say, falling asleep.
-
The CEO of Netflix recently said
that our biggest competitors are Facebook,
-
YouTube, and sleep. Right?
-
And so what we need to recognize
is that the human architecture is limited,
-
and that we have certain boundaries
or dimensions of our lives
-
that we want to be honored and respected,
-
and technology could help do that.
-
(Applause)
-
CA: I mean, could you make the case
-
that part of the problem here is that
we've got a naïve model of human nature?
-
So much of this is justified in terms
of human preference,
-
where we've got these algorithms
that do an amazing job
-
of optimizing for human preference,
-
but which preference?
-
There's the preferences of things
that we really care about
-
when we think about them
-
versus the preferences of what
we just instinctively click on.
-
If we could implant that more
nuanced view of human nature
-
in every design, would that
be a step forward?
-
TH: Absolutely. I mean,
I think right now it's as if
-
all of our technology is basically
only asking our lizard brain
-
what's the best way to just impulsively
get you to the next tiniest thing
-
with your time,
-
instead of asking you in your life
-
what we would be most
time well spent for you?
-
What would be the perfect timeline
that might include something later,
-
would be time well spent for you
here at TED in your last day here?
-
CA: So if Facebook and Google
and everyone said to us first up,
-
"Hey would you like us to optimize
for your reflective brain
-
or your lizard brain? You choose."
-
TH: Right. That would be one way. Yes.
-
CA: When you said persuadability,
that's an interesting word to me
-
because to me there's two
different types of persuadability.
-
There's the persuadability
that we're trying right now
-
of reason and thinking
and making an argument,
-
but I think you're almost
talking about a different kind
-
of more visceral type of persuadability,
-
of being persuaded without
even knowing that you're thinking.
-
TH: Exactly, yeah. The reason
I care about this problem so much is
-
I studied at a lab called
the Persuasive Technology Lab at Stanford
-
that taught people
exactly these techniques.
-
There's conferences and workshops
that teach people all these covert ways
-
of getting people's attention
and orchestrating people's lives.
-
And it's because most people
don't know that that exists
-
that this conversation is so important.
-
CA: Tristan, you and I, we both know
70 people from all these companies.
-
There are actually many here in the room,
-
and I don't know about you,
but my experience of them
-
is that there is no
shortage of good intent.
-
People want a better world.
-
They are actually, they really want it.
-
And I don't think anything you're saying
is that these are evil people,
-
it's a system where there's
these unintended consequences
-
that have really got out of control --
-
TH: -- of this race for attention, yeah.
-
It's the classic race to the bottom
when you have to get attention,
-
and it's so tense.
-
The only way to get more is to go
lower on the brain stem,
-
to go lower into outrage,
to go lower into emotion,
-
to go lower into the lizard brain.
-
CA: Well, thank you so much for helping us
all get a little bit wiser about this.
-
Tristan Harris, thank you.
TH: Thank you. Thank you very much.
-
(Applause)
TED Translators admin
Hi All,
Please note note that 1:41 "new" has been updated to "news."
Thank you!