< Return to Video

How a handful of tech companies control billions of minds every day

  • 0:01 - 0:03
    I want you to imagine
  • 0:03 - 0:05
    walking into a room,
  • 0:05 - 0:08
    a control room with a bunch of people,
  • 0:08 - 0:12
    a hundred people, hunched
    over a desk with little dials,
  • 0:12 - 0:14
    and that that control room
  • 0:14 - 0:18
    will shape the thoughts and feelings
  • 0:18 - 0:21
    of a billion people.
  • 0:21 - 0:23
    This might sound like science fiction,
  • 0:23 - 0:26
    but this actually exists
  • 0:26 - 0:28
    right now today.
  • 0:28 - 0:32
    I know because I used to be
    in one of those control rooms.
  • 0:32 - 0:35
    I was a design ethicist at Google,
  • 0:35 - 0:39
    where I studied how do you ethically
    steer people's thoughts?
  • 0:39 - 0:42
    Because what we don't talk about
    are how the handful of people
  • 0:42 - 0:44
    working at a handful
    of technology companies
  • 0:44 - 0:47
    through their choices will steer
    what a billion people are thinking today.
  • 0:47 - 0:55
    Because when you pull out your phone
    and they design how this works
  • 0:55 - 0:56
    or what's on the feed,
  • 0:56 - 0:59
    it's scheduling little blocks
    of time in our minds.
  • 0:59 - 1:02
    If you see a notification,
    it schedules you to have thoughts
  • 1:02 - 1:05
    that maybe you didn't intend to have.
  • 1:05 - 1:07
    If you swipe over that notification,
  • 1:07 - 1:09
    it schedules you into spending
    a little bit of time
  • 1:09 - 1:13
    getting sucked into something
    that maybe you didn't intend
  • 1:13 - 1:15
    to get sucked into.
  • 1:15 - 1:18
    When we talk about technology,
  • 1:18 - 1:21
    we tend to talk about it
    as this blue sky opportunity.
  • 1:21 - 1:24
    It could go any direction.
  • 1:24 - 1:26
    And I want to get serious for a moment
    and tell you why it's going
  • 1:26 - 1:29
    in a very specific direction.
  • 1:29 - 1:32
    Because it's not evolving randomly.
  • 1:32 - 1:34
    There's a hidden goal
    driving the direction
  • 1:34 - 1:36
    of all of the technology we make,
  • 1:36 - 1:41
    and that goal is the race
    for our attention.
  • 1:41 - 1:44
    Because every new site --
  • 1:44 - 1:47
    TED, elections, politicians,
  • 1:47 - 1:49
    games, even meditation apps --
  • 1:49 - 1:51
    have to compete for one thing,
  • 1:51 - 1:53
    which is our attention,
  • 1:53 - 1:57
    and there's only so much of it.
  • 1:57 - 1:59
    And the best way to get people's attention
  • 1:59 - 2:02
    is to know how someone's mind works.
  • 2:02 - 2:05
    And there's a whole bunch
    of persuasive techniques
  • 2:05 - 2:08
    that I learned in college at a lab
    called the Persuasive Technology Lab
  • 2:08 - 2:10
    to get people's attention.
  • 2:10 - 2:12
    A simple example is YouTube.
  • 2:12 - 2:15
    YouTube wants to maximize
    how much time you spend,
  • 2:15 - 2:17
    and so what do they do?
  • 2:17 - 2:20
    They autoplay the next video.
  • 2:20 - 2:21
    And let's say that works really well.
  • 2:21 - 2:23
    They're getting a little bit more
    of people's time.
  • 2:23 - 2:26
    Well, if you're Netflix, you look at that
  • 2:26 - 2:28
    and say, well that's shrinking
    my market share,
  • 2:28 - 2:31
    so I'm going to autoplay the next episode.
  • 2:31 - 2:35
    But then if you're Facebook, you say,
    well no, that's shrinking
  • 2:35 - 2:37
    all of my market share, so now
    I have to autoplay
  • 2:37 - 2:41
    all the videos in the newsfeed
    before waiting for you to click play.
  • 2:41 - 2:45
    So the Internet is not evolving at random.
  • 2:45 - 2:49
    The reason it feels like it's
    sucking us in the way it is
  • 2:49 - 2:52
    is because of this race for attention.
  • 2:52 - 2:54
    We know where this is going.
  • 2:54 - 2:56
    Technology is not neutral,
  • 2:56 - 3:01
    and it becomes this race to the bottom
    of the brainstem of who can go lower
  • 3:01 - 3:02
    to get it.
  • 3:02 - 3:05
    Let me give you an example of Snapchat.
  • 3:05 - 3:06
    If you didn't know,
  • 3:06 - 3:08
    Snapchat is the number one way
  • 3:08 - 3:10
    that teenagers in
    the United States communicate.
  • 3:10 - 3:14
    So if you're like me, and you use
    text messages to communicate,
  • 3:14 - 3:17
    Snapchat is that for teenagers,
  • 3:17 - 3:19
    and there's, like, a hundred million
    of them that use it.
  • 3:19 - 3:21
    And they invented a feature
    called Snapstreaks,
  • 3:21 - 3:23
    which shows the number of days in a row
  • 3:23 - 3:26
    that two people have
    communicated with each other.
  • 3:26 - 3:28
    In other words, what they just did
  • 3:28 - 3:32
    is they gave two people
    something they don't want to lose,
  • 3:32 - 3:36
    because if you're a teenager,
    and you have 150 days in a row,
  • 3:36 - 3:38
    you don't want that to go away,
  • 3:38 - 3:41
    and so think of the little blocks of time
  • 3:41 - 3:42
    that that schedules in kids' minds.
  • 3:42 - 3:45
    This isn't theoretical: when kids
    go on vacation, it's been shown
  • 3:45 - 3:48
    they give their passwords
    to up to five other friends
  • 3:48 - 3:50
    to keep their Snapstreaks going,
  • 3:50 - 3:52
    even when they can't do it.
  • 3:52 - 3:55
    And they have, like, 30 of these things,
  • 3:55 - 3:58
    and so they have to get through
    taking photos of just pictures or walls
  • 3:58 - 4:01
    or ceilings just to get through their day.
  • 4:01 - 4:04
    So it's not even like
    they're having real conversations.
  • 4:04 - 4:06
    We have a temptation to think about this
  • 4:06 - 4:09
    as, oh, they're just using Snapchat
  • 4:09 - 4:11
    the way we used
    to gossip on the telephone.
  • 4:11 - 4:13
    It's probably okay.
  • 4:13 - 4:15
    Well, what this misses
    is that in the 1970s
  • 4:15 - 4:18
    when you were just gossiping
    on the telephone,
  • 4:18 - 4:20
    there wasn't a hundred engineers
    on the other side of the screen
  • 4:20 - 4:23
    who knew exactly
    how your psychology worked
  • 4:23 - 4:27
    and orchestrated you into
    a double bind with each other.
  • 4:27 - 4:31
    Now, if this is making you
    feel a little bit outraged,
  • 4:31 - 4:34
    notice that that thought
    just comes over you.
  • 4:34 - 4:38
    Outrage is a really good way also
    of getting your intention,
  • 4:38 - 4:40
    because we don't choose outrage.
  • 4:40 - 4:41
    It happens to us.
  • 4:41 - 4:43
    And if you're the Facebook news feed,
  • 4:43 - 4:45
    whether you'd want to or not
    you actually benefit
  • 4:45 - 4:47
    when there's outrage,
  • 4:47 - 4:50
    because outrage doesn't just
    schedule a reaction
  • 4:50 - 4:54
    in emotional time, space, for you.
  • 4:54 - 4:56
    We want to share that outrage
    with other people.
  • 4:56 - 4:58
    So we want to share and say,
  • 4:58 - 5:01
    "Can you believe the thing
    that they said?"
  • 5:01 - 5:04
    And so outrage works really well
    at getting attention,
  • 5:04 - 5:08
    such that if Facebook had a choice
    between showing you the outrage feed
  • 5:08 - 5:10
    and a calm news feed,
  • 5:10 - 5:13
    they would want to show you
    the outrage feed,
  • 5:13 - 5:15
    not because someone
    consciously chose that,
  • 5:15 - 5:19
    but because that worked better
    at getting your attention.
  • 5:19 - 5:25
    And the news feed control room
    is not accountable to us.
  • 5:25 - 5:28
    It's only accountable
    to maximizing attention.
  • 5:28 - 5:31
    It's also accountable, because
    of the business model of advertising,
  • 5:31 - 5:35
    for anybody who can pay the most
    to actually walk into the control room
  • 5:35 - 5:39
    and say, "That group over there,
    I want to schedule these thoughts
  • 5:39 - 5:40
    into their minds."
  • 5:40 - 5:42
    So you can target,
  • 5:42 - 5:44
    you can precisely target a lie
  • 5:44 - 5:48
    directly to the people
    who are most susceptible.
  • 5:48 - 5:52
    And because this is profitable,
    it's only going to get worse.
  • 5:52 - 5:55
    So I'm here today
  • 5:55 - 6:01
    because the costs are so obvious.
  • 6:01 - 6:02
    I don't know a more urgent
    problem than this,
  • 6:02 - 6:07
    because this problem is underneath
    all other problems.
  • 6:07 - 6:10
    It's not just taking away our agency
  • 6:10 - 6:14
    to spend our attention and live
    the lives that we want,
  • 6:14 - 6:18
    it's changing the way
    that we have our conversations,
  • 6:18 - 6:19
    it's changing our democracy,
  • 6:19 - 6:23
    and it's changing our ability
    to have the conversations
  • 6:23 - 6:25
    and relationships we want with each other,
  • 6:25 - 6:27
    and it affects everyone,
  • 6:27 - 6:31
    because a billion people
    have one of these in their pocket.
  • 6:34 - 6:37
    So how do we fix this?
  • 6:37 - 6:40
    We need to make three radical changes
  • 6:40 - 6:44
    to technology and to our society.
  • 6:44 - 6:46
    The first is we need to acknowledge
  • 6:46 - 6:49
    that we are persuadable.
  • 6:49 - 6:52
    Once you start understanding
    that your mind can be scheduled
  • 6:52 - 6:54
    into having little thoughts
    or little blocks of time
  • 6:54 - 6:56
    that you didn't choose,
  • 6:56 - 6:58
    wouldn't we want to use that understanding
  • 6:58 - 7:01
    and protect against the way
    that that happens?
  • 7:01 - 7:04
    I think we need to see ourselves
    fundamentally in a new way.
  • 7:04 - 7:06
    It's almost like a new period
    of human history,
  • 7:06 - 7:07
    like the Enlightenment,
  • 7:07 - 7:10
    but almost a kind of
    self-aware Enlightenment,
  • 7:10 - 7:13
    that we can be persuaded,
  • 7:13 - 7:16
    and there might be something
    we want to protect.
  • 7:16 - 7:20
    The second is we need new models
    and accountability systems
  • 7:20 - 7:24
    so that as the world gets better
    and more and more persuasive over time,
  • 7:24 - 7:26
    because it's only going
    to get more persuasive,
  • 7:26 - 7:29
    that the people in those control rooms
    are accountable and transparent
  • 7:29 - 7:31
    to what we want.
  • 7:31 - 7:33
    The only form of ethical
    persuasion that exists
  • 7:33 - 7:36
    is when the goals of the persuader
  • 7:36 - 7:38
    are aligned with the goals
    of the persuadee.
  • 7:38 - 7:40
    And that involves questioning big things,
  • 7:40 - 7:43
    like the business model of advertising.
  • 7:43 - 7:47
    Lastly, we need a design renaissance,
  • 7:47 - 7:50
    because once you have
    this view of human nature,
  • 7:50 - 7:54
    that you can steer the timelines
    of a billion people --
  • 7:54 - 7:57
    Just imagine, there's people
    who have some desire
  • 7:57 - 7:58
    about what they want to do
    and what they want to be thinking
  • 7:58 - 8:00
    and what they want to be feeling
  • 8:00 - 8:01
    and how they want to be informed,
  • 8:01 - 8:03
    and we're all just tugged
    into these other directions,
  • 8:03 - 8:07
    and you have a billion people just tugged
    into all these different directions.
  • 8:07 - 8:11
    Well imagine an entire design renaissance
  • 8:11 - 8:14
    that tried to orchestrate the exact
    and most empowering
  • 8:14 - 8:15
    time-well-spent way for those
    timelines to happen.
  • 8:15 - 8:18
    And that would involve two things:
    one would be protecting against
  • 8:18 - 8:23
    the timelines that we don't
    want to be experiencing,
  • 8:23 - 8:25
    the thoughts that we wouldn't
    want to be happening,
  • 8:25 - 8:27
    so that when that ding happens,
    not having the ding
  • 8:27 - 8:28
    that sends us away;
  • 8:28 - 8:32
    and the second would be empowering us
    to live out the timeline that we want.
  • 8:32 - 8:34
    So let me give you a concrete example.
  • 8:34 - 8:37
    Today, let's say your friend
    cancels dinner on you,
  • 8:37 - 8:41
    and you are feeling a little bit lonely.
  • 8:41 - 8:43
    And so what do you do in that moment?
  • 8:43 - 8:45
    You open up Facebook.
  • 8:45 - 8:47
    And in that moment,
  • 8:47 - 8:51
    the designers in the control room
    want to schedule exactly one thing,
  • 8:51 - 8:55
    which is to maximize how much time
    you spend on the screen.
  • 8:55 - 8:59
    Now instead, imagine if those designers
    created a different timeline
  • 8:59 - 9:02
    that was the easiest way,
    using all of their data,
  • 9:02 - 9:05
    to actually help you get out
    with the people that you care about?
  • 9:05 - 9:08
    Just imagine how, just think,
    alleviating all loneliness in society,
  • 9:08 - 9:14
    if that was the timeline that Facebook
    wanted to make possible for people.
  • 9:14 - 9:17
    Or imagine a different conversation
  • 9:17 - 9:19
    where, let's say you wanted to post
    something super-controversial on Facebook,
  • 9:19 - 9:21
    which is a really important
    thing to be able to do,
  • 9:21 - 9:23
    to talk about controversial topics.
  • 9:23 - 9:26
    And right now, when there's
    that big comment box,
  • 9:26 - 9:27
    it's almost asking you,
  • 9:27 - 9:29
    what key do you want to type?
  • 9:29 - 9:33
    In other words, it's scheduling
    a little timeline of things
  • 9:33 - 9:34
    you're going to continue
    to do on the screen.
  • 9:34 - 9:37
    And imagine instead that there was
    another button there saying,
  • 9:37 - 9:39
    well what would be most
    time well spent for you?
  • 9:39 - 9:41
    And you click "host a dinner."
  • 9:41 - 9:43
    And right there underneath the item,
  • 9:43 - 9:45
    it said, "Who wants
    to RSVP for the dinner?"
  • 9:45 - 9:48
    And so you'd still have a conversation
    about something controversial,
  • 9:48 - 9:52
    but you'd be having it in the most
    empowering place on your timeline,
  • 9:52 - 9:54
    which would be at home that night
    with a bunch of a friends over
  • 9:54 - 9:57
    to talk about it.
  • 9:57 - 10:01
    So imagine we're running, like,
    a find and replace
  • 10:01 - 10:04
    on all of the timelines that are
    currently steering us towards more
  • 10:04 - 10:07
    and more screen time persuasively
  • 10:07 - 10:10
    and replacing all of those timelines
  • 10:10 - 10:14
    with what do we want in our lives.
  • 10:14 - 10:19
    It doesn't have to be this way.
  • 10:19 - 10:21
    Instead of handicapping our attention,
  • 10:21 - 10:24
    imagine if we used all of this data
    and all of this power
  • 10:24 - 10:25
    and this new view of human nature
  • 10:25 - 10:28
    to give us a superhuman ability to focus
  • 10:28 - 10:32
    and a superhuman ability to put
    our attention to what we cared about
  • 10:32 - 10:35
    and a superhuman ability to have
    the conversations that we need
  • 10:35 - 10:37
    to have for a democracy.
  • 10:40 - 10:44
    The most complex challenges in the world
  • 10:44 - 10:49
    require not just us to use
    our attention individually.
  • 10:49 - 10:53
    They require us to use our attention
    and coordinate it together.
  • 10:53 - 10:55
    Climate change is going to require
    that a lot of people
  • 10:55 - 10:57
    are being able to coordinate
    their attention
  • 10:57 - 11:00
    in the most empowering way together.
  • 11:00 - 11:04
    And imagine creating a superhuman
    ability to do that.
  • 11:07 - 11:12
    Sometimes the world's most pressing
    and important problems
  • 11:12 - 11:17
    are not these hypothetical future things
    that we could create in the future.
  • 11:17 - 11:19
    Sometimes the most pressing problems
  • 11:19 - 11:21
    are the ones that are
    right underneath our noses,
  • 11:21 - 11:25
    the things that are already directing
    a billion people's thoughts.
  • 11:25 - 11:28
    And maybe instead of getting excited
    about the new augmented reality
  • 11:28 - 11:32
    and virtual reality and these
    cool things that could happen,
  • 11:32 - 11:35
    which are going to be susceptible
    to the same race for attention,
  • 11:35 - 11:37
    if we could fix the race for attention
  • 11:37 - 11:40
    on the thing that's already
    in a billion people's pockets.
  • 11:40 - 11:44
    Maybe instead of getting excited
    about the most exciting new cool
  • 11:44 - 11:46
    fancy education apps,
  • 11:46 - 11:49
    we could fix the way kids' minds
    are getting manipulated
  • 11:49 - 11:53
    into sending empty messages
    back and forth.
  • 11:53 - 11:57
    (Applause)
  • 11:57 - 12:00
    Maybe instead of worrying about
    hypothetical future runaway
  • 12:00 - 12:05
    artificial intelligences that are
    maximizing for one goal,
  • 12:05 - 12:07
    we could solve the runaway
    artificial intelligence
  • 12:07 - 12:10
    that already exists right now,
  • 12:10 - 12:11
    which are these news feed
  • 12:11 - 12:13
    maximizing for one thing.
  • 12:14 - 12:18
    It's almost like instead of running away
    to colonize new planets,
  • 12:18 - 12:20
    we could fix the one
    that we're already on.
  • 12:20 - 12:24
    (Applause)
  • 12:28 - 12:30
    Solving this problem
  • 12:30 - 12:35
    is critical infrastructure
    for solving every other problem.
  • 12:35 - 12:39
    There's nothing in your life
    or in our collective problems
  • 12:39 - 12:42
    that does not require our ability
    to put our attention where we care about.
  • 12:42 - 12:46
    At the end of our lives,
  • 12:46 - 12:49
    all we have is our attention
  • 12:49 - 12:50
    and our time.
  • 12:50 - 12:52
    What will be time well spent for ours?
  • 12:52 - 12:53
    Thank you.
  • 12:53 - 12:57
    (Applause)
  • 13:06 - 13:09
    Chris Anderson: Tristan, thank you.
    Hey, stay up here a sec.
  • 13:09 - 13:11
    First of all, thank you.
  • 13:11 - 13:13
    I know we asked you to do this talk
    on pretty short notice,
  • 13:13 - 13:16
    and you've had quite a stressful week
  • 13:16 - 13:18
    getting this thing together, so thank you.
  • 13:18 - 13:20
    Some people listening might say,
  • 13:20 - 13:22
    well look, what you complain about
  • 13:22 - 13:25
    is addiction, and all these people
    doing this stuff,
  • 13:25 - 13:27
    for them it's actually interesting.
  • 13:27 - 13:28
    All these design decisions
  • 13:28 - 13:31
    have built user content
    that is fantastically interesting.
  • 13:31 - 13:33
    The world's more interesting
    than it ever has been.
  • 13:33 - 13:34
    What's possibly wrong with that?
  • 13:34 - 13:36
    Tristan Harris: Well, I think
    it's really interesting.
  • 13:36 - 13:37
    It's like, one way to see this
  • 13:37 - 13:41
    is if you're just YouTube for example,
  • 13:41 - 13:43
    you want to always show
    the more interesting next video.
  • 13:43 - 13:46
    You want to get better and better
    at suggesting that next video,
  • 13:46 - 13:48
    but even if you could propose
    the perfect next video
  • 13:48 - 13:50
    that everyone would want to watch,
  • 13:50 - 13:53
    it would just be better and better
    at keeping you hooked on the screen.
  • 13:53 - 13:54
    So what's missing in that equation
  • 13:54 - 13:57
    is figuring out what
    our boundaries would be.
  • 13:57 - 14:00
    Right? You would want YouTube to know
    something about, say, falling asleep.
  • 14:00 - 14:03
    The CEO of Netflix recently said
    that our biggest competitors are Facebook,
  • 14:03 - 14:05
    YouTube, and sleep. Right?
  • 14:05 - 14:10
    And so what we need to recognize
    is that the human architecture is limited,
  • 14:10 - 14:13
    and that we have certain boundaries
    or dimensions of our lives
  • 14:13 - 14:15
    that we want to be honored and respected,
  • 14:15 - 14:18
    and technology could help do that.
  • 14:18 - 14:20
    (Applause)
  • 14:20 - 14:22
    CA: I mean, could you make the case
  • 14:22 - 14:27
    that part of the problem here is that
    we've got a naïve model of human nature?
  • 14:27 - 14:30
    So much of this is justified in terms
    of human preference,
  • 14:30 - 14:32
    where we've got these algorithms
    that do an amazing job
  • 14:32 - 14:34
    of optimizing for human preference,
  • 14:34 - 14:36
    but which preference?
  • 14:36 - 14:39
    There's the preferences of things
    that we really care about
  • 14:39 - 14:41
    when we think about them
  • 14:41 - 14:44
    versus the preferences of what
    we just instinctively click on.
  • 14:44 - 14:48
    If we could implant that more
    nuanced view of human nature
  • 14:48 - 14:50
    in every design, would that
    be a step forward?
  • 14:50 - 14:52
    TH: Absolutely. I mean,
    I think right now it's as if
  • 14:52 - 14:55
    all of our technology is basically
    only asking our lizard brain
  • 14:55 - 15:00
    what's the best way to just impulsively
    get you to the next tiniest thing
  • 15:00 - 15:00
    with your time,
  • 15:00 - 15:02
    instead of asking you in your life
  • 15:02 - 15:04
    what we would be most
    time well spent for you?
  • 15:04 - 15:06
    What would be the perfect timeline
    that might include something later,
  • 15:06 - 15:09
    would be time well spent for you
    here at TED in your last day here?
  • 15:09 - 15:12
    CA: So if Facebook and Google
    and everyone said to us first up,
  • 15:12 - 15:16
    "Hey would you like us to optimize
    for your reflective brain
  • 15:16 - 15:18
    or your lizard brain? You choose."
  • 15:18 - 15:21
    TH: Right. That would be one way. Yes.
  • 15:21 - 15:25
    CA: When you said persuadability,
    that's an interesting word to me
  • 15:25 - 15:28
    because to me there's two
    different types of persuadability.
  • 15:28 - 15:30
    There's the persuadability
    that we're trying right now
  • 15:30 - 15:33
    of reason and thinking
    and making an argument,
  • 15:33 - 15:35
    but I think you're almost
    talking about a different kind
  • 15:35 - 15:37
    of more visceral type of persuadability,
  • 15:37 - 15:39
    of being persuaded without
    even knowing that you're thinking.
  • 15:39 - 15:43
    TH: Exactly, yeah. The reason
    I care about this problem so much is
  • 15:43 - 15:46
    I studied at a lab called
    the Persuasive Technology Lab at Stanford
  • 15:46 - 15:48
    that taught people
    exactly these techniques.
  • 15:48 - 15:52
    There's conferences and workshops
    that teach people all these covert ways
  • 15:52 - 15:55
    of getting people's attention
    and orchestrating people's lives.
  • 15:55 - 15:57
    And it's because most people
    don't know that that exists
  • 15:57 - 16:00
    that this conversation is so important.
  • 16:00 - 16:04
    CA: Tristan, you and I, we both know
    70 people from all these companies.
  • 16:04 - 16:05
    There are actually many here in the room,
  • 16:05 - 16:09
    and I don't know about you,
    but my experience of them
  • 16:09 - 16:10
    is that there is no
    shortage of good intent.
  • 16:10 - 16:13
    People want a better world.
  • 16:13 - 16:17
    They are actually, they really want it.
  • 16:17 - 16:21
    And I don't think anything you're saying
    is that these are evil people,
  • 16:21 - 16:24
    it's a system where there's
    these unintended consequences
  • 16:24 - 16:26
    that have really got out of control --
  • 16:26 - 16:28
    TH: -- of this race for attention, yeah.
  • 16:28 - 16:28
    It's the classic race to the bottom
    when you have to get attention,
  • 16:30 - 16:32
    and it's so tense.
  • 16:32 - 16:34
    The only way to get more is to go
    lower on the brain stem,
  • 16:34 - 16:37
    to go lower into outrage,
    to go lower into emotion,
  • 16:37 - 16:39
    to go lower into the lizard brain.
  • 16:39 - 16:43
    CA: Well, thank you so much for helping us
    all get a little bit wiser about this.
  • 16:43 - 16:45
    Tristan Harris, thank you.
    TH: Thank you. Thank you very much.
  • 16:45 - 16:48
    (Applause)
Title:
How a handful of tech companies control billions of minds every day
Speaker:
Tristan Harris
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
17:00

English subtitles

Revisions Compare revisions