< Return to Video

How a handful of tech companies control billions of minds every day

  • 0:01 - 0:02
    I want you to imagine
  • 0:03 - 0:04
    walking into a room,
  • 0:05 - 0:08
    a control room with a bunch of people,
  • 0:08 - 0:10
    a hundred people, hunched
    over a desk with little dials,
  • 0:11 - 0:13
    and that that control room
  • 0:14 - 0:17
    will shape the thoughts and feelings
  • 0:17 - 0:19
    of a billion people.
  • 0:21 - 0:22
    This might sound like science fiction,
  • 0:23 - 0:26
    but this actually exists
  • 0:26 - 0:27
    right now, today.
  • 0:28 - 0:31
    I know because I used to be
    in one of those control rooms.
  • 0:32 - 0:34
    I was a design ethicist at Google,
  • 0:34 - 0:38
    where I studied how do you ethically
    steer people's thoughts?
  • 0:39 - 0:41
    Because what we don't talk about
    is how the handful of people
  • 0:41 - 0:44
    working at a handful
    of technology companies
  • 0:44 - 0:49
    through their choices will steer
    what a billion people are thinking today.
  • 0:50 - 0:52
    Because when you pull out your phone
  • 0:52 - 0:55
    and they design how this works
    or what's on the feed,
  • 0:55 - 0:58
    it's scheduling little blocks
    of time in our minds.
  • 0:59 - 1:02
    If you see a notification,
    it schedules you to have thoughts
  • 1:02 - 1:04
    that maybe you didn't intend to have.
  • 1:04 - 1:07
    If you swipe over that notification,
  • 1:07 - 1:09
    it schedules you into spending
    a little bit of time
  • 1:09 - 1:11
    getting sucked into something
  • 1:11 - 1:14
    that maybe you didn't intend
    to get sucked into.
  • 1:15 - 1:17
    When we talk about technology,
  • 1:18 - 1:21
    we tend to talk about it
    as this blue sky opportunity.
  • 1:21 - 1:22
    It could go any direction.
  • 1:23 - 1:25
    And I want to get serious for a moment
  • 1:25 - 1:28
    and tell you why it's going
    in a very specific direction.
  • 1:29 - 1:31
    Because it's not evolving randomly.
  • 1:32 - 1:34
    There's a hidden goal
    driving the direction
  • 1:34 - 1:36
    of all of the technology we make,
  • 1:36 - 1:39
    and that goal is the race
    for our attention.
  • 1:41 - 1:44
    Because every new site --
  • 1:44 - 1:46
    TED, elections, politicians,
  • 1:46 - 1:48
    games, even meditation apps --
  • 1:48 - 1:50
    have to compete for one thing,
  • 1:51 - 1:53
    which is our attention,
  • 1:53 - 1:55
    and there's only so much of it.
  • 1:56 - 1:59
    And the best way to get people's attention
  • 1:59 - 2:01
    is to know how someone's mind works.
  • 2:02 - 2:04
    And there's a whole bunch
    of persuasive techniques
  • 2:04 - 2:08
    that I learned in college at a lab
    called the Persuasive Technology Lab
  • 2:08 - 2:09
    to get people's attention.
  • 2:10 - 2:11
    A simple example is YouTube.
  • 2:12 - 2:15
    YouTube wants to maximize
    how much time you spend.
  • 2:15 - 2:16
    And so what do they do?
  • 2:17 - 2:19
    They autoplay the next video.
  • 2:20 - 2:22
    And let's say that works really well.
  • 2:22 - 2:24
    They're getting a little bit
    more of people's time.
  • 2:24 - 2:26
    Well, if you're Netflix,
    you look at that and say,
  • 2:26 - 2:28
    well, that's shrinking my market share,
  • 2:28 - 2:30
    so I'm going to autoplay the next episode.
  • 2:31 - 2:33
    But then if you're Facebook,
  • 2:33 - 2:35
    you say, that's shrinking
    all of my market share,
  • 2:35 - 2:38
    so now I have to autoplay
    all the videos in the newsfeed
  • 2:38 - 2:40
    before waiting for you to click play.
  • 2:40 - 2:43
    So the internet is not evolving at random.
  • 2:44 - 2:49
    The reason it feels
    like it's sucking us in the way it is
  • 2:49 - 2:51
    is because of this race for attention.
  • 2:51 - 2:53
    We know where this is going.
  • 2:53 - 2:54
    Technology is not neutral,
  • 2:55 - 2:59
    and it becomes this race
    to the bottom of the brain stem
  • 2:59 - 3:01
    of who can go lower to get it.
  • 3:02 - 3:04
    Let me give you an example of Snapchat.
  • 3:04 - 3:08
    If you didn't know,
    Snapchat is the number one way
  • 3:08 - 3:10
    that teenagers in
    the United States communicate.
  • 3:10 - 3:14
    So if you're like me, and you use
    text messages to communicate,
  • 3:14 - 3:16
    Snapchat is that for teenagers,
  • 3:16 - 3:19
    and there's, like,
    a hundred million of them that use it.
  • 3:19 - 3:21
    And they invented
    a feature called Snapstreaks,
  • 3:21 - 3:23
    which shows the number of days in a row
  • 3:23 - 3:26
    that two people have
    communicated with each other.
  • 3:26 - 3:28
    In other words, what they just did
  • 3:28 - 3:31
    is they gave two people
    something they don't want to lose.
  • 3:32 - 3:35
    Because if you're a teenager,
    and you have 150 days in a row,
  • 3:35 - 3:37
    you don't want that to go away.
  • 3:37 - 3:42
    And so think of the little blocks of time
    that that schedules in kids' minds.
  • 3:42 - 3:44
    This isn't theoretical:
    when kids go on vacation,
  • 3:45 - 3:48
    it's been shown they give their passwords
    to up to five other friends
  • 3:48 - 3:50
    to keep their Snapstreaks going,
  • 3:50 - 3:52
    even when they can't do it.
  • 3:52 - 3:54
    And they have, like, 30 of these things,
  • 3:54 - 3:57
    and so they have to get through
    taking photos of just pictures or walls
  • 3:57 - 4:00
    or ceilings just to get through their day.
  • 4:01 - 4:04
    So it's not even like
    they're having real conversations.
  • 4:04 - 4:06
    We have a temptation to think about this
  • 4:06 - 4:09
    as, oh, they're just using Snapchat
  • 4:09 - 4:11
    the way we used to
    gossip on the telephone.
  • 4:11 - 4:12
    It's probably OK.
  • 4:12 - 4:15
    Well, what this misses
    is that in the 1970s,
  • 4:15 - 4:17
    when you were just
    gossiping on the telephone,
  • 4:17 - 4:20
    there wasn't a hundred engineers
    on the other side of the screen
  • 4:20 - 4:22
    who knew exactly
    how your psychology worked
  • 4:23 - 4:25
    and orchestrated you
    into a double bind with each other.
  • 4:26 - 4:30
    Now, if this is making you
    feel a little bit of outrage,
  • 4:31 - 4:33
    notice that that thought
    just comes over you.
  • 4:33 - 4:37
    Outrage is a really good way also
    of getting your attention,
  • 4:38 - 4:39
    because we don't choose outrage.
  • 4:39 - 4:41
    It happens to us.
  • 4:41 - 4:43
    And if you're the Facebook newsfeed,
  • 4:43 - 4:44
    whether you'd want to or not,
  • 4:44 - 4:47
    you actually benefit when there's outrage.
  • 4:47 - 4:50
    Because outrage
    doesn't just schedule a reaction
  • 4:50 - 4:53
    in emotional time, space, for you.
  • 4:53 - 4:56
    We want to share that outrage
    with other people.
  • 4:56 - 4:57
    So we want to hit share and say,
  • 4:57 - 5:00
    "Can you believe the thing
    that they said?"
  • 5:01 - 5:04
    And so outrage works really well
    at getting attention,
  • 5:04 - 5:08
    such that if Facebook had a choice
    between showing you the outrage feed
  • 5:08 - 5:09
    and a calm newsfeed,
  • 5:10 - 5:12
    they would want
    to show you the outrage feed,
  • 5:12 - 5:14
    not because someone
    consciously chose that,
  • 5:14 - 5:17
    but because that worked better
    at getting your attention.
  • 5:19 - 5:25
    And the newsfeed control room
    is not accountable to us.
  • 5:25 - 5:27
    It's only accountable
    to maximizing attention.
  • 5:27 - 5:29
    It's also accountable,
  • 5:29 - 5:31
    because of the business model
    of advertising,
  • 5:31 - 5:34
    for anybody who can pay the most
    to actually walk into the control room
  • 5:34 - 5:36
    and say, "That group over there,
  • 5:36 - 5:39
    I want to schedule these thoughts
    into their minds."
  • 5:40 - 5:41
    So you can target,
  • 5:42 - 5:44
    you can precisely target a lie
  • 5:44 - 5:47
    directly to the people
    who are most susceptible.
  • 5:48 - 5:51
    And because this is profitable,
    it's only going to get worse.
  • 5:53 - 5:55
    So I'm here today
  • 5:56 - 5:58
    because the costs are so obvious.
  • 6:00 - 6:02
    I don't know a more urgent
    problem than this,
  • 6:02 - 6:06
    because this problem
    is underneath all other problems.
  • 6:07 - 6:10
    It's not just taking away our agency
  • 6:10 - 6:13
    to spend our attention
    and live the lives that we want,
  • 6:14 - 6:17
    it's changing the way
    that we have our conversations,
  • 6:17 - 6:19
    it's changing our democracy,
  • 6:19 - 6:22
    and it's changing our ability
    to have the conversations
  • 6:22 - 6:24
    and relationships we want with each other.
  • 6:25 - 6:27
    And it affects everyone,
  • 6:27 - 6:30
    because a billion people
    have one of these in their pocket.
  • 6:33 - 6:35
    So how do we fix this?
  • 6:37 - 6:40
    We need to make three radical changes
  • 6:40 - 6:42
    to technology and to our society.
  • 6:44 - 6:48
    The first is we need to acknowledge
    that we are persuadable.
  • 6:49 - 6:50
    Once you start understanding
  • 6:50 - 6:53
    that your mind can be scheduled
    into having little thoughts
  • 6:53 - 6:56
    or little blocks of time
    that you didn't choose,
  • 6:56 - 6:58
    wouldn't we want to use that understanding
  • 6:58 - 7:00
    and protect against the way
    that that happens?
  • 7:01 - 7:04
    I think we need to see ourselves
    fundamentally in a new way.
  • 7:04 - 7:06
    It's almost like a new period
    of human history,
  • 7:06 - 7:07
    like the Enlightenment,
  • 7:07 - 7:10
    but almost a kind of
    self-aware Enlightenment,
  • 7:10 - 7:12
    that we can be persuaded,
  • 7:12 - 7:15
    and there might be something
    we want to protect.
  • 7:15 - 7:20
    The second is we need new models
    and accountability systems
  • 7:20 - 7:23
    so that as the world gets better
    and more and more persuasive over time --
  • 7:24 - 7:26
    because it's only going
    to get more persuasive --
  • 7:26 - 7:28
    that the people in those control rooms
  • 7:28 - 7:30
    are accountable and transparent
    to what we want.
  • 7:30 - 7:33
    The only form of ethical
    persuasion that exists
  • 7:33 - 7:35
    is when the goals of the persuader
  • 7:35 - 7:37
    are aligned with the goals
    of the persuadee.
  • 7:38 - 7:41
    And that involves questioning big things,
    like the business model of advertising.
  • 7:43 - 7:44
    Lastly,
  • 7:44 - 7:46
    we need a design renaissance,
  • 7:47 - 7:50
    because once you have
    this view of human nature,
  • 7:50 - 7:53
    that you can steer the timelines
    of a billion people --
  • 7:53 - 7:56
    just imagine, there's people
    who have some desire
  • 7:56 - 7:59
    about what they want to do
    and what they want to be thinking
  • 7:59 - 8:02
    and what they want to be feeling
    and how they want to be informed,
  • 8:02 - 8:04
    and we're all just tugged
    into these other directions.
  • 8:05 - 8:08
    And you have a billion people just tugged
    into all these different directions.
  • 8:08 - 8:10
    Well, imagine an entire design renaissance
  • 8:10 - 8:13
    that tried to orchestrate
    the exact and most empowering
  • 8:13 - 8:17
    time-well-spent way
    for those timelines to happen.
  • 8:17 - 8:18
    And that would involve two things:
  • 8:18 - 8:20
    one would be protecting
    against the timelines
  • 8:20 - 8:22
    that we don't want to be experiencing,
  • 8:22 - 8:25
    the thoughts that we
    wouldn't want to be happening,
  • 8:25 - 8:28
    so that when that ding happens,
    not having the ding that sends us away;
  • 8:28 - 8:32
    and the second would be empowering us
    to live out the timeline that we want.
  • 8:32 - 8:34
    So let me give you a concrete example.
  • 8:34 - 8:37
    Today, let's say your friend
    cancels dinner on you,
  • 8:37 - 8:41
    and you are feeling a little bit lonely.
  • 8:41 - 8:42
    And so what do you do in that moment?
  • 8:42 - 8:44
    You open up Facebook.
  • 8:45 - 8:47
    And in that moment,
  • 8:47 - 8:50
    the designers in the control room
    want to schedule exactly one thing,
  • 8:50 - 8:53
    which is to maximize how much time
    you spend on the screen.
  • 8:55 - 8:59
    Now, instead, imagine if those designers
    created a different timeline
  • 8:59 - 9:02
    that was the easiest way,
    using all of their data,
  • 9:02 - 9:05
    to actually help you get out
    with the people that you care about?
  • 9:05 - 9:11
    Just think, alleviating
    all loneliness in society,
  • 9:11 - 9:14
    if that was the timeline that Facebook
    wanted to make possible for people.
  • 9:14 - 9:16
    Or imagine a different conversation.
  • 9:16 - 9:19
    Let's say you wanted to post
    something supercontroversial on Facebook,
  • 9:19 - 9:22
    which is a really important
    thing to be able to do,
  • 9:22 - 9:23
    to talk about controversial topics.
  • 9:23 - 9:26
    And right now, when there's
    that big comment box,
  • 9:26 - 9:29
    it's almost asking you,
    what key do you want to type?
  • 9:29 - 9:32
    In other words, it's scheduling
    a little timeline of things
  • 9:32 - 9:34
    you're going to continue
    to do on the screen.
  • 9:34 - 9:37
    And imagine instead that there was
    another button there saying,
  • 9:37 - 9:39
    what would be most
    time well spent for you?
  • 9:39 - 9:41
    And you click "host a dinner."
  • 9:41 - 9:43
    And right there
    underneath the item it said,
  • 9:43 - 9:45
    "Who wants to RSVP for the dinner?"
  • 9:45 - 9:48
    And so you'd still have a conversation
    about something controversial,
  • 9:48 - 9:52
    but you'd be having it in the most
    empowering place on your timeline,
  • 9:52 - 9:55
    which would be at home that night
    with a bunch of a friends over
  • 9:55 - 9:56
    to talk about it.
  • 9:57 - 10:00
    So imagine we're running, like,
    a find and replace
  • 10:01 - 10:04
    on all of the timelines
    that are currently steering us
  • 10:04 - 10:06
    towards more and more
    screen time persuasively
  • 10:07 - 10:10
    and replacing all of those timelines
  • 10:10 - 10:11
    with what do we want in our lives.
  • 10:15 - 10:16
    It doesn't have to be this way.
  • 10:18 - 10:21
    Instead of handicapping our attention,
  • 10:21 - 10:23
    imagine if we used all of this data
    and all of this power
  • 10:23 - 10:25
    and this new view of human nature
  • 10:25 - 10:28
    to give us a superhuman ability to focus
  • 10:28 - 10:32
    and a superhuman ability to put
    our attention to what we cared about
  • 10:32 - 10:35
    and a superhuman ability
    to have the conversations
  • 10:35 - 10:37
    that we need to have for democracy.
  • 10:40 - 10:42
    The most complex challenges in the world
  • 10:44 - 10:47
    require not just us
    to use our attention individually.
  • 10:48 - 10:52
    They require us to use our attention
    and coordinate it together.
  • 10:52 - 10:55
    Climate change is going to require
    that a lot of people
  • 10:55 - 10:57
    are being able
    to coordinate their attention
  • 10:57 - 10:59
    in the most empowering way together.
  • 10:59 - 11:02
    And imagine creating
    a superhuman ability to do that.
  • 11:07 - 11:11
    Sometimes the world's
    most pressing and important problems
  • 11:12 - 11:16
    are not these hypothetical future things
    that we could create in the future.
  • 11:17 - 11:18
    Sometimes the most pressing problems
  • 11:18 - 11:21
    are the ones that are
    right underneath our noses,
  • 11:21 - 11:24
    the things that are already directing
    a billion people's thoughts.
  • 11:25 - 11:28
    And maybe instead of getting excited
    about the new augmented reality
  • 11:28 - 11:31
    and virtual reality
    and these cool things that could happen,
  • 11:31 - 11:35
    which are going to be susceptible
    to the same race for attention,
  • 11:35 - 11:37
    if we could fix the race for attention
  • 11:37 - 11:40
    on the thing that's already
    in a billion people's pockets.
  • 11:40 - 11:42
    Maybe instead of getting excited
  • 11:42 - 11:46
    about the most exciting
    new cool fancy education apps,
  • 11:46 - 11:49
    we could fix the way
    kids' minds are getting manipulated
  • 11:49 - 11:51
    into sending empty messages
    back and forth.
  • 11:52 - 11:56
    (Applause)
  • 11:56 - 11:58
    Maybe instead of worrying
  • 11:58 - 12:01
    about hypothetical future
    runaway artificial intelligences
  • 12:01 - 12:03
    that are maximizing for one goal,
  • 12:05 - 12:07
    we could solve the runaway
    artificial intelligence
  • 12:07 - 12:09
    that already exists right now,
  • 12:09 - 12:12
    which are these newsfeeds
    maximizing for one thing.
  • 12:14 - 12:18
    It's almost like instead of running away
    to colonize new planets,
  • 12:18 - 12:20
    we could fix the one
    that we're already on.
  • 12:20 - 12:24
    (Applause)
  • 12:28 - 12:30
    Solving this problem
  • 12:30 - 12:34
    is critical infrastructure
    for solving every other problem.
  • 12:35 - 12:39
    There's nothing in your life
    or in our collective problems
  • 12:39 - 12:42
    that does not require our ability
    to put our attention where we care about.
  • 12:44 - 12:45
    At the end of our lives,
  • 12:46 - 12:49
    all we have is our attention and our time.
  • 12:50 - 12:52
    What will be time well spent for ours?
  • 12:52 - 12:53
    Thank you.
  • 12:53 - 12:56
    (Applause)
  • 13:06 - 13:09
    Chris Anderson: Tristan, thank you.
    Hey, stay up here a sec.
  • 13:09 - 13:10
    First of all, thank you.
  • 13:10 - 13:13
    I know we asked you to do this talk
    on pretty short notice,
  • 13:13 - 13:15
    and you've had quite a stressful week
  • 13:15 - 13:18
    getting this thing together, so thank you.
  • 13:19 - 13:23
    Some people listening might say,
    what you complain about is addiction,
  • 13:23 - 13:26
    and all these people doing this stuff,
    for them it's actually interesting.
  • 13:26 - 13:27
    All these design decisions
  • 13:27 - 13:31
    have built user content
    that is fantastically interesting.
  • 13:31 - 13:33
    The world's more interesting
    than it ever has been.
  • 13:33 - 13:34
    What's wrong with that?
  • 13:34 - 13:37
    Tristan Harris:
    I think it's really interesting.
  • 13:37 - 13:41
    One way to see this
    is if you're just YouTube, for example,
  • 13:41 - 13:43
    you want to always show
    the more interesting next video.
  • 13:43 - 13:46
    You want to get better and better
    at suggesting that next video,
  • 13:46 - 13:49
    but even if you could propose
    the perfect next video
  • 13:49 - 13:50
    that everyone would want to watch,
  • 13:51 - 13:54
    it would just be better and better
    at keeping you hooked on the screen.
  • 13:54 - 13:56
    So what's missing in that equation
  • 13:56 - 13:58
    is figuring out what
    our boundaries would be.
  • 13:58 - 14:01
    You would want YouTube to know
    something about, say, falling asleep.
  • 14:01 - 14:03
    The CEO of Netflix recently said,
  • 14:03 - 14:05
    "our biggest competitors
    are Facebook, YouTube and sleep."
  • 14:05 - 14:10
    And so what we need to recognize
    is that the human architecture is limited
  • 14:10 - 14:13
    and that we have certain boundaries
    or dimensions of our lives
  • 14:13 - 14:15
    that we want to be honored and respected,
  • 14:15 - 14:17
    and technology could help do that.
  • 14:17 - 14:19
    (Applause)
  • 14:19 - 14:21
    CA: I mean, could you make the case
  • 14:21 - 14:27
    that part of the problem here is that
    we've got a naïve model of human nature?
  • 14:27 - 14:30
    So much of this is justified
    in terms of human preference,
  • 14:30 - 14:32
    where we've got these algorithms
    that do an amazing job
  • 14:33 - 14:34
    of optimizing for human preference,
  • 14:34 - 14:36
    but which preference?
  • 14:36 - 14:39
    There's the preferences
    of things that we really care about
  • 14:39 - 14:40
    when we think about them
  • 14:41 - 14:44
    versus the preferences
    of what we just instinctively click on.
  • 14:44 - 14:48
    If we could implant that more nuanced
    view of human nature in every design,
  • 14:48 - 14:50
    would that be a step forward?
  • 14:50 - 14:52
    TH: Absolutely. I mean, I think right now
  • 14:52 - 14:55
    it's as if all of our technology
    is basically only asking our lizard brain
  • 14:55 - 14:58
    what's the best way
    to just impulsively get you to do
  • 14:58 - 15:00
    the next tiniest thing with your time,
  • 15:00 - 15:02
    instead of asking you in your life
  • 15:02 - 15:04
    what we would be most
    time well spent for you?
  • 15:04 - 15:07
    What would be the perfect timeline
    that might include something later,
  • 15:07 - 15:10
    would be time well spent for you
    here at TED in your last day here?
  • 15:10 - 15:13
    CA: So if Facebook and Google
    and everyone said to us first up,
  • 15:13 - 15:16
    "Hey, would you like us
    to optimize for your reflective brain
  • 15:16 - 15:18
    or your lizard brain? You choose."
  • 15:18 - 15:20
    TH: Right. That would be one way. Yes.
  • 15:22 - 15:25
    CA: You said persuadability,
    that's an interesting word to me
  • 15:25 - 15:28
    because to me there's
    two different types of persuadability.
  • 15:28 - 15:31
    There's the persuadability
    that we're trying right now
  • 15:31 - 15:33
    of reason and thinking
    and making an argument,
  • 15:33 - 15:36
    but I think you're almost
    talking about a different kind,
  • 15:36 - 15:37
    a more visceral type of persuadability,
  • 15:38 - 15:40
    of being persuaded without
    even knowing that you're thinking.
  • 15:40 - 15:43
    TH: Exactly. The reason
    I care about this problem so much is
  • 15:43 - 15:46
    I studied at a lab called
    the Persuasive Technology Lab at Stanford
  • 15:47 - 15:49
    that taught people
    exactly these techniques.
  • 15:49 - 15:52
    There's conferences and workshops
    that teach people all these covert ways
  • 15:52 - 15:55
    of getting people's attention
    and orchestrating people's lives.
  • 15:55 - 15:58
    And it's because most people
    don't know that that exists
  • 15:58 - 16:00
    that this conversation is so important.
  • 16:00 - 16:03
    CA: Tristan, you and I, we both know
    so many people from all these companies.
  • 16:04 - 16:05
    There are actually many here in the room,
  • 16:06 - 16:08
    and I don't know about you,
    but my experience of them
  • 16:08 - 16:10
    is that there is
    no shortage of good intent.
  • 16:10 - 16:12
    People want a better world.
  • 16:12 - 16:16
    They are actually -- they really want it.
  • 16:16 - 16:20
    And I don't think anything you're saying
    is that these are evil people.
  • 16:21 - 16:24
    It's a system where there's
    these unintended consequences
  • 16:24 - 16:26
    that have really got out of control --
  • 16:26 - 16:28
    TH: Of this race for attention.
  • 16:28 - 16:31
    It's the classic race to the bottom
    when you have to get attention,
  • 16:31 - 16:32
    and it's so tense.
  • 16:32 - 16:35
    The only way to get more
    is to go lower on the brain stem,
  • 16:35 - 16:37
    to go lower into outrage,
    to go lower into emotion,
  • 16:37 - 16:39
    to go lower into the lizard brain.
  • 16:39 - 16:43
    CA: Well, thank you so much for helping us
    all get a little bit wiser about this.
  • 16:43 - 16:45
    Tristan Harris, thank you.
    TH: Thank you very much.
  • 16:45 - 16:48
    (Applause)
Title:
How a handful of tech companies control billions of minds every day
Speaker:
Tristan Harris
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
17:00

English subtitles

Revisions Compare revisions