< Return to Video

Stand out of our light | James Williams | TEDxAthens

  • 0:10 - 0:11
    Good afternoon.
  • 0:11 - 0:16
    Most philosophers do not live
    in big ceramic barrels
  • 0:16 - 0:18
    in their local supermarket.
  • 0:18 - 0:20
    But there was one,
  • 0:20 - 0:22
    just down the road from here, actually,
  • 0:22 - 0:24
    not so very long ago.
  • 0:25 - 0:28
    His name was Diogenes of Sinope,
  • 0:28 - 0:34
    and he was probably the closest thing
    philosophy has ever produced to a troll.
  • 0:34 - 0:38
    He was rude, outrageous,
    impulsive, offensive,
  • 0:39 - 0:43
    but he was deeply admired
    by Alexander the Great,
  • 0:43 - 0:47
    who was, arguably, the most powerful
    person in the world at the time.
  • 0:48 - 0:51
    It's said that one day,
    Alexander went to visit Diogenes
  • 0:51 - 0:53
    in his big barrel in the marketplace
  • 0:53 - 0:56
    and went up to him and said,
  • 0:56 - 0:59
    "Diogenes, I will grant
    any one wish that you have;
  • 0:59 - 1:00
    just tell me what you want."
  • 1:01 - 1:04
    And Diogenes was lying
    in the sun at the time,
  • 1:04 - 1:08
    and true to his form, he looked up
    at Alexander and replied,
  • 1:08 - 1:10
    "Stand out of my light."
  • 1:12 - 1:16
    And I love this story
    because it has lessons for us
  • 1:16 - 1:19
    about how we should be responding
    to the Alexanders of our time,
  • 1:19 - 1:23
    our digital technologies
    and the people who create them.
  • 1:23 - 1:25
    Because like Alexander,
  • 1:25 - 1:26
    they've come into our lives
  • 1:26 - 1:30
    and offered to fulfill all sorts
    of needs and wishes that we have,
  • 1:30 - 1:33
    and in many ways,
    they've done so extremely well.
  • 1:33 - 1:38
    But we're beginning to realize now
    that in doing so, they, like Alexander,
  • 1:38 - 1:42
    have also been standing
    in our light, in a sense,
  • 1:42 - 1:45
    and in one light in particular
  • 1:45 - 1:49
    that is so precious and so essential
    for human flourishing,
  • 1:49 - 1:51
    that without this light,
  • 1:51 - 1:55
    the other benefits of technology
    might not do us very much good.
  • 1:56 - 1:59
    The light that I mean
    is the light of our attention.
  • 2:00 - 2:04
    There's something profound
    and potentially irreversible
  • 2:04 - 2:07
    happening to human attention
    in the digital age.
  • 2:08 - 2:10
    It's more than just distraction.
  • 2:10 - 2:14
    It's more than just addiction
    or manipulation.
  • 2:15 - 2:18
    In fact, I think that the way
    we respond to this challenge
  • 2:18 - 2:22
    could be the defining moral
    and political issue of our time.
  • 2:23 - 2:28
    I'd like to tell you why I think so
    and what I think we can do about it.
  • 2:30 - 2:33
    In the 1970s, Herbert Simon pointed out
  • 2:33 - 2:36
    that in an environment
    of information abundance,
  • 2:36 - 2:39
    attention becomes the scarce resource.
  • 2:39 - 2:43
    There's a kind of figure-ground
    inversion that takes place,
  • 2:43 - 2:46
    and this inversion has happened
    so quickly and so recently
  • 2:46 - 2:49
    that we're still just beginning
    to come to terms
  • 2:49 - 2:51
    with what it means for human life.
  • 2:52 - 2:54
    But because attention
    is the scarce resource,
  • 2:54 - 2:56
    it is now the object of competition
  • 2:56 - 2:59
    among most of the technologies
    we use everyday.
  • 2:59 - 3:03
    The total environment
    of competition for our attention
  • 3:03 - 3:05
    is often called "the attention economy,"
  • 3:05 - 3:09
    and in the attention economy,
    there are no truly free products.
  • 3:09 - 3:11
    You pay with your attentional labor
  • 3:11 - 3:15
    every time you look
    or tap or scroll or click,
  • 3:16 - 3:20
    and this is exactly what
    they're designed to try to get you to do.
  • 3:20 - 3:25
    And they use their attentional labor
    to advance their goals, not yours.
  • 3:26 - 3:30
    Because there is a difference
    between their goals and yours.
  • 3:30 - 3:33
    If you think about the goals
    that you have for yourself
  • 3:33 - 3:36
    today, this year and even beyond,
  • 3:36 - 3:40
    they're probably things like
    "I want to spend more time with family"
  • 3:40 - 3:44
    or "I want to learn how to play the piano"
  • 3:44 - 3:47
    or "I want to take that trip
    I've been thinking about for a while."
  • 3:47 - 3:49
    You know, these are real human goals,
  • 3:49 - 3:51
    the stuff that when
    we're on our death bed,
  • 3:51 - 3:54
    if we don't do, we'll probably regret it.
  • 3:54 - 3:57
    But if you look at what the technologies
    of the attention economy
  • 3:57 - 4:00
    are designed to promote in our lives,
  • 4:00 - 4:02
    you don't see these goals.
  • 4:02 - 4:05
    What you see are things like
  • 4:05 - 4:07
    "Maximize the amount of time
    I spend using it"
  • 4:07 - 4:10
    or "the amount of clicks that I make"
  • 4:10 - 4:14
    or "the number of pages
    or ads that I view."
  • 4:14 - 4:18
    Now, I don't know anybody
    who has these goals for themselves.
  • 4:18 - 4:19
    Does anybody wake up in the morning
  • 4:19 - 4:23
    and think, "How much time
    can I possibly spend on Facebook today?"
  • 4:24 - 4:25
    I certainly don't.
  • 4:25 - 4:28
    If there's someone out there like that,
    I'd love to meet them.
  • 4:28 - 4:29
    But what this means
  • 4:29 - 4:32
    is that there's a deep gap
    between our goals and theirs
  • 4:32 - 4:36
    and that the technologies of the attention
    economy are not on our side;
  • 4:37 - 4:39
    their goals are not our goals.
  • 4:39 - 4:42
    These are distractions,
    petty distractions,
  • 4:42 - 4:44
    from the goals of life.
  • 4:44 - 4:46
    And this seems to me
    to be a really big deal,
  • 4:47 - 4:50
    even more so because the creators
    of these technologies
  • 4:50 - 4:51
    know that this is the case.
  • 4:51 - 4:55
    Steve Jobs did not let
    his children use the iPad.
  • 4:55 - 4:57
    The CEO of Netfilx, a little while back,
  • 4:57 - 5:00
    said that in addition
    to Snapchat and YouTube,
  • 5:00 - 5:03
    one of their biggest
    competitors was sleep.
  • 5:04 - 5:07
    This seems to me
    a crisis of design ethics,
  • 5:07 - 5:09
    a crisis of self-regulation
  • 5:09 - 5:13
    that design is actually amplifying
    and making even worse.
  • 5:13 - 5:16
    In the last couple of decades,
  • 5:16 - 5:18
    psychology and behavioral
    economics research
  • 5:18 - 5:23
    has cataloged an enormous number
    of vulnerabilities in our brains,
  • 5:23 - 5:27
    little buttons that can be pushed
    to get us to think or do certain things.
  • 5:28 - 5:30
    In parallel with this,
  • 5:30 - 5:35
    the advertising industry
    has effectively colonized the internet
  • 5:35 - 5:39
    and turned it into a large-scale system
    of industrial persuasion -
  • 5:39 - 5:43
    of measurement, of optimization,
    of message delivery.
  • 5:44 - 5:47
    What's more, this power,
    this persuasive power,
  • 5:47 - 5:51
    is more centralized
    than at any time in human history.
  • 5:51 - 5:52
    Never before in history
  • 5:52 - 5:57
    have a few people at a few companies,
    in one state, in one country,
  • 5:57 - 6:02
    been able to shape the attentional habits
    of billions of human beings.
  • 6:02 - 6:06
    Alexander could have never
    even dreamed of that sort of power.
  • 6:06 - 6:11
    So I think it's no hyperbole to say
    that the digital attention economy
  • 6:11 - 6:15
    is the largest system
    and most effective system
  • 6:15 - 6:20
    for human attitudinal and behavioral
    manipulation the world has ever seen.
  • 6:20 - 6:23
    And again, this seems to me
    an enormous question.
  • 6:23 - 6:25
    I think what's happened
  • 6:25 - 6:29
    is that, as Aldous Huxley said
    of the defenders of freedom in his time,
  • 6:29 - 6:32
    that they had "failed to take into account
  • 6:32 - 6:35
    man's almost infinite appetite
    for distractions."
  • 6:35 - 6:37
    I think that in the design
    of digital technologies,
  • 6:37 - 6:39
    we've made exactly the same mistake,
  • 6:39 - 6:44
    and I think that it is urgent
    for us to start taking those into account.
  • 6:45 - 6:47
    So how can we start to do that?
  • 6:47 - 6:49
    Well, I think what
    it would require, essentially,
  • 6:49 - 6:54
    is to start asserting and defending
    our freedom of attention.
  • 6:54 - 6:57
    Now, this is a type of freedom
    we have always had
  • 6:57 - 7:00
    but never needed
    to seriously assert or defend
  • 7:00 - 7:02
    because there wasn't
    a whole lot in our world
  • 7:02 - 7:04
    that could seriously threaten it.
  • 7:05 - 7:10
    But I think we can find good precedent
    in the great writers on the subject.
  • 7:11 - 7:13
    For instance, John Stuart Mill,
  • 7:13 - 7:17
    who said that "The appropriate
    region of human liberty
  • 7:17 - 7:20
    comprises the inward domain
    of consciousness."
  • 7:20 - 7:23
    Freedom of mind
    is the first sort of freedom.
  • 7:23 - 7:27
    He adds that "The principle of liberty,
    liberty of tastes and pursuits,
  • 7:27 - 7:30
    of framing the plan of our life
    to suit our own character."
  • 7:30 - 7:32
    So what this suggests to me
  • 7:32 - 7:35
    is that we need
    to start thinking more broadly
  • 7:35 - 7:37
    about what we mean
    by the concept of attention
  • 7:37 - 7:41
    in order to take into account
    the full spectrum of distractions
  • 7:41 - 7:44
    that are now being unleashed in our world.
  • 7:44 - 7:46
    Because when we hear the term "attention,"
  • 7:46 - 7:49
    what we normally think of
    is the spotlight of attention,
  • 7:49 - 7:53
    kind of the immediate way we shape
    our awareness within the task domain,
  • 7:53 - 7:57
    so the attention that you are all
    giving to me right now in this moment,
  • 7:57 - 8:00
    attention for which, by the way,
    I am very grateful.
  • 8:01 - 8:05
    But when the spotlight
    of our attention gets obscured,
  • 8:05 - 8:07
    it sort of interferes
    with our ability to act.
  • 8:07 - 8:09
    So let's say I'm trying to read a book,
  • 8:09 - 8:14
    but I see on my phone that Donald Trump
    has unleashed another outrageous tweet,
  • 8:14 - 8:18
    and so I stop reading my book
    and don't finish reading it until later.
  • 8:20 - 8:26
    But over time, actions become habits;
    the things we do become the people we are.
  • 8:27 - 8:32
    And we don't have a way of talking
    about attention in this longer-term view
  • 8:32 - 8:35
    with respect to our higher
    goals and our values.
  • 8:36 - 8:39
    So I think that we could maybe think
    of another light of attention
  • 8:39 - 8:41
    beyond this spotlight of attention.
  • 8:41 - 8:44
    We could think of
    the starlight of attention,
  • 8:44 - 8:49
    so the way we navigate our lives
    by the stars of our higher values.
  • 8:49 - 8:54
    So when technology obscures
    the starlight of our attention -
  • 8:54 - 8:58
    we can see this especially
    in infinite scrolling news feeds,
  • 8:58 - 9:00
    like on Facebook or Twitter,
  • 9:00 - 9:02
    and when you pull down to refresh,
  • 9:02 - 9:05
    the same psychological
    mechanism is at play
  • 9:05 - 9:07
    that is at play in the design
    of slot machines,
  • 9:07 - 9:09
    so there's intermittent variable rewards.
  • 9:09 - 9:11
    When you randomize
    the reward you give somebody,
  • 9:11 - 9:15
    they're more like to do
    the behavior you want them to do.
  • 9:16 - 9:21
    And when the attention economy
    stands in the starlight of our attention,
  • 9:21 - 9:24
    it shapes our lives in its image;
  • 9:24 - 9:27
    our values become its values.
  • 9:27 - 9:31
    We become more petty,
    more narcissistic, more impulsive.
  • 9:33 - 9:35
    And I think this is perfectly represented
  • 9:35 - 9:40
    by the CBS CEO's comment
    from February of last year,
  • 9:40 - 9:43
    when he said, "Donald Trump's candidacy
    may not be good for America,
  • 9:43 - 9:45
    but it's damn good for CBS."
  • 9:45 - 9:49
    The attention economy
    doesn't just shape our lives in its image;
  • 9:49 - 9:51
    it shapes our politics in its image.
  • 9:51 - 9:54
    Again, I think this is
    an urgent moral question
  • 9:54 - 9:57
    that is being talked about
    virtually by no one.
  • 9:58 - 10:03
    But I think we could find one more light
    of our attention to talk about.
  • 10:03 - 10:08
    It's when the technology
    doesn't just make it harder
  • 10:08 - 10:11
    to do what we want to do
    or to be who we want to be,
  • 10:11 - 10:14
    but in a sense, to want
    what we want to want -
  • 10:14 - 10:17
    to define our goals and values
    in the first place.
  • 10:18 - 10:21
    So we can think of this
    as the daylight of our attention,
  • 10:21 - 10:24
    the light by which we're able
    to do everything else.
  • 10:24 - 10:27
    When technology undermines
    the daylight of our attention,
  • 10:27 - 10:28
    it erodes our fundamental capacities
  • 10:28 - 10:33
    like reason, reflection,
    intelligence, metacognition.
  • 10:35 - 10:37
    One way we see this very clearly
  • 10:37 - 10:42
    is in the proliferation of outrage
    in our societies and in our world.
  • 10:42 - 10:47
    Outrage - the impulse
    to judge and punish -
  • 10:47 - 10:50
    was extremely valuable
    at earlier stages of human evolution
  • 10:50 - 10:56
    in small foraging groups that promoted
    moral clarity, social solidarity.
  • 10:56 - 10:59
    It was a way of signalling to other people
    that you could be trusted.
  • 10:59 - 11:02
    But when we amplify this
    on a societal scale,
  • 11:02 - 11:08
    it results in large-scale social division
    and rampant retaliation.
  • 11:08 - 11:09
    To give you one example -
  • 11:09 - 11:11
    I don't know how many of you
    remember this -
  • 11:12 - 11:15
    there was a dentist from Minnesota,
    a little while back,
  • 11:15 - 11:18
    that went to Zimbabwe
    and killed the lion named Cecil.
  • 11:18 - 11:21
    It was a stupid thing to do;
    he probably shouldn't have done it.
  • 11:21 - 11:23
    It might have been illegal - I don't know.
  • 11:24 - 11:26
    But what happened as a result of that
  • 11:26 - 11:30
    is the entire internet came down
    on this man for a bad decision.
  • 11:31 - 11:35
    It was this whole sort of festival
    of public shaming.
  • 11:35 - 11:39
    People showed up at his place of work,
    putting signs on it saying, "Rot in hell."
  • 11:39 - 11:42
    They showed up to his home
    and spray-painted it.
  • 11:43 - 11:46
    When children do this sort of thing,
    we call it "cyber-bullying."
  • 11:46 - 11:49
    But when adults do it,
    it is mob rule, plain and simple.
  • 11:49 - 11:52
    And mob rule is precisely
    what Socrates held
  • 11:52 - 11:57
    was the main route democracies take
    when they turn into tyrannies.
  • 11:58 - 12:01
    So, we can think beyond
    the spotlight of our attention;
  • 12:01 - 12:04
    we can think not just in terms
    of doing what we want to do
  • 12:04 - 12:06
    but being who we want to be
  • 12:06 - 12:09
    and ultimately wanting
    what we want to want.
  • 12:10 - 12:16
    This is an intolerable situation;
    this should not persist.
  • 12:16 - 12:20
    As Aristotle said, "It is disgraceful
    to be unable to use our good things."
  • 12:20 - 12:22
    We should not have to settle
  • 12:22 - 12:25
    for a relationship with technology
    that is adversarial.
  • 12:25 - 12:28
    We should demand that they be on our side.
  • 12:28 - 12:30
    Isn't that what technology is for?
  • 12:31 - 12:32
    So how would we do that?
  • 12:32 - 12:36
    Well, in the past, we've typically
    put it back on people themselves
  • 12:36 - 12:39
    to deal with distraction,
    to deal with the effects of technology,
  • 12:39 - 12:40
    to say work harder.
  • 12:40 - 12:45
    But in the digital age, the persuasion
    is just too powerful and too ubiquitous,
  • 12:45 - 12:46
    and this will not work.
  • 12:47 - 12:50
    But neither can we blame the people
    who make these technologies.
  • 12:50 - 12:54
    These are by and large good people,
    and I count many of them as my friends.
  • 12:55 - 12:58
    They're just players in a game
    called the attention economy.
  • 12:58 - 13:00
    The problem is that game.
  • 13:01 - 13:05
    Ultimately, this is not a problem
    of the ethics of individual actors;
  • 13:05 - 13:09
    it's a problem of the ethics
    of the system, of the infrastructure,
  • 13:09 - 13:13
    what philosopher Luciano Floridi
    at Oxford calls the "infraetchics."
  • 13:14 - 13:18
    So how can we change the situation?
  • 13:18 - 13:20
    Well, go back to what I said earlier
  • 13:20 - 13:22
    about how what we're doing
    is attentional labor
  • 13:22 - 13:24
    when we're using these technologies
  • 13:24 - 13:27
    and paying for them
    with our time and our attention.
  • 13:28 - 13:31
    In this light, we can frame
    the problem in two ways.
  • 13:31 - 13:35
    One is that we're getting poor value
    for our attentional labor.
  • 13:35 - 13:39
    The other problem is that
    the conditions of that attentional labor
  • 13:39 - 13:40
    are extremely poor.
  • 13:41 - 13:44
    Now, throughout history, when people
    have been faced with this situation,
  • 13:44 - 13:46
    what they've done is to organize,
  • 13:46 - 13:50
    to create mechanisms
    of collective representation
  • 13:50 - 13:53
    so that they collectively negotiate
    with those in power,
  • 13:53 - 13:56
    with those Alexanders of their time.
  • 13:56 - 14:01
    So what I think is needed
    is something that will give us a voice,
  • 14:01 - 14:02
    a direct voice,
  • 14:02 - 14:04
    in the design of our technologies,
  • 14:04 - 14:07
    and no mechanism like this exists today.
  • 14:07 - 14:09
    So what I am calling for
    and what is needed
  • 14:09 - 14:15
    is a labor union or something like it
    for the attention economy.
  • 14:15 - 14:17
    And there's a community of people
  • 14:17 - 14:20
    who are passionate
    in thinking about these issues
  • 14:20 - 14:22
    going under the name Time Well Spent,
  • 14:22 - 14:25
    so that you can know when you
    spend time with your technologies,
  • 14:25 - 14:29
    that it won't just be time spent;
    it will be time well spent.
  • 14:29 - 14:32
    So we're thinking about how to change
    the attention economy,
  • 14:32 - 14:37
    coming up with better metrics, better
    principles, processes, business models.
  • 14:37 - 14:40
    And if you're passionate about this,
    we would love to engage with you
  • 14:40 - 14:44
    because the thing is, we need your help
    to make this a reality,
  • 14:44 - 14:46
    to change the system.
  • 14:47 - 14:50
    Because at the end of the day,
  • 14:50 - 14:54
    your attention is the most precious
    resource that you have.
  • 14:54 - 14:57
    It's the ultimate
    scarce and finite resource,
  • 14:57 - 15:01
    and the challenges
    that are facing humanity right now,
  • 15:01 - 15:04
    so many big and important challenges,
  • 15:04 - 15:07
    before anything else,
    what they require of us
  • 15:07 - 15:11
    is that we be able to give attention
    to the things that matter
  • 15:11 - 15:14
    on individual levels
    and at the collective level,
  • 15:14 - 15:15
    and this is precisely
  • 15:15 - 15:18
    what the technologies of the attention
    economy are undermining.
  • 15:20 - 15:23
    And ultimately, this relates
    to the very goals of life.
  • 15:23 - 15:27
    Because nobody on their death bed
    ever looked back and said,
  • 15:27 - 15:30
    "I wish I'd spent more time on Facebook."
  • 15:30 - 15:35
    You know, what we regret
    are those real goals, the human goals,
  • 15:35 - 15:37
    those things that make life worth living.
  • 15:38 - 15:44
    And so, from here, I think it will take
    some time to reform the attention economy.
  • 15:44 - 15:47
    Big projects usually take time.
  • 15:48 - 15:52
    But in the meantime,
    I think what we need to do is organize
  • 15:52 - 15:56
    so that we can have a voice,
    a direct voice,
  • 15:57 - 16:00
    to those who create our technologies,
  • 16:00 - 16:03
    and we should continue to reap
    the benefits of our technologies
  • 16:03 - 16:06
    and continue to affirm and support
    the people who create them
  • 16:06 - 16:08
    because they carry that flame
    of innovation and creativity
  • 16:08 - 16:11
    that is so core to the human project.
  • 16:12 - 16:14
    But before anything else,
  • 16:14 - 16:18
    we need to organize
    and ask them for their attention
  • 16:18 - 16:20
    so that we can tell them
    what we want them to do with ours.
  • 16:21 - 16:22
    Before anything else,
  • 16:22 - 16:26
    we need to ask them
    to stand out of our light.
  • 16:26 - 16:28
    Thank you for your attention.
  • 16:28 - 16:29
    (Applause)
Title:
Stand out of our light | James Williams | TEDxAthens
Description:

Our technological reality is flooded with fake news, identities, realities. Design ethicist James Williams argues that we need to redefine the way we use social media if we want to be, ironically, more social.

James Williams is a design ethicist and a co-founder of the Time Well Spent campaign, an effort that aims to evolve the design ethos of digital technologies to help them be more respectful of our time and attention and more supportive of our deeper human values and goals. He is currently completing doctoral research at the University of Oxford, where he is focused on questions of attention and persuasion in technology design.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
16:41

English subtitles

Revisions