Return to Video

What your designs say about you | Sebastian Deterding | TEDxHogeschoolUtrecht

  • 0:01 - 0:02
    Hi there, everyone.
  • 0:04 - 0:07
    I would like to start
    by asking you a simple question.
  • 0:07 - 0:08
    And that is,
  • 0:09 - 0:13
    who of you wants to build a product
    that is as captivating and engaging
  • 0:13 - 0:14
    as, say, Facebook or Twitter?
  • 0:15 - 0:17
    If you think so, please raise your hand.
  • 0:18 - 0:20
    Something that's as captivating
    and engaging as Twitter.
  • 0:20 - 0:22
    Please keep your hands up.
  • 0:23 - 0:25
    Now, those of you
    who have kept your hands up,
  • 0:25 - 0:27
    please keep your hands up if you feel
  • 0:27 - 0:30
    that you're spending more time
    than you should spend
  • 0:30 - 0:33
    on sites like Facebook or Twitter,
  • 0:33 - 0:35
    time that would be better spent
  • 0:35 - 0:39
    with friends or spouses
    or doing things that you actually love.
  • 0:40 - 0:42
    Okay. Those who still have
    their arms up,
  • 0:42 - 0:44
    please discuss after the break.
  • 0:44 - 0:46
    (Laughter)
  • 0:46 - 0:48
    So, why am I asking this question?
  • 0:48 - 0:50
    I am asking it,
  • 0:50 - 0:52
    because we are today talking
    about moral persuasion:
  • 0:53 - 0:56
    What is moral and immoral
    in trying to change people's behaviors
  • 0:57 - 0:59
    by using technology and using design?
  • 0:59 - 1:01
    And I don't know what you expect,
  • 1:01 - 1:03
    but when I was thinking about that issue,
  • 1:03 - 1:07
    I early on realized what I'm not able
    to give you are answers.
  • 1:07 - 1:10
    I'm not able to tell you
    what is moral or immoral,
  • 1:10 - 1:13
    because we're living
    in a pluralist society.
  • 1:13 - 1:17
    My values can be radically
    different from your values,
  • 1:17 - 1:20
    which means that what I consider
    moral or immoral based on that
  • 1:20 - 1:24
    might not necessarily be
    what you consider moral or immoral.
  • 1:24 - 1:27
    But I also realized
    there is one thing that I could give you,
  • 1:28 - 1:30
    and that is what this guy
    behind me gave the world...
  • 1:30 - 1:31
    Socrates.
  • 1:31 - 1:33
    It is questions.
  • 1:33 - 1:35
    What I can do and what
    I would like to do with you
  • 1:35 - 1:37
    is give you, like that initial question,
  • 1:37 - 1:42
    a set of questions
    to figure out for yourselves,
  • 1:42 - 1:46
    layer by layer, like peeling an onion,
  • 1:46 - 1:51
    getting at the core of what you believe
    is moral or immoral persuasion.
  • 1:51 - 1:54
    And I'd like to do that
    with a couple of examples
  • 1:54 - 1:58
    as was said, a couple of examples
    of technologies
  • 1:58 - 2:03
    where people have used game elements
    to get people to do things.
  • 2:04 - 2:07
    So, here's the first example
    leading us to our first question.
  • 2:07 - 2:09
    one of my favorite examples
    of gamification,
  • 2:09 - 2:11
    Buster Benson's health month.
  • 2:11 - 2:14
    It's a simple application
    where you set yourself health rules
  • 2:15 - 2:16
    for one month.
  • 2:16 - 2:19
    Rules like, I want to exercise
    six times a week.
  • 2:19 - 2:21
    Or, I don't want to drink any alcohol.
  • 2:21 - 2:24
    And every morning you get
    an email, asking you,
  • 2:24 - 2:26
    Did you stick to your rules or not?
  • 2:26 - 2:28
    And you say yes or no
    to the different questions.
  • 2:28 - 2:31
    And then, on the platform,
    you can see how well you did,
  • 2:31 - 2:34
    you can earn points and badges for that,
  • 2:34 - 2:36
    That information is shared
    with your peers,
  • 2:36 - 2:39
    and if you don't stick to a rule,
    you loose a health point,
  • 2:39 - 2:41
    but you friends ccan chip in
    and heal you.
  • 2:42 - 2:45
    Beautiful example, and I believe
    most of you will agree with me
  • 2:46 - 2:48
    that it kind of sounds
    like ethical persuasion, right?
  • 2:49 - 2:51
    It sounds like something
    that is good to do.
  • 2:51 - 2:52
    Here's another example.
  • 2:53 - 2:56
    Very similar in the kind of
    thinking behind it,
  • 2:56 - 2:58
    very different example - Lockers.
  • 2:58 - 3:01
    It's a social platform
    where people set up profiles
  • 3:01 - 3:04
    and on them, the main thing they do
    is they put up product pictures
  • 3:04 - 3:07
    pictures of products they like,
  • 3:07 - 3:09
    and link their profiles with their friends
  • 3:09 - 3:12
    and every time I click
    on one of those products on your page,
  • 3:12 - 3:13
    you get points,
  • 3:13 - 3:17
    and every time you click
    on a product page on my page,
  • 3:17 - 3:18
    I get points,
  • 3:18 - 3:21
    and if you actually buy something
    I get a lot of points.
  • 3:21 - 3:23
    and both of us can then
    exchange these points
  • 3:24 - 3:26
    into percentages of these products.
  • 3:27 - 3:28
    Now, I don't know how you feel,
  • 3:28 - 3:31
    but personally I think
    that Health Month
  • 3:31 - 3:34
    is something that feels to me very benign
  • 3:34 - 3:37
    and a good piece, a moral piece
    of technology,
  • 3:37 - 3:41
    whereas there's something about Locker
    that makes me feel a little queezy.
  • 3:42 - 3:45
    Thinking about what is it exactly
    that makes me feel queezy here,
  • 3:45 - 3:46
    in this case, versus the other,
  • 3:46 - 3:49
    was a very simple answer,
    and that is, well,
  • 3:49 - 3:51
    the intention behind it, right?
  • 3:51 - 3:55
    In one case, the intention is,
    "That site wants me to be healthier,
  • 3:55 - 3:57
    and the other site wants me to shop more."
  • 3:58 - 4:01
    So it's at first a very simple,
    very obvious question
  • 4:01 - 4:02
    I would like to give you:
  • 4:02 - 4:05
    What are your intentions
    if you are designing something?
  • 4:05 - 4:08
    And obviously, intentions
    are not the only thing,
  • 4:09 - 4:12
    so here is another example
    for one of these applications.
  • 4:12 - 4:15
    There are a couple of these kinds
    of Eco dashboards right now...
  • 4:15 - 4:17
    Dashboards built into cars...
  • 4:17 - 4:20
    Which try to motivate you
    to drive more fuel-efficiently.
  • 4:20 - 4:22
    This here is Nissan's MyLeaf,
  • 4:22 - 4:26
    where your driving behavior
    is compared with the driving behavior
  • 4:26 - 4:27
    of other people,
  • 4:27 - 4:30
    so you can compete for who drives a route
    the most fuel-efficiently.
  • 4:30 - 4:32
    And these things are
    very effective, it turns out...
  • 4:33 - 4:36
    So effective that they motivate people
    to engage in unsafe driving behaviors,
  • 4:36 - 4:38
    like not stopping at a red light,
  • 4:38 - 4:41
    because that way you have
    to stop and restart the engine,
  • 4:41 - 4:44
    and that would use quite
    some fuel, wouldn't it?
  • 4:45 - 4:49
    So despite this being
    a very well-intended application,
  • 4:49 - 4:52
    obviously there was a side effect of that.
  • 4:52 - 4:54
    Here's another example
    for one of these side effects.
  • 4:54 - 4:59
    Commendable: a site that allows parents
    to give their kids little badges
  • 4:59 - 5:02
    for doing the things
    that parents want their kids to do,
  • 5:02 - 5:03
    like tying their shoes.
  • 5:03 - 5:05
    And at first that sounds very nice,
  • 5:05 - 5:08
    very benign, well-intended.
  • 5:08 - 5:12
    But it turns out, if you look into
    research on people's mindset,
  • 5:12 - 5:13
    caring about outcomes,
  • 5:13 - 5:15
    caring about public recognition,
  • 5:15 - 5:19
    caring about these kinds
    of public tokens of recognition
  • 5:19 - 5:21
    is not necessarily very helpful
  • 5:21 - 5:23
    for your long-term
    psychological well-being.
  • 5:23 - 5:26
    It's better if you care
    about learning something.
  • 5:26 - 5:28
    It's better when you care about yourself
  • 5:28 - 5:31
    than how you appear
    in front of other people.
  • 5:31 - 5:36
    So that kind of motivational tool
    that is used actually, in and of itself,
  • 5:36 - 5:38
    has a long-term side effect,
  • 5:38 - 5:40
    in that every time we use a technology
  • 5:40 - 5:43
    that uses something
    like public recognition or status,
  • 5:43 - 5:46
    we're actually positively endorsing this
  • 5:46 - 5:49
    as a good and normal thing
    to care about...
  • 5:49 - 5:52
    That way, possibly having
    a detrimental effect
  • 5:52 - 5:56
    on the long-term psychological
    well-being of ourselves as a culture.
  • 5:56 - 5:58
    So that's a second, very obvious question:
  • 5:58 - 6:01
    What are the effects
    of what you're doing...
  • 6:01 - 6:05
    The effects you're having
    with the device, like less fuel,
  • 6:05 - 6:08
    as well as the effects
    of the actual tools you're using
  • 6:08 - 6:09
    to get people to do things...
  • 6:09 - 6:11
    Public recognition?
  • 6:11 - 6:14
    Now is that all... intention, effect?
  • 6:14 - 6:17
    Well, there are some technologies
    which obviously combine both.
  • 6:17 - 6:20
    Both good long-term and short-term effects
  • 6:20 - 6:24
    and a positive intention
    like Fred Stutzman's "Freedom,"
  • 6:25 - 6:27
    where the whole point
    of that application is...
  • 6:27 - 6:31
    Well, we're usually so bombarded
    with constant requests by other people,
  • 6:31 - 6:32
    with this device,
  • 6:32 - 6:35
    you can shut off the Internet
    connectivity of your PC of choice
  • 6:35 - 6:37
    for a pre-set amount of time,
  • 6:37 - 6:39
    to actually get some work done.
  • 6:40 - 6:43
    And I think most of us will agree
    that's something well-intended,
  • 6:43 - 6:45
    and also has good consequences.
  • 6:45 - 6:47
    In the words of Michel Foucault,
  • 6:47 - 6:49
    it is a "technology of the self."
  • 6:49 - 6:52
    It is a technology
    that empowers the individual
  • 6:52 - 6:53
    to determine its own life course,
  • 6:53 - 6:55
    to shape itself.
  • 6:56 - 6:59
    But the problem is,
    as Foucault points out,
  • 6:59 - 7:01
    that every technology of the self
  • 7:01 - 7:05
    has a technology of domination
    as its flip side.
  • 7:05 - 7:09
    As you see in today's modern
    liberal democracies,
  • 7:09 - 7:14
    the society, the state,
    not only allows us to determine our self,
  • 7:14 - 7:15
    to shape our self,
  • 7:15 - 7:17
    it also demands it of us.
  • 7:17 - 7:19
    It demands that we optimize ourselves,
  • 7:19 - 7:21
    that we control ourselves,
  • 7:21 - 7:24
    that we self-manage continuously,
  • 7:24 - 7:27
    because that's the only way
    in which such a liberal society works.
  • 7:27 - 7:28
    In a way,
  • 7:29 - 7:32
    the kind of devices like
    Fred Stutzman's "Freedom,"
  • 7:32 - 7:36
    or Buster Benson's Helth Month,
    are technologies of domination,
  • 7:36 - 7:37
    because they want us to be
  • 7:38 - 7:41
    (Robotic voice) fitter, happier,
    more productive,
  • 7:41 - 7:44
    comfortable, not drinking too much,
  • 7:44 - 7:48
    regular exercise at the gym
    three days a week,
  • 7:48 - 7:52
    gettin on better with your
    associate employee contemporaries.
  • 7:52 - 7:53
    At ease.
    Eating well.
  • 7:54 - 7:57
    No more microwave dinners
    and saturated fats.
  • 7:57 - 8:00
    A patient, better driver,
    a safer car,
  • 8:00 - 8:02
    [unclear]
  • 8:02 - 8:05
    sleeping well, no bad dreams.
  • 8:06 - 8:11
    SD: These technologies want us
    to stay in the game
  • 8:11 - 8:13
    that society has devised for us.
  • 8:13 - 8:16
    They want us to fit in even better.
  • 8:16 - 8:18
    They want us to optimize
    ourselves to fit in.
  • 8:19 - 8:22
    Now, I don't say that
    is necessarily a bad thing;
  • 8:23 - 8:27
    I just think that this example
    points us to a general realization,
  • 8:27 - 8:31
    and that is: no matter what technology
    or design you look at,
  • 8:31 - 8:34
    even something we consider
    as well-intended
  • 8:36 - 8:39
    and as good in its effects
    as Stutzman's Freedom,
  • 8:39 - 8:42
    comes with certain values embedded in it.
  • 8:42 - 8:44
    And we can question these values.
  • 8:44 - 8:46
    We can question: Is it a good thing
  • 8:46 - 8:49
    that all of us continuously
    self-optimize ourselves
  • 8:49 - 8:51
    to fit better into that society?
  • 8:51 - 8:53
    Or to give you another example:
  • 8:53 - 8:56
    the one at the initial presentation,
  • 8:56 - 8:59
    What about a piece
    of persuasive technology
  • 8:59 - 9:02
    that convinces Muslim women
    to wear their headscarves?
  • 9:02 - 9:04
    Is that a good or a bad technology
  • 9:04 - 9:06
    in its intentions or in its effects?
  • 9:07 - 9:11
    Well, that basically depends on
    the kind of values you bring to bear
  • 9:11 - 9:14
    to make these kinds of judgments.
  • 9:14 - 9:15
    So that's a third question:
  • 9:15 - 9:17
    What values do you use to judge?
  • 9:17 - 9:19
    And speaking of values:
  • 9:19 - 9:22
    I've noticed that in the discussion
    about moral persuasion online
  • 9:22 - 9:24
    and when I'm talking with people,
  • 9:24 - 9:27
    more often than not,
    there is a weird bias.
  • 9:28 - 9:30
    And that bias is that we're asking:
  • 9:31 - 9:33
    Is this or that "still" ethical?
  • 9:33 - 9:36
    Is it "still" permissible?
  • 9:37 - 9:38
    We're asking things like:
  • 9:38 - 9:41
    Is this Oxfam donation form,
  • 9:41 - 9:44
    where the regular monthly
    donation is the preset default,
  • 9:44 - 9:46
    and people, maybe without intending it,
  • 9:46 - 9:50
    are encouraged or nudged
    into giving a regular donation
  • 9:50 - 9:52
    instead of a one-time donation,
  • 9:52 - 9:53
    is that "still' permissible?
  • 9:53 - 9:55
    Is it "still" ethical?
  • 9:55 - 9:56
    We're fishing at the low end.
  • 9:57 - 9:59
    But in fact, that question,
    "Is it 'still' ethical?"
  • 9:59 - 10:01
    is just one way of looking at ethics.
  • 10:01 - 10:06
    Because if you look at the beginning
    of ethics in Western culture,
  • 10:06 - 10:10
    you see a very different idea
    of what ethics also could be.
  • 10:10 - 10:14
    For Aristotle, ethics
    was not about the question,
  • 10:14 - 10:16
    "Is that still good, or is it bad?"
  • 10:16 - 10:20
    Ethics was about the question
    of how to live life well.
  • 10:20 - 10:22
    And he put that in the word "arête,"
  • 10:22 - 10:25
    which we, from [Ancient Greek],
    translate as "virtue."
  • 10:25 - 10:27
    But really, it means "excellence."
  • 10:27 - 10:32
    It means living up to your own
    full potential as a human being.
  • 10:33 - 10:35
    And that is an idea that, I think,
  • 10:35 - 10:38
    Paul Richard Buchanan
    put nicely in a recent essay,
  • 10:38 - 10:41
    where he said,
    "Products are vivid arguments
  • 10:41 - 10:43
    about how we should live our lives."
  • 10:43 - 10:46
    Our designs are not ethical or unethical
  • 10:46 - 10:50
    in that they're using ethical
    or unethical means of persuading us.
  • 10:51 - 10:52
    They have a moral component
  • 10:52 - 10:57
    just in the kind of vision
    and the aspiration of the good life
  • 10:57 - 10:58
    that they present to us.
  • 10:58 - 11:02
    And if you look into the designed
    environment around us
  • 11:02 - 11:03
    with that kind of lens,
  • 11:03 - 11:06
    asking, "What is the vision
    of the good life
  • 11:06 - 11:08
    that our products, our design,
    present to us?",
  • 11:08 - 11:11
    then you often get the shivers,
  • 11:11 - 11:13
    because of how little
    we expect of each other,
  • 11:13 - 11:17
    of how little we actually
    seem to expect of our life,
  • 11:17 - 11:19
    and what the good life looks like.
  • 11:20 - 11:23
    So that's a fourth question
    I'd like to leave you with:
  • 11:23 - 11:28
    What vision of the good life
    do your designs convey?
  • 11:28 - 11:30
    And speaking of design,
  • 11:30 - 11:34
    you'll notice that I already
    broadened the discussion,
  • 11:34 - 11:39
    because it's not just persuasive
    technology that we're talking about here,
  • 11:39 - 11:43
    it's any piece of design
    that we put out here in the world.
  • 11:43 - 11:44
    I don't know whether you know
  • 11:44 - 11:48
    the great communication researcher
    Paul Watzlawick who, back in the '60s,
  • 11:48 - 11:50
    made the argument
    that we cannot not communicate.
  • 11:50 - 11:53
    Even if we choose to be silent,
    we chose to be silent,
  • 11:53 - 11:56
    and we're communicating something
    by choosing to be silent.
  • 11:56 - 11:59
    And in the same way
    that we cannot not communicate,
  • 11:59 - 12:00
    we cannot not persuade:
  • 12:00 - 12:02
    whatever we do or refrain from doing,
  • 12:02 - 12:07
    whatever we put out there
    as a piece of design, into the world,
  • 12:07 - 12:09
    has a persuasive component.
  • 12:09 - 12:10
    It tries to affect people.
  • 12:11 - 12:14
    It puts a certain vision of the good life
    out there in front of us,
  • 12:15 - 12:17
    which is what Peter-Paul Verbeek,
  • 12:17 - 12:19
    the Dutch philosopher of technology, says.
  • 12:20 - 12:24
    No matter whether we as designers
    intend it or not,
  • 12:24 - 12:26
    we materialize morality.
  • 12:26 - 12:29
    We make certain things
    harder and easier to do.
  • 12:29 - 12:31
    We organize the existence of people.
  • 12:31 - 12:33
    We put a certain vision
  • 12:33 - 12:36
    of what good or bad or normal or usual is
  • 12:36 - 12:37
    in front of people,
  • 12:37 - 12:40
    by everything we put
    out there in the world.
  • 12:40 - 12:43
    Even something as innocuous
    as a set of school chairs
  • 12:43 - 12:47
    as a set of chairs that you're sitting on
    and I'm standing in front of,
  • 12:48 - 12:50
    is a persuasive technology,
  • 12:50 - 12:55
    because it presents and materializes
    a certain vision of the good life...
  • 12:55 - 12:58
    A good life in which teaching
    and learning and listening
  • 12:58 - 13:01
    is about one person teaching,
    the others listening;
  • 13:01 - 13:05
    in which it is about
    learning-is-done-while-sitting;
  • 13:05 - 13:07
    in which you learn for yourself;
  • 13:07 - 13:09
    in which you're not supposed
    to change these rules,
  • 13:09 - 13:12
    because the chairs
    are fixed to the ground.
  • 13:13 - 13:16
    And even something as innocuous
    as a single-design chair,
  • 13:16 - 13:17
    like this one by Arne Jacobsen,
  • 13:17 - 13:19
    is a persuasive technology,
  • 13:19 - 13:22
    because, again, it communicates
    an idea of the good life:
  • 13:23 - 13:28
    a good life... a life that you,
    as a designer, consent to by saying,
  • 13:28 - 13:32
    "In a good life, goods are produced
    as sustainably or unsustainably
  • 13:32 - 13:33
    as this chair.
  • 13:33 - 13:35
    Workers are treated as well or as badly
  • 13:35 - 13:38
    as the workers were treated
    that built that chair."
  • 13:38 - 13:41
    The good life is a life
    where design is important
  • 13:41 - 13:43
    because somebody obviously took
    the time and spent the money
  • 13:43 - 13:45
    for that kind of well-designed chair;
  • 13:45 - 13:47
    where tradition is important,
  • 13:47 - 13:50
    because this is a traditional classic
    and someone cared about this;
  • 13:50 - 13:53
    and where there is something
    as conspicuous consumption,
  • 13:53 - 13:56
    where it is OK and normal to spend
    a humongous amount of money
  • 13:56 - 13:58
    on such a chair,
  • 13:58 - 14:00
    to signal to other people
    what your social status is.
  • 14:02 - 14:05
    So these are the kinds of layers,
    the kinds of questions
  • 14:06 - 14:07
    I wanted to lead you through today;
  • 14:07 - 14:11
    the question of: What are the intentions
    that you bring to bear
  • 14:11 - 14:12
    when you're designing something?
  • 14:12 - 14:15
    What are the effects, intended
    and unintended, that you're having?
  • 14:15 - 14:18
    What are the values
    you're using to judge those?
  • 14:18 - 14:20
    What are the virtues, the aspirations
  • 14:20 - 14:22
    that you're actually expressing in that?
  • 14:23 - 14:24
    And how does that apply,
  • 14:25 - 14:27
    not just to persuasive technology,
  • 14:27 - 14:29
    but to everything you design?
  • 14:29 - 14:31
    Do we stop there?
  • 14:31 - 14:33
    I don't think so.
  • 14:33 - 14:37
    I think that all of these things
    are eventually informed
  • 14:37 - 14:39
    by the core of all of this,
  • 14:39 - 14:42
    and this is nothing but life itself.
  • 14:42 - 14:45
    Why, when the question
    of what the good life is
  • 14:45 - 14:47
    informs everything that we design,
  • 14:47 - 14:50
    should we stop at design
    and not ask ourselves:
  • 14:50 - 14:52
    How does it apply to our own life?
  • 14:52 - 14:55
    "Why should the lamp
    or the house be an art object,
  • 14:55 - 14:56
    but not our life?"
  • 14:56 - 14:58
    as Michel Foucault puts it.
  • 14:58 - 15:02
    Just to give you a practical
    example of Buster Benson.
  • 15:02 - 15:04
    whom I mentioned at the beginning.
  • 15:04 - 15:07
    This is Buster setting
    up a pull-up machine
  • 15:07 - 15:09
    at the office of his new
    start-up, Habit Labs,
  • 15:09 - 15:13
    where they're trying to build
    other applications like "Health Month"
  • 15:13 - 15:14
    for people.
  • 15:14 - 15:16
    And why is he building a thing like this?
  • 15:16 - 15:18
    Well, here is the set of axioms
  • 15:18 - 15:23
    that Habit Labs, Buster's start-up,
    put up for themselves
  • 15:23 - 15:26
    on how they wanted to work
    together as a team
  • 15:26 - 15:28
    when they're building
    these applications...
  • 15:28 - 15:30
    A set of moral principles
    they set themselves
  • 15:30 - 15:31
    for working together...
  • 15:31 - 15:33
    One of them being,
  • 15:33 - 15:36
    "We take care of our own health
    and manage our own burnout."
  • 15:36 - 15:41
    Because ultimately,
    how can you ask yourselves
  • 15:41 - 15:45
    and how can you find an answer
    on what vision of the good life
  • 15:45 - 15:48
    you want to convey and create
    with your designs
  • 15:48 - 15:50
    without asking the question:
  • 15:50 - 15:54
    What vision of the good life
    do you yourself want to live?
  • 15:54 - 15:55
    And with that,
  • 15:56 - 15:57
    I thank you.
  • 15:57 - 16:02
    (Applause)
Title:
What your designs say about you | Sebastian Deterding | TEDxHogeschoolUtrecht
Description:

What does your chair say about what you value? Designer Sebastian Deterding shows how our visions of morality and what the good life is are reflected in the design of objects around us.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
16:02

English subtitles

Revisions Compare revisions