0:00:00.391,0:00:02.977 We are today talking[br]about moral persuasion: 0:00:03.001,0:00:06.983 What is moral and immoral[br]in trying to change people's behaviors 0:00:07.007,0:00:09.460 by using technology and using design? 0:00:09.484,0:00:11.316 And I don't know what you expect, 0:00:11.340,0:00:13.293 but when I was thinking about that issue, 0:00:13.317,0:00:17.356 I early on realized what I'm not able[br]to give you are answers. 0:00:17.943,0:00:20.714 I'm not able to tell you[br]what is moral or immoral, 0:00:20.738,0:00:23.285 because we're living[br]in a pluralist society. 0:00:23.309,0:00:27.551 My values can be radically[br]different from your values, 0:00:27.575,0:00:30.752 which means that what I consider[br]moral or immoral based on that 0:00:30.776,0:00:34.388 might not necessarily be[br]what you consider moral or immoral. 0:00:34.769,0:00:37.983 But I also realized[br]there is one thing that I could give you, 0:00:38.007,0:00:40.540 and that is what this guy[br]behind me gave the world -- 0:00:40.564,0:00:41.714 Socrates. 0:00:41.738,0:00:43.133 It is questions. 0:00:43.157,0:00:45.790 What I can do and what[br]I would like to do with you 0:00:45.814,0:00:47.759 is give you, like that initial question, 0:00:47.783,0:00:51.193 a set of questions[br]to figure out for yourselves, 0:00:51.217,0:00:54.858 layer by layer, like peeling an onion, 0:00:54.882,0:00:59.902 getting at the core of what you believe[br]is moral or immoral persuasion. 0:01:00.299,0:01:04.340 And I'd like to do that[br]with a couple of examples of technologies 0:01:04.364,0:01:09.344 where people have used game elements[br]to get people to do things. 0:01:10.083,0:01:13.123 So it's at first a very simple,[br]very obvious question 0:01:13.147,0:01:14.343 I would like to give you: 0:01:14.367,0:01:17.054 What are your intentions[br]if you are designing something? 0:01:17.408,0:01:20.781 And obviously, intentions[br]are not the only thing, 0:01:20.805,0:01:23.983 so here is another example[br]for one of these applications. 0:01:24.007,0:01:27.094 There are a couple of these kinds[br]of Eco dashboards right now -- 0:01:27.118,0:01:28.578 dashboards built into cars -- 0:01:28.602,0:01:31.434 which try to motivate you[br]to drive more fuel-efficiently. 0:01:31.458,0:01:33.231 This here is Nissan's MyLeaf, 0:01:33.255,0:01:36.319 where your driving behavior[br]is compared with the driving behavior 0:01:36.343,0:01:37.494 of other people, 0:01:37.518,0:01:40.795 so you can compete for who drives a route[br]the most fuel-efficiently. 0:01:40.819,0:01:43.296 And these things are[br]very effective, it turns out -- 0:01:43.320,0:01:47.259 so effective that they motivate people[br]to engage in unsafe driving behaviors, 0:01:47.283,0:01:49.045 like not stopping at a red light, 0:01:49.069,0:01:51.784 because that way you have[br]to stop and restart the engine, 0:01:51.808,0:01:54.531 and that would use quite[br]some fuel, wouldn't it? 0:01:55.078,0:01:59.487 So despite this being[br]a very well-intended application, 0:01:59.511,0:02:01.886 obviously there was a side effect of that. 0:02:01.910,0:02:04.458 Here's another example[br]for one of these side effects. 0:02:04.482,0:02:09.266 Commendable: a site that allows parents[br]to give their kids little badges 0:02:09.290,0:02:11.948 for doing the things[br]that parents want their kids to do, 0:02:11.972,0:02:13.288 like tying their shoes. 0:02:13.312,0:02:15.556 And at first that sounds very nice, 0:02:15.580,0:02:17.730 very benign, well-intended. 0:02:17.754,0:02:21.524 But it turns out, if you look into[br]research on people's mindset, 0:02:21.548,0:02:23.035 caring about outcomes, 0:02:23.059,0:02:24.832 caring about public recognition, 0:02:24.856,0:02:28.717 caring about these kinds[br]of public tokens of recognition 0:02:28.741,0:02:30.735 is not necessarily very helpful 0:02:30.759,0:02:33.066 for your long-term[br]psychological well-being. 0:02:33.090,0:02:35.766 It's better if you care[br]about learning something. 0:02:35.790,0:02:37.695 It's better when you care about yourself 0:02:37.719,0:02:40.293 than how you appear[br]in front of other people. 0:02:40.761,0:02:45.802 So that kind of motivational tool[br]that is used actually, in and of itself, 0:02:45.826,0:02:47.734 has a long-term side effect, 0:02:47.758,0:02:49.568 in that every time we use a technology 0:02:49.592,0:02:52.766 that uses something[br]like public recognition or status, 0:02:52.790,0:02:55.166 we're actually positively endorsing this 0:02:55.190,0:02:58.600 as a good and normal thing[br]to care about -- 0:02:58.624,0:03:01.488 that way, possibly having[br]a detrimental effect 0:03:01.512,0:03:05.367 on the long-term psychological[br]well-being of ourselves as a culture. 0:03:05.391,0:03:08.035 So that's a second, very obvious question: 0:03:08.059,0:03:10.370 What are the effects[br]of what you're doing -- 0:03:10.394,0:03:14.518 the effects you're having[br]with the device, like less fuel, 0:03:14.542,0:03:17.247 as well as the effects[br]of the actual tools you're using 0:03:17.271,0:03:18.945 to get people to do things -- 0:03:18.969,0:03:20.540 public recognition? 0:03:20.564,0:03:23.485 Now is that all -- intention, effect? 0:03:23.509,0:03:26.643 Well, there are some technologies[br]which obviously combine both. 0:03:26.667,0:03:29.454 Both good long-term and short-term effects 0:03:29.478,0:03:32.287 and a positive intention[br]like Fred Stutzman's "Freedom," 0:03:32.311,0:03:34.518 where the whole point[br]of that application is -- 0:03:34.542,0:03:38.267 well, we're usually so bombarded[br]with constant requests by other people, 0:03:38.291,0:03:39.447 with this device, 0:03:39.471,0:03:42.951 you can shut off the Internet[br]connectivity of your PC of choice 0:03:42.975,0:03:44.423 for a pre-set amount of time, 0:03:44.447,0:03:46.418 to actually get some work done. 0:03:46.442,0:03:49.593 And I think most of us will agree[br]that's something well-intended, 0:03:49.617,0:03:51.837 and also has good consequences. 0:03:51.861,0:03:53.507 In the words of Michel Foucault, 0:03:53.531,0:03:55.471 it is a "technology of the self." 0:03:55.495,0:03:58.332 It is a technology[br]that empowers the individual 0:03:58.356,0:04:00.171 to determine its own life course, 0:04:00.195,0:04:01.714 to shape itself. 0:04:02.150,0:04:05.134 But the problem is,[br]as Foucault points out, 0:04:05.158,0:04:06.943 that every technology of the self 0:04:06.967,0:04:10.412 has a technology of domination[br]as its flip side. 0:04:10.436,0:04:15.039 As you see in today's modern[br]liberal democracies, 0:04:15.063,0:04:19.746 the society, the state,[br]not only allows us to determine our self, 0:04:19.770,0:04:20.921 to shape our self, 0:04:20.945,0:04:22.936 it also demands it of us. 0:04:22.960,0:04:24.921 It demands that we optimize ourselves, 0:04:24.945,0:04:26.785 that we control ourselves, 0:04:26.809,0:04:29.520 that we self-manage continuously, 0:04:29.544,0:04:33.437 because that's the only way[br]in which such a liberal society works. 0:04:33.461,0:04:37.729 These technologies want us[br]to stay in the game 0:04:37.753,0:04:40.526 that society has devised for us. 0:04:40.550,0:04:42.837 They want us to fit in even better. 0:04:42.861,0:04:45.439 They want us to optimize[br]ourselves to fit in. 0:04:46.368,0:04:49.447 Now, I don't say that[br]is necessarily a bad thing; 0:04:50.033,0:04:54.361 I just think that this example[br]points us to a general realization, 0:04:54.385,0:04:58.188 and that is: no matter what technology[br]or design you look at, 0:04:58.212,0:05:01.233 even something we consider[br]as well-intended 0:05:01.257,0:05:04.223 and as good in its effects[br]as Stutzman's Freedom, 0:05:04.247,0:05:06.954 comes with certain values embedded in it. 0:05:06.978,0:05:08.914 And we can question these values. 0:05:08.938,0:05:10.882 We can question: Is it a good thing 0:05:10.906,0:05:14.390 that all of us continuously[br]self-optimize ourselves 0:05:14.414,0:05:16.425 to fit better into that society? 0:05:16.449,0:05:17.941 Or to give you another example: 0:05:17.965,0:05:20.441 What about a piece[br]of persuasive technology 0:05:20.465,0:05:23.656 that convinces Muslim women[br]to wear their headscarves? 0:05:23.680,0:05:25.732 Is that a good or a bad technology 0:05:25.756,0:05:28.319 in its intentions or in its effects? 0:05:28.343,0:05:32.597 Well, that basically depends on[br]the kind of values you bring to bear 0:05:32.621,0:05:34.858 to make these kinds of judgments. 0:05:34.882,0:05:36.410 So that's a third question: 0:05:36.434,0:05:37.962 What values do you use to judge? 0:05:38.588,0:05:39.929 And speaking of values: 0:05:39.953,0:05:43.309 I've noticed that in the discussion[br]about moral persuasion online 0:05:43.333,0:05:44.970 and when I'm talking with people, 0:05:44.994,0:05:47.661 more often than not,[br]there is a weird bias. 0:05:48.203,0:05:51.099 And that bias is that we're asking: 0:05:51.123,0:05:53.936 Is this or that "still" ethical? 0:05:53.960,0:05:56.610 Is it "still" permissible? 0:05:56.634,0:05:57.832 We're asking things like: 0:05:57.856,0:06:00.045 Is this Oxfam donation form, 0:06:00.069,0:06:03.117 where the regular monthly[br]donation is the preset default, 0:06:03.141,0:06:05.220 and people, maybe without intending it, 0:06:05.244,0:06:09.056 are encouraged or nudged[br]into giving a regular donation 0:06:09.080,0:06:10.569 instead of a one-time donation, 0:06:10.593,0:06:11.936 is that "still' permissible? 0:06:11.960,0:06:13.323 Is it "still" ethical? 0:06:13.347,0:06:14.826 We're fishing at the low end. 0:06:15.619,0:06:18.093 But in fact, that question,[br]"Is it 'still' ethical?" 0:06:18.117,0:06:19.898 is just one way of looking at ethics. 0:06:19.922,0:06:24.814 Because if you look at the beginning[br]of ethics in Western culture, 0:06:24.838,0:06:28.370 you see a very different idea[br]of what ethics also could be. 0:06:28.710,0:06:32.707 For Aristotle, ethics[br]was not about the question, 0:06:32.731,0:06:35.003 "Is that still good, or is it bad?" 0:06:35.027,0:06:38.455 Ethics was about the question[br]of how to live life well. 0:06:38.956,0:06:41.137 And he put that in the word "arĂȘte," 0:06:41.161,0:06:43.916 which we, from [Ancient Greek],[br]translate as "virtue." 0:06:43.940,0:06:45.580 But really, it means "excellence." 0:06:45.604,0:06:51.035 It means living up to your own[br]full potential as a human being. 0:06:51.677,0:06:53.333 And that is an idea that, I think, 0:06:53.357,0:06:56.054 Paul Richard Buchanan[br]put nicely in a recent essay, 0:06:56.078,0:06:58.221 where he said,[br]"Products are vivid arguments 0:06:58.245,0:07:00.368 about how we should live our lives." 0:07:00.826,0:07:03.403 Our designs are not ethical or unethical 0:07:03.427,0:07:08.017 in that they're using ethical[br]or unethical means of persuading us. 0:07:08.401,0:07:09.971 They have a moral component 0:07:09.995,0:07:14.222 just in the kind of vision[br]and the aspiration of the good life 0:07:14.246,0:07:15.594 that they present to us. 0:07:16.181,0:07:19.684 And if you look into the designed[br]environment around us 0:07:19.708,0:07:20.880 with that kind of lens, 0:07:20.904,0:07:23.357 asking, "What is the vision[br]of the good life 0:07:23.381,0:07:26.119 that our products, our design,[br]present to us?", 0:07:26.143,0:07:28.419 then you often get the shivers, 0:07:28.443,0:07:30.771 because of how little[br]we expect of each other, 0:07:30.795,0:07:34.685 of how little we actually[br]seem to expect of our life, 0:07:34.709,0:07:36.743 and what the good life looks like. 0:07:37.850,0:07:40.873 So that's a fourth question[br]I'd like to leave you with: 0:07:40.897,0:07:45.259 What vision of the good life[br]do your designs convey? 0:07:45.989,0:07:47.361 And speaking of design, 0:07:47.385,0:07:51.575 you'll notice that I already[br]broadened the discussion, 0:07:51.599,0:07:56.043 because it's not just persuasive[br]technology that we're talking about here, 0:07:56.067,0:08:00.144 it's any piece of design[br]that we put out here in the world. 0:08:00.168,0:08:01.568 I don't know whether you know 0:08:01.592,0:08:05.147 the great communication researcher[br]Paul Watzlawick who, back in the '60s, 0:08:05.171,0:08:07.682 made the argument[br]that we cannot not communicate. 0:08:07.706,0:08:10.307 Even if we choose to be silent,[br]we chose to be silent, 0:08:10.331,0:08:13.287 and we're communicating something[br]by choosing to be silent. 0:08:13.311,0:08:16.046 And in the same way[br]that we cannot not communicate, 0:08:16.070,0:08:17.601 we cannot not persuade: 0:08:17.625,0:08:19.636 whatever we do or refrain from doing, 0:08:19.660,0:08:24.025 whatever we put out there[br]as a piece of design, into the world, 0:08:24.049,0:08:26.096 has a persuasive component. 0:08:26.120,0:08:27.993 It tries to affect people. 0:08:28.017,0:08:31.764 It puts a certain vision of the good life[br]out there in front of us, 0:08:31.788,0:08:33.413 which is what Peter-Paul Verbeek, 0:08:33.437,0:08:36.153 the Dutch philosopher of technology, says. 0:08:36.177,0:08:40.125 No matter whether we as designers[br]intend it or not, 0:08:40.149,0:08:42.291 we materialize morality. 0:08:42.315,0:08:45.118 We make certain things[br]harder and easier to do. 0:08:45.142,0:08:47.353 We organize the existence of people. 0:08:47.377,0:08:48.528 We put a certain vision 0:08:48.552,0:08:51.956 of what good or bad or normal or usual is 0:08:51.980,0:08:53.131 in front of people, 0:08:53.155,0:08:55.555 by everything we put[br]out there in the world. 0:08:55.987,0:08:59.073 Even something as innocuous[br]as a set of school chairs 0:08:59.097,0:09:01.144 is a persuasive technology, 0:09:01.168,0:09:05.858 because it presents and materializes[br]a certain vision of the good life -- 0:09:05.882,0:09:08.739 a good life in which teaching[br]and learning and listening 0:09:08.763,0:09:11.862 is about one person teaching,[br]the others listening; 0:09:11.886,0:09:15.939 in which it is about[br]learning-is-done-while-sitting; 0:09:15.963,0:09:17.558 in which you learn for yourself; 0:09:17.582,0:09:20.002 in which you're not supposed[br]to change these rules, 0:09:20.026,0:09:22.473 because the chairs[br]are fixed to the ground. 0:09:23.628,0:09:26.539 And even something as innocuous[br]as a single-design chair, 0:09:26.563,0:09:28.135 like this one by Arne Jacobsen, 0:09:28.159,0:09:29.935 is a persuasive technology, 0:09:29.959,0:09:33.002 because, again, it communicates[br]an idea of the good life: 0:09:33.475,0:09:38.400 a good life -- a life that you,[br]as a designer, consent to by saying, 0:09:38.424,0:09:41.931 "In a good life, goods are produced[br]as sustainably or unsustainably 0:09:41.955,0:09:43.566 as this chair. 0:09:43.590,0:09:45.567 Workers are treated as well or as badly 0:09:45.591,0:09:48.163 as the workers were treated[br]that built that chair." 0:09:48.502,0:09:50.803 The good life is a life[br]where design is important 0:09:50.827,0:09:53.748 because somebody obviously took[br]the time and spent the money 0:09:53.772,0:09:55.556 for that kind of well-designed chair; 0:09:55.580,0:09:56.984 where tradition is important, 0:09:57.008,0:10:00.174 because this is a traditional classic[br]and someone cared about this; 0:10:00.198,0:10:02.888 and where there is something[br]as conspicuous consumption, 0:10:02.912,0:10:05.868 where it is OK and normal to spend[br]a humongous amount of money 0:10:05.892,0:10:07.043 on such a chair, 0:10:07.067,0:10:09.640 to signal to other people[br]what your social status is. 0:10:09.664,0:10:12.981 So these are the kinds of layers,[br]the kinds of questions 0:10:13.005,0:10:14.975 I wanted to lead you through today; 0:10:14.999,0:10:18.033 the question of: What are the intentions[br]that you bring to bear 0:10:18.057,0:10:19.617 when you're designing something? 0:10:19.641,0:10:22.893 What are the effects, intended[br]and unintended, that you're having? 0:10:22.917,0:10:25.718 What are the values[br]you're using to judge those? 0:10:25.742,0:10:27.715 What are the virtues, the aspirations 0:10:27.739,0:10:29.767 that you're actually expressing in that? 0:10:30.140,0:10:31.997 And how does that apply, 0:10:32.021,0:10:34.007 not just to persuasive technology, 0:10:34.031,0:10:36.079 but to everything you design? 0:10:36.652,0:10:37.943 Do we stop there? 0:10:38.555,0:10:39.722 I don't think so. 0:10:40.031,0:10:44.469 I think that all of these things[br]are eventually informed 0:10:44.493,0:10:45.916 by the core of all of this, 0:10:45.940,0:10:49.031 and this is nothing but life itself. 0:10:49.334,0:10:52.051 Why, when the question[br]of what the good life is 0:10:52.075,0:10:54.414 informs everything that we design, 0:10:54.438,0:10:57.232 should we stop at design[br]and not ask ourselves: 0:10:57.256,0:10:59.246 How does it apply to our own life? 0:10:59.621,0:11:02.333 "Why should the lamp[br]or the house be an art object, 0:11:02.357,0:11:03.533 but not our life?" 0:11:03.557,0:11:05.040 as Michel Foucault puts it. 0:11:05.436,0:11:09.047 Just to give you a practical[br]example of Buster Benson. 0:11:09.071,0:11:11.329 This is Buster setting up[br]a pull-up machine 0:11:11.353,0:11:13.965 at the office of his new[br]start-up, Habit Labs, 0:11:13.989,0:11:17.204 where they're trying to build[br]other applications like "Health Month" 0:11:17.228,0:11:18.386 for people. 0:11:18.410,0:11:20.372 And why is he building a thing like this? 0:11:20.396,0:11:22.429 Well, here is the set of axioms 0:11:22.453,0:11:25.851 that Habit Labs, Buster's start-up,[br]put up for themselves 0:11:25.875,0:11:28.580 on how they wanted to work[br]together as a team 0:11:28.604,0:11:30.633 when they're building[br]these applications -- 0:11:30.657,0:11:32.846 a set of moral principles[br]they set themselves 0:11:32.870,0:11:34.231 for working together -- 0:11:34.255,0:11:35.493 one of them being, 0:11:35.517,0:11:38.604 "We take care of our own health[br]and manage our own burnout." 0:11:38.961,0:11:42.461 Because ultimately,[br]how can you ask yourselves 0:11:42.485,0:11:46.415 and how can you find an answer[br]on what vision of the good life 0:11:46.439,0:11:49.608 you want to convey and create[br]with your designs 0:11:49.632,0:11:51.362 without asking the question: 0:11:51.386,0:11:55.218 What vision of the good life[br]do you yourself want to live? 0:11:55.758,0:11:56.932 And with that, 0:11:57.685,0:11:59.097 I thank you. 0:11:59.121,0:12:03.277 (Applause)