1 00:00:00,391 --> 00:00:02,977 We are today talking about moral persuasion: 2 00:00:03,001 --> 00:00:06,983 What is moral and immoral in trying to change people's behaviors 3 00:00:07,007 --> 00:00:09,460 by using technology and using design? 4 00:00:09,484 --> 00:00:11,316 And I don't know what you expect, 5 00:00:11,340 --> 00:00:13,293 but when I was thinking about that issue, 6 00:00:13,317 --> 00:00:17,356 I early on realized what I'm not able to give you are answers. 7 00:00:17,943 --> 00:00:20,714 I'm not able to tell you what is moral or immoral, 8 00:00:20,738 --> 00:00:23,285 because we're living in a pluralist society. 9 00:00:23,309 --> 00:00:27,551 My values can be radically different from your values, 10 00:00:27,575 --> 00:00:30,752 which means that what I consider moral or immoral based on that 11 00:00:30,776 --> 00:00:34,388 might not necessarily be what you consider moral or immoral. 12 00:00:34,769 --> 00:00:37,983 But I also realized there is one thing that I could give you, 13 00:00:38,007 --> 00:00:40,540 and that is what this guy behind me gave the world -- 14 00:00:40,564 --> 00:00:41,714 Socrates. 15 00:00:41,738 --> 00:00:43,133 It is questions. 16 00:00:43,157 --> 00:00:45,790 What I can do and what I would like to do with you 17 00:00:45,814 --> 00:00:47,759 is give you, like that initial question, 18 00:00:47,783 --> 00:00:51,193 a set of questions to figure out for yourselves, 19 00:00:51,217 --> 00:00:54,858 layer by layer, like peeling an onion, 20 00:00:54,882 --> 00:00:59,902 getting at the core of what you believe is moral or immoral persuasion. 21 00:01:00,299 --> 00:01:04,340 And I'd like to do that with a couple of examples of technologies 22 00:01:04,364 --> 00:01:09,344 where people have used game elements to get people to do things. 23 00:01:10,083 --> 00:01:13,123 So it's at first a very simple, very obvious question 24 00:01:13,147 --> 00:01:14,343 I would like to give you: 25 00:01:14,367 --> 00:01:17,054 What are your intentions if you are designing something? 26 00:01:17,408 --> 00:01:20,781 And obviously, intentions are not the only thing, 27 00:01:20,805 --> 00:01:23,983 so here is another example for one of these applications. 28 00:01:24,007 --> 00:01:27,094 There are a couple of these kinds of Eco dashboards right now -- 29 00:01:27,118 --> 00:01:28,578 dashboards built into cars -- 30 00:01:28,602 --> 00:01:31,434 which try to motivate you to drive more fuel-efficiently. 31 00:01:31,458 --> 00:01:33,231 This here is Nissan's MyLeaf, 32 00:01:33,255 --> 00:01:36,319 where your driving behavior is compared with the driving behavior 33 00:01:36,343 --> 00:01:37,494 of other people, 34 00:01:37,518 --> 00:01:40,795 so you can compete for who drives a route the most fuel-efficiently. 35 00:01:40,819 --> 00:01:43,296 And these things are very effective, it turns out -- 36 00:01:43,320 --> 00:01:47,259 so effective that they motivate people to engage in unsafe driving behaviors, 37 00:01:47,283 --> 00:01:49,045 like not stopping at a red light, 38 00:01:49,069 --> 00:01:51,784 because that way you have to stop and restart the engine, 39 00:01:51,808 --> 00:01:54,531 and that would use quite some fuel, wouldn't it? 40 00:01:55,078 --> 00:01:59,487 So despite this being a very well-intended application, 41 00:01:59,511 --> 00:02:01,886 obviously there was a side effect of that. 42 00:02:01,910 --> 00:02:04,458 Here's another example for one of these side effects. 43 00:02:04,482 --> 00:02:09,266 Commendable: a site that allows parents to give their kids little badges 44 00:02:09,290 --> 00:02:11,948 for doing the things that parents want their kids to do, 45 00:02:11,972 --> 00:02:13,288 like tying their shoes. 46 00:02:13,312 --> 00:02:15,556 And at first that sounds very nice, 47 00:02:15,580 --> 00:02:17,730 very benign, well-intended. 48 00:02:17,754 --> 00:02:21,524 But it turns out, if you look into research on people's mindset, 49 00:02:21,548 --> 00:02:23,035 caring about outcomes, 50 00:02:23,059 --> 00:02:24,832 caring about public recognition, 51 00:02:24,856 --> 00:02:28,717 caring about these kinds of public tokens of recognition 52 00:02:28,741 --> 00:02:30,735 is not necessarily very helpful 53 00:02:30,759 --> 00:02:33,066 for your long-term psychological well-being. 54 00:02:33,090 --> 00:02:35,766 It's better if you care about learning something. 55 00:02:35,790 --> 00:02:37,695 It's better when you care about yourself 56 00:02:37,719 --> 00:02:40,293 than how you appear in front of other people. 57 00:02:40,761 --> 00:02:45,802 So that kind of motivational tool that is used actually, in and of itself, 58 00:02:45,826 --> 00:02:47,734 has a long-term side effect, 59 00:02:47,758 --> 00:02:49,568 in that every time we use a technology 60 00:02:49,592 --> 00:02:52,766 that uses something like public recognition or status, 61 00:02:52,790 --> 00:02:55,166 we're actually positively endorsing this 62 00:02:55,190 --> 00:02:58,600 as a good and normal thing to care about -- 63 00:02:58,624 --> 00:03:01,488 that way, possibly having a detrimental effect 64 00:03:01,512 --> 00:03:05,367 on the long-term psychological well-being of ourselves as a culture. 65 00:03:05,391 --> 00:03:08,035 So that's a second, very obvious question: 66 00:03:08,059 --> 00:03:10,370 What are the effects of what you're doing -- 67 00:03:10,394 --> 00:03:14,518 the effects you're having with the device, like less fuel, 68 00:03:14,542 --> 00:03:17,247 as well as the effects of the actual tools you're using 69 00:03:17,271 --> 00:03:18,945 to get people to do things -- 70 00:03:18,969 --> 00:03:20,540 public recognition? 71 00:03:20,564 --> 00:03:23,485 Now is that all -- intention, effect? 72 00:03:23,509 --> 00:03:26,643 Well, there are some technologies which obviously combine both. 73 00:03:26,667 --> 00:03:29,454 Both good long-term and short-term effects 74 00:03:29,478 --> 00:03:32,287 and a positive intention like Fred Stutzman's "Freedom," 75 00:03:32,311 --> 00:03:34,518 where the whole point of that application is -- 76 00:03:34,542 --> 00:03:38,267 well, we're usually so bombarded with constant requests by other people, 77 00:03:38,291 --> 00:03:39,447 with this device, 78 00:03:39,471 --> 00:03:42,951 you can shut off the Internet connectivity of your PC of choice 79 00:03:42,975 --> 00:03:44,423 for a pre-set amount of time, 80 00:03:44,447 --> 00:03:46,418 to actually get some work done. 81 00:03:46,442 --> 00:03:49,593 And I think most of us will agree that's something well-intended, 82 00:03:49,617 --> 00:03:51,837 and also has good consequences. 83 00:03:51,861 --> 00:03:53,507 In the words of Michel Foucault, 84 00:03:53,531 --> 00:03:55,471 it is a "technology of the self." 85 00:03:55,495 --> 00:03:58,332 It is a technology that empowers the individual 86 00:03:58,356 --> 00:04:00,171 to determine its own life course, 87 00:04:00,195 --> 00:04:01,714 to shape itself. 88 00:04:02,150 --> 00:04:05,134 But the problem is, as Foucault points out, 89 00:04:05,158 --> 00:04:06,943 that every technology of the self 90 00:04:06,967 --> 00:04:10,412 has a technology of domination as its flip side. 91 00:04:10,436 --> 00:04:15,039 As you see in today's modern liberal democracies, 92 00:04:15,063 --> 00:04:19,746 the society, the state, not only allows us to determine our self, 93 00:04:19,770 --> 00:04:20,921 to shape our self, 94 00:04:20,945 --> 00:04:22,936 it also demands it of us. 95 00:04:22,960 --> 00:04:24,921 It demands that we optimize ourselves, 96 00:04:24,945 --> 00:04:26,785 that we control ourselves, 97 00:04:26,809 --> 00:04:29,520 that we self-manage continuously, 98 00:04:29,544 --> 00:04:33,437 because that's the only way in which such a liberal society works. 99 00:04:33,461 --> 00:04:37,729 These technologies want us to stay in the game 100 00:04:37,753 --> 00:04:40,526 that society has devised for us. 101 00:04:40,550 --> 00:04:42,837 They want us to fit in even better. 102 00:04:42,861 --> 00:04:45,439 They want us to optimize ourselves to fit in. 103 00:04:46,368 --> 00:04:49,447 Now, I don't say that is necessarily a bad thing; 104 00:04:50,033 --> 00:04:54,361 I just think that this example points us to a general realization, 105 00:04:54,385 --> 00:04:58,188 and that is: no matter what technology or design you look at, 106 00:04:58,212 --> 00:05:01,233 even something we consider as well-intended 107 00:05:01,257 --> 00:05:04,223 and as good in its effects as Stutzman's Freedom, 108 00:05:04,247 --> 00:05:06,954 comes with certain values embedded in it. 109 00:05:06,978 --> 00:05:08,914 And we can question these values. 110 00:05:08,938 --> 00:05:10,882 We can question: Is it a good thing 111 00:05:10,906 --> 00:05:14,390 that all of us continuously self-optimize ourselves 112 00:05:14,414 --> 00:05:16,425 to fit better into that society? 113 00:05:16,449 --> 00:05:17,941 Or to give you another example: 114 00:05:17,965 --> 00:05:20,441 What about a piece of persuasive technology 115 00:05:20,465 --> 00:05:23,656 that convinces Muslim women to wear their headscarves? 116 00:05:23,680 --> 00:05:25,732 Is that a good or a bad technology 117 00:05:25,756 --> 00:05:28,319 in its intentions or in its effects? 118 00:05:28,343 --> 00:05:32,597 Well, that basically depends on the kind of values you bring to bear 119 00:05:32,621 --> 00:05:34,858 to make these kinds of judgments. 120 00:05:34,882 --> 00:05:36,410 So that's a third question: 121 00:05:36,434 --> 00:05:37,962 What values do you use to judge? 122 00:05:38,588 --> 00:05:39,929 And speaking of values: 123 00:05:39,953 --> 00:05:43,309 I've noticed that in the discussion about moral persuasion online 124 00:05:43,333 --> 00:05:44,970 and when I'm talking with people, 125 00:05:44,994 --> 00:05:47,661 more often than not, there is a weird bias. 126 00:05:48,203 --> 00:05:51,099 And that bias is that we're asking: 127 00:05:51,123 --> 00:05:53,936 Is this or that "still" ethical? 128 00:05:53,960 --> 00:05:56,610 Is it "still" permissible? 129 00:05:56,634 --> 00:05:57,832 We're asking things like: 130 00:05:57,856 --> 00:06:00,045 Is this Oxfam donation form, 131 00:06:00,069 --> 00:06:03,117 where the regular monthly donation is the preset default, 132 00:06:03,141 --> 00:06:05,220 and people, maybe without intending it, 133 00:06:05,244 --> 00:06:09,056 are encouraged or nudged into giving a regular donation 134 00:06:09,080 --> 00:06:10,569 instead of a one-time donation, 135 00:06:10,593 --> 00:06:11,936 is that "still' permissible? 136 00:06:11,960 --> 00:06:13,323 Is it "still" ethical? 137 00:06:13,347 --> 00:06:14,826 We're fishing at the low end. 138 00:06:15,619 --> 00:06:18,093 But in fact, that question, "Is it 'still' ethical?" 139 00:06:18,117 --> 00:06:19,898 is just one way of looking at ethics. 140 00:06:19,922 --> 00:06:24,814 Because if you look at the beginning of ethics in Western culture, 141 00:06:24,838 --> 00:06:28,370 you see a very different idea of what ethics also could be. 142 00:06:28,710 --> 00:06:32,707 For Aristotle, ethics was not about the question, 143 00:06:32,731 --> 00:06:35,003 "Is that still good, or is it bad?" 144 00:06:35,027 --> 00:06:38,455 Ethics was about the question of how to live life well. 145 00:06:38,956 --> 00:06:41,137 And he put that in the word "arĂȘte," 146 00:06:41,161 --> 00:06:43,916 which we, from [Ancient Greek], translate as "virtue." 147 00:06:43,940 --> 00:06:45,580 But really, it means "excellence." 148 00:06:45,604 --> 00:06:51,035 It means living up to your own full potential as a human being. 149 00:06:51,677 --> 00:06:53,333 And that is an idea that, I think, 150 00:06:53,357 --> 00:06:56,054 Paul Richard Buchanan put nicely in a recent essay, 151 00:06:56,078 --> 00:06:58,221 where he said, "Products are vivid arguments 152 00:06:58,245 --> 00:07:00,368 about how we should live our lives." 153 00:07:00,826 --> 00:07:03,403 Our designs are not ethical or unethical 154 00:07:03,427 --> 00:07:08,017 in that they're using ethical or unethical means of persuading us. 155 00:07:08,401 --> 00:07:09,971 They have a moral component 156 00:07:09,995 --> 00:07:14,222 just in the kind of vision and the aspiration of the good life 157 00:07:14,246 --> 00:07:15,594 that they present to us. 158 00:07:16,181 --> 00:07:19,684 And if you look into the designed environment around us 159 00:07:19,708 --> 00:07:20,880 with that kind of lens, 160 00:07:20,904 --> 00:07:23,357 asking, "What is the vision of the good life 161 00:07:23,381 --> 00:07:26,119 that our products, our design, present to us?", 162 00:07:26,143 --> 00:07:28,419 then you often get the shivers, 163 00:07:28,443 --> 00:07:30,771 because of how little we expect of each other, 164 00:07:30,795 --> 00:07:34,685 of how little we actually seem to expect of our life, 165 00:07:34,709 --> 00:07:36,743 and what the good life looks like. 166 00:07:37,850 --> 00:07:40,873 So that's a fourth question I'd like to leave you with: 167 00:07:40,897 --> 00:07:45,259 What vision of the good life do your designs convey? 168 00:07:45,989 --> 00:07:47,361 And speaking of design, 169 00:07:47,385 --> 00:07:51,575 you'll notice that I already broadened the discussion, 170 00:07:51,599 --> 00:07:56,043 because it's not just persuasive technology that we're talking about here, 171 00:07:56,067 --> 00:08:00,144 it's any piece of design that we put out here in the world. 172 00:08:00,168 --> 00:08:01,568 I don't know whether you know 173 00:08:01,592 --> 00:08:05,147 the great communication researcher Paul Watzlawick who, back in the '60s, 174 00:08:05,171 --> 00:08:07,682 made the argument that we cannot not communicate. 175 00:08:07,706 --> 00:08:10,307 Even if we choose to be silent, we chose to be silent, 176 00:08:10,331 --> 00:08:13,287 and we're communicating something by choosing to be silent. 177 00:08:13,311 --> 00:08:16,046 And in the same way that we cannot not communicate, 178 00:08:16,070 --> 00:08:17,601 we cannot not persuade: 179 00:08:17,625 --> 00:08:19,636 whatever we do or refrain from doing, 180 00:08:19,660 --> 00:08:24,025 whatever we put out there as a piece of design, into the world, 181 00:08:24,049 --> 00:08:26,096 has a persuasive component. 182 00:08:26,120 --> 00:08:27,993 It tries to affect people. 183 00:08:28,017 --> 00:08:31,764 It puts a certain vision of the good life out there in front of us, 184 00:08:31,788 --> 00:08:33,413 which is what Peter-Paul Verbeek, 185 00:08:33,437 --> 00:08:36,153 the Dutch philosopher of technology, says. 186 00:08:36,177 --> 00:08:40,125 No matter whether we as designers intend it or not, 187 00:08:40,149 --> 00:08:42,291 we materialize morality. 188 00:08:42,315 --> 00:08:45,118 We make certain things harder and easier to do. 189 00:08:45,142 --> 00:08:47,353 We organize the existence of people. 190 00:08:47,377 --> 00:08:48,528 We put a certain vision 191 00:08:48,552 --> 00:08:51,956 of what good or bad or normal or usual is 192 00:08:51,980 --> 00:08:53,131 in front of people, 193 00:08:53,155 --> 00:08:55,555 by everything we put out there in the world. 194 00:08:55,987 --> 00:08:59,073 Even something as innocuous as a set of school chairs 195 00:08:59,097 --> 00:09:01,144 is a persuasive technology, 196 00:09:01,168 --> 00:09:05,858 because it presents and materializes a certain vision of the good life -- 197 00:09:05,882 --> 00:09:08,739 a good life in which teaching and learning and listening 198 00:09:08,763 --> 00:09:11,862 is about one person teaching, the others listening; 199 00:09:11,886 --> 00:09:15,939 in which it is about learning-is-done-while-sitting; 200 00:09:15,963 --> 00:09:17,558 in which you learn for yourself; 201 00:09:17,582 --> 00:09:20,002 in which you're not supposed to change these rules, 202 00:09:20,026 --> 00:09:22,473 because the chairs are fixed to the ground. 203 00:09:23,628 --> 00:09:26,539 And even something as innocuous as a single-design chair, 204 00:09:26,563 --> 00:09:28,135 like this one by Arne Jacobsen, 205 00:09:28,159 --> 00:09:29,935 is a persuasive technology, 206 00:09:29,959 --> 00:09:33,002 because, again, it communicates an idea of the good life: 207 00:09:33,475 --> 00:09:38,400 a good life -- a life that you, as a designer, consent to by saying, 208 00:09:38,424 --> 00:09:41,931 "In a good life, goods are produced as sustainably or unsustainably 209 00:09:41,955 --> 00:09:43,566 as this chair. 210 00:09:43,590 --> 00:09:45,567 Workers are treated as well or as badly 211 00:09:45,591 --> 00:09:48,163 as the workers were treated that built that chair." 212 00:09:48,502 --> 00:09:50,803 The good life is a life where design is important 213 00:09:50,827 --> 00:09:53,748 because somebody obviously took the time and spent the money 214 00:09:53,772 --> 00:09:55,556 for that kind of well-designed chair; 215 00:09:55,580 --> 00:09:56,984 where tradition is important, 216 00:09:57,008 --> 00:10:00,174 because this is a traditional classic and someone cared about this; 217 00:10:00,198 --> 00:10:02,888 and where there is something as conspicuous consumption, 218 00:10:02,912 --> 00:10:05,868 where it is OK and normal to spend a humongous amount of money 219 00:10:05,892 --> 00:10:07,043 on such a chair, 220 00:10:07,067 --> 00:10:09,640 to signal to other people what your social status is. 221 00:10:09,664 --> 00:10:12,981 So these are the kinds of layers, the kinds of questions 222 00:10:13,005 --> 00:10:14,975 I wanted to lead you through today; 223 00:10:14,999 --> 00:10:18,033 the question of: What are the intentions that you bring to bear 224 00:10:18,057 --> 00:10:19,617 when you're designing something? 225 00:10:19,641 --> 00:10:22,893 What are the effects, intended and unintended, that you're having? 226 00:10:22,917 --> 00:10:25,718 What are the values you're using to judge those? 227 00:10:25,742 --> 00:10:27,715 What are the virtues, the aspirations 228 00:10:27,739 --> 00:10:29,767 that you're actually expressing in that? 229 00:10:30,140 --> 00:10:31,997 And how does that apply, 230 00:10:32,021 --> 00:10:34,007 not just to persuasive technology, 231 00:10:34,031 --> 00:10:36,079 but to everything you design? 232 00:10:36,652 --> 00:10:37,943 Do we stop there? 233 00:10:38,555 --> 00:10:39,722 I don't think so. 234 00:10:40,031 --> 00:10:44,469 I think that all of these things are eventually informed 235 00:10:44,493 --> 00:10:45,916 by the core of all of this, 236 00:10:45,940 --> 00:10:49,031 and this is nothing but life itself. 237 00:10:49,334 --> 00:10:52,051 Why, when the question of what the good life is 238 00:10:52,075 --> 00:10:54,414 informs everything that we design, 239 00:10:54,438 --> 00:10:57,232 should we stop at design and not ask ourselves: 240 00:10:57,256 --> 00:10:59,246 How does it apply to our own life? 241 00:10:59,621 --> 00:11:02,333 "Why should the lamp or the house be an art object, 242 00:11:02,357 --> 00:11:03,533 but not our life?" 243 00:11:03,557 --> 00:11:05,040 as Michel Foucault puts it. 244 00:11:05,436 --> 00:11:09,047 Just to give you a practical example of Buster Benson. 245 00:11:09,071 --> 00:11:11,329 This is Buster setting up a pull-up machine 246 00:11:11,353 --> 00:11:13,965 at the office of his new start-up, Habit Labs, 247 00:11:13,989 --> 00:11:17,204 where they're trying to build other applications like "Health Month" 248 00:11:17,228 --> 00:11:18,386 for people. 249 00:11:18,410 --> 00:11:20,372 And why is he building a thing like this? 250 00:11:20,396 --> 00:11:22,429 Well, here is the set of axioms 251 00:11:22,453 --> 00:11:25,851 that Habit Labs, Buster's start-up, put up for themselves 252 00:11:25,875 --> 00:11:28,580 on how they wanted to work together as a team 253 00:11:28,604 --> 00:11:30,633 when they're building these applications -- 254 00:11:30,657 --> 00:11:32,846 a set of moral principles they set themselves 255 00:11:32,870 --> 00:11:34,231 for working together -- 256 00:11:34,255 --> 00:11:35,493 one of them being, 257 00:11:35,517 --> 00:11:38,604 "We take care of our own health and manage our own burnout." 258 00:11:38,961 --> 00:11:42,461 Because ultimately, how can you ask yourselves 259 00:11:42,485 --> 00:11:46,415 and how can you find an answer on what vision of the good life 260 00:11:46,439 --> 00:11:49,608 you want to convey and create with your designs 261 00:11:49,632 --> 00:11:51,362 without asking the question: 262 00:11:51,386 --> 00:11:55,218 What vision of the good life do you yourself want to live? 263 00:11:55,758 --> 00:11:56,932 And with that, 264 00:11:57,685 --> 00:11:59,097 I thank you. 265 00:11:59,121 --> 00:12:03,277 (Applause)