1 00:00:01,167 --> 00:00:02,500 Hi there, everyone. 2 00:00:03,722 --> 00:00:07,389 I would like to start by asking you a simple question. 3 00:00:07,452 --> 00:00:08,484 And that is, 4 00:00:08,531 --> 00:00:12,536 who of you wants to build a product that is as captivating and engaging 5 00:00:12,561 --> 00:00:14,494 as, say, Facebook or Twitter? 6 00:00:14,641 --> 00:00:17,307 If you think so, please raise your hand. 7 00:00:17,513 --> 00:00:20,339 Something that's as captivating and engaging as Twitter. 8 00:00:20,355 --> 00:00:22,088 Please keep your hands up. 9 00:00:22,625 --> 00:00:24,878 Now, those of you who have kept your hands up, 10 00:00:24,919 --> 00:00:26,902 please keep your hands up if you feel 11 00:00:26,950 --> 00:00:29,997 that you're spending more time than you should spend 12 00:00:30,030 --> 00:00:32,910 on sites like Facebook or Twitter, 13 00:00:32,934 --> 00:00:34,997 time that would be better spent 14 00:00:35,038 --> 00:00:38,624 with friends or spouses or doing things that you actually love. 15 00:00:39,720 --> 00:00:42,085 Okay. Those who still have their arms up, 16 00:00:42,133 --> 00:00:44,200 please discuss after the break. 17 00:00:44,220 --> 00:00:46,077 (Laughter) 18 00:00:46,125 --> 00:00:48,066 So, why am I asking this question? 19 00:00:48,474 --> 00:00:49,529 I am asking it, 20 00:00:49,591 --> 00:00:52,442 because we are today talking about moral persuasion: 21 00:00:52,501 --> 00:00:56,483 What is moral and immoral in trying to change people's behaviors 22 00:00:56,507 --> 00:00:58,960 by using technology and using design? 23 00:00:58,984 --> 00:01:00,816 And I don't know what you expect, 24 00:01:00,840 --> 00:01:02,793 but when I was thinking about that issue, 25 00:01:02,817 --> 00:01:06,856 I early on realized what I'm not able to give you are answers. 26 00:01:07,443 --> 00:01:10,214 I'm not able to tell you what is moral or immoral, 27 00:01:10,238 --> 00:01:12,785 because we're living in a pluralist society. 28 00:01:12,809 --> 00:01:17,051 My values can be radically different from your values, 29 00:01:17,075 --> 00:01:20,252 which means that what I consider moral or immoral based on that 30 00:01:20,276 --> 00:01:23,888 might not necessarily be what you consider moral or immoral. 31 00:01:24,269 --> 00:01:27,483 But I also realized there is one thing that I could give you, 32 00:01:27,507 --> 00:01:30,040 and that is what this guy behind me gave the world... 33 00:01:30,064 --> 00:01:31,214 Socrates. 34 00:01:31,238 --> 00:01:32,633 It is questions. 35 00:01:32,657 --> 00:01:35,290 What I can do and what I would like to do with you 36 00:01:35,314 --> 00:01:37,259 is give you, like that initial question, 37 00:01:37,283 --> 00:01:41,914 a set of questions to figure out for yourselves, 38 00:01:42,017 --> 00:01:45,658 layer by layer, like peeling an onion, 39 00:01:45,682 --> 00:01:50,702 getting at the core of what you believe is moral or immoral persuasion. 40 00:01:51,099 --> 00:01:54,287 And I'd like to do that with a couple of examples 41 00:01:54,446 --> 00:01:57,597 as was said, a couple of examples of technologies 42 00:01:57,664 --> 00:02:02,644 where people have used game elements to get people to do things. 43 00:02:03,561 --> 00:02:06,545 So, here's the first example leading us to our first question. 44 00:02:06,585 --> 00:02:09,118 one of my favorite examples of gamification, 45 00:02:09,143 --> 00:02:10,538 Buster Benson's health month. 46 00:02:10,568 --> 00:02:14,243 It's a simple application where you set yourself health rules 47 00:02:14,942 --> 00:02:15,942 for one month. 48 00:02:16,188 --> 00:02:18,767 Rules like, I want to exercise six times a week. 49 00:02:18,823 --> 00:02:20,656 Or, I don't want to drink any alcohol. 50 00:02:20,704 --> 00:02:23,767 And every morning you get an email, asking you, 51 00:02:23,853 --> 00:02:25,583 Did you stick to your rules or not? 52 00:02:25,607 --> 00:02:27,894 And you say yes or no to the different questions. 53 00:02:27,927 --> 00:02:30,553 And then, on the platform, you can see how well you did, 54 00:02:30,958 --> 00:02:33,625 you can earn points and badges for that, 55 00:02:33,665 --> 00:02:36,058 That information is shared with your peers, 56 00:02:36,108 --> 00:02:38,990 and if you don't stick to a rule, you loose a health point, 57 00:02:39,015 --> 00:02:41,140 but you friends ccan chip in and heal you. 58 00:02:41,879 --> 00:02:45,442 Beautiful example, and I believe most of you will agree with me 59 00:02:45,760 --> 00:02:48,490 that it kind of sounds like ethical persuasion, right? 60 00:02:48,561 --> 00:02:50,910 It sounds like something that is good to do. 61 00:02:50,948 --> 00:02:52,482 Here's another example. 62 00:02:52,768 --> 00:02:55,513 Very similar in the kind of thinking behind it, 63 00:02:55,569 --> 00:02:57,505 very different example - Lockers. 64 00:02:57,593 --> 00:03:01,069 It's a social platform where people set up profiles 65 00:03:01,180 --> 00:03:04,418 and on them, the main thing they do is they put up product pictures 66 00:03:04,498 --> 00:03:06,565 pictures of products they like, 67 00:03:06,617 --> 00:03:08,831 and link their profiles with their friends 68 00:03:08,863 --> 00:03:12,339 and every time I click on one of those products on your page, 69 00:03:12,371 --> 00:03:13,474 you get points, 70 00:03:13,487 --> 00:03:16,783 and every time you click on a product page on my page, 71 00:03:17,125 --> 00:03:18,125 I get points, 72 00:03:18,157 --> 00:03:20,791 and if you actually buy something I get a lot of points. 73 00:03:20,838 --> 00:03:23,490 and both of us can then exchange these points 74 00:03:23,831 --> 00:03:26,165 into percentages of these products. 75 00:03:26,538 --> 00:03:28,101 Now, I don't know how you feel, 76 00:03:28,173 --> 00:03:30,839 but personally I think that Health Month 77 00:03:30,895 --> 00:03:33,628 is something that feels to me very benign 78 00:03:33,704 --> 00:03:37,005 and a good piece, a moral piece of technology, 79 00:03:37,046 --> 00:03:40,632 whereas there's something about Locker that makes me feel a little queezy. 80 00:03:41,514 --> 00:03:44,918 Thinking about what is it exactly that makes me feel queezy here, 81 00:03:44,943 --> 00:03:46,450 in this case, versus the other, 82 00:03:46,490 --> 00:03:48,640 was a very simple answer, and that is, well, 83 00:03:48,657 --> 00:03:50,505 the intention behind it, right? 84 00:03:51,355 --> 00:03:54,997 In one case, the intention is, "That site wants me to be healthier, 85 00:03:55,022 --> 00:03:57,418 and the other site wants me to shop more." 86 00:03:57,683 --> 00:04:00,723 So it's at first a very simple, very obvious question 87 00:04:00,747 --> 00:04:01,943 I would like to give you: 88 00:04:01,967 --> 00:04:04,654 What are your intentions if you are designing something? 89 00:04:05,008 --> 00:04:08,381 And obviously, intentions are not the only thing, 90 00:04:08,705 --> 00:04:12,140 so here is another example for one of these applications. 91 00:04:12,307 --> 00:04:15,394 There are a couple of these kinds of Eco dashboards right now... 92 00:04:15,418 --> 00:04:16,878 Dashboards built into cars... 93 00:04:16,902 --> 00:04:19,733 Which try to motivate you to drive more fuel-efficiently. 94 00:04:19,757 --> 00:04:21,531 This here is Nissan's MyLeaf, 95 00:04:21,555 --> 00:04:25,518 where your driving behavior is compared with the driving behavior 96 00:04:25,543 --> 00:04:26,694 of other people, 97 00:04:26,718 --> 00:04:29,995 so you can compete for who drives a route the most fuel-efficiently. 98 00:04:30,019 --> 00:04:32,496 And these things are very effective, it turns out... 99 00:04:32,520 --> 00:04:36,459 So effective that they motivate people to engage in unsafe driving behaviors, 100 00:04:36,483 --> 00:04:38,444 like not stopping at a red light, 101 00:04:38,469 --> 00:04:41,184 because that way you have to stop and restart the engine, 102 00:04:41,208 --> 00:04:43,931 and that would use quite some fuel, wouldn't it? 103 00:04:44,978 --> 00:04:49,387 So despite this being a very well-intended application, 104 00:04:49,411 --> 00:04:51,786 obviously there was a side effect of that. 105 00:04:51,810 --> 00:04:54,358 Here's another example for one of these side effects. 106 00:04:54,382 --> 00:04:59,166 Commendable: a site that allows parents to give their kids little badges 107 00:04:59,190 --> 00:05:01,848 for doing the things that parents want their kids to do, 108 00:05:01,872 --> 00:05:03,188 like tying their shoes. 109 00:05:03,212 --> 00:05:05,456 And at first that sounds very nice, 110 00:05:05,480 --> 00:05:07,630 very benign, well-intended. 111 00:05:07,654 --> 00:05:11,923 But it turns out, if you look into research on people's mindset, 112 00:05:11,948 --> 00:05:13,435 caring about outcomes, 113 00:05:13,459 --> 00:05:15,232 caring about public recognition, 114 00:05:15,256 --> 00:05:19,117 caring about these kinds of public tokens of recognition 115 00:05:19,141 --> 00:05:21,135 is not necessarily very helpful 116 00:05:21,159 --> 00:05:23,466 for your long-term psychological well-being. 117 00:05:23,490 --> 00:05:26,166 It's better if you care about learning something. 118 00:05:26,190 --> 00:05:28,095 It's better when you care about yourself 119 00:05:28,119 --> 00:05:30,693 than how you appear in front of other people. 120 00:05:31,161 --> 00:05:36,202 So that kind of motivational tool that is used actually, in and of itself, 121 00:05:36,226 --> 00:05:38,134 has a long-term side effect, 122 00:05:38,158 --> 00:05:39,968 in that every time we use a technology 123 00:05:39,992 --> 00:05:43,166 that uses something like public recognition or status, 124 00:05:43,190 --> 00:05:45,566 we're actually positively endorsing this 125 00:05:45,590 --> 00:05:49,000 as a good and normal thing to care about... 126 00:05:49,024 --> 00:05:51,888 That way, possibly having a detrimental effect 127 00:05:51,912 --> 00:05:55,767 on the long-term psychological well-being of ourselves as a culture. 128 00:05:55,791 --> 00:05:58,435 So that's a second, very obvious question: 129 00:05:58,459 --> 00:06:00,770 What are the effects of what you're doing... 130 00:06:00,794 --> 00:06:04,918 The effects you're having with the device, like less fuel, 131 00:06:04,942 --> 00:06:07,647 as well as the effects of the actual tools you're using 132 00:06:07,671 --> 00:06:09,345 to get people to do things... 133 00:06:09,369 --> 00:06:10,940 Public recognition? 134 00:06:11,264 --> 00:06:14,185 Now is that all... intention, effect? 135 00:06:14,209 --> 00:06:17,343 Well, there are some technologies which obviously combine both. 136 00:06:17,367 --> 00:06:20,154 Both good long-term and short-term effects 137 00:06:20,178 --> 00:06:24,409 and a positive intention like Fred Stutzman's "Freedom," 138 00:06:24,611 --> 00:06:26,818 where the whole point of that application is... 139 00:06:26,842 --> 00:06:30,567 Well, we're usually so bombarded with constant requests by other people, 140 00:06:30,591 --> 00:06:31,747 with this device, 141 00:06:31,771 --> 00:06:35,251 you can shut off the Internet connectivity of your PC of choice 142 00:06:35,275 --> 00:06:36,723 for a pre-set amount of time, 143 00:06:36,747 --> 00:06:38,718 to actually get some work done. 144 00:06:39,742 --> 00:06:42,893 And I think most of us will agree that's something well-intended, 145 00:06:42,917 --> 00:06:45,137 and also has good consequences. 146 00:06:45,161 --> 00:06:46,807 In the words of Michel Foucault, 147 00:06:46,831 --> 00:06:48,771 it is a "technology of the self." 148 00:06:48,795 --> 00:06:51,632 It is a technology that empowers the individual 149 00:06:51,656 --> 00:06:53,471 to determine its own life course, 150 00:06:53,495 --> 00:06:55,014 to shape itself. 151 00:06:55,850 --> 00:06:59,094 But the problem is, as Foucault points out, 152 00:06:59,358 --> 00:07:01,143 that every technology of the self 153 00:07:01,167 --> 00:07:04,612 has a technology of domination as its flip side. 154 00:07:04,636 --> 00:07:09,239 As you see in today's modern liberal democracies, 155 00:07:09,263 --> 00:07:13,946 the society, the state, not only allows us to determine our self, 156 00:07:13,970 --> 00:07:15,121 to shape our self, 157 00:07:15,145 --> 00:07:17,136 it also demands it of us. 158 00:07:17,160 --> 00:07:19,121 It demands that we optimize ourselves, 159 00:07:19,145 --> 00:07:20,985 that we control ourselves, 160 00:07:21,009 --> 00:07:23,720 that we self-manage continuously, 161 00:07:23,744 --> 00:07:27,347 because that's the only way in which such a liberal society works. 162 00:07:27,434 --> 00:07:28,450 In a way, 163 00:07:28,585 --> 00:07:32,069 the kind of devices like Fred Stutzman's "Freedom," 164 00:07:32,125 --> 00:07:35,593 or Buster Benson's Helth Month, are technologies of domination, 165 00:07:35,601 --> 00:07:37,334 because they want us to be 166 00:07:37,513 --> 00:07:40,855 (Robotic voice) fitter, happier, more productive, 167 00:07:40,950 --> 00:07:43,902 comfortable, not drinking too much, 168 00:07:44,069 --> 00:07:47,553 regular exercise at the gym three days a week, 169 00:07:47,688 --> 00:07:51,521 gettin on better with your associate employee contemporaries. 170 00:07:51,546 --> 00:07:53,434 At ease. Eating well. 171 00:07:53,530 --> 00:07:56,878 No more microwave dinners and saturated fats. 172 00:07:56,966 --> 00:08:00,148 A patient, better driver, a safer car, 173 00:08:00,212 --> 00:08:02,188 [unclear] 174 00:08:02,228 --> 00:08:04,807 sleeping well, no bad dreams. 175 00:08:06,261 --> 00:08:10,529 SD: These technologies want us to stay in the game 176 00:08:10,553 --> 00:08:13,326 that society has devised for us. 177 00:08:13,350 --> 00:08:15,637 They want us to fit in even better. 178 00:08:15,661 --> 00:08:18,239 They want us to optimize ourselves to fit in. 179 00:08:19,168 --> 00:08:22,247 Now, I don't say that is necessarily a bad thing; 180 00:08:22,833 --> 00:08:27,161 I just think that this example points us to a general realization, 181 00:08:27,185 --> 00:08:30,988 and that is: no matter what technology or design you look at, 182 00:08:31,012 --> 00:08:34,033 even something we consider as well-intended 183 00:08:35,857 --> 00:08:38,994 and as good in its effects as Stutzman's Freedom, 184 00:08:39,246 --> 00:08:41,953 comes with certain values embedded in it. 185 00:08:41,977 --> 00:08:43,914 And we can question these values. 186 00:08:43,938 --> 00:08:45,882 We can question: Is it a good thing 187 00:08:45,906 --> 00:08:49,390 that all of us continuously self-optimize ourselves 188 00:08:49,414 --> 00:08:51,425 to fit better into that society? 189 00:08:51,449 --> 00:08:52,941 Or to give you another example: 190 00:08:52,976 --> 00:08:56,031 the one at the initial presentation, 191 00:08:56,065 --> 00:08:58,541 What about a piece of persuasive technology 192 00:08:58,565 --> 00:09:01,756 that convinces Muslim women to wear their headscarves? 193 00:09:01,780 --> 00:09:03,832 Is that a good or a bad technology 194 00:09:03,856 --> 00:09:06,419 in its intentions or in its effects? 195 00:09:07,143 --> 00:09:11,397 Well, that basically depends on the kind of values you bring to bear 196 00:09:11,421 --> 00:09:13,658 to make these kinds of judgments. 197 00:09:13,682 --> 00:09:15,210 So that's a third question: 198 00:09:15,234 --> 00:09:16,762 What values do you use to judge? 199 00:09:17,388 --> 00:09:18,729 And speaking of values: 200 00:09:18,753 --> 00:09:22,109 I've noticed that in the discussion about moral persuasion online 201 00:09:22,133 --> 00:09:23,770 and when I'm talking with people, 202 00:09:24,394 --> 00:09:27,061 more often than not, there is a weird bias. 203 00:09:27,603 --> 00:09:30,499 And that bias is that we're asking: 204 00:09:30,523 --> 00:09:33,336 Is this or that "still" ethical? 205 00:09:33,360 --> 00:09:36,010 Is it "still" permissible? 206 00:09:37,234 --> 00:09:38,432 We're asking things like: 207 00:09:38,456 --> 00:09:40,645 Is this Oxfam donation form, 208 00:09:40,669 --> 00:09:43,717 where the regular monthly donation is the preset default, 209 00:09:43,741 --> 00:09:45,820 and people, maybe without intending it, 210 00:09:45,844 --> 00:09:50,455 are encouraged or nudged into giving a regular donation 211 00:09:50,480 --> 00:09:51,969 instead of a one-time donation, 212 00:09:51,993 --> 00:09:53,336 is that "still' permissible? 213 00:09:53,360 --> 00:09:54,723 Is it "still" ethical? 214 00:09:54,747 --> 00:09:56,226 We're fishing at the low end. 215 00:09:56,819 --> 00:09:59,293 But in fact, that question, "Is it 'still' ethical?" 216 00:09:59,317 --> 00:10:01,098 is just one way of looking at ethics. 217 00:10:01,122 --> 00:10:06,014 Because if you look at the beginning of ethics in Western culture, 218 00:10:06,038 --> 00:10:09,570 you see a very different idea of what ethics also could be. 219 00:10:09,910 --> 00:10:13,907 For Aristotle, ethics was not about the question, 220 00:10:13,931 --> 00:10:16,203 "Is that still good, or is it bad?" 221 00:10:16,227 --> 00:10:19,655 Ethics was about the question of how to live life well. 222 00:10:20,156 --> 00:10:22,337 And he put that in the word "arĂȘte," 223 00:10:22,361 --> 00:10:25,116 which we, from [Ancient Greek], translate as "virtue." 224 00:10:25,140 --> 00:10:26,780 But really, it means "excellence." 225 00:10:26,804 --> 00:10:32,235 It means living up to your own full potential as a human being. 226 00:10:33,377 --> 00:10:35,033 And that is an idea that, I think, 227 00:10:35,057 --> 00:10:38,353 Paul Richard Buchanan put nicely in a recent essay, 228 00:10:38,378 --> 00:10:40,521 where he said, "Products are vivid arguments 229 00:10:40,545 --> 00:10:42,668 about how we should live our lives." 230 00:10:43,126 --> 00:10:45,703 Our designs are not ethical or unethical 231 00:10:45,727 --> 00:10:50,317 in that they're using ethical or unethical means of persuading us. 232 00:10:50,701 --> 00:10:52,271 They have a moral component 233 00:10:52,295 --> 00:10:56,522 just in the kind of vision and the aspiration of the good life 234 00:10:56,546 --> 00:10:57,894 that they present to us. 235 00:10:58,381 --> 00:11:01,884 And if you look into the designed environment around us 236 00:11:01,908 --> 00:11:03,080 with that kind of lens, 237 00:11:03,104 --> 00:11:05,557 asking, "What is the vision of the good life 238 00:11:05,581 --> 00:11:08,319 that our products, our design, present to us?", 239 00:11:08,343 --> 00:11:10,619 then you often get the shivers, 240 00:11:10,643 --> 00:11:12,971 because of how little we expect of each other, 241 00:11:12,995 --> 00:11:16,885 of how little we actually seem to expect of our life, 242 00:11:16,909 --> 00:11:18,943 and what the good life looks like. 243 00:11:20,050 --> 00:11:23,073 So that's a fourth question I'd like to leave you with: 244 00:11:23,397 --> 00:11:27,759 What vision of the good life do your designs convey? 245 00:11:28,489 --> 00:11:29,861 And speaking of design, 246 00:11:29,885 --> 00:11:34,075 you'll notice that I already broadened the discussion, 247 00:11:34,099 --> 00:11:38,543 because it's not just persuasive technology that we're talking about here, 248 00:11:38,567 --> 00:11:42,644 it's any piece of design that we put out here in the world. 249 00:11:42,668 --> 00:11:44,068 I don't know whether you know 250 00:11:44,092 --> 00:11:47,647 the great communication researcher Paul Watzlawick who, back in the '60s, 251 00:11:47,671 --> 00:11:50,182 made the argument that we cannot not communicate. 252 00:11:50,206 --> 00:11:52,807 Even if we choose to be silent, we chose to be silent, 253 00:11:52,831 --> 00:11:55,787 and we're communicating something by choosing to be silent. 254 00:11:55,811 --> 00:11:58,546 And in the same way that we cannot not communicate, 255 00:11:58,570 --> 00:12:00,101 we cannot not persuade: 256 00:12:00,125 --> 00:12:02,136 whatever we do or refrain from doing, 257 00:12:02,160 --> 00:12:06,525 whatever we put out there as a piece of design, into the world, 258 00:12:06,549 --> 00:12:08,596 has a persuasive component. 259 00:12:08,620 --> 00:12:10,493 It tries to affect people. 260 00:12:10,517 --> 00:12:14,264 It puts a certain vision of the good life out there in front of us, 261 00:12:14,888 --> 00:12:16,513 which is what Peter-Paul Verbeek, 262 00:12:16,537 --> 00:12:19,253 the Dutch philosopher of technology, says. 263 00:12:20,277 --> 00:12:24,225 No matter whether we as designers intend it or not, 264 00:12:24,249 --> 00:12:26,391 we materialize morality. 265 00:12:26,415 --> 00:12:29,218 We make certain things harder and easier to do. 266 00:12:29,242 --> 00:12:31,453 We organize the existence of people. 267 00:12:31,477 --> 00:12:32,628 We put a certain vision 268 00:12:32,652 --> 00:12:36,056 of what good or bad or normal or usual is 269 00:12:36,080 --> 00:12:37,231 in front of people, 270 00:12:37,255 --> 00:12:39,655 by everything we put out there in the world. 271 00:12:40,087 --> 00:12:43,173 Even something as innocuous as a set of school chairs 272 00:12:43,203 --> 00:12:47,409 as a set of chairs that you're sitting on and I'm standing in front of, 273 00:12:48,297 --> 00:12:50,344 is a persuasive technology, 274 00:12:50,368 --> 00:12:55,058 because it presents and materializes a certain vision of the good life... 275 00:12:55,082 --> 00:12:57,939 A good life in which teaching and learning and listening 276 00:12:57,963 --> 00:13:01,062 is about one person teaching, the others listening; 277 00:13:01,086 --> 00:13:05,139 in which it is about learning-is-done-while-sitting; 278 00:13:05,163 --> 00:13:06,758 in which you learn for yourself; 279 00:13:06,782 --> 00:13:09,202 in which you're not supposed to change these rules, 280 00:13:09,226 --> 00:13:11,673 because the chairs are fixed to the ground. 281 00:13:12,828 --> 00:13:15,739 And even something as innocuous as a single-design chair, 282 00:13:15,763 --> 00:13:17,335 like this one by Arne Jacobsen, 283 00:13:17,359 --> 00:13:19,135 is a persuasive technology, 284 00:13:19,159 --> 00:13:22,202 because, again, it communicates an idea of the good life: 285 00:13:22,675 --> 00:13:28,087 a good life... a life that you, as a designer, consent to by saying, 286 00:13:28,124 --> 00:13:31,631 "In a good life, goods are produced as sustainably or unsustainably 287 00:13:31,655 --> 00:13:33,266 as this chair. 288 00:13:33,290 --> 00:13:35,267 Workers are treated as well or as badly 289 00:13:35,291 --> 00:13:37,863 as the workers were treated that built that chair." 290 00:13:38,202 --> 00:13:40,503 The good life is a life where design is important 291 00:13:40,527 --> 00:13:43,448 because somebody obviously took the time and spent the money 292 00:13:43,472 --> 00:13:45,256 for that kind of well-designed chair; 293 00:13:45,280 --> 00:13:46,684 where tradition is important, 294 00:13:46,708 --> 00:13:49,874 because this is a traditional classic and someone cared about this; 295 00:13:49,898 --> 00:13:52,588 and where there is something as conspicuous consumption, 296 00:13:52,612 --> 00:13:55,568 where it is OK and normal to spend a humongous amount of money 297 00:13:55,592 --> 00:13:57,642 on such a chair, 298 00:13:57,667 --> 00:14:00,240 to signal to other people what your social status is. 299 00:14:02,164 --> 00:14:05,481 So these are the kinds of layers, the kinds of questions 300 00:14:05,505 --> 00:14:07,475 I wanted to lead you through today; 301 00:14:07,499 --> 00:14:10,533 the question of: What are the intentions that you bring to bear 302 00:14:10,557 --> 00:14:12,117 when you're designing something? 303 00:14:12,141 --> 00:14:15,393 What are the effects, intended and unintended, that you're having? 304 00:14:15,417 --> 00:14:18,218 What are the values you're using to judge those? 305 00:14:18,242 --> 00:14:20,215 What are the virtues, the aspirations 306 00:14:20,239 --> 00:14:22,267 that you're actually expressing in that? 307 00:14:22,640 --> 00:14:24,497 And how does that apply, 308 00:14:24,521 --> 00:14:26,507 not just to persuasive technology, 309 00:14:26,531 --> 00:14:28,579 but to everything you design? 310 00:14:29,452 --> 00:14:30,743 Do we stop there? 311 00:14:31,355 --> 00:14:32,522 I don't think so. 312 00:14:32,831 --> 00:14:37,269 I think that all of these things are eventually informed 313 00:14:37,293 --> 00:14:38,716 by the core of all of this, 314 00:14:38,740 --> 00:14:41,831 and this is nothing but life itself. 315 00:14:42,134 --> 00:14:44,851 Why, when the question of what the good life is 316 00:14:44,875 --> 00:14:47,214 informs everything that we design, 317 00:14:47,238 --> 00:14:50,032 should we stop at design and not ask ourselves: 318 00:14:50,056 --> 00:14:52,046 How does it apply to our own life? 319 00:14:52,421 --> 00:14:55,133 "Why should the lamp or the house be an art object, 320 00:14:55,157 --> 00:14:56,333 but not our life?" 321 00:14:56,357 --> 00:14:57,840 as Michel Foucault puts it. 322 00:14:58,236 --> 00:15:01,764 Just to give you a practical example of Buster Benson. 323 00:15:01,789 --> 00:15:03,765 whom I mentioned at the beginning. 324 00:15:03,871 --> 00:15:06,728 This is Buster setting up a pull-up machine 325 00:15:06,753 --> 00:15:09,365 at the office of his new start-up, Habit Labs, 326 00:15:09,389 --> 00:15:12,604 where they're trying to build other applications like "Health Month" 327 00:15:12,628 --> 00:15:13,786 for people. 328 00:15:13,810 --> 00:15:15,772 And why is he building a thing like this? 329 00:15:15,796 --> 00:15:17,829 Well, here is the set of axioms 330 00:15:17,853 --> 00:15:22,897 that Habit Labs, Buster's start-up, put up for themselves 331 00:15:22,975 --> 00:15:25,680 on how they wanted to work together as a team 332 00:15:25,704 --> 00:15:27,733 when they're building these applications... 333 00:15:27,757 --> 00:15:29,946 A set of moral principles they set themselves 334 00:15:29,970 --> 00:15:31,331 for working together... 335 00:15:31,355 --> 00:15:32,593 One of them being, 336 00:15:32,617 --> 00:15:35,704 "We take care of our own health and manage our own burnout." 337 00:15:36,061 --> 00:15:40,901 Because ultimately, how can you ask yourselves 338 00:15:40,985 --> 00:15:44,915 and how can you find an answer on what vision of the good life 339 00:15:44,939 --> 00:15:48,108 you want to convey and create with your designs 340 00:15:48,132 --> 00:15:49,862 without asking the question: 341 00:15:49,886 --> 00:15:53,718 What vision of the good life do you yourself want to live? 342 00:15:53,885 --> 00:15:55,059 And with that, 343 00:15:55,827 --> 00:15:57,239 I thank you. 344 00:15:57,494 --> 00:16:01,650 (Applause)