WEBVTT 00:00:01.167 --> 00:00:02.500 Hi there, everyone. 00:00:03.722 --> 00:00:07.389 I would like to start by asking you a simple question. 00:00:07.452 --> 00:00:08.484 And that is, 00:00:08.531 --> 00:00:12.536 who of you wants to build a product that is as captivating and engaging 00:00:12.561 --> 00:00:14.494 as, say, Facebook or Twitter? 00:00:14.641 --> 00:00:17.307 If you think so, please raise your hand. 00:00:17.513 --> 00:00:20.339 Something that's as captivating and engaging as Twitter. 00:00:20.355 --> 00:00:22.088 Please keep your hands up. 00:00:22.625 --> 00:00:24.878 Now, those of you who have kept your hands up, 00:00:24.919 --> 00:00:26.902 please keep your hands up if you feel 00:00:26.950 --> 00:00:29.997 that you're spending more time than you should spend 00:00:30.030 --> 00:00:32.910 on sites like Facebook or Twitter, 00:00:32.934 --> 00:00:34.997 time that would be better spent 00:00:35.038 --> 00:00:38.624 with friends or spouses or doing things that you actually love. 00:00:39.720 --> 00:00:42.085 Okay. Those who still have their arms up, 00:00:42.133 --> 00:00:44.200 please discuss after the break. 00:00:44.220 --> 00:00:46.077 (Laughter) 00:00:46.125 --> 00:00:48.066 So, why am I asking this question? 00:00:48.474 --> 00:00:49.529 I am asking it, 00:00:49.591 --> 00:00:52.442 because we are today talking about moral persuasion: 00:00:52.501 --> 00:00:56.483 What is moral and immoral in trying to change people's behaviors 00:00:56.507 --> 00:00:58.960 by using technology and using design? 00:00:58.984 --> 00:01:00.816 And I don't know what you expect, 00:01:00.840 --> 00:01:02.793 but when I was thinking about that issue, 00:01:02.817 --> 00:01:06.856 I early on realized what I'm not able to give you are answers. 00:01:07.443 --> 00:01:10.214 I'm not able to tell you what is moral or immoral, 00:01:10.238 --> 00:01:12.785 because we're living in a pluralist society. 00:01:12.809 --> 00:01:17.051 My values can be radically different from your values, 00:01:17.075 --> 00:01:20.252 which means that what I consider moral or immoral based on that 00:01:20.276 --> 00:01:23.888 might not necessarily be what you consider moral or immoral. 00:01:24.269 --> 00:01:27.483 But I also realized there is one thing that I could give you, 00:01:27.507 --> 00:01:30.040 and that is what this guy behind me gave the world... 00:01:30.064 --> 00:01:31.214 Socrates. 00:01:31.238 --> 00:01:32.633 It is questions. 00:01:32.657 --> 00:01:35.290 What I can do and what I would like to do with you 00:01:35.314 --> 00:01:37.259 is give you, like that initial question, 00:01:37.283 --> 00:01:41.914 a set of questions to figure out for yourselves, 00:01:42.017 --> 00:01:45.658 layer by layer, like peeling an onion, 00:01:45.682 --> 00:01:50.702 getting at the core of what you believe is moral or immoral persuasion. 00:01:51.099 --> 00:01:54.287 And I'd like to do that with a couple of examples 00:01:54.446 --> 00:01:57.597 as was said, a couple of examples of technologies 00:01:57.664 --> 00:02:02.644 where people have used game elements to get people to do things. 00:02:03.561 --> 00:02:06.545 So, here's the first example leading us to our first question. 00:02:06.585 --> 00:02:09.118 one of my favorite examples of gamification, 00:02:09.143 --> 00:02:10.538 Buster Benson's health month. 00:02:10.568 --> 00:02:14.243 It's a simple application where you set yourself health rules 00:02:14.942 --> 00:02:15.942 for one month. 00:02:16.188 --> 00:02:18.767 Rules like, I want to exercise six times a week. 00:02:18.823 --> 00:02:20.656 Or, I don't want to drink any alcohol. 00:02:20.704 --> 00:02:23.767 And every morning you get an email, asking you, 00:02:23.853 --> 00:02:25.583 Did you stick to your rules or not? 00:02:25.607 --> 00:02:27.894 And you say yes or no to the different questions. 00:02:27.927 --> 00:02:30.553 And then, on the platform, you can see how well you did, 00:02:30.958 --> 00:02:33.625 you can earn points and badges for that, 00:02:33.665 --> 00:02:36.058 That information is shared with your peers, 00:02:36.108 --> 00:02:38.990 and if you don't stick to a rule, you loose a health point, 00:02:39.015 --> 00:02:41.140 but you friends ccan chip in and heal you. 00:02:41.879 --> 00:02:45.442 Beautiful example, and I believe most of you will agree with me 00:02:45.760 --> 00:02:48.490 that it kind of sounds like ethical persuasion, right? 00:02:48.561 --> 00:02:50.910 It sounds like something that is good to do. 00:02:50.948 --> 00:02:52.482 Here's another example. 00:02:52.768 --> 00:02:55.513 Very similar in the kind of thinking behind it, 00:02:55.569 --> 00:02:57.505 very different example - Lockers. 00:02:57.593 --> 00:03:01.069 It's a social platform where people set up profiles 00:03:01.180 --> 00:03:04.418 and on them, the main thing they do is they put up product pictures 00:03:04.498 --> 00:03:06.565 pictures of products they like, 00:03:06.617 --> 00:03:08.831 and link their profiles with their friends 00:03:08.863 --> 00:03:12.339 and every time I click on one of those products on your page, 00:03:12.371 --> 00:03:13.474 you get points, 00:03:13.487 --> 00:03:16.783 and every time you click on a product page on my page, 00:03:17.125 --> 00:03:18.125 I get points, 00:03:18.157 --> 00:03:20.791 and if you actually buy something I get a lot of points. 00:03:20.838 --> 00:03:23.490 and both of us can then exchange these points 00:03:23.831 --> 00:03:26.165 into percentages of these products. 00:03:26.538 --> 00:03:28.101 Now, I don't know how you feel, 00:03:28.173 --> 00:03:30.839 but personally I think that Health Month 00:03:30.895 --> 00:03:33.628 is something that feels to me very benign 00:03:33.704 --> 00:03:37.005 and a good piece, a moral piece of technology, 00:03:37.046 --> 00:03:40.632 whereas there's something about Locker that makes me feel a little queezy. 00:03:41.514 --> 00:03:44.918 Thinking about what is it exactly that makes me feel queezy here, 00:03:44.943 --> 00:03:46.450 in this case, versus the other, 00:03:46.490 --> 00:03:48.640 was a very simple answer, and that is, well, 00:03:48.657 --> 00:03:50.505 the intention behind it, right? 00:03:51.355 --> 00:03:54.997 In one case, the intention is, "That site wants me to be healthier, 00:03:55.022 --> 00:03:57.418 and the other site wants me to shop more." 00:03:57.683 --> 00:04:00.723 So it's at first a very simple, very obvious question 00:04:00.747 --> 00:04:01.943 I would like to give you: 00:04:01.967 --> 00:04:04.654 What are your intentions if you are designing something? 00:04:05.008 --> 00:04:08.381 And obviously, intentions are not the only thing, 00:04:08.705 --> 00:04:12.140 so here is another example for one of these applications. 00:04:12.307 --> 00:04:15.394 There are a couple of these kinds of Eco dashboards right now... 00:04:15.418 --> 00:04:16.878 Dashboards built into cars... 00:04:16.902 --> 00:04:19.733 Which try to motivate you to drive more fuel-efficiently. 00:04:19.757 --> 00:04:21.531 This here is Nissan's MyLeaf, 00:04:21.555 --> 00:04:25.518 where your driving behavior is compared with the driving behavior 00:04:25.543 --> 00:04:26.694 of other people, 00:04:26.718 --> 00:04:29.995 so you can compete for who drives a route the most fuel-efficiently. 00:04:30.019 --> 00:04:32.496 And these things are very effective, it turns out... 00:04:32.520 --> 00:04:36.459 So effective that they motivate people to engage in unsafe driving behaviors, 00:04:36.483 --> 00:04:38.444 like not stopping at a red light, 00:04:38.469 --> 00:04:41.184 because that way you have to stop and restart the engine, 00:04:41.208 --> 00:04:43.931 and that would use quite some fuel, wouldn't it? 00:04:44.978 --> 00:04:49.387 So despite this being a very well-intended application, 00:04:49.411 --> 00:04:51.786 obviously there was a side effect of that. 00:04:51.810 --> 00:04:54.358 Here's another example for one of these side effects. 00:04:54.382 --> 00:04:59.166 Commendable: a site that allows parents to give their kids little badges 00:04:59.190 --> 00:05:01.848 for doing the things that parents want their kids to do, 00:05:01.872 --> 00:05:03.188 like tying their shoes. 00:05:03.212 --> 00:05:05.456 And at first that sounds very nice, 00:05:05.480 --> 00:05:07.630 very benign, well-intended. 00:05:07.654 --> 00:05:11.923 But it turns out, if you look into research on people's mindset, 00:05:11.948 --> 00:05:13.435 caring about outcomes, 00:05:13.459 --> 00:05:15.232 caring about public recognition, 00:05:15.256 --> 00:05:19.117 caring about these kinds of public tokens of recognition 00:05:19.141 --> 00:05:21.135 is not necessarily very helpful 00:05:21.159 --> 00:05:23.466 for your long-term psychological well-being. 00:05:23.490 --> 00:05:26.166 It's better if you care about learning something. 00:05:26.190 --> 00:05:28.095 It's better when you care about yourself 00:05:28.119 --> 00:05:30.693 than how you appear in front of other people. 00:05:31.161 --> 00:05:36.202 So that kind of motivational tool that is used actually, in and of itself, 00:05:36.226 --> 00:05:38.134 has a long-term side effect, 00:05:38.158 --> 00:05:39.968 in that every time we use a technology 00:05:39.992 --> 00:05:43.166 that uses something like public recognition or status, 00:05:43.190 --> 00:05:45.566 we're actually positively endorsing this 00:05:45.590 --> 00:05:49.000 as a good and normal thing to care about... 00:05:49.024 --> 00:05:51.888 That way, possibly having a detrimental effect 00:05:51.912 --> 00:05:55.767 on the long-term psychological well-being of ourselves as a culture. 00:05:55.791 --> 00:05:58.435 So that's a second, very obvious question: 00:05:58.459 --> 00:06:00.770 What are the effects of what you're doing... 00:06:00.794 --> 00:06:04.918 The effects you're having with the device, like less fuel, 00:06:04.942 --> 00:06:07.647 as well as the effects of the actual tools you're using 00:06:07.671 --> 00:06:09.345 to get people to do things... 00:06:09.369 --> 00:06:10.940 Public recognition? 00:06:11.264 --> 00:06:14.185 Now is that all... intention, effect? 00:06:14.209 --> 00:06:17.343 Well, there are some technologies which obviously combine both. 00:06:17.367 --> 00:06:20.154 Both good long-term and short-term effects 00:06:20.178 --> 00:06:24.409 and a positive intention like Fred Stutzman's "Freedom," 00:06:24.611 --> 00:06:26.818 where the whole point of that application is... 00:06:26.842 --> 00:06:30.567 Well, we're usually so bombarded with constant requests by other people, 00:06:30.591 --> 00:06:31.747 with this device, 00:06:31.771 --> 00:06:35.251 you can shut off the Internet connectivity of your PC of choice 00:06:35.275 --> 00:06:36.723 for a pre-set amount of time, 00:06:36.747 --> 00:06:38.718 to actually get some work done. 00:06:39.742 --> 00:06:42.893 And I think most of us will agree that's something well-intended, 00:06:42.917 --> 00:06:45.137 and also has good consequences. 00:06:45.161 --> 00:06:46.807 In the words of Michel Foucault, 00:06:46.831 --> 00:06:48.771 it is a "technology of the self." 00:06:48.795 --> 00:06:51.632 It is a technology that empowers the individual 00:06:51.656 --> 00:06:53.471 to determine its own life course, 00:06:53.495 --> 00:06:55.014 to shape itself. 00:06:55.850 --> 00:06:59.094 But the problem is, as Foucault points out, 00:06:59.358 --> 00:07:01.143 that every technology of the self 00:07:01.167 --> 00:07:04.612 has a technology of domination as its flip side. 00:07:04.636 --> 00:07:09.239 As you see in today's modern liberal democracies, 00:07:09.263 --> 00:07:13.946 the society, the state, not only allows us to determine our self, 00:07:13.970 --> 00:07:15.121 to shape our self, 00:07:15.145 --> 00:07:17.136 it also demands it of us. 00:07:17.160 --> 00:07:19.121 It demands that we optimize ourselves, 00:07:19.145 --> 00:07:20.985 that we control ourselves, 00:07:21.009 --> 00:07:23.720 that we self-manage continuously, 00:07:23.744 --> 00:07:27.347 because that's the only way in which such a liberal society works. 00:07:27.434 --> 00:07:28.450 In a way, 00:07:28.585 --> 00:07:32.069 the kind of devices like Fred Stutzman's "Freedom," 00:07:32.125 --> 00:07:35.593 or Buster Benson's Helth Month, are technologies of domination, 00:07:35.601 --> 00:07:37.334 because they want us to be 00:07:37.513 --> 00:07:40.855 (Robotic voice) fitter, happier, more productive, 00:07:40.950 --> 00:07:43.902 comfortable, not drinking too much, 00:07:44.069 --> 00:07:47.553 regular exercise at the gym three days a week, 00:07:47.688 --> 00:07:51.521 gettin on better with your associate employee contemporaries. 00:07:51.546 --> 00:07:53.434 At ease. Eating well. 00:07:53.530 --> 00:07:56.878 No more microwave dinners and saturated fats. 00:07:56.966 --> 00:08:00.148 A patient, better driver, a safer car, 00:08:00.212 --> 00:08:02.188 [unclear] 00:08:02.228 --> 00:08:04.807 sleeping well, no bad dreams. 00:08:06.261 --> 00:08:10.529 SD: These technologies want us to stay in the game 00:08:10.553 --> 00:08:13.326 that society has devised for us. 00:08:13.350 --> 00:08:15.637 They want us to fit in even better. 00:08:15.661 --> 00:08:18.239 They want us to optimize ourselves to fit in. 00:08:19.168 --> 00:08:22.247 Now, I don't say that is necessarily a bad thing; 00:08:22.833 --> 00:08:27.161 I just think that this example points us to a general realization, 00:08:27.185 --> 00:08:30.988 and that is: no matter what technology or design you look at, 00:08:31.012 --> 00:08:34.033 even something we consider as well-intended 00:08:35.857 --> 00:08:38.994 and as good in its effects as Stutzman's Freedom, 00:08:39.246 --> 00:08:41.953 comes with certain values embedded in it. 00:08:41.977 --> 00:08:43.914 And we can question these values. 00:08:43.938 --> 00:08:45.882 We can question: Is it a good thing 00:08:45.906 --> 00:08:49.390 that all of us continuously self-optimize ourselves 00:08:49.414 --> 00:08:51.425 to fit better into that society? 00:08:51.449 --> 00:08:52.941 Or to give you another example: 00:08:52.976 --> 00:08:56.031 the one at the initial presentation, 00:08:56.065 --> 00:08:58.541 What about a piece of persuasive technology 00:08:58.565 --> 00:09:01.756 that convinces Muslim women to wear their headscarves? 00:09:01.780 --> 00:09:03.832 Is that a good or a bad technology 00:09:03.856 --> 00:09:06.419 in its intentions or in its effects? 00:09:07.143 --> 00:09:11.397 Well, that basically depends on the kind of values you bring to bear 00:09:11.421 --> 00:09:13.658 to make these kinds of judgments. 00:09:13.682 --> 00:09:15.210 So that's a third question: 00:09:15.234 --> 00:09:16.762 What values do you use to judge? 00:09:17.388 --> 00:09:18.729 And speaking of values: 00:09:18.753 --> 00:09:22.109 I've noticed that in the discussion about moral persuasion online 00:09:22.133 --> 00:09:23.770 and when I'm talking with people, 00:09:24.394 --> 00:09:27.061 more often than not, there is a weird bias. 00:09:27.603 --> 00:09:30.499 And that bias is that we're asking: 00:09:30.523 --> 00:09:33.336 Is this or that "still" ethical? 00:09:33.360 --> 00:09:36.010 Is it "still" permissible? 00:09:37.234 --> 00:09:38.432 We're asking things like: 00:09:38.456 --> 00:09:40.645 Is this Oxfam donation form, 00:09:40.669 --> 00:09:43.717 where the regular monthly donation is the preset default, 00:09:43.741 --> 00:09:45.820 and people, maybe without intending it, 00:09:45.844 --> 00:09:50.455 are encouraged or nudged into giving a regular donation 00:09:50.480 --> 00:09:51.969 instead of a one-time donation, 00:09:51.993 --> 00:09:53.336 is that "still' permissible? 00:09:53.360 --> 00:09:54.723 Is it "still" ethical? 00:09:54.747 --> 00:09:56.226 We're fishing at the low end. 00:09:56.819 --> 00:09:59.293 But in fact, that question, "Is it 'still' ethical?" 00:09:59.317 --> 00:10:01.098 is just one way of looking at ethics. 00:10:01.122 --> 00:10:06.014 Because if you look at the beginning of ethics in Western culture, 00:10:06.038 --> 00:10:09.570 you see a very different idea of what ethics also could be. 00:10:09.910 --> 00:10:13.907 For Aristotle, ethics was not about the question, 00:10:13.931 --> 00:10:16.203 "Is that still good, or is it bad?" 00:10:16.227 --> 00:10:19.655 Ethics was about the question of how to live life well. 00:10:20.156 --> 00:10:22.337 And he put that in the word "arĂȘte," 00:10:22.361 --> 00:10:25.116 which we, from [Ancient Greek], translate as "virtue." 00:10:25.140 --> 00:10:26.780 But really, it means "excellence." 00:10:26.804 --> 00:10:32.235 It means living up to your own full potential as a human being. 00:10:33.377 --> 00:10:35.033 And that is an idea that, I think, 00:10:35.057 --> 00:10:38.353 Paul Richard Buchanan put nicely in a recent essay, 00:10:38.378 --> 00:10:40.521 where he said, "Products are vivid arguments 00:10:40.545 --> 00:10:42.668 about how we should live our lives." 00:10:43.126 --> 00:10:45.703 Our designs are not ethical or unethical 00:10:45.727 --> 00:10:50.317 in that they're using ethical or unethical means of persuading us. 00:10:50.701 --> 00:10:52.271 They have a moral component 00:10:52.295 --> 00:10:56.522 just in the kind of vision and the aspiration of the good life 00:10:56.546 --> 00:10:57.894 that they present to us. 00:10:58.381 --> 00:11:01.884 And if you look into the designed environment around us 00:11:01.908 --> 00:11:03.080 with that kind of lens, 00:11:03.104 --> 00:11:05.557 asking, "What is the vision of the good life 00:11:05.581 --> 00:11:08.319 that our products, our design, present to us?", 00:11:08.343 --> 00:11:10.619 then you often get the shivers, 00:11:10.643 --> 00:11:12.971 because of how little we expect of each other, 00:11:12.995 --> 00:11:16.885 of how little we actually seem to expect of our life, 00:11:16.909 --> 00:11:18.943 and what the good life looks like. 00:11:20.050 --> 00:11:23.073 So that's a fourth question I'd like to leave you with: 00:11:23.397 --> 00:11:27.759 What vision of the good life do your designs convey? 00:11:28.489 --> 00:11:29.861 And speaking of design, 00:11:29.885 --> 00:11:34.075 you'll notice that I already broadened the discussion, 00:11:34.099 --> 00:11:38.543 because it's not just persuasive technology that we're talking about here, 00:11:38.567 --> 00:11:42.644 it's any piece of design that we put out here in the world. 00:11:42.668 --> 00:11:44.068 I don't know whether you know 00:11:44.092 --> 00:11:47.647 the great communication researcher Paul Watzlawick who, back in the '60s, 00:11:47.671 --> 00:11:50.182 made the argument that we cannot not communicate. 00:11:50.206 --> 00:11:52.807 Even if we choose to be silent, we chose to be silent, 00:11:52.831 --> 00:11:55.787 and we're communicating something by choosing to be silent. 00:11:55.811 --> 00:11:58.546 And in the same way that we cannot not communicate, 00:11:58.570 --> 00:12:00.101 we cannot not persuade: 00:12:00.125 --> 00:12:02.136 whatever we do or refrain from doing, 00:12:02.160 --> 00:12:06.525 whatever we put out there as a piece of design, into the world, 00:12:06.549 --> 00:12:08.596 has a persuasive component. 00:12:08.620 --> 00:12:10.493 It tries to affect people. 00:12:10.517 --> 00:12:14.264 It puts a certain vision of the good life out there in front of us, 00:12:14.888 --> 00:12:16.513 which is what Peter-Paul Verbeek, 00:12:16.537 --> 00:12:19.253 the Dutch philosopher of technology, says. 00:12:20.277 --> 00:12:24.225 No matter whether we as designers intend it or not, 00:12:24.249 --> 00:12:26.391 we materialize morality. 00:12:26.415 --> 00:12:29.218 We make certain things harder and easier to do. 00:12:29.242 --> 00:12:31.453 We organize the existence of people. 00:12:31.477 --> 00:12:32.628 We put a certain vision 00:12:32.652 --> 00:12:36.056 of what good or bad or normal or usual is 00:12:36.080 --> 00:12:37.231 in front of people, 00:12:37.255 --> 00:12:39.655 by everything we put out there in the world. 00:12:40.087 --> 00:12:43.173 Even something as innocuous as a set of school chairs 00:12:43.203 --> 00:12:47.409 as a set of chairs that you're sitting on and I'm standing in front of, 00:12:48.297 --> 00:12:50.344 is a persuasive technology, 00:12:50.368 --> 00:12:55.058 because it presents and materializes a certain vision of the good life... 00:12:55.082 --> 00:12:57.939 A good life in which teaching and learning and listening 00:12:57.963 --> 00:13:01.062 is about one person teaching, the others listening; 00:13:01.086 --> 00:13:05.139 in which it is about learning-is-done-while-sitting; 00:13:05.163 --> 00:13:06.758 in which you learn for yourself; 00:13:06.782 --> 00:13:09.202 in which you're not supposed to change these rules, 00:13:09.226 --> 00:13:11.673 because the chairs are fixed to the ground. 00:13:12.828 --> 00:13:15.739 And even something as innocuous as a single-design chair, 00:13:15.763 --> 00:13:17.335 like this one by Arne Jacobsen, 00:13:17.359 --> 00:13:19.135 is a persuasive technology, 00:13:19.159 --> 00:13:22.202 because, again, it communicates an idea of the good life: 00:13:22.675 --> 00:13:28.087 a good life... a life that you, as a designer, consent to by saying, 00:13:28.124 --> 00:13:31.631 "In a good life, goods are produced as sustainably or unsustainably 00:13:31.655 --> 00:13:33.266 as this chair. 00:13:33.290 --> 00:13:35.267 Workers are treated as well or as badly 00:13:35.291 --> 00:13:37.863 as the workers were treated that built that chair." 00:13:38.202 --> 00:13:40.503 The good life is a life where design is important 00:13:40.527 --> 00:13:43.448 because somebody obviously took the time and spent the money 00:13:43.472 --> 00:13:45.256 for that kind of well-designed chair; 00:13:45.280 --> 00:13:46.684 where tradition is important, 00:13:46.708 --> 00:13:49.874 because this is a traditional classic and someone cared about this; 00:13:49.898 --> 00:13:52.588 and where there is something as conspicuous consumption, 00:13:52.612 --> 00:13:55.568 where it is OK and normal to spend a humongous amount of money 00:13:55.592 --> 00:13:57.642 on such a chair, 00:13:57.667 --> 00:14:00.240 to signal to other people what your social status is. 00:14:02.164 --> 00:14:05.481 So these are the kinds of layers, the kinds of questions 00:14:05.505 --> 00:14:07.475 I wanted to lead you through today; 00:14:07.499 --> 00:14:10.533 the question of: What are the intentions that you bring to bear 00:14:10.557 --> 00:14:12.117 when you're designing something? 00:14:12.141 --> 00:14:15.393 What are the effects, intended and unintended, that you're having? 00:14:15.417 --> 00:14:18.218 What are the values you're using to judge those? 00:14:18.242 --> 00:14:20.215 What are the virtues, the aspirations 00:14:20.239 --> 00:14:22.267 that you're actually expressing in that? 00:14:22.640 --> 00:14:24.497 And how does that apply, 00:14:24.521 --> 00:14:26.507 not just to persuasive technology, 00:14:26.531 --> 00:14:28.579 but to everything you design? 00:14:29.452 --> 00:14:30.743 Do we stop there? 00:14:31.355 --> 00:14:32.522 I don't think so. 00:14:32.831 --> 00:14:37.269 I think that all of these things are eventually informed 00:14:37.293 --> 00:14:38.716 by the core of all of this, 00:14:38.740 --> 00:14:41.831 and this is nothing but life itself. 00:14:42.134 --> 00:14:44.851 Why, when the question of what the good life is 00:14:44.875 --> 00:14:47.214 informs everything that we design, 00:14:47.238 --> 00:14:50.032 should we stop at design and not ask ourselves: 00:14:50.056 --> 00:14:52.046 How does it apply to our own life? 00:14:52.421 --> 00:14:55.133 "Why should the lamp or the house be an art object, 00:14:55.157 --> 00:14:56.333 but not our life?" 00:14:56.357 --> 00:14:57.840 as Michel Foucault puts it. 00:14:58.236 --> 00:15:01.764 Just to give you a practical example of Buster Benson. 00:15:01.789 --> 00:15:03.765 whom I mentioned at the beginning. 00:15:03.871 --> 00:15:06.728 This is Buster setting up a pull-up machine 00:15:06.753 --> 00:15:09.365 at the office of his new start-up, Habit Labs, 00:15:09.389 --> 00:15:12.604 where they're trying to build other applications like "Health Month" 00:15:12.628 --> 00:15:13.786 for people. 00:15:13.810 --> 00:15:15.772 And why is he building a thing like this? 00:15:15.796 --> 00:15:17.829 Well, here is the set of axioms 00:15:17.853 --> 00:15:22.897 that Habit Labs, Buster's start-up, put up for themselves 00:15:22.975 --> 00:15:25.680 on how they wanted to work together as a team 00:15:25.704 --> 00:15:27.733 when they're building these applications... 00:15:27.757 --> 00:15:29.946 A set of moral principles they set themselves 00:15:29.970 --> 00:15:31.331 for working together... 00:15:31.355 --> 00:15:32.593 One of them being, 00:15:32.617 --> 00:15:35.704 "We take care of our own health and manage our own burnout." 00:15:36.061 --> 00:15:40.901 Because ultimately, how can you ask yourselves 00:15:40.985 --> 00:15:44.915 and how can you find an answer on what vision of the good life 00:15:44.939 --> 00:15:48.108 you want to convey and create with your designs 00:15:48.132 --> 00:15:49.862 without asking the question: 00:15:49.886 --> 00:15:53.718 What vision of the good life do you yourself want to live? 00:15:53.885 --> 00:15:55.059 And with that, 00:15:55.827 --> 00:15:57.239 I thank you. 00:15:57.494 --> 00:16:01.650 (Applause)