0:00:01.167,0:00:02.500 Hi there, everyone. 0:00:03.722,0:00:07.389 I would like to start[br]by asking you a simple question. 0:00:07.452,0:00:08.484 And that is, 0:00:08.531,0:00:12.536 who of you wants to build a product[br]that is as captivating and engaging 0:00:12.561,0:00:14.494 as, say, Facebook or Twitter? 0:00:14.641,0:00:17.307 If you think so, please raise your hand. 0:00:17.513,0:00:20.339 Something that's as captivating[br]and engaging as Twitter. 0:00:20.355,0:00:22.088 Please keep your hands up. 0:00:22.625,0:00:24.878 Now, those of you[br]who have kept your hands up, 0:00:24.919,0:00:26.902 please keep your hands up if you feel 0:00:26.950,0:00:29.997 that you're spending more time[br]than you should spend 0:00:30.030,0:00:32.910 on sites like Facebook or Twitter, 0:00:32.934,0:00:34.997 time that would be better spent 0:00:35.038,0:00:38.624 with friends or spouses[br]or doing things that you actually love. 0:00:39.720,0:00:42.085 Okay. Those who still have[br]their arms up, 0:00:42.133,0:00:44.200 please discuss after the break. 0:00:44.220,0:00:46.077 (Laughter) 0:00:46.125,0:00:48.066 So, why am I asking this question? 0:00:48.474,0:00:49.529 I am asking it, 0:00:49.591,0:00:52.442 because we are today talking[br]about moral persuasion: 0:00:52.501,0:00:56.483 What is moral and immoral[br]in trying to change people's behaviors 0:00:56.507,0:00:58.960 by using technology and using design? 0:00:58.984,0:01:00.816 And I don't know what you expect, 0:01:00.840,0:01:02.793 but when I was thinking about that issue, 0:01:02.817,0:01:06.856 I early on realized what I'm not able[br]to give you are answers. 0:01:07.443,0:01:10.214 I'm not able to tell you[br]what is moral or immoral, 0:01:10.238,0:01:12.785 because we're living[br]in a pluralist society. 0:01:12.809,0:01:17.051 My values can be radically[br]different from your values, 0:01:17.075,0:01:20.252 which means that what I consider[br]moral or immoral based on that 0:01:20.276,0:01:23.888 might not necessarily be[br]what you consider moral or immoral. 0:01:24.269,0:01:27.483 But I also realized[br]there is one thing that I could give you, 0:01:27.507,0:01:30.040 and that is what this guy[br]behind me gave the world... 0:01:30.064,0:01:31.214 Socrates. 0:01:31.238,0:01:32.633 It is questions. 0:01:32.657,0:01:35.290 What I can do and what[br]I would like to do with you 0:01:35.314,0:01:37.259 is give you, like that initial question, 0:01:37.283,0:01:41.914 a set of questions[br]to figure out for yourselves, 0:01:42.017,0:01:45.658 layer by layer, like peeling an onion, 0:01:45.682,0:01:50.702 getting at the core of what you believe[br]is moral or immoral persuasion. 0:01:51.099,0:01:54.287 And I'd like to do that[br]with a couple of examples 0:01:54.446,0:01:57.597 as was said, a couple of examples[br]of technologies 0:01:57.664,0:02:02.644 where people have used game elements[br]to get people to do things. 0:02:03.561,0:02:06.545 So, here's the first example[br]leading us to our first question. 0:02:06.585,0:02:09.118 one of my favorite examples[br]of gamification, 0:02:09.143,0:02:10.538 Buster Benson's health month. 0:02:10.568,0:02:14.243 It's a simple application[br]where you set yourself health rules 0:02:14.942,0:02:15.942 for one month. 0:02:16.188,0:02:18.767 Rules like, I want to exercise[br]six times a week. 0:02:18.823,0:02:20.656 Or, I don't want to drink any alcohol. 0:02:20.704,0:02:23.767 And every morning you get[br]an email, asking you, 0:02:23.853,0:02:25.583 Did you stick to your rules or not? 0:02:25.607,0:02:27.894 And you say yes or no[br]to the different questions. 0:02:27.927,0:02:30.553 And then, on the platform,[br]you can see how well you did, 0:02:30.958,0:02:33.625 you can earn points and badges for that, 0:02:33.665,0:02:36.058 That information is shared[br]with your peers, 0:02:36.108,0:02:38.990 and if you don't stick to a rule,[br]you loose a health point, 0:02:39.015,0:02:41.140 but you friends ccan chip in[br]and heal you. 0:02:41.879,0:02:45.442 Beautiful example, and I believe[br]most of you will agree with me 0:02:45.760,0:02:48.490 that it kind of sounds[br]like ethical persuasion, right? 0:02:48.561,0:02:50.910 It sounds like something[br]that is good to do. 0:02:50.948,0:02:52.482 Here's another example. 0:02:52.768,0:02:55.513 Very similar in the kind of[br]thinking behind it, 0:02:55.569,0:02:57.505 very different example - Lockers. 0:02:57.593,0:03:01.069 It's a social platform[br]where people set up profiles 0:03:01.180,0:03:04.418 and on them, the main thing they do[br]is they put up product pictures 0:03:04.498,0:03:06.565 pictures of products they like, 0:03:06.617,0:03:08.831 and link their profiles with their friends 0:03:08.863,0:03:12.339 and every time I click[br]on one of those products on your page, 0:03:12.371,0:03:13.474 you get points, 0:03:13.487,0:03:16.783 and every time you click[br]on a product page on my page, 0:03:17.125,0:03:18.125 I get points, 0:03:18.157,0:03:20.791 and if you actually buy something[br]I get a lot of points. 0:03:20.838,0:03:23.490 and both of us can then[br]exchange these points 0:03:23.831,0:03:26.165 into percentages of these products. 0:03:26.538,0:03:28.101 Now, I don't know how you feel, 0:03:28.173,0:03:30.839 but personally I think[br]that Health Month 0:03:30.895,0:03:33.628 is something that feels to me very benign 0:03:33.704,0:03:37.005 and a good piece, a moral piece[br]of technology, 0:03:37.046,0:03:40.632 whereas there's something about Locker[br]that makes me feel a little queezy. 0:03:41.514,0:03:44.918 Thinking about what is it exactly[br]that makes me feel queezy here, 0:03:44.943,0:03:46.450 in this case, versus the other, 0:03:46.490,0:03:48.640 was a very simple answer,[br]and that is, well, 0:03:48.657,0:03:50.505 the intention behind it, right? 0:03:51.355,0:03:54.997 In one case, the intention is,[br]"That site wants me to be healthier, 0:03:55.022,0:03:57.418 and the other site wants me to shop more." 0:03:57.683,0:04:00.723 So it's at first a very simple,[br]very obvious question 0:04:00.747,0:04:01.943 I would like to give you: 0:04:01.967,0:04:04.654 What are your intentions[br]if you are designing something? 0:04:05.008,0:04:08.381 And obviously, intentions[br]are not the only thing, 0:04:08.705,0:04:12.140 so here is another example[br]for one of these applications. 0:04:12.307,0:04:15.394 There are a couple of these kinds[br]of Eco dashboards right now... 0:04:15.418,0:04:16.878 Dashboards built into cars... 0:04:16.902,0:04:19.733 Which try to motivate you[br]to drive more fuel-efficiently. 0:04:19.757,0:04:21.531 This here is Nissan's MyLeaf, 0:04:21.555,0:04:25.518 where your driving behavior[br]is compared with the driving behavior 0:04:25.543,0:04:26.694 of other people, 0:04:26.718,0:04:29.995 so you can compete for who drives a route[br]the most fuel-efficiently. 0:04:30.019,0:04:32.496 And these things are[br]very effective, it turns out... 0:04:32.520,0:04:36.459 So effective that they motivate people[br]to engage in unsafe driving behaviors, 0:04:36.483,0:04:38.444 like not stopping at a red light, 0:04:38.469,0:04:41.184 because that way you have[br]to stop and restart the engine, 0:04:41.208,0:04:43.931 and that would use quite[br]some fuel, wouldn't it? 0:04:44.978,0:04:49.387 So despite this being[br]a very well-intended application, 0:04:49.411,0:04:51.786 obviously there was a side effect of that. 0:04:51.810,0:04:54.358 Here's another example[br]for one of these side effects. 0:04:54.382,0:04:59.166 Commendable: a site that allows parents[br]to give their kids little badges 0:04:59.190,0:05:01.848 for doing the things[br]that parents want their kids to do, 0:05:01.872,0:05:03.188 like tying their shoes. 0:05:03.212,0:05:05.456 And at first that sounds very nice, 0:05:05.480,0:05:07.630 very benign, well-intended. 0:05:07.654,0:05:11.923 But it turns out, if you look into[br]research on people's mindset, 0:05:11.948,0:05:13.435 caring about outcomes, 0:05:13.459,0:05:15.232 caring about public recognition, 0:05:15.256,0:05:19.117 caring about these kinds[br]of public tokens of recognition 0:05:19.141,0:05:21.135 is not necessarily very helpful 0:05:21.159,0:05:23.466 for your long-term[br]psychological well-being. 0:05:23.490,0:05:26.166 It's better if you care[br]about learning something. 0:05:26.190,0:05:28.095 It's better when you care about yourself 0:05:28.119,0:05:30.693 than how you appear[br]in front of other people. 0:05:31.161,0:05:36.202 So that kind of motivational tool[br]that is used actually, in and of itself, 0:05:36.226,0:05:38.134 has a long-term side effect, 0:05:38.158,0:05:39.968 in that every time we use a technology 0:05:39.992,0:05:43.166 that uses something[br]like public recognition or status, 0:05:43.190,0:05:45.566 we're actually positively endorsing this 0:05:45.590,0:05:49.000 as a good and normal thing[br]to care about... 0:05:49.024,0:05:51.888 That way, possibly having[br]a detrimental effect 0:05:51.912,0:05:55.767 on the long-term psychological[br]well-being of ourselves as a culture. 0:05:55.791,0:05:58.435 So that's a second, very obvious question: 0:05:58.459,0:06:00.770 What are the effects[br]of what you're doing... 0:06:00.794,0:06:04.918 The effects you're having[br]with the device, like less fuel, 0:06:04.942,0:06:07.647 as well as the effects[br]of the actual tools you're using 0:06:07.671,0:06:09.345 to get people to do things... 0:06:09.369,0:06:10.940 Public recognition? 0:06:11.264,0:06:14.185 Now is that all... intention, effect? 0:06:14.209,0:06:17.343 Well, there are some technologies[br]which obviously combine both. 0:06:17.367,0:06:20.154 Both good long-term and short-term effects 0:06:20.178,0:06:24.409 and a positive intention[br]like Fred Stutzman's "Freedom," 0:06:24.611,0:06:26.818 where the whole point[br]of that application is... 0:06:26.842,0:06:30.567 Well, we're usually so bombarded[br]with constant requests by other people, 0:06:30.591,0:06:31.747 with this device, 0:06:31.771,0:06:35.251 you can shut off the Internet[br]connectivity of your PC of choice 0:06:35.275,0:06:36.723 for a pre-set amount of time, 0:06:36.747,0:06:38.718 to actually get some work done. 0:06:39.742,0:06:42.893 And I think most of us will agree[br]that's something well-intended, 0:06:42.917,0:06:45.137 and also has good consequences. 0:06:45.161,0:06:46.807 In the words of Michel Foucault, 0:06:46.831,0:06:48.771 it is a "technology of the self." 0:06:48.795,0:06:51.632 It is a technology[br]that empowers the individual 0:06:51.656,0:06:53.471 to determine its own life course, 0:06:53.495,0:06:55.014 to shape itself. 0:06:55.850,0:06:59.094 But the problem is,[br]as Foucault points out, 0:06:59.358,0:07:01.143 that every technology of the self 0:07:01.167,0:07:04.612 has a technology of domination[br]as its flip side. 0:07:04.636,0:07:09.239 As you see in today's modern[br]liberal democracies, 0:07:09.263,0:07:13.946 the society, the state,[br]not only allows us to determine our self, 0:07:13.970,0:07:15.121 to shape our self, 0:07:15.145,0:07:17.136 it also demands it of us. 0:07:17.160,0:07:19.121 It demands that we optimize ourselves, 0:07:19.145,0:07:20.985 that we control ourselves, 0:07:21.009,0:07:23.720 that we self-manage continuously, 0:07:23.744,0:07:27.347 because that's the only way[br]in which such a liberal society works. 0:07:27.434,0:07:28.450 In a way, 0:07:28.585,0:07:32.069 the kind of devices like[br]Fred Stutzman's "Freedom," 0:07:32.125,0:07:35.593 or Buster Benson's Helth Month,[br]are technologies of domination, 0:07:35.601,0:07:37.334 because they want us to be 0:07:37.513,0:07:40.855 (Robotic voice) fitter, happier,[br]more productive, 0:07:40.950,0:07:43.902 comfortable, not drinking too much, 0:07:44.069,0:07:47.553 regular exercise at the gym[br]three days a week, 0:07:47.688,0:07:51.521 gettin on better with your[br]associate employee contemporaries. 0:07:51.546,0:07:53.434 At ease.[br]Eating well. 0:07:53.530,0:07:56.878 No more microwave dinners[br]and saturated fats. 0:07:56.966,0:08:00.148 A patient, better driver,[br]a safer car, 0:08:00.212,0:08:02.188 [unclear] 0:08:02.228,0:08:04.807 sleeping well, no bad dreams. 0:08:06.261,0:08:10.529 SD: These technologies want us[br]to stay in the game 0:08:10.553,0:08:13.326 that society has devised for us. 0:08:13.350,0:08:15.637 They want us to fit in even better. 0:08:15.661,0:08:18.239 They want us to optimize[br]ourselves to fit in. 0:08:19.168,0:08:22.247 Now, I don't say that[br]is necessarily a bad thing; 0:08:22.833,0:08:27.161 I just think that this example[br]points us to a general realization, 0:08:27.185,0:08:30.988 and that is: no matter what technology[br]or design you look at, 0:08:31.012,0:08:34.033 even something we consider[br]as well-intended 0:08:35.857,0:08:38.994 and as good in its effects[br]as Stutzman's Freedom, 0:08:39.246,0:08:41.953 comes with certain values embedded in it. 0:08:41.977,0:08:43.914 And we can question these values. 0:08:43.938,0:08:45.882 We can question: Is it a good thing 0:08:45.906,0:08:49.390 that all of us continuously[br]self-optimize ourselves 0:08:49.414,0:08:51.425 to fit better into that society? 0:08:51.449,0:08:52.941 Or to give you another example: 0:08:52.976,0:08:56.031 the one at the initial presentation, 0:08:56.065,0:08:58.541 What about a piece[br]of persuasive technology 0:08:58.565,0:09:01.756 that convinces Muslim women[br]to wear their headscarves? 0:09:01.780,0:09:03.832 Is that a good or a bad technology 0:09:03.856,0:09:06.419 in its intentions or in its effects? 0:09:07.143,0:09:11.397 Well, that basically depends on[br]the kind of values you bring to bear 0:09:11.421,0:09:13.658 to make these kinds of judgments. 0:09:13.682,0:09:15.210 So that's a third question: 0:09:15.234,0:09:16.762 What values do you use to judge? 0:09:17.388,0:09:18.729 And speaking of values: 0:09:18.753,0:09:22.109 I've noticed that in the discussion[br]about moral persuasion online 0:09:22.133,0:09:23.770 and when I'm talking with people, 0:09:24.394,0:09:27.061 more often than not,[br]there is a weird bias. 0:09:27.603,0:09:30.499 And that bias is that we're asking: 0:09:30.523,0:09:33.336 Is this or that "still" ethical? 0:09:33.360,0:09:36.010 Is it "still" permissible? 0:09:37.234,0:09:38.432 We're asking things like: 0:09:38.456,0:09:40.645 Is this Oxfam donation form, 0:09:40.669,0:09:43.717 where the regular monthly[br]donation is the preset default, 0:09:43.741,0:09:45.820 and people, maybe without intending it, 0:09:45.844,0:09:50.455 are encouraged or nudged[br]into giving a regular donation 0:09:50.480,0:09:51.969 instead of a one-time donation, 0:09:51.993,0:09:53.336 is that "still' permissible? 0:09:53.360,0:09:54.723 Is it "still" ethical? 0:09:54.747,0:09:56.226 We're fishing at the low end. 0:09:56.819,0:09:59.293 But in fact, that question,[br]"Is it 'still' ethical?" 0:09:59.317,0:10:01.098 is just one way of looking at ethics. 0:10:01.122,0:10:06.014 Because if you look at the beginning[br]of ethics in Western culture, 0:10:06.038,0:10:09.570 you see a very different idea[br]of what ethics also could be. 0:10:09.910,0:10:13.907 For Aristotle, ethics[br]was not about the question, 0:10:13.931,0:10:16.203 "Is that still good, or is it bad?" 0:10:16.227,0:10:19.655 Ethics was about the question[br]of how to live life well. 0:10:20.156,0:10:22.337 And he put that in the word "arĂȘte," 0:10:22.361,0:10:25.116 which we, from [Ancient Greek],[br]translate as "virtue." 0:10:25.140,0:10:26.780 But really, it means "excellence." 0:10:26.804,0:10:32.235 It means living up to your own[br]full potential as a human being. 0:10:33.377,0:10:35.033 And that is an idea that, I think, 0:10:35.057,0:10:38.353 Paul Richard Buchanan[br]put nicely in a recent essay, 0:10:38.378,0:10:40.521 where he said,[br]"Products are vivid arguments 0:10:40.545,0:10:42.668 about how we should live our lives." 0:10:43.126,0:10:45.703 Our designs are not ethical or unethical 0:10:45.727,0:10:50.317 in that they're using ethical[br]or unethical means of persuading us. 0:10:50.701,0:10:52.271 They have a moral component 0:10:52.295,0:10:56.522 just in the kind of vision[br]and the aspiration of the good life 0:10:56.546,0:10:57.894 that they present to us. 0:10:58.381,0:11:01.884 And if you look into the designed[br]environment around us 0:11:01.908,0:11:03.080 with that kind of lens, 0:11:03.104,0:11:05.557 asking, "What is the vision[br]of the good life 0:11:05.581,0:11:08.319 that our products, our design,[br]present to us?", 0:11:08.343,0:11:10.619 then you often get the shivers, 0:11:10.643,0:11:12.971 because of how little[br]we expect of each other, 0:11:12.995,0:11:16.885 of how little we actually[br]seem to expect of our life, 0:11:16.909,0:11:18.943 and what the good life looks like. 0:11:20.050,0:11:23.073 So that's a fourth question[br]I'd like to leave you with: 0:11:23.397,0:11:27.759 What vision of the good life[br]do your designs convey? 0:11:28.489,0:11:29.861 And speaking of design, 0:11:29.885,0:11:34.075 you'll notice that I already[br]broadened the discussion, 0:11:34.099,0:11:38.543 because it's not just persuasive[br]technology that we're talking about here, 0:11:38.567,0:11:42.644 it's any piece of design[br]that we put out here in the world. 0:11:42.668,0:11:44.068 I don't know whether you know 0:11:44.092,0:11:47.647 the great communication researcher[br]Paul Watzlawick who, back in the '60s, 0:11:47.671,0:11:50.182 made the argument[br]that we cannot not communicate. 0:11:50.206,0:11:52.807 Even if we choose to be silent,[br]we chose to be silent, 0:11:52.831,0:11:55.787 and we're communicating something[br]by choosing to be silent. 0:11:55.811,0:11:58.546 And in the same way[br]that we cannot not communicate, 0:11:58.570,0:12:00.101 we cannot not persuade: 0:12:00.125,0:12:02.136 whatever we do or refrain from doing, 0:12:02.160,0:12:06.525 whatever we put out there[br]as a piece of design, into the world, 0:12:06.549,0:12:08.596 has a persuasive component. 0:12:08.620,0:12:10.493 It tries to affect people. 0:12:10.517,0:12:14.264 It puts a certain vision of the good life[br]out there in front of us, 0:12:14.888,0:12:16.513 which is what Peter-Paul Verbeek, 0:12:16.537,0:12:19.253 the Dutch philosopher of technology, says. 0:12:20.277,0:12:24.225 No matter whether we as designers[br]intend it or not, 0:12:24.249,0:12:26.391 we materialize morality. 0:12:26.415,0:12:29.218 We make certain things[br]harder and easier to do. 0:12:29.242,0:12:31.453 We organize the existence of people. 0:12:31.477,0:12:32.628 We put a certain vision 0:12:32.652,0:12:36.056 of what good or bad or normal or usual is 0:12:36.080,0:12:37.231 in front of people, 0:12:37.255,0:12:39.655 by everything we put[br]out there in the world. 0:12:40.087,0:12:43.173 Even something as innocuous[br]as a set of school chairs 0:12:43.203,0:12:47.409 as a set of chairs that you're sitting on[br]and I'm standing in front of, 0:12:48.297,0:12:50.344 is a persuasive technology, 0:12:50.368,0:12:55.058 because it presents and materializes[br]a certain vision of the good life... 0:12:55.082,0:12:57.939 A good life in which teaching[br]and learning and listening 0:12:57.963,0:13:01.062 is about one person teaching,[br]the others listening; 0:13:01.086,0:13:05.139 in which it is about[br]learning-is-done-while-sitting; 0:13:05.163,0:13:06.758 in which you learn for yourself; 0:13:06.782,0:13:09.202 in which you're not supposed[br]to change these rules, 0:13:09.226,0:13:11.673 because the chairs[br]are fixed to the ground. 0:13:12.828,0:13:15.739 And even something as innocuous[br]as a single-design chair, 0:13:15.763,0:13:17.335 like this one by Arne Jacobsen, 0:13:17.359,0:13:19.135 is a persuasive technology, 0:13:19.159,0:13:22.202 because, again, it communicates[br]an idea of the good life: 0:13:22.675,0:13:28.087 a good life... a life that you,[br]as a designer, consent to by saying, 0:13:28.124,0:13:31.631 "In a good life, goods are produced[br]as sustainably or unsustainably 0:13:31.655,0:13:33.266 as this chair. 0:13:33.290,0:13:35.267 Workers are treated as well or as badly 0:13:35.291,0:13:37.863 as the workers were treated[br]that built that chair." 0:13:38.202,0:13:40.503 The good life is a life[br]where design is important 0:13:40.527,0:13:43.448 because somebody obviously took[br]the time and spent the money 0:13:43.472,0:13:45.256 for that kind of well-designed chair; 0:13:45.280,0:13:46.684 where tradition is important, 0:13:46.708,0:13:49.874 because this is a traditional classic[br]and someone cared about this; 0:13:49.898,0:13:52.588 and where there is something[br]as conspicuous consumption, 0:13:52.612,0:13:55.568 where it is OK and normal to spend[br]a humongous amount of money 0:13:55.592,0:13:57.642 on such a chair, 0:13:57.667,0:14:00.240 to signal to other people[br]what your social status is. 0:14:02.164,0:14:05.481 So these are the kinds of layers,[br]the kinds of questions 0:14:05.505,0:14:07.475 I wanted to lead you through today; 0:14:07.499,0:14:10.533 the question of: What are the intentions[br]that you bring to bear 0:14:10.557,0:14:12.117 when you're designing something? 0:14:12.141,0:14:15.393 What are the effects, intended[br]and unintended, that you're having? 0:14:15.417,0:14:18.218 What are the values[br]you're using to judge those? 0:14:18.242,0:14:20.215 What are the virtues, the aspirations 0:14:20.239,0:14:22.267 that you're actually expressing in that? 0:14:22.640,0:14:24.497 And how does that apply, 0:14:24.521,0:14:26.507 not just to persuasive technology, 0:14:26.531,0:14:28.579 but to everything you design? 0:14:29.452,0:14:30.743 Do we stop there? 0:14:31.355,0:14:32.522 I don't think so. 0:14:32.831,0:14:37.269 I think that all of these things[br]are eventually informed 0:14:37.293,0:14:38.716 by the core of all of this, 0:14:38.740,0:14:41.831 and this is nothing but life itself. 0:14:42.134,0:14:44.851 Why, when the question[br]of what the good life is 0:14:44.875,0:14:47.214 informs everything that we design, 0:14:47.238,0:14:50.032 should we stop at design[br]and not ask ourselves: 0:14:50.056,0:14:52.046 How does it apply to our own life? 0:14:52.421,0:14:55.133 "Why should the lamp[br]or the house be an art object, 0:14:55.157,0:14:56.333 but not our life?" 0:14:56.357,0:14:57.840 as Michel Foucault puts it. 0:14:58.236,0:15:01.764 Just to give you a practical[br]example of Buster Benson. 0:15:01.789,0:15:03.765 whom I mentioned at the beginning. 0:15:03.871,0:15:06.728 This is Buster setting[br]up a pull-up machine 0:15:06.753,0:15:09.365 at the office of his new[br]start-up, Habit Labs, 0:15:09.389,0:15:12.604 where they're trying to build[br]other applications like "Health Month" 0:15:12.628,0:15:13.786 for people. 0:15:13.810,0:15:15.772 And why is he building a thing like this? 0:15:15.796,0:15:17.829 Well, here is the set of axioms 0:15:17.853,0:15:22.897 that Habit Labs, Buster's start-up,[br]put up for themselves 0:15:22.975,0:15:25.680 on how they wanted to work[br]together as a team 0:15:25.704,0:15:27.733 when they're building[br]these applications... 0:15:27.757,0:15:29.946 A set of moral principles[br]they set themselves 0:15:29.970,0:15:31.331 for working together... 0:15:31.355,0:15:32.593 One of them being, 0:15:32.617,0:15:35.704 "We take care of our own health[br]and manage our own burnout." 0:15:36.061,0:15:40.901 Because ultimately,[br]how can you ask yourselves 0:15:40.985,0:15:44.915 and how can you find an answer[br]on what vision of the good life 0:15:44.939,0:15:48.108 you want to convey and create[br]with your designs 0:15:48.132,0:15:49.862 without asking the question: 0:15:49.886,0:15:53.718 What vision of the good life[br]do you yourself want to live? 0:15:53.885,0:15:55.059 And with that, 0:15:55.827,0:15:57.239 I thank you. 0:15:57.494,0:16:01.650 (Applause)