1 00:00:18,000 --> 00:00:19,190 YEHUDA KATZ: So it's been ten years. 2 00:00:19,190 --> 00:00:20,710 Unlike DHH I can't get up here and 3 00:00:20,710 --> 00:00:22,890 tell you I've been doing Rails for ten years. 4 00:00:22,890 --> 00:00:24,680 I've been doing Rails for nine or eight or something 5 00:00:24,680 --> 00:00:26,520 like that. But, Rails has been around 6 00:00:26,520 --> 00:00:29,020 for ten years, and so it's not surprising 7 00:00:29,020 --> 00:00:30,320 that a bunch of people are gonna get up here 8 00:00:30,320 --> 00:00:33,360 and do a little bit of a retrospective. 9 00:00:33,360 --> 00:00:36,449 So this is sort of my feeling. Oh my god, 10 00:00:36,449 --> 00:00:39,230 I remember sort of thinking back in 2008, 11 00:00:39,230 --> 00:00:43,340 when DHH was, was giving his talk about sort of 12 00:00:43,340 --> 00:00:46,150 a look back around the same year that Merb 13 00:00:46,150 --> 00:00:48,930 was becoming a thing, and DHH, and we were 14 00:00:48,930 --> 00:00:51,899 eventually gonna, you know, merge a little bit later. 15 00:00:51,899 --> 00:00:54,399 But in 2008, when DHH gave the great surplus 16 00:00:54,399 --> 00:00:57,370 talk, that was sort of a retrospective year too, 17 00:00:57,370 --> 00:00:58,690 because we had gotten to the point 18 00:00:58,690 --> 00:01:00,700 where Rails was big enough that it couldn't 19 00:01:00,700 --> 00:01:02,450 actually host a competitor. 20 00:01:02,450 --> 00:01:04,360 And I think it's really great that we ended 21 00:01:04,360 --> 00:01:06,460 up merging in and we ended up getting another 22 00:01:06,460 --> 00:01:09,580 five, six years of, of, of great, a great 23 00:01:09,580 --> 00:01:12,729 framework community. But now it's another opportunity to look 24 00:01:12,729 --> 00:01:15,420 back and think about sort of what, what the 25 00:01:15,420 --> 00:01:17,720 Rails community is and how you should think about 26 00:01:17,720 --> 00:01:21,220 taking the lessons of Rails to other environments. And 27 00:01:21,220 --> 00:01:22,960 that's sort of what my talk is about today. 28 00:01:22,960 --> 00:01:25,970 So, I'm gonna start by saying, I think, if 29 00:01:25,970 --> 00:01:27,659 you think about one thing about Rails, if you 30 00:01:27,659 --> 00:01:31,680 want to think about what Rails is above anything 31 00:01:31,680 --> 00:01:35,640 else, I think Rails popularized the idea of convention 32 00:01:35,640 --> 00:01:38,000 over configuration. And you've been hearing the term convention 33 00:01:38,000 --> 00:01:41,939 over configuration for ten years now. So probably it's 34 00:01:41,939 --> 00:01:43,930 sort of, it's a meaningless term. It's sort of 35 00:01:43,930 --> 00:01:45,220 like when you hear the same word over and 36 00:01:45,220 --> 00:01:47,610 over and over again, eventually you reach semantic saturation. 37 00:01:47,610 --> 00:01:50,299 I think we've all reached semantic saturation of the 38 00:01:50,299 --> 00:01:51,840 term convention over configuration. 39 00:01:51,840 --> 00:01:55,189 I want to unpack it a little bit. I 40 00:01:55,189 --> 00:01:58,420 think one way of thinking about it is, is 41 00:01:58,420 --> 00:02:00,220 this other term called the paradox of choice. This 42 00:02:00,220 --> 00:02:03,439 idea that people, well I'll let DHH say what 43 00:02:03,439 --> 00:02:05,520 it is, people like choices a lot more than 44 00:02:05,520 --> 00:02:08,119 they like having to actually choose. Right, so there's 45 00:02:08,119 --> 00:02:10,810 this, I think this sort of narrow point, but 46 00:02:10,810 --> 00:02:12,549 it's still a very important point, which is that 47 00:02:12,549 --> 00:02:17,010 people go into environments, they go into programming environments 48 00:02:17,010 --> 00:02:20,500 or groceries or whatever, and they like the idea 49 00:02:20,500 --> 00:02:22,410 of having a whole lot of choices, a lot 50 00:02:22,410 --> 00:02:25,000 more than they like having to actually choose what 51 00:02:25,000 --> 00:02:25,390 to do. 52 00:02:25,390 --> 00:02:26,790 And this is sort of the state of the 53 00:02:26,790 --> 00:02:29,080 art, this is what we knew in 2008 when 54 00:02:29,080 --> 00:02:32,800 David gave the great surplus talk back then. And 55 00:02:32,800 --> 00:02:34,390 what I want to do is go a little 56 00:02:34,390 --> 00:02:38,260 bit beyond sort of these, these pithy points and 57 00:02:38,260 --> 00:02:41,020 talk a little bit about what, what's happening, what 58 00:02:41,020 --> 00:02:44,989 is actually going on that's causing this idea to 59 00:02:44,989 --> 00:02:47,730 occur. What's hap- causing the paradox of choice? And 60 00:02:47,730 --> 00:02:49,879 actually there's been a lot of science, even in 61 00:02:49,879 --> 00:02:53,170 2008 there was science, but in between 2008 and 62 00:02:53,170 --> 00:02:55,069 now, or certainly 2004 and now, there's been a 63 00:02:55,069 --> 00:02:59,870 tremendous amount of science about what is causing the 64 00:02:59,870 --> 00:03:03,209 paradox of choice. What is causing convention over configuration 65 00:03:03,209 --> 00:03:05,019 to be effective. 66 00:03:05,019 --> 00:03:06,260 And if you want to Google it, if you 67 00:03:06,260 --> 00:03:08,640 want to go find something, more information about this 68 00:03:08,640 --> 00:03:13,010 on Wikipedia, the term is called ego depletion. Sometimes 69 00:03:13,010 --> 00:03:15,989 the idea of cognitive depletion. And in order to 70 00:03:15,989 --> 00:03:18,920 understand what's happening here, you first need to understand, 71 00:03:18,920 --> 00:03:20,489 you first need to think about sort of, like, 72 00:03:20,489 --> 00:03:24,269 your everyday, your everyday job, your, how you feel 73 00:03:24,269 --> 00:03:24,950 during the day. 74 00:03:24,950 --> 00:03:28,180 So you wake up in the morning. You get 75 00:03:28,180 --> 00:03:30,739 out, you go out of the house, and you're 76 00:03:30,739 --> 00:03:33,220 pretty much fully charged. You're ready to attack the 77 00:03:33,220 --> 00:03:35,580 world, you, hopefully it's sunny and you can skip 78 00:03:35,580 --> 00:03:37,959 down the street. And you're, you're ready to do 79 00:03:37,959 --> 00:03:40,230 anything. You're ready. You have all the, the, the 80 00:03:40,230 --> 00:03:42,739 cognitive resources in the world. 81 00:03:42,739 --> 00:03:45,190 And then, you know, you get to your desk. 82 00:03:45,190 --> 00:03:48,860 I find it amusing that, that character is a 83 00:03:48,860 --> 00:03:52,590 programmer. It's so perfect. So you get to your 84 00:03:52,590 --> 00:03:54,470 desk and you know you've, you've done a little 85 00:03:54,470 --> 00:03:57,290 bit of work, and you're cognitive resources start to 86 00:03:57,290 --> 00:03:58,780 deplete a little bit. You got a little bit 87 00:03:58,780 --> 00:04:02,030 fewer cognitive resources. You know, eventually something happens during 88 00:04:02,030 --> 00:04:05,590 the day that you might not, that might not 89 00:04:05,590 --> 00:04:09,860 be so pleasant, and your cognitive resources deplete and 90 00:04:09,860 --> 00:04:11,870 then you reach a point, at some point during 91 00:04:11,870 --> 00:04:13,010 the day - this is a Van Gogh painting 92 00:04:13,010 --> 00:04:15,200 - you reach a point some time during the 93 00:04:15,200 --> 00:04:17,470 day where you're really flagging. You're feeling like you 94 00:04:17,470 --> 00:04:18,959 don't have a lot of capacity to left to, 95 00:04:18,959 --> 00:04:21,349 to do anything or to think hard. And eventually 96 00:04:21,349 --> 00:04:23,550 you totally run out. You run out of cognitive 97 00:04:23,550 --> 00:04:26,250 resources entirely and you're done. 98 00:04:26,250 --> 00:04:28,050 And so the idea here is the concept of 99 00:04:28,050 --> 00:04:30,720 cognitive depletion or ego depletion, you have a certain 100 00:04:30,720 --> 00:04:32,940 amount of resources and they run out. And I 101 00:04:32,940 --> 00:04:34,690 think most people think about this in terms of, 102 00:04:34,690 --> 00:04:36,560 like, you're day job. Right, so you wake up 103 00:04:36,560 --> 00:04:39,690 in the morning. Throughout the day your resources deplete. 104 00:04:39,690 --> 00:04:40,530 You get to the end of the day, you're 105 00:04:40,530 --> 00:04:42,630 out of resources. Rinse and repeat. I think that's 106 00:04:42,630 --> 00:04:43,940 how most people think about it and that's sort 107 00:04:43,940 --> 00:04:45,680 of how I framed it here. 108 00:04:45,680 --> 00:04:48,100 But the really interesting thing about ego depletion or 109 00:04:48,100 --> 00:04:51,240 cognitive depletion is that what actually turns out to 110 00:04:51,240 --> 00:04:52,650 be the case is that there's sort of this 111 00:04:52,650 --> 00:04:57,139 one, there's this one big pool of resources, this 112 00:04:57,139 --> 00:04:59,530 one battery, that's actually used for a lot of 113 00:04:59,530 --> 00:05:02,100 different things. And so there's a lot of studies 114 00:05:02,100 --> 00:05:05,850 about things like grocery stores, right. So why is 115 00:05:05,850 --> 00:05:07,840 it that when you go to a grocery store, 116 00:05:07,840 --> 00:05:10,070 why do you find yourself so willing to buy 117 00:05:10,070 --> 00:05:12,630 something at the impulse, at the impulse aisle? Right, 118 00:05:12,630 --> 00:05:13,949 at the end of the, at the end of 119 00:05:13,949 --> 00:05:15,750 the grocery trip. 120 00:05:15,750 --> 00:05:17,240 And the reason is that you spent all this 121 00:05:17,240 --> 00:05:20,060 time in the grocery store doing not very difficult 122 00:05:20,060 --> 00:05:22,620 activities, but you're making a lot of choices. So 123 00:05:22,620 --> 00:05:24,660 you're making choices the entire time, as you're walking 124 00:05:24,660 --> 00:05:27,720 around the grocery store. And eventually your brain just 125 00:05:27,720 --> 00:05:29,810 runs out of resources to do anything else, and 126 00:05:29,810 --> 00:05:32,550 it's actually drawing from the same pool of resources 127 00:05:32,550 --> 00:05:34,830 that will power comes from. So even though choice 128 00:05:34,830 --> 00:05:37,020 making and will power feel like two totally different 129 00:05:37,020 --> 00:05:39,050 things, when you get to the end of the 130 00:05:39,050 --> 00:05:40,979 grocery trip and you've used all of your resources 131 00:05:40,979 --> 00:05:44,020 on choice making, you're out of resources to not 132 00:05:44,020 --> 00:05:46,729 buy the candy bar at the, at the impulse, 133 00:05:46,729 --> 00:05:49,240 from the impulse aisle. 134 00:05:49,240 --> 00:05:50,600 And the same thing is true about a lot 135 00:05:50,600 --> 00:05:52,800 of things. They've done studies where they'll just take 136 00:05:52,800 --> 00:05:54,410 two halves of a room, like this one, and 137 00:05:54,410 --> 00:05:57,780 they'll say, you guys memorize two numbers, you guys 138 00:05:57,780 --> 00:06:00,570 memorize seven numbers, and then when they're totally done 139 00:06:00,570 --> 00:06:02,759 and you've done the memorization, they'll take you into 140 00:06:02,759 --> 00:06:06,139 a processing room and they'll say, OK. You have, 141 00:06:06,139 --> 00:06:08,770 you know, cookies, and you have some fruit. And 142 00:06:08,770 --> 00:06:11,710 you get to basically decide which one to eat. 143 00:06:11,710 --> 00:06:13,440 And it turns out that the people who did 144 00:06:13,440 --> 00:06:15,740 the not much more difficult job of memorizing seven 145 00:06:15,740 --> 00:06:18,720 numbers are far more likely to eat the cookies. 146 00:06:18,720 --> 00:06:20,060 And the same thing is true in the other 147 00:06:20,060 --> 00:06:22,430 direction. If you have people eat cookies first, and 148 00:06:22,430 --> 00:06:25,300 then go and do a cognitively difficult task, so 149 00:06:25,300 --> 00:06:27,509 a task that requires. One of the most famous 150 00:06:27,509 --> 00:06:29,530 experiments has to do with an impossible task. How 151 00:06:29,530 --> 00:06:31,810 long can people persevere doing a task that you 152 00:06:31,810 --> 00:06:34,900 can literally never finish? And it turns out that 153 00:06:34,900 --> 00:06:38,419 people who eat the cookies first actually have, spend 154 00:06:38,419 --> 00:06:40,160 a lot more time trying to do the impossible 155 00:06:40,160 --> 00:06:42,880 task than the people that, that were, had, had 156 00:06:42,880 --> 00:06:44,199 to sit in front of a tray of cookies 157 00:06:44,199 --> 00:06:45,759 and were told not to eat it. 158 00:06:45,759 --> 00:06:48,000 And so there's, there's all these experiments, there's, by 159 00:06:48,000 --> 00:06:50,180 now in 2014 there is a ton of them, 160 00:06:50,180 --> 00:06:52,810 and basically what they show in aggregate is that 161 00:06:52,810 --> 00:06:55,330 there is this pool of resources that we have 162 00:06:55,330 --> 00:06:59,270 to do our job, challenging tasks, to do cognitive 163 00:06:59,270 --> 00:07:01,740 dissonance. So there's studies around, if you just tell 164 00:07:01,740 --> 00:07:03,139 people, you need to get up and give a 165 00:07:03,139 --> 00:07:05,389 speech about something that's different than what you actually 166 00:07:05,389 --> 00:07:08,740 believe, people who do that actually have less will 167 00:07:08,740 --> 00:07:10,699 power afterwards. They have less ability to do challenging 168 00:07:10,699 --> 00:07:14,440 tasks. They have less general cognitive resources than people 169 00:07:14,440 --> 00:07:15,650 who are asked to give a speech about something 170 00:07:15,650 --> 00:07:17,860 they do believe. Even if the actual act of 171 00:07:17,860 --> 00:07:22,300 giving the speech is equally, is equally difficult. 172 00:07:22,300 --> 00:07:24,410 And so I think what's kind of interesting about 173 00:07:24,410 --> 00:07:26,160 this is that, really what we want in programming 174 00:07:26,160 --> 00:07:28,240 is we want to be doing, having as much 175 00:07:28,240 --> 00:07:30,340 time as we can for the challenging tasks, right. 176 00:07:30,340 --> 00:07:32,610 We want to be spending all of our time 177 00:07:32,610 --> 00:07:35,009 on challenging tasks and as little of those very 178 00:07:35,009 --> 00:07:38,289 scarce resources as we can on things like the 179 00:07:38,289 --> 00:07:40,110 will power to write your code in exactly the 180 00:07:40,110 --> 00:07:44,080 right way, or making a lot of choices. 181 00:07:44,080 --> 00:07:45,940 And here are some terms that you might have 182 00:07:45,940 --> 00:07:49,560 heard, basically, about this paradox of choice or ego 183 00:07:49,560 --> 00:07:52,880 depletion or the concept of decision fatigue. These are 184 00:07:52,880 --> 00:07:54,910 all ways that you've heard that describe this general 185 00:07:54,910 --> 00:07:58,120 concept, this general concept of you just have this 186 00:07:58,120 --> 00:08:00,310 battery, and it runs out at some point, and 187 00:08:00,310 --> 00:08:03,539 there's all these really counter-intuitive things that don't seem 188 00:08:03,539 --> 00:08:05,699 like they're very hard but are taking away resources 189 00:08:05,699 --> 00:08:08,069 that you need to work on hard problems. 190 00:08:08,069 --> 00:08:12,270 So, how do we, how do we solve this? 191 00:08:12,270 --> 00:08:14,970 How do we actually solve this problem? Because obviously 192 00:08:14,970 --> 00:08:16,520 it's the case that you, if you want to 193 00:08:16,520 --> 00:08:19,300 be spending a lot of time on your challenging 194 00:08:19,300 --> 00:08:21,669 problems, if you just ignore the problem of will 195 00:08:21,669 --> 00:08:24,669 power or the problem of, of choices, you're just 196 00:08:24,669 --> 00:08:26,729 gonna end up making a lot of mindless choices 197 00:08:26,729 --> 00:08:29,300 all day. And, so what we need to do 198 00:08:29,300 --> 00:08:31,389 is we need to find some psychological hacks that 199 00:08:31,389 --> 00:08:33,940 we can apply that will keep us doing the 200 00:08:33,940 --> 00:08:36,979 right thing, basically all the time. Keep us from 201 00:08:36,979 --> 00:08:39,198 wasting cognitive resources. 202 00:08:39,198 --> 00:08:42,958 And I think my favorite study about, about this, 203 00:08:42,958 --> 00:08:46,500 about the kinds of cognitive hacks that work effectively, 204 00:08:46,500 --> 00:08:51,269 is the idea of what happens to organ donation 205 00:08:51,269 --> 00:08:53,940 if the organ donation requirement is opt in, in 206 00:08:53,940 --> 00:08:55,260 other words, you go to the DMV and there's 207 00:08:55,260 --> 00:08:57,850 a form that says, yes, I agree to donate 208 00:08:57,850 --> 00:08:59,250 my organs and here are the ones I agree 209 00:08:59,250 --> 00:09:02,180 to donate. And when the organ donation is opt 210 00:09:02,180 --> 00:09:04,269 out, in other words, you have to explicitly say, 211 00:09:04,269 --> 00:09:06,230 I do not want, I, I do not want 212 00:09:06,230 --> 00:09:08,190 my organs to be donated. And what you can 213 00:09:08,190 --> 00:09:10,730 see is that, in the countries where it's opt 214 00:09:10,730 --> 00:09:15,470 in, it's actually a very, very low rate and 215 00:09:15,470 --> 00:09:16,870 in the countries where it's opt out, it's a 216 00:09:16,870 --> 00:09:20,149 surprisingly high rate. It's basically almost universal. 217 00:09:20,149 --> 00:09:21,810 And I, and I think to, to some degree 218 00:09:21,810 --> 00:09:23,860 you might expect that this is the case. But 219 00:09:23,860 --> 00:09:27,839 I think this difference is really counter-intuitive, because you 220 00:09:27,839 --> 00:09:30,490 would expect that if somebody, you know, goes and 221 00:09:30,490 --> 00:09:32,370 they're sitting at a form and the form says, 222 00:09:32,370 --> 00:09:33,740 Do you want to donate your organs? And the 223 00:09:33,740 --> 00:09:35,830 excuse that they're telling themselves in their head for 224 00:09:35,830 --> 00:09:38,339 not checking the check box is, you know, my 225 00:09:38,339 --> 00:09:39,990 mom would be super angry if she found out 226 00:09:39,990 --> 00:09:42,180 or my religion tells me that I shouldn't do 227 00:09:42,180 --> 00:09:44,930 this or, you know, growing up, I heard people 228 00:09:44,930 --> 00:09:46,980 say negative things or whatever, whatever the excuse is 229 00:09:46,980 --> 00:09:49,019 you tell yourself to not check the check box, 230 00:09:49,019 --> 00:09:52,050 you would think that some of those people, more 231 00:09:52,050 --> 00:09:54,160 than, you know, zero point one percent of those 232 00:09:54,160 --> 00:09:56,930 people, would pick up the pen and opt out. 233 00:09:56,930 --> 00:10:00,890 But the interesting thing is that, by just changing 234 00:10:00,890 --> 00:10:03,930 the default from yes to no, all of the 235 00:10:03,930 --> 00:10:05,810 sudden, all those excuses, all those things that people 236 00:10:05,810 --> 00:10:08,610 tell themselves about the reasons that they really shouldn't 237 00:10:08,610 --> 00:10:11,170 check the check box, suddenly go away. 238 00:10:11,170 --> 00:10:13,190 And what's, I think even more interesting about this 239 00:10:13,190 --> 00:10:16,519 is that, these choices are actually made on, on 240 00:10:16,519 --> 00:10:18,580 really big DMV forms. So you basically what you 241 00:10:18,580 --> 00:10:20,829 can see is that people have already gone through 242 00:10:20,829 --> 00:10:26,649 this really complicated, somewhat trivial but very choice-heavy process 243 00:10:26,649 --> 00:10:28,510 of filling out this DMV form, and by the 244 00:10:28,510 --> 00:10:30,529 time they get to the bottom and are asked 245 00:10:30,529 --> 00:10:33,420 about organ donation, they're so cognitively depleted that they 246 00:10:33,420 --> 00:10:35,329 have no energy left to even really think about 247 00:10:35,329 --> 00:10:37,920 it. They basically just do the defaults. 248 00:10:37,920 --> 00:10:41,459 So I think, honestly, defaults are our most powerful 249 00:10:41,459 --> 00:10:44,899 psychological hack, our most powerful weapon in trying to 250 00:10:44,899 --> 00:10:46,649 deal with the pro- the fact that we have 251 00:10:46,649 --> 00:10:50,230 this limited source of cognitive capacity that we want 252 00:10:50,230 --> 00:10:53,339 to make good use of when we're programming. And 253 00:10:53,339 --> 00:10:56,000 the really cool thing about defaults in general is 254 00:10:56,000 --> 00:10:59,310 that defaults are actually really effective on both sides 255 00:10:59,310 --> 00:11:01,320 of the spectrum. So some days, you get to 256 00:11:01,320 --> 00:11:03,850 work, you're ready to go, you're like, in an 257 00:11:03,850 --> 00:11:09,760 amazing mood. Everything is perfect. Everything is awesome. 258 00:11:09,760 --> 00:11:12,070 And on those days, the defaults, you have a 259 00:11:12,070 --> 00:11:15,040 big store of cognitive resources, and the defaults keep 260 00:11:15,040 --> 00:11:17,600 you high up. They keep you in the charge 261 00:11:17,600 --> 00:11:19,100 state of a longer time. You don't have to 262 00:11:19,100 --> 00:11:21,430 make choices that would go deplete, and remember the 263 00:11:21,430 --> 00:11:24,160 choice-making doesn't deplete per minute. It's not every minute 264 00:11:24,160 --> 00:11:27,160 of choices depletes. It's every choice depletes your cognitive 265 00:11:27,160 --> 00:11:29,600 resources. So, having a set of defaults that tells 266 00:11:29,600 --> 00:11:31,470 you, here is what you're gonna do in general 267 00:11:31,470 --> 00:11:33,180 and, you have to, you know, you have to 268 00:11:33,180 --> 00:11:35,779 think hard to opt out. That's really great when 269 00:11:35,779 --> 00:11:37,760 you're in a good mood, when things are charged. 270 00:11:37,760 --> 00:11:39,329 But it's actually also really great when you're in 271 00:11:39,329 --> 00:11:41,250 a bad moon. When you're really depleted and you 272 00:11:41,250 --> 00:11:42,480 still have to go to work and do your 273 00:11:42,480 --> 00:11:44,410 job, because the default keeps you on the straight 274 00:11:44,410 --> 00:11:46,430 and narrow, right. You don't have enough energy left 275 00:11:46,430 --> 00:11:48,660 to really think about what you're doing, and so 276 00:11:48,660 --> 00:11:51,399 everybody has bad days. Everybody works on teams with 277 00:11:51,399 --> 00:11:57,850 developers who aren't great. Let me say it a 278 00:11:57,850 --> 00:11:59,370 different way. Everyone has at some point worked on 279 00:11:59,370 --> 00:12:01,740 a team, hopefully not. But you have junior developers. 280 00:12:01,740 --> 00:12:03,860 You know, you hire people who are, who are 281 00:12:03,860 --> 00:12:06,170 new to whatever it is that you're doing, or 282 00:12:06,170 --> 00:12:08,910 you have a bad day, or you're stressed out 283 00:12:08,910 --> 00:12:10,709 because your mom gave you a call at lunch 284 00:12:10,709 --> 00:12:12,529 and now you're in a bad mood. 285 00:12:12,529 --> 00:12:14,139 Right, so everybody gets to a point where they 286 00:12:14,139 --> 00:12:21,139 have cognitively, my mother's gonna be so angry now. 287 00:12:22,050 --> 00:12:27,519 Let's say, an ex-girlfriend or whatever. So everyone has 288 00:12:27,519 --> 00:12:33,040 days where they're cognitively depleted. And on those days, 289 00:12:33,040 --> 00:12:35,200 defaults are also really powerful, because they keep you, 290 00:12:35,200 --> 00:12:37,320 when. Instead of having you be sort of in 291 00:12:37,320 --> 00:12:39,240 a bad mood and you'll just sort of do 292 00:12:39,240 --> 00:12:41,610 whatever, you know, you feel like, you're basically kept 293 00:12:41,610 --> 00:12:42,730 on the straight and narrow. You're kept on the 294 00:12:42,730 --> 00:12:43,269 right path. 295 00:12:43,269 --> 00:12:45,829 And I think this actually helps to explain why 296 00:12:45,829 --> 00:12:49,529 yak-shaving doesn't feel as good as you might think. 297 00:12:49,529 --> 00:12:52,160 So, yak-shaving isn't the most terrible activity in the 298 00:12:52,160 --> 00:12:54,180 world. I think sometimes you need to yak-shave. But 299 00:12:54,180 --> 00:12:55,839 I think if you think about doing, like, four 300 00:12:55,839 --> 00:12:58,410 hours of yak-shaving in an eight hour day, pretty 301 00:12:58,410 --> 00:13:00,240 much after four hours, if you, you know, let 302 00:13:00,240 --> 00:13:02,680 me set up my, you know, my vagrant box, 303 00:13:02,680 --> 00:13:04,959 or let me go set up my testing environment. 304 00:13:04,959 --> 00:13:07,950 After like four hours of that, you're totally cognitively 305 00:13:07,950 --> 00:13:10,079 depleted. Doesn't matter that you only spent four hours 306 00:13:10,079 --> 00:13:11,950 out of an eight hour day. Basically you have 307 00:13:11,950 --> 00:13:14,300 no more cognitive resources left. And I think this, 308 00:13:14,300 --> 00:13:17,050 this means we should be very careful about yak-shaving. 309 00:13:17,050 --> 00:13:19,160 Because yak-shaving may feel good and it may be 310 00:13:19,160 --> 00:13:20,790 important in a lot of cases, but we need 311 00:13:20,790 --> 00:13:22,779 to be very honest about the fact that there 312 00:13:22,779 --> 00:13:26,029 is, there's a certain amount of cognitive resources that 313 00:13:26,029 --> 00:13:28,420 we have and yak-shaving takes up more of them 314 00:13:28,420 --> 00:13:31,050 than you would expect. And they don't leave us 315 00:13:31,050 --> 00:13:33,570 time after, even two hours or three hours, they 316 00:13:33,570 --> 00:13:35,339 don't leave us a lot of cognitive resources to 317 00:13:35,339 --> 00:13:37,579 actually do the task that we were yak-shaving towards. 318 00:13:37,579 --> 00:13:40,779 So, obviously, occasionally you know you need to refactor 319 00:13:40,779 --> 00:13:43,820 and, and do all kinds of these kinds of 320 00:13:43,820 --> 00:13:45,839 tasks, but you should be careful about thinking that 321 00:13:45,839 --> 00:13:47,579 you'll get a lot done afterwards. 322 00:13:47,579 --> 00:13:49,850 So, I think this is sort of the unpacking. 323 00:13:49,850 --> 00:13:52,339 This is a scientific unpacking of what it is 324 00:13:52,339 --> 00:13:55,820 that we're talking about. But, and I think everyone 325 00:13:55,820 --> 00:13:57,730 in this room can nod their heads along with 326 00:13:57,730 --> 00:13:59,910 what I'm saying. They can agree with what I'm 327 00:13:59,910 --> 00:14:03,720 saying. Makes sense. But what ends up happening in 328 00:14:03,720 --> 00:14:06,880 the rest of the world, and also there's usually 329 00:14:06,880 --> 00:14:10,760 a devil on your shoulder, is that people find 330 00:14:10,760 --> 00:14:13,410 all kinds of excuses to argue against the thing 331 00:14:13,410 --> 00:14:14,000 I just said. 332 00:14:14,000 --> 00:14:16,610 So I just outlined sort of an unpacking of 333 00:14:16,610 --> 00:14:23,610 sort of the conventional reconfiguration story, and somehow, we, 334 00:14:23,980 --> 00:14:26,339 as a human race, actually find a lot of 335 00:14:26,339 --> 00:14:30,240 ways to, to argue against these things. And one 336 00:14:30,240 --> 00:14:32,420 of these, one of these ways that we find 337 00:14:32,420 --> 00:14:35,490 to argue against it is to tell ourselves that 338 00:14:35,490 --> 00:14:38,740 we're unique and we're special, and I'll just let 339 00:14:38,740 --> 00:14:44,579 David from 2008 talk about this for a second. 340 00:14:44,579 --> 00:14:48,529 DHH: One point I keep coming back to, over 341 00:14:48,529 --> 00:14:51,279 and over again when I talk about Ruby and 342 00:14:51,279 --> 00:14:56,380 Rails, is that we confessed commonality. We confessed the 343 00:14:56,380 --> 00:14:59,160 fact that we're not as special as we like 344 00:14:59,160 --> 00:15:02,240 to believe. We confessed that we're not the only 345 00:15:02,240 --> 00:15:06,709 ones trying to climb the same mountain. And I 346 00:15:06,709 --> 00:15:08,980 think this is a real important point because it's 347 00:15:08,980 --> 00:15:11,440 somewhat counter intuitive, I think, for a lot of 348 00:15:11,440 --> 00:15:14,089 developers to think that they're not that special. I 349 00:15:14,089 --> 00:15:16,519 think it's counter intuitive for humans in general to 350 00:15:16,519 --> 00:15:18,399 think they're not that special. 351 00:15:18,399 --> 00:15:21,010 But, when they do think that they're special, when 352 00:15:21,010 --> 00:15:22,860 they do think that they're the only ones climbing 353 00:15:22,860 --> 00:15:25,820 that mountain, they kind of get these assumptions that 354 00:15:25,820 --> 00:15:28,560 they need very unique and special tools that will 355 00:15:28,560 --> 00:15:31,500 only work for them. And I think that's a 356 00:15:31,500 --> 00:15:36,620 really bad way to approach getting greater productivity. Because 357 00:15:36,620 --> 00:15:39,980 I think what really makes this special and makes 358 00:15:39,980 --> 00:15:42,930 it work is all the points where we recognize 359 00:15:42,930 --> 00:15:44,740 that we're exactly the same. 360 00:15:44,740 --> 00:15:47,889 Y.K.: And I think that's really the point, is 361 00:15:47,889 --> 00:15:51,269 that the way we gain better productivity is by 362 00:15:51,269 --> 00:15:54,440 pushing back against this impulse. I think, we have 363 00:15:54,440 --> 00:15:56,350 it in the Rails community to some degree. I 364 00:15:56,350 --> 00:15:59,070 think it's especially significant outside of the Rails community, 365 00:15:59,070 --> 00:16:01,550 where people didn't already come together around the idea 366 00:16:01,550 --> 00:16:04,290 that we're gonna build shared tools and shared solutions. 367 00:16:04,290 --> 00:16:05,600 But I think we really do have to push 368 00:16:05,600 --> 00:16:07,230 back against this idea. 369 00:16:07,230 --> 00:16:09,209 And I think my favorite example of this sort 370 00:16:09,209 --> 00:16:15,750 of, taken to an absurdist extreme, is sort of 371 00:16:15,750 --> 00:16:17,920 famous interview. What is your most surprising app on 372 00:16:17,920 --> 00:16:20,040 the home screen? Well, it's Operator. It's a custom-designed, 373 00:16:20,040 --> 00:16:23,250 one-of-a-kind bespoke app I had built for my assistant 374 00:16:23,250 --> 00:16:25,630 and I to communicate and collaborate. Did this person 375 00:16:25,630 --> 00:16:29,410 need a custom bespoke one-of-a-kind application to communicate with 376 00:16:29,410 --> 00:16:31,639 their assistant? No. Almost certainly not. 377 00:16:31,639 --> 00:16:34,160 But they decided they were so special, they were 378 00:16:34,160 --> 00:16:37,250 so, they themselves were so one-of-a-kind, such a unique 379 00:16:37,250 --> 00:16:40,639 snowflake, that they needed a special tool to communicate 380 00:16:40,639 --> 00:16:42,779 with their assistant. And I think this is sort 381 00:16:42,779 --> 00:16:44,620 of how, this is how we act. This is 382 00:16:44,620 --> 00:16:46,610 how we behave. And if you look at, sort 383 00:16:46,610 --> 00:16:48,790 of, how people talk about software, you see things 384 00:16:48,790 --> 00:16:50,889 like, this is a tool set for building the 385 00:16:50,889 --> 00:16:54,529 framework most suited to your application development. Your application, 386 00:16:54,529 --> 00:16:58,360 your company, your industry is so special, that you 387 00:16:58,360 --> 00:17:00,620 can't use general-purpose tools. You need to use a 388 00:17:00,620 --> 00:17:03,110 tool set to build your own framework. 389 00:17:03,110 --> 00:17:07,638 Or, in an ecosystem where overarching, decides-everything-for-you frameworks are 390 00:17:07,638 --> 00:17:09,689 commonplace, and many libraries require your site to be 391 00:17:09,689 --> 00:17:12,289 reorganized to suit their look, feel, and default behavior 392 00:17:12,289 --> 00:17:13,380 - we should continue to be a tool that 393 00:17:13,380 --> 00:17:15,510 gives you the freedom to design the full experience 394 00:17:15,510 --> 00:17:16,439 of your web application. 395 00:17:16,439 --> 00:17:20,240 And, who could be against freedom? Right? Freedom is 396 00:17:20,240 --> 00:17:24,819 a really effective thing to put on the wall 397 00:17:24,819 --> 00:17:26,329 to say, this is the thing that we're arguing 398 00:17:26,329 --> 00:17:28,990 for. We're arguing for freedom. But this is just 399 00:17:28,990 --> 00:17:32,100 another way, it's just another way that you, that 400 00:17:32,100 --> 00:17:37,080 peoples' brains sneak in arguments against that, that helped 401 00:17:37,080 --> 00:17:38,970 us create the paradox of choice in the first 402 00:17:38,970 --> 00:17:43,350 place, right. People say, you know, I'm special. I 403 00:17:43,350 --> 00:17:45,020 can't use these shared tools. I can't use these 404 00:17:45,020 --> 00:17:46,880 tools that were built for everybody. I need to 405 00:17:46,880 --> 00:17:51,309 use special tools. I need to use small libraries 406 00:17:51,309 --> 00:17:54,000 that help me build my own abstractions. I can't 407 00:17:54,000 --> 00:17:56,470 share with the community. 408 00:17:56,470 --> 00:17:59,400 And then, even if people come to the conclusion 409 00:17:59,400 --> 00:18:02,530 that maybe abstractions, maybe shared solutions are a good 410 00:18:02,530 --> 00:18:05,350 idea, then you get another argument. The devil on 411 00:18:05,350 --> 00:18:07,799 your shoulder or the devil in your community. It 412 00:18:07,799 --> 00:18:10,490 makes another argument, which is the law of leaky 413 00:18:10,490 --> 00:18:12,150 abstractions. And this is not, this is sort of 414 00:18:12,150 --> 00:18:15,549 like the law of Demeter. It's not a suggestion, 415 00:18:15,549 --> 00:18:20,110 or an observation. It's a law. The law of 416 00:18:20,110 --> 00:18:21,320 leaky abstractions. 417 00:18:21,320 --> 00:18:24,500 And I think any time somebody couches an observation 418 00:18:24,500 --> 00:18:28,039 about software development as a law, you know something 419 00:18:28,039 --> 00:18:31,380 fishy is going on. You know that something's not 420 00:18:31,380 --> 00:18:35,460 right. Because software development isn't a science. You, basically 421 00:18:35,460 --> 00:18:38,090 people want you to put on your science hat, 422 00:18:38,090 --> 00:18:40,710 and say, aha! It's a law! It's like the 423 00:18:40,710 --> 00:18:43,850 law of gravity. I can derive some conclusions from 424 00:18:43,850 --> 00:18:46,480 this law. What, what, what conclusions do they want 425 00:18:46,480 --> 00:18:49,020 you to derive? Abstractions are bad. You should never 426 00:18:49,020 --> 00:18:51,520 use abstractions. You should do everything yourself. 427 00:18:51,520 --> 00:18:54,730 And, so this law of leaky abstractions was originally 428 00:18:54,730 --> 00:18:59,110 built by, or written by Joel Spolsky, and Jeff 429 00:18:59,110 --> 00:19:01,789 Atwood, who was his partner at Stack Overflow, actually 430 00:19:01,789 --> 00:19:04,720 responded, I think, kind of brilliantly to this. And 431 00:19:04,720 --> 00:19:07,200 he said, you know, I'd argue, that virtually all 432 00:19:07,200 --> 00:19:09,610 good programming abstractions are failed abstractions. I don't think 433 00:19:09,610 --> 00:19:11,059 I've ever used one that didn't leak like a 434 00:19:11,059 --> 00:19:13,789 sieve. But I think that's an awfully architecture astronaut 435 00:19:13,789 --> 00:19:17,000 way of looking at things. Instead, let's ask ourselves 436 00:19:17,000 --> 00:19:19,700 a more programatic question: does this abstraction make our 437 00:19:19,700 --> 00:19:21,780 code at least a little easier to write? To 438 00:19:21,780 --> 00:19:24,000 understand? To troubleshoot? Are we better off with this 439 00:19:24,000 --> 00:19:26,240 abstraction than we were without it? 440 00:19:26,240 --> 00:19:28,410 It's out job as modern programmers not to abandon 441 00:19:28,410 --> 00:19:31,230 abstractions due to these deficiencies, but to embrace the 442 00:19:31,230 --> 00:19:33,510 useful elements of them. To adapt the working parts 443 00:19:33,510 --> 00:19:36,049 and construct ever so slightly less leaky and broken 444 00:19:36,049 --> 00:19:37,830 abstractions over time. 445 00:19:37,830 --> 00:19:42,169 And I think people use this idea, these excuses, 446 00:19:42,169 --> 00:19:44,830 things like the law of leaky abstractions, to give 447 00:19:44,830 --> 00:19:48,770 an excuse for themselves to not share solutions. And 448 00:19:48,770 --> 00:19:50,820 I think sort of the hilarious thing, and this 449 00:19:50,820 --> 00:19:54,419 is sort of a compressed super conflated set of 450 00:19:54,419 --> 00:19:56,610 abstractions. Every single one of us is sitting on 451 00:19:56,610 --> 00:20:01,190 top of abstractions that maybe occasionally leak, but really, 452 00:20:01,190 --> 00:20:02,850 how many people ever have to drop down into 453 00:20:02,850 --> 00:20:05,390 the X86 or the arm level? Or even the 454 00:20:05,390 --> 00:20:09,590 C level? Right. People. We can build higher and 455 00:20:09,590 --> 00:20:12,380 higher sets of abstractions, and we can keep, we 456 00:20:12,380 --> 00:20:15,260 can keep building on top of these abstractions, and 457 00:20:15,260 --> 00:20:17,880 build, and, and allow us to sort of eliminate 458 00:20:17,880 --> 00:20:20,370 more and more code that we had to write 459 00:20:20,370 --> 00:20:22,490 before. That we had to write in 1960, 1970, 460 00:20:22,490 --> 00:20:24,980 1980. Sort of every year is another set of 461 00:20:24,980 --> 00:20:27,460 things that we have discovered as a community that 462 00:20:27,460 --> 00:20:29,150 we don't have to worry about, that were shared. 463 00:20:29,150 --> 00:20:30,520 And I think, sort of, people look at us 464 00:20:30,520 --> 00:20:31,730 and they say, oh my god it's a pile 465 00:20:31,730 --> 00:20:34,059 of hacks. It's hacks on hacks on hacks on 466 00:20:34,059 --> 00:20:35,970 hacks. But actually it's not. Actually what's going on 467 00:20:35,970 --> 00:20:39,140 here is that every single time you start off 468 00:20:39,140 --> 00:20:41,679 with this sort of experimental playground, people are building, 469 00:20:41,679 --> 00:20:43,210 you know, at the bottom layer, people were building 470 00:20:43,210 --> 00:20:45,630 their own hardware. And eventually people came to the 471 00:20:45,630 --> 00:20:47,549 conclusion that you don't have to build your own 472 00:20:47,549 --> 00:20:50,909 hardware. We can standardize around things like X86. 473 00:20:50,909 --> 00:20:53,650 And then we standardized around it and people stopped 474 00:20:53,650 --> 00:20:55,980 worrying about all the craziness that was underneath. And 475 00:20:55,980 --> 00:20:58,580 then people said, we can build C, and if 476 00:20:58,580 --> 00:21:01,450 we build C, people can stop worrying, most of 477 00:21:01,450 --> 00:21:03,760 the time, about what's below it. So really every 478 00:21:03,760 --> 00:21:05,090 one of these layers is not a pile of 479 00:21:05,090 --> 00:21:07,240 hacks built on a pile of hacks. It's us, 480 00:21:07,240 --> 00:21:10,130 as a group of people, as a community of 481 00:21:10,130 --> 00:21:12,480 programmers, deciding that 90% of the things that we're 482 00:21:12,480 --> 00:21:14,919 doing, we've figured out we don't actually need to, 483 00:21:14,919 --> 00:21:16,090 to worry about. 484 00:21:16,090 --> 00:21:19,210 And, I think, fundamentally, this is about, sort of 485 00:21:19,210 --> 00:21:21,580 the history of programming is that we have shared 486 00:21:21,580 --> 00:21:25,840 solutions. We make progress by building up the stack. 487 00:21:25,840 --> 00:21:27,730 By eliminating code that we didn't have to write. 488 00:21:27,730 --> 00:21:30,780 And Steve Jobs actually talked about this in 1995. 489 00:21:30,780 --> 00:21:32,940 Sort of exactly the same thing. So let me 490 00:21:32,940 --> 00:21:33,900 let him talk. 491 00:21:33,900 --> 00:21:37,059 STEVE JOBS: Because it's all about managing complexity, right. 492 00:21:37,059 --> 00:21:39,730 You're developers. You know that. It's all about managing 493 00:21:39,730 --> 00:21:44,100 complexity. It's, like, scaffolding, right. You erect some scaffolding, 494 00:21:44,100 --> 00:21:46,010 and if you keep going up and up and 495 00:21:46,010 --> 00:21:49,770 up, eventually the scaffolding collapses of its own weight, 496 00:21:49,770 --> 00:21:53,590 right. That's what building software is. It's, how much 497 00:21:53,590 --> 00:21:56,770 scaffolding can you erect before the whole thing collapses 498 00:21:56,770 --> 00:21:57,320 of its own weight. 499 00:21:57,320 --> 00:21:58,750 Doesn't matter how many people you have working on 500 00:21:58,750 --> 00:22:01,299 it. Doesn't matter if you're Microsoft with three, four 501 00:22:01,299 --> 00:22:03,710 hundred people, five hundred people on the team. It 502 00:22:03,710 --> 00:22:06,470 will collapse under its own weight. You've read the 503 00:22:06,470 --> 00:22:09,380 Mythical Man Month, right. Basic premise of this is, 504 00:22:09,380 --> 00:22:11,890 a software development project gets to a certain size 505 00:22:11,890 --> 00:22:13,890 where if you add one more person, the amount 506 00:22:13,890 --> 00:22:16,190 of energy to communicate with that person is actually 507 00:22:16,190 --> 00:22:18,870 greater than their net contribution to the project, so 508 00:22:18,870 --> 00:22:19,690 it slows down. 509 00:22:19,690 --> 00:22:22,309 So you have local maximum and then it comes 510 00:22:22,309 --> 00:22:25,280 down. We all know that about software. It's about 511 00:22:25,280 --> 00:22:29,900 managing complexity. These tools allow you to not have 512 00:22:29,900 --> 00:22:32,710 to worry about ninety percent of the stuff you 513 00:22:32,710 --> 00:22:35,700 worry about, so that you can erect your five 514 00:22:35,700 --> 00:22:39,799 stories of scaffolding, but starting at story number twenty-three 515 00:22:39,799 --> 00:22:43,179 instead of starting at story number six. You get 516 00:22:43,179 --> 00:22:45,100 a lot higher. 517 00:22:45,100 --> 00:22:47,510 Y.K.: And I think that's fundamentally what we do 518 00:22:47,510 --> 00:22:49,780 as software people. For all of the complaints that 519 00:22:49,780 --> 00:22:53,169 people make about, you know, oh my god, every 520 00:22:53,169 --> 00:22:57,880 abstraction leaks. All we've ever done, even in, even 521 00:22:57,880 --> 00:23:00,299 as far back as, you know, in the 60s, 522 00:23:00,299 --> 00:23:03,539 but even in 1995, Steve Jobs was already talking 523 00:23:03,539 --> 00:23:05,830 about this idea that we can build higher by 524 00:23:05,830 --> 00:23:08,400 building shared solutions. And I'm gonna let him speak 525 00:23:08,400 --> 00:23:12,350 one more time, because I think, really, it's really 526 00:23:12,350 --> 00:23:14,820 fascinating how much this idea of how you get 527 00:23:14,820 --> 00:23:18,640 better programmer productivity hasn't really changed, fundamentally, since that 528 00:23:18,640 --> 00:23:19,039 time. 529 00:23:19,039 --> 00:23:22,210 STEVE JOBS: But, on top of that, we're gonna 530 00:23:22,210 --> 00:23:29,210 put something called open step. And open step lets 531 00:23:34,169 --> 00:23:38,400 you start developing your apps on the twentieth floor. 532 00:23:38,400 --> 00:23:40,429 And the kinds of apps you can deliver are 533 00:23:40,429 --> 00:23:44,250 phenomenal. But there's another hidden advantage. 534 00:23:44,250 --> 00:23:47,760 Most of the great break through, the page makers, 535 00:23:47,760 --> 00:23:52,200 the illustrators, et cetera, the directors, come from smaller 536 00:23:52,200 --> 00:23:54,570 software companies. That's been said a few times today. 537 00:23:54,570 --> 00:23:57,000 They don't come from the large software companies. They 538 00:23:57,000 --> 00:23:59,210 come from the smaller ones. And one of the 539 00:23:59,210 --> 00:24:03,370 greatest things is that using this new technology, two 540 00:24:03,370 --> 00:24:06,539 people or three people in a garage can build 541 00:24:06,539 --> 00:24:09,289 an app and get it from concept to market 542 00:24:09,289 --> 00:24:11,590 in six to nine months, that is every bit 543 00:24:11,590 --> 00:24:15,750 as feature-rich, every bit as reliable, and every bit 544 00:24:15,750 --> 00:24:18,289 as exciting as a giant software company can do 545 00:24:18,289 --> 00:24:20,990 with a hundred fifty person team. 546 00:24:20,990 --> 00:24:22,039 It's phenomenal. 547 00:24:22,039 --> 00:24:26,140 Y.K.: So, I think what's kind of cool about 548 00:24:26,140 --> 00:24:31,640 this is that, this idea that we can take 549 00:24:31,640 --> 00:24:35,049 shared problems that everyone has, shared problems that a 550 00:24:35,049 --> 00:24:37,779 community of people have, solving the same problem, and 551 00:24:37,779 --> 00:24:40,260 we can build up shared solutions. This is not 552 00:24:40,260 --> 00:24:43,820 new. It's not, it shouldn't be controversial. It's kind 553 00:24:43,820 --> 00:24:47,100 of fundamental to what we do as software developers. 554 00:24:47,100 --> 00:24:50,070 And yet, if someone isn't up here telling you 555 00:24:50,070 --> 00:24:52,409 this, it's so easy to forget. There are so 556 00:24:52,409 --> 00:24:55,110 many excuses that people tell themselves. 557 00:24:55,110 --> 00:24:56,770 Sort of what happens in reality is you have 558 00:24:56,770 --> 00:24:58,760 this bulk of shared solutions, you have an area 559 00:24:58,760 --> 00:25:01,490 of experimentation - sort of the wild west - 560 00:25:01,490 --> 00:25:03,750 and you let the area of experimentation fold back 561 00:25:03,750 --> 00:25:05,650 into shared solutions. This is sort of how Rails 562 00:25:05,650 --> 00:25:08,020 works, right. So you build higher and higher and 563 00:25:08,020 --> 00:25:10,890 higher stacks. You get to a point where you, 564 00:25:10,890 --> 00:25:12,590 you know, you could build something like Devise in 565 00:25:12,590 --> 00:25:15,070 the Rails community, because there's so much of what 566 00:25:15,070 --> 00:25:18,929 underpins Devise, you know, everybody uses the same set 567 00:25:18,929 --> 00:25:22,309 of model abstractions, everyone has similar ways of talking 568 00:25:22,309 --> 00:25:25,240 about users, right. So you can build an abstraction 569 00:25:25,240 --> 00:25:27,059 on top of that because everyone has sort of 570 00:25:27,059 --> 00:25:28,870 built up this shared understanding of what it is 571 00:25:28,870 --> 00:25:30,059 that we're doing. 572 00:25:30,059 --> 00:25:33,169 And, it's so easy to let yourself be confused 573 00:25:33,169 --> 00:25:35,630 by the fact that the area of experimentation is 574 00:25:35,630 --> 00:25:38,400 the wild west, and forget that that area of 575 00:25:38,400 --> 00:25:41,820 experimentation is sitting on top of huge, a huge 576 00:25:41,820 --> 00:25:43,799 stack of abstractions. And this is sort of, I 577 00:25:43,799 --> 00:25:45,909 think, to me, the answer to why the node 578 00:25:45,909 --> 00:25:49,779 community seems, they're sitting on top of maybe, you 579 00:25:49,779 --> 00:25:53,340 know, the most advanced dynamic language git in the 580 00:25:53,340 --> 00:25:57,480 world, on top of all kinds of abstractions. And 581 00:25:57,480 --> 00:25:58,630 they sit on top and they say, oh my 582 00:25:58,630 --> 00:26:00,919 god, we need to build a lot of tiny 583 00:26:00,919 --> 00:26:02,750 modules, because if we don't build tiny modules, this, 584 00:26:02,750 --> 00:26:04,289 the abstraction's gonna kill us. 585 00:26:04,289 --> 00:26:06,830 But, for me, this has always been a paradox. 586 00:26:06,830 --> 00:26:08,650 You're sitting on top of a stack of abstractions 587 00:26:08,650 --> 00:26:10,850 that's far higher than anything that you're claiming to 588 00:26:10,850 --> 00:26:12,350 be afraid of. So, why are you so afraid? 589 00:26:12,350 --> 00:26:13,900 And I think it's because of this area of 590 00:26:13,900 --> 00:26:16,600 experimentation, right. When you're in an area of experimentation, 591 00:26:16,600 --> 00:26:18,640 of course abstractions are gonna leak. You're still figuring 592 00:26:18,640 --> 00:26:20,650 out what it is that you're doing. 593 00:26:20,650 --> 00:26:22,490 But the goal of a good community that's gonna 594 00:26:22,490 --> 00:26:25,159 help people be more productive is to eventually notice 595 00:26:25,159 --> 00:26:28,600 that the area of experimentation is over. And move 596 00:26:28,600 --> 00:26:31,600 into a conventional system where you can say, we 597 00:26:31,600 --> 00:26:33,059 don't need to argue about this anymore. It was 598 00:26:33,059 --> 00:26:34,970 a worthy thing for us to discuss when we 599 00:26:34,970 --> 00:26:37,179 were thinking about the problem, but we can take 600 00:26:37,179 --> 00:26:38,820 that and we can roll it in, into our 601 00:26:38,820 --> 00:26:40,289 set of shared defaults, and we can climb up 602 00:26:40,289 --> 00:26:41,070 the ladder, right. 603 00:26:41,070 --> 00:26:42,799 And this is what, this is what Steve was 604 00:26:42,799 --> 00:26:46,000 saying. He was saying, you know, instead of having 605 00:26:46,000 --> 00:26:48,450 everybody start from some, some floor, sort of this 606 00:26:48,450 --> 00:26:52,700 is how iOS programming works today, right. Ironically. Is 607 00:26:52,700 --> 00:26:55,230 everyone starts from the same base. There is not 608 00:26:55,230 --> 00:26:57,390 a lot of shared programming. A lot of shared 609 00:26:57,390 --> 00:27:01,190 solutions. And I think, fundamentally, open source has been 610 00:27:01,190 --> 00:27:04,640 something that has really kick started this idea of 611 00:27:04,640 --> 00:27:06,830 experimentation merging through the shared solutions. 612 00:27:06,830 --> 00:27:09,350 Because trying to essentially plan it the way big 613 00:27:09,350 --> 00:27:11,590 companies like Apple or Microsoft do it, is just 614 00:27:11,590 --> 00:27:14,120 not gonna get the job done across the board. 615 00:27:14,120 --> 00:27:16,900 It'll, it'll solve some problems, but getting the open 616 00:27:16,900 --> 00:27:19,059 source communities, the power of the open source community, 617 00:27:19,059 --> 00:27:21,340 means that you can have all these little verticals, 618 00:27:21,340 --> 00:27:23,610 all these little areas where people are trying to 619 00:27:23,610 --> 00:27:27,159 build higher abstractions for shared communities. 620 00:27:27,159 --> 00:27:30,799 And interestingly, it's not enough to make these abstractions 621 00:27:30,799 --> 00:27:33,409 cheap. I think when you think about how people 622 00:27:33,409 --> 00:27:35,750 actually go and they build on top of these 623 00:27:35,750 --> 00:27:38,340 stacks, it's not enough, if every layer in the 624 00:27:38,340 --> 00:27:41,169 abstraction, in fact, if x86 and then C and, 625 00:27:41,169 --> 00:27:42,720 you know, Linux, and Posits. If every one of 626 00:27:42,720 --> 00:27:44,770 those things cost a little, by the time you 627 00:27:44,770 --> 00:27:46,679 actually got to build software, you would be so 628 00:27:46,679 --> 00:27:48,649 overwhelmed with the weight of the abstraction that you 629 00:27:48,649 --> 00:27:49,720 would never be able to do anything. 630 00:27:49,720 --> 00:27:53,270 So, it's really fundamental that the abstractions that we 631 00:27:53,270 --> 00:27:56,029 build eventually get to the point where they're basically 632 00:27:56,029 --> 00:27:58,000 free. Where they have no cognitive capacity. So that 633 00:27:58,000 --> 00:27:59,860 we can keep building higher and higher and higher, 634 00:27:59,860 --> 00:28:03,020 right. And the Rails philosophy is basically, how do 635 00:28:03,020 --> 00:28:04,210 you do that? How do you, how do, how 636 00:28:04,210 --> 00:28:05,299 do you say, you know, we're gonna experiment for 637 00:28:05,299 --> 00:28:07,799 a little bit, but eventually we're gonna work really 638 00:28:07,799 --> 00:28:10,390 hard, we're gonna push really hard at making the 639 00:28:10,390 --> 00:28:12,240 cost of that thing that everyone was just experimenting 640 00:28:12,240 --> 00:28:14,179 with a minute, a little, a minute ago free, 641 00:28:14,179 --> 00:28:15,649 so that we can go build another level up 642 00:28:15,649 --> 00:28:16,960 and another level up. 643 00:28:16,960 --> 00:28:19,830 And, I have a few sort of closing points 644 00:28:19,830 --> 00:28:23,450 to make about the ecosystem. So, first of all, 645 00:28:23,450 --> 00:28:24,730 Rails is not the only way that we have 646 00:28:24,730 --> 00:28:28,159 to share. I think I was pretty sad when 647 00:28:28,159 --> 00:28:31,529 the queue abstraction didn't end up in Rails, but 648 00:28:31,529 --> 00:28:33,850 I kind of am sad that we didn't get 649 00:28:33,850 --> 00:28:35,870 to see sort of what you can build up 650 00:28:35,870 --> 00:28:38,490 on top of the queue abstraction, there's really no 651 00:28:38,490 --> 00:28:41,029 reason that all the queue guys didn't get together 652 00:28:41,029 --> 00:28:44,360 and say, you know, we're gonna build some abstraction 653 00:28:44,360 --> 00:28:45,799 on top of that. And once you build the 654 00:28:45,799 --> 00:28:47,490 abstraction on top of that, then you can see 655 00:28:47,490 --> 00:28:49,270 how high you can go, right. 656 00:28:49,270 --> 00:28:50,570 And a sort of similar thing happened in the 657 00:28:50,570 --> 00:28:52,360 JavaScript community. In the JavaScript community there were a 658 00:28:52,360 --> 00:28:57,600 lot of different promise implementations, and what happened over 659 00:28:57,600 --> 00:28:59,289 time is that people realized that not having a 660 00:28:59,289 --> 00:29:01,190 standard way to talk about this was actually making 661 00:29:01,190 --> 00:29:02,940 it hard to build up. 662 00:29:02,940 --> 00:29:05,270 So we said, let's actually get together and let's 663 00:29:05,270 --> 00:29:06,740 decide that we're gonna have a standard way of 664 00:29:06,740 --> 00:29:09,760 talking about that. We'll call it PromisesA+. And now 665 00:29:09,760 --> 00:29:12,450 Promises are in the DOM. And now, you know, 666 00:29:12,450 --> 00:29:14,789 we can build up another level and make asynchronous 667 00:29:14,789 --> 00:29:16,880 things look synchronous. And then we can build up 668 00:29:16,880 --> 00:29:19,029 another level, and we can put that idea into 669 00:29:19,029 --> 00:29:20,159 the language. 670 00:29:20,159 --> 00:29:22,390 And, you know, I don't know where we're gonna 671 00:29:22,390 --> 00:29:24,210 go from there. But we can start building higher 672 00:29:24,210 --> 00:29:26,570 and higher abstractions. But it requires taking the first 673 00:29:26,570 --> 00:29:30,350 step. So I, I guess what I'm saying is, 674 00:29:30,350 --> 00:29:32,279 getting something into Rails core is not the only 675 00:29:32,279 --> 00:29:34,309 way that you can build these abstractions. It, it 676 00:29:34,309 --> 00:29:36,830 requires some discipline to actually get to the point 677 00:29:36,830 --> 00:29:38,950 where we're agreeing on something, but I think if 678 00:29:38,950 --> 00:29:41,220 you find that there is some topic, like, for 679 00:29:41,220 --> 00:29:45,929 example, jobs or, you know, queuing or jobs, and 680 00:29:45,929 --> 00:29:47,450 everybody does them or a lot of people do 681 00:29:47,450 --> 00:29:48,620 them but we don't have a good way of 682 00:29:48,620 --> 00:29:51,039 building on top of them, that's a good opportunity 683 00:29:51,039 --> 00:29:52,620 for someone to go and say, I'm gonna do 684 00:29:52,620 --> 00:29:54,570 the hard work to say, let's create a shared 685 00:29:54,570 --> 00:29:55,960 idea of what it is that we're doing. 686 00:29:55,960 --> 00:29:58,130 And sometimes it's via an inter-op player and sometimes 687 00:29:58,130 --> 00:30:00,809 it's via standardizing around one solution. 688 00:30:00,809 --> 00:30:04,059 And, another part of this is, when I started 689 00:30:04,059 --> 00:30:06,529 working on Ember in the JavaScript community, I thought 690 00:30:06,529 --> 00:30:08,659 a lot of these ideas were obvious. I thought 691 00:30:08,659 --> 00:30:10,840 it was gonna be, you know, a slam dunk. 692 00:30:10,840 --> 00:30:12,730 Everybody should agree that building a shared set of 693 00:30:12,730 --> 00:30:15,690 solutions is the right thing. And what I found 694 00:30:15,690 --> 00:30:20,059 was that, what I found instead is how powerful 695 00:30:20,059 --> 00:30:23,570 the unique snowflake bias is, and how powerful the 696 00:30:23,570 --> 00:30:27,220 leaky abstraction fallacy can be in communities that don't 697 00:30:27,220 --> 00:30:28,750 place a high value on shared solutions. 698 00:30:28,750 --> 00:30:31,130 So, if you don't see the power of shared 699 00:30:31,130 --> 00:30:33,809 solutions, if you're not familiar with the idea, with 700 00:30:33,809 --> 00:30:36,039 the wins, it's really easy to pull out those 701 00:30:36,039 --> 00:30:38,429 olds canards, the I am a unique snow flake, 702 00:30:38,429 --> 00:30:41,390 I can't use a tool like Ember, because I 703 00:30:41,390 --> 00:30:44,250 have special needs. I need to use a toolkit 704 00:30:44,250 --> 00:30:46,390 that lets me build my own framework, because my 705 00:30:46,390 --> 00:30:47,149 needs are oh-so-special. 706 00:30:47,149 --> 00:30:51,080 Or, you know, you know, I looked at Ember 707 00:30:51,080 --> 00:30:52,919 when it was new, and Ember leaked all over 708 00:30:52,919 --> 00:30:54,779 the place. So the law of leaky abstractions means 709 00:30:54,779 --> 00:30:57,210 you can't build in JavaScript a shared solution. But, 710 00:30:57,210 --> 00:30:58,450 of course these things are not true. And I 711 00:30:58,450 --> 00:31:01,270 think, what I want to say is, I think 712 00:31:01,270 --> 00:31:03,860 Rails, ten years on, has basically proved that these 713 00:31:03,860 --> 00:31:06,679 things are not true. 714 00:31:06,679 --> 00:31:08,840 Before Rails, people spent a lot of time working 715 00:31:08,840 --> 00:31:11,330 on their own bespoke solutions, convinced that their problem 716 00:31:11,330 --> 00:31:14,590 was just too special for shared solutions. An when 717 00:31:14,590 --> 00:31:16,970 Rails came out, they looked at the very idea 718 00:31:16,970 --> 00:31:20,279 of convention over configuration as a joke. And then, 719 00:31:20,279 --> 00:31:22,970 one day, Rails developers started beating the pants off 720 00:31:22,970 --> 00:31:27,200 those people. And I think, in closing, if you 721 00:31:27,200 --> 00:31:29,840 find yourself in an ecosystem where developers still start 722 00:31:29,840 --> 00:31:32,580 from floor one every time, learn the lessons of 723 00:31:32,580 --> 00:31:34,019 Rails. 724 00:31:34,019 --> 00:31:37,630 Everybody should band together, push back, both in your 725 00:31:37,630 --> 00:31:40,370 own brain and on other people, on the excuses 726 00:31:40,370 --> 00:31:42,539 that drive us apart instead of the things that 727 00:31:42,539 --> 00:31:46,580 bind us together. The legacy of Rails isn't MVC 728 00:31:46,580 --> 00:31:49,769 or even Ruby. It's powerful ten years of evidence 729 00:31:49,769 --> 00:31:52,100 that by sticking to our guns, we can build 730 00:31:52,100 --> 00:31:54,220 far higher than anyone ever imagined. 731 00:31:54,220 --> 00:31:56,860 Thank you very much.