0:00:18.000,0:00:19.190 YEHUDA KATZ: So it's been ten years. 0:00:19.190,0:00:20.710 Unlike DHH I can't get up here and 0:00:20.710,0:00:22.890 tell you I've been doing Rails for ten years. 0:00:22.890,0:00:24.680 I've been doing Rails for nine or eight or[br]something 0:00:24.680,0:00:26.520 like that. But, Rails has been around 0:00:26.520,0:00:29.020 for ten years, and so it's not surprising 0:00:29.020,0:00:30.320 that a bunch of people are gonna get up here 0:00:30.320,0:00:33.360 and do a little bit of a retrospective. 0:00:33.360,0:00:36.449 So this is sort of my feeling. Oh my god, 0:00:36.449,0:00:39.230 I remember sort of thinking back in 2008, 0:00:39.230,0:00:43.340 when DHH was, was giving his talk about sort[br]of 0:00:43.340,0:00:46.150 a look back around the same year that Merb 0:00:46.150,0:00:48.930 was becoming a thing, and DHH, and we were 0:00:48.930,0:00:51.899 eventually gonna, you know, merge a little[br]bit later. 0:00:51.899,0:00:54.399 But in 2008, when DHH gave the great surplus 0:00:54.399,0:00:57.370 talk, that was sort of a retrospective year[br]too, 0:00:57.370,0:00:58.690 because we had gotten to the point 0:00:58.690,0:01:00.700 where Rails was big enough that it couldn't 0:01:00.700,0:01:02.450 actually host a competitor. 0:01:02.450,0:01:04.360 And I think it's really great that we ended 0:01:04.360,0:01:06.460 up merging in and we ended up getting another 0:01:06.460,0:01:09.580 five, six years of, of, of great, a great 0:01:09.580,0:01:12.729 framework community. But now it's another[br]opportunity to look 0:01:12.729,0:01:15.420 back and think about sort of what, what the 0:01:15.420,0:01:17.720 Rails community is and how you should think[br]about 0:01:17.720,0:01:21.220 taking the lessons of Rails to other environments.[br]And 0:01:21.220,0:01:22.960 that's sort of what my talk is about today. 0:01:22.960,0:01:25.970 So, I'm gonna start by saying, I think, if 0:01:25.970,0:01:27.659 you think about one thing about Rails, if[br]you 0:01:27.659,0:01:31.680 want to think about what Rails is above anything 0:01:31.680,0:01:35.640 else, I think Rails popularized the idea of[br]convention 0:01:35.640,0:01:38.000 over configuration. And you've been hearing[br]the term convention 0:01:38.000,0:01:41.939 over configuration for ten years now. So probably[br]it's 0:01:41.939,0:01:43.930 sort of, it's a meaningless term. It's sort[br]of 0:01:43.930,0:01:45.220 like when you hear the same word over and 0:01:45.220,0:01:47.610 over and over again, eventually you reach[br]semantic saturation. 0:01:47.610,0:01:50.299 I think we've all reached semantic saturation[br]of the 0:01:50.299,0:01:51.840 term convention over configuration. 0:01:51.840,0:01:55.189 I want to unpack it a little bit. I 0:01:55.189,0:01:58.420 think one way of thinking about it is, is 0:01:58.420,0:02:00.220 this other term called the paradox of choice.[br]This 0:02:00.220,0:02:03.439 idea that people, well I'll let DHH say what 0:02:03.439,0:02:05.520 it is, people like choices a lot more than 0:02:05.520,0:02:08.119 they like having to actually choose. Right,[br]so there's 0:02:08.119,0:02:10.810 this, I think this sort of narrow point, but 0:02:10.810,0:02:12.549 it's still a very important point, which is[br]that 0:02:12.549,0:02:17.010 people go into environments, they go into[br]programming environments 0:02:17.010,0:02:20.500 or groceries or whatever, and they like the[br]idea 0:02:20.500,0:02:22.410 of having a whole lot of choices, a lot 0:02:22.410,0:02:25.000 more than they like having to actually choose[br]what 0:02:25.000,0:02:25.390 to do. 0:02:25.390,0:02:26.790 And this is sort of the state of the 0:02:26.790,0:02:29.080 art, this is what we knew in 2008 when 0:02:29.080,0:02:32.800 David gave the great surplus talk back then.[br]And 0:02:32.800,0:02:34.390 what I want to do is go a little 0:02:34.390,0:02:38.260 bit beyond sort of these, these pithy points[br]and 0:02:38.260,0:02:41.020 talk a little bit about what, what's happening,[br]what 0:02:41.020,0:02:44.989 is actually going on that's causing this idea[br]to 0:02:44.989,0:02:47.730 occur. What's hap- causing the paradox of[br]choice? And 0:02:47.730,0:02:49.879 actually there's been a lot of science, even[br]in 0:02:49.879,0:02:53.170 2008 there was science, but in between 2008[br]and 0:02:53.170,0:02:55.069 now, or certainly 2004 and now, there's been[br]a 0:02:55.069,0:02:59.870 tremendous amount of science about what is[br]causing the 0:02:59.870,0:03:03.209 paradox of choice. What is causing convention[br]over configuration 0:03:03.209,0:03:05.019 to be effective. 0:03:05.019,0:03:06.260 And if you want to Google it, if you 0:03:06.260,0:03:08.640 want to go find something, more information[br]about this 0:03:08.640,0:03:13.010 on Wikipedia, the term is called ego depletion.[br]Sometimes 0:03:13.010,0:03:15.989 the idea of cognitive depletion. And in order[br]to 0:03:15.989,0:03:18.920 understand what's happening here, you first[br]need to understand, 0:03:18.920,0:03:20.489 you first need to think about sort of, like, 0:03:20.489,0:03:24.269 your everyday, your everyday job, your, how[br]you feel 0:03:24.269,0:03:24.950 during the day. 0:03:24.950,0:03:28.180 So you wake up in the morning. You get 0:03:28.180,0:03:30.739 out, you go out of the house, and you're 0:03:30.739,0:03:33.220 pretty much fully charged. You're ready to[br]attack the 0:03:33.220,0:03:35.580 world, you, hopefully it's sunny and you can[br]skip 0:03:35.580,0:03:37.959 down the street. And you're, you're ready[br]to do 0:03:37.959,0:03:40.230 anything. You're ready. You have all the,[br]the, the 0:03:40.230,0:03:42.739 cognitive resources in the world. 0:03:42.739,0:03:45.190 And then, you know, you get to your desk. 0:03:45.190,0:03:48.860 I find it amusing that, that character is[br]a 0:03:48.860,0:03:52.590 programmer. It's so perfect. So you get to[br]your 0:03:52.590,0:03:54.470 desk and you know you've, you've done a little 0:03:54.470,0:03:57.290 bit of work, and you're cognitive resources[br]start to 0:03:57.290,0:03:58.780 deplete a little bit. You got a little bit 0:03:58.780,0:04:02.030 fewer cognitive resources. You know, eventually[br]something happens during 0:04:02.030,0:04:05.590 the day that you might not, that might not 0:04:05.590,0:04:09.860 be so pleasant, and your cognitive resources[br]deplete and 0:04:09.860,0:04:11.870 then you reach a point, at some point during 0:04:11.870,0:04:13.010 the day - this is a Van Gogh painting 0:04:13.010,0:04:15.200 - you reach a point some time during the 0:04:15.200,0:04:17.470 day where you're really flagging. You're feeling[br]like you 0:04:17.470,0:04:18.959 don't have a lot of capacity to left to, 0:04:18.959,0:04:21.349 to do anything or to think hard. And eventually 0:04:21.349,0:04:23.550 you totally run out. You run out of cognitive 0:04:23.550,0:04:26.250 resources entirely and you're done. 0:04:26.250,0:04:28.050 And so the idea here is the concept of 0:04:28.050,0:04:30.720 cognitive depletion or ego depletion, you[br]have a certain 0:04:30.720,0:04:32.940 amount of resources and they run out. And[br]I 0:04:32.940,0:04:34.690 think most people think about this in terms[br]of, 0:04:34.690,0:04:36.560 like, you're day job. Right, so you wake up 0:04:36.560,0:04:39.690 in the morning. Throughout the day your resources[br]deplete. 0:04:39.690,0:04:40.530 You get to the end of the day, you're 0:04:40.530,0:04:42.630 out of resources. Rinse and repeat. I think[br]that's 0:04:42.630,0:04:43.940 how most people think about it and that's[br]sort 0:04:43.940,0:04:45.680 of how I framed it here. 0:04:45.680,0:04:48.100 But the really interesting thing about ego[br]depletion or 0:04:48.100,0:04:51.240 cognitive depletion is that what actually[br]turns out to 0:04:51.240,0:04:52.650 be the case is that there's sort of this 0:04:52.650,0:04:57.139 one, there's this one big pool of resources,[br]this 0:04:57.139,0:04:59.530 one battery, that's actually used for a lot[br]of 0:04:59.530,0:05:02.100 different things. And so there's a lot of[br]studies 0:05:02.100,0:05:05.850 about things like grocery stores, right. So[br]why is 0:05:05.850,0:05:07.840 it that when you go to a grocery store, 0:05:07.840,0:05:10.070 why do you find yourself so willing to buy 0:05:10.070,0:05:12.630 something at the impulse, at the impulse aisle?[br]Right, 0:05:12.630,0:05:13.949 at the end of the, at the end of 0:05:13.949,0:05:15.750 the grocery trip. 0:05:15.750,0:05:17.240 And the reason is that you spent all this 0:05:17.240,0:05:20.060 time in the grocery store doing not very difficult 0:05:20.060,0:05:22.620 activities, but you're making a lot of choices.[br]So 0:05:22.620,0:05:24.660 you're making choices the entire time, as[br]you're walking 0:05:24.660,0:05:27.720 around the grocery store. And eventually your[br]brain just 0:05:27.720,0:05:29.810 runs out of resources to do anything else,[br]and 0:05:29.810,0:05:32.550 it's actually drawing from the same pool of[br]resources 0:05:32.550,0:05:34.830 that will power comes from. So even though[br]choice 0:05:34.830,0:05:37.020 making and will power feel like two totally[br]different 0:05:37.020,0:05:39.050 things, when you get to the end of the 0:05:39.050,0:05:40.979 grocery trip and you've used all of your resources 0:05:40.979,0:05:44.020 on choice making, you're out of resources[br]to not 0:05:44.020,0:05:46.729 buy the candy bar at the, at the impulse, 0:05:46.729,0:05:49.240 from the impulse aisle. 0:05:49.240,0:05:50.600 And the same thing is true about a lot 0:05:50.600,0:05:52.800 of things. They've done studies where they'll[br]just take 0:05:52.800,0:05:54.410 two halves of a room, like this one, and 0:05:54.410,0:05:57.780 they'll say, you guys memorize two numbers,[br]you guys 0:05:57.780,0:06:00.570 memorize seven numbers, and then when they're[br]totally done 0:06:00.570,0:06:02.759 and you've done the memorization, they'll[br]take you into 0:06:02.759,0:06:06.139 a processing room and they'll say, OK. You[br]have, 0:06:06.139,0:06:08.770 you know, cookies, and you have some fruit.[br]And 0:06:08.770,0:06:11.710 you get to basically decide which one to eat. 0:06:11.710,0:06:13.440 And it turns out that the people who did 0:06:13.440,0:06:15.740 the not much more difficult job of memorizing[br]seven 0:06:15.740,0:06:18.720 numbers are far more likely to eat the cookies. 0:06:18.720,0:06:20.060 And the same thing is true in the other 0:06:20.060,0:06:22.430 direction. If you have people eat cookies[br]first, and 0:06:22.430,0:06:25.300 then go and do a cognitively difficult task,[br]so 0:06:25.300,0:06:27.509 a task that requires. One of the most famous 0:06:27.509,0:06:29.530 experiments has to do with an impossible task.[br]How 0:06:29.530,0:06:31.810 long can people persevere doing a task that[br]you 0:06:31.810,0:06:34.900 can literally never finish? And it turns out[br]that 0:06:34.900,0:06:38.419 people who eat the cookies first actually[br]have, spend 0:06:38.419,0:06:40.160 a lot more time trying to do the impossible 0:06:40.160,0:06:42.880 task than the people that, that were, had,[br]had 0:06:42.880,0:06:44.199 to sit in front of a tray of cookies 0:06:44.199,0:06:45.759 and were told not to eat it. 0:06:45.759,0:06:48.000 And so there's, there's all these experiments,[br]there's, by 0:06:48.000,0:06:50.180 now in 2014 there is a ton of them, 0:06:50.180,0:06:52.810 and basically what they show in aggregate[br]is that 0:06:52.810,0:06:55.330 there is this pool of resources that we have 0:06:55.330,0:06:59.270 to do our job, challenging tasks, to do cognitive 0:06:59.270,0:07:01.740 dissonance. So there's studies around, if[br]you just tell 0:07:01.740,0:07:03.139 people, you need to get up and give a 0:07:03.139,0:07:05.389 speech about something that's different than[br]what you actually 0:07:05.389,0:07:08.740 believe, people who do that actually have[br]less will 0:07:08.740,0:07:10.699 power afterwards. They have less ability to[br]do challenging 0:07:10.699,0:07:14.440 tasks. They have less general cognitive resources[br]than people 0:07:14.440,0:07:15.650 who are asked to give a speech about something 0:07:15.650,0:07:17.860 they do believe. Even if the actual act of 0:07:17.860,0:07:22.300 giving the speech is equally, is equally difficult. 0:07:22.300,0:07:24.410 And so I think what's kind of interesting[br]about 0:07:24.410,0:07:26.160 this is that, really what we want in programming 0:07:26.160,0:07:28.240 is we want to be doing, having as much 0:07:28.240,0:07:30.340 time as we can for the challenging tasks,[br]right. 0:07:30.340,0:07:32.610 We want to be spending all of our time 0:07:32.610,0:07:35.009 on challenging tasks and as little of those[br]very 0:07:35.009,0:07:38.289 scarce resources as we can on things like[br]the 0:07:38.289,0:07:40.110 will power to write your code in exactly the 0:07:40.110,0:07:44.080 right way, or making a lot of choices. 0:07:44.080,0:07:45.940 And here are some terms that you might have 0:07:45.940,0:07:49.560 heard, basically, about this paradox of choice[br]or ego 0:07:49.560,0:07:52.880 depletion or the concept of decision fatigue.[br]These are 0:07:52.880,0:07:54.910 all ways that you've heard that describe this[br]general 0:07:54.910,0:07:58.120 concept, this general concept of you just[br]have this 0:07:58.120,0:08:00.310 battery, and it runs out at some point, and 0:08:00.310,0:08:03.539 there's all these really counter-intuitive[br]things that don't seem 0:08:03.539,0:08:05.699 like they're very hard but are taking away[br]resources 0:08:05.699,0:08:08.069 that you need to work on hard problems. 0:08:08.069,0:08:12.270 So, how do we, how do we solve this? 0:08:12.270,0:08:14.970 How do we actually solve this problem? Because[br]obviously 0:08:14.970,0:08:16.520 it's the case that you, if you want to 0:08:16.520,0:08:19.300 be spending a lot of time on your challenging 0:08:19.300,0:08:21.669 problems, if you just ignore the problem of[br]will 0:08:21.669,0:08:24.669 power or the problem of, of choices, you're[br]just 0:08:24.669,0:08:26.729 gonna end up making a lot of mindless choices 0:08:26.729,0:08:29.300 all day. And, so what we need to do 0:08:29.300,0:08:31.389 is we need to find some psychological hacks[br]that 0:08:31.389,0:08:33.940 we can apply that will keep us doing the 0:08:33.940,0:08:36.979 right thing, basically all the time. Keep[br]us from 0:08:36.979,0:08:39.198 wasting cognitive resources. 0:08:39.198,0:08:42.958 And I think my favorite study about, about[br]this, 0:08:42.958,0:08:46.500 about the kinds of cognitive hacks that work[br]effectively, 0:08:46.500,0:08:51.269 is the idea of what happens to organ donation 0:08:51.269,0:08:53.940 if the organ donation requirement is opt in,[br]in 0:08:53.940,0:08:55.260 other words, you go to the DMV and there's 0:08:55.260,0:08:57.850 a form that says, yes, I agree to donate 0:08:57.850,0:08:59.250 my organs and here are the ones I agree 0:08:59.250,0:09:02.180 to donate. And when the organ donation is[br]opt 0:09:02.180,0:09:04.269 out, in other words, you have to explicitly[br]say, 0:09:04.269,0:09:06.230 I do not want, I, I do not want 0:09:06.230,0:09:08.190 my organs to be donated. And what you can 0:09:08.190,0:09:10.730 see is that, in the countries where it's opt 0:09:10.730,0:09:15.470 in, it's actually a very, very low rate and 0:09:15.470,0:09:16.870 in the countries where it's opt out, it's[br]a 0:09:16.870,0:09:20.149 surprisingly high rate. It's basically almost[br]universal. 0:09:20.149,0:09:21.810 And I, and I think to, to some degree 0:09:21.810,0:09:23.860 you might expect that this is the case. But 0:09:23.860,0:09:27.839 I think this difference is really counter-intuitive,[br]because you 0:09:27.839,0:09:30.490 would expect that if somebody, you know, goes[br]and 0:09:30.490,0:09:32.370 they're sitting at a form and the form says, 0:09:32.370,0:09:33.740 Do you want to donate your organs? And the 0:09:33.740,0:09:35.830 excuse that they're telling themselves in[br]their head for 0:09:35.830,0:09:38.339 not checking the check box is, you know, my 0:09:38.339,0:09:39.990 mom would be super angry if she found out 0:09:39.990,0:09:42.180 or my religion tells me that I shouldn't do 0:09:42.180,0:09:44.930 this or, you know, growing up, I heard people 0:09:44.930,0:09:46.980 say negative things or whatever, whatever[br]the excuse is 0:09:46.980,0:09:49.019 you tell yourself to not check the check box, 0:09:49.019,0:09:52.050 you would think that some of those people,[br]more 0:09:52.050,0:09:54.160 than, you know, zero point one percent of[br]those 0:09:54.160,0:09:56.930 people, would pick up the pen and opt out. 0:09:56.930,0:10:00.890 But the interesting thing is that, by just[br]changing 0:10:00.890,0:10:03.930 the default from yes to no, all of the 0:10:03.930,0:10:05.810 sudden, all those excuses, all those things[br]that people 0:10:05.810,0:10:08.610 tell themselves about the reasons that they[br]really shouldn't 0:10:08.610,0:10:11.170 check the check box, suddenly go away. 0:10:11.170,0:10:13.190 And what's, I think even more interesting[br]about this 0:10:13.190,0:10:16.519 is that, these choices are actually made on,[br]on 0:10:16.519,0:10:18.580 really big DMV forms. So you basically what[br]you 0:10:18.580,0:10:20.829 can see is that people have already gone through 0:10:20.829,0:10:26.649 this really complicated, somewhat trivial[br]but very choice-heavy process 0:10:26.649,0:10:28.510 of filling out this DMV form, and by the 0:10:28.510,0:10:30.529 time they get to the bottom and are asked 0:10:30.529,0:10:33.420 about organ donation, they're so cognitively[br]depleted that they 0:10:33.420,0:10:35.329 have no energy left to even really think about 0:10:35.329,0:10:37.920 it. They basically just do the defaults. 0:10:37.920,0:10:41.459 So I think, honestly, defaults are our most[br]powerful 0:10:41.459,0:10:44.899 psychological hack, our most powerful weapon[br]in trying to 0:10:44.899,0:10:46.649 deal with the pro- the fact that we have 0:10:46.649,0:10:50.230 this limited source of cognitive capacity[br]that we want 0:10:50.230,0:10:53.339 to make good use of when we're programming.[br]And 0:10:53.339,0:10:56.000 the really cool thing about defaults in general[br]is 0:10:56.000,0:10:59.310 that defaults are actually really effective[br]on both sides 0:10:59.310,0:11:01.320 of the spectrum. So some days, you get to 0:11:01.320,0:11:03.850 work, you're ready to go, you're like, in[br]an 0:11:03.850,0:11:09.760 amazing mood. Everything is perfect. Everything[br]is awesome. 0:11:09.760,0:11:12.070 And on those days, the defaults, you have[br]a 0:11:12.070,0:11:15.040 big store of cognitive resources, and the[br]defaults keep 0:11:15.040,0:11:17.600 you high up. They keep you in the charge 0:11:17.600,0:11:19.100 state of a longer time. You don't have to 0:11:19.100,0:11:21.430 make choices that would go deplete, and remember[br]the 0:11:21.430,0:11:24.160 choice-making doesn't deplete per minute.[br]It's not every minute 0:11:24.160,0:11:27.160 of choices depletes. It's every choice depletes[br]your cognitive 0:11:27.160,0:11:29.600 resources. So, having a set of defaults that[br]tells 0:11:29.600,0:11:31.470 you, here is what you're gonna do in general 0:11:31.470,0:11:33.180 and, you have to, you know, you have to 0:11:33.180,0:11:35.779 think hard to opt out. That's really great[br]when 0:11:35.779,0:11:37.760 you're in a good mood, when things are charged. 0:11:37.760,0:11:39.329 But it's actually also really great when you're[br]in 0:11:39.329,0:11:41.250 a bad moon. When you're really depleted and[br]you 0:11:41.250,0:11:42.480 still have to go to work and do your 0:11:42.480,0:11:44.410 job, because the default keeps you on the[br]straight 0:11:44.410,0:11:46.430 and narrow, right. You don't have enough energy[br]left 0:11:46.430,0:11:48.660 to really think about what you're doing, and[br]so 0:11:48.660,0:11:51.399 everybody has bad days. Everybody works on[br]teams with 0:11:51.399,0:11:57.850 developers who aren't great. Let me say it[br]a 0:11:57.850,0:11:59.370 different way. Everyone has at some point[br]worked on 0:11:59.370,0:12:01.740 a team, hopefully not. But you have junior[br]developers. 0:12:01.740,0:12:03.860 You know, you hire people who are, who are 0:12:03.860,0:12:06.170 new to whatever it is that you're doing, or 0:12:06.170,0:12:08.910 you have a bad day, or you're stressed out 0:12:08.910,0:12:10.709 because your mom gave you a call at lunch 0:12:10.709,0:12:12.529 and now you're in a bad mood. 0:12:12.529,0:12:14.139 Right, so everybody gets to a point where[br]they 0:12:14.139,0:12:21.139 have cognitively, my mother's gonna be so[br]angry now. 0:12:22.050,0:12:27.519 Let's say, an ex-girlfriend or whatever. So[br]everyone has 0:12:27.519,0:12:33.040 days where they're cognitively depleted. And[br]on those days, 0:12:33.040,0:12:35.200 defaults are also really powerful, because[br]they keep you, 0:12:35.200,0:12:37.320 when. Instead of having you be sort of in 0:12:37.320,0:12:39.240 a bad mood and you'll just sort of do 0:12:39.240,0:12:41.610 whatever, you know, you feel like, you're[br]basically kept 0:12:41.610,0:12:42.730 on the straight and narrow. You're kept on[br]the 0:12:42.730,0:12:43.269 right path. 0:12:43.269,0:12:45.829 And I think this actually helps to explain[br]why 0:12:45.829,0:12:49.529 yak-shaving doesn't feel as good as you might[br]think. 0:12:49.529,0:12:52.160 So, yak-shaving isn't the most terrible activity[br]in the 0:12:52.160,0:12:54.180 world. I think sometimes you need to yak-shave.[br]But 0:12:54.180,0:12:55.839 I think if you think about doing, like, four 0:12:55.839,0:12:58.410 hours of yak-shaving in an eight hour day,[br]pretty 0:12:58.410,0:13:00.240 much after four hours, if you, you know, let 0:13:00.240,0:13:02.680 me set up my, you know, my vagrant box, 0:13:02.680,0:13:04.959 or let me go set up my testing environment. 0:13:04.959,0:13:07.950 After like four hours of that, you're totally[br]cognitively 0:13:07.950,0:13:10.079 depleted. Doesn't matter that you only spent[br]four hours 0:13:10.079,0:13:11.950 out of an eight hour day. Basically you have 0:13:11.950,0:13:14.300 no more cognitive resources left. And I think[br]this, 0:13:14.300,0:13:17.050 this means we should be very careful about[br]yak-shaving. 0:13:17.050,0:13:19.160 Because yak-shaving may feel good and it may[br]be 0:13:19.160,0:13:20.790 important in a lot of cases, but we need 0:13:20.790,0:13:22.779 to be very honest about the fact that there 0:13:22.779,0:13:26.029 is, there's a certain amount of cognitive[br]resources that 0:13:26.029,0:13:28.420 we have and yak-shaving takes up more of them 0:13:28.420,0:13:31.050 than you would expect. And they don't leave[br]us 0:13:31.050,0:13:33.570 time after, even two hours or three hours,[br]they 0:13:33.570,0:13:35.339 don't leave us a lot of cognitive resources[br]to 0:13:35.339,0:13:37.579 actually do the task that we were yak-shaving[br]towards. 0:13:37.579,0:13:40.779 So, obviously, occasionally you know you need[br]to refactor 0:13:40.779,0:13:43.820 and, and do all kinds of these kinds of 0:13:43.820,0:13:45.839 tasks, but you should be careful about thinking[br]that 0:13:45.839,0:13:47.579 you'll get a lot done afterwards. 0:13:47.579,0:13:49.850 So, I think this is sort of the unpacking. 0:13:49.850,0:13:52.339 This is a scientific unpacking of what it[br]is 0:13:52.339,0:13:55.820 that we're talking about. But, and I think[br]everyone 0:13:55.820,0:13:57.730 in this room can nod their heads along with 0:13:57.730,0:13:59.910 what I'm saying. They can agree with what[br]I'm 0:13:59.910,0:14:03.720 saying. Makes sense. But what ends up happening[br]in 0:14:03.720,0:14:06.880 the rest of the world, and also there's usually 0:14:06.880,0:14:10.760 a devil on your shoulder, is that people find 0:14:10.760,0:14:13.410 all kinds of excuses to argue against the[br]thing 0:14:13.410,0:14:14.000 I just said. 0:14:14.000,0:14:16.610 So I just outlined sort of an unpacking of 0:14:16.610,0:14:23.610 sort of the conventional reconfiguration story,[br]and somehow, we, 0:14:23.980,0:14:26.339 as a human race, actually find a lot of 0:14:26.339,0:14:30.240 ways to, to argue against these things. And[br]one 0:14:30.240,0:14:32.420 of these, one of these ways that we find 0:14:32.420,0:14:35.490 to argue against it is to tell ourselves that 0:14:35.490,0:14:38.740 we're unique and we're special, and I'll just[br]let 0:14:38.740,0:14:44.579 David from 2008 talk about this for a second. 0:14:44.579,0:14:48.529 DHH: One point I keep coming back to, over 0:14:48.529,0:14:51.279 and over again when I talk about Ruby and 0:14:51.279,0:14:56.380 Rails, is that we confessed commonality. We[br]confessed the 0:14:56.380,0:14:59.160 fact that we're not as special as we like 0:14:59.160,0:15:02.240 to believe. We confessed that we're not the[br]only 0:15:02.240,0:15:06.709 ones trying to climb the same mountain. And[br]I 0:15:06.709,0:15:08.980 think this is a real important point because[br]it's 0:15:08.980,0:15:11.440 somewhat counter intuitive, I think, for a[br]lot of 0:15:11.440,0:15:14.089 developers to think that they're not that[br]special. I 0:15:14.089,0:15:16.519 think it's counter intuitive for humans in[br]general to 0:15:16.519,0:15:18.399 think they're not that special. 0:15:18.399,0:15:21.010 But, when they do think that they're special,[br]when 0:15:21.010,0:15:22.860 they do think that they're the only ones climbing 0:15:22.860,0:15:25.820 that mountain, they kind of get these assumptions[br]that 0:15:25.820,0:15:28.560 they need very unique and special tools that[br]will 0:15:28.560,0:15:31.500 only work for them. And I think that's a 0:15:31.500,0:15:36.620 really bad way to approach getting greater[br]productivity. Because 0:15:36.620,0:15:39.980 I think what really makes this special and[br]makes 0:15:39.980,0:15:42.930 it work is all the points where we recognize 0:15:42.930,0:15:44.740 that we're exactly the same. 0:15:44.740,0:15:47.889 Y.K.: And I think that's really the point,[br]is 0:15:47.889,0:15:51.269 that the way we gain better productivity is[br]by 0:15:51.269,0:15:54.440 pushing back against this impulse. I think,[br]we have 0:15:54.440,0:15:56.350 it in the Rails community to some degree.[br]I 0:15:56.350,0:15:59.070 think it's especially significant outside[br]of the Rails community, 0:15:59.070,0:16:01.550 where people didn't already come together[br]around the idea 0:16:01.550,0:16:04.290 that we're gonna build shared tools and shared[br]solutions. 0:16:04.290,0:16:05.600 But I think we really do have to push 0:16:05.600,0:16:07.230 back against this idea. 0:16:07.230,0:16:09.209 And I think my favorite example of this sort 0:16:09.209,0:16:15.750 of, taken to an absurdist extreme, is sort[br]of 0:16:15.750,0:16:17.920 famous interview. What is your most surprising[br]app on 0:16:17.920,0:16:20.040 the home screen? Well, it's Operator. It's[br]a custom-designed, 0:16:20.040,0:16:23.250 one-of-a-kind bespoke app I had built for[br]my assistant 0:16:23.250,0:16:25.630 and I to communicate and collaborate. Did[br]this person 0:16:25.630,0:16:29.410 need a custom bespoke one-of-a-kind application[br]to communicate with 0:16:29.410,0:16:31.639 their assistant? No. Almost certainly not. 0:16:31.639,0:16:34.160 But they decided they were so special, they[br]were 0:16:34.160,0:16:37.250 so, they themselves were so one-of-a-kind,[br]such a unique 0:16:37.250,0:16:40.639 snowflake, that they needed a special tool[br]to communicate 0:16:40.639,0:16:42.779 with their assistant. And I think this is[br]sort 0:16:42.779,0:16:44.620 of how, this is how we act. This is 0:16:44.620,0:16:46.610 how we behave. And if you look at, sort 0:16:46.610,0:16:48.790 of, how people talk about software, you see[br]things 0:16:48.790,0:16:50.889 like, this is a tool set for building the 0:16:50.889,0:16:54.529 framework most suited to your application[br]development. Your application, 0:16:54.529,0:16:58.360 your company, your industry is so special,[br]that you 0:16:58.360,0:17:00.620 can't use general-purpose tools. You need[br]to use a 0:17:00.620,0:17:03.110 tool set to build your own framework. 0:17:03.110,0:17:07.638 Or, in an ecosystem where overarching, decides-everything-for-you[br]frameworks are 0:17:07.638,0:17:09.689 commonplace, and many libraries require your[br]site to be 0:17:09.689,0:17:12.289 reorganized to suit their look, feel, and[br]default behavior 0:17:12.289,0:17:13.380 - we should continue to be a tool that 0:17:13.380,0:17:15.510 gives you the freedom to design the full experience 0:17:15.510,0:17:16.439 of your web application. 0:17:16.439,0:17:20.240 And, who could be against freedom? Right?[br]Freedom is 0:17:20.240,0:17:24.819 a really effective thing to put on the wall 0:17:24.819,0:17:26.329 to say, this is the thing that we're arguing 0:17:26.329,0:17:28.990 for. We're arguing for freedom. But this is[br]just 0:17:28.990,0:17:32.100 another way, it's just another way that you,[br]that 0:17:32.100,0:17:37.080 peoples' brains sneak in arguments against[br]that, that helped 0:17:37.080,0:17:38.970 us create the paradox of choice in the first 0:17:38.970,0:17:43.350 place, right. People say, you know, I'm special.[br]I 0:17:43.350,0:17:45.020 can't use these shared tools. I can't use[br]these 0:17:45.020,0:17:46.880 tools that were built for everybody. I need[br]to 0:17:46.880,0:17:51.309 use special tools. I need to use small libraries 0:17:51.309,0:17:54.000 that help me build my own abstractions. I[br]can't 0:17:54.000,0:17:56.470 share with the community. 0:17:56.470,0:17:59.400 And then, even if people come to the conclusion 0:17:59.400,0:18:02.530 that maybe abstractions, maybe shared solutions[br]are a good 0:18:02.530,0:18:05.350 idea, then you get another argument. The devil[br]on 0:18:05.350,0:18:07.799 your shoulder or the devil in your community.[br]It 0:18:07.799,0:18:10.490 makes another argument, which is the law of[br]leaky 0:18:10.490,0:18:12.150 abstractions. And this is not, this is sort[br]of 0:18:12.150,0:18:15.549 like the law of Demeter. It's not a suggestion, 0:18:15.549,0:18:20.110 or an observation. It's a law. The law of 0:18:20.110,0:18:21.320 leaky abstractions. 0:18:21.320,0:18:24.500 And I think any time somebody couches an observation 0:18:24.500,0:18:28.039 about software development as a law, you know[br]something 0:18:28.039,0:18:31.380 fishy is going on. You know that something's[br]not 0:18:31.380,0:18:35.460 right. Because software development isn't[br]a science. You, basically 0:18:35.460,0:18:38.090 people want you to put on your science hat, 0:18:38.090,0:18:40.710 and say, aha! It's a law! It's like the 0:18:40.710,0:18:43.850 law of gravity. I can derive some conclusions[br]from 0:18:43.850,0:18:46.480 this law. What, what, what conclusions do[br]they want 0:18:46.480,0:18:49.020 you to derive? Abstractions are bad. You should[br]never 0:18:49.020,0:18:51.520 use abstractions. You should do everything[br]yourself. 0:18:51.520,0:18:54.730 And, so this law of leaky abstractions was[br]originally 0:18:54.730,0:18:59.110 built by, or written by Joel Spolsky, and[br]Jeff 0:18:59.110,0:19:01.789 Atwood, who was his partner at Stack Overflow,[br]actually 0:19:01.789,0:19:04.720 responded, I think, kind of brilliantly to[br]this. And 0:19:04.720,0:19:07.200 he said, you know, I'd argue, that virtually[br]all 0:19:07.200,0:19:09.610 good programming abstractions are failed abstractions.[br]I don't think 0:19:09.610,0:19:11.059 I've ever used one that didn't leak like a 0:19:11.059,0:19:13.789 sieve. But I think that's an awfully architecture[br]astronaut 0:19:13.789,0:19:17.000 way of looking at things. Instead, let's ask[br]ourselves 0:19:17.000,0:19:19.700 a more programatic question: does this abstraction[br]make our 0:19:19.700,0:19:21.780 code at least a little easier to write? To 0:19:21.780,0:19:24.000 understand? To troubleshoot? Are we better[br]off with this 0:19:24.000,0:19:26.240 abstraction than we were without it? 0:19:26.240,0:19:28.410 It's out job as modern programmers not to[br]abandon 0:19:28.410,0:19:31.230 abstractions due to these deficiencies, but[br]to embrace the 0:19:31.230,0:19:33.510 useful elements of them. To adapt the working[br]parts 0:19:33.510,0:19:36.049 and construct ever so slightly less leaky[br]and broken 0:19:36.049,0:19:37.830 abstractions over time. 0:19:37.830,0:19:42.169 And I think people use this idea, these excuses, 0:19:42.169,0:19:44.830 things like the law of leaky abstractions,[br]to give 0:19:44.830,0:19:48.770 an excuse for themselves to not share solutions.[br]And 0:19:48.770,0:19:50.820 I think sort of the hilarious thing, and this 0:19:50.820,0:19:54.419 is sort of a compressed super conflated set[br]of 0:19:54.419,0:19:56.610 abstractions. Every single one of us is sitting[br]on 0:19:56.610,0:20:01.190 top of abstractions that maybe occasionally[br]leak, but really, 0:20:01.190,0:20:02.850 how many people ever have to drop down into 0:20:02.850,0:20:05.390 the X86 or the arm level? Or even the 0:20:05.390,0:20:09.590 C level? Right. People. We can build higher[br]and 0:20:09.590,0:20:12.380 higher sets of abstractions, and we can keep,[br]we 0:20:12.380,0:20:15.260 can keep building on top of these abstractions,[br]and 0:20:15.260,0:20:17.880 build, and, and allow us to sort of eliminate 0:20:17.880,0:20:20.370 more and more code that we had to write 0:20:20.370,0:20:22.490 before. That we had to write in 1960, 1970, 0:20:22.490,0:20:24.980 1980. Sort of every year is another set of 0:20:24.980,0:20:27.460 things that we have discovered as a community[br]that 0:20:27.460,0:20:29.150 we don't have to worry about, that were shared. 0:20:29.150,0:20:30.520 And I think, sort of, people look at us 0:20:30.520,0:20:31.730 and they say, oh my god it's a pile 0:20:31.730,0:20:34.059 of hacks. It's hacks on hacks on hacks on 0:20:34.059,0:20:35.970 hacks. But actually it's not. Actually what's[br]going on 0:20:35.970,0:20:39.140 here is that every single time you start off 0:20:39.140,0:20:41.679 with this sort of experimental playground,[br]people are building, 0:20:41.679,0:20:43.210 you know, at the bottom layer, people were[br]building 0:20:43.210,0:20:45.630 their own hardware. And eventually people[br]came to the 0:20:45.630,0:20:47.549 conclusion that you don't have to build your[br]own 0:20:47.549,0:20:50.909 hardware. We can standardize around things[br]like X86. 0:20:50.909,0:20:53.650 And then we standardized around it and people[br]stopped 0:20:53.650,0:20:55.980 worrying about all the craziness that was[br]underneath. And 0:20:55.980,0:20:58.580 then people said, we can build C, and if 0:20:58.580,0:21:01.450 we build C, people can stop worrying, most[br]of 0:21:01.450,0:21:03.760 the time, about what's below it. So really[br]every 0:21:03.760,0:21:05.090 one of these layers is not a pile of 0:21:05.090,0:21:07.240 hacks built on a pile of hacks. It's us, 0:21:07.240,0:21:10.130 as a group of people, as a community of 0:21:10.130,0:21:12.480 programmers, deciding that 90% of the things[br]that we're 0:21:12.480,0:21:14.919 doing, we've figured out we don't actually[br]need to, 0:21:14.919,0:21:16.090 to worry about. 0:21:16.090,0:21:19.210 And, I think, fundamentally, this is about,[br]sort of 0:21:19.210,0:21:21.580 the history of programming is that we have[br]shared 0:21:21.580,0:21:25.840 solutions. We make progress by building up[br]the stack. 0:21:25.840,0:21:27.730 By eliminating code that we didn't have to[br]write. 0:21:27.730,0:21:30.780 And Steve Jobs actually talked about this[br]in 1995. 0:21:30.780,0:21:32.940 Sort of exactly the same thing. So let me 0:21:32.940,0:21:33.900 let him talk. 0:21:33.900,0:21:37.059 STEVE JOBS: Because it's all about managing[br]complexity, right. 0:21:37.059,0:21:39.730 You're developers. You know that. It's all[br]about managing 0:21:39.730,0:21:44.100 complexity. It's, like, scaffolding, right.[br]You erect some scaffolding, 0:21:44.100,0:21:46.010 and if you keep going up and up and 0:21:46.010,0:21:49.770 up, eventually the scaffolding collapses of[br]its own weight, 0:21:49.770,0:21:53.590 right. That's what building software is. It's,[br]how much 0:21:53.590,0:21:56.770 scaffolding can you erect before the whole[br]thing collapses 0:21:56.770,0:21:57.320 of its own weight. 0:21:57.320,0:21:58.750 Doesn't matter how many people you have working[br]on 0:21:58.750,0:22:01.299 it. Doesn't matter if you're Microsoft with[br]three, four 0:22:01.299,0:22:03.710 hundred people, five hundred people on the[br]team. It 0:22:03.710,0:22:06.470 will collapse under its own weight. You've[br]read the 0:22:06.470,0:22:09.380 Mythical Man Month, right. Basic premise of[br]this is, 0:22:09.380,0:22:11.890 a software development project gets to a certain[br]size 0:22:11.890,0:22:13.890 where if you add one more person, the amount 0:22:13.890,0:22:16.190 of energy to communicate with that person[br]is actually 0:22:16.190,0:22:18.870 greater than their net contribution to the[br]project, so 0:22:18.870,0:22:19.690 it slows down. 0:22:19.690,0:22:22.309 So you have local maximum and then it comes 0:22:22.309,0:22:25.280 down. We all know that about software. It's[br]about 0:22:25.280,0:22:29.900 managing complexity. These tools allow you[br]to not have 0:22:29.900,0:22:32.710 to worry about ninety percent of the stuff[br]you 0:22:32.710,0:22:35.700 worry about, so that you can erect your five 0:22:35.700,0:22:39.799 stories of scaffolding, but starting at story[br]number twenty-three 0:22:39.799,0:22:43.179 instead of starting at story number six. You[br]get 0:22:43.179,0:22:45.100 a lot higher. 0:22:45.100,0:22:47.510 Y.K.: And I think that's fundamentally what[br]we do 0:22:47.510,0:22:49.780 as software people. For all of the complaints[br]that 0:22:49.780,0:22:53.169 people make about, you know, oh my god, every 0:22:53.169,0:22:57.880 abstraction leaks. All we've ever done, even[br]in, even 0:22:57.880,0:23:00.299 as far back as, you know, in the 60s, 0:23:00.299,0:23:03.539 but even in 1995, Steve Jobs was already talking 0:23:03.539,0:23:05.830 about this idea that we can build higher by 0:23:05.830,0:23:08.400 building shared solutions. And I'm gonna let[br]him speak 0:23:08.400,0:23:12.350 one more time, because I think, really, it's[br]really 0:23:12.350,0:23:14.820 fascinating how much this idea of how you[br]get 0:23:14.820,0:23:18.640 better programmer productivity hasn't really[br]changed, fundamentally, since that 0:23:18.640,0:23:19.039 time. 0:23:19.039,0:23:22.210 STEVE JOBS: But, on top of that, we're gonna 0:23:22.210,0:23:29.210 put something called open step. And open step[br]lets 0:23:34.169,0:23:38.400 you start developing your apps on the twentieth[br]floor. 0:23:38.400,0:23:40.429 And the kinds of apps you can deliver are 0:23:40.429,0:23:44.250 phenomenal. But there's another hidden advantage. 0:23:44.250,0:23:47.760 Most of the great break through, the page[br]makers, 0:23:47.760,0:23:52.200 the illustrators, et cetera, the directors,[br]come from smaller 0:23:52.200,0:23:54.570 software companies. That's been said a few[br]times today. 0:23:54.570,0:23:57.000 They don't come from the large software companies.[br]They 0:23:57.000,0:23:59.210 come from the smaller ones. And one of the 0:23:59.210,0:24:03.370 greatest things is that using this new technology,[br]two 0:24:03.370,0:24:06.539 people or three people in a garage can build 0:24:06.539,0:24:09.289 an app and get it from concept to market 0:24:09.289,0:24:11.590 in six to nine months, that is every bit 0:24:11.590,0:24:15.750 as feature-rich, every bit as reliable, and[br]every bit 0:24:15.750,0:24:18.289 as exciting as a giant software company can[br]do 0:24:18.289,0:24:20.990 with a hundred fifty person team. 0:24:20.990,0:24:22.039 It's phenomenal. 0:24:22.039,0:24:26.140 Y.K.: So, I think what's kind of cool about 0:24:26.140,0:24:31.640 this is that, this idea that we can take 0:24:31.640,0:24:35.049 shared problems that everyone has, shared[br]problems that a 0:24:35.049,0:24:37.779 community of people have, solving the same[br]problem, and 0:24:37.779,0:24:40.260 we can build up shared solutions. This is[br]not 0:24:40.260,0:24:43.820 new. It's not, it shouldn't be controversial.[br]It's kind 0:24:43.820,0:24:47.100 of fundamental to what we do as software developers. 0:24:47.100,0:24:50.070 And yet, if someone isn't up here telling[br]you 0:24:50.070,0:24:52.409 this, it's so easy to forget. There are so 0:24:52.409,0:24:55.110 many excuses that people tell themselves. 0:24:55.110,0:24:56.770 Sort of what happens in reality is you have 0:24:56.770,0:24:58.760 this bulk of shared solutions, you have an[br]area 0:24:58.760,0:25:01.490 of experimentation - sort of the wild west[br]- 0:25:01.490,0:25:03.750 and you let the area of experimentation fold[br]back 0:25:03.750,0:25:05.650 into shared solutions. This is sort of how[br]Rails 0:25:05.650,0:25:08.020 works, right. So you build higher and higher[br]and 0:25:08.020,0:25:10.890 higher stacks. You get to a point where you, 0:25:10.890,0:25:12.590 you know, you could build something like Devise[br]in 0:25:12.590,0:25:15.070 the Rails community, because there's so much[br]of what 0:25:15.070,0:25:18.929 underpins Devise, you know, everybody uses[br]the same set 0:25:18.929,0:25:22.309 of model abstractions, everyone has similar[br]ways of talking 0:25:22.309,0:25:25.240 about users, right. So you can build an abstraction 0:25:25.240,0:25:27.059 on top of that because everyone has sort of 0:25:27.059,0:25:28.870 built up this shared understanding of what[br]it is 0:25:28.870,0:25:30.059 that we're doing. 0:25:30.059,0:25:33.169 And, it's so easy to let yourself be confused 0:25:33.169,0:25:35.630 by the fact that the area of experimentation[br]is 0:25:35.630,0:25:38.400 the wild west, and forget that that area of 0:25:38.400,0:25:41.820 experimentation is sitting on top of huge,[br]a huge 0:25:41.820,0:25:43.799 stack of abstractions. And this is sort of,[br]I 0:25:43.799,0:25:45.909 think, to me, the answer to why the node 0:25:45.909,0:25:49.779 community seems, they're sitting on top of[br]maybe, you 0:25:49.779,0:25:53.340 know, the most advanced dynamic language git[br]in the 0:25:53.340,0:25:57.480 world, on top of all kinds of abstractions.[br]And 0:25:57.480,0:25:58.630 they sit on top and they say, oh my 0:25:58.630,0:26:00.919 god, we need to build a lot of tiny 0:26:00.919,0:26:02.750 modules, because if we don't build tiny modules,[br]this, 0:26:02.750,0:26:04.289 the abstraction's gonna kill us. 0:26:04.289,0:26:06.830 But, for me, this has always been a paradox. 0:26:06.830,0:26:08.650 You're sitting on top of a stack of abstractions 0:26:08.650,0:26:10.850 that's far higher than anything that you're[br]claiming to 0:26:10.850,0:26:12.350 be afraid of. So, why are you so afraid? 0:26:12.350,0:26:13.900 And I think it's because of this area of 0:26:13.900,0:26:16.600 experimentation, right. When you're in an[br]area of experimentation, 0:26:16.600,0:26:18.640 of course abstractions are gonna leak. You're[br]still figuring 0:26:18.640,0:26:20.650 out what it is that you're doing. 0:26:20.650,0:26:22.490 But the goal of a good community that's gonna 0:26:22.490,0:26:25.159 help people be more productive is to eventually[br]notice 0:26:25.159,0:26:28.600 that the area of experimentation is over.[br]And move 0:26:28.600,0:26:31.600 into a conventional system where you can say,[br]we 0:26:31.600,0:26:33.059 don't need to argue about this anymore. It[br]was 0:26:33.059,0:26:34.970 a worthy thing for us to discuss when we 0:26:34.970,0:26:37.179 were thinking about the problem, but we can[br]take 0:26:37.179,0:26:38.820 that and we can roll it in, into our 0:26:38.820,0:26:40.289 set of shared defaults, and we can climb up 0:26:40.289,0:26:41.070 the ladder, right. 0:26:41.070,0:26:42.799 And this is what, this is what Steve was 0:26:42.799,0:26:46.000 saying. He was saying, you know, instead of[br]having 0:26:46.000,0:26:48.450 everybody start from some, some floor, sort[br]of this 0:26:48.450,0:26:52.700 is how iOS programming works today, right.[br]Ironically. Is 0:26:52.700,0:26:55.230 everyone starts from the same base. There[br]is not 0:26:55.230,0:26:57.390 a lot of shared programming. A lot of shared 0:26:57.390,0:27:01.190 solutions. And I think, fundamentally, open[br]source has been 0:27:01.190,0:27:04.640 something that has really kick started this[br]idea of 0:27:04.640,0:27:06.830 experimentation merging through the shared[br]solutions. 0:27:06.830,0:27:09.350 Because trying to essentially plan it the[br]way big 0:27:09.350,0:27:11.590 companies like Apple or Microsoft do it, is[br]just 0:27:11.590,0:27:14.120 not gonna get the job done across the board. 0:27:14.120,0:27:16.900 It'll, it'll solve some problems, but getting[br]the open 0:27:16.900,0:27:19.059 source communities, the power of the open[br]source community, 0:27:19.059,0:27:21.340 means that you can have all these little verticals, 0:27:21.340,0:27:23.610 all these little areas where people are trying[br]to 0:27:23.610,0:27:27.159 build higher abstractions for shared communities. 0:27:27.159,0:27:30.799 And interestingly, it's not enough to make[br]these abstractions 0:27:30.799,0:27:33.409 cheap. I think when you think about how people 0:27:33.409,0:27:35.750 actually go and they build on top of these 0:27:35.750,0:27:38.340 stacks, it's not enough, if every layer in[br]the 0:27:38.340,0:27:41.169 abstraction, in fact, if x86 and then C and, 0:27:41.169,0:27:42.720 you know, Linux, and Posits. If every one[br]of 0:27:42.720,0:27:44.770 those things cost a little, by the time you 0:27:44.770,0:27:46.679 actually got to build software, you would[br]be so 0:27:46.679,0:27:48.649 overwhelmed with the weight of the abstraction[br]that you 0:27:48.649,0:27:49.720 would never be able to do anything. 0:27:49.720,0:27:53.270 So, it's really fundamental that the abstractions[br]that we 0:27:53.270,0:27:56.029 build eventually get to the point where they're[br]basically 0:27:56.029,0:27:58.000 free. Where they have no cognitive capacity.[br]So that 0:27:58.000,0:27:59.860 we can keep building higher and higher and[br]higher, 0:27:59.860,0:28:03.020 right. And the Rails philosophy is basically,[br]how do 0:28:03.020,0:28:04.210 you do that? How do you, how do, how 0:28:04.210,0:28:05.299 do you say, you know, we're gonna experiment[br]for 0:28:05.299,0:28:07.799 a little bit, but eventually we're gonna work[br]really 0:28:07.799,0:28:10.390 hard, we're gonna push really hard at making[br]the 0:28:10.390,0:28:12.240 cost of that thing that everyone was just[br]experimenting 0:28:12.240,0:28:14.179 with a minute, a little, a minute ago free, 0:28:14.179,0:28:15.649 so that we can go build another level up 0:28:15.649,0:28:16.960 and another level up. 0:28:16.960,0:28:19.830 And, I have a few sort of closing points 0:28:19.830,0:28:23.450 to make about the ecosystem. So, first of[br]all, 0:28:23.450,0:28:24.730 Rails is not the only way that we have 0:28:24.730,0:28:28.159 to share. I think I was pretty sad when 0:28:28.159,0:28:31.529 the queue abstraction didn't end up in Rails,[br]but 0:28:31.529,0:28:33.850 I kind of am sad that we didn't get 0:28:33.850,0:28:35.870 to see sort of what you can build up 0:28:35.870,0:28:38.490 on top of the queue abstraction, there's really[br]no 0:28:38.490,0:28:41.029 reason that all the queue guys didn't get[br]together 0:28:41.029,0:28:44.360 and say, you know, we're gonna build some[br]abstraction 0:28:44.360,0:28:45.799 on top of that. And once you build the 0:28:45.799,0:28:47.490 abstraction on top of that, then you can see 0:28:47.490,0:28:49.270 how high you can go, right. 0:28:49.270,0:28:50.570 And a sort of similar thing happened in the 0:28:50.570,0:28:52.360 JavaScript community. In the JavaScript community[br]there were a 0:28:52.360,0:28:57.600 lot of different promise implementations,[br]and what happened over 0:28:57.600,0:28:59.289 time is that people realized that not having[br]a 0:28:59.289,0:29:01.190 standard way to talk about this was actually[br]making 0:29:01.190,0:29:02.940 it hard to build up. 0:29:02.940,0:29:05.270 So we said, let's actually get together and[br]let's 0:29:05.270,0:29:06.740 decide that we're gonna have a standard way[br]of 0:29:06.740,0:29:09.760 talking about that. We'll call it PromisesA+.[br]And now 0:29:09.760,0:29:12.450 Promises are in the DOM. And now, you know, 0:29:12.450,0:29:14.789 we can build up another level and make asynchronous 0:29:14.789,0:29:16.880 things look synchronous. And then we can build[br]up 0:29:16.880,0:29:19.029 another level, and we can put that idea into 0:29:19.029,0:29:20.159 the language. 0:29:20.159,0:29:22.390 And, you know, I don't know where we're gonna 0:29:22.390,0:29:24.210 go from there. But we can start building higher 0:29:24.210,0:29:26.570 and higher abstractions. But it requires taking[br]the first 0:29:26.570,0:29:30.350 step. So I, I guess what I'm saying is, 0:29:30.350,0:29:32.279 getting something into Rails core is not the[br]only 0:29:32.279,0:29:34.309 way that you can build these abstractions.[br]It, it 0:29:34.309,0:29:36.830 requires some discipline to actually get to[br]the point 0:29:36.830,0:29:38.950 where we're agreeing on something, but I think[br]if 0:29:38.950,0:29:41.220 you find that there is some topic, like, for 0:29:41.220,0:29:45.929 example, jobs or, you know, queuing or jobs,[br]and 0:29:45.929,0:29:47.450 everybody does them or a lot of people do 0:29:47.450,0:29:48.620 them but we don't have a good way of 0:29:48.620,0:29:51.039 building on top of them, that's a good opportunity 0:29:51.039,0:29:52.620 for someone to go and say, I'm gonna do 0:29:52.620,0:29:54.570 the hard work to say, let's create a shared 0:29:54.570,0:29:55.960 idea of what it is that we're doing. 0:29:55.960,0:29:58.130 And sometimes it's via an inter-op player[br]and sometimes 0:29:58.130,0:30:00.809 it's via standardizing around one solution. 0:30:00.809,0:30:04.059 And, another part of this is, when I started 0:30:04.059,0:30:06.529 working on Ember in the JavaScript community,[br]I thought 0:30:06.529,0:30:08.659 a lot of these ideas were obvious. I thought 0:30:08.659,0:30:10.840 it was gonna be, you know, a slam dunk. 0:30:10.840,0:30:12.730 Everybody should agree that building a shared[br]set of 0:30:12.730,0:30:15.690 solutions is the right thing. And what I found 0:30:15.690,0:30:20.059 was that, what I found instead is how powerful 0:30:20.059,0:30:23.570 the unique snowflake bias is, and how powerful[br]the 0:30:23.570,0:30:27.220 leaky abstraction fallacy can be in communities[br]that don't 0:30:27.220,0:30:28.750 place a high value on shared solutions. 0:30:28.750,0:30:31.130 So, if you don't see the power of shared 0:30:31.130,0:30:33.809 solutions, if you're not familiar with the[br]idea, with 0:30:33.809,0:30:36.039 the wins, it's really easy to pull out those 0:30:36.039,0:30:38.429 olds canards, the I am a unique snow flake, 0:30:38.429,0:30:41.390 I can't use a tool like Ember, because I 0:30:41.390,0:30:44.250 have special needs. I need to use a toolkit 0:30:44.250,0:30:46.390 that lets me build my own framework, because[br]my 0:30:46.390,0:30:47.149 needs are oh-so-special. 0:30:47.149,0:30:51.080 Or, you know, you know, I looked at Ember 0:30:51.080,0:30:52.919 when it was new, and Ember leaked all over 0:30:52.919,0:30:54.779 the place. So the law of leaky abstractions[br]means 0:30:54.779,0:30:57.210 you can't build in JavaScript a shared solution.[br]But, 0:30:57.210,0:30:58.450 of course these things are not true. And I 0:30:58.450,0:31:01.270 think, what I want to say is, I think 0:31:01.270,0:31:03.860 Rails, ten years on, has basically proved[br]that these 0:31:03.860,0:31:06.679 things are not true. 0:31:06.679,0:31:08.840 Before Rails, people spent a lot of time working 0:31:08.840,0:31:11.330 on their own bespoke solutions, convinced[br]that their problem 0:31:11.330,0:31:14.590 was just too special for shared solutions.[br]An when 0:31:14.590,0:31:16.970 Rails came out, they looked at the very idea 0:31:16.970,0:31:20.279 of convention over configuration as a joke.[br]And then, 0:31:20.279,0:31:22.970 one day, Rails developers started beating[br]the pants off 0:31:22.970,0:31:27.200 those people. And I think, in closing, if[br]you 0:31:27.200,0:31:29.840 find yourself in an ecosystem where developers[br]still start 0:31:29.840,0:31:32.580 from floor one every time, learn the lessons[br]of 0:31:32.580,0:31:34.019 Rails. 0:31:34.019,0:31:37.630 Everybody should band together, push back,[br]both in your 0:31:37.630,0:31:40.370 own brain and on other people, on the excuses 0:31:40.370,0:31:42.539 that drive us apart instead of the things[br]that 0:31:42.539,0:31:46.580 bind us together. The legacy of Rails isn't[br]MVC 0:31:46.580,0:31:49.769 or even Ruby. It's powerful ten years of evidence 0:31:49.769,0:31:52.100 that by sticking to our guns, we can build 0:31:52.100,0:31:54.220 far higher than anyone ever imagined. 0:31:54.220,0:31:56.860 Thank you very much.