0:00:00.000,0:00:00.550 0:00:00.550,0:00:02.900 We've seen in the last several[br]videos you start off with 0:00:02.900,0:00:05.410 any crazy distribution. 0:00:05.410,0:00:06.940 It doesn't have to be[br]crazy, it could be a nice 0:00:06.940,0:00:07.830 normal distribution. 0:00:07.830,0:00:10.380 But to really make the point[br]that you don't have to have 0:00:10.380,0:00:12.170 a normal distribution I[br]like to use crazy ones. 0:00:12.170,0:00:14.790 So let's say you have some kind[br]of crazy distribution that 0:00:14.790,0:00:15.790 looks something like that. 0:00:15.790,0:00:17.080 It could look like anything. 0:00:17.080,0:00:19.360 So we've seen multiple times[br]you take samples from 0:00:19.360,0:00:20.530 this crazy distribution. 0:00:20.530,0:00:27.890 So let's say you were to take[br]samples of n is equal to 10. 0:00:27.890,0:00:32.750 So we take 10 instances of this[br]random variable, average them 0:00:32.750,0:00:34.750 out, and then plot our average. 0:00:34.750,0:00:36.160 We plot our average. 0:00:36.160,0:00:37.590 We get 1 instance there. 0:00:37.590,0:00:38.680 We keep doing that. 0:00:38.680,0:00:39.280 We do that again. 0:00:39.280,0:00:42.650 We take 10 samples from this[br]random variable, average 0:00:42.650,0:00:43.430 them, plot them again. 0:00:43.430,0:00:47.360 You plot again and eventually[br]you do this a gazillion times-- 0:00:47.360,0:00:50.230 in theory an infinite number of[br]times-- and you're going to 0:00:50.230,0:00:53.240 approach the sampling[br]distribution of the sample 0:00:53.240,0:00:56.290 mean. n equal 10 is not going[br]to be a perfect normal 0:00:56.290,0:00:58.900 distribution but it's[br]going to be close. 0:00:58.900,0:01:00.610 It'd be perfect only[br]if n was infinity. 0:01:00.610,0:01:05.680 But let's say we eventually--[br]all of our samples we get a lot 0:01:05.680,0:01:08.180 of averages that are there that[br]stacks up, that stacks up 0:01:08.180,0:01:10.290 there, and eventually will[br]approach something that 0:01:10.290,0:01:13.060 looks something like that. 0:01:13.060,0:01:16.190 And we've seen from the last[br]video that one-- if let's say 0:01:16.190,0:01:19.130 we were to do it again and this[br]time let's say that n is equal 0:01:19.130,0:01:23.200 to 20-- one, the distribution[br]that we get is going 0:01:23.200,0:01:24.990 to be more normal. 0:01:24.990,0:01:27.440 And maybe in future videos[br]we'll delve even deeper into 0:01:27.440,0:01:29.510 things like kurtosis and skew. 0:01:29.510,0:01:30.780 But it's going to[br]be more normal. 0:01:30.780,0:01:33.570 But even more important here or[br]I guess even more obviously 0:01:33.570,0:01:35.740 to us, we saw that in the[br]experiment it's going to have 0:01:35.740,0:01:37.420 a lower standard deviation. 0:01:37.420,0:01:38.680 So they're all going to[br]have the same mean. 0:01:38.680,0:01:41.800 Let's say the mean here is,[br]I don't know, let's say 0:01:41.800,0:01:43.410 the mean here is 5. 0:01:43.410,0:01:45.350 Then the mean here is[br]also going to be 5. 0:01:45.350,0:01:47.600 The mean of our sampling[br]distribution of the sample 0:01:47.600,0:01:48.960 mean is going to be 5. 0:01:48.960,0:01:50.260 It doesn't matter[br]what our n is. 0:01:50.260,0:01:52.250 If our n is 20 it's[br]still going to be 5. 0:01:52.250,0:01:54.140 But our standard deviation[br]is going to be less than 0:01:54.140,0:01:55.670 either of these scenarios. 0:01:55.670,0:01:57.260 And we saw that just[br]by experimenting. 0:01:57.260,0:01:58.015 It might look like this. 0:01:58.015,0:01:59.520 It's going to be more normal[br]but it's going to have a 0:01:59.520,0:02:01.350 tighter standard deviation. 0:02:01.350,0:02:03.160 So maybe it'll look like that. 0:02:03.160,0:02:06.810 And if we did it with an even[br]larger sample size-- let me do 0:02:06.810,0:02:09.770 that in a different color-- if[br]we did that with an even larger 0:02:09.770,0:02:13.430 sample size, n is equal to 100,[br]what we're going to get is 0:02:13.430,0:02:17.010 something that fits the normal[br]distribution even better. 0:02:17.010,0:02:19.820 We take a hundred instances[br]of this random variable, 0:02:19.820,0:02:21.230 average them, plot it. 0:02:21.230,0:02:22.730 A hundred instances of[br]this random variable, 0:02:22.730,0:02:23.600 average them, plot it. 0:02:23.600,0:02:25.230 And we just keep doing that. 0:02:25.230,0:02:27.510 If we keep doing that, what[br]we're going to have is 0:02:27.510,0:02:29.950 something that's even more[br]normal than either of these. 0:02:29.950,0:02:32.130 So it's going to be a much[br]closer fit to a true 0:02:32.130,0:02:33.480 normal distribution. 0:02:33.480,0:02:35.890 But even more obvious to[br]the human, it's going 0:02:35.890,0:02:37.650 to be even tighter. 0:02:37.650,0:02:40.640 So it's going to be a very[br]low standard deviation. 0:02:40.640,0:02:41.840 It's going to look[br]something like that. 0:02:41.840,0:02:46.760 And I'll show you on the[br]simulation app in the next or 0:02:46.760,0:02:48.710 probably later in this video. 0:02:48.710,0:02:49.680 So two things happen. 0:02:49.680,0:02:51.830 As you increase your sample[br]size for every time you 0:02:51.830,0:02:53.560 do the average, two[br]things are happening. 0:02:53.560,0:02:57.150 You're becoming more normal[br]and your standard deviation 0:02:57.150,0:02:57.970 is getting smaller. 0:02:57.970,0:03:00.810 So the question might[br]arise is there a formula? 0:03:00.810,0:03:04.680 So if I know the standard[br]deviation-- so this is my 0:03:04.680,0:03:07.990 standard deviation of just my[br]original probability density 0:03:07.990,0:03:10.910 function, this is the mean of[br]my original probability 0:03:10.910,0:03:11.800 density function. 0:03:11.800,0:03:15.320 So if I know the standard[br]deviation and I know n-- n is 0:03:15.320,0:03:17.330 going to change depending on[br]how many samples I'm taking 0:03:17.330,0:03:21.450 every time I do a sample mean--[br]if I know that my standard 0:03:21.450,0:03:23.870 deviation, or maybe if I[br]know my variance, right? 0:03:23.870,0:03:26.300 The variance to just the[br]standard deviation squared. 0:03:26.300,0:03:27.950 If you don't remember[br]that you might want to 0:03:27.950,0:03:29.640 review those videos. 0:03:29.640,0:03:34.450 But if I know the variance of[br]my original distribution and if 0:03:34.450,0:03:38.860 I know what my n is-- how many[br]samples I'm going to take every 0:03:38.860,0:03:41.620 time before I average them in[br]order to plot one thing in my 0:03:41.620,0:03:46.910 sampling distribution of my[br]sample mean-- is there a way to 0:03:46.910,0:03:50.700 predict what the mean of[br]these distributions are? 0:03:50.700,0:03:52.870 And so-- I'm sorry, the[br]standard deviation of 0:03:52.870,0:03:54.050 these distributions. 0:03:54.050,0:03:56.300 And so you don't get confused[br]between that and that, 0:03:56.300,0:03:57.190 let me say the variance. 0:03:57.190,0:03:58.560 If you know the variance[br]you can figure out the 0:03:58.560,0:03:59.570 standard deviation. 0:03:59.570,0:04:01.220 One is just the square[br]root of the other. 0:04:01.220,0:04:06.170 So this is the variance of[br]our original distribution. 0:04:06.170,0:04:09.300 Now to show that this is the[br]variance of our sampling 0:04:09.300,0:04:11.620 distribution of our sample mean[br]we'll write it right here. 0:04:11.620,0:04:16.430 This is the variance of our[br]mean of our sample mean. 0:04:16.430,0:04:19.470 Remember the sample--[br]our true mean is this. 0:04:19.470,0:04:22.100 The Greek letter Mu[br]is our true mean. 0:04:22.100,0:04:27.080 This is equal to the mean,[br]while an x a line over 0:04:27.080,0:04:28.350 it means sample mean. 0:04:28.350,0:04:31.300 0:04:31.300,0:04:34.110 So here what we're saying is[br]this is the variance of our 0:04:34.110,0:04:36.880 sample mean, that this is going[br]to be true distribution. 0:04:36.880,0:04:38.320 This isn't an estimate. 0:04:38.320,0:04:42.590 There's some-- you know, if we[br]magically knew distribution-- 0:04:42.590,0:04:44.920 there's some true[br]variance here. 0:04:44.920,0:04:48.590 And of course the mean-- so[br]this has a mean-- this right 0:04:48.590,0:04:51.420 here, we can just get our[br]notation right, this is the 0:04:51.420,0:04:54.750 mean of the sampling[br]distribution of the 0:04:54.750,0:04:55.710 sampling mean. 0:04:55.710,0:04:57.870 So this is the mean[br]of our means. 0:04:57.870,0:04:59.830 It just happens to[br]be the same thing. 0:04:59.830,0:05:03.180 This is the mean of[br]our sample means. 0:05:03.180,0:05:05.410 It's going to be the same thing[br]as that, especially if we do 0:05:05.410,0:05:07.400 the trial over and over again. 0:05:07.400,0:05:09.490 But anyway, the point of this[br]video, is there any way to 0:05:09.490,0:05:13.600 figure out this variance given[br]the variance of the original 0:05:13.600,0:05:15.520 distribution and your n? 0:05:15.520,0:05:16.720 And it turns out there is. 0:05:16.720,0:05:18.230 And I'm not going to[br]do a proof here. 0:05:18.230,0:05:19.910 I really want to give you[br]the intuition of it. 0:05:19.910,0:05:22.830 I think you already do have the[br]sense that every trial you 0:05:22.830,0:05:26.010 take-- if you take a hundred,[br]you're much more likely when 0:05:26.010,0:05:29.140 you average those out, to get[br]close to the true mean than if 0:05:29.140,0:05:31.410 you took an n of[br]2 or an n of 5. 0:05:31.410,0:05:34.240 You're just very unlikely to be[br]far away, right, if you took 0:05:34.240,0:05:36.640 100 trials as opposed[br]to taking 5. 0:05:36.640,0:05:38.590 So I think you know that[br]in some way it should be 0:05:38.590,0:05:40.740 inversely proportional to n. 0:05:40.740,0:05:43.680 The larger your n the smaller[br]a standard deviation. 0:05:43.680,0:05:45.740 And actually it turns out it's[br]about as simple as possible. 0:05:45.740,0:05:47.740 It's one of those magical[br]things about mathematics. 0:05:47.740,0:05:50.110 And I'll prove it[br]to you one day. 0:05:50.110,0:05:51.690 I want to give you[br]working knowledge first. 0:05:51.690,0:05:54.390 In statistics, I'm always[br]struggling whether I should be 0:05:54.390,0:05:57.410 formal in giving you rigorous[br]proofs but I've kind of come to 0:05:57.410,0:05:59.220 the conclusion that it's more[br]important to get the working 0:05:59.220,0:06:02.010 knowledge first in statistics[br]and then later, once you've 0:06:02.010,0:06:05.010 gotten all of that down, we can[br]get into the real deep math 0:06:05.010,0:06:06.270 of it and prove it to you. 0:06:06.270,0:06:09.460 But I think experimental proofs[br]are kind of all you need for 0:06:09.460,0:06:11.110 right now, using those[br]simulations to show that 0:06:11.110,0:06:12.050 they're really true. 0:06:12.050,0:06:14.860 So it turns out that the[br]variance of your sampling 0:06:14.860,0:06:18.230 distribution of your sample[br]mean is equal to the 0:06:18.230,0:06:20.880 variance of your original[br]distribution-- that guy 0:06:20.880,0:06:23.310 right there-- divided by n. 0:06:23.310,0:06:24.300 That's all it is. 0:06:24.300,0:06:29.940 So if this up here has a[br]variance of-- let's say this up 0:06:29.940,0:06:33.640 here has a variance of 20-- I'm[br]just making that number up-- 0:06:33.640,0:06:36.330 then let's say your n is 20. 0:06:36.330,0:06:38.850 Then the variance of your[br]sampling distribution of your 0:06:38.850,0:06:41.360 sample mean for an n of 20,[br]well you're just going to take 0:06:41.360,0:06:44.340 that, the variance up here--[br]your variance is 20-- 0:06:44.340,0:06:45.840 divided by your n, 20. 0:06:45.840,0:06:49.780 So here your variance is[br]going to be 20 divided by 0:06:49.780,0:06:51.410 20 which is equal to 1. 0:06:51.410,0:06:53.230 This is the variance of[br]your original probability 0:06:53.230,0:06:55.860 distribution and[br]this is your n. 0:06:55.860,0:06:57.460 What's your standard[br]deviation going to be? 0:06:57.460,0:06:59.540 What's going to be the[br]square root of that, right? 0:06:59.540,0:07:00.870 Standard deviation is going[br]to be square root of 1. 0:07:00.870,0:07:02.410 Well that's also going to be 1. 0:07:02.410,0:07:04.090 So we could also write this. 0:07:04.090,0:07:07.040 We could take the square root[br]of both sides of this and say 0:07:07.040,0:07:10.830 the standard deviation of the[br]sampling distribution 0:07:10.830,0:07:13.800 standard-- the standard[br]deviation of the sampling 0:07:13.800,0:07:17.180 distribution of the sample mean[br]is often called the standard 0:07:17.180,0:07:18.550 deviation of the mean. 0:07:18.550,0:07:20.470 And it's also called-- I'm[br]going to write this down-- the 0:07:20.470,0:07:22.050 standard error of the mean. 0:07:22.050,0:07:27.810 0:07:27.810,0:07:30.230 All of these things that I just[br]mentioned, they all just mean 0:07:30.230,0:07:33.010 the standard deviation of the[br]sampling distribution 0:07:33.010,0:07:33.860 of the sample mean. 0:07:33.860,0:07:36.770 That's why this is confusing[br]because you use the word mean 0:07:36.770,0:07:38.080 and sample over and over again. 0:07:38.080,0:07:39.800 And if it confuses[br]you let me know. 0:07:39.800,0:07:42.030 I'll do another video or pause[br]and repeat or whatever. 0:07:42.030,0:07:44.000 But if we just take the square[br]root of both sides, the 0:07:44.000,0:07:46.770 standard error of the mean or[br]the standard deviation of the 0:07:46.770,0:07:50.300 sampling distribution of the[br]sample mean is equal to the 0:07:50.300,0:07:54.310 standard deviation of your[br]original function-- of your 0:07:54.310,0:07:56.580 original probability density[br]function-- which could be very 0:07:56.580,0:07:59.850 non-normal, divided by[br]the square root of n. 0:07:59.850,0:08:03.110 I just took the square root of[br]both sides of this equation. 0:08:03.110,0:08:07.090 I personally like to remember[br]this: that the variance is just 0:08:07.090,0:08:08.820 inversely proportional to n. 0:08:08.820,0:08:10.300 And then I like to[br]go back to this. 0:08:10.300,0:08:11.840 Because this is very[br]simple in my head. 0:08:11.840,0:08:13.790 You just take the[br]variance, divide it by n. 0:08:13.790,0:08:15.790 Oh and if I want the standard[br]deviation, I just take the 0:08:15.790,0:08:18.430 square roots of both sides[br]and I get this formula. 0:08:18.430,0:08:22.460 So here the standard[br]deviation-- when n is 20-- the 0:08:22.460,0:08:25.890 standard deviation of the[br]sampling distribution of the 0:08:25.890,0:08:27.210 sample mean is going to be 1. 0:08:27.210,0:08:31.530 Here when n is 100, our[br]variance here when 0:08:31.530,0:08:32.500 n is equal to 100. 0:08:32.500,0:08:35.230 So our variance of the sampling[br]mean of the sample distribution 0:08:35.230,0:08:37.580 or our variance of the mean--[br]of the sample mean, we 0:08:37.580,0:08:40.340 could say-- is going to be[br]equal to 20-- this guy's 0:08:40.340,0:08:43.290 variance-- divided by n. 0:08:43.290,0:08:46.950 So it equals-- n is[br]100-- so it equals 1/5. 0:08:46.950,0:08:50.730 Now this guy's standard[br]deviation or the standard 0:08:50.730,0:08:54.270 deviation of the sampling[br]distribution of the sample mean 0:08:54.270,0:08:55.760 or the standard error of the[br]mean is going to be the 0:08:55.760,0:08:56.480 square root of that. 0:08:56.480,0:08:58.780 So 1 over the square root of 5. 0:08:58.780,0:09:03.390 And so this guy's will be a[br]little bit under 1/2 the 0:09:03.390,0:09:05.060 standard deviation while[br]this guy had a standard 0:09:05.060,0:09:05.930 deviation of 1. 0:09:05.930,0:09:07.450 So you see, it's[br]definitely thinner. 0:09:07.450,0:09:08.140 Now I know what you're saying. 0:09:08.140,0:09:09.590 Well, Sal, you just gave[br]a formula, I don't 0:09:09.590,0:09:11.140 necessarily believe you. 0:09:11.140,0:09:13.610 Well let's see if we can[br]prove it to ourselves 0:09:13.610,0:09:15.960 using the simulation. 0:09:15.960,0:09:20.240 So just for fun let me make[br]a-- I'll just mess with this 0:09:20.240,0:09:21.670 distribution a little bit. 0:09:21.670,0:09:23.440 So that's my new distribution. 0:09:23.440,0:09:25.370 And let me take an n of-- let[br]me take two things that's easy 0:09:25.370,0:09:27.550 to take the square root of[br]because we're looking at 0:09:27.550,0:09:28.330 standard deviations. 0:09:28.330,0:09:33.830 So we take an n of[br]16 and an n of 25. 0:09:33.830,0:09:35.470 Let's do 10,000 trials. 0:09:35.470,0:09:37.490 So in this case every one of[br]the trials we're going to take 0:09:37.490,0:09:40.320 16 samples from here, average[br]them, plot it here, and 0:09:40.320,0:09:41.600 then do a frequency plot. 0:09:41.600,0:09:45.020 Here we're going to do 25 at a[br]time and then average them. 0:09:45.020,0:09:46.830 I'll do it once animated[br]just to remember. 0:09:46.830,0:09:50.620 So I'm taking 16[br]samples, plot it there. 0:09:50.620,0:09:53.500 I take 16 samples as described[br]by this probability density 0:09:53.500,0:09:56.960 function-- or 25 now,[br]plot it down here. 0:09:56.960,0:10:03.420 Now if I do that 10,000[br]times, what do I get? 0:10:03.420,0:10:06.710 All right, so here, just[br]visually you can tell just when 0:10:06.710,0:10:08.810 n was larger, the standard[br]deviation here is smaller. 0:10:08.810,0:10:10.020 This is more squeezed together. 0:10:10.020,0:10:12.490 But actually let's[br]write this stuff down. 0:10:12.490,0:10:14.420 Let's see if I can[br]remember it here. 0:10:14.420,0:10:17.370 So in this random distribution[br]I made my standard 0:10:17.370,0:10:19.160 deviation was 9.3. 0:10:19.160,0:10:20.460 I'm going to remember these. 0:10:20.460,0:10:24.470 Our standard deviation for[br]the original thing was 9.3. 0:10:24.470,0:10:27.980 And so standard deviation here[br]was 2.3 and the standard 0:10:27.980,0:10:29.590 deviation here is 1.87. 0:10:29.590,0:10:33.290 Let's see if it conforms[br]to our formula. 0:10:33.290,0:10:35.240 So I'm going to take this off[br]screen for a second and I'm 0:10:35.240,0:10:38.900 going to go back and[br]do some mathematics. 0:10:38.900,0:10:41.030 So I have this on my[br]other screen so I can 0:10:41.030,0:10:42.750 remember those numbers. 0:10:42.750,0:10:47.220 So in the trial we just did,[br]my wacky distribution had a 0:10:47.220,0:10:52.650 standard deviation of 9.3. 0:10:52.650,0:10:57.680 When n is equal to-- let me do[br]this in another color-- when n 0:10:57.680,0:11:01.900 was equal to 16, just doing the[br]experiment, doing a bunch of 0:11:01.900,0:11:04.460 trials and averaging and doing[br]all the things, we got the 0:11:04.460,0:11:08.070 standard deviation of the[br]sampling distribution of the 0:11:08.070,0:11:10.380 sample mean or the standard[br]error of the mean, we 0:11:10.380,0:11:15.570 experimentally determined[br]it to be 2.33. 0:11:15.570,0:11:21.500 And then when n is equal to 25[br]we got the standard error of 0:11:21.500,0:11:24.900 the mean being equal to 1.87. 0:11:24.900,0:11:28.330 Let's see if it conforms[br]to our formulas. 0:11:28.330,0:11:32.680 So we know that the variance or[br]we could almost say the 0:11:32.680,0:11:36.010 variance of the mean or the[br]standard error-- the variance 0:11:36.010,0:11:39.230 of the sampling distribution of[br]the sample mean is equal to the 0:11:39.230,0:11:42.050 variance of our original[br]distribution divided by n, take 0:11:42.050,0:11:45.490 the square roots of both sides,[br]and then you get the standard 0:11:45.490,0:11:48.770 error of the mean is equal to[br]the standard deviation of your 0:11:48.770,0:11:51.800 original distribution divided[br]by the square root of n. 0:11:51.800,0:11:54.350 So let's see if this works[br]out for these two things. 0:11:54.350,0:11:58.670 So if I were to take 9.3--[br]so let me do this case. 0:11:58.670,0:12:03.870 So 9.3 divided by the[br]square root of 16, right? 0:12:03.870,0:12:05.150 N is 16. 0:12:05.150,0:12:07.050 So divided by the square[br]root of 16, which is 0:12:07.050,0:12:09.300 4, what do I get? 0:12:09.300,0:12:11.566 So 9.3 divided by 4. 0:12:11.566,0:12:14.840 Let me get a little[br]calculator out here. 0:12:14.840,0:12:15.550 Let's see. 0:12:15.550,0:12:18.720 We have-- let me clear it[br]out-- we want to divide 0:12:18.720,0:12:21.000 9.3 divided by 4. 0:12:21.000,0:12:24.930 9.3 three divided by our[br]square root of n. n was 16. 0:12:24.930,0:12:32.060 So divided by 4 is[br]equal to 2.32. 0:12:32.060,0:12:41.540 So this is equal to 2.32 which[br]is pretty darn close to 2.33. 0:12:41.540,0:12:43.160 This was after 10,000 trials. 0:12:43.160,0:12:45.780 Maybe right after this I'll see[br]what happens if we did 20,000 0:12:45.780,0:12:48.960 or 30,000 trials where we take[br]samples of 16 and average them. 0:12:48.960,0:12:50.340 Now let's look at this. 0:12:50.340,0:12:55.400 Here we would take 9.3-- so let[br]me draw a little line here. 0:12:55.400,0:12:57.300 Let me scroll over,[br]that might be better. 0:12:57.300,0:13:00.340 So we take our standard[br]deviation of our 0:13:00.340,0:13:01.790 original distribution. 0:13:01.790,0:13:05.280 So just that formula that we've[br]derived right here would tell 0:13:05.280,0:13:09.180 us that our standard error[br]should be equal to the standard 0:13:09.180,0:13:13.350 deviation of our original[br]distribution, 9.3, divided by 0:13:13.350,0:13:15.400 the square root of n, divided[br]by the square root 0:13:15.400,0:13:16.380 of 25, right? 0:13:16.380,0:13:18.410 4 was just the[br]square root of 16. 0:13:18.410,0:13:21.860 So this is equal to[br]9.3 divided by 5. 0:13:21.860,0:13:23.820 And let's see if it's 1.87. 0:13:23.820,0:13:28.320 So let me get my[br]calculator back. 0:13:28.320,0:13:36.410 So if I take 9.3 divided[br]by 5, what do I get? 0:13:36.410,0:13:41.720 1.86 which is very[br]close to 1.87. 0:13:41.720,0:13:49.480 So we got in this case 1.86. 0:13:49.480,0:13:53.150 So as you can see what we got[br]experimentally was almost 0:13:53.150,0:13:56.010 exactly-- and this was after[br]10,000 trials-- of what 0:13:56.010,0:13:56.600 you would expect. 0:13:56.600,0:13:58.690 Let's do another 10,000. 0:13:58.690,0:14:00.220 So you've got another[br]10,000 trials. 0:14:00.220,0:14:01.550 Well we're still[br]in the ballpark. 0:14:01.550,0:14:04.920 We're not going to-- maybe I[br]can't hope to get the exact 0:14:04.920,0:14:07.350 number rounded or whatever. 0:14:07.350,0:14:10.690 But as you can see, hopefully[br]that'll be pretty satisfying to 0:14:10.690,0:14:14.410 you, that the variance of the[br]sampling distribution of the 0:14:14.410,0:14:21.500 sample mean is just going to be[br]equal to the variance of your 0:14:21.500,0:14:23.790 original distribution, no[br]matter how wacky that 0:14:23.790,0:14:27.370 distribution might be, divided[br]by your sample size-- by the 0:14:27.370,0:14:33.530 number of samples you take for[br]every basket that you average I 0:14:33.530,0:14:35.220 guess is the best way[br]to think about it. 0:14:35.220,0:14:37.710 You know, sometimes this can[br]get confusing because you are 0:14:37.710,0:14:40.275 taking samples of averages[br]based on samples. 0:14:40.275,0:14:43.300 So when someone says sample[br]size, you're like, is sample 0:14:43.300,0:14:46.510 size the number of times I[br]took averages or the number 0:14:46.510,0:14:48.780 of things I'm taking[br]averages of each time? 0:14:48.780,0:14:51.160 And you know, it doesn't[br]hurt to clarify that. 0:14:51.160,0:14:52.790 Normally when they talk[br]about sample size 0:14:52.790,0:14:54.220 they're talking about n. 0:14:54.220,0:14:57.570 And, at least in my head, when[br]I think of the trials as you 0:14:57.570,0:15:00.700 take a sample size of 16, you[br]average it, that's the one 0:15:00.700,0:15:01.990 trial, and then you plot it. 0:15:01.990,0:15:03.680 Then you do it again and[br]you do another trial. 0:15:03.680,0:15:04.980 And you do it over[br]and over again. 0:15:04.980,0:15:06.930 But anyway, hopefully this[br]makes everything clear and then 0:15:06.930,0:15:11.340 you now also understand how to[br]get to the standard 0:15:11.340,0:15:13.750 error of the mean. 0:15:13.750,0:15:14.848