0:00:00.730,0:00:06.870 For any transformation that maps[br]from Rn to Rn, we've done 0:00:06.870,0:00:09.590 it implicitly, but it's been[br]interesting for us to find the 0:00:09.590,0:00:12.460 vectors that essentially just[br]get scaled up by the 0:00:12.460,0:00:13.880 transformations. 0:00:13.880,0:00:17.230 So the vectors that have the[br]form-- the transformation of 0:00:17.230,0:00:20.950 my vector is just equal[br]to some scaled-up 0:00:20.950,0:00:22.035 version of a vector. 0:00:22.035,0:00:24.290 And if this doesn't look[br]familiar, I can jog your 0:00:24.290,0:00:25.750 memory a little bit. 0:00:25.750,0:00:27.690 When we were looking for[br]basis vectors for the 0:00:27.690,0:00:29.140 transformation--[br]let me draw it. 0:00:29.140,0:00:31.190 This was from R2 to R2. 0:00:33.970,0:00:37.110 So let me draw R2 right here. 0:00:37.110,0:00:44.310 And let's say I had the[br]vector v1 was equal to 0:00:44.310,0:00:45.870 the vector 1, 2. 0:00:45.870,0:00:48.990 And we had the lines spanned[br]by that vector. 0:00:48.990,0:00:52.000 We did this problem several[br]videos ago. 0:00:52.000,0:00:55.350 And I had the transformation[br]that flipped across this line. 0:00:55.350,0:01:01.230 So if we call that line l, T was[br]the transformation from R2 0:01:01.230,0:01:05.410 to R2 that flipped vectors[br]across this line. 0:01:05.410,0:01:13.210 So it flipped vectors[br]across l. 0:01:13.210,0:01:15.740 So if you remember that[br]transformation, if I had some 0:01:15.740,0:01:19.050 random vector that looked like[br]that, let's say that's x, 0:01:19.050,0:01:21.640 that's vector x, then the[br]transformation of x looks 0:01:21.640,0:01:22.410 something like this. 0:01:22.410,0:01:24.640 It's just flipped across[br]that line. 0:01:24.640,0:01:26.770 That was the transformation[br]of x. 0:01:26.770,0:01:28.990 And if you remember that video,[br]we were looking for a 0:01:28.990,0:01:31.670 change of basis that would allow[br]us to at least figure 0:01:31.670,0:01:34.640 out the matrix for the[br]transformation, at least in an 0:01:34.640,0:01:35.500 alternate basis. 0:01:35.500,0:01:36.900 And then we could figure[br]out the matrix for the 0:01:36.900,0:01:38.950 transformation in the[br]standard basis. 0:01:38.950,0:01:42.790 And the basis we picked were[br]basis vectors that didn't get 0:01:42.790,0:01:44.950 changed much by the[br]transformation, or ones that 0:01:44.950,0:01:46.940 only got scaled by the[br]transformation. 0:01:46.940,0:01:52.750 For example, when I took the[br]transformation of v1, it just 0:01:52.750,0:01:54.320 equaled v1. 0:01:54.320,0:01:59.380 Or we could say that the[br]transformation of v1 just 0:01:59.380,0:02:02.800 equaled 1 times v1. 0:02:02.800,0:02:06.780 So if you just follow this[br]little format that I set up 0:02:06.780,0:02:08.860 here, lambda, in this[br]case, would be 1. 0:02:08.860,0:02:11.360 And of course, the vector[br]in this case is v1. 0:02:11.360,0:02:16.395 The transformation just[br]scaled up v1 by 1. 0:02:16.395,0:02:18.860 In that same problem, we had[br]the other vector that 0:02:18.860,0:02:22.450 we also looked at. 0:02:22.450,0:02:28.270 It was the vector minus-- let's[br]say it's the vector v2, 0:02:28.270,0:02:32.410 which is-- let's say[br]it's 2, minus 1. 0:02:32.410,0:02:34.420 And then if you take the[br]transformation of it, since it 0:02:34.420,0:02:36.250 was orthogonal to the[br]line, it just got 0:02:36.250,0:02:37.840 flipped over like that. 0:02:37.840,0:02:39.760 And that was a pretty[br]interesting vector force as 0:02:39.760,0:02:44.960 well, because the transformation[br]of v2 in this 0:02:44.960,0:02:47.050 situation is equal to what? 0:02:47.050,0:02:48.930 Just minus v2. 0:02:48.930,0:02:50.270 It's equal to minus v2. 0:02:50.270,0:02:54.920 Or you could say that the[br]transformation of v2 is equal 0:02:54.920,0:02:58.230 to minus 1 times v2. 0:02:58.230,0:03:01.870 And these were interesting[br]vectors for us because when we 0:03:01.870,0:03:06.390 defined a new basis with these[br]guys as the basis vector, it 0:03:06.390,0:03:09.280 was very easy to figure out[br]our transformation matrix. 0:03:09.280,0:03:12.000 And actually, that basis was[br]very easy to compute with. 0:03:12.000,0:03:14.390 And we'll explore that a little[br]bit more in the future. 0:03:14.390,0:03:16.620 But hopefully you realize that[br]these are interesting vectors. 0:03:16.620,0:03:21.750 There was also the cases where[br]we had the planes spanned by 0:03:21.750,0:03:23.630 some vectors. 0:03:23.630,0:03:25.820 And then we had another vector[br]that was popping out of the 0:03:25.820,0:03:27.040 plane like that. 0:03:27.040,0:03:29.320 And we were transforming things[br]by taking the mirror 0:03:29.320,0:03:31.200 image across this and we're[br]like, well in that 0:03:31.200,0:03:34.360 transformation, these red[br]vectors don't change at all 0:03:34.360,0:03:35.960 and this guy gets[br]flipped over. 0:03:35.960,0:03:38.290 So maybe those would make[br]for good bases. 0:03:38.290,0:03:40.250 Or those would make for[br]good basis vectors. 0:03:40.250,0:03:41.240 And they did. 0:03:41.240,0:03:44.850 So in general, we're always[br]interested with the vectors 0:03:44.850,0:03:47.240 that just get scaled up[br]by a transformation. 0:03:47.240,0:03:49.080 It's not going to be[br]all vectors, right? 0:03:49.080,0:03:51.320 This vector that I drew here,[br]this vector x, it doesn't just 0:03:51.320,0:03:54.650 get scaled up, it actually gets[br]changed, this direction 0:03:54.650,0:03:56.730 gets changed. 0:03:56.730,0:04:00.360 The vectors that get scaled up[br]might switch direct-- might go 0:04:00.360,0:04:03.020 from this direction to that[br]direction, or maybe 0:04:03.020,0:04:04.430 they go from that. 0:04:04.430,0:04:07.270 Maybe that's x and then the[br]transformation of x might be a 0:04:07.270,0:04:08.460 scaled up version of x. 0:04:08.460,0:04:09.710 Maybe it's that. 0:04:12.050,0:04:16.970 The actual, I guess, line that[br]they span will not change. 0:04:16.970,0:04:19.350 And so that's what we're going[br]to concern ourselves with. 0:04:19.350,0:04:21.019 These have a special name. 0:04:21.019,0:04:23.660 And they have a special name and[br]I want to make this very 0:04:23.660,0:04:25.050 clear because they're useful. 0:04:25.050,0:04:27.360 It's not just some mathematical[br]game we're 0:04:27.360,0:04:29.970 playing, although sometimes[br]we do fall into that trap. 0:04:29.970,0:04:31.250 But they're actually useful. 0:04:31.250,0:04:34.140 They're useful for defining[br]bases because in those bases 0:04:34.140,0:04:36.730 it's easier to find[br]transformation matrices. 0:04:36.730,0:04:38.950 They're more natural coordinate[br]systems. And 0:04:38.950,0:04:41.700 oftentimes, the transformation[br]matrices in those bases are 0:04:41.700,0:04:43.620 easier to compute with. 0:04:43.620,0:04:47.060 And so these have[br]special names. 0:04:47.060,0:04:50.040 Any vector that satisfies this[br]right here is called an 0:04:50.040,0:04:57.810 eigenvector for the[br]transformation T. 0:04:57.810,0:05:01.680 And the lambda, the multiple[br]that it becomes-- this is the 0:05:01.680,0:05:12.410 eigenvalue associated with[br]that eigenvector. 0:05:16.870,0:05:19.590 So in the example I just gave[br]where the transformation is 0:05:19.590,0:05:24.020 flipping around this line,[br]v1, the vector 1, 2 is an 0:05:24.020,0:05:27.210 eigenvector of our[br]transformation. 0:05:27.210,0:05:31.080 So 1, 2 is an eigenvector. 0:05:33.960,0:05:36.305 And it's corresponding[br]eigenvalue is 1. 0:05:42.170,0:05:43.820 This guy is also an[br]eigenvector-- the 0:05:43.820,0:05:45.270 vector 2, minus 1. 0:05:45.270,0:05:47.520 He's also an eigenvector. 0:05:47.520,0:05:50.440 A very fancy word, but all it[br]means is a vector that's just 0:05:50.440,0:05:51.920 scaled up by a transformation. 0:05:51.920,0:05:55.030 It doesn't get changed in any[br]more meaningful way than just 0:05:55.030,0:05:56.270 the scaling factor. 0:05:56.270,0:06:03.860 And it's corresponding[br]eigenvalue is minus 1. 0:06:03.860,0:06:05.580 If this transformation--[br]I don't know what its 0:06:05.580,0:06:06.750 transformation matrix is. 0:06:06.750,0:06:07.990 I forgot what it was. 0:06:07.990,0:06:10.820 We actually figured it[br]out a while ago. 0:06:10.820,0:06:16.490 If this transformation matrix[br]can be represented as a matrix 0:06:16.490,0:06:18.180 vector product-- and it should[br]be; it's a linear 0:06:18.180,0:06:22.940 transformation-- then any[br]v that satisfies the 0:06:22.940,0:06:27.610 transformation of-- I'll say[br]transformation of v is equal 0:06:27.610,0:06:32.520 to lambda v, which also would[br]be-- you know, the 0:06:32.520,0:06:33.180 transformation of [? v ?] 0:06:33.180,0:06:36.380 would just be A times v. 0:06:36.380,0:06:39.390 These are also called[br]eigenvectors of A, because A 0:06:39.390,0:06:41.570 is just really the matrix[br]representation of the 0:06:41.570,0:06:43.090 transformation. 0:06:43.090,0:06:51.560 So in this case, this would be[br]an eigenvector of A, and this 0:06:51.560,0:06:53.690 would be the eigenvalue[br]associated with the 0:06:53.690,0:06:54.940 eigenvector. 0:06:58.700,0:07:00.940 So if you give me a matrix that[br]represents some linear 0:07:00.940,0:07:01.880 transformation. 0:07:01.880,0:07:03.880 You can also figure[br]these things out. 0:07:03.880,0:07:05.730 Now the next video we're[br]actually going to figure out a 0:07:05.730,0:07:07.080 way to figure these[br]things out. 0:07:07.080,0:07:10.320 But what I want you to[br]appreciate in this video is 0:07:10.320,0:07:13.920 that it's easy to say,[br]oh, the vectors that 0:07:13.920,0:07:15.130 don't get changed much. 0:07:15.130,0:07:16.620 But I want you to understand[br]what that means. 0:07:16.620,0:07:19.860 It literally just gets scaled up[br]or maybe they get reversed. 0:07:19.860,0:07:22.060 Their direction or the[br]lines they span 0:07:22.060,0:07:23.460 fundamentally don't change. 0:07:23.460,0:07:26.400 And the reason why they're[br]interesting for us is, well, 0:07:26.400,0:07:28.790 one of the reasons why they're[br]interesting for us is that 0:07:28.790,0:07:32.590 they make for interesting basis[br]vectors-- basis vectors 0:07:32.590,0:07:36.530 whose transformation matrices[br]are maybe computationally more 0:07:36.530,0:07:41.610 simpler, or ones that make for[br]better coordinate systems.