< Return to Video

Linear Algebra: Introduction to Eigenvalues and Eigenvectors

  • 0:01 - 0:07
    For any transformation that maps
    from Rn to Rn, we've done
  • 0:07 - 0:10
    it implicitly, but it's been
    interesting for us to find the
  • 0:10 - 0:12
    vectors that essentially just
    get scaled up by the
  • 0:12 - 0:14
    transformations.
  • 0:14 - 0:17
    So the vectors that have the
    form-- the transformation of
  • 0:17 - 0:21
    my vector is just equal
    to some scaled-up
  • 0:21 - 0:22
    version of a vector.
  • 0:22 - 0:24
    And if this doesn't look
    familiar, I can jog your
  • 0:24 - 0:26
    memory a little bit.
  • 0:26 - 0:28
    When we were looking for
    basis vectors for the
  • 0:28 - 0:29
    transformation--
    let me draw it.
  • 0:29 - 0:31
    This was from R2 to R2.
  • 0:34 - 0:37
    So let me draw R2 right here.
  • 0:37 - 0:44
    And let's say I had the
    vector v1 was equal to
  • 0:44 - 0:46
    the vector 1, 2.
  • 0:46 - 0:49
    And we had the lines spanned
    by that vector.
  • 0:49 - 0:52
    We did this problem several
    videos ago.
  • 0:52 - 0:55
    And I had the transformation
    that flipped across this line.
  • 0:55 - 1:01
    So if we call that line l, T was
    the transformation from R2
  • 1:01 - 1:05
    to R2 that flipped vectors
    across this line.
  • 1:05 - 1:13
    So it flipped vectors
    across l.
  • 1:13 - 1:16
    So if you remember that
    transformation, if I had some
  • 1:16 - 1:19
    random vector that looked like
    that, let's say that's x,
  • 1:19 - 1:22
    that's vector x, then the
    transformation of x looks
  • 1:22 - 1:22
    something like this.
  • 1:22 - 1:25
    It's just flipped across
    that line.
  • 1:25 - 1:27
    That was the transformation
    of x.
  • 1:27 - 1:29
    And if you remember that video,
    we were looking for a
  • 1:29 - 1:32
    change of basis that would allow
    us to at least figure
  • 1:32 - 1:35
    out the matrix for the
    transformation, at least in an
  • 1:35 - 1:36
    alternate basis.
  • 1:36 - 1:37
    And then we could figure
    out the matrix for the
  • 1:37 - 1:39
    transformation in the
    standard basis.
  • 1:39 - 1:43
    And the basis we picked were
    basis vectors that didn't get
  • 1:43 - 1:45
    changed much by the
    transformation, or ones that
  • 1:45 - 1:47
    only got scaled by the
    transformation.
  • 1:47 - 1:53
    For example, when I took the
    transformation of v1, it just
  • 1:53 - 1:54
    equaled v1.
  • 1:54 - 1:59
    Or we could say that the
    transformation of v1 just
  • 1:59 - 2:03
    equaled 1 times v1.
  • 2:03 - 2:07
    So if you just follow this
    little format that I set up
  • 2:07 - 2:09
    here, lambda, in this
    case, would be 1.
  • 2:09 - 2:11
    And of course, the vector
    in this case is v1.
  • 2:11 - 2:16
    The transformation just
    scaled up v1 by 1.
  • 2:16 - 2:19
    In that same problem, we had
    the other vector that
  • 2:19 - 2:22
    we also looked at.
  • 2:22 - 2:28
    It was the vector minus-- let's
    say it's the vector v2,
  • 2:28 - 2:32
    which is-- let's say
    it's 2, minus 1.
  • 2:32 - 2:34
    And then if you take the
    transformation of it, since it
  • 2:34 - 2:36
    was orthogonal to the
    line, it just got
  • 2:36 - 2:38
    flipped over like that.
  • 2:38 - 2:40
    And that was a pretty
    interesting vector force as
  • 2:40 - 2:45
    well, because the transformation
    of v2 in this
  • 2:45 - 2:47
    situation is equal to what?
  • 2:47 - 2:49
    Just minus v2.
  • 2:49 - 2:50
    It's equal to minus v2.
  • 2:50 - 2:55
    Or you could say that the
    transformation of v2 is equal
  • 2:55 - 2:58
    to minus 1 times v2.
  • 2:58 - 3:02
    And these were interesting
    vectors for us because when we
  • 3:02 - 3:06
    defined a new basis with these
    guys as the basis vector, it
  • 3:06 - 3:09
    was very easy to figure out
    our transformation matrix.
  • 3:09 - 3:12
    And actually, that basis was
    very easy to compute with.
  • 3:12 - 3:14
    And we'll explore that a little
    bit more in the future.
  • 3:14 - 3:17
    But hopefully you realize that
    these are interesting vectors.
  • 3:17 - 3:22
    There was also the cases where
    we had the planes spanned by
  • 3:22 - 3:24
    some vectors.
  • 3:24 - 3:26
    And then we had another vector
    that was popping out of the
  • 3:26 - 3:27
    plane like that.
  • 3:27 - 3:29
    And we were transforming things
    by taking the mirror
  • 3:29 - 3:31
    image across this and we're
    like, well in that
  • 3:31 - 3:34
    transformation, these red
    vectors don't change at all
  • 3:34 - 3:36
    and this guy gets
    flipped over.
  • 3:36 - 3:38
    So maybe those would make
    for good bases.
  • 3:38 - 3:40
    Or those would make for
    good basis vectors.
  • 3:40 - 3:41
    And they did.
  • 3:41 - 3:45
    So in general, we're always
    interested with the vectors
  • 3:45 - 3:47
    that just get scaled up
    by a transformation.
  • 3:47 - 3:49
    It's not going to be
    all vectors, right?
  • 3:49 - 3:51
    This vector that I drew here,
    this vector x, it doesn't just
  • 3:51 - 3:55
    get scaled up, it actually gets
    changed, this direction
  • 3:55 - 3:57
    gets changed.
  • 3:57 - 4:00
    The vectors that get scaled up
    might switch direct-- might go
  • 4:00 - 4:03
    from this direction to that
    direction, or maybe
  • 4:03 - 4:04
    they go from that.
  • 4:04 - 4:07
    Maybe that's x and then the
    transformation of x might be a
  • 4:07 - 4:08
    scaled up version of x.
  • 4:08 - 4:10
    Maybe it's that.
  • 4:12 - 4:17
    The actual, I guess, line that
    they span will not change.
  • 4:17 - 4:19
    And so that's what we're going
    to concern ourselves with.
  • 4:19 - 4:21
    These have a special name.
  • 4:21 - 4:24
    And they have a special name and
    I want to make this very
  • 4:24 - 4:25
    clear because they're useful.
  • 4:25 - 4:27
    It's not just some mathematical
    game we're
  • 4:27 - 4:30
    playing, although sometimes
    we do fall into that trap.
  • 4:30 - 4:31
    But they're actually useful.
  • 4:31 - 4:34
    They're useful for defining
    bases because in those bases
  • 4:34 - 4:37
    it's easier to find
    transformation matrices.
  • 4:37 - 4:39
    They're more natural coordinate
    systems. And
  • 4:39 - 4:42
    oftentimes, the transformation
    matrices in those bases are
  • 4:42 - 4:44
    easier to compute with.
  • 4:44 - 4:47
    And so these have
    special names.
  • 4:47 - 4:50
    Any vector that satisfies this
    right here is called an
  • 4:50 - 4:58
    eigenvector for the
    transformation T.
  • 4:58 - 5:02
    And the lambda, the multiple
    that it becomes-- this is the
  • 5:02 - 5:12
    eigenvalue associated with
    that eigenvector.
  • 5:17 - 5:20
    So in the example I just gave
    where the transformation is
  • 5:20 - 5:24
    flipping around this line,
    v1, the vector 1, 2 is an
  • 5:24 - 5:27
    eigenvector of our
    transformation.
  • 5:27 - 5:31
    So 1, 2 is an eigenvector.
  • 5:34 - 5:36
    And it's corresponding
    eigenvalue is 1.
  • 5:42 - 5:44
    This guy is also an
    eigenvector-- the
  • 5:44 - 5:45
    vector 2, minus 1.
  • 5:45 - 5:48
    He's also an eigenvector.
  • 5:48 - 5:50
    A very fancy word, but all it
    means is a vector that's just
  • 5:50 - 5:52
    scaled up by a transformation.
  • 5:52 - 5:55
    It doesn't get changed in any
    more meaningful way than just
  • 5:55 - 5:56
    the scaling factor.
  • 5:56 - 6:04
    And it's corresponding
    eigenvalue is minus 1.
  • 6:04 - 6:06
    If this transformation--
    I don't know what its
  • 6:06 - 6:07
    transformation matrix is.
  • 6:07 - 6:08
    I forgot what it was.
  • 6:08 - 6:11
    We actually figured it
    out a while ago.
  • 6:11 - 6:16
    If this transformation matrix
    can be represented as a matrix
  • 6:16 - 6:18
    vector product-- and it should
    be; it's a linear
  • 6:18 - 6:23
    transformation-- then any
    v that satisfies the
  • 6:23 - 6:28
    transformation of-- I'll say
    transformation of v is equal
  • 6:28 - 6:33
    to lambda v, which also would
    be-- you know, the
  • 6:33 - 6:33
    transformation of [? v ?]
  • 6:33 - 6:36
    would just be A times v.
  • 6:36 - 6:39
    These are also called
    eigenvectors of A, because A
  • 6:39 - 6:42
    is just really the matrix
    representation of the
  • 6:42 - 6:43
    transformation.
  • 6:43 - 6:52
    So in this case, this would be
    an eigenvector of A, and this
  • 6:52 - 6:54
    would be the eigenvalue
    associated with the
  • 6:54 - 6:55
    eigenvector.
  • 6:59 - 7:01
    So if you give me a matrix that
    represents some linear
  • 7:01 - 7:02
    transformation.
  • 7:02 - 7:04
    You can also figure
    these things out.
  • 7:04 - 7:06
    Now the next video we're
    actually going to figure out a
  • 7:06 - 7:07
    way to figure these
    things out.
  • 7:07 - 7:10
    But what I want you to
    appreciate in this video is
  • 7:10 - 7:14
    that it's easy to say,
    oh, the vectors that
  • 7:14 - 7:15
    don't get changed much.
  • 7:15 - 7:17
    But I want you to understand
    what that means.
  • 7:17 - 7:20
    It literally just gets scaled up
    or maybe they get reversed.
  • 7:20 - 7:22
    Their direction or the
    lines they span
  • 7:22 - 7:23
    fundamentally don't change.
  • 7:23 - 7:26
    And the reason why they're
    interesting for us is, well,
  • 7:26 - 7:29
    one of the reasons why they're
    interesting for us is that
  • 7:29 - 7:33
    they make for interesting basis
    vectors-- basis vectors
  • 7:33 - 7:37
    whose transformation matrices
    are maybe computationally more
  • 7:37 - 7:42
    simpler, or ones that make for
    better coordinate systems.
Title:
Linear Algebra: Introduction to Eigenvalues and Eigenvectors
Description:

more » « less
Video Language:
English
Team:
Khan Academy
Duration:
07:43

English subtitles

Revisions Compare revisions