< Return to Video

Inverting Matrices (part 3)

  • 0:00 - 0:01
  • 0:01 - 0:04
    I will now show you my preferred
    way of finding an
  • 0:04 - 0:06
    inverse of a 3 by 3 matrix.
  • 0:06 - 0:07
    And I actually think it's
    a lot more fun.
  • 0:07 - 0:09
    And you're less likely to
    make careless mistakes.
  • 0:09 - 0:11
    But if I remember correctly from
    Algebra 2, they didn't
  • 0:11 - 0:13
    teach it this way
    in Algebra 2.
  • 0:13 - 0:15
    And that's why I taught the
    other way initially.
  • 0:15 - 0:16
    But let's go through this.
  • 0:16 - 0:20
    And in a future video, I will
    teach you why it works.
  • 0:20 - 0:21
    Because that's always
    important.
  • 0:21 - 0:24
    But in linear algebra, this is
    one of the few subjects where
  • 0:24 - 0:27
    I think it's very important
    learn how to do the operations
  • 0:27 - 0:29
    first. And then later,
    we'll learn the why.
  • 0:29 - 0:30
    Because the how is
    very mechanical.
  • 0:30 - 0:33
    And it really just involves
    some basic arithmetic
  • 0:33 - 0:34
    for the most part.
  • 0:34 - 0:39
    But the why tends to
    be quite deep.
  • 0:39 - 0:41
    So I'll leave that
    to later videos.
  • 0:41 - 0:44
    And you can often think about
    the depth of things when you
  • 0:44 - 0:47
    have confidence that you at
    least understand the hows.
  • 0:47 - 0:50
    So anyway, let's go back
    to our original matrix.
  • 0:50 - 0:51
    And what was that original
    matrix that I
  • 0:51 - 0:52
    did in the last video?
  • 0:52 - 1:04
    It was 1, 0, 1, 0,
    2, 1, 1, 1, 1.
  • 1:04 - 1:07
    And we wanted to find the
    inverse of this matrix.
  • 1:07 - 1:09
    So this is what we're
    going to do.
  • 1:09 - 1:13
    It's called Gauss-Jordan
    elimination, to find the
  • 1:13 - 1:14
    inverse of the matrix.
  • 1:14 - 1:16
    And the way you do it-- and it
    might seem a little bit like
  • 1:16 - 1:19
    magic, it might seem a little
    bit like voodoo, but I think
  • 1:19 - 1:20
    you'll see in future videos that
    it makes a lot of sense.
  • 1:20 - 1:23
    What we do is we augment
    this matrix.
  • 1:23 - 1:24
    What does augment mean?
  • 1:24 - 1:25
    It means we just add
    something to it.
  • 1:25 - 1:27
    So I draw a dividing line.
  • 1:27 - 1:28
    Some people don't.
  • 1:28 - 1:31
    So if I put a dividing
    line here.
  • 1:31 - 1:34
    And what do I put on the other
    side of the dividing line?
  • 1:34 - 1:38
    I put the identity matrix
    of the same size.
  • 1:38 - 1:41
    This is 3 by 3, so I put a
    3 by 3 identity matrix.
  • 1:41 - 1:52
    So that's 1, 0, 0,
    0, 1, 0, 0, 0, 1.
  • 1:52 - 1:55
    All right, so what are
    we going to do?
  • 1:55 - 1:59
    What I'm going to do is perform
    a series of elementary
  • 1:59 - 2:00
    row operations.
  • 2:00 - 2:03
    And I'm about to tell you what
    are valid elementary row
  • 2:03 - 2:05
    operations on this matrix.
  • 2:05 - 2:07
    But whatever I do to any of
    these rows here, I have to do
  • 2:07 - 2:09
    to the corresponding
    rows here.
  • 2:09 - 2:13
    And my goal is essentially to
    perform a bunch of operations
  • 2:13 - 2:14
    on the left hand side.
  • 2:14 - 2:16
    And of course, the same
    operations will be applied to
  • 2:16 - 2:19
    the right hand side, so that I
    eventually end up with the
  • 2:19 - 2:21
    identity matrix on the
    left hand side.
  • 2:21 - 2:23
    And then when I have the
    identity matrix on the left
  • 2:23 - 2:26
    hand side, what I have left on
    the right hand side will be
  • 2:26 - 2:29
    the inverse of this
    original matrix.
  • 2:29 - 2:33
    And when this becomes an
    identity matrix, that's
  • 2:33 - 2:35
    actually called reduced
    row echelon form.
  • 2:35 - 2:36
    And I'll talk more about that.
  • 2:36 - 2:39
    There's a lot of names and
    labels in linear algebra.
  • 2:39 - 2:41
    But they're really just fairly
    simple concepts.
  • 2:41 - 2:45
    But anyway, let's get started
    and this should become a
  • 2:45 - 2:45
    little clear.
  • 2:45 - 2:47
    At least the process
    will become clear.
  • 2:47 - 2:49
    Maybe not why it works.
  • 2:49 - 2:52
    So first of all, I said I'm
    going to perform a bunch of
  • 2:52 - 2:52
    operations here.
  • 2:52 - 2:54
    What are legitimate
    operations?
  • 2:54 - 2:56
    They're called elementary
    row operations.
  • 2:56 - 2:58
    So there's a couple
    things I can do.
  • 2:58 - 3:02
    I can replace any row
    with that row
  • 3:02 - 3:04
    multiplied by some number.
  • 3:04 - 3:05
    So I could do that.
  • 3:05 - 3:08
    I can swap any two rows.
  • 3:08 - 3:11
    And of course if I swap say the
    first and second row, I'd
  • 3:11 - 3:12
    have to do it here as well.
  • 3:12 - 3:17
    And I can add or subtract one
    row from another row.
  • 3:17 - 3:21
    So when I do that-- so for
    example, I could take this row
  • 3:21 - 3:24
    and replace it with this
    row added to this row.
  • 3:24 - 3:26
    And you'll see what I
    mean in the second.
  • 3:26 - 3:28
    And you know, if you combine it,
    you could you could say,
  • 3:28 - 3:30
    well I'm going to multiple this
    row times negative 1, and
  • 3:30 - 3:33
    add it to this row, and replace
    this row with that.
  • 3:33 - 3:37
    So if you start to feel like
    this is something like what
  • 3:37 - 3:40
    you learned when you learned
    solving systems of linear
  • 3:40 - 3:43
    equations, that's
    no coincidence.
  • 3:43 - 3:46
    Because matrices are actually
    a very good way to represent
  • 3:46 - 3:48
    that, and I will show
    you that soon.
  • 3:48 - 3:51
    But anyway, let's do some
    elementary row operations to
  • 3:51 - 3:55
    get this left hand side into
    reduced row echelon form.
  • 3:55 - 3:58
    Which is really just a fancy way
    of saying, let's turn it
  • 3:58 - 4:00
    into the identity matrix.
  • 4:00 - 4:01
    So let's see what
    we want to do.
  • 4:01 - 4:02
    We want to have 1's
    all across here.
  • 4:02 - 4:04
    We want these to be 0's.
  • 4:04 - 4:08
    Let's see how we can do
    this efficiently.
  • 4:08 - 4:11
    Let me draw the matrix again.
  • 4:11 - 4:16
    So let's get a 0 here.
  • 4:16 - 4:17
    That would be convenient.
  • 4:17 - 4:20
    So I'm going to keep the
    top two rows the same.
  • 4:20 - 4:21
    1, 0, 1.
  • 4:21 - 4:23
    I have my dividing line.
  • 4:23 - 4:24
    1, 0, 0.
  • 4:24 - 4:25
    I didn't do anything there.
  • 4:25 - 4:27
    I'm not doing anything
    to the second row.
  • 4:27 - 4:29
    0, 2, 1.
  • 4:29 - 4:33
  • 4:33 - 4:37
    0, 1, 0.
  • 4:37 - 4:40
    And what I'm going to do, I'm
    going to replace this row--
  • 4:40 - 4:42
    And just so you know my
    motivation, my goal
  • 4:42 - 4:43
    is to get a 0 here.
  • 4:43 - 4:47
    So I'm a little bit closer
    to having the
  • 4:47 - 4:48
    identity matrix here.
  • 4:48 - 4:50
    So how do I get a 0 here?
  • 4:50 - 4:56
    What I could do is I can replace
    this row with this row
  • 4:56 - 4:57
    minus this row.
  • 4:57 - 5:00
    So I can replace the third
    row with the third row
  • 5:00 - 5:02
    minus the first row.
  • 5:02 - 5:04
    So what's the third row
    minus the first row?
  • 5:04 - 5:07
    1 minus 1 is 0.
  • 5:07 - 5:11
    1 minus 0 is 1.
  • 5:11 - 5:14
    1 minus 1 is 0.
  • 5:14 - 5:16
    Well I did it on the left hand
    side, so I have to do it on
  • 5:16 - 5:17
    the right hand side.
  • 5:17 - 5:20
    I have to replace this
    with this minus this.
  • 5:20 - 5:24
    So 0 minus 1 is minus 1.
  • 5:24 - 5:27
    0 minus 0 is 0.
  • 5:27 - 5:30
    And 1 minus 0 is 1.
  • 5:30 - 5:31
    Fair enough.
  • 5:31 - 5:33
    Now what can I do?
  • 5:33 - 5:38
    Well this row right here, this
    third row, it has 0 and 0-- it
  • 5:38 - 5:41
    looks a lot like what I want
    for my second row in the
  • 5:41 - 5:42
    identity matrix.
  • 5:42 - 5:43
    So why don't I just swap
    these two rows?
  • 5:43 - 5:45
    Why don't I just swap the
    first and second rows?
  • 5:45 - 5:47
    So let's do that.
  • 5:47 - 5:50
    I'm going to swap the first
    and second rows.
  • 5:50 - 5:51
    So the first row
    stays the same.
  • 5:51 - 5:55
    1, 0, 1.
  • 5:55 - 5:58
    And then the other side stays
    the same as well.
  • 5:58 - 6:02
    And I'm swapping the second
    and third rows.
  • 6:02 - 6:05
    So now my second row
    is now 0, 1, 0.
  • 6:05 - 6:07
    And I have to swap it on
    the right hand side.
  • 6:07 - 6:10
    So it's minus 1, 0, 1.
  • 6:10 - 6:13
    I'm just swapping these two.
  • 6:13 - 6:14
    So then my third row now
    becomes what the
  • 6:14 - 6:15
    second row was here.
  • 6:15 - 6:18
    0, 2, 1.
  • 6:18 - 6:22
    And 0, 1, 0.
  • 6:22 - 6:23
    Fair enough.
  • 6:23 - 6:25
    Now what do I want to do?
  • 6:25 - 6:27
    Well it would be nice if
    I had a 0 right here.
  • 6:27 - 6:30
    That would get me that much
    closer to the identity matrix.
  • 6:30 - 6:32
    So how could I get as 0 here?
  • 6:32 - 6:37
    Well what if I subtracted 2
    times row two from row one?
  • 6:37 - 6:40
    Because this would be,
    1 times 2 is 2.
  • 6:40 - 6:45
    And if I subtracted that from
    this, I'll get a 0 here.
  • 6:45 - 6:47
    So let's do that.
  • 6:47 - 6:50
    So the first row has
    been very lucky.
  • 6:50 - 6:51
    It hasn't had to do anything.
  • 6:51 - 6:53
    It's just sitting there.
  • 6:53 - 6:59
    1, 0, 1, 1, 0, 0.
  • 6:59 - 7:02
    And the second row's not
    changing for now.
  • 7:02 - 7:05
    Minus 1, 0, 1.
  • 7:05 - 7:07
    Now what did I say I
    was going to do?
  • 7:07 - 7:13
    I'm going to subtract 2 times
    row two from row three.
  • 7:13 - 7:19
    So this is 0 minus
    2 times 0 is 0.
  • 7:19 - 7:24
    2 minus 2 times 1,
    well that's 0.
  • 7:24 - 7:29
    1 minus 2 times 0 is 1.
  • 7:29 - 7:38
    0 minus 2 times negative 1 is--
    so let's remember 0 minus
  • 7:38 - 7:40
    2 times negative 1.
  • 7:40 - 7:45
    So that's 0 minus negative
    2, so that's positive 2.
  • 7:45 - 7:48
    1 minus 2 times 0.
  • 7:48 - 7:50
    Well that's just still 1.
  • 7:50 - 7:53
    0 minus 2 times 1.
  • 7:53 - 7:54
    So that's minus 2.
  • 7:54 - 7:57
  • 7:57 - 7:58
    Have I done that right?
  • 7:58 - 7:59
    I just want to make sure.
  • 7:59 - 8:05
    0 minus 2 times-- right, 2
    times minus 1 is minus 2.
  • 8:05 - 8:07
    And I'm subtracting
    it, so it's plus.
  • 8:07 - 8:08
    OK, so I'm close.
  • 8:08 - 8:11
    This almost looks like the
    identity matrix or reduced row
  • 8:11 - 8:12
    echelon form.
  • 8:12 - 8:13
    Except for this 1 right here.
  • 8:13 - 8:17
    So I'm finally going to have
    to touch the top row.
  • 8:17 - 8:18
    And what can I do?
  • 8:18 - 8:23
    well how about I replace the top
    row with the top row minus
  • 8:23 - 8:24
    the bottom row?
  • 8:24 - 8:25
    Because if I subtract
    this from that,
  • 8:25 - 8:27
    this'll get a 0 there.
  • 8:27 - 8:28
    So let's do that.
  • 8:28 - 8:30
    So I'm replacing the top
    row with the top row
  • 8:30 - 8:32
    minus the third row.
  • 8:32 - 8:36
    So 1 minus 0 is 1.
  • 8:36 - 8:39
    0 minus 0 is 0.
  • 8:39 - 8:41
    1 minus 1 is 0.
  • 8:41 - 8:44
    That was our whole goal.
  • 8:44 - 8:48
    And then 1 minus 2
    is negative 1.
  • 8:48 - 8:53
    0 minus 1 is negative 1.
  • 8:53 - 8:59
    0 minus negative 2., well
    that's positive 2.
  • 8:59 - 9:02
    And then the other rows
    stay the same.
  • 9:02 - 9:08
    0, 1, 0, minus 1, 0, 1.
  • 9:08 - 9:16
    And then 0, 0, 1, 2,
    1, negative 2.
  • 9:16 - 9:17
    And there you have it.
  • 9:17 - 9:19
    We have performed a series
    of operations on
  • 9:19 - 9:20
    the left hand side.
  • 9:20 - 9:21
    And we've performed the
    same operations on
  • 9:21 - 9:23
    the right hand side.
  • 9:23 - 9:26
    This became the identity
    matrix, or
  • 9:26 - 9:27
    reduced row echelon form.
  • 9:27 - 9:31
    And we did this using
    Gauss-Jordan elimination.
  • 9:31 - 9:32
    And what is this?
  • 9:32 - 9:37
    Well this is the inverse of
    this original matrix.
  • 9:37 - 9:39
    This times this will equal
    the identity matrix.
  • 9:39 - 9:47
    So if this is a, than
    this is a inverse.
  • 9:47 - 9:48
    And that's all you have to do.
  • 9:48 - 9:50
    And as you could see, this took
    me half the amount of
  • 9:50 - 9:53
    time, and required a lot less
    hairy mathematics than when I
  • 9:53 - 9:56
    did it using the adjoint and
    the cofactors and the
  • 9:56 - 9:58
    determinant.
  • 9:58 - 10:00
    And if you think about it, I'll
    give you a little hint of
  • 10:00 - 10:01
    why this worked.
  • 10:01 - 10:07
    Every one of these operations
    I did on the left hand side,
  • 10:07 - 10:11
    you could kind of view them as
    multiplying-- you know, to get
  • 10:11 - 10:12
    from here to here,
    I multiplied.
  • 10:12 - 10:14
    You can kind of say that
    there's a matrix.
  • 10:14 - 10:16
    That if I multiplied by that
    matrix, it would have
  • 10:16 - 10:18
    performed this operation.
  • 10:18 - 10:20
    And then I would have had to
    multiply by another matrix to
  • 10:20 - 10:22
    do this operation.
  • 10:22 - 10:24
    So essentially what we did is
    we multiplied by a series of
  • 10:24 - 10:26
    matrices to get here.
  • 10:26 - 10:28
    And if you multiplied all
    of those, what we call
  • 10:28 - 10:31
    elimination matrices, together,
    you essentially
  • 10:31 - 10:34
    multiply this times
    the inverse.
  • 10:34 - 10:36
    So what am I saying?
  • 10:36 - 10:43
    So if we have a, to go from
    here to here, we have to
  • 10:43 - 10:47
    multiply a times the
    elimination matrix.
  • 10:47 - 10:50
    And this might be completely
    confusing for you, so ignore
  • 10:50 - 10:52
    it if it is, but it might
    be insightful.
  • 10:52 - 10:55
    So what did we eliminate
    in this?
  • 10:55 - 10:58
    We eliminated 3, 1.
  • 10:58 - 11:01
    We multiplied by the
    elimination matrix
  • 11:01 - 11:04
    3, 1, to get here.
  • 11:04 - 11:06
    And then, to go from
    here to here, we've
  • 11:06 - 11:07
    multiplied by some matrix.
  • 11:07 - 11:08
    And I'll tell you more.
  • 11:08 - 11:09
    I'll show you how
    we can construct
  • 11:09 - 11:11
    these elimination matrices.
  • 11:11 - 11:13
    We multiply by an elimination
    matrix.
  • 11:13 - 11:16
    Well actually, we had
    a row swap here.
  • 11:16 - 11:17
    I don't know what you
    want to call that.
  • 11:17 - 11:21
    You could call that
    the swap matrix.
  • 11:21 - 11:25
    We swapped row two for three.
  • 11:25 - 11:29
    And then here, we multiplied
    by elimination
  • 11:29 - 11:31
    matrix-- what did we do?
  • 11:31 - 11:34
    We eliminated this, so
    this was row three,
  • 11:34 - 11:36
    column two, 3, 2.
  • 11:36 - 11:39
    And then finally, to get here,
    we had to multiply by
  • 11:39 - 11:40
    elimination matrix.
  • 11:40 - 11:42
    We had to eliminate
    this right here.
  • 11:42 - 11:44
    So we eliminated row
    one, column three.
  • 11:44 - 11:47
  • 11:47 - 11:50
    And I want you to know right
    now that it's not important
  • 11:50 - 11:51
    what these matrices are.
  • 11:51 - 11:53
    I'll show you how we can
    construct these matrices.
  • 11:53 - 11:56
    But I just want you to have kind
    of a leap of faith that
  • 11:56 - 11:59
    each of these operations could
    have been done by multiplying
  • 11:59 - 12:01
    by some matrix.
  • 12:01 - 12:04
    But what we do know is by
    multiplying by all of these
  • 12:04 - 12:07
    matrices, we essentially got
    the identity matrix.
  • 12:07 - 12:08
    Back here.
  • 12:08 - 12:11
    So the combination of all of
    these matrices, when you
  • 12:11 - 12:14
    multiply them by each
    other, this must
  • 12:14 - 12:15
    be the inverse matrix.
  • 12:15 - 12:18
    If I were to multiply each of
    these elimination and row swap
  • 12:18 - 12:22
    matrices, this must be the
    inverse matrix of a.
  • 12:22 - 12:24
    Because if you multiply
    all them times
  • 12:24 - 12:26
    a, you get the inverse.
  • 12:26 - 12:29
    Well what happened?
  • 12:29 - 12:32
    If these matrices are
    collectively the inverse
  • 12:32 - 12:36
    matrix, if I do them, if I
    multiply the identity matrix
  • 12:36 - 12:41
    times them-- the elimination
    matrix, this one times that
  • 12:41 - 12:41
    equals that.
  • 12:41 - 12:43
    This one times that
    equals that.
  • 12:43 - 12:45
    This one times that
    equals that.
  • 12:45 - 12:45
    And so forth.
  • 12:45 - 12:49
    I'm essentially multiplying--
    when you combine all of
  • 12:49 - 12:53
    these-- a inverse times
    the identity matrix.
  • 12:53 - 12:56
    So if you think about it just
    very big picture-- and I don't
  • 12:56 - 12:56
    want to confuse you.
  • 12:56 - 12:58
    It's good enough at this
    point if you just
  • 12:58 - 13:00
    understood what I did.
  • 13:00 - 13:04
    But what I'm doing from all of
    these steps, I'm essentially
  • 13:04 - 13:08
    multiplying both sides of this
    augmented matrix, you could
  • 13:08 - 13:10
    call it, by a inverse.
  • 13:10 - 13:13
    So I multiplied this by a
    inverse, to get to the
  • 13:13 - 13:14
    identity matrix.
  • 13:14 - 13:17
    But of course, if I multiplied
    the inverse matrix times the
  • 13:17 - 13:19
    identity matrix, I'll get
    the inverse matrix.
  • 13:19 - 13:21
    But anyway, I don't want
    to confuse you.
  • 13:21 - 13:22
    Hopefully that'll give you
    a little intuition.
  • 13:22 - 13:25
    I'll do this later with some
    more concrete examples.
  • 13:25 - 13:28
    But hopefully you see that this
    is a lot less hairy than
  • 13:28 - 13:30
    the way we did it with the
    adjoint and the cofactors and
  • 13:30 - 13:33
    the minor matrices and the
    determinants, et cetera.
  • 13:33 - 13:35
    Anyway, I'll see you
    in the next video.
Title:
Inverting Matrices (part 3)
Description:

Using Gauss-Jordan elimination to invert a 3x3 matrix.

more » « less
Video Language:
English
Duration:
13:36

English subtitles

Incomplete

Revisions