< Return to Video

coursera - Design and Analysis of Algorithms I - 2.1 Big-Oh Notation

  • 0:00 - 0:03
    In the following series of videos, we'll
    give a formal treatment of asymptotic
  • 0:03 - 0:08
    notation, in particular big O notation, as
    well as work through a number of examples.
  • 0:08 - 0:12
    Big O notation concerns functions defined
    on the positive integers, we'll call it t
  • 0:12 - 0:17
    of n. We'll pretty much always have the
    same semantics for t of n. We're gonna be
  • 0:17 - 0:20
    concerned about the worst-case running
    time of an algorithm, as a function of the
  • 0:20 - 0:24
    input size, n. So, the question I wanna
    answer for you in the rest of this video,
  • 0:24 - 0:29
    is, what does it mean when we say a
    function, t of n, is big O of f of n. Or
  • 0:29 - 0:34
    hear f of n is some basic function, like
    for example n log n. So I'll give you a
  • 0:34 - 0:38
    number of answers, a number of ways of, to
    think about what big O notation really
  • 0:38 - 0:42
    means. For starters let's begin with an
    English definition. What does it mean for
  • 0:42 - 0:46
    a function to be big O of F of N? It means
    eventually, for all sufficiently large
  • 0:46 - 0:52
    values of N, it's bounded above by a
    constant multiple of F of N. Let's think
  • 0:52 - 0:55
    about it in a couple other ways. So next
    I'm gonna translate this English
  • 0:55 - 1:00
    definition into picture and then I'll
    translate it into formal mathematics. So
  • 1:00 - 1:05
    pictorially you can imagine that perhaps
    we have [inaudible] denoted by this blue
  • 1:05 - 1:11
    functions here. And perhaps F of N is
    denoted by this green function here, which
  • 1:11 - 1:17
    lies below T of N. But when we double F of
    N, we get a function that eventually
  • 1:17 - 1:21
    crosses T of N and forevermore is larger
    than it. So in this event, we would say
  • 1:21 - 1:27
    that T event indeed is Big O of F event.
    The reason being that for all sufficiently
  • 1:27 - 1:32
    large, and once we go far enough out right
    on this graph, indeed, the constant
  • 1:32 - 1:36
    multiple times F [inaudible] twice
    [inaudible], is an upper bound on T event.
  • 1:36 - 1:39
    So finally, let me give you a actual
    mathematical definition that you could use
  • 1:39 - 1:44
    to do formal proofs. So how do we say, in
    mathematics, that eventually it should be
  • 1:44 - 1:48
    bounded above by a constant multiple of F
    event? We see that there exists two
  • 1:48 - 1:56
    constants, which I'll call c and n not. So
    that t of n is no more than C times F of
  • 1:56 - 2:02
    N, for all n that exceed or equal and not.
    So, the role of these two constants is to
  • 2:02 - 2:05
    quantify what we mean by a constant
    multiple, and what we mean by sufficiently
  • 2:05 - 2:09
    large, in the English definition. C
    obviously quantifies the constant multiple
  • 2:09 - 2:13
    of f of n, and n knot is quantifying
    sufficiently large, that's the threshold
  • 2:13 - 2:19
    beyond which we insist that, c times f of
    n is an upper-bound on t of n. So, going
  • 2:19 - 2:25
    back to the picture, what are c and n
    knot? Well, c, of course, is just going to
  • 2:25 - 2:30
    be two. And N. Knot is the crossing point.
    So we get to where two F. Of N. And T. Of
  • 2:30 - 2:36
    N. Cross, and then we drop the acentode.
    This would be the relative value. Of N not
  • 2:36 - 2:40
    in this picture, so that's the formal
    definition, the way to prove that
  • 2:40 - 2:44
    something's bigger than [inaudible] you
    exhibit these two constants C and N not
  • 2:44 - 2:48
    and it better be the case that for all N
    not C times F of N upper-bounds T of N.
  • 2:48 - 2:52
    One way to think about it if you're trying
    to establish that something is big O of
  • 2:52 - 2:56
    some function it's like you're playing a
    game against an opponent and you want to
  • 2:56 - 3:00
    prove that. This inequality here holds and
    your opponent must show that it doesn't
  • 3:00 - 3:05
    hold for sufficiently large N you have to
    go first your job is to pick a strategy in
  • 3:05 - 3:09
    the form of a constant C and a constant N
    not and your opponent is then allowed to
  • 3:09 - 3:14
    pick any number N larger than N not so the
    function is [inaudible] if and only if you
  • 3:14 - 3:18
    have a winning strategy in this game. If
    you can up front commit to constants C and
  • 3:18 - 3:23
    N not so that no matter how big of an N
    your opponent picks, this inequality holds
  • 3:23 - 3:27
    if you have no winning strategy then it's
    not to go of F of N no matter what C of N
  • 3:27 - 3:31
    not you choose your opponent can always
    flip this in equality. By choosing a
  • 3:31 - 3:35
    suitable, suitable large value of N. I
    want to emphasis one last thing which is
  • 3:35 - 3:41
    that these constants, what do I mean by
    constants. I mean they are independent of
  • 3:41 - 3:45
    N. And so when you apply this definition,
    and you choose your constant C [inaudible]
  • 3:45 - 3:49
    not, it better be that N does not appear
    anywhere. So C should just be something
  • 3:49 - 3:54
    like a thousand or a million. Some
    constant independent of N. So those are a
  • 3:54 - 3:57
    bunch of way to think about [inaudible]
    notation. In English, you wanna have it
  • 3:57 - 4:01
    bound above for sufficiently large numbers
    N. I'm showing you how to translate that
  • 4:01 - 4:04
    into mathematics that give you a pictorial
    representation. And also sort of a game
  • 4:04 - 4:07
    theoretic way to think about it. Now,
    let's move on to a video that explores a
  • 4:07 -
    number of examples.
Title:
coursera - Design and Analysis of Algorithms I - 2.1 Big-Oh Notation
Description:

https://www.coursera.org/

more » « less
Duration:
04:10

English subtitles

Revisions