< Return to Video

Big-Oh Notation (4 min)

  • 0:00 - 0:04
    In the following series of videos, we'll
    give a formal treatment of asymptotic
  • 0:04 - 0:09
    notation, in particular big-Oh notation, as
    well as work through a number of examples.
  • 0:09 - 0:13
    Big-Oh notation concerns functions defined
    on the positive integers, we'll call it T(n)
  • 0:13 - 0:18
    We'll pretty much always have the
    same semantics for T(n). We're gonna be
  • 0:18 - 0:22
    concerned about the worst-case running
    time of an algorithm, as a function of the
  • 0:22 - 0:26
    input size, n. So, the question I wanna
    answer for you in the rest of this video,
  • 0:26 - 0:30
    is, what does it mean when we say a
    function, T(n), is big-Oh of f(n). Or
  • 0:30 - 0:35
    hear f(n) is some basic function, like
    for example n log n. So I'll give you a
  • 0:35 - 0:39
    number of answers, a number of ways of, to
    think about what big-Oh notation really
  • 0:39 - 0:43
    means. For starters let's begin with an
    English definition. What does it mean for
  • 0:43 - 0:48
    a function to be big-Oh of f(n)? It means
    eventually, for all sufficiently large
  • 0:48 - 0:52
    values of n, it's bounded above by a
    constant multiple of f(n). Let's think
  • 0:52 - 0:56
    about it in a couple other ways. So next
    I'm gonna translate this English
  • 0:56 - 1:00
    definition into picture and then I'll
    translate it into formal mathematics. So
  • 1:00 - 1:05
    pictorially you can imagine that perhaps
    we have T(n) denoted by this blue
  • 1:05 - 1:11
    functions here. And perhaps f(n) is
    denoted by this green function here, which
  • 1:11 - 1:16
    lies below T(n). But when we double f(n), we get a function that eventually
  • 1:16 - 1:22
    crosses T(n) and forevermore is larger
    than it. So in this event, we would say
  • 1:22 - 1:26
    that T(n) indeed is a Big-Oh of f(n).
    The reason being that for all sufficiently
  • 1:26 - 1:31
    large n, and once we go far enough out right
    on this graph, indeed, a constant
  • 1:31 - 1:35
    multiple times of f(n), twice f(n), is an upper bound of T(n).
  • 1:35 - 1:40
    So finally, let me give you a actual
    mathematical definition that you could use
  • 1:40 - 1:45
    to do formal proofs. So how do we say, in
    mathematics, that eventually it should be
  • 1:45 - 1:49
    bounded above by a constant multiple of f(n)? We see that there exists two
  • 1:49 - 1:56
    constants, which I'll call c and n0. So
    that T(n) is no more than c times f(n)
  • 1:56 - 2:03
    for all n that exceed or equal n0.
    So, the role of these two constants is to
  • 2:03 - 2:07
    quantify what we mean by a constant
    multiple, and what we mean by sufficiently
  • 2:07 - 2:12
    large, in the English definition. c
    obviously quantifies the constant multiple
  • 2:12 - 2:17
    of f(n), and n0 is quantifying
    sufficiently large, that's the threshold
  • 2:17 - 2:21
    beyond which we insist that, c times f(n) is an upper-bound on T(n). So, going
  • 2:21 - 2:26
    back to the picture, what are c and n0? Well, c, of course, is just going to
  • 2:26 - 2:31
    be two. And n0 is the crossing point.
    So we get to where two f(n). And T(n)
  • 2:31 - 2:36
    cross, and then we drop the acentode.
    This would be the relative value of n0
  • 2:36 - 2:39
    in this picture, so that's the formal
    definition, the way to prove that
  • 2:39 - 2:43
    something's bigger of f(n) you
    exhibit these two constants c and n0
  • 2:43 - 2:48
    and it better be the case that for all n at least
    n0, c times f(n) upper-bounds T(n).
  • 2:48 - 2:52
    One way to think about it if you're trying
    to establish that something is big-Oh of
  • 2:52 - 2:56
    some function it's like you're playing a
    game against an opponent and you want to
  • 2:56 - 3:00
    prove that. This inequality here holds and
    your opponent must show that it doesn't
  • 3:00 - 3:05
    hold for sufficiently large n you have to
    go first your job is to pick a strategy in
  • 3:05 - 3:09
    the form of a constant c and a constant n0 and your opponent is then allowed to
  • 3:09 - 3:14
    pick any number n larger than n0 so the
    function is big-Oh of f(n) if and only if you
  • 3:14 - 3:18
    have a winning strategy in this game. If
    you can up front commit to constants c and
  • 3:18 - 3:22
    n0 so that no matter how big of an n
    your opponent picks, this inequality holds
  • 3:22 - 3:27
    if you have no winning strategy then it's
    not big-Oh of f(n) no matter what C and n0
  • 3:27 - 3:31
    you choose your opponent can always
    flip this in equality. By choosing a
  • 3:31 - 3:36
    suitable, suitable large value of n. I
    want to emphasis one last thing which is
  • 3:36 - 3:41
    that these constants, what do I mean by
    constants. I mean they are independent of
  • 3:41 - 3:45
    n. And so when you apply this definition,
    and you choose your constant c and
  • 3:45 - 3:49
    n0, it better be that n does not appear
    anywhere. So C should just be something
  • 3:49 - 3:53
    like a thousand or a million. Some
    constant independent of n. So those are a
  • 3:53 - 3:57
    bunch of way to think about big-Oh
    notation. In English, you wanna have it
  • 3:57 - 4:01
    bound above for sufficiently large numbers
    n. I'm showing you how to translate that
  • 4:01 - 4:04
    into mathematics that give you a pictorial
    representation. And also sort of a game
  • 4:04 - 4:08
    theoretic way to think about it. Now,
    let's move on to a video that explores a
  • 4:08 - 4:09
    number of examples.
Title:
Big-Oh Notation (4 min)
Video Language:
English

English subtitles

Revisions