< Return to Video

There is certainty in uncertainty | Brian Schmidt | TEDxCanberra

  • 0:11 - 0:13
    Last December,
  • 0:13 - 0:16
    me and my fellow Nobel Laureates
    were asked by a journalist
  • 0:16 - 0:20
    if there was one thing
    that we could teach the world,
  • 0:20 - 0:21
    what would it be?
  • 0:21 - 0:22
    And to my surprise,
  • 0:22 - 0:28
    two economists, two biologists,
    a chemist, and three physicists
  • 0:28 - 0:30
    gave the same answer.
  • 0:30 - 0:33
    And that answer was about uncertainty.
  • 0:33 - 0:36
    So I'm going to talk to you today
    about uncertainty.
  • 0:37 - 0:43
    To understand anything,
    you must understand its uncertainty.
  • 0:43 - 0:48
    Uncertainty is at the heart
    of the fabric of the Universe.
  • 0:48 - 0:51
    I'm going to illustrate this with a laser.
  • 0:52 - 0:54
    A laser puts out a small,
  • 0:54 - 0:58
    but not infinitesimally small
    point of light.
  • 0:58 - 1:01
    You might think that if I go through
  • 1:01 - 1:05
    and I try to make
    that point of light smaller
  • 1:05 - 1:09
    by, for example,
    bringing two jars of a slit together,
  • 1:09 - 1:11
    that I could make that point
    as small as I want.
  • 1:11 - 1:15
    I just want to make
    those slits closer and closer.
  • 1:16 - 1:19
    So let's see what happens
    when I do this for real.
  • 1:20 - 1:23
    My friends at Mount Stromlo gave a call
  • 1:23 - 1:27
    and made up a nice little invent,
    a little here.
  • 1:28 - 1:33
    By essentially adjusting
    the laser, the slit -
  • 1:33 - 1:36
    we're going to go through
    and we are going to see what happens
  • 1:36 - 1:38
    when I close the jaws of the slit.
  • 1:39 - 1:40
    The more I close it,
  • 1:42 - 1:46
    instead of getting smaller,
    the laser gets spread out.
  • 1:47 - 1:51
    So it works exactly the opposite
    of what I was expecting.
  • 1:52 - 1:56
    And that's due to something known
    as Heisenberg's Uncertainty Principle.
  • 1:57 - 1:59
    Heisenberg's Uncertainty Principle states
  • 1:59 - 2:03
    that you can't know exactly
    where something is
  • 2:03 - 2:07
    and know its momentum at the same time.
  • 2:07 - 2:10
    Light's momentum is really its direction.
  • 2:11 - 2:16
    So, as I bring those slits
    closer and closer together,
  • 2:16 - 2:19
    I actually constrain where the light is.
  • 2:20 - 2:25
    But the quantum world says
    you can't do that.
  • 2:25 - 2:27
    The light then has an uncertain direction.
  • 2:27 - 2:32
    So instead of being a smaller point,
    the light has a randomness put out to it,
  • 2:32 - 2:35
    which is that pattern that we saw.
  • 2:37 - 2:42
    Many things in life you can think of
    as a series of little decisions.
  • 2:42 - 2:47
    For example, if I start at a point,
    and I can go left or right,
  • 2:47 - 2:51
    well, I do it, let's say, 50% of the time
    I can go left or right.
  • 2:51 - 2:56
    Let's say I have another decision tree
    down below that.
  • 2:56 - 3:01
    I can go left, I can go right,
    or I can go to the middle.
  • 3:01 - 3:04
    Because I've had two chances
    to go to the middle from above,
  • 3:04 - 3:07
    I would do that 50% of the time.
  • 3:07 - 3:11
    I only go one quarter to the left
    and one quarter all the way to the right.
  • 3:11 - 3:14
    And you can build up such a decision tree,
    and Pascal did this.
  • 3:14 - 3:17
    It's called Pascal's triangle.
  • 3:17 - 3:21
    You get a probability
    of where you are going to end up.
  • 3:21 - 3:23
    I brought something like this
    with me today.
  • 3:25 - 3:28
    It's this machine right here.
  • 3:29 - 3:33
    This is a machine you can put balls into
    and you can randomly see what happens.
  • 3:33 - 3:36
    So, for example, if I put a ball in here,
  • 3:36 - 3:40
    it'll bounce down
    and it'll end up somewhere.
  • 3:40 - 3:43
    It's essentially an enactment
    of Pascal's triangle.
  • 3:43 - 3:46
    I need two people
    from the audience to help me,
  • 3:46 - 3:49
    and I think I am going to have
    Sly and Jon right there
  • 3:49 - 3:51
    come up and help me if that's okay.
  • 3:51 - 3:52
    You know who you are.
  • 3:52 - 3:54
    (Laughter)
  • 3:54 - 3:55
    What they are going to do
  • 3:55 - 3:58
    is they are going to,
    as fast as they can -
  • 3:58 - 4:02
    faster than they are going right now,
    because I only have 18 minutes -
  • 4:02 - 4:03
    (Laughter)
  • 4:03 - 4:07
    put balls through this machine,
    and we're going to see what happens.
  • 4:08 - 4:10
    This machine counts things where they end.
  • 4:10 - 4:13
    So you guys have to go through
    as fast as you can.
  • 4:13 - 4:15
    Work together,
  • 4:15 - 4:18
    and during the rest of my talk,
    you are going to build up this.
  • 4:18 - 4:21
    And the more you do,
    the better it is, okay?
  • 4:21 - 4:24
    So go for it, and I'll keep on going.
  • 4:24 - 4:25
    (Laughter)
  • 4:25 - 4:26
    Alright.
  • 4:26 - 4:30
    It turns out that if you have
    a series of random events in life,
  • 4:30 - 4:34
    you end up with something
    called a Bell Shaped Curve,
  • 4:34 - 4:38
    which we also call a Normal Distribution
    or Gaussian Distribution.
  • 4:39 - 4:42
    So, for example, if you have
    just a few random events,
  • 4:42 - 4:45
    you don't get something
    that really looks like that.
  • 4:45 - 4:46
    But if you do more and more,
  • 4:46 - 4:50
    they add up to give you
    this very characteristic pattern
  • 4:50 - 4:54
    which Gauss famously
    wrote down mathematically.
  • 4:54 - 4:55
    It turns out that in most cases
  • 4:55 - 5:01
    a series of random events
    gives you this bell-shaped curve.
  • 5:02 - 5:04
    It doesn't really matter what it is.
  • 5:04 - 5:06
    For example, if I were going to go out
  • 5:06 - 5:10
    and have a million scales across Australia
  • 5:10 - 5:13
    measure my weight.
  • 5:13 - 5:14
    Well, there's some randomness to that,
  • 5:14 - 5:19
    and you'll get a bell-shaped curve
    of what my weight actually is.
  • 5:20 - 5:22
    If I were instead to go through
  • 5:22 - 5:25
    and ask a million Australian males
    what their weight is,
  • 5:25 - 5:26
    and actually measure it,
  • 5:26 - 5:29
    I would also get a bell-shaped curve,
  • 5:29 - 5:32
    because that is also made up
    of a series of random events
  • 5:32 - 5:34
    which determine people's weight.
  • 5:35 - 5:38
    So the way a bell-shaped curve
    is characterized
  • 5:38 - 5:42
    is by its mean -
    that's the most likely value -
  • 5:42 - 5:46
    and its width, which we call
    a standard deviation.
  • 5:47 - 5:49
    This is a very important concept
  • 5:49 - 5:53
    because the width
    and how close you are to the mean
  • 5:53 - 5:54
    you can characterize,
  • 5:54 - 5:57
    so the likelihood of things is occurring.
  • 5:58 - 6:02
    So it turns out if you are within
    one standard deviation,
  • 6:02 - 6:06
    that happens 68.3% of the time.
  • 6:06 - 6:10
    I'm going to illustrate how this works
    for work example in just a second.
  • 6:11 - 6:15
    If you have two standard deviations,
    that happens 95.4% of the time;
  • 6:15 - 6:16
    you're within two.
  • 6:16 - 6:20
    99.73% within three standard deviations.
  • 6:20 - 6:25
    This is a very powerful way for us
    to describe things in the world.
  • 6:25 - 6:29
    So, it turns out this means
    that I can go out
  • 6:29 - 6:32
    and make a measurement of, for example,
  • 6:32 - 6:33
    how much I weigh,
  • 6:33 - 6:36
    and if I use more and more
    scales in Australia,
  • 6:36 - 6:39
    I will get a better and better answer,
  • 6:39 - 6:41
    provided they are good scales.
  • 6:42 - 6:47
    It turns out the more trials I do,
    or the more measurements I make,
  • 6:47 - 6:49
    the better I will make that measurement.
  • 6:49 - 6:52
    And the accuracy increases
  • 6:52 - 6:57
    as the square root of the number
    of times I make the measurement.
  • 6:57 - 6:59
    That's why I am having these guys
    do what they are doing
  • 6:59 - 7:00
    as fast as they can.
  • 7:00 - 7:01
    (Laughter)
  • 7:01 - 7:05
    So let's apply this
    to a real world problem we all see:
  • 7:05 - 7:08
    the approval rating
    of the Prime Minister of Australia.
  • 7:09 - 7:11
    Over the past 15 months,
  • 7:11 - 7:14
    every couple of weeks, we hear news poll
  • 7:14 - 7:19
    go out and ask the people of Australia:
    "Do you approve of the Prime Minister?"
  • 7:19 - 7:20
    Over the last 15 months,
  • 7:20 - 7:24
    they have done this 28 times,
    and they asked 1100 people.
  • 7:24 - 7:27
    They don't ask
    about 22 million Australians
  • 7:27 - 7:29
    because it's too expensive to do that,
  • 7:29 - 7:31
    so they ask 1100 people.
  • 7:31 - 7:33
    The square root of 1100 is 33,
  • 7:33 - 7:36
    and so it turns out
    their answers are uncertain
  • 7:36 - 7:41
    by plus or minus 33 people
    when they asked these 1100 people.
  • 7:41 - 7:45
    That's a 3% error.
    That's 33 divided by 1100.
  • 7:45 - 7:47
    So let's see what they get.
  • 7:47 - 7:49
    Here is last fifteen months.
  • 7:49 - 7:53
    You can see it seems that some time
    in the middle of the last year
  • 7:53 - 7:56
    the Prime Minister had a very bad week,
  • 7:56 - 8:00
    followed a few weeks later
    by what appears to be a very good week.
  • 8:01 - 8:05
    Of course, you could look at it
    in another way.
  • 8:05 - 8:08
    You could say, "What would happened
    if the Prime Minister's popularity
  • 8:08 - 8:13
    hasn't changed at all
    in the last fifteen months?"
  • 8:13 - 8:15
    Well, then there's an average,
  • 8:15 - 8:20
    and that mean turns out
    to be 29.6% for this set of polls.
  • 8:20 - 8:23
    So she hasn't been very popular
    over the last 15 months.
  • 8:23 - 8:28
    And we know that, if a basis bell curve,
  • 8:28 - 8:31
    that's 68.3% of the time,
  • 8:31 - 8:34
    it should lie within plus or minus 3%,
  • 8:34 - 8:38
    because of the number
    of people we're asking.
  • 8:38 - 8:44
    So that means we expect it turns out
    between 15 and 23 of the time.
  • 8:44 - 8:47
    So it should lie within plus or minus 3%.
  • 8:47 - 8:50
    And the actual number of times is 24.
  • 8:51 - 8:53
    What about those really extreme cases
  • 8:53 - 8:57
    when she seems to have
    a really bad or really good week?
  • 8:57 - 9:02
    Well, you actually expect
    zero to two times,
  • 9:02 - 9:03
    so 5% of the time,
  • 9:03 - 9:07
    to be more than 6% discrepant
    from the mean.
  • 9:07 - 9:08
    And what do we see?
  • 9:09 - 9:10
    Two.
  • 9:10 - 9:13
    In other words, over the last 15 months
  • 9:13 - 9:16
    the polls are completely consistent
  • 9:16 - 9:20
    with the Prime Minister's popularity
    not changing a bit.
  • 9:22 - 9:25
    Alright. And let's see what the news is.
  • 9:25 - 9:27
    For example, just last week.
  • 9:27 - 9:31
    Well, approval rating,
    big headline in the Australians,
  • 9:31 - 9:34
    dropped from 29 to 27%,
  • 9:34 - 9:39
    even though the error on that
    is at least 3% even for that single poll.
  • 9:39 - 9:41
    It's not just Australia that does this;
  • 9:41 - 9:43
    it's all the news agencies.
  • 9:45 - 9:47
    Now, the other thing is that
  • 9:47 - 9:49
    news polls are not the only people
    who do this.
  • 9:49 - 9:52
    For example, Nielsen
    does this for Fairfax,
  • 9:52 - 9:54
    and here are their polls.
  • 9:54 - 9:56
    Same question,
  • 9:56 - 9:59
    and you'll see that it seems
    that they are also consistent
  • 9:59 - 10:03
    with the Prime Minister's popularity
    not changing over time.
  • 10:03 - 10:05
    But they seem to get a different answer.
  • 10:05 - 10:09
    They get 36.5% approval over that period.
  • 10:10 - 10:13
    We are not talking about 1,000 people here
  • 10:13 - 10:15
    when we compare these two things.
  • 10:15 - 10:17
    We're talking about 30,000,
  • 10:17 - 10:19
    because we get to add up all those people.
  • 10:19 - 10:23
    So, the uncertainty in these measurement
    is well less than 1%,
  • 10:23 - 10:26
    and yet they disagree by 6%.
  • 10:26 - 10:29
    That's because
    not all uncertainty is random.
  • 10:29 - 10:34
    It can be done to just make
    mistakes or errors.
  • 10:34 - 10:39
    It turns out it really hard to ask
    1,100 people across Australia
  • 10:39 - 10:43
    who are representative
    of the average Australian.
  • 10:43 - 10:47
    So, there is an additional uncertainty
    caused by just error
  • 10:47 - 10:52
    that is making a scientific
    or a polling error which we see here.
  • 10:52 - 10:54
    You might ask yourself,
  • 10:54 - 10:57
    "Why don't they just ask more people,
    like 10,000 people,
  • 10:57 - 11:00
    less frequently, once a month?"
  • 11:00 - 11:01
    And a cynic might say
  • 11:01 - 11:06
    because there’s no news in telling people
    that the popularity is the same
  • 11:06 - 11:08
    month after month after month.
  • 11:08 - 11:10
    (Laughter)
  • 11:10 - 11:11
    Alright.
  • 11:11 - 11:14
    Not all things, though,
    become more accurate
  • 11:14 - 11:16
    the more you measure them,
  • 11:16 - 11:20
    and such systems we call
    as exhibiting chaotic behavior.
  • 11:20 - 11:25
    I happen to have something
    that exhibits chaotic behavior here,
  • 11:25 - 11:28
    which is a double pendulum.
  • 11:28 - 11:29
    A double pendulum -
  • 11:29 - 11:33
    this was made up for the people
    by me at Questacon,
  • 11:33 - 11:35
    and I thank them for that.
  • 11:35 - 11:39
    A double pendulum is just two pendulums
    connected to each other.
  • 11:39 - 11:43
    And the beautiful thing is
    this doesn't always exhibit chaos.
  • 11:43 - 11:44
    Let me show you what happens here.
  • 11:44 - 11:46
    If I just start this thing,
  • 11:46 - 11:49
    these things swing
    back and forth in unison
  • 11:49 - 11:52
    because there is no chaos here.
  • 11:52 - 11:55
    If I make measurements,
    better and better measurements,
  • 11:55 - 11:59
    I can predict exactly
    what is going on here.
  • 11:59 - 12:01
    The better I do, the better I will know
  • 12:01 - 12:04
    what pendulum is going to be
    in the future.
  • 12:04 - 12:08
    But if I take a double pendulum
    and I swing it a lot,
  • 12:08 - 12:11
    then something else happens.
  • 12:11 - 12:13
    They don't do the same thing,
  • 12:13 - 12:15
    and there is nothing I can do,
  • 12:15 - 12:17
    no matter how many measurements I make,
  • 12:17 - 12:22
    that I can predict what is going to happen
    with these pendulums,
  • 12:22 - 12:28
    because infinite testable differences
    lead to different outcomes.
  • 12:28 - 12:30
    Not is all lost here.
  • 12:30 - 12:32
    It turns out there are things
    we can learn.
  • 12:32 - 12:37
    For example, I can know
    through my measurements,
  • 12:37 - 12:41
    what the likelihood of the things
    swinging all the way around is,
  • 12:41 - 12:43
    how often that's going to happen.
  • 12:43 - 12:45
    So, you can know things
    about chaotic systems,
  • 12:45 - 12:49
    but you cannot predict exactly
    what they're going to do.
  • 12:50 - 12:53
    Alright, so what is a chaotic system
    that we are used to?
  • 12:53 - 12:59
    Well, it turns out the Earth's climate
    is a good example of a chaotic system.
  • 12:59 - 13:03
    I show you here the temperature record
    from Antarctic ice cores
  • 13:03 - 13:06
    over the last 650,000 years.
  • 13:06 - 13:10
    You can see in grey regions times
    when the Earth is quite warm,
  • 13:10 - 13:13
    and then it seemingly cools down.
  • 13:13 - 13:15
    And why does it do that?
  • 13:15 - 13:21
    It's a chaotic process that is related
    to how the Earth goes around the Sun
  • 13:21 - 13:24
    in a quite complex way.
  • 13:24 - 13:28
    So it's very difficult to predict exactly
    what the Earth is going to do
  • 13:28 - 13:29
    at any given time.
  • 13:31 - 13:34
    Also, it's just hard to measure
    what's going on with the Earth.
  • 13:34 - 13:36
    For the last thousand years,
  • 13:36 - 13:39
    here are temperature reconstructions
    from different groups.
  • 13:39 - 13:41
    You can see over the last thousand years,
  • 13:41 - 13:45
    we get quite different answers
    back in time.
  • 13:45 - 13:48
    We more or less agree
    where we have better information,
  • 13:48 - 13:51
    which is in the last hundred years or so,
  • 13:51 - 13:55
    that the Earth is warmed up
    about 8/10 of a degree.
  • 13:56 - 13:58
    So, modeling and measuring
    the climate is hard.
  • 13:59 - 14:02
    The consensus view of just using the data
  • 14:02 - 14:07
    is that we are 90% sure
    that the warming trend is not an accident,
  • 14:07 - 14:08
    that it is actually caused
  • 14:08 - 14:13
    by anthropogenic
    or man-made carbon dioxide.
  • 14:13 - 14:15
    As a scientist trying
    to make an experiment,
  • 14:15 - 14:20
    90% isn't a very good result.
  • 14:20 - 14:21
    You're not very sure about it.
  • 14:21 - 14:25
    However, if someone's trying to figure out
    my future of my life,
  • 14:25 - 14:27
    90% is a pretty big risk factor.
  • 14:27 - 14:31
    So, that's a very different thing
    between those two things.
  • 14:31 - 14:33
    But from my point as a scientist,
  • 14:33 - 14:37
    I am 99.99999% sure
  • 14:37 - 14:42
    that physics tells us
    that adding CO2 to the atmosphere
  • 14:42 - 14:46
    causes sunlight to be more effectively
    trapped in our atmosphere,
  • 14:46 - 14:48
    raising the temperature a bit.
  • 14:48 - 14:50
    The hard part is -
  • 14:50 - 14:52
    and what we are much less sure of -
  • 14:52 - 14:54
    is how many clouds there are going to be,
  • 14:54 - 14:56
    how much water vapor will be released,
  • 14:56 - 14:58
    which warms the Earth up even more,
  • 14:58 - 15:00
    how many methane releases will follow,
  • 15:00 - 15:04
    and precisely how the oceans
    will interact with all this
  • 15:04 - 15:07
    to trap the CO2 and hold the warmth.
  • 15:07 - 15:13
    Of course, we have no idea really
    how much CO2 we will emit into the future.
  • 15:14 - 15:16
    So here is our best estimate.
  • 15:17 - 15:19
    The red curve shows
    what we think will happen
  • 15:19 - 15:24
    if we don't do anything
    about our CO2 emission into the future.
  • 15:24 - 15:25
    We're going to burn more and more
  • 15:25 - 15:28
    as we become
    more and more developed as a world.
  • 15:28 - 15:33
    The blue line shows a very aggressive
    carbon reduction strategy
  • 15:33 - 15:36
    proposed by the IPCC.
  • 15:37 - 15:39
    And then we can estimate
    using our best physics
  • 15:39 - 15:41
    of what we think is going to happen.
  • 15:41 - 15:44
    Here is the outcome of the two ideas.
  • 15:44 - 15:49
    The blue curve shows what happens
    if we do that very aggressive drop.
  • 15:49 - 15:52
    It keeps the rise of temperature
    over the next century
  • 15:52 - 15:57
    to less than 2 degrees C
    with about 90% confidence.
  • 15:58 - 16:00
    On the other hand,
    if we let things keep going,
  • 16:00 - 16:04
    the best prediction is, of course,
    that it's going to get warmer and warmer,
  • 16:04 - 16:09
    with a great deal of uncertainty
    of about exactly how warm we'll go.
  • 16:09 - 16:12
    According to the Australian
    Academy of Science
  • 16:12 - 16:16
    they say, "Expect climate surprises,"
  • 16:16 - 16:17
    and we should,
  • 16:17 - 16:20
    because the Earth's climate
    is a chaotic system.
  • 16:20 - 16:23
    We don't exactly know
    what it's going to do,
  • 16:23 - 16:26
    and that is what scares
    the hell out of me.
  • 16:27 - 16:30
    So, life is not black and white.
  • 16:31 - 16:34
    Life is really shades of grey.
  • 16:35 - 16:37
    But it's not all bad.
  • 16:37 - 16:39
    You guys have done an excellent job,
  • 16:39 - 16:41
    so what I want you to do now is to stop,
  • 16:41 - 16:44
    and we are going to read out
    your numbers here,
  • 16:44 - 16:45
    and we're going to compare them
  • 16:45 - 16:49
    to what I thought
    which we were going to predict, okay?
  • 16:49 - 16:54
    So I have here hopefully
    a functioning computer.
  • 16:54 - 16:58
    So what I need you to do
    is to just go through from the left
  • 16:58 - 17:00
    and read out the numbers
    that you have achieved.
  • 17:01 - 17:02
    Assistant: 5. Brian Schmidt: 5.
  • 17:03 - 17:04
    A: 10. BS: 10.
  • 17:04 - 17:07
    A: 21. BS:21.
  • 17:07 - 17:12
    A: 21. BS:21 again? A: That's right. 24.
  • 17:12 - 17:15
    BS: 24? A: Yes. Then 30. BS: 30.
  • 17:15 - 17:17
    A: 37. BS: 37.
  • 17:17 - 17:19
    A: 47. BS: 47.
  • 17:19 - 17:21
    A: 41. BS: 41.
  • 17:21 - 17:23
    A: 43. BS: 43.
  • 17:23 - 17:25
    A: 29. BS: 29.
  • 17:25 - 17:27
    A: 21. BS: 21.
  • 17:27 - 17:29
    A: 8. BS: 8.
  • 17:29 - 17:31
    A:10. BS: 10.
  • 17:31 - 17:33
    A: 3. BS: 3.
  • 17:33 - 17:38
    Well, I am proud to say
    you guys were completely random.
  • 17:38 - 17:39
    It was perfect.
  • 17:39 - 17:40
    (Laughter)
  • 17:40 - 17:45
    I show here the prediction
    of what should happen and what happened.
  • 17:45 - 17:46
    Bang on.
  • 17:46 - 17:48
    (Applause)
  • 17:48 - 17:51
    There is certainty in uncertainty,
  • 17:51 - 17:52
    (Laughter)
  • 17:52 - 17:54
    and that is the beauty of it.
  • 17:54 - 18:00
    But to make policy decisions
    based on what we know about science,
  • 18:00 - 18:02
    about what we know about economics,
  • 18:02 - 18:07
    requires our politicians,
    our policy makers, and our citizens
  • 18:07 - 18:09
    to understand uncertainty.
  • 18:09 - 18:12
    I'm going to finish with the words
    of Richard Feynman,
  • 18:12 - 18:15
    with words that really could be my own,
  • 18:15 - 18:18
    which is, "I can live
    with doubt and uncertainty.
  • 18:18 - 18:21
    I think it's much more interesting
    to live not knowing
  • 18:21 - 18:24
    than to have answers
    which might be wrong."
  • 18:24 - 18:25
    Thank you very much.
  • 18:25 - 18:28
    (Applause)
  • 18:29 - 18:31
    Thank you. Excellent.
  • 18:31 - 18:33
    (Applause)
Title:
There is certainty in uncertainty | Brian Schmidt | TEDxCanberra
Description:

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

At TEDxCanberra 2012, Nobel Prize for Physics recipient, Professor Brian Schmidt, provides a live, engaging and practical demonstration of just how uncertainty works in the real world.

With a little help from some students in the audience, Professor Schmidt explains how uncertainty operates in our daily lives, including inaccurate media coverage of political polling and efforts to understand climate change.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
18:41

English subtitles

Revisions Compare revisions