There is certainty in uncertainty | Brian Schmidt | TEDxCanberra
-
0:11 - 0:13Last December,
-
0:13 - 0:16me and my fellow Nobel Laureates
were asked by a journalist -
0:16 - 0:20if there was one thing
that we could teach the world, -
0:20 - 0:21what would it be?
-
0:21 - 0:22And to my surprise,
-
0:22 - 0:28two economists, two biologists,
a chemist, and three physicists -
0:28 - 0:30gave the same answer.
-
0:30 - 0:33And that answer was about uncertainty.
-
0:33 - 0:36So I'm going to talk to you today
about uncertainty. -
0:37 - 0:43To understand anything,
you must understand its uncertainty. -
0:43 - 0:48Uncertainty is at the heart
of the fabric of the Universe. -
0:48 - 0:51I'm going to illustrate this with a laser.
-
0:52 - 0:54A laser puts out a small,
-
0:54 - 0:58but not infinitesimally small
point of light. -
0:58 - 1:01You might think that if I go through
-
1:01 - 1:05and I try to make
that point of light smaller -
1:05 - 1:09by, for example,
bringing two jars of a slit together, -
1:09 - 1:11that I could make that point
as small as I want. -
1:11 - 1:15I just want to make
those slits closer and closer. -
1:16 - 1:19So let's see what happens
when I do this for real. -
1:20 - 1:23My friends at Mount Stromlo gave a call
-
1:23 - 1:27and made up a nice little invent,
a little here. -
1:28 - 1:33By essentially adjusting
the laser, the slit - -
1:33 - 1:36we're going to go through
and we are going to see what happens -
1:36 - 1:38when I close the jaws of the slit.
-
1:39 - 1:40The more I close it,
-
1:42 - 1:46instead of getting smaller,
the laser gets spread out. -
1:47 - 1:51So it works exactly the opposite
of what I was expecting. -
1:52 - 1:56And that's due to something known
as Heisenberg's Uncertainty Principle. -
1:57 - 1:59Heisenberg's Uncertainty Principle states
-
1:59 - 2:03that you can't know exactly
where something is -
2:03 - 2:07and know its momentum at the same time.
-
2:07 - 2:10Light's momentum is really its direction.
-
2:11 - 2:16So, as I bring those slits
closer and closer together, -
2:16 - 2:19I actually constrain where the light is.
-
2:20 - 2:25But the quantum world says
you can't do that. -
2:25 - 2:27The light then has an uncertain direction.
-
2:27 - 2:32So instead of being a smaller point,
the light has a randomness put out to it, -
2:32 - 2:35which is that pattern that we saw.
-
2:37 - 2:42Many things in life you can think of
as a series of little decisions. -
2:42 - 2:47For example, if I start at a point,
and I can go left or right, -
2:47 - 2:51well, I do it, let's say, 50% of the time
I can go left or right. -
2:51 - 2:56Let's say I have another decision tree
down below that. -
2:56 - 3:01I can go left, I can go right,
or I can go to the middle. -
3:01 - 3:04Because I've had two chances
to go to the middle from above, -
3:04 - 3:07I would do that 50% of the time.
-
3:07 - 3:11I only go one quarter to the left
and one quarter all the way to the right. -
3:11 - 3:14And you can build up such a decision tree,
and Pascal did this. -
3:14 - 3:17It's called Pascal's triangle.
-
3:17 - 3:21You get a probability
of where you are going to end up. -
3:21 - 3:23I brought something like this
with me today. -
3:25 - 3:28It's this machine right here.
-
3:29 - 3:33This is a machine you can put balls into
and you can randomly see what happens. -
3:33 - 3:36So, for example, if I put a ball in here,
-
3:36 - 3:40it'll bounce down
and it'll end up somewhere. -
3:40 - 3:43It's essentially an enactment
of Pascal's triangle. -
3:43 - 3:46I need two people
from the audience to help me, -
3:46 - 3:49and I think I am going to have
Sly and Jon right there -
3:49 - 3:51come up and help me if that's okay.
-
3:51 - 3:52You know who you are.
-
3:52 - 3:54(Laughter)
-
3:54 - 3:55What they are going to do
-
3:55 - 3:58is they are going to,
as fast as they can - -
3:58 - 4:02faster than they are going right now,
because I only have 18 minutes - -
4:02 - 4:03(Laughter)
-
4:03 - 4:07put balls through this machine,
and we're going to see what happens. -
4:08 - 4:10This machine counts things where they end.
-
4:10 - 4:13So you guys have to go through
as fast as you can. -
4:13 - 4:15Work together,
-
4:15 - 4:18and during the rest of my talk,
you are going to build up this. -
4:18 - 4:21And the more you do,
the better it is, okay? -
4:21 - 4:24So go for it, and I'll keep on going.
-
4:24 - 4:25(Laughter)
-
4:25 - 4:26Alright.
-
4:26 - 4:30It turns out that if you have
a series of random events in life, -
4:30 - 4:34you end up with something
called a Bell Shaped Curve, -
4:34 - 4:38which we also call a Normal Distribution
or Gaussian Distribution. -
4:39 - 4:42So, for example, if you have
just a few random events, -
4:42 - 4:45you don't get something
that really looks like that. -
4:45 - 4:46But if you do more and more,
-
4:46 - 4:50they add up to give you
this very characteristic pattern -
4:50 - 4:54which Gauss famously
wrote down mathematically. -
4:54 - 4:55It turns out that in most cases
-
4:55 - 5:01a series of random events
gives you this bell-shaped curve. -
5:02 - 5:04It doesn't really matter what it is.
-
5:04 - 5:06For example, if I were going to go out
-
5:06 - 5:10and have a million scales across Australia
-
5:10 - 5:13measure my weight.
-
5:13 - 5:14Well, there's some randomness to that,
-
5:14 - 5:19and you'll get a bell-shaped curve
of what my weight actually is. -
5:20 - 5:22If I were instead to go through
-
5:22 - 5:25and ask a million Australian males
what their weight is, -
5:25 - 5:26and actually measure it,
-
5:26 - 5:29I would also get a bell-shaped curve,
-
5:29 - 5:32because that is also made up
of a series of random events -
5:32 - 5:34which determine people's weight.
-
5:35 - 5:38So the way a bell-shaped curve
is characterized -
5:38 - 5:42is by its mean -
that's the most likely value - -
5:42 - 5:46and its width, which we call
a standard deviation. -
5:47 - 5:49This is a very important concept
-
5:49 - 5:53because the width
and how close you are to the mean -
5:53 - 5:54you can characterize,
-
5:54 - 5:57so the likelihood of things is occurring.
-
5:58 - 6:02So it turns out if you are within
one standard deviation, -
6:02 - 6:06that happens 68.3% of the time.
-
6:06 - 6:10I'm going to illustrate how this works
for work example in just a second. -
6:11 - 6:15If you have two standard deviations,
that happens 95.4% of the time; -
6:15 - 6:16you're within two.
-
6:16 - 6:2099.73% within three standard deviations.
-
6:20 - 6:25This is a very powerful way for us
to describe things in the world. -
6:25 - 6:29So, it turns out this means
that I can go out -
6:29 - 6:32and make a measurement of, for example,
-
6:32 - 6:33how much I weigh,
-
6:33 - 6:36and if I use more and more
scales in Australia, -
6:36 - 6:39I will get a better and better answer,
-
6:39 - 6:41provided they are good scales.
-
6:42 - 6:47It turns out the more trials I do,
or the more measurements I make, -
6:47 - 6:49the better I will make that measurement.
-
6:49 - 6:52And the accuracy increases
-
6:52 - 6:57as the square root of the number
of times I make the measurement. -
6:57 - 6:59That's why I am having these guys
do what they are doing -
6:59 - 7:00as fast as they can.
-
7:00 - 7:01(Laughter)
-
7:01 - 7:05So let's apply this
to a real world problem we all see: -
7:05 - 7:08the approval rating
of the Prime Minister of Australia. -
7:09 - 7:11Over the past 15 months,
-
7:11 - 7:14every couple of weeks, we hear news poll
-
7:14 - 7:19go out and ask the people of Australia:
"Do you approve of the Prime Minister?" -
7:19 - 7:20Over the last 15 months,
-
7:20 - 7:24they have done this 28 times,
and they asked 1100 people. -
7:24 - 7:27They don't ask
about 22 million Australians -
7:27 - 7:29because it's too expensive to do that,
-
7:29 - 7:31so they ask 1100 people.
-
7:31 - 7:33The square root of 1100 is 33,
-
7:33 - 7:36and so it turns out
their answers are uncertain -
7:36 - 7:41by plus or minus 33 people
when they asked these 1100 people. -
7:41 - 7:45That's a 3% error.
That's 33 divided by 1100. -
7:45 - 7:47So let's see what they get.
-
7:47 - 7:49Here is last fifteen months.
-
7:49 - 7:53You can see it seems that some time
in the middle of the last year -
7:53 - 7:56the Prime Minister had a very bad week,
-
7:56 - 8:00followed a few weeks later
by what appears to be a very good week. -
8:01 - 8:05Of course, you could look at it
in another way. -
8:05 - 8:08You could say, "What would happened
if the Prime Minister's popularity -
8:08 - 8:13hasn't changed at all
in the last fifteen months?" -
8:13 - 8:15Well, then there's an average,
-
8:15 - 8:20and that mean turns out
to be 29.6% for this set of polls. -
8:20 - 8:23So she hasn't been very popular
over the last 15 months. -
8:23 - 8:28And we know that, if a basis bell curve,
-
8:28 - 8:31that's 68.3% of the time,
-
8:31 - 8:34it should lie within plus or minus 3%,
-
8:34 - 8:38because of the number
of people we're asking. -
8:38 - 8:44So that means we expect it turns out
between 15 and 23 of the time. -
8:44 - 8:47So it should lie within plus or minus 3%.
-
8:47 - 8:50And the actual number of times is 24.
-
8:51 - 8:53What about those really extreme cases
-
8:53 - 8:57when she seems to have
a really bad or really good week? -
8:57 - 9:02Well, you actually expect
zero to two times, -
9:02 - 9:03so 5% of the time,
-
9:03 - 9:07to be more than 6% discrepant
from the mean. -
9:07 - 9:08And what do we see?
-
9:09 - 9:10Two.
-
9:10 - 9:13In other words, over the last 15 months
-
9:13 - 9:16the polls are completely consistent
-
9:16 - 9:20with the Prime Minister's popularity
not changing a bit. -
9:22 - 9:25Alright. And let's see what the news is.
-
9:25 - 9:27For example, just last week.
-
9:27 - 9:31Well, approval rating,
big headline in the Australians, -
9:31 - 9:34dropped from 29 to 27%,
-
9:34 - 9:39even though the error on that
is at least 3% even for that single poll. -
9:39 - 9:41It's not just Australia that does this;
-
9:41 - 9:43it's all the news agencies.
-
9:45 - 9:47Now, the other thing is that
-
9:47 - 9:49news polls are not the only people
who do this. -
9:49 - 9:52For example, Nielsen
does this for Fairfax, -
9:52 - 9:54and here are their polls.
-
9:54 - 9:56Same question,
-
9:56 - 9:59and you'll see that it seems
that they are also consistent -
9:59 - 10:03with the Prime Minister's popularity
not changing over time. -
10:03 - 10:05But they seem to get a different answer.
-
10:05 - 10:09They get 36.5% approval over that period.
-
10:10 - 10:13We are not talking about 1,000 people here
-
10:13 - 10:15when we compare these two things.
-
10:15 - 10:17We're talking about 30,000,
-
10:17 - 10:19because we get to add up all those people.
-
10:19 - 10:23So, the uncertainty in these measurement
is well less than 1%, -
10:23 - 10:26and yet they disagree by 6%.
-
10:26 - 10:29That's because
not all uncertainty is random. -
10:29 - 10:34It can be done to just make
mistakes or errors. -
10:34 - 10:39It turns out it really hard to ask
1,100 people across Australia -
10:39 - 10:43who are representative
of the average Australian. -
10:43 - 10:47So, there is an additional uncertainty
caused by just error -
10:47 - 10:52that is making a scientific
or a polling error which we see here. -
10:52 - 10:54You might ask yourself,
-
10:54 - 10:57"Why don't they just ask more people,
like 10,000 people, -
10:57 - 11:00less frequently, once a month?"
-
11:00 - 11:01And a cynic might say
-
11:01 - 11:06because there’s no news in telling people
that the popularity is the same -
11:06 - 11:08month after month after month.
-
11:08 - 11:10(Laughter)
-
11:10 - 11:11Alright.
-
11:11 - 11:14Not all things, though,
become more accurate -
11:14 - 11:16the more you measure them,
-
11:16 - 11:20and such systems we call
as exhibiting chaotic behavior. -
11:20 - 11:25I happen to have something
that exhibits chaotic behavior here, -
11:25 - 11:28which is a double pendulum.
-
11:28 - 11:29A double pendulum -
-
11:29 - 11:33this was made up for the people
by me at Questacon, -
11:33 - 11:35and I thank them for that.
-
11:35 - 11:39A double pendulum is just two pendulums
connected to each other. -
11:39 - 11:43And the beautiful thing is
this doesn't always exhibit chaos. -
11:43 - 11:44Let me show you what happens here.
-
11:44 - 11:46If I just start this thing,
-
11:46 - 11:49these things swing
back and forth in unison -
11:49 - 11:52because there is no chaos here.
-
11:52 - 11:55If I make measurements,
better and better measurements, -
11:55 - 11:59I can predict exactly
what is going on here. -
11:59 - 12:01The better I do, the better I will know
-
12:01 - 12:04what pendulum is going to be
in the future. -
12:04 - 12:08But if I take a double pendulum
and I swing it a lot, -
12:08 - 12:11then something else happens.
-
12:11 - 12:13They don't do the same thing,
-
12:13 - 12:15and there is nothing I can do,
-
12:15 - 12:17no matter how many measurements I make,
-
12:17 - 12:22that I can predict what is going to happen
with these pendulums, -
12:22 - 12:28because infinite testable differences
lead to different outcomes. -
12:28 - 12:30Not is all lost here.
-
12:30 - 12:32It turns out there are things
we can learn. -
12:32 - 12:37For example, I can know
through my measurements, -
12:37 - 12:41what the likelihood of the things
swinging all the way around is, -
12:41 - 12:43how often that's going to happen.
-
12:43 - 12:45So, you can know things
about chaotic systems, -
12:45 - 12:49but you cannot predict exactly
what they're going to do. -
12:50 - 12:53Alright, so what is a chaotic system
that we are used to? -
12:53 - 12:59Well, it turns out the Earth's climate
is a good example of a chaotic system. -
12:59 - 13:03I show you here the temperature record
from Antarctic ice cores -
13:03 - 13:06over the last 650,000 years.
-
13:06 - 13:10You can see in grey regions times
when the Earth is quite warm, -
13:10 - 13:13and then it seemingly cools down.
-
13:13 - 13:15And why does it do that?
-
13:15 - 13:21It's a chaotic process that is related
to how the Earth goes around the Sun -
13:21 - 13:24in a quite complex way.
-
13:24 - 13:28So it's very difficult to predict exactly
what the Earth is going to do -
13:28 - 13:29at any given time.
-
13:31 - 13:34Also, it's just hard to measure
what's going on with the Earth. -
13:34 - 13:36For the last thousand years,
-
13:36 - 13:39here are temperature reconstructions
from different groups. -
13:39 - 13:41You can see over the last thousand years,
-
13:41 - 13:45we get quite different answers
back in time. -
13:45 - 13:48We more or less agree
where we have better information, -
13:48 - 13:51which is in the last hundred years or so,
-
13:51 - 13:55that the Earth is warmed up
about 8/10 of a degree. -
13:56 - 13:58So, modeling and measuring
the climate is hard. -
13:59 - 14:02The consensus view of just using the data
-
14:02 - 14:07is that we are 90% sure
that the warming trend is not an accident, -
14:07 - 14:08that it is actually caused
-
14:08 - 14:13by anthropogenic
or man-made carbon dioxide. -
14:13 - 14:15As a scientist trying
to make an experiment, -
14:15 - 14:2090% isn't a very good result.
-
14:20 - 14:21You're not very sure about it.
-
14:21 - 14:25However, if someone's trying to figure out
my future of my life, -
14:25 - 14:2790% is a pretty big risk factor.
-
14:27 - 14:31So, that's a very different thing
between those two things. -
14:31 - 14:33But from my point as a scientist,
-
14:33 - 14:37I am 99.99999% sure
-
14:37 - 14:42that physics tells us
that adding CO2 to the atmosphere -
14:42 - 14:46causes sunlight to be more effectively
trapped in our atmosphere, -
14:46 - 14:48raising the temperature a bit.
-
14:48 - 14:50The hard part is -
-
14:50 - 14:52and what we are much less sure of -
-
14:52 - 14:54is how many clouds there are going to be,
-
14:54 - 14:56how much water vapor will be released,
-
14:56 - 14:58which warms the Earth up even more,
-
14:58 - 15:00how many methane releases will follow,
-
15:00 - 15:04and precisely how the oceans
will interact with all this -
15:04 - 15:07to trap the CO2 and hold the warmth.
-
15:07 - 15:13Of course, we have no idea really
how much CO2 we will emit into the future. -
15:14 - 15:16So here is our best estimate.
-
15:17 - 15:19The red curve shows
what we think will happen -
15:19 - 15:24if we don't do anything
about our CO2 emission into the future. -
15:24 - 15:25We're going to burn more and more
-
15:25 - 15:28as we become
more and more developed as a world. -
15:28 - 15:33The blue line shows a very aggressive
carbon reduction strategy -
15:33 - 15:36proposed by the IPCC.
-
15:37 - 15:39And then we can estimate
using our best physics -
15:39 - 15:41of what we think is going to happen.
-
15:41 - 15:44Here is the outcome of the two ideas.
-
15:44 - 15:49The blue curve shows what happens
if we do that very aggressive drop. -
15:49 - 15:52It keeps the rise of temperature
over the next century -
15:52 - 15:57to less than 2 degrees C
with about 90% confidence. -
15:58 - 16:00On the other hand,
if we let things keep going, -
16:00 - 16:04the best prediction is, of course,
that it's going to get warmer and warmer, -
16:04 - 16:09with a great deal of uncertainty
of about exactly how warm we'll go. -
16:09 - 16:12According to the Australian
Academy of Science -
16:12 - 16:16they say, "Expect climate surprises,"
-
16:16 - 16:17and we should,
-
16:17 - 16:20because the Earth's climate
is a chaotic system. -
16:20 - 16:23We don't exactly know
what it's going to do, -
16:23 - 16:26and that is what scares
the hell out of me. -
16:27 - 16:30So, life is not black and white.
-
16:31 - 16:34Life is really shades of grey.
-
16:35 - 16:37But it's not all bad.
-
16:37 - 16:39You guys have done an excellent job,
-
16:39 - 16:41so what I want you to do now is to stop,
-
16:41 - 16:44and we are going to read out
your numbers here, -
16:44 - 16:45and we're going to compare them
-
16:45 - 16:49to what I thought
which we were going to predict, okay? -
16:49 - 16:54So I have here hopefully
a functioning computer. -
16:54 - 16:58So what I need you to do
is to just go through from the left -
16:58 - 17:00and read out the numbers
that you have achieved. -
17:01 - 17:02Assistant: 5. Brian Schmidt: 5.
-
17:03 - 17:04A: 10. BS: 10.
-
17:04 - 17:07A: 21. BS:21.
-
17:07 - 17:12A: 21. BS:21 again? A: That's right. 24.
-
17:12 - 17:15BS: 24? A: Yes. Then 30. BS: 30.
-
17:15 - 17:17A: 37. BS: 37.
-
17:17 - 17:19A: 47. BS: 47.
-
17:19 - 17:21A: 41. BS: 41.
-
17:21 - 17:23A: 43. BS: 43.
-
17:23 - 17:25A: 29. BS: 29.
-
17:25 - 17:27A: 21. BS: 21.
-
17:27 - 17:29A: 8. BS: 8.
-
17:29 - 17:31A:10. BS: 10.
-
17:31 - 17:33A: 3. BS: 3.
-
17:33 - 17:38Well, I am proud to say
you guys were completely random. -
17:38 - 17:39It was perfect.
-
17:39 - 17:40(Laughter)
-
17:40 - 17:45I show here the prediction
of what should happen and what happened. -
17:45 - 17:46Bang on.
-
17:46 - 17:48(Applause)
-
17:48 - 17:51There is certainty in uncertainty,
-
17:51 - 17:52(Laughter)
-
17:52 - 17:54and that is the beauty of it.
-
17:54 - 18:00But to make policy decisions
based on what we know about science, -
18:00 - 18:02about what we know about economics,
-
18:02 - 18:07requires our politicians,
our policy makers, and our citizens -
18:07 - 18:09to understand uncertainty.
-
18:09 - 18:12I'm going to finish with the words
of Richard Feynman, -
18:12 - 18:15with words that really could be my own,
-
18:15 - 18:18which is, "I can live
with doubt and uncertainty. -
18:18 - 18:21I think it's much more interesting
to live not knowing -
18:21 - 18:24than to have answers
which might be wrong." -
18:24 - 18:25Thank you very much.
-
18:25 - 18:28(Applause)
-
18:29 - 18:31Thank you. Excellent.
-
18:31 - 18:33(Applause)
- Title:
- There is certainty in uncertainty | Brian Schmidt | TEDxCanberra
- Description:
-
This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
At TEDxCanberra 2012, Nobel Prize for Physics recipient, Professor Brian Schmidt, provides a live, engaging and practical demonstration of just how uncertainty works in the real world.
With a little help from some students in the audience, Professor Schmidt explains how uncertainty operates in our daily lives, including inaccurate media coverage of political polling and efforts to understand climate change.
- Video Language:
- English
- Team:
closed TED
- Project:
- TEDxTalks
- Duration:
- 18:41
![]() |
TED Translators admin edited English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
TED Translators admin approved English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
Denise RQ accepted English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
Tijana Mihajlović edited English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
Tijana Mihajlović edited English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
Tijana Mihajlović edited English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
Tijana Mihajlović edited English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 | |
![]() |
Tijana Mihajlović edited English subtitles for There is Certainty in Uncertainty | Brian Schmidt | TEDxCanberra 2012 |