Last December,
me and my fellow Nobel Laureates
were asked by a journalist
if there was one thing
that we could teach the world,
what would it be?
And to my surprise,
two economists, two biologists,
a chemist, and three physicists
gave the same answer.
And that answer was about uncertainty.
So I'm going to talk to you today
about uncertainty.
To understand anything,
you must understand its uncertainty.
Uncertainty is at the heart
of the fabric of the Universe.
I'm going to illustrate this with a laser.
A laser puts out a small,
but not infinitesimally small
point of light.
You might think that if I go through
and I try to make
that point of light smaller
by, for example,
bringing two jars of a slit together,
that I could make that point
as small as I want.
I just want to make
those slits closer and closer.
So let's see what happens
when I do this for real.
My friends at Mount Stromlo gave a call
and made up a nice little invent,
a little here.
By essentially adjusting
the laser, the slit -
we're going to go through
and we are going to see what happens
when I close the jaws of the slit.
The more I close it,
instead of getting smaller,
the laser gets spread out.
So it works exactly the opposite
of what I was expecting.
And that's due to something known
as Heisenberg's Uncertainty Principle.
Heisenberg's Uncertainty Principle states
that you can't know exactly
where something is
and know its momentum at the same time.
Light's momentum is really its direction.
So, as I bring those slits
closer and closer together,
I actually constrain where the light is.
But the quantum world says
you can't do that.
The light then has an uncertain direction.
So instead of being a smaller point,
the light has a randomness put out to it,
which is that pattern that we saw.
Many things in life you can think of
as a series of little decisions.
For example, if I start at a point,
and I can go left or right,
well, I do it, let's say, 50% of the time
I can go left or right.
Let's say I have another decision tree
down below that.
I can go left, I can go right,
or I can go to the middle.
Because I've had two chances
to go to the middle from above,
I would do that 50% of the time.
I only go one quarter to the left
and one quarter all the way to the right.
And you can build up such a decision tree,
and Pascal did this.
It's called Pascal's triangle.
You get a probability
of where you are going to end up.
I brought something like this
with me today.
It's this machine right here.
This is a machine you can put balls into
and you can randomly see what happens.
So, for example, if I put a ball in here,
it'll bounce down
and it'll end up somewhere.
It's essentially an enactment
of Pascal's triangle.
I need two people
from the audience to help me,
and I think I am going to have
Sly and Jon right there
come up and help me if that's okay.
You know who you are.
(Laughter)
What they are going to do
is they are going to,
as fast as they can -
faster than they are going right now,
because I only have 18 minutes -
(Laughter)
put balls through this machine,
and we're going to see what happens.
This machine counts things where they end.
So you guys have to go through
as fast as you can.
Work together,
and during the rest of my talk,
you are going to build up this.
And the more you do,
the better it is, okay?
So go for it, and I'll keep on going.
(Laughter)
Alright.
It turns out that if you have
a series of random events in life,
you end up with something
called a Bell Shaped Curve,
which we also call a Normal Distribution
or Gaussian Distribution.
So, for example, if you have
just a few random events,
you don't get something
that really looks like that.
But if you do more and more,
they add up to give you
this very characteristic pattern
which Gauss famously
wrote down mathematically.
It turns out that in most cases
a series of random events
gives you this bell-shaped curve.
It doesn't really matter what it is.
For example, if I were going to go out
and have a million scales across Australia
measure my weight.
Well, there's some randomness to that,
and you'll get a bell-shaped curve
of what my weight actually is.
If I were instead to go through
and ask a million Australian males
what their weight is,
and actually measure it,
I would also get a bell-shaped curve,
because that is also made up
of a series of random events
which determine people's weight.
So the way a bell-shaped curve
is characterized
is by its mean -
that's the most likely value -
and its width, which we call
a standard deviation.
This is a very important concept
because the width
and how close you are to the mean
you can characterize,
so the likelihood of things is occurring.
So it turns out if you are within
one standard deviation,
that happens 68.3% of the time.
I'm going to illustrate how this works
for work example in just a second.
If you have two standard deviations,
that happens 95.4% of the time;
you're within two.
99.73% within three standard deviations.
This is a very powerful way for us
to describe things in the world.
So, it turns out this means
that I can go out
and make a measurement of, for example,
how much I weigh,
and if I use more and more
scales in Australia,
I will get a better and better answer,
provided they are good scales.
It turns out the more trials I do,
or the more measurements I make,
the better I will make that measurement.
And the accuracy increases
as the square root of the number
of times I make the measurement.
That's why I am having these guys
do what they are doing
as fast as they can.
(Laughter)
So let's apply this
to a real world problem we all see:
the approval rating
of the Prime Minister of Australia.
Over the past 15 months,
every couple of weeks, we hear news poll
go out and ask the people of Australia:
"Do you approve of the Prime Minister?"
Over the last 15 months,
they have done this 28 times,
and they asked 1100 people.
They don't ask
about 22 million Australians
because it's too expensive to do that,
so they ask 1100 people.
The square root of 1100 is 33,
and so it turns out
their answers are uncertain
by plus or minus 33 people
when they asked these 1100 people.
That's a 3% error.
That's 33 divided by 1100.
So let's see what they get.
Here is last fifteen months.
You can see it seems that some time
in the middle of the last year
the Prime Minister had a very bad week,
followed a few weeks later
by what appears to be a very good week.
Of course, you could look at it
in another way.
You could say, "What would happened
if the Prime Minister's popularity
hasn't changed at all
in the last fifteen months?"
Well, then there's an average,
and that mean turns out
to be 29.6% for this set of polls.
So she hasn't been very popular
over the last 15 months.
And we know that, if a basis bell curve,
that's 68.3% of the time,
it should lie within plus or minus 3%,
because of the number
of people we're asking.
So that means we expect it turns out
between 15 and 23 of the time.
So it should lie within plus or minus 3%.
And the actual number of times is 24.
What about those really extreme cases
when she seems to have
a really bad or really good week?
Well, you actually expect
zero to two times,
so 5% of the time,
to be more than 6% discrepant
from the mean.
And what do we see?
Two.
In other words, over the last 15 months
the polls are completely consistent
with the Prime Minister's popularity
not changing a bit.
Alright. And let's see what the news is.
For example, just last week.
Well, approval rating,
big headline in the Australians,
dropped from 29 to 27%,
even though the error on that
is at least 3% even for that single poll.
It's not just Australia that does this;
it's all the news agencies.
Now, the other thing is that
news polls are not the only people
who do this.
For example, Nielsen
does this for Fairfax,
and here are their polls.
Same question,
and you'll see that it seems
that they are also consistent
with the Prime Minister's popularity
not changing over time.
But they seem to get a different answer.
They get 36.5% approval over that period.
We are not talking about 1,000 people here
when we compare these two things.
We're talking about 30,000,
because we get to add up all those people.
So, the uncertainty in these measurement
is well less than 1%,
and yet they disagree by 6%.
That's because
not all uncertainty is random.
It can be done to just make
mistakes or errors.
It turns out it really hard to ask
1,100 people across Australia
who are representative
of the average Australian.
So, there is an additional uncertainty
caused by just error
that is making a scientific
or a polling error which we see here.
You might ask yourself,
"Why don't they just ask more people,
like 10,000 people,
less frequently, once a month?"
And a cynic might say
because there’s no news in telling people
that the popularity is the same
month after month after month.
(Laughter)
Alright.
Not all things, though,
become more accurate
the more you measure them,
and such systems we call
as exhibiting chaotic behavior.
I happen to have something
that exhibits chaotic behavior here,
which is a double pendulum.
A double pendulum -
this was made up for the people
by me at Questacon,
and I thank them for that.
A double pendulum is just two pendulums
connected to each other.
And the beautiful thing is
this doesn't always exhibit chaos.
Let me show you what happens here.
If I just start this thing,
these things swing
back and forth in unison
because there is no chaos here.
If I make measurements,
better and better measurements,
I can predict exactly
what is going on here.
The better I do, the better I will know
what pendulum is going to be
in the future.
But if I take a double pendulum
and I swing it a lot,
then something else happens.
They don't do the same thing,
and there is nothing I can do,
no matter how many measurements I make,
that I can predict what is going to happen
with these pendulums,
because infinite testable differences
lead to different outcomes.
Not is all lost here.
It turns out there are things
we can learn.
For example, I can know
through my measurements,
what the likelihood of the things
swinging all the way around is,
how often that's going to happen.
So, you can know things
about chaotic systems,
but you cannot predict exactly
what they're going to do.
Alright, so what is a chaotic system
that we are used to?
Well, it turns out the Earth's climate
is a good example of a chaotic system.
I show you here the temperature record
from Antarctic ice cores
over the last 650,000 years.
You can see in grey regions times
when the Earth is quite warm,
and then it seemingly cools down.
And why does it do that?
It's a chaotic process that is related
to how the Earth goes around the Sun
in a quite complex way.
So it's very difficult to predict exactly
what the Earth is going to do
at any given time.
Also, it's just hard to measure
what's going on with the Earth.
For the last thousand years,
here are temperature reconstructions
from different groups.
You can see over the last thousand years,
we get quite different answers
back in time.
We more or less agree
where we have better information,
which is in the last hundred years or so,
that the Earth is warmed up
about 8/10 of a degree.
So, modeling and measuring
the climate is hard.
The consensus view of just using the data
is that we are 90% sure
that the warming trend is not an accident,
that it is actually caused
by anthropogenic
or man-made carbon dioxide.
As a scientist trying
to make an experiment,
90% isn't a very good result.
You're not very sure about it.
However, if someone's trying to figure out
my future of my life,
90% is a pretty big risk factor.
So, that's a very different thing
between those two things.
But from my point as a scientist,
I am 99.99999% sure
that physics tells us
that adding CO2 to the atmosphere
causes sunlight to be more effectively
trapped in our atmosphere,
raising the temperature a bit.
The hard part is -
and what we are much less sure of -
is how many clouds there are going to be,
how much water vapor will be released,
which warms the Earth up even more,
how many methane releases will follow,
and precisely how the oceans
will interact with all this
to trap the CO2 and hold the warmth.
Of course, we have no idea really
how much CO2 we will emit into the future.
So here is our best estimate.
The red curve shows
what we think will happen
if we don't do anything
about our CO2 emission into the future.
We're going to burn more and more
as we become
more and more developed as a world.
The blue line shows a very aggressive
carbon reduction strategy
proposed by the IPCC.
And then we can estimate
using our best physics
of what we think is going to happen.
Here is the outcome of the two ideas.
The blue curve shows what happens
if we do that very aggressive drop.
It keeps the rise of temperature
over the next century
to less than 2 degrees C
with about 90% confidence.
On the other hand,
if we let things keep going,
the best prediction is, of course,
that it's going to get warmer and warmer,
with a great deal of uncertainty
of about exactly how warm we'll go.
According to the Australian
Academy of Science
they say, "Expect climate surprises,"
and we should,
because the Earth's climate
is a chaotic system.
We don't exactly know
what it's going to do,
and that is what scares
the hell out of me.
So, life is not black and white.
Life is really shades of grey.
But it's not all bad.
You guys have done an excellent job,
so what I want you to do now is to stop,
and we are going to read out
your numbers here,
and we're going to compare them
to what I thought
which we were going to predict, okay?
So I have here hopefully
a functioning computer.
So what I need you to do
is to just go through from the left
and read out the numbers
that you have achieved.
Assistant: 5. Brian Schmidt: 5.
A: 10. BS: 10.
A: 21. BS:21.
A: 21. BS:21 again? A: That's right. 24.
BS: 24? A: Yes. Then 30. BS: 30.
A: 37. BS: 37.
A: 47. BS: 47.
A: 41. BS: 41.
A: 43. BS: 43.
A: 29. BS: 29.
A: 21. BS: 21.
A: 8. BS: 8.
A:10. BS: 10.
A: 3. BS: 3.
Well, I am proud to say
you guys were completely random.
It was perfect.
(Laughter)
I show here the prediction
of what should happen and what happened.
Bang on.
(Applause)
There is certainty in uncertainty,
(Laughter)
and that is the beauty of it.
But to make policy decisions
based on what we know about science,
about what we know about economics,
requires our politicians,
our policy makers, and our citizens
to understand uncertainty.
I'm going to finish with the words
of Richard Feynman,
with words that really could be my own,
which is, "I can live
with doubt and uncertainty.
I think it's much more interesting
to live not knowing
than to have answers
which might be wrong."
Thank you very much.
(Applause)
Thank you. Excellent.
(Applause)