I'm a meteorologist by degree,
I have a bachelor's,
master's and PhD in physical meteorology,
so I'm a meteorologist, card carrying.
And so with that comes
four questions, always.
This is one prediction
I will always get right.
(Laughter)
And those questions are,
"Marshall, what channel are you on?"
(Laughter)
"Dr. Shepherd, what's the weather
going to be tomorrow?"
(Laughter)
And oh, I love this one:
"My daughter is getting married
next September, it's an outdoor wedding.
Is it going to rain?"
(Laughter)
Not kidding, I get those,
and I don't know the answer to that,
the science isn't there.
But the one I get a lot these days is,
"Dr. Shepherd, do you believe
in climate change?"
"Do you believe in global warming?"
Now, I have to gather myself
every time I get that question.
Because it's an ill-posed question --
science isn't a belief system.
My son, he's 10 --
he believes in the tooth fairy.
And he needs to get over that,
because I'm losing dollars, fast.
(Laughter)
But he believes in the tooth fairy.
But consider this.
Bank of America building,
there, in Atlanta.
You never hear anyone say,
"Do you believe, if you go
to the top of that building
and throw a ball off, it's going to fall?"
You never hear that,
because gravity is a thing.
So why don't we hear the question,
"Do you believe in gravity?"
But of course, we hear the question,
"Do you believe in global warming?"
Well, consider these facts.
The American Association
for the Advancement of Science, AAAS,
one of the leading
organizations in science,
queried scientists and the public
on different science topics.
Here are some of them:
genetically modified food,
animal research, human evolution.
And look at what the scientists
say about those,
the people that actually
study those topics, in red,
versus the gray, what the public thinks.
How did we get there?
How did we get there?
That scientists and the public
are so far apart on these science issues.
Well, I'll come a little bit
closer to home for me,
climate change.
Eighty-seven percent of scientists
believe that humans are contributing
to climate change.
But only 50 percent of the public?
How did we get there?
So it begs the question,
what shapes perceptions about science?
It's an interesting question
and one that I've been
thinking about quite a bit.
I think that one thing that shapes
perceptions in the public, about science,
is belief systems and biases.
Belief systems and biases.
Go with me for a moment.
Because I want to talk
about three elements of that:
confirmation bias, Dunning-Kruger effect
and cognitive dissonance.
Now, these sound like big, fancy,
academic terms, and they are.
But when I describe them,
you're going to be like, "Oh!
I recognize that; I even know
somebody that does that."
Confirmation bias.
Finding evidence that supports
what we already believe.
Now, we're probably all
a little bit guilty of that at times.
Take a look at this.
I'm on Twitter.
And often, when it snows,
I'll get this tweet back to me.
(Laughter)
"Hey, Dr. Shepherd, I have 20 inches
of global warming in my yard,
what are you guys
talking about, climate change?"
I get that tweet a lot, actually.
It's a cute tweet,
it makes me chuckle as well.
But it's oh, so fundamentally
scientifically flawed.
Because it illustrates
that the person tweeting
doesn't understand
the difference
between weather and climate.
I often say, weather is your mood
and climate is your personality.
Think about that.
Weather is your mood,
climate is your personality.
Your mood today doesn't necessarily
tell me anything about your personality,
nor does a cold day tell me anything
about climate change,
or a hot day, for that matter.
Dunning-Kruger.
Two scholars from Cornell
came up with the Dunning-Kruger effect.
If you go look up
the peer-reviewed paper for this,
you will see all kinds
of fancy terminology:
it's an illusory superiority complex,
thinking we know things.
In other words, people think
they know more than they do.
Or they underestimate
what they don't know.
And then, there's cognitive dissonance.
Cognitive dissonance is interesting.
We just recently had Groundhog Day, right?
Now, there's no better definition
of cognitive dissonance
than intelligent people asking me
if a rodent's forecast is accurate.
(Laughter)
But I get that, all of the time.
(Laughter)
But I also hear
about the Farmer's Almanac.
We grew up on the Farmer's Almanac,
people are familiar with it.
The problem is, it's only
about 37 percent accurate,
according to studies
at Penn State University.
But we're in an era of science
where we actually
can forecast the weather.
And believe it or not, and I know
some of you are like, "Yeah, right,"
we're about 90 percent accurate, or more,
with weather forecast.
You just tend to remember
the occasional miss, you do.
(Laughter)
So confirmation bias,
Dunning-Kruger and cognitive dissonance.
I think those shape biases and perceptions
that people have about science.
But then, there's literacy
and misinformation
that keep us boxed in, as well.
During the hurricane season of 2017,
media outlets had to actually
assign reporters
to dismiss fake information
about the weather forecast.
That's the era that we're in.
I deal with this all the time
in social media.
Someone will tweet a forecast --
that's a forecast for Hurricane Irma,
but here's the problem:
it didn't come from the Hurricane Center.
But people were tweeting
and sharing this; it went viral.
It didn't come from
the National Hurricane Center at all.
So I spent 12 years of my career at NASA
before coming
to the University of Georgia,
and I chair their Earth Science
Advisory Committee,
I was just up there last week in DC.
And I saw some really interesting things.
Here's a NASA model
and science data from satellite
showing the 2017 hurricane season.
You see Hurricane Harvey there?
Look at all the dust coming off of Africa.
Look at the wildfires up in northwest US
and in western Canada.
There comes Hurricane Irma.
This is fascinating to me.
But admittedly, I'm a weather geek.
But more importantly, it illustrates
that we have the technology
to not only observe the weather
and climate system,
but predict it.
There's scientific understanding,
so there's no need for some
of those perceptions and biases
that we've been talking about.
We have knowledge.
But think about this ...
This is Houston, Texas,
after Hurricane Harvey.
Now, I write a contribution
for "Forbes" magazine periodically,
and I wrote an article a week before
Hurricane Harvey made landfall, saying,
"There's probably going to be
40 to 50 inches of rainfall."
I wrote that a week before it happened.
But yet, when you talk
to people in Houston,
people are saying, "We had no idea
it was going to be this bad."
I'm just...
(Sigh)
(Laughter)
A week before.
But --
I know, it's amusing, but the reality is,
we all struggle with perceiving something
outside of our experience level.
People in Houston
get rain all of the time,
they flood all of the time.
But they've never experienced that.
Houston gets about 34 inches of rainfall
for the entire year.
They got 50 inches in three days.
That's an anomaly event,
that's outside of the normal.
So belief systems and biases,
literacy and misinformation.
How do we step out of the boxes
that are cornering our perceptions?
Well we don't even have to go to Houston,
we can come very close to home.
(Laughter)
Remember "Snowpocalypse?"
(Laughter)
Snowmageddon?
Snowzilla?
Whatever you want to call it.
All two inches of it.
(Laughter)
Two inches of snow
shut the city of Atlanta down.
(Laughter)
But the reality is,
we were in a winter storm watch,
we went to a winter weather advisory,
and a lot of people perceived that
as being a downgrade,
"Oh, it's not going to be as bad."
When in fact, the perception
was that it was not going to be as bad,
but it was actually an upgrade.
Things were getting worse
as the models were coming in.
So that's an example of how we get
boxed in by our perceptions.
So, the question becomes,
how do we expand our radius?
The area of a circle is "pi r squared".
We increase the radius,
we increase the area.
How do we expand our radius
of understanding about science?
Here are my thoughts.
You take inventory of your own biases.
And I'm challenging you all to do that.
Take an inventory of your own biases.
Where do they come from?
Your upbringing, your political
perspective, your faith --
what shapes your own biases?
Then, evaluate your sources --
where do you get
your information on science?
What do you read, what do you listen to,
to consume your information on science?
And then, it's important to speak out.
Talk about how you evaluated your biases
and evaluated your sources.
I want you to listen to this
little 40-second clip
from one of the top
TV meteorologists in the US, Greg Fishel,
in the Raleigh, Durham area.
He's revered in that region.
But he was a climate skeptic.
But listen to what he says
about speaking out.
Greg Fishel:
The mistake I was making
and didn't realize until very recently,
was that I was only looking
for information
to support what I already thought,
and was not interested
in listening to anything contrary.
And so I woke up one morning,
and there was this question in my mind,
"Greg, are you engaging
in confirmation bias?
Are you only looking for information
to support what you already think?"
And if I was honest with myself,
and I tried to be,
I admitted that was going on.
And so the more I talked to scientists
and read peer-reviewed literature
and tried to conduct myself the way
I'd been taught to conduct myself
at Penn State when I was a student,
it became very difficult for me
to make the argument
that we weren't at least
having some effect.
Maybe there was still a doubt
as to how much,
but to say "nothing" was not
a responsible thing for me to do
as a scientist or a person.
Greg Fishel just talked
about expanding his radius
of understanding of science.
And when we expand our radius,
it's not about making a better future,
but it's about preserving
life as we know it.
So as we think about expanding
our own radius in understanding science,
it's critical for Athens, Georgia,
for Atlanta, Georgia,
for the state of Georgia,
and for the world.
So expand your radius.
Thank you.
(Applause)