-
>> I'm here at the [inaudible] dying of heat
exhaustion to talk to you about these guys.
-
Specifically, how they relate to one
of our biggest problems in science.
-
Let's start from the beginning.
-
There is a replication crisis
among scientists and it's serious.
-
Several scientific fields have
done major replication attempts
-
and they just aren't getting the
same results the second time around.
-
And this isn't a case of an
odd finding here or there, no.
-
We're having to question foundational studies
in the literature of psychology, biomedicine.
-
And while the harder scientists
might be smugly chuckling right now
-
about their epistemic superiority, many wonder
-
if they're not a big replication
attempt away from the same situation.
-
In short, the replication crisis is a
serious problem and not just for scientists.
-
Remember, what happens in
science becomes public policy.
-
What happens in science becomes
your drug treatment.
-
What happens in science turns into pop
science which turns into your friend insisting
-
at a dinner party that power-posing is the
secret to becoming dominant and assertive.
-
>> There is nothing there.
-
The books have nothing to say.
-
[ Music ]
-
>> We all have a vested interest
in science getting it right
-
or at least correcting itself when it's wrong.
-
But what's so terrifying about the replication
crisis isn't that we got a few things wrong,
-
it's that we didn't notice
they were wrong for so long.
-
The explanations for the
replication crisis are varied.
-
A few simply deny there is a problem;
others propose that our standards
-
for statistical significance are too weak.
-
P-values, they would say, breeds too many false
positives and have helped create the crisis.
-
Some say we don't incentivize
replication enough.
-
Some say the gate-keeping institutions are
to blame and the culprit list goes on and on.
-
And I think most of these have some
truth to them, but I want to talk
-
about a much simpler theory
for the replication crisis,
-
a theory that begins with
us looking at giraffes.
-
This video is sponsored by Squarespace.
-
Not this exact giraffe of course but the
process that led us to this exact giraffe.
-
Natural selection is the process
where organisms better adapted
-
to an environment tend to survive and reproduce.
-
And giraffes are an archetype
of natural selection.
-
The ones with longer necks were
better able to eat tree leaves
-
and reproduce and over time you get this.
-
But the scientist Dr. Paul
Smaldino [assumed spelling] argues
-
that this same process affects
more than just animals.
-
>> So if you can imagine, we're used to
thinking about evolution in terms of acting
-
on [inaudible] and on biological species that
die and reproduce but culture also evolves.
-
And, you know, Darwin laid out --
you don't need genes for evolution.
-
There are just -- you just need three things.
-
Right. You need variation in a population.
-
You need the things that vary to be heritable
to be able to be passed down from one individual
-
to another and you need selection so
that variation matters in the extent
-
to which those traits are passed down.
-
And anything that has those properties
is subject to natural selection.
-
>> Anything including science.
-
>> So scientists vary in their methods.
-
There is heritability in which, you
know, different methods get passed
-
down so students have advisors who they
learn from, professors who have influence.
-
They spread their methods through
their papers, through their seminars,
-
through their lectures and their selection.
-
There is a lot more people who get -- who is in
the sciences who want jobs than can have them
-
and that -- so there is a bottleneck.
-
And selection through the
bottleneck is non-random and this --
-
you know, so the question is then what
are the traits that we're selecting for?
-
[ Music ]
-
>> In theory, we're selecting for truth but
it's hard to know what's true so we've tended
-
to use publications in journals as a proxy.
-
So what we actually select for in practice
are as many publications as possible
-
in the best journals possible
which Smaldino admits sounds good.
-
>> You know, we want to hire
scientists who are productive,
-
who do a lot of -- you know
actually produce work.
-
We want to hire scientists whose work is
impactful, that gets cited a lot, gets --
-
you know is important enough to be in
high-impact journals and get press coverage.
-
The issue is that if we use the measure
of how many papers, the impact of those --
-
of the journal they're published
in, how many grants, right,
-
these are proxies and proxies can be gamed.
-
>> Proxies cannot only be
gamed, they will be gamed.
-
This is something called
Goodhart's law in economics.
-
I'm going to demonstrate how.
-
Let's say you have a gold mine, and
we'll put it on Science Mountain.
-
Now normally the miners just look
for gold and sell it to merchants
-
but unfortunately there are
limited mines on Science Mountain.
-
So to make sure you have
the most productive miners,
-
you think it's a good idea
to measure productivity.
-
So you tell miners, the ones who sell
the most gold every year keep their jobs
-
and get promoted.
-
Fair enough.
-
But there is a small problem
on Science Mountain.
-
Gold is rare but there is also
fools gold which is less rare.
-
Fools gold looks like gold.
-
It feels like gold but it's not gold.
-
And as the owner of the mine,
you don't want fools gold.
-
It ruins your reputation long-term.
-
Now we'd hope that merchants would be
really good at spotting fools gold.
-
They wouldn't buy it, so to
speak, so that we keep everything
-
in check but we find that's not the case.
-
Science Mountain has fools gold.
-
It's is really hard to spot, and the merchants
aren't that good at telling the difference.
-
The only people who are really experts
are the miners who specialize in that mine
-
but now you've given them every incentive
to work fast and ask questions later.
-
It's not hard to see that given that
situation, in a very short amount of time,
-
you'd flood the market with fools gold
because whatever could be found and polished
-
up and sent off as gold would be.
-
Meanwhile, your scrupulous miners, the
ones who took their time and made sure only
-
to sell real gold would be fired
for not being productive enough.
-
This is good Goodhart's law at work.
-
You picked a measure of productivity,
made it a target,
-
and inadvertently made all
your miners less productive.
-
Your miners are busy filling quotas
to try to keep their job instead
-
of making sure what they're
mining is really gold.
-
Now this, of course, is more
than a simple analogy.
-
This is the situation with real science.
-
Journals are not good at
catching false positives.
-
So when you make publication numbers
and citations the measure of a scientist
-
and what gets published are sensational
results, you might think you're selecting
-
for more true, exciting science but you're not.
-
You're just going to end up with a lot
of exciting science that appears true
-
but may, in fact, be built on nothing.
-
And it doesn't help that although
we pay lip service to replication,
-
we don't give scientists much incentive
to replicate old studies which means
-
that once an untruth gets into
our scientific literature,
-
it's really hard for us to dig it back out.
-
>> Because of the publish or perish mentality,
-
we are currently facing an impossible
amount of scientific articles.
-
The well-respected, "The Lancet" researched
that a while ago and they found that 85%
-
of the published biomedical
research is rubbish, nonsense.
-
If you throw in enough data and you give it a
good scientific stir, you'd be a complete idiot
-
if you don't find any correlation
between something and something else
-
and then your published and your H-index goes
up and that, my dear scientist of the future,
-
is a good thing; not for society,
not for the world but for you.
-
[ Music ]
-
>> What is good for a scientist's
career isn't good for science.
-
That is at the heart of Smaldino's critique.
-
But he goes a step further and says
look, this isn't an individual problem.
-
This isn't a matter of some bad
scientists ruining it for the rest.
-
Natural selection doesn't require
conscious strategizing and here is
-
where our giraffe example is once again useful.
-
No individual giraffe conspired
to get a longer neck.
-
In the same way, no scientist even
needs to be aware of the system in order
-
for the replication crisis to happen.
-
This is a problem that Smaldino
things a lot of people miss.
-
>> If the incentives are still there, then
over time everyone can still have, you know,
-
perfectly good intentions and
say I'm going to be my best
-
but the people whose best involves
doing, let's say, less rigorous work
-
and getting more papers out, doing less
rigorous work and over-hyping the results
-
or interpreting it in ways that, let's
say, have less integrity but, you know,
-
get the headline, they're going to get rewarded.
-
They're going to get the better jobs.
-
They're going to get more grants.
-
They're going to attract more students.
-
Their students are going to copy their methods
and those methods are going to propagate
-
that lead to worse results,
more false positives,
-
new false discoveries, less reproducibility.
-
>> And the problems don't stop there.
-
It's not just about bad research
being able to thrive.
-
Having a constant supply of so-called
groundbreaking studies also devalues what we
-
think of as true in science.
-
Consider that in the last 40 years, words
like innovative, groundbreaking and novel
-
in PubMed journal abstracts
have increased by 2,500%.
-
Now one would hope that scientists
have gotten that much more innovative
-
in 40 years but that seems unlikely.
-
Instead, it seems indicative of an
ever-increasing pressure on scientists
-
to be pumping out more and more papers every
year to be able to compete with their colleagues
-
so they don't lose their
spot on Science Mountain.
-
And in that environment, the
scientists who choose to do deep, slow,
-
rigorous work fall farther and farther behind.
-
You might have heard of Peter Higgs.
-
He won the Nobel Prize for physics in
2013 for predicting the Higgs bosons.
-
But what you might not appreciate
is that-that piece
-
of great science took Peter
Higgs a very long time.
-
It took him five years of publishing nothing,
just working on this one problem in order
-
to publish a paper that he would
eventually get the Nobel Prize for.
-
In an interview, Peter Higgs said
he couldn't do that work today.
-
He'd never be hired.
-
He says he isn't productive
enough by today's standards.
-
>> And to do deep work also
requires some removal.
-
Right. If you want to do really deep,
interesting, unique work, right, who --
-
if you -- and this is true of any field and not
just science or art and, you know, literature.
-
Anything that involves creation, right, the
people who are going to be doing the things
-
that are the most unique and potentially
groundbreaking are probably not the people
-
who are always moving and hustling
and, you know, in the limelight.
-
It's going to be people who at least sometimes
take time and go away for a while and work
-
in a concentrated way on what they're doing.
-
And if you never have the time and the space
to do that, if the people who need the time
-
and the space to do that are never given the
opportunity to do that, you know what does
-
that mean for the landscape of research?
-
You know the kinds of research
that we see that gets done?
-
It's not obvious how to solve this problem, but
I think it's an important thing to consider.
-
>> Wait. Wait.
-
Wait. Wait.
-
Wait. Wait.
-
We can't end it like that.
-
We haven't even talked about solutions.
-
And we are about to fix that.
-
We will talk about solutions to these
very big, very real problems in science
-
but first we're going to talk
about a much simpler solution
-
to a big problem which is the website.
-
So thank you to our sponsor today, Squarespace.
-
Bad joke but great website builder.
-
Squarespace is your all-in-one spot for
any kind of website you want to build.
-
Now I know I talked in the last video about
how it's super customizable but really easy
-
to use and surprisingly a deep product.
-
But what I want to focus on
today is how fast it loads.
-
Now they use progressive image loading which
means that when someone goes on your site --
-
normally if you build a website yourself,
-
your images probably all try
to load at the same time.
-
It's slow.
-
Progressive image loading loads the top
images first so your website is very snappy.
-
We all know that snappy websites are the king.
-
You don't want to get that back button we've all
click before because a website's not loading.
-
You don't want that to happen on your website.
-
So Squarespace, it solves all these
things for you so you don't have
-
to spend time worrying about them.
-
So if you want to go check it
out, it's a 14 day free trial.
-
You can build the website, try all the
little features, and when you're sure
-
that it does everything that you want
it to do, you can go to check out
-
and use promo code coffee break for
10% off your first domain or website.
-
Once again, that's promo code coffee break
for 10% off your first website or domain.
-
Thank you to Squarespace
for sponsoring this video.
-
Now let's talk about much
more complicated solutions,
-
how do we fix these problems in science?
-
So I put two full interviews together with
two really smart scientists all about this.
-
One is Dr. Smaldino.
-
You've seen him before and
number two is Dr. [inaudible].
-
Super great.
-
You're going to get a lot
out of that conversation,
-
but if you just want a little taste, here's
a little teaser for those two interviews.
-
Thank you for watching.
-
Thank you for supporting the channel,
and I will see you guys next time.
-
>> So let me be clear.
-
Like science is this great process.
-
It is like the best way to discover truth.
-
>> We have the notion that replication
is boring is demonstrably false.
-
>> There is, you know, this nice
game theory modeling showing
-
that a lottery would actually be a
really good use of the scientist's time.
-
>> The best intervention that we have
-
so far that's realigning the
incentives is Registered Reports.
-
[ Music ]