Return to Video

Why Bad Science Spreads by Coffee Break

  • 0:00 - 0:04
    >> I'm here at the [inaudible] dying of heat
    exhaustion to talk to you about these guys.
  • 0:05 - 0:09
    Specifically, how they relate to one
    of our biggest problems in science.
  • 0:09 - 0:10
    Let's start from the beginning.
  • 0:11 - 0:14
    There is a replication crisis
    among scientists and it's serious.
  • 0:15 - 0:18
    Several scientific fields have
    done major replication attempts
  • 0:18 - 0:21
    and they just aren't getting the
    same results the second time around.
  • 0:21 - 0:24
    And this isn't a case of an
    odd finding here or there, no.
  • 0:24 - 0:30
    We're having to question foundational studies
    in the literature of psychology, biomedicine.
  • 0:30 - 0:32
    And while the harder scientists
    might be smugly chuckling right now
  • 0:32 - 0:35
    about their epistemic superiority, many wonder
  • 0:35 - 0:39
    if they're not a big replication
    attempt away from the same situation.
  • 0:39 - 0:44
    In short, the replication crisis is a
    serious problem and not just for scientists.
  • 0:44 - 0:47
    Remember, what happens in
    science becomes public policy.
  • 0:47 - 0:49
    What happens in science becomes
    your drug treatment.
  • 0:50 - 0:54
    What happens in science turns into pop
    science which turns into your friend insisting
  • 0:54 - 0:58
    at a dinner party that power-posing is the
    secret to becoming dominant and assertive.
  • 0:59 - 0:59
    >> There is nothing there.
  • 1:00 - 1:02
    The books have nothing to say.
  • 1:03 - 1:05
    [ Music ]
  • 1:06 - 1:09
    >> We all have a vested interest
    in science getting it right
  • 1:09 - 1:11
    or at least correcting itself when it's wrong.
  • 1:12 - 1:16
    But what's so terrifying about the replication
    crisis isn't that we got a few things wrong,
  • 1:16 - 1:19
    it's that we didn't notice
    they were wrong for so long.
  • 1:19 - 1:22
    The explanations for the
    replication crisis are varied.
  • 1:22 - 1:26
    A few simply deny there is a problem;
    others propose that our standards
  • 1:26 - 1:28
    for statistical significance are too weak.
  • 1:28 - 1:33
    P-values, they would say, breeds too many false
    positives and have helped create the crisis.
  • 1:33 - 1:36
    Some say we don't incentivize
    replication enough.
  • 1:36 - 1:40
    Some say the gate-keeping institutions are
    to blame and the culprit list goes on and on.
  • 1:40 - 1:43
    And I think most of these have some
    truth to them, but I want to talk
  • 1:43 - 1:46
    about a much simpler theory
    for the replication crisis,
  • 1:46 - 1:51
    a theory that begins with
    us looking at giraffes.
  • 1:51 - 1:53
    This video is sponsored by Squarespace.
  • 1:53 - 1:58
    Not this exact giraffe of course but the
    process that led us to this exact giraffe.
  • 1:59 - 2:03
    Natural selection is the process
    where organisms better adapted
  • 2:03 - 2:05
    to an environment tend to survive and reproduce.
  • 2:06 - 2:08
    And giraffes are an archetype
    of natural selection.
  • 2:08 - 2:10
    The ones with longer necks were
    better able to eat tree leaves
  • 2:10 - 2:13
    and reproduce and over time you get this.
  • 2:14 - 2:16
    But the scientist Dr. Paul
    Smaldino [assumed spelling] argues
  • 2:16 - 2:21
    that this same process affects
    more than just animals.
  • 2:21 - 2:26
    >> So if you can imagine, we're used to
    thinking about evolution in terms of acting
  • 2:26 - 2:31
    on [inaudible] and on biological species that
    die and reproduce but culture also evolves.
  • 2:31 - 2:36
    And, you know, Darwin laid out --
    you don't need genes for evolution.
  • 2:36 - 2:38
    There are just -- you just need three things.
  • 2:38 - 2:42
    Right. You need variation in a population.
  • 2:42 - 2:47
    You need the things that vary to be heritable
    to be able to be passed down from one individual
  • 2:47 - 2:52
    to another and you need selection so
    that variation matters in the extent
  • 2:52 - 2:53
    to which those traits are passed down.
  • 2:53 - 2:59
    And anything that has those properties
    is subject to natural selection.
  • 2:59 - 3:00
    >> Anything including science.
  • 3:00 - 3:02
    >> So scientists vary in their methods.
  • 3:02 - 3:07
    There is heritability in which, you
    know, different methods get passed
  • 3:07 - 3:14
    down so students have advisors who they
    learn from, professors who have influence.
  • 3:14 - 3:17
    They spread their methods through
    their papers, through their seminars,
  • 3:17 - 3:20
    through their lectures and their selection.
  • 3:20 - 3:28
    There is a lot more people who get -- who is in
    the sciences who want jobs than can have them
  • 3:28 - 3:30
    and that -- so there is a bottleneck.
  • 3:30 - 3:35
    And selection through the
    bottleneck is non-random and this --
  • 3:35 - 3:39
    you know, so the question is then what
    are the traits that we're selecting for?
  • 3:40 - 3:42
    [ Music ]
  • 3:42 - 3:48
    >> In theory, we're selecting for truth but
    it's hard to know what's true so we've tended
  • 3:48 - 3:50
    to use publications in journals as a proxy.
  • 3:50 - 3:56
    So what we actually select for in practice
    are as many publications as possible
  • 3:56 - 4:00
    in the best journals possible
    which Smaldino admits sounds good.
  • 4:01 - 4:03
    >> You know, we want to hire
    scientists who are productive,
  • 4:03 - 4:06
    who do a lot of -- you know
    actually produce work.
  • 4:06 - 4:11
    We want to hire scientists whose work is
    impactful, that gets cited a lot, gets --
  • 4:11 - 4:17
    you know is important enough to be in
    high-impact journals and get press coverage.
  • 4:17 - 4:25
    The issue is that if we use the measure
    of how many papers, the impact of those --
  • 4:25 - 4:29
    of the journal they're published
    in, how many grants, right,
  • 4:29 - 4:34
    these are proxies and proxies can be gamed.
  • 4:34 - 4:37
    >> Proxies cannot only be
    gamed, they will be gamed.
  • 4:37 - 4:40
    This is something called
    Goodhart's law in economics.
  • 4:40 - 4:42
    I'm going to demonstrate how.
  • 4:43 - 4:46
    Let's say you have a gold mine, and
    we'll put it on Science Mountain.
  • 4:46 - 4:49
    Now normally the miners just look
    for gold and sell it to merchants
  • 4:50 - 4:52
    but unfortunately there are
    limited mines on Science Mountain.
  • 4:52 - 4:54
    So to make sure you have
    the most productive miners,
  • 4:54 - 4:57
    you think it's a good idea
    to measure productivity.
  • 4:57 - 5:01
    So you tell miners, the ones who sell
    the most gold every year keep their jobs
  • 5:01 - 5:01
    and get promoted.
  • 5:02 - 5:02
    Fair enough.
  • 5:03 - 5:05
    But there is a small problem
    on Science Mountain.
  • 5:05 - 5:08
    Gold is rare but there is also
    fools gold which is less rare.
  • 5:09 - 5:10
    Fools gold looks like gold.
  • 5:10 - 5:12
    It feels like gold but it's not gold.
  • 5:12 - 5:15
    And as the owner of the mine,
    you don't want fools gold.
  • 5:15 - 5:16
    It ruins your reputation long-term.
  • 5:17 - 5:21
    Now we'd hope that merchants would be
    really good at spotting fools gold.
  • 5:21 - 5:24
    They wouldn't buy it, so to
    speak, so that we keep everything
  • 5:24 - 5:27
    in check but we find that's not the case.
  • 5:27 - 5:28
    Science Mountain has fools gold.
  • 5:28 - 5:32
    It's is really hard to spot, and the merchants
    aren't that good at telling the difference.
  • 5:32 - 5:36
    The only people who are really experts
    are the miners who specialize in that mine
  • 5:37 - 5:40
    but now you've given them every incentive
    to work fast and ask questions later.
  • 5:40 - 5:44
    It's not hard to see that given that
    situation, in a very short amount of time,
  • 5:44 - 5:49
    you'd flood the market with fools gold
    because whatever could be found and polished
  • 5:49 - 5:51
    up and sent off as gold would be.
  • 5:51 - 5:56
    Meanwhile, your scrupulous miners, the
    ones who took their time and made sure only
  • 5:56 - 5:59
    to sell real gold would be fired
    for not being productive enough.
  • 6:00 - 6:02
    This is good Goodhart's law at work.
  • 6:02 - 6:05
    You picked a measure of productivity,
    made it a target,
  • 6:05 - 6:09
    and inadvertently made all
    your miners less productive.
  • 6:09 - 6:13
    Your miners are busy filling quotas
    to try to keep their job instead
  • 6:13 - 6:15
    of making sure what they're
    mining is really gold.
  • 6:16 - 6:18
    Now this, of course, is more
    than a simple analogy.
  • 6:18 - 6:20
    This is the situation with real science.
  • 6:21 - 6:24
    Journals are not good at
    catching false positives.
  • 6:24 - 6:28
    So when you make publication numbers
    and citations the measure of a scientist
  • 6:29 - 6:33
    and what gets published are sensational
    results, you might think you're selecting
  • 6:33 - 6:36
    for more true, exciting science but you're not.
  • 6:36 - 6:39
    You're just going to end up with a lot
    of exciting science that appears true
  • 6:39 - 6:41
    but may, in fact, be built on nothing.
  • 6:42 - 6:45
    And it doesn't help that although
    we pay lip service to replication,
  • 6:45 - 6:48
    we don't give scientists much incentive
    to replicate old studies which means
  • 6:48 - 6:51
    that once an untruth gets into
    our scientific literature,
  • 6:51 - 6:53
    it's really hard for us to dig it back out.
  • 6:54 - 6:57
    >> Because of the publish or perish mentality,
  • 6:57 - 7:03
    we are currently facing an impossible
    amount of scientific articles.
  • 7:03 - 7:10
    The well-respected, "The Lancet" researched
    that a while ago and they found that 85%
  • 7:10 - 7:17
    of the published biomedical
    research is rubbish, nonsense.
  • 7:17 - 7:22
    If you throw in enough data and you give it a
    good scientific stir, you'd be a complete idiot
  • 7:22 - 7:27
    if you don't find any correlation
    between something and something else
  • 7:27 - 7:37
    and then your published and your H-index goes
    up and that, my dear scientist of the future,
  • 7:38 - 7:44
    is a good thing; not for society,
    not for the world but for you.
  • 7:45 - 7:48
    [ Music ]
  • 7:48 - 7:53
    >> What is good for a scientist's
    career isn't good for science.
  • 7:53 - 7:55
    That is at the heart of Smaldino's critique.
  • 7:55 - 7:59
    But he goes a step further and says
    look, this isn't an individual problem.
  • 7:59 - 8:03
    This isn't a matter of some bad
    scientists ruining it for the rest.
  • 8:03 - 8:07
    Natural selection doesn't require
    conscious strategizing and here is
  • 8:07 - 8:09
    where our giraffe example is once again useful.
  • 8:09 - 8:13
    No individual giraffe conspired
    to get a longer neck.
  • 8:13 - 8:17
    In the same way, no scientist even
    needs to be aware of the system in order
  • 8:17 - 8:20
    for the replication crisis to happen.
  • 8:20 - 8:23
    This is a problem that Smaldino
    things a lot of people miss.
  • 8:23 - 8:30
    >> If the incentives are still there, then
    over time everyone can still have, you know,
  • 8:30 - 8:34
    perfectly good intentions and
    say I'm going to be my best
  • 8:34 - 8:40
    but the people whose best involves
    doing, let's say, less rigorous work
  • 8:40 - 8:46
    and getting more papers out, doing less
    rigorous work and over-hyping the results
  • 8:46 - 8:53
    or interpreting it in ways that, let's
    say, have less integrity but, you know,
  • 8:53 - 8:56
    get the headline, they're going to get rewarded.
  • 8:56 - 8:58
    They're going to get the better jobs.
  • 8:58 - 8:59
    They're going to get more grants.
  • 8:59 - 9:01
    They're going to attract more students.
  • 9:01 - 9:06
    Their students are going to copy their methods
    and those methods are going to propagate
  • 9:06 - 9:10
    that lead to worse results,
    more false positives,
  • 9:10 - 9:12
    new false discoveries, less reproducibility.
  • 9:12 - 9:15
    >> And the problems don't stop there.
  • 9:15 - 9:18
    It's not just about bad research
    being able to thrive.
  • 9:18 - 9:22
    Having a constant supply of so-called
    groundbreaking studies also devalues what we
  • 9:23 - 9:25
    think of as true in science.
  • 9:25 - 9:29
    Consider that in the last 40 years, words
    like innovative, groundbreaking and novel
  • 9:29 - 9:34
    in PubMed journal abstracts
    have increased by 2,500%.
  • 9:34 - 9:37
    Now one would hope that scientists
    have gotten that much more innovative
  • 9:37 - 9:40
    in 40 years but that seems unlikely.
  • 9:40 - 9:44
    Instead, it seems indicative of an
    ever-increasing pressure on scientists
  • 9:44 - 9:49
    to be pumping out more and more papers every
    year to be able to compete with their colleagues
  • 9:49 - 9:51
    so they don't lose their
    spot on Science Mountain.
  • 9:51 - 9:55
    And in that environment, the
    scientists who choose to do deep, slow,
  • 9:55 - 9:59
    rigorous work fall farther and farther behind.
  • 9:59 - 10:03
    You might have heard of Peter Higgs.
  • 10:03 - 10:07
    He won the Nobel Prize for physics in
    2013 for predicting the Higgs bosons.
  • 10:07 - 10:10
    But what you might not appreciate
    is that-that piece
  • 10:10 - 10:13
    of great science took Peter
    Higgs a very long time.
  • 10:13 - 10:18
    It took him five years of publishing nothing,
    just working on this one problem in order
  • 10:18 - 10:21
    to publish a paper that he would
    eventually get the Nobel Prize for.
  • 10:21 - 10:24
    In an interview, Peter Higgs said
    he couldn't do that work today.
  • 10:24 - 10:25
    He'd never be hired.
  • 10:25 - 10:28
    He says he isn't productive
    enough by today's standards.
  • 10:28 - 10:34
    >> And to do deep work also
    requires some removal.
  • 10:35 - 10:39
    Right. If you want to do really deep,
    interesting, unique work, right, who --
  • 10:39 - 10:45
    if you -- and this is true of any field and not
    just science or art and, you know, literature.
  • 10:45 - 10:51
    Anything that involves creation, right, the
    people who are going to be doing the things
  • 10:51 - 10:55
    that are the most unique and potentially
    groundbreaking are probably not the people
  • 10:55 - 10:59
    who are always moving and hustling
    and, you know, in the limelight.
  • 10:59 - 11:07
    It's going to be people who at least sometimes
    take time and go away for a while and work
  • 11:07 - 11:09
    in a concentrated way on what they're doing.
  • 11:09 - 11:14
    And if you never have the time and the space
    to do that, if the people who need the time
  • 11:14 - 11:20
    and the space to do that are never given the
    opportunity to do that, you know what does
  • 11:20 - 11:23
    that mean for the landscape of research?
  • 11:23 - 11:27
    You know the kinds of research
    that we see that gets done?
  • 11:28 - 11:34
    It's not obvious how to solve this problem, but
    I think it's an important thing to consider.
  • 11:36 - 11:37
    >> Wait. Wait.
  • 11:37 - 11:37
    Wait. Wait.
  • 11:37 - 11:38
    Wait. Wait.
  • 11:38 - 11:39
    We can't end it like that.
  • 11:39 - 11:41
    We haven't even talked about solutions.
  • 11:41 - 11:42
    And we are about to fix that.
  • 11:42 - 11:46
    We will talk about solutions to these
    very big, very real problems in science
  • 11:46 - 11:49
    but first we're going to talk
    about a much simpler solution
  • 11:49 - 11:51
    to a big problem which is the website.
  • 11:51 - 11:54
    So thank you to our sponsor today, Squarespace.
  • 11:54 - 11:56
    Bad joke but great website builder.
  • 11:57 - 12:01
    Squarespace is your all-in-one spot for
    any kind of website you want to build.
  • 12:02 - 12:06
    Now I know I talked in the last video about
    how it's super customizable but really easy
  • 12:06 - 12:09
    to use and surprisingly a deep product.
  • 12:09 - 12:12
    But what I want to focus on
    today is how fast it loads.
  • 12:12 - 12:15
    Now they use progressive image loading which
    means that when someone goes on your site --
  • 12:16 - 12:17
    normally if you build a website yourself,
  • 12:17 - 12:19
    your images probably all try
    to load at the same time.
  • 12:20 - 12:20
    It's slow.
  • 12:20 - 12:25
    Progressive image loading loads the top
    images first so your website is very snappy.
  • 12:25 - 12:27
    We all know that snappy websites are the king.
  • 12:27 - 12:31
    You don't want to get that back button we've all
    click before because a website's not loading.
  • 12:31 - 12:32
    You don't want that to happen on your website.
  • 12:32 - 12:36
    So Squarespace, it solves all these
    things for you so you don't have
  • 12:36 - 12:38
    to spend time worrying about them.
  • 12:38 - 12:41
    So if you want to go check it
    out, it's a 14 day free trial.
  • 12:41 - 12:44
    You can build the website, try all the
    little features, and when you're sure
  • 12:44 - 12:48
    that it does everything that you want
    it to do, you can go to check out
  • 12:48 - 12:52
    and use promo code coffee break for
    10% off your first domain or website.
  • 12:52 - 12:56
    Once again, that's promo code coffee break
    for 10% off your first website or domain.
  • 12:57 - 12:59
    Thank you to Squarespace
    for sponsoring this video.
  • 12:59 - 13:01
    Now let's talk about much
    more complicated solutions,
  • 13:01 - 13:03
    how do we fix these problems in science?
  • 13:03 - 13:07
    So I put two full interviews together with
    two really smart scientists all about this.
  • 13:07 - 13:08
    One is Dr. Smaldino.
  • 13:08 - 13:11
    You've seen him before and
    number two is Dr. [inaudible].
  • 13:11 - 13:12
    Super great.
  • 13:12 - 13:14
    You're going to get a lot
    out of that conversation,
  • 13:14 - 13:20
    but if you just want a little taste, here's
    a little teaser for those two interviews.
  • 13:20 - 13:20
    Thank you for watching.
  • 13:20 - 13:23
    Thank you for supporting the channel,
    and I will see you guys next time.
  • 13:23 - 13:24
    >> So let me be clear.
  • 13:24 - 13:27
    Like science is this great process.
  • 13:27 - 13:30
    It is like the best way to discover truth.
  • 13:30 - 13:34
    >> We have the notion that replication
    is boring is demonstrably false.
  • 13:34 - 13:37
    >> There is, you know, this nice
    game theory modeling showing
  • 13:37 - 13:42
    that a lottery would actually be a
    really good use of the scientist's time.
  • 13:42 - 13:44
    >> The best intervention that we have
  • 13:44 - 13:48
    so far that's realigning the
    incentives is Registered Reports.
  • 13:49 - 13:53
    [ Music ]
Title:
Why Bad Science Spreads by Coffee Break
Description:

more » « less
Video Language:
English (United States)
Duration:
13:54

English (United States) subtitles

Revisions