Return to Video

https:/.../5-6-19+Low+Q_pt3.mp4

  • 0:01 - 0:04
    And so this is a relevant thing to
    think about.
  • 0:04 - 0:08
    So this was a very talented scientist.
    I saw her talk at
  • 0:08 - 0:12
    a Gordon research conference and
    immediately asked her to apply to UT.
  • 0:12 - 0:18
    I said "You should apply for a faculty job
    at UT", completely sold on the story.
  • 0:19 - 0:22
    So was UT Southwestern, who actually
    offered her a job
  • 0:22 - 0:24
    as an assistant professor.
  • 0:24 - 0:28
    So was the Cancer Prevention and
    Research Institute of Texas,
  • 0:28 - 0:31
    which offered her a two million dollar
    recruitment grant.
  • 0:32 - 0:36
    All of which was of course rescinded
    when this was discovered.
  • 0:37 - 0:43
    Turns out she had fabricated all the data
    in a second paper in PNAS as well.
  • 0:44 - 0:49
    And so I think it's worth considering
    these things and sort of paying attention
  • 0:49 - 0:51
    to what happens here, right?
  • 0:51 - 0:55
    So Claire Waterman, the senior author
    on these papers is a very famous
  • 0:55 - 0:58
    cell biologist, who's been in the
    business for 25 years,
  • 0:58 - 0:59
    at least as long as I have.
  • 1:01 - 1:05
    And is very well regarded, very
    respected by all of her colleagues,
  • 1:06 - 1:09
    Obviously this puts her in
    quite a bind, right?
  • 1:11 - 1:12
    And it's worth thinking about.
  • 1:12 - 1:14
    We've had faculty in this department
    where this has happened, right?
  • 1:14 - 1:19
    This appears to be a case where
    a very intelligent young scientist
  • 1:19 - 1:23
    somehow felt something about the
    pressure of the job, needed to
  • 1:23 - 1:26
    make up all the data in two entire papers.
  • 1:29 - 1:31
    Then so one of the questions you ask is,
  • 1:31 - 1:34
    that you get asked a lot from
    non-scientists is
  • 1:34 - 1:37
    "Well, why doesn't peer review find it?"
  • 1:38 - 1:40
    Right? How does this get through
    peer review?
  • 1:40 - 1:42
    Did anyone see anything wrong
    with this paper? If I had just given
  • 1:42 - 1:45
    you this paper, you'd have read
    it and you'd have loved it, right?
  • 1:46 - 1:50
    Because if you're a smart person,
    and you're a dedicated fraud,
  • 1:50 - 1:52
    it's very hard to get caught, right?
  • 1:52 - 1:56
    If you just make up numbers and put
    them on a graph, and it looks good,
  • 1:56 - 1:59
    very hard to see that in peer review.
    Right?
  • 2:00 - 2:03
    If you pick the part of the image that
    shows what you want to show...
  • 2:05 - 2:07
    it's very easy to get away
    with this actually.
  • 2:10 - 2:12
    >> The only thing I was suspicious about
    in the whole paper was that
  • 2:12 - 2:17
    it all came together so perfectly,
    and that's just like not how it should
  • 2:17 - 2:18
    ever happen with cancer cells.
  • 2:18 - 2:20
    >> Yeah, but it does sometimes, right?
  • 2:20 - 2:24
    Yeah, right, it does, and when you sit
    there and you watch the data,
  • 2:24 - 2:26
    and it's from a good lab-- right?
  • 2:27 - 2:30
    It's very hard to say,
    "Ah, it's too perfect."
  • 2:30 - 2:34
    You know, you wouldn't have said
    "Oh, clearly you've made all of this up."
  • 2:34 - 2:36
    Right? Yeah, yeah, yeah. Right?
  • 2:36 - 2:39
    >> I don't - I was only suspicious
    because I had thought already
  • 2:39 - 2:41
    that that's what happened with
    this paper.
  • 2:41 - 2:43
    >> Yeah, yeah, yeah. No.
  • 2:43 - 2:46
    It's hard to do the exercise
    any other way.
  • 2:47 - 2:48
    August had a question and
    then Jifa.
  • 2:48 - 2:52
    >> No, just because we were quite -
  • 2:52 - 2:58
    you know, they made a little bit
    far-fetched of a claim, but
  • 2:58 - 3:02
    at the same time we can see that,
    yeah, the phenotype's right.
  • 3:02 - 3:05
    So I think...
    >> Yeah, Jifa?
  • 3:06 - 3:11
    >> So, were the other scientists
    here in on it, or was this like -
  • 3:11 - 3:12
    did she orchestrate it?
  • 3:12 - 3:16
    >> No, the Office of Research Integrity
    puts it all on the one person.
  • 3:16 - 3:19
    Basically just a dedicated fraud
    all the way through.
  • 3:21 - 3:24
    And again, it wouldn't be that
    hard to do, right?
  • 3:24 - 3:28
    So probably the other authors
    involved are the authors who
  • 3:28 - 3:32
    generated other reagents,
    also possible that many of them
  • 3:32 - 3:37
    just generated reagents and gave
    them to this person, to do work on,
  • 3:37 - 3:40
    and then she did experiments on
    them and made up the results.
  • 3:40 - 3:43
    But actually in the entire
    investigation, only one person
  • 3:43 - 3:45
    was found to be the root of all of this.
  • 3:47 - 3:47
    And so...
  • 3:47 - 3:49
    >> (inaudible).
  • 3:49 - 3:50
    >> Huh?
  • 3:50 - 3:51
    >> I said, it sucks for the other
    authors of the paper.
  • 3:51 - 3:53
    >> Oh, it sucks for everybody,
    absolutely.
  • 3:54 - 4:02
    >> I'm sorry, I just want to know,
    how about the (inaudible)
  • 4:02 - 4:04
    >> I think we don't know. I think
    someone will have to go back
  • 4:04 - 4:06
    and redo all those experiments, right?
  • 4:06 - 4:14
    So these are the findings of what
    people said, but even those findings
  • 4:14 - 4:20
    in the first few figures, clearly the
    numbers are inflated, right?
  • 4:22 - 4:25
    And so I think someone will have
    to go back and start over.
  • 4:25 - 4:30
    >> I didn't read the summary at
    the beginning because (inaudible),
  • 4:30 - 4:36
    so I just read the introduction,
    but I feel like the titles are very attractive,
  • 4:36 - 4:43
    and then I read-- briefly go the the
    introduction and read each figure.
  • 4:44 - 4:50
    I think they're basically true then you
    gave us a little time so I went back
  • 4:50 - 4:53
    to read the summary.
    >> Yup.
  • 4:53 - 4:58
    >> And I felt like some of the summaries
    they were giving for the whole picture
  • 4:58 - 5:04
    of this paper, I felt like how could
    it be so perfect, you know in
  • 5:05 - 5:08
    animal models, and the (inaudible)
  • 5:08 - 5:10
    >> Yeah, but every once in a while
    you get lucky, right?
  • 5:10 - 5:12
    And so that's the thing with fraud, right?
  • 5:12 - 5:17
    If it's really consistent and you
    stick to your guns, it's really hard
  • 5:17 - 5:20
    to catch. Now, the funny thing is
  • 5:21 - 5:25
    that surely she must have thought
    someone would try to repeat -
  • 5:25 - 5:28
    I mean it's a cell paper. She was
    on the job market, she gave
  • 5:28 - 5:32
    talks all over the country.
    She must have known that
  • 5:32 - 5:34
    somebody was going to do
    these experiments.
  • 5:35 - 5:37
    And so I just don't know.
    August?
  • 5:37 - 5:41
    >> I'm pretty sure that Skau's
    reputation is completely shattered,
  • 5:41 - 5:44
    but how is the reputation of
    Clare Waterman?
  • 5:44 - 5:48
    >> I mean that's always a danger, right?
    And time will tell, right?
  • 5:48 - 5:51
    It's the only time she's ever
    been associated with it.
  • 5:51 - 5:56
    The independent investigation
    found that it was all this one person.
  • 5:57 - 5:59
    So, time will tell.
  • 6:00 - 6:02
    But right now I would say
    it's still very good.
  • 6:02 - 6:03
    >> Oh, okay.
    >>Right? Yeah.
  • 6:05 - 6:10
    >> This makes me wonder like, what
    is so terribly wrong with the system
  • 6:10 - 6:12
    that somebody (inaudible)
  • 6:12 - 6:17
    >> Sure. Right? I mean I think
    that's one of the big questions,
  • 6:17 - 6:19
    and what's kind of related
    to that that I think's a really
  • 6:19 - 6:21
    subtle point - I'll get to
    your questions in a minute,
  • 6:21 - 6:24
    but I do want to make this point,
    because it comes straight to this.
  • 6:24 - 6:24
    Right?
  • 6:26 - 6:33
    It's very easy to start believing
    your own bullshit. Right?
  • 6:34 - 6:41
    And there's a big difference between
    really making up entire papers
  • 6:41 - 6:47
    and not being honest with yourself,
    but they all are part of the
  • 6:47 - 6:50
    same spectrum, and something you
    have to guard against is
  • 6:51 - 6:53
    "Am I really doing this the right way?"
  • 6:53 - 6:55
    "Am I really seeing what
    I think I'm seeing?"
  • 6:55 - 6:56
    "Am I really being completely honest?"
  • 6:56 - 7:01
    Because especially once you get halfway
    into a paper, halfway into your PhD,
  • 7:03 - 7:07
    it gets, you know, there's
    a lot of pressure.
  • 7:08 - 7:11
    And it's something you have to
    fight against, right?
  • 7:11 - 7:14
    Especially when a project's
    going south, right?
  • 7:14 - 7:17
    You've invested a lot in it,
    and everything's starting to
  • 7:17 - 7:21
    fall apart, and so you've
    got to be vigilant.
  • 7:21 - 7:24
    And again, I'm not saying that
    you're likely to decide to just
  • 7:24 - 7:27
    make up all the data in the
    last three figures of your paper.
  • 7:28 - 7:31
    But the system is stressful.
  • 7:31 - 7:33
    It's a very competitive system.
  • 7:34 - 7:36
    It's a very competitive world.
  • 7:37 - 7:40
    Is fraud any worse now than it used to be,
  • 7:40 - 7:42
    or can we just detect it better, right?
  • 7:42 - 7:46
    I don't actually know. My guess is that
    we can detect it a lot better, right?
  • 7:46 - 7:47
    There's a lot better tools for it.
  • 7:47 - 7:50
    But there's also a lot more money
    in science than there ever was before
  • 7:50 - 7:54
    at all levels, even in academia,
    which puts a big incentive in
  • 7:55 - 7:58
    for people to be unethical.
  • 7:59 - 8:03
    But I mean these are serious concerns,
    I think there's things that as
  • 8:03 - 8:07
    graduate students, you guys should
    be thinking about and considering.
  • 8:07 - 8:10
    There were some hands up.
    I can't remember who they all were.
  • 8:10 - 8:11
    Will?
  • 8:12 - 8:15
    >> I was just going to say that it's
    crazy that it seems like the
  • 8:15 - 8:17
    way that she started to get caught
    was that she sent
  • 8:17 - 8:21
    stably-transfected cell line, and it
    wasn't actually stably-transfected.
  • 8:21 - 8:23
    >> Yup.
    >> It seems so basic.
  • 8:24 - 8:27
    >> Yeah, I mean but once you're there,
    right, they ask for the line,
  • 8:27 - 8:28
    what are you going to do?
  • 8:29 - 8:33
    >> That's also easy to hand away,
    to be like "Oh, it might have..."
  • 8:33 - 8:37
    You know, you build these lines
    and they fall apart, so I mean...
  • 8:37 - 8:42
    and if you're this confident in
    falsifying data that ends up
  • 8:42 - 8:46
    in the "Cell" paper, you're probably
    willing to think that you can
  • 8:46 - 8:48
    argue your way out of
    something like that, right?
  • 8:50 - 8:54
    Do you see what I'm saying,
    like the mental-- yeah, yeah.
  • 8:54 - 8:59
    >> Well but also at this level, you know
    it's slid over into pathology, right?
  • 8:59 - 9:04
    I mean literally every single thing
    in this paper has something
  • 9:04 - 9:05
    fraudulent about it.
  • 9:06 - 9:09
    Right, which is really evidence to me
    that someone has gone
  • 9:09 - 9:14
    way past "Oh crap, I've got to just
    save this one paper" into
  • 9:14 - 9:15
    "I'm invincible".
  • 9:16 - 9:17
    Right?
    >> Yeah.
  • 9:17 - 9:20
    >> Yeah yeah yeah, I mean
    I don't know, but yeah.
  • 9:20 - 9:23
    >> Yeah, so apart from the reputation
    and embarassment, what is the
  • 9:23 - 9:26
    punishment for such (inaudible)?
  • 9:26 - 9:31
    >> Right, so the punishment for her
    is-- I mean, the punishment
  • 9:31 - 9:34
    is her entire scientific career
    is completely ruined.
  • 9:34 - 9:36
    She'll never work in the
    field again, right?
  • 9:36 - 9:38
    >> What about the taxpayer's money?
    >> Right, right, right.
  • 9:38 - 9:41
    Taxpayer's money, there's nothing--
    I mean, nothing done.
  • 9:42 - 9:45
    They have these voluntary settlements
    here that you can read about it.
  • 9:46 - 9:52
    And these things are really built
    around the assumption of
  • 9:52 - 9:54
    small amounts of fraud, you know?
  • 9:54 - 10:00
    You made a mistake, you're a savable
    person as a scientist, then they
  • 10:00 - 10:02
    put a supervisory plan in place
    where someone's really
  • 10:02 - 10:05
    looking a lot more carefully at your work,
    and then for three years,
  • 10:05 - 10:08
    you're not allowed to work for the NIH
    and these kinds of things.
  • 10:08 - 10:11
    But at this level, her reputation
    is just destroyed, so there's--
  • 10:11 - 10:13
    we won't see her anywhere
    in academic science again.
  • 10:13 - 10:18
    In terms of monetary, these kinds of
    things, I don't know of any,
  • 10:18 - 10:20
    and there's much more-- there was
    a big "New York Times" piece
  • 10:20 - 10:24
    about a cancer biologist at
    Penn State who's apparently
  • 10:24 - 10:29
    been under investigation for fraud
    like every other year for ten years,
  • 10:29 - 10:32
    and he still has millions of dollars
    of research money and it just sort of
  • 10:32 - 10:34
    never comes home to roost, so.
  • 10:35 - 10:38
    You guys should educate yourself
    about the fraud situation,
  • 10:39 - 10:40
    and what's there.
  • 10:40 - 10:42
    Did someone else have a hand up?
  • 10:42 - 10:43
    >> Can I same something quick?
    >> Yeah.
  • 10:43 - 10:48
    >> I saw in the news, UT Southwestern
    removed a million dollar grant from her.
  • 10:48 - 10:50
    >> Yeah yeah, that's the grant.
    Yeah-- oh yeah, she--
  • 10:51 - 10:54
    the job was rescinded, the grant
    was rescinded, so she's not
  • 10:54 - 10:56
    getting the grant, she didn't
    get the job, she will absolutely
  • 10:56 - 10:58
    have to leave science.
  • 10:58 - 11:01
    As I understand it, she's like a
    program officer at a foundation now
  • 11:01 - 11:02
    or something. Yeah.
  • 11:02 - 11:06
    >> But not just the monetary loss,
    what about killing so many mice
  • 11:06 - 11:09
    and falsifying data? (inaudible).
    (laughter)
  • 11:11 - 11:14
    >> If you go down the road that
    killing life is a problem,
  • 11:14 - 11:16
    then you're going to have
    a real problem.
  • 11:16 - 11:18
    >> If you're doing science, doing
    something productive,
  • 11:18 - 11:20
    it makes sense but if
    you're falsifying data.
  • 11:20 - 11:21
    >> Yeah.
  • 11:23 - 11:25
    >> I was--
    >> Hang on, you had a hand up.
  • 11:26 - 11:29
    >> How often-- obviously this one
    seems like it was very intentional,
  • 11:29 - 11:31
    and so that's really-- but how
    often do people get
  • 11:31 - 11:33
    caught up in this just because
    they weren't careful enough
  • 11:33 - 11:37
    and they didn't force themselves
    to be very thorough?
  • 11:37 - 11:41
    >> I mean mistakes get made in papers
    all the time, right, and some papers
  • 11:41 - 11:46
    get retracted because we went back
    and resequenced the mouse allele,
  • 11:46 - 11:49
    and it wasn't what we thought
    it was, right?
  • 11:49 - 11:52
    But generally what happens is there's
    a correction. Generally when that happens,
  • 11:52 - 11:55
    it's not like the entire paper's ruined,
    "Oh, surprise, we mutated
  • 11:55 - 11:57
    totally the wrong gene".
  • 11:57 - 12:00
    Usually it's like "Oh we thought it was
    exon 3, but in a frameshift,
  • 12:00 - 12:03
    but in fact it was something else."
    Right?
  • 12:03 - 12:04
    >> Right.
  • 12:04 - 12:09
    >> And so you get a lot of corrections
    in papers, and some retractions,
  • 12:09 - 12:13
    but generally, you know, you've
    got to-- the whole paper
  • 12:13 - 12:16
    really has to be systematically
    wrong to retract the entire paper.
  • 12:16 - 12:19
    >> So it's the dishonesty that
    gets prosecuted.
  • 12:19 - 12:23
    >> Yeah, or I mean, it-- yeah, so the
    dishonestly immediately you want to
  • 12:23 - 12:26
    retract the paper, because that's
    also what the PI is going to
  • 12:26 - 12:27
    want to do.
    >> Right.
  • 12:27 - 12:31
    >> Right? But if you just get it wrong,
    you know, you get it wrong,
  • 12:31 - 12:32
    we're human, right?
  • 12:32 - 12:36
    If there was no-- and there have been
    investigations where they find
  • 12:36 - 12:39
    "Oh, no, they weren't unethical,
    they were just dumb."
  • 12:39 - 12:40
    (laughter)
  • 12:40 - 12:42
    No, seriously, right? And that can
    happen, right?
  • 12:42 - 12:44
    And that's not a crime.
  • 12:45 - 12:46
    So, yeah.
  • 12:47 - 12:50
    >> Who does these investigations?
    Is there like a secret service
  • 12:50 - 12:52
    for scientists, like--
    >> Yeah.
  • 12:52 - 12:54
    The Office of Research Integrity
    at the NIH, right?
  • 12:54 - 12:55
    >> At the NIH?
  • 12:55 - 12:59
    >> Yup, and then all universities will
    have an office for research integrity,
  • 12:59 - 13:03
    so I'm sure that-- so if it happens at UT,
    there's a UT office that will investigate,
  • 13:03 - 13:06
    but the NIH will also investigate
    if it's NIH-funded research.
  • 13:07 - 13:09
    Yeah, so several differnet bodies.
  • 13:09 - 13:11
    And there's a...
  • 13:12 - 13:13
    Yeah.
  • 13:14 - 13:17
    >> Has anyone here read
    the book "Bad Blood?"
  • 13:17 - 13:18
    >> "Bad Blood" what is that?
  • 13:18 - 13:21
    >> It's about the Theranos controversy
    and (inaudible).
  • 13:21 - 13:24
    >> Oh, wow. No, is it good?
    >> It's amazing.
  • 13:24 - 13:26
    >> Really? "Bad Blood."
    >> Everyone should read it.
  • 13:26 - 13:27
    >> Okay.
    >> It's really good.
  • 13:27 - 13:29
    >> The documentary on
    HBO's pretty good.
  • 13:30 - 13:31
    >> Same title? Okay.
  • 13:32 - 13:33
    >> I think so.
    >> Okay.
  • 13:33 - 13:36
    >> So lots of these
    kinds of things existing.
  • 13:36 - 13:37
    >> Yeah (laughs).
  • 13:37 - 13:38
    >> Just like the...
  • 13:40 - 13:43
    I don't know how to
    say that vitamin C is
  • 13:43 - 13:48
    for everything, and so now,
    the (inaudible) for everything
  • 13:48 - 13:50
    and the (inaudible) can do everything,
  • 13:50 - 13:52
    and they just make money from that.
  • 13:52 - 13:52
    (laughter)
  • 13:52 - 13:55
    >> As long as they don't kill anyone,
  • 13:55 - 13:58
    they just sell the ideas
    to the (inaudible).
  • 13:58 - 14:01
    >> Yeah, I mean that's the entire
    supplement industry, right?
  • 14:03 - 14:06
    All right. Any other questions?
  • 14:06 - 14:08
    Thoughts, comments?
  • 14:08 - 14:11
    >> When is the next assignment due?
  • 14:12 - 14:15
    >> Next assignment is due
    a week from Wednesday.
  • 14:18 - 14:20
    No, so nine days from now.
  • 14:20 - 14:21
    >> Nine days?
  • 14:21 - 14:22
    >> Nine days.
  • 14:23 - 14:25
    Whatever the date is here,
    actually I have a calendar
  • 14:25 - 14:27
    in front of me.
    >> It is the 6th so the 15th.
  • 14:28 - 14:30
    >> The 15th.
    >> Ooh, I'm good at simple math.
  • 14:31 - 14:34
    >> It is as my calendar says,
    'cause I have an 11 year old
  • 14:34 - 14:37
    girl at home, and "Riverdale"
    season three starts.
  • 14:37 - 14:38
    (laughter)
  • 14:38 - 14:40
    But also your assignment is due,
    so get it done before River--
  • 14:40 - 14:41
    >> "Riverdale" season three
    already happened.
  • 14:41 - 14:44
    >> What?
    (laughter)
  • 14:44 - 14:46
    Oh, no, no, it comes out
    free on Netflix. Yeah.
  • 14:46 - 14:48
    (laughter)
  • 14:49 - 14:51
    >> Judge me all you want.
  • 14:51 - 14:52
    >> I have a shared--
    >> I know all of you
  • 14:52 - 14:54
    watch "Game of Thrones,"
    and I don't, so.
  • 14:54 - 14:56
    >> No, no I don't either.
  • 14:56 - 14:57
    (crosstalk)
  • 14:57 - 14:58
    >> Jifa?
  • 14:58 - 14:59
    >> What's going on Wednesday?
  • 14:59 - 15:02
    >> Haven't decided yet. It's going to
    be so much fun, though.
  • 15:02 - 15:03
    (laughter)
  • 15:06 - 15:08
    All right, cool. We're done.
  • 15:08 - 15:11
    (crosstalk)
  • 15:18 - 15:22
    >> I mean I sent in my paper
    just last Friday afternoon.
  • 15:22 - 15:24
    >> Okay.
    >> Was that too late, 'cause I--
  • 15:24 - 15:26
    >> It was too late, 'cause we
    told you again and again and again--
  • 15:26 - 15:29
    >> Oh my--
    >> All right, so anyway
  • 15:29 - 15:31
    >> Sorry about that.
    >> Did you get your paper in today?
  • 15:31 - 15:32
    >> Yes. Yes, yes.
    >> Okay. Great.
  • 15:32 - 15:35
    We will survive it.
    >> Thank you, thank you.
  • 15:35 - 15:37
    >> All right.
  • 15:37 - 15:38
    (crosstalk continues)
  • 15:38 - 15:40
    >> Oh sorry, I wasn't (inaudible).
  • 15:41 - 15:46
    And on Wednesday-- I mean, this is
    not about the class,
  • 15:46 - 15:50
    it's slightly about the class,
    because (inaudible)--
  • 15:50 - 15:53
    invitation, having lunch--
    >> With Roy Parker?
  • 15:53 - 15:55
    Yeah, you leave early
    and go to that.
  • 15:55 - 15:58
    >> No no, not for that, but on
    Wednesday afternoon,
  • 15:58 - 16:02
    I'm flying to (inaudible) to walk
    in the commencement.
  • 16:02 - 16:04
    >> Oh okay. So you'll
    miss this talk?
  • 16:04 - 16:08
    >> Yeah, I'll miss this talk likely,
    and I'll be away for two weeks
  • 16:08 - 16:10
    because my parents are coming.
    >> No problem.
  • 16:10 - 16:14
    So send your paper in before that.
    >> So I was going to submit it
  • 16:14 - 16:15
    on Wednesday.
    >> Great, no problem.
  • 16:15 - 16:18
    Okay, cool, good, it works.
    >> Thank you.
  • 16:18 - 16:19
    >> You bet.
  • 16:19 - 16:19
    >> Mina?
    >> Yeah?
  • 16:19 - 16:22
    >> So are you going to be in
    your office right now, or?
  • 16:22 - 16:25
    >> Yeah yeah, I'll be here all day
    doing experiments,
  • 16:25 - 16:27
    so if I'm not there, just leave it
    on my desk.
  • 16:27 - 16:28
    >> Okay.
  • 16:28 - 16:30
    (crosstalk)
Title:
https:/.../5-6-19+Low+Q_pt3.mp4
Video Language:
English
Duration:
16:59

English subtitles

Incomplete

Revisions