0:00:01.230,0:00:03.055 One, because it's going to ruin our day,[br]alright? 0:00:03.055,0:00:09.780 So, uh, if you know anything special[br]about this paper, don't tell anybody. 0:00:09.780,0:00:13.385 What I really want you guys to do[br]is to not use the internet. 0:00:13.385,0:00:17.982 We're going to take a close look at this[br]paper, see what you guys think about it. 0:00:17.982,0:00:20.545 We're going to read it, I'm going to give[br]you about 20 minutes 0:00:20.545,0:00:23.253 to read this paper and discuss it.[br]Maybe half an hour. 0:00:23.253,0:00:24.717 Talk to your friends if you want to. 0:00:24.717,0:00:28.121 The idea is going to be to get through it[br]as fast as you can, and let's talk about 0:00:28.121,0:00:30.979 if we like this paper or not, alright? 0:00:30.979,0:00:33.433 Cool. 0:00:33.433,0:00:38.413 (crosstalk) 0:00:38.413,0:00:41.477 >> Put them up here.[br]>> Mia apparently doesn't staple papers 0:00:41.477,0:00:43.347 that she reads, so they're not stapled. 0:00:43.347,0:00:46.522 So it's just complete mayhem. 0:00:48.484,0:00:52.192 >> At least I got the job done.[br]>> No, not really. 0:00:53.015,0:01:13.725 (crosstalk) 0:01:37.136,0:01:39.976 >> Did anybody not get one? 0:01:41.230,0:01:44.838 (crosstalk) 0:01:49.838,0:01:51.501 >> Alright. I'll be back. 0:01:53.590,0:02:07.868 (crosstalk) 0:29:33.784,0:29:37.822 >> Alright, so, maybe read for about[br]five more minutes, and then maybe 0:29:37.822,0:29:40.800 discuss this for about five or[br]ten minutes. 0:29:45.026,0:29:49.792 So read for five more minutes,[br]then we'll discuss 0:29:49.842,0:29:53.418 for five or ten minutes, and then I'll[br]lead the discussion on the paper. 0:29:53.418,0:29:56.671 The discussion is really going to[br]center around what do you find convincing? 0:29:56.671,0:29:59.327 What do you like and dislike? Same kind of[br]discussions we've had, but sort of 0:29:59.327,0:30:03.775 a short version, just to kind of see if[br]maybe people will notice things 0:30:03.775,0:30:05.733 that other people didn't notice. 0:30:05.733,0:30:09.217 Does anyone know about this paper?[br]Anyone know what I'm talking about here? 0:30:09.217,0:30:12.244 Has anyone read this paper before? 0:30:12.244,0:30:14.212 Alright, good, we're all[br]on the same page. 0:30:14.212,0:30:17.534 Alright, read for a few more minutes, and[br]then when you're ready, start discussing 0:30:17.534,0:30:21.294 with whoever your neighbors - or[br]whoever you want to talk to about it. 0:30:21.294,0:30:23.451 Talk about this paper for[br]ten or fifteen minutes. 0:30:25.224,0:30:28.654 See what you think about it, see if you[br]can come up with a consensus. 0:37:36.675,0:37:59.618 (crosstalk) 0:44:50.319,0:44:58.191 >> Alright, okay, so uh, what do you[br]guys think of this paper? 0:44:58.191,0:45:00.943 >> Cool.[br]>> Yeah, super cool, huh? 0:45:00.943,0:45:04.203 Like really really good.[br]Yeah, I think you guys got it. 0:45:04.203,0:45:05.725 Yeah, and what do you think? 0:45:05.725,0:45:07.800 >> We have a theory[br]that it's all made up. 0:45:07.800,0:45:10.013 >> Ahh.[br](class laughing) 0:45:10.013,0:45:13.000 Anyone else think it's all made up?[br]>> I was convinced. 0:45:13.000,0:45:16.975 >> Based purely on the fact that you[br]told us that there was a retraction. 0:45:16.975,0:45:21.398 >> You can't find it, can you?[br]>> You also gave us the author manuscript, 0:45:21.398,0:45:23.363 so, yeah.[br]>> Yeah, yeah, no, no. 0:45:23.363,0:45:26.097 So, go do some Googling right now,[br]everybody pull up your laptop or your 0:45:26.097,0:45:29.564 phone and see what you think here,[br]because this is a really interesting 0:45:29.564,0:45:32.694 lesson in how to do this, right? 0:45:32.694,0:45:38.255 Yeah, yeah, yeah, no. But go see[br]what you can find about this paper. 0:45:38.255,0:45:41.079 It's a useful exercise. 0:45:41.079,0:45:46.085 A couple of minutes of your own[br]research by phone or laptop, 0:45:46.085,0:45:48.094 and then we'll have[br]a little more conversation. 0:45:50.439,0:46:06.354 (crosstalk) 0:48:27.216,0:48:30.474 >> All right. That's pretty amazing,[br]huh? 0:48:31.864,0:48:36.535 Okay, so not just the paper in "Cell,"[br]but also the paper in PNAS, right? 0:48:36.535,0:48:41.006 So two different papers had to be[br]completely retracted. 0:48:41.006,0:48:46.000 The retraction notice here in "Cell"[br]is good reading. 0:48:46.000,0:48:53.887 Let's see - where's the retraction[br]notice? Here's the retraction notice. 0:48:53.887,0:49:01.661 So, here we are. 0:49:01.661,0:49:04.027 Study reported the former family[br]blah, blah, blah. 0:49:04.027,0:49:06.885 Shortly after publication, a lab[br]with whom we had shared reagents 0:49:06.885,0:49:09.572 noticed that cell lines that were[br]supposed to be stably expressing 0:49:09.572,0:49:13.099 GFE-4 were not, we subsequently[br]found a western blot of the paper 0:49:13.099,0:49:14.127 had been inappropriately manipulated. 0:49:14.127,0:49:16.545 And that multiple cell lines were not[br]as reported. 0:49:17.641,0:49:20.160 When we constructed and validated[br]new cell lines and reagents, 0:49:20.160,0:49:23.963 our attempts to produce critical data[br]in the paper were unsuccessful. 0:49:23.963,0:49:30.925 This led to basically office -ORI,[br]is the Office of Research Integrity, 0:49:30.925,0:49:35.167 at the NIH, did a study, and[br]actually found that there were 0:49:35.167,0:49:38.595 fabrications in figures[br]two, three, five, six, seven, 0:49:38.595,0:49:41.860 and the majority of the supplement.[br]Alright? 0:49:41.860,0:49:50.748 Now, what's even more stunning -[br]what? Yeah, I'm getting there. 0:49:50.748,0:49:54.283 (crosstalk)[br]>> From the NIH? This is ridiculous. 0:49:54.283,0:49:56.341 >> Yeah, well, so, okay, right. 0:49:56.341,0:49:59.443 So an interesting thing to do[br]in these cases when you see retractions 0:49:59.443,0:50:04.103 is to read the ORI report, uh,[br]and the ORI report goes on for 0:50:04.103,0:50:09.218 pages and pages and pages of[br]all the things that were altered, 0:50:09.218,0:50:12.931 but the real meat of it is here. 0:50:14.033,0:50:17.010 This is for the "Cell" paper. 0:50:17.010,0:50:21.636 Results which did not originate from[br]experimental observations, 0:50:21.636,0:50:25.067 which did not originate from experimental[br]observation using selected regions 0:50:25.067,0:50:29.855 of the same original to represent the[br]control and the rescue. 0:50:29.855,0:50:34.696 Lying about whether they were 2.5[br]or 3.5 microliter channels, 0:50:34.696,0:50:36.921 not originating from experimental. 0:50:36.921,0:50:40.266 And then, what's particularly crazy[br]about this, is that there's a lot of 0:50:40.266,0:50:44.245 falsification data here, but almost[br]every single figure in here has the 0:50:44.245,0:50:46.582 end numbers exaggerated. 0:50:46.582,0:50:49.689 The error bar is mislabeled, etc.,[br]etc., right? 0:50:51.038,0:50:53.772 And so this is a relevant thing to[br]think about. 0:50:53.772,0:50:58.340 So this was a very talented scientist.[br]I saw her talk at 0:50:58.340,0:51:01.845 a Gordon research conference and[br]immediately asked her to apply to UT. 0:51:01.845,0:51:08.423 I said "You should apply for a faculty job[br]at UT," completely sold on the story. 0:51:09.393,0:51:12.308 So was UT Southwestern, who actually[br]offered her a job 0:51:12.308,0:51:13.832 as an assistant professor. 0:51:14.172,0:51:18.104 So was the Cancer Prevention and[br]Research Institute of Texas, 0:51:18.104,0:51:21.091 which offered her a two million dollar[br]recruitment grant. 0:51:22.361,0:51:26.174 All of which was of course rescinded[br]when this was discovered. 0:51:26.944,0:51:32.779 Turns out she had fabricated all the data[br]in a second paper in PNAS as well. 0:51:34.409,0:51:39.004 And so I think it's worth considering[br]these things and sort of paying attention 0:51:39.004,0:51:40.956 to what happens here, right? 0:51:40.956,0:51:44.840 So Claire Waterman, the senior author[br]on these papers is a very famous 0:51:44.840,0:51:48.121 cell biologist, who's been in the[br]business for 25 years, 0:51:48.121,0:51:49.479 at least as long as I have. 0:51:50.731,0:51:54.704 And is very well regarded, very[br]respected by all of her colleagues, 0:51:55.854,0:51:58.970 Obviously this puts her in[br]quite a bind, right? 0:52:00.640,0:52:02.009 And it's worth thinking about. 0:52:02.009,0:52:04.494 We've had faculty in this department[br]where this has happened, right? 0:52:04.494,0:52:08.603 This appears to be a case where[br]a very intelligent young scientist 0:52:08.603,0:52:12.782 somehow felt something about the[br]pressure of the job, needed to 0:52:12.782,0:52:15.942 make up all the data in two entire papers. 0:52:19.392,0:52:21.365 Then so one of the questions you ask is, 0:52:21.365,0:52:24.216 that you get asked a lot from[br]non-scientists is 0:52:24.216,0:52:27.404 "Well, why doesn't peer review find it?" 0:52:27.874,0:52:30.128 Right? How does this get through[br]peer review? 0:52:30.388,0:52:32.582 Did anyone see anything wrong[br]with this paper? If I had just given 0:52:32.582,0:52:35.598 you this paper, you'd have read[br]it and you'd have loved it, right? 0:52:36.047,0:52:40.058 Because if you're a smart person,[br]and you're a dedicated fraud, 0:52:40.588,0:52:42.574 it's very hard to get caught, right? 0:52:42.574,0:52:45.807 If you just make up numbers and put[br]them on a graph, and it looks good, 0:52:46.437,0:52:49.172 very hard to see that in peer review.[br]Right? 0:52:49.729,0:52:53.053 If you pick the part of the image that[br]shows what you want to show... 0:52:54.663,0:52:57.088 it's very easy to get away[br]with this actually. 0:52:59.675,0:53:02.557 >> The only thing I was suspicious about[br]in the whole paper was that 0:53:02.557,0:53:07.412 it all came together so perfectly,[br]and that's just like not how it should 0:53:07.412,0:53:08.550 ever happen with cancer cells. 0:53:08.550,0:53:10.493 >> Yeah, but it does sometimes, right? 0:53:10.493,0:53:13.931 Yeah, right, it does, and when you sit[br]there and you watch the data, 0:53:13.931,0:53:16.409 and it's from a good lab-- right? 0:53:17.079,0:53:20.597 It's very hard to say,[br]"Ah, it's too perfect." 0:53:20.597,0:53:23.803 You know, you wouldn't have said[br]"Oh, clearly you've made all of this up." 0:53:23.803,0:53:26.184 Right? Yeah, yeah, yeah. Right? 0:53:26.184,0:53:29.520 >> I don't - I was only suspicious[br]because I had thought already 0:53:29.520,0:53:31.029 that that's what happened with[br]this paper. 0:53:31.029,0:53:32.653 >> Yeah, yeah, yeah. No. 0:53:33.513,0:53:36.443 It's hard to do the exercise[br]any other way. 0:53:36.860,0:53:38.466 August had a question and[br]then Jifa. 0:53:38.466,0:53:42.496 >> No, just because we were quite - 0:53:42.496,0:53:47.755 you know, they made a little bit[br]far-fetched of a claim, but 0:53:47.755,0:53:52.538 at the same time we can see that,[br]yeah, the phenotype's right. 0:53:52.538,0:53:55.219 So I think...[br]>> Yeah, Jifa? 0:53:55.850,0:54:01.141 >> So, were the other scientists[br]here in on it, or was this like - 0:54:01.141,0:54:02.090 did she orchestrate it? 0:54:02.090,0:54:05.853 >> No, the Office of Research Integrity[br]puts it all on the one person. 0:54:06.563,0:54:09.281 Basically just a dedicated fraud[br]all the way through. 0:54:10.941,0:54:14.526 And again, it wouldn't be that[br]hard to do, right? 0:54:14.526,0:54:18.391 So probably the other authors[br]involved are the authors who 0:54:18.391,0:54:21.794 generated other reagents,[br]also possible that many of them 0:54:21.794,0:54:26.734 just generated reagents and gave[br]them to this person, to do work on, 0:54:26.734,0:54:29.662 and then she did experiments on[br]them and made up the results. 0:54:29.662,0:54:32.861 But actually in the entire[br]investigation, only one person 0:54:32.861,0:54:34.854 was found to be the root of all of this. 0:54:36.674,0:54:37.524 And so... 0:54:37.524,0:54:39.164 >> (inaudible). 0:54:39.164,0:54:39.914 >> Huh? 0:54:39.914,0:54:41.546 >> I said, it sucks for the other[br]authors of the paper. 0:54:41.546,0:54:43.237 >> Oh, it sucks for everybody,[br]absolutely. 0:54:43.947,0:54:51.823 >> I'm sorry, I just want to know,[br]how about the (inaudible) 0:54:52.249,0:54:54.465 >> I think we don't know. I think[br]someone will have to go back 0:54:54.465,0:54:56.109 and redo all those experiments, right? 0:54:56.109,0:55:03.818 So these are the findings of what[br]people said, but even those findings 0:55:03.818,0:55:10.183 in the first few figures, clearly the[br]numbers are inflated, right? 0:55:12.055,0:55:15.042 And so I think someone will have[br]to go back and start over. 0:55:15.395,0:55:20.525 >> I didn't read the summary at[br]the beginning because (inaudible), 0:55:20.525,0:55:25.955 so I just read the introduction, but I[br]feel like the titles are very attractive, 0:55:25.955,0:55:33.114 and then I read-- briefly go the the[br]introduction and read each figure. 0:55:34.010,0:55:40.080 I think they're basically true then you[br]gave us a little time so I went back 0:55:40.080,0:55:42.630 to read the summary.[br]>> Yup. 0:55:42.635,0:55:48.078 >> And I felt like some of the summaries[br]they were giving for the whole picture 0:55:48.078,0:55:53.710 of this paper, I felt like how could[br]it be so perfect, you know in 0:55:54.830,0:55:58.066 animal models, and the (inaudible) 0:55:58.066,0:56:00.243 >> Yeah, but every once in a while[br]you get lucky, right? 0:56:00.243,0:56:02.246 And so that's the thing with fraud, right? 0:56:02.246,0:56:07.419 If it's really consistent and you[br]stick to your guns, it's really hard 0:56:07.419,0:56:09.877 to catch. Now, the funny thing is 0:56:11.359,0:56:15.265 that surely she must have thought[br]someone would try to repeat - 0:56:15.265,0:56:18.358 I mean it's a cell paper. She was[br]on the job market, she gave 0:56:18.358,0:56:21.638 talks all over the country.[br]She must have known that 0:56:21.638,0:56:24.409 somebody was going to do[br]these experiments. 0:56:25.349,0:56:27.324 And so I just don't know.[br]August? 0:56:27.324,0:56:31.409 >> I'm pretty sure that Skau's[br]reputation is completely shattered, 0:56:31.409,0:56:34.415 but how is the reputation of[br]Clare Waterman? 0:56:34.415,0:56:38.176 >> I mean that's always a danger, right?[br]And time will tell, right? 0:56:38.176,0:56:41.189 It's the only time she's ever[br]been associated with it. 0:56:41.189,0:56:45.812 The independent investigation[br]found that it was all this one person. 0:56:46.712,0:56:48.662 So, time will tell. 0:56:49.965,0:56:51.941 But right now I would say[br]it's still very good. 0:56:51.941,0:56:53.416 >> Oh, okay.[br]>>Right? Yeah. 0:56:54.883,0:56:59.705 >> This makes me wonder like, what[br]is so terribly wrong with the system 0:56:59.705,0:57:01.658 that somebody (inaudible) 0:57:01.658,0:57:06.615 >> Sure. Right? I mean I think[br]that's one of the big questions, 0:57:06.615,0:57:09.272 and what's kind of related to[br]that that I think is a really 0:57:09.272,0:57:11.321 subtle point - I'll get to[br]your questions in a minute, 0:57:11.321,0:57:13.718 but I do want to make this point,[br]because it comes straight to this. 0:57:13.718,0:57:14.564 Right? 0:57:16.284,0:57:22.759 It's very easy to start believing[br]your own bullshit. Right? 0:57:23.859,0:57:31.051 And there's a big difference between[br]really making up entire papers 0:57:31.051,0:57:36.813 and not being honest with yourself,[br]but they all are part of the 0:57:36.823,0:57:40.158 same spectrum, and something you[br]have to guard against is 0:57:40.838,0:57:42.725 "Am I really doing this the right way?" 0:57:42.725,0:57:44.794 "Am I really seeing what[br]I think I'm seeing?" 0:57:44.794,0:57:46.443 "Am I really being completely honest?" 0:57:46.443,0:57:51.523 Because especially once you get halfway[br]into a paper, halfway into your PhD, 0:57:53.173,0:57:56.923 it gets, you know, there's[br]a lot of pressure. 0:57:58.063,0:58:01.440 And it's something you have to[br]fight against, right? 0:58:01.440,0:58:04.125 Especially when a project's[br]going south, right? 0:58:04.485,0:58:07.167 You've invested a lot in it,[br]and everything's starting to 0:58:07.167,0:58:10.816 fall apart, and so you've[br]got to be vigilant. 0:58:10.816,0:58:14.236 And again, I'm not saying that[br]you're likely to decide to just 0:58:14.236,0:58:17.003 make up all the data in the[br]last three figures of your paper. 0:58:18.073,0:58:21.245 But the system is stressful. 0:58:21.581,0:58:23.317 It's a very competitive system. 0:58:24.536,0:58:26.361 It's a very competitive world. 0:58:26.860,0:58:29.765 Is fraud any worse now than it used to be, 0:58:29.765,0:58:32.193 or can we just detect it better, right? 0:58:32.513,0:58:35.692 I don't actually know. My guess is that[br]we can detect it a lot better, right? 0:58:35.692,0:58:37.481 There's a lot better tools for it. 0:58:37.481,0:58:40.130 But there's also a lot more money[br]in science than there ever was before 0:58:40.130,0:58:44.305 at all levels, even in academia,[br]which puts a big incentive in 0:58:45.415,0:58:48.563 for people to be unethical. 0:58:49.517,0:58:52.804 But I mean these are serious concerns,[br]I think there's things that as 0:58:52.804,0:58:56.810 graduate students, you guys should[br]be thinking about and considering. 0:58:56.810,0:58:59.726 There were some hands up.[br]I can't remember who they all were. 0:59:00.389,0:59:01.283 Will? 0:59:01.996,0:59:04.907 >> I was just going to say that it's[br]crazy that it seems like the 0:59:04.907,0:59:07.260 way that she started to get caught[br]was that she sent 0:59:07.260,0:59:11.272 stably-transfected cell line, and it[br]wasn't actually stably-transfected. 0:59:11.272,0:59:13.202 >> Yup.[br]>> It seems so basic. 0:59:14.507,0:59:17.505 >> Yeah, I mean but once you're there,[br]right, they ask for the line, 0:59:17.505,0:59:18.347 what are you going to do? 0:59:18.880,0:59:22.764 >> That's also easy to hand away,[br]to be like "Oh, it might have..." 0:59:22.764,0:59:27.001 You know, you build these lines[br]and they fall apart, so I mean... 0:59:27.461,0:59:31.744 and if you're this confident in[br]falsifying data that ends up 0:59:31.744,0:59:35.999 in the "Cell" paper, you're probably[br]willing to think that you can 0:59:35.999,0:59:38.595 argue your way out of[br]something like that, right? 0:59:39.670,0:59:43.950 Do you see what I'm saying,[br]like the mental-- yeah, yeah. 0:59:44.322,0:59:48.898 >> Well but also at this level, you know[br]it's slid over into pathology, right? 0:59:48.898,0:59:54.138 I mean literally every single thing[br]in this paper has something 0:59:54.138,0:59:55.196 fraudulent about it. 0:59:56.110,0:59:58.881 Right, which is really evidence to me[br]that someone has gone 0:59:58.881,1:00:03.704 way past "Oh crap, I've got to just[br]save this one paper" into 1:00:03.704,1:00:04.724 "I'm invincible". 1:00:06.358,1:00:07.325 Right?[br]>> Yeah. 1:00:07.325,1:00:09.788 >> Yeah yeah yeah, I mean[br]I don't know, but yeah. 1:00:10.104,1:00:13.298 >> Yeah, so apart from the reputation[br]and embarrassment, what is the 1:00:13.298,1:00:15.794 punishment for such (inaudible)? 1:00:15.794,1:00:21.091 >> Right, so the punishment for her[br]is-- I mean, the punishment 1:00:21.091,1:00:23.975 is her entire scientific career[br]is completely ruined. 1:00:23.975,1:00:25.766 She'll never work in the[br]field again, right? 1:00:25.766,1:00:28.020 >> What about the taxpayer's money?[br]>> Right, right, right. 1:00:28.020,1:00:31.038 Taxpayer's money, there's nothing--[br]I mean, nothing done. 1:00:31.768,1:00:35.130 They have these voluntary settlements[br]here that you can read about it. 1:00:36.240,1:00:42.124 And these things are really built[br]around the assumption of 1:00:42.124,1:00:43.944 small amounts of fraud, you know? 1:00:44.178,1:00:49.670 You made a mistake, you're a savable[br]person as a scientist, then they 1:00:49.670,1:00:52.099 put a supervisory plan in place[br]where someone's really 1:00:52.099,1:00:55.050 looking a lot more carefully at your work,[br]and then for three years, 1:00:55.050,1:00:57.786 you're not allowed to work for the NIH[br]and these kinds of things. 1:00:58.436,1:01:01.294 But at this level, her reputation[br]is just destroyed, so there's-- 1:01:01.294,1:01:03.505 we won't see her anywhere[br]in academic science again. 1:01:03.505,1:01:07.816 In terms of monetary, these kinds of[br]things, I don't know of any, 1:01:07.816,1:01:10.488 and there's much more-- there was[br]a big "New York Times" piece 1:01:10.488,1:01:14.588 about a cancer biologist at[br]Penn State who's apparently 1:01:14.588,1:01:19.248 been under investigation for fraud[br]like every other year for ten years, 1:01:19.248,1:01:22.545 and he still has millions of dollars[br]of research money and it just sort of 1:01:22.545,1:01:24.514 never comes home to roost, so. 1:01:25.074,1:01:27.770 You guys should educate yourself[br]about the fraud situation, 1:01:28.770,1:01:29.836 and what's there. 1:01:30.110,1:01:31.812 Did someone else have a hand up? 1:01:32.280,1:01:33.590 >> Can I same something quick?[br]>> Yeah. 1:01:33.590,1:01:37.906 >> I saw in the news, UT Southwestern[br]removed a million dollar grant from her. 1:01:37.906,1:01:40.300 >> Yeah yeah, that's the grant.[br]Yeah-- oh yeah, she-- 1:01:41.127,1:01:44.330 the job was rescinded, the grant[br]was rescinded, so she's not 1:01:44.330,1:01:46.400 getting the grant, she didn't[br]get the job, she will absolutely 1:01:46.400,1:01:47.760 have to leave science. 1:01:48.316,1:01:50.911 As I understand it, she's like a[br]program officer at a foundation now 1:01:50.911,1:01:52.233 or something. Yeah. 1:01:52.233,1:01:55.716 >> But not just the monetary loss,[br]what about killing so many mice 1:01:55.716,1:01:59.570 and falsifying data? (inaudible).[br](laughter) 1:02:00.663,1:02:04.400 >> If you go down the road that[br]killing life is a problem, 1:02:04.400,1:02:05.989 then you're going to have[br]a real problem. 1:02:06.245,1:02:08.226 >> If you're doing science, doing[br]something productive, 1:02:08.226,1:02:10.554 it makes sense but if[br]you're falsifying data. 1:02:10.554,1:02:11.506 >> Yeah. 1:02:13.236,1:02:15.247 >> I was--[br]>> Hang on, you had a hand up. 1:02:15.766,1:02:18.982 >> How often-- obviously this one[br]seems like it was very intentional, 1:02:18.982,1:02:21.327 and so that's really-- but how[br]often do people get 1:02:21.327,1:02:23.493 caught up in this just because[br]they weren't careful enough 1:02:23.493,1:02:27.295 and they didn't force themselves[br]to be very thorough? 1:02:27.446,1:02:31.546 >> I mean mistakes get made in papers[br]all the time, right, and some papers 1:02:31.546,1:02:36.025 get retracted because we went back[br]and resequenced the mouse allele, 1:02:36.025,1:02:39.020 and it wasn't what we thought[br]it was, right? 1:02:39.424,1:02:42.183 But generally what happens is there's[br]a correction. Generally when that happens, 1:02:42.183,1:02:45.105 it's not like the entire paper's ruined,[br]"Oh, surprise, we mutated 1:02:45.105,1:02:46.859 totally the wrong gene". 1:02:46.859,1:02:50.570 Usually it's like "Oh we thought it was[br]exon 3, but in a frameshift, 1:02:50.570,1:02:53.362 but in fact it was something else."[br]Right? 1:02:53.367,1:02:54.093 >> Right. 1:02:54.093,1:02:59.272 >> And so you get a lot of corrections[br]in papers, and some retractions, 1:02:59.272,1:03:02.960 but generally, you know, you've[br]got to-- the whole paper 1:03:02.960,1:03:05.949 really has to be systematically[br]wrong to retract the entire paper. 1:03:05.949,1:03:09.135 >> So it's the dishonesty that[br]gets prosecuted. 1:03:09.135,1:03:13.287 >> Yeah, or I mean, it-- yeah, so the[br]dishonestly immediately you want to 1:03:13.287,1:03:16.015 retract the paper, because that's[br]also what the PI is going to 1:03:16.015,1:03:17.143 want to do.[br]>> Right. 1:03:17.143,1:03:21.286 >> Right? But if you just get it wrong,[br]you know, you get it wrong, 1:03:21.286,1:03:22.171 we're human, right? 1:03:22.171,1:03:25.959 If there was no-- and there have been[br]investigations where they find 1:03:25.959,1:03:29.247 "Oh, no, they weren't unethical,[br]they were just dumb." 1:03:29.247,1:03:29.867 (laughter) 1:03:29.867,1:03:31.776 No, seriously, right? And that can[br]happen, right? 1:03:31.776,1:03:34.379 And that's not a crime. 1:03:35.471,1:03:36.588 So, yeah. 1:03:37.454,1:03:40.570 >> Who does these investigations?[br]Is there like a secret service 1:03:40.570,1:03:42.074 for scientists, like--[br]>> Yeah. 1:03:42.074,1:03:44.461 The Office of Research Integrity[br]at the NIH, right? 1:03:44.461,1:03:45.274 >> At the NIH? 1:03:45.274,1:03:49.086 >> Yup, and then all universities will[br]have an office for research integrity, 1:03:49.086,1:03:52.956 so I'm sure that-- so if it happens at UT,[br]there's a UT office that will investigate, 1:03:52.956,1:03:56.364 but the NIH will also investigate[br]if it's NIH-funded research. 1:03:57.439,1:03:59.200 Yeah, so several differnet bodies. 1:03:59.200,1:04:00.775 And there's a... 1:04:01.845,1:04:02.993 Yeah. 1:04:04.578,1:04:06.925 >> Has anyone here read[br]the book "Bad Blood?" 1:04:07.475,1:04:08.527 >> "Bad Blood" what is that? 1:04:08.539,1:04:11.457 >> It's about the Theranos controversy[br]and (inaudible). 1:04:11.457,1:04:14.483 >> Oh, wow. No, is it good?[br]>> It's amazing. 1:04:14.483,1:04:16.275 >> Really? "Bad Blood."[br]>> Everyone should read it. 1:04:16.275,1:04:17.516 >> Okay.[br]>> It's really good. 1:04:17.516,1:04:19.478 >> The documentary on[br]HBO's pretty good. 1:04:20.138,1:04:21.534 >> Same title? Okay. 1:04:22.051,1:04:23.359 >> I think so.[br]>> Okay. 1:04:23.519,1:04:25.908 >> So lots of these[br]kinds of things existing. 1:04:26.271,1:04:27.009 >> Yeah (laughs). 1:04:27.009,1:04:28.357 >> Just like the... 1:04:29.731,1:04:33.264 I don't know how to[br]say that vitamin C is 1:04:33.264,1:04:37.675 for everything, and so now,[br]the (inaudible) for everything 1:04:37.675,1:04:39.903 and the (inaudible) can do everything, 1:04:39.903,1:04:41.799 and they just make money from that. 1:04:41.799,1:04:42.541 (laughter) 1:04:42.571,1:04:44.710 >> As long as they don't kill anyone, 1:04:44.710,1:04:47.855 they just sell the ideas[br]to the (inaudible). 1:04:48.027,1:04:50.641 >> Yeah, I mean that's the entire[br]supplement industry, right? 1:04:53.411,1:04:55.947 All right. Any other questions? 1:04:56.324,1:04:57.785 Thoughts, comments? 1:04:58.101,1:05:00.731 >> When is the next assignment due? 1:05:02.411,1:05:05.232 >> Next assignment is due[br]a week from Wednesday. 1:05:07.802,1:05:10.186 No, so nine days from now. 1:05:10.186,1:05:11.186 >> Nine days? 1:05:11.186,1:05:12.109 >> Nine days. 1:05:12.689,1:05:15.009 Whatever the date is here,[br]actually I have a calendar 1:05:15.009,1:05:17.587 in front of me.[br]>> It is the 6th so the 15th. 1:05:18.317,1:05:20.572 >> The 15th.[br]>> Ooh, I'm good at simple math. 1:05:20.934,1:05:24.088 >> It is as my calendar says,[br]'cause I have an 11 year old 1:05:24.088,1:05:26.633 girl at home, and "Riverdale"[br]season three starts. 1:05:26.633,1:05:27.693 (laughter) 1:05:27.694,1:05:29.843 But also your assignment is due,[br]so get it done before River-- 1:05:29.843,1:05:31.161 >> "Riverdale" season three[br]already happened. 1:05:31.172,1:05:33.953 >> What?[br](laughter) 1:05:33.953,1:05:36.496 Oh, no, no, it comes out[br]free on Netflix. Yeah. 1:05:36.496,1:05:38.059 (laughter) 1:05:39.383,1:05:40.638 >> Judge me all you want. 1:05:40.638,1:05:42.358 >> I have a shared--[br]>> I know all of you 1:05:42.358,1:05:44.091 watch "Game of Thrones,"[br]and I don't, so. 1:05:44.091,1:05:46.145 >> No, no I don't either. 1:05:46.145,1:05:47.065 (crosstalk) 1:05:47.065,1:05:47.789 >> Jifa? 1:05:47.789,1:05:49.220 >> What's going on Wednesday? 1:05:49.506,1:05:51.996 >> Haven't decided yet. It's going to[br]be so much fun, though. 1:05:51.996,1:05:53.510 (laughter) 1:05:55.848,1:05:58.250 All right, cool. We're done. 1:05:58.250,1:06:00.633 (crosstalk) 1:06:07.721,1:06:11.911 >> I mean I sent in my paper[br]just last Friday afternoon. 1:06:11.911,1:06:13.891 >> Okay.[br]>> Was that too late, 'cause I-- 1:06:13.891,1:06:16.260 >> It was too late, 'cause we[br]told you again and again and again-- 1:06:16.260,1:06:19.233 >> Oh my--[br]>> All right, so anyway 1:06:19.233,1:06:20.990 >> Sorry about that.[br]>> Did you get your paper in today? 1:06:20.990,1:06:22.493 >> Yes. Yes, yes.[br]>> Okay. Great. 1:06:22.493,1:06:25.034 We will survive it.[br]>> Thank you, thank you. 1:06:25.034,1:06:26.676 >> All right. 1:06:26.676,1:06:28.372 (crosstalk continues) 1:06:28.372,1:06:29.617 >> Oh sorry, I wasn't (inaudible). 1:06:31.540,1:06:36.411 And on Wednesday-- I mean, this is[br]not about the class, 1:06:36.411,1:06:40.402 it's slightly about the class,[br]because (inaudible)-- 1:06:40.402,1:06:43.107 invitation, having lunch--[br]>> With Roy Parker? 1:06:43.107,1:06:44.888 Yeah, you leave early[br]and go to that. 1:06:44.888,1:06:48.252 >> No no, not for that, but on[br]Wednesday afternoon, 1:06:48.252,1:06:51.650 I'm flying to (inaudible) to walk[br]in the commencement. 1:06:51.650,1:06:54.282 >> Oh okay. So you'll[br]miss this talk? 1:06:54.282,1:06:58.290 >> Yeah, I'll miss this talk likely,[br]and I'll be away for two weeks 1:06:58.290,1:07:00.526 because my parents are coming.[br]>> No problem. 1:07:00.526,1:07:03.683 So send your paper in before that.[br]>> So I was going to submit it 1:07:03.683,1:07:05.179 on Wednesday.[br]>> Great, no problem. 1:07:05.588,1:07:08.046 Okay, cool, good, it works.[br]>> Thank you. 1:07:08.046,1:07:08.683 >> You bet. 1:07:08.683,1:07:09.377 >> Mina?[br]>> Yeah? 1:07:09.377,1:07:12.399 >> So are you going to be in[br]your office right now, or? 1:07:12.399,1:07:15.057 >> Yeah yeah, I'll be here all day[br]doing experiments, 1:07:15.057,1:07:17.133 so if I'm not there, just leave it[br]on my desk. 1:07:17.133,1:07:17.838 >> Okay. 1:07:17.838,1:07:20.554 (crosstalk)