1 00:00:05,235 --> 00:00:08,235 (audience applause) 2 00:00:10,446 --> 00:00:11,696 - Good morning. 3 00:00:13,446 --> 00:00:16,222 For the past three years, I was a researcher 4 00:00:16,222 --> 00:00:18,310 at the Edmond J. Safra Center for Ethics 5 00:00:18,310 --> 00:00:22,233 at Harvard University, where I examined corrupting influences 6 00:00:22,233 --> 00:00:25,500 and hidden biases in the pursuit of knowledge. 7 00:00:25,500 --> 00:00:27,468 During this time, I conducted in-depth interviews 8 00:00:27,468 --> 00:00:30,715 with professors from medicine, business, law, 9 00:00:30,715 --> 00:00:33,527 the natural life sciences, as well as the humanities 10 00:00:33,527 --> 00:00:35,863 and social sciences. 11 00:00:35,863 --> 00:00:38,991 My goal was to try to understand the everyday life 12 00:00:38,991 --> 00:00:43,134 of scientists and professors across all the disciplines. 13 00:00:43,134 --> 00:00:45,943 In the end, I ended up with close to 10,000 pages 14 00:00:45,943 --> 00:00:48,207 of interview transcripts. 15 00:00:48,207 --> 00:00:50,973 Today I would like to share with you 16 00:00:50,973 --> 00:00:53,682 some of the ethical dilemmas that professors face. 17 00:00:53,682 --> 00:00:57,156 In particular, whether they experience an increased risk 18 00:00:57,156 --> 00:01:01,323 of bias depending on who is funding their research. 19 00:01:02,402 --> 00:01:05,147 Now, why should we be concerned about the ethics 20 00:01:05,147 --> 00:01:07,704 of knowledge production? 21 00:01:07,704 --> 00:01:11,986 When I first started university, I had this idealistic, 22 00:01:11,986 --> 00:01:14,870 and perhaps naive, view of science. 23 00:01:14,870 --> 00:01:17,938 I believed that scientists inquired about the world, 24 00:01:17,938 --> 00:01:20,495 practiced the scientific method with integrity, 25 00:01:20,495 --> 00:01:24,058 and made new discoveries that drive progress forward. 26 00:01:24,058 --> 00:01:27,154 But close examination of how scientist conduct research 27 00:01:27,154 --> 00:01:30,122 reveals that what we can know depends not only 28 00:01:30,122 --> 00:01:33,247 on the scientist, but also on the structures 29 00:01:33,247 --> 00:01:35,939 and institutions that give scientists the means 30 00:01:35,939 --> 00:01:38,115 to pursue knowledge. 31 00:01:38,115 --> 00:01:40,212 As I interviewed scientists and professors, 32 00:01:40,212 --> 00:01:44,460 I began to uncover patterns of scientific distortion, 33 00:01:44,460 --> 00:01:47,770 or what some might call "the corruption of knowledge." 34 00:01:47,770 --> 00:01:49,933 However, the majority of these distortions 35 00:01:49,933 --> 00:01:53,352 were not produced by bad people behaving unethically 36 00:01:53,352 --> 00:01:56,288 or illegally, although this does happen. 37 00:01:56,288 --> 00:01:58,177 But rather by good people, like the people 38 00:01:58,177 --> 00:02:01,769 sitting beside you right now: your friends and your family 39 00:02:01,769 --> 00:02:03,919 who in response to the daily pressures 40 00:02:03,919 --> 00:02:06,395 of work may simply begin to rationalize 41 00:02:06,395 --> 00:02:11,233 to themselves little ethical lapses here and there. 42 00:02:11,233 --> 00:02:15,718 Now by ethical lapse, I mean scientific integrity lapses 43 00:02:15,718 --> 00:02:18,409 that appear to be very small, or inconsequential, 44 00:02:18,409 --> 00:02:19,885 at the time. 45 00:02:19,885 --> 00:02:22,555 One of the most common examples of this 46 00:02:22,555 --> 00:02:24,989 involves a scientist thinking, 47 00:02:24,989 --> 00:02:27,967 Maybe I won't ask Question A when pursuing 48 00:02:27,967 --> 00:02:30,890 my research because my funder who may be relying 49 00:02:30,890 --> 00:02:33,797 on the results of this study to obtain regulatory approval 50 00:02:33,797 --> 00:02:36,522 for potential commercialization may not be too happy 51 00:02:36,522 --> 00:02:39,969 with the higher risk of a negative result 52 00:02:39,969 --> 00:02:42,446 which might also affect my future funding. 53 00:02:42,446 --> 00:02:44,408 So maybe instead I'll self-censor myself 54 00:02:44,408 --> 00:02:46,324 and ask a different question of the data 55 00:02:46,324 --> 00:02:48,916 where the possible outcome will most likely 56 00:02:48,916 --> 00:02:51,615 not ruffle too many feathers and I will then answer 57 00:02:51,615 --> 00:02:55,964 that question honestly and with scientific integrity. 58 00:02:55,964 --> 00:02:59,408 Now these types of rationalizations, 59 00:02:59,408 --> 00:03:02,093 these little compromises, where we convince ourselves 60 00:03:02,093 --> 00:03:04,901 in the moment that what we're doing is okay, 61 00:03:04,901 --> 00:03:07,456 help to neutralize any guilt we might experience 62 00:03:07,456 --> 00:03:10,128 in our ethical decision making. 63 00:03:10,128 --> 00:03:13,307 However, over time the accumulation 64 00:03:13,307 --> 00:03:15,681 of these little ethical lapses is leading 65 00:03:15,681 --> 00:03:18,202 to a broader system of knowledge production 66 00:03:18,202 --> 00:03:20,901 that is becoming increasingly distorted 67 00:03:20,901 --> 00:03:23,234 and more difficult to trust. 68 00:03:24,555 --> 00:03:27,985 I want you to think about that word for a moment: trust. 69 00:03:27,985 --> 00:03:31,548 And how it plays a role in your daily activities. 70 00:03:31,548 --> 00:03:35,051 For instance, plastic water bottles. 71 00:03:35,051 --> 00:03:37,363 They're so common that when we pick one up, 72 00:03:37,363 --> 00:03:38,732 we're probably not thinking anything 73 00:03:38,732 --> 00:03:40,899 other than, "I'm thirsty." 74 00:03:41,944 --> 00:03:46,237 We don't ask ourselves, Hmm, does bisphenol-a, 75 00:03:46,237 --> 00:03:49,395 or BPA, a common compound used in hard plastic products, 76 00:03:49,395 --> 00:03:51,158 lead to cancer, behavioral disorders, 77 00:03:51,158 --> 00:03:52,590 or reproductive problems? 78 00:03:52,590 --> 00:03:54,090 No, of course not. 79 00:03:54,941 --> 00:03:56,670 We take a drink and we go on with our day. 80 00:03:56,670 --> 00:03:58,726 We trust that drinking from the water bottle 81 00:03:58,726 --> 00:04:03,350 can't be bad, or at least bad enough to worry about. 82 00:04:03,350 --> 00:04:06,087 While on the one hand, you can feel safe 83 00:04:06,087 --> 00:04:08,266 because every study performed by scientists 84 00:04:08,266 --> 00:04:12,433 funded by the industry concludes, "No harm from BPA." 85 00:04:13,396 --> 00:04:17,293 In other words: it's okay, you can trust BPA. 86 00:04:17,293 --> 00:04:21,098 But, at the same time, 93% of the non industry 87 00:04:21,098 --> 00:04:25,295 funded studies show that there might be cause for concern. 88 00:04:25,295 --> 00:04:28,516 And that maybe we should be a little less trusting 89 00:04:28,516 --> 00:04:33,430 the next time we pick up a hard plastic water bottle. 90 00:04:33,430 --> 00:04:35,572 So who do you trust? 91 00:04:35,572 --> 00:04:39,028 And how is it possible that the industry funded scientists 92 00:04:39,028 --> 00:04:43,195 studying BPA are so certain that there is no harm? 93 00:04:44,691 --> 00:04:46,987 Is it simply because they're better scientists? 94 00:04:46,987 --> 00:04:48,532 Have bigger data sets? 95 00:04:48,532 --> 00:04:50,837 Know the compound better? 96 00:04:50,837 --> 00:04:54,411 Maybe, perhaps, but we see this pattern 97 00:04:54,411 --> 00:04:57,031 often called the funding effect, 98 00:04:57,031 --> 00:04:59,803 across many different areas of research 99 00:04:59,803 --> 00:05:03,698 from cell phone safety to climate change to soft drinks. 100 00:05:03,698 --> 00:05:06,480 In each case, scientists funded by the industry, 101 00:05:06,480 --> 00:05:09,343 or industry supported think tanks, reached conclusions 102 00:05:09,343 --> 00:05:13,660 that overall tend to deny or downplay any harm. 103 00:05:13,660 --> 00:05:16,686 While non industry funded scientists overwhelmingly 104 00:05:16,686 --> 00:05:19,107 find evidence of harm. 105 00:05:19,107 --> 00:05:22,302 Among the professors I interviewed in food and nutrition, 106 00:05:22,302 --> 00:05:26,145 there was acknowledgement of this funding effect bias. 107 00:05:26,145 --> 00:05:30,142 When food scientists said, "There is a tendency for people 108 00:05:30,142 --> 00:05:32,742 in my discipline to develop sympathies 109 00:05:32,742 --> 00:05:34,606 with the food industry and to say 110 00:05:34,606 --> 00:05:37,918 'Yeah this is definitely safe.' Rather than to say, 111 00:05:37,918 --> 00:05:39,797 "Okay, here's this research study, 112 00:05:39,797 --> 00:05:42,915 and this research study, and this research study.'" 113 00:05:42,915 --> 00:05:45,831 When I interviewed another professor, who was also 114 00:05:45,831 --> 00:05:49,344 an editor of a scientific journal in nutrition, 115 00:05:49,344 --> 00:05:53,573 he said the following to me, "So we get some manuscripts 116 00:05:53,573 --> 00:05:56,246 that are industry sponsored and one senses 117 00:05:56,246 --> 00:05:58,534 that their story is a little slanted towards the benefit 118 00:05:58,534 --> 00:06:00,122 of whatever it might be. 119 00:06:00,122 --> 00:06:02,405 Their product did this, never mind that 120 00:06:02,405 --> 00:06:05,268 it didn't do 10 other things. 121 00:06:05,268 --> 00:06:07,444 The most frequent scenario is not that the study 122 00:06:07,444 --> 00:06:11,047 is done poorly, but that the questions themselves 123 00:06:11,047 --> 00:06:13,047 are kind of selective." 124 00:06:15,049 --> 00:06:18,382 Now if a funding effect bias does exist, 125 00:06:19,388 --> 00:06:21,927 then surely the regulatory bodies who look out 126 00:06:21,927 --> 00:06:25,381 for our safety must be aware of it, right? 127 00:06:25,381 --> 00:06:28,890 For instance, what about our prescription drugs? 128 00:06:28,890 --> 00:06:30,565 Pharmaceutical companies must first obtain 129 00:06:30,565 --> 00:06:34,084 regulatory approval for their products, right? 130 00:06:34,084 --> 00:06:37,520 Yes, however, many of the drug evaluation 131 00:06:37,520 --> 00:06:39,512 and research advisory committee members 132 00:06:39,512 --> 00:06:41,429 who vote on whether a drug should be granted 133 00:06:41,429 --> 00:06:44,049 regulatory approval also have financial conflicts 134 00:06:44,049 --> 00:06:47,632 of interest with these same drug companies. 135 00:06:48,470 --> 00:06:52,811 These voting members often serve as consultants 136 00:06:52,811 --> 00:06:55,168 and have ownership interest in the same drug companies 137 00:06:55,168 --> 00:06:56,585 seeking approval. 138 00:06:57,861 --> 00:07:00,191 They also sit on their advisory boards 139 00:07:00,191 --> 00:07:02,503 and even receive funding from these firms 140 00:07:02,503 --> 00:07:05,336 for their own individual research. 141 00:07:06,544 --> 00:07:09,001 In other words, they might be experts, 142 00:07:09,001 --> 00:07:12,084 but they are not independent experts. 143 00:07:13,230 --> 00:07:16,941 As you know, in 2008 the world suffered 144 00:07:16,941 --> 00:07:19,024 a major financial crisis. 145 00:07:20,305 --> 00:07:22,554 The Oscar-winning documentary, Inside Job, 146 00:07:22,554 --> 00:07:25,270 suggested that economics professors were being corrupted 147 00:07:25,270 --> 00:07:28,660 and blinded through their consulting relationships 148 00:07:28,660 --> 00:07:32,339 and conflicts of interest with the financial sector. 149 00:07:32,339 --> 00:07:35,639 It was so serious that even an upset Queen of England 150 00:07:35,639 --> 00:07:38,987 (audience laughs) 151 00:07:38,987 --> 00:07:42,355 visited the LSC, the prestigious London School 152 00:07:42,355 --> 00:07:44,877 of Economics, and sternly asked her top 153 00:07:44,877 --> 00:07:49,396 economics professors, "If the problem was so widespread, 154 00:07:49,396 --> 00:07:52,859 "then why didn't anyone notice it?" 155 00:07:52,859 --> 00:07:55,561 The Director of LSC's Management Department, 156 00:07:55,561 --> 00:07:57,736 who was standing beside the Queen at the time, 157 00:07:57,736 --> 00:08:01,198 said to her, "At every stage, someone was relying 158 00:08:01,198 --> 00:08:03,230 on somebody else and everyone thought 159 00:08:03,230 --> 00:08:06,063 they were doing the right thing." 160 00:08:07,082 --> 00:08:09,901 In my interviews with business and economics professors, 161 00:08:09,901 --> 00:08:11,864 it was observed, as it was with professors 162 00:08:11,864 --> 00:08:15,239 across all the disciplines, that a lack of independence 163 00:08:15,239 --> 00:08:18,640 can distort the production of knowledge. 164 00:08:18,640 --> 00:08:20,800 One economics professor, who researches 165 00:08:20,800 --> 00:08:25,194 private equity finance, told me during an interview, 166 00:08:25,194 --> 00:08:27,482 "The only way to get the data is to get 167 00:08:27,482 --> 00:08:30,426 "the private equity firms to give it to you. 168 00:08:30,426 --> 00:08:32,605 "If you then say these people don't know what they're doing, 169 00:08:32,605 --> 00:08:35,957 or they only make returns by taking excessive risks, 170 00:08:35,957 --> 00:08:38,515 then there is the potential you simply will not 171 00:08:38,515 --> 00:08:41,176 get data going forward and you will be forced 172 00:08:41,176 --> 00:08:43,448 to leave the field of economics. 173 00:08:43,448 --> 00:08:45,940 So you have to worry that the research that comes out 174 00:08:45,940 --> 00:08:48,488 is more favorable to the private equity industry 175 00:08:48,488 --> 00:08:50,905 than otherwise it would be." 176 00:08:52,889 --> 00:08:55,797 Now despite all these cautionary examples 177 00:08:55,797 --> 00:08:58,390 of corrupting influences and hidden biases, 178 00:08:58,390 --> 00:09:00,479 some of you out there, I'm certain, 179 00:09:00,479 --> 00:09:03,395 are still thinking to yourself, "Okay, Gary, 180 00:09:03,395 --> 00:09:05,962 I hear what you're saying, but I would never distort 181 00:09:05,962 --> 00:09:08,905 my work, and no conflict of interest would change 182 00:09:08,905 --> 00:09:11,155 how I pursue my research." 183 00:09:12,287 --> 00:09:15,314 Fair enough, many of us do believe 184 00:09:15,314 --> 00:09:17,637 that we can manage any conflict of interest 185 00:09:17,637 --> 00:09:21,884 and still maintain our own personal integrity. 186 00:09:21,884 --> 00:09:24,862 However, we should never forget that the power 187 00:09:24,862 --> 00:09:27,265 to rationalize our own little ethical lapses 188 00:09:27,265 --> 00:09:28,432 is remarkable. 189 00:09:29,407 --> 00:09:32,179 Consider this everyday example. 190 00:09:32,179 --> 00:09:34,311 Statistics demonstrate that there are disturbingly 191 00:09:34,311 --> 00:09:36,634 high rates of accidents and deaths 192 00:09:36,634 --> 00:09:39,888 due to cell phone related distracted driving. 193 00:09:39,888 --> 00:09:43,141 Yet, despite knowing this, many of us will continue 194 00:09:43,141 --> 00:09:44,835 to use cell phones when we drive, 195 00:09:44,835 --> 00:09:46,941 even after we leave here today. 196 00:09:46,941 --> 00:09:49,020 Studies show that more than half of us believe 197 00:09:49,020 --> 00:09:50,872 that when we use cell phones and drive, 198 00:09:50,872 --> 00:09:52,699 it makes no real difference on our own 199 00:09:52,699 --> 00:09:55,689 individual driving performance. 200 00:09:55,689 --> 00:09:59,700 Yet, when we switch from being the driver to the passenger, 201 00:09:59,700 --> 00:10:02,771 90% of us now will suddenly state, " I would feel 202 00:10:02,771 --> 00:10:06,938 very unsafe if I observed my driver using a cell phone." 203 00:10:10,464 --> 00:10:13,842 So saying you have integrity is easy. 204 00:10:13,842 --> 00:10:16,848 Practicing integrity is not easy. 205 00:10:16,848 --> 00:10:19,519 And recognizing our own little ethical lapses 206 00:10:19,519 --> 00:10:22,884 and rationalizations is even more difficult. 207 00:10:22,884 --> 00:10:25,153 So what does this all mean in the context 208 00:10:25,153 --> 00:10:27,870 of knowledge production? 209 00:10:27,870 --> 00:10:31,346 First, we should be aware that funders increasingly want 210 00:10:31,346 --> 00:10:34,790 more influence over what questions scientists can ask, 211 00:10:34,790 --> 00:10:36,997 what findings they can share, and ultimately 212 00:10:36,997 --> 00:10:39,914 what kind of knowledge is produced. 213 00:10:41,181 --> 00:10:43,801 So ask yourself, "What are the strings attached 214 00:10:43,801 --> 00:10:46,410 "when we accept funding?" 215 00:10:46,410 --> 00:10:49,426 Are the strings visible, where the scientist is told 216 00:10:49,426 --> 00:10:51,805 that she cannot publish her work until given approval 217 00:10:51,805 --> 00:10:54,452 to do so by the funder? 218 00:10:54,452 --> 00:10:56,095 Or does the funder require that the data 219 00:10:56,095 --> 00:10:58,649 remain confidential so that the research conclusions 220 00:10:58,649 --> 00:11:03,150 can never be verified within the scientific community? 221 00:11:03,150 --> 00:11:05,567 Or are the strings invisible? 222 00:11:06,507 --> 00:11:09,756 Increasingly, scientists and professors are self-censoring 223 00:11:09,756 --> 00:11:12,471 their work in order to appeal to funders. 224 00:11:12,471 --> 00:11:15,576 And in so doing are sidestepping important questions 225 00:11:15,576 --> 00:11:17,504 that may be critical to the public good 226 00:11:17,504 --> 00:11:19,421 and society as a whole. 227 00:11:21,041 --> 00:11:22,833 My interviews make clear 228 00:11:22,833 --> 00:11:25,916 that the funding effect bias is real. 229 00:11:28,980 --> 00:11:32,478 And, if left unchecked, will continue to have a real impact 230 00:11:32,478 --> 00:11:34,764 on what we can know. 231 00:11:34,764 --> 00:11:38,904 So next time you pick up a book or a research article, 232 00:11:38,904 --> 00:11:42,180 check to see who is funding the author's work. 233 00:11:42,180 --> 00:11:46,419 And pay close attention to the author's affiliations. 234 00:11:46,419 --> 00:11:49,651 In order to be informed in this information age, 235 00:11:49,651 --> 00:11:52,271 we need to take extra measures to vet the legitimacy 236 00:11:52,271 --> 00:11:55,870 of the content that we rely on, to develop a critical eye 237 00:11:55,870 --> 00:11:59,676 for independence, and to value scientific integrity 238 00:11:59,676 --> 00:12:01,343 above anything else. 239 00:12:02,799 --> 00:12:06,643 Information and knowledge require science, 240 00:12:06,643 --> 00:12:08,532 unfettered and unbiased. 241 00:12:08,532 --> 00:12:11,727 And it's time we all take measures to demand it. 242 00:12:11,727 --> 00:12:12,843 Thank you. 243 00:12:12,843 --> 00:12:15,843 (audience applause)