WEBVTT 00:00:05.235 --> 00:00:08.235 (audience applause) 00:00:10.446 --> 00:00:11.696 - Good morning. 00:00:13.446 --> 00:00:16.222 For the past three years, I was a researcher 00:00:16.222 --> 00:00:18.310 at the Edmond J. Safra Center for Ethics 00:00:18.310 --> 00:00:22.233 at Harvard University, where I examined corrupting influences 00:00:22.233 --> 00:00:25.500 and hidden biases in the pursuit of knowledge. 00:00:25.500 --> 00:00:27.468 During this time, I conducted in-depth interviews 00:00:27.468 --> 00:00:30.715 with professors from medicine, business, law, 00:00:30.715 --> 00:00:33.527 the natural life sciences, as well as the humanities 00:00:33.527 --> 00:00:35.863 and social sciences. 00:00:35.863 --> 00:00:38.991 My goal was to try to understand the everyday life 00:00:38.991 --> 00:00:43.134 of scientists and professors across all the disciplines. 00:00:43.134 --> 00:00:45.943 In the end, I ended up with close to 10,000 pages 00:00:45.943 --> 00:00:48.207 of interview transcripts. 00:00:48.207 --> 00:00:50.973 Today I would like to share with you 00:00:50.973 --> 00:00:53.682 some of the ethical dilemmas that professors face. 00:00:53.682 --> 00:00:57.156 In particular, whether they experience an increased risk 00:00:57.156 --> 00:01:01.323 of bias depending on who is funding their research. 00:01:02.402 --> 00:01:05.147 Now, why should we be concerned about the ethics 00:01:05.147 --> 00:01:07.704 of knowledge production? 00:01:07.704 --> 00:01:11.986 When I first started university, I had this idealistic, 00:01:11.986 --> 00:01:14.870 and perhaps naive, view of science. 00:01:14.870 --> 00:01:17.938 I believed that scientists inquired about the world, 00:01:17.938 --> 00:01:20.495 practiced the scientific method with integrity, 00:01:20.495 --> 00:01:24.058 and made new discoveries that drive progress forward. 00:01:24.058 --> 00:01:27.154 But close examination of how scientist conduct research 00:01:27.154 --> 00:01:30.122 reveals that what we can know depends not only 00:01:30.122 --> 00:01:33.247 on the scientist, but also on the structures 00:01:33.247 --> 00:01:35.939 and institutions that give scientists the means 00:01:35.939 --> 00:01:38.115 to pursue knowledge. 00:01:38.115 --> 00:01:40.212 As I interviewed scientists and professors, 00:01:40.212 --> 00:01:44.460 I began to uncover patterns of scientific distortion, 00:01:44.460 --> 00:01:47.770 or what some might call "the corruption of knowledge." 00:01:47.770 --> 00:01:49.933 However, the majority of these distortions 00:01:49.933 --> 00:01:53.352 were not produced by bad people behaving unethically 00:01:53.352 --> 00:01:56.288 or illegally, although this does happen. 00:01:56.288 --> 00:01:58.177 But rather by good people, like the people 00:01:58.177 --> 00:02:01.769 sitting beside you right now: your friends and your family 00:02:01.769 --> 00:02:03.919 who in response to the daily pressures 00:02:03.919 --> 00:02:06.395 of work may simply begin to rationalize 00:02:06.395 --> 00:02:11.233 to themselves little ethical lapses here and there. 00:02:11.233 --> 00:02:15.718 Now by ethical lapse, I mean scientific integrity lapses 00:02:15.718 --> 00:02:18.409 that appear to be very small, or inconsequential, 00:02:18.409 --> 00:02:19.885 at the time. 00:02:19.885 --> 00:02:22.555 One of the most common examples of this 00:02:22.555 --> 00:02:24.989 involves a scientist thinking, 00:02:24.989 --> 00:02:27.967 Maybe I won't ask Question A when pursuing 00:02:27.967 --> 00:02:30.890 my research because my funder who may be relying 00:02:30.890 --> 00:02:33.797 on the results of this study to obtain regulatory approval 00:02:33.797 --> 00:02:36.522 for potential commercialization may not be too happy 00:02:36.522 --> 00:02:39.969 with the higher risk of a negative result 00:02:39.969 --> 00:02:42.446 which might also affect my future funding. 00:02:42.446 --> 00:02:44.408 So maybe instead I'll self-censor myself 00:02:44.408 --> 00:02:46.324 and ask a different question of the data 00:02:46.324 --> 00:02:48.916 where the possible outcome will most likely 00:02:48.916 --> 00:02:51.615 not ruffle too many feathers and I will then answer 00:02:51.615 --> 00:02:55.964 that question honestly and with scientific integrity. 00:02:55.964 --> 00:02:59.408 Now these types of rationalizations, 00:02:59.408 --> 00:03:02.093 these little compromises, where we convince ourselves 00:03:02.093 --> 00:03:04.901 in the moment that what we're doing is okay, 00:03:04.901 --> 00:03:07.456 help to neutralize any guilt we might experience 00:03:07.456 --> 00:03:10.128 in our ethical decision making. 00:03:10.128 --> 00:03:13.307 However, over time the accumulation 00:03:13.307 --> 00:03:15.681 of these little ethical lapses is leading 00:03:15.681 --> 00:03:18.202 to a broader system of knowledge production 00:03:18.202 --> 00:03:20.901 that is becoming increasingly distorted 00:03:20.901 --> 00:03:23.234 and more difficult to trust. 00:03:24.555 --> 00:03:27.985 I want you to think about that word for a moment: trust. 00:03:27.985 --> 00:03:31.548 And how it plays a role in your daily activities. 00:03:31.548 --> 00:03:35.051 For instance, plastic water bottles. 00:03:35.051 --> 00:03:37.363 They're so common that when we pick one up, 00:03:37.363 --> 00:03:38.732 we're probably not thinking anything 00:03:38.732 --> 00:03:40.899 other than, "I'm thirsty." 00:03:41.944 --> 00:03:46.237 We don't ask ourselves, Hmm, does bisphenol-a, 00:03:46.237 --> 00:03:49.395 or BPA, a common compound used in hard plastic products, 00:03:49.395 --> 00:03:51.158 lead to cancer, behavioral disorders, 00:03:51.158 --> 00:03:52.590 or reproductive problems? 00:03:52.590 --> 00:03:54.090 No, of course not. 00:03:54.941 --> 00:03:56.670 We take a drink and we go on with our day. 00:03:56.670 --> 00:03:58.726 We trust that drinking from the water bottle 00:03:58.726 --> 00:04:03.350 can't be bad, or at least bad enough to worry about. 00:04:03.350 --> 00:04:06.087 While on the one hand, you can feel safe 00:04:06.087 --> 00:04:08.266 because every study performed by scientists 00:04:08.266 --> 00:04:12.433 funded by the industry concludes, "No harm from BPA." 00:04:13.396 --> 00:04:17.293 In other words: it's okay, you can trust BPA. 00:04:17.293 --> 00:04:21.098 But, at the same time, 93% of the non industry 00:04:21.098 --> 00:04:25.295 funded studies show that there might be cause for concern. 00:04:25.295 --> 00:04:28.516 And that maybe we should be a little less trusting 00:04:28.516 --> 00:04:33.430 the next time we pick up a hard plastic water bottle. 00:04:33.430 --> 00:04:35.572 So who do you trust? 00:04:35.572 --> 00:04:39.028 And how is it possible that the industry funded scientists 00:04:39.028 --> 00:04:43.195 studying BPA are so certain that there is no harm? 00:04:44.691 --> 00:04:46.987 Is it simply because they're better scientists? 00:04:46.987 --> 00:04:48.532 Have bigger data sets? 00:04:48.532 --> 00:04:50.837 Know the compound better? 00:04:50.837 --> 00:04:54.411 Maybe, perhaps, but we see this pattern 00:04:54.411 --> 00:04:57.031 often called the funding effect, 00:04:57.031 --> 00:04:59.803 across many different areas of research 00:04:59.803 --> 00:05:03.698 from cell phone safety to climate change to soft drinks. 00:05:03.698 --> 00:05:06.480 In each case, scientists funded by the industry, 00:05:06.480 --> 00:05:09.343 or industry supported think tanks, reached conclusions 00:05:09.343 --> 00:05:13.660 that overall tend to deny or downplay any harm. 00:05:13.660 --> 00:05:16.686 While non industry funded scientists overwhelmingly 00:05:16.686 --> 00:05:19.107 find evidence of harm. 00:05:19.107 --> 00:05:22.302 Among the professors I interviewed in food and nutrition, 00:05:22.302 --> 00:05:26.145 there was acknowledgement of this funding effect bias. 00:05:26.145 --> 00:05:30.142 When food scientists said, "There is a tendency for people 00:05:30.142 --> 00:05:32.742 in my discipline to develop sympathies 00:05:32.742 --> 00:05:34.606 with the food industry and to say 00:05:34.606 --> 00:05:37.918 'Yeah this is definitely safe.' Rather than to say, 00:05:37.918 --> 00:05:39.797 "Okay, here's this research study, NOTE Paragraph 00:05:39.797 --> 00:05:42.915 and this research study, and this research study.'" 00:05:42.915 --> 00:05:45.831 When I interviewed another professor, who was also 00:05:45.831 --> 00:05:49.344 an editor of a scientific journal in nutrition, 00:05:49.344 --> 00:05:53.573 he said the following to me, "So we get some manuscripts 00:05:53.573 --> 00:05:56.246 that are industry sponsored and one senses 00:05:56.246 --> 00:05:58.534 that their story is a little slanted towards the benefit 00:05:58.534 --> 00:06:00.122 of whatever it might be. 00:06:00.122 --> 00:06:02.405 Their product did this, never mind that 00:06:02.405 --> 00:06:05.268 it didn't do 10 other things. 00:06:05.268 --> 00:06:07.444 The most frequent scenario is not that the study 00:06:07.444 --> 00:06:11.047 is done poorly, but that the questions themselves 00:06:11.047 --> 00:06:13.047 are kind of selective." 00:06:15.049 --> 00:06:18.382 Now if a funding effect bias does exist, 00:06:19.388 --> 00:06:21.927 then surely the regulatory bodies who look out 00:06:21.927 --> 00:06:25.381 for our safety must be aware of it, right? 00:06:25.381 --> 00:06:28.890 For instance, what about our prescription drugs? 00:06:28.890 --> 00:06:30.565 Pharmaceutical companies must first obtain 00:06:30.565 --> 00:06:34.084 regulatory approval for their products, right? 00:06:34.084 --> 00:06:37.520 Yes, however, many of the drug evaluation 00:06:37.520 --> 00:06:39.512 and research advisory committee members 00:06:39.512 --> 00:06:41.429 who vote on whether a drug should be granted 00:06:41.429 --> 00:06:44.049 regulatory approval also have financial conflicts 00:06:44.049 --> 00:06:47.632 of interest with these same drug companies. 00:06:48.470 --> 00:06:52.811 These voting members often serve as consultants 00:06:52.811 --> 00:06:55.168 and have ownership interest in the same drug companies 00:06:55.168 --> 00:06:56.585 seeking approval. 00:06:57.861 --> 00:07:00.191 They also sit on their advisory boards 00:07:00.191 --> 00:07:02.503 and even receive funding from these firms 00:07:02.503 --> 00:07:05.336 for their own individual research. 00:07:06.544 --> 00:07:09.001 In other words, they might be experts, 00:07:09.001 --> 00:07:12.084 but they are not independent experts. 00:07:13.230 --> 00:07:16.941 As you know, in 2008 the world suffered 00:07:16.941 --> 00:07:19.024 a major financial crisis. 00:07:20.305 --> 00:07:22.554 The Oscar-winning documentary, Inside Job, 00:07:22.554 --> 00:07:25.270 suggested that economics professors were being corrupted 00:07:25.270 --> 00:07:28.660 and blinded through their consulting relationships 00:07:28.660 --> 00:07:32.339 and conflicts of interest with the financial sector. 00:07:32.339 --> 00:07:35.639 It was so serious that even an upset Queen of England 00:07:35.639 --> 00:07:38.987 (audience laughs) 00:07:38.987 --> 00:07:42.355 visited the LSC, the prestigious London School 00:07:42.355 --> 00:07:44.877 of Economics, and sternly asked her top 00:07:44.877 --> 00:07:49.396 economics professors, "If the problem was so widespread, 00:07:49.396 --> 00:07:52.859 "then why didn't anyone notice it?" 00:07:52.859 --> 00:07:55.561 The Director of LSC's Management Department, 00:07:55.561 --> 00:07:57.736 who was standing beside the Queen at the time, 00:07:57.736 --> 00:08:01.198 said to her, "At every stage, someone was relying 00:08:01.198 --> 00:08:03.230 on somebody else and everyone thought 00:08:03.230 --> 00:08:06.063 they were doing the right thing." 00:08:07.082 --> 00:08:09.901 In my interviews with business and economics professors, 00:08:09.901 --> 00:08:11.864 it was observed, as it was with professors 00:08:11.864 --> 00:08:15.239 across all the disciplines, that a lack of independence 00:08:15.239 --> 00:08:18.640 can distort the production of knowledge. 00:08:18.640 --> 00:08:20.800 One economics professor, who researches 00:08:20.800 --> 00:08:25.194 private equity finance, told me during an interview, 00:08:25.194 --> 00:08:27.482 "The only way to get the data is to get 00:08:27.482 --> 00:08:30.426 "the private equity firms to give it to you. 00:08:30.426 --> 00:08:32.605 "If you then say these people don't know what they're doing, 00:08:32.605 --> 00:08:35.957 or they only make returns by taking excessive risks, 00:08:35.957 --> 00:08:38.515 then there is the potential you simply will not 00:08:38.515 --> 00:08:41.176 get data going forward and you will be forced 00:08:41.176 --> 00:08:43.448 to leave the field of economics. 00:08:43.448 --> 00:08:45.940 So you have to worry that the research that comes out 00:08:45.940 --> 00:08:48.488 is more favorable to the private equity industry 00:08:48.488 --> 00:08:50.905 than otherwise it would be." 00:08:52.889 --> 00:08:55.797 Now despite all these cautionary examples 00:08:55.797 --> 00:08:58.390 of corrupting influences and hidden biases, 00:08:58.390 --> 00:09:00.479 some of you out there, I'm certain, 00:09:00.479 --> 00:09:03.395 are still thinking to yourself, "Okay, Gary, 00:09:03.395 --> 00:09:05.962 I hear what you're saying, but I would never distort 00:09:05.962 --> 00:09:08.905 my work, and no conflict of interest would change 00:09:08.905 --> 00:09:11.155 how I pursue my research." 00:09:12.287 --> 00:09:15.314 Fair enough, many of us do believe 00:09:15.314 --> 00:09:17.637 that we can manage any conflict of interest 00:09:17.637 --> 00:09:21.884 and still maintain our own personal integrity. 00:09:21.884 --> 00:09:24.862 However, we should never forget that the power 00:09:24.862 --> 00:09:27.265 to rationalize our own little ethical lapses 00:09:27.265 --> 00:09:28.432 is remarkable. 00:09:29.407 --> 00:09:32.179 Consider this everyday example. 00:09:32.179 --> 00:09:34.311 Statistics demonstrate that there are disturbingly 00:09:34.311 --> 00:09:36.634 high rates of accidents and deaths 00:09:36.634 --> 00:09:39.888 due to cell phone related distracted driving. 00:09:39.888 --> 00:09:43.141 Yet, despite knowing this, many of us will continue 00:09:43.141 --> 00:09:44.835 to use cell phones when we drive, 00:09:44.835 --> 00:09:46.941 even after we leave here today. 00:09:46.941 --> 00:09:49.020 Studies show that more than half of us believe 00:09:49.020 --> 00:09:50.872 that when we use cell phones and drive, 00:09:50.872 --> 00:09:52.699 it makes no real difference on our own 00:09:52.699 --> 00:09:55.689 individual driving performance. 00:09:55.689 --> 00:09:59.700 Yet, when we switch from being the driver to the passenger, 00:09:59.700 --> 00:10:02.771 90% of us now will suddenly state, " I would feel 00:10:02.771 --> 00:10:06.938 very unsafe if I observed my driver using a cell phone." 00:10:10.464 --> 00:10:13.842 So saying you have integrity is easy. 00:10:13.842 --> 00:10:16.848 Practicing integrity is not easy. 00:10:16.848 --> 00:10:19.519 And recognizing our own little ethical lapses 00:10:19.519 --> 00:10:22.884 and rationalizations is even more difficult. 00:10:22.884 --> 00:10:25.153 So what does this all mean in the context 00:10:25.153 --> 00:10:27.870 of knowledge production? 00:10:27.870 --> 00:10:31.346 First, we should be aware that funders increasingly want 00:10:31.346 --> 00:10:34.790 more influence over what questions scientists can ask, 00:10:34.790 --> 00:10:36.997 what findings they can share, and ultimately 00:10:36.997 --> 00:10:39.914 what kind of knowledge is produced. 00:10:41.181 --> 00:10:43.801 So ask yourself, "What are the strings attached 00:10:43.801 --> 00:10:46.410 "when we accept funding?" 00:10:46.410 --> 00:10:49.426 Are the strings visible, where the scientist is told 00:10:49.426 --> 00:10:51.805 that she cannot publish her work until given approval 00:10:51.805 --> 00:10:54.452 to do so by the funder? 00:10:54.452 --> 00:10:56.095 Or does the funder require that the data 00:10:56.095 --> 00:10:58.649 remain confidential so that the research conclusions 00:10:58.649 --> 00:11:03.150 can never be verified within the scientific community? 00:11:03.150 --> 00:11:05.567 Or are the strings invisible? 00:11:06.507 --> 00:11:09.756 Increasingly, scientists and professors are self-censoring 00:11:09.756 --> 00:11:12.471 their work in order to appeal to funders. 00:11:12.471 --> 00:11:15.576 And in so doing are sidestepping important questions 00:11:15.576 --> 00:11:17.504 that may be critical to the public good 00:11:17.504 --> 00:11:19.421 and society as a whole. 00:11:21.041 --> 00:11:22.833 My interviews make clear 00:11:22.833 --> 00:11:25.916 that the funding effect bias is real. 00:11:28.980 --> 00:11:32.478 And, if left unchecked, will continue to have a real impact 00:11:32.478 --> 00:11:34.764 on what we can know. 00:11:34.764 --> 00:11:38.904 So next time you pick up a book or a research article, 00:11:38.904 --> 00:11:42.180 check to see who is funding the author's work. 00:11:42.180 --> 00:11:46.419 And pay close attention to the author's affiliations. 00:11:46.419 --> 00:11:49.651 In order to be informed in this information age, 00:11:49.651 --> 00:11:52.271 we need to take extra measures to vet the legitimacy 00:11:52.271 --> 00:11:55.870 of the content that we rely on, to develop a critical eye 00:11:55.870 --> 00:11:59.676 for independence, and to value scientific integrity 00:11:59.676 --> 00:12:01.343 above anything else. 00:12:02.799 --> 00:12:06.643 Information and knowledge require science, 00:12:06.643 --> 00:12:08.532 unfettered and unbiased. 00:12:08.532 --> 00:12:11.727 And it's time we all take measures to demand it. 00:12:11.727 --> 00:12:12.843 Thank you. 00:12:12.843 --> 00:12:15.843 (audience applause)