0:00:05.235,0:00:08.235 (audience applause) 0:00:10.446,0:00:11.696 - Good morning. 0:00:13.446,0:00:16.222 For the past three[br]years, I was a researcher 0:00:16.222,0:00:18.310 at the Edmond J. Safra Center for Ethics 0:00:18.310,0:00:22.233 at Harvard University, where[br]I examined corrupting influences 0:00:22.233,0:00:25.500 and hidden biases in the[br]pursuit of knowledge. 0:00:25.500,0:00:27.468 During this time, I[br]conducted in-depth interviews 0:00:27.468,0:00:30.715 with professors from[br]medicine, business, law, 0:00:30.715,0:00:33.527 the natural life sciences,[br]as well as the humanities 0:00:33.527,0:00:35.863 and social sciences. 0:00:35.863,0:00:38.991 My goal was to try to[br]understand the everyday life 0:00:38.991,0:00:43.134 of scientists and professors[br]across all the disciplines. 0:00:43.134,0:00:45.943 In the end, I ended up[br]with close to 10,000 pages 0:00:45.943,0:00:48.207 of interview transcripts. 0:00:48.207,0:00:50.973 Today I would like to share with you 0:00:50.973,0:00:53.682 some of the ethical dilemmas[br]that professors face. 0:00:53.682,0:00:57.156 In particular, whether they[br]experience an increased risk 0:00:57.156,0:01:01.323 of bias depending on who[br]is funding their research. 0:01:02.402,0:01:05.147 Now, why should we be[br]concerned about the ethics 0:01:05.147,0:01:07.704 of knowledge production? 0:01:07.704,0:01:11.986 When I first started university,[br]I had this idealistic, 0:01:11.986,0:01:14.870 and perhaps naive, view of science. 0:01:14.870,0:01:17.938 I believed that scientists[br]inquired about the world, 0:01:17.938,0:01:20.495 practiced the scientific[br]method with integrity, 0:01:20.495,0:01:24.058 and made new discoveries[br]that drive progress forward. 0:01:24.058,0:01:27.154 But close examination of how[br]scientist conduct research 0:01:27.154,0:01:30.122 reveals that what we can[br]know depends not only 0:01:30.122,0:01:33.247 on the scientist, but[br]also on the structures 0:01:33.247,0:01:35.939 and institutions that[br]give scientists the means 0:01:35.939,0:01:38.115 to pursue knowledge. 0:01:38.115,0:01:40.212 As I interviewed[br]scientists and professors, 0:01:40.212,0:01:44.460 I began to uncover patterns[br]of scientific distortion, 0:01:44.460,0:01:47.770 or what some might call "the[br]corruption of knowledge." 0:01:47.770,0:01:49.933 However, the majority of these distortions 0:01:49.933,0:01:53.352 were not produced by bad[br]people behaving unethically 0:01:53.352,0:01:56.288 or illegally, although this does happen. 0:01:56.288,0:01:58.177 But rather by good people, like the people 0:01:58.177,0:02:01.769 sitting beside you right now:[br]your friends and your family 0:02:01.769,0:02:03.919 who in response to the daily pressures 0:02:03.919,0:02:06.395 of work may simply begin to rationalize 0:02:06.395,0:02:11.233 to themselves little ethical[br]lapses here and there. 0:02:11.233,0:02:15.718 Now by ethical lapse, I mean[br]scientific integrity lapses 0:02:15.718,0:02:18.409 that appear to be very[br]small, or inconsequential, 0:02:18.409,0:02:19.885 at the time. 0:02:19.885,0:02:22.555 One of the most common examples of this 0:02:22.555,0:02:24.989 involves a scientist thinking, 0:02:24.989,0:02:27.967 Maybe I won't ask[br]Question A when pursuing 0:02:27.967,0:02:30.890 my research because my[br]funder who may be relying 0:02:30.890,0:02:33.797 on the results of this study[br]to obtain regulatory approval 0:02:33.797,0:02:36.522 for potential commercialization[br]may not be too happy 0:02:36.522,0:02:39.969 with the higher risk of a negative result 0:02:39.969,0:02:42.446 which might also affect[br]my future funding. 0:02:42.446,0:02:44.408 So maybe instead I'll self-censor myself 0:02:44.408,0:02:46.324 and ask a different question of the data 0:02:46.324,0:02:48.916 where the possible[br]outcome will most likely 0:02:48.916,0:02:51.615 not ruffle too many feathers[br]and I will then answer 0:02:51.615,0:02:55.964 that question honestly and[br]with scientific integrity. 0:02:55.964,0:02:59.408 Now these types of rationalizations, 0:02:59.408,0:03:02.093 these little compromises,[br]where we convince ourselves 0:03:02.093,0:03:04.901 in the moment that what[br]we're doing is okay, 0:03:04.901,0:03:07.456 help to neutralize any[br]guilt we might experience 0:03:07.456,0:03:10.128 in our ethical decision making. 0:03:10.128,0:03:13.307 However, over time the accumulation 0:03:13.307,0:03:15.681 of these little ethical lapses is leading 0:03:15.681,0:03:18.202 to a broader system of[br]knowledge production 0:03:18.202,0:03:20.901 that is becoming increasingly distorted 0:03:20.901,0:03:23.234 and more difficult to trust. 0:03:24.555,0:03:27.985 I want you to think about[br]that word for a moment: trust. 0:03:27.985,0:03:31.548 And how it plays a role[br]in your daily activities. 0:03:31.548,0:03:35.051 For instance, plastic water bottles. 0:03:35.051,0:03:37.363 They're so common that[br]when we pick one up, 0:03:37.363,0:03:38.732 we're probably not thinking anything 0:03:38.732,0:03:40.899 other than, "I'm thirsty." 0:03:41.944,0:03:46.237 We don't ask ourselves,[br]Hmm, does bisphenol-a, 0:03:46.237,0:03:49.395 or BPA, a common compound[br]used in hard plastic products, 0:03:49.395,0:03:51.158 lead to cancer, behavioral disorders, 0:03:51.158,0:03:52.590 or reproductive problems? 0:03:52.590,0:03:54.090 No, of course not. 0:03:54.941,0:03:56.670 We take a drink and we go on with our day. 0:03:56.670,0:03:58.726 We trust that drinking[br]from the water bottle 0:03:58.726,0:04:03.350 can't be bad, or at least[br]bad enough to worry about. 0:04:03.350,0:04:06.087 While on the one hand, you can feel safe 0:04:06.087,0:04:08.266 because every study[br]performed by scientists 0:04:08.266,0:04:12.433 funded by the industry[br]concludes, "No harm from BPA." 0:04:13.396,0:04:17.293 In other words: it's[br]okay, you can trust BPA. 0:04:17.293,0:04:21.098 But, at the same time,[br]93% of the non industry 0:04:21.098,0:04:25.295 funded studies show that there[br]might be cause for concern. 0:04:25.295,0:04:28.516 And that maybe we should[br]be a little less trusting 0:04:28.516,0:04:33.430 the next time we pick up a[br]hard plastic water bottle. 0:04:33.430,0:04:35.572 So who do you trust? 0:04:35.572,0:04:39.028 And how is it possible that[br]the industry funded scientists 0:04:39.028,0:04:43.195 studying BPA are so certain[br]that there is no harm? 0:04:44.691,0:04:46.987 Is it simply because[br]they're better scientists? 0:04:46.987,0:04:48.532 Have bigger data sets? 0:04:48.532,0:04:50.837 Know the compound better? 0:04:50.837,0:04:54.411 Maybe, perhaps, but we see this pattern 0:04:54.411,0:04:57.031 often called the funding effect, 0:04:57.031,0:04:59.803 across many different areas of research 0:04:59.803,0:05:03.698 from cell phone safety to[br]climate change to soft drinks. 0:05:03.698,0:05:06.480 In each case, scientists[br]funded by the industry, 0:05:06.480,0:05:09.343 or industry supported think[br]tanks, reached conclusions 0:05:09.343,0:05:13.660 that overall tend to deny[br]or downplay any harm. 0:05:13.660,0:05:16.686 While non industry funded[br]scientists overwhelmingly 0:05:16.686,0:05:19.107 find evidence of harm. 0:05:19.107,0:05:22.302 Among the professors I[br]interviewed in food and nutrition, 0:05:22.302,0:05:26.145 there was acknowledgement[br]of this funding effect bias. 0:05:26.145,0:05:30.142 When food scientists said,[br]"There is a tendency for people 0:05:30.142,0:05:32.742 in my discipline to develop sympathies 0:05:32.742,0:05:34.606 with the food industry and to say 0:05:34.606,0:05:37.918 'Yeah this is definitely[br]safe.' Rather than to say, 0:05:37.918,0:05:39.797 "Okay, here's this research study, 0:05:39.797,0:05:42.915 and this research study, and this research study.'" 0:05:42.915,0:05:45.831 When I interviewed another[br]professor, who was also 0:05:45.831,0:05:49.344 an editor of a scientific[br]journal in nutrition, 0:05:49.344,0:05:53.573 he said the following to me,[br]"So we get some manuscripts 0:05:53.573,0:05:56.246 that are industry[br]sponsored and one senses 0:05:56.246,0:05:58.534 that their story is a little[br]slanted towards the benefit 0:05:58.534,0:06:00.122 of whatever it might be. 0:06:00.122,0:06:02.405 Their product did this, never mind that 0:06:02.405,0:06:05.268 it didn't do 10 other things. 0:06:05.268,0:06:07.444 The most frequent scenario[br]is not that the study 0:06:07.444,0:06:11.047 is done poorly, but that[br]the questions themselves 0:06:11.047,0:06:13.047 are kind of selective." 0:06:15.049,0:06:18.382 Now if a funding effect bias does exist, 0:06:19.388,0:06:21.927 then surely the regulatory[br]bodies who look out 0:06:21.927,0:06:25.381 for our safety must be aware of it, right? 0:06:25.381,0:06:28.890 For instance, what about[br]our prescription drugs? 0:06:28.890,0:06:30.565 Pharmaceutical companies must first obtain 0:06:30.565,0:06:34.084 regulatory approval for[br]their products, right? 0:06:34.084,0:06:37.520 Yes, however, many of the drug evaluation 0:06:37.520,0:06:39.512 and research advisory committee members 0:06:39.512,0:06:41.429 who vote on whether a[br]drug should be granted 0:06:41.429,0:06:44.049 regulatory approval also[br]have financial conflicts 0:06:44.049,0:06:47.632 of interest with these[br]same drug companies. 0:06:48.470,0:06:52.811 These voting members[br]often serve as consultants 0:06:52.811,0:06:55.168 and have ownership interest[br]in the same drug companies 0:06:55.168,0:06:56.585 seeking approval. 0:06:57.861,0:07:00.191 They also sit on their advisory boards 0:07:00.191,0:07:02.503 and even receive funding from these firms 0:07:02.503,0:07:05.336 for their own individual research. 0:07:06.544,0:07:09.001 In other words, they might be experts, 0:07:09.001,0:07:12.084 but they are not independent experts. 0:07:13.230,0:07:16.941 As you know, in 2008 the world suffered 0:07:16.941,0:07:19.024 a major financial crisis. 0:07:20.305,0:07:22.554 The Oscar-winning documentary, Inside Job, 0:07:22.554,0:07:25.270 suggested that economics[br]professors were being corrupted 0:07:25.270,0:07:28.660 and blinded through their[br]consulting relationships 0:07:28.660,0:07:32.339 and conflicts of interest[br]with the financial sector. 0:07:32.339,0:07:35.639 It was so serious that even[br]an upset Queen of England 0:07:35.639,0:07:38.987 (audience laughs) 0:07:38.987,0:07:42.355 visited the LSC, the[br]prestigious London School 0:07:42.355,0:07:44.877 of Economics, and sternly asked her top 0:07:44.877,0:07:49.396 economics professors, "If the[br]problem was so widespread, 0:07:49.396,0:07:52.859 "then why didn't anyone notice it?" 0:07:52.859,0:07:55.561 The Director of LSC's[br]Management Department, 0:07:55.561,0:07:57.736 who was standing beside[br]the Queen at the time, 0:07:57.736,0:08:01.198 said to her, "At every[br]stage, someone was relying 0:08:01.198,0:08:03.230 on somebody else and everyone thought 0:08:03.230,0:08:06.063 they were doing the right thing." 0:08:07.082,0:08:09.901 In my interviews with business[br]and economics professors, 0:08:09.901,0:08:11.864 it was observed, as it was with professors 0:08:11.864,0:08:15.239 across all the disciplines,[br]that a lack of independence 0:08:15.239,0:08:18.640 can distort the production of knowledge. 0:08:18.640,0:08:20.800 One economics professor, who researches 0:08:20.800,0:08:25.194 private equity finance,[br]told me during an interview, 0:08:25.194,0:08:27.482 "The only way to get the data is to get 0:08:27.482,0:08:30.426 "the private equity[br]firms to give it to you. 0:08:30.426,0:08:32.605 "If you then say these people[br]don't know what they're doing, 0:08:32.605,0:08:35.957 or they only make returns[br]by taking excessive risks, 0:08:35.957,0:08:38.515 then there is the potential[br]you simply will not 0:08:38.515,0:08:41.176 get data going forward[br]and you will be forced 0:08:41.176,0:08:43.448 to leave the field of economics. 0:08:43.448,0:08:45.940 So you have to worry that[br]the research that comes out 0:08:45.940,0:08:48.488 is more favorable to the[br]private equity industry 0:08:48.488,0:08:50.905 than otherwise it would be." 0:08:52.889,0:08:55.797 Now despite all these cautionary examples 0:08:55.797,0:08:58.390 of corrupting influences[br]and hidden biases, 0:08:58.390,0:09:00.479 some of you out there, I'm certain, 0:09:00.479,0:09:03.395 are still thinking to[br]yourself, "Okay, Gary, 0:09:03.395,0:09:05.962 I hear what you're saying,[br]but I would never distort 0:09:05.962,0:09:08.905 my work, and no conflict[br]of interest would change 0:09:08.905,0:09:11.155 how I pursue my research." 0:09:12.287,0:09:15.314 Fair enough, many of us do believe 0:09:15.314,0:09:17.637 that we can manage any[br]conflict of interest 0:09:17.637,0:09:21.884 and still maintain our[br]own personal integrity. 0:09:21.884,0:09:24.862 However, we should never[br]forget that the power 0:09:24.862,0:09:27.265 to rationalize our own[br]little ethical lapses 0:09:27.265,0:09:28.432 is remarkable. 0:09:29.407,0:09:32.179 Consider this everyday example. 0:09:32.179,0:09:34.311 Statistics demonstrate[br]that there are disturbingly 0:09:34.311,0:09:36.634 high rates of accidents and deaths 0:09:36.634,0:09:39.888 due to cell phone related[br]distracted driving. 0:09:39.888,0:09:43.141 Yet, despite knowing this,[br]many of us will continue 0:09:43.141,0:09:44.835 to use cell phones when we drive, 0:09:44.835,0:09:46.941 even after we leave here today. 0:09:46.941,0:09:49.020 Studies show that more[br]than half of us believe 0:09:49.020,0:09:50.872 that when we use cell phones and drive, 0:09:50.872,0:09:52.699 it makes no real difference on our own 0:09:52.699,0:09:55.689 individual driving performance. 0:09:55.689,0:09:59.700 Yet, when we switch from being[br]the driver to the passenger, 0:09:59.700,0:10:02.771 90% of us now will suddenly[br]state, " I would feel 0:10:02.771,0:10:06.938 very unsafe if I observed my[br]driver using a cell phone." 0:10:10.464,0:10:13.842 So saying you have integrity is easy. 0:10:13.842,0:10:16.848 Practicing integrity is not easy. 0:10:16.848,0:10:19.519 And recognizing our own[br]little ethical lapses 0:10:19.519,0:10:22.884 and rationalizations[br]is even more difficult. 0:10:22.884,0:10:25.153 So what does this all mean in the context 0:10:25.153,0:10:27.870 of knowledge production? 0:10:27.870,0:10:31.346 First, we should be aware[br]that funders increasingly want 0:10:31.346,0:10:34.790 more influence over what[br]questions scientists can ask, 0:10:34.790,0:10:36.997 what findings they can[br]share, and ultimately 0:10:36.997,0:10:39.914 what kind of knowledge is produced. 0:10:41.181,0:10:43.801 So ask yourself, "What[br]are the strings attached 0:10:43.801,0:10:46.410 "when we accept funding?" 0:10:46.410,0:10:49.426 Are the strings visible,[br]where the scientist is told 0:10:49.426,0:10:51.805 that she cannot publish her[br]work until given approval 0:10:51.805,0:10:54.452 to do so by the funder? 0:10:54.452,0:10:56.095 Or does the funder require that the data 0:10:56.095,0:10:58.649 remain confidential so that[br]the research conclusions 0:10:58.649,0:11:03.150 can never be verified within[br]the scientific community? 0:11:03.150,0:11:05.567 Or are the strings invisible? 0:11:06.507,0:11:09.756 Increasingly, scientists and[br]professors are self-censoring 0:11:09.756,0:11:12.471 their work in order to appeal to funders. 0:11:12.471,0:11:15.576 And in so doing are[br]sidestepping important questions 0:11:15.576,0:11:17.504 that may be critical to the public good 0:11:17.504,0:11:19.421 and society as a whole. 0:11:21.041,0:11:22.833 My interviews make clear 0:11:22.833,0:11:25.916 that the funding effect bias is real. 0:11:28.980,0:11:32.478 And, if left unchecked, will[br]continue to have a real impact 0:11:32.478,0:11:34.764 on what we can know. 0:11:34.764,0:11:38.904 So next time you pick up a[br]book or a research article, 0:11:38.904,0:11:42.180 check to see who is[br]funding the author's work. 0:11:42.180,0:11:46.419 And pay close attention to[br]the author's affiliations. 0:11:46.419,0:11:49.651 In order to be informed[br]in this information age, 0:11:49.651,0:11:52.271 we need to take extra[br]measures to vet the legitimacy 0:11:52.271,0:11:55.870 of the content that we rely[br]on, to develop a critical eye 0:11:55.870,0:11:59.676 for independence, and to[br]value scientific integrity 0:11:59.676,0:12:01.343 above anything else. 0:12:02.799,0:12:06.643 Information and knowledge require science, 0:12:06.643,0:12:08.532 unfettered and unbiased. 0:12:08.532,0:12:11.727 And it's time we all take[br]measures to demand it. 0:12:11.727,0:12:12.843 Thank you. 0:12:12.843,0:12:15.843 (audience applause)