(audience applause)
- Good morning.
For the past three
years, I was a researcher
at the Edmond J. Safra Center for Ethics
at Harvard University, where
I examined corrupting influences
and hidden biases in the
pursuit of knowledge.
During this time, I
conducted in-depth interviews
with professors from
medicine, business, law,
the natural life sciences,
as well as the humanities
and social sciences.
My goal was to try to
understand the everyday life
of scientists and professors
across all the disciplines.
In the end, I ended up
with close to 10,000 pages
of interview transcripts.
Today I would like to share with you
some of the ethical dilemmas
that professors face.
In particular, whether they
experience an increased risk
of bias depending on who
is funding their research.
Now, why should we be
concerned about the ethics
of knowledge production?
When I first started university,
I had this idealistic,
and perhaps naive, view of science.
I believed that scientists
inquired about the world,
practiced the scientific
method with integrity,
and made new discoveries
that drive progress forward.
But close examination of how
scientist conduct research
reveals that what we can
know depends not only
on the scientist, but
also on the structures
and institutions that
give scientists the means
to pursue knowledge.
As I interviewed
scientists and professors,
I began to uncover patterns
of scientific distortion,
or what some might call "the
corruption of knowledge."
However, the majority of these distortions
were not produced by bad
people behaving unethically
or illegally, although this does happen.
But rather by good people, like the people
sitting beside you right now:
your friends and your family
who in response to the daily pressures
of work may simply begin to rationalize
to themselves little ethical
lapses here and there.
Now by ethical lapse, I mean
scientific integrity lapses
that appear to be very
small, or inconsequential,
at the time.
One of the most common examples of this
involves a scientist thinking,
Maybe I won't ask
Question A when pursuing
my research because my
funder who may be relying
on the results of this study
to obtain regulatory approval
for potential commercialization
may not be too happy
with the higher risk of a negative result
which might also affect
my future funding.
So maybe instead I'll self-censor myself
and ask a different question of the data
where the possible
outcome will most likely
not ruffle too many feathers
and I will then answer
that question honestly and
with scientific integrity.
Now these types of rationalizations,
these little compromises,
where we convince ourselves
in the moment that what
we're doing is okay,
help to neutralize any
guilt we might experience
in our ethical decision making.
However, over time the accumulation
of these little ethical lapses is leading
to a broader system of
knowledge production
that is becoming increasingly distorted
and more difficult to trust.
I want you to think about
that word for a moment: trust.
And how it plays a role
in your daily activities.
For instance, plastic water bottles.
They're so common that
when we pick one up,
we're probably not thinking anything
other than, "I'm thirsty."
We don't ask ourselves,
Hmm, does bisphenol-a,
or BPA, a common compound
used in hard plastic products,
lead to cancer, behavioral disorders,
or reproductive problems?
No, of course not.
We take a drink and we go on with our day.
We trust that drinking
from the water bottle
can't be bad, or at least
bad enough to worry about.
While on the one hand, you can feel safe
because every study
performed by scientists
funded by the industry
concludes, "No harm from BPA."
In other words: it's
okay, you can trust BPA.
But, at the same time,
93% of the non industry
funded studies show that there
might be cause for concern.
And that maybe we should
be a little less trusting
the next time we pick up a
hard plastic water bottle.
So who do you trust?
And how is it possible that
the industry funded scientists
studying BPA are so certain
that there is no harm?
Is it simply because
they're better scientists?
Have bigger data sets?
Know the compound better?
Maybe, perhaps, but we see this pattern
often called the funding effect,
across many different areas of research
from cell phone safety to
climate change to soft drinks.
In each case, scientists
funded by the industry,
or industry supported think
tanks, reached conclusions
that overall tend to deny
or downplay any harm.
While non industry funded
scientists overwhelmingly
find evidence of harm.
Among the professors I
interviewed in food and nutrition,
there was acknowledgement
of this funding effect bias.
When food scientists said,
"There is a tendency for people
in my discipline to develop sympathies
with the food industry and to say
'Yeah this is definitely
safe.' Rather than to say,
"Okay, here's this research study,
and this research study, and this research study.'"
When I interviewed another
professor, who was also
an editor of a scientific
journal in nutrition,
he said the following to me,
"So we get some manuscripts
that are industry
sponsored and one senses
that their story is a little
slanted towards the benefit
of whatever it might be.
Their product did this, never mind that
it didn't do 10 other things.
The most frequent scenario
is not that the study
is done poorly, but that
the questions themselves
are kind of selective."
Now if a funding effect bias does exist,
then surely the regulatory
bodies who look out
for our safety must be aware of it, right?
For instance, what about
our prescription drugs?
Pharmaceutical companies must first obtain
regulatory approval for
their products, right?
Yes, however, many of the drug evaluation
and research advisory committee members
who vote on whether a
drug should be granted
regulatory approval also
have financial conflicts
of interest with these
same drug companies.
These voting members
often serve as consultants
and have ownership interest
in the same drug companies
seeking approval.
They also sit on their advisory boards
and even receive funding from these firms
for their own individual research.
In other words, they might be experts,
but they are not independent experts.
As you know, in 2008 the world suffered
a major financial crisis.
The Oscar-winning documentary, Inside Job,
suggested that economics
professors were being corrupted
and blinded through their
consulting relationships
and conflicts of interest
with the financial sector.
It was so serious that even
an upset Queen of England
(audience laughs)
visited the LSC, the
prestigious London School
of Economics, and sternly asked her top
economics professors, "If the
problem was so widespread,
"then why didn't anyone notice it?"
The Director of LSC's
Management Department,
who was standing beside
the Queen at the time,
said to her, "At every
stage, someone was relying
on somebody else and everyone thought
they were doing the right thing."
In my interviews with business
and economics professors,
it was observed, as it was with professors
across all the disciplines,
that a lack of independence
can distort the production of knowledge.
One economics professor, who researches
private equity finance,
told me during an interview,
"The only way to get the data is to get
"the private equity
firms to give it to you.
"If you then say these people
don't know what they're doing,
or they only make returns
by taking excessive risks,
then there is the potential
you simply will not
get data going forward
and you will be forced
to leave the field of economics.
So you have to worry that
the research that comes out
is more favorable to the
private equity industry
than otherwise it would be."
Now despite all these cautionary examples
of corrupting influences
and hidden biases,
some of you out there, I'm certain,
are still thinking to
yourself, "Okay, Gary,
I hear what you're saying,
but I would never distort
my work, and no conflict
of interest would change
how I pursue my research."
Fair enough, many of us do believe
that we can manage any
conflict of interest
and still maintain our
own personal integrity.
However, we should never
forget that the power
to rationalize our own
little ethical lapses
is remarkable.
Consider this everyday example.
Statistics demonstrate
that there are disturbingly
high rates of accidents and deaths
due to cell phone related
distracted driving.
Yet, despite knowing this,
many of us will continue
to use cell phones when we drive,
even after we leave here today.
Studies show that more
than half of us believe
that when we use cell phones and drive,
it makes no real difference on our own
individual driving performance.
Yet, when we switch from being
the driver to the passenger,
90% of us now will suddenly
state, " I would feel
very unsafe if I observed my
driver using a cell phone."
So saying you have integrity is easy.
Practicing integrity is not easy.
And recognizing our own
little ethical lapses
and rationalizations
is even more difficult.
So what does this all mean in the context
of knowledge production?
First, we should be aware
that funders increasingly want
more influence over what
questions scientists can ask,
what findings they can
share, and ultimately
what kind of knowledge is produced.
So ask yourself, "What
are the strings attached
"when we accept funding?"
Are the strings visible,
where the scientist is told
that she cannot publish her
work until given approval
to do so by the funder?
Or does the funder require that the data
remain confidential so that
the research conclusions
can never be verified within
the scientific community?
Or are the strings invisible?
Increasingly, scientists and
professors are self-censoring
their work in order to appeal to funders.
And in so doing are
sidestepping important questions
that may be critical to the public good
and society as a whole.
My interviews make clear
that the funding effect bias is real.
And, if left unchecked, will
continue to have a real impact
on what we can know.
So next time you pick up a
book or a research article,
check to see who is
funding the author's work.
And pay close attention to
the author's affiliations.
In order to be informed
in this information age,
we need to take extra
measures to vet the legitimacy
of the content that we rely
on, to develop a critical eye
for independence, and to
value scientific integrity
above anything else.
Information and knowledge require science,
unfettered and unbiased.
And it's time we all take
measures to demand it.
Thank you.
(audience applause)