So imagine that you had
your smartphone minituarized
and hooked up directly to your brain.
If you had this sort of brain chip,
you'd be able to upload and download
to the Internet at the speed of thought.
Accessing social media or Wikipedia
would be a lot like --
well, from the inside at least --
like consulting your own memory.
It would be as easy
and as intimate as thinking.
But would it make it easier
for you to know what's true?
Just because a way
of accessing information is faster
doesn't mean it's
more reliable of course,
and it doesn't mean that we would all
interpret it the same way.
It doesn't mean that you would be
any better at evaluating it,
in fact you might even we worse
because, you know, more data,
less time for evaluation.
Something like this is already
happening to us right now.
We already carry a world of information
around in our pockets,
but it seems as if the more information
that we share and access online,
the more difficult it can be
for us to tell the difference between
what's real and what's fake.
It's as if we know more
but understand less.
Now, it's a feature of modern life,
I suppose,
that large swaths of the public
live in isolated information bubbles.
We're polarized not just over values
but over the facts,
and one reason for that
is that the data analytics
that drive the Internet
get us not just more information
but more of the information that we want.
Our online life is personalized;
everything from the ads we read
to the news that comes down
our Facebook feed
is tailored to satisfy our preferences.
And so while we get more information,
a lot of that information ends up
reflecting ourselves
as much as it does reality.
It ends up,
I suppose,
inflating our bubbles rather
than bursting them.
And so maybe it's no surprise
that we're in a situation --
a paradoxical situation of thinking
that we know so much more
and yet not agreeing
on what it is we know.
So how are we going to solve
this problem of knowledge polarization?
One obvious tactic is to try
to fix our technology --
to redesign our digital platform
so as to make them less
susceptible to polarization.
And I'm happy to report that many
smart people at Google and Facebook
are working on just that.
These projects are vital.
I think that fixing technology
is obviously really important,
but I don't think
that technology alone --
fixing it is going to solve the problem
of knowledge polarization.
I don't think that because I don't think,
at the end of the day,
it is a technological problem.
I think it's a human problem,
having to do with how we think
a what we value.
In order to solve it,
I think we're going to need help.
We're going to need help from psychology
and political science,
but we're also going to need help,
I think, from philosophy.
Because to solve the problem
of knowledge polarization,
we're going to need to reconnect
with one fundamental,
philosophical idea ...
that we live in a common reality.
The idea of a common reality is like,
I suppose,
a lot of philosophical concepts:
easy to state,
but mysteriously difficult
to put into practice.
To really accept it,
I think we need to do three things,
each of which is a challenge right now.
First, we need to believe in truth.
You might have noticed
that our culture is having
something of a troubled relationship
with that concept right now.
It seems as if we disagree so much that,
as one political commentator
put it not long ago,
it's as if there are no facts anymore.
But that thought is actually an expression
of a sort of seductive line
of argument that's in the air.
It goes like this:
we just can't step outside of our
own perspectives,
we can't step outside of our biases.
Every time we try,
we just get more information
from our perpesctive.
So, this line of thought goes,
we might as well admit
that objective truth is an illusion,
or it doesn't matter,
because either we'll never know what it is
or it doesn't exist in the first place.
That's not a new philosophical thought --
skepticism about truth.
During the end of the last century,
as some of you know,
it was very popular in certain
academic circles.
But it really goes back all the way
to the Greek philosopher Protagoras,
if not farther back.
Protagoras said that objective
truth was an illusion
because "man is the measure
of all things."
Man is the measure of all things.
That can seem like a bracing bit
of real politic to people,
or liberating because it allows each of us
to discover or make our own truth.
But actually,
I think it's a bit of self-serving
rationalization disguised as philosophy.
It confuses the difficulty of being
certain with the impossibility of truth.
Look,
of course it's difficult
to be certain about anything;
we might all be living in "The Matrix,"
you might have a brain chip in your head
feeding you all the wrong information.
But in practice, we do agree
on all sorts of facts.
We agree that bullets can kill people.
We agree that you can't flap
your arms and fly.
We agree --
or we should --
that there is an external reality,
and ignoring it can get you hurt.
Nonetheless, skepticism
about truth can be tempting
because it allows us to rationalize
away our own biases.
When we do that,
we're sort of like the guy in the movie
who knew he was living in "The Tatrix,"
but decided he liked it there anyway.
After all, getting what you
want feels good.
Being right all the time feels good.
So often it's easier for us
to wrap ourselves in our
cozy information bubbles,
live in bad faith,
and take those bubbles
as the measure of reality.
An example I think of how
this bad faith gets into our action
is our reaction to the
phenomenon of fake news.
The fake news that spread on the Internet
during the American
presidential election of 2016
was designed to feed into our biases,
designed to inflate our bubbles.
But what was really striking about it
was not just that it
fooled so many people.
What was really striking to me
about fake news,
the phenomenon,
is how quickly it itself became
the subject of knowledge polarization.
So much so that the very term --
the very term, "fake news,"
now just means "news story I don't like."
That's an example of the bad faith
towards the truth that I'm talking about.
But the really, I think, dangerous thing
about skepticism with regard to truth
is that it leads to despotism.
"Man is the measure of all things"
inevitably becomes "the man
is the measure of all things."
Just as "every man for himself"
always seems to turn out to be
"only the strong survive."
At the end of Orwell's "1984,"
the thought policeman O'Brien
is torturing the protagonist,
Winston Smith,
into believing two plus two equals five.
What O'Brien says is the point
is that he wants to convince Smith
that whatever the party says is the truth,
and the truth is whatever the party says.
And what O'Brien knows is that
once this thought is accepted,
critical descent is impossible.
You can't speak truth to power
if the power speaks truth by definition.
OK, so I said that in order to accept
that we really live in a common reality
we have to do three things.
The first thing is to believe in truth.
The second thing can be summed up
by the Latin phrase that Kant took
as the motto for the enlightenment.
Sapere aude,
or "dare to know,"
or as Kant [... it],
"Dare to know for yourself."
I think in the early days of the Internet,
a lot of us thought
that information technology
was always going to make it easier
for us to know for ourselves,
and of course in many ways, it has.
But as the Internet has become
more and more a part of our lives,
our reliance on it,
our use of it has become
often more passive.
Much of what we know today
we Google-know.
We download prepackaged sets of facts
and sort of shuffle them along
the assembly line of social media.
Google-knowing is useful
precisely because it involves
a sort of intellectual outsourcing.
We offload our effort onto a network
of others and algorithms.
And that allows us of course
to not clutter our minds
with all sorts of facts.
We can just download them
when we need them,
and that's awesome.
But there's a difference between
downloading a set of facts
and really understanding how
or why those facts are as they are.
Understanding why
a particular disease spreads,
or how a mathematical proof works,
or why your friend is depressed
involves more than just downloading.
It's going to require,
most likely,
doing some work for yourself.
Having a little creative insight.
Using your imagination,
getting out into the field,
doing the experiment,
working through the proof,
talking to someone.
I'm not saying of course that we
should stop Google-knowing.
I'm just saying we shouldn't
overvalue it either.
We need to find ways of encouraging
forms of knowing that are more active
and don't always involve passing off
our effort into our bubble.
Because the thing about Google-knowing
is that too often it ends up
being bubble-knowing.
And bubble-knowing means
always being right.
But daring to know,
daring to understand,
means risking the possibility
that you could be wrong.
It means risking the possibility
that what you want and what's true
are different things.
Which brings me to the third thing
that I think we need to do
if we want to accept that we live
in a common reality.
That third thing is
have a little humility.
By humility here, I mean
epistemic humility,
which means, in a sense,
knowing that you don't know it all.
But it also means something
more than that.
It means seeing your worldview
as open to improvement
by the evidence and experience of others.
Seeing your worldview
as open to improvement
by the evidence and experience of others.
That's more than just being
open to change.
It's more than just being open
to self-improvement.
It means seeing your knowledge
as capable of enhancing
or being enriched by
what others contribute.
That's part of what is involved
in recognizing that there's
a common reality
that you too are responsible [for.]
I don't think it's much of a stretch
to say that our society is not
particularly great
at enhancing or encouraging
that sort of humility.
That's partly because,
well, we tend to confuse
arrogance and confidence.
And it's partly because,
well, you know, arrogance is just easier.
It's just easier to think
of yourself as knowing it all.
It's just easier to think of yourself
as having it all figured out.
But that's another example
of the bad faith towards the truth
that I've been talking about.
So the concept of a common reality,
like a lot of philosophical concepts,
can seem so obvious
that we can look right past it
and forget why it's important.
Democracies can't function
if their citizens don't strive,
at least some of the time,
to inhabit a common space.
A space where they can pass
ideas back and forth when --
and especially when --
they disagree.
But you can't strive to inhabit that space
if you don't already accept
that you live in the same reality.
To accept that we've
got to believe in truth,
we've got to encourage
more active ways of knowing.
And we've got to have the humility
to realize that we're not
the measure of all things.
We may yet one day realize the vision
of having the Internet in our brains,
but if we want that to be liberating
and not terrifying,
if we want it to expand our understanding
and not just our passive knowing,
we need to remember
that our perspectives,
as wondrous, as beautiful as they are,
are just that --
perspectives on one reality.
Thank you.
(Applause)