So imagine that you had
your smartphone minituarized
and hooked up directly to your brain.
If you had this sort of brain chip,
you'd be able to upload and download
to the Internet at the speed of thought.
Accessing social media or Wikipedia
would be a lot like --
well, from the inside at least --
like consulting your own memory.
It would be as easy
and as intimate as thinking.
But would it make it easier
for you to know what's true?
Just because a way
of accessing information is faster
doesn't mean it's
more reliable of course,
and it doesn't mean that we would all
interpret it the same way.
It doesn't mean that you would be
any better at evaluating it,
in fact you might even we worse
because, you know, more data,
less time for evaluation.
Something like this is already
happening to us right now.
We already carry a world of information
around in our pockets,
but it seems as if the more information
that we share and access online,
the more difficult it can be
for us to tell the difference between
what's real and what's fake.
It's as if we know more
but understand less.
Now, it's a feature of modern like,
I supposed,
that large swaths of the public
live in isolated information bubbles.
We're polarized not just over values
but over the facts,
and one reason for that
is that the data analytics
that drive the Internet
get us not just more information
but more of the information that we want.
Our online life is personalized,
everything from the ads we read
to the news that comes down
our Facebook feed
is tailored to satisfy our preferences.
And so while we get more information,
a lot of that information ends up
reflecting ourselves
as much as it does reality.
It ends up,
I suppose,
inflating our bubbles rather
than bursting them.
And so maybe it's not surprise
that we're in a situation --
a paradoxical situation --
thinking that we know so much more
and yet not agreeing
on what it is we know.