< Return to Video

You are a simulation & physics can prove it: George Smoot at TEDxSalford

  • 0:16 - 0:18
    Thank you,
    it's a pleasure to be here.
  • 0:18 - 0:21
    They asked me if I wanted
    a drink before I came on
  • 0:21 - 0:23
    and I asked for a pint
    but they gave me water.
  • 0:23 - 0:25
    (Laughter)
  • 0:25 - 0:27
    So, following the other speakers
    I have a change of pace,
  • 0:27 - 0:29
    a little bit of a fun talk.
  • 0:29 - 0:33
    And what I am going try and do
    is convince you you're a stimulation,
  • 0:33 - 0:35
    and that physics can prove it, okay?
  • 0:35 - 0:36
    (Laughter)
  • 0:36 - 0:40
    So, instead of a usual uplifting talk,
    this is a different kind of talk.
  • 0:40 - 0:43
    Okay, so, there's one thing
    you know for certain,
  • 0:43 - 0:46
    that is that you exist as
    a flesh and blood human being;
  • 0:46 - 0:49
    my goal is to convince
    you otherwise. Okay?
  • 0:50 - 0:52
    So, logic is not going to be enough,
  • 0:52 - 0:55
    you guys are going
    to be simulation deniers,
  • 0:56 - 0:58
    there's just no way round it.
  • 0:58 - 1:02
    So, my actual goal will be to actually
    create a sliver of doubt in your minds,
  • 1:02 - 1:06
    so that you actually think about this,
    and what it might mean. Okay?
  • 1:07 - 1:09
    So, here's is the first check
    about simulations.
  • 1:10 - 1:12
    How many of you have ever
    played a computer game?
  • 1:12 - 1:13
    Just raise your hands.
  • 1:13 - 1:15
    Ah, alright.
  • 1:15 - 1:19
    So, did you do it against simulated player
    or simulated players?
  • 1:19 - 1:23
    Or, in fact, was it you, several people
    plus simulated people?
  • 1:23 - 1:27
    Right. And what role did you take?
    Was it a pawn or a hero?
  • 1:27 - 1:31
    What role do you have in life?
    Is it pawn or hero?
  • 1:31 - 1:33
    Right.
    Are you the king, for example?
  • 1:33 - 1:34
    (Laughter)
  • 1:34 - 1:37
    I don't see him here... but...
  • 1:39 - 1:41
    Now, the other thing you might ask,
    if you were a social scientist,
  • 1:41 - 1:43
    or other kind of scientist
    like a cosmologist:
  • 1:43 - 1:46
    Would you like
    to run realistic simulations
  • 1:46 - 1:49
    to test and develop your theories?
    Likewise for political candidates.
  • 1:49 - 1:53
    Right? So, I'm just trying to see
    there's motivation for it.
  • 1:53 - 1:57
    And then the question is: Are computation
    and simulation capabilities
  • 1:58 - 1:59
    increasing over time?
  • 1:59 - 2:02
    So, think of the HetNOS,
    think about Moore's law,
  • 2:02 - 2:05
    think about what computer you had
    when you were young
  • 2:05 - 2:07
    and what you have on you now,
    not that you're not all young still.
  • 2:07 - 2:10
    Okay, that's just setting you up
    for having the doubt.
  • 2:11 - 2:14
    Okay, so we'll take
    a little journey into philosophy.
  • 2:14 - 2:19
    Solipsism is the idea that one's own mind
    is the only thing that's sure to exist.
  • 2:19 - 2:23
    It turns out, people
    have been studying this for decades,
  • 2:23 - 2:27
    and realizes both irrefutable and
    indefensible at the same time,
  • 2:27 - 2:30
    so have this point of view,
  • 2:30 - 2:33
    and that it's not
    a falsifiable hypothesis,
  • 2:33 - 2:35
    there are people who work on this issue.
  • 2:35 - 2:38
    So, there doesn't seem to be
    any imaginable disproof that you can have,
  • 2:38 - 2:41
    so even if you have a Solipsan,
    he dies,
  • 2:41 - 2:44
    you can't falsify his belief,
    because he's not there to do it.
  • 2:44 - 2:48
    This is a pragmatic dead end,
    it's kind of like what we have on TV now,
  • 2:48 - 2:52
    which is, you know,
    zombie philosophy.
  • 2:53 - 2:57
    But there is an opposite,
    that is philosophical zombies.
  • 2:57 - 3:00
    There's a slight use
    to philosophical zombies.
  • 3:00 - 3:02
    So what is the idea here?
  • 3:02 - 3:04
    The philosophical zombie
    is a hypothetical being
  • 3:04 - 3:07
    that in this thing what
    you all thought a normal human being,
  • 3:07 - 3:10
    that is everybody you think you are,
    you know what you think you are,
  • 3:10 - 3:15
    except that it lacks conscious experience,
    qualia or sentience.
  • 3:15 - 3:19
    So, if you take a philosophical zombie
    and poke it with a sharp object,
  • 3:19 - 3:23
    it doesn't feel any pain, however,
    it behaves exactly as if it does.
  • 3:23 - 3:26
    It would say "ouch" and
    do all the usual kind of things.
  • 3:26 - 3:31
    So, what the zombie is there for,
    is to support the idea
  • 3:31 - 3:34
    that the world includes two kinds of
    things: the mental and the physical,
  • 3:34 - 3:38
    or the concepts and
    the physical world around you.
  • 3:38 - 3:40
    And so that's the idea.
  • 3:41 - 3:44
    So, we have in cosmology, lots of things.
    We have the anthropic principal,
  • 3:44 - 3:47
    that is, a philosophical concept that
    the universe must be compatible
  • 3:47 - 3:49
    with conscious life that observes it.
  • 3:49 - 3:52
    And there's a strong version
    and a weak version.
  • 3:52 - 3:56
    One of them that says the universe is
    compelled to have conscious life emerge,
  • 3:56 - 4:00
    and the other says that the universe
    is fine-tuned for life to be necessary.
  • 4:00 - 4:06
    And this is pretty much in line with
    a lot of even more specific kind of ideas,
  • 4:06 - 4:11
    from conservative Christianity and Islam,
    that there's intelligent design,
  • 4:11 - 4:15
    or that there could be like a simulation.
    I'm working on you... so....
  • 4:17 - 4:19
    And we also have
    the idea of multiverses,
  • 4:19 - 4:22
    that there are many different kinds...
    there's a metauniverse
  • 4:22 - 4:25
    and there's many possible universes
    inside of it.
  • 4:25 - 4:28
    And there are different reasons for that,
    quantum mechanics,
  • 4:28 - 4:31
    but also a way to explain whether
    physical constants happen to be the ones
  • 4:31 - 4:34
    that make this auditorium possible.
  • 4:34 - 4:38
    And so, you know, one way
    is to have that many real universes,
  • 4:38 - 4:41
    the other way is just to make
    a lot of simulations.
  • 4:41 - 4:43
    So, your choice.
    Okay, so now we move on.
  • 4:43 - 4:46
    Here's the crux of the arguments,
  • 4:46 - 4:49
    and these arguments have been around
    for more than 30 years,
  • 4:49 - 4:52
    they were first published 30 years ago,
  • 4:53 - 4:55
    and what people
    went to a lot of trouble to show,
  • 4:56 - 4:58
    that one of these three things
    is extremely likely to be true.
  • 4:58 - 5:01
    So, you get your choice between
    No. 1, No. 2 and No. 3,
  • 5:01 - 5:04
    just like the doors,
    look what's behind each door.
  • 5:04 - 5:07
    The first one is: Human civilization
    is unlikely
  • 5:07 - 5:10
    to reach a level of technological maturity
  • 5:10 - 5:14
    capable of producing simulated realities,
    or it's physically impossible.
  • 5:14 - 5:17
    Okay, so we made some progress
    in 30 years and I'll mention that.
  • 5:17 - 5:21
    The second is: Comparable civilizations
    throughout the universe
  • 5:22 - 5:25
    which do reach that capability
  • 5:25 - 5:28
    will choose not to make simulations
  • 5:28 - 5:32
    in such a large scale that, in fact,
    the probability of being a simulated being
  • 5:32 - 5:35
    is much higher than probability
    of being a real being.
  • 5:35 - 5:38
    So, those are your choices, right --
  • 5:39 - 5:42
    there's some other choices,
    but they're extraordinarily unlikely,
  • 5:42 - 5:44
    and we can pretty much
    rule them out.
  • 5:44 - 5:48
    And the 3rd choice is: Any entities
    with our general set of experiences
  • 5:48 - 5:50
    are almost certainly
    to be living in a simulation.
  • 5:50 - 5:55
    That would be us. Right? Okay?
    In case you guys aren't paying attention.
  • 5:55 - 5:56
    (Chuckles)
  • 5:56 - 6:00
    Okay. So, let's talk about
    making simulator realities by humans.
  • 6:01 - 6:03
    So, will humans produce
    realistic simulations?
  • 6:03 - 6:05
    And the answer is yes.
  • 6:05 - 6:08
    I have to keep coming back
    because I just wrote this talk
  • 6:08 - 6:10
    and so I don't remember
    what I have to say.
  • 6:10 - 6:15
    And, so, the answer to that is clearly
    yes, you guys already proved it,
  • 6:15 - 6:19
    because there's a lot of money
    to be made in making computer games,
  • 6:19 - 6:21
    simulated realities.
  • 6:21 - 6:24
    And the better the simulator reality,
    the more people you get involved in it.
  • 6:24 - 6:27
    There's a lot of entertainment,
    we have a lot of animated movies.
  • 6:27 - 6:29
    Now, we're going to have
    animated interactive movies
  • 6:29 - 6:32
    and videos and pornography.
  • 6:32 - 6:35
    So, you know,
    you can't rule out pornography,
  • 6:35 - 6:39
    in the early days of the Internet,
    pornography was the No. 1 commerce,
  • 6:39 - 6:42
    it was roughly half the commerce
    in the Internet in the early days.
  • 6:42 - 6:47
    And even today, 50% of the bits
    that are transmitted on the Internet
  • 6:47 - 6:49
    are transmitted for porn.
  • 6:49 - 6:52
    So, you can wonder: Why is it?
    Well, originally stories
  • 6:52 - 6:54
    and then there got to be pictures,
    and then there got to be videos,
  • 6:54 - 6:56
    pretty soon there'll be
    interactive videos.
  • 6:56 - 6:59
    So, it's clear there is
    a tremendous financial motivation,
  • 6:59 - 7:01
    and especially here in Media City,
  • 7:01 - 7:04
    where people make their living
    out of these kind of things.
  • 7:04 - 7:08
    So, how... I'm not sure which
    of the three, But OK.
  • 7:10 - 7:13
    How detailed and how accurate
    will the simulations be?
  • 7:13 - 7:16
    And the answer turns out,
    as we know from experience,
  • 7:16 - 7:18
    computation power is the first issue,
  • 7:18 - 7:20
    you have to have tremendously good
    computation power
  • 7:20 - 7:22
    to make a really good quality simulation,
  • 7:22 - 7:25
    and good programming,
    that is good software,
  • 7:25 - 7:27
    to explain what's going on,
    that's the second.
  • 7:27 - 7:32
    But, clearly we're making progress,
    just look at the games, look at PONG,
  • 7:32 - 7:36
    and look at that the kind of video games
    we have now. So, we'll see.
  • 7:37 - 7:39
    What about simulations
    by other civilizations?
  • 7:39 - 7:43
    So, now we know a lot more
    about this than we did 30 years ago.
  • 7:43 - 7:45
    We've made tremendous progress.
  • 7:45 - 7:48
    We've discovered more than
    2,000 other stars
  • 7:48 - 7:50
    that have planetary systems
    around them.
  • 7:50 - 7:55
    And we know there is at least on the order
    of a billion or more habitable planets
  • 7:55 - 7:58
    in our galaxy, and there's about
    a 100 billion galaxies
  • 7:58 - 8:02
    for around 10^20 to 10^22
    depending what your range
  • 8:02 - 8:07
    possible sites for life and then
    advanced civilizations in the universe.
  • 8:07 - 8:10
    So, what are the chances
    that the earth is the most advanced,
  • 8:10 - 8:13
    the most computationally powerful.
  • 8:13 - 8:18
    Well, the odds, you got to be really,
    pretty much thinking you're special,
  • 8:18 - 8:21
    to think that the odds are
    that we're the top.
  • 8:25 - 8:31
    So, the question is:
    Will advanced beings run simulations?
  • 8:32 - 8:35
    And, in fact, will simulated beings
    run simulations?
  • 8:35 - 8:39
    If we're simulated, are we running
    simulations in our simulations,
  • 8:39 - 8:43
    simulations all the way down?
    If you know the things.
  • 8:43 - 8:49
    So, even the people running our simulation
    don't know if they're a simulation or not.
  • 8:50 - 8:54
    It's interesting, because
    it creates ethics and a bunch of things
  • 8:54 - 8:57
    because there might be somebody
    watching you.
  • 8:57 - 9:02
    So, are ethical considerations
    likely to stop every single civilization
  • 9:02 - 9:06
    from running simulations and running
    large numbers of simulations?
  • 9:06 - 9:08
    Well, the answer I think is "no".
  • 9:08 - 9:12
    What if doing simulations is likely to say
    what we think are real lives?
  • 9:12 - 9:14
    Right? We're willing
    to do the simulations
  • 9:14 - 9:17
    even though they're being strapped
    in that simulation, right?
  • 9:17 - 9:18
    Conscious beings.
  • 9:18 - 9:21
    And the other thing you might consider is
    how do human beings
  • 9:21 - 9:23
    treat what they think
    are real human beings.
  • 9:23 - 9:26
    How's the ethical treatment on our Earth?
  • 9:26 - 9:28
    And how much more
    is society likely to advance
  • 9:28 - 9:32
    before we're doing
  • 9:32 - 9:36
    very advanced simulations
    of civilizations and beings?
  • 9:37 - 9:41
    So, we'll probably all be in a simulation.
    The lights are not on enough in here,
  • 9:41 - 9:43
    but look to the left
    and look to the right,
  • 9:43 - 9:46
    if there's anybody here you think
    is a real person, this is a random sample,
  • 9:46 - 9:49
    then you're probably not.
    (Laughter)
  • 9:51 - 9:54
    But, you know, If you think you're
    a social scientist
  • 9:54 - 9:56
    or an anthropologist or something,
    and you want to run
  • 9:56 - 9:58
    and see how
    the civilizations rise and fall,
  • 9:58 - 10:02
    you'll run simulations with
    up to billions of people.
  • 10:03 - 10:05
    And you will run many
    of those simulations,
  • 10:05 - 10:07
    so it's not so hard to imagine
    you'll get up to the level
  • 10:07 - 10:12
    of 10^12:1 simulated beings
    to unsimulated beings,
  • 10:12 - 10:15
    that's why the probability
    becomes fairly likely
  • 10:15 - 10:20
    that any being that has a behavior
    or activities and experiences like us
  • 10:20 - 10:21
    is simulated.
  • 10:23 - 10:26
    Sorry, I got some sunscreen in my eye.
  • 10:26 - 10:29
    Put on sunscreen this morning in case
    it was an unusual day in England.
  • 10:29 - 10:32
    (Laughter)
  • 10:32 - 10:34
    And I got a little in my eye here.
  • 10:34 - 10:39
    So let's talk about how we're going
    to do the simulations on the Earth.
  • 10:40 - 10:42
    This is part of going back
    to convince you
  • 10:42 - 10:44
    that we're going to have
    realistic simulations
  • 10:44 - 10:47
    and we're going to have
    artificial reality to go with it.
  • 10:47 - 10:51
    So, can we take a real brain
    and make it into a virtual mind?
  • 10:51 - 10:54
    And the answer is:
    So, here is the purple real brain,
  • 10:54 - 10:57
    and the neurons behind it,
    it's this neural net,
  • 10:57 - 11:00
    it's the regional neural net,
    as far as we're concerned.
  • 11:00 - 11:04
    And then on the left,
    yeah your left,
  • 11:04 - 11:06
    there is the beginnings
    of a mapping of a brain,
  • 11:07 - 11:10
    so that I can take and map that brain,
    and just place it into a computer.
  • 11:11 - 11:13
    So, how's that going to work?
  • 11:13 - 11:16
    The answer is, it's going
    to work just fine,
  • 11:16 - 11:18
    because we're there to the point
    where we can do it now.
  • 11:18 - 11:23
    So, here is a high-resolution,
    45-minute brain scan
  • 11:23 - 11:25
    that was done in February.
  • 11:25 - 11:29
    And 45 minutes, that's how long
    you have to hold the person's head still,
  • 11:29 - 11:31
    in order to make a map to this level.
  • 11:31 - 11:33
    And what you can see here are the main --
  • 11:33 - 11:36
    Let's see if the laser pointer works --
    -- No --
  • 11:36 - 11:39
    So, you can see here the main highways
    in your brain.
  • 11:39 - 11:43
    They're mapped out by this,
    and this is basically an MRI
  • 11:44 - 11:47
    I got a scan of my brain done
    and I was really impressed,
  • 11:47 - 11:49
    to prove that I had a brain,
    but one of my friends got an fMRI
  • 11:49 - 11:53
    to prove that his brain worked.
    (Laughter)
  • 11:54 - 11:57
    The thing that's impressive about this
  • 11:57 - 12:00
    is that the MRI's
    are getting so good now,
  • 12:00 - 12:02
    you can map to
    the individual neuron level.
  • 12:03 - 12:05
    The problem is there's a lot of neurons,
  • 12:05 - 12:07
    so you have to hold
    the head still for a long time,
  • 12:07 - 12:11
    and that's an advance in the ability
    to do the mapping,
  • 12:11 - 12:13
    and also in the software
    for doing that mapping.
  • 12:13 - 12:15
    And, so, that's where we are today.
  • 12:15 - 12:19
    If we can hold the person still
    long enough, if we can find a volunteer
  • 12:19 - 12:21
    that we can put, you know,
  • 12:21 - 12:25
    the little plastic thing on their head,
    to hold their head still for some days,
  • 12:25 - 12:28
    which is a little bit of a problem,
    we could probably go ahead
  • 12:28 - 12:30
    and map their entire brain,
  • 12:30 - 12:33
    and then just transform
    that map into a computer model,
  • 12:33 - 12:37
    and we would have that person's mind
    downloaded into a computer.
  • 12:37 - 12:41
    This is coming and this is coming soon,
  • 12:41 - 12:46
    just like it's now possible for the order
    of ÂŁ1,000 to get your DNA mapped,
  • 12:48 - 12:50
    it's going to cost you something,
  • 12:50 - 12:52
    in about 30 years
    it's going to be possible
  • 12:52 - 12:56
    to download your brain into a computer
    for about ÂŁ1,000 pounds,
  • 12:56 - 12:59
    plus inflation.
    (Laughter)
  • 13:01 - 13:03
    Could go up, could go down.
    Right?
  • 13:03 - 13:05
    But there's tremendous advances
    in technology
  • 13:05 - 13:08
    and these are making it possible
    to do things that before we were without.
  • 13:08 - 13:10
    So, I have quote from
    a Google expert,
  • 13:10 - 13:14
    we'll be uploading entire minds
    to computers in 2045.
  • 13:14 - 13:18
    He also says we'll do bodies too,
    I'm thinking we won't do bodies,
  • 13:18 - 13:22
    what we'll do is we'll take that mind
    and keep it from going out of its mind,
  • 13:22 - 13:24
    we'll put it on artificial reality,
  • 13:24 - 13:26
    it's in the computer,
    it's going to get bored,
  • 13:26 - 13:28
    wants entertainment,
    wants social interactions,
  • 13:28 - 13:30
    so we're going to
    create artificial realities.
  • 13:30 - 13:33
    Now, in the old days,
    we'd make a thing like that,
  • 13:33 - 13:35
    if you remember the matrix.
    Right?
  • 13:35 - 13:37
    Ones and zeros, now, in fact,
    we may use quantum computers,
  • 13:37 - 13:39
    so we'll have entangled states,
  • 13:39 - 13:42
    but, in fact, it'll be
    some kind of a complicated environment
  • 13:42 - 13:45
    where we can interact socially,
    because people want to be social,
  • 13:45 - 13:48
    so there'll have to be thousands
    of people to interact with,
  • 13:48 - 13:50
    and there'll have to be
    all kinds of other things
  • 13:50 - 13:52
    in order to make
    that artificial environment
  • 13:52 - 13:55
    sort of realistic and keep you going.
  • 13:55 - 13:57
    And remember,
    when you download your brain,
  • 13:57 - 13:59
    you're going to think
    about a million times faster,
  • 13:59 - 14:01
    you're going to experience life
    about a million times faster.
  • 14:01 - 14:04
    It's going to be
    a very different kind of a situation.
  • 14:04 - 14:06
    You know? The idea of going back
    and machines
  • 14:06 - 14:08
    and go out in the real world
    where things are still slow slow,
  • 14:08 - 14:11
    you're going to get tired
    of doing that.
  • 14:11 - 14:17
    And the size of simulated porn
    isn't so good in the real world.
  • 14:17 - 14:19
    Okay, so, now,
    the other thing I have to do
  • 14:19 - 14:23
    is to attack your certainty.
  • 14:24 - 14:27
    So, I have to point out to you,
    human beings are not good at figuring out
  • 14:27 - 14:29
    if they're real.
  • 14:29 - 14:31
    So, your mind is really not equipped
  • 14:31 - 14:34
    for addressing this
    and many other important questions.
  • 14:34 - 14:36
    So, the first question
    I'm going to give you
  • 14:36 - 14:39
    is count the number black dots.
  • 14:39 - 14:41
    (Laughter)
  • 14:41 - 14:44
    It's a still picture and
    there's no video.
  • 14:50 - 14:53
    Here we go.
    You see this picture?
  • 14:53 - 14:56
    How many of you see
    the horse in the picture?
  • 14:56 - 14:59
    How many of you cannot see
    the horse in the picture?
  • 14:59 - 15:02
    Once you see the horse,
    it's hard not to see the horse.
  • 15:02 - 15:07
    And I want to show a picture of an object
    and ask you: Can it be real?
  • 15:07 - 15:10
    And then I'm going to tell you,
    it's a photograph,
  • 15:10 - 15:13
    the watch is real,
    the paper's real, the desk is real,
  • 15:13 - 15:15
    Is that object real?
  • 15:15 - 15:18
    Well, it's a photograph,
    so it's a real in some sense,
  • 15:19 - 15:24
    as is this, and for me,
    this object flips back and forth.
  • 15:26 - 15:34
    Here's the real version of that, made out
    of 2x4s, focused at different angles,
  • 15:34 - 15:38
    and you see it's an optical illusion
    where your eye puts it together.
  • 15:38 - 15:42
    And here's another example, and this is
    another example, just for fun,
  • 15:44 - 15:46
    because you know it's just rotated,
  • 15:46 - 15:49
    but the first response is:
    Phew, that's weird.
  • 15:49 - 15:51
    So, here's one you're going
    to get the answer to.
  • 15:51 - 15:53
    Which of these is longer?
  • 15:55 - 15:58
    So, they're the same. How come?
    It doesn't really look that way to you.
  • 15:58 - 16:02
    You knew that I was tricking you,
    so you understand,
  • 16:02 - 16:06
    well, you still look at it, and think,
    well, but I'd better check now.
  • 16:06 - 16:08
    because I know I'm going
    to make mistakes.
  • 16:08 - 16:10
    Okay, so one more.
  • 16:10 - 16:14
    I'll skip the lilac chaser
    on the other side,
  • 16:14 - 16:16
    and just ask you
    about the stuff on the right.
  • 16:16 - 16:18
    Are those lines straight or not?
  • 16:19 - 16:21
    Well,they are straight,
    but to your eyes,
  • 16:21 - 16:23
    it's really hard to convince you
    they're straight.
  • 16:23 - 16:27
    Your brain is set to do other things.
  • 16:27 - 16:30
    Here's another Bayesian reading test --
  • 16:31 - 16:32
    in the land of Bayes
    that we do it.
  • 16:32 - 16:37
    There's many examples you can give this,
    but a cab was involved in a hit-and-run,
  • 16:37 - 16:39
    and two cab companies are in the town,
  • 16:39 - 16:42
    green cabs and blue cabs.
  • 16:42 - 16:46
    And they operate:
    85% of the cabs are green, 15% are blue.
  • 16:46 - 16:48
    The witness says the cab was blue.
  • 16:48 - 16:51
    When he's tested,
    he gets it right 80% of the time,
  • 16:51 - 16:53
    or she gets it right 80 % of the time.
  • 16:53 - 16:56
    What's the probability
    that it really was a blue cab?
  • 16:56 - 16:59
    You have to go through
    the calculation carefully,
  • 16:59 - 17:03
    this is the only equation, usually when
    I put equations up people go -- but --
  • 17:04 - 17:07
    it's almost 60% chance
    that the cab was green,
  • 17:07 - 17:10
    even though this person
    gets it 80% right.
  • 17:10 - 17:13
    And this is relevant, but
    there are other kinds of test like that.
  • 17:13 - 17:15
    So, you can have tests
    that are even more powerful,
  • 17:15 - 17:17
    like the test for breast cancer
    is 99% correct;
  • 17:17 - 17:19
    it gets the wrong answer
    1% of the time.
  • 17:19 - 17:22
    But roughly a thousand times
    as many people,
  • 17:22 - 17:26
    who... you know, 1,000 of the people that
    get tested actually have breast cancer,
  • 17:26 - 17:30
    so, when you get the first response
    that you have breast cancer,
  • 17:30 - 17:33
    itćs only a 10% that chance you really do,
    it's not to you get the next test.
  • 17:33 - 17:37
    But the10%, you know, the 10 people
    are freaked out.
  • 17:37 - 17:41
    Humans aren't ready for
    dealing with that kind of thing.
  • 17:41 - 17:44
    Okay, so, it's because
    we lack computing power.
  • 17:44 - 17:47
    So, we have compromises
    in our algorithms, right?
  • 17:47 - 17:51
    Humans, therefore, are susceptible
    to optical illusions,
  • 17:51 - 17:53
    systematic errors in judgment --
    I'm running out of time --
  • 17:53 - 17:56
    I should have gone faster --
    I got confident --
  • 17:56 - 18:00
    difficulty with complex decisions --
    and keeping on time --
  • 18:00 - 18:02
    and the ability to function
    in a prehistoric world,
  • 18:02 - 18:04
    which was the important one, you know.
  • 18:04 - 18:07
    Only a few percent of the humans
    got wiped out,
  • 18:07 - 18:09
    before they were able
    to reproduce, okay?
  • 18:09 - 18:12
    So, simulations are going to make
    the same kind or similar approximates.
  • 18:12 - 18:15
    So, we have many contradictions.
  • 18:15 - 18:18
    We could see if our physics
    is inconsistent,
  • 18:18 - 18:21
    then it's likely we're in a simulation,
    if physics is self-consistent
  • 18:21 - 18:24
    it's more likely we're real,
    because it just takes more to do that.
  • 18:24 - 18:29
    So, then one of the implications is,
    if we're in a simulated environment
  • 18:29 - 18:32
    what we're going to do?
    Well, we're going to be discretized,
  • 18:32 - 18:35
    that is fuzzy on a small scale,
    we're going to have entangled states,
  • 18:35 - 18:38
    it means quantum mechanics.
    We have the holographic principle,
  • 18:38 - 18:41
    that everything inside every environment
    is enclosed on the surface.
  • 18:41 - 18:43
    So, here's an example.
  • 18:43 - 18:46
    The hand and the apple
  • 18:46 - 18:49
    are encoded on a geometrical sheet
    but projected into three dimensions,
  • 18:49 - 18:51
    that's a way to keep track of everything,
  • 18:51 - 18:55
    and the large scale in space and time
    may not match the small scale.
  • 18:55 - 18:57
    So, let me finish up.
  • 18:57 - 19:00
    Human beings are ill-equipped
    for determining reality.
  • 19:00 - 19:04
    Physics, so this is actually
    a selling thing for physics,
  • 19:04 - 19:07
    is a fundamental test of our realness.
    Currently we have contradictions.
  • 19:07 - 19:11
    Is that because we're not good
    at resolving things,
  • 19:11 - 19:15
    or is it because we're in a simulation?
    And what would that mean?
  • 19:15 - 19:17
    Thank you.
  • 19:17 - 19:19
    (Applause)
Title:
You are a simulation & physics can prove it: George Smoot at TEDxSalford
Description:

Are we, human beings, real or are we simulations? Can physics (medical science, anthropology and philosophy) provide an answer?

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
19:27
  • Generally, please don't include slips of the tongue and mid-sentence changes that do not alter the meaning of the whole sentence. For example, if the speaker says "I can see that there are some-- I can see some buildings here," just transcribe it as "I can see some buildings here" or "I can see that there are some buildings here," depending on which version would be more suitable in terms of reading speed.

English subtitles

Revisions