WEBVTT 00:00:00.371 --> 00:00:02.467 Steve Ramirez: My first year of grad school, 00:00:02.491 --> 00:00:03.907 I found myself in my bedroom 00:00:03.931 --> 00:00:06.207 eating lots of Ben & Jerry's 00:00:06.231 --> 00:00:07.891 watching some trashy TV 00:00:07.915 --> 00:00:11.118 and maybe, maybe listening to Taylor Swift. 00:00:11.142 --> 00:00:12.859 I had just gone through a breakup. 00:00:12.883 --> 00:00:14.306 (Laughter) 00:00:14.330 --> 00:00:16.502 So for the longest time, all I would do 00:00:16.526 --> 00:00:20.318 is recall the memory of this person over and over again, 00:00:20.342 --> 00:00:22.811 wishing that I could get rid of that gut-wrenching, 00:00:22.835 --> 00:00:25.344 visceral "blah" feeling. 00:00:25.368 --> 00:00:27.627 Now, as it turns out, I'm a neuroscientist, 00:00:27.651 --> 00:00:30.029 so I knew that the memory of that person 00:00:30.053 --> 00:00:33.169 and the awful, emotional undertones that color in that memory, 00:00:33.193 --> 00:00:35.803 are largely mediated by separate brain systems. 00:00:35.827 --> 00:00:38.304 And so I thought, what if we could go into the brain 00:00:38.328 --> 00:00:40.258 and edit out that nauseating feeling 00:00:40.282 --> 00:00:43.228 but while keeping the memory of that person intact? 00:00:43.252 --> 00:00:45.919 Then I realized, maybe that's a little bit lofty for now. 00:00:45.943 --> 00:00:48.432 So what if we could start off by going into the brain 00:00:48.456 --> 00:00:51.065 and just finding a single memory to begin with? 00:00:51.089 --> 00:00:53.579 Could we jump-start that memory back to life, 00:00:53.603 --> 00:00:57.447 maybe even play with the contents of that memory? 00:00:57.471 --> 00:00:59.676 All that said, there is one person in the entire world right now 00:00:59.700 --> 00:01:01.843 that I really hope is not watching this talk. 00:01:01.867 --> 00:01:05.672 (Laughter) 00:01:05.696 --> 00:01:08.961 So there is a catch. There is a catch. 00:01:08.985 --> 00:01:11.749 These ideas probably remind you of "Total Recall," 00:01:11.773 --> 00:01:13.723 "Eternal Sunshine of the Spotless Mind," 00:01:13.747 --> 00:01:15.026 or of "Inception." 00:01:15.050 --> 00:01:16.812 But the movie stars that we work with 00:01:16.836 --> 00:01:18.545 are the celebrities of the lab. 00:01:18.569 --> 00:01:20.445 Xu Liu: Test mice. 00:01:20.469 --> 00:01:21.573 (Laughter) 00:01:21.597 --> 00:01:24.727 As neuroscientists, we work in the lab with mice 00:01:24.751 --> 00:01:28.137 trying to understand how memory works. 00:01:28.161 --> 00:01:30.706 And today, we hope to convince you that now 00:01:30.730 --> 00:01:33.922 we are actually able to activate a memory in the brain 00:01:33.946 --> 00:01:36.092 at the speed of light. 00:01:36.116 --> 00:01:39.198 To do this, there's only two simple steps to follow. 00:01:39.222 --> 00:01:42.708 First, you find and label a memory in the brain, 00:01:42.732 --> 00:01:46.338 and then you activate it with a switch. 00:01:46.362 --> 00:01:47.783 As simple as that. 00:01:47.807 --> 00:01:49.605 (Laughter) 00:01:49.629 --> 00:01:51.450 SR: Are you convinced? 00:01:51.474 --> 00:01:55.171 So, turns out finding a memory in the brain isn't all that easy. 00:01:55.195 --> 00:01:57.974 XL: Indeed. This is way more difficult than, let's say, 00:01:57.998 --> 00:02:00.378 finding a needle in a haystack, 00:02:00.402 --> 00:02:03.117 because at least, you know, the needle is still something 00:02:03.141 --> 00:02:05.467 you can physically put your fingers on. 00:02:05.491 --> 00:02:07.444 But memory is not. 00:02:07.468 --> 00:02:10.482 And also, there's way more cells in your brain 00:02:10.506 --> 00:02:15.548 than the number of straws in a typical haystack. 00:02:15.572 --> 00:02:18.427 So yeah, this task does seem to be daunting. 00:02:18.451 --> 00:02:22.106 But luckily, we got help from the brain itself. 00:02:22.130 --> 00:02:24.561 It turned out that all we need to do is basically 00:02:24.585 --> 00:02:26.554 to let the brain form a memory, 00:02:26.578 --> 00:02:30.384 and then the brain will tell us which cells are involved 00:02:30.408 --> 00:02:32.150 in that particular memory. 00:02:32.174 --> 00:02:34.507 SR: So what was going on in my brain 00:02:34.531 --> 00:02:36.601 while I was recalling the memory of an ex? 00:02:36.625 --> 00:02:39.003 If you were to just completely ignore human ethics for a second 00:02:39.027 --> 00:02:40.671 and slice up my brain right now, 00:02:40.695 --> 00:02:42.886 you would see that there was an amazing number 00:02:42.910 --> 00:02:45.867 of brain regions that were active while recalling that memory. 00:02:45.891 --> 00:02:48.779 Now one brain region that would be robustly active 00:02:48.803 --> 00:02:50.786 in particular is called the hippocampus, 00:02:50.810 --> 00:02:53.241 which for decades has been implicated in processing 00:02:53.265 --> 00:02:55.657 the kinds of memories that we hold near and dear, 00:02:55.681 --> 00:02:58.231 which also makes it an ideal target to go into 00:02:58.255 --> 00:03:01.016 and to try and find and maybe reactivate a memory. 00:03:01.040 --> 00:03:03.410 XL: When you zoom in into the hippocampus, 00:03:03.434 --> 00:03:05.758 of course you will see lots of cells, 00:03:05.782 --> 00:03:08.789 but we are able to find which cells are involved 00:03:08.813 --> 00:03:10.265 in a particular memory, 00:03:10.289 --> 00:03:12.883 because whenever a cell is active, 00:03:12.907 --> 00:03:14.431 like when it's forming a memory, 00:03:14.455 --> 00:03:18.104 it will also leave a footprint that will later allow us to know 00:03:18.128 --> 00:03:20.806 these cells are recently active. 00:03:20.830 --> 00:03:23.164 SR: So the same way that building lights at night 00:03:23.188 --> 00:03:26.617 let you know that somebody's probably working there at any given moment, 00:03:26.641 --> 00:03:29.202 in a very real sense, there are biological sensors 00:03:29.226 --> 00:03:31.156 within a cell that are turned on 00:03:31.180 --> 00:03:33.291 only when that cell was just working. 00:03:33.315 --> 00:03:35.601 They're sort of biological windows that light up 00:03:35.625 --> 00:03:37.816 to let us know that that cell was just active. 00:03:37.840 --> 00:03:39.974 XL: So we clipped part of this sensor, 00:03:39.998 --> 00:03:43.121 and attached that to a switch to control the cells, 00:03:43.145 --> 00:03:47.021 and we packed this switch into an engineered virus 00:03:47.045 --> 00:03:49.609 and injected that into the brain of the mice. 00:03:49.633 --> 00:03:52.243 So whenever a memory is being formed, 00:03:52.267 --> 00:03:54.591 any active cells for that memory 00:03:54.615 --> 00:03:57.333 will also have this switch installed. 00:03:57.357 --> 00:03:59.548 SR: So here is what the hippocampus looks like 00:03:59.572 --> 00:04:01.798 after forming a fear memory, for example. 00:04:01.822 --> 00:04:03.938 The sea of blue that you see here 00:04:03.962 --> 00:04:05.890 are densely packed brain cells, 00:04:05.914 --> 00:04:07.435 but the green brain cells, 00:04:07.459 --> 00:04:10.031 the green brain cells are the ones that are holding on 00:04:10.055 --> 00:04:11.374 to a specific fear memory. 00:04:11.398 --> 00:04:13.353 So you are looking at the crystallization 00:04:13.377 --> 00:04:15.740 of the fleeting formation of fear. 00:04:15.764 --> 00:04:19.237 You're actually looking at the cross-section of a memory right now. 00:04:19.261 --> 00:04:21.690 XL: Now, for the switch we have been talking about, 00:04:21.714 --> 00:04:24.615 ideally, the switch has to act really fast. 00:04:24.639 --> 00:04:27.194 It shouldn't take minutes or hours to work. 00:04:27.218 --> 00:04:31.458 It should act at the speed of the brain, in milliseconds. 00:04:31.482 --> 00:04:32.888 SR: So what do you think, Xu? 00:04:32.912 --> 00:04:35.490 Could we use, let's say, pharmacological drugs 00:04:35.514 --> 00:04:37.328 to activate or inactivate brain cells? 00:04:37.352 --> 00:04:41.391 XL: Nah. Drugs are pretty messy. They spread everywhere. 00:04:41.415 --> 00:04:44.399 And also it takes them forever to act on cells. 00:04:44.423 --> 00:04:48.048 So it will not allow us to control a memory in real time. 00:04:48.072 --> 00:04:52.342 So Steve, how about let's zap the brain with electricity? 00:04:52.366 --> 00:04:54.647 SR: So electricity is pretty fast, 00:04:54.671 --> 00:04:56.386 but we probably wouldn't be able to target it 00:04:56.410 --> 00:04:58.839 to just the specific cells that hold onto a memory, 00:04:58.863 --> 00:05:00.618 and we'd probably fry the brain. 00:05:00.642 --> 00:05:03.813 XL: Oh. That's true. So it looks like, hmm, 00:05:03.837 --> 00:05:06.424 indeed we need to find a better way 00:05:06.448 --> 00:05:09.719 to impact the brain at the speed of light. 00:05:09.743 --> 00:05:14.805 SR: So it just so happens that light travels at the speed of light. 00:05:14.829 --> 00:05:18.288 So maybe we could activate or inactive memories 00:05:18.312 --> 00:05:19.785 by just using light -- 00:05:19.809 --> 00:05:21.140 XL: That's pretty fast. 00:05:21.164 --> 00:05:23.025 SR: -- and because normally brain cells 00:05:23.049 --> 00:05:24.621 don't respond to pulses of light, 00:05:24.645 --> 00:05:26.579 so those that would respond to pulses of light 00:05:26.603 --> 00:05:29.035 are those that contain a light-sensitive switch. 00:05:29.059 --> 00:05:30.981 Now to do that, first we need to trick brain cells 00:05:31.005 --> 00:05:32.443 to respond to laser beams. 00:05:32.467 --> 00:05:33.513 XL: Yep. You heard it right. 00:05:33.537 --> 00:05:35.680 We are trying to shoot lasers into the brain. 00:05:35.704 --> 00:05:37.390 (Laughter) 00:05:37.414 --> 00:05:40.714 SR: And the technique that lets us do that is optogenetics. 00:05:40.738 --> 00:05:43.996 Optogenetics gave us this light switch that we can use 00:05:44.020 --> 00:05:45.504 to turn brain cells on or off, 00:05:45.528 --> 00:05:48.021 and the name of that switch is channelrhodopsin, 00:05:48.045 --> 00:05:50.558 seen here as these green dots attached to this brain cell. 00:05:50.582 --> 00:05:53.868 You can think of channelrhodopsin as a sort of light-sensitive switch 00:05:53.892 --> 00:05:56.484 that can be artificially installed in brain cells 00:05:56.508 --> 00:05:58.398 so that now we can use that switch 00:05:58.422 --> 00:06:01.422 to activate or inactivate the brain cell simply by clicking it, 00:06:01.446 --> 00:06:03.970 and in this case we click it on with pulses of light. 00:06:03.994 --> 00:06:07.665 XL: So we attach this light-sensitive switch of channelrhodopsin 00:06:07.689 --> 00:06:09.873 to the sensor we've been talking about 00:06:09.897 --> 00:06:12.328 and inject this into the brain. 00:06:12.352 --> 00:06:15.539 So whenever a memory is being formed, 00:06:15.563 --> 00:06:17.766 any active cell for that particular memory 00:06:17.790 --> 00:06:21.250 will also have this light-sensitive switch installed in it 00:06:21.274 --> 00:06:23.651 so that we can control these cells 00:06:23.675 --> 00:06:27.915 by the flipping of a laser just like this one you see. 00:06:27.939 --> 00:06:30.779 SR: So let's put all of this to the test now. 00:06:30.803 --> 00:06:32.914 What we can do is we can take our mice 00:06:32.938 --> 00:06:35.842 and then we can put them in a box that looks exactly like this box here, 00:06:35.866 --> 00:06:38.182 and then we can give them a very mild foot shock 00:06:38.206 --> 00:06:40.254 so that they form a fear memory of this box. 00:06:40.278 --> 00:06:42.337 They learn that something bad happened here. 00:06:42.361 --> 00:06:44.679 Now with our system, the cells that are active 00:06:44.703 --> 00:06:47.469 in the hippocampus in the making of this memory, 00:06:47.493 --> 00:06:50.352 only those cells will now contain channelrhodopsin. 00:06:50.376 --> 00:06:53.369 XL: When you are as small as a mouse, 00:06:53.393 --> 00:06:56.964 it feels as if the whole world is trying to get you. 00:06:56.988 --> 00:06:58.712 So your best response of defense 00:06:58.736 --> 00:07:01.194 is trying to be undetected. 00:07:01.218 --> 00:07:03.227 Whenever a mouse is in fear, 00:07:03.251 --> 00:07:05.109 it will show this very typical behavior 00:07:05.133 --> 00:07:06.878 by staying at one corner of the box, 00:07:06.902 --> 00:07:09.640 trying to not move any part of its body, 00:07:09.664 --> 00:07:12.935 and this posture is called freezing. 00:07:12.959 --> 00:07:17.229 So if a mouse remembers that something bad happened in this box, 00:07:17.253 --> 00:07:19.852 and when we put them back into the same box, 00:07:19.876 --> 00:07:21.656 it will basically show freezing 00:07:21.680 --> 00:07:23.941 because it doesn't want to be detected 00:07:23.965 --> 00:07:26.636 by any potential threats in this box. 00:07:26.660 --> 00:07:27.991 SR: So you can think of freezing as, 00:07:28.015 --> 00:07:30.206 you're walking down the street minding your own business, 00:07:30.230 --> 00:07:32.278 and then out of nowhere you almost run into 00:07:32.302 --> 00:07:34.123 an ex-girlfriend or ex-boyfriend, 00:07:34.147 --> 00:07:36.257 and now those terrifying two seconds 00:07:36.281 --> 00:07:38.133 where you start thinking, "What do I do? Do I say hi? 00:07:38.157 --> 00:07:39.501 Do I shake their hand? Do I turn around and run away? 00:07:39.525 --> 00:07:41.530 Do I sit here and pretend like I don't exist?" 00:07:41.554 --> 00:07:44.714 Those kinds of fleeting thoughts that physically incapacitate you, 00:07:44.738 --> 00:07:47.460 that temporarily give you that deer-in-headlights look. 00:07:47.484 --> 00:07:50.751 XL: However, if you put the mouse in a completely different 00:07:50.775 --> 00:07:53.912 new box, like the next one, 00:07:53.936 --> 00:07:56.059 it will not be afraid of this box 00:07:56.083 --> 00:08:00.788 because there's no reason that it will be afraid of this new environment. 00:08:00.812 --> 00:08:03.968 But what if we put the mouse in this new box 00:08:03.992 --> 00:08:07.579 but at the same time, we activate the fear memory 00:08:07.603 --> 00:08:10.258 using lasers just like we did before? 00:08:10.282 --> 00:08:13.112 Are we going to bring back the fear memory 00:08:13.136 --> 00:08:17.109 for the first box into this completely new environment? 00:08:17.133 --> 00:08:19.844 SR: All right, and here's the million-dollar experiment. 00:08:19.868 --> 00:08:22.757 Now to bring back to life the memory of that day, 00:08:22.781 --> 00:08:24.940 I remember that the Red Sox had just won, 00:08:24.964 --> 00:08:26.849 it was a green spring day, 00:08:26.873 --> 00:08:28.731 perfect for going up and down the river 00:08:28.755 --> 00:08:31.000 and then maybe going to the North End 00:08:31.024 --> 00:08:33.159 to get some cannolis, #justsaying. 00:08:33.183 --> 00:08:36.261 Now Xu and I, on the other hand, 00:08:36.285 --> 00:08:39.091 were in a completely windowless black room 00:08:39.115 --> 00:08:42.751 not making any ocular movement that even remotely resembles an eye blink 00:08:42.775 --> 00:08:45.218 because our eyes were fixed onto a computer screen. 00:08:45.242 --> 00:08:48.195 We were looking at this mouse here trying to activate a memory 00:08:48.219 --> 00:08:50.078 for the first time using our technique. 00:08:50.102 --> 00:08:52.345 XL: And this is what we saw. 00:08:52.369 --> 00:08:54.547 When we first put the mouse into this box, 00:08:54.571 --> 00:08:57.660 it's exploring, sniffing around, walking around, 00:08:57.684 --> 00:08:59.349 minding its own business, 00:08:59.373 --> 00:09:01.050 because actually by nature, 00:09:01.074 --> 00:09:03.029 mice are pretty curious animals. 00:09:03.053 --> 00:09:05.651 They want to know, what's going on in this new box? 00:09:05.675 --> 00:09:07.182 It's interesting. 00:09:07.206 --> 00:09:10.633 But the moment we turned on the laser, like you see now, 00:09:10.657 --> 00:09:13.665 all of a sudden the mouse entered this freezing mode. 00:09:13.689 --> 00:09:18.096 It stayed here and tried not to move any part of its body. 00:09:18.120 --> 00:09:19.724 Clearly it's freezing. 00:09:19.748 --> 00:09:22.307 So indeed, it looks like we are able to bring back 00:09:22.331 --> 00:09:24.371 the fear memory for the first box 00:09:24.395 --> 00:09:27.738 in this completely new environment. 00:09:27.762 --> 00:09:29.850 While watching this, Steve and I 00:09:29.874 --> 00:09:31.983 are as shocked as the mouse itself. 00:09:32.007 --> 00:09:33.245 (Laughter) 00:09:33.269 --> 00:09:36.552 So after the experiment, the two of us just left the room 00:09:36.576 --> 00:09:38.305 without saying anything. 00:09:38.329 --> 00:09:41.701 After a kind of long, awkward period of time, 00:09:41.725 --> 00:09:43.913 Steve broke the silence. 00:09:43.937 --> 00:09:46.254 SR: "Did that just work?" 00:09:46.278 --> 00:09:49.228 XL: "Yes," I said. "Indeed it worked!" 00:09:49.252 --> 00:09:51.345 We're really excited about this. 00:09:51.369 --> 00:09:53.969 And then we published our findings 00:09:53.993 --> 00:09:55.665 in the journal Nature. 00:09:55.689 --> 00:09:58.136 Ever since the publication of our work, 00:09:58.160 --> 00:10:00.551 we've been receiving numerous comments 00:10:00.575 --> 00:10:02.676 from all over the Internet. 00:10:02.700 --> 00:10:06.426 Maybe we can take a look at some of those. 00:10:06.450 --> 00:10:08.883 ["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"] 00:10:08.907 --> 00:10:10.883 SR: So the first thing that you'll notice is that people 00:10:10.907 --> 00:10:13.786 have really strong opinions about this kind of work. 00:10:13.810 --> 00:10:16.340 Now I happen to completely agree with the optimism 00:10:16.364 --> 00:10:17.156 of this first quote, 00:10:17.180 --> 00:10:19.964 because on a scale of zero to Morgan Freeman's voice, 00:10:19.988 --> 00:10:22.465 it happens to be one of the most evocative accolades 00:10:22.489 --> 00:10:24.015 that I've heard come our way. 00:10:24.039 --> 00:10:25.833 (Laughter) 00:10:25.857 --> 00:10:27.784 But as you'll see, it's not the only opinion that's out there. 00:10:27.808 --> 00:10:29.348 ["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"] 00:10:29.372 --> 00:10:31.658 XL: Indeed, if we take a look at the second one, 00:10:31.682 --> 00:10:33.765 I think we can all agree that it's, meh, 00:10:33.789 --> 00:10:35.748 probably not as positive. 00:10:35.772 --> 00:10:37.933 But this also reminds us that, 00:10:37.957 --> 00:10:40.119 although we are still working with mice, 00:10:40.143 --> 00:10:43.636 it's probably a good idea to start thinking and discussing 00:10:43.660 --> 00:10:46.627 about the possible ethical ramifications 00:10:46.651 --> 00:10:48.575 of memory control. 00:10:48.599 --> 00:10:50.775 SR: Now, in the spirit of the third quote, 00:10:50.799 --> 00:10:52.749 we want to tell you about a recent project that we've been 00:10:52.773 --> 00:10:55.345 working on in lab that we've called Project Inception. 00:10:55.369 --> 00:10:58.588 ["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."] 00:10:58.612 --> 00:11:02.154 So we reasoned that now that we can reactivate a memory, 00:11:02.178 --> 00:11:05.106 what if we do so but then begin to tinker with that memory? 00:11:05.130 --> 00:11:08.139 Could we possibly even turn it into a false memory? 00:11:08.163 --> 00:11:12.238 XL: So all memory is sophisticated and dynamic, 00:11:12.262 --> 00:11:15.217 but if just for simplicity, let's imagine memory 00:11:15.241 --> 00:11:16.619 as a movie clip. 00:11:16.643 --> 00:11:19.289 So far what we've told you is basically we can control 00:11:19.313 --> 00:11:21.220 this "play" button of the clip 00:11:21.244 --> 00:11:25.805 so that we can play this video clip any time, anywhere. 00:11:25.829 --> 00:11:28.336 But is there a possibility that we can actually get 00:11:28.360 --> 00:11:31.196 inside the brain and edit this movie clip 00:11:31.220 --> 00:11:34.092 so that we can make it different from the original? 00:11:34.116 --> 00:11:36.270 Yes we can. 00:11:36.294 --> 00:11:38.485 Turned out that all we need to do is basically 00:11:38.509 --> 00:11:42.675 reactivate a memory using lasers just like we did before, 00:11:42.699 --> 00:11:46.114 but at the same time, if we present new information 00:11:46.138 --> 00:11:50.088 and allow this new information to incorporate into this old memory, 00:11:50.112 --> 00:11:52.526 this will change the memory. 00:11:52.550 --> 00:11:56.189 It's sort of like making a remix tape. 00:11:56.213 --> 00:11:59.047 SR: So how do we do this? 00:11:59.071 --> 00:12:01.004 Rather than finding a fear memory in the brain, 00:12:01.028 --> 00:12:02.720 we can start by taking our animals, 00:12:02.744 --> 00:12:05.697 and let's say we put them in a blue box like this blue box here 00:12:05.721 --> 00:12:08.344 and we find the brain cells that represent that blue box 00:12:08.368 --> 00:12:10.607 and we trick them to respond to pulses of light 00:12:10.631 --> 00:12:12.341 exactly like we had said before. 00:12:12.365 --> 00:12:14.465 Now the next day, we can take our animals and place them 00:12:14.489 --> 00:12:17.164 in a red box that they've never experienced before. 00:12:17.188 --> 00:12:19.427 We can shoot light into the brain to reactivate 00:12:19.451 --> 00:12:21.299 the memory of the blue box. 00:12:21.323 --> 00:12:23.043 So what would happen here if, while the animal 00:12:23.067 --> 00:12:24.972 is recalling the memory of the blue box, 00:12:24.996 --> 00:12:27.580 we gave it a couple of mild foot shocks? 00:12:27.604 --> 00:12:30.273 So here we're trying to artificially make an association 00:12:30.297 --> 00:12:32.188 between the memory of the blue box 00:12:32.212 --> 00:12:33.691 and the foot shocks themselves. 00:12:33.715 --> 00:12:35.477 We're just trying to connect the two. 00:12:35.501 --> 00:12:37.013 So to test if we had done so, 00:12:37.037 --> 00:12:38.341 we can take our animals once again 00:12:38.365 --> 00:12:40.287 and place them back in the blue box. 00:12:40.311 --> 00:12:43.026 Again, we had just reactivated the memory of the blue box 00:12:43.050 --> 00:12:45.431 while the animal got a couple of mild foot shocks, 00:12:45.455 --> 00:12:47.562 and now the animal suddenly freezes. 00:12:47.586 --> 00:12:50.920 It's as though it's recalling being mildly shocked in this environment 00:12:50.944 --> 00:12:53.766 even though that never actually happened. 00:12:53.790 --> 00:12:55.628 So it formed a false memory, 00:12:55.652 --> 00:12:57.700 because it's falsely fearing an environment 00:12:57.724 --> 00:12:59.058 where, technically speaking, 00:12:59.082 --> 00:13:01.242 nothing bad actually happened to it. 00:13:01.266 --> 00:13:03.695 XL: So, so far we are only talking about 00:13:03.719 --> 00:13:06.061 this light-controlled "on" switch. 00:13:06.085 --> 00:13:09.349 In fact, we also have a light-controlled "off" switch, 00:13:09.373 --> 00:13:11.429 and it's very easy to imagine that 00:13:11.453 --> 00:13:13.907 by installing this light-controlled "off" switch, 00:13:13.931 --> 00:13:19.495 we can also turn off a memory, any time, anywhere. 00:13:19.519 --> 00:13:21.709 So everything we've been talking about today 00:13:21.733 --> 00:13:26.386 is based on this philosophically charged principle of neuroscience 00:13:26.410 --> 00:13:30.504 that the mind, with its seemingly mysterious properties, 00:13:30.528 --> 00:13:34.149 is actually made of physical stuff that we can tinker with. 00:13:34.173 --> 00:13:35.624 SR: And for me personally, 00:13:35.648 --> 00:13:37.410 I see a world where we can reactivate 00:13:37.434 --> 00:13:39.321 any kind of memory that we'd like. 00:13:39.345 --> 00:13:42.619 I also see a world where we can erase unwanted memories. 00:13:42.643 --> 00:13:44.786 Now, I even see a world where editing memories 00:13:44.810 --> 00:13:46.094 is something of a reality, 00:13:46.118 --> 00:13:47.799 because we're living in a time where it's possible 00:13:47.823 --> 00:13:50.252 to pluck questions from the tree of science fiction 00:13:50.276 --> 00:13:52.444 and to ground them in experimental reality. 00:13:52.468 --> 00:13:54.327 XL: Nowadays, people in the lab 00:13:54.351 --> 00:13:56.713 and people in other groups all over the world 00:13:56.737 --> 00:14:00.530 are using similar methods to activate or edit memories, 00:14:00.554 --> 00:14:04.371 whether that's old or new, positive or negative, 00:14:04.395 --> 00:14:07.043 all sorts of memories so that we can understand 00:14:07.067 --> 00:14:08.907 how memory works. 00:14:08.931 --> 00:14:10.691 SR: For example, one group in our lab 00:14:10.715 --> 00:14:13.329 was able to find the brain cells that make up a fear memory 00:14:13.353 --> 00:14:16.104 and converted them into a pleasurable memory, just like that. 00:14:16.128 --> 00:14:19.271 That's exactly what I mean about editing these kinds of processes. 00:14:19.295 --> 00:14:21.534 Now one dude in lab was even able to reactivate 00:14:21.558 --> 00:14:23.479 memories of female mice in male mice, 00:14:23.503 --> 00:14:26.474 which rumor has it is a pleasurable experience. 00:14:26.498 --> 00:14:30.591 XL: Indeed, we are living in a very exciting moment 00:14:30.615 --> 00:14:34.396 where science doesn't have any arbitrary speed limits 00:14:34.420 --> 00:14:37.583 but is only bound by our own imagination. 00:14:37.607 --> 00:14:39.750 SR: And finally, what do we make of all this? 00:14:39.774 --> 00:14:41.701 How do we push this technology forward? 00:14:41.725 --> 00:14:43.916 These are the questions that should not remain 00:14:43.940 --> 00:14:45.213 just inside the lab, 00:14:45.237 --> 00:14:47.809 and so one goal of today's talk was to bring everybody 00:14:47.833 --> 00:14:50.214 up to speed with the kind of stuff that's possible 00:14:50.238 --> 00:14:51.488 in modern neuroscience, 00:14:51.512 --> 00:14:52.998 but now, just as importantly, 00:14:53.022 --> 00:14:56.330 to actively engage everybody in this conversation. 00:14:56.354 --> 00:14:59.116 So let's think together as a team about what this all means 00:14:59.140 --> 00:15:01.133 and where we can and should go from here, 00:15:01.157 --> 00:15:03.231 because Xu and I think we all have 00:15:03.255 --> 00:15:05.767 some really big decisions ahead of us. 00:15:05.791 --> 00:15:06.892 Thank you. XL: Thank you. 00:15:06.916 --> 00:15:08.550 (Applause)