WEBVTT 00:00:08.371 --> 00:00:10.419 Steve Ramirez: My first year of grad school, 00:00:10.443 --> 00:00:11.911 I found myself in my bedroom 00:00:11.931 --> 00:00:14.211 eating lots of Ben & Jerry's 00:00:14.231 --> 00:00:15.895 watching some trashy TV 00:00:15.915 --> 00:00:19.122 and maybe, maybe listening to Taylor Swift. 00:00:19.142 --> 00:00:20.863 I had just gone through a breakup. 00:00:20.883 --> 00:00:22.310 (Laughter) 00:00:22.330 --> 00:00:24.506 So for the longest time, all I would do 00:00:24.526 --> 00:00:28.322 is recall the memory of this person over and over again, 00:00:28.342 --> 00:00:30.815 wishing that I could get rid of that gut-wrenching, 00:00:30.835 --> 00:00:33.348 visceral "blah" feeling. 00:00:33.368 --> 00:00:35.631 Now, as it turns out, I'm a neuroscientist, 00:00:35.651 --> 00:00:38.033 so I knew that the memory of that person 00:00:38.053 --> 00:00:41.173 and the awful, emotional undertones that color in that memory, 00:00:41.193 --> 00:00:43.807 are largely mediated by separate brain systems. 00:00:43.827 --> 00:00:46.304 And so I thought, what if we could go into the brain 00:00:46.328 --> 00:00:48.262 and edit out that nauseating feeling 00:00:48.282 --> 00:00:51.232 but while keeping the memory of that person intact? 00:00:51.252 --> 00:00:53.938 Then I realized, maybe that's a little bit lofty for now. 00:00:53.938 --> 00:00:56.436 So what if we could start off by going into the brain 00:00:56.456 --> 00:00:59.069 and just finding a single memory to begin with? 00:00:59.089 --> 00:01:01.583 Could we jump-start that memory back to life, 00:01:01.603 --> 00:01:05.425 maybe even play with the contents of that memory? 00:01:05.425 --> 00:01:08.487 All that said, there is one person in the entire world right now 00:01:08.512 --> 00:01:10.647 that I really hope is not watching this talk. 00:01:10.678 --> 00:01:13.676 (Laughter) 00:01:13.696 --> 00:01:16.965 So, there is a catch. There is a catch. 00:01:16.985 --> 00:01:19.753 These ideas probably remind you of "Total Recall," 00:01:19.773 --> 00:01:21.727 "Eternal Sunshine of the Spotless Mind," 00:01:21.747 --> 00:01:23.030 or of "Inception." 00:01:23.050 --> 00:01:24.812 But the movie stars that we work with 00:01:24.836 --> 00:01:26.549 are the celebrities of the lab. 00:01:26.569 --> 00:01:28.449 Xu Liu: Test mice. 00:01:28.469 --> 00:01:29.577 (Laughter) 00:01:29.597 --> 00:01:32.731 As neuroscientists, we work in the lab with mice 00:01:32.751 --> 00:01:36.141 trying to understand how memory works. 00:01:36.161 --> 00:01:38.729 And today, we hope to convince you that now 00:01:38.765 --> 00:01:41.926 we are actually able to activate a memory in the brain 00:01:41.946 --> 00:01:44.096 at the speed of light. 00:01:44.116 --> 00:01:47.202 To do this, there's only two simple steps to follow. 00:01:47.222 --> 00:01:50.712 First, you find and label a memory in the brain, 00:01:50.732 --> 00:01:54.342 and then you activate it with a switch. 00:01:54.362 --> 00:01:55.787 As simple as that. 00:01:55.807 --> 00:01:57.609 (Laughter) 00:01:57.629 --> 00:01:59.454 SR: Are you convinced? 00:01:59.474 --> 00:02:03.175 So, turns out finding a memory in the brain isn't all that easy. 00:02:03.195 --> 00:02:05.978 XL: Indeed. This is way more difficult than, let's say, 00:02:05.998 --> 00:02:08.382 finding a needle in a haystack, 00:02:08.401 --> 00:02:11.117 because at least, you know, the needle is still something 00:02:11.141 --> 00:02:13.471 you can physically put your fingers on. 00:02:13.491 --> 00:02:15.448 But memory is not. 00:02:15.468 --> 00:02:18.486 And also, there's way more cells in your brain 00:02:18.506 --> 00:02:23.552 than the number of straws in a typical haystack. 00:02:23.572 --> 00:02:26.431 So yeah, this task does seem to be daunting. 00:02:26.451 --> 00:02:30.110 But luckily, we got help from the brain itself. 00:02:30.130 --> 00:02:32.565 It turned out that all we need to do is basically 00:02:32.585 --> 00:02:34.558 to let the brain form a memory, 00:02:34.578 --> 00:02:38.388 and then the brain will tell us which cells are involved 00:02:38.408 --> 00:02:40.154 in that particular memory. 00:02:40.174 --> 00:02:42.511 SR: So what was going on in my brain 00:02:42.531 --> 00:02:44.605 while I was recalling the memory of an ex? 00:02:44.625 --> 00:02:47.627 If you were to just completely ignore human ethics for a second 00:02:47.627 --> 00:02:49.175 and slice up my brain right now, 00:02:49.175 --> 00:02:51.316 you would see that there was an amazing number 00:02:51.316 --> 00:02:54.271 of brain regions that were active while recalling that memory. 00:02:54.271 --> 00:02:56.783 Now one brain region that would be robustly active 00:02:56.803 --> 00:02:58.790 in particular is called the hippocampus, 00:02:58.810 --> 00:03:01.245 which for decades has been implicated in processing 00:03:01.265 --> 00:03:03.661 the kinds of memories that we hold near and dear, 00:03:03.681 --> 00:03:06.235 which also makes it an ideal target to go into 00:03:06.255 --> 00:03:09.020 and to try and find and maybe reactivate a memory. 00:03:09.040 --> 00:03:11.414 XL: When you zoom in into the hippocampus, 00:03:11.434 --> 00:03:13.762 of course you will see lots of cells, 00:03:13.782 --> 00:03:16.793 but we are able to find which cells are involved 00:03:16.813 --> 00:03:18.269 in a particular memory, 00:03:18.289 --> 00:03:20.887 because whenever a cell is active, 00:03:20.907 --> 00:03:22.431 like when it's forming a memory, 00:03:22.455 --> 00:03:26.108 it will also leave a footprint that will later allow us to know 00:03:26.128 --> 00:03:28.810 these cells are recently active. 00:03:28.830 --> 00:03:31.145 SR: So the same way that building lights at night 00:03:31.165 --> 00:03:34.594 let you know that somebody's probably working there at any given moment, 00:03:34.618 --> 00:03:37.206 in a very real sense, there are biological sensors 00:03:37.226 --> 00:03:39.160 within a cell that are turned on 00:03:39.180 --> 00:03:41.295 only when that cell was just working. 00:03:41.315 --> 00:03:43.582 They're sort of biological windows that light up 00:03:43.602 --> 00:03:45.793 to let us know that that cell was just active. 00:03:45.817 --> 00:03:47.978 XL: So we clipped part of this sensor, 00:03:47.998 --> 00:03:51.125 and attached that to a switch to control the cells, 00:03:51.145 --> 00:03:55.025 and we packed this switch into an engineered virus 00:03:55.045 --> 00:03:57.613 and injected that into the brain of the mice. 00:03:57.633 --> 00:04:00.247 So whenever a memory is being formed, 00:04:00.267 --> 00:04:02.595 any active cells for that memory 00:04:02.615 --> 00:04:05.337 will also have this switch installed. 00:04:05.357 --> 00:04:07.548 SR: So here is what the hippocampus looks like 00:04:07.572 --> 00:04:09.802 after forming a fear memory, for example. 00:04:09.822 --> 00:04:11.942 The sea of blue that you see here 00:04:11.962 --> 00:04:13.894 are densely packed brain cells, 00:04:13.914 --> 00:04:15.439 but the green brain cells, 00:04:15.459 --> 00:04:18.031 the green brain cells are the ones that are holding on 00:04:18.055 --> 00:04:19.378 to a specific fear memory. 00:04:19.398 --> 00:04:21.357 So you are looking at the crystallization 00:04:21.377 --> 00:04:23.744 of the fleeting formation of fear. 00:04:23.764 --> 00:04:25.890 You're actually looking at the cross-section 00:04:25.915 --> 00:04:27.236 of a memory right now. 00:04:27.261 --> 00:04:29.690 XL: Now, for the switch we have been talking about, 00:04:29.714 --> 00:04:32.619 ideally, the switch has to act really fast. 00:04:32.639 --> 00:04:35.198 It shouldn't take minutes or hours to work. 00:04:35.218 --> 00:04:39.462 It should act at the speed of the brain, in milliseconds. 00:04:39.482 --> 00:04:40.892 SR: So what do you think, Xu? 00:04:40.912 --> 00:04:43.494 Could we use, let's say, pharmacological drugs 00:04:43.514 --> 00:04:45.332 to activate or inactivate brain cells? 00:04:45.352 --> 00:04:49.395 XL: Nah. Drugs are pretty messy. They spread everywhere. 00:04:49.415 --> 00:04:52.403 And also it takes them forever to act on cells. 00:04:52.423 --> 00:04:56.052 So it will not allow us to control a memory in real time. 00:04:56.072 --> 00:05:00.346 So Steve, how about let's zap the brain with electricity? 00:05:00.366 --> 00:05:02.651 SR: So electricity is pretty fast, 00:05:02.671 --> 00:05:04.807 but we probably wouldn't be able to target it 00:05:04.807 --> 00:05:07.211 to just the specific cells that hold onto a memory, 00:05:07.211 --> 00:05:08.759 and we'd probably fry the brain. 00:05:08.759 --> 00:05:11.817 XL: Oh. That's true. So it looks like, hmm, 00:05:11.837 --> 00:05:14.428 indeed we need to find a better way 00:05:14.448 --> 00:05:17.723 to impact the brain at the speed of light. 00:05:17.743 --> 00:05:22.809 SR: So it just so happens that light travels at the speed of light. 00:05:22.829 --> 00:05:26.292 So maybe we could activate or inactivate memories 00:05:26.312 --> 00:05:27.789 by just using light - 00:05:27.809 --> 00:05:29.144 XL: That's pretty fast. 00:05:29.164 --> 00:05:31.029 SR: - and because normally brain cells 00:05:31.049 --> 00:05:32.636 don't respond to pulses of light, 00:05:32.661 --> 00:05:34.829 so those that would respond to pulses of light 00:05:34.829 --> 00:05:37.069 are those that contain a light-sensitive switch. 00:05:37.069 --> 00:05:39.426 Now to do that, first we need to trick brain cells 00:05:39.426 --> 00:05:40.694 to respond to laser beams. 00:05:40.694 --> 00:05:42.052 XL: Yep. You heard it right. 00:05:42.052 --> 00:05:44.185 We are trying to shoot lasers into the brain. 00:05:44.185 --> 00:05:45.394 (Laughter) 00:05:45.414 --> 00:05:48.718 SR: And the technique that lets us do that is optogenetics. 00:05:48.738 --> 00:05:52.000 Optogenetics gave us this light switch that we can use 00:05:52.020 --> 00:05:53.508 to turn brain cells on or off, 00:05:53.528 --> 00:05:56.025 and the name of that switch is channelrhodopsin, 00:05:56.045 --> 00:05:58.782 seen here as these green dots attached to this brain cell. 00:05:58.782 --> 00:06:02.051 You can think of channelrhodopsin as a sort of light-sensitive switch 00:06:02.051 --> 00:06:04.488 that can be artificially installed in brain cells 00:06:04.508 --> 00:06:06.402 so that now we can use that switch 00:06:06.422 --> 00:06:09.415 to activate or inactivate the brain cell simply by clicking it, 00:06:09.435 --> 00:06:11.959 and in this case we click it on with pulses of light. 00:06:11.983 --> 00:06:15.669 XL: So we attach this light-sensitive switch of channelrhodopsin 00:06:15.689 --> 00:06:17.877 to the sensor we've been talking about 00:06:17.897 --> 00:06:20.332 and inject this into the brain. 00:06:20.352 --> 00:06:23.543 So whenever a memory is being formed, 00:06:23.563 --> 00:06:25.770 any active cell for that particular memory 00:06:25.790 --> 00:06:29.254 will also have this light-sensitive switch installed in it 00:06:29.274 --> 00:06:31.655 so that we can control these cells 00:06:31.675 --> 00:06:35.919 by the flipping of a laser just like this one you see. 00:06:35.939 --> 00:06:38.783 SR: So let's put all of this to the test now. 00:06:38.803 --> 00:06:40.918 What we can do is we can take our mice 00:06:40.938 --> 00:06:44.136 and then we can put them in a box that looks exactly like this one, 00:06:44.136 --> 00:06:46.426 and then we can give them a very mild foot shock 00:06:46.426 --> 00:06:48.523 so that they form a fear memory of this box. 00:06:48.523 --> 00:06:50.590 They learn that something bad happened here. 00:06:50.590 --> 00:06:52.760 Now with our system, the cells that are active 00:06:52.760 --> 00:06:55.473 in the hippocampus in the making of this memory, 00:06:55.493 --> 00:06:58.356 only those cells will now contain channelrhodopsin. 00:06:58.376 --> 00:07:01.373 XL: When you are as small as a mouse, 00:07:01.393 --> 00:07:04.968 it feels as if the whole world is trying to get you. 00:07:04.988 --> 00:07:06.716 So your best response of defense 00:07:06.736 --> 00:07:09.198 is trying to be undetected. 00:07:09.218 --> 00:07:11.231 Whenever a mouse is in fear, 00:07:11.251 --> 00:07:13.140 it will show this very typical behavior 00:07:13.140 --> 00:07:14.859 by staying at one corner of the box, 00:07:14.879 --> 00:07:17.644 trying to not move any part of its body, 00:07:17.664 --> 00:07:20.939 and this posture is called freezing. 00:07:20.959 --> 00:07:25.233 So if a mouse remembers that something bad happened in this box, 00:07:25.253 --> 00:07:27.856 and when we put them back into the same box, 00:07:27.876 --> 00:07:29.660 it will basically show freezing 00:07:29.680 --> 00:07:31.945 because it doesn't want to be detected 00:07:31.965 --> 00:07:34.530 by any potential threats in this box. 00:07:34.530 --> 00:07:36.275 SR: So you can think of freezing as, 00:07:36.275 --> 00:07:38.980 you're walking down the street minding your own business, 00:07:38.980 --> 00:07:40.988 and then out of nowhere you almost run into 00:07:40.988 --> 00:07:42.587 an ex-girlfriend or ex-boyfriend, 00:07:42.587 --> 00:07:44.371 and now those terrifying two seconds 00:07:44.371 --> 00:07:46.847 where you start thinking, "What do I do? Do I say hi? 00:07:46.847 --> 00:07:49.305 Do I shake their hand? Do I turn around and run away? 00:07:49.305 --> 00:07:51.494 Do I sit here and pretend like I don't exist?" 00:07:51.494 --> 00:07:54.598 Those kinds of fleeting thoughts that physically incapacitate you, 00:07:54.598 --> 00:07:57.224 that temporarily give you that deer-in-headlights look. 00:07:57.224 --> 00:08:00.465 XL: However, if you put the mouse in a completely different new box, 00:08:00.465 --> 00:08:01.916 like the next one, 00:08:01.936 --> 00:08:04.063 it will not be afraid of this box 00:08:04.083 --> 00:08:08.792 because there's no reason that it will be afraid of this new environment. 00:08:08.812 --> 00:08:11.972 But what if we put the mouse in this new box 00:08:11.992 --> 00:08:15.583 but at the same time, we activate the fear memory 00:08:15.603 --> 00:08:18.262 using lasers just like we did before? 00:08:18.282 --> 00:08:21.116 Are we going to bring back the fear memory 00:08:21.136 --> 00:08:25.113 for the first box into this completely new environment? 00:08:25.133 --> 00:08:27.848 SR: All right, and here's the million-dollar experiment. 00:08:27.868 --> 00:08:30.761 Now to bring back to life the memory of that day, 00:08:30.781 --> 00:08:32.943 I remember that the Red Sox had just won, 00:08:32.964 --> 00:08:34.852 it was a green spring day, 00:08:34.873 --> 00:08:36.731 perfect for going up and down the river 00:08:36.755 --> 00:08:39.004 and then maybe going to the North End 00:08:39.024 --> 00:08:41.163 to get some cannolis, #justsaying. 00:08:41.183 --> 00:08:44.265 Now Xu and I, on the other hand, 00:08:44.284 --> 00:08:47.095 were in a completely windowless black room 00:08:47.115 --> 00:08:50.755 not making any ocular movement that even remotely resembles an eye blink 00:08:50.775 --> 00:08:53.222 because our eyes were fixed onto a computer screen. 00:08:53.242 --> 00:08:56.176 We were looking at this mouse here trying to activate a memory 00:08:56.196 --> 00:08:58.059 for the first time using our technique. 00:08:58.079 --> 00:09:00.349 XL: And this is what we saw. 00:09:00.369 --> 00:09:02.551 When we first put the mouse into this box, 00:09:02.571 --> 00:09:05.664 it's exploring, sniffing around, walking around, 00:09:05.684 --> 00:09:07.353 minding its own business, 00:09:07.373 --> 00:09:09.054 because actually by nature, 00:09:09.074 --> 00:09:11.033 mice are pretty curious animals. 00:09:11.053 --> 00:09:13.655 They want to know, what's going on in this new box? 00:09:13.675 --> 00:09:15.186 It's interesting. 00:09:15.206 --> 00:09:18.637 But the moment we turned on the laser, like you see now, 00:09:18.657 --> 00:09:21.669 all of a sudden the mouse entered this freezing mode. 00:09:21.689 --> 00:09:26.100 It stayed here and tried not to move any part of its body. 00:09:26.120 --> 00:09:27.728 Clearly it's freezing. 00:09:27.748 --> 00:09:30.311 So indeed, it looks like we are able to bring back 00:09:30.331 --> 00:09:32.375 the fear memory for the first box 00:09:32.395 --> 00:09:35.742 in this completely new environment. 00:09:35.762 --> 00:09:37.854 While watching this, Steve and I 00:09:37.874 --> 00:09:39.987 are as shocked as the mouse itself. 00:09:40.007 --> 00:09:41.249 (Laughter) 00:09:41.269 --> 00:09:44.556 So after the experiment, the two of us just left the room 00:09:44.576 --> 00:09:46.309 without saying anything. 00:09:46.329 --> 00:09:49.705 After a kind of long, awkward period of time, 00:09:49.725 --> 00:09:51.917 Steve broke the silence. 00:09:51.937 --> 00:09:54.258 SR: "Did that just work?" 00:09:54.278 --> 00:09:57.232 XL: "Yes," I said. "Indeed it worked!" 00:09:57.252 --> 00:09:59.349 We're really excited about this. 00:09:59.369 --> 00:10:01.973 And then we published our findings 00:10:01.993 --> 00:10:03.669 in the journal Nature. 00:10:03.689 --> 00:10:06.140 Ever since the publication of our work, 00:10:06.160 --> 00:10:08.555 we've been receiving numerous comments 00:10:08.575 --> 00:10:10.680 from all over the Internet. 00:10:10.700 --> 00:10:13.718 Maybe we can take a look at some of those. 00:10:16.625 --> 00:10:19.571 SR: So the first thing that you'll notice is that people have 00:10:19.592 --> 00:10:21.790 really strong opinions about this kind of work. 00:10:21.810 --> 00:10:24.344 Now I happen to completely agree with the optimism 00:10:24.364 --> 00:10:25.317 of this first quote, 00:10:25.341 --> 00:10:27.968 because on a scale of zero to Morgan Freeman's voice, 00:10:27.988 --> 00:10:30.446 it happens to be one of the most evocative accolades 00:10:30.466 --> 00:10:31.996 that I've heard come our way. 00:10:32.016 --> 00:10:33.837 (Laughter) 00:10:33.857 --> 00:10:36.846 But as you'll see, it's not the only opinion that's out there. 00:10:36.894 --> 00:10:39.639 XL: Indeed, if we take a look at the second one, 00:10:39.659 --> 00:10:41.746 I think we can all agree that it's, meh, 00:10:41.766 --> 00:10:43.752 probably not as positive. 00:10:43.772 --> 00:10:45.937 But this also reminds us that, 00:10:45.957 --> 00:10:48.123 although we are still working with mice, 00:10:48.143 --> 00:10:51.640 it's probably a good idea to start thinking and discussing 00:10:51.660 --> 00:10:54.631 about the possible ethical ramifications 00:10:54.651 --> 00:10:56.579 of memory control. 00:10:56.599 --> 00:10:58.779 SR: Now, in the spirit of the third quote, 00:10:58.799 --> 00:11:00.913 we want to tell you about a recent project 00:11:00.913 --> 00:11:04.396 that we've been working on in lab that we've called Project Inception. 00:11:06.589 --> 00:11:10.158 So we reasoned that now that we can reactivate a memory, 00:11:10.178 --> 00:11:13.110 what if we do so but then begin to tinker with that memory? 00:11:13.130 --> 00:11:16.143 Could we possibly even turn it into a false memory? 00:11:16.163 --> 00:11:20.242 XL: So all memory is sophisticated and dynamic, 00:11:20.262 --> 00:11:23.221 but if just for simplicity, let's imagine memory 00:11:23.241 --> 00:11:24.623 as a movie clip. 00:11:24.643 --> 00:11:27.293 So far what we've told you is basically we can control 00:11:27.313 --> 00:11:29.224 this "play" button of the clip 00:11:29.244 --> 00:11:33.809 so that we can play this video clip any time, anywhere. 00:11:33.829 --> 00:11:36.340 But is there a possibility that we can actually get 00:11:36.360 --> 00:11:39.200 inside the brain and edit this movie clip 00:11:39.220 --> 00:11:42.096 so that we can make it different from the original? 00:11:42.116 --> 00:11:44.274 Yes we can. 00:11:44.294 --> 00:11:46.485 Turned out that all we need to do is basically 00:11:46.509 --> 00:11:50.679 reactivate a memory using lasers just like we did before, 00:11:50.699 --> 00:11:54.118 but at the same time, if we present new information 00:11:54.138 --> 00:11:58.092 and allow this new information to incorporate into this old memory, 00:11:58.112 --> 00:12:00.530 this will change the memory. 00:12:00.550 --> 00:12:04.193 It's sort of like making a remix tape. 00:12:04.213 --> 00:12:07.051 SR: So how do we do this? 00:12:07.071 --> 00:12:09.278 Rather than finding a fear memory in the brain, 00:12:09.278 --> 00:12:10.984 we can start by taking our animals, 00:12:10.984 --> 00:12:13.987 and let's say we put them in a blue box like this blue box here 00:12:13.987 --> 00:12:16.625 and we find the brain cells that represent that blue box 00:12:16.625 --> 00:12:18.854 and we trick them to respond to pulses of light 00:12:18.854 --> 00:12:20.445 exactly like we had said before. 00:12:20.445 --> 00:12:23.119 Now the next day, we can take our animals and place them 00:12:23.119 --> 00:12:25.508 in a red box that they've never experienced before. 00:12:25.508 --> 00:12:27.727 We can shoot light into the brain to reactivate 00:12:27.727 --> 00:12:29.303 the memory of the blue box. 00:12:29.323 --> 00:12:30.748 So what would happen here if, 00:12:30.748 --> 00:12:33.476 while the animal is recalling the memory of the blue box, 00:12:33.476 --> 00:12:35.584 we gave it a couple of mild foot shocks? 00:12:35.604 --> 00:12:38.277 So here we're trying to artificially make an association 00:12:38.297 --> 00:12:40.192 between the memory of the blue box 00:12:40.212 --> 00:12:41.695 and the foot shocks themselves. 00:12:41.715 --> 00:12:43.474 We're just trying to connect the two. 00:12:43.494 --> 00:12:45.017 So to test if we had done so, 00:12:45.037 --> 00:12:46.695 we can take our animals once again 00:12:46.695 --> 00:12:48.461 and place them back in the blue box. 00:12:48.461 --> 00:12:51.169 Again, we had just reactivated the memory of the blue box 00:12:51.169 --> 00:12:53.530 while the animal got a couple of mild foot shocks, 00:12:53.530 --> 00:12:55.566 and now the animal suddenly freezes. 00:12:55.586 --> 00:12:58.920 It's as though it's recalling being mildly shocked in this environment 00:12:58.944 --> 00:13:01.770 even though that never actually happened. 00:13:01.790 --> 00:13:03.632 So it formed a false memory, 00:13:03.652 --> 00:13:05.681 because it's falsely fearing an environment 00:13:05.701 --> 00:13:07.035 where, technically speaking, 00:13:07.059 --> 00:13:09.246 nothing bad actually happened to it. 00:13:09.266 --> 00:13:11.699 XL: So, so far we are only talking about 00:13:11.719 --> 00:13:14.065 this light-controlled "on" switch. 00:13:14.085 --> 00:13:17.353 In fact, we also have a light-controlled "off" switch, 00:13:17.373 --> 00:13:19.433 and it's very easy to imagine 00:13:19.453 --> 00:13:22.059 that by installing this light-controlled "off" switch, 00:13:22.080 --> 00:13:26.264 we can also turn off a memory, any time, anywhere. 00:13:27.519 --> 00:13:29.713 So everything we've been talking about today 00:13:29.733 --> 00:13:34.390 is based on this philosophically charged principle of neuroscience 00:13:34.410 --> 00:13:38.508 that the mind, with its seemingly mysterious properties, 00:13:38.528 --> 00:13:42.153 is actually made of physical stuff that we can tinker with. 00:13:42.173 --> 00:13:43.628 SR: And for me personally, 00:13:43.648 --> 00:13:45.410 I see a world where we can reactivate 00:13:45.434 --> 00:13:47.325 any kind of memory that we'd like. 00:13:47.345 --> 00:13:50.623 I also see a world where we can erase unwanted memories. 00:13:50.643 --> 00:13:52.834 Now, I even see a world where editing memories 00:13:52.858 --> 00:13:54.098 is something of a reality, 00:13:54.118 --> 00:13:56.463 because we're living in a time where it's possible 00:13:56.463 --> 00:13:58.872 to pluck questions from the tree of science fiction 00:13:58.872 --> 00:14:00.908 and to ground them in experimental reality. 00:14:00.908 --> 00:14:02.461 XL: Nowadays, people in the lab 00:14:02.461 --> 00:14:04.717 and people in other groups all over the world 00:14:04.737 --> 00:14:08.534 are using similar methods to activate or edit memories, 00:14:08.554 --> 00:14:12.375 whether that's old or new, positive or negative, 00:14:12.395 --> 00:14:15.047 all sorts of memories so that we can understand 00:14:15.067 --> 00:14:16.911 how memory works. 00:14:16.931 --> 00:14:18.695 SR: For example, one group in our lab 00:14:18.715 --> 00:14:21.480 was able to find the brain cells that make up a fear memory 00:14:21.501 --> 00:14:24.388 and converted them into a pleasurable memory, just like that. 00:14:24.388 --> 00:14:27.530 That's exactly what I mean about editing these kinds of processes. 00:14:27.530 --> 00:14:29.752 Now one dude in lab was even able to reactivate 00:14:29.752 --> 00:14:31.553 memories of female mice in male mice, 00:14:31.553 --> 00:14:34.478 which rumor has it is a pleasurable experience. 00:14:34.498 --> 00:14:38.595 XL: Indeed, we are living in a very exciting moment 00:14:38.615 --> 00:14:42.400 where science doesn't have any arbitrary speed limits 00:14:42.420 --> 00:14:45.587 but is only bound by our own imagination. 00:14:45.607 --> 00:14:47.734 SR: And finally, what do we make of all this? 00:14:47.754 --> 00:14:49.705 How do we push this technology forward? 00:14:49.725 --> 00:14:51.916 These are the questions that should not remain 00:14:51.940 --> 00:14:53.217 just inside the lab, 00:14:53.237 --> 00:14:55.790 and so one goal of today's talk was to bring everybody 00:14:55.810 --> 00:14:58.191 up to speed with the kind of stuff that's possible 00:14:58.215 --> 00:14:59.492 in modern neuroscience, 00:14:59.512 --> 00:15:01.002 but now, just as importantly, 00:15:01.022 --> 00:15:04.334 to actively engage everybody in this conversation. 00:15:04.354 --> 00:15:07.164 So let's think together as a team about what this all means 00:15:07.188 --> 00:15:09.137 and where we can and should go from here, 00:15:09.157 --> 00:15:11.235 because Xu and I think we all have 00:15:11.255 --> 00:15:13.771 some really big decisions ahead of us. 00:15:13.791 --> 00:15:14.934 Thank you. XL: Thank you. 00:15:14.958 --> 00:15:17.079 (Applause)