WEBVTT 00:00:00.000 --> 00:00:02.000 Steve Ramirez: My first year of grad school, 00:00:02.000 --> 00:00:04.000 I found myself in my bedroom 00:00:04.000 --> 00:00:06.000 eating lots of Ben & Jerry's 00:00:06.000 --> 00:00:08.000 watching some trashy TV 00:00:08.000 --> 00:00:10.000 and maybe, maybe listening to Taylor Swift. 00:00:10.000 --> 00:00:12.000 I had just gone through a breakup. 00:00:12.000 --> 00:00:14.000 (Laughter) 00:00:14.000 --> 00:00:16.000 So for the longest time, all I would do 00:00:16.000 --> 00:00:20.000 is recall the memory of this person over and over again, 00:00:20.000 --> 00:00:22.000 wishing that I could get rid of that gut-wrenching, 00:00:22.000 --> 00:00:25.000 visceral "blah" feeling. NOTE Paragraph 00:00:25.000 --> 00:00:27.000 Now, as it turns out, I'm a neuroscientist, 00:00:27.000 --> 00:00:29.000 so I knew that the memory of that person 00:00:29.000 --> 00:00:32.000 and the awful, emotional undertones that color in that memory, 00:00:32.000 --> 00:00:35.000 are largely mediated by separate brain systems. 00:00:35.000 --> 00:00:37.000 And so I thought, what if we could go to the brain 00:00:37.000 --> 00:00:39.000 and edit out that nauseating feeling 00:00:39.000 --> 00:00:43.000 but while keeping the memory of that person intact? 00:00:43.000 --> 00:00:45.000 Then I realized, maybe that's a little bit lofty for now. 00:00:45.000 --> 00:00:47.000 So what if we could start off by going into the brain 00:00:47.000 --> 00:00:50.000 and just finding a single memory to begin with? 00:00:50.000 --> 00:00:53.000 Could we jump-start that memory back to life, 00:00:53.000 --> 00:00:57.000 maybe even play with the contents of that memory? NOTE Paragraph 00:00:57.000 --> 00:00:59.000 All that said, there is one person in the entire world right now 00:00:59.000 --> 00:01:01.000 that I really hope is not watching this talk. NOTE Paragraph 00:01:01.000 --> 00:01:05.000 (Laughter) NOTE Paragraph 00:01:05.000 --> 00:01:08.000 So there is a catch. There is a catch. 00:01:08.000 --> 00:01:11.000 These ideas probably remind you of "Total Recall," 00:01:11.000 --> 00:01:13.000 "Eternal Sunshine of the Spotless Mind," 00:01:13.000 --> 00:01:15.000 or of "Inception." 00:01:15.000 --> 00:01:17.000 But the movie stars that we work with 00:01:17.000 --> 00:01:19.000 are the celebrities in the lab. NOTE Paragraph 00:01:19.000 --> 00:01:21.000 Xu Liu: Test mice. NOTE Paragraph 00:01:21.000 --> 00:01:23.000 (Laughter) NOTE Paragraph 00:01:23.000 --> 00:01:25.000 As neuroscientists, we work in the lab with mice 00:01:25.000 --> 00:01:27.000 trying to understand how memory works. 00:01:27.000 --> 00:01:30.000 And today, we hope to convince you that now 00:01:30.000 --> 00:01:33.000 we are actually able to activate a memory in the brain 00:01:33.000 --> 00:01:36.000 at the speed of light. 00:01:36.000 --> 00:01:38.000 To do this, there's only two simple steps to follow. 00:01:38.000 --> 00:01:42.000 First, you find and label a memory in the brain, 00:01:42.000 --> 00:01:46.000 and then you activate it with a switch. 00:01:46.000 --> 00:01:48.000 As simple as that. 00:01:48.000 --> 00:01:50.000 (Laughter) NOTE Paragraph 00:01:50.000 --> 00:01:52.000 SR: Are you convinced? 00:01:52.000 --> 00:01:54.000 So, turns out finding a memory in the brain isn't all that easy. NOTE Paragraph 00:01:54.000 --> 00:01:57.000 XL: Indeed. This is way more difficult than, let's say, 00:01:57.000 --> 00:01:59.000 finding a needle in a haystack, 00:01:59.000 --> 00:02:02.000 because at least, you know, the needle is still something 00:02:02.000 --> 00:02:05.000 you can physically put your fingers on. 00:02:05.000 --> 00:02:07.000 But memory is not. 00:02:07.000 --> 00:02:10.000 And also, there's way more cells in your brain 00:02:10.000 --> 00:02:15.000 than the number of straws in a typical haystack. 00:02:15.000 --> 00:02:18.000 So yeah, this task does seem to be daunting. 00:02:18.000 --> 00:02:22.000 But luckily, we got help from the brain itself. 00:02:22.000 --> 00:02:24.000 It turned out that all we need to do is basically 00:02:24.000 --> 00:02:26.000 to let the brain form a memory, 00:02:26.000 --> 00:02:30.000 and then the brain will tell us which cells are involved 00:02:30.000 --> 00:02:32.000 in that particular memory. NOTE Paragraph 00:02:32.000 --> 00:02:34.000 SR: So what was going on in my brain 00:02:34.000 --> 00:02:37.000 while I was recalling the memory of an ex? 00:02:37.000 --> 00:02:39.000 If you were to just completely ignore human ethics for a second 00:02:39.000 --> 00:02:41.000 and slice up my brain right now, 00:02:41.000 --> 00:02:43.000 you would see that there was an amazing number 00:02:43.000 --> 00:02:45.000 of brain regions that were active while recalling that memory. 00:02:45.000 --> 00:02:48.000 Now one brain region that would be robustly active 00:02:48.000 --> 00:02:50.000 in particular is called the hippocampus, 00:02:50.000 --> 00:02:52.000 which for decades has been implicated in processing 00:02:52.000 --> 00:02:55.000 the kinds of memories that we hold near and dear, 00:02:55.000 --> 00:02:57.000 which also makes it an ideal target to go into 00:02:57.000 --> 00:03:00.000 and to try and find and maybe reactivate a memory. NOTE Paragraph 00:03:00.000 --> 00:03:03.000 XL: When you zoom in into the hippocampus, 00:03:03.000 --> 00:03:05.000 of course you will see lots of cells, 00:03:05.000 --> 00:03:08.000 but we are able to find which cells are involved 00:03:08.000 --> 00:03:10.000 in a particular memory, 00:03:10.000 --> 00:03:12.000 because whenever a cell is active, 00:03:12.000 --> 00:03:14.000 like when it's forming a memory, 00:03:14.000 --> 00:03:17.000 it will also leave a footprint that will later allow us to know 00:03:17.000 --> 00:03:20.000 these cells are recently active. NOTE Paragraph 00:03:20.000 --> 00:03:22.000 SR: So the same way that building lights at night 00:03:22.000 --> 00:03:25.000 let you know that somebody's probably working there at any given moment, 00:03:25.000 --> 00:03:28.000 in a very real sense, there are biological sensors 00:03:28.000 --> 00:03:31.000 within a cell that are turned on 00:03:31.000 --> 00:03:33.000 only when that cell was just working. 00:03:33.000 --> 00:03:35.000 They're sort of biological windows that light up 00:03:35.000 --> 00:03:37.000 to let us know that that cell was just active. NOTE Paragraph 00:03:37.000 --> 00:03:39.000 XL: So we clicked part of this sensor, 00:03:39.000 --> 00:03:42.000 and attached that to a switch to control the cells, 00:03:42.000 --> 00:03:46.000 and we packed this switch into an engineered virus 00:03:46.000 --> 00:03:49.000 and inject that into the brain of the mice. 00:03:49.000 --> 00:03:52.000 So whenever a memory is being formed, 00:03:52.000 --> 00:03:54.000 any active cells for that memory 00:03:54.000 --> 00:03:57.000 will also have this switch installed. NOTE Paragraph 00:03:57.000 --> 00:03:59.000 SR: So here is what the hippocampus looks like 00:03:59.000 --> 00:04:01.000 after forming a fear memory, for example. 00:04:01.000 --> 00:04:03.000 The sea of blue that you see here 00:04:03.000 --> 00:04:05.000 are densely packed brain cells, 00:04:05.000 --> 00:04:07.000 but the green brain cells, 00:04:07.000 --> 00:04:10.000 the green brain cells are the one that are holding on 00:04:10.000 --> 00:04:12.000 to a specific fear memory. 00:04:12.000 --> 00:04:13.000 So you are looking at the crystallization 00:04:13.000 --> 00:04:16.000 of the fleeting formation of fear. 00:04:16.000 --> 00:04:19.000 You're actually looking at the cross-section of a memory right now. NOTE Paragraph 00:04:19.000 --> 00:04:21.000 XL: Now, for the switch we have been talking about, 00:04:21.000 --> 00:04:24.000 ideally, the switch has to act really fast. 00:04:24.000 --> 00:04:27.000 It shouldn't take minutes or hours to work. 00:04:27.000 --> 00:04:31.000 It should act at the speed of the brain, in milliseconds. NOTE Paragraph 00:04:31.000 --> 00:04:33.000 SR: So what do you think, Xu? 00:04:33.000 --> 00:04:35.000 Could we use, let's say, pharmacological drugs 00:04:35.000 --> 00:04:37.000 to activate or inactivate brain cells? NOTE Paragraph 00:04:37.000 --> 00:04:41.000 XL: Nah. Drugs are pretty messy. They spread everywhere. 00:04:41.000 --> 00:04:44.000 And also it takes them forever to act on cells. 00:04:44.000 --> 00:04:48.000 So it will not allow us to control a memory in real time. 00:04:48.000 --> 00:04:51.000 So Steve, how about let's zap the brain with electricity? NOTE Paragraph 00:04:51.000 --> 00:04:54.000 SR: So electricity is pretty fast, 00:04:54.000 --> 00:04:56.000 but we probably wouldn't be able to target it 00:04:56.000 --> 00:04:58.000 to just the specific cells that hold on to a memory, 00:04:58.000 --> 00:05:00.000 and we'd probably fry the brain. NOTE Paragraph 00:05:00.000 --> 00:05:03.000 XL: Oh. That's true. So it looks like, hmm, 00:05:03.000 --> 00:05:06.000 indeed we need to find a better way 00:05:06.000 --> 00:05:09.000 to impact the brain at the speed of light. NOTE Paragraph 00:05:09.000 --> 00:05:14.000 SR: So it just so happens that light travels at the speed of light. 00:05:14.000 --> 00:05:17.000 So maybe we could activate or inactive memories 00:05:17.000 --> 00:05:19.000 by just using light -- NOTE Paragraph 00:05:19.000 --> 00:05:20.000 XL: That's pretty fast. NOTE Paragraph 00:05:20.000 --> 00:05:22.000 SR: -- and because normally brain cells 00:05:22.000 --> 00:05:24.000 don't respond to pulses of light, 00:05:24.000 --> 00:05:26.000 so those that would respond to pulses of light 00:05:26.000 --> 00:05:28.000 are those that contain a light-sensitive switch. 00:05:28.000 --> 00:05:30.000 Now to do that, first we need to trick brain cells 00:05:30.000 --> 00:05:32.000 to respond to to laser beams. NOTE Paragraph 00:05:32.000 --> 00:05:34.000 XL: Yep. You heard it right. 00:05:34.000 --> 00:05:36.000 We are trying to shoot lasers into the brain. 00:05:36.000 --> 00:05:37.000 (Laughter) NOTE Paragraph 00:05:37.000 --> 00:05:40.000 SR: And the technique that lets us do that is optogenetics. 00:05:40.000 --> 00:05:43.000 Optogenetics gave us this light switch that we can use 00:05:43.000 --> 00:05:45.000 to turn brain cells on or off, 00:05:45.000 --> 00:05:47.000 and the name of that switch is channelrhodopsin, 00:05:47.000 --> 00:05:50.000 seen here as these green dots attached to this brain cell. 00:05:50.000 --> 00:05:53.000 You can think of channelrhodopsin as a sort of light-sensitive switch 00:05:53.000 --> 00:05:56.000 that can be artificially installed in brain cells 00:05:56.000 --> 00:05:58.000 so that now we can use that switch 00:05:58.000 --> 00:06:01.000 to activate or inactivate the brain cell simply by clicking it, 00:06:01.000 --> 00:06:03.000 and in this case we click it on with pulses of light. 00:06:03.000 --> 00:06:07.000 XL: So we attach this light-sensitive switch of channelrhodopsin 00:06:07.000 --> 00:06:09.000 to the sensor we've been talking about 00:06:09.000 --> 00:06:12.000 and inject this into the brain. 00:06:12.000 --> 00:06:15.000 So whenever a memory is being formed, 00:06:15.000 --> 00:06:17.000 any active cell for that particular memory 00:06:17.000 --> 00:06:20.000 will also have this light-sensitive switch installed in it 00:06:20.000 --> 00:06:23.000 so that we can control these cells 00:06:23.000 --> 00:06:27.000 by the flipping of a laser just like this one you see. NOTE Paragraph 00:06:27.000 --> 00:06:30.000 SR: So let's put all of this to the test now. 00:06:30.000 --> 00:06:32.000 What we can do is we can take our mice 00:06:32.000 --> 00:06:35.000 and then we can put them in a box that looks exactly like this box here, 00:06:35.000 --> 00:06:37.000 and then we can give them a very mild footshock 00:06:37.000 --> 00:06:40.000 so that they form a fear memory of this box. 00:06:40.000 --> 00:06:42.000 They learn that something bad happened here. 00:06:42.000 --> 00:06:44.000 Now with our system, the cells that are active 00:06:44.000 --> 00:06:47.000 in the hippocampus in the making of this memory, 00:06:47.000 --> 00:06:50.000 only those cells will now contain channelrhodopsin. NOTE Paragraph 00:06:50.000 --> 00:06:53.000 XL: When you are as small as a mouse, 00:06:53.000 --> 00:06:57.000 it feels as if the whole world is trying to get you. 00:06:57.000 --> 00:06:59.000 So your best response of defense 00:06:59.000 --> 00:07:01.000 is trying to be undetected. 00:07:01.000 --> 00:07:03.000 Whenever a mouse is in fear, 00:07:03.000 --> 00:07:05.000 it will show this very typical behavior 00:07:05.000 --> 00:07:07.000 by staying at one corner of the box, 00:07:07.000 --> 00:07:09.000 trying to not move any part of its body, 00:07:09.000 --> 00:07:12.000 and this posture is called freezing. 00:07:12.000 --> 00:07:16.000 So if a mouse remembers that something bad happened in this box, 00:07:16.000 --> 00:07:19.000 and when we put them back into the same box, 00:07:19.000 --> 00:07:21.000 it will basically show freezing 00:07:21.000 --> 00:07:23.000 because it doesn't want to be detected 00:07:23.000 --> 00:07:26.000 by any potential threats in this box. NOTE Paragraph 00:07:26.000 --> 00:07:28.000 SR: So you can think of freezing as 00:07:28.000 --> 00:07:30.000 you're walking down the street minding your own business, 00:07:30.000 --> 00:07:32.000 and then out of nowhere you almost run into 00:07:32.000 --> 00:07:34.000 an ex-girlfriend or ex-boyfriend, 00:07:34.000 --> 00:07:36.000 and now those terrifying two seconds 00:07:36.000 --> 00:07:38.000 where you start thinking, "What do I do? Do I say hi? 00:07:38.000 --> 00:07:40.000 Do I shake their hand? Do I turn around and run away? 00:07:40.000 --> 00:07:42.000 Do I sit here and pretend like I don't exist?" 00:07:42.000 --> 00:07:44.000 Those kind of fleeting thoughts that physically incapacitate you, 00:07:44.000 --> 00:07:46.000 that temporarily give you that deer-in-headlights look. NOTE Paragraph 00:07:46.000 --> 00:07:50.000 XL: However, if you put the mouse in a completely different 00:07:50.000 --> 00:07:53.000 new box, like the next one, 00:07:53.000 --> 00:07:56.000 it will not be afraid of this box 00:07:56.000 --> 00:07:59.000 because there's no reason that it will be afraid of this new environment. NOTE Paragraph 00:07:59.000 --> 00:08:04.000 But what if we put the mouse in this new box 00:08:04.000 --> 00:08:07.000 but at the same time, we activate the fear memory 00:08:07.000 --> 00:08:09.000 using lasers just like we did before? 00:08:09.000 --> 00:08:13.000 Are we going to bring back the fear memory 00:08:13.000 --> 00:08:16.000 for the first box into this completely new environment? NOTE Paragraph 00:08:16.000 --> 00:08:19.000 SR: All right, and here's the million dollar experiment. 00:08:19.000 --> 00:08:21.000 Now to bring back to life the memory of that day, 00:08:21.000 --> 00:08:24.000 I remember that the Red Sox had just won, 00:08:24.000 --> 00:08:26.000 it was a green spring day, 00:08:26.000 --> 00:08:29.000 perfect for going up and down the river 00:08:29.000 --> 00:08:31.000 and then maybe going to the north end 00:08:31.000 --> 00:08:33.000 to get some cannolis, [inaud]. 00:08:33.000 --> 00:08:36.000 Now Xu and I, on the other hand, 00:08:36.000 --> 00:08:39.000 were in a completely windowless black room 00:08:39.000 --> 00:08:42.000 not making any ocular movement that even remotely resembles an eye blink 00:08:42.000 --> 00:08:44.000 because our eyes were fixed onto a computer screen. 00:08:44.000 --> 00:08:47.000 We were looking at this mouse here trying to activate a memory 00:08:47.000 --> 00:08:50.000 for the first time using our technique. NOTE Paragraph 00:08:50.000 --> 00:08:52.000 XL: And this is what we saw. 00:08:52.000 --> 00:08:54.000 When we first put the mouse into this box, 00:08:54.000 --> 00:08:57.000 it's exploring, sniffing around, walking around, 00:08:57.000 --> 00:08:59.000 minding its own business, 00:08:59.000 --> 00:09:01.000 because actually by nature, 00:09:01.000 --> 00:09:03.000 mice are pretty curious animals. 00:09:03.000 --> 00:09:05.000 They want to know, what's going on in this new box? 00:09:05.000 --> 00:09:07.000 It's interesting. 00:09:07.000 --> 00:09:10.000 But the moment we turned on laser, like you see now, 00:09:10.000 --> 00:09:13.000 all the sudden the mouse entered this freezing mode. 00:09:13.000 --> 00:09:18.000 It stayed at here and tried not to move any part of its body. 00:09:18.000 --> 00:09:20.000 Clearly it's freezing. 00:09:20.000 --> 00:09:22.000 So indeed, it looks like we are able to bring back 00:09:22.000 --> 00:09:24.000 the fear memory for the first box 00:09:24.000 --> 00:09:27.000 in this completely new environment. NOTE Paragraph 00:09:27.000 --> 00:09:29.000 While watching this, Steve and I 00:09:29.000 --> 00:09:32.000 are as shocked as the mouse itself. 00:09:32.000 --> 00:09:36.000 So after the experiment, the two of us just left the room 00:09:36.000 --> 00:09:38.000 without saying anything. NOTE Paragraph 00:09:38.000 --> 00:09:41.000 After kind of long, awkward period of time, 00:09:41.000 --> 00:09:43.000 Steve broke the silence. NOTE Paragraph 00:09:43.000 --> 00:09:45.000 SR: "Did that just work?" NOTE Paragraph 00:09:45.000 --> 00:09:48.000 XL: "Yes," I said. "Indeed it worked!" 00:09:48.000 --> 00:09:50.000 We're really excited about this. 00:09:50.000 --> 00:09:53.000 And then we published our findings 00:09:53.000 --> 00:09:56.000 in the Journal of Nature. 00:09:56.000 --> 00:09:58.000 Ever since the publication of our work, 00:09:58.000 --> 00:10:00.000 we've been receiving numerous comments 00:10:00.000 --> 00:10:02.000 from all over the internet. 00:10:02.000 --> 00:10:04.000 Maybe we can take a look at some of those. NOTE Paragraph 00:10:04.000 --> 00:10:06.000 (Laughter) NOTE Paragraph 00:10:06.000 --> 00:10:08.000 ["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"] NOTE Paragraph 00:10:08.000 --> 00:10:10.000 SR: So the first thing that you'll notice is that people 00:10:10.000 --> 00:10:13.000 have really strong opinions about this kind of work. 00:10:13.000 --> 00:10:15.000 Now I happen to completely agree with the optimism 00:10:15.000 --> 00:10:17.000 of this first quote, 00:10:17.000 --> 00:10:20.000 because on a scale of zero to Morgan Freeman's voice, 00:10:20.000 --> 00:10:22.000 it happens to be one of the most evocative accolades 00:10:22.000 --> 00:10:23.000 that I've heard come our way. 00:10:23.000 --> 00:10:25.000 (Laughter) 00:10:25.000 --> 00:10:28.000 But as you'll see, it's not the only opinion that's out there. NOTE Paragraph 00:10:28.000 --> 00:10:30.000 ["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"] NOTE Paragraph 00:10:30.000 --> 00:10:32.000 XL: Indeed, if we take a look at the second one, 00:10:32.000 --> 00:10:34.000 I think we can all agree that it's, meh, 00:10:34.000 --> 00:10:36.000 probably not as positive. 00:10:36.000 --> 00:10:38.000 But this also reminds us that, 00:10:38.000 --> 00:10:40.000 although we are still working with mice, 00:10:40.000 --> 00:10:43.000 it's probably a good idea to start thinking and discussing 00:10:43.000 --> 00:10:45.000 about the possible ethical ramifications 00:10:45.000 --> 00:10:48.000 of memory control. NOTE Paragraph 00:10:48.000 --> 00:10:50.000 ["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."] NOTE Paragraph 00:10:50.000 --> 00:10:52.000 SR: Now, in the spirit of the third quote, 00:10:52.000 --> 00:10:54.000 we want to tell you about a recent project that we've been 00:10:54.000 --> 00:10:56.000 working on in lab that we've called "Project Inception." 00:10:56.000 --> 00:10:58.000 So -- (Laughter) -- 00:10:58.000 --> 00:11:01.000 we reasoned that now that we can reactivate a memory, 00:11:01.000 --> 00:11:05.000 what if we do so but then begin to tinker with that memory? 00:11:05.000 --> 00:11:07.000 Could we possibly even turn it into a false memory? NOTE Paragraph 00:11:07.000 --> 00:11:12.000 XL: So all memory is sophisticated and dynamic, 00:11:12.000 --> 00:11:14.000 but if just for simplicity, let's imagine memory 00:11:14.000 --> 00:11:17.000 as a movie clip. 00:11:17.000 --> 00:11:19.000 So far what we've told you is basically we can control 00:11:19.000 --> 00:11:21.000 this "play" button of the clip 00:11:21.000 --> 00:11:24.000 so that we can play this video clip any time, anywhere. 00:11:24.000 --> 00:11:27.000 But is there a possibility that we can actually get 00:11:27.000 --> 00:11:31.000 inside the brain and edit this movie clip 00:11:31.000 --> 00:11:34.000 so that we can make it different from the original? 00:11:34.000 --> 00:11:36.000 Yes we can. 00:11:36.000 --> 00:11:38.000 Turned out that all we need to do is basically 00:11:38.000 --> 00:11:42.000 reactivate a memory using lasers just like we did before, 00:11:42.000 --> 00:11:45.000 but at the same time, if we present new information 00:11:45.000 --> 00:11:50.000 and allow this new information to incorporate into this old memory, 00:11:50.000 --> 00:11:52.000 this will change the memory. 00:11:52.000 --> 00:11:55.000 It's sort of like making a remix tape. NOTE Paragraph 00:11:55.000 --> 00:11:58.000 SR: So how do we do this? 00:11:58.000 --> 00:12:00.000 Rather than finding a fear memory in the brain, 00:12:00.000 --> 00:12:03.000 we can start by taking our animals, 00:12:03.000 --> 00:12:05.000 and let's say we put them in a blue box like this blue box here 00:12:05.000 --> 00:12:08.000 and we find the brain cells that represent that blue box 00:12:08.000 --> 00:12:10.000 and we trick them to respond to pulses of light 00:12:10.000 --> 00:12:12.000 exactly like we had set before. 00:12:12.000 --> 00:12:14.000 Now the next day, we can take our animals and place them 00:12:14.000 --> 00:12:17.000 in a red box that they've never experienced before. 00:12:17.000 --> 00:12:19.000 We can shoot light into the brain to reactivate 00:12:19.000 --> 00:12:21.000 the memory of the blue box. 00:12:21.000 --> 00:12:23.000 So what would happen here if, while the animal 00:12:23.000 --> 00:12:25.000 is recalling the memory of the blue box, 00:12:25.000 --> 00:12:27.000 we gave it a couple of mild foot shocks? 00:12:27.000 --> 00:12:30.000 So here we're trying to artificially make an association 00:12:30.000 --> 00:12:32.000 between the memory of the blue box 00:12:32.000 --> 00:12:34.000 and the foot shocks themselves. 00:12:34.000 --> 00:12:36.000 We're just trying to connect the two. 00:12:36.000 --> 00:12:37.000 So to test if we had done so, 00:12:37.000 --> 00:12:39.000 we can take our animals once again 00:12:39.000 --> 00:12:41.000 and place them back in the blue box. 00:12:41.000 --> 00:12:43.000 Again, we had just reactivated the memory of the blue box 00:12:43.000 --> 00:12:46.000 while the animal got a couple of mild foot shocks, 00:12:46.000 --> 00:12:48.000 and now the animal suddenly freezes. 00:12:48.000 --> 00:12:50.000 It's as though it's recalling being mildly shocked in this environment 00:12:50.000 --> 00:12:53.000 even though that never actually happened. 00:12:53.000 --> 00:12:55.000 So it formed a false memory, 00:12:55.000 --> 00:12:57.000 because it's falsely fearing an environment 00:12:57.000 --> 00:12:59.000 where, technically speaking, 00:12:59.000 --> 00:13:01.000 nothing bad actually happened to it. NOTE Paragraph 00:13:01.000 --> 00:13:03.000 XL: So, so far we are only talking about 00:13:03.000 --> 00:13:06.000 this light-controlled "on" switch. 00:13:06.000 --> 00:13:08.000 In fact, we also have a light-controlled "off" switch, 00:13:08.000 --> 00:13:11.000 and it's very easy to imagine that 00:13:11.000 --> 00:13:13.000 by installing this light-controlled "off" switch, 00:13:13.000 --> 00:13:18.000 we can also turn off a memory, any time, anywhere. NOTE Paragraph 00:13:18.000 --> 00:13:21.000 So everything we've been talking today 00:13:21.000 --> 00:13:26.000 is based on this philosophically charted principle of neuroscience 00:13:26.000 --> 00:13:29.000 that the mind, with its seemingly mysterious properties, 00:13:29.000 --> 00:13:33.000 is actually made of physical stuff that we can tinker with. NOTE Paragraph 00:13:33.000 --> 00:13:35.000 SR: And for me personally, 00:13:35.000 --> 00:13:37.000 I see a world where we can reactivate 00:13:37.000 --> 00:13:39.000 any kind of memory that we'd like. 00:13:39.000 --> 00:13:42.000 I also see a world where we can erase unwanted memories. 00:13:42.000 --> 00:13:44.000 Now, I even see a world where editing memories 00:13:44.000 --> 00:13:46.000 is something of a reality, 00:13:46.000 --> 00:13:48.000 because we're living in a time where it's possible 00:13:48.000 --> 00:13:50.000 to pluck questions from the tree of science fiction 00:13:50.000 --> 00:13:52.000 and to ground them in experimental reality. NOTE Paragraph 00:13:52.000 --> 00:13:54.000 XL: Nowadays, people in the lab 00:13:54.000 --> 00:13:57.000 and people in other groups all over the world 00:13:57.000 --> 00:14:00.000 are using similar methods to activate or edit memories, 00:14:00.000 --> 00:14:04.000 whether that's old or new, positive or negative, 00:14:04.000 --> 00:14:06.000 all sorts of memories so that we can understand 00:14:06.000 --> 00:14:08.000 how memory works. NOTE Paragraph 00:14:08.000 --> 00:14:11.000 SR: For example, one group in our lab 00:14:11.000 --> 00:14:13.000 was able to find the brain cells that make up a fear memory 00:14:13.000 --> 00:14:16.000 and converted them into a pleasurable memory, just like that. 00:14:16.000 --> 00:14:18.000 That's exactly what I mean about editing these kinds of processes. 00:14:18.000 --> 00:14:21.000 Now one dude in lab was even able to reactivate 00:14:21.000 --> 00:14:23.000 memories of female mice in male mice, 00:14:23.000 --> 00:14:26.000 which rumor has it is a pleasurable experience. NOTE Paragraph 00:14:26.000 --> 00:14:30.000 XL: Indeed, we are living in a very exciting moment 00:14:30.000 --> 00:14:33.000 where science doesn't have any arbitrary speed limits 00:14:33.000 --> 00:14:37.000 but is only bound by our own imagination. NOTE Paragraph 00:14:37.000 --> 00:14:39.000 SR: And finally, what do we make of all this? 00:14:39.000 --> 00:14:42.000 How do we push this technology forward? 00:14:42.000 --> 00:14:44.000 These are the questions that should not remain 00:14:44.000 --> 00:14:46.000 just inside the lab, 00:14:46.000 --> 00:14:48.000 and so one goal of today's talk was to bring everybody 00:14:48.000 --> 00:14:50.000 up to speed with the kind of stuff that's possible 00:14:50.000 --> 00:14:52.000 in modern neuroscience, 00:14:52.000 --> 00:14:54.000 but now, just as importantly, 00:14:54.000 --> 00:14:56.000 to actively engage everybody in this conversation. 00:14:56.000 --> 00:14:58.000 So let's think together as a team about what this all means 00:14:58.000 --> 00:15:00.000 and where we can and should go from here, 00:15:00.000 --> 00:15:02.000 because Xu and I think we all have 00:15:02.000 --> 00:15:05.000 some really big decisions ahead of us. NOTE Paragraph 00:15:05.000 --> 00:15:07.000 Thank you. XL: Thank you. NOTE Paragraph 00:15:07.000 --> 00:15:08.000 (Applause)