0:00:00.000,0:00:01.000 Steve Ramirez: My first year of grad school, 0:00:01.000,0:00:03.000 I found myself in my bedroom 0:00:03.000,0:00:06.000 eating lots of Ben & Jerry's 0:00:06.000,0:00:07.000 watching some trashy TV 0:00:07.000,0:00:11.000 and maybe, maybe listening to Taylor Swift. 0:00:11.000,0:00:12.000 I had just gone through a breakup. 0:00:12.000,0:00:14.000 (Laughter) 0:00:14.000,0:00:16.000 So for the longest time, all I would do 0:00:16.000,0:00:20.000 is recall the memory of this person over and over again, 0:00:20.000,0:00:22.000 wishing that I could get rid of that gut-wrenching, 0:00:22.000,0:00:25.000 visceral "blah" feeling. 0:00:25.000,0:00:27.000 Now, as it turns out, I'm a neuroscientist, 0:00:27.000,0:00:30.000 so I knew that the memory of that person 0:00:30.000,0:00:33.000 and the awful, emotional undertones that color in that memory, 0:00:33.000,0:00:35.000 are largely mediated by separate brain systems. 0:00:35.000,0:00:38.000 And so I thought, what if we could go into the brain 0:00:38.000,0:00:40.000 and edit out that nauseating feeling 0:00:40.000,0:00:43.000 but while keeping the memory of that person intact? 0:00:43.000,0:00:45.000 Then I realized, maybe that's a little bit lofty for now. 0:00:45.000,0:00:48.000 So what if we could start off by going into the brain 0:00:48.000,0:00:51.000 and just finding a single memory to begin with? 0:00:51.000,0:00:53.000 Could we jump-start that memory back to life, 0:00:53.000,0:00:57.000 maybe even play with the contents of that memory? 0:00:57.000,0:00:59.000 All that said, there is one person in the entire world right now 0:00:59.000,0:01:01.000 that I really hope is not watching this talk. 0:01:01.000,0:01:05.000 (Laughter) 0:01:05.000,0:01:08.000 So there is a catch. There is a catch. 0:01:08.000,0:01:11.000 These ideas probably remind you of "Total Recall," 0:01:11.000,0:01:13.000 "Eternal Sunshine of the Spotless Mind," 0:01:13.000,0:01:15.000 or of "Inception." 0:01:15.000,0:01:16.000 But the movie stars that we work with 0:01:16.000,0:01:18.000 are the celebrities of the lab. 0:01:18.000,0:01:20.000 Xu Liu: Test mice. 0:01:20.000,0:01:21.000 (Laughter) 0:01:21.000,0:01:24.000 As neuroscientists, we work in the lab with mice 0:01:24.000,0:01:28.000 trying to understand how memory works. 0:01:28.000,0:01:30.000 And today, we hope to convince you that now 0:01:30.000,0:01:33.000 we are actually able to activate a memory in the brain 0:01:33.000,0:01:36.000 at the speed of light. 0:01:36.000,0:01:39.000 To do this, there's only two simple steps to follow. 0:01:39.000,0:01:42.000 First, you find and label a memory in the brain, 0:01:42.000,0:01:46.000 and then you activate it with a switch. 0:01:46.000,0:01:47.000 As simple as that. 0:01:47.000,0:01:49.000 (Laughter) 0:01:49.000,0:01:51.000 SR: Are you convinced? 0:01:51.000,0:01:55.000 So, turns out finding a memory in the brain isn't all that easy. 0:01:55.000,0:01:57.000 XL: Indeed. This is way more difficult than, let's say, 0:01:57.000,0:02:00.000 finding a needle in a haystack, 0:02:00.000,0:02:02.000 because at least, you know, the needle is still something 0:02:02.000,0:02:05.000 you can physically put your fingers on. 0:02:05.000,0:02:07.000 But memory is not. 0:02:07.000,0:02:10.000 And also, there's way more cells in your brain 0:02:10.000,0:02:15.000 than the number of straws in a typical haystack. 0:02:15.000,0:02:18.000 So yeah, this task does seem to be daunting. 0:02:18.000,0:02:22.000 But luckily, we got help from the brain itself. 0:02:22.000,0:02:24.000 It turned out that all we need to do is basically 0:02:24.000,0:02:26.000 to let the brain form a memory, 0:02:26.000,0:02:30.000 and then the brain will tell us which cells are involved 0:02:30.000,0:02:32.000 in that particular memory. 0:02:32.000,0:02:34.000 SR: So what was going on in my brain 0:02:34.000,0:02:36.000 while I was recalling the memory of an ex? 0:02:36.000,0:02:39.000 If you were to just completely ignore human ethics for a second 0:02:39.000,0:02:40.000 and slice up my brain right now, 0:02:40.000,0:02:42.000 you would see that there was an amazing number 0:02:42.000,0:02:45.000 of brain regions that were active while recalling that memory. 0:02:45.000,0:02:48.000 Now one brain region that would be robustly active 0:02:48.000,0:02:50.000 in particular is called the hippocampus, 0:02:50.000,0:02:53.000 which for decades has been implicated in processing 0:02:53.000,0:02:55.000 the kinds of memories that we hold near and dear, 0:02:55.000,0:02:58.000 which also makes it an ideal target to go into 0:02:58.000,0:03:01.000 and to try and find and maybe reactivate a memory. 0:03:01.000,0:03:03.000 XL: When you zoom in into the hippocampus, 0:03:03.000,0:03:05.000 of course you will see lots of cells, 0:03:05.000,0:03:08.000 but we are able to find which cells are involved 0:03:08.000,0:03:10.000 in a particular memory, 0:03:10.000,0:03:12.000 because whenever a cell is active, 0:03:12.000,0:03:14.000 like when it's forming a memory, 0:03:14.000,0:03:18.000 it will also leave a footprint that will later allow us to know 0:03:18.000,0:03:20.000 these cells are recently active. 0:03:20.000,0:03:22.000 SR: So the same way that building lights at night 0:03:22.000,0:03:25.000 let you know that somebody's probably working there at any given moment, 0:03:25.000,0:03:29.000 in a very real sense, there are biological sensors 0:03:29.000,0:03:31.000 within a cell that are turned on 0:03:31.000,0:03:33.000 only when that cell was just working. 0:03:33.000,0:03:35.000 They're sort of biological windows that light up 0:03:35.000,0:03:37.000 to let us know that that cell was just active. 0:03:37.000,0:03:39.000 XL: So we clipped part of this sensor, 0:03:39.000,0:03:43.000 and attached that to a switch to control the cells, 0:03:43.000,0:03:47.000 and we packed this switch into an engineered virus 0:03:47.000,0:03:49.000 and injected that into the brain of the mice. 0:03:49.000,0:03:52.000 So whenever a memory is being formed, 0:03:52.000,0:03:54.000 any active cells for that memory 0:03:54.000,0:03:57.000 will also have this switch installed. 0:03:57.000,0:03:58.000 SR: So here is what the hippocampus looks like 0:03:58.000,0:04:01.000 after forming a fear memory, for example. 0:04:01.000,0:04:03.000 The sea of blue that you see here 0:04:03.000,0:04:05.000 are densely packed brain cells, 0:04:05.000,0:04:07.000 but the green brain cells, 0:04:07.000,0:04:09.000 the green brain cells are the ones that are holding on 0:04:09.000,0:04:11.000 to a specific fear memory. 0:04:11.000,0:04:13.000 So you are looking at the crystallization 0:04:13.000,0:04:15.000 of the fleeting formation of fear. 0:04:15.000,0:04:19.000 You're actually looking at the cross-section of a memory right now. 0:04:19.000,0:04:21.000 XL: Now, for the switch we have been talking about, 0:04:21.000,0:04:24.000 ideally, the switch has to act really fast. 0:04:24.000,0:04:27.000 It shouldn't take minutes or hours to work. 0:04:27.000,0:04:31.000 It should act at the speed of the brain, in milliseconds. 0:04:31.000,0:04:32.000 SR: So what do you think, Xu? 0:04:32.000,0:04:35.000 Could we use, let's say, pharmacological drugs 0:04:35.000,0:04:37.000 to activate or inactivate brain cells? 0:04:37.000,0:04:41.000 XL: Nah. Drugs are pretty messy. They spread everywhere. 0:04:41.000,0:04:44.000 And also it takes them forever to act on cells. 0:04:44.000,0:04:48.000 So it will not allow us to control a memory in real time. 0:04:48.000,0:04:52.000 So Steve, how about let's zap the brain with electricity? 0:04:52.000,0:04:54.000 SR: So electricity is pretty fast, 0:04:54.000,0:04:56.000 but we probably wouldn't be able to target it 0:04:56.000,0:04:58.000 to just the specific cells that hold onto a memory, 0:04:58.000,0:05:00.000 and we'd probably fry the brain. 0:05:00.000,0:05:03.000 XL: Oh. That's true. So it looks like, hmm, 0:05:03.000,0:05:06.000 indeed we need to find a better way 0:05:06.000,0:05:09.000 to impact the brain at the speed of light. 0:05:09.000,0:05:14.000 SR: So it just so happens that light travels at the speed of light. 0:05:14.000,0:05:18.000 So maybe we could activate or inactive memories 0:05:18.000,0:05:19.000 by just using light -- 0:05:19.000,0:05:21.000 XL: That's pretty fast. 0:05:21.000,0:05:23.000 SR: -- and because normally brain cells 0:05:23.000,0:05:24.000 don't respond to pulses of light, 0:05:24.000,0:05:26.000 so those that would respond to pulses of light 0:05:26.000,0:05:29.000 are those that contain a light-sensitive switch. 0:05:29.000,0:05:31.000 Now to do that, first we need to trick brain cells 0:05:31.000,0:05:32.000 to respond to laser beams. 0:05:32.000,0:05:33.000 XL: Yep. You heard it right. 0:05:33.000,0:05:35.000 We are trying to shoot lasers into the brain. 0:05:35.000,0:05:37.000 (Laughter) 0:05:37.000,0:05:40.000 SR: And the technique that lets us do that is optogenetics. 0:05:40.000,0:05:44.000 Optogenetics gave us this light switch that we can use 0:05:44.000,0:05:45.000 to turn brain cells on or off, 0:05:45.000,0:05:48.000 and the name of that switch is channelrhodopsin, 0:05:48.000,0:05:50.000 seen here as these green dots attached to this brain cell. 0:05:50.000,0:05:53.000 You can think of channelrhodopsin as a sort of light-sensitive switch 0:05:53.000,0:05:56.000 that can be artificially installed in brain cells 0:05:56.000,0:05:58.000 so that now we can use that switch 0:05:58.000,0:06:01.000 to activate or inactivate the brain cell simply by clicking it, 0:06:01.000,0:06:03.000 and in this case we click it on with pulses of light. 0:06:03.000,0:06:07.000 XL: So we attach this light-sensitive switch of channelrhodopsin 0:06:07.000,0:06:09.000 to the sensor we've been talking about 0:06:09.000,0:06:12.000 and inject this into the brain. 0:06:12.000,0:06:15.000 So whenever a memory is being formed, 0:06:15.000,0:06:17.000 any active cell for that particular memory 0:06:17.000,0:06:21.000 will also have this light-sensitive switch installed in it 0:06:21.000,0:06:23.000 so that we can control these cells 0:06:23.000,0:06:27.000 by the flipping of a laser just like this one you see. 0:06:27.000,0:06:30.000 SR: So let's put all of this to the test now. 0:06:30.000,0:06:32.000 What we can do is we can take our mice 0:06:32.000,0:06:35.000 and then we can put them in a box that looks exactly like this box here, 0:06:35.000,0:06:38.000 and then we can give them a very mild foot shock 0:06:38.000,0:06:40.000 so that they form a fear memory of this box. 0:06:40.000,0:06:42.000 They learn that something bad happened here. 0:06:42.000,0:06:44.000 Now with our system, the cells that are active 0:06:44.000,0:06:47.000 in the hippocampus in the making of this memory, 0:06:47.000,0:06:50.000 only those cells will now contain channelrhodopsin. 0:06:50.000,0:06:53.000 XL: When you are as small as a mouse, 0:06:53.000,0:06:56.000 it feels as if the whole world is trying to get you. 0:06:56.000,0:06:58.000 So your best response of defense 0:06:58.000,0:07:01.000 is trying to be undetected. 0:07:01.000,0:07:03.000 Whenever a mouse is in fear, 0:07:03.000,0:07:04.000 it will show this very typical behavior 0:07:04.000,0:07:06.000 by staying at one corner of the box, 0:07:06.000,0:07:09.000 trying to not move any part of its body, 0:07:09.000,0:07:12.000 and this posture is called freezing. 0:07:12.000,0:07:17.000 So if a mouse remembers that something bad happened in this box, 0:07:17.000,0:07:19.000 and when we put them back into the same box, 0:07:19.000,0:07:21.000 it will basically show freezing 0:07:21.000,0:07:23.000 because it doesn't want to be detected 0:07:23.000,0:07:26.000 by any potential threats in this box. 0:07:26.000,0:07:28.000 SR: So you can think of freezing as, 0:07:28.000,0:07:30.000 you're walking down the street minding your own business, 0:07:30.000,0:07:31.000 and then out of nowhere you almost run into 0:07:31.000,0:07:34.000 an ex-girlfriend or ex-boyfriend, 0:07:34.000,0:07:36.000 and now those terrifying two seconds 0:07:36.000,0:07:38.000 where you start thinking, "What do I do? Do I say hi? 0:07:38.000,0:07:39.000 Do I shake their hand? Do I turn around and run away? 0:07:39.000,0:07:41.000 Do I sit here and pretend like I don't exist?" 0:07:41.000,0:07:44.000 Those kinds of fleeting thoughts that physically incapacitate you, 0:07:44.000,0:07:47.000 that temporarily give you that deer-in-headlights look. 0:07:47.000,0:07:50.000 XL: However, if you put the mouse in a completely different 0:07:50.000,0:07:53.000 new box, like the next one, 0:07:53.000,0:07:56.000 it will not be afraid of this box 0:07:56.000,0:08:00.000 because there's no reason that it will be afraid of this new environment. 0:08:00.000,0:08:03.000 But what if we put the mouse in this new box 0:08:03.000,0:08:07.000 but at the same time, we activate the fear memory 0:08:07.000,0:08:10.000 using lasers just like we did before? 0:08:10.000,0:08:13.000 Are we going to bring back the fear memory 0:08:13.000,0:08:17.000 for the first box into this completely new environment? 0:08:17.000,0:08:19.000 SR: All right, and here's the million-dollar experiment. 0:08:19.000,0:08:22.000 Now to bring back to life the memory of that day, 0:08:22.000,0:08:24.000 I remember that the Red Sox had just won, 0:08:24.000,0:08:26.000 it was a green spring day, 0:08:26.000,0:08:28.000 perfect for going up and down the river 0:08:28.000,0:08:31.000 and then maybe going to the North End 0:08:31.000,0:08:33.000 to get some cannolis, #justsaying. 0:08:33.000,0:08:36.000 Now Xu and I, on the other hand, 0:08:36.000,0:08:39.000 were in a completely windowless black room 0:08:39.000,0:08:42.000 not making any ocular movement that even remotely resembles an eye blink 0:08:42.000,0:08:45.000 because our eyes were fixed onto a computer screen. 0:08:45.000,0:08:47.000 We were looking at this mouse here trying to activate a memory 0:08:47.000,0:08:49.000 for the first time using our technique. 0:08:49.000,0:08:52.000 XL: And this is what we saw. 0:08:52.000,0:08:54.000 When we first put the mouse into this box, 0:08:54.000,0:08:57.000 it's exploring, sniffing around, walking around, 0:08:57.000,0:08:59.000 minding its own business, 0:08:59.000,0:09:01.000 because actually by nature, 0:09:01.000,0:09:03.000 mice are pretty curious animals. 0:09:03.000,0:09:05.000 They want to know, what's going on in this new box? 0:09:05.000,0:09:07.000 It's interesting. 0:09:07.000,0:09:10.000 But the moment we turned on the laser, like you see now, 0:09:10.000,0:09:13.000 all of a sudden the mouse entered this freezing mode. 0:09:13.000,0:09:18.000 It stayed here and tried not to move any part of its body. 0:09:18.000,0:09:19.000 Clearly it's freezing. 0:09:19.000,0:09:22.000 So indeed, it looks like we are able to bring back 0:09:22.000,0:09:24.000 the fear memory for the first box 0:09:24.000,0:09:27.000 in this completely new environment. 0:09:27.000,0:09:29.000 While watching this, Steve and I 0:09:29.000,0:09:32.000 are as shocked as the mouse itself. 0:09:32.000,0:09:33.000 (Laughter) 0:09:33.000,0:09:36.000 So after the experiment, the two of us just left the room 0:09:36.000,0:09:38.000 without saying anything. 0:09:38.000,0:09:41.000 After a kind of long, awkward period of time, 0:09:41.000,0:09:43.000 Steve broke the silence. 0:09:43.000,0:09:46.000 SR: "Did that just work?" 0:09:46.000,0:09:49.000 XL: "Yes," I said. "Indeed it worked!" 0:09:49.000,0:09:51.000 We're really excited about this. 0:09:51.000,0:09:53.000 And then we published our findings 0:09:53.000,0:09:55.000 in the journal Nature. 0:09:55.000,0:09:58.000 Ever since the publication of our work, 0:09:58.000,0:10:00.000 we've been receiving numerous comments 0:10:00.000,0:10:02.000 from all over the Internet. 0:10:02.000,0:10:06.000 Maybe we can take a look at some of those. 0:10:06.000,0:10:08.000 ["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"] 0:10:08.000,0:10:10.000 SR: So the first thing that you'll notice is that people 0:10:10.000,0:10:13.000 have really strong opinions about this kind of work. 0:10:13.000,0:10:16.000 Now I happen to completely agree with the optimism 0:10:16.000,0:10:17.000 of this first quote, 0:10:17.000,0:10:19.000 because on a scale of zero to Morgan Freeman's voice, 0:10:19.000,0:10:22.000 it happens to be one of the most evocative accolades 0:10:22.000,0:10:23.000 that I've heard come our way. 0:10:23.000,0:10:25.000 (Laughter) 0:10:25.000,0:10:27.000 But as you'll see, it's not the only opinion that's out there. 0:10:27.000,0:10:29.000 ["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"] 0:10:29.000,0:10:31.000 XL: Indeed, if we take a look at the second one, 0:10:31.000,0:10:33.000 I think we can all agree that it's, meh, 0:10:33.000,0:10:35.000 probably not as positive. 0:10:35.000,0:10:37.000 But this also reminds us that, 0:10:37.000,0:10:40.000 although we are still working with mice, 0:10:40.000,0:10:43.000 it's probably a good idea to start thinking and discussing 0:10:43.000,0:10:46.000 about the possible ethical ramifications 0:10:46.000,0:10:48.000 of memory control. 0:10:48.000,0:10:50.000 SR: Now, in the spirit of the third quote, 0:10:50.000,0:10:52.000 we want to tell you about a recent project that we've been 0:10:52.000,0:10:55.000 working on in lab that we've called Project Inception. 0:10:55.000,0:10:58.000 ["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."] 0:10:58.000,0:11:02.000 So we reasoned that now that we can reactivate a memory, 0:11:02.000,0:11:05.000 what if we do so but then begin to tinker with that memory? 0:11:05.000,0:11:08.000 Could we possibly even turn it into a false memory? 0:11:08.000,0:11:12.000 XL: So all memory is sophisticated and dynamic, 0:11:12.000,0:11:15.000 but if just for simplicity, let's imagine memory 0:11:15.000,0:11:16.000 as a movie clip. 0:11:16.000,0:11:19.000 So far what we've told you is basically we can control 0:11:19.000,0:11:21.000 this "play" button of the clip 0:11:21.000,0:11:25.000 so that we can play this video clip any time, anywhere. 0:11:25.000,0:11:28.000 But is there a possibility that we can actually get 0:11:28.000,0:11:31.000 inside the brain and edit this movie clip 0:11:31.000,0:11:34.000 so that we can make it different from the original? 0:11:34.000,0:11:36.000 Yes we can. 0:11:36.000,0:11:38.000 Turned out that all we need to do is basically 0:11:38.000,0:11:42.000 reactivate a memory using lasers just like we did before, 0:11:42.000,0:11:46.000 but at the same time, if we present new information 0:11:46.000,0:11:50.000 and allow this new information to incorporate into this old memory, 0:11:50.000,0:11:52.000 this will change the memory. 0:11:52.000,0:11:56.000 It's sort of like making a remix tape. 0:11:56.000,0:11:59.000 SR: So how do we do this? 0:11:59.000,0:12:01.000 Rather than finding a fear memory in the brain, 0:12:01.000,0:12:02.000 we can start by taking our animals, 0:12:02.000,0:12:05.000 and let's say we put them in a blue box like this blue box here 0:12:05.000,0:12:08.000 and we find the brain cells that represent that blue box 0:12:08.000,0:12:10.000 and we trick them to respond to pulses of light 0:12:10.000,0:12:12.000 exactly like we had said before. 0:12:12.000,0:12:14.000 Now the next day, we can take our animals and place them 0:12:14.000,0:12:17.000 in a red box that they've never experienced before. 0:12:17.000,0:12:19.000 We can shoot light into the brain to reactivate 0:12:19.000,0:12:21.000 the memory of the blue box. 0:12:21.000,0:12:23.000 So what would happen here if, while the animal 0:12:23.000,0:12:24.000 is recalling the memory of the blue box, 0:12:24.000,0:12:27.000 we gave it a couple of mild foot shocks? 0:12:27.000,0:12:30.000 So here we're trying to artificially make an association 0:12:30.000,0:12:32.000 between the memory of the blue box 0:12:32.000,0:12:33.000 and the foot shocks themselves. 0:12:33.000,0:12:35.000 We're just trying to connect the two. 0:12:35.000,0:12:37.000 So to test if we had done so, 0:12:37.000,0:12:38.000 we can take our animals once again 0:12:38.000,0:12:40.000 and place them back in the blue box. 0:12:40.000,0:12:43.000 Again, we had just reactivated the memory of the blue box 0:12:43.000,0:12:45.000 while the animal got a couple of mild foot shocks, 0:12:45.000,0:12:47.000 and now the animal suddenly freezes. 0:12:47.000,0:12:50.000 It's as though it's recalling being mildly shocked in this environment 0:12:50.000,0:12:53.000 even though that never actually happened. 0:12:53.000,0:12:55.000 So it formed a false memory, 0:12:55.000,0:12:57.000 because it's falsely fearing an environment 0:12:57.000,0:12:58.000 where, technically speaking, 0:12:58.000,0:13:01.000 nothing bad actually happened to it. 0:13:01.000,0:13:03.000 XL: So, so far we are only talking about 0:13:03.000,0:13:06.000 this light-controlled "on" switch. 0:13:06.000,0:13:09.000 In fact, we also have a light-controlled "off" switch, 0:13:09.000,0:13:11.000 and it's very easy to imagine that 0:13:11.000,0:13:13.000 by installing this light-controlled "off" switch, 0:13:13.000,0:13:19.000 we can also turn off a memory, any time, anywhere. 0:13:19.000,0:13:21.000 So everything we've been talking about today 0:13:21.000,0:13:26.000 is based on this philosophically charged principle of neuroscience 0:13:26.000,0:13:30.000 that the mind, with its seemingly mysterious properties, 0:13:30.000,0:13:34.000 is actually made of physical stuff that we can tinker with. 0:13:34.000,0:13:35.000 SR: And for me personally, 0:13:35.000,0:13:37.000 I see a world where we can reactivate 0:13:37.000,0:13:39.000 any kind of memory that we'd like. 0:13:39.000,0:13:42.000 I also see a world where we can erase unwanted memories. 0:13:42.000,0:13:44.000 Now, I even see a world where editing memories 0:13:44.000,0:13:46.000 is something of a reality, 0:13:46.000,0:13:47.000 because we're living in a time where it's possible 0:13:47.000,0:13:50.000 to pluck questions from the tree of science fiction 0:13:50.000,0:13:52.000 and to ground them in experimental reality. 0:13:52.000,0:13:54.000 XL: Nowadays, people in the lab 0:13:54.000,0:13:56.000 and people in other groups all over the world 0:13:56.000,0:14:00.000 are using similar methods to activate or edit memories, 0:14:00.000,0:14:04.000 whether that's old or new, positive or negative, 0:14:04.000,0:14:07.000 all sorts of memories so that we can understand 0:14:07.000,0:14:08.000 how memory works. 0:14:08.000,0:14:10.000 SR: For example, one group in our lab 0:14:10.000,0:14:13.000 was able to find the brain cells that make up a fear memory 0:14:13.000,0:14:16.000 and converted them into a pleasurable memory, just like that. 0:14:16.000,0:14:19.000 That's exactly what I mean about editing these kinds of processes. 0:14:19.000,0:14:21.000 Now one dude in lab was even able to reactivate 0:14:21.000,0:14:23.000 memories of female mice in male mice, 0:14:23.000,0:14:26.000 which rumor has it is a pleasurable experience. 0:14:26.000,0:14:30.000 XL: Indeed, we are living in a very exciting moment 0:14:30.000,0:14:34.000 where science doesn't have any arbitrary speed limits 0:14:34.000,0:14:37.000 but is only bound by our own imagination. 0:14:37.000,0:14:39.000 SR: And finally, what do we make of all this? 0:14:39.000,0:14:41.000 How do we push this technology forward? 0:14:41.000,0:14:43.000 These are the questions that should not remain 0:14:43.000,0:14:45.000 just inside the lab, 0:14:45.000,0:14:47.000 and so one goal of today's talk was to bring everybody 0:14:47.000,0:14:50.000 up to speed with the kind of stuff that's possible 0:14:50.000,0:14:51.000 in modern neuroscience, 0:14:51.000,0:14:53.000 but now, just as importantly, 0:14:53.000,0:14:56.000 to actively engage everybody in this conversation. 0:14:56.000,0:14:58.000 So let's think together as a team about what this all means 0:14:58.000,0:15:01.000 and where we can and should go from here, 0:15:01.000,0:15:03.000 because Xu and I think we all have 0:15:03.000,0:15:05.000 some really big decisions ahead of us. 0:15:05.000,0:15:06.000 Thank you.[br]XL: Thank you. 0:15:06.000,0:15:08.000 (Applause)