1 00:00:00,000 --> 00:00:01,000 Steve Ramirez: My first year of grad school, 2 00:00:01,000 --> 00:00:03,000 I found myself in my bedroom 3 00:00:03,000 --> 00:00:06,000 eating lots of Ben & Jerry's 4 00:00:06,000 --> 00:00:07,000 watching some trashy TV 5 00:00:07,000 --> 00:00:11,000 and maybe, maybe listening to Taylor Swift. 6 00:00:11,000 --> 00:00:12,000 I had just gone through a breakup. 7 00:00:12,000 --> 00:00:14,000 (Laughter) 8 00:00:14,000 --> 00:00:16,000 So for the longest time, all I would do 9 00:00:16,000 --> 00:00:20,000 is recall the memory of this person over and over again, 10 00:00:20,000 --> 00:00:22,000 wishing that I could get rid of that gut-wrenching, 11 00:00:22,000 --> 00:00:25,000 visceral "blah" feeling. 12 00:00:25,000 --> 00:00:27,000 Now, as it turns out, I'm a neuroscientist, 13 00:00:27,000 --> 00:00:30,000 so I knew that the memory of that person 14 00:00:30,000 --> 00:00:33,000 and the awful, emotional undertones that color in that memory, 15 00:00:33,000 --> 00:00:35,000 are largely mediated by separate brain systems. 16 00:00:35,000 --> 00:00:38,000 And so I thought, what if we could go into the brain 17 00:00:38,000 --> 00:00:40,000 and edit out that nauseating feeling 18 00:00:40,000 --> 00:00:43,000 but while keeping the memory of that person intact? 19 00:00:43,000 --> 00:00:45,000 Then I realized, maybe that's a little bit lofty for now. 20 00:00:45,000 --> 00:00:48,000 So what if we could start off by going into the brain 21 00:00:48,000 --> 00:00:51,000 and just finding a single memory to begin with? 22 00:00:51,000 --> 00:00:53,000 Could we jump-start that memory back to life, 23 00:00:53,000 --> 00:00:57,000 maybe even play with the contents of that memory? 24 00:00:57,000 --> 00:00:59,000 All that said, there is one person in the entire world right now 25 00:00:59,000 --> 00:01:01,000 that I really hope is not watching this talk. 26 00:01:01,000 --> 00:01:05,000 (Laughter) 27 00:01:05,000 --> 00:01:08,000 So there is a catch. There is a catch. 28 00:01:08,000 --> 00:01:11,000 These ideas probably remind you of "Total Recall," 29 00:01:11,000 --> 00:01:13,000 "Eternal Sunshine of the Spotless Mind," 30 00:01:13,000 --> 00:01:15,000 or of "Inception." 31 00:01:15,000 --> 00:01:16,000 But the movie stars that we work with 32 00:01:16,000 --> 00:01:18,000 are the celebrities of the lab. 33 00:01:18,000 --> 00:01:20,000 Xu Liu: Test mice. 34 00:01:20,000 --> 00:01:21,000 (Laughter) 35 00:01:21,000 --> 00:01:24,000 As neuroscientists, we work in the lab with mice 36 00:01:24,000 --> 00:01:28,000 trying to understand how memory works. 37 00:01:28,000 --> 00:01:30,000 And today, we hope to convince you that now 38 00:01:30,000 --> 00:01:33,000 we are actually able to activate a memory in the brain 39 00:01:33,000 --> 00:01:36,000 at the speed of light. 40 00:01:36,000 --> 00:01:39,000 To do this, there's only two simple steps to follow. 41 00:01:39,000 --> 00:01:42,000 First, you find and label a memory in the brain, 42 00:01:42,000 --> 00:01:46,000 and then you activate it with a switch. 43 00:01:46,000 --> 00:01:47,000 As simple as that. 44 00:01:47,000 --> 00:01:49,000 (Laughter) 45 00:01:49,000 --> 00:01:51,000 SR: Are you convinced? 46 00:01:51,000 --> 00:01:55,000 So, turns out finding a memory in the brain isn't all that easy. 47 00:01:55,000 --> 00:01:57,000 XL: Indeed. This is way more difficult than, let's say, 48 00:01:57,000 --> 00:02:00,000 finding a needle in a haystack, 49 00:02:00,000 --> 00:02:02,000 because at least, you know, the needle is still something 50 00:02:02,000 --> 00:02:05,000 you can physically put your fingers on. 51 00:02:05,000 --> 00:02:07,000 But memory is not. 52 00:02:07,000 --> 00:02:10,000 And also, there's way more cells in your brain 53 00:02:10,000 --> 00:02:15,000 than the number of straws in a typical haystack. 54 00:02:15,000 --> 00:02:18,000 So yeah, this task does seem to be daunting. 55 00:02:18,000 --> 00:02:22,000 But luckily, we got help from the brain itself. 56 00:02:22,000 --> 00:02:24,000 It turned out that all we need to do is basically 57 00:02:24,000 --> 00:02:26,000 to let the brain form a memory, 58 00:02:26,000 --> 00:02:30,000 and then the brain will tell us which cells are involved 59 00:02:30,000 --> 00:02:32,000 in that particular memory. 60 00:02:32,000 --> 00:02:34,000 SR: So what was going on in my brain 61 00:02:34,000 --> 00:02:36,000 while I was recalling the memory of an ex? 62 00:02:36,000 --> 00:02:39,000 If you were to just completely ignore human ethics for a second 63 00:02:39,000 --> 00:02:40,000 and slice up my brain right now, 64 00:02:40,000 --> 00:02:42,000 you would see that there was an amazing number 65 00:02:42,000 --> 00:02:45,000 of brain regions that were active while recalling that memory. 66 00:02:45,000 --> 00:02:48,000 Now one brain region that would be robustly active 67 00:02:48,000 --> 00:02:50,000 in particular is called the hippocampus, 68 00:02:50,000 --> 00:02:53,000 which for decades has been implicated in processing 69 00:02:53,000 --> 00:02:55,000 the kinds of memories that we hold near and dear, 70 00:02:55,000 --> 00:02:58,000 which also makes it an ideal target to go into 71 00:02:58,000 --> 00:03:01,000 and to try and find and maybe reactivate a memory. 72 00:03:01,000 --> 00:03:03,000 XL: When you zoom in into the hippocampus, 73 00:03:03,000 --> 00:03:05,000 of course you will see lots of cells, 74 00:03:05,000 --> 00:03:08,000 but we are able to find which cells are involved 75 00:03:08,000 --> 00:03:10,000 in a particular memory, 76 00:03:10,000 --> 00:03:12,000 because whenever a cell is active, 77 00:03:12,000 --> 00:03:14,000 like when it's forming a memory, 78 00:03:14,000 --> 00:03:18,000 it will also leave a footprint that will later allow us to know 79 00:03:18,000 --> 00:03:20,000 these cells are recently active. 80 00:03:20,000 --> 00:03:22,000 SR: So the same way that building lights at night 81 00:03:22,000 --> 00:03:25,000 let you know that somebody's probably working there at any given moment, 82 00:03:25,000 --> 00:03:29,000 in a very real sense, there are biological sensors 83 00:03:29,000 --> 00:03:31,000 within a cell that are turned on 84 00:03:31,000 --> 00:03:33,000 only when that cell was just working. 85 00:03:33,000 --> 00:03:35,000 They're sort of biological windows that light up 86 00:03:35,000 --> 00:03:37,000 to let us know that that cell was just active. 87 00:03:37,000 --> 00:03:39,000 XL: So we clipped part of this sensor, 88 00:03:39,000 --> 00:03:43,000 and attached that to a switch to control the cells, 89 00:03:43,000 --> 00:03:47,000 and we packed this switch into an engineered virus 90 00:03:47,000 --> 00:03:49,000 and injected that into the brain of the mice. 91 00:03:49,000 --> 00:03:52,000 So whenever a memory is being formed, 92 00:03:52,000 --> 00:03:54,000 any active cells for that memory 93 00:03:54,000 --> 00:03:57,000 will also have this switch installed. 94 00:03:57,000 --> 00:03:58,000 SR: So here is what the hippocampus looks like 95 00:03:58,000 --> 00:04:01,000 after forming a fear memory, for example. 96 00:04:01,000 --> 00:04:03,000 The sea of blue that you see here 97 00:04:03,000 --> 00:04:05,000 are densely packed brain cells, 98 00:04:05,000 --> 00:04:07,000 but the green brain cells, 99 00:04:07,000 --> 00:04:09,000 the green brain cells are the ones that are holding on 100 00:04:09,000 --> 00:04:11,000 to a specific fear memory. 101 00:04:11,000 --> 00:04:13,000 So you are looking at the crystallization 102 00:04:13,000 --> 00:04:15,000 of the fleeting formation of fear. 103 00:04:15,000 --> 00:04:19,000 You're actually looking at the cross-section of a memory right now. 104 00:04:19,000 --> 00:04:21,000 XL: Now, for the switch we have been talking about, 105 00:04:21,000 --> 00:04:24,000 ideally, the switch has to act really fast. 106 00:04:24,000 --> 00:04:27,000 It shouldn't take minutes or hours to work. 107 00:04:27,000 --> 00:04:31,000 It should act at the speed of the brain, in milliseconds. 108 00:04:31,000 --> 00:04:32,000 SR: So what do you think, Xu? 109 00:04:32,000 --> 00:04:35,000 Could we use, let's say, pharmacological drugs 110 00:04:35,000 --> 00:04:37,000 to activate or inactivate brain cells? 111 00:04:37,000 --> 00:04:41,000 XL: Nah. Drugs are pretty messy. They spread everywhere. 112 00:04:41,000 --> 00:04:44,000 And also it takes them forever to act on cells. 113 00:04:44,000 --> 00:04:48,000 So it will not allow us to control a memory in real time. 114 00:04:48,000 --> 00:04:52,000 So Steve, how about let's zap the brain with electricity? 115 00:04:52,000 --> 00:04:54,000 SR: So electricity is pretty fast, 116 00:04:54,000 --> 00:04:56,000 but we probably wouldn't be able to target it 117 00:04:56,000 --> 00:04:58,000 to just the specific cells that hold onto a memory, 118 00:04:58,000 --> 00:05:00,000 and we'd probably fry the brain. 119 00:05:00,000 --> 00:05:03,000 XL: Oh. That's true. So it looks like, hmm, 120 00:05:03,000 --> 00:05:06,000 indeed we need to find a better way 121 00:05:06,000 --> 00:05:09,000 to impact the brain at the speed of light. 122 00:05:09,000 --> 00:05:14,000 SR: So it just so happens that light travels at the speed of light. 123 00:05:14,000 --> 00:05:18,000 So maybe we could activate or inactive memories 124 00:05:18,000 --> 00:05:19,000 by just using light -- 125 00:05:19,000 --> 00:05:21,000 XL: That's pretty fast. 126 00:05:21,000 --> 00:05:23,000 SR: -- and because normally brain cells 127 00:05:23,000 --> 00:05:24,000 don't respond to pulses of light, 128 00:05:24,000 --> 00:05:26,000 so those that would respond to pulses of light 129 00:05:26,000 --> 00:05:29,000 are those that contain a light-sensitive switch. 130 00:05:29,000 --> 00:05:31,000 Now to do that, first we need to trick brain cells 131 00:05:31,000 --> 00:05:32,000 to respond to laser beams. 132 00:05:32,000 --> 00:05:33,000 XL: Yep. You heard it right. 133 00:05:33,000 --> 00:05:35,000 We are trying to shoot lasers into the brain. 134 00:05:35,000 --> 00:05:37,000 (Laughter) 135 00:05:37,000 --> 00:05:40,000 SR: And the technique that lets us do that is optogenetics. 136 00:05:40,000 --> 00:05:44,000 Optogenetics gave us this light switch that we can use 137 00:05:44,000 --> 00:05:45,000 to turn brain cells on or off, 138 00:05:45,000 --> 00:05:48,000 and the name of that switch is channelrhodopsin, 139 00:05:48,000 --> 00:05:50,000 seen here as these green dots attached to this brain cell. 140 00:05:50,000 --> 00:05:53,000 You can think of channelrhodopsin as a sort of light-sensitive switch 141 00:05:53,000 --> 00:05:56,000 that can be artificially installed in brain cells 142 00:05:56,000 --> 00:05:58,000 so that now we can use that switch 143 00:05:58,000 --> 00:06:01,000 to activate or inactivate the brain cell simply by clicking it, 144 00:06:01,000 --> 00:06:03,000 and in this case we click it on with pulses of light. 145 00:06:03,000 --> 00:06:07,000 XL: So we attach this light-sensitive switch of channelrhodopsin 146 00:06:07,000 --> 00:06:09,000 to the sensor we've been talking about 147 00:06:09,000 --> 00:06:12,000 and inject this into the brain. 148 00:06:12,000 --> 00:06:15,000 So whenever a memory is being formed, 149 00:06:15,000 --> 00:06:17,000 any active cell for that particular memory 150 00:06:17,000 --> 00:06:21,000 will also have this light-sensitive switch installed in it 151 00:06:21,000 --> 00:06:23,000 so that we can control these cells 152 00:06:23,000 --> 00:06:27,000 by the flipping of a laser just like this one you see. 153 00:06:27,000 --> 00:06:30,000 SR: So let's put all of this to the test now. 154 00:06:30,000 --> 00:06:32,000 What we can do is we can take our mice 155 00:06:32,000 --> 00:06:35,000 and then we can put them in a box that looks exactly like this box here, 156 00:06:35,000 --> 00:06:38,000 and then we can give them a very mild foot shock 157 00:06:38,000 --> 00:06:40,000 so that they form a fear memory of this box. 158 00:06:40,000 --> 00:06:42,000 They learn that something bad happened here. 159 00:06:42,000 --> 00:06:44,000 Now with our system, the cells that are active 160 00:06:44,000 --> 00:06:47,000 in the hippocampus in the making of this memory, 161 00:06:47,000 --> 00:06:50,000 only those cells will now contain channelrhodopsin. 162 00:06:50,000 --> 00:06:53,000 XL: When you are as small as a mouse, 163 00:06:53,000 --> 00:06:56,000 it feels as if the whole world is trying to get you. 164 00:06:56,000 --> 00:06:58,000 So your best response of defense 165 00:06:58,000 --> 00:07:01,000 is trying to be undetected. 166 00:07:01,000 --> 00:07:03,000 Whenever a mouse is in fear, 167 00:07:03,000 --> 00:07:04,000 it will show this very typical behavior 168 00:07:04,000 --> 00:07:06,000 by staying at one corner of the box, 169 00:07:06,000 --> 00:07:09,000 trying to not move any part of its body, 170 00:07:09,000 --> 00:07:12,000 and this posture is called freezing. 171 00:07:12,000 --> 00:07:17,000 So if a mouse remembers that something bad happened in this box, 172 00:07:17,000 --> 00:07:19,000 and when we put them back into the same box, 173 00:07:19,000 --> 00:07:21,000 it will basically show freezing 174 00:07:21,000 --> 00:07:23,000 because it doesn't want to be detected 175 00:07:23,000 --> 00:07:26,000 by any potential threats in this box. 176 00:07:26,000 --> 00:07:28,000 SR: So you can think of freezing as, 177 00:07:28,000 --> 00:07:30,000 you're walking down the street minding your own business, 178 00:07:30,000 --> 00:07:31,000 and then out of nowhere you almost run into 179 00:07:31,000 --> 00:07:34,000 an ex-girlfriend or ex-boyfriend, 180 00:07:34,000 --> 00:07:36,000 and now those terrifying two seconds 181 00:07:36,000 --> 00:07:38,000 where you start thinking, "What do I do? Do I say hi? 182 00:07:38,000 --> 00:07:39,000 Do I shake their hand? Do I turn around and run away? 183 00:07:39,000 --> 00:07:41,000 Do I sit here and pretend like I don't exist?" 184 00:07:41,000 --> 00:07:44,000 Those kinds of fleeting thoughts that physically incapacitate you, 185 00:07:44,000 --> 00:07:47,000 that temporarily give you that deer-in-headlights look. 186 00:07:47,000 --> 00:07:50,000 XL: However, if you put the mouse in a completely different 187 00:07:50,000 --> 00:07:53,000 new box, like the next one, 188 00:07:53,000 --> 00:07:56,000 it will not be afraid of this box 189 00:07:56,000 --> 00:08:00,000 because there's no reason that it will be afraid of this new environment. 190 00:08:00,000 --> 00:08:03,000 But what if we put the mouse in this new box 191 00:08:03,000 --> 00:08:07,000 but at the same time, we activate the fear memory 192 00:08:07,000 --> 00:08:10,000 using lasers just like we did before? 193 00:08:10,000 --> 00:08:13,000 Are we going to bring back the fear memory 194 00:08:13,000 --> 00:08:17,000 for the first box into this completely new environment? 195 00:08:17,000 --> 00:08:19,000 SR: All right, and here's the million-dollar experiment. 196 00:08:19,000 --> 00:08:22,000 Now to bring back to life the memory of that day, 197 00:08:22,000 --> 00:08:24,000 I remember that the Red Sox had just won, 198 00:08:24,000 --> 00:08:26,000 it was a green spring day, 199 00:08:26,000 --> 00:08:28,000 perfect for going up and down the river 200 00:08:28,000 --> 00:08:31,000 and then maybe going to the North End 201 00:08:31,000 --> 00:08:33,000 to get some cannolis, #justsaying. 202 00:08:33,000 --> 00:08:36,000 Now Xu and I, on the other hand, 203 00:08:36,000 --> 00:08:39,000 were in a completely windowless black room 204 00:08:39,000 --> 00:08:42,000 not making any ocular movement that even remotely resembles an eye blink 205 00:08:42,000 --> 00:08:45,000 because our eyes were fixed onto a computer screen. 206 00:08:45,000 --> 00:08:47,000 We were looking at this mouse here trying to activate a memory 207 00:08:47,000 --> 00:08:49,000 for the first time using our technique. 208 00:08:49,000 --> 00:08:52,000 XL: And this is what we saw. 209 00:08:52,000 --> 00:08:54,000 When we first put the mouse into this box, 210 00:08:54,000 --> 00:08:57,000 it's exploring, sniffing around, walking around, 211 00:08:57,000 --> 00:08:59,000 minding its own business, 212 00:08:59,000 --> 00:09:01,000 because actually by nature, 213 00:09:01,000 --> 00:09:03,000 mice are pretty curious animals. 214 00:09:03,000 --> 00:09:05,000 They want to know, what's going on in this new box? 215 00:09:05,000 --> 00:09:07,000 It's interesting. 216 00:09:07,000 --> 00:09:10,000 But the moment we turned on the laser, like you see now, 217 00:09:10,000 --> 00:09:13,000 all of a sudden the mouse entered this freezing mode. 218 00:09:13,000 --> 00:09:18,000 It stayed here and tried not to move any part of its body. 219 00:09:18,000 --> 00:09:19,000 Clearly it's freezing. 220 00:09:19,000 --> 00:09:22,000 So indeed, it looks like we are able to bring back 221 00:09:22,000 --> 00:09:24,000 the fear memory for the first box 222 00:09:24,000 --> 00:09:27,000 in this completely new environment. 223 00:09:27,000 --> 00:09:29,000 While watching this, Steve and I 224 00:09:29,000 --> 00:09:32,000 are as shocked as the mouse itself. 225 00:09:32,000 --> 00:09:33,000 (Laughter) 226 00:09:33,000 --> 00:09:36,000 So after the experiment, the two of us just left the room 227 00:09:36,000 --> 00:09:38,000 without saying anything. 228 00:09:38,000 --> 00:09:41,000 After a kind of long, awkward period of time, 229 00:09:41,000 --> 00:09:43,000 Steve broke the silence. 230 00:09:43,000 --> 00:09:46,000 SR: "Did that just work?" 231 00:09:46,000 --> 00:09:49,000 XL: "Yes," I said. "Indeed it worked!" 232 00:09:49,000 --> 00:09:51,000 We're really excited about this. 233 00:09:51,000 --> 00:09:53,000 And then we published our findings 234 00:09:53,000 --> 00:09:55,000 in the journal Nature. 235 00:09:55,000 --> 00:09:58,000 Ever since the publication of our work, 236 00:09:58,000 --> 00:10:00,000 we've been receiving numerous comments 237 00:10:00,000 --> 00:10:02,000 from all over the Internet. 238 00:10:02,000 --> 00:10:06,000 Maybe we can take a look at some of those. 239 00:10:06,000 --> 00:10:08,000 ["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"] 240 00:10:08,000 --> 00:10:10,000 SR: So the first thing that you'll notice is that people 241 00:10:10,000 --> 00:10:13,000 have really strong opinions about this kind of work. 242 00:10:13,000 --> 00:10:16,000 Now I happen to completely agree with the optimism 243 00:10:16,000 --> 00:10:17,000 of this first quote, 244 00:10:17,000 --> 00:10:19,000 because on a scale of zero to Morgan Freeman's voice, 245 00:10:19,000 --> 00:10:22,000 it happens to be one of the most evocative accolades 246 00:10:22,000 --> 00:10:23,000 that I've heard come our way. 247 00:10:23,000 --> 00:10:25,000 (Laughter) 248 00:10:25,000 --> 00:10:27,000 But as you'll see, it's not the only opinion that's out there. 249 00:10:27,000 --> 00:10:29,000 ["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"] 250 00:10:29,000 --> 00:10:31,000 XL: Indeed, if we take a look at the second one, 251 00:10:31,000 --> 00:10:33,000 I think we can all agree that it's, meh, 252 00:10:33,000 --> 00:10:35,000 probably not as positive. 253 00:10:35,000 --> 00:10:37,000 But this also reminds us that, 254 00:10:37,000 --> 00:10:40,000 although we are still working with mice, 255 00:10:40,000 --> 00:10:43,000 it's probably a good idea to start thinking and discussing 256 00:10:43,000 --> 00:10:46,000 about the possible ethical ramifications 257 00:10:46,000 --> 00:10:48,000 of memory control. 258 00:10:48,000 --> 00:10:50,000 SR: Now, in the spirit of the third quote, 259 00:10:50,000 --> 00:10:52,000 we want to tell you about a recent project that we've been 260 00:10:52,000 --> 00:10:55,000 working on in lab that we've called Project Inception. 261 00:10:55,000 --> 00:10:58,000 ["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."] 262 00:10:58,000 --> 00:11:02,000 So we reasoned that now that we can reactivate a memory, 263 00:11:02,000 --> 00:11:05,000 what if we do so but then begin to tinker with that memory? 264 00:11:05,000 --> 00:11:08,000 Could we possibly even turn it into a false memory? 265 00:11:08,000 --> 00:11:12,000 XL: So all memory is sophisticated and dynamic, 266 00:11:12,000 --> 00:11:15,000 but if just for simplicity, let's imagine memory 267 00:11:15,000 --> 00:11:16,000 as a movie clip. 268 00:11:16,000 --> 00:11:19,000 So far what we've told you is basically we can control 269 00:11:19,000 --> 00:11:21,000 this "play" button of the clip 270 00:11:21,000 --> 00:11:25,000 so that we can play this video clip any time, anywhere. 271 00:11:25,000 --> 00:11:28,000 But is there a possibility that we can actually get 272 00:11:28,000 --> 00:11:31,000 inside the brain and edit this movie clip 273 00:11:31,000 --> 00:11:34,000 so that we can make it different from the original? 274 00:11:34,000 --> 00:11:36,000 Yes we can. 275 00:11:36,000 --> 00:11:38,000 Turned out that all we need to do is basically 276 00:11:38,000 --> 00:11:42,000 reactivate a memory using lasers just like we did before, 277 00:11:42,000 --> 00:11:46,000 but at the same time, if we present new information 278 00:11:46,000 --> 00:11:50,000 and allow this new information to incorporate into this old memory, 279 00:11:50,000 --> 00:11:52,000 this will change the memory. 280 00:11:52,000 --> 00:11:56,000 It's sort of like making a remix tape. 281 00:11:56,000 --> 00:11:59,000 SR: So how do we do this? 282 00:11:59,000 --> 00:12:01,000 Rather than finding a fear memory in the brain, 283 00:12:01,000 --> 00:12:02,000 we can start by taking our animals, 284 00:12:02,000 --> 00:12:05,000 and let's say we put them in a blue box like this blue box here 285 00:12:05,000 --> 00:12:08,000 and we find the brain cells that represent that blue box 286 00:12:08,000 --> 00:12:10,000 and we trick them to respond to pulses of light 287 00:12:10,000 --> 00:12:12,000 exactly like we had said before. 288 00:12:12,000 --> 00:12:14,000 Now the next day, we can take our animals and place them 289 00:12:14,000 --> 00:12:17,000 in a red box that they've never experienced before. 290 00:12:17,000 --> 00:12:19,000 We can shoot light into the brain to reactivate 291 00:12:19,000 --> 00:12:21,000 the memory of the blue box. 292 00:12:21,000 --> 00:12:23,000 So what would happen here if, while the animal 293 00:12:23,000 --> 00:12:24,000 is recalling the memory of the blue box, 294 00:12:24,000 --> 00:12:27,000 we gave it a couple of mild foot shocks? 295 00:12:27,000 --> 00:12:30,000 So here we're trying to artificially make an association 296 00:12:30,000 --> 00:12:32,000 between the memory of the blue box 297 00:12:32,000 --> 00:12:33,000 and the foot shocks themselves. 298 00:12:33,000 --> 00:12:35,000 We're just trying to connect the two. 299 00:12:35,000 --> 00:12:37,000 So to test if we had done so, 300 00:12:37,000 --> 00:12:38,000 we can take our animals once again 301 00:12:38,000 --> 00:12:40,000 and place them back in the blue box. 302 00:12:40,000 --> 00:12:43,000 Again, we had just reactivated the memory of the blue box 303 00:12:43,000 --> 00:12:45,000 while the animal got a couple of mild foot shocks, 304 00:12:45,000 --> 00:12:47,000 and now the animal suddenly freezes. 305 00:12:47,000 --> 00:12:50,000 It's as though it's recalling being mildly shocked in this environment 306 00:12:50,000 --> 00:12:53,000 even though that never actually happened. 307 00:12:53,000 --> 00:12:55,000 So it formed a false memory, 308 00:12:55,000 --> 00:12:57,000 because it's falsely fearing an environment 309 00:12:57,000 --> 00:12:58,000 where, technically speaking, 310 00:12:58,000 --> 00:13:01,000 nothing bad actually happened to it. 311 00:13:01,000 --> 00:13:03,000 XL: So, so far we are only talking about 312 00:13:03,000 --> 00:13:06,000 this light-controlled "on" switch. 313 00:13:06,000 --> 00:13:09,000 In fact, we also have a light-controlled "off" switch, 314 00:13:09,000 --> 00:13:11,000 and it's very easy to imagine that 315 00:13:11,000 --> 00:13:13,000 by installing this light-controlled "off" switch, 316 00:13:13,000 --> 00:13:19,000 we can also turn off a memory, any time, anywhere. 317 00:13:19,000 --> 00:13:21,000 So everything we've been talking about today 318 00:13:21,000 --> 00:13:26,000 is based on this philosophically charged principle of neuroscience 319 00:13:26,000 --> 00:13:30,000 that the mind, with its seemingly mysterious properties, 320 00:13:30,000 --> 00:13:34,000 is actually made of physical stuff that we can tinker with. 321 00:13:34,000 --> 00:13:35,000 SR: And for me personally, 322 00:13:35,000 --> 00:13:37,000 I see a world where we can reactivate 323 00:13:37,000 --> 00:13:39,000 any kind of memory that we'd like. 324 00:13:39,000 --> 00:13:42,000 I also see a world where we can erase unwanted memories. 325 00:13:42,000 --> 00:13:44,000 Now, I even see a world where editing memories 326 00:13:44,000 --> 00:13:46,000 is something of a reality, 327 00:13:46,000 --> 00:13:47,000 because we're living in a time where it's possible 328 00:13:47,000 --> 00:13:50,000 to pluck questions from the tree of science fiction 329 00:13:50,000 --> 00:13:52,000 and to ground them in experimental reality. 330 00:13:52,000 --> 00:13:54,000 XL: Nowadays, people in the lab 331 00:13:54,000 --> 00:13:56,000 and people in other groups all over the world 332 00:13:56,000 --> 00:14:00,000 are using similar methods to activate or edit memories, 333 00:14:00,000 --> 00:14:04,000 whether that's old or new, positive or negative, 334 00:14:04,000 --> 00:14:07,000 all sorts of memories so that we can understand 335 00:14:07,000 --> 00:14:08,000 how memory works. 336 00:14:08,000 --> 00:14:10,000 SR: For example, one group in our lab 337 00:14:10,000 --> 00:14:13,000 was able to find the brain cells that make up a fear memory 338 00:14:13,000 --> 00:14:16,000 and converted them into a pleasurable memory, just like that. 339 00:14:16,000 --> 00:14:19,000 That's exactly what I mean about editing these kinds of processes. 340 00:14:19,000 --> 00:14:21,000 Now one dude in lab was even able to reactivate 341 00:14:21,000 --> 00:14:23,000 memories of female mice in male mice, 342 00:14:23,000 --> 00:14:26,000 which rumor has it is a pleasurable experience. 343 00:14:26,000 --> 00:14:30,000 XL: Indeed, we are living in a very exciting moment 344 00:14:30,000 --> 00:14:34,000 where science doesn't have any arbitrary speed limits 345 00:14:34,000 --> 00:14:37,000 but is only bound by our own imagination. 346 00:14:37,000 --> 00:14:39,000 SR: And finally, what do we make of all this? 347 00:14:39,000 --> 00:14:41,000 How do we push this technology forward? 348 00:14:41,000 --> 00:14:43,000 These are the questions that should not remain 349 00:14:43,000 --> 00:14:45,000 just inside the lab, 350 00:14:45,000 --> 00:14:47,000 and so one goal of today's talk was to bring everybody 351 00:14:47,000 --> 00:14:50,000 up to speed with the kind of stuff that's possible 352 00:14:50,000 --> 00:14:51,000 in modern neuroscience, 353 00:14:51,000 --> 00:14:53,000 but now, just as importantly, 354 00:14:53,000 --> 00:14:56,000 to actively engage everybody in this conversation. 355 00:14:56,000 --> 00:14:58,000 So let's think together as a team about what this all means 356 00:14:58,000 --> 00:15:01,000 and where we can and should go from here, 357 00:15:01,000 --> 00:15:03,000 because Xu and I think we all have 358 00:15:03,000 --> 00:15:05,000 some really big decisions ahead of us. 359 00:15:05,000 --> 00:15:06,000 Thank you. XL: Thank you. 360 00:15:06,000 --> 00:15:08,000 (Applause)