1 00:00:00,720 --> 00:00:05,000 People have been using media to talk about sex for a long time. 2 00:00:05,600 --> 00:00:09,040 Love letters, phone sex, racy Polaroids. 3 00:00:09,480 --> 00:00:15,536 There's even a story of a girl who eloped with a man that she met over the telegraph 4 00:00:15,560 --> 00:00:17,200 in 1886. 5 00:00:18,560 --> 00:00:23,656 Today we have sexting, and I am a sexting expert. 6 00:00:23,680 --> 00:00:25,560 Not an expert sexter. 7 00:00:26,800 --> 00:00:30,976 Though, I do know what this means -- I think you do too. 8 00:00:31,000 --> 00:00:32,375 [it's a penis] 9 00:00:32,400 --> 00:00:34,760 (Laughter) 10 00:00:36,360 --> 00:00:42,696 I have been studying sexting since the media attention to it began in 2008. 11 00:00:42,720 --> 00:00:45,696 I wrote a book on the moral panic about sexting. 12 00:00:45,720 --> 00:00:47,336 And here's what I found: 13 00:00:47,360 --> 00:00:50,576 most people are worrying about the wrong thing. 14 00:00:50,600 --> 00:00:54,776 They're trying to just prevent sexting from happening entirely. 15 00:00:54,800 --> 00:00:56,336 But let me ask you this: 16 00:00:56,360 --> 00:01:01,216 As long as it's completely consensual, what's the problem with sexting? 17 00:01:01,240 --> 00:01:05,256 People are into all sorts of things that you may not be into, 18 00:01:05,280 --> 00:01:07,576 like blue cheese or cilantro. 19 00:01:07,600 --> 00:01:09,240 (Laughter) 20 00:01:10,600 --> 00:01:14,736 Sexting is certainly risky, like anything that's fun, 21 00:01:14,760 --> 00:01:21,456 but as long as you're not sending an image to someone who doesn't want to receive it, 22 00:01:21,480 --> 00:01:23,256 there's no harm. 23 00:01:23,280 --> 00:01:25,896 What I do think is a serious problem 24 00:01:25,920 --> 00:01:29,176 is when people share private images of others 25 00:01:29,200 --> 00:01:30,720 without their permission. 26 00:01:31,360 --> 00:01:33,696 And instead of worrying about sexting, 27 00:01:33,720 --> 00:01:38,280 what I think we need to do is think a lot more about digital privacy. 28 00:01:38,880 --> 00:01:41,080 The key is consent. 29 00:01:41,680 --> 00:01:44,976 Right now most people are thinking about sexting 30 00:01:45,000 --> 00:01:47,960 without really thinking about consent at all. 31 00:01:48,400 --> 00:01:52,080 Did you know that we currently criminalize teen sexting? 32 00:01:53,400 --> 00:01:56,856 It can be a crime because it counts as child pornography, 33 00:01:56,880 --> 00:01:59,736 if there's an image of someone under 18, 34 00:01:59,760 --> 00:02:01,096 and it doesn't even matter 35 00:02:01,120 --> 00:02:05,320 if they took that image of themselves and shared it willingly. 36 00:02:05,800 --> 00:02:08,776 So we end up with this bizarre legal situation 37 00:02:08,800 --> 00:02:13,336 where two 17-year-olds can legally have sex in most US states 38 00:02:13,360 --> 00:02:15,240 but they can't photograph it. 39 00:02:16,560 --> 00:02:20,816 Some states have also tried passing sexting misdemeanor laws 40 00:02:20,840 --> 00:02:23,856 but these laws repeat the same problem 41 00:02:23,880 --> 00:02:27,640 because they still make consensual sexting illegal. 42 00:02:28,520 --> 00:02:29,776 It doesn't make sense 43 00:02:29,800 --> 00:02:34,536 to try to ban all sexting to try to address privacy violations. 44 00:02:34,560 --> 00:02:36,056 This is kind of like saying, 45 00:02:36,080 --> 00:02:41,520 let's solve the problem of date rape by just making dating completely illegal. 46 00:02:43,120 --> 00:02:48,496 Most teens don't get arrested for sexting, but can you guess who does? 47 00:02:48,520 --> 00:02:53,496 It's often teens who are disliked by their partner's parents. 48 00:02:53,520 --> 00:02:58,320 And this can be because of class bias, racism or homophobia. 49 00:02:58,960 --> 00:03:01,736 Most prosecutors are, of course, smart enough 50 00:03:01,760 --> 00:03:07,240 not to use child pornography charges against teenagers, but some do. 51 00:03:07,720 --> 00:03:11,376 According to researchers at the University of New Hampshire 52 00:03:11,400 --> 00:03:17,056 seven percent of all child pornography possession arrests are teens, 53 00:03:17,080 --> 00:03:20,080 sexting consensually with other teens. 54 00:03:21,480 --> 00:03:24,016 Child pornography is a serious crime, 55 00:03:24,040 --> 00:03:27,720 but it's just not the same thing as teen sexting. 56 00:03:29,040 --> 00:03:32,456 Parents and educators are also responding to sexting 57 00:03:32,480 --> 00:03:35,616 without really thinking too much about consent. 58 00:03:35,640 --> 00:03:39,760 Their message to teens is often: just don't do it. 59 00:03:40,200 --> 00:03:43,696 And I totally get it -- there are serious legal risks 60 00:03:43,720 --> 00:03:47,376 and of course, that potential for privacy violations. 61 00:03:47,400 --> 00:03:48,656 And when you were a teen, 62 00:03:48,680 --> 00:03:52,080 I'm sure you did exactly as you were told, right? 63 00:03:53,440 --> 00:03:56,896 You're probably thinking, my kid would never sext. 64 00:03:56,920 --> 00:04:00,376 And that's true, your little angel may not be sexting 65 00:04:00,400 --> 00:04:03,536 because only 33 percent 66 00:04:03,560 --> 00:04:05,920 of 16- and 17-year-olds are sexting. 67 00:04:07,200 --> 00:04:11,816 But, sorry, by the time they're older, odds are they will be sexting. 68 00:04:11,840 --> 00:04:18,040 Every study I've seen puts the rate above 50 percent for 18- to 24-year-olds. 69 00:04:18,720 --> 00:04:21,776 And most of the time, nothing goes wrong. 70 00:04:21,800 --> 00:04:27,176 People ask me all the time things like, isn't sexting just so dangerous, though? 71 00:04:27,200 --> 00:04:30,776 It's like you wouldn't leave your wallet on a park bench 72 00:04:30,800 --> 00:04:34,240 and you expect it's going to get stolen if you do that, right? 73 00:04:34,880 --> 00:04:36,336 Here's how I think about it: 74 00:04:36,360 --> 00:04:40,296 sexting is like leaving your wallet at your boyfriend's house. 75 00:04:40,320 --> 00:04:42,096 If you come back the next day 76 00:04:42,120 --> 00:04:44,400 and all the money is just gone, 77 00:04:45,040 --> 00:04:47,160 you really need to dump that guy. 78 00:04:47,690 --> 00:04:49,860 (Laughter) 79 00:04:51,360 --> 00:04:53,680 So instead of criminalizing sexting 80 00:04:53,720 --> 00:04:56,336 to try to prevent these privacy violations, 81 00:04:56,360 --> 00:04:59,656 instead we need to make consent central 82 00:04:59,680 --> 00:05:03,760 to how we think about the circulation of our private information. 83 00:05:04,480 --> 00:05:08,736 Every new media technology raises privacy concerns. 84 00:05:08,760 --> 00:05:13,376 In fact, in the US the very first major debates about privacy 85 00:05:13,400 --> 00:05:17,896 were in response to technologies that were relatively new at the time. 86 00:05:17,920 --> 00:05:21,816 In the late 1800s, people were worried about cameras, 87 00:05:21,840 --> 00:05:25,296 which were just suddenly more portable than ever before, 88 00:05:25,320 --> 00:05:27,816 and newspaper gossip columns. 89 00:05:27,840 --> 00:05:31,656 They were worried that the camera would capture information about them, 90 00:05:31,680 --> 00:05:34,880 take it out of context and widely disseminate it. 91 00:05:35,240 --> 00:05:36,856 Does this sound familiar? 92 00:05:36,880 --> 00:05:41,736 It's exactly what we're worrying about now with social media and drone cameras, 93 00:05:41,760 --> 00:05:43,400 and, of course, sexting. 94 00:05:43,920 --> 00:05:46,136 And these fears about technology, 95 00:05:46,160 --> 00:05:47,376 they make sense 96 00:05:47,400 --> 00:05:50,816 because technologies can amplify and bring out 97 00:05:50,840 --> 00:05:53,560 our worst qualities and behaviors. 98 00:05:54,160 --> 00:05:56,536 But there are solutions. 99 00:05:56,560 --> 00:06:00,120 And we've been here before with a dangerous new technology. 100 00:06:00,720 --> 00:06:04,496 In 1908, Ford introduced the Model T car. 101 00:06:04,520 --> 00:06:07,096 Traffic fatality rates were rising. 102 00:06:07,120 --> 00:06:09,920 It was a serious problem -- it looks so safe, right? 103 00:06:12,080 --> 00:06:16,056 Our first response was to try to change drivers' behavior, 104 00:06:16,080 --> 00:06:19,800 so we developed speed limits and enforced them through fines. 105 00:06:20,240 --> 00:06:22,096 But over the following decades, 106 00:06:22,120 --> 00:06:27,616 we started to realize the technology of the car itself is not just neutral. 107 00:06:27,640 --> 00:06:30,856 We could design the car to make it safer. 108 00:06:30,880 --> 00:06:34,336 So in the 1920s, we got shatter-resistant windshields. 109 00:06:34,360 --> 00:06:36,856 In the 1950s, seat belts. 110 00:06:36,880 --> 00:06:39,960 And in the 1990s, airbags. 111 00:06:40,440 --> 00:06:42,816 All three of these areas: 112 00:06:42,840 --> 00:06:47,616 laws, individuals and industry came together over time 113 00:06:47,640 --> 00:06:51,416 to help solve the problem that a new technology causes. 114 00:06:51,440 --> 00:06:54,680 And we can do the same thing with digital privacy. 115 00:06:55,160 --> 00:06:57,920 Of course, it comes back to consent. 116 00:06:58,360 --> 00:06:59,576 Here's the idea. 117 00:06:59,600 --> 00:07:03,416 Before anyone can distribute your private information, 118 00:07:03,440 --> 00:07:05,680 they should have to get your permission. 119 00:07:06,240 --> 00:07:11,056 This idea of affirmative consent comes from anti-rape activists 120 00:07:11,080 --> 00:07:14,856 who tell us that we need consent for every sexual act. 121 00:07:14,880 --> 00:07:19,456 And we have really high standards for consent in a lot of other areas. 122 00:07:19,480 --> 00:07:21,336 Think about having surgery. 123 00:07:21,360 --> 00:07:22,976 Your doctor has to make sure 124 00:07:23,000 --> 00:07:27,040 that you are meaningfully and knowingly consenting to that medical procedure. 125 00:07:27,520 --> 00:07:31,216 This is not the type of consent like with an iTunes Terms of Service 126 00:07:31,240 --> 00:07:34,896 where you just scroll to the bottom and you're like, agree, agree, whatever. 127 00:07:34,920 --> 00:07:36,640 (Laughter) 128 00:07:37,160 --> 00:07:42,416 If we think more about consent, we can have better privacy laws. 129 00:07:42,440 --> 00:07:45,856 Right now, we just don't have that many protections. 130 00:07:45,880 --> 00:07:49,456 If your ex-husband or your ex-wife is a terrible person, 131 00:07:49,480 --> 00:07:53,696 they can take your nude photos and upload them to a porn site. 132 00:07:53,720 --> 00:07:56,936 It can be really hard to get those images taken down. 133 00:07:56,960 --> 00:07:58,176 And in a lot of states, 134 00:07:58,200 --> 00:08:02,016 you're actually better off if you took the images of yourself 135 00:08:02,040 --> 00:08:04,840 because then you can file a copyright claim. 136 00:08:05,320 --> 00:08:07,376 (Laughter) 137 00:08:07,400 --> 00:08:10,376 Right now, if someone violates your privacy, 138 00:08:10,400 --> 00:08:14,600 whether that's an individual or a company or the NSA, 139 00:08:15,280 --> 00:08:18,016 you can try filing a lawsuit, 140 00:08:18,040 --> 00:08:20,176 though you may not be successful 141 00:08:20,200 --> 00:08:24,976 because many courts assume that digital privacy is just impossible. 142 00:08:25,000 --> 00:08:28,440 So they're not willing to punish anyone for violating it. 143 00:08:29,200 --> 00:08:32,096 I still hear people asking me all the time, 144 00:08:32,120 --> 00:08:37,416 isn't a digital image somehow blurring the line between public and private 145 00:08:37,440 --> 00:08:39,080 because it's digital, right? 146 00:08:39,600 --> 00:08:40,936 No! No! 147 00:08:40,960 --> 00:08:44,296 Everything digital is not just automatically public. 148 00:08:44,320 --> 00:08:46,216 That doesn't make any sense. 149 00:08:46,240 --> 00:08:49,736 As NYU legal scholar Helen Nissenbaum tells us, 150 00:08:49,760 --> 00:08:52,376 we have laws and policies and norms 151 00:08:52,400 --> 00:08:55,536 that protect all kinds of information that's private, 152 00:08:55,560 --> 00:08:58,976 and it doesn't make a difference if it's digital or not. 153 00:08:59,000 --> 00:09:01,656 All of your health records are digitized 154 00:09:01,680 --> 00:09:04,816 but your doctor can't just share them with anyone. 155 00:09:04,840 --> 00:09:09,296 All of your financial information is held in digital databases, 156 00:09:09,320 --> 00:09:13,560 but your credit card company can't just post your purchase history online. 157 00:09:15,080 --> 00:09:20,536 Better laws could help address privacy violations after they happen, 158 00:09:20,560 --> 00:09:24,936 but one of the easiest things we can all do is make personal changes 159 00:09:24,960 --> 00:09:27,640 to help protect each other's privacy. 160 00:09:28,360 --> 00:09:30,256 We're always told that privacy 161 00:09:30,280 --> 00:09:33,336 is our own, sole, individual responsibility. 162 00:09:33,360 --> 00:09:37,616 We're told, constantly monitor and update your privacy settings. 163 00:09:37,640 --> 00:09:42,440 We're told, never share anything you wouldn't want the entire world to see. 164 00:09:43,400 --> 00:09:44,616 This doesn't make sense. 165 00:09:44,640 --> 00:09:47,616 Digital media are social environments 166 00:09:47,640 --> 00:09:51,920 and we share things with people we trust all day, every day. 167 00:09:52,760 --> 00:09:55,736 As Princeton researcher Janet Vertesi argues, 168 00:09:55,760 --> 00:09:59,776 our data and our privacy, they're not just personal, 169 00:09:59,800 --> 00:10:02,376 they're actually interpersonal. 170 00:10:02,400 --> 00:10:05,656 And so one thing you can do that's really easy 171 00:10:05,680 --> 00:10:10,776 is just start asking for permission before you share anyone else's information. 172 00:10:10,800 --> 00:10:15,336 If you want to post a photo of someone online, ask for permission. 173 00:10:15,360 --> 00:10:17,816 If you want to forward an email thread, 174 00:10:17,840 --> 00:10:19,216 ask for permission. 175 00:10:19,240 --> 00:10:22,016 And if you want to share someone's nude selfie, 176 00:10:22,040 --> 00:10:24,320 obviously, ask for permission. 177 00:10:25,560 --> 00:10:30,016 These individual changes can really help us protect each other's privacy, 178 00:10:30,040 --> 00:10:33,840 but we need technology companies on board as well. 179 00:10:34,360 --> 00:10:38,856 These companies have very little incentive to help protect our privacy 180 00:10:38,880 --> 00:10:42,176 because their business models depend on us sharing everything 181 00:10:42,200 --> 00:10:44,440 with as many people as possible. 182 00:10:45,080 --> 00:10:47,016 Right now, if I send you an image, 183 00:10:47,040 --> 00:10:50,136 you can forward that to anyone that you want. 184 00:10:50,160 --> 00:10:54,416 But what if I got to decide if that image was forwardable or not? 185 00:10:54,440 --> 00:10:58,496 This would tell you, you don't have my permission to send this image out. 186 00:10:58,520 --> 00:11:02,656 We do this kind of thing all the time to protect copyright. 187 00:11:02,680 --> 00:11:07,456 If you buy an e-book, you can't just send it out to as many people as you want. 188 00:11:07,480 --> 00:11:10,040 So why not try this with mobile phones? 189 00:11:10,960 --> 00:11:15,736 What you can do is we can demand that tech companies add these protections 190 00:11:15,760 --> 00:11:19,496 to our devices and our platforms as the default. 191 00:11:19,520 --> 00:11:22,936 After all, you can choose the color of your car, 192 00:11:22,960 --> 00:11:25,800 but the airbags are always standard. 193 00:11:28,080 --> 00:11:31,896 If we don't think more about digital privacy and consent, 194 00:11:31,920 --> 00:11:34,640 there can be serious consequences. 195 00:11:35,360 --> 00:11:37,616 There was a teenager from Ohio -- 196 00:11:37,640 --> 00:11:40,480 let's call her Jennifer, for the sake of her privacy. 197 00:11:41,120 --> 00:11:44,696 She shared nude photos of herself with her high school boyfriend, 198 00:11:44,720 --> 00:11:46,240 thinking she could trust him. 199 00:11:47,720 --> 00:11:49,656 Unfortunately, he betrayed her 200 00:11:49,680 --> 00:11:52,656 and sent her photos around the entire school. 201 00:11:52,680 --> 00:11:56,200 Jennifer was embarrassed and humiliated, 202 00:11:56,800 --> 00:12:00,936 but instead of being compassionate, her classmates harassed her. 203 00:12:00,960 --> 00:12:02,816 They called her a slut and a whore 204 00:12:02,840 --> 00:12:04,800 and they made her life miserable. 205 00:12:05,360 --> 00:12:09,040 Jennifer started missing school and her grades dropped. 206 00:12:09,520 --> 00:12:13,320 Ultimately, Jennifer decided to end her own life. 207 00:12:14,720 --> 00:12:17,416 Jennifer did nothing wrong. 208 00:12:17,440 --> 00:12:19,696 All she did was share a nude photo 209 00:12:19,720 --> 00:12:22,536 with someone she thought that she could trust. 210 00:12:22,560 --> 00:12:25,176 And yet our laws tell her 211 00:12:25,200 --> 00:12:29,360 that she committed a horrible crime equivalent to child pornography. 212 00:12:29,920 --> 00:12:31,416 Our gender norms tell her 213 00:12:31,440 --> 00:12:34,656 that by producing this nude image of herself, 214 00:12:34,680 --> 00:12:37,880 she somehow did the most horrible, shameful thing. 215 00:12:38,400 --> 00:12:42,616 And when we assume that privacy is impossible in digital media, 216 00:12:42,640 --> 00:12:48,160 we completely write off and excuse her boyfriend's bad, bad behavior. 217 00:12:49,200 --> 00:12:54,936 People are still saying all the time to victims of privacy violations, 218 00:12:54,960 --> 00:12:56,216 "What were you thinking? 219 00:12:56,240 --> 00:12:58,720 You should have never sent that image." 220 00:12:59,640 --> 00:13:03,640 If you're trying to figure out what to say instead, try this. 221 00:13:04,160 --> 00:13:07,680 Imagine you've run into your friend who broke their leg skiing. 222 00:13:08,240 --> 00:13:12,816 They took a risk to do something fun, and it didn't end well. 223 00:13:12,840 --> 00:13:15,376 But you're probably not going to be the jerk who says, 224 00:13:15,400 --> 00:13:17,840 "Well, I guess you shouldn't have gone skiing then." 225 00:13:20,080 --> 00:13:22,216 If we think more about consent, 226 00:13:22,240 --> 00:13:25,496 we can see that victims of privacy violations 227 00:13:25,520 --> 00:13:27,256 deserve our compassion, 228 00:13:27,280 --> 00:13:31,880 not criminalization, shaming, harassment or punishment. 229 00:13:32,440 --> 00:13:36,936 We can support victims, and we can prevent some privacy violations 230 00:13:36,960 --> 00:13:41,280 by making these legal, individual and technological changes. 231 00:13:41,840 --> 00:13:47,656 Because the problem is not sexting, the issue is digital privacy. 232 00:13:47,680 --> 00:13:50,040 And one solution is consent. 233 00:13:50,680 --> 00:13:55,256 So the next time a victim of a privacy violation comes up to you, 234 00:13:55,280 --> 00:13:58,016 instead of blaming them, let's do this instead: 235 00:13:58,040 --> 00:14:01,456 let's shift our ideas about digital privacy, 236 00:14:01,480 --> 00:14:04,120 and let's respond with compassion. 237 00:14:04,680 --> 00:14:05,896 Thank you. 238 00:14:05,920 --> 00:14:12,056 (Applause)