1 00:00:00,829 --> 00:00:04,827 People have been using media to talk about sex for a long time. 2 00:00:05,692 --> 00:00:08,463 Love letters, phone sex, racy polaroids. 3 00:00:09,549 --> 00:00:15,278 There's even a story of a girl who eloped with a man that she met over the telegraph 4 00:00:15,627 --> 00:00:16,962 in 1886. 5 00:00:18,761 --> 00:00:23,299 Today we have sexting, and I am a sexting expert. 6 00:00:23,861 --> 00:00:25,884 Not an expect sexter. 7 00:00:26,999 --> 00:00:30,825 Though, I do know what this means -- I think you do too. 8 00:00:31,425 --> 00:00:35,475 (Laughter) 9 00:00:36,584 --> 00:00:41,994 I have been studying sexting since the media attention to it began in 2008. 10 00:00:42,802 --> 00:00:45,681 I wrote a book on the moral panic about sexting. 11 00:00:45,991 --> 00:00:47,375 And here's what I found: 12 00:00:47,575 --> 00:00:50,249 most people are worrying about the wrong thing. 13 00:00:50,618 --> 00:00:54,074 They're trying to just prevent sexting from happening entirely. 14 00:00:54,920 --> 00:00:56,516 But let me ask you this: 15 00:00:56,586 --> 00:01:00,732 As long as it's completely consensual, what's the problem with sexting? 16 00:01:01,208 --> 00:01:04,969 People are into all sorts of things that you may not be into, 17 00:01:05,402 --> 00:01:07,575 like blue cheese or cilantro. 18 00:01:07,978 --> 00:01:09,982 (Laughter) 19 00:01:10,778 --> 00:01:14,318 Sexting is certainly risky, like anything that's fun, 20 00:01:15,015 --> 00:01:20,901 but as long as you're not sending an image to someone who doesn't want to receive it, 21 00:01:21,639 --> 00:01:23,111 there's no harm. 22 00:01:23,370 --> 00:01:29,044 What I do think is a serious problem is when people share private images of others 23 00:01:29,303 --> 00:01:31,247 without their permission. 24 00:01:31,455 --> 00:01:35,049 And instead of worrying about sexting, what I think we need to do 25 00:01:35,346 --> 00:01:38,110 is think a lot more about digital privacy. 26 00:01:38,894 --> 00:01:40,895 The key is consent. 27 00:01:41,871 --> 00:01:44,109 Right now most people are thinking 28 00:01:44,109 --> 00:01:47,497 about sexting without really thinking about consent at all. 29 00:01:48,427 --> 00:01:52,188 Did you know that we currently criminalize teen sexting? 30 00:01:53,414 --> 00:01:56,887 It can be a crime because it counts as child pornography, 31 00:01:56,950 --> 00:01:59,602 if there's an image of someone under 18 32 00:01:59,783 --> 00:02:01,418 and it doesn't even matter 33 00:02:01,418 --> 00:02:05,106 if they took that image of themselves and shared it willingly. 34 00:02:05,902 --> 00:02:08,475 So we end up with this bizarre legal situation 35 00:02:08,963 --> 00:02:12,793 where two 17-year-olds can legally have sex in most U.S. states 36 00:02:13,447 --> 00:02:15,676 but they can't photograph it. 37 00:02:16,767 --> 00:02:20,601 Some states have also tried passing sexting misdemeanor laws 38 00:02:21,040 --> 00:02:23,843 but these laws repeat the same problem 39 00:02:24,035 --> 00:02:27,256 because they still make consensual sexting illegal. 40 00:02:28,746 --> 00:02:29,883 It doesn't make sense 41 00:02:29,883 --> 00:02:33,719 to try to ban all sexting to try to address privacy violations. 42 00:02:34,597 --> 00:02:36,143 This is kind of like saying, 43 00:02:36,202 --> 00:02:41,525 let's solve the problem of date rape by just making dating completely illegal. 44 00:02:43,276 --> 00:02:48,061 Most teens don't get arrested for sexting, but can you guess who does? 45 00:02:48,575 --> 00:02:52,565 It's often teens who are disliked by their partner's parents. 46 00:02:53,596 --> 00:02:58,036 And this can be because of class bias, racism or homophobia. 47 00:02:59,115 --> 00:03:01,756 Most prosecutors are, of course, smart enough 48 00:03:01,756 --> 00:03:07,283 not to use child pornography charges against teenagers but some do. 49 00:03:07,753 --> 00:03:10,814 According to researchers at the University of New Hampshire 50 00:03:11,311 --> 00:03:16,624 seven percent of all child pornography possession arrests are teens, 51 00:03:17,098 --> 00:03:19,863 sexting consensually with other teens. 52 00:03:21,512 --> 00:03:24,024 Child pornography is a serious crime, 53 00:03:24,313 --> 00:03:27,696 but it's just not the same thing as teen sexting. 54 00:03:29,121 --> 00:03:32,295 Parents and educators are also responding to sexting 55 00:03:32,558 --> 00:03:35,191 without really thinking too much about consent. 56 00:03:35,698 --> 00:03:39,707 Their message to teens is often: just don't do it. 57 00:03:40,242 --> 00:03:43,223 And I totally get it -- there are serious legal risks 58 00:03:43,822 --> 00:03:46,872 and of course, that potential for privacy violations. 59 00:03:47,351 --> 00:03:48,712 And when you were a teen, 60 00:03:48,872 --> 00:03:52,074 I'm sure you did exactly as you were told, right? 61 00:03:53,571 --> 00:03:56,613 You're probably thinking, my kid would never sext. 62 00:03:57,100 --> 00:03:59,694 And that's true, your little angel may not be sexting 63 00:04:00,434 --> 00:04:05,934 because only 33% of 16 and 17-year-olds are sexting. 64 00:04:07,214 --> 00:04:11,224 But, sorry, by the time they're older, odds are they will be sexting. 65 00:04:12,024 --> 00:04:17,569 Every study I've seen puts the rate above 50% for 18 to 24-year-olds. 66 00:04:18,856 --> 00:04:21,176 And most of the time, nothing goes wrong. 67 00:04:22,041 --> 00:04:27,264 People ask me all the time things like, isn't sexting just so dangerous, though, 68 00:04:27,567 --> 00:04:31,351 like you wouldn't leave your wallet on a park bench 69 00:04:31,773 --> 00:04:34,418 and you expect it's gonna get stolen if you do that, right? 70 00:04:34,738 --> 00:04:36,339 Here's how I think about it: 71 00:04:36,655 --> 00:04:39,855 sexting is like leaving your wallet at your boyfriend's house. 72 00:04:40,385 --> 00:04:42,192 If you come back the next day 73 00:04:42,409 --> 00:04:47,150 and all the money is just gone, you really need to dump that guy. 74 00:04:51,756 --> 00:04:56,028 So instead of criminalizing sexting to try to prevent these privacy violations, 75 00:04:56,645 --> 00:04:59,223 instead we need to make consent central 76 00:04:59,900 --> 00:05:03,537 to how we think about the circulation of our private information. 77 00:05:04,673 --> 00:05:08,460 Every new media technology raises privacy concerns. 78 00:05:08,962 --> 00:05:13,556 In fact, in the U.S. the very first major debates about privacy 79 00:05:13,789 --> 00:05:17,440 were in response to technologies that were relatively new at the time. 80 00:05:18,069 --> 00:05:20,822 In the late 1800s, people were worried about cameras, 81 00:05:22,057 --> 00:05:25,169 which were just suddenly more portable than ever before, 82 00:05:25,619 --> 00:05:27,696 and newspaper gossip columns. 83 00:05:28,316 --> 00:05:31,596 They were worried that the camera would capture information about them, 84 00:05:31,975 --> 00:05:34,863 take it out of context and widely disseminate it. 85 00:05:35,521 --> 00:05:37,289 Does this sound familiar? 86 00:05:37,529 --> 00:05:41,603 It's exactly what we're worrying about now with social media and drone cameras. 87 00:05:41,928 --> 00:05:44,014 and, of course, sexting. 88 00:05:44,297 --> 00:05:48,850 And these fears about technology, they make sense because technologies 89 00:05:49,124 --> 00:05:53,803 can amplify and bring out our worst qualities and behaviors. 90 00:05:54,170 --> 00:05:56,300 But there are solutions. 91 00:05:56,727 --> 00:06:00,431 And we've been here before with a dangerous new technology. 92 00:06:00,857 --> 00:06:04,542 In 1908, Ford introduced the Model T car. 93 00:06:04,742 --> 00:06:06,933 Traffic fatality rates were rising. 94 00:06:07,370 --> 00:06:10,523 It was a serious problem -- it looks so safe, right? 95 00:06:12,654 --> 00:06:16,299 Our first response was to try to change drivers behavior, 96 00:06:16,973 --> 00:06:19,984 so we developed speed limits and enforced them through fines. 97 00:06:20,452 --> 00:06:22,888 But over the following decades, 98 00:06:23,086 --> 00:06:27,597 we started to realize the technology of the car itself is not just neutral. 99 00:06:27,985 --> 00:06:31,057 We could design the car to make it safer. 100 00:06:31,383 --> 00:06:34,383 So in the 1920s, we got shatter-resistant windshields. 101 00:06:34,733 --> 00:06:39,569 In the 1950s, seatbelts. And in the 1990s, airbags. 102 00:06:40,730 --> 00:06:42,374 All three of these areas: 103 00:06:42,832 --> 00:06:46,111 laws, individuals, industry came together 104 00:06:46,939 --> 00:06:51,093 over time to help solve the problem that a new technology causes. 105 00:06:51,537 --> 00:06:54,793 And we can do the same thing with digital privacy. 106 00:06:55,310 --> 00:06:57,927 Of course, it comes back to consent. 107 00:06:58,422 --> 00:06:59,938 Here's the idea. 108 00:07:00,282 --> 00:07:03,184 Before anyone can distribute your private information, 109 00:07:03,809 --> 00:07:05,903 they should have to get your permission. 110 00:07:06,481 --> 00:07:11,585 This idea of affirmative consent comes from anti-rape activists who tell us 111 00:07:11,916 --> 00:07:14,684 that we need consent for every sexual act. 112 00:07:15,088 --> 00:07:19,069 And we have really high standards for consent in a lot of other areas. 113 00:07:19,486 --> 00:07:21,275 Think about having surgery. 114 00:07:21,763 --> 00:07:23,304 Your doctor has to make sure that 115 00:07:23,572 --> 00:07:27,271 you are meaningfully and knowingly consenting to that medical procedure. 116 00:07:27,792 --> 00:07:31,303 This is not the type of consent with an iTunes Terms of Service 117 00:07:31,555 --> 00:07:35,564 where you scroll to the bottom and you're like, agree, agree, whatever. 118 00:07:37,523 --> 00:07:41,884 If we think more about consent, we can have better privacy laws. 119 00:07:42,598 --> 00:07:45,741 Right now we just don't have that many protections. 120 00:07:46,093 --> 00:07:49,135 If your ex-husband or your ex-wife is a terrible person, 121 00:07:49,569 --> 00:07:53,489 they can take your nude photos and upload them to a porn site. 122 00:07:53,997 --> 00:07:56,754 It can be really hard to get those images taken down. 123 00:07:57,047 --> 00:07:59,496 And in a lot of states, you're actually better off 124 00:07:59,963 --> 00:08:01,922 if you took the images of yourself 125 00:08:02,203 --> 00:08:04,687 because then you can file a copyright claim. 126 00:08:07,601 --> 00:08:10,274 Right now, if someone violates your privacy, 127 00:08:10,610 --> 00:08:14,633 whether that's an individual or a company or the NSA, 128 00:08:15,421 --> 00:08:19,539 you can try filing a lawsuit, though you may not be successful 129 00:08:20,411 --> 00:08:24,561 because many courts assume that digital privacy is just impossible. 130 00:08:25,248 --> 00:08:28,739 So they're not willing to punish anyone for violating it. 131 00:08:29,463 --> 00:08:32,332 I still hear people asking me all the time, 132 00:08:32,620 --> 00:08:37,638 isn't a digital image somehow blurring the line between public and private 133 00:08:37,927 --> 00:08:39,834 because it's digital, right? 134 00:08:40,057 --> 00:08:40,973 No! No! 135 00:08:41,469 --> 00:08:44,532 Everything digital is not just automatically public. 136 00:08:44,737 --> 00:08:46,700 That doesn't make any sense. 137 00:08:46,942 --> 00:08:49,493 As NYU legal scholar Helen Nissenbaum tells us, 138 00:08:49,896 --> 00:08:52,841 we have laws and policies and norms that protect 139 00:08:53,077 --> 00:08:56,572 all kinds of information that's private, and it doesn't make a difference 140 00:08:56,885 --> 00:08:58,911 if it's digital or not. 141 00:08:59,221 --> 00:09:02,435 All of your health records are digitized but your doctor 142 00:09:02,652 --> 00:09:04,653 can't just share them with anyone. 143 00:09:04,911 --> 00:09:08,979 All of your financial information is held in digital databases, 144 00:09:09,449 --> 00:09:13,553 but your credit card company can't just post your purchase history online. 145 00:09:15,247 --> 00:09:19,866 Better laws could help address privacy violations after they happen, 146 00:09:20,298 --> 00:09:24,891 but one of the easiest things we can all do is make personal changes 147 00:09:25,244 --> 00:09:27,863 to help protect each other's privacy. 148 00:09:29,293 --> 00:09:33,196 We're always told that privacy is our own, sole, individual responsibility. 149 00:09:33,626 --> 00:09:37,237 We're told, constantly monitor and update your privacy settings. 150 00:09:37,703 --> 00:09:42,452 We're told never share anything you wouldn't want the entire world to see. 151 00:09:43,225 --> 00:09:44,924 This doesn't make sense. 152 00:09:45,111 --> 00:09:47,704 Digital media are social environments 153 00:09:47,981 --> 00:09:52,196 and we share things with people we trust all day, every day. 154 00:09:53,047 --> 00:09:55,397 As Princeton researcher Jennifer ?argues, 155 00:09:55,666 --> 00:09:59,449 our data and are privacy, they're not just personal, 156 00:09:59,889 --> 00:10:02,469 they're actually interpersonal. 157 00:10:02,855 --> 00:10:05,496 And so one thing you can do, that's really easy 158 00:10:05,820 --> 00:10:10,224 is just start asking for permission before you share anyone else's information. 159 00:10:11,325 --> 00:10:14,835 If you want to post a photo of someone online, ask for permission. 160 00:10:15,401 --> 00:10:18,649 If you want to forward an email thread, ask for permission. 161 00:10:19,446 --> 00:10:24,102 And if you want to share someone's nude selfie, obviously, ask for permission. 162 00:10:25,837 --> 00:10:29,662 These individual changes can really help us protect each other's privacy, 163 00:10:30,092 --> 00:10:33,729 but we need technology companies on board as well. 164 00:10:34,712 --> 00:10:38,574 These companies have very little incentive to help protect our privacy 165 00:10:38,961 --> 00:10:42,194 because their business models depend on us sharing everything 166 00:10:42,458 --> 00:10:44,864 with as many people as possible. 167 00:10:45,231 --> 00:10:47,147 Right now, if I send you an image, 168 00:10:47,383 --> 00:10:49,725 you can forward that to anyone that you want. 169 00:10:50,264 --> 00:10:54,228 But what if I got to decide if that image was forwardable or not? 170 00:10:54,595 --> 00:10:58,534 This would tell you that you don't have my permission to send this image out. 171 00:10:58,779 --> 00:11:02,195 We do this kind of thing all the time to protect copy right. 172 00:11:02,849 --> 00:11:07,160 If you buy an e-book, you can't just send it out to as many people as you want. 173 00:11:07,620 --> 00:11:10,441 So why not try this with mobile phones? 174 00:11:11,229 --> 00:11:15,378 What you can do is we can demand that tech companies add these protections 175 00:11:15,874 --> 00:11:19,101 to our devices and our platforms as the default. 176 00:11:19,572 --> 00:11:22,493 After all, you can choose the color of your car, 177 00:11:22,898 --> 00:11:25,947 but the airbags are always standard. 178 00:11:28,332 --> 00:11:32,147 If we don't think more about digital privacy and consent, 179 00:11:33,561 --> 00:11:35,756 there can be serious consequences. 180 00:11:35,756 --> 00:11:38,963 There was a teenager from Ohio, let's call her Jennifer, 181 00:11:39,332 --> 00:11:43,014 for the sake of her privacy, she shared nude photos of herself 182 00:11:43,323 --> 00:11:46,311 with her high-school boyfriend, thinking she could trust him. 183 00:11:47,890 --> 00:11:52,381 Unfortunately, he betrayed her and sent her photos around the entire school. 184 00:11:52,988 --> 00:11:56,104 Jennifer was embarrassed and humiliated, 185 00:11:56,783 --> 00:12:00,934 but instead of being compassionate, her classmates harassed her. 186 00:12:01,314 --> 00:12:03,197 They called her a slut and a whore. 187 00:12:03,425 --> 00:12:05,352 They made her life miserable. 188 00:12:05,593 --> 00:12:08,969 Jennifer started missing school and her grades dropped. 189 00:12:09,773 --> 00:12:13,379 Ultimately, Jennifer decided to end her own life. 190 00:12:14,930 --> 00:12:17,188 Jennifer did nothing wrong. 191 00:12:17,560 --> 00:12:19,659 All she did was share a nude photo 192 00:12:19,998 --> 00:12:22,267 with someone she thought that she could trust. 193 00:12:22,638 --> 00:12:25,005 And yet our laws tell her 194 00:12:25,459 --> 00:12:29,343 that she committed a horrible crime equivalent to child pornography. 195 00:12:29,981 --> 00:12:31,739 Our gender norms tell her that 196 00:12:31,971 --> 00:12:34,444 by producing this nude image of herself, 197 00:12:34,802 --> 00:12:38,121 she somehow did the most horrible, shameful thing. 198 00:12:38,700 --> 00:12:42,141 And when we assume that privacy is impossible in digital media, 199 00:12:42,802 --> 00:12:48,098 we completely write-off and excuse her boyfriend's bad, bad behavior. 200 00:12:49,216 --> 00:12:54,098 People are still saying all the time to victims of privacy violations, 201 00:12:54,887 --> 00:12:56,488 what were you thinking? 202 00:12:56,706 --> 00:12:58,960 You should have never sent that image. 203 00:12:59,934 --> 00:13:03,725 If you're trying to figure out what to say instead, try this. 204 00:13:04,209 --> 00:13:08,066 Imagine you've run into your friend who broke their leg skiing. 205 00:13:08,603 --> 00:13:12,363 They took a risk to do something fun and it didn't end well. 206 00:13:12,918 --> 00:13:15,746 But you're probably not going to be the jerk who says, well, 207 00:13:16,116 --> 00:13:18,748 I guess you shouldn't have gone skiing then. 208 00:13:20,302 --> 00:13:23,595 If we think more about consent, we can see that 209 00:13:23,880 --> 00:13:27,075 victims of privacy violations deserve our compassion, 210 00:13:27,628 --> 00:13:31,678 not criminalization, shaming, harassment, or punishment. 211 00:13:32,590 --> 00:13:36,926 We can support victims and we can prevent some privacy violations 212 00:13:37,269 --> 00:13:41,195 by making these legal, individual and technological changes. 213 00:13:41,975 --> 00:13:47,276 Because the problem is not sexting, the issue is digital privacy. 214 00:13:47,905 --> 00:13:50,630 And one solution is consent. 215 00:13:51,066 --> 00:13:54,459 So the next time a victim of a privacy violation comes up to you, 216 00:13:55,265 --> 00:13:57,074 instead of blaming them, 217 00:13:57,354 --> 00:13:58,553 let's do this instead: 218 00:13:58,789 --> 00:14:02,169 Let's shift our ideas about digital privacy and 219 00:14:02,485 --> 00:14:04,428 let's respond with compassion. 220 00:14:04,888 --> 00:14:06,620 Thank you.