1 00:00:18,524 --> 00:00:23,564 People have been using media to talk about sex for a long time: 2 00:00:23,564 --> 00:00:27,345 love letters, phone sex, racy Polaroids. 3 00:00:27,345 --> 00:00:30,595 There's even a story of a girl who eloped 4 00:00:30,595 --> 00:00:35,256 with a man that she met over the telegraph in 1886. 5 00:00:35,256 --> 00:00:36,556 (Laughter) 6 00:00:36,556 --> 00:00:39,196 Today we have sexting. 7 00:00:39,196 --> 00:00:41,617 And I am a sexting expert. 8 00:00:41,617 --> 00:00:43,837 Not an expert sexter 9 00:00:43,837 --> 00:00:44,846 (Laughter) 10 00:00:44,846 --> 00:00:47,807 though, I do know what this means, 11 00:00:47,807 --> 00:00:48,917 and I think you do too! 12 00:00:48,917 --> 00:00:50,137 [it's a penis] 13 00:00:50,137 --> 00:00:52,306 (Laughter) 14 00:00:52,306 --> 00:00:54,377 I have been studying sexting 15 00:00:54,377 --> 00:00:58,577 since the media attention to it began in 2008. 16 00:00:58,577 --> 00:01:01,660 I wrote a book on the moral panic about sexting, 17 00:01:01,660 --> 00:01:03,359 and here's what I found: 18 00:01:03,359 --> 00:01:06,459 most people are worrying about the wrong thing. 19 00:01:06,459 --> 00:01:10,779 They're trying to just prevent sexting from happening entirely, 20 00:01:10,779 --> 00:01:12,209 but let me ask you this: 21 00:01:12,209 --> 00:01:17,179 As long as it's completely consensual, what's the problem with sexting? 22 00:01:17,179 --> 00:01:21,220 People are into all sorts of things that you may not be into, 23 00:01:21,220 --> 00:01:23,756 like blue cheese or cilantro. 24 00:01:23,756 --> 00:01:26,561 (Laughter) 25 00:01:26,561 --> 00:01:30,762 Sexting is certainly risky, like anything that's fun, 26 00:01:30,762 --> 00:01:31,772 but ... 27 00:01:31,772 --> 00:01:32,781 (Laughter) 28 00:01:32,781 --> 00:01:37,352 as long as you are not sending an image to someone who doesn't want to receive it, 29 00:01:37,352 --> 00:01:39,131 there's no harm. 30 00:01:39,131 --> 00:01:41,892 What I do think is a serious problem 31 00:01:41,892 --> 00:01:46,873 is when people share private images of others without their permission, 32 00:01:47,283 --> 00:01:49,702 and instead of worrying about sexting, 33 00:01:49,702 --> 00:01:54,803 what I think we need to do is think a lot more about digital privacy. 34 00:01:54,803 --> 00:01:57,114 The key is consent. 35 00:01:57,544 --> 00:02:00,874 Right now, most people are thinking about sexting 36 00:02:00,874 --> 00:02:04,325 without really thinking about consent at all. 37 00:02:04,325 --> 00:02:08,355 Did you know that we currently criminalize teen sexting? 38 00:02:09,295 --> 00:02:12,715 It can be a crime because it counts as child pornography 39 00:02:12,715 --> 00:02:15,716 if there's an image of someone under 18, 40 00:02:15,716 --> 00:02:19,385 and it doesn't even matter if they took that image of themselves 41 00:02:19,385 --> 00:02:21,716 and shared it willingly. 42 00:02:21,716 --> 00:02:24,716 So we end up with this bizarre legal situation 43 00:02:24,716 --> 00:02:29,226 where two 17-year-olds can legally have sex in most U.S. states, 44 00:02:29,226 --> 00:02:31,427 but they can't photograph it. 45 00:02:32,587 --> 00:02:36,678 Some states have also tried passing sexting misdemeanor laws, 46 00:02:36,678 --> 00:02:39,778 but these laws repeat the same problem, 47 00:02:39,778 --> 00:02:43,858 because they still make consensual sexting illegal. 48 00:02:44,418 --> 00:02:47,718 It doesn't make sense to try to ban all sexting 49 00:02:47,718 --> 00:02:50,379 to try to address privacy violations. 50 00:02:50,379 --> 00:02:51,979 This is kind of like saying, 51 00:02:51,979 --> 00:02:57,820 "Let's solve the problem of date rape by just making dating completely illegal." 52 00:02:59,010 --> 00:03:04,450 Most teens don't get arrested for sexting, but can you guess who does? 53 00:03:04,450 --> 00:03:09,362 It's often teens who are disliked by their partner's parents, 54 00:03:09,362 --> 00:03:14,571 and this can be because of class bias, racism, or homophobia. 55 00:03:14,915 --> 00:03:17,686 Most prosecutors are, of course, smart enough 56 00:03:17,686 --> 00:03:21,865 not to use child pornography charges against teenagers, 57 00:03:21,865 --> 00:03:23,605 but some do. 58 00:03:23,605 --> 00:03:27,165 According to researchers at the University of New Hampshire, 59 00:03:27,165 --> 00:03:32,767 7% of all child pornography possession arrests are teens 60 00:03:32,767 --> 00:03:36,397 sexting consensually with other teens. 61 00:03:37,417 --> 00:03:39,958 Child pornography is a serious crime, 62 00:03:39,958 --> 00:03:44,068 but it's just not the same thing as teen sexting. 63 00:03:44,978 --> 00:03:48,308 Parents and educators are also responding to sexting 64 00:03:48,308 --> 00:03:51,419 without really thinking too much about consent. 65 00:03:51,419 --> 00:03:56,069 Their message to teens is often, "Just don't do it," 66 00:03:56,069 --> 00:03:57,639 and I totally get it. 67 00:03:57,639 --> 00:03:59,619 There are serious legal risks, 68 00:03:59,619 --> 00:04:03,300 and of course, that potential for privacy violations. 69 00:04:03,300 --> 00:04:08,516 And when you were a teen, I'm sure you did exactly as you were told, right? 70 00:04:09,366 --> 00:04:12,841 You're probably thinking, "My kid would never sext," 71 00:04:12,841 --> 00:04:16,307 and that's true; your little angel may not be sexting 72 00:04:16,307 --> 00:04:22,298 because only 33% of 16- and 17-year-olds are sexting. 73 00:04:23,058 --> 00:04:25,438 But, sorry, by the time they're older, 74 00:04:25,438 --> 00:04:27,651 odds are, they will be sexting. 75 00:04:27,651 --> 00:04:34,103 Every study I've seen puts the rate above 50% for 18- to 24-year-olds. 76 00:04:34,603 --> 00:04:37,803 And most of the time, nothing goes wrong. 77 00:04:38,273 --> 00:04:40,933 People ask me all the time things like, 78 00:04:40,933 --> 00:04:43,703 "Isn't sexting just so dangerous, though? 79 00:04:43,703 --> 00:04:47,323 You wouldn't leave your wallet on a park bench. 80 00:04:47,323 --> 00:04:51,394 You expect it's going to get stolen if you do that, right?" 81 00:04:51,394 --> 00:04:52,775 Here's how I think about it: 82 00:04:52,775 --> 00:04:56,675 sexting is like leaving your wallet at your boyfriend's house. 83 00:04:56,675 --> 00:05:01,475 If you come back the next day, and all the money is just gone, 84 00:05:01,475 --> 00:05:04,276 you really need to dump that guy. 85 00:05:04,276 --> 00:05:05,745 (Laughter) 86 00:05:07,845 --> 00:05:12,746 So instead of criminalizing sexting to try to prevent these privacy violations, 87 00:05:12,746 --> 00:05:16,057 instead, we need to make consent central 88 00:05:16,057 --> 00:05:20,498 to how we think about that circulation of our private information. 89 00:05:20,868 --> 00:05:25,188 Every new media technology raises privacy concerns; 90 00:05:25,188 --> 00:05:29,758 in fact, in the U.S., the first major debates about privacy 91 00:05:29,758 --> 00:05:34,249 were in response to technologies that were relatively new at the time. 92 00:05:34,249 --> 00:05:38,229 In the late 1800s, people were worried about cameras, 93 00:05:38,229 --> 00:05:41,730 which were just suddenly more portable than ever before, 94 00:05:41,730 --> 00:05:44,130 and newspaper gossip columns. 95 00:05:44,130 --> 00:05:48,030 They were worried that the camera would capture information about them, 96 00:05:48,030 --> 00:05:51,639 take it out of context, and widely disseminate it. 97 00:05:51,639 --> 00:05:53,241 Does this sound familiar? 98 00:05:53,241 --> 00:05:58,200 It's exactly what we're worrying about now with social media, and drone cameras, 99 00:05:58,200 --> 00:06:00,321 and of course, sexting. 100 00:06:00,321 --> 00:06:03,652 And these fears about technology make sense, 101 00:06:03,652 --> 00:06:06,372 because technologies can amplify 102 00:06:06,372 --> 00:06:10,563 and bring out our worst qualities and behaviors. 103 00:06:10,563 --> 00:06:12,893 But there are solutions, 104 00:06:12,893 --> 00:06:17,193 and we've been here before with a dangerous new technology. 105 00:06:17,193 --> 00:06:21,023 In 1908, Ford introduced the Model T car. 106 00:06:21,023 --> 00:06:24,912 Traffic fatality rates were rising; it was a serious problem. 107 00:06:24,912 --> 00:06:27,113 It looks so safe, right? 108 00:06:27,113 --> 00:06:28,474 (Laughter) 109 00:06:29,234 --> 00:06:33,245 Our first response was to try to change drivers' behavior, 110 00:06:33,245 --> 00:06:37,285 so we developed speed limits and enforced them through fines. 111 00:06:37,285 --> 00:06:40,765 But over the following decades, we started to realize 112 00:06:40,765 --> 00:06:44,716 the technology of the car itself is not just neutral. 113 00:06:44,716 --> 00:06:47,966 We could design the car to make it safer. 114 00:06:47,966 --> 00:06:51,466 So in the 1920s, we got shatter-resistant windshields; 115 00:06:51,466 --> 00:06:57,076 in the 1950s, seat belts; and in the 1990s, air bags. 116 00:06:58,256 --> 00:07:03,307 All three of these areas: laws, individuals, and industry 117 00:07:03,307 --> 00:07:09,348 came together over time to help solve the problems that a new technology causes, 118 00:07:09,348 --> 00:07:12,918 and we can do the same thing with digital privacy. 119 00:07:12,918 --> 00:07:16,218 Of course, it comes back to consent. 120 00:07:16,218 --> 00:07:17,558 Here's the idea: 121 00:07:17,558 --> 00:07:21,239 before anyone can distribute your private information, 122 00:07:21,239 --> 00:07:24,069 they should have to get your permission. 123 00:07:24,069 --> 00:07:29,020 This idea of affirmative consent comes from anti-rape activists 124 00:07:29,020 --> 00:07:32,760 who tell us that we need consent for every sexual act. 125 00:07:32,760 --> 00:07:37,327 And we have really high standards for consent in a lot of other areas. 126 00:07:37,327 --> 00:07:39,127 Think about having surgery. 127 00:07:39,127 --> 00:07:42,189 Your doctor has to make sure that you are meaningfully 128 00:07:42,189 --> 00:07:45,359 and knowingly consenting to that medical procedure. 129 00:07:45,359 --> 00:07:49,098 This is not the type of consent with like an iTunes Terms of Service 130 00:07:49,098 --> 00:07:53,009 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 131 00:07:53,009 --> 00:07:54,460 (Laughter) 132 00:07:55,079 --> 00:08:00,260 If we think more about consent, we can have better privacy laws. 133 00:08:00,260 --> 00:08:03,681 Right now, we just don't have that many protections. 134 00:08:03,681 --> 00:08:07,381 If your ex-husband or your ex-wife is a terrible person, 135 00:08:07,381 --> 00:08:11,531 they can take your nude photos and upload them to a porn site. 136 00:08:11,531 --> 00:08:14,750 It can be really hard to get those images taken down, 137 00:08:14,750 --> 00:08:16,022 and in a lot of states, 138 00:08:16,022 --> 00:08:19,912 you're actually better off if you took the images of yourself 139 00:08:19,912 --> 00:08:23,122 because then you can file a copyright claim. 140 00:08:23,472 --> 00:08:24,760 (Laughter) 141 00:08:25,353 --> 00:08:28,283 Right now, if someone violates your privacy, 142 00:08:28,283 --> 00:08:32,644 whether that's an individual, or a company, or the NSA, 143 00:08:33,154 --> 00:08:38,044 you can try filing a lawsuit, but you may not be successful 144 00:08:38,044 --> 00:08:42,904 because many courts assume that digital privacy is just impossible, 145 00:08:42,904 --> 00:08:46,694 so they're not willing to punish anyone for violating it. 146 00:08:47,154 --> 00:08:50,055 I still hear people asking me all the time, 147 00:08:50,055 --> 00:08:55,366 "Isn't a digital image somehow blurring the line between public and private 148 00:08:55,366 --> 00:08:57,556 because it's digital, right?" 149 00:08:57,556 --> 00:08:58,936 No, no! 150 00:08:58,936 --> 00:09:02,236 Everything digital is not just automatically public. 151 00:09:02,236 --> 00:09:04,086 That doesn't make any sense. 152 00:09:04,086 --> 00:09:07,597 As NYU legal scholar, Helen Nissenbaum, tells us, 153 00:09:07,597 --> 00:09:10,326 we have laws, and policies, and norms 154 00:09:10,326 --> 00:09:13,467 that protect all kinds of information that's private, 155 00:09:13,467 --> 00:09:16,846 and it doesn't make a difference if it's digital or not. 156 00:09:16,846 --> 00:09:19,628 All of your health records are digitized, 157 00:09:19,628 --> 00:09:22,748 but your doctor can't just share them with anyone. 158 00:09:22,748 --> 00:09:27,098 All of your financial information is held in digital databases, 159 00:09:27,098 --> 00:09:32,039 but your credit card company can't just post your purchase history online. 160 00:09:32,879 --> 00:09:38,010 Better laws could help address privacy violations after they happen, 161 00:09:38,420 --> 00:09:40,999 but one of the easiest things we can all do 162 00:09:40,999 --> 00:09:45,930 is make personal changes to help protect each others' privacy. 163 00:09:46,250 --> 00:09:51,271 We're always told that privacy is our own sole individual responsibility. 164 00:09:51,271 --> 00:09:55,431 We're told, "Constantly monitor and update your privacy settings." 165 00:09:55,431 --> 00:10:00,902 We're told, "Never share anything you wouldn't want the entire world to see." 166 00:10:01,352 --> 00:10:02,522 This doesn't make sense. 167 00:10:02,522 --> 00:10:05,582 Digital media are social environments, 168 00:10:05,582 --> 00:10:10,413 and we share things with people we trust all day, every day. 169 00:10:10,723 --> 00:10:13,613 As Princeton researcher, Janet Vertesi, argues, 170 00:10:13,613 --> 00:10:17,523 our data and our privacy are not just personal, 171 00:10:17,523 --> 00:10:20,284 they're actually interpersonal. 172 00:10:20,284 --> 00:10:25,704 So one thing you can do that's really easy is just start asking for permission 173 00:10:25,704 --> 00:10:28,654 before you share anyone else's information. 174 00:10:28,654 --> 00:10:33,335 If you want to post a photo of someone online, ask for permission. 175 00:10:33,335 --> 00:10:37,076 If you want to forward an email thread, ask for permission. 176 00:10:37,076 --> 00:10:39,776 If you want to share someone's nude selfie, 177 00:10:39,776 --> 00:10:42,466 obviously, ask for permission. 178 00:10:43,386 --> 00:10:47,826 These individual changes can help us protect each others' privacy, 179 00:10:47,826 --> 00:10:52,246 but we need technology companies on board as well. 180 00:10:52,246 --> 00:10:56,697 These companies have very little incentive to help protect our privacy 181 00:10:56,697 --> 00:10:58,969 because their business models depend on us 182 00:10:58,969 --> 00:11:02,918 sharing everything with as many people as possible. 183 00:11:02,918 --> 00:11:04,988 Right now, if I send you an image, 184 00:11:04,988 --> 00:11:08,008 you can forward that to anyone that you want. 185 00:11:08,008 --> 00:11:12,319 But what if I got to decide if that image was forwardable or not? 186 00:11:12,319 --> 00:11:16,379 This would tell you, "You don't have my permission to send this image out." 187 00:11:16,379 --> 00:11:20,600 We do this kind of thing all the time to protect copyright. 188 00:11:20,600 --> 00:11:25,300 If you buy an e-book, you can't just send it out to as many people as you want, 189 00:11:25,300 --> 00:11:28,371 so why not try this with mobile phones? 190 00:11:29,151 --> 00:11:30,271 What you can do 191 00:11:30,271 --> 00:11:34,001 is we can demand that tech companies add these protections 192 00:11:34,001 --> 00:11:37,651 to our devices and our platforms as the default. 193 00:11:37,651 --> 00:11:41,061 After all, you can choose the color of your car, 194 00:11:41,061 --> 00:11:44,342 but the airbags are always standard. 195 00:11:45,432 --> 00:11:49,223 If we don't think more about digital privacy and consent, 196 00:11:49,223 --> 00:11:52,053 there can be serious consequences. 197 00:11:52,573 --> 00:11:55,003 There was a teenager from Ohio 198 00:11:55,003 --> 00:11:58,509 - let's call her Jennifer for the sake of her privacy - 199 00:11:58,509 --> 00:12:02,143 she shared nude photos of herself with her high school boyfriend 200 00:12:02,143 --> 00:12:04,040 thinking she could trust him. 201 00:12:05,080 --> 00:12:10,034 Unfortunately, he betrayed her and sent her photos around the entire school. 202 00:12:10,034 --> 00:12:14,064 Jennifer was embarrassed and humiliated, 203 00:12:14,064 --> 00:12:18,205 but instead of being compassionate, her classmates harassed her. 204 00:12:18,205 --> 00:12:22,626 They called her a slut and a whore, and they made her life miserable. 205 00:12:22,626 --> 00:12:26,795 Jennifer started missing school, and her grades dropped. 206 00:12:26,795 --> 00:12:31,137 Ultimately, Jennifer decided to end her own life. 207 00:12:32,067 --> 00:12:34,697 Jennifer did nothing wrong. 208 00:12:34,697 --> 00:12:37,127 All she did was share a nude photo 209 00:12:37,127 --> 00:12:39,818 with someone that she thought that she could trust. 210 00:12:39,818 --> 00:12:42,558 And yet, our laws tell her 211 00:12:42,558 --> 00:12:47,317 that she committed a horrible crime equivalent to child pornography. 212 00:12:47,317 --> 00:12:48,888 Our gender norms tell her 213 00:12:48,888 --> 00:12:52,068 that by producing this nude image of herself, 214 00:12:52,068 --> 00:12:55,779 she somehow did the most horrible, shameful thing. 215 00:12:55,779 --> 00:12:59,860 And when we assume that privacy is impossible in digital media, 216 00:12:59,860 --> 00:13:05,950 we completely write off and excuse her boyfriend's bad, bad behavior. 217 00:13:06,420 --> 00:13:12,251 People are still saying all the time to victims of privacy violations, 218 00:13:12,251 --> 00:13:13,610 "What were you thinking? 219 00:13:13,610 --> 00:13:16,390 You should've never sent that image." 220 00:13:16,950 --> 00:13:21,552 If you're trying to figure out what to say instead, try this: 221 00:13:21,552 --> 00:13:25,442 imagine you've run into your friend who broke their leg skiing. 222 00:13:25,442 --> 00:13:30,102 They took a risk to do something fun, and it didn't end well. 223 00:13:30,102 --> 00:13:32,693 But you're probably not going to be the jerk who says, 224 00:13:32,693 --> 00:13:35,823 "Well, I guess you shouldn't have gone skiing then!" 225 00:13:37,343 --> 00:13:39,563 If we think more about consent, 226 00:13:39,563 --> 00:13:44,594 we can see that victims of privacy violations deserve our compassion 227 00:13:44,594 --> 00:13:49,784 not criminalization, shaming, harassment, or punishment. 228 00:13:49,784 --> 00:13:54,335 We can support victims, and we can prevent some privacy violations 229 00:13:54,335 --> 00:13:59,145 by making these legal, individual, and technological changes. 230 00:13:59,145 --> 00:14:04,935 Because the problem is not sexting, the issue is digital privacy, 231 00:14:04,935 --> 00:14:07,876 and one solution is consent. 232 00:14:07,876 --> 00:14:12,637 So the next time a victim of a privacy violation comes up to you, 233 00:14:12,637 --> 00:14:15,346 instead of blaming them, let's do this instead: 234 00:14:15,346 --> 00:14:18,807 Let's shift our ideas about digital privacy, 235 00:14:18,807 --> 00:14:22,027 and let's respond with compassion. 236 00:14:22,027 --> 00:14:23,247 Thank you. 237 00:14:23,247 --> 00:14:24,977 (Applause)