1 00:00:18,524 --> 00:00:23,564 People have been using media to talk about sex for a long time. 2 00:00:23,564 --> 00:00:27,345 Love letters, phone sex, racy Polaroids. 3 00:00:27,345 --> 00:00:30,595 There's even a story of a girl who eloped 4 00:00:30,595 --> 00:00:35,256 with a man that she met over the telegraph in 1886. 5 00:00:35,256 --> 00:00:36,556 (Laughter) 6 00:00:36,556 --> 00:00:39,196 Today we have sexting. 7 00:00:39,196 --> 00:00:41,617 And I am a sexting expert. 8 00:00:41,617 --> 00:00:43,837 Not an expert sexter - 9 00:00:43,837 --> 00:00:44,846 (Laughter) 10 00:00:44,846 --> 00:00:47,807 Though, I do know what this means 11 00:00:47,807 --> 00:00:48,917 and I think you do too! 12 00:00:48,917 --> 00:00:50,137 [it's a penis] 13 00:00:50,137 --> 00:00:52,306 (Laughter) 14 00:00:52,306 --> 00:00:54,377 I have been studying sexting 15 00:00:54,377 --> 00:00:58,577 since the media attention to it began in 2008. 16 00:00:58,577 --> 00:01:01,660 I wrote a book on the moral panic about sexting, 17 00:01:01,660 --> 00:01:03,359 and here's what I found: 18 00:01:03,359 --> 00:01:06,459 Most people are worrying about the wrong thing. 19 00:01:06,459 --> 00:01:10,779 They're trying to just prevent sexting from happening entirely, 20 00:01:10,779 --> 00:01:12,209 but let me ask you this: 21 00:01:12,209 --> 00:01:17,179 as long as it's completely consensual, what's the problem with sexting? 22 00:01:17,179 --> 00:01:21,220 People are into all sorts of things that you may not be into, 23 00:01:21,220 --> 00:01:23,756 like blue cheese or cilantro. 24 00:01:23,756 --> 00:01:26,561 (Laughter) 25 00:01:26,561 --> 00:01:30,762 Sexting is certainly risky, like anything that's fun, 26 00:01:30,762 --> 00:01:31,772 but... 27 00:01:31,772 --> 00:01:32,781 (Laughter) 28 00:01:32,781 --> 00:01:37,352 as long as you are not sending an image to someone who doesn't want to receive it, 29 00:01:37,352 --> 00:01:39,131 there's no harm. 30 00:01:39,131 --> 00:01:41,892 What I do think is a serious problem 31 00:01:41,892 --> 00:01:46,873 is when people share private images of others without their permission, 32 00:01:47,283 --> 00:01:49,702 and instead of worrying about sexting, 33 00:01:49,702 --> 00:01:54,803 what I think we need to do is think a lot more about digital privacy. 34 00:01:54,803 --> 00:01:57,544 The key is consent. 35 00:01:57,544 --> 00:02:00,874 Right now, most people are thinking about sexting 36 00:02:00,874 --> 00:02:04,325 without really thinking about consent at all. 37 00:02:04,325 --> 00:02:08,355 Did you know that we currently criminalize teen sexting? 38 00:02:09,295 --> 00:02:12,715 It can be a crime because it counts as child pornography 39 00:02:12,715 --> 00:02:15,716 if there's an image of someone under 18, 40 00:02:15,716 --> 00:02:19,385 and it doesn't even matter if they took that image of themselves 41 00:02:19,385 --> 00:02:21,716 and shared it willingly. 42 00:02:21,716 --> 00:02:24,716 So we end up with this bizarre legal situation 43 00:02:24,716 --> 00:02:29,226 where two 17-year-olds can legally have sex in most U.S. states, 44 00:02:29,226 --> 00:02:31,567 but they can't photograph it. 45 00:02:32,587 --> 00:02:36,678 Some states have also tried passing sexting misdemeanor laws, 46 00:02:36,678 --> 00:02:39,778 but these laws repeat the same problem 47 00:02:39,778 --> 00:02:43,858 because they still make consensual sexting illegal. 48 00:02:44,418 --> 00:02:47,718 It doesn't make sense to try to ban all sexting 49 00:02:47,718 --> 00:02:50,379 to try to address privacy violations. 50 00:02:50,379 --> 00:02:51,979 This is kind of like saying, 51 00:02:51,979 --> 00:02:57,820 "Let's solve the problem of date rape by just making dating completely illegal." 52 00:02:59,010 --> 00:03:04,450 Most teens don't get arrested for sexting, but can you guess who does? 53 00:03:04,450 --> 00:03:09,362 It's often teens who are disliked by their partner's parents, 54 00:03:09,362 --> 00:03:14,571 and this can be because of class bias, racism, or homophobia. 55 00:03:14,915 --> 00:03:17,686 Most prosecutors are, of course, smart enough 56 00:03:17,686 --> 00:03:21,865 not to use child pornography charges against teenagers, 57 00:03:21,865 --> 00:03:23,605 but some do. 58 00:03:23,605 --> 00:03:27,165 According to researchers at the University of New Hampshire, 59 00:03:27,165 --> 00:03:31,777 seven percent of all child pornography possession arrests 60 00:03:31,777 --> 00:03:36,597 are teens sexting consensually with other teens. 61 00:03:37,417 --> 00:03:39,958 Child pornography is a serious crime, 62 00:03:39,958 --> 00:03:44,068 but it's just not the same thing as teen sexting. 63 00:03:44,978 --> 00:03:48,308 Parents and educators are also responding to sexting 64 00:03:48,308 --> 00:03:51,419 without really thinking too much about consent. 65 00:03:51,419 --> 00:03:56,069 Their message to teens is often 'just don't do it,' 66 00:03:56,069 --> 00:03:57,639 and I totally get it. 67 00:03:57,639 --> 00:03:59,619 There are serious legal risks, 68 00:03:59,619 --> 00:04:03,300 and of course, that potential for privacy violations. 69 00:04:03,300 --> 00:04:08,516 And when you were a teen, I'm sure you did exactly as you were told, right? 70 00:04:09,366 --> 00:04:12,841 You're probably thinking, "My kid would never sext," 71 00:04:12,841 --> 00:04:16,307 and that's true; your little angel may not be sexting 72 00:04:16,307 --> 00:04:22,428 because only 33 percent of 16- and 17-year-olds are sexting. 73 00:04:23,058 --> 00:04:25,438 But, sorry, by the time they're older, 74 00:04:25,438 --> 00:04:27,651 odds are, they will be sexting. 75 00:04:27,651 --> 00:04:34,103 Every study I've seen puts the rate above 50 percent for 18- to 24-year-olds. 76 00:04:34,603 --> 00:04:37,803 And most of the time, nothing goes wrong. 77 00:04:38,273 --> 00:04:40,933 People ask me all the time things like, 78 00:04:40,933 --> 00:04:43,703 "Isn't sexting just so dangerous, though? 79 00:04:43,703 --> 00:04:47,323 You wouldn't leave your wallet on a park bench. 80 00:04:47,323 --> 00:04:51,394 You expect it's going to get stolen if you do that, right?" 81 00:04:51,394 --> 00:04:52,775 Here's how I think about it: 82 00:04:52,775 --> 00:04:56,675 Sexting is like leaving your wallet at your boyfriend's house. 83 00:04:56,675 --> 00:05:01,475 If you come back the next day and all the money is just gone, 84 00:05:01,475 --> 00:05:04,276 you really need to dump that guy. 85 00:05:04,276 --> 00:05:07,845 (Laughter) 86 00:05:07,845 --> 00:05:12,746 So instead of criminalizing sexting to try to prevent these privacy violations, 87 00:05:12,746 --> 00:05:16,057 instead, we need to make consent central 88 00:05:16,057 --> 00:05:20,498 to how we think about that circulation of our private information. 89 00:05:20,868 --> 00:05:25,188 Every new media technology raises privacy concerns; 90 00:05:25,188 --> 00:05:29,758 in fact, in the U.S., the first major debates about privacy 91 00:05:29,758 --> 00:05:34,249 were in response to technologies that were relatively new at the time. 92 00:05:34,249 --> 00:05:38,229 In the late 1800s, people were worried about cameras, 93 00:05:38,229 --> 00:05:41,730 which were just suddenly more portable than ever before, 94 00:05:41,730 --> 00:05:44,130 and newspaper gossip columns. 95 00:05:44,130 --> 00:05:48,030 They were worried that the camera would capture information about them, 96 00:05:48,030 --> 00:05:51,639 take it out of context, and widely disseminate it. 97 00:05:51,639 --> 00:05:53,241 Does this sound familiar? 98 00:05:53,241 --> 00:05:58,200 It's exactly what we're worrying about now with social media, and drone cameras, 99 00:05:58,200 --> 00:06:00,321 and of course, sexting. 100 00:06:00,321 --> 00:06:03,652 And these fears about technology, they make sense, 101 00:06:03,652 --> 00:06:06,372 because technologies can amplify 102 00:06:06,372 --> 00:06:10,563 and bring out our worst qualities and behaviors. 103 00:06:10,563 --> 00:06:12,893 But there are solutions, 104 00:06:12,893 --> 00:06:17,193 and we've been here before with a dangerous new technology. 105 00:06:17,193 --> 00:06:21,023 In 1908, Ford introduced the Model T car. 106 00:06:21,023 --> 00:06:24,912 Traffic fatality rates were rising; it was a serious problem. 107 00:06:24,912 --> 00:06:27,113 It looks so safe, right? 108 00:06:27,113 --> 00:06:28,824 (Laughter) 109 00:06:29,234 --> 00:06:33,245 Our first response was to try to change drivers' behavior, 110 00:06:33,245 --> 00:06:37,285 so we developed speed limits and enforced them through fines. 111 00:06:37,285 --> 00:06:40,765 But over the following decades we started to realize 112 00:06:40,765 --> 00:06:44,716 the technology of the car itself is not just neutral. 113 00:06:44,716 --> 00:06:47,966 We could design the car to make it safer. 114 00:06:47,966 --> 00:06:51,466 So in the 1920s, we got shatter-resistant windshields; 115 00:06:51,466 --> 00:06:57,266 in the 1950s, seat belts; and in the 1990s, air bags. 116 00:06:58,256 --> 00:07:03,307 All three of these areas - laws, individuals, and industry, 117 00:07:03,307 --> 00:07:09,348 came together over time to help solve the problems that a new technology causes 118 00:07:09,348 --> 00:07:12,918 and we can do the same thing with digital privacy. 119 00:07:12,918 --> 00:07:16,218 Of course, it comes back to consent. 120 00:07:16,218 --> 00:07:17,558 Here's the idea: 121 00:07:17,558 --> 00:07:21,239 before anyone can distribute your private information, 122 00:07:21,239 --> 00:07:24,069 they should have to get your permission. 123 00:07:24,069 --> 00:07:29,020 This idea of affirmative consent comes from anti-rape activists 124 00:07:29,020 --> 00:07:32,760 who tell us that we need consent for every sexual act. 125 00:07:32,760 --> 00:07:37,327 And we have really high standards for consent in a lot of other areas. 126 00:07:37,327 --> 00:07:39,127 Think about having surgery. 127 00:07:39,127 --> 00:07:42,189 Your doctor has to make sure that you are meaningfully 128 00:07:42,189 --> 00:07:45,359 and knowingly consenting to that medical procedure. 129 00:07:45,359 --> 00:07:49,098 This is not the type of consent with like an iTunes Terms of Service 130 00:07:49,098 --> 00:07:53,009 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 131 00:07:53,009 --> 00:07:54,460 (Laughter) 132 00:07:55,079 --> 00:08:00,260 If we think more about consent, we can have better privacy laws. 133 00:08:00,260 --> 00:08:03,681 Right now we just don't have that many protections. 134 00:08:03,681 --> 00:08:07,381 If your ex-husband or your ex-wife is a terrible person, 135 00:08:07,381 --> 00:08:11,781 they can take your nude photos and upload them to a porn site. 136 00:08:11,781 --> 00:08:14,750 It can be really hard to get those images taken down 137 00:08:14,750 --> 00:08:16,202 and in a lot of states, 138 00:08:16,202 --> 00:08:20,022 you're actually better off if you took the images of yourself 139 00:08:20,022 --> 00:08:23,122 because then you can file a copyright claim. 140 00:08:23,122 --> 00:08:24,410 (Laughter) 141 00:08:25,613 --> 00:08:28,433 Right now if someone violates your privacy, 142 00:08:28,433 --> 00:08:32,544 whether that's an individual or a company or the NSA, 143 00:08:32,544 --> 00:08:38,224 you can try filing a lawsuit, but you may not be successful 144 00:08:38,224 --> 00:08:42,994 because many courts assume that digital privacy is just impossible 145 00:08:42,994 --> 00:08:46,954 so they're not willing to punish anyone for violating it. 146 00:08:46,954 --> 00:08:50,055 I still hear people asking me all the time, 147 00:08:50,055 --> 00:08:55,366 "isn't a digital image somehow blurring the line between public and private 148 00:08:55,366 --> 00:08:57,626 because it's digital, right?" 149 00:08:57,626 --> 00:08:59,066 No, no! 150 00:08:59,066 --> 00:09:02,236 Everything digital is not just automatically public. 151 00:09:02,236 --> 00:09:04,256 That doesn't make any sense. 152 00:09:04,256 --> 00:09:07,907 As NYU legal scholar, Helen Nissenbaum, tells us, 153 00:09:07,907 --> 00:09:10,666 we have laws and policies and norms 154 00:09:10,666 --> 00:09:13,877 that protect all kinds of information that's private, 155 00:09:13,877 --> 00:09:16,996 and it doesn't make a difference if it's digital or not. 156 00:09:16,996 --> 00:09:19,748 All of your health records are digitized 157 00:09:19,748 --> 00:09:22,778 but your doctor can't just share them with anyone. 158 00:09:22,778 --> 00:09:27,098 All of your financial information is held in digital databases 159 00:09:27,098 --> 00:09:33,029 but your credit card company can't just post your purchase history online. 160 00:09:33,029 --> 00:09:38,060 Better laws could help address privacy violations after they happen, 161 00:09:38,060 --> 00:09:41,139 but one of the easiest things we can all do 162 00:09:41,139 --> 00:09:46,400 is make personal changes to help protect each others' privacy. 163 00:09:46,400 --> 00:09:51,441 We're always told that privacy is our sole individual responsibility. 164 00:09:51,441 --> 00:09:55,611 We're told, "Constantly monitor and update your privacy settings." 165 00:09:55,611 --> 00:10:01,072 We're told, "Never share anything you wouldn't want the entire world to see." 166 00:10:01,072 --> 00:10:02,792 This doesn't make sense. 167 00:10:02,792 --> 00:10:05,582 Digital media are social environments 168 00:10:05,582 --> 00:10:10,833 and we share things with people we trust all day, every day. 169 00:10:10,833 --> 00:10:13,743 As Princeton researcher, Janet Vertesi, argues, 170 00:10:13,743 --> 00:10:15,923 our data and our privacy, 171 00:10:15,923 --> 00:10:20,494 they're not just personal, they're interpersonal. 172 00:10:20,494 --> 00:10:25,954 So one thing you can do that's really easy is just start asking for permission 173 00:10:25,954 --> 00:10:28,794 before you share anyone else's information. 174 00:10:28,794 --> 00:10:33,335 If you want to post a photo of someone online, ask for permission. 175 00:10:33,335 --> 00:10:37,076 If you want to forward an email thread, ask for permission. 176 00:10:37,076 --> 00:10:39,776 If you want to share someone's nude selfie, 177 00:10:39,776 --> 00:10:43,606 obviously, ask for permission! 178 00:10:43,606 --> 00:10:47,746 These individual changes can help us protect each others' privacy, 179 00:10:47,746 --> 00:10:52,386 but we need technology companies on board as well. 180 00:10:52,386 --> 00:10:56,867 These companies have very little incentive to help our privacy 181 00:10:56,867 --> 00:10:59,139 because their business models depend on us 182 00:10:59,139 --> 00:11:02,978 sharing everything with as many people as possible. 183 00:11:02,978 --> 00:11:05,108 Right now, if I send you an image, 184 00:11:05,108 --> 00:11:08,218 you can forward that to anyone that you want. 185 00:11:08,218 --> 00:11:12,559 But what if I got to decide if that image was forwardable or not? 186 00:11:12,559 --> 00:11:16,509 This would tell you, "You don't have my permission to send this image out." 187 00:11:16,509 --> 00:11:20,600 We do this kind of thing all the time to protect copyright. 188 00:11:20,600 --> 00:11:25,420 If you buy an ebook, you can't just send it out to as many people as you want, 189 00:11:25,420 --> 00:11:29,311 so why not try this with mobile phones? 190 00:11:29,311 --> 00:11:32,291 What you can do is we can demand that tech companies 191 00:11:32,291 --> 00:11:37,851 add these protections to our devices and our platforms as the default. 192 00:11:37,851 --> 00:11:41,061 After all, you can choose the color of your car, 193 00:11:41,061 --> 00:11:45,602 but the airbags are always standard. 194 00:11:45,602 --> 00:11:49,333 If we don't think more about digital privacy and consent, 195 00:11:49,333 --> 00:11:52,573 there can be serious consequences. 196 00:11:52,573 --> 00:11:55,003 There was a teenager from Ohio. 197 00:11:55,003 --> 00:11:58,509 Let's call her Jennifer for the sake of her privacy. 198 00:11:58,509 --> 00:12:01,823 She shared nude photos of herself with her high school boyfriend 199 00:12:01,823 --> 00:12:04,740 thinking she could trust him. 200 00:12:04,740 --> 00:12:10,344 Unfortunately, he betrayed her and sent her photos around the entire school. 201 00:12:10,344 --> 00:12:14,294 Jennifer was embarrassed and humiliated, 202 00:12:14,294 --> 00:12:18,385 but instead of being compassionate, her classmates harassed her. 203 00:12:18,385 --> 00:12:22,806 They called her a slut and a whore and they made her life miserable. 204 00:12:22,806 --> 00:12:26,985 Jennifer started missing school, and her grades dropped. 205 00:12:26,985 --> 00:12:32,067 Ultimately, Jennifer decided to end her own life. 206 00:12:32,067 --> 00:12:34,697 Jennifer did nothing wrong. 207 00:12:34,697 --> 00:12:37,327 All she did was share a nude photo 208 00:12:37,327 --> 00:12:39,818 with someone that she thought that she could trust. 209 00:12:39,818 --> 00:12:42,748 And yet, our laws tell herv 210 00:12:42,748 --> 00:12:47,317 that she committed a horrible crime equivalent to child pornography. 211 00:12:47,317 --> 00:12:52,068 Our gender norms tell her that by producing this nude image of herself, 212 00:12:52,068 --> 00:12:55,779 she somehow did the most horrible, shameful thing. 213 00:12:55,779 --> 00:13:00,040 And when we assume that privacy is impossible in digital media, 214 00:13:00,040 --> 00:13:06,420 we completely write off and excuse her boyfriend's bad, bad behavior. 215 00:13:06,420 --> 00:13:12,251 People are still saying all the time to victims of privacy violations, 216 00:13:12,251 --> 00:13:13,740 "What were you thinking? 217 00:13:13,740 --> 00:13:17,050 You should've never sent that image." 218 00:13:17,050 --> 00:13:21,632 If you're trying to figure out what to say instead, try this: 219 00:13:21,632 --> 00:13:25,832 imagine you've run into your friend who broke their leg skiing. 220 00:13:25,832 --> 00:13:30,342 They took a risk to do something fun, and it didn't end well. 221 00:13:30,342 --> 00:13:32,953 But you're probably not going to be the jerk who says, 222 00:13:32,953 --> 00:13:37,493 "Well, I guess you shouldn't have gone skiing then!" 223 00:13:37,493 --> 00:13:39,833 If we think more about consent, 224 00:13:39,833 --> 00:13:44,414 we can see that victims of privacy violations deserve our compassion, 225 00:13:44,414 --> 00:13:49,784 not criminalization, shaming, harassment, or punishment. 226 00:13:49,784 --> 00:13:54,335 We can support victims, and we can prevent some privacy violations 227 00:13:54,335 --> 00:13:59,285 by making these legal, individual, and technological changes. 228 00:13:59,285 --> 00:14:04,865 Because the problem is not sexting, the issue is digital privacy 229 00:14:04,865 --> 00:14:08,126 and one solution is consent. 230 00:14:08,126 --> 00:14:12,637 So the next time a victim of a privacy violation comes up to you, 231 00:14:12,637 --> 00:14:15,596 instead of blaming them, let's do this instead: 232 00:14:15,596 --> 00:14:18,807 let's shift our ideas about digital privacy 233 00:14:18,807 --> 00:14:21,907 and let's respond with compassion. 234 00:14:21,907 --> 00:14:23,437 Thank you. 235 00:14:23,437 --> 00:14:25,227 (Applause)