0:00:00.720,0:00:05.000 People have been using media[br]to talk about sex for a long time. 0:00:05.600,0:00:09.040 Love letters, phone sex, racy Polaroids. 0:00:09.480,0:00:15.536 There's even a story of a girl who eloped[br]with a man that she met over the telegraph 0:00:15.560,0:00:17.200 in 1886. 0:00:18.560,0:00:23.656 Today we have sexting,[br]and I am a sexting expert. 0:00:23.680,0:00:25.560 Not an expert sexter. 0:00:26.800,0:00:30.976 Though, I do know what this means --[br]I think you do too. 0:00:31.000,0:00:32.375 [it's a penis] 0:00:32.400,0:00:34.760 (Laughter) 0:00:36.360,0:00:42.696 I have been studying sexting since[br]the media attention to it began in 2008. 0:00:42.720,0:00:45.696 I wrote a book on the moral[br]panic about sexting. 0:00:45.720,0:00:47.336 And here's what I found: 0:00:47.360,0:00:50.576 most people are worrying[br]about the wrong thing. 0:00:50.600,0:00:54.776 They're trying to just prevent[br]sexting from happening entirely. 0:00:54.800,0:00:56.336 But let me ask you this: 0:00:56.360,0:01:01.216 As long as it's completely consensual,[br]what's the problem with sexting? 0:01:01.240,0:01:05.256 People are into all sorts of things[br]that you may not be into, 0:01:05.280,0:01:07.576 like blue cheese or cilantro. 0:01:07.600,0:01:09.240 (Laughter) 0:01:10.600,0:01:14.736 Sexting is certainly risky,[br]like anything that's fun, 0:01:14.760,0:01:21.456 but as long as you're not sending an image[br]to someone who doesn't want to receive it, 0:01:21.480,0:01:23.256 there's no harm. 0:01:23.280,0:01:25.896 What I do think is a serious problem 0:01:25.920,0:01:29.176 is when people share[br]private images of others 0:01:29.200,0:01:30.720 without their permission. 0:01:31.360,0:01:33.696 And instead of worrying about sexting, 0:01:33.720,0:01:38.280 what I think we need to do[br]is think a lot more about digital privacy. 0:01:38.880,0:01:41.080 The key is consent. 0:01:41.680,0:01:44.976 Right now most people[br]are thinking about sexting 0:01:45.000,0:01:47.960 without really thinking[br]about consent at all. 0:01:48.400,0:01:52.080 Did you know that we currently[br]criminalize teen sexting? 0:01:53.400,0:01:56.856 It can be a crime because[br]it counts as child pornography, 0:01:56.880,0:01:59.736 if there's an image of someone under 18, 0:01:59.760,0:02:01.096 and it doesn't even matter 0:02:01.120,0:02:05.320 if they took that image of themselves[br]and shared it willingly. 0:02:05.800,0:02:08.776 So we end up with this[br]bizarre legal situation 0:02:08.800,0:02:13.336 where two 17-year-olds[br]can legally have sex in most US states 0:02:13.360,0:02:15.240 but they can't photograph it. 0:02:16.560,0:02:20.816 Some states have also tried[br]passing sexting misdemeanor laws 0:02:20.840,0:02:23.856 but these laws repeat the same problem 0:02:23.880,0:02:27.640 because they still[br]make consensual sexting illegal. 0:02:28.520,0:02:29.776 It doesn't make sense 0:02:29.800,0:02:34.536 to try to ban all sexting[br]to try to address privacy violations. 0:02:34.560,0:02:36.056 This is kind of like saying, 0:02:36.080,0:02:41.520 let's solve the problem of date rape[br]by just making dating completely illegal. 0:02:43.120,0:02:48.496 Most teens don't get arrested for sexting,[br]but can you guess who does? 0:02:48.520,0:02:53.496 It's often teens who are disliked[br]by their partner's parents. 0:02:53.520,0:02:58.320 And this can be because of class bias,[br]racism or homophobia. 0:02:58.960,0:03:01.736 Most prosecutors are,[br]of course, smart enough 0:03:01.760,0:03:07.240 not to use child pornography charges[br]against teenagers, but some do. 0:03:07.720,0:03:11.376 According to researchers[br]at the University of New Hampshire 0:03:11.400,0:03:17.056 seven percent of all child pornography[br]possession arrests are teens, 0:03:17.080,0:03:20.080 sexting consensually with other teens. 0:03:21.480,0:03:24.016 Child pornography is a serious crime, 0:03:24.040,0:03:27.720 but it's just not[br]the same thing as teen sexting. 0:03:29.040,0:03:32.456 Parents and educators[br]are also responding to sexting 0:03:32.480,0:03:35.616 without really thinking[br]too much about consent. 0:03:35.640,0:03:39.760 Their message to teens is often:[br]just don't do it. 0:03:40.200,0:03:43.696 And I totally get it --[br]there are serious legal risks 0:03:43.720,0:03:47.376 and of course, that potential[br]for privacy violations. 0:03:47.400,0:03:48.656 And when you were a teen, 0:03:48.680,0:03:52.080 I'm sure you did exactly[br]as you were told, right? 0:03:53.440,0:03:56.896 You're probably thinking,[br]my kid would never sext. 0:03:56.920,0:04:00.376 And that's true, your little angel[br]may not be sexting 0:04:00.400,0:04:03.536 because only 33 percent 0:04:03.560,0:04:05.920 of 16- and 17-year-olds are sexting. 0:04:07.200,0:04:11.816 But, sorry, by the time they're older,[br]odds are they will be sexting. 0:04:11.840,0:04:18.040 Every study I've seen puts the rate[br]above 50 percent for 18- to 24-year-olds. 0:04:18.720,0:04:21.776 And most of the time, nothing goes wrong. 0:04:21.800,0:04:27.176 People ask me all the time things like,[br]isn't sexting just so dangerous, though? 0:04:27.200,0:04:30.776 It's like you wouldn't[br]leave your wallet on a park bench 0:04:30.800,0:04:34.240 and you expect it's going to get stolen[br]if you do that, right? 0:04:34.880,0:04:36.336 Here's how I think about it: 0:04:36.360,0:04:40.296 sexting is like leaving your wallet[br]at your boyfriend's house. 0:04:40.320,0:04:42.096 If you come back the next day 0:04:42.120,0:04:44.400 and all the money is just gone, 0:04:45.040,0:04:47.160 you really need to dump that guy. 0:04:47.690,0:04:49.860 (Laughter) 0:04:51.360,0:04:53.680 So instead of criminalizing sexting 0:04:53.720,0:04:56.336 to try to prevent[br]these privacy violations, 0:04:56.360,0:04:59.656 instead we need to make consent central 0:04:59.680,0:05:03.760 to how we think about the circulation[br]of our private information. 0:05:04.480,0:05:08.736 Every new media technology[br]raises privacy concerns. 0:05:08.760,0:05:13.376 In fact, in the US the very first[br]major debates about privacy 0:05:13.400,0:05:17.896 were in response to technologies[br]that were relatively new at the time. 0:05:17.920,0:05:21.816 In the late 1800s,[br]people were worried about cameras, 0:05:21.840,0:05:25.296 which were just suddenly[br]more portable than ever before, 0:05:25.320,0:05:27.816 and newspaper gossip columns. 0:05:27.840,0:05:31.656 They were worried that the camera[br]would capture information about them, 0:05:31.680,0:05:34.880 take it out of context[br]and widely disseminate it. 0:05:35.240,0:05:36.856 Does this sound familiar? 0:05:36.880,0:05:41.736 It's exactly what we're worrying about[br]now with social media and drone cameras, 0:05:41.760,0:05:43.400 and, of course, sexting. 0:05:43.920,0:05:46.136 And these fears about technology, 0:05:46.160,0:05:47.376 they make sense 0:05:47.400,0:05:50.816 because technologies[br]can amplify and bring out 0:05:50.840,0:05:53.560 our worst qualities and behaviors. 0:05:54.160,0:05:56.536 But there are solutions. 0:05:56.560,0:06:00.120 And we've been here before[br]with a dangerous new technology. 0:06:00.720,0:06:04.496 In 1908, Ford introduced the Model T car. 0:06:04.520,0:06:07.096 Traffic fatality rates were rising. 0:06:07.120,0:06:09.920 It was a serious problem --[br]it looks so safe, right? 0:06:12.080,0:06:16.056 Our first response[br]was to try to change drivers' behavior, 0:06:16.080,0:06:19.800 so we developed speed limits[br]and enforced them through fines. 0:06:20.240,0:06:22.096 But over the following decades, 0:06:22.120,0:06:27.616 we started to realize the technology[br]of the car itself is not just neutral. 0:06:27.640,0:06:30.856 We could design the car to make it safer. 0:06:30.880,0:06:34.336 So in the 1920s, we got[br]shatter-resistant windshields. 0:06:34.360,0:06:36.856 In the 1950s, seat belts. 0:06:36.880,0:06:39.960 And in the 1990s, airbags. 0:06:40.440,0:06:42.816 All three of these areas: 0:06:42.840,0:06:47.616 laws, individuals and industry[br]came together over time 0:06:47.640,0:06:51.416 to help solve the problem[br]that a new technology causes. 0:06:51.440,0:06:54.680 And we can do the same thing[br]with digital privacy. 0:06:55.160,0:06:57.920 Of course, it comes back to consent. 0:06:58.360,0:06:59.576 Here's the idea. 0:06:59.600,0:07:03.416 Before anyone can distribute[br]your private information, 0:07:03.440,0:07:05.680 they should have to get your permission. 0:07:06.240,0:07:11.056 This idea of affirmative consent[br]comes from anti-rape activists 0:07:11.080,0:07:14.856 who tell us that we need consent[br]for every sexual act. 0:07:14.880,0:07:19.456 And we have really high standards[br]for consent in a lot of other areas. 0:07:19.480,0:07:21.336 Think about having surgery. 0:07:21.360,0:07:22.976 Your doctor has to make sure 0:07:23.000,0:07:27.040 that you are meaningfully and knowingly[br]consenting to that medical procedure. 0:07:27.520,0:07:31.216 This is not the type of consent[br]like with an iTunes Terms of Service 0:07:31.240,0:07:34.896 where you just scroll to the bottom[br]and you're like, agree, agree, whatever. 0:07:34.920,0:07:36.640 (Laughter) 0:07:37.160,0:07:42.416 If we think more about consent,[br]we can have better privacy laws. 0:07:42.440,0:07:45.856 Right now, we just don't have[br]that many protections. 0:07:45.880,0:07:49.456 If your ex-husband or your ex-wife[br]is a terrible person, 0:07:49.480,0:07:53.696 they can take your nude photos[br]and upload them to a porn site. 0:07:53.720,0:07:56.936 It can be really hard[br]to get those images taken down. 0:07:56.960,0:07:58.176 And in a lot of states, 0:07:58.200,0:08:02.016 you're actually better off[br]if you took the images of yourself 0:08:02.040,0:08:04.840 because then you can[br]file a copyright claim. 0:08:05.320,0:08:07.376 (Laughter) 0:08:07.400,0:08:10.376 Right now, if someone[br]violates your privacy, 0:08:10.400,0:08:14.600 whether that's an individual[br]or a company or the NSA, 0:08:15.280,0:08:18.016 you can try filing a lawsuit, 0:08:18.040,0:08:20.176 though you may not be successful 0:08:20.200,0:08:24.976 because many courts assume[br]that digital privacy is just impossible. 0:08:25.000,0:08:28.440 So they're not willing[br]to punish anyone for violating it. 0:08:29.200,0:08:32.096 I still hear people[br]asking me all the time, 0:08:32.120,0:08:37.416 isn't a digital image somehow blurring[br]the line between public and private 0:08:37.440,0:08:39.080 because it's digital, right? 0:08:39.600,0:08:40.936 No! No! 0:08:40.960,0:08:44.296 Everything digital[br]is not just automatically public. 0:08:44.320,0:08:46.216 That doesn't make any sense. 0:08:46.240,0:08:49.736 As NYU legal scholar[br]Helen Nissenbaum tells us, 0:08:49.760,0:08:52.376 we have laws and policies and norms 0:08:52.400,0:08:55.536 that protect all kinds[br]of information that's private, 0:08:55.560,0:08:58.976 and it doesn't make a difference[br]if it's digital or not. 0:08:59.000,0:09:01.656 All of your health records are digitized 0:09:01.680,0:09:04.816 but your doctor can't[br]just share them with anyone. 0:09:04.840,0:09:09.296 All of your financial information[br]is held in digital databases, 0:09:09.320,0:09:13.560 but your credit card company can't[br]just post your purchase history online. 0:09:15.080,0:09:20.536 Better laws could help address[br]privacy violations after they happen, 0:09:20.560,0:09:24.936 but one of the easiest things[br]we can all do is make personal changes 0:09:24.960,0:09:27.640 to help protect each other's privacy. 0:09:28.360,0:09:30.256 We're always told that privacy 0:09:30.280,0:09:33.336 is our own, sole,[br]individual responsibility. 0:09:33.360,0:09:37.616 We're told, constantly monitor[br]and update your privacy settings. 0:09:37.640,0:09:42.440 We're told, never share anything[br]you wouldn't want the entire world to see. 0:09:43.400,0:09:44.616 This doesn't make sense. 0:09:44.640,0:09:47.616 Digital media are social environments 0:09:47.640,0:09:51.920 and we share things[br]with people we trust all day, every day. 0:09:52.760,0:09:55.736 As Princeton researcher[br]Janet Vertesi argues, 0:09:55.760,0:09:59.776 our data and our privacy,[br]they're not just personal, 0:09:59.800,0:10:02.376 they're actually interpersonal. 0:10:02.400,0:10:05.656 And so one thing you can do[br]that's really easy 0:10:05.680,0:10:10.776 is just start asking for permission before[br]you share anyone else's information. 0:10:10.800,0:10:15.336 If you want to post a photo[br]of someone online, ask for permission. 0:10:15.360,0:10:17.816 If you want to forward an email thread, 0:10:17.840,0:10:19.216 ask for permission. 0:10:19.240,0:10:22.016 And if you want to share[br]someone's nude selfie, 0:10:22.040,0:10:24.320 obviously, ask for permission. 0:10:25.560,0:10:30.016 These individual changes can really[br]help us protect each other's privacy, 0:10:30.040,0:10:33.840 but we need technology companies[br]on board as well. 0:10:34.360,0:10:38.856 These companies have very little[br]incentive to help protect our privacy 0:10:38.880,0:10:42.176 because their business models[br]depend on us sharing everything 0:10:42.200,0:10:44.440 with as many people as possible. 0:10:45.080,0:10:47.016 Right now, if I send you an image, 0:10:47.040,0:10:50.136 you can forward that[br]to anyone that you want. 0:10:50.160,0:10:54.416 But what if I got to decide[br]if that image was forwardable or not? 0:10:54.440,0:10:58.496 This would tell you, you don't[br]have my permission to send this image out. 0:10:58.520,0:11:02.656 We do this kind of thing all the time[br]to protect copyright. 0:11:02.680,0:11:07.456 If you buy an e-book, you can't just[br]send it out to as many people as you want. 0:11:07.480,0:11:10.040 So why not try this with mobile phones? 0:11:10.960,0:11:15.736 What you can do is we can demand[br]that tech companies add these protections 0:11:15.760,0:11:19.496 to our devices and our platforms[br]as the default. 0:11:19.520,0:11:22.936 After all, you can choose[br]the color of your car, 0:11:22.960,0:11:25.800 but the airbags are always standard. 0:11:28.080,0:11:31.896 If we don't think more[br]about digital privacy and consent, 0:11:31.920,0:11:34.640 there can be serious consequences. 0:11:35.360,0:11:37.616 There was a teenager from Ohio -- 0:11:37.640,0:11:40.480 let's call her Jennifer,[br]for the sake of her privacy. 0:11:41.120,0:11:44.696 She shared nude photos of herself[br]with her high school boyfriend, 0:11:44.720,0:11:46.240 thinking she could trust him. 0:11:47.720,0:11:49.656 Unfortunately, he betrayed her 0:11:49.680,0:11:52.656 and sent her photos[br]around the entire school. 0:11:52.680,0:11:56.200 Jennifer was embarrassed and humiliated, 0:11:56.800,0:12:00.936 but instead of being compassionate,[br]her classmates harassed her. 0:12:00.960,0:12:02.816 They called her a slut and a whore 0:12:02.840,0:12:04.800 and they made her life miserable. 0:12:05.360,0:12:09.040 Jennifer started missing school[br]and her grades dropped. 0:12:09.520,0:12:13.320 Ultimately, Jennifer decided[br]to end her own life. 0:12:14.720,0:12:17.416 Jennifer did nothing wrong. 0:12:17.440,0:12:19.696 All she did was share a nude photo 0:12:19.720,0:12:22.536 with someone she thought[br]that she could trust. 0:12:22.560,0:12:25.176 And yet our laws tell her 0:12:25.200,0:12:29.360 that she committed a horrible crime[br]equivalent to child pornography. 0:12:29.920,0:12:31.416 Our gender norms tell her 0:12:31.440,0:12:34.656 that by producing[br]this nude image of herself, 0:12:34.680,0:12:37.880 she somehow did the most[br]horrible, shameful thing. 0:12:38.400,0:12:42.616 And when we assume that privacy[br]is impossible in digital media, 0:12:42.640,0:12:48.160 we completely write off and excuse[br]her boyfriend's bad, bad behavior. 0:12:49.200,0:12:54.936 People are still saying all the time[br]to victims of privacy violations, 0:12:54.960,0:12:56.216 "What were you thinking? 0:12:56.240,0:12:58.720 You should have never sent that image." 0:12:59.640,0:13:03.640 If you're trying to figure out[br]what to say instead, try this. 0:13:04.160,0:13:07.680 Imagine you've run into your friend[br]who broke their leg skiing. 0:13:08.240,0:13:12.816 They took a risk to do something fun,[br]and it didn't end well. 0:13:12.840,0:13:15.376 But you're probably[br]not going to be the jerk who says, 0:13:15.400,0:13:17.840 "Well, I guess you shouldn't[br]have gone skiing then." 0:13:20.080,0:13:22.216 If we think more about consent, 0:13:22.240,0:13:25.496 we can see that victims[br]of privacy violations 0:13:25.520,0:13:27.256 deserve our compassion, 0:13:27.280,0:13:31.880 not criminalization, shaming,[br]harassment or punishment. 0:13:32.440,0:13:36.936 We can support victims,[br]and we can prevent some privacy violations 0:13:36.960,0:13:41.280 by making these legal,[br]individual and technological changes. 0:13:41.840,0:13:47.656 Because the problem is not sexting,[br]the issue is digital privacy. 0:13:47.680,0:13:50.040 And one solution is consent. 0:13:50.680,0:13:55.256 So the next time a victim[br]of a privacy violation comes up to you, 0:13:55.280,0:13:58.016 instead of blaming them,[br]let's do this instead: 0:13:58.040,0:14:01.456 let's shift our ideas[br]about digital privacy, 0:14:01.480,0:14:04.120 and let's respond with compassion. 0:14:04.680,0:14:05.896 Thank you. 0:14:05.920,0:14:12.056 (Applause)