WEBVTT 00:00:00.720 --> 00:00:05.000 People have been using media to talk about sex for a long time. 00:00:05.600 --> 00:00:09.040 Love letters, phone sex, racy Polaroids. 00:00:09.480 --> 00:00:15.536 There's even a story of a girl who eloped with a man that she met over the telegraph 00:00:15.560 --> 00:00:17.200 in 1886. 00:00:18.560 --> 00:00:23.656 Today we have sexting, and I am a sexting expert. 00:00:23.680 --> 00:00:25.560 Not an expert sexter. 00:00:26.800 --> 00:00:30.976 Though, I do know what this means -- I think you do too. NOTE Paragraph 00:00:31.000 --> 00:00:32.375 [it's a penis] NOTE Paragraph 00:00:32.400 --> 00:00:34.760 (Laughter) NOTE Paragraph 00:00:36.360 --> 00:00:42.696 I have been studying sexting since the media attention to it began in 2008. 00:00:42.720 --> 00:00:45.696 I wrote a book on the moral panic about sexting. 00:00:45.720 --> 00:00:47.336 And here's what I found: 00:00:47.360 --> 00:00:50.576 most people are worrying about the wrong thing. 00:00:50.600 --> 00:00:54.776 They're trying to just prevent sexting from happening entirely. 00:00:54.800 --> 00:00:56.336 But let me ask you this: 00:00:56.360 --> 00:01:01.216 As long as it's completely consensual, what's the problem with sexting? 00:01:01.240 --> 00:01:05.256 People are into all sorts of things that you may not be into, 00:01:05.280 --> 00:01:07.576 like blue cheese or cilantro. NOTE Paragraph 00:01:07.600 --> 00:01:09.240 (Laughter) NOTE Paragraph 00:01:10.600 --> 00:01:14.736 Sexting is certainly risky, like anything that's fun, 00:01:14.760 --> 00:01:21.456 but as long as you're not sending an image to someone who doesn't want to receive it, 00:01:21.480 --> 00:01:23.256 there's no harm. 00:01:23.280 --> 00:01:25.896 What I do think is a serious problem 00:01:25.920 --> 00:01:29.176 is when people share private images of others 00:01:29.200 --> 00:01:30.720 without their permission. 00:01:31.360 --> 00:01:33.696 And instead of worrying about sexting, 00:01:33.720 --> 00:01:38.280 what I think we need to do is think a lot more about digital privacy. NOTE Paragraph 00:01:38.880 --> 00:01:41.080 The key is consent. 00:01:41.680 --> 00:01:44.976 Right now most people are thinking about sexting 00:01:45.000 --> 00:01:47.960 without really thinking about consent at all. 00:01:48.400 --> 00:01:52.080 Did you know that we currently criminalize teen sexting? 00:01:53.400 --> 00:01:56.856 It can be a crime because it counts as child pornography, 00:01:56.880 --> 00:01:59.736 if there's an image of someone under 18, 00:01:59.760 --> 00:02:01.096 and it doesn't even matter 00:02:01.120 --> 00:02:05.320 if they took that image of themselves and shared it willingly. 00:02:05.800 --> 00:02:08.776 So we end up with this bizarre legal situation 00:02:08.800 --> 00:02:13.336 where two 17-year-olds can legally have sex in most US states 00:02:13.360 --> 00:02:15.240 but they can't photograph it. 00:02:16.560 --> 00:02:20.816 Some states have also tried passing sexting misdemeanor laws 00:02:20.840 --> 00:02:23.856 but these laws repeat the same problem 00:02:23.880 --> 00:02:27.640 because they still make consensual sexting illegal. 00:02:28.520 --> 00:02:29.776 It doesn't make sense 00:02:29.800 --> 00:02:34.536 to try to ban all sexting to try to address privacy violations. 00:02:34.560 --> 00:02:36.056 This is kind of like saying, 00:02:36.080 --> 00:02:41.520 let's solve the problem of date rape by just making dating completely illegal. 00:02:43.120 --> 00:02:48.496 Most teens don't get arrested for sexting, but can you guess who does? 00:02:48.520 --> 00:02:53.496 It's often teens who are disliked by their partner's parents. 00:02:53.520 --> 00:02:58.320 And this can be because of class bias, racism or homophobia. 00:02:58.960 --> 00:03:01.736 Most prosecutors are, of course, smart enough 00:03:01.760 --> 00:03:07.240 not to use child pornography charges against teenagers, but some do. NOTE Paragraph 00:03:07.720 --> 00:03:11.376 According to researchers at the University of New Hampshire 00:03:11.400 --> 00:03:17.056 seven percent of all child pornography possession arrests are teens, 00:03:17.080 --> 00:03:20.080 sexting consensually with other teens. 00:03:21.480 --> 00:03:24.016 Child pornography is a serious crime, 00:03:24.040 --> 00:03:27.720 but it's just not the same thing as teen sexting. 00:03:29.040 --> 00:03:32.456 Parents and educators are also responding to sexting 00:03:32.480 --> 00:03:35.616 without really thinking too much about consent. 00:03:35.640 --> 00:03:39.760 Their message to teens is often: just don't do it. NOTE Paragraph 00:03:40.200 --> 00:03:43.696 And I totally get it -- there are serious legal risks 00:03:43.720 --> 00:03:47.376 and of course, that potential for privacy violations. 00:03:47.400 --> 00:03:48.656 And when you were a teen, 00:03:48.680 --> 00:03:52.080 I'm sure you did exactly as you were told, right? NOTE Paragraph 00:03:53.440 --> 00:03:56.896 You're probably thinking, my kid would never sext. 00:03:56.920 --> 00:04:00.376 And that's true, your little angel may not be sexting 00:04:00.400 --> 00:04:03.536 because only 33 percent 00:04:03.560 --> 00:04:05.920 of 16- and 17-year-olds are sexting. 00:04:07.200 --> 00:04:11.816 But, sorry, by the time they're older, odds are they will be sexting. 00:04:11.840 --> 00:04:18.040 Every study I've seen puts the rate above 50 percent for 18- to 24-year-olds. 00:04:18.720 --> 00:04:21.776 And most of the time, nothing goes wrong. 00:04:21.800 --> 00:04:27.176 People ask me all the time things like, isn't sexting just so dangerous, though? 00:04:27.200 --> 00:04:30.776 It's like you wouldn't leave your wallet on a park bench 00:04:30.800 --> 00:04:34.240 and you expect it's going to get stolen if you do that, right? 00:04:34.880 --> 00:04:36.336 Here's how I think about it: 00:04:36.360 --> 00:04:40.296 sexting is like leaving your wallet at your boyfriend's house. 00:04:40.320 --> 00:04:42.096 If you come back the next day 00:04:42.120 --> 00:04:44.400 and all the money is just gone, 00:04:45.040 --> 00:04:47.160 you really need to dump that guy. 00:04:47.690 --> 00:04:49.860 (Laughter) 00:04:51.360 --> 00:04:53.680 So instead of criminalizing sexting 00:04:53.720 --> 00:04:56.336 to try to prevent these privacy violations, 00:04:56.360 --> 00:04:59.656 instead we need to make consent central 00:04:59.680 --> 00:05:03.760 to how we think about the circulation of our private information. NOTE Paragraph 00:05:04.480 --> 00:05:08.736 Every new media technology raises privacy concerns. 00:05:08.760 --> 00:05:13.376 In fact, in the US the very first major debates about privacy 00:05:13.400 --> 00:05:17.896 were in response to technologies that were relatively new at the time. 00:05:17.920 --> 00:05:21.816 In the late 1800s, people were worried about cameras, 00:05:21.840 --> 00:05:25.296 which were just suddenly more portable than ever before, 00:05:25.320 --> 00:05:27.816 and newspaper gossip columns. 00:05:27.840 --> 00:05:31.656 They were worried that the camera would capture information about them, 00:05:31.680 --> 00:05:34.880 take it out of context and widely disseminate it. 00:05:35.240 --> 00:05:36.856 Does this sound familiar? 00:05:36.880 --> 00:05:41.736 It's exactly what we're worrying about now with social media and drone cameras, 00:05:41.760 --> 00:05:43.400 and, of course, sexting. 00:05:43.920 --> 00:05:46.136 And these fears about technology, 00:05:46.160 --> 00:05:47.376 they make sense 00:05:47.400 --> 00:05:50.816 because technologies can amplify and bring out 00:05:50.840 --> 00:05:53.560 our worst qualities and behaviors. 00:05:54.160 --> 00:05:56.536 But there are solutions. 00:05:56.560 --> 00:06:00.120 And we've been here before with a dangerous new technology. NOTE Paragraph 00:06:00.720 --> 00:06:04.496 In 1908, Ford introduced the Model T car. 00:06:04.520 --> 00:06:07.096 Traffic fatality rates were rising. 00:06:07.120 --> 00:06:09.920 It was a serious problem -- it looks so safe, right? 00:06:12.080 --> 00:06:16.056 Our first response was to try to change drivers' behavior, 00:06:16.080 --> 00:06:19.800 so we developed speed limits and enforced them through fines. 00:06:20.240 --> 00:06:22.096 But over the following decades, 00:06:22.120 --> 00:06:27.616 we started to realize the technology of the car itself is not just neutral. 00:06:27.640 --> 00:06:30.856 We could design the car to make it safer. 00:06:30.880 --> 00:06:34.336 So in the 1920s, we got shatter-resistant windshields. 00:06:34.360 --> 00:06:36.856 In the 1950s, seat belts. 00:06:36.880 --> 00:06:39.960 And in the 1990s, airbags. 00:06:40.440 --> 00:06:42.816 All three of these areas: 00:06:42.840 --> 00:06:47.616 laws, individuals and industry came together over time 00:06:47.640 --> 00:06:51.416 to help solve the problem that a new technology causes. 00:06:51.440 --> 00:06:54.680 And we can do the same thing with digital privacy. 00:06:55.160 --> 00:06:57.920 Of course, it comes back to consent. 00:06:58.360 --> 00:06:59.576 Here's the idea. 00:06:59.600 --> 00:07:03.416 Before anyone can distribute your private information, 00:07:03.440 --> 00:07:05.680 they should have to get your permission. 00:07:06.240 --> 00:07:11.056 This idea of affirmative consent comes from anti-rape activists 00:07:11.080 --> 00:07:14.856 who tell us that we need consent for every sexual act. 00:07:14.880 --> 00:07:19.456 And we have really high standards for consent in a lot of other areas. NOTE Paragraph 00:07:19.480 --> 00:07:21.336 Think about having surgery. 00:07:21.360 --> 00:07:22.976 Your doctor has to make sure 00:07:23.000 --> 00:07:27.040 that you are meaningfully and knowingly consenting to that medical procedure. 00:07:27.520 --> 00:07:31.216 This is not the type of consent like with an iTunes Terms of Service 00:07:31.240 --> 00:07:34.896 where you just scroll to the bottom and you're like, agree, agree, whatever. NOTE Paragraph 00:07:34.920 --> 00:07:36.640 (Laughter) NOTE Paragraph 00:07:37.160 --> 00:07:42.416 If we think more about consent, we can have better privacy laws. 00:07:42.440 --> 00:07:45.856 Right now, we just don't have that many protections. 00:07:45.880 --> 00:07:49.456 If your ex-husband or your ex-wife is a terrible person, 00:07:49.480 --> 00:07:53.696 they can take your nude photos and upload them to a porn site. 00:07:53.720 --> 00:07:56.936 It can be really hard to get those images taken down. 00:07:56.960 --> 00:07:58.176 And in a lot of states, 00:07:58.200 --> 00:08:02.016 you're actually better off if you took the images of yourself 00:08:02.040 --> 00:08:04.840 because then you can file a copyright claim. NOTE Paragraph 00:08:05.320 --> 00:08:07.376 (Laughter) NOTE Paragraph 00:08:07.400 --> 00:08:10.376 Right now, if someone violates your privacy, 00:08:10.400 --> 00:08:14.600 whether that's an individual or a company or the NSA, 00:08:15.280 --> 00:08:18.016 you can try filing a lawsuit, 00:08:18.040 --> 00:08:20.176 though you may not be successful 00:08:20.200 --> 00:08:24.976 because many courts assume that digital privacy is just impossible. 00:08:25.000 --> 00:08:28.440 So they're not willing to punish anyone for violating it. 00:08:29.200 --> 00:08:32.096 I still hear people asking me all the time, 00:08:32.120 --> 00:08:37.416 isn't a digital image somehow blurring the line between public and private 00:08:37.440 --> 00:08:39.080 because it's digital, right? 00:08:39.600 --> 00:08:40.936 No! No! 00:08:40.960 --> 00:08:44.296 Everything digital is not just automatically public. 00:08:44.320 --> 00:08:46.216 That doesn't make any sense. NOTE Paragraph 00:08:46.240 --> 00:08:49.736 As NYU legal scholar Helen Nissenbaum tells us, 00:08:49.760 --> 00:08:52.376 we have laws and policies and norms 00:08:52.400 --> 00:08:55.536 that protect all kinds of information that's private, 00:08:55.560 --> 00:08:58.976 and it doesn't make a difference if it's digital or not. 00:08:59.000 --> 00:09:01.656 All of your health records are digitized 00:09:01.680 --> 00:09:04.816 but your doctor can't just share them with anyone. 00:09:04.840 --> 00:09:09.296 All of your financial information is held in digital databases, 00:09:09.320 --> 00:09:13.560 but your credit card company can't just post your purchase history online. 00:09:15.080 --> 00:09:20.536 Better laws could help address privacy violations after they happen, 00:09:20.560 --> 00:09:24.936 but one of the easiest things we can all do is make personal changes 00:09:24.960 --> 00:09:27.640 to help protect each other's privacy. 00:09:28.360 --> 00:09:30.256 We're always told that privacy 00:09:30.280 --> 00:09:33.336 is our own, sole, individual responsibility. 00:09:33.360 --> 00:09:37.616 We're told, constantly monitor and update your privacy settings. 00:09:37.640 --> 00:09:42.440 We're told, never share anything you wouldn't want the entire world to see. 00:09:43.400 --> 00:09:44.616 This doesn't make sense. 00:09:44.640 --> 00:09:47.616 Digital media are social environments 00:09:47.640 --> 00:09:51.920 and we share things with people we trust all day, every day. NOTE Paragraph 00:09:52.760 --> 00:09:55.736 As Princeton researcher Janet Vertesi argues, 00:09:55.760 --> 00:09:59.776 our data and our privacy, they're not just personal, 00:09:59.800 --> 00:10:02.376 they're actually interpersonal. 00:10:02.400 --> 00:10:05.656 And so one thing you can do that's really easy 00:10:05.680 --> 00:10:10.776 is just start asking for permission before you share anyone else's information. 00:10:10.800 --> 00:10:15.336 If you want to post a photo of someone online, ask for permission. 00:10:15.360 --> 00:10:17.816 If you want to forward an email thread, 00:10:17.840 --> 00:10:19.216 ask for permission. 00:10:19.240 --> 00:10:22.016 And if you want to share someone's nude selfie, 00:10:22.040 --> 00:10:24.320 obviously, ask for permission. NOTE Paragraph 00:10:25.560 --> 00:10:30.016 These individual changes can really help us protect each other's privacy, 00:10:30.040 --> 00:10:33.840 but we need technology companies on board as well. 00:10:34.360 --> 00:10:38.856 These companies have very little incentive to help protect our privacy 00:10:38.880 --> 00:10:42.176 because their business models depend on us sharing everything 00:10:42.200 --> 00:10:44.440 with as many people as possible. 00:10:45.080 --> 00:10:47.016 Right now, if I send you an image, 00:10:47.040 --> 00:10:50.136 you can forward that to anyone that you want. 00:10:50.160 --> 00:10:54.416 But what if I got to decide if that image was forwardable or not? 00:10:54.440 --> 00:10:58.496 This would tell you, you don't have my permission to send this image out. 00:10:58.520 --> 00:11:02.656 We do this kind of thing all the time to protect copyright. 00:11:02.680 --> 00:11:07.456 If you buy an e-book, you can't just send it out to as many people as you want. 00:11:07.480 --> 00:11:10.040 So why not try this with mobile phones? 00:11:10.960 --> 00:11:15.736 What you can do is we can demand that tech companies add these protections 00:11:15.760 --> 00:11:19.496 to our devices and our platforms as the default. 00:11:19.520 --> 00:11:22.936 After all, you can choose the color of your car, 00:11:22.960 --> 00:11:25.800 but the airbags are always standard. NOTE Paragraph 00:11:28.080 --> 00:11:31.896 If we don't think more about digital privacy and consent, 00:11:31.920 --> 00:11:34.640 there can be serious consequences. 00:11:35.360 --> 00:11:37.616 There was a teenager from Ohio -- 00:11:37.640 --> 00:11:40.480 let's call her Jennifer, for the sake of her privacy. 00:11:41.120 --> 00:11:44.696 She shared nude photos of herself with her high school boyfriend, 00:11:44.720 --> 00:11:46.240 thinking she could trust him. 00:11:47.720 --> 00:11:49.656 Unfortunately, he betrayed her 00:11:49.680 --> 00:11:52.656 and sent her photos around the entire school. 00:11:52.680 --> 00:11:56.200 Jennifer was embarrassed and humiliated, 00:11:56.800 --> 00:12:00.936 but instead of being compassionate, her classmates harassed her. 00:12:00.960 --> 00:12:02.816 They called her a slut and a whore 00:12:02.840 --> 00:12:04.800 and they made her life miserable. 00:12:05.360 --> 00:12:09.040 Jennifer started missing school and her grades dropped. 00:12:09.520 --> 00:12:13.320 Ultimately, Jennifer decided to end her own life. NOTE Paragraph 00:12:14.720 --> 00:12:17.416 Jennifer did nothing wrong. 00:12:17.440 --> 00:12:19.696 All she did was share a nude photo 00:12:19.720 --> 00:12:22.536 with someone she thought that she could trust. 00:12:22.560 --> 00:12:25.176 And yet our laws tell her 00:12:25.200 --> 00:12:29.360 that she committed a horrible crime equivalent to child pornography. 00:12:29.920 --> 00:12:31.416 Our gender norms tell her 00:12:31.440 --> 00:12:34.656 that by producing this nude image of herself, 00:12:34.680 --> 00:12:37.880 she somehow did the most horrible, shameful thing. 00:12:38.400 --> 00:12:42.616 And when we assume that privacy is impossible in digital media, 00:12:42.640 --> 00:12:48.160 we completely write off and excuse her boyfriend's bad, bad behavior. 00:12:49.200 --> 00:12:54.936 People are still saying all the time to victims of privacy violations, 00:12:54.960 --> 00:12:56.216 "What were you thinking? 00:12:56.240 --> 00:12:58.720 You should have never sent that image." NOTE Paragraph 00:12:59.640 --> 00:13:03.640 If you're trying to figure out what to say instead, try this. 00:13:04.160 --> 00:13:07.680 Imagine you've run into your friend who broke their leg skiing. 00:13:08.240 --> 00:13:12.816 They took a risk to do something fun, and it didn't end well. 00:13:12.840 --> 00:13:15.376 But you're probably not going to be the jerk who says, 00:13:15.400 --> 00:13:17.840 "Well, I guess you shouldn't have gone skiing then." 00:13:20.080 --> 00:13:22.216 If we think more about consent, 00:13:22.240 --> 00:13:25.496 we can see that victims of privacy violations 00:13:25.520 --> 00:13:27.256 deserve our compassion, 00:13:27.280 --> 00:13:31.880 not criminalization, shaming, harassment or punishment. 00:13:32.440 --> 00:13:36.936 We can support victims, and we can prevent some privacy violations 00:13:36.960 --> 00:13:41.280 by making these legal, individual and technological changes. 00:13:41.840 --> 00:13:47.656 Because the problem is not sexting, the issue is digital privacy. 00:13:47.680 --> 00:13:50.040 And one solution is consent. NOTE Paragraph 00:13:50.680 --> 00:13:55.256 So the next time a victim of a privacy violation comes up to you, 00:13:55.280 --> 00:13:58.016 instead of blaming them, let's do this instead: 00:13:58.040 --> 00:14:01.456 let's shift our ideas about digital privacy, 00:14:01.480 --> 00:14:04.120 and let's respond with compassion. NOTE Paragraph 00:14:04.680 --> 00:14:05.896 Thank you. NOTE Paragraph 00:14:05.920 --> 00:14:12.056 (Applause)