WEBVTT 00:00:18.524 --> 00:00:23.564 People have been using media to talk about sex for a long time: 00:00:23.564 --> 00:00:27.345 love letters, phone sex, racy Polaroids. 00:00:27.345 --> 00:00:30.595 There's even a story of a girl who eloped 00:00:30.595 --> 00:00:35.256 with a man that she met over the telegraph in 1886. 00:00:35.256 --> 00:00:36.556 (Laughter) 00:00:36.556 --> 00:00:39.196 Today we have sexting. 00:00:39.196 --> 00:00:41.617 And I am a sexting expert. 00:00:41.617 --> 00:00:43.837 Not an expert sexter 00:00:43.837 --> 00:00:44.846 (Laughter) 00:00:44.846 --> 00:00:47.807 though, I do know what this means, 00:00:47.807 --> 00:00:48.917 and I think you do too! 00:00:48.917 --> 00:00:50.137 [it's a penis] 00:00:50.137 --> 00:00:52.306 (Laughter) 00:00:52.306 --> 00:00:54.377 I have been studying sexting 00:00:54.377 --> 00:00:58.577 since the media attention to it began in 2008. 00:00:58.577 --> 00:01:01.660 I wrote a book on the moral panic about sexting, 00:01:01.660 --> 00:01:03.359 and here's what I found: 00:01:03.359 --> 00:01:06.459 most people are worrying about the wrong thing. 00:01:06.459 --> 00:01:10.779 They're trying to just prevent sexting from happening entirely, 00:01:10.779 --> 00:01:12.209 but let me ask you this: 00:01:12.209 --> 00:01:17.179 As long as it's completely consensual, what's the problem with sexting? 00:01:17.179 --> 00:01:21.220 People are into all sorts of things that you may not be into, 00:01:21.220 --> 00:01:23.756 like blue cheese or cilantro. 00:01:23.756 --> 00:01:26.561 (Laughter) 00:01:26.561 --> 00:01:30.762 Sexting is certainly risky, like anything that's fun, 00:01:30.762 --> 00:01:31.772 but ... 00:01:31.772 --> 00:01:32.781 (Laughter) 00:01:32.781 --> 00:01:37.352 as long as you are not sending an image to someone who doesn't want to receive it, 00:01:37.352 --> 00:01:39.131 there's no harm. 00:01:39.131 --> 00:01:41.892 What I do think is a serious problem 00:01:41.892 --> 00:01:46.873 is when people share private images of others without their permission, 00:01:47.283 --> 00:01:49.702 and instead of worrying about sexting, 00:01:49.702 --> 00:01:54.803 what I think we need to do is think a lot more about digital privacy. 00:01:54.803 --> 00:01:57.114 The key is consent. 00:01:57.544 --> 00:02:00.874 Right now, most people are thinking about sexting 00:02:00.874 --> 00:02:04.325 without really thinking about consent at all. 00:02:04.325 --> 00:02:08.355 Did you know that we currently criminalize teen sexting? 00:02:09.295 --> 00:02:12.715 It can be a crime because it counts as child pornography 00:02:12.715 --> 00:02:15.716 if there's an image of someone under 18, 00:02:15.716 --> 00:02:19.385 and it doesn't even matter if they took that image of themselves 00:02:19.385 --> 00:02:21.716 and shared it willingly. 00:02:21.716 --> 00:02:24.716 So we end up with this bizarre legal situation 00:02:24.716 --> 00:02:29.226 where two 17-year-olds can legally have sex in most U.S. states, 00:02:29.226 --> 00:02:31.427 but they can't photograph it. 00:02:32.587 --> 00:02:36.678 Some states have also tried passing sexting misdemeanor laws, 00:02:36.678 --> 00:02:39.778 but these laws repeat the same problem, 00:02:39.778 --> 00:02:43.858 because they still make consensual sexting illegal. 00:02:44.418 --> 00:02:47.718 It doesn't make sense to try to ban all sexting 00:02:47.718 --> 00:02:50.379 to try to address privacy violations. 00:02:50.379 --> 00:02:51.979 This is kind of like saying, 00:02:51.979 --> 00:02:57.820 "Let's solve the problem of date rape by just making dating completely illegal." 00:02:59.010 --> 00:03:04.450 Most teens don't get arrested for sexting, but can you guess who does? 00:03:04.450 --> 00:03:09.362 It's often teens who are disliked by their partner's parents, 00:03:09.362 --> 00:03:14.571 and this can be because of class bias, racism, or homophobia. 00:03:14.915 --> 00:03:17.686 Most prosecutors are, of course, smart enough 00:03:17.686 --> 00:03:21.865 not to use child pornography charges against teenagers, 00:03:21.865 --> 00:03:23.605 but some do. 00:03:23.605 --> 00:03:27.165 According to researchers at the University of New Hampshire, 00:03:27.165 --> 00:03:32.767 7% of all child pornography possession arrests are teens 00:03:32.767 --> 00:03:36.397 sexting consensually with other teens. 00:03:37.417 --> 00:03:39.958 Child pornography is a serious crime, 00:03:39.958 --> 00:03:44.068 but it's just not the same thing as teen sexting. 00:03:44.978 --> 00:03:48.308 Parents and educators are also responding to sexting 00:03:48.308 --> 00:03:51.419 without really thinking too much about consent. 00:03:51.419 --> 00:03:56.069 Their message to teens is often, "Just don't do it," 00:03:56.069 --> 00:03:57.639 and I totally get it. 00:03:57.639 --> 00:03:59.619 There are serious legal risks, 00:03:59.619 --> 00:04:03.300 and of course, that potential for privacy violations. 00:04:03.300 --> 00:04:08.516 And when you were a teen, I'm sure you did exactly as you were told, right? 00:04:09.366 --> 00:04:12.841 You're probably thinking, "My kid would never sext," 00:04:12.841 --> 00:04:16.307 and that's true; your little angel may not be sexting 00:04:16.307 --> 00:04:22.298 because only 33% of 16- and 17-year-olds are sexting. 00:04:23.058 --> 00:04:25.438 But, sorry, by the time they're older, 00:04:25.438 --> 00:04:27.651 odds are, they will be sexting. 00:04:27.651 --> 00:04:34.103 Every study I've seen puts the rate above 50% for 18- to 24-year-olds. 00:04:34.603 --> 00:04:37.803 And most of the time, nothing goes wrong. 00:04:38.273 --> 00:04:40.933 People ask me all the time things like, 00:04:40.933 --> 00:04:43.703 "Isn't sexting just so dangerous, though? 00:04:43.703 --> 00:04:47.323 You wouldn't leave your wallet on a park bench. 00:04:47.323 --> 00:04:51.394 You expect it's going to get stolen if you do that, right?" 00:04:51.394 --> 00:04:52.775 Here's how I think about it: 00:04:52.775 --> 00:04:56.675 sexting is like leaving your wallet at your boyfriend's house. 00:04:56.675 --> 00:05:01.475 If you come back the next day, and all the money is just gone, 00:05:01.475 --> 00:05:04.276 you really need to dump that guy. 00:05:04.276 --> 00:05:05.745 (Laughter) 00:05:07.845 --> 00:05:12.746 So instead of criminalizing sexting to try to prevent these privacy violations, 00:05:12.746 --> 00:05:16.057 instead, we need to make consent central 00:05:16.057 --> 00:05:20.498 to how we think about that circulation of our private information. 00:05:20.868 --> 00:05:25.188 Every new media technology raises privacy concerns; 00:05:25.188 --> 00:05:29.758 in fact, in the U.S., the first major debates about privacy 00:05:29.758 --> 00:05:34.249 were in response to technologies that were relatively new at the time. 00:05:34.249 --> 00:05:38.229 In the late 1800s, people were worried about cameras, 00:05:38.229 --> 00:05:41.730 which were just suddenly more portable than ever before, 00:05:41.730 --> 00:05:44.130 and newspaper gossip columns. 00:05:44.130 --> 00:05:48.030 They were worried that the camera would capture information about them, 00:05:48.030 --> 00:05:51.639 take it out of context, and widely disseminate it. 00:05:51.639 --> 00:05:53.241 Does this sound familiar? 00:05:53.241 --> 00:05:58.200 It's exactly what we're worrying about now with social media, and drone cameras, 00:05:58.200 --> 00:06:00.321 and of course, sexting. 00:06:00.321 --> 00:06:03.652 And these fears about technology make sense, 00:06:03.652 --> 00:06:06.372 because technologies can amplify 00:06:06.372 --> 00:06:10.563 and bring out our worst qualities and behaviors. 00:06:10.563 --> 00:06:12.893 But there are solutions, 00:06:12.893 --> 00:06:17.193 and we've been here before with a dangerous new technology. 00:06:17.193 --> 00:06:21.023 In 1908, Ford introduced the Model T car. 00:06:21.023 --> 00:06:24.912 Traffic fatality rates were rising; it was a serious problem. 00:06:24.912 --> 00:06:27.113 It looks so safe, right? 00:06:27.113 --> 00:06:28.474 (Laughter) 00:06:29.234 --> 00:06:33.245 Our first response was to try to change drivers' behavior, 00:06:33.245 --> 00:06:37.285 so we developed speed limits and enforced them through fines. 00:06:37.285 --> 00:06:40.765 But over the following decades, we started to realize 00:06:40.765 --> 00:06:44.716 the technology of the car itself is not just neutral. 00:06:44.716 --> 00:06:47.966 We could design the car to make it safer. 00:06:47.966 --> 00:06:51.466 So in the 1920s, we got shatter-resistant windshields; 00:06:51.466 --> 00:06:57.076 in the 1950s, seat belts; and in the 1990s, air bags. 00:06:58.256 --> 00:07:03.307 All three of these areas: laws, individuals, and industry 00:07:03.307 --> 00:07:09.348 came together over time to help solve the problems that a new technology causes, 00:07:09.348 --> 00:07:12.918 and we can do the same thing with digital privacy. 00:07:12.918 --> 00:07:16.218 Of course, it comes back to consent. 00:07:16.218 --> 00:07:17.558 Here's the idea: 00:07:17.558 --> 00:07:21.239 before anyone can distribute your private information, 00:07:21.239 --> 00:07:24.069 they should have to get your permission. 00:07:24.069 --> 00:07:29.020 This idea of affirmative consent comes from anti-rape activists 00:07:29.020 --> 00:07:32.760 who tell us that we need consent for every sexual act. 00:07:32.760 --> 00:07:37.327 And we have really high standards for consent in a lot of other areas. 00:07:37.327 --> 00:07:39.127 Think about having surgery. 00:07:39.127 --> 00:07:42.189 Your doctor has to make sure that you are meaningfully 00:07:42.189 --> 00:07:45.359 and knowingly consenting to that medical procedure. 00:07:45.359 --> 00:07:49.098 This is not the type of consent with like an iTunes Terms of Service 00:07:49.098 --> 00:07:53.009 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 00:07:53.009 --> 00:07:54.460 (Laughter) 00:07:55.079 --> 00:08:00.260 If we think more about consent, we can have better privacy laws. 00:08:00.260 --> 00:08:03.681 Right now, we just don't have that many protections. 00:08:03.681 --> 00:08:07.381 If your ex-husband or your ex-wife is a terrible person, 00:08:07.381 --> 00:08:11.531 they can take your nude photos and upload them to a porn site. 00:08:11.531 --> 00:08:14.750 It can be really hard to get those images taken down, 00:08:14.750 --> 00:08:16.022 and in a lot of states, 00:08:16.022 --> 00:08:19.912 you're actually better off if you took the images of yourself 00:08:19.912 --> 00:08:23.122 because then you can file a copyright claim. 00:08:23.472 --> 00:08:24.760 (Laughter) 00:08:25.353 --> 00:08:28.283 Right now, if someone violates your privacy, 00:08:28.283 --> 00:08:32.644 whether that's an individual, or a company, or the NSA, 00:08:33.154 --> 00:08:38.044 you can try filing a lawsuit, but you may not be successful 00:08:38.044 --> 00:08:42.904 because many courts assume that digital privacy is just impossible, 00:08:42.904 --> 00:08:46.694 so they're not willing to punish anyone for violating it. 00:08:47.154 --> 00:08:50.055 I still hear people asking me all the time, 00:08:50.055 --> 00:08:55.366 "Isn't a digital image somehow blurring the line between public and private 00:08:55.366 --> 00:08:57.556 because it's digital, right?" 00:08:57.556 --> 00:08:58.936 No, no! 00:08:58.936 --> 00:09:02.236 Everything digital is not just automatically public. 00:09:02.236 --> 00:09:04.086 That doesn't make any sense. 00:09:04.086 --> 00:09:07.597 As NYU legal scholar, Helen Nissenbaum, tells us, 00:09:07.597 --> 00:09:10.326 we have laws, and policies, and norms 00:09:10.326 --> 00:09:13.467 that protect all kinds of information that's private, 00:09:13.467 --> 00:09:16.846 and it doesn't make a difference if it's digital or not. 00:09:16.846 --> 00:09:19.628 All of your health records are digitized, 00:09:19.628 --> 00:09:22.748 but your doctor can't just share them with anyone. 00:09:22.748 --> 00:09:27.098 All of your financial information is held in digital databases, 00:09:27.098 --> 00:09:32.039 but your credit card company can't just post your purchase history online. 00:09:32.879 --> 00:09:38.010 Better laws could help address privacy violations after they happen, 00:09:38.420 --> 00:09:40.999 but one of the easiest things we can all do 00:09:40.999 --> 00:09:45.930 is make personal changes to help protect each others' privacy. 00:09:46.250 --> 00:09:51.271 We're always told that privacy is our own sole individual responsibility. 00:09:51.271 --> 00:09:55.431 We're told, "Constantly monitor and update your privacy settings." 00:09:55.431 --> 00:10:00.902 We're told, "Never share anything you wouldn't want the entire world to see." 00:10:01.352 --> 00:10:02.522 This doesn't make sense. 00:10:02.522 --> 00:10:05.582 Digital media are social environments, 00:10:05.582 --> 00:10:10.413 and we share things with people we trust all day, every day. 00:10:10.723 --> 00:10:13.613 As Princeton researcher, Janet Vertesi, argues, 00:10:13.613 --> 00:10:17.523 our data and our privacy are not just personal, 00:10:17.523 --> 00:10:20.284 they're actually interpersonal. 00:10:20.284 --> 00:10:25.704 So one thing you can do that's really easy is just start asking for permission 00:10:25.704 --> 00:10:28.654 before you share anyone else's information. 00:10:28.654 --> 00:10:33.335 If you want to post a photo of someone online, ask for permission. 00:10:33.335 --> 00:10:37.076 If you want to forward an email thread, ask for permission. 00:10:37.076 --> 00:10:39.776 If you want to share someone's nude selfie, 00:10:39.776 --> 00:10:42.466 obviously, ask for permission. 00:10:43.386 --> 00:10:47.826 These individual changes can help us protect each others' privacy, 00:10:47.826 --> 00:10:52.246 but we need technology companies on board as well. 00:10:52.246 --> 00:10:56.697 These companies have very little incentive to help protect our privacy 00:10:56.697 --> 00:10:58.969 because their business models depend on us 00:10:58.969 --> 00:11:02.918 sharing everything with as many people as possible. 00:11:02.918 --> 00:11:04.988 Right now, if I send you an image, 00:11:04.988 --> 00:11:08.008 you can forward that to anyone that you want. 00:11:08.008 --> 00:11:12.319 But what if I got to decide if that image was forwardable or not? 00:11:12.319 --> 00:11:16.379 This would tell you, "You don't have my permission to send this image out." 00:11:16.379 --> 00:11:20.600 We do this kind of thing all the time to protect copyright. 00:11:20.600 --> 00:11:25.300 If you buy an e-book, you can't just send it out to as many people as you want, 00:11:25.300 --> 00:11:28.371 so why not try this with mobile phones? 00:11:29.151 --> 00:11:30.271 What you can do 00:11:30.271 --> 00:11:34.001 is we can demand that tech companies add these protections 00:11:34.001 --> 00:11:37.651 to our devices and our platforms as the default. 00:11:37.651 --> 00:11:41.061 After all, you can choose the color of your car, 00:11:41.061 --> 00:11:44.342 but the airbags are always standard. 00:11:45.432 --> 00:11:49.223 If we don't think more about digital privacy and consent, 00:11:49.223 --> 00:11:52.053 there can be serious consequences. 00:11:52.573 --> 00:11:55.003 There was a teenager from Ohio 00:11:55.003 --> 00:11:58.509 - let's call her Jennifer for the sake of her privacy - 00:11:58.509 --> 00:12:02.143 she shared nude photos of herself with her high school boyfriend 00:12:02.143 --> 00:12:04.040 thinking she could trust him. 00:12:05.080 --> 00:12:10.034 Unfortunately, he betrayed her and sent her photos around the entire school. 00:12:10.034 --> 00:12:14.064 Jennifer was embarrassed and humiliated, 00:12:14.064 --> 00:12:18.205 but instead of being compassionate, her classmates harassed her. 00:12:18.205 --> 00:12:22.626 They called her a slut and a whore, and they made her life miserable. 00:12:22.626 --> 00:12:26.795 Jennifer started missing school, and her grades dropped. 00:12:26.795 --> 00:12:31.137 Ultimately, Jennifer decided to end her own life. 00:12:32.067 --> 00:12:34.697 Jennifer did nothing wrong. 00:12:34.697 --> 00:12:37.127 All she did was share a nude photo 00:12:37.127 --> 00:12:39.818 with someone that she thought that she could trust. 00:12:39.818 --> 00:12:42.558 And yet, our laws tell her 00:12:42.558 --> 00:12:47.317 that she committed a horrible crime equivalent to child pornography. 00:12:47.317 --> 00:12:48.888 Our gender norms tell her 00:12:48.888 --> 00:12:52.068 that by producing this nude image of herself, 00:12:52.068 --> 00:12:55.779 she somehow did the most horrible, shameful thing. 00:12:55.779 --> 00:12:59.860 And when we assume that privacy is impossible in digital media, 00:12:59.860 --> 00:13:05.950 we completely write off and excuse her boyfriend's bad, bad behavior. 00:13:06.420 --> 00:13:12.251 People are still saying all the time to victims of privacy violations, 00:13:12.251 --> 00:13:13.610 "What were you thinking? 00:13:13.610 --> 00:13:16.390 You should've never sent that image." 00:13:16.950 --> 00:13:21.552 If you're trying to figure out what to say instead, try this: 00:13:21.552 --> 00:13:25.442 imagine you've run into your friend who broke their leg skiing. 00:13:25.442 --> 00:13:30.102 They took a risk to do something fun, and it didn't end well. 00:13:30.102 --> 00:13:32.693 But you're probably not going to be the jerk who says, 00:13:32.693 --> 00:13:35.823 "Well, I guess you shouldn't have gone skiing then!" 00:13:37.343 --> 00:13:39.563 If we think more about consent, 00:13:39.563 --> 00:13:44.594 we can see that victims of privacy violations deserve our compassion 00:13:44.594 --> 00:13:49.784 not criminalization, shaming, harassment, or punishment. 00:13:49.784 --> 00:13:54.335 We can support victims, and we can prevent some privacy violations 00:13:54.335 --> 00:13:59.145 by making these legal, individual, and technological changes. 00:13:59.145 --> 00:14:04.935 Because the problem is not sexting, the issue is digital privacy, 00:14:04.935 --> 00:14:07.876 and one solution is consent. 00:14:07.876 --> 00:14:12.637 So the next time a victim of a privacy violation comes up to you, 00:14:12.637 --> 00:14:15.346 instead of blaming them, let's do this instead: 00:14:15.346 --> 00:14:18.807 Let's shift our ideas about digital privacy, 00:14:18.807 --> 00:14:22.027 and let's respond with compassion. 00:14:22.027 --> 00:14:23.247 Thank you. 00:14:23.247 --> 00:14:24.977 (Applause)