0:00:18.524,0:00:23.564 People have been using media[br]to talk about sex for a long time. 0:00:23.564,0:00:27.345 Love letters, phone sex, racy Polaroids. 0:00:27.345,0:00:30.595 There's even a story of a girl who eloped 0:00:30.595,0:00:35.256 with a man that she met[br]over the telegraph in 1886. 0:00:35.256,0:00:36.556 (Laughter) 0:00:36.556,0:00:39.196 Today we have sexting. 0:00:39.196,0:00:41.617 And I am a sexting expert. 0:00:41.617,0:00:43.837 Not an expert sexter - 0:00:43.837,0:00:44.846 (Laughter) 0:00:44.846,0:00:47.807 though, I do know what this means 0:00:47.807,0:00:48.917 and I think you do too! 0:00:48.917,0:00:50.137 [it's a penis] 0:00:50.137,0:00:52.306 (Laughter) 0:00:52.306,0:00:54.377 I have been studying sexting 0:00:54.377,0:00:58.577 since the media attention[br]to it began in 2008. 0:00:58.577,0:01:01.660 I wrote a book on the moral[br]panic about sexting, 0:01:01.660,0:01:03.359 and here's what I found: 0:01:03.359,0:01:06.459 Most people are worrying[br]about the wrong thing. 0:01:06.459,0:01:10.779 They're trying to just prevent sexting[br]from happening entirely, 0:01:10.779,0:01:12.209 but let me ask you this: 0:01:12.209,0:01:17.179 As long as it's completely consensual,[br]what's the problem with sexting? 0:01:17.179,0:01:21.220 People are into all sorts of things[br]that you may not be into, 0:01:21.220,0:01:23.756 like blue cheese or cilantro. 0:01:23.756,0:01:26.561 (Laughter) 0:01:26.561,0:01:30.762 Sexting is certainly risky,[br]like anything that's fun, 0:01:30.762,0:01:31.772 but... 0:01:31.772,0:01:32.781 (Laughter) 0:01:32.781,0:01:37.352 as long as you are not sending an image[br]to someone who doesn't want to receive it, 0:01:37.352,0:01:39.131 there's no harm. 0:01:39.131,0:01:41.892 What I do think is a serious problem 0:01:41.892,0:01:46.873 is when people share private images[br]of others without their permission, 0:01:47.283,0:01:49.702 and instead of worrying about sexting, 0:01:49.702,0:01:54.803 what I think we need to do[br]is think a lot more about digital privacy. 0:01:54.803,0:01:57.114 The key is consent. 0:01:57.544,0:02:00.874 Right now, most people[br]are thinking about sexting 0:02:00.874,0:02:04.325 without really thinking[br]about consent at all. 0:02:04.325,0:02:08.355 Did you know that we currently[br]criminalize teen sexting? 0:02:09.295,0:02:12.715 It can be a crime because[br]it counts as child pornography 0:02:12.715,0:02:15.716 if there's an image[br]of someone under 18, 0:02:15.716,0:02:19.385 and it doesn't even matter[br]if they took that image of themselves 0:02:19.385,0:02:21.716 and shared it willingly. 0:02:21.716,0:02:24.716 So we end up with[br]this bizarre legal situation 0:02:24.716,0:02:29.226 where two 17-year-olds[br]can legally have sex in most U.S. states, 0:02:29.226,0:02:31.427 but they can't photograph it. 0:02:32.587,0:02:36.678 Some states have also tried passing[br]sexting misdemeanor laws, 0:02:36.678,0:02:39.778 but these laws repeat the same problem 0:02:39.778,0:02:43.858 because they still make[br]consensual sexting illegal. 0:02:44.418,0:02:47.718 It doesn't make sense[br]to try to ban all sexting 0:02:47.718,0:02:50.379 to try to address privacy violations. 0:02:50.379,0:02:51.979 This is kind of like saying, 0:02:51.979,0:02:57.820 "Let's solve the problem of date rape[br]by just making dating completely illegal." 0:02:59.010,0:03:04.450 Most teens don't get arrested for sexting,[br]but can you guess who does? 0:03:04.450,0:03:09.362 It's often teens who are disliked[br]by their partner's parents, 0:03:09.362,0:03:14.571 and this can be because of class bias,[br]racism, or homophobia. 0:03:14.915,0:03:17.686 Most prosecutors are,[br]of course, smart enough 0:03:17.686,0:03:21.865 not to use child pornography[br]charges against teenagers, 0:03:21.865,0:03:23.605 but some do. 0:03:23.605,0:03:27.165 According to researchers[br]at the University of New Hampshire, 0:03:27.165,0:03:31.777 seven percent of all child[br]pornography possession arrests 0:03:31.777,0:03:36.397 are teens sexting consensually[br]with other teens. 0:03:37.417,0:03:39.958 Child pornography is a serious crime, 0:03:39.958,0:03:44.068 but it's just not the same thing[br]as teen sexting. 0:03:44.978,0:03:48.308 Parents and educators[br]are also responding to sexting 0:03:48.308,0:03:51.419 without really thinking[br]too much about consent. 0:03:51.419,0:03:56.069 Their message to teens[br]is often 'just don't do it,' 0:03:56.069,0:03:57.639 and I totally get it. 0:03:57.639,0:03:59.619 There are serious legal risks, 0:03:59.619,0:04:03.300 and of course, that potential[br]for privacy violations. 0:04:03.300,0:04:08.516 And when you were a teen, I'm sure you did[br]exactly as you were told, right? 0:04:09.366,0:04:12.841 You're probably thinking,[br]"My kid would never sext," 0:04:12.841,0:04:16.307 and that's true; your little angel[br]may not be sexting 0:04:16.307,0:04:22.298 because only 33 percent[br]of 16- and 17-year-olds are sexting. 0:04:23.058,0:04:25.438 But, sorry, by the time they're older, 0:04:25.438,0:04:27.651 odds are, they will be sexting. 0:04:27.651,0:04:34.103 Every study I've seen puts the rate[br]above 50 percent for 18- to 24-year-olds. 0:04:34.603,0:04:37.803 And most of the time, nothing goes wrong. 0:04:38.273,0:04:40.933 People ask me all the time things like, 0:04:40.933,0:04:43.703 "Isn't sexting just so dangerous, though? 0:04:43.703,0:04:47.323 You wouldn't leave your wallet[br]on a park bench. 0:04:47.323,0:04:51.394 You expect it's going to get stolen[br]if you do that, right?" 0:04:51.394,0:04:52.775 Here's how I think about it: 0:04:52.775,0:04:56.675 Sexting is like leaving your wallet[br]at your boyfriend's house. 0:04:56.675,0:05:01.475 If you come back the next day[br]and all the money is just gone, 0:05:01.475,0:05:04.276 you really need to dump that guy. 0:05:04.276,0:05:06.885 (Laughter) 0:05:07.845,0:05:12.746 So instead of criminalizing sexting to try[br]to prevent these privacy violations, 0:05:12.746,0:05:16.057 instead, we need to make consent central 0:05:16.057,0:05:20.498 to how we think about that circulation[br]of our private information. 0:05:20.868,0:05:25.188 Every new media technology[br]raises privacy concerns; 0:05:25.188,0:05:29.758 in fact, in the U.S.,[br]the first major debates about privacy 0:05:29.758,0:05:34.249 were in response to technologies[br]that were relatively new at the time. 0:05:34.249,0:05:38.229 In the late 1800s,[br]people were worried about cameras, 0:05:38.229,0:05:41.730 which were just suddenly[br]more portable than ever before, 0:05:41.730,0:05:44.130 and newspaper gossip columns. 0:05:44.130,0:05:48.030 They were worried that the camera[br]would capture information about them, 0:05:48.030,0:05:51.639 take it out of context,[br]and widely disseminate it. 0:05:51.639,0:05:53.241 Does this sound familiar? 0:05:53.241,0:05:58.200 It's exactly what we're worrying about now[br]with social media, and drone cameras, 0:05:58.200,0:06:00.321 and of course, sexting. 0:06:00.321,0:06:03.652 And these fears about technology,[br]they make sense, 0:06:03.652,0:06:06.372 because technologies can amplify 0:06:06.372,0:06:10.563 and bring out our worst[br]qualities and behaviors. 0:06:10.563,0:06:12.893 But there are solutions, 0:06:12.893,0:06:17.193 and we've been here before[br]with a dangerous new technology. 0:06:17.193,0:06:21.023 In 1908, Ford introduced the Model T car. 0:06:21.023,0:06:24.912 Traffic fatality rates were rising;[br]it was a serious problem. 0:06:24.912,0:06:27.113 It looks so safe, right? 0:06:27.113,0:06:28.474 (Laughter) 0:06:29.234,0:06:33.245 Our first response was[br]to try to change drivers' behavior, 0:06:33.245,0:06:37.285 so we developed speed limits[br]and enforced them through fines. 0:06:37.285,0:06:40.765 But over the following decades[br]we started to realize 0:06:40.765,0:06:44.716 the technology of the car itself[br]is not just neutral. 0:06:44.716,0:06:47.966 We could design the car to make it safer. 0:06:47.966,0:06:51.466 So in the 1920s, we got[br]shatter-resistant windshields; 0:06:51.466,0:06:57.076 in the 1950s, seat belts;[br]and in the 1990s, air bags. 0:06:58.256,0:07:03.307 All three of these areas - laws,[br]individuals, and industry, 0:07:03.307,0:07:09.348 came together over time to help solve[br]the problems that a new technology causes 0:07:09.348,0:07:12.918 and we can do the same thing[br]with digital privacy. 0:07:12.918,0:07:16.218 Of course, it comes back to consent. 0:07:16.218,0:07:17.558 Here's the idea: 0:07:17.558,0:07:21.239 before anyone can distribute[br]your private information, 0:07:21.239,0:07:24.069 they should have to get your permission. 0:07:24.069,0:07:29.020 This idea of affirmative consent[br]comes from anti-rape activists 0:07:29.020,0:07:32.760 who tell us that we need consent[br]for every sexual act. 0:07:32.760,0:07:37.327 And we have really high standards[br]for consent in a lot of other areas. 0:07:37.327,0:07:39.127 Think about having surgery. 0:07:39.127,0:07:42.189 Your doctor has to make sure[br]that you are meaningfully 0:07:42.189,0:07:45.359 and knowingly consenting[br]to that medical procedure. 0:07:45.359,0:07:49.098 This is not the type of consent[br]with like an iTunes Terms of Service 0:07:49.098,0:07:53.009 where you just scroll to the bottom[br]and you're like, "Agree, agree, whatever." 0:07:53.009,0:07:54.460 (Laughter) 0:07:55.079,0:08:00.260 If we think more about consent,[br]we can have better privacy laws. 0:08:00.260,0:08:03.681 Right now we just don't have[br]that many protections. 0:08:03.681,0:08:07.381 If your ex-husband[br]or your ex-wife is a terrible person, 0:08:07.381,0:08:11.531 they can take your nude photos[br]and upload them to a porn site. 0:08:11.531,0:08:14.750 It can be really hard[br]to get those images taken down 0:08:14.750,0:08:16.022 and in a lot of states, 0:08:16.022,0:08:19.912 you're actually better off[br]if you took the images of yourself 0:08:19.912,0:08:23.122 because then you can file[br]a copyright claim. 0:08:23.472,0:08:24.760 (Laughter) 0:08:25.353,0:08:28.283 Right now, if someone[br]violates your privacy, 0:08:28.283,0:08:32.644 whether that's an individual[br]or a company or the NSA, 0:08:33.154,0:08:38.044 you can try filing a lawsuit,[br]but you may not be successful 0:08:38.044,0:08:42.904 because many courts assume[br]that digital privacy is just impossible, 0:08:42.904,0:08:46.694 so they're not willing to punish[br]anyone for violating it. 0:08:47.154,0:08:50.055 I still hear people[br]asking me all the time, 0:08:50.055,0:08:55.366 "Isn't a digital image somehow blurring[br]the line between public and private 0:08:55.366,0:08:57.556 because it's digital, right?" 0:08:57.556,0:08:58.936 No, no! 0:08:58.936,0:09:02.236 Everything digital[br]is not just automatically public. 0:09:02.236,0:09:04.086 That doesn't make any sense. 0:09:04.086,0:09:07.597 As NYU legal scholar,[br]Helen Nissenbaum, tells us, 0:09:07.597,0:09:10.326 we have laws and policies and norms 0:09:10.326,0:09:13.467 that protect all kinds[br]of information that's private, 0:09:13.467,0:09:16.846 and it doesn't make a difference[br]if it's digital or not. 0:09:16.846,0:09:19.628 All of your health records are digitized, 0:09:19.628,0:09:22.748 but your doctor can't just[br]share them with anyone. 0:09:22.748,0:09:27.098 All of your financial information[br]is held in digital databases, 0:09:27.098,0:09:32.039 but your credit card company can't just[br]post your purchase history online. 0:09:32.879,0:09:38.010 Better laws could help address[br]privacy violations after they happen, 0:09:38.420,0:09:40.999 but one of the easiest[br]things we can all do 0:09:40.999,0:09:45.930 is make personal changes[br]to help protect each others' privacy. 0:09:46.250,0:09:51.271 We're always told that privacy[br]is our own sole individual responsibility. 0:09:51.271,0:09:55.431 We're told, "Constantly monitor[br]and update your privacy settings." 0:09:55.431,0:10:00.902 We're told, "Never share anything you[br]wouldn't want the entire world to see." 0:10:01.352,0:10:02.522 This doesn't make sense. 0:10:02.522,0:10:05.582 Digital media are social environments 0:10:05.582,0:10:10.413 and we share things with people[br]we trust all day, every day. 0:10:10.723,0:10:13.613 As Princeton researcher,[br]Janet Vertesi, argues, 0:10:13.613,0:10:15.923 our data and our privacy, 0:10:15.923,0:10:20.284 they're not just personal,[br]they're actually interpersonal. 0:10:20.284,0:10:25.704 So one thing you can do that's really easy[br]is just start asking for permission 0:10:25.704,0:10:28.654 before you share[br]anyone else's information. 0:10:28.654,0:10:33.335 If you want to post a photo[br]of someone online, ask for permission. 0:10:33.335,0:10:37.076 If you want to forward an email thread,[br]ask for permission. 0:10:37.076,0:10:39.776 If you want to share[br]someone's nude selfie, 0:10:39.776,0:10:42.466 obviously, ask for permission! 0:10:43.386,0:10:47.826 These individual changes can help us[br]protect each others' privacy, 0:10:47.826,0:10:52.246 but we need technology companies[br]on board as well. 0:10:52.246,0:10:56.697 These companies have very little incentive[br]to help protect our privacy 0:10:56.697,0:10:58.969 because their business models depend on us 0:10:58.969,0:11:02.918 sharing everything[br]with as many people as possible. 0:11:02.918,0:11:04.988 Right now, if I send you an image, 0:11:04.988,0:11:08.008 you can forward that[br]to anyone that you want. 0:11:08.008,0:11:12.319 But what if I got to decide[br]if that image was forwardable or not? 0:11:12.319,0:11:16.379 This would tell you, "You don't have[br]my permission to send this image out." 0:11:16.379,0:11:20.600 We do this kind of thing all the time[br]to protect copyright. 0:11:20.600,0:11:25.300 If you buy an e-book, you can't just[br]send it out to as many people as you want, 0:11:25.300,0:11:28.371 so why not try this with mobile phones? 0:11:29.151,0:11:32.291 What you can do is we can demand[br]that tech companies 0:11:32.291,0:11:37.651 add these protections to our devices[br]and our platforms as the default. 0:11:37.651,0:11:41.061 After all, you can choose[br]the color of your car, 0:11:41.061,0:11:44.342 but the airbags are always standard. 0:11:45.432,0:11:49.223 If we don't think more[br]about digital privacy and consent, 0:11:49.223,0:11:52.053 there can be serious consequences. 0:11:52.573,0:11:55.003 There was a teenager from Ohio - 0:11:55.003,0:11:58.509 let's call her Jennifer[br]for the sake of her privacy - 0:11:58.509,0:12:02.143 she shared nude photos of herself[br]with her high school boyfriend 0:12:02.143,0:12:04.040 thinking she could trust him. 0:12:05.080,0:12:10.034 Unfortunately, he betrayed her and sent[br]her photos around the entire school. 0:12:10.034,0:12:14.064 Jennifer was embarrassed and humiliated, 0:12:14.064,0:12:18.205 but instead of being compassionate,[br]her classmates harassed her. 0:12:18.205,0:12:22.626 They called her a slut and a whore[br]and they made her life miserable. 0:12:22.626,0:12:26.795 Jennifer started missing school,[br]and her grades dropped. 0:12:26.795,0:12:31.137 Ultimately, Jennifer decided[br]to end her own life. 0:12:32.067,0:12:34.697 Jennifer did nothing wrong. 0:12:34.697,0:12:37.127 All she did was share a nude photo 0:12:37.127,0:12:39.818 with someone that she thought[br]that she could trust. 0:12:39.818,0:12:42.558 And yet, our laws tell her 0:12:42.558,0:12:47.317 that she committed a horrible crime[br]equivalent to child pornography. 0:12:47.317,0:12:52.068 Our gender norms tell her that[br]by producing this nude image of herself, 0:12:52.068,0:12:55.779 she somehow did the most[br]horrible, shameful thing. 0:12:55.779,0:12:59.860 And when we assume that privacy[br]is impossible in digital media, 0:12:59.860,0:13:05.950 we completely write off and excuse[br]her boyfriend's bad, bad behavior. 0:13:06.420,0:13:12.251 People are still saying all the time[br]to victims of privacy violations, 0:13:12.251,0:13:13.610 "What were you thinking? 0:13:13.610,0:13:16.390 You should've never sent that image." 0:13:16.950,0:13:21.552 If you're trying to figure out[br]what to say instead, try this: 0:13:21.552,0:13:25.442 Imagine you've run into your friend[br]who broke their leg skiing. 0:13:25.442,0:13:30.102 They took a risk to do something fun,[br]and it didn't end well. 0:13:30.102,0:13:32.693 But you're probably[br]not going to be the jerk who says, 0:13:32.693,0:13:35.823 "Well, I guess you shouldn't[br]have gone skiing then!" 0:13:37.343,0:13:39.563 If we think more about consent, 0:13:39.563,0:13:44.594 we can see that victims of privacy[br]violations deserve our compassion, 0:13:44.594,0:13:49.784 not criminalization, shaming,[br]harassment, or punishment. 0:13:49.784,0:13:54.335 We can support victims,[br]and we can prevent some privacy violations 0:13:54.335,0:13:59.145 by making these legal, individual,[br]and technological changes. 0:13:59.145,0:14:04.935 Because the problem is not sexting,[br]the issue is digital privacy, 0:14:04.935,0:14:07.876 and one solution is consent. 0:14:07.876,0:14:12.637 So the next time a victim[br]of a privacy violation comes up to you, 0:14:12.637,0:14:15.346 instead of blaming them,[br]let's do this instead: 0:14:15.346,0:14:18.807 Let's shift our ideas[br]about digital privacy 0:14:18.807,0:14:22.027 and let's respond with compassion. 0:14:22.027,0:14:23.437 Thank you. 0:14:23.437,0:14:26.087 (Applause)