0:00:00.829,0:00:04.827 People have been using media to talk[br]about sex for a long time. 0:00:05.692,0:00:08.463 Love letters, phone sex, racy polaroids. 0:00:09.549,0:00:15.278 There's even a story of a girl who eloped[br]with a man that she met over the telegraph 0:00:15.627,0:00:16.962 in 1886. 0:00:18.761,0:00:23.299 Today we have sexting,[br]and I am a sexting expert. 0:00:23.861,0:00:25.884 Not an expect sexter. 0:00:26.999,0:00:30.825 Though, I do know what this means --[br]I think you do too. 0:00:31.425,0:00:35.475 (Laughter) 0:00:36.584,0:00:41.994 I have been studying sexting since[br]the media attention to it began in 2008. 0:00:42.802,0:00:45.681 I wrote a book on the moral[br]panic about sexting. 0:00:45.991,0:00:47.375 And here's what I found: 0:00:47.575,0:00:50.249 most people are worrying[br]about the wrong thing. 0:00:50.618,0:00:54.074 They're trying to just prevent[br]sexting from happening entirely. 0:00:54.920,0:00:56.516 But let me ask you this: 0:00:56.586,0:01:00.732 As long as it's completely consensual,[br]what's the problem with sexting? 0:01:01.208,0:01:04.969 People are into all sorts of things[br]that you may not be into, 0:01:05.402,0:01:07.575 like blue cheese or cilantro. 0:01:07.978,0:01:09.982 (Laughter) 0:01:10.778,0:01:14.318 Sexting is certainly risky,[br]like anything that's fun, 0:01:15.015,0:01:20.901 but as long as you're not sending an image[br]to someone who doesn't want to receive it, 0:01:21.639,0:01:23.111 there's no harm. 0:01:23.370,0:01:29.044 What I do think is a serious problem is[br]when people share private images of others 0:01:29.303,0:01:31.247 without their permission. 0:01:31.455,0:01:35.049 And instead of worrying about sexting,[br]what I think we need to do 0:01:35.346,0:01:38.110 is think a lot more about digital privacy. 0:01:38.894,0:01:40.895 The key is consent. 0:01:41.871,0:01:44.109 Right now most people are thinking 0:01:44.109,0:01:47.497 about sexting without really thinking[br]about consent at all. 0:01:48.427,0:01:52.188 Did you know that we currently[br]criminalize teen sexting? 0:01:53.414,0:01:56.887 It can be a crime because it counts[br]as child pornography, 0:01:56.950,0:01:59.602 if there's an image of someone under 18 0:01:59.783,0:02:01.418 and it doesn't even matter 0:02:01.418,0:02:05.106 if they took that image of themselves[br]and shared it willingly. 0:02:05.902,0:02:08.475 So we end up with this[br]bizarre legal situation 0:02:08.963,0:02:12.793 where two 17-year-olds[br]can legally have sex in most U.S. states 0:02:13.447,0:02:15.676 but they can't photograph it. 0:02:16.767,0:02:20.601 Some states have also tried passing[br]sexting misdemeanor laws 0:02:20.970,0:02:23.773 but these laws repeat the same problem 0:02:24.035,0:02:27.256 because they still make[br]consensual sexting illegal. 0:02:28.706,0:02:29.843 It doesn't make sense 0:02:29.883,0:02:33.719 to try to ban all sexting[br]to try to address privacy violations. 0:02:34.597,0:02:36.143 This is kind of like saying, 0:02:36.202,0:02:41.525 let's solve the problem of date rape[br]by just making dating completely illegal. 0:02:43.276,0:02:48.061 Most teens don't get arrested for sexting,[br]but can you guess who does? 0:02:48.575,0:02:52.565 It's often teens who are disliked[br]by their partner's parents. 0:02:53.596,0:02:58.036 And this can be because of class bias,[br]racism or homophobia. 0:02:59.115,0:03:01.756 Most prosecutors are,[br]of course, smart enough 0:03:01.756,0:03:07.283 not to use child pornography charges[br]against teenagers, but some do. 0:03:07.753,0:03:10.814 According to researchers[br]at the University of New Hampshire 0:03:11.311,0:03:16.624 seven percent of all child pornography[br]possession arrests are teens, 0:03:17.098,0:03:19.863 sexting consensually with other teens. 0:03:21.512,0:03:24.024 Child pornography is a serious crime, 0:03:24.223,0:03:27.606 but it's just not[br]the same thing as teen sexting. 0:03:29.121,0:03:32.295 Parents and educators are also[br]responding to sexting 0:03:32.558,0:03:35.191 without really thinking[br]too much about consent. 0:03:35.698,0:03:39.707 Their message to teens is often:[br]just don't do it. 0:03:40.242,0:03:43.223 And I totally get it --[br]there are serious legal risks 0:03:43.822,0:03:46.872 and of course,[br]that potential for privacy violations. 0:03:47.351,0:03:48.712 And when you were a teen, 0:03:48.872,0:03:52.074 I'm sure you did exactly[br]as you were told, right? 0:03:53.571,0:03:56.613 You're probably thinking,[br]my kid would never sext. 0:03:57.100,0:03:59.694 And that's true, your little angel[br]may not be sexting 0:04:00.434,0:04:05.934 because only 33% of 16 and[br]17-year-olds are sexting. 0:04:07.214,0:04:11.224 But, sorry, by the time they're older,[br]odds are they will be sexting. 0:04:12.024,0:04:17.569 Every study I've seen puts the rate[br]above 50% for 18 to 24-year-olds. 0:04:18.856,0:04:21.176 And most of the time,[br]nothing goes wrong. 0:04:22.041,0:04:27.264 People ask me all the time things like,[br]isn't sexting just so dangerous, though, 0:04:27.567,0:04:31.121 like you wouldn't leave your wallet[br]on a park bench 0:04:31.253,0:04:34.108 and you expect it's gonna get stolen[br]if you do that, right? 0:04:34.738,0:04:36.339 Here's how I think about it: 0:04:36.595,0:04:39.795 sexting is like leaving your wallet[br]at your boyfriend's house. 0:04:40.385,0:04:42.192 If you come back the next day 0:04:42.409,0:04:47.150 and all the money is just gone,[br]you really need to dump that guy. 0:04:51.616,0:04:56.058 So instead of criminalizing sexting to try[br]to prevent these privacy violations, 0:04:56.645,0:04:59.223 instead we need to make consent central 0:04:59.810,0:05:03.447 to how we think about the circulation[br]of our private information. 0:05:04.673,0:05:08.460 Every new media technology[br]raises privacy concerns. 0:05:08.962,0:05:13.446 In fact, in the U.S. the very first[br]major debates about privacy 0:05:13.459,0:05:17.110 were in response to technologies[br]that were relatively new at the time. 0:05:17.969,0:05:20.722 In the late 1800s, people were[br]worried about cameras, 0:05:22.057,0:05:25.169 which were just suddenly more portable[br]than ever before, 0:05:25.619,0:05:27.696 and newspaper gossip columns. 0:05:28.056,0:05:31.586 They were worried that the camera[br]would capture information about them, 0:05:31.905,0:05:34.793 take it out of context[br]and widely disseminate it. 0:05:35.371,0:05:37.139 Does this sound familiar? 0:05:37.359,0:05:41.433 It's exactly what we're worrying about[br]now with social media and drone cameras. 0:05:41.888,0:05:43.974 and, of course, sexting. 0:05:44.197,0:05:48.750 And these fears about technology,[br]they make sense because technologies 0:05:49.034,0:05:53.713 can amplify and bring out[br]our worst qualities and behaviors. 0:05:54.170,0:05:56.300 But there are solutions. 0:05:56.727,0:06:00.431 And we've been here before[br]with a dangerous new technology. 0:06:00.857,0:06:04.542 In 1908, Ford introduced the Model T car. 0:06:04.742,0:06:06.933 Traffic fatality rates were rising.[br] 0:06:07.290,0:06:10.443 It was a serious problem --[br]it looks so safe, right? 0:06:12.224,0:06:15.869 Our first response was to try[br]to change drivers behavior, 0:06:16.423,0:06:19.434 so we developed speed limits and[br]enforced them through fines. 0:06:20.362,0:06:22.238 But over the following decades, 0:06:22.238,0:06:26.869 we started to realize the technology[br]of the car itself is not just neutral. 0:06:27.795,0:06:30.867 We could design the car to make it safer. 0:06:31.143,0:06:34.143 So in the 1920s, we got[br]shatter-resistant windshields. 0:06:34.593,0:06:39.429 In the 1950s, seatbelts.[br]And in the 1990s, airbags. 0:06:40.540,0:06:42.184 All three of these areas: 0:06:42.832,0:06:46.381 laws, individuals,[br]and industry came together 0:06:46.939,0:06:51.093 over time to help solve the problem[br]that a new technology causes. 0:06:51.537,0:06:54.793 And we can do the same thing[br]with digital privacy. 0:06:55.310,0:06:57.927 Of course, it comes back to consent. 0:06:58.422,0:06:59.938 Here's the idea. 0:07:00.102,0:07:03.004 Before anyone can distribute[br]your private information, 0:07:03.589,0:07:05.683 they should have to get your permission. 0:07:06.391,0:07:11.495 This idea of affirmative consent[br]comes from anti-rape activists who tell us 0:07:11.766,0:07:14.534 that we need consent for every sexual act. 0:07:15.088,0:07:19.069 And we have really high standards[br]for consent in a lot of other areas. 0:07:19.486,0:07:21.275 Think about having surgery. 0:07:21.553,0:07:23.214 Your doctor has to make sure that 0:07:23.214,0:07:26.891 you are meaningfully and knowingly[br]consenting to that medical procedure. 0:07:27.732,0:07:31.243 This is not the type of consent with[br]an iTunes Terms of Service 0:07:31.435,0:07:35.444 where you scroll to the bottom and[br]you're like, agree, agree, whatever. 0:07:37.363,0:07:41.724 If we think more about consent,[br]we can have better privacy laws. 0:07:42.598,0:07:45.741 Right now, we just don't have[br]that many protections. 0:07:46.033,0:07:49.075 If your ex-husband or your ex-wife[br]is a terrible person, 0:07:49.569,0:07:53.489 they can take your nude photos[br]and upload them to a porn site. 0:07:53.907,0:07:56.664 It can be really hard to get[br]those images taken down. 0:07:57.047,0:07:59.496 And in a lot of states,[br]you're actually better off 0:07:59.963,0:08:01.922 if you took the images of yourself 0:08:02.203,0:08:04.687 because then you can[br]file a copyright claim. 0:08:07.601,0:08:10.274 Right now, if someone[br]violates your privacy, 0:08:10.610,0:08:14.633 whether that's an individual or[br]a company or the NSA, 0:08:15.361,0:08:19.479 you can try filing a lawsuit,[br]though you may not be successful 0:08:20.331,0:08:24.481 because many courts assume that[br]digital privacy is just impossible. 0:08:25.248,0:08:28.739 So they're not willing to punish[br]anyone for violating it. 0:08:29.313,0:08:32.182 I still hear people[br]asking me all the time, 0:08:32.460,0:08:37.478 isn't a digital image somehow blurring[br]the line between public and private 0:08:37.667,0:08:39.574 because it's digital, right? 0:08:39.757,0:08:40.953 No! No! 0:08:41.299,0:08:44.362 Everything digital is not[br]just automatically public. 0:08:44.362,0:08:46.275 That doesn't make any sense. 0:08:46.482,0:08:49.313 As NYU legal scholar[br]Helen Nissenbaum tells us, 0:08:49.896,0:08:52.841 we have laws and policies[br]and norms that protect 0:08:53.077,0:08:56.572 all kinds of information that's private,[br]and it doesn't make a difference 0:08:56.885,0:08:58.911 if it's digital or not. 0:08:59.221,0:09:02.435 All of your health records[br]are digitized but your doctor 0:09:02.652,0:09:04.653 can't just share them with anyone. 0:09:04.911,0:09:08.979 All of your financial information[br]is held in digital databases, 0:09:09.379,0:09:13.483 but your credit card company can't[br]just post your purchase history online. 0:09:15.247,0:09:19.866 Better laws could help address[br]privacy violations after they happen, 0:09:20.298,0:09:24.891 but one of the easiest things[br]we can all do is make personal changes 0:09:25.244,0:09:27.863 to help protect each other's privacy. 0:09:28.673,0:09:33.086 We're always told that privacy is our[br]own, sole, individual responsibility. 0:09:33.536,0:09:37.147 We're told, constantly monitor[br]and update your privacy settings. 0:09:37.703,0:09:42.452 We're told never share anything you[br]wouldn't want the entire world to see. 0:09:43.435,0:09:44.924 This doesn't make sense. 0:09:44.924,0:09:47.517 Digital media are social environments 0:09:47.721,0:09:51.936 and we share things[br]with people we trust all day, every day. 0:09:52.897,0:09:55.247 As Princeton researcher[br]Jennifer Tessie argues, 0:09:55.816,0:09:59.449 our data and are privacy,[br]they're not just personal, 0:09:59.889,0:10:02.469 they're actually interpersonal. 0:10:02.645,0:10:05.286 And so one thing you can do,[br]that's really easy 0:10:05.820,0:10:10.224 is just start asking for permission before[br]you share anyone else's information. 0:10:11.045,0:10:14.555 If you want to post a photo[br]of someone online, ask for permission. 0:10:15.401,0:10:18.649 If you want to forward an email thread,[br]ask for permission. 0:10:19.346,0:10:24.002 And if you want to share someone's nude[br]selfie, obviously, ask for permission. 0:10:25.807,0:10:29.632 These individual changes can really[br]help us protect each other's privacy, 0:10:30.092,0:10:33.729 but we need technology companies[br]on board as well. 0:10:34.552,0:10:38.414 These companies have very little[br]incentive to help protect our privacy 0:10:38.961,0:10:42.194 because their business models depend on[br]us sharing everything 0:10:42.398,0:10:44.804 with as many people as possible. 0:10:45.231,0:10:47.147 Right now, if I send you an image, 0:10:47.147,0:10:49.489 you can forward that[br]to anyone that you want. 0:10:50.264,0:10:54.228 But what if I got to decide[br]if that image was forwardable or not? 0:10:54.595,0:10:58.534 This would tell you, you don't[br]have my permission to send this image out. 0:10:58.699,0:11:02.115 We do this kind of thing all the time[br]to protect copy right. 0:11:02.849,0:11:07.160 If you buy an e-book, you can't just[br]send it out to as many people as you want. 0:11:07.620,0:11:10.441 So why not try this with mobile phones? 0:11:11.229,0:11:15.378 What you can do is we can demand[br]that tech companies add these protections 0:11:15.814,0:11:19.041 to our devices and our platforms[br]as the default. 0:11:19.452,0:11:22.373 After all, you can choose[br]the color of your car, 0:11:22.898,0:11:25.947 but the airbags are always standard. 0:11:28.192,0:11:32.007 If we don't think more about[br]digital privacy and consent, 0:11:32.331,0:11:34.526 there can be serious consequences. 0:11:35.556,0:11:37.336 There was a teenager from Ohio -- 0:11:37.776,0:11:40.462 let's call her Jennifer,[br]for the sake of her privacy. 0:11:41.192,0:11:43.094 she shared nude photos of herself 0:11:43.103,0:11:46.091 with her high-school boyfriend,[br]thinking she could trust him. 0:11:47.890,0:11:52.381 Unfortunately, he betrayed her and[br]sent her photos around the entire school. 0:11:52.818,0:11:55.934 Jennifer was embarrassed and humiliated, 0:11:56.783,0:12:00.934 but instead of being compassionate,[br]her classmates harassed her. 0:12:01.174,0:12:03.057 They called her a slut and a whore. 0:12:03.057,0:12:04.984 and they made her life miserable. 0:12:05.593,0:12:08.969 Jennifer started missing school[br]and her grades dropped. 0:12:09.623,0:12:13.229 Ultimately, Jennifer decided[br]to end her own life. 0:12:14.930,0:12:17.188 Jennifer did nothing wrong. 0:12:17.560,0:12:19.659 All she did was share a nude photo 0:12:19.998,0:12:22.267 with someone she thought[br]that she could trust. 0:12:22.638,0:12:25.005 And yet our laws tell her 0:12:25.459,0:12:29.343 that she committed a horrible crime[br]equivalent to child pornography. 0:12:29.981,0:12:31.739 Our gender norms tell her that 0:12:31.971,0:12:34.444 by producing this nude image of herself, 0:12:34.802,0:12:38.121 she somehow did the most[br]horrible, shameful thing. 0:12:38.700,0:12:42.141 And when we assume that privacy[br]is impossible in digital media, 0:12:42.802,0:12:48.098 we completely write-off and excuse[br]her boyfriend's bad, bad behavior. 0:12:49.216,0:12:54.098 People are still saying all the time[br]to victims of privacy violations, 0:12:54.887,0:12:56.488 what were you thinking? 0:12:56.618,0:12:58.872 You should have never sent that image. 0:12:59.764,0:13:03.555 If you're trying to figure out[br]what to say instead, try this. 0:13:04.209,0:13:08.066 Imagine you've run into your friend[br]who broke their leg skiing. 0:13:08.493,0:13:12.253 They took a risk to do something fun[br]and it didn't end well. 0:13:12.918,0:13:15.746 But you're probably not going to be[br]the jerk who says, well, 0:13:16.116,0:13:18.748 I guess you shouldn't[br]have gone skiing then. 0:13:20.302,0:13:23.595 If we think more about consent,[br]we can see that 0:13:23.880,0:13:27.075 victims of privacy violations[br]deserve our compassion, 0:13:27.628,0:13:31.678 not criminalization, shaming,[br]harassment, or punishment. 0:13:32.590,0:13:36.926 We can support victims and[br]we can prevent some privacy violations 0:13:37.119,0:13:41.045 by making these legal, individual[br]and technological changes. 0:13:41.905,0:13:47.206 Because the problem is not sexting,[br]the issue is digital privacy. 0:13:47.865,0:13:50.590 And one solution is consent. 0:13:50.956,0:13:54.349 So the next time a victim[br]of a privacy violation comes up to you, 0:13:55.265,0:13:57.074 instead of blaming them, 0:13:57.254,0:13:58.453 let's do this instead:[br] 0:13:58.453,0:14:01.833 Let's shift our ideas[br]about digital privacy and 0:14:01.833,0:14:03.776 let's respond with compassion. 0:14:04.848,0:14:06.580 Thank you.