0:00:18.824,0:00:23.854 People have been using media[br]to talk about sex for a long time. 0:00:23.854,0:00:27.715 Love letters, phone sex, racy polaroids. 0:00:27.715,0:00:31.335 There's even a story of a girl[br]who eloped with a man 0:00:31.335,0:00:36.636 that she met over the telegraph in 1886. 0:00:36.636,0:00:39.296 Today we have sexting 0:00:39.296,0:00:41.937 and I am a sexting expert. 0:00:41.937,0:00:43.837 Not an expert sexter-- 0:00:43.837,0:00:45.146 (Laughter) 0:00:45.146,0:00:48.027 Though, I do know what this means 0:00:48.027,0:00:50.137 and I think you do too! 0:00:50.137,0:00:52.656 (Laughter) 0:00:52.656,0:00:54.707 I have been studying sexting 0:00:54.707,0:00:58.627 since the media attention to it [br]began in 2008. 0:00:58.627,0:01:01.790 I wrote a book[br]on the moral panic about sexting, 0:01:01.790,0:01:03.439 and here's what I found: 0:01:03.439,0:01:06.669 Most people are worrying[br]about the wrong thing. 0:01:06.669,0:01:10.779 They're trying to just prevent sexting[br]from happening entirely, 0:01:10.779,0:01:12.429 but let me ask you this: 0:01:12.429,0:01:15.359 as long as it is completely consensual,[br]what's the problem with sexting? 0:01:15.359,0:01:21.400 People are into all sorts of things[br]that you may not be into, 0:01:21.400,0:01:23.756 like blue cheese or cilantro. 0:01:23.756,0:01:26.831 (Laughter) 0:01:26.831,0:01:30.962 Sexting is certainly risky,[br]like anything that's fun, 0:01:30.962,0:01:35.271 but as long as you're not[br]sending any image 0:01:35.271,0:01:37.522 to someone who doesn't[br]want to receive it, 0:01:37.522,0:01:39.401 there's no harm. 0:01:39.401,0:01:41.972 What I do think is a serious problem 0:01:41.972,0:01:47.283 is when people share private images[br]of others without their permission, 0:01:47.283,0:01:49.882 and instead of worrying about sexting, 0:01:49.882,0:01:54.853 what I think we need to do[br]is think a lot more about digital privacy. 0:01:54.853,0:01:57.754 The key is consent. 0:01:57.754,0:02:01.134 Right now people[br]are thinking about sexting 0:02:01.134,0:02:04.505 without really thinking[br]about consent at all. 0:02:04.505,0:02:09.485 Did you know that we currently[br]criminalize teen sexting? 0:02:09.485,0:02:13.075 It can be a crime[br]because it counts as child pornography 0:02:13.075,0:02:15.716 if there's an image[br]of someone under eighteen, 0:02:15.716,0:02:19.475 and it doesn't even matter[br]if they took that image of themselves 0:02:19.475,0:02:21.836 and shared it willingly. 0:02:21.836,0:02:24.716 So we end up[br]with this bizarre legal situation 0:02:24.716,0:02:29.226 where two 17-year-olds[br]can have sex in most U.S. states, 0:02:29.226,0:02:32.057 but they can't photograph it. 0:02:32.057,0:02:36.978 Some states have also tried passing[br]sexting misdemeanor laws, 0:02:36.978,0:02:40.008 but these laws repeat the same problem 0:02:40.008,0:02:44.628 because they still make[br]consensual sexting illegal. 0:02:44.628,0:02:47.998 It doesn't make sense[br]to try to ban all sexting 0:02:47.998,0:02:50.539 to try to address privacy violations. 0:02:50.539,0:02:52.099 This is kind of like saying, 0:02:52.099,0:02:59.010 "Let's solve the problem of date rape[br]by just making dating completely illegal." 0:02:59.010,0:03:04.720 Most teens don't get arrested for sexting,[br]but can you guess who does? 0:03:04.720,0:03:09.362 It's often teens who are disliked[br]by their partners parents, 0:03:09.362,0:03:15.101 and this can be because of class bias,[br]racism, or homophobia. 0:03:15.375,0:03:17.776 Most prosecutors are,[br]of course, smart enough 0:03:17.776,0:03:21.865 not to use child pornography[br]charges against teenagers, 0:03:21.865,0:03:23.735 but some do. 0:03:23.735,0:03:27.335 According to researchers[br]at the University of New Hampshire, 0:03:27.335,0:03:32.007 seven percent of all child pornography[br]possession arrests 0:03:32.007,0:03:37.587 are teen sexting consensually[br]with other teens. 0:03:37.587,0:03:40.168 Child pornography is a serious crime 0:03:40.168,0:03:44.808 but it's just not the same thing[br]as teen sexting. 0:03:44.808,0:03:48.428 Parents and educators[br]are also responding to sexting 0:03:48.428,0:03:51.739 without really thinking[br]too much about consent. 0:03:51.739,0:03:56.259 Their message to teens[br]is often just don't do it, 0:03:56.259,0:03:57.819 and I totally get it. 0:03:57.819,0:03:59.869 There are serious legal risks 0:03:59.869,0:04:03.300 and, of course,[br]that potential for privacy violations. 0:04:03.300,0:04:09.466 And when you were a teen, I'm sure you did[br]exactly as you were told, right? 0:04:09.466,0:04:12.931 You're probably thinking,[br]"My kid would never sext," 0:04:12.931,0:04:16.427 and that's true; your little angel[br]may not be sexting 0:04:16.427,0:04:23.118 because only 33 percent[br]of 16 and 17-year-olds are sexting. 0:04:23.118,0:04:25.648 But, sorry, by the time they're older 0:04:25.648,0:04:27.981 the odds are that they will be sexting. 0:04:27.981,0:04:34.873 Every study I've seen puts the rate[br]above 50 percent for 18 to 24-year-olds. 0:04:34.873,0:04:38.483 And most of the time, nothing goes wrong. 0:04:38.483,0:04:41.103 People asking me all the time things like, 0:04:41.103,0:04:43.933 "isn't sexting just so dangerous, though? 0:04:43.933,0:04:47.493 You wouldn't leave your wallet[br]on a park bench. 0:04:47.493,0:04:51.394 You expect it's going to get stolen[br]if you do that, right?" 0:04:51.394,0:04:52.925 Here's how I think about it: 0:04:52.925,0:04:56.775 Sexting is like leaving your wallet[br]at your boyfriend's house. 0:04:56.775,0:05:01.405 If you come back the next day[br]and all the money is just gone, 0:05:01.405,0:05:04.276 you really need to dump that guy. 0:05:04.276,0:05:08.015 (Laughter) 0:05:08.015,0:05:13.036 So instead of criminalizing sexting to try[br]to prevent these privacy violations, 0:05:13.036,0:05:16.347 instead we need to make consent central 0:05:16.347,0:05:20.858 to how we think about the circulation[br]of our private information. 0:05:20.858,0:05:25.488 Every new media technology[br]raises privacy concerns; 0:05:25.488,0:05:29.748 in fact, in the U.S.,[br]the first major debates about privacy 0:05:29.748,0:05:34.519 were in response to technologies[br]that were relatively new at the time. 0:05:34.519,0:05:38.509 In the late 1800s,[br]people were worried about cameras, 0:05:38.509,0:05:41.930 which were just suddenly[br]more portable than ever before, 0:05:41.930,0:05:44.410 and newspaper gossip columns. 0:05:44.410,0:05:48.310 They were worried that the camera[br]would capture information about them, 0:05:48.310,0:05:51.869 take it out of context,[br]and widely disseminate it. 0:05:51.869,0:05:53.591 Does that sound familiar? 0:05:53.591,0:05:58.370 It's exactly what we're worrying about now[br]with social media, drone cameras, 0:05:58.370,0:06:00.581 and of course, sexting. 0:06:00.581,0:06:03.852 And these fears about technology,[br]they make sense 0:06:03.852,0:06:06.452 because technologies can amplify 0:06:06.452,0:06:10.563 and bring out our worst qualities[br]and behaviors. 0:06:10.563,0:06:13.013 But there are solutions[br] 0:06:13.013,0:06:17.303 and we've been here before[br]with a dangerous new technology. 0:06:17.303,0:06:21.193 In 1908, Ford introduced the Model T car. 0:06:21.193,0:06:25.142 Traffic fatality rates were rising;[br]it was a serious problem. 0:06:25.142,0:06:26.833 It looks so safe, right? 0:06:26.833,0:06:29.654 (Laughter) 0:06:29.654,0:06:33.485 Our first response[br]was to try to change drivers' behavior, 0:06:33.485,0:06:37.665 so we developed speed limits[br]and enforced them through fines. 0:06:37.665,0:06:40.765 But over the following decades[br]we started to realize 0:06:40.765,0:06:44.876 that the technology of the car itself[br]is not just neutral. 0:06:44.876,0:06:48.206 We could design the car to make it safer. 0:06:48.206,0:06:51.706 So in the 1920s,[br]we got shatter-resistant windshields, 0:06:51.706,0:06:58.256 in the 1950s, seat belts,[br]and in the 1990s, air bags. 0:06:58.256,0:07:03.697 All three of these areas, laws,[br]individuals, and industry, 0:07:03.697,0:07:09.488 came together over time to help solve[br]the problems that a new technology causes 0:07:09.488,0:07:13.148 and we can do the same thing[br]with digital privacy. 0:07:13.148,0:07:16.478 Of course, it comes back to consent. 0:07:16.478,0:07:17.688 Here's the idea: 0:07:17.688,0:07:21.379 before anyone can distribute[br]your private information, 0:07:21.379,0:07:24.259 they should have to get your permission. 0:07:24.259,0:07:29.020 This idea of affirmative consent[br]comes from anti-rape activists 0:07:29.020,0:07:32.940 who tell us that we need consent[br]for every sexual act. 0:07:32.940,0:07:37.000 And we have really high levels[br]of consent in other areas. 9:59:59.000,9:59:59.000 Think about having surgery. 9:59:59.000,9:59:59.000 Your doctor has to make sure[br]that you are meaningfully 9:59:59.000,9:59:59.000 and knowingly consenting[br]to that medical procedure. 9:59:59.000,9:59:59.000 This is not the type of consent[br]with like an iTunes terms of service 9:59:59.000,9:59:59.000 where you just scroll to the bottom[br]and you're like, "Agree, agree, whatever." 9:59:59.000,9:59:59.000 (Laughter) 9:59:59.000,9:59:59.000 If we think more about consent,[br]we can have better privacy laws. 9:59:59.000,9:59:59.000 Right now we just don't have[br]that many protections. 9:59:59.000,9:59:59.000 If your ex-husband[br]or your ex-wife is a terrible person, 9:59:59.000,9:59:59.000 they can take your nude photos[br]and upload them to a porn site. 9:59:59.000,9:59:59.000 It can be really hard[br]to get those images taken down 9:59:59.000,9:59:59.000 and in a lot of states, 9:59:59.000,9:59:59.000 you're actually better off[br]if you took the images of yourself 9:59:59.000,9:59:59.000 because then you can file[br]a copyright claim. 9:59:59.000,9:59:59.000 (Laughter) 9:59:59.000,9:59:59.000 Right now[br]if someone violates your privacy, 9:59:59.000,9:59:59.000 whether that's an individual[br]or a company or the NSA, 9:59:59.000,9:59:59.000 you can try filing a lawsuit,[br]but you may not be successful 9:59:59.000,9:59:59.000 because many courts assume[br]that digital privacy is just impossible 9:59:59.000,9:59:59.000 so they're not willing to punish anyone[br]for violating it. 9:59:59.000,9:59:59.000 I still hear people[br]asking me all the time, 9:59:59.000,9:59:59.000 "isn't a digital image somehow blurring[br]the line between public and private 9:59:59.000,9:59:59.000 because it's digital, right?" 9:59:59.000,9:59:59.000 No, no! 9:59:59.000,9:59:59.000 Everything digital[br]is not just automatically public. 9:59:59.000,9:59:59.000 That doesn't make any sense. 9:59:59.000,9:59:59.000 As NYU legal scholar,[br]Helen Nissenbaum, tells us, 9:59:59.000,9:59:59.000 we have laws and policies and norms 9:59:59.000,9:59:59.000 that protect all kinds[br]of information that's private, 9:59:59.000,9:59:59.000 and it doesn't make a difference[br]if it's digital or not. 9:59:59.000,9:59:59.000 All of your health records are digitized 9:59:59.000,9:59:59.000 but your doctor can't just[br]share them with anyone. 9:59:59.000,9:59:59.000 All of your financial information[br]is held in digital databases 9:59:59.000,9:59:59.000 but your credit card company can't just[br]post your purchase history online. 9:59:59.000,9:59:59.000 Better laws could help address[br]privacy violations after they happen, 9:59:59.000,9:59:59.000 but one of the easiest things[br]we can all do 9:59:59.000,9:59:59.000 is make personal changes[br]to help protect each others' privacy. 9:59:59.000,9:59:59.000 We're always told that privacy[br]is our sole individual responsibility. 9:59:59.000,9:59:59.000 We're told, "Constantly monitor[br]and update your privacy settings." 9:59:59.000,9:59:59.000 We're told, "Never share anything you[br]wouldn't want the entire world to see." 9:59:59.000,9:59:59.000 This doesn't make sense. 9:59:59.000,9:59:59.000 Digital media are social environments 9:59:59.000,9:59:59.000 and we share things with people[br]we trust all day, every day. 9:59:59.000,9:59:59.000 As Princeton researcher,[br]Janet Vertesi, argues, 9:59:59.000,9:59:59.000 our data and our privacy, 9:59:59.000,9:59:59.000 they're not just personal,[br]they're interpersonal. 9:59:59.000,9:59:59.000 So one thing you can do that's really easy[br]is just start asking for permission 9:59:59.000,9:59:59.000 before you share[br]anyone else's information. 9:59:59.000,9:59:59.000 If you want to post a photo[br]of someone online, ask for permission. 9:59:59.000,9:59:59.000 If you want to forward an email thread,[br]ask for permission. 9:59:59.000,9:59:59.000 If you want to share[br]someone's nude selfie, 9:59:59.000,9:59:59.000 obviously, ask for permission! 9:59:59.000,9:59:59.000 These individual changes can help us[br]protect each others' privacy, 9:59:59.000,9:59:59.000 but we need technology companies[br]on board as well. 9:59:59.000,9:59:59.000 These companies have very little incentive[br]to help our privacy 9:59:59.000,9:59:59.000 because their business models depend on us 9:59:59.000,9:59:59.000 sharing everything[br]with as many people as possible. 9:59:59.000,9:59:59.000 Right now, if I send you an image, 9:59:59.000,9:59:59.000 you can forward that[br]to anyone that you want. 9:59:59.000,9:59:59.000 But what if I got to decide[br]if that image was forwardable or not? 9:59:59.000,9:59:59.000 This would tell you, "You don't have[br]my permission to send this image out." 9:59:59.000,9:59:59.000 We do this kind of thing all the time[br]to protect copyright. 9:59:59.000,9:59:59.000 If you buy an ebook, you can't just[br]send it out to as many people as you want, 9:59:59.000,9:59:59.000 so why not try this with mobile phones? 9:59:59.000,9:59:59.000 What you can do is we can demand[br]that tech companies 9:59:59.000,9:59:59.000 add these protections to our devices[br]and our platforms as the default. 9:59:59.000,9:59:59.000 After all, you can choose[br]the color of your car, 9:59:59.000,9:59:59.000 but the airbags are always standard. 9:59:59.000,9:59:59.000 If we don't think more[br]about digital privacy and consent, 9:59:59.000,9:59:59.000 there can be serious consequences. 9:59:59.000,9:59:59.000 There was a teenager from Ohio. 9:59:59.000,9:59:59.000 Let's call her Jennifer[br]for the sake of her privacy. 9:59:59.000,9:59:59.000 She shared nude photos of herself[br]with her high school boyfriend 9:59:59.000,9:59:59.000 thinking she could trust him. 9:59:59.000,9:59:59.000 Unfortunately, he betrayed her and sent[br]her photos around the entire school. 9:59:59.000,9:59:59.000 Jennifer was embarrassed and humiliated, 9:59:59.000,9:59:59.000 but instead of being compassionate,[br]her classmates harassed her. 9:59:59.000,9:59:59.000 They called her a slut and a whore[br]and they made her life miserable. 9:59:59.000,9:59:59.000 Jennifer started missing school,[br]and her grades dropped. 9:59:59.000,9:59:59.000 Ultimately,[br]Jennifer decided to end her own life. 9:59:59.000,9:59:59.000 Jennifer did nothing wrong. 9:59:59.000,9:59:59.000 All she did was share a nude photo 9:59:59.000,9:59:59.000 with someone that she thought[br]that she could trust. 9:59:59.000,9:59:59.000 And yet, our laws tell herv 9:59:59.000,9:59:59.000 that she committed a horrible crime[br]equivalent to child pornography. 9:59:59.000,9:59:59.000 Our gender norms tell her that[br]by producing this nude image of herself, 9:59:59.000,9:59:59.000 she somehow did[br]the most horrible, shameful thing. 9:59:59.000,9:59:59.000 And when we assume that privacy[br]is impossible in digital media, 9:59:59.000,9:59:59.000 we completely write off and excuse[br]her boyfriend's bad, bad behavior. 9:59:59.000,9:59:59.000 People are still saying all the time[br]to victims of privacy violations, 9:59:59.000,9:59:59.000 "What were you thinking? 9:59:59.000,9:59:59.000 You should've never sent that image." 9:59:59.000,9:59:59.000 If you're trying to figure out[br]what to say instead, try this: 9:59:59.000,9:59:59.000 imagine you've run into your friend[br]who broke their leg skiing. 9:59:59.000,9:59:59.000 They took a risk to do something fun,[br]and it didn't end well. 9:59:59.000,9:59:59.000 But you're probably[br]not going to be the jerk who says, 9:59:59.000,9:59:59.000 "Well, I guess you shouldn't[br]have gone skiing then!" 9:59:59.000,9:59:59.000 If we think more about consent, 9:59:59.000,9:59:59.000 we can see that victims of privacy[br]violations deserve our compassion, 9:59:59.000,9:59:59.000 not criminalization, shaming,[br]harassment, or punishment. 9:59:59.000,9:59:59.000 We can support victims,[br]and we can prevent some privacy violations 9:59:59.000,9:59:59.000 by making these legal, individual,[br]and technological changes. 9:59:59.000,9:59:59.000 Because the problem is not sexting,[br]the issue is digital privacy 9:59:59.000,9:59:59.000 and one solution is consent. 9:59:59.000,9:59:59.000 So the next time a victim[br]of a privacy violation comes up to you, 9:59:59.000,9:59:59.000 instead of blaming them,[br]let's do this instead: 9:59:59.000,9:59:59.000 let's shift our ideas[br]about digital privacy 9:59:59.000,9:59:59.000 and let's respond with compassion. 9:59:59.000,9:59:59.000 Thank you. 9:59:59.000,9:59:59.000 (Applause)