WEBVTT 00:00:18.824 --> 00:00:23.854 People have been using media to talk about sex for a long time. 00:00:23.854 --> 00:00:27.715 Love letters, phone sex, racy polaroids. 00:00:27.715 --> 00:00:31.335 There's even a story of a girl who eloped with a man 00:00:31.335 --> 00:00:36.636 that she met over the telegraph in 1886. 00:00:36.636 --> 00:00:39.296 Today we have sexting 00:00:39.296 --> 00:00:41.937 and I am a sexting expert. 00:00:41.937 --> 00:00:43.837 Not an expert sexter-- 00:00:43.837 --> 00:00:45.146 (Laughter) 00:00:45.146 --> 00:00:48.027 Though, I do know what this means 00:00:48.027 --> 00:00:50.137 and I think you do too! 00:00:50.137 --> 00:00:52.656 (Laughter) 00:00:52.656 --> 00:00:54.707 I have been studying sexting 00:00:54.707 --> 00:00:58.627 since the media attention to it began in 2008. 00:00:58.627 --> 00:01:01.790 I wrote a book on the moral panic about sexting, 00:01:01.790 --> 00:01:03.439 and here's what I found: 00:01:03.439 --> 00:01:06.669 Most people are worrying about the wrong thing. 00:01:06.669 --> 00:01:10.779 They're trying to just prevent sexting from happening entirely, 00:01:10.779 --> 00:01:12.429 but let me ask you this: 00:01:12.429 --> 00:01:15.359 as long as it is completely consensual, what's the problem with sexting? 00:01:15.359 --> 00:01:21.400 People are into all sorts of things that you may not be into, 00:01:21.400 --> 00:01:23.756 like blue cheese or cilantro. 00:01:23.756 --> 00:01:26.831 (Laughter) 00:01:26.831 --> 00:01:30.962 Sexting is certainly risky, like anything that's fun, 00:01:30.962 --> 00:01:35.271 but as long as you're not sending any image 00:01:35.271 --> 00:01:37.522 to someone who doesn't want to receive it, 00:01:37.522 --> 00:01:39.401 there's no harm. 00:01:39.401 --> 00:01:41.972 What I do think is a serious problem 00:01:41.972 --> 00:01:47.283 is when people share private images of others without their permission, 00:01:47.283 --> 00:01:49.882 and instead of worrying about sexting, 00:01:49.882 --> 00:01:54.853 what I think we need to do is think a lot more about digital privacy. 00:01:54.853 --> 00:01:57.754 The key is consent. 00:01:57.754 --> 00:02:01.134 Right now people are thinking about sexting 00:02:01.134 --> 00:02:04.505 without really thinking about consent at all. 00:02:04.505 --> 00:02:09.485 Did you know that we currently criminalize teen sexting? 00:02:09.485 --> 00:02:13.075 It can be a crime because it counts as child pornography 00:02:13.075 --> 00:02:15.716 if there's an image of someone under eighteen, 00:02:15.716 --> 00:02:19.475 and it doesn't even matter if they took that image of themselves 00:02:19.475 --> 00:02:21.836 and shared it willingly. 00:02:21.836 --> 00:02:24.716 So we end up with this bizarre legal situation 00:02:24.716 --> 00:02:29.226 where two 17-year-olds can have sex in most U.S. states, 00:02:29.226 --> 00:02:32.057 but they can't photograph it. 00:02:32.057 --> 00:02:36.978 Some states have also tried passing sexting misdemeanor laws, 00:02:36.978 --> 00:02:40.008 but these laws repeat the same problem 00:02:40.008 --> 00:02:44.628 because they still make consensual sexting illegal. 00:02:44.628 --> 00:02:47.998 It doesn't make sense to try to ban all sexting 00:02:47.998 --> 00:02:50.539 to try to address privacy violations. 00:02:50.539 --> 00:02:52.099 This is kind of like saying, 00:02:52.099 --> 00:02:59.010 "Let's solve the problem of date rape by just making dating completely illegal." 00:02:59.010 --> 00:03:04.720 Most teens don't get arrested for sexting, but can you guess who does? 00:03:04.720 --> 00:03:09.362 It's often teens who are disliked by their partners parents, 00:03:09.362 --> 00:03:15.101 and this can be because of class bias, racism, or homophobia. 00:03:15.375 --> 00:03:17.776 Most prosecutors are, of course, smart enough 00:03:17.776 --> 00:03:21.865 not to use child pornography charges against teenagers, 00:03:21.865 --> 00:03:23.735 but some do. 00:03:23.735 --> 00:03:27.335 According to researchers at the University of New Hampshire, 00:03:27.335 --> 00:03:32.007 seven percent of all child pornography possession arrests 00:03:32.007 --> 00:03:37.587 are teen sexting consensually with other teens. 00:03:37.587 --> 00:03:40.168 Child pornography is a serious crime 00:03:40.168 --> 00:03:44.808 but it's just not the same thing as teen sexting. 00:03:44.808 --> 00:03:48.428 Parents and educators are also responding to sexting 00:03:48.428 --> 00:03:51.739 without really thinking too much about consent. 00:03:51.739 --> 00:03:56.259 Their message to teens is often just don't do it, 00:03:56.259 --> 00:03:57.819 and I totally get it. 00:03:57.819 --> 00:03:59.869 There are serious legal risks 00:03:59.869 --> 00:04:03.300 and, of course, that potential for privacy violations. 00:04:03.300 --> 00:04:09.466 And when you were a teen, I'm sure you did exactly as you were told, right? 00:04:09.466 --> 00:04:12.931 You're probably thinking, "My kid would never sext," 00:04:12.931 --> 00:04:16.427 and that's true; your little angel may not be sexting 00:04:16.427 --> 00:04:23.118 because only 33 percent of 16 and 17-year-olds are sexting. 00:04:23.118 --> 00:04:25.648 But, sorry, by the time they're older 00:04:25.648 --> 00:04:27.981 the odds are that they will be sexting. 00:04:27.981 --> 00:04:34.873 Every study I've seen puts the rate above 50 percent for 18 to 24-year-olds. 00:04:34.873 --> 00:04:38.483 And most of the time, nothing goes wrong. 00:04:38.483 --> 00:04:41.103 People asking me all the time things like, 00:04:41.103 --> 00:04:43.933 "isn't sexting just so dangerous, though? 00:04:43.933 --> 00:04:47.493 You wouldn't leave your wallet on a park bench. 00:04:47.493 --> 00:04:51.394 You expect it's going to get stolen if you do that, right?" 00:04:51.394 --> 00:04:52.925 Here's how I think about it: 00:04:52.925 --> 00:04:56.775 Sexting is like leaving your wallet at your boyfriend's house. 00:04:56.775 --> 00:05:01.405 If you come back the next day and all the money is just gone, 00:05:01.405 --> 00:05:04.276 you really need to dump that guy. 00:05:04.276 --> 00:05:08.015 (Laughter) 00:05:08.015 --> 00:05:13.036 So instead of criminalizing sexting to try to prevent these privacy violations, 00:05:13.036 --> 00:05:16.347 instead we need to make consent central 00:05:16.347 --> 00:05:20.858 to how we think about the circulation of our private information. 00:05:20.858 --> 00:05:25.488 Every new media technology raises privacy concerns; 00:05:25.488 --> 00:05:29.748 in fact, in the U.S., the first major debates about privacy 00:05:29.748 --> 00:05:34.519 were in response to technologies that were relatively new at the time. 00:05:34.519 --> 00:05:38.509 In the late 1800s, people were worried about cameras, 00:05:38.509 --> 00:05:41.930 which were just suddenly more portable than ever before, 00:05:41.930 --> 00:05:44.410 and newspaper gossip columns. 00:05:44.410 --> 00:05:48.310 They were worried that the camera would capture information about them, 00:05:48.310 --> 00:05:51.869 take it out of context, and widely disseminate it. 00:05:51.869 --> 00:05:53.591 Does that sound familiar? 00:05:53.591 --> 00:05:58.370 It's exactly what we're worrying about now with social media, drone cameras, 00:05:58.370 --> 00:06:00.581 and of course, sexting. 00:06:00.581 --> 00:06:03.852 And these fears about technology, they make sense 00:06:03.852 --> 00:06:06.452 because technologies can amplify 00:06:06.452 --> 00:06:10.563 and bring out our worst qualities and behaviors. 00:06:10.563 --> 00:06:13.013 But there are solutions 00:06:13.013 --> 00:06:17.303 and we've been here before with a dangerous new technology. 00:06:17.303 --> 00:06:21.193 In 1908, Ford introduced the Model T car. 00:06:21.193 --> 00:06:25.142 Traffic fatality rates were rising; it was a serious problem. 00:06:25.142 --> 00:06:26.833 It looks so safe, right? 00:06:26.833 --> 00:06:29.654 (Laughter) 00:06:29.654 --> 00:06:33.485 Our first response was to try to change drivers' behavior, 00:06:33.485 --> 00:06:37.665 so we developed speed limits and enforced them through fines. 00:06:37.665 --> 00:06:40.765 But over the following decades we started to realize 00:06:40.765 --> 00:06:44.876 that the technology of the car itself is not just neutral. 00:06:44.876 --> 00:06:48.206 We could design the car to make it safer. 00:06:48.206 --> 00:06:51.706 So in the 1920s, we got shatter-resistant windshields, 00:06:51.706 --> 00:06:58.256 in the 1950s, seat belts, and in the 1990s, air bags. 00:06:58.256 --> 00:07:03.697 All three of these areas, laws, individuals, and industry, 00:07:03.697 --> 00:07:09.488 came together over time to help solve the problems that a new technology causes 00:07:09.488 --> 00:07:13.148 and we can do the same thing with digital privacy. 00:07:13.148 --> 00:07:16.478 Of course, it comes back to consent. 00:07:16.478 --> 00:07:17.688 Here's the idea: 00:07:17.688 --> 00:07:21.379 before anyone can distribute your private information, 00:07:21.379 --> 00:07:24.259 they should have to get your permission. 00:07:24.259 --> 00:07:29.020 This idea of affirmative consent comes from anti-rape activists 00:07:29.020 --> 00:07:32.940 who tell us that we need consent for every sexual act. 00:07:32.940 --> 00:07:37.000 And we have really high levels of consent in other areas. 99:59:59.999 --> 99:59:59.999 Think about having surgery. 99:59:59.999 --> 99:59:59.999 Your doctor has to make sure that you are meaningfully 99:59:59.999 --> 99:59:59.999 and knowingly consenting to that medical procedure. 99:59:59.999 --> 99:59:59.999 This is not the type of consent with like an iTunes terms of service 99:59:59.999 --> 99:59:59.999 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 99:59:59.999 --> 99:59:59.999 (Laughter) 99:59:59.999 --> 99:59:59.999 If we think more about consent, we can have better privacy laws. 99:59:59.999 --> 99:59:59.999 Right now we just don't have that many protections. 99:59:59.999 --> 99:59:59.999 If your ex-husband or your ex-wife is a terrible person, 99:59:59.999 --> 99:59:59.999 they can take your nude photos and upload them to a porn site. 99:59:59.999 --> 99:59:59.999 It can be really hard to get those images taken down 99:59:59.999 --> 99:59:59.999 and in a lot of states, 99:59:59.999 --> 99:59:59.999 you're actually better off if you took the images of yourself 99:59:59.999 --> 99:59:59.999 because then you can file a copyright claim. 99:59:59.999 --> 99:59:59.999 (Laughter) 99:59:59.999 --> 99:59:59.999 Right now if someone violates your privacy, 99:59:59.999 --> 99:59:59.999 whether that's an individual or a company or the NSA, 99:59:59.999 --> 99:59:59.999 you can try filing a lawsuit, but you may not be successful 99:59:59.999 --> 99:59:59.999 because many courts assume that digital privacy is just impossible 99:59:59.999 --> 99:59:59.999 so they're not willing to punish anyone for violating it. 99:59:59.999 --> 99:59:59.999 I still hear people asking me all the time, 99:59:59.999 --> 99:59:59.999 "isn't a digital image somehow blurring the line between public and private 99:59:59.999 --> 99:59:59.999 because it's digital, right?" 99:59:59.999 --> 99:59:59.999 No, no! 99:59:59.999 --> 99:59:59.999 Everything digital is not just automatically public. 99:59:59.999 --> 99:59:59.999 That doesn't make any sense. 99:59:59.999 --> 99:59:59.999 As NYU legal scholar, Helen Nissenbaum, tells us, 99:59:59.999 --> 99:59:59.999 we have laws and policies and norms 99:59:59.999 --> 99:59:59.999 that protect all kinds of information that's private, 99:59:59.999 --> 99:59:59.999 and it doesn't make a difference if it's digital or not. 99:59:59.999 --> 99:59:59.999 All of your health records are digitized 99:59:59.999 --> 99:59:59.999 but your doctor can't just share them with anyone. 99:59:59.999 --> 99:59:59.999 All of your financial information is held in digital databases 99:59:59.999 --> 99:59:59.999 but your credit card company can't just post your purchase history online. 99:59:59.999 --> 99:59:59.999 Better laws could help address privacy violations after they happen, 99:59:59.999 --> 99:59:59.999 but one of the easiest things we can all do 99:59:59.999 --> 99:59:59.999 is make personal changes to help protect each others' privacy. 99:59:59.999 --> 99:59:59.999 We're always told that privacy is our sole individual responsibility. 99:59:59.999 --> 99:59:59.999 We're told, "Constantly monitor and update your privacy settings." 99:59:59.999 --> 99:59:59.999 We're told, "Never share anything you wouldn't want the entire world to see." 99:59:59.999 --> 99:59:59.999 This doesn't make sense. 99:59:59.999 --> 99:59:59.999 Digital media are social environments 99:59:59.999 --> 99:59:59.999 and we share things with people we trust all day, every day. 99:59:59.999 --> 99:59:59.999 As Princeton researcher, Janet Vertesi, argues, 99:59:59.999 --> 99:59:59.999 our data and our privacy, 99:59:59.999 --> 99:59:59.999 they're not just personal, they're interpersonal. 99:59:59.999 --> 99:59:59.999 So one thing you can do that's really easy is just start asking for permission 99:59:59.999 --> 99:59:59.999 before you share anyone else's information. 99:59:59.999 --> 99:59:59.999 If you want to post a photo of someone online, ask for permission. 99:59:59.999 --> 99:59:59.999 If you want to forward an email thread, ask for permission. 99:59:59.999 --> 99:59:59.999 If you want to share someone's nude selfie, 99:59:59.999 --> 99:59:59.999 obviously, ask for permission! 99:59:59.999 --> 99:59:59.999 These individual changes can help us protect each others' privacy, 99:59:59.999 --> 99:59:59.999 but we need technology companies on board as well. 99:59:59.999 --> 99:59:59.999 These companies have very little incentive to help our privacy 99:59:59.999 --> 99:59:59.999 because their business models depend on us 99:59:59.999 --> 99:59:59.999 sharing everything with as many people as possible. 99:59:59.999 --> 99:59:59.999 Right now, if I send you an image, 99:59:59.999 --> 99:59:59.999 you can forward that to anyone that you want. 99:59:59.999 --> 99:59:59.999 But what if I got to decide if that image was forwardable or not? 99:59:59.999 --> 99:59:59.999 This would tell you, "You don't have my permission to send this image out." 99:59:59.999 --> 99:59:59.999 We do this kind of thing all the time to protect copyright. 99:59:59.999 --> 99:59:59.999 If you buy an ebook, you can't just send it out to as many people as you want, 99:59:59.999 --> 99:59:59.999 so why not try this with mobile phones? 99:59:59.999 --> 99:59:59.999 What you can do is we can demand that tech companies 99:59:59.999 --> 99:59:59.999 add these protections to our devices and our platforms as the default. 99:59:59.999 --> 99:59:59.999 After all, you can choose the color of your car, 99:59:59.999 --> 99:59:59.999 but the airbags are always standard. 99:59:59.999 --> 99:59:59.999 If we don't think more about digital privacy and consent, 99:59:59.999 --> 99:59:59.999 there can be serious consequences. 99:59:59.999 --> 99:59:59.999 There was a teenager from Ohio. 99:59:59.999 --> 99:59:59.999 Let's call her Jennifer for the sake of her privacy. 99:59:59.999 --> 99:59:59.999 She shared nude photos of herself with her high school boyfriend 99:59:59.999 --> 99:59:59.999 thinking she could trust him. 99:59:59.999 --> 99:59:59.999 Unfortunately, he betrayed her and sent her photos around the entire school. 99:59:59.999 --> 99:59:59.999 Jennifer was embarrassed and humiliated, 99:59:59.999 --> 99:59:59.999 but instead of being compassionate, her classmates harassed her. 99:59:59.999 --> 99:59:59.999 They called her a slut and a whore and they made her life miserable. 99:59:59.999 --> 99:59:59.999 Jennifer started missing school, and her grades dropped. 99:59:59.999 --> 99:59:59.999 Ultimately, Jennifer decided to end her own life. 99:59:59.999 --> 99:59:59.999 Jennifer did nothing wrong. 99:59:59.999 --> 99:59:59.999 All she did was share a nude photo 99:59:59.999 --> 99:59:59.999 with someone that she thought that she could trust. 99:59:59.999 --> 99:59:59.999 And yet, our laws tell herv 99:59:59.999 --> 99:59:59.999 that she committed a horrible crime equivalent to child pornography. 99:59:59.999 --> 99:59:59.999 Our gender norms tell her that by producing this nude image of herself, 99:59:59.999 --> 99:59:59.999 she somehow did the most horrible, shameful thing. 99:59:59.999 --> 99:59:59.999 And when we assume that privacy is impossible in digital media, 99:59:59.999 --> 99:59:59.999 we completely write off and excuse her boyfriend's bad, bad behavior. 99:59:59.999 --> 99:59:59.999 People are still saying all the time to victims of privacy violations, 99:59:59.999 --> 99:59:59.999 "What were you thinking? 99:59:59.999 --> 99:59:59.999 You should've never sent that image." 99:59:59.999 --> 99:59:59.999 If you're trying to figure out what to say instead, try this: 99:59:59.999 --> 99:59:59.999 imagine you've run into your friend who broke their leg skiing. 99:59:59.999 --> 99:59:59.999 They took a risk to do something fun, and it didn't end well. 99:59:59.999 --> 99:59:59.999 But you're probably not going to be the jerk who says, 99:59:59.999 --> 99:59:59.999 "Well, I guess you shouldn't have gone skiing then!" 99:59:59.999 --> 99:59:59.999 If we think more about consent, 99:59:59.999 --> 99:59:59.999 we can see that victims of privacy violations deserve our compassion, 99:59:59.999 --> 99:59:59.999 not criminalization, shaming, harassment, or punishment. 99:59:59.999 --> 99:59:59.999 We can support victims, and we can prevent some privacy violations 99:59:59.999 --> 99:59:59.999 by making these legal, individual, and technological changes. 99:59:59.999 --> 99:59:59.999 Because the problem is not sexting, the issue is digital privacy 99:59:59.999 --> 99:59:59.999 and one solution is consent. 99:59:59.999 --> 99:59:59.999 So the next time a victim of a privacy violation comes up to you, 99:59:59.999 --> 99:59:59.999 instead of blaming them, let's do this instead: 99:59:59.999 --> 99:59:59.999 let's shift our ideas about digital privacy 99:59:59.999 --> 99:59:59.999 and let's respond with compassion. 99:59:59.999 --> 99:59:59.999 Thank you. 99:59:59.999 --> 99:59:59.999 (Applause)