WEBVTT 00:00:18.824 --> 00:00:23.854 People have been using media to talk about sex for a long time. 00:00:23.854 --> 00:00:27.715 Love letters, phone sex, racy polaroids. 00:00:27.715 --> 00:00:31.335 There's even a story of a girl who eloped with a man 00:00:31.335 --> 00:00:36.636 that she met over the telegraph in 1886. 00:00:36.636 --> 00:00:39.296 Today we have sexting 00:00:39.296 --> 00:00:41.937 and I am a sexting expert. 00:00:41.937 --> 00:00:43.837 Not an expert sexter-- 00:00:43.837 --> 00:00:45.146 (Laughter) 00:00:45.146 --> 00:00:48.027 Though, I do know what this means 00:00:48.027 --> 00:00:50.137 and I think you do too! 00:00:50.137 --> 00:00:52.656 (Laughter) 00:00:52.656 --> 00:00:54.707 I have been studying sexting 00:00:54.707 --> 00:00:58.627 since the media attention to it began in 2008. 00:00:58.627 --> 00:01:01.790 I wrote a book on the moral panic about sexting, 00:01:01.790 --> 00:01:03.439 and here's what I found: 00:01:03.439 --> 00:01:06.669 Most people are worrying about the wrong thing. 00:01:06.669 --> 00:01:10.779 They're trying to just prevent sexting from happening entirely, 00:01:10.779 --> 00:01:12.429 but let me ask you this: 00:01:12.429 --> 00:01:15.359 as long as it is completely consensual, what's the problem with sexting? 00:01:15.359 --> 00:01:21.400 People are into all sorts of things that you may not be into, 00:01:21.400 --> 00:01:23.756 like blue cheese or cilantro. 00:01:23.756 --> 00:01:26.831 (Laughter) 00:01:26.831 --> 00:01:30.962 Sexting is certainly risky, like anything that's fun, 00:01:30.962 --> 00:01:35.271 but as long as you're not sending any image 00:01:35.271 --> 00:01:37.522 to someone who doesn't want to receive it, 00:01:37.522 --> 00:01:39.401 there's no harm. 00:01:39.401 --> 00:01:41.972 What I do think is a serious problem 00:01:41.972 --> 00:01:47.283 is when people share private images of others without their permission, 00:01:47.283 --> 00:01:49.882 and instead of worrying about sexting, 00:01:49.882 --> 00:01:54.853 what I think we need to do is think a lot more about digital privacy. 00:01:54.853 --> 00:01:57.754 The key is consent. 00:01:57.754 --> 00:02:01.134 Right now people are thinking about sexting 00:02:01.134 --> 00:02:04.505 without really thinking about consent at all. 00:02:04.505 --> 00:02:09.485 Did you know that we currently criminalize teen sexting? 00:02:09.485 --> 00:02:13.075 It can be a crime because it counts as child pornography 00:02:13.075 --> 00:02:15.716 if there's an image of someone under eighteen, 00:02:15.716 --> 00:02:19.475 and it doesn't even matter if they took that image of themselves 00:02:19.475 --> 00:02:21.836 and shared it willingly. 00:02:21.836 --> 00:02:24.716 So we end up with this bizarre legal situation 00:02:24.716 --> 00:02:29.226 where two 17-year-olds can have sex in most U.S. states, 00:02:29.226 --> 00:02:32.057 but they can't photograph it. 00:02:32.057 --> 00:02:36.978 Some states have also tried passing sexting misdemeanor laws, 00:02:36.978 --> 00:02:40.008 but these laws repeat the same problem 00:02:40.008 --> 00:02:44.628 because they still make consensual sexting illegal. 00:02:44.628 --> 00:02:47.998 It doesn't make sense to try to ban all sexting 00:02:47.998 --> 00:02:50.539 to try to address privacy violations. 00:02:50.539 --> 00:02:52.099 This is kind of like saying, 00:02:52.099 --> 00:02:59.010 "Let's solve the problem of date rape by just making dating completely illegal." 00:02:59.010 --> 00:03:04.720 Most teens don't get arrested for sexting, but can you guess who does? 00:03:04.720 --> 00:03:09.362 It's often teens who are disliked by their partners parents, 00:03:09.362 --> 00:03:15.101 and this can be because of class bias, racism, or homophobia. 00:03:15.375 --> 00:03:17.776 Most prosecutors are, of course, smart enough 00:03:17.776 --> 00:03:21.865 not to use child pornography charges against teenagers, 00:03:21.865 --> 00:03:23.735 but some do. 00:03:23.735 --> 00:03:27.335 According to researchers at the University of New Hampshire, 00:03:27.335 --> 00:03:32.007 seven percent of all child pornography possession arrests 00:03:32.007 --> 00:03:37.587 are teen sexting consensually with other teens. 00:03:37.587 --> 00:03:40.168 Child pornography is a serious crime 00:03:40.168 --> 00:03:44.808 but it's just not the same thing as teen sexting. 00:03:44.808 --> 00:03:48.428 Parents and educators are also responding to sexting 00:03:48.428 --> 00:03:51.739 without really thinking too much about consent. 00:03:51.739 --> 00:03:56.259 Their message to teens is often just don't do it, 00:03:56.259 --> 00:03:57.819 and I totally get it. 00:03:57.819 --> 00:03:59.869 There are serious legal risks 00:03:59.869 --> 00:04:03.300 and, of course, that potential for privacy violations. 00:04:03.300 --> 00:04:09.466 And when you were a teen, I'm sure you did exactly as you were told, right? 00:04:09.466 --> 00:04:12.931 You're probably thinking, "My kid would never sext," 00:04:12.931 --> 00:04:16.427 and that's true; your little angel may not be sexting 00:04:16.427 --> 00:04:23.118 because only 33 percent of 16 and 17-year-olds are sexting. 00:04:23.118 --> 00:04:25.648 But, sorry, by the time they're older 00:04:25.648 --> 00:04:27.981 the odds are that they will be sexting. 00:04:27.981 --> 00:04:34.873 Every study I've seen puts the rate above 50 percent for 18 to 24-year-olds. 00:04:34.873 --> 00:04:38.483 And most of the time, nothing goes wrong. 00:04:38.483 --> 00:04:41.103 People asking me all the time things like, 00:04:41.103 --> 00:04:43.933 "isn't sexting just so dangerous, though? 00:04:43.933 --> 00:04:47.493 You wouldn't leave your wallet on a park bench. 00:04:47.493 --> 00:04:51.394 You expect it's going to get stolen if you do that, right?" 00:04:51.394 --> 00:04:52.925 Here's how I think about it: 00:04:52.925 --> 00:04:56.775 Sexting is like leaving your wallet at your boyfriend's house. 00:04:56.775 --> 00:05:01.405 If you come back the next day and all the money is just gone, 00:05:01.405 --> 00:05:04.276 you really need to dump that guy. 00:05:04.276 --> 00:05:08.015 (Laughter) 00:05:08.015 --> 00:05:13.036 So instead of criminalizing sexting to try to prevent these privacy violations, 00:05:13.036 --> 00:05:16.347 instead we need to make consent central 00:05:16.347 --> 00:05:20.858 to how we think about the circulation of our private information. 00:05:20.858 --> 00:05:25.488 Every new media technology raises privacy concerns; 00:05:25.488 --> 00:05:29.748 in fact, in the U.S., the first major debates about privacy 00:05:29.748 --> 00:05:34.519 were in response to technologies that were relatively new at the time. 00:05:34.519 --> 00:05:38.509 In the late 1800s, people were worried about cameras, 00:05:38.509 --> 00:05:41.930 which were just suddenly more portable than ever before, 00:05:41.930 --> 00:05:44.410 and newspaper gossip columns. 00:05:44.410 --> 00:05:48.310 They were worried that the camera would capture information about them, 00:05:48.310 --> 00:05:51.869 take it out of context, and widely disseminate it. 00:05:51.869 --> 00:05:53.591 Does that sound familiar? 00:05:53.591 --> 00:05:58.370 It's exactly what we're worrying about now with social media, drone cameras, 00:05:58.370 --> 00:06:00.581 and of course, sexting. 00:06:00.581 --> 00:06:03.852 And these fears about technology, they make sense 00:06:03.852 --> 00:06:06.452 because technologies can amplify 00:06:06.452 --> 00:06:10.563 and bring out our worst qualities and behaviors. 00:06:10.563 --> 00:06:13.013 But there are solutions 00:06:13.013 --> 00:06:17.303 and we've been here before with a dangerous new technology. 00:06:17.303 --> 00:06:21.193 In 1908, Ford introduced the Model T car. 00:06:21.193 --> 00:06:25.142 Traffic fatality rates were rising; it was a serious problem. 00:06:25.142 --> 00:06:26.833 It looks so safe, right? 00:06:26.833 --> 00:06:29.654 (Laughter) 00:06:29.654 --> 00:06:33.485 Our first response was to try to change drivers' behavior, 00:06:33.485 --> 00:06:37.665 so we developed speed limits and enforced them through fines. 00:06:37.665 --> 00:06:40.765 But over the following decades we started to realize 00:06:40.765 --> 00:06:44.876 that the technology of the car itself is not just neutral. 00:06:44.876 --> 00:06:48.206 We could design the car to make it safer. 00:06:48.206 --> 00:06:51.706 So in the 1920s, we got shatter-resistant windshields, 00:06:51.706 --> 00:06:58.256 in the 1950s, seat belts, and in the 1990s, air bags. 00:06:58.256 --> 00:07:03.697 All three of these areas, laws, individuals, and industry, 00:07:03.697 --> 00:07:09.488 came together over time to help solve the problems that a new technology causes 00:07:09.488 --> 00:07:13.148 and we can do the same thing with digital privacy. 00:07:13.148 --> 00:07:16.478 Of course, it comes back to consent. 00:07:16.478 --> 00:07:17.688 Here's the idea: 00:07:17.688 --> 00:07:21.379 before anyone can distribute your private information, 00:07:21.379 --> 00:07:24.259 they should have to get your permission. 00:07:24.259 --> 00:07:29.020 This idea of affirmative consent comes from anti-rape activists 00:07:29.020 --> 00:07:32.940 who tell us that we need consent for every sexual act. 00:07:32.940 --> 00:07:37.000 And we have really high levels of consent in other areas. 00:07:37.607 --> 00:07:39.507 Think about having surgery. 00:07:39.507 --> 00:07:42.429 Your doctor has to make sure that you are meaningfully 00:07:42.429 --> 00:07:45.639 and knowingly consenting to that medical procedure. 00:07:45.639 --> 00:07:49.208 This is not the type of consent with like an iTunes terms of service 00:07:49.208 --> 00:07:53.009 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 00:07:53.009 --> 00:07:54.460 (Laughter) 00:07:55.289 --> 00:08:00.600 If we think more about consent, we can have better privacy laws. 00:08:00.600 --> 00:08:03.881 Right now we just don't have that many protections. 00:08:03.881 --> 00:08:07.381 If your ex-husband or your ex-wife is a terrible person, 00:08:07.381 --> 00:08:11.781 they can take your nude photos and upload them to a porn site. 00:08:11.781 --> 00:08:14.750 It can be really hard to get those images taken down 00:08:14.750 --> 00:08:16.202 and in a lot of states, 00:08:16.202 --> 00:08:20.022 you're actually better off if you took the images of yourself 00:08:20.022 --> 00:08:23.122 because then you can file a copyright claim. 00:08:23.122 --> 00:08:24.410 (Laughter) 00:08:25.613 --> 00:08:28.433 Right now if someone violates your privacy, 00:08:28.433 --> 00:08:32.544 whether that's an individual or a company or the NSA, 00:08:32.544 --> 00:08:38.224 you can try filing a lawsuit, but you may not be successful 00:08:38.224 --> 00:08:42.994 because many courts assume that digital privacy is just impossible 00:08:42.994 --> 00:08:46.954 so they're not willing to punish anyone for violating it. 00:08:46.954 --> 00:08:50.055 I still hear people asking me all the time, 00:08:50.055 --> 00:08:55.366 "isn't a digital image somehow blurring the line between public and private 00:08:55.366 --> 00:08:57.626 because it's digital, right?" 00:08:57.626 --> 00:08:59.066 No, no! 00:08:59.066 --> 00:09:02.236 Everything digital is not just automatically public. 00:09:02.236 --> 00:09:04.256 That doesn't make any sense. 00:09:04.256 --> 00:09:07.907 As NYU legal scholar, Helen Nissenbaum, tells us, 00:09:07.907 --> 00:09:10.666 we have laws and policies and norms 00:09:10.666 --> 00:09:13.877 that protect all kinds of information that's private, 00:09:13.877 --> 00:09:16.996 and it doesn't make a difference if it's digital or not. 00:09:16.996 --> 00:09:19.748 All of your health records are digitized 00:09:19.748 --> 00:09:22.778 but your doctor can't just share them with anyone. 00:09:22.778 --> 00:09:27.098 All of your financial information is held in digital databases 00:09:27.098 --> 00:09:33.029 but your credit card company can't just post your purchase history online. 00:09:33.029 --> 00:09:38.060 Better laws could help address privacy violations after they happen, 00:09:38.060 --> 00:09:41.139 but one of the easiest things we can all do 00:09:41.139 --> 00:09:46.400 is make personal changes to help protect each others' privacy. 00:09:46.400 --> 00:09:51.441 We're always told that privacy is our sole individual responsibility. 00:09:51.441 --> 00:09:55.611 We're told, "Constantly monitor and update your privacy settings." 00:09:55.611 --> 00:10:01.072 We're told, "Never share anything you wouldn't want the entire world to see." 00:10:01.072 --> 00:10:02.792 This doesn't make sense. 00:10:02.792 --> 00:10:05.582 Digital media are social environments 00:10:05.582 --> 00:10:10.833 and we share things with people we trust all day, every day. 00:10:10.833 --> 00:10:13.743 As Princeton researcher, Janet Vertesi, argues, 00:10:13.743 --> 00:10:15.923 our data and our privacy, 00:10:15.923 --> 00:10:20.494 they're not just personal, they're interpersonal. 00:10:20.494 --> 00:10:25.954 So one thing you can do that's really easy is just start asking for permission 00:10:25.954 --> 00:10:28.794 before you share anyone else's information. 00:10:28.794 --> 00:10:33.335 If you want to post a photo of someone online, ask for permission. 00:10:33.335 --> 00:10:37.076 If you want to forward an email thread, ask for permission. 00:10:37.076 --> 00:10:39.776 If you want to share someone's nude selfie, 00:10:39.776 --> 00:10:43.606 obviously, ask for permission! 00:10:43.606 --> 00:10:47.746 These individual changes can help us protect each others' privacy, 00:10:47.746 --> 00:10:52.386 but we need technology companies on board as well. 00:10:52.386 --> 00:10:56.867 These companies have very little incentive to help our privacy 00:10:56.867 --> 00:10:59.139 because their business models depend on us 00:10:59.139 --> 00:11:02.978 sharing everything with as many people as possible. 00:11:02.978 --> 00:11:05.108 Right now, if I send you an image, 00:11:05.108 --> 00:11:08.218 you can forward that to anyone that you want. 00:11:08.218 --> 00:11:12.559 But what if I got to decide if that image was forwardable or not? 00:11:12.559 --> 00:11:16.509 This would tell you, "You don't have my permission to send this image out." 00:11:16.509 --> 00:11:20.600 We do this kind of thing all the time to protect copyright. 00:11:20.600 --> 00:11:25.420 If you buy an ebook, you can't just send it out to as many people as you want, 00:11:25.420 --> 00:11:29.311 so why not try this with mobile phones? 00:11:29.311 --> 00:11:32.291 What you can do is we can demand that tech companies 00:11:32.291 --> 00:11:37.851 add these protections to our devices and our platforms as the default. 00:11:37.851 --> 00:11:41.061 After all, you can choose the color of your car, 00:11:41.061 --> 00:11:45.602 but the airbags are always standard. 00:11:45.602 --> 00:11:49.333 If we don't think more about digital privacy and consent, 00:11:49.333 --> 00:11:52.573 there can be serious consequences. 00:11:52.573 --> 00:11:55.003 There was a teenager from Ohio. 00:11:55.003 --> 00:11:58.509 Let's call her Jennifer for the sake of her privacy. 00:11:58.509 --> 00:12:01.823 She shared nude photos of herself with her high school boyfriend 00:12:01.823 --> 00:12:04.740 thinking she could trust him. 00:12:04.740 --> 00:12:10.344 Unfortunately, he betrayed her and sent her photos around the entire school. 00:12:10.344 --> 00:12:14.294 Jennifer was embarrassed and humiliated, 00:12:14.294 --> 00:12:18.385 but instead of being compassionate, her classmates harassed her. 00:12:18.385 --> 00:12:22.806 They called her a slut and a whore and they made her life miserable. 00:12:22.806 --> 00:12:26.985 Jennifer started missing school, and her grades dropped. 00:12:26.985 --> 00:12:32.067 Ultimately, Jennifer decided to end her own life. 00:12:32.067 --> 00:12:34.697 Jennifer did nothing wrong. 00:12:34.697 --> 00:12:37.327 All she did was share a nude photo 00:12:37.327 --> 00:12:39.818 with someone that she thought that she could trust. 00:12:39.818 --> 00:12:42.748 And yet, our laws tell herv 00:12:42.748 --> 00:12:47.317 that she committed a horrible crime equivalent to child pornography. 00:12:47.317 --> 00:12:52.068 Our gender norms tell her that by producing this nude image of herself, 00:12:52.068 --> 00:12:55.779 she somehow did the most horrible, shameful thing. 00:12:55.779 --> 00:13:00.040 And when we assume that privacy is impossible in digital media, 00:13:00.040 --> 00:13:06.420 we completely write off and excuse her boyfriend's bad, bad behavior. 00:13:06.420 --> 00:13:12.251 People are still saying all the time to victims of privacy violations, 00:13:12.251 --> 00:13:13.740 "What were you thinking? 00:13:13.740 --> 00:13:17.050 You should've never sent that image." 00:13:17.050 --> 00:13:21.632 If you're trying to figure out what to say instead, try this: 00:13:21.632 --> 00:13:25.832 imagine you've run into your friend who broke their leg skiing. 00:13:25.832 --> 00:13:30.342 They took a risk to do something fun, and it didn't end well. 00:13:30.342 --> 00:13:32.953 But you're probably not going to be the jerk who says, 00:13:32.953 --> 00:13:37.493 "Well, I guess you shouldn't have gone skiing then!" 00:13:37.493 --> 00:13:39.833 If we think more about consent, 00:13:39.833 --> 00:13:44.414 we can see that victims of privacy violations deserve our compassion, 00:13:44.414 --> 00:13:49.784 not criminalization, shaming, harassment, or punishment. 00:13:49.784 --> 00:13:54.335 We can support victims, and we can prevent some privacy violations 00:13:54.335 --> 00:13:59.285 by making these legal, individual, and technological changes. 00:13:59.285 --> 00:14:04.865 Because the problem is not sexting, the issue is digital privacy 00:14:04.865 --> 00:14:08.126 and one solution is consent. 00:14:08.126 --> 00:14:12.637 So the next time a victim of a privacy violation comes up to you, 00:14:12.637 --> 00:14:15.596 instead of blaming them, let's do this instead: 00:14:15.596 --> 00:14:18.807 let's shift our ideas about digital privacy 00:14:18.807 --> 00:14:21.907 and let's respond with compassion. 00:14:21.907 --> 00:14:23.437 Thank you. 00:14:23.437 --> 00:14:25.227 (Applause)