WEBVTT 00:00:00.829 --> 00:00:04.827 People have been using media to talk about sex for a long time. 00:00:05.692 --> 00:00:08.463 Love letters, phone sex, racy polaroids. 00:00:09.549 --> 00:00:15.278 There's even a story of a girl who eloped with a man that she met over the telegraph 00:00:15.627 --> 00:00:16.962 in 1886. 00:00:18.761 --> 00:00:23.299 Today we have sexting, and I am a sexting expert. 00:00:23.861 --> 00:00:25.884 Not an expect sexter. 00:00:26.999 --> 00:00:30.825 Though, I do know what this means -- I think you do too. 00:00:31.425 --> 00:00:35.475 (Laughter) NOTE Paragraph 00:00:36.584 --> 00:00:41.994 I have been studying sexting since the media attention to it began in 2008. 00:00:42.802 --> 00:00:45.681 I wrote a book on the moral panic about sexting. 00:00:45.991 --> 00:00:47.375 And here's what I found: 00:00:47.575 --> 00:00:50.249 most people are worrying about the wrong thing. 00:00:50.618 --> 00:00:54.074 They're trying to just prevent sexting from happening entirely. 00:00:54.920 --> 00:00:56.516 But let me ask you this: 00:00:56.586 --> 00:01:00.732 As long as it's completely consensual, what's the problem with sexting? 00:01:01.208 --> 00:01:04.969 People are into all sorts of things that you may not be into, 00:01:05.402 --> 00:01:07.575 like blue cheese or cilantro. 00:01:07.978 --> 00:01:09.982 (Laughter) 00:01:10.778 --> 00:01:14.318 Sexting is certainly risky, like anything that's fun, 00:01:15.015 --> 00:01:20.901 but as long as you're not sending an image to someone who doesn't want to receive it, 00:01:21.639 --> 00:01:23.111 there's no harm. 00:01:23.370 --> 00:01:29.044 What I do think is a serious problem is when people share private images of others 00:01:29.303 --> 00:01:31.247 without their permission. 00:01:31.455 --> 00:01:35.049 And instead of worrying about sexting, what I think we need to do 00:01:35.346 --> 00:01:38.110 is think a lot more about digital privacy. 00:01:38.894 --> 00:01:40.895 The key is consent. 00:01:41.871 --> 00:01:44.109 Right now most people are thinking 00:01:44.109 --> 00:01:47.497 about sexting without really thinking about consent at all. NOTE Paragraph 00:01:48.427 --> 00:01:52.188 Did you know that we currently criminalize teen sexting? 00:01:53.414 --> 00:01:56.887 It can be a crime because it counts as child pornography, 00:01:56.950 --> 00:01:59.602 if there's an image of someone under 18 00:01:59.783 --> 00:02:01.418 and it doesn't even matter 00:02:01.418 --> 00:02:05.106 if they took that image of themselves and shared it willingly. 00:02:05.902 --> 00:02:08.475 So we end up with this bizarre legal situation 00:02:08.963 --> 00:02:12.793 where two 17-year-olds can legally have sex in most U.S. states 00:02:13.447 --> 00:02:15.676 but they can't photograph it. 00:02:16.767 --> 00:02:20.601 Some states have also tried passing sexting misdemeanor laws 00:02:20.970 --> 00:02:23.773 but these laws repeat the same problem 00:02:24.035 --> 00:02:27.256 because they still make consensual sexting illegal. 00:02:28.706 --> 00:02:29.843 It doesn't make sense 00:02:29.883 --> 00:02:33.719 to try to ban all sexting to try to address privacy violations. 00:02:34.597 --> 00:02:36.143 This is kind of like saying, 00:02:36.202 --> 00:02:41.525 let's solve the problem of date rape by just making dating completely illegal. 00:02:43.276 --> 00:02:48.061 Most teens don't get arrested for sexting, but can you guess who does? 00:02:48.575 --> 00:02:52.565 It's often teens who are disliked by their partner's parents. 00:02:53.596 --> 00:02:58.036 And this can be because of class bias, racism or homophobia. 00:02:59.115 --> 00:03:01.756 Most prosecutors are, of course, smart enough 00:03:01.756 --> 00:03:07.283 not to use child pornography charges against teenagers, but some do. NOTE Paragraph 00:03:07.753 --> 00:03:10.814 According to researchers at the University of New Hampshire 00:03:11.311 --> 00:03:16.624 seven percent of all child pornography possession arrests are teens, 00:03:17.098 --> 00:03:19.863 sexting consensually with other teens. 00:03:21.512 --> 00:03:24.024 Child pornography is a serious crime, 00:03:24.223 --> 00:03:27.606 but it's just not the same thing as teen sexting. 00:03:29.121 --> 00:03:32.295 Parents and educators are also responding to sexting 00:03:32.558 --> 00:03:35.191 without really thinking too much about consent. 00:03:35.698 --> 00:03:39.707 Their message to teens is often: just don't do it. NOTE Paragraph 00:03:40.242 --> 00:03:43.223 And I totally get it -- there are serious legal risks 00:03:43.822 --> 00:03:46.872 and of course, that potential for privacy violations. 00:03:47.351 --> 00:03:48.712 And when you were a teen, 00:03:48.872 --> 00:03:52.074 I'm sure you did exactly as you were told, right? 00:03:53.571 --> 00:03:56.613 You're probably thinking, my kid would never sext. 00:03:57.100 --> 00:03:59.694 And that's true, your little angel may not be sexting 00:04:00.434 --> 00:04:05.934 because only 33% of 16 and 17-year-olds are sexting. 00:04:07.214 --> 00:04:11.224 But, sorry, by the time they're older, odds are they will be sexting. 00:04:12.024 --> 00:04:17.569 Every study I've seen puts the rate above 50% for 18 to 24-year-olds. 00:04:18.856 --> 00:04:21.176 And most of the time, nothing goes wrong. 00:04:22.041 --> 00:04:27.264 People ask me all the time things like, isn't sexting just so dangerous, though, 00:04:27.567 --> 00:04:31.121 like you wouldn't leave your wallet on a park bench 00:04:31.253 --> 00:04:34.108 and you expect it's gonna get stolen if you do that, right? 00:04:34.738 --> 00:04:36.339 Here's how I think about it: 00:04:36.595 --> 00:04:39.795 sexting is like leaving your wallet at your boyfriend's house. 00:04:40.385 --> 00:04:42.192 If you come back the next day 00:04:42.409 --> 00:04:47.150 and all the money is just gone, you really need to dump that guy. 00:04:51.616 --> 00:04:56.058 So instead of criminalizing sexting to try to prevent these privacy violations, 00:04:56.645 --> 00:04:59.223 instead we need to make consent central 00:04:59.810 --> 00:05:03.447 to how we think about the circulation of our private information. NOTE Paragraph 00:05:04.673 --> 00:05:08.460 Every new media technology raises privacy concerns. 00:05:08.962 --> 00:05:13.446 In fact, in the U.S. the very first major debates about privacy 00:05:13.459 --> 00:05:17.110 were in response to technologies that were relatively new at the time. 00:05:17.969 --> 00:05:20.722 In the late 1800s, people were worried about cameras, 00:05:22.057 --> 00:05:25.169 which were just suddenly more portable than ever before, 00:05:25.619 --> 00:05:27.696 and newspaper gossip columns. 00:05:28.056 --> 00:05:31.586 They were worried that the camera would capture information about them, 00:05:31.905 --> 00:05:34.793 take it out of context and widely disseminate it. 00:05:35.371 --> 00:05:37.139 Does this sound familiar? 00:05:37.359 --> 00:05:41.433 It's exactly what we're worrying about now with social media and drone cameras. 00:05:41.888 --> 00:05:43.974 and, of course, sexting. 00:05:44.197 --> 00:05:48.750 And these fears about technology, they make sense because technologies 00:05:49.034 --> 00:05:53.713 can amplify and bring out our worst qualities and behaviors. 00:05:54.170 --> 00:05:56.300 But there are solutions. 00:05:56.727 --> 00:06:00.431 And we've been here before with a dangerous new technology. NOTE Paragraph 00:06:00.857 --> 00:06:04.542 In 1908, Ford introduced the Model T car. 00:06:04.742 --> 00:06:06.933 Traffic fatality rates were rising. 00:06:07.290 --> 00:06:10.443 It was a serious problem -- it looks so safe, right? 00:06:12.224 --> 00:06:15.869 Our first response was to try to change drivers behavior, 00:06:16.423 --> 00:06:19.434 so we developed speed limits and enforced them through fines. 00:06:20.362 --> 00:06:22.238 But over the following decades, 00:06:22.238 --> 00:06:26.869 we started to realize the technology of the car itself is not just neutral. 00:06:27.795 --> 00:06:30.867 We could design the car to make it safer. 00:06:31.143 --> 00:06:34.143 So in the 1920s, we got shatter-resistant windshields. 00:06:34.593 --> 00:06:39.429 In the 1950s, seatbelts. And in the 1990s, airbags. 00:06:40.540 --> 00:06:42.184 All three of these areas: 00:06:42.832 --> 00:06:46.381 laws, individuals, and industry came together 00:06:46.939 --> 00:06:51.093 over time to help solve the problem that a new technology causes. 00:06:51.537 --> 00:06:54.793 And we can do the same thing with digital privacy. 00:06:55.310 --> 00:06:57.927 Of course, it comes back to consent. 00:06:58.422 --> 00:06:59.938 Here's the idea. 00:07:00.102 --> 00:07:03.004 Before anyone can distribute your private information, 00:07:03.589 --> 00:07:05.683 they should have to get your permission. 00:07:06.391 --> 00:07:11.495 This idea of affirmative consent comes from anti-rape activists who tell us 00:07:11.766 --> 00:07:14.534 that we need consent for every sexual act. 00:07:15.088 --> 00:07:19.069 And we have really high standards for consent in a lot of other areas. NOTE Paragraph 00:07:19.486 --> 00:07:21.275 Think about having surgery. 00:07:21.553 --> 00:07:23.214 Your doctor has to make sure that 00:07:23.214 --> 00:07:26.891 you are meaningfully and knowingly consenting to that medical procedure. 00:07:27.732 --> 00:07:31.243 This is not the type of consent with an iTunes Terms of Service 00:07:31.435 --> 00:07:35.444 where you scroll to the bottom and you're like, agree, agree, whatever. 00:07:37.363 --> 00:07:41.724 If we think more about consent, we can have better privacy laws. 00:07:42.598 --> 00:07:45.741 Right now, we just don't have that many protections. 00:07:46.033 --> 00:07:49.075 If your ex-husband or your ex-wife is a terrible person, 00:07:49.569 --> 00:07:53.489 they can take your nude photos and upload them to a porn site. 00:07:53.907 --> 00:07:56.664 It can be really hard to get those images taken down. 00:07:57.047 --> 00:07:59.496 And in a lot of states, you're actually better off 00:07:59.963 --> 00:08:01.922 if you took the images of yourself 00:08:02.203 --> 00:08:04.687 because then you can file a copyright claim. 00:08:07.601 --> 00:08:10.274 Right now, if someone violates your privacy, 00:08:10.610 --> 00:08:14.633 whether that's an individual or a company or the NSA, 00:08:15.361 --> 00:08:19.479 you can try filing a lawsuit, though you may not be successful 00:08:20.331 --> 00:08:24.481 because many courts assume that digital privacy is just impossible. 00:08:25.248 --> 00:08:28.739 So they're not willing to punish anyone for violating it. 00:08:29.313 --> 00:08:32.182 I still hear people asking me all the time, 00:08:32.460 --> 00:08:37.478 isn't a digital image somehow blurring the line between public and private 00:08:37.667 --> 00:08:39.574 because it's digital, right? 00:08:39.757 --> 00:08:40.953 No! No! 00:08:41.299 --> 00:08:44.362 Everything digital is not just automatically public. 00:08:44.362 --> 00:08:46.275 That doesn't make any sense. 00:08:46.482 --> 00:08:49.313 As NYU legal scholar Helen Nissenbaum tells us, 00:08:49.896 --> 00:08:52.841 we have laws and policies and norms that protect 00:08:53.077 --> 00:08:56.572 all kinds of information that's private, and it doesn't make a difference 00:08:56.885 --> 00:08:58.911 if it's digital or not. 00:08:59.221 --> 00:09:02.435 All of your health records are digitized but your doctor 00:09:02.652 --> 00:09:04.653 can't just share them with anyone. 00:09:04.911 --> 00:09:08.979 All of your financial information is held in digital databases, 00:09:09.379 --> 00:09:13.483 but your credit card company can't just post your purchase history online. 00:09:15.247 --> 00:09:19.866 Better laws could help address privacy violations after they happen, 00:09:20.298 --> 00:09:24.891 but one of the easiest things we can all do is make personal changes 00:09:25.244 --> 00:09:27.863 to help protect each other's privacy. 00:09:28.673 --> 00:09:33.086 We're always told that privacy is our own, sole, individual responsibility. 00:09:33.536 --> 00:09:37.147 We're told, constantly monitor and update your privacy settings. 00:09:37.703 --> 00:09:42.452 We're told never share anything you wouldn't want the entire world to see. 00:09:43.435 --> 00:09:44.924 This doesn't make sense. 00:09:44.924 --> 00:09:47.517 Digital media are social environments 00:09:47.721 --> 00:09:51.936 and we share things with people we trust all day, every day. 00:09:52.897 --> 00:09:55.247 As Princeton researcher Jennifer Tessie argues, 00:09:55.816 --> 00:09:59.449 our data and are privacy, they're not just personal, 00:09:59.889 --> 00:10:02.469 they're actually interpersonal. 00:10:02.645 --> 00:10:05.286 And so one thing you can do, that's really easy 00:10:05.820 --> 00:10:10.224 is just start asking for permission before you share anyone else's information. 00:10:11.045 --> 00:10:14.555 If you want to post a photo of someone online, ask for permission. 00:10:15.401 --> 00:10:18.649 If you want to forward an email thread, ask for permission. 00:10:19.346 --> 00:10:24.002 And if you want to share someone's nude selfie, obviously, ask for permission. 00:10:25.807 --> 00:10:29.632 These individual changes can really help us protect each other's privacy, 00:10:30.092 --> 00:10:33.729 but we need technology companies on board as well. 00:10:34.552 --> 00:10:38.414 These companies have very little incentive to help protect our privacy 00:10:38.961 --> 00:10:42.194 because their business models depend on us sharing everything 00:10:42.398 --> 00:10:44.804 with as many people as possible. 00:10:45.231 --> 00:10:47.147 Right now, if I send you an image, 00:10:47.147 --> 00:10:49.489 you can forward that to anyone that you want. 00:10:50.264 --> 00:10:54.228 But what if I got to decide if that image was forwardable or not? 00:10:54.595 --> 00:10:58.534 This would tell you, you don't have my permission to send this image out. 00:10:58.699 --> 00:11:02.115 We do this kind of thing all the time to protect copy right. 00:11:02.849 --> 00:11:07.160 If you buy an e-book, you can't just send it out to as many people as you want. 00:11:07.620 --> 00:11:10.441 So why not try this with mobile phones? 00:11:11.229 --> 00:11:15.378 What you can do is we can demand that tech companies add these protections 00:11:15.814 --> 00:11:19.041 to our devices and our platforms as the default. 00:11:19.452 --> 00:11:22.373 After all, you can choose the color of your car, 00:11:22.898 --> 00:11:25.947 but the airbags are always standard. 00:11:28.192 --> 00:11:32.007 If we don't think more about digital privacy and consent, 00:11:32.331 --> 00:11:34.526 there can be serious consequences. 00:11:35.556 --> 00:11:37.336 There was a teenager from Ohio -- 00:11:37.776 --> 00:11:40.462 let's call her Jennifer, for the sake of her privacy. 00:11:41.192 --> 00:11:43.094 she shared nude photos of herself 00:11:43.103 --> 00:11:46.091 with her high-school boyfriend, thinking she could trust him. 00:11:47.890 --> 00:11:52.381 Unfortunately, he betrayed her and sent her photos around the entire school. 00:11:52.818 --> 00:11:55.934 Jennifer was embarrassed and humiliated, 00:11:56.783 --> 00:12:00.934 but instead of being compassionate, her classmates harassed her. 00:12:01.174 --> 00:12:03.057 They called her a slut and a whore. 00:12:03.057 --> 00:12:04.984 and they made her life miserable. 00:12:05.593 --> 00:12:08.969 Jennifer started missing school and her grades dropped. 00:12:09.623 --> 00:12:13.229 Ultimately, Jennifer decided to end her own life. 00:12:14.930 --> 00:12:17.188 Jennifer did nothing wrong. 00:12:17.560 --> 00:12:19.659 All she did was share a nude photo 00:12:19.998 --> 00:12:22.267 with someone she thought that she could trust. 00:12:22.638 --> 00:12:25.005 And yet our laws tell her 00:12:25.459 --> 00:12:29.343 that she committed a horrible crime equivalent to child pornography. 00:12:29.981 --> 00:12:31.739 Our gender norms tell her that 00:12:31.971 --> 00:12:34.444 by producing this nude image of herself, 00:12:34.802 --> 00:12:38.121 she somehow did the most horrible, shameful thing. 00:12:38.700 --> 00:12:42.141 And when we assume that privacy is impossible in digital media, 00:12:42.802 --> 00:12:48.098 we completely write-off and excuse her boyfriend's bad, bad behavior. 00:12:49.216 --> 00:12:54.098 People are still saying all the time to victims of privacy violations, 00:12:54.887 --> 00:12:56.488 what were you thinking? 00:12:56.618 --> 00:12:58.872 You should have never sent that image. 00:12:59.764 --> 00:13:03.555 If you're trying to figure out what to say instead, try this. 00:13:04.209 --> 00:13:08.066 Imagine you've run into your friend who broke their leg skiing. 00:13:08.493 --> 00:13:12.253 They took a risk to do something fun and it didn't end well. 00:13:12.918 --> 00:13:15.746 But you're probably not going to be the jerk who says, well, 00:13:16.116 --> 00:13:18.748 I guess you shouldn't have gone skiing then. 00:13:20.302 --> 00:13:23.595 If we think more about consent, we can see that 00:13:23.880 --> 00:13:27.075 victims of privacy violations deserve our compassion, 00:13:27.628 --> 00:13:31.678 not criminalization, shaming, harassment, or punishment. 00:13:32.590 --> 00:13:36.926 We can support victims and we can prevent some privacy violations 00:13:37.119 --> 00:13:41.045 by making these legal, individual and technological changes. 00:13:41.905 --> 00:13:47.206 Because the problem is not sexting, the issue is digital privacy. 00:13:47.865 --> 00:13:50.590 And one solution is consent. 00:13:50.956 --> 00:13:54.349 So the next time a victim of a privacy violation comes up to you, 00:13:55.265 --> 00:13:57.074 instead of blaming them, 00:13:57.254 --> 00:13:58.453 let's do this instead: 00:13:58.453 --> 00:14:01.833 Let's shift our ideas about digital privacy and 00:14:01.833 --> 00:14:03.776 let's respond with compassion. 00:14:04.848 --> 00:14:06.580 Thank you.