1 00:00:18,824 --> 00:00:23,854 People have been using media to talk about sex for a long time. 2 00:00:23,854 --> 00:00:27,715 Love letters, phone sex, racy polaroids. 3 00:00:27,715 --> 00:00:31,335 There's even a story of a girl who eloped with a man 4 00:00:31,335 --> 00:00:36,636 that she met over the telegraph in 1886. 5 00:00:36,636 --> 00:00:39,296 Today we have sexting 6 00:00:39,296 --> 00:00:41,937 and I am a sexting expert. 7 00:00:41,937 --> 00:00:43,837 Not an expert sexter-- 8 00:00:43,837 --> 00:00:45,146 (Laughter) 9 00:00:45,146 --> 00:00:48,027 Though, I do know what this means 10 00:00:48,027 --> 00:00:50,137 and I think you do too! 11 00:00:50,137 --> 00:00:52,656 (Laughter) 12 00:00:52,656 --> 00:00:54,707 I have been studying sexting 13 00:00:54,707 --> 00:00:58,627 since the media attention to it began in 2008. 14 00:00:58,627 --> 00:01:01,790 I wrote a book on the moral panic about sexting, 15 00:01:01,790 --> 00:01:03,439 and here's what I found: 16 00:01:03,439 --> 00:01:06,669 Most people are worrying about the wrong thing. 17 00:01:06,669 --> 00:01:10,779 They're trying to just prevent sexting from happening entirely, 18 00:01:10,779 --> 00:01:12,429 but let me ask you this: 19 00:01:12,429 --> 00:01:15,359 as long as it is completely consensual, what's the problem with sexting? 20 00:01:15,359 --> 00:01:21,400 People are into all sorts of things that you may not be into, 21 00:01:21,400 --> 00:01:23,756 like blue cheese or cilantro. 22 00:01:23,756 --> 00:01:26,831 (Laughter) 23 00:01:26,831 --> 00:01:30,962 Sexting is certainly risky, like anything that's fun, 24 00:01:30,962 --> 00:01:35,271 but as long as you're not sending any image 25 00:01:35,271 --> 00:01:37,522 to someone who doesn't want to receive it, 26 00:01:37,522 --> 00:01:39,401 there's no harm. 27 00:01:39,401 --> 00:01:41,972 What I do think is a serious problem 28 00:01:41,972 --> 00:01:47,283 is when people share private images of others without their permission, 29 00:01:47,283 --> 00:01:49,882 and instead of worrying about sexting, 30 00:01:49,882 --> 00:01:54,853 what I think we need to do is think a lot more about digital privacy. 31 00:01:54,853 --> 00:01:57,754 The key is consent. 32 00:01:57,754 --> 00:02:01,134 Right now people are thinking about sexting 33 00:02:01,134 --> 00:02:04,505 without really thinking about consent at all. 34 00:02:04,505 --> 00:02:09,485 Did you know that we currently criminalize teen sexting? 35 00:02:09,485 --> 00:02:13,075 It can be a crime because it counts as child pornography 36 00:02:13,075 --> 00:02:15,716 if there's an image of someone under eighteen, 37 00:02:15,716 --> 00:02:19,475 and it doesn't even matter if they took that image of themselves 38 00:02:19,475 --> 00:02:21,836 and shared it willingly. 39 00:02:21,836 --> 00:02:24,716 So we end up with this bizarre legal situation 40 00:02:24,716 --> 00:02:29,226 where two 17-year-olds can have sex in most U.S. states, 41 00:02:29,226 --> 00:02:32,057 but they can't photograph it. 42 00:02:32,057 --> 00:02:36,978 Some states have also tried passing sexting misdemeanor laws, 43 00:02:36,978 --> 00:02:40,008 but these laws repeat the same problem 44 00:02:40,008 --> 00:02:44,628 because they still make consensual sexting illegal. 45 00:02:44,628 --> 00:02:47,998 It doesn't make sense to try to ban all sexting 46 00:02:47,998 --> 00:02:50,539 to try to address privacy violations. 47 00:02:50,539 --> 00:02:52,099 This is kind of like saying, 48 00:02:52,099 --> 00:02:59,010 "Let's solve the problem of date rape by just making dating completely illegal." 49 00:02:59,010 --> 00:03:04,720 Most teens don't get arrested for sexting, but can you guess who does? 50 00:03:04,720 --> 00:03:09,362 It's often teens who are disliked by their partners parents, 51 00:03:09,362 --> 00:03:15,101 and this can be because of class bias, racism, or homophobia. 52 00:03:15,375 --> 00:03:17,776 Most prosecutors are, of course, smart enough 53 00:03:17,776 --> 00:03:21,865 not to use child pornography charges against teenagers, 54 00:03:21,865 --> 00:03:23,735 but some do. 55 00:03:23,735 --> 00:03:27,335 According to researchers at the University of New Hampshire, 56 00:03:27,335 --> 00:03:32,007 seven percent of all child pornography possession arrests 57 00:03:32,007 --> 00:03:37,587 are teen sexting consensually with other teens. 58 00:03:37,587 --> 00:03:40,168 Child pornography is a serious crime 59 00:03:40,168 --> 00:03:44,808 but it's just not the same thing as teen sexting. 60 00:03:44,808 --> 00:03:48,428 Parents and educators are also responding to sexting 61 00:03:48,428 --> 00:03:51,739 without really thinking too much about consent. 62 00:03:51,739 --> 00:03:56,259 Their message to teens is often just don't do it, 63 00:03:56,259 --> 00:03:57,819 and I totally get it. 64 00:03:57,819 --> 00:03:59,869 There are serious legal risks 65 00:03:59,869 --> 00:04:03,300 and, of course, that potential for privacy violations. 66 00:04:03,300 --> 00:04:09,466 And when you were a teen, I'm sure you did exactly as you were told, right? 67 00:04:09,466 --> 00:04:12,931 You're probably thinking, "My kid would never sext," 68 00:04:12,931 --> 00:04:16,427 and that's true; your little angel may not be sexting 69 00:04:16,427 --> 00:04:23,118 because only 33 percent of 16 and 17-year-olds are sexting. 70 00:04:23,118 --> 00:04:25,648 But, sorry, by the time they're older 71 00:04:25,648 --> 00:04:27,981 the odds are that they will be sexting. 72 00:04:27,981 --> 00:04:34,873 Every study I've seen puts the rate above 50 percent for 18 to 24-year-olds. 73 00:04:34,873 --> 00:04:38,483 And most of the time, nothing goes wrong. 74 00:04:38,483 --> 00:04:41,103 People asking me all the time things like, 75 00:04:41,103 --> 00:04:43,933 "isn't sexting just so dangerous, though? 76 00:04:43,933 --> 00:04:47,493 You wouldn't leave your wallet on a park bench. 77 00:04:47,493 --> 00:04:51,394 You expect it's going to get stolen if you do that, right?" 78 00:04:51,394 --> 00:04:52,925 Here's how I think about it: 79 00:04:52,925 --> 00:04:56,775 Sexting is like leaving your wallet at your boyfriend's house. 80 00:04:56,775 --> 00:05:01,405 If you come back the next day and all the money is just gone, 81 00:05:01,405 --> 00:05:04,276 you really need to dump that guy. 82 00:05:04,276 --> 00:05:08,015 (Laughter) 83 00:05:08,015 --> 00:05:13,036 So instead of criminalizing sexting to try to prevent these privacy violations, 84 00:05:13,036 --> 00:05:16,347 instead we need to make consent central 85 00:05:16,347 --> 00:05:20,858 to how we think about the circulation of our private information. 86 00:05:20,858 --> 00:05:25,488 Every new media technology raises privacy concerns; 87 00:05:25,488 --> 00:05:29,748 in fact, in the U.S., the first major debates about privacy 88 00:05:29,748 --> 00:05:34,519 were in response to technologies that were relatively new at the time. 89 00:05:34,519 --> 00:05:38,509 In the late 1800s, people were worried about cameras, 90 00:05:38,509 --> 00:05:41,930 which were just suddenly more portable than ever before, 91 00:05:41,930 --> 00:05:44,410 and newspaper gossip columns. 92 00:05:44,410 --> 00:05:48,310 They were worried that the camera would capture information about them, 93 00:05:48,310 --> 00:05:51,869 take it out of context, and widely disseminate it. 94 00:05:51,869 --> 00:05:53,591 Does that sound familiar? 95 00:05:53,591 --> 00:05:58,370 It's exactly what we're worrying about now with social media, drone cameras, 96 00:05:58,370 --> 00:06:00,581 and of course, sexting. 97 00:06:00,581 --> 00:06:03,852 And these fears about technology, they make sense 98 00:06:03,852 --> 00:06:06,452 because technologies can amplify 99 00:06:06,452 --> 00:06:10,563 and bring out our worst qualities and behaviors. 100 00:06:10,563 --> 00:06:13,013 But there are solutions 101 00:06:13,013 --> 00:06:17,303 and we've been here before with a dangerous new technology. 102 00:06:17,303 --> 00:06:21,193 In 1908, Ford introduced the Model T car. 103 00:06:21,193 --> 00:06:25,142 Traffic fatality rates were rising; it was a serious problem. 104 00:06:25,142 --> 00:06:26,833 It looks so safe, right? 105 00:06:26,833 --> 00:06:29,654 (Laughter) 106 00:06:29,654 --> 00:06:33,485 Our first response was to try to change drivers' behavior, 107 00:06:33,485 --> 00:06:37,665 so we developed speed limits and enforced them through fines. 108 00:06:37,665 --> 00:06:40,765 But over the following decades we started to realize 109 00:06:40,765 --> 00:06:44,876 that the technology of the car itself is not just neutral. 110 00:06:44,876 --> 00:06:48,206 We could design the car to make it safer. 111 00:06:48,206 --> 00:06:51,706 So in the 1920s, we got shatter-resistant windshields, 112 00:06:51,706 --> 00:06:58,256 in the 1950s, seat belts, and in the 1990s, air bags. 113 00:06:58,256 --> 00:07:03,697 All three of these areas, laws, individuals, and industry, 114 00:07:03,697 --> 00:07:09,488 came together over time to help solve the problems that a new technology causes 115 00:07:09,488 --> 00:07:13,148 and we can do the same thing with digital privacy. 116 00:07:13,148 --> 00:07:16,478 Of course, it comes back to consent. 117 00:07:16,478 --> 00:07:17,688 Here's the idea: 118 00:07:17,688 --> 00:07:21,379 before anyone can distribute your private information, 119 00:07:21,379 --> 00:07:24,259 they should have to get your permission. 120 00:07:24,259 --> 00:07:29,020 This idea of affirmative consent comes from anti-rape activists 121 00:07:29,020 --> 00:07:32,940 who tell us that we need consent for every sexual act. 122 00:07:32,940 --> 00:07:37,000 And we have really high levels of consent in other areas. 123 99:59:59,999 --> 99:59:59,999 Think about having surgery. 124 99:59:59,999 --> 99:59:59,999 Your doctor has to make sure that you are meaningfully 125 99:59:59,999 --> 99:59:59,999 and knowingly consenting to that medical procedure. 126 99:59:59,999 --> 99:59:59,999 This is not the type of consent with like an iTunes terms of service 127 99:59:59,999 --> 99:59:59,999 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 128 99:59:59,999 --> 99:59:59,999 (Laughter) 129 99:59:59,999 --> 99:59:59,999 If we think more about consent, we can have better privacy laws. 130 99:59:59,999 --> 99:59:59,999 Right now we just don't have that many protections. 131 99:59:59,999 --> 99:59:59,999 If your ex-husband or your ex-wife is a terrible person, 132 99:59:59,999 --> 99:59:59,999 they can take your nude photos and upload them to a porn site. 133 99:59:59,999 --> 99:59:59,999 It can be really hard to get those images taken down 134 99:59:59,999 --> 99:59:59,999 and in a lot of states, 135 99:59:59,999 --> 99:59:59,999 you're actually better off if you took the images of yourself 136 99:59:59,999 --> 99:59:59,999 because then you can file a copyright claim. 137 99:59:59,999 --> 99:59:59,999 (Laughter) 138 99:59:59,999 --> 99:59:59,999 Right now if someone violates your privacy, 139 99:59:59,999 --> 99:59:59,999 whether that's an individual or a company or the NSA, 140 99:59:59,999 --> 99:59:59,999 you can try filing a lawsuit, but you may not be successful 141 99:59:59,999 --> 99:59:59,999 because many courts assume that digital privacy is just impossible 142 99:59:59,999 --> 99:59:59,999 so they're not willing to punish anyone for violating it. 143 99:59:59,999 --> 99:59:59,999 I still hear people asking me all the time, 144 99:59:59,999 --> 99:59:59,999 "isn't a digital image somehow blurring the line between public and private 145 99:59:59,999 --> 99:59:59,999 because it's digital, right?" 146 99:59:59,999 --> 99:59:59,999 No, no! 147 99:59:59,999 --> 99:59:59,999 Everything digital is not just automatically public. 148 99:59:59,999 --> 99:59:59,999 That doesn't make any sense. 149 99:59:59,999 --> 99:59:59,999 As NYU legal scholar, Helen Nissenbaum, tells us, 150 99:59:59,999 --> 99:59:59,999 we have laws and policies and norms 151 99:59:59,999 --> 99:59:59,999 that protect all kinds of information that's private, 152 99:59:59,999 --> 99:59:59,999 and it doesn't make a difference if it's digital or not. 153 99:59:59,999 --> 99:59:59,999 All of your health records are digitized 154 99:59:59,999 --> 99:59:59,999 but your doctor can't just share them with anyone. 155 99:59:59,999 --> 99:59:59,999 All of your financial information is held in digital databases 156 99:59:59,999 --> 99:59:59,999 but your credit card company can't just post your purchase history online. 157 99:59:59,999 --> 99:59:59,999 Better laws could help address privacy violations after they happen, 158 99:59:59,999 --> 99:59:59,999 but one of the easiest things we can all do 159 99:59:59,999 --> 99:59:59,999 is make personal changes to help protect each others' privacy. 160 99:59:59,999 --> 99:59:59,999 We're always told that privacy is our sole individual responsibility. 161 99:59:59,999 --> 99:59:59,999 We're told, "Constantly monitor and update your privacy settings." 162 99:59:59,999 --> 99:59:59,999 We're told, "Never share anything you wouldn't want the entire world to see." 163 99:59:59,999 --> 99:59:59,999 This doesn't make sense. 164 99:59:59,999 --> 99:59:59,999 Digital media are social environments 165 99:59:59,999 --> 99:59:59,999 and we share things with people we trust all day, every day. 166 99:59:59,999 --> 99:59:59,999 As Princeton researcher, Janet Vertesi, argues, 167 99:59:59,999 --> 99:59:59,999 our data and our privacy, 168 99:59:59,999 --> 99:59:59,999 they're not just personal, they're interpersonal. 169 99:59:59,999 --> 99:59:59,999 So one thing you can do that's really easy is just start asking for permission 170 99:59:59,999 --> 99:59:59,999 before you share anyone else's information. 171 99:59:59,999 --> 99:59:59,999 If you want to post a photo of someone online, ask for permission. 172 99:59:59,999 --> 99:59:59,999 If you want to forward an email thread, ask for permission. 173 99:59:59,999 --> 99:59:59,999 If you want to share someone's nude selfie, 174 99:59:59,999 --> 99:59:59,999 obviously, ask for permission! 175 99:59:59,999 --> 99:59:59,999 These individual changes can help us protect each others' privacy, 176 99:59:59,999 --> 99:59:59,999 but we need technology companies on board as well. 177 99:59:59,999 --> 99:59:59,999 These companies have very little incentive to help our privacy 178 99:59:59,999 --> 99:59:59,999 because their business models depend on us 179 99:59:59,999 --> 99:59:59,999 sharing everything with as many people as possible. 180 99:59:59,999 --> 99:59:59,999 Right now, if I send you an image, 181 99:59:59,999 --> 99:59:59,999 you can forward that to anyone that you want. 182 99:59:59,999 --> 99:59:59,999 But what if I got to decide if that image was forwardable or not? 183 99:59:59,999 --> 99:59:59,999 This would tell you, "You don't have my permission to send this image out." 184 99:59:59,999 --> 99:59:59,999 We do this kind of thing all the time to protect copyright. 185 99:59:59,999 --> 99:59:59,999 If you buy an ebook, you can't just send it out to as many people as you want, 186 99:59:59,999 --> 99:59:59,999 so why not try this with mobile phones? 187 99:59:59,999 --> 99:59:59,999 What you can do is we can demand that tech companies 188 99:59:59,999 --> 99:59:59,999 add these protections to our devices and our platforms as the default. 189 99:59:59,999 --> 99:59:59,999 After all, you can choose the color of your car, 190 99:59:59,999 --> 99:59:59,999 but the airbags are always standard. 191 99:59:59,999 --> 99:59:59,999 If we don't think more about digital privacy and consent, 192 99:59:59,999 --> 99:59:59,999 there can be serious consequences. 193 99:59:59,999 --> 99:59:59,999 There was a teenager from Ohio. 194 99:59:59,999 --> 99:59:59,999 Let's call her Jennifer for the sake of her privacy. 195 99:59:59,999 --> 99:59:59,999 She shared nude photos of herself with her high school boyfriend 196 99:59:59,999 --> 99:59:59,999 thinking she could trust him. 197 99:59:59,999 --> 99:59:59,999 Unfortunately, he betrayed her and sent her photos around the entire school. 198 99:59:59,999 --> 99:59:59,999 Jennifer was embarrassed and humiliated, 199 99:59:59,999 --> 99:59:59,999 but instead of being compassionate, her classmates harassed her. 200 99:59:59,999 --> 99:59:59,999 They called her a slut and a whore and they made her life miserable. 201 99:59:59,999 --> 99:59:59,999 Jennifer started missing school, and her grades dropped. 202 99:59:59,999 --> 99:59:59,999 Ultimately, Jennifer decided to end her own life. 203 99:59:59,999 --> 99:59:59,999 Jennifer did nothing wrong. 204 99:59:59,999 --> 99:59:59,999 All she did was share a nude photo 205 99:59:59,999 --> 99:59:59,999 with someone that she thought that she could trust. 206 99:59:59,999 --> 99:59:59,999 And yet, our laws tell herv 207 99:59:59,999 --> 99:59:59,999 that she committed a horrible crime equivalent to child pornography. 208 99:59:59,999 --> 99:59:59,999 Our gender norms tell her that by producing this nude image of herself, 209 99:59:59,999 --> 99:59:59,999 she somehow did the most horrible, shameful thing. 210 99:59:59,999 --> 99:59:59,999 And when we assume that privacy is impossible in digital media, 211 99:59:59,999 --> 99:59:59,999 we completely write off and excuse her boyfriend's bad, bad behavior. 212 99:59:59,999 --> 99:59:59,999 People are still saying all the time to victims of privacy violations, 213 99:59:59,999 --> 99:59:59,999 "What were you thinking? 214 99:59:59,999 --> 99:59:59,999 You should've never sent that image." 215 99:59:59,999 --> 99:59:59,999 If you're trying to figure out what to say instead, try this: 216 99:59:59,999 --> 99:59:59,999 imagine you've run into your friend who broke their leg skiing. 217 99:59:59,999 --> 99:59:59,999 They took a risk to do something fun, and it didn't end well. 218 99:59:59,999 --> 99:59:59,999 But you're probably not going to be the jerk who says, 219 99:59:59,999 --> 99:59:59,999 "Well, I guess you shouldn't have gone skiing then!" 220 99:59:59,999 --> 99:59:59,999 If we think more about consent, 221 99:59:59,999 --> 99:59:59,999 we can see that victims of privacy violations deserve our compassion, 222 99:59:59,999 --> 99:59:59,999 not criminalization, shaming, harassment, or punishment. 223 99:59:59,999 --> 99:59:59,999 We can support victims, and we can prevent some privacy violations 224 99:59:59,999 --> 99:59:59,999 by making these legal, individual, and technological changes. 225 99:59:59,999 --> 99:59:59,999 Because the problem is not sexting, the issue is digital privacy 226 99:59:59,999 --> 99:59:59,999 and one solution is consent. 227 99:59:59,999 --> 99:59:59,999 So the next time a victim of a privacy violation comes up to you, 228 99:59:59,999 --> 99:59:59,999 instead of blaming them, let's do this instead: 229 99:59:59,999 --> 99:59:59,999 let's shift our ideas about digital privacy 230 99:59:59,999 --> 99:59:59,999 and let's respond with compassion. 231 99:59:59,999 --> 99:59:59,999 Thank you. 232 99:59:59,999 --> 99:59:59,999 (Applause)