1 00:00:18,824 --> 00:00:23,854 People have been using media to talk about sex for a long time. 2 00:00:23,854 --> 00:00:27,715 Love letters, phone sex, racy polaroids. 3 00:00:27,715 --> 00:00:31,335 There's even a story of a girl who eloped with a man 4 00:00:31,335 --> 00:00:36,636 that she met over the telegraph in 1886. 5 00:00:36,636 --> 00:00:39,296 Today we have sexting 6 00:00:39,296 --> 00:00:41,937 and I am a sexting expert. 7 00:00:41,937 --> 00:00:43,837 Not an expert sexter-- 8 00:00:43,837 --> 00:00:45,146 (Laughter) 9 00:00:45,146 --> 00:00:48,027 Though, I do know what this means 10 00:00:48,027 --> 00:00:50,137 and I think you do too! 11 00:00:50,137 --> 00:00:52,656 (Laughter) 12 00:00:52,656 --> 00:00:54,707 I have been studying sexting 13 00:00:54,707 --> 00:00:58,627 since the media attention to it began in 2008. 14 00:00:58,627 --> 00:01:01,790 I wrote a book on the moral panic about sexting, 15 00:01:01,790 --> 00:01:03,439 and here's what I found: 16 00:01:03,439 --> 00:01:06,669 Most people are worrying about the wrong thing. 17 00:01:06,669 --> 00:01:10,779 They're trying to just prevent sexting from happening entirely, 18 00:01:10,779 --> 00:01:12,429 but let me ask you this: 19 00:01:12,429 --> 00:01:15,359 as long as it is completely consensual, what's the problem with sexting? 20 00:01:15,359 --> 00:01:21,400 People are into all sorts of things that you may not be into, 21 00:01:21,400 --> 00:01:23,756 like blue cheese or cilantro. 22 00:01:23,756 --> 00:01:26,831 (Laughter) 23 00:01:26,831 --> 00:01:30,962 Sexting is certainly risky, like anything that's fun, 24 00:01:30,962 --> 00:01:35,271 but as long as you're not sending any image 25 00:01:35,271 --> 00:01:37,522 to someone who doesn't want to receive it, 26 00:01:37,522 --> 00:01:39,401 there's no harm. 27 00:01:39,401 --> 00:01:41,972 What I do think is a serious problem 28 00:01:41,972 --> 00:01:47,283 is when people share private images of others without their permission, 29 00:01:47,283 --> 00:01:49,882 and instead of worrying about sexting, 30 00:01:49,882 --> 00:01:54,853 what I think we need to do is think a lot more about digital privacy. 31 00:01:54,853 --> 00:01:57,754 The key is consent. 32 00:01:57,754 --> 00:02:01,134 Right now people are thinking about sexting 33 00:02:01,134 --> 00:02:04,505 without really thinking about consent at all. 34 00:02:04,505 --> 00:02:09,485 Did you know that we currently criminalize teen sexting? 35 00:02:09,485 --> 00:02:13,075 It can be a crime because it counts as child pornography 36 00:02:13,075 --> 00:02:15,716 if there's an image of someone under eighteen, 37 00:02:15,716 --> 00:02:19,475 and it doesn't even matter if they took that image of themselves 38 00:02:19,475 --> 00:02:21,836 and shared it willingly. 39 00:02:21,836 --> 00:02:24,716 So we end up with this bizarre legal situation 40 00:02:24,716 --> 00:02:29,226 where two 17-year-olds can have sex in most U.S. states, 41 00:02:29,226 --> 00:02:32,057 but they can't photograph it. 42 00:02:32,057 --> 00:02:36,978 Some states have also tried passing sexting misdemeanor laws, 43 00:02:36,978 --> 00:02:40,008 but these laws repeat the same problem 44 00:02:40,008 --> 00:02:44,628 because they still make consensual sexting illegal. 45 00:02:44,628 --> 00:02:47,998 It doesn't make sense to try to ban all sexting 46 00:02:47,998 --> 00:02:50,539 to try to address privacy violations. 47 00:02:50,539 --> 00:02:52,099 This is kind of like saying, 48 00:02:52,099 --> 00:02:59,010 "Let's solve the problem of date rape by just making dating completely illegal." 49 00:02:59,010 --> 00:03:04,720 Most teens don't get arrested for sexting, but can you guess who does? 50 00:03:04,720 --> 00:03:09,362 It's often teens who are disliked by their partners parents, 51 00:03:09,362 --> 00:03:15,101 and this can be because of class bias, racism, or homophobia. 52 00:03:15,375 --> 00:03:17,776 Most prosecutors are, of course, smart enough 53 00:03:17,776 --> 00:03:21,865 not to use child pornography charges against teenagers, 54 00:03:21,865 --> 00:03:23,735 but some do. 55 00:03:23,735 --> 00:03:27,335 According to researchers at the University of New Hampshire, 56 00:03:27,335 --> 00:03:32,007 seven percent of all child pornography possession arrests 57 00:03:32,007 --> 00:03:37,587 are teen sexting consensually with other teens. 58 00:03:37,587 --> 00:03:40,168 Child pornography is a serious crime 59 00:03:40,168 --> 00:03:44,808 but it's just not the same thing as teen sexting. 60 00:03:44,808 --> 00:03:48,428 Parents and educators are also responding to sexting 61 00:03:48,428 --> 00:03:51,739 without really thinking too much about consent. 62 00:03:51,739 --> 00:03:56,259 Their message to teens is often just don't do it, 63 00:03:56,259 --> 00:03:57,819 and I totally get it. 64 00:03:57,819 --> 00:03:59,869 There are serious legal risks 65 00:03:59,869 --> 00:04:03,300 and, of course, that potential for privacy violations. 66 00:04:03,300 --> 00:04:09,466 And when you were a teen, I'm sure you did exactly as you were told, right? 67 00:04:09,466 --> 00:04:12,931 You're probably thinking, "My kid would never sext," 68 00:04:12,931 --> 00:04:16,427 and that's true; your little angel may not be sexting 69 00:04:16,427 --> 00:04:23,118 because only 33 percent of 16 and 17-year-olds are sexting. 70 00:04:23,118 --> 00:04:25,648 But, sorry, by the time they're older 71 00:04:25,648 --> 00:04:27,981 the odds are that they will be sexting. 72 00:04:27,981 --> 00:04:34,873 Every study I've seen puts the rate above 50 percent for 18 to 24-year-olds. 73 00:04:34,873 --> 00:04:38,483 And most of the time, nothing goes wrong. 74 00:04:38,483 --> 00:04:41,103 People asking me all the time things like, 75 00:04:41,103 --> 00:04:43,933 "isn't sexting just so dangerous, though? 76 00:04:43,933 --> 00:04:47,493 You wouldn't leave your wallet on a park bench. 77 00:04:47,493 --> 00:04:51,394 You expect it's going to get stolen if you do that, right?" 78 00:04:51,394 --> 00:04:52,925 Here's how I think about it: 79 00:04:52,925 --> 00:04:56,775 Sexting is like leaving your wallet at your boyfriend's house. 80 00:04:56,775 --> 00:05:01,405 If you come back the next day and all the money is just gone, 81 00:05:01,405 --> 00:05:04,276 you really need to dump that guy. 82 00:05:04,276 --> 00:05:08,015 (Laughter) 83 00:05:08,015 --> 00:05:13,036 So instead of criminalizing sexting to try to prevent these privacy violations, 84 00:05:13,036 --> 00:05:16,347 instead we need to make consent central 85 00:05:16,347 --> 00:05:20,858 to how we think about the circulation of our private information. 86 00:05:20,858 --> 00:05:25,488 Every new media technology raises privacy concerns; 87 00:05:25,488 --> 00:05:29,748 in fact, in the U.S., the first major debates about privacy 88 00:05:29,748 --> 00:05:34,519 were in response to technologies that were relatively new at the time. 89 00:05:34,519 --> 00:05:38,509 In the late 1800s, people were worried about cameras, 90 00:05:38,509 --> 00:05:41,930 which were just suddenly more portable than ever before, 91 00:05:41,930 --> 00:05:44,410 and newspaper gossip columns. 92 00:05:44,410 --> 00:05:48,310 They were worried that the camera would capture information about them, 93 00:05:48,310 --> 00:05:51,869 take it out of context, and widely disseminate it. 94 00:05:51,869 --> 00:05:53,591 Does that sound familiar? 95 00:05:53,591 --> 00:05:58,370 It's exactly what we're worrying about now with social media, drone cameras, 96 00:05:58,370 --> 00:06:00,581 and of course, sexting. 97 00:06:00,581 --> 00:06:03,852 And these fears about technology, they make sense 98 00:06:03,852 --> 00:06:06,452 because technologies can amplify 99 00:06:06,452 --> 00:06:10,563 and bring out our worst qualities and behaviors. 100 00:06:10,563 --> 00:06:13,013 But there are solutions 101 00:06:13,013 --> 00:06:17,303 and we've been here before with a dangerous new technology. 102 00:06:17,303 --> 00:06:21,193 In 1908, Ford introduced the Model T car. 103 00:06:21,193 --> 00:06:25,142 Traffic fatality rates were rising; it was a serious problem. 104 00:06:25,142 --> 00:06:26,833 It looks so safe, right? 105 00:06:26,833 --> 00:06:29,654 (Laughter) 106 00:06:29,654 --> 00:06:33,485 Our first response was to try to change drivers' behavior, 107 00:06:33,485 --> 00:06:37,665 so we developed speed limits and enforced them through fines. 108 00:06:37,665 --> 00:06:40,765 But over the following decades we started to realize 109 00:06:40,765 --> 00:06:44,876 that the technology of the car itself is not just neutral. 110 00:06:44,876 --> 00:06:48,206 We could design the car to make it safer. 111 00:06:48,206 --> 00:06:51,706 So in the 1920s, we got shatter-resistant windshields, 112 00:06:51,706 --> 00:06:58,256 in the 1950s, seat belts, and in the 1990s, air bags. 113 00:06:58,256 --> 00:07:03,697 All three of these areas, laws, individuals, and industry, 114 00:07:03,697 --> 00:07:09,488 came together over time to help solve the problems that a new technology causes 115 00:07:09,488 --> 00:07:13,148 and we can do the same thing with digital privacy. 116 00:07:13,148 --> 00:07:16,478 Of course, it comes back to consent. 117 00:07:16,478 --> 00:07:17,688 Here's the idea: 118 00:07:17,688 --> 00:07:21,379 before anyone can distribute your private information, 119 00:07:21,379 --> 00:07:24,259 they should have to get your permission. 120 00:07:24,259 --> 00:07:29,020 This idea of affirmative consent comes from anti-rape activists 121 00:07:29,020 --> 00:07:32,940 who tell us that we need consent for every sexual act. 122 00:07:32,940 --> 00:07:37,000 And we have really high levels of consent in other areas. 123 00:07:37,607 --> 00:07:39,507 Think about having surgery. 124 00:07:39,507 --> 00:07:42,429 Your doctor has to make sure that you are meaningfully 125 00:07:42,429 --> 00:07:45,639 and knowingly consenting to that medical procedure. 126 00:07:45,639 --> 00:07:49,208 This is not the type of consent with like an iTunes terms of service 127 00:07:49,208 --> 00:07:53,009 where you just scroll to the bottom and you're like, "Agree, agree, whatever." 128 00:07:53,009 --> 00:07:54,460 (Laughter) 129 00:07:55,289 --> 00:08:00,600 If we think more about consent, we can have better privacy laws. 130 00:08:00,600 --> 00:08:03,881 Right now we just don't have that many protections. 131 00:08:03,881 --> 00:08:07,381 If your ex-husband or your ex-wife is a terrible person, 132 00:08:07,381 --> 00:08:11,781 they can take your nude photos and upload them to a porn site. 133 00:08:11,781 --> 00:08:14,750 It can be really hard to get those images taken down 134 00:08:14,750 --> 00:08:16,202 and in a lot of states, 135 00:08:16,202 --> 00:08:20,022 you're actually better off if you took the images of yourself 136 00:08:20,022 --> 00:08:23,122 because then you can file a copyright claim. 137 00:08:23,122 --> 00:08:24,410 (Laughter) 138 00:08:25,613 --> 00:08:28,433 Right now if someone violates your privacy, 139 00:08:28,433 --> 00:08:32,544 whether that's an individual or a company or the NSA, 140 00:08:32,544 --> 00:08:38,224 you can try filing a lawsuit, but you may not be successful 141 00:08:38,224 --> 00:08:42,994 because many courts assume that digital privacy is just impossible 142 00:08:42,994 --> 00:08:46,954 so they're not willing to punish anyone for violating it. 143 00:08:46,954 --> 00:08:50,055 I still hear people asking me all the time, 144 00:08:50,055 --> 00:08:55,366 "isn't a digital image somehow blurring the line between public and private 145 00:08:55,366 --> 00:08:57,626 because it's digital, right?" 146 00:08:57,626 --> 00:08:59,066 No, no! 147 00:08:59,066 --> 00:09:02,236 Everything digital is not just automatically public. 148 00:09:02,236 --> 00:09:04,256 That doesn't make any sense. 149 00:09:04,256 --> 00:09:07,907 As NYU legal scholar, Helen Nissenbaum, tells us, 150 00:09:07,907 --> 00:09:10,666 we have laws and policies and norms 151 00:09:10,666 --> 00:09:13,877 that protect all kinds of information that's private, 152 00:09:13,877 --> 00:09:16,996 and it doesn't make a difference if it's digital or not. 153 00:09:16,996 --> 00:09:19,748 All of your health records are digitized 154 00:09:19,748 --> 00:09:22,778 but your doctor can't just share them with anyone. 155 00:09:22,778 --> 00:09:27,098 All of your financial information is held in digital databases 156 00:09:27,098 --> 00:09:33,029 but your credit card company can't just post your purchase history online. 157 00:09:33,029 --> 00:09:38,060 Better laws could help address privacy violations after they happen, 158 00:09:38,060 --> 00:09:41,139 but one of the easiest things we can all do 159 00:09:41,139 --> 00:09:46,400 is make personal changes to help protect each others' privacy. 160 00:09:46,400 --> 00:09:51,441 We're always told that privacy is our sole individual responsibility. 161 00:09:51,441 --> 00:09:55,611 We're told, "Constantly monitor and update your privacy settings." 162 00:09:55,611 --> 00:10:01,072 We're told, "Never share anything you wouldn't want the entire world to see." 163 00:10:01,072 --> 00:10:02,792 This doesn't make sense. 164 00:10:02,792 --> 00:10:05,582 Digital media are social environments 165 00:10:05,582 --> 00:10:10,833 and we share things with people we trust all day, every day. 166 00:10:10,833 --> 00:10:13,743 As Princeton researcher, Janet Vertesi, argues, 167 00:10:13,743 --> 00:10:15,923 our data and our privacy, 168 00:10:15,923 --> 00:10:20,494 they're not just personal, they're interpersonal. 169 00:10:20,494 --> 00:10:25,954 So one thing you can do that's really easy is just start asking for permission 170 00:10:25,954 --> 00:10:28,794 before you share anyone else's information. 171 00:10:28,794 --> 00:10:33,335 If you want to post a photo of someone online, ask for permission. 172 00:10:33,335 --> 00:10:37,076 If you want to forward an email thread, ask for permission. 173 00:10:37,076 --> 00:10:39,776 If you want to share someone's nude selfie, 174 00:10:39,776 --> 00:10:43,606 obviously, ask for permission! 175 00:10:43,606 --> 00:10:47,746 These individual changes can help us protect each others' privacy, 176 00:10:47,746 --> 00:10:52,386 but we need technology companies on board as well. 177 00:10:52,386 --> 00:10:56,867 These companies have very little incentive to help our privacy 178 00:10:56,867 --> 00:10:59,139 because their business models depend on us 179 00:10:59,139 --> 00:11:02,978 sharing everything with as many people as possible. 180 00:11:02,978 --> 00:11:05,108 Right now, if I send you an image, 181 00:11:05,108 --> 00:11:08,218 you can forward that to anyone that you want. 182 00:11:08,218 --> 00:11:12,559 But what if I got to decide if that image was forwardable or not? 183 00:11:12,559 --> 00:11:16,509 This would tell you, "You don't have my permission to send this image out." 184 00:11:16,509 --> 00:11:20,600 We do this kind of thing all the time to protect copyright. 185 00:11:20,600 --> 00:11:25,420 If you buy an ebook, you can't just send it out to as many people as you want, 186 00:11:25,420 --> 00:11:29,311 so why not try this with mobile phones? 187 00:11:29,311 --> 00:11:32,291 What you can do is we can demand that tech companies 188 00:11:32,291 --> 00:11:37,851 add these protections to our devices and our platforms as the default. 189 00:11:37,851 --> 00:11:41,061 After all, you can choose the color of your car, 190 00:11:41,061 --> 00:11:45,602 but the airbags are always standard. 191 00:11:45,602 --> 00:11:49,333 If we don't think more about digital privacy and consent, 192 00:11:49,333 --> 00:11:52,573 there can be serious consequences. 193 00:11:52,573 --> 00:11:55,003 There was a teenager from Ohio. 194 00:11:55,003 --> 00:11:58,509 Let's call her Jennifer for the sake of her privacy. 195 00:11:58,509 --> 00:12:01,823 She shared nude photos of herself with her high school boyfriend 196 00:12:01,823 --> 00:12:04,740 thinking she could trust him. 197 00:12:04,740 --> 00:12:10,344 Unfortunately, he betrayed her and sent her photos around the entire school. 198 00:12:10,344 --> 00:12:14,294 Jennifer was embarrassed and humiliated, 199 00:12:14,294 --> 00:12:18,385 but instead of being compassionate, her classmates harassed her. 200 00:12:18,385 --> 00:12:22,806 They called her a slut and a whore and they made her life miserable. 201 00:12:22,806 --> 00:12:26,985 Jennifer started missing school, and her grades dropped. 202 00:12:26,985 --> 00:12:32,067 Ultimately, Jennifer decided to end her own life. 203 00:12:32,067 --> 00:12:34,697 Jennifer did nothing wrong. 204 00:12:34,697 --> 00:12:37,327 All she did was share a nude photo 205 00:12:37,327 --> 00:12:39,818 with someone that she thought that she could trust. 206 00:12:39,818 --> 00:12:42,748 And yet, our laws tell herv 207 00:12:42,748 --> 00:12:47,317 that she committed a horrible crime equivalent to child pornography. 208 00:12:47,317 --> 00:12:52,068 Our gender norms tell her that by producing this nude image of herself, 209 00:12:52,068 --> 00:12:55,779 she somehow did the most horrible, shameful thing. 210 00:12:55,779 --> 00:13:00,040 And when we assume that privacy is impossible in digital media, 211 00:13:00,040 --> 00:13:06,420 we completely write off and excuse her boyfriend's bad, bad behavior. 212 00:13:06,420 --> 00:13:12,251 People are still saying all the time to victims of privacy violations, 213 00:13:12,251 --> 00:13:13,740 "What were you thinking? 214 00:13:13,740 --> 00:13:17,050 You should've never sent that image." 215 00:13:17,050 --> 00:13:21,632 If you're trying to figure out what to say instead, try this: 216 00:13:21,632 --> 00:13:25,832 imagine you've run into your friend who broke their leg skiing. 217 00:13:25,832 --> 00:13:30,342 They took a risk to do something fun, and it didn't end well. 218 00:13:30,342 --> 00:13:32,953 But you're probably not going to be the jerk who says, 219 00:13:32,953 --> 00:13:37,493 "Well, I guess you shouldn't have gone skiing then!" 220 00:13:37,493 --> 00:13:39,833 If we think more about consent, 221 00:13:39,833 --> 00:13:44,414 we can see that victims of privacy violations deserve our compassion, 222 00:13:44,414 --> 00:13:49,784 not criminalization, shaming, harassment, or punishment. 223 00:13:49,784 --> 00:13:54,335 We can support victims, and we can prevent some privacy violations 224 00:13:54,335 --> 00:13:59,285 by making these legal, individual, and technological changes. 225 00:13:59,285 --> 00:14:04,865 Because the problem is not sexting, the issue is digital privacy 226 00:14:04,865 --> 00:14:08,126 and one solution is consent. 227 00:14:08,126 --> 00:14:12,637 So the next time a victim of a privacy violation comes up to you, 228 00:14:12,637 --> 00:14:15,596 instead of blaming them, let's do this instead: 229 00:14:15,596 --> 00:14:18,807 let's shift our ideas about digital privacy 230 00:14:18,807 --> 00:14:21,907 and let's respond with compassion. 231 00:14:21,907 --> 00:14:23,437 Thank you. 232 00:14:23,437 --> 00:14:25,227 (Applause)