Return to Video

Online predators spread fake porn of me. Here's how I fought back

  • 0:00 - 0:04
    [This talk contains graphic language
    and descriptions of sexual abuse]
  • 0:05 - 0:10
    Can I get a show of hands
    who here has ever Googled themselves?
  • 0:12 - 0:13
    I have.
  • 0:15 - 0:19
    But what started off
    as momentary curiosity
  • 0:19 - 0:22
    very quickly turned
  • 0:22 - 0:27
    into an almost five-year horrific battle
  • 0:27 - 0:29
    that almost ruined my life.
  • 0:31 - 0:33
    I Google Images reverse-searched myself:
  • 0:34 - 0:38
    a function of Google
    that allows you to upload an image
  • 0:38 - 0:42
    and it shows you
    where it is on the internet.
  • 0:42 - 0:46
    This is me at 17 years old.
  • 0:46 - 0:49
    An innocent selfie I took before a party.
  • 0:49 - 0:52
    Now, before I continue, I must point out
  • 0:52 - 0:57
    that what I'm about to talk about
    is very confronting and graphic.
  • 0:58 - 0:59
    But there's no way out.
  • 0:59 - 1:02
    This is a very confronting issue.
  • 1:03 - 1:06
    In a split second,
  • 1:06 - 1:11
    my screen was flooded with that image
  • 1:11 - 1:14
    and dozens more images of me
  • 1:14 - 1:17
    that had been stolen from my social media,
  • 1:17 - 1:20
    on links connected to porn sites.
  • 1:22 - 1:28
    On these sites, nameless,
    faceless sexual predators
  • 1:28 - 1:32
    had published highly explicit
    sexual commentary about me
  • 1:32 - 1:34
    and what they'd like to do to me.
  • 1:36 - 1:39
    "Cover her face and we'd fuck her body,"
  • 1:39 - 1:40
    one person wrote.
  • 1:41 - 1:45
    They also published
    identifying information about me:
  • 1:45 - 1:49
    where I lived, what I studied, who I was.
  • 1:50 - 1:52
    But things got worse.
  • 1:53 - 1:57
    I soon discovered
    that these sexual predators
  • 1:57 - 2:02
    had doctored or photoshopped my face
  • 2:02 - 2:07
    onto the bodies of naked adult actresses
    engaged in sexual intercourse,
  • 2:08 - 2:10
    on solo shots
  • 2:11 - 2:14
    of me being ejaculated on by two men.
  • 2:15 - 2:18
    Sperm was edited onto my face.
  • 2:20 - 2:23
    I was edited onto the cover of a porn DVD.
  • 2:25 - 2:28
    Perpetrators had edited my images
  • 2:28 - 2:33
    to give the effect that my blouse
    was transparent or see-through,
  • 2:33 - 2:35
    so you could see my nipples.
  • 2:37 - 2:41
    Perpetrators ejaculated on images of me,
  • 2:41 - 2:48
    took photos of their sperm
    and penises on these images
  • 2:48 - 2:50
    and posted them onto porn sites.
  • 2:51 - 2:54
    "Cum on printed pigs,"
    is what they call it.
  • 2:55 - 2:58
    Now, you might be wondering,
  • 2:58 - 3:01
    what sorts of images
    I posted on social media.
  • 3:02 - 3:06
    This is me, at around 19
    at the Claremont Hotel,
  • 3:06 - 3:08
    just a few suburbs away.
  • 3:09 - 3:13
    And they superimposed that face into this.
  • 3:15 - 3:17
    And things got worse.
  • 3:17 - 3:20
    Nothing was off limits
    for these predators.
  • 3:20 - 3:24
    They even posted an image
    with my little sister on these sites too.
  • 3:25 - 3:27
    Now, you might be thinking,
  • 3:27 - 3:30
    "Well, you do dress provocatively,
  • 3:30 - 3:34
    even a little sexually suggestive,
  • 3:34 - 3:35
    attention seeking maybe."
  • 3:37 - 3:40
    But just because
    a woman's body gets attention,
  • 3:40 - 3:42
    doesn't mean she's attention-seeking.
  • 3:43 - 3:47
    And what is provocative anyway,
    what is sexually suggestive?
  • 3:47 - 3:51
    In some parts of the world,
    showing your ankles
  • 3:51 - 3:53
    is promiscuous, is provocative.
  • 3:55 - 3:59
    It's just like, no matter
    what a woman wears,
  • 3:59 - 4:02
    it's always perceived
    as more sexual than it is.
  • 4:03 - 4:06
    For me, I just wanted
    to feel pretty and confident.
  • 4:08 - 4:10
    What's so wrong with that?
  • 4:12 - 4:14
    Now, you might be thinking,
  • 4:14 - 4:18
    "Well, can't you just
    set your social media on private?"
  • 4:20 - 4:23
    Well, these perpetrators were calculated.
  • 4:23 - 4:26
    They befriended my friends on social media
  • 4:26 - 4:28
    under fake profiles,
  • 4:28 - 4:31
    they followed the public galleries
  • 4:31 - 4:35
    of the events and places
    I regularly visited.
  • 4:37 - 4:39
    But why?
  • 4:39 - 4:43
    Why should one have to retreat and hide
  • 4:43 - 4:46
    out of fear that something
    like this could happen?
  • 4:48 - 4:50
    What I post and what I wear
  • 4:50 - 4:54
    isn't an invitation
    to violate and abuse me.
  • 4:54 - 4:58
    The only person that should be
    changing their behavior
  • 4:58 - 5:00
    is the perpetrators.
  • 5:00 - 5:07
    (Applause)
  • 5:12 - 5:14
    Now, you might be thinking, why me?
  • 5:15 - 5:18
    Well, I'm just one
  • 5:18 - 5:21
    of the thousands upon thousands
  • 5:21 - 5:24
    of ordinary women
    who are being preyed upon
  • 5:26 - 5:33
    in these mass-scale, horrific
    online cultures, websites and threads
  • 5:33 - 5:38
    that are dedicated
    to sexually exploiting and doctoring
  • 5:38 - 5:41
    ordinary images of women into porn.
  • 5:42 - 5:44
    As I speak,
  • 5:45 - 5:49
    there are women who are being preyed upon,
    and they don't even know it.
  • 5:51 - 5:55
    In the beginning, I tried seeking help.
  • 5:55 - 5:58
    I went to police,
    I contacted government agencies,
  • 5:58 - 6:02
    I even tried to hire
    a private investigator,
  • 6:02 - 6:04
    but they were too expensive.
  • 6:05 - 6:07
    There was nothing that they could do.
  • 6:07 - 6:09
    I mean, what could you do
  • 6:09 - 6:15
    when the sites are hosted overseas
    and the perpetrators are from overseas?
  • 6:17 - 6:21
    I was told I had to contact
    the sites one by one,
  • 6:21 - 6:25
    notifying the webmasters
    to get everything deleted.
  • 6:27 - 6:29
    And so as you can imagine,
  • 6:29 - 6:33
    in complete and utter fear and pain,
  • 6:35 - 6:36
    I did.
  • 6:37 - 6:39
    I contacted the webmasters,
  • 6:40 - 6:44
    requesting that they delete
    the material shared without consent.
  • 6:45 - 6:48
    And I had some successes,
  • 6:48 - 6:50
    but I also had some major setbacks.
  • 6:52 - 6:54
    The more I fought,
  • 6:55 - 6:58
    the more sites I would discover,
  • 6:58 - 7:02
    and with time, the more my images
    were being seen and shared
  • 7:02 - 7:05
    in the tens of thousands.
  • 7:05 - 7:11
    I had one webmaster respond to me
    saying he'll only delete the site
  • 7:12 - 7:16
    if I sent him nude photos
    of myself within 24 hours.
  • 7:17 - 7:19
    And this went on for years,
  • 7:19 - 7:23
    fighting against these
    dodgy, disgusting sites.
  • 7:23 - 7:26
    But I was fighting a losing battle.
  • 7:27 - 7:31
    And I couldn't continue this any longer
    for my own mental health.
  • 7:32 - 7:34
    But what could I do?
  • 7:36 - 7:40
    Maybe, I thought, if I spoke out,
  • 7:41 - 7:44
    I could reclaim my name,
  • 7:44 - 7:47
    and I could rewrite my narrative
    on my own terms.
  • 7:47 - 7:49
    Maybe if I spoke out,
  • 7:49 - 7:52
    I could raise awareness about this.
  • 7:53 - 7:56
    Maybe I could even try to change the law.
  • 7:58 - 7:59
    And so I did.
  • 8:00 - 8:07
    (Applause)
  • 8:10 - 8:12
    I spoke out publicly late last year
  • 8:12 - 8:17
    and news of my story
    reverberated around the world.
  • 8:18 - 8:20
    But this was the response.
  • 8:20 - 8:23
    "She's a fat, ugly slut, she's a whore."
  • 8:23 - 8:26
    "She's an attention-seeking
    piece of trash."
  • 8:26 - 8:29
    "Feel flattered, baby, it's a compliment."
  • 8:29 - 8:33
    I was victim-blamed and slut-shamed
  • 8:33 - 8:36
    and told I was deserving
    of what happened to me.
  • 8:37 - 8:39
    And quite frankly,
  • 8:39 - 8:41
    that was more difficult for me to endure
  • 8:41 - 8:45
    than my actual experiences
    of image-based abuse.
  • 8:47 - 8:50
    But I couldn’t let
    this criticism defeat me.
  • 8:50 - 8:53
    I knew what the perpetrators
    had done was wrong,
  • 8:53 - 8:56
    and I knew what they were doing
    to others was wrong.
  • 8:56 - 8:58
    And so I petitioned.
  • 8:59 - 9:03
    I sent out impassioned pleas for support.
  • 9:03 - 9:05
    But it didn't work.
  • 9:05 - 9:08
    I think I got like 330 signatures.
  • 9:09 - 9:10
    And that was really disheartening.
  • 9:13 - 9:16
    But I then contacted
    my state and federal MPs.
  • 9:18 - 9:22
    And I was referred to the New South Wales
    Attorney General's department,
  • 9:22 - 9:26
    who were already in the process
    of drafting new laws
  • 9:26 - 9:31
    to criminalize the nonconsensual
    distribution of intimate images.
  • 9:33 - 9:34
    Image-based sexual abuse.
  • 9:36 - 9:38
    Some of you might know it as revenge porn.
  • 9:40 - 9:43
    And soon I became a spokesperson,
  • 9:43 - 9:45
    a public face for the new laws.
  • 9:46 - 9:48
    But I must point out,
  • 9:48 - 9:50
    I do not in any way, shape or form
  • 9:50 - 9:53
    want to take credit
    for this change in the law.
  • 9:53 - 9:55
    This is on the backs
    of cybersafety experts,
  • 9:55 - 9:59
    of researchers, of the Attorney
    General's department,
  • 9:59 - 10:02
    of so many people
    who have fought for years.
  • 10:03 - 10:08
    New South Wales was the first
    state in all the world
  • 10:10 - 10:13
    to specifically include a provision
    on altering images.
  • 10:13 - 10:15
    Something that happened to me,
  • 10:15 - 10:18
    something you certainly
    don't hear about very often.
  • 10:18 - 10:22
    And now ACT has also criminalized this,
  • 10:22 - 10:25
    also with a provision on altered images.
  • 10:25 - 10:28
    And next year, WA
    is introducing legislation
  • 10:28 - 10:33
    and hopefully, they introduce
    a provision on altered images
  • 10:33 - 10:36
    and I urge every state and every country
    in this world to follow suit,
  • 10:36 - 10:38
    because right now,
  • 10:38 - 10:40
    there's no justice for people like me.
  • 10:42 - 10:43
    Despite it all,
  • 10:44 - 10:46
    despite the hate
    and despite the criticism,
  • 10:48 - 10:51
    despite the fact that I'm never
    going to get justice,
  • 10:51 - 10:54
    because my experiences happened
    before this movement of law reform,
  • 10:56 - 10:59
    speaking out was
    the best thing I've ever done,
  • 10:59 - 11:01
    because I know for a fact
    that it has helped people.
  • 11:05 - 11:07
    And I just want to live in a world
  • 11:09 - 11:12
    where, regardless
    of what I wear or what I post,
  • 11:12 - 11:17
    that I'm still worthy of being treated
    with dignity and respect.
  • 11:19 - 11:20
    Respect.
  • 11:21 - 11:24
    Now, that's an idea worth sharing.
  • 11:24 - 11:25
    (Applause)
  • 11:25 - 11:27
    Thank you.
  • 11:27 - 11:33
    (Applause)
Title:
Online predators spread fake porn of me. Here's how I fought back
Speaker:
Noelle Martin
Description:

A casual reverse-image search unleashed a nightmarish reality on Noelle Martin when she discovered her face edited into pornographic materials across the internet. Join Martin as she recounts years battling shadowy online figures to reclaim her identity, narrative and peace of mind -- and learn how she helped change Australian law. (This talk contains mature content.)

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
11:46

English subtitles

Revisions Compare revisions