< Return to Video

Fact, fiction and politics in a post-truth age | David Patrikarakos | TEDxAthens

  • 0:09 - 0:12
    I first met Vitaly
    in the spring of last year.
  • 0:13 - 0:15
    He spoke little English,
  • 0:15 - 0:17
    but he was very charming.
  • 0:17 - 0:22
    On his left arm he had a tattoo
    of the singer Marilyn Manson.
  • 0:22 - 0:25
    On his right leg,
    he had a tattoo of Winnie-the-Pooh.
  • 0:25 - 0:26
    (Laughter)
  • 0:26 - 0:29
    It was a rather strange experience.
  • 0:29 - 0:30
    I was there to hear his story,
  • 0:31 - 0:34
    and it was a story that
    I would never have believed,
  • 0:34 - 0:40
    had I not myself spent eight months
    in the year of 2014
  • 0:40 - 0:43
    in Ukraine covering the war
    between Moscow and Kiev.
  • 0:45 - 0:48
    Now, at around the same time,
    in August 2014,
  • 0:49 - 0:52
    Vitaly found himself
    in St. Petersburg without a job.
  • 0:52 - 0:53
    Vitaly was a journalist.
  • 0:53 - 0:56
    But the website he worked for
    had lost its funding.
  • 0:57 - 0:59
    So he sent out endless applications.
  • 0:59 - 1:00
    Heard nothing.
  • 1:00 - 1:02
    Eventually something came back.
  • 1:02 - 1:03
    He'd sent out so many,
  • 1:03 - 1:06
    he couldn't even remember
    which job it had been for.
  • 1:07 - 1:09
    He went for an interview,
    he duly got the job.
  • 1:11 - 1:13
    It was at - sorry -
  • 1:14 - 1:16
    OK - Let me just - Yeah -
  • 1:16 - 1:19
    It was at a nondescript building
    in the centre of St. Petersburg,
  • 1:19 - 1:21
    55 Savushkina Street.
  • 1:22 - 1:24
    And when Vitaly started,
  • 1:24 - 1:28
    he realized it wasn't quite
    your every day media company.
  • 1:28 - 1:32
    He was assigned to a project
    called "Ukraine 2".
  • 1:33 - 1:40
    His job was to essentially write
    articles about Ukraine -
  • 1:40 - 1:41
    which was fair enough,
  • 1:41 - 1:44
    I mean, it's a subject
    of interest for Russians -
  • 1:44 - 1:46
    with some slight differences.
  • 1:47 - 1:49
    His job was to write articles
  • 1:49 - 1:53
    from websites ending in ".ua".
  • 1:53 - 1:57
    That is to say, websites
    that pretended to be Ukrainian.
  • 1:57 - 2:01
    So he would take for example a newspaper
    called the Donetsk News,
  • 2:02 - 2:06
    an Eastern city in Ukraine
    occupied by pro-Russian separatists.
  • 2:06 - 2:08
    And he would write articles about it.
  • 2:08 - 2:09
    It wasn't too bad.
  • 2:09 - 2:12
    He would stick to the basic facts.
  • 2:12 - 2:15
    But he wasn't allowed to use
    the word terrorist or separatist.
  • 2:15 - 2:16
    He had to call them militia.
  • 2:16 - 2:19
    The Ukrainian army, on the other hand,
  • 2:19 - 2:22
    had to always be referred to
    as volunteer battalions
  • 2:23 - 2:24
    or national guards.
  • 2:24 - 2:28
    These are battalions
    that still exist today,
  • 2:28 - 2:30
    that have well-deserved reputations
  • 2:30 - 2:33
    for containing far-right
    or thuggish elements.
  • 2:33 - 2:35
    Now, as time passed,
  • 2:35 - 2:37
    it became clear to Vitaly
  • 2:37 - 2:40
    that he was working in what
    later would become known to the media
  • 2:40 - 2:44
    as a "troll factory" or a "troll farm".
  • 2:44 - 2:47
    And the farm had a very clear structure.
  • 2:47 - 2:51
    On the first floor was Vitaly
    writing fake news websites,
  • 2:51 - 2:53
    and his colleagues.
  • 2:53 - 2:55
    The second floor
    was a social media department.
  • 2:55 - 2:58
    Here cartoons and memes were created
  • 2:58 - 3:02
    to spread across all forms of platforms,
    social media platforms,
  • 3:02 - 3:05
    promulgating the Kremlin line on Ukraine.
  • 3:05 - 3:07
    On the third floor were the bloggers.
  • 3:07 - 3:09
    Now, there were two types of bloggers.
  • 3:09 - 3:12
    The first were the "Ukrainian" bloggers
  • 3:12 - 3:15
    who would blog about how terrible
    the situation in Kiev was,
  • 3:15 - 3:17
    how parts of the city
    didn't have electricity,
  • 3:17 - 3:20
    how the kindergartens
    were running low on food.
  • 3:20 - 3:24
    The second were the "American" bloggers.
  • 3:24 - 3:28
    They would write about how the whole
    of America supported Putin's actions,
  • 3:28 - 3:30
    that Ukrainians were fascists,
  • 3:30 - 3:32
    all sorts of things like this.
  • 3:33 - 3:37
    Hilariously, or perhaps
    bitterly hilariously, what would happen is
  • 3:37 - 3:40
    the bloggers on the third floor,
  • 3:40 - 3:42
    their output would become sources
  • 3:42 - 3:45
    for the fake news articles
    on the first floor.
  • 3:45 - 3:47
    So, it was a merry-go-round of lies.
  • 3:48 - 3:51
    Now, on the fourth floor
    were the big time, big time trolls.
  • 3:51 - 3:53
    These were the guys and women
  • 3:53 - 3:55
    tasked with commenting
    on Facebook, Twitter,
  • 3:56 - 3:57
    and most of all, VKontakte,
  • 3:57 - 3:59
    which is the Russian version of Facebook,
  • 3:59 - 4:03
    a very, very popular
    social media platform
  • 4:03 - 4:05
    for the Kremlin
    and the Russian government.
  • 4:05 - 4:08
    Now, after a month of working
    on the fake websites,
  • 4:08 - 4:12
    Vitaly was sent
    to the social media section.
  • 4:12 - 4:15
    And his work began
    to take a different turn.
  • 4:16 - 4:17
    So, what he would have -
  • 4:17 - 4:19
    When Vitaly was sent to this section,
  • 4:19 - 4:22
    he was first given
    a load of different SIM cards.
  • 4:22 - 4:26
    To register on VKontakte,
    you have to have a phone number.
  • 4:26 - 4:28
    And the troll farm had
    an inexhaustible supply.
  • 4:28 - 4:31
    And his job was just to spam
  • 4:31 - 4:34
    every, but every, page
    you could think of with this -
  • 4:34 - 4:40
    I mean often group pages
    that have nothing to do with politics.
  • 4:40 - 4:43
    One example, he spammed the site
  • 4:43 - 4:46
    devoted to couples meeting up for sex
    in a small Russian town.
  • 4:46 - 4:47
    There was no logic to it.
  • 4:48 - 4:50
    He always had to use female profiles,
  • 4:50 - 4:55
    as females were seen to be
    more trusted than male profiles.
  • 4:55 - 4:57
    The second thing he had to do,
  • 4:57 - 5:02
    and this is central to the dissemination
    of information in the social media age,
  • 5:02 - 5:04
    is it had to be visual.
  • 5:05 - 5:06
    This is something I have learned
  • 5:06 - 5:09
    from the State Department
    to the Israeli Defence Forces,
  • 5:09 - 5:11
    that if you want information,
  • 5:11 - 5:14
    if you want people
    to look at what you have to say,
  • 5:14 - 5:15
    it has to be visual.
  • 5:16 - 5:17
    And, in fact,
  • 5:17 - 5:21
    people, even when they look at your link,
    they probably won't click on it.
  • 5:21 - 5:26
    So what you do, is you leave your message
    in the meme, in the visuals.
  • 5:26 - 5:28
    So here we have
    a tearful looking picture of Obama.
  • 5:28 - 5:33
    He's saying, "I want to start a war,
    but none of my friends will join me."
  • 5:34 - 5:39
    Here we have
    a two-panel picture of Obama.
  • 5:39 - 5:44
    On the first, he is looking very serious,
    saying: "We don't talk to terrorists."
  • 5:44 - 5:46
    The second panel, he is shown smiling:
  • 5:46 - 5:48
    "We only sponsor them!"
  • 5:49 - 5:50
    The goal was to attack
  • 5:50 - 5:54
    people seen as hostile
    to the Kremlin's war in Ukraine.
  • 5:54 - 5:59
    That was Barack Obama, Angela Merkel
    and leading Ukrainian politicians.
  • 5:59 - 6:01
    Now, after a while, this got to Vitaly.
  • 6:02 - 6:03
    He felt dirty,
  • 6:03 - 6:05
    he didn't like the work he was doing,
  • 6:05 - 6:07
    he was having problems with his nerves.
  • 6:07 - 6:09
    So, in the end, he decided to quit.
  • 6:09 - 6:12
    His boss, Anna, asked him why,
    and he told her the truth.
  • 6:12 - 6:14
    He said, "I don't believe in
    what we're doing."
  • 6:14 - 6:15
    So, he left.
  • 6:15 - 6:18
    But still he couldn't get
    the experience out of his mind.
  • 6:18 - 6:21
    He felt like he had to make amends.
  • 6:21 - 6:25
    So, he decided to write
    an article about his experiences.
  • 6:26 - 6:28
    He wrote it anonymously,
  • 6:28 - 6:31
    and he presented it as if he were female.
  • 6:31 - 6:34
    Because the majority of the
    trolls farms' [employees] were [females].
  • 6:34 - 6:36
    It came out, and it did well.
  • 6:36 - 6:37
    It got a lot traction.
  • 6:37 - 6:41
    Even [Alexei] Navalny,
    the Russian opposition leader, tweeted it.
  • 6:42 - 6:46
    Unfortunately, Vitaly's attempts
    at anonymity had been less successful.
  • 6:46 - 6:47
    An hour after it came out,
  • 6:47 - 6:49
    he got a text from his boss Anna -
  • 6:49 - 6:50
    or ex-boss -
  • 6:50 - 6:52
    telling him that he thought he was a hero
  • 6:52 - 6:55
    but he was, in fact,
    a little son-of-a-bitch,
  • 6:55 - 6:58
    who couldn't do anything for himself
    but could only spoil the work of others.
  • 6:59 - 7:00
    Then the phone calls started.
  • 7:01 - 7:03
    "What the hell do you think you're doing?"
  • 7:03 - 7:06
    a gruff male voice
    called him up one night,
  • 7:06 - 7:09
    "Don't you know people can get punched
    in the face for this sort of thing?"
  • 7:09 - 7:13
    For a while, Vitaly was afraid
    to walk the streets alone.
  • 7:16 - 7:17
    Now ...
  • 7:17 - 7:18
    if you are tempted -
  • 7:19 - 7:20
    Sorry, that's just Putin.
  • 7:20 - 7:23
    I never get over that picture,
    it kills me every time.
  • 7:23 - 7:26
    If you are tempted
    to dismiss Vitaly's story
  • 7:26 - 7:30
    as just a bunch of internet nerds
    playing around ... don't,
  • 7:31 - 7:33
    because it is effective.
  • 7:33 - 7:35
    I saw it for myself on the ground,
  • 7:35 - 7:41
    in Eastern Ukraine as I travelled
    through the cities, the occupied cities,
  • 7:41 - 7:43
    that spring and summer of 2014,
  • 7:43 - 7:46
    as they fell almost daily
    to pro-Russia separatists.
  • 7:47 - 7:50
    Now, Marshal McLuhan,
    the Canadian philosopher said:
  • 7:50 - 7:55
    "All media are extensions of some
    human faculty, psychic or physical."
  • 7:55 - 7:58
    Travelling through
    Eastern Ukraine in 2014,
  • 7:58 - 8:03
    it was as if Putin's
    central nervous system were on display.
  • 8:03 - 8:08
    The content that had come
    from Vitaly's pen, directly,
  • 8:08 - 8:10
    was all around me.
  • 8:10 - 8:14
    The online world had seeped
    into the offline reality.
  • 8:15 - 8:17
    On the ground in Eastern Ukraine,
  • 8:17 - 8:20
    old men parroted geopolitical concepts,
  • 8:20 - 8:22
    like Novorossiya,
  • 8:22 - 8:26
    a Czarist term referring to part
    of Eastern Ukraine as belonging to Russia,
  • 8:26 - 8:28
    that they barely understood.
  • 8:28 - 8:29
    Teenagers, laughingly, showed me
  • 8:29 - 8:32
    racist memes of Barack Obama
    on their smartphone,
  • 8:32 - 8:34
    knocked up in the troll factory.
  • 8:35 - 8:38
    The belief that Kiev was a fascist junta,
  • 8:38 - 8:42
    and that it genuinely wanted to destroy
    the speaking of Russian in the East,
  • 8:42 - 8:44
    was sincerely held.
  • 8:44 - 8:47
    This was not, I realized, mere propaganda,
  • 8:47 - 8:49
    it wasn't narratives;
  • 8:49 - 8:51
    it was the reinvention of reality.
  • 8:53 - 8:57
    Now, we have to ask:
    How did we get here?
  • 8:58 - 9:02
    Putin's information age could only
    be possible in a post-truth age.
  • 9:03 - 9:05
    And what has created this age
    is social media.
  • 9:06 - 9:08
    Now, social media brings
    people together, we know this.
  • 9:08 - 9:10
    It's transnational.
  • 9:10 - 9:13
    You can speak to your friend
    in India or... well, not China -
  • 9:13 - 9:16
    India or the UK or America or wherever.
  • 9:16 - 9:19
    But it does something else
    that is less discussed:
  • 9:19 - 9:22
    it shatters unity, and it divides people.
  • 9:22 - 9:25
    It does this in two overarching ways.
  • 9:25 - 9:26
    The first is obvious:
  • 9:27 - 9:30
    it facilitates direct confrontation.
  • 9:30 - 9:33
    If you are a Hillary supporter
    during the election campaign,
  • 9:33 - 9:36
    you can row with Trump supporters online.
  • 9:36 - 9:37
    I've seen it happen over Brexit,
  • 9:37 - 9:41
    I've seen friends unfriend
    each other on Facebook over Brexit,
  • 9:41 - 9:43
    it facilitates direct confrontation.
  • 9:43 - 9:46
    But it does something else more subtle:
  • 9:46 - 9:51
    it creates what was called homophily,
    love of the like-minded.
  • 9:51 - 9:54
    Now, when you are on Facebook,
    you have your friends -
  • 9:54 - 9:57
    we understand "friends", OK? -
  • 9:57 - 10:02
    and most of your friends are likely
    to broadly share similar worldviews.
  • 10:02 - 10:04
    Now, I am not saying
    that everyone votes your party,
  • 10:04 - 10:06
    there are left, there are right,
  • 10:06 - 10:08
    but it's unlikely
    that anyone in this audience
  • 10:08 - 10:10
    is going to be friends
    with a bunch of Nazis ...
  • 10:10 - 10:12
    I would hope anyway.
  • 10:12 - 10:14
    So, what happens is,
  • 10:14 - 10:17
    first of all you are cocooned
    with like-minded individuals,
  • 10:17 - 10:20
    who post articles that tend
    to slant to your point of view.
  • 10:20 - 10:24
    Second, even more insidious,
    is the Facebook algorithm.
  • 10:24 - 10:28
    People think social media platforms
    are these impartial mediums,
  • 10:28 - 10:29
    but they are not.
  • 10:29 - 10:33
    They are capitalist enterprises
    and their product is us, their users.
  • 10:34 - 10:35
    The algorithm is designed
  • 10:35 - 10:38
    to keep us on their platforms
    for as long as possible
  • 10:38 - 10:41
    by feeding us content
    that we know we like.
  • 10:41 - 10:43
    So, if you're a Hillary supporter
  • 10:43 - 10:46
    and you're clicking on pro-Hillary links
    during the campaign,
  • 10:46 - 10:49
    the algorithm is going to feed you
    more and more pro-Hillary content,
  • 10:49 - 10:51
    and vice versa.
  • 10:51 - 10:56
    Thus are our prejudices reaffirmed
    and hatred of the other exacerbated.
  • 10:58 - 10:59
    There he is
  • 10:59 - 11:01
    in all his splendiferous stupidity!
  • 11:02 - 11:04
    So how did we get here, part 2.
  • 11:05 - 11:10
    The rise of social media has coincided
    with a crisis of faith in the West
  • 11:10 - 11:12
    and, critically, its institutions.
  • 11:12 - 11:14
    From the 2003 Iraq war,
  • 11:14 - 11:17
    in which our politicians
    took us to war on a lie,
  • 11:17 - 11:20
    to the 2008 global financial crisis,
  • 11:20 - 11:25
    in which the bankers
    took us into the Great Recession,
  • 11:25 - 11:27
    to the NSA Snowden spying revelations,
  • 11:27 - 11:31
    combined with long-term declines
    in media trust,
  • 11:31 - 11:36
    means that the fundamental institutions
    upon which the West is based
  • 11:36 - 11:38
    have been discredited.
  • 11:38 - 11:40
    As a result,
  • 11:40 - 11:43
    we see the rise of populist demagogues,
    nationalist demagogues,
  • 11:43 - 11:44
    Geert Wilders in Holland,
  • 11:45 - 11:47
    Nigel Farage with Brexit,
  • 11:47 - 11:50
    Marine Le Pen, recently
    in the French elections,
  • 11:50 - 11:52
    and of course the apotheosis of it:
  • 11:52 - 11:53
    Mr. Trump.
  • 11:53 - 11:56
    Sorry, there we go again.
  • 11:59 - 12:02
    So, what is the end result of all this?
  • 12:02 - 12:05
    The end result of all this is
    that we now live in a post-truth age.
  • 12:05 - 12:07
    Now, this has been defined.
  • 12:07 - 12:11
    This was Oxford Dictionaries'
    word of the year 2016,
  • 12:11 - 12:14
    and I am going to read out its definition
  • 12:14 - 12:16
    because I think
    it is worth thinking about.
  • 12:17 - 12:22
    It was defined as an adjective
    relating to or denoting circumstances
  • 12:22 - 12:23
    in which objective facts
  • 12:23 - 12:26
    are less influential
    in shaping public opinion
  • 12:26 - 12:29
    than appeals to emotion
    and personal belief.
  • 12:29 - 12:32
    This was supplemented by the president
    of the Oxford Dictionaries,
  • 12:32 - 12:37
    who said: "It is not surprising
    that our choice reflects a year
  • 12:37 - 12:41
    dominated by highly charged
    political and social discourse.
  • 12:41 - 12:45
    Fuelled by the rise
    of social media as a news source
  • 12:45 - 12:50
    and a growing distrust
    of facts offered up by the establishment,
  • 12:50 - 12:56
    post-truth as a concept has been finding
    its linguistic footing for some time."
  • 12:58 - 13:00
    Now, what are the problems?
  • 13:00 - 13:02
    The problems are that
    in both Europe and the US,
  • 13:02 - 13:05
    public broadcasters
    have to adhere to guidelines.
  • 13:05 - 13:11
    And these regard balance, impartiality,
    professional journalists, etc, etc, etc.
  • 13:11 - 13:15
    No such regulation
    exists for social media.
  • 13:15 - 13:18
    And the cost is plain to see.
  • 13:18 - 13:20
    Facebook, which has been named
  • 13:20 - 13:23
    as the platform most conducive
    to the spread of false stories,
  • 13:23 - 13:27
    has recently announced that is going to
    start using third-party fact-checkers,
  • 13:27 - 13:29
    people like Full Fact or Snopes.
  • 13:29 - 13:31
    It's going to self-police,
  • 13:31 - 13:34
    it's going to try in some way -
    imperfectly, in my opinion -
  • 13:34 - 13:36
    to combat the phenomenon.
  • 13:37 - 13:39
    But what are the results?
  • 13:39 - 13:42
    The results are,
    and this is very dangerous,
  • 13:42 - 13:46
    that the post-truth world
    has created the post-truth leader.
  • 13:47 - 13:51
    From Vladimir Putin, who reinvents
    reality in Eastern Ukraine,
  • 13:51 - 13:52
    to Donald Trump,
  • 13:52 - 13:54
    whose White House will tell you
  • 13:54 - 13:57
    his inauguration crowds
    were bigger than Obama's
  • 13:57 - 13:59
    when you can see that they weren't.
  • 14:00 - 14:02
    Now, they are different, let's be fair.
  • 14:02 - 14:04
    One is a dictator in all but name;
  • 14:04 - 14:06
    the other leads the world's
    most powerful democracy.
  • 14:06 - 14:09
    But, in each case, the goal is the same:
  • 14:09 - 14:14
    the more doubt you can sow
    in people's minds about all information,
  • 14:14 - 14:18
    the more you will weaken their propensity
    to recognize the truth when they see it.
  • 14:19 - 14:21
    The goal is not to twist the notion,
  • 14:21 - 14:23
    twist the truth,
  • 14:23 - 14:24
    like the politicians of old:
  • 14:25 - 14:27
    "I did not have sexual relations
    with that woman",
  • 14:27 - 14:31
    but to subvert the very notion
    that an objective truth exists at all:
  • 14:31 - 14:34
    "My crowds were bigger!" -
    "That's just an alternative fact."
  • 14:36 - 14:38
    So this is all very cheerful.
  • 14:38 - 14:40
    So I'd like to end
    on a slightly happier note.
  • 14:40 - 14:44
    What can we do to survive
    in this post-truth age?
  • 14:44 - 14:46
    Now, I believe, in the end,
  • 14:46 - 14:50
    change is going to have to come
    at the legislative level:
  • 14:50 - 14:52
    governments are going
    to have to intervene.
  • 14:52 - 14:56
    But I have put together, in the interim,
    a post-truth survival kit
  • 14:56 - 14:59
    to surviving, negotiating,
  • 14:59 - 15:00
    and trying to make your way
  • 15:00 - 15:04
    through this sick information environment,
    this unhealthy information environment,
  • 15:04 - 15:05
    in which we live.
  • 15:05 - 15:07
    One.
  • 15:07 - 15:09
    Go out of your way
    to friend or follow people
  • 15:09 - 15:11
    that don't necessarily
    agree with your worldview.
  • 15:11 - 15:14
    Please, don't go out and friend
    a bunch of Golden Dawn supporters,
  • 15:14 - 15:17
    but generally try and broaden
    your circles a bit.
  • 15:17 - 15:18
    Two.
  • 15:18 - 15:22
    Go directly to the websites
    of trusted news sources,
  • 15:22 - 15:25
    or, better still,
    buy the newspaper itself.
  • 15:25 - 15:27
    Read all of its reporting;
  • 15:27 - 15:31
    don't cherry-pick articles with a slant
    that appeals to your preexisting belief.
  • 15:32 - 15:33
    Three.
  • 15:33 - 15:35
    Read articles form publications
  • 15:35 - 15:38
    whose political views
    you don't agree with.
  • 15:38 - 15:39
    You will learn things.
  • 15:39 - 15:41
    Trust me.
  • 15:41 - 15:42
    Four.
  • 15:42 - 15:43
    Read books.
  • 15:43 - 15:44
    Yes, they still exist.
  • 15:44 - 15:48
    In fact, I would recommend
    you read one book in particular.
  • 15:48 - 15:50
    It's called: "War in 140 Characters:
  • 15:50 - 15:55
    How Social Media Is [Reshaping] Conflict
    [in] the Twenty-First Century."
  • 15:55 - 15:57
    It's available on Amazon,
    by your humble servant.
  • 15:57 - 16:00
    Well worth $30 of anybody's money.
  • 16:01 - 16:02
    Five.
  • 16:02 - 16:04
    Log off.
  • 16:04 - 16:07
    Get off Facebook, it will suck your life.
  • 16:07 - 16:09
    Leave the house.
  • 16:09 - 16:10
    Go and meet friends.
  • 16:10 - 16:11
    Have fun.
  • 16:11 - 16:12
    You will benefit.
  • 16:12 - 16:14
    Thank you very much!
  • 16:14 - 16:16
    (Applause)
  • 16:17 - 16:18
    Thank you! Thank you!
  • 16:18 - 16:20
    (Applause)
Title:
Fact, fiction and politics in a post-truth age | David Patrikarakos | TEDxAthens
Description:

In this talk, David analyses the trend of fake news worldwide. From Russia to the American Elections and beyond, from Twitter to Iran.

David Patrikarakos is an international author, journalist and producer with an extensive track record in high-end international affairs. As a foreign correspondent, he has reported from across the world for a variety of leading US, UK and international publications.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
16:27
  • Whoever is going to review this English transcription, please, email me so I can give you his word docx, which unfortunately arrived in my box after I submitted.

English subtitles

Revisions