Return to Video

Winning the battle against online hate speech in Myanmar | Min Thu Aung | TEDxUM1Yangon

  • 0:07 - 0:09
    OK, (Burmese) Hello.
  • 0:11 - 0:12
    My name is Min Thu Aung,
  • 0:12 - 0:15
    and I'm trying to find a solution
  • 0:15 - 0:19
    to solve the problem
    of online hate speech in Myanmar.
  • 0:20 - 0:23
    So you might ask,
    "Why is this so important?"
  • 0:23 - 0:26
    After all, hate speech
    is nothing new, right?
  • 0:26 - 0:31
    Countries all over the world
    have experienced hate speech in some way.
  • 0:32 - 0:37
    And as a country we have
    a lot of problems to solve.
  • 0:37 - 0:39
    I'll give you one. Healthcare.
  • 0:39 - 0:41
    We need to improve our healthcare,
  • 0:41 - 0:44
    we need to improve our education
    our roads, electricity maybe -
  • 0:45 - 0:46
    a lot of problems to solve.
  • 0:46 - 0:50
    Why hate speech? And why now?
    Well, here's why.
  • 0:52 - 0:53
    As we know well,
  • 0:54 - 0:59
    back in August 2017, there
    was the Rohingya crisis of course.
  • 0:59 - 1:01
    The UN fact-finding mission
  • 1:02 - 1:06
    concluded that online
    hate speech on Facebook
  • 1:06 - 1:11
    was one of the main reasons
    for this crisis.
  • 1:13 - 1:18
    This is hate speech
    that divided communities
  • 1:18 - 1:20
    and caused a refugee crisis.
  • 1:21 - 1:23
    This is hate speech that -
  • 1:25 - 1:30
    really damaged our country's
    international reputation,
  • 1:30 - 1:34
    and also reduced
  • 1:34 - 1:40
    the investment friendliness
    of the country from a Western perspective.
  • 1:40 - 1:45
    It also is hate speech
    that one day will affect all of us,
  • 1:46 - 1:49
    and, therefore, I believe
    that it's an important problem to solve.
  • 1:50 - 1:51
    Here's another reason:
  • 1:51 - 1:54
    Just last month,
  • 1:55 - 2:00
    an LGBT librarian
    at a private university in Yangon
  • 2:01 - 2:06
    was forced to confess his status
    in a case of workplace harassment.
  • 2:06 - 2:12
    This harassment continued online
    where he experienced cyberbullying
  • 2:13 - 2:18
    and also hate speech
    because of his sexual orientation
  • 2:19 - 2:22
    and, tragically, he took his own life.
  • 2:22 - 2:24
    It's a big problem.
  • 2:24 - 2:26
    Another reason.
  • 2:26 - 2:29
    Another reason is that the elections
    are coming up in 2020.
  • 2:30 - 2:34
    Political parties now are already
    taking their positions,
  • 2:35 - 2:38
    and just like the US
    election in 2016 also,
  • 2:38 - 2:41
    we expect that hate speech
    could become, and fake news,
  • 2:41 - 2:44
    could be a feature
    of the elections in 2020.
  • 2:44 - 2:47
    So that's why I'm convinced
    that online hate speech,
  • 2:47 - 2:48
    in any democracy,
  • 2:48 - 2:53
    is an important issue to solve,
    and Myanmar is no exception.
  • 2:53 - 2:54
    It applies to Myanmar as well.
  • 2:55 - 2:58
    I guess we first need
    to look at what is hate speech?
  • 2:58 - 3:00
    What is the problem we're trying to solve?
  • 3:01 - 3:02
    What is free speech?
  • 3:02 - 3:06
    Where does free speech end,
    and where does hate speech begin?
  • 3:06 - 3:09
    And for that, we can rely on -
  • 3:10 - 3:14
    international definitions
    of hate speech and free speech.
  • 3:14 - 3:18
    So article 19 of the Universal
    Declaration of Human Rights
  • 3:18 - 3:24
    and also the International Covenant
    on Civil and Political Rights, ICCPR,
  • 3:24 - 3:28
    says that free speech is the right
    to hold any opinion you want
  • 3:28 - 3:32
    and also to seek and receive
    information in any way, right?
  • 3:32 - 3:35
    That's quite clear. You can say
    anything you want. That's free speech.
  • 3:35 - 3:39
    But article 20 of the ICCPR
    goes one step further.
  • 3:40 - 3:43
    It tells you where
    your free speech could be limited.
  • 3:44 - 3:47
    If your free speech includes
    any propaganda for war,
  • 3:48 - 3:51
    any advocacy of national, racial,
    or religious hatred
  • 3:51 - 3:56
    which incites discrimination,
    incites hostility or violence,
  • 3:56 - 3:59
    then your free speech can be restricted.
  • 4:00 - 4:01
    That's pretty clear.
  • 4:01 - 4:04
    But what does that mean
    in normal language? That means ...
  • 4:04 - 4:07
    Well, in private
    you can say anything you want,
  • 4:07 - 4:09
    even if it's very controversial,
  • 4:09 - 4:12
    but if you use that same speech in public
  • 4:12 - 4:17
    to incite any violence, any hostility
    against specific groups of people,
  • 4:18 - 4:22
    then that can be considered hate speech,
    and that can be offline or online.
  • 4:22 - 4:24
    It's the same.
  • 4:25 - 4:28
    Now, how big is the problem, right?
  • 4:29 - 4:32
    I think it's first important to note
  • 4:32 - 4:35
    that social media and the internet
    is not all about hate speech, right?
  • 4:35 - 4:41
    If anything, the UN in 2016 passed
    a resolution that says that the internet
  • 4:41 - 4:47
    is actually required or actually allows
    the enjoyment of people's human rights.
  • 4:48 - 4:52
    It's very important, and some countries
    have also even made internet access
  • 4:52 - 4:55
    a constitutional right, right?
  • 4:56 - 4:59
    So it's not the internet and social media.
  • 4:59 - 5:01
    It's not all about hate speech.
  • 5:02 - 5:07
    However, in 2014, when most
    of the Myanmar population
  • 5:07 - 5:09
    got access to the internet,
  • 5:09 - 5:13
    they also got access to a tool
    which allowed them
  • 5:13 - 5:17
    to access and express their views
    to millions of people
  • 5:17 - 5:19
    for the first time in their lives.
  • 5:19 - 5:22
    Behind the veil of anonymity,
  • 5:23 - 5:28
    users on Facebook realized
    that the speech they were using online
  • 5:28 - 5:29
    had consequences,
  • 5:29 - 5:32
    negative consequences for people offline.
  • 5:33 - 5:35
    And with 20 million users in Myanmar,
  • 5:35 - 5:38
    Facebook is effectively
    the internet, right?
  • 5:38 - 5:42
    I think we all know that
    in some way, shape, or form,
  • 5:42 - 5:46
    and therefore, for the rest of my talk
    I'm going to focus on Facebook
  • 5:46 - 5:51
    as the main area where hate speech
    exists online in Myanmar.
  • 5:51 - 5:56
    Now, concrete information or data
    on hate speech is hard to find.
  • 5:57 - 6:01
    According to a survey done
    by Myanmar mobile operator Telenor,
  • 6:02 - 6:04
    back in 2016,
  • 6:04 - 6:08
    the survey found that around 28%
    of the people surveyed
  • 6:09 - 6:13
    knew someone that had been cyberbullied,
    and hate speech could be a part of it.
  • 6:14 - 6:18
    But actually, anecdotally,
    we know it's a big problem.
  • 6:18 - 6:23
    If you look at any Facebook page
    of news outlets in the country,
  • 6:23 - 6:27
    you will find in the comments
    following each article a lot of speech
  • 6:27 - 6:31
    which could be termed hate speech,
    relating to religion, relating to race,
  • 6:32 - 6:33
    relating to other characteristics.
  • 6:34 - 6:36
    We also know, of course,
    as I mentioned earlier,
  • 6:36 - 6:39
    that the UN fact-finding mission also said
  • 6:39 - 6:44
    that hate speech on Facebook
    contributed to the Rohingya crisis.
  • 6:44 - 6:47
    So, in a nutshell,
    it's a big problem, right?
  • 6:47 - 6:49
    But this is not a new problem.
  • 6:49 - 6:54
    Of course, countries in the West
    have used social media and the internet
  • 6:54 - 6:56
    since the mid-nineties, right?
  • 6:56 - 6:58
    So how are they tackling the problem?
  • 6:59 - 7:04
    Well, I will show the way
    that they are tackling this problem,
  • 7:04 - 7:09
    based on this framework
    of the four solutions for hate speech.
  • 7:09 - 7:13
    First, let's look at
    the state solution which is -
  • 7:15 - 7:20
    the governments regulate
    and control speech online and offline.
  • 7:20 - 7:23
    One example of this
    is actually in Germany.
  • 7:23 - 7:27
    They passed an act which
    has been called the "Facebook Act,"
  • 7:27 - 7:32
    which requires Facebook
    to remove any law-breaking content
  • 7:32 - 7:35
    within 24 hours of been told.
  • 7:35 - 7:36
    If they don't do this,
  • 7:36 - 7:39
    then they could be fined
    up to 50 million euros.
  • 7:40 - 7:44
    And just in July this year,
    Facebook was fined 2 million euros
  • 7:44 - 7:47
    for breaking some part of this law.
  • 7:47 - 7:53
    So this law is actually been described
    as one of the most severe ways
  • 7:53 - 7:56
    that the government
    and regulators around the world
  • 7:56 - 7:58
    have limited hate speech online.
  • 7:59 - 8:02
    And I know that France is also looking
    at a similar law as well.
  • 8:03 - 8:06
    So civil societies, the NGOs
    and rights groups in Germany
  • 8:07 - 8:09
    have actually criticized this law,
  • 8:09 - 8:14
    saying that it could actually
    reduce free speech,
  • 8:14 - 8:16
    it could actually lead to censorship.
  • 8:16 - 8:18
    Why is that?
  • 8:18 - 8:21
    Well, Facebook can be fined
    if there's any law-breaking content.
  • 8:21 - 8:27
    So Facebook could proactively
    remove content that some people
  • 8:27 - 8:30
    might not even consider to be hate speech,
    so that could lead to censorship.
  • 8:30 - 8:32
    Now, in a country like Germany,
  • 8:32 - 8:37
    which is a very advanced democracy,
  • 8:37 - 8:39
    if there's criticism
    for this kind of a law,
  • 8:40 - 8:41
    in a newer democracy
  • 8:41 - 8:47
    where legislative, executive
    and judicial capacity are still limited,
  • 8:47 - 8:49
    and freedom of the press
    is still also limited,
  • 8:50 - 8:52
    this kind of a law is extremely risky.
  • 8:52 - 8:56
    But some countries
    are looking at the solution.
  • 8:56 - 8:58
    Let's look at the social media solution.
  • 8:58 - 9:00
    And this is where platforms,
    such as Facebook,
  • 9:01 - 9:03
    remove content proactively,
  • 9:03 - 9:06
    regardless of whether
    there are laws or not.
  • 9:06 - 9:09
    Now, this solution is actually
    quite active in a sense
  • 9:09 - 9:13
    that Facebook has a few thousand
    content moderators around the world
  • 9:13 - 9:19
    that review content that has been flagged
    by users as hate speech,
  • 9:19 - 9:22
    or as another kind
    of non-compliant content.
  • 9:22 - 9:24
    Also in Myanmar,
    I understand that Facebook
  • 9:24 - 9:28
    has a hundred or so content moderators
    in Myanmar language,
  • 9:29 - 9:30
    and I also understand that Facebook
  • 9:30 - 9:35
    is in the longer term looking at using
    artificial intelligence, or "AI,"
  • 9:36 - 9:39
    to help detect and review hate speech.
  • 9:40 - 9:43
    This can be really fast and scalable,
    but under three conditions.
  • 9:44 - 9:48
    Number one: that these content moderators
    actually can distinguish
  • 9:48 - 9:51
    between free speech
    and hate speech without bias.
  • 9:52 - 9:54
    In an unbiased fair way. Number one.
  • 9:54 - 9:59
    Number two: the way
    that hate speech is expressed
  • 9:59 - 10:01
    from time to time actually changes, right?
  • 10:01 - 10:04
    The way it's expressed today
    and expressed next week may be different.
  • 10:04 - 10:06
    So it's important that content moderators
  • 10:06 - 10:10
    actually keep up to date with the way
    that hate speech is expressed online.
  • 10:10 - 10:12
    That's the second condition.
  • 10:12 - 10:15
    And the third one is that Facebook
    is investing in AI.
  • 10:15 - 10:20
    Well, it's important that this AI
    is actually trained well
  • 10:20 - 10:26
    by the decisions made by users
    and also by content moderators,
  • 10:26 - 10:32
    so that the AI can make its own decisions
    accurately at a later point in time.
  • 10:33 - 10:37
    Of course, this solution
    can be quite expensive for Facebook.
  • 10:37 - 10:40
    Lots of content moderators,
    lots of new technology, lots of AI -
  • 10:40 - 10:44
    but I don't think Facebook, frankly,
    has a choice but to invest in this.
  • 10:44 - 10:46
    They need to invest in this,
  • 10:46 - 10:50
    so it's not seen as a place
    for hate speech and fake news
  • 10:51 - 10:52
    online in the future.
  • 10:52 - 10:57
    So we may have heard
    of the Cambridge Analytica scandal
  • 10:57 - 10:59
    on Facebook.
  • 10:59 - 11:02
    I believe that if this hate speech issue
    is not solved on Facebook,
  • 11:02 - 11:06
    it could become
    the next Cambridge Analytica,
  • 11:06 - 11:09
    which could potentially be detrimental
    for Facebook if they don't invest.
  • 11:11 - 11:15
    Now, looking at a third way,
    the self solution,
  • 11:15 - 11:19
    which is where each of us actually
    individually take responsibility
  • 11:20 - 11:23
    for what we post, what we share,
    what we like online.
  • 11:24 - 11:26
    And there are three ways this can happen.
  • 11:26 - 11:29
    Number one is through formal education.
  • 11:29 - 11:32
    So online curriculum,
    online training, online safety
  • 11:32 - 11:34
    being part of the national curriculum.
  • 11:34 - 11:35
    That's the first way.
  • 11:35 - 11:37
    The second way is through
    informal training.
  • 11:37 - 11:39
    So for people outside of school,
  • 11:39 - 11:43
    they can be trained
    by mass awareness programs,
  • 11:43 - 11:47
    by government on TV,
    on radio, in newspapers.
  • 11:47 - 11:48
    That's another way.
  • 11:48 - 11:52
    And the third way is that we could train
    people on Facebook itself.
  • 11:52 - 11:56
    Now, I have to say that Myanmar
  • 11:56 - 11:58
    is actually making
    some progress in this regard,
  • 11:59 - 12:03
    so the Myanmar mobile operator
    Telenor, for example,
  • 12:03 - 12:06
    has actually trained 300,000
    students on hate speech
  • 12:06 - 12:10
    and online safety in the last
    two-and-a-half years,
  • 12:11 - 12:15
    and this has been done collaborating
    with the Ministry of Education,
  • 12:15 - 12:18
    with civil society group MIDO
  • 12:18 - 12:21
    and with training provider KMD as well.
  • 12:22 - 12:24
    All four partnering together.
  • 12:25 - 12:32
    Of course, we also know that online safety
    is in the national curriculum in the UK,
  • 12:32 - 12:35
    in Germany, in Singapore,
    and Sweden for example.
  • 12:36 - 12:40
    So this is a solution that could work,
    long-term is probably required,
  • 12:40 - 12:43
    but because it requires behavioral change,
  • 12:43 - 12:45
    we have to change the way
    we think, the way we act.
  • 12:46 - 12:50
    That takes time, and therefore,
    it cannot be acting alone.
  • 12:50 - 12:54
    And last but not least,
    the civil society solution:
  • 12:54 - 12:59
    so this is where NGOs work together
    to actually determine what hate speech is,
  • 12:59 - 13:03
    and in Myanmar, this has been quite
    an important part of reducing hate speech.
  • 13:03 - 13:07
    So there have been a number
    of campaigns done by civil society.
  • 13:07 - 13:12
    So the Panzagar campaign by MIDO,
    for example, is one example.
  • 13:12 - 13:15
    The Myanmar Institute
    for Peace and Security
  • 13:15 - 13:16
    has an online hate speech tracker,
  • 13:17 - 13:21
    and there's been a lot of media literacy
    and fact-checking campaigns
  • 13:21 - 13:23
    organized by civil society as well.
  • 13:23 - 13:25
    Civil society are very active
    and they will remain active
  • 13:25 - 13:27
    I believe going forward.
  • 13:27 - 13:28
    Oops!
  • 13:29 - 13:31
    So what's the answer for us in Myanmar?
  • 13:31 - 13:34
    How can we actually apply
    the learning from around the world
  • 13:35 - 13:37
    and create our own solution?
  • 13:37 - 13:39
    Well, let's use the same framework again.
  • 13:39 - 13:42
    And as I mentioned, the state solution -
  • 13:42 - 13:44
    so having laws to regulate hate speech,
  • 13:45 - 13:47
    maybe it's a little early
    for that in Myanmar.
  • 13:47 - 13:52
    But I do think though that Facebook
    could play a much more prominent role.
  • 13:52 - 13:55
    They need to hire more content moderators
    familiar with Myanmar language
  • 13:55 - 13:57
    and also even the ethnic languages.
  • 13:58 - 14:01
    The second thing is we need
    better content moderators:
  • 14:01 - 14:03
    content moderators
    that can keep up-to-date
  • 14:04 - 14:06
    with the way that hate speech
    is expressed online.
  • 14:06 - 14:08
    Content moderators
    that can distinguish clearly
  • 14:08 - 14:11
    between free speech and hate speech,
  • 14:11 - 14:16
    and content moderators whose decisions
    are reviewed regularly,
  • 14:16 - 14:20
    and as used as a way to train future
    content moderators as well.
  • 14:23 - 14:25
    I think that Facebook cannot act alone.
  • 14:25 - 14:28
    They need to partner
    closely with civil society,
  • 14:28 - 14:32
    and civil society can help Facebook
    spot new trends in hate speech
  • 14:32 - 14:36
    and also help train and support
    the content moderators,
  • 14:36 - 14:38
    so that they work hand in hand.
  • 14:39 - 14:41
    So that's in the short term.
  • 14:41 - 14:44
    In the medium term,
    I mentioned earlier that I believe AI
  • 14:44 - 14:45
    artificial intelligence,
  • 14:45 - 14:48
    is a very important part
    in reducing hate speech.
  • 14:49 - 14:55
    So with more content moderators,
    collaboration with civil society,
  • 14:55 - 14:58
    I believe that these -
  • 15:00 - 15:02
    decisions made by content moderators
  • 15:02 - 15:05
    should be used
    to train the AI of Facebook,
  • 15:05 - 15:12
    such that posts can be flagged, reviewed,
    and maybe even taken down one day,
  • 15:12 - 15:15
    consistently in line
    with international standards
  • 15:15 - 15:16
    of freedom of expression,
  • 15:17 - 15:20
    so that we're not relying
    on human moderators only.
  • 15:20 - 15:23
    Because human moderators to moderate -
  • 15:23 - 15:28
    well, 100 to even 500 content moderators
    to moderate 20 million people's speech,
  • 15:28 - 15:29
    it's a bit difficult.
  • 15:29 - 15:31
    We need a technical solution as well.
  • 15:32 - 15:34
    In the long term though,
  • 15:34 - 15:37
    I think hate speech
    can really only be reduced
  • 15:37 - 15:42
    if we change the way we think as a people.
  • 15:42 - 15:44
    And that can be done in two ways I think:
  • 15:44 - 15:47
    First, I think there should
    be mandatory online safety training,
  • 15:47 - 15:49
    including on hate speech
    on Facebook itself.
  • 15:50 - 15:55
    New users should be required
    to actually go to an online safety course
  • 15:55 - 15:57
    before they use Facebook.
  • 15:57 - 15:59
    Existing users, like probably you and me,
  • 16:00 - 16:06
    should take annual recertification
    in Facebook's Community Guidelines,
  • 16:07 - 16:09
    which include online safety
    and hate speech.
  • 16:09 - 16:12
    And until we complete
    that recertification,
  • 16:12 - 16:13
    the account should be locked.
  • 16:13 - 16:18
    We should be forced to go through
    this training every year by Facebook, OK?
  • 16:19 - 16:21
    Second thing is I believe it's important
  • 16:21 - 16:25
    to have an online safety course
    in our national curriculum.
  • 16:26 - 16:30
    With that, I'm pleased to announce
    that on 12th of July, just last week,
  • 16:31 - 16:37
    Telenor Myanmar and international NGO
    Plan International signed an agreement
  • 16:37 - 16:41
    to explore including online safety
    in the national curriculum,
  • 16:41 - 16:43
    together with the Ministry of Education.
  • 16:43 - 16:45
    We're working on this.
  • 16:45 - 16:49
    And if it works well, we will be educating
  • 16:49 - 16:53
    millions of our children every year
  • 16:53 - 16:57
    on how to be responsible
    digital citizens online,
  • 16:58 - 17:02
    avoiding hate speech, and using
    social media in a positive way.
  • 17:05 - 17:06
    So I believe that -
  • 17:08 - 17:11
    platforms, Facebook
    and its content moderators,
  • 17:11 - 17:14
    its AI, civil society, educators,
    all working together
  • 17:15 - 17:18
    can significantly reduce hate speech,
    but what about you?
  • 17:18 - 17:19
    What can you do?
  • 17:19 - 17:21
    How can you be a part of the solution?
  • 17:21 - 17:24
    Or have you been a victim
    of hate speech yourself?
  • 17:24 - 17:29
    Well, here are four tips to reduce
    hate speech and what you can do.
  • 17:30 - 17:31
    The first one's obvious, right?
  • 17:32 - 17:34
    Avoid hate speech yourself.
    That's a given.
  • 17:36 - 17:39
    Secondly, don't share or like
    any post or comments
  • 17:39 - 17:41
    that contain hate speech.
  • 17:42 - 17:45
    Sometimes hate speech
    can appear as fake news,
  • 17:45 - 17:47
    as jokes on Facebook as well.
  • 17:47 - 17:53
    So be careful not to share, not to like,
    and not to comment on these posts either.
  • 17:54 - 17:58
    You can also report any instance
    of hate speech directly to Facebook,
  • 17:58 - 18:01
    so that their content moderators
    can review and quickly take down
  • 18:01 - 18:04
    any hate speech quickly.
  • 18:05 - 18:08
    And last but not least,
    if you have been a victim yourself,
  • 18:08 - 18:11
    you're able to now report
    to the cybercrime police in Naypyidaw.
  • 18:11 - 18:17
    They now have about 100 officers
    that are looking at this issue.
  • 18:17 - 18:19
    Again, 100 officers for 20 million users.
  • 18:19 - 18:22
    It's still a bit small,
    but it's a good start.
  • 18:22 - 18:25
    But you can also call on them,
    or alternatively,
  • 18:25 - 18:28
    report the issue to your local
    police station as well,
  • 18:28 - 18:30
    for assistance if you
    are a victim of hate speech.
  • 18:31 - 18:33
    Now, I think for this solution,
  • 18:33 - 18:36
    for hate speech to be reduced
    in Myanmar long-term,
  • 18:36 - 18:39
    we all need to be part of this solution,
    and that starts now.
  • 18:39 - 18:40
    Thank you so much everyone.
  • 18:40 - 18:42
    (Applause)
Title:
Winning the battle against online hate speech in Myanmar | Min Thu Aung | TEDxUM1Yangon
Description:

We are now living in world of increasing connectivity. This connectivity brings opportunities as well as challenges. One of the overarching challenges is our safety during online activities. Nowadays, we are seeing increasing hate speech, fake news, and other harmful materials on the internet. In this talk, Min Thu Aung analyzes the issue of online safety in the context of Myanmar and explores the ways in which we can tackle the problem. Min Thu Aung leads the Business Sustainability and Special Projects team at Telenor in Myanmar. Since 2016, he has been leading a program that has educated over 300,000 youth in Myanmar in being responsible digital citizens on social media, with a focus on tackling the pervasive issue of hate speech. Prior to this, Min led a microfinance institution in Myanmar focusing on financial inclusion and was a management consultant in the UK before his return to Myanmar in 2013.

As a son of two expatriate Myanmar doctors, he was educated in the culturally diverse environments of Malaysia and the UK, where he personally witnessed how diversity can drive sustainable, socioeconomic growth and social harmony. He believes that empowering future generations of Myanmar’s youth to appreciate diversity as responsible digital citizens and to refrain from hate speech on Myanmar’s social media platforms will aid Myanmar in its ongoing reintegration into the global community following decades of isolation.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
18:43

English subtitles

Revisions