Return to Video

How Twitter needs to change

  • 0:01 - 0:04
    Chris Anderson:
    What worries you right now?
  • 0:04 - 0:06
    You've been very open
    about lots of issues on Twitter.
  • 0:06 - 0:09
    What would be your top worry
  • 0:09 - 0:11
    about where things are right now?
  • 0:11 - 0:14
    Jack Dorsey: Right now,
    the health of the conversation.
  • 0:14 - 0:18
    So, our purpose is to serve
    the public conversation,
  • 0:18 - 0:23
    and we have seen
    a number of attacks on it.
  • 0:23 - 0:26
    We've seen abuse, we've seen harassment,
  • 0:26 - 0:29
    we've seen manipulation,
  • 0:29 - 0:33
    automation, human coordination,
    misinformation.
  • 0:34 - 0:38
    So these are all dynamics
    that we were not expecting
  • 0:38 - 0:42
    13 years ago when we were
    starting the company.
  • 0:42 - 0:45
    But we do now see them at scale,
  • 0:45 - 0:50
    and what worries me most
    is just our ability to address it
  • 0:50 - 0:53
    in a systemic way that is scalable,
  • 0:53 - 1:00
    that has a rigorous understanding
    of how we're taking action,
  • 1:00 - 1:03
    a transparent understanding
    of how we're taking action
  • 1:03 - 1:06
    and a rigorous appeals process
    for when we're wrong,
  • 1:06 - 1:08
    because we will be wrong.
  • 1:09 - 1:11
    Whitney Pennington Rodgers:
    I'm really glad to hear
  • 1:11 - 1:13
    that that's something that concerns you,
  • 1:13 - 1:16
    because I think there's been
    a lot written about people
  • 1:16 - 1:18
    who feel they've been abused
    and harassed on Twitter,
  • 1:18 - 1:22
    and I think no one more so
    than women and women of color
  • 1:22 - 1:23
    and black women.
  • 1:23 - 1:25
    And there's been data that's come out --
  • 1:25 - 1:28
    Amnesty International put out
    a report a few months ago
  • 1:28 - 1:33
    where they showed that a subset
    of active black female Twitter users
  • 1:33 - 1:36
    were receiving, on average,
    one in 10 of their tweets
  • 1:36 - 1:38
    were some form of harassment.
  • 1:38 - 1:42
    And so when you think about health
    for the community on Twitter,
  • 1:42 - 1:46
    I'm interested to hear,
    "health for everyone,"
  • 1:46 - 1:49
    but specifically: How are you looking
    to make Twitter a safe space
  • 1:49 - 1:54
    for that subset, for women,
    for women of color and black women?
  • 1:54 - 1:55
    JD: Yeah.
  • 1:55 - 1:57
    So it's a pretty terrible situation
  • 1:57 - 1:59
    when you're coming to a service
  • 1:59 - 2:03
    that, ideally, you want to learn
    something about the world,
  • 2:03 - 2:09
    and you spend the majority of your time
    reporting abuse, receiving abuse,
  • 2:09 - 2:11
    receiving harassment.
  • 2:11 - 2:18
    So what we're looking most deeply at
    is just the incentives
  • 2:18 - 2:22
    that the platform naturally provides
    and the service provides.
  • 2:22 - 2:27
    Right now, the dynamic of the system
    makes it super-easy to harass
  • 2:27 - 2:31
    and to abuse others through the service,
  • 2:31 - 2:34
    and unfortunately, the majority
    of our system in the past
  • 2:34 - 2:39
    worked entirely based on people
    reporting harassment and abuse.
  • 2:39 - 2:45
    So about midway last year,
    we decided that we were going to apply
  • 2:45 - 2:49
    a lot more machine learning,
    a lot more deep learning to the problem,
  • 2:49 - 2:53
    and try to be a lot more proactive
    around where abuse is happening,
  • 2:53 - 2:57
    so that we can take the burden
    off the victim completely.
  • 2:57 - 3:00
    And we've made some progress recently.
  • 3:00 - 3:06
    About 38 percent of abusive tweets
    are now proactively identified
  • 3:06 - 3:08
    by machine learning algorithms
  • 3:08 - 3:10
    so that people don't actually
    have to report them.
  • 3:10 - 3:14
    But those that are identified
    are still reviewed by humans,
  • 3:14 - 3:19
    so we do not take down content or accounts
    without a human actually reviewing it.
  • 3:19 - 3:22
    But that was from zero percent
    just a year ago.
  • 3:22 - 3:24
    So that meant, at that zero percent,
  • 3:24 - 3:28
    every single person who received abuse
    had to actually report it,
  • 3:28 - 3:31
    which was a lot of work for them,
    a lot of work for us
  • 3:31 - 3:33
    and just ultimately unfair.
  • 3:35 - 3:38
    The other thing that we're doing
    is making sure that we, as a company,
  • 3:38 - 3:42
    have representation of all the communities
    that we're trying to serve.
  • 3:42 - 3:44
    We can't build a business
    that is successful
  • 3:44 - 3:47
    unless we have a diversity
    of perspective inside of our walls
  • 3:47 - 3:51
    that actually feel these issues
    every single day.
  • 3:51 - 3:55
    And that's not just with the team
    that's doing the work,
  • 3:55 - 3:57
    it's also within our leadership as well.
  • 3:57 - 4:03
    So we need to continue to build empathy
    for what people are experiencing
  • 4:03 - 4:06
    and give them better tools to act on it
  • 4:06 - 4:10
    and also give our customers
    a much better and easier approach
  • 4:10 - 4:13
    to handle some of the things
    that they're seeing.
  • 4:13 - 4:16
    So a lot of what we're doing
    is around technology,
  • 4:16 - 4:20
    but we're also looking at
    the incentives on the service:
  • 4:20 - 4:25
    What does Twitter incentivize you to do
    when you first open it up?
  • 4:25 - 4:27
    And in the past,
  • 4:29 - 4:34
    it's incented a lot of outrage,
    it's incented a lot of mob behavior,
  • 4:34 - 4:37
    it's incented a lot of group harassment.
  • 4:37 - 4:40
    And we have to look a lot deeper
    at some of the fundamentals
  • 4:40 - 4:43
    of what the service is doing
    to make the bigger shifts.
  • 4:43 - 4:47
    We can make a bunch of small shifts
    around technology, as I just described,
  • 4:47 - 4:52
    but ultimately, we have to look deeply
    at the dynamics in the network itself,
  • 4:52 - 4:53
    and that's what we're doing.
  • 4:53 - 4:55
    CA: But what's your sense --
  • 4:55 - 4:59
    what is the kind of thing
    that you might be able to change
  • 4:59 - 5:02
    that would actually
    fundamentally shift behavior?
  • 5:03 - 5:05
    JD: Well, one of the things --
  • 5:05 - 5:10
    we started the service
    with this concept of following an account,
  • 5:10 - 5:12
    as an example,
  • 5:12 - 5:16
    and I don't believe that's why
    people actually come to Twitter.
  • 5:16 - 5:21
    I believe Twitter is best
    as an interest-based network.
  • 5:21 - 5:25
    People come with a particular interest.
  • 5:25 - 5:28
    They have to do a ton of work
    to find and follow the related accounts
  • 5:28 - 5:30
    around those interests.
  • 5:30 - 5:34
    What we could do instead
    is allow you to follow an interest,
  • 5:34 - 5:36
    follow a hashtag, follow a trend,
  • 5:36 - 5:38
    follow a community,
  • 5:38 - 5:42
    which gives us the opportunity
    to show all of the accounts,
  • 5:42 - 5:46
    all the topics, all the moments,
    all the hashtags
  • 5:46 - 5:50
    that are associated with that
    particular topic and interest,
  • 5:50 - 5:54
    which really opens up
    the perspective that you see.
  • 5:54 - 5:56
    But that is a huge fundamental shift
  • 5:56 - 6:00
    to bias the entire network
    away from just an account bias
  • 6:00 - 6:03
    towards a topics and interest bias.
  • 6:03 - 6:06
    CA: Because isn't it the case
  • 6:07 - 6:11
    that one reason why you have
    so much content on there
  • 6:11 - 6:15
    is a result of putting millions
    of people around the world
  • 6:15 - 6:18
    in this kind of gladiatorial
    contest with each other
  • 6:18 - 6:20
    for followers, for attention?
  • 6:20 - 6:24
    Like, from the point of view
    of people who just read Twitter,
  • 6:24 - 6:25
    that's not an issue,
  • 6:25 - 6:29
    but for the people who actually create it,
    everyone's out there saying,
  • 6:29 - 6:32
    "You know, I wish I had
    a few more 'likes,' followers, retweets."
  • 6:32 - 6:34
    And so they're constantly experimenting,
  • 6:34 - 6:36
    trying to find the path to do that.
  • 6:36 - 6:40
    And what we've all discovered
    is that the number one path to do that
  • 6:40 - 6:44
    is to be some form of provocative,
  • 6:44 - 6:47
    obnoxious, eloquently obnoxious,
  • 6:47 - 6:50
    like, eloquent insults
    are a dream on Twitter,
  • 6:50 - 6:53
    where you rapidly pile up --
  • 6:53 - 6:57
    and it becomes this self-fueling
    process of driving outrage.
  • 6:57 - 7:00
    How do you defuse that?
  • 7:01 - 7:04
    JD: Yeah, I mean, I think you're spot on,
  • 7:04 - 7:05
    but that goes back to the incentives.
  • 7:06 - 7:08
    Like, one of the choices
    we made in the early days was
  • 7:08 - 7:13
    we had this number that showed
    how many people follow you.
  • 7:13 - 7:16
    We decided that number
    should be big and bold,
  • 7:16 - 7:20
    and anything that's on the page
    that's big and bold has importance,
  • 7:20 - 7:22
    and those are the things
    that you want to drive.
  • 7:22 - 7:24
    Was that the right decision at the time?
  • 7:24 - 7:25
    Probably not.
  • 7:25 - 7:27
    If I had to start the service again,
  • 7:27 - 7:29
    I would not emphasize
    the follower count as much.
  • 7:29 - 7:32
    I would not emphasize
    the "like" count as much.
  • 7:32 - 7:35
    I don't think I would even
    create "like" in the first place,
  • 7:35 - 7:38
    because it doesn't actually push
  • 7:38 - 7:41
    what we believe now
    to be the most important thing,
  • 7:41 - 7:44
    which is healthy contribution
    back to the network
  • 7:44 - 7:47
    and conversation to the network,
  • 7:47 - 7:49
    participation within conversation,
  • 7:49 - 7:52
    learning something from the conversation.
  • 7:52 - 7:54
    Those are not things
    that we thought of 13 years ago,
  • 7:54 - 7:57
    and we believe are extremely
    important right now.
  • 7:57 - 8:00
    So we have to look at
    how we display the follower count,
  • 8:00 - 8:02
    how we display retweet count,
  • 8:02 - 8:04
    how we display "likes,"
  • 8:04 - 8:06
    and just ask the deep question:
  • 8:06 - 8:09
    Is this really the number
    that we want people to drive up?
  • 8:09 - 8:12
    Is this the thing that,
    when you open Twitter,
  • 8:12 - 8:14
    you see, "That's the thing
    I need to increase?"
  • 8:14 - 8:16
    And I don't believe
    that's the case right now.
  • 8:16 - 8:19
    (Applause)
  • 8:19 - 8:21
    WPR: I think we should look at
    some of the tweets
  • 8:21 - 8:23
    that are coming
    in from the audience as well.
  • 8:24 - 8:26
    CA: Let's see what you guys are asking.
  • 8:26 - 8:30
    I mean, this is -- generally, one
    of the amazing things about Twitter
  • 8:30 - 8:32
    is how you can use it for crowd wisdom,
  • 8:32 - 8:37
    you know, that more knowledge,
    more questions, more points of view
  • 8:37 - 8:38
    than you can imagine,
  • 8:38 - 8:42
    and sometimes, many of them
    are really healthy.
  • 8:42 - 8:45
    WPR: I think one I saw that
    passed already quickly down here,
  • 8:45 - 8:48
    "What's Twitter's plan to combat
    foreign meddling in the 2020 US election?"
  • 8:48 - 8:51
    I think that's something
    that's an issue we're seeing
  • 8:51 - 8:53
    on the internet in general,
  • 8:53 - 8:56
    that we have a lot of malicious
    automated activity happening.
  • 8:56 - 9:02
    And on Twitter, for example,
    in fact, we have some work
  • 9:02 - 9:05
    that's come from our friends
    at Zignal Labs,
  • 9:05 - 9:07
    and maybe we can even see that
    to give us an example
  • 9:07 - 9:09
    of what exactly I'm talking about,
  • 9:09 - 9:12
    where you have these bots, if you will,
  • 9:13 - 9:17
    or coordinated automated
    malicious account activity,
  • 9:17 - 9:20
    that is being used to influence
    things like elections.
  • 9:20 - 9:24
    And in this example we have
    from Zignal which they've shared with us
  • 9:24 - 9:26
    using the data that
    they have from Twitter,
  • 9:26 - 9:28
    you actually see that in this case,
  • 9:28 - 9:33
    white represents the humans --
    human accounts, each dot is an account.
  • 9:33 - 9:34
    The pinker it is,
  • 9:34 - 9:36
    the more automated the activity is.
  • 9:36 - 9:42
    And you can see how you have
    a few humans interacting with bots.
  • 9:42 - 9:46
    In this case, it's related
    to the election in Israel
  • 9:46 - 9:49
    and spreading misinformation
    about Benny Gantz,
  • 9:49 - 9:52
    and as we know, in the end,
    that was an election
  • 9:52 - 9:56
    that Netanyahu won by a slim margin,
  • 9:56 - 9:59
    and that may have been
    in some case influenced by this.
  • 9:59 - 10:01
    And when you think about
    that happening on Twitter,
  • 10:01 - 10:04
    what are the things
    that you're doing, specifically,
  • 10:04 - 10:07
    to ensure you don't have misinformation
    like this spreading in this way,
  • 10:07 - 10:12
    influencing people in ways
    that could affect democracy?
  • 10:12 - 10:13
    JD: Just to back up a bit,
  • 10:13 - 10:16
    we asked ourselves a question:
  • 10:16 - 10:20
    Can we actually measure
    the health of a conversation,
  • 10:20 - 10:22
    and what does that mean?
  • 10:22 - 10:25
    And in the same way
    that you have indicators
  • 10:25 - 10:28
    and we have indicators as humans
    in terms of are we healthy or not,
  • 10:28 - 10:33
    such as temperature,
    the flushness of your face,
  • 10:33 - 10:38
    we believe that we could find
    the indicators of conversational health.
  • 10:38 - 10:42
    And we worked with a lab
    called Cortico at MIT
  • 10:42 - 10:49
    to propose four starter indicators
  • 10:49 - 10:52
    that we believe we could ultimately
    measure on the system.
  • 10:53 - 10:59
    And the first one is
    what we're calling shared attention.
  • 10:59 - 11:02
    It's a measure of how much
    of the conversation is attentive
  • 11:02 - 11:05
    on the same topic versus disparate.
  • 11:06 - 11:09
    The second one is called shared reality,
  • 11:09 - 11:11
    and this is what percentage
    of the conversation
  • 11:12 - 11:14
    shares the same facts --
  • 11:14 - 11:17
    not whether those facts
    are truthful or not,
  • 11:17 - 11:20
    but are we sharing
    the same facts as we converse?
  • 11:20 - 11:23
    The third is receptivity:
  • 11:23 - 11:27
    How much of the conversation
    is receptive or civil
  • 11:27 - 11:30
    or the inverse, toxic?
  • 11:30 - 11:33
    And then the fourth
    is variety of perspective.
  • 11:33 - 11:37
    So, are we seeing filter bubbles
    or echo chambers,
  • 11:37 - 11:40
    or are we actually getting
    a variety of opinions
  • 11:40 - 11:41
    within the conversation?
  • 11:41 - 11:45
    And implicit in all four of these
    is the understanding that,
  • 11:45 - 11:49
    as they increase, the conversation
    gets healthier and healthier.
  • 11:49 - 11:54
    So our first step is to see
    if we can measure these online,
  • 11:54 - 11:55
    which we believe we can.
  • 11:55 - 11:58
    We have the most momentum
    around receptivity.
  • 11:58 - 12:03
    We have a toxicity score,
    a toxicity model, on our system
  • 12:03 - 12:07
    that can actually measure
    whether you are likely to walk away
  • 12:07 - 12:09
    from a conversation
    that you're having on Twitter
  • 12:09 - 12:11
    because you feel it's toxic,
  • 12:11 - 12:13
    with some pretty high degree.
  • 12:14 - 12:17
    We're working to measure the rest,
  • 12:17 - 12:19
    and the next step is,
  • 12:19 - 12:22
    as we build up solutions,
  • 12:22 - 12:25
    to watch how these measurements
    trend over time
  • 12:25 - 12:27
    and continue to experiment.
  • 12:27 - 12:31
    And our goal is to make sure
    that these are balanced,
  • 12:31 - 12:35
    because if you increase one,
    you might decrease another.
  • 12:35 - 12:37
    If you increase variety of perspective,
  • 12:37 - 12:40
    you might actually decrease
    shared reality.
  • 12:40 - 12:45
    CA: Just picking up on some
    of the questions flooding in here.
  • 12:45 - 12:46
    JD: Constant questioning.
  • 12:47 - 12:51
    CA: A lot of people are puzzled why,
  • 12:51 - 12:55
    like, how hard is it to get rid
    of Nazis from Twitter?
  • 12:56 - 12:58
    JD: (Laughs)
  • 12:58 - 13:05
    So we have policies
    around violent extremist groups,
  • 13:05 - 13:09
    and the majority of our work
    and our terms of service
  • 13:09 - 13:13
    works on conduct, not content.
  • 13:13 - 13:15
    So we're actually looking for conduct.
  • 13:15 - 13:18
    Conduct being using the service
  • 13:18 - 13:22
    to repeatedly or episodically
    harass someone,
  • 13:22 - 13:25
    using hateful imagery
  • 13:25 - 13:27
    that might be associated with the KKK
  • 13:27 - 13:30
    or the American Nazi Party.
  • 13:30 - 13:34
    Those are all things
    that we act on immediately.
  • 13:35 - 13:40
    We're in a situation right now
    where that term is used fairly loosely,
  • 13:40 - 13:46
    and we just cannot take
    any one mention of that word
  • 13:46 - 13:48
    accusing someone else
  • 13:48 - 13:52
    as a factual indication that they
    should be removed from the platform.
  • 13:52 - 13:54
    So a lot of our models
    are based around, number one:
  • 13:54 - 13:58
    Is this account associated
    with a violent extremist group?
  • 13:58 - 14:00
    And if so, we can take action.
  • 14:00 - 14:03
    And we have done so on the KKK
    and the American Nazi Party and others.
  • 14:03 - 14:08
    And number two: Are they using
    imagery or conduct
  • 14:08 - 14:10
    that would associate them as such as well?
  • 14:10 - 14:13
    CA: How many people do you have
    working on content moderation
  • 14:13 - 14:15
    to look at this?
  • 14:15 - 14:16
    JD: It varies.
  • 14:16 - 14:18
    We want to be flexible on this,
  • 14:18 - 14:20
    because we want to make sure
    that we're, number one,
  • 14:20 - 14:25
    building algorithms instead of just
    hiring massive amounts of people,
  • 14:25 - 14:28
    because we need to make sure
    that this is scalable,
  • 14:28 - 14:31
    and there are no amount of people
    that can actually scale this.
  • 14:31 - 14:38
    So this is why we've done so much work
    around proactive detection of abuse
  • 14:38 - 14:39
    that humans can then review.
  • 14:39 - 14:42
    We want to have a situation
  • 14:42 - 14:46
    where algorithms are constantly
    scouring every single tweet
  • 14:46 - 14:48
    and bringing the most
    interesting ones to the top
  • 14:48 - 14:52
    so that humans can bring their judgment
    to whether we should take action or not,
  • 14:52 - 14:54
    based on our terms of service.
  • 14:54 - 14:57
    WPR: But there's not an amount
    of people that are scalable,
  • 14:57 - 15:00
    but how many people do you currently have
    monitoring these accounts,
  • 15:00 - 15:03
    and how do you figure out what's enough?
  • 15:03 - 15:05
    JD: They're completely flexible.
  • 15:05 - 15:08
    Sometimes we associate folks with spam.
  • 15:08 - 15:12
    Sometimes we associate folks
    with abuse and harassment.
  • 15:12 - 15:15
    We're going to make sure that
    we have flexibility in our people
  • 15:15 - 15:17
    so that we can direct them
    at what is most needed.
  • 15:17 - 15:18
    Sometimes, the elections.
  • 15:19 - 15:23
    We've had a string of elections
    in Mexico, one coming up in India,
  • 15:23 - 15:28
    obviously, the election last year,
    the midterm election,
  • 15:28 - 15:30
    so we just want to be flexible
    with our resources.
  • 15:30 - 15:33
    So when people --
  • 15:33 - 15:39
    just as an example, if you go
    to our current terms of service
  • 15:39 - 15:41
    and you bring the page up,
  • 15:41 - 15:44
    and you're wondering about abuse
    and harassment that you just received
  • 15:44 - 15:48
    and whether it was against
    our terms of service to report it,
  • 15:48 - 15:51
    the first thing you see
    when you open that page
  • 15:51 - 15:54
    is around intellectual
    property protection.
  • 15:55 - 16:00
    You scroll down and you get to
    abuse, harassment
  • 16:00 - 16:02
    and everything else
    that you might be experiencing.
  • 16:02 - 16:05
    So I don't know how that happened
    over the company's history,
  • 16:05 - 16:10
    but we put that above
    the thing that people want
  • 16:12 - 16:15
    the most information on
    and to actually act on.
  • 16:15 - 16:21
    And just our ordering shows the world
    what we believed was important.
  • 16:21 - 16:24
    So we're changing all that.
  • 16:24 - 16:25
    We're ordering it the right way,
  • 16:25 - 16:29
    but we're also simplifying the rules
    so that they're human-readable
  • 16:29 - 16:33
    so that people can actually
    understand themselves
  • 16:33 - 16:36
    when something is against our terms
    and when something is not.
  • 16:36 - 16:38
    And then we're making --
  • 16:38 - 16:44
    again, our big focus is on removing
    the burden of work from the victims.
  • 16:44 - 16:47
    So that means push more
    towards technology,
  • 16:47 - 16:49
    rather than humans doing the work --
  • 16:49 - 16:52
    that means the humans receiving the abuse
  • 16:52 - 16:55
    and also the humans
    having to review that work.
  • 16:55 - 16:56
    So we want to make sure
  • 16:56 - 16:59
    that we're not just encouraging more work
  • 16:59 - 17:02
    around something
    that's super, super negative,
  • 17:02 - 17:05
    and we want to have a good balance
    between the technology
  • 17:05 - 17:08
    and where humans can actually be creative,
  • 17:08 - 17:11
    which is the judgment of the rules,
  • 17:11 - 17:14
    and not just all the mechanical stuff
    of finding and reporting them.
  • 17:14 - 17:15
    So that's how we think about it.
  • 17:15 - 17:18
    CA: I'm curious to dig in more
    about what you said.
  • 17:18 - 17:21
    I mean, I love that you said
    you are looking for ways
  • 17:21 - 17:24
    to re-tweak the fundamental
    design of the system
  • 17:24 - 17:29
    to discourage some of the reactive
    behavior, and perhaps --
  • 17:29 - 17:32
    to use Tristan Harris-type language --
  • 17:32 - 17:36
    engage people's more reflective thinking.
  • 17:36 - 17:38
    How far advanced is that?
  • 17:38 - 17:42
    What would alternatives
    to that "like" button be?
  • 17:44 - 17:47
    JD: Well, first and foremost,
  • 17:47 - 17:53
    my personal goal with the service
    is that I believe fundamentally
  • 17:53 - 17:56
    that public conversation is critical.
  • 17:56 - 17:58
    There are existential problems
    facing the world
  • 17:58 - 18:02
    that are facing the entire world,
    not any one particular nation-state,
  • 18:02 - 18:05
    that global public conversation benefits.
  • 18:05 - 18:08
    And that is one of the unique
    dynamics of Twitter,
  • 18:08 - 18:09
    that it is completely open,
  • 18:09 - 18:11
    it is completely public,
  • 18:11 - 18:12
    it is completely fluid,
  • 18:12 - 18:16
    and anyone can see any other conversation
    and participate in it.
  • 18:16 - 18:19
    So there are conversations
    like climate change.
  • 18:19 - 18:21
    There are conversations
    like the displacement in the work
  • 18:21 - 18:23
    through artificial intelligence.
  • 18:23 - 18:26
    There are conversations
    like economic disparity.
  • 18:26 - 18:29
    No matter what any one nation-state does,
  • 18:29 - 18:32
    they will not be able
    to solve the problem alone.
  • 18:32 - 18:34
    It takes coordination around the world,
  • 18:34 - 18:37
    and that's where I think
    Twitter can play a part.
  • 18:37 - 18:43
    The second thing is that Twitter,
    right now, when you go to it,
  • 18:43 - 18:47
    you don't necessarily walk away
    feeling like you learned something.
  • 18:47 - 18:48
    Some people do.
  • 18:48 - 18:51
    Some people have
    a very, very rich network,
  • 18:51 - 18:54
    a very rich community
    that they learn from every single day.
  • 18:54 - 18:58
    But it takes a lot of work
    and a lot of time to build up to that.
  • 18:58 - 19:02
    So we want to get people
    to those topics and those interests
  • 19:02 - 19:03
    much, much faster
  • 19:03 - 19:06
    and make sure that
    they're finding something that,
  • 19:07 - 19:09
    no matter how much time
    they spend on Twitter --
  • 19:09 - 19:11
    and I don't want to maximize
    the time on Twitter,
  • 19:11 - 19:14
    I want to maximize
    what they actually take away from it
  • 19:14 - 19:16
    and what they learn from it, and --
  • 19:18 - 19:19
    CA: Well, do you, though?
  • 19:19 - 19:22
    Because that's the core question
    that a lot of people want to know.
  • 19:22 - 19:26
    Surely, Jack, you're constrained,
    to a huge extent,
  • 19:26 - 19:28
    by the fact that you're a public company,
  • 19:28 - 19:30
    you've got investors pressing on you,
  • 19:30 - 19:33
    the number one way you make your money
    is from advertising --
  • 19:33 - 19:36
    that depends on user engagement.
  • 19:36 - 19:41
    Are you willing to sacrifice
    user time, if need be,
  • 19:41 - 19:45
    to go for a more reflective conversation?
  • 19:45 - 19:48
    JD: Yeah; more relevance means
    less time on the service,
  • 19:48 - 19:50
    and that's perfectly fine,
  • 19:50 - 19:53
    because we want to make sure
    that, like, you're coming to Twitter,
  • 19:53 - 19:57
    and you see something immediately
    that you learn from and that you push.
  • 19:57 - 20:01
    We can still serve an ad against that.
  • 20:01 - 20:04
    That doesn't mean you need to spend
    any more time to see more.
  • 20:04 - 20:05
    The second thing we're looking at --
  • 20:05 - 20:08
    CA: But just -- on that goal,
    daily active usage,
  • 20:08 - 20:11
    if you're measuring that,
    that doesn't necessarily mean things
  • 20:11 - 20:13
    that people value every day.
  • 20:13 - 20:14
    It may well mean
  • 20:14 - 20:18
    things that people are drawn to
    like a moth to the flame, every day.
  • 20:18 - 20:21
    We are addicted, because we see
    something that pisses us off,
  • 20:21 - 20:24
    so we go in and add fuel to the fire,
  • 20:24 - 20:26
    and the daily active usage goes up,
  • 20:26 - 20:28
    and there's more ad revenue there,
  • 20:28 - 20:30
    but we all get angrier with each other.
  • 20:30 - 20:33
    How do you define ...
  • 20:33 - 20:37
    "Daily active usage" seems like a really
    dangerous term to be optimizing.
  • 20:37 - 20:42
    (Applause)
  • 20:42 - 20:43
    JD: Taken alone, it is,
  • 20:44 - 20:46
    but you didn't let me
    finish the other metric,
  • 20:46 - 20:50
    which is, we're watching for conversations
  • 20:50 - 20:52
    and conversation chains.
  • 20:52 - 20:57
    So we want to incentivize
    healthy contribution back to the network,
  • 20:57 - 21:01
    and what we believe that is
    is actually participating in conversation
  • 21:01 - 21:02
    that is healthy,
  • 21:02 - 21:07
    as defined by those four indicators
    I articulated earlier.
  • 21:07 - 21:10
    So you can't just optimize
    around one metric.
  • 21:10 - 21:13
    You have to balance and look constantly
  • 21:13 - 21:17
    at what is actually going to create
    a healthy contribution to the network
  • 21:17 - 21:19
    and a healthy experience for people.
  • 21:19 - 21:21
    Ultimately, we want to get to a metric
  • 21:21 - 21:25
    where people can tell us,
    "Hey, I learned something from Twitter,
  • 21:25 - 21:27
    and I'm walking away
    with something valuable."
  • 21:27 - 21:29
    That is our goal ultimately over time,
  • 21:29 - 21:31
    but that's going to take some time.
  • 21:31 - 21:36
    CA: You come over to many,
    I think to me, as this enigma.
  • 21:36 - 21:41
    This is possibly unfair,
    but I woke up the other night
  • 21:41 - 21:45
    with this picture of how I found I was
    thinking about you and the situation,
  • 21:45 - 21:52
    that we're on this great voyage with you
    on this ship called the "Twittanic" --
  • 21:52 - 21:53
    (Laughter)
  • 21:53 - 21:57
    and there are people on board in steerage
  • 21:57 - 22:00
    who are expressing discomfort,
  • 22:00 - 22:02
    and you, unlike many other captains,
  • 22:02 - 22:06
    are saying, "Well, tell me, talk to me,
    listen to me, I want to hear."
  • 22:06 - 22:09
    And they talk to you, and they say,
    "We're worried about the iceberg ahead."
  • 22:09 - 22:11
    And you go, "You know,
    that is a powerful point,
  • 22:11 - 22:14
    and our ship, frankly,
    hasn't been built properly
  • 22:14 - 22:16
    for steering as well as it might."
  • 22:16 - 22:17
    And we say, "Please do something."
  • 22:17 - 22:19
    And you go to the bridge,
  • 22:19 - 22:21
    and we're waiting,
  • 22:21 - 22:26
    and we look, and then you're showing
    this extraordinary calm,
  • 22:26 - 22:30
    but we're all standing outside,
    saying, "Jack, turn the fucking wheel!"
  • 22:30 - 22:31
    You know?
  • 22:31 - 22:32
    (Laughter)
  • 22:32 - 22:34
    (Applause)
  • 22:34 - 22:36
    I mean --
  • 22:36 - 22:37
    (Applause)
  • 22:37 - 22:42
    It's democracy at stake.
  • 22:42 - 22:45
    It's our culture at stake.
    It's our world at stake.
  • 22:45 - 22:50
    And Twitter is amazing and shapes so much.
  • 22:50 - 22:52
    It's not as big as some
    of the other platforms,
  • 22:52 - 22:55
    but the people of influence use it
    to set the agenda,
  • 22:55 - 23:01
    and it's just hard to imagine a more
    important role in the world than to ...
  • 23:02 - 23:05
    I mean, you're doing a brilliant job
    of listening, Jack, and hearing people,
  • 23:05 - 23:10
    but to actually dial up the urgency
    and move on this stuff --
  • 23:10 - 23:12
    will you do that?
  • 23:13 - 23:17
    JD: Yes, and we have been
    moving substantially.
  • 23:17 - 23:20
    I mean, there's been
    a few dynamics in Twitter's history.
  • 23:20 - 23:22
    One, when I came back to the company,
  • 23:23 - 23:30
    we were in a pretty dire state
    in terms of our future,
  • 23:30 - 23:34
    and not just from how people
    were using the platform,
  • 23:34 - 23:36
    but from a corporate narrative as well.
  • 23:36 - 23:40
    So we had to fix
    a bunch of the foundation,
  • 23:40 - 23:42
    turn the company around,
  • 23:42 - 23:45
    go through two crazy layoffs,
  • 23:45 - 23:49
    because we just got too big
    for what we were doing,
  • 23:49 - 23:51
    and we focused all of our energy
  • 23:51 - 23:54
    on this concept of serving
    the public conversation.
  • 23:54 - 23:56
    And that took some work.
  • 23:56 - 23:58
    And as we dived into that,
  • 23:58 - 24:01
    we realized some of the issues
    with the fundamentals.
  • 24:02 - 24:07
    We could do a bunch of superficial things
    to address what you're talking about,
  • 24:07 - 24:09
    but we need the changes to last,
  • 24:09 - 24:11
    and that means going really, really deep
  • 24:11 - 24:15
    and paying attention
    to what we started 13 years ago
  • 24:15 - 24:18
    and really questioning
  • 24:18 - 24:20
    how the system works
    and how the framework works
  • 24:20 - 24:24
    and what is needed for the world today,
  • 24:24 - 24:28
    given how quickly everything is moving
    and how people are using it.
  • 24:28 - 24:35
    So we are working as quickly as we can,
    but quickness will not get the job done.
  • 24:35 - 24:37
    It's focus, it's prioritization,
  • 24:37 - 24:40
    it's understanding
    the fundamentals of the network
  • 24:40 - 24:43
    and building a framework that scales
  • 24:43 - 24:46
    and that is resilient to change,
  • 24:46 - 24:51
    and being open about where we are
    and being transparent about where are
  • 24:51 - 24:53
    so that we can continue to earn trust.
  • 24:54 - 24:57
    So I'm proud of all the frameworks
    that we've put in place.
  • 24:57 - 25:00
    I'm proud of our direction.
  • 25:01 - 25:04
    We obviously can move faster,
  • 25:04 - 25:08
    but that required just stopping a bunch
    of stupid stuff we were doing in the past.
  • 25:09 - 25:10
    CA: All right.
  • 25:10 - 25:14
    Well, I suspect there are many people here
    who, if given the chance,
  • 25:14 - 25:18
    would love to help you
    on this change-making agenda you're on,
  • 25:18 - 25:20
    and I don't know if Whitney --
  • 25:20 - 25:23
    Jack, thank you for coming here
    and speaking so openly.
  • 25:23 - 25:24
    It took courage.
  • 25:24 - 25:28
    I really appreciate what you said,
    and good luck with your mission.
  • 25:28 - 25:30
    JD: Thank you so much.
    Thanks for having me.
  • 25:30 - 25:33
    (Applause)
  • 25:33 - 25:34
    Thank you.
Title:
How Twitter needs to change
Speaker:
Jack Dorsey
Description:

Can Twitter be saved? In a wide-ranging conversation with TED's Chris Anderson and Whitney Pennington Rodgers, Twitter CEO Jack Dorsey discusses the future of the platform -- acknowledging problems with harassment and moderation and proposing some fundamental changes that he hopes will encourage healthy, respectful conversations. "Are we actually delivering something that people value every single day?" Dorsey asks.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
25:47
Brian Greene edited English subtitles for How Twitter needs to change
Brian Greene edited English subtitles for How Twitter needs to change
Brian Greene approved English subtitles for How Twitter needs to change
Brian Greene edited English subtitles for How Twitter needs to change
Camille Martínez accepted English subtitles for How Twitter needs to change
Camille Martínez edited English subtitles for How Twitter needs to change
Camille Martínez edited English subtitles for How Twitter needs to change
Joseph Geni edited English subtitles for How Twitter needs to change
Show all

English subtitles

Revisions Compare revisions