How Twitter needs to change
-
0:01 - 0:04Chris Anderson:
What worries you right now? -
0:04 - 0:06You've been very open
about lots of issues on Twitter. -
0:06 - 0:09What would be your top worry
-
0:09 - 0:11about where things are right now?
-
0:11 - 0:14Jack Dorsey: Right now,
the health of the conversation. -
0:14 - 0:18So, our purpose is to serve
the public conversation, -
0:18 - 0:23and we have seen
a number of attacks on it. -
0:23 - 0:26We've seen abuse, we've seen harassment,
-
0:26 - 0:29we've seen manipulation,
-
0:29 - 0:33automation, human coordination,
misinformation. -
0:34 - 0:38So these are all dynamics
that we were not expecting -
0:38 - 0:4213 years ago when we were
starting the company. -
0:42 - 0:45But we do now see them at scale,
-
0:45 - 0:50and what worries me most
is just our ability to address it -
0:50 - 0:53in a systemic way that is scalable,
-
0:53 - 1:00that has a rigorous understanding
of how we're taking action, -
1:00 - 1:03a transparent understanding
of how we're taking action -
1:03 - 1:06and a rigorous appeals process
for when we're wrong, -
1:06 - 1:08because we will be wrong.
-
1:09 - 1:11Whitney Pennington Rodgers:
I'm really glad to hear -
1:11 - 1:13that that's something that concerns you,
-
1:13 - 1:16because I think there's been
a lot written about people -
1:16 - 1:18who feel they've been abused
and harassed on Twitter, -
1:18 - 1:22and I think no one more so
than women and women of color -
1:22 - 1:23and black women.
-
1:23 - 1:25And there's been data that's come out --
-
1:25 - 1:28Amnesty International put out
a report a few months ago -
1:28 - 1:33where they showed that a subset
of active black female Twitter users -
1:33 - 1:36were receiving, on average,
one in 10 of their tweets -
1:36 - 1:38were some form of harassment.
-
1:38 - 1:42And so when you think about health
for the community on Twitter, -
1:42 - 1:46I'm interested to hear,
"health for everyone," -
1:46 - 1:49but specifically: How are you looking
to make Twitter a safe space -
1:49 - 1:54for that subset, for women,
for women of color and black women? -
1:54 - 1:55JD: Yeah.
-
1:55 - 1:57So it's a pretty terrible situation
-
1:57 - 1:59when you're coming to a service
-
1:59 - 2:03that, ideally, you want to learn
something about the world, -
2:03 - 2:09and you spend the majority of your time
reporting abuse, receiving abuse, -
2:09 - 2:11receiving harassment.
-
2:11 - 2:18So what we're looking most deeply at
is just the incentives -
2:18 - 2:22that the platform naturally provides
and the service provides. -
2:22 - 2:27Right now, the dynamic of the system
makes it super-easy to harass -
2:27 - 2:31and to abuse others through the service,
-
2:31 - 2:34and unfortunately, the majority
of our system in the past -
2:34 - 2:39worked entirely based on people
reporting harassment and abuse. -
2:39 - 2:45So about midway last year,
we decided that we were going to apply -
2:45 - 2:49a lot more machine learning,
a lot more deep learning to the problem, -
2:49 - 2:53and try to be a lot more proactive
around where abuse is happening, -
2:53 - 2:57so that we can take the burden
off the victim completely. -
2:57 - 3:00And we've made some progress recently.
-
3:00 - 3:06About 38 percent of abusive tweets
are now proactively identified -
3:06 - 3:08by machine learning algorithms
-
3:08 - 3:10so that people don't actually
have to report them. -
3:10 - 3:14But those that are identified
are still reviewed by humans, -
3:14 - 3:19so we do not take down content or accounts
without a human actually reviewing it. -
3:19 - 3:22But that was from zero percent
just a year ago. -
3:22 - 3:24So that meant, at that zero percent,
-
3:24 - 3:28every single person who received abuse
had to actually report it, -
3:28 - 3:31which was a lot of work for them,
a lot of work for us -
3:31 - 3:33and just ultimately unfair.
-
3:35 - 3:38The other thing that we're doing
is making sure that we, as a company, -
3:38 - 3:42have representation of all the communities
that we're trying to serve. -
3:42 - 3:44We can't build a business
that is successful -
3:44 - 3:47unless we have a diversity
of perspective inside of our walls -
3:47 - 3:51that actually feel these issues
every single day. -
3:51 - 3:55And that's not just with the team
that's doing the work, -
3:55 - 3:57it's also within our leadership as well.
-
3:57 - 4:03So we need to continue to build empathy
for what people are experiencing -
4:03 - 4:06and give them better tools to act on it
-
4:06 - 4:10and also give our customers
a much better and easier approach -
4:10 - 4:13to handle some of the things
that they're seeing. -
4:13 - 4:16So a lot of what we're doing
is around technology, -
4:16 - 4:20but we're also looking at
the incentives on the service: -
4:20 - 4:25What does Twitter incentivize you to do
when you first open it up? -
4:25 - 4:27And in the past,
-
4:29 - 4:34it's incented a lot of outrage,
it's incented a lot of mob behavior, -
4:34 - 4:37it's incented a lot of group harassment.
-
4:37 - 4:40And we have to look a lot deeper
at some of the fundamentals -
4:40 - 4:43of what the service is doing
to make the bigger shifts. -
4:43 - 4:47We can make a bunch of small shifts
around technology, as I just described, -
4:47 - 4:52but ultimately, we have to look deeply
at the dynamics in the network itself, -
4:52 - 4:53and that's what we're doing.
-
4:53 - 4:55CA: But what's your sense --
-
4:55 - 4:59what is the kind of thing
that you might be able to change -
4:59 - 5:02that would actually
fundamentally shift behavior? -
5:03 - 5:05JD: Well, one of the things --
-
5:05 - 5:10we started the service
with this concept of following an account, -
5:10 - 5:12as an example,
-
5:12 - 5:16and I don't believe that's why
people actually come to Twitter. -
5:16 - 5:21I believe Twitter is best
as an interest-based network. -
5:21 - 5:25People come with a particular interest.
-
5:25 - 5:28They have to do a ton of work
to find and follow the related accounts -
5:28 - 5:30around those interests.
-
5:30 - 5:34What we could do instead
is allow you to follow an interest, -
5:34 - 5:36follow a hashtag, follow a trend,
-
5:36 - 5:38follow a community,
-
5:38 - 5:42which gives us the opportunity
to show all of the accounts, -
5:42 - 5:46all the topics, all the moments,
all the hashtags -
5:46 - 5:50that are associated with that
particular topic and interest, -
5:50 - 5:54which really opens up
the perspective that you see. -
5:54 - 5:56But that is a huge fundamental shift
-
5:56 - 6:00to bias the entire network
away from just an account bias -
6:00 - 6:03towards a topics and interest bias.
-
6:03 - 6:06CA: Because isn't it the case
-
6:07 - 6:11that one reason why you have
so much content on there -
6:11 - 6:15is a result of putting millions
of people around the world -
6:15 - 6:18in this kind of gladiatorial
contest with each other -
6:18 - 6:20for followers, for attention?
-
6:20 - 6:24Like, from the point of view
of people who just read Twitter, -
6:24 - 6:25that's not an issue,
-
6:25 - 6:29but for the people who actually create it,
everyone's out there saying, -
6:29 - 6:32"You know, I wish I had
a few more 'likes,' followers, retweets." -
6:32 - 6:34And so they're constantly experimenting,
-
6:34 - 6:36trying to find the path to do that.
-
6:36 - 6:40And what we've all discovered
is that the number one path to do that -
6:40 - 6:44is to be some form of provocative,
-
6:44 - 6:47obnoxious, eloquently obnoxious,
-
6:47 - 6:50like, eloquent insults
are a dream on Twitter, -
6:50 - 6:53where you rapidly pile up --
-
6:53 - 6:57and it becomes this self-fueling
process of driving outrage. -
6:57 - 7:00How do you defuse that?
-
7:01 - 7:04JD: Yeah, I mean, I think you're spot on,
-
7:04 - 7:05but that goes back to the incentives.
-
7:06 - 7:08Like, one of the choices
we made in the early days was -
7:08 - 7:13we had this number that showed
how many people follow you. -
7:13 - 7:16We decided that number
should be big and bold, -
7:16 - 7:20and anything that's on the page
that's big and bold has importance, -
7:20 - 7:22and those are the things
that you want to drive. -
7:22 - 7:24Was that the right decision at the time?
-
7:24 - 7:25Probably not.
-
7:25 - 7:27If I had to start the service again,
-
7:27 - 7:29I would not emphasize
the follower count as much. -
7:29 - 7:32I would not emphasize
the "like" count as much. -
7:32 - 7:35I don't think I would even
create "like" in the first place, -
7:35 - 7:38because it doesn't actually push
-
7:38 - 7:41what we believe now
to be the most important thing, -
7:41 - 7:44which is healthy contribution
back to the network -
7:44 - 7:47and conversation to the network,
-
7:47 - 7:49participation within conversation,
-
7:49 - 7:52learning something from the conversation.
-
7:52 - 7:54Those are not things
that we thought of 13 years ago, -
7:54 - 7:57and we believe are extremely
important right now. -
7:57 - 8:00So we have to look at
how we display the follower count, -
8:00 - 8:02how we display retweet count,
-
8:02 - 8:04how we display "likes,"
-
8:04 - 8:06and just ask the deep question:
-
8:06 - 8:09Is this really the number
that we want people to drive up? -
8:09 - 8:12Is this the thing that,
when you open Twitter, -
8:12 - 8:14you see, "That's the thing
I need to increase?" -
8:14 - 8:16And I don't believe
that's the case right now. -
8:16 - 8:19(Applause)
-
8:19 - 8:21WPR: I think we should look at
some of the tweets -
8:21 - 8:23that are coming
in from the audience as well. -
8:24 - 8:26CA: Let's see what you guys are asking.
-
8:26 - 8:30I mean, this is -- generally, one
of the amazing things about Twitter -
8:30 - 8:32is how you can use it for crowd wisdom,
-
8:32 - 8:37you know, that more knowledge,
more questions, more points of view -
8:37 - 8:38than you can imagine,
-
8:38 - 8:42and sometimes, many of them
are really healthy. -
8:42 - 8:45WPR: I think one I saw that
passed already quickly down here, -
8:45 - 8:48"What's Twitter's plan to combat
foreign meddling in the 2020 US election?" -
8:48 - 8:51I think that's something
that's an issue we're seeing -
8:51 - 8:53on the internet in general,
-
8:53 - 8:56that we have a lot of malicious
automated activity happening. -
8:56 - 9:02And on Twitter, for example,
in fact, we have some work -
9:02 - 9:05that's come from our friends
at Zignal Labs, -
9:05 - 9:07and maybe we can even see that
to give us an example -
9:07 - 9:09of what exactly I'm talking about,
-
9:09 - 9:12where you have these bots, if you will,
-
9:13 - 9:17or coordinated automated
malicious account activity, -
9:17 - 9:20that is being used to influence
things like elections. -
9:20 - 9:24And in this example we have
from Zignal which they've shared with us -
9:24 - 9:26using the data that
they have from Twitter, -
9:26 - 9:28you actually see that in this case,
-
9:28 - 9:33white represents the humans --
human accounts, each dot is an account. -
9:33 - 9:34The pinker it is,
-
9:34 - 9:36the more automated the activity is.
-
9:36 - 9:42And you can see how you have
a few humans interacting with bots. -
9:42 - 9:46In this case, it's related
to the election in Israel -
9:46 - 9:49and spreading misinformation
about Benny Gantz, -
9:49 - 9:52and as we know, in the end,
that was an election -
9:52 - 9:56that Netanyahu won by a slim margin,
-
9:56 - 9:59and that may have been
in some case influenced by this. -
9:59 - 10:01And when you think about
that happening on Twitter, -
10:01 - 10:04what are the things
that you're doing, specifically, -
10:04 - 10:07to ensure you don't have misinformation
like this spreading in this way, -
10:07 - 10:12influencing people in ways
that could affect democracy? -
10:12 - 10:13JD: Just to back up a bit,
-
10:13 - 10:16we asked ourselves a question:
-
10:16 - 10:20Can we actually measure
the health of a conversation, -
10:20 - 10:22and what does that mean?
-
10:22 - 10:25And in the same way
that you have indicators -
10:25 - 10:28and we have indicators as humans
in terms of are we healthy or not, -
10:28 - 10:33such as temperature,
the flushness of your face, -
10:33 - 10:38we believe that we could find
the indicators of conversational health. -
10:38 - 10:42And we worked with a lab
called Cortico at MIT -
10:42 - 10:49to propose four starter indicators
-
10:49 - 10:52that we believe we could ultimately
measure on the system. -
10:53 - 10:59And the first one is
what we're calling shared attention. -
10:59 - 11:02It's a measure of how much
of the conversation is attentive -
11:02 - 11:05on the same topic versus disparate.
-
11:06 - 11:09The second one is called shared reality,
-
11:09 - 11:11and this is what percentage
of the conversation -
11:12 - 11:14shares the same facts --
-
11:14 - 11:17not whether those facts
are truthful or not, -
11:17 - 11:20but are we sharing
the same facts as we converse? -
11:20 - 11:23The third is receptivity:
-
11:23 - 11:27How much of the conversation
is receptive or civil -
11:27 - 11:30or the inverse, toxic?
-
11:30 - 11:33And then the fourth
is variety of perspective. -
11:33 - 11:37So, are we seeing filter bubbles
or echo chambers, -
11:37 - 11:40or are we actually getting
a variety of opinions -
11:40 - 11:41within the conversation?
-
11:41 - 11:45And implicit in all four of these
is the understanding that, -
11:45 - 11:49as they increase, the conversation
gets healthier and healthier. -
11:49 - 11:54So our first step is to see
if we can measure these online, -
11:54 - 11:55which we believe we can.
-
11:55 - 11:58We have the most momentum
around receptivity. -
11:58 - 12:03We have a toxicity score,
a toxicity model, on our system -
12:03 - 12:07that can actually measure
whether you are likely to walk away -
12:07 - 12:09from a conversation
that you're having on Twitter -
12:09 - 12:11because you feel it's toxic,
-
12:11 - 12:13with some pretty high degree.
-
12:14 - 12:17We're working to measure the rest,
-
12:17 - 12:19and the next step is,
-
12:19 - 12:22as we build up solutions,
-
12:22 - 12:25to watch how these measurements
trend over time -
12:25 - 12:27and continue to experiment.
-
12:27 - 12:31And our goal is to make sure
that these are balanced, -
12:31 - 12:35because if you increase one,
you might decrease another. -
12:35 - 12:37If you increase variety of perspective,
-
12:37 - 12:40you might actually decrease
shared reality. -
12:40 - 12:45CA: Just picking up on some
of the questions flooding in here. -
12:45 - 12:46JD: Constant questioning.
-
12:47 - 12:51CA: A lot of people are puzzled why,
-
12:51 - 12:55like, how hard is it to get rid
of Nazis from Twitter? -
12:56 - 12:58JD: (Laughs)
-
12:58 - 13:05So we have policies
around violent extremist groups, -
13:05 - 13:09and the majority of our work
and our terms of service -
13:09 - 13:13works on conduct, not content.
-
13:13 - 13:15So we're actually looking for conduct.
-
13:15 - 13:18Conduct being using the service
-
13:18 - 13:22to repeatedly or episodically
harass someone, -
13:22 - 13:25using hateful imagery
-
13:25 - 13:27that might be associated with the KKK
-
13:27 - 13:30or the American Nazi Party.
-
13:30 - 13:34Those are all things
that we act on immediately. -
13:35 - 13:40We're in a situation right now
where that term is used fairly loosely, -
13:40 - 13:46and we just cannot take
any one mention of that word -
13:46 - 13:48accusing someone else
-
13:48 - 13:52as a factual indication that they
should be removed from the platform. -
13:52 - 13:54So a lot of our models
are based around, number one: -
13:54 - 13:58Is this account associated
with a violent extremist group? -
13:58 - 14:00And if so, we can take action.
-
14:00 - 14:03And we have done so on the KKK
and the American Nazi Party and others. -
14:03 - 14:08And number two: Are they using
imagery or conduct -
14:08 - 14:10that would associate them as such as well?
-
14:10 - 14:13CA: How many people do you have
working on content moderation -
14:13 - 14:15to look at this?
-
14:15 - 14:16JD: It varies.
-
14:16 - 14:18We want to be flexible on this,
-
14:18 - 14:20because we want to make sure
that we're, number one, -
14:20 - 14:25building algorithms instead of just
hiring massive amounts of people, -
14:25 - 14:28because we need to make sure
that this is scalable, -
14:28 - 14:31and there are no amount of people
that can actually scale this. -
14:31 - 14:38So this is why we've done so much work
around proactive detection of abuse -
14:38 - 14:39that humans can then review.
-
14:39 - 14:42We want to have a situation
-
14:42 - 14:46where algorithms are constantly
scouring every single tweet -
14:46 - 14:48and bringing the most
interesting ones to the top -
14:48 - 14:52so that humans can bring their judgment
to whether we should take action or not, -
14:52 - 14:54based on our terms of service.
-
14:54 - 14:57WPR: But there's not an amount
of people that are scalable, -
14:57 - 15:00but how many people do you currently have
monitoring these accounts, -
15:00 - 15:03and how do you figure out what's enough?
-
15:03 - 15:05JD: They're completely flexible.
-
15:05 - 15:08Sometimes we associate folks with spam.
-
15:08 - 15:12Sometimes we associate folks
with abuse and harassment. -
15:12 - 15:15We're going to make sure that
we have flexibility in our people -
15:15 - 15:17so that we can direct them
at what is most needed. -
15:17 - 15:18Sometimes, the elections.
-
15:19 - 15:23We've had a string of elections
in Mexico, one coming up in India, -
15:23 - 15:28obviously, the election last year,
the midterm election, -
15:28 - 15:30so we just want to be flexible
with our resources. -
15:30 - 15:33So when people --
-
15:33 - 15:39just as an example, if you go
to our current terms of service -
15:39 - 15:41and you bring the page up,
-
15:41 - 15:44and you're wondering about abuse
and harassment that you just received -
15:44 - 15:48and whether it was against
our terms of service to report it, -
15:48 - 15:51the first thing you see
when you open that page -
15:51 - 15:54is around intellectual
property protection. -
15:55 - 16:00You scroll down and you get to
abuse, harassment -
16:00 - 16:02and everything else
that you might be experiencing. -
16:02 - 16:05So I don't know how that happened
over the company's history, -
16:05 - 16:10but we put that above
the thing that people want -
16:12 - 16:15the most information on
and to actually act on. -
16:15 - 16:21And just our ordering shows the world
what we believed was important. -
16:21 - 16:24So we're changing all that.
-
16:24 - 16:25We're ordering it the right way,
-
16:25 - 16:29but we're also simplifying the rules
so that they're human-readable -
16:29 - 16:33so that people can actually
understand themselves -
16:33 - 16:36when something is against our terms
and when something is not. -
16:36 - 16:38And then we're making --
-
16:38 - 16:44again, our big focus is on removing
the burden of work from the victims. -
16:44 - 16:47So that means push more
towards technology, -
16:47 - 16:49rather than humans doing the work --
-
16:49 - 16:52that means the humans receiving the abuse
-
16:52 - 16:55and also the humans
having to review that work. -
16:55 - 16:56So we want to make sure
-
16:56 - 16:59that we're not just encouraging more work
-
16:59 - 17:02around something
that's super, super negative, -
17:02 - 17:05and we want to have a good balance
between the technology -
17:05 - 17:08and where humans can actually be creative,
-
17:08 - 17:11which is the judgment of the rules,
-
17:11 - 17:14and not just all the mechanical stuff
of finding and reporting them. -
17:14 - 17:15So that's how we think about it.
-
17:15 - 17:18CA: I'm curious to dig in more
about what you said. -
17:18 - 17:21I mean, I love that you said
you are looking for ways -
17:21 - 17:24to re-tweak the fundamental
design of the system -
17:24 - 17:29to discourage some of the reactive
behavior, and perhaps -- -
17:29 - 17:32to use Tristan Harris-type language --
-
17:32 - 17:36engage people's more reflective thinking.
-
17:36 - 17:38How far advanced is that?
-
17:38 - 17:42What would alternatives
to that "like" button be? -
17:44 - 17:47JD: Well, first and foremost,
-
17:47 - 17:53my personal goal with the service
is that I believe fundamentally -
17:53 - 17:56that public conversation is critical.
-
17:56 - 17:58There are existential problems
facing the world -
17:58 - 18:02that are facing the entire world,
not any one particular nation-state, -
18:02 - 18:05that global public conversation benefits.
-
18:05 - 18:08And that is one of the unique
dynamics of Twitter, -
18:08 - 18:09that it is completely open,
-
18:09 - 18:11it is completely public,
-
18:11 - 18:12it is completely fluid,
-
18:12 - 18:16and anyone can see any other conversation
and participate in it. -
18:16 - 18:19So there are conversations
like climate change. -
18:19 - 18:21There are conversations
like the displacement in the work -
18:21 - 18:23through artificial intelligence.
-
18:23 - 18:26There are conversations
like economic disparity. -
18:26 - 18:29No matter what any one nation-state does,
-
18:29 - 18:32they will not be able
to solve the problem alone. -
18:32 - 18:34It takes coordination around the world,
-
18:34 - 18:37and that's where I think
Twitter can play a part. -
18:37 - 18:43The second thing is that Twitter,
right now, when you go to it, -
18:43 - 18:47you don't necessarily walk away
feeling like you learned something. -
18:47 - 18:48Some people do.
-
18:48 - 18:51Some people have
a very, very rich network, -
18:51 - 18:54a very rich community
that they learn from every single day. -
18:54 - 18:58But it takes a lot of work
and a lot of time to build up to that. -
18:58 - 19:02So we want to get people
to those topics and those interests -
19:02 - 19:03much, much faster
-
19:03 - 19:06and make sure that
they're finding something that, -
19:07 - 19:09no matter how much time
they spend on Twitter -- -
19:09 - 19:11and I don't want to maximize
the time on Twitter, -
19:11 - 19:14I want to maximize
what they actually take away from it -
19:14 - 19:16and what they learn from it, and --
-
19:18 - 19:19CA: Well, do you, though?
-
19:19 - 19:22Because that's the core question
that a lot of people want to know. -
19:22 - 19:26Surely, Jack, you're constrained,
to a huge extent, -
19:26 - 19:28by the fact that you're a public company,
-
19:28 - 19:30you've got investors pressing on you,
-
19:30 - 19:33the number one way you make your money
is from advertising -- -
19:33 - 19:36that depends on user engagement.
-
19:36 - 19:41Are you willing to sacrifice
user time, if need be, -
19:41 - 19:45to go for a more reflective conversation?
-
19:45 - 19:48JD: Yeah; more relevance means
less time on the service, -
19:48 - 19:50and that's perfectly fine,
-
19:50 - 19:53because we want to make sure
that, like, you're coming to Twitter, -
19:53 - 19:57and you see something immediately
that you learn from and that you push. -
19:57 - 20:01We can still serve an ad against that.
-
20:01 - 20:04That doesn't mean you need to spend
any more time to see more. -
20:04 - 20:05The second thing we're looking at --
-
20:05 - 20:08CA: But just -- on that goal,
daily active usage, -
20:08 - 20:11if you're measuring that,
that doesn't necessarily mean things -
20:11 - 20:13that people value every day.
-
20:13 - 20:14It may well mean
-
20:14 - 20:18things that people are drawn to
like a moth to the flame, every day. -
20:18 - 20:21We are addicted, because we see
something that pisses us off, -
20:21 - 20:24so we go in and add fuel to the fire,
-
20:24 - 20:26and the daily active usage goes up,
-
20:26 - 20:28and there's more ad revenue there,
-
20:28 - 20:30but we all get angrier with each other.
-
20:30 - 20:33How do you define ...
-
20:33 - 20:37"Daily active usage" seems like a really
dangerous term to be optimizing. -
20:37 - 20:42(Applause)
-
20:42 - 20:43JD: Taken alone, it is,
-
20:44 - 20:46but you didn't let me
finish the other metric, -
20:46 - 20:50which is, we're watching for conversations
-
20:50 - 20:52and conversation chains.
-
20:52 - 20:57So we want to incentivize
healthy contribution back to the network, -
20:57 - 21:01and what we believe that is
is actually participating in conversation -
21:01 - 21:02that is healthy,
-
21:02 - 21:07as defined by those four indicators
I articulated earlier. -
21:07 - 21:10So you can't just optimize
around one metric. -
21:10 - 21:13You have to balance and look constantly
-
21:13 - 21:17at what is actually going to create
a healthy contribution to the network -
21:17 - 21:19and a healthy experience for people.
-
21:19 - 21:21Ultimately, we want to get to a metric
-
21:21 - 21:25where people can tell us,
"Hey, I learned something from Twitter, -
21:25 - 21:27and I'm walking away
with something valuable." -
21:27 - 21:29That is our goal ultimately over time,
-
21:29 - 21:31but that's going to take some time.
-
21:31 - 21:36CA: You come over to many,
I think to me, as this enigma. -
21:36 - 21:41This is possibly unfair,
but I woke up the other night -
21:41 - 21:45with this picture of how I found I was
thinking about you and the situation, -
21:45 - 21:52that we're on this great voyage with you
on this ship called the "Twittanic" -- -
21:52 - 21:53(Laughter)
-
21:53 - 21:57and there are people on board in steerage
-
21:57 - 22:00who are expressing discomfort,
-
22:00 - 22:02and you, unlike many other captains,
-
22:02 - 22:06are saying, "Well, tell me, talk to me,
listen to me, I want to hear." -
22:06 - 22:09And they talk to you, and they say,
"We're worried about the iceberg ahead." -
22:09 - 22:11And you go, "You know,
that is a powerful point, -
22:11 - 22:14and our ship, frankly,
hasn't been built properly -
22:14 - 22:16for steering as well as it might."
-
22:16 - 22:17And we say, "Please do something."
-
22:17 - 22:19And you go to the bridge,
-
22:19 - 22:21and we're waiting,
-
22:21 - 22:26and we look, and then you're showing
this extraordinary calm, -
22:26 - 22:30but we're all standing outside,
saying, "Jack, turn the fucking wheel!" -
22:30 - 22:31You know?
-
22:31 - 22:32(Laughter)
-
22:32 - 22:34(Applause)
-
22:34 - 22:36I mean --
-
22:36 - 22:37(Applause)
-
22:37 - 22:42It's democracy at stake.
-
22:42 - 22:45It's our culture at stake.
It's our world at stake. -
22:45 - 22:50And Twitter is amazing and shapes so much.
-
22:50 - 22:52It's not as big as some
of the other platforms, -
22:52 - 22:55but the people of influence use it
to set the agenda, -
22:55 - 23:01and it's just hard to imagine a more
important role in the world than to ... -
23:02 - 23:05I mean, you're doing a brilliant job
of listening, Jack, and hearing people, -
23:05 - 23:10but to actually dial up the urgency
and move on this stuff -- -
23:10 - 23:12will you do that?
-
23:13 - 23:17JD: Yes, and we have been
moving substantially. -
23:17 - 23:20I mean, there's been
a few dynamics in Twitter's history. -
23:20 - 23:22One, when I came back to the company,
-
23:23 - 23:30we were in a pretty dire state
in terms of our future, -
23:30 - 23:34and not just from how people
were using the platform, -
23:34 - 23:36but from a corporate narrative as well.
-
23:36 - 23:40So we had to fix
a bunch of the foundation, -
23:40 - 23:42turn the company around,
-
23:42 - 23:45go through two crazy layoffs,
-
23:45 - 23:49because we just got too big
for what we were doing, -
23:49 - 23:51and we focused all of our energy
-
23:51 - 23:54on this concept of serving
the public conversation. -
23:54 - 23:56And that took some work.
-
23:56 - 23:58And as we dived into that,
-
23:58 - 24:01we realized some of the issues
with the fundamentals. -
24:02 - 24:07We could do a bunch of superficial things
to address what you're talking about, -
24:07 - 24:09but we need the changes to last,
-
24:09 - 24:11and that means going really, really deep
-
24:11 - 24:15and paying attention
to what we started 13 years ago -
24:15 - 24:18and really questioning
-
24:18 - 24:20how the system works
and how the framework works -
24:20 - 24:24and what is needed for the world today,
-
24:24 - 24:28given how quickly everything is moving
and how people are using it. -
24:28 - 24:35So we are working as quickly as we can,
but quickness will not get the job done. -
24:35 - 24:37It's focus, it's prioritization,
-
24:37 - 24:40it's understanding
the fundamentals of the network -
24:40 - 24:43and building a framework that scales
-
24:43 - 24:46and that is resilient to change,
-
24:46 - 24:51and being open about where we are
and being transparent about where are -
24:51 - 24:53so that we can continue to earn trust.
-
24:54 - 24:57So I'm proud of all the frameworks
that we've put in place. -
24:57 - 25:00I'm proud of our direction.
-
25:01 - 25:04We obviously can move faster,
-
25:04 - 25:08but that required just stopping a bunch
of stupid stuff we were doing in the past. -
25:09 - 25:10CA: All right.
-
25:10 - 25:14Well, I suspect there are many people here
who, if given the chance, -
25:14 - 25:18would love to help you
on this change-making agenda you're on, -
25:18 - 25:20and I don't know if Whitney --
-
25:20 - 25:23Jack, thank you for coming here
and speaking so openly. -
25:23 - 25:24It took courage.
-
25:24 - 25:28I really appreciate what you said,
and good luck with your mission. -
25:28 - 25:30JD: Thank you so much.
Thanks for having me. -
25:30 - 25:33(Applause)
-
25:33 - 25:34Thank you.
- Title:
- How Twitter needs to change
- Speaker:
- Jack Dorsey
- Description:
-
Can Twitter be saved? In a wide-ranging conversation with TED's Chris Anderson and Whitney Pennington Rodgers, Twitter CEO Jack Dorsey discusses the future of the platform -- acknowledging problems with harassment and moderation and proposing some fundamental changes that he hopes will encourage healthy, respectful conversations. "Are we actually delivering something that people value every single day?" Dorsey asks.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 25:47
Brian Greene edited English subtitles for How Twitter needs to change | ||
Brian Greene edited English subtitles for How Twitter needs to change | ||
Brian Greene approved English subtitles for How Twitter needs to change | ||
Brian Greene edited English subtitles for How Twitter needs to change | ||
Camille Martínez accepted English subtitles for How Twitter needs to change | ||
Camille Martínez edited English subtitles for How Twitter needs to change | ||
Camille Martínez edited English subtitles for How Twitter needs to change | ||
Joseph Geni edited English subtitles for How Twitter needs to change |