[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:09.65,0:00:11.19,Default,,0000,0000,0000,,Good afternoon. Dialogue: 0,0:00:11.19,0:00:16.00,Default,,0000,0000,0000,,Most philosophers do not live\Nin big ceramic barrels Dialogue: 0,0:00:16.00,0:00:17.90,Default,,0000,0000,0000,,in their local supermarket. Dialogue: 0,0:00:18.27,0:00:19.86,Default,,0000,0000,0000,,But there was one, Dialogue: 0,0:00:20.23,0:00:22.37,Default,,0000,0000,0000,,just down the road from here, actually, Dialogue: 0,0:00:22.37,0:00:24.40,Default,,0000,0000,0000,,not so very long ago. Dialogue: 0,0:00:24.74,0:00:27.82,Default,,0000,0000,0000,,His name was Diogenes of Sinope, Dialogue: 0,0:00:27.82,0:00:33.65,Default,,0000,0000,0000,,and he was probably the closest thing\Nphilosophy has ever produced to a troll. Dialogue: 0,0:00:33.65,0:00:38.31,Default,,0000,0000,0000,,He was rude, outrageous,\Nimpulsive, offensive, Dialogue: 0,0:00:39.13,0:00:42.99,Default,,0000,0000,0000,,but he was deeply admired\Nby Alexander the Great, Dialogue: 0,0:00:42.99,0:00:47.36,Default,,0000,0000,0000,,who was, arguably, the most powerful\Nperson in the world at the time. Dialogue: 0,0:00:48.06,0:00:51.32,Default,,0000,0000,0000,,It's said that one day,\NAlexander went to visit Diogenes Dialogue: 0,0:00:51.32,0:00:53.37,Default,,0000,0000,0000,,in his big barrel in the marketplace Dialogue: 0,0:00:53.37,0:00:55.66,Default,,0000,0000,0000,,and went up to him and said, Dialogue: 0,0:00:55.66,0:00:58.73,Default,,0000,0000,0000,,"Diogenes, I will grant\Nany one wish that you have; Dialogue: 0,0:00:58.73,0:01:00.48,Default,,0000,0000,0000,,just tell me what you want." Dialogue: 0,0:01:00.74,0:01:04.19,Default,,0000,0000,0000,,And Diogenes was lying\Nin the sun at the time, Dialogue: 0,0:01:04.49,0:01:08.48,Default,,0000,0000,0000,,and true to his form, he looked up\Nat Alexander and replied, Dialogue: 0,0:01:08.48,0:01:10.34,Default,,0000,0000,0000,,"Stand out of my light." Dialogue: 0,0:01:11.76,0:01:15.66,Default,,0000,0000,0000,,And I love this story\Nbecause it has lessons for us Dialogue: 0,0:01:15.66,0:01:19.17,Default,,0000,0000,0000,,about how we should be responding\Nto the Alexanders of our time, Dialogue: 0,0:01:19.49,0:01:22.84,Default,,0000,0000,0000,,our digital technologies\Nand the people who create them. Dialogue: 0,0:01:23.08,0:01:24.59,Default,,0000,0000,0000,,Because like Alexander, Dialogue: 0,0:01:24.59,0:01:25.90,Default,,0000,0000,0000,,they've come into our lives Dialogue: 0,0:01:25.90,0:01:29.87,Default,,0000,0000,0000,,and offered to fulfill all sorts\Nof needs and wishes that we have, Dialogue: 0,0:01:29.87,0:01:32.70,Default,,0000,0000,0000,,and in many ways,\Nthey've done so extremely well. Dialogue: 0,0:01:33.31,0:01:38.49,Default,,0000,0000,0000,,But we're beginning to realize now\Nthat in doing so, they, like Alexander, Dialogue: 0,0:01:38.49,0:01:41.90,Default,,0000,0000,0000,,have also been standing\Nin our light, in a sense, Dialogue: 0,0:01:42.28,0:01:44.74,Default,,0000,0000,0000,,and in one light in particular Dialogue: 0,0:01:44.74,0:01:49.01,Default,,0000,0000,0000,,that is so precious and so essential\Nfor human flourishing, Dialogue: 0,0:01:49.01,0:01:50.83,Default,,0000,0000,0000,,that without this light, Dialogue: 0,0:01:50.83,0:01:54.66,Default,,0000,0000,0000,,the other benefits of technology\Nmight not do us very much good. Dialogue: 0,0:01:55.88,0:01:58.53,Default,,0000,0000,0000,,The light that I mean\Nis the light of our attention. Dialogue: 0,0:01:59.73,0:02:03.58,Default,,0000,0000,0000,,There's something profound\Nand potentially irreversible Dialogue: 0,0:02:03.58,0:02:07.07,Default,,0000,0000,0000,,happening to human attention\Nin the digital age. Dialogue: 0,0:02:07.51,0:02:10.11,Default,,0000,0000,0000,,It's more than just distraction. Dialogue: 0,0:02:10.11,0:02:13.75,Default,,0000,0000,0000,,It's more than just addiction\Nor manipulation. Dialogue: 0,0:02:14.63,0:02:18.31,Default,,0000,0000,0000,,In fact, I think that the way\Nwe respond to this challenge Dialogue: 0,0:02:18.31,0:02:22.30,Default,,0000,0000,0000,,could be the defining moral\Nand political issue of our time. Dialogue: 0,0:02:22.99,0:02:27.89,Default,,0000,0000,0000,,I'd like to tell you why I think so\Nand what I think we can do about it. Dialogue: 0,0:02:29.83,0:02:32.61,Default,,0000,0000,0000,,In the 1970s, Herbert Simon pointed out Dialogue: 0,0:02:32.61,0:02:36.21,Default,,0000,0000,0000,,that in an environment\Nof information abundance, Dialogue: 0,0:02:36.21,0:02:38.99,Default,,0000,0000,0000,,attention becomes the scarce resource. Dialogue: 0,0:02:39.42,0:02:42.56,Default,,0000,0000,0000,,There's a kind of figure-ground\Ninversion that takes place, Dialogue: 0,0:02:42.56,0:02:46.38,Default,,0000,0000,0000,,and this inversion has happened\Nso quickly and so recently Dialogue: 0,0:02:46.38,0:02:48.88,Default,,0000,0000,0000,,that we're still just beginning\Nto come to terms Dialogue: 0,0:02:48.88,0:02:51.10,Default,,0000,0000,0000,,with what it means for human life. Dialogue: 0,0:02:51.65,0:02:54.29,Default,,0000,0000,0000,,But because attention\Nis the scarce resource, Dialogue: 0,0:02:54.29,0:02:56.12,Default,,0000,0000,0000,,it is now the object of competition Dialogue: 0,0:02:56.12,0:02:59.35,Default,,0000,0000,0000,,among most of the technologies\Nwe use everyday. Dialogue: 0,0:02:59.35,0:03:02.81,Default,,0000,0000,0000,,The total environment\Nof competition for our attention Dialogue: 0,0:03:02.81,0:03:05.28,Default,,0000,0000,0000,,is often called "the attention economy," Dialogue: 0,0:03:05.28,0:03:09.07,Default,,0000,0000,0000,,and in the attention economy,\Nthere are no truly free products. Dialogue: 0,0:03:09.07,0:03:11.32,Default,,0000,0000,0000,,You pay with your attentional labor Dialogue: 0,0:03:11.32,0:03:15.30,Default,,0000,0000,0000,,every time you look\Nor tap or scroll or click, Dialogue: 0,0:03:15.98,0:03:19.51,Default,,0000,0000,0000,,and this is exactly what\Nthey're designed to try to get you to do. Dialogue: 0,0:03:20.48,0:03:25.41,Default,,0000,0000,0000,,And they use their attentional labor\Nto advance their goals, not yours. Dialogue: 0,0:03:25.81,0:03:29.54,Default,,0000,0000,0000,,Because there is a difference\Nbetween their goals and yours. Dialogue: 0,0:03:30.28,0:03:32.94,Default,,0000,0000,0000,,If you think about the goals\Nthat you have for yourself Dialogue: 0,0:03:33.40,0:03:36.37,Default,,0000,0000,0000,,today, this year and even beyond, Dialogue: 0,0:03:36.37,0:03:40.17,Default,,0000,0000,0000,,they're probably things like\N"I want to spend more time with family" Dialogue: 0,0:03:40.17,0:03:43.55,Default,,0000,0000,0000,,or "I want to learn how to play the piano" Dialogue: 0,0:03:43.55,0:03:47.16,Default,,0000,0000,0000,,or "I want to take that trip\NI've been thinking about for a while." Dialogue: 0,0:03:47.16,0:03:49.14,Default,,0000,0000,0000,,You know, these are real human goals, Dialogue: 0,0:03:49.14,0:03:51.17,Default,,0000,0000,0000,,the stuff that when\Nwe're on our death bed, Dialogue: 0,0:03:51.17,0:03:53.74,Default,,0000,0000,0000,,if we don't do, we'll probably regret it. Dialogue: 0,0:03:54.31,0:03:57.35,Default,,0000,0000,0000,,But if you look at what the technologies\Nof the attention economy Dialogue: 0,0:03:57.35,0:04:00.18,Default,,0000,0000,0000,,are designed to promote in our lives, Dialogue: 0,0:04:00.18,0:04:01.83,Default,,0000,0000,0000,,you don't see these goals. Dialogue: 0,0:04:02.22,0:04:04.73,Default,,0000,0000,0000,,What you see are things like Dialogue: 0,0:04:04.73,0:04:07.22,Default,,0000,0000,0000,,"Maximize the amount of time\NI spend using it" Dialogue: 0,0:04:07.22,0:04:10.19,Default,,0000,0000,0000,,or "the amount of clicks that I make" Dialogue: 0,0:04:10.19,0:04:13.73,Default,,0000,0000,0000,,or "the number of pages\Nor ads that I view." Dialogue: 0,0:04:14.14,0:04:17.57,Default,,0000,0000,0000,,Now, I don't know anybody\Nwho has these goals for themselves. Dialogue: 0,0:04:17.57,0:04:19.31,Default,,0000,0000,0000,,Does anybody wake up in the morning Dialogue: 0,0:04:19.31,0:04:23.29,Default,,0000,0000,0000,,and think, "How much time\Ncan I possibly spend on Facebook today?" Dialogue: 0,0:04:23.56,0:04:24.59,Default,,0000,0000,0000,,I certainly don't. Dialogue: 0,0:04:24.59,0:04:27.56,Default,,0000,0000,0000,,If there's someone out there like that,\NI'd love to meet them. Dialogue: 0,0:04:27.84,0:04:29.01,Default,,0000,0000,0000,,But what this means Dialogue: 0,0:04:29.01,0:04:32.42,Default,,0000,0000,0000,,is that there's a deep gap\Nbetween our goals and theirs Dialogue: 0,0:04:32.42,0:04:35.98,Default,,0000,0000,0000,,and that the technologies of the attention\Neconomy are not on our side; Dialogue: 0,0:04:36.53,0:04:38.70,Default,,0000,0000,0000,,their goals are not our goals. Dialogue: 0,0:04:39.14,0:04:42.01,Default,,0000,0000,0000,,These are distractions,\Npetty distractions, Dialogue: 0,0:04:42.01,0:04:43.84,Default,,0000,0000,0000,,from the goals of life. Dialogue: 0,0:04:43.84,0:04:46.46,Default,,0000,0000,0000,,And this seems to me\Nto be a really big deal, Dialogue: 0,0:04:46.96,0:04:49.56,Default,,0000,0000,0000,,even more so because the creators\Nof these technologies Dialogue: 0,0:04:49.56,0:04:51.31,Default,,0000,0000,0000,,know that this is the case. Dialogue: 0,0:04:51.31,0:04:54.95,Default,,0000,0000,0000,,Steve Jobs did not let\Nhis children use the iPad. Dialogue: 0,0:04:55.39,0:04:57.30,Default,,0000,0000,0000,,The CEO of Netfilx, a little while back, Dialogue: 0,0:04:57.30,0:05:00.09,Default,,0000,0000,0000,,said that in addition\Nto Snapchat and YouTube, Dialogue: 0,0:05:00.09,0:05:03.41,Default,,0000,0000,0000,,one of their biggest\Ncompetitors was sleep. Dialogue: 0,0:05:04.10,0:05:06.67,Default,,0000,0000,0000,,This seems to me\Na crisis of design ethics, Dialogue: 0,0:05:06.67,0:05:08.89,Default,,0000,0000,0000,,a crisis of self-regulation Dialogue: 0,0:05:08.89,0:05:13.04,Default,,0000,0000,0000,,that design is actually amplifying\Nand making even worse. Dialogue: 0,0:05:13.40,0:05:15.70,Default,,0000,0000,0000,,In the last couple of decades, Dialogue: 0,0:05:15.70,0:05:17.75,Default,,0000,0000,0000,,psychology and behavioral\Neconomics research Dialogue: 0,0:05:17.75,0:05:22.82,Default,,0000,0000,0000,,has cataloged an enormous number\Nof vulnerabilities in our brains, Dialogue: 0,0:05:22.82,0:05:27.39,Default,,0000,0000,0000,,little buttons that can be pushed\Nto get us to think or do certain things. Dialogue: 0,0:05:28.09,0:05:29.90,Default,,0000,0000,0000,,In parallel with this, Dialogue: 0,0:05:30.33,0:05:34.56,Default,,0000,0000,0000,,the advertising industry\Nhas effectively colonized the internet Dialogue: 0,0:05:34.56,0:05:38.95,Default,,0000,0000,0000,,and turned it into a large-scale system\Nof industrial persuasion - Dialogue: 0,0:05:38.95,0:05:42.77,Default,,0000,0000,0000,,of measurement, of optimization,\Nof message delivery. Dialogue: 0,0:05:43.67,0:05:47.42,Default,,0000,0000,0000,,What's more, this power,\Nthis persuasive power, Dialogue: 0,0:05:47.42,0:05:50.61,Default,,0000,0000,0000,,is more centralized\Nthan at any time in human history. Dialogue: 0,0:05:51.01,0:05:52.27,Default,,0000,0000,0000,,Never before in history Dialogue: 0,0:05:52.27,0:05:57.45,Default,,0000,0000,0000,,have a few people at a few companies,\Nin one state, in one country, Dialogue: 0,0:05:57.45,0:06:01.75,Default,,0000,0000,0000,,been able to shape the attentional habits\Nof billions of human beings. Dialogue: 0,0:06:01.75,0:06:05.53,Default,,0000,0000,0000,,Alexander could have never\Neven dreamed of that sort of power. Dialogue: 0,0:06:06.22,0:06:11.46,Default,,0000,0000,0000,,So I think it's no hyperbole to say\Nthat the digital attention economy Dialogue: 0,0:06:11.46,0:06:14.94,Default,,0000,0000,0000,,is the largest system\Nand most effective system Dialogue: 0,0:06:14.94,0:06:20.06,Default,,0000,0000,0000,,for human attitudinal and behavioral\Nmanipulation the world has ever seen. Dialogue: 0,0:06:20.06,0:06:22.58,Default,,0000,0000,0000,,And again, this seems to me\Nan enormous question. Dialogue: 0,0:06:23.02,0:06:24.68,Default,,0000,0000,0000,,I think what's happened Dialogue: 0,0:06:24.68,0:06:29.40,Default,,0000,0000,0000,,is that, as Aldous Huxley said\Nof the defenders of freedom in his time, Dialogue: 0,0:06:29.40,0:06:31.54,Default,,0000,0000,0000,,that they had "failed to take into account Dialogue: 0,0:06:31.54,0:06:34.65,Default,,0000,0000,0000,,man's almost infinite appetite\Nfor distractions." Dialogue: 0,0:06:34.65,0:06:37.08,Default,,0000,0000,0000,,I think that in the design\Nof digital technologies, Dialogue: 0,0:06:37.08,0:06:39.32,Default,,0000,0000,0000,,we've made exactly the same mistake, Dialogue: 0,0:06:39.32,0:06:43.92,Default,,0000,0000,0000,,and I think that it is urgent\Nfor us to start taking those into account. Dialogue: 0,0:06:44.88,0:06:47.10,Default,,0000,0000,0000,,So how can we start to do that? Dialogue: 0,0:06:47.10,0:06:49.42,Default,,0000,0000,0000,,Well, I think what\Nit would require, essentially, Dialogue: 0,0:06:49.42,0:06:53.69,Default,,0000,0000,0000,,is to start asserting and defending\Nour freedom of attention. Dialogue: 0,0:06:54.16,0:06:56.67,Default,,0000,0000,0000,,Now, this is a type of freedom\Nwe have always had Dialogue: 0,0:06:56.67,0:06:59.61,Default,,0000,0000,0000,,but never needed\Nto seriously assert or defend Dialogue: 0,0:06:59.61,0:07:01.74,Default,,0000,0000,0000,,because there wasn't\Na whole lot in our world Dialogue: 0,0:07:01.74,0:07:04.33,Default,,0000,0000,0000,,that could seriously threaten it. Dialogue: 0,0:07:04.91,0:07:09.54,Default,,0000,0000,0000,,But I think we can find good precedent\Nin the great writers on the subject. Dialogue: 0,0:07:10.82,0:07:12.81,Default,,0000,0000,0000,,For instance, John Stuart Mill, Dialogue: 0,0:07:12.81,0:07:16.67,Default,,0000,0000,0000,,who said that "The appropriate\Nregion of human liberty Dialogue: 0,0:07:16.67,0:07:19.61,Default,,0000,0000,0000,,comprises the inward domain\Nof consciousness." Dialogue: 0,0:07:20.01,0:07:23.06,Default,,0000,0000,0000,,Freedom of mind\Nis the first sort of freedom. Dialogue: 0,0:07:23.06,0:07:27.12,Default,,0000,0000,0000,,He adds that "The principle of liberty,\Nliberty of tastes and pursuits, Dialogue: 0,0:07:27.12,0:07:30.44,Default,,0000,0000,0000,,of framing the plan of our life\Nto suit our own character." Dialogue: 0,0:07:30.44,0:07:31.78,Default,,0000,0000,0000,,So what this suggests to me Dialogue: 0,0:07:31.78,0:07:34.66,Default,,0000,0000,0000,,is that we need\Nto start thinking more broadly Dialogue: 0,0:07:34.66,0:07:37.44,Default,,0000,0000,0000,,about what we mean\Nby the concept of attention Dialogue: 0,0:07:37.44,0:07:40.96,Default,,0000,0000,0000,,in order to take into account\Nthe full spectrum of distractions Dialogue: 0,0:07:40.96,0:07:43.52,Default,,0000,0000,0000,,that are now being unleashed in our world. Dialogue: 0,0:07:43.52,0:07:45.52,Default,,0000,0000,0000,,Because when we hear the term "attention," Dialogue: 0,0:07:45.52,0:07:48.52,Default,,0000,0000,0000,,what we normally think of\Nis the spotlight of attention, Dialogue: 0,0:07:48.52,0:07:53.45,Default,,0000,0000,0000,,kind of the immediate way we shape\Nour awareness within the task domain, Dialogue: 0,0:07:53.45,0:07:57.26,Default,,0000,0000,0000,,so the attention that you are all\Ngiving to me right now in this moment, Dialogue: 0,0:07:57.26,0:07:59.80,Default,,0000,0000,0000,,attention for which, by the way,\NI am very grateful. Dialogue: 0,0:08:00.88,0:08:04.50,Default,,0000,0000,0000,,But when the spotlight\Nof our attention gets obscured, Dialogue: 0,0:08:04.50,0:08:07.07,Default,,0000,0000,0000,,it sort of interferes\Nwith our ability to act. Dialogue: 0,0:08:07.07,0:08:08.93,Default,,0000,0000,0000,,So let's say I'm trying to read a book, Dialogue: 0,0:08:08.93,0:08:13.97,Default,,0000,0000,0000,,but I see on my phone that Donald Trump\Nhas unleashed another outrageous tweet, Dialogue: 0,0:08:13.97,0:08:18.40,Default,,0000,0000,0000,,and so I stop reading my book\Nand don't finish reading it until later. Dialogue: 0,0:08:20.03,0:08:25.81,Default,,0000,0000,0000,,But over time, actions become habits;\Nthe things we do become the people we are. Dialogue: 0,0:08:26.81,0:08:31.97,Default,,0000,0000,0000,,And we don't have a way of talking\Nabout attention in this longer-term view Dialogue: 0,0:08:32.21,0:08:35.36,Default,,0000,0000,0000,,with respect to our higher\Ngoals and our values. Dialogue: 0,0:08:35.84,0:08:39.41,Default,,0000,0000,0000,,So I think that we could maybe think\Nof another light of attention Dialogue: 0,0:08:39.41,0:08:41.45,Default,,0000,0000,0000,,beyond this spotlight of attention. Dialogue: 0,0:08:41.45,0:08:44.04,Default,,0000,0000,0000,,We could think of\Nthe starlight of attention, Dialogue: 0,0:08:44.34,0:08:48.56,Default,,0000,0000,0000,,so the way we navigate our lives\Nby the stars of our higher values. Dialogue: 0,0:08:49.09,0:08:53.72,Default,,0000,0000,0000,,So when technology obscures\Nthe starlight of our attention - Dialogue: 0,0:08:54.06,0:08:57.65,Default,,0000,0000,0000,,we can see this especially\Nin infinite scrolling news feeds, Dialogue: 0,0:08:57.65,0:08:59.74,Default,,0000,0000,0000,,like on Facebook or Twitter, Dialogue: 0,0:08:59.74,0:09:01.96,Default,,0000,0000,0000,,and when you pull down to refresh, Dialogue: 0,0:09:01.96,0:09:04.66,Default,,0000,0000,0000,,the same psychological\Nmechanism is at play Dialogue: 0,0:09:04.66,0:09:07.17,Default,,0000,0000,0000,,that is at play in the design\Nof slot machines, Dialogue: 0,0:09:07.17,0:09:09.21,Default,,0000,0000,0000,,so there's intermittent variable rewards. Dialogue: 0,0:09:09.21,0:09:11.45,Default,,0000,0000,0000,,When you randomize\Nthe reward you give somebody, Dialogue: 0,0:09:11.45,0:09:14.55,Default,,0000,0000,0000,,they're more like to do\Nthe behavior you want them to do. Dialogue: 0,0:09:15.79,0:09:21.42,Default,,0000,0000,0000,,And when the attention economy\Nstands in the starlight of our attention, Dialogue: 0,0:09:21.42,0:09:24.28,Default,,0000,0000,0000,,it shapes our lives in its image; Dialogue: 0,0:09:24.28,0:09:26.78,Default,,0000,0000,0000,,our values become its values. Dialogue: 0,0:09:26.78,0:09:31.11,Default,,0000,0000,0000,,We become more petty,\Nmore narcissistic, more impulsive. Dialogue: 0,0:09:32.54,0:09:34.99,Default,,0000,0000,0000,,And I think this is perfectly represented Dialogue: 0,0:09:34.99,0:09:39.72,Default,,0000,0000,0000,,by the CBS CEO's comment\Nfrom February of last year, Dialogue: 0,0:09:39.72,0:09:43.04,Default,,0000,0000,0000,,when he said, "Donald Trump's candidacy\Nmay not be good for America, Dialogue: 0,0:09:43.04,0:09:45.40,Default,,0000,0000,0000,,but it's damn good for CBS." Dialogue: 0,0:09:45.40,0:09:48.52,Default,,0000,0000,0000,,The attention economy\Ndoesn't just shape our lives in its image; Dialogue: 0,0:09:48.52,0:09:51.12,Default,,0000,0000,0000,,it shapes our politics in its image. Dialogue: 0,0:09:51.12,0:09:53.51,Default,,0000,0000,0000,,Again, I think this is\Nan urgent moral question Dialogue: 0,0:09:53.51,0:09:57.25,Default,,0000,0000,0000,,that is being talked about\Nvirtually by no one. Dialogue: 0,0:09:58.24,0:10:02.76,Default,,0000,0000,0000,,But I think we could find one more light\Nof our attention to talk about. Dialogue: 0,0:10:02.76,0:10:07.75,Default,,0000,0000,0000,,It's when the technology\Ndoesn't just make it harder Dialogue: 0,0:10:07.75,0:10:11.50,Default,,0000,0000,0000,,to do what we want to do\Nor to be who we want to be, Dialogue: 0,0:10:11.50,0:10:14.12,Default,,0000,0000,0000,,but in a sense, to want\Nwhat we want to want - Dialogue: 0,0:10:14.12,0:10:17.47,Default,,0000,0000,0000,,to define our goals and values\Nin the first place. Dialogue: 0,0:10:17.87,0:10:20.55,Default,,0000,0000,0000,,So we can think of this\Nas the daylight of our attention, Dialogue: 0,0:10:20.55,0:10:23.52,Default,,0000,0000,0000,,the light by which we're able\Nto do everything else. Dialogue: 0,0:10:23.85,0:10:26.53,Default,,0000,0000,0000,,When technology undermines\Nthe daylight of our attention, Dialogue: 0,0:10:26.53,0:10:28.28,Default,,0000,0000,0000,,it erodes our fundamental capacities Dialogue: 0,0:10:28.28,0:10:33.31,Default,,0000,0000,0000,,like reason, reflection,\Nintelligence, metacognition. Dialogue: 0,0:10:34.51,0:10:36.73,Default,,0000,0000,0000,,One way we see this very clearly Dialogue: 0,0:10:36.73,0:10:42.07,Default,,0000,0000,0000,,is in the proliferation of outrage\Nin our societies and in our world. Dialogue: 0,0:10:42.35,0:10:46.77,Default,,0000,0000,0000,,Outrage - the impulse\Nto judge and punish - Dialogue: 0,0:10:46.77,0:10:50.17,Default,,0000,0000,0000,,was extremely valuable\Nat earlier stages of human evolution Dialogue: 0,0:10:50.17,0:10:55.70,Default,,0000,0000,0000,,in small foraging groups that promoted\Nmoral clarity, social solidarity. Dialogue: 0,0:10:55.70,0:10:59.47,Default,,0000,0000,0000,,It was a way of signalling to other people\Nthat you could be trusted. Dialogue: 0,0:10:59.47,0:11:02.39,Default,,0000,0000,0000,,But when we amplify this\Non a societal scale, Dialogue: 0,0:11:02.39,0:11:07.61,Default,,0000,0000,0000,,it results in large-scale social division\Nand rampant retaliation. Dialogue: 0,0:11:07.61,0:11:09.11,Default,,0000,0000,0000,,To give you one example - Dialogue: 0,0:11:09.36,0:11:11.50,Default,,0000,0000,0000,,I don't know how many of you\Nremember this - Dialogue: 0,0:11:11.83,0:11:14.65,Default,,0000,0000,0000,,there was a dentist from Minnesota,\Na little while back, Dialogue: 0,0:11:14.65,0:11:18.21,Default,,0000,0000,0000,,that went to Zimbabwe\Nand killed the lion named Cecil. Dialogue: 0,0:11:18.21,0:11:21.26,Default,,0000,0000,0000,,It was a stupid thing to do;\Nhe probably shouldn't have done it. Dialogue: 0,0:11:21.26,0:11:23.35,Default,,0000,0000,0000,,It might have been illegal - I don't know. Dialogue: 0,0:11:24.11,0:11:25.90,Default,,0000,0000,0000,,But what happened as a result of that Dialogue: 0,0:11:25.90,0:11:30.18,Default,,0000,0000,0000,,is the entire internet came down\Non this man for a bad decision. Dialogue: 0,0:11:30.56,0:11:34.65,Default,,0000,0000,0000,,It was this whole sort of festival\Nof public shaming. Dialogue: 0,0:11:34.65,0:11:39.36,Default,,0000,0000,0000,,People showed up at his place of work,\Nputting signs on it saying, "Rot in hell." Dialogue: 0,0:11:39.36,0:11:42.26,Default,,0000,0000,0000,,They showed up to his home\Nand spray-painted it. Dialogue: 0,0:11:42.64,0:11:45.69,Default,,0000,0000,0000,,When children do this sort of thing,\Nwe call it "cyber-bullying." Dialogue: 0,0:11:45.69,0:11:49.24,Default,,0000,0000,0000,,But when adults do it,\Nit is mob rule, plain and simple. Dialogue: 0,0:11:49.24,0:11:52.21,Default,,0000,0000,0000,,And mob rule is precisely\Nwhat Socrates held Dialogue: 0,0:11:52.21,0:11:56.77,Default,,0000,0000,0000,,was the main route democracies take\Nwhen they turn into tyrannies. Dialogue: 0,0:11:58.11,0:12:01.24,Default,,0000,0000,0000,,So, we can think beyond\Nthe spotlight of our attention; Dialogue: 0,0:12:01.24,0:12:04.48,Default,,0000,0000,0000,,we can think not just in terms\Nof doing what we want to do Dialogue: 0,0:12:04.48,0:12:05.81,Default,,0000,0000,0000,,but being who we want to be Dialogue: 0,0:12:05.81,0:12:08.87,Default,,0000,0000,0000,,and ultimately wanting\Nwhat we want to want. Dialogue: 0,0:12:10.34,0:12:15.58,Default,,0000,0000,0000,,This is an intolerable situation;\Nthis should not persist. Dialogue: 0,0:12:16.12,0:12:20.17,Default,,0000,0000,0000,,As Aristotle said, "It is disgraceful\Nto be unable to use our good things." Dialogue: 0,0:12:20.17,0:12:21.79,Default,,0000,0000,0000,,We should not have to settle Dialogue: 0,0:12:21.79,0:12:25.09,Default,,0000,0000,0000,,for a relationship with technology\Nthat is adversarial. Dialogue: 0,0:12:25.09,0:12:27.91,Default,,0000,0000,0000,,We should demand that they be on our side. Dialogue: 0,0:12:27.91,0:12:30.06,Default,,0000,0000,0000,,Isn't that what technology is for? Dialogue: 0,0:12:30.73,0:12:32.24,Default,,0000,0000,0000,,So how would we do that? Dialogue: 0,0:12:32.24,0:12:35.65,Default,,0000,0000,0000,,Well, in the past, we've typically\Nput it back on people themselves Dialogue: 0,0:12:35.65,0:12:38.89,Default,,0000,0000,0000,,to deal with distraction,\Nto deal with the effects of technology, Dialogue: 0,0:12:38.89,0:12:40.40,Default,,0000,0000,0000,,to say work harder. Dialogue: 0,0:12:40.40,0:12:44.73,Default,,0000,0000,0000,,But in the digital age, the persuasion\Nis just too powerful and too ubiquitous, Dialogue: 0,0:12:44.73,0:12:46.38,Default,,0000,0000,0000,,and this will not work. Dialogue: 0,0:12:46.76,0:12:50.12,Default,,0000,0000,0000,,But neither can we blame the people\Nwho make these technologies. Dialogue: 0,0:12:50.12,0:12:54.33,Default,,0000,0000,0000,,These are by and large good people,\Nand I count many of them as my friends. Dialogue: 0,0:12:54.59,0:12:58.16,Default,,0000,0000,0000,,They're just players in a game\Ncalled the attention economy. Dialogue: 0,0:12:58.16,0:13:00.10,Default,,0000,0000,0000,,The problem is that game. Dialogue: 0,0:13:01.41,0:13:05.16,Default,,0000,0000,0000,,Ultimately, this is not a problem\Nof the ethics of individual actors; Dialogue: 0,0:13:05.16,0:13:08.76,Default,,0000,0000,0000,,it's a problem of the ethics\Nof the system, of the infrastructure, Dialogue: 0,0:13:08.76,0:13:13.23,Default,,0000,0000,0000,,what philosopher Luciano Floridi\Nat Oxford calls the "infraetchics." Dialogue: 0,0:13:14.09,0:13:17.58,Default,,0000,0000,0000,,So how can we change the situation? Dialogue: 0,0:13:17.88,0:13:19.70,Default,,0000,0000,0000,,Well, go back to what I said earlier Dialogue: 0,0:13:19.70,0:13:21.98,Default,,0000,0000,0000,,about how what we're doing\Nis attentional labor Dialogue: 0,0:13:21.98,0:13:23.81,Default,,0000,0000,0000,,when we're using these technologies Dialogue: 0,0:13:23.81,0:13:26.68,Default,,0000,0000,0000,,and paying for them\Nwith our time and our attention. Dialogue: 0,0:13:28.30,0:13:31.12,Default,,0000,0000,0000,,In this light, we can frame\Nthe problem in two ways. Dialogue: 0,0:13:31.12,0:13:34.69,Default,,0000,0000,0000,,One is that we're getting poor value\Nfor our attentional labor. Dialogue: 0,0:13:34.69,0:13:38.54,Default,,0000,0000,0000,,The other problem is that\Nthe conditions of that attentional labor Dialogue: 0,0:13:38.54,0:13:40.27,Default,,0000,0000,0000,,are extremely poor. Dialogue: 0,0:13:40.65,0:13:44.07,Default,,0000,0000,0000,,Now, throughout history, when people\Nhave been faced with this situation, Dialogue: 0,0:13:44.07,0:13:45.93,Default,,0000,0000,0000,,what they've done is to organize, Dialogue: 0,0:13:45.93,0:13:50.19,Default,,0000,0000,0000,,to create mechanisms\Nof collective representation Dialogue: 0,0:13:50.19,0:13:53.36,Default,,0000,0000,0000,,so that they collectively negotiate\Nwith those in power, Dialogue: 0,0:13:53.36,0:13:55.82,Default,,0000,0000,0000,,with those Alexanders of their time. Dialogue: 0,0:13:56.33,0:14:00.100,Default,,0000,0000,0000,,So what I think is needed\Nis something that will give us a voice, Dialogue: 0,0:14:00.100,0:14:02.26,Default,,0000,0000,0000,,a direct voice, Dialogue: 0,0:14:02.26,0:14:03.97,Default,,0000,0000,0000,,in the design of our technologies, Dialogue: 0,0:14:03.97,0:14:06.76,Default,,0000,0000,0000,,and no mechanism like this exists today. Dialogue: 0,0:14:06.76,0:14:09.30,Default,,0000,0000,0000,,So what I am calling for\Nand what is needed Dialogue: 0,0:14:09.30,0:14:14.74,Default,,0000,0000,0000,,is a labor union or something like it\Nfor the attention economy. Dialogue: 0,0:14:15.10,0:14:17.04,Default,,0000,0000,0000,,And there's a community of people Dialogue: 0,0:14:17.04,0:14:20.42,Default,,0000,0000,0000,,who are passionate\Nin thinking about these issues Dialogue: 0,0:14:20.42,0:14:22.37,Default,,0000,0000,0000,,going under the name Time Well Spent, Dialogue: 0,0:14:22.37,0:14:25.37,Default,,0000,0000,0000,,so that you can know when you\Nspend time with your technologies, Dialogue: 0,0:14:25.37,0:14:28.68,Default,,0000,0000,0000,,that it won't just be time spent;\Nit will be time well spent. Dialogue: 0,0:14:29.31,0:14:32.16,Default,,0000,0000,0000,,So we're thinking about how to change\Nthe attention economy, Dialogue: 0,0:14:32.16,0:14:36.70,Default,,0000,0000,0000,,coming up with better metrics, better\Nprinciples, processes, business models. Dialogue: 0,0:14:36.70,0:14:39.96,Default,,0000,0000,0000,,And if you're passionate about this,\Nwe would love to engage with you Dialogue: 0,0:14:39.96,0:14:44.08,Default,,0000,0000,0000,,because the thing is, we need your help\Nto make this a reality, Dialogue: 0,0:14:44.08,0:14:45.90,Default,,0000,0000,0000,,to change the system. Dialogue: 0,0:14:46.69,0:14:50.28,Default,,0000,0000,0000,,Because at the end of the day, Dialogue: 0,0:14:50.28,0:14:54.27,Default,,0000,0000,0000,,your attention is the most precious\Nresource that you have. Dialogue: 0,0:14:54.27,0:14:57.41,Default,,0000,0000,0000,,It's the ultimate\Nscarce and finite resource, Dialogue: 0,0:14:57.41,0:15:01.16,Default,,0000,0000,0000,,and the challenges\Nthat are facing humanity right now, Dialogue: 0,0:15:01.16,0:15:03.81,Default,,0000,0000,0000,,so many big and important challenges, Dialogue: 0,0:15:03.81,0:15:07.13,Default,,0000,0000,0000,,before anything else,\Nwhat they require of us Dialogue: 0,0:15:07.13,0:15:10.66,Default,,0000,0000,0000,,is that we be able to give attention\Nto the things that matter Dialogue: 0,0:15:10.66,0:15:13.53,Default,,0000,0000,0000,,on individual levels\Nand at the collective level, Dialogue: 0,0:15:13.53,0:15:14.62,Default,,0000,0000,0000,,and this is precisely Dialogue: 0,0:15:14.62,0:15:18.35,Default,,0000,0000,0000,,what the technologies of the attention\Neconomy are undermining. Dialogue: 0,0:15:19.96,0:15:23.46,Default,,0000,0000,0000,,And ultimately, this relates\Nto the very goals of life. Dialogue: 0,0:15:23.46,0:15:26.91,Default,,0000,0000,0000,,Because nobody on their death bed\Never looked back and said, Dialogue: 0,0:15:26.91,0:15:29.54,Default,,0000,0000,0000,,"I wish I'd spent more time on Facebook." Dialogue: 0,0:15:29.54,0:15:34.79,Default,,0000,0000,0000,,You know, what we regret\Nare those real goals, the human goals, Dialogue: 0,0:15:34.79,0:15:37.42,Default,,0000,0000,0000,,those things that make life worth living. Dialogue: 0,0:15:38.43,0:15:44.36,Default,,0000,0000,0000,,And so, from here, I think it will take\Nsome time to reform the attention economy. Dialogue: 0,0:15:44.36,0:15:47.20,Default,,0000,0000,0000,,Big projects usually take time. Dialogue: 0,0:15:47.59,0:15:52.01,Default,,0000,0000,0000,,But in the meantime,\NI think what we need to do is organize Dialogue: 0,0:15:52.01,0:15:56.44,Default,,0000,0000,0000,,so that we can have a voice,\Na direct voice, Dialogue: 0,0:15:56.86,0:15:59.76,Default,,0000,0000,0000,,to those who create our technologies, Dialogue: 0,0:15:59.76,0:16:02.72,Default,,0000,0000,0000,,and we should continue to reap\Nthe benefits of our technologies Dialogue: 0,0:16:02.72,0:16:05.76,Default,,0000,0000,0000,,and continue to affirm and support\Nthe people who create them Dialogue: 0,0:16:05.76,0:16:08.49,Default,,0000,0000,0000,,because they carry that flame\Nof innovation and creativity Dialogue: 0,0:16:08.49,0:16:11.24,Default,,0000,0000,0000,,that is so core to the human project. Dialogue: 0,0:16:11.86,0:16:13.92,Default,,0000,0000,0000,,But before anything else, Dialogue: 0,0:16:14.44,0:16:17.53,Default,,0000,0000,0000,,we need to organize\Nand ask them for their attention Dialogue: 0,0:16:17.53,0:16:20.42,Default,,0000,0000,0000,,so that we can tell them\Nwhat we want them to do with ours. Dialogue: 0,0:16:20.71,0:16:22.49,Default,,0000,0000,0000,,Before anything else, Dialogue: 0,0:16:22.49,0:16:25.93,Default,,0000,0000,0000,,we need to ask them\Nto stand out of our light. Dialogue: 0,0:16:26.31,0:16:27.73,Default,,0000,0000,0000,,Thank you for your attention. Dialogue: 0,0:16:27.73,0:16:29.16,Default,,0000,0000,0000,,(Applause)