1 00:00:13,601 --> 00:00:15,814 We have heard a lot of stories 2 00:00:15,814 --> 00:00:19,574 about the impact of the Internet on protest movements, 3 00:00:19,574 --> 00:00:22,000 heard a lot about the information revolution 4 00:00:22,000 --> 00:00:23,878 and how it's transforming 5 00:00:23,878 --> 00:00:26,200 countries like China, countries like Iran 6 00:00:26,200 --> 00:00:29,559 even many of the countries from the former Soviet Union. 7 00:00:29,559 --> 00:00:33,363 And, you know, the assumption so far has been that 8 00:00:33,363 --> 00:00:35,846 the Internet is basically a very good thing 9 00:00:35,846 --> 00:00:38,632 when it comes to promoting democracy. 10 00:00:38,632 --> 00:00:41,401 So, many of these illusions 11 00:00:41,401 --> 00:00:45,018 were put together in the mid-nineties 12 00:00:45,018 --> 00:00:49,251 by thinkers, which I can only call cyberutopians. 13 00:00:49,251 --> 00:00:54,929 People who really believe in the transformative power of the Web to change societies 14 00:00:54,929 --> 00:00:57,400 and to change them for the better. 15 00:00:57,400 --> 00:00:59,080 The most famous quote was that 16 00:00:59,080 --> 00:01:01,742 if social networking and blogging was around 17 00:01:01,742 --> 00:01:03,709 in the early nineties 18 00:01:03,709 --> 00:01:05,886 the genocide in Rwanda wouldn't have happened. 19 00:01:05,886 --> 00:01:08,663 Which is now very often quoted to illustrate 20 00:01:08,663 --> 00:01:13,016 this very naive view that many people had back at the time. 21 00:01:13,016 --> 00:01:16,400 So, many of the people who still believe in this really think 22 00:01:16,400 --> 00:01:21,419 that blogs are more or less what faxes and Xerox machines 23 00:01:21,419 --> 00:01:23,586 were in the late eighties, 24 00:01:23,586 --> 00:01:25,240 where the great dissident movements 25 00:01:25,240 --> 00:01:26,942 in Poland and Eastern Europe 26 00:01:26,942 --> 00:01:29,652 really embraced this technology, right? 27 00:01:29,652 --> 00:01:31,216 So, essentially, this argument 28 00:01:31,216 --> 00:01:33,585 is about economics and logistics. 29 00:01:33,585 --> 00:01:37,285 The Internet and new media have a made it really cheap 30 00:01:37,285 --> 00:01:40,080 for people to produce content and, of course, the activists 31 00:01:40,080 --> 00:01:43,816 and the NGO's will inevitably use this technology 32 00:01:43,816 --> 00:01:46,905 in order to push for reform and change, right? 33 00:01:47,012 --> 00:01:49,778 So, you know, if you really want to sum-up this view 34 00:01:49,778 --> 00:01:53,240 it basically says that if you have enough connectivity 35 00:01:53,240 --> 00:01:56,402 and enough devices, democracy is almost inevitable. 36 00:01:56,402 --> 00:01:58,352 And that explains why we have seen 37 00:01:58,352 --> 00:02:04,080 so many pushes to, you know, get China online, get Iran online, get Russia online, 38 00:02:04,080 --> 00:02:06,387 make sure that people have enough connectivity, 39 00:02:06,387 --> 00:02:08,917 make sure that people know what blogging is, 40 00:02:08,917 --> 00:02:10,763 make sure people know what connectivity is, 41 00:02:10,763 --> 00:02:13,771 and somehow, although no one explains how exactly, 42 00:02:13,771 --> 00:02:15,721 these people will use these tools 43 00:02:15,721 --> 00:02:21,000 to ask for more democracy, and, you know, cooperate together, and push for more stuff. 44 00:02:21,000 --> 00:02:26,560 And, you know, one of the things which, pundits have developed, in this particular view, 45 00:02:26,560 --> 00:02:28,265 is iPod liberalism. 46 00:02:28,265 --> 00:02:31,903 It's this belief that people who have iPods 47 00:02:31,903 --> 00:02:34,731 or any other sort of modern Western technology 48 00:02:34,731 --> 00:02:40,444 will also be very likely to support Western values and Western democracy, right? 49 00:02:40,444 --> 00:02:43,046 So the assumption here is that if you give 50 00:02:43,046 --> 00:02:47,000 all Chinese or Iranians or Russians, you know, enough iPods, or enough laptops, 51 00:02:47,000 --> 00:02:51,741 or enough, you know, fax machines, they would all somehow, on their own, aspire 52 00:02:51,741 --> 00:02:54,569 to democratic change, right? 53 00:02:54,569 --> 00:02:57,843 And, of course, this would make a fascinating, you know, 54 00:02:57,843 --> 00:03:00,812 title for a column by Thomas Freidman, you know, 55 00:03:00,812 --> 00:03:02,626 "Drop iPods, Not Bombs" 56 00:03:02,626 --> 00:03:05,429 But, you know, this is rarely a good sign, right? 57 00:03:05,429 --> 00:03:07,966 It's a view which essentially disregards 58 00:03:07,966 --> 00:03:11,620 a lot of political, cultural, sociological forces 59 00:03:11,620 --> 00:03:13,871 we try to place on these societies 60 00:03:13,871 --> 00:03:17,811 and embraces a very deterministic picture 61 00:03:17,811 --> 00:03:20,972 of the role that technology plays, right? 62 00:03:20,972 --> 00:03:24,612 And the main confusion here is due to the fact that 63 00:03:24,612 --> 00:03:27,840 we actually tend to confuse the intended uses 64 00:03:27,840 --> 00:03:31,080 of technology with the actual uses, right? 65 00:03:31,080 --> 00:03:35,969 Just like we want to think that, you know, radio, for example, 66 00:03:35,969 --> 00:03:40,240 can help establish democracy in, you know, countries like the Soviet Union, 67 00:03:40,240 --> 00:03:42,175 which it partly did, you know. 68 00:03:42,175 --> 00:03:48,120 It was also used very actively during the very Rwandan genocide that, you know, we wanted to avert. 69 00:03:48,120 --> 00:03:53,922 We have, as I've mentioned previously, somewhat of a myth that certain leaders and dictators 70 00:03:53,922 --> 00:03:57,582 somehow fear the Internet, that they fear technology. 71 00:03:57,582 --> 00:04:04,160 And again, if you really look very closely at how government leaders are trying to sort of reach out 72 00:04:04,160 --> 00:04:08,240 to their different citizens and Internet users. 73 00:04:08,240 --> 00:04:09,366 That's actually not the case. 74 00:04:09,366 --> 00:04:14,140 Pretty much across the board, with the exception probably of North Korea and Burma, 75 00:04:14,140 --> 00:04:19,161 authoritarian leaders are actually very actively engaging with technology, computers, you know, and so forth. 76 00:04:19,161 --> 00:04:22,905 And, you know, sometimes they do allow debate around issues which are not political. 77 00:04:22,905 --> 00:04:27,926 They do allow debate around some more political issues like climate change. 78 00:04:27,926 --> 00:04:32,200 All that is happening; it's just that it's not happening with issues like human rights, for example. 79 00:04:32,200 --> 00:04:35,040 So you can see that there is criticism in Chinese blogs. 80 00:04:35,040 --> 00:04:37,966 It's actually there is much more criticism than non criticism. 81 00:04:37,966 --> 00:04:40,600 You know, it's both of national and local governments. 82 00:04:40,600 --> 00:04:45,258 The question then is: why does the government tolerate it? 83 00:04:45,258 --> 00:04:49,640 First, it's to generate information that the government needs to run the country. 84 00:04:49,640 --> 00:04:54,080 Most of these bureaucrats in the government in Russia, China, Iran, or elsewhere, 85 00:04:54,080 --> 00:04:59,668 they operate in huge information vacuums. They don't really know fully what's happening in the outter regions. 86 00:05:00,406 --> 00:05:03,923 So for them having people blog and having people voluntarily 87 00:05:03,923 --> 00:05:08,895 provide information about what may be wrong with some local issues is actually quite helpful. 88 00:05:08,895 --> 00:05:11,975 Because that will help them to crackdown on local corruption and bad behavior. 89 00:05:11,975 --> 00:05:16,730 You know, and go actually fix some of the problems which, may not be political in nature, 90 00:05:16,730 --> 00:05:21,338 but which may help them to, you know, survive into the next, you know, century. 91 00:05:21,338 --> 00:05:23,570 It just helps them achieve legitimacy, right? 92 00:05:23,570 --> 00:05:29,360 So for them, sort of having this fake opening up in cyberspace is actually very useful 93 00:05:29,360 --> 00:05:34,670 because it reduces tension, and it convinces some people, at least, that yes, they are willing to consider 94 00:05:34,670 --> 00:05:36,175 outside views and opinions. 95 00:05:36,323 --> 00:05:38,486 Some of this is happening slightly differently 96 00:05:38,486 --> 00:05:39,542 in other countries. 97 00:05:39,542 --> 00:05:45,722 See now they may be still spinning it, but they are also trying to leverage the support of their users 98 00:05:45,722 --> 00:05:47,520 online in cyberspace, right? 99 00:05:47,520 --> 00:05:51,080 So, for example, in Thailand there's a very interesting site called "Protect the King." 100 00:05:51,080 --> 00:05:53,160 It was started by one of their members in parliament, 101 00:05:53,160 --> 00:05:56,560 which basically encourages users, you know Internet users, 102 00:05:56,560 --> 00:06:01,529 to go and start submitting links to websites which they may find offensive to the king. 103 00:06:01,529 --> 00:06:04,994 So you can basically go and nominate any of the websites that you don't like 104 00:06:04,994 --> 00:06:09,191 and it will be, almost within 24 hours, blocked, 105 00:06:09,191 --> 00:06:12,658 and then you need to go through a very complex procedure to unblock it. 106 00:06:12,658 --> 00:06:17,473 And then because there a very severe Lese-majesty laws in Thailand that works very well. 107 00:06:17,473 --> 00:06:21,240 Once they launched it then in 24 hours it was like 3,000 websites blocked. 108 00:06:21,240 --> 00:06:26,040 And then there are a lot of loyalists who are actually very happy to contribute their knowledge and tips and whatnot 109 00:06:26,040 --> 00:06:29,701 to have those websites censored. 110 00:06:29,701 --> 00:06:31,651 The same is happening in Saudi Arabia 111 00:06:31,651 --> 00:06:35,040 where Internet users are encouraged to go and search 112 00:06:35,040 --> 00:06:39,080 YouTube for videos which may offend Saudi sensibilities. 113 00:06:39,080 --> 00:06:41,496 And then to nominate them for deletion. 114 00:06:41,496 --> 00:06:46,120 And then if that particular video accumulates a critical mass 115 00:06:46,120 --> 00:06:47,607 then YouTube will have to delete them; 116 00:06:47,607 --> 00:06:49,331 they will have to consider deleting it 117 00:06:49,331 --> 00:06:51,502 because so many people have complained. 118 00:06:51,502 --> 00:06:54,385 So there are organized campaigns actually to try to go 119 00:06:54,385 --> 00:06:59,000 and influence the decisions of the western companies on this issue. 120 00:06:59,000 --> 00:07:05,970 How the Iranian authorities, after the protests are now, you know, slowing down, 121 00:07:05,970 --> 00:07:10,120 are actually looking at all the online evidence trail left on Facebook and Twitter. 122 00:07:10,120 --> 00:07:16,360 They actually go and start cracking down on people who were active on cyberspace, right? 123 00:07:16,360 --> 00:07:21,899 So now, one of their initiatives now is actually putting online the pictures of protesters 124 00:07:21,899 --> 00:07:25,320 on the street so that they can actually be identified. 125 00:07:25,320 --> 00:07:30,960 So they're crowd-sourcing this process of matching faces to names, right? 126 00:07:30,960 --> 00:07:35,400 And, of course, you can guess what's gonna happen once they know who the protesters were. 127 00:07:35,400 --> 00:07:40,120 There are a lot of dangers and fears which we do not entirely understand at this point. 128 00:07:40,120 --> 00:07:43,931 What you don't realize is that Twitter, despite all its virtues, 129 00:07:43,931 --> 00:07:45,969 is actually a public platform, right? 130 00:07:45,969 --> 00:07:49,240 And if you do want to plan a revolution on Twitter, 131 00:07:49,240 --> 00:07:51,600 your actions will be visible to everyone. 132 00:07:51,600 --> 00:07:55,893 In the past, states used to torture to get this kind of data. 133 00:07:55,893 --> 00:07:57,983 I mean now all they have to do is just get on Facebook. 134 00:07:57,983 --> 00:07:59,000 [crowd laughing] 135 00:07:59,000 --> 00:08:04,200 If you want to know how I am as an activist in a country like Belarus or Iran, 136 00:08:04,200 --> 00:08:08,040 I'm connected to twenty-thousand other activists. 137 00:08:08,040 --> 00:08:10,568 All they have to do is go look up my Facebook friends. 138 00:08:10,568 --> 00:08:17,766 My final point here would be again about the cyberutopian assumption that somehow the younger generation, 139 00:08:17,766 --> 00:08:20,120 which has not been subject to brainwashing, 140 00:08:20,120 --> 00:08:23,989 which is all about digital media, mobile phones, 141 00:08:23,989 --> 00:08:28,726 Blackberries, and laptops will somehow be prone to, you know, a revolution, 142 00:08:28,726 --> 00:08:31,977 will be prone to embrace democratic values. 143 00:08:31,977 --> 00:08:37,120 The problem here again is that we hear it quite a lot about cyber activism, right?, 144 00:08:37,120 --> 00:08:40,568 but we hear very little about what I call the cybercriminal, right?, 145 00:08:40,568 --> 00:08:43,634 where young people may not necessarily be that crazy 146 00:08:44,997 --> 00:08:46,360 about participating in any political actions, 147 00:08:46,360 --> 00:08:49,000 but they're always online because of all the good things 148 00:08:49,000 --> 00:08:51,600 that the Internet has to offer. 149 00:08:51,600 --> 00:08:55,320 Adult content, which is pornography, instant messaging, and email 150 00:08:55,320 --> 00:08:58,958 still occupies proportionally much more space 151 00:08:58,958 --> 00:09:01,360 than politics or news. 152 00:09:01,360 --> 00:09:04,160 And, again, you have to keep it perspective that most 153 00:09:04,160 --> 00:09:09,501 of what young people do online revolves around, you know, them communicating to each other 154 00:09:09,501 --> 00:09:12,150 or downloading entertainment, right? 155 00:09:12,150 --> 00:09:19,996 And it's not at all clear how they will advance to this level of actually being politically active. 156 00:09:19,996 --> 00:09:22,000 What if it wouldn't get them into the streets? 157 00:09:22,000 --> 00:09:24,825 This is something which we don't see discussed very often. 158 00:09:24,825 --> 00:09:31,977 You know, we hear a lot about this distinction between digital natives and digital immigrants. 159 00:09:31,977 --> 00:09:35,971 What we don't hear about is the distinction between digital renegades and digital captives, 160 00:09:35,971 --> 00:09:39,905 which I think is much more important one because we need to know how exactly 161 00:09:39,905 --> 00:09:43,604 technology influences their civic engagement, 162 00:09:43,604 --> 00:09:46,437 and their propensity to actually go and engage in protest. 163 00:09:46,437 --> 00:09:52,411 You have to go back to Maslow and actually start thinking about how this pyramid of needs 164 00:09:52,411 --> 00:09:54,640 can actually be applied to cyberspace. 165 00:09:54,640 --> 00:10:01,327 It may as well be that when you are bringing Internet to China, Russia, or Iran, 166 00:10:01,327 --> 00:10:03,682 at the very beginning what people will want to do online is, 167 00:10:03,682 --> 00:10:09,000 you know, have fun, you know, explore pornography, or YouTube, or videos of funny cats, 168 00:10:09,000 --> 00:10:12,954 and move on to talking and sharing and some will want to go and explore learning. 169 00:10:12,954 --> 00:10:15,280 Eventually they may want to campaign. 170 00:10:15,280 --> 00:10:18,742 Some of them will go and start downloading a word from humans rights watch, 171 00:10:18,742 --> 00:10:21,160 but most of them will still be downloading pornography, 172 00:10:21,160 --> 00:10:24,160 and that's a very important perspective to keep in mind. 173 00:10:24,160 --> 00:10:32,721 If you really want to understand the actual net impact or net effect of technology on society, 174 00:10:32,721 --> 00:10:37,968 then you have to look much broader, in fact, and view the negative consequences, as well.