1 00:00:00,887 --> 00:00:03,283 I'm an artist and an engineer. 2 00:00:03,307 --> 00:00:09,081 And lately, I've been thinking a lot about how technology mediates 3 00:00:09,105 --> 00:00:11,146 the way we perceive reality. 4 00:00:11,653 --> 00:00:16,293 And it's being done in a superinvisible and nuanced way. 5 00:00:17,260 --> 00:00:20,906 Technology is designed to shape our sense of reality 6 00:00:20,930 --> 00:00:24,741 by masking itself as the actual experience of the world. 7 00:00:25,430 --> 00:00:30,116 As a result, we are becoming unconscious and unaware 8 00:00:30,140 --> 00:00:32,325 that it is happening at all. 9 00:00:33,360 --> 00:00:35,848 Take the glasses I usually wear, for example. 10 00:00:35,872 --> 00:00:39,896 These have become part of the way I ordinarily experience my surroundings. 11 00:00:40,352 --> 00:00:41,842 I barely notice them, 12 00:00:41,866 --> 00:00:45,951 even though they are constantly framing reality for me. 13 00:00:46,447 --> 00:00:50,463 The technology I am talking about is designed to do the same thing: 14 00:00:50,487 --> 00:00:52,925 change what we see and think 15 00:00:52,949 --> 00:00:54,709 but go unnoticed. 16 00:00:55,780 --> 00:00:59,049 Now, the only time I do notice my glasses 17 00:00:59,073 --> 00:01:02,331 is when something happens to draw my attention to it, 18 00:01:02,355 --> 00:01:05,869 like when it gets dirty or my prescription changes. 19 00:01:06,481 --> 00:01:11,089 So I asked myself, "As an artist, what can I create 20 00:01:11,113 --> 00:01:13,560 to draw the same kind of attention 21 00:01:13,584 --> 00:01:19,755 to the ways digital media -- like news organizations, social media platforms, 22 00:01:19,779 --> 00:01:22,040 advertising and search engines -- 23 00:01:22,064 --> 00:01:24,078 are shaping our reality?" 24 00:01:24,798 --> 00:01:29,570 So I created a series of perceptual machines 25 00:01:29,594 --> 00:01:33,160 to help us defamiliarize and question 26 00:01:33,184 --> 00:01:35,304 the ways we see the world. 27 00:01:36,859 --> 00:01:42,892 For example, nowadays, many of us have this kind of allergic reaction 28 00:01:42,916 --> 00:01:45,409 to ideas that are different from ours. 29 00:01:45,945 --> 00:01:51,601 We may not even realize that we've developed this kind of mental allergy. 30 00:01:52,596 --> 00:01:58,239 So I created a helmet that creates this artificial allergy to the color red. 31 00:01:58,263 --> 00:02:02,716 It simulates this hypersensitivity by making red things look bigger 32 00:02:02,740 --> 00:02:04,020 when you are wearing it. 33 00:02:04,730 --> 00:02:08,288 It has two modes: nocebo and placebo. 34 00:02:09,272 --> 00:02:14,629 In nocebo mode, it creates this sensorial experience of hyperallergy. 35 00:02:14,653 --> 00:02:17,360 Whenever I see red, the red expands. 36 00:02:18,062 --> 00:02:21,593 It's similar to social media's amplification effect, 37 00:02:21,617 --> 00:02:24,214 like when you look at something that bothers you, 38 00:02:24,238 --> 00:02:27,166 you tend to stick with like-minded people 39 00:02:27,190 --> 00:02:31,785 and exchange messages and memes, and you become even more angry. 40 00:02:32,274 --> 00:02:35,936 Sometimes, a trivial discussion gets amplified 41 00:02:35,960 --> 00:02:38,361 and blown way out of proportion. 42 00:02:39,038 --> 00:02:43,558 Maybe that's even why we are living in the politics of anger. 43 00:02:44,756 --> 00:02:48,171 In placebo mode, it's an artificial cure for this allergy. 44 00:02:48,575 --> 00:02:50,867 Whenever you see red, the red shrinks. 45 00:02:51,617 --> 00:02:54,492 It's a palliative, like in digital media. 46 00:02:54,516 --> 00:02:57,415 When you encounter people with different opinions, 47 00:02:57,439 --> 00:02:58,916 we will unfollow them, 48 00:02:58,940 --> 00:03:01,688 remove them completely out of our feeds. 49 00:03:02,343 --> 00:03:05,546 It cures this allergy by avoiding it. 50 00:03:05,570 --> 00:03:10,117 But this way of intentionally ignoring opposing ideas 51 00:03:10,141 --> 00:03:14,197 makes human community hyperfragmented and separated. 52 00:03:15,343 --> 00:03:18,402 The device inside the helmet reshapes reality 53 00:03:18,426 --> 00:03:21,376 and projects into our eyes through a set of lenses 54 00:03:21,400 --> 00:03:23,315 to create an augmented reality. 55 00:03:23,935 --> 00:03:28,717 I picked the color red, because it's intense and emotional, 56 00:03:28,741 --> 00:03:30,699 it has high visibility 57 00:03:30,723 --> 00:03:32,001 and it's political. 58 00:03:32,390 --> 00:03:33,653 So what if we take a look 59 00:03:33,677 --> 00:03:36,726 at the last American presidential election map 60 00:03:36,750 --> 00:03:37,915 through the helmet? 61 00:03:37,939 --> 00:03:38,947 (Laughter) 62 00:03:38,971 --> 00:03:42,544 You can see that it doesn't matter if you're a Democrat or a Republican, 63 00:03:42,568 --> 00:03:46,557 because the mediation alters our perceptions. 64 00:03:46,581 --> 00:03:49,714 The allergy exists on both sides. 65 00:03:51,134 --> 00:03:52,454 In digital media, 66 00:03:52,478 --> 00:03:55,611 what we see every day is often mediated, 67 00:03:55,635 --> 00:03:57,368 but it's also very nuanced. 68 00:03:57,758 --> 00:03:59,753 If we are not aware of this, 69 00:03:59,777 --> 00:04:05,182 we will keep being vulnerable to many kinds of mental allergies. 70 00:04:06,900 --> 00:04:10,410 Our perception is not only part of our identities, 71 00:04:10,434 --> 00:04:14,728 but in digital media, it's also a part of the value chain. 72 00:04:15,992 --> 00:04:19,567 Our visual field is packed with so much information 73 00:04:19,591 --> 00:04:25,103 that our perception has become a commodity with real estate value. 74 00:04:26,035 --> 00:04:29,564 Designs are used to exploit our unconscious biases, 75 00:04:29,588 --> 00:04:33,068 algorithms favor content that reaffirms our opinions, 76 00:04:33,092 --> 00:04:37,283 so that every little corner of our field of view is being colonized 77 00:04:37,307 --> 00:04:38,649 to sell ads. 78 00:04:39,402 --> 00:04:43,419 Like, when this little red dot comes out in your notifications, 79 00:04:43,443 --> 00:04:47,687 it grows and expands, and to your mind, it's huge. 80 00:04:48,571 --> 00:04:51,932 So I started to think of ways to put a little dirt, 81 00:04:51,956 --> 00:04:54,654 or change the lenses of my glasses, 82 00:04:54,678 --> 00:04:56,776 and came up with another project. 83 00:04:57,423 --> 00:05:01,224 Now, keep in mind this is conceptual. It's not a real product. 84 00:05:01,676 --> 00:05:03,749 It's a web browser plug-in 85 00:05:03,773 --> 00:05:08,115 that could help us to notice the things that we would usually ignore. 86 00:05:08,709 --> 00:05:12,479 Like the helmet, the plug-in reshapes reality, 87 00:05:12,503 --> 00:05:15,900 but this time, directly into the digital media itself. 88 00:05:17,066 --> 00:05:20,222 It shouts out the hidden filtered voices. 89 00:05:20,246 --> 00:05:21,819 What you should be noticing now 90 00:05:21,843 --> 00:05:24,352 will be bigger and vibrant, 91 00:05:24,376 --> 00:05:28,671 like here, this story about gender bias emerging from the sea of cats. 92 00:05:28,695 --> 00:05:30,716 (Laughter) 93 00:05:30,740 --> 00:05:36,157 The plug-in could dilute the things that are being amplified by an algorithm. 94 00:05:36,831 --> 00:05:38,527 Like, here in this comment section, 95 00:05:38,551 --> 00:05:41,620 there are lots of people shouting about the same opinions. 96 00:05:42,111 --> 00:05:44,981 The plug-in makes their comments super small. 97 00:05:45,005 --> 00:05:46,013 (Laughter) 98 00:05:46,037 --> 00:05:51,288 So now the amount of pixel presence they have on the screen 99 00:05:51,312 --> 00:05:55,866 is proportional to the actual value they are contributing to the conversation. 100 00:05:55,890 --> 00:05:57,832 (Laughter) 101 00:05:59,345 --> 00:06:02,976 (Applause) 102 00:06:04,919 --> 00:06:08,980 The plug-in also shows the real estate value of our visual field 103 00:06:09,004 --> 00:06:12,460 and how much of our perception is being commoditized. 104 00:06:12,912 --> 00:06:14,501 Different from ad blockers, 105 00:06:14,525 --> 00:06:16,997 for every ad you see on the web page, 106 00:06:17,021 --> 00:06:20,428 it shows the amount of money you should be earning. 107 00:06:20,452 --> 00:06:21,697 (Laughter) 108 00:06:22,356 --> 00:06:24,648 We are living in a battlefield between reality 109 00:06:24,672 --> 00:06:26,925 and commercial distributed reality, 110 00:06:27,329 --> 00:06:32,415 so the next version of the plug-in could strike away that commercial reality 111 00:06:32,439 --> 00:06:35,238 and show you things as they really are. 112 00:06:35,262 --> 00:06:37,053 (Laughter) 113 00:06:38,335 --> 00:06:41,964 (Applause) 114 00:06:43,179 --> 00:06:46,791 Well, you can imagine how many directions this could really go. 115 00:06:46,815 --> 00:06:51,229 Believe me, I know the risks are high if this were to become a real product. 116 00:06:52,006 --> 00:06:54,945 And I created this with good intentions 117 00:06:54,969 --> 00:06:58,806 to train our perception and eliminate biases. 118 00:06:58,830 --> 00:07:02,558 But the same approach could be used with bad intentions, 119 00:07:02,582 --> 00:07:05,787 like forcing citizens to install a plug-in like that 120 00:07:05,811 --> 00:07:07,937 to control the public narrative. 121 00:07:08,768 --> 00:07:11,505 It's challenging to make it fair and personal 122 00:07:11,529 --> 00:07:14,684 without it just becoming another layer of mediation. 123 00:07:15,933 --> 00:07:18,595 So what does all this mean for us? 124 00:07:19,230 --> 00:07:22,963 Even though technology is creating this isolation, 125 00:07:22,987 --> 00:07:26,589 we could use it to make the world connected again 126 00:07:26,613 --> 00:07:29,860 by breaking the existing model and going beyond it. 127 00:07:30,508 --> 00:07:33,846 By exploring how we interface with these technologies, 128 00:07:33,870 --> 00:07:39,055 we could step out of our habitual, almost machine-like behavior 129 00:07:39,079 --> 00:07:41,749 and finally find common ground between each other. 130 00:07:42,915 --> 00:07:44,704 Technology is never neutral. 131 00:07:45,191 --> 00:07:48,210 It provides a context and frames reality. 132 00:07:48,234 --> 00:07:51,122 It's part of the problem and part of the solution. 133 00:07:51,624 --> 00:07:57,264 We could use it to uncover our blind spots and retrain our perception 134 00:07:57,949 --> 00:08:01,455 and consequently, choose how we see each other. 135 00:08:01,892 --> 00:08:03,083 Thank you. 136 00:08:03,107 --> 00:08:05,792 (Applause)