1 00:00:00,792 --> 00:00:03,059 Har kuni, har hafta 2 00:00:03,083 --> 00:00:05,268 biz foydalanish shartlariga rozi bo'lamiz. 3 00:00:05,292 --> 00:00:06,768 Va biz rozi bo'lganimzida 4 00:00:06,792 --> 00:00:09,268 biz kompaniyalarga qonuniy huquq taqdim etamiz 5 00:00:09,292 --> 00:00:12,976 ma'lumotlarimiz bilan istalgan ish qilishiga 6 00:00:13,000 --> 00:00:15,375 va farzandlarimizning ma'lumotlari bilan ham. 7 00:00:16,792 --> 00:00:19,768 Hayron qolarlisi, 8 00:00:19,792 --> 00:00:22,684 biz bolalarning ancha ma'lumotlarini berib yuboryapmiz. 9 00:00:22,708 --> 00:00:24,708 Bularning oqibati qanday bo'ladi? 10 00:00:26,500 --> 00:00:27,893 Men antropologman, 11 00:00:27,917 --> 00:00:30,518 shuningdek, ikki qizchaning onasiman. 12 00:00:30,542 --> 00:00:35,018 Men bu savolga 2015-yilda qiziqishni boshlagandim. 13 00:00:35,042 --> 00:00:37,768 O'shanda kutilmaganda shuni angladim, 14 00:00:37,792 --> 00:00:40,809 ulkan va aql bovar qilmas hajmdagi bolalar haqidagi ma'lumotlar 15 00:00:40,833 --> 00:00:44,000 ishlab chiqarilishi va to'planishini. 16 00:00:44,792 --> 00:00:46,768 Shu tarzda ilmiy loyihamni boshladim. 17 00:00:46,792 --> 00:00:49,268 Uni Bola Ma'lumot Fuqaro deb nomladim 18 00:00:49,292 --> 00:00:51,417 va bo'shliqni to'ldirishni maqsad qildim. 19 00:00:52,583 --> 00:00:55,601 Endi o'ylashingiz mumkin meni hozir sizni aybdor qiladi deb 20 00:00:55,625 --> 00:00:58,393 ijtimoiy tarmoqga farzandingiz rasmini joylaganingiz uchun 21 00:00:58,417 --> 00:01:00,559 ammo gap bunda emas, 22 00:01:00,583 --> 00:01:04,000 Muammo bundan ko'ra ancha yirikroqdir. 23 00:01:04,792 --> 00:01:08,893 Hammasi tizimga bog'liq, odamlarga emas. 24 00:01:08,917 --> 00:01:11,208 SIz va sizning odatlaringiz aybdor emas. 25 00:01:12,833 --> 00:01:15,684 Tarixda birinchi bor, 26 00:01:15,708 --> 00:01:18,268 biz bolalarning ma'lumotlarini kuzatmoqdamiz 27 00:01:18,292 --> 00:01:20,059 ularning tug'ilishidan ancha oldin 28 00:01:20,083 --> 00:01:22,768 urug'lanish paytidan boshlab, 29 00:01:22,792 --> 00:01:25,143 va butun hayoti davomida. 30 00:01:25,167 --> 00:01:28,268 Bilamizki, ota-onalar farzand ko'rishmoqchi bo'lsihganida, 31 00:01:28,292 --> 00:01:31,268 ular "homilador bo'lish yo'llari"ni onlayn qidirishadi, 32 00:01:31,292 --> 00:01:34,042 yoki ovulyatsiyani kuzatadgan ilovalarni yuklab olishadi. 33 00:01:35,250 --> 00:01:37,851 Ular homilador bo'lishganida esa, 34 00:01:37,875 --> 00:01:41,018 ular chaqalog'ining ultratovushini ijtimoiy tarmoqqa yuklaydi, 35 00:01:41,042 --> 00:01:43,059 homiladorlik ilovalarini yuklab olishadi 36 00:01:43,083 --> 00:01:46,809 yoki Doktor Google bilan hamma narsa haqida maslahatlashishadi, 37 00:01:46,833 --> 00:01:48,351 o'zingiz bilasiz -- 38 00:01:48,375 --> 00:01:50,934 "parvoz davomida bolaning tushib qolish xavfi" 39 00:01:50,958 --> 00:01:53,726 yoki "erta homiladorlikdagi qorindagi spazmlar" haqida. 40 00:01:53,750 --> 00:01:55,559 Men bilaman, chunki shunday qilganman 41 00:01:55,583 --> 00:01:57,208 va ko'p marta. 42 00:01:58,458 --> 00:02:01,268 Bola tug'ilganida esa, ular kuzatishadi: uning har uyqusini, 43 00:02:01,292 --> 00:02:02,559 ovqatlanishini, 44 00:02:02,583 --> 00:02:05,167 hayotining har bir harakatini, turli texnologiyalarda. 45 00:02:06,083 --> 00:02:07,559 Va bu texnologiyalar 46 00:02:07,583 --> 00:02:13,726 chaqaloqning shaxsiy ma'lumotlarini foydaga aylantirishadi, ularni 47 00:02:13,750 --> 00:02:15,542 boshqalar bilan ulashgan holda. 48 00:02:16,583 --> 00:02:18,726 Mana bu qanday ishlaydi, 49 00:02:18,750 --> 00:02:23,934 2019-yilda Britaniya Tibbiyot Jurnali bir tadqiqotni nashr qildi. 50 00:02:23,958 --> 00:02:27,601 Unda aytilishicha 24 mobil sog'liq ilovasidan 51 00:02:27,625 --> 00:02:31,083 19 tasi ma'lumotlarni uchinchi tomon bilan bo'lishishgan. 52 00:02:32,083 --> 00:02:37,917 Va bu uchinchi tomonlar ma'lumotni 216 tashkilot bilan bo'lishishgan. 53 00:02:38,875 --> 00:02:42,309 Bu 216 tashkilotdan 54 00:02:42,333 --> 00:02:45,476 bor-yo'g'i uchtasi sog'liqni saqlash tizimiga oid. 55 00:02:45,500 --> 00:02:50,018 Ma'lumotga kirish huquqi bor kompaniyalar orasida katta texnologiya kompaniyalari 56 00:02:50,042 --> 00:02:53,559 Google, Facebook va Oracle kabilar bor. 57 00:02:53,583 --> 00:02:56,184 Ular raqamli reklama kompaniyalaridir, 58 00:02:56,208 --> 00:03:00,333 shuningdek iste'mol kreditlari bo'yicha agentlik ham bor. 59 00:03:01,125 --> 00:03:02,559 Siz to'g'ri topdingiz: 60 00:03:02,583 --> 00:03:07,708 Ularda allaqachon chaqaloqlar haqida ma'lumotlar bor. 61 00:03:09,125 --> 00:03:11,893 Ammo mobil ilovalar, veb qidiruvlar va ijtimoiy tarmoqlar 62 00:03:11,917 --> 00:03:15,018 shunchaki aysbergning bir uchidir. 63 00:03:15,042 --> 00:03:17,893 Chunki turli texnologiyalar bolalarni kuzatib borishadi 64 00:03:17,917 --> 00:03:19,643 Ularning kundalik hayoti davomida. 65 00:03:19,667 --> 00:03:23,809 Ular uy texnologiyalari va uydagi virtual yordamchilar tomonidan kuzatiladi. 66 00:03:23,833 --> 00:03:25,809 Ta'lim platformasi ham ularni kuzatishadi 67 00:03:25,833 --> 00:03:28,018 va maktabdagi ta'lim texnologiyalari ham. 68 00:03:28,042 --> 00:03:29,643 Onlayn yozuvlari ham kuzatiladi 69 00:03:29,667 --> 00:03:32,684 va shifoxonalaridagi onlayn portallari ham. 70 00:03:32,708 --> 00:03:35,059 Ularni internetli o'yinchoqlar, ularning onlayn 71 00:03:35,083 --> 00:03:36,393 o'yinlari orqali kuzatishadi 72 00:03:36,417 --> 00:03:39,083 va boshqa judayam ko'plab texnologiyalar orqali. 73 00:03:40,250 --> 00:03:41,893 Tadqiqotim davomida 74 00:03:41,917 --> 00:03:46,059 ko'plab ota-onalar oldimga kelishardi va "Ho'sh nima bo'libdi?" deyishardi. 75 00:03:46,083 --> 00:03:49,000 "Nima farqi bor bolalarim kuzatilishini? 76 00:03:50,042 --> 00:03:51,375 Yashiradigan narsamiz yo'q." 77 00:03:52,958 --> 00:03:54,458 Lekin, farqi bor. 78 00:03:55,083 --> 00:04:01,101 Farqi bor, chunki hozirda odamlarni nafaqat kuzatishadi, 79 00:04:01,125 --> 00:04:05,226 balki, ularning ma'lumoti alohida yig'iladi qidiruv tarixiga asoslanib, 80 00:04:05,250 --> 00:04:09,059 sun'iy ong va bashorat tahlillari uchun ishlatiladi. 81 00:04:09,083 --> 00:04:12,726 Inson hayoti bo'yicha iloji boricha ko'p ma'lumot yig'ish uchun 82 00:04:12,750 --> 00:04:14,601 turli manbalardan: 83 00:04:14,625 --> 00:04:19,143 oila tarixi, xarid qilish odatlari va ijtimoiy tarmoq izohlaridan. 84 00:04:19,167 --> 00:04:21,018 Keyin ular bu ma'lumotlarni jamlashadi. 85 00:04:21,042 --> 00:04:23,792 Ma'lumotga asoslangan qaror chiqarish uchun 86 00:04:24,792 --> 00:04:28,226 bunday texnologiyalar hamma yerda ishlatiladi. 87 00:04:28,250 --> 00:04:30,643 Banklar bularni kredit, qarz berishda ishlatishadi, 88 00:04:30,667 --> 00:04:33,042 sug'urta kompaniayalari esa premiyani aniqlashda. 89 00:04:34,208 --> 00:04:36,684 Ish beruvchilar ham ulardan foydalanishadi. 90 00:04:36,708 --> 00:04:39,625 Nomzod ishga to'gri kelishi yoki kelmasligini aniqlashda. 91 00:04:40,750 --> 00:04:43,851 Shuningdek, politsiya va sud ham jinoyatchini aniqlashda 92 00:04:43,875 --> 00:04:47,393 ulardan foydalanishadi, 93 00:04:47,417 --> 00:04:50,042 yoki qayta jinoyatga moyilligini aniqlaganda. 94 00:04:52,458 --> 00:04:56,518 Bizda ma'lumotlarni sotadigan, sotib oladigan va tahlil qiladiganlar 95 00:04:56,542 --> 00:05:00,184 ustidan hech qanday ko'nikma yoki nazorat yo'q. 96 00:05:00,208 --> 00:05:02,917 Ular bizni va bolalarimizning ma'lumotlarini yig'ishadi. 97 00:05:03,625 --> 00:05:07,667 Lekin bu yig'ilgan ma'lumotlar huquqlarimizni yetarlicha buzishadi. 98 00:05:08,917 --> 00:05:11,125 Misol uchun, 99 00:05:13,792 --> 00:05:17,851 2018-yil New York Times yangiligida aytilishicha 100 00:05:17,875 --> 00:05:19,851 onlayn kollejni reja qilish tizimi orqali 101 00:05:19,875 --> 00:05:22,934 yig'ilgan ma'lumotlarda 102 00:05:22,958 --> 00:05:27,684 AQSH bo'ylab millionlab maktab o'quvchilari ma'lumoti yig'ilgan 103 00:05:27,708 --> 00:05:31,351 kollej programmasi yoki grant qidirib yurgan o'quvchilar orasidan 104 00:05:31,375 --> 00:05:34,417 va bu ma'lumotlar o'quv brokerlariga sotilgan. 105 00:05:35,792 --> 00:05:41,226 Hozirda, Fordhamda o'quv brokerlarini o'rgangan tadqiqotchilar 106 00:05:41,250 --> 00:05:46,476 bolalarning ikki yoshligidan ma'lumoti to'planishini oshkor qilishdi. 107 00:05:46,500 --> 00:05:49,559 Turli yo'nalishlar bo'yicha: 108 00:05:49,583 --> 00:05:53,768 millati, dini, boyligi, 109 00:05:53,792 --> 00:05:55,851 ijtimoiy ahvoli 110 00:05:55,875 --> 00:05:58,809 va ko'plab boshqa turli yo'nalishlar bo'yicha. 111 00:05:58,833 --> 00:06:03,851 Keyin ular bu yig'ilgan profillarni bolalarning ismi, 112 00:06:03,875 --> 00:06:06,684 uy manzili va kontaktlari bilan birga 113 00:06:06,708 --> 00:06:08,559 turli kompaniyalarga sotishadi, 114 00:06:08,583 --> 00:06:11,042 savdo va ta'lim dargohlaridan tortib, 115 00:06:12,083 --> 00:06:13,351 talaba qarzlari, 116 00:06:13,375 --> 00:06:15,125 talaba kredit karta kompaniyasigacha. 117 00:06:16,542 --> 00:06:17,893 Aniqlashtirish uchun, 118 00:06:17,917 --> 00:06:21,726 Fordhamdagi tadqiqotchilar o'quv brokeridan 119 00:06:21,750 --> 00:06:27,559 14-15 yoshli qizlar ro'yxati bilan ta'minlashni so'rashdi. 120 00:06:27,583 --> 00:06:30,958 Masalan, oilaviy reja qilishga qiziqadiganlarini. 121 00:06:32,208 --> 00:06:34,684 Broker ro'yxatni berishga rozi bo'ldi. 122 00:06:34,708 --> 00:06:39,583 Tasavvur qiling, bu qanchalik bolalarimiz uchun havfli bo'lishi mumkin. 123 00:06:40,833 --> 00:06:44,809 Lekin o'quv brokerlari shunchaki bir misol. 124 00:06:44,833 --> 00:06:49,518 Haqiqat shuki, bolalarimizning ma'lumoti yig'ilishini biz nazorat qilolmaymiz, 125 00:06:49,542 --> 00:06:52,958 ammo bu narsa ularning hayotiga yetarlicha ta'sir qiladi. 126 00:06:54,167 --> 00:06:57,643 Biz o'zimizdan so'rashimiz kerak: 127 00:06:57,667 --> 00:07:02,351 bolalarimizning ma'lumoti yig'ilganda biz bu texnologiyalarga ishona olamizmi? 128 00:07:02,375 --> 00:07:03,625 Aniq ishona olamizmi? 129 00:07:05,708 --> 00:07:06,958 Mening javobim "yo'q." 130 00:07:07,792 --> 00:07:09,059 Antropolog sifatida, 131 00:07:09,083 --> 00:07:12,851 Sun'iy ong va bashorat tahlillari ajoyibdir. 132 00:07:12,875 --> 00:07:14,893 Kasallik izini topishda 133 00:07:14,917 --> 00:07:16,750 yoki iqlim o'zgarishga qarshi kurashda. 134 00:07:18,000 --> 00:07:19,643 Bu qarashdan voz kechishimiz kerak: 135 00:07:19,667 --> 00:07:23,351 ularning xolisona ma'lumot yig'ishida 136 00:07:23,375 --> 00:07:26,559 va dalillarga asoslanib qaror chiqarishda ularga ishonishimizda, 137 00:07:26,583 --> 00:07:28,476 insonlar hayoti haqida. 138 00:07:28,500 --> 00:07:31,059 Chunki ular insonning aslini ko'rsatmaydi. 139 00:07:31,083 --> 00:07:34,434 Ma'lumotlar bizning kimligimizning ko'zgusi emas. 140 00:07:34,458 --> 00:07:36,559 Odam bir narsani o'ylab, teskarisini aytadi, 141 00:07:36,583 --> 00:07:39,018 ko'ngli buni desa, o'zi boshqa ishni qiladi. 142 00:07:39,042 --> 00:07:41,518 Algoritmik bashoratlar yoki raqamli amaliyotlar 143 00:07:41,542 --> 00:07:46,708 insonning murakab tizimini oldindan aytib berolmaydi. 144 00:07:48,417 --> 00:07:49,976 Lekin, avvalambor 145 00:07:50,000 --> 00:07:52,684 bu texnologiyalar har doim ham 146 00:07:52,708 --> 00:07:53,976 har doim ham 147 00:07:54,000 --> 00:07:55,917 xolis emas. 148 00:07:57,125 --> 00:08:02,184 Bilamizki, algoritmlar qoidalar yoki qadamlar to'plamidir. 149 00:08:02,208 --> 00:08:05,917 Ma'lum bir natijaga erishish uchun ishlab chiqilgan, shunday emasmi? 150 00:08:06,833 --> 00:08:09,559 Lekin bu qoidalar yoki qadamlar to'plami xolis emas, 151 00:08:09,583 --> 00:08:11,726 chunki ularni odamlar ishlab chiqishgan. 152 00:08:11,750 --> 00:08:13,476 Ma'lum bir madaniy jabxada 153 00:08:13,500 --> 00:08:16,000 va o'ziga xos madaniy jihatlar bilan sayqallashgan. 154 00:08:16,667 --> 00:08:18,393 Shunday ekan, mashinalar o'rganishsa 155 00:08:18,417 --> 00:08:20,667 ular noxolis algoritmlar orqali o'rganishadi, 156 00:08:21,625 --> 00:08:24,833 va ular noxolis ma'lumotlar bazasidan ham tezroq o'rganishadi. 157 00:08:25,833 --> 00:08:29,559 Ayni paytda, algoritmik taraqqiyotning ilk misollarini ko'rishimiz mumkin. 158 00:08:29,583 --> 00:08:33,083 Va ba'zi misollar rostan qo'rqinchlidir. 159 00:08:34,500 --> 00:08:38,559 Shu yili, Ney-Yorkdagi AI Now Instituti hisobot chop etishdi: 160 00:08:38,583 --> 00:08:40,976 shuni oshkor qilishdiki, sun'iy ong texnologiyalari 161 00:08:41,000 --> 00:08:44,476 bashorat qilishda ishlatilayotgan ekan, 162 00:08:44,500 --> 00:08:47,625 "kir" ma'lumotlarni aniqlashga o'rgatilgan. 163 00:08:48,333 --> 00:08:51,226 Bu asosan shunday yig'ilganki, 164 00:08:51,250 --> 00:08:55,434 tarixiy irqchillika oid ma'lumotlardan iborat va 165 00:08:55,458 --> 00:08:57,708 shaffof bo'lmagan politsiya ishlari ham bor. 166 00:08:58,542 --> 00:09:02,601 Bu texnologiyalar qora ma'lumotlarni ishlatgani uchun 167 00:09:02,625 --> 00:09:04,059 ular xolis emas, 168 00:09:04,083 --> 00:09:08,601 ularning oqibatlari esa faqatgina politsiya qilmishi va xatolarini 169 00:09:08,625 --> 00:09:10,250 ko'paytiradi va abadiylashtiradi. 170 00:09:13,167 --> 00:09:16,309 Xo'sh, menimcha bizning jamiyatimiz muhim bir 171 00:09:16,333 --> 00:09:17,976 muammoga duch keldi. 172 00:09:18,000 --> 00:09:22,792 Biz ma'lumotlar yig'ishda ishlatiladigan texnologiyalarga ishonishni boshladik. 173 00:09:23,750 --> 00:09:26,268 Bilamizki, inson ma'lumotlarini yig'ishda 174 00:09:26,292 --> 00:09:29,101 bu texnologiyalar har doim ham xolis bo'lmaydi 175 00:09:29,125 --> 00:09:31,851 va hech qachon aniq bo'lmaydi. 176 00:09:31,875 --> 00:09:34,809 Hozir bizga kerak bo'lgan narsa bu siyosiy yechimdir. 177 00:09:34,833 --> 00:09:39,542 Hukumatlarimiz bizning ma'lumot huquqimiz insoniy huquqimiz ekanini tan olishi kerak 178 00:09:40,292 --> 00:09:44,375 (Qarsaklar va hayqiriqlar) 179 00:09:47,833 --> 00:09:51,917 Ungacha biz kelajakka umid bilan qaray olmaymiz. 180 00:09:52,750 --> 00:09:55,476 Men qizlarimning algoritmik kamsitilishi va xatolarga 181 00:09:55,500 --> 00:09:59,226 duchor bo'lishidan xavotirdaman. 182 00:09:59,250 --> 00:10:01,643 Men bilan qizlarim o'rtasidagi farq shundaki, 183 00:10:01,667 --> 00:10:04,851 mening bolaligim haqida umuman ochiq ma'lumotlar yo'q. 184 00:10:04,875 --> 00:10:08,893 Shubhasiz men o'smirlik vaqtimda qilgan va o'ylagan barcha ahmoqona ishlarning 185 00:10:08,917 --> 00:10:11,059 ma'lumot bazasi yo'q. 186 00:10:11,083 --> 00:10:12,583 (Kulgi) 187 00:10:13,833 --> 00:10:16,583 Ammo qizlarim uchun bu boshqacha bo'lishi mumkin. 188 00:10:17,292 --> 00:10:20,476 Ularning ma'lumoti bugundan boshlab yig'ilib borilmoqda, 189 00:10:20,500 --> 00:10:24,309 u kelajakda ularni ustidan hukm chiqarish uchun ishlatilishi mumkin 190 00:10:24,333 --> 00:10:27,292 va ularning orzu-umidlariga to'sqinlik qilishi mumkin. 191 00:10:28,583 --> 00:10:30,101 Menimcha, vaqti keldi. 192 00:10:30,125 --> 00:10:31,559 Harakat qilish vaqt keldi. 193 00:10:31,583 --> 00:10:34,059 Barchamizning birga ishlash vaqtimiz keldi 194 00:10:34,083 --> 00:10:35,518 inson sifatida, 195 00:10:35,542 --> 00:10:38,059 tashkilot va muassassa sifatida, 196 00:10:38,083 --> 00:10:41,184 va biz ko'proq ma'lumot adolatini talab qilamiz, o'zimiz uchun 197 00:10:41,208 --> 00:10:42,601 va farzandlarimiz uchun 198 00:10:42,625 --> 00:10:44,143 juda kech bo'lishidan oldin. 199 00:10:44,167 --> 00:10:45,434 Tashakkur. 200 00:10:45,458 --> 00:10:46,875 (Qarsaklar)