0:00:01.040,0:00:04.791 Men bu yerda turishimdan maqsad, soham, [br]sun'iy intellekt haqida 0:00:04.815,0:00:06.583 yangicha fikrlashni taklif qilish. 0:00:06.607,0:00:08.621 Menimcha SI ning maqsadi 0:00:08.645,0:00:12.079 bu insonlarni mashina intellekti [br]bilan kuchaytirishdir. 0:00:13.286,0:00:15.736 Mashinalar aqlliroq bo'lgan sari 0:00:15.760,0:00:17.013 biz aqlliroq bo'lamiz. 0:00:17.907,0:00:20.419 Men buni "insoniy SI" deb atayman. 0:00:20.443,0:00:24.114 Sun'iy intellekt odamlar bilan o'zaro [br]ishlashi va ularni rivojlanishiga 0:00:24.138,0:00:27.069 yordam berishi orqali insonlarning [br]ehtiyojlarini qondiradi. 0:00:28.149,0:00:30.059 Hozirgi kunda yordamchi intellekt 0:00:30.083,0:00:32.260 g'oyasi katta mavzuga aylanganligidan 0:00:32.284,0:00:33.753 juda hursandman. 0:00:33.777,0:00:38.837 Bu odamlar va SI o'rtasidagi faollikni [br]ifodalash uchun o'rinli metaforadir. 0:00:39.556,0:00:42.253 Men yaratishga hissamni qo'shgan SI [br]Siri deb ataladi. 0:00:42.957,0:00:44.109 Siz Sirini bilasiz. 0:00:44.133,0:00:46.778 Siri sizning maqsadingizni tushuna oladi, [br]hamda uni 0:00:46.802,0:00:48.844 amalga oshirishda yordam beradi, 0:00:48.868,0:00:50.369 va uni bajarishda qo'l keladi. 0:00:50.393,0:00:52.867 Ammo siz bilmaydigan bitta narsa bor:[br]Sirini aslida 0:00:53.645,0:00:55.611 ijtimoiy SI sifatida,[br]suhbat interfeysida 0:00:55.635,0:00:58.806 insonni o'rnini to'ldirib turishi uchun[br]va mobil hisoblash 0:00:58.830,0:01:02.283 vazifalarini bajarishda yordam [br]berishi uchun 0:01:02.307,0:01:05.591 ishlab chiqqan edik. 0:01:06.814,0:01:08.391 Hozirda ko'pchiligimiz uchun bu 0:01:08.415,0:01:11.196 texnologiyaning ta'siri ish bajarishni [br]biroz bo'lsa-da 0:01:11.220,0:01:13.503 yengillashtirishidadir. 0:01:13.916,0:01:16.095 Lekin mening do'stim Daniel uchun 0:01:17.352,0:01:20.523 buning ta'siri butun hayotini o'zgartirdi. 0:01:21.217,0:01:26.050 Bilasizmi, [br]Daniel juda ham kirishimli yigit, 0:01:26.074,0:01:27.833 lekin u ko'r va qo'l oyoqlari shol. 0:01:28.614,0:01:32.473 Shuning uchun biz bemalol ishlata [br]oladigan moslamalarni u ishlata olmaydi. 0:01:32.497,0:01:34.939 Ohirgi marta men uning uyiga borganimda [br]uning akasi 0:01:34.963,0:01:36.821 "Biroz kuting, Daniel hali tayyor emas. 0:01:36.845,0:01:39.187 U online tanishgan qizi bilan [br]gaplashyapti" dedi. 0:01:40.250,0:01:42.260 Men "Zo'rku, buni qanday uddaladi?" dedim. 0:01:42.284,0:01:45.778 Daniel ijtimoiy hayotini boshqarish [br]uchun Siridan foydalanadi. 0:01:46.731,0:01:48.671 U ko'makdoshining yordamisiz 0:01:48.695,0:01:51.171 e-mail, sms va qo'ng'iroq qila [br]olishi mumkin. 0:01:52.501,0:01:54.239 Bu qiziq, shunday emasmi? 0:01:54.263,0:01:55.580 Taqdir hazili shundaki, 0:01:55.604,0:01:58.500 Sun'iy Intellekt unga aqlli inson bilan [br]muloqot qilishda 0:01:58.524,0:02:01.701 yordam berayapti. 0:02:02.805,0:02:05.935 Bu insoniy Sun'iy Intellektdir. 0:02:07.404,0:02:09.802 Yana bir yorqin misol bu saraton [br]kasalligiga 0:02:09.826,0:02:11.083 tashhis qo'yishdadir. 0:02:12.795,0:02:14.950 Vrachlar saratonni gumon qilganda, 0:02:14.974,0:02:17.677 ular qondan namuna olib patologga [br]jo'natadi. 0:02:17.701,0:02:19.653 Patolog uni mikroskopda ko'radi. 0:02:20.145,0:02:22.670 Patologlar bir kunda yuzlab namunalar va 0:02:22.694,0:02:25.018 minglab hujayralarga qaraydi. 0:02:25.630,0:02:27.017 Bu vazifani yengillashtirish 0:02:27.041,0:02:30.483 maqsadida ba'zi olimlar tasniflagich [br]SI yaratishdi. 0:02:31.548,0:02:36.101 Ushbu tasniflagich namunalarga qarab [br]saraton bor 0:02:36.125,0:02:37.430 yoki yo'qligini aytadi. 0:02:38.355,0:02:39.919 Dastlab tasniflagich juda yaxshi 0:02:40.967,0:02:42.652 ishlayotgan edi, lekin inson kabi 0:02:43.945,0:02:45.713 doim ham to'g'ri aytib bera olmasdi. 0:02:46.427,0:02:52.607 Ammo qachonki olimlar inson intellekti va [br]mashina qobiliyatini birlashtira olganda 0:02:52.631,0:02:55.626 aniqlik 99.5 foizga oshdi. 0:02:57.150,0:03:02.151 Shunda SI bilan hamkorlikda ishlash [br]patologlar bir o'zlari 0:03:02.175,0:03:05.861 ishlaganda qiladigan xatoliklarini [br]85 foizga kamaytirdi. 0:03:06.820,0:03:09.979 Bu juda ham ko'p oldi olinmagan [br]saraton kasalliklaridir. 0:03:11.097,0:03:13.027 Qizig'i shundaki, insonlar yolg'on 0:03:13.051,0:03:16.525 alomatni inkor etishni yaxshi uddalasa, [br]mashina esa aniqlanishi qiyin 0:03:16.549,0:03:20.121 bo'lgan alomatlarni topa oladi. 0:03:20.145,0:03:23.382 Bundagi saboq qaysi biri tasniflash [br]vazifasini 0:03:23.406,0:03:25.484 yaxshi bajara olishida emas. 0:03:25.888,0:03:27.727 Chunki narsalar doim o'zgarib turadi. 0:03:27.751,0:03:29.319 Balki, 0:03:29.343,0:03:32.703 inson va dastgoh qobiliyatini [br]birlashtirish orqali 0:03:32.727,0:03:37.678 superinsoniy vazifani bajarishda [br]sheriklik yaratilishidir. 0:03:38.543,0:03:41.483 Bu insoniy SI dir. 0:03:42.787,0:03:44.773 Keling, yana bir misolni olaylik. 0:03:44.797,0:03:47.084 Bu tezlikni 0:03:47.108,0:03:48.511 yaratishdir. 0:03:48.535,0:03:50.576 Masalan, siz muhandissiz. 0:03:50.600,0:03:52.717 Siz dron uchun rom yaratmoqchisiz. 0:03:52.741,0:03:55.805 Siz uni eng yoqtirgan dasturli [br]3D dasturingizda tayyorlaysiz 0:03:55.829,0:04:00.044 va shaklni hamda homashyolarni kiritasiz. [br]So'ng siz bajarilishini tahlil qilasiz. 0:04:00.068,0:04:01.412 Bu bitta dizaynni beradi. 0:04:02.162,0:04:05.839 Agar siz o'sha dastgohlarni SI ga [br]bersangiz, 0:04:05.863,0:04:08.987 u minglab dizaynlarni yarata oladi. 0:04:10.193,0:04:12.127 Autodesk yaratgan ushbu video ajoyib. 0:04:12.151,0:04:13.451 Bu haqiqatdan ham ajoyib. 0:04:14.092,0:04:17.163 U biz dizayn qilgan narsani yaratadi. 0:04:17.187,0:04:18.615 Muhandis inson dizayn bilan 0:04:18.639,0:04:21.954 nimaga erishish kerakligini aytadi 0:04:21.978,0:04:23.134 va mashina 0:04:23.158,0:04:24.528 "Mana namunalar" deb 0:04:25.758,0:04:28.698 ko'rsatadi va endi muhandisning vazifasi [br]inson sifatida o'zi 0:04:28.722,0:04:31.822 ma'qul deb biladigan, dizayn maqsadiga [br]mos tushadigan 0:04:31.846,0:04:35.185 variantni insoniy fikr va tajribaga [br]asoslanib 0:04:35.209,0:04:37.003 tanlaydi. 0:04:37.414,0:04:38.994 Natijada yaratilgan mahsulot 0:04:39.018,0:04:41.729 huddi tabiat yaratgan narsaday ko'rinadi. 0:04:41.753,0:04:43.868 Bir necha million yillik evolyutsiya 0:04:43.892,0:04:46.273 va shuncha keraksiz bo'lgan mo'yna. 0:04:48.424,0:04:53.096 Keling endi bu insoniy SI, agar biz uni [br]ortidan ergashsak, qayerga yetaklab 0:04:53.120,0:04:55.513 olib borishi mumkinligi haqida [br]gaplashaylik. 0:04:56.118,0:04:59.122 Biz hammamiz qanaqa rivojlanishni [br]istaymiz? 0:04:59.529,0:05:02.435 Kognitiv rivojlanishga nima deysiz? 0:05:03.904,0:05:05.383 "Biz mashinalarni qanchalik 0:05:05.407,0:05:07.468 aqlli qila olamiz?" deb so'rashni o'rniga, 0:05:07.492,0:05:08.665 keling, "Ular bizni 0:05:08.689,0:05:11.394 qanchalik aqlli qilishi mumkin?" [br]deb so'raylik. 0:05:13.105,0:05:14.885 Misol sifatida xotirani olaylik. 0:05:15.511,0:05:18.457 Xotira bu inson intellektining asosidir. 0:05:19.767,0:05:22.738 Lekin inson xotirasi kamchilikka egaligi [br]hammaga ma'lum. 0:05:23.541,0:05:25.615 Biz hikoya qilishda yaxshimiz, ammo uning 0:05:25.639,0:05:27.590 detallarida unchalik emas. 0:05:27.614,0:05:30.820 Va bizning xotiramiz vaqt o'tishi [br]bilan sustlashadi. 0:05:30.844,0:05:34.076 Ya'ni, 60-yillar qayga ketgan bo'lsa [br]men ham o'sha yerga ketaman. 0:05:34.100,0:05:35.567 (Kulgu) 0:05:36.648,0:05:40.952 Ammo sizda kompyuter xotirasi singari [br]xotira bo'lganda 0:05:42.158,0:05:45.017 nima qilgan bo'lardingiz? 0:05:45.714,0:05:48.503 Siz hayotingizda uchratgan har bir [br]inson ismini eslab, 0:05:48.527,0:05:50.473 ismini qanday qilib talaffuz qilishni, 0:05:50.497,0:05:52.589 ularning oilasiga taaluqli detallarni, [br]xatto 0:05:52.613,0:05:54.576 so'nggi bor qachon gaplashganingizni 0:05:54.600,0:05:57.255 eslay olganingizda nima bo'lardi?[br]Sizda shunday xotira 0:05:57.279,0:06:00.577 butun umr davomida bo'lsa, barcha [br]insonlar bilan qilingan suhbatlarni 0:06:00.601,0:06:02.018 ko'rish va ancha oldingi 0:06:02.042,0:06:05.835 munosabatlaringizni eslashga yordam bera [br]oladigan SI bo'lsa. 0:06:06.530,0:06:11.596 Agarda SI siz o'qigan har qanday narsani [br]o'qib bera olsa va eshitgan har qanday 0:06:11.620,0:06:14.134 musiqasini kuylab bera olsa [br]nima bo'lgan bo'lardi? 0:06:15.344,0:06:19.083 Kichkinagina ishora bilan u sizga [br]oldin eshitgan har qanday narsani 0:06:19.107,0:06:21.220 eslatib bera olsa, siz yangi aloqalar [br]qilish 0:06:21.931,0:06:25.181 va yangi g'oyalar shakllantirish [br]qobiliyatiga ega bo'lishni 0:06:25.205,0:06:26.785 tasavvur qilib ko'ring. 0:06:26.809,0:06:28.797 Bizning tanamizchi? 0:06:29.732,0:06:34.010 Biz yegan har bitta ovqatimiz, qabul [br]qilgan tabletkamiz 0:06:34.034,0:06:35.535 va yotishdan oldin qabul qilgan 0:06:36.485,0:06:38.320 dori ta'sirini eslay olganimizdachi? 0:06:39.048,0:06:41.758 Bizda to'plangan ma'lumotlardan kelib [br]chiqib, nimalar 0:06:41.782,0:06:45.642 o'zimizni yaxshi his qilishimizga undashi [br]haqida ilmiy ish qilishimiz mumkin. 0:06:45.666,0:06:48.142 Tasavvur eting, allergiya va surunkali [br]kasalliklarni 0:06:48.166,0:06:50.856 boshqarishimizda bu qanchalik [br]qo'l kelishini. 0:06:52.909,0:06:57.883 Men SI shaxsiy xotirani kuchaytira [br]olishiga ishonaman. 0:06:59.122,0:07:01.946 Men qachon va qanday omillar bilan [br]bog'liqligini aniq 0:07:02.627,0:07:04.590 aytolmayman, ammo bu muqarrarligi aniq, 0:07:05.923,0:07:10.750 chunki SI ni hozirgi kunda muvaffaqqiyatli[br]qilgan narsa bu 0:07:10.774,0:07:13.267 keng qamrovli ma'lumotlar bazasidir, 0:07:13.291,0:07:16.839 mashinalarning ushbu bazadan har [br]kunlik turmushda 0:07:16.863,0:07:19.922 foydalana olishidadir. 0:07:20.505,0:07:23.992 Ushbu ma'lumotlar bazasi hozir hammamiz [br]uchun mavjud, 0:07:24.016,0:07:27.884 chunki biz telefonda va online orqali 0:07:28.570,0:07:30.705 raqamli, vositali hayot olib borayapmiz. 0:07:31.833,0:07:35.723 Mening fikrimcha, shaxsiy xotira bu [br]xususiy xotiradir. 0:07:35.747,0:07:39.684 Biz nimani eslash kerakligini o'zimiz [br]tanlaymiz. 0:07:40.517,0:07:44.181 Bu mutlaqo alohida saqlanishi kerak. 0:07:45.294,0:07:46.756 Hozir, bizning ko'pchiligimiz 0:07:46.780,0:07:49.673 uchun rivojlangan shaxsiy xotira ta'siri 0:07:49.697,0:07:52.432 yaxshilangan ruhiy yutuq bo'lardi. 0:07:52.456,0:07:54.911 Ehtimol, bu ijtimoiy yutuq hamdir. 0:07:56.658,0:08:01.305 Lekin Altsgeymer va ruhiy kasalligiga [br]uchragan millionlab odamlar uchun 0:08:03.209,0:08:05.412 rivojlangan xotira ularning hayotiga [br]o'zgarish 0:08:05.436,0:08:07.897 kiritishi ular hayotidagi yolg'izlik, qadr-qimmat, 0:08:09.016,0:08:10.795 hamda o'zaro aloqadagi o'zgarishdir. 0:08:12.166,0:08:17.813 Biz hozir sun'iy intellektning renessans [br]davri o'rtasida turibmiz. 0:08:18.652,0:08:20.623 Ya'ni, o'tgan bir necha yil davomida 0:08:20.647,0:08:24.278 biz yillar davomida kurashib kelayotgan 0:08:24.302,0:08:28.272 SI muammolariga yechimni kuzatishimiz [br]mumkin. 0:08:29.226,0:08:31.893 Bu muammolar suhbatni anglash, yozuvni [br]va rasmni 0:08:31.917,0:08:33.237 fahmlashdan iborat edi. 0:08:34.177,0:08:38.252 Biz bunday kuchli texnologiyani qanday [br]foydalanishni o'zimiz tanlashimiz mumkin. 0:08:39.268,0:08:44.132 Biz SI ni bizni avtomatlashtirish va biz [br]bilan raqobatlasha olishi, 0:08:44.156,0:08:48.133 yoki undan bizni rivojlantirishi va biz [br]bilan birga ishlashi uchun, 0:08:48.157,0:08:51.228 bizning kognitiv cheklovlarimizni yengish 0:08:51.252,0:08:54.414 va biz qilayotgan ishni yaxshiroq [br]bajarish maqsadida 0:08:54.438,0:08:55.729 foydalanishimiz mumkin. 0:08:56.339,0:09:02.843 Biz mashinalarga intellekt o'rnatishning [br]yangi yo'llarini topganimiz sari, 0:09:02.867,0:09:05.936 ushbu intellektni sharoitidan qat'iy [br]nazar, 0:09:05.960,0:09:08.341 har bir ko'makdoshga muhtoj bo'lgan odamga 0:09:08.365,0:09:11.899 yordam berishga hissamizni qo'shishimiz [br]mumkin. 0:09:13.189,0:09:14.718 Shuning uchun, 0:09:14.742,0:09:17.903 har safar mashina aqlliroq bo'lgani sari 0:09:18.628,0:09:20.032 biz aqlliroq bo'lamiz. 0:09:21.028,0:09:24.761 Shu sababli SI boshqalar bilan [br]ulashishga arziydi. 0:09:24.785,0:09:25.967 Rahmat. 0:09:25.991,0:09:29.964 (Qarsaklar)