[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:01.62,0:00:03.91,Default,,0000,0000,0000,,It will also be re\Nbroadcasted at a later date Dialogue: 0,0:00:03.91,0:00:08.88,Default,,0000,0000,0000,,on Comcast eight, RCN 82 SAS as per 196. Dialogue: 0,0:00:08.88,0:00:10.73,Default,,0000,0000,0000,,For a public testimony written comments Dialogue: 0,0:00:10.73,0:00:12.68,Default,,0000,0000,0000,,may be sent to the committee email Dialogue: 0,0:00:12.68,0:00:15.87,Default,,0000,0000,0000,,@ccc.go@boston.gov Dialogue: 0,0:00:15.87,0:00:18.71,Default,,0000,0000,0000,,and it will be made part\Nof the official record. Dialogue: 0,0:00:18.71,0:00:19.73,Default,,0000,0000,0000,,This is an ordinance Dialogue: 0,0:00:19.73,0:00:22.29,Default,,0000,0000,0000,,that would ban the use\Nof facial surveillance Dialogue: 0,0:00:22.29,0:00:25.65,Default,,0000,0000,0000,,by the city of Boston or any\Nofficial in the city of Boston. Dialogue: 0,0:00:25.65,0:00:27.13,Default,,0000,0000,0000,,The proposal would also prohibit Dialogue: 0,0:00:27.13,0:00:29.01,Default,,0000,0000,0000,,entering into agreements, contracts Dialogue: 0,0:00:29.01,0:00:32.17,Default,,0000,0000,0000,,to obtain face-to-face\Nsurveillance with third parties. Dialogue: 0,0:00:32.17,0:00:34.32,Default,,0000,0000,0000,,I'm gonna turn it now\Nover to the lead sponsors, Dialogue: 0,0:00:34.32,0:00:37.97,Default,,0000,0000,0000,,councilor Wu and councilor Arroyo. Dialogue: 0,0:00:37.97,0:00:39.39,Default,,0000,0000,0000,,And then I will turn it to my colleagues Dialogue: 0,0:00:39.39,0:00:41.75,Default,,0000,0000,0000,,in order of arrival, for opening remarks. Dialogue: 0,0:00:41.75,0:00:45.38,Default,,0000,0000,0000,,And just to acknowledge my colleagues Dialogue: 0,0:00:45.38,0:00:48.13,Default,,0000,0000,0000,,and I believe the order that I had so far, Dialogue: 0,0:00:48.13,0:00:51.89,Default,,0000,0000,0000,,again it will be councilor Arroyo, Dialogue: 0,0:00:51.89,0:00:54.73,Default,,0000,0000,0000,,councilor Wu, councilor\NBraden, councilor Bok, Dialogue: 0,0:00:55.58,0:00:58.26,Default,,0000,0000,0000,,councilor Mejia, councilor Campbell, Dialogue: 0,0:00:58.26,0:01:03.26,Default,,0000,0000,0000,,and then I had councilor\NO'Malley, councilor Essaibi George Dialogue: 0,0:01:03.61,0:01:05.86,Default,,0000,0000,0000,,and then I'm sorry, I started talking. Dialogue: 0,0:01:05.86,0:01:07.97,Default,,0000,0000,0000,,So other councilors have joined us. Dialogue: 0,0:01:11.93,0:01:13.84,Default,,0000,0000,0000,,MICHAEL: Good morning madam chair. Dialogue: 0,0:01:13.84,0:01:15.44,Default,,0000,0000,0000,,Councilor Flaherty was before me Dialogue: 0,0:01:15.44,0:01:17.22,Default,,0000,0000,0000,,councilor Edwards, thanks. Dialogue: 0,0:01:17.22,0:01:18.92,Default,,0000,0000,0000,,I apologize councilor Flaherty, Dialogue: 0,0:01:18.92,0:01:21.98,Default,,0000,0000,0000,,then councilor Campbell,\Nthen counselor melody Dialogue: 0,0:01:21.98,0:01:23.73,Default,,0000,0000,0000,,and councilor Essaibi George. Dialogue: 0,0:01:26.69,0:01:28.94,Default,,0000,0000,0000,,I will not turn it over\Nto the lead sponsors. Dialogue: 0,0:01:29.79,0:01:31.50,Default,,0000,0000,0000,,Does that start with me? Dialogue: 0,0:01:31.50,0:01:32.42,Default,,0000,0000,0000,,CHAIR: Sure. Dialogue: 0,0:01:32.42,0:01:34.24,Default,,0000,0000,0000,,Thank you, madam chair. Dialogue: 0,0:01:34.24,0:01:36.26,Default,,0000,0000,0000,,I think we have excellent panelists Dialogue: 0,0:01:36.26,0:01:38.80,Default,,0000,0000,0000,,to kind of go into the intricacies Dialogue: 0,0:01:38.80,0:01:40.48,Default,,0000,0000,0000,,of the facial recognition ban. Dialogue: 0,0:01:40.48,0:01:42.87,Default,,0000,0000,0000,,But to just kind of summarize it, Dialogue: 0,0:01:42.87,0:01:45.97,Default,,0000,0000,0000,,this facial recognition band\Ncreates a community process Dialogue: 0,0:01:46.98,0:01:48.82,Default,,0000,0000,0000,,that makes our systems more inclusive Dialogue: 0,0:01:48.82,0:01:50.63,Default,,0000,0000,0000,,of community consents while\Nadding democratic oversight, Dialogue: 0,0:01:50.63,0:01:52.84,Default,,0000,0000,0000,,where there currently is none. Dialogue: 0,0:01:52.84,0:01:53.89,Default,,0000,0000,0000,,We will be joining our neighbors Dialogue: 0,0:01:53.89,0:01:55.98,Default,,0000,0000,0000,,and Cambridge and Somerville. Dialogue: 0,0:01:55.98,0:01:58.61,Default,,0000,0000,0000,,And specifically when it comes\Nto facial recognition tech, Dialogue: 0,0:01:58.61,0:02:01.10,Default,,0000,0000,0000,,it doesn't work, it's not good. Dialogue: 0,0:02:01.10,0:02:03.72,Default,,0000,0000,0000,,It it's been proven through the data Dialogue: 0,0:02:03.72,0:02:06.09,Default,,0000,0000,0000,,to be less accurate for\Npeople with darker skin. Dialogue: 0,0:02:06.09,0:02:09.90,Default,,0000,0000,0000,,A study by one of our\Npanelists Joy Buolamwini, Dialogue: 0,0:02:09.90,0:02:11.22,Default,,0000,0000,0000,,researcher at MIT, Dialogue: 0,0:02:11.22,0:02:12.05,Default,,0000,0000,0000,,found that black women Dialogue: 0,0:02:12.05,0:02:13.96,Default,,0000,0000,0000,,were 35% more likely than white men Dialogue: 0,0:02:13.96,0:02:16.43,Default,,0000,0000,0000,,to be misclassified by face\Nsurveillance technology. Dialogue: 0,0:02:16.43,0:02:18.73,Default,,0000,0000,0000,,The reality is that face surveillance Dialogue: 0,0:02:18.73,0:02:21.68,Default,,0000,0000,0000,,isn't very effective in its current form. Dialogue: 0,0:02:24.33,0:02:26.16,Default,,0000,0000,0000,,According to records obtained by the ACLU, Dialogue: 0,0:02:26.16,0:02:28.85,Default,,0000,0000,0000,,one manufacturer promoting\Nthis technology Massachusetts Dialogue: 0,0:02:28.85,0:02:31.64,Default,,0000,0000,0000,,admitted that it might only\Nwork about 30% of the time. Dialogue: 0,0:02:31.64,0:02:34.37,Default,,0000,0000,0000,,Currently, my understanding Dialogue: 0,0:02:34.37,0:02:37.53,Default,,0000,0000,0000,,is that the Boston Police\NDepartment does not use this. Dialogue: 0,0:02:37.53,0:02:39.43,Default,,0000,0000,0000,,This isn't a ban on surveillance. Dialogue: 0,0:02:39.43,0:02:42.88,Default,,0000,0000,0000,,This is simply a ban on facial\Nrecognition surveillance, Dialogue: 0,0:02:42.88,0:02:45.55,Default,,0000,0000,0000,,which I think the data\Nhas shown doesn't work. Dialogue: 0,0:02:45.55,0:02:47.88,Default,,0000,0000,0000,,BPD doesn't use it because, Dialogue: 0,0:02:47.88,0:02:50.92,Default,,0000,0000,0000,,and I will allow commissioner\NGross to speak on it. Dialogue: 0,0:02:50.92,0:02:52.80,Default,,0000,0000,0000,,But my understanding from what I've heard Dialogue: 0,0:02:52.80,0:02:53.99,Default,,0000,0000,0000,,and past comments from commissioner Gross Dialogue: 0,0:02:53.99,0:02:55.84,Default,,0000,0000,0000,,is that they don't wanna use technology Dialogue: 0,0:02:55.84,0:02:57.09,Default,,0000,0000,0000,,that doesn't work well. Dialogue: 0,0:02:57.09,0:03:01.82,Default,,0000,0000,0000,,And so this ban necessarily\Neven a permanent ban, Dialogue: 0,0:03:01.82,0:03:05.60,Default,,0000,0000,0000,,it just creates a process\Nwhere we are banning something Dialogue: 0,0:03:05.60,0:03:07.03,Default,,0000,0000,0000,,that doesn't have to be process Dialogue: 0,0:03:07.03,0:03:09.78,Default,,0000,0000,0000,,that doesn't have community trust Dialogue: 0,0:03:09.78,0:03:12.00,Default,,0000,0000,0000,,and doesn't have community consent Dialogue: 0,0:03:12.00,0:03:13.65,Default,,0000,0000,0000,,in which would require\Nsome democratic oversight Dialogue: 0,0:03:13.65,0:03:15.25,Default,,0000,0000,0000,,if they ever do want to implement it, Dialogue: 0,0:03:15.25,0:03:16.08,Default,,0000,0000,0000,,if the technology in the\Nfuture ever does improve. Dialogue: 0,0:03:17.33,0:03:18.71,Default,,0000,0000,0000,,And so with that, I see\Nthe rest of my time. Dialogue: 0,0:03:18.71,0:03:19.97,Default,,0000,0000,0000,,Thank you, madam chair. Dialogue: 0,0:03:19.97,0:03:24.59,Default,,0000,0000,0000,,And thank you also to my\Nco-sponsor councilor Wu. Dialogue: 0,0:03:26.22,0:03:28.62,Default,,0000,0000,0000,,Thank you very much madam chair. Dialogue: 0,0:03:28.62,0:03:33.07,Default,,0000,0000,0000,,Thank you also to councilor Arroyo Dialogue: 0,0:03:33.07,0:03:35.65,Default,,0000,0000,0000,,and all of the coalition partners Dialogue: 0,0:03:35.65,0:03:39.12,Default,,0000,0000,0000,,that have been working\Non this for many months Dialogue: 0,0:03:39.12,0:03:42.49,Default,,0000,0000,0000,,and providing all sorts of\Nincredibly important feedback. Dialogue: 0,0:03:43.83,0:03:47.68,Default,,0000,0000,0000,,We scheduled this hearing\Nmany weeks ago at this point, Dialogue: 0,0:03:47.68,0:03:48.82,Default,,0000,0000,0000,,I think over a month ago. Dialogue: 0,0:03:48.82,0:03:52.80,Default,,0000,0000,0000,,And it just so happened that\Nthe timing of it now has lined Dialogue: 0,0:03:52.80,0:03:56.56,Default,,0000,0000,0000,,up with a moment of great national trauma. Dialogue: 0,0:03:56.56,0:03:57.67,Default,,0000,0000,0000,,And in this moment, Dialogue: 0,0:03:57.67,0:04:02.33,Default,,0000,0000,0000,,the responsibility is on\Neach one of us to step up, Dialogue: 0,0:04:02.33,0:04:06.18,Default,,0000,0000,0000,,to truly address systemic\Nracism and systemic oppression, Dialogue: 0,0:04:06.18,0:04:07.01,Default,,0000,0000,0000,,but the responsibilities, Dialogue: 0,0:04:07.01,0:04:10.68,Default,,0000,0000,0000,,especially on elected\Nofficials, policy makers, Dialogue: 0,0:04:10.68,0:04:13.58,Default,,0000,0000,0000,,those of us who have the power Dialogue: 0,0:04:13.58,0:04:16.35,Default,,0000,0000,0000,,and the ability to make change now. Dialogue: 0,0:04:16.35,0:04:20.29,Default,,0000,0000,0000,,So I am proud that\NBoston has the potential Dialogue: 0,0:04:20.29,0:04:22.91,Default,,0000,0000,0000,,to join our sister cities\Nacross Massachusetts Dialogue: 0,0:04:22.91,0:04:26.02,Default,,0000,0000,0000,,in taking this immediate\Nstep to ban a technology Dialogue: 0,0:04:26.02,0:04:28.92,Default,,0000,0000,0000,,that has been proven to\Nbe racially discriminatory Dialogue: 0,0:04:28.92,0:04:33.42,Default,,0000,0000,0000,,and that threatens basic rights. Dialogue: 0,0:04:33.42,0:04:35.61,Default,,0000,0000,0000,,We are thankful that Boston police Dialogue: 0,0:04:35.61,0:04:37.15,Default,,0000,0000,0000,,agree with that assessment Dialogue: 0,0:04:37.15,0:04:39.52,Default,,0000,0000,0000,,and do not use facial recognition, Dialogue: 0,0:04:39.52,0:04:42.44,Default,,0000,0000,0000,,facial surveillance today in Boston, Dialogue: 0,0:04:42.44,0:04:45.81,Default,,0000,0000,0000,,and are looking to\Ncodify that understanding Dialogue: 0,0:04:45.81,0:04:49.29,Default,,0000,0000,0000,,so that with the various\Nmechanisms for technology Dialogue: 0,0:04:49.29,0:04:53.71,Default,,0000,0000,0000,,and upgrades and changing circumstances Dialogue: 0,0:04:53.71,0:04:56.24,Default,,0000,0000,0000,,of different administrations\Nand personalities, Dialogue: 0,0:04:56.24,0:04:58.06,Default,,0000,0000,0000,,that we just have it on the books Dialogue: 0,0:04:58.06,0:05:00.07,Default,,0000,0000,0000,,that protections come first, Dialogue: 0,0:05:00.07,0:05:02.46,Default,,0000,0000,0000,,because we truly believe\Nthat public health Dialogue: 0,0:05:02.46,0:05:05.57,Default,,0000,0000,0000,,and public safety should\Nbe grounded in trust. Dialogue: 0,0:05:05.57,0:05:09.55,Default,,0000,0000,0000,,So the chair gave a wonderful description Dialogue: 0,0:05:09.55,0:05:11.19,Default,,0000,0000,0000,,of this ordinance. Dialogue: 0,0:05:11.19,0:05:13.97,Default,,0000,0000,0000,,I am looking forward to the discussion, Dialogue: 0,0:05:13.97,0:05:16.65,Default,,0000,0000,0000,,but most of all I'm grateful to colleagues Dialogue: 0,0:05:16.65,0:05:18.10,Default,,0000,0000,0000,,and everyone has con who has contributed Dialogue: 0,0:05:18.10,0:05:19.33,Default,,0000,0000,0000,,to this conversation, Dialogue: 0,0:05:19.33,0:05:22.45,Default,,0000,0000,0000,,as well as the larger conversation about Dialogue: 0,0:05:22.45,0:05:25.69,Default,,0000,0000,0000,,community centered\Noversight of surveillance, Dialogue: 0,0:05:25.69,0:05:29.22,Default,,0000,0000,0000,,which will be in a\Nseparate but related docket Dialogue: 0,0:05:29.22,0:05:30.77,Default,,0000,0000,0000,,that will be moving\Nforward on the council. Dialogue: 0,0:05:30.77,0:05:31.77,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,0:05:35.48,0:05:38.58,Default,,0000,0000,0000,,Councilor Breadon, opening remarks. Dialogue: 0,0:05:41.17,0:05:43.69,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:05:43.69,0:05:45.73,Default,,0000,0000,0000,,Thank you, madam chair, Dialogue: 0,0:05:45.73,0:05:50.16,Default,,0000,0000,0000,,thank you to the makers of this ordinance. Dialogue: 0,0:05:50.16,0:05:53.89,Default,,0000,0000,0000,,I really look forward to the\Nconversation this afternoon. Dialogue: 0,0:05:53.89,0:05:57.09,Default,,0000,0000,0000,,I'm very proud to participate\Nin this discussion. Dialogue: 0,0:05:57.09,0:06:00.83,Default,,0000,0000,0000,,I think it's the proper\Nand right thing to do Dialogue: 0,0:06:00.83,0:06:05.10,Default,,0000,0000,0000,,to really consider deeply\Nall the implications Dialogue: 0,0:06:05.10,0:06:07.23,Default,,0000,0000,0000,,of new technology that hasn't\Nbeen proven to be effective Dialogue: 0,0:06:07.23,0:06:10.66,Default,,0000,0000,0000,,before introducing it in\Nany city in Massachusetts. Dialogue: 0,0:06:10.66,0:06:13.61,Default,,0000,0000,0000,,So I really look forward to learning more Dialogue: 0,0:06:13.61,0:06:16.91,Default,,0000,0000,0000,,and hearing from the panelists\Nthis afternoon, thank you. Dialogue: 0,0:06:19.78,0:06:21.78,Default,,0000,0000,0000,,CHAIR: Councilor Bok. Dialogue: 0,0:06:26.08,0:06:27.08,Default,,0000,0000,0000,,Ah, yes. Dialogue: 0,0:06:27.08,0:06:27.91,Default,,0000,0000,0000,,Thank you, madam chair. Dialogue: 0,0:06:27.91,0:06:29.80,Default,,0000,0000,0000,,And I hope everyone will forgive me Dialogue: 0,0:06:29.80,0:06:30.63,Default,,0000,0000,0000,,for being outside for a minute. Dialogue: 0,0:06:30.63,0:06:32.26,Default,,0000,0000,0000,,I wanted to just say, Dialogue: 0,0:06:32.26,0:06:34.04,Default,,0000,0000,0000,,I'm excited about this conversation today. Dialogue: 0,0:06:34.04,0:06:35.87,Default,,0000,0000,0000,,I think it's a piece of action\Nwe should absolutely take. Dialogue: 0,0:06:35.87,0:06:37.17,Default,,0000,0000,0000,,And I just wanna say that, Dialogue: 0,0:06:38.06,0:06:39.86,Default,,0000,0000,0000,,I think it's really important\Nthat this technology Dialogue: 0,0:06:39.86,0:06:41.08,Default,,0000,0000,0000,,has major shortcomings Dialogue: 0,0:06:41.08,0:06:44.24,Default,,0000,0000,0000,,and that also it has a\Nracial bias builds into it. Dialogue: 0,0:06:44.24,0:06:46.60,Default,,0000,0000,0000,,Those are really good\Nreasons not to use it, Dialogue: 0,0:06:46.60,0:06:51.41,Default,,0000,0000,0000,,but honestly, even if those did not hold, Dialogue: 0,0:06:51.41,0:06:52.90,Default,,0000,0000,0000,,I think that in this country, Dialogue: 0,0:06:52.90,0:06:56.72,Default,,0000,0000,0000,,we run the risk sometimes of\Njust starting to do things Dialogue: 0,0:06:56.72,0:06:59.18,Default,,0000,0000,0000,,that tech new technology can do, Dialogue: 0,0:06:59.18,0:07:02.17,Default,,0000,0000,0000,,without asking ourselves like\Nas a democratic populous, Dialogue: 0,0:07:02.17,0:07:04.66,Default,,0000,0000,0000,,as a community, is this\Nsomething that we should do? Dialogue: 0,0:07:04.66,0:07:06.45,Default,,0000,0000,0000,,And I think in the process, Dialogue: 0,0:07:06.45,0:07:07.91,Default,,0000,0000,0000,,we as Americans have given up Dialogue: 0,0:07:07.91,0:07:09.97,Default,,0000,0000,0000,,a lot of our rights and privacy, Dialogue: 0,0:07:09.97,0:07:13.35,Default,,0000,0000,0000,,frankly as often as not\Nto private corporations Dialogue: 0,0:07:13.35,0:07:14.94,Default,,0000,0000,0000,,as to government. Dialogue: 0,0:07:14.94,0:07:17.44,Default,,0000,0000,0000,,But I think that it's really important Dialogue: 0,0:07:17.44,0:07:21.19,Default,,0000,0000,0000,,to just remember that there's\Na true public interest Dialogue: 0,0:07:21.19,0:07:24.56,Default,,0000,0000,0000,,in a society that is built more on trust Dialogue: 0,0:07:24.56,0:07:25.44,Default,,0000,0000,0000,,than on surveillance. Dialogue: 0,0:07:25.44,0:07:28.10,Default,,0000,0000,0000,,And so I think it's really important Dialogue: 0,0:07:28.10,0:07:29.06,Default,,0000,0000,0000,,for us to draw this line in the sand Dialogue: 0,0:07:29.06,0:07:31.61,Default,,0000,0000,0000,,so that if the technology improves Dialogue: 0,0:07:31.61,0:07:32.89,Default,,0000,0000,0000,,and some of those arguments Dialogue: 0,0:07:32.89,0:07:35.53,Default,,0000,0000,0000,,that are based on a kind\Nof its functionality Dialogue: 0,0:07:35.53,0:07:37.86,Default,,0000,0000,0000,,start to slip away, that\Nwe're still gonna have Dialogue: 0,0:07:37.86,0:07:39.27,Default,,0000,0000,0000,,a real conversation together Dialogue: 0,0:07:39.27,0:07:42.73,Default,,0000,0000,0000,,about how we wanna live in Dialogue: 0,0:07:42.73,0:07:44.41,Default,,0000,0000,0000,,and what kind of a society we want to be. Dialogue: 0,0:07:44.41,0:07:48.30,Default,,0000,0000,0000,,So to me, putting a\Nmoratorium ban like this Dialogue: 0,0:07:48.30,0:07:50.23,Default,,0000,0000,0000,,into place, makes a ton of sense. Dialogue: 0,0:07:50.23,0:07:52.04,Default,,0000,0000,0000,,And I'm excited for the conversation Dialogue: 0,0:07:52.04,0:07:53.46,Default,,0000,0000,0000,,and grateful to all the advocates. Dialogue: 0,0:07:53.46,0:07:54.91,Default,,0000,0000,0000,,Thank you so much madam chair Dialogue: 0,0:07:56.18,0:07:57.68,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:07:57.68,0:07:59.01,Default,,0000,0000,0000,,Councilor Mejia. Dialogue: 0,0:08:02.78,0:08:04.13,Default,,0000,0000,0000,,Yes, thank you. Dialogue: 0,0:08:04.13,0:08:06.09,Default,,0000,0000,0000,,Sorry about my audio. Dialogue: 0,0:08:06.09,0:08:10.21,Default,,0000,0000,0000,,Thank you chair, women Edwards, Dialogue: 0,0:08:10.21,0:08:12.22,Default,,0000,0000,0000,,and thank you to the makers, councilor Wu, Dialogue: 0,0:08:12.22,0:08:14.83,Default,,0000,0000,0000,,and I applaud you for your\Nleadership on this issue. Dialogue: 0,0:08:14.83,0:08:17.17,Default,,0000,0000,0000,,For me this issue is\Npersonal and professional Dialogue: 0,0:08:17.17,0:08:19.48,Default,,0000,0000,0000,,as a Dominican woman who\Nclaims a black boots. Dialogue: 0,0:08:19.48,0:08:21.68,Default,,0000,0000,0000,,Facial recognition technology Dialogue: 0,0:08:21.68,0:08:24.61,Default,,0000,0000,0000,,misidentifies people like me by 35%. Dialogue: 0,0:08:24.61,0:08:26.68,Default,,0000,0000,0000,,We're in a time where our technology Dialogue: 0,0:08:26.68,0:08:28.84,Default,,0000,0000,0000,,is outpacing our morals. Dialogue: 0,0:08:28.84,0:08:32.57,Default,,0000,0000,0000,,We've got a 2020 technology\Nwith a 1620 state of mind, Dialogue: 0,0:08:32.57,0:08:35.36,Default,,0000,0000,0000,,but that's thinking. Dialogue: 0,0:08:35.36,0:08:38.22,Default,,0000,0000,0000,,As a city, we need to\Npractice extreme caution Dialogue: 0,0:08:38.22,0:08:40.58,Default,,0000,0000,0000,,over facial recognition technology. Dialogue: 0,0:08:40.58,0:08:42.37,Default,,0000,0000,0000,,And we should be concerned Dialogue: 0,0:08:42.37,0:08:45.15,Default,,0000,0000,0000,,not only with this technology goes wrong, Dialogue: 0,0:08:45.15,0:08:46.95,Default,,0000,0000,0000,,but also when it goes right. Dialogue: 0,0:08:46.95,0:08:48.09,Default,,0000,0000,0000,,We have a lot of work to do Dialogue: 0,0:08:48.09,0:08:50.88,Default,,0000,0000,0000,,when it comes to building\Ntrust in our government. Dialogue: 0,0:08:50.88,0:08:54.06,Default,,0000,0000,0000,,Working against this facial\Nrecognition technology Dialogue: 0,0:08:54.06,0:08:56.85,Default,,0000,0000,0000,,is a good step and I'm happy to hear Dialogue: 0,0:08:56.85,0:08:59.41,Default,,0000,0000,0000,,there's pushback against\Nthis kind of surveillance Dialogue: 0,0:08:59.41,0:09:01.40,Default,,0000,0000,0000,,on all sides of the issue. Dialogue: 0,0:09:01.40,0:09:03.82,Default,,0000,0000,0000,,Like the makers of the\Nordinance mentioned, Dialogue: 0,0:09:03.82,0:09:06.95,Default,,0000,0000,0000,,now, right now, there's not a desire Dialogue: 0,0:09:06.95,0:09:09.28,Default,,0000,0000,0000,,to put facial recognition\Ntechnology in place. Dialogue: 0,0:09:09.28,0:09:13.68,Default,,0000,0000,0000,,And this ordinance is an\Nopportunity to put that into law. Dialogue: 0,0:09:15.85,0:09:17.69,Default,,0000,0000,0000,,I look forward to the discussion Dialogue: 0,0:09:17.69,0:09:19.66,Default,,0000,0000,0000,,and I hope to continue\Nto push on this issue Dialogue: 0,0:09:19.66,0:09:22.09,Default,,0000,0000,0000,,alongside councilor Wu and Arroyo, Dialogue: 0,0:09:22.09,0:09:23.22,Default,,0000,0000,0000,,thank you so much. Dialogue: 0,0:09:25.94,0:09:27.29,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,0:09:27.29,0:09:28.75,Default,,0000,0000,0000,,Councilor Flaherty. Dialogue: 0,0:09:28.75,0:09:30.07,Default,,0000,0000,0000,,MICHAEL: Thank you madam chair Dialogue: 0,0:09:30.07,0:09:32.03,Default,,0000,0000,0000,,and the sponsors for this hearing. Dialogue: 0,0:09:32.03,0:09:35.74,Default,,0000,0000,0000,,As I've stated in the past\Nthe technology updates Dialogue: 0,0:09:35.74,0:09:39.23,Default,,0000,0000,0000,,and advances have provided\Nus with many advantages, Dialogue: 0,0:09:39.23,0:09:41.51,Default,,0000,0000,0000,,but at the same time, Dialogue: 0,0:09:41.51,0:09:44.30,Default,,0000,0000,0000,,they've also proven to be deeply flawed, Dialogue: 0,0:09:44.30,0:09:46.44,Default,,0000,0000,0000,,unreliable and disproportionally impacting Dialogue: 0,0:09:46.44,0:09:48.94,Default,,0000,0000,0000,,communities of color. Dialogue: 0,0:09:48.94,0:09:49.77,Default,,0000,0000,0000,,This technology, Dialogue: 0,0:09:49.77,0:09:51.92,Default,,0000,0000,0000,,just for everyone's full\Ndisclosure if you listening, Dialogue: 0,0:09:51.92,0:09:56.32,Default,,0000,0000,0000,,this technology is currently\Nnot being used by the BPD Dialogue: 0,0:09:56.32,0:09:59.60,Default,,0000,0000,0000,,for that reason and CAC\Ncommissioner is here Dialogue: 0,0:09:59.60,0:10:02.28,Default,,0000,0000,0000,,and good afternoon to the commissioner. Dialogue: 0,0:10:02.28,0:10:04.52,Default,,0000,0000,0000,,I appreciate the work\Nthat you've been doing Dialogue: 0,0:10:04.52,0:10:05.35,Default,,0000,0000,0000,,through all of this. Dialogue: 0,0:10:05.35,0:10:07.52,Default,,0000,0000,0000,,Whether it's just the\Nday-to-day public safety, Dialogue: 0,0:10:07.52,0:10:08.54,Default,,0000,0000,0000,,rigors of your job, Dialogue: 0,0:10:08.54,0:10:10.88,Default,,0000,0000,0000,,or it's been through\Nthe COVID-19 response, Dialogue: 0,0:10:10.88,0:10:13.87,Default,,0000,0000,0000,,or it's been, most recently and tragically Dialogue: 0,0:10:13.87,0:10:18.24,Default,,0000,0000,0000,,in response to George\NFloyd's horrific death Dialogue: 0,0:10:19.36,0:10:22.07,Default,,0000,0000,0000,,and everything that has come from that. Dialogue: 0,0:10:22.07,0:10:22.90,Default,,0000,0000,0000,,I wanna make sure Dialogue: 0,0:10:22.90,0:10:27.80,Default,,0000,0000,0000,,that I'm gonna continue\Nto remain supportive Dialogue: 0,0:10:29.17,0:10:32.52,Default,,0000,0000,0000,,of setting transparent limits\Nin creating a public process Dialogue: 0,0:10:32.52,0:10:36.32,Default,,0000,0000,0000,,by which we assess the\Nuse of these technologies. Dialogue: 0,0:10:36.32,0:10:38.91,Default,,0000,0000,0000,,I know that we've been able\Nto solve a lot of crimes Dialogue: 0,0:10:38.91,0:10:40.56,Default,,0000,0000,0000,,through video surveillance Dialogue: 0,0:10:40.56,0:10:42.61,Default,,0000,0000,0000,,and also through video, quite frankly, Dialogue: 0,0:10:42.61,0:10:47.09,Default,,0000,0000,0000,,and also they've led\Nto justice for families Dialogue: 0,0:10:47.09,0:10:50.38,Default,,0000,0000,0000,,and someone that has lost\Na cousin due to murder, Dialogue: 0,0:10:50.38,0:10:54.02,Default,,0000,0000,0000,,but for, it was surveillance Dialogue: 0,0:10:54.02,0:10:58.63,Default,,0000,0000,0000,,and it was DNA that led to\Nmy family getting justice. Dialogue: 0,0:10:58.63,0:11:02.68,Default,,0000,0000,0000,,So looking forward to\Nlearning more about this Dialogue: 0,0:11:02.68,0:11:05.06,Default,,0000,0000,0000,,and the commissioner's perspective on it, Dialogue: 0,0:11:05.06,0:11:08.05,Default,,0000,0000,0000,,but also recognizing that collectively Dialogue: 0,0:11:08.05,0:11:10.48,Default,,0000,0000,0000,,we need to make sure that\Nwe're putting forth decisions Dialogue: 0,0:11:10.48,0:11:13.20,Default,,0000,0000,0000,,that make the most sense for Boston. Dialogue: 0,0:11:13.20,0:11:15.93,Default,,0000,0000,0000,,I'll get real parochial here. Dialogue: 0,0:11:15.93,0:11:18.93,Default,,0000,0000,0000,,Quite frankly, I'm not really\Nconcerned about Cambridge Dialogue: 0,0:11:18.93,0:11:20.58,Default,,0000,0000,0000,,and Somerville, I wanna make sure Dialogue: 0,0:11:20.58,0:11:22.52,Default,,0000,0000,0000,,that whatever we're doing\Nin Boston is impacting Dialogue: 0,0:11:22.52,0:11:23.60,Default,,0000,0000,0000,,all of our neighborhoods. Dialogue: 0,0:11:23.60,0:11:24.88,Default,,0000,0000,0000,,The neighborhoods that I represent Dialogue: 0,0:11:24.88,0:11:26.22,Default,,0000,0000,0000,,as a city wide city councilor. Dialogue: 0,0:11:26.22,0:11:28.44,Default,,0000,0000,0000,,And making sure that we\Nhave community input, Dialogue: 0,0:11:28.44,0:11:29.27,Default,,0000,0000,0000,,most importantly. Dialogue: 0,0:11:29.27,0:11:31.33,Default,,0000,0000,0000,,We work for the people\Nof Boston, as council Dialogue: 0,0:11:31.33,0:11:33.85,Default,,0000,0000,0000,,and obviously as the\Ncommissioner of police officers, Dialogue: 0,0:11:33.85,0:11:36.80,Default,,0000,0000,0000,,we work for the residents and\Nthe taxpayers of our city. Dialogue: 0,0:11:36.80,0:11:38.97,Default,,0000,0000,0000,,So they need a seat at the table Dialogue: 0,0:11:38.97,0:11:39.80,Default,,0000,0000,0000,,and we wanna to hear from them. Dialogue: 0,0:11:39.80,0:11:41.41,Default,,0000,0000,0000,,If the technology enhances Dialogue: 0,0:11:41.41,0:11:43.63,Default,,0000,0000,0000,,and it help us is public\Nsafety tool, great. Dialogue: 0,0:11:43.63,0:11:45.39,Default,,0000,0000,0000,,But as the time stands, Dialogue: 0,0:11:45.39,0:11:49.19,Default,,0000,0000,0000,,it's still unreliable and\Nthere are flaws to it. Dialogue: 0,0:11:49.19,0:11:51.80,Default,,0000,0000,0000,,So as a result that's\Nwhere I'm gonna be on this. Dialogue: 0,0:11:51.80,0:11:52.90,Default,,0000,0000,0000,,Thank you madam chair. Dialogue: 0,0:11:53.92,0:11:55.28,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,0:11:55.28,0:11:56.42,Default,,0000,0000,0000,,Councilor Campbell. Dialogue: 0,0:12:00.24,0:12:02.96,Default,,0000,0000,0000,,Thank you, madam chair and thank you, Dialogue: 0,0:12:02.96,0:12:04.43,Default,,0000,0000,0000,,I know you're gonna run a tight ship Dialogue: 0,0:12:04.43,0:12:05.83,Default,,0000,0000,0000,,given what's all that's happening today, Dialogue: 0,0:12:05.83,0:12:08.19,Default,,0000,0000,0000,,including the funeral, of\Ncourse for George Floyd. Dialogue: 0,0:12:08.19,0:12:09.48,Default,,0000,0000,0000,,So thank you very much. Dialogue: 0,0:12:10.36,0:12:11.19,Default,,0000,0000,0000,,Thank you also to, of course, Dialogue: 0,0:12:11.19,0:12:13.41,Default,,0000,0000,0000,,to the commissioner for being here, Dialogue: 0,0:12:13.41,0:12:15.10,Default,,0000,0000,0000,,given all that's going on. Dialogue: 0,0:12:15.10,0:12:16.41,Default,,0000,0000,0000,,I know you've been on\Nthe record in the past Dialogue: 0,0:12:16.41,0:12:17.100,Default,,0000,0000,0000,,with respect to this issue Dialogue: 0,0:12:17.100,0:12:20.55,Default,,0000,0000,0000,,and that not only does the\Ndepartment not have it, Dialogue: 0,0:12:20.55,0:12:23.10,Default,,0000,0000,0000,,but you understand the flaws of it. Dialogue: 0,0:12:23.10,0:12:24.92,Default,,0000,0000,0000,,So thank you for being here. Dialogue: 0,0:12:24.92,0:12:27.63,Default,,0000,0000,0000,,Thank you to my council colleagues, Dialogue: 0,0:12:27.63,0:12:29.20,Default,,0000,0000,0000,,but als most importantly the makers Dialogue: 0,0:12:29.20,0:12:30.54,Default,,0000,0000,0000,,for putting this forward. Dialogue: 0,0:12:30.54,0:12:33.70,Default,,0000,0000,0000,,I have been supportive of\Nthis since the very beginning, Dialogue: 0,0:12:33.70,0:12:34.83,Default,,0000,0000,0000,,when I hosted, Dialogue: 0,0:12:34.83,0:12:36.94,Default,,0000,0000,0000,,along with councilor Wu\Nand counselor McCarthy, Dialogue: 0,0:12:36.94,0:12:38.40,Default,,0000,0000,0000,,a hearing on this very issue Dialogue: 0,0:12:38.40,0:12:42.25,Default,,0000,0000,0000,,along with some other\Nsurveillance technologies Dialogue: 0,0:12:42.25,0:12:44.47,Default,,0000,0000,0000,,and things that were possibly\Ncoming down the pipe. Dialogue: 0,0:12:44.47,0:12:46.44,Default,,0000,0000,0000,,So this was an opportunity for us Dialogue: 0,0:12:46.44,0:12:49.91,Default,,0000,0000,0000,,to have this conversation on\Nwhat we could do proactively Dialogue: 0,0:12:49.91,0:12:53.03,Default,,0000,0000,0000,,to make sure that surveillance tools Dialogue: 0,0:12:53.03,0:12:56.41,Default,,0000,0000,0000,,were not put into place without\Na robust community process. Dialogue: 0,0:12:56.41,0:12:58.14,Default,,0000,0000,0000,,And of course, council input. Dialogue: 0,0:12:58.14,0:13:01.10,Default,,0000,0000,0000,,So back then I supported\Nit and I support it now. Dialogue: 0,0:13:01.10,0:13:03.68,Default,,0000,0000,0000,,Again thank you commissioner,\Nthank you to your team, Dialogue: 0,0:13:03.68,0:13:06.60,Default,,0000,0000,0000,,thank you to the makers\Nand thank you madam chair. Dialogue: 0,0:13:06.60,0:13:08.15,Default,,0000,0000,0000,,And Aiden also says, thank you. Dialogue: 0,0:13:11.05,0:13:13.24,Default,,0000,0000,0000,,Thank you Aiden and thank\Nyou councilor Campbell. Dialogue: 0,0:13:13.24,0:13:15.07,Default,,0000,0000,0000,,councilor O'Malley. Dialogue: 0,0:13:15.07,0:13:16.56,Default,,0000,0000,0000,,Thank you madam chair. Dialogue: 0,0:13:16.56,0:13:20.20,Default,,0000,0000,0000,,Just very briefly wanna thank the makers, Dialogue: 0,0:13:20.20,0:13:23.38,Default,,0000,0000,0000,,councilor Wu and Arroyo\Nfor their leadership here. Dialogue: 0,0:13:23.38,0:13:26.56,Default,,0000,0000,0000,,Obviously I support the\Nfacial recognition ban. Dialogue: 0,0:13:26.56,0:13:31.24,Default,,0000,0000,0000,,Also grateful, wanna echo my\Nthanks to the commissioner Dialogue: 0,0:13:31.24,0:13:33.81,Default,,0000,0000,0000,,for his comments and his opposition, Dialogue: 0,0:13:33.81,0:13:35.69,Default,,0000,0000,0000,,the fact that this isn't used. Dialogue: 0,0:13:35.69,0:13:37.46,Default,,0000,0000,0000,,So it's important that we\Ncodified that in such a way, Dialogue: 0,0:13:37.46,0:13:39.46,Default,,0000,0000,0000,,which again is the purpose of this hearing Dialogue: 0,0:13:39.46,0:13:41.57,Default,,0000,0000,0000,,and this ordinance and\Nstand ready, willing Dialogue: 0,0:13:41.57,0:13:43.50,Default,,0000,0000,0000,,and able to get to work. Dialogue: 0,0:13:43.50,0:13:44.49,Default,,0000,0000,0000,,Thank you and look forward Dialogue: 0,0:13:44.49,0:13:46.16,Default,,0000,0000,0000,,to hearing from a number of residents, Dialogue: 0,0:13:46.16,0:13:48.83,Default,,0000,0000,0000,,our panelists, as well as a\Npublic testimony after, thanks. Dialogue: 0,0:13:48.83,0:13:52.05,Default,,0000,0000,0000,,Thank you very much,\Ncouncilor Essaibi George. Dialogue: 0,0:13:52.05,0:13:52.88,Default,,0000,0000,0000,,Thank you madam chair, Dialogue: 0,0:13:52.88,0:13:54.73,Default,,0000,0000,0000,,and thank you to the commissioner Dialogue: 0,0:13:54.73,0:13:55.98,Default,,0000,0000,0000,,for being with us this afternoon. Dialogue: 0,0:13:55.98,0:13:57.53,Default,,0000,0000,0000,,I look forward to hearing from him Dialogue: 0,0:13:57.53,0:14:00.58,Default,,0000,0000,0000,,and from the panelists discussing\Nthis ordinance further. Dialogue: 0,0:14:00.58,0:14:02.22,Default,,0000,0000,0000,,I think that it is necessary Dialogue: 0,0:14:02.22,0:14:05.09,Default,,0000,0000,0000,,to ban this technology at this point. Dialogue: 0,0:14:05.09,0:14:09.06,Default,,0000,0000,0000,,I think that our colleagues\Nhave brought to the forefront, Dialogue: 0,0:14:09.06,0:14:11.53,Default,,0000,0000,0000,,over the last few years\Nas have the advocates, Dialogue: 0,0:14:11.53,0:14:14.73,Default,,0000,0000,0000,,around the concerns regarding\Nthis kind of technology. Dialogue: 0,0:14:14.73,0:14:19.73,Default,,0000,0000,0000,,I appreciate the real concerns\Naround accuracy and biases Dialogue: 0,0:14:21.09,0:14:24.38,Default,,0000,0000,0000,,against people of color\Nwomen, children, the elderly, Dialogue: 0,0:14:24.38,0:14:29.13,Default,,0000,0000,0000,,and certainly reflective\Nof our city's residents. Dialogue: 0,0:14:29.13,0:14:32.44,Default,,0000,0000,0000,,So we need to make sure\Nthat we're proactive Dialogue: 0,0:14:32.44,0:14:34.92,Default,,0000,0000,0000,,and protecting our residents Dialogue: 0,0:14:34.92,0:14:37.25,Default,,0000,0000,0000,,and having a deeper understanding Dialogue: 0,0:14:37.25,0:14:39.44,Default,,0000,0000,0000,,and appreciation of this technology Dialogue: 0,0:14:39.44,0:14:41.81,Default,,0000,0000,0000,,and understanding what\Nthis ordinance would mean Dialogue: 0,0:14:41.81,0:14:45.44,Default,,0000,0000,0000,,around that ban and\Nfollowing through with it. Dialogue: 0,0:14:45.44,0:14:46.42,Default,,0000,0000,0000,,Thank you ma'am chair. Dialogue: 0,0:14:46.42,0:14:47.47,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,0:14:47.47,0:14:50.04,Default,,0000,0000,0000,,I understand councilor\NFlynn is trying to get in Dialogue: 0,0:14:50.04,0:14:51.80,Default,,0000,0000,0000,,and there might be a limits. Dialogue: 0,0:14:51.80,0:14:53.26,Default,,0000,0000,0000,,I'm just saying that for our tech folks. Dialogue: 0,0:14:53.26,0:14:55.15,Default,,0000,0000,0000,,In the meantime, I'm\Ngoing to read a letter Dialogue: 0,0:14:55.15,0:14:58.28,Default,,0000,0000,0000,,from city council president Kim Janey, Dialogue: 0,0:14:58.28,0:15:01.29,Default,,0000,0000,0000,,"Dear chairwoman Edwards, due\Nto technical difficulties, Dialogue: 0,0:15:01.29,0:15:04.98,Default,,0000,0000,0000,,I'm unable to log in for\Ntoday's hearing on docket 0683, Dialogue: 0,0:15:04.98,0:15:06.01,Default,,0000,0000,0000,,regarding an ordinance Dialogue: 0,0:15:06.01,0:15:08.87,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston. Dialogue: 0,0:15:08.87,0:15:11.43,Default,,0000,0000,0000,,I will continue to work on\Nthe technical challenges Dialogue: 0,0:15:11.43,0:15:13.72,Default,,0000,0000,0000,,on my end and we'll join\Nthe hearing at a later time Dialogue: 0,0:15:13.72,0:15:16.09,Default,,0000,0000,0000,,if I am able, if I'm unable to join, Dialogue: 0,0:15:16.09,0:15:18.76,Default,,0000,0000,0000,,I'll review the recording at a later time. Dialogue: 0,0:15:18.76,0:15:21.49,Default,,0000,0000,0000,,I want to thank the makers\Ncouncilors Wu and Arroyo Dialogue: 0,0:15:21.49,0:15:23.91,Default,,0000,0000,0000,,for their advocacy and\Ncontinued leadership Dialogue: 0,0:15:23.91,0:15:25.36,Default,,0000,0000,0000,,on this critical issue. Dialogue: 0,0:15:25.36,0:15:26.76,Default,,0000,0000,0000,,I especially want to\Nthank all the advocates Dialogue: 0,0:15:26.76,0:15:28.32,Default,,0000,0000,0000,,who are participating Dialogue: 0,0:15:28.32,0:15:31.31,Default,,0000,0000,0000,,for their tireless advocacy on this issue. Dialogue: 0,0:15:31.31,0:15:32.96,Default,,0000,0000,0000,,I look forward to working with advocates Dialogue: 0,0:15:32.96,0:15:34.37,Default,,0000,0000,0000,,to incorporate their ideas Dialogue: 0,0:15:34.37,0:15:36.38,Default,,0000,0000,0000,,and how we can push this\Nissue moving forward. Dialogue: 0,0:15:36.38,0:15:39.24,Default,,0000,0000,0000,,It is clear we need policies in place Dialogue: 0,0:15:39.24,0:15:41.46,Default,,0000,0000,0000,,to protect communities that\Nare vulnerable to technologies Dialogue: 0,0:15:41.46,0:15:42.89,Default,,0000,0000,0000,,that are biased against them. Dialogue: 0,0:15:42.89,0:15:44.50,Default,,0000,0000,0000,,I look forward to working\Nwith my colleagues Dialogue: 0,0:15:44.50,0:15:45.53,Default,,0000,0000,0000,,on this critical issue, Dialogue: 0,0:15:45.53,0:15:47.42,Default,,0000,0000,0000,,and we'll continue to fight for an agenda Dialogue: 0,0:15:47.42,0:15:50.30,Default,,0000,0000,0000,,that promotes and protects black lives. Dialogue: 0,0:15:50.30,0:15:52.13,Default,,0000,0000,0000,,Please read this letter\Ninto the record, thank you." Dialogue: 0,0:15:52.13,0:15:53.37,Default,,0000,0000,0000,,Sincerely Kim Janey. Dialogue: 0,0:15:54.22,0:15:56.30,Default,,0000,0000,0000,,I'm not sure if we've been\Njoined by councilor Flynn. Dialogue: 0,0:15:56.30,0:15:59.62,Default,,0000,0000,0000,,I do know he was trying to get in. Dialogue: 0,0:16:03.60,0:16:05.01,Default,,0000,0000,0000,,In the meantime I did want to just do Dialogue: 0,0:16:05.01,0:16:08.84,Default,,0000,0000,0000,,some quick housekeeping as part\Nof my introductory remarks. Dialogue: 0,0:16:08.84,0:16:11.43,Default,,0000,0000,0000,,Just to let people know again, Dialogue: 0,0:16:11.43,0:16:12.92,Default,,0000,0000,0000,,today's hearing is specifically Dialogue: 0,0:16:12.92,0:16:14.39,Default,,0000,0000,0000,,about this proposed ordinance Dialogue: 0,0:16:14.39,0:16:16.85,Default,,0000,0000,0000,,that would ban facial\Nrecognition technology. Dialogue: 0,0:16:16.85,0:16:21.27,Default,,0000,0000,0000,,We had a robust conversation\Nabout defunding the police, Dialogue: 0,0:16:21.27,0:16:23.48,Default,,0000,0000,0000,,discussing all aspects of the budget, Dialogue: 0,0:16:23.48,0:16:25.74,Default,,0000,0000,0000,,all of those different things just happen Dialogue: 0,0:16:25.74,0:16:28.33,Default,,0000,0000,0000,,and it was chaired by councilor Bok. Dialogue: 0,0:16:28.33,0:16:32.14,Default,,0000,0000,0000,,And I wanna make sure that\Nour conversations are related, Dialogue: 0,0:16:32.14,0:16:35.61,Default,,0000,0000,0000,,that we do not take this precious\Nmoment on this legislation Dialogue: 0,0:16:35.61,0:16:40.06,Default,,0000,0000,0000,,and have it covered up by\Nthe other conversation. Dialogue: 0,0:16:40.06,0:16:43.21,Default,,0000,0000,0000,,I just want you to know that Dialogue: 0,0:16:43.21,0:16:45.79,Default,,0000,0000,0000,,and again, I will move the conversation Dialogue: 0,0:16:45.79,0:16:47.61,Default,,0000,0000,0000,,towards this ordinance. Dialogue: 0,0:16:47.61,0:16:49.16,Default,,0000,0000,0000,,There should be a copy of it. Dialogue: 0,0:16:49.16,0:16:50.91,Default,,0000,0000,0000,,People should have the ability Dialogue: 0,0:16:50.91,0:16:53.16,Default,,0000,0000,0000,,to discuss about this\Nlanguage in this ordinance. Dialogue: 0,0:16:53.16,0:16:55.30,Default,,0000,0000,0000,,Questions I have, Dialogue: 0,0:16:55.30,0:16:58.90,Default,,0000,0000,0000,,I do want to talk about,\Nwhether time is of the essence. Dialogue: 0,0:16:58.90,0:17:00.52,Default,,0000,0000,0000,,There seem to been some issue, Dialogue: 0,0:17:00.52,0:17:03.26,Default,,0000,0000,0000,,I'm concerned about whether\Nwe were entering a contract Dialogue: 0,0:17:03.26,0:17:04.31,Default,,0000,0000,0000,,that allowed for this. Dialogue: 0,0:17:04.31,0:17:05.60,Default,,0000,0000,0000,,So when the commissioner speaks, Dialogue: 0,0:17:05.60,0:17:08.49,Default,,0000,0000,0000,,I'd like for him to speak on that. Dialogue: 0,0:17:08.49,0:17:12.08,Default,,0000,0000,0000,,I'm not sure if there's a\Ncontract pending what's going on. Dialogue: 0,0:17:12.08,0:17:14.54,Default,,0000,0000,0000,,I also wanna make sure\Nthat folks understand Dialogue: 0,0:17:14.54,0:17:16.70,Default,,0000,0000,0000,,due to the fact that I think Dialogue: 0,0:17:16.70,0:17:19.80,Default,,0000,0000,0000,,we have possibly a hundred\Npeople signed up to speak. Dialogue: 0,0:17:19.80,0:17:21.87,Default,,0000,0000,0000,,We have a lot of folks Dialogue: 0,0:17:21.87,0:17:23.41,Default,,0000,0000,0000,,who are gonna be speaking on the panel. Dialogue: 0,0:17:23.41,0:17:26.22,Default,,0000,0000,0000,,I'm going to limit my\Ncolleagues to three minutes Dialogue: 0,0:17:26.22,0:17:27.38,Default,,0000,0000,0000,,in their first round. Dialogue: 0,0:17:27.38,0:17:30.56,Default,,0000,0000,0000,,And I'm gonna get the\Npublic to no more than two. Dialogue: 0,0:17:30.56,0:17:32.92,Default,,0000,0000,0000,,And I'm gonna try my best\Nto get through that list. Dialogue: 0,0:17:32.92,0:17:35.85,Default,,0000,0000,0000,,So I see we've been\Njoined by councilor Flynn, Dialogue: 0,0:17:35.85,0:17:38.30,Default,,0000,0000,0000,,councilor Flynn, do you\Nhave any opening remarks? Dialogue: 0,0:17:45.73,0:17:47.07,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:17:47.07,0:17:49.70,Default,,0000,0000,0000,,Thank you, counsel Edwards, Dialogue: 0,0:17:49.70,0:17:54.70,Default,,0000,0000,0000,,and to the sponsors and looking\Nforward to this hearing, Dialogue: 0,0:17:56.40,0:17:59.86,Default,,0000,0000,0000,,learning, learning more about the proposal Dialogue: 0,0:18:01.24,0:18:04.16,Default,,0000,0000,0000,,and also interested in hearing Dialogue: 0,0:18:04.16,0:18:09.16,Default,,0000,0000,0000,,from the commissioner of\Npolice, commissioner Gross Dialogue: 0,0:18:09.93,0:18:14.16,Default,,0000,0000,0000,,and with the public as well. Dialogue: 0,0:18:14.16,0:18:19.16,Default,,0000,0000,0000,,Again, this is an issue that's\Nthat we all can learn from. Dialogue: 0,0:18:21.49,0:18:25.28,Default,,0000,0000,0000,,We can listen to each\Nother, we can be respectful Dialogue: 0,0:18:25.28,0:18:28.13,Default,,0000,0000,0000,,and hear each other's point of view. Dialogue: 0,0:18:28.13,0:18:29.75,Default,,0000,0000,0000,,And that's what this city is all about, Dialogue: 0,0:18:29.75,0:18:34.15,Default,,0000,0000,0000,,is coming together and\Ntreating each other fairly. Dialogue: 0,0:18:34.15,0:18:37.64,Default,,0000,0000,0000,,And with respect, especially\Nduring difficult days Dialogue: 0,0:18:37.64,0:18:40.07,Default,,0000,0000,0000,,and on and on difficult issues. Dialogue: 0,0:18:40.07,0:18:43.50,Default,,0000,0000,0000,,I also know that our police commissioner Dialogue: 0,0:18:43.50,0:18:47.84,Default,,0000,0000,0000,,is someone that also listens to people Dialogue: 0,0:18:47.84,0:18:51.36,Default,,0000,0000,0000,,and he's in the neighborhoods and he talks Dialogue: 0,0:18:51.36,0:18:55.00,Default,,0000,0000,0000,,and listens to a lot of the\Nconcerns of the residents. Dialogue: 0,0:18:55.00,0:18:58.12,Default,,0000,0000,0000,,That's what I like most about\Nthe police commissioner, Dialogue: 0,0:18:58.12,0:19:01.13,Default,,0000,0000,0000,,is his ability to listen\Nand treat people freely. Dialogue: 0,0:19:01.13,0:19:03.83,Default,,0000,0000,0000,,Again, I'm here to listen\Nand learn about the proposal. Dialogue: 0,0:19:03.83,0:19:06.07,Default,,0000,0000,0000,,Thank you councilor Edwards\Nfor your strong leadership. Dialogue: 0,0:19:06.07,0:19:07.85,Default,,0000,0000,0000,,Thank you councilor Flynn. Dialogue: 0,0:19:07.85,0:19:09.67,Default,,0000,0000,0000,,I just wanna let people know, Dialogue: 0,0:19:09.67,0:19:12.82,Default,,0000,0000,0000,,we actually have a a hundred\Npercent limit on the Zoom. Dialogue: 0,0:19:12.82,0:19:14.50,Default,,0000,0000,0000,,We have 118 people trying to get in. Dialogue: 0,0:19:14.50,0:19:15.84,Default,,0000,0000,0000,,So once you've testified, Dialogue: 0,0:19:15.84,0:19:17.70,Default,,0000,0000,0000,,could you at least sign out, Dialogue: 0,0:19:18.69,0:19:19.58,Default,,0000,0000,0000,,that allows for other people to get in. Dialogue: 0,0:19:19.58,0:19:21.81,Default,,0000,0000,0000,,Commissioner Gross you've been waiting. Dialogue: 0,0:19:21.81,0:19:23.95,Default,,0000,0000,0000,,So I'm gonna turn the\Nmicrophone over to you. Dialogue: 0,0:19:23.95,0:19:27.35,Default,,0000,0000,0000,,I'm gonna ask you get\Nright to the ordinance Dialogue: 0,0:19:27.35,0:19:29.69,Default,,0000,0000,0000,,and get to the trends\Nand thoughts on that. Dialogue: 0,0:19:29.69,0:19:32.06,Default,,0000,0000,0000,,I'm happy to discuss any procedure. Dialogue: 0,0:19:32.06,0:19:33.89,Default,,0000,0000,0000,,Then we're gonna try\Nand get to the panelists Dialogue: 0,0:19:33.89,0:19:36.70,Default,,0000,0000,0000,,and move back to the city\Ncouncil while you're still here. Dialogue: 0,0:19:36.70,0:19:38.88,Default,,0000,0000,0000,,Okay.\NYes, thank you. Dialogue: 0,0:19:42.47,0:19:44.31,Default,,0000,0000,0000,,Good afternoon everyone. Dialogue: 0,0:19:44.31,0:19:49.31,Default,,0000,0000,0000,,And God bless Mr. George\NFloyd, may he rest in peace. Dialogue: 0,0:19:50.12,0:19:52.82,Default,,0000,0000,0000,,And for the record, I'm proud of Boston, Dialogue: 0,0:19:52.82,0:19:56.59,Default,,0000,0000,0000,,as they paid great homage\Nand peaceful protest Dialogue: 0,0:19:56.59,0:20:00.37,Default,,0000,0000,0000,,and anything we can do to\Nmove our department forward, Dialogue: 0,0:20:00.37,0:20:02.09,Default,,0000,0000,0000,,working with our community. Dialogue: 0,0:20:02.09,0:20:03.33,Default,,0000,0000,0000,,You have it on record right here, Dialogue: 0,0:20:03.33,0:20:05.23,Default,,0000,0000,0000,,and now I'm willing to\Nwork with you councilors, Dialogue: 0,0:20:05.23,0:20:07.72,Default,,0000,0000,0000,,the mayor, in our great city. Dialogue: 0,0:20:07.72,0:20:08.58,Default,,0000,0000,0000,,So, thank you. Dialogue: 0,0:20:09.57,0:20:13.36,Default,,0000,0000,0000,,So I will start and thank you the chair Dialogue: 0,0:20:13.36,0:20:15.86,Default,,0000,0000,0000,,and the committee members and the makers, Dialogue: 0,0:20:15.86,0:20:18.22,Default,,0000,0000,0000,,councilor Wu and Arroyo. Dialogue: 0,0:20:19.93,0:20:22.15,Default,,0000,0000,0000,,I really thank you for\Ninviting me to participate Dialogue: 0,0:20:22.15,0:20:24.37,Default,,0000,0000,0000,,in today's hearing regarding the ordinance Dialogue: 0,0:20:24.37,0:20:28.64,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston. Dialogue: 0,0:20:28.64,0:20:31.67,Default,,0000,0000,0000,,As you can imagine the\NBoston Police Department Dialogue: 0,0:20:31.67,0:20:33.99,Default,,0000,0000,0000,,has been extremely busy\Nmeeting the current demands Dialogue: 0,0:20:33.99,0:20:37.64,Default,,0000,0000,0000,,of public safety from\Nthe COVID-19 pandemic Dialogue: 0,0:20:37.64,0:20:39.58,Default,,0000,0000,0000,,and the public rallies. Dialogue: 0,0:20:39.58,0:20:42.11,Default,,0000,0000,0000,,However, the department and I understand Dialogue: 0,0:20:42.11,0:20:45.80,Default,,0000,0000,0000,,the importance of the given topic, Dialogue: 0,0:20:45.80,0:20:46.84,Default,,0000,0000,0000,,I would like to request Dialogue: 0,0:20:46.84,0:20:49.63,Default,,0000,0000,0000,,that hearing remain on the given topic Dialogue: 0,0:20:49.63,0:20:52.67,Default,,0000,0000,0000,,and that we work together\Nto find other times Dialogue: 0,0:20:52.67,0:20:56.17,Default,,0000,0000,0000,,to discuss the important issues\Nregarding police relations, Dialogue: 0,0:20:56.17,0:20:58.82,Default,,0000,0000,0000,,national issues on police reform Dialogue: 0,0:20:58.82,0:21:01.33,Default,,0000,0000,0000,,and the ways that we can improve Dialogue: 0,0:21:01.33,0:21:04.92,Default,,0000,0000,0000,,our relationships here in Boston. Dialogue: 0,0:21:06.43,0:21:08.31,Default,,0000,0000,0000,,As has been practiced in the past, Dialogue: 0,0:21:08.31,0:21:10.02,Default,,0000,0000,0000,,we welcome a future opportunity Dialogue: 0,0:21:10.02,0:21:12.05,Default,,0000,0000,0000,,to discuss the ordinance language Dialogue: 0,0:21:12.05,0:21:15.69,Default,,0000,0000,0000,,and city council concerns\Nin a working session Dialogue: 0,0:21:15.69,0:21:18.06,Default,,0000,0000,0000,,regarding technology policy Dialogue: 0,0:21:18.06,0:21:20.59,Default,,0000,0000,0000,,and potential privacy concerns. Dialogue: 0,0:21:20.59,0:21:24.95,Default,,0000,0000,0000,,My testimony today is meant\Nto serve as a background Dialogue: 0,0:21:24.95,0:21:27.57,Default,,0000,0000,0000,,to BPD current practices Dialogue: 0,0:21:27.57,0:21:31.14,Default,,0000,0000,0000,,and identify potential\Ntechnological needs. Dialogue: 0,0:21:32.05,0:21:34.33,Default,,0000,0000,0000,,The department for the record, Dialogue: 0,0:21:34.33,0:21:36.38,Default,,0000,0000,0000,,does not currently have the technology Dialogue: 0,0:21:36.38,0:21:37.88,Default,,0000,0000,0000,,for facial recognition. Dialogue: 0,0:21:39.24,0:21:41.84,Default,,0000,0000,0000,,As technology advances however, Dialogue: 0,0:21:41.84,0:21:45.33,Default,,0000,0000,0000,,many vendors have and will\Ncontinue to incorporate Dialogue: 0,0:21:45.33,0:21:49.24,Default,,0000,0000,0000,,automated recognition abilities. Dialogue: 0,0:21:50.10,0:21:52.36,Default,,0000,0000,0000,,We have prohibited these features Dialogue: 0,0:21:52.36,0:21:55.02,Default,,0000,0000,0000,,as we have moved along in our advancements Dialogue: 0,0:21:55.02,0:21:58.29,Default,,0000,0000,0000,,with the intention to have rich\Ndialogue with the community Dialogue: 0,0:21:58.29,0:22:03.00,Default,,0000,0000,0000,,prior to any acquisition\Nof such a technology. Dialogue: 0,0:22:04.41,0:22:07.54,Default,,0000,0000,0000,,Video has proven to be one\Nof the most effective tools Dialogue: 0,0:22:07.54,0:22:10.87,Default,,0000,0000,0000,,for collecting evidence\Nof criminal offenses, Dialogue: 0,0:22:10.87,0:22:13.09,Default,,0000,0000,0000,,for solving crimes and locating missing Dialogue: 0,0:22:13.09,0:22:15.64,Default,,0000,0000,0000,,and exploited individuals. Dialogue: 0,0:22:15.64,0:22:18.72,Default,,0000,0000,0000,,Any prohibitions on\Nthese investigative tools Dialogue: 0,0:22:18.72,0:22:21.71,Default,,0000,0000,0000,,without a full understanding\Nof potential uses Dialogue: 0,0:22:21.71,0:22:23.44,Default,,0000,0000,0000,,under strict protocols, Dialogue: 0,0:22:23.44,0:22:25.61,Default,,0000,0000,0000,,could be harmful and impede our ability Dialogue: 0,0:22:25.61,0:22:26.89,Default,,0000,0000,0000,,to protect the public. Dialogue: 0,0:22:28.37,0:22:32.12,Default,,0000,0000,0000,,The proposed ordinance defines\Nfacial surveillance to mean, Dialogue: 0,0:22:32.12,0:22:35.65,Default,,0000,0000,0000,,an automatic automated\Nor semi-automated process Dialogue: 0,0:22:35.65,0:22:40.50,Default,,0000,0000,0000,,that assist in identifying\Nor verifying an individual Dialogue: 0,0:22:40.50,0:22:43.54,Default,,0000,0000,0000,,or in capturing information\Nabout an individual Dialogue: 0,0:22:43.54,0:22:45.51,Default,,0000,0000,0000,,based on the physical characteristics Dialogue: 0,0:22:45.51,0:22:48.00,Default,,0000,0000,0000,,of an individual's face. Dialogue: 0,0:22:51.44,0:22:53.10,Default,,0000,0000,0000,,And to be clear, Dialogue: 0,0:22:53.10,0:22:54.84,Default,,0000,0000,0000,,that department has no desire Dialogue: 0,0:22:54.84,0:22:56.64,Default,,0000,0000,0000,,to employ a facial surveillance system Dialogue: 0,0:22:58.11,0:23:01.33,Default,,0000,0000,0000,,to generally surveil\Nthe citizens of Boston. Dialogue: 0,0:23:01.33,0:23:03.64,Default,,0000,0000,0000,,The department however notes a distinction Dialogue: 0,0:23:03.64,0:23:06.26,Default,,0000,0000,0000,,between facial surveillance systems Dialogue: 0,0:23:06.26,0:23:08.49,Default,,0000,0000,0000,,and facial recognition technology. Dialogue: 0,0:23:09.35,0:23:10.23,Default,,0000,0000,0000,,The department believes Dialogue: 0,0:23:10.23,0:23:13.25,Default,,0000,0000,0000,,that the term facial\Nrecognition technology Dialogue: 0,0:23:13.25,0:23:16.55,Default,,0000,0000,0000,,better describes the investigative nature Dialogue: 0,0:23:16.55,0:23:19.62,Default,,0000,0000,0000,,of the technology we believe\Nwould be useful to the city, Dialogue: 0,0:23:20.64,0:23:23.39,Default,,0000,0000,0000,,with the right safeguards\Nand community input. Dialogue: 0,0:23:24.47,0:23:25.96,Default,,0000,0000,0000,,The department would, Dialogue: 0,0:23:25.96,0:23:28.96,Default,,0000,0000,0000,,as technology advances\Nand becomes more reliable, Dialogue: 0,0:23:28.96,0:23:31.85,Default,,0000,0000,0000,,like the opportunity to\Ndiscuss the utilization Dialogue: 0,0:23:31.85,0:23:34.72,Default,,0000,0000,0000,,of facial recognition technology Dialogue: 0,0:23:34.72,0:23:39.72,Default,,0000,0000,0000,,to respond to specific crimes\Nand emergency situations. Dialogue: 0,0:23:41.34,0:23:44.42,Default,,0000,0000,0000,,And I would like to\Nrevisit everyone's thoughts Dialogue: 0,0:23:44.42,0:23:47.69,Default,,0000,0000,0000,,and remembrances of the Boston Marathon Dialogue: 0,0:23:47.69,0:23:49.85,Default,,0000,0000,0000,,and the most recent kidnappings. Dialogue: 0,0:23:51.12,0:23:55.51,Default,,0000,0000,0000,,Facial recognition technology\Nto the best of our knowledge, Dialogue: 0,0:23:55.51,0:23:57.59,Default,,0000,0000,0000,,will greatly reduced the hours necessary Dialogue: 0,0:23:57.59,0:24:00.32,Default,,0000,0000,0000,,to review video evidence. Dialogue: 0,0:24:00.32,0:24:04.44,Default,,0000,0000,0000,,This would also allow\Ninvestigators to move more quickly, Dialogue: 0,0:24:04.44,0:24:06.76,Default,,0000,0000,0000,,identify victims, missing persons Dialogue: 0,0:24:06.76,0:24:10.97,Default,,0000,0000,0000,,or suspects of crime through\Ncitywide camera recordings. Dialogue: 0,0:24:10.97,0:24:13.76,Default,,0000,0000,0000,,This technology, if acquired Dialogue: 0,0:24:13.76,0:24:16.16,Default,,0000,0000,0000,,would also allow for the compilation Dialogue: 0,0:24:16.16,0:24:19.85,Default,,0000,0000,0000,,and condensing a video to\Nbe done more efficiently. Dialogue: 0,0:24:21.95,0:24:23.67,Default,,0000,0000,0000,,Facial recognition technology Dialogue: 0,0:24:23.67,0:24:25.62,Default,,0000,0000,0000,,used with well-established guidelines Dialogue: 0,0:24:25.62,0:24:27.44,Default,,0000,0000,0000,,and under strict review, Dialogue: 0,0:24:27.44,0:24:29.38,Default,,0000,0000,0000,,can also provide a safer environment Dialogue: 0,0:24:29.38,0:24:31.16,Default,,0000,0000,0000,,for all in the city of Boston. Dialogue: 0,0:24:32.24,0:24:34.68,Default,,0000,0000,0000,,Creating a system in\Nwhich evidence is reviewed Dialogue: 0,0:24:34.68,0:24:37.54,Default,,0000,0000,0000,,in a timely manner and efficiently Dialogue: 0,0:24:37.54,0:24:40.38,Default,,0000,0000,0000,,allows for police to intervene Dialogue: 0,0:24:40.38,0:24:45.38,Default,,0000,0000,0000,,and reduce levels of\Ncrime would be beneficial. Dialogue: 0,0:24:45.46,0:24:49.85,Default,,0000,0000,0000,,The department rejects any\Nnotion in the ordinance Dialogue: 0,0:24:49.85,0:24:54.85,Default,,0000,0000,0000,,that would whatever use\Nfacial recognition technology Dialogue: 0,0:24:55.68,0:24:58.79,Default,,0000,0000,0000,,in our response to COVID-19 pandemic. Dialogue: 0,0:24:58.79,0:25:00.62,Default,,0000,0000,0000,,And just for the record, Dialogue: 0,0:25:00.62,0:25:01.83,Default,,0000,0000,0000,,I'd like to repeat that, Dialogue: 0,0:25:01.83,0:25:04.82,Default,,0000,0000,0000,,the department rejects any\Nnotion in the ordinance Dialogue: 0,0:25:04.82,0:25:09.42,Default,,0000,0000,0000,,that we are or will ever use\Nfacial recognition technology Dialogue: 0,0:25:09.42,0:25:12.06,Default,,0000,0000,0000,,in our response to COVID-19 pandemic. Dialogue: 0,0:25:13.98,0:25:14.81,Default,,0000,0000,0000,,Finally, Dialogue: 0,0:25:18.64,0:25:20.76,Default,,0000,0000,0000,,it is important to note, Dialogue: 0,0:25:20.76,0:25:24.44,Default,,0000,0000,0000,,the department shares the\Nconcerns of our community Dialogue: 0,0:25:24.44,0:25:27.36,Default,,0000,0000,0000,,and our community concerns around privacy Dialogue: 0,0:25:27.36,0:25:29.64,Default,,0000,0000,0000,,and intrusive surveillance, Dialogue: 0,0:25:30.68,0:25:34.09,Default,,0000,0000,0000,,as it is the current practice\Nwith existing technology, Dialogue: 0,0:25:34.09,0:25:35.59,Default,,0000,0000,0000,,our intentions are to implement Dialogue: 0,0:25:35.59,0:25:37.95,Default,,0000,0000,0000,,any potential future advances Dialogue: 0,0:25:37.95,0:25:40.61,Default,,0000,0000,0000,,in facial recognition technology, Dialogue: 0,0:25:40.61,0:25:43.46,Default,,0000,0000,0000,,through well-defined\Nprotocols and procedures. Dialogue: 0,0:25:43.46,0:25:47.04,Default,,0000,0000,0000,,Some areas we consider, excuse me, Dialogue: 0,0:25:47.04,0:25:49.87,Default,,0000,0000,0000,,some areas we consider this\Ntechnology may be beneficial, Dialogue: 0,0:25:49.87,0:25:52.26,Default,,0000,0000,0000,,are in identifying the route Dialogue: 0,0:25:52.26,0:25:54.20,Default,,0000,0000,0000,,and locations of missing persons, Dialogue: 0,0:25:54.20,0:25:57.39,Default,,0000,0000,0000,,including the suffering from\NAlzheimer's and dementia, Dialogue: 0,0:25:57.39,0:26:00.94,Default,,0000,0000,0000,,those suffering from\NAlzheimer's and dementia. Dialogue: 0,0:26:00.94,0:26:04.13,Default,,0000,0000,0000,,As well, missing children,\Nkidnapped individuals, Dialogue: 0,0:26:04.13,0:26:06.62,Default,,0000,0000,0000,,human trafficking victims, Dialogue: 0,0:26:06.62,0:26:09.84,Default,,0000,0000,0000,,and the suspects of\Nassaults and shootings. Dialogue: 0,0:26:09.84,0:26:14.50,Default,,0000,0000,0000,,Also the victims of\Nhomicide and domestic abuse. Dialogue: 0,0:26:17.72,0:26:19.17,Default,,0000,0000,0000,,Important to know, Dialogue: 0,0:26:19.17,0:26:22.56,Default,,0000,0000,0000,,we would like to work with\Nthe council and our partners Dialogue: 0,0:26:22.56,0:26:24.86,Default,,0000,0000,0000,,to clearly articulate the language Dialogue: 0,0:26:24.86,0:26:27.52,Default,,0000,0000,0000,,that could best describe these uses. Dialogue: 0,0:26:27.52,0:26:30.59,Default,,0000,0000,0000,,In section B2 with the proposed ordinance Dialogue: 0,0:26:30.59,0:26:33.00,Default,,0000,0000,0000,,that indicates there is some intention Dialogue: 0,0:26:33.00,0:26:35.69,Default,,0000,0000,0000,,to account for the use of this technology Dialogue: 0,0:26:35.69,0:26:37.32,Default,,0000,0000,0000,,to advance investigations Dialogue: 0,0:26:37.32,0:26:39.48,Default,,0000,0000,0000,,but the department police\Nsome additional language Dialogue: 0,0:26:39.48,0:26:43.57,Default,,0000,0000,0000,,referencing permitted uses is necessary. Dialogue: 0,0:26:44.76,0:26:48.77,Default,,0000,0000,0000,,We as well, would\Nappreciate a working session Dialogue: 0,0:26:48.77,0:26:51.38,Default,,0000,0000,0000,,within the next 60 days, Dialogue: 0,0:26:51.38,0:26:55.35,Default,,0000,0000,0000,,to draft language that we\Ncan collectively present Dialogue: 0,0:26:55.35,0:26:58.99,Default,,0000,0000,0000,,to constituents and\Nrequesting interest groups Dialogue: 0,0:26:58.99,0:27:01.26,Default,,0000,0000,0000,,for feedback and input. Dialogue: 0,0:27:02.66,0:27:05.13,Default,,0000,0000,0000,,We also plan to use the public testimony Dialogue: 0,0:27:05.13,0:27:07.35,Default,,0000,0000,0000,,being provided in today's hearing Dialogue: 0,0:27:07.35,0:27:09.51,Default,,0000,0000,0000,,to assist in our decision-making. Dialogue: 0,0:27:10.41,0:27:12.61,Default,,0000,0000,0000,,I'd like to thank you for this opportunity Dialogue: 0,0:27:12.61,0:27:14.62,Default,,0000,0000,0000,,to provide this testimony Dialogue: 0,0:27:14.62,0:27:16.74,Default,,0000,0000,0000,,and your continued interest and time Dialogue: 0,0:27:16.74,0:27:18.90,Default,,0000,0000,0000,,in creating public forums, Dialogue: 0,0:27:18.90,0:27:21.58,Default,,0000,0000,0000,,around police technology and advancements. Dialogue: 0,0:27:22.81,0:27:25.56,Default,,0000,0000,0000,,That's my official\Nstatement that I read in. Dialogue: 0,0:27:25.56,0:27:29.10,Default,,0000,0000,0000,,And I'm telling you right now\Nas an African-American male, Dialogue: 0,0:27:29.10,0:27:32.11,Default,,0000,0000,0000,,the technology that is in place today Dialogue: 0,0:27:32.11,0:27:35.50,Default,,0000,0000,0000,,does not meet the standards of\Nthe Boston Police Department, Dialogue: 0,0:27:35.50,0:27:37.90,Default,,0000,0000,0000,,nor does it meet my standards. Dialogue: 0,0:27:37.90,0:27:39.47,Default,,0000,0000,0000,,And until technology advances Dialogue: 0,0:27:39.47,0:27:42.77,Default,,0000,0000,0000,,to a point where it is more reliable, Dialogue: 0,0:27:42.77,0:27:46.46,Default,,0000,0000,0000,,again, we will need your\Ninput, your guidance, Dialogue: 0,0:27:46.46,0:27:50.92,Default,,0000,0000,0000,,and we'll work together\Nto pick that technology Dialogue: 0,0:27:50.92,0:27:55.16,Default,,0000,0000,0000,,which is more conducive to our privacy Dialogue: 0,0:27:55.16,0:27:59.24,Default,,0000,0000,0000,,and our rights as citizens\Nof Boston, thank you. Dialogue: 0,0:28:01.69,0:28:03.85,Default,,0000,0000,0000,,Thank you very much commissioner Gross. Dialogue: 0,0:28:03.85,0:28:06.52,Default,,0000,0000,0000,,Do you, just before I go on, Dialogue: 0,0:28:06.52,0:28:08.93,Default,,0000,0000,0000,,we're gonna go now to Joy Buolamwini Dialogue: 0,0:28:12.12,0:28:13.80,Default,,0000,0000,0000,,and I am so sorry for that Joy. Dialogue: 0,0:28:13.80,0:28:16.89,Default,,0000,0000,0000,,I apologize profusely for the\Nbutchering of your last name. Dialogue: 0,0:28:16.89,0:28:21.58,Default,,0000,0000,0000,,My'kel McMillen, Karina\NHam, Kade Crawford, Kate, Dialogue: 0,0:28:21.58,0:28:26.12,Default,,0000,0000,0000,,Erik Berg, and Joshua Barocas. Dialogue: 0,0:28:27.09,0:28:30.34,Default,,0000,0000,0000,,But I wanted to ask the\Ncommissioner just very quickly, Dialogue: 0,0:28:30.34,0:28:33.21,Default,,0000,0000,0000,,you said you have suggested language Dialogue: 0,0:28:33.21,0:28:36.90,Default,,0000,0000,0000,,or that you would want to work\Non suggested language for B2. Dialogue: 0,0:28:36.90,0:28:39.01,Default,,0000,0000,0000,,No, I would want to work on it. Dialogue: 0,0:28:39.01,0:28:42.24,Default,,0000,0000,0000,,So keep in mind anything\Nwe do going forward, Dialogue: 0,0:28:42.24,0:28:44.67,Default,,0000,0000,0000,,as I promised for the last four years, Dialogue: 0,0:28:44.67,0:28:47.26,Default,,0000,0000,0000,,we want your input and your guidance. Dialogue: 0,0:28:47.26,0:28:50.43,Default,,0000,0000,0000,,Councilor Mejia you hit it on point. Dialogue: 0,0:28:50.43,0:28:52.73,Default,,0000,0000,0000,,This technology is not fair to everyone, Dialogue: 0,0:28:52.73,0:28:55.67,Default,,0000,0000,0000,,especially African-Americans, Latinos, Dialogue: 0,0:28:55.67,0:28:57.12,Default,,0000,0000,0000,,it's not there yet. Dialogue: 0,0:28:57.12,0:28:58.96,Default,,0000,0000,0000,,So moving forward, Dialogue: 0,0:28:58.96,0:29:00.99,Default,,0000,0000,0000,,we have to make sure\Neverybody's comfortable Dialogue: 0,0:29:00.99,0:29:02.56,Default,,0000,0000,0000,,with this type of technology. Dialogue: 0,0:29:03.70,0:29:04.53,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:29:05.46,0:29:08.96,Default,,0000,0000,0000,,So now I'm gonna turn it\Nover to some of the folks Dialogue: 0,0:29:08.96,0:29:12.68,Default,,0000,0000,0000,,who have called for to speak. Dialogue: 0,0:29:12.68,0:29:15.52,Default,,0000,0000,0000,,Hope we're gonna try and get through Dialogue: 0,0:29:15.52,0:29:17.02,Default,,0000,0000,0000,,as many of them as possible. Dialogue: 0,0:29:17.02,0:29:19.16,Default,,0000,0000,0000,,Commissioner, these are the\Nfolks who helped to push Dialogue: 0,0:29:19.16,0:29:20.16,Default,,0000,0000,0000,,and move this ordinance. Dialogue: 0,0:29:20.16,0:29:21.86,Default,,0000,0000,0000,,We really hope you can\Nstay as long as you can Dialogue: 0,0:29:21.86,0:29:22.90,Default,,0000,0000,0000,,to hear them, also. Dialogue: 0,0:29:22.90,0:29:25.28,Default,,0000,0000,0000,,I'm looking forward to\Nworking with the folks Dialogue: 0,0:29:25.28,0:29:26.84,Default,,0000,0000,0000,,in the community. Dialogue: 0,0:29:26.84,0:29:28.13,Default,,0000,0000,0000,,Wonderful, thank you. Dialogue: 0,0:29:28.13,0:29:29.32,Default,,0000,0000,0000,,So Joy. Dialogue: 0,0:29:31.09,0:29:31.94,Default,,0000,0000,0000,,Do I? Dialogue: 0,0:29:34.76,0:29:36.37,Default,,0000,0000,0000,,Hello? Dialogue: 0,0:29:36.37,0:29:38.20,Default,,0000,0000,0000,,Oh, wonderful, Joy. Dialogue: 0,0:29:38.20,0:29:39.52,Default,,0000,0000,0000,,The floor is yours. Dialogue: 0,0:29:39.52,0:29:43.74,Default,,0000,0000,0000,,If you could take about\Nthree minutes or no more. Dialogue: 0,0:29:43.74,0:29:44.95,Default,,0000,0000,0000,,Sounds good. Dialogue: 0,0:29:44.95,0:29:47.07,Default,,0000,0000,0000,,So thank you so much madam chair Edwards, Dialogue: 0,0:29:47.07,0:29:50.26,Default,,0000,0000,0000,,members of the committee\Non government operations Dialogue: 0,0:29:50.26,0:29:53.39,Default,,0000,0000,0000,,and members of the Boston City Council Dialogue: 0,0:29:53.39,0:29:55.87,Default,,0000,0000,0000,,for the opportunity to testify today, Dialogue: 0,0:29:55.87,0:29:58.36,Default,,0000,0000,0000,,I am the founder of the\NAlgorithmic Justice League Dialogue: 0,0:29:58.36,0:30:00.88,Default,,0000,0000,0000,,and an algorithmic bias researcher. Dialogue: 0,0:30:00.88,0:30:02.84,Default,,0000,0000,0000,,I've conducted MIT study Dialogue: 0,0:30:02.84,0:30:05.42,Default,,0000,0000,0000,,showing some of the\Nlargest recorded gender Dialogue: 0,0:30:05.42,0:30:09.35,Default,,0000,0000,0000,,and racial biases in AI\Nsystems sold by companies, Dialogue: 0,0:30:09.35,0:30:12.61,Default,,0000,0000,0000,,including IBM, Microsoft and Amazon. Dialogue: 0,0:30:12.61,0:30:15.13,Default,,0000,0000,0000,,As you've heard the deployment\Nof facial recognition Dialogue: 0,0:30:15.13,0:30:17.12,Default,,0000,0000,0000,,and related technologies Dialogue: 0,0:30:17.12,0:30:19.04,Default,,0000,0000,0000,,has major civil rights implications. Dialogue: 0,0:30:19.04,0:30:21.40,Default,,0000,0000,0000,,These tools also have\Ntechnical limitations Dialogue: 0,0:30:21.40,0:30:24.37,Default,,0000,0000,0000,,that further amplify\Nharms for black people, Dialogue: 0,0:30:24.37,0:30:27.48,Default,,0000,0000,0000,,indigenous people, other\Ncommunities of color, Dialogue: 0,0:30:27.48,0:30:31.29,Default,,0000,0000,0000,,women, the elderly, those\Nwith dementia and Parkinson's, Dialogue: 0,0:30:31.29,0:30:35.61,Default,,0000,0000,0000,,youth trans and gender\Nnon-conforming individuals. Dialogue: 0,0:30:35.61,0:30:37.05,Default,,0000,0000,0000,,In one test I ran, Dialogue: 0,0:30:37.05,0:30:41.31,Default,,0000,0000,0000,,Amazon's AI even failed on\Nthe face of Oprah Winfrey, Dialogue: 0,0:30:41.31,0:30:43.12,Default,,0000,0000,0000,,labeling her male. Dialogue: 0,0:30:43.12,0:30:46.36,Default,,0000,0000,0000,,Personally, I've had to resort\Nto wearing a white mask, Dialogue: 0,0:30:46.36,0:30:50.55,Default,,0000,0000,0000,,to have my dark skin detected\Nby some of this technology, Dialogue: 0,0:30:50.55,0:30:52.73,Default,,0000,0000,0000,,but given mass surveillance applications, Dialogue: 0,0:30:52.73,0:30:55.94,Default,,0000,0000,0000,,not having my face\Ndetected can be a benefit. Dialogue: 0,0:30:55.94,0:30:57.98,Default,,0000,0000,0000,,We do not need to look to China Dialogue: 0,0:30:58.92,0:31:00.27,Default,,0000,0000,0000,,to see this technology\Nbeing used for surveillance Dialogue: 0,0:31:00.27,0:31:03.13,Default,,0000,0000,0000,,of protesters with little\Nto no accountability, Dialogue: 0,0:31:03.13,0:31:06.85,Default,,0000,0000,0000,,and too often in violation of\Nour civil and human rights, Dialogue: 0,0:31:06.85,0:31:09.86,Default,,0000,0000,0000,,including first amendment\Nrights of freedom of expression, Dialogue: 0,0:31:09.86,0:31:12.33,Default,,0000,0000,0000,,association, and assembly. Dialogue: 0,0:31:13.27,0:31:14.33,Default,,0000,0000,0000,,When the tech works, Dialogue: 0,0:31:14.33,0:31:16.77,Default,,0000,0000,0000,,we can't forget about\Nthe cost of surveillance, Dialogue: 0,0:31:16.77,0:31:18.90,Default,,0000,0000,0000,,but in other contexts it can fail Dialogue: 0,0:31:18.90,0:31:20.66,Default,,0000,0000,0000,,and the failures can be harmful. Dialogue: 0,0:31:20.66,0:31:23.70,Default,,0000,0000,0000,,Misidentifications can\Nlead to false arrest Dialogue: 0,0:31:23.70,0:31:25.13,Default,,0000,0000,0000,,and accusations. Dialogue: 0,0:31:25.13,0:31:28.16,Default,,0000,0000,0000,,In April 2019, a brown university senior Dialogue: 0,0:31:28.16,0:31:31.57,Default,,0000,0000,0000,,was misidentified as a terrorist suspect Dialogue: 0,0:31:31.57,0:31:34.46,Default,,0000,0000,0000,,in the Sri Lanka Easter bombings. Dialogue: 0,0:31:34.46,0:31:36.36,Default,,0000,0000,0000,,he police eventually\Ncorrected the mistake, Dialogue: 0,0:31:36.36,0:31:38.47,Default,,0000,0000,0000,,but she still received death threats. Dialogue: 0,0:31:38.47,0:31:41.67,Default,,0000,0000,0000,,Mistaken identity is more\Nthan an inconvenience. Dialogue: 0,0:31:41.67,0:31:44.00,Default,,0000,0000,0000,,And she's not alone, in the UK, Dialogue: 0,0:31:44.00,0:31:47.15,Default,,0000,0000,0000,,the faces of over 2,400 innocent people Dialogue: 0,0:31:47.15,0:31:50.74,Default,,0000,0000,0000,,were stored by the police\Ndepartment without their consent. Dialogue: 0,0:31:50.74,0:31:52.05,Default,,0000,0000,0000,,The department reported Dialogue: 0,0:31:52.05,0:31:55.98,Default,,0000,0000,0000,,a false positive identification\Nrate of over 90%. Dialogue: 0,0:31:56.94,0:32:00.30,Default,,0000,0000,0000,,In the U.S. there are no\Nreporting requirements Dialogue: 0,0:32:00.30,0:32:03.61,Default,,0000,0000,0000,,generally speaking, so\Nwe're operating in the dark. Dialogue: 0,0:32:03.61,0:32:05.50,Default,,0000,0000,0000,,Further, these tools\Ndo not have to identify Dialogue: 0,0:32:05.50,0:32:07.50,Default,,0000,0000,0000,,unique basis to be harmful. Dialogue: 0,0:32:07.50,0:32:08.90,Default,,0000,0000,0000,,And investigation reported Dialogue: 0,0:32:08.90,0:32:11.20,Default,,0000,0000,0000,,that IBM equipped the NYP Dialogue: 0,0:32:11.20,0:32:13.37,Default,,0000,0000,0000,,with tools to search for people in video, Dialogue: 0,0:32:13.37,0:32:15.70,Default,,0000,0000,0000,,by facial hair and skin tone. Dialogue: 0,0:32:15.70,0:32:17.21,Default,,0000,0000,0000,,In short, these tools can be used Dialogue: 0,0:32:17.21,0:32:19.44,Default,,0000,0000,0000,,to automate racial profiling. Dialogue: 0,0:32:19.44,0:32:21.12,Default,,0000,0000,0000,,The company recently came out Dialogue: 0,0:32:21.12,0:32:22.77,Default,,0000,0000,0000,,to denounce the use of these tools Dialogue: 0,0:32:22.77,0:32:25.24,Default,,0000,0000,0000,,for mass surveillance and profiling. Dialogue: 0,0:32:25.24,0:32:29.58,Default,,0000,0000,0000,,IBM's moved to stop selling\Nfacial recognition technology Dialogue: 0,0:32:29.58,0:32:32.31,Default,,0000,0000,0000,,underscores it's dangerous. Dialogue: 0,0:32:32.31,0:32:34.34,Default,,0000,0000,0000,,Due to the consequences of failure, Dialogue: 0,0:32:34.34,0:32:36.43,Default,,0000,0000,0000,,I've focused my MIT research Dialogue: 0,0:32:36.43,0:32:39.21,Default,,0000,0000,0000,,on the performance of\Nfacial analysis systems. Dialogue: 0,0:32:39.21,0:32:42.06,Default,,0000,0000,0000,,I found that for the task\Nof gender classification, Dialogue: 0,0:32:42.06,0:32:44.61,Default,,0000,0000,0000,,IBM, Microsoft and Amazon Dialogue: 0,0:32:44.61,0:32:48.79,Default,,0000,0000,0000,,had air rates of no more\Nthan 1% for lighter skin men. Dialogue: 0,0:32:48.79,0:32:49.83,Default,,0000,0000,0000,,In the worst case, Dialogue: 0,0:32:51.10,0:32:54.08,Default,,0000,0000,0000,,these rates were to over\N30% for darker skin women. Dialogue: 0,0:32:54.08,0:32:57.23,Default,,0000,0000,0000,,Subsequent government\Nstudies compliment this work. Dialogue: 0,0:32:57.23,0:33:01.01,Default,,0000,0000,0000,,They show that continued air\Ndisparities in facial analysis Dialogue: 0,0:33:01.01,0:33:03.68,Default,,0000,0000,0000,,and other tasks, including\Nface recognition. Dialogue: 0,0:33:03.68,0:33:07.86,Default,,0000,0000,0000,,The latest government\Nstudy of 189 algorithms, Dialogue: 0,0:33:07.86,0:33:10.56,Default,,0000,0000,0000,,revealed consequential racial gender Dialogue: 0,0:33:10.56,0:33:13.96,Default,,0000,0000,0000,,and age bias in many of\Nthe algorithms tested. Dialogue: 0,0:33:13.96,0:33:16.05,Default,,0000,0000,0000,,Still, even if air rates improve, Dialogue: 0,0:33:16.05,0:33:18.90,Default,,0000,0000,0000,,the capacity for abuse, lack of oversight Dialogue: 0,0:33:18.90,0:33:22.77,Default,,0000,0000,0000,,and deployment limitations\Npost too great a risk. Dialogue: 0,0:33:22.77,0:33:25.08,Default,,0000,0000,0000,,Given known harms, the city of Boston Dialogue: 0,0:33:25.08,0:33:28.15,Default,,0000,0000,0000,,should ban government\Nuse of face surveillance. Dialogue: 0,0:33:28.15,0:33:30.04,Default,,0000,0000,0000,,I look forward to\Nanswering your questions. Dialogue: 0,0:33:30.04,0:33:32.56,Default,,0000,0000,0000,,Thank you for the opportunity to testify. Dialogue: 0,0:33:33.60,0:33:34.62,Default,,0000,0000,0000,,Thank you so much Dialogue: 0,0:33:34.62,0:33:38.50,Default,,0000,0000,0000,,and to correct my horrible\Nmispronunciation of your name. Dialogue: 0,0:33:39.50,0:33:42.77,Default,,0000,0000,0000,,Would you mind saying your\Nfull name for the record again? Dialogue: 0,0:33:42.77,0:33:45.60,Default,,0000,0000,0000,,Sure, my name is Joy Buolamwini. Dialogue: 0,0:33:45.60,0:33:47.14,Default,,0000,0000,0000,,Thank you very much Joy. Dialogue: 0,0:33:47.14,0:33:49.96,Default,,0000,0000,0000,,I'm gonna now turn over\Nto My'Kel McMillen, Dialogue: 0,0:33:49.96,0:33:51.70,Default,,0000,0000,0000,,a youth advocate. Dialogue: 0,0:33:51.70,0:33:55.16,Default,,0000,0000,0000,,I'm also gonna set the\Nclock three minutes, My'Kel. Dialogue: 0,0:33:59.80,0:34:01.22,Default,,0000,0000,0000,,God afternoon madam chair, Dialogue: 0,0:34:01.22,0:34:03.19,Default,,0000,0000,0000,,good afternoon city councilor, Dialogue: 0,0:34:03.19,0:34:04.69,Default,,0000,0000,0000,,good afternoon everybody else. Dialogue: 0,0:34:06.01,0:34:09.09,Default,,0000,0000,0000,,My name is My'Kel McMillen,\NI'm a youth advocate Dialogue: 0,0:34:10.28,0:34:13.18,Default,,0000,0000,0000,,I'm also an organizer [indistinct]\Nwhich is from the east. Dialogue: 0,0:34:15.69,0:34:17.00,Default,,0000,0000,0000,,I just want to talk about on a little bit Dialogue: 0,0:34:17.00,0:34:20.27,Default,,0000,0000,0000,,about what's going on within\Nthe streets of Boston. Dialogue: 0,0:34:22.77,0:34:23.69,Default,,0000,0000,0000,,For centuries black folks Dialogue: 0,0:34:23.69,0:34:25.78,Default,,0000,0000,0000,,have been the Guinea pig in America, Dialogue: 0,0:34:25.78,0:34:27.15,Default,,0000,0000,0000,,from experiments with our bodies Dialogue: 0,0:34:27.15,0:34:29.19,Default,,0000,0000,0000,,to the strain on our communities, Dialogue: 0,0:34:29.19,0:34:31.99,Default,,0000,0000,0000,,it seems [indistinct] Dialogue: 0,0:34:31.99,0:34:35.20,Default,,0000,0000,0000,,Do I agree police are as involved? Dialogue: 0,0:34:35.20,0:34:37.37,Default,,0000,0000,0000,,Yes, there are no longer\Ncatching us glaze, Dialogue: 0,0:34:37.37,0:34:39.73,Default,,0000,0000,0000,,but catching poor black and brown folks. Dialogue: 0,0:34:39.73,0:34:43.96,Default,,0000,0000,0000,,On January, so I wanna\Ntalk about three years ago, Dialogue: 0,0:34:43.96,0:34:46.98,Default,,0000,0000,0000,,roughly January around 2017, Dialogue: 0,0:34:46.98,0:34:50.98,Default,,0000,0000,0000,,there were residents from my\Nneighborhood that came up to me Dialogue: 0,0:34:50.98,0:34:54.49,Default,,0000,0000,0000,,that they asking a drone\Nflying into the area Dialogue: 0,0:34:54.49,0:34:56.84,Default,,0000,0000,0000,,and they asked numerous times Dialogue: 0,0:34:56.84,0:34:59.20,Default,,0000,0000,0000,,and nobody could figure out\Nwho was flying the drone, Dialogue: 0,0:34:59.20,0:35:01.48,Default,,0000,0000,0000,,who's controlling it. Dialogue: 0,0:35:01.48,0:35:04.47,Default,,0000,0000,0000,,And so upon receiving his information, Dialogue: 0,0:35:04.47,0:35:07.40,Default,,0000,0000,0000,,I live across the street\Nfrom a soda company, Dialogue: 0,0:35:07.40,0:35:09.06,Default,,0000,0000,0000,,and later on that night, Dialogue: 0,0:35:09.06,0:35:10.94,Default,,0000,0000,0000,,I seen officers playing with a drone Dialogue: 0,0:35:10.94,0:35:13.12,Default,,0000,0000,0000,,and I ended up taking the\Nphotos of their drone. Dialogue: 0,0:35:13.12,0:35:14.32,Default,,0000,0000,0000,,And it kind of jogged my memory back Dialogue: 0,0:35:14.32,0:35:16.17,Default,,0000,0000,0000,,that this was a drone Dialogue: 0,0:35:16.17,0:35:18.46,Default,,0000,0000,0000,,that lots of residents in my neighborhood Dialogue: 0,0:35:18.46,0:35:20.00,Default,,0000,0000,0000,,had seen and talked about. Dialogue: 0,0:35:20.00,0:35:21.92,Default,,0000,0000,0000,,That no one knew who was flying the drone. Dialogue: 0,0:35:21.92,0:35:24.20,Default,,0000,0000,0000,,It comes to my knowledge,\Nit was the police. Dialogue: 0,0:35:24.20,0:35:26.59,Default,,0000,0000,0000,,So with doing some research Dialogue: 0,0:35:26.59,0:35:28.91,Default,,0000,0000,0000,,and reaching out to a friend over at ACLU Dialogue: 0,0:35:29.94,0:35:32.80,Default,,0000,0000,0000,,and we try to follow up\Na public records request, Dialogue: 0,0:35:32.80,0:35:34.68,Default,,0000,0000,0000,,which we were denied our first time, Dialogue: 0,0:35:34.68,0:35:35.94,Default,,0000,0000,0000,,and then eventually got it, Dialogue: 0,0:35:35.94,0:35:40.17,Default,,0000,0000,0000,,where the Boston police has\Nspent the amount of $17,500 Dialogue: 0,0:35:41.06,0:35:42.83,Default,,0000,0000,0000,,on three police drones. Dialogue: 0,0:35:42.83,0:35:45.90,Default,,0000,0000,0000,,And nobody knew about it from\Ncity council to the public. Dialogue: 0,0:35:45.90,0:35:48.05,Default,,0000,0000,0000,,It was kind of kept secret Dialogue: 0,0:35:48.05,0:35:50.93,Default,,0000,0000,0000,,and there was no committee oversight. Dialogue: 0,0:35:50.93,0:35:53.70,Default,,0000,0000,0000,,And these drones were flying illegally Dialogue: 0,0:35:53.70,0:35:57.93,Default,,0000,0000,0000,,in the side of a neighborhood\Nand the breach of privacy, Dialogue: 0,0:35:57.93,0:36:02.93,Default,,0000,0000,0000,,The not knowing what\Ninformation was being collected, Dialogue: 0,0:36:05.39,0:36:07.84,Default,,0000,0000,0000,,what was being stored,\Nhow it was being stored Dialogue: 0,0:36:07.84,0:36:10.22,Default,,0000,0000,0000,,that really scared a lot\Nof folks in my neighborhood Dialogue: 0,0:36:10.22,0:36:12.49,Default,,0000,0000,0000,,and to the point that a lot\Nof them didn't wanna speak up Dialogue: 0,0:36:12.49,0:36:15.48,Default,,0000,0000,0000,,because of past experience\Nof dealing with abuse Dialogue: 0,0:36:15.48,0:36:17.02,Default,,0000,0000,0000,,and being harassed. Dialogue: 0,0:36:17.02,0:36:18.91,Default,,0000,0000,0000,,Nobody wanted that backlash, Dialogue: 0,0:36:18.91,0:36:20.76,Default,,0000,0000,0000,,but at the same time, Dialogue: 0,0:36:20.76,0:36:22.16,Default,,0000,0000,0000,,we have to be as residents Dialogue: 0,0:36:22.16,0:36:25.76,Default,,0000,0000,0000,,and have to empower one\Nanother and have to speak up. Dialogue: 0,0:36:25.76,0:36:29.93,Default,,0000,0000,0000,,And sometimes it has to\Nbe somebody regardless Dialogue: 0,0:36:29.93,0:36:33.34,Default,,0000,0000,0000,,if it's myself or another individual, Dialogue: 0,0:36:33.34,0:36:35.08,Default,,0000,0000,0000,,we have to hold people accountable. Dialogue: 0,0:36:35.08,0:36:37.97,Default,,0000,0000,0000,,And something that Monica\NCandace, who's another activist, Dialogue: 0,0:36:37.97,0:36:39.97,Default,,0000,0000,0000,,accountability has no colors to another, Dialogue: 0,0:36:39.97,0:36:43.18,Default,,0000,0000,0000,,regardless if you're white,\Nblack in a blue uniform, Dialogue: 0,0:36:43.18,0:36:44.74,Default,,0000,0000,0000,,you still have to be held accountable. Dialogue: 0,0:36:44.74,0:36:46.86,Default,,0000,0000,0000,,And then just to see that\Nno one was held accountable Dialogue: 0,0:36:46.86,0:36:48.48,Default,,0000,0000,0000,,was a scary thing, Dialogue: 0,0:36:48.48,0:36:50.66,Default,,0000,0000,0000,,because it kind of just [indistinct] Dialogue: 0,0:36:53.89,0:36:55.33,Default,,0000,0000,0000,,Nobody ever gets caught accountable Dialogue: 0,0:36:55.33,0:36:58.84,Default,,0000,0000,0000,,because our voices are muffled by money Dialogue: 0,0:36:58.84,0:37:03.78,Default,,0000,0000,0000,,or just a knee in the neck,\Nrest in peace to George Floyd. Dialogue: 0,0:37:03.78,0:37:05.62,Default,,0000,0000,0000,,That is it for my time, thank you all. Dialogue: 0,0:37:05.62,0:37:08.51,Default,,0000,0000,0000,,Thank you, thank you so much. Dialogue: 0,0:37:08.51,0:37:09.83,Default,,0000,0000,0000,,Perfect testimony. Dialogue: 0,0:37:09.83,0:37:11.76,Default,,0000,0000,0000,,And I really appreciate you\Nspeaking from the heart, Dialogue: 0,0:37:11.76,0:37:13.45,Default,,0000,0000,0000,,from your own experience Dialogue: 0,0:37:13.45,0:37:16.03,Default,,0000,0000,0000,,and also demonstrating how you've actually Dialogue: 0,0:37:16.03,0:37:18.42,Default,,0000,0000,0000,,had conversation for accountability Dialogue: 0,0:37:18.42,0:37:19.74,Default,,0000,0000,0000,,through your leadership. Dialogue: 0,0:37:19.74,0:37:21.81,Default,,0000,0000,0000,,So I wanna thank you for that. Dialogue: 0,0:37:21.81,0:37:25.16,Default,,0000,0000,0000,,Up next we have Karina Hem Dialogue: 0,0:37:25.16,0:37:27.27,Default,,0000,0000,0000,,from the Student Immigrant Movement. Dialogue: 0,0:37:27.27,0:37:30.23,Default,,0000,0000,0000,,Three minutes, Karina. Dialogue: 0,0:37:30.23,0:37:31.66,Default,,0000,0000,0000,,Hi everyone. Dialogue: 0,0:37:31.66,0:37:34.97,Default,,0000,0000,0000,,Thank you everyone for joining\Nus today at this hearing, Dialogue: 0,0:37:34.97,0:37:36.22,Default,,0000,0000,0000,,my name is Karina Hem, Dialogue: 0,0:37:36.22,0:37:38.06,Default,,0000,0000,0000,,and I'm the field organizer Dialogue: 0,0:37:38.06,0:37:39.23,Default,,0000,0000,0000,,for the Student Immigrant Movement. Dialogue: 0,0:37:39.23,0:37:41.38,Default,,0000,0000,0000,,I would like to also thank our partners Dialogue: 0,0:37:41.38,0:37:43.73,Default,,0000,0000,0000,,who've been working on the\Nfacial recognition ban, Dialogue: 0,0:37:43.73,0:37:46.58,Default,,0000,0000,0000,,city councilors, Michelle\NWu, Ricardo Arroyo, Dialogue: 0,0:37:46.58,0:37:49.23,Default,,0000,0000,0000,,Andrea Campbell, Kim Janey.\NAnnesa Essaibi George, Dialogue: 0,0:37:49.23,0:37:51.17,Default,,0000,0000,0000,,along with ACLU of Massachusetts, Dialogue: 0,0:37:51.17,0:37:53.74,Default,,0000,0000,0000,,Muslim Justice League\Nand Unafraid Educators. Dialogue: 0,0:37:53.74,0:37:55.78,Default,,0000,0000,0000,,So SIM is a statewide\Ngrassroots organization Dialogue: 0,0:37:55.78,0:37:58.30,Default,,0000,0000,0000,,by and for the undocumented youth. Dialogue: 0,0:37:58.30,0:38:01.33,Default,,0000,0000,0000,,We work closely with our\Nmembers or our young people Dialogue: 0,0:38:01.33,0:38:03.91,Default,,0000,0000,0000,,to create trusting and\Nempowering relationships. Dialogue: 0,0:38:03.91,0:38:06.47,Default,,0000,0000,0000,,At SIM we fight for the\Npermanent protection Dialogue: 0,0:38:06.47,0:38:08.42,Default,,0000,0000,0000,,of our undocumented\Nyouth and their families Dialogue: 0,0:38:08.42,0:38:09.83,Default,,0000,0000,0000,,through collective action. Dialogue: 0,0:38:09.83,0:38:11.57,Default,,0000,0000,0000,,And one of the major role of the work Dialogue: 0,0:38:11.57,0:38:13.23,Default,,0000,0000,0000,,in the organizing that we do, Dialogue: 0,0:38:13.23,0:38:15.87,Default,,0000,0000,0000,,is to challenge these systems\Nthat continue to exploit Dialogue: 0,0:38:15.87,0:38:17.81,Default,,0000,0000,0000,,and disenfranchise the undocumented Dialogue: 0,0:38:18.99,0:38:19.82,Default,,0000,0000,0000,,and immigrant communities. Dialogue: 0,0:38:19.82,0:38:21.90,Default,,0000,0000,0000,,So I am here on behalf of SIM Dialogue: 0,0:38:21.90,0:38:23.20,Default,,0000,0000,0000,,to support the ban on facial recognition Dialogue: 0,0:38:23.20,0:38:24.68,Default,,0000,0000,0000,,because it's another system, Dialogue: 0,0:38:24.68,0:38:26.50,Default,,0000,0000,0000,,it's another tool that will intentionally Dialogue: 0,0:38:26.50,0:38:28.89,Default,,0000,0000,0000,,be used against marginalized community. Dialogue: 0,0:38:28.89,0:38:30.90,Default,,0000,0000,0000,,So the undocumented immigrant community Dialogue: 0,0:38:30.90,0:38:33.44,Default,,0000,0000,0000,,is not mutually exclusive\Nto one particular group. Dialogue: 0,0:38:33.44,0:38:35.54,Default,,0000,0000,0000,,We have transgender people, Dialogue: 0,0:38:35.54,0:38:37.55,Default,,0000,0000,0000,,black, Latina, exmuslim,\Npeople with disabilities, Dialogue: 0,0:38:37.55,0:38:39.20,Default,,0000,0000,0000,,and it's so intersectional. Dialogue: 0,0:38:39.20,0:38:42.35,Default,,0000,0000,0000,,And so the stakes for getting\Nthis identified are high Dialogue: 0,0:38:42.35,0:38:44.28,Default,,0000,0000,0000,,because this creates a dangerous situation Dialogue: 0,0:38:44.28,0:38:45.26,Default,,0000,0000,0000,,for people who are undocumented. Dialogue: 0,0:38:45.26,0:38:47.44,Default,,0000,0000,0000,,And these programs are\Nmade to accurately identify Dialogue: 0,0:38:47.44,0:38:49.50,Default,,0000,0000,0000,,white adult males, living\Nblack and brown folks Dialogue: 0,0:38:49.50,0:38:53.14,Default,,0000,0000,0000,,at risk on top of living\Nin marginalized community. Dialogue: 0,0:38:53.14,0:38:55.08,Default,,0000,0000,0000,,And so many of the\Nyoung people I work with Dialogue: 0,0:38:55.08,0:38:56.70,Default,,0000,0000,0000,,are students who are in college Dialogue: 0,0:38:56.70,0:38:58.40,Default,,0000,0000,0000,,and many are still in high school. Dialogue: 0,0:38:58.40,0:39:00.64,Default,,0000,0000,0000,,Face surveillance technology\Nis not meant for children, Dialogue: 0,0:39:00.64,0:39:02.52,Default,,0000,0000,0000,,and it's not meant for the young people Dialogue: 0,0:39:02.52,0:39:04.04,Default,,0000,0000,0000,,and law enforcement already uses it Dialogue: 0,0:39:04.04,0:39:07.46,Default,,0000,0000,0000,,to monitor the youth in places\Nlike Lockport, New York. Dialogue: 0,0:39:07.46,0:39:08.61,Default,,0000,0000,0000,,Their face surveillance system Dialogue: 0,0:39:08.61,0:39:10.78,Default,,0000,0000,0000,,has been adopted in their school district, Dialogue: 0,0:39:10.78,0:39:12.65,Default,,0000,0000,0000,,and they've already spent\Nover a million dollars Dialogue: 0,0:39:12.65,0:39:13.48,Default,,0000,0000,0000,,in the system. Dialogue: 0,0:39:13.48,0:39:14.42,Default,,0000,0000,0000,,So you ask yourself, Dialogue: 0,0:39:14.42,0:39:16.80,Default,,0000,0000,0000,,but what is exactly the intent\Nto have these surveillance? Dialogue: 0,0:39:16.80,0:39:19.40,Default,,0000,0000,0000,,Do you still have the\Nright to their privacy Dialogue: 0,0:39:19.40,0:39:22.02,Default,,0000,0000,0000,,and have the right to live\Nwithout constant infringement Dialogue: 0,0:39:22.02,0:39:24.06,Default,,0000,0000,0000,,and interference of law enforcement. Dialogue: 0,0:39:24.06,0:39:25.92,Default,,0000,0000,0000,,If another city allows facial surveillance Dialogue: 0,0:39:25.92,0:39:27.66,Default,,0000,0000,0000,,in their schools to happen, Dialogue: 0,0:39:27.66,0:39:29.74,Default,,0000,0000,0000,,the chance of that occurring\Nhere is high as well Dialogue: 0,0:39:29.74,0:39:32.33,Default,,0000,0000,0000,,if we do not ban facial\Nrecognition in Boston. Dialogue: 0,0:39:32.33,0:39:33.96,Default,,0000,0000,0000,,And we believe this because the youth, Dialogue: 0,0:39:33.96,0:39:34.95,Default,,0000,0000,0000,,the undocumented youth, Dialogue: 0,0:39:34.95,0:39:37.56,Default,,0000,0000,0000,,there are already not\Nprotected in their schools, Dialogue: 0,0:39:37.56,0:39:38.95,Default,,0000,0000,0000,,school disciplinary reports, Dialogue: 0,0:39:38.95,0:39:42.28,Default,,0000,0000,0000,,which currently have no guideline\Nprotocol or clear criteria Dialogue: 0,0:39:42.28,0:39:44.19,Default,,0000,0000,0000,,on what exactly school officers can write Dialogue: 0,0:39:44.19,0:39:46.73,Default,,0000,0000,0000,,are sent to the superintendent\Nand their designee Dialogue: 0,0:39:46.73,0:39:49.01,Default,,0000,0000,0000,,to sign off and sent to the\NBoston Police Department. Dialogue: 0,0:39:49.01,0:39:52.40,Default,,0000,0000,0000,,And still we've seen that\Nsome of these reports Dialogue: 0,0:39:52.40,0:39:53.23,Default,,0000,0000,0000,,actually go straight to\Nthe police department Dialogue: 0,0:39:53.23,0:39:54.18,Default,,0000,0000,0000,,with no oversight. Dialogue: 0,0:39:55.06,0:39:56.55,Default,,0000,0000,0000,,And so this endangers the\Nlife of immigrant youth, Dialogue: 0,0:39:56.55,0:39:59.18,Default,,0000,0000,0000,,because Boston Police\NDepartments share information Dialogue: 0,0:39:59.18,0:40:02.06,Default,,0000,0000,0000,,and databases with agencies like I.C.E. Dialogue: 0,0:40:02.06,0:40:03.68,Default,,0000,0000,0000,,So it creates a gateway to imprisonment Dialogue: 0,0:40:03.68,0:40:05.76,Default,,0000,0000,0000,,and deportation from any\Nof these young folks. Dialogue: 0,0:40:05.76,0:40:07.80,Default,,0000,0000,0000,,And so adding facial\Nsurveillance into schools Dialogue: 0,0:40:07.80,0:40:10.59,Default,,0000,0000,0000,,and in the city would be used\Nagainst those same students Dialogue: 0,0:40:10.59,0:40:11.67,Default,,0000,0000,0000,,who are already at risk Dialogue: 0,0:40:11.67,0:40:14.17,Default,,0000,0000,0000,,for being separated from their community. Dialogue: 0,0:40:14.17,0:40:15.34,Default,,0000,0000,0000,,We know that black and brown youth Dialogue: 0,0:40:15.34,0:40:16.99,Default,,0000,0000,0000,,experience depression every day, Dialogue: 0,0:40:18.46,0:40:19.81,Default,,0000,0000,0000,,and they already are deemed\Nsuspicious by law enforcement Dialogue: 0,0:40:20.79,0:40:21.88,Default,,0000,0000,0000,,based on their skin color\Nand the clothes they wear, Dialogue: 0,0:40:21.88,0:40:23.87,Default,,0000,0000,0000,,the people they interact with in more. Dialogue: 0,0:40:23.87,0:40:25.50,Default,,0000,0000,0000,,And so adding facial recognition, Dialogue: 0,0:40:25.50,0:40:28.07,Default,,0000,0000,0000,,that's not even accurate to\Nsurveil our youth and families Dialogue: 0,0:40:28.07,0:40:32.54,Default,,0000,0000,0000,,will just be used to justify\Ntheir arrests or deportation. Dialogue: 0,0:40:32.54,0:40:35.35,Default,,0000,0000,0000,,And through the policing system, Dialogue: 0,0:40:35.35,0:40:36.46,Default,,0000,0000,0000,,we were already asking for police officers Dialogue: 0,0:40:36.46,0:40:37.87,Default,,0000,0000,0000,,to engage with the youth in ways Dialogue: 0,0:40:37.87,0:40:40.53,Default,,0000,0000,0000,,that are just harmful to\Nthe youth development. Dialogue: 0,0:40:40.53,0:40:45.18,Default,,0000,0000,0000,,The city of Boston needs to\Nstop criminalizing young folks, Dialogue: 0,0:40:45.18,0:40:47.56,Default,,0000,0000,0000,,whether they engage in\Ncriminal activity or not, Dialogue: 0,0:40:47.56,0:40:49.98,Default,,0000,0000,0000,,arresting them will not get\Nto the root of the problem. Dialogue: 0,0:40:49.98,0:40:51.13,Default,,0000,0000,0000,,There are so many options Dialogue: 0,0:40:51.13,0:40:53.74,Default,,0000,0000,0000,,that are much safer and\Nmore secure for our youth, Dialogue: 0,0:40:53.74,0:40:56.07,Default,,0000,0000,0000,,for the families and their futures. Dialogue: 0,0:40:56.07,0:40:58.02,Default,,0000,0000,0000,,Banning facial recognition\Nwould meet law enforcement Dialogue: 0,0:40:58.02,0:41:00.66,Default,,0000,0000,0000,,has one less tool to\Ncriminalize our youth. Dialogue: 0,0:41:00.66,0:41:01.86,Default,,0000,0000,0000,,Thanks so much everyone. Dialogue: 0,0:41:02.98,0:41:04.69,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:41:04.69,0:41:09.06,Default,,0000,0000,0000,,Commissioner, I understand\Ntime it'd be of the essence, Dialogue: 0,0:41:09.06,0:41:13.88,Default,,0000,0000,0000,,but I'm really hoping you can\Nstay till about 4:15, 4:30, Dialogue: 0,0:41:15.40,0:41:18.14,Default,,0000,0000,0000,,please, please. Dialogue: 0,0:41:21.69,0:41:25.22,Default,,0000,0000,0000,,I can stay until four, that's it. Dialogue: 0,0:41:25.22,0:41:29.59,Default,,0000,0000,0000,,I have to go at four, that long. Dialogue: 0,0:41:30.44,0:41:33.25,Default,,0000,0000,0000,,Okay, so- Dialogue: 0,0:41:34.83,0:41:35.90,Default,,0000,0000,0000,,I appreciate everyone's testimony as well. Dialogue: 0,0:41:35.90,0:41:37.83,Default,,0000,0000,0000,,I have a team we're taking notes. Dialogue: 0,0:41:37.83,0:41:41.94,Default,,0000,0000,0000,,So I don't think when I go\Nthat no one's taking notes. Dialogue: 0,0:41:41.94,0:41:43.51,Default,,0000,0000,0000,,We would really wanna meet again Dialogue: 0,0:41:43.51,0:41:45.86,Default,,0000,0000,0000,,and discuss on this technology. Dialogue: 0,0:41:46.75,0:41:48.27,Default,,0000,0000,0000,,I know that there are some councilors Dialogue: 0,0:41:48.27,0:41:50.53,Default,,0000,0000,0000,,that have some clarifying questions Dialogue: 0,0:41:50.53,0:41:53.10,Default,,0000,0000,0000,,and we have three more people to speak. Dialogue: 0,0:41:53.10,0:41:56.87,Default,,0000,0000,0000,,And so I am begging you, please, Dialogue: 0,0:41:56.87,0:41:59.55,Default,,0000,0000,0000,,if you can at least get\Nto the lead sponsors, Dialogue: 0,0:41:59.55,0:42:01.78,Default,,0000,0000,0000,,councilor Wu and councilor Arroyo. Dialogue: 0,0:42:03.44,0:42:07.67,Default,,0000,0000,0000,,That'll be three, six, nine, 10. Dialogue: 0,0:42:07.67,0:42:09.40,Default,,0000,0000,0000,,That'd be probably 20 more minutes. Dialogue: 0,0:42:09.40,0:42:11.24,Default,,0000,0000,0000,,A little after four. Dialogue: 0,0:42:11.24,0:42:14.44,Default,,0000,0000,0000,,Okay.\NA little close for me. Dialogue: 0,0:42:14.44,0:42:15.48,Default,,0000,0000,0000,,[Gross laughs] Dialogue: 0,0:42:15.48,0:42:17.10,Default,,0000,0000,0000,,We're trying, we're trying. Dialogue: 0,0:42:17.10,0:42:18.70,Default,,0000,0000,0000,,I understand, I understand, go ahead. Dialogue: 0,0:42:18.70,0:42:22.14,Default,,0000,0000,0000,,Thank you, Kade. Dialogue: 0,0:42:22.14,0:42:24.17,Default,,0000,0000,0000,,Thank you chair. Dialogue: 0,0:42:24.17,0:42:25.31,Default,,0000,0000,0000,,My name is Kade Crawford. Dialogue: 0,0:42:25.31,0:42:27.79,Default,,0000,0000,0000,,I am the director of the\NTechnology for Liberty Program Dialogue: 0,0:42:27.79,0:42:29.46,Default,,0000,0000,0000,,at the ACLU of Massachusetts. Dialogue: 0,0:42:29.46,0:42:31.93,Default,,0000,0000,0000,,Chair Edwards, members of the committee Dialogue: 0,0:42:31.93,0:42:33.22,Default,,0000,0000,0000,,and members of the city council, Dialogue: 0,0:42:33.22,0:42:34.75,Default,,0000,0000,0000,,thank you all for the opportunity Dialogue: 0,0:42:34.75,0:42:36.97,Default,,0000,0000,0000,,to speak on this crucial issue today. Dialogue: 0,0:42:36.97,0:42:40.75,Default,,0000,0000,0000,,We are obviously in the midst\Nof a massive social upheaval Dialogue: 0,0:42:40.75,0:42:43.13,Default,,0000,0000,0000,,as people across Boston and the nation Dialogue: 0,0:42:43.13,0:42:46.01,Default,,0000,0000,0000,,demand that we ensure black lives matter Dialogue: 0,0:42:46.01,0:42:48.65,Default,,0000,0000,0000,,by shifting our budget\Npriorities away from policing Dialogue: 0,0:42:48.65,0:42:51.43,Default,,0000,0000,0000,,and incarceration and\Ntowards government programs Dialogue: 0,0:42:51.43,0:42:54.06,Default,,0000,0000,0000,,that lift people up instead\Nof holding them back. Dialogue: 0,0:42:54.06,0:42:56.85,Default,,0000,0000,0000,,And people are outraged\Nbecause for too long Dialogue: 0,0:42:56.85,0:42:59.25,Default,,0000,0000,0000,,police departments have\Nbeen armed to the teeth Dialogue: 0,0:42:59.25,0:43:02.54,Default,,0000,0000,0000,,and equipped with military\Nstyle surveillance technologies, Dialogue: 0,0:43:02.54,0:43:04.97,Default,,0000,0000,0000,,Like the ones that My'Kel just referenced. Dialogue: 0,0:43:04.97,0:43:08.34,Default,,0000,0000,0000,,Even as our schools lack\Nthe most basic needs, Dialogue: 0,0:43:08.34,0:43:11.46,Default,,0000,0000,0000,,like functioning water\Nfountains, nurses and counselors. Dialogue: 0,0:43:11.46,0:43:13.62,Default,,0000,0000,0000,,And even now during this pandemic, Dialogue: 0,0:43:13.62,0:43:16.69,Default,,0000,0000,0000,,as our healthcare providers\Nlack the PPE they need Dialogue: 0,0:43:16.69,0:43:18.88,Default,,0000,0000,0000,,to protect themselves and us, Dialogue: 0,0:43:18.88,0:43:21.34,Default,,0000,0000,0000,,but our government cannot continue to act Dialogue: 0,0:43:23.06,0:43:24.81,Default,,0000,0000,0000,,as if it is at war with its residents, Dialogue: 0,0:43:24.81,0:43:25.96,Default,,0000,0000,0000,,waging counter-insurgency surveillance Dialogue: 0,0:43:25.96,0:43:27.98,Default,,0000,0000,0000,,and control operations, Dialogue: 0,0:43:27.98,0:43:29.72,Default,,0000,0000,0000,,particularly in black\Nand brown neighborhoods Dialogue: 0,0:43:29.72,0:43:31.47,Default,,0000,0000,0000,,and against black and brown people. Dialogue: 0,0:43:32.56,0:43:35.71,Default,,0000,0000,0000,,We must act, we must not merely\Nspeak to change our laws, Dialogue: 0,0:43:35.71,0:43:37.99,Default,,0000,0000,0000,,to ensure we are building a free, Dialogue: 0,0:43:37.99,0:43:41.62,Default,,0000,0000,0000,,just an equitable future\Nfor all people in Boston. Dialogue: 0,0:43:41.62,0:43:42.66,Default,,0000,0000,0000,,So that said, Dialogue: 0,0:43:42.66,0:43:45.52,Default,,0000,0000,0000,,banning face surveillance\Nin Boston is a no-brainer. Dialogue: 0,0:43:45.52,0:43:47.64,Default,,0000,0000,0000,,It is one small but vital piece Dialogue: 0,0:43:47.64,0:43:50.78,Default,,0000,0000,0000,,of this larger necessary transformation. Dialogue: 0,0:43:50.78,0:43:52.84,Default,,0000,0000,0000,,Face surveillance is\Ndangerous when it works Dialogue: 0,0:43:52.84,0:43:54.26,Default,,0000,0000,0000,,and when it doesn't. Dialogue: 0,0:43:54.26,0:43:56.18,Default,,0000,0000,0000,,Banning this technology in Boston now Dialogue: 0,0:43:56.18,0:43:58.26,Default,,0000,0000,0000,,is not an academic issue. Dialogue: 0,0:43:58.26,0:44:00.97,Default,,0000,0000,0000,,Indeed the city already uses technology Dialogue: 0,0:44:00.97,0:44:02.90,Default,,0000,0000,0000,,that with one mere software upgrade Dialogue: 0,0:44:02.90,0:44:06.63,Default,,0000,0000,0000,,could blanket Boston in the\Nkind of dystopian surveillance Dialogue: 0,0:44:06.63,0:44:09.05,Default,,0000,0000,0000,,currently practiced by\Nauthoritarian regimes Dialogue: 0,0:44:09.05,0:44:10.63,Default,,0000,0000,0000,,in China and Russia. Dialogue: 0,0:44:10.63,0:44:11.88,Default,,0000,0000,0000,,And as Joyce said, Dialogue: 0,0:44:11.88,0:44:14.85,Default,,0000,0000,0000,,even some cities right\Nhere in the United States. Dialogue: 0,0:44:14.85,0:44:17.16,Default,,0000,0000,0000,,Boston has to strike a different path Dialogue: 0,0:44:17.16,0:44:19.76,Default,,0000,0000,0000,,by protecting privacy, racial justice, Dialogue: 0,0:44:19.76,0:44:21.37,Default,,0000,0000,0000,,and first amendment rights. Dialogue: 0,0:44:21.37,0:44:23.80,Default,,0000,0000,0000,,We must ban this technology now, Dialogue: 0,0:44:23.80,0:44:26.90,Default,,0000,0000,0000,,before it creeps into our\Ngovernment, in the shadows, Dialogue: 0,0:44:26.90,0:44:29.41,Default,,0000,0000,0000,,with no democratic control or oversight Dialogue: 0,0:44:29.41,0:44:31.68,Default,,0000,0000,0000,,as it has in countless other cities, Dialogue: 0,0:44:31.68,0:44:34.12,Default,,0000,0000,0000,,across the country and across the world. Dialogue: 0,0:44:34.12,0:44:37.34,Default,,0000,0000,0000,,So today you already\Nheard from pretty much Dialogue: 0,0:44:37.34,0:44:38.76,Default,,0000,0000,0000,,the world renowned expert Dialogue: 0,0:44:38.76,0:44:41.26,Default,,0000,0000,0000,,on racial and gender bias\Nand facial recognition. Dialogue: 0,0:44:41.26,0:44:44.08,Default,,0000,0000,0000,,Thank you so much Joy\Nfor being with us today, Dialogue: 0,0:44:44.08,0:44:46.29,Default,,0000,0000,0000,,your work blazed a trail\Nand showed the world Dialogue: 0,0:44:46.29,0:44:49.15,Default,,0000,0000,0000,,that yes, algorithms\Ncan in fact be racist. Dialogue: 0,0:44:49.15,0:44:50.89,Default,,0000,0000,0000,,You also heard from my My'Kel McMillen, Dialogue: 0,0:44:50.89,0:44:53.93,Default,,0000,0000,0000,,a young Boston and resident\Nwho has experienced firsthand, Dialogue: 0,0:44:53.93,0:44:55.26,Default,,0000,0000,0000,,what draconian surveillance Dialogue: 0,0:44:55.26,0:44:57.53,Default,,0000,0000,0000,,with no accountability looks like. Dialogue: 0,0:44:57.53,0:44:59.88,Default,,0000,0000,0000,,Karina Ham from the\NStudent Immigrant Movement Dialogue: 0,0:44:59.88,0:45:02.03,Default,,0000,0000,0000,,spoke to the need to\Nkeep face surveillance Dialogue: 0,0:45:03.85,0:45:05.27,Default,,0000,0000,0000,,out of our public schools, Dialogue: 0,0:45:05.27,0:45:06.89,Default,,0000,0000,0000,,where students deserve to be\Nable to learn without fear. Dialogue: 0,0:45:06.89,0:45:09.70,Default,,0000,0000,0000,,You're gonna hear a\Nsimilar set of sentiments Dialogue: 0,0:45:09.70,0:45:12.10,Default,,0000,0000,0000,,from Erik Berg of the\NBoston Teachers Union Dialogue: 0,0:45:12.10,0:45:15.05,Default,,0000,0000,0000,,and shortly medical doctor\Nand infectious disease expert, Dialogue: 0,0:45:15.05,0:45:16.71,Default,,0000,0000,0000,,Joshua Barocas will testify Dialogue: 0,0:45:16.71,0:45:18.93,Default,,0000,0000,0000,,to how this technology in Boston, Dialogue: 0,0:45:18.93,0:45:21.37,Default,,0000,0000,0000,,would inevitably undermine trust Dialogue: 0,0:45:21.37,0:45:22.93,Default,,0000,0000,0000,,among the most vulnerable people Dialogue: 0,0:45:22.93,0:45:24.78,Default,,0000,0000,0000,,seeking medical care and help. Dialogue: 0,0:45:24.78,0:45:27.32,Default,,0000,0000,0000,,Obviously that's the last\Nthing that we need to do, Dialogue: 0,0:45:27.32,0:45:29.46,Default,,0000,0000,0000,,especially during a pandemic. Dialogue: 0,0:45:29.46,0:45:32.51,Default,,0000,0000,0000,,For too long in Boston,\Ncity agencies have acquired Dialogue: 0,0:45:32.51,0:45:34.91,Default,,0000,0000,0000,,and deployed invasive\Nsurveillance technologies Dialogue: 0,0:45:34.91,0:45:37.53,Default,,0000,0000,0000,,with no public debate, transparency, Dialogue: 0,0:45:37.53,0:45:39.46,Default,,0000,0000,0000,,accountability or oversight. Dialogue: 0,0:45:39.46,0:45:41.25,Default,,0000,0000,0000,,In most cases, these technologies, Dialogue: 0,0:45:41.25,0:45:43.18,Default,,0000,0000,0000,,which as My'Kel said, include drones, Dialogue: 0,0:45:43.18,0:45:45.36,Default,,0000,0000,0000,,but also license plate readers, Dialogue: 0,0:45:45.36,0:45:48.47,Default,,0000,0000,0000,,social media surveillance,\Nvideo analytics software Dialogue: 0,0:45:48.47,0:45:51.10,Default,,0000,0000,0000,,and military style cell\Nphone spying devices, Dialogue: 0,0:45:51.10,0:45:52.53,Default,,0000,0000,0000,,among many others. Dialogue: 0,0:45:52.53,0:45:54.35,Default,,0000,0000,0000,,These were purchased and deployed Dialogue: 0,0:45:54.35,0:45:58.09,Default,,0000,0000,0000,,without the city council's\Nknowledge, let alone approval. Dialogue: 0,0:45:58.09,0:45:59.89,Default,,0000,0000,0000,,We have seen this routine play out Dialogue: 0,0:45:59.89,0:46:03.23,Default,,0000,0000,0000,,over and over and over for years. Dialogue: 0,0:46:03.23,0:46:06.23,Default,,0000,0000,0000,,This is a problem with all\Nsurveillance technologies, Dialogue: 0,0:46:06.23,0:46:08.54,Default,,0000,0000,0000,,but it is especially unacceptable Dialogue: 0,0:46:08.54,0:46:10.14,Default,,0000,0000,0000,,with a technology as dangerous Dialogue: 0,0:46:10.14,0:46:12.25,Default,,0000,0000,0000,,and dystopian as face surveillance. Dialogue: 0,0:46:12.25,0:46:14.54,Default,,0000,0000,0000,,It is not an exaggeration to say that Dialogue: 0,0:46:14.54,0:46:15.59,Default,,0000,0000,0000,,face surveillance would, Dialogue: 0,0:46:15.59,0:46:18.10,Default,,0000,0000,0000,,if we allow it destroy privacy Dialogue: 0,0:46:18.10,0:46:20.49,Default,,0000,0000,0000,,and anonymity in public space. Dialogue: 0,0:46:20.49,0:46:22.27,Default,,0000,0000,0000,,Face surveillance technology Dialogue: 0,0:46:22.27,0:46:25.43,Default,,0000,0000,0000,,paired with a thousands of\Nnetwork surveillance cameras Dialogue: 0,0:46:25.43,0:46:27.61,Default,,0000,0000,0000,,already installed throughout the city, Dialogue: 0,0:46:27.61,0:46:30.85,Default,,0000,0000,0000,,would enable any official\Nwith access to the system Dialogue: 0,0:46:30.85,0:46:33.64,Default,,0000,0000,0000,,to automatically catalog the movements, Dialogue: 0,0:46:33.64,0:46:38.28,Default,,0000,0000,0000,,habits and associations of\Nall people at all times, Dialogue: 0,0:46:38.28,0:46:40.14,Default,,0000,0000,0000,,merely with the push of a button. Dialogue: 0,0:46:40.98,0:46:43.73,Default,,0000,0000,0000,,This is technology that\Nwould make it trivial Dialogue: 0,0:46:43.73,0:46:46.50,Default,,0000,0000,0000,,for the government to\Nblackmail public officials Dialogue: 0,0:46:46.50,0:46:48.41,Default,,0000,0000,0000,,for seeking substance use treatment, Dialogue: 0,0:46:48.41,0:46:50.05,Default,,0000,0000,0000,,or to identify whistleblowers Dialogue: 0,0:46:50.05,0:46:51.87,Default,,0000,0000,0000,,who are speaking to the Boston Globe Dialogue: 0,0:46:51.87,0:46:54.52,Default,,0000,0000,0000,,or in its most routine manifestation Dialogue: 0,0:46:54.52,0:46:57.36,Default,,0000,0000,0000,,to supercharge existing police harassment Dialogue: 0,0:46:57.36,0:46:59.94,Default,,0000,0000,0000,,and surveillance of\Nblack and brown residents Dialogue: 0,0:46:59.94,0:47:02.20,Default,,0000,0000,0000,,merely because of the color of their skin Dialogue: 0,0:47:02.20,0:47:03.86,Default,,0000,0000,0000,,and where they live. Dialogue: 0,0:47:03.86,0:47:05.77,Default,,0000,0000,0000,,Using face recognition tech, Dialogue: 0,0:47:05.77,0:47:09.49,Default,,0000,0000,0000,,the police could very easily\Ntake photos or videos of people Dialogue: 0,0:47:09.49,0:47:12.63,Default,,0000,0000,0000,,at any of these Black Lives\NMatter demonstrations, Dialogue: 0,0:47:12.63,0:47:15.10,Default,,0000,0000,0000,,run those images through\Na computer program Dialogue: 0,0:47:15.10,0:47:17.36,Default,,0000,0000,0000,,and automatically populate a list Dialogue: 0,0:47:17.36,0:47:19.10,Default,,0000,0000,0000,,of each person who attended Dialogue: 0,0:47:19.10,0:47:21.01,Default,,0000,0000,0000,,to express their first amendment rights Dialogue: 0,0:47:21.01,0:47:22.87,Default,,0000,0000,0000,,to demand racial justice. Dialogue: 0,0:47:22.87,0:47:25.52,Default,,0000,0000,0000,,The police could also\Nask a computer program Dialogue: 0,0:47:25.52,0:47:27.88,Default,,0000,0000,0000,,to automatically populate\Na list of every person Dialogue: 0,0:47:27.88,0:47:31.01,Default,,0000,0000,0000,,who walked down a specific\Non any given morning Dialogue: 0,0:47:31.01,0:47:33.01,Default,,0000,0000,0000,,and share that information with I.C.E. Dialogue: 0,0:47:33.01,0:47:34.03,Default,,0000,0000,0000,,It could also be used Dialogue: 0,0:47:34.03,0:47:36.19,Default,,0000,0000,0000,,to automatically alert law enforcement, Dialogue: 0,0:47:36.19,0:47:38.02,Default,,0000,0000,0000,,whenever a specific person passes Dialogue: 0,0:47:38.02,0:47:40.95,Default,,0000,0000,0000,,by a specific surveillance\Ncamera anywhere in the city. Dialogue: 0,0:47:40.95,0:47:43.15,Default,,0000,0000,0000,,And again, there are\Nthousands of these cameras Dialogue: 0,0:47:43.15,0:47:45.36,Default,,0000,0000,0000,,with more going up each week. Dialogue: 0,0:47:45.36,0:47:47.91,Default,,0000,0000,0000,,Yes, so we're, I'm so sorry. Dialogue: 0,0:47:47.91,0:47:50.75,Default,,0000,0000,0000,,You're over the three minutes. Dialogue: 0,0:47:50.75,0:47:51.83,Default,,0000,0000,0000,,Well, over the three minutes, Dialogue: 0,0:47:51.83,0:47:53.97,Default,,0000,0000,0000,,but I'm I... Dialogue: 0,0:47:53.97,0:47:57.09,Default,,0000,0000,0000,,and I have two more people\Nplus trying to get this. Dialogue: 0,0:47:57.09,0:48:00.83,Default,,0000,0000,0000,,So I'm just letting you\Nknow if you could summarize. Dialogue: 0,0:48:00.83,0:48:02.61,Default,,0000,0000,0000,,All right, I'm almost done, thank you. Dialogue: 0,0:48:02.61,0:48:03.96,Default,,0000,0000,0000,,Sorry, chair. Dialogue: 0,0:48:03.96,0:48:06.35,Default,,0000,0000,0000,,So I'll just skip to the end and say, Dialogue: 0,0:48:06.35,0:48:08.69,Default,,0000,0000,0000,,we're at a fork in the road right now, Dialogue: 0,0:48:08.69,0:48:12.52,Default,,0000,0000,0000,,as Joy said, you know, major corporations, Dialogue: 0,0:48:12.52,0:48:14.67,Default,,0000,0000,0000,,including IBM and Google are now declining Dialogue: 0,0:48:14.67,0:48:17.23,Default,,0000,0000,0000,,to sell this technology\Nand that's for good reason, Dialogue: 0,0:48:17.23,0:48:19.88,Default,,0000,0000,0000,,because of the real\Npotential for grave human Dialogue: 0,0:48:19.88,0:48:21.66,Default,,0000,0000,0000,,and civil rights abuses. Dialogue: 0,0:48:21.66,0:48:23.89,Default,,0000,0000,0000,,We the people, if we live in a democracy Dialogue: 0,0:48:23.89,0:48:25.37,Default,,0000,0000,0000,,and need to be in the driver's seat Dialogue: 0,0:48:25.37,0:48:26.35,Default,,0000,0000,0000,,in terms of determining Dialogue: 0,0:48:26.35,0:48:28.54,Default,,0000,0000,0000,,whether we will use these technologies. Dialogue: 0,0:48:28.54,0:48:31.15,Default,,0000,0000,0000,,We can either continue\Nwith business as usual, Dialogue: 0,0:48:31.15,0:48:32.72,Default,,0000,0000,0000,,allowing governments to adopt Dialogue: 0,0:48:32.72,0:48:35.63,Default,,0000,0000,0000,,and deploy these technologies\Nunchecked in our communities, Dialogue: 0,0:48:35.63,0:48:37.46,Default,,0000,0000,0000,,our streets, in our schools, Dialogue: 0,0:48:37.46,0:48:41.34,Default,,0000,0000,0000,,or we can take bold\Naction now to press pause Dialogue: 0,0:48:41.34,0:48:43.41,Default,,0000,0000,0000,,on the government's\Nuse of this technology, Dialogue: 0,0:48:43.41,0:48:44.96,Default,,0000,0000,0000,,to protect our privacy Dialogue: 0,0:48:44.96,0:48:48.49,Default,,0000,0000,0000,,and to build a safer fewer\Nfreer future for all of us. Dialogue: 0,0:48:50.27,0:48:51.23,Default,,0000,0000,0000,,So please, I encourage the city council Dialogue: 0,0:48:51.23,0:48:54.01,Default,,0000,0000,0000,,to join us by supporting\Nthis crucial ordinance Dialogue: 0,0:48:54.01,0:48:55.62,Default,,0000,0000,0000,,with your advocacy and with your vote Dialogue: 0,0:48:55.62,0:48:58.16,Default,,0000,0000,0000,,and I thank you all for\Nyour public service. Dialogue: 0,0:48:58.16,0:49:01.42,Default,,0000,0000,0000,,Thank you very much,\NErik Berg, three minutes. Dialogue: 0,0:49:02.58,0:49:04.38,Default,,0000,0000,0000,,Yeah, thank you. Dialogue: 0,0:49:04.38,0:49:06.43,Default,,0000,0000,0000,,And I'll get right to the\Npoint in the interest of time. Dialogue: 0,0:49:06.43,0:49:07.62,Default,,0000,0000,0000,,Thank you to the council Dialogue: 0,0:49:07.62,0:49:09.83,Default,,0000,0000,0000,,for taking this up and for\Nallowing this time today. Dialogue: 0,0:49:09.83,0:49:11.34,Default,,0000,0000,0000,,And I'm here to speak today Dialogue: 0,0:49:11.34,0:49:13.11,Default,,0000,0000,0000,,on behalf of the Boston Teachers Union Dialogue: 0,0:49:13.11,0:49:16.29,Default,,0000,0000,0000,,and our over 10,000 members\Nin support of the ordinance, Dialogue: 0,0:49:16.29,0:49:19.21,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston Dialogue: 0,0:49:19.21,0:49:22.31,Default,,0000,0000,0000,,that was presented by\Ncouncilors Wu and Arroyo. Dialogue: 0,0:49:22.31,0:49:24.56,Default,,0000,0000,0000,,We strongly oppose the\Nuse of this technology Dialogue: 0,0:49:24.56,0:49:26.05,Default,,0000,0000,0000,,in our public schools. Dialogue: 0,0:49:26.05,0:49:28.31,Default,,0000,0000,0000,,Boston public schools\Nshould be safe environments Dialogue: 0,0:49:28.31,0:49:29.46,Default,,0000,0000,0000,,for students to learn, Dialogue: 0,0:49:29.46,0:49:31.79,Default,,0000,0000,0000,,explore their identities\Nand intellect and play. Dialogue: 0,0:49:31.79,0:49:35.58,Default,,0000,0000,0000,,Face surveillance technology\Nthreatens that environment. Dialogue: 0,0:49:35.58,0:49:38.53,Default,,0000,0000,0000,,The technology also threatens\Nthe rights of our BTU members, Dialogue: 0,0:49:38.53,0:49:39.70,Default,,0000,0000,0000,,who must be able to go to work Dialogue: 0,0:49:39.70,0:49:41.49,Default,,0000,0000,0000,,without fearing that their every movement, Dialogue: 0,0:49:41.49,0:49:44.71,Default,,0000,0000,0000,,habit and association will\Nbe tracked and cataloged. Dialogue: 0,0:49:44.71,0:49:46.22,Default,,0000,0000,0000,,To our knowledge, Dialogue: 0,0:49:46.22,0:49:48.49,Default,,0000,0000,0000,,this technology is not\Nin use in our schools, Dialogue: 0,0:49:48.49,0:49:50.81,Default,,0000,0000,0000,,but we've already witnessed\Nsome experimenting with it. Dialogue: 0,0:49:50.81,0:49:53.50,Default,,0000,0000,0000,,Although it was unclear\Nif it was authorized. Dialogue: 0,0:49:53.50,0:49:55.82,Default,,0000,0000,0000,,Two summers ago, members of\Nthe Boston Teachers Union, Dialogue: 0,0:49:55.82,0:49:58.15,Default,,0000,0000,0000,,who were working in the summer program, Dialogue: 0,0:49:58.15,0:49:59.01,Default,,0000,0000,0000,,contacted the union to let us know Dialogue: 0,0:49:59.01,0:50:00.73,Default,,0000,0000,0000,,that they were being asked to sign in Dialogue: 0,0:50:00.73,0:50:05.06,Default,,0000,0000,0000,,using an app called Tinder\Nwith face recognition features. Dialogue: 0,0:50:05.06,0:50:05.90,Default,,0000,0000,0000,,And the central office Dialogue: 0,0:50:05.90,0:50:07.42,Default,,0000,0000,0000,,didn't seem to know about this program, Dialogue: 0,0:50:07.42,0:50:08.25,Default,,0000,0000,0000,,and to this day, Dialogue: 0,0:50:08.25,0:50:09.94,Default,,0000,0000,0000,,we don't know how it came about. Dialogue: 0,0:50:09.94,0:50:11.37,Default,,0000,0000,0000,,While the district quickly stopped Dialogue: 0,0:50:11.37,0:50:13.01,Default,,0000,0000,0000,,using the photo portion of this app Dialogue: 0,0:50:13.01,0:50:15.82,Default,,0000,0000,0000,,and informed the union that\Nall photos have been deleted, Dialogue: 0,0:50:15.82,0:50:17.45,Default,,0000,0000,0000,,this incident is indicative Dialogue: 0,0:50:17.45,0:50:20.76,Default,,0000,0000,0000,,of how easy it is for private\Nsecurity or HR companies Dialogue: 0,0:50:20.76,0:50:23.75,Default,,0000,0000,0000,,to sell a technology to a\Nwell-intentioned principal Dialogue: 0,0:50:23.75,0:50:26.35,Default,,0000,0000,0000,,or superintendent who\Nmay not have expertise Dialogue: 0,0:50:26.35,0:50:27.31,Default,,0000,0000,0000,,in the tech field. Dialogue: 0,0:50:28.61,0:50:31.33,Default,,0000,0000,0000,,Face surveillance in schools\Ntransforms all students Dialogue: 0,0:50:31.33,0:50:33.35,Default,,0000,0000,0000,,and their family members,\Nas well as employees Dialogue: 0,0:50:33.35,0:50:35.43,Default,,0000,0000,0000,,into perpetual suspects, Dialogue: 0,0:50:35.43,0:50:37.28,Default,,0000,0000,0000,,where each and every\None of their movements Dialogue: 0,0:50:37.28,0:50:39.49,Default,,0000,0000,0000,,can be automatically monitored. Dialogue: 0,0:50:39.49,0:50:41.51,Default,,0000,0000,0000,,The use of this technology\Nin public schools Dialogue: 0,0:50:41.51,0:50:43.49,Default,,0000,0000,0000,,will negatively impact students' ability Dialogue: 0,0:50:43.49,0:50:46.48,Default,,0000,0000,0000,,to explore new ideas,\Nexpress their creativity, Dialogue: 0,0:50:46.48,0:50:48.45,Default,,0000,0000,0000,,and engage in student dissent Dialogue: 0,0:50:48.45,0:50:50.21,Default,,0000,0000,0000,,and especially disturbing prospect Dialogue: 0,0:50:50.21,0:50:51.93,Default,,0000,0000,0000,,given the current youth led protests Dialogue: 0,0:50:51.93,0:50:53.95,Default,,0000,0000,0000,,against police violence. Dialogue: 0,0:50:53.95,0:50:55.37,Default,,0000,0000,0000,,Even worse, as we've heard, Dialogue: 0,0:50:55.37,0:50:58.17,Default,,0000,0000,0000,,the technology is frequently\Nbiased and inaccurate, Dialogue: 0,0:50:58.17,0:50:59.30,Default,,0000,0000,0000,,which raises concerns Dialogue: 0,0:50:59.30,0:51:01.75,Default,,0000,0000,0000,,about its use to police students of color, Dialogue: 0,0:51:01.75,0:51:04.21,Default,,0000,0000,0000,,academic peer reviewed studies, Dialogue: 0,0:51:04.21,0:51:06.14,Default,,0000,0000,0000,,show face surveillance algorithms Dialogue: 0,0:51:06.14,0:51:08.35,Default,,0000,0000,0000,,are too often racially based, Dialogue: 0,0:51:08.35,0:51:10.04,Default,,0000,0000,0000,,particularly against black women Dialogue: 0,0:51:10.04,0:51:14.27,Default,,0000,0000,0000,,with inaccuracy rates up to\N35% for that demographic. Dialogue: 0,0:51:14.27,0:51:16.28,Default,,0000,0000,0000,,We know black and brown students Dialogue: 0,0:51:16.28,0:51:19.55,Default,,0000,0000,0000,,are more likely to be punished\Nfor perceived misbehavior. Dialogue: 0,0:51:19.55,0:51:21.36,Default,,0000,0000,0000,,Face surveillance will only perpetuate Dialogue: 0,0:51:21.36,0:51:23.15,Default,,0000,0000,0000,,and reproduce this situation. Dialogue: 0,0:51:24.36,0:51:25.99,Default,,0000,0000,0000,,When used to monitor children, Dialogue: 0,0:51:25.99,0:51:28.41,Default,,0000,0000,0000,,this technology fails\Nin an essential sense Dialogue: 0,0:51:28.41,0:51:29.94,Default,,0000,0000,0000,,because it has difficulty Dialogue: 0,0:51:29.94,0:51:33.41,Default,,0000,0000,0000,,accurately identifying young\Npeople as their faces change. Dialogue: 0,0:51:33.41,0:51:36.64,Default,,0000,0000,0000,,Research that tested five top performing Dialogue: 0,0:51:36.64,0:51:39.92,Default,,0000,0000,0000,,commercial off the shelf\Nface recognition systems Dialogue: 0,0:51:39.92,0:51:42.86,Default,,0000,0000,0000,,shows there's a negative bias\Nwhen they're used on children, Dialogue: 0,0:51:42.86,0:51:45.60,Default,,0000,0000,0000,,they perform poor on\Nchildren than on adults. Dialogue: 0,0:51:45.60,0:51:48.00,Default,,0000,0000,0000,,That's because these systems are modeled Dialogue: 0,0:51:48.00,0:51:49.36,Default,,0000,0000,0000,,through the use of adult faces Dialogue: 0,0:51:49.36,0:51:50.90,Default,,0000,0000,0000,,and children look different from adults Dialogue: 0,0:51:50.90,0:51:52.59,Default,,0000,0000,0000,,in such a way they can not be considered, Dialogue: 0,0:51:52.59,0:51:54.82,Default,,0000,0000,0000,,they're simply scaled down versions. Dialogue: 0,0:51:56.08,0:51:57.84,Default,,0000,0000,0000,,On top of this face, Dialogue: 0,0:51:57.84,0:52:00.48,Default,,0000,0000,0000,,surveillance technology regularly Dialogue: 0,0:52:00.48,0:52:01.71,Default,,0000,0000,0000,,misgenders transgender people, Dialogue: 0,0:52:01.71,0:52:03.06,Default,,0000,0000,0000,,and will have a harmful impact Dialogue: 0,0:52:03.06,0:52:05.77,Default,,0000,0000,0000,,on transgender young\Npeople in our schools. Dialogue: 0,0:52:05.77,0:52:08.36,Default,,0000,0000,0000,,Research shows that\Nautomatic gender recognition Dialogue: 0,0:52:08.36,0:52:12.01,Default,,0000,0000,0000,,consistently views gender\Nin a trans exclusive way. Dialogue: 0,0:52:12.01,0:52:14.40,Default,,0000,0000,0000,,And consequently carries\Ndisproportionate risk Dialogue: 0,0:52:14.40,0:52:16.06,Default,,0000,0000,0000,,for trans people subject to it. Dialogue: 0,0:52:17.16,0:52:18.72,Default,,0000,0000,0000,,At a time when transgender children Dialogue: 0,0:52:18.72,0:52:21.16,Default,,0000,0000,0000,,are being stripped of their\Nrights at a national level, Dialogue: 0,0:52:21.16,0:52:24.26,Default,,0000,0000,0000,,Boston must protect transgender\Nkids in our schools. Dialogue: 0,0:52:25.17,0:52:27.89,Default,,0000,0000,0000,,Moreover, face surveillance in schools Dialogue: 0,0:52:27.89,0:52:29.62,Default,,0000,0000,0000,,will contribute to the\Nschool to prison pipeline, Dialogue: 0,0:52:29.62,0:52:30.99,Default,,0000,0000,0000,,threatening children's welfare, Dialogue: 0,0:52:30.99,0:52:34.04,Default,,0000,0000,0000,,educational opportunities\Nand life trajectories. Dialogue: 0,0:52:34.04,0:52:36.50,Default,,0000,0000,0000,,Already, children from\Nmarginalized communities Dialogue: 0,0:52:36.50,0:52:38.62,Default,,0000,0000,0000,,are too often funneled\Nout of public schools Dialogue: 0,0:52:38.62,0:52:41.95,Default,,0000,0000,0000,,and into the juvenile and\Ncriminal justice systems - Dialogue: 0,0:52:41.95,0:52:44.71,Default,,0000,0000,0000,,will inevitable this pipeline. Dialogue: 0,0:52:44.71,0:52:47.99,Default,,0000,0000,0000,,I'll get right to the end. Dialogue: 0,0:52:47.99,0:52:49.83,Default,,0000,0000,0000,,Finally, face surveillance technology Dialogue: 0,0:52:49.83,0:52:51.94,Default,,0000,0000,0000,,will harm immigrant families. Dialogue: 0,0:52:51.94,0:52:53.07,Default,,0000,0000,0000,,In this political climate, Dialogue: 0,0:52:53.07,0:52:55.09,Default,,0000,0000,0000,,immigrants are already\Nfearful of engagement Dialogue: 0,0:52:55.09,0:52:57.21,Default,,0000,0000,0000,,with public institutions Dialogue: 0,0:52:57.21,0:52:58.64,Default,,0000,0000,0000,,and face surveillance systems Dialogue: 0,0:52:58.64,0:53:01.28,Default,,0000,0000,0000,,would further chill student\Nand parent participation Dialogue: 0,0:53:01.28,0:53:03.32,Default,,0000,0000,0000,,in immigrant communities in our schools, Dialogue: 0,0:53:03.32,0:53:05.93,Default,,0000,0000,0000,,Boston schools must be\Nwelcoming and safe spaces Dialogue: 0,0:53:05.93,0:53:07.38,Default,,0000,0000,0000,,for all families. Dialogue: 0,0:53:07.38,0:53:09.36,Default,,0000,0000,0000,,The city of Boston must take action now Dialogue: 0,0:53:09.36,0:53:11.06,Default,,0000,0000,0000,,to ensure children and BTU workers Dialogue: 0,0:53:11.06,0:53:13.00,Default,,0000,0000,0000,,are not subject to this unfair, Dialogue: 0,0:53:13.00,0:53:15.26,Default,,0000,0000,0000,,biased and chilling scrutiny. Dialogue: 0,0:53:15.26,0:53:16.58,Default,,0000,0000,0000,,In order to protect young people Dialogue: 0,0:53:16.58,0:53:18.00,Default,,0000,0000,0000,,in our educational community, Dialogue: 0,0:53:18.00,0:53:21.54,Default,,0000,0000,0000,,we must stop face surveillance\Nin schools before it begins. Dialogue: 0,0:53:21.54,0:53:23.50,Default,,0000,0000,0000,,Thank you very much for your attention Dialogue: 0,0:53:23.50,0:53:25.15,Default,,0000,0000,0000,,and your consideration. Dialogue: 0,0:53:25.15,0:53:27.09,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,0:53:27.09,0:53:28.81,Default,,0000,0000,0000,,We're getting there, Dialogue: 0,0:53:28.81,0:53:30.05,Default,,0000,0000,0000,,very, very close commissioner I promise. Dialogue: 0,0:53:30.05,0:53:31.29,Default,,0000,0000,0000,,Just one more person. Dialogue: 0,0:53:31.29,0:53:33.44,Default,,0000,0000,0000,,Then the two people who specifically said Dialogue: 0,0:53:33.44,0:53:36.03,Default,,0000,0000,0000,,they had some questions,\Nwe'll go right to them. Dialogue: 0,0:53:36.03,0:53:39.11,Default,,0000,0000,0000,,Joshua Barocas Dialogue: 0,0:53:40.50,0:53:43.97,Default,,0000,0000,0000,,Yeah, I will read quickly. Dialogue: 0,0:53:43.97,0:53:45.24,Default,,0000,0000,0000,,First of all, thanks for allowing me Dialogue: 0,0:53:45.24,0:53:48.44,Default,,0000,0000,0000,,to testify today regarding\Nthis important issue. Dialogue: 0,0:53:48.44,0:53:50.65,Default,,0000,0000,0000,,I should say that the views Dialogue: 0,0:53:50.65,0:53:52.13,Default,,0000,0000,0000,,that I'm expressing today are my own Dialogue: 0,0:53:52.13,0:53:54.96,Default,,0000,0000,0000,,and don't necessarily\Nrepresent those of my employer, Dialogue: 0,0:53:54.96,0:53:57.79,Default,,0000,0000,0000,,but that said I'm an\Ninfectious disease physician Dialogue: 0,0:53:57.79,0:54:00.62,Default,,0000,0000,0000,,and an addictions\Nresearcher here in Boston. Dialogue: 0,0:54:00.62,0:54:03.78,Default,,0000,0000,0000,,As such, I spend a large\Nmajority of my time working Dialogue: 0,0:54:03.78,0:54:06.46,Default,,0000,0000,0000,,on solutions to improve\Nthe health and wellbeing Dialogue: 0,0:54:06.46,0:54:08.07,Default,,0000,0000,0000,,of vulnerable populations, Dialogue: 0,0:54:08.07,0:54:10.46,Default,,0000,0000,0000,,including people who are\Nexperiencing homelessness, Dialogue: 0,0:54:10.46,0:54:12.60,Default,,0000,0000,0000,,people with substance use disorders. Dialogue: 0,0:54:12.60,0:54:16.77,Default,,0000,0000,0000,,One thing that we know\Nis that stigma and bias Dialogue: 0,0:54:17.64,0:54:18.71,Default,,0000,0000,0000,,are pervasive throughout our community Dialogue: 0,0:54:18.71,0:54:22.07,Default,,0000,0000,0000,,and lead to disparities and\Ncare for these populations. Dialogue: 0,0:54:22.07,0:54:24.54,Default,,0000,0000,0000,,These disparities were laid bare Dialogue: 0,0:54:24.54,0:54:28.84,Default,,0000,0000,0000,,and exacerbated by the\Nongoing COVID-19 pandemic. Dialogue: 0,0:54:28.84,0:54:31.05,Default,,0000,0000,0000,,Well, we're all searching for answers Dialogue: 0,0:54:31.05,0:54:34.01,Default,,0000,0000,0000,,on how to best protect\Nourselves from this virus, Dialogue: 0,0:54:34.01,0:54:37.53,Default,,0000,0000,0000,,I truly fear that the use of\Nfacial recognition technology Dialogue: 0,0:54:37.53,0:54:41.26,Default,,0000,0000,0000,,will only serve to exacerbate\Nexisting disparities, Dialogue: 0,0:54:41.26,0:54:44.93,Default,,0000,0000,0000,,including those of racial,\Ngender and socioeconomic. Dialogue: 0,0:54:44.93,0:54:48.28,Default,,0000,0000,0000,,If we use this very nascent\Nand imperfect technology Dialogue: 0,0:54:48.28,0:54:50.32,Default,,0000,0000,0000,,to track individuals, I'm concerned Dialogue: 0,0:54:50.32,0:54:53.24,Default,,0000,0000,0000,,that it will only serve\Nthe marginalized people Dialogue: 0,0:54:53.24,0:54:55.17,Default,,0000,0000,0000,,and worse in health outcomes. Dialogue: 0,0:54:55.17,0:54:57.66,Default,,0000,0000,0000,,We've heard about the low accuracy rates. Dialogue: 0,0:54:57.66,0:54:59.89,Default,,0000,0000,0000,,We know about systemic racism, Dialogue: 0,0:54:59.89,0:55:01.36,Default,,0000,0000,0000,,as a public health issue Dialogue: 0,0:55:01.36,0:55:04.65,Default,,0000,0000,0000,,that's fueled by stigma\Nbias and misinformation. Dialogue: 0,0:55:04.65,0:55:08.53,Default,,0000,0000,0000,,Additionally, Boston\Nprovides expansive services Dialogue: 0,0:55:08.53,0:55:11.73,Default,,0000,0000,0000,,for substance use disorders,\Nincluding methadone treatment, Dialogue: 0,0:55:11.73,0:55:16.73,Default,,0000,0000,0000,,and it's imperative that we\Nban facial recognition software Dialogue: 0,0:55:16.81,0:55:18.16,Default,,0000,0000,0000,,to ensure that people seeking Dialogue: 0,0:55:18.16,0:55:21.02,Default,,0000,0000,0000,,substance use disorder\Ntreatment and mental health care Dialogue: 0,0:55:23.25,0:55:25.65,Default,,0000,0000,0000,,among other stigmatized health\Nservices may do so privately Dialogue: 0,0:55:25.65,0:55:26.80,Default,,0000,0000,0000,,and without fear. Dialogue: 0,0:55:27.87,0:55:30.78,Default,,0000,0000,0000,,If we allow this on many\Nof our public cameras, Dialogue: 0,0:55:30.78,0:55:32.35,Default,,0000,0000,0000,,it would enable the government Dialogue: 0,0:55:32.35,0:55:34.52,Default,,0000,0000,0000,,to automatically compile lists of people Dialogue: 0,0:55:34.52,0:55:37.67,Default,,0000,0000,0000,,seeking treatment for\Nsubstance use mental health Dialogue: 0,0:55:37.67,0:55:41.47,Default,,0000,0000,0000,,and as I said, other stigmas\Nstigmatized health conditions. Dialogue: 0,0:55:41.47,0:55:43.12,Default,,0000,0000,0000,,This would have a chilling effect Dialogue: 0,0:55:44.17,0:55:48.27,Default,,0000,0000,0000,,and disproportionately affect Dialogue: 0,0:55:48.27,0:55:50.69,Default,,0000,0000,0000,,our most vulnerable populations. Dialogue: 0,0:55:50.69,0:55:55.69,Default,,0000,0000,0000,,I'll stop there and ask that\Nwe consider banning this. Dialogue: 0,0:55:58.35,0:55:59.50,Default,,0000,0000,0000,,Thank you so very much. Dialogue: 0,0:55:59.50,0:56:00.79,Default,,0000,0000,0000,,I'm gonna turn it over. Dialogue: 0,0:56:02.15,0:56:05.81,Default,,0000,0000,0000,,I misspoke earlier the lead\Nsponsor, there's Michelle Wu, Dialogue: 0,0:56:05.81,0:56:09.83,Default,,0000,0000,0000,,and she mentioned that she has\Nto move along very quickly. Dialogue: 0,0:56:09.83,0:56:11.93,Default,,0000,0000,0000,,So I wanted to, if it's\Nokay, councilor Arroyo, Dialogue: 0,0:56:11.93,0:56:15.12,Default,,0000,0000,0000,,I'm gonna go ahead and go\Nto councilor Wu real quick, Dialogue: 0,0:56:15.12,0:56:16.54,Default,,0000,0000,0000,,for specific questions. Dialogue: 0,0:56:16.54,0:56:18.66,Default,,0000,0000,0000,,Your good!\NYeah, absolutely. Dialogue: 0,0:56:18.66,0:56:19.92,Default,,0000,0000,0000,,Councilor Wu. Dialogue: 0,0:56:19.92,0:56:21.18,Default,,0000,0000,0000,,Thank you madam chair. Dialogue: 0,0:56:21.18,0:56:22.77,Default,,0000,0000,0000,,And thank you so much commissioner Dialogue: 0,0:56:22.77,0:56:24.38,Default,,0000,0000,0000,,for making the time to be here. Dialogue: 0,0:56:24.38,0:56:27.46,Default,,0000,0000,0000,,We all know how many demands\Nthere are in your time. Dialogue: 0,0:56:27.46,0:56:29.24,Default,,0000,0000,0000,,And now we're asking for even\Nmore than you had alluded. Dialogue: 0,0:56:29.24,0:56:32.34,Default,,0000,0000,0000,,So very, very grateful that it means a lot Dialogue: 0,0:56:32.34,0:56:35.81,Default,,0000,0000,0000,,that you took the time and\Nare the one at this hearing. Dialogue: 0,0:56:35.81,0:56:38.82,Default,,0000,0000,0000,,I also wanna note that you\Nwere the one at the hearing Dialogue: 0,0:56:38.82,0:56:40.52,Default,,0000,0000,0000,,in June, 2018 as well. Dialogue: 0,0:56:40.52,0:56:43.08,Default,,0000,0000,0000,,So we know that you've been\Npart of this conversation Dialogue: 0,0:56:43.08,0:56:46.61,Default,,0000,0000,0000,,and supportive of this\Ngeneral idea for a long time. Dialogue: 0,0:56:46.61,0:56:48.66,Default,,0000,0000,0000,,So just to clarify on\Nsome of what you said, Dialogue: 0,0:56:48.66,0:56:51.44,Default,,0000,0000,0000,,you mentioned that you, Dialogue: 0,0:56:51.44,0:56:53.68,Default,,0000,0000,0000,,you referenced the technology upgrades, Dialogue: 0,0:56:53.68,0:56:55.46,Default,,0000,0000,0000,,that some current vendors\Nthat BPD works with Dialogue: 0,0:56:55.46,0:56:57.67,Default,,0000,0000,0000,,have for face surveillance. Dialogue: 0,0:56:57.67,0:56:58.50,Default,,0000,0000,0000,,So just to clarify, Dialogue: 0,0:56:58.50,0:57:01.86,Default,,0000,0000,0000,,has BPD upgraded to the next\Nversion of that software Dialogue: 0,0:57:01.86,0:57:03.26,Default,,0000,0000,0000,,and we're just choosing Dialogue: 0,0:57:03.26,0:57:06.80,Default,,0000,0000,0000,,not to use the facial\Nrecognition components of it, Dialogue: 0,0:57:06.80,0:57:08.59,Default,,0000,0000,0000,,or has the upgrade not been accepted Dialogue: 0,0:57:08.59,0:57:10.54,Default,,0000,0000,0000,,because it includes facial recognition? Dialogue: 0,0:57:11.47,0:57:13.37,Default,,0000,0000,0000,,To my knowledge, we have not upgraded, Dialogue: 0,0:57:13.37,0:57:15.83,Default,,0000,0000,0000,,but we anticipate that we will have to, Dialogue: 0,0:57:15.83,0:57:19.19,Default,,0000,0000,0000,,but we will not be using any components Dialogue: 0,0:57:19.19,0:57:20.97,Default,,0000,0000,0000,,of facial recognition. Dialogue: 0,0:57:20.97,0:57:25.97,Default,,0000,0000,0000,,And I guess the technology\Njust isn't efficient enough - Dialogue: 0,0:57:26.46,0:57:29.79,Default,,0000,0000,0000,,When do you think the\Nupgrade would likely happen? Dialogue: 0,0:57:30.70,0:57:32.53,Default,,0000,0000,0000,,No, I will get back to you. Dialogue: 0,0:57:32.53,0:57:34.79,Default,,0000,0000,0000,,And trust me, I wanna\Nhave a further meeting Dialogue: 0,0:57:36.08,0:57:38.24,Default,,0000,0000,0000,,so I can have my subject\Nmatter experts here. Dialogue: 0,0:57:38.24,0:57:40.47,Default,,0000,0000,0000,,And I'm gonna tick through, Dialogue: 0,0:57:40.47,0:57:41.88,Default,,0000,0000,0000,,'cause I wanna try to get you out of here Dialogue: 0,0:57:41.88,0:57:42.71,Default,,0000,0000,0000,,as quick as possible. Dialogue: 0,0:57:42.71,0:57:43.99,Default,,0000,0000,0000,,Secondly, so you drew a distinction Dialogue: 0,0:57:43.99,0:57:46.70,Default,,0000,0000,0000,,between face surveillance systems Dialogue: 0,0:57:46.70,0:57:48.86,Default,,0000,0000,0000,,and facial recognition technology. Dialogue: 0,0:57:48.86,0:57:50.13,Default,,0000,0000,0000,,So just to clarify, Dialogue: 0,0:57:50.13,0:57:53.62,Default,,0000,0000,0000,,does BPD, we don't use that. Dialogue: 0,0:57:53.62,0:57:55.31,Default,,0000,0000,0000,,You don't own it right now Dialogue: 0,0:57:55.31,0:57:56.71,Default,,0000,0000,0000,,because you haven't done that upgrade, Dialogue: 0,0:57:56.71,0:58:01.71,Default,,0000,0000,0000,,but does BPD work with other\Nstate or federal entities Dialogue: 0,0:58:02.65,0:58:04.68,Default,,0000,0000,0000,,that have face surveillance Dialogue: 0,0:58:04.68,0:58:06.19,Default,,0000,0000,0000,,or facial recognition technology? Dialogue: 0,0:58:06.19,0:58:08.82,Default,,0000,0000,0000,,Do you ever use the\Nresults of that technology Dialogue: 0,0:58:08.82,0:58:10.27,Default,,0000,0000,0000,,on any sort of databases? Dialogue: 0,0:58:10.27,0:58:12.16,Default,,0000,0000,0000,,To answer you more accurately, Dialogue: 0,0:58:12.16,0:58:15.67,Default,,0000,0000,0000,,I would have to find out. Dialogue: 0,0:58:15.67,0:58:17.92,Default,,0000,0000,0000,,I believe the state police Dialogue: 0,0:58:17.92,0:58:20.80,Default,,0000,0000,0000,,does have some form of facial recognition Dialogue: 0,0:58:20.80,0:58:22.95,Default,,0000,0000,0000,,and maybe the registry to. Dialogue: 0,0:58:22.95,0:58:24.90,Default,,0000,0000,0000,,Okay, the registry of motor vehicles. Dialogue: 0,0:58:24.90,0:58:26.56,Default,,0000,0000,0000,,Yes. Dialogue: 0,0:58:26.56,0:58:29.94,Default,,0000,0000,0000,,And so when state police or Dialogue: 0,0:58:29.94,0:58:31.74,Default,,0000,0000,0000,,we'll find out more potentially BPD Dialogue: 0,0:58:31.74,0:58:34.97,Default,,0000,0000,0000,,works with the RMB to\Nrun matches of photos Dialogue: 0,0:58:34.97,0:58:36.28,Default,,0000,0000,0000,,against their database. Dialogue: 0,0:58:36.28,0:58:38.30,Default,,0000,0000,0000,,Do you need any sort of warrants Dialogue: 0,0:58:38.30,0:58:40.41,Default,,0000,0000,0000,,or other approvals to do that before, Dialogue: 0,0:58:40.41,0:58:42.36,Default,,0000,0000,0000,,for let's say the state? Dialogue: 0,0:58:45.24,0:58:48.27,Default,,0000,0000,0000,,So I used to be part of\Nthe Bureau of Investigative Dialogue: 0,0:58:48.27,0:58:49.82,Default,,0000,0000,0000,,Services, to my knowledge, Dialogue: 0,0:58:49.82,0:58:53.72,Default,,0000,0000,0000,,it's only utilized to help\Nassist in photo arrays. Dialogue: 0,0:58:53.72,0:58:57.00,Default,,0000,0000,0000,,So nothing past that. Dialogue: 0,0:58:58.15,0:59:00.79,Default,,0000,0000,0000,,Okay, could you just clarify\Nwhat you mean by that? Dialogue: 0,0:59:00.79,0:59:03.18,Default,,0000,0000,0000,,So if you a victim of a crime Dialogue: 0,0:59:03.18,0:59:05.01,Default,,0000,0000,0000,,and we don't have that\Nperson under arrest, Dialogue: 0,0:59:05.01,0:59:07.47,Default,,0000,0000,0000,,what do we do have a potential suspect? Dialogue: 0,0:59:07.47,0:59:08.83,Default,,0000,0000,0000,,You have to pick, Dialogue: 0,0:59:08.83,0:59:10.91,Default,,0000,0000,0000,,you were allowed the opportunity Dialogue: 0,0:59:10.91,0:59:15.38,Default,,0000,0000,0000,,to try to identify the\Nsuspect of the crime. Dialogue: 0,0:59:15.38,0:59:16.89,Default,,0000,0000,0000,,And so to be fair, Dialogue: 0,0:59:16.89,0:59:20.22,Default,,0000,0000,0000,,to ensure that no innocent\Nperson is selected, Dialogue: 0,0:59:23.30,0:59:27.28,Default,,0000,0000,0000,,you put the suspects picture Dialogue: 0,0:59:27.28,0:59:30.02,Default,,0000,0000,0000,,along with several other suspects pictures Dialogue: 0,0:59:30.02,0:59:31.25,Default,,0000,0000,0000,,up to seven or more, Dialogue: 0,0:59:31.25,0:59:33.73,Default,,0000,0000,0000,,and then you see if\Nthe victim of the crime Dialogue: 0,0:59:33.73,0:59:37.31,Default,,0000,0000,0000,,can pick the suspect\Nout of that photo array. Dialogue: 0,0:59:38.41,0:59:41.71,Default,,0000,0000,0000,,Okay, and so the RMB\Nuses the face recognition Dialogue: 0,0:59:41.71,0:59:43.49,Default,,0000,0000,0000,,to provide those burdens. Dialogue: 0,0:59:43.49,0:59:47.21,Default,,0000,0000,0000,,I wanna give you an accurate answer. Dialogue: 0,0:59:47.21,0:59:52.21,Default,,0000,0000,0000,,So again, in our next meeting,\Nyou'll have your answer. Dialogue: 0,0:59:53.01,0:59:53.84,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:59:56.75,0:59:59.83,Default,,0000,0000,0000,,Are you aware of any state\Nor federal regulations Dialogue: 0,0:59:59.83,1:00:03.74,Default,,0000,0000,0000,,that provide any sort\Nof kind of restrictions Dialogue: 0,1:00:03.74,1:00:05.34,Default,,0000,0000,0000,,or reigning in guidance when it comes Dialogue: 0,1:00:06.19,1:00:07.68,Default,,0000,0000,0000,,to facial recognition technology Dialogue: 0,1:00:07.68,1:00:09.28,Default,,0000,0000,0000,,or face surveillance systems? Dialogue: 0,1:00:10.26,1:00:11.23,Default,,0000,0000,0000,,Nope. Dialogue: 0,1:00:11.23,1:00:15.50,Default,,0000,0000,0000,,Okay, and so given the absence\Nof guidelines right now Dialogue: 0,1:00:15.50,1:00:17.02,Default,,0000,0000,0000,,with the federal state or city level Dialogue: 0,1:00:17.02,1:00:20.67,Default,,0000,0000,0000,,and the need to upgrade and\Nthe use by state police, Dialogue: 0,1:00:20.67,1:00:22.96,Default,,0000,0000,0000,,for example, are you supportive, Dialogue: 0,1:00:22.96,1:00:25.05,Default,,0000,0000,0000,,we're hearing you loud and clear Dialogue: 0,1:00:25.05,1:00:25.89,Default,,0000,0000,0000,,that you'd like to have\Nfurther conversation, Dialogue: 0,1:00:25.89,1:00:28.10,Default,,0000,0000,0000,,we just wanna get your opinion now, Dialogue: 0,1:00:28.10,1:00:30.97,Default,,0000,0000,0000,,are you supportive of a city level ban Dialogue: 0,1:00:30.97,1:00:34.81,Default,,0000,0000,0000,,until there are policies\Nin place potentially, Dialogue: 0,1:00:34.81,1:00:38.83,Default,,0000,0000,0000,,at other levels or\Nspecifically at the city level Dialogue: 0,1:00:38.83,1:00:41.31,Default,,0000,0000,0000,,to reign in surveillance overall? Dialogue: 0,1:00:41.31,1:00:42.32,Default,,0000,0000,0000,,Yes, I am. Dialogue: 0,1:00:42.32,1:00:44.96,Default,,0000,0000,0000,,I've been clear for four years. Dialogue: 0,1:00:44.96,1:00:46.77,Default,,0000,0000,0000,,We need your input, your guidance. Dialogue: 0,1:00:46.77,1:00:49.29,Default,,0000,0000,0000,,And I thank you to everyone\Nthat testified today. Dialogue: 0,1:00:49.29,1:00:51.36,Default,,0000,0000,0000,,I have a team taking notes Dialogue: 0,1:00:51.36,1:00:54.16,Default,,0000,0000,0000,,because I didn't forget\Nthat I'm African-American Dialogue: 0,1:00:54.16,1:00:56.60,Default,,0000,0000,0000,,and I could be misidentified as well. Dialogue: 0,1:00:56.60,1:01:00.43,Default,,0000,0000,0000,,And I believe that we have\None of the most diversified Dialogue: 0,1:01:00.43,1:01:03.02,Default,,0000,0000,0000,,populations in the history of our city. Dialogue: 0,1:01:03.02,1:01:06.19,Default,,0000,0000,0000,,And we do have to be fair to everyone Dialogue: 0,1:01:06.19,1:01:08.47,Default,,0000,0000,0000,,that is a citizen of Boston. Dialogue: 0,1:01:08.47,1:01:10.96,Default,,0000,0000,0000,,We don't want anyone misidentified. Dialogue: 0,1:01:10.96,1:01:15.15,Default,,0000,0000,0000,,And again we will work with everyone here Dialogue: 0,1:01:15.15,1:01:18.09,Default,,0000,0000,0000,,because this is how we will be educated. Dialogue: 0,1:01:18.09,1:01:20.20,Default,,0000,0000,0000,,Yes, we're very grateful for your time. Dialogue: 0,1:01:20.20,1:01:21.81,Default,,0000,0000,0000,,I'm gonna see to the co-sponsor. Dialogue: 0,1:01:21.81,1:01:22.64,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:01:23.85,1:01:27.96,Default,,0000,0000,0000,,So, actually, is that fine chair? Dialogue: 0,1:01:27.96,1:01:31.21,Default,,0000,0000,0000,,I just wanna thank you commissioner Dialogue: 0,1:01:31.21,1:01:34.64,Default,,0000,0000,0000,,for supporting the facial\Nrecognition surveillance ban Dialogue: 0,1:01:34.64,1:01:37.35,Default,,0000,0000,0000,,and for taking the time to\Nbe here for the questions. Dialogue: 0,1:01:37.35,1:01:39.21,Default,,0000,0000,0000,,Councilor Wu asked many of them, Dialogue: 0,1:01:39.21,1:01:41.50,Default,,0000,0000,0000,,I have one specific one which is, Dialogue: 0,1:01:41.50,1:01:45.10,Default,,0000,0000,0000,,has the BPD used facial\Nrecognition in the past, Dialogue: 0,1:01:45.10,1:01:46.22,Default,,0000,0000,0000,,in any capacity? Dialogue: 0,1:01:47.53,1:01:49.35,Default,,0000,0000,0000,,No, not to my knowledge. Dialogue: 0,1:01:49.35,1:01:53.23,Default,,0000,0000,0000,,And I will review that, but\Nnot to my knowledge at all. Dialogue: 0,1:01:53.23,1:01:56.54,Default,,0000,0000,0000,,And BRC, does BRC have access Dialogue: 0,1:01:57.89,1:01:58.72,Default,,0000,0000,0000,,to facial recognition techniques? Dialogue: 0,1:01:58.72,1:01:59.79,Default,,0000,0000,0000,,Nope, not at all. Dialogue: 0,1:01:59.79,1:02:03.17,Default,,0000,0000,0000,,We don't use it and I\Nanticipated that question Dialogue: 0,1:02:03.17,1:02:06.64,Default,,0000,0000,0000,,and we make sure that\Nwe're not a part of any, Dialogue: 0,1:02:06.64,1:02:09.23,Default,,0000,0000,0000,,BRC does not have facial recognition. Dialogue: 0,1:02:09.23,1:02:11.42,Default,,0000,0000,0000,,Okay, I think every other question Dialogue: 0,1:02:11.42,1:02:13.83,Default,,0000,0000,0000,,that I had for you today\Nwas asked by councilor Wu. Dialogue: 0,1:02:13.83,1:02:15.54,Default,,0000,0000,0000,,So I just thank you for\Ntaking the time to be here Dialogue: 0,1:02:15.54,1:02:18.10,Default,,0000,0000,0000,,and for supporting this, thank you. Dialogue: 0,1:02:18.10,1:02:19.44,Default,,0000,0000,0000,,Thank you, and before I go, Dialogue: 0,1:02:19.44,1:02:22.50,Default,,0000,0000,0000,,for the record, nothing\Nhas changed in my opinion Dialogue: 0,1:02:22.50,1:02:24.34,Default,,0000,0000,0000,,from four years ago, Dialogue: 0,1:02:24.34,1:02:26.57,Default,,0000,0000,0000,,until this technology\Nis a hundred percent, Dialogue: 0,1:02:26.57,1:02:28.80,Default,,0000,0000,0000,,I'm not interested in it. Dialogue: 0,1:02:29.64,1:02:33.51,Default,,0000,0000,0000,,And even when it is a hundred percent, Dialogue: 0,1:02:33.51,1:02:35.97,Default,,0000,0000,0000,,you've committed to a\Nconversation about it? Dialogue: 0,1:02:35.97,1:02:38.00,Default,,0000,0000,0000,,Absolutely, we discussed that before. Dialogue: 0,1:02:38.00,1:02:39.61,Default,,0000,0000,0000,,That you have. Dialogue: 0,1:02:39.61,1:02:43.30,Default,,0000,0000,0000,,It's not something that\Nyou can just implement. Dialogue: 0,1:02:43.30,1:02:45.63,Default,,0000,0000,0000,,[crosstalk] Dialogue: 0,1:02:47.19,1:02:51.81,Default,,0000,0000,0000,,It's important that we also get\Nthe input from the community Dialogue: 0,1:02:53.14,1:02:54.66,Default,,0000,0000,0000,,that we serve and work in partnership Dialogue: 0,1:02:56.58,1:02:59.15,Default,,0000,0000,0000,,so we can of course have\Na better quality of life. Dialogue: 0,1:02:59.15,1:03:03.09,Default,,0000,0000,0000,,And hat's inclusive of everyone's\Nexpectation of privacy. Dialogue: 0,1:03:03.09,1:03:04.69,Default,,0000,0000,0000,,So before you go, Dialogue: 0,1:03:04.69,1:03:09.69,Default,,0000,0000,0000,,I'm gonna do this one call out\Nthe panelists that just spoke Dialogue: 0,1:03:10.04,1:03:11.45,Default,,0000,0000,0000,,or to any city councilors. Dialogue: 0,1:03:11.45,1:03:13.97,Default,,0000,0000,0000,,One question that you may\Nhave for the commissioner Dialogue: 0,1:03:13.97,1:03:16.43,Default,,0000,0000,0000,,that you need to ask,\Ndo any of you have that? Dialogue: 0,1:03:17.88,1:03:22.88,Default,,0000,0000,0000,,Do either Joy, My'Kel,\NKarina, Kade, Erik, Joshua, Dialogue: 0,1:03:23.72,1:03:25.01,Default,,0000,0000,0000,,or to the other city councilors. Dialogue: 0,1:03:25.01,1:03:26.53,Default,,0000,0000,0000,,I'm trying to be mindful of his time Dialogue: 0,1:03:26.53,1:03:28.73,Default,,0000,0000,0000,,but I also understand Dialogue: 0,1:03:28.73,1:03:31.48,Default,,0000,0000,0000,,you guys also gave time\Nto be here today as well. Dialogue: 0,1:03:31.48,1:03:35.24,Default,,0000,0000,0000,,And if you wanted to ask him\Nrepresenting BPD a question. Dialogue: 0,1:03:35.24,1:03:39.15,Default,,0000,0000,0000,,I see Julia Mejia has raised her hand, Dialogue: 0,1:03:39.15,1:03:40.54,Default,,0000,0000,0000,,to the panelists though, Dialogue: 0,1:03:40.54,1:03:42.38,Default,,0000,0000,0000,,who just spoke any questions? Dialogue: 0,1:03:43.24,1:03:46.49,Default,,0000,0000,0000,,I'm gonna go now to my\Ncolleagues to ask one question. Dialogue: 0,1:03:46.49,1:03:48.69,Default,,0000,0000,0000,,So that allows for him to go. Dialogue: 0,1:03:48.69,1:03:50.82,Default,,0000,0000,0000,,Okay, councilor Mejia Dialogue: 0,1:03:52.09,1:03:56.14,Default,,0000,0000,0000,,Yes, I just wanted to quickly\Nthank commissioner Gross Dialogue: 0,1:03:56.14,1:03:59.47,Default,,0000,0000,0000,,for your time and just\Nyour steadfast leadership. Dialogue: 0,1:04:00.45,1:04:03.67,Default,,0000,0000,0000,,But I do have just a quick\Nquestion before you go. Dialogue: 0,1:04:03.67,1:04:05.11,Default,,0000,0000,0000,,I just need some clarity, Dialogue: 0,1:04:05.11,1:04:07.19,Default,,0000,0000,0000,,is that you mentioned that\Nyou hope to keep this hearing Dialogue: 0,1:04:07.19,1:04:08.40,Default,,0000,0000,0000,,on the topic of the ordinance Dialogue: 0,1:04:08.40,1:04:12.16,Default,,0000,0000,0000,,and find other ways to adjust\Nissues of police relations. Dialogue: 0,1:04:12.16,1:04:14.56,Default,,0000,0000,0000,,So a lot of people draw the direct link Dialogue: 0,1:04:14.56,1:04:16.77,Default,,0000,0000,0000,,between facial recognition Dialogue: 0,1:04:16.77,1:04:18.16,Default,,0000,0000,0000,,and relationships with the community. Dialogue: 0,1:04:18.16,1:04:19.37,Default,,0000,0000,0000,,Do you see that link? Dialogue: 0,1:04:20.59,1:04:22.10,Default,,0000,0000,0000,,And can you talk about\Nhow the police department Dialogue: 0,1:04:22.10,1:04:23.88,Default,,0000,0000,0000,,sees that link? Dialogue: 0,1:04:23.88,1:04:24.90,Default,,0000,0000,0000,,I'm just really curious about Dialogue: 0,1:04:24.90,1:04:28.69,Default,,0000,0000,0000,,how do we build relationships\Nwith the community Dialogue: 0,1:04:28.69,1:04:31.12,Default,,0000,0000,0000,,while also thinking\Nabout facial recognition? Dialogue: 0,1:04:31.12,1:04:32.54,Default,,0000,0000,0000,,Like, how do you reconcile that? Dialogue: 0,1:04:32.54,1:04:33.59,Default,,0000,0000,0000,,I can answer that, Dialogue: 0,1:04:35.02,1:04:36.66,Default,,0000,0000,0000,,the information I was given before Dialogue: 0,1:04:36.66,1:04:41.12,Default,,0000,0000,0000,,that we were going to\Ndiscuss many subjects. Dialogue: 0,1:04:41.12,1:04:44.61,Default,,0000,0000,0000,,And so I thank you for\Neliminating this one, Dialogue: 0,1:04:44.61,1:04:47.26,Default,,0000,0000,0000,,but the only way you\Nincrease relationships Dialogue: 0,1:04:47.26,1:04:49.46,Default,,0000,0000,0000,,with the community is to have Dialogue: 0,1:04:49.46,1:04:51.54,Default,,0000,0000,0000,,hold hard discussions like this one. Dialogue: 0,1:04:51.54,1:04:56.01,Default,,0000,0000,0000,,You have to listen to the\Npeople that you serve, Dialogue: 0,1:04:56.01,1:04:58.59,Default,,0000,0000,0000,,because everyone throws around Dialogue: 0,1:04:58.59,1:04:59.76,Default,,0000,0000,0000,,the moniker community policing. Dialogue: 0,1:04:59.76,1:05:01.68,Default,,0000,0000,0000,,I believe that's become jaded. Dialogue: 0,1:05:01.68,1:05:05.02,Default,,0000,0000,0000,,And so I've created a bureau\Nof community engagement Dialogue: 0,1:05:05.02,1:05:07.48,Default,,0000,0000,0000,,so that we can have these discussions. Dialogue: 0,1:05:07.48,1:05:09.61,Default,,0000,0000,0000,,And again, I don't forget my history. Dialogue: 0,1:05:09.61,1:05:11.60,Default,,0000,0000,0000,,I haven't forgotten my history. Dialogue: 0,1:05:11.60,1:05:15.67,Default,,0000,0000,0000,,I came on in 1983 and I've\Ngone through a lot of racism, Dialogue: 0,1:05:15.67,1:05:20.00,Default,,0000,0000,0000,,was in a tough neighborhood growing up, Dialogue: 0,1:05:20.00,1:05:21.70,Default,,0000,0000,0000,,and I didn't forget any of that. Dialogue: 0,1:05:21.70,1:05:25.46,Default,,0000,0000,0000,,And as you alluded to earlier,\Nsome of the city councilors, Dialogue: 0,1:05:25.46,1:05:27.91,Default,,0000,0000,0000,,I'm always in the street\Ntalking to people, Dialogue: 0,1:05:27.91,1:05:32.08,Default,,0000,0000,0000,,I get a lot of criticism of\Nlaw enforcement in general, Dialogue: 0,1:05:32.08,1:05:35.54,Default,,0000,0000,0000,,but I believe in if you\Nwant change, be the change, Dialogue: 0,1:05:35.54,1:05:37.61,Default,,0000,0000,0000,,you can only change with the people. Dialogue: 0,1:05:37.61,1:05:42.61,Default,,0000,0000,0000,,So, I am looking forward\Nto further conversation Dialogue: 0,1:05:43.41,1:05:47.17,Default,,0000,0000,0000,,and to listening to testimony\Non how we can improve Dialogue: 0,1:05:47.17,1:05:48.78,Default,,0000,0000,0000,,our community relations, Dialogue: 0,1:05:48.78,1:05:53.57,Default,,0000,0000,0000,,especially not only throwing\Nthe time civil unrest, Dialogue: 0,1:05:53.57,1:05:56.24,Default,,0000,0000,0000,,but we're in the middle\Nof a pandemic as well. Dialogue: 0,1:05:56.24,1:05:58.44,Default,,0000,0000,0000,,And so police departments Dialogue: 0,1:05:58.44,1:06:01.47,Default,,0000,0000,0000,,should not be separated\Nfrom the community. Dialogue: 0,1:06:01.47,1:06:04.85,Default,,0000,0000,0000,,You have to have empathy,\Nsympathy, care and respect. Dialogue: 0,1:06:04.85,1:06:06.98,Default,,0000,0000,0000,,A lot of people have lost jobs, Dialogue: 0,1:06:06.98,1:06:10.08,Default,,0000,0000,0000,,and we must keep in mind\Nabout socioeconomics, Dialogue: 0,1:06:10.08,1:06:14.41,Default,,0000,0000,0000,,fairness for employment. Dialogue: 0,1:06:14.41,1:06:16.26,Default,,0000,0000,0000,,A lot of those things Dialogue: 0,1:06:16.26,1:06:20.17,Default,,0000,0000,0000,,directly affect the\Nneighborhoods of color. Dialogue: 0,1:06:20.17,1:06:22.52,Default,,0000,0000,0000,,And I'm glad that we are increasing Dialogue: 0,1:06:22.52,1:06:24.32,Default,,0000,0000,0000,,our diversity in the communities. Dialogue: 0,1:06:24.32,1:06:27.55,Default,,0000,0000,0000,,And we currently currently\Nhave organizations in Boston Dialogue: 0,1:06:27.55,1:06:29.55,Default,,0000,0000,0000,,that directly speaks to the people. Dialogue: 0,1:06:29.55,1:06:32.33,Default,,0000,0000,0000,,The Latino Law Enforcement Organization, Dialogue: 0,1:06:32.33,1:06:34.28,Default,,0000,0000,0000,,the Cabo Verde Police Associations, Dialogue: 0,1:06:34.28,1:06:39.28,Default,,0000,0000,0000,,the Benevolent Asian Jade Society. Dialogue: 0,1:06:39.47,1:06:42.77,Default,,0000,0000,0000,,Our department is increasing in diversity, Dialogue: 0,1:06:42.77,1:06:44.81,Default,,0000,0000,0000,,and the benefit of that, Dialogue: 0,1:06:44.81,1:06:47.25,Default,,0000,0000,0000,,the benefit of having representation Dialogue: 0,1:06:47.25,1:06:49.62,Default,,0000,0000,0000,,from every neighborhood we serve, Dialogue: 0,1:06:49.62,1:06:52.21,Default,,0000,0000,0000,,is that it will improve relationships. Dialogue: 0,1:06:52.21,1:06:54.55,Default,,0000,0000,0000,,So again, that's why I'm looking forward Dialogue: 0,1:06:54.55,1:06:57.39,Default,,0000,0000,0000,,to a working session and\Nmy team is taking notes Dialogue: 0,1:06:57.39,1:07:02.12,Default,,0000,0000,0000,,because everybody's everybody's\Ntestimony is important. Dialogue: 0,1:07:02.12,1:07:05.84,Default,,0000,0000,0000,,Trust me, people like you that worked hard Dialogue: 0,1:07:05.84,1:07:06.81,Default,,0000,0000,0000,,for everyone's rights. Dialogue: 0,1:07:06.81,1:07:07.71,Default,,0000,0000,0000,,I wouldn't be here Dialogue: 0,1:07:07.71,1:07:10.34,Default,,0000,0000,0000,,as the first African-American\Npolice commissioner, Dialogue: 0,1:07:10.34,1:07:11.63,Default,,0000,0000,0000,,if we didn't have folks Dialogue: 0,1:07:11.63,1:07:15.86,Default,,0000,0000,0000,,that exercise their\NFirst Amendment Rights. Dialogue: 0,1:07:17.08,1:07:17.98,Default,,0000,0000,0000,,Thank you, commissioner Gross, Dialogue: 0,1:07:17.98,1:07:19.55,Default,,0000,0000,0000,,I have a clarifying question. Dialogue: 0,1:07:19.55,1:07:22.02,Default,,0000,0000,0000,,And then I just wanted to note, Dialogue: 0,1:07:22.02,1:07:24.68,Default,,0000,0000,0000,,this impacts the staying in. Dialogue: 0,1:07:24.68,1:07:28.15,Default,,0000,0000,0000,,So for folks who,\Ncommissioner Gross has alluded Dialogue: 0,1:07:28.15,1:07:29.09,Default,,0000,0000,0000,,to a working session, Dialogue: 0,1:07:29.09,1:07:31.72,Default,,0000,0000,0000,,just to give some clarification\Nfor those watching, Dialogue: 0,1:07:31.72,1:07:33.15,Default,,0000,0000,0000,,that is when we actually get down Dialogue: 0,1:07:33.15,1:07:35.55,Default,,0000,0000,0000,,to the language of the ordinance Dialogue: 0,1:07:35.55,1:07:38.19,Default,,0000,0000,0000,,and work down to the\Ncomments is what I say Dialogue: 0,1:07:38.19,1:07:39.80,Default,,0000,0000,0000,,and how to do that. Dialogue: 0,1:07:39.80,1:07:41.20,Default,,0000,0000,0000,,And so what commissioner Gross is asking Dialogue: 0,1:07:41.20,1:07:44.07,Default,,0000,0000,0000,,that we have that conversation\Nwithin the next 60 days, Dialogue: 0,1:07:44.07,1:07:45.30,Default,,0000,0000,0000,,which I'm fine with doing that, Dialogue: 0,1:07:45.30,1:07:47.41,Default,,0000,0000,0000,,I'll just check with the lead sponsors. Dialogue: 0,1:07:47.41,1:07:49.79,Default,,0000,0000,0000,,I did wanna make sure I understood Dialogue: 0,1:07:49.79,1:07:52.48,Default,,0000,0000,0000,,what in terms of timing is\Ngoing on with the contract, Dialogue: 0,1:07:52.48,1:07:57.20,Default,,0000,0000,0000,,is BPD entering into a contract\Nthat has this technology? Dialogue: 0,1:07:57.20,1:07:58.53,Default,,0000,0000,0000,,There was a timing issue Dialogue: 0,1:07:58.53,1:08:01.79,Default,,0000,0000,0000,,that I thought was warranting this hearing Dialogue: 0,1:08:01.79,1:08:02.74,Default,,0000,0000,0000,,happening very fast. Dialogue: 0,1:08:02.74,1:08:04.98,Default,,0000,0000,0000,,Maybe the lead sponsors could\Nalso answer this question. Dialogue: 0,1:08:04.98,1:08:06.36,Default,,0000,0000,0000,,I was confused, Dialogue: 0,1:08:06.36,1:08:09.64,Default,,0000,0000,0000,,if someone could explain what was the... Dialogue: 0,1:08:09.64,1:08:12.10,Default,,0000,0000,0000,,there's a sense of urgency\Nthat I felt that I was meeting Dialogue: 0,1:08:12.10,1:08:13.66,Default,,0000,0000,0000,,and I'm happy to meet it. Dialogue: 0,1:08:13.66,1:08:17.35,Default,,0000,0000,0000,,Just if someone either, I\Nsaw you Kade, you nodded. Dialogue: 0,1:08:17.35,1:08:21.41,Default,,0000,0000,0000,,Let me just figure out what\Nthis contract issue is Kade. Dialogue: 0,1:08:21.41,1:08:24.83,Default,,0000,0000,0000,,Sure councilor, I'd be\Nhappy to address that. Dialogue: 0,1:08:24.83,1:08:27.66,Default,,0000,0000,0000,,We obtained public records\Nfrom the city a while back Dialogue: 0,1:08:27.66,1:08:29.71,Default,,0000,0000,0000,,showing that the Boston Police Department Dialogue: 0,1:08:31.15,1:08:32.77,Default,,0000,0000,0000,,has a contract with a\Ncompany called BriefCam Dialogue: 0,1:08:32.77,1:08:34.51,Default,,0000,0000,0000,,that expired on May 14th. Dialogue: 0,1:08:37.48,1:08:41.39,Default,,0000,0000,0000,,And the contract was for version\N4.3 of BriefCam software, Dialogue: 0,1:08:41.39,1:08:44.63,Default,,0000,0000,0000,,which did not include facial\Nsurveillance algorithms. Dialogue: 0,1:08:44.63,1:08:47.21,Default,,0000,0000,0000,,The current version of\NBriefCam's technology Dialogue: 0,1:08:47.21,1:08:50.30,Default,,0000,0000,0000,,does include facial\Nsurveillance algorithms. Dialogue: 0,1:08:50.30,1:08:53.13,Default,,0000,0000,0000,,And so one of the reasons\Nthat the councilors Dialogue: 0,1:08:53.13,1:08:55.51,Default,,0000,0000,0000,,and the advocacy groups\Nbehind this measure Dialogue: 0,1:08:55.51,1:08:59.37,Default,,0000,0000,0000,,were urging you chair\Nto schedule this quickly Dialogue: 0,1:08:59.37,1:09:01.23,Default,,0000,0000,0000,,is so that we could pass this ban fast Dialogue: 0,1:09:01.23,1:09:02.45,Default,,0000,0000,0000,,to ensure that the city Dialogue: 0,1:09:02.45,1:09:06.22,Default,,0000,0000,0000,,does not enter into a contract\Nto upgrade that technology. Dialogue: 0,1:09:07.48,1:09:09.28,Default,,0000,0000,0000,,Thank you, commissioner Gross, Dialogue: 0,1:09:09.28,1:09:10.84,Default,,0000,0000,0000,,where are we on this contract? Dialogue: 0,1:09:10.84,1:09:11.85,Default,,0000,0000,0000,,Yep and thank you. Dialogue: 0,1:09:11.85,1:09:13.98,Default,,0000,0000,0000,,That's exactly what I was gonna comment. Dialogue: 0,1:09:13.98,1:09:15.58,Default,,0000,0000,0000,,The older version of BriefCam Dialogue: 0,1:09:15.58,1:09:17.59,Default,,0000,0000,0000,,did not have facial recognition Dialogue: 0,1:09:17.59,1:09:20.69,Default,,0000,0000,0000,,and I believe that the newer version does. Dialogue: 0,1:09:20.69,1:09:25.62,Default,,0000,0000,0000,,BriefCam works on the old version Dialogue: 0,1:09:25.62,1:09:27.96,Default,,0000,0000,0000,,on condensing objects, cars, Dialogue: 0,1:09:27.96,1:09:31.20,Default,,0000,0000,0000,,but I would definitely\Ncheck on that contract Dialogue: 0,1:09:31.20,1:09:34.30,Default,,0000,0000,0000,,because I don't want any technology Dialogue: 0,1:09:34.30,1:09:36.28,Default,,0000,0000,0000,,that has facial recognition Dialogue: 0,1:09:37.94,1:09:40.17,Default,,0000,0000,0000,,and that's exactly what\Nwe were gonna check on Dialogue: 0,1:09:40.17,1:09:45.17,Default,,0000,0000,0000,,and so my answer right\Nnow is no at this point. Dialogue: 0,1:09:46.65,1:09:50.04,Default,,0000,0000,0000,,I've just read into testimony, Dialogue: 0,1:09:50.04,1:09:55.00,Default,,0000,0000,0000,,if we obtain technology such as that, Dialogue: 0,1:09:55.00,1:09:57.51,Default,,0000,0000,0000,,I should be speaking to all of you Dialogue: 0,1:09:57.51,1:09:59.88,Default,,0000,0000,0000,,and to what that entails, Dialogue: 0,1:10:04.60,1:10:08.37,Default,,0000,0000,0000,,if we have BriefCam and\Nit has facial recognition Dialogue: 0,1:10:08.37,1:10:10.28,Default,,0000,0000,0000,,[indistinct] that allows\Nfor facial recognition, no. Dialogue: 0,1:10:10.28,1:10:12.11,Default,,0000,0000,0000,,And if someone did, if we\Ndid go into a contract, Dialogue: 0,1:10:12.11,1:10:14.91,Default,,0000,0000,0000,,would I have the ability to show you Dialogue: 0,1:10:14.91,1:10:18.97,Default,,0000,0000,0000,,that that portion that does\Nhave facial recognition, Dialogue: 0,1:10:18.97,1:10:20.83,Default,,0000,0000,0000,,that that can be censored? Dialogue: 0,1:10:20.83,1:10:24.43,Default,,0000,0000,0000,,That that is not, excuse\Nme, poor choice of words Dialogue: 0,1:10:24.43,1:10:26.35,Default,,0000,0000,0000,,that that can be excluded. Dialogue: 0,1:10:26.35,1:10:29.34,Default,,0000,0000,0000,,And so I'm not comfortable\Nwith that contract Dialogue: 0,1:10:29.34,1:10:31.00,Default,,0000,0000,0000,,until I know more about it. Dialogue: 0,1:10:31.00,1:10:33.86,Default,,0000,0000,0000,,I don't want any part\Nof facial recognition. Dialogue: 0,1:10:33.86,1:10:36.82,Default,,0000,0000,0000,,But as I read into testimony, Dialogue: 0,1:10:36.82,1:10:39.91,Default,,0000,0000,0000,,all the technology\Nthat's going forward now Dialogue: 0,1:10:39.91,1:10:41.69,Default,,0000,0000,0000,,in many fields is like, Dialogue: 0,1:10:41.69,1:10:43.32,Default,,0000,0000,0000,,hey we have facial recognition Dialogue: 0,1:10:43.32,1:10:45.65,Default,,0000,0000,0000,,and I'm like not comfortable with that. Dialogue: 0,1:10:45.65,1:10:50.43,Default,,0000,0000,0000,,And so BriefCam, yeah the old version, Dialogue: 0,1:10:50.43,1:10:52.53,Default,,0000,0000,0000,,we hardly ever used it. Dialogue: 0,1:10:52.53,1:10:54.85,Default,,0000,0000,0000,,And there is discussion on the new version Dialogue: 0,1:10:54.85,1:10:56.59,Default,,0000,0000,0000,,and I'm not comfortable Dialogue: 0,1:10:56.59,1:10:58.89,Default,,0000,0000,0000,,with the facial recognition\Ncomponent of that. Dialogue: 0,1:11:00.34,1:11:01.21,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:11:01.21,1:11:04.05,Default,,0000,0000,0000,,And thank you very much\NKade for the background Dialogue: 0,1:11:04.05,1:11:07.01,Default,,0000,0000,0000,,and understanding both of\Nyou, and understanding. Dialogue: 0,1:11:07.01,1:11:09.64,Default,,0000,0000,0000,,I felt a sense to move\Nfast and I'm moving fast, Dialogue: 0,1:11:09.64,1:11:12.32,Default,,0000,0000,0000,,but I was trying to make\Nsure I understood what for. Dialogue: 0,1:11:12.32,1:11:14.14,Default,,0000,0000,0000,,I really do have to go. Dialogue: 0,1:11:14.14,1:11:15.33,Default,,0000,0000,0000,,Yes, thank you very much. Dialogue: 0,1:11:15.33,1:11:18.34,Default,,0000,0000,0000,,Thank you all for your input. Dialogue: 0,1:11:18.34,1:11:21.19,Default,,0000,0000,0000,,Will there someone be taking\Nnotes or is there someone, Dialogue: 0,1:11:23.32,1:11:25.08,Default,,0000,0000,0000,,I guess Neil from the mayor's\Noffice will be taking notes. Dialogue: 0,1:11:25.08,1:11:28.00,Default,,0000,0000,0000,,And I have my own team\Ntaking notes right now. Dialogue: 0,1:11:28.00,1:11:30.07,Default,,0000,0000,0000,,Excellent, thank you very much. Dialogue: 0,1:11:30.07,1:11:32.23,Default,,0000,0000,0000,,Thank you for your testimony. Dialogue: 0,1:11:32.23,1:11:34.48,Default,,0000,0000,0000,,If it weren't for\Nadvocates, such as everyone Dialogue: 0,1:11:34.48,1:11:39.48,Default,,0000,0000,0000,,on this Zoom call, I wouldn't\Nbe here in this capacity. Dialogue: 0,1:11:39.92,1:11:40.78,Default,,0000,0000,0000,,So thank you. Dialogue: 0,1:11:41.94,1:11:43.56,Default,,0000,0000,0000,,Thank you very much commissioner. Dialogue: 0,1:11:43.56,1:11:46.52,Default,,0000,0000,0000,,Thank you everyone take care. Dialogue: 0,1:11:46.52,1:11:48.25,Default,,0000,0000,0000,,Okay, thank you very much. Dialogue: 0,1:11:48.25,1:11:52.32,Default,,0000,0000,0000,,And so we have a still a\Nrobust conversation to have Dialogue: 0,1:11:52.32,1:11:56.56,Default,,0000,0000,0000,,amongst ourselves about the\Nlanguage, about the goals, Dialogue: 0,1:11:56.56,1:11:59.71,Default,,0000,0000,0000,,whether it goes far enough,\Nif anyone opposes this ban, Dialogue: 0,1:11:59.71,1:12:02.12,Default,,0000,0000,0000,,and I want people to understand Dialogue: 0,1:12:02.12,1:12:04.81,Default,,0000,0000,0000,,if there is someone who\Ndisagrees with a lot of us, Dialogue: 0,1:12:04.81,1:12:06.81,Default,,0000,0000,0000,,this will be a respectful conversation Dialogue: 0,1:12:06.81,1:12:09.34,Default,,0000,0000,0000,,and that person will be\Nwelcome to have their opinion Dialogue: 0,1:12:09.34,1:12:11.82,Default,,0000,0000,0000,,and express it as every single one of us Dialogue: 0,1:12:11.82,1:12:12.82,Default,,0000,0000,0000,,has been able to do. Dialogue: 0,1:12:13.66,1:12:14.67,Default,,0000,0000,0000,,So I'm gonna continue now, Dialogue: 0,1:12:14.67,1:12:17.49,Default,,0000,0000,0000,,there's a list of folks\Nwho have signed up to speak Dialogue: 0,1:12:17.49,1:12:19.89,Default,,0000,0000,0000,,and I'm gonna continue down that list Dialogue: 0,1:12:19.89,1:12:23.57,Default,,0000,0000,0000,,and keep them to no\Nmore than three minutes Dialogue: 0,1:12:24.47,1:12:26.01,Default,,0000,0000,0000,,and then we're gonna open up. Dialogue: 0,1:12:26.01,1:12:28.37,Default,,0000,0000,0000,,Oh, I'm so sorry, I apologize. Dialogue: 0,1:12:28.37,1:12:31.29,Default,,0000,0000,0000,,Before I go down a list,\NI know my councilor, Dialogue: 0,1:12:31.29,1:12:33.73,Default,,0000,0000,0000,,my colleagues also may\Nhave questions or concerns Dialogue: 0,1:12:33.73,1:12:36.38,Default,,0000,0000,0000,,or may wanna voice certain\Nthings about the legislation. Dialogue: 0,1:12:36.38,1:12:37.79,Default,,0000,0000,0000,,And I apologize. Dialogue: 0,1:12:37.79,1:12:38.79,Default,,0000,0000,0000,,[Lydia giggles] Dialogue: 0,1:12:38.79,1:12:40.97,Default,,0000,0000,0000,,So I'm just so amped to get\Nto the public testimony. Dialogue: 0,1:12:40.97,1:12:43.31,Default,,0000,0000,0000,,So I'm gonna go ahead and\Ngo in order of arrival, Dialogue: 0,1:12:43.31,1:12:45.45,Default,,0000,0000,0000,,the two lead sponsors have\Nasked questions already Dialogue: 0,1:12:45.45,1:12:47.04,Default,,0000,0000,0000,,of the commissioner. Dialogue: 0,1:12:47.04,1:12:49.58,Default,,0000,0000,0000,,Did they have the councilor\NWu or councilor Arroyo Dialogue: 0,1:12:49.58,1:12:52.98,Default,,0000,0000,0000,,have any questions for the\Npanelists that just spoke? Dialogue: 0,1:12:52.98,1:12:55.38,Default,,0000,0000,0000,,If not, I'll move on\Nto my other colleagues. Dialogue: 0,1:12:56.86,1:12:59.07,Default,,0000,0000,0000,,MICHELLE: I'm happy to\Ndefer to colleagues, Dialogue: 0,1:12:59.07,1:12:59.90,Default,,0000,0000,0000,,thank you madam chair. Dialogue: 0,1:12:59.90,1:13:01.90,Default,,0000,0000,0000,,Very well, councilor Arroyo. Dialogue: 0,1:13:01.90,1:13:03.92,Default,,0000,0000,0000,,The only question I would have, Dialogue: 0,1:13:03.92,1:13:06.80,Default,,0000,0000,0000,,which is more to, I\Nthink Kade from the ACLU, Dialogue: 0,1:13:06.80,1:13:09.26,Default,,0000,0000,0000,,is whether or not she has any idea Dialogue: 0,1:13:09.26,1:13:10.41,Default,,0000,0000,0000,,where we are on that contract. Dialogue: 0,1:13:10.41,1:13:13.01,Default,,0000,0000,0000,,From what I heard from commissioner Gross, Dialogue: 0,1:13:13.01,1:13:15.32,Default,,0000,0000,0000,,it sounded like he couldn't confirm Dialogue: 0,1:13:15.32,1:13:16.94,Default,,0000,0000,0000,,the timeline for that contract, Dialogue: 0,1:13:16.94,1:13:18.66,Default,,0000,0000,0000,,whether or not it's been\Nsigned or not signed, Dialogue: 0,1:13:18.66,1:13:20.38,Default,,0000,0000,0000,,what's going on with that contract. Dialogue: 0,1:13:20.38,1:13:22.82,Default,,0000,0000,0000,,Does any panelists here\Nhave any information Dialogue: 0,1:13:22.82,1:13:24.91,Default,,0000,0000,0000,,on whether or not that contract is, Dialogue: 0,1:13:24.91,1:13:26.72,Default,,0000,0000,0000,,what the status of that is? Dialogue: 0,1:13:26.72,1:13:28.76,Default,,0000,0000,0000,,Thanks for the question, councilor Arroyo. Dialogue: 0,1:13:28.76,1:13:29.96,Default,,0000,0000,0000,,Regrettably no. Dialogue: 0,1:13:29.96,1:13:32.40,Default,,0000,0000,0000,,We filed a public records request Dialogue: 0,1:13:32.40,1:13:34.84,Default,,0000,0000,0000,,with the Boston Police\NDepartment on May 14th, Dialogue: 0,1:13:34.84,1:13:36.40,Default,,0000,0000,0000,,so that was about a month ago, Dialogue: 0,1:13:36.40,1:13:38.38,Default,,0000,0000,0000,,asking just for one simple document Dialogue: 0,1:13:38.38,1:13:40.86,Default,,0000,0000,0000,,for the existing contract Dialogue: 0,1:13:40.86,1:13:43.40,Default,,0000,0000,0000,,that the Boston Police\NDepartment has with BriefCam Dialogue: 0,1:13:43.40,1:13:44.99,Default,,0000,0000,0000,,and they haven't sent it to us yet. Dialogue: 0,1:13:44.99,1:13:49.05,Default,,0000,0000,0000,,So, and we've prided them\Nmultiple times about it, Dialogue: 0,1:13:49.05,1:13:51.15,Default,,0000,0000,0000,,including on Friday saying, Dialogue: 0,1:13:51.15,1:13:51.98,Default,,0000,0000,0000,,it would be a real shame Dialogue: 0,1:13:51.98,1:13:54.20,Default,,0000,0000,0000,,if we had to go to this hearing on Tuesday Dialogue: 0,1:13:54.20,1:13:57.13,Default,,0000,0000,0000,,without this information and\Nwe still haven't received it, Dialogue: 0,1:13:57.13,1:13:58.28,Default,,0000,0000,0000,,so we do not. Dialogue: 0,1:13:59.38,1:14:00.56,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:14:01.76,1:14:04.65,Default,,0000,0000,0000,,That all councilor Arroyo? Dialogue: 0,1:14:04.65,1:14:06.64,Default,,0000,0000,0000,,Very well, councilor Breadon. Dialogue: 0,1:14:10.95,1:14:12.54,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:14:12.54,1:14:14.61,Default,,0000,0000,0000,,This has been very informative, Dialogue: 0,1:14:14.61,1:14:18.46,Default,,0000,0000,0000,,very good questions from my colleagues. Dialogue: 0,1:14:18.46,1:14:23.46,Default,,0000,0000,0000,,One question I had was, other\Nagencies, federal agencies, Dialogue: 0,1:14:24.27,1:14:26.15,Default,,0000,0000,0000,,are they using facial recognition? Dialogue: 0,1:14:26.15,1:14:29.64,Default,,0000,0000,0000,,And if they are they\Nsharing that information Dialogue: 0,1:14:29.64,1:14:30.99,Default,,0000,0000,0000,,with our police department? Dialogue: 0,1:14:33.15,1:14:35.86,Default,,0000,0000,0000,,Does anyone know the\Nanswer to that question? Dialogue: 0,1:14:35.86,1:14:39.05,Default,,0000,0000,0000,,I can speak to that, Joy\NBuolomwini if you're still on, Dialogue: 0,1:14:39.05,1:14:41.29,Default,,0000,0000,0000,,you may wanna speak to some of those too. Dialogue: 0,1:14:41.29,1:14:46.29,Default,,0000,0000,0000,,So the ACLU we nationwide have\Nbeen filing FOIAR requests, Dialogue: 0,1:14:47.06,1:14:49.36,Default,,0000,0000,0000,,Freedom of Information Act Requests Dialogue: 0,1:14:49.36,1:14:51.38,Default,,0000,0000,0000,,with various federal agencies Dialogue: 0,1:14:51.38,1:14:53.43,Default,,0000,0000,0000,,to learn about how the federal government Dialogue: 0,1:14:55.02,1:14:56.51,Default,,0000,0000,0000,,is using this technology\Nand how they're sharing Dialogue: 0,1:14:57.87,1:14:59.19,Default,,0000,0000,0000,,information derived from the technology, Dialogue: 0,1:14:59.19,1:15:00.86,Default,,0000,0000,0000,,with state and local law enforcement. Dialogue: 0,1:15:00.86,1:15:03.18,Default,,0000,0000,0000,,We know for example, that\Nin the city of Boston, Dialogue: 0,1:15:03.18,1:15:06.02,Default,,0000,0000,0000,,the Boston Police Department works closely Dialogue: 0,1:15:06.02,1:15:07.06,Default,,0000,0000,0000,,not only with I.C.E, Dialogue: 0,1:15:07.06,1:15:09.26,Default,,0000,0000,0000,,which has been the subject\Nof much consternation Dialogue: 0,1:15:09.26,1:15:11.47,Default,,0000,0000,0000,,and debate before this body, Dialogue: 0,1:15:11.47,1:15:12.90,Default,,0000,0000,0000,,but also with the FBI, Dialogue: 0,1:15:12.90,1:15:15.100,Default,,0000,0000,0000,,through something called the\NJoint Terrorism Task Force. Dialogue: 0,1:15:15.100,1:15:17.21,Default,,0000,0000,0000,,The Boston Police Department Dialogue: 0,1:15:17.21,1:15:21.27,Default,,0000,0000,0000,,actually has detectives\Nassigned to that JTTF unit, Dialogue: 0,1:15:21.27,1:15:23.14,Default,,0000,0000,0000,,who act as federal agents. Dialogue: 0,1:15:23.14,1:15:25.34,Default,,0000,0000,0000,,So one concern that we have is that Dialogue: 0,1:15:25.34,1:15:27.22,Default,,0000,0000,0000,,even if the city of Boston banned Dialogue: 0,1:15:27.22,1:15:29.100,Default,,0000,0000,0000,,face surveillance technology\Nfor city employees, Dialogue: 0,1:15:29.100,1:15:32.86,Default,,0000,0000,0000,,certainly it would be\Nthe case that the FBI Dialogue: 0,1:15:32.86,1:15:34.20,Default,,0000,0000,0000,,and other federal agencies Dialogue: 0,1:15:34.20,1:15:36.50,Default,,0000,0000,0000,,would be able to continue\Nto use this technology Dialogue: 0,1:15:36.50,1:15:40.15,Default,,0000,0000,0000,,and likely to share\Ninformation that comes from it Dialogue: 0,1:15:40.15,1:15:42.22,Default,,0000,0000,0000,,with the Boston Police Department\Nand other city agencies. Dialogue: 0,1:15:42.22,1:15:45.67,Default,,0000,0000,0000,,Unfortunately, there's\Nnothing we can do about that Dialogue: 0,1:15:45.67,1:15:46.84,Default,,0000,0000,0000,,at the city level, Dialogue: 0,1:15:46.84,1:15:48.69,Default,,0000,0000,0000,,which is why the ACLU Dialogue: 0,1:15:48.69,1:15:50.99,Default,,0000,0000,0000,,is also working with partners in congress Dialogue: 0,1:15:50.99,1:15:53.87,Default,,0000,0000,0000,,to try to address this level\Nat the federal level as well. Dialogue: 0,1:15:56.21,1:15:57.64,Default,,0000,0000,0000,,Thank you, that answered my question Dialogue: 0,1:15:57.64,1:16:00.90,Default,,0000,0000,0000,,and thank you for your work\Non this very important issue. Dialogue: 0,1:16:00.90,1:16:04.57,Default,,0000,0000,0000,,Councilor, madam chair\Nthat's all the questions Dialogue: 0,1:16:04.57,1:16:05.40,Default,,0000,0000,0000,,I have for now. Dialogue: 0,1:16:06.27,1:16:08.28,Default,,0000,0000,0000,,Thank you for very much, councilor Bok. Dialogue: 0,1:16:08.28,1:16:10.64,Default,,0000,0000,0000,,So, Mejia. Dialogue: 0,1:16:12.36,1:16:14.32,Default,,0000,0000,0000,,I'm so sorry, councilor Bok. Dialogue: 0,1:16:14.32,1:16:16.13,Default,,0000,0000,0000,,Thank you councilor Breadon. Dialogue: 0,1:16:16.13,1:16:19.23,Default,,0000,0000,0000,,Next is councilor Bok, I apologize. Dialogue: 0,1:16:19.23,1:16:21.96,Default,,0000,0000,0000,,Thank you, thanks councilor Edwards. Dialogue: 0,1:16:21.96,1:16:24.80,Default,,0000,0000,0000,,My question was actually\Njust, it's for the panelists. Dialogue: 0,1:16:24.80,1:16:27.25,Default,,0000,0000,0000,,Whoever wants to jump in, Dialogue: 0,1:16:27.25,1:16:29.59,Default,,0000,0000,0000,,the police commissioner\Nwas making a distinction Dialogue: 0,1:16:29.59,1:16:32.77,Default,,0000,0000,0000,,between facial surveillance\Nand facial recognition systems Dialogue: 0,1:16:32.77,1:16:35.76,Default,,0000,0000,0000,,and what we might be banning or not. Dialogue: 0,1:16:35.76,1:16:37.41,Default,,0000,0000,0000,,And I'm just wondering, Dialogue: 0,1:16:37.41,1:16:39.14,Default,,0000,0000,0000,,I don't know the literature\Nin this world well enough Dialogue: 0,1:16:39.14,1:16:41.92,Default,,0000,0000,0000,,to know if that sort of a\Nstrong existing distinction, Dialogue: 0,1:16:41.92,1:16:44.58,Default,,0000,0000,0000,,whether you think this piece\Nof legislation in front of us Dialogue: 0,1:16:44.58,1:16:45.64,Default,,0000,0000,0000,,bans one or the other, Dialogue: 0,1:16:45.64,1:16:48.60,Default,,0000,0000,0000,,whether you think we should\Nbe making that distinction, Dialogue: 0,1:16:48.60,1:16:51.40,Default,,0000,0000,0000,,I'd just be curious for anybody\Nto weigh in on that front. Dialogue: 0,1:16:51.40,1:16:55.02,Default,,0000,0000,0000,,Got it, so when we hear the\Nterm facial recognition, Dialogue: 0,1:16:55.02,1:16:58.37,Default,,0000,0000,0000,,oftentimes what it\Nmeans it's not so clear. Dialogue: 0,1:16:58.37,1:17:00.60,Default,,0000,0000,0000,,And so with the way the\Ncurrent bill is written, Dialogue: 0,1:17:00.60,1:17:03.35,Default,,0000,0000,0000,,it actually covers a wide range Dialogue: 0,1:17:03.35,1:17:06.90,Default,,0000,0000,0000,,of different kinds of facial\Nrecognition technologies. Dialogue: 0,1:17:06.90,1:17:09.60,Default,,0000,0000,0000,,And I say technologies plural Dialogue: 0,1:17:09.60,1:17:12.28,Default,,0000,0000,0000,,to emphasize we're talking\Nabout different things. Dialogue: 0,1:17:12.28,1:17:15.28,Default,,0000,0000,0000,,So for example, you have face recognition, Dialogue: 0,1:17:15.28,1:17:18.21,Default,,0000,0000,0000,,which is about identifying\Na unique individual. Dialogue: 0,1:17:18.21,1:17:20.72,Default,,0000,0000,0000,,So when we're talking\Nabout face surveillance, Dialogue: 0,1:17:20.72,1:17:22.21,Default,,0000,0000,0000,,that is the issue. Dialogue: 0,1:17:22.21,1:17:25.14,Default,,0000,0000,0000,,But you can also have facial analysis, Dialogue: 0,1:17:25.14,1:17:27.05,Default,,0000,0000,0000,,that's guessing somebody's gender Dialogue: 0,1:17:27.05,1:17:29.27,Default,,0000,0000,0000,,or somebodies age or other systems Dialogue: 0,1:17:29.27,1:17:32.90,Default,,0000,0000,0000,,that might try to infer your\Nsexuality or your religion. Dialogue: 0,1:17:32.90,1:17:34.86,Default,,0000,0000,0000,,So you can still discriminate Dialogue: 0,1:17:34.86,1:17:38.38,Default,,0000,0000,0000,,even if it's not\Ntechnically face recognition Dialogue: 0,1:17:38.38,1:17:39.85,Default,,0000,0000,0000,,in the technical sense. Dialogue: 0,1:17:39.85,1:17:42.74,Default,,0000,0000,0000,,So, it's important to have\Na really broad definition. Dialogue: 0,1:17:42.74,1:17:44.40,Default,,0000,0000,0000,,You also have face detection. Dialogue: 0,1:17:44.40,1:17:46.73,Default,,0000,0000,0000,,So if you think about weapon systems Dialogue: 0,1:17:46.73,1:17:49.26,Default,,0000,0000,0000,,where you're just detecting\Nthe presence of a face Dialogue: 0,1:17:49.26,1:17:51.88,Default,,0000,0000,0000,,without saying a specific individual Dialogue: 0,1:17:51.88,1:17:53.69,Default,,0000,0000,0000,,that can still be problematic. Dialogue: 0,1:17:53.69,1:17:55.75,Default,,0000,0000,0000,,You also have companies like Face Suction Dialogue: 0,1:17:55.75,1:17:58.18,Default,,0000,0000,0000,,that say, we can infer a criminality Dialogue: 0,1:17:58.18,1:18:00.03,Default,,0000,0000,0000,,just by looking at your face, Dialogue: 0,1:18:00.03,1:18:04.48,Default,,0000,0000,0000,,your potential to be a pedophile,\Na murderer, a terrorist. Dialogue: 0,1:18:04.48,1:18:07.01,Default,,0000,0000,0000,,And these are the kinds of systems Dialogue: 0,1:18:07.01,1:18:09.87,Default,,0000,0000,0000,,that companies are attempting\Nto sell law enforcement. Dialogue: 0,1:18:09.87,1:18:12.57,Default,,0000,0000,0000,,So it's crucial that any legislation Dialogue: 0,1:18:12.57,1:18:13.91,Default,,0000,0000,0000,,that is being written, Dialogue: 0,1:18:13.91,1:18:16.26,Default,,0000,0000,0000,,has a sufficiently broad definition Dialogue: 0,1:18:16.26,1:18:21.26,Default,,0000,0000,0000,,of a wide range of facial\Nrecognition technologies, right? Dialogue: 0,1:18:21.45,1:18:24.67,Default,,0000,0000,0000,,With the plural so that\Nyou don't get a loophole, Dialogue: 0,1:18:26.28,1:18:27.74,Default,,0000,0000,0000,,which says, oh, we're\Ndoing face identification Dialogue: 0,1:18:27.74,1:18:31.10,Default,,0000,0000,0000,,or face verification, which\Nfalls under recognition. Dialogue: 0,1:18:31.10,1:18:33.91,Default,,0000,0000,0000,,So everything we're doing\Nover here doesn't matter, Dialogue: 0,1:18:33.91,1:18:34.90,Default,,0000,0000,0000,,but it does. Dialogue: 0,1:18:34.90,1:18:36.65,Default,,0000,0000,0000,,So thank you so much for that question Dialogue: 0,1:18:36.65,1:18:38.92,Default,,0000,0000,0000,,because this is an\Nimportant clarification. Dialogue: 0,1:18:42.09,1:18:44.00,Default,,0000,0000,0000,,Great, thanks so much. Dialogue: 0,1:18:44.00,1:18:45.60,Default,,0000,0000,0000,,Was there anybody else who\Nwanted to comment on that front? Dialogue: 0,1:18:46.44,1:18:48.07,Default,,0000,0000,0000,,No, okay, great. Dialogue: 0,1:18:48.07,1:18:50.08,Default,,0000,0000,0000,,Thank you, that was really helpful Joy. Dialogue: 0,1:18:50.08,1:18:51.88,Default,,0000,0000,0000,,Madam chair, I'm mindful\Nof the public testimony, Dialogue: 0,1:18:51.88,1:18:53.25,Default,,0000,0000,0000,,so that'll be it for me. Dialogue: 0,1:18:54.70,1:18:55.57,Default,,0000,0000,0000,,Thank you very much councilor Bok, Dialogue: 0,1:18:55.57,1:18:56.68,Default,,0000,0000,0000,,councilor Mejia. Dialogue: 0,1:18:58.50,1:19:00.94,Default,,0000,0000,0000,,Thank you again to the panelists Dialogue: 0,1:19:00.94,1:19:04.48,Default,,0000,0000,0000,,for educating us beforehand Dialogue: 0,1:19:04.48,1:19:07.90,Default,,0000,0000,0000,,and always sharing all of\Nthis amazing information Dialogue: 0,1:19:07.90,1:19:11.66,Default,,0000,0000,0000,,and data and research that\Ninforms our thinking every day. Dialogue: 0,1:19:11.66,1:19:14.22,Default,,0000,0000,0000,,I am just curious Dialogue: 0,1:19:14.22,1:19:18.72,Default,,0000,0000,0000,,and at the same time a little bit worried Dialogue: 0,1:19:18.72,1:19:23.28,Default,,0000,0000,0000,,about, just kind of like, Dialogue: 0,1:19:23.28,1:19:27.68,Default,,0000,0000,0000,,how we can prevent this\Nfrom ever passing ever. Dialogue: 0,1:19:27.68,1:19:30.72,Default,,0000,0000,0000,,Because it seems to me that Dialogue: 0,1:19:30.72,1:19:33.46,Default,,0000,0000,0000,,the way it's been positioned\Nis that it's the right now, Dialogue: 0,1:19:33.46,1:19:37.63,Default,,0000,0000,0000,,because it's not accurate, Dialogue: 0,1:19:37.63,1:19:39.52,Default,,0000,0000,0000,,but I'm just curious what happened Dialogue: 0,1:19:39.52,1:19:44.52,Default,,0000,0000,0000,,if you've seen other cities\Nor just even around the world Dialogue: 0,1:19:44.99,1:19:46.40,Default,,0000,0000,0000,,or any other incidences Dialogue: 0,1:19:46.40,1:19:51.08,Default,,0000,0000,0000,,where this has kind of slipped\Nin the radar and has passed. Dialogue: 0,1:19:53.25,1:19:54.93,Default,,0000,0000,0000,,I don't know if I'm making\Nany sense here but... Dialogue: 0,1:19:54.93,1:19:58.44,Default,,0000,0000,0000,,Basically the bottom line is\Nwhat I'm trying to understand Dialogue: 0,1:19:58.44,1:20:00.78,Default,,0000,0000,0000,,is that here we are standing firm Dialogue: 0,1:20:00.78,1:20:03.56,Default,,0000,0000,0000,,and banning facial recognition, Dialogue: 0,1:20:03.56,1:20:06.88,Default,,0000,0000,0000,,from what I understand at this point, Dialogue: 0,1:20:06.88,1:20:10.78,Default,,0000,0000,0000,,our commissioner is in agreement, Dialogue: 0,1:20:10.78,1:20:13.91,Default,,0000,0000,0000,,because of the issue of accuracy, right? Dialogue: 0,1:20:13.91,1:20:16.13,Default,,0000,0000,0000,,How can we get ahead of this situation Dialogue: 0,1:20:16.13,1:20:20.97,Default,,0000,0000,0000,,in a way that we can right this ordinance Dialogue: 0,1:20:20.97,1:20:25.58,Default,,0000,0000,0000,,to ensure that regardless of\Nwhether or not it is accurate, Dialogue: 0,1:20:25.58,1:20:26.96,Default,,0000,0000,0000,,that we can still protect our residents. Dialogue: 0,1:20:26.96,1:20:28.67,Default,,0000,0000,0000,,Do you get what I'm trying to say? Dialogue: 0,1:20:28.67,1:20:31.16,Default,,0000,0000,0000,,I do councilor Mejia\Nand thank you for that. Dialogue: 0,1:20:31.16,1:20:33.54,Default,,0000,0000,0000,,I can address that quickly. Dialogue: 0,1:20:33.54,1:20:37.71,Default,,0000,0000,0000,,The council can't bind\Nfuture city councils, right? Dialogue: 0,1:20:37.71,1:20:40.30,Default,,0000,0000,0000,,So I agree with you. Dialogue: 0,1:20:40.30,1:20:42.77,Default,,0000,0000,0000,,I think that this technology is dangerous, Dialogue: 0,1:20:42.77,1:20:44.40,Default,,0000,0000,0000,,whether it works or it doesn't. Dialogue: 0,1:20:44.40,1:20:46.53,Default,,0000,0000,0000,,And I think that's why the city of Boston Dialogue: 0,1:20:46.53,1:20:50.09,Default,,0000,0000,0000,,ought to take the step to\Nban its use in government. Dialogue: 0,1:20:50.09,1:20:52.09,Default,,0000,0000,0000,,I can promise you that as advocates, Dialogue: 0,1:20:52.09,1:20:53.90,Default,,0000,0000,0000,,we will show up to ensure Dialogue: 0,1:20:53.90,1:20:56.68,Default,,0000,0000,0000,,that these protections\Npersist in the city of Boston, Dialogue: 0,1:20:56.68,1:20:59.09,Default,,0000,0000,0000,,as long as I'm alive. [Kade laughs] Dialogue: 0,1:20:59.09,1:21:02.79,Default,,0000,0000,0000,,And I think that's true of many\Nof my friends and colleagues Dialogue: 0,1:21:02.79,1:21:04.80,Default,,0000,0000,0000,,in the advocacy space. Dialogue: 0,1:21:04.80,1:21:07.43,Default,,0000,0000,0000,,But some cities, some states Dialogue: 0,1:21:07.43,1:21:08.65,Default,,0000,0000,0000,,have considered approaches of Dialogue: 0,1:21:08.65,1:21:10.40,Default,,0000,0000,0000,,for example passing a moratorium Dialogue: 0,1:21:10.40,1:21:12.87,Default,,0000,0000,0000,,that expires after a few years, Dialogue: 0,1:21:12.87,1:21:14.79,Default,,0000,0000,0000,,we did not choose that route here, Dialogue: 0,1:21:14.79,1:21:17.18,Default,,0000,0000,0000,,in fact because of our concerns Dialogue: 0,1:21:17.18,1:21:19.24,Default,,0000,0000,0000,,that I think are identical to yours. Dialogue: 0,1:21:19.24,1:21:22.69,Default,,0000,0000,0000,,It's our view that we should\Nnever wanna live in a society Dialogue: 0,1:21:22.69,1:21:24.36,Default,,0000,0000,0000,,where the government can track us Dialogue: 0,1:21:25.24,1:21:26.82,Default,,0000,0000,0000,,through these surveillance cameras Dialogue: 0,1:21:26.82,1:21:29.49,Default,,0000,0000,0000,,by our face wherever we go\Nor automatically get alerts, Dialogue: 0,1:21:29.49,1:21:31.31,Default,,0000,0000,0000,,just because I happened\Nto walk past the camera Dialogue: 0,1:21:31.31,1:21:32.20,Default,,0000,0000,0000,,in a certain neighborhood. Dialogue: 0,1:21:32.20,1:21:33.48,Default,,0000,0000,0000,,That's a kind of surveillance Dialogue: 0,1:21:33.48,1:21:36.18,Default,,0000,0000,0000,,that should never exist in a free society. Dialogue: 0,1:21:36.18,1:21:38.63,Default,,0000,0000,0000,,So we agree that it should\Nbe permanently banned. Dialogue: 0,1:21:39.63,1:21:40.46,Default,,0000,0000,0000,,Thank you for that. Dialogue: 0,1:21:40.46,1:21:43.71,Default,,0000,0000,0000,,And then the other\Nquestion that I have is, Dialogue: 0,1:21:43.71,1:21:45.80,Default,,0000,0000,0000,,I know that, where the city council Dialogue: 0,1:21:45.80,1:21:47.96,Default,,0000,0000,0000,,and all of the jurisdiction is all things Dialogue: 0,1:21:47.96,1:21:49.57,Default,,0000,0000,0000,,that deal with the city, Dialogue: 0,1:21:49.57,1:21:54.49,Default,,0000,0000,0000,,but I'm just wondering\Nwhat if any examples Dialogue: 0,1:21:54.49,1:21:55.54,Default,,0000,0000,0000,,have you heard of other, Dialogue: 0,1:21:55.54,1:21:57.73,Default,,0000,0000,0000,,like maybe within the private sector Dialogue: 0,1:21:57.73,1:22:02.03,Default,,0000,0000,0000,,that are utilizing or considering\Nto utilize this as a form? Dialogue: 0,1:22:02.03,1:22:04.60,Default,,0000,0000,0000,,Is there anything, any\Ntraction or anything Dialogue: 0,1:22:04.60,1:22:05.43,Default,,0000,0000,0000,,that we need to be mindful of? Dialogue: 0,1:22:05.43,1:22:07.91,Default,,0000,0000,0000,,Happening outside of city\Ngovernment, if you will. Dialogue: 0,1:22:10.58,1:22:11.56,Default,,0000,0000,0000,,Joy, do you wanna speak Dialogue: 0,1:22:11.56,1:22:14.30,Default,,0000,0000,0000,,to some of the commercial applications? Dialogue: 0,1:22:14.30,1:22:17.06,Default,,0000,0000,0000,,Sure, I'm so glad you're\Nasking this question Dialogue: 0,1:22:17.06,1:22:19.03,Default,,0000,0000,0000,,because facial recognition technologies Dialogue: 0,1:22:19.03,1:22:22.99,Default,,0000,0000,0000,,broadly speaking are not\Njust in the government realm. Dialogue: 0,1:22:22.99,1:22:27.01,Default,,0000,0000,0000,,So right now, especially\Nwith the COVID pandemic, Dialogue: 0,1:22:27.01,1:22:30.29,Default,,0000,0000,0000,,you're starting to see\Nmore people make proposals Dialogue: 0,1:22:30.29,1:22:33.06,Default,,0000,0000,0000,,of using facial recognition\Ntechnologies in different ways. Dialogue: 0,1:22:33.06,1:22:35.39,Default,,0000,0000,0000,,Some that I've seen start to surface are, Dialogue: 0,1:22:35.39,1:22:39.77,Default,,0000,0000,0000,,can we use facial recognition\Nfor contactless payments? Dialogue: 0,1:22:39.77,1:22:42.90,Default,,0000,0000,0000,,So your face becomes what you pay with. Dialogue: 0,1:22:42.90,1:22:45.89,Default,,0000,0000,0000,,You also have the case Dialogue: 0,1:22:45.89,1:22:48.90,Default,,0000,0000,0000,,of using facial analysis in employment. Dialogue: 0,1:22:48.90,1:22:50.40,Default,,0000,0000,0000,,So there's a company HireVue Dialogue: 0,1:22:50.40,1:22:53.63,Default,,0000,0000,0000,,that says we will analyze\Nyour facial movements Dialogue: 0,1:22:53.63,1:22:56.92,Default,,0000,0000,0000,,and use that to inform hiring decisions. Dialogue: 0,1:22:56.92,1:23:00.38,Default,,0000,0000,0000,,And guess what we train on\Nthe current top performers. Dialogue: 0,1:23:00.38,1:23:03.93,Default,,0000,0000,0000,,So the biases that are already\Nthere can be propagated Dialogue: 0,1:23:03.93,1:23:06.38,Default,,0000,0000,0000,,and you actually might\Npurchase that technology Dialogue: 0,1:23:06.38,1:23:09.65,Default,,0000,0000,0000,,thinking you're trying to remove bias. Dialogue: 0,1:23:09.65,1:23:12.75,Default,,0000,0000,0000,,And so there are many ways\Nin which these technologies Dialogue: 0,1:23:12.75,1:23:16.70,Default,,0000,0000,0000,,might be presented with good intent, Dialogue: 0,1:23:16.70,1:23:18.84,Default,,0000,0000,0000,,but when you look at the ways in which Dialogue: 0,1:23:18.84,1:23:21.47,Default,,0000,0000,0000,,it actually manifests, there are harms Dialogue: 0,1:23:21.47,1:23:24.40,Default,,0000,0000,0000,,and oftentimes harms that\Ndisproportionately fall Dialogue: 0,1:23:24.40,1:23:25.96,Default,,0000,0000,0000,,on a marginalized group. Dialogue: 0,1:23:25.96,1:23:27.26,Default,,0000,0000,0000,,Even in the healthcare system. Dialogue: 0,1:23:27.26,1:23:29.53,Default,,0000,0000,0000,,I was reading research earlier Dialogue: 0,1:23:29.53,1:23:31.11,Default,,0000,0000,0000,,that was talking about ableism Dialogue: 0,1:23:31.11,1:23:33.15,Default,,0000,0000,0000,,when it comes to using some kinds of Dialogue: 0,1:23:33.15,1:23:35.08,Default,,0000,0000,0000,,facial analysis systems Dialogue: 0,1:23:35.08,1:23:37.19,Default,,0000,0000,0000,,within the healthcare context, Dialogue: 0,1:23:37.19,1:23:38.80,Default,,0000,0000,0000,,where it's not working as well Dialogue: 0,1:23:38.80,1:23:40.90,Default,,0000,0000,0000,,for older adults with dementia. Dialogue: 0,1:23:40.90,1:23:43.02,Default,,0000,0000,0000,,So the promises of these technologies Dialogue: 0,1:23:43.02,1:23:45.59,Default,,0000,0000,0000,,really have to be backed\Nup with the claims. Dialogue: 0,1:23:45.59,1:23:49.68,Default,,0000,0000,0000,,And I think the council can\Nactually go further by saying, Dialogue: 0,1:23:49.68,1:23:51.42,Default,,0000,0000,0000,,we're not only just talking about Dialogue: 0,1:23:51.42,1:23:54.24,Default,,0000,0000,0000,,government use of facial\Nrecognition technologies, Dialogue: 0,1:23:54.24,1:23:55.77,Default,,0000,0000,0000,,but if these technologies Dialogue: 0,1:23:55.77,1:23:58.36,Default,,0000,0000,0000,,impact someone's life in\Na material way, right? Dialogue: 0,1:23:58.36,1:24:00.82,Default,,0000,0000,0000,,So we're talking about\Neconomic opportunity. Dialogue: 0,1:24:00.82,1:24:02.50,Default,,0000,0000,0000,,We're talking about healthcare, Dialogue: 0,1:24:02.50,1:24:04.40,Default,,0000,0000,0000,,that there needs to be oversight Dialogue: 0,1:24:04.40,1:24:08.17,Default,,0000,0000,0000,,or clear restrictions depending\Non the type of application, Dialogue: 0,1:24:08.17,1:24:10.87,Default,,0000,0000,0000,,but the precautionary\Nprinciple press pause, Dialogue: 0,1:24:10.87,1:24:14.51,Default,,0000,0000,0000,,make sense when this technology\Nis nowhere near there. Dialogue: 0,1:24:14.51,1:24:15.34,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:24:15.34,1:24:17.02,Default,,0000,0000,0000,,For that and I'm not\Nsure if I got the Jabil, Dialogue: 0,1:24:17.02,1:24:18.56,Default,,0000,0000,0000,,but I do have one more, Dialogue: 0,1:24:18.56,1:24:20.43,Default,,0000,0000,0000,,You got the, you got the... Dialogue: 0,1:24:20.43,1:24:21.95,Default,,0000,0000,0000,,[crosstalk] Dialogue: 0,1:24:21.95,1:24:25.55,Default,,0000,0000,0000,,But I'll wait. Dialogue: 0,1:24:25.55,1:24:29.56,Default,,0000,0000,0000,,Sorry, thank you so much\Nto you both, thank you. Dialogue: 0,1:24:32.10,1:24:33.15,Default,,0000,0000,0000,,Councilor Flaherty. Dialogue: 0,1:24:39.42,1:24:40.62,Default,,0000,0000,0000,,Okay, councilor Campbell said Dialogue: 0,1:24:40.62,1:24:43.69,Default,,0000,0000,0000,,she was having some spotty\Nissues with her internet. Dialogue: 0,1:24:45.02,1:24:45.86,Default,,0000,0000,0000,,So if she's not here, Dialogue: 0,1:24:45.86,1:24:49.31,Default,,0000,0000,0000,,I'm gonna go ahead and\Ngo to councilor O'Malley Dialogue: 0,1:24:49.31,1:24:51.92,Default,,0000,0000,0000,,who I'm not sure he's still on. Dialogue: 0,1:24:51.92,1:24:53.74,Default,,0000,0000,0000,,I'm here madam chair. Dialogue: 0,1:24:53.74,1:24:55.09,Default,,0000,0000,0000,,There you are, councilor O'Malley. Dialogue: 0,1:24:55.09,1:24:55.92,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:24:55.92,1:24:57.86,Default,,0000,0000,0000,,Yeah, I will be brief because\NI think it's important Dialogue: 0,1:24:57.86,1:24:59.08,Default,,0000,0000,0000,,to get to the public testimony, Dialogue: 0,1:24:59.08,1:25:00.01,Default,,0000,0000,0000,,but it seems to me that this sounds Dialogue: 0,1:25:00.01,1:25:03.82,Default,,0000,0000,0000,,like a very productive\Ngovernment operations hearing. Dialogue: 0,1:25:03.82,1:25:06.34,Default,,0000,0000,0000,,It sounds to me that obviously Dialogue: 0,1:25:06.34,1:25:08.61,Default,,0000,0000,0000,,there's a lot of support across the board Dialogue: 0,1:25:08.61,1:25:10.64,Default,,0000,0000,0000,,and it looks as though the commissioners Dialogue: 0,1:25:10.64,1:25:12.47,Default,,0000,0000,0000,,asking for one more working session, Dialogue: 0,1:25:12.47,1:25:13.44,Default,,0000,0000,0000,,which seems to make sense Dialogue: 0,1:25:13.44,1:25:15.36,Default,,0000,0000,0000,,and his comments were very positive. Dialogue: 0,1:25:15.36,1:25:17.22,Default,,0000,0000,0000,,So I think this is a good thing. Dialogue: 0,1:25:17.22,1:25:18.10,Default,,0000,0000,0000,,So thank you to the advocates. Dialogue: 0,1:25:18.10,1:25:19.48,Default,,0000,0000,0000,,Thank you to my colleagues, Dialogue: 0,1:25:19.48,1:25:20.75,Default,,0000,0000,0000,,particularly the lead sponsors Dialogue: 0,1:25:20.75,1:25:23.26,Default,,0000,0000,0000,,and the commissioner for his\Nparticipation this afternoon. Dialogue: 0,1:25:23.26,1:25:24.93,Default,,0000,0000,0000,,And thank you for all the folks Dialogue: 0,1:25:24.93,1:25:26.25,Default,,0000,0000,0000,,whom we will hear from shortly. Dialogue: 0,1:25:26.25,1:25:27.94,Default,,0000,0000,0000,,That's all I've got. Dialogue: 0,1:25:27.94,1:25:30.13,Default,,0000,0000,0000,,Thank you, councilor Essaibi George. Dialogue: 0,1:25:31.16,1:25:32.40,Default,,0000,0000,0000,,Thank you madam chair. Dialogue: 0,1:25:32.40,1:25:33.55,Default,,0000,0000,0000,,And I do apologize for earlier, Dialogue: 0,1:25:33.55,1:25:38.55,Default,,0000,0000,0000,,taking a call and unmuting myself somehow. Dialogue: 0,1:25:39.46,1:25:40.29,Default,,0000,0000,0000,,I apologize for that. Dialogue: 0,1:25:40.29,1:25:42.41,Default,,0000,0000,0000,,I hope I didn't say anything inappropriate Dialogue: 0,1:25:42.41,1:25:44.30,Default,,0000,0000,0000,,or share any certain secrets. Dialogue: 0,1:25:44.30,1:25:46.51,Default,,0000,0000,0000,,Thank you to the advocates Dialogue: 0,1:25:46.51,1:25:48.61,Default,,0000,0000,0000,,and to the commissioner\Nfor being here today. Dialogue: 0,1:25:48.61,1:25:50.66,Default,,0000,0000,0000,,And thank you to the advocates Dialogue: 0,1:25:50.66,1:25:53.55,Default,,0000,0000,0000,,who have met with me over\Nthe last a few months ago, Dialogue: 0,1:25:53.55,1:25:55.56,Default,,0000,0000,0000,,over a period of a few months, Dialogue: 0,1:25:55.56,1:25:58.43,Default,,0000,0000,0000,,just really learned a lot\Nthrough our times together, Dialogue: 0,1:25:58.43,1:26:01.54,Default,,0000,0000,0000,,especially from the younger folks. Dialogue: 0,1:26:01.54,1:26:02.97,Default,,0000,0000,0000,,I am interested in some, Dialogue: 0,1:26:04.20,1:26:05.04,Default,,0000,0000,0000,,if there is any information Dialogue: 0,1:26:05.04,1:26:06.87,Default,,0000,0000,0000,,that the commissioner\Nis no longer with us. Dialogue: 0,1:26:06.87,1:26:10.80,Default,,0000,0000,0000,,So perhaps I'll just forward\Nthis question to him, myself. Dialogue: 0,1:26:10.80,1:26:15.80,Default,,0000,0000,0000,,But my question is around\Nthe FBI's evidence standards Dialogue: 0,1:26:17.28,1:26:19.64,Default,,0000,0000,0000,,around facial recognition technology. Dialogue: 0,1:26:19.64,1:26:21.84,Default,,0000,0000,0000,,I wonder if they've released that. Dialogue: 0,1:26:21.84,1:26:25.58,Default,,0000,0000,0000,,And then through the chair to the makers, Dialogue: 0,1:26:25.58,1:26:30.27,Default,,0000,0000,0000,,we had some questions\Naround section B and two, Dialogue: 0,1:26:32.38,1:26:33.84,Default,,0000,0000,0000,,and perhaps we can discuss\Nthis in the working sessional, Dialogue: 0,1:26:35.18,1:26:36.80,Default,,0000,0000,0000,,or I'll send this to you ahead of time, Dialogue: 0,1:26:36.80,1:26:41.80,Default,,0000,0000,0000,,but it does talk about\Nsection B part two reads, Dialogue: 0,1:26:42.38,1:26:45.79,Default,,0000,0000,0000,,"Nothing in B or one shall prohibit Boston Dialogue: 0,1:26:46.82,1:26:47.97,Default,,0000,0000,0000,,or any Boston official\Nfrom using evidence related Dialogue: 0,1:26:47.97,1:26:50.46,Default,,0000,0000,0000,,to the investigation of a specific crime Dialogue: 0,1:26:50.46,1:26:51.67,Default,,0000,0000,0000,,that may have been generated Dialogue: 0,1:26:51.67,1:26:53.98,Default,,0000,0000,0000,,from face surveillance systems." Dialogue: 0,1:26:53.98,1:26:56.17,Default,,0000,0000,0000,,So we're just wondering\Nwhat types of evidence Dialogue: 0,1:26:56.17,1:26:57.75,Default,,0000,0000,0000,,are we allowing from here Dialogue: 0,1:26:57.75,1:27:00.56,Default,,0000,0000,0000,,and what situations are\Nwe trying to accommodate. Dialogue: 0,1:27:00.56,1:27:04.45,Default,,0000,0000,0000,,Just the specific question\Nthat we came up with Dialogue: 0,1:27:04.45,1:27:06.19,Default,,0000,0000,0000,,as an office, looking through that. Dialogue: 0,1:27:06.19,1:27:08.68,Default,,0000,0000,0000,,So I'll share that with the\Nmakers of this ordinance Dialogue: 0,1:27:08.68,1:27:10.27,Default,,0000,0000,0000,,to understand that a little bit better. Dialogue: 0,1:27:10.27,1:27:12.12,Default,,0000,0000,0000,,But if any of the panels have information Dialogue: 0,1:27:12.12,1:27:15.58,Default,,0000,0000,0000,,on the FBI standards regarding, Dialogue: 0,1:27:15.58,1:27:17.76,Default,,0000,0000,0000,,if they've shared their standards. Dialogue: 0,1:27:17.76,1:27:21.46,Default,,0000,0000,0000,,That is my question for this time. Dialogue: 0,1:27:21.46,1:27:22.60,Default,,0000,0000,0000,,Thank you ma'am chair. Dialogue: 0,1:27:22.60,1:27:23.60,Default,,0000,0000,0000,,I can answer that quickly Dialogue: 0,1:27:23.60,1:27:24.54,Default,,0000,0000,0000,,and then Joy may also have an answer. Dialogue: 0,1:27:24.54,1:27:27.01,Default,,0000,0000,0000,,But one of the issues here Dialogue: 0,1:27:27.01,1:27:29.70,Default,,0000,0000,0000,,is that police departments\Nacross the country Dialogue: 0,1:27:29.70,1:27:32.41,Default,,0000,0000,0000,,have been using facial\Nrecognition to identify people Dialogue: 0,1:27:32.41,1:27:35.70,Default,,0000,0000,0000,,in criminal investigations\Nfor decades now actually, Dialogue: 0,1:27:35.70,1:27:38.16,Default,,0000,0000,0000,,and then not disclosing that information Dialogue: 0,1:27:38.16,1:27:41.35,Default,,0000,0000,0000,,to criminal defendants, which\Nis a due process violation. Dialogue: 0,1:27:41.35,1:27:43.47,Default,,0000,0000,0000,,And it's a serious problem Dialogue: 0,1:27:43.47,1:27:46.84,Default,,0000,0000,0000,,in terms of the integrity of\Nour criminal justice process Dialogue: 0,1:27:46.84,1:27:50.07,Default,,0000,0000,0000,,in the courts and defendant's\Nrights to a fair trial. Dialogue: 0,1:27:50.07,1:27:53.44,Default,,0000,0000,0000,,So the FBI standards\Nthat you're referencing Dialogue: 0,1:27:53.44,1:27:56.24,Default,,0000,0000,0000,,one of the reasons, at least\Nas far as we understand it Dialogue: 0,1:27:56.24,1:27:58.28,Default,,0000,0000,0000,,and I don't work for\Nthe police department, Dialogue: 0,1:27:58.28,1:28:01.35,Default,,0000,0000,0000,,but one of the reasons as\Nfar as we understand it, Dialogue: 0,1:28:01.35,1:28:03.68,Default,,0000,0000,0000,,that police departments have\Nnot been disclosing information Dialogue: 0,1:28:03.68,1:28:06.13,Default,,0000,0000,0000,,about these searches\Nto criminal defendants, Dialogue: 0,1:28:06.13,1:28:10.09,Default,,0000,0000,0000,,is that there is no nationally\Nagreed upon forensic standard Dialogue: 0,1:28:10.09,1:28:12.98,Default,,0000,0000,0000,,for the use of facial\Nrecognition technology Dialogue: 0,1:28:12.98,1:28:14.57,Default,,0000,0000,0000,,in criminal investigations. Dialogue: 0,1:28:14.57,1:28:17.54,Default,,0000,0000,0000,,And so my understanding is, Dialogue: 0,1:28:17.54,1:28:19.52,Default,,0000,0000,0000,,police departments and prosecutors fear Dialogue: 0,1:28:19.52,1:28:21.98,Default,,0000,0000,0000,,that if they were to disclose to courts Dialogue: 0,1:28:21.98,1:28:22.95,Default,,0000,0000,0000,,and criminal defendants, Dialogue: 0,1:28:22.95,1:28:25.38,Default,,0000,0000,0000,,that this technology was\Nused to identify people Dialogue: 0,1:28:25.38,1:28:27.04,Default,,0000,0000,0000,,in specific investigations, Dialogue: 0,1:28:27.04,1:28:29.88,Default,,0000,0000,0000,,that those cases would basically\Nget thrown out of court Dialogue: 0,1:28:29.88,1:28:34.88,Default,,0000,0000,0000,,because the people who\Nperformed the analysis Dialogue: 0,1:28:35.73,1:28:37.76,Default,,0000,0000,0000,,would not really be able to testify Dialogue: 0,1:28:37.76,1:28:41.17,Default,,0000,0000,0000,,to their training under any\Nagreed upon national standard Dialogue: 0,1:28:41.17,1:28:43.39,Default,,0000,0000,0000,,for evaluating facial recognition results Dialogue: 0,1:28:43.39,1:28:45.98,Default,,0000,0000,0000,,or using the technology more generally. Dialogue: 0,1:28:45.98,1:28:48.45,Default,,0000,0000,0000,,I can't answer your other\Nquestion about that exemption. Dialogue: 0,1:28:48.45,1:28:51.58,Default,,0000,0000,0000,,It is meant to address\Nbasically wanted posters, Dialogue: 0,1:28:51.58,1:28:55.40,Default,,0000,0000,0000,,because for example, if the\NFBI runs facial recognition Dialogue: 0,1:28:55.40,1:28:57.60,Default,,0000,0000,0000,,on an image of a bank robbery suspect Dialogue: 0,1:28:57.60,1:28:59.19,Default,,0000,0000,0000,,and then produces a wanted poster Dialogue: 0,1:28:59.19,1:29:00.59,Default,,0000,0000,0000,,that has that person's name on it Dialogue: 0,1:29:00.59,1:29:02.78,Default,,0000,0000,0000,,and shares with the\NBoston Police Department, Dialogue: 0,1:29:02.78,1:29:06.80,Default,,0000,0000,0000,,obviously the BPD is not going\Nto be in a position to ask Dialogue: 0,1:29:06.80,1:29:09.06,Default,,0000,0000,0000,,every single police\Ndepartment in the country, Dialogue: 0,1:29:09.06,1:29:11.04,Default,,0000,0000,0000,,where did you get the name\Nattached to this picture? Dialogue: 0,1:29:11.04,1:29:14.64,Default,,0000,0000,0000,,And so that's what that\Nexemption is meant to address. Dialogue: 0,1:29:14.64,1:29:15.83,Default,,0000,0000,0000,,There have been some concerns Dialogue: 0,1:29:15.83,1:29:17.36,Default,,0000,0000,0000,,from some of our allied organizations Dialogue: 0,1:29:17.36,1:29:18.96,Default,,0000,0000,0000,,that we need to tighten that up Dialogue: 0,1:29:18.96,1:29:20.87,Default,,0000,0000,0000,,with a little more extra language Dialogue: 0,1:29:20.87,1:29:23.68,Default,,0000,0000,0000,,to make clear that that\Ndoes not allow the BPD Dialogue: 0,1:29:23.68,1:29:26.86,Default,,0000,0000,0000,,to use the RMV system\Nfor facial recognition. Dialogue: 0,1:29:26.86,1:29:28.98,Default,,0000,0000,0000,,And we will be suggesting some language Dialogue: 0,1:29:28.98,1:29:30.58,Default,,0000,0000,0000,,to the committee to that effect. Dialogue: 0,1:29:32.48,1:29:34.28,Default,,0000,0000,0000,,And just to echo a little bit Dialogue: 0,1:29:34.28,1:29:37.21,Default,,0000,0000,0000,,of what Kade is sharing here, Dialogue: 0,1:29:37.21,1:29:40.56,Default,,0000,0000,0000,,I do wanna point out that in April, 2019, Dialogue: 0,1:29:40.56,1:29:44.22,Default,,0000,0000,0000,,what the example of the\Nstudent being misidentified Dialogue: 0,1:29:44.22,1:29:49.10,Default,,0000,0000,0000,,as a terrorist suspect,\Nyou had that photo shared. Dialogue: 0,1:29:49.10,1:29:51.27,Default,,0000,0000,0000,,And even though it was incorrect, Dialogue: 0,1:29:51.27,1:29:54.11,Default,,0000,0000,0000,,you have the ramifications of what happens Dialogue: 0,1:29:54.11,1:29:57.00,Default,,0000,0000,0000,,because of the presumption of guilt. Dialogue: 0,1:29:57.00,1:29:59.30,Default,,0000,0000,0000,,And so this is why I especially believe Dialogue: 0,1:29:59.30,1:30:02.26,Default,,0000,0000,0000,,we should also be talking about systems Dialogue: 0,1:30:02.26,1:30:05.02,Default,,0000,0000,0000,,that try to infer criminality Dialogue: 0,1:30:05.02,1:30:08.80,Default,,0000,0000,0000,,using AI systems in any kind of way. Dialogue: 0,1:30:08.80,1:30:11.21,Default,,0000,0000,0000,,Because again, the presumption of guilt Dialogue: 0,1:30:11.21,1:30:13.44,Default,,0000,0000,0000,,then adds to the confirmation bias Dialogue: 0,1:30:13.44,1:30:16.19,Default,,0000,0000,0000,,of getting something from a machine. Dialogue: 0,1:30:17.74,1:30:19.53,Default,,0000,0000,0000,,So, I think in the working session, Dialogue: 0,1:30:19.53,1:30:23.31,Default,,0000,0000,0000,,will there be improved language Dialogue: 0,1:30:23.31,1:30:25.03,Default,,0000,0000,0000,,around tightening that piece up. Dialogue: 0,1:30:26.88,1:30:28.90,Default,,0000,0000,0000,,Yes, we have some suggestions. Dialogue: 0,1:30:28.90,1:30:30.89,Default,,0000,0000,0000,,Yeah, thank you, thank you ma'am chir. Dialogue: 0,1:30:30.89,1:30:32.63,Default,,0000,0000,0000,,You're welcome, councilor Flynn. Dialogue: 0,1:30:34.14,1:30:36.63,Default,,0000,0000,0000,,Wasn't sure if you had any questions. Dialogue: 0,1:30:38.84,1:30:41.37,Default,,0000,0000,0000,,Thank you, thank you councilor Edwards. Dialogue: 0,1:30:41.37,1:30:43.93,Default,,0000,0000,0000,,I know I had a conversation awhile back Dialogue: 0,1:30:43.93,1:30:48.08,Default,,0000,0000,0000,,with Erik Berg from the teacher's\Nunion about the subject, Dialogue: 0,1:30:48.08,1:30:49.06,Default,,0000,0000,0000,,and I learned a lot Dialogue: 0,1:30:49.06,1:30:52.48,Default,,0000,0000,0000,,and I learned a lot from\Nlistening to the advocates Dialogue: 0,1:30:52.48,1:30:54.09,Default,,0000,0000,0000,,this afternoon. Dialogue: 0,1:30:56.61,1:31:01.61,Default,,0000,0000,0000,,My question is, I know Joy\Ncovered it a little bit, Dialogue: 0,1:31:01.97,1:31:05.99,Default,,0000,0000,0000,,but besides for criminal investigations, Dialogue: 0,1:31:05.99,1:31:09.08,Default,,0000,0000,0000,,what are the reasons that the government Dialogue: 0,1:31:09.08,1:31:12.56,Default,,0000,0000,0000,,would want surveillance cameras. Dialogue: 0,1:31:15.86,1:31:17.71,Default,,0000,0000,0000,,Surveillance, cameras, councilor Flynn, Dialogue: 0,1:31:17.71,1:31:20.92,Default,,0000,0000,0000,,or facial recognition\Ntechnology specifically? Dialogue: 0,1:31:20.92,1:31:21.75,Default,,0000,0000,0000,,Both. Dialogue: 0,1:31:23.96,1:31:27.60,Default,,0000,0000,0000,,Well, my understanding\Nis government agencies, Dialogue: 0,1:31:27.60,1:31:29.56,Default,,0000,0000,0000,,including the Boston\NTransportation Department Dialogue: 0,1:31:29.56,1:31:33.41,Default,,0000,0000,0000,,use surveillance cameras for\Nthings like traffic analysis, Dialogue: 0,1:31:33.41,1:31:36.50,Default,,0000,0000,0000,,accident reconstruction. Dialogue: 0,1:31:36.50,1:31:40.84,Default,,0000,0000,0000,,So there are non-criminal\Nuses for surveillance cameras. Dialogue: 0,1:31:40.84,1:31:45.84,Default,,0000,0000,0000,,I'm not aware of any\Nnon-criminal uses in government, Dialogue: 0,1:31:46.31,1:31:49.77,Default,,0000,0000,0000,,at least for facial facial surveillance, Dialogue: 0,1:31:49.77,1:31:51.33,Default,,0000,0000,0000,,except in the, Dialogue: 0,1:31:51.33,1:31:53.70,Default,,0000,0000,0000,,I would say so-called intelligence realm. Dialogue: 0,1:31:53.70,1:31:56.79,Default,,0000,0000,0000,,So that would mean not a\Ncriminal investigation, Dialogue: 0,1:31:56.79,1:31:58.54,Default,,0000,0000,0000,,but rather an intelligence investigation, Dialogue: 0,1:31:58.54,1:32:00.38,Default,,0000,0000,0000,,for example of an activist Dialogue: 0,1:32:00.38,1:32:04.20,Default,,0000,0000,0000,,or an activist group or a\Nprotest or something like that. Dialogue: 0,1:32:05.19,1:32:06.56,Default,,0000,0000,0000,,Okay. Dialogue: 0,1:32:06.56,1:32:10.98,Default,,0000,0000,0000,,Can besides government, Dialogue: 0,1:32:10.98,1:32:15.25,Default,,0000,0000,0000,,what are the purposes\Nof facial recognition Dialogue: 0,1:32:15.25,1:32:18.26,Default,,0000,0000,0000,,that businesses would want to know? Dialogue: 0,1:32:18.26,1:32:21.19,Default,,0000,0000,0000,,Then they might use it in a store, Dialogue: 0,1:32:21.19,1:32:23.35,Default,,0000,0000,0000,,but what are some of the other reasons Dialogue: 0,1:32:23.35,1:32:27.40,Default,,0000,0000,0000,,private sector might use them. Dialogue: 0,1:32:27.40,1:32:30.43,Default,,0000,0000,0000,,Sure, so speaking to private sector uses Dialogue: 0,1:32:30.43,1:32:32.13,Default,,0000,0000,0000,,of facial recognition, Dialogue: 0,1:32:32.13,1:32:34.34,Default,,0000,0000,0000,,oftentimes you see it\Nbeing used for security Dialogue: 0,1:32:34.34,1:32:38.52,Default,,0000,0000,0000,,or accessing securing access\Nto a particular place. Dialogue: 0,1:32:38.52,1:32:41.41,Default,,0000,0000,0000,,So only a particular\Nemployees can come in. Dialogue: 0,1:32:41.41,1:32:43.31,Default,,0000,0000,0000,,Something that's more alarming Dialogue: 0,1:32:43.31,1:32:45.45,Default,,0000,0000,0000,,that we're seeing with commercial use, Dialogue: 0,1:32:45.45,1:32:47.33,Default,,0000,0000,0000,,is the use in housing. Dialogue: 0,1:32:47.33,1:32:50.01,Default,,0000,0000,0000,,And so we have a case even in Brooklyn, Dialogue: 0,1:32:50.01,1:32:53.31,Default,,0000,0000,0000,,where you had tenants\Nsaying to the landlord, Dialogue: 0,1:32:53.31,1:32:57.09,Default,,0000,0000,0000,,we don't want to enter\Nour homes with our face, Dialogue: 0,1:32:57.09,1:32:59.72,Default,,0000,0000,0000,,don't install facial\Nrecognition technology. Dialogue: 0,1:32:59.72,1:33:01.09,Default,,0000,0000,0000,,They actually won that case, Dialogue: 0,1:33:01.09,1:33:04.08,Default,,0000,0000,0000,,but not every group is\Ngoing to be so successful. Dialogue: 0,1:33:04.08,1:33:06.06,Default,,0000,0000,0000,,So you can be in a situation Dialogue: 0,1:33:06.06,1:33:08.80,Default,,0000,0000,0000,,where your face becomes the key. Dialogue: 0,1:33:08.80,1:33:11.93,Default,,0000,0000,0000,,But when your face is the key\Nand it gets stolen or hacked, Dialogue: 0,1:33:11.93,1:33:14.11,Default,,0000,0000,0000,,you can't just replace it so easily. Dialogue: 0,1:33:14.11,1:33:16.64,Default,,0000,0000,0000,,You might need some plastic surgery. Dialogue: 0,1:33:16.64,1:33:20.01,Default,,0000,0000,0000,,So we see facial recognition technologies Dialogue: 0,1:33:20.01,1:33:23.17,Default,,0000,0000,0000,,being used for access in certain ways. Dialogue: 0,1:33:23.17,1:33:25.42,Default,,0000,0000,0000,,And again, the dangerous use of trying Dialogue: 0,1:33:25.42,1:33:27.63,Default,,0000,0000,0000,,to predict something about somebody. Dialogue: 0,1:33:27.63,1:33:29.82,Default,,0000,0000,0000,,Are you going to be a good employee? Dialogue: 0,1:33:29.82,1:33:32.51,Default,,0000,0000,0000,,You have companies like Amazon, Dialogue: 0,1:33:32.51,1:33:35.21,Default,,0000,0000,0000,,saying we can detect fear from your face. Dialogue: 0,1:33:35.21,1:33:39.16,Default,,0000,0000,0000,,And so thinking about how\Nthat might be used as well. Dialogue: 0,1:33:39.16,1:33:40.67,Default,,0000,0000,0000,,The other thing we have to consider Dialogue: 0,1:33:40.67,1:33:44.18,Default,,0000,0000,0000,,when we're talking about commercial uses Dialogue: 0,1:33:44.18,1:33:45.57,Default,,0000,0000,0000,,of facial recognition technologies, Dialogue: 0,1:33:45.57,1:33:47.72,Default,,0000,0000,0000,,is oftentimes you'll have companies Dialogue: 0,1:33:47.72,1:33:51.11,Default,,0000,0000,0000,,have general purpose facial recognition. Dialogue: 0,1:33:51.11,1:33:53.72,Default,,0000,0000,0000,,So this then means other people can buy it Dialogue: 0,1:33:53.72,1:33:55.51,Default,,0000,0000,0000,,and use it in all kinds of ways. Dialogue: 0,1:33:55.51,1:33:57.67,Default,,0000,0000,0000,,So from your Snapchat filter Dialogue: 0,1:33:57.67,1:34:00.25,Default,,0000,0000,0000,,to putting it for lethal\Nautonomous weapons. Dialogue: 0,1:34:00.25,1:34:01.87,Default,,0000,0000,0000,,You have a major range Dialogue: 0,1:34:01.87,1:34:05.75,Default,,0000,0000,0000,,with what can happen with\Nthese sorts of technologies. Dialogue: 0,1:34:05.75,1:34:08.93,Default,,0000,0000,0000,,Pardon me councilor. Dialogue: 0,1:34:09.86,1:34:10.73,Default,,0000,0000,0000,,I would just jump into also reiterate Dialogue: 0,1:34:10.73,1:34:13.51,Default,,0000,0000,0000,,that this ordinance only\Napplies to government conduct. Dialogue: 0,1:34:13.51,1:34:15.79,Default,,0000,0000,0000,,It would not restrict in any way, Dialogue: 0,1:34:15.79,1:34:19.89,Default,,0000,0000,0000,,any entity in the private sector\Nfrom using this technology. Dialogue: 0,1:34:19.89,1:34:23.36,Default,,0000,0000,0000,,But that said, some other uses, Dialogue: 0,1:34:23.36,1:34:26.25,Default,,0000,0000,0000,,I don't know if you've ever\Nseen the film "Minority Report," Dialogue: 0,1:34:26.25,1:34:30.61,Default,,0000,0000,0000,,but in that film, Tom\NCruise enters them mall Dialogue: 0,1:34:30.61,1:34:33.71,Default,,0000,0000,0000,,and some technology says to him, Dialogue: 0,1:34:33.71,1:34:37.39,Default,,0000,0000,0000,,"Hello, sir, how did you\Nlike the size small underwear Dialogue: 0,1:34:37.39,1:34:39.15,Default,,0000,0000,0000,,you bought last time?" Dialogue: 0,1:34:39.15,1:34:41.79,Default,,0000,0000,0000,,So that's actually now\Nbecoming a reality as well, Dialogue: 0,1:34:41.79,1:34:45.63,Default,,0000,0000,0000,,that in the commercial space, at stores, Dialogue: 0,1:34:45.63,1:34:47.79,Default,,0000,0000,0000,,for marketing purposes, Dialogue: 0,1:34:47.79,1:34:49.41,Default,,0000,0000,0000,,we are likely going to see Dialogue: 0,1:34:49.41,1:34:52.45,Default,,0000,0000,0000,,if laws do not stop this from happening, Dialogue: 0,1:34:52.45,1:34:54.50,Default,,0000,0000,0000,,the application of facial surveillance Dialogue: 0,1:34:54.50,1:34:57.33,Default,,0000,0000,0000,,in a marketing and commercial context Dialogue: 0,1:34:57.33,1:34:59.65,Default,,0000,0000,0000,,to basically try to get\Nus to buy more stuff. Dialogue: 0,1:35:00.98,1:35:03.19,Default,,0000,0000,0000,,And to underscore that point, Dialogue: 0,1:35:03.19,1:35:04.81,Default,,0000,0000,0000,,you have a patent from Facebook, Dialogue: 0,1:35:04.81,1:35:06.78,Default,,0000,0000,0000,,which basically Facebook has\Nso many of our face prints. Dialogue: 0,1:35:06.78,1:35:09.73,Default,,0000,0000,0000,,We've been training\Ntheir systems for years. Dialogue: 0,1:35:09.73,1:35:13.08,Default,,0000,0000,0000,,The patent says, given that\Nwe have this information, Dialogue: 0,1:35:13.08,1:35:17.07,Default,,0000,0000,0000,,we can provide you background details Dialogue: 0,1:35:17.07,1:35:20.30,Default,,0000,0000,0000,,about somebody entering your store, Dialogue: 0,1:35:20.30,1:35:22.91,Default,,0000,0000,0000,,and even give them a\Ntrust worthiness score Dialogue: 0,1:35:22.91,1:35:25.54,Default,,0000,0000,0000,,to restrict access to certain products. Dialogue: 0,1:35:25.54,1:35:29.25,Default,,0000,0000,0000,,This is a patent that has\Nbeen filed by Facebook. Dialogue: 0,1:35:29.25,1:35:31.27,Default,,0000,0000,0000,,So it's certainly within the realm Dialogue: 0,1:35:31.27,1:35:34.20,Default,,0000,0000,0000,,of what companies are exploring to do. Dialogue: 0,1:35:34.20,1:35:35.55,Default,,0000,0000,0000,,And you already have companies Dialogue: 0,1:35:35.55,1:35:38.13,Default,,0000,0000,0000,,that use facial recognition technologies Dialogue: 0,1:35:38.13,1:35:39.56,Default,,0000,0000,0000,,to assess demographics. Dialogue: 0,1:35:39.56,1:35:42.86,Default,,0000,0000,0000,,So you have cameras that\Ncan be put into shelves. Dialogue: 0,1:35:42.86,1:35:45.40,Default,,0000,0000,0000,,You have cameras that can\Nbe put into mannequins. Dialogue: 0,1:35:45.40,1:35:48.04,Default,,0000,0000,0000,,So you already have this going on Dialogue: 0,1:35:48.04,1:35:49.70,Default,,0000,0000,0000,,and Tacoma in Washington, Dialogue: 0,1:35:49.70,1:35:52.12,Default,,0000,0000,0000,,they even implemented a system Dialogue: 0,1:35:52.12,1:35:55.15,Default,,0000,0000,0000,,where you had to be face checked Dialogue: 0,1:35:55.15,1:35:58.38,Default,,0000,0000,0000,,before you could walk\Ninto a convenience store. Dialogue: 0,1:35:58.38,1:36:00.75,Default,,0000,0000,0000,,So this technology is already out there. Dialogue: 0,1:36:01.94,1:36:02.98,Default,,0000,0000,0000,,Well thank you. Dialogue: 0,1:36:02.98,1:36:04.51,Default,,0000,0000,0000,,I know my time is up. Dialogue: 0,1:36:04.51,1:36:07.36,Default,,0000,0000,0000,,I'm looking forward to\Ncontinuing the conversation Dialogue: 0,1:36:07.36,1:36:08.65,Default,,0000,0000,0000,,'cause I have a couple more questions, Dialogue: 0,1:36:08.65,1:36:10.94,Default,,0000,0000,0000,,but I'll ask them at another time. Dialogue: 0,1:36:10.94,1:36:13.21,Default,,0000,0000,0000,,But again, thank you to the advocates Dialogue: 0,1:36:13.21,1:36:14.70,Default,,0000,0000,0000,,and thank you to councilor Edwards. Dialogue: 0,1:36:14.70,1:36:16.73,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:36:16.73,1:36:19.61,Default,,0000,0000,0000,,So it's between me and public testimony. Dialogue: 0,1:36:19.61,1:36:22.70,Default,,0000,0000,0000,,So I'm gonna just put my\Nthree questions out there Dialogue: 0,1:36:22.70,1:36:25.93,Default,,0000,0000,0000,,and then we're gonna go\Nright down the list of folks Dialogue: 0,1:36:25.93,1:36:27.67,Default,,0000,0000,0000,,who have signed up in RSVP. Dialogue: 0,1:36:28.76,1:36:31.20,Default,,0000,0000,0000,,Again to those folks who are gonna come Dialogue: 0,1:36:32.39,1:36:33.28,Default,,0000,0000,0000,,after you've testified, Dialogue: 0,1:36:33.28,1:36:35.32,Default,,0000,0000,0000,,we were maxed out in our limit of people Dialogue: 0,1:36:35.32,1:36:36.69,Default,,0000,0000,0000,,who could participate. Dialogue: 0,1:36:36.69,1:36:39.95,Default,,0000,0000,0000,,So if you're willing to\Ntestify and then also sign out Dialogue: 0,1:36:39.95,1:36:41.25,Default,,0000,0000,0000,,and watch through YouTube Dialogue: 0,1:36:41.25,1:36:42.90,Default,,0000,0000,0000,,or watch you and other mechanisms Dialogue: 0,1:36:42.90,1:36:44.51,Default,,0000,0000,0000,,to allow somebody else to testify, Dialogue: 0,1:36:44.51,1:36:47.83,Default,,0000,0000,0000,,that would be very\Nhelpful to this process. Dialogue: 0,1:36:47.83,1:36:51.15,Default,,0000,0000,0000,,I'm looking specifically\Nat the issue of enforcement Dialogue: 0,1:36:51.15,1:36:53.58,Default,,0000,0000,0000,,in this statute or in this ordinance. Dialogue: 0,1:36:53.58,1:36:56.93,Default,,0000,0000,0000,,And it says no evidence derived there from Dialogue: 0,1:36:56.93,1:36:59.77,Default,,0000,0000,0000,,maybe received in evidence in proceeding Dialogue: 0,1:36:59.77,1:37:03.83,Default,,0000,0000,0000,,or before any department or\Nofficer agency regulatory body. Dialogue: 0,1:37:03.83,1:37:07.50,Default,,0000,0000,0000,,I want to be clear if the\Npolice received this evidence, Dialogue: 0,1:37:09.06,1:37:11.83,Default,,0000,0000,0000,,could a prosecutor use it in court. Dialogue: 0,1:37:11.83,1:37:14.08,Default,,0000,0000,0000,,So -\NThat's my question. Dialogue: 0,1:37:15.59,1:37:17.07,Default,,0000,0000,0000,,So that's my issue. Dialogue: 0,1:37:17.07,1:37:19.72,Default,,0000,0000,0000,,There seems to be no\Nactual evidence prohibition Dialogue: 0,1:37:19.72,1:37:22.15,Default,,0000,0000,0000,,for use in court against somebody. Dialogue: 0,1:37:22.15,1:37:24.98,Default,,0000,0000,0000,,The other issue or concern I have Dialogue: 0,1:37:24.98,1:37:27.56,Default,,0000,0000,0000,,is a provision violations\Nof this ordinance Dialogue: 0,1:37:27.56,1:37:30.79,Default,,0000,0000,0000,,by a city employee, shall\Nresult in consequences Dialogue: 0,1:37:30.79,1:37:35.79,Default,,0000,0000,0000,,that may include retraining,\Nsuspension, termination, Dialogue: 0,1:37:36.07,1:37:38.30,Default,,0000,0000,0000,,but they're all subject to provisions Dialogue: 0,1:37:38.30,1:37:40.34,Default,,0000,0000,0000,,of collective bargaining agreements. Dialogue: 0,1:37:40.34,1:37:42.68,Default,,0000,0000,0000,,So all of that could be gone away Dialogue: 0,1:37:42.68,1:37:44.32,Default,,0000,0000,0000,,if their union hasn't agreed Dialogue: 0,1:37:45.19,1:37:47.99,Default,,0000,0000,0000,,to that being part of\Nthe disciplinary steps Dialogue: 0,1:37:47.99,1:37:48.99,Default,,0000,0000,0000,,for that particular employee. Dialogue: 0,1:37:48.99,1:37:52.12,Default,,0000,0000,0000,,So to me that signals the patrolmen Dialogue: 0,1:37:52.12,1:37:54.95,Default,,0000,0000,0000,,and other unions of folks Dialogue: 0,1:37:54.95,1:37:58.70,Default,,0000,0000,0000,,who are part of need to need\Nto have this as part of their, Dialogue: 0,1:37:58.70,1:38:01.19,Default,,0000,0000,0000,,a violation of contract\Nor violation of standard, Dialogue: 0,1:38:01.19,1:38:03.53,Default,,0000,0000,0000,,I think maybe I'm wrong. Dialogue: 0,1:38:03.53,1:38:07.98,Default,,0000,0000,0000,,And then finally, Joy, your testimony Dialogue: 0,1:38:07.98,1:38:10.18,Default,,0000,0000,0000,,about the private sector\Nhas really hit me. Dialogue: 0,1:38:11.14,1:38:12.71,Default,,0000,0000,0000,,Thank you so much. Dialogue: 0,1:38:12.71,1:38:14.37,Default,,0000,0000,0000,,I'm particularly concerned about this. Dialogue: 0,1:38:14.37,1:38:15.20,Default,,0000,0000,0000,,I'm a twin. Dialogue: 0,1:38:16.59,1:38:20.02,Default,,0000,0000,0000,,Facebook tags my sister in all\Nof my pictures automatically. Dialogue: 0,1:38:21.66,1:38:25.71,Default,,0000,0000,0000,,I mean I'm a twin, councilor\NEssabi George has triplets. Dialogue: 0,1:38:25.71,1:38:28.42,Default,,0000,0000,0000,,So for those, we're called\Nmultiples, you're all singletons, Dialogue: 0,1:38:28.42,1:38:29.86,Default,,0000,0000,0000,,you were born by yourself. Dialogue: 0,1:38:29.86,1:38:32.32,Default,,0000,0000,0000,,We share birth, multiples have you know, Dialogue: 0,1:38:32.32,1:38:36.10,Default,,0000,0000,0000,,and so naturally I'm concerned\Nabout this technology. Dialogue: 0,1:38:36.10,1:38:38.94,Default,,0000,0000,0000,,Not that my sister is inclined\Nto do any criminal activity, Dialogue: 0,1:38:38.94,1:38:41.13,Default,,0000,0000,0000,,but she may go to a protest\Nor two, I don't know, Dialogue: 0,1:38:41.13,1:38:43.96,Default,,0000,0000,0000,,but either way, the point is, Dialogue: 0,1:38:43.96,1:38:48.61,Default,,0000,0000,0000,,I'm concerned about the private\Nactor and how they interact. Dialogue: 0,1:38:48.61,1:38:51.23,Default,,0000,0000,0000,,And I'm particularly concerned Dialogue: 0,1:38:51.23,1:38:53.72,Default,,0000,0000,0000,,is how far this will go down the line, Dialogue: 0,1:38:53.72,1:38:56.73,Default,,0000,0000,0000,,the chain supply chain, right? Dialogue: 0,1:38:56.73,1:38:58.92,Default,,0000,0000,0000,,'Cause we have $600 million in contracts Dialogue: 0,1:38:58.92,1:39:02.22,Default,,0000,0000,0000,,for all sorts of things\Nfor the city of Boston. Dialogue: 0,1:39:02.22,1:39:05.17,Default,,0000,0000,0000,,If we have a contract\Nwith a cleaning company Dialogue: 0,1:39:05.17,1:39:07.27,Default,,0000,0000,0000,,that requires the workers to sign in Dialogue: 0,1:39:07.27,1:39:09.01,Default,,0000,0000,0000,,with facial recognition, right? Dialogue: 0,1:39:09.01,1:39:12.68,Default,,0000,0000,0000,,So how far down the line\Ncan we go with our money? Dialogue: 0,1:39:13.92,1:39:17.10,Default,,0000,0000,0000,,Now I understand time is of the essence. Dialogue: 0,1:39:17.10,1:39:18.28,Default,,0000,0000,0000,,And that might be a big, Dialogue: 0,1:39:18.28,1:39:19.41,Default,,0000,0000,0000,,big question that we can work\Nout in the working session. Dialogue: 0,1:39:19.41,1:39:22.49,Default,,0000,0000,0000,,I'm particularly concerned\Nabout the courts. Dialogue: 0,1:39:22.49,1:39:26.87,Default,,0000,0000,0000,,So if you wanna just focus\Non that one, thank you. Dialogue: 0,1:39:26.87,1:39:27.71,Default,,0000,0000,0000,,Thank you, councilor. Dialogue: 0,1:39:27.71,1:39:28.96,Default,,0000,0000,0000,,So just very quickly, Dialogue: 0,1:39:31.10,1:39:35.87,Default,,0000,0000,0000,,we are unclear on what the\Ncity council's power is Dialogue: 0,1:39:35.87,1:39:38.57,Default,,0000,0000,0000,,to control the prosecutor's office. Dialogue: 0,1:39:38.57,1:39:42.53,Default,,0000,0000,0000,,I think we'll look more into that, Dialogue: 0,1:39:42.53,1:39:43.99,Default,,0000,0000,0000,,we'll look closely at that. Dialogue: 0,1:39:43.99,1:39:46.70,Default,,0000,0000,0000,,On the second question, I agree. Dialogue: 0,1:39:46.70,1:39:48.96,Default,,0000,0000,0000,,We need to look at the BPA agreement, Dialogue: 0,1:39:48.96,1:39:51.44,Default,,0000,0000,0000,,which I understand expires the summer. Dialogue: 0,1:39:51.44,1:39:55.96,Default,,0000,0000,0000,,And then finally on the\Nprivate sector concerns, Dialogue: 0,1:39:55.96,1:39:57.02,Default,,0000,0000,0000,,I share them. Dialogue: 0,1:39:57.02,1:40:00.99,Default,,0000,0000,0000,,Again, we wanna get this government ban Dialogue: 0,1:40:00.99,1:40:02.53,Default,,0000,0000,0000,,passed as quickly as possible. Dialogue: 0,1:40:02.53,1:40:04.68,Default,,0000,0000,0000,,The ACLU supports approaches like that, Dialogue: 0,1:40:04.68,1:40:07.21,Default,,0000,0000,0000,,that the state of Illinois has taken. Dialogue: 0,1:40:07.21,1:40:09.40,Default,,0000,0000,0000,,They passed the nation's strongest Dialogue: 0,1:40:09.40,1:40:10.95,Default,,0000,0000,0000,,consumer facing biometrics privacy law. Dialogue: 0,1:40:10.95,1:40:13.91,Default,,0000,0000,0000,,It's called the Biometric\NInformation Privacy Act, Dialogue: 0,1:40:13.91,1:40:14.92,Default,,0000,0000,0000,,BIPA. Dialogue: 0,1:40:14.92,1:40:17.65,Default,,0000,0000,0000,,BIPA essentially prevents\Nprivate companies, Dialogue: 0,1:40:17.65,1:40:18.68,Default,,0000,0000,0000,,any private companies, Dialogue: 0,1:40:18.68,1:40:20.61,Default,,0000,0000,0000,,from collecting your biometric data Dialogue: 0,1:40:20.61,1:40:22.38,Default,,0000,0000,0000,,without your opt-in consent. Dialogue: 0,1:40:22.38,1:40:24.24,Default,,0000,0000,0000,,And that's not, I clicked a button Dialogue: 0,1:40:24.24,1:40:26.25,Default,,0000,0000,0000,,when I was scrolling\Nthrough a terms of service, Dialogue: 0,1:40:26.25,1:40:28.64,Default,,0000,0000,0000,,it's you actually have\Nto sign a piece of paper Dialogue: 0,1:40:28.64,1:40:29.47,Default,,0000,0000,0000,,and give it to them. Dialogue: 0,1:40:29.47,1:40:32.17,Default,,0000,0000,0000,,So we need a law like that\Nright here in Massachusetts. Dialogue: 0,1:40:32.17,1:40:34.50,Default,,0000,0000,0000,,Thank you. Dialogue: 0,1:40:34.50,1:40:35.86,Default,,0000,0000,0000,,So at this point, Dialogue: 0,1:40:35.86,1:40:36.76,Default,,0000,0000,0000,,I'm gonna turn it over\Nto public testimony. Dialogue: 0,1:40:36.76,1:40:39.12,Default,,0000,0000,0000,,I have some folks who\Nhave already committed Dialogue: 0,1:40:39.12,1:40:40.91,Default,,0000,0000,0000,,and asked to speak today. Dialogue: 0,1:40:40.91,1:40:41.90,Default,,0000,0000,0000,,I'm gonna go through them. Dialogue: 0,1:40:41.90,1:40:44.54,Default,,0000,0000,0000,,I'm going to, it's\Nalready quarter to five. Dialogue: 0,1:40:44.54,1:40:48.61,Default,,0000,0000,0000,,I'm going to try and end this\Nhearing no later than six. Dialogue: 0,1:40:48.61,1:40:51.30,Default,,0000,0000,0000,,So that's an hour and 15 minutes Dialogue: 0,1:40:51.30,1:40:54.32,Default,,0000,0000,0000,,to move through public testimony. Dialogue: 0,1:40:54.32,1:40:55.32,Default,,0000,0000,0000,,And I'm gonna try, Dialogue: 0,1:40:55.32,1:40:57.11,Default,,0000,0000,0000,,I'm gonna have to keep\Nfolks to two minutes Dialogue: 0,1:40:57.11,1:40:58.40,Default,,0000,0000,0000,,as I said before, Dialogue: 0,1:40:58.40,1:41:00.83,Default,,0000,0000,0000,,because there are a lot\Nof people signed up. Dialogue: 0,1:41:00.83,1:41:02.91,Default,,0000,0000,0000,,So, of the folks who've signed up, Dialogue: 0,1:41:02.91,1:41:04.53,Default,,0000,0000,0000,,I will go through that list. Dialogue: 0,1:41:04.53,1:41:07.32,Default,,0000,0000,0000,,Then I'm gonna invite\Npeople to raise their hands Dialogue: 0,1:41:07.32,1:41:10.04,Default,,0000,0000,0000,,who may not have signed up directly, Dialogue: 0,1:41:10.04,1:41:11.92,Default,,0000,0000,0000,,who would also like to testify. Dialogue: 0,1:41:11.92,1:41:16.49,Default,,0000,0000,0000,,All right, so I have on the\Nlist, Bonnie Tenneriello Dialogue: 0,1:41:18.02,1:41:19.62,Default,,0000,0000,0000,,from the National Lawyers Guild. Dialogue: 0,1:41:22.51,1:41:24.75,Default,,0000,0000,0000,,After her I have Callan Bignoli Dialogue: 0,1:41:24.75,1:41:27.71,Default,,0000,0000,0000,,from the Library Freedom\NProject and Maty Cropley. Dialogue: 0,1:41:27.71,1:41:29.46,Default,,0000,0000,0000,,Those are the three folks lined up. Dialogue: 0,1:41:32.21,1:41:33.04,Default,,0000,0000,0000,,Let's see. Dialogue: 0,1:41:34.07,1:41:36.38,Default,,0000,0000,0000,,Kaitlin, is there a Bonnie? Dialogue: 0,1:41:42.57,1:41:44.22,Default,,0000,0000,0000,,Okay well, I see Callan\Nright now ready to go. Dialogue: 0,1:41:44.22,1:41:46.54,Default,,0000,0000,0000,,So I'm gonna go ahead\Nand start with Callan Dialogue: 0,1:41:46.54,1:41:49.22,Default,,0000,0000,0000,,if you wanna start and\Nyour two minutes has begun. Dialogue: 0,1:41:51.68,1:41:54.57,Default,,0000,0000,0000,,Sure, hi, I'm Callan Bignoli, Dialogue: 0,1:41:54.57,1:41:56.04,Default,,0000,0000,0000,,I am a resident of West Roxbury Dialogue: 0,1:41:56.04,1:41:58.18,Default,,0000,0000,0000,,and a librarian in the Boston area. Dialogue: 0,1:41:58.18,1:42:00.47,Default,,0000,0000,0000,,And I am speaking on behalf Dialogue: 0,1:42:00.47,1:42:02.09,Default,,0000,0000,0000,,of the Library Freedom Project Today. Dialogue: 0,1:42:02.09,1:42:03.13,Default,,0000,0000,0000,,So thank you to the chair Dialogue: 0,1:42:03.13,1:42:04.37,Default,,0000,0000,0000,,and thank you to all of city council Dialogue: 0,1:42:04.37,1:42:05.27,Default,,0000,0000,0000,,for the chance to testify today Dialogue: 0,1:42:05.27,1:42:07.82,Default,,0000,0000,0000,,in this important piece of legislation. Dialogue: 0,1:42:07.82,1:42:11.60,Default,,0000,0000,0000,,So, the Library Freedom Project\Nis a library advocacy group Dialogue: 0,1:42:11.60,1:42:14.00,Default,,0000,0000,0000,,that trains library workers to advocate Dialogue: 0,1:42:14.00,1:42:15.68,Default,,0000,0000,0000,,and educate their library community Dialogue: 0,1:42:15.68,1:42:17.71,Default,,0000,0000,0000,,about privacy and surveillance. Dialogue: 0,1:42:17.71,1:42:20.76,Default,,0000,0000,0000,,We include dozens of library\Nworkers from the U.S., Dialogue: 0,1:42:20.76,1:42:21.80,Default,,0000,0000,0000,,Canada and Mexico, Dialogue: 0,1:42:21.80,1:42:24.25,Default,,0000,0000,0000,,including eight librarians\Nfrom Massachusetts. Dialogue: 0,1:42:24.25,1:42:26.61,Default,,0000,0000,0000,,These library workers\Neducate their colleagues Dialogue: 0,1:42:26.61,1:42:28.33,Default,,0000,0000,0000,,and library users about surveillance Dialogue: 0,1:42:28.33,1:42:31.81,Default,,0000,0000,0000,,by creating library programs,\Nunion initiatives, workshops, Dialogue: 0,1:42:31.81,1:42:34.77,Default,,0000,0000,0000,,and other resources inform\Nand agitate for change. Dialogue: 0,1:42:34.77,1:42:37.15,Default,,0000,0000,0000,,And facial recognition technology Dialogue: 0,1:42:37.15,1:42:40.49,Default,,0000,0000,0000,,represents a class of\Ntechnology that is antithetical Dialogue: 0,1:42:40.49,1:42:41.96,Default,,0000,0000,0000,,to the values of privacy Dialogue: 0,1:42:41.96,1:42:44.53,Default,,0000,0000,0000,,and confidentiality of library workers. Dialogue: 0,1:42:45.49,1:42:47.89,Default,,0000,0000,0000,,As library staff, we understand\Nthat part of our work Dialogue: 0,1:42:47.89,1:42:50.56,Default,,0000,0000,0000,,is to represent these\Nvalues and the services Dialogue: 0,1:42:50.56,1:42:53.56,Default,,0000,0000,0000,,and resources we provide\Nto our library community Dialogue: 0,1:42:53.56,1:42:57.17,Default,,0000,0000,0000,,in order to reduce the harm of\Nstate and corporate scrutiny. Dialogue: 0,1:42:57.17,1:42:59.11,Default,,0000,0000,0000,,As we understand this, Dialogue: 0,1:42:59.11,1:43:01.93,Default,,0000,0000,0000,,we can see the social\Nand economic imperatives Dialogue: 0,1:43:01.93,1:43:04.53,Default,,0000,0000,0000,,that privilege some and marginalize others Dialogue: 0,1:43:04.53,1:43:06.78,Default,,0000,0000,0000,,that are encoded into\Nmany computer systems Dialogue: 0,1:43:06.78,1:43:08.19,Default,,0000,0000,0000,,and applications. Dialogue: 0,1:43:08.19,1:43:09.98,Default,,0000,0000,0000,,This is a result of the structure Dialogue: 0,1:43:09.98,1:43:11.36,Default,,0000,0000,0000,,of our technology industry, Dialogue: 0,1:43:11.36,1:43:14.11,Default,,0000,0000,0000,,which prioritizes the\Ninterests of management, Dialogue: 0,1:43:14.11,1:43:16.22,Default,,0000,0000,0000,,venture capitalists and stockholders, Dialogue: 0,1:43:16.22,1:43:18.13,Default,,0000,0000,0000,,who are mostly white men. Dialogue: 0,1:43:18.13,1:43:20.25,Default,,0000,0000,0000,,The racial and gender biases Dialogue: 0,1:43:20.25,1:43:22.72,Default,,0000,0000,0000,,inherent in face surveillance technology Dialogue: 0,1:43:22.72,1:43:24.50,Default,,0000,0000,0000,,are indicative of those values Dialogue: 0,1:43:24.50,1:43:28.21,Default,,0000,0000,0000,,and they describe how moneyed\Ninterests seek to shape Dialogue: 0,1:43:28.21,1:43:31.02,Default,,0000,0000,0000,,or reinforce racist and gender depressions Dialogue: 0,1:43:31.02,1:43:32.61,Default,,0000,0000,0000,,by creating computer systems Dialogue: 0,1:43:32.61,1:43:34.87,Default,,0000,0000,0000,,that extend the reach of profiling Dialogue: 0,1:43:34.87,1:43:37.06,Default,,0000,0000,0000,,through the exercise of capital. Dialogue: 0,1:43:37.06,1:43:40.47,Default,,0000,0000,0000,,Democratic direct worker\Nand community control Dialogue: 0,1:43:40.47,1:43:41.49,Default,,0000,0000,0000,,over the development acquisition Dialogue: 0,1:43:41.49,1:43:44.61,Default,,0000,0000,0000,,and practices of surveillance\Nand surveillance technology, Dialogue: 0,1:43:44.61,1:43:46.84,Default,,0000,0000,0000,,is an urgent priority to protect safety Dialogue: 0,1:43:46.84,1:43:49.69,Default,,0000,0000,0000,,and privacy and our\Nneighborhoods and schools. Dialogue: 0,1:43:49.69,1:43:51.22,Default,,0000,0000,0000,,So the Library Freedom Project Dialogue: 0,1:43:51.22,1:43:53.54,Default,,0000,0000,0000,,supports this urgently necessary ordinance Dialogue: 0,1:43:53.54,1:43:56.21,Default,,0000,0000,0000,,to ban facial recognition\Ntechnology in Boston Dialogue: 0,1:43:56.21,1:43:57.98,Default,,0000,0000,0000,,for the safety and privacy Dialogue: 0,1:43:57.98,1:44:01.04,Default,,0000,0000,0000,,of those who were working in\Nthe city and who live here. Dialogue: 0,1:44:01.04,1:44:04.56,Default,,0000,0000,0000,,we urge the council to rest\Ncontrol over surveillance - Dialogue: 0,1:44:04.56,1:44:06.26,Default,,0000,0000,0000,,Oh, two minutes is up.\NOops! Dialogue: 0,1:44:06.26,1:44:09.23,Default,,0000,0000,0000,,Two minutes is up, so\NIf you wanna summarize. Dialogue: 0,1:44:09.23,1:44:11.17,Default,,0000,0000,0000,,I have one, one more sentence. Dialogue: 0,1:44:11.17,1:44:15.06,Default,,0000,0000,0000,,We can not allow Boston\Nto adopt authoritarian, Dialogue: 0,1:44:15.06,1:44:18.93,Default,,0000,0000,0000,,unregulated and bias surveillance\Ntechnology, thank you. Dialogue: 0,1:44:18.93,1:44:21.21,Default,,0000,0000,0000,,Thank you very much, Bonnie, Dialogue: 0,1:44:21.21,1:44:23.24,Default,,0000,0000,0000,,I don't know if Bonnie's available, Dialogue: 0,1:44:23.24,1:44:26.90,Default,,0000,0000,0000,,if not I see Maty is available. Dialogue: 0,1:44:26.90,1:44:29.65,Default,,0000,0000,0000,,So I'm gonna go ahead and\Nstart your two minutes, Maty. Dialogue: 0,1:44:30.64,1:44:31.85,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:44:31.85,1:44:33.96,Default,,0000,0000,0000,,Thank you councilor Edwards Dialogue: 0,1:44:33.96,1:44:35.49,Default,,0000,0000,0000,,and thank you to the city council Dialogue: 0,1:44:35.49,1:44:38.12,Default,,0000,0000,0000,,and all the panelists for\Ntaking up this important issue. Dialogue: 0,1:44:38.12,1:44:41.82,Default,,0000,0000,0000,,My name is Maty Cropley,\NI use they/them pronouns. Dialogue: 0,1:44:41.82,1:44:44.85,Default,,0000,0000,0000,,I'm a teen librarian and\Na bargaining unit member Dialogue: 0,1:44:44.85,1:44:47.53,Default,,0000,0000,0000,,of the Boston Public Library\NProfessional Staff Association, Dialogue: 0,1:44:47.53,1:44:51.51,Default,,0000,0000,0000,,MSLA local 4928 AFT. Dialogue: 0,1:44:51.51,1:44:53.38,Default,,0000,0000,0000,,Our union of library workers Dialogue: 0,1:44:53.38,1:44:56.07,Default,,0000,0000,0000,,supports a ban on facial\Nrecognition in Boston. Dialogue: 0,1:44:56.07,1:44:58.07,Default,,0000,0000,0000,,In voting to do so, Dialogue: 0,1:44:58.07,1:45:00.17,Default,,0000,0000,0000,,our union recognizes\Nthat facial recognition Dialogue: 0,1:45:00.17,1:45:02.24,Default,,0000,0000,0000,,and other forms of biometric surveillance Dialogue: 0,1:45:02.24,1:45:03.61,Default,,0000,0000,0000,,are a threat to the civil liberties Dialogue: 0,1:45:03.61,1:45:06.81,Default,,0000,0000,0000,,and safety of our colleagues\Nand of library users. Dialogue: 0,1:45:06.81,1:45:09.17,Default,,0000,0000,0000,,Our library ethics of privacy\Nand intellectual freedom Dialogue: 0,1:45:09.17,1:45:12.40,Default,,0000,0000,0000,,are incompatible with\Nthis invasive technology. Dialogue: 0,1:45:12.40,1:45:14.01,Default,,0000,0000,0000,,We recognize that facial recognition Dialogue: 0,1:45:14.01,1:45:16.67,Default,,0000,0000,0000,,and other biometric\Nsurveillance technologies Dialogue: 0,1:45:16.67,1:45:18.51,Default,,0000,0000,0000,,are proven to be riddled with racist, Dialogue: 0,1:45:18.51,1:45:21.34,Default,,0000,0000,0000,,ableist and gendered algorithmic biases. Dialogue: 0,1:45:21.34,1:45:23.77,Default,,0000,0000,0000,,These systems routinely\Nmisidentify people of color, Dialogue: 0,1:45:23.77,1:45:25.99,Default,,0000,0000,0000,,which can result in needless\Ncontact with law enforcement Dialogue: 0,1:45:25.99,1:45:30.08,Default,,0000,0000,0000,,and other scrutiny, essentially\Nautomating racial profiling. Dialogue: 0,1:45:30.08,1:45:32.89,Default,,0000,0000,0000,,We recognize the harm of\Nsurveillance for youth in our city, Dialogue: 0,1:45:32.89,1:45:35.82,Default,,0000,0000,0000,,especially black, brown\Nand immigrant youth. Dialogue: 0,1:45:35.82,1:45:37.89,Default,,0000,0000,0000,,Unregulated scrutiny by authorities Dialogue: 0,1:45:37.89,1:45:40.13,Default,,0000,0000,0000,,leads to early contact\Nwith law enforcement Dialogue: 0,1:45:40.13,1:45:43.27,Default,,0000,0000,0000,,resulting in disenfranchisement,\Nmarginalized futures Dialogue: 0,1:45:43.27,1:45:45.92,Default,,0000,0000,0000,,and potential death by state violence. Dialogue: 0,1:45:45.92,1:45:48.52,Default,,0000,0000,0000,,We recognize that public\Nareas such as libraries, Dialogue: 0,1:45:48.52,1:45:50.50,Default,,0000,0000,0000,,parks, and sidewalks exist as spaces Dialogue: 0,1:45:50.50,1:45:52.93,Default,,0000,0000,0000,,in which people should be free to move, Dialogue: 0,1:45:52.93,1:45:56.13,Default,,0000,0000,0000,,speak, think inquire, perform, protest, Dialogue: 0,1:45:56.13,1:45:58.46,Default,,0000,0000,0000,,and assemble freely without\Nthe intense scrutiny Dialogue: 0,1:45:58.46,1:46:01.27,Default,,0000,0000,0000,,of unregulated surveillance\Nby law enforcement. Dialogue: 0,1:46:01.27,1:46:03.54,Default,,0000,0000,0000,,Public spaces exist to extend our rights Dialogue: 0,1:46:03.54,1:46:05.11,Default,,0000,0000,0000,,and provide space for the performance Dialogue: 0,1:46:05.11,1:46:08.34,Default,,0000,0000,0000,,of our civil liberties, not policing. Dialogue: 0,1:46:08.34,1:46:10.09,Default,,0000,0000,0000,,A ban on face surveillance technology Dialogue: 0,1:46:10.09,1:46:12.38,Default,,0000,0000,0000,,is critically important for the residents Dialogue: 0,1:46:12.38,1:46:15.18,Default,,0000,0000,0000,,and visitors to the city of Boston. Dialogue: 0,1:46:15.18,1:46:17.58,Default,,0000,0000,0000,,Our union encourages you to ban Dialogue: 0,1:46:17.58,1:46:18.81,Default,,0000,0000,0000,,the use of face surveillance\Nin the city of Boston Dialogue: 0,1:46:18.81,1:46:21.83,Default,,0000,0000,0000,,by supporting and passing\Nthis very crucial ordinance. Dialogue: 0,1:46:21.83,1:46:23.39,Default,,0000,0000,0000,,I'd like to thank you for your time. Dialogue: 0,1:46:23.39,1:46:25.08,Default,,0000,0000,0000,,And we are the Boston Public Library, Dialogue: 0,1:46:25.08,1:46:26.82,Default,,0000,0000,0000,,Professional Staff Association. Dialogue: 0,1:46:26.82,1:46:28.63,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:46:28.63,1:46:30.16,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:46:30.16,1:46:34.57,Default,,0000,0000,0000,,I understand that Ms. Bonnie is available. Dialogue: 0,1:46:34.57,1:46:39.03,Default,,0000,0000,0000,,She's under a name Linda Rydzewski, Dialogue: 0,1:46:42.28,1:46:43.68,Default,,0000,0000,0000,,and we're gonna pull her up. Dialogue: 0,1:46:47.34,1:46:48.51,Default,,0000,0000,0000,,I'll check when she's available. Dialogue: 0,1:46:49.64,1:46:53.16,Default,,0000,0000,0000,,After her will be the\Nfollowing three testifiers, Dialogue: 0,1:46:53.16,1:46:58.16,Default,,0000,0000,0000,,Nikhill Thorat, Nour Sulalman\Nand professor Woodrow Hartzog. Dialogue: 0,1:46:59.67,1:47:02.19,Default,,0000,0000,0000,,First, Bonnie. Dialogue: 0,1:47:07.98,1:47:10.90,Default,,0000,0000,0000,,She's under Linda Rydzewski, sorry. Dialogue: 0,1:47:13.56,1:47:15.55,Default,,0000,0000,0000,,Linda Rydzewski Dialogue: 0,1:47:19.90,1:47:24.78,Default,,0000,0000,0000,,I am so sorry, I've had\Na technical problems. Dialogue: 0,1:47:24.78,1:47:26.81,Default,,0000,0000,0000,,This is Bonnie Tenneriello, Dialogue: 0,1:47:26.81,1:47:29.36,Default,,0000,0000,0000,,on behalf of the National Lawyers Guild. Dialogue: 0,1:47:29.36,1:47:34.27,Default,,0000,0000,0000,,And I had to sign in\Nwith my office account Dialogue: 0,1:47:34.27,1:47:38.88,Default,,0000,0000,0000,,and I am not representing\NPrisoner's Legal Services Dialogue: 0,1:47:38.88,1:47:42.80,Default,,0000,0000,0000,,and I couldn't change the screen\Nname once I was signed in. Dialogue: 0,1:47:42.80,1:47:44.28,Default,,0000,0000,0000,,Am I audible to everyone? Dialogue: 0,1:47:45.69,1:47:48.00,Default,,0000,0000,0000,,You're on and you have two minutes. Dialogue: 0,1:47:48.00,1:47:49.58,Default,,0000,0000,0000,,Okay. Dialogue: 0,1:47:49.58,1:47:53.21,Default,,0000,0000,0000,,I'm speaking on behalf of\Nthe National Lawyers Guild, Dialogue: 0,1:47:53.21,1:47:55.08,Default,,0000,0000,0000,,Massachusetts Chapter, Dialogue: 0,1:47:55.08,1:48:00.08,Default,,0000,0000,0000,,which for over 80 years as\Nfought for human rights, Dialogue: 0,1:48:04.04,1:48:06.30,Default,,0000,0000,0000,,and we're deeply\Nconcerned over the dangers Dialogue: 0,1:48:06.30,1:48:07.68,Default,,0000,0000,0000,,of facial surveillance. Dialogue: 0,1:48:07.68,1:48:11.09,Default,,0000,0000,0000,,We strongly support this ordinance. Dialogue: 0,1:48:11.09,1:48:15.48,Default,,0000,0000,0000,,Facial recognition\Ndeployed through cameras Dialogue: 0,1:48:15.48,1:48:17.69,Default,,0000,0000,0000,,throughout a city can be used Dialogue: 0,1:48:17.69,1:48:19.60,Default,,0000,0000,0000,,to track the movements of dissidents Dialogue: 0,1:48:19.60,1:48:22.79,Default,,0000,0000,0000,,and anyone attending a protest\Nas others have observed. Dialogue: 0,1:48:22.79,1:48:25.41,Default,,0000,0000,0000,,Our history and the National Lawyers Guild Dialogue: 0,1:48:25.41,1:48:28.70,Default,,0000,0000,0000,,shows that this danger is real. Dialogue: 0,1:48:28.70,1:48:30.83,Default,,0000,0000,0000,,We have defended political dissidents Dialogue: 0,1:48:30.83,1:48:33.19,Default,,0000,0000,0000,,targeted by law enforcement, Dialogue: 0,1:48:33.19,1:48:36.10,Default,,0000,0000,0000,,such as members of the\NBlack Panther Party, Dialogue: 0,1:48:36.10,1:48:37.33,Default,,0000,0000,0000,,American Indian Movement, Dialogue: 0,1:48:37.33,1:48:39.30,Default,,0000,0000,0000,,Puerto Rican Independence Movement Dialogue: 0,1:48:39.30,1:48:41.60,Default,,0000,0000,0000,,and we're recently here in Boston, Dialogue: 0,1:48:41.60,1:48:44.11,Default,,0000,0000,0000,,occupied Boston Climate Change activists, Dialogue: 0,1:48:44.11,1:48:47.27,Default,,0000,0000,0000,,those opposing a straight pride event Dialogue: 0,1:48:47.27,1:48:48.92,Default,,0000,0000,0000,,and Black Lives Matter. Dialogue: 0,1:48:48.92,1:48:53.92,Default,,0000,0000,0000,,We know that the dangers\Nof political surveillance Dialogue: 0,1:48:55.58,1:48:58.89,Default,,0000,0000,0000,,and political use of\Nthis technology are real. Dialogue: 0,1:48:59.81,1:49:01.02,Default,,0000,0000,0000,,The National Lawyers Guild Dialogue: 0,1:49:01.02,1:49:04.02,Default,,0000,0000,0000,,also helped expose illegal surveillance Dialogue: 0,1:49:04.02,1:49:09.02,Default,,0000,0000,0000,,in the Church Commission\N1975-76 COINTELPRO hearings. Dialogue: 0,1:49:10.62,1:49:12.68,Default,,0000,0000,0000,,We know from this experience Dialogue: 0,1:49:12.68,1:49:15.96,Default,,0000,0000,0000,,that this can be used to disrupt Dialogue: 0,1:49:15.96,1:49:19.93,Default,,0000,0000,0000,,and prosecute political protest. Dialogue: 0,1:49:19.93,1:49:21.76,Default,,0000,0000,0000,,People must be able to demonstrate Dialogue: 0,1:49:21.76,1:49:25.91,Default,,0000,0000,0000,,without fear of being identified\Nand tracked afterwards. Dialogue: 0,1:49:25.91,1:49:28.66,Default,,0000,0000,0000,,I will not repeat what others have said Dialogue: 0,1:49:28.66,1:49:33.05,Default,,0000,0000,0000,,very articulately about just\Nthe basic privacy concerns Dialogue: 0,1:49:33.05,1:49:35.05,Default,,0000,0000,0000,,of being able to walk through a city, Dialogue: 0,1:49:35.05,1:49:37.66,Default,,0000,0000,0000,,without having cameras track your identity Dialogue: 0,1:49:37.66,1:49:39.66,Default,,0000,0000,0000,,and movements throughout the city, Dialogue: 0,1:49:39.66,1:49:44.18,Default,,0000,0000,0000,,nor will I repeat the excellent testimony Dialogue: 0,1:49:44.18,1:49:49.18,Default,,0000,0000,0000,,that was given about the\Nracial bias in this technology. Dialogue: 0,1:49:49.55,1:49:52.32,Default,,0000,0000,0000,,But on behalf of the\NNational Lawyers Guild, Dialogue: 0,1:49:52.32,1:49:56.52,Default,,0000,0000,0000,,I will say that we cannot\Ntolerate a technology Dialogue: 0,1:49:56.52,1:50:00.06,Default,,0000,0000,0000,,that perpetuates racism\Nin law enforcement, Dialogue: 0,1:50:00.06,1:50:01.84,Default,,0000,0000,0000,,as tens of thousands of people Dialogue: 0,1:50:01.84,1:50:05.18,Default,,0000,0000,0000,,take to the streets\Ncalling for racial justice. Dialogue: 0,1:50:05.18,1:50:08.29,Default,,0000,0000,0000,,We're at a historical moment\Nwhen our society is demanding Dialogue: 0,1:50:08.29,1:50:11.11,Default,,0000,0000,0000,,and this council is\Nconsidering a shift away Dialogue: 0,1:50:11.11,1:50:15.14,Default,,0000,0000,0000,,from control and policing\Nof communities of color Dialogue: 0,1:50:15.14,1:50:17.16,Default,,0000,0000,0000,,and low-income communities Dialogue: 0,1:50:17.16,1:50:18.45,Default,,0000,0000,0000,,and towards community directed programs Dialogue: 0,1:50:18.45,1:50:22.18,Default,,0000,0000,0000,,that channel resources to\Njobs, healthcare, education, Dialogue: 0,1:50:22.18,1:50:25.20,Default,,0000,0000,0000,,and other programs that\Ncreate true safety. Dialogue: 0,1:50:25.20,1:50:29.07,Default,,0000,0000,0000,,We reject facial surveillance Dialogue: 0,1:50:29.07,1:50:33.21,Default,,0000,0000,0000,,and we urge the council\Nto pass this ordinance. Dialogue: 0,1:50:33.21,1:50:35.03,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:50:36.65,1:50:39.84,Default,,0000,0000,0000,,So up next we have Nikhill Thorat. Dialogue: 0,1:50:42.92,1:50:44.17,Default,,0000,0000,0000,,Sorry, if I mispronounced your name. Dialogue: 0,1:50:44.17,1:50:45.31,Default,,0000,0000,0000,,No harm, you got that right. Dialogue: 0,1:50:45.31,1:50:48.26,Default,,0000,0000,0000,,So thank you chair and to the\Namazing speakers before me. Dialogue: 0,1:50:48.26,1:50:52.99,Default,,0000,0000,0000,,My name is Nikhil, I work on\Nat Google Brain in Cambridge. Dialogue: 0,1:50:52.99,1:50:54.51,Default,,0000,0000,0000,,So I specifically work in the area Dialogue: 0,1:50:54.51,1:50:57.93,Default,,0000,0000,0000,,of machine learning fairness\Nand interpretability. Dialogue: 0,1:50:57.93,1:50:59.33,Default,,0000,0000,0000,,So this is sort of the stuff that we do. Dialogue: 0,1:50:59.33,1:51:00.62,Default,,0000,0000,0000,,So I'm here as a citizen, Dialogue: 0,1:51:00.62,1:51:02.95,Default,,0000,0000,0000,,and I am very much in\Nfavor of the ordinance Dialogue: 0,1:51:02.95,1:51:05.71,Default,,0000,0000,0000,,banning facial recognition\Nby public officials Dialogue: 0,1:51:05.71,1:51:07.30,Default,,0000,0000,0000,,here in Boston. Dialogue: 0,1:51:07.30,1:51:10.31,Default,,0000,0000,0000,,So modern AI algorithms\Nare statistical machines. Dialogue: 0,1:51:10.31,1:51:12.80,Default,,0000,0000,0000,,They base predictions off\Npatterns and data sets Dialogue: 0,1:51:12.80,1:51:15.23,Default,,0000,0000,0000,,that they're trained on,\Nthere's there's no magic here. Dialogue: 0,1:51:15.23,1:51:18.71,Default,,0000,0000,0000,,So any bias that stems from\Nhow a data set is collected, Dialogue: 0,1:51:18.71,1:51:20.90,Default,,0000,0000,0000,,like non-representative collection of data Dialogue: 0,1:51:20.90,1:51:21.88,Default,,0000,0000,0000,,for different races Dialogue: 0,1:51:21.88,1:51:23.61,Default,,0000,0000,0000,,because of historical context, Dialogue: 0,1:51:23.61,1:51:27.26,Default,,0000,0000,0000,,gets automatically reflected\Nin the AI models predictions. Dialogue: 0,1:51:27.26,1:51:31.06,Default,,0000,0000,0000,,So what this means is that it\Nmanifests as unequal errors Dialogue: 0,1:51:31.06,1:51:33.46,Default,,0000,0000,0000,,between subgroups, like race or gender. Dialogue: 0,1:51:33.46,1:51:35.05,Default,,0000,0000,0000,,And this is not theoretical, Dialogue: 0,1:51:35.05,1:51:36.68,Default,,0000,0000,0000,,it's been studied very deeply Dialogue: 0,1:51:37.98,1:51:39.47,Default,,0000,0000,0000,,by many AI researchers in this field Dialogue: 0,1:51:39.47,1:51:40.33,Default,,0000,0000,0000,,who have been on this call. Dialogue: 0,1:51:40.33,1:51:42.25,Default,,0000,0000,0000,,There's a federal NISD, Dialogue: 0,1:51:42.25,1:51:44.70,Default,,0000,0000,0000,,a National Institute of\NStandards and Technology Study Dialogue: 0,1:51:44.70,1:51:47.28,Default,,0000,0000,0000,,that looks at almost 200\Nfacial recognition algorithms Dialogue: 0,1:51:47.28,1:51:49.88,Default,,0000,0000,0000,,and found significant\Ndemographic disparities Dialogue: 0,1:51:49.88,1:51:51.40,Default,,0000,0000,0000,,in most of them. Dialogue: 0,1:51:51.40,1:51:55.11,Default,,0000,0000,0000,,And there's another paper\Nby Joy here, Gender Shades, Dialogue: 0,1:51:55.11,1:51:57.03,Default,,0000,0000,0000,,that looked at three commercial\Nclassification systems Dialogue: 0,1:51:57.03,1:51:58.45,Default,,0000,0000,0000,,and found that there are Dialogue: 0,1:51:59.69,1:52:03.74,Default,,0000,0000,0000,,35% of the time incorrect\Nresults for dark-skinned females Dialogue: 0,1:52:03.74,1:52:07.04,Default,,0000,0000,0000,,versus only 0.8% incorrect\Nfor light-skinned males. Dialogue: 0,1:52:07.04,1:52:08.98,Default,,0000,0000,0000,,This is, this is horrible, right? Dialogue: 0,1:52:08.98,1:52:12.51,Default,,0000,0000,0000,,So these mistakes can't\Nalso be viewed in isolation. Dialogue: 0,1:52:12.51,1:52:14.82,Default,,0000,0000,0000,,They will perpetuate bias issues Dialogue: 0,1:52:14.82,1:52:16.30,Default,,0000,0000,0000,,by informing the collection Dialogue: 0,1:52:16.30,1:52:18.24,Default,,0000,0000,0000,,of the next generation of datasets, Dialogue: 0,1:52:18.24,1:52:19.75,Default,,0000,0000,0000,,which are then again used Dialogue: 0,1:52:19.75,1:52:22.54,Default,,0000,0000,0000,,to train the next generation of AI models. Dialogue: 0,1:52:22.54,1:52:24.58,Default,,0000,0000,0000,,Moreover, when mistakes are made either Dialogue: 0,1:52:24.58,1:52:26.78,Default,,0000,0000,0000,,by a machine or by a human being, Dialogue: 0,1:52:26.78,1:52:28.43,Default,,0000,0000,0000,,we need to be able to ask the question, Dialogue: 0,1:52:28.43,1:52:30.46,Default,,0000,0000,0000,,why was that mistake made? Dialogue: 0,1:52:30.46,1:52:32.97,Default,,0000,0000,0000,,And we simply haven't\Ndeveloped robust tools Dialogue: 0,1:52:32.97,1:52:35.01,Default,,0000,0000,0000,,to be able to hold these algorithms Dialogue: 0,1:52:35.01,1:52:37.16,Default,,0000,0000,0000,,to an acceptable level of transparency, Dialogue: 0,1:52:37.16,1:52:40.20,Default,,0000,0000,0000,,to answer any of these critical questions. Dialogue: 0,1:52:40.20,1:52:42.12,Default,,0000,0000,0000,,Deployment of these systems Dialogue: 0,1:52:42.12,1:52:43.93,Default,,0000,0000,0000,,will exacerbate the issues of racial bias Dialogue: 0,1:52:43.93,1:52:46.13,Default,,0000,0000,0000,,and policing and accelerate\Nthe various social inequalities Dialogue: 0,1:52:47.10,1:52:49.23,Default,,0000,0000,0000,,that we're protesting to\Ninnocent people will be targeted. Dialogue: 0,1:52:49.23,1:52:51.53,Default,,0000,0000,0000,,The threshold for deploying these systems Dialogue: 0,1:52:51.53,1:52:52.97,Default,,0000,0000,0000,,should be extremely high, Dialogue: 0,1:52:52.97,1:52:55.78,Default,,0000,0000,0000,,one that's arguably not\Nreachable beyond accuracy, Dialogue: 0,1:52:55.78,1:52:58.17,Default,,0000,0000,0000,,beyond a hundred percent accuracy idea. Dialogue: 0,1:52:58.17,1:53:00.05,Default,,0000,0000,0000,,Data would have to be public Dialogue: 0,1:53:00.05,1:53:02.87,Default,,0000,0000,0000,,and there would have to be\Nactionable algorithmic oversight. Dialogue: 0,1:53:02.87,1:53:05.72,Default,,0000,0000,0000,,This is an incredibly high bar\Nand we are nowhere near that, Dialogue: 0,1:53:05.72,1:53:08.40,Default,,0000,0000,0000,,both algorithmically and societaly. Dialogue: 0,1:53:08.40,1:53:09.30,Default,,0000,0000,0000,,Along with this ordinance, Dialogue: 0,1:53:09.30,1:53:10.84,Default,,0000,0000,0000,,we must start laying the groundwork Dialogue: 0,1:53:10.84,1:53:13.54,Default,,0000,0000,0000,,for a rigorous criteria\Nfor future technology Dialogue: 0,1:53:13.54,1:53:15.91,Default,,0000,0000,0000,,as it is an inevitable conversation. Dialogue: 0,1:53:15.91,1:53:17.50,Default,,0000,0000,0000,,But first we must pass the ordinance Dialogue: 0,1:53:17.50,1:53:19.40,Default,,0000,0000,0000,,banning facial recognition in Boston. Dialogue: 0,1:53:19.40,1:53:20.70,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:53:20.70,1:53:22.67,Default,,0000,0000,0000,,Thank you very much, Dialogue: 0,1:53:22.67,1:53:26.02,Default,,0000,0000,0000,,following Nikhill we have Nour Sulalman. Dialogue: 0,1:53:28.69,1:53:31.50,Default,,0000,0000,0000,,Sorry again for mispronunciation. Dialogue: 0,1:53:31.50,1:53:32.33,Default,,0000,0000,0000,,It's all good. Dialogue: 0,1:53:33.29,1:53:35.31,Default,,0000,0000,0000,,Thank you council members Dialogue: 0,1:53:35.31,1:53:38.12,Default,,0000,0000,0000,,and authors of the ordinance\Nfor your service to the city. Dialogue: 0,1:53:38.12,1:53:40.87,Default,,0000,0000,0000,,My name is Noue Sulalman Langedorfer, Dialogue: 0,1:53:40.87,1:53:43.13,Default,,0000,0000,0000,,and I use she/her pronouns. Dialogue: 0,1:53:43.13,1:53:45.77,Default,,0000,0000,0000,,I'm a resident of Jamaica Plain, Dialogue: 0,1:53:45.77,1:53:48.94,Default,,0000,0000,0000,,and I'm here in my capacity\Nas a private citizen. Dialogue: 0,1:53:48.94,1:53:52.24,Default,,0000,0000,0000,,I won't repeat what everyone else said, Dialogue: 0,1:53:52.24,1:53:55.13,Default,,0000,0000,0000,,but I will say that the\Nuse of facial recognition Dialogue: 0,1:53:55.13,1:53:58.62,Default,,0000,0000,0000,,by law enforcement today,\Nis especially disturbing Dialogue: 0,1:53:58.62,1:54:01.99,Default,,0000,0000,0000,,due to the epidemic of police\Nviolence in this country Dialogue: 0,1:54:01.99,1:54:04.85,Default,,0000,0000,0000,,and the dysfunctional relationship it has Dialogue: 0,1:54:04.85,1:54:08.39,Default,,0000,0000,0000,,with black and brown people in the city. Dialogue: 0,1:54:08.39,1:54:10.58,Default,,0000,0000,0000,,And with all due respect, Dialogue: 0,1:54:10.58,1:54:13.30,Default,,0000,0000,0000,,I'm unconvinced by the distinction made Dialogue: 0,1:54:13.30,1:54:16.54,Default,,0000,0000,0000,,between facial surveillance\Nand facial recognition. Dialogue: 0,1:54:16.54,1:54:21.54,Default,,0000,0000,0000,,I'm even more disturbed or I am disturbed Dialogue: 0,1:54:21.98,1:54:23.41,Default,,0000,0000,0000,,that the commissioner does not know Dialogue: 0,1:54:23.41,1:54:27.19,Default,,0000,0000,0000,,if the police department uses\Nfacial recognition software. Dialogue: 0,1:54:27.19,1:54:31.31,Default,,0000,0000,0000,,I asked the council to consider supporting Dialogue: 0,1:54:31.31,1:54:33.89,Default,,0000,0000,0000,,that the ordinance established a total ban Dialogue: 0,1:54:33.89,1:54:35.72,Default,,0000,0000,0000,,on the use of facial recognition software Dialogue: 0,1:54:35.72,1:54:39.10,Default,,0000,0000,0000,,in all public spaces,\Nincluding by law enforcement, Dialogue: 0,1:54:39.10,1:54:41.38,Default,,0000,0000,0000,,malls and shopping centers, retail spaces, Dialogue: 0,1:54:41.38,1:54:44.77,Default,,0000,0000,0000,,restaurants open to the public,\Nstate government buildings, Dialogue: 0,1:54:44.77,1:54:48.52,Default,,0000,0000,0000,,courts, public schools,\Nschools funded by the state, Dialogue: 0,1:54:48.52,1:54:50.98,Default,,0000,0000,0000,,on public streets and\Nin places of employment, Dialogue: 0,1:54:50.98,1:54:52.59,Default,,0000,0000,0000,,as a condition of employment Dialogue: 0,1:54:52.59,1:54:55.90,Default,,0000,0000,0000,,and as a condition of landlords\Nleasing rental properties. Dialogue: 0,1:54:55.90,1:54:58.35,Default,,0000,0000,0000,,The ban should also apply to the use Dialogue: 0,1:54:58.35,1:55:00.68,Default,,0000,0000,0000,,and purchase a facial recognition software Dialogue: 0,1:55:00.68,1:55:03.79,Default,,0000,0000,0000,,by any state agency operating in Boston, Dialogue: 0,1:55:03.79,1:55:05.83,Default,,0000,0000,0000,,including law enforcement Dialogue: 0,1:55:05.83,1:55:08.98,Default,,0000,0000,0000,,and I further or to the council\Nto consider adopting a ban Dialogue: 0,1:55:08.98,1:55:12.80,Default,,0000,0000,0000,,on the use of private DNA\Ndatabases by law enforcement Dialogue: 0,1:55:12.80,1:55:15.79,Default,,0000,0000,0000,,and all remote surveillance systems, Dialogue: 0,1:55:15.79,1:55:18.46,Default,,0000,0000,0000,,including those that\Nrecognize individual gates. Dialogue: 0,1:55:18.46,1:55:23.46,Default,,0000,0000,0000,,Information gathered through\Nthe aforementioned technologies Dialogue: 0,1:55:24.01,1:55:27.39,Default,,0000,0000,0000,,should not be deemed\Nadmissible in civil, criminal Dialogue: 0,1:55:27.39,1:55:29.59,Default,,0000,0000,0000,,or administrative proceedings Dialogue: 0,1:55:29.59,1:55:33.52,Default,,0000,0000,0000,,or serve as a basis for\Ngranting a search warrant. Dialogue: 0,1:55:33.52,1:55:34.99,Default,,0000,0000,0000,,Thank you madam chair. Dialogue: 0,1:55:38.20,1:55:39.30,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,1:55:40.15,1:55:43.68,Default,,0000,0000,0000,,I have a professor Woodrow Hartzog. Dialogue: 0,1:55:46.32,1:55:47.57,Default,,0000,0000,0000,,Okay, two minutes. Dialogue: 0,1:55:47.57,1:55:50.34,Default,,0000,0000,0000,,Thank you so much, dear councilors, Dialogue: 0,1:55:50.34,1:55:52.23,Default,,0000,0000,0000,,thank you for allowing me the opportunity Dialogue: 0,1:55:52.23,1:55:54.47,Default,,0000,0000,0000,,to speak in support of the ordinance. Dialogue: 0,1:55:54.47,1:55:56.99,Default,,0000,0000,0000,,Banning face surveillance in Boston. Dialogue: 0,1:55:56.99,1:55:59.17,Default,,0000,0000,0000,,I am a professor of law\Nand computer science Dialogue: 0,1:55:59.17,1:56:00.81,Default,,0000,0000,0000,,at Northeastern University, Dialogue: 0,1:56:00.81,1:56:02.91,Default,,0000,0000,0000,,who has been researching and writing Dialogue: 0,1:56:02.91,1:56:04.61,Default,,0000,0000,0000,,about the risks of facial\Nrecognition technologies Dialogue: 0,1:56:04.61,1:56:06.28,Default,,0000,0000,0000,,for over seven years, Dialogue: 0,1:56:06.28,1:56:09.42,Default,,0000,0000,0000,,I make these comments in my\Npersonal academic capacity. Dialogue: 0,1:56:09.42,1:56:11.13,Default,,0000,0000,0000,,I'm not serving as an advocate Dialogue: 0,1:56:11.13,1:56:14.07,Default,,0000,0000,0000,,for any particular organization. Dialogue: 0,1:56:14.07,1:56:16.68,Default,,0000,0000,0000,,Please allow me to be direct, Dialogue: 0,1:56:16.68,1:56:18.29,Default,,0000,0000,0000,,facial recognition technology Dialogue: 0,1:56:18.29,1:56:21.98,Default,,0000,0000,0000,,is the most dangerous\Nsurveillance tool ever invented. Dialogue: 0,1:56:21.98,1:56:24.61,Default,,0000,0000,0000,,It poses substantial\Nthreats to civil liberties, Dialogue: 0,1:56:24.61,1:56:27.08,Default,,0000,0000,0000,,privacy and democratic accountability. Dialogue: 0,1:56:27.08,1:56:30.53,Default,,0000,0000,0000,,Quite simply the world has\Nnever seen anything like it. Dialogue: 0,1:56:30.53,1:56:33.95,Default,,0000,0000,0000,,Traditional legal rules, such\Nas requiring legal process Dialogue: 0,1:56:33.95,1:56:36.54,Default,,0000,0000,0000,,or people's consent before\Nsurveillance could be conducted, Dialogue: 0,1:56:36.54,1:56:39.34,Default,,0000,0000,0000,,will only entrench these systems, Dialogue: 0,1:56:39.34,1:56:41.22,Default,,0000,0000,0000,,lead to a more watched society Dialogue: 0,1:56:41.22,1:56:44.50,Default,,0000,0000,0000,,where we are all treated\Nas suspects all the time. Dialogue: 0,1:56:44.50,1:56:45.85,Default,,0000,0000,0000,,Anything less than a ban Dialogue: 0,1:56:45.85,1:56:48.55,Default,,0000,0000,0000,,will inevitably lead to unacceptable abuse Dialogue: 0,1:56:48.55,1:56:51.22,Default,,0000,0000,0000,,of the massive power\Nbestowed by these systems. Dialogue: 0,1:56:51.22,1:56:53.23,Default,,0000,0000,0000,,That is why I believe that this ordinance Dialogue: 0,1:56:53.23,1:56:55.49,Default,,0000,0000,0000,,is justified and necessary. Dialogue: 0,1:56:55.49,1:56:57.19,Default,,0000,0000,0000,,There are many ways that law enforcement's Dialogue: 0,1:56:57.19,1:56:59.71,Default,,0000,0000,0000,,use of facial recognition\Ntechnology can harm people. Dialogue: 0,1:56:59.71,1:57:01.60,Default,,0000,0000,0000,,Government surveillance Dialogue: 0,1:57:01.60,1:57:04.85,Default,,0000,0000,0000,,has disproportionately targeted\Nmarginalized communities, Dialogue: 0,1:57:04.85,1:57:06.67,Default,,0000,0000,0000,,specifically people of color, Dialogue: 0,1:57:06.67,1:57:09.07,Default,,0000,0000,0000,,facial recognition will\Nonly make this worse Dialogue: 0,1:57:09.07,1:57:10.93,Default,,0000,0000,0000,,because it is inaccurate and bias Dialogue: 0,1:57:10.93,1:57:13.72,Default,,0000,0000,0000,,based along racial and gender lines. Dialogue: 0,1:57:13.72,1:57:16.18,Default,,0000,0000,0000,,And while facial\Nrecognition is unacceptable Dialogue: 0,1:57:16.18,1:57:18.44,Default,,0000,0000,0000,,and it's biased and error prone, Dialogue: 0,1:57:18.44,1:57:20.27,Default,,0000,0000,0000,,the more accurate it becomes, Dialogue: 0,1:57:20.27,1:57:22.20,Default,,0000,0000,0000,,the more oppressive it will get. Dialogue: 0,1:57:22.20,1:57:25.71,Default,,0000,0000,0000,,Accurate systems will be more\Nheavily used and invested in Dialogue: 0,1:57:25.71,1:57:28.91,Default,,0000,0000,0000,,further endangering the people of Boston. Dialogue: 0,1:57:28.91,1:57:31.56,Default,,0000,0000,0000,,Use of facial recognition\Nseems likely to create Dialogue: 0,1:57:31.56,1:57:34.00,Default,,0000,0000,0000,,a pervasive atmosphere of chill. Dialogue: 0,1:57:34.00,1:57:36.84,Default,,0000,0000,0000,,These tools make it easier\Nto engage in surveillance, Dialogue: 0,1:57:36.84,1:57:40.01,Default,,0000,0000,0000,,which means the more\Nsurveillance can and will occur. Dialogue: 0,1:57:40.01,1:57:43.18,Default,,0000,0000,0000,,The mere prospect of a\Nhyper surveil society Dialogue: 0,1:57:43.18,1:57:44.87,Default,,0000,0000,0000,,could routinely prevent citizens Dialogue: 0,1:57:44.87,1:57:47.94,Default,,0000,0000,0000,,from engaging in First\NAmendment protected activities, Dialogue: 0,1:57:47.94,1:57:50.02,Default,,0000,0000,0000,,such as protesting and worshiping, Dialogue: 0,1:57:50.02,1:57:53.59,Default,,0000,0000,0000,,for fear of ending up on\Ngovernment watch lists. Dialogue: 0,1:57:53.59,1:57:55.67,Default,,0000,0000,0000,,Facial recognition also poses a threat Dialogue: 0,1:57:55.67,1:57:57.64,Default,,0000,0000,0000,,to our ideals of due process, Dialogue: 0,1:57:57.64,1:57:59.01,Default,,0000,0000,0000,,because it makes it all too easy Dialogue: 0,1:57:59.01,1:58:02.49,Default,,0000,0000,0000,,for governments to excessively\Nenforce minor infractions Dialogue: 0,1:58:02.49,1:58:05.20,Default,,0000,0000,0000,,as pretext for secretly monitoring Dialogue: 0,1:58:05.20,1:58:07.61,Default,,0000,0000,0000,,and retaliating against our citizens Dialogue: 0,1:58:07.61,1:58:10.76,Default,,0000,0000,0000,,who are often targeted for\Nspeaking up like journalists, Dialogue: 0,1:58:10.76,1:58:13.37,Default,,0000,0000,0000,,whistleblowers and activists. Dialogue: 0,1:58:13.37,1:58:15.86,Default,,0000,0000,0000,,The net result could be\Nanxious and oppressed citizens Dialogue: 0,1:58:15.86,1:58:19.32,Default,,0000,0000,0000,,who were denied fundamental\Nopportunities and rights. Dialogue: 0,1:58:19.32,1:58:20.69,Default,,0000,0000,0000,,For the reasons outlined above, Dialogue: 0,1:58:20.69,1:58:22.19,Default,,0000,0000,0000,,I strongly support the ordinance Dialogue: 0,1:58:22.19,1:58:24.95,Default,,0000,0000,0000,,banning facial recognition\Nsurveillance in Boston. Dialogue: 0,1:58:24.95,1:58:26.63,Default,,0000,0000,0000,,It's the best approach for preventing Dialogue: 0,1:58:26.63,1:58:27.98,Default,,0000,0000,0000,,an Orwellian future Dialogue: 0,1:58:27.98,1:58:29.67,Default,,0000,0000,0000,,and ensuring that the city of Boston Dialogue: 0,1:58:29.67,1:58:33.10,Default,,0000,0000,0000,,remain a place where people can flourish Dialogue: 0,1:58:33.10,1:58:35.87,Default,,0000,0000,0000,,and civil liberties from a protected. Dialogue: 0,1:58:35.87,1:58:37.42,Default,,0000,0000,0000,,Thank you so much, professor. Dialogue: 0,1:58:38.28,1:58:43.28,Default,,0000,0000,0000,,We have coming up, Alex\NMarthews, Emily Reif, Dialogue: 0,1:58:44.64,1:58:47.27,Default,,0000,0000,0000,,Jurell Laronal. Dialogue: 0,1:58:48.55,1:58:52.51,Default,,0000,0000,0000,,Those three folks, I don't\Nknow if they're ready to go. Dialogue: 0,1:58:52.51,1:58:56.37,Default,,0000,0000,0000,,But we'll start, I see Jurell, I see Alex. Dialogue: 0,1:58:57.29,1:58:59.60,Default,,0000,0000,0000,,Go ahead Alex, Marthews. Dialogue: 0,1:59:01.32,1:59:02.15,Default,,0000,0000,0000,,Hi. Dialogue: 0,1:59:03.55,1:59:08.55,Default,,0000,0000,0000,,I don't wish to echo what\Nother excellent advocates Dialogue: 0,1:59:09.81,1:59:10.97,Default,,0000,0000,0000,,have been saying, Dialogue: 0,1:59:10.97,1:59:15.40,Default,,0000,0000,0000,,but I am here representing Digital Fourth, Dialogue: 0,1:59:15.40,1:59:16.74,Default,,0000,0000,0000,,restored fourth Boston, Dialogue: 0,1:59:16.74,1:59:19.34,Default,,0000,0000,0000,,where she is a volunteer based Dialogue: 0,1:59:19.34,1:59:22.78,Default,,0000,0000,0000,,advocacy group on surveillance\Nand policing issues Dialogue: 0,1:59:22.78,1:59:24.10,Default,,0000,0000,0000,,in the Boston area. Dialogue: 0,1:59:25.37,1:59:30.37,Default,,0000,0000,0000,,I wanna highlight some\Ncomments of commissioner Gross, Dialogue: 0,1:59:30.90,1:59:34.28,Default,,0000,0000,0000,,in addition to our testimony. Dialogue: 0,1:59:35.15,1:59:39.04,Default,,0000,0000,0000,,Commissioner Gross seems to\Nrecognize the racial biases Dialogue: 0,1:59:39.04,1:59:43.41,Default,,0000,0000,0000,,and limitations of this\Ntechnology as it stands now, Dialogue: 0,1:59:43.41,1:59:47.42,Default,,0000,0000,0000,,but he says it's improving\Nand he holds out hope Dialogue: 0,1:59:47.42,1:59:51.51,Default,,0000,0000,0000,,that it will be more\Naccurate in the future. Dialogue: 0,1:59:51.51,1:59:53.63,Default,,0000,0000,0000,,And that perhaps at some future point, Dialogue: 0,1:59:53.63,1:59:56.96,Default,,0000,0000,0000,,the Boston City Council should reconsider Dialogue: 0,1:59:56.96,1:59:58.58,Default,,0000,0000,0000,,whether it will be appropriate Dialogue: 0,1:59:58.58,2:00:03.13,Default,,0000,0000,0000,,to implement facial\Nrecognition technology. Dialogue: 0,2:00:03.13,2:00:05.71,Default,,0000,0000,0000,,The problem with this is that Dialogue: 0,2:00:05.71,2:00:10.71,Default,,0000,0000,0000,,a 100% accurate facial\Nrecognition technology system Dialogue: 0,2:00:10.78,2:00:11.93,Default,,0000,0000,0000,,would be terrifying. Dialogue: 0,2:00:13.36,2:00:17.66,Default,,0000,0000,0000,,And it would represent the\Ndeath of anonymity in public. Dialogue: 0,2:00:18.57,2:00:23.48,Default,,0000,0000,0000,,In terms of the effect of\Nfacial recognition technology Dialogue: 0,2:00:23.48,2:00:26.08,Default,,0000,0000,0000,,on legislative terrorists themselves, Dialogue: 0,2:00:27.26,2:00:29.52,Default,,0000,0000,0000,,there have been a number of studies now Dialogue: 0,2:00:30.46,2:00:34.43,Default,,0000,0000,0000,,that show that current facial\Nrecognition technologies Dialogue: 0,2:00:34.43,2:00:36.17,Default,,0000,0000,0000,,are perfectly capable Dialogue: 0,2:00:37.98,2:00:40.30,Default,,0000,0000,0000,,of misidentifying legislators as criminals Dialogue: 0,2:00:41.42,2:00:45.82,Default,,0000,0000,0000,,and these systems being 100% accurate Dialogue: 0,2:00:45.82,2:00:49.71,Default,,0000,0000,0000,,will pose even more\Nworrying risks about that. Dialogue: 0,2:00:49.71,2:00:52.17,Default,,0000,0000,0000,,As city councilors, Dialogue: 0,2:00:52.17,2:00:54.57,Default,,0000,0000,0000,,where are you going when\Nyou're going in public, Dialogue: 0,2:00:54.57,2:00:57.21,Default,,0000,0000,0000,,are you meeting with\Nsocial justice groups? Dialogue: 0,2:00:57.21,2:00:59.29,Default,,0000,0000,0000,,Are you meeting with immigrant advocates? Dialogue: 0,2:00:59.29,2:01:01.81,Default,,0000,0000,0000,,Are you meeting with God forbid, Dialogue: 0,2:01:01.81,2:01:04.84,Default,,0000,0000,0000,,police reform and down to\Nkit accountability groups? Dialogue: 0,2:01:04.84,2:01:08.21,Default,,0000,0000,0000,,Are you doing something\Nembarrassing perhaps Dialogue: 0,2:01:08.21,2:01:10.96,Default,,0000,0000,0000,,that could be used to influence your votes Dialogue: 0,2:01:10.96,2:01:13.00,Default,,0000,0000,0000,,on matters of public interest Dialogue: 0,2:01:13.00,2:01:16.10,Default,,0000,0000,0000,,or matters relating to the police budget? Dialogue: 0,2:01:16.10,2:01:20.81,Default,,0000,0000,0000,,The risk of these things\Nhappening are very real, Dialogue: 0,2:01:20.81,2:01:23.48,Default,,0000,0000,0000,,if it is to be implemented in the future. Dialogue: 0,2:01:23.48,2:01:26.15,Default,,0000,0000,0000,,And the only appropriate response Dialogue: 0,2:01:26.15,2:01:30.66,Default,,0000,0000,0000,,is for today's city council\Nto ban this technology. Dialogue: 0,2:01:31.66,2:01:34.33,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:01:35.80,2:01:38.83,Default,,0000,0000,0000,,I believe, I don't know\Nif Emily is available, Dialogue: 0,2:01:38.83,2:01:43.83,Default,,0000,0000,0000,,but I do see Jurell, Jurell\Nwould you like to go? Dialogue: 0,2:01:50.86,2:01:51.69,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:01:51.69,2:01:54.33,Default,,0000,0000,0000,,Thank you, Boston City Council\Nfor having this platform. Dialogue: 0,2:01:54.33,2:01:56.17,Default,,0000,0000,0000,,I wanna thank you. Dialogue: 0,2:01:56.17,2:01:58.68,Default,,0000,0000,0000,,Thank you and a shout out\Nto the expert advocates Dialogue: 0,2:01:58.68,2:02:01.76,Default,,0000,0000,0000,,on the subject for shedding\Nlight on the subject Dialogue: 0,2:02:01.76,2:02:04.85,Default,,0000,0000,0000,,that me and many of us\Nare not fully versed in. Dialogue: 0,2:02:04.85,2:02:07.80,Default,,0000,0000,0000,,What I am well versed in it though, Dialogue: 0,2:02:07.80,2:02:09.71,Default,,0000,0000,0000,,is the reality surrounding Dialogue: 0,2:02:09.71,2:02:12.07,Default,,0000,0000,0000,,the Boston Police Department's history Dialogue: 0,2:02:12.07,2:02:14.55,Default,,0000,0000,0000,,and the negative impact that they have had Dialogue: 0,2:02:14.55,2:02:17.37,Default,,0000,0000,0000,,on the black or brown\NBostonian's for decades. Dialogue: 0,2:02:17.37,2:02:18.86,Default,,0000,0000,0000,,My name is Jurell Laronal, Dialogue: 0,2:02:18.86,2:02:21.25,Default,,0000,0000,0000,,I'm a formerly incarcerated black man Dialogue: 0,2:02:21.25,2:02:24.66,Default,,0000,0000,0000,,from a heavily police commerce\Nneighborhood of Dorchester. Dialogue: 0,2:02:24.66,2:02:29.48,Default,,0000,0000,0000,,I'm a community organizer for\NFamilies for Justice as Halen Dialogue: 0,2:02:29.48,2:02:32.42,Default,,0000,0000,0000,,and I'm here to support the ordinance Dialogue: 0,2:02:32.42,2:02:35.91,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston Dialogue: 0,2:02:35.91,2:02:39.04,Default,,0000,0000,0000,,presented by councilors Wu and Arroyo. Dialogue: 0,2:02:39.04,2:02:41.10,Default,,0000,0000,0000,,First, what are we talking about? Dialogue: 0,2:02:41.10,2:02:43.88,Default,,0000,0000,0000,,We are talking about a force whose actions Dialogue: 0,2:02:43.88,2:02:47.69,Default,,0000,0000,0000,,and racial profiling techniques\Nhave been documented, Dialogue: 0,2:02:47.69,2:02:50.55,Default,,0000,0000,0000,,tried in court and notoriously\Nknown for targeting Dialogue: 0,2:02:50.55,2:02:55.30,Default,,0000,0000,0000,,and terrorizing those from\Nmy disinvested communities. Dialogue: 0,2:02:55.30,2:03:00.20,Default,,0000,0000,0000,,We are talking about\Nmilitarization to intimidate, Dialogue: 0,2:03:00.20,2:03:03.94,Default,,0000,0000,0000,,force their well on\Nblack and brown citizens Dialogue: 0,2:03:03.94,2:03:05.48,Default,,0000,0000,0000,,of the Commonwealth. Dialogue: 0,2:03:05.48,2:03:08.26,Default,,0000,0000,0000,,A force by their own field interrogations Dialogue: 0,2:03:08.26,2:03:09.27,Default,,0000,0000,0000,,and observation records, Dialogue: 0,2:03:09.27,2:03:12.64,Default,,0000,0000,0000,,show almost 70% of them\Nwere black residents Dialogue: 0,2:03:12.64,2:03:16.16,Default,,0000,0000,0000,,in a city that's only around 25.2% black. Dialogue: 0,2:03:16.16,2:03:19.42,Default,,0000,0000,0000,,So we're talking about\Ntaking the same force Dialogue: 0,2:03:19.42,2:03:21.58,Default,,0000,0000,0000,,and further funding them Dialogue: 0,2:03:21.58,2:03:24.31,Default,,0000,0000,0000,,and equipping them with another weapon Dialogue: 0,2:03:24.31,2:03:27.16,Default,,0000,0000,0000,,that would definitely be used\Nto further oppress, target Dialogue: 0,2:03:27.16,2:03:30.17,Default,,0000,0000,0000,,and incarcerate our sons and fathers Dialogue: 0,2:03:30.17,2:03:32.13,Default,,0000,0000,0000,,and daughters and mothers. Dialogue: 0,2:03:32.13,2:03:34.58,Default,,0000,0000,0000,,My organization is currently\Nsupporting the appeals Dialogue: 0,2:03:34.58,2:03:38.00,Default,,0000,0000,0000,,and release efforts of black\Nmen and their families, Dialogue: 0,2:03:38.00,2:03:40.83,Default,,0000,0000,0000,,who have been incarcerated\Nsince the seventies Dialogue: 0,2:03:40.83,2:03:44.20,Default,,0000,0000,0000,,due to a racist profiling\Nand racist misidentification. Dialogue: 0,2:03:45.16,2:03:48.52,Default,,0000,0000,0000,,When I think about racial\Nrecognition software, Dialogue: 0,2:03:48.52,2:03:51.38,Default,,0000,0000,0000,,which at best is predatory and inaccurate, Dialogue: 0,2:03:51.38,2:03:54.68,Default,,0000,0000,0000,,I don't see a solution\Nand I don't see safety. Dialogue: 0,2:03:54.68,2:03:56.90,Default,,0000,0000,0000,,I see yet another form of racial bias Dialogue: 0,2:03:56.90,2:03:58.46,Default,,0000,0000,0000,,and another way of police Dialogue: 0,2:03:59.52,2:04:00.36,Default,,0000,0000,0000,,to cause harm in my community. Dialogue: 0,2:04:01.31,2:04:04.34,Default,,0000,0000,0000,,Today our city's historical\Nracial profiling issue Dialogue: 0,2:04:04.34,2:04:06.78,Default,,0000,0000,0000,,with, somebody said it earlier, Dialogue: 0,2:04:06.78,2:04:08.36,Default,,0000,0000,0000,,with the federal government study Dialogue: 0,2:04:08.36,2:04:11.94,Default,,0000,0000,0000,,that the algorithm, my fault, Dialogue: 0,2:04:11.94,2:04:15.00,Default,,0000,0000,0000,,failed to identify people of color Dialogue: 0,2:04:15.00,2:04:17.73,Default,,0000,0000,0000,,and children and elderly and women. Dialogue: 0,2:04:18.82,2:04:22.41,Default,,0000,0000,0000,,I see that it needs to be banned Dialogue: 0,2:04:22.41,2:04:25.29,Default,,0000,0000,0000,,because we can't continue\Ncondoning and reassuring Dialogue: 0,2:04:25.29,2:04:29.32,Default,,0000,0000,0000,,and fueling other generations\Nof black and brown Dialogue: 0,2:04:29.32,2:04:32.13,Default,,0000,0000,0000,,men and women from Dorchester\NRoxbury and Mattapan, Dialogue: 0,2:04:32.13,2:04:34.71,Default,,0000,0000,0000,,to more trauma and more incarceration. Dialogue: 0,2:04:34.71,2:04:37.42,Default,,0000,0000,0000,,Investment in our communities Dialogue: 0,2:04:37.42,2:04:39.27,Default,,0000,0000,0000,,through community led processes, Dialogue: 0,2:04:39.27,2:04:41.43,Default,,0000,0000,0000,,is the only thing that we need, Dialogue: 0,2:04:41.43,2:04:43.54,Default,,0000,0000,0000,,not facial recognition tech. Dialogue: 0,2:04:43.54,2:04:45.96,Default,,0000,0000,0000,,Oh these bees are messing\Nwith me, sorry about that. Dialogue: 0,2:04:45.96,2:04:50.42,Default,,0000,0000,0000,,No matter what level that\Ntechnology reaches and thank you. Dialogue: 0,2:04:50.42,2:04:51.33,Default,,0000,0000,0000,,That's all I have to say. Dialogue: 0,2:04:51.33,2:04:52.53,Default,,0000,0000,0000,,I'm gonna keep it quick. Dialogue: 0,2:04:54.72,2:04:55.73,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:04:55.73,2:04:56.56,Default,,0000,0000,0000,,I really do appreciate it. Dialogue: 0,2:04:56.56,2:04:58.00,Default,,0000,0000,0000,,And I think, again speak from the heart Dialogue: 0,2:04:58.00,2:04:59.29,Default,,0000,0000,0000,,about their own experience Dialogue: 0,2:04:59.29,2:05:02.07,Default,,0000,0000,0000,,and specifically talk about\Nhow it would impact them, Dialogue: 0,2:05:02.07,2:05:03.94,Default,,0000,0000,0000,,I do appreciate that. Dialogue: 0,2:05:03.94,2:05:05.75,Default,,0000,0000,0000,,Not that I don't appreciate the experts Dialogue: 0,2:05:05.75,2:05:07.21,Default,,0000,0000,0000,,but I'm telling you there's nothing better Dialogue: 0,2:05:07.21,2:05:10.14,Default,,0000,0000,0000,,than hearing about how\Nresidents are directly impacted. Dialogue: 0,2:05:10.14,2:05:10.97,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:05:11.82,2:05:16.82,Default,,0000,0000,0000,,I have also on the list, Cynthia Pades, Dialogue: 0,2:05:19.23,2:05:20.96,Default,,0000,0000,0000,,I don't have a name, Dialogue: 0,2:05:20.96,2:05:23.68,Default,,0000,0000,0000,,but I have an organization\Ncalled For the People Boston. Dialogue: 0,2:05:25.57,2:05:27.22,Default,,0000,0000,0000,,I don't think we have either one. Dialogue: 0,2:05:28.57,2:05:29.61,Default,,0000,0000,0000,,There is. Dialogue: 0,2:05:29.61,2:05:33.16,Default,,0000,0000,0000,,Oh, I see a Marco, Marco Staminivich. Dialogue: 0,2:05:37.09,2:05:39.34,Default,,0000,0000,0000,,Marco, ready to go? Dialogue: 0,2:05:41.71,2:05:42.54,Default,,0000,0000,0000,,Marco! Dialogue: 0,2:05:43.48,2:05:44.51,Default,,0000,0000,0000,,Sure.\NOkay. Dialogue: 0,2:05:45.95,2:05:48.97,Default,,0000,0000,0000,,How is it going everyone. Dialogue: 0,2:05:48.97,2:05:51.90,Default,,0000,0000,0000,,Thank you to the council and the panelists Dialogue: 0,2:05:51.90,2:05:53.72,Default,,0000,0000,0000,,for allowing me to speak. Dialogue: 0,2:05:53.72,2:05:55.75,Default,,0000,0000,0000,,So I'm a machine learning engineer Dialogue: 0,2:05:55.75,2:05:57.65,Default,,0000,0000,0000,,working in an adjacent field Dialogue: 0,2:05:57.65,2:06:00.47,Default,,0000,0000,0000,,and a resident of Jamaica Plain Dialogue: 0,2:06:00.47,2:06:02.87,Default,,0000,0000,0000,,and I'm here today as a private citizen. Dialogue: 0,2:06:02.87,2:06:05.54,Default,,0000,0000,0000,,So this is my first time\Nspeaking at a public hearing, Dialogue: 0,2:06:05.54,2:06:07.49,Default,,0000,0000,0000,,so cut me some slack. Dialogue: 0,2:06:07.49,2:06:09.38,Default,,0000,0000,0000,,Although I'm not an expert Dialogue: 0,2:06:09.38,2:06:12.64,Default,,0000,0000,0000,,on service on the surveillance\Naspect and the applications, Dialogue: 0,2:06:12.64,2:06:17.64,Default,,0000,0000,0000,,I am familiar with the\Ntechnology aspect of this stuff. Dialogue: 0,2:06:18.80,2:06:20.55,Default,,0000,0000,0000,,So I felt the need to testify today Dialogue: 0,2:06:20.55,2:06:22.81,Default,,0000,0000,0000,,because of the gravity of this issue. Dialogue: 0,2:06:22.81,2:06:27.61,Default,,0000,0000,0000,,Although the technology may\Nseem benign upon first glance, Dialogue: 0,2:06:28.87,2:06:31.63,Default,,0000,0000,0000,,this is an extremely\Npowerful and dangerous tool. Dialogue: 0,2:06:31.63,2:06:33.83,Default,,0000,0000,0000,,So to me this amounts to basically Dialogue: 0,2:06:33.83,2:06:35.95,Default,,0000,0000,0000,,a form of digital stop and frisk, Dialogue: 0,2:06:35.95,2:06:37.13,Default,,0000,0000,0000,,and it's even more dangerous Dialogue: 0,2:06:37.13,2:06:39.80,Default,,0000,0000,0000,,since it can be kind\Nof constantly running, Dialogue: 0,2:06:39.80,2:06:43.11,Default,,0000,0000,0000,,constantly working anywhere at all times, Dialogue: 0,2:06:43.11,2:06:44.58,Default,,0000,0000,0000,,kind of tracking folks. Dialogue: 0,2:06:44.58,2:06:46.28,Default,,0000,0000,0000,,This seems like a clear violation Dialogue: 0,2:06:46.28,2:06:48.42,Default,,0000,0000,0000,,of the Fourth Amendment to me. Dialogue: 0,2:06:49.29,2:06:52.42,Default,,0000,0000,0000,,As many commenters and\Npanelists today have noted, Dialogue: 0,2:06:52.42,2:06:55.08,Default,,0000,0000,0000,,in addition, as many\Nfolks have noted today, Dialogue: 0,2:06:55.08,2:06:57.35,Default,,0000,0000,0000,,the technology is certainly not perfect Dialogue: 0,2:06:57.35,2:06:59.97,Default,,0000,0000,0000,,and it's riddled with\Nbiases based on the data Dialogue: 0,2:06:59.97,2:07:02.30,Default,,0000,0000,0000,,that it was trained on\Nand how it was trained Dialogue: 0,2:07:02.30,2:07:03.96,Default,,0000,0000,0000,,and by the folks that trained it. Dialogue: 0,2:07:04.80,2:07:07.52,Default,,0000,0000,0000,,And although this is an\Nimportant thing to note, Dialogue: 0,2:07:07.52,2:07:08.93,Default,,0000,0000,0000,,this would really be a moot point Dialogue: 0,2:07:08.93,2:07:12.12,Default,,0000,0000,0000,,when evaluating these\Ntechnologies for deployment. Dialogue: 0,2:07:12.12,2:07:15.19,Default,,0000,0000,0000,,Because while a poorly\Nfunctioning system is bad Dialogue: 0,2:07:15.19,2:07:18.26,Default,,0000,0000,0000,,and introduces issues, Dialogue: 0,2:07:18.26,2:07:21.27,Default,,0000,0000,0000,,a perfectly functioning system\Nwould be much, much worse Dialogue: 0,2:07:21.27,2:07:23.10,Default,,0000,0000,0000,,and would basically signal a world Dialogue: 0,2:07:23.10,2:07:24.94,Default,,0000,0000,0000,,where we could be tracked out of hand Dialogue: 0,2:07:24.94,2:07:28.15,Default,,0000,0000,0000,,at any time that police\Nor government wanted. Dialogue: 0,2:07:28.15,2:07:32.26,Default,,0000,0000,0000,,So that's really what I wanted to say. Dialogue: 0,2:07:32.26,2:07:37.14,Default,,0000,0000,0000,,And I wanted to kind of state my support Dialogue: 0,2:07:37.14,2:07:41.49,Default,,0000,0000,0000,,for a ban on any type of\Nfacial recognition technology Dialogue: 0,2:07:41.49,2:07:44.21,Default,,0000,0000,0000,,for use by the police or the government. Dialogue: 0,2:07:44.21,2:07:47.10,Default,,0000,0000,0000,,And that we should\Ndisincentivize the research Dialogue: 0,2:07:47.10,2:07:49.57,Default,,0000,0000,0000,,and development of, send clear signal Dialogue: 0,2:07:51.26,2:07:52.73,Default,,0000,0000,0000,,to disincentivize the\Nresearch and development Dialogue: 0,2:07:52.73,2:07:55.28,Default,,0000,0000,0000,,of these types of systems in the future. Dialogue: 0,2:07:55.28,2:07:58.61,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:07:58.61,2:08:00.07,Default,,0000,0000,0000,,Thank you, it was perfect on time, Dialogue: 0,2:08:00.07,2:08:01.50,Default,,0000,0000,0000,,for the first time participating. Dialogue: 0,2:08:01.50,2:08:02.47,Default,,0000,0000,0000,,We really appreciate it. Dialogue: 0,2:08:02.47,2:08:04.81,Default,,0000,0000,0000,,I hope that this is not your last time. Dialogue: 0,2:08:04.81,2:08:07.04,Default,,0000,0000,0000,,We are dealing with a\Nlot of different issues. Dialogue: 0,2:08:07.04,2:08:09.85,Default,,0000,0000,0000,,Your voice is definitely\Nwelcome, thank you so much. Dialogue: 0,2:08:11.51,2:08:14.41,Default,,0000,0000,0000,,Up next I have, Dialogue: 0,2:08:14.41,2:08:19.41,Default,,0000,0000,0000,,I'm gonna name Will Luckman\Nfollowed by Lizette Medina, Dialogue: 0,2:08:19.53,2:08:23.15,Default,,0000,0000,0000,,and then also a Nathan\NSheard, I apologize again, Dialogue: 0,2:08:23.15,2:08:24.93,Default,,0000,0000,0000,,if I mispronounce your name, Dialogue: 0,2:08:24.93,2:08:27.36,Default,,0000,0000,0000,,feel free to just state your name again. Dialogue: 0,2:08:27.36,2:08:29.18,Default,,0000,0000,0000,,For folks who have already testified, Dialogue: 0,2:08:29.18,2:08:31.46,Default,,0000,0000,0000,,again, since we're over capacity, Dialogue: 0,2:08:31.46,2:08:34.74,Default,,0000,0000,0000,,if you sign out and then watch on YouTube Dialogue: 0,2:08:34.74,2:08:37.51,Default,,0000,0000,0000,,that allows for someone else\Nto get in line to also speak. Dialogue: 0,2:08:37.51,2:08:39.68,Default,,0000,0000,0000,,So just letting you know that Dialogue: 0,2:08:39.68,2:08:41.57,Default,,0000,0000,0000,,that's what we're trying to do. Dialogue: 0,2:08:42.46,2:08:44.23,Default,,0000,0000,0000,,So I will now turn it over. Dialogue: 0,2:08:44.23,2:08:45.87,Default,,0000,0000,0000,,I believe Will's ready to go. Dialogue: 0,2:08:45.87,2:08:47.43,Default,,0000,0000,0000,,And you have two minutes Will. Dialogue: 0,2:08:48.35,2:08:50.09,Default,,0000,0000,0000,,Thank you. good afternoon. Dialogue: 0,2:08:50.09,2:08:52.26,Default,,0000,0000,0000,,My name is Will Luckman\Nand I serve as an organizer Dialogue: 0,2:08:52.26,2:08:57.26,Default,,0000,0000,0000,,with the Surveillance Technology\NOversight Project or STOP Dialogue: 0,2:08:57.72,2:08:59.94,Default,,0000,0000,0000,,and STOP advocates and litigate Dialogue: 0,2:08:59.94,2:09:02.94,Default,,0000,0000,0000,,to fight discriminatory surveillance. Dialogue: 0,2:09:02.94,2:09:04.79,Default,,0000,0000,0000,,Thank you councilors for\Nholding this hearing today Dialogue: 0,2:09:04.79,2:09:07.62,Default,,0000,0000,0000,,and for proposing this crucial reform. Dialogue: 0,2:09:07.62,2:09:11.04,Default,,0000,0000,0000,,Today, my oral remarks are an\Nexcerpt of written testimony Dialogue: 0,2:09:11.04,2:09:13.11,Default,,0000,0000,0000,,being entered into the record. Dialogue: 0,2:09:13.11,2:09:16.83,Default,,0000,0000,0000,,Facial recognition is biased,\Nbroken, and when it works Dialogue: 0,2:09:16.83,2:09:19.43,Default,,0000,0000,0000,,antithetical to a democratic society. Dialogue: 0,2:09:19.43,2:09:22.18,Default,,0000,0000,0000,,And crucially, without this ban, Dialogue: 0,2:09:22.18,2:09:25.22,Default,,0000,0000,0000,,more people of color will be\Nwrongly stopped by the police Dialogue: 0,2:09:25.22,2:09:27.28,Default,,0000,0000,0000,,at a moment when the\Ndangers of police encounters Dialogue: 0,2:09:27.28,2:09:29.04,Default,,0000,0000,0000,,have never been clearer. Dialogue: 0,2:09:29.04,2:09:30.94,Default,,0000,0000,0000,,The technology that\Ndrives facial recognition Dialogue: 0,2:09:30.94,2:09:33.42,Default,,0000,0000,0000,,is far more subjective than many realize. Dialogue: 0,2:09:34.44,2:09:36.27,Default,,0000,0000,0000,,Artificial intelligence is the aggregation Dialogue: 0,2:09:36.27,2:09:40.12,Default,,0000,0000,0000,,of countless human decisions\Ncodified into algorithms. Dialogue: 0,2:09:40.12,2:09:43.94,Default,,0000,0000,0000,,And as a result, human\Nbias can affect AI systems, Dialogue: 0,2:09:43.94,2:09:46.95,Default,,0000,0000,0000,,including those that\Nsupposedly recognize faces Dialogue: 0,2:09:46.95,2:09:48.43,Default,,0000,0000,0000,,in countless ways. Dialogue: 0,2:09:48.43,2:09:51.05,Default,,0000,0000,0000,,For example, if a security camera learns Dialogue: 0,2:09:51.05,2:09:53.60,Default,,0000,0000,0000,,you are a suspicious looking,\Nusing pictures of inmates, Dialogue: 0,2:09:53.60,2:09:54.96,Default,,0000,0000,0000,,the photos will teach the AI Dialogue: 0,2:09:54.96,2:09:58.81,Default,,0000,0000,0000,,to replicate the mass incarceration\Nof African-American men. Dialogue: 0,2:09:58.81,2:10:01.99,Default,,0000,0000,0000,,In this way, AI can\Nlearn to be just like us Dialogue: 0,2:10:01.99,2:10:03.90,Default,,0000,0000,0000,,exacerbating structural discrimination Dialogue: 0,2:10:03.90,2:10:06.29,Default,,0000,0000,0000,,against marginalized communities. Dialogue: 0,2:10:06.29,2:10:07.27,Default,,0000,0000,0000,,As we've heard, Dialogue: 0,2:10:07.27,2:10:10.24,Default,,0000,0000,0000,,even if facial recognition\Nworked without errors, Dialogue: 0,2:10:10.24,2:10:11.94,Default,,0000,0000,0000,,even if it had no bias, Dialogue: 0,2:10:11.94,2:10:14.07,Default,,0000,0000,0000,,the technology would\Nstill remain antithetical Dialogue: 0,2:10:14.07,2:10:16.42,Default,,0000,0000,0000,,to everything the city\Nof Boston believes in. Dialogue: 0,2:10:16.42,2:10:18.69,Default,,0000,0000,0000,,Facial recognition\Nmanufacturers are trying Dialogue: 0,2:10:18.69,2:10:21.07,Default,,0000,0000,0000,,to create a system that\Nallows everyone to be tracked Dialogue: 0,2:10:21.07,2:10:23.45,Default,,0000,0000,0000,,at every moment in perpetuity. Dialogue: 0,2:10:23.45,2:10:25.52,Default,,0000,0000,0000,,Go to a protest, he system knows, Dialogue: 0,2:10:25.52,2:10:28.46,Default,,0000,0000,0000,,go to a health facility,\Nit keeps a record. Dialogue: 0,2:10:28.46,2:10:31.62,Default,,0000,0000,0000,,Suddenly Bostonians lose\Nthe freedom of movement Dialogue: 0,2:10:31.62,2:10:33.30,Default,,0000,0000,0000,,that is essential to an open society. Dialogue: 0,2:10:33.30,2:10:35.23,Default,,0000,0000,0000,,If the city fails to act soon, Dialogue: 0,2:10:35.23,2:10:38.20,Default,,0000,0000,0000,,it will only become\Nharder to enact reforms. Dialogue: 0,2:10:38.20,2:10:41.26,Default,,0000,0000,0000,,Companies are pressuring local\Nstate and federal agencies Dialogue: 0,2:10:41.26,2:10:43.04,Default,,0000,0000,0000,,to adopt facial recognition tools. Dialogue: 0,2:10:43.04,2:10:46.46,Default,,0000,0000,0000,,BriefCam, the software\Npowering Boston's surveillance Dialogue: 0,2:10:46.46,2:10:49.30,Default,,0000,0000,0000,,camera network, has released a\Nnew version of their software Dialogue: 0,2:10:49.30,2:10:50.76,Default,,0000,0000,0000,,that would easily integrate Dialogue: 0,2:10:50.76,2:10:52.90,Default,,0000,0000,0000,,invasive facial recognition tools. Dialogue: 0,2:10:52.90,2:10:54.06,Default,,0000,0000,0000,,Although the commissioner Dialogue: 0,2:10:54.06,2:10:56.70,Default,,0000,0000,0000,,says he will reevaluate\Nthe new version on offer. Dialogue: 0,2:10:56.70,2:10:58.80,Default,,0000,0000,0000,,He also expressed interest in waiting Dialogue: 0,2:10:58.80,2:11:01.39,Default,,0000,0000,0000,,until the accuracy of\Nthe technology improves Dialogue: 0,2:11:01.39,2:11:05.43,Default,,0000,0000,0000,,or making exceptions for facial\Nrecognition use right now. Dialogue: 0,2:11:05.43,2:11:08.55,Default,,0000,0000,0000,,To definitively and\Npermanently avoid the issues Dialogue: 0,2:11:08.55,2:11:11.32,Default,,0000,0000,0000,,with facial recognition\Nthat I've raised here, Dialogue: 0,2:11:11.32,2:11:13.70,Default,,0000,0000,0000,,we need comprehensive ban now. Dialogue: 0,2:11:13.70,2:11:16.25,Default,,0000,0000,0000,,And I'll conclude on a personal note. Dialogue: 0,2:11:16.25,2:11:17.58,Default,,0000,0000,0000,,I live and work in New York City, Dialogue: 0,2:11:17.58,2:11:20.06,Default,,0000,0000,0000,,but I was born in Boston\Nand raised in Brooklyn,. Dialogue: 0,2:11:20.06,2:11:23.24,Default,,0000,0000,0000,,it pains me to see the\Ncurrent wave of protests Dialogue: 0,2:11:23.24,2:11:24.07,Default,,0000,0000,0000,,roiling the area, Dialogue: 0,2:11:24.07,2:11:25.16,Default,,0000,0000,0000,,because it demonstrates the bias Dialogue: 0,2:11:25.16,2:11:27.03,Default,,0000,0000,0000,,and unequal law enforcement practices Dialogue: 0,2:11:27.03,2:11:29.96,Default,,0000,0000,0000,,I remember from my youth,\Nhave yet to be addressed. Dialogue: 0,2:11:29.96,2:11:31.84,Default,,0000,0000,0000,,I know that the people of the Commonwealth Dialogue: 0,2:11:31.84,2:11:33.22,Default,,0000,0000,0000,,want to see a change, Dialogue: 0,2:11:33.22,2:11:36.10,Default,,0000,0000,0000,,and I believe that the\Ncouncil is on their side. Dialogue: 0,2:11:36.10,2:11:38.71,Default,,0000,0000,0000,,In practice, inaccuracies aside, Dialogue: 0,2:11:38.71,2:11:41.27,Default,,0000,0000,0000,,facial recognition systems\Nlead to increased stops Dialogue: 0,2:11:41.27,2:11:43.98,Default,,0000,0000,0000,,for people of color, increased stops means Dialogue: 0,2:11:43.98,2:11:47.03,Default,,0000,0000,0000,,an increase in opportunity\Nfor police violence and abuse. Dialogue: 0,2:11:47.03,2:11:49.13,Default,,0000,0000,0000,,We must recognize that Black Lives Matter Dialogue: 0,2:11:49.13,2:11:50.88,Default,,0000,0000,0000,,and to do so we must realize Dialogue: 0,2:11:50.88,2:11:54.02,Default,,0000,0000,0000,,that technology doesn't\Noperate in a neutral vacuum. Dialogue: 0,2:11:54.02,2:11:55.78,Default,,0000,0000,0000,,Instead it takes on the character Dialogue: 0,2:11:55.78,2:11:57.96,Default,,0000,0000,0000,,of those building and deploying it. Dialogue: 0,2:11:57.96,2:11:59.75,Default,,0000,0000,0000,,I encouraged the council to respond Dialogue: 0,2:11:59.75,2:12:02.40,Default,,0000,0000,0000,,to their constituents\Ndemands for police reform Dialogue: 0,2:12:02.40,2:12:03.91,Default,,0000,0000,0000,,by immediately banning the use Dialogue: 0,2:12:05.46,2:12:06.29,Default,,0000,0000,0000,,of this harmful technology in Boston. Dialogue: 0,2:12:07.22,2:12:09.46,Default,,0000,0000,0000,,Thank you.\NThank you very much Will. Dialogue: 0,2:12:09.46,2:12:10.43,Default,,0000,0000,0000,,Lizet Medina. Dialogue: 0,2:12:12.67,2:12:15.48,Default,,0000,0000,0000,,Hi, good afternoon. Dialogue: 0,2:12:15.48,2:12:18.85,Default,,0000,0000,0000,,My name is Lizet Medina, I'm\Nhere as a private citizen. Dialogue: 0,2:12:18.85,2:12:22.02,Default,,0000,0000,0000,,I actually am pretty new to all of this. Dialogue: 0,2:12:22.02,2:12:23.62,Default,,0000,0000,0000,,This week I've found out Dialogue: 0,2:12:23.62,2:12:26.42,Default,,0000,0000,0000,,from the Student Immigration Movement, Dialogue: 0,2:12:26.42,2:12:28.00,Default,,0000,0000,0000,,kind of what's going on. Dialogue: 0,2:12:28.00,2:12:32.21,Default,,0000,0000,0000,,And I just wanted to voice\Nmy support for this ban. Dialogue: 0,2:12:32.21,2:12:35.28,Default,,0000,0000,0000,,I think for personally, as an immigrant, Dialogue: 0,2:12:35.28,2:12:38.72,Default,,0000,0000,0000,,and speaking toward like the presence Dialogue: 0,2:12:38.72,2:12:41.34,Default,,0000,0000,0000,,of facial recognition\Ntechnology in the school system, Dialogue: 0,2:12:41.34,2:12:43.78,Default,,0000,0000,0000,,I think especially where we\Nhave so many conversations Dialogue: 0,2:12:43.78,2:12:46.62,Default,,0000,0000,0000,,with like the school to prison pipeline. Dialogue: 0,2:12:46.62,2:12:48.93,Default,,0000,0000,0000,,This tool could be used to facilitate Dialogue: 0,2:12:48.93,2:12:53.43,Default,,0000,0000,0000,,that kind of work in racially profiling Dialogue: 0,2:12:53.43,2:12:55.40,Default,,0000,0000,0000,,and profile of our students. Dialogue: 0,2:12:55.40,2:12:57.92,Default,,0000,0000,0000,,And I think now more than ever, Dialogue: 0,2:12:57.92,2:13:02.47,Default,,0000,0000,0000,,we have this point in\Ntime a decision to make Dialogue: 0,2:13:02.47,2:13:05.61,Default,,0000,0000,0000,,in protecting our residents of color, Dialogue: 0,2:13:05.61,2:13:08.94,Default,,0000,0000,0000,,black and brown residents and students Dialogue: 0,2:13:08.94,2:13:13.64,Default,,0000,0000,0000,,as they live in and try to\Nnavigate the various systems Dialogue: 0,2:13:13.64,2:13:15.41,Default,,0000,0000,0000,,that are already putting pressure on them. Dialogue: 0,2:13:15.41,2:13:19.66,Default,,0000,0000,0000,,And so I think, especially\Nin light of recent events, Dialogue: 0,2:13:19.66,2:13:22.53,Default,,0000,0000,0000,,this is really a stand to support Dialogue: 0,2:13:22.53,2:13:24.72,Default,,0000,0000,0000,,residents of color in Boston, Dialogue: 0,2:13:24.72,2:13:29.72,Default,,0000,0000,0000,,over possible pros that\Ncould be looked at in this. Dialogue: 0,2:13:30.50,2:13:32.72,Default,,0000,0000,0000,,And I also think it's troubling Dialogue: 0,2:13:32.72,2:13:35.19,Default,,0000,0000,0000,,that the commissioner doesn't seem Dialogue: 0,2:13:35.19,2:13:36.39,Default,,0000,0000,0000,,to be clear about What's\Nkind of going on with this. Dialogue: 0,2:13:36.39,2:13:40.96,Default,,0000,0000,0000,,In addition that it also isn't clear Dialogue: 0,2:13:40.96,2:13:45.68,Default,,0000,0000,0000,,that there is even a\Nsurveillance of whether or not, Dialogue: 0,2:13:45.68,2:13:48.03,Default,,0000,0000,0000,,this could be used already,\Nthat kind of thing. Dialogue: 0,2:13:49.16,2:13:50.35,Default,,0000,0000,0000,,So I think one, I'm just so thankful Dialogue: 0,2:13:51.26,2:13:52.82,Default,,0000,0000,0000,,that this conversation is being had, Dialogue: 0,2:13:52.82,2:13:54.28,Default,,0000,0000,0000,,that this is open to the public Dialogue: 0,2:13:54.28,2:13:56.60,Default,,0000,0000,0000,,and that I just wanted to voice my concern Dialogue: 0,2:13:56.60,2:13:59.96,Default,,0000,0000,0000,,for this being a tool for racial profiling Dialogue: 0,2:13:59.96,2:14:03.17,Default,,0000,0000,0000,,and my support in the ban. Dialogue: 0,2:14:03.17,2:14:04.44,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:14:04.44,2:14:05.27,Default,,0000,0000,0000,,Very much. Dialogue: 0,2:14:06.50,2:14:08.80,Default,,0000,0000,0000,,Next is Nathan Sheard. Dialogue: 0,2:14:08.80,2:14:11.52,Default,,0000,0000,0000,,Again I apologize if I\Nmispronounced your name. Dialogue: 0,2:14:11.52,2:14:12.97,Default,,0000,0000,0000,,You have two minutes Nathan, go ahead. Dialogue: 0,2:14:12.97,2:14:15.77,Default,,0000,0000,0000,,No, you said it perfectly, thank you. Dialogue: 0,2:14:15.77,2:14:17.27,Default,,0000,0000,0000,,So my name is Nathan Sheard Dialogue: 0,2:14:17.27,2:14:19.02,Default,,0000,0000,0000,,and thank you for allowing me to speak Dialogue: 0,2:14:20.22,2:14:21.05,Default,,0000,0000,0000,,on behalf of the Electronic\NFrontier Foundation Dialogue: 0,2:14:22.08,2:14:23.08,Default,,0000,0000,0000,,and our 30,000 members. Dialogue: 0,2:14:23.08,2:14:24.55,Default,,0000,0000,0000,,The Electronic Frontier Foundation, Dialogue: 0,2:14:24.55,2:14:25.47,Default,,0000,0000,0000,,strongly supports legislation Dialogue: 0,2:14:25.47,2:14:27.30,Default,,0000,0000,0000,,that bans government\Nagencies and employees Dialogue: 0,2:14:27.30,2:14:29.20,Default,,0000,0000,0000,,from using face surveillance technology. Dialogue: 0,2:14:29.20,2:14:31.70,Default,,0000,0000,0000,,We thank the sponsors of this ordinance Dialogue: 0,2:14:31.70,2:14:33.88,Default,,0000,0000,0000,,for their attention to\Nthis critical issue. Dialogue: 0,2:14:33.88,2:14:36.17,Default,,0000,0000,0000,,Face surveillance is profoundly\Ndangerous for many reasons. Dialogue: 0,2:14:36.17,2:14:38.08,Default,,0000,0000,0000,,First, in tracking our faces, Dialogue: 0,2:14:38.08,2:14:40.69,Default,,0000,0000,0000,,a unique marker that we can not change, Dialogue: 0,2:14:40.69,2:14:42.38,Default,,0000,0000,0000,,it invades our privacy. Dialogue: 0,2:14:42.38,2:14:45.17,Default,,0000,0000,0000,,Second, government use of the\Ntechnology in public places Dialogue: 0,2:14:45.17,2:14:46.66,Default,,0000,0000,0000,,will chill people from engaging Dialogue: 0,2:14:46.66,2:14:48.40,Default,,0000,0000,0000,,First Amendment protected activity. Dialogue: 0,2:14:48.40,2:14:50.92,Default,,0000,0000,0000,,Research shows and courts\Nhave long recognized Dialogue: 0,2:14:50.92,2:14:52.38,Default,,0000,0000,0000,,that government surveillance Dialogue: 0,2:14:52.38,2:14:54.81,Default,,0000,0000,0000,,and the First Amendment\Nactivity has a deterrent effect. Dialogue: 0,2:14:54.81,2:14:56.88,Default,,0000,0000,0000,,Third surveillance\Ntechnologies have an unfair, Dialogue: 0,2:14:56.88,2:15:00.00,Default,,0000,0000,0000,,disparate impact against\Npeople of color, immigrants Dialogue: 0,2:15:00.00,2:15:01.84,Default,,0000,0000,0000,,and other vulnerable populations. Dialogue: 0,2:15:01.84,2:15:04.25,Default,,0000,0000,0000,,Watch lists are frequently over-inclusive, Dialogue: 0,2:15:04.25,2:15:06.06,Default,,0000,0000,0000,,error riddled and used in conjunction Dialogue: 0,2:15:06.06,2:15:08.11,Default,,0000,0000,0000,,with powerful mathematical algorithms, Dialogue: 0,2:15:08.11,2:15:10.36,Default,,0000,0000,0000,,which often amplify bias. Dialogue: 0,2:15:10.36,2:15:12.40,Default,,0000,0000,0000,,Thus space surveillance is so dangerous Dialogue: 0,2:15:12.40,2:15:15.83,Default,,0000,0000,0000,,that governments must not use it at all. Dialogue: 0,2:15:15.83,2:15:17.45,Default,,0000,0000,0000,,We support the aims of this ordinance Dialogue: 0,2:15:17.45,2:15:19.82,Default,,0000,0000,0000,,and respectfully seek three amendments Dialogue: 0,2:15:19.82,2:15:22.35,Default,,0000,0000,0000,,to show that the bill will appropriately Dialogue: 0,2:15:22.35,2:15:23.28,Default,,0000,0000,0000,,protect the rights of Boston residents. Dialogue: 0,2:15:23.28,2:15:25.62,Default,,0000,0000,0000,,First, the bill has an\Nexemption for evidence Dialogue: 0,2:15:25.62,2:15:27.05,Default,,0000,0000,0000,,generated by face of bills Dialogue: 0,2:15:27.05,2:15:29.43,Default,,0000,0000,0000,,that relates to investigation of crime. Dialogue: 0,2:15:29.43,2:15:31.84,Default,,0000,0000,0000,,If that respectfully ask that'd be amended Dialogue: 0,2:15:31.84,2:15:34.91,Default,,0000,0000,0000,,to prohibit the use of face\Nsurveillance enabled evidence Dialogue: 0,2:15:34.91,2:15:37.77,Default,,0000,0000,0000,,generated by or at the request Dialogue: 0,2:15:37.77,2:15:40.80,Default,,0000,0000,0000,,of the Boston Police Department\Nor any Boston official. Dialogue: 0,2:15:40.80,2:15:42.72,Default,,0000,0000,0000,,Second, the private right of action Dialogue: 0,2:15:42.72,2:15:45.85,Default,,0000,0000,0000,,does not provide the shipping\Nfor a prevailing plaintiff. Dialogue: 0,2:15:45.85,2:15:47.01,Default,,0000,0000,0000,,Without such fee shipping, Dialogue: 0,2:15:47.01,2:15:50.21,Default,,0000,0000,0000,,the only private enforcers\Nwill be advocacy organizations Dialogue: 0,2:15:50.21,2:15:51.64,Default,,0000,0000,0000,,and wealthy individuals. Dialogue: 0,2:15:51.64,2:15:54.96,Default,,0000,0000,0000,,You have to be eff respectfully\Nask that the language Dialogue: 0,2:15:54.96,2:15:57.80,Default,,0000,0000,0000,,be included toward\Nreasonable attorney fees Dialogue: 0,2:15:57.80,2:15:59.51,Default,,0000,0000,0000,,to a prevailing plaintiff. Dialogue: 0,2:15:59.51,2:16:01.61,Default,,0000,0000,0000,,Third, the ban extends to private sector Dialogue: 0,2:16:01.61,2:16:04.63,Default,,0000,0000,0000,,use of face surveillance conducted\Nwith a government permit. Dialogue: 0,2:16:04.63,2:16:08.23,Default,,0000,0000,0000,,The better approach as\NKade has offered earlier, Dialogue: 0,2:16:08.23,2:16:13.23,Default,,0000,0000,0000,,is to use face surveillance is used only Dialogue: 0,2:16:16.21,2:16:18.21,Default,,0000,0000,0000,,with informed opt-in consent. Dialogue: 0,2:16:18.21,2:16:20.32,Default,,0000,0000,0000,,Thus you have to respectfully requests Dialogue: 0,2:16:20.32,2:16:21.73,Default,,0000,0000,0000,,that the language be limited Dialogue: 0,2:16:21.73,2:16:24.00,Default,,0000,0000,0000,,to prohibiting private sector permitting Dialogue: 0,2:16:24.00,2:16:25.72,Default,,0000,0000,0000,,for the use of face\Nsurveillance recognition Dialogue: 0,2:16:25.72,2:16:27.56,Default,,0000,0000,0000,,at the government's request. Dialogue: 0,2:16:27.56,2:16:30.64,Default,,0000,0000,0000,,And closing EFF, once again\Nthanks to the sponsors. Dialogue: 0,2:16:30.64,2:16:32.92,Default,,0000,0000,0000,,Look forward to seeing an ordinance pass Dialogue: 0,2:16:32.92,2:16:34.26,Default,,0000,0000,0000,,that bans government use of\Nface surveillance in Boston Dialogue: 0,2:16:34.26,2:16:36.74,Default,,0000,0000,0000,,and will enthusiastically\Nsupport the ordinance banning Dialogue: 0,2:16:36.74,2:16:38.78,Default,,0000,0000,0000,,face recognition technology in Boston, Dialogue: 0,2:16:38.78,2:16:40.88,Default,,0000,0000,0000,,if it is amended in these ways, thank you. Dialogue: 0,2:16:41.83,2:16:43.70,Default,,0000,0000,0000,,Nathan, I just wanna make sure I notes Dialogue: 0,2:16:43.70,2:16:44.70,Default,,0000,0000,0000,,on your suggestions. Dialogue: 0,2:16:46.16,2:16:47.25,Default,,0000,0000,0000,,And I wanna make sure you had three, Dialogue: 0,2:16:47.25,2:16:49.42,Default,,0000,0000,0000,,which is the ban that used by the BPB, Dialogue: 0,2:16:49.42,2:16:52.86,Default,,0000,0000,0000,,no exemptions, period. Dialogue: 0,2:16:52.86,2:16:56.63,Default,,0000,0000,0000,,Two, that currently because [indistinct] Dialogue: 0,2:16:56.63,2:16:58.66,Default,,0000,0000,0000,,doesn't allow for fee\Nshifting our attorney fees, Dialogue: 0,2:16:58.66,2:17:01.00,Default,,0000,0000,0000,,it really does put it\Non either individuals Dialogue: 0,2:17:01.00,2:17:04.28,Default,,0000,0000,0000,,to take it on their own\Nwithout having any kind of, Dialogue: 0,2:17:04.28,2:17:08.71,Default,,0000,0000,0000,,I guess partner care or\Nsupport that helps be funded. Dialogue: 0,2:17:08.71,2:17:11.99,Default,,0000,0000,0000,,And then three to extend\Nto private actors, Dialogue: 0,2:17:11.99,2:17:14.25,Default,,0000,0000,0000,,essentially down the supply chain Dialogue: 0,2:17:14.25,2:17:15.66,Default,,0000,0000,0000,,where the city of Boston\Ncan tell private actors Dialogue: 0,2:17:15.66,2:17:17.71,Default,,0000,0000,0000,,that contract with us. Dialogue: 0,2:17:17.71,2:17:21.01,Default,,0000,0000,0000,,Because one of the reasons\NI am always saying this, Dialogue: 0,2:17:21.01,2:17:23.56,Default,,0000,0000,0000,,one of the issues is how far we can go. Dialogue: 0,2:17:23.56,2:17:25.13,Default,,0000,0000,0000,,I mean, we can't regulate Facebook, Dialogue: 0,2:17:25.13,2:17:27.22,Default,,0000,0000,0000,,but we can regulate where money goes. Dialogue: 0,2:17:27.22,2:17:28.90,Default,,0000,0000,0000,,So I'm assuming spot that. Dialogue: 0,2:17:29.93,2:17:32.14,Default,,0000,0000,0000,,Yeah, so just to reiterate, Dialogue: 0,2:17:32.14,2:17:35.08,Default,,0000,0000,0000,,so the First Amendment\Nthat we would suggest Dialogue: 0,2:17:35.08,2:17:37.39,Default,,0000,0000,0000,,is that in right now, Dialogue: 0,2:17:38.36,2:17:40.21,Default,,0000,0000,0000,,there's an exclusion\Nfor forever for evidence Dialogue: 0,2:17:40.21,2:17:41.94,Default,,0000,0000,0000,,and Kade spoke earlier Dialogue: 0,2:17:41.94,2:17:43.60,Default,,0000,0000,0000,,to the fact that that's in reference to, Dialogue: 0,2:17:43.60,2:17:45.15,Default,,0000,0000,0000,,if they receive something\Nfrom the federal government Dialogue: 0,2:17:45.15,2:17:46.63,Default,,0000,0000,0000,,or one to post or what have you. Dialogue: 0,2:17:46.63,2:17:48.97,Default,,0000,0000,0000,,But to make sure that it's amended Dialogue: 0,2:17:48.97,2:17:50.87,Default,,0000,0000,0000,,to tighten that up a bit and say that, Dialogue: 0,2:17:50.87,2:17:54.50,Default,,0000,0000,0000,,as long as that\Ninformation isn't requested Dialogue: 0,2:17:54.50,2:17:56.78,Default,,0000,0000,0000,,or created at the request\Nof the Boston police forums, Dialogue: 0,2:17:56.78,2:17:57.83,Default,,0000,0000,0000,,that's the first one. Dialogue: 0,2:17:58.100,2:17:59.99,Default,,0000,0000,0000,,The second one is the fee shipping. Dialogue: 0,2:18:00.91,2:18:01.97,Default,,0000,0000,0000,,So that it's not just the EFF and the ACLU Dialogue: 0,2:18:01.97,2:18:04.59,Default,,0000,0000,0000,,and wealthy individuals\Nthat have the power Dialogue: 0,2:18:04.59,2:18:06.70,Default,,0000,0000,0000,,to make sure that it's enforced. Dialogue: 0,2:18:06.70,2:18:08.81,Default,,0000,0000,0000,,So folks can find\Nattorneys to support them Dialogue: 0,2:18:08.81,2:18:12.35,Default,,0000,0000,0000,,because the attorneys will\Nbe compensated for that time. Dialogue: 0,2:18:12.35,2:18:14.05,Default,,0000,0000,0000,,And finally, on the third. Dialogue: 0,2:18:14.05,2:18:16.45,Default,,0000,0000,0000,,And I think that this speaks\Nto your greater question, Dialogue: 0,2:18:16.45,2:18:20.28,Default,,0000,0000,0000,,right now it's it prohibits\Nthe offering of a permit Dialogue: 0,2:18:20.28,2:18:25.10,Default,,0000,0000,0000,,to a private company, agency\Nperson, what have you, Dialogue: 0,2:18:25.10,2:18:27.27,Default,,0000,0000,0000,,whereas we would tightened up to say that Dialogue: 0,2:18:27.27,2:18:29.15,Default,,0000,0000,0000,,that a permit could be provided Dialogue: 0,2:18:29.15,2:18:31.45,Default,,0000,0000,0000,,as long it wasn't being provided Dialogue: 0,2:18:31.45,2:18:34.43,Default,,0000,0000,0000,,for face surveillance to\Nbe collected at the behest Dialogue: 0,2:18:34.43,2:18:38.09,Default,,0000,0000,0000,,or at the request of the\NBoston Police Department. Dialogue: 0,2:18:38.09,2:18:38.99,Default,,0000,0000,0000,,So really making sure Dialogue: 0,2:18:38.99,2:18:41.53,Default,,0000,0000,0000,,that we're speaking back to the fact Dialogue: 0,2:18:41.53,2:18:44.34,Default,,0000,0000,0000,,that like private use\Nwould be better protected Dialogue: 0,2:18:44.34,2:18:46.05,Default,,0000,0000,0000,,by something similar to Dialogue: 0,2:18:46.05,2:18:48.47,Default,,0000,0000,0000,,the Biometric Information\NPrivacy Act in Illinois Dialogue: 0,2:18:48.47,2:18:52.26,Default,,0000,0000,0000,,that requires; informed,\Nopt-in and concept Dialogue: 0,2:18:52.26,2:18:53.38,Default,,0000,0000,0000,,from the individual, Dialogue: 0,2:18:53.38,2:18:55.14,Default,,0000,0000,0000,,rather than simply saying Dialogue: 0,2:18:55.14,2:18:56.64,Default,,0000,0000,0000,,that no private doctor can do space. Dialogue: 0,2:18:58.40,2:18:59.61,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:18:59.61,2:19:01.42,Default,,0000,0000,0000,,That was a great clarification. Dialogue: 0,2:19:01.42,2:19:03.58,Default,,0000,0000,0000,,I'm gonna go now to our next speakers. Dialogue: 0,2:19:03.58,2:19:05.49,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:19:05.49,2:19:10.49,Default,,0000,0000,0000,,I have a Sabrina Barosso,\NElmeda and Christina Vasquez. Dialogue: 0,2:19:13.29,2:19:17.26,Default,,0000,0000,0000,,So if we have. Dialogue: 0,2:19:18.47,2:19:20.97,Default,,0000,0000,0000,,Hi folks, thank you so much. Dialogue: 0,2:19:20.97,2:19:23.58,Default,,0000,0000,0000,,Just to let folks know,\NChristina is not with us. Dialogue: 0,2:19:23.58,2:19:26.61,Default,,0000,0000,0000,,Some folks had to leave\Nbecause they have to work, Dialogue: 0,2:19:26.61,2:19:29.12,Default,,0000,0000,0000,,but so hi everyone, hi city councilors. Dialogue: 0,2:19:29.12,2:19:30.69,Default,,0000,0000,0000,,My name is Sabrina Barroso. Dialogue: 0,2:19:30.69,2:19:33.87,Default,,0000,0000,0000,,I am the lead organizer for\Nthe Student Immigrant Movement. Dialogue: 0,2:19:33.87,2:19:36.87,Default,,0000,0000,0000,,The first undocumented\Nimmigrant youth led organization Dialogue: 0,2:19:36.87,2:19:38.57,Default,,0000,0000,0000,,in the state of Massachusetts. Dialogue: 0,2:19:38.57,2:19:40.92,Default,,0000,0000,0000,,We believe that all\Nyoung people are powerful Dialogue: 0,2:19:40.92,2:19:42.92,Default,,0000,0000,0000,,and that when you are\Nsupported and have space Dialogue: 0,2:19:42.92,2:19:43.90,Default,,0000,0000,0000,,and investment to grow, Dialogue: 0,2:19:43.90,2:19:46.12,Default,,0000,0000,0000,,they flourished into the\Nleaders of today and tomorrow. Dialogue: 0,2:19:46.12,2:19:48.80,Default,,0000,0000,0000,,At SIM, we are invested in protecting Dialogue: 0,2:19:48.80,2:19:50.93,Default,,0000,0000,0000,,and working on behalf of\Nthe undocumented youth Dialogue: 0,2:19:50.93,2:19:53.02,Default,,0000,0000,0000,,and for families of Massachusetts. Dialogue: 0,2:19:53.02,2:19:55.14,Default,,0000,0000,0000,,That's why I'm here with you all today Dialogue: 0,2:19:55.14,2:19:56.15,Default,,0000,0000,0000,,and I would like to make clear Dialogue: 0,2:19:56.15,2:19:59.48,Default,,0000,0000,0000,,that we must prohibit the\Nuse of facial recognition, Dialogue: 0,2:19:59.48,2:20:01.67,Default,,0000,0000,0000,,technology and software\Nin the city of Boston. Dialogue: 0,2:20:01.67,2:20:03.74,Default,,0000,0000,0000,,For far too long the\NBoston Police Department Dialogue: 0,2:20:03.74,2:20:06.88,Default,,0000,0000,0000,,has been allowed to have full flexibility Dialogue: 0,2:20:06.88,2:20:10.03,Default,,0000,0000,0000,,and freewill over how the police\Nand surveil our communities Dialogue: 0,2:20:10.03,2:20:11.86,Default,,0000,0000,0000,,and the consequences are devastating. Dialogue: 0,2:20:11.86,2:20:15.29,Default,,0000,0000,0000,,Our people are being criminalized\Nand funneled into prisons, Dialogue: 0,2:20:15.29,2:20:17.33,Default,,0000,0000,0000,,detention and deportation. Dialogue: 0,2:20:17.33,2:20:19.63,Default,,0000,0000,0000,,We need full community\Ncontrol over the police. Dialogue: 0,2:20:19.63,2:20:22.26,Default,,0000,0000,0000,,And this is a key move to\Nbringing power to our people. Dialogue: 0,2:20:22.26,2:20:24.42,Default,,0000,0000,0000,,Right now, there are no laws Dialogue: 0,2:20:24.42,2:20:27.00,Default,,0000,0000,0000,,that control what the BPD\Ncan purchase for surveillance Dialogue: 0,2:20:27.00,2:20:29.30,Default,,0000,0000,0000,,or how they use it and\Nwhen I learned this, Dialogue: 0,2:20:29.30,2:20:31.73,Default,,0000,0000,0000,,I was scared because I\Nthought about the history Dialogue: 0,2:20:31.73,2:20:34.28,Default,,0000,0000,0000,,between the BPD and families like mine. Dialogue: 0,2:20:34.28,2:20:36.22,Default,,0000,0000,0000,,And I thought about the people that I love Dialogue: 0,2:20:36.22,2:20:39.30,Default,,0000,0000,0000,,and that the people I care about. Dialogue: 0,2:20:39.30,2:20:41.59,Default,,0000,0000,0000,,And I thought about the people\Nwho are constantly harassed Dialogue: 0,2:20:41.59,2:20:44.66,Default,,0000,0000,0000,,and targeted by law enforcement\Nsimply for existing. Dialogue: 0,2:20:44.66,2:20:45.64,Default,,0000,0000,0000,,So if you ask me, Dialogue: 0,2:20:45.64,2:20:47.73,Default,,0000,0000,0000,,facial recognition is not to be trusted Dialogue: 0,2:20:48.76,2:20:50.32,Default,,0000,0000,0000,,in the hands of law enforcement, Dialogue: 0,2:20:50.32,2:20:51.95,Default,,0000,0000,0000,,that tracks, monitors, hyper surveils, Dialogue: 0,2:20:51.95,2:20:53.49,Default,,0000,0000,0000,,black and brown communities, Dialogue: 0,2:20:53.49,2:20:55.64,Default,,0000,0000,0000,,and that puts immigrants and activists, Dialogue: 0,2:20:55.64,2:20:59.90,Default,,0000,0000,0000,,youth under constant\Nscrutiny and criminalization. Dialogue: 0,2:20:59.90,2:21:02.83,Default,,0000,0000,0000,,Councilors, would you really\Ntrust a police department Dialogue: 0,2:21:02.83,2:21:04.72,Default,,0000,0000,0000,,that shares information to I.C.E? Dialogue: 0,2:21:04.72,2:21:07.89,Default,,0000,0000,0000,,That's share student\Ninformation to I.C.E as well. Dialogue: 0,2:21:07.89,2:21:10.22,Default,,0000,0000,0000,,That has led to the\Ndeportation of students Dialogue: 0,2:21:10.22,2:21:13.90,Default,,0000,0000,0000,,that exchange emails saying\Nhappy hunting with I.C.E. Dialogue: 0,2:21:13.90,2:21:16.06,Default,,0000,0000,0000,,And that has historically targeted Dialogue: 0,2:21:16.06,2:21:19.13,Default,,0000,0000,0000,,black and brown youth to\Ninvade their personal space, Dialogue: 0,2:21:19.13,2:21:22.15,Default,,0000,0000,0000,,their bodies and their\Nprivacy with stop and frisk. Dialogue: 0,2:21:22.15,2:21:24.95,Default,,0000,0000,0000,,There is not a shame to separate youth Dialogue: 0,2:21:24.95,2:21:27.16,Default,,0000,0000,0000,,from their and their families\Nfrom their communities Dialogue: 0,2:21:27.16,2:21:30.48,Default,,0000,0000,0000,,and then end up pinning,\Ninjustices and crimes on us. Dialogue: 0,2:21:30.48,2:21:33.98,Default,,0000,0000,0000,,The BPD says that they don't\Nuse spatial surveillance now, Dialogue: 0,2:21:33.98,2:21:35.65,Default,,0000,0000,0000,,but they have access to it. Dialogue: 0,2:21:35.65,2:21:39.06,Default,,0000,0000,0000,,And they have a $414\Nmillion budget to buy it. Dialogue: 0,2:21:39.06,2:21:41.39,Default,,0000,0000,0000,,Right now, our communities need access Dialogue: 0,2:21:41.39,2:21:43.98,Default,,0000,0000,0000,,to funding for healthcare,\Nhousing and so many other things Dialogue: 0,2:21:43.98,2:21:45.34,Default,,0000,0000,0000,,and the list goes on. Dialogue: 0,2:21:45.34,2:21:47.55,Default,,0000,0000,0000,,For years, residents of\NBoston have been demanding Dialogue: 0,2:21:47.55,2:21:49.41,Default,,0000,0000,0000,,to fund things that truly matter to them Dialogue: 0,2:21:49.41,2:21:51.30,Default,,0000,0000,0000,,and we have to invest in things Dialogue: 0,2:21:51.30,2:21:53.81,Default,,0000,0000,0000,,that are not face surveillance\Nthat doesn't even work Dialogue: 0,2:21:53.81,2:21:56.21,Default,,0000,0000,0000,,and is irrelevant to\Nkeeping our community safe. Dialogue: 0,2:21:56.21,2:21:59.80,Default,,0000,0000,0000,,We must invest in intentional\Nrestorative justice practices, Dialogue: 0,2:21:59.80,2:22:02.52,Default,,0000,0000,0000,,healthcare, mental health and resources Dialogue: 0,2:22:02.52,2:22:04.58,Default,,0000,0000,0000,,to prevent violence and distress. Dialogue: 0,2:22:04.58,2:22:07.01,Default,,0000,0000,0000,,Facial surveillance would\Nend privacy as we know it Dialogue: 0,2:22:07.01,2:22:09.49,Default,,0000,0000,0000,,and completely throw off the balance Dialogue: 0,2:22:09.49,2:22:11.16,Default,,0000,0000,0000,,and power between people\Nand their government. Dialogue: 0,2:22:11.16,2:22:13.40,Default,,0000,0000,0000,,Listen to youth and\Nlistening to our community, Dialogue: 0,2:22:13.40,2:22:16.32,Default,,0000,0000,0000,,and let's follow the lead of young people Dialogue: 0,2:22:16.32,2:22:18.70,Default,,0000,0000,0000,,who are addressing the deeper issue Dialogue: 0,2:22:18.70,2:22:20.51,Default,,0000,0000,0000,,of what is to criminalize our people Dialogue: 0,2:22:20.51,2:22:23.18,Default,,0000,0000,0000,,and putting the power in\Nthe hands of the people, Dialogue: 0,2:22:23.18,2:22:26.18,Default,,0000,0000,0000,,especially when it comes\Nto governing surveillance. Dialogue: 0,2:22:26.18,2:22:27.76,Default,,0000,0000,0000,,Thank you all so much. Dialogue: 0,2:22:27.76,2:22:28.92,Default,,0000,0000,0000,,Thank you very much Dialogue: 0,2:22:28.92,2:22:30.72,Default,,0000,0000,0000,,and it'S like Christina's\Nnot available, Sabrina. Dialogue: 0,2:22:30.72,2:22:33.53,Default,,0000,0000,0000,,Yeah, she's not available. Dialogue: 0,2:22:33.53,2:22:38.53,Default,,0000,0000,0000,,Okay, Ismleda. Dialogue: 0,2:22:41.62,2:22:46.62,Default,,0000,0000,0000,,I'm Ismleda and a member of\Nthe City Media Movements. Dialogue: 0,2:22:48.89,2:22:50.64,Default,,0000,0000,0000,,Today I'm here to testify in support Dialogue: 0,2:22:50.64,2:22:53.26,Default,,0000,0000,0000,,of ban in facial recognition\Ntechnology in Boston. Dialogue: 0,2:22:53.26,2:22:56.20,Default,,0000,0000,0000,,This will establish a\Nban of government use Dialogue: 0,2:22:56.20,2:22:58.37,Default,,0000,0000,0000,,of face surveillance\Nin the city of Boston. Dialogue: 0,2:22:58.37,2:23:01.95,Default,,0000,0000,0000,,Boston must pass this\Nordinance to join Springfield, Dialogue: 0,2:23:01.95,2:23:04.88,Default,,0000,0000,0000,,Somerville, Cambridge,\NBrooklyn and North Hampton Dialogue: 0,2:23:04.88,2:23:07.45,Default,,0000,0000,0000,,in protecting racial justice\Nand freedom of speech. Dialogue: 0,2:23:07.45,2:23:10.100,Default,,0000,0000,0000,,The feather of study\Npublished in December 2019, Dialogue: 0,2:23:10.100,2:23:13.36,Default,,0000,0000,0000,,found that face recognition algorithms Dialogue: 0,2:23:13.36,2:23:15.41,Default,,0000,0000,0000,,are much more likely to fail Dialogue: 0,2:23:15.41,2:23:17.74,Default,,0000,0000,0000,,when attempted to identify\Nthe faces of people of color, Dialogue: 0,2:23:17.74,2:23:20.40,Default,,0000,0000,0000,,children, the elderly and women. Dialogue: 0,2:23:20.40,2:23:22.92,Default,,0000,0000,0000,,That means that technology\Nonly reliably works Dialogue: 0,2:23:22.92,2:23:26.76,Default,,0000,0000,0000,,on middle aged white men. Dialogue: 0,2:23:26.76,2:23:29.80,Default,,0000,0000,0000,,It very small fraction\Nof busters precedents. Dialogue: 0,2:23:31.08,2:23:33.06,Default,,0000,0000,0000,,As a citizen of Boston, Dialogue: 0,2:23:33.06,2:23:35.26,Default,,0000,0000,0000,,it is clear to me that\Nthis type of technology Dialogue: 0,2:23:35.26,2:23:37.23,Default,,0000,0000,0000,,is not okay to use in our communities. Dialogue: 0,2:23:37.23,2:23:39.27,Default,,0000,0000,0000,,The fact that this technology Dialogue: 0,2:23:39.27,2:23:42.84,Default,,0000,0000,0000,,is used to control specifically\Nin communities of color, Dialogue: 0,2:23:42.84,2:23:45.60,Default,,0000,0000,0000,,makes my family and me feel unsafe, Dialogue: 0,2:23:45.60,2:23:46.83,Default,,0000,0000,0000,,as well as the other families Dialogue: 0,2:23:46.83,2:23:48.63,Default,,0000,0000,0000,,that live within this communities. Dialogue: 0,2:23:48.63,2:23:51.85,Default,,0000,0000,0000,,Black and brown kids\Nare put into databases Dialogue: 0,2:23:51.85,2:23:55.29,Default,,0000,0000,0000,,that are used against them\Nwithout redness or reason. Dialogue: 0,2:23:55.29,2:23:57.57,Default,,0000,0000,0000,,This interrupts our education\Nand limits our future. Dialogue: 0,2:23:57.57,2:23:59.87,Default,,0000,0000,0000,,Kids shouldn't be criminalized\Nor feeling unsafe. Dialogue: 0,2:23:59.87,2:24:03.12,Default,,0000,0000,0000,,Black and brown kids should\Nbe able to walk on the street Dialogue: 0,2:24:03.12,2:24:07.25,Default,,0000,0000,0000,,without the fear of knowing\Nwhat's going to happen. Dialogue: 0,2:24:07.25,2:24:09.87,Default,,0000,0000,0000,,And it shows how this surveillance Dialogue: 0,2:24:09.87,2:24:12.39,Default,,0000,0000,0000,,is used to target black\Nand brown communities. Dialogue: 0,2:24:12.39,2:24:14.53,Default,,0000,0000,0000,,The racial profiling in technology Dialogue: 0,2:24:14.53,2:24:17.43,Default,,0000,0000,0000,,is one of the many forms\Nof systematic racism. Dialogue: 0,2:24:17.43,2:24:20.73,Default,,0000,0000,0000,,Immigrant youth in this\Ncity do not trust the police Dialogue: 0,2:24:20.73,2:24:23.49,Default,,0000,0000,0000,,because we know that they\Nare very close with I.C.E. Dialogue: 0,2:24:23.49,2:24:26.51,Default,,0000,0000,0000,,We know that they sent a\Nstudent information to I.C.E, Dialogue: 0,2:24:26.51,2:24:30.32,Default,,0000,0000,0000,,and this has led to the\Ndeportation of junk people Dialogue: 0,2:24:30.32,2:24:32.56,Default,,0000,0000,0000,,and pains our communities. Dialogue: 0,2:24:32.56,2:24:34.98,Default,,0000,0000,0000,,[indistinct] Dialogue: 0,2:24:40.32,2:24:45.32,Default,,0000,0000,0000,,BPD, cameras with surveillance\Nand criminalization of us. Dialogue: 0,2:24:45.45,2:24:46.70,Default,,0000,0000,0000,,It's not right. Dialogue: 0,2:24:46.70,2:24:48.29,Default,,0000,0000,0000,,Everyone on the council at some point Dialogue: 0,2:24:48.29,2:24:50.44,Default,,0000,0000,0000,,has talked about supporting young people Dialogue: 0,2:24:52.07,2:24:53.08,Default,,0000,0000,0000,,and young people being our future leaders. Dialogue: 0,2:24:53.08,2:24:54.06,Default,,0000,0000,0000,,We are leaders right now, Dialogue: 0,2:24:54.06,2:24:57.64,Default,,0000,0000,0000,,we are telling you to do\Nwhat everyone knows is right. Dialogue: 0,2:24:57.64,2:25:00.25,Default,,0000,0000,0000,,By preventing the use of facial\Nrecognition surveillance, Dialogue: 0,2:25:00.25,2:25:03.49,Default,,0000,0000,0000,,we can do more to protect\Nimmigrant families in Boston. Dialogue: 0,2:25:03.49,2:25:05.08,Default,,0000,0000,0000,,Stop selling youth and families out. Dialogue: 0,2:25:05.08,2:25:07.13,Default,,0000,0000,0000,,Let's give power to people Dialogue: 0,2:25:07.13,2:25:09.13,Default,,0000,0000,0000,,and take this first step in\Naddressing the problem we have Dialogue: 0,2:25:09.13,2:25:10.46,Default,,0000,0000,0000,,with law enforcement surveillance Dialogue: 0,2:25:10.46,2:25:12.76,Default,,0000,0000,0000,,and criminalization or people\Nby surveillance system. Dialogue: 0,2:25:12.76,2:25:13.94,Default,,0000,0000,0000,,Thank you very much Dialogue: 0,2:25:15.15,2:25:17.35,Default,,0000,0000,0000,,Next up, the next group\Nof folks we have are, Dialogue: 0,2:25:17.35,2:25:22.35,Default,,0000,0000,0000,,Sherlandy Pardieu, Eli Harmon,\NNatalie Diaz and Jose Gomez. Dialogue: 0,2:25:27.90,2:25:30.08,Default,,0000,0000,0000,,So is Sherlandy available. Dialogue: 0,2:25:32.38,2:25:36.24,Default,,0000,0000,0000,,I see that person in\Nthe waiting room, sorry. Dialogue: 0,2:25:47.37,2:25:50.29,Default,,0000,0000,0000,,well, we'll go ahead and go to Eli. Dialogue: 0,2:25:53.00,2:25:55.36,Default,,0000,0000,0000,,I think I see right now, ready to go Dialogue: 0,2:25:55.36,2:25:58.99,Default,,0000,0000,0000,,and then we'll come back to Sherlandy, Dialogue: 0,2:26:00.20,2:26:02.12,Default,,0000,0000,0000,,Eli you have two minutes. Dialogue: 0,2:26:02.12,2:26:03.50,Default,,0000,0000,0000,,Oh, thank you so much. Dialogue: 0,2:26:03.50,2:26:04.55,Default,,0000,0000,0000,,Oh, who's speaking? Dialogue: 0,2:26:05.56,2:26:06.98,Default,,0000,0000,0000,,oh, sorry, this is Eli. Dialogue: 0,2:26:06.98,2:26:08.29,Default,,0000,0000,0000,,Okay, very well. Dialogue: 0,2:26:08.29,2:26:09.31,Default,,0000,0000,0000,,Thank you so much. Dialogue: 0,2:26:09.31,2:26:11.64,Default,,0000,0000,0000,,My name is Eli Harmon,\NI live in Mission Hill. Dialogue: 0,2:26:11.64,2:26:15.35,Default,,0000,0000,0000,,I've been doing work with the\NStudent Immigrant Movement. Dialogue: 0,2:26:15.35,2:26:18.16,Default,,0000,0000,0000,,I'm testifying on behalf of them today, Dialogue: 0,2:26:18.16,2:26:20.01,Default,,0000,0000,0000,,in support of the ordinance. Dialogue: 0,2:26:20.01,2:26:21.25,Default,,0000,0000,0000,,The issue of police surveillance Dialogue: 0,2:26:21.25,2:26:23.00,Default,,0000,0000,0000,,is enormously important to people Dialogue: 0,2:26:23.00,2:26:24.76,Default,,0000,0000,0000,,all across the city of Boston Dialogue: 0,2:26:24.76,2:26:27.72,Default,,0000,0000,0000,,though it's of particular\Nconcern to communities of color, Dialogue: 0,2:26:27.72,2:26:29.71,Default,,0000,0000,0000,,which it overwhelmingly targets. Dialogue: 0,2:26:29.71,2:26:31.47,Default,,0000,0000,0000,,A study done by MIT research Dialogue: 0,2:26:31.47,2:26:33.25,Default,,0000,0000,0000,,showed this type of\Nsurveillance technology Dialogue: 0,2:26:33.25,2:26:35.37,Default,,0000,0000,0000,,would be far less accurate\Nfor people of color. Dialogue: 0,2:26:35.37,2:26:39.68,Default,,0000,0000,0000,,And in fact, it was inaccurate\Nfor 35% of black women Dialogue: 0,2:26:39.68,2:26:43.16,Default,,0000,0000,0000,,and was most accurate for white adult men. Dialogue: 0,2:26:43.16,2:26:45.84,Default,,0000,0000,0000,,Law enforcement today, already of course, Dialogue: 0,2:26:45.84,2:26:48.96,Default,,0000,0000,0000,,overwhelmingly targets\Nblack and brown communities. Dialogue: 0,2:26:48.96,2:26:50.01,Default,,0000,0000,0000,,And this type of technology Dialogue: 0,2:26:50.01,2:26:52.58,Default,,0000,0000,0000,,is likely to only make\Nthis more severe case. Dialogue: 0,2:26:52.58,2:26:54.57,Default,,0000,0000,0000,,Additionally this type of technology Dialogue: 0,2:26:54.57,2:26:56.94,Default,,0000,0000,0000,,is a complete violation\Nof people's privacy. Dialogue: 0,2:26:56.94,2:26:58.73,Default,,0000,0000,0000,,There are no regulations with regard Dialogue: 0,2:26:58.73,2:27:00.15,Default,,0000,0000,0000,,to the extent to which law enforcement Dialogue: 0,2:27:00.15,2:27:01.60,Default,,0000,0000,0000,,can use this technology. Dialogue: 0,2:27:01.60,2:27:03.53,Default,,0000,0000,0000,,And it has been used as an abuse of power Dialogue: 0,2:27:03.53,2:27:05.68,Default,,0000,0000,0000,,by government officials in\Norder to track down immigrants, Dialogue: 0,2:27:05.68,2:27:07.90,Default,,0000,0000,0000,,people of color and activists. Dialogue: 0,2:27:07.90,2:27:09.54,Default,,0000,0000,0000,,Finally, lots of money goes into funding Dialogue: 0,2:27:09.54,2:27:10.70,Default,,0000,0000,0000,,this surveillance technology, Dialogue: 0,2:27:10.70,2:27:12.43,Default,,0000,0000,0000,,and for the reasons previously stated Dialogue: 0,2:27:12.43,2:27:15.68,Default,,0000,0000,0000,,this technology in no way\Nkeeps our communities safe Dialogue: 0,2:27:15.68,2:27:17.37,Default,,0000,0000,0000,,and the money that currently funds it, Dialogue: 0,2:27:17.37,2:27:19.03,Default,,0000,0000,0000,,could be put into much better places, Dialogue: 0,2:27:19.03,2:27:21.03,Default,,0000,0000,0000,,such as creating\Naffordable housing for all, Dialogue: 0,2:27:21.03,2:27:23.54,Default,,0000,0000,0000,,fully funding the Boston public schools, Dialogue: 0,2:27:23.54,2:27:27.06,Default,,0000,0000,0000,,creating equitable healthcare Dialogue: 0,2:27:27.06,2:27:29.55,Default,,0000,0000,0000,,and other places which\Nwould actually be effective Dialogue: 0,2:27:29.55,2:27:31.00,Default,,0000,0000,0000,,in bringing down crime rates. Dialogue: 0,2:27:32.30,2:27:34.72,Default,,0000,0000,0000,,Many cities that neighbor\NBoston, including Somerville, Dialogue: 0,2:27:34.72,2:27:36.73,Default,,0000,0000,0000,,Brooklyn, Cambridge and Springfield, Dialogue: 0,2:27:36.73,2:27:39.06,Default,,0000,0000,0000,,have chosen to ban facial surveillance Dialogue: 0,2:27:39.06,2:27:41.24,Default,,0000,0000,0000,,because they understand\Nthat it is dangerous Dialogue: 0,2:27:41.24,2:27:43.03,Default,,0000,0000,0000,,and harmful to communities of color. Dialogue: 0,2:27:43.03,2:27:44.41,Default,,0000,0000,0000,,It is a violation of people's privacy Dialogue: 0,2:27:44.41,2:27:48.24,Default,,0000,0000,0000,,and that money currently being\Nused to fund that technology Dialogue: 0,2:27:48.24,2:27:50.06,Default,,0000,0000,0000,,would be better spent in places Dialogue: 0,2:27:50.06,2:27:52.42,Default,,0000,0000,0000,,that actually improve lives in Boston. Dialogue: 0,2:27:52.42,2:27:53.91,Default,,0000,0000,0000,,For these reasons, Dialogue: 0,2:27:53.91,2:27:55.07,Default,,0000,0000,0000,,I asked the council to pass a prohibition Dialogue: 0,2:27:55.07,2:27:57.60,Default,,0000,0000,0000,,of the use of facial\Nrecognition surveillance Dialogue: 0,2:27:57.60,2:27:59.13,Default,,0000,0000,0000,,and give the community control Dialogue: 0,2:27:59.13,2:28:00.91,Default,,0000,0000,0000,,over surveillance technology and usage. Dialogue: 0,2:28:00.91,2:28:02.34,Default,,0000,0000,0000,,Thank you so much. Dialogue: 0,2:28:02.34,2:28:04.24,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:28:04.24,2:28:06.47,Default,,0000,0000,0000,,I do see Sherlandy Dialogue: 0,2:28:08.87,2:28:10.47,Default,,0000,0000,0000,,I'm here. Dialogue: 0,2:28:10.47,2:28:12.55,Default,,0000,0000,0000,,Okay, you have two minutes Sherlandy. Dialogue: 0,2:28:19.54,2:28:22.69,Default,,0000,0000,0000,,I am writing on behalf of SIM\Nin support of the ordinance Dialogue: 0,2:28:22.69,2:28:25.49,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston. Dialogue: 0,2:28:25.49,2:28:28.28,Default,,0000,0000,0000,,I urge my city of Boston\Nto pass this ordinance Dialogue: 0,2:28:28.28,2:28:30.25,Default,,0000,0000,0000,,to join other cities in Massachusetts, Dialogue: 0,2:28:30.25,2:28:33.27,Default,,0000,0000,0000,,like Springfield, Somerville,\NCambridge, Brooklyn Dialogue: 0,2:28:33.27,2:28:36.37,Default,,0000,0000,0000,,and Northern and putting\Nthe community power Dialogue: 0,2:28:36.37,2:28:37.84,Default,,0000,0000,0000,,before the police. Dialogue: 0,2:28:37.84,2:28:40.37,Default,,0000,0000,0000,,The city of Boston,\Nthe residents, families Dialogue: 0,2:28:40.37,2:28:42.51,Default,,0000,0000,0000,,and youth from Boston,\Ndeserve transparency, Dialogue: 0,2:28:42.51,2:28:44.23,Default,,0000,0000,0000,,respect and control. Dialogue: 0,2:28:44.23,2:28:46.42,Default,,0000,0000,0000,,A ban on facial surveillance technology Dialogue: 0,2:28:46.42,2:28:48.22,Default,,0000,0000,0000,,is important in the city of Boston Dialogue: 0,2:28:48.22,2:28:50.30,Default,,0000,0000,0000,,because this technology is dangerous Dialogue: 0,2:28:50.30,2:28:54.12,Default,,0000,0000,0000,,and it leads to serious\Nmistrust in the community. Dialogue: 0,2:28:54.12,2:28:57.70,Default,,0000,0000,0000,,Face surveillance is\Nproven to be less accurate Dialogue: 0,2:28:57.70,2:29:01.04,Default,,0000,0000,0000,,for people of darker skin. Dialogue: 0,2:29:01.04,2:29:04.38,Default,,0000,0000,0000,,Black people already face extreme levels Dialogue: 0,2:29:04.38,2:29:06.37,Default,,0000,0000,0000,,of police violence and harassment. Dialogue: 0,2:29:06.37,2:29:09.08,Default,,0000,0000,0000,,The racial bias and face surveillance Dialogue: 0,2:29:09.08,2:29:11.42,Default,,0000,0000,0000,,will put lives in danger. Dialogue: 0,2:29:11.42,2:29:13.73,Default,,0000,0000,0000,,But even if accurate, Dialogue: 0,2:29:13.73,2:29:15.93,Default,,0000,0000,0000,,we should still be doing all that we can Dialogue: 0,2:29:15.93,2:29:19.73,Default,,0000,0000,0000,,to avoid a future where every\Ntime we go to the doctor, Dialogue: 0,2:29:19.73,2:29:22.34,Default,,0000,0000,0000,,a place of worship or protest, Dialogue: 0,2:29:22.34,2:29:23.92,Default,,0000,0000,0000,,the government is scanning our face Dialogue: 0,2:29:23.92,2:29:26.06,Default,,0000,0000,0000,,and recording our movements. Dialogue: 0,2:29:26.06,2:29:27.78,Default,,0000,0000,0000,,We already know that law enforcement Dialogue: 0,2:29:27.78,2:29:30.83,Default,,0000,0000,0000,,is constantly sharing\Nand misusing information. Dialogue: 0,2:29:30.83,2:29:32.60,Default,,0000,0000,0000,,What would happen to our people Dialogue: 0,2:29:32.60,2:29:35.25,Default,,0000,0000,0000,,when we are already under\Nconstant surveillance Dialogue: 0,2:29:35.25,2:29:37.15,Default,,0000,0000,0000,,and criminalization. Dialogue: 0,2:29:37.15,2:29:38.54,Default,,0000,0000,0000,,Using face surveillance, Dialogue: 0,2:29:38.54,2:29:40.63,Default,,0000,0000,0000,,the government could begin to identify Dialogue: 0,2:29:40.63,2:29:44.35,Default,,0000,0000,0000,,and keep a list of every\Nperson at a protests. Dialogue: 0,2:29:44.35,2:29:45.77,Default,,0000,0000,0000,,People who are driving Dialogue: 0,2:29:45.77,2:29:48.53,Default,,0000,0000,0000,,or who are walking the\Nstreet every single day Dialogue: 0,2:29:48.53,2:29:52.52,Default,,0000,0000,0000,,across the city on a\Nongoing automatic basis, Dialogue: 0,2:29:52.52,2:29:54.85,Default,,0000,0000,0000,,it would keep the same\Nharmful surveillance going, Dialogue: 0,2:29:54.85,2:29:57.94,Default,,0000,0000,0000,,like what happened with the\Nuse of [indistinct] database. Dialogue: 0,2:29:57.94,2:30:01.63,Default,,0000,0000,0000,,It would criminalize anyone\Nto make a move in our city. Dialogue: 0,2:30:06.40,2:30:09.55,Default,,0000,0000,0000,,If we want this city to be safe, Dialogue: 0,2:30:09.55,2:30:12.34,Default,,0000,0000,0000,,we need to put our\Nmoney where our mouth is Dialogue: 0,2:30:12.34,2:30:16.21,Default,,0000,0000,0000,,and invest in practice\Nsuch as restorative justice Dialogue: 0,2:30:16.21,2:30:18.31,Default,,0000,0000,0000,,to help restore or community. Dialogue: 0,2:30:18.31,2:30:20.48,Default,,0000,0000,0000,,Facial surveillance is not the answer. Dialogue: 0,2:30:20.48,2:30:23.58,Default,,0000,0000,0000,,If anything, it is distracting Dialogue: 0,2:30:23.58,2:30:26.54,Default,,0000,0000,0000,,or addressing our real concern Dialogue: 0,2:30:26.54,2:30:28.61,Default,,0000,0000,0000,,about safety in our communities. Dialogue: 0,2:30:28.61,2:30:31.98,Default,,0000,0000,0000,,I urge you to prohibit the\Nuse of facial surveillance Dialogue: 0,2:30:31.98,2:30:34.49,Default,,0000,0000,0000,,in the city of Boston, Dialogue: 0,2:30:34.49,2:30:38.13,Default,,0000,0000,0000,,because we cannot allow law\Nenforcement to target black Dialogue: 0,2:30:38.13,2:30:42.27,Default,,0000,0000,0000,,and brown immigrants, activists\Nand community members. Dialogue: 0,2:30:42.27,2:30:43.82,Default,,0000,0000,0000,,Thank you so much for your time. Dialogue: 0,2:30:43.82,2:30:44.98,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:30:44.98,2:30:47.10,Default,,0000,0000,0000,,I want to say thank\Nyou to all of the youth Dialogue: 0,2:30:47.10,2:30:51.43,Default,,0000,0000,0000,,who have testified and\Nwill testify at SIM, Dialogue: 0,2:30:51.43,2:30:53.74,Default,,0000,0000,0000,,all those folks showing up, you are... Dialogue: 0,2:30:53.74,2:30:55.15,Default,,0000,0000,0000,,I just wanna say thank you so much Dialogue: 0,2:30:55.15,2:30:57.86,Default,,0000,0000,0000,,on behalf of the city\Ncouncil, you're our future Dialogue: 0,2:30:57.86,2:30:59.68,Default,,0000,0000,0000,,and you're doing an amazing job. Dialogue: 0,2:30:59.68,2:31:03.03,Default,,0000,0000,0000,,I have a Natalie Diaz and then Jose Gomez. Dialogue: 0,2:31:03.03,2:31:06.67,Default,,0000,0000,0000,,I see the Emily Reif\Nwho had signed up before Dialogue: 0,2:31:06.67,2:31:08.67,Default,,0000,0000,0000,,might also be available. Dialogue: 0,2:31:08.67,2:31:10.42,Default,,0000,0000,0000,,So those next three names, Dialogue: 0,2:31:10.42,2:31:12.47,Default,,0000,0000,0000,,I think Natalie may no longer be with us. Dialogue: 0,2:31:15.11,2:31:17.86,Default,,0000,0000,0000,,Okay, is Jose available? Dialogue: 0,2:31:17.86,2:31:20.12,Default,,0000,0000,0000,,Yes.\NCHAIR: Two minutes. Dialogue: 0,2:31:21.30,2:31:22.21,Default,,0000,0000,0000,,Hi everyone. Dialogue: 0,2:31:22.21,2:31:26.93,Default,,0000,0000,0000,,I'm here on behalf of\NSIM to testify in favor Dialogue: 0,2:31:26.93,2:31:30.34,Default,,0000,0000,0000,,of the ban on facial\Nrecognition software in Boston. Dialogue: 0,2:31:30.34,2:31:33.83,Default,,0000,0000,0000,,I am an MIT alumni, Dialogue: 0,2:31:33.83,2:31:35.76,Default,,0000,0000,0000,,working as a software engineer in Boston Dialogue: 0,2:31:35.76,2:31:38.66,Default,,0000,0000,0000,,and I'm a DACA recipient as well. Dialogue: 0,2:31:38.66,2:31:41.49,Default,,0000,0000,0000,,And I wanted to tell you for your time Dialogue: 0,2:31:41.49,2:31:43.44,Default,,0000,0000,0000,,that as a software engineer working in AI Dialogue: 0,2:31:43.44,2:31:44.75,Default,,0000,0000,0000,,and machine learning, enabled robots, Dialogue: 0,2:31:44.75,2:31:46.78,Default,,0000,0000,0000,,that the software use\Nfor facial recognition Dialogue: 0,2:31:46.78,2:31:48.05,Default,,0000,0000,0000,,is far from perfect. Dialogue: 0,2:31:49.09,2:31:51.72,Default,,0000,0000,0000,,In fact, as Joy found\Nin others have noted, Dialogue: 0,2:31:51.72,2:31:53.94,Default,,0000,0000,0000,,one in three black women is likely Dialogue: 0,2:31:53.94,2:31:56.08,Default,,0000,0000,0000,,to be misclassified by technology. Dialogue: 0,2:31:56.08,2:31:58.17,Default,,0000,0000,0000,,And our own federal government has found Dialogue: 0,2:31:58.17,2:32:00.05,Default,,0000,0000,0000,,that the face recognition algorithms Dialogue: 0,2:32:00.05,2:32:01.11,Default,,0000,0000,0000,,were more likely to fail Dialogue: 0,2:32:01.11,2:32:03.72,Default,,0000,0000,0000,,when attempting to identify\Nfaces of people of color, Dialogue: 0,2:32:03.72,2:32:06.18,Default,,0000,0000,0000,,children, elderly and women. Dialogue: 0,2:32:06.18,2:32:10.09,Default,,0000,0000,0000,,This software after all\Nis written by people. Dialogue: 0,2:32:10.09,2:32:12.74,Default,,0000,0000,0000,,And this is inherently filled\Nwith mistakes and bias, Dialogue: 0,2:32:12.74,2:32:16.63,Default,,0000,0000,0000,,despite the detailed\Nquality assurance processes Dialogue: 0,2:32:16.63,2:32:18.97,Default,,0000,0000,0000,,or production level software undergoes. Dialogue: 0,2:32:18.97,2:32:21.83,Default,,0000,0000,0000,,This is an accepted fact\Nin software industry, Dialogue: 0,2:32:21.83,2:32:23.54,Default,,0000,0000,0000,,no matter how good your software is Dialogue: 0,2:32:23.54,2:32:25.35,Default,,0000,0000,0000,,or the testing process is, Dialogue: 0,2:32:25.35,2:32:27.43,Default,,0000,0000,0000,,bugs in the software will always exist. Dialogue: 0,2:32:27.43,2:32:30.78,Default,,0000,0000,0000,,Consequently, the use of\Nfacial recognition technology Dialogue: 0,2:32:30.78,2:32:33.64,Default,,0000,0000,0000,,by government entities\Nespecially the police departments Dialogue: 0,2:32:33.64,2:32:35.83,Default,,0000,0000,0000,,must be banned as a consequence of errors Dialogue: 0,2:32:35.83,2:32:38.46,Default,,0000,0000,0000,,in the technology or if life and death. Dialogue: 0,2:32:38.46,2:32:41.51,Default,,0000,0000,0000,,The false identification by\Na facial recognition software Dialogue: 0,2:32:41.51,2:32:43.85,Default,,0000,0000,0000,,could lead to an unwarranted confrontation Dialogue: 0,2:32:43.85,2:32:45.81,Default,,0000,0000,0000,,with the police department. Dialogue: 0,2:32:45.81,2:32:48.32,Default,,0000,0000,0000,,The seemingly small error in\Nfacial recognition software Dialogue: 0,2:32:48.32,2:32:50.14,Default,,0000,0000,0000,,may not mean much to you, Dialogue: 0,2:32:50.14,2:32:54.15,Default,,0000,0000,0000,,but as an undocumented\Nimmigrant and a person of color, Dialogue: 0,2:32:54.15,2:32:56.04,Default,,0000,0000,0000,,any confrontation with the police Dialogue: 0,2:32:56.04,2:32:58.12,Default,,0000,0000,0000,,could mean deportation or even my life. Dialogue: 0,2:32:59.07,2:33:02.13,Default,,0000,0000,0000,,The police have a record\Nof systemic racial bias Dialogue: 0,2:33:02.13,2:33:04.41,Default,,0000,0000,0000,,leading into the deaths\Nof thousands of people. Dialogue: 0,2:33:04.41,2:33:06.88,Default,,0000,0000,0000,,Government entities, special police forces Dialogue: 0,2:33:06.88,2:33:09.80,Default,,0000,0000,0000,,do not need any more tools\Nto further their racial bias Dialogue: 0,2:33:09.80,2:33:12.61,Default,,0000,0000,0000,,and consequently there's\Na stomach murdering Dialogue: 0,2:33:12.61,2:33:13.98,Default,,0000,0000,0000,,of the very people\Nthey're meant to protect. Dialogue: 0,2:33:13.98,2:33:15.46,Default,,0000,0000,0000,,I encourage you to press pause Dialogue: 0,2:33:15.46,2:33:17.77,Default,,0000,0000,0000,,on these face surveillance\Nby government entities Dialogue: 0,2:33:17.77,2:33:21.50,Default,,0000,0000,0000,,in the city of Boston, by\Nsupporting this crucial ban. Dialogue: 0,2:33:24.13,2:33:28.62,Default,,0000,0000,0000,,Thank you very much, after\NJose I think we had... Dialogue: 0,2:33:33.08,2:33:38.08,Default,,0000,0000,0000,,After Jose we had Emily,\NEmily Reif has rejoined us. Dialogue: 0,2:33:38.30,2:33:42.17,Default,,0000,0000,0000,,She was part of the original\Nfirst set we called, Emily. Dialogue: 0,2:33:45.27,2:33:46.68,Default,,0000,0000,0000,,EMILY: Hello, sorry. Dialogue: 0,2:33:46.68,2:33:48.53,Default,,0000,0000,0000,,Can you hear me?\NYep, two minutes. Dialogue: 0,2:33:49.62,2:33:53.15,Default,,0000,0000,0000,,Sorry about that, I was\Nhaving internet issues. Dialogue: 0,2:33:53.15,2:33:55.00,Default,,0000,0000,0000,,So my name is Emily Reif Dialogue: 0,2:33:55.00,2:33:57.39,Default,,0000,0000,0000,,and I work at a machine\Nlearning research at Google. Dialogue: 0,2:33:57.39,2:33:58.43,Default,,0000,0000,0000,,Although of course my views are my own Dialogue: 0,2:33:58.43,2:34:00.49,Default,,0000,0000,0000,,and I'm here as a citizen. Dialogue: 0,2:34:00.49,2:34:02.06,Default,,0000,0000,0000,,Yeah, I strongly support the ban Dialogue: 0,2:34:02.06,2:34:03.86,Default,,0000,0000,0000,,on facial recognition technology Dialogue: 0,2:34:03.86,2:34:05.86,Default,,0000,0000,0000,,and many people have said similar things Dialogue: 0,2:34:05.86,2:34:09.40,Default,,0000,0000,0000,,to what I'm about to say\Nand much better words. Dialogue: 0,2:34:09.40,2:34:13.08,Default,,0000,0000,0000,,I just wanted to reiterate on... Dialogue: 0,2:34:13.08,2:34:14.85,Default,,0000,0000,0000,,Even when working correctly, Dialogue: 0,2:34:14.85,2:34:17.12,Default,,0000,0000,0000,,there are so many major privacy Dialogue: 0,2:34:17.12,2:34:20.22,Default,,0000,0000,0000,,with these technologies and by definition, Dialogue: 0,2:34:20.22,2:34:22.02,Default,,0000,0000,0000,,they're designed to track\Nus wherever and whenever Dialogue: 0,2:34:22.02,2:34:26.00,Default,,0000,0000,0000,,we're in public spaces and\Nto aggregate this information Dialogue: 0,2:34:26.00,2:34:28.07,Default,,0000,0000,0000,,and having those kinds\Nof extreme surveillance Dialogue: 0,2:34:28.07,2:34:30.93,Default,,0000,0000,0000,,is not only kind of horrifying\Nat a personal level, Dialogue: 0,2:34:30.93,2:34:32.59,Default,,0000,0000,0000,,but it will also fundamentally change Dialogue: 0,2:34:32.59,2:34:34.38,Default,,0000,0000,0000,,the way that we go about our daily lives Dialogue: 0,2:34:34.38,2:34:36.62,Default,,0000,0000,0000,,and operate as a democratic society. Dialogue: 0,2:34:36.62,2:34:40.00,Default,,0000,0000,0000,,And others have already cited\Nsome of the research on this. Dialogue: 0,2:34:40.00,2:34:42.96,Default,,0000,0000,0000,,It's a well-documented\Nphenomenon, not a good one. Dialogue: 0,2:34:42.96,2:34:47.10,Default,,0000,0000,0000,,And so that's all about\Nfacial recognition technology Dialogue: 0,2:34:47.10,2:34:48.08,Default,,0000,0000,0000,,when it's working perfectly. Dialogue: 0,2:34:48.08,2:34:50.81,Default,,0000,0000,0000,,But again, as people have noted Dialogue: 0,2:34:50.81,2:34:53.16,Default,,0000,0000,0000,,that's so far from the truth. Dialogue: 0,2:34:53.16,2:34:55.53,Default,,0000,0000,0000,,Yeah, there's a ton of\Nresearch on this by the ACLU, Dialogue: 0,2:34:55.53,2:34:58.57,Default,,0000,0000,0000,,and Joy who spoke earlier Dialogue: 0,2:34:58.57,2:35:00.72,Default,,0000,0000,0000,,and I've seen this in my own research. Dialogue: 0,2:35:00.72,2:35:03.83,Default,,0000,0000,0000,,Our team's research is\Nrelated to these areas. Dialogue: 0,2:35:03.83,2:35:08.31,Default,,0000,0000,0000,,That these models are just\Ntotally not reliable for data Dialogue: 0,2:35:08.31,2:35:10.10,Default,,0000,0000,0000,,that is different than\Nwhat they're trained on. Dialogue: 0,2:35:10.10,2:35:13.06,Default,,0000,0000,0000,,And that these inaccuracy\Ndon't make it any less, Dialogue: 0,2:35:13.06,2:35:15.57,Default,,0000,0000,0000,,I mean, one might say\Nlike, oh, so it makes it, Dialogue: 0,2:35:15.57,2:35:17.84,Default,,0000,0000,0000,,potentially less dangerous,\Nbut it makes it much, Dialogue: 0,2:35:17.84,2:35:20.77,Default,,0000,0000,0000,,much more dangerous when the inaccuracy Dialogue: 0,2:35:20.77,2:35:22.24,Default,,0000,0000,0000,,has just reinforced the biases, Dialogue: 0,2:35:22.24,2:35:24.18,Default,,0000,0000,0000,,that disparities that\Nhave always been there Dialogue: 0,2:35:24.18,2:35:26.41,Default,,0000,0000,0000,,in the way that different\Ngroups are surveyed Dialogue: 0,2:35:26.41,2:35:28.79,Default,,0000,0000,0000,,and policed and incarcerated. Dialogue: 0,2:35:28.79,2:35:32.91,Default,,0000,0000,0000,,And we saw with, with\NBriana Hiller's death Dialogue: 0,2:35:32.91,2:35:35.46,Default,,0000,0000,0000,,that thinking that you're attacking Dialogue: 0,2:35:35.46,2:35:36.80,Default,,0000,0000,0000,,or that you're entering the right house, Dialogue: 0,2:35:36.80,2:35:38.93,Default,,0000,0000,0000,,and if you are wrong about that, Dialogue: 0,2:35:38.93,2:35:42.77,Default,,0000,0000,0000,,there's just horrible, an\Nunspeakable consequences Dialogue: 0,2:35:42.77,2:35:47.77,Default,,0000,0000,0000,,And yeah, we can't possibly risk that Dialogue: 0,2:35:47.80,2:35:49.90,Default,,0000,0000,0000,,and that's not a road we should remotely Dialogue: 0,2:35:49.90,2:35:52.12,Default,,0000,0000,0000,,think about going down, thank you. Dialogue: 0,2:35:53.81,2:35:55.83,Default,,0000,0000,0000,,That's perfectly on time. Dialogue: 0,2:35:55.83,2:35:58.92,Default,,0000,0000,0000,,I'm just gonna go down the list Dialogue: 0,2:35:58.92,2:36:03.92,Default,,0000,0000,0000,,and also call on Christian\NDe Leon, Clara Ruis, Dialogue: 0,2:36:10.72,2:36:13.19,Default,,0000,0000,0000,,Diana Serrano and Oliver De Leon. Dialogue: 0,2:36:20.25,2:36:22.56,Default,,0000,0000,0000,,Are those folks are available. Dialogue: 0,2:36:22.56,2:36:23.44,Default,,0000,0000,0000,,I see Christian. Dialogue: 0,2:36:27.85,2:36:29.50,Default,,0000,0000,0000,,Christian you may start,\Nyou have two minutes. Dialogue: 0,2:36:30.75,2:36:33.94,Default,,0000,0000,0000,,[indistinct] Dialogue: 0,2:36:33.94,2:36:36.33,Default,,0000,0000,0000,,You have to turn off one\Ncomputer and on your phone Dialogue: 0,2:36:36.33,2:36:38.95,Default,,0000,0000,0000,,or something, you're\Ngonna have to turn off Dialogue: 0,2:36:38.95,2:36:39.78,Default,,0000,0000,0000,,one of the devices. Dialogue: 0,2:36:39.78,2:36:41.23,Default,,0000,0000,0000,,CHRISTIAN: Can you hear me fine now? Dialogue: 0,2:36:41.23,2:36:42.48,Default,,0000,0000,0000,,Much better, thank you. Dialogue: 0,2:36:45.45,2:36:47.99,Default,,0000,0000,0000,,CHRISTIAN: My name is Christian De Leon, Dialogue: 0,2:36:47.99,2:36:49.75,Default,,0000,0000,0000,,I'm here today as a member Dialogue: 0,2:36:49.75,2:36:50.58,Default,,0000,0000,0000,,of the Student Immigrant Movement Dialogue: 0,2:36:51.63,2:36:53.32,Default,,0000,0000,0000,,in support of banning facial\Nrecognition in Boston. Dialogue: 0,2:36:53.32,2:36:56.28,Default,,0000,0000,0000,,Face surveillance is not only\Na huge invasion of privacy, Dialogue: 0,2:36:56.28,2:36:58.58,Default,,0000,0000,0000,,but in the wrong hands can be used Dialogue: 0,2:36:58.58,2:37:00.68,Default,,0000,0000,0000,,to even further oppress minorities. Dialogue: 0,2:37:00.68,2:37:03.35,Default,,0000,0000,0000,,It can be used to track\Nimmigrants, protestors, Dialogue: 0,2:37:03.35,2:37:07.41,Default,,0000,0000,0000,,and inaccurate when tracking\Npeople of darker complexion. Dialogue: 0,2:37:07.41,2:37:09.65,Default,,0000,0000,0000,,As we know, this can be very problematic Dialogue: 0,2:37:09.65,2:37:12.08,Default,,0000,0000,0000,,since these tools target these communities Dialogue: 0,2:37:12.08,2:37:14.19,Default,,0000,0000,0000,,through racial profiling already. Dialogue: 0,2:37:14.19,2:37:17.16,Default,,0000,0000,0000,,Having surveillance tools\Nlike facial recognition Dialogue: 0,2:37:17.16,2:37:19.73,Default,,0000,0000,0000,,will contribute to an even\Ngreater systemic oppression. Dialogue: 0,2:37:19.73,2:37:23.40,Default,,0000,0000,0000,,There's also has no business\Nbeing in our schools. Dialogue: 0,2:37:23.40,2:37:26.29,Default,,0000,0000,0000,,To my knowledge, some\Nschools are already investing Dialogue: 0,2:37:26.29,2:37:28.21,Default,,0000,0000,0000,,in face recognition software Dialogue: 0,2:37:28.21,2:37:31.24,Default,,0000,0000,0000,,and are investing millions\Nin this new technology, Dialogue: 0,2:37:31.24,2:37:32.96,Default,,0000,0000,0000,,and for what exactly? Dialogue: 0,2:37:32.96,2:37:34.84,Default,,0000,0000,0000,,To track their students every move? Dialogue: 0,2:37:34.84,2:37:38.62,Default,,0000,0000,0000,,I believe school should\Nspend that time and money Dialogue: 0,2:37:38.62,2:37:41.80,Default,,0000,0000,0000,,on bettering their school\Nand education systems, Dialogue: 0,2:37:41.80,2:37:44.68,Default,,0000,0000,0000,,rather than spending it\Non software and tools Dialogue: 0,2:37:44.68,2:37:47.97,Default,,0000,0000,0000,,that will no way improve\Nstudents' ability to be educated. Dialogue: 0,2:37:47.97,2:37:50.04,Default,,0000,0000,0000,,We are already living in a time Dialogue: 0,2:37:50.04,2:37:52.84,Default,,0000,0000,0000,,where technology can do scary things Dialogue: 0,2:37:52.84,2:37:54.87,Default,,0000,0000,0000,,and the last thing we need Dialogue: 0,2:37:54.87,2:37:57.92,Default,,0000,0000,0000,,are people watching our every\Nmove on a day-to-day basis. Dialogue: 0,2:37:57.92,2:37:59.21,Default,,0000,0000,0000,,During this pandemic, Dialogue: 0,2:37:59.21,2:38:00.95,Default,,0000,0000,0000,,we have grown accustomed to wearing masks Dialogue: 0,2:38:00.95,2:38:02.75,Default,,0000,0000,0000,,and covering our faces in public, Dialogue: 0,2:38:02.75,2:38:05.38,Default,,0000,0000,0000,,well, if we now have facial recognition Dialogue: 0,2:38:05.38,2:38:06.80,Default,,0000,0000,0000,,on every street corner, Dialogue: 0,2:38:06.80,2:38:10.04,Default,,0000,0000,0000,,then this temporary thing\Nthe world has adapted to Dialogue: 0,2:38:10.04,2:38:11.61,Default,,0000,0000,0000,,may just become permanent. Dialogue: 0,2:38:11.61,2:38:13.73,Default,,0000,0000,0000,,Not only to protect our own privacy, Dialogue: 0,2:38:13.73,2:38:16.51,Default,,0000,0000,0000,,but to not be falsely accused of crimes Dialogue: 0,2:38:16.51,2:38:18.80,Default,,0000,0000,0000,,due to falsely recognition software. Dialogue: 0,2:38:18.80,2:38:22.76,Default,,0000,0000,0000,,2020 has had a great display\Nof racism and brutality Dialogue: 0,2:38:22.76,2:38:23.94,Default,,0000,0000,0000,,against people of color, Dialogue: 0,2:38:23.94,2:38:25.59,Default,,0000,0000,0000,,particularly the black community, Dialogue: 0,2:38:25.59,2:38:28.53,Default,,0000,0000,0000,,and I fear that systems such\Nas facial recognition software Dialogue: 0,2:38:28.53,2:38:30.42,Default,,0000,0000,0000,,in the hands of law enforcement, Dialogue: 0,2:38:30.42,2:38:33.00,Default,,0000,0000,0000,,can just be another tool\Nthey use against us. Dialogue: 0,2:38:33.00,2:38:35.68,Default,,0000,0000,0000,,Thank you for listening to my concerns. Dialogue: 0,2:38:35.68,2:38:37.48,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:38:37.48,2:38:38.31,Default,,0000,0000,0000,,Clara Ruiz. Dialogue: 0,2:38:43.48,2:38:45.05,Default,,0000,0000,0000,,CLARA: Hello.\NHello. Dialogue: 0,2:38:45.05,2:38:47.20,Default,,0000,0000,0000,,You have two minutes Clara. Dialogue: 0,2:38:49.33,2:38:50.40,Default,,0000,0000,0000,,CLARA] : Thank you so much. Dialogue: 0,2:38:50.40,2:38:53.77,Default,,0000,0000,0000,,My name is Clara Ruiz and\NI'm writing in support Dialogue: 0,2:38:53.77,2:38:57.00,Default,,0000,0000,0000,,of the ordinance banning\Nfacial recognition technology Dialogue: 0,2:38:57.00,2:39:00.21,Default,,0000,0000,0000,,in Boston presented by\Ncouncilor Wu and Arroyo. Dialogue: 0,2:39:00.21,2:39:02.92,Default,,0000,0000,0000,,This ordinance establish\Na ban on government use Dialogue: 0,2:39:02.92,2:39:05.24,Default,,0000,0000,0000,,of face surveillance\Nin the city of Boston. Dialogue: 0,2:39:05.24,2:39:07.84,Default,,0000,0000,0000,,Boston must pass this\Nordinance to join Springfield, Dialogue: 0,2:39:07.84,2:39:10.17,Default,,0000,0000,0000,,Somerville, Cambridge,\NBrooklyn, North Hampton Dialogue: 0,2:39:10.17,2:39:13.33,Default,,0000,0000,0000,,in protecting racial justice\Nprivacy, and freedom of speech. Dialogue: 0,2:39:13.33,2:39:17.23,Default,,0000,0000,0000,,This mattered to me because\NI'm a member of the BIPOCC, Dialogue: 0,2:39:19.52,2:39:22.39,Default,,0000,0000,0000,,Black Indigenous and\NPeople of Color Community. Dialogue: 0,2:39:22.39,2:39:23.38,Default,,0000,0000,0000,,And this is allowing for the community Dialogue: 0,2:39:23.38,2:39:25.58,Default,,0000,0000,0000,,to increase in racial biases. Dialogue: 0,2:39:25.58,2:39:28.20,Default,,0000,0000,0000,,This technology does not\Nprotect constitutional rights Dialogue: 0,2:39:28.20,2:39:32.11,Default,,0000,0000,0000,,and values, including\Nprivacy in an immediate Dialogue: 0,2:39:32.11,2:39:35.33,Default,,0000,0000,0000,,[indistinct] Dialogue: 0,2:39:35.33,2:39:37.97,Default,,0000,0000,0000,,appropriate speech and\Nassociation, equal protection Dialogue: 0,2:39:37.97,2:39:40.74,Default,,0000,0000,0000,,and government accountability\Nin transparency. Dialogue: 0,2:39:40.74,2:39:43.12,Default,,0000,0000,0000,,And the government must be\Naware of the technical limits Dialogue: 0,2:39:43.12,2:39:45.80,Default,,0000,0000,0000,,of even the most\Npromising new technologies Dialogue: 0,2:39:45.80,2:39:47.87,Default,,0000,0000,0000,,in the release of relying\Non computer system Dialogue: 0,2:39:47.87,2:39:49.52,Default,,0000,0000,0000,,that can be prone to error. Dialogue: 0,2:39:49.52,2:39:50.93,Default,,0000,0000,0000,,Facial recognition is creeping Dialogue: 0,2:39:50.93,2:39:53.09,Default,,0000,0000,0000,,into more and more law\Nenforcement agencies, Dialogue: 0,2:39:53.09,2:39:55.03,Default,,0000,0000,0000,,but later notice or oversight. Dialogue: 0,2:39:55.03,2:39:57.86,Default,,0000,0000,0000,,And there are practically\Nno laws regulating Dialogue: 0,2:39:57.86,2:40:00.40,Default,,0000,0000,0000,,this investive surveillance technology. Dialogue: 0,2:40:00.40,2:40:02.73,Default,,0000,0000,0000,,Meanwhile, ongoing technical limitations Dialogue: 0,2:40:02.73,2:40:05.01,Default,,0000,0000,0000,,to the accuracy of facial recognition Dialogue: 0,2:40:05.01,2:40:07.47,Default,,0000,0000,0000,,create serious problems\Nlike misidentification Dialogue: 0,2:40:07.47,2:40:09.57,Default,,0000,0000,0000,,and public safety risk. Dialogue: 0,2:40:09.57,2:40:11.91,Default,,0000,0000,0000,,I refuse to sit back and\Nput my community on the risk Dialogue: 0,2:40:11.91,2:40:15.50,Default,,0000,0000,0000,,than it already is. Dialogue: 0,2:40:15.50,2:40:17.12,Default,,0000,0000,0000,,These surveillance technologies Dialogue: 0,2:40:17.12,2:40:21.21,Default,,0000,0000,0000,,are intentionally set up to\Nthe disadvantage of brown Dialogue: 0,2:40:22.43,2:40:24.42,Default,,0000,0000,0000,,and color people, communities. Dialogue: 0,2:40:24.42,2:40:26.78,Default,,0000,0000,0000,,It has a better success rate\Nwhen it comes to white males, Dialogue: 0,2:40:26.78,2:40:29.15,Default,,0000,0000,0000,,meaning this technology is a unreliable Dialogue: 0,2:40:29.15,2:40:30.75,Default,,0000,0000,0000,,and it will do more harm, Dialogue: 0,2:40:30.75,2:40:33.02,Default,,0000,0000,0000,,if we don't ban the surveillance tool. Dialogue: 0,2:40:33.02,2:40:36.82,Default,,0000,0000,0000,,This technology will allow\Nfor misidentification Dialogue: 0,2:40:36.82,2:40:40.21,Default,,0000,0000,0000,,to purposely happen against\Nblack and brown people. Dialogue: 0,2:40:40.21,2:40:42.84,Default,,0000,0000,0000,,A ban on face surveillance technology Dialogue: 0,2:40:42.84,2:40:44.94,Default,,0000,0000,0000,,is critically important\Nin the city of Boston Dialogue: 0,2:40:44.94,2:40:47.17,Default,,0000,0000,0000,,because we need to keep\Nthe community safe, Dialogue: 0,2:40:47.17,2:40:48.92,Default,,0000,0000,0000,,because all life matters. Dialogue: 0,2:40:48.92,2:40:51.15,Default,,0000,0000,0000,,A federal government government's study Dialogue: 0,2:40:51.15,2:40:53.48,Default,,0000,0000,0000,,published in December 2019, Dialogue: 0,2:40:53.48,2:40:55.33,Default,,0000,0000,0000,,conduct face recognition algorithms Dialogue: 0,2:40:55.33,2:40:56.73,Default,,0000,0000,0000,,were most likely to fail Dialogue: 0,2:40:56.73,2:41:00.36,Default,,0000,0000,0000,,when attempting to identify\Nthe faces of people of color, Dialogue: 0,2:41:00.36,2:41:03.27,Default,,0000,0000,0000,,children, elderly and woman. Dialogue: 0,2:41:03.27,2:41:07.29,Default,,0000,0000,0000,,That means the test only reliably works Dialogue: 0,2:41:07.29,2:41:08.82,Default,,0000,0000,0000,,on middle aged white men. Dialogue: 0,2:41:08.82,2:41:11.45,Default,,0000,0000,0000,,If really a small fraction\Nof Boston residents, Dialogue: 0,2:41:11.45,2:41:14.59,Default,,0000,0000,0000,,I encourage you to pass to press pause Dialogue: 0,2:41:14.59,2:41:17.55,Default,,0000,0000,0000,,on the use of face surveillance\Nby government entities Dialogue: 0,2:41:20.25,2:41:21.42,Default,,0000,0000,0000,,in the city of Boston, Dialogue: 0,2:41:21.42,2:41:23.92,Default,,0000,0000,0000,,by supporting and\Npassing this crucial ban. Dialogue: 0,2:41:23.92,2:41:26.52,Default,,0000,0000,0000,,We cannot allow Boston\Nto adopt authoritarian, Dialogue: 0,2:41:26.52,2:41:30.10,Default,,0000,0000,0000,,unregulated, bias surveillance technology. Dialogue: 0,2:41:30.10,2:41:32.32,Default,,0000,0000,0000,,Thank you for the opportunity to testify. Dialogue: 0,2:41:32.32,2:41:34.25,Default,,0000,0000,0000,,And the way, Diana Serrano, Dialogue: 0,2:41:34.25,2:41:36.33,Default,,0000,0000,0000,,is also gonna be testifying through this. Dialogue: 0,2:41:38.72,2:41:40.83,Default,,0000,0000,0000,,Diana you have two minutes. Dialogue: 0,2:41:40.83,2:41:43.31,Default,,0000,0000,0000,,DIANA: Okay, hello, my\Nname is Diana Serrano, Dialogue: 0,2:41:43.31,2:41:44.92,Default,,0000,0000,0000,,good evening. Dialogue: 0,2:41:44.92,2:41:47.68,Default,,0000,0000,0000,,I'm writing on behalf of SIM\Nin support of the ordinance Dialogue: 0,2:41:47.68,2:41:50.49,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston. Dialogue: 0,2:41:50.49,2:41:53.40,Default,,0000,0000,0000,,I urge my city of Boston\Nto pass this ordinance Dialogue: 0,2:41:53.40,2:41:55.78,Default,,0000,0000,0000,,to join Springfield,\NSomerville, Cambridge, Brooklyn Dialogue: 0,2:41:55.78,2:41:58.40,Default,,0000,0000,0000,,and North Hampton in\Nprotecting racial justice, Dialogue: 0,2:41:58.40,2:42:01.22,Default,,0000,0000,0000,,privacy and freedom of speech. Dialogue: 0,2:42:01.22,2:42:02.88,Default,,0000,0000,0000,,It is our duty as a free and just city, Dialogue: 0,2:42:02.88,2:42:06.26,Default,,0000,0000,0000,,to ban technology like facial recognition. Dialogue: 0,2:42:06.26,2:42:08.40,Default,,0000,0000,0000,,As a victim of social media hacking, Dialogue: 0,2:42:08.40,2:42:11.70,Default,,0000,0000,0000,,I've seen how technology could\Nbe used as a tool to defame Dialogue: 0,2:42:11.70,2:42:14.18,Default,,0000,0000,0000,,and criminalize with pictures Dialogue: 0,2:42:14.18,2:42:17.38,Default,,0000,0000,0000,,that feel like proof to the viewers. Dialogue: 0,2:42:17.38,2:42:19.18,Default,,0000,0000,0000,,As a teacher and civilian, Dialogue: 0,2:42:19.18,2:42:22.17,Default,,0000,0000,0000,,I refuse to allow those in my community Dialogue: 0,2:42:22.17,2:42:24.66,Default,,0000,0000,0000,,to be exposed to a highly problematic Dialogue: 0,2:42:24.66,2:42:27.54,Default,,0000,0000,0000,,and technology like facial recognition. Dialogue: 0,2:42:27.54,2:42:29.85,Default,,0000,0000,0000,,Not only is privacy important Dialogue: 0,2:42:29.85,2:42:32.47,Default,,0000,0000,0000,,and should be valued\Nby our law enforcement, Dialogue: 0,2:42:32.47,2:42:35.40,Default,,0000,0000,0000,,face surveillance is\Nproven to be less accurate Dialogue: 0,2:42:35.40,2:42:37.47,Default,,0000,0000,0000,,for people with darker skin. Dialogue: 0,2:42:37.47,2:42:39.71,Default,,0000,0000,0000,,With the high levels of\Nracial discrimination Dialogue: 0,2:42:39.71,2:42:41.74,Default,,0000,0000,0000,,forced on our black and brown communities Dialogue: 0,2:42:41.74,2:42:43.27,Default,,0000,0000,0000,,by law enforcement. Dialogue: 0,2:42:43.27,2:42:45.26,Default,,0000,0000,0000,,The racial bias in face surveillance Dialogue: 0,2:42:45.26,2:42:48.96,Default,,0000,0000,0000,,will put lives in danger even\Nfurther than they already are. Dialogue: 0,2:42:48.96,2:42:50.85,Default,,0000,0000,0000,,A ban on face surveillance technology Dialogue: 0,2:42:50.85,2:42:53.14,Default,,0000,0000,0000,,is critically important\Nin the city of Boston Dialogue: 0,2:42:53.14,2:42:55.52,Default,,0000,0000,0000,,because that technology\Nreally is only reliable Dialogue: 0,2:42:55.52,2:42:58.62,Default,,0000,0000,0000,,for a very small fraction\Nof the population. Dialogue: 0,2:42:58.62,2:43:01.66,Default,,0000,0000,0000,,It's highly important\Nto me to live in a city Dialogue: 0,2:43:01.66,2:43:04.38,Default,,0000,0000,0000,,where it is critical for\Nleaders to keep our city Dialogue: 0,2:43:04.38,2:43:05.91,Default,,0000,0000,0000,,and its people safe. Dialogue: 0,2:43:05.91,2:43:08.92,Default,,0000,0000,0000,,Although it may seem as\Nthough facial recognition Dialogue: 0,2:43:08.92,2:43:12.12,Default,,0000,0000,0000,,could maybe keep us safer, it\Nhas proven do the opposite. Dialogue: 0,2:43:12.12,2:43:16.10,Default,,0000,0000,0000,,I urge you to press pause on\Nthe use of facial surveillance Dialogue: 0,2:43:16.10,2:43:18.75,Default,,0000,0000,0000,,by government entities\Nand the city of Boston Dialogue: 0,2:43:18.75,2:43:21.21,Default,,0000,0000,0000,,by supporting and passing the crucial ban. Dialogue: 0,2:43:21.21,2:43:23.18,Default,,0000,0000,0000,,We cannot allow Boston to adopt Dialogue: 0,2:43:23.18,2:43:26.83,Default,,0000,0000,0000,,authoritarian, unregulated,\Nbias surveillance technology. Dialogue: 0,2:43:26.83,2:43:29.68,Default,,0000,0000,0000,,Thank you for your\Nattention, Diana Serrano. Dialogue: 0,2:43:29.68,2:43:31.66,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:43:31.66,2:43:36.66,Default,,0000,0000,0000,,Oliver De Leon, you have two minute. Dialogue: 0,2:43:37.76,2:43:40.30,Default,,0000,0000,0000,,OLIVER: Hi, my name is Oliver Dialogue: 0,2:43:40.30,2:43:42.82,Default,,0000,0000,0000,,and I live in Jamaica Plain. Dialogue: 0,2:43:42.82,2:43:45.67,Default,,0000,0000,0000,,And I was undocumented when I attended Dialogue: 0,2:43:45.67,2:43:48.36,Default,,0000,0000,0000,,the Boston public schools\Nin early nineties. Dialogue: 0,2:43:48.36,2:43:53.04,Default,,0000,0000,0000,,So yes, it's a while back, so\NI became a citizen in 2016. Dialogue: 0,2:43:54.36,2:43:57.11,Default,,0000,0000,0000,,I look back on my days in high school Dialogue: 0,2:43:57.11,2:43:59.38,Default,,0000,0000,0000,,while contemplating this new technology, Dialogue: 0,2:43:59.38,2:44:04.38,Default,,0000,0000,0000,,facial recognition and I\Ncan sincerely tell you that Dialogue: 0,2:44:04.86,2:44:08.39,Default,,0000,0000,0000,,it terrified me the\Nthought of having somebody Dialogue: 0,2:44:08.39,2:44:11.13,Default,,0000,0000,0000,,access this technology without\Nmy consent acknowledge. Dialogue: 0,2:44:11.13,2:44:14.04,Default,,0000,0000,0000,,I don't know if this would have allowed me Dialogue: 0,2:44:14.04,2:44:16.95,Default,,0000,0000,0000,,to finish school at that time. Dialogue: 0,2:44:16.95,2:44:19.02,Default,,0000,0000,0000,,And it would have been hard Dialogue: 0,2:44:19.02,2:44:22.44,Default,,0000,0000,0000,,to even finish possibly higher education. Dialogue: 0,2:44:22.44,2:44:24.00,Default,,0000,0000,0000,,Without that education, Dialogue: 0,2:44:24.00,2:44:27.84,Default,,0000,0000,0000,,I wouldn't have been able\Nto secure job opportunities Dialogue: 0,2:44:27.84,2:44:30.54,Default,,0000,0000,0000,,like I have now with a paying job Dialogue: 0,2:44:30.54,2:44:32.80,Default,,0000,0000,0000,,that helps me provide for my family. Dialogue: 0,2:44:32.80,2:44:35.41,Default,,0000,0000,0000,,So we started thinking about that. Dialogue: 0,2:44:35.41,2:44:38.32,Default,,0000,0000,0000,,We start see seeing how\Nthis facial recognition Dialogue: 0,2:44:38.32,2:44:41.47,Default,,0000,0000,0000,,can actually have a negative impact Dialogue: 0,2:44:41.47,2:44:44.48,Default,,0000,0000,0000,,in the education of our\Nyouth and our future. Dialogue: 0,2:44:44.48,2:44:47.56,Default,,0000,0000,0000,,People already talk, how are the U.S. Dialogue: 0,2:44:47.56,2:44:49.61,Default,,0000,0000,0000,,is already riddled with racial bias, Dialogue: 0,2:44:49.61,2:44:54.03,Default,,0000,0000,0000,,and this software is just\Ngiving people already empowered Dialogue: 0,2:44:54.03,2:44:56.31,Default,,0000,0000,0000,,a tool to discriminate even more. Dialogue: 0,2:44:56.31,2:45:00.09,Default,,0000,0000,0000,,For example, I.C.E has already\Nshown up in the protests Dialogue: 0,2:45:00.09,2:45:02.30,Default,,0000,0000,0000,,and detained a few people. Dialogue: 0,2:45:02.30,2:45:04.13,Default,,0000,0000,0000,,Now we have to ask ourselves Dialogue: 0,2:45:04.13,2:45:08.22,Default,,0000,0000,0000,,how the heck do they\Npick out these few people Dialogue: 0,2:45:08.22,2:45:11.31,Default,,0000,0000,0000,,in the thousands of protesters? Dialogue: 0,2:45:11.31,2:45:13.30,Default,,0000,0000,0000,,I mean, one can only speculate, Dialogue: 0,2:45:13.30,2:45:14.70,Default,,0000,0000,0000,,but one of the few answers Dialogue: 0,2:45:14.70,2:45:17.10,Default,,0000,0000,0000,,can be facial recognition software. Dialogue: 0,2:45:17.10,2:45:18.06,Default,,0000,0000,0000,,Do we not have it? Dialogue: 0,2:45:18.06,2:45:19.20,Default,,0000,0000,0000,,Do they have it? Dialogue: 0,2:45:19.20,2:45:21.30,Default,,0000,0000,0000,,How can we show that they're not using it? Dialogue: 0,2:45:21.30,2:45:24.58,Default,,0000,0000,0000,,Well, it's pretty hard to pick\Nout two or three individuals Dialogue: 0,2:45:24.58,2:45:27.20,Default,,0000,0000,0000,,out of a thousands and thousands Dialogue: 0,2:45:27.20,2:45:29.02,Default,,0000,0000,0000,,and thousands of protestors. Dialogue: 0,2:45:29.02,2:45:31.82,Default,,0000,0000,0000,,So it's up to us to kind\Nof speculate what that is. Dialogue: 0,2:45:31.82,2:45:35.25,Default,,0000,0000,0000,,The Coronavirus has caused major epidemic. Dialogue: 0,2:45:35.25,2:45:38.50,Default,,0000,0000,0000,,The police brutality has\Ncaused major epidemic. Dialogue: 0,2:45:38.50,2:45:40.55,Default,,0000,0000,0000,,I guess we have to ask ourselves, Dialogue: 0,2:45:40.55,2:45:44.01,Default,,0000,0000,0000,,are you allowing the\Nnext epidemic to happen? Dialogue: 0,2:45:44.01,2:45:44.84,Default,,0000,0000,0000,,Maybe. Dialogue: 0,2:45:45.68,2:45:47.81,Default,,0000,0000,0000,,Banning facial recognition\Nin the city of Boston Dialogue: 0,2:45:47.81,2:45:52.06,Default,,0000,0000,0000,,will bring us to be in the\Nmiddle of the next battle. Dialogue: 0,2:45:52.06,2:45:56.26,Default,,0000,0000,0000,,And which side do we wanna be on? Dialogue: 0,2:45:56.26,2:45:58.63,Default,,0000,0000,0000,,Let's not approve this software. Dialogue: 0,2:45:58.63,2:46:01.03,Default,,0000,0000,0000,,I urge you to to ban facial\Nrecognition technology Dialogue: 0,2:46:01.03,2:46:03.27,Default,,0000,0000,0000,,as it also affects our youth. Dialogue: 0,2:46:03.27,2:46:07.73,Default,,0000,0000,0000,,We do not need a system or software Dialogue: 0,2:46:07.73,2:46:09.44,Default,,0000,0000,0000,,that can pick up individuals Dialogue: 0,2:46:09.44,2:46:11.34,Default,,0000,0000,0000,,and potentially erroneously Dialogue: 0,2:46:11.34,2:46:14.89,Default,,0000,0000,0000,,accuse them of something\Nthat they haven't done. Dialogue: 0,2:46:14.89,2:46:18.00,Default,,0000,0000,0000,,Will the next death is\Ncaused by your choice today. Dialogue: 0,2:46:18.00,2:46:19.79,Default,,0000,0000,0000,,Let's hope not, thank you. Dialogue: 0,2:46:19.79,2:46:22.10,Default,,0000,0000,0000,,Thank you, thank you very much. Dialogue: 0,2:46:22.10,2:46:23.97,Default,,0000,0000,0000,,It's 10 to six. Dialogue: 0,2:46:23.97,2:46:28.43,Default,,0000,0000,0000,,I'm going to have to stop\Nsharing at six o'clock. Dialogue: 0,2:46:28.43,2:46:32.61,Default,,0000,0000,0000,,That only means that I will\Nbe turning over the gavel Dialogue: 0,2:46:32.61,2:46:35.38,Default,,0000,0000,0000,,to the one of the lead sponsors, Dialogue: 0,2:46:35.38,2:46:36.54,Default,,0000,0000,0000,,Michelle Wu at six o'clock. Dialogue: 0,2:46:37.52,2:46:39.26,Default,,0000,0000,0000,,But I'm gonna call the\Nnext names of individuals Dialogue: 0,2:46:39.26,2:46:43.12,Default,,0000,0000,0000,,who are in line and have\Nsigned up to speak before I go. Dialogue: 0,2:46:43.12,2:46:44.54,Default,,0000,0000,0000,,And we'll continue chair. Dialogue: 0,2:46:44.54,2:46:49.54,Default,,0000,0000,0000,,I've got an Angel, Michelle\NRaj Mon, Leon Smith, Dialogue: 0,2:46:51.16,2:46:56.16,Default,,0000,0000,0000,,Amy van der Hiel, excuse me,\NJulie McNulty and Zachary Lawn. Dialogue: 0,2:47:00.06,2:47:01.82,Default,,0000,0000,0000,,So is the Angel available? Dialogue: 0,2:47:02.73,2:47:03.56,Default,,0000,0000,0000,,ANGEL: Yes. Dialogue: 0,2:47:05.33,2:47:08.26,Default,,0000,0000,0000,,Hello everyone, thank\Nyou for being here today. Dialogue: 0,2:47:09.54,2:47:10.68,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:47:14.22,2:47:19.22,Default,,0000,0000,0000,,ANGEL: My name is Angel\Nand I am in fresh grade Dialogue: 0,2:47:19.41,2:47:24.41,Default,,0000,0000,0000,,and go to a Boston public school. Dialogue: 0,2:47:26.70,2:47:28.53,Default,,0000,0000,0000,,I am seven years old. Dialogue: 0,2:47:28.53,2:47:32.36,Default,,0000,0000,0000,,To me surveillance means\Npeople watching you Dialogue: 0,2:47:32.36,2:47:34.50,Default,,0000,0000,0000,,without even knowing. Dialogue: 0,2:47:34.50,2:47:36.31,Default,,0000,0000,0000,,I would feel frustrated Dialogue: 0,2:47:36.31,2:47:41.31,Default,,0000,0000,0000,,because I do not want anyone watching me, Dialogue: 0,2:47:42.76,2:47:47.76,Default,,0000,0000,0000,,especially if I'm home\Nat school or at a park. Dialogue: 0,2:47:48.78,2:47:53.78,Default,,0000,0000,0000,,Facial recognition is\Nwhen they see your face Dialogue: 0,2:47:54.20,2:47:58.81,Default,,0000,0000,0000,,and know all the details, Dialogue: 0,2:47:58.81,2:48:01.87,Default,,0000,0000,0000,,but this does not work for everyone Dialogue: 0,2:48:01.87,2:48:06.18,Default,,0000,0000,0000,,because most police do not like people Dialogue: 0,2:48:06.18,2:48:09.74,Default,,0000,0000,0000,,who are not their skin color. Dialogue: 0,2:48:09.74,2:48:13.25,Default,,0000,0000,0000,,This tool is not going to be equal Dialogue: 0,2:48:13.25,2:48:18.25,Default,,0000,0000,0000,,to people who are black,\NLatino, immigrants and more, Dialogue: 0,2:48:20.49,2:48:23.68,Default,,0000,0000,0000,,it can get the wrong person. Dialogue: 0,2:48:27.35,2:48:32.35,Default,,0000,0000,0000,,I am young, as I get older,\NI will not look the same. Dialogue: 0,2:48:33.32,2:48:38.32,Default,,0000,0000,0000,,I am seven now and when I turned\N13, I will look different. Dialogue: 0,2:48:38.65,2:48:42.25,Default,,0000,0000,0000,,When I turn 18, I will look different. Dialogue: 0,2:48:42.25,2:48:45.67,Default,,0000,0000,0000,,I get older and my face changes. Dialogue: 0,2:48:45.67,2:48:50.05,Default,,0000,0000,0000,,If these tools are used new\Nschool, I will not feel safe. Dialogue: 0,2:48:50.05,2:48:52.97,Default,,0000,0000,0000,,We are just children. Dialogue: 0,2:48:52.97,2:48:57.30,Default,,0000,0000,0000,,Our teachers are already\Ntaking care and watching us. Dialogue: 0,2:48:57.30,2:49:00.08,Default,,0000,0000,0000,,We do not need police watching us. Dialogue: 0,2:49:00.08,2:49:03.96,Default,,0000,0000,0000,,Thank you SIM, for your love and support. Dialogue: 0,2:49:07.41,2:49:08.25,Default,,0000,0000,0000,,Well, I have to say, Dialogue: 0,2:49:08.25,2:49:12.42,Default,,0000,0000,0000,,I think you're the youngest\Nperson that's testified ever. Dialogue: 0,2:49:12.42,2:49:15.58,Default,,0000,0000,0000,,And in anything I've ever shared, Dialogue: 0,2:49:15.58,2:49:17.05,Default,,0000,0000,0000,,or I had that participation, Dialogue: 0,2:49:17.05,2:49:18.29,Default,,0000,0000,0000,,I know we're all on Zoom, Dialogue: 0,2:49:18.29,2:49:21.12,Default,,0000,0000,0000,,but I will give you a round of applause. Dialogue: 0,2:49:21.12,2:49:22.55,Default,,0000,0000,0000,,Thank you so much. Dialogue: 0,2:49:22.55,2:49:25.80,Default,,0000,0000,0000,,That was absolutely beautiful,\Nvery powerful testimony. Dialogue: 0,2:49:25.80,2:49:28.43,Default,,0000,0000,0000,,We're so proud of you. Dialogue: 0,2:49:28.43,2:49:31.01,Default,,0000,0000,0000,,Your city council and\Ncity are so proud of you. Dialogue: 0,2:49:31.01,2:49:34.27,Default,,0000,0000,0000,,Thank you so much for\Nyour testimony Angel. Dialogue: 0,2:49:34.27,2:49:38.53,Default,,0000,0000,0000,,I'm going to now call on, Dialogue: 0,2:49:38.53,2:49:41.22,Default,,0000,0000,0000,,I don't think I see Michelle or Leon Dialogue: 0,2:49:41.22,2:49:43.56,Default,,0000,0000,0000,,and don't share them either. Dialogue: 0,2:49:45.02,2:49:49.02,Default,,0000,0000,0000,,Okay, so I see Amy, who\Nwas we called earlier? Dialogue: 0,2:49:49.02,2:49:52.27,Default,,0000,0000,0000,,So Amy, you have two minutes Dialogue: 0,2:49:52.27,2:49:54.51,Default,,0000,0000,0000,,and you get to follow the\Ncutest seven-year-old ever. Dialogue: 0,2:49:54.51,2:49:56.70,Default,,0000,0000,0000,,[Lydia laughs] Dialogue: 0,2:49:56.70,2:49:57.53,Default,,0000,0000,0000,,Go ahead, Dialogue: 0,2:49:58.93,2:49:59.76,Default,,0000,0000,0000,,you're on mute. Dialogue: 0,2:50:02.12,2:50:05.13,Default,,0000,0000,0000,,Thank you, I applaud\Nher testimony entirely. Dialogue: 0,2:50:05.13,2:50:07.47,Default,,0000,0000,0000,,Thank you madam chairwoman and the council Dialogue: 0,2:50:07.47,2:50:10.01,Default,,0000,0000,0000,,for hearing our testimony,\NI'll be very brief. Dialogue: 0,2:50:10.01,2:50:13.67,Default,,0000,0000,0000,,I just wanted to say my\Nname is Amy van der Hiel. Dialogue: 0,2:50:13.67,2:50:15.14,Default,,0000,0000,0000,,I live in Roslindale Dialogue: 0,2:50:15.14,2:50:18.20,Default,,0000,0000,0000,,and I support the ban on\Nface surveillance, thank you. Dialogue: 0,2:50:19.10,2:50:21.04,Default,,0000,0000,0000,,Thank you very much, Michelle. Dialogue: 0,2:50:21.04,2:50:25.37,Default,,0000,0000,0000,,I have a Michelle Raj Mon and\NI see that Michelle Barrios. Dialogue: 0,2:50:25.37,2:50:26.20,Default,,0000,0000,0000,,Are you... Dialogue: 0,2:50:26.20,2:50:28.17,Default,,0000,0000,0000,,Okay, that was the mistake. Dialogue: 0,2:50:28.17,2:50:30.85,Default,,0000,0000,0000,,Very well then, Michelle\Nyou have two minutes. Dialogue: 0,2:50:30.85,2:50:31.68,Default,,0000,0000,0000,,That's okay. Dialogue: 0,2:50:33.23,2:50:38.04,Default,,0000,0000,0000,,And I also am not following\NAngel as a bit much. Dialogue: 0,2:50:38.04,2:50:39.98,Default,,0000,0000,0000,,So hello. Dialogue: 0,2:50:39.98,2:50:41.61,Default,,0000,0000,0000,,My name is Michelle Barrios Dialogue: 0,2:50:41.61,2:50:44.55,Default,,0000,0000,0000,,and I am a social studies\Nteacher in the Boston area. Dialogue: 0,2:50:44.55,2:50:46.63,Default,,0000,0000,0000,,And today I am here speaking on behalf Dialogue: 0,2:50:46.63,2:50:48.47,Default,,0000,0000,0000,,of the Student Immigrant Movement, Dialogue: 0,2:50:48.47,2:50:50.58,Default,,0000,0000,0000,,myself as an educator Dialogue: 0,2:50:50.58,2:50:54.05,Default,,0000,0000,0000,,and on the behalf of my\Ndiverse student population. Dialogue: 0,2:50:54.05,2:50:56.58,Default,,0000,0000,0000,,I am speaking in support of the ordinance Dialogue: 0,2:50:56.58,2:50:59.44,Default,,0000,0000,0000,,banning facial recognition\Ntechnology in Boston, Dialogue: 0,2:50:59.44,2:51:02.17,Default,,0000,0000,0000,,presented by councilors Wu and Arroyo. Dialogue: 0,2:51:02.17,2:51:04.82,Default,,0000,0000,0000,,This ordinance established\Na ban on government use Dialogue: 0,2:51:04.82,2:51:07.72,Default,,0000,0000,0000,,of facial surveillance\Nin the city of Boston. Dialogue: 0,2:51:07.72,2:51:10.49,Default,,0000,0000,0000,,Boston must pass this\Nordinance to join Springfield, Dialogue: 0,2:51:10.49,2:51:14.05,Default,,0000,0000,0000,,Somerville, excuse me,\NSpringfield, Somerville, Dialogue: 0,2:51:14.05,2:51:16.68,Default,,0000,0000,0000,,Cambridge, Brooklyn and North Hampton Dialogue: 0,2:51:16.68,2:51:19.93,Default,,0000,0000,0000,,in protecting racial justice,\Nprivacy and freedom of speech. Dialogue: 0,2:51:19.93,2:51:23.56,Default,,0000,0000,0000,,In my training as a\Nsocial studies teacher, Dialogue: 0,2:51:23.56,2:51:25.51,Default,,0000,0000,0000,,I became increasingly aware of the trauma Dialogue: 0,2:51:25.51,2:51:27.11,Default,,0000,0000,0000,,that students of color develop Dialogue: 0,2:51:27.99,2:51:30.04,Default,,0000,0000,0000,,as they grow in a society Dialogue: 0,2:51:30.04,2:51:32.68,Default,,0000,0000,0000,,that seeks to police them on\Nthe basis of their ethnicity, Dialogue: 0,2:51:32.68,2:51:35.18,Default,,0000,0000,0000,,economic status and place of residence. Dialogue: 0,2:51:35.18,2:51:38.45,Default,,0000,0000,0000,,This trauma is exacerbated\Nwhen a young person Dialogue: 0,2:51:38.45,2:51:40.01,Default,,0000,0000,0000,,comes from an immigrant family, Dialogue: 0,2:51:40.01,2:51:41.38,Default,,0000,0000,0000,,the status in this country, Dialogue: 0,2:51:41.38,2:51:44.15,Default,,0000,0000,0000,,places them at risk of family separation, Dialogue: 0,2:51:44.15,2:51:47.84,Default,,0000,0000,0000,,a loss of education and a\Nstripping of basic human rights. Dialogue: 0,2:51:47.84,2:51:51.13,Default,,0000,0000,0000,,We know that students struggle\Nto succeed academically Dialogue: 0,2:51:51.13,2:51:54.44,Default,,0000,0000,0000,,when they face these\Ndaily existential traumas, Dialogue: 0,2:51:54.44,2:51:57.66,Default,,0000,0000,0000,,which in essence is a\Nviolation of their rights Dialogue: 0,2:51:57.66,2:51:59.73,Default,,0000,0000,0000,,in the U.S. to pursue survival, Dialogue: 0,2:51:59.73,2:52:03.06,Default,,0000,0000,0000,,let alone social mobility\Nand a future of stability. Dialogue: 0,2:52:03.06,2:52:05.75,Default,,0000,0000,0000,,In addition to my training\Nand work as an educator, Dialogue: 0,2:52:05.75,2:52:08.27,Default,,0000,0000,0000,,I am the wife of a Latino immigrants Dialogue: 0,2:52:08.27,2:52:11.66,Default,,0000,0000,0000,,and the mother of two\Nof my sisters children, Dialogue: 0,2:52:11.66,2:52:13.10,Default,,0000,0000,0000,,while I am a white Latina Dialogue: 0,2:52:13.10,2:52:15.17,Default,,0000,0000,0000,,and I can live under the\Npresumption of innocence Dialogue: 0,2:52:15.17,2:52:16.35,Default,,0000,0000,0000,,due to my white privilege, Dialogue: 0,2:52:16.35,2:52:19.09,Default,,0000,0000,0000,,I've witnessed firsthand what\Nit means to live in a society Dialogue: 0,2:52:19.09,2:52:22.44,Default,,0000,0000,0000,,that has not yet made good\Non its promise of equality, Dialogue: 0,2:52:22.44,2:52:24.38,Default,,0000,0000,0000,,liberty, and justice for all. Dialogue: 0,2:52:24.38,2:52:26.36,Default,,0000,0000,0000,,My husband is followed in stores, Dialogue: 0,2:52:26.36,2:52:28.81,Default,,0000,0000,0000,,teachers hesitate to\Nrelease our children to him Dialogue: 0,2:52:28.81,2:52:30.08,Default,,0000,0000,0000,,when they first meet him Dialogue: 0,2:52:30.08,2:52:32.17,Default,,0000,0000,0000,,and he has been asked for\Nhis immigration status Dialogue: 0,2:52:32.17,2:52:35.50,Default,,0000,0000,0000,,by customers in his place of work. Dialogue: 0,2:52:35.50,2:52:39.34,Default,,0000,0000,0000,,We are in the fortunate\Nposition to know that his life Dialogue: 0,2:52:39.34,2:52:41.70,Default,,0000,0000,0000,,and our family can count\Non his permanent residency Dialogue: 0,2:52:41.70,2:52:43.80,Default,,0000,0000,0000,,to keep us together for now. Dialogue: 0,2:52:43.80,2:52:47.10,Default,,0000,0000,0000,,However, I cannot say the\Nsame for many Latin X families Dialogue: 0,2:52:47.10,2:52:48.75,Default,,0000,0000,0000,,in the greater Boston area, Dialogue: 0,2:52:48.75,2:52:50.83,Default,,0000,0000,0000,,a ban on face surveillance technology Dialogue: 0,2:52:50.83,2:52:53.27,Default,,0000,0000,0000,,is critically important\Nin the city of Boston Dialogue: 0,2:52:53.27,2:52:56.46,Default,,0000,0000,0000,,because this surveillance\Nharms our privacy Dialogue: 0,2:52:56.46,2:52:58.00,Default,,0000,0000,0000,,and our freedom of speech, Dialogue: 0,2:52:58.00,2:53:01.12,Default,,0000,0000,0000,,a fundamental right in our constitution. Dialogue: 0,2:53:01.12,2:53:03.65,Default,,0000,0000,0000,,This type of surveillance\Nthreatens to create a world Dialogue: 0,2:53:03.65,2:53:05.68,Default,,0000,0000,0000,,where people are watched and identified Dialogue: 0,2:53:05.68,2:53:07.82,Default,,0000,0000,0000,,as they exercise their right to protest, Dialogue: 0,2:53:07.82,2:53:10.00,Default,,0000,0000,0000,,congregate at places of worship Dialogue: 0,2:53:10.00,2:53:12.73,Default,,0000,0000,0000,,and visit medical providers, Dialogue: 0,2:53:12.73,2:53:14.65,Default,,0000,0000,0000,,as they go about their daily lives. Dialogue: 0,2:53:14.65,2:53:15.81,Default,,0000,0000,0000,,For my students of color, Dialogue: 0,2:53:15.81,2:53:18.22,Default,,0000,0000,0000,,specifically the young\Nmen growing up in a world Dialogue: 0,2:53:18.22,2:53:21.76,Default,,0000,0000,0000,,that still cannot fully accept\Nthat their lives matter, Dialogue: 0,2:53:26.27,2:53:28.87,Default,,0000,0000,0000,,face surveillance technology is not only Dialogue: 0,2:53:28.87,2:53:30.47,Default,,0000,0000,0000,,another threat to their lives, Dialogue: 0,2:53:30.47,2:53:32.53,Default,,0000,0000,0000,,but a response saying\Nto these young people, Dialogue: 0,2:53:32.53,2:53:34.59,Default,,0000,0000,0000,,you still don't matter. Dialogue: 0,2:53:34.59,2:53:37.06,Default,,0000,0000,0000,,I teach world history as well\Nas comparative government Dialogue: 0,2:53:37.06,2:53:40.38,Default,,0000,0000,0000,,and politics, and when I\Nteach about authoritarianism, Dialogue: 0,2:53:40.38,2:53:43.41,Default,,0000,0000,0000,,I teach about the impact\Non marginalized groups Dialogue: 0,2:53:43.41,2:53:46.16,Default,,0000,0000,0000,,in countries that are still\Nconsidered developing. Dialogue: 0,2:53:46.16,2:53:50.33,Default,,0000,0000,0000,,I teach that in these free countries Dialogue: 0,2:53:50.33,2:53:52.50,Default,,0000,0000,0000,,or in these recently free countries, Dialogue: 0,2:53:52.50,2:53:56.10,Default,,0000,0000,0000,,surveillance is one of\Nthe most common tools Dialogue: 0,2:53:56.10,2:53:58.18,Default,,0000,0000,0000,,to manage descent. Dialogue: 0,2:53:58.18,2:54:01.69,Default,,0000,0000,0000,,And that is often followed\Nby state sanctioned violence Dialogue: 0,2:54:01.69,2:54:05.01,Default,,0000,0000,0000,,as means to silence any dissidents. Dialogue: 0,2:54:05.01,2:54:07.58,Default,,0000,0000,0000,,I urge each of you to\Nconsider the statement Dialogue: 0,2:54:07.58,2:54:10.50,Default,,0000,0000,0000,,the statement that you\Nwould make by not banning Dialogue: 0,2:54:10.50,2:54:15.50,Default,,0000,0000,0000,,the use of facial surveillance\Ntechnology in Boston. Dialogue: 0,2:54:16.38,2:54:20.21,Default,,0000,0000,0000,,I urge each of you to imagine\Na 14 year old student of color Dialogue: 0,2:54:20.21,2:54:22.97,Default,,0000,0000,0000,,or a Latino essential\Nworker in front of you, Dialogue: 0,2:54:22.97,2:54:25.21,Default,,0000,0000,0000,,asking if their lives matter. Dialogue: 0,2:54:25.21,2:54:26.97,Default,,0000,0000,0000,,What is your response? Dialogue: 0,2:54:26.97,2:54:29.26,Default,,0000,0000,0000,,I encourage you to consider that moment Dialogue: 0,2:54:29.26,2:54:32.24,Default,,0000,0000,0000,,and to press pause on the\Nuse of face surveillance Dialogue: 0,2:54:32.24,2:54:34.36,Default,,0000,0000,0000,,by government entities\Nin the city of Boston, Dialogue: 0,2:54:34.36,2:54:36.78,Default,,0000,0000,0000,,by supporting and\Npassing this crucial ban. Dialogue: 0,2:54:36.78,2:54:40.24,Default,,0000,0000,0000,,We cannot allow Boston\Nto adopt authoritarian, Dialogue: 0,2:54:40.24,2:54:43.31,Default,,0000,0000,0000,,unregulated, bias surveillance technology. Dialogue: 0,2:54:43.31,2:54:46.30,Default,,0000,0000,0000,,Thank you for your attention\Nand your consideration. Dialogue: 0,2:54:46.30,2:54:47.41,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,2:54:47.41,2:54:51.55,Default,,0000,0000,0000,,At this point, I'm going\Nto turn over the microphone Dialogue: 0,2:54:51.55,2:54:56.06,Default,,0000,0000,0000,,or the gavel if you will, to\Nmy colleague, Michelle Wu. Dialogue: 0,2:54:56.06,2:54:59.57,Default,,0000,0000,0000,,I'll just say up next two\Nspeakers should be getting ready. Dialogue: 0,2:54:59.57,2:55:04.57,Default,,0000,0000,0000,,Is Julie McNulty, Zachary\NLawn, Christina Rivera, Dialogue: 0,2:55:06.70,2:55:08.29,Default,,0000,0000,0000,,and Joseph Fridman. Dialogue: 0,2:55:09.64,2:55:10.47,Default,,0000,0000,0000,,I wanna thank you so\Nmuch to the lead sponsors Dialogue: 0,2:55:12.45,2:55:13.50,Default,,0000,0000,0000,,of this ordinance. Dialogue: 0,2:55:13.50,2:55:14.51,Default,,0000,0000,0000,,I will be planning to\Nhave a working session Dialogue: 0,2:55:14.51,2:55:16.35,Default,,0000,0000,0000,,as soon as possible. Dialogue: 0,2:55:16.35,2:55:18.90,Default,,0000,0000,0000,,I do understand the\Nurgency and I do believe Dialogue: 0,2:55:18.90,2:55:23.34,Default,,0000,0000,0000,,that there is a great\Nopportunity before us to lead Dialogue: 0,2:55:23.34,2:55:28.34,Default,,0000,0000,0000,,in a way that not only values the lives Dialogue: 0,2:55:29.26,2:55:33.61,Default,,0000,0000,0000,,of our black and brown, but in\Ngeneral, our civil liberties. Dialogue: 0,2:55:33.61,2:55:34.85,Default,,0000,0000,0000,,And so I want to thank again, Dialogue: 0,2:55:34.85,2:55:36.67,Default,,0000,0000,0000,,lead sponsors for their leadership. Dialogue: 0,2:55:36.67,2:55:38.80,Default,,0000,0000,0000,,I look forward to getting this done Dialogue: 0,2:55:38.80,2:55:41.35,Default,,0000,0000,0000,,and with that I'm going\Nto have to sign off Dialogue: 0,2:55:41.35,2:55:43.42,Default,,0000,0000,0000,,and I turn over to Michelle Wu. Dialogue: 0,2:55:44.39,2:55:45.94,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:55:45.94,2:55:47.20,Default,,0000,0000,0000,,Thank you so much, madam chair. Dialogue: 0,2:55:47.20,2:55:49.54,Default,,0000,0000,0000,,We're very grateful for you. Dialogue: 0,2:55:51.20,2:55:53.08,Default,,0000,0000,0000,,Okay, next step was Julie. Dialogue: 0,2:55:55.16,2:55:59.29,Default,,0000,0000,0000,,And if not, Julie, then Zachary. Dialogue: 0,2:56:06.45,2:56:11.21,Default,,0000,0000,0000,,And just to know, could\Ncouncil central staff Dialogue: 0,2:56:11.21,2:56:12.39,Default,,0000,0000,0000,,give me administrative rights, Dialogue: 0,2:56:12.39,2:56:14.96,Default,,0000,0000,0000,,so I can help search whether folks Dialogue: 0,2:56:14.96,2:56:17.06,Default,,0000,0000,0000,,are stuck in the waiting room Dialogue: 0,2:56:17.06,2:56:20.31,Default,,0000,0000,0000,,and trans people over into the\Npanelists section, thank you. Dialogue: 0,2:56:23.16,2:56:27.26,Default,,0000,0000,0000,,So again, it was Julie,\NZachary, Christina Rivera, Dialogue: 0,2:56:27.26,2:56:31.84,Default,,0000,0000,0000,,Joseph Fridman, Danielle\NSamter, for the next few. Dialogue: 0,2:56:43.63,2:56:44.78,Default,,0000,0000,0000,,Christina, why don't you go ahead. Dialogue: 0,2:56:44.78,2:56:46.73,Default,,0000,0000,0000,,I see you now in the, in the main room. Dialogue: 0,2:56:48.28,2:56:49.28,Default,,0000,0000,0000,,Hi, yes. Dialogue: 0,2:56:49.28,2:56:50.32,Default,,0000,0000,0000,,Thank you for having me as stated. Dialogue: 0,2:56:50.32,2:56:51.92,Default,,0000,0000,0000,,My name is Christina Rivera Dialogue: 0,2:56:51.92,2:56:54.29,Default,,0000,0000,0000,,and I am an east Boston resident. Dialogue: 0,2:56:54.29,2:56:57.24,Default,,0000,0000,0000,,I'm also the founder of\Nthe Latin X Coalition Dialogue: 0,2:56:57.24,2:56:58.90,Default,,0000,0000,0000,,for Black Lives Matter. Dialogue: 0,2:56:58.90,2:57:01.79,Default,,0000,0000,0000,,It is a support organization Dialogue: 0,2:57:01.79,2:57:04.56,Default,,0000,0000,0000,,to help end racism within our community, Dialogue: 0,2:57:04.56,2:57:06.90,Default,,0000,0000,0000,,but also help advance the work Dialogue: 0,2:57:06.90,2:57:09.87,Default,,0000,0000,0000,,of the Black Lives Matter Movement. Dialogue: 0,2:57:09.87,2:57:11.48,Default,,0000,0000,0000,,I wanna say that we as an organization, Dialogue: 0,2:57:11.48,2:57:13.08,Default,,0000,0000,0000,,we fully support the ordinance Dialogue: 0,2:57:13.08,2:57:16.68,Default,,0000,0000,0000,,of a ban on facial recognition technology. Dialogue: 0,2:57:16.68,2:57:18.98,Default,,0000,0000,0000,,As stated all throughout\Nthis meeting earlier, Dialogue: 0,2:57:19.93,2:57:21.11,Default,,0000,0000,0000,,we know that the automation bias exists Dialogue: 0,2:57:21.11,2:57:23.77,Default,,0000,0000,0000,,and that facial recognition technology Dialogue: 0,2:57:23.77,2:57:26.76,Default,,0000,0000,0000,,clearly shows bias towards\Nalready highly targeted Dialogue: 0,2:57:26.76,2:57:28.61,Default,,0000,0000,0000,,and hyper policed communities, Dialogue: 0,2:57:28.61,2:57:30.63,Default,,0000,0000,0000,,specifically those of color. Dialogue: 0,2:57:30.63,2:57:33.18,Default,,0000,0000,0000,,And I just wanna pose the question, Dialogue: 0,2:57:33.18,2:57:35.12,Default,,0000,0000,0000,,how can we begin to trust Dialogue: 0,2:57:35.12,2:57:37.60,Default,,0000,0000,0000,,even with the development\Nof algorithm accuracy, Dialogue: 0,2:57:37.60,2:57:41.40,Default,,0000,0000,0000,,that those who will use it in the future Dialogue: 0,2:57:41.40,2:57:43.30,Default,,0000,0000,0000,,will not abuse its capabilities. Dialogue: 0,2:57:43.30,2:57:45.32,Default,,0000,0000,0000,,We already have a police force Dialogue: 0,2:57:45.32,2:57:47.11,Default,,0000,0000,0000,,that has shown time and time again, Dialogue: 0,2:57:47.11,2:57:49.96,Default,,0000,0000,0000,,that some of their own\Nprocedures are still abused Dialogue: 0,2:57:49.96,2:57:53.37,Default,,0000,0000,0000,,to this date and used by bad actors. Dialogue: 0,2:57:53.37,2:57:55.37,Default,,0000,0000,0000,,So adding a tool that can be used Dialogue: 0,2:57:55.37,2:57:57.00,Default,,0000,0000,0000,,to further affirm racial bias Dialogue: 0,2:57:57.00,2:57:59.91,Default,,0000,0000,0000,,only provides another route\Nto target our communities. Dialogue: 0,2:57:59.91,2:58:03.87,Default,,0000,0000,0000,,Using this technology also\Nenforces a policing system Dialogue: 0,2:58:03.87,2:58:05.83,Default,,0000,0000,0000,,that is based on control, Dialogue: 0,2:58:05.83,2:58:09.39,Default,,0000,0000,0000,,and that is a direct\Nsymptom of police mistrust Dialogue: 0,2:58:09.39,2:58:11.42,Default,,0000,0000,0000,,that already exists of their own citizens Dialogue: 0,2:58:11.42,2:58:13.70,Default,,0000,0000,0000,,and the ones that they serve. Dialogue: 0,2:58:13.70,2:58:16.34,Default,,0000,0000,0000,,And so, as some people earlier stated, Dialogue: 0,2:58:16.34,2:58:19.46,Default,,0000,0000,0000,,specifically Mr. Daley point about Dialogue: 0,2:58:19.46,2:58:24.28,Default,,0000,0000,0000,,what government agencies are\Nallowed to already use this Dialogue: 0,2:58:24.28,2:58:28.52,Default,,0000,0000,0000,,versus people who have in inactive bands, Dialogue: 0,2:58:28.52,2:58:31.57,Default,,0000,0000,0000,,what is also going to\Nbe the legislative power Dialogue: 0,2:58:31.57,2:58:34.00,Default,,0000,0000,0000,,that any federal agency\Nwill be able to use it Dialogue: 0,2:58:34.00,2:58:35.60,Default,,0000,0000,0000,,in individual states, Dialogue: 0,2:58:35.60,2:58:39.07,Default,,0000,0000,0000,,specifically when it comes\Nto I.C.E and immigration. Dialogue: 0,2:58:39.07,2:58:41.06,Default,,0000,0000,0000,,So this is something that\Nwe are looking to make sure Dialogue: 0,2:58:41.06,2:58:43.45,Default,,0000,0000,0000,,that it's not only is banned now, Dialogue: 0,2:58:43.45,2:58:45.60,Default,,0000,0000,0000,,but also cannot be allowed in the future. Dialogue: 0,2:58:45.60,2:58:46.70,Default,,0000,0000,0000,,With that said, Dialogue: 0,2:58:46.70,2:58:48.16,Default,,0000,0000,0000,,thank you so much to all the panelists Dialogue: 0,2:58:48.16,2:58:50.26,Default,,0000,0000,0000,,who've shared all this great information. Dialogue: 0,2:58:50.26,2:58:52.91,Default,,0000,0000,0000,,Thank you so much to councilor\NWu and councilor Arroyo Dialogue: 0,2:58:52.91,2:58:56.26,Default,,0000,0000,0000,,for doing this work and I concede my time. Dialogue: 0,2:58:56.26,2:58:57.09,Default,,0000,0000,0000,,Thank you. Dialogue: 0,2:58:58.76,2:59:00.00,Default,,0000,0000,0000,,Thank you so much, Christina. Dialogue: 0,2:59:00.00,2:59:02.40,Default,,0000,0000,0000,,We appreciate you, Joseph. Dialogue: 0,2:59:06.03,2:59:07.43,Default,,0000,0000,0000,,Yes, thank you for your time. Dialogue: 0,2:59:07.43,2:59:08.52,Default,,0000,0000,0000,,I'd like to tell you a\Nlittle bit about the science Dialogue: 0,2:59:08.52,2:59:11.06,Default,,0000,0000,0000,,between emotion detection technologies. Dialogue: 0,2:59:11.06,2:59:11.96,Default,,0000,0000,0000,,So these are technologies Dialogue: 0,2:59:11.96,2:59:14.48,Default,,0000,0000,0000,,that use information captured\Nfrom face surveillance Dialogue: 0,2:59:14.48,2:59:16.35,Default,,0000,0000,0000,,to supposedly recognize emotions. Dialogue: 0,2:59:16.35,2:59:18.56,Default,,0000,0000,0000,,So I'll be speaking to\Nyou as a private citizen, Dialogue: 0,2:59:18.56,2:59:19.88,Default,,0000,0000,0000,,but I'll tell you that\Nfor the past three years, Dialogue: 0,2:59:19.88,2:59:21.87,Default,,0000,0000,0000,,I've been working at a lab\Nat Northeastern University, Dialogue: 0,2:59:21.87,2:59:23.20,Default,,0000,0000,0000,,which studies emotions, Dialogue: 0,2:59:23.20,2:59:24.25,Default,,0000,0000,0000,,and I've been a research assistant Dialogue: 0,2:59:24.25,2:59:26.19,Default,,0000,0000,0000,,at the Harvard Kennedy Schools\NBelfer Center for Science Dialogue: 0,2:59:26.19,2:59:27.41,Default,,0000,0000,0000,,and International Relations, Dialogue: 0,2:59:27.41,2:59:29.43,Default,,0000,0000,0000,,thinking about the same\Ntype of technology. Dialogue: 0,2:59:29.43,2:59:31.44,Default,,0000,0000,0000,,At the interdisciplinary\Naspect of science lab Dialogue: 0,2:59:31.44,2:59:32.27,Default,,0000,0000,0000,,at Northeastern, Dialogue: 0,2:59:32.27,2:59:34.43,Default,,0000,0000,0000,,one of my bosses is Dr.\NLisa Feldman Barrett. Dialogue: 0,2:59:34.43,2:59:36.48,Default,,0000,0000,0000,,Dr. Barrett has research\Nappointments at MGH Dialogue: 0,2:59:36.48,2:59:37.33,Default,,0000,0000,0000,,and Harvard Medical School. Dialogue: 0,2:59:37.33,2:59:39.61,Default,,0000,0000,0000,,She's the chief scientific\Nofficer at MGH Center Dialogue: 0,2:59:39.61,2:59:41.10,Default,,0000,0000,0000,,for Law, Brain and Behavior. Dialogue: 0,2:59:41.10,2:59:41.93,Default,,0000,0000,0000,,She's the author of Dialogue: 0,2:59:41.93,2:59:44.04,Default,,0000,0000,0000,,over 230 peer reviewed scientific papers, Dialogue: 0,2:59:44.04,2:59:45.43,Default,,0000,0000,0000,,and just finished a term as president Dialogue: 0,2:59:45.43,2:59:47.59,Default,,0000,0000,0000,,of the Association for\NPsychological Science, Dialogue: 0,2:59:47.59,2:59:49.52,Default,,0000,0000,0000,,where she represented tens\Nof thousands of scientists Dialogue: 0,2:59:49.52,2:59:50.76,Default,,0000,0000,0000,,around the world. Dialogue: 0,2:59:50.76,2:59:51.59,Default,,0000,0000,0000,,Three years ago, Dialogue: 0,2:59:51.59,2:59:54.06,Default,,0000,0000,0000,,this organization commissioned\Nher and four other experts, Dialogue: 0,2:59:54.06,2:59:56.19,Default,,0000,0000,0000,,psychologists, computer\Nscience, and so on, Dialogue: 0,2:59:56.19,2:59:57.27,Default,,0000,0000,0000,,to do a major study Dialogue: 0,2:59:57.27,2:59:59.60,Default,,0000,0000,0000,,on whether people across\Nthe world express emotions Dialogue: 0,2:59:59.60,3:00:01.86,Default,,0000,0000,0000,,like anger, sadness,\Nfear, disgust, surprise, Dialogue: 0,3:00:01.86,3:00:04.96,Default,,0000,0000,0000,,and happiness with distinctive\Nmovements of the face. Dialogue: 0,3:00:04.96,3:00:06.03,Default,,0000,0000,0000,,So all of these scientists Dialogue: 0,3:00:06.03,3:00:07.29,Default,,0000,0000,0000,,started with different perspectives, Dialogue: 0,3:00:07.29,3:00:09.36,Default,,0000,0000,0000,,but they reviewed a\Nthousand research articles Dialogue: 0,3:00:09.36,3:00:12.10,Default,,0000,0000,0000,,and they came to a very simple\Nand very important consensus. Dialogue: 0,3:00:12.10,3:00:13.40,Default,,0000,0000,0000,,So the question is, Dialogue: 0,3:00:13.40,3:00:16.11,Default,,0000,0000,0000,,can you read emotions\Nreliably in a human face? Dialogue: 0,3:00:16.11,3:00:17.39,Default,,0000,0000,0000,,And the answer is no. Dialogue: 0,3:00:17.39,3:00:19.47,Default,,0000,0000,0000,,Here's an example from Dr. Barrett. Dialogue: 0,3:00:19.47,3:00:22.35,Default,,0000,0000,0000,,People living in Western\Ncultures scowl when they're angry Dialogue: 0,3:00:22.35,3:00:23.92,Default,,0000,0000,0000,,about 30% of the time, Dialogue: 0,3:00:23.92,3:00:26.19,Default,,0000,0000,0000,,this means that they make\Nother facial expressions, Dialogue: 0,3:00:26.19,3:00:29.87,Default,,0000,0000,0000,,frowns, wide-eyed, gasps\Nsmiles about 70% of the time. Dialogue: 0,3:00:29.87,3:00:32.32,Default,,0000,0000,0000,,This is called low reliability. Dialogue: 0,3:00:32.32,3:00:35.07,Default,,0000,0000,0000,,People scowl when angry more than chance, Dialogue: 0,3:00:35.07,3:00:36.13,Default,,0000,0000,0000,,but they also scowl\Nwhen they're not angry. Dialogue: 0,3:00:36.13,3:00:38.65,Default,,0000,0000,0000,,If they're concentrating or if\Nthey have something like gas, Dialogue: 0,3:00:38.65,3:00:40.51,Default,,0000,0000,0000,,this is called low specificity. Dialogue: 0,3:00:40.51,3:00:42.73,Default,,0000,0000,0000,,So a scowl is not the expression of anger, Dialogue: 0,3:00:42.73,3:00:44.42,Default,,0000,0000,0000,,but it's one of many expressions of anger. Dialogue: 0,3:00:44.42,3:00:47.07,Default,,0000,0000,0000,,And sometimes it doesn't\Nexpress anger at all. Dialogue: 0,3:00:47.07,3:00:49.67,Default,,0000,0000,0000,,In this paper, Dr.\NBarrett and her colleagues Dialogue: 0,3:00:49.67,3:00:52.48,Default,,0000,0000,0000,,showed that scowling and\Nanger, smiling and happiness, Dialogue: 0,3:00:52.48,3:00:53.49,Default,,0000,0000,0000,,frowning and sadness, Dialogue: 0,3:00:53.49,3:00:55.31,Default,,0000,0000,0000,,these are all Western stereotypes\Nof emotional expressions. Dialogue: 0,3:00:55.31,3:00:56.77,Default,,0000,0000,0000,,They reflect some common beliefs Dialogue: 0,3:00:56.77,3:00:58.26,Default,,0000,0000,0000,,about emotional expressions, Dialogue: 0,3:00:58.26,3:01:00.39,Default,,0000,0000,0000,,but these beliefs don't correspond Dialogue: 0,3:01:00.39,3:01:02.89,Default,,0000,0000,0000,,to how people actually move\Ntheir faces in real life. Dialogue: 0,3:01:02.89,3:01:03.72,Default,,0000,0000,0000,,And they don't generalize to cultures Dialogue: 0,3:01:04.92,3:01:06.61,Default,,0000,0000,0000,,that are different from ours. Dialogue: 0,3:01:06.61,3:01:07.99,Default,,0000,0000,0000,,So it's not possible for anyone Dialogue: 0,3:01:07.99,3:01:09.81,Default,,0000,0000,0000,,or anything to read an\Nemotion on our face. Dialogue: 0,3:01:09.81,3:01:11.55,Default,,0000,0000,0000,,And we shouldn't confidently\Ninfer happiness from a smile, Dialogue: 0,3:01:11.55,3:01:14.07,Default,,0000,0000,0000,,anger from a scowl or\Nsadness from a frown. Dialogue: 0,3:01:14.07,3:01:16.91,Default,,0000,0000,0000,,There are technologies\Nthat claim to do this, Dialogue: 0,3:01:16.91,3:01:18.99,Default,,0000,0000,0000,,and they're misrepresenting\Nwhat they can do Dialogue: 0,3:01:18.99,3:01:20.20,Default,,0000,0000,0000,,according to the best available evidence. Dialogue: 0,3:01:21.47,3:01:24.37,Default,,0000,0000,0000,,People aren't moving their faces randomly, Dialogue: 0,3:01:24.37,3:01:25.73,Default,,0000,0000,0000,,but there's a lot of variation Dialogue: 0,3:01:25.73,3:01:27.81,Default,,0000,0000,0000,,and the meaning of a facial movement Dialogue: 0,3:01:27.81,3:01:30.19,Default,,0000,0000,0000,,depends on a person on the\Ncontext and on culture. Dialogue: 0,3:01:30.19,3:01:31.81,Default,,0000,0000,0000,,So let me give you one example Dialogue: 0,3:01:31.81,3:01:34.67,Default,,0000,0000,0000,,of how making this assumption\Ncan be really harmful. Dialogue: 0,3:01:34.67,3:01:36.24,Default,,0000,0000,0000,,There's a scientist, Dr. Lauren Ruth, Dialogue: 0,3:01:36.24,3:01:38.56,Default,,0000,0000,0000,,when she was at Wake Forest University, Dialogue: 0,3:01:38.56,3:01:40.08,Default,,0000,0000,0000,,published work reviewing two popular Dialogue: 0,3:01:40.08,3:01:41.01,Default,,0000,0000,0000,,emotion detection algorithms. Dialogue: 0,3:01:41.01,3:01:43.01,Default,,0000,0000,0000,,She ran these algorithms algorithms Dialogue: 0,3:01:43.01,3:01:45.47,Default,,0000,0000,0000,,on portraits of white\Nand black NBA players. Dialogue: 0,3:01:45.47,3:01:47.31,Default,,0000,0000,0000,,And both of them consistently interpreted Dialogue: 0,3:01:47.31,3:01:49.38,Default,,0000,0000,0000,,the black players as having\Nmore negative emotions Dialogue: 0,3:01:49.38,3:01:50.83,Default,,0000,0000,0000,,than the white players. Dialogue: 0,3:01:50.83,3:01:52.24,Default,,0000,0000,0000,,Imagine that there's a security guard Dialogue: 0,3:01:52.24,3:01:54.09,Default,,0000,0000,0000,,that gets information\Nfrom a surveillance camera Dialogue: 0,3:01:54.09,3:01:55.27,Default,,0000,0000,0000,,that this is possible nuisance Dialogue: 0,3:01:55.27,3:01:57.13,Default,,0000,0000,0000,,in the lobby of their building Dialogue: 0,3:01:57.13,3:01:59.06,Default,,0000,0000,0000,,because of the facial movements\Nthat somebody is making. Dialogue: 0,3:01:59.06,3:02:00.66,Default,,0000,0000,0000,,Imagine a defendant in the courtroom Dialogue: 0,3:02:00.66,3:02:02.86,Default,,0000,0000,0000,,that scowling as they\Nconcentrate on legal proceedings Dialogue: 0,3:02:02.86,3:02:05.35,Default,,0000,0000,0000,,but in emotion AI, based\Non facial surveillance Dialogue: 0,3:02:05.35,3:02:07.50,Default,,0000,0000,0000,,tells the jury that\Nthe defendant is angry. Dialogue: 0,3:02:07.50,3:02:08.91,Default,,0000,0000,0000,,Imagine that the defendant is black Dialogue: 0,3:02:08.91,3:02:10.92,Default,,0000,0000,0000,,and the AI says that they're contemptuous. Dialogue: 0,3:02:10.92,3:02:12.90,Default,,0000,0000,0000,,Imagine at worst, that police officer Dialogue: 0,3:02:12.90,3:02:14.66,Default,,0000,0000,0000,,receiving an alert from at a body camera Dialogue: 0,3:02:14.66,3:02:16.03,Default,,0000,0000,0000,,based on this technology Dialogue: 0,3:02:16.03,3:02:18.44,Default,,0000,0000,0000,,that says that someone in\Ntheir line of sight is a threat Dialogue: 0,3:02:18.44,3:02:21.41,Default,,0000,0000,0000,,based on a so-called\Nreading of their face. Dialogue: 0,3:02:21.41,3:02:24.22,Default,,0000,0000,0000,,Councilors I think that the question Dialogue: 0,3:02:24.22,3:02:25.19,Default,,0000,0000,0000,,that we should be thinking about Dialogue: 0,3:02:25.19,3:02:27.85,Default,,0000,0000,0000,,is whether we want someone\Nto use this technology on us Dialogue: 0,3:02:27.85,3:02:31.13,Default,,0000,0000,0000,,with the chance that it\Nwould misinterpret our face Dialogue: 0,3:02:31.13,3:02:34.01,Default,,0000,0000,0000,,given the major chance\Nthat it has of being wrong. Dialogue: 0,3:02:34.01,3:02:35.96,Default,,0000,0000,0000,,And I think the clear\Nanswer to that is no. Dialogue: 0,3:02:35.96,3:02:38.22,Default,,0000,0000,0000,,We should ban facial\Nsurveillance technologies Dialogue: 0,3:02:38.22,3:02:41.49,Default,,0000,0000,0000,,and all of these other\Ndangerous applications. Dialogue: 0,3:02:41.49,3:02:42.76,Default,,0000,0000,0000,,Thank you. Dialogue: 0,3:02:42.76,3:02:43.86,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,3:02:43.86,3:02:45.99,Default,,0000,0000,0000,,Next up is Danielle and\NDanielle will be followed Dialogue: 0,3:02:45.99,3:02:48.70,Default,,0000,0000,0000,,by DJ Hatfield and Denise Horn. Dialogue: 0,3:02:50.98,3:02:53.11,Default,,0000,0000,0000,,Hi, good afternoon everybody. Dialogue: 0,3:02:53.11,3:02:55.07,Default,,0000,0000,0000,,My name is Danielle,\NI'm a Boston resident, Dialogue: 0,3:02:55.07,3:02:57.45,Default,,0000,0000,0000,,I live in the north end. Dialogue: 0,3:02:57.45,3:03:00.98,Default,,0000,0000,0000,,I just wanted to fully support Dialogue: 0,3:03:00.98,3:03:03.93,Default,,0000,0000,0000,,and put my voice behind the ban Dialogue: 0,3:03:03.93,3:03:07.92,Default,,0000,0000,0000,,for facial recognition\Nrecognition technology. Dialogue: 0,3:03:07.92,3:03:10.90,Default,,0000,0000,0000,,I feel as though the expert witnesses Dialogue: 0,3:03:10.90,3:03:13.21,Default,,0000,0000,0000,,and activists that have gone before me, Dialogue: 0,3:03:13.21,3:03:17.63,Default,,0000,0000,0000,,have clearly expressed and\Neloquently expressed the views Dialogue: 0,3:03:17.63,3:03:19.23,Default,,0000,0000,0000,,that I also share. Dialogue: 0,3:03:19.23,3:03:22.86,Default,,0000,0000,0000,,I find this technology to\Nbe incredibly dangerous. Dialogue: 0,3:03:22.86,3:03:25.62,Default,,0000,0000,0000,,As a cybersecurity professional, Dialogue: 0,3:03:25.62,3:03:27.39,Default,,0000,0000,0000,,I have a lot of questions about Dialogue: 0,3:03:27.39,3:03:32.35,Default,,0000,0000,0000,,what it means to be\Ncollecting data on people Dialogue: 0,3:03:32.35,3:03:33.28,Default,,0000,0000,0000,,and storing it. Dialogue: 0,3:03:34.21,3:03:38.11,Default,,0000,0000,0000,,I certainly think that this is a measure Dialogue: 0,3:03:38.11,3:03:40.43,Default,,0000,0000,0000,,that should be taken to be a precedent Dialogue: 0,3:03:40.43,3:03:43.46,Default,,0000,0000,0000,,for all future developments\Nof the technology Dialogue: 0,3:03:43.46,3:03:44.75,Default,,0000,0000,0000,,as was just stated. Dialogue: 0,3:03:44.75,3:03:48.27,Default,,0000,0000,0000,,There are upcoming and new ways Dialogue: 0,3:03:48.27,3:03:52.49,Default,,0000,0000,0000,,to express facial recognition\Nthrough emotions or otherwise Dialogue: 0,3:03:52.49,3:03:56.16,Default,,0000,0000,0000,,that we can't even comprehend\Nyet because it hasn't started. Dialogue: 0,3:03:56.16,3:03:59.55,Default,,0000,0000,0000,,So I wanted to support it and\Nthank all the other people Dialogue: 0,3:03:59.55,3:04:01.29,Default,,0000,0000,0000,,for putting their voice behind it Dialogue: 0,3:04:01.29,3:04:04.48,Default,,0000,0000,0000,,and hope to see it take place, thank you. Dialogue: 0,3:04:06.16,3:04:07.57,Default,,0000,0000,0000,,Thank you very much, DJ. Dialogue: 0,3:04:15.26,3:04:17.86,Default,,0000,0000,0000,,Hi, will start the video. Dialogue: 0,3:04:17.86,3:04:19.69,Default,,0000,0000,0000,,So I'm DJ Hatfield. Dialogue: 0,3:04:19.69,3:04:23.35,Default,,0000,0000,0000,,I'm a professor of\Nanthropology and history Dialogue: 0,3:04:23.35,3:04:25.87,Default,,0000,0000,0000,,at the Berkeley College\Nof Music in Boston. Dialogue: 0,3:04:25.87,3:04:30.86,Default,,0000,0000,0000,,However, what I have to\Nsay is as a private citizen Dialogue: 0,3:04:30.86,3:04:33.35,Default,,0000,0000,0000,,and a resident of east Boston, Dialogue: 0,3:04:33.35,3:04:37.51,Default,,0000,0000,0000,,that's a shout out to\NLydia, east Boston pride. Dialogue: 0,3:04:37.51,3:04:42.51,Default,,0000,0000,0000,,Yes, as an anthropologist I'm\Naware that all technologies, Dialogue: 0,3:04:42.75,3:04:44.81,Default,,0000,0000,0000,,including digital technologies Dialogue: 0,3:04:44.81,3:04:47.10,Default,,0000,0000,0000,,reflect an augment the social structures Dialogue: 0,3:04:47.10,3:04:50.07,Default,,0000,0000,0000,,and systems of value of\Nparticular societies. Dialogue: 0,3:04:50.07,3:04:51.87,Default,,0000,0000,0000,,They are not objective, Dialogue: 0,3:04:51.87,3:04:54.37,Default,,0000,0000,0000,,but reflect the biases of those societies Dialogue: 0,3:04:54.37,3:04:56.79,Default,,0000,0000,0000,,as they develop through history. Dialogue: 0,3:04:56.79,3:04:59.47,Default,,0000,0000,0000,,Moreover, if we look at societies Dialogue: 0,3:04:59.47,3:05:02.49,Default,,0000,0000,0000,,where facial recognition\Nhas been widely deployed, Dialogue: 0,3:05:02.49,3:05:04.63,Default,,0000,0000,0000,,such as the People's Republic of China, Dialogue: 0,3:05:04.63,3:05:07.78,Default,,0000,0000,0000,,we see very clearly how these technologies Dialogue: 0,3:05:07.78,3:05:10.56,Default,,0000,0000,0000,,have been used systematically Dialogue: 0,3:05:10.56,3:05:13.06,Default,,0000,0000,0000,,to target marginal populations. Dialogue: 0,3:05:14.59,3:05:16.80,Default,,0000,0000,0000,,This is a quality of the weight Dialogue: 0,3:05:16.80,3:05:21.13,Default,,0000,0000,0000,,that these technologies\Nintersect with social systems Dialogue: 0,3:05:21.13,3:05:23.14,Default,,0000,0000,0000,,that act against people of color Dialogue: 0,3:05:23.14,3:05:25.36,Default,,0000,0000,0000,,and indigenous people, globally. Dialogue: 0,3:05:26.43,3:05:28.44,Default,,0000,0000,0000,,Therefore, in the city of Boston, Dialogue: 0,3:05:28.44,3:05:33.03,Default,,0000,0000,0000,,where we take pride in our civic freedoms, Dialogue: 0,3:05:33.03,3:05:36.59,Default,,0000,0000,0000,,we should be wary of technologies Dialogue: 0,3:05:36.59,3:05:39.92,Default,,0000,0000,0000,,that would make our Fourth\NAmendment meaningless Dialogue: 0,3:05:39.92,3:05:41.72,Default,,0000,0000,0000,,and would erode the First Amendment. Dialogue: 0,3:05:42.60,3:05:45.15,Default,,0000,0000,0000,,Therefore, I would urge the city council Dialogue: 0,3:05:45.15,3:05:50.15,Default,,0000,0000,0000,,and the city of Boston\Nto pass a complete ban Dialogue: 0,3:05:50.52,3:05:54.97,Default,,0000,0000,0000,,on facial recognition\Nsystems and technology, Dialogue: 0,3:05:54.97,3:05:58.63,Default,,0000,0000,0000,,with the amendments that have\Nbeen proposed by experts, Dialogue: 0,3:05:58.63,3:06:01.69,Default,,0000,0000,0000,,such as those of the\NElectronic Freedom Foundation, Dialogue: 0,3:06:01.69,3:06:06.68,Default,,0000,0000,0000,,and the Algorithmic Justice League, Dialogue: 0,3:06:06.68,3:06:09.08,Default,,0000,0000,0000,,who have said there should be no exception Dialogue: 0,3:06:09.08,3:06:12.90,Default,,0000,0000,0000,,for the Boston Police Department\Nto use facial recognition Dialogue: 0,3:06:12.90,3:06:17.90,Default,,0000,0000,0000,,or surveillance systems in evidence, Dialogue: 0,3:06:17.96,3:06:20.97,Default,,0000,0000,0000,,in which there should be strict rules Dialogue: 0,3:06:20.97,3:06:24.42,Default,,0000,0000,0000,,for enforcement of breaches of this ban. Dialogue: 0,3:06:24.42,3:06:28.57,Default,,0000,0000,0000,,And finally, if possible,\Nextending this ban Dialogue: 0,3:06:28.57,3:06:31.38,Default,,0000,0000,0000,,to all private actors who have business Dialogue: 0,3:06:31.38,3:06:32.88,Default,,0000,0000,0000,,in the city of Boston. Dialogue: 0,3:06:32.88,3:06:34.35,Default,,0000,0000,0000,,Thank you for your consideration Dialogue: 0,3:06:34.35,3:06:35.96,Default,,0000,0000,0000,,and thank you for all your time today. Dialogue: 0,3:06:35.96,3:06:38.63,Default,,0000,0000,0000,,I appreciate being able to join Dialogue: 0,3:06:38.63,3:06:40.97,Default,,0000,0000,0000,,in my very first public hearing. Dialogue: 0,3:06:40.97,3:06:41.80,Default,,0000,0000,0000,,Thank you. Dialogue: 0,3:06:41.80,3:06:44.59,Default,,0000,0000,0000,,We are excited to have you at our hearing Dialogue: 0,3:06:44.59,3:06:47.29,Default,,0000,0000,0000,,and hope you'll come back, Denise next. Dialogue: 0,3:06:47.29,3:06:48.88,Default,,0000,0000,0000,,And then after Denise, Dialogue: 0,3:06:48.88,3:06:50.64,Default,,0000,0000,0000,,there are a few names assigned up Dialogue: 0,3:06:50.64,3:06:52.41,Default,,0000,0000,0000,,that I didn't see in the room. Dialogue: 0,3:06:52.41,3:06:55.20,Default,,0000,0000,0000,,So I'm gonna call out folks Dialogue: 0,3:06:55.20,3:06:57.35,Default,,0000,0000,0000,,that I was able to cross\Nreference between the waiting room Dialogue: 0,3:06:57.35,3:07:00.50,Default,,0000,0000,0000,,and the list, which was Charles Griswold Dialogue: 0,3:07:00.50,3:07:02.24,Default,,0000,0000,0000,,and Nora Paul-Schultz. Dialogue: 0,3:07:02.24,3:07:03.20,Default,,0000,0000,0000,,And then after that, Dialogue: 0,3:07:03.20,3:07:04.75,Default,,0000,0000,0000,,I'm gonna try to admit everyone else Dialogue: 0,3:07:04.75,3:07:06.80,Default,,0000,0000,0000,,who's in the room and go in order, Dialogue: 0,3:07:06.80,3:07:09.90,Default,,0000,0000,0000,,even if you weren't on the original list. Dialogue: 0,3:07:09.90,3:07:11.01,Default,,0000,0000,0000,,So Denise, please. Dialogue: 0,3:07:12.67,3:07:13.61,Default,,0000,0000,0000,,Thank you. Dialogue: 0,3:07:13.61,3:07:14.98,Default,,0000,0000,0000,,Thank you members of the council Dialogue: 0,3:07:14.98,3:07:16.90,Default,,0000,0000,0000,,and the lead sponsors of the ordinance Dialogue: 0,3:07:16.90,3:07:19.19,Default,,0000,0000,0000,,and the panelists, really\Nexcellent commentary. Dialogue: 0,3:07:19.19,3:07:21.36,Default,,0000,0000,0000,,And I just have to say how impressed I am Dialogue: 0,3:07:21.36,3:07:25.02,Default,,0000,0000,0000,,with the Student Immigrant\NMovement for showing up today, Dialogue: 0,3:07:25.02,3:07:26.61,Default,,0000,0000,0000,,this really impressive. Dialogue: 0,3:07:26.61,3:07:30.15,Default,,0000,0000,0000,,My name is Denise Horn, I'm\Na resident of Jamaica Plain. Dialogue: 0,3:07:30.15,3:07:32.73,Default,,0000,0000,0000,,I'm also a professor of political science Dialogue: 0,3:07:32.73,3:07:35.43,Default,,0000,0000,0000,,and international relations\Nat Simmons University. Dialogue: 0,3:07:35.43,3:07:37.64,Default,,0000,0000,0000,,Although I am not representing\Nmy institution today. Dialogue: 0,3:07:37.64,3:07:40.79,Default,,0000,0000,0000,,And I want to say that I support the ban Dialogue: 0,3:07:40.79,3:07:42.73,Default,,0000,0000,0000,,against the use of facial surveillance. Dialogue: 0,3:07:42.73,3:07:45.30,Default,,0000,0000,0000,,So, like professor Hatfield, Dialogue: 0,3:07:45.30,3:07:49.07,Default,,0000,0000,0000,,I am a scholar of international relations, Dialogue: 0,3:07:49.07,3:07:51.67,Default,,0000,0000,0000,,I have studied authoritarian\Nregimes in Eastern Europe Dialogue: 0,3:07:51.67,3:07:55.02,Default,,0000,0000,0000,,and east and southeast Asia, Dialogue: 0,3:07:55.02,3:07:57.57,Default,,0000,0000,0000,,where surveillance technologies\Nof every kind have been Dialogue: 0,3:07:57.57,3:08:00.62,Default,,0000,0000,0000,,and are used to target\Nenemies of the state. Dialogue: 0,3:08:00.62,3:08:03.43,Default,,0000,0000,0000,,In the United States we're\Nalready seeing this language Dialogue: 0,3:08:03.43,3:08:05.01,Default,,0000,0000,0000,,in use by the leaders in our government, Dialogue: 0,3:08:05.01,3:08:06.24,Default,,0000,0000,0000,,regarding immigrants and activists, Dialogue: 0,3:08:06.24,3:08:09.53,Default,,0000,0000,0000,,and especially those who\Nare opposed to fascism. Dialogue: 0,3:08:09.53,3:08:11.77,Default,,0000,0000,0000,,I do not think that we\Nshould underestimate Dialogue: 0,3:08:11.77,3:08:14.37,Default,,0000,0000,0000,,how this type of technology could be used Dialogue: 0,3:08:14.37,3:08:17.73,Default,,0000,0000,0000,,to further racists an\Nauthoritarian agendas here. Dialogue: 0,3:08:17.73,3:08:21.25,Default,,0000,0000,0000,,The use of facial surveillance\Nis used to oppress Dialogue: 0,3:08:21.25,3:08:24.17,Default,,0000,0000,0000,,and control citizens under\Nthe guise of public safety. Dialogue: 0,3:08:24.17,3:08:25.47,Default,,0000,0000,0000,,This is a slippery slope. Dialogue: 0,3:08:25.47,3:08:28.89,Default,,0000,0000,0000,,Government surveillance\Nis used to silence, Dialogue: 0,3:08:28.89,3:08:32.32,Default,,0000,0000,0000,,to suppress speech and\Nto violate human rights. Dialogue: 0,3:08:32.32,3:08:34.68,Default,,0000,0000,0000,,As an educator, I worry about this type, Dialogue: 0,3:08:34.68,3:08:37.63,Default,,0000,0000,0000,,how this type of technology will be used Dialogue: 0,3:08:37.63,3:08:40.91,Default,,0000,0000,0000,,to adversely affect the lives\Nof my students of color, Dialogue: 0,3:08:40.91,3:08:43.64,Default,,0000,0000,0000,,trans students, undocumented\Nstudents, activists Dialogue: 0,3:08:43.64,3:08:47.07,Default,,0000,0000,0000,,and marginalized groups,\Ntheir lives matter. Dialogue: 0,3:08:47.07,3:08:48.52,Default,,0000,0000,0000,,I strongly support this ban Dialogue: 0,3:08:48.52,3:08:50.48,Default,,0000,0000,0000,,and I thank you for letting me speak. Dialogue: 0,3:08:52.38,3:08:54.90,Default,,0000,0000,0000,,Thank you, Denise, Charles. Dialogue: 0,3:08:55.83,3:09:00.83,Default,,0000,0000,0000,,Yep, so I'm trying to start\Nthe video here, there we are. Dialogue: 0,3:09:02.83,3:09:04.68,Default,,0000,0000,0000,,Can you see me and hear me? Dialogue: 0,3:09:05.65,3:09:08.42,Default,,0000,0000,0000,,Yes we can.\NOkay, thank you. Dialogue: 0,3:09:08.42,3:09:11.54,Default,,0000,0000,0000,,My name is Charles Gridwold,\NI'm a resident of Brighton Dialogue: 0,3:09:11.54,3:09:15.72,Default,,0000,0000,0000,,and I'm a philosophy professor\Nat Boston University. Dialogue: 0,3:09:15.72,3:09:18.87,Default,,0000,0000,0000,,I hasten to add that I'm\Noffering my thoughts here Dialogue: 0,3:09:18.87,3:09:21.22,Default,,0000,0000,0000,,and in offering my thoughts, Dialogue: 0,3:09:21.22,3:09:23.32,Default,,0000,0000,0000,,I am not speaking for my university Dialogue: 0,3:09:23.32,3:09:24.83,Default,,0000,0000,0000,,and not acting on his behalf. Dialogue: 0,3:09:24.83,3:09:26.71,Default,,0000,0000,0000,,So I'm speaking on my own behalf only. Dialogue: 0,3:09:26.71,3:09:29.64,Default,,0000,0000,0000,,I'm very grateful to all\Nof you on the council Dialogue: 0,3:09:29.64,3:09:31.77,Default,,0000,0000,0000,,for taking action on this matter Dialogue: 0,3:09:31.77,3:09:34.74,Default,,0000,0000,0000,,and to the police commissioner\Nfor his impressive Dialogue: 0,3:09:34.74,3:09:36.53,Default,,0000,0000,0000,,and in some ways reassuring testimony. Dialogue: 0,3:09:36.53,3:09:39.37,Default,,0000,0000,0000,,It's imperative I think, Dialogue: 0,3:09:39.37,3:09:43.39,Default,,0000,0000,0000,,that you approve this ban on\Nfacial recognition technology, Dialogue: 0,3:09:43.39,3:09:47.66,Default,,0000,0000,0000,,or rather technologies as we've\Nlearned today by government Dialogue: 0,3:09:47.66,3:09:50.65,Default,,0000,0000,0000,,before the relevant software is upgraded. Dialogue: 0,3:09:51.52,3:09:52.69,Default,,0000,0000,0000,,So far as I can tell, Dialogue: 0,3:09:52.69,3:09:56.71,Default,,0000,0000,0000,,three arguments for the ban\Nhave been discussed today. Dialogue: 0,3:09:56.71,3:10:00.05,Default,,0000,0000,0000,,The first two extensively\Nand the third a bit less so Dialogue: 0,3:10:00.05,3:10:02.99,Default,,0000,0000,0000,,and taken together, they\Nseem to me to be decisive Dialogue: 0,3:10:02.99,3:10:06.12,Default,,0000,0000,0000,,in making the case for the ban. Dialogue: 0,3:10:06.12,3:10:07.28,Default,,0000,0000,0000,,The first is of course Dialogue: 0,3:10:07.28,3:10:10.00,Default,,0000,0000,0000,,the matter of racial and gender justice, Dialogue: 0,3:10:10.00,3:10:12.09,Default,,0000,0000,0000,,on account of the fact that the technology Dialogue: 0,3:10:12.09,3:10:14.37,Default,,0000,0000,0000,,is known to be inaccurate. Dialogue: 0,3:10:14.37,3:10:18.40,Default,,0000,0000,0000,,But secondly, is the\Nfact that the technology Dialogue: 0,3:10:18.40,3:10:19.51,Default,,0000,0000,0000,,is objectionable. Dialogue: 0,3:10:19.51,3:10:21.16,Default,,0000,0000,0000,,Not just because it's not accurate, Dialogue: 0,3:10:21.16,3:10:23.25,Default,,0000,0000,0000,,but also when it is accurate, Dialogue: 0,3:10:23.25,3:10:26.96,Default,,0000,0000,0000,,as Kade Crockford has\Neloquently argued today. Dialogue: 0,3:10:26.96,3:10:28.86,Default,,0000,0000,0000,,And that seems to me\Nto be an essential part Dialogue: 0,3:10:28.86,3:10:32.63,Default,,0000,0000,0000,,of the response to the\Ncommissioner's remarks. Dialogue: 0,3:10:32.63,3:10:36.99,Default,,0000,0000,0000,,In some of the technology puts\Ntremendous additional power Dialogue: 0,3:10:36.99,3:10:39.41,Default,,0000,0000,0000,,in the hands of government and its agents, Dialogue: 0,3:10:39.41,3:10:42.12,Default,,0000,0000,0000,,all the more so when it actually does work Dialogue: 0,3:10:42.12,3:10:44.40,Default,,0000,0000,0000,,and our privacy among other things Dialogue: 0,3:10:44.40,3:10:46.75,Default,,0000,0000,0000,,is certainly threatened by that. Dialogue: 0,3:10:46.75,3:10:49.09,Default,,0000,0000,0000,,Where could that lead, look no further, Dialogue: 0,3:10:49.09,3:10:52.12,Default,,0000,0000,0000,,as people have suggested to the nightmare, Dialogue: 0,3:10:52.12,3:10:55.67,Default,,0000,0000,0000,,that is the surveillance\Napparatus in China today. Dialogue: 0,3:10:55.67,3:10:58.89,Default,,0000,0000,0000,,The third point is, has been mentioned, Dialogue: 0,3:10:58.89,3:11:00.45,Default,,0000,0000,0000,,but it was less emphasized. Dialogue: 0,3:11:00.45,3:11:02.02,Default,,0000,0000,0000,,And I think it's just worth underlining, Dialogue: 0,3:11:02.02,3:11:04.00,Default,,0000,0000,0000,,which is that the awareness Dialogue: 0,3:11:04.00,3:11:07.89,Default,,0000,0000,0000,,of being personally tracked\Noutside of one's home Dialogue: 0,3:11:07.89,3:11:09.41,Default,,0000,0000,0000,,by this technology can, Dialogue: 0,3:11:09.41,3:11:11.66,Default,,0000,0000,0000,,and I think will have a chilling effect Dialogue: 0,3:11:11.66,3:11:14.15,Default,,0000,0000,0000,,on our exercise of our civil rights, Dialogue: 0,3:11:14.15,3:11:16.26,Default,,0000,0000,0000,,including our rights of free speech, Dialogue: 0,3:11:16.26,3:11:18.85,Default,,0000,0000,0000,,freedom of association and our religion. Dialogue: 0,3:11:18.85,3:11:23.24,Default,,0000,0000,0000,,So I strongly urge the\Ncouncil to pass this ban, Dialogue: 0,3:11:23.24,3:11:25.71,Default,,0000,0000,0000,,but I also ask that you\Nwill not stop there, Dialogue: 0,3:11:25.71,3:11:28.74,Default,,0000,0000,0000,,expand the band even further, Dialogue: 0,3:11:28.74,3:11:31.71,Default,,0000,0000,0000,,going forward to include other kinds Dialogue: 0,3:11:31.71,3:11:33.30,Default,,0000,0000,0000,,of remote surveillance systems, Dialogue: 0,3:11:33.30,3:11:35.07,Default,,0000,0000,0000,,such that are biometric for example, Dialogue: 0,3:11:35.07,3:11:36.95,Default,,0000,0000,0000,,a gate and voice recognition systems, Dialogue: 0,3:11:36.95,3:11:41.21,Default,,0000,0000,0000,,and also to the private\Nsector has been suggested. Dialogue: 0,3:11:41.21,3:11:46.14,Default,,0000,0000,0000,,So I hope that you will consider\Nexpeditiously passing this, Dialogue: 0,3:11:46.14,3:11:48.27,Default,,0000,0000,0000,,but also going further. Dialogue: 0,3:11:48.27,3:11:49.80,Default,,0000,0000,0000,,Thank you so much for listening. Dialogue: 0,3:11:49.80,3:11:51.92,Default,,0000,0000,0000,,Thank you very much, Nora. Dialogue: 0,3:11:53.90,3:11:56.06,Default,,0000,0000,0000,,Good evening, my name\Nis Nora Paul-Schultz, Dialogue: 0,3:11:56.06,3:11:58.77,Default,,0000,0000,0000,,and I am a physics teacher\Nin Boston public schools. Dialogue: 0,3:11:58.77,3:12:01.94,Default,,0000,0000,0000,,And I've had the pleasure\Nof teaching Eli and Esmalda Dialogue: 0,3:12:01.94,3:12:03.96,Default,,0000,0000,0000,,who spoke earlier and\Nhopefully one day Angel. Dialogue: 0,3:12:03.96,3:12:06.53,Default,,0000,0000,0000,,And I'm a resident of Jamaica Plain. Dialogue: 0,3:12:06.53,3:12:09.30,Default,,0000,0000,0000,,As someone who studied\Nengineering when I was in college, Dialogue: 0,3:12:09.30,3:12:10.48,Default,,0000,0000,0000,,has taught engineering Dialogue: 0,3:12:10.48,3:12:13.08,Default,,0000,0000,0000,,and is one of the robotics\Ncoaches at The O'Bryant, Dialogue: 0,3:12:13.08,3:12:14.76,Default,,0000,0000,0000,,I understand the impulse to feel like Dialogue: 0,3:12:14.76,3:12:17.36,Default,,0000,0000,0000,,technology can solve all of our problems. Dialogue: 0,3:12:17.36,3:12:20.45,Default,,0000,0000,0000,,I know that many for\Nmany there is a desire Dialogue: 0,3:12:20.45,3:12:23.58,Default,,0000,0000,0000,,that if we have better\Nsmarter and more technology Dialogue: 0,3:12:23.58,3:12:26.10,Default,,0000,0000,0000,,than our communities will be safer, Dialogue: 0,3:12:26.10,3:12:28.76,Default,,0000,0000,0000,,but the reality is that\Ntechnology is not perfect. Dialogue: 0,3:12:28.76,3:12:31.08,Default,,0000,0000,0000,,A perfect unbias tool, Dialogue: 0,3:12:31.08,3:12:33.41,Default,,0000,0000,0000,,as much as we wish it would be. Dialogue: 0,3:12:33.41,3:12:36.79,Default,,0000,0000,0000,,Our technologies reflect the\Ninherent biases of our makers. Dialogue: 0,3:12:36.79,3:12:39.95,Default,,0000,0000,0000,,So technology isn't going to save us. Dialogue: 0,3:12:39.95,3:12:44.65,Default,,0000,0000,0000,,If it does not take\Nmuch to see that racism Dialogue: 0,3:12:44.65,3:12:47.22,Default,,0000,0000,0000,,is infused in all parts of our country. Dialogue: 0,3:12:47.22,3:12:51.15,Default,,0000,0000,0000,,And that is true about the\Nfacial surveillance technologies. Dialogue: 0,3:12:51.15,3:12:52.93,Default,,0000,0000,0000,,Through my work with Unafraid Educators, Dialogue: 0,3:12:52.93,3:12:55.01,Default,,0000,0000,0000,,the Boston Teacher Unions, Dialogue: 0,3:12:55.01,3:12:56.59,Default,,0000,0000,0000,,immigrant rights organizing committee, Dialogue: 0,3:12:56.59,3:12:59.86,Default,,0000,0000,0000,,the information sharing\Nbetween Boston School Police Dialogue: 0,3:12:59.86,3:13:02.08,Default,,0000,0000,0000,,and the Boston Police Department, Dialogue: 0,3:13:02.08,3:13:06.23,Default,,0000,0000,0000,,I know how detrimental observations\Nand surveillance can be. Dialogue: 0,3:13:06.23,3:13:07.94,Default,,0000,0000,0000,,Practices of surveillance Dialogue: 0,3:13:07.94,3:13:09.64,Default,,0000,0000,0000,,like camera recording in the hallway Dialogue: 0,3:13:09.64,3:13:13.43,Default,,0000,0000,0000,,or officers watching young\Npeople are not passive. Dialogue: 0,3:13:13.43,3:13:15.43,Default,,0000,0000,0000,,Surveillance is active. Dialogue: 0,3:13:15.43,3:13:19.06,Default,,0000,0000,0000,,It leads to the creation\Nof law enforcement profiles Dialogue: 0,3:13:19.06,3:13:23.85,Default,,0000,0000,0000,,about young people and\Nthat can impact their lives Dialogue: 0,3:13:23.85,3:13:25.80,Default,,0000,0000,0000,,in material ways for years to come. Dialogue: 0,3:13:25.80,3:13:29.77,Default,,0000,0000,0000,,And what of those impacts\Nbeing in the system, Dialogue: 0,3:13:29.77,3:13:32.41,Default,,0000,0000,0000,,can make it harder for\Nyoung people to get a job Dialogue: 0,3:13:32.41,3:13:33.40,Default,,0000,0000,0000,,or get housing. Dialogue: 0,3:13:33.40,3:13:36.92,Default,,0000,0000,0000,,It can lead to incarceration,\Nit can lead to deportation. Dialogue: 0,3:13:36.92,3:13:39.64,Default,,0000,0000,0000,,Our city government\Ndoes not need more tools Dialogue: 0,3:13:39.64,3:13:41.65,Default,,0000,0000,0000,,to profile the people of our city Dialogue: 0,3:13:41.65,3:13:44.96,Default,,0000,0000,0000,,and especially does doesn't need tools Dialogue: 0,3:13:44.96,3:13:46.48,Default,,0000,0000,0000,,that are inherently racist. Dialogue: 0,3:13:46.48,3:13:49.01,Default,,0000,0000,0000,,This is why the ban is so important. Dialogue: 0,3:13:49.01,3:13:51.85,Default,,0000,0000,0000,,Teaching has taught me\Nthat what keeps us safe Dialogue: 0,3:13:51.85,3:13:54.19,Default,,0000,0000,0000,,is not having the government surveillance Dialogue: 0,3:13:54.19,3:13:55.96,Default,,0000,0000,0000,,and track us through technology, Dialogue: 0,3:13:55.96,3:13:58.16,Default,,0000,0000,0000,,what keeps us safe is\Ninvesting in housing, Dialogue: 0,3:13:58.16,3:13:59.81,Default,,0000,0000,0000,,education and healthcare. Dialogue: 0,3:13:59.81,3:14:03.25,Default,,0000,0000,0000,,This face surveillance\Ntechnology harms our community, Dialogue: 0,3:14:03.25,3:14:04.49,Default,,0000,0000,0000,,Ah! Dialogue: 0,3:14:07.07,3:14:08.55,Default,,0000,0000,0000,,Sorry. Dialogue: 0,3:14:08.55,3:14:13.55,Default,,0000,0000,0000,,This surveillance technology\Nharms our community. Dialogue: 0,3:14:13.95,3:14:16.17,Default,,0000,0000,0000,,The fact that the face ban technology Dialogue: 0,3:14:16.17,3:14:19.83,Default,,0000,0000,0000,,only identifies black women\Ncorrectly one third of the time Dialogue: 0,3:14:19.83,3:14:22.01,Default,,0000,0000,0000,,lays bare both the ineffectiveness Dialogue: 0,3:14:22.01,3:14:24.28,Default,,0000,0000,0000,,and implicit bias built into the system. Dialogue: 0,3:14:24.28,3:14:27.04,Default,,0000,0000,0000,,As a teacher I know that\Nthat is not a passing grade. Dialogue: 0,3:14:27.04,3:14:29.01,Default,,0000,0000,0000,,We need money to go to services Dialogue: 0,3:14:29.01,3:14:30.99,Default,,0000,0000,0000,,that will protect our community, Dialogue: 0,3:14:30.99,3:14:33.11,Default,,0000,0000,0000,,and we need to stop and prevent Dialogue: 0,3:14:33.11,3:14:35.03,Default,,0000,0000,0000,,the use of ineffective racists Dialogue: 0,3:14:35.03,3:14:37.82,Default,,0000,0000,0000,,and money wasting technologies. Dialogue: 0,3:14:37.82,3:14:38.65,Default,,0000,0000,0000,,Thank you. Dialogue: 0,3:14:40.66,3:14:43.09,Default,,0000,0000,0000,,Thank you, Nora and we\Nappreciate all of your work. Dialogue: 0,3:14:43.09,3:14:45.99,Default,,0000,0000,0000,,I'm gonna quickly read the list of names Dialogue: 0,3:14:45.99,3:14:48.24,Default,,0000,0000,0000,,that were signed up to testify, Dialogue: 0,3:14:48.24,3:14:50.77,Default,,0000,0000,0000,,but I didn't see matches\Nof the names of folks Dialogue: 0,3:14:50.77,3:14:52.01,Default,,0000,0000,0000,,left in the waiting room. Dialogue: 0,3:14:52.01,3:14:54.36,Default,,0000,0000,0000,,So just in case you're assigning\Nit on a different name, Dialogue: 0,3:14:54.36,3:14:55.72,Default,,0000,0000,0000,,but I'm expecting that these folks Dialogue: 0,3:14:55.72,3:14:57.08,Default,,0000,0000,0000,,probably aren't here anymore. Dialogue: 0,3:14:57.08,3:15:00.77,Default,,0000,0000,0000,,Julie McNulty, Zachary\NLawn, McKenna Kavanaugh, Dialogue: 0,3:15:00.77,3:15:05.77,Default,,0000,0000,0000,,Christine Dordy, Sarah Nelson,\NAndrew Tario, Amanda Meehan. Dialogue: 0,3:15:05.83,3:15:10.07,Default,,0000,0000,0000,,Chris Farone, Bridget\NShepherd, Lena Papagiannis. Dialogue: 0,3:15:10.07,3:15:14.67,Default,,0000,0000,0000,,Anyone here and signed on\Nunder a different name? Dialogue: 0,3:15:15.67,3:15:17.13,Default,,0000,0000,0000,,Nope, okay. Dialogue: 0,3:15:17.13,3:15:18.62,Default,,0000,0000,0000,,Then I will just go on the order Dialogue: 0,3:15:18.62,3:15:21.54,Default,,0000,0000,0000,,of folks appearing on my\Nscreen to close this out. Dialogue: 0,3:15:21.54,3:15:23.58,Default,,0000,0000,0000,,Thank you so much for your patience. Dialogue: 0,3:15:23.58,3:15:28.58,Default,,0000,0000,0000,,So that will be Mark G then Adeline Ansell Dialogue: 0,3:15:28.89,3:15:30.95,Default,,0000,0000,0000,,then Carolina Pena. Dialogue: 0,3:15:34.19,3:15:36.36,Default,,0000,0000,0000,,Yes, so this is Mark Gurvich, Dialogue: 0,3:15:37.47,3:15:39.30,Default,,0000,0000,0000,,I'm presenting testimony on behalf of Dialogue: 0,3:15:40.33,3:15:41.16,Default,,0000,0000,0000,,Jewish Voice for Peace, Dialogue: 0,3:15:41.16,3:15:42.57,Default,,0000,0000,0000,,a local and national organization Dialogue: 0,3:15:42.57,3:15:45.63,Default,,0000,0000,0000,,that's guided by Jewish\Nvalues, fighting racism, Dialogue: 0,3:15:45.63,3:15:49.30,Default,,0000,0000,0000,,sexism, Islamophobia, LGBTQ, oppression Dialogue: 0,3:15:49.30,3:15:51.87,Default,,0000,0000,0000,,and anti-immigrant policies here Dialogue: 0,3:15:51.87,3:15:53.63,Default,,0000,0000,0000,,and challenging racism, colonialism Dialogue: 0,3:15:53.63,3:15:55.35,Default,,0000,0000,0000,,and oppression everywhere, Dialogue: 0,3:15:55.35,3:15:58.29,Default,,0000,0000,0000,,with a focus on the injustices\Nsuffered by Palestinians Dialogue: 0,3:15:58.29,3:16:01.01,Default,,0000,0000,0000,,under Israeli occupation and control. Dialogue: 0,3:16:01.01,3:16:04.24,Default,,0000,0000,0000,,In recent days in the U.S.\Nwe have again witnessed Dialogue: 0,3:16:04.24,3:16:07.24,Default,,0000,0000,0000,,the impact of militarized policing Dialogue: 0,3:16:07.24,3:16:08.93,Default,,0000,0000,0000,,on communities of color. Dialogue: 0,3:16:08.93,3:16:13.50,Default,,0000,0000,0000,,Councilor Wu has pointed\Nout in a recent publication Dialogue: 0,3:16:14.99,3:16:17.47,Default,,0000,0000,0000,,about this militarization Dialogue: 0,3:16:17.47,3:16:20.37,Default,,0000,0000,0000,,and much of this\Nmilitarization has been through Dialogue: 0,3:16:20.37,3:16:23.70,Default,,0000,0000,0000,,the transfer of billions of\Ndollars worth of equipment Dialogue: 0,3:16:23.70,3:16:27.00,Default,,0000,0000,0000,,from the military to\Ncivilian or police forces, Dialogue: 0,3:16:27.00,3:16:28.24,Default,,0000,0000,0000,,and the training of U.S. police Dialogue: 0,3:16:28.24,3:16:30.04,Default,,0000,0000,0000,,and the use of military tactics Dialogue: 0,3:16:30.04,3:16:31.71,Default,,0000,0000,0000,,in controlling the communities, Dialogue: 0,3:16:31.71,3:16:33.81,Default,,0000,0000,0000,,most heavily impacted by racism, Dialogue: 0,3:16:33.81,3:16:36.81,Default,,0000,0000,0000,,economic depletion, and health crises. Dialogue: 0,3:16:36.81,3:16:41.69,Default,,0000,0000,0000,,Our view is that the use of\Nfacial recognition technology Dialogue: 0,3:16:41.69,3:16:46.25,Default,,0000,0000,0000,,is a key component of the\Nmilitarization of policing. Dialogue: 0,3:16:46.25,3:16:48.12,Default,,0000,0000,0000,,We know this because of our knowledge Dialogue: 0,3:16:48.12,3:16:50.18,Default,,0000,0000,0000,,and experience of the Israeli occupation Dialogue: 0,3:16:50.18,3:16:52.59,Default,,0000,0000,0000,,of Palestinian people in land. Dialogue: 0,3:16:52.59,3:16:57.59,Default,,0000,0000,0000,,Facial recognition is a core\Nfeature of that occupation. Dialogue: 0,3:16:58.83,3:17:00.85,Default,,0000,0000,0000,,Much of the technology being marketed Dialogue: 0,3:17:00.85,3:17:02.50,Default,,0000,0000,0000,,to the U.S. law enforcement, Dialogue: 0,3:17:02.50,3:17:04.80,Default,,0000,0000,0000,,has been field tested on Palestinians Dialogue: 0,3:17:04.80,3:17:07.23,Default,,0000,0000,0000,,under military occupation. Dialogue: 0,3:17:07.23,3:17:09.18,Default,,0000,0000,0000,,At this historic and critical time, Dialogue: 0,3:17:09.18,3:17:12.11,Default,,0000,0000,0000,,when the effects of militarized\Npolicing in the U.S. Dialogue: 0,3:17:12.11,3:17:15.95,Default,,0000,0000,0000,,has resulted in nationwide\Nand worldwide condemnation, Dialogue: 0,3:17:15.95,3:17:18.93,Default,,0000,0000,0000,,the very last thing we\Nneed is another tool Dialogue: 0,3:17:18.93,3:17:22.77,Default,,0000,0000,0000,,adopted from the toolbox\Nof military occupation. Dialogue: 0,3:17:22.77,3:17:26.73,Default,,0000,0000,0000,,Jewish Voice for Peace, supports\Nthe face surveillance ban. Dialogue: 0,3:17:26.73,3:17:28.81,Default,,0000,0000,0000,,Thank you. Dialogue: 0,3:17:28.81,3:17:30.23,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,3:17:30.23,3:17:33.03,Default,,0000,0000,0000,,Adeline or Adeline, sorry\Nto mispronounce your name. Dialogue: 0,3:17:40.18,3:17:41.01,Default,,0000,0000,0000,,Adeline Ansell. Dialogue: 0,3:17:44.81,3:17:47.83,Default,,0000,0000,0000,,We'll move on to Carolina\Nor Carolina Pena. Dialogue: 0,3:17:56.38,3:17:57.85,Default,,0000,0000,0000,,Let me see. Dialogue: 0,3:17:57.85,3:17:59.59,Default,,0000,0000,0000,,I will unmute on my own in case Dialogue: 0,3:17:59.59,3:18:01.57,Default,,0000,0000,0000,,it's a muting and unmuting issue. Dialogue: 0,3:18:06.18,3:18:09.03,Default,,0000,0000,0000,,Then we'll move on to Kristen Lyman. Dialogue: 0,3:18:09.03,3:18:11.51,Default,,0000,0000,0000,,Kristen will be followed by Galen bunting, Dialogue: 0,3:18:11.51,3:18:14.51,Default,,0000,0000,0000,,Maria Brincker and Araya Zack. Dialogue: 0,3:18:14.51,3:18:15.34,Default,,0000,0000,0000,,Kristin. Dialogue: 0,3:18:28.14,3:18:29.43,Default,,0000,0000,0000,,Okay, going to Galen, Dialogue: 0,3:18:38.13,3:18:39.38,Default,,0000,0000,0000,,Maria Brincker. Dialogue: 0,3:18:46.01,3:18:47.04,Default,,0000,0000,0000,,MARIA: Yes. Dialogue: 0,3:18:47.04,3:18:49.82,Default,,0000,0000,0000,,Hi, I had actually not\Nnecessarily prepared. Dialogue: 0,3:18:49.82,3:18:53.21,Default,,0000,0000,0000,,Hello, can you hear me? Dialogue: 0,3:18:53.21,3:18:57.37,Default,,0000,0000,0000,,Adeline, will go to your\Nright right next up to Maria. Dialogue: 0,3:18:57.37,3:18:59.64,Default,,0000,0000,0000,,Sorry about the confusion. Dialogue: 0,3:18:59.64,3:19:01.93,Default,,0000,0000,0000,,ADELINE: Okay.\NOkay, no problem. Dialogue: 0,3:19:01.93,3:19:05.28,Default,,0000,0000,0000,,So I just wanted to thank everybody Dialogue: 0,3:19:05.28,3:19:09.88,Default,,0000,0000,0000,,and I wanna say that I am very much, Dialogue: 0,3:19:09.88,3:19:12.69,Default,,0000,0000,0000,,I'm an associate professor at UMass Boston Dialogue: 0,3:19:12.69,3:19:15.77,Default,,0000,0000,0000,,and both as an educator, Dialogue: 0,3:19:15.77,3:19:20.25,Default,,0000,0000,0000,,and as a researcher in\Nissues of surveillance, Dialogue: 0,3:19:20.25,3:19:23.41,Default,,0000,0000,0000,,I am very much in favor of the ordinance. Dialogue: 0,3:19:23.41,3:19:26.23,Default,,0000,0000,0000,,I would like to say that, Dialogue: 0,3:19:26.23,3:19:31.23,Default,,0000,0000,0000,,I would also like the council\Nto consider wider regulations. Dialogue: 0,3:19:31.28,3:19:33.97,Default,,0000,0000,0000,,So as many have said before, Dialogue: 0,3:19:33.97,3:19:38.97,Default,,0000,0000,0000,,there's a lot of issues\Nalso in the private sphere. Dialogue: 0,3:19:39.97,3:19:42.37,Default,,0000,0000,0000,,So this is a very important first step, Dialogue: 0,3:19:42.37,3:19:47.37,Default,,0000,0000,0000,,but a lot of the use of facial recognition Dialogue: 0,3:19:48.08,3:19:49.51,Default,,0000,0000,0000,,in law enforcement, Dialogue: 0,3:19:49.51,3:19:54.51,Default,,0000,0000,0000,,really interacts with the use\Nof surveillance technologies Dialogue: 0,3:19:55.16,3:19:56.77,Default,,0000,0000,0000,,in the private space. Dialogue: 0,3:19:56.77,3:20:01.20,Default,,0000,0000,0000,,So for example, when the\NBrooklyn Police Department Dialogue: 0,3:20:01.20,3:20:04.40,Default,,0000,0000,0000,,were trying to argue\Nagainst the Brooklyn ban, Dialogue: 0,3:20:08.27,3:20:13.27,Default,,0000,0000,0000,,they used an example of\Npolice using Snapchat. Dialogue: 0,3:20:13.89,3:20:18.33,Default,,0000,0000,0000,,So a lot of speakers before\Nhave talked about the importance Dialogue: 0,3:20:18.33,3:20:20.33,Default,,0000,0000,0000,,of anonymity in public space, Dialogue: 0,3:20:20.33,3:20:22.54,Default,,0000,0000,0000,,and I'm fully in support of that, Dialogue: 0,3:20:22.54,3:20:25.52,Default,,0000,0000,0000,,but this is not only about public space. Dialogue: 0,3:20:25.52,3:20:27.86,Default,,0000,0000,0000,,So the use of facial recognition software Dialogue: 0,3:20:27.86,3:20:31.14,Default,,0000,0000,0000,,is of course invading all\Nour private spaces as well. Dialogue: 0,3:20:31.14,3:20:35.21,Default,,0000,0000,0000,,So I would like people\Nto be aware of that. Dialogue: 0,3:20:35.21,3:20:36.98,Default,,0000,0000,0000,,And as a teacher right now, Dialogue: 0,3:20:36.98,3:20:40.03,Default,,0000,0000,0000,,with the epidemic teaching on Zoom, Dialogue: 0,3:20:40.03,3:20:45.03,Default,,0000,0000,0000,,you can only imagine\Nthe deployment of voice Dialogue: 0,3:20:47.41,3:20:52.37,Default,,0000,0000,0000,,and face recognition in all our spaces. Dialogue: 0,3:20:52.37,3:20:54.73,Default,,0000,0000,0000,,And so I think this is very important. Dialogue: 0,3:20:54.73,3:20:55.56,Default,,0000,0000,0000,,And another thing, Dialogue: 0,3:20:55.56,3:20:57.94,Default,,0000,0000,0000,,my background is also in\Nphilosophy of neuroscience Dialogue: 0,3:20:57.94,3:21:02.05,Default,,0000,0000,0000,,and in the understanding\Nof autonomous action. Dialogue: 0,3:21:02.05,3:21:05.76,Default,,0000,0000,0000,,And autonomous action\Ndepends on being in a space Dialogue: 0,3:21:05.76,3:21:07.58,Default,,0000,0000,0000,,that we can understand. Dialogue: 0,3:21:07.58,3:21:09.83,Default,,0000,0000,0000,,And one of the reasons\Nwhy facial recognition Dialogue: 0,3:21:09.83,3:21:12.69,Default,,0000,0000,0000,,is one of the most dangerous inventions, Dialogue: 0,3:21:12.69,3:21:17.69,Default,,0000,0000,0000,,is that it precisely ruins\Nthe integrity of any space Dialogue: 0,3:21:17.93,3:21:20.23,Default,,0000,0000,0000,,because we don't know who's there Dialogue: 0,3:21:20.23,3:21:24.05,Default,,0000,0000,0000,,and it can do so retroactively. Dialogue: 0,3:21:25.12,3:21:29.23,Default,,0000,0000,0000,,So I thank you very much\Nfor all your work on this, Dialogue: 0,3:21:29.23,3:21:32.40,Default,,0000,0000,0000,,and I'm fully in support of\Nthe ordinance, thank you. Dialogue: 0,3:21:32.40,3:21:33.40,Default,,0000,0000,0000,,Thank you so much. Dialogue: 0,3:21:34.53,3:21:36.42,Default,,0000,0000,0000,,Okay, next we'll go to Adeline Dialogue: 0,3:21:39.14,3:21:41.41,Default,,0000,0000,0000,,Hi, I'm Adeline Ansell. Dialogue: 0,3:21:42.62,3:21:45.48,Default,,0000,0000,0000,,I was just coming out to\Nsupport the ordinance as member Dialogue: 0,3:21:45.48,3:21:47.36,Default,,0000,0000,0000,,of the Unafraid Educators group Dialogue: 0,3:21:47.36,3:21:49.78,Default,,0000,0000,0000,,of the Boston Teachers Union, Dialogue: 0,3:21:49.78,3:21:51.79,Default,,0000,0000,0000,,to protect our immigrant students Dialogue: 0,3:21:51.79,3:21:53.92,Default,,0000,0000,0000,,and who have been\Nadvocating for themselves Dialogue: 0,3:21:53.92,3:21:56.75,Default,,0000,0000,0000,,through the Student Immigrant Movement Dialogue: 0,3:21:56.75,3:21:58.95,Default,,0000,0000,0000,,to support this ordinance Dialogue: 0,3:21:58.95,3:22:03.02,Default,,0000,0000,0000,,banning facial recognition technology, Dialogue: 0,3:22:03.02,3:22:06.94,Default,,0000,0000,0000,,because it puts our\Nstudents more so at risk. Dialogue: 0,3:22:06.94,3:22:10.93,Default,,0000,0000,0000,,And I don't have a lot to add. Dialogue: 0,3:22:10.93,3:22:12.40,Default,,0000,0000,0000,,I just wanna support\Nthe work they're doing Dialogue: 0,3:22:12.40,3:22:14.03,Default,,0000,0000,0000,,in their student advocacy. Dialogue: 0,3:22:14.03,3:22:15.75,Default,,0000,0000,0000,,And I know they all spoke for themselves Dialogue: 0,3:22:15.75,3:22:17.67,Default,,0000,0000,0000,,and in their allies did as well, Dialogue: 0,3:22:17.67,3:22:20.46,Default,,0000,0000,0000,,and just wanted to let you\Nknow that Boston teachers Dialogue: 0,3:22:20.46,3:22:23.35,Default,,0000,0000,0000,,also support this ban. Dialogue: 0,3:22:23.35,3:22:26.50,Default,,0000,0000,0000,,So thank you for listening\Nto all of their testimony Dialogue: 0,3:22:26.50,3:22:29.47,Default,,0000,0000,0000,,and know that the teachers\Nare on their side too, Dialogue: 0,3:22:29.47,3:22:32.65,Default,,0000,0000,0000,,and that we support this\Nordinance, thank you. Dialogue: 0,3:22:32.65,3:22:33.99,Default,,0000,0000,0000,,Thank you. Dialogue: 0,3:22:33.99,3:22:35.60,Default,,0000,0000,0000,,Okay, last call. Dialogue: 0,3:22:35.60,3:22:38.41,Default,,0000,0000,0000,,Either Carolina or Galen Dialogue: 0,3:22:38.41,3:22:40.26,Default,,0000,0000,0000,,or anyone else who would like to speak. Dialogue: 0,3:22:40.26,3:22:43.12,Default,,0000,0000,0000,,I think everyone's been admitted\Nfrom the waiting room now. Dialogue: 0,3:22:45.57,3:22:46.89,Default,,0000,0000,0000,,Okay, seeing no takers, Dialogue: 0,3:22:46.89,3:22:49.41,Default,,0000,0000,0000,,I'll pass it back if any councilors Dialogue: 0,3:22:49.41,3:22:52.19,Default,,0000,0000,0000,,wish to make a closing\Nstatement before we wrap up. Dialogue: 0,3:22:55.50,3:22:58.86,Default,,0000,0000,0000,,I see I'll save councilor Arroyo\Nfor right before we close, Dialogue: 0,3:22:58.86,3:23:00.82,Default,,0000,0000,0000,,councilor Annissa Essaibi-George. Dialogue: 0,3:23:00.82,3:23:02.96,Default,,0000,0000,0000,,Thank you madam chair at this point Dialogue: 0,3:23:02.96,3:23:04.92,Default,,0000,0000,0000,,and thank you to you and council Arroyo Dialogue: 0,3:23:04.92,3:23:06.80,Default,,0000,0000,0000,,for bringing this before the council. Dialogue: 0,3:23:06.80,3:23:11.35,Default,,0000,0000,0000,,I've really appreciated\Neveryone's testimony today Dialogue: 0,3:23:11.35,3:23:13.13,Default,,0000,0000,0000,,in both hearings and sort of the, Dialogue: 0,3:23:13.13,3:23:15.66,Default,,0000,0000,0000,,certainly a very direct correlation Dialogue: 0,3:23:15.66,3:23:17.21,Default,,0000,0000,0000,,between some of the testimony today, Dialogue: 0,3:23:17.21,3:23:19.48,Default,,0000,0000,0000,,but a particular interest of mine Dialogue: 0,3:23:19.48,3:23:21.57,Default,,0000,0000,0000,,was the testimony of the young people Dialogue: 0,3:23:21.57,3:23:23.29,Default,,0000,0000,0000,,and the testimony of the teachers, Dialogue: 0,3:23:23.29,3:23:25.27,Default,,0000,0000,0000,,in particular... Dialogue: 0,3:23:25.27,3:23:27.86,Default,,0000,0000,0000,,So just thank you, thank you both. Dialogue: 0,3:23:27.86,3:23:32.02,Default,,0000,0000,0000,,Thank you councilor Wu for\Nstaying after the chair Dialogue: 0,3:23:32.02,3:23:34.01,Default,,0000,0000,0000,,had to leave to extend an opportunity Dialogue: 0,3:23:34.01,3:23:36.03,Default,,0000,0000,0000,,for this continued testimony. Dialogue: 0,3:23:36.03,3:23:40.35,Default,,0000,0000,0000,,It was certainly important testimony, Dialogue: 0,3:23:40.35,3:23:43.73,Default,,0000,0000,0000,,but just the fact that everyone stayed on Dialogue: 0,3:23:43.73,3:23:45.97,Default,,0000,0000,0000,,as long as they did to\Noffer that testimony, Dialogue: 0,3:23:45.97,3:23:48.38,Default,,0000,0000,0000,,I think is really important to recognize, Dialogue: 0,3:23:48.38,3:23:50.85,Default,,0000,0000,0000,,and to be grateful for that engagement. Dialogue: 0,3:23:50.85,3:23:53.12,Default,,0000,0000,0000,,And thank you to you and\Nthank you to the maker. Dialogue: 0,3:23:53.12,3:23:54.42,Default,,0000,0000,0000,,Thank you councilor Dialogue: 0,3:23:54.42,3:23:58.24,Default,,0000,0000,0000,,and final closing words from\Nmy co-sponsor councilor Arroyo. Dialogue: 0,3:23:59.89,3:24:03.25,Default,,0000,0000,0000,,Just a sincere thank you to our panelists. Dialogue: 0,3:24:03.25,3:24:04.68,Default,,0000,0000,0000,,I think many of them are off now, Dialogue: 0,3:24:04.68,3:24:07.21,Default,,0000,0000,0000,,but thank you so much to them Dialogue: 0,3:24:07.21,3:24:08.99,Default,,0000,0000,0000,,and everybody who gave testimony, Dialogue: 0,3:24:08.99,3:24:11.61,Default,,0000,0000,0000,,much of that was deeply moving. Dialogue: 0,3:24:11.61,3:24:13.17,Default,,0000,0000,0000,,Certainly Angel was a highlight. Dialogue: 0,3:24:13.17,3:24:14.63,Default,,0000,0000,0000,,So thank you Angel, Dialogue: 0,3:24:14.63,3:24:17.45,Default,,0000,0000,0000,,and anybody who can get\Nthat message to Angel, Dialogue: 0,3:24:17.45,3:24:18.90,Default,,0000,0000,0000,,I'm not sure if it's bedtime. Dialogue: 0,3:24:20.10,3:24:22.16,Default,,0000,0000,0000,,Thank you so much for raising your voices Dialogue: 0,3:24:22.100,3:24:25.71,Default,,0000,0000,0000,,and really being focused\Nat a real municipal level. Dialogue: 0,3:24:25.71,3:24:28.05,Default,,0000,0000,0000,,This is what we've always asked for. Dialogue: 0,3:24:28.05,3:24:30.90,Default,,0000,0000,0000,,This is what I grew up about Angel's age, Dialogue: 0,3:24:30.90,3:24:32.18,Default,,0000,0000,0000,,watching these meetings Dialogue: 0,3:24:32.18,3:24:34.82,Default,,0000,0000,0000,,and having folks basically beg Dialogue: 0,3:24:34.82,3:24:38.09,Default,,0000,0000,0000,,for this kind of interaction\Nfrom our communities. Dialogue: 0,3:24:38.09,3:24:39.81,Default,,0000,0000,0000,,And so I'm just deeply grateful Dialogue: 0,3:24:39.81,3:24:43.03,Default,,0000,0000,0000,,that whatever brought\Nyou here specifically, Dialogue: 0,3:24:43.03,3:24:44.36,Default,,0000,0000,0000,,I hope you stay engaged. Dialogue: 0,3:24:44.36,3:24:46.69,Default,,0000,0000,0000,,I hope you bring this\Nenergy to other issues Dialogue: 0,3:24:46.69,3:24:50.01,Default,,0000,0000,0000,,that really affect our\Ncommunities on a daily basis Dialogue: 0,3:24:50.01,3:24:51.53,Default,,0000,0000,0000,,because it really makes change. Dialogue: 0,3:24:51.53,3:24:53.43,Default,,0000,0000,0000,,It really changes things. Dialogue: 0,3:24:53.43,3:24:56.49,Default,,0000,0000,0000,,And so thank you so much to\Neverybody who took part in this, Dialogue: 0,3:24:56.49,3:24:59.08,Default,,0000,0000,0000,,thank you to all the\Nadvocates as a part of this. Dialogue: 0,3:24:59.08,3:25:01.14,Default,,0000,0000,0000,,Thank you councilor Wu for making sure Dialogue: 0,3:25:02.01,3:25:03.77,Default,,0000,0000,0000,,that there was a chair here Dialogue: 0,3:25:03.77,3:25:06.55,Default,,0000,0000,0000,,and to councilor Edwards first\Nrunning it so well earlier. Dialogue: 0,3:25:06.55,3:25:07.62,Default,,0000,0000,0000,,But thank you to you for ensuring Dialogue: 0,3:25:07.62,3:25:09.67,Default,,0000,0000,0000,,that we can continue to\Nhear public testimony Dialogue: 0,3:25:09.67,3:25:11.74,Default,,0000,0000,0000,,because it really is valuable. Dialogue: 0,3:25:11.74,3:25:13.92,Default,,0000,0000,0000,,So with that, I thought\Nthat was all very valuable. Dialogue: 0,3:25:13.92,3:25:15.05,Default,,0000,0000,0000,,And also thank you to the commissioner Dialogue: 0,3:25:15.05,3:25:16.40,Default,,0000,0000,0000,,for taking the time to be here Dialogue: 0,3:25:16.40,3:25:19.77,Default,,0000,0000,0000,,and to really endorse this idea. Dialogue: 0,3:25:19.77,3:25:20.83,Default,,0000,0000,0000,,I thought that was fantastic. Dialogue: 0,3:25:20.83,3:25:21.66,Default,,0000,0000,0000,,So thank you. Dialogue: 0,3:25:23.35,3:25:26.62,Default,,0000,0000,0000,,Thank you, this is a great, great hearing. Dialogue: 0,3:25:26.62,3:25:28.26,Default,,0000,0000,0000,,I think we all learned so much Dialogue: 0,3:25:28.26,3:25:31.46,Default,,0000,0000,0000,,and I wanna echo the\Nthanks to the commissioner, Dialogue: 0,3:25:31.46,3:25:33.84,Default,,0000,0000,0000,,all the community members who\Ngave feedback along the way Dialogue: 0,3:25:33.84,3:25:37.37,Default,,0000,0000,0000,,and over this evening, my\Ncolleagues, the commissioner, Dialogue: 0,3:25:37.37,3:25:39.44,Default,,0000,0000,0000,,and of course, central staff, Dialogue: 0,3:25:39.44,3:25:43.19,Default,,0000,0000,0000,,or for making sure all of it\Nwent off very, very smoothly. Dialogue: 0,3:25:43.19,3:25:45.39,Default,,0000,0000,0000,,So we'll circle back up\Nwith the committee chair Dialogue: 0,3:25:45.39,3:25:49.27,Default,,0000,0000,0000,,and figure out next steps,\Nbut hoping for swift passage. Dialogue: 0,3:25:49.27,3:25:51.36,Default,,0000,0000,0000,,And I'm very grateful again, Dialogue: 0,3:25:51.36,3:25:54.29,Default,,0000,0000,0000,,that we have an opportunity\Nin Boston to take action. Dialogue: 0,3:25:54.29,3:25:58.35,Default,,0000,0000,0000,,So this will conclude our\Nhearing on docket number 0683, Dialogue: 0,3:25:58.35,3:26:01.22,Default,,0000,0000,0000,,on an ordinance banning\Nfacial recognition technology Dialogue: 0,3:26:01.22,3:26:02.09,Default,,0000,0000,0000,,in Boston. Dialogue: 0,3:26:02.09,3:26:04.12,Default,,0000,0000,0000,,This hearing is adjourned. Dialogue: 0,3:26:04.12,3:26:05.12,Default,,0000,0000,0000,,[gravel strikes] Dialogue: 0,3:26:05.12,3:26:06.87,Default,,0000,0000,0000,,Bye, see you.\NBye.