< Return to Video

Committee on Government Operations on June 9, 2020

  • 0:02 - 0:04
    It will also be re
    broadcasted at a later date
  • 0:04 - 0:09
    on Comcast eight, RCN 82 SAS as per 196.
  • 0:09 - 0:11
    For a public testimony written comments
  • 0:11 - 0:13
    may be sent to the committee email
  • 0:13 - 0:16
    @ccc.go@boston.gov
  • 0:16 - 0:19
    and it will be made part
    of the official record.
  • 0:19 - 0:20
    This is an ordinance
  • 0:20 - 0:22
    that would ban the use
    of facial surveillance
  • 0:22 - 0:26
    by the city of Boston or any
    official in the city of Boston.
  • 0:26 - 0:27
    The proposal would also prohibit
  • 0:27 - 0:29
    entering into agreements, contracts
  • 0:29 - 0:32
    to obtain face-to-face
    surveillance with third parties.
  • 0:32 - 0:34
    I'm gonna turn it now
    over to the lead sponsors,
  • 0:34 - 0:38
    councilor Wu and councilor Arroyo.
  • 0:38 - 0:39
    And then I will turn it to my colleagues
  • 0:39 - 0:42
    in order of arrival, for opening remarks.
  • 0:42 - 0:45
    And just to acknowledge my colleagues
  • 0:45 - 0:48
    and I believe the order that I had so far,
  • 0:48 - 0:52
    again it will be councilor Arroyo,
  • 0:52 - 0:55
    councilor Wu, councilor
    Braden, councilor Bok,
  • 0:56 - 0:58
    councilor Mejia, councilor Campbell,
  • 0:58 - 1:03
    and then I had councilor
    O'Malley, councilor Essaibi George
  • 1:04 - 1:06
    and then I'm sorry, I started talking.
  • 1:06 - 1:08
    So other councilors have joined us.
  • 1:12 - 1:14
    MICHAEL: Good morning madam chair.
  • 1:14 - 1:15
    Councilor Flaherty was before me
  • 1:15 - 1:17
    councilor Edwards, thanks.
  • 1:17 - 1:19
    I apologize councilor Flaherty,
  • 1:19 - 1:22
    then councilor Campbell,
    then counselor melody
  • 1:22 - 1:24
    and councilor Essaibi George.
  • 1:27 - 1:29
    I will not turn it over
    to the lead sponsors.
  • 1:30 - 1:32
    Does that start with me?
  • 1:32 - 1:32
    CHAIR: Sure.
  • 1:32 - 1:34
    Thank you, madam chair.
  • 1:34 - 1:36
    I think we have excellent panelists
  • 1:36 - 1:39
    to kind of go into the intricacies
  • 1:39 - 1:40
    of the facial recognition ban.
  • 1:40 - 1:43
    But to just kind of summarize it,
  • 1:43 - 1:46
    this facial recognition band
    creates a community process
  • 1:47 - 1:49
    that makes our systems more inclusive
  • 1:49 - 1:51
    of community consents while
    adding democratic oversight,
  • 1:51 - 1:53
    where there currently is none.
  • 1:53 - 1:54
    We will be joining our neighbors
  • 1:54 - 1:56
    and Cambridge and Somerville.
  • 1:56 - 1:59
    And specifically when it comes
    to facial recognition tech,
  • 1:59 - 2:01
    it doesn't work, it's not good.
  • 2:01 - 2:04
    It it's been proven through the data
  • 2:04 - 2:06
    to be less accurate for
    people with darker skin.
  • 2:06 - 2:10
    A study by one of our
    panelists Joy Buolamwini,
  • 2:10 - 2:11
    researcher at MIT,
  • 2:11 - 2:12
    found that black women
  • 2:12 - 2:14
    were 35% more likely than white men
  • 2:14 - 2:16
    to be misclassified by face
    surveillance technology.
  • 2:16 - 2:19
    The reality is that face surveillance
  • 2:19 - 2:22
    isn't very effective in its current form.
  • 2:24 - 2:26
    According to records obtained by the ACLU,
  • 2:26 - 2:29
    one manufacturer promoting
    this technology Massachusetts
  • 2:29 - 2:32
    admitted that it might only
    work about 30% of the time.
  • 2:32 - 2:34
    Currently, my understanding
  • 2:34 - 2:38
    is that the Boston Police
    Department does not use this.
  • 2:38 - 2:39
    This isn't a ban on surveillance.
  • 2:39 - 2:43
    This is simply a ban on facial
    recognition surveillance,
  • 2:43 - 2:46
    which I think the data
    has shown doesn't work.
  • 2:46 - 2:48
    BPD doesn't use it because,
  • 2:48 - 2:51
    and I will allow commissioner
    Gross to speak on it.
  • 2:51 - 2:53
    But my understanding from what I've heard
  • 2:53 - 2:54
    and past comments from commissioner Gross
  • 2:54 - 2:56
    is that they don't wanna use technology
  • 2:56 - 2:57
    that doesn't work well.
  • 2:57 - 3:02
    And so this ban necessarily
    even a permanent ban,
  • 3:02 - 3:06
    it just creates a process
    where we are banning something
  • 3:06 - 3:07
    that doesn't have to be process
  • 3:07 - 3:10
    that doesn't have community trust
  • 3:10 - 3:12
    and doesn't have community consent
  • 3:12 - 3:14
    in which would require
    some democratic oversight
  • 3:14 - 3:15
    if they ever do want to implement it,
  • 3:15 - 3:16
    if the technology in the
    future ever does improve.
  • 3:17 - 3:19
    And so with that, I see
    the rest of my time.
  • 3:19 - 3:20
    Thank you, madam chair.
  • 3:20 - 3:25
    And thank you also to my
    co-sponsor councilor Wu.
  • 3:26 - 3:29
    Thank you very much madam chair.
  • 3:29 - 3:33
    Thank you also to councilor Arroyo
  • 3:33 - 3:36
    and all of the coalition partners
  • 3:36 - 3:39
    that have been working
    on this for many months
  • 3:39 - 3:42
    and providing all sorts of
    incredibly important feedback.
  • 3:44 - 3:48
    We scheduled this hearing
    many weeks ago at this point,
  • 3:48 - 3:49
    I think over a month ago.
  • 3:49 - 3:53
    And it just so happened that
    the timing of it now has lined
  • 3:53 - 3:57
    up with a moment of great national trauma.
  • 3:57 - 3:58
    And in this moment,
  • 3:58 - 4:02
    the responsibility is on
    each one of us to step up,
  • 4:02 - 4:06
    to truly address systemic
    racism and systemic oppression,
  • 4:06 - 4:07
    but the responsibilities,
  • 4:07 - 4:11
    especially on elected
    officials, policy makers,
  • 4:11 - 4:14
    those of us who have the power
  • 4:14 - 4:16
    and the ability to make change now.
  • 4:16 - 4:20
    So I am proud that
    Boston has the potential
  • 4:20 - 4:23
    to join our sister cities
    across Massachusetts
  • 4:23 - 4:26
    in taking this immediate
    step to ban a technology
  • 4:26 - 4:29
    that has been proven to
    be racially discriminatory
  • 4:29 - 4:33
    and that threatens basic rights.
  • 4:33 - 4:36
    We are thankful that Boston police
  • 4:36 - 4:37
    agree with that assessment
  • 4:37 - 4:40
    and do not use facial recognition,
  • 4:40 - 4:42
    facial surveillance today in Boston,
  • 4:42 - 4:46
    and are looking to
    codify that understanding
  • 4:46 - 4:49
    so that with the various
    mechanisms for technology
  • 4:49 - 4:54
    and upgrades and changing circumstances
  • 4:54 - 4:56
    of different administrations
    and personalities,
  • 4:56 - 4:58
    that we just have it on the books
  • 4:58 - 5:00
    that protections come first,
  • 5:00 - 5:02
    because we truly believe
    that public health
  • 5:02 - 5:06
    and public safety should
    be grounded in trust.
  • 5:06 - 5:10
    So the chair gave a wonderful description
  • 5:10 - 5:11
    of this ordinance.
  • 5:11 - 5:14
    I am looking forward to the discussion,
  • 5:14 - 5:17
    but most of all I'm grateful to colleagues
  • 5:17 - 5:18
    and everyone has con who has contributed
  • 5:18 - 5:19
    to this conversation,
  • 5:19 - 5:22
    as well as the larger conversation about
  • 5:22 - 5:26
    community centered
    oversight of surveillance,
  • 5:26 - 5:29
    which will be in a
    separate but related docket
  • 5:29 - 5:31
    that will be moving
    forward on the council.
  • 5:31 - 5:32
    Thank you very much.
  • 5:35 - 5:39
    Councilor Breadon, opening remarks.
  • 5:41 - 5:44
    Thank you.
  • 5:44 - 5:46
    Thank you, madam chair,
  • 5:46 - 5:50
    thank you to the makers of this ordinance.
  • 5:50 - 5:54
    I really look forward to the
    conversation this afternoon.
  • 5:54 - 5:57
    I'm very proud to participate
    in this discussion.
  • 5:57 - 6:01
    I think it's the proper
    and right thing to do
  • 6:01 - 6:05
    to really consider deeply
    all the implications
  • 6:05 - 6:07
    of new technology that hasn't
    been proven to be effective
  • 6:07 - 6:11
    before introducing it in
    any city in Massachusetts.
  • 6:11 - 6:14
    So I really look forward to learning more
  • 6:14 - 6:17
    and hearing from the panelists
    this afternoon, thank you.
  • 6:20 - 6:22
    CHAIR: Councilor Bok.
  • 6:26 - 6:27
    Ah, yes.
  • 6:27 - 6:28
    Thank you, madam chair.
  • 6:28 - 6:30
    And I hope everyone will forgive me
  • 6:30 - 6:31
    for being outside for a minute.
  • 6:31 - 6:32
    I wanted to just say,
  • 6:32 - 6:34
    I'm excited about this conversation today.
  • 6:34 - 6:36
    I think it's a piece of action
    we should absolutely take.
  • 6:36 - 6:37
    And I just wanna say that,
  • 6:38 - 6:40
    I think it's really important
    that this technology
  • 6:40 - 6:41
    has major shortcomings
  • 6:41 - 6:44
    and that also it has a
    racial bias builds into it.
  • 6:44 - 6:47
    Those are really good
    reasons not to use it,
  • 6:47 - 6:51
    but honestly, even if those did not hold,
  • 6:51 - 6:53
    I think that in this country,
  • 6:53 - 6:57
    we run the risk sometimes of
    just starting to do things
  • 6:57 - 6:59
    that tech new technology can do,
  • 6:59 - 7:02
    without asking ourselves like
    as a democratic populous,
  • 7:02 - 7:05
    as a community, is this
    something that we should do?
  • 7:05 - 7:06
    And I think in the process,
  • 7:06 - 7:08
    we as Americans have given up
  • 7:08 - 7:10
    a lot of our rights and privacy,
  • 7:10 - 7:13
    frankly as often as not
    to private corporations
  • 7:13 - 7:15
    as to government.
  • 7:15 - 7:17
    But I think that it's really important
  • 7:17 - 7:21
    to just remember that there's
    a true public interest
  • 7:21 - 7:25
    in a society that is built more on trust
  • 7:25 - 7:25
    than on surveillance.
  • 7:25 - 7:28
    And so I think it's really important
  • 7:28 - 7:29
    for us to draw this line in the sand
  • 7:29 - 7:32
    so that if the technology improves
  • 7:32 - 7:33
    and some of those arguments
  • 7:33 - 7:36
    that are based on a kind
    of its functionality
  • 7:36 - 7:38
    start to slip away, that
    we're still gonna have
  • 7:38 - 7:39
    a real conversation together
  • 7:39 - 7:43
    about how we wanna live in
  • 7:43 - 7:44
    and what kind of a society we want to be.
  • 7:44 - 7:48
    So to me, putting a
    moratorium ban like this
  • 7:48 - 7:50
    into place, makes a ton of sense.
  • 7:50 - 7:52
    And I'm excited for the conversation
  • 7:52 - 7:53
    and grateful to all the advocates.
  • 7:53 - 7:55
    Thank you so much madam chair
  • 7:56 - 7:58
    Thank you.
  • 7:58 - 7:59
    Councilor Mejia.
  • 8:03 - 8:04
    Yes, thank you.
  • 8:04 - 8:06
    Sorry about my audio.
  • 8:06 - 8:10
    Thank you chair, women Edwards,
  • 8:10 - 8:12
    and thank you to the makers, councilor Wu,
  • 8:12 - 8:15
    and I applaud you for your
    leadership on this issue.
  • 8:15 - 8:17
    For me this issue is
    personal and professional
  • 8:17 - 8:19
    as a Dominican woman who
    claims a black boots.
  • 8:19 - 8:22
    Facial recognition technology
  • 8:22 - 8:25
    misidentifies people like me by 35%.
  • 8:25 - 8:27
    We're in a time where our technology
  • 8:27 - 8:29
    is outpacing our morals.
  • 8:29 - 8:33
    We've got a 2020 technology
    with a 1620 state of mind,
  • 8:33 - 8:35
    but that's thinking.
  • 8:35 - 8:38
    As a city, we need to
    practice extreme caution
  • 8:38 - 8:41
    over facial recognition technology.
  • 8:41 - 8:42
    And we should be concerned
  • 8:42 - 8:45
    not only with this technology goes wrong,
  • 8:45 - 8:47
    but also when it goes right.
  • 8:47 - 8:48
    We have a lot of work to do
  • 8:48 - 8:51
    when it comes to building
    trust in our government.
  • 8:51 - 8:54
    Working against this facial
    recognition technology
  • 8:54 - 8:57
    is a good step and I'm happy to hear
  • 8:57 - 8:59
    there's pushback against
    this kind of surveillance
  • 8:59 - 9:01
    on all sides of the issue.
  • 9:01 - 9:04
    Like the makers of the
    ordinance mentioned,
  • 9:04 - 9:07
    now, right now, there's not a desire
  • 9:07 - 9:09
    to put facial recognition
    technology in place.
  • 9:09 - 9:14
    And this ordinance is an
    opportunity to put that into law.
  • 9:16 - 9:18
    I look forward to the discussion
  • 9:18 - 9:20
    and I hope to continue
    to push on this issue
  • 9:20 - 9:22
    alongside councilor Wu and Arroyo,
  • 9:22 - 9:23
    thank you so much.
  • 9:26 - 9:27
    Thank you very much.
  • 9:27 - 9:29
    Councilor Flaherty.
  • 9:29 - 9:30
    MICHAEL: Thank you madam chair
  • 9:30 - 9:32
    and the sponsors for this hearing.
  • 9:32 - 9:36
    As I've stated in the past
    the technology updates
  • 9:36 - 9:39
    and advances have provided
    us with many advantages,
  • 9:39 - 9:42
    but at the same time,
  • 9:42 - 9:44
    they've also proven to be deeply flawed,
  • 9:44 - 9:46
    unreliable and disproportionally impacting
  • 9:46 - 9:49
    communities of color.
  • 9:49 - 9:50
    This technology,
  • 9:50 - 9:52
    just for everyone's full
    disclosure if you listening,
  • 9:52 - 9:56
    this technology is currently
    not being used by the BPD
  • 9:56 - 10:00
    for that reason and CAC
    commissioner is here
  • 10:00 - 10:02
    and good afternoon to the commissioner.
  • 10:02 - 10:05
    I appreciate the work
    that you've been doing
  • 10:05 - 10:05
    through all of this.
  • 10:05 - 10:08
    Whether it's just the
    day-to-day public safety,
  • 10:08 - 10:09
    rigors of your job,
  • 10:09 - 10:11
    or it's been through
    the COVID-19 response,
  • 10:11 - 10:14
    or it's been, most recently and tragically
  • 10:14 - 10:18
    in response to George
    Floyd's horrific death
  • 10:19 - 10:22
    and everything that has come from that.
  • 10:22 - 10:23
    I wanna make sure
  • 10:23 - 10:28
    that I'm gonna continue
    to remain supportive
  • 10:29 - 10:33
    of setting transparent limits
    in creating a public process
  • 10:33 - 10:36
    by which we assess the
    use of these technologies.
  • 10:36 - 10:39
    I know that we've been able
    to solve a lot of crimes
  • 10:39 - 10:41
    through video surveillance
  • 10:41 - 10:43
    and also through video, quite frankly,
  • 10:43 - 10:47
    and also they've led
    to justice for families
  • 10:47 - 10:50
    and someone that has lost
    a cousin due to murder,
  • 10:50 - 10:54
    but for, it was surveillance
  • 10:54 - 10:59
    and it was DNA that led to
    my family getting justice.
  • 10:59 - 11:03
    So looking forward to
    learning more about this
  • 11:03 - 11:05
    and the commissioner's perspective on it,
  • 11:05 - 11:08
    but also recognizing that collectively
  • 11:08 - 11:10
    we need to make sure that
    we're putting forth decisions
  • 11:10 - 11:13
    that make the most sense for Boston.
  • 11:13 - 11:16
    I'll get real parochial here.
  • 11:16 - 11:19
    Quite frankly, I'm not really
    concerned about Cambridge
  • 11:19 - 11:21
    and Somerville, I wanna make sure
  • 11:21 - 11:23
    that whatever we're doing
    in Boston is impacting
  • 11:23 - 11:24
    all of our neighborhoods.
  • 11:24 - 11:25
    The neighborhoods that I represent
  • 11:25 - 11:26
    as a city wide city councilor.
  • 11:26 - 11:28
    And making sure that we
    have community input,
  • 11:28 - 11:29
    most importantly.
  • 11:29 - 11:31
    We work for the people
    of Boston, as council
  • 11:31 - 11:34
    and obviously as the
    commissioner of police officers,
  • 11:34 - 11:37
    we work for the residents and
    the taxpayers of our city.
  • 11:37 - 11:39
    So they need a seat at the table
  • 11:39 - 11:40
    and we wanna to hear from them.
  • 11:40 - 11:41
    If the technology enhances
  • 11:41 - 11:44
    and it help us is public
    safety tool, great.
  • 11:44 - 11:45
    But as the time stands,
  • 11:45 - 11:49
    it's still unreliable and
    there are flaws to it.
  • 11:49 - 11:52
    So as a result that's
    where I'm gonna be on this.
  • 11:52 - 11:53
    Thank you madam chair.
  • 11:54 - 11:55
    Thank you very much.
  • 11:55 - 11:56
    Councilor Campbell.
  • 12:00 - 12:03
    Thank you, madam chair and thank you,
  • 12:03 - 12:04
    I know you're gonna run a tight ship
  • 12:04 - 12:06
    given what's all that's happening today,
  • 12:06 - 12:08
    including the funeral, of
    course for George Floyd.
  • 12:08 - 12:09
    So thank you very much.
  • 12:10 - 12:11
    Thank you also to, of course,
  • 12:11 - 12:13
    to the commissioner for being here,
  • 12:13 - 12:15
    given all that's going on.
  • 12:15 - 12:16
    I know you've been on
    the record in the past
  • 12:16 - 12:18
    with respect to this issue
  • 12:18 - 12:21
    and that not only does the
    department not have it,
  • 12:21 - 12:23
    but you understand the flaws of it.
  • 12:23 - 12:25
    So thank you for being here.
  • 12:25 - 12:28
    Thank you to my council colleagues,
  • 12:28 - 12:29
    but als most importantly the makers
  • 12:29 - 12:31
    for putting this forward.
  • 12:31 - 12:34
    I have been supportive of
    this since the very beginning,
  • 12:34 - 12:35
    when I hosted,
  • 12:35 - 12:37
    along with councilor Wu
    and counselor McCarthy,
  • 12:37 - 12:38
    a hearing on this very issue
  • 12:38 - 12:42
    along with some other
    surveillance technologies
  • 12:42 - 12:44
    and things that were possibly
    coming down the pipe.
  • 12:44 - 12:46
    So this was an opportunity for us
  • 12:46 - 12:50
    to have this conversation on
    what we could do proactively
  • 12:50 - 12:53
    to make sure that surveillance tools
  • 12:53 - 12:56
    were not put into place without
    a robust community process.
  • 12:56 - 12:58
    And of course, council input.
  • 12:58 - 13:01
    So back then I supported
    it and I support it now.
  • 13:01 - 13:04
    Again thank you commissioner,
    thank you to your team,
  • 13:04 - 13:07
    thank you to the makers
    and thank you madam chair.
  • 13:07 - 13:08
    And Aiden also says, thank you.
  • 13:11 - 13:13
    Thank you Aiden and thank
    you councilor Campbell.
  • 13:13 - 13:15
    councilor O'Malley.
  • 13:15 - 13:17
    Thank you madam chair.
  • 13:17 - 13:20
    Just very briefly wanna thank the makers,
  • 13:20 - 13:23
    councilor Wu and Arroyo
    for their leadership here.
  • 13:23 - 13:27
    Obviously I support the
    facial recognition ban.
  • 13:27 - 13:31
    Also grateful, wanna echo my
    thanks to the commissioner
  • 13:31 - 13:34
    for his comments and his opposition,
  • 13:34 - 13:36
    the fact that this isn't used.
  • 13:36 - 13:37
    So it's important that we
    codified that in such a way,
  • 13:37 - 13:39
    which again is the purpose of this hearing
  • 13:39 - 13:42
    and this ordinance and
    stand ready, willing
  • 13:42 - 13:43
    and able to get to work.
  • 13:43 - 13:44
    Thank you and look forward
  • 13:44 - 13:46
    to hearing from a number of residents,
  • 13:46 - 13:49
    our panelists, as well as a
    public testimony after, thanks.
  • 13:49 - 13:52
    Thank you very much,
    councilor Essaibi George.
  • 13:52 - 13:53
    Thank you madam chair,
  • 13:53 - 13:55
    and thank you to the commissioner
  • 13:55 - 13:56
    for being with us this afternoon.
  • 13:56 - 13:58
    I look forward to hearing from him
  • 13:58 - 14:01
    and from the panelists discussing
    this ordinance further.
  • 14:01 - 14:02
    I think that it is necessary
  • 14:02 - 14:05
    to ban this technology at this point.
  • 14:05 - 14:09
    I think that our colleagues
    have brought to the forefront,
  • 14:09 - 14:12
    over the last few years
    as have the advocates,
  • 14:12 - 14:15
    around the concerns regarding
    this kind of technology.
  • 14:15 - 14:20
    I appreciate the real concerns
    around accuracy and biases
  • 14:21 - 14:24
    against people of color
    women, children, the elderly,
  • 14:24 - 14:29
    and certainly reflective
    of our city's residents.
  • 14:29 - 14:32
    So we need to make sure
    that we're proactive
  • 14:32 - 14:35
    and protecting our residents
  • 14:35 - 14:37
    and having a deeper understanding
  • 14:37 - 14:39
    and appreciation of this technology
  • 14:39 - 14:42
    and understanding what
    this ordinance would mean
  • 14:42 - 14:45
    around that ban and
    following through with it.
  • 14:45 - 14:46
    Thank you ma'am chair.
  • 14:46 - 14:47
    Thank you very much.
  • 14:47 - 14:50
    I understand councilor
    Flynn is trying to get in
  • 14:50 - 14:52
    and there might be a limits.
  • 14:52 - 14:53
    I'm just saying that for our tech folks.
  • 14:53 - 14:55
    In the meantime, I'm
    going to read a letter
  • 14:55 - 14:58
    from city council president Kim Janey,
  • 14:58 - 15:01
    "Dear chairwoman Edwards, due
    to technical difficulties,
  • 15:01 - 15:05
    I'm unable to log in for
    today's hearing on docket 0683,
  • 15:05 - 15:06
    regarding an ordinance
  • 15:06 - 15:09
    banning facial recognition
    technology in Boston.
  • 15:09 - 15:11
    I will continue to work on
    the technical challenges
  • 15:11 - 15:14
    on my end and we'll join
    the hearing at a later time
  • 15:14 - 15:16
    if I am able, if I'm unable to join,
  • 15:16 - 15:19
    I'll review the recording at a later time.
  • 15:19 - 15:21
    I want to thank the makers
    councilors Wu and Arroyo
  • 15:21 - 15:24
    for their advocacy and
    continued leadership
  • 15:24 - 15:25
    on this critical issue.
  • 15:25 - 15:27
    I especially want to
    thank all the advocates
  • 15:27 - 15:28
    who are participating
  • 15:28 - 15:31
    for their tireless advocacy on this issue.
  • 15:31 - 15:33
    I look forward to working with advocates
  • 15:33 - 15:34
    to incorporate their ideas
  • 15:34 - 15:36
    and how we can push this
    issue moving forward.
  • 15:36 - 15:39
    It is clear we need policies in place
  • 15:39 - 15:41
    to protect communities that
    are vulnerable to technologies
  • 15:41 - 15:43
    that are biased against them.
  • 15:43 - 15:44
    I look forward to working
    with my colleagues
  • 15:44 - 15:46
    on this critical issue,
  • 15:46 - 15:47
    and we'll continue to fight for an agenda
  • 15:47 - 15:50
    that promotes and protects black lives.
  • 15:50 - 15:52
    Please read this letter
    into the record, thank you."
  • 15:52 - 15:53
    Sincerely Kim Janey.
  • 15:54 - 15:56
    I'm not sure if we've been
    joined by councilor Flynn.
  • 15:56 - 16:00
    I do know he was trying to get in.
  • 16:04 - 16:05
    In the meantime I did want to just do
  • 16:05 - 16:09
    some quick housekeeping as part
    of my introductory remarks.
  • 16:09 - 16:11
    Just to let people know again,
  • 16:11 - 16:13
    today's hearing is specifically
  • 16:13 - 16:14
    about this proposed ordinance
  • 16:14 - 16:17
    that would ban facial
    recognition technology.
  • 16:17 - 16:21
    We had a robust conversation
    about defunding the police,
  • 16:21 - 16:23
    discussing all aspects of the budget,
  • 16:23 - 16:26
    all of those different things just happen
  • 16:26 - 16:28
    and it was chaired by councilor Bok.
  • 16:28 - 16:32
    And I wanna make sure that
    our conversations are related,
  • 16:32 - 16:36
    that we do not take this precious
    moment on this legislation
  • 16:36 - 16:40
    and have it covered up by
    the other conversation.
  • 16:40 - 16:43
    I just want you to know that
  • 16:43 - 16:46
    and again, I will move the conversation
  • 16:46 - 16:48
    towards this ordinance.
  • 16:48 - 16:49
    There should be a copy of it.
  • 16:49 - 16:51
    People should have the ability
  • 16:51 - 16:53
    to discuss about this
    language in this ordinance.
  • 16:53 - 16:55
    Questions I have,
  • 16:55 - 16:59
    I do want to talk about,
    whether time is of the essence.
  • 16:59 - 17:01
    There seem to been some issue,
  • 17:01 - 17:03
    I'm concerned about whether
    we were entering a contract
  • 17:03 - 17:04
    that allowed for this.
  • 17:04 - 17:06
    So when the commissioner speaks,
  • 17:06 - 17:08
    I'd like for him to speak on that.
  • 17:08 - 17:12
    I'm not sure if there's a
    contract pending what's going on.
  • 17:12 - 17:15
    I also wanna make sure
    that folks understand
  • 17:15 - 17:17
    due to the fact that I think
  • 17:17 - 17:20
    we have possibly a hundred
    people signed up to speak.
  • 17:20 - 17:22
    We have a lot of folks
  • 17:22 - 17:23
    who are gonna be speaking on the panel.
  • 17:23 - 17:26
    I'm going to limit my
    colleagues to three minutes
  • 17:26 - 17:27
    in their first round.
  • 17:27 - 17:31
    And I'm gonna get the
    public to no more than two.
  • 17:31 - 17:33
    And I'm gonna try my best
    to get through that list.
  • 17:33 - 17:36
    So I see we've been
    joined by councilor Flynn,
  • 17:36 - 17:38
    councilor Flynn, do you
    have any opening remarks?
  • 17:46 - 17:47
    Thank you.
  • 17:47 - 17:50
    Thank you, counsel Edwards,
  • 17:50 - 17:55
    and to the sponsors and looking
    forward to this hearing,
  • 17:56 - 18:00
    learning, learning more about the proposal
  • 18:01 - 18:04
    and also interested in hearing
  • 18:04 - 18:09
    from the commissioner of
    police, commissioner Gross
  • 18:10 - 18:14
    and with the public as well.
  • 18:14 - 18:19
    Again, this is an issue that's
    that we all can learn from.
  • 18:21 - 18:25
    We can listen to each
    other, we can be respectful
  • 18:25 - 18:28
    and hear each other's point of view.
  • 18:28 - 18:30
    And that's what this city is all about,
  • 18:30 - 18:34
    is coming together and
    treating each other fairly.
  • 18:34 - 18:38
    And with respect, especially
    during difficult days
  • 18:38 - 18:40
    and on and on difficult issues.
  • 18:40 - 18:44
    I also know that our police commissioner
  • 18:44 - 18:48
    is someone that also listens to people
  • 18:48 - 18:51
    and he's in the neighborhoods and he talks
  • 18:51 - 18:55
    and listens to a lot of the
    concerns of the residents.
  • 18:55 - 18:58
    That's what I like most about
    the police commissioner,
  • 18:58 - 19:01
    is his ability to listen
    and treat people freely.
  • 19:01 - 19:04
    Again, I'm here to listen
    and learn about the proposal.
  • 19:04 - 19:06
    Thank you councilor Edwards
    for your strong leadership.
  • 19:06 - 19:08
    Thank you councilor Flynn.
  • 19:08 - 19:10
    I just wanna let people know,
  • 19:10 - 19:13
    we actually have a a hundred
    percent limit on the Zoom.
  • 19:13 - 19:14
    We have 118 people trying to get in.
  • 19:14 - 19:16
    So once you've testified,
  • 19:16 - 19:18
    could you at least sign out,
  • 19:19 - 19:20
    that allows for other people to get in.
  • 19:20 - 19:22
    Commissioner Gross you've been waiting.
  • 19:22 - 19:24
    So I'm gonna turn the
    microphone over to you.
  • 19:24 - 19:27
    I'm gonna ask you get
    right to the ordinance
  • 19:27 - 19:30
    and get to the trends
    and thoughts on that.
  • 19:30 - 19:32
    I'm happy to discuss any procedure.
  • 19:32 - 19:34
    Then we're gonna try
    and get to the panelists
  • 19:34 - 19:37
    and move back to the city
    council while you're still here.
  • 19:37 - 19:39
    Okay.
    Yes, thank you.
  • 19:42 - 19:44
    Good afternoon everyone.
  • 19:44 - 19:49
    And God bless Mr. George
    Floyd, may he rest in peace.
  • 19:50 - 19:53
    And for the record, I'm proud of Boston,
  • 19:53 - 19:57
    as they paid great homage
    and peaceful protest
  • 19:57 - 20:00
    and anything we can do to
    move our department forward,
  • 20:00 - 20:02
    working with our community.
  • 20:02 - 20:03
    You have it on record right here,
  • 20:03 - 20:05
    and now I'm willing to
    work with you councilors,
  • 20:05 - 20:08
    the mayor, in our great city.
  • 20:08 - 20:09
    So, thank you.
  • 20:10 - 20:13
    So I will start and thank you the chair
  • 20:13 - 20:16
    and the committee members and the makers,
  • 20:16 - 20:18
    councilor Wu and Arroyo.
  • 20:20 - 20:22
    I really thank you for
    inviting me to participate
  • 20:22 - 20:24
    in today's hearing regarding the ordinance
  • 20:24 - 20:29
    banning facial recognition
    technology in Boston.
  • 20:29 - 20:32
    As you can imagine the
    Boston Police Department
  • 20:32 - 20:34
    has been extremely busy
    meeting the current demands
  • 20:34 - 20:38
    of public safety from
    the COVID-19 pandemic
  • 20:38 - 20:40
    and the public rallies.
  • 20:40 - 20:42
    However, the department and I understand
  • 20:42 - 20:46
    the importance of the given topic,
  • 20:46 - 20:47
    I would like to request
  • 20:47 - 20:50
    that hearing remain on the given topic
  • 20:50 - 20:53
    and that we work together
    to find other times
  • 20:53 - 20:56
    to discuss the important issues
    regarding police relations,
  • 20:56 - 20:59
    national issues on police reform
  • 20:59 - 21:01
    and the ways that we can improve
  • 21:01 - 21:05
    our relationships here in Boston.
  • 21:06 - 21:08
    As has been practiced in the past,
  • 21:08 - 21:10
    we welcome a future opportunity
  • 21:10 - 21:12
    to discuss the ordinance language
  • 21:12 - 21:16
    and city council concerns
    in a working session
  • 21:16 - 21:18
    regarding technology policy
  • 21:18 - 21:21
    and potential privacy concerns.
  • 21:21 - 21:25
    My testimony today is meant
    to serve as a background
  • 21:25 - 21:28
    to BPD current practices
  • 21:28 - 21:31
    and identify potential
    technological needs.
  • 21:32 - 21:34
    The department for the record,
  • 21:34 - 21:36
    does not currently have the technology
  • 21:36 - 21:38
    for facial recognition.
  • 21:39 - 21:42
    As technology advances however,
  • 21:42 - 21:45
    many vendors have and will
    continue to incorporate
  • 21:45 - 21:49
    automated recognition abilities.
  • 21:50 - 21:52
    We have prohibited these features
  • 21:52 - 21:55
    as we have moved along in our advancements
  • 21:55 - 21:58
    with the intention to have rich
    dialogue with the community
  • 21:58 - 22:03
    prior to any acquisition
    of such a technology.
  • 22:04 - 22:08
    Video has proven to be one
    of the most effective tools
  • 22:08 - 22:11
    for collecting evidence
    of criminal offenses,
  • 22:11 - 22:13
    for solving crimes and locating missing
  • 22:13 - 22:16
    and exploited individuals.
  • 22:16 - 22:19
    Any prohibitions on
    these investigative tools
  • 22:19 - 22:22
    without a full understanding
    of potential uses
  • 22:22 - 22:23
    under strict protocols,
  • 22:23 - 22:26
    could be harmful and impede our ability
  • 22:26 - 22:27
    to protect the public.
  • 22:28 - 22:32
    The proposed ordinance defines
    facial surveillance to mean,
  • 22:32 - 22:36
    an automatic automated
    or semi-automated process
  • 22:36 - 22:40
    that assist in identifying
    or verifying an individual
  • 22:40 - 22:44
    or in capturing information
    about an individual
  • 22:44 - 22:46
    based on the physical characteristics
  • 22:46 - 22:48
    of an individual's face.
  • 22:51 - 22:53
    And to be clear,
  • 22:53 - 22:55
    that department has no desire
  • 22:55 - 22:57
    to employ a facial surveillance system
  • 22:58 - 23:01
    to generally surveil
    the citizens of Boston.
  • 23:01 - 23:04
    The department however notes a distinction
  • 23:04 - 23:06
    between facial surveillance systems
  • 23:06 - 23:08
    and facial recognition technology.
  • 23:09 - 23:10
    The department believes
  • 23:10 - 23:13
    that the term facial
    recognition technology
  • 23:13 - 23:17
    better describes the investigative nature
  • 23:17 - 23:20
    of the technology we believe
    would be useful to the city,
  • 23:21 - 23:23
    with the right safeguards
    and community input.
  • 23:24 - 23:26
    The department would,
  • 23:26 - 23:29
    as technology advances
    and becomes more reliable,
  • 23:29 - 23:32
    like the opportunity to
    discuss the utilization
  • 23:32 - 23:35
    of facial recognition technology
  • 23:35 - 23:40
    to respond to specific crimes
    and emergency situations.
  • 23:41 - 23:44
    And I would like to
    revisit everyone's thoughts
  • 23:44 - 23:48
    and remembrances of the Boston Marathon
  • 23:48 - 23:50
    and the most recent kidnappings.
  • 23:51 - 23:56
    Facial recognition technology
    to the best of our knowledge,
  • 23:56 - 23:58
    will greatly reduced the hours necessary
  • 23:58 - 24:00
    to review video evidence.
  • 24:00 - 24:04
    This would also allow
    investigators to move more quickly,
  • 24:04 - 24:07
    identify victims, missing persons
  • 24:07 - 24:11
    or suspects of crime through
    citywide camera recordings.
  • 24:11 - 24:14
    This technology, if acquired
  • 24:14 - 24:16
    would also allow for the compilation
  • 24:16 - 24:20
    and condensing a video to
    be done more efficiently.
  • 24:22 - 24:24
    Facial recognition technology
  • 24:24 - 24:26
    used with well-established guidelines
  • 24:26 - 24:27
    and under strict review,
  • 24:27 - 24:29
    can also provide a safer environment
  • 24:29 - 24:31
    for all in the city of Boston.
  • 24:32 - 24:35
    Creating a system in
    which evidence is reviewed
  • 24:35 - 24:38
    in a timely manner and efficiently
  • 24:38 - 24:40
    allows for police to intervene
  • 24:40 - 24:45
    and reduce levels of
    crime would be beneficial.
  • 24:45 - 24:50
    The department rejects any
    notion in the ordinance
  • 24:50 - 24:55
    that would whatever use
    facial recognition technology
  • 24:56 - 24:59
    in our response to COVID-19 pandemic.
  • 24:59 - 25:01
    And just for the record,
  • 25:01 - 25:02
    I'd like to repeat that,
  • 25:02 - 25:05
    the department rejects any
    notion in the ordinance
  • 25:05 - 25:09
    that we are or will ever use
    facial recognition technology
  • 25:09 - 25:12
    in our response to COVID-19 pandemic.
  • 25:14 - 25:15
    Finally,
  • 25:19 - 25:21
    it is important to note,
  • 25:21 - 25:24
    the department shares the
    concerns of our community
  • 25:24 - 25:27
    and our community concerns around privacy
  • 25:27 - 25:30
    and intrusive surveillance,
  • 25:31 - 25:34
    as it is the current practice
    with existing technology,
  • 25:34 - 25:36
    our intentions are to implement
  • 25:36 - 25:38
    any potential future advances
  • 25:38 - 25:41
    in facial recognition technology,
  • 25:41 - 25:43
    through well-defined
    protocols and procedures.
  • 25:43 - 25:47
    Some areas we consider, excuse me,
  • 25:47 - 25:50
    some areas we consider this
    technology may be beneficial,
  • 25:50 - 25:52
    are in identifying the route
  • 25:52 - 25:54
    and locations of missing persons,
  • 25:54 - 25:57
    including the suffering from
    Alzheimer's and dementia,
  • 25:57 - 26:01
    those suffering from
    Alzheimer's and dementia.
  • 26:01 - 26:04
    As well, missing children,
    kidnapped individuals,
  • 26:04 - 26:07
    human trafficking victims,
  • 26:07 - 26:10
    and the suspects of
    assaults and shootings.
  • 26:10 - 26:15
    Also the victims of
    homicide and domestic abuse.
  • 26:18 - 26:19
    Important to know,
  • 26:19 - 26:23
    we would like to work with
    the council and our partners
  • 26:23 - 26:25
    to clearly articulate the language
  • 26:25 - 26:28
    that could best describe these uses.
  • 26:28 - 26:31
    In section B2 with the proposed ordinance
  • 26:31 - 26:33
    that indicates there is some intention
  • 26:33 - 26:36
    to account for the use of this technology
  • 26:36 - 26:37
    to advance investigations
  • 26:37 - 26:39
    but the department police
    some additional language
  • 26:39 - 26:44
    referencing permitted uses is necessary.
  • 26:45 - 26:49
    We as well, would
    appreciate a working session
  • 26:49 - 26:51
    within the next 60 days,
  • 26:51 - 26:55
    to draft language that we
    can collectively present
  • 26:55 - 26:59
    to constituents and
    requesting interest groups
  • 26:59 - 27:01
    for feedback and input.
  • 27:03 - 27:05
    We also plan to use the public testimony
  • 27:05 - 27:07
    being provided in today's hearing
  • 27:07 - 27:10
    to assist in our decision-making.
  • 27:10 - 27:13
    I'd like to thank you for this opportunity
  • 27:13 - 27:15
    to provide this testimony
  • 27:15 - 27:17
    and your continued interest and time
  • 27:17 - 27:19
    in creating public forums,
  • 27:19 - 27:22
    around police technology and advancements.
  • 27:23 - 27:26
    That's my official
    statement that I read in.
  • 27:26 - 27:29
    And I'm telling you right now
    as an African-American male,
  • 27:29 - 27:32
    the technology that is in place today
  • 27:32 - 27:36
    does not meet the standards of
    the Boston Police Department,
  • 27:36 - 27:38
    nor does it meet my standards.
  • 27:38 - 27:39
    And until technology advances
  • 27:39 - 27:43
    to a point where it is more reliable,
  • 27:43 - 27:46
    again, we will need your
    input, your guidance,
  • 27:46 - 27:51
    and we'll work together
    to pick that technology
  • 27:51 - 27:55
    which is more conducive to our privacy
  • 27:55 - 27:59
    and our rights as citizens
    of Boston, thank you.
  • 28:02 - 28:04
    Thank you very much commissioner Gross.
  • 28:04 - 28:07
    Do you, just before I go on,
  • 28:07 - 28:09
    we're gonna go now to Joy Buolamwini
  • 28:12 - 28:14
    and I am so sorry for that Joy.
  • 28:14 - 28:17
    I apologize profusely for the
    butchering of your last name.
  • 28:17 - 28:22
    My'kel McMillen, Karina
    Ham, Kade Crawford, Kate,
  • 28:22 - 28:26
    Erik Berg, and Joshua Barocas.
  • 28:27 - 28:30
    But I wanted to ask the
    commissioner just very quickly,
  • 28:30 - 28:33
    you said you have suggested language
  • 28:33 - 28:37
    or that you would want to work
    on suggested language for B2.
  • 28:37 - 28:39
    No, I would want to work on it.
  • 28:39 - 28:42
    So keep in mind anything
    we do going forward,
  • 28:42 - 28:45
    as I promised for the last four years,
  • 28:45 - 28:47
    we want your input and your guidance.
  • 28:47 - 28:50
    Councilor Mejia you hit it on point.
  • 28:50 - 28:53
    This technology is not fair to everyone,
  • 28:53 - 28:56
    especially African-Americans, Latinos,
  • 28:56 - 28:57
    it's not there yet.
  • 28:57 - 28:59
    So moving forward,
  • 28:59 - 29:01
    we have to make sure
    everybody's comfortable
  • 29:01 - 29:03
    with this type of technology.
  • 29:04 - 29:05
    Thank you.
  • 29:05 - 29:09
    So now I'm gonna turn it
    over to some of the folks
  • 29:09 - 29:13
    who have called for to speak.
  • 29:13 - 29:16
    Hope we're gonna try and get through
  • 29:16 - 29:17
    as many of them as possible.
  • 29:17 - 29:19
    Commissioner, these are the
    folks who helped to push
  • 29:19 - 29:20
    and move this ordinance.
  • 29:20 - 29:22
    We really hope you can
    stay as long as you can
  • 29:22 - 29:23
    to hear them, also.
  • 29:23 - 29:25
    I'm looking forward to
    working with the folks
  • 29:25 - 29:27
    in the community.
  • 29:27 - 29:28
    Wonderful, thank you.
  • 29:28 - 29:29
    So Joy.
  • 29:31 - 29:32
    Do I?
  • 29:35 - 29:36
    Hello?
  • 29:36 - 29:38
    Oh, wonderful, Joy.
  • 29:38 - 29:40
    The floor is yours.
  • 29:40 - 29:44
    If you could take about
    three minutes or no more.
  • 29:44 - 29:45
    Sounds good.
  • 29:45 - 29:47
    So thank you so much madam chair Edwards,
  • 29:47 - 29:50
    members of the committee
    on government operations
  • 29:50 - 29:53
    and members of the Boston City Council
  • 29:53 - 29:56
    for the opportunity to testify today,
  • 29:56 - 29:58
    I am the founder of the
    Algorithmic Justice League
  • 29:58 - 30:01
    and an algorithmic bias researcher.
  • 30:01 - 30:03
    I've conducted MIT study
  • 30:03 - 30:05
    showing some of the
    largest recorded gender
  • 30:05 - 30:09
    and racial biases in AI
    systems sold by companies,
  • 30:09 - 30:13
    including IBM, Microsoft and Amazon.
  • 30:13 - 30:15
    As you've heard the deployment
    of facial recognition
  • 30:15 - 30:17
    and related technologies
  • 30:17 - 30:19
    has major civil rights implications.
  • 30:19 - 30:21
    These tools also have
    technical limitations
  • 30:21 - 30:24
    that further amplify
    harms for black people,
  • 30:24 - 30:27
    indigenous people, other
    communities of color,
  • 30:27 - 30:31
    women, the elderly, those
    with dementia and Parkinson's,
  • 30:31 - 30:36
    youth trans and gender
    non-conforming individuals.
  • 30:36 - 30:37
    In one test I ran,
  • 30:37 - 30:41
    Amazon's AI even failed on
    the face of Oprah Winfrey,
  • 30:41 - 30:43
    labeling her male.
  • 30:43 - 30:46
    Personally, I've had to resort
    to wearing a white mask,
  • 30:46 - 30:51
    to have my dark skin detected
    by some of this technology,
  • 30:51 - 30:53
    but given mass surveillance applications,
  • 30:53 - 30:56
    not having my face
    detected can be a benefit.
  • 30:56 - 30:58
    We do not need to look to China
  • 30:59 - 31:00
    to see this technology
    being used for surveillance
  • 31:00 - 31:03
    of protesters with little
    to no accountability,
  • 31:03 - 31:07
    and too often in violation of
    our civil and human rights,
  • 31:07 - 31:10
    including first amendment
    rights of freedom of expression,
  • 31:10 - 31:12
    association, and assembly.
  • 31:13 - 31:14
    When the tech works,
  • 31:14 - 31:17
    we can't forget about
    the cost of surveillance,
  • 31:17 - 31:19
    but in other contexts it can fail
  • 31:19 - 31:21
    and the failures can be harmful.
  • 31:21 - 31:24
    Misidentifications can
    lead to false arrest
  • 31:24 - 31:25
    and accusations.
  • 31:25 - 31:28
    In April 2019, a brown university senior
  • 31:28 - 31:32
    was misidentified as a terrorist suspect
  • 31:32 - 31:34
    in the Sri Lanka Easter bombings.
  • 31:34 - 31:36
    he police eventually
    corrected the mistake,
  • 31:36 - 31:38
    but she still received death threats.
  • 31:38 - 31:42
    Mistaken identity is more
    than an inconvenience.
  • 31:42 - 31:44
    And she's not alone, in the UK,
  • 31:44 - 31:47
    the faces of over 2,400 innocent people
  • 31:47 - 31:51
    were stored by the police
    department without their consent.
  • 31:51 - 31:52
    The department reported
  • 31:52 - 31:56
    a false positive identification
    rate of over 90%.
  • 31:57 - 32:00
    In the U.S. there are no
    reporting requirements
  • 32:00 - 32:04
    generally speaking, so
    we're operating in the dark.
  • 32:04 - 32:06
    Further, these tools
    do not have to identify
  • 32:06 - 32:08
    unique basis to be harmful.
  • 32:08 - 32:09
    And investigation reported
  • 32:09 - 32:11
    that IBM equipped the NYP
  • 32:11 - 32:13
    with tools to search for people in video,
  • 32:13 - 32:16
    by facial hair and skin tone.
  • 32:16 - 32:17
    In short, these tools can be used
  • 32:17 - 32:19
    to automate racial profiling.
  • 32:19 - 32:21
    The company recently came out
  • 32:21 - 32:23
    to denounce the use of these tools
  • 32:23 - 32:25
    for mass surveillance and profiling.
  • 32:25 - 32:30
    IBM's moved to stop selling
    facial recognition technology
  • 32:30 - 32:32
    underscores it's dangerous.
  • 32:32 - 32:34
    Due to the consequences of failure,
  • 32:34 - 32:36
    I've focused my MIT research
  • 32:36 - 32:39
    on the performance of
    facial analysis systems.
  • 32:39 - 32:42
    I found that for the task
    of gender classification,
  • 32:42 - 32:45
    IBM, Microsoft and Amazon
  • 32:45 - 32:49
    had air rates of no more
    than 1% for lighter skin men.
  • 32:49 - 32:50
    In the worst case,
  • 32:51 - 32:54
    these rates were to over
    30% for darker skin women.
  • 32:54 - 32:57
    Subsequent government
    studies compliment this work.
  • 32:57 - 33:01
    They show that continued air
    disparities in facial analysis
  • 33:01 - 33:04
    and other tasks, including
    face recognition.
  • 33:04 - 33:08
    The latest government
    study of 189 algorithms,
  • 33:08 - 33:11
    revealed consequential racial gender
  • 33:11 - 33:14
    and age bias in many of
    the algorithms tested.
  • 33:14 - 33:16
    Still, even if air rates improve,
  • 33:16 - 33:19
    the capacity for abuse, lack of oversight
  • 33:19 - 33:23
    and deployment limitations
    post too great a risk.
  • 33:23 - 33:25
    Given known harms, the city of Boston
  • 33:25 - 33:28
    should ban government
    use of face surveillance.
  • 33:28 - 33:30
    I look forward to
    answering your questions.
  • 33:30 - 33:33
    Thank you for the opportunity to testify.
  • 33:34 - 33:35
    Thank you so much
  • 33:35 - 33:38
    and to correct my horrible
    mispronunciation of your name.
  • 33:40 - 33:43
    Would you mind saying your
    full name for the record again?
  • 33:43 - 33:46
    Sure, my name is Joy Buolamwini.
  • 33:46 - 33:47
    Thank you very much Joy.
  • 33:47 - 33:50
    I'm gonna now turn over
    to My'Kel McMillen,
  • 33:50 - 33:52
    a youth advocate.
  • 33:52 - 33:55
    I'm also gonna set the
    clock three minutes, My'Kel.
  • 34:00 - 34:01
    God afternoon madam chair,
  • 34:01 - 34:03
    good afternoon city councilor,
  • 34:03 - 34:05
    good afternoon everybody else.
  • 34:06 - 34:09
    My name is My'Kel McMillen,
    I'm a youth advocate
  • 34:10 - 34:13
    I'm also an organizer [indistinct]
    which is from the east.
  • 34:16 - 34:17
    I just want to talk about on a little bit
  • 34:17 - 34:20
    about what's going on within
    the streets of Boston.
  • 34:23 - 34:24
    For centuries black folks
  • 34:24 - 34:26
    have been the Guinea pig in America,
  • 34:26 - 34:27
    from experiments with our bodies
  • 34:27 - 34:29
    to the strain on our communities,
  • 34:29 - 34:32
    it seems [indistinct]
  • 34:32 - 34:35
    Do I agree police are as involved?
  • 34:35 - 34:37
    Yes, there are no longer
    catching us glaze,
  • 34:37 - 34:40
    but catching poor black and brown folks.
  • 34:40 - 34:44
    On January, so I wanna
    talk about three years ago,
  • 34:44 - 34:47
    roughly January around 2017,
  • 34:47 - 34:51
    there were residents from my
    neighborhood that came up to me
  • 34:51 - 34:54
    that they asking a drone
    flying into the area
  • 34:54 - 34:57
    and they asked numerous times
  • 34:57 - 34:59
    and nobody could figure out
    who was flying the drone,
  • 34:59 - 35:01
    who's controlling it.
  • 35:01 - 35:04
    And so upon receiving his information,
  • 35:04 - 35:07
    I live across the street
    from a soda company,
  • 35:07 - 35:09
    and later on that night,
  • 35:09 - 35:11
    I seen officers playing with a drone
  • 35:11 - 35:13
    and I ended up taking the
    photos of their drone.
  • 35:13 - 35:14
    And it kind of jogged my memory back
  • 35:14 - 35:16
    that this was a drone
  • 35:16 - 35:18
    that lots of residents in my neighborhood
  • 35:18 - 35:20
    had seen and talked about.
  • 35:20 - 35:22
    That no one knew who was flying the drone.
  • 35:22 - 35:24
    It comes to my knowledge,
    it was the police.
  • 35:24 - 35:27
    So with doing some research
  • 35:27 - 35:29
    and reaching out to a friend over at ACLU
  • 35:30 - 35:33
    and we try to follow up
    a public records request,
  • 35:33 - 35:35
    which we were denied our first time,
  • 35:35 - 35:36
    and then eventually got it,
  • 35:36 - 35:40
    where the Boston police has
    spent the amount of $17,500
  • 35:41 - 35:43
    on three police drones.
  • 35:43 - 35:46
    And nobody knew about it from
    city council to the public.
  • 35:46 - 35:48
    It was kind of kept secret
  • 35:48 - 35:51
    and there was no committee oversight.
  • 35:51 - 35:54
    And these drones were flying illegally
  • 35:54 - 35:58
    in the side of a neighborhood
    and the breach of privacy,
  • 35:58 - 36:03
    The not knowing what
    information was being collected,
  • 36:05 - 36:08
    what was being stored,
    how it was being stored
  • 36:08 - 36:10
    that really scared a lot
    of folks in my neighborhood
  • 36:10 - 36:12
    and to the point that a lot
    of them didn't wanna speak up
  • 36:12 - 36:15
    because of past experience
    of dealing with abuse
  • 36:15 - 36:17
    and being harassed.
  • 36:17 - 36:19
    Nobody wanted that backlash,
  • 36:19 - 36:21
    but at the same time,
  • 36:21 - 36:22
    we have to be as residents
  • 36:22 - 36:26
    and have to empower one
    another and have to speak up.
  • 36:26 - 36:30
    And sometimes it has to
    be somebody regardless
  • 36:30 - 36:33
    if it's myself or another individual,
  • 36:33 - 36:35
    we have to hold people accountable.
  • 36:35 - 36:38
    And something that Monica
    Candace, who's another activist,
  • 36:38 - 36:40
    accountability has no colors to another,
  • 36:40 - 36:43
    regardless if you're white,
    black in a blue uniform,
  • 36:43 - 36:45
    you still have to be held accountable.
  • 36:45 - 36:47
    And then just to see that
    no one was held accountable
  • 36:47 - 36:48
    was a scary thing,
  • 36:48 - 36:51
    because it kind of just [indistinct]
  • 36:54 - 36:55
    Nobody ever gets caught accountable
  • 36:55 - 36:59
    because our voices are muffled by money
  • 36:59 - 37:04
    or just a knee in the neck,
    rest in peace to George Floyd.
  • 37:04 - 37:06
    That is it for my time, thank you all.
  • 37:06 - 37:09
    Thank you, thank you so much.
  • 37:09 - 37:10
    Perfect testimony.
  • 37:10 - 37:12
    And I really appreciate you
    speaking from the heart,
  • 37:12 - 37:13
    from your own experience
  • 37:13 - 37:16
    and also demonstrating how you've actually
  • 37:16 - 37:18
    had conversation for accountability
  • 37:18 - 37:20
    through your leadership.
  • 37:20 - 37:22
    So I wanna thank you for that.
  • 37:22 - 37:25
    Up next we have Karina Hem
  • 37:25 - 37:27
    from the Student Immigrant Movement.
  • 37:27 - 37:30
    Three minutes, Karina.
  • 37:30 - 37:32
    Hi everyone.
  • 37:32 - 37:35
    Thank you everyone for joining
    us today at this hearing,
  • 37:35 - 37:36
    my name is Karina Hem,
  • 37:36 - 37:38
    and I'm the field organizer
  • 37:38 - 37:39
    for the Student Immigrant Movement.
  • 37:39 - 37:41
    I would like to also thank our partners
  • 37:41 - 37:44
    who've been working on the
    facial recognition ban,
  • 37:44 - 37:47
    city councilors, Michelle
    Wu, Ricardo Arroyo,
  • 37:47 - 37:49
    Andrea Campbell, Kim Janey.
    Annesa Essaibi George,
  • 37:49 - 37:51
    along with ACLU of Massachusetts,
  • 37:51 - 37:54
    Muslim Justice League
    and Unafraid Educators.
  • 37:54 - 37:56
    So SIM is a statewide
    grassroots organization
  • 37:56 - 37:58
    by and for the undocumented youth.
  • 37:58 - 38:01
    We work closely with our
    members or our young people
  • 38:01 - 38:04
    to create trusting and
    empowering relationships.
  • 38:04 - 38:06
    At SIM we fight for the
    permanent protection
  • 38:06 - 38:08
    of our undocumented
    youth and their families
  • 38:08 - 38:10
    through collective action.
  • 38:10 - 38:12
    And one of the major role of the work
  • 38:12 - 38:13
    in the organizing that we do,
  • 38:13 - 38:16
    is to challenge these systems
    that continue to exploit
  • 38:16 - 38:18
    and disenfranchise the undocumented
  • 38:19 - 38:20
    and immigrant communities.
  • 38:20 - 38:22
    So I am here on behalf of SIM
  • 38:22 - 38:23
    to support the ban on facial recognition
  • 38:23 - 38:25
    because it's another system,
  • 38:25 - 38:26
    it's another tool that will intentionally
  • 38:26 - 38:29
    be used against marginalized community.
  • 38:29 - 38:31
    So the undocumented immigrant community
  • 38:31 - 38:33
    is not mutually exclusive
    to one particular group.
  • 38:33 - 38:36
    We have transgender people,
  • 38:36 - 38:38
    black, Latina, exmuslim,
    people with disabilities,
  • 38:38 - 38:39
    and it's so intersectional.
  • 38:39 - 38:42
    And so the stakes for getting
    this identified are high
  • 38:42 - 38:44
    because this creates a dangerous situation
  • 38:44 - 38:45
    for people who are undocumented.
  • 38:45 - 38:47
    And these programs are
    made to accurately identify
  • 38:47 - 38:50
    white adult males, living
    black and brown folks
  • 38:50 - 38:53
    at risk on top of living
    in marginalized community.
  • 38:53 - 38:55
    And so many of the
    young people I work with
  • 38:55 - 38:57
    are students who are in college
  • 38:57 - 38:58
    and many are still in high school.
  • 38:58 - 39:01
    Face surveillance technology
    is not meant for children,
  • 39:01 - 39:03
    and it's not meant for the young people
  • 39:03 - 39:04
    and law enforcement already uses it
  • 39:04 - 39:07
    to monitor the youth in places
    like Lockport, New York.
  • 39:07 - 39:09
    Their face surveillance system
  • 39:09 - 39:11
    has been adopted in their school district,
  • 39:11 - 39:13
    and they've already spent
    over a million dollars
  • 39:13 - 39:13
    in the system.
  • 39:13 - 39:14
    So you ask yourself,
  • 39:14 - 39:17
    but what is exactly the intent
    to have these surveillance?
  • 39:17 - 39:19
    Do you still have the
    right to their privacy
  • 39:19 - 39:22
    and have the right to live
    without constant infringement
  • 39:22 - 39:24
    and interference of law enforcement.
  • 39:24 - 39:26
    If another city allows facial surveillance
  • 39:26 - 39:28
    in their schools to happen,
  • 39:28 - 39:30
    the chance of that occurring
    here is high as well
  • 39:30 - 39:32
    if we do not ban facial
    recognition in Boston.
  • 39:32 - 39:34
    And we believe this because the youth,
  • 39:34 - 39:35
    the undocumented youth,
  • 39:35 - 39:38
    there are already not
    protected in their schools,
  • 39:38 - 39:39
    school disciplinary reports,
  • 39:39 - 39:42
    which currently have no guideline
    protocol or clear criteria
  • 39:42 - 39:44
    on what exactly school officers can write
  • 39:44 - 39:47
    are sent to the superintendent
    and their designee
  • 39:47 - 39:49
    to sign off and sent to the
    Boston Police Department.
  • 39:49 - 39:52
    And still we've seen that
    some of these reports
  • 39:52 - 39:53
    actually go straight to
    the police department
  • 39:53 - 39:54
    with no oversight.
  • 39:55 - 39:57
    And so this endangers the
    life of immigrant youth,
  • 39:57 - 39:59
    because Boston Police
    Departments share information
  • 39:59 - 40:02
    and databases with agencies like I.C.E.
  • 40:02 - 40:04
    So it creates a gateway to imprisonment
  • 40:04 - 40:06
    and deportation from any
    of these young folks.
  • 40:06 - 40:08
    And so adding facial
    surveillance into schools
  • 40:08 - 40:11
    and in the city would be used
    against those same students
  • 40:11 - 40:12
    who are already at risk
  • 40:12 - 40:14
    for being separated from their community.
  • 40:14 - 40:15
    We know that black and brown youth
  • 40:15 - 40:17
    experience depression every day,
  • 40:18 - 40:20
    and they already are deemed
    suspicious by law enforcement
  • 40:21 - 40:22
    based on their skin color
    and the clothes they wear,
  • 40:22 - 40:24
    the people they interact with in more.
  • 40:24 - 40:26
    And so adding facial recognition,
  • 40:26 - 40:28
    that's not even accurate to
    surveil our youth and families
  • 40:28 - 40:33
    will just be used to justify
    their arrests or deportation.
  • 40:33 - 40:35
    And through the policing system,
  • 40:35 - 40:36
    we were already asking for police officers
  • 40:36 - 40:38
    to engage with the youth in ways
  • 40:38 - 40:41
    that are just harmful to
    the youth development.
  • 40:41 - 40:45
    The city of Boston needs to
    stop criminalizing young folks,
  • 40:45 - 40:48
    whether they engage in
    criminal activity or not,
  • 40:48 - 40:50
    arresting them will not get
    to the root of the problem.
  • 40:50 - 40:51
    There are so many options
  • 40:51 - 40:54
    that are much safer and
    more secure for our youth,
  • 40:54 - 40:56
    for the families and their futures.
  • 40:56 - 40:58
    Banning facial recognition
    would meet law enforcement
  • 40:58 - 41:01
    has one less tool to
    criminalize our youth.
  • 41:01 - 41:02
    Thanks so much everyone.
  • 41:03 - 41:05
    Thank you.
  • 41:05 - 41:09
    Commissioner, I understand
    time it'd be of the essence,
  • 41:09 - 41:14
    but I'm really hoping you can
    stay till about 4:15, 4:30,
  • 41:15 - 41:18
    please, please.
  • 41:22 - 41:25
    I can stay until four, that's it.
  • 41:25 - 41:30
    I have to go at four, that long.
  • 41:30 - 41:33
    Okay, so-
  • 41:35 - 41:36
    I appreciate everyone's testimony as well.
  • 41:36 - 41:38
    I have a team we're taking notes.
  • 41:38 - 41:42
    So I don't think when I go
    that no one's taking notes.
  • 41:42 - 41:44
    We would really wanna meet again
  • 41:44 - 41:46
    and discuss on this technology.
  • 41:47 - 41:48
    I know that there are some councilors
  • 41:48 - 41:51
    that have some clarifying questions
  • 41:51 - 41:53
    and we have three more people to speak.
  • 41:53 - 41:57
    And so I am begging you, please,
  • 41:57 - 42:00
    if you can at least get
    to the lead sponsors,
  • 42:00 - 42:02
    councilor Wu and councilor Arroyo.
  • 42:03 - 42:08
    That'll be three, six, nine, 10.
  • 42:08 - 42:09
    That'd be probably 20 more minutes.
  • 42:09 - 42:11
    A little after four.
  • 42:11 - 42:14
    Okay.
    A little close for me.
  • 42:14 - 42:15
    [Gross laughs]
  • 42:15 - 42:17
    We're trying, we're trying.
  • 42:17 - 42:19
    I understand, I understand, go ahead.
  • 42:19 - 42:22
    Thank you, Kade.
  • 42:22 - 42:24
    Thank you chair.
  • 42:24 - 42:25
    My name is Kade Crawford.
  • 42:25 - 42:28
    I am the director of the
    Technology for Liberty Program
  • 42:28 - 42:29
    at the ACLU of Massachusetts.
  • 42:29 - 42:32
    Chair Edwards, members of the committee
  • 42:32 - 42:33
    and members of the city council,
  • 42:33 - 42:35
    thank you all for the opportunity
  • 42:35 - 42:37
    to speak on this crucial issue today.
  • 42:37 - 42:41
    We are obviously in the midst
    of a massive social upheaval
  • 42:41 - 42:43
    as people across Boston and the nation
  • 42:43 - 42:46
    demand that we ensure black lives matter
  • 42:46 - 42:49
    by shifting our budget
    priorities away from policing
  • 42:49 - 42:51
    and incarceration and
    towards government programs
  • 42:51 - 42:54
    that lift people up instead
    of holding them back.
  • 42:54 - 42:57
    And people are outraged
    because for too long
  • 42:57 - 42:59
    police departments have
    been armed to the teeth
  • 42:59 - 43:03
    and equipped with military
    style surveillance technologies,
  • 43:03 - 43:05
    Like the ones that My'Kel just referenced.
  • 43:05 - 43:08
    Even as our schools lack
    the most basic needs,
  • 43:08 - 43:11
    like functioning water
    fountains, nurses and counselors.
  • 43:11 - 43:14
    And even now during this pandemic,
  • 43:14 - 43:17
    as our healthcare providers
    lack the PPE they need
  • 43:17 - 43:19
    to protect themselves and us,
  • 43:19 - 43:21
    but our government cannot continue to act
  • 43:23 - 43:25
    as if it is at war with its residents,
  • 43:25 - 43:26
    waging counter-insurgency surveillance
  • 43:26 - 43:28
    and control operations,
  • 43:28 - 43:30
    particularly in black
    and brown neighborhoods
  • 43:30 - 43:31
    and against black and brown people.
  • 43:33 - 43:36
    We must act, we must not merely
    speak to change our laws,
  • 43:36 - 43:38
    to ensure we are building a free,
  • 43:38 - 43:42
    just an equitable future
    for all people in Boston.
  • 43:42 - 43:43
    So that said,
  • 43:43 - 43:46
    banning face surveillance
    in Boston is a no-brainer.
  • 43:46 - 43:48
    It is one small but vital piece
  • 43:48 - 43:51
    of this larger necessary transformation.
  • 43:51 - 43:53
    Face surveillance is
    dangerous when it works
  • 43:53 - 43:54
    and when it doesn't.
  • 43:54 - 43:56
    Banning this technology in Boston now
  • 43:56 - 43:58
    is not an academic issue.
  • 43:58 - 44:01
    Indeed the city already uses technology
  • 44:01 - 44:03
    that with one mere software upgrade
  • 44:03 - 44:07
    could blanket Boston in the
    kind of dystopian surveillance
  • 44:07 - 44:09
    currently practiced by
    authoritarian regimes
  • 44:09 - 44:11
    in China and Russia.
  • 44:11 - 44:12
    And as Joyce said,
  • 44:12 - 44:15
    even some cities right
    here in the United States.
  • 44:15 - 44:17
    Boston has to strike a different path
  • 44:17 - 44:20
    by protecting privacy, racial justice,
  • 44:20 - 44:21
    and first amendment rights.
  • 44:21 - 44:24
    We must ban this technology now,
  • 44:24 - 44:27
    before it creeps into our
    government, in the shadows,
  • 44:27 - 44:29
    with no democratic control or oversight
  • 44:29 - 44:32
    as it has in countless other cities,
  • 44:32 - 44:34
    across the country and across the world.
  • 44:34 - 44:37
    So today you already
    heard from pretty much
  • 44:37 - 44:39
    the world renowned expert
  • 44:39 - 44:41
    on racial and gender bias
    and facial recognition.
  • 44:41 - 44:44
    Thank you so much Joy
    for being with us today,
  • 44:44 - 44:46
    your work blazed a trail
    and showed the world
  • 44:46 - 44:49
    that yes, algorithms
    can in fact be racist.
  • 44:49 - 44:51
    You also heard from my My'Kel McMillen,
  • 44:51 - 44:54
    a young Boston and resident
    who has experienced firsthand,
  • 44:54 - 44:55
    what draconian surveillance
  • 44:55 - 44:58
    with no accountability looks like.
  • 44:58 - 45:00
    Karina Ham from the
    Student Immigrant Movement
  • 45:00 - 45:02
    spoke to the need to
    keep face surveillance
  • 45:04 - 45:05
    out of our public schools,
  • 45:05 - 45:07
    where students deserve to be
    able to learn without fear.
  • 45:07 - 45:10
    You're gonna hear a
    similar set of sentiments
  • 45:10 - 45:12
    from Erik Berg of the
    Boston Teachers Union
  • 45:12 - 45:15
    and shortly medical doctor
    and infectious disease expert,
  • 45:15 - 45:17
    Joshua Barocas will testify
  • 45:17 - 45:19
    to how this technology in Boston,
  • 45:19 - 45:21
    would inevitably undermine trust
  • 45:21 - 45:23
    among the most vulnerable people
  • 45:23 - 45:25
    seeking medical care and help.
  • 45:25 - 45:27
    Obviously that's the last
    thing that we need to do,
  • 45:27 - 45:29
    especially during a pandemic.
  • 45:29 - 45:33
    For too long in Boston,
    city agencies have acquired
  • 45:33 - 45:35
    and deployed invasive
    surveillance technologies
  • 45:35 - 45:38
    with no public debate, transparency,
  • 45:38 - 45:39
    accountability or oversight.
  • 45:39 - 45:41
    In most cases, these technologies,
  • 45:41 - 45:43
    which as My'Kel said, include drones,
  • 45:43 - 45:45
    but also license plate readers,
  • 45:45 - 45:48
    social media surveillance,
    video analytics software
  • 45:48 - 45:51
    and military style cell
    phone spying devices,
  • 45:51 - 45:53
    among many others.
  • 45:53 - 45:54
    These were purchased and deployed
  • 45:54 - 45:58
    without the city council's
    knowledge, let alone approval.
  • 45:58 - 46:00
    We have seen this routine play out
  • 46:00 - 46:03
    over and over and over for years.
  • 46:03 - 46:06
    This is a problem with all
    surveillance technologies,
  • 46:06 - 46:09
    but it is especially unacceptable
  • 46:09 - 46:10
    with a technology as dangerous
  • 46:10 - 46:12
    and dystopian as face surveillance.
  • 46:12 - 46:15
    It is not an exaggeration to say that
  • 46:15 - 46:16
    face surveillance would,
  • 46:16 - 46:18
    if we allow it destroy privacy
  • 46:18 - 46:20
    and anonymity in public space.
  • 46:20 - 46:22
    Face surveillance technology
  • 46:22 - 46:25
    paired with a thousands of
    network surveillance cameras
  • 46:25 - 46:28
    already installed throughout the city,
  • 46:28 - 46:31
    would enable any official
    with access to the system
  • 46:31 - 46:34
    to automatically catalog the movements,
  • 46:34 - 46:38
    habits and associations of
    all people at all times,
  • 46:38 - 46:40
    merely with the push of a button.
  • 46:41 - 46:44
    This is technology that
    would make it trivial
  • 46:44 - 46:46
    for the government to
    blackmail public officials
  • 46:46 - 46:48
    for seeking substance use treatment,
  • 46:48 - 46:50
    or to identify whistleblowers
  • 46:50 - 46:52
    who are speaking to the Boston Globe
  • 46:52 - 46:55
    or in its most routine manifestation
  • 46:55 - 46:57
    to supercharge existing police harassment
  • 46:57 - 47:00
    and surveillance of
    black and brown residents
  • 47:00 - 47:02
    merely because of the color of their skin
  • 47:02 - 47:04
    and where they live.
  • 47:04 - 47:06
    Using face recognition tech,
  • 47:06 - 47:09
    the police could very easily
    take photos or videos of people
  • 47:09 - 47:13
    at any of these Black Lives
    Matter demonstrations,
  • 47:13 - 47:15
    run those images through
    a computer program
  • 47:15 - 47:17
    and automatically populate a list
  • 47:17 - 47:19
    of each person who attended
  • 47:19 - 47:21
    to express their first amendment rights
  • 47:21 - 47:23
    to demand racial justice.
  • 47:23 - 47:26
    The police could also
    ask a computer program
  • 47:26 - 47:28
    to automatically populate
    a list of every person
  • 47:28 - 47:31
    who walked down a specific
    on any given morning
  • 47:31 - 47:33
    and share that information with I.C.E.
  • 47:33 - 47:34
    It could also be used
  • 47:34 - 47:36
    to automatically alert law enforcement,
  • 47:36 - 47:38
    whenever a specific person passes
  • 47:38 - 47:41
    by a specific surveillance
    camera anywhere in the city.
  • 47:41 - 47:43
    And again, there are
    thousands of these cameras
  • 47:43 - 47:45
    with more going up each week.
  • 47:45 - 47:48
    Yes, so we're, I'm so sorry.
  • 47:48 - 47:51
    You're over the three minutes.
  • 47:51 - 47:52
    Well, over the three minutes,
  • 47:52 - 47:54
    but I'm I...
  • 47:54 - 47:57
    and I have two more people
    plus trying to get this.
  • 47:57 - 48:01
    So I'm just letting you
    know if you could summarize.
  • 48:01 - 48:03
    All right, I'm almost done, thank you.
  • 48:03 - 48:04
    Sorry, chair.
  • 48:04 - 48:06
    So I'll just skip to the end and say,
  • 48:06 - 48:09
    we're at a fork in the road right now,
  • 48:09 - 48:13
    as Joy said, you know, major corporations,
  • 48:13 - 48:15
    including IBM and Google are now declining
  • 48:15 - 48:17
    to sell this technology
    and that's for good reason,
  • 48:17 - 48:20
    because of the real
    potential for grave human
  • 48:20 - 48:22
    and civil rights abuses.
  • 48:22 - 48:24
    We the people, if we live in a democracy
  • 48:24 - 48:25
    and need to be in the driver's seat
  • 48:25 - 48:26
    in terms of determining
  • 48:26 - 48:29
    whether we will use these technologies.
  • 48:29 - 48:31
    We can either continue
    with business as usual,
  • 48:31 - 48:33
    allowing governments to adopt
  • 48:33 - 48:36
    and deploy these technologies
    unchecked in our communities,
  • 48:36 - 48:37
    our streets, in our schools,
  • 48:37 - 48:41
    or we can take bold
    action now to press pause
  • 48:41 - 48:43
    on the government's
    use of this technology,
  • 48:43 - 48:45
    to protect our privacy
  • 48:45 - 48:48
    and to build a safer fewer
    freer future for all of us.
  • 48:50 - 48:51
    So please, I encourage the city council
  • 48:51 - 48:54
    to join us by supporting
    this crucial ordinance
  • 48:54 - 48:56
    with your advocacy and with your vote
  • 48:56 - 48:58
    and I thank you all for
    your public service.
  • 48:58 - 49:01
    Thank you very much,
    Erik Berg, three minutes.
  • 49:03 - 49:04
    Yeah, thank you.
  • 49:04 - 49:06
    And I'll get right to the
    point in the interest of time.
  • 49:06 - 49:08
    Thank you to the council
  • 49:08 - 49:10
    for taking this up and for
    allowing this time today.
  • 49:10 - 49:11
    And I'm here to speak today
  • 49:11 - 49:13
    on behalf of the Boston Teachers Union
  • 49:13 - 49:16
    and our over 10,000 members
    in support of the ordinance,
  • 49:16 - 49:19
    banning facial recognition
    technology in Boston
  • 49:19 - 49:22
    that was presented by
    councilors Wu and Arroyo.
  • 49:22 - 49:25
    We strongly oppose the
    use of this technology
  • 49:25 - 49:26
    in our public schools.
  • 49:26 - 49:28
    Boston public schools
    should be safe environments
  • 49:28 - 49:29
    for students to learn,
  • 49:29 - 49:32
    explore their identities
    and intellect and play.
  • 49:32 - 49:36
    Face surveillance technology
    threatens that environment.
  • 49:36 - 49:39
    The technology also threatens
    the rights of our BTU members,
  • 49:39 - 49:40
    who must be able to go to work
  • 49:40 - 49:41
    without fearing that their every movement,
  • 49:41 - 49:45
    habit and association will
    be tracked and cataloged.
  • 49:45 - 49:46
    To our knowledge,
  • 49:46 - 49:48
    this technology is not
    in use in our schools,
  • 49:48 - 49:51
    but we've already witnessed
    some experimenting with it.
  • 49:51 - 49:54
    Although it was unclear
    if it was authorized.
  • 49:54 - 49:56
    Two summers ago, members of
    the Boston Teachers Union,
  • 49:56 - 49:58
    who were working in the summer program,
  • 49:58 - 49:59
    contacted the union to let us know
  • 49:59 - 50:01
    that they were being asked to sign in
  • 50:01 - 50:05
    using an app called Tinder
    with face recognition features.
  • 50:05 - 50:06
    And the central office
  • 50:06 - 50:07
    didn't seem to know about this program,
  • 50:07 - 50:08
    and to this day,
  • 50:08 - 50:10
    we don't know how it came about.
  • 50:10 - 50:11
    While the district quickly stopped
  • 50:11 - 50:13
    using the photo portion of this app
  • 50:13 - 50:16
    and informed the union that
    all photos have been deleted,
  • 50:16 - 50:17
    this incident is indicative
  • 50:17 - 50:21
    of how easy it is for private
    security or HR companies
  • 50:21 - 50:24
    to sell a technology to a
    well-intentioned principal
  • 50:24 - 50:26
    or superintendent who
    may not have expertise
  • 50:26 - 50:27
    in the tech field.
  • 50:29 - 50:31
    Face surveillance in schools
    transforms all students
  • 50:31 - 50:33
    and their family members,
    as well as employees
  • 50:33 - 50:35
    into perpetual suspects,
  • 50:35 - 50:37
    where each and every
    one of their movements
  • 50:37 - 50:39
    can be automatically monitored.
  • 50:39 - 50:42
    The use of this technology
    in public schools
  • 50:42 - 50:43
    will negatively impact students' ability
  • 50:43 - 50:46
    to explore new ideas,
    express their creativity,
  • 50:46 - 50:48
    and engage in student dissent
  • 50:48 - 50:50
    and especially disturbing prospect
  • 50:50 - 50:52
    given the current youth led protests
  • 50:52 - 50:54
    against police violence.
  • 50:54 - 50:55
    Even worse, as we've heard,
  • 50:55 - 50:58
    the technology is frequently
    biased and inaccurate,
  • 50:58 - 50:59
    which raises concerns
  • 50:59 - 51:02
    about its use to police students of color,
  • 51:02 - 51:04
    academic peer reviewed studies,
  • 51:04 - 51:06
    show face surveillance algorithms
  • 51:06 - 51:08
    are too often racially based,
  • 51:08 - 51:10
    particularly against black women
  • 51:10 - 51:14
    with inaccuracy rates up to
    35% for that demographic.
  • 51:14 - 51:16
    We know black and brown students
  • 51:16 - 51:20
    are more likely to be punished
    for perceived misbehavior.
  • 51:20 - 51:21
    Face surveillance will only perpetuate
  • 51:21 - 51:23
    and reproduce this situation.
  • 51:24 - 51:26
    When used to monitor children,
  • 51:26 - 51:28
    this technology fails
    in an essential sense
  • 51:28 - 51:30
    because it has difficulty
  • 51:30 - 51:33
    accurately identifying young
    people as their faces change.
  • 51:33 - 51:37
    Research that tested five top performing
  • 51:37 - 51:40
    commercial off the shelf
    face recognition systems
  • 51:40 - 51:43
    shows there's a negative bias
    when they're used on children,
  • 51:43 - 51:46
    they perform poor on
    children than on adults.
  • 51:46 - 51:48
    That's because these systems are modeled
  • 51:48 - 51:49
    through the use of adult faces
  • 51:49 - 51:51
    and children look different from adults
  • 51:51 - 51:53
    in such a way they can not be considered,
  • 51:53 - 51:55
    they're simply scaled down versions.
  • 51:56 - 51:58
    On top of this face,
  • 51:58 - 52:00
    surveillance technology regularly
  • 52:00 - 52:02
    misgenders transgender people,
  • 52:02 - 52:03
    and will have a harmful impact
  • 52:03 - 52:06
    on transgender young
    people in our schools.
  • 52:06 - 52:08
    Research shows that
    automatic gender recognition
  • 52:08 - 52:12
    consistently views gender
    in a trans exclusive way.
  • 52:12 - 52:14
    And consequently carries
    disproportionate risk
  • 52:14 - 52:16
    for trans people subject to it.
  • 52:17 - 52:19
    At a time when transgender children
  • 52:19 - 52:21
    are being stripped of their
    rights at a national level,
  • 52:21 - 52:24
    Boston must protect transgender
    kids in our schools.
  • 52:25 - 52:28
    Moreover, face surveillance in schools
  • 52:28 - 52:30
    will contribute to the
    school to prison pipeline,
  • 52:30 - 52:31
    threatening children's welfare,
  • 52:31 - 52:34
    educational opportunities
    and life trajectories.
  • 52:34 - 52:36
    Already, children from
    marginalized communities
  • 52:36 - 52:39
    are too often funneled
    out of public schools
  • 52:39 - 52:42
    and into the juvenile and
    criminal justice systems -
  • 52:42 - 52:45
    will inevitable this pipeline.
  • 52:45 - 52:48
    I'll get right to the end.
  • 52:48 - 52:50
    Finally, face surveillance technology
  • 52:50 - 52:52
    will harm immigrant families.
  • 52:52 - 52:53
    In this political climate,
  • 52:53 - 52:55
    immigrants are already
    fearful of engagement
  • 52:55 - 52:57
    with public institutions
  • 52:57 - 52:59
    and face surveillance systems
  • 52:59 - 53:01
    would further chill student
    and parent participation
  • 53:01 - 53:03
    in immigrant communities in our schools,
  • 53:03 - 53:06
    Boston schools must be
    welcoming and safe spaces
  • 53:06 - 53:07
    for all families.
  • 53:07 - 53:09
    The city of Boston must take action now
  • 53:09 - 53:11
    to ensure children and BTU workers
  • 53:11 - 53:13
    are not subject to this unfair,
  • 53:13 - 53:15
    biased and chilling scrutiny.
  • 53:15 - 53:17
    In order to protect young people
  • 53:17 - 53:18
    in our educational community,
  • 53:18 - 53:22
    we must stop face surveillance
    in schools before it begins.
  • 53:22 - 53:24
    Thank you very much for your attention
  • 53:24 - 53:25
    and your consideration.
  • 53:25 - 53:27
    Thank you very much.
  • 53:27 - 53:29
    We're getting there,
  • 53:29 - 53:30
    very, very close commissioner I promise.
  • 53:30 - 53:31
    Just one more person.
  • 53:31 - 53:33
    Then the two people who specifically said
  • 53:33 - 53:36
    they had some questions,
    we'll go right to them.
  • 53:36 - 53:39
    Joshua Barocas
  • 53:40 - 53:44
    Yeah, I will read quickly.
  • 53:44 - 53:45
    First of all, thanks for allowing me
  • 53:45 - 53:48
    to testify today regarding
    this important issue.
  • 53:48 - 53:51
    I should say that the views
  • 53:51 - 53:52
    that I'm expressing today are my own
  • 53:52 - 53:55
    and don't necessarily
    represent those of my employer,
  • 53:55 - 53:58
    but that said I'm an
    infectious disease physician
  • 53:58 - 54:01
    and an addictions
    researcher here in Boston.
  • 54:01 - 54:04
    As such, I spend a large
    majority of my time working
  • 54:04 - 54:06
    on solutions to improve
    the health and wellbeing
  • 54:06 - 54:08
    of vulnerable populations,
  • 54:08 - 54:10
    including people who are
    experiencing homelessness,
  • 54:10 - 54:13
    people with substance use disorders.
  • 54:13 - 54:17
    One thing that we know
    is that stigma and bias
  • 54:18 - 54:19
    are pervasive throughout our community
  • 54:19 - 54:22
    and lead to disparities and
    care for these populations.
  • 54:22 - 54:25
    These disparities were laid bare
  • 54:25 - 54:29
    and exacerbated by the
    ongoing COVID-19 pandemic.
  • 54:29 - 54:31
    Well, we're all searching for answers
  • 54:31 - 54:34
    on how to best protect
    ourselves from this virus,
  • 54:34 - 54:38
    I truly fear that the use of
    facial recognition technology
  • 54:38 - 54:41
    will only serve to exacerbate
    existing disparities,
  • 54:41 - 54:45
    including those of racial,
    gender and socioeconomic.
  • 54:45 - 54:48
    If we use this very nascent
    and imperfect technology
  • 54:48 - 54:50
    to track individuals, I'm concerned
  • 54:50 - 54:53
    that it will only serve
    the marginalized people
  • 54:53 - 54:55
    and worse in health outcomes.
  • 54:55 - 54:58
    We've heard about the low accuracy rates.
  • 54:58 - 55:00
    We know about systemic racism,
  • 55:00 - 55:01
    as a public health issue
  • 55:01 - 55:05
    that's fueled by stigma
    bias and misinformation.
  • 55:05 - 55:09
    Additionally, Boston
    provides expansive services
  • 55:09 - 55:12
    for substance use disorders,
    including methadone treatment,
  • 55:12 - 55:17
    and it's imperative that we
    ban facial recognition software
  • 55:17 - 55:18
    to ensure that people seeking
  • 55:18 - 55:21
    substance use disorder
    treatment and mental health care
  • 55:23 - 55:26
    among other stigmatized health
    services may do so privately
  • 55:26 - 55:27
    and without fear.
  • 55:28 - 55:31
    If we allow this on many
    of our public cameras,
  • 55:31 - 55:32
    it would enable the government
  • 55:32 - 55:35
    to automatically compile lists of people
  • 55:35 - 55:38
    seeking treatment for
    substance use mental health
  • 55:38 - 55:41
    and as I said, other stigmas
    stigmatized health conditions.
  • 55:41 - 55:43
    This would have a chilling effect
  • 55:44 - 55:48
    and disproportionately affect
  • 55:48 - 55:51
    our most vulnerable populations.
  • 55:51 - 55:56
    I'll stop there and ask that
    we consider banning this.
  • 55:58 - 56:00
    Thank you so very much.
  • 56:00 - 56:01
    I'm gonna turn it over.
  • 56:02 - 56:06
    I misspoke earlier the lead
    sponsor, there's Michelle Wu,
  • 56:06 - 56:10
    and she mentioned that she has
    to move along very quickly.
  • 56:10 - 56:12
    So I wanted to, if it's
    okay, councilor Arroyo,
  • 56:12 - 56:15
    I'm gonna go ahead and go
    to councilor Wu real quick,
  • 56:15 - 56:17
    for specific questions.
  • 56:17 - 56:19
    Your good!
    Yeah, absolutely.
  • 56:19 - 56:20
    Councilor Wu.
  • 56:20 - 56:21
    Thank you madam chair.
  • 56:21 - 56:23
    And thank you so much commissioner
  • 56:23 - 56:24
    for making the time to be here.
  • 56:24 - 56:27
    We all know how many demands
    there are in your time.
  • 56:27 - 56:29
    And now we're asking for even
    more than you had alluded.
  • 56:29 - 56:32
    So very, very grateful that it means a lot
  • 56:32 - 56:36
    that you took the time and
    are the one at this hearing.
  • 56:36 - 56:39
    I also wanna note that you
    were the one at the hearing
  • 56:39 - 56:41
    in June, 2018 as well.
  • 56:41 - 56:43
    So we know that you've been
    part of this conversation
  • 56:43 - 56:47
    and supportive of this
    general idea for a long time.
  • 56:47 - 56:49
    So just to clarify on
    some of what you said,
  • 56:49 - 56:51
    you mentioned that you,
  • 56:51 - 56:54
    you referenced the technology upgrades,
  • 56:54 - 56:55
    that some current vendors
    that BPD works with
  • 56:55 - 56:58
    have for face surveillance.
  • 56:58 - 56:58
    So just to clarify,
  • 56:58 - 57:02
    has BPD upgraded to the next
    version of that software
  • 57:02 - 57:03
    and we're just choosing
  • 57:03 - 57:07
    not to use the facial
    recognition components of it,
  • 57:07 - 57:09
    or has the upgrade not been accepted
  • 57:09 - 57:11
    because it includes facial recognition?
  • 57:11 - 57:13
    To my knowledge, we have not upgraded,
  • 57:13 - 57:16
    but we anticipate that we will have to,
  • 57:16 - 57:19
    but we will not be using any components
  • 57:19 - 57:21
    of facial recognition.
  • 57:21 - 57:26
    And I guess the technology
    just isn't efficient enough -
  • 57:26 - 57:30
    When do you think the
    upgrade would likely happen?
  • 57:31 - 57:33
    No, I will get back to you.
  • 57:33 - 57:35
    And trust me, I wanna
    have a further meeting
  • 57:36 - 57:38
    so I can have my subject
    matter experts here.
  • 57:38 - 57:40
    And I'm gonna tick through,
  • 57:40 - 57:42
    'cause I wanna try to get you out of here
  • 57:42 - 57:43
    as quick as possible.
  • 57:43 - 57:44
    Secondly, so you drew a distinction
  • 57:44 - 57:47
    between face surveillance systems
  • 57:47 - 57:49
    and facial recognition technology.
  • 57:49 - 57:50
    So just to clarify,
  • 57:50 - 57:54
    does BPD, we don't use that.
  • 57:54 - 57:55
    You don't own it right now
  • 57:55 - 57:57
    because you haven't done that upgrade,
  • 57:57 - 58:02
    but does BPD work with other
    state or federal entities
  • 58:03 - 58:05
    that have face surveillance
  • 58:05 - 58:06
    or facial recognition technology?
  • 58:06 - 58:09
    Do you ever use the
    results of that technology
  • 58:09 - 58:10
    on any sort of databases?
  • 58:10 - 58:12
    To answer you more accurately,
  • 58:12 - 58:16
    I would have to find out.
  • 58:16 - 58:18
    I believe the state police
  • 58:18 - 58:21
    does have some form of facial recognition
  • 58:21 - 58:23
    and maybe the registry to.
  • 58:23 - 58:25
    Okay, the registry of motor vehicles.
  • 58:25 - 58:27
    Yes.
  • 58:27 - 58:30
    And so when state police or
  • 58:30 - 58:32
    we'll find out more potentially BPD
  • 58:32 - 58:35
    works with the RMB to
    run matches of photos
  • 58:35 - 58:36
    against their database.
  • 58:36 - 58:38
    Do you need any sort of warrants
  • 58:38 - 58:40
    or other approvals to do that before,
  • 58:40 - 58:42
    for let's say the state?
  • 58:45 - 58:48
    So I used to be part of
    the Bureau of Investigative
  • 58:48 - 58:50
    Services, to my knowledge,
  • 58:50 - 58:54
    it's only utilized to help
    assist in photo arrays.
  • 58:54 - 58:57
    So nothing past that.
  • 58:58 - 59:01
    Okay, could you just clarify
    what you mean by that?
  • 59:01 - 59:03
    So if you a victim of a crime
  • 59:03 - 59:05
    and we don't have that
    person under arrest,
  • 59:05 - 59:07
    what do we do have a potential suspect?
  • 59:07 - 59:09
    You have to pick,
  • 59:09 - 59:11
    you were allowed the opportunity
  • 59:11 - 59:15
    to try to identify the
    suspect of the crime.
  • 59:15 - 59:17
    And so to be fair,
  • 59:17 - 59:20
    to ensure that no innocent
    person is selected,
  • 59:23 - 59:27
    you put the suspects picture
  • 59:27 - 59:30
    along with several other suspects pictures
  • 59:30 - 59:31
    up to seven or more,
  • 59:31 - 59:34
    and then you see if
    the victim of the crime
  • 59:34 - 59:37
    can pick the suspect
    out of that photo array.
  • 59:38 - 59:42
    Okay, and so the RMB
    uses the face recognition
  • 59:42 - 59:43
    to provide those burdens.
  • 59:43 - 59:47
    I wanna give you an accurate answer.
  • 59:47 - 59:52
    So again, in our next meeting,
    you'll have your answer.
  • 59:53 - 59:54
    Thank you.
  • 59:57 - 60:00
    Are you aware of any state
    or federal regulations
  • 60:00 - 60:04
    that provide any sort
    of kind of restrictions
  • 60:04 - 60:05
    or reigning in guidance when it comes
  • 60:06 - 60:08
    to facial recognition technology
  • 60:08 - 60:09
    or face surveillance systems?
  • 60:10 - 60:11
    Nope.
  • 60:11 - 60:16
    Okay, and so given the absence
    of guidelines right now
  • 60:16 - 60:17
    with the federal state or city level
  • 60:17 - 60:21
    and the need to upgrade and
    the use by state police,
  • 60:21 - 60:23
    for example, are you supportive,
  • 60:23 - 60:25
    we're hearing you loud and clear
  • 60:25 - 60:26
    that you'd like to have
    further conversation,
  • 60:26 - 60:28
    we just wanna get your opinion now,
  • 60:28 - 60:31
    are you supportive of a city level ban
  • 60:31 - 60:35
    until there are policies
    in place potentially,
  • 60:35 - 60:39
    at other levels or
    specifically at the city level
  • 60:39 - 60:41
    to reign in surveillance overall?
  • 60:41 - 60:42
    Yes, I am.
  • 60:42 - 60:45
    I've been clear for four years.
  • 60:45 - 60:47
    We need your input, your guidance.
  • 60:47 - 60:49
    And I thank you to everyone
    that testified today.
  • 60:49 - 60:51
    I have a team taking notes
  • 60:51 - 60:54
    because I didn't forget
    that I'm African-American
  • 60:54 - 60:57
    and I could be misidentified as well.
  • 60:57 - 61:00
    And I believe that we have
    one of the most diversified
  • 61:00 - 61:03
    populations in the history of our city.
  • 61:03 - 61:06
    And we do have to be fair to everyone
  • 61:06 - 61:08
    that is a citizen of Boston.
  • 61:08 - 61:11
    We don't want anyone misidentified.
  • 61:11 - 61:15
    And again we will work with everyone here
  • 61:15 - 61:18
    because this is how we will be educated.
  • 61:18 - 61:20
    Yes, we're very grateful for your time.
  • 61:20 - 61:22
    I'm gonna see to the co-sponsor.
  • 61:22 - 61:23
    Thank you.
  • 61:24 - 61:28
    So, actually, is that fine chair?
  • 61:28 - 61:31
    I just wanna thank you commissioner
  • 61:31 - 61:35
    for supporting the facial
    recognition surveillance ban
  • 61:35 - 61:37
    and for taking the time to
    be here for the questions.
  • 61:37 - 61:39
    Councilor Wu asked many of them,
  • 61:39 - 61:42
    I have one specific one which is,
  • 61:42 - 61:45
    has the BPD used facial
    recognition in the past,
  • 61:45 - 61:46
    in any capacity?
  • 61:48 - 61:49
    No, not to my knowledge.
  • 61:49 - 61:53
    And I will review that, but
    not to my knowledge at all.
  • 61:53 - 61:57
    And BRC, does BRC have access
  • 61:58 - 61:59
    to facial recognition techniques?
  • 61:59 - 62:00
    Nope, not at all.
  • 62:00 - 62:03
    We don't use it and I
    anticipated that question
  • 62:03 - 62:07
    and we make sure that
    we're not a part of any,
  • 62:07 - 62:09
    BRC does not have facial recognition.
  • 62:09 - 62:11
    Okay, I think every other question
  • 62:11 - 62:14
    that I had for you today
    was asked by councilor Wu.
  • 62:14 - 62:16
    So I just thank you for
    taking the time to be here
  • 62:16 - 62:18
    and for supporting this, thank you.
  • 62:18 - 62:19
    Thank you, and before I go,
  • 62:19 - 62:22
    for the record, nothing
    has changed in my opinion
  • 62:22 - 62:24
    from four years ago,
  • 62:24 - 62:27
    until this technology
    is a hundred percent,
  • 62:27 - 62:29
    I'm not interested in it.
  • 62:30 - 62:34
    And even when it is a hundred percent,
  • 62:34 - 62:36
    you've committed to a
    conversation about it?
  • 62:36 - 62:38
    Absolutely, we discussed that before.
  • 62:38 - 62:40
    That you have.
  • 62:40 - 62:43
    It's not something that
    you can just implement.
  • 62:43 - 62:46
    [crosstalk]
  • 62:47 - 62:52
    It's important that we also get
    the input from the community
  • 62:53 - 62:55
    that we serve and work in partnership
  • 62:57 - 62:59
    so we can of course have
    a better quality of life.
  • 62:59 - 63:03
    And hat's inclusive of everyone's
    expectation of privacy.
  • 63:03 - 63:05
    So before you go,
  • 63:05 - 63:10
    I'm gonna do this one call out
    the panelists that just spoke
  • 63:10 - 63:11
    or to any city councilors.
  • 63:11 - 63:14
    One question that you may
    have for the commissioner
  • 63:14 - 63:16
    that you need to ask,
    do any of you have that?
  • 63:18 - 63:23
    Do either Joy, My'Kel,
    Karina, Kade, Erik, Joshua,
  • 63:24 - 63:25
    or to the other city councilors.
  • 63:25 - 63:27
    I'm trying to be mindful of his time
  • 63:27 - 63:29
    but I also understand
  • 63:29 - 63:31
    you guys also gave time
    to be here today as well.
  • 63:31 - 63:35
    And if you wanted to ask him
    representing BPD a question.
  • 63:35 - 63:39
    I see Julia Mejia has raised her hand,
  • 63:39 - 63:41
    to the panelists though,
  • 63:41 - 63:42
    who just spoke any questions?
  • 63:43 - 63:46
    I'm gonna go now to my
    colleagues to ask one question.
  • 63:46 - 63:49
    So that allows for him to go.
  • 63:49 - 63:51
    Okay, councilor Mejia
  • 63:52 - 63:56
    Yes, I just wanted to quickly
    thank commissioner Gross
  • 63:56 - 63:59
    for your time and just
    your steadfast leadership.
  • 64:00 - 64:04
    But I do have just a quick
    question before you go.
  • 64:04 - 64:05
    I just need some clarity,
  • 64:05 - 64:07
    is that you mentioned that
    you hope to keep this hearing
  • 64:07 - 64:08
    on the topic of the ordinance
  • 64:08 - 64:12
    and find other ways to adjust
    issues of police relations.
  • 64:12 - 64:15
    So a lot of people draw the direct link
  • 64:15 - 64:17
    between facial recognition
  • 64:17 - 64:18
    and relationships with the community.
  • 64:18 - 64:19
    Do you see that link?
  • 64:21 - 64:22
    And can you talk about
    how the police department
  • 64:22 - 64:24
    sees that link?
  • 64:24 - 64:25
    I'm just really curious about
  • 64:25 - 64:29
    how do we build relationships
    with the community
  • 64:29 - 64:31
    while also thinking
    about facial recognition?
  • 64:31 - 64:33
    Like, how do you reconcile that?
  • 64:33 - 64:34
    I can answer that,
  • 64:35 - 64:37
    the information I was given before
  • 64:37 - 64:41
    that we were going to
    discuss many subjects.
  • 64:41 - 64:45
    And so I thank you for
    eliminating this one,
  • 64:45 - 64:47
    but the only way you
    increase relationships
  • 64:47 - 64:49
    with the community is to have
  • 64:49 - 64:52
    hold hard discussions like this one.
  • 64:52 - 64:56
    You have to listen to the
    people that you serve,
  • 64:56 - 64:59
    because everyone throws around
  • 64:59 - 65:00
    the moniker community policing.
  • 65:00 - 65:02
    I believe that's become jaded.
  • 65:02 - 65:05
    And so I've created a bureau
    of community engagement
  • 65:05 - 65:07
    so that we can have these discussions.
  • 65:07 - 65:10
    And again, I don't forget my history.
  • 65:10 - 65:12
    I haven't forgotten my history.
  • 65:12 - 65:16
    I came on in 1983 and I've
    gone through a lot of racism,
  • 65:16 - 65:20
    was in a tough neighborhood growing up,
  • 65:20 - 65:22
    and I didn't forget any of that.
  • 65:22 - 65:25
    And as you alluded to earlier,
    some of the city councilors,
  • 65:25 - 65:28
    I'm always in the street
    talking to people,
  • 65:28 - 65:32
    I get a lot of criticism of
    law enforcement in general,
  • 65:32 - 65:36
    but I believe in if you
    want change, be the change,
  • 65:36 - 65:38
    you can only change with the people.
  • 65:38 - 65:43
    So, I am looking forward
    to further conversation
  • 65:43 - 65:47
    and to listening to testimony
    on how we can improve
  • 65:47 - 65:49
    our community relations,
  • 65:49 - 65:54
    especially not only throwing
    the time civil unrest,
  • 65:54 - 65:56
    but we're in the middle
    of a pandemic as well.
  • 65:56 - 65:58
    And so police departments
  • 65:58 - 66:01
    should not be separated
    from the community.
  • 66:01 - 66:05
    You have to have empathy,
    sympathy, care and respect.
  • 66:05 - 66:07
    A lot of people have lost jobs,
  • 66:07 - 66:10
    and we must keep in mind
    about socioeconomics,
  • 66:10 - 66:14
    fairness for employment.
  • 66:14 - 66:16
    A lot of those things
  • 66:16 - 66:20
    directly affect the
    neighborhoods of color.
  • 66:20 - 66:23
    And I'm glad that we are increasing
  • 66:23 - 66:24
    our diversity in the communities.
  • 66:24 - 66:28
    And we currently currently
    have organizations in Boston
  • 66:28 - 66:30
    that directly speaks to the people.
  • 66:30 - 66:32
    The Latino Law Enforcement Organization,
  • 66:32 - 66:34
    the Cabo Verde Police Associations,
  • 66:34 - 66:39
    the Benevolent Asian Jade Society.
  • 66:39 - 66:43
    Our department is increasing in diversity,
  • 66:43 - 66:45
    and the benefit of that,
  • 66:45 - 66:47
    the benefit of having representation
  • 66:47 - 66:50
    from every neighborhood we serve,
  • 66:50 - 66:52
    is that it will improve relationships.
  • 66:52 - 66:55
    So again, that's why I'm looking forward
  • 66:55 - 66:57
    to a working session and
    my team is taking notes
  • 66:57 - 67:02
    because everybody's everybody's
    testimony is important.
  • 67:02 - 67:06
    Trust me, people like you that worked hard
  • 67:06 - 67:07
    for everyone's rights.
  • 67:07 - 67:08
    I wouldn't be here
  • 67:08 - 67:10
    as the first African-American
    police commissioner,
  • 67:10 - 67:12
    if we didn't have folks
  • 67:12 - 67:16
    that exercise their
    First Amendment Rights.
  • 67:17 - 67:18
    Thank you, commissioner Gross,
  • 67:18 - 67:20
    I have a clarifying question.
  • 67:20 - 67:22
    And then I just wanted to note,
  • 67:22 - 67:25
    this impacts the staying in.
  • 67:25 - 67:28
    So for folks who,
    commissioner Gross has alluded
  • 67:28 - 67:29
    to a working session,
  • 67:29 - 67:32
    just to give some clarification
    for those watching,
  • 67:32 - 67:33
    that is when we actually get down
  • 67:33 - 67:36
    to the language of the ordinance
  • 67:36 - 67:38
    and work down to the
    comments is what I say
  • 67:38 - 67:40
    and how to do that.
  • 67:40 - 67:41
    And so what commissioner Gross is asking
  • 67:41 - 67:44
    that we have that conversation
    within the next 60 days,
  • 67:44 - 67:45
    which I'm fine with doing that,
  • 67:45 - 67:47
    I'll just check with the lead sponsors.
  • 67:47 - 67:50
    I did wanna make sure I understood
  • 67:50 - 67:52
    what in terms of timing is
    going on with the contract,
  • 67:52 - 67:57
    is BPD entering into a contract
    that has this technology?
  • 67:57 - 67:59
    There was a timing issue
  • 67:59 - 68:02
    that I thought was warranting this hearing
  • 68:02 - 68:03
    happening very fast.
  • 68:03 - 68:05
    Maybe the lead sponsors could
    also answer this question.
  • 68:05 - 68:06
    I was confused,
  • 68:06 - 68:10
    if someone could explain what was the...
  • 68:10 - 68:12
    there's a sense of urgency
    that I felt that I was meeting
  • 68:12 - 68:14
    and I'm happy to meet it.
  • 68:14 - 68:17
    Just if someone either, I
    saw you Kade, you nodded.
  • 68:17 - 68:21
    Let me just figure out what
    this contract issue is Kade.
  • 68:21 - 68:25
    Sure councilor, I'd be
    happy to address that.
  • 68:25 - 68:28
    We obtained public records
    from the city a while back
  • 68:28 - 68:30
    showing that the Boston Police Department
  • 68:31 - 68:33
    has a contract with a
    company called BriefCam
  • 68:33 - 68:35
    that expired on May 14th.
  • 68:37 - 68:41
    And the contract was for version
    4.3 of BriefCam software,
  • 68:41 - 68:45
    which did not include facial
    surveillance algorithms.
  • 68:45 - 68:47
    The current version of
    BriefCam's technology
  • 68:47 - 68:50
    does include facial
    surveillance algorithms.
  • 68:50 - 68:53
    And so one of the reasons
    that the councilors
  • 68:53 - 68:56
    and the advocacy groups
    behind this measure
  • 68:56 - 68:59
    were urging you chair
    to schedule this quickly
  • 68:59 - 69:01
    is so that we could pass this ban fast
  • 69:01 - 69:02
    to ensure that the city
  • 69:02 - 69:06
    does not enter into a contract
    to upgrade that technology.
  • 69:07 - 69:09
    Thank you, commissioner Gross,
  • 69:09 - 69:11
    where are we on this contract?
  • 69:11 - 69:12
    Yep and thank you.
  • 69:12 - 69:14
    That's exactly what I was gonna comment.
  • 69:14 - 69:16
    The older version of BriefCam
  • 69:16 - 69:18
    did not have facial recognition
  • 69:18 - 69:21
    and I believe that the newer version does.
  • 69:21 - 69:26
    BriefCam works on the old version
  • 69:26 - 69:28
    on condensing objects, cars,
  • 69:28 - 69:31
    but I would definitely
    check on that contract
  • 69:31 - 69:34
    because I don't want any technology
  • 69:34 - 69:36
    that has facial recognition
  • 69:38 - 69:40
    and that's exactly what
    we were gonna check on
  • 69:40 - 69:45
    and so my answer right
    now is no at this point.
  • 69:47 - 69:50
    I've just read into testimony,
  • 69:50 - 69:55
    if we obtain technology such as that,
  • 69:55 - 69:58
    I should be speaking to all of you
  • 69:58 - 70:00
    and to what that entails,
  • 70:05 - 70:08
    if we have BriefCam and
    it has facial recognition
  • 70:08 - 70:10
    [indistinct] that allows
    for facial recognition, no.
  • 70:10 - 70:12
    And if someone did, if we
    did go into a contract,
  • 70:12 - 70:15
    would I have the ability to show you
  • 70:15 - 70:19
    that that portion that does
    have facial recognition,
  • 70:19 - 70:21
    that that can be censored?
  • 70:21 - 70:24
    That that is not, excuse
    me, poor choice of words
  • 70:24 - 70:26
    that that can be excluded.
  • 70:26 - 70:29
    And so I'm not comfortable
    with that contract
  • 70:29 - 70:31
    until I know more about it.
  • 70:31 - 70:34
    I don't want any part
    of facial recognition.
  • 70:34 - 70:37
    But as I read into testimony,
  • 70:37 - 70:40
    all the technology
    that's going forward now
  • 70:40 - 70:42
    in many fields is like,
  • 70:42 - 70:43
    hey we have facial recognition
  • 70:43 - 70:46
    and I'm like not comfortable with that.
  • 70:46 - 70:50
    And so BriefCam, yeah the old version,
  • 70:50 - 70:53
    we hardly ever used it.
  • 70:53 - 70:55
    And there is discussion on the new version
  • 70:55 - 70:57
    and I'm not comfortable
  • 70:57 - 70:59
    with the facial recognition
    component of that.
  • 71:00 - 71:01
    Thank you very much.
  • 71:01 - 71:04
    And thank you very much
    Kade for the background
  • 71:04 - 71:07
    and understanding both of
    you, and understanding.
  • 71:07 - 71:10
    I felt a sense to move
    fast and I'm moving fast,
  • 71:10 - 71:12
    but I was trying to make
    sure I understood what for.
  • 71:12 - 71:14
    I really do have to go.
  • 71:14 - 71:15
    Yes, thank you very much.
  • 71:15 - 71:18
    Thank you all for your input.
  • 71:18 - 71:21
    Will there someone be taking
    notes or is there someone,
  • 71:23 - 71:25
    I guess Neil from the mayor's
    office will be taking notes.
  • 71:25 - 71:28
    And I have my own team
    taking notes right now.
  • 71:28 - 71:30
    Excellent, thank you very much.
  • 71:30 - 71:32
    Thank you for your testimony.
  • 71:32 - 71:34
    If it weren't for
    advocates, such as everyone
  • 71:34 - 71:39
    on this Zoom call, I wouldn't
    be here in this capacity.
  • 71:40 - 71:41
    So thank you.
  • 71:42 - 71:44
    Thank you very much commissioner.
  • 71:44 - 71:47
    Thank you everyone take care.
  • 71:47 - 71:48
    Okay, thank you very much.
  • 71:48 - 71:52
    And so we have a still a
    robust conversation to have
  • 71:52 - 71:57
    amongst ourselves about the
    language, about the goals,
  • 71:57 - 72:00
    whether it goes far enough,
    if anyone opposes this ban,
  • 72:00 - 72:02
    and I want people to understand
  • 72:02 - 72:05
    if there is someone who
    disagrees with a lot of us,
  • 72:05 - 72:07
    this will be a respectful conversation
  • 72:07 - 72:09
    and that person will be
    welcome to have their opinion
  • 72:09 - 72:12
    and express it as every single one of us
  • 72:12 - 72:13
    has been able to do.
  • 72:14 - 72:15
    So I'm gonna continue now,
  • 72:15 - 72:17
    there's a list of folks
    who have signed up to speak
  • 72:17 - 72:20
    and I'm gonna continue down that list
  • 72:20 - 72:24
    and keep them to no
    more than three minutes
  • 72:24 - 72:26
    and then we're gonna open up.
  • 72:26 - 72:28
    Oh, I'm so sorry, I apologize.
  • 72:28 - 72:31
    Before I go down a list,
    I know my councilor,
  • 72:31 - 72:34
    my colleagues also may
    have questions or concerns
  • 72:34 - 72:36
    or may wanna voice certain
    things about the legislation.
  • 72:36 - 72:38
    And I apologize.
  • 72:38 - 72:39
    [Lydia giggles]
  • 72:39 - 72:41
    So I'm just so amped to get
    to the public testimony.
  • 72:41 - 72:43
    So I'm gonna go ahead and
    go in order of arrival,
  • 72:43 - 72:45
    the two lead sponsors have
    asked questions already
  • 72:45 - 72:47
    of the commissioner.
  • 72:47 - 72:50
    Did they have the councilor
    Wu or councilor Arroyo
  • 72:50 - 72:53
    have any questions for the
    panelists that just spoke?
  • 72:53 - 72:55
    If not, I'll move on
    to my other colleagues.
  • 72:57 - 72:59
    MICHELLE: I'm happy to
    defer to colleagues,
  • 72:59 - 73:00
    thank you madam chair.
  • 73:00 - 73:02
    Very well, councilor Arroyo.
  • 73:02 - 73:04
    The only question I would have,
  • 73:04 - 73:07
    which is more to, I
    think Kade from the ACLU,
  • 73:07 - 73:09
    is whether or not she has any idea
  • 73:09 - 73:10
    where we are on that contract.
  • 73:10 - 73:13
    From what I heard from commissioner Gross,
  • 73:13 - 73:15
    it sounded like he couldn't confirm
  • 73:15 - 73:17
    the timeline for that contract,
  • 73:17 - 73:19
    whether or not it's been
    signed or not signed,
  • 73:19 - 73:20
    what's going on with that contract.
  • 73:20 - 73:23
    Does any panelists here
    have any information
  • 73:23 - 73:25
    on whether or not that contract is,
  • 73:25 - 73:27
    what the status of that is?
  • 73:27 - 73:29
    Thanks for the question, councilor Arroyo.
  • 73:29 - 73:30
    Regrettably no.
  • 73:30 - 73:32
    We filed a public records request
  • 73:32 - 73:35
    with the Boston Police
    Department on May 14th,
  • 73:35 - 73:36
    so that was about a month ago,
  • 73:36 - 73:38
    asking just for one simple document
  • 73:38 - 73:41
    for the existing contract
  • 73:41 - 73:43
    that the Boston Police
    Department has with BriefCam
  • 73:43 - 73:45
    and they haven't sent it to us yet.
  • 73:45 - 73:49
    So, and we've prided them
    multiple times about it,
  • 73:49 - 73:51
    including on Friday saying,
  • 73:51 - 73:52
    it would be a real shame
  • 73:52 - 73:54
    if we had to go to this hearing on Tuesday
  • 73:54 - 73:57
    without this information and
    we still haven't received it,
  • 73:57 - 73:58
    so we do not.
  • 73:59 - 74:01
    Thank you.
  • 74:02 - 74:05
    That all councilor Arroyo?
  • 74:05 - 74:07
    Very well, councilor Breadon.
  • 74:11 - 74:13
    Thank you.
  • 74:13 - 74:15
    This has been very informative,
  • 74:15 - 74:18
    very good questions from my colleagues.
  • 74:18 - 74:23
    One question I had was, other
    agencies, federal agencies,
  • 74:24 - 74:26
    are they using facial recognition?
  • 74:26 - 74:30
    And if they are they
    sharing that information
  • 74:30 - 74:31
    with our police department?
  • 74:33 - 74:36
    Does anyone know the
    answer to that question?
  • 74:36 - 74:39
    I can speak to that, Joy
    Buolomwini if you're still on,
  • 74:39 - 74:41
    you may wanna speak to some of those too.
  • 74:41 - 74:46
    So the ACLU we nationwide have
    been filing FOIAR requests,
  • 74:47 - 74:49
    Freedom of Information Act Requests
  • 74:49 - 74:51
    with various federal agencies
  • 74:51 - 74:53
    to learn about how the federal government
  • 74:55 - 74:57
    is using this technology
    and how they're sharing
  • 74:58 - 74:59
    information derived from the technology,
  • 74:59 - 75:01
    with state and local law enforcement.
  • 75:01 - 75:03
    We know for example, that
    in the city of Boston,
  • 75:03 - 75:06
    the Boston Police Department works closely
  • 75:06 - 75:07
    not only with I.C.E,
  • 75:07 - 75:09
    which has been the subject
    of much consternation
  • 75:09 - 75:11
    and debate before this body,
  • 75:11 - 75:13
    but also with the FBI,
  • 75:13 - 75:16
    through something called the
    Joint Terrorism Task Force.
  • 75:16 - 75:17
    The Boston Police Department
  • 75:17 - 75:21
    actually has detectives
    assigned to that JTTF unit,
  • 75:21 - 75:23
    who act as federal agents.
  • 75:23 - 75:25
    So one concern that we have is that
  • 75:25 - 75:27
    even if the city of Boston banned
  • 75:27 - 75:30
    face surveillance technology
    for city employees,
  • 75:30 - 75:33
    certainly it would be
    the case that the FBI
  • 75:33 - 75:34
    and other federal agencies
  • 75:34 - 75:36
    would be able to continue
    to use this technology
  • 75:36 - 75:40
    and likely to share
    information that comes from it
  • 75:40 - 75:42
    with the Boston Police Department
    and other city agencies.
  • 75:42 - 75:46
    Unfortunately, there's
    nothing we can do about that
  • 75:46 - 75:47
    at the city level,
  • 75:47 - 75:49
    which is why the ACLU
  • 75:49 - 75:51
    is also working with partners in congress
  • 75:51 - 75:54
    to try to address this level
    at the federal level as well.
  • 75:56 - 75:58
    Thank you, that answered my question
  • 75:58 - 76:01
    and thank you for your work
    on this very important issue.
  • 76:01 - 76:05
    Councilor, madam chair
    that's all the questions
  • 76:05 - 76:05
    I have for now.
  • 76:06 - 76:08
    Thank you for very much, councilor Bok.
  • 76:08 - 76:11
    So, Mejia.
  • 76:12 - 76:14
    I'm so sorry, councilor Bok.
  • 76:14 - 76:16
    Thank you councilor Breadon.
  • 76:16 - 76:19
    Next is councilor Bok, I apologize.
  • 76:19 - 76:22
    Thank you, thanks councilor Edwards.
  • 76:22 - 76:25
    My question was actually
    just, it's for the panelists.
  • 76:25 - 76:27
    Whoever wants to jump in,
  • 76:27 - 76:30
    the police commissioner
    was making a distinction
  • 76:30 - 76:33
    between facial surveillance
    and facial recognition systems
  • 76:33 - 76:36
    and what we might be banning or not.
  • 76:36 - 76:37
    And I'm just wondering,
  • 76:37 - 76:39
    I don't know the literature
    in this world well enough
  • 76:39 - 76:42
    to know if that sort of a
    strong existing distinction,
  • 76:42 - 76:45
    whether you think this piece
    of legislation in front of us
  • 76:45 - 76:46
    bans one or the other,
  • 76:46 - 76:49
    whether you think we should
    be making that distinction,
  • 76:49 - 76:51
    I'd just be curious for anybody
    to weigh in on that front.
  • 76:51 - 76:55
    Got it, so when we hear the
    term facial recognition,
  • 76:55 - 76:58
    oftentimes what it
    means it's not so clear.
  • 76:58 - 77:01
    And so with the way the
    current bill is written,
  • 77:01 - 77:03
    it actually covers a wide range
  • 77:03 - 77:07
    of different kinds of facial
    recognition technologies.
  • 77:07 - 77:10
    And I say technologies plural
  • 77:10 - 77:12
    to emphasize we're talking
    about different things.
  • 77:12 - 77:15
    So for example, you have face recognition,
  • 77:15 - 77:18
    which is about identifying
    a unique individual.
  • 77:18 - 77:21
    So when we're talking
    about face surveillance,
  • 77:21 - 77:22
    that is the issue.
  • 77:22 - 77:25
    But you can also have facial analysis,
  • 77:25 - 77:27
    that's guessing somebody's gender
  • 77:27 - 77:29
    or somebodies age or other systems
  • 77:29 - 77:33
    that might try to infer your
    sexuality or your religion.
  • 77:33 - 77:35
    So you can still discriminate
  • 77:35 - 77:38
    even if it's not
    technically face recognition
  • 77:38 - 77:40
    in the technical sense.
  • 77:40 - 77:43
    So, it's important to have
    a really broad definition.
  • 77:43 - 77:44
    You also have face detection.
  • 77:44 - 77:47
    So if you think about weapon systems
  • 77:47 - 77:49
    where you're just detecting
    the presence of a face
  • 77:49 - 77:52
    without saying a specific individual
  • 77:52 - 77:54
    that can still be problematic.
  • 77:54 - 77:56
    You also have companies like Face Suction
  • 77:56 - 77:58
    that say, we can infer a criminality
  • 77:58 - 78:00
    just by looking at your face,
  • 78:00 - 78:04
    your potential to be a pedophile,
    a murderer, a terrorist.
  • 78:04 - 78:07
    And these are the kinds of systems
  • 78:07 - 78:10
    that companies are attempting
    to sell law enforcement.
  • 78:10 - 78:13
    So it's crucial that any legislation
  • 78:13 - 78:14
    that is being written,
  • 78:14 - 78:16
    has a sufficiently broad definition
  • 78:16 - 78:21
    of a wide range of facial
    recognition technologies, right?
  • 78:21 - 78:25
    With the plural so that
    you don't get a loophole,
  • 78:26 - 78:28
    which says, oh, we're
    doing face identification
  • 78:28 - 78:31
    or face verification, which
    falls under recognition.
  • 78:31 - 78:34
    So everything we're doing
    over here doesn't matter,
  • 78:34 - 78:35
    but it does.
  • 78:35 - 78:37
    So thank you so much for that question
  • 78:37 - 78:39
    because this is an
    important clarification.
  • 78:42 - 78:44
    Great, thanks so much.
  • 78:44 - 78:46
    Was there anybody else who
    wanted to comment on that front?
  • 78:46 - 78:48
    No, okay, great.
  • 78:48 - 78:50
    Thank you, that was really helpful Joy.
  • 78:50 - 78:52
    Madam chair, I'm mindful
    of the public testimony,
  • 78:52 - 78:53
    so that'll be it for me.
  • 78:55 - 78:56
    Thank you very much councilor Bok,
  • 78:56 - 78:57
    councilor Mejia.
  • 78:58 - 79:01
    Thank you again to the panelists
  • 79:01 - 79:04
    for educating us beforehand
  • 79:04 - 79:08
    and always sharing all of
    this amazing information
  • 79:08 - 79:12
    and data and research that
    informs our thinking every day.
  • 79:12 - 79:14
    I am just curious
  • 79:14 - 79:19
    and at the same time a little bit worried
  • 79:19 - 79:23
    about, just kind of like,
  • 79:23 - 79:28
    how we can prevent this
    from ever passing ever.
  • 79:28 - 79:31
    Because it seems to me that
  • 79:31 - 79:33
    the way it's been positioned
    is that it's the right now,
  • 79:33 - 79:38
    because it's not accurate,
  • 79:38 - 79:40
    but I'm just curious what happened
  • 79:40 - 79:45
    if you've seen other cities
    or just even around the world
  • 79:45 - 79:46
    or any other incidences
  • 79:46 - 79:51
    where this has kind of slipped
    in the radar and has passed.
  • 79:53 - 79:55
    I don't know if I'm making
    any sense here but...
  • 79:55 - 79:58
    Basically the bottom line is
    what I'm trying to understand
  • 79:58 - 80:01
    is that here we are standing firm
  • 80:01 - 80:04
    and banning facial recognition,
  • 80:04 - 80:07
    from what I understand at this point,
  • 80:07 - 80:11
    our commissioner is in agreement,
  • 80:11 - 80:14
    because of the issue of accuracy, right?
  • 80:14 - 80:16
    How can we get ahead of this situation
  • 80:16 - 80:21
    in a way that we can right this ordinance
  • 80:21 - 80:26
    to ensure that regardless of
    whether or not it is accurate,
  • 80:26 - 80:27
    that we can still protect our residents.
  • 80:27 - 80:29
    Do you get what I'm trying to say?
  • 80:29 - 80:31
    I do councilor Mejia
    and thank you for that.
  • 80:31 - 80:34
    I can address that quickly.
  • 80:34 - 80:38
    The council can't bind
    future city councils, right?
  • 80:38 - 80:40
    So I agree with you.
  • 80:40 - 80:43
    I think that this technology is dangerous,
  • 80:43 - 80:44
    whether it works or it doesn't.
  • 80:44 - 80:47
    And I think that's why the city of Boston
  • 80:47 - 80:50
    ought to take the step to
    ban its use in government.
  • 80:50 - 80:52
    I can promise you that as advocates,
  • 80:52 - 80:54
    we will show up to ensure
  • 80:54 - 80:57
    that these protections
    persist in the city of Boston,
  • 80:57 - 80:59
    as long as I'm alive. [Kade laughs]
  • 80:59 - 81:03
    And I think that's true of many
    of my friends and colleagues
  • 81:03 - 81:05
    in the advocacy space.
  • 81:05 - 81:07
    But some cities, some states
  • 81:07 - 81:09
    have considered approaches of
  • 81:09 - 81:10
    for example passing a moratorium
  • 81:10 - 81:13
    that expires after a few years,
  • 81:13 - 81:15
    we did not choose that route here,
  • 81:15 - 81:17
    in fact because of our concerns
  • 81:17 - 81:19
    that I think are identical to yours.
  • 81:19 - 81:23
    It's our view that we should
    never wanna live in a society
  • 81:23 - 81:24
    where the government can track us
  • 81:25 - 81:27
    through these surveillance cameras
  • 81:27 - 81:29
    by our face wherever we go
    or automatically get alerts,
  • 81:29 - 81:31
    just because I happened
    to walk past the camera
  • 81:31 - 81:32
    in a certain neighborhood.
  • 81:32 - 81:33
    That's a kind of surveillance
  • 81:33 - 81:36
    that should never exist in a free society.
  • 81:36 - 81:39
    So we agree that it should
    be permanently banned.
  • 81:40 - 81:40
    Thank you for that.
  • 81:40 - 81:44
    And then the other
    question that I have is,
  • 81:44 - 81:46
    I know that, where the city council
  • 81:46 - 81:48
    and all of the jurisdiction is all things
  • 81:48 - 81:50
    that deal with the city,
  • 81:50 - 81:54
    but I'm just wondering
    what if any examples
  • 81:54 - 81:56
    have you heard of other,
  • 81:56 - 81:58
    like maybe within the private sector
  • 81:58 - 82:02
    that are utilizing or considering
    to utilize this as a form?
  • 82:02 - 82:05
    Is there anything, any
    traction or anything
  • 82:05 - 82:05
    that we need to be mindful of?
  • 82:05 - 82:08
    Happening outside of city
    government, if you will.
  • 82:11 - 82:12
    Joy, do you wanna speak
  • 82:12 - 82:14
    to some of the commercial applications?
  • 82:14 - 82:17
    Sure, I'm so glad you're
    asking this question
  • 82:17 - 82:19
    because facial recognition technologies
  • 82:19 - 82:23
    broadly speaking are not
    just in the government realm.
  • 82:23 - 82:27
    So right now, especially
    with the COVID pandemic,
  • 82:27 - 82:30
    you're starting to see
    more people make proposals
  • 82:30 - 82:33
    of using facial recognition
    technologies in different ways.
  • 82:33 - 82:35
    Some that I've seen start to surface are,
  • 82:35 - 82:40
    can we use facial recognition
    for contactless payments?
  • 82:40 - 82:43
    So your face becomes what you pay with.
  • 82:43 - 82:46
    You also have the case
  • 82:46 - 82:49
    of using facial analysis in employment.
  • 82:49 - 82:50
    So there's a company HireVue
  • 82:50 - 82:54
    that says we will analyze
    your facial movements
  • 82:54 - 82:57
    and use that to inform hiring decisions.
  • 82:57 - 83:00
    And guess what we train on
    the current top performers.
  • 83:00 - 83:04
    So the biases that are already
    there can be propagated
  • 83:04 - 83:06
    and you actually might
    purchase that technology
  • 83:06 - 83:10
    thinking you're trying to remove bias.
  • 83:10 - 83:13
    And so there are many ways
    in which these technologies
  • 83:13 - 83:17
    might be presented with good intent,
  • 83:17 - 83:19
    but when you look at the ways in which
  • 83:19 - 83:21
    it actually manifests, there are harms
  • 83:21 - 83:24
    and oftentimes harms that
    disproportionately fall
  • 83:24 - 83:26
    on a marginalized group.
  • 83:26 - 83:27
    Even in the healthcare system.
  • 83:27 - 83:30
    I was reading research earlier
  • 83:30 - 83:31
    that was talking about ableism
  • 83:31 - 83:33
    when it comes to using some kinds of
  • 83:33 - 83:35
    facial analysis systems
  • 83:35 - 83:37
    within the healthcare context,
  • 83:37 - 83:39
    where it's not working as well
  • 83:39 - 83:41
    for older adults with dementia.
  • 83:41 - 83:43
    So the promises of these technologies
  • 83:43 - 83:46
    really have to be backed
    up with the claims.
  • 83:46 - 83:50
    And I think the council can
    actually go further by saying,
  • 83:50 - 83:51
    we're not only just talking about
  • 83:51 - 83:54
    government use of facial
    recognition technologies,
  • 83:54 - 83:56
    but if these technologies
  • 83:56 - 83:58
    impact someone's life in
    a material way, right?
  • 83:58 - 84:01
    So we're talking about
    economic opportunity.
  • 84:01 - 84:02
    We're talking about healthcare,
  • 84:02 - 84:04
    that there needs to be oversight
  • 84:04 - 84:08
    or clear restrictions depending
    on the type of application,
  • 84:08 - 84:11
    but the precautionary
    principle press pause,
  • 84:11 - 84:15
    make sense when this technology
    is nowhere near there.
  • 84:15 - 84:15
    Thank you.
  • 84:15 - 84:17
    For that and I'm not
    sure if I got the Jabil,
  • 84:17 - 84:19
    but I do have one more,
  • 84:19 - 84:20
    You got the, you got the...
  • 84:20 - 84:22
    [crosstalk]
  • 84:22 - 84:26
    But I'll wait.
  • 84:26 - 84:30
    Sorry, thank you so much
    to you both, thank you.
  • 84:32 - 84:33
    Councilor Flaherty.
  • 84:39 - 84:41
    Okay, councilor Campbell said
  • 84:41 - 84:44
    she was having some spotty
    issues with her internet.
  • 84:45 - 84:46
    So if she's not here,
  • 84:46 - 84:49
    I'm gonna go ahead and
    go to councilor O'Malley
  • 84:49 - 84:52
    who I'm not sure he's still on.
  • 84:52 - 84:54
    I'm here madam chair.
  • 84:54 - 84:55
    There you are, councilor O'Malley.
  • 84:55 - 84:56
    Thank you.
  • 84:56 - 84:58
    Yeah, I will be brief because
    I think it's important
  • 84:58 - 84:59
    to get to the public testimony,
  • 84:59 - 85:00
    but it seems to me that this sounds
  • 85:00 - 85:04
    like a very productive
    government operations hearing.
  • 85:04 - 85:06
    It sounds to me that obviously
  • 85:06 - 85:09
    there's a lot of support across the board
  • 85:09 - 85:11
    and it looks as though the commissioners
  • 85:11 - 85:12
    asking for one more working session,
  • 85:12 - 85:13
    which seems to make sense
  • 85:13 - 85:15
    and his comments were very positive.
  • 85:15 - 85:17
    So I think this is a good thing.
  • 85:17 - 85:18
    So thank you to the advocates.
  • 85:18 - 85:19
    Thank you to my colleagues,
  • 85:19 - 85:21
    particularly the lead sponsors
  • 85:21 - 85:23
    and the commissioner for his
    participation this afternoon.
  • 85:23 - 85:25
    And thank you for all the folks
  • 85:25 - 85:26
    whom we will hear from shortly.
  • 85:26 - 85:28
    That's all I've got.
  • 85:28 - 85:30
    Thank you, councilor Essaibi George.
  • 85:31 - 85:32
    Thank you madam chair.
  • 85:32 - 85:34
    And I do apologize for earlier,
  • 85:34 - 85:39
    taking a call and unmuting myself somehow.
  • 85:39 - 85:40
    I apologize for that.
  • 85:40 - 85:42
    I hope I didn't say anything inappropriate
  • 85:42 - 85:44
    or share any certain secrets.
  • 85:44 - 85:47
    Thank you to the advocates
  • 85:47 - 85:49
    and to the commissioner
    for being here today.
  • 85:49 - 85:51
    And thank you to the advocates
  • 85:51 - 85:54
    who have met with me over
    the last a few months ago,
  • 85:54 - 85:56
    over a period of a few months,
  • 85:56 - 85:58
    just really learned a lot
    through our times together,
  • 85:58 - 86:02
    especially from the younger folks.
  • 86:02 - 86:03
    I am interested in some,
  • 86:04 - 86:05
    if there is any information
  • 86:05 - 86:07
    that the commissioner
    is no longer with us.
  • 86:07 - 86:11
    So perhaps I'll just forward
    this question to him, myself.
  • 86:11 - 86:16
    But my question is around
    the FBI's evidence standards
  • 86:17 - 86:20
    around facial recognition technology.
  • 86:20 - 86:22
    I wonder if they've released that.
  • 86:22 - 86:26
    And then through the chair to the makers,
  • 86:26 - 86:30
    we had some questions
    around section B and two,
  • 86:32 - 86:34
    and perhaps we can discuss
    this in the working sessional,
  • 86:35 - 86:37
    or I'll send this to you ahead of time,
  • 86:37 - 86:42
    but it does talk about
    section B part two reads,
  • 86:42 - 86:46
    "Nothing in B or one shall prohibit Boston
  • 86:47 - 86:48
    or any Boston official
    from using evidence related
  • 86:48 - 86:50
    to the investigation of a specific crime
  • 86:50 - 86:52
    that may have been generated
  • 86:52 - 86:54
    from face surveillance systems."
  • 86:54 - 86:56
    So we're just wondering
    what types of evidence
  • 86:56 - 86:58
    are we allowing from here
  • 86:58 - 87:01
    and what situations are
    we trying to accommodate.
  • 87:01 - 87:04
    Just the specific question
    that we came up with
  • 87:04 - 87:06
    as an office, looking through that.
  • 87:06 - 87:09
    So I'll share that with the
    makers of this ordinance
  • 87:09 - 87:10
    to understand that a little bit better.
  • 87:10 - 87:12
    But if any of the panels have information
  • 87:12 - 87:16
    on the FBI standards regarding,
  • 87:16 - 87:18
    if they've shared their standards.
  • 87:18 - 87:21
    That is my question for this time.
  • 87:21 - 87:23
    Thank you ma'am chair.
  • 87:23 - 87:24
    I can answer that quickly
  • 87:24 - 87:25
    and then Joy may also have an answer.
  • 87:25 - 87:27
    But one of the issues here
  • 87:27 - 87:30
    is that police departments
    across the country
  • 87:30 - 87:32
    have been using facial
    recognition to identify people
  • 87:32 - 87:36
    in criminal investigations
    for decades now actually,
  • 87:36 - 87:38
    and then not disclosing that information
  • 87:38 - 87:41
    to criminal defendants, which
    is a due process violation.
  • 87:41 - 87:43
    And it's a serious problem
  • 87:43 - 87:47
    in terms of the integrity of
    our criminal justice process
  • 87:47 - 87:50
    in the courts and defendant's
    rights to a fair trial.
  • 87:50 - 87:53
    So the FBI standards
    that you're referencing
  • 87:53 - 87:56
    one of the reasons, at least
    as far as we understand it
  • 87:56 - 87:58
    and I don't work for
    the police department,
  • 87:58 - 88:01
    but one of the reasons as
    far as we understand it,
  • 88:01 - 88:04
    that police departments have
    not been disclosing information
  • 88:04 - 88:06
    about these searches
    to criminal defendants,
  • 88:06 - 88:10
    is that there is no nationally
    agreed upon forensic standard
  • 88:10 - 88:13
    for the use of facial
    recognition technology
  • 88:13 - 88:15
    in criminal investigations.
  • 88:15 - 88:18
    And so my understanding is,
  • 88:18 - 88:20
    police departments and prosecutors fear
  • 88:20 - 88:22
    that if they were to disclose to courts
  • 88:22 - 88:23
    and criminal defendants,
  • 88:23 - 88:25
    that this technology was
    used to identify people
  • 88:25 - 88:27
    in specific investigations,
  • 88:27 - 88:30
    that those cases would basically
    get thrown out of court
  • 88:30 - 88:35
    because the people who
    performed the analysis
  • 88:36 - 88:38
    would not really be able to testify
  • 88:38 - 88:41
    to their training under any
    agreed upon national standard
  • 88:41 - 88:43
    for evaluating facial recognition results
  • 88:43 - 88:46
    or using the technology more generally.
  • 88:46 - 88:48
    I can't answer your other
    question about that exemption.
  • 88:48 - 88:52
    It is meant to address
    basically wanted posters,
  • 88:52 - 88:55
    because for example, if the
    FBI runs facial recognition
  • 88:55 - 88:58
    on an image of a bank robbery suspect
  • 88:58 - 88:59
    and then produces a wanted poster
  • 88:59 - 89:01
    that has that person's name on it
  • 89:01 - 89:03
    and shares with the
    Boston Police Department,
  • 89:03 - 89:07
    obviously the BPD is not going
    to be in a position to ask
  • 89:07 - 89:09
    every single police
    department in the country,
  • 89:09 - 89:11
    where did you get the name
    attached to this picture?
  • 89:11 - 89:15
    And so that's what that
    exemption is meant to address.
  • 89:15 - 89:16
    There have been some concerns
  • 89:16 - 89:17
    from some of our allied organizations
  • 89:17 - 89:19
    that we need to tighten that up
  • 89:19 - 89:21
    with a little more extra language
  • 89:21 - 89:24
    to make clear that that
    does not allow the BPD
  • 89:24 - 89:27
    to use the RMV system
    for facial recognition.
  • 89:27 - 89:29
    And we will be suggesting some language
  • 89:29 - 89:31
    to the committee to that effect.
  • 89:32 - 89:34
    And just to echo a little bit
  • 89:34 - 89:37
    of what Kade is sharing here,
  • 89:37 - 89:41
    I do wanna point out that in April, 2019,
  • 89:41 - 89:44
    what the example of the
    student being misidentified
  • 89:44 - 89:49
    as a terrorist suspect,
    you had that photo shared.
  • 89:49 - 89:51
    And even though it was incorrect,
  • 89:51 - 89:54
    you have the ramifications of what happens
  • 89:54 - 89:57
    because of the presumption of guilt.
  • 89:57 - 89:59
    And so this is why I especially believe
  • 89:59 - 90:02
    we should also be talking about systems
  • 90:02 - 90:05
    that try to infer criminality
  • 90:05 - 90:09
    using AI systems in any kind of way.
  • 90:09 - 90:11
    Because again, the presumption of guilt
  • 90:11 - 90:13
    then adds to the confirmation bias
  • 90:13 - 90:16
    of getting something from a machine.
  • 90:18 - 90:20
    So, I think in the working session,
  • 90:20 - 90:23
    will there be improved language
  • 90:23 - 90:25
    around tightening that piece up.
  • 90:27 - 90:29
    Yes, we have some suggestions.
  • 90:29 - 90:31
    Yeah, thank you, thank you ma'am chir.
  • 90:31 - 90:33
    You're welcome, councilor Flynn.
  • 90:34 - 90:37
    Wasn't sure if you had any questions.
  • 90:39 - 90:41
    Thank you, thank you councilor Edwards.
  • 90:41 - 90:44
    I know I had a conversation awhile back
  • 90:44 - 90:48
    with Erik Berg from the teacher's
    union about the subject,
  • 90:48 - 90:49
    and I learned a lot
  • 90:49 - 90:52
    and I learned a lot from
    listening to the advocates
  • 90:52 - 90:54
    this afternoon.
  • 90:57 - 91:02
    My question is, I know Joy
    covered it a little bit,
  • 91:02 - 91:06
    but besides for criminal investigations,
  • 91:06 - 91:09
    what are the reasons that the government
  • 91:09 - 91:13
    would want surveillance cameras.
  • 91:16 - 91:18
    Surveillance, cameras, councilor Flynn,
  • 91:18 - 91:21
    or facial recognition
    technology specifically?
  • 91:21 - 91:22
    Both.
  • 91:24 - 91:28
    Well, my understanding
    is government agencies,
  • 91:28 - 91:30
    including the Boston
    Transportation Department
  • 91:30 - 91:33
    use surveillance cameras for
    things like traffic analysis,
  • 91:33 - 91:36
    accident reconstruction.
  • 91:36 - 91:41
    So there are non-criminal
    uses for surveillance cameras.
  • 91:41 - 91:46
    I'm not aware of any
    non-criminal uses in government,
  • 91:46 - 91:50
    at least for facial facial surveillance,
  • 91:50 - 91:51
    except in the,
  • 91:51 - 91:54
    I would say so-called intelligence realm.
  • 91:54 - 91:57
    So that would mean not a
    criminal investigation,
  • 91:57 - 91:59
    but rather an intelligence investigation,
  • 91:59 - 92:00
    for example of an activist
  • 92:00 - 92:04
    or an activist group or a
    protest or something like that.
  • 92:05 - 92:07
    Okay.
  • 92:07 - 92:11
    Can besides government,
  • 92:11 - 92:15
    what are the purposes
    of facial recognition
  • 92:15 - 92:18
    that businesses would want to know?
  • 92:18 - 92:21
    Then they might use it in a store,
  • 92:21 - 92:23
    but what are some of the other reasons
  • 92:23 - 92:27
    private sector might use them.
  • 92:27 - 92:30
    Sure, so speaking to private sector uses
  • 92:30 - 92:32
    of facial recognition,
  • 92:32 - 92:34
    oftentimes you see it
    being used for security
  • 92:34 - 92:39
    or accessing securing access
    to a particular place.
  • 92:39 - 92:41
    So only a particular
    employees can come in.
  • 92:41 - 92:43
    Something that's more alarming
  • 92:43 - 92:45
    that we're seeing with commercial use,
  • 92:45 - 92:47
    is the use in housing.
  • 92:47 - 92:50
    And so we have a case even in Brooklyn,
  • 92:50 - 92:53
    where you had tenants
    saying to the landlord,
  • 92:53 - 92:57
    we don't want to enter
    our homes with our face,
  • 92:57 - 93:00
    don't install facial
    recognition technology.
  • 93:00 - 93:01
    They actually won that case,
  • 93:01 - 93:04
    but not every group is
    going to be so successful.
  • 93:04 - 93:06
    So you can be in a situation
  • 93:06 - 93:09
    where your face becomes the key.
  • 93:09 - 93:12
    But when your face is the key
    and it gets stolen or hacked,
  • 93:12 - 93:14
    you can't just replace it so easily.
  • 93:14 - 93:17
    You might need some plastic surgery.
  • 93:17 - 93:20
    So we see facial recognition technologies
  • 93:20 - 93:23
    being used for access in certain ways.
  • 93:23 - 93:25
    And again, the dangerous use of trying
  • 93:25 - 93:28
    to predict something about somebody.
  • 93:28 - 93:30
    Are you going to be a good employee?
  • 93:30 - 93:33
    You have companies like Amazon,
  • 93:33 - 93:35
    saying we can detect fear from your face.
  • 93:35 - 93:39
    And so thinking about how
    that might be used as well.
  • 93:39 - 93:41
    The other thing we have to consider
  • 93:41 - 93:44
    when we're talking about commercial uses
  • 93:44 - 93:46
    of facial recognition technologies,
  • 93:46 - 93:48
    is oftentimes you'll have companies
  • 93:48 - 93:51
    have general purpose facial recognition.
  • 93:51 - 93:54
    So this then means other people can buy it
  • 93:54 - 93:56
    and use it in all kinds of ways.
  • 93:56 - 93:58
    So from your Snapchat filter
  • 93:58 - 94:00
    to putting it for lethal
    autonomous weapons.
  • 94:00 - 94:02
    You have a major range
  • 94:02 - 94:06
    with what can happen with
    these sorts of technologies.
  • 94:06 - 94:09
    Pardon me councilor.
  • 94:10 - 94:11
    I would just jump into also reiterate
  • 94:11 - 94:14
    that this ordinance only
    applies to government conduct.
  • 94:14 - 94:16
    It would not restrict in any way,
  • 94:16 - 94:20
    any entity in the private sector
    from using this technology.
  • 94:20 - 94:23
    But that said, some other uses,
  • 94:23 - 94:26
    I don't know if you've ever
    seen the film "Minority Report,"
  • 94:26 - 94:31
    but in that film, Tom
    Cruise enters them mall
  • 94:31 - 94:34
    and some technology says to him,
  • 94:34 - 94:37
    "Hello, sir, how did you
    like the size small underwear
  • 94:37 - 94:39
    you bought last time?"
  • 94:39 - 94:42
    So that's actually now
    becoming a reality as well,
  • 94:42 - 94:46
    that in the commercial space, at stores,
  • 94:46 - 94:48
    for marketing purposes,
  • 94:48 - 94:49
    we are likely going to see
  • 94:49 - 94:52
    if laws do not stop this from happening,
  • 94:52 - 94:54
    the application of facial surveillance
  • 94:54 - 94:57
    in a marketing and commercial context
  • 94:57 - 95:00
    to basically try to get
    us to buy more stuff.
  • 95:01 - 95:03
    And to underscore that point,
  • 95:03 - 95:05
    you have a patent from Facebook,
  • 95:05 - 95:07
    which basically Facebook has
    so many of our face prints.
  • 95:07 - 95:10
    We've been training
    their systems for years.
  • 95:10 - 95:13
    The patent says, given that
    we have this information,
  • 95:13 - 95:17
    we can provide you background details
  • 95:17 - 95:20
    about somebody entering your store,
  • 95:20 - 95:23
    and even give them a
    trust worthiness score
  • 95:23 - 95:26
    to restrict access to certain products.
  • 95:26 - 95:29
    This is a patent that has
    been filed by Facebook.
  • 95:29 - 95:31
    So it's certainly within the realm
  • 95:31 - 95:34
    of what companies are exploring to do.
  • 95:34 - 95:36
    And you already have companies
  • 95:36 - 95:38
    that use facial recognition technologies
  • 95:38 - 95:40
    to assess demographics.
  • 95:40 - 95:43
    So you have cameras that
    can be put into shelves.
  • 95:43 - 95:45
    You have cameras that can
    be put into mannequins.
  • 95:45 - 95:48
    So you already have this going on
  • 95:48 - 95:50
    and Tacoma in Washington,
  • 95:50 - 95:52
    they even implemented a system
  • 95:52 - 95:55
    where you had to be face checked
  • 95:55 - 95:58
    before you could walk
    into a convenience store.
  • 95:58 - 96:01
    So this technology is already out there.
  • 96:02 - 96:03
    Well thank you.
  • 96:03 - 96:05
    I know my time is up.
  • 96:05 - 96:07
    I'm looking forward to
    continuing the conversation
  • 96:07 - 96:09
    'cause I have a couple more questions,
  • 96:09 - 96:11
    but I'll ask them at another time.
  • 96:11 - 96:13
    But again, thank you to the advocates
  • 96:13 - 96:15
    and thank you to councilor Edwards.
  • 96:15 - 96:17
    Thank you.
  • 96:17 - 96:20
    So it's between me and public testimony.
  • 96:20 - 96:23
    So I'm gonna just put my
    three questions out there
  • 96:23 - 96:26
    and then we're gonna go
    right down the list of folks
  • 96:26 - 96:28
    who have signed up in RSVP.
  • 96:29 - 96:31
    Again to those folks who are gonna come
  • 96:32 - 96:33
    after you've testified,
  • 96:33 - 96:35
    we were maxed out in our limit of people
  • 96:35 - 96:37
    who could participate.
  • 96:37 - 96:40
    So if you're willing to
    testify and then also sign out
  • 96:40 - 96:41
    and watch through YouTube
  • 96:41 - 96:43
    or watch you and other mechanisms
  • 96:43 - 96:45
    to allow somebody else to testify,
  • 96:45 - 96:48
    that would be very
    helpful to this process.
  • 96:48 - 96:51
    I'm looking specifically
    at the issue of enforcement
  • 96:51 - 96:54
    in this statute or in this ordinance.
  • 96:54 - 96:57
    And it says no evidence derived there from
  • 96:57 - 97:00
    maybe received in evidence in proceeding
  • 97:00 - 97:04
    or before any department or
    officer agency regulatory body.
  • 97:04 - 97:08
    I want to be clear if the
    police received this evidence,
  • 97:09 - 97:12
    could a prosecutor use it in court.
  • 97:12 - 97:14
    So -
    That's my question.
  • 97:16 - 97:17
    So that's my issue.
  • 97:17 - 97:20
    There seems to be no
    actual evidence prohibition
  • 97:20 - 97:22
    for use in court against somebody.
  • 97:22 - 97:25
    The other issue or concern I have
  • 97:25 - 97:28
    is a provision violations
    of this ordinance
  • 97:28 - 97:31
    by a city employee, shall
    result in consequences
  • 97:31 - 97:36
    that may include retraining,
    suspension, termination,
  • 97:36 - 97:38
    but they're all subject to provisions
  • 97:38 - 97:40
    of collective bargaining agreements.
  • 97:40 - 97:43
    So all of that could be gone away
  • 97:43 - 97:44
    if their union hasn't agreed
  • 97:45 - 97:48
    to that being part of
    the disciplinary steps
  • 97:48 - 97:49
    for that particular employee.
  • 97:49 - 97:52
    So to me that signals the patrolmen
  • 97:52 - 97:55
    and other unions of folks
  • 97:55 - 97:59
    who are part of need to need
    to have this as part of their,
  • 97:59 - 98:01
    a violation of contract
    or violation of standard,
  • 98:01 - 98:04
    I think maybe I'm wrong.
  • 98:04 - 98:08
    And then finally, Joy, your testimony
  • 98:08 - 98:10
    about the private sector
    has really hit me.
  • 98:11 - 98:13
    Thank you so much.
  • 98:13 - 98:14
    I'm particularly concerned about this.
  • 98:14 - 98:15
    I'm a twin.
  • 98:17 - 98:20
    Facebook tags my sister in all
    of my pictures automatically.
  • 98:22 - 98:26
    I mean I'm a twin, councilor
    Essabi George has triplets.
  • 98:26 - 98:28
    So for those, we're called
    multiples, you're all singletons,
  • 98:28 - 98:30
    you were born by yourself.
  • 98:30 - 98:32
    We share birth, multiples have you know,
  • 98:32 - 98:36
    and so naturally I'm concerned
    about this technology.
  • 98:36 - 98:39
    Not that my sister is inclined
    to do any criminal activity,
  • 98:39 - 98:41
    but she may go to a protest
    or two, I don't know,
  • 98:41 - 98:44
    but either way, the point is,
  • 98:44 - 98:49
    I'm concerned about the private
    actor and how they interact.
  • 98:49 - 98:51
    And I'm particularly concerned
  • 98:51 - 98:54
    is how far this will go down the line,
  • 98:54 - 98:57
    the chain supply chain, right?
  • 98:57 - 98:59
    'Cause we have $600 million in contracts
  • 98:59 - 99:02
    for all sorts of things
    for the city of Boston.
  • 99:02 - 99:05
    If we have a contract
    with a cleaning company
  • 99:05 - 99:07
    that requires the workers to sign in
  • 99:07 - 99:09
    with facial recognition, right?
  • 99:09 - 99:13
    So how far down the line
    can we go with our money?
  • 99:14 - 99:17
    Now I understand time is of the essence.
  • 99:17 - 99:18
    And that might be a big,
  • 99:18 - 99:19
    big question that we can work
    out in the working session.
  • 99:19 - 99:22
    I'm particularly concerned
    about the courts.
  • 99:22 - 99:27
    So if you wanna just focus
    on that one, thank you.
  • 99:27 - 99:28
    Thank you, councilor.
  • 99:28 - 99:29
    So just very quickly,
  • 99:31 - 99:36
    we are unclear on what the
    city council's power is
  • 99:36 - 99:39
    to control the prosecutor's office.
  • 99:39 - 99:43
    I think we'll look more into that,
  • 99:43 - 99:44
    we'll look closely at that.
  • 99:44 - 99:47
    On the second question, I agree.
  • 99:47 - 99:49
    We need to look at the BPA agreement,
  • 99:49 - 99:51
    which I understand expires the summer.
  • 99:51 - 99:56
    And then finally on the
    private sector concerns,
  • 99:56 - 99:57
    I share them.
  • 99:57 - 100:01
    Again, we wanna get this government ban
  • 100:01 - 100:03
    passed as quickly as possible.
  • 100:03 - 100:05
    The ACLU supports approaches like that,
  • 100:05 - 100:07
    that the state of Illinois has taken.
  • 100:07 - 100:09
    They passed the nation's strongest
  • 100:09 - 100:11
    consumer facing biometrics privacy law.
  • 100:11 - 100:14
    It's called the Biometric
    Information Privacy Act,
  • 100:14 - 100:15
    BIPA.
  • 100:15 - 100:18
    BIPA essentially prevents
    private companies,
  • 100:18 - 100:19
    any private companies,
  • 100:19 - 100:21
    from collecting your biometric data
  • 100:21 - 100:22
    without your opt-in consent.
  • 100:22 - 100:24
    And that's not, I clicked a button
  • 100:24 - 100:26
    when I was scrolling
    through a terms of service,
  • 100:26 - 100:29
    it's you actually have
    to sign a piece of paper
  • 100:29 - 100:29
    and give it to them.
  • 100:29 - 100:32
    So we need a law like that
    right here in Massachusetts.
  • 100:32 - 100:35
    Thank you.
  • 100:35 - 100:36
    So at this point,
  • 100:36 - 100:37
    I'm gonna turn it over
    to public testimony.
  • 100:37 - 100:39
    I have some folks who
    have already committed
  • 100:39 - 100:41
    and asked to speak today.
  • 100:41 - 100:42
    I'm gonna go through them.
  • 100:42 - 100:45
    I'm going to, it's
    already quarter to five.
  • 100:45 - 100:49
    I'm going to try and end this
    hearing no later than six.
  • 100:49 - 100:51
    So that's an hour and 15 minutes
  • 100:51 - 100:54
    to move through public testimony.
  • 100:54 - 100:55
    And I'm gonna try,
  • 100:55 - 100:57
    I'm gonna have to keep
    folks to two minutes
  • 100:57 - 100:58
    as I said before,
  • 100:58 - 101:01
    because there are a lot
    of people signed up.
  • 101:01 - 101:03
    So, of the folks who've signed up,
  • 101:03 - 101:05
    I will go through that list.
  • 101:05 - 101:07
    Then I'm gonna invite
    people to raise their hands
  • 101:07 - 101:10
    who may not have signed up directly,
  • 101:10 - 101:12
    who would also like to testify.
  • 101:12 - 101:16
    All right, so I have on the
    list, Bonnie Tenneriello
  • 101:18 - 101:20
    from the National Lawyers Guild.
  • 101:23 - 101:25
    After her I have Callan Bignoli
  • 101:25 - 101:28
    from the Library Freedom
    Project and Maty Cropley.
  • 101:28 - 101:29
    Those are the three folks lined up.
  • 101:32 - 101:33
    Let's see.
  • 101:34 - 101:36
    Kaitlin, is there a Bonnie?
  • 101:43 - 101:44
    Okay well, I see Callan
    right now ready to go.
  • 101:44 - 101:47
    So I'm gonna go ahead
    and start with Callan
  • 101:47 - 101:49
    if you wanna start and
    your two minutes has begun.
  • 101:52 - 101:55
    Sure, hi, I'm Callan Bignoli,
  • 101:55 - 101:56
    I am a resident of West Roxbury
  • 101:56 - 101:58
    and a librarian in the Boston area.
  • 101:58 - 102:00
    And I am speaking on behalf
  • 102:00 - 102:02
    of the Library Freedom Project Today.
  • 102:02 - 102:03
    So thank you to the chair
  • 102:03 - 102:04
    and thank you to all of city council
  • 102:04 - 102:05
    for the chance to testify today
  • 102:05 - 102:08
    in this important piece of legislation.
  • 102:08 - 102:12
    So, the Library Freedom Project
    is a library advocacy group
  • 102:12 - 102:14
    that trains library workers to advocate
  • 102:14 - 102:16
    and educate their library community
  • 102:16 - 102:18
    about privacy and surveillance.
  • 102:18 - 102:21
    We include dozens of library
    workers from the U.S.,
  • 102:21 - 102:22
    Canada and Mexico,
  • 102:22 - 102:24
    including eight librarians
    from Massachusetts.
  • 102:24 - 102:27
    These library workers
    educate their colleagues
  • 102:27 - 102:28
    and library users about surveillance
  • 102:28 - 102:32
    by creating library programs,
    union initiatives, workshops,
  • 102:32 - 102:35
    and other resources inform
    and agitate for change.
  • 102:35 - 102:37
    And facial recognition technology
  • 102:37 - 102:40
    represents a class of
    technology that is antithetical
  • 102:40 - 102:42
    to the values of privacy
  • 102:42 - 102:45
    and confidentiality of library workers.
  • 102:45 - 102:48
    As library staff, we understand
    that part of our work
  • 102:48 - 102:51
    is to represent these
    values and the services
  • 102:51 - 102:54
    and resources we provide
    to our library community
  • 102:54 - 102:57
    in order to reduce the harm of
    state and corporate scrutiny.
  • 102:57 - 102:59
    As we understand this,
  • 102:59 - 103:02
    we can see the social
    and economic imperatives
  • 103:02 - 103:05
    that privilege some and marginalize others
  • 103:05 - 103:07
    that are encoded into
    many computer systems
  • 103:07 - 103:08
    and applications.
  • 103:08 - 103:10
    This is a result of the structure
  • 103:10 - 103:11
    of our technology industry,
  • 103:11 - 103:14
    which prioritizes the
    interests of management,
  • 103:14 - 103:16
    venture capitalists and stockholders,
  • 103:16 - 103:18
    who are mostly white men.
  • 103:18 - 103:20
    The racial and gender biases
  • 103:20 - 103:23
    inherent in face surveillance technology
  • 103:23 - 103:24
    are indicative of those values
  • 103:24 - 103:28
    and they describe how moneyed
    interests seek to shape
  • 103:28 - 103:31
    or reinforce racist and gender depressions
  • 103:31 - 103:33
    by creating computer systems
  • 103:33 - 103:35
    that extend the reach of profiling
  • 103:35 - 103:37
    through the exercise of capital.
  • 103:37 - 103:40
    Democratic direct worker
    and community control
  • 103:40 - 103:41
    over the development acquisition
  • 103:41 - 103:45
    and practices of surveillance
    and surveillance technology,
  • 103:45 - 103:47
    is an urgent priority to protect safety
  • 103:47 - 103:50
    and privacy and our
    neighborhoods and schools.
  • 103:50 - 103:51
    So the Library Freedom Project
  • 103:51 - 103:54
    supports this urgently necessary ordinance
  • 103:54 - 103:56
    to ban facial recognition
    technology in Boston
  • 103:56 - 103:58
    for the safety and privacy
  • 103:58 - 104:01
    of those who were working in
    the city and who live here.
  • 104:01 - 104:05
    we urge the council to rest
    control over surveillance -
  • 104:05 - 104:06
    Oh, two minutes is up.
    Oops!
  • 104:06 - 104:09
    Two minutes is up, so
    If you wanna summarize.
  • 104:09 - 104:11
    I have one, one more sentence.
  • 104:11 - 104:15
    We can not allow Boston
    to adopt authoritarian,
  • 104:15 - 104:19
    unregulated and bias surveillance
    technology, thank you.
  • 104:19 - 104:21
    Thank you very much, Bonnie,
  • 104:21 - 104:23
    I don't know if Bonnie's available,
  • 104:23 - 104:27
    if not I see Maty is available.
  • 104:27 - 104:30
    So I'm gonna go ahead and
    start your two minutes, Maty.
  • 104:31 - 104:32
    Thank you very much.
  • 104:32 - 104:34
    Thank you councilor Edwards
  • 104:34 - 104:35
    and thank you to the city council
  • 104:35 - 104:38
    and all the panelists for
    taking up this important issue.
  • 104:38 - 104:42
    My name is Maty Cropley,
    I use they/them pronouns.
  • 104:42 - 104:45
    I'm a teen librarian and
    a bargaining unit member
  • 104:45 - 104:48
    of the Boston Public Library
    Professional Staff Association,
  • 104:48 - 104:52
    MSLA local 4928 AFT.
  • 104:52 - 104:53
    Our union of library workers
  • 104:53 - 104:56
    supports a ban on facial
    recognition in Boston.
  • 104:56 - 104:58
    In voting to do so,
  • 104:58 - 105:00
    our union recognizes
    that facial recognition
  • 105:00 - 105:02
    and other forms of biometric surveillance
  • 105:02 - 105:04
    are a threat to the civil liberties
  • 105:04 - 105:07
    and safety of our colleagues
    and of library users.
  • 105:07 - 105:09
    Our library ethics of privacy
    and intellectual freedom
  • 105:09 - 105:12
    are incompatible with
    this invasive technology.
  • 105:12 - 105:14
    We recognize that facial recognition
  • 105:14 - 105:17
    and other biometric
    surveillance technologies
  • 105:17 - 105:19
    are proven to be riddled with racist,
  • 105:19 - 105:21
    ableist and gendered algorithmic biases.
  • 105:21 - 105:24
    These systems routinely
    misidentify people of color,
  • 105:24 - 105:26
    which can result in needless
    contact with law enforcement
  • 105:26 - 105:30
    and other scrutiny, essentially
    automating racial profiling.
  • 105:30 - 105:33
    We recognize the harm of
    surveillance for youth in our city,
  • 105:33 - 105:36
    especially black, brown
    and immigrant youth.
  • 105:36 - 105:38
    Unregulated scrutiny by authorities
  • 105:38 - 105:40
    leads to early contact
    with law enforcement
  • 105:40 - 105:43
    resulting in disenfranchisement,
    marginalized futures
  • 105:43 - 105:46
    and potential death by state violence.
  • 105:46 - 105:49
    We recognize that public
    areas such as libraries,
  • 105:49 - 105:50
    parks, and sidewalks exist as spaces
  • 105:50 - 105:53
    in which people should be free to move,
  • 105:53 - 105:56
    speak, think inquire, perform, protest,
  • 105:56 - 105:58
    and assemble freely without
    the intense scrutiny
  • 105:58 - 106:01
    of unregulated surveillance
    by law enforcement.
  • 106:01 - 106:04
    Public spaces exist to extend our rights
  • 106:04 - 106:05
    and provide space for the performance
  • 106:05 - 106:08
    of our civil liberties, not policing.
  • 106:08 - 106:10
    A ban on face surveillance technology
  • 106:10 - 106:12
    is critically important for the residents
  • 106:12 - 106:15
    and visitors to the city of Boston.
  • 106:15 - 106:18
    Our union encourages you to ban
  • 106:18 - 106:19
    the use of face surveillance
    in the city of Boston
  • 106:19 - 106:22
    by supporting and passing
    this very crucial ordinance.
  • 106:22 - 106:23
    I'd like to thank you for your time.
  • 106:23 - 106:25
    And we are the Boston Public Library,
  • 106:25 - 106:27
    Professional Staff Association.
  • 106:27 - 106:29
    Thank you very much.
  • 106:29 - 106:30
    Thank you very much.
  • 106:30 - 106:35
    I understand that Ms. Bonnie is available.
  • 106:35 - 106:39
    She's under a name Linda Rydzewski,
  • 106:42 - 106:44
    and we're gonna pull her up.
  • 106:47 - 106:49
    I'll check when she's available.
  • 106:50 - 106:53
    After her will be the
    following three testifiers,
  • 106:53 - 106:58
    Nikhill Thorat, Nour Sulalman
    and professor Woodrow Hartzog.
  • 107:00 - 107:02
    First, Bonnie.
  • 107:08 - 107:11
    She's under Linda Rydzewski, sorry.
  • 107:14 - 107:16
    Linda Rydzewski
  • 107:20 - 107:25
    I am so sorry, I've had
    a technical problems.
  • 107:25 - 107:27
    This is Bonnie Tenneriello,
  • 107:27 - 107:29
    on behalf of the National Lawyers Guild.
  • 107:29 - 107:34
    And I had to sign in
    with my office account
  • 107:34 - 107:39
    and I am not representing
    Prisoner's Legal Services
  • 107:39 - 107:43
    and I couldn't change the screen
    name once I was signed in.
  • 107:43 - 107:44
    Am I audible to everyone?
  • 107:46 - 107:48
    You're on and you have two minutes.
  • 107:48 - 107:50
    Okay.
  • 107:50 - 107:53
    I'm speaking on behalf of
    the National Lawyers Guild,
  • 107:53 - 107:55
    Massachusetts Chapter,
  • 107:55 - 108:00
    which for over 80 years as
    fought for human rights,
  • 108:04 - 108:06
    and we're deeply
    concerned over the dangers
  • 108:06 - 108:08
    of facial surveillance.
  • 108:08 - 108:11
    We strongly support this ordinance.
  • 108:11 - 108:15
    Facial recognition
    deployed through cameras
  • 108:15 - 108:18
    throughout a city can be used
  • 108:18 - 108:20
    to track the movements of dissidents
  • 108:20 - 108:23
    and anyone attending a protest
    as others have observed.
  • 108:23 - 108:25
    Our history and the National Lawyers Guild
  • 108:25 - 108:29
    shows that this danger is real.
  • 108:29 - 108:31
    We have defended political dissidents
  • 108:31 - 108:33
    targeted by law enforcement,
  • 108:33 - 108:36
    such as members of the
    Black Panther Party,
  • 108:36 - 108:37
    American Indian Movement,
  • 108:37 - 108:39
    Puerto Rican Independence Movement
  • 108:39 - 108:42
    and we're recently here in Boston,
  • 108:42 - 108:44
    occupied Boston Climate Change activists,
  • 108:44 - 108:47
    those opposing a straight pride event
  • 108:47 - 108:49
    and Black Lives Matter.
  • 108:49 - 108:54
    We know that the dangers
    of political surveillance
  • 108:56 - 108:59
    and political use of
    this technology are real.
  • 109:00 - 109:01
    The National Lawyers Guild
  • 109:01 - 109:04
    also helped expose illegal surveillance
  • 109:04 - 109:09
    in the Church Commission
    1975-76 COINTELPRO hearings.
  • 109:11 - 109:13
    We know from this experience
  • 109:13 - 109:16
    that this can be used to disrupt
  • 109:16 - 109:20
    and prosecute political protest.
  • 109:20 - 109:22
    People must be able to demonstrate
  • 109:22 - 109:26
    without fear of being identified
    and tracked afterwards.
  • 109:26 - 109:29
    I will not repeat what others have said
  • 109:29 - 109:33
    very articulately about just
    the basic privacy concerns
  • 109:33 - 109:35
    of being able to walk through a city,
  • 109:35 - 109:38
    without having cameras track your identity
  • 109:38 - 109:40
    and movements throughout the city,
  • 109:40 - 109:44
    nor will I repeat the excellent testimony
  • 109:44 - 109:49
    that was given about the
    racial bias in this technology.
  • 109:50 - 109:52
    But on behalf of the
    National Lawyers Guild,
  • 109:52 - 109:57
    I will say that we cannot
    tolerate a technology
  • 109:57 - 110:00
    that perpetuates racism
    in law enforcement,
  • 110:00 - 110:02
    as tens of thousands of people
  • 110:02 - 110:05
    take to the streets
    calling for racial justice.
  • 110:05 - 110:08
    We're at a historical moment
    when our society is demanding
  • 110:08 - 110:11
    and this council is
    considering a shift away
  • 110:11 - 110:15
    from control and policing
    of communities of color
  • 110:15 - 110:17
    and low-income communities
  • 110:17 - 110:18
    and towards community directed programs
  • 110:18 - 110:22
    that channel resources to
    jobs, healthcare, education,
  • 110:22 - 110:25
    and other programs that
    create true safety.
  • 110:25 - 110:29
    We reject facial surveillance
  • 110:29 - 110:33
    and we urge the council
    to pass this ordinance.
  • 110:33 - 110:35
    Thank you very much.
  • 110:37 - 110:40
    So up next we have Nikhill Thorat.
  • 110:43 - 110:44
    Sorry, if I mispronounced your name.
  • 110:44 - 110:45
    No harm, you got that right.
  • 110:45 - 110:48
    So thank you chair and to the
    amazing speakers before me.
  • 110:48 - 110:53
    My name is Nikhil, I work on
    at Google Brain in Cambridge.
  • 110:53 - 110:55
    So I specifically work in the area
  • 110:55 - 110:58
    of machine learning fairness
    and interpretability.
  • 110:58 - 110:59
    So this is sort of the stuff that we do.
  • 110:59 - 111:01
    So I'm here as a citizen,
  • 111:01 - 111:03
    and I am very much in
    favor of the ordinance
  • 111:03 - 111:06
    banning facial recognition
    by public officials
  • 111:06 - 111:07
    here in Boston.
  • 111:07 - 111:10
    So modern AI algorithms
    are statistical machines.
  • 111:10 - 111:13
    They base predictions off
    patterns and data sets
  • 111:13 - 111:15
    that they're trained on,
    there's there's no magic here.
  • 111:15 - 111:19
    So any bias that stems from
    how a data set is collected,
  • 111:19 - 111:21
    like non-representative collection of data
  • 111:21 - 111:22
    for different races
  • 111:22 - 111:24
    because of historical context,
  • 111:24 - 111:27
    gets automatically reflected
    in the AI models predictions.
  • 111:27 - 111:31
    So what this means is that it
    manifests as unequal errors
  • 111:31 - 111:33
    between subgroups, like race or gender.
  • 111:33 - 111:35
    And this is not theoretical,
  • 111:35 - 111:37
    it's been studied very deeply
  • 111:38 - 111:39
    by many AI researchers in this field
  • 111:39 - 111:40
    who have been on this call.
  • 111:40 - 111:42
    There's a federal NISD,
  • 111:42 - 111:45
    a National Institute of
    Standards and Technology Study
  • 111:45 - 111:47
    that looks at almost 200
    facial recognition algorithms
  • 111:47 - 111:50
    and found significant
    demographic disparities
  • 111:50 - 111:51
    in most of them.
  • 111:51 - 111:55
    And there's another paper
    by Joy here, Gender Shades,
  • 111:55 - 111:57
    that looked at three commercial
    classification systems
  • 111:57 - 111:58
    and found that there are
  • 112:00 - 112:04
    35% of the time incorrect
    results for dark-skinned females
  • 112:04 - 112:07
    versus only 0.8% incorrect
    for light-skinned males.
  • 112:07 - 112:09
    This is, this is horrible, right?
  • 112:09 - 112:13
    So these mistakes can't
    also be viewed in isolation.
  • 112:13 - 112:15
    They will perpetuate bias issues
  • 112:15 - 112:16
    by informing the collection
  • 112:16 - 112:18
    of the next generation of datasets,
  • 112:18 - 112:20
    which are then again used
  • 112:20 - 112:23
    to train the next generation of AI models.
  • 112:23 - 112:25
    Moreover, when mistakes are made either
  • 112:25 - 112:27
    by a machine or by a human being,
  • 112:27 - 112:28
    we need to be able to ask the question,
  • 112:28 - 112:30
    why was that mistake made?
  • 112:30 - 112:33
    And we simply haven't
    developed robust tools
  • 112:33 - 112:35
    to be able to hold these algorithms
  • 112:35 - 112:37
    to an acceptable level of transparency,
  • 112:37 - 112:40
    to answer any of these critical questions.
  • 112:40 - 112:42
    Deployment of these systems
  • 112:42 - 112:44
    will exacerbate the issues of racial bias
  • 112:44 - 112:46
    and policing and accelerate
    the various social inequalities
  • 112:47 - 112:49
    that we're protesting to
    innocent people will be targeted.
  • 112:49 - 112:52
    The threshold for deploying these systems
  • 112:52 - 112:53
    should be extremely high,
  • 112:53 - 112:56
    one that's arguably not
    reachable beyond accuracy,
  • 112:56 - 112:58
    beyond a hundred percent accuracy idea.
  • 112:58 - 113:00
    Data would have to be public
  • 113:00 - 113:03
    and there would have to be
    actionable algorithmic oversight.
  • 113:03 - 113:06
    This is an incredibly high bar
    and we are nowhere near that,
  • 113:06 - 113:08
    both algorithmically and societaly.
  • 113:08 - 113:09
    Along with this ordinance,
  • 113:09 - 113:11
    we must start laying the groundwork
  • 113:11 - 113:14
    for a rigorous criteria
    for future technology
  • 113:14 - 113:16
    as it is an inevitable conversation.
  • 113:16 - 113:18
    But first we must pass the ordinance
  • 113:18 - 113:19
    banning facial recognition in Boston.
  • 113:19 - 113:21
    Thank you very much.
  • 113:21 - 113:23
    Thank you very much,
  • 113:23 - 113:26
    following Nikhill we have Nour Sulalman.
  • 113:29 - 113:32
    Sorry again for mispronunciation.
  • 113:32 - 113:32
    It's all good.
  • 113:33 - 113:35
    Thank you council members
  • 113:35 - 113:38
    and authors of the ordinance
    for your service to the city.
  • 113:38 - 113:41
    My name is Noue Sulalman Langedorfer,
  • 113:41 - 113:43
    and I use she/her pronouns.
  • 113:43 - 113:46
    I'm a resident of Jamaica Plain,
  • 113:46 - 113:49
    and I'm here in my capacity
    as a private citizen.
  • 113:49 - 113:52
    I won't repeat what everyone else said,
  • 113:52 - 113:55
    but I will say that the
    use of facial recognition
  • 113:55 - 113:59
    by law enforcement today,
    is especially disturbing
  • 113:59 - 114:02
    due to the epidemic of police
    violence in this country
  • 114:02 - 114:05
    and the dysfunctional relationship it has
  • 114:05 - 114:08
    with black and brown people in the city.
  • 114:08 - 114:11
    And with all due respect,
  • 114:11 - 114:13
    I'm unconvinced by the distinction made
  • 114:13 - 114:17
    between facial surveillance
    and facial recognition.
  • 114:17 - 114:22
    I'm even more disturbed or I am disturbed
  • 114:22 - 114:23
    that the commissioner does not know
  • 114:23 - 114:27
    if the police department uses
    facial recognition software.
  • 114:27 - 114:31
    I asked the council to consider supporting
  • 114:31 - 114:34
    that the ordinance established a total ban
  • 114:34 - 114:36
    on the use of facial recognition software
  • 114:36 - 114:39
    in all public spaces,
    including by law enforcement,
  • 114:39 - 114:41
    malls and shopping centers, retail spaces,
  • 114:41 - 114:45
    restaurants open to the public,
    state government buildings,
  • 114:45 - 114:49
    courts, public schools,
    schools funded by the state,
  • 114:49 - 114:51
    on public streets and
    in places of employment,
  • 114:51 - 114:53
    as a condition of employment
  • 114:53 - 114:56
    and as a condition of landlords
    leasing rental properties.
  • 114:56 - 114:58
    The ban should also apply to the use
  • 114:58 - 115:01
    and purchase a facial recognition software
  • 115:01 - 115:04
    by any state agency operating in Boston,
  • 115:04 - 115:06
    including law enforcement
  • 115:06 - 115:09
    and I further or to the council
    to consider adopting a ban
  • 115:09 - 115:13
    on the use of private DNA
    databases by law enforcement
  • 115:13 - 115:16
    and all remote surveillance systems,
  • 115:16 - 115:18
    including those that
    recognize individual gates.
  • 115:18 - 115:23
    Information gathered through
    the aforementioned technologies
  • 115:24 - 115:27
    should not be deemed
    admissible in civil, criminal
  • 115:27 - 115:30
    or administrative proceedings
  • 115:30 - 115:34
    or serve as a basis for
    granting a search warrant.
  • 115:34 - 115:35
    Thank you madam chair.
  • 115:38 - 115:39
    Thank you very much.
  • 115:40 - 115:44
    I have a professor Woodrow Hartzog.
  • 115:46 - 115:48
    Okay, two minutes.
  • 115:48 - 115:50
    Thank you so much, dear councilors,
  • 115:50 - 115:52
    thank you for allowing me the opportunity
  • 115:52 - 115:54
    to speak in support of the ordinance.
  • 115:54 - 115:57
    Banning face surveillance in Boston.
  • 115:57 - 115:59
    I am a professor of law
    and computer science
  • 115:59 - 116:01
    at Northeastern University,
  • 116:01 - 116:03
    who has been researching and writing
  • 116:03 - 116:05
    about the risks of facial
    recognition technologies
  • 116:05 - 116:06
    for over seven years,
  • 116:06 - 116:09
    I make these comments in my
    personal academic capacity.
  • 116:09 - 116:11
    I'm not serving as an advocate
  • 116:11 - 116:14
    for any particular organization.
  • 116:14 - 116:17
    Please allow me to be direct,
  • 116:17 - 116:18
    facial recognition technology
  • 116:18 - 116:22
    is the most dangerous
    surveillance tool ever invented.
  • 116:22 - 116:25
    It poses substantial
    threats to civil liberties,
  • 116:25 - 116:27
    privacy and democratic accountability.
  • 116:27 - 116:31
    Quite simply the world has
    never seen anything like it.
  • 116:31 - 116:34
    Traditional legal rules, such
    as requiring legal process
  • 116:34 - 116:37
    or people's consent before
    surveillance could be conducted,
  • 116:37 - 116:39
    will only entrench these systems,
  • 116:39 - 116:41
    lead to a more watched society
  • 116:41 - 116:44
    where we are all treated
    as suspects all the time.
  • 116:44 - 116:46
    Anything less than a ban
  • 116:46 - 116:49
    will inevitably lead to unacceptable abuse
  • 116:49 - 116:51
    of the massive power
    bestowed by these systems.
  • 116:51 - 116:53
    That is why I believe that this ordinance
  • 116:53 - 116:55
    is justified and necessary.
  • 116:55 - 116:57
    There are many ways that law enforcement's
  • 116:57 - 117:00
    use of facial recognition
    technology can harm people.
  • 117:00 - 117:02
    Government surveillance
  • 117:02 - 117:05
    has disproportionately targeted
    marginalized communities,
  • 117:05 - 117:07
    specifically people of color,
  • 117:07 - 117:09
    facial recognition will
    only make this worse
  • 117:09 - 117:11
    because it is inaccurate and bias
  • 117:11 - 117:14
    based along racial and gender lines.
  • 117:14 - 117:16
    And while facial
    recognition is unacceptable
  • 117:16 - 117:18
    and it's biased and error prone,
  • 117:18 - 117:20
    the more accurate it becomes,
  • 117:20 - 117:22
    the more oppressive it will get.
  • 117:22 - 117:26
    Accurate systems will be more
    heavily used and invested in
  • 117:26 - 117:29
    further endangering the people of Boston.
  • 117:29 - 117:32
    Use of facial recognition
    seems likely to create
  • 117:32 - 117:34
    a pervasive atmosphere of chill.
  • 117:34 - 117:37
    These tools make it easier
    to engage in surveillance,
  • 117:37 - 117:40
    which means the more
    surveillance can and will occur.
  • 117:40 - 117:43
    The mere prospect of a
    hyper surveil society
  • 117:43 - 117:45
    could routinely prevent citizens
  • 117:45 - 117:48
    from engaging in First
    Amendment protected activities,
  • 117:48 - 117:50
    such as protesting and worshiping,
  • 117:50 - 117:54
    for fear of ending up on
    government watch lists.
  • 117:54 - 117:56
    Facial recognition also poses a threat
  • 117:56 - 117:58
    to our ideals of due process,
  • 117:58 - 117:59
    because it makes it all too easy
  • 117:59 - 118:02
    for governments to excessively
    enforce minor infractions
  • 118:02 - 118:05
    as pretext for secretly monitoring
  • 118:05 - 118:08
    and retaliating against our citizens
  • 118:08 - 118:11
    who are often targeted for
    speaking up like journalists,
  • 118:11 - 118:13
    whistleblowers and activists.
  • 118:13 - 118:16
    The net result could be
    anxious and oppressed citizens
  • 118:16 - 118:19
    who were denied fundamental
    opportunities and rights.
  • 118:19 - 118:21
    For the reasons outlined above,
  • 118:21 - 118:22
    I strongly support the ordinance
  • 118:22 - 118:25
    banning facial recognition
    surveillance in Boston.
  • 118:25 - 118:27
    It's the best approach for preventing
  • 118:27 - 118:28
    an Orwellian future
  • 118:28 - 118:30
    and ensuring that the city of Boston
  • 118:30 - 118:33
    remain a place where people can flourish
  • 118:33 - 118:36
    and civil liberties from a protected.
  • 118:36 - 118:37
    Thank you so much, professor.
  • 118:38 - 118:43
    We have coming up, Alex
    Marthews, Emily Reif,
  • 118:45 - 118:47
    Jurell Laronal.
  • 118:49 - 118:53
    Those three folks, I don't
    know if they're ready to go.
  • 118:53 - 118:56
    But we'll start, I see Jurell, I see Alex.
  • 118:57 - 119:00
    Go ahead Alex, Marthews.
  • 119:01 - 119:02
    Hi.
  • 119:04 - 119:09
    I don't wish to echo what
    other excellent advocates
  • 119:10 - 119:11
    have been saying,
  • 119:11 - 119:15
    but I am here representing Digital Fourth,
  • 119:15 - 119:17
    restored fourth Boston,
  • 119:17 - 119:19
    where she is a volunteer based
  • 119:19 - 119:23
    advocacy group on surveillance
    and policing issues
  • 119:23 - 119:24
    in the Boston area.
  • 119:25 - 119:30
    I wanna highlight some
    comments of commissioner Gross,
  • 119:31 - 119:34
    in addition to our testimony.
  • 119:35 - 119:39
    Commissioner Gross seems to
    recognize the racial biases
  • 119:39 - 119:43
    and limitations of this
    technology as it stands now,
  • 119:43 - 119:47
    but he says it's improving
    and he holds out hope
  • 119:47 - 119:52
    that it will be more
    accurate in the future.
  • 119:52 - 119:54
    And that perhaps at some future point,
  • 119:54 - 119:57
    the Boston City Council should reconsider
  • 119:57 - 119:59
    whether it will be appropriate
  • 119:59 - 120:03
    to implement facial
    recognition technology.
  • 120:03 - 120:06
    The problem with this is that
  • 120:06 - 120:11
    a 100% accurate facial
    recognition technology system
  • 120:11 - 120:12
    would be terrifying.
  • 120:13 - 120:18
    And it would represent the
    death of anonymity in public.
  • 120:19 - 120:23
    In terms of the effect of
    facial recognition technology
  • 120:23 - 120:26
    on legislative terrorists themselves,
  • 120:27 - 120:30
    there have been a number of studies now
  • 120:30 - 120:34
    that show that current facial
    recognition technologies
  • 120:34 - 120:36
    are perfectly capable
  • 120:38 - 120:40
    of misidentifying legislators as criminals
  • 120:41 - 120:46
    and these systems being 100% accurate
  • 120:46 - 120:50
    will pose even more
    worrying risks about that.
  • 120:50 - 120:52
    As city councilors,
  • 120:52 - 120:55
    where are you going when
    you're going in public,
  • 120:55 - 120:57
    are you meeting with
    social justice groups?
  • 120:57 - 120:59
    Are you meeting with immigrant advocates?
  • 120:59 - 121:02
    Are you meeting with God forbid,
  • 121:02 - 121:05
    police reform and down to
    kit accountability groups?
  • 121:05 - 121:08
    Are you doing something
    embarrassing perhaps
  • 121:08 - 121:11
    that could be used to influence your votes
  • 121:11 - 121:13
    on matters of public interest
  • 121:13 - 121:16
    or matters relating to the police budget?
  • 121:16 - 121:21
    The risk of these things
    happening are very real,
  • 121:21 - 121:23
    if it is to be implemented in the future.
  • 121:23 - 121:26
    And the only appropriate response
  • 121:26 - 121:31
    is for today's city council
    to ban this technology.
  • 121:32 - 121:34
    Thank you very much.
  • 121:36 - 121:39
    I believe, I don't know
    if Emily is available,
  • 121:39 - 121:44
    but I do see Jurell, Jurell
    would you like to go?
  • 121:51 - 121:52
    Thank you.
  • 121:52 - 121:54
    Thank you, Boston City Council
    for having this platform.
  • 121:54 - 121:56
    I wanna thank you.
  • 121:56 - 121:59
    Thank you and a shout out
    to the expert advocates
  • 121:59 - 122:02
    on the subject for shedding
    light on the subject
  • 122:02 - 122:05
    that me and many of us
    are not fully versed in.
  • 122:05 - 122:08
    What I am well versed in it though,
  • 122:08 - 122:10
    is the reality surrounding
  • 122:10 - 122:12
    the Boston Police Department's history
  • 122:12 - 122:15
    and the negative impact that they have had
  • 122:15 - 122:17
    on the black or brown
    Bostonian's for decades.
  • 122:17 - 122:19
    My name is Jurell Laronal,
  • 122:19 - 122:21
    I'm a formerly incarcerated black man
  • 122:21 - 122:25
    from a heavily police commerce
    neighborhood of Dorchester.
  • 122:25 - 122:29
    I'm a community organizer for
    Families for Justice as Halen
  • 122:29 - 122:32
    and I'm here to support the ordinance
  • 122:32 - 122:36
    banning facial recognition
    technology in Boston
  • 122:36 - 122:39
    presented by councilors Wu and Arroyo.
  • 122:39 - 122:41
    First, what are we talking about?
  • 122:41 - 122:44
    We are talking about a force whose actions
  • 122:44 - 122:48
    and racial profiling techniques
    have been documented,
  • 122:48 - 122:51
    tried in court and notoriously
    known for targeting
  • 122:51 - 122:55
    and terrorizing those from
    my disinvested communities.
  • 122:55 - 123:00
    We are talking about
    militarization to intimidate,
  • 123:00 - 123:04
    force their well on
    black and brown citizens
  • 123:04 - 123:05
    of the Commonwealth.
  • 123:05 - 123:08
    A force by their own field interrogations
  • 123:08 - 123:09
    and observation records,
  • 123:09 - 123:13
    show almost 70% of them
    were black residents
  • 123:13 - 123:16
    in a city that's only around 25.2% black.
  • 123:16 - 123:19
    So we're talking about
    taking the same force
  • 123:19 - 123:22
    and further funding them
  • 123:22 - 123:24
    and equipping them with another weapon
  • 123:24 - 123:27
    that would definitely be used
    to further oppress, target
  • 123:27 - 123:30
    and incarcerate our sons and fathers
  • 123:30 - 123:32
    and daughters and mothers.
  • 123:32 - 123:35
    My organization is currently
    supporting the appeals
  • 123:35 - 123:38
    and release efforts of black
    men and their families,
  • 123:38 - 123:41
    who have been incarcerated
    since the seventies
  • 123:41 - 123:44
    due to a racist profiling
    and racist misidentification.
  • 123:45 - 123:49
    When I think about racial
    recognition software,
  • 123:49 - 123:51
    which at best is predatory and inaccurate,
  • 123:51 - 123:55
    I don't see a solution
    and I don't see safety.
  • 123:55 - 123:57
    I see yet another form of racial bias
  • 123:57 - 123:58
    and another way of police
  • 124:00 - 124:00
    to cause harm in my community.
  • 124:01 - 124:04
    Today our city's historical
    racial profiling issue
  • 124:04 - 124:07
    with, somebody said it earlier,
  • 124:07 - 124:08
    with the federal government study
  • 124:08 - 124:12
    that the algorithm, my fault,
  • 124:12 - 124:15
    failed to identify people of color
  • 124:15 - 124:18
    and children and elderly and women.
  • 124:19 - 124:22
    I see that it needs to be banned
  • 124:22 - 124:25
    because we can't continue
    condoning and reassuring
  • 124:25 - 124:29
    and fueling other generations
    of black and brown
  • 124:29 - 124:32
    men and women from Dorchester
    Roxbury and Mattapan,
  • 124:32 - 124:35
    to more trauma and more incarceration.
  • 124:35 - 124:37
    Investment in our communities
  • 124:37 - 124:39
    through community led processes,
  • 124:39 - 124:41
    is the only thing that we need,
  • 124:41 - 124:44
    not facial recognition tech.
  • 124:44 - 124:46
    Oh these bees are messing
    with me, sorry about that.
  • 124:46 - 124:50
    No matter what level that
    technology reaches and thank you.
  • 124:50 - 124:51
    That's all I have to say.
  • 124:51 - 124:53
    I'm gonna keep it quick.
  • 124:55 - 124:56
    Thank you very much.
  • 124:56 - 124:57
    I really do appreciate it.
  • 124:57 - 124:58
    And I think, again speak from the heart
  • 124:58 - 124:59
    about their own experience
  • 124:59 - 125:02
    and specifically talk about
    how it would impact them,
  • 125:02 - 125:04
    I do appreciate that.
  • 125:04 - 125:06
    Not that I don't appreciate the experts
  • 125:06 - 125:07
    but I'm telling you there's nothing better
  • 125:07 - 125:10
    than hearing about how
    residents are directly impacted.
  • 125:10 - 125:11
    Thank you.
  • 125:12 - 125:17
    I have also on the list, Cynthia Pades,
  • 125:19 - 125:21
    I don't have a name,
  • 125:21 - 125:24
    but I have an organization
    called For the People Boston.
  • 125:26 - 125:27
    I don't think we have either one.
  • 125:29 - 125:30
    There is.
  • 125:30 - 125:33
    Oh, I see a Marco, Marco Staminivich.
  • 125:37 - 125:39
    Marco, ready to go?
  • 125:42 - 125:43
    Marco!
  • 125:43 - 125:45
    Sure.
    Okay.
  • 125:46 - 125:49
    How is it going everyone.
  • 125:49 - 125:52
    Thank you to the council and the panelists
  • 125:52 - 125:54
    for allowing me to speak.
  • 125:54 - 125:56
    So I'm a machine learning engineer
  • 125:56 - 125:58
    working in an adjacent field
  • 125:58 - 126:00
    and a resident of Jamaica Plain
  • 126:00 - 126:03
    and I'm here today as a private citizen.
  • 126:03 - 126:06
    So this is my first time
    speaking at a public hearing,
  • 126:06 - 126:07
    so cut me some slack.
  • 126:07 - 126:09
    Although I'm not an expert
  • 126:09 - 126:13
    on service on the surveillance
    aspect and the applications,
  • 126:13 - 126:18
    I am familiar with the
    technology aspect of this stuff.
  • 126:19 - 126:21
    So I felt the need to testify today
  • 126:21 - 126:23
    because of the gravity of this issue.
  • 126:23 - 126:28
    Although the technology may
    seem benign upon first glance,
  • 126:29 - 126:32
    this is an extremely
    powerful and dangerous tool.
  • 126:32 - 126:34
    So to me this amounts to basically
  • 126:34 - 126:36
    a form of digital stop and frisk,
  • 126:36 - 126:37
    and it's even more dangerous
  • 126:37 - 126:40
    since it can be kind
    of constantly running,
  • 126:40 - 126:43
    constantly working anywhere at all times,
  • 126:43 - 126:45
    kind of tracking folks.
  • 126:45 - 126:46
    This seems like a clear violation
  • 126:46 - 126:48
    of the Fourth Amendment to me.
  • 126:49 - 126:52
    As many commenters and
    panelists today have noted,
  • 126:52 - 126:55
    in addition, as many
    folks have noted today,
  • 126:55 - 126:57
    the technology is certainly not perfect
  • 126:57 - 127:00
    and it's riddled with
    biases based on the data
  • 127:00 - 127:02
    that it was trained on
    and how it was trained
  • 127:02 - 127:04
    and by the folks that trained it.
  • 127:05 - 127:08
    And although this is an
    important thing to note,
  • 127:08 - 127:09
    this would really be a moot point
  • 127:09 - 127:12
    when evaluating these
    technologies for deployment.
  • 127:12 - 127:15
    Because while a poorly
    functioning system is bad
  • 127:15 - 127:18
    and introduces issues,
  • 127:18 - 127:21
    a perfectly functioning system
    would be much, much worse
  • 127:21 - 127:23
    and would basically signal a world
  • 127:23 - 127:25
    where we could be tracked out of hand
  • 127:25 - 127:28
    at any time that police
    or government wanted.
  • 127:28 - 127:32
    So that's really what I wanted to say.
  • 127:32 - 127:37
    And I wanted to kind of state my support
  • 127:37 - 127:41
    for a ban on any type of
    facial recognition technology
  • 127:41 - 127:44
    for use by the police or the government.
  • 127:44 - 127:47
    And that we should
    disincentivize the research
  • 127:47 - 127:50
    and development of, send clear signal
  • 127:51 - 127:53
    to disincentivize the
    research and development
  • 127:53 - 127:55
    of these types of systems in the future.
  • 127:55 - 127:59
    Thank you.
  • 127:59 - 128:00
    Thank you, it was perfect on time,
  • 128:00 - 128:02
    for the first time participating.
  • 128:02 - 128:02
    We really appreciate it.
  • 128:02 - 128:05
    I hope that this is not your last time.
  • 128:05 - 128:07
    We are dealing with a
    lot of different issues.
  • 128:07 - 128:10
    Your voice is definitely
    welcome, thank you so much.
  • 128:12 - 128:14
    Up next I have,
  • 128:14 - 128:19
    I'm gonna name Will Luckman
    followed by Lizette Medina,
  • 128:20 - 128:23
    and then also a Nathan
    Sheard, I apologize again,
  • 128:23 - 128:25
    if I mispronounce your name,
  • 128:25 - 128:27
    feel free to just state your name again.
  • 128:27 - 128:29
    For folks who have already testified,
  • 128:29 - 128:31
    again, since we're over capacity,
  • 128:31 - 128:35
    if you sign out and then watch on YouTube
  • 128:35 - 128:38
    that allows for someone else
    to get in line to also speak.
  • 128:38 - 128:40
    So just letting you know that
  • 128:40 - 128:42
    that's what we're trying to do.
  • 128:42 - 128:44
    So I will now turn it over.
  • 128:44 - 128:46
    I believe Will's ready to go.
  • 128:46 - 128:47
    And you have two minutes Will.
  • 128:48 - 128:50
    Thank you. good afternoon.
  • 128:50 - 128:52
    My name is Will Luckman
    and I serve as an organizer
  • 128:52 - 128:57
    with the Surveillance Technology
    Oversight Project or STOP
  • 128:58 - 129:00
    and STOP advocates and litigate
  • 129:00 - 129:03
    to fight discriminatory surveillance.
  • 129:03 - 129:05
    Thank you councilors for
    holding this hearing today
  • 129:05 - 129:08
    and for proposing this crucial reform.
  • 129:08 - 129:11
    Today, my oral remarks are an
    excerpt of written testimony
  • 129:11 - 129:13
    being entered into the record.
  • 129:13 - 129:17
    Facial recognition is biased,
    broken, and when it works
  • 129:17 - 129:19
    antithetical to a democratic society.
  • 129:19 - 129:22
    And crucially, without this ban,
  • 129:22 - 129:25
    more people of color will be
    wrongly stopped by the police
  • 129:25 - 129:27
    at a moment when the
    dangers of police encounters
  • 129:27 - 129:29
    have never been clearer.
  • 129:29 - 129:31
    The technology that
    drives facial recognition
  • 129:31 - 129:33
    is far more subjective than many realize.
  • 129:34 - 129:36
    Artificial intelligence is the aggregation
  • 129:36 - 129:40
    of countless human decisions
    codified into algorithms.
  • 129:40 - 129:44
    And as a result, human
    bias can affect AI systems,
  • 129:44 - 129:47
    including those that
    supposedly recognize faces
  • 129:47 - 129:48
    in countless ways.
  • 129:48 - 129:51
    For example, if a security camera learns
  • 129:51 - 129:54
    you are a suspicious looking,
    using pictures of inmates,
  • 129:54 - 129:55
    the photos will teach the AI
  • 129:55 - 129:59
    to replicate the mass incarceration
    of African-American men.
  • 129:59 - 130:02
    In this way, AI can
    learn to be just like us
  • 130:02 - 130:04
    exacerbating structural discrimination
  • 130:04 - 130:06
    against marginalized communities.
  • 130:06 - 130:07
    As we've heard,
  • 130:07 - 130:10
    even if facial recognition
    worked without errors,
  • 130:10 - 130:12
    even if it had no bias,
  • 130:12 - 130:14
    the technology would
    still remain antithetical
  • 130:14 - 130:16
    to everything the city
    of Boston believes in.
  • 130:16 - 130:19
    Facial recognition
    manufacturers are trying
  • 130:19 - 130:21
    to create a system that
    allows everyone to be tracked
  • 130:21 - 130:23
    at every moment in perpetuity.
  • 130:23 - 130:26
    Go to a protest, he system knows,
  • 130:26 - 130:28
    go to a health facility,
    it keeps a record.
  • 130:28 - 130:32
    Suddenly Bostonians lose
    the freedom of movement
  • 130:32 - 130:33
    that is essential to an open society.
  • 130:33 - 130:35
    If the city fails to act soon,
  • 130:35 - 130:38
    it will only become
    harder to enact reforms.
  • 130:38 - 130:41
    Companies are pressuring local
    state and federal agencies
  • 130:41 - 130:43
    to adopt facial recognition tools.
  • 130:43 - 130:46
    BriefCam, the software
    powering Boston's surveillance
  • 130:46 - 130:49
    camera network, has released a
    new version of their software
  • 130:49 - 130:51
    that would easily integrate
  • 130:51 - 130:53
    invasive facial recognition tools.
  • 130:53 - 130:54
    Although the commissioner
  • 130:54 - 130:57
    says he will reevaluate
    the new version on offer.
  • 130:57 - 130:59
    He also expressed interest in waiting
  • 130:59 - 131:01
    until the accuracy of
    the technology improves
  • 131:01 - 131:05
    or making exceptions for facial
    recognition use right now.
  • 131:05 - 131:09
    To definitively and
    permanently avoid the issues
  • 131:09 - 131:11
    with facial recognition
    that I've raised here,
  • 131:11 - 131:14
    we need comprehensive ban now.
  • 131:14 - 131:16
    And I'll conclude on a personal note.
  • 131:16 - 131:18
    I live and work in New York City,
  • 131:18 - 131:20
    but I was born in Boston
    and raised in Brooklyn,.
  • 131:20 - 131:23
    it pains me to see the
    current wave of protests
  • 131:23 - 131:24
    roiling the area,
  • 131:24 - 131:25
    because it demonstrates the bias
  • 131:25 - 131:27
    and unequal law enforcement practices
  • 131:27 - 131:30
    I remember from my youth,
    have yet to be addressed.
  • 131:30 - 131:32
    I know that the people of the Commonwealth
  • 131:32 - 131:33
    want to see a change,
  • 131:33 - 131:36
    and I believe that the
    council is on their side.
  • 131:36 - 131:39
    In practice, inaccuracies aside,
  • 131:39 - 131:41
    facial recognition systems
    lead to increased stops
  • 131:41 - 131:44
    for people of color, increased stops means
  • 131:44 - 131:47
    an increase in opportunity
    for police violence and abuse.
  • 131:47 - 131:49
    We must recognize that Black Lives Matter
  • 131:49 - 131:51
    and to do so we must realize
  • 131:51 - 131:54
    that technology doesn't
    operate in a neutral vacuum.
  • 131:54 - 131:56
    Instead it takes on the character
  • 131:56 - 131:58
    of those building and deploying it.
  • 131:58 - 132:00
    I encouraged the council to respond
  • 132:00 - 132:02
    to their constituents
    demands for police reform
  • 132:02 - 132:04
    by immediately banning the use
  • 132:05 - 132:06
    of this harmful technology in Boston.
  • 132:07 - 132:09
    Thank you.
    Thank you very much Will.
  • 132:09 - 132:10
    Lizet Medina.
  • 132:13 - 132:15
    Hi, good afternoon.
  • 132:15 - 132:19
    My name is Lizet Medina, I'm
    here as a private citizen.
  • 132:19 - 132:22
    I actually am pretty new to all of this.
  • 132:22 - 132:24
    This week I've found out
  • 132:24 - 132:26
    from the Student Immigration Movement,
  • 132:26 - 132:28
    kind of what's going on.
  • 132:28 - 132:32
    And I just wanted to voice
    my support for this ban.
  • 132:32 - 132:35
    I think for personally, as an immigrant,
  • 132:35 - 132:39
    and speaking toward like the presence
  • 132:39 - 132:41
    of facial recognition
    technology in the school system,
  • 132:41 - 132:44
    I think especially where we
    have so many conversations
  • 132:44 - 132:47
    with like the school to prison pipeline.
  • 132:47 - 132:49
    This tool could be used to facilitate
  • 132:49 - 132:53
    that kind of work in racially profiling
  • 132:53 - 132:55
    and profile of our students.
  • 132:55 - 132:58
    And I think now more than ever,
  • 132:58 - 133:02
    we have this point in
    time a decision to make
  • 133:02 - 133:06
    in protecting our residents of color,
  • 133:06 - 133:09
    black and brown residents and students
  • 133:09 - 133:14
    as they live in and try to
    navigate the various systems
  • 133:14 - 133:15
    that are already putting pressure on them.
  • 133:15 - 133:20
    And so I think, especially
    in light of recent events,
  • 133:20 - 133:23
    this is really a stand to support
  • 133:23 - 133:25
    residents of color in Boston,
  • 133:25 - 133:30
    over possible pros that
    could be looked at in this.
  • 133:30 - 133:33
    And I also think it's troubling
  • 133:33 - 133:35
    that the commissioner doesn't seem
  • 133:35 - 133:36
    to be clear about What's
    kind of going on with this.
  • 133:36 - 133:41
    In addition that it also isn't clear
  • 133:41 - 133:46
    that there is even a
    surveillance of whether or not,
  • 133:46 - 133:48
    this could be used already,
    that kind of thing.
  • 133:49 - 133:50
    So I think one, I'm just so thankful
  • 133:51 - 133:53
    that this conversation is being had,
  • 133:53 - 133:54
    that this is open to the public
  • 133:54 - 133:57
    and that I just wanted to voice my concern
  • 133:57 - 134:00
    for this being a tool for racial profiling
  • 134:00 - 134:03
    and my support in the ban.
  • 134:03 - 134:04
    Thank you.
  • 134:04 - 134:05
    Very much.
  • 134:06 - 134:09
    Next is Nathan Sheard.
  • 134:09 - 134:12
    Again I apologize if I
    mispronounced your name.
  • 134:12 - 134:13
    You have two minutes Nathan, go ahead.
  • 134:13 - 134:16
    No, you said it perfectly, thank you.
  • 134:16 - 134:17
    So my name is Nathan Sheard
  • 134:17 - 134:19
    and thank you for allowing me to speak
  • 134:20 - 134:21
    on behalf of the Electronic
    Frontier Foundation
  • 134:22 - 134:23
    and our 30,000 members.
  • 134:23 - 134:25
    The Electronic Frontier Foundation,
  • 134:25 - 134:25
    strongly supports legislation
  • 134:25 - 134:27
    that bans government
    agencies and employees
  • 134:27 - 134:29
    from using face surveillance technology.
  • 134:29 - 134:32
    We thank the sponsors of this ordinance
  • 134:32 - 134:34
    for their attention to
    this critical issue.
  • 134:34 - 134:36
    Face surveillance is profoundly
    dangerous for many reasons.
  • 134:36 - 134:38
    First, in tracking our faces,
  • 134:38 - 134:41
    a unique marker that we can not change,
  • 134:41 - 134:42
    it invades our privacy.
  • 134:42 - 134:45
    Second, government use of the
    technology in public places
  • 134:45 - 134:47
    will chill people from engaging
  • 134:47 - 134:48
    First Amendment protected activity.
  • 134:48 - 134:51
    Research shows and courts
    have long recognized
  • 134:51 - 134:52
    that government surveillance
  • 134:52 - 134:55
    and the First Amendment
    activity has a deterrent effect.
  • 134:55 - 134:57
    Third surveillance
    technologies have an unfair,
  • 134:57 - 135:00
    disparate impact against
    people of color, immigrants
  • 135:00 - 135:02
    and other vulnerable populations.
  • 135:02 - 135:04
    Watch lists are frequently over-inclusive,
  • 135:04 - 135:06
    error riddled and used in conjunction
  • 135:06 - 135:08
    with powerful mathematical algorithms,
  • 135:08 - 135:10
    which often amplify bias.
  • 135:10 - 135:12
    Thus space surveillance is so dangerous
  • 135:12 - 135:16
    that governments must not use it at all.
  • 135:16 - 135:17
    We support the aims of this ordinance
  • 135:17 - 135:20
    and respectfully seek three amendments
  • 135:20 - 135:22
    to show that the bill will appropriately
  • 135:22 - 135:23
    protect the rights of Boston residents.
  • 135:23 - 135:26
    First, the bill has an
    exemption for evidence
  • 135:26 - 135:27
    generated by face of bills
  • 135:27 - 135:29
    that relates to investigation of crime.
  • 135:29 - 135:32
    If that respectfully ask that'd be amended
  • 135:32 - 135:35
    to prohibit the use of face
    surveillance enabled evidence
  • 135:35 - 135:38
    generated by or at the request
  • 135:38 - 135:41
    of the Boston Police Department
    or any Boston official.
  • 135:41 - 135:43
    Second, the private right of action
  • 135:43 - 135:46
    does not provide the shipping
    for a prevailing plaintiff.
  • 135:46 - 135:47
    Without such fee shipping,
  • 135:47 - 135:50
    the only private enforcers
    will be advocacy organizations
  • 135:50 - 135:52
    and wealthy individuals.
  • 135:52 - 135:55
    You have to be eff respectfully
    ask that the language
  • 135:55 - 135:58
    be included toward
    reasonable attorney fees
  • 135:58 - 136:00
    to a prevailing plaintiff.
  • 136:00 - 136:02
    Third, the ban extends to private sector
  • 136:02 - 136:05
    use of face surveillance conducted
    with a government permit.
  • 136:05 - 136:08
    The better approach as
    Kade has offered earlier,
  • 136:08 - 136:13
    is to use face surveillance is used only
  • 136:16 - 136:18
    with informed opt-in consent.
  • 136:18 - 136:20
    Thus you have to respectfully requests
  • 136:20 - 136:22
    that the language be limited
  • 136:22 - 136:24
    to prohibiting private sector permitting
  • 136:24 - 136:26
    for the use of face
    surveillance recognition
  • 136:26 - 136:28
    at the government's request.
  • 136:28 - 136:31
    And closing EFF, once again
    thanks to the sponsors.
  • 136:31 - 136:33
    Look forward to seeing an ordinance pass
  • 136:33 - 136:34
    that bans government use of
    face surveillance in Boston
  • 136:34 - 136:37
    and will enthusiastically
    support the ordinance banning
  • 136:37 - 136:39
    face recognition technology in Boston,
  • 136:39 - 136:41
    if it is amended in these ways, thank you.
  • 136:42 - 136:44
    Nathan, I just wanna make sure I notes
  • 136:44 - 136:45
    on your suggestions.
  • 136:46 - 136:47
    And I wanna make sure you had three,
  • 136:47 - 136:49
    which is the ban that used by the BPB,
  • 136:49 - 136:53
    no exemptions, period.
  • 136:53 - 136:57
    Two, that currently because [indistinct]
  • 136:57 - 136:59
    doesn't allow for fee
    shifting our attorney fees,
  • 136:59 - 137:01
    it really does put it
    on either individuals
  • 137:01 - 137:04
    to take it on their own
    without having any kind of,
  • 137:04 - 137:09
    I guess partner care or
    support that helps be funded.
  • 137:09 - 137:12
    And then three to extend
    to private actors,
  • 137:12 - 137:14
    essentially down the supply chain
  • 137:14 - 137:16
    where the city of Boston
    can tell private actors
  • 137:16 - 137:18
    that contract with us.
  • 137:18 - 137:21
    Because one of the reasons
    I am always saying this,
  • 137:21 - 137:24
    one of the issues is how far we can go.
  • 137:24 - 137:25
    I mean, we can't regulate Facebook,
  • 137:25 - 137:27
    but we can regulate where money goes.
  • 137:27 - 137:29
    So I'm assuming spot that.
  • 137:30 - 137:32
    Yeah, so just to reiterate,
  • 137:32 - 137:35
    so the First Amendment
    that we would suggest
  • 137:35 - 137:37
    is that in right now,
  • 137:38 - 137:40
    there's an exclusion
    for forever for evidence
  • 137:40 - 137:42
    and Kade spoke earlier
  • 137:42 - 137:44
    to the fact that that's in reference to,
  • 137:44 - 137:45
    if they receive something
    from the federal government
  • 137:45 - 137:47
    or one to post or what have you.
  • 137:47 - 137:49
    But to make sure that it's amended
  • 137:49 - 137:51
    to tighten that up a bit and say that,
  • 137:51 - 137:54
    as long as that
    information isn't requested
  • 137:54 - 137:57
    or created at the request
    of the Boston police forums,
  • 137:57 - 137:58
    that's the first one.
  • 137:59 - 138:00
    The second one is the fee shipping.
  • 138:01 - 138:02
    So that it's not just the EFF and the ACLU
  • 138:02 - 138:05
    and wealthy individuals
    that have the power
  • 138:05 - 138:07
    to make sure that it's enforced.
  • 138:07 - 138:09
    So folks can find
    attorneys to support them
  • 138:09 - 138:12
    because the attorneys will
    be compensated for that time.
  • 138:12 - 138:14
    And finally, on the third.
  • 138:14 - 138:16
    And I think that this speaks
    to your greater question,
  • 138:16 - 138:20
    right now it's it prohibits
    the offering of a permit
  • 138:20 - 138:25
    to a private company, agency
    person, what have you,
  • 138:25 - 138:27
    whereas we would tightened up to say that
  • 138:27 - 138:29
    that a permit could be provided
  • 138:29 - 138:31
    as long it wasn't being provided
  • 138:31 - 138:34
    for face surveillance to
    be collected at the behest
  • 138:34 - 138:38
    or at the request of the
    Boston Police Department.
  • 138:38 - 138:39
    So really making sure
  • 138:39 - 138:42
    that we're speaking back to the fact
  • 138:42 - 138:44
    that like private use
    would be better protected
  • 138:44 - 138:46
    by something similar to
  • 138:46 - 138:48
    the Biometric Information
    Privacy Act in Illinois
  • 138:48 - 138:52
    that requires; informed,
    opt-in and concept
  • 138:52 - 138:53
    from the individual,
  • 138:53 - 138:55
    rather than simply saying
  • 138:55 - 138:57
    that no private doctor can do space.
  • 138:58 - 139:00
    Thank you.
  • 139:00 - 139:01
    That was a great clarification.
  • 139:01 - 139:04
    I'm gonna go now to our next speakers.
  • 139:04 - 139:05
    Thank you very much.
  • 139:05 - 139:10
    I have a Sabrina Barosso,
    Elmeda and Christina Vasquez.
  • 139:13 - 139:17
    So if we have.
  • 139:18 - 139:21
    Hi folks, thank you so much.
  • 139:21 - 139:24
    Just to let folks know,
    Christina is not with us.
  • 139:24 - 139:27
    Some folks had to leave
    because they have to work,
  • 139:27 - 139:29
    but so hi everyone, hi city councilors.
  • 139:29 - 139:31
    My name is Sabrina Barroso.
  • 139:31 - 139:34
    I am the lead organizer for
    the Student Immigrant Movement.
  • 139:34 - 139:37
    The first undocumented
    immigrant youth led organization
  • 139:37 - 139:39
    in the state of Massachusetts.
  • 139:39 - 139:41
    We believe that all
    young people are powerful
  • 139:41 - 139:43
    and that when you are
    supported and have space
  • 139:43 - 139:44
    and investment to grow,
  • 139:44 - 139:46
    they flourished into the
    leaders of today and tomorrow.
  • 139:46 - 139:49
    At SIM, we are invested in protecting
  • 139:49 - 139:51
    and working on behalf of
    the undocumented youth
  • 139:51 - 139:53
    and for families of Massachusetts.
  • 139:53 - 139:55
    That's why I'm here with you all today
  • 139:55 - 139:56
    and I would like to make clear
  • 139:56 - 139:59
    that we must prohibit the
    use of facial recognition,
  • 139:59 - 140:02
    technology and software
    in the city of Boston.
  • 140:02 - 140:04
    For far too long the
    Boston Police Department
  • 140:04 - 140:07
    has been allowed to have full flexibility
  • 140:07 - 140:10
    and freewill over how the police
    and surveil our communities
  • 140:10 - 140:12
    and the consequences are devastating.
  • 140:12 - 140:15
    Our people are being criminalized
    and funneled into prisons,
  • 140:15 - 140:17
    detention and deportation.
  • 140:17 - 140:20
    We need full community
    control over the police.
  • 140:20 - 140:22
    And this is a key move to
    bringing power to our people.
  • 140:22 - 140:24
    Right now, there are no laws
  • 140:24 - 140:27
    that control what the BPD
    can purchase for surveillance
  • 140:27 - 140:29
    or how they use it and
    when I learned this,
  • 140:29 - 140:32
    I was scared because I
    thought about the history
  • 140:32 - 140:34
    between the BPD and families like mine.
  • 140:34 - 140:36
    And I thought about the people that I love
  • 140:36 - 140:39
    and that the people I care about.
  • 140:39 - 140:42
    And I thought about the people
    who are constantly harassed
  • 140:42 - 140:45
    and targeted by law enforcement
    simply for existing.
  • 140:45 - 140:46
    So if you ask me,
  • 140:46 - 140:48
    facial recognition is not to be trusted
  • 140:49 - 140:50
    in the hands of law enforcement,
  • 140:50 - 140:52
    that tracks, monitors, hyper surveils,
  • 140:52 - 140:53
    black and brown communities,
  • 140:53 - 140:56
    and that puts immigrants and activists,
  • 140:56 - 141:00
    youth under constant
    scrutiny and criminalization.
  • 141:00 - 141:03
    Councilors, would you really
    trust a police department
  • 141:03 - 141:05
    that shares information to I.C.E?
  • 141:05 - 141:08
    That's share student
    information to I.C.E as well.
  • 141:08 - 141:10
    That has led to the
    deportation of students
  • 141:10 - 141:14
    that exchange emails saying
    happy hunting with I.C.E.
  • 141:14 - 141:16
    And that has historically targeted
  • 141:16 - 141:19
    black and brown youth to
    invade their personal space,
  • 141:19 - 141:22
    their bodies and their
    privacy with stop and frisk.
  • 141:22 - 141:25
    There is not a shame to separate youth
  • 141:25 - 141:27
    from their and their families
    from their communities
  • 141:27 - 141:30
    and then end up pinning,
    injustices and crimes on us.
  • 141:30 - 141:34
    The BPD says that they don't
    use spatial surveillance now,
  • 141:34 - 141:36
    but they have access to it.
  • 141:36 - 141:39
    And they have a $414
    million budget to buy it.
  • 141:39 - 141:41
    Right now, our communities need access
  • 141:41 - 141:44
    to funding for healthcare,
    housing and so many other things
  • 141:44 - 141:45
    and the list goes on.
  • 141:45 - 141:48
    For years, residents of
    Boston have been demanding
  • 141:48 - 141:49
    to fund things that truly matter to them
  • 141:49 - 141:51
    and we have to invest in things
  • 141:51 - 141:54
    that are not face surveillance
    that doesn't even work
  • 141:54 - 141:56
    and is irrelevant to
    keeping our community safe.
  • 141:56 - 142:00
    We must invest in intentional
    restorative justice practices,
  • 142:00 - 142:03
    healthcare, mental health and resources
  • 142:03 - 142:05
    to prevent violence and distress.
  • 142:05 - 142:07
    Facial surveillance would
    end privacy as we know it
  • 142:07 - 142:09
    and completely throw off the balance
  • 142:09 - 142:11
    and power between people
    and their government.
  • 142:11 - 142:13
    Listen to youth and
    listening to our community,
  • 142:13 - 142:16
    and let's follow the lead of young people
  • 142:16 - 142:19
    who are addressing the deeper issue
  • 142:19 - 142:21
    of what is to criminalize our people
  • 142:21 - 142:23
    and putting the power in
    the hands of the people,
  • 142:23 - 142:26
    especially when it comes
    to governing surveillance.
  • 142:26 - 142:28
    Thank you all so much.
  • 142:28 - 142:29
    Thank you very much
  • 142:29 - 142:31
    and it'S like Christina's
    not available, Sabrina.
  • 142:31 - 142:34
    Yeah, she's not available.
  • 142:34 - 142:39
    Okay, Ismleda.
  • 142:42 - 142:47
    I'm Ismleda and a member of
    the City Media Movements.
  • 142:49 - 142:51
    Today I'm here to testify in support
  • 142:51 - 142:53
    of ban in facial recognition
    technology in Boston.
  • 142:53 - 142:56
    This will establish a
    ban of government use
  • 142:56 - 142:58
    of face surveillance
    in the city of Boston.
  • 142:58 - 143:02
    Boston must pass this
    ordinance to join Springfield,
  • 143:02 - 143:05
    Somerville, Cambridge,
    Brooklyn and North Hampton
  • 143:05 - 143:07
    in protecting racial justice
    and freedom of speech.
  • 143:07 - 143:11
    The feather of study
    published in December 2019,
  • 143:11 - 143:13
    found that face recognition algorithms
  • 143:13 - 143:15
    are much more likely to fail
  • 143:15 - 143:18
    when attempted to identify
    the faces of people of color,
  • 143:18 - 143:20
    children, the elderly and women.
  • 143:20 - 143:23
    That means that technology
    only reliably works
  • 143:23 - 143:27
    on middle aged white men.
  • 143:27 - 143:30
    It very small fraction
    of busters precedents.
  • 143:31 - 143:33
    As a citizen of Boston,
  • 143:33 - 143:35
    it is clear to me that
    this type of technology
  • 143:35 - 143:37
    is not okay to use in our communities.
  • 143:37 - 143:39
    The fact that this technology
  • 143:39 - 143:43
    is used to control specifically
    in communities of color,
  • 143:43 - 143:46
    makes my family and me feel unsafe,
  • 143:46 - 143:47
    as well as the other families
  • 143:47 - 143:49
    that live within this communities.
  • 143:49 - 143:52
    Black and brown kids
    are put into databases
  • 143:52 - 143:55
    that are used against them
    without redness or reason.
  • 143:55 - 143:58
    This interrupts our education
    and limits our future.
  • 143:58 - 144:00
    Kids shouldn't be criminalized
    or feeling unsafe.
  • 144:00 - 144:03
    Black and brown kids should
    be able to walk on the street
  • 144:03 - 144:07
    without the fear of knowing
    what's going to happen.
  • 144:07 - 144:10
    And it shows how this surveillance
  • 144:10 - 144:12
    is used to target black
    and brown communities.
  • 144:12 - 144:15
    The racial profiling in technology
  • 144:15 - 144:17
    is one of the many forms
    of systematic racism.
  • 144:17 - 144:21
    Immigrant youth in this
    city do not trust the police
  • 144:21 - 144:23
    because we know that they
    are very close with I.C.E.
  • 144:23 - 144:27
    We know that they sent a
    student information to I.C.E,
  • 144:27 - 144:30
    and this has led to the
    deportation of junk people
  • 144:30 - 144:33
    and pains our communities.
  • 144:33 - 144:35
    [indistinct]
  • 144:40 - 144:45
    BPD, cameras with surveillance
    and criminalization of us.
  • 144:45 - 144:47
    It's not right.
  • 144:47 - 144:48
    Everyone on the council at some point
  • 144:48 - 144:50
    has talked about supporting young people
  • 144:52 - 144:53
    and young people being our future leaders.
  • 144:53 - 144:54
    We are leaders right now,
  • 144:54 - 144:58
    we are telling you to do
    what everyone knows is right.
  • 144:58 - 145:00
    By preventing the use of facial
    recognition surveillance,
  • 145:00 - 145:03
    we can do more to protect
    immigrant families in Boston.
  • 145:03 - 145:05
    Stop selling youth and families out.
  • 145:05 - 145:07
    Let's give power to people
  • 145:07 - 145:09
    and take this first step in
    addressing the problem we have
  • 145:09 - 145:10
    with law enforcement surveillance
  • 145:10 - 145:13
    and criminalization or people
    by surveillance system.
  • 145:13 - 145:14
    Thank you very much
  • 145:15 - 145:17
    Next up, the next group
    of folks we have are,
  • 145:17 - 145:22
    Sherlandy Pardieu, Eli Harmon,
    Natalie Diaz and Jose Gomez.
  • 145:28 - 145:30
    So is Sherlandy available.
  • 145:32 - 145:36
    I see that person in
    the waiting room, sorry.
  • 145:47 - 145:50
    well, we'll go ahead and go to Eli.
  • 145:53 - 145:55
    I think I see right now, ready to go
  • 145:55 - 145:59
    and then we'll come back to Sherlandy,
  • 146:00 - 146:02
    Eli you have two minutes.
  • 146:02 - 146:04
    Oh, thank you so much.
  • 146:04 - 146:05
    Oh, who's speaking?
  • 146:06 - 146:07
    oh, sorry, this is Eli.
  • 146:07 - 146:08
    Okay, very well.
  • 146:08 - 146:09
    Thank you so much.
  • 146:09 - 146:12
    My name is Eli Harmon,
    I live in Mission Hill.
  • 146:12 - 146:15
    I've been doing work with the
    Student Immigrant Movement.
  • 146:15 - 146:18
    I'm testifying on behalf of them today,
  • 146:18 - 146:20
    in support of the ordinance.
  • 146:20 - 146:21
    The issue of police surveillance
  • 146:21 - 146:23
    is enormously important to people
  • 146:23 - 146:25
    all across the city of Boston
  • 146:25 - 146:28
    though it's of particular
    concern to communities of color,
  • 146:28 - 146:30
    which it overwhelmingly targets.
  • 146:30 - 146:31
    A study done by MIT research
  • 146:31 - 146:33
    showed this type of
    surveillance technology
  • 146:33 - 146:35
    would be far less accurate
    for people of color.
  • 146:35 - 146:40
    And in fact, it was inaccurate
    for 35% of black women
  • 146:40 - 146:43
    and was most accurate for white adult men.
  • 146:43 - 146:46
    Law enforcement today, already of course,
  • 146:46 - 146:49
    overwhelmingly targets
    black and brown communities.
  • 146:49 - 146:50
    And this type of technology
  • 146:50 - 146:53
    is likely to only make
    this more severe case.
  • 146:53 - 146:55
    Additionally this type of technology
  • 146:55 - 146:57
    is a complete violation
    of people's privacy.
  • 146:57 - 146:59
    There are no regulations with regard
  • 146:59 - 147:00
    to the extent to which law enforcement
  • 147:00 - 147:02
    can use this technology.
  • 147:02 - 147:04
    And it has been used as an abuse of power
  • 147:04 - 147:06
    by government officials in
    order to track down immigrants,
  • 147:06 - 147:08
    people of color and activists.
  • 147:08 - 147:10
    Finally, lots of money goes into funding
  • 147:10 - 147:11
    this surveillance technology,
  • 147:11 - 147:12
    and for the reasons previously stated
  • 147:12 - 147:16
    this technology in no way
    keeps our communities safe
  • 147:16 - 147:17
    and the money that currently funds it,
  • 147:17 - 147:19
    could be put into much better places,
  • 147:19 - 147:21
    such as creating
    affordable housing for all,
  • 147:21 - 147:24
    fully funding the Boston public schools,
  • 147:24 - 147:27
    creating equitable healthcare
  • 147:27 - 147:30
    and other places which
    would actually be effective
  • 147:30 - 147:31
    in bringing down crime rates.
  • 147:32 - 147:35
    Many cities that neighbor
    Boston, including Somerville,
  • 147:35 - 147:37
    Brooklyn, Cambridge and Springfield,
  • 147:37 - 147:39
    have chosen to ban facial surveillance
  • 147:39 - 147:41
    because they understand
    that it is dangerous
  • 147:41 - 147:43
    and harmful to communities of color.
  • 147:43 - 147:44
    It is a violation of people's privacy
  • 147:44 - 147:48
    and that money currently being
    used to fund that technology
  • 147:48 - 147:50
    would be better spent in places
  • 147:50 - 147:52
    that actually improve lives in Boston.
  • 147:52 - 147:54
    For these reasons,
  • 147:54 - 147:55
    I asked the council to pass a prohibition
  • 147:55 - 147:58
    of the use of facial
    recognition surveillance
  • 147:58 - 147:59
    and give the community control
  • 147:59 - 148:01
    over surveillance technology and usage.
  • 148:01 - 148:02
    Thank you so much.
  • 148:02 - 148:04
    Thank you very much.
  • 148:04 - 148:06
    I do see Sherlandy
  • 148:09 - 148:10
    I'm here.
  • 148:10 - 148:13
    Okay, you have two minutes Sherlandy.
  • 148:20 - 148:23
    I am writing on behalf of SIM
    in support of the ordinance
  • 148:23 - 148:25
    banning facial recognition
    technology in Boston.
  • 148:25 - 148:28
    I urge my city of Boston
    to pass this ordinance
  • 148:28 - 148:30
    to join other cities in Massachusetts,
  • 148:30 - 148:33
    like Springfield, Somerville,
    Cambridge, Brooklyn
  • 148:33 - 148:36
    and Northern and putting
    the community power
  • 148:36 - 148:38
    before the police.
  • 148:38 - 148:40
    The city of Boston,
    the residents, families
  • 148:40 - 148:43
    and youth from Boston,
    deserve transparency,
  • 148:43 - 148:44
    respect and control.
  • 148:44 - 148:46
    A ban on facial surveillance technology
  • 148:46 - 148:48
    is important in the city of Boston
  • 148:48 - 148:50
    because this technology is dangerous
  • 148:50 - 148:54
    and it leads to serious
    mistrust in the community.
  • 148:54 - 148:58
    Face surveillance is
    proven to be less accurate
  • 148:58 - 149:01
    for people of darker skin.
  • 149:01 - 149:04
    Black people already face extreme levels
  • 149:04 - 149:06
    of police violence and harassment.
  • 149:06 - 149:09
    The racial bias and face surveillance
  • 149:09 - 149:11
    will put lives in danger.
  • 149:11 - 149:14
    But even if accurate,
  • 149:14 - 149:16
    we should still be doing all that we can
  • 149:16 - 149:20
    to avoid a future where every
    time we go to the doctor,
  • 149:20 - 149:22
    a place of worship or protest,
  • 149:22 - 149:24
    the government is scanning our face
  • 149:24 - 149:26
    and recording our movements.
  • 149:26 - 149:28
    We already know that law enforcement
  • 149:28 - 149:31
    is constantly sharing
    and misusing information.
  • 149:31 - 149:33
    What would happen to our people
  • 149:33 - 149:35
    when we are already under
    constant surveillance
  • 149:35 - 149:37
    and criminalization.
  • 149:37 - 149:39
    Using face surveillance,
  • 149:39 - 149:41
    the government could begin to identify
  • 149:41 - 149:44
    and keep a list of every
    person at a protests.
  • 149:44 - 149:46
    People who are driving
  • 149:46 - 149:49
    or who are walking the
    street every single day
  • 149:49 - 149:53
    across the city on a
    ongoing automatic basis,
  • 149:53 - 149:55
    it would keep the same
    harmful surveillance going,
  • 149:55 - 149:58
    like what happened with the
    use of [indistinct] database.
  • 149:58 - 150:02
    It would criminalize anyone
    to make a move in our city.
  • 150:06 - 150:10
    If we want this city to be safe,
  • 150:10 - 150:12
    we need to put our
    money where our mouth is
  • 150:12 - 150:16
    and invest in practice
    such as restorative justice
  • 150:16 - 150:18
    to help restore or community.
  • 150:18 - 150:20
    Facial surveillance is not the answer.
  • 150:20 - 150:24
    If anything, it is distracting
  • 150:24 - 150:27
    or addressing our real concern
  • 150:27 - 150:29
    about safety in our communities.
  • 150:29 - 150:32
    I urge you to prohibit the
    use of facial surveillance
  • 150:32 - 150:34
    in the city of Boston,
  • 150:34 - 150:38
    because we cannot allow law
    enforcement to target black
  • 150:38 - 150:42
    and brown immigrants, activists
    and community members.
  • 150:42 - 150:44
    Thank you so much for your time.
  • 150:44 - 150:45
    Thank you very much.
  • 150:45 - 150:47
    I want to say thank
    you to all of the youth
  • 150:47 - 150:51
    who have testified and
    will testify at SIM,
  • 150:51 - 150:54
    all those folks showing up, you are...
  • 150:54 - 150:55
    I just wanna say thank you so much
  • 150:55 - 150:58
    on behalf of the city
    council, you're our future
  • 150:58 - 151:00
    and you're doing an amazing job.
  • 151:00 - 151:03
    I have a Natalie Diaz and then Jose Gomez.
  • 151:03 - 151:07
    I see the Emily Reif
    who had signed up before
  • 151:07 - 151:09
    might also be available.
  • 151:09 - 151:10
    So those next three names,
  • 151:10 - 151:12
    I think Natalie may no longer be with us.
  • 151:15 - 151:18
    Okay, is Jose available?
  • 151:18 - 151:20
    Yes.
    CHAIR: Two minutes.
  • 151:21 - 151:22
    Hi everyone.
  • 151:22 - 151:27
    I'm here on behalf of
    SIM to testify in favor
  • 151:27 - 151:30
    of the ban on facial
    recognition software in Boston.
  • 151:30 - 151:34
    I am an MIT alumni,
  • 151:34 - 151:36
    working as a software engineer in Boston
  • 151:36 - 151:39
    and I'm a DACA recipient as well.
  • 151:39 - 151:41
    And I wanted to tell you for your time
  • 151:41 - 151:43
    that as a software engineer working in AI
  • 151:43 - 151:45
    and machine learning, enabled robots,
  • 151:45 - 151:47
    that the software use
    for facial recognition
  • 151:47 - 151:48
    is far from perfect.
  • 151:49 - 151:52
    In fact, as Joy found
    in others have noted,
  • 151:52 - 151:54
    one in three black women is likely
  • 151:54 - 151:56
    to be misclassified by technology.
  • 151:56 - 151:58
    And our own federal government has found
  • 151:58 - 152:00
    that the face recognition algorithms
  • 152:00 - 152:01
    were more likely to fail
  • 152:01 - 152:04
    when attempting to identify
    faces of people of color,
  • 152:04 - 152:06
    children, elderly and women.
  • 152:06 - 152:10
    This software after all
    is written by people.
  • 152:10 - 152:13
    And this is inherently filled
    with mistakes and bias,
  • 152:13 - 152:17
    despite the detailed
    quality assurance processes
  • 152:17 - 152:19
    or production level software undergoes.
  • 152:19 - 152:22
    This is an accepted fact
    in software industry,
  • 152:22 - 152:24
    no matter how good your software is
  • 152:24 - 152:25
    or the testing process is,
  • 152:25 - 152:27
    bugs in the software will always exist.
  • 152:27 - 152:31
    Consequently, the use of
    facial recognition technology
  • 152:31 - 152:34
    by government entities
    especially the police departments
  • 152:34 - 152:36
    must be banned as a consequence of errors
  • 152:36 - 152:38
    in the technology or if life and death.
  • 152:38 - 152:42
    The false identification by
    a facial recognition software
  • 152:42 - 152:44
    could lead to an unwarranted confrontation
  • 152:44 - 152:46
    with the police department.
  • 152:46 - 152:48
    The seemingly small error in
    facial recognition software
  • 152:48 - 152:50
    may not mean much to you,
  • 152:50 - 152:54
    but as an undocumented
    immigrant and a person of color,
  • 152:54 - 152:56
    any confrontation with the police
  • 152:56 - 152:58
    could mean deportation or even my life.
  • 152:59 - 153:02
    The police have a record
    of systemic racial bias
  • 153:02 - 153:04
    leading into the deaths
    of thousands of people.
  • 153:04 - 153:07
    Government entities, special police forces
  • 153:07 - 153:10
    do not need any more tools
    to further their racial bias
  • 153:10 - 153:13
    and consequently there's
    a stomach murdering
  • 153:13 - 153:14
    of the very people
    they're meant to protect.
  • 153:14 - 153:15
    I encourage you to press pause
  • 153:15 - 153:18
    on these face surveillance
    by government entities
  • 153:18 - 153:22
    in the city of Boston, by
    supporting this crucial ban.
  • 153:24 - 153:29
    Thank you very much, after
    Jose I think we had...
  • 153:33 - 153:38
    After Jose we had Emily,
    Emily Reif has rejoined us.
  • 153:38 - 153:42
    She was part of the original
    first set we called, Emily.
  • 153:45 - 153:47
    EMILY: Hello, sorry.
  • 153:47 - 153:49
    Can you hear me?
    Yep, two minutes.
  • 153:50 - 153:53
    Sorry about that, I was
    having internet issues.
  • 153:53 - 153:55
    So my name is Emily Reif
  • 153:55 - 153:57
    and I work at a machine
    learning research at Google.
  • 153:57 - 153:58
    Although of course my views are my own
  • 153:58 - 154:00
    and I'm here as a citizen.
  • 154:00 - 154:02
    Yeah, I strongly support the ban
  • 154:02 - 154:04
    on facial recognition technology
  • 154:04 - 154:06
    and many people have said similar things
  • 154:06 - 154:09
    to what I'm about to say
    and much better words.
  • 154:09 - 154:13
    I just wanted to reiterate on...
  • 154:13 - 154:15
    Even when working correctly,
  • 154:15 - 154:17
    there are so many major privacy
  • 154:17 - 154:20
    with these technologies and by definition,
  • 154:20 - 154:22
    they're designed to track
    us wherever and whenever
  • 154:22 - 154:26
    we're in public spaces and
    to aggregate this information
  • 154:26 - 154:28
    and having those kinds
    of extreme surveillance
  • 154:28 - 154:31
    is not only kind of horrifying
    at a personal level,
  • 154:31 - 154:33
    but it will also fundamentally change
  • 154:33 - 154:34
    the way that we go about our daily lives
  • 154:34 - 154:37
    and operate as a democratic society.
  • 154:37 - 154:40
    And others have already cited
    some of the research on this.
  • 154:40 - 154:43
    It's a well-documented
    phenomenon, not a good one.
  • 154:43 - 154:47
    And so that's all about
    facial recognition technology
  • 154:47 - 154:48
    when it's working perfectly.
  • 154:48 - 154:51
    But again, as people have noted
  • 154:51 - 154:53
    that's so far from the truth.
  • 154:53 - 154:56
    Yeah, there's a ton of
    research on this by the ACLU,
  • 154:56 - 154:59
    and Joy who spoke earlier
  • 154:59 - 155:01
    and I've seen this in my own research.
  • 155:01 - 155:04
    Our team's research is
    related to these areas.
  • 155:04 - 155:08
    That these models are just
    totally not reliable for data
  • 155:08 - 155:10
    that is different than
    what they're trained on.
  • 155:10 - 155:13
    And that these inaccuracy
    don't make it any less,
  • 155:13 - 155:16
    I mean, one might say
    like, oh, so it makes it,
  • 155:16 - 155:18
    potentially less dangerous,
    but it makes it much,
  • 155:18 - 155:21
    much more dangerous when the inaccuracy
  • 155:21 - 155:22
    has just reinforced the biases,
  • 155:22 - 155:24
    that disparities that
    have always been there
  • 155:24 - 155:26
    in the way that different
    groups are surveyed
  • 155:26 - 155:29
    and policed and incarcerated.
  • 155:29 - 155:33
    And we saw with, with
    Briana Hiller's death
  • 155:33 - 155:35
    that thinking that you're attacking
  • 155:35 - 155:37
    or that you're entering the right house,
  • 155:37 - 155:39
    and if you are wrong about that,
  • 155:39 - 155:43
    there's just horrible, an
    unspeakable consequences
  • 155:43 - 155:48
    And yeah, we can't possibly risk that
  • 155:48 - 155:50
    and that's not a road we should remotely
  • 155:50 - 155:52
    think about going down, thank you.
  • 155:54 - 155:56
    That's perfectly on time.
  • 155:56 - 155:59
    I'm just gonna go down the list
  • 155:59 - 156:04
    and also call on Christian
    De Leon, Clara Ruis,
  • 156:11 - 156:13
    Diana Serrano and Oliver De Leon.
  • 156:20 - 156:23
    Are those folks are available.
  • 156:23 - 156:23
    I see Christian.
  • 156:28 - 156:30
    Christian you may start,
    you have two minutes.
  • 156:31 - 156:34
    [indistinct]
  • 156:34 - 156:36
    You have to turn off one
    computer and on your phone
  • 156:36 - 156:39
    or something, you're
    gonna have to turn off
  • 156:39 - 156:40
    one of the devices.
  • 156:40 - 156:41
    CHRISTIAN: Can you hear me fine now?
  • 156:41 - 156:42
    Much better, thank you.
  • 156:45 - 156:48
    CHRISTIAN: My name is Christian De Leon,
  • 156:48 - 156:50
    I'm here today as a member
  • 156:50 - 156:51
    of the Student Immigrant Movement
  • 156:52 - 156:53
    in support of banning facial
    recognition in Boston.
  • 156:53 - 156:56
    Face surveillance is not only
    a huge invasion of privacy,
  • 156:56 - 156:59
    but in the wrong hands can be used
  • 156:59 - 157:01
    to even further oppress minorities.
  • 157:01 - 157:03
    It can be used to track
    immigrants, protestors,
  • 157:03 - 157:07
    and inaccurate when tracking
    people of darker complexion.
  • 157:07 - 157:10
    As we know, this can be very problematic
  • 157:10 - 157:12
    since these tools target these communities
  • 157:12 - 157:14
    through racial profiling already.
  • 157:14 - 157:17
    Having surveillance tools
    like facial recognition
  • 157:17 - 157:20
    will contribute to an even
    greater systemic oppression.
  • 157:20 - 157:23
    There's also has no business
    being in our schools.
  • 157:23 - 157:26
    To my knowledge, some
    schools are already investing
  • 157:26 - 157:28
    in face recognition software
  • 157:28 - 157:31
    and are investing millions
    in this new technology,
  • 157:31 - 157:33
    and for what exactly?
  • 157:33 - 157:35
    To track their students every move?
  • 157:35 - 157:39
    I believe school should
    spend that time and money
  • 157:39 - 157:42
    on bettering their school
    and education systems,
  • 157:42 - 157:45
    rather than spending it
    on software and tools
  • 157:45 - 157:48
    that will no way improve
    students' ability to be educated.
  • 157:48 - 157:50
    We are already living in a time
  • 157:50 - 157:53
    where technology can do scary things
  • 157:53 - 157:55
    and the last thing we need
  • 157:55 - 157:58
    are people watching our every
    move on a day-to-day basis.
  • 157:58 - 157:59
    During this pandemic,
  • 157:59 - 158:01
    we have grown accustomed to wearing masks
  • 158:01 - 158:03
    and covering our faces in public,
  • 158:03 - 158:05
    well, if we now have facial recognition
  • 158:05 - 158:07
    on every street corner,
  • 158:07 - 158:10
    then this temporary thing
    the world has adapted to
  • 158:10 - 158:12
    may just become permanent.
  • 158:12 - 158:14
    Not only to protect our own privacy,
  • 158:14 - 158:17
    but to not be falsely accused of crimes
  • 158:17 - 158:19
    due to falsely recognition software.
  • 158:19 - 158:23
    2020 has had a great display
    of racism and brutality
  • 158:23 - 158:24
    against people of color,
  • 158:24 - 158:26
    particularly the black community,
  • 158:26 - 158:29
    and I fear that systems such
    as facial recognition software
  • 158:29 - 158:30
    in the hands of law enforcement,
  • 158:30 - 158:33
    can just be another tool
    they use against us.
  • 158:33 - 158:36
    Thank you for listening to my concerns.
  • 158:36 - 158:37
    Thank you very much.
  • 158:37 - 158:38
    Clara Ruiz.
  • 158:43 - 158:45
    CLARA: Hello.
    Hello.
  • 158:45 - 158:47
    You have two minutes Clara.
  • 158:49 - 158:50
    CLARA] : Thank you so much.
  • 158:50 - 158:54
    My name is Clara Ruiz and
    I'm writing in support
  • 158:54 - 158:57
    of the ordinance banning
    facial recognition technology
  • 158:57 - 159:00
    in Boston presented by
    councilor Wu and Arroyo.
  • 159:00 - 159:03
    This ordinance establish
    a ban on government use
  • 159:03 - 159:05
    of face surveillance
    in the city of Boston.
  • 159:05 - 159:08
    Boston must pass this
    ordinance to join Springfield,
  • 159:08 - 159:10
    Somerville, Cambridge,
    Brooklyn, North Hampton
  • 159:10 - 159:13
    in protecting racial justice
    privacy, and freedom of speech.
  • 159:13 - 159:17
    This mattered to me because
    I'm a member of the BIPOCC,
  • 159:20 - 159:22
    Black Indigenous and
    People of Color Community.
  • 159:22 - 159:23
    And this is allowing for the community
  • 159:23 - 159:26
    to increase in racial biases.
  • 159:26 - 159:28
    This technology does not
    protect constitutional rights
  • 159:28 - 159:32
    and values, including
    privacy in an immediate
  • 159:32 - 159:35
    [indistinct]
  • 159:35 - 159:38
    appropriate speech and
    association, equal protection
  • 159:38 - 159:41
    and government accountability
    in transparency.
  • 159:41 - 159:43
    And the government must be
    aware of the technical limits
  • 159:43 - 159:46
    of even the most
    promising new technologies
  • 159:46 - 159:48
    in the release of relying
    on computer system
  • 159:48 - 159:50
    that can be prone to error.
  • 159:50 - 159:51
    Facial recognition is creeping
  • 159:51 - 159:53
    into more and more law
    enforcement agencies,
  • 159:53 - 159:55
    but later notice or oversight.
  • 159:55 - 159:58
    And there are practically
    no laws regulating
  • 159:58 - 160:00
    this investive surveillance technology.
  • 160:00 - 160:03
    Meanwhile, ongoing technical limitations
  • 160:03 - 160:05
    to the accuracy of facial recognition
  • 160:05 - 160:07
    create serious problems
    like misidentification
  • 160:07 - 160:10
    and public safety risk.
  • 160:10 - 160:12
    I refuse to sit back and
    put my community on the risk
  • 160:12 - 160:16
    than it already is.
  • 160:16 - 160:17
    These surveillance technologies
  • 160:17 - 160:21
    are intentionally set up to
    the disadvantage of brown
  • 160:22 - 160:24
    and color people, communities.
  • 160:24 - 160:27
    It has a better success rate
    when it comes to white males,
  • 160:27 - 160:29
    meaning this technology is a unreliable
  • 160:29 - 160:31
    and it will do more harm,
  • 160:31 - 160:33
    if we don't ban the surveillance tool.
  • 160:33 - 160:37
    This technology will allow
    for misidentification
  • 160:37 - 160:40
    to purposely happen against
    black and brown people.
  • 160:40 - 160:43
    A ban on face surveillance technology
  • 160:43 - 160:45
    is critically important
    in the city of Boston
  • 160:45 - 160:47
    because we need to keep
    the community safe,
  • 160:47 - 160:49
    because all life matters.
  • 160:49 - 160:51
    A federal government government's study
  • 160:51 - 160:53
    published in December 2019,
  • 160:53 - 160:55
    conduct face recognition algorithms
  • 160:55 - 160:57
    were most likely to fail
  • 160:57 - 161:00
    when attempting to identify
    the faces of people of color,
  • 161:00 - 161:03
    children, elderly and woman.
  • 161:03 - 161:07
    That means the test only reliably works
  • 161:07 - 161:09
    on middle aged white men.
  • 161:09 - 161:11
    If really a small fraction
    of Boston residents,
  • 161:11 - 161:15
    I encourage you to pass to press pause
  • 161:15 - 161:18
    on the use of face surveillance
    by government entities
  • 161:20 - 161:21
    in the city of Boston,
  • 161:21 - 161:24
    by supporting and
    passing this crucial ban.
  • 161:24 - 161:27
    We cannot allow Boston
    to adopt authoritarian,
  • 161:27 - 161:30
    unregulated, bias surveillance technology.
  • 161:30 - 161:32
    Thank you for the opportunity to testify.
  • 161:32 - 161:34
    And the way, Diana Serrano,
  • 161:34 - 161:36
    is also gonna be testifying through this.
  • 161:39 - 161:41
    Diana you have two minutes.
  • 161:41 - 161:43
    DIANA: Okay, hello, my
    name is Diana Serrano,
  • 161:43 - 161:45
    good evening.
  • 161:45 - 161:48
    I'm writing on behalf of SIM
    in support of the ordinance
  • 161:48 - 161:50
    banning facial recognition
    technology in Boston.
  • 161:50 - 161:53
    I urge my city of Boston
    to pass this ordinance
  • 161:53 - 161:56
    to join Springfield,
    Somerville, Cambridge, Brooklyn
  • 161:56 - 161:58
    and North Hampton in
    protecting racial justice,
  • 161:58 - 162:01
    privacy and freedom of speech.
  • 162:01 - 162:03
    It is our duty as a free and just city,
  • 162:03 - 162:06
    to ban technology like facial recognition.
  • 162:06 - 162:08
    As a victim of social media hacking,
  • 162:08 - 162:12
    I've seen how technology could
    be used as a tool to defame
  • 162:12 - 162:14
    and criminalize with pictures
  • 162:14 - 162:17
    that feel like proof to the viewers.
  • 162:17 - 162:19
    As a teacher and civilian,
  • 162:19 - 162:22
    I refuse to allow those in my community
  • 162:22 - 162:25
    to be exposed to a highly problematic
  • 162:25 - 162:28
    and technology like facial recognition.
  • 162:28 - 162:30
    Not only is privacy important
  • 162:30 - 162:32
    and should be valued
    by our law enforcement,
  • 162:32 - 162:35
    face surveillance is
    proven to be less accurate
  • 162:35 - 162:37
    for people with darker skin.
  • 162:37 - 162:40
    With the high levels of
    racial discrimination
  • 162:40 - 162:42
    forced on our black and brown communities
  • 162:42 - 162:43
    by law enforcement.
  • 162:43 - 162:45
    The racial bias in face surveillance
  • 162:45 - 162:49
    will put lives in danger even
    further than they already are.
  • 162:49 - 162:51
    A ban on face surveillance technology
  • 162:51 - 162:53
    is critically important
    in the city of Boston
  • 162:53 - 162:56
    because that technology
    really is only reliable
  • 162:56 - 162:59
    for a very small fraction
    of the population.
  • 162:59 - 163:02
    It's highly important
    to me to live in a city
  • 163:02 - 163:04
    where it is critical for
    leaders to keep our city
  • 163:04 - 163:06
    and its people safe.
  • 163:06 - 163:09
    Although it may seem as
    though facial recognition
  • 163:09 - 163:12
    could maybe keep us safer, it
    has proven do the opposite.
  • 163:12 - 163:16
    I urge you to press pause on
    the use of facial surveillance
  • 163:16 - 163:19
    by government entities
    and the city of Boston
  • 163:19 - 163:21
    by supporting and passing the crucial ban.
  • 163:21 - 163:23
    We cannot allow Boston to adopt
  • 163:23 - 163:27
    authoritarian, unregulated,
    bias surveillance technology.
  • 163:27 - 163:30
    Thank you for your
    attention, Diana Serrano.
  • 163:30 - 163:32
    Thank you very much.
  • 163:32 - 163:37
    Oliver De Leon, you have two minute.
  • 163:38 - 163:40
    OLIVER: Hi, my name is Oliver
  • 163:40 - 163:43
    and I live in Jamaica Plain.
  • 163:43 - 163:46
    And I was undocumented when I attended
  • 163:46 - 163:48
    the Boston public schools
    in early nineties.
  • 163:48 - 163:53
    So yes, it's a while back, so
    I became a citizen in 2016.
  • 163:54 - 163:57
    I look back on my days in high school
  • 163:57 - 163:59
    while contemplating this new technology,
  • 163:59 - 164:04
    facial recognition and I
    can sincerely tell you that
  • 164:05 - 164:08
    it terrified me the
    thought of having somebody
  • 164:08 - 164:11
    access this technology without
    my consent acknowledge.
  • 164:11 - 164:14
    I don't know if this would have allowed me
  • 164:14 - 164:17
    to finish school at that time.
  • 164:17 - 164:19
    And it would have been hard
  • 164:19 - 164:22
    to even finish possibly higher education.
  • 164:22 - 164:24
    Without that education,
  • 164:24 - 164:28
    I wouldn't have been able
    to secure job opportunities
  • 164:28 - 164:31
    like I have now with a paying job
  • 164:31 - 164:33
    that helps me provide for my family.
  • 164:33 - 164:35
    So we started thinking about that.
  • 164:35 - 164:38
    We start see seeing how
    this facial recognition
  • 164:38 - 164:41
    can actually have a negative impact
  • 164:41 - 164:44
    in the education of our
    youth and our future.
  • 164:44 - 164:48
    People already talk, how are the U.S.
  • 164:48 - 164:50
    is already riddled with racial bias,
  • 164:50 - 164:54
    and this software is just
    giving people already empowered
  • 164:54 - 164:56
    a tool to discriminate even more.
  • 164:56 - 165:00
    For example, I.C.E has already
    shown up in the protests
  • 165:00 - 165:02
    and detained a few people.
  • 165:02 - 165:04
    Now we have to ask ourselves
  • 165:04 - 165:08
    how the heck do they
    pick out these few people
  • 165:08 - 165:11
    in the thousands of protesters?
  • 165:11 - 165:13
    I mean, one can only speculate,
  • 165:13 - 165:15
    but one of the few answers
  • 165:15 - 165:17
    can be facial recognition software.
  • 165:17 - 165:18
    Do we not have it?
  • 165:18 - 165:19
    Do they have it?
  • 165:19 - 165:21
    How can we show that they're not using it?
  • 165:21 - 165:25
    Well, it's pretty hard to pick
    out two or three individuals
  • 165:25 - 165:27
    out of a thousands and thousands
  • 165:27 - 165:29
    and thousands of protestors.
  • 165:29 - 165:32
    So it's up to us to kind
    of speculate what that is.
  • 165:32 - 165:35
    The Coronavirus has caused major epidemic.
  • 165:35 - 165:39
    The police brutality has
    caused major epidemic.
  • 165:39 - 165:41
    I guess we have to ask ourselves,
  • 165:41 - 165:44
    are you allowing the
    next epidemic to happen?
  • 165:44 - 165:45
    Maybe.
  • 165:46 - 165:48
    Banning facial recognition
    in the city of Boston
  • 165:48 - 165:52
    will bring us to be in the
    middle of the next battle.
  • 165:52 - 165:56
    And which side do we wanna be on?
  • 165:56 - 165:59
    Let's not approve this software.
  • 165:59 - 166:01
    I urge you to to ban facial
    recognition technology
  • 166:01 - 166:03
    as it also affects our youth.
  • 166:03 - 166:08
    We do not need a system or software
  • 166:08 - 166:09
    that can pick up individuals
  • 166:09 - 166:11
    and potentially erroneously
  • 166:11 - 166:15
    accuse them of something
    that they haven't done.
  • 166:15 - 166:18
    Will the next death is
    caused by your choice today.
  • 166:18 - 166:20
    Let's hope not, thank you.
  • 166:20 - 166:22
    Thank you, thank you very much.
  • 166:22 - 166:24
    It's 10 to six.
  • 166:24 - 166:28
    I'm going to have to stop
    sharing at six o'clock.
  • 166:28 - 166:33
    That only means that I will
    be turning over the gavel
  • 166:33 - 166:35
    to the one of the lead sponsors,
  • 166:35 - 166:37
    Michelle Wu at six o'clock.
  • 166:38 - 166:39
    But I'm gonna call the
    next names of individuals
  • 166:39 - 166:43
    who are in line and have
    signed up to speak before I go.
  • 166:43 - 166:45
    And we'll continue chair.
  • 166:45 - 166:50
    I've got an Angel, Michelle
    Raj Mon, Leon Smith,
  • 166:51 - 166:56
    Amy van der Hiel, excuse me,
    Julie McNulty and Zachary Lawn.
  • 167:00 - 167:02
    So is the Angel available?
  • 167:03 - 167:04
    ANGEL: Yes.
  • 167:05 - 167:08
    Hello everyone, thank
    you for being here today.
  • 167:10 - 167:11
    Thank you.
  • 167:14 - 167:19
    ANGEL: My name is Angel
    and I am in fresh grade
  • 167:19 - 167:24
    and go to a Boston public school.
  • 167:27 - 167:29
    I am seven years old.
  • 167:29 - 167:32
    To me surveillance means
    people watching you
  • 167:32 - 167:34
    without even knowing.
  • 167:34 - 167:36
    I would feel frustrated
  • 167:36 - 167:41
    because I do not want anyone watching me,
  • 167:43 - 167:48
    especially if I'm home
    at school or at a park.
  • 167:49 - 167:54
    Facial recognition is
    when they see your face
  • 167:54 - 167:59
    and know all the details,
  • 167:59 - 168:02
    but this does not work for everyone
  • 168:02 - 168:06
    because most police do not like people
  • 168:06 - 168:10
    who are not their skin color.
  • 168:10 - 168:13
    This tool is not going to be equal
  • 168:13 - 168:18
    to people who are black,
    Latino, immigrants and more,
  • 168:20 - 168:24
    it can get the wrong person.
  • 168:27 - 168:32
    I am young, as I get older,
    I will not look the same.
  • 168:33 - 168:38
    I am seven now and when I turned
    13, I will look different.
  • 168:39 - 168:42
    When I turn 18, I will look different.
  • 168:42 - 168:46
    I get older and my face changes.
  • 168:46 - 168:50
    If these tools are used new
    school, I will not feel safe.
  • 168:50 - 168:53
    We are just children.
  • 168:53 - 168:57
    Our teachers are already
    taking care and watching us.
  • 168:57 - 169:00
    We do not need police watching us.
  • 169:00 - 169:04
    Thank you SIM, for your love and support.
  • 169:07 - 169:08
    Well, I have to say,
  • 169:08 - 169:12
    I think you're the youngest
    person that's testified ever.
  • 169:12 - 169:16
    And in anything I've ever shared,
  • 169:16 - 169:17
    or I had that participation,
  • 169:17 - 169:18
    I know we're all on Zoom,
  • 169:18 - 169:21
    but I will give you a round of applause.
  • 169:21 - 169:23
    Thank you so much.
  • 169:23 - 169:26
    That was absolutely beautiful,
    very powerful testimony.
  • 169:26 - 169:28
    We're so proud of you.
  • 169:28 - 169:31
    Your city council and
    city are so proud of you.
  • 169:31 - 169:34
    Thank you so much for
    your testimony Angel.
  • 169:34 - 169:39
    I'm going to now call on,
  • 169:39 - 169:41
    I don't think I see Michelle or Leon
  • 169:41 - 169:44
    and don't share them either.
  • 169:45 - 169:49
    Okay, so I see Amy, who
    was we called earlier?
  • 169:49 - 169:52
    So Amy, you have two minutes
  • 169:52 - 169:55
    and you get to follow the
    cutest seven-year-old ever.
  • 169:55 - 169:57
    [Lydia laughs]
  • 169:57 - 169:58
    Go ahead,
  • 169:59 - 170:00
    you're on mute.
  • 170:02 - 170:05
    Thank you, I applaud
    her testimony entirely.
  • 170:05 - 170:07
    Thank you madam chairwoman and the council
  • 170:07 - 170:10
    for hearing our testimony,
    I'll be very brief.
  • 170:10 - 170:14
    I just wanted to say my
    name is Amy van der Hiel.
  • 170:14 - 170:15
    I live in Roslindale
  • 170:15 - 170:18
    and I support the ban on
    face surveillance, thank you.
  • 170:19 - 170:21
    Thank you very much, Michelle.
  • 170:21 - 170:25
    I have a Michelle Raj Mon and
    I see that Michelle Barrios.
  • 170:25 - 170:26
    Are you...
  • 170:26 - 170:28
    Okay, that was the mistake.
  • 170:28 - 170:31
    Very well then, Michelle
    you have two minutes.
  • 170:31 - 170:32
    That's okay.
  • 170:33 - 170:38
    And I also am not following
    Angel as a bit much.
  • 170:38 - 170:40
    So hello.
  • 170:40 - 170:42
    My name is Michelle Barrios
  • 170:42 - 170:45
    and I am a social studies
    teacher in the Boston area.
  • 170:45 - 170:47
    And today I am here speaking on behalf
  • 170:47 - 170:48
    of the Student Immigrant Movement,
  • 170:48 - 170:51
    myself as an educator
  • 170:51 - 170:54
    and on the behalf of my
    diverse student population.
  • 170:54 - 170:57
    I am speaking in support of the ordinance
  • 170:57 - 170:59
    banning facial recognition
    technology in Boston,
  • 170:59 - 171:02
    presented by councilors Wu and Arroyo.
  • 171:02 - 171:05
    This ordinance established
    a ban on government use
  • 171:05 - 171:08
    of facial surveillance
    in the city of Boston.
  • 171:08 - 171:10
    Boston must pass this
    ordinance to join Springfield,
  • 171:10 - 171:14
    Somerville, excuse me,
    Springfield, Somerville,
  • 171:14 - 171:17
    Cambridge, Brooklyn and North Hampton
  • 171:17 - 171:20
    in protecting racial justice,
    privacy and freedom of speech.
  • 171:20 - 171:24
    In my training as a
    social studies teacher,
  • 171:24 - 171:26
    I became increasingly aware of the trauma
  • 171:26 - 171:27
    that students of color develop
  • 171:28 - 171:30
    as they grow in a society
  • 171:30 - 171:33
    that seeks to police them on
    the basis of their ethnicity,
  • 171:33 - 171:35
    economic status and place of residence.
  • 171:35 - 171:38
    This trauma is exacerbated
    when a young person
  • 171:38 - 171:40
    comes from an immigrant family,
  • 171:40 - 171:41
    the status in this country,
  • 171:41 - 171:44
    places them at risk of family separation,
  • 171:44 - 171:48
    a loss of education and a
    stripping of basic human rights.
  • 171:48 - 171:51
    We know that students struggle
    to succeed academically
  • 171:51 - 171:54
    when they face these
    daily existential traumas,
  • 171:54 - 171:58
    which in essence is a
    violation of their rights
  • 171:58 - 172:00
    in the U.S. to pursue survival,
  • 172:00 - 172:03
    let alone social mobility
    and a future of stability.
  • 172:03 - 172:06
    In addition to my training
    and work as an educator,
  • 172:06 - 172:08
    I am the wife of a Latino immigrants
  • 172:08 - 172:12
    and the mother of two
    of my sisters children,
  • 172:12 - 172:13
    while I am a white Latina
  • 172:13 - 172:15
    and I can live under the
    presumption of innocence
  • 172:15 - 172:16
    due to my white privilege,
  • 172:16 - 172:19
    I've witnessed firsthand what
    it means to live in a society
  • 172:19 - 172:22
    that has not yet made good
    on its promise of equality,
  • 172:22 - 172:24
    liberty, and justice for all.
  • 172:24 - 172:26
    My husband is followed in stores,
  • 172:26 - 172:29
    teachers hesitate to
    release our children to him
  • 172:29 - 172:30
    when they first meet him
  • 172:30 - 172:32
    and he has been asked for
    his immigration status
  • 172:32 - 172:36
    by customers in his place of work.
  • 172:36 - 172:39
    We are in the fortunate
    position to know that his life
  • 172:39 - 172:42
    and our family can count
    on his permanent residency
  • 172:42 - 172:44
    to keep us together for now.
  • 172:44 - 172:47
    However, I cannot say the
    same for many Latin X families
  • 172:47 - 172:49
    in the greater Boston area,
  • 172:49 - 172:51
    a ban on face surveillance technology
  • 172:51 - 172:53
    is critically important
    in the city of Boston
  • 172:53 - 172:56
    because this surveillance
    harms our privacy
  • 172:56 - 172:58
    and our freedom of speech,
  • 172:58 - 173:01
    a fundamental right in our constitution.
  • 173:01 - 173:04
    This type of surveillance
    threatens to create a world
  • 173:04 - 173:06
    where people are watched and identified
  • 173:06 - 173:08
    as they exercise their right to protest,
  • 173:08 - 173:10
    congregate at places of worship
  • 173:10 - 173:13
    and visit medical providers,
  • 173:13 - 173:15
    as they go about their daily lives.
  • 173:15 - 173:16
    For my students of color,
  • 173:16 - 173:18
    specifically the young
    men growing up in a world
  • 173:18 - 173:22
    that still cannot fully accept
    that their lives matter,
  • 173:26 - 173:29
    face surveillance technology is not only
  • 173:29 - 173:30
    another threat to their lives,
  • 173:30 - 173:33
    but a response saying
    to these young people,
  • 173:33 - 173:35
    you still don't matter.
  • 173:35 - 173:37
    I teach world history as well
    as comparative government
  • 173:37 - 173:40
    and politics, and when I
    teach about authoritarianism,
  • 173:40 - 173:43
    I teach about the impact
    on marginalized groups
  • 173:43 - 173:46
    in countries that are still
    considered developing.
  • 173:46 - 173:50
    I teach that in these free countries
  • 173:50 - 173:53
    or in these recently free countries,
  • 173:53 - 173:56
    surveillance is one of
    the most common tools
  • 173:56 - 173:58
    to manage descent.
  • 173:58 - 174:02
    And that is often followed
    by state sanctioned violence
  • 174:02 - 174:05
    as means to silence any dissidents.
  • 174:05 - 174:08
    I urge each of you to
    consider the statement
  • 174:08 - 174:11
    the statement that you
    would make by not banning
  • 174:11 - 174:16
    the use of facial surveillance
    technology in Boston.
  • 174:16 - 174:20
    I urge each of you to imagine
    a 14 year old student of color
  • 174:20 - 174:23
    or a Latino essential
    worker in front of you,
  • 174:23 - 174:25
    asking if their lives matter.
  • 174:25 - 174:27
    What is your response?
  • 174:27 - 174:29
    I encourage you to consider that moment
  • 174:29 - 174:32
    and to press pause on the
    use of face surveillance
  • 174:32 - 174:34
    by government entities
    in the city of Boston,
  • 174:34 - 174:37
    by supporting and
    passing this crucial ban.
  • 174:37 - 174:40
    We cannot allow Boston
    to adopt authoritarian,
  • 174:40 - 174:43
    unregulated, bias surveillance technology.
  • 174:43 - 174:46
    Thank you for your attention
    and your consideration.
  • 174:46 - 174:47
    Thank you very much.
  • 174:47 - 174:52
    At this point, I'm going
    to turn over the microphone
  • 174:52 - 174:56
    or the gavel if you will, to
    my colleague, Michelle Wu.
  • 174:56 - 175:00
    I'll just say up next two
    speakers should be getting ready.
  • 175:00 - 175:05
    Is Julie McNulty, Zachary
    Lawn, Christina Rivera,
  • 175:07 - 175:08
    and Joseph Fridman.
  • 175:10 - 175:10
    I wanna thank you so
    much to the lead sponsors
  • 175:12 - 175:13
    of this ordinance.
  • 175:13 - 175:15
    I will be planning to
    have a working session
  • 175:15 - 175:16
    as soon as possible.
  • 175:16 - 175:19
    I do understand the
    urgency and I do believe
  • 175:19 - 175:23
    that there is a great
    opportunity before us to lead
  • 175:23 - 175:28
    in a way that not only values the lives
  • 175:29 - 175:34
    of our black and brown, but in
    general, our civil liberties.
  • 175:34 - 175:35
    And so I want to thank again,
  • 175:35 - 175:37
    lead sponsors for their leadership.
  • 175:37 - 175:39
    I look forward to getting this done
  • 175:39 - 175:41
    and with that I'm going
    to have to sign off
  • 175:41 - 175:43
    and I turn over to Michelle Wu.
  • 175:44 - 175:46
    Thank you.
  • 175:46 - 175:47
    Thank you so much, madam chair.
  • 175:47 - 175:50
    We're very grateful for you.
  • 175:51 - 175:53
    Okay, next step was Julie.
  • 175:55 - 175:59
    And if not, Julie, then Zachary.
  • 176:06 - 176:11
    And just to know, could
    council central staff
  • 176:11 - 176:12
    give me administrative rights,
  • 176:12 - 176:15
    so I can help search whether folks
  • 176:15 - 176:17
    are stuck in the waiting room
  • 176:17 - 176:20
    and trans people over into the
    panelists section, thank you.
  • 176:23 - 176:27
    So again, it was Julie,
    Zachary, Christina Rivera,
  • 176:27 - 176:32
    Joseph Fridman, Danielle
    Samter, for the next few.
  • 176:44 - 176:45
    Christina, why don't you go ahead.
  • 176:45 - 176:47
    I see you now in the, in the main room.
  • 176:48 - 176:49
    Hi, yes.
  • 176:49 - 176:50
    Thank you for having me as stated.
  • 176:50 - 176:52
    My name is Christina Rivera
  • 176:52 - 176:54
    and I am an east Boston resident.
  • 176:54 - 176:57
    I'm also the founder of
    the Latin X Coalition
  • 176:57 - 176:59
    for Black Lives Matter.
  • 176:59 - 177:02
    It is a support organization
  • 177:02 - 177:05
    to help end racism within our community,
  • 177:05 - 177:07
    but also help advance the work
  • 177:07 - 177:10
    of the Black Lives Matter Movement.
  • 177:10 - 177:11
    I wanna say that we as an organization,
  • 177:11 - 177:13
    we fully support the ordinance
  • 177:13 - 177:17
    of a ban on facial recognition technology.
  • 177:17 - 177:19
    As stated all throughout
    this meeting earlier,
  • 177:20 - 177:21
    we know that the automation bias exists
  • 177:21 - 177:24
    and that facial recognition technology
  • 177:24 - 177:27
    clearly shows bias towards
    already highly targeted
  • 177:27 - 177:29
    and hyper policed communities,
  • 177:29 - 177:31
    specifically those of color.
  • 177:31 - 177:33
    And I just wanna pose the question,
  • 177:33 - 177:35
    how can we begin to trust
  • 177:35 - 177:38
    even with the development
    of algorithm accuracy,
  • 177:38 - 177:41
    that those who will use it in the future
  • 177:41 - 177:43
    will not abuse its capabilities.
  • 177:43 - 177:45
    We already have a police force
  • 177:45 - 177:47
    that has shown time and time again,
  • 177:47 - 177:50
    that some of their own
    procedures are still abused
  • 177:50 - 177:53
    to this date and used by bad actors.
  • 177:53 - 177:55
    So adding a tool that can be used
  • 177:55 - 177:57
    to further affirm racial bias
  • 177:57 - 178:00
    only provides another route
    to target our communities.
  • 178:00 - 178:04
    Using this technology also
    enforces a policing system
  • 178:04 - 178:06
    that is based on control,
  • 178:06 - 178:09
    and that is a direct
    symptom of police mistrust
  • 178:09 - 178:11
    that already exists of their own citizens
  • 178:11 - 178:14
    and the ones that they serve.
  • 178:14 - 178:16
    And so, as some people earlier stated,
  • 178:16 - 178:19
    specifically Mr. Daley point about
  • 178:19 - 178:24
    what government agencies are
    allowed to already use this
  • 178:24 - 178:29
    versus people who have in inactive bands,
  • 178:29 - 178:32
    what is also going to
    be the legislative power
  • 178:32 - 178:34
    that any federal agency
    will be able to use it
  • 178:34 - 178:36
    in individual states,
  • 178:36 - 178:39
    specifically when it comes
    to I.C.E and immigration.
  • 178:39 - 178:41
    So this is something that
    we are looking to make sure
  • 178:41 - 178:43
    that it's not only is banned now,
  • 178:43 - 178:46
    but also cannot be allowed in the future.
  • 178:46 - 178:47
    With that said,
  • 178:47 - 178:48
    thank you so much to all the panelists
  • 178:48 - 178:50
    who've shared all this great information.
  • 178:50 - 178:53
    Thank you so much to councilor
    Wu and councilor Arroyo
  • 178:53 - 178:56
    for doing this work and I concede my time.
  • 178:56 - 178:57
    Thank you.
  • 178:59 - 179:00
    Thank you so much, Christina.
  • 179:00 - 179:02
    We appreciate you, Joseph.
  • 179:06 - 179:07
    Yes, thank you for your time.
  • 179:07 - 179:09
    I'd like to tell you a
    little bit about the science
  • 179:09 - 179:11
    between emotion detection technologies.
  • 179:11 - 179:12
    So these are technologies
  • 179:12 - 179:14
    that use information captured
    from face surveillance
  • 179:14 - 179:16
    to supposedly recognize emotions.
  • 179:16 - 179:19
    So I'll be speaking to
    you as a private citizen,
  • 179:19 - 179:20
    but I'll tell you that
    for the past three years,
  • 179:20 - 179:22
    I've been working at a lab
    at Northeastern University,
  • 179:22 - 179:23
    which studies emotions,
  • 179:23 - 179:24
    and I've been a research assistant
  • 179:24 - 179:26
    at the Harvard Kennedy Schools
    Belfer Center for Science
  • 179:26 - 179:27
    and International Relations,
  • 179:27 - 179:29
    thinking about the same
    type of technology.
  • 179:29 - 179:31
    At the interdisciplinary
    aspect of science lab
  • 179:31 - 179:32
    at Northeastern,
  • 179:32 - 179:34
    one of my bosses is Dr.
    Lisa Feldman Barrett.
  • 179:34 - 179:36
    Dr. Barrett has research
    appointments at MGH
  • 179:36 - 179:37
    and Harvard Medical School.
  • 179:37 - 179:40
    She's the chief scientific
    officer at MGH Center
  • 179:40 - 179:41
    for Law, Brain and Behavior.
  • 179:41 - 179:42
    She's the author of
  • 179:42 - 179:44
    over 230 peer reviewed scientific papers,
  • 179:44 - 179:45
    and just finished a term as president
  • 179:45 - 179:48
    of the Association for
    Psychological Science,
  • 179:48 - 179:50
    where she represented tens
    of thousands of scientists
  • 179:50 - 179:51
    around the world.
  • 179:51 - 179:52
    Three years ago,
  • 179:52 - 179:54
    this organization commissioned
    her and four other experts,
  • 179:54 - 179:56
    psychologists, computer
    science, and so on,
  • 179:56 - 179:57
    to do a major study
  • 179:57 - 180:00
    on whether people across
    the world express emotions
  • 180:00 - 180:02
    like anger, sadness,
    fear, disgust, surprise,
  • 180:02 - 180:05
    and happiness with distinctive
    movements of the face.
  • 180:05 - 180:06
    So all of these scientists
  • 180:06 - 180:07
    started with different perspectives,
  • 180:07 - 180:09
    but they reviewed a
    thousand research articles
  • 180:09 - 180:12
    and they came to a very simple
    and very important consensus.
  • 180:12 - 180:13
    So the question is,
  • 180:13 - 180:16
    can you read emotions
    reliably in a human face?
  • 180:16 - 180:17
    And the answer is no.
  • 180:17 - 180:19
    Here's an example from Dr. Barrett.
  • 180:19 - 180:22
    People living in Western
    cultures scowl when they're angry
  • 180:22 - 180:24
    about 30% of the time,
  • 180:24 - 180:26
    this means that they make
    other facial expressions,
  • 180:26 - 180:30
    frowns, wide-eyed, gasps
    smiles about 70% of the time.
  • 180:30 - 180:32
    This is called low reliability.
  • 180:32 - 180:35
    People scowl when angry more than chance,
  • 180:35 - 180:36
    but they also scowl
    when they're not angry.
  • 180:36 - 180:39
    If they're concentrating or if
    they have something like gas,
  • 180:39 - 180:41
    this is called low specificity.
  • 180:41 - 180:43
    So a scowl is not the expression of anger,
  • 180:43 - 180:44
    but it's one of many expressions of anger.
  • 180:44 - 180:47
    And sometimes it doesn't
    express anger at all.
  • 180:47 - 180:50
    In this paper, Dr.
    Barrett and her colleagues
  • 180:50 - 180:52
    showed that scowling and
    anger, smiling and happiness,
  • 180:52 - 180:53
    frowning and sadness,
  • 180:53 - 180:55
    these are all Western stereotypes
    of emotional expressions.
  • 180:55 - 180:57
    They reflect some common beliefs
  • 180:57 - 180:58
    about emotional expressions,
  • 180:58 - 181:00
    but these beliefs don't correspond
  • 181:00 - 181:03
    to how people actually move
    their faces in real life.
  • 181:03 - 181:04
    And they don't generalize to cultures
  • 181:05 - 181:07
    that are different from ours.
  • 181:07 - 181:08
    So it's not possible for anyone
  • 181:08 - 181:10
    or anything to read an
    emotion on our face.
  • 181:10 - 181:12
    And we shouldn't confidently
    infer happiness from a smile,
  • 181:12 - 181:14
    anger from a scowl or
    sadness from a frown.
  • 181:14 - 181:17
    There are technologies
    that claim to do this,
  • 181:17 - 181:19
    and they're misrepresenting
    what they can do
  • 181:19 - 181:20
    according to the best available evidence.
  • 181:21 - 181:24
    People aren't moving their faces randomly,
  • 181:24 - 181:26
    but there's a lot of variation
  • 181:26 - 181:28
    and the meaning of a facial movement
  • 181:28 - 181:30
    depends on a person on the
    context and on culture.
  • 181:30 - 181:32
    So let me give you one example
  • 181:32 - 181:35
    of how making this assumption
    can be really harmful.
  • 181:35 - 181:36
    There's a scientist, Dr. Lauren Ruth,
  • 181:36 - 181:39
    when she was at Wake Forest University,
  • 181:39 - 181:40
    published work reviewing two popular
  • 181:40 - 181:41
    emotion detection algorithms.
  • 181:41 - 181:43
    She ran these algorithms algorithms
  • 181:43 - 181:45
    on portraits of white
    and black NBA players.
  • 181:45 - 181:47
    And both of them consistently interpreted
  • 181:47 - 181:49
    the black players as having
    more negative emotions
  • 181:49 - 181:51
    than the white players.
  • 181:51 - 181:52
    Imagine that there's a security guard
  • 181:52 - 181:54
    that gets information
    from a surveillance camera
  • 181:54 - 181:55
    that this is possible nuisance
  • 181:55 - 181:57
    in the lobby of their building
  • 181:57 - 181:59
    because of the facial movements
    that somebody is making.
  • 181:59 - 182:01
    Imagine a defendant in the courtroom
  • 182:01 - 182:03
    that scowling as they
    concentrate on legal proceedings
  • 182:03 - 182:05
    but in emotion AI, based
    on facial surveillance
  • 182:05 - 182:08
    tells the jury that
    the defendant is angry.
  • 182:08 - 182:09
    Imagine that the defendant is black
  • 182:09 - 182:11
    and the AI says that they're contemptuous.
  • 182:11 - 182:13
    Imagine at worst, that police officer
  • 182:13 - 182:15
    receiving an alert from at a body camera
  • 182:15 - 182:16
    based on this technology
  • 182:16 - 182:18
    that says that someone in
    their line of sight is a threat
  • 182:18 - 182:21
    based on a so-called
    reading of their face.
  • 182:21 - 182:24
    Councilors I think that the question
  • 182:24 - 182:25
    that we should be thinking about
  • 182:25 - 182:28
    is whether we want someone
    to use this technology on us
  • 182:28 - 182:31
    with the chance that it
    would misinterpret our face
  • 182:31 - 182:34
    given the major chance
    that it has of being wrong.
  • 182:34 - 182:36
    And I think the clear
    answer to that is no.
  • 182:36 - 182:38
    We should ban facial
    surveillance technologies
  • 182:38 - 182:41
    and all of these other
    dangerous applications.
  • 182:41 - 182:43
    Thank you.
  • 182:43 - 182:44
    Thank you very much.
  • 182:44 - 182:46
    Next up is Danielle and
    Danielle will be followed
  • 182:46 - 182:49
    by DJ Hatfield and Denise Horn.
  • 182:51 - 182:53
    Hi, good afternoon everybody.
  • 182:53 - 182:55
    My name is Danielle,
    I'm a Boston resident,
  • 182:55 - 182:57
    I live in the north end.
  • 182:57 - 183:01
    I just wanted to fully support
  • 183:01 - 183:04
    and put my voice behind the ban
  • 183:04 - 183:08
    for facial recognition
    recognition technology.
  • 183:08 - 183:11
    I feel as though the expert witnesses
  • 183:11 - 183:13
    and activists that have gone before me,
  • 183:13 - 183:18
    have clearly expressed and
    eloquently expressed the views
  • 183:18 - 183:19
    that I also share.
  • 183:19 - 183:23
    I find this technology to
    be incredibly dangerous.
  • 183:23 - 183:26
    As a cybersecurity professional,
  • 183:26 - 183:27
    I have a lot of questions about
  • 183:27 - 183:32
    what it means to be
    collecting data on people
  • 183:32 - 183:33
    and storing it.
  • 183:34 - 183:38
    I certainly think that this is a measure
  • 183:38 - 183:40
    that should be taken to be a precedent
  • 183:40 - 183:43
    for all future developments
    of the technology
  • 183:43 - 183:45
    as was just stated.
  • 183:45 - 183:48
    There are upcoming and new ways
  • 183:48 - 183:52
    to express facial recognition
    through emotions or otherwise
  • 183:52 - 183:56
    that we can't even comprehend
    yet because it hasn't started.
  • 183:56 - 184:00
    So I wanted to support it and
    thank all the other people
  • 184:00 - 184:01
    for putting their voice behind it
  • 184:01 - 184:04
    and hope to see it take place, thank you.
  • 184:06 - 184:08
    Thank you very much, DJ.
  • 184:15 - 184:18
    Hi, will start the video.
  • 184:18 - 184:20
    So I'm DJ Hatfield.
  • 184:20 - 184:23
    I'm a professor of
    anthropology and history
  • 184:23 - 184:26
    at the Berkeley College
    of Music in Boston.
  • 184:26 - 184:31
    However, what I have to
    say is as a private citizen
  • 184:31 - 184:33
    and a resident of east Boston,
  • 184:33 - 184:38
    that's a shout out to
    Lydia, east Boston pride.
  • 184:38 - 184:43
    Yes, as an anthropologist I'm
    aware that all technologies,
  • 184:43 - 184:45
    including digital technologies
  • 184:45 - 184:47
    reflect an augment the social structures
  • 184:47 - 184:50
    and systems of value of
    particular societies.
  • 184:50 - 184:52
    They are not objective,
  • 184:52 - 184:54
    but reflect the biases of those societies
  • 184:54 - 184:57
    as they develop through history.
  • 184:57 - 184:59
    Moreover, if we look at societies
  • 184:59 - 185:02
    where facial recognition
    has been widely deployed,
  • 185:02 - 185:05
    such as the People's Republic of China,
  • 185:05 - 185:08
    we see very clearly how these technologies
  • 185:08 - 185:11
    have been used systematically
  • 185:11 - 185:13
    to target marginal populations.
  • 185:15 - 185:17
    This is a quality of the weight
  • 185:17 - 185:21
    that these technologies
    intersect with social systems
  • 185:21 - 185:23
    that act against people of color
  • 185:23 - 185:25
    and indigenous people, globally.
  • 185:26 - 185:28
    Therefore, in the city of Boston,
  • 185:28 - 185:33
    where we take pride in our civic freedoms,
  • 185:33 - 185:37
    we should be wary of technologies
  • 185:37 - 185:40
    that would make our Fourth
    Amendment meaningless
  • 185:40 - 185:42
    and would erode the First Amendment.
  • 185:43 - 185:45
    Therefore, I would urge the city council
  • 185:45 - 185:50
    and the city of Boston
    to pass a complete ban
  • 185:51 - 185:55
    on facial recognition
    systems and technology,
  • 185:55 - 185:59
    with the amendments that have
    been proposed by experts,
  • 185:59 - 186:02
    such as those of the
    Electronic Freedom Foundation,
  • 186:02 - 186:07
    and the Algorithmic Justice League,
  • 186:07 - 186:09
    who have said there should be no exception
  • 186:09 - 186:13
    for the Boston Police Department
    to use facial recognition
  • 186:13 - 186:18
    or surveillance systems in evidence,
  • 186:18 - 186:21
    in which there should be strict rules
  • 186:21 - 186:24
    for enforcement of breaches of this ban.
  • 186:24 - 186:29
    And finally, if possible,
    extending this ban
  • 186:29 - 186:31
    to all private actors who have business
  • 186:31 - 186:33
    in the city of Boston.
  • 186:33 - 186:34
    Thank you for your consideration
  • 186:34 - 186:36
    and thank you for all your time today.
  • 186:36 - 186:39
    I appreciate being able to join
  • 186:39 - 186:41
    in my very first public hearing.
  • 186:41 - 186:42
    Thank you.
  • 186:42 - 186:45
    We are excited to have you at our hearing
  • 186:45 - 186:47
    and hope you'll come back, Denise next.
  • 186:47 - 186:49
    And then after Denise,
  • 186:49 - 186:51
    there are a few names assigned up
  • 186:51 - 186:52
    that I didn't see in the room.
  • 186:52 - 186:55
    So I'm gonna call out folks
  • 186:55 - 186:57
    that I was able to cross
    reference between the waiting room
  • 186:57 - 187:00
    and the list, which was Charles Griswold
  • 187:00 - 187:02
    and Nora Paul-Schultz.
  • 187:02 - 187:03
    And then after that,
  • 187:03 - 187:05
    I'm gonna try to admit everyone else
  • 187:05 - 187:07
    who's in the room and go in order,
  • 187:07 - 187:10
    even if you weren't on the original list.
  • 187:10 - 187:11
    So Denise, please.
  • 187:13 - 187:14
    Thank you.
  • 187:14 - 187:15
    Thank you members of the council
  • 187:15 - 187:17
    and the lead sponsors of the ordinance
  • 187:17 - 187:19
    and the panelists, really
    excellent commentary.
  • 187:19 - 187:21
    And I just have to say how impressed I am
  • 187:21 - 187:25
    with the Student Immigrant
    Movement for showing up today,
  • 187:25 - 187:27
    this really impressive.
  • 187:27 - 187:30
    My name is Denise Horn, I'm
    a resident of Jamaica Plain.
  • 187:30 - 187:33
    I'm also a professor of political science
  • 187:33 - 187:35
    and international relations
    at Simmons University.
  • 187:35 - 187:38
    Although I am not representing
    my institution today.
  • 187:38 - 187:41
    And I want to say that I support the ban
  • 187:41 - 187:43
    against the use of facial surveillance.
  • 187:43 - 187:45
    So, like professor Hatfield,
  • 187:45 - 187:49
    I am a scholar of international relations,
  • 187:49 - 187:52
    I have studied authoritarian
    regimes in Eastern Europe
  • 187:52 - 187:55
    and east and southeast Asia,
  • 187:55 - 187:58
    where surveillance technologies
    of every kind have been
  • 187:58 - 188:01
    and are used to target
    enemies of the state.
  • 188:01 - 188:03
    In the United States we're
    already seeing this language
  • 188:03 - 188:05
    in use by the leaders in our government,
  • 188:05 - 188:06
    regarding immigrants and activists,
  • 188:06 - 188:10
    and especially those who
    are opposed to fascism.
  • 188:10 - 188:12
    I do not think that we
    should underestimate
  • 188:12 - 188:14
    how this type of technology could be used
  • 188:14 - 188:18
    to further racists an
    authoritarian agendas here.
  • 188:18 - 188:21
    The use of facial surveillance
    is used to oppress
  • 188:21 - 188:24
    and control citizens under
    the guise of public safety.
  • 188:24 - 188:25
    This is a slippery slope.
  • 188:25 - 188:29
    Government surveillance
    is used to silence,
  • 188:29 - 188:32
    to suppress speech and
    to violate human rights.
  • 188:32 - 188:35
    As an educator, I worry about this type,
  • 188:35 - 188:38
    how this type of technology will be used
  • 188:38 - 188:41
    to adversely affect the lives
    of my students of color,
  • 188:41 - 188:44
    trans students, undocumented
    students, activists
  • 188:44 - 188:47
    and marginalized groups,
    their lives matter.
  • 188:47 - 188:49
    I strongly support this ban
  • 188:49 - 188:50
    and I thank you for letting me speak.
  • 188:52 - 188:55
    Thank you, Denise, Charles.
  • 188:56 - 189:01
    Yep, so I'm trying to start
    the video here, there we are.
  • 189:03 - 189:05
    Can you see me and hear me?
  • 189:06 - 189:08
    Yes we can.
    Okay, thank you.
  • 189:08 - 189:12
    My name is Charles Gridwold,
    I'm a resident of Brighton
  • 189:12 - 189:16
    and I'm a philosophy professor
    at Boston University.
  • 189:16 - 189:19
    I hasten to add that I'm
    offering my thoughts here
  • 189:19 - 189:21
    and in offering my thoughts,
  • 189:21 - 189:23
    I am not speaking for my university
  • 189:23 - 189:25
    and not acting on his behalf.
  • 189:25 - 189:27
    So I'm speaking on my own behalf only.
  • 189:27 - 189:30
    I'm very grateful to all
    of you on the council
  • 189:30 - 189:32
    for taking action on this matter
  • 189:32 - 189:35
    and to the police commissioner
    for his impressive
  • 189:35 - 189:37
    and in some ways reassuring testimony.
  • 189:37 - 189:39
    It's imperative I think,
  • 189:39 - 189:43
    that you approve this ban on
    facial recognition technology,
  • 189:43 - 189:48
    or rather technologies as we've
    learned today by government
  • 189:48 - 189:51
    before the relevant software is upgraded.
  • 189:52 - 189:53
    So far as I can tell,
  • 189:53 - 189:57
    three arguments for the ban
    have been discussed today.
  • 189:57 - 190:00
    The first two extensively
    and the third a bit less so
  • 190:00 - 190:03
    and taken together, they
    seem to me to be decisive
  • 190:03 - 190:06
    in making the case for the ban.
  • 190:06 - 190:07
    The first is of course
  • 190:07 - 190:10
    the matter of racial and gender justice,
  • 190:10 - 190:12
    on account of the fact that the technology
  • 190:12 - 190:14
    is known to be inaccurate.
  • 190:14 - 190:18
    But secondly, is the
    fact that the technology
  • 190:18 - 190:20
    is objectionable.
  • 190:20 - 190:21
    Not just because it's not accurate,
  • 190:21 - 190:23
    but also when it is accurate,
  • 190:23 - 190:27
    as Kade Crockford has
    eloquently argued today.
  • 190:27 - 190:29
    And that seems to me
    to be an essential part
  • 190:29 - 190:33
    of the response to the
    commissioner's remarks.
  • 190:33 - 190:37
    In some of the technology puts
    tremendous additional power
  • 190:37 - 190:39
    in the hands of government and its agents,
  • 190:39 - 190:42
    all the more so when it actually does work
  • 190:42 - 190:44
    and our privacy among other things
  • 190:44 - 190:47
    is certainly threatened by that.
  • 190:47 - 190:49
    Where could that lead, look no further,
  • 190:49 - 190:52
    as people have suggested to the nightmare,
  • 190:52 - 190:56
    that is the surveillance
    apparatus in China today.
  • 190:56 - 190:59
    The third point is, has been mentioned,
  • 190:59 - 191:00
    but it was less emphasized.
  • 191:00 - 191:02
    And I think it's just worth underlining,
  • 191:02 - 191:04
    which is that the awareness
  • 191:04 - 191:08
    of being personally tracked
    outside of one's home
  • 191:08 - 191:09
    by this technology can,
  • 191:09 - 191:12
    and I think will have a chilling effect
  • 191:12 - 191:14
    on our exercise of our civil rights,
  • 191:14 - 191:16
    including our rights of free speech,
  • 191:16 - 191:19
    freedom of association and our religion.
  • 191:19 - 191:23
    So I strongly urge the
    council to pass this ban,
  • 191:23 - 191:26
    but I also ask that you
    will not stop there,
  • 191:26 - 191:29
    expand the band even further,
  • 191:29 - 191:32
    going forward to include other kinds
  • 191:32 - 191:33
    of remote surveillance systems,
  • 191:33 - 191:35
    such that are biometric for example,
  • 191:35 - 191:37
    a gate and voice recognition systems,
  • 191:37 - 191:41
    and also to the private
    sector has been suggested.
  • 191:41 - 191:46
    So I hope that you will consider
    expeditiously passing this,
  • 191:46 - 191:48
    but also going further.
  • 191:48 - 191:50
    Thank you so much for listening.
  • 191:50 - 191:52
    Thank you very much, Nora.
  • 191:54 - 191:56
    Good evening, my name
    is Nora Paul-Schultz,
  • 191:56 - 191:59
    and I am a physics teacher
    in Boston public schools.
  • 191:59 - 192:02
    And I've had the pleasure
    of teaching Eli and Esmalda
  • 192:02 - 192:04
    who spoke earlier and
    hopefully one day Angel.
  • 192:04 - 192:07
    And I'm a resident of Jamaica Plain.
  • 192:07 - 192:09
    As someone who studied
    engineering when I was in college,
  • 192:09 - 192:10
    has taught engineering
  • 192:10 - 192:13
    and is one of the robotics
    coaches at The O'Bryant,
  • 192:13 - 192:15
    I understand the impulse to feel like
  • 192:15 - 192:17
    technology can solve all of our problems.
  • 192:17 - 192:20
    I know that many for
    many there is a desire
  • 192:20 - 192:24
    that if we have better
    smarter and more technology
  • 192:24 - 192:26
    than our communities will be safer,
  • 192:26 - 192:29
    but the reality is that
    technology is not perfect.
  • 192:29 - 192:31
    A perfect unbias tool,
  • 192:31 - 192:33
    as much as we wish it would be.
  • 192:33 - 192:37
    Our technologies reflect the
    inherent biases of our makers.
  • 192:37 - 192:40
    So technology isn't going to save us.
  • 192:40 - 192:45
    If it does not take
    much to see that racism
  • 192:45 - 192:47
    is infused in all parts of our country.
  • 192:47 - 192:51
    And that is true about the
    facial surveillance technologies.
  • 192:51 - 192:53
    Through my work with Unafraid Educators,
  • 192:53 - 192:55
    the Boston Teacher Unions,
  • 192:55 - 192:57
    immigrant rights organizing committee,
  • 192:57 - 193:00
    the information sharing
    between Boston School Police
  • 193:00 - 193:02
    and the Boston Police Department,
  • 193:02 - 193:06
    I know how detrimental observations
    and surveillance can be.
  • 193:06 - 193:08
    Practices of surveillance
  • 193:08 - 193:10
    like camera recording in the hallway
  • 193:10 - 193:13
    or officers watching young
    people are not passive.
  • 193:13 - 193:15
    Surveillance is active.
  • 193:15 - 193:19
    It leads to the creation
    of law enforcement profiles
  • 193:19 - 193:24
    about young people and
    that can impact their lives
  • 193:24 - 193:26
    in material ways for years to come.
  • 193:26 - 193:30
    And what of those impacts
    being in the system,
  • 193:30 - 193:32
    can make it harder for
    young people to get a job
  • 193:32 - 193:33
    or get housing.
  • 193:33 - 193:37
    It can lead to incarceration,
    it can lead to deportation.
  • 193:37 - 193:40
    Our city government
    does not need more tools
  • 193:40 - 193:42
    to profile the people of our city
  • 193:42 - 193:45
    and especially does doesn't need tools
  • 193:45 - 193:46
    that are inherently racist.
  • 193:46 - 193:49
    This is why the ban is so important.
  • 193:49 - 193:52
    Teaching has taught me
    that what keeps us safe
  • 193:52 - 193:54
    is not having the government surveillance
  • 193:54 - 193:56
    and track us through technology,
  • 193:56 - 193:58
    what keeps us safe is
    investing in housing,
  • 193:58 - 194:00
    education and healthcare.
  • 194:00 - 194:03
    This face surveillance
    technology harms our community,
  • 194:03 - 194:04
    Ah!
  • 194:07 - 194:09
    Sorry.
  • 194:09 - 194:14
    This surveillance technology
    harms our community.
  • 194:14 - 194:16
    The fact that the face ban technology
  • 194:16 - 194:20
    only identifies black women
    correctly one third of the time
  • 194:20 - 194:22
    lays bare both the ineffectiveness
  • 194:22 - 194:24
    and implicit bias built into the system.
  • 194:24 - 194:27
    As a teacher I know that
    that is not a passing grade.
  • 194:27 - 194:29
    We need money to go to services
  • 194:29 - 194:31
    that will protect our community,
  • 194:31 - 194:33
    and we need to stop and prevent
  • 194:33 - 194:35
    the use of ineffective racists
  • 194:35 - 194:38
    and money wasting technologies.
  • 194:38 - 194:39
    Thank you.
  • 194:41 - 194:43
    Thank you, Nora and we
    appreciate all of your work.
  • 194:43 - 194:46
    I'm gonna quickly read the list of names
  • 194:46 - 194:48
    that were signed up to testify,
  • 194:48 - 194:51
    but I didn't see matches
    of the names of folks
  • 194:51 - 194:52
    left in the waiting room.
  • 194:52 - 194:54
    So just in case you're assigning
    it on a different name,
  • 194:54 - 194:56
    but I'm expecting that these folks
  • 194:56 - 194:57
    probably aren't here anymore.
  • 194:57 - 195:01
    Julie McNulty, Zachary
    Lawn, McKenna Kavanaugh,
  • 195:01 - 195:06
    Christine Dordy, Sarah Nelson,
    Andrew Tario, Amanda Meehan.
  • 195:06 - 195:10
    Chris Farone, Bridget
    Shepherd, Lena Papagiannis.
  • 195:10 - 195:15
    Anyone here and signed on
    under a different name?
  • 195:16 - 195:17
    Nope, okay.
  • 195:17 - 195:19
    Then I will just go on the order
  • 195:19 - 195:22
    of folks appearing on my
    screen to close this out.
  • 195:22 - 195:24
    Thank you so much for your patience.
  • 195:24 - 195:29
    So that will be Mark G then Adeline Ansell
  • 195:29 - 195:31
    then Carolina Pena.
  • 195:34 - 195:36
    Yes, so this is Mark Gurvich,
  • 195:37 - 195:39
    I'm presenting testimony on behalf of
  • 195:40 - 195:41
    Jewish Voice for Peace,
  • 195:41 - 195:43
    a local and national organization
  • 195:43 - 195:46
    that's guided by Jewish
    values, fighting racism,
  • 195:46 - 195:49
    sexism, Islamophobia, LGBTQ, oppression
  • 195:49 - 195:52
    and anti-immigrant policies here
  • 195:52 - 195:54
    and challenging racism, colonialism
  • 195:54 - 195:55
    and oppression everywhere,
  • 195:55 - 195:58
    with a focus on the injustices
    suffered by Palestinians
  • 195:58 - 196:01
    under Israeli occupation and control.
  • 196:01 - 196:04
    In recent days in the U.S.
    we have again witnessed
  • 196:04 - 196:07
    the impact of militarized policing
  • 196:07 - 196:09
    on communities of color.
  • 196:09 - 196:14
    Councilor Wu has pointed
    out in a recent publication
  • 196:15 - 196:17
    about this militarization
  • 196:17 - 196:20
    and much of this
    militarization has been through
  • 196:20 - 196:24
    the transfer of billions of
    dollars worth of equipment
  • 196:24 - 196:27
    from the military to
    civilian or police forces,
  • 196:27 - 196:28
    and the training of U.S. police
  • 196:28 - 196:30
    and the use of military tactics
  • 196:30 - 196:32
    in controlling the communities,
  • 196:32 - 196:34
    most heavily impacted by racism,
  • 196:34 - 196:37
    economic depletion, and health crises.
  • 196:37 - 196:42
    Our view is that the use of
    facial recognition technology
  • 196:42 - 196:46
    is a key component of the
    militarization of policing.
  • 196:46 - 196:48
    We know this because of our knowledge
  • 196:48 - 196:50
    and experience of the Israeli occupation
  • 196:50 - 196:53
    of Palestinian people in land.
  • 196:53 - 196:58
    Facial recognition is a core
    feature of that occupation.
  • 196:59 - 197:01
    Much of the technology being marketed
  • 197:01 - 197:02
    to the U.S. law enforcement,
  • 197:02 - 197:05
    has been field tested on Palestinians
  • 197:05 - 197:07
    under military occupation.
  • 197:07 - 197:09
    At this historic and critical time,
  • 197:09 - 197:12
    when the effects of militarized
    policing in the U.S.
  • 197:12 - 197:16
    has resulted in nationwide
    and worldwide condemnation,
  • 197:16 - 197:19
    the very last thing we
    need is another tool
  • 197:19 - 197:23
    adopted from the toolbox
    of military occupation.
  • 197:23 - 197:27
    Jewish Voice for Peace, supports
    the face surveillance ban.
  • 197:27 - 197:29
    Thank you.
  • 197:29 - 197:30
    Thank you very much.
  • 197:30 - 197:33
    Adeline or Adeline, sorry
    to mispronounce your name.
  • 197:40 - 197:41
    Adeline Ansell.
  • 197:45 - 197:48
    We'll move on to Carolina
    or Carolina Pena.
  • 197:56 - 197:58
    Let me see.
  • 197:58 - 198:00
    I will unmute on my own in case
  • 198:00 - 198:02
    it's a muting and unmuting issue.
  • 198:06 - 198:09
    Then we'll move on to Kristen Lyman.
  • 198:09 - 198:12
    Kristen will be followed by Galen bunting,
  • 198:12 - 198:15
    Maria Brincker and Araya Zack.
  • 198:15 - 198:15
    Kristin.
  • 198:28 - 198:29
    Okay, going to Galen,
  • 198:38 - 198:39
    Maria Brincker.
  • 198:46 - 198:47
    MARIA: Yes.
  • 198:47 - 198:50
    Hi, I had actually not
    necessarily prepared.
  • 198:50 - 198:53
    Hello, can you hear me?
  • 198:53 - 198:57
    Adeline, will go to your
    right right next up to Maria.
  • 198:57 - 199:00
    Sorry about the confusion.
  • 199:00 - 199:02
    ADELINE: Okay.
    Okay, no problem.
  • 199:02 - 199:05
    So I just wanted to thank everybody
  • 199:05 - 199:10
    and I wanna say that I am very much,
  • 199:10 - 199:13
    I'm an associate professor at UMass Boston
  • 199:13 - 199:16
    and both as an educator,
  • 199:16 - 199:20
    and as a researcher in
    issues of surveillance,
  • 199:20 - 199:23
    I am very much in favor of the ordinance.
  • 199:23 - 199:26
    I would like to say that,
  • 199:26 - 199:31
    I would also like the council
    to consider wider regulations.
  • 199:31 - 199:34
    So as many have said before,
  • 199:34 - 199:39
    there's a lot of issues
    also in the private sphere.
  • 199:40 - 199:42
    So this is a very important first step,
  • 199:42 - 199:47
    but a lot of the use of facial recognition
  • 199:48 - 199:50
    in law enforcement,
  • 199:50 - 199:55
    really interacts with the use
    of surveillance technologies
  • 199:55 - 199:57
    in the private space.
  • 199:57 - 200:01
    So for example, when the
    Brooklyn Police Department
  • 200:01 - 200:04
    were trying to argue
    against the Brooklyn ban,
  • 200:08 - 200:13
    they used an example of
    police using Snapchat.
  • 200:14 - 200:18
    So a lot of speakers before
    have talked about the importance
  • 200:18 - 200:20
    of anonymity in public space,
  • 200:20 - 200:23
    and I'm fully in support of that,
  • 200:23 - 200:26
    but this is not only about public space.
  • 200:26 - 200:28
    So the use of facial recognition software
  • 200:28 - 200:31
    is of course invading all
    our private spaces as well.
  • 200:31 - 200:35
    So I would like people
    to be aware of that.
  • 200:35 - 200:37
    And as a teacher right now,
  • 200:37 - 200:40
    with the epidemic teaching on Zoom,
  • 200:40 - 200:45
    you can only imagine
    the deployment of voice
  • 200:47 - 200:52
    and face recognition in all our spaces.
  • 200:52 - 200:55
    And so I think this is very important.
  • 200:55 - 200:56
    And another thing,
  • 200:56 - 200:58
    my background is also in
    philosophy of neuroscience
  • 200:58 - 201:02
    and in the understanding
    of autonomous action.
  • 201:02 - 201:06
    And autonomous action
    depends on being in a space
  • 201:06 - 201:08
    that we can understand.
  • 201:08 - 201:10
    And one of the reasons
    why facial recognition
  • 201:10 - 201:13
    is one of the most dangerous inventions,
  • 201:13 - 201:18
    is that it precisely ruins
    the integrity of any space
  • 201:18 - 201:20
    because we don't know who's there
  • 201:20 - 201:24
    and it can do so retroactively.
  • 201:25 - 201:29
    So I thank you very much
    for all your work on this,
  • 201:29 - 201:32
    and I'm fully in support of
    the ordinance, thank you.
  • 201:32 - 201:33
    Thank you so much.
  • 201:35 - 201:36
    Okay, next we'll go to Adeline
  • 201:39 - 201:41
    Hi, I'm Adeline Ansell.
  • 201:43 - 201:45
    I was just coming out to
    support the ordinance as member
  • 201:45 - 201:47
    of the Unafraid Educators group
  • 201:47 - 201:50
    of the Boston Teachers Union,
  • 201:50 - 201:52
    to protect our immigrant students
  • 201:52 - 201:54
    and who have been
    advocating for themselves
  • 201:54 - 201:57
    through the Student Immigrant Movement
  • 201:57 - 201:59
    to support this ordinance
  • 201:59 - 202:03
    banning facial recognition technology,
  • 202:03 - 202:07
    because it puts our
    students more so at risk.
  • 202:07 - 202:11
    And I don't have a lot to add.
  • 202:11 - 202:12
    I just wanna support
    the work they're doing
  • 202:12 - 202:14
    in their student advocacy.
  • 202:14 - 202:16
    And I know they all spoke for themselves
  • 202:16 - 202:18
    and in their allies did as well,
  • 202:18 - 202:20
    and just wanted to let you
    know that Boston teachers
  • 202:20 - 202:23
    also support this ban.
  • 202:23 - 202:26
    So thank you for listening
    to all of their testimony
  • 202:26 - 202:29
    and know that the teachers
    are on their side too,
  • 202:29 - 202:33
    and that we support this
    ordinance, thank you.
  • 202:33 - 202:34
    Thank you.
  • 202:34 - 202:36
    Okay, last call.
  • 202:36 - 202:38
    Either Carolina or Galen
  • 202:38 - 202:40
    or anyone else who would like to speak.
  • 202:40 - 202:43
    I think everyone's been admitted
    from the waiting room now.
  • 202:46 - 202:47
    Okay, seeing no takers,
  • 202:47 - 202:49
    I'll pass it back if any councilors
  • 202:49 - 202:52
    wish to make a closing
    statement before we wrap up.
  • 202:56 - 202:59
    I see I'll save councilor Arroyo
    for right before we close,
  • 202:59 - 203:01
    councilor Annissa Essaibi-George.
  • 203:01 - 203:03
    Thank you madam chair at this point
  • 203:03 - 203:05
    and thank you to you and council Arroyo
  • 203:05 - 203:07
    for bringing this before the council.
  • 203:07 - 203:11
    I've really appreciated
    everyone's testimony today
  • 203:11 - 203:13
    in both hearings and sort of the,
  • 203:13 - 203:16
    certainly a very direct correlation
  • 203:16 - 203:17
    between some of the testimony today,
  • 203:17 - 203:19
    but a particular interest of mine
  • 203:19 - 203:22
    was the testimony of the young people
  • 203:22 - 203:23
    and the testimony of the teachers,
  • 203:23 - 203:25
    in particular...
  • 203:25 - 203:28
    So just thank you, thank you both.
  • 203:28 - 203:32
    Thank you councilor Wu for
    staying after the chair
  • 203:32 - 203:34
    had to leave to extend an opportunity
  • 203:34 - 203:36
    for this continued testimony.
  • 203:36 - 203:40
    It was certainly important testimony,
  • 203:40 - 203:44
    but just the fact that everyone stayed on
  • 203:44 - 203:46
    as long as they did to
    offer that testimony,
  • 203:46 - 203:48
    I think is really important to recognize,
  • 203:48 - 203:51
    and to be grateful for that engagement.
  • 203:51 - 203:53
    And thank you to you and
    thank you to the maker.
  • 203:53 - 203:54
    Thank you councilor
  • 203:54 - 203:58
    and final closing words from
    my co-sponsor councilor Arroyo.
  • 204:00 - 204:03
    Just a sincere thank you to our panelists.
  • 204:03 - 204:05
    I think many of them are off now,
  • 204:05 - 204:07
    but thank you so much to them
  • 204:07 - 204:09
    and everybody who gave testimony,
  • 204:09 - 204:12
    much of that was deeply moving.
  • 204:12 - 204:13
    Certainly Angel was a highlight.
  • 204:13 - 204:15
    So thank you Angel,
  • 204:15 - 204:17
    and anybody who can get
    that message to Angel,
  • 204:17 - 204:19
    I'm not sure if it's bedtime.
  • 204:20 - 204:22
    Thank you so much for raising your voices
  • 204:23 - 204:26
    and really being focused
    at a real municipal level.
  • 204:26 - 204:28
    This is what we've always asked for.
  • 204:28 - 204:31
    This is what I grew up about Angel's age,
  • 204:31 - 204:32
    watching these meetings
  • 204:32 - 204:35
    and having folks basically beg
  • 204:35 - 204:38
    for this kind of interaction
    from our communities.
  • 204:38 - 204:40
    And so I'm just deeply grateful
  • 204:40 - 204:43
    that whatever brought
    you here specifically,
  • 204:43 - 204:44
    I hope you stay engaged.
  • 204:44 - 204:47
    I hope you bring this
    energy to other issues
  • 204:47 - 204:50
    that really affect our
    communities on a daily basis
  • 204:50 - 204:52
    because it really makes change.
  • 204:52 - 204:53
    It really changes things.
  • 204:53 - 204:56
    And so thank you so much to
    everybody who took part in this,
  • 204:56 - 204:59
    thank you to all the
    advocates as a part of this.
  • 204:59 - 205:01
    Thank you councilor Wu for making sure
  • 205:02 - 205:04
    that there was a chair here
  • 205:04 - 205:07
    and to councilor Edwards first
    running it so well earlier.
  • 205:07 - 205:08
    But thank you to you for ensuring
  • 205:08 - 205:10
    that we can continue to
    hear public testimony
  • 205:10 - 205:12
    because it really is valuable.
  • 205:12 - 205:14
    So with that, I thought
    that was all very valuable.
  • 205:14 - 205:15
    And also thank you to the commissioner
  • 205:15 - 205:16
    for taking the time to be here
  • 205:16 - 205:20
    and to really endorse this idea.
  • 205:20 - 205:21
    I thought that was fantastic.
  • 205:21 - 205:22
    So thank you.
  • 205:23 - 205:27
    Thank you, this is a great, great hearing.
  • 205:27 - 205:28
    I think we all learned so much
  • 205:28 - 205:31
    and I wanna echo the
    thanks to the commissioner,
  • 205:31 - 205:34
    all the community members who
    gave feedback along the way
  • 205:34 - 205:37
    and over this evening, my
    colleagues, the commissioner,
  • 205:37 - 205:39
    and of course, central staff,
  • 205:39 - 205:43
    or for making sure all of it
    went off very, very smoothly.
  • 205:43 - 205:45
    So we'll circle back up
    with the committee chair
  • 205:45 - 205:49
    and figure out next steps,
    but hoping for swift passage.
  • 205:49 - 205:51
    And I'm very grateful again,
  • 205:51 - 205:54
    that we have an opportunity
    in Boston to take action.
  • 205:54 - 205:58
    So this will conclude our
    hearing on docket number 0683,
  • 205:58 - 206:01
    on an ordinance banning
    facial recognition technology
  • 206:01 - 206:02
    in Boston.
  • 206:02 - 206:04
    This hearing is adjourned.
  • 206:04 - 206:05
    [gravel strikes]
  • 206:05 - 206:07
    Bye, see you.
    Bye.
Title:
Committee on Government Operations on June 9, 2020
Description:

more » « less
Video Language:
English
Duration:
03:31:11

English subtitles

Revisions