Addressing issues related to Facial recognition technology (FRT) can seem incredibly daunting. The technology itself is difficult to wrap your head around because of all the different variations like face detection, face scanning, complete face mapping, and face matching, just to name a few. Kade Crockford has brought the Technology for Liberty Programat the ACLU of Massachusetts to the forefront of this discussion. Most recently, the program has been working to organize citizens around H. 1538, which seeks to impose a moratorium on the use of FRT and other biometric surveillance by law enforcement.

We are honored to have Kade Crockford join us at Northeastern University School of Law on May 10, 2019, for About Face: The Changing Landscape of Facial Recognition. Kade Crockford will join, Dr. Chris Gilliard, Professor at Macomb Community College, and Brenda Leong, Sr. Counsel & Director of Strategy at Future of Privacy Forum on the panel “Understanding the Social Impacts & Challenges of FRT.” This article will introduce Director Crockford’s work, provide insight into her perspective on social harms and technology, and explore the issues with oversight of new technology.

Who watches the watchers?

In 2016 the Boston Globe lifted the veil on the Boston Police Department’s (BPD) policy for body cameras.[1] The aspect of the policy that relates to our conference is the fact that BPD pledged not to use FRT as part of the body camera deployment. The ACLU kept digging into FRT deployments and in 2018 filed a Freedom of Information Act (FOIA) request to learn about the federal government’s use of FRT, particularly Customs Border Protection (CBP) and Transportation Security Administration (TSA) use.[2] These examples highlight one of the biggest issues with FRT; that we rarely know when or how it will be deployed. Theoretically any camera could be used to deploy FRT, so it is important that citizens have mechanisms in place to be able to monitor the acquisition and deployment of FRT. 

couple municipalities in Massachusetts have formed community groups to provide oversight of surveillance. The benefits to these groups come in the form of increased transparency and accountability for police departments. Lawrence Police Department made several disclosures related to their “plan to install about 70 surveillance cameras” around town, including that FRT would not be deployed.[3] The arguments for deploying the cameras mirror those of FRT proponents who want to convince us that the harms are mythical while the benefits to crime fighters come in the form of deterrence and investigative help.[4] Cambridge City Council also implemented a requirement to “seek City Council permission before buying, acquiring, or otherwise using new surveillance technologies” and that the City Council “must…approve the policy to govern [its] use…”[5] These community control mechanisms are in the same vein as the Community Control Over Police Surveillance (CCOPS) that Clare Garvie told us could be effective at controlling the unfettered use of FRT. 

The Power of Surveillance

Monitoring the use of FRT is incredibly important in light of research that has revealed astounding failures of accuracyracial bias, and the fact that Rekognition was not submitted to the National Institute of Standards and Technology (NIST) for external evaluation. The response from Amazon indicates that the test was improperly conducted and that led to the bad results. One of the criticisms of the MIT study by Amazon related to confidence thresholds, which tell the user how likely the system’s identification is to be correct, but they have put out inconsistent statements about the appropriate threshold for policing and we know at least one of their users had no idea how to configure this setting.[6]

Director Crockford acknowledged the difficulty in connecting problems with FRT to the biggest challenges people face in the US and abroad like healthcare, housing, and a widening socioeconomic divide.[7] Still, Director Crockford emphasized that this shouldn’t be a binary choice for activism and organizers. According to Director Crockford, digital rights scholars and activists would be wise to join other groups and connect around the common resistance to the growth in state and corporate power. Examples like this letter to Amazon and the strong resistance to FRT deployment at a rent-stabilized complex in NYC provide guide posts for this type of connected effort.

Director Crockford also shared her concerns surrounding the Microsoft Principles and the idea that corporate responsibility is anything beyond mere puffery. While publicly Microsoft proclaims concern around FRT and its potential for abuse, particularly in the “lawful surveillance” principle, they have worked to kill legislation in Washington state that would have provided a similar moratorium to H. 1538 in Massachusetts. They have also been a key player in the discussions for a data privacy bill in Washington that could have put more stringent requirements around FRT use and the methods companies must use to get consent, but the bill failed. Dr. Gilliard also raised concerns about consent and control mechanisms, echoing the thoughts of our conference co-chair, Woodrow Hartzog.

An Urgent Need for Action

If we cannot trust companies to police themselves then surely, we can trust courts to reign in abusive practices, right? The problem with trusting courts, in Director Crockford’s view, comes in three forms: 1. We don’t know what the make-up of SCOTUS will be in the future so we can’t depend on their decisions to expand on some of the concepts in Carpenter v. United States; 2. While we are waiting for cases to make their way through the legal system, millions of people will have their civil rights violated; and 3. Police departments across the world are developing their own systems of surveillance and without legal action, they will not need the assistance of the third-party doctrine or have to look anywhere outside their walls to decide what is appropriate.

So where do we turn? To really curb the potential for abuse and ensure that FRT is deployed responsibly, Director Crockford envisions and independent body that conducts the actual FRT analysis. Director Crockford referenced the Wiretap Act as a starting point for the standard that law enforcement must meet before making a request to the independent body to run a facial recognition scan. Removing these tools from the hands of law enforcement is politically explosive due to the grandiose tales of FRT’s successful deployments but if we consider the incentives in place, it may be the only solution that makes sense and still allows for FRT development. Police will always have an incentive to find the easiest way to surveil the most people in the quickest fashion. Their job is to catch criminals (saying nothing of the composition of low-level drug offenders incarcerated) and they will use any means necessary. Still, Director Crockford reminds us that, as a society, we must decide if the means are necessary and the discussion needs to happen now.

This closes our series introducing our six panelists and keynote speaker, Cyrus Farivar. We hope that you have found these introductions to their work interesting and we hope to see you on May 10, 2019 for About Face: The Changing Landscape of Facial Recognition.  Admission is free and we hope you join us to continue the discussion of FRT deployments.

[1]Kade Crockford. Boston Police body camera policy forbids facial recognition.Privacy SOS. July 13, 2016. Available at

[2]Kade Crockford. ACLU and Boston-are technologists ask Trump admin for info about Federal facial facial recognition surveillance programs.Privacy SOS. May 16, 2018. Available at

[3]Kade Crockford. Lawrence is set to expand its surveillance network. Here are the details.Privacy SOS. December 13, 2018. Available at



[6]Joy Buolamwini. Response: Racial and Gender bias in Amazon Rekognition – Commercial AI System for analyzing faces.Medium. January 15, 2019. Available at

[7]Interview with Kade Crockford. May 2, 2019. All references to Director Crockford’s opinions are from this interview.