Search

Search

On Thanksgiving weekend, Brian Hoferand his brother, Jonathan, rented a car to drive north to visit their family in northern California.  Although they were going the speed limit and obeying traffic signs, they were pulled over. A police officer approached their car, asked them to step out of the vehicle and place their hands on the roof of the car, while he and two other police officers are all pointing their guns at the brothers.  Apparently, the rental car had been reported as stolen and the cop’s license plate reader picked it up.  While pointing a gun to Brian’s head the police asked him to type his password to unlock his phone so that he could prove that the car was a rental.  After fifteen minutes of being held at gunpoint, the police realize that the car was mistakenly reported as stolen and release Brian and Jonathan.  Brain Hofer, Chair of Oakland’s Privacy Advisory Commission, ironically works towards creating policies for Oakland that aim to prevent these types of misidentifications and scenarios.

It is because of stories like these that the Center for Law, Innovation, and Creativity hosted a conference titled “About Face: The Changing Landscape of Facial Recognition” where keynote speaker, Cyrus Farivar, discussed that it’s time for not only law enforcement agencies to be more transparent with their use of technology, but also for citizens within their communities to ask more questions and get more involved in the process.

The United States regulates under the theory of technological exceptionalism—the idea that government and consumers should allow technology into the marketplace in the name of innovation and worry about the impacts that the technology could have after the fact.  Often this means that technology is deployed long before there are opportunities for regulators to push the pause button to map out the risks and benefits that these technologies may cause. 

Farivar often uses license plate readers as a parallel to law enforcement’s use of facial recognition technology, because they both are used to search through databases of information in order to help identify an individual and their location. They also typically share many of the same flaws, such as misidentification. Historically, license plate readers that police have used have led to highly offensive intrusions upon individual privacy because of technological exceptionalism.  Brian Hofer, was held at gunpointbecause his rental car was erroneously listed as stolen and a license plate reader picked up his car.  In situations like Brian Hofer’s, and others’ misidentifications through either license plate readers or facial recognition devices, Farivar believes something as basic as having the police double check the output to determine if the read makes sense, by simply checking the letters or photos the devices report back, “can go a long way to hopefully reducing and maybe even eliminating instances where people are erroneously pulled over, or worse held at gunpoint.”

But while many people understand that the police carry guns, or that police use radios and could easily identify these tools if asked, Farivar says, “Most people don’t know or wouldn’t know what a license plate reader was if it was staring them in the face.”  People can’t operate in a free and open society where they don’t know or don’t understand how and when and why they are being watched. He said, “I think that transparency can go a long way to making a case as to why it’s important, because I think a lot of us, we have concerns or maybe are outright afraid of things that we don’t know or don’t understand.”

San Franciscorecently passed a bill that banned facial recognition software used by the police and other agencies.  One of the main features of the San Francisco bill includes public input and review of technology before law enforcement purchases it.  “I think it’s important to have the law enforcement be more proactive than maybe historically they have been in terms of being very clear about what they want, why they have it, and what it’s used for,” Farivar said. Cities across the country, are looking to follow suit, including Somervilleand Oakland.  Additionally, the Massachusetts ACLUhas proposed a moratorium on facial recognition technology in conjunction with several bills pending in Massachusetts.

While Farivar predicted that the San Francisco bill would pass, he’s uncertain on how similar bills will bode in other cities.  Currently, neither San Francisco nor Oakland use facial recognition technology, so other cities may be waiting to see how successful these bans will be.  For cities like Oakland, Farivar said, where crime is particularly serious, there is a need for proper balancing of interests since “it’s a city that unfortunately has over policing, particularly in communities of color.”  He continued on this point, “I think many people want the police to solve particularly violent, serious crimes, but we also don’t want them to police and be imposing when they don’t have to be, and I think that’s a really hard balance.”


Looking toward the future, and what ordinary citizens can do, Farivar advocates for strong community engagement in the process, from selecting the technologies, to reviewing annually how it’s been used.  “I think that we need to collectively, as a society, go to our local law enforcement agencies and say, ‘We understand you have this tool, we understand it’s for a specific purpose, but we all want to be clear as to who has access, under what circumstances, how long the data is kept for.’  And to think about these questions ahead of time, because unfortunately, right now in America, it seems to me that those questions are not the norm.” He also recommends, for those who are interested in learning more, to sign upfor emails from Oakland’s Privacy Advisory Commission to keep on top of all the current news relating to surveillance technology.  

For those in Massachusetts, consider joiningthe ACLU and supporting Bill H. 1538 which would establish a moratorium on unregulated government use of face recognition.  The Center for Law, Innovation, and Creativity in conjunction with the student group, the Law and Information Society, will be hosting future events regarding this bill as well.  Please email privacyandtech@gmail.comif you would like to be contacted regarding those events.

Interview and article by Christie Dougherty