Apple uses it to help users unlock the latest iPhones. Facebook uses it to identify friends and family members in the photos you post. And several airlines are using it instead of boarding passes to screen passengers.
It’s everywhere. But the proliferation of facial recognition technology is raising concerns among civil rights advocates and others who fear the technology will be used to conduct mass surveillance of innocent civilians. Critics also point out the technology is less reliable in identifying people of color or women than white men, which can lead to false-positive identifications of racial and ethnic minorities in criminal investigations.
Now Portland City Councilors Pious Ali and Brian Batson want the city to formally ban the use of facial recognition technology and similar identification tools by city employees, including the city’s police department. If successful, Portland would be the first community in the state – and one of a small but growing number of cities around the country – to enact a ban on the technology.
Portland councilors are scheduled to take up the issue Monday. The discussion comes as Maine’s largest city has embraced so-called smart-city technology, which includes traffic monitoring and has stoked fears about government’s increasing ability to watch over citizens. And Portland’s police officers are now equipped with cameras mounted on their chests to capture video of each person an officer interacts with, another source of concern about privacy.
Portland’s smart city efforts so far include LED streetlights that can double as WiFi routers and sophisticated traffic management systems that can read and adjust signals in response to real-time traffic. Portland officials have expressed interest in other “public safety functions,” but so far, officials say that has not included facial recognition technology.
Ali would like to keep it that way.
“It’s a proactive way to say, ‘I don’t want this,’ ” Ali said. “It’s the right thing to do.”
Portland Police Chief Frank Clark did not respond to interview requests for this story. Clark last week delayed plans for Portland’s school resource officers to activate body cameras after school officials raised concerns about student privacy.
City Manager Jon Jennings said he was looking forward to the discussion but wants to keep the city’s options open when it comes to technology that could be beneficial in the future.
“City staff is eager to follow the discussion on facial recognition software at the council meeting on Monday,” Jennings said in an email. “While the city currently does not use facial recognition, there may be instances in the future where partners at the Jetport or on the waterfront may want to implement. The city currently does not plan to utilize this technology, but there may be needs in the future. Ultimately this will be up to the City Council to decide and we will follow their direction.”
A 2016 study by the Georgetown Law Center on Privacy & Technology said that 117 million Americans, or one out of every two people, already have their image in a law enforcement face recognition database, mostly from driver’s license or passport photos. At least 26 states allow law enforcement to run facial recognition searches. The report says police in Maine can search 24.9 million mugshots. And of the 52 police agencies nationwide believed to be using the technology, only four had a public policy about how the technology is used, researchers say.
Facial recognition has been around for decades, but technological advances in recent years, especially in 3-dimensional imaging, have made it more ubiquitous. While 2-dimensional images require ideal lighting and the subject to be facing the camera, new technology can allow identification of someone turned 90 degrees from the camera.
It works by essentially mapping a person’s dominant facial features, such as eye shape and spacing, as well as jaw and nose lines. The software measures the distances between dozens of reference points, creating a unique profile similar to a fingerprint. The profile is then compared to the faces in existing databases, such as those that contain driver’s license, state ID, passport photos or police mugshots. More advanced programs can also conduct an analysis of skin texture to increase accuracy and, in some cases, account for different facial expressions, facial hair or eyeglasses.
China is currently experimenting with gait recognition technology, which can allow authorities to identify a person whose face is obscured by analyzing how he or she moves when walking. The proposed ordinance in Portland would not prohibit the use of gait recognition technology, but a resolution attached to it says facial recognition and “other biometric surveillance technology” pose a threat to civil liberties.
Facial recognition technology is used all over the world, with applications ranging from airport and border security to a soccer team in Denmark that uses it to identify unruly fans.
Perhaps the most extensive deployment is in China, where authorities are using it to monitor the Uighurs, a largely Muslim minority ethnic population. The New York Times reported in April that the technology is being used to track the group’s movements and alert police if a large group appears in a certain location.
In the U.S., the technology has been used to screen passengers at airports, including Miami International Airport. The Boston Herald reported in April that JetBlue was using facial recognition rather than boarding passes.
Paul Bradbury, director of the Portland International Jetport, said that JetBlue and Delta are using the technology at other airports, but not here.
“The (Portland) Jetport does not use this technology, but some of our airline partners have commenced facial recognition projects at other airports,” Bradbury said.
The U.S. Department of Homeland Security said in a report to Congress that it is working toward full deployment of the technology within the next four years “to account for over 90 percent of departing commercial air travelers from the United States.” That system is already being used to identify noncitizens who have overstayed their visas. It compares face scans at the airport to a database assembled from existing federal images, including from passports, visa applications or previous border encounters with the U.S Customs and Border Protection.
By the end of fiscal 2018, DHS said the technology was operational at 15 locations nationwide. Since its inception, over 2 million passengers on more than 15,000 flights have been scanned by the technology, which DHS says has a 98 percent match rate. And 7,000 people who have overstayed their visas have been biometrically confirmed. The U.S. Border Patrol also tested facial recognition technology at a Texas border crossing to identify people in vehicles moving under 20 mph.
In November, the American Civil Liberties Union filed a lawsuit against the federal government seeking information about how facial recognition was being used in public surveillance. The suit was filed in U.S. District Court in Massachusetts against the Department of Justice, Federal Bureau of Investigation and the Drug Enforcement Administration. It came after the agencies did not provide those documents in response to a public records request.
In the absence of federal laws governing facial recognition, some states and local municipalities are taking action to regulate the trend.
San Francisco became the first city in the U.S. to ban use of the technology in May. Since then, Oakland, California, and Somerville, Massachusetts, have followed suit. And Cambridge, Massachusetts, is considering enacting a ban on use by its city government.
The state of California this fall passed a three-year ban on using facial recognition technology in police body cameras in the state. And Massachusetts, which the ACLU says has used facial recognition since at least 2006, is considering a moratorium on facial recognition and “other remote biometric surveillance systems.”
Police in Detroit, who are led by former Portland Police Chief James Craig, spent $1 million on facial recognition software and had been using the technology for a year and half before drafting a policy on its use. The American Civil Liberties Union of Michigan has filed a public records request to learn more about how it is being used.
This fall, the board that oversees the Detroit police voted to restrict its use, prohibiting public surveillance and immigration checks that use the technology. The Detroit Free Press reported that facial recognition can only be used in response to violent crimes, including robbery, sexual assault, aggravated assault and homicide. One board member, Willie Burton, described the technology as “techno-racism.”
As momentum builds to restrict government uses of the technology, business groups and police associations are beginning to raise their voices to protect the ability to deploy it in the name of public safety.
In October, the U.S. Chamber of Commerce, along with eight other technology, airport and security groups, sent a letter to Congress, urging it not to ban or pass any moratorium on the use of facial recognition. The group noted that the technology can enhance security and is becoming increasingly accurate.
“We are concerned that a moratorium on the use of facial recognition technology would be premature and have unintended consequences not only for innovation, safety, and security but for the continued improvement of the technology’s accuracy and effectiveness,” the group said. “Instead, we urge Congress to collaborate with all stakeholders to address concerns raised by facial recognition technology and provide a consistent set of rules across the United States.”
A similar open letter to Congress was published in September by tech companies.
The letters came after a congressional hearing in May, during which concerns about the technology’s infringement on civil liberties were raised.
Among those to testify before the House Committee on Oversight and Government Reform was Joy Buolamwini, a graduate researcher at the Massachusetts Institute of Technology who founded a group named the Algorithmic Justice League in Cambridge after being misidentified by the technology.
“I established AJL to create a world with more ethical and inclusive technology after experiencing facial analysis software failing to detect my dark-skinned face until I put on a white mask,” she said in her written testimony. Buolamwini said her research showed that women of color in some cases were misidentified one-third of the time.
“Real-world failures and problematic deployments including mass state surveillance, false arrests, and the denial of working opportunities remind us of what is at stake in the absence of oversight and regulation,” she said. “Congress must act now to protect the public interest.”
Kade Crockford, director of the ACLU of Massachusetts’ technology for liberty program, worked with Councilor Ali on Portland’s proposed ban.
“Face surveillance is dangerous when it works, and when it doesn’t,” Crockford said. “The government has no business using technology that can facilitate the mass tracking of our every movement in public space – every visit to an Alcoholics Anonymous meeting, a doctor’s office, city hall, or even just a friend’s house. But that’s exactly what this technology enables, in its most dangerous form.”
The Cumberland County Sheriff”s Office used facial recognition technology for a couple of years beginning in 2012 to compare investigative images to existing mugshots. But Sheriff Kevin Joyce said the technology only worked if they had a clear picture and the person was facing the camera. They were unable to use the technology to identify any robbery suspects because of either a poor picture quality or the camera angle.
Joyce said he hasn’t stayed up-to-date on the technology. But he cautioned against an outright ban, saying that it could be another tool, along with other investigatory work, to help police solve crimes.
“There are going to be crimes in the future that come up that we’d maybe solve a crime by having that facial recognition,” Joyce said. “If it’s used for criminal justice and solving crimes and used ethically, I don’t see there’s an issue with it. But it needs to be used ethically and correctly.”
In July, Maine Secretary of State Matthew Dunlap announced that the state would not grant open-ended access to the state’s Real ID driver’s license database for investigations or facial recognition scans. However, the state will conduct specific searches on behalf of federal agencies if such a search is deemed appropriate.
Kristen Muszynski, spokeswoman for the secretary of state, said Maine law only allows the state to use biometric technology to issue licenses or state IDs – it does not allow them to conduct facial recognition searches for outside groups, such as local, state or federal law enforcement. Staff is currently working on legislation that would allow it to conduct those searches under certain circumstances, but she didn’t know whether it would be ready for the upcoming session.
Muszynski said the rules would likely mirror an existing policy for conducting searches of its old license and ID database, where police agencies submit a written request with a subject’s name and or address in order to get a photo to confirm an identity. The investigations unit of the Bureau of Motor Vehicles has processed 592 image requests this year, she said.
According to state records, the state has issued 17,815 driver’s licenses and 727 state identification cards since July 1 when it began using Real ID, which can be used for facial recognition searches.
Sen. Shenna Bellows, D-Manchester, said she would like to see the state enact a moratorium on facial recognition so policies can be drafted, similar to the way the state handled police use of drones several years ago.
“No government agency should be deploying this technology without clear guidelines around protecting people’s constitutional freedoms and preventing abuse,” said Bellows, a former ACLU of Maine director. “One of my biggest concerns is that this technology can be used for massive, warrantless surveillance of public activities on a scale that we haven’t seen before.”
In Portland, some city councilors are looking to get ahead of the curve.
The ban proposed by Ali and Batson states that it is intended to “protect the privacy and civil liberties of residents.” In addition to frequently misidentifying women, young people and people of color, the councilors say the technology “may chill the exercise of free speech in public places” and “corrupt the core purpose of officer-worn body cameras by transforming those devices from transparency and accountability tools into roving surveillance systems.”
The council could vote on the ban Monday night. But some councilors have already expressed an interest in either referring the ban to a committee for additional work or to a council workshop for further discussion before voting.
City Councilor Kimberly Cook said she supports the proposal and that it would require city officials to seek permission from the council to use the technology, rather than allowing staff to implement it on their own. Cook also said the council should be playing a larger policy-setting role in the city’s entire smart-city initiative.
“To the extent that city staff are pursuing these initiatives, they should bring them to the council for consideration and guidance about whether this is a priority for the city,” Cook said. “So far, I have not been seeing that happen with smart-city technology,” including automated vehicles.
City Councilor Jill Duson said she plans to ask councilors to refer the ban to a committee because she would rather see an ordinance that’s tailored to Portland than simply adopting a standard ordinance used in other communities. Duson said she’d like the city to have a thorough discussion about facial recognition technology before setting a policy, much like it did with police body cameras.
“I would have real concerns about a precipitous ban,” she said. “But I also have concerns about precipitous implementation. For me, it has to be done in the context of our city.”