Rules needed to use facial recognition technology

Yomiuri Shimbun file photo
This photo, taken July 20 at Narita Airport, shows the facial recognition system that started full-scale operation.

Facial recognition technology is becoming more sophisticated thanks to artificial intelligence. Although there are high expectations the technology will improve safety and convenience, there is also the risk that it will be used to create a surveillance society. In recent years, there have been moves in Europe and the United States to regulate the use of facial recognition systems by law enforcement agencies. How will Japan approach the issue?

Each application of facial recognition technology presents different challenges.

For example, a passenger authentication system that came into full-scale operation at Narita Airport in July is less likely to raise concerns because its use is based on the consent of the passengers themselves.

Passengers who register at check-in can pass through security checks and boarding gates equipped with facial recognition technology. Only passengers who have given their explicit consent can use the system and the data is disposed of within a certain period after boarding as part of security measures. Conventional procedures are available for passengers who do not want to use the facial recognition system.

“This is a good example of achieving both convenience and privacy protection,” said Aimi Ozaki, a lecturer at Kyorin University who is knowledgeable about facial recognition systems. Ozaki also acknowledged that there are many problems with using the technology without people’s consent, particularly when it involves large numbers of the general public.

Surveillance cameras used in facial recognition systems can capture images of people and vehicles from a distance so it is usually difficult to notice that data is being collected. It is virtually impossible to obtain the consent of each person being photographed in public spaces.

In an age when photos of people are widely circulated on the internet through social media, the spread of facial recognition cameras in every corner of society could lead to the loss of anonymity in public spaces and make it possible to track people’s behavior.

“If people feel like they are being watched, they will restrict their behavior. Society will wither as a result,” Ozaki said.

The United States and Europe are beginning to take a hard look at the use of such technology in police investigations.

In August last year, the Court of Appeal in Britain ruled that the use of facial recognition technology by the South Wales Police was illegal. Since 2017, police in Britain have matched the facial images of about 500,000 suspects and other people on watch lists in and around the venues of soccer matches. The court ruled that it was unclear where the photos were taken and who was on the watch list, among other issues.

In April, the European Commission announced a proposal to regulate the use of AI. The commission proposed a ban on real-time facial recognition in public spaces by law enforcement agencies, in principle, with some exceptions including searches for missing children.

Resistance has also been increasing in the United States, which has been actively using the technology in criminal investigations.

One of the turning points was the death of George Floyd in May last year, which triggered protests against the actions of the police. In some states, police were criticized for connecting to private security cameras to monitor protesters, and for using body cameras with facial recognition functions.

IBM, Amazon, and Microsoft have stopped providing facial recognition technology to the police, and states and cities around the U.S. are moving to restrict its use.

In Japan, the National Police Agency claims that it does not perform real-time facial recognition in public spaces, but details of how it uses the technology have not been made public.

Prof. Makoto Ibusuki of Seijo University, an expert in criminal procedure law, said: “The police need to increase the transparency of the use of facial recognition technology to prevent a repeat of the ‘GPS investigations,’” referring to investigations conducted by police using GPS devices without warrants, which resulted in a Supreme Court ruling in 2017 stating that the practice was illegal.

During the trial, it was revealed that the people involved had been ordered to keep the illegal use of GPS a secret for many years. The practice came to light by chance, after someone happened to find a GPS device during a vehicle inspection.

The use of GPS devices in police investigations was a new way of tracking suspects, and they were installed without warrants. The technology was used in place of police officers tailing suspects, but the levels of accuracy and efficiency were much higher.

The same holds for facial recognition: In terms of accuracy and the range of surveillance, it is on another level. If the use of facial recognition technology spreads to public spaces, a warrant may be required to conduct investigations in some cases.

Transparency urged

Courtesy of East Japan Railway Co.
A sign at a train station informs passengers that a facial recognition camera is in operation.

The police have access to privately operated cameras installed in public places such as cameras installed by railway operators.

As of March 2019, there were 94,000 such cameras nationwide and 17 railway operators have dedicated lines to send images in real time to the police in emergencies. The system serves as an important function to protect public safety, but some operators have not disclosed the fact that such cooperation occurs.

East Japan Railway Co. started using cameras with facial recognition functions at train stations in July. According to the company, facial images of suspects, people on parole for serious crimes and people who act suspiciously are stored in their facial recognition system, which can identify such individuals among station users.

However, the company refuses to disclose the locations and numbers of these cameras, the criteria for storing facial images, and where and how it obtains the images.

Over a 12-month period, Google said it had received from the U.S. government more than 10,000 so-called geofence warrants, forcing them to disclose information that can be used to identify people who were near the site of a crime when it took place. It is not a targeted request for information, but rather a dragnet of data on all people who match certain criteria.

The system greatly contributes to efficiently conducting investigations. But at the same time, it violates the privacy of countless individuals who are not criminals. It is essential that people are informed by companies of the fact that their information has been provided to the police. It is also necessary that their data is discarded when they are cleared of suspicion.

According to legal expert Ibusuki, “there is an urgent need to create rules that can be applied to data-driven investigations,” as the use of data collected through such technology as facial recognition systems will likely increase.