• Companies

JR East’s facial recognition system watch list in spotlight

The Yomiuri Shimbun
A sign notifies commuters that facial recognition cameras are in operation at a JR East station.

East Japan Railway Co. (JR East) has been using facial recognition technology at stations and other locations since July that enables it to monitor individuals on a surveillance watch list that includes people such as former prisoners, it has been learned.

According to JR East and other sources, the AI-equipped facial recognition system can be used to monitor people who have served sentences for committing serious crimes at such places as JR East stations as well as wanted suspects and people behaving suspiciously. The company said it conducts baggage checks on people flagged by the system if necessary.

However, JR East’s use of the technology is likely to spark controversy, as it involves monitoring the movements of people who have finished serving their sentences and restricting their activities.

Images of people on JR East’s watch list are stored on a database, which its facial recognition system uses to automatically cross-check people captured by security cameras operated by the company. A total of 8,350 networked cameras are installed at locations including 110 major stations and electric power substations, but JR East has not disclosed how many are currently being used under the system.

JR East receives information from the Public Prosecutors Office based on a notification system in which victims, witnesses and operators of relevant sites are informed when offenders have completed sentences or are released on parole.

When JR East receives information about the release of offenders who committed serious crimes that involved JR East or its passengers, it saves their names, details of their offenses and mug shots that were published by media outlets at the time of their arrests. As of early September, there were no such cases registered on the database, according to sources.

The watch list does not cover crimes such as groping or theft.

When cameras detect subjects, including suspicious people and wanted suspects, security guards visually confirm the identity of the subjects, and, if necessary, alert the police or inspect their baggage.

JR East announced the introduction of facial recognition cameras on July 6 as part of terrorism countermeasures for the Tokyo Olympics and Paralympics, and the cameras went into operation on July 19.

Information about the company’s use of facial recognition cameras is clearly indicated on its website and in stations. However, the company does not state that its watch list includes former prisoners and parolees.

Under the Law on the Protection of Personal Information, criminal records and other sensitive information are classified, and it is prohibited to obtain such information without the consent of the individual.

However, exceptions are made for cases based on laws and regulations such as the victim notification system.

“This is a necessary measure that puts the safety of passengers first,” a JR East official said. “We can’t disclose the details for security reasons. We are thoroughly managing the information.”

Europe, U.S. devising guideline

Facial recognition cameras can collect biometric data remotely without subjects being aware. The data can then be used to conduct sophisticated surveillance by linking it to other types of information, such as travel and purchase histories.

If used responsibly, facial recognition cameras can improve public safety, but there are concerns that the significant infringement on people’s privacy could have the side effect of draining life from society.

The Yomiuri Shimbun
An artist illustration : Security surveillance

In light of this, efforts to develop rules specifically about facial recognition cameras are underway in Western countries.

Under the General Data Protection Regulation (GDPR), which is equivalent to Japan’s Law on the Protection of Personal Information, the European Union defines biometric data, including information on facial features, as “special categories of personal data,” prohibiting the handling of such data without the consent of the individual. The EU has also created guidelines specific to facial recognition cameras.

A draft of AI regulation released in April this year also calls for strictly restricting the use of facial recognition cameras in public spaces.

In July, a Spanish retail chain operator was fined about ¥300 million for violating the GDPR by using facial recognition cameras to monitor people who had committed robbery and other crimes in its stores and were later released from prison.

British police operated facial recognition cameras in high-crime areas and an external auditing organization was used to monitor their use. However, a court ruled in August last year that the use of the cameras was unlawful in a case because there was no clear guidance on who to target and where to install such cameras.

Some U.S. states are also considering regulating facial recognition cameras, and laws have been passed in some states.

Japan’s Law on the Protection of Personal Information treats facial feature data as information equivalent to photographs and does not provide higher levels of protection. Under the law, consent from individuals is not necessary to obtain such biometric data; public notifications are sufficient.

However, the Personal Information Protection Commission has interpreted the law to mean that such notification and announcements are unnecessary in terms of security cameras.

As few business operators disclose information regarding security cameras, it is difficult to determine the scale of facial recognition technology use, prompting concerns.

Amid such criticism, the commission changed its interpretation this month. From April next year, users of facial recognition technology will be required to notify the public of the purpose of its use.

However, there will be no need to disclose details or obtain consent.