Japan Institute Researchers Devise New Way to Fight Deepfakes
15:59 JST, August 30, 2023
A method to restore deepfakes to the original content has been developed by a National Institute of Informatics team.
The misuse of these convincingly realistic fake videos created using artificial intelligence software has become an issue affecting society. For example, a person’s face can be replaced by that of a celebrity or politician and the voice and facial expressions can seem to show something the celebrity or politician never said or did.
Ahead of the U.S. presidential election next year, controversy has arisen after the campaigns of former U.S. Donald President Trump and current Florida Gov. Ron DeSantis posted fake videos on social media in an attempt to manipulate voters’ impressions. Both are vying to win the Republican Party’s nomination for the White House.
In recent years, it has become increasingly easy to generate deepfakes using apps and other computer programs. As technology for creating deepfakes becomes more sophisticated, it becomes more difficult to detect what is being faked.
Given the situation, the Japanese government has supported through subsidies and other assistance the efforts of universities and research institutes to develop technology to detect fake videos.
The team at NII, which is under the jurisdiction of the Education, Culture, Sports, Science and Technology Ministry, will look to use its method on deepfakes of public institutions and celebrities.
The method involves embedding data identifiers about a person’s facial features. Even if the face is merged with that of a different person, the information in the video can be used as clues to restore the original face.
The team said it tested its method on about 200 deepfakes and successfully restored most of them highly accurately.
“Like a vaccine administered in advance to fight off an illness, our method can be used as a ‘cyber vaccine’ to prevent the misuse of fake videos,” said team leader Isao Echizen, the director of NII’s Information and Society Research Division.
There are of course still issues to be dealt with when using the method. For example, the method cannot restore a video if the person is not facing forward or if the image is dark.
As the team continues to improve the accuracy of its method, it is considering offering the technology upon request from enterprises and other outside entities.
Toshihiko Yamasaki, a University of Tokyo professor and expert on deepfake research, called the institute’s method “an achievement that can serve to deter the misuse of fake videos.”
“If more complex information is embedded in a video, security is likely to be further enhanced,” he said.
"Society" POPULAR ARTICLE
-
Typhoon Shanshan Forms, Slowly Moves Toward Japan; Govt Says Typhoon No. 10 Likely to Approach Japan Next Week
-
Typhoon Ampil Approaching Japan
-
Shizuoka Pref. City Offers Foreigners Free Japanese Language Classes; Aims to Raise Non-Natives to Daily Conversation Level
-
Typhoon No. 10 Forecast to Develop; Move into Pacific Ocean South of Japan on Aug. 26
-
Strong Typhoon Shanshan Predicted to Approach Western, Eastern Japan Earliest on Wednesday
JN ACCESS RANKING
- Nankai Trough Megaquake Tsunami could Hit in 2 Minutes; Japan Authorities Urge Caution after Recent Earthquake
- Typhoon Shanshan Forms, Slowly Moves Toward Japan; Govt Says Typhoon No. 10 Likely to Approach Japan Next Week
- Typhoon Ampil Approaching Japan
- Shizuoka Pref. City Offers Foreigners Free Japanese Language Classes; Aims to Raise Non-Natives to Daily Conversation Level
- Typhoon No. 10 Forecast to Develop; Move into Pacific Ocean South of Japan on Aug. 26