Unbalanced information diet: AI-generated deception / Anger, Anxiety Created by Narratives Manipulates Public Opinion

The Yomiuri Shimbun
This photo taken in Chiyoda Ward, Tokyo, shows Christopher Wylie talking to The Yomiuri Shimbun online on Nov. 10.

This is the fourth installment in a series examining moves to spread biased information through the use of generative artificial intelligence, acts that threaten our democracy.

***

Christopher Wylie, former director of research at the now-defunct Cambridge Analytica (CA), a British data analysis firm, was summoned in the U.S. Congress hearing on May 16, 2018, testifying on the CA’s role in the U.S. presidential election in 2016.

“Cambridge Analytica sought to identify mental vulnerabilities in voters and worked to exploit them by targeting information designed to activate some of the worst characteristics in people, such as neuroticism, paranoia and racial biases,” Wylie said in the hearing.

CA served as a campaign consultant for former U.S. President Donald Trump’s campaign in the 2016 U.S. presidential election and allegedly manipulated voting behavior. Wylie worked at CA until the autumn of 2014, where he researched social media and the application of psychological knowledge to manipulate public opinion. He blew a whistle and exposed the methods at the U.S. Congress and other places.

CA used narratives to manipulate public opinion. The company used personal information on about 87 million people illegally obtained from Facebook. It spread negative narratives such as “Immigrants are taking our jobs” so as to let Trump, who advocates anti-immigrant policies, win the election.

Wylie believes narratives that stir up anger and anxiety influence people’s mind, and if spread on social media, they can easily change people’s behavior as well as politics and society, he said in an online interview with The Yomiuri Shimbun in November.

Narrative wars, in which people spread favorable rumors and impressions, have intensified on social media, in the issues of Russia’s aggression into Ukraine and the conflict between Israel and Hamas.

Around 2014, psychologists took a long time to create narratives, but today generative artificial intelligence can create many different narratives in a matter of seconds, Wylie said.

Public opinion can be manipulated more quickly at a lower cost, and the threat to democracy is growing, he said, adding Japan is no exception.

Opinion manipulation

Wylie, 34, shared inside stories of public opinion manipulation in a recent interview with The Yomiuri Shimbun. He also underscored the looming threats of public opinion manipulation presented by generative AI.

In 2013, Wylie began working at a British company, which was the parent company of CA at the time, utilizing his data analysis expertise. His role at this firm, which was involved in military psychological warfare activities in the United States, Britain, and other countries, focused on developing strategies to counter Islamic extremists. In collaboration with psychologists, Wylie conducted extensive research and found that individuals who are prone to feelings of anxiety or anger are more likely to be influenced by and share radical content on social media, compared to those who do not exhibit such emotional predispositions.

Wylie said he found that influencing public opinion involves simply identifying the target audience and inciting their feelings of anxiety and anger. Then people will naturally disseminate the information he aimed to spread.

CA was later established with the addition of Stephen Bannon, who subsequently became an adviser to Donald Trump’s presidential campaign. The company shifted its focus to studying the American public. Part of Wylie’s responsibilities involved gathering data on the opinions and dissatisfactions of Americans. He recalled Bannon would often say they could change the politics and culture of the United States.

Wylie, psychologists and other employees traveled across the southeastern United States by car to conduct extensive interviews. In the interviews, poor white men voiced their grievances about losing their jobs to immigrants from Mexico and other neighboring countries and expressed concerns about declining public safety. Such individuals were also often critical of then President Barack Obama’s administration and expressed frustration and anger toward the increasing prominence of women and people of color in American society.

Psychologists developed negative narratives designed to provoke anxiety and anger among the people to divide public opinion. A notable example of the approach was “Build the Wall,” which was intended to create a psychological barrier between immigrants and white men in the nation, thereby inciting confrontation between these groups.

CA, with the help of researchers, collected the personal data and posts of Facebook users and their friends, and then analyzed them using AI to categorize the users based on personality traits. The analysis revealed that disseminating negative narratives to those who felt dissatisfied and alienated from society could effectively stir up anger and anxiety. These insights were compiled into a report and submitted to Bannon.

The narrative of “Build the Wall” was often used by the Trump camp during the 2016 presidential campaign.

“We were looking at how can we develop tools to … protect democracy in the first place. … In my opinion, what they [CA] were doing was incredibly unethical and [would] harm or destroy democracy,” Wylie recalled. He blew the whistle on the situation at CA out of his regrets.

Wylie believes that generative AI narratives will be used in elections around the world in the future, and Japan will be no exception, as generative AI transcends language barriers.

“If you use it [a narrative] to manipulate people, it’s really an effective way to do that, especially when you combine it with generative AI and SNS,” said Wylie.

Wylie then continued: “Japan where it sits, like, literally geographically … to your north is Russia. And then on the other side is China. And so you’re an island that’s squashed between two geopolitical adversaries. … Japan … has an aging population. And when you look at research on people who are more vulnerable to believing disinformation, it’s not young people.”

Unbalanced thinking

“CA may have even understood the mechanisms of the brain and targeted people who are susceptible to narratives,” Prof. Hajime Mushiake, an expert in neuroscience at Tohoku University, said.

According to Mushiake, the human brain maintains a balance between logical thinking and emotional thinking. However, the balance is disrupted when people feel lonely or anxious, making them more easily influenced by negative narratives, such as the ones CA promoted, and their logical thinking is suppressed.

“Failing to recognize other viewpoints can lead to extreme actions, such as the attack on the U.S. Congress,” Mushiake said.

“It’s important to always acknowledge diverse opinions. Understanding the pitfalls of social media and cultivating the ability to make sound judgments is also essential.”

Undemocratic narratives

Since 2022, Prof. Tetsuro Kobayashi, an expert in public opinion research at Waseda University, has been studying how Japanese people are affected by undemocratic narratives that might be promoted by countries such as Russia and China.

Kobayashi conducted a survey with 2,400 Japanese men and women aged 18 to 79, in which he presented a scenario of a 17-year-old girl suffering because of the oppression of ethnic minorities in the Xinjiang Uyghur Autonomous Region under both a pro-China narrative and a pro-democracy narrative. The survey showed more people expressed their understanding of the pro-China narrative. Kobayashi is now analyzing the factors behind it.

“The development of AI is making it easier to create narratives that are more acceptable to Japanese people. Narratives are a powerful means of manipulating opinions, and we must be aware of this possibility,” Kobayashi said.

The increasing use of generative AI now poses an imminent threat in the form of opinion manipulation in Japan’s digital space.