Countries using ‘hostile social manipulation’ to target enemies – report

Michelle Abad

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Countries using ‘hostile social manipulation’ to target enemies – report
The RAND report on hostile social manipulation says countries are using common disinformation practices to gain a competitive advantage

MANILA, Philippines – In global strategic competition, countries are compelled to make use of as many information outlets in the most effective ways possible. 

A recent RAND report indicated numerous online tools – including targeted social media campaigns, fake news, sophisticated forgeries, cyberbullying and harassment, as well as the distribution of rumors and conspiracy theories – are instruments countries may use to cause damage on targets like the United States.

These practices, referred to in the report as “hostile social manipulation,” are plays in information warfare by those deemed to be leading authors of the technique, Russia and China.

The report defines hostile social manipulation as “the purposeful, systematic generation and dissemination of information to produce harmful social, political, and economic outcomes in a target area by affecting beliefs, attitudes, and behavior.”

Nature, goals

Hostile social manipulation takes on a modern form of long traditions of propaganda. The report focused on the use of information to “shape perceptions and attitudes in other societies and achieve harmful effects.”

The report observed countries using both traditional and non-traditional media to disseminate their pro-government content and connect with their target audiences.

In the case of Russia, the researchers noted that President Vladimir Putin and members of his inner circle are “trained to view information through a specific lens.” Coming from a career in the KGB, the Soviet Union’s defunct security agency, Putin’s interests revolved around the government maintaining control over information. Various tools were used to sow discord, aggravate a political divide, weaken faith in public institutions, and manipulate US political and social outcomes.

To cite an example, Russia used tools such as automated social media bots, political advertising on Facebook, and state-owned media channels to direct propaganda to the targeted release of stolen documents to influence electoral outcomes. (READ: Propaganda-spewing Russian trolls act differently online from regular people)

The report found several studies that indicated Russian and Venezuelan social media accounts flooded Spain with pro-independence messages during the Catalan separatist crisis in 2017.

Meanwhile, China employs both defensive and offensive goals seeking to delegitimize critics of the Chinese Communist Party (CCP) as an “extremely tiny handful” often serving “hostile foreign forces.”

China’s efforts to shape the information landscape are reportedly becoming more active, with some direct or indirect support of web or social media sites that promote “China’s official narratives.” There are also reports of China spreading deceptive information and even fabrications designed to create social divisions in the USA.

Professional networking site LinkedIn isn’t safe from danger either, as China state intelligence has also reportedly used the platform to gather information and establish relationships with key individuals.

Apart from election-related agendas, countries in Europe and Russia have also engaged in cyberharassment, “trolling,” stealing and leaking personal information, and other techniques to intimidate or discredit specific individuals or activist groups.

Such efforts may have other possible goals, such as sparking desired behavior like terrorism or protest, or simply just directing attention and causing confusion.

Possible, imminent dangers

The RAND report said there was a need to understand whether these trends of disinformation are capable of generating dynamics that could have dangerous long-term effects on the stability of target countries. (READ: Information wars endanger civilization, say ‘Doomsday’ experts)

These acts can even be referred to as “measures short of war.”

For instance, direct cyberattacks on another state’s military can be considered a form of electronic warfare or cyberwar.

Hostile social manipulation also includes the efforts of extremist groups to promote radicalization among target populations for increased recruitment. This is not necessarily referenced in the cases of Russia or China.

The report currently found no clear-cut way to measure the impact or effectiveness of such attacks on social media. However, Chinese journals have devoted an increasing amount of attention to the issues of foreign public opinion and how to shape it, as shown in the report through visualized data.

It also reported that China resolved to multiply its efforts and devote more resources to controlling information. 

Russia’s operations meanwhile are said to be advanced, although they have not greatly affected strategic positioning of states or boosted support for policies Russia desires.

The report recommends targets like the USA come up with an updated framework for organizing its thinking about complex issues in manipulated infospheres by foreign powers.

Local online vulnerability

Organized disinformation very much exists in local infospheres as well, according to VERA Files co-founder Luz Rimban. (READ: Is the Philippines in step with Russian online propaganda warfare?)

“Some people or groups are messing with our minds,” she said. These acts consequently affect behaviors and actions, such as what people say, think, buy, or vote for. With social media, these messages are being disseminated faster than ever.

“With technology being developed the way it is now, targeting of people can be done on an individual level, which they call ‘behavioral microtargeting.’ The people behind the manipulation know you enough to be able to design a message and approach specific to you and people like you,” she said.

Rimban said the “poor quality” of media literacy in the Philippines gives rise to the danger of social manipulation. 

“We have people who share or repost without reading and thinking, who do not understand what they’re reading, can’t tell between satire and news, and basically lack critical thinking skills. Many do not understand what being on social media means,” she said.

Dismantling systems of disinformation is a job that surpasses the responsibility of democracies and journalists.

“Everyone should be involved in addressing this problem – public relations practitioners, advertisers, educators, psychologists, and tech people,” Rimban explained.  – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Michelle Abad

Michelle Abad is a multimedia reporter at Rappler. She covers the rights of women and children, migrant Filipinos, and labor.