Propaganda and information manipulation in groups

As Russia’s attack on Ukraine continues, UB School of Management expert Kate Bezrukova examines how misinformation takes root in groups

A destroyed residential building in Kyiv, Ukraine. Photo: Алесь Усцінаў

By Kate Bezrukova and Chester Spell

Kate Bezrukova
Kate Bezrukova

As the Russian invasion of Ukraine has, of course, dominated world attention, I’ve been trying to understand what ordinary Russians think of this in as “unfiltered” a manner as possible—by asking personal contacts directly.

Some of these contacts are friends back from my time in college, living in Moscow. Like old friends everywhere, we’ve been supporting each other through the worst of Covid, talking about kids, books, and our jobs for years. So, just after the invasion began, I asked them directly, “What the heck has been going on with Russia since I left 22 years ago?” Would they, from a Russian perspective, be able to justify what is euphemistically called a “war of choice”?

My initial questions—to my surprise, perhaps borderline shock—were asked right back at me. Having grown up with a firsthand experience of how Russian media operates and the limitations of any independent news agencies, I was aware that the story this group was getting would not be at all like what we hear in the Western world. Yet, I was still surprised by how they interpreted the war and the causality of the conflict.

Our interactions over the next weeks included endless texts about denazification (even though many countries have far-right presence), liberation, and Russia fighting two wars at the same time, with the U.S. being the biggest villain. There was also the conviction that Ukrainian modeling agencies were staging horrifying pictures of injured civilians for the Western press, that Ukrainians were barbarians shelling their own people, and finally, parroting Vladimir Putin’s bizarre read of history (and it is bizarre) that Ukraine is not a nation.

At first, I wondered how it was even possible for anyone, much less people I’ve cherished as friends for decades, to believe this nonsense in the 21st century. But then it occurred to me that in our interactions prior to the war, regular topics, like kids’ books or Russian authors, would spark a huge range of opinions—yet they had a surprisingly unified response regarding the war.

How misinformation takes root in groups

This got me thinking about several concepts and phenomena extensively researched in groups and teams that could help explain what I was hearing.

For example, the concept of conformity can explain how and why group pressure leads to these responses and outcomes, as people bring their behavior into alignment with a group’s expectations and beliefs. (See Asch’s 1956 line experiment, when 76% of the group made erroneous conforming judgments about the length of three lines that were obviously of different lengths.)

Why do people conform? There are two main reasons:

  • The need to be right. The more people who hold a particular opinion, the more right that opinion appears to be. This phenomenon explains why many academic journals require three reviews before an editor accepts a paper, and why many competitions use multiple judges. Closer to our point here, this is why seeing the same interpretation of an event from different contacts in your social network could amplify a certain opinion. So, if I see one of my social media “friends” post something about a modeling agency staging a pregnant woman being carried in front of a bombed maternity hospital, and then I see that same post by my other friends, I may start believing that a tree is red and not green.
  • The need to be liked. The tendency to agree with a group to feel more like a part of that group. This is a fundamental human motive that drives a lot of behavior in groups. It gives the sense of being part of “something bigger than myself.” Governments and other players producing disinformation can simultaneously tap into this need and construct a sense of being a victim to truly mobilize group thinking that distorts reality.

Additionally, there is of course attribution error, or mistakes we make in attributing motives to other people’s behavior. Another mode of distortion is based on confirmation bias, which describes how people look for information that supports their interpretation of the event and rejects or dismisses something that conflicts with it. In a recent, tragic example, when Russian people are shown photos of bombed civilian apartments, they may dismiss that information because it does not fit with the state-sanctioned narrative of “bombing only military objects.”

Finally, consider the role of moral transgression and the concept of the group mind. There is a long history of observed differences between the individual and group that can be traced back to Plato’s Republic. In more recent times, the discussion has related to crowd behavior and the group mind. Lebon argues that individuals who join groups tend to regress to primitive mental states, become vulnerable to losing moral standards and inhibitions, and become prone to competitive, barbaric acts, including violence—all presumably because of the emergence of a group mind.

Sadly, the Ukraine invasion is not the first case of Russia’s deviant behavior, even recently. We’ve witnessed cyberattacks and various doping scandals at the Olympics, for example. But such nefarious activities are often justified as Russia not being a well-off country and having some type of right to use unethical ways to level the playing field. We have seen parallels of this line of thinking with justifying property crimes as a way of distributing wealth, especially in cases of extreme disparity.

How to overcome information distortion

All of these biases and sources of distortion likely played some role in shaping the opinions Russians have expressed and explains why those attitudes are so strongly held and resistant to change. But what is one to do? How can the individual, apart from the group, combat these effects in the face of overwhelming evidence?

Here are three ways to overcome the biases surrounding information distortion:

  • Carefully consider decision alternatives and use all available information. While considering other sources might not be practical in Russia now (after all, that’s why independent news sources have been silenced there), it is possible for others to consume information from sources you would not naturally be inclined to read. If you lean to the left, read the Wall Street Journal or National Review regularly. If you are conservative, take a look at the Nation or New York Times.
  • Analyze information by considering which facts support a certain opinion, which facts contradict a certain opinion and which facts are neutral. I practice this approach to processing information with my students. They have to come up with a system to analyze facts to help them decide on a key suspect who’s stealing technology from a company.
  • Have moral standards. Here are some questions I put to my negotiation class: How would you feel if someone used this unethical tactic on you? Would you feel comfortable advising someone else to use this tactic? What would the result be in society if everyone bargained in this manner?

Returning to the question: How to justify an invasion? Some of the justifications given to me by my friends reminded me of Star Wars mythology, recalling Anakin Skywalker going to the Dark Side and justifying violence to “maintain peace.”

The rationale I have been told for Russia’s invasion of Ukraine sounds all too familiar. The pity is that we are not talking about a Hollywood movie but real lives and massive human suffering due to a fallacious tale. That tale, because of the strong biases and tendencies at play, was relatively easy to spin—but will be hard to undo.

This article was first published by Psychology Today.

Kate Bezrukova, PhD, is an associate professor of organization and human resources in the University at Buffalo School of Management. She is an expert on team chemistry, managing a diverse workforce, negotiations and gender, and conflict management. Her research examines group faultlines, diversity, and conflict evolution and management.

2 Replies to “Propaganda and information manipulation in groups”

  1. Looks like more clear evidence for the case that enlightenment emphasis of individual liberty trumps the negative group mob mindset of collectivism.

Leave a Reply

Your email address will not be published. Required fields are marked *

*