Saturday, July 29, 2023

BIRGing (Basking in Reflected Glory): How Our Social Identities Affect Our Perceptions

In a study back in the 1970s, researchers found that people are more likely to wear team paraphernalia after “their” team wins than win it loses. Known as BIRGing (Basking in Reflected Glory), it is a benign and somewhat humorous example of how our social identities impact how we think of ourselves. Decades of research (see references below) has found that our sense of who we are and feelings of self-worth are due, in part, to our group and organizational affiliations. It isn’t only our accomplishments that drive our sense of self; it’s also the accomplishments of the groups with which we affiliate. Our membership in these groups impacts our perceived status in this world and our self-esteem.

Notably, researchers have found that we typically evaluate members of our group better than we do out-group members. Early studies focused on how this phenomenon impacted prejudice, but in recent years studies have examined how it drives political polarization (Mason, 2018; Bail, 2021) and lone-wolf terrorism (Sageman, 2017). Nationalism, or at least its more pernicious manifestations, can also reflect this (see the cartoon above).

Which group membership is salient at a particular time depends on the situation. However, researchers have found it’s easy to “trigger” (activate) a social identity. Consider this quote from Lev Golinkin, a Ukrainian writer living in the U.S.:
President Vladimir Putin of Russia, who denies that Ukraine is a sovereign nation, is waging far more than a physical war: He, like his predecessors in the Kremlin, is working to erase the very concept of Ukraine from existence. With each new report of a Russian bombing, I find myself becoming more Ukrainian, seizing the identity that first the Soviet Union—and now Russia—has long fought to suppress.” (“The Ukraine of My Childhood Is Being Erased”)
In my previous post on social media and political polarization, I note that Chris Bail has found that social media offers us platforms that we can use to “enhance” our self-esteem, which we cannot obtain solely on our own. People who troll others on social media don’t do it to change the minds of others but to impress those within their online “communities,” regardless of how small or large they are.

Documenting this phenomenon doesn’t solve problems such as prejudice, terrorism, or political polarization. However, the more we know about it, the less likely we will succumb to it mindlessly.

References

Bail, Christopher A. 2021. Breaking the Social Media Prisim: How to Make Our Platforms Less Polarizing. Princeton, NJ: Princeton University Press.

Cialdini, Robert B., Richard J. Borden, Avril Thorne, Marcus Randall Walker, Stephen Freeman, and Lloyd Reynolds Sloan. 1976. “Basking in Reflected Glory: Three (Football) Field Studies.” Journal of Personality and Social Psychology 34:366-75.

Mason, Lilliana. 2018. Uncivil Agreement: How Politics Became Our Identity. Chicago and London: The University of Chicago Press.

Miller, Kevin P., Marilynn B. Brewer, and Nathan L. Arbuckle. 2009. “Social Identity Complexity: Its Correlates and Antecedents.” Group Processes & Intergroup Relations 12(1):79-94. doi: 10.1177/1368430208098778

Sageman, Marc. 2017. Misunderstanding Terrorism. Philadelphia, PA: University of Pennsylvania Press.

Sherif, Muzafer, O. J. Harvey, B. Jack White, William R. Hood, and Carolyn W. Sherif. 1988. The Robbers Cave Experiment: Intergroup Conflict and Cooperation. Middletown, CT: Wesleyan University Press.

Tajfel, Henri. 1970. “Experiments in Intergroup Discrimination.” Scientific American 223(5):96-103.

Tajfel, Henri, and John C. Turner. 1982. “The Social Identity Theory of Intergroup Behavior.” Pp. 7-24 in Psychology of Intergroup Relations, edited by S. Worchel and W. G. Austin. Chicago, IL: Nelson-Hall.

Saturday, July 22, 2023

Social Media and Political Polarization

In a recent paper about the QAnon conversation on Twitter (“A Network Analysis of Twitter’s Crackdown on the QAnon Conversation”), Dan Cunningham and I concluded social media platform such as Facebook, Twitter, TikTok, etc. could adjust their algorithms to expose users to multiple points of view and potentially form ties with those who don’t share their opinions. As an illustration, we relate the story of Ashley Vanderbilt, who, after the November 2020 election, spent much of her time consuming QAnon-related social media. By “inauguration day, she was convinced that if… Joe Biden took office, the United States would literally turn into a communist country” (O’Sullivan 2021). This occurred because she kept clicking on videos suggested by TikTok:
It’s there, she says, that she was first introduced to QAnon… She mostly followed entertainment accounts on the platform, but as the election neared she began interacting with pro-Trump and anti-Biden TikTok videos. Soon, she says, TikTok’s “For You” page, an algorithmically determined feed in the app that suggests videos a user might like, was showing her video after video of conspiracy theories. (O’Sullivan 2021)
We assumed that many people who are active on social media have become “trapped” in echo chambers that reinforce and radicalize their views. Thus, exposing people to alternative viewpoints helps them step outside their echo chambers, which leads them to moderate their views.

Our argument has an intuitive logic, but research by Chris Bail of Duke University and the head of the “Polarization Lab” challenges this line of thinking. Specifically, he and his collaborators have found that exposing people to alternative viewpoints has the opposite effect. When people step outside their echo chambers, they experience it as an “attack upon their identity” (“Breaking the Social Media Prism,” p. 31). He tells the story of Patty, a moderate-to-progressive Democrat:
Patty did not focus on the moderate messages retweeted by center-right Twitter accounts. Rather, she was captivated by the uncivil or ad hominem attacks on Democrats by several of the more extreme conservatives [that were] retweeted. The worst of these attacks were previously obscured by her echo chamber, but now Patty was experiencing the full scale of partisan warfare for the first time… Patty came to realize that there was a war going on, and she had to choose a side (pp. 31-32).
Bail relates similar stories of political conservatives whose views became more radicalized when exposed to retweets from moderate and liberal Democrats. He notes, “For both types of people, stepping outside the echo chamber was not creating a better competition of ideas, but a vicious competition of identities” (p. 39).

He contends social media platforms offer us an outlet that we can use to “enhance” our sense of who we are. Decades of research has found that membership in social groups (e.g., political parties, faith communities) can enhance our self-esteem, which we cannot obtain solely on our own. This is “often driven by the process of drawing boundaries between ourselves and others we deem to be less capable, honest, or moral. The sense of superiority that we derive from categorizing people into groups of ‘us’ and ‘them’ fulfills our intrinsic need for status” (p. 49).

Bail and his colleagues found that those who engage in extreme online behavior (i.e., “trolls”) don’t do it to change the minds of others, but to impress people within their online “communities,” regardless of how small or large they are. The people most likely to troll others are those who feel “marginalized, lonely, or disempowered in their off-line lives” (p. 66):
Social media offer such social outcasts another path. Even if the fame extremists generate has little significance beyond small groups of other outcasts, the research my colleagues and I conducted suggests that social media give extremists a sense of purpose, community, and—most importantly—self-worth” (pp. 66-67)
If you’re wondering, they document examples of this for people on both sides of the political spectrum—in other words, this isn’t just a conservative or liberal thing. Bail and his colleagues also found that
  • Online extremism tends to drive moderates offline or convinces them the rewards of posting their opinions online are less than the cost. People who post right-of-center or left-of-center opinions are often attacked and/or harassed by extremists. And not all of these attacks come from people on the other side of the political spectrum. Moderates are often attacked by extremists from their own side for not being sufficiently conservative or liberal (p. 79).
  • Thus, while social media provides extremists “with a sense of status they lack in their everyday lives,” for moderates, “the opposite is often true.” Discussing politics online simply isn’t worth it (p. 77).
  • Most people online don’t discuss politics online. For example, “Across all tweets from U.S. adults, just 13% focused on national politics.” And those who do “are mostly extremists” (p. 82).
  • The relative absence of moderate views on online discussions has led to what is knowns as “false polarization,” which is “the tendency for people to overestimate the amount of ideological differences between themselves and people from other political parties” (p. 75).
  • The “partisan perception gap—that is, the extent to which people exaggerated the ideological extremism from the other party—was significantly greater among those who used social media to get their news” (p. 76).
  • Research has found that issue polarization is not as great as social polarization. That is, people are not as far apart on particular issues as many of us assume we are (e.g., see “Uncivil Agreement” and “The American left and right loathe each other and agree on a lot”).
  • Although stepping outside of our echo chambers (for those of us who are in them) can push us toward more extreme positions, how we respond to alternative views is a function of the “distance” between them and our preexisting ideas. If they are within “our latitude of acceptance (a range of attitudes about a given issue that an individual finds acceptable or reasonable even if they don’t agree with them a priori), then people will be more motivated to engage with the viewpoint and perhaps even move closer to it” (p. 108)
Based on their findings, Bail and his colleagues argue that moderate voices are crucial for keeping online discussions more civil (less polarizing). After all, most Americans embrace moderate views, although, as we saw above, false polarization keeps most Americans from seeing this. Thus, they have concluded that we need social media platforms where moderates feel welcome, and extremists are not rewarded for their abusive behavior.

To this end, they conducted an experiment to see if they could create a social media platform that encouraged less polarizing online discussions. They created a mobile app called DiscussIt, where two people could discuss issues anonymously. For the experiment, they recruited 1,200 Republicans and Democrats, who were assigned a particular topic to discuss and then (unbeknownst to them) matched with someone from the opposing party. Bail et al. were encouraged by the results:
The results of the experiment make me cautiously optimistic about the power of anonymity. People who used DiscussIt exhibited significantly lower levels of polarization after using it for just a short time. Many people expressed fewer negative attitudes toward the other party or subscribed less strongly to stereotypes about them. Many others expressed more moderate views about the political issues they discussed or social policies designed to address them… Most surprising to me, however, is that an overwhelming majority of people told us they enjoyed using our social media platform, even though they had no incentive to do so… Several users even asked how much the app would cost when it is released to the public (p. 125).
Presently, social media platforms like DiscussIt are unavailable, but we can still take constructive steps toward making social media less polarizing:
  • Don’t attack, harass, or troll people who disagree with your views. We can’t control what others do, but we can control what we do. If you can’t say it civilly, don’t say it. You won’t change anyone’s mind behaving like a jerk.
  • Don’t “Like,” retweet, or even comment on extremist online comments. Most are posted by people seeking attention and status. Deny them that satisfaction.
  • Patronize social media platforms where polarizing discussions are few and far between. This could encourage other platforms to follow suit (clearly, a long-term strategy).
  • If you do engage in a political discussion with someone online, choose someone who appears to be within your “latitude of acceptance” (and vice versa). If you do, you might find yourself in the midst of a constructive (and civil) conversation.
Finally, pay Chris Bail’s “Polarization Lab” a visit. Better yet, read his book. It includes a number of suggestions for making social media more constructive and civil. It isn’t cheap since it is published by an academic press (Princeton). However, used copies can be found, and your local library should be able to track down a copy.

Articles and Books Cited:

Christopher A. Bail. 2021. Breaking the Social Media Prisim: How to Make Our Platforms Less Polarizing. Princeton, NJ: Princeton University Press.

Daniel Cunningham and Sean F. Everton. 2022. “A Network Analysis of Twitter’s Crackdown on the QAnon Conversation.” Journal of Social Structure 23:4-27.

The Economist. 2023. The Economist. “The American left and right loathe each other and agree on a lot.”

Lilliana Mason. 2018. Uncivil Agreement: How Politics Became Our Identity. Chicago: The University of Chicago Press.