Have you ever tried to engage in a balanced discussion or debate over, say, macro-economics or maybe foreign policy — subjects that are complex and contingent upon many factors — and found yourself frustrated in your attempts by an extreme/unyielding ideological viewpoint?
Well, the next time you’re confronted with an extreme (and typically over-simplified) political viewpoint, you might try asking that person to explain their viewpoint, that is, ask them to detail how they think a certain policy or law actually works. You might just find that their extreme view will shift to one that is more moderate and balanced.
This shift from an extreme view to a less extreme one is due to the ‘illusion of understanding’, according to new research published in the journal Psychological Science.
How to Explain the Rise and Ubiquity of Polarized Politics?
The generally perceived increase in political polarization in the US in recent years, prompted a team of psychological scientists (Fernbach et al) at the Leeds School of Business, University of Colorado, Boulder, to explore some of the possible factors contributing to this increase in extreme/polarizing politics.
“We wanted to know how it’s possible that people can maintain such strong positions on issues that are so complex — such as macroeconomics, health care, foreign relations — and yet seem to be so ill-informed about those issues,” said Philip Fernbach, lead researcher on the study.
Previous research on voter viewpoints seems to indicate that this ‘illusion of understanding’ is based upon people possessing a misapprehension (the ‘illusion’) of their own knowledge of a subject, that is, they tend to believe that their understanding is greater or better than it actually is*.
Apparently, when the extreme view-holding person is tasked with explaining the “nuts and bolts” of how a given policy or law works, this forces them to confront the fact that they do not know as much about the policy as they had previously thought, forcing them to moderate or soften their former extreme view.
Welcome news, to be sure, but not that surprising, really.
* This misapprehension about one’s own knowledge seems very similar to the ‘Dunning-Kruger Effect’ in which a person’s incompetence prevents them from recognizing their incompetence.
The Experiments – Explaining Your View Points
In the first of two experiments conducted by the research team, participants were given an on-line survey to assess how well they understood six different policies (e.g., raising the retirement age for Social Security qualifying, or implementing a national ‘flat tax’ or merit-based pay for teachers). Following this self-rating, the participants were randomly given two of the policies to actually explain and then they were asked to rate themselves again based upon how well they thought they understood the issues.
Data from these self-assessments corroborated the earlier prediction; in general, people reported lower understanding on all six policies after being tasked with explaining them. Further, results showed that as participants’ understanding of a policy decreased, their uncertainty about a position increased, and the less extreme their position became.
These results held true for “views” spanning the mainstream political spectrum — Republican, Independent, and Democrat.
What’s more, the researchers found a strongly correlated behavioral effect from this shifting of viewpoints; a second experiment showed that people indeed “put their money where their mouth is” (or rather, withhold their money when faced with their policy ignorance) in terms of monetary donations to a given political cause. Given the opportunity to donate ‘bonus’ money to a politically-related organization, participants whose extreme views had shifted or “softened” after having to explain the policies were less likely to donate money to that political organization.
The Importance of Being Uncertain
The researchers feel that their findings could possibly help to keep lines of communication open amongst those locked in ideologically-driven debate or negotiations.
“This research is important because political polarization is hard to combat. There are many psychological processes that act to create greater extremism and polarization, but this is a rare case where asking people to attempt to explain makes them back off their extreme positions.” commented Fernbach.
This study (Fernbach et al) would also seem to be supported by, or be related to, earlier research regarding the role that uniformed individuals (those showing ‘weak preferences” in both human and non-human animal groups) play in “promoting democratic consensus” through preventing extreme minority viewpoints from dictating group choice (see: Couzin et al; ‘Uninformed Individuals Promote Democratic Consensus in Animal Groups’, Science, 16 Dec., 2011).
Well, maybe as long as the “uninformed individuals” have to explain themselves before they go vote.
The study paper is entitled: ‘Political Extremism Is Supported by an Illusion of Understanding’ and its co-authors include Todd Rogers, Harvard Kennedy School; Craig R. Fox of the University of California, Los Angeles; and Steven A. Sloman of Brown University. Source: Association for Psychological Science
While I find this study affirming of my principles of moderation (in most things) and feel it is generally valid, the study results imply (and seem to confirm) that any given extreme political view (towards a given policy, as defined by the researchers) is generally held by someone who cannot explain accurately the policy for which the view is held. The study also seems to imply that the extreme view is the ‘wrong’ view (because it is unsupported by personal understanding; it could be unsupported, but, by chance, be correct), and further seems to imply that, therefore, the moderated/softened view is more “correct”, or closer to the truth.
Psychological studies such as these are always problematic.
Some source material for this post (including quotes) came from the escience news article: ‘Extreme political attitudes may stem from an illusion of understanding’ – published: Monday, April 29, 2013 – 12:36 in the Psychology & Sociology section.