The Economics of Polarization
People tend to interpret ambiguous information as confirming whatever they believed to begin with.

The Economics Of is an original video series hosted by thought-provoking Harvard economist Roland Fryer.
From the decline of NFL running backs to social identity, this series offers fresh perspectives on familiar topics, exploring how economic thinking shapes our world in unexpected ways.
The Economics Of invites viewers to see the world as an economist would, revealing the hidden forces that influence our daily lives.
Dive deeper with Roland's companion pieces in the Wall Street Journal.
By Roland Fryer
April 20, 2025 11:27 am ET
Nothing throws America's divisions into stark relief quite like having Donald Trump in the White House. But Mr. Trump is an effect of polarization as much as a cause. We've been growing apart politically for decades.
Data from Gallup show wide and growing divergence across numerous issues between 2003 and 2023. Nowadays, 85% of Democrats but only 30% of Republicans think the government should ensure that everyone has healthcare, a gap that grew 24 points during those two decades. The notion that the federal government has too much power now gains support from 73% of Republicans and 31% of Democrats, a 51-point shift–yes, the partisan split was slightly in the other direction during the George W. Bush administration. The split on whether abortion should always be legal has widened 30 points, on whether human activity is the main cause of global warming by 33 points.
Lawmakers have become more polarized over the past half-century too, and there's polarization over why this happened. Some data suggest that Republican politicians pulled to the right, but conservatives note that the government has moved sharply to the left. In a 2021 Quinnipiac University poll, 52% of Americans said the Democrats had moved too far left, while 35% said the Republicans had moved too far right.
Even seemingly nonpartisan measures such as the consumer sentiment index reveal partisan polarization: Republicans had more-positive views than Democrats about their economic situation during the first Trump term, and this abruptly flipped in early 2021.

How can two people who observe the same information come away with starkly different conclusions? And why do views on factual questions, such as the cause of global warming or the strength of the economy, break down so neatly on ideological lines?
A key insight came to me in a roundabout way. My wife is a great driver, but she blows the horn too much for my taste. Any slight–perceived or real–and you get a loud honk if she is behind the wheel. One morning as we were commuting, a car pulled past her on the highway and veered just slightly our way, so that its tires drifted into our lane. She honked. I tried to reason with her–his driving was within the usual margin of error. Her response: "I have kept myself from many, many accidents by being a proactive honker."
We observed the same incident, drew opposite conclusions, and each became more convinced we'd been right all along. Is this consistent with rational thought? And could it explain why Americans have become so polarized?
As soon as she dropped me off on campus, I ran to my office to tell a fellow economist this anomaly I had observed. Was my wife irrational? Was I? Or did we need to think about inference and decision-making a bit differently? I confided first in Matthew Jackson, who specializes in social networks. He seemed as perplexed as I was and, because he knows my wife, offered up several interpretations that would make her seem more rational. Finally, he relented, and it became one of the guiding examples for us to think differently about how humans process information when there is uncertainty.
In the simplest version of the model we developed, imagine that the truth is either A or B. Climate change either is or isn't caused by human activity. The death penalty either deters crime or it doesn't. No one really knows the truth–but we start with a prior belief about how plausible A and B seem. Each person observes a series of signals, information that suggests the truth might be A or B. Some signals are ambiguous and come as AB rather than A or B.
If you were fully rational and able to set aside prior beliefs, you'd store the information in a sequence–A, B, AB, AB, A, AB, A, B, B–and add it up at the end: three points for A, three for B and three ambiguous signals.
But if you tend to align unclear evidence with your previous expectations, you would come away thinking your original instincts were right. If you construe all the "AB" signals as A (or B), you now think the evidence falls on your side by a 2-to-1 margin. Further observations of the world entrench that view rather than correcting it, because future ambiguous signals will have the same skew.
Our main mathematical result demonstrates that if a large enough share of experiences are open to interpretation–maybe the guy who drifts into your lane until you honk is an example of the horn's saving lives, or maybe he's an average driver who never posed a threat–then two agents who have differing prior beliefs and who see exactly the same sequence of evidence can often end up polarized, with one person being absolutely sure of A and the other of B.
We explored the model's implications in an online experiment with more than 600 subjects, modeled on a pioneering 1979 paper by Charles G. Lord and colleagues. First, participants were presented with questions about their beliefs on climate change and the death penalty. Then they read a series of summaries of research about each topic. After each summary, we asked participants if they thought the summary provided evidence for or against the topic on a 16-point scale. After all of the summaries were presented, we repeated the initial questions about their beliefs about the topic.
There was a very significant correlation between a subject's prior belief and his interpretation of the evidence. More than half of our sample exited our experiment with more extreme beliefs than at the start, even though the evidence presented to them was neutral.
The discouraging implication is that in a world where information is plentiful, people will become more divided, not less. That's true even if they all see the same information–which they don't when they can choose between Fox News and MSNBC. And it's true even if our widening divisions prove deeply unhealthy for our country.
Mr. Fryer, a Journal contributor, is a professor of economics at Harvard, a founder of Equal Opportunity Ventures and a senior fellow at the Manhattan Institute.
Appeared in the Wall Street Journal on April 21, 2025. Read the original article here