Miss the program? Check it out on WFSU 88.9FM tonight at 7PM.
Having urged FDR to build the first nuclear bomb as the threat from Nazi Germany mounted, Albert Einstein later became haunted by the legacy of risks he knew nuclear power had left for us. To that end, he gave this advice: “To the village square we must carry the facts of atomic energy. From there must come America’s voice.”
We liked Einstein’s advice so much we named our organization after it.
A couple of years ago, we even took Einstein’s charge ridiculously literally by bringing the facts of atomic energy back to “The Village Square” as part of our Dinner at the Square in our series on energy. It turns out that in this next generation of America’s nuclear debate, some of what we “know” about nuclear power isn’t true anymore and some never was. Read all »
Joe Keohane writes a powerful piece on how our entrenched political opinion resists fact that contradicts it. Here’s a snip of an article that’s just so good that it’s going straight into the Village Square library, but we’d strongly recommend you head straight to Boston.com and read the whole piece.
Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”
Read the rest of the article HERE.
In a study titled “There Must Be a Reason: Osama, Saddam and Inferred Justification” published in the journal Sociological Inquiry, sociologists from four major research institutions looked into the high level of persistent belief in America that Saddam Hussein and Iraq were responsible for the attacks of 9/11, despite the overwhelming evidence to the contrary. The study results:
“Our data shows substantial support for a cognitive theory known as ‘motivated reasoning,’ which suggests that rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe…The study demonstrates voters’ ability to develop elaborate rationalizations based on faulty information. The argument here is that people get deeply attached to their beliefs.”
The researchers describe the observed pattern of motivated reasoning is a “serious challenge to democratic theory and practice that results when citizens with incorrect information cannot form appropriate preferences or evaluate the preferences of others.”
So, to recap… We’ve broken ourselves into feuding teams which are no longer making governing decisions based on reality but on what we want to pretend is true?
(Postscript: If you’re liberal and feeling smug right about now, then think again. You do it too.)