Wednesday, September 9, 2015

The Science of Why We Don't Believe Science : Summary and Response

         In his article, The Science of Why We Don’t Believe in Science, Chris Mooney argues that emotions are inseparable from reasoning and emotions are usually stronger than logical reasoning.
He begins this argument by giving the example of the experiment conducted by Leon Festinger as well as other scientists in which a cult believed they were communicating with aliens. This experiment illustrated the idea of ‘motivated reasoning’ which is reasoning which is not separable from emotions and positive or negative feelings come before conscious thoughts; this is explained to be a human survival skill. In other words, we try to push away or even deny threatening information and pill information we see as friendly closer. Mooney goes on to state that emotions are stronger in our reactions when it is something we care about, explaining the effects of confirmation bias where we weigh more on the information that is consistent with our beliefs as well as disconfirmation bias where we try to disprove information that is unsuitable with our beliefs. Mooney argues that we do such things because things like identity affirmation and self-protection are more important than accuracy.  Next Mooney transitions to the next section, entitled The back-fire effect: Why direct persuasion fails, arguing that people can see all of the details of science, but their beliefs will always prevail, giving examples such as experiments with thoughts on the death penalty and other experiments.  Mooney cites Dan Kahan, a Yale Law School professor, saying that individuals are classified by their cultural values, either individualist or communitarian and their outlooks as either hierarchical or egalitarian, pointing out that conservative republicans tend to be hierarchical individualists and liberal democrats tend to be egalitarian communitarians, using these political separations to give further examples.  Based on these classifications, Mooney states that a group of people can all be given the same information but have different percentages of belief.  Mooney points out that the backfire effect can be triggered by direct attempts to persuade someone of something, causing them to hold their wrong views stronger than ever. In his next section, Climategate: What really happened?, Mooney points out that people are not going to ignore their belief system over a bit of information, especially because they feel that their life will be made harder if they believe something that opposes that of their groups or spectators, trying to keep their social desirability in tact.  Transitioning to the effects of the media, Mooney states that people tend to gravitate towards media that has similar beliefs, and social media is worsening this skew of the information received by people. Next Mooney discusses the effects of education, based on experiments, explaining that increased education tends to make people more likely to deny information that is against their beliefs.  In his next section, Why the vaccine-autism link persists, Mooney discusses more political issues, namely vaccination. He goes on to say that conservatives seem to be more rigid while liberals are more tolerant of ambiguity. Mooney concludes his article by stating that in order to get someone to accept new evidence, it is necessary to present it in such a way that no emotional or defensive reaction results, that in order to gives facts a fighting chance, the solution is to lead with values.


Affect and reason are tools that shape and solidify our belief system by almost instinctively labeling something as positive or negative to us. Our beliefs are also shaped by confirmation bias, where we hold information that supports our beliefs higher, and disconfirmation bias, where we try to disprove information that is not consistent with our beliefs, thus making our beliefs more solidified. If the network is not easily swayed by compelling facts and evidence or reads research to validate  existing views, this suggests that knowledge is based on beliefs, meaning it is relatively closed off to facts if they are facts that go against beliefs, only learning more about that which is believed to be true. In my opinion the only way to change people’s minds on topics such as global warming, abortion, health care, etc. would be for them to actually experience both views playing out, so they can see what will happen, because hypotheticals, however factually based they are, are difficult to use to convince people.  If people cannot be persuaded by arguments and evidence this gives writers a very hard job, a job in which they have to present arguments in such a way that support the readers beliefs while also pushing for the results based on evidence. This leaves citizens of democracy at the hands of whichever political party has the most supporters, regardless of our individual thoughts.  According to Mooney, confirmation bias is worsened by technology and social media because it supplies an endless supply of information that supports our beliefs, rather than exposing us to the many beliefs of the world around us. Mooney says that conservatives are more apt to deny science because they are more authoritarian, denying results of individuals, and based on past political views involving science.  I think that this is because conservatives generally tend to be more religious, tightly bound to their beliefs that they have grown up with, which tend not to include science and deny certain aspects of it.

No comments:

Post a Comment