We all know that talking about something like politics can be a very touchy subject. People have different beliefs, and most are very strong in their beliefs. This is all well and good, except that these beliefs can actually prevent you from seeing facts for what they are.

A study done by Yale law professor Dan Kahan asked study participants about their political views, and also asked a series of questions designed to gauge their numeracy, or their mathematical reasoning ability. I'm not going to get into all the details of the study, but if you'd like to see the article you can find it here.

The basics of study are that the participants were given a fairly difficult problem that involved interpreting (fake) scientific results. At their core, all the problems are essentially the same. However, one set was a study on the effectiveness of a skin cream, while the other was on the effectiveness of laws "banning private citizens from carrying concealed handguns in public."

Now, like I just said, this problem given was fairly difficult. In fact, 59% of participants got the skin cream question wrong. And I'm pretty sure there was nothing political about the skin cream. But throw in a gun control question, and all of sudden the numbers get skewed.

It's believed (in the deficit model) that if people had more information and a better understanding of topics like climate change, evolution, etc.they would be better able to come to consensus with scientists and experts. But this study showed the opposite. The more information someone had, the more they would skew that information to reach their pre-conceived conclusion.

Not shockingly, in the study the Liberals found that gun control decreased crime while the Conservatives found that gun control increased crime.

More From 96.5 KNRX