What shapes our perceptions (and misperceptions) about science? In an eye-opening talk, meteorologist J. Marshall Shepherd explains how confirmation bias, the Dunning-Kruger effect and cognitive dissonance impact what we think we know — and shares ideas for how we can replace them with something much more powerful: knowledge.
A while back, Steven Novella [Here] had posted some really good thoughts on the difference between effective, intellectually honest skepticism and cheap, lazy, cynical denialism, and on the importance of cultivating the former and avoiding the latter.
In the past, I’ve attempted to describe a belief spectrum from absolute credulity to definitive denial, but I currently think that’s an erroneous concept.
For as has often been pointed out by Stephanie Zvan, except for some rare cases of neurological dysfunction, nobody is totally credulous or completely cynical about everything, but somewhere between them in more of a rock-strewn landscape of belief with surer, safer footing nearer the center than at the edges, to paraphrase her analogy.
But in the comment thread of Steve’s post, one of the commenters [Starting Here] tries very hard to prove the very thesis of cynicism the post addresses in a classic and blatant display of the Dunning-Kruger effect, by conspiracy mongering, in dishonestly ignoring or dismissing all counterarguments, attempting to assert intellectual superiority by evading questions and repeating the same talking points using glaring errors in reasoning apparent to nearly everyone else in the thread, and especially obvious to Dr. Novella.
Despite suggestions from the others, and better arguments offered by same, at no point does the offending commenter get a clue as to his own incompetence in reasoning, and repeatedly sticks to 20-30 years out-of-date books and documentaries as proof positive of his claims of evil government conspiracies in a manner that seems a bit too uncritically cynical, arrogant, and condescending for one claiming to be the better skeptic.
Exactly what was described in Steve’s main post. To a tee.
The commenter is content to claim the moral and intellectual high-ground, and not once does he note the irony of his factual errors, illogical statements and attempts to shift the burden of proof onto the other commenters, thinking his own arguments absolutely steel-girded and views flawlessly correct.
I’m going to say something I rarely feel a need to: Incompetence leads to more of the same. Some people are too clueless to notice or too resentful to acknowledge their own lack of ability and project it onto others to protect their fragile egos and rice-paper thin skins.
I for one am skeptical of his claims, as I hear the same sort of absurd arguments from people whose only criticisms of science are based upon casting aspersions of motive and vested interest, thus showing quite nicely that they really don’t understand science.
As noted with the Dunning-Kruger effect, There’s an enormous difference between self-reporting how well-informed one is about something, and really being as well-informed as one claims: It’s an inverse relationship between how unduly confident one is about their understanding and how much they actually understand, ego and self-esteem aside.
People who really do know more probably tend to be more introspective and self-critical thinkers and are more aware of their own intellectual shortcomings and biases than incurious types who don’t think deeply enough to question the limits of their understanding and of their own subjective but real weaknesses.