Self-Deception: Can we lie to ourselves?

What does it mean to lie? In a strict sense, lying means relating a falsehood that one knows on some level to be just that — false. We are not always honest with ourselves, and our brains have many ways to deceive themselves. We are frequently convinced, often quite strongly of things simply not true, tenaciously holding some falsehoods as self-evident truths.

It seems paradoxical to be able to believe what we know to be false, so how then may it be possible to fool ourselves in that manner and still be aware, at least intellectually, that we are doing so?

First, a bit of what it means to ‘know’ something. Knowledge at it’s most basic level involves both awareness of an idea, or probable fact, and its acceptance.

Knowledge involves belief that something is the case or that something is not, that belief needed to fully grasp the intricacies and nuances of what is known, but we are not all the way there yet. We have a couple more steps to go…

To know something, we must not only be informed of a thing and believe it to be the case, but it must actually conform to existing facts — it must be true. Not just this, but there must be some grounds for believing it, and convincing ourselves that we have knowledge, and not just a lucky guess on our part.

Strictly speaking, you can’t really know something that’s false, and you can’t truthfully say you know something without good grounds…


There must be some information available, usually gained by our senses, by which we obtain those grounds and the justification for the item of knowledge we possess — some channel of information must necessarily and sufficiently complete the picture, so that we can confidently think that we know something.

Whether these grounds come from our own personal sensory experience, often enhanced by our instruments and other artifacts, secondhand or further removed testimony given by others (usually needing some kinds of grounding itself, like real and relevant expertise of the source giving the testimony…) and possibly other channels of information as well, we must have evidence, and it must be strong enough to justify the claim we accept.


We surely deceive ourselves, convince ourselves of probable falsehoods and we often hold conflicting beliefs by walling them off from each other — and even using processes of doublethink and rationalization to entertain them without the discomfort we often experience when both or all come to our conscious awareness at the same time.

It’s possible to have a lack of confidence in one’s knowledge, the niggling doubt that sometimes happens to those of us who hold all knowledge to be subject to correction with further and better grounds to retain, reject or amend what we know at any given time.

With doublethink, rationalization, logical fallacies and belief in belief — believing for the sake of belief itself as a virtue — we may hold at least as partially true what we intellectually know (and thus to an extent accept) to be false, and move from compartmentalizing our accepted and conflicting claims into the territory of the pious fraud, further into those of the pathological liar and victims of False memory syndrome

…as well as when we willfully sacrifice the value of reason and evidence in favor of what feels good to us, rather than the uncomfortable realities we are often forced to deal with in daily life.

4 thoughts on “Self-Deception: Can we lie to ourselves?

  1. > we may hold at least as partially true what we intellectually know (and thus to an extent accept) to be false
    > To know something, we must not only be informed of a thing and believe it to be the case, but it must actually conform to existing facts

    You might be putting too much faith in people, assuming they have a reasonable grasp of the facts first and only then build up a false interpretation.

    Having studied UFO believers, I find quite the opposite. Some have seen (but most have only read about) an unidentified object that, to them, acted in a way that (they think) can’t be explained by natural means or attributed to human activity. That is, they start from a state of unknowing, but don’t recognise their ignornace. Even so, from there they use the process of elimination to reason that: non-natural + non-human = demons, aliens, time travellers, dimension jumpers (whatever fits their preexisting worldview).

    This hypothesis is never honestly tested. All facts are made to conform to the belief. UFO sightings are aliens until otherwise explained, a safe gambit because you cannot explain 100% of all cases (even so, this ploy is still using the unexplained as positive evidence!).

    Even lack of facts confirm the belief, usually giving rise to misanthropy (the public/academia is close-minded to the truth), paranoia (witnesses won’t come forward with hard evidence for fear of losing their social standing, career, lives), conspiracy theory (the government is hiding alien confirmation or is colluding with the aliens), or unwarranted speculations on alien technology (mind control, invisibility, time travel, dimension jumping).

    Stanton Friedman, self-described “flying saucer scientist,” is fond of saying, “Absence of evidence isn’t evidence of absence.”

    Very sad.

    > Strictly speaking, you can’t really know something that’s false, and you can’t truthfully say you know something without good grounds…

    Yet people do!


    • Hmmm. You’re right about that last, in the sense of “knowing what isn’t so” as per the similarly phrased title of Thomas Gilovich’s book on self-deception. That seems to me more the illusion of knowing, though, than the technical definition of knowledge as true belief justified by evidence of some sort.

      Ben Radford’s pointed out a hierarchy of errors in arriving at and making paranormal claims, starting at perception, then the interpretation of what’s perceived, then the constructed memory of what’s perceived and how it’s interpreted, and then the verbal account of what’s remembered in making the claim.

      I’ve noticed that one’s ability to both interpret and relate what was perceived can be distorted by language difficulties, like a limited vocabulary and/or syntax (which can limit the scope and depth of one’s own conceptual skills) or a genuine language barrier between speaker and listener.

      I think that all of these working together can easily produce that illusion of knowledge, which seems to me more of a problem than mere ignorance.


  2. > “knowing what isn’t so”

    I think people just can’t stand uncertainty, so they are compelled to explain something even though they have insufficient evidence. This plagues believers but also many skeptics who simply must explain away kooky claims in terms of the known.

    Myself, I’m content to say, “No one knows.” Though I must admit it is fun to dismantle a nutty claim. I prefer to check the claimant’s facts — which usually is enough to demolish the claim — than impose my own explanation.

    > Ben Radford’s pointed out a hierarchy of errors

    That sounds about right. I believe the recovered memory syndromes start that way. Something as ambiguous as a sense of unease is seen by zealots as a symptom of amnesia for a previously unknown traumatic event.


  3. Pingback: The Weekly Gnuz & Lynx Roundup « The Call of Troythulu

Commenting below. No spam or trolling, or my cats will be angry.

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s