PseudoCertain
- Brent Wiseman
- Feb 10, 2017
- 6 min read
Isn’t it ironic how everyone is certain they’re right? It’s almost like an unspoken agreement where everyone pretends to be correct, constantly trying to correct those they think incorrect, even though everyone knows that nobody knows what is correct. There will always be the conceited, but you would expect at minimum maybe 30% that recognizes their views and beliefs might be off. I say that, but I don’t even include myself in that noble group. Not really. I have studied Descartes and find it fascinating and yet, if I'm being honest with myself, I still have convictions about the majority of my beliefs. There is very little inside my head that I leave open the possibility that I might actually be wrong. Isn’t that actually fairly strange, considering human's capacity for error? Isn’t that the height of arrogance? When I try and answer a tough philosophical question like this, I like to look at data and numbers and evidence from similar circumstances to try and glean if my thoughts are justified or if they are motivated by bias. I have plenty information to draw upon. Everybody has opinions, everybody has beliefs. I’ve been asking myself whether there is actually more incorrect belief within humanity than true, seeing as how much of what people believe is in direct contradiction from something another group believes. In many cases, there are many groups, all with conflicting views in which at most a single view is correct. If we only take an account of factual data, excluding opinion and religion, that would make most people wrong about many of their beliefs, right? I don’t believe anybody carries with them not a single factually incorrect thought, however small, but I think the majority of information that many people carry is simply false. This is is a concept that is being constantly proven. On a day to day basis, you see or hear or read of people being proven wrong in something they claimed to believe without doubt only moments before. As I’ve often said, we can see that being certain of something doesn’t mean anything. People are mostly just wrong, yet we all think we’re the exception. Every time, we think we’re the only sane ones. If you laid out every belief you hold in front of you, how many would be as rotten as Descartes’ false apples, corrupting the other fruit they lay beside? I just haven’t been able to get that thought out of my head. “Isn’t it ironic?”, even though I don't think 'ironic' is the proper word for it. How much evidence must I see before I admit the same might or even must be true of myself? We’re all just a bunch of pompous conceited children, always thinking we know everything, or at least, thinking everything we think is definitely true. Some of us concede there is much we don’t know, but comfort ourselves in the security blanket of false confidence in what we do. Descartes taught me to question everything, but a lesson I’ve finally learned is that this is a never ending process. Don’t question something, make a belief decision, and then never visit the thought process again. Your initial endorsement of the idea and subsequent inevitable over-confidence in its validity must constantly be reevaluated due to the fact that you are human and humans are drawn to error like rot to a fruit.
I presented a more concise version of this question to a philosophical Facebook group just to see the answers I’d get, interested in other perspectives. It mostly provided me with the insight that, contrary to my belief, I hadn’t explained well. Once people understood what I was getting at, I got some decent answers: *all of this = [sic], btw Kevin Decker - “In fact there is now so much conflicting information out there that is virtually impossible to qualitatively assess yet is delivered as authoritative that you'd hardly be human if you didn't fall for some of it. Best we can do is to adopt the attitude that its OK to be wrong, unless you're a heart surgeon or an Air Traffic Controller!!” Chris Evans - “Confidence is essential in America what with our low scoring in academia it's all we have.” Victor Tavarez - “the only type of people that for the most part understand their limitations and dont overestimate their capacities are reallly smart people. like scientist and intellectuals. Just cuz they know enough to know how much they dont know” Vincent van Heukelum - “People mostly argue/defend what they think is true... that's how we work. The big thing is what we do once we are proven wrong... when people believe something without evidence because they want it to be true, they will get attached to that idea and are less likely to give it up once demonstrated to be wrong. Which is why openly saying "I hope so" instead of "it is" is so important. It limits the chances of ideas nesting in your mind.” Josh O’Brien - “When you put so much of yourself into a thought into an idea or an ideology it becomes apart of you. So that then requires you justify the amount of energy and effort you put into it, the investment fbias, you put value into something therefore it has to be valuable. You also have the issue where since it becomes apart of there identity so that when you point out its fallaciousness they see it as an attack, not on a falsehood but upon there persons, upon them hence the usual dismissal(to justify there belief in the face of contrary beliefs and evidence) or cherry picking information(to validate there claims in a pseudo intellectual manner) or out right hostility(when they cannot counter the statement but need to be right). This is why people should never hold onto an idea to tightly otherwise when the time comes you will not be able to let it go.” Julian Sunny - “There were polls made asking people if they are above or below the IQ average. More than 90℅ answered that they belong to the upper bracket. So almost 50℅ of population suffers from overconfidence (or low self esteem), delusion, all kinds of cognitive biases, pathological lying etc and all off these "errors" in assessing reality in one's self advantage actually worked in our evolution, made us go that one step further with courage and a sense of self worth.” Henrik Eriksson - “Well, to be fair, over the years my confidence in certain things have risen to levels I'd call 100%. Things like evolution for example (and I'm not talking details here. The details may change, but the conclusion is that evolution happened). Can I be wrong? Yes, I do acknowledge that, but that doesn't mean my confidence level is less. It just means I'm intellectually honest. That might be what you're actually talking about. That they refuse to admit that no matter how much they believe to be right, they may just be wrong.” Mark Ovderbeist - “//Why do so many people make an assertion that they claim "100% certainty" in?// Because it is often easier than to launch into a lengthy explanation of any doubts that may persist. For example, I often say that god doesn't exist. I cannot prove that, but it's a lot easier than saying "I see no evidence that any gods exist and, whilst absence of evidence is not evidence of absence I find the bible to be full of errors and contradictions etc etc etc"”
Out of these, I really like the ‘investment bias’ thought, though I don’t think that’s the proper name as I can’t find any resources on it. I do feel I’ve heard that definition before. Then, there’s certainty as essentially a defense mechanism, protecting our fragile egos and self-esteem. There’s the pretend certainty where we’re simply restraining ourselves from being pedantic because a “yes” is much easier than the full explanation, and there’s the Dunning-Kruger effect. All of these are good answers, but I still don’t feel completely satisfied. I realized the answer I was searching for wasn’t exactly ‘Why do people do this?’, but rather why they don’t use the rest of the world as a resource for how they view themselves? In this, they separate themselves from the rest of humanity and consider themselves always the exceptions. In my experience, most people are calling everyone who does not agree with their views ignorant and delusional. If everyone is saying it about everyone else and yet almost everyone gets proven wrong at some point, some many times a week, that’s a useful piece of data. The bias to view that situation and still not consider that we, ourselves, might be just as fallible still never seems to affect most people's confidence in most of their views and beliefs. That is fascinating to me. It seems to me that the proper response would be to hold ALL your beliefs, save for those with actual proof, with some degree of skepticism. NEVER claim 100% certainty in anything without proof. I think the bias I am thinking of is essentially overconfidence bias, though maybe a little different. Thinking that because you believe something, it must be true. Hence, if you are ever convinced of something, you will never again question it and will believe it wholly and without doubt, never recognizing that there is a chance you are incorrect. This is especially a problem for those prone to being convinced of something due to other biases without even evidence.

Comments