top of page

Experiment: The Effect of Bias

  • Writer: Brent Wiseman
    Brent Wiseman
  • Jan 15, 2017
  • 7 min read

I’ve been increasingly fascinated by humanities biases and fallacies that they employ. I think that understanding them can answer a lot of questions on why we have the perceptions that we do. I used to view bias as something that colored maybe only a few problem areas. Either petty arguments between people or the very large scale political partisan warfare type. I didn’t really see much of a middle ground. I think most people are intuitively aware of confirmation bias, the bandwagon effect, selective perception, etc., if maybe not by name, by behavior, but there are so many more. And it really helps in most things, I think, to hear the actual definition of something. It helps them recognize it. What I’m now wondering is if I’m correct that most people would not consciously allow themselves to be biased if they were made aware of it. I want to design a study in which 2 questionnaires are made, both asking many questions that are different, but similar to a corresponding question on a second test. The answers will be mostly opinion based and multiple choice. The expectation is that the same person would answer similar questions with similar answers. Example: ‘Test 1: Why are the majority who receive welfare on it? A. Lack of motivation B. Lack of opportunity C. Lack of proper upbringing D. Lack of livable wages’. On test 2, I would ask essentially the same question, only worded differently - same with the answers. With a bit more time I could probably find a bit more subtle wordings. I’d hope for about 30 or so questions on each test. As a control, have many people take both tests one after another. Then, have people take the first, then read about different biases like below, then take the second. The goal is to see if being aware of biases and hearing their definitions, even if they don’t necessarily believe they personally possess them, will have an effect on their results, and if so, how great. I hypothesize there will be slight variance, with a small portion of people who were the most vicious and unempathetic in Test 1 answers to adopt a slightly more compassionate and understanding view in the 2nd, but no/almost no complete switches of views. With the welfare question, I would expect a few who answered A. to switch to C. for test 2. I suppose I would call that a minor ‘win’, but that’s both outside the purpose of this experiment and also probably not great for the validity of such an experiment - having a preferred outcome. It obviously could affect how I set up the experiment, the questions I ask, and my attitude towards the subjects, possibly affecting how they answer if they glean my feelings. Response bias, in other words. Below are just a few I would include - I'm sure there are many more that deserve a spot on this list. I'll collect them like Pokemon.

------------------

The fundamental attribution error (basically the same as Dispositional attribution) is our tendency to explain someone's behavior based on internal factors, such as personality or disposition, and to underestimate the influence that external factors, such as situational influences, have on another person's behavior. Response bias is a type of cognitive bias which can affect the results of a statistical survey if respondents answer questions in the way they think the questioner wants them to answer rather than according to their true beliefs. Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities. Similar is Selective Exposure, in which individuals' tendency to favor information which reinforces their pre-existing views while avoiding contradictory information. The "backfire effect" is a term coined by Brendan Nyhan and Jason Reifler to describe how some individuals when confronted with evidence that conflicts with their beliefs come to hold their original position even more strongly: Outcome Bias - Psychologists have found that people tend to judge the quality of a decision based on how the decision worked out. That is, people sometimes forget that the quality of the decision must be judged on what was known at the time the decision was made, not how it worked out, because the outcome is not known at the time of the decision. Reactive devaluation is a cognitive bias that occurs when a proposal is devalued if it appears to originate from an antagonist. Zero-sum thinking, also known as zero-sum bias, is when an individual thinks that a situation is like a zero-sum game, where one person's gain would be another's loss. For example, males who hate feminism.

Anchoring bias - Anchoring or focalism is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the "anchor") when making decisions. During decision making, anchoring occurs when individuals use an initial piece of information to make subsequent judgments. Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. For example, the initial price offered for a used car sets the standard for the rest of the negotiations, so that prices lower than the initial price seem more reasonable even if they are still higher than what the car is really worth. Availability bias is a human cognitive bias that causes us to overestimate probabilities of events associated with memorable or dramatic occurrences. A cognitive bias is a pattern of deviation in judgment that occurs in particular situations. AKA Availability Heuristic Bias is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. Example: People think terrorism is biggest threat to US because that’s what’s on the news, inflating the danger. But, if you look at statistics, falling TV sets, cows, falling coconuts, are all more dangerous. You are 130x more likely to be killed by a cop than terrorist.

The bandwagon effect is a psychological phenomenon in which people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override. The bandwagon effect has wide implications, but is commonly seen in politics and consumer behavior.

In cognitive science, choice-supportive bias is the tendency to retroactively ascribe positive attributes to an option one has selected. It is a cognitive bias. Example: Not admitting your wrong because you are biased towards the decision you have made. Not pointing out you made a bad decision. The ostrich bias - The decision, often subliminal, to ignore negative information. "To avoid finding out negative information, we should stop looking for it." Promotes ignorance.

Overconfidence effect - The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. The gambler's fallacy can also be attributed to the mistaken belief that gambling (or even chance itself) is a fair process that can correct itself in the event of streaks, otherwise known as the just-world hypothesis.

Survivorship bias, or survival bias, is the logical error of concentrating on the people or things that "survived" some process and inadvertently overlooking those that did not because of their lack of visibility. For instance, claiming the roman buildings that survived had superior architecture than their buildings that didn’t, though they have no examples of those fallen.

Selective perception is the process by which individuals perceive what they want to in media messages while ignoring opposing viewpoints. It is a broad term to identify the behavior all people exhibit to tend to "see things" based on their particular frame of reference.

Blindspot Bias is the cognitive bias of recognizing the impact of biases on the judgement of others, while failing to see the impact of biases on one's own judgment. *Very common but difficult to catch/stop yourself doing* Appeal to Ignorance - This fallacy occurs when you argue that your conclusion must be true, because there is no evidence against it. This fallacy wrongly shifts the burden of proof away from the one making the claim.

Appeal to Authority - Using an authority as evidence in your argument when the authority is not really an authority on the facts relevant to the argument.

Cognitive Dissonance - In psychology, cognitive dissonance is the mental stress (discomfort) experienced by a person who simultaneously holds two or more contradictory beliefs, ideas, or values, when performing an action that contradicts those beliefs, ideas, and values; or when confronted with new information that contradicts existing beliefs, ideas, and values.

Dunning-Kreuger Effect - a cognitive bias in which low-ability individuals suffer from illusory superiority, mistakenly assessing their ability as much higher than it really is. Example: *indignantly* "I could do ______ so much better than them if I had their job!"

Straw Man Fallacy - A straw man is a common form of argument and is an informal fallacy based on giving the impression of refuting an opponent's argument, while refuting an argument that was not advanced by that opponent. One who engages in this fallacy is said to be "attacking a straw man". The typical straw man argument creates the illusion of having completely refuted or defeated an opponent's proposition through the covert replacement of it with a different proposition (i.e. "stand up a straw man") and the subsequent refutation of that false argument ("knock down a straw man") instead of the opponent's proposition. Example - A: "We should relax the laws on beer." B: "No, any society with unrestricted access to intoxicants loses its work ethic and goes only for immediate gratification." The original proposal was to relax laws on beer. Person B has misconstrued/misrepresented this proposal by responding to it as if it had been something like "(we should have...) unrestricted access to intoxicants". It is a logical fallacy because Person A never advocated allowing said unrestricted access to intoxicants.

Commentaires


 RECENT POSTS: 
 SEARCH BY TAGS: 
bottom of page