
The barriers to critical thinking are numerous and varied. They impede thinking that is clear, reasonable, rational, and accurate.
In this article, we identify the main impediments to critical thinking, define them, and illustrate how they operate in the real world.
Fallacies of reasoning obstruct critical thinking because they invalidate the arguments in which they are committed. The critical thinker must be able to detect the common reasoning fallacies.
Conventional wisdom is a barrier to critical thinking because it encourages us to accept the truth of certain propositions and discourages us from questioning such truths.
Every day, unavoidably, we encounter numerous impediments to our being or becoming critical thinkers. Some of these are inherent human limitations. Others are barriers in our surroundings. Wherever we run into them, these impediments hinder our ability to think clearly, reasonably, rationally, and accurately.
Experts in critical thinking divide the barriers into several basic categories.

Human limitations
Our inherent limitations (e.g., of memory, sensory perception, outlook) hinder us from seeing and understanding things clearly, objectively, and reliably.
Language use
The words we (and others) choose and how we (and others) use them can be misleading, confusing, or deceptive.
Faulty perception
Distorted perceptions of events and issues affect our judgment and understanding of them.
Faulty logic
Flawed logic invalidates the reasoning that relies upon it.
Sociological impediments
Societal influences can lead us to preconceptions and discourage us from thinking and acting independently.
Psychological pitfalls
Our own minds can construct impediments that prevent us from thinking rationally.
Examples of Common Barriers
Let’s take a closer look at the common barriers to critical thinking. Click on each of the barriers listed below for more detailed explanations and examples. As you consider these examples, see if you can think of additional examples from your own experiences.
Human Limitations
Language Use
Assuring expressions | Words or phrases that preempt any challenges to or questions about what follows in the statement. | Expressions such as “Common sense tells us that…” and “No one would dispute that…” |
---|---|---|
Jargon | Technical or overblown language that is used to make the simple seem complex. | Referring to a manhole cover as a “subterranean infrastructure entry hatch.” |
Euphemism | An expression intended to be less offensive, disturbing, or troubling to the listener than the word or phrase it replaces. | Referring to the inadvertent killing of innocent people and destruction of property in wartime as “collateral damage.” |
False implication | Language that is accurate but misleading because it suggests something false. | A preliminary budget proposes a 50% spending increase. The final budget embodies a 25% increase. Opponents of the final budget say spending for the coming year was “slashed by 50%.” It is true that the size of the proposed increase was cut. But the opponents’ characterization implies, misleadingly, that spending was reduced from the previous year. |
Vagueness | Language that is not precise enough for the context. | Parent: When will you be home? Teen: Soon. |
Faulty Perception
Clustering illusion | Mistaken perception that random events which happen in clusters are not random (i.e., not by chance). | Believing that separate outbreaks of a noninfectious disease in the same geographic area must be causally related. |
---|---|---|
Superstition | Irrational perception of a connection between unrelated events. | Believing your favorite team lost because you forgot to bring your lucky hat with you to the game. |
Pareidolia | Misperceiving a vague stimulus as something clear and distinct. The stimulus might be seen, heard, or smelled, for example. | Perceiving a water stain on the ceiling to be a clear image of Princess Diana. |
Faulty Logic
Gambler’s fallacy | The mistaken belief that something with fixed probabilities becomes more or less probable depending on recent events. | Avoiding lottery numbers that have been drawn recently in the belief that doing so boosts your probability of winning. |
---|---|---|
Argument from ignorance | The fallacy that something must be true if it has not been proven false. | “There must have been a gunman on the grassy knoll because no one has ever proved there wasn’t.” |
Post hoc fallacy | Mistakenly concluding that one thing caused another simply because the second thing followed the first. | Believing that student standardized test scores improved because the school adopted a new curriculum. |
Sociological Impediments
Conformity | Our tendency to bend to social pressure to conform to the majority view or position. | Buying a bigger house and more expensive car because the social consensus is that bigger and pricier validate our status as a success. |
---|---|---|
Stereotyping | To attribute defining qualities or characteristics to a group (e.g., ethnic, cultural, political, professional). To caricature people based on their group identity. | “Members of the national news media are all liberal weenies.” “Rush Limbaugh’s listeners are all red-neck fascists.” |
Irrelevant appeal to authority | When we assert that a position merits acceptance because it is supported by a well-known or esteemed authority figure. | Claiming that tougher new criminal sentencing guidelines are justified because the district attorney endorses them. |
Communal reinforcement | When an idea or proposition, regardless of its validity, becomes a firm belief through repeated assertion by members of the community. | “Slavery isn’t a necessary evil. It is morally right.” (This belief was widely proclaimed and held in the South during the American Civil War.) |
Psychological Pitfalls
Self-deception | Misleading ourselves to accept as true what is, in fact, not true; or to deny what is evidently the case. | Believing that our compulsive betting on pro football is just a healthy expression of our competitiveness and not an indicator of a gambling problem. |
---|---|---|
Wishful thinking | Interpreting facts, events, perceptions, etc., in order to arrive at what we want the truth to be rather than what the evidence says it is. | A week before the election, we interpret polling data that show our candidate trailing by a wide margin in every precinct as “anomalous.” |
Sunk-cost fallacy | Sticking with a hopeless investment because we fear that we’ll lose all that we’ve sunk into it if we don’t. | Giving additional “seed” money to our cousin for his sputtering start-up business, when all signs are that the business is heading for the cliff. |
Reasoning Fallacies
A fallacy is an error in reasoning. We differentiate it from a factual error, which simply involves getting the facts wrong. A fallacy is present in an argument when the premises (or reasons) given for the conclusion don’t properly support the conclusion. In fallacious reasoning, the premises might be irrelevant to the conclusion or not logically connected to it. Or they might be insufficient to warrant the conclusion. The presence of a fallacy, to one degree or another, invalidates the argument.
When we make a claim based on flawed reasoning, we commit a fallacy. In Asking the Right Questions, M. Neil Browne and Stuart Keeley refer to fallacies as reasoning “tricks.” They cite 3 common ones:
Providing reasoning that entails incorrect assumptions Example: The Sun looks bigger than the Moon. It must be closer to Earth than the Moon is. |
Making a premise seem relevant to the conclusion when it is not Example: He never played baseball, so he won’t understand the game’s nuances. |
Supporting the conclusion with reasons that depend on the conclusion being true Example: FedEx is the best overnight delivery service because it is superior to all of its competitors. |
The term “tricks” implies that these fallacies are committed on purpose, with intent to deceive or mislead. However, in reality, many – if not most—fallacies are perpetrated without deliberate intent. Often they stem from careless or sloppy thinking.
Intentional or not, fallacious reasoning in all cases impedes critical thinking because it builds arguments on a faulty foundation. It asks us to accept conclusions based on flawed premises. To be able to identify fallacies in others’ arguments, and to avoid them in one’s own arguments, is an essential skill for the critical thinker.
Common Fallacies
Fallacies of reasoning take many forms. Indeed, they are too numerous and varied to inventory here. Among them, however, are ones that rank as more common. These are fallacies that show up most prevalently in arguments of the sort we encounter. In some cases, the names for them (begging the question, red herring, slippery slope) are part of our general lexicon, and we recognize them even if we aren’t sure exactly what they mean.
Below is a catalog of some of the most common reasoning fallacies.
Ad hominem
Dismissing an argument by attacking the person who offers it rather than by refuting its reasoning.
Appeal to authority
To justify support for a position by citing an esteemed or well-known figure who supports it. An appeal to authority does not address the merit of the position.
Appeal to popularity
Citing majority sentiment or popular opinion as the reason for supporting a claim. It assumes that any position favored by the larger crowd must be true or worthy.
Begging the question
Asserting a conclusion that is assumed in the reasoning. The reason given to support the conclusion restates the conclusion.
Either-or
Assuming only two alternatives when, in reality, there are more than two. It implies that one of two outcomes is inevitable – either x or y.
Faulty analogy
Drawing an invalid comparison between things for the purpose of either supporting or refuting some position. A faulty analogy suggests that because two things are alike in some respect, they must be alike in other respects.
Hasty generalization
Inferring a general proposition about something based on too small a sample or an unrepresentative sample.
Red herring
Introducing an irrelevant point or topic to divert attention from the issue at hand. It is a tactic for confusing the point under debate.
search for perfect solution
Asserting that a solution is not worth adopting because it does not fix the problem completely.
Slippery slope
To suggest that a step or action, once taken, will lead inevitably to similar steps or actions with presumably undesirable consequences. The fallacy is invoked to justify not taking whatever initial step or action, lest it lead us down the “slippery slope.”
Straw man
Distorting or exaggerating an opponent’s argument so that it might be more easily attacked.
Two wrongs make a right
Defending or justifying our wrong position or conduct by pointing to a similar wrong done by someone else.
Knowing the names of these common fallacies benefits us because they are the widely accepted short-hand for forms of faulty reasoning. What’s most important to us as critical thinkers, however, is not to memorize a list of common reasoning fallacies. It is, rather, to grasp what makes reasoning fallacious and to be able to spot it in arguments.
Detecting Fallacies
To avoid being victimized by fallacious reasoning, we first must be able to detect it. How do we tell when a fallacy is at play in an argument? We don’t, unless we have a keen understanding of faulty reasoning – what makes it fallacious (why is this not sound reasoning?) and how it operates on us (by what means is it trying to get us to accept the conclusion?). Such understanding comes through the hard work of thinking about fallacies and learning the mechanisms by which they function.
Spotting fallacies begins with evaluating arguments critically. Are their conclusions supported by their premises? If not, why not? Our starting point should be locating and assessing the assumptions in an argument. What does the argument assume (i.e., ask us to accept as a given)? Faulty assumptions are very often at the root of fallacy. The faultier an assumption is, the less valid the reasoning.
Exercise: Detecting Fallacies
In Asking the Right Questions, M. Neil Browne and Stuart Keeley suggest a series of thinking steps to follow to spot fallacies in an argument.
1- Identify the conclusion and the reasons given to support it.
2- Consider reasons you think are relevant to the conclusion; then contrast your reasons with the reasons presented in the argument. How do they match up?
3- Identify any necessary assumption in the argument by asking yourself two questions:
What must I believe for the assumption to be acceptable?
What must I believe for the assumption to logically support the conclusion?
4- Ask yourself, “Do these assumptions make sense?” An obviously false assumption is a red flag for fallacious reasoning.
5- Look for language that might distract you from relevant reasoning by strongly appealing to your emotions.
Challenging Conventional Wisdom
Conventional wisdom describes thinking or explanations which we all generally embrace as true.
Watch the following video to learn more about the complexities of conventional wisdom.
Challenges threaten those who hold most dearly to conventional wisdom. If we hope to see our critical thinking translate into positive action, we need to recognize that a bold assault on conventional wisdom is likely to drive its defenders to dig in their heels. When this happens, our ability to move our company or organization beyond conventional wisdom is impaired. Our effectiveness, then, may well hinge on the following factors:
- Showing the tangible benefits to our organization of moving beyond conventional wisdom
- Being willing to compromise, in recognition that conventional wisdom is not overturned in one stroke
- Taking what we can get (in the way of change) rather than insisting on “all or nothing”
Not Invented Here
Another obstacle to critical thinking is the phenomenon known as Not Invented Here. This term describes a corporate or institutional culture that avoids using knowledge or adopting solutions because they were not developed by the corporation or institution itself.
The culture does so intentionally or not. In some cases, Not Invented Here results from simple ignorance. For example, a company might simply fail to do the research to determine whether a solution already exists for a problem it is working on. But in many instances, an organization deliberately rejects a known solution, for various misguided reasons:
- Because it is unwilling to invest the time to understand the existing solution
- Because it believes it can (and therefore should) develop an even better solution
- Because it won’t get as much credit for borrowing a known solution as for developing a new one
- Because it would require investment in new infrastructure
The consequence of Not Invented Here is time and effort spent needlessly in pursuit of a solution when a suitable solution already exists.
Recommended for you: What is Critical Thinking? , Systematic Problem Solving