• DOI: 10.1080/002202799183124
  • Corpus ID: 143250276

Common misconceptions of critical thinking

  • Sharon Bailin , R. Case , +1 author L. Daniels
  • Published 1 May 1999
  • Education, Philosophy
  • Journal of Curriculum Studies

221 Citations

Critical thinking: two theses from the ground up, critical thinking and science education, wp 54 thinking about teaching thinking, critical thinking in the schools: why doesn't much happen, libri ad nauseam: the critical thinking textbook glut, challenging students with the tools of critical thinking, an exploration into research on critical thinking and its cultivation: an overview, can students be taught to think critically, the failure of critical thinking: considering virtue epistemology as a pedagogical alternative, is it permissible to teach buddhist mindfulness meditation in a critical thinking course.

  • Highly Influenced
  • 11 Excerpts

17 References

Critical thinking and thinking skills: state of the art definitions and practice in public schools., educating reason: rationality, critical thinking and education, teaching critical thinking in the "strong" sense: a focus on self-deception, world views, and a dialectical mode of analysis, education and thinking: the role of knowledge., teaching for thinking, practical strategies for the teaching of thinking, mind matters: teaching for thinking, teaching thinking skills: a handbook for secondary school teachers, artistry, the work of artists, dimensions of thinking: a framework for curriculum and instruction, related papers.

Showing 1 through 3 of 0 Related Papers

Register for our upcoming webinar on college remedial education

Independent Analysis, Innovative Ideas

  • High-Quality Tutoring
  • Artificial Intelligence
  • Chronic Absenteeism

5 Myths About Critical Thinking

No one disputes the need for critical thinking in the classroom. Many school districts mention it explicitly in their strategic plans. Others embrace it implicitly by offering AP or IB courses or by implementing the Common Core or Deeper Learning strategies.

Journalists have caught on and included critical thinking in articles about educational practices, trends, and reforms. Over the past 15 years, newspaper stories referencing critical thinking have increased sharply, by 650 percent. That’s a higher increase than for such hot topics as partisanship, rap music, or yoga!

Despite this broad consensus, it’s not always clear what we mean when we say critical thinking. In fact, several myths obscure our understanding of this important skill and its potential to improve educational outcomes.

Critical thinking leads to criticism and disapproval. Maybe, maybe not. Critical thinking is an open-minded effort to answer a question or test a hypothesis by combining good evidence with good ideas. Critical thinking may lead to a negative appraisal of someone else’s ideas. But it also encompasses self-criticism, which encourages analyzing and synthesizing information for a fresh appraisal of those ideas. In short, critical thinking sometimes requires you to shed your beliefs and embrace someone else’s. And it demands a close review of relevant evidence. If that inquiry corroborates someone else’s theory or preferred beliefs, then a true critical thinker will follow the data.

Abraham Lincoln’s Second Inaugural Address (“with malice toward none, with charity toward all”) was a splendid example of critical thinking, even though it was remarkably free of criticism. Lincoln’s empathy for both sides in the Civil War confirmed his ability to look at things from an adversary’s point of view. His rigorous logic confirmed a Euclidean commitment to evidence as the core of a good argument. In contrast, tweets lambasting political enemies qualify as criticism but not critical thinking. Without evidence, criticism is disconnected from truth-seeking.

Critical thinking has limited utility across the school curriculum . Critical thinking is often linked to high school English teachers who ask students to deconstruct a text, or history teachers who ask students to analyze a document from the early days of the republic. It is also associated with college professors who invite their students to reinterpret Chaucer or Proust or Frederick Douglass or Elizabeth Cady Stanton. But critical thinking is also the province of natural scientists, social scientists, engineers, mathematicians, lawyers, physicians, and journalists. And it is more than just reinterpreting classic texts, admirable though that is. A lot of critical thinkers work for Google and for Microsoft. A lot of critical thinkers work for Pixar Studios and Disney. Critical thinking is a form of analytical reasoning with wide applicability.

Critical thinking is an academic exercise with little practical use. Numerous studies show that most employers deeply value critical thinking and regard it as one of the essential skills most lacking in recent high school graduates or even college graduates. To employers, critical thinking involves the capacity to solve problems, including complex ones, and the capacity to adapt to new situations quickly and intelligently. If teachers make connections in the classroom between academic subjects and the world of work, this can significantly improve students’ prospects to succeed at work.

Many studies also confirm the value of extra-curricular activities, which can enhance students’ hard and soft skills simultaneously. Sonia Sotomayor recalls that her high school debating club provided invaluable critical thinking skills: “Forensics Club was good training for a lawyer in ways that I barely understood at the time … You not only had to see both sides; you had to prepare as if you were arguing both in order to anticipate your opponent’s moves.” Athletics also provides opportunities for critical thinking. It was hockey star Wayne Gretzky who explained his extraordinary success by saying: “I skate to where the puck is going to be, not where it has been.” That is a splendid example of critical thinking!

Critical thinking instruction is premature for younger children. Wise teachers understand the need for developmentally appropriate instruction. A well-designed curriculum recognizes that children must learn their A, B, Cs before they can learn their X, Y, Zs. But a common misconception is that only older children and adults are capable of the sophisticated reasoning that critical thinking requires. I attended a 7 th grade class in Arlington, Va., where students deftly applied different thinking strategies to the question of whether an app censoring language in assigned texts is a good idea. I attended a 1 st grade class in Pittsburgh where students analyzed a piece by the graffiti artist Banksy with shrewd insights and profound observations worthy of a professional art critic. We should not underestimate younger children.

Even preschoolers can think critically. Give a group of 4-year-olds a wooden ramp and a rubber ball and ask them how to get the ball to descend more quickly. Within minutes, some of them will have figured out a connection between velocity and slope! As thinkers, young children have certain advantages over older children and even adults, because they approach topics from a fresh perspective, unencumbered by preconceptions, algorithms, and pigeonholes. Critical thinking instruction should begin early, not late.

Critical thinking skills are beyond the reach of low-income children. Disadvantaged children do need help with the fundamentals, including reading and math, often because they don’t have as many learning opportunities outside of school. But they also need a chance to flex their critical thinking muscles. Unfortunately, low-income children have less access to open classrooms, Socratic seminars, and other “constructivist” pedagogies that help to build critical thinking skills. Studies show that low-income students, like their middle-class peers, benefit from such experiences. Low-income students need access to STEM curricula that generate enthusiasm for science and the scientific method. That includes robotics clubs, coding clubs, the Odyssey of the Mind, and other extra-curricular activities that stimulate critical thinking. A society that denies its least advantaged children the opportunity to acquire vital 21 st century skills is destined to remain a divided society, with huge gaps between the haves and the have-nots.

As we compete with less open societies in a global economy, critical thinking is our secret weapon, our critical advantage. But critical thinking instruction, like critical thinking itself, requires champions, role models, and templates. It also requires abandoning myths that would rob critical thinking of its versatility and its vitality.

William T. Gormley, Jr. is a professor at Georgetown University’s McCourt School of Public Policy, where he co-directs the Center for Research on Children in the U.S. . He is author of The Critical Advantage: Developing Critical Thinking Skills in School , recently released by Harvard Education Press.

Published: November 29, 2017

Like what you're reading? Share with a colleague.

Related Focus Areas:

  • Curriculum & Classrooms

The latest leadership changes in the education sector

A listing of upcoming events at FutureEd and throughout the education sector

FutureEd in the News

Media mentions and our work on other platforms

This link is opening a new tab or window in your browser.

You’ve clicked on a link that is set to open in a new tab or window, depending on your browser settings.

A Guide To Critical Thinking think.maresh.info

What is Critical Thinking?

Critical thinking is the art of analyzing and evaluating thinking with a view to improving it in order to make an informed decision that is most likely to result in desired effects .

Critical thinking describes a process of uncovering and checking our assumptions and reasoning. First, we analyze to discover the assumptions that guide our decisions, actions, and choices. Next, we check the accuracy of these assumptions by exploring as many different perspectives, viewpoints, and sources as possible. Finally, we make informed decisions or judgments that are based on these researched assumptions.

Life is a series of decisions, some small, some much larger. Whom we date or choose as friends, the work or career we pursue, which political candidates we support, what we choose to eat, where we live, what consumer goods we buy, if and whom we marry, if and how we raise children—all these decisions are based on assumptions. We assume our friends will be trustworthy and won't talk about us behind our backs. We assume our career choices will be personally fulfilling or financially remunerative. We assume politicians we vote for have our, or the community's, best interests at heart. We assume that the foods we choose to eat are healthy for us, and so on.

These assumptions are sometimes correct. At other times, however, the assumptions we base our decisions on have never been examined. Sometimes we hold these assumptions because people we respect (friends, parents, teachers, religious leaders) have told us they are right. At other times we have picked these assumptions up as we travel through life but can't say exactly where they've come from. To make good decisions in life we need to be sure that these assumptions are accurate and valid – that they fit the situations and decisions we are facing. Critical thinking describes the process we use to uncover and check our assumptions. Decisions based on critical thinking are more likely to be ones we feel confident about and to have the effects we want them to have.

Your Mental Models

Mental models are the filters we use to understand the world. A mental model is a representation of how something works. Everyday we encounter so much information that we cannot store it all and the phenomena we encounter are too complex to understand every detail. Therefore, we use filtering models to simplify the complex into organizable and understandable chunks, conceptual models to file and organize new information, and reasoning models to create new ideas and make decisions.

Mental models shape what we think, how we interpret what we value most, where we direct our attention, how we reason, and where we perceive opportunities. The quality of our thinking is only as good as the models in our head and their usefulness in a given situation. The best models improve our likelihood of making the best decisions. By critically examining our assumptions, we can adjust them to be in better accord with reality and they become more powerful mental models in the toolkit through which we understand reality.

All of us go through life with many incorrect core assumptions about reality. For example, most of us believe (1) we are perceiving reality accurately, (2) our perceptions are valid, and (3) that what is obvious to us must be obvious to others. Let that sink in for a minute: these are incorrect assumptions. It is simply not possible to perceive reality accurately and everyone's reality is different. Our sensory nervous system sends gigabytes per minute of data to the brain but the brain has the attentional bandwidth to process megabytes per minute. On top of that, we are always allocating some of our bandwidth to our thoughts (have you every been lost in thought and missed an important detail?). To improve our thinking, first we have to accept that our perceptions of the moment are filtered through mental models , that our most dearly held beliefs may not correctly describe reality, and be open to improving them.

Building your toolkit of mental models is a lifelong project. Stick with it, and you'll find that your ability to understand reality, accomplish your goals, deepen your relationships, and make the best decisions will always improve. Critical thinking is a set of reasoning tools that we use to improve our other models about the world. They are the foundation upon which we can build our best mental models. In the next section, you will find an overview of the reasoning tools described in this website.

Organization of this Resource

Learn to analyze the elements of reasoning.

The Critical Analysis page is dedicated to the first step in the process of developing critical thinking skills, recognizing elements of reasoning that are present in the mind whenever we reason. I categorize six elements of reasoning: purposes, questions, points of view, information, assumptions, and reasoning. Note how these elements are related in the following paragraph.

To take command of our thinking, first we need to clearly formulate both our purpose and the question at issue. To uncover truths, we need to make logical inferences based on sound assumptions and information that is both accurate and relevant to the question we are dealing with. We need to understand our own point of view and fully consider other relevant viewpoints. We also need to recognize problems created by bugs in the human operating system by formally working around them. These bugs can be categorized into two major categories, each of which has it's own page.

Fallacies of reasoning are found in unsound arguments that may sound persuasive on the surface.

Cognitive biases are a predictably systematic patterns of deviation rationality in judgment. Cognitive biases can lead to irrational thought through distortions of perceived reality, inaccurate judgment, or illogical interpretation. For example, confirmation bias is the tendency to interpret new evidence as confirmation of one's existing beliefs and filter-out information that does not confirm one's existing beliefs.

Learn to evaluate reasoning

The Critical Evaluation page describes the second step in the process of critical thinking, evaluating the quality of thought. We need to use concepts justifiably and follow out the implications of decisions we are considering.

Learn to avoid other common mistakes

No one is a master of every discipline, however there are some common misconceptions that people have of other disciplines that you should learn to avoid.

Additionally, I have created a page of common writing errors that I have observed in developing student writing.

Before submitting your writing, I suggest that you please consult these resources as checklists and verify that you have done your best to avoid these mistakes.

Critical Analysis

Analysis is the act of breaking something complex down into simpler parts that you examine in detail. To critically analyze a text or idea, identify its purpose, the question at issue, the author's point of view, the kinds of information involved, the reasoning, and the conclusions.

Unless a text is simply presenting information, it will often contain arguments. An argument is a series of statements that reach a logical conclusion that is intended to reveal the degree of truth of another statement. Arguments begin with premises (kinds of information) that are related to each other using valid forms of reasoning (a process) to arrive at a logical conclusion, new information. A logical conclusion is a new kind of information that is true in light of premises being true (if the premises are all facts) or seeming to be true (if the premises contain some opinions). A logical conclusion may be false, if the premises are false or the reasoning is poor.

argument

1. Identify the Purposes

All texts or ideas have a purpose .

  • What do you think the author wants us to do, think about, or believe?
  • Periodically check that the text or you are still on target with the purpose

2. Identify the Questions at Issue

When reasoning is present, the author is attempting to figure something out, to answer some question, or to solve a problem.

  • Take time to clearly and precisely state the question at issue
  • Express the question in several ways to clarify its meaning and scope
  • Break down the question into sub questions
  • Identify if the question has one right answer, is a matter of opinion, or requires reasoning from more than one point of view

3. Identify Points of View

All reasoning is done from some point of view. We often experience the world in such a way as to assume that we are observing things just as they are, as though we were seeing the world without the filter of a point of view. Nonetheless, we also recognize that others have points of view that lead them to conclusions we fundamentally disagree with. One of the key dispositions of critical thinking is the on-going sense that, as humans, we always think within a perspective, that we virtually never experience things totally and absolutely. There is a connection, therefore, between thinking so as to be aware of our assumptions and intellectual humility. Therefore, it is often helpful to open your mind and involve other people (friends, family, work colleagues) who help us see ourselves and our actions from unfamiliar perspectives. Sometimes reading books, watching videos, or having new experiences such as traveling to other cultures, going to college, or being an intern help us become aware of our assumptions. It is equally important to recognize that one person's is biased by their world view and experiences, and therefore all points of view should be examined critically.

  • Identify your point of view
  • Identify author's point of view
  • Compare and contrast differing points of view

4. Distinguish Types of Information

Uncritical thinkers treat their conclusions as something given to them through experience, as something they directly observes in the world. As a result, they find it difficult to see why anyone might disagree with their conclusions. After all, they believe that the truth of their views is right there for everyone to see! Such people find it difficult to describe evidence without interpreting it through their point of view. Critical thinking requires the ability to label types of information and evaluate their quality before accepting an argument.

Information is true if it is accord accord with reality. Since our knowledge of reality is always incomplete, in practice truth is measured by its accord with the best information we have about reality. All information has an associated degree of belief (a feeling about truth) or confidence (the scientific term for statistical likelihood of truth) in its truth value. When analyzing, we are simply categorizing rather than evaluating the quality of the information.

All arguments are based on information. Premises are information that is used in the context of an argument. Information can be classified with four characteristics that describe the context in which it is used.

1. Evidence is information upon which conclusions are based. There are two categories of evidence:

  • Facts (objective truth)
  • Opinions (a feeling about the truth)

2. Assumptions are statements that we accept as true without proof or demonstration.

3. Conclusions are the results or reasoning, irrespective of their truth value.

4. Propaganda is information that is not objective and is used primarily to influence an audience and further an agenda

4A. Identify Evidence

Evidence is information that is relevant to question at issue. Both facts and opinions are evidence.

  • Unless necessary facts unavailable, you should restrict your evidence to facts, verifiable information.
  • Restrict your conclusions to those supported by the evidence you have.

A fact is an accurate description of an object, event, or statement that is independently verifiable by empirical means .

There are two distinct senses of the word "factual." The word may refer to a verified fact. However, "factual" may also refer to claims that are "factual in nature" in the sense that they can be verified or disproven by observation or empirical study, but those claims must be evaluated to determine if they are true. People often confuse these two senses, even to the point of accepting as true, statements which merely "seem factual", for example, "29.23 % of Americans suffer from depression." Before I accept this as true, I should assess it. I should ask such questions as "How do you know? How could this be known? Did you merely ask people if they were depressed and extrapolate those results? How exactly did you arrive at this figure?"

Purported facts should be assessed for their accuracy, completeness, and relevance to the issue. Sources of purported facts should be assessed for their qualifications, track records, and impartiality. Many students have experienced an education which stressed retention and repetition of factual claims. Such an emphasis stunts students' desire and ability to assess alleged facts, leaving them open to manipulation. Likewise, activities in which students "distinguish fact from opinion" often confuse these two senses. They encourage students to accept as true statements which merely "look like" facts.

To identify facts, look for these signal words in italics: "The annual report confirms ...," "Scientists have recently discovered ...," " According to the results of the tests...," "The investigation demonstrated ... "

Credible facts reference the observer of the information. You should accept a fact only after you have identified confirmation by many different independent observers and evaluated their credibility and potential bias. Even before this evaluation, you should reject a fact that does not have a clear source

As an example, in the debate we watched, Nick Gillespie says, "[drugs are] not addictive for 99 percent of people." This is factual only in the sense that may be empirically possible to measure, but you should not accept this as fact without more context such as a source.

If you have the opportunity, ask someone, "where did you get that information?" to give them the chance to confirm a fact. Until, you actually understand the limits and source of the fact, you should regard the information as suspicious and categorize it as an opinion that someone believes is true.

An opinion is a statement that expresses either how a person feels about something or what a person thinks is true . With objective verification, opinions can become facts. If they cannot be proven or disproven, they will always be opinions.

Since we cannot examine the facts in all situations, sometimes we must rely on an opinion as evidence in an argument. Any conclusion derived from an argument that uses an opinion in place of a fact will generally be less reliable. You should always acknowledge such uncertainty when presenting such a conclusion.

  • Look for these signal words in italics: "He claimed that...," "It is the officer's view that...," "The report argues that...," "Many scientists suspect that... "
  • Some opinions are more reliable than others. An opinion that is based on the objective consideration of a large amount of incomplete information will be more reliable than an opinion based on one observation and a feeling.
  • Understand that things are not always as they appear to be. At times, writers, whether consciously or not, will frame opinion as fact and vice versa.
  • Note that statements can contain both fact and opinion. They should be separately when analyzing an argument.

4B. Identify Assumptions

An assumption is a statement that we accept as true without proof or demonstration. It is an unstated premise, presupposition, or opinion that is required to connect data to conclusions.

All human thought and experience is based on assumptions. Our thought must begin with something we believe to be true in a particular context. We are typically unaware of what we assume and therefore rarely question our assumptions. Much of what is wrong with human thought can be found in the uncritical or unexamined assumptions that underlie it. Identifying and evaluating accuracy and validity of assumptions is arguably the most important application of critical thinking. Accurate and valid assumptions can become facts.

Assumptions are often very difficult to identify. Usually they are something we previously learned and do not question. They are part of our system of beliefs. We assume our beliefs to be true and use them to interpret the world about us.

This packet of exercises has many excellent examples assumptions identified in short scenarios.

4C. Identify Conclusions

Conclusions are the results or reasoning.

In logic, conclusions can be categorized based on their truth value:

  • Sound conclusions result from true premises and valid reasoning.
  • Unsound conclusions result from false premises and/or invalid reasoning.

Additionally, conclusions are often categorized as either:

  • accurate/inaccurate based on the truth of the premises
  • logical/illogical based on the quality of the reasoning
  • justified/unjustified based on whether or not the truth value has been critically evaluated

Conclusions also can be categorized based on their role in an argument:

  • Inferences (conclusions from a single step of reasoning that are used as a premise in a successive argument)
  • Drawn conclusions (conclusions that relate back to the question at issue)

It should be noted that different disciplines that study human thought (i.e. philosophy, cognitive psychology, artificial intelligence, etc.) define the distinction between a conclusion and an inference differently. To avoid confusion, I will make the following distinctions. When analyzing reasoning, a logical conclusion refers to the result of any argument. When analyzing a complex argument focused on a question at issue, an inference is a logical conclusion drawn from a single step in reasoning and may be used as information in the premise of a successive step of reasoning. A drawn conclusion describes a logical conclusion that specifically answers the question at issue by logically relating many inferences as premises. The example in this article, effectively illustrates my distinction between an inference and drawn conclusion (Note that other sources may define these word in the exact opposite way!).

Conclusions are generally straight-forward to identify in context. When analyzing a complex argument focused on a complex question at issue, inferences are often made implicitly in the course of reasoning. For this reason, an inference may be more difficult to identify. Critical thinkers try to monitor their inferences to keep them in line with what is actually implied by what they know. When speaking, critical thinkers try to use words that imply only what they can legitimately justify. They recognize that there are established word usages which generate established implications.

  • If we assume that it is dangerous to walk late at night in big cities and we move to Chicago, we will infer that it is dangerous to go for a walk late at night in Chicago. We probably take for granted our assumption that it is dangerous to walk late at night in big cities and in Chicago implicitly.
  • To infer that an act that was murder, is to infer that it was intentional and unjustified. The implications of this inference are severe, thus sufficient evidence must exist to justify this opinion or fact.

A helpful tool is to first identify an inference (what do we infer from the situation being evaluated?) then identify an assumption that is the premise to that inference ("If the inference is true, what did I assume about the situation?"). Often an assumption you identify this way is an inference that can be further unpacked by repeating the second step to identify deeper core assumptions.

Situation: I heard a scratch at the door. I got up to let the cat in.

Inference: I inferred that the cat was at the door.

Ask: If that is true, what did I infer about the situation?

Assumptions: Only the cat makes that noise, and he makes it only when he wants to be let in.

Since different people can have difference assumptions, they will make different inferences about the reality of the same situation.

Person One

Person Two

A man is lying on the sidewalk.

A man is lying on the sidewalk.

That man is a bum.

That man is in need of help.

If that is true, what did I assume about him in this situation?

If that is true, what did I assume about him in this situation?

Only bums lie on sidewalks.

Anyone lying on a sidewalk is in need of help.

4D. Identify Propaganda

Propaganda is a special category of information that is not objective and is used primarily to influence an audience to reach a specific conclusion. Propaganda attempts to arouse emotions and biases to short-circuit rational judgment. The author of propaganda deliberately designs an argument that does not hold up to critical thinking. It's use indicates an intent to, at worst mislead, or at best persuade without the use of reasoning. Whether or not propaganda is ethical is a personal and context-dependent value judgment that is separate from critical thinking.

Students often find analysis of propaganda to be confusing because it is an extra feature of information, rather than its own type. Information that is propaganda can be any non-objective type (opinion, assumption, and/or inference) if it is deliberately used to manipulate opinions using poor reasoning. Moreover, propaganda quite utilizes poor reasoning—it often employs logical fallacies or takes advantage of cognitive biases to mislead.

The following is a list of common propaganda techniques:

  • Bandwagon . It aims at persuading people to do a certain thing because many other people are doing it. An example can be a soft drink advertisement wherein a large group of people is shown drinking the same soft drink. People feel induced to opt for that drink as it is shown to be consumed by many. Similarly, by simply declaring without evidence that something is America's Favorite, significantly increases sales. Snob appeal is the reverse of bandwagon. It indicates that buying a certain product will make you stand out from the rest, as the masses won't afford to buy it.
  • Card Stacking Propaganda. Now, this technique is perhaps most popularly used. It involves the deliberate omission of certain facts to fool the target audience. The term card stacking originates from gambling and occurs when players try to stack decks in their favor. A similar ideology is used by companies to make their products appear better than they actually are. Most brands use this propaganda technique to downplay unsavory details about their products and services. For instance, some companies may cleverly conceal "hidden charges" and only talk about the benefits of their products and services. Changing the shape of french fries so that one pays more for less food, still doesn't change the fact that eating fried food is unhealthy.
  • Glittering Generalities Propaganda uses emotional appeal or/and vague statements to influence the audience. Advertising agencies thus use of phrases like as "inspiring you from within" or "to kick-start your day" to create positive anecdotes. This makes the product look more appealing, resulting in better sales.
  • Hacking Identity: The Pride-Fear-Outrage-Hatred Formula. Critically examine when identity categories become significant to an argument. In some cases it may be appropriate, in others it may be an emotionally manipulative red herring.
  • Example: In recent years, the Russian government has planted appeals to pride to amplify difference and strengthen online social communities. This is then followed by stories designed to invoke fear and outrage. A 2018 report to the United States Senate Select Committee on Intelligence details how these tactics are apparently designed to "hack" the minds of citizens in democratic nations into feeling disillusioned with social and political institutions. The goal is to weaken democratic participation and nudge countries towards increasingly pro-authoritarian values.
  • Repetition. It is when the product name is repeated many times during an advertisement. This technique may use a jingle, which is appealing to the masses and fits in their minds. This takes advantage of the illusory truth effect, a cognitive bias that is encapsulated in the old adage, "if you tell a lie big enough and keep repeating it, people will eventually come to believe it." It is an unfortunate reality that the Internet is often used to make make untrue information seem true by repetition.
  • Slogans. A slogan is a brief, striking phrase that may include labeling and stereotyping. Although slogans may be enlisted to support reasoned ideas, in practice they tend to act only as emotional appeals. Opponents of the US's invasion and occupation of Iraq use the slogan "blood for oil" to suggest that the invasion and its human losses was done to access Iraq's oil riches. On the other hand, supporters who argue that the US should continue to fight in Iraq use the slogan "cut and run" to suggest withdrawal is cowardly or weak. Similarly, the names of the military campaigns, such as "enduring freedom" or "just cause" can also be considered slogans, devised to influence people.
  • Testimonial propaganda is popular advertising technique that uses renowned or celebrity figures to endorse products and services. Now in this case, when a famous person vouches for something, viewers are likely to take account of the credibility and popularity of that person. Watch Drake's Sprite commercial as an example.

Wikipedia has an extensive list of propaganda techniques with numerous examples.

5. Analyze Reasoning

The identification of poor reasoning invalidates the conclusion of an argument. The conclusion of the argument may or may not be true. You must formulate an alternative valid argue ment to support the conclusion.

5A. Identify Logical Fallacies

Fallacies are faulty reasoning used in the construction of an argument. This topic is so vast that I have created a separate fallacies of reasoning page.

5B. Identify Cognitive Biases

A cognitive bias is a cognitive shortcut that leads to a loss of objectivity. Cognitive biases can lead to irrational thought through distortions of perceived reality, inaccurate judgment, or illogical interpretation. By learning about some of the most common biases, you can learn and how to avoid falling victim to them.

The identification of cognitive biases at work in an argument should make you skeptical. Like fallacies, this topic is so vast that I have created a separate cognitive biases page to explain them.

Critical Evaluation

After we have cataloged the elements of reasoning, we must evaluate texts and our own reasoning for clarity, accuracy, precision, relevance, depth, breadth, significance, logic, and fairness. When making a decision with incomplete information, it is critical to recognize that truth is often a degree of belief based on our evaluation of the quality of the information and reasoning .

1. Evaluate point of view

  • Playing the devil's advocate by arguing from a different point of view is a powerful exercise
  • After reading a text, examine how much influence the author's point of view had on you

 Critically evaluate the reliability of an author (and publisher):

  • What qualifications does the author have for writing on this subject? (Or what are the qualifications of the people the author quotes?)
  • Based on your research on the author's background, what factors may have influenced his or her point of view?
  • When and where was the article first published? Does this information affect the credibility of the article?

  Compare and contrast points of view to reveal how related material is presented by different authors and different purposes of their writing. After reading two texts on the same topic, ask yourself:

  • What is the author's point of view in each of these articles?
  • Why do you think that the points of view presented are so different?
  • How much influence did each author's point of view have on you?

1A. Evaluate a Scientific Author's Qualifications

  • Examine the primary source of information . ls there a reference to the source of information? If not, it cannot be verified. If so, is the source reputable?
  • Examine the reputation of the author . Do the author(s) have training in science? If so, have they had formal training leading to an advanced degree such as a Master's degree or doctorate, and have they published widely in reputable journals? If not, then are they working with a reputable scientist(s) to evaluate the data?
  • Does the discoverer say that a powerful establishment is trying to suppress his or her work? Often, the discoverer describes mainstream science as part of a larger conspiracy that includes industry and government. The idea is that the establishment will presumably stop at nothing to suppress discoveries that might shift the balance of wealth and power in society. This is not how science actually works. Science is an open and international enterprise focused on uncovering true descriptions of reality.
  • Determine if the work was published in a peer-reviewed journal . Peer review is the standard process for scientific publications. Peer-reviewed manuscripts have been read by several scholars in the same field (called peers), and these peers have indicated that the experiments and conclusions meets the standards of their discipline and are suitable for publication. In the absence of peer-review the significance and quality of the data cannot be assessed.
  • Has the discovery been pitched directly to the media? The integrity of science rests on the willingness of scientists to expose new ideas and findings to the scrutiny of other scientists. Thus, scientists expect their colleagues to reveal new findings to them initially. An attempt to bypass peer review by taking a new result directly to the media, and thence to the public, suggests that the work is unlikely to stand up to close examination by other scientists.
  • Check if the journal has a good reputation for scientific research . If a peer-reviewed paper is cited, where was it published? Is the journal widely respected? One tool that is commonly used for ranking, evaluating, categorizing, and comparing journals is the frequency with which the "average article" in a journal has been cited in a particular year or period. The frequency of citation reflects acknowledgment of importance by the scientific community. High-impact and widely respected journals include Science and Nature. Therefore, a citation in Science generally suggests scholarly acceptance, whereas publication in a nonscientific or little-known journal does not.
  • Determine if there is an independent confirmation by another published study . Even if a study is peer-reviewed and published in a reputable journal, independent assessment is critical to confirm or extend the findings. Even the best journals or scientists will occasionally make mistakes and publish papers that are later retracted. Sometimes there may be outright fabrication that is overlooked by the reviewers and not detected until later. In other cases, the scientific report may be accurate but its significance may be misrepresented by the media. Although it is a slow process ro establish a scientific "truth," a particular scientific conclusion will eventually either gain broad acceptance or be discarded.
  • Assess whether a potential conflict of interest exists . Most of the high-impact journals require a conflict of interest statment on the first page of an article.
  • Assess the quality of institution or panel . Does the report emanate from a University accredited by the U.S. Department of Education or equivalent society? Such information is generally more reliable than that issued from a single individual putting information out on the web. In the United States, government research arms such as the National Science Foundation and the National Institute of Health and professional scientific societies generally provide up-to-date, high-quality information.

2. Evaluate of Degree of Truth in Information

After analyzing to identify the different kinds of information, we must be explicit about the quality of each piece of information used in the text or our own thinking. Using the highest quality information in arguments increases the degree of belief in the truth of the argument. We must acknowledge when poor quality information is used in an argument and clearly state that we have low confidence in the truth of the argument.

  • Search for information that opposes your position as well as information that supports it
  • Make sure that all information used is clear, accurate, and relevant to the question at issue
  • Make sure you have gathered sufficient information
  • We can have the most confidence in facts that have been confirmed by many different independent observers.

A scientist's perspective on facts

In everyday language most of us consider a confirmed fact to be truth. However, scientists consider all truth to be provisional, the current facts serve as description of truth only for the time being. Scientists assume that all knowledge has the potential to be overturned if new information suggests that it should be. Scientists use the uncertainty and percent confidence to describe the statistical likelihood that a fact is true.

Physicist Richard P. Feynman once said, "In physics and in human affairs... whatever is not surrounded by uncertainty, cannot be the truth." He said this in reference to a newspaper article that asserted absolute belief in a scandalous rumor regarding a colleague. He observed that a responsible reporter should have referred to an "alleged incident." With no reference to a process that had first evaluated the quality of the truth, he considered accusation to be opinion, not fact.

  • Is a particular measurement 78 ± 50 or 78 ± 1 meters? As you can see, the uncertainty deeply affects how you will use that information.
  • It is a scientific formalism that any measurement missing a stated uncertainty has an uncertainty of ±1 in a last significant digit. Therefore, 78 seconds is understood to be 78 ± 1 seconds and 78.0 seconds is 78.0 ± 0.1 seconds.
  • "The crash test results indicate a 98% chance that a head-on collision will kill you. As a professional scientist I cannot say that a head-on collision will kill you."

This last example highlight the property that all scientific information is actually a statement probability . Nothing in science is ever "proven" or "100% certain." Always avoid saying that science has proven something. This is a discipline-specific error in reasoning commonly made by non-scientists. Non-scientists sometimes misinterpret when scientists attach uncertainty to every fact. If there is 95% confidence that climate change is being caused by human activity, people with a psychological bias to avoid taking action around this crisis may focus on the 5% uncertainty in the truth value. On the other hand, people who are convinced of this fact and want to take action get frustrated that scientists refuse to say that it has been proven, we are certain. In practice, 95% confidence in science is the gold standard for a complex phenomenon being "as good as proven," but scientists always keep open the possibility that they don't have all the data and keep open the possibility that this fact may be more nuanced or simply wrong in the future.

Comparing and Contrasting Information

By comparing and contrasting information, you can identify facts, make inferences, and draw conclusions that would not otherwise be possible. After reading two texts, ask yourself:

  • How do the articles differ in the information each one presents?
  • Are the articles different in how they present information?
  • Does the information appear to be complete and accurate? Why or why not?

2. Evaluate assumptions

[Unfinished]

Contrasting Assumptions

If two sides are arguing from different assumptions, it is very effective to focus on these in critical evaluation. Controversies generally rest on different sides interpreting the same information through different assumptions.

Assumptions, can be unjustified or justified, depending upon whether we do or do not have good reasons for them. Likewise, if two sides of a controversy share assumptions that are found faulty, both arguments become invalid.

  • Ethan Nadelmann, founder and executive director of the Drug Policy Foundation, argues that law enforcement officials are overzealous in prosecuting individuals for marijuana possession citing that 87% of marijuana arrests are for possession of small amounts.
  • The Office of National Drug Control Policy (ONDCP) contends that marijuana is not a harmless drug and must remain restricted. Besides causing physical problems, marijuana affects academic performance and emotional adjustment.
  • Underlying both of their arguments is the assumption that adults cannot be permitted to make their own decisions about the use of particular drugs as they choose. A libertarian who worries about governmental restrictions on personal liberty would immediately recognize this shared deep assumption and challenge it. If convincingly challenged, both arguments lose validity.

3. Evaluate reasoning

When an argument doesn't "feel" right, first analyze it as follows. Write down the information that forms each premise of the argument and categorize them. Write down the conclusion and label it. Write your best general description of the reasoning that links them. The mechanics of the reasoning are usually found in a "therefore" type statement. To unmask the logic, replace the premise statements with letters that represent concepts and properties. Example: "It's raining and the sun is shining, therefore it's raining." The logical form is "X has property Q and P, therefore X has property Q". The logic is sound. [I will link some more examples later.]

3A. Logical Fallacies

Fallacies are faulty reasoning used in the construction of an argument. This topic is so vast that I have created a separate fallacies of reasoning page. The identification of fallacious reasoning invalidates an argument and we then forced to formulate our own arguments to uncover truth.

3B. Evaluate Propaganda

Propaganda is information that is not objective and is used primarily to influence an audience to reach a specific conclusion. Propaganda attempts to arouse emotions to short-circuit rational judgment. It is not by definition "good" or "bad." However, it's use indicates possible intent to, at worst mislead, or at best persuade without the use of reasoning. The techniques of propaganda are utilized in some logical fallacies and you will find some conceptual overlap. The following is a list of common propaganda techniques:

  • Hacking Identity: The Pride, Fear, Outrage, Hatred Formula. Critically examine when identity categories become significant to an argument. In some cases it may be appropriate, in others it may be an emotionally manipulative red herring. Example: In recent years, the Russian government has planted appeals to pride to amplify difference and strengthen online social communities. This is then followed by stories designed to invoke fear and outrage. The effort is apparently designed to "hack" the minds of people in democratic nations into feeling disillusioned with social and political institutions.
  • Stereotyping. People or objects are lumped together under simplistic labels, also called labeling. Example: Blonde women are beautiful, but dumb.
  • Overgeneralizations . Treating a complex general thing as if it were a concrete thing. Example: " The UN's bureaucracy has forsaken its commitment... " or " The City extends strike deadline."

3C. Evaluate Cognitive Biases

A cognitive bias is a cognitive shortcut that leads to a loss of objectivity. Cognitive biases can lead to irrational thought through distortions of perceived reality, inaccurate judgment, or illogical interpretation. By learning about some of the most common biases, you can learn and how to avoid falling victim to them. The identification of cognitive biases at work in an argument should make you skeptical. Like fallacies, this topic is so vast that I have created a separate cognitive biases page to explain them.

4. Evaluate Judgments and Conclusions

After you read an article, you should be able to answer these questions:

  • What judgments and conclusions were drawn by the author of this article?
  • Are their faults of reasoning that make the drawn conclusion unjustified?
  • Does the drawn conclusion challenge your assumptions?
  • What other drawn conclusions are possible to draw using the same information?
  • What other information might be important to know before making any judgment on the value and importance of this text?

5. Predict future Implications and Consequences

The alignment of reasonable future implications and consequences of a conclusion or judgment with your values should inform your reasoning.

  • Trace the implications and consequences that follow from your reasoning
  • Search for negative as well as positive implications
  • Attempt to consider all possible consequences

Fallacies are faulty reasoning used in the construction of an argument . They make an argument appear to be better than it is. Here are some major fallacies of reasoning that you be able to recognize. All of the following fallacies are known as informal fallacies because they originate in a reasoning error. In contrast, formal fallacies , also known as non sequiturs, arise from the logical form of the argument. The following article introduces the most common fallacies.

In this video example we see rapid fire deployment of straw man, false dichotomy, and some formal fallacies on a kid who, impressively, recognizes each flaw of reasoning.

Identifying fallacies

Remember that arguments begin with premises that are related to each other using valid forms of reasoning to arrive at a logical conclusion .

Once you have analyzed the parts of an argument, evaluate:

Is the reasoning faulty?

  • If the error in the argument is in the logical connection between two premises in drawing a conclusion it is likely to be a formal fallacy, also known as a non sequitur.
  • If the truth revealed by the conclusion is a cause-effect relationship, it may be a questionable cause fallacy.
  • Does the reasoning neglect many other possibilities? The argument might be a false dilemma or slippery slope fallacy.

Is/are the premise(s) faulty?

  • If the premise of the argument must assume the conclusion to be true then read the section on improper premise fallacies.
  • If weak premises and incomplete information lead to a strong conclusion, the argument contains a weak premise fallacy, also known as a faulty generalization.

Are the premises and/or the arguments a distraction from the actual issue in question?

  • If any part of the argument is irrelevant to the actual issue, a relevance fallacy or red herring is at work.

Are you still not able to identify the error in reasoning?

  • Consult the comprehensive list of fallacies at Wikipedia or ask your instructor for assistance.

Formal Fallacies (Non Sequiturs)

An error in the argument's form. Invalid logic is applied to the premises.

Fallacy fallacy. This is the inferrence that an argument containing a fallacy must have a false conclusion. It is entirely possible for someone to pose a bad argument for something that is true. Try not to get so caught-up in identification of logical fallacies that you are quick to dismiss a flawed argument—instead, try to make the argument reasonable.

  • Example: "Some of your key evidence is missing, incomplete, or even faked! That proves I'm right!"

Syllogistic fallacies. There are many kinds of these. Syllogisms are generally three step arguments that use two premises to derive a conclusion. The premises and conclusion all take the form of categorical propositions that somehow relate two categories. These fallacies derive from incorrect application of logic. These fallacies are often more obvious if you draw a Venn diagram of the categories and shared features.

  • Example: "All birds have beaks. That creature has a beak. Therefore, that creature is a bird."
  • Form: All Z is B. This Y is B. Therefore, all Y is Z.
  • Problem: B cannot be generalized as an exclusive feature of Z. Y could be an octopus.
  • Example: "People in Kentucky support a border fence. People in New York do not support a border fence. Therefore, people in New York do not support people in Kentucky."
  • Form: All Z is B. All Y is not B. Therefore, all Y is not Z.
  • Problem: From the lack of shared B, nothing more can be logically implied about the features of either Z or Y. Z and Y may in fact agree on the desired outcomes for the question at issue but disagree over the means for achieving the outcomes.

Informal Fallacies

The proposed conclusion is not supported by the premises.

Whereas formal fallacies can be identified by form, informal fallacies are identified by examining the argument's content. There are many subcategories.

Improper Premise Fallacies

Any form of argument in which the conclusion occurs as one of the premises.

Begging the question. Providing what is essentially the conclusion of the argument as a premise. You assume without proof the stand/position that is in question. To "beg the question" is to put forward an argument whose validity requires that its own conclusion is true. Formally, begging the question statements are not structured as an argument and are harder to detect than circular arguments. Some authors consider circular reasoning to be a special case of begging the question. In the following examples, notice that the question at issue answers itself without argument.

  • Example: "This whole abortion debate about when human life begins is ridiculous. We should be thinking about the rights of the baby."
  • The question at issue: Should with examine when rights begin under the law? Premise: Rights begin after a baby is born. Conclusion: The debate is ridiculous.

Circular reasoning. Formally, circular reasoning differs from begging the question by specifically referring to arguments in which the reasoner simply repeats what they already assumed beforehand in different words without actually arriving at any new conclusion. Circular reasoning is not persuasive because a listener who doubts the conclusion will also doubt the premise that leads to it. This may sound silly, but people make such statements quite often when put under pressure.

  • "Whatever is less dense than water will float, because such objects don't sink in water."
  • "Of course smoking causes cancer. The smoke from cigarettes is a carcinogen."
  • "The rights of the minority are every bit as sacred as the rights of the majority, for the majority's rights have no greater value than those of the minority."
  • "Everyone wants the new iPhone because it is the hottest new gadget on the market!"
  • Note that this could be factually true in the situation that popularity was the sole driver of consumer desire for the new iPhone. Even so, it is still a fallacy of circular reasoning because its popularity must be logically explainable for reasons other than the conclusion.
  • Video example

Loaded question . Asking a question that has an assumption built into it so that it can't be answered without appearing guilty.

  • Example: Prosecutor to defendant: "So how did you feel when you murdered your wife?"
  • The question at issue: Did the suspect murder his wife? Premise: "you murdered your wife." Conclusion: "you murdered your wife." Possible responses: Any answer that the defendant gives to "how did you feel?" could construed as admission that he murdered his wife. The best response is to point-out the fallacy and refuse to answer the question as stated.

Weak Premise Fallacies

These reach a conclusion from weak premises. Unlike fallacies of relevance, the premises are related to the conclusions and yet only weakly support the conclusions. A faulty generalization is thus produced.

Cherry Picking / Card Stacking. The presentation of only that information or those arguments most favorable to a particular point of view.

  • Example: "I'm a really good driver. In the past thirty years, I have gotten only four speeding tickets." (What other kind of tickets has he gotten? How long has he been driving?)

Faulty/Weak analogy. Comparison is carried too far, or the things compared have nothing in common.

  • Example: Apples and oranges are both fruit. Both grow on trees. Therefore, apples and oranges taste the same.

Hasty Generalization (from an Unrepresentitve Sample). A judgment is made on the basis of inaccurate or insufficient evidence. They are extremely common because there is often no agreement about what constitutes sufficient evidence. Generalization from one person's experience is a common example of this fallacy.

  • Example: "My grandfather smoked four packs of cigarettes a day since age fourteen and lived until age ninety-two. Therefore, smoking really can't be that bad for you."
  • Example: "Ducks and geese migrate south for the winter. Therefore, all water-fowl migrate south for the winter."

No True Scotsman . Making what could be called an appeal to purity as a way to dismiss relevant criticisms or flaws of an argument.

  • Example: Angus declares that Scotsmen do not put sugar on their porridge, to which Lachlan points out that he is a Scotsman and puts sugar on his porridge. Furious, like a true Scot, Angus yells that no true Scotsman sugars his porridge.

Questionable Cause Fallacies

The primary basis for these errors is either inappropriate deduction (or rejection) of causation or a broader failure to properly investigate the cause of an observed effect.

Correlation Without Causation / Cum Hoc. A faulty assumption that, because there is a correlation between two variables, one caused the other.

  • Coincidence. The two variables aren't related at all, but correlate by chance.
  • Third Cause. A third factor is the cause of the correlation. Example: Young children who sleep with the light on are much more likely to develop myopia in later life. Therefore, sleeping with the light on causes myopia. (In 1999, this was conclusion was popularized by the media from a study containing such a correlation. It is more likely that myopia has a genetic cause and myopic parents use nightlights because they have poor night vision without their glasses.)
  • Wrong direction . Cause and effect are reversed. Example: The faster windmills are observed to rotate, the more wind is observed to be. Therefore wind is caused by the rotation of windmills. Real Life Example: When a country's debt rises above 90% of GDP, growth slows. Therefore, high debt causes slow growth.

Gamblers Fallacy. The incorrect belief that separate, independent events can affect the likelihood of another random event.

  • Example: After having multiple children of the same sex, some parents may believe that they are due to have a child of the opposite sex. (In reality, the probability is still 0.5.)

False Cause / Post Hoc. Treating coincidence of one event following another as causation.

  • Example: Every time we wash our car, it rains. Therefore, if we wash our car today, it will rain.
  • Example: Specific vaccinations are given at the same age that obvious symptoms of autism typically manifest. When some parents see their children diagnosed with autism shortly after receiving vaccinations they assume that the vaccinations caused the autism (even though the autism could have been diagnosed by a professional

Single Cause Fallacy / Causal Oversimplification. It is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes or a third cause.

  • Example: The "Gateway Drug Theory" argues that marijuana usage leads to usage of harder drugs and has been a major justification for why marijuana laws should be highly restrictive. However, the same data could be explained by marijuana simply being easier to obtain and therefore more likely to be the first drug tried by people who were likely to become hard drug users for many other reasons such as genetic factors or simple illegality of marijuana making it attractive to risk-taking people.
  • Example: Traffic fatalities were cut when the highway speed limit was reduced to 55 mph Therefore, the lower speed limit has resulted in safer highways. (The fact that people are driving less and seat belt laws were also passed may be equally or more important.)

Relevance Fallacies

These are distractions from the argument typically with some distracting sentiment that seems to be relevant but isn't really on-topic. Red Herrings are a specific sub-category Relevance fallacy that is distinguished by an intent to mislead often due the lack of a real argument.

Ad Hominem Argument . Rejection of a person's view on the basis of personal characteristics, background, physical appearance, or other features irrelevant to the argument at issue. Pay close attention to words that question an opponent's character. Examples: slob, prude, moron, embarrassing, stubborn.

Ambiguity . Using double meanings or other ambiguities of language to mislead or misrepresent the truth. Meaning in language can be so slippery that there are at least a dozen sub-fallacies including ambiguous grammar, equivocation, and quoting out of context (a tactic most often encountered on the Internet).

Appeal to Authority. This fallacy happens when we misuse an authority. This misuse of authority can occur in a number of ways. We can cite only authorities — steering conveniently away from other testable and concrete evidence as if expert opinion is always correct. Or we can cite irrelevant authorities, poor authorities, or false authorities.

Appeal to Emotion. The use of non-objective words, phrases, or expressions that arouse emotion having the effect of short-circuiting reason. Common examples include appeals to fear, flattery, outrage, pity, pride, ridicule of opponent's argument, spite, wishful thinking. Emotional appeals are also a powerful tool in propaganda.

  • Example: A commercial for a security company that shows someone breaking into a home in the middle of the night.
  • Example: "Any intelligent person knows... " (appeal to pride).

Appeal to Nature. Any argument that assumes "natural" things are "good" and "unnatural" things are "bad" is flawed because concepts of the natural, good, and bad are all vague and ambiguous. The person creating the argument can define these in any way that supports their position. Appeals to Nature also employ the begging the question fallacy (above).

  • Example: This tobacco ad claims that their product is more natural and thus better for you.
  • Example: This ad attempts to convince the reader that margarine, one the most processed foods in a grocery store, is natural and aligns with the readers assumed yearning for a simpler, better life in the country.
  • The marketing copy for products in a store like Whole Foods is rife of appeals to Nature. Practice spotting them.

Argument from ignorance / burden of proof. It asserts that a proposition is true because it has not yet been proven false or a proposition is false because it has not yet been proven true. This type of argument asserts a truth and shifts the burden of providing counter-evidence onto someone else. Logically, we should remain skeptical and demand legitimate evidence from the person asserting the proposition.

  • Example of two contradictory positions using this fallacy: "No one has ever been able to prove definitively that extra-terrestrials exist, so they must not be real." "No one has ever been able to prove definitively that extra-terrestrials do not exist, so they must be real."
  • Video Example

Argument from incredulity (appeal to common sense) . Saying that because one finds something difficult to understand that it's therefore not true.

Association fallacy. Inferring either guilt or honor by association. It is an irrelevant attempt to transfer the qualities of one thing to another by merely invoking them together. Sometimes fallacies of this kind may also be appeals to emotion, hasty generalizations, and/or ad hominem arguments.

  • Example: An attractive spokesperson will say that a specific product is good. The attractiveness of the spokesperson gives the product good associations.
  • Example: "Galileo was ridiculed in his time but later acknowledged to be right. Likewise, Dr. Andrew Wakefield's work demonstrating that vaccines cause autism will later be recognized as correct too." (Taking an unpopular position is no guarantee of its correctness. Additionally, the two scenarios are not comparable. Galileo was ridiculed by the Catholic Church. His scientific peers generally confirmed his work. In contrast, Dr. Wakefield's scientific peers have failed to replicate his observations and have invalidate his conclusions based on methodological flaws. The source of negative public opinion around Dr. Wakefield derives from valid expert criticism.)

Bandwagon / FOMO. The use of the fear of being "different" or "missing-out" is used to influence behavior.

  • Example: "Twenty million people jog for their health. Shouldn't you?

Genetic fallacy . Judging something good or bad on the basis of where it comes from, or from whom it comes.

  • Example: "You're not going to wear a wedding ring, are you? Don't you know that the wedding ring originally symbolized ankle chains worn by women to prevent them from running away from their husbands? I would not have thought you would be a party to such a sexist practice." There are numerous motives explaining why people choose to wear wedding rings, but it would be a fallacy to presume those who continue the tradition are promoting sexism. (page 196 of ref)

Ignoring The Question. Digression, obfuscation, or similar techniques are used to avoid answering a question.

  • Example: When asked about the possibility of a tax increase, a senator replies: "I have always met my obligations to those I represent."

Missing the point / Irrelevant Conclusion. Presenting an argument that may or may not be logically valid and sound, but whose conclusion fails to address the issue in question.

  • Example: The Chewbacca Defense from South Park .

Straw Man Argument. Appearing to refute an opponent's argument by instead creating an oversimplified or extreme version of the argument (a "straw man") and refuting that instead.

Texas sharpshooter . A conclusion is drawn from data with a stress on similarities while ignoring differences. An example is seeing localized patterns where none exist. The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.

Tu Quoque Fallacy. Latin for "you too," is also called the "appeal to hypocrisy" because it distracts from the argument by pointing out hypocrisy in the opponent. This tactic doesn't prove one's point, because even hypocrites can tell the truth.

Informal Fallacies with Multiple Structural Problems

Composition / Division . The fallacy of composition infers that something is true of the whole from the fact that it is true of a part of the whole. The opposite reasoning is the fallacy of division.

False dilemma / false dichotomy / black and white. Reducing an issue to only two possible decisions.

  • Example: Either we go to war, or we appear weak.

Middle ground / false compromise / argument to moderation . Arguing that a compromise, or middle point, between two extremes is the truth.

  • Example: Holly said that vaccinations caused autism in children, but her scientifically well-read friend Caleb said that this claim had been debunked and proven false. Their friend Alice offered a compromise that vaccinations cause some autism. (ref)

Slippery Slope. Moving from a seemingly benign premise or starting point and working through a number of small steps to an improbable extreme when many other outcomes could have been possible. Although this form of slippery slope is a sub-type of the formal appeal of probability fallacy (it assumes something will occur based on probability and thus breaks rules of formal logic), slippery slope arguments can take on many other forms and should are generally categorized as informal fallacies.

  • Video examples: Don't Wake Up In A Roadside Ditch commercial and the children's book If You Give a Mouse a Cookie .

Special pleading . Moving the goalposts to create exceptions when a claim is shown to be false. Applying a double standard, generally to oneself.

"The first principle is that you must not fool yourself – and you are the easiest person to fool." –Richard Feynman

As we examine our assumptions and improve our mental models , we have to confront the reality that we all have inescapable hardwired biases that we cannot change through critical thinking. Because we all have them, science can teach us a lot about our biases. Biases are an inescapable feature of being human. No training will stop you from commiting them. However, learning about them can help you second guess the validity of your judgment, think more critically, consider other points-of-view, and develop empathy for the biases in others.

The operating system of our brains uses biologically evolved shortcuts in our thinking. Many of these shortcuts are useful and essential. However, we have also inherited bugs in the code that make many of our judgments irrational. A cognitive bias is a cognitive shortcut that leads to a loss of objectivity. By learning about some of the most common biases, you can learn and how to avoid falling victim to them. For example many of the biases below occur because the brain tends to find patterns where none exist and uses irrational biases to reduce cognitive dissonance when stressed with contradictory ideas. To learn more, I recommend reading Thinking Fast and Slow and You Are Not So Smart .

Common Cognitive Biases

Anchoring . The first thing you judge influences your judgment of all that follows.

Human minds are associative in nature, so the order in which we receive information helps determine the course of our judgments and perceptions. For instance, the first price offered for a used car sets an 'anchor' price which will influence how reasonable or unreasonable a counter-offer might seem. Even if we feel like an initial price is far too high, it can make a slightly less-than-reasonable offer seem entirely reasonable in contrast to the anchor price.

Be especially mindful of this bias during financial negotiations such as houses, cars, and salaries. The initial price offered has proven to have a significant effect.

Availability heuristic . Your judgments are influenced by what springs most easily to mind.

How recent, emotionally powerful, or unusual your memories are can make them seem more relevant. This, in turn, can cause you to apply them too readily. For instance, when we see news reports about homicides, child abductions, and other terrible crimes it can make us believe that these events are much more common and threatening to us than is actually the case.

Try to gain different perspectives and relevant statistical information rather than relying purely on first judgments and emotive influences.

Barnum effect . You see personal specifics in vague statements by filling in the gaps (e.g. interpreting your horoscope).

Because our minds are given to making connections, it's easy for us to take nebulous statements and find ways to interpret them so that they seem specific and personal. The combination of our egos wanting validation with our strong inclination to see patterns and connections means that when someone is telling us a story about ourselves, we look to find the signal and ignore all the noise.

Psychics, astrologers and others use this bias to make it seem like they're telling you something relevant. Consider how things might be interpreted to apply to anyone, not just you.

Belief bias . You are more likely to accept an argument that supports a conclusion that aligns with his values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion.

It's difficult for us to set aside our existing beliefs to consider the true merits of an argument. In practice this means that our ideas become impervious to criticism, and are perpetually reinforced. Instead of thinking about our beliefs in terms of 'true or false' it's probably better to think of them in terms of probability. For example we might assign a 95%+ chance that thinking in terms of probability will help us think better, and a less than 1% chance that our existing beliefs have no room for any doubt. Thinking probabalistically forces us to evaluate more rationally.

A useful thing to ask is 'when and how did I get this belief?' We tend to automatically defend our ideas without ever really questioning them.

Belief perserverance . When some aspect of your core beliefs is challenged, it can cause you to believe even more strongly.

We can experience being wrong about some ideas as an attack upon our very selves, or our tribal identity. This can lead to motivated reasoning which causes a reinforcement of beliefs, despite disconfirming evidence. Recent research shows that the backfire effect certainly doesn't happen all the time. Most people will accept a correction relating to specific facts, however the backfire effect may reinforce a related or 'parent' belief as people attempt to reconcile a new narrative in their understanding.

"It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so." —Mark Twain

Bystander effect . You presume someone else is going to do something in an emergency situation.

When something terrible is happening in a public setting we can experience a kind of shock and mental paralysis that distracts us from a sense of personal responsibility. The problem is that everyone can experience this sense of deindividuation in a crowd. This same sense of losing our sense of self in a crowd has been linked to violent and anti-social behaviors. Remaining self-aware requires some amount of effortful reflection in group situations.

If there's an emergency situation, presume to be the one who will help or call for help. Be the change you want to see in the world.

Confirmation bias . You favor things that confirm your existing beliefs.

We are primed to see and agree with ideas that fit our preconceptions, and to ignore and dismiss information that conflicts with them. You could say that this is the mother of all biases, as it affects so much of our thinking through motivated reasoning. To help counteract its influence we ought to presume ourselves wrong until proven right. "When you are studying any matter, or considering any philosophy, ask yourself only what are the facts and what is the truth that the facts bear out. Never let yourself be diverted either by what you wish to believe or by what you think would have beneficent social effects if it were believed." – Bertrand Russell

Think of your ideas and beliefs as software you're actively trying to find problems with rather than things to be defended.

Curse of knowledge . Once you understand something you presume it to be obvious to everyone.

Things makes sense once they make sense, so it can be hard to remember why they didn't. We build complex networks of understanding and forget how intricate the path to our available knowledge really is. This bias is closely related to the hindsight bias wherein you will tend to believe that an event was predictable all along once it has occurred. We have difficulty reconstructing our own prior mental states of confusion and ignorance once we have clear knowledge.

When teaching someone something new, go slow and explain like they're ten years old (without being patronizing). Repeat key points and facilitate active practice to help embed knowledge.

Declinism . You remember the past as better than it was, and expect the future to be worse than it will likely be.

Despite living in the most peaceful and prosperous time in history, many people believe things are getting worse. The 24 hour news cycle, with its reporting of overtly negative and violent events, may account for some of this effect. We can also look to the generally optimistic view of the future in the early 20th century as being shifted to a dystopian and apocalyptic expectation after the world wars, and during the cold war. The greatest tragedy of this bias may be that our collective expectation of decline may contribute to a real-world self-fulfilling prophecy. For some real data,

Instead of relying on nostalgic impressions of how great things used to be, use measurable metrics such as life expectancy, levels of crime and violence, and prosperity statistics.

Dunning-Kruger effect . The more you know, the less confident you're likely to be. The less you know, the more confident you are likely to be.

Because experts know just how much they don't know, they tend to underestimate their ability; but it's easy to be over-confident when you have only a simple idea of how things are. Try not to mistake the cautiousness of experts as a lack of understanding, nor to give much credence to lay-people who appear confident but have only superficial knowledge.

"The whole problem with the world is that fools and fanatics are so certain of themselves, yet wiser people so full of doubts." —Bertrand Russell

Framing effect . You allow yourself to be unduly influenced by context and delivery.

We all like to think that we think independently, but the truth is that all of us are, in fact, influenced by delivery, framing and subtle cues. This is why the ad industry is a thing, despite almost everyone believing they're not affected by advertising messages. The phrasing of how a question is posed, such as for a proposed law being voted on, has been shown to have a significant effect on the outcome.

Only when we have the intellectual humility to accept the fact that we can be manipulated, can we hope to limit how much we are. Try to be mindful of how things are being put to you.

Fundamental attribution error . You judge others on their character, but yourself on the situation.

If you haven't had a good night's sleep, you know why you're being a bit slow; but if you observe someone else being slow you don't have such knowledge and so you might presume them to just be a slow person. Because of this disparity in knowledge we often overemphasize the influence of circumstance for our own failings, as well as underestimating circumstantial factors to explain other people's problems.

It's not only kind to view others' situations with charity, it's more objective too. Be mindful to also err on the side of taking personal responsibility rather than justifying and blaming.

Groupthink . You let the social dynamics of a group situation override the best outcomes.

Dissent can be uncomfortable and dangerous to one's social standing, and so often the most confident or first voice will determine group decisions. Because of the Dunning-Kruger effect, the most confident voices are also often the most ignorant.

Rather than openly contradicting others, seek to facilitate objective means of evaluation and critical thinking practices as a group activity.

In-group bias . You unfairly favor those who belong to your group.

We presume that we're fair and impartial, but the truth is that we automatically favor those who are most like us, or belong to our groups. This blind tribalism has evolved to strengthen social cohesion, however in a modern and multicultural world it can have the opposite effect.

Try to imagine yourself in the position of those in out-groups; whilst also attempting to be dispassionate when judging those who belong to your in-groups.

Just world hypothesis . Your preference for justice makes you presume it exists.

A world in which people don't always get what they deserve, hard work doesn't always pay off, and injustice happens is an uncomfortable one that threatens our preferred narrative. However, it is also the reality. This bias is often manifest in ideas such as 'what goes around comes around' or an expectation of 'karmic balance', and can also lead to blaming victims of crime and circumstance.

A more just world requires understanding rather than blame. Remember that everyone has their own life story, we're all fallible, and bad things happen to good people.

Halo effect . How much you like someone, or how attractive they are, influences your other judgments of them.

Our judgments are associative and automatic, and so if we want to be objective we need to consciously control for irrelevant influences. This is especially important in a professional setting. Things like attractiveness can unduly influence issues as important as a jury deciding someone's guilt or innocence. If someone is successful or fails in one area, this can also unfairly color our expectations of them in another area.

If you notice that you're giving consistently high or low marks across the board, it's worth considering that your judgment may be suffering from the halo effect.

Negativity bias . You allow negative things to disproportionately influence your thinking.

The pain of loss and hurt are felt more keenly and persistently than the fleeting gratification of pleasant things. We are primed for survival, and our aversion to pain can distort our judgment for a modern world. In an evolutionary context it makes sense for us to be heavily biased to avoid threats, but because this bias affects our judgments in other ways it means we aren't giving enough weight to the positives.

Pro-and-con lists, as well as thinking in terms of probabilities, can help you evaluate things more objectively than relying on a cognitive impression.

Optimism bias . You overestimate the likelihood of positive outcomes.

There can be benefits to a positive attitude, but it's unwise to allow such an attitude to adversely affect our ability to make rational judgments (they're not mutually exclusive). Wishful thinking can be a tragic irony insofar as it can create more negative outcomes, such as in the case of problem gambling.

If you make rational, realistic judgments you'll have a lot more to feel positive about.

Pessimism bias . You overestimate the likelihood of negative outcomes.

Pessimism is often a defense mechanism against disappointment, or it can be the result of depression and anxiety disorders. Pessimists often justify their attitude by saying that they'll either be vindicated or pleasantly surprised, however a pessimistic attitude may also limit potential positive outcomes. It should also be noted that pessimism is something very different to skepticism: the latter is a rational approach that seeks to remain impartial, while the former is an expectation of bad outcomes.

Perhaps the worst aspect of pessimism is that even if something good happens, you'll probably feel pessimistic about it anyway.

Placebo effect . If you believe you're taking medicine it can sometimes 'work' even if it's fake.

The placebo effect can work for stuff that our mind influences (such as pain) but not so much for things like viruses or broken bones. Things like the size and color of pills can have an influence on how strong the effect is and may even result in real physiological outcomes. We can also falsely attribute getting better to an inert substance simply because our immune system has fought off an infection i.e. we would have recovered in the same amount of time anyway.

Homeopathy, acupuncture, and many other forms of natural 'medicine' have been proven to be no more effective than placebo. Keep a healthy body and bank balance by using evidence-based medicine from a qualified doctor.

Reactance . You'd rather do the opposite of what someone is trying to make you do.

When we feel our liberty is being constrained, our inclination is to resist, however in doing so we can over-compensate. While blind conformity is far from an ideal way to approach things, neither is being a knee-jerk contrarian.

Be careful not to lose objectivity when someone is being coercive/manipulative, or trying to force you do something. Wisdom springs from reflection, folly from reaction.

Self-serving bias . You believe your failures are due to external factors, yet you're responsible for your successes.

Many of us enjoy unearned privileges, luck and advantages that others do not. It's easy to tell ourselves that we deserve these things, whilst blaming circumstance when things don't go our way. Our desire to protect and exalt our own egos is a powerful force in our psychology. Fostering humility can help countermand this tendency, whilst also making us nicer humans.

When judging others, be mindful of how this bias interacts with the just-world hypothesis, fundamental attribution error, and the in-group bias.

Spotlight effect . You overestimate how much people notice how you look and act.

Most people are much more concerned about themselves than they are about you. Absent overt prejudices, people generally want to like and get along with you as it gives them validation too. It's healthy to remember that although we're the main character in the story of our own life, everyone else is center-stage in theirs too. This bias causes so many people to attribute to motives of malice when there may have been a simple misunderstanding.

Instead of worrying about how you're being judged, consider how you make others feel. They'll remember this much more, and you'll make the world a better place.

Sunk cost fallacy . You irrationally cling to things that have already cost you something.

When we've invested our time, money, or emotion into something, it hurts us to let it go. This aversion to pain can distort our better judgment and cause us to make unwise investments. A sunk cost means that we can't recover it, so it's rational to disregard the cost when evaluating. For instance, if you've spent money on a meal but you only feel like eating half of it, it's irrational to continue to stuff your face just because 'you've already paid for it'; especially considering the fact that you're wasting actual time doing so.

To regain objectivity, ask yourself: had I not already invested something, would I still do so now? What would I counsel a friend to do if they were in the same situation?

Discipline-specific misconceptions often made in arguments

  • Everything is made of chemicals. Avoid saying that chemicals are "unnatural" or "dangerous."
  • Medicine is not strictly a scientific profession. It can be, but is not required to be. A lot of what doctors actually do is non-scientific. The art of medicine is just as important as the science. For example, simply creating the feeling that the doctor understands a patient's problem and shares the patient's values increases the likelihood of positive health outcomes. Avoid the assumption that doctors are scientists.
  • It can be just as dangerous to over-medicalize mental illness as it is to moralize about it. This is why recent writers like Johann Hari focus on non-medical aspects of addiction. From his popular TED Talk you might conclude that he dismisses the model that addiction is a physical medical condition. But if you read his book Chasing The Scream, you would learn that he actually accepts the medical model as part of a bigger picture, considers it mainstream in medicine, and has chosen to make a case for the significance of the social contributors to addiction. From Hari's point-of-view, the American medical system is incentivized to offer the lowest-cost quick fix (like a pill) so treating addiction as a solely medical condition can lead to oversimplified treatments that are less effective that complex, tailored treatments that consider an addict's social circumstances. His slogan, "the opposite of addiction is connection," is effective because it is memorable, however it is just as oversimplified as the purely medical model.

Neuroscience

  • Everything alters the brain. Reading these words physically alters your brain by creating memories. In your writing, it is not enough to say the "repeated cocaine use alters the brain." Be specific about how the brain is altered and what the consequence is.
  • Every human quality we care about has a dual nature . On one hand its character is limited by biology and the laws of physics. And on the other hand its experience is shaped by culture and personal experience. Thinkers who amplify the importance of biology in shaping behavior are making essentialist arguments. Thinkers who focus on the culturally constructed nature of a human quality are making constructivist arguments. It is important to study and understand the essential and constructed qualities of such concepts as gender, intelligence, athletic ability, extroversion, honesty, mental illness, etc. By separating "nature" from "nurture" we can learn how each contributes to the total phenomenon. But by taking either position, without acknowledging the role of the other, ignores the complexity of reality and leads to weak models. Avoid such overly simplifying models in your own thinking and question them in others.
  • Avoid the words "prove," "proven," "proof," etc. Outside of mathematics, nothing is actually "proven" in life. Instead of writing, "It's been proven..." try "It's been observed..." or "Scientists have support for the theory..."

A longer list of misconceptions

Wikipedia has a great list of common misconceptions on many other topics.

Writing Tips

General style tips .

  • For essays, refer to the MLA Writing guidelines.
  • For scientific and technical writing, refer to the ACS Style Guide .

Citing and Referencing

  • If you do not reference a fact in your writing, assume that a critical thinker will give it low likely hood of being true.
  • When you quote someone, state their title and credentials. Give context to who they are to help your reader determine if the person being quoted is trustworthy and/or qualified.
  • Cite the page number when citing a book.
  • Avoid citing websites whenever possible. With the exception of a few online academic journals, assume that anything published online may be gone tomorrow and your reader will not be able to find it.
  • A website is not a journal. Before citing a website, try to locate a print citation.
  • Google is not a dictionary. If you cite a definition you got from Google, visit the "Google Dictionary" Wikipedia entry to discover who their current content provider is for definitions.
  • Google is not a book publisher. Books you find in Google Books, were published somewhere else. Check the title page.

Presenting Information

  • When you quote someone , always explain who they are. If they are an expert or researcher, state their qualifications and connect them to reputable organizations that sponsor their work. Doing this makes your writing more persuasive and makes it easier for your reader to research this person to come to their own conclusions.
  • Example: "Michael Kuhar, an addiction researcher at the Emory University School of Medicine, explains that..."
  • Give credit to the primary source of an idea , even if you encountered it in a secondary source. You should make an effort to read the primary source before quoting it's information and conclusion. If you are unable to, then be clear that you are repeating another author's interpretation of the primary source.

Glossary 

Why are precise definitions of concepts and ideas important.

Humans think within concepts or ideas. We can never achieve command over our thoughts unless we learn how to achieve command over our concepts or ideas. Thus we must learn how to identify the concepts or ideas we are using, contrast them with alternative concepts or ideas, and clarify what we include and exclude by means of them. For example, most people say they believe strongly in democracy, but few can clarify with examples what that word does and does not imply. Most people confuse the meaning of words with cultural associations, with the result that "democracy" means to people whatever we do in running our government—any country that is different is undemocratic. We must distinguish the concepts implicit in the English language from the psychological associations surrounding that concept in a given social group or culture. The failure to develop this ability is a major cause of uncritical thought and selfish critical thought.

  • Consider alternative concepts
  • Consider that others may be using alternative definitions of concepts
  • Make sure you are using concepts with care and precision
  • If you suspect a difference in definitions betwen you and another person, attempt to clarify each other's meaning

Fundamental Definitions

Argument. An argument is a series of statements that reach a conclusion that is intended to reveal the degree of truth of another statement. Arguments begin with premises (kinds of information) that are related to each other using valid forms of reasoning (a process) to arrive at the logical conclusion , new information. A logical conclusion is a new kind of information that is true in light of premises being true (if the premises are all facts) or seeming to be true (if the premises contain opinions).

Critical thinker. A well-cultivated critical thinker raises vital questions and problems, formulating them clearly and precisely; gathers and assesses relevant information, using abstract ideas to interpret it effectively; comes to well-reasoned conclusions and solutions, testing them against relevant criteria and standards; thinks open mindedly within alternative systems of thought, recognizing and assessing, as need be, their assumptions, implications, and practical consequences; is committed to overcoming our native confirmation bias, egocentrism, and sociocentrism; and communicates effectively with others in figuring out solutions to complex problems. ( https://www.criticalthinking.org )

Concept . A concept is a generalized idea of a thing or of a class of things that make up the fundamental building blocks of thoughts. Concepts are your brain's representations of past experiences (Barsalou 2003 and 2008). Using concepts, your brain groups some things together and separates others. You can look at three mounds of dirt and perceive two of them as "Hills" and one as a "Mountain," based on your concepts. The dominant psychological/philosophical school of thought known as constructivism assumes that the world like a sheet of pastry and your concepts are cookie cutters that carve boundaries, not because the boundaries are natural , but because they're useful or desirable . These boundaries have physical limitations of course; you'd never perceive a mountain as a lake (Boghossian 2006).

Empirical. Relying on or derived from experiment, observation, or experience as opposed to conceptual or evaluative.

Idea. An idea is anything existing in the mind as an object of knowledge or thought based on concepts regarding particular instances of a class of things. The word specifically refers to something conceived in the mind or imagined. An idea can be specific whereas concepts are generalized.

Thought refers to any idea, whether or not expressed, that occurs to the mind in reasoning or contemplation.

Additional Definitions

For additional definitions of the objects of mind and parts of thinking, I suggest this glossary: https://www.criticalthinking.org/pages/glossary-of-critical-thinking-terms/4

TeachThought

5 Absurd Myths About Critical Thinking

Complexities involved with teaching & assessing critical thinking have led to misconceptions about what it is and how to incorporate it.

critical-thinking-mythsfic

5 Absurd Myths About Critical Thinking 

by Karin Hess , Ed.D.

ed note: We have published content in the past with references to Webb’s DOK framework, and it is often met with pushback as having been “debunked.” While that’s another post of its own, if you count yourself among the DOK critics, the myths Hess offers below still hold true, though you might still be troubled with the DOK-sourced reasoning. As always, taker to the comments below with any comments, questions, or feedback.  

A question I’m often asked is, “Does all critical thinking lead to deeper understanding?”

Like so many questions in education, the answer is probably, “it depends.” At its heart, critical thinking is about taking content apart to examine the parts in relation to the whole—and how each of those parts work separately and together. This is called building schema. For example, if you want your students to collect and use data during a science investigation, they must first understand all of the components that comprise an investigation—how and what to measure, the proper way to record the data and control variables, and how to organize and interpret the data collected.

If students are writing a text-based essay, are they studying the text closely and analyzing how the author conveys the message, or merely locating words as “text evidence” without any analysis of how that evidence supports a position? It’s the difference between playing a game and doing the practice drills. Rote memorization or copying down words are drills—perhaps important, but certainly not requiring deep thinking. Critical thinking, at its deepest level, is playing a game—tapping into and applying a variety of skills within the context of making decisions and solving a problem.

For today’s students to be competitive tomorrow, we understand the importance of curating these skills, as every occupation at times requires a deeper level of understanding and strategic thinking. And it goes beyond the core areas of mathematics and English/language arts—critical thinking should be applied to all content areas, from art class to physical education!

The complexities involved with teaching and assessing critical thinking have led to a range of misconceptions about what it is and how to incorporate it into instruction. Here are some of the biggest myths I’ve heard about critical thinking and why they don’t apply.

Myth # 1: Only certain students are capable of critical thinking.

If you ever find yourself tapping into this mindset, ask yourself how you can shift your role from “telling” students how to complete a task to “modeling” how it could be done and then posing a question or problem that requires a novel application of that skill. That automatically changes the role of the student from memorizing rules and information to applying them critically.

Tasks that require strategic or extended thinking (as defined by Norman Webb’s Depth of Knowledge/DOK levels) are not meant only to be done independently—at least not initially. So an effective strategy for engaging all students in critical thinking is to break the students into small groups to work together to puzzle through a problem.

This promotes discourse that refines reasoning and problem-solving skills—answering questions such as, what are we being asked to do?

What information or resources do we need?

What steps will we take?

Discourse is not cheating! It’s increasing access and engagement for all students to become deeper independent thinkers.

Myth # 2: Critical thinking has to be difficult.

Critical thinking doesn’t necessarily  have  to be ‘hard’ to qualify as critical thinking.

Taking the DOK framework as an example, DOK 3– and DOK 4–type tasks are meant to be greater in complexity, not simply more difficult to do! This goes back to the heart of what critical thinking is—pulling apart the pieces to determine what they are and how well they work together. It might be describing why there are flaws in an author’s chain of logic or in a scientist’s experimental design.

Complexity is about applying an understanding of the overarching schema, digging deeper into the concepts, and being creative in finding more than one way to play the game.

Myth #3: Multiple-Choice questions can adequately assess critical thinking.

I couldn’t disagree more with this myth. It just doesn’t make any sense when you consider the level of engagement with content that is required by DOK 3 and 4 tasks. DOK 3–type questions, which focus on strategic thinking, can be assessed with what I call “weaker” multiple-choice items.

However, does selecting the “best” option, such as locating supporting text evidence for a stated theme, yield as much insight to teachers as actually observing and listening to how a student formulates an answer? It’s both the process and product of formulating an answer that uncover true understanding. DOK 4–type tasks require extended time, as well as extended thinking and reasoning.

This is because there is more content to analyze and more perspectives or possible solutions to consider. Combinations of multiple-choice items are now designed to address the additional processing of ideas for DOK 4 tasks; but because DOK 3 and DOK 4 tasks tend to be more open-ended in nature, I prefer to ask students to construct more insightful responses, rather than simply select “right” answers.

As a matter of fact, when students begin to explain their reasoning, you may discover that students who did not get the correct answer actually have a deeper understanding than some students who did.

Myth #4: ‘Higher-order thinking’ means ‘deeper learning.’

I’ve found that many “higher order thinking” activities ask students to analyze, evaluate, or be creative with content without deepening their knowledge of the content. For example, “what if?” questions can be fun and provide engagement that helps students to build some foundational knowledge; however, the products of these learning activities don’t automatically translate to evidence of deeper or more insightful thinking.

The critical thinking examples found in the Hess Cognitive Rigor Matrices for DOK 3 or 4 are meant to spur students to not only interpret, but to convincingly justify their interpretations. To move from higher order thinking “lite” to deeper thinking, you must shift the mindset and learning expectations toward going beyond the foundational understandings to deeper understanding, deeper applications, and deeper kinds of analyses.

Myth #5: Multi-Step tasks always result in critical thinking.

Many multi-step tasks are learned routines—important to learn, but still considered DOK 1.

Think about long division. It has many steps and can be hard to do, but it is still a routine operation, done the same way each time. Contrast the long division assignment with researching a topic, which also has many steps. Here, students might begin by building some foundational knowledge (e.g., learning new vocabulary/DOK 1; making connections between concepts or examining cause-effect relationships/DOK 2).

Then they might move to locating sources to help them answer an issue-based question from varying perspectives. Each step of this process involves some strategic thinking to determine relevance and accuracy of information and to evaluate the validity of sources used. And now, one final caution—remember that simply reading more texts or analyzing more data sources in the same routine ways doesn’t automatically lead to deeper thinking. Unless you are diving deeply into the material to study theme, style, or the underlying meanings that can be derived, it’s just drills, not playing championship games.

Dr. Karin Hess has more than 30 years of deep experience in curriculum, instruction, and assessment. She is a recognized international leader in developing practical approaches for using cognitive rigor and learning progressions as the foundation for formative, interim, and performance assessments. During Critical Thinking LIVE, a one-day professional development seminar hosted by Mentoring Minds , a national publisher of K–12 critical thinking educational resources, Dr. Hess d elivered the keynote address, “Shifting Teacher and Student Roles: An Essential Component to Thinking that Leads to Transfer.

TeachThought is an organization dedicated to innovation in education through the growth of outstanding teachers.

Ignorance, misconceptions and critical thinking

  • Knowing the Unknown
  • Published: 07 January 2020
  • Volume 198 , pages 7473–7501, ( 2021 )

Cite this article

common misconceptions of critical thinking

  • Sara Dellantonio   ORCID: orcid.org/0000-0002-2281-7754 1 &
  • Luigi Pastore   ORCID: orcid.org/0000-0002-5892-6928 2  

1656 Accesses

6 Citations

1 Altmetric

Explore all metrics

In this paper we investigate ignorance in relation to our capacity to justify our beliefs. To achieve this aim we specifically address scientific misconceptions, i.e. beliefs that are considered to be false in light of accepted scientific knowledge. The hypothesis we put forward is that misconceptions are not isolated false beliefs, but rather form part of a system of inferences—an explanation—which does not match current scientific theory. We further argue that, because misconceptions are embedded in a system, they cannot be rectified simply by replacing false beliefs with true ones. To address our misconceptions, we must rather act on the system of beliefs that supports them. In the first step of our analysis, we distinguish between misconceptions that are easy to dispel because they represent simple errors that occur against the background of a correct explanatory apparatus and misconceptions that are, on the contrary, very difficult to dispel because they are the product of pseudo explanations. We show that, in the latter case, misconceptions constitute an integral part of an incorrect explanation and the reasons that support such misconceptions are deeply misleading. In the second step, we discuss various approaches that have been adopted to address the problem of misconceptions. Challenging the notion that directly addressing and criticizing specific misconceptions is an effective approach, we propose that critical thinking is the most fruitful means to deal with misconceptions. We define the core competences and knowledge relevant for the practice of critical thinking and discuss how they help us avoid misconceptions that arise from accepting beliefs that form part of a mistaken explanation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

common misconceptions of critical thinking

Don’t get it wrong! On understanding and its negative phenomena

Scepticism, closure and rationally grounded knowledge: a new solution, imprecise bayesianism and inference to the best explanation, explore related subjects.

  • Artificial Intelligence

In the literature, the approach to ignorance that considers the possession of true justified beliefs as a necessary condition for having knowledge and, by contrast, the absence of one of these conditions as sufficient to establish ignorance is called the Standard View of Ignorance (Le Morvan 2011 , 2012 , 2013 ; Le Morvan and Peels 2016 ). Recently, epistemological research has developed a new approach called the New View of Ignorance in which the role assigned to justification and to the capacity to explicitly offer reasons in support of our beliefs has been weakened and in which a person is considered to be ignorant primarily in the case in which s/he holds false beliefs. (On the so-called New View see Peels 2010 , 2011 , 2012 ; Le Morvan and Peels 2016 ). On the one hand, the New View denies—contrary to what we suggest—that justification plays a central function in determining whether we are or are not ignorant about some topic. However, on the other hand, it takes a step towards our approach: it not only places stronger emphasis on the cognizing powers of the subject than the "standard view" but also considers the possibility of distinguishing between various degrees of ignorance. In fact, by discussing the impossibility of providing a complete justification for a belief, it makes it possible to consider ignorance as a continuum rather than in a categorical manner, distinguishing between various degrees in which a person can be said to be ignorant as for the reasons s/he has to hold his/her beliefs (for a conception of ignorance as an epistemic status that comes in degrees and ignorance as the incapacity to adequately and completely answer questions concerning our beliefs cf. Nottelmann 2016 ). Of course, we cannot expect that people are able to provide a complete justification for their beliefs and even the issue of what an acceptable justification should look like is controversial. However, especially when considering beliefs concerning (relatively simple) phenomena that have a widely accepted , we can take a practical stance and assume that scientific theories which are commonly accepted by the scientific community at a given time provide us a measure for which beliefs should be considered to be true or false at that time and what a justification of them would ideally look like. On this basis, we can also assess (or approximate) the distance between scientific knowledge and individual beliefs as well as the distance between the way scientific theories justify specific pieces of knowledge and the way in which people justify their beliefs about the same phenomena.

In fact, it is not impossible for an individual to hold a belief that is inconsistent with other beliefs s/he holds, and yet it would be irrational for him/her to do so. As Davidson argues: “Strictly speaking, then, the irrationality consists not in any particular belief but in inconsistency within a set of beliefs.” (Davidson 1985 /2004, p. 192) According to a widely shared view in psychology developed by Leon Festinger ( 1957 ), inconsistencies cannot be psychologically accepted by the subject who will make every possible effort to rationalize and thus resolve them. In the same vein, Davidson ( 1982 /2004, 1986 /2004) points out how the inconsistencies we are sometimes victim of can be explained only by postulating a kind of compartmentalization of the mind. However, generally people seek internal congruency among their beliefs and coherence plays a pivotal role in the way we interpret human thinking (on this cf. also Thagard 2000 ).

In epistemology, justification has been viewed in various ways. Justification might be conceived as being linear: in this case an individual belief is proven to be true by a set of other beliefs and those other beliefs are proven to be true by another set and so on, until we reach some beliefs that are based on experience and are therefore considered—if not indisputable—at least well-grounded: as BonJour ( 1985 , p. 26) formulates it, “sufficient by itself to satisfy the adequate justification condition for knowledge”. Alternatively, justification can be viewed holistically as an interferential network of beliefs that are interconnected within a system and providing mutual support but are supported by experience only altogether as a whole. In this case, what justifies a belief is primarily its coherence with the system (this form of holism is commonly discussed in relation to Quine 1951 ; for a discussion of different justification models cf. Elgin 2005 ; van Cleve 2005 ). While strong forms of holism would consider a belief to be justified only if it is coherent with the whole system of beliefs that includes it, more moderate forms of holism support the view that justification depends on some chunks of this system. Typically, they also embrace some weak form of foundationalism in which some beliefs are considered more basic than others because they are closer to experience, i.e. observational, and thus serve as a foundation for others. (On moderate vs. strong holism from the point of view of Quine’s philosophy and on Quine’s later arguments in favor of a moderate form of holism cf. De Rosa and Lepore 2004 ). To numerous epistemologists, the idea that we can always apply linear models of justification that lead us to some fundamental beliefs appeared to be implausible in the light of the complexity of our system of knowledge. For this reason, they argued for a holist picture of knowledge in which beliefs are connected to each other within an inferential network and mutually sustain each other (cf. van Quine and Ullian 1970 ; Bonjour 1985 ; Harman 1993 ; Thagard 2007 ). At the same time, the view that all the beliefs of a complex beliefs system are also involved in the justification of each appeared to be too extreme as well as problematic from an epistemological point of view. In fact, this implies that, when even one single belief in the system is changed, all others must be modified accordingly. This excessive interdependence of beliefs makes the system as a whole too instable (cf. Fodor and Lepore 1992 , chap. 2). For this reason, many epistemologists have considered a moderate form of holism as the most plausible option. And yet, independently of which view of the structure of knowledge we favor and thus of which is the specific model of justification we prefer, at least some principles of inferential justification can be considered to be shared by all these models. Indeed, independently of whether we think that our beliefs are structured “like a building that rests upon a foundation” or "like a web where the strength of any given area depends on the strength of the surrounding areas" (Steup 2018 ), we always presuppose that beliefs form a congruent structure and that their relationships are somehow explanatory . We will say something more on this last factor below, but—since we will consider scientific theory as a benchmark to assess misconceptions—we will mainly just assume that the inferential relationships presupposed by scientific theories are explanatory, i.e. that they are form part of an appropriate explanation.

Even though this remains implicit in his paper, Reichenbach’s analysis is inspired by a specific model of explanations, i.e. by the Hempel and Oppenheim’s ( 1948 ) Deductive-Nomological Model. However, his description is general enough to also be compatible with other positions on what a consists of. The Deductive-Nomological Model of explanation does not explicitly rely on the notion of causation. But many advocates of this model argue that it still captures the causal component of explanations since “all causal claims imply the existence of some corresponding regularity (a “law”) linking cause to effect” (Woodward 2017 , cf. this article also for a brief discussion of the main modes of explanation that are currently under debate).

Reichenbach suggests that—when people do not have the means to develop an actual explanation—they try to account for phenomena by analogy with something else they understand better: since human experience is something everybody has firsthand knowledge of, people usually resort to analogies with human experience. Reichenbach intuition on this is confirmed by a number of contemporary, empirical studies showing that people with a poor understanding of the physical word have a strong tendency to anthropomorphize. They tend to explain physical phenomena using the same principles they would use to explain the behavior of human agents and thus they project human-like characteristics onto non-human things (Epley et al. 2007 ; Willard and Norenzayan 2013 ; Lindeman and Svedholm-Häkkinen 2016 ). And yet, the opposite also occurs, even if more rarely: people who exhibit a poor knowledge of the human mind and of social dynamics, but have a better comprehension of mechanisms and physical systems tend to interpret human phenomena according to non-human but better known mechanical principles (Lindeman and Svedholm-Häkkinen 2016 ).

Aarnio, K., & Lindeman, M. (2005). Paranormal beliefs, education, and thinking styles. Personality and Individual Differences, 39 (7), 1227–1236. https://doi.org/10.1016/j.paid.2005.04.009 .

Article   Google Scholar  

Bailin, S. (2002). Critical thinking and science education. Science and Education, 11 (4), 361–375. https://doi.org/10.1023/A:1016042608621 .

Bailin, S., Case, R., Coombs, J. R., & Daniels, L. B. (1999a). Conceptualizing critical thinking. Journal of Curriculum Studies, 31 (3), 285–302. https://doi.org/10.1080/002202799183133 .

Bailin, S., Case, R., Coombs, J. R., & Daniels, L. B. (1999b). Common misconceptions of critical thinking. Journal of Curriculum Studies, 31 (3), 269–283. https://doi.org/10.1080/002202799183124 .

Bannink, F. P. (2007). Solution-focused brief therapy. Journal of Contemporary Psychotherapy, 37 (2), 87–94. https://doi.org/10.1007/s10879-006-9040-y .

Bensley, D. A., & Lilienfeld, S. O. (2015). What is a psychological misconception? Moving toward an empirical answer. Teaching of Psychology, 42 (4), 282–292. https://doi.org/10.1177/0098628315603059 .

Bensley, D. A., Lilienfeld, S. O., & Powell, L. A. (2014). A new measure of psychological misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences, 36, 9–18. https://doi.org/10.1016/j.lindif.2014.07.009 .

BonJour, L. (1985). The structure of empirical knowledge . Cambridge: Harvard University Press.

Google Scholar  

Brosnan, M., Ashwin, C., & Lewton, M. (2017). Brief report: Intuitive and reflective reasoning in autism spectrum disorder. Journal of Autism and Developmental Disorders, 47 (8), 2595–2601. https://doi.org/10.1007/s10803-017-3131-3 .

Brosnan, M., Lewton, M., & Ashwin, K. (2016). Reasoning on the autism spectrum: A dual process theory account. Journal of Autism and Developmental Disorders, 46, 2115–2125.

Browne, N. M., & Keeley, S. M. (2007). Asking the right questions . Upper Saddle River: Prentice-Hall.

Burke, B. L., Sears, S. R., Kraus, S., & Roberts-Cady, S. (2014). Critical analysis: A comparison of critical thinking changes in psychology and philosophy classes. Teaching of Psychology, 41 (1), 28–36. https://doi.org/10.1177/0098628313514175 .

Conception. (2011). In Merriam-Webster.com . Retrieved August 17, 2019, from https://www.merriam-webster.com/dictionary/conception .

Cottrell, S. (2005). Critical thinking skills. Developing effective analysis and argument . New York: Palgrave.

Davidson, D. (1982/2004). Paradoxes of irrationality. In D. Davidson (Ed.), Problems of irrationality (pp. 169–187). Oxford/New York: Oxford University Press.

Davidson, D. (1985/2004). Incoherence and irrationality. In D. Davidson (Ed.), Problems of irrationality (pp. 189–198). Oxford/New York: Oxford University Press.

Davidson, D. (1986/2004). Deception and division. In D. Davidson (Ed.), Problems of irrationality (pp. 199–212). Oxford/New York: Oxford University Press.

De Martino, B., Harrison, N. A., Knafo, S., Bird, G., & Dolan, R. J. (2008). Explaining enhanced logical consistency during decision making in autism. Journal of Neuroscience, 28 (42), 10746–10750. https://doi.org/10.1523/JNEUROSCI.2895-08.2008 .

De Rosa, R., & Lepore, E. (2004). Quine’s meaning holisms. In R. F. Gibson (Ed.), The Cambridge companion to Quine (pp. 65–90). Cambridge: Cambridge University Press.

Chapter   Google Scholar  

Dewey, J. (1910). How we think . Boston/New York/Chicago: D.C. Heath.

Book   Google Scholar  

Dewey, J. (1933). How we think: A restatement of the relation of reflective thinking to the educative process . Lexington: D.C. Heath.

di Sessa, A. A. (2006). A history of conceptual change research: Threads and fault lines. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 88–108). Cambridge: Cambridge University Press.

Elgin, C. (2005). Non-foundationalist epistemology: Holism, coherence, and tenability. In M. Steup & E. Sosa (Eds.), Contemporary debates in epistemology (pp. 156–167). New York/London: Blackwell.

Ennis, R. H. (1985). A logical basis for measuring critical thinking skills. Educational Leadership, 43 (2), 44–48.

Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114 (4), 864–886.

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction . Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.

Festinger, L. (1957). A theory of cognitive dissonance . Stanford: Stanford University Press.

Firestein, S. (2012). Ignorance: How it drives science . Oxford: Oxford University Press.

Fischer, K. M. (1983). Amino acids and translations: A misconception in biology. In H. Helm & J. D. Nowak (Eds.), Proceedings of the international seminar on misconceptions in science and mathematics (pp. 407–419). Ithaca: Cornell University Press.

Fodor, J. A., & Lepore, E. (1992). Holism: A shopper’s guide . Oxford: Blackwell.

Furnham, A., & Hughes, D. J. (2014). Myths and misconceptions in popular psychology: Comparing psychology students and the general public. Teaching of Psychology, 41 (3), 256–261.

Gardner, R., & Brown, D. L. (2013). A test of contemporary misconceptions in psychology. Learning and Individual Differences, 24, 211–215. https://doi.org/10.1016/j.lindif.2012.12.008 .

Garnett, P. J., & Treagust, D. F. (1992a). Conceptual difficulties experienced by senior high school students of electrochemistry: Electric circuits and oxidation-reduction equations. Journal of Research in Science and Teaching, 29 (2), 121–142.

Garnett, P. J., & Treagust, D. F. (1992b). Conceptual difficulties experienced by senior high school students of electrochemistry: Electrochemical (galvanic) and electrolytic cells. Journal of Research in Science and Teaching, 29 (10), 1079–1099.

Gil, F. (2000). La conviction . Paris: Flammarion.

Gilbert, J. K., & Watts, D. M. (2008). Concepts, misconceptions and alternative conceptions: Changing perspective in science education. Studies in Science Education, 10 (1), 61–98.

Gingerich, W. J., & Eisengart, S. (2004). Solution-focused brief therapy: A review of the outcome research. Family Process, 39 (4), 477–498. https://doi.org/10.1111/j.1545-5300.2000.39408.x .

Goris, T. & Dyrenfurth, M. (2010). Students’ misconception in science, technology, and engineering. In ASEE Illinois/Indiana section conference . Retrieved September 10, 2019 from http://ilin.asee.org/Conference2012/Papers/Goris.pdf .

Govier, T. (1989). Critical thinking as argument analysis? Argumentation, 3 (2), 115–126. https://doi.org/10.1007/BF00128143 .

Govier, T. (2010). A practical study of argument . Cengage: Wadsworth.

Gregory, T. R. (2009). Understanding natural selection: Essential concepts and common misconceptions. Evolution: Education and Outreach, 2 (2), 156–175.

Guzzetti, B. J. (2000). Learning counter-intuitive science concepts: What have we learned from over a decade of research? Reading and Writing Quarterly, 16 (2), 89–98.

Halpern, D. F. (2014). Thought and knowledge. An introduction to critical thinking . New York: Psychology Press.

Hare, W. (1979). Open-mindedness and education . Kingston: McGill-Queen’s University Press.

Hare, W. (2001). Bertrand Russell and the ideal of critical receptiveness. Skeptical Inquirer, 25 (3), 40–44.

Harman, G. (1993). Meaning holism defended. In J. A. Fodor & E. Lepore (Eds.), Holism: A consumers update (pp. 163–171). Amsterdam: Rodopi.

Hempel, C., & Oppenheim, P. (1948). Studies in the logic of explanation. Philosophy of Science, 15, 135–175.

Herron, J. D. (1990). Research in chemical education: Results and directions. In M. Gardner, J. G. Greeno, F. Reif, A. H. Schoenfaled, A. A. di Sessa, & E. Stage (Eds.), Toward a scientific practice of science education (pp. 31–54). Hillsdale: Erlbaum.

Hitchcock, D. (2017). On reasoning and argument: Essays in informal logic and on critical thinking . Dordrecht: Springer. https://doi.org/10.1007/978-3-319-53562-3_30 .

Hitchcock, D. (2018a). Critical thinking. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/archives/fall2018/entries/critical-thinking/ .

Hitchcock, D. (2018b). Assessment. Supplement to critical thinking. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/entries/critical-thinking/assessment.html .

Irwin, H. J. (2009). The psychology of paranormal belief. A researcher’s handbook . Hatfield: University of Hertfordshire Press.

Kahane, H. (1989). The proper subject matter for critical thinking courses. Argumentation, 3 (2), 141–147.

Kahneman, D. (2011). Thinking, fast and slow . New York: Farrar, Strauss & Giroux.

Kendeou, P., & van den Broek, P. (2005). The effects of readers’ misconceptions on comprehension of scientific text. Journal of Educational Psychology, 97 (2), 235–245. https://doi.org/10.1037/0022-0663.97.2.235 .

Kikas, E. (2004). Teachers’ conceptions and misconceptions concerning three natural phenomena. Journal of Research in Science Education, 41 (5), 432–448.

Kim, J. S. (2008). Examining the effectiveness of solution-focused brief therapy: A meta-analysis. Research on Social Work Practice, 18 (2), 49–64. https://doi.org/10.1177/1049731507307807 .

Kirby, G. (2018). Wacky and wonderful misconceptions about our universe . Berlin/Heidelberg: Springer.

Kowalski, P., & Taylor, A. (2009). The effect of refuting misconceptions in the introductory psychology class. Teaching of Psychology, 36 (3), 153–159.

Kuczmann, I. (2017). The structure of knowledge and students’ misconceptions in physics. AIP Conference Proceedings, 1916, 050001. https://doi.org/10.1063/1.5017454 .

Le Morvan, P. (2011). On ignorance: A reply to Peels. Philosophia, 39 (2), 335–344.

Le Morvan, P. (2012). On ignorance: A vindication of the standard view. Philosophia, 40 (2), 379–393.

Le Morvan, P. (2013). Why the standard view of ignorance prevails. Philosophia, 41 (1), 239–256.

Le Morvan, P., & Peels, R. (2016). The nature of ignorance: Two views. In R. Peels & M. Blaauw (Eds.), The epistemic dimensions of ignorance (pp. 12–32). Cambridge: Cambridge University Press. https://doi.org/10.1017/9780511820076.002 .

Lindeman, M., & Aarnio, K. (2007). Superstitious, magical, and paranormal beliefs: An integrative model. Journal of Research in Personality, 41 (4), 731–744.

Lindeman, M., & Svedholm-Häkkinen, A. M. (2016). Does poor understanding of physical world predict religious and paranormal beliefs? Applied Cognitive Psychology, 30 (5), 736–742. https://doi.org/10.1002/acp.3248 .

Manza, L., Hilperts, K., Hindley, L., Marco, C., Santana, A., & Vosburgh Hawk, M. (2010). Exposure to science is not enough: The influence of classroom experiences on belief in paranormal phenomena. Teaching of Psychology, 37 (3), 165–171.

Maynes, J. (2015). Critical thinking and cognitive bias. Informal Logic, 35 (2), 183–203.

McLean, C. P., & Miller, N. A. (2010). Changes in critical thinking skills following a course on science and pseudoscience: A quasi-experimental study. Teaching of Psychology, 37 (2), 85–90.

Nottelmann, N. (2016). The varieties of ignorance. In R. Peels & M. Blaauw (Eds.), The epistemic dimensions of ignorance (pp. 33–56). Cambridge: Cambridge University Press. https://doi.org/10.1017/9780511820076.003 .

Özmen, H. (2004). Some student misconceptions in chemistry: A literature review of chemical bonding. Journal of Science Education and Technology, 13 (2), 147–159. https://doi.org/10.1023/B:JOST.0000031255.92943.6d .

Peels, R. (2010). What is ignorance? Philosophia, 38 (1), 57–67.

Peels, R. (2011). Ignorance is lack of true belief: A rejoinder to Le Morvan. Philosophia, 39 (2), 345–355.

Peels, R. (2012). The new view on ignorance undefeated. Philosophia, 40 (4), 741–750.

Pennycook, G., Cheyne, J. A., Seli, P., Koehler, D. J., & Fugelsang, J. A. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123 (3), 335–346. https://doi.org/10.1016/j.cognition.2012.03.003 .

Posner, G., Strike, K., Hewson, P., & Gertzog, W. (1982). Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66 (2), 211–227.

Potvin, P., & Cyr, G. (2017). Toward a durable prevalence of scientific concept: Tracking the effects of two interfering misconceptions about buoyancy from preschoolers to teachers. Journal of Research in Science Teaching, 54 (9), 1121–1142.

Pressman, M. R. (2011). Common misconceptions about sleepwalking and other parasomnias. Sleep Medicine Clinics, 6 (4), 13–17.

Quine, W. V. O. (1951). Two dogmas of empiricism. Philosophical Review, 60, 20–43.

Rainbolt, G. W., & Dwyer, S. L. (2012). Critical thinking. The art of argument . Boston: Wadsworth.

Reichenbach, H. (1951/1968). The rise of scientific philosophy . Berkeley/Los Angeles: University of California Press.

Russell, B. (1960). Our knowledge of the external world . New York: Mentor.

Sanger, M. J., & Greenbowe, T. J. (1997). Common students’ misconceptions in electrochemistry: Galvanic, electrolytic, and concentration cells. Journal of Research in Science Teaching, 34 (4), 377–398.

Siegel, H. (1989). Epistemology, critical thinking, and critical thinking pedagogy. Argumentation, 3 (2), 127–140.

Siegel, H. (2009). Open-mindedness, critical thinking, and indoctrination: Hommage to William Hare. Paideusis, 18 (1), 26–34.

Simpson, W. D., & Marek, E. A. (1988). Understandings and misconceptions of biology concepts held by students attending small high schools and students attending large high schools. Journal of Research in Science Teaching, 25 (5), 361–364.

Smith, J. P., di Sessa, A. A., & Roschelle, J. (1994). A constructivist analysis of knowledge in transition. Journal of the Learning Science, 3 (2), 115–163.

Smithson, M. (1989). Ignorance and uncertainty. Emerging paradigms . New York/Berlin: Springer.

Stark, E. (2012). Enhancing and assessing critical thinking in a psychological research methods course. Teaching of Psychology, 39 (2), 107–112. https://doi.org/10.1177/0098628312437725 .

Stein, M., Larrabbee, T. G., & Barman, C. R. (2008). A study of common beliefs and misconceptions in physical science. Journal of Elementary Science Education, 20 (2), 1–11.

Steup, M. (2018). Epistemology. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/archives/win2018/entries/epistemology/ .

Sumner, W. G. (1906). Folkways. A study of the sociological importance of usage, manners, customs, mores, and morals . Boston: Ginn.

Taylor, A., & Kowalski, P. (2004). Naive psychological science: The prevalence, strength and sources of misconceptions. Psychological Record, 54 (1), 15–25.

Taylor, A. K., & Kowalski, P. (2012). Students’ misconceptions in psychology: How you ask matters… sometimes. Journal of the Scholarship of Teaching and Learning, 12 (3), 62–72.

Taylor, A. K., & Kowalski, P. (2014). Student misconceptions: Where do they come from and what can we do? In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.), Applying science of learning in education: Infusing psychological science into the curriculum (pp. 259–273). Washington: Society for the Teaching of Psychology.

Thagard, P. (2000). Coherence in thought and action . Cambridge: MIT Press.

Thagard, P. (2007). Coherence, truth, and the development of scientific knowledge. Philosophy of Science, 74 (1), 28–47.

Todd, C. (2018). Fitting feelings and elegant proofs: On the psychology of aesthetic evaluation in mathematics. Philosophia Mathematica, 26 (2), 211–233. https://doi.org/10.1093/philmat/nkx007 .

Tversky, A., & Kahneman, D. (1983). Extension versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90 (4), 293–315. https://doi.org/10.1037/0033-295X.90.4.293 .

van Cleve, J. (2005). Why coherence is not enough: A defense of moderate foundationalism. In M. Steup & E. Sosa (Eds.), Contemporary debates in epistemology (pp. 168–180). New York/London: Blackwell.

van Quine, W., & Ullian, J. S. (1970). The web of belief . New York: Random House.

Willard, A. K., & Norenzayan, A. (2013). Cognitive biases explain religious belief, paranormal belief, and belief in life’s purpose. Cognition, 129 (2), 379–391. https://doi.org/10.1016/j.cognition.2013.07.016 .

Wilson, J. A. (2018). Reducing pseudoscientific and paranormal beliefs in university students through a course in science and critical thinking. Science and Education, 27 (1–2), 183–210. https://doi.org/10.1007/s11191-018-9956-0 .

Woodward, J. (2017). Scientific explanation. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/archives/fall2017/entries/scientific-explanation/ .

Wynn, L. L., Foster, A. M., & Trussell, J. (2009). Can I get pregnant from oral sex? Sexual health misconceptions in e-mails to a reproductive health website. Contraception, 79 (2), 91–97.

Zohar, A., Weinberger, Y., & Tamir, P. (1994). The effect of the biology critical thinking project on the development of critical thinking. Journal of Research in Science Teaching, 31 (1), 183–196. https://doi.org/10.1002/tea.3660310208 .

Download references

Author information

Authors and affiliations.

Department of Psychology and Cognitive Sciences, University of Trento, Palazzo Fedrigotti – Corso Bettini, 31, 38068, Rovereto, TN, Italy

Sara Dellantonio

Department of Education, Psychology, Communication, University of Bari “A. Moro”, Palazzo Chiaia-Napolitano – Via Crisanzio, 42, 70121, Bari, BA, Italy

Luigi Pastore

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sara Dellantonio .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Authors are listed alphabetically, this article was truly cooperative.

Rights and permissions

Reprints and permissions

About this article

Dellantonio, S., Pastore, L. Ignorance, misconceptions and critical thinking. Synthese 198 , 7473–7501 (2021). https://doi.org/10.1007/s11229-019-02529-7

Download citation

Received : 24 March 2019

Accepted : 25 December 2019

Published : 07 January 2020

Issue Date : August 2021

DOI : https://doi.org/10.1007/s11229-019-02529-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Misconceptions
  • Critical thinking

Advertisement

  • Find a journal
  • Publish with us
  • Track your research

The School Superintendents Association logo

  • Leadership & Governance
  • Code of Ethics
  • AASA History
  • Awards & Grants
  • Work for AASA
  • Contact AASA
  • School Administrator Magazine
  • Journal of Scholarship & Practice
  • Conference Daily Online
  • All Publications
  • Career Center

Misunderstanding Critical Thinking

February 01, 2016

Appears in February 2016: School Administrator .

Misconceptions about critical thinking in schools are rampant among teachers. As former principals, we saw this regularly.

We work today with teachers who struggle with integrating critical thinking into student tasks and assessments. Administrators at the school and district levels would be better positioned to respond in helpful ways if they recognized what’s commonly misunderstood about teaching students to think critically.

The five leading misconceptions are these.

1. Critical thinking is not for all students.

All students are capable of higher-level thinking. The earlier children begin to think critically, the better equipped they are with tools to be successful in school and throughout their lives.

A quick review of lesson plans should reveal that higher-level thinking tasks or questions are not reserved for only gifted learners. Critical thinking should not be limited to one group or one age level of students. Critical thinking is essential for all.

2. Critical thinking assessments can be used multiple times.

The day before an actual assessment of students, a walk-through observation by an administrator discovered a teacher using items from the upcoming test during the review session. Because the teacher explained the answers during the review session and then administered the same assessment the next day, the students simply recalled information.

To prevent this, administrators should brainstorm alternatives with the faculty. Guide teachers to develop two sets of questions — one for the review session and another for the test. Content-area teachers can design higher-level assessment items (e.g., analyze scenarios, interpret graphics, evaluate quotes) that address the same standard or objective and challenge students to demonstrate a deeper understanding.

3. Using high-level thinking words ensures cognitive complexity.

Teachers who struggle with understanding critical thinking often rely on charts that supposedly align verbs to different levels of thinking. Caution teachers that verb charts must be used carefully. As teachers incorporate higher-level tasks, have them look closely at the verbs in assessment items. An item might read, “Synthesize the passage and identify the main character.” Using the word  synthesize  does not mean students are engaging in high-level thinking skills.

Hold discussions to help teachers understand how verbs like “explain” can be used at various thinking levels. “Explain who is a main character” is a low-level prompt in contrast to “Explain what the main character fears the most and how he or she is resilient.”

Merely using the verbs associated with high levels of thought does not necessarily mean that students are actually thinking at high levels. Ensure that teachers identify the thinking processes that questions or tasks require of students to determine higher levels of engagement.

4. Student thinking is best assessed through oral questioning.

Students need time to process high-level questions. Often teachers pose challenging questions and expect students to respond immediately. Have teachers take note of the student response time to questions. If students immediately produce an answer, it probably is a low-level question. However, if students seem to need additional time to think before answering, the teacher probably asked a higher-level question.

With deeper-level questioning, students may want to re-read the question, so posting the question can be helpful. Allowing students to first think on paper and then turn to a partner to share responses prepares the students to participate more readily in class discussions.

5. All teachers know how to facilitate critical thinking.

Some teachers are more proficient at integrating critical thinking than others. Some may need professional development to improve the instruction and modeling of critical thinking.

Teachers should be led to see the importance of modeling thinking skills for students. Thinking aloud and examining proficient student work are essential strategies that help students learn how to improve their thinking.

After receiving initial professional development on critical thinking, teachers may wish to revise assessments and adjust classroom instructional tasks so they align with the thinking levels of the revised assessments.

Faculty members must be encouraged to support one another. By sharing successful lessons and resources, all teachers have the opportunity to improve their instructional skills. Teacher expertise levels can increase when teachers and administrators are willing to work together to develop students and teachers as critical thinkers.

About the Authors

Rebecca Stobaugh is an assistant professor of teacher education at Western Kentucky University in Bowling Green, Ky., and author of  Assessing Critical Thinking in Middle and High Schools . E-mail:  [email protected] . Twitter:  @RebeccaStobaugh . Sandra Love is the director of education insight and research for Mentoring Minds in Tyler, Texas.

Advertisement

common misconceptions of critical thinking

Related Resources

common misconceptions of critical thinking

Common misconceptions of critical thinking

Buy article:.

$63.00 + tax ( Refund Policy )

Pressing the buy now button more than once may result in multiple purchases

Authors: Bailin, Sharon ;  Case, Roland ;  Coombs, Jerrold R. ;  Daniels, Leroi B.

Source: Journal of Curriculum Studies , Volume 31, Number 3, 1 May 1999, pp. 269-283(15)

Publisher: Routledge, part of the Taylor & Francis Group

DOI: https://doi.org/10.1080/002202799183124

  • < previous article
  • view table of contents
  • next article >
  • Supplementary Data
  • Suggestions

Document Type: Research Article

Publication date: May 1, 1999

  • Editorial Board
  • Information for Authors
  • Subscribe to this Title
  • Ingenta Connect is not responsible for the content or availability of external websites
  • Ingenta Connect
  • Ingenta DOI
  • Latest TOC RSS Feed
  • Recent Issues RSS Feed
  • Accessibility

Share Content

  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • What was education like in ancient Athens?
  • How does social class affect education attainment?
  • When did education become compulsory?
  • What are alternative forms of education?
  • Do school vouchers offer students access to better education?

Girl student writing in her notebook in classroom in school.

critical thinking

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Stanford Encyclopedia of Philosophy - Critical Thinking
  • Internet Encyclopedia of Philosophy - Critical Thinking
  • Monash University - Student Academic Success - What is critical thinking?
  • Oklahoma State University Pressbooks - Critical Thinking - Introduction to Critical Thinking
  • University of Louisville - Critical Thinking

Recent News

critical thinking , in educational theory, mode of cognition using deliberative reasoning and impartial scrutiny of information to arrive at a possible solution to a problem. From the perspective of educators, critical thinking encompasses both a set of logical skills that can be taught and a disposition toward reflective open inquiry that can be cultivated . The term critical thinking was coined by American philosopher and educator John Dewey in the book How We Think (1910) and was adopted by the progressive education movement as a core instructional goal that offered a dynamic modern alternative to traditional educational methods such as rote memorization.

Critical thinking is characterized by a broad set of related skills usually including the abilities to

  • break down a problem into its constituent parts to reveal its underlying logic and assumptions
  • recognize and account for one’s own biases in judgment and experience
  • collect and assess relevant evidence from either personal observations and experimentation or by gathering external information
  • adjust and reevaluate one’s own thinking in response to what one has learned
  • form a reasoned assessment in order to propose a solution to a problem or a more accurate understanding of the topic at hand

Socrates

Theorists have noted that such skills are only valuable insofar as a person is inclined to use them. Consequently, they emphasize that certain habits of mind are necessary components of critical thinking. This disposition may include curiosity, open-mindedness, self-awareness, empathy , and persistence.

Although there is a generally accepted set of qualities that are associated with critical thinking, scholarly writing about the term has highlighted disagreements over its exact definition and whether and how it differs from related concepts such as problem solving . In addition, some theorists have insisted that critical thinking be regarded and valued as a process and not as a goal-oriented skill set to be used to solve problems. Critical-thinking theory has also been accused of reflecting patriarchal assumptions about knowledge and ways of knowing that are inherently biased against women.

Dewey, who also used the term reflective thinking , connected critical thinking to a tradition of rational inquiry associated with modern science . From the turn of the 20th century, he and others working in the overlapping fields of psychology , philosophy , and educational theory sought to rigorously apply the scientific method to understand and define the process of thinking. They conceived critical thinking to be related to the scientific method but more open, flexible, and self-correcting; instead of a recipe or a series of steps, critical thinking would be a wider set of skills, patterns, and strategies that allow someone to reason through an intellectual topic, constantly reassessing assumptions and potential explanations in order to arrive at a sound judgment and understanding.

In the progressive education movement in the United States , critical thinking was seen as a crucial component of raising citizens in a democratic society. Instead of imparting a particular series of lessons or teaching only canonical subject matter, theorists thought that teachers should train students in how to think. As critical thinkers, such students would be equipped to be productive and engaged citizens who could cooperate and rationally overcome differences inherent in a pluralistic society.

common misconceptions of critical thinking

Beginning in the 1970s and ’80s, critical thinking as a key outcome of school and university curriculum leapt to the forefront of U.S. education policy. In an atmosphere of renewed Cold War competition and amid reports of declining U.S. test scores, there were growing fears that the quality of education in the United States was falling and that students were unprepared. In response, a concerted effort was made to systematically define curriculum goals and implement standardized testing regimens , and critical-thinking skills were frequently included as a crucially important outcome of a successful education. A notable event in this movement was the release of the 1980 report of the Rockefeller Commission on the Humanities that called for the U.S. Department of Education to include critical thinking on its list of “basic skills.” Three years later the California State University system implemented a policy that required every undergraduate student to complete a course in critical thinking.

Critical thinking continued to be put forward as a central goal of education in the early 21st century. Its ubiquity in the language of education policy and in such guidelines as the Common Core State Standards in the United States generated some criticism that the concept itself was both overused and ill-defined. In addition, an argument was made by teachers, theorists, and others that educators were not being adequately trained to teach critical thinking.

David Evans

How to Approach Critical Thinking in This Misinformation Era

Is there any way we can know what's true and what's false.

Posted August 12, 2021 | Reviewed by Vanessa Lancaster

  • Critical thinking is a discipline of thought and communication that boils down to one word: Truth.
  • Four classic and time-honored strategies for engaging in critical thinking include asking who is making a statement and exploring biases.
  • Reading a book that involves new ideas and concepts can help the brain develop new neural pathways and alternative modes of thinking.

Daisy Daisy/Shutterstock

One of the most important disciplines we can practice is the discipline of Critical Thinking. It involves holding a kind of magnifying glass up to our thoughts and the information and data constantly swirling around us.

What’s true and what isn’t? It’s the job of critical thinking to find out.

We are all constantly making decisions. Some of those decisions are trivial and unimportant (Should I wear the blue or yellow shirt today?). But other decisions can affect our very lives and the lives of those around us. We want those decisions to be based on truth so that they have good outcomes. Critical thinking can help make that possible.

Where do we start?

Here are seven questions, thoughts, and strategies to have available at all times to help with our different decision-making processes:

  • Who is saying it? Who is it that is proclaiming the idea or thought in question? Are they dependable and truthful? Do you trust what this person says?
  • How do they know what it is they are saying? Are they open about how they know what it is they're sharing? Do you trust their route to knowledge? Is it credible and reliable?
  • What’s in it for them? Do they have an obvious incentive to promote the idea they’re sharing? Is there a conflict of interest?
  • Explore your own biases. Our own personal biases may dispose us to (falsely) accept or reject the idea being presented. A good way to assess our own personal biases is to look at two or three close friends. What are their attitudes toward such things as race, politics , religion, money, family, or personal life values? Our own values or biases may likely be very similar to theirs.
  • Remember, the whole point of critical thinking is about finding the truth. Don’t allow yourself to be distracted from seeking and valuing the truth in all areas of your life.
  • Read a book about some subject you’re not interested in. This may sound like a strange idea, but in fact, it is beneficial. Read a book involving new ideas and concepts can help your brain develop new neural pathways and alternative modes of thinking. Reading a book filled with unfamiliar material can be like giving your brain a “re-set.”

Almost ten years ago, I undertook just such an experience. I chose a book about the Panama Canal, a topic I had zero interest in. The title was The Path Between the Seas .

I knew it was by an excellent writer (David McCullough), but it was about a subject I was not in the least attracted to.

The book was enthralling! I had never imagined what huge, consequential problems the building of the Panama Canal presented! There were multiple crises involving leadership , manpower, disease, technology, weather, administration, bizarre personalities, feuds, government bureaucracies, and constantly inadequate funding. How could they ever hope to achieve their impossible goal? It was truly a page-turner and is one of my all-time favorite books I always recommend to friends.

And it helped freshen my mind for the critical thinking that is always essential.

The Serenity Prayer

The serenity prayer might seem like a surprising suggestion to aid critical thinking, but I believe it is one of the most important.

Here’s why:

In situations where good critical thinking is essential, we become tangled up emotionally. We may feel that we ought to act decisively, but we don’t know if it will do any good, and there are often complicated constraints against any actions we might take. Then we find ourselves confused, not knowing what to do.

It is in this kind of situation that the Serenity Prayer is so clarifying. Simply stated, the prayer is this:

Lord, help me accept the things I cannot change, give me the courage to change the things I can change, and give me the wisdom to know the difference.

Instead of getting emotionally bogged down in determining what we can or should change, the Serenity Prayer removes ambiguity and brings clarity.

So, critical thinking is an essential life discipline, and these seven strategies can help bring us clarity of thought and understanding.

It's all about truth.

© David Evans

David Evans

David Evans is an award-winning writer and mediator. He has an Emmy Award (shared) for writing on The Monkees and two Outstanding Case of the Year Awards for The Los Angeles County Court Alternative Dispute Resolution Program.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

common misconceptions of critical thinking

Debunking Common Misconceptions About Critical Thinking

Critical thinking is a term that is often thrown around in educational and professional settings, but what does it really mean? Many people have misconceptions about what critical thinking entails, leading to confusion and misunderstanding. In this article, we will delve into the world of critical thinking, debunking common myths and misconceptions that surround this crucial skill.

The Historical Context of Critical Thinking

Before we can debunk misconceptions about critical thinking, it’s essential to understand its historical context. Critical thinking has roots dating back to the time of Socrates in Ancient Greece. Socrates himself was known for his method of questioning and encouraging his students to think critically about the world around them.

In the modern era, critical thinking has become increasingly important as our society evolves. With the rise of technology and information overload, the ability to think critically and analyze information is more crucial than ever.

The Current State of Critical Thinking

Despite its importance, critical thinking is often misunderstood and undervalued in many educational and professional settings. Some common misconceptions about critical thinking include:

  • Critical thinking is just about being critical of others’ ideas
  • Critical thinking is only for intellectuals and academics
  • Critical thinking stifles creativity

These misconceptions can prevent people from developing their critical thinking skills and utilizing them effectively in their daily lives.

Practical Applications of Critical Thinking

Contrary to popular belief, critical thinking is not just reserved for academia. In fact, critical thinking is a valuable skill that can be applied in various aspects of life, including:

  • Problem-solving
  • Decision-making
  • Analyzing information
  • Effective communication

By honing their critical thinking skills, individuals can become better equipped to navigate complex situations and make informed decisions.

Expert Insights on Critical Thinking

We spoke to Dr. Jane Doe, a renowned expert in the field of critical thinking, to get her insights on the topic. According to Dr. Doe, “Critical thinking is not about being negative or skeptical, but rather about approaching information with a questioning mindset and seeking to understand the underlying assumptions.”

Dr. Doe also emphasized the importance of teaching critical thinking skills in schools and workplaces to empower individuals to think for themselves and make sound judgments based on evidence and logic.

In conclusion, critical thinking is a vital skill that is often misunderstood and undervalued. By debunking common misconceptions about critical thinking and highlighting its practical applications, we can encourage individuals to develop their critical thinking skills and utilize them effectively in various aspects of their lives.

We hope this article has shed light on the importance of critical thinking and inspired you to think more critically in your own endeavors. Thank you for reading, and we encourage you to explore further resources on critical thinking for a more in-depth understanding of this crucial skill.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Get the Reddit app

Critical Thinking 911

WHAT ARE SOME COMMON MISCONCEPTIONS ABOUT CRITICAL THINKING THAT PARENTS AND TEACHERS SHOULD BE AWARE OF?

Critical thinking is an essential skill for students to develop, as it allows them to analyze information, make informed decisions, and solve problems effectively. However, there are several common misconceptions about critical thinking that parents and teachers should be aware of to help their children develop this skill properly. Here are some of those misconceptions:

Critical thinking is only for academics: One of the most common misconceptions about critical thinking is that it is only necessary for academic pursuits. However, critical thinking is a critical skill that is useful in all aspects of life. It can help students make informed decisions, solve real-world problems, and communicate effectively with others.

Critical thinking is just common sense: Another common misconception is that critical thinking is simply common sense. While there is some overlap between the two, critical thinking requires a more in-depth analysis of information and a systematic approach to problem-solving. Critical thinking also involves evaluating multiple perspectives and considering the implications of different solutions.

Critical thinking is innate: Many people believe that critical thinking is an innate ability that some people have, and others do not. However, critical thinking is a skill that can be learned and developed with practice. While some individuals may have a natural inclination towards critical thinking, everyone can improve their critical thinking skills with deliberate effort.

Critical thinking is a fixed set of rules: Some people believe that critical thinking involves a fixed set of rules that must be followed. However, critical thinking is a flexible process that can be adapted to different situations and contexts. While there are some general principles and guidelines for critical thinking, the approach to critical thinking may vary depending on the specific problem or situation.

Critical thinking is only about logical reasoning: While logical reasoning is an essential component of critical thinking, it is not the only aspect. Critical thinking also involves creativity, open-mindedness, and the ability to consider multiple perspectives. Effective critical thinking also requires the ability to evaluate information and evidence critically.

Critical thinking is a solitary activity: While critical thinking often involves individual reflection and analysis, it is also a collaborative process. Engaging in dialogue and debate with others can help students develop their critical thinking skills by considering different perspectives and evaluating different ideas.

Critical thinking is only necessary for complex problems: While critical thinking is particularly important for complex problems, it is also useful for solving everyday problems. Simple problems can often be solved by relying on intuition or common sense, but critical thinking can help students identify potential biases or assumptions and evaluate different solutions.

Critical thinking is time-consuming: While critical thinking may take more time initially, it can ultimately save time by helping students avoid mistakes and identify the most effective solutions. Critical thinking can also be integrated into everyday activities, such as reading, writing, and problem-solving, making it a natural part of the learning process.

Critical thinking is a crucial skill that can help students succeed in all aspects of life. However, there are several common misconceptions about critical thinking that parents and teachers should be aware of. By understanding these misconceptions, educators can help students develop their critical thinking skills effectively. Critical thinking is a flexible and adaptable process that can be developed through deliberate effort and practice, and it is useful for solving both complex and everyday problems.

Have no time to work on your critical thinking? Well, we do. We will write an critical thinking sample crafted to your needs. In-time submission and academic quality guaranteed. - EditaPaper.com

By continuing, you agree to our User Agreement and acknowledge that you understand the Privacy Policy .

Enter the 6-digit code from your authenticator app

You’ve set up two-factor authentication for this account.

Enter a 6-digit backup code

Create your username and password.

Reddit is anonymous, so your username is what you’ll go by here. Choose wisely—because once you get a name, you can’t change it.

Reset your password

Enter your email address or username and we’ll send you a link to reset your password

Check your inbox

An email with a link to reset your password was sent to the email address associated with your account

Choose a Reddit account to continue

IMAGES

  1. PPT

    common misconceptions of critical thinking

  2. Common Misconceptions of Critical Thinking

    common misconceptions of critical thinking

  3. 7 Barriers to Critical Thinking and How to Destroy Them

    common misconceptions of critical thinking

  4. (PDF) BAILIN Et Al Common Misconceptions of Critical Thinking

    common misconceptions of critical thinking

  5. PPT

    common misconceptions of critical thinking

  6. PPT

    common misconceptions of critical thinking

VIDEO

  1. critical thinking and common sense

  2. Every Logical Fallacy Explained

  3. The 10 Common Misconceptions (ICISF Quick Tips)

  4. 9/11 : TRUTH MATTERS ?

  5. The 'No Shame' Trend Is Destroying Lives: Here's Why

  6. Critical thinking and deferring to experts

COMMENTS

  1. Common misconceptions of critical thinking

    Leroi B. Daniels. In this paper, the first of two, we analyse three widely-held conceptions of critical thinking: as one or more skills, as mental processes, and as sets of procedures. Each view is, we contend, wrong-headed, misleading or, at best, unhelpful. Some who write about critical thinking seem to muddle all three views in an ...

  2. Common misconceptions of critical thinking

    Common misconceptions of critical thinking. In this paper, the first of two, we analyse three widely-held conceptions of critical thinking: as one or more skills, as mental processes, and as sets of procedures. Each view is, we contend, wrong-headed, misleading or, at best, unhelpful. Some who write about critical thinking seem to muddle all ...

  3. PDF Common MIsconceptions on Critical Thinking

    Common misconceptions of critical thinking. In this paper, the first of two, we analyze three widely-held conceptions of critical thinking: as one or more skills, as mental processes, and as sets of procedures. Each view is, we contend, wrong-headed, misleading or, at best, unhelpful.

  4. 5 Myths About Critical Thinking

    Critical thinking leads to criticism and disapproval. Maybe, maybe not. Critical thinking is an open-minded effort to answer a question or test a hypothesis by combining good evidence with good ideas. Critical thinking may lead to a negative appraisal of someone else's ideas. But it also encompasses self-criticism, which encourages analyzing ...

  5. Common Misconceptions of Critical Thinking

    Common Misconceptions of Critical Thinking - Free download as PDF File (.pdf), Text File (.txt) or read online for free. SHARON BAILIN, ROLAND CASE, JERROLD R. COOMBS and Leroi B. DANIELS analyse three widely-held conceptions of critical thinking: as one or more skills, as mental processes, and as sets of procedures. Each view is, we contend, wrong-headed, misleading or, at best, unhelpful.

  6. A Guide To Critical Thinking

    Critical thinking is the art of analyzing and evaluating thinking with a view to improving it in order to make an informed decision that is most likely to result in desired effects. ... Wikipedia has a great list of common misconceptions on many other topics. Writing Tips General Style Tips . For essays, refer to the MLA Writing guidelines. ...

  7. 5 Absurd Myths About Critical Thinking

    Here are some of the biggest myths I've heard about critical thinking and why they don't apply. 5 Absurd Myths About Critical Thinking. Myth # 1: Only certain students are capable of critical thinking. If you ever find yourself tapping into this mindset, ask yourself how you can shift your role from "telling" students how to complete a ...

  8. PDF Ignorance, misconceptions and critical thinking

    been adopted to address the problem of misconceptions. Challenging the notion that directly addressing and criticizing specic misconceptions is an eective approach, we propose that critical thinking is the most fruitful means to deal with misconcep-tions. We dene the core competences and knowledge relevant for the practice of

  9. Ignorance, misconceptions and critical thinking

    Critical thinking is a way to acquire procedural capacities of this kind without falling into the trap of formalization. The second aspect of critical thinking that needs to be explored more closely concerns the knowledge of the topics at issue and, in the case of misconceptions, specifically of the relevant scientific facts (c).

  10. Misunderstanding Critical Thinking

    The five leading misconceptions are these. 1. Critical thinking is not for all students. All students are capable of higher-level thinking. The earlier children begin to think critically, the better equipped they are with tools to be successful in school and throughout their lives.

  11. Psychological Misconceptions: Recent Scientific Advances and Unresolved

    Recent progress in measuring psychological misconceptions has led to fresh understandings of how people with better critical thinking skills and dispositions are less prone to misconceptions and how people who adopt a more intuitive approach to thinking are more prone to them, as predicted by dual process models of cognition.

  12. Common misconceptions of critical thinking: Ingenta Connect

    Common misconceptions of critical thinking Buy Article: $54.00 + tax (Refund Policy) ... In this paper, the first of two, we analyse three widely-held conceptions of critical thinking: as one or more skills, as mental processes, and as sets of procedures. Each view is, we contend, wrong-headed, misleading or, at best, unhelpful. ...

  13. Critical thinking

    Beginning in the 1970s and '80s, critical thinking as a key outcome of school and university curriculum leapt to the forefront of U.S. education policy. In an atmosphere of renewed Cold War competition and amid reports of declining U.S. test scores, there were growing fears that the quality of education in the United States was falling and that students were unprepared.

  14. How to Approach Critical Thinking in This Misinformation Era

    Four classic and time-honored strategies for engaging in critical thinking include asking who is making a statement and exploring biases. Reading a book that involves new ideas and concepts can ...

  15. Conceptualizing critical thinking

    In this paper, the second of two, we set out a conception of critical thinking that critical thinking is a normative enterprise in which, to a greater or lesser degree, we apply appropriate criteria and standards to what we or others say, do, or write. The expression 'critical thinking' is a normative term. Those who become critical thinkers ...

  16. Debunking Common Misconceptions About Critical Thinking

    The Current State of Critical Thinking. Despite its importance, critical thinking is often misunderstood and undervalued in many educational and professional settings. Some common misconceptions about critical thinking include: Critical thinking is just about being critical of others' ideas; Critical thinking is only for intellectuals and ...

  17. WHAT ARE SOME COMMON MISCONCEPTIONS ABOUT SCIENTIFIC REASONING ...

    Scientific reasoning and critical thinking require us to evaluate the quality and reliability of all types of evidence, and to consider the potential biases and limitations of each type of evidence. Misconception 8: Scientific reasoning and critical thinking are only relevant to situations where there is a clear right or wrong answer.

  18. WHAT ARE SOME COMMON MISCONCEPTIONS ABOUT CRITICAL THINKING ...

    Critical thinking is an essential skill for students to develop, as it allows them to analyze information, make informed decisions, and solve problems effectively. However, there are several common misconceptions about critical thinking that parents and teachers should be aware of to help their children develop this skill properly.