EduFusion.org

Where Creative Ideas in Distance Ed Come Together

Fear, Anger, and Denial—How do critical thinkers deal with barriers during a crisis?

Critical Thinking in Education

Have you ever made a bad choice in life? Most of us have at some point and if we go back and analyze what went wrong, we often find that there was some barrier to our critical thinking keeping us from making the best choice among the alternatives that we have. To understand and spot the barriers is important because we can often develop strategies for dealing with these roadblocks to good decision making.

Today I will focus on just a few with examples that highlight how we are currently dealing with the COVID 19 outbreak. The barriers that have surfaced are fear, anger, denial, egocentrism, and Sociocentrism. Each one of these barriers can have a significant impact on our decision making and with so many important decisions now that must be made, we want to be certain we are making the best possible choice.

Fear is perhaps one of those barriers that are partly ingrained through our evolution. In essence, fear encourages us to act—either rationally or irrationally.  Take for example how people are dealing with the outbreak of the coronavirus. There are many examples of how people are reacting out of fear rather than using critical thinking to work through and find a solution. Whether we are talking about the hoarding of something like toilet tissue or something more dangerous like attacking someone not social distancing, how we react can be traced back to this barrier. Roalfe (1929) posits that “ the real force or menace behind all fear and anxiety is desire” (p. 35). Our desire for safety and security essentially can overwhelm our ability to use our critical thinking to make good choices.

Anger is another of the barriers to thinking. When people are angry they will often ignore important information that might help resolve the root cause of the anger.  Berkowitz and Jones (2004) explain that “anger is often intermixed with the fright so that the fearful persons are all too apt to be angry as well, particularly when they think they cannot escape from the danger” (p. 152). This is particularly pertinent as people react to the COVID 19 crisis.  Many people simply cannot escape an outbreak and they will react with both anger and fear. One not mentioned though is denial.

Denial or repression is perhaps more dangerous than our initial impulse of anger because we can rationalize ourselves into believing something that is not factual rather than dealing with the issue. Whether it manifests in refusal to take protective measures or leads to risky behavior, the outcome can be detrimental not just to the person, but to others as well. Even when confronted with the facts, denial can keep a person from making the decisions needed for personal protection. We saw this play out again and again as people carried on as if nothing was really happening, even as the hospitals were flooded with sick and the numbers grew daily.

As critical thinkers, we can use the same methods we use for daily problem solving to help us make the best possible solution, even when dealing with a life-altering outbreak like the current COVID 19 pandemic we are currently going through. The first thing that must happen is the realization that there is a problem. As mentioned above, this can be very difficult when there are so many different voices trying to influence our understanding and how we are wired psychologically. Without the realization of the problem though, there can be no solutions. But how do we decide in such a fractured approach to the possible solutions? El-Hai & Machado (2020) state that

when the problem is not consensually stated to a significant extent, the search for solutions becomes open­ended, with different stakeholders championing alternative solutions and competing with one another to frame ‘the problem’ in a way that directly connects their preferred solution and their preferred problem definition, and, besides, fulfills their own interests and pleases their own values. (p.6)

What this suggests is that we must advocate for ourselves and seek out the best possible information. Whether we are reviewing information from the CDC or our state health department, the voices of the professionals can help us gain the insight we need. Understanding is key because it can help us to explore the alternatives we have available as we move forward towards a solution.

Exploring the alternatives is also important because in this stage we can weigh the pros and cons of how the possible decisions will affect us. This stage can be as complex as any and as reflected in the quote above, how we view the possibilities is impacted by how the issue is framed.

Then we will make the decision and implement the plan. In this stage, there will likely be some uncertainty and how we deal with the uncertainty can impact how willing we are to accept the risk in the decision. A young, healthy person is not likely to see the same risk as an older person with a pre-existing condition. As noted though in the available research, even the young can be impacted by the virus and young people can be potential spreaders of the disease when asymptomatic.

The final step in decision making a review of how well the plan is working. As I write today, the country has started opening back up and the number of cases has started to rise. As critical thinkers, we track our progress and work to make adjustments when needed. The question remains though, will people use social distancing and face coverings as a way slow the disease as the economy restarts, or will some slip into a state of denial?

Berkowitz, L., & Harmon-Jones, E. (2004). More thoughts about anger determinants. Emotion, 4 (2), 151-155. doi:http://dx.doi.org/10.1037/1528-3542.4.2.151

El-Hani, C., & Machado, V. (2020). COVID-19: The need of an integrated and critical view. Ethnobiology and Conservation, 9 Retrieved from https://search.proquest.com/docview/2404320755?accountid=35812

Roalfe, W. R. (1929). The psychology of fear. The Journal of Abnormal and Social Psychology, 24 (1), 32-40. doi:http://dx.doi.org/10.1037/h0071654

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Notify me of follow-up comments by email.

Notify me of new posts by email.

A Conscious Rethink

Being In Denial: Signs You Are, Examples, How To Stop

Disclosure: this page may contain affiliate links to select partners. We receive a commission should you choose to make a purchase after clicking on them. Read our affiliate disclosure.

in denial - woman with her hands over her eyes

Denial is a defensive mechanism with a poor reputation. The common perception of denial is that it is harmful; that it prevents a person from accepting their reality and moving forward.

However, denial is not necessarily an unhealthy thing. Sometimes it’s necessary.

Denial serves an important role in preserving a person’s mental health when confronted with some kind of awful thing. It’s the brain protecting itself from the immediate anxiety and trauma that thing would cause.

Denial may be immediately after an experience, or it could be later on when a person refuses to acknowledge something that happened in their past.

The problem with denial is that it prevents you from living your life, addressing your problems, healing from them, and moving forward.

The first step on any healing path is acknowledgment and acceptance of the problem. Being stuck in denial prevents the person from taking that first step.

Speak to an accredited and experienced therapist to help you process and deal with the thing(s) you are in denial about. You may want to try speaking to one via BetterHelp.com for quality care at its most convenient.

Symptoms Of Denial

There are many negative feelings that people try to avoid. No one wants to be exposed to traumatic experiences, threatening situations, stress, or terrible consequences.

Unfortunately, we don’t always get that choice. Life will sometimes just throw things at you that you have to find a way to deal with and move past.

Still, it can be helpful to know when you or a loved one may be experiencing denial, so you can at least be aware of it.

These are some common signs that someone is in denial:

1. You avoid engaging with or thinking about the problem. (“I’ll just zone out and watch Netflix or mindlessly scroll through social media instead of thinking.”)

2. You blame other people or circumstances for the problem. (“I wouldn’t be drinking so much if my spouse wasn’t always stressing me out.”)

3. You continue harmful actions even though it has negative consequences. (“I’m not going to go to a dentist for this toothache, even though it’ll only get worse.”)

4. You justify your negative behavior or circumstances. (“I can’t have fun without drinking.”)

5. You say you will just address the problem in the future. (“That toothache isn’t a big deal. I’ll deal with it in a couple of weeks.”)

6. You just won’t talk about the problem with anyone. (“I don’t want to talk about it. Ever.”)

7. You ignore and minimize the concerns of others. (“You don’t know me. You don’t know what I can and can’t handle.”)

8. You may be using threats or intimidation to keep other people from talking to you about it. (“F*ck you. I don’t have a problem.”)

9. You may be engaging in self-harm to replace emotional pain with physical pain. (Cutting, punching yourself, burning yourself, etc.)

10. You may be engaging in unhealthy, abusive behaviors. (Substance abuse, working far too much to avoid thinking about it, promiscuous or unhealthy sexual behavior.)

11. You may withdraw from other people, so no one asks too many questions. (Not answering calls, not returning messages, calling out of work, avoiding family members.)

12. You justify your behavior by comparing it to others. (“I don’t have a drinking problem! Mark drinks way more than I do!”)

Denial may also manifest as having similar symptoms as depression. The person in denial may feel helpless or hopeless in addressing the situation. They may also believe that nothing they can do about the situation will make a difference.

Examples Of Denial

Denial is a common defensive mechanism for avoiding consequences or dealing with stressful situations.

In fact, it’s quite likely that you have used denial sometime in your life to avoid an unpleasant truth. It’s okay if you have. Everyone does it sooner or later.

Some examples of denial include:

1. Denying a mental health issue.

Many people with mental health issues struggle with accepting that fact. To accept that one has a mental health issue or mental illness is to accept that one is different from the social perception of normal. No one really wants to be an outsider. That’s more of a state of being imposed on us by our experiences and circumstances.

Of course, there can be many reasons to deny mental health problems, ranging from refusal to accept reality to wanting to avoid stigma. It’s no easy thing to deal with. The problem with denying mental health issues is that they don’t go away independently. They just get worse until you can no longer ignore them.

2. Denying or minimizing substance abuse.

The common way for people to deny substance abuse issues is by either comparing themselves to another or justifying their behavior.

For example, they may say they don’t have a problem because someone else is much worse off. They may also use the excuse that they don’t really have a problem because they are functional and able to work.

3. Denying health issues.

A person diagnosed with a chronic illness or serious medical condition may minimize their diagnosis. They convince themselves that it’s not as bad as the doctors say or look for nonmedical treatment methods. The problem with this is that the person often delays medical treatment, making the problem much worse.

4. Denying a bereavement.

An unexpected death may cause a loved one to avoid the reality of the situation. They refuse to accept it and may act as though the person is still alive so they can avoid the pain and stress of losing their loved one.

Now, denial is a step in the stages of grief, and it is totally normal to want to deny a loss. Eventually, though, you will have to accept it.

5. Denying their own poor behavior.

Many people deny that their actions may have hurt someone they care about. This can look like a flat-out refusal to talk about the situation or shift the blame onto the harmed person. “You made me do it!”

How Do I Address My Denial?

Since denial has many causes, the solution will largely rely on the cause.

Often, denial happens immediately after a difficult or traumatic event because the brain is just creating space to cope. However, you may find that you break through your denial on your own after a while when your brain subconsciously determines that it can handle the stress.

On the other hand, sometimes denial is willful. You may know that there is a problem and refuses to address it. You may also not understand how serious the problem actually is. You won’t break through that denial until they can see that there is a problem and choose to accept it.

You see this with substance abuse often. But unfortunately, many people don’t give sobriety the effort they deserve until they lose everything they care about, like a career, spouse, or family. That ugly reality can punch straight through denial.

The other solution is to look into therapy relevant to the problem you’re facing. People often throw around generic advice like “look for a counselor.”

But the reality is that a specific counselor can be of more help. If you’re trying to get through grief, look into a grief counselor. If you had a traumatic experience, talk to a trauma counselor. If you have mental health issues, talk to a mental health counselor. Substance abuse problems? There are definitely substance abuse counselors out there.

Many people just go out and find whatever counselor they can, don’t make progress, and then write off therapy as ineffective. If possible, try to find a counselor specializing in the problem you’re trying to work through.

A good place to find a counselor or therapist who meets your specific needs is the website BetterHelp.com – they ask some initial questions that help to match you with the most suitable professionals on their platform. And you can talk to that professional via phone, video, or instant message from wherever you are in the world.

Click here to get professional help today, or to learn more about the services BetterHelp.com offer.

A support group can also be a valuable way to navigate your denial. Just being around other people that understand can make you feel far less alone and empowered to confront the issue. Social connection is a powerful thing on the road to healing.

Still, if you want to try to go it on your own, you may try an approach like this:

1. Consider what you might be afraid of or denying.

You’re not in denial about any old thing, but do you know precisely what you are in denial about?

For instance, maybe you have an illness of some sort. Are you in denial about being ill? No? Then are you in denial about the severity of that illness? No, ok. But are you in denial about the likelihood that the illness is going to limit the kind of life you lead or even how long that life is likely to be?

Be as precise as possible about the thing or things you are denying.

2. Consider the problems that may arise from living in denial.

Though you may be able to deny a particular problem or event, it’s impossible to forever deny the real-life consequences of that thing.

Did you lose your mom recently? You can try as hard as you like to deny that, but doing so may put real strain on your relationships with other family members who want to talk about her death or deal with things like funeral arrangements or inheritance.

Or, as alluded to earlier, denying the existence or severity of a health problem can mean it gets worse or even becomes untreatable if left too long.

3. Allow yourself time to think about why you deny the circumstance.

What are your reasons for denying the thing? What are you hoping to gain by doing so?

Perhaps your life is already stressful enough and you just don’t have the emotional bandwidth to be able to process something heavy right now. So, you’re in denial about it as a means to delay having to deal with it.

Or maybe it is purely to avoid the pain and hurt that will come from facing the thing in all its reality. Even if you rationally know something has happened, you can avoid its full effects by distracting yourself in one of a million different ways.

4. Consider what irrational beliefs may be preventing your acceptance.

People deny things for all sorts of reasons, and those reasons often relate to a person’s beliefs.

Examples of these beliefs include:

“I don’t need to accept it because it will sort itself out on its own.” – While this can sometimes be true, very often the thing will need your input to find a resolution. If it is not resolved, it will likely linger in the background and may even get worse until you decide to act.

“The pain would be too much for me to bear.” – People often have far more strength and resolve than they give themselves credit for. This belief may, however, be rational if the person has mental health issues or is already facing incredible adversity and devastation. In which case, external help should be sought.

“Someone else will fix this issue for me.” – Is there any basis for this belief or are you just hoping beyond hope that someone else will recognize the issue and take time out of their life to address it on your behalf?”

“Bad things always happen to me, so what’s the use in trying to make things better?” – This is a victim mentality that might be based on self-esteem and self-worth issues that skew your view of events in your life. Firstly, the reality of your life might not be nearly as bad as you believe. What’s more, while denial allows you to avoid acting, if you just put some time and effort in, you might be able to enjoy better outcomes.

5. Journaling out the situation and your feelings about it can help.

When something exists purely in your mind, it is far easier to turn a blind eye to it. That’s where journaling can come in handy.

When you sit down and actually write out the issue and your thoughts about the issue, it forces you to take those first vital steps to acknowledging both that the issue exists and that you need to do something about it.

And we would highly encourage you to write in a physical journal with a pen rather than a digital journal. The act of writing is far different and in many ways more effective than typing on a screen or keyboard.

6. Talk about it with someone you trust.

The moment the words pass your lips that confirm the existence and severity of an issue you are facing is the moment you begin to stop denying it.

If you can bring yourself to talk about something that you have thus far been avoiding, you begin to challenge and change the beliefs you have about it. You might not do a complete 180 the first time you bring something up, but you will start the process that ends with you accepting that there is a problem that needs to be dealt with.

Just then make sure you don’t ruminate on the problem over and over while continuing to avoid dealing with it. That’s just as unhealthy as denying the existence of the problem in the first place.

Talk about it with the aim of putting real-world actions into place to help resolve the problem, or to help you process the emotional elements of the issue so that you can begin to let go of it.

7. If that doesn’t work, seek additional help through counseling or therapy.

If you are really struggling to confront the thing you’re in denial about, do speak to a professional about it as we talk about above.

How do I help someone with their denial?

Sometimes it’s frustrating when your loved one is experiencing their own denial, particularly in issues with mental illness or substance abuse.

Sometimes the problem can be so severe that it negatively affects your life. In that case, it’s generally best for you to seek the help of a therapist to construct boundaries and find a way to get through the situation. You will want professional support because those situations can end up violent.

You may feel like you want to force the person through their denial to confront reality, but this is often a bad choice. People will often come through their denial in their own time. The brain typically knows what to do to get through things; we just interrupt the process a lot.

After all, who really has the time to sit around and be sad for a while? To cry? To be angry? You’ve got stuff to do! Gotta get to work! That laundry isn’t going to do itself.

As a result, we interrupt our process of acceptance and healing.

That’s also why trying to force someone else through their own process doesn’t work. We impose what may be good for us on someone else, but we don’t always ask if that is good for the other person.

Do offer to listen to the person. Sometimes all a person needs to move through their denial is to be heard. Some people struggle so much and have had no one to just listen to them. You don’t have to try to fix the problem.

And fair warning, it may be an uncomfortable conversation. It’s okay to be uncomfortable. Just keep going through it until the conversation is over.

You may also want to suggest the person look into professional support once you’re done. However, do not try to force the person to seek help, whether it’s counseling or for medical issues. They will likely get defensive, angry, and you’ll lose any progress you made.

You can also offer to go with them to their first appointments so they can get more comfortable with the idea.

People often use denial to avoid their fears instead of confronting them.

Denial isn’t always an unhealthy thing if it’s temporary. But if it lasts longer than six months, it would be a good idea to seek out professional support.

You may also like:

  • How To Accept What Is (Without Surrendering Your Power): 10 Tips
  • 12 Radical Acceptance Coping Statements To Deal With Difficult Emotions
  • 20 Healthy Coping Skills: Strategies To Help With Negative Emotions
  • How To Let Go Of The Past: 16 No Nonsense Tips!
  • 8 Reasons Why Time Doesn’t Heal All Wounds

You may also like...

young woman in green hoodie shrugging her shoulders as if to say "I don't know"

How To Humbly Admit When You Don’t Know Something (6 Tips)

a man wearing a checked shirt holding his hand up, palm facing out as if to say "I made a mistake"

12 Good Things That Happen When You Are Able To Say “I Made A Mistake”

man with crossed arms and dismissive look on his face as if to deny a mistake, the image is a digital painting style

11 Reasons It’s Hard To Admit You’re Wrong, According To Psychology

a man with a slightly regretful expression on his face talks to his flatmate, owning up to a mistake

8 Tips To Help You Own Up To Your Mistakes

a pensive woman looking out of a window as the sun shines in through another window, she is deep in thought about her flaws and imperfections

10 Ways To Accept Your Flaws And Embrace Your Imperfections

a young woman in a denim shirt looks pensive as her friend opens up about being in the wrong about something

How To Admit You Were Wrong: 12 Tips If You Find It Difficult

a female university student with an anxious expression on her face as she sits in a seminar. She does not want to talk and risk looking stupid

10 Ways To Overcome Your Fear Of Looking Stupid

a slightly angry man sitting on a bed gesturing with his hands to try to justify his bad behavior to his partner who sits next to him on the bed

How To Stop Justifying Your Bad Behavior To Others

brunette woman with a furrowed brow with an offended expression on her face

11 reasons why you get offended easily, according to psychology

About The Author

example of denial in critical thinking

Jack Nollan is a person who has lived with Bipolar Disorder and Bipolar-depression for almost 30 years now. Jack is a mental health writer of 10 years who pairs lived experience with evidence-based information to provide perspective from the side of the mental health consumer. With hands-on experience as the facilitator of a mental health support group, Jack has a firm grasp of the wide range of struggles people face when their mind is not in the healthiest of places. Jack is an activist who is passionate about helping disadvantaged people find a better path.

ABLE blog: thoughts, learnings and experiences

  • Productivity
  • Thoughtful learning

Break through these 5 common critical thinking barriers

Break through these 5 common critical thinking barriers

Can you think of the last time you made a decision? It was probably about one second ago, even though you may not have realized it.

Our days are filled with choices, from pressing the snooze button on the morning alarm to selecting what to eat for dinner. On average, adults make around 35,000 decisions a day . If you average 16 hours of waking time, that's almost 36 decisions per minute.

Most decisions are entirely unconscious, like whether or not to scratch an itch or having a knee-jerk reaction to the expression on your significant other's face. Others, though, require a more careful and critical examination.

Critical thinking is one of the most valuable skills we can possess in our personal and professional lives. It allows us to analyze information, make sound decisions, and solve problems. However, many people find it difficult to think critically.

This article will discuss what critical thinking is, why it's important, and how you can overcome common critical thinking barriers.

What is critical thinking?

The origin of critical thinking can be traced back thousands of years to the teaching practice of the Greek philosopher Socrates. After discovering that many people couldn't explain the truth of their statements, he encouraged people to ask questions that go deep into their thoughts before accepting them.

Socrates used open-ended questions to stimulate critical thinking and uncover assumptions, a process that bears his name today — Socratic Questioning. It’s grounded in the belief that thoughtful questioning allows the student to examine ideas logically and determine their validity.

Socrates' method of questioning set the stage for thoughtful reflection. Today, the Foundation for Critical Thinking defines critical thinking as "the art of analyzing and evaluating thinking to improve it." Unlike automatic or subconscious thought, thinking critically requires you to actively use intellectual tools to reach conclusions rather than relying on subconscious processes. This strengthens decision-making skills.

Critical thinking consists of two components:

  • A set of skills used to process information and beliefs
  • The act of consciously applying those skills as a guide for behavior

Each of these components is equally important during the critical thinking process.

What is the critical thinking process?

Critical thinking barriers: Steps on a wall

Critical thinkers evaluate evidence and analyze information before making a judgment. The process requires higher-order thinking skills such as sorting, analyzing, comparing data, and assessing logic and reason.

The critical thinking process consists of five primary elements :

  • Identify the claims. Organize arguments into basic statements and conclusions.
  • Clarify the arguments. Look for inconsistencies and ambiguities in statements.
  • Establish the facts. Verify whether the claims are reasonable, identify missing or omitted information, apply logic, and check for possible contradictions.
  • Evaluate the logic. Analyze whether the assumptions align with the conclusions.
  • Make the decision. Evaluate the argument using evidence, logic, and supporting data to increase the weight, contradictions, poor reasoning, or lack of evidence to decrease the weight.

Finding accuracy in ideas and challenging assumptions are essential parts of this process. Observing these two steps closely enables critical thinkers to form their own conclusions.

Why is it important to think critically?

Success in both business and life depends on the ability to think critically.

Human nature doesn't permit us to be completely objective. Instead, we each have our own viewpoints, close-mindedness, and social conditioning that influence our objective thinking capability. Everyone experiences distorted thinking and cognitive biases, leading to irrational thought processes. Critical thinking ability is necessary to overcome the limitations of irrational thinking.

Thinking critically is beneficial because it:

  • Promotes problem solving and innovation
  • Boosts creativity and curiosity
  • Encourages deeper self-reflection, self-assertion, and independence
  • Improves career opportunities
  • Builds objectivity and open-mindedness

Critical thinking isn't about reaching the "right" answer — it's about challenging the information you're given to make your own conclusions. When you can question details and think for yourself, you're less likely to be swayed by false claims, misleading arguments, and emotional manipulation.

5 common critical thinking barriers and how to break through them

It's possible to break through critical thinking barriers

The ability to think critically is essential to our personal and professional development. To become excellent critical thinkers, we must embrace a growth mindset — the idea that we can cultivate intelligence through learning and practice. This includes stepping out of our comfort zone to push our thinking patterns and checking in to correct ourselves as needed.

Very few of us can think critically without hitting a couple of roadblocks. These critical thinking barriers can come in many forms, including unwarranted assumptions, personal biases, egocentric thinking, and emotions that inhibit us from thinking clearly. By becoming aware of these common challenges and making a conscious effort to counter them, we can improve our critical thinking skills and learn to make better decisions.

Here are five of the most commonly encountered critical thinking barriers, how to spot them, and what you can do to overcome them.

1. Confirmation bias

What it is: Confirmation bias refers to the tendency to see new information as an affirmation of our existing beliefs and opinions. People with this bias disregard opposing points of view in favor of evidence that supports their position.

Why it occurs: Confirmation bias results from our emotional inclination to see the world from our perspective. Having quick reflexes keeps us safe, so we interpret information from our own perspective because it enables us to react instinctively . Another explanation is that our minds struggle with the parallel processing of two opposing arguments, so we only process the one we already believe because it’s easier.

How to overcome it: Confirmation bias may be the hardest bias to defeat . It’s difficult to not hold preconceived notions, but you can train your mind to think differently. Make an effort to be open-minded and look at situations from an alternative perspective. When we're aware of our own confirmation biases and diligently watch out for them, we can avoid favoring specific facts when evaluating arguments.

2. Self-serving bias

What it is : The self-serving bias concerns how we place attribution for results. An individual with this bias externalizes blame for any undesirable results, yet takes credit for success.

Why it occurs: Researchers have found that people with a self-serving bias make attributions based on their need to maintain a high level of self-esteem . Our minds fear losing confidence if we take responsibility for failure or negative outcomes.

How to overcome it: You can counteract self-serving bias by maintaining a growth mindset. To have a growth mindset, you must be able to admit your errors, examine personal biases, and learn to take criticism. To overcome a self-serving bias, practice self-compassion. Accepting your imperfections and being kind to yourself when you fall short of your goals can help you maintain confidence.

3. Normalcy bias

What it is: The normalcy bias arises from our instinctual need for safety. Using this bias, we tend to overlook new information and common sense so that nothing changes and we can continue to live our lives as usual.

Why it occurs: The normalcy bias is a protection mechanism, a form of denial. Usually active when facing a traumatic event, this bias shuts down the mind to protect us from things that are too painful or confusing to comprehend.

How to overcome it: Although it is the brain's attempt to protect us, the normalcy bias can be harmful — and even dangerous — if it keeps us from facing reality. The best way to overcome it is to face facts and truth head-on, no matter how difficult it may be.

4. Availability heuristic

What it is: The availability heuristic occurs when we rely on the first piece of information that comes to mind without weighing other possibilities, even when it may not be the best option. We assume that information that is more readily accessible is more likely to be true.

Why it occurs: This heuristic stems from the brain’s use of shortcuts to be efficient. It can be used in a wide variety of real-life situations to facilitate fast and accurate estimation.

How to overcome it: Some real-world scenarios (like probability estimations) can benefit from the availability bias, so it's neither possible nor advisable to eliminate it entirely. In the event of uncertainty, however, we must be aware of all relevant data when making judgments, not just that which comes readily to mind.

5. Sunk cost fallacy

What it is: The sunk cost fallacy arises from the instinctual need for commitment. We fall victim to this illusion when we continue doing something even if it's irrational, simply because we’ve already invested resources that we can’t get back.

Why it occurs: The sunk cost fallacy occurs when we’re affected by feelings of loss, guilt, or regret. These innate feelings are hard to overcome — research has found that even rats and mice struggle with sunk costs when pursuing a reward. Because of this tendency, when we feel like we've already put considerable effort into organizing our information and pursuing a result, we tell ourselves that we can’t waste it by changing course.

How to overcome it: Instead of dwelling on past commitments, pay attention to the present and future. Thinking with logical reasoning, in terms of concrete actions instead of feelings, is vital.

Be ABLE to think critically despite barriers

Thinking critically is an essential skill for self-learners . Making sound decisions starts with recognizing our critical thinking barriers. Practicing self-compassion and self-awareness are excellent ways to identify biases in your thinking. From there, you can begin working toward overcoming those obstacles. When you have no critical thinking barriers in your way, you can develop and strengthen the skills that will help you succeed.

I hope you have enjoyed reading this article. Feel free to share, recommend and connect 🙏

Connect with me on Twitter 👉   https://twitter.com/iamborisv

And follow Able's journey on Twitter: https://twitter.com/meet_able

And subscribe to our newsletter to read more valuable articles before it gets published on our blog.

Now we're building a Discord community of like-minded people, and we would be honoured and delighted to see you there.

Boris

Straight from the ABLE team: how we work and what we build. Thoughts, learnings, notes, experiences and what really matters.

Read more posts by this author

follow me :

Time management matrix: How to make the most of this useful tool

The ultimate guide to the outline note-taking method.

What is abstract thinking? 10 activities to improve your abstract thinking skills

What is abstract thinking? 10 activities to improve your abstract thinking skills

5 examples of cognitive learning theory (and how you can use them)

5 examples of cognitive learning theory (and how you can use them)

0 results found.

  • Aegis Alpha SA
  • We build in public

Building with passion in

  • About & FAQ
  • What is a Cult?
  • Public Speaking
  • Critical Merchandise
  • Books I Recommend
  • Become a Patron
  • Curriculum Vitae

Critical Thinking vs. Denialism

  • by Chris Shelton
  • July 13, 2017 October 21, 2018

example of denial in critical thinking

The subject of critical thinking is important. It’s important enough that it drives all of science and discovery and has given our modern culture great power and ability. Everything we have that makes our life easier and better: our technology, our system of government, our entire way of life, is founded on good critical thinking that was and is done by very smart people. It’s important that we have a clear understanding of what critical thinking is, but it’s also important that we show what it’s not. Just because someone says they are a critical thinker doesn’t mean they are. This is a label that actually means something. So let’s talk about this.

What do all of these things have in common:

  • Global conspiracy theories
  • Creationism
  • Anti-vaccination movement
  • Climate change denial
  • Holocaust denial
  • Flat Earth theory

All of these groups, and unfortunately many others, all believe that they are being enlightened and intelligent critical thinkers. Unfortunately, the truth is the exact opposite. To call anyone who goes in for this kind of nonsense as a critical thinker is a misnomer.

Critical thinking is using rational thought and logic to analyze information and make reasonable decisions or conclusions based on facts and evidence. Being critical is also sometimes described as discerning, analytical, diagnostic, exacting, particular, open minded, informed by evidence and disciplined.

Skepticism is popularly thought to be the idea that you doubt the truth of something and is also sometimes described as cynicism, distrust, mistrust, aporetic, suspicion, incredulity, pessimism, defeatism, dubiety, apprehension, nullifidian, disbelief, hesitation, pyrrhonic, inconvincible, ephectic, reluctance, dubiousness, faithlessness, questioning, having qualms, wary, misgiving, mistrustful, wavering, vacilation or lack of confidence. But let’s take an even closer look at this because there are too many people out there who are under the mistaken idea that they are skeptics and critical thinkers but who should be called what they actually are: denialists.

Denialism is a refusal to accept well-established theory, law or evidence. It is not critical thinking, but people who do this tell themselves and anyone else who will listen that they are just being critical thinkers by questioning the accepted party line. They often describe solid scientific consensus and evidence-based conclusions as a conspiracy they are fighting against, having discovered in their internet research that there is some shadowy cabal working very hard to fool the general public for some nefarious reason.

A more precise definition was provided by Mark Hoffnagle on his Denialism blog back in 2007: “Denialism is the employment of rhetorical tactics to give the appearance of argument or legitimate debate, when in actuality there is none. These false arguments are used when one has few or no facts to support one’s viewpoint against a scientific consensus or against overwhelming evidence to the contrary. They are effective in distracting from actual useful debate using emotionally appealing but ultimately empty and illogical assertions.” His entire blog is actually dedicated to this topic and it’s quite good.

Denialists will not usually refute the entirety of a scientific claim, but will take digs at parts of it, usually the parts they themselves don’t really understand but can counter in some fashion that will emotionally or even financially appeal to others. For example, someone could counter the monumental evidence of global climate change by saying that coal and fossil fuels are vital because they provide jobs for working Americans which we cannot economically do without. In other words, what do you want to do Mr. Climate Change Guy, deny poor coal workers the ability to feed their families? Now people who are pushing for less carbon emissions in our atmosphere are the bad guys because they’re trying to kill people, a claim as blatantly false as saying that you should wash your clothes with mud.

Another cause of this is the fact that we don’t have answers for many of the problems and questions that plague our daily life. Science is not a catchall, one-stop-shop for every problem, but is a process of discovery which has been and will continue to move forward at a frustratingly slow pace. We don’t know what causes autism or how to cure it, why ice is slippery or even fully understand why sleep is necessary. There are lots of very sound and good explanations for some parts of these things, but final and accepted conclusions still elude us. This can be upsetting to people who need answers and they then turn to totally fake pseudoscience and nonsense in desperation. Just because we don’t always know what causes something doesn’t mean we can’t rule some things out.

In a paper called Manufactroversy: The Art of Creating Controversy Where None Existed , University of Washington Professor Leah Ceccarelli wrote:

“First, they skillfully invoke values that are shared by the scientific community and the American public alike, like free speech, skeptical inquiry, and the revolutionary force of new ideas against a repressive orthodoxy. It is difficult to argue against someone who invokes these values without seeming unscientific or un-American. “Second, they exploit a tension between the technical and public spheres in … American life. Highly specialized scientific experts can’t spare the time to engage in careful public communication, and are then surprised when the public distrusts, fears, or opposes them. “Third, today’s sophists exploit a public misconception about what science is. They portray science as a structure of complete consensus built from the steady accumulation of unassailable data. Any dissent by any scientist is then seen as evidence that there’s no consensus, and thus truth must not have been discovered yet.”

A recent video I posted to take a stab at what are perhaps the lowest hanging fruit in the critical thinking world, the Flat Earthers, demonstrates this. People who believe the earth is flat can be kindly described as a group of people who are either scientifically illiterate or horribly undereducated. Now the truth is that some of them are highly educated but refute their own learning because of religious zealotry, mental health issues or both. The comments that appeared within hours of posting my video were an onslaught of personal insults, logical fallacies and very good demonstrations of people who have no clue how physics, gravity and electromagnetism work but who were happy to expound on their ignorance for paragraphs at a time. It would be hilarious if it wasn’t so sad.

Astrophysicist Neil deGrasse Tyson has described the problem this way: “A skeptic will question claims, then embrace the evidence. A denier will question claims, then reject the evidence.” He has discussed at length the issues with science education and how schooling is more about reciting facts and dates and pieces of information rather than teaching the more fundamental basis of science: “Science is a way of understanding what is and is not true in the world.” When this understanding is replaced with religious dogma, conspiracy theories, unfounded opinions masquerading as facts or just totally invented ideas, we have a problem. And unfortunately, we are glutted with all of this for many different reasons.

What these folks have proven to me is that it is not the facts that matter, it’s the way we approach the use of facts and our attitude about what is and isn’t true. When we let our personal biases and emotions get the better of our rationality, we can make really bad decisions. What is the most simple and immediate answer? To always question ourselves. To always be open to consider new or different ideas. When engaging in a debate or argument, the purpose is often to win the conversation by proving how we are right and the other person is wrong but winning arguments is not the purpose of critical thinking. That’s a different subject called rhetoric and believe me, winning arguments can be done without having any truth or facts on your side at all.

Physicst Mark Boslough wrote:

“Real skeptics do not cling to absurd conspiracy theories for which there is no evidence, nor do they engage in obfuscation, misrepresentation, data fabrication, smear campaigns, or intimidation tactics. These are the methods of deniers.”

So my message is don’t be that guy or gal. Keep an open mind, maintain a truly skeptical attitude, and don’t be afraid to accept evidence and facts. If you find yourself having to do a lot of mental gymnastics to make something make sense, you probably don’t understand it well enough to even be talking about it yet. Instead, just learn more about it. There’s nothing at all wrong with acknowledging that you simply don’t know something. In fact, that’s the hallmark of good critical thinking.

Thank you for watching.

Share this:

3 thoughts on “critical thinking vs. denialism”.

' src=

Chris, I think one important difference between a denier and a critical thinker is the ability of a critical thinker to admit they’re wrong when sufficient evidence accumulates to undermine their own theory.

A denialist believes that everyone else is wrong and only they (and probably a small fraternity of like-minded people) is right. The satisfaction in being a denialist is the feeling of power from having some secret knowledge that the poor, benighted souls in the other 99.9% of humanity don’t have. I suspect that denialism functions a lot like membership in a cult — you have to engage in a lot of thought stopping to consistently ignore the evidence that undermines your belief. And you have to isolate yourself from people who disagree with you lest they keep picking at your beliefs.

I’ve found that a lot of conspiracy theory beliefs and various denialist “scientific” theories often depend on one assumption (either explicit, or sometimes implicit) that, when disproven, blow the theory up completely.

One such belief is the idea that oil companies control the price of oil and decide, via some sort of shadowy cabal, just how much to produce to control the global economy. The reality is that the oil business is an incredibly well-understood business and prices are driven over the long term by supply and demand. There are indeed short-term dislocations that individual players can effect, such as the 1973 Arab oil embargo and the run-up to $140 a barrel in 2007 as hedge funds tried to turn commodities into an “asset class” like stocks or bonds (they’re not). But eventually those distortions get blown out of the market and it reverts to supply and demand.

And the fact that oil companies can’t manipulate the market over the long term is established by one fact that the average citizen doesn’t understand: it is very hard to turn oil wells off. You can slow production very slightly, but you can’t turn a well off and restart it 3 years or even 3 months later. The geology won’t permit it. So there is always more oil. In fact, when oil demand slows suddenly, as it did in the beginning of the credit crisis, there’s literally no place to store it while it awaits a buyer. This one simple fact completely demolishes any possible conspiracy theories that there is a shadowy secret club of oil industry execs that have consistently set prices over the decades.

This is not to make out oil company CEOs as saints, or as victims of conspiracy theorists. They’re not. I’ve met a number of oil company CEOs and would not be in a hurry to invite any of them home. But the idea that they have super powers to control the world economy is laughable and is an easily demolished juvenile fantasy.

' src=

Great comment John. Thank you.

Pingback:  PSY 240 Stratford University Denial the Defense Mechanisms? Discussion - Essayscity: Collepals.com Paper Writing Website

Leave a Reply Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed .

41+ Critical Thinking Examples (Definition + Practices)

practical psychology logo

Critical thinking is an essential skill in our information-overloaded world, where figuring out what is fact and fiction has become increasingly challenging.

But why is critical thinking essential? Put, critical thinking empowers us to make better decisions, challenge and validate our beliefs and assumptions, and understand and interact with the world more effectively and meaningfully.

Critical thinking is like using your brain's "superpowers" to make smart choices. Whether it's picking the right insurance, deciding what to do in a job, or discussing topics in school, thinking deeply helps a lot. In the next parts, we'll share real-life examples of when this superpower comes in handy and give you some fun exercises to practice it.

Critical Thinking Process Outline

a woman thinking

Critical thinking means thinking clearly and fairly without letting personal feelings get in the way. It's like being a detective, trying to solve a mystery by using clues and thinking hard about them.

It isn't always easy to think critically, as it can take a pretty smart person to see some of the questions that aren't being answered in a certain situation. But, we can train our brains to think more like puzzle solvers, which can help develop our critical thinking skills.

Here's what it looks like step by step:

Spotting the Problem: It's like discovering a puzzle to solve. You see that there's something you need to figure out or decide.

Collecting Clues: Now, you need to gather information. Maybe you read about it, watch a video, talk to people, or do some research. It's like getting all the pieces to solve your puzzle.

Breaking It Down: This is where you look at all your clues and try to see how they fit together. You're asking questions like: Why did this happen? What could happen next?

Checking Your Clues: You want to make sure your information is good. This means seeing if what you found out is true and if you can trust where it came from.

Making a Guess: After looking at all your clues, you think about what they mean and come up with an answer. This answer is like your best guess based on what you know.

Explaining Your Thoughts: Now, you tell others how you solved the puzzle. You explain how you thought about it and how you answered. 

Checking Your Work: This is like looking back and seeing if you missed anything. Did you make any mistakes? Did you let any personal feelings get in the way? This step helps make sure your thinking is clear and fair.

And remember, you might sometimes need to go back and redo some steps if you discover something new. If you realize you missed an important clue, you might have to go back and collect more information.

Critical Thinking Methods

Just like doing push-ups or running helps our bodies get stronger, there are special exercises that help our brains think better. These brain workouts push us to think harder, look at things closely, and ask many questions.

It's not always about finding the "right" answer. Instead, it's about the journey of thinking and asking "why" or "how." Doing these exercises often helps us become better thinkers and makes us curious to know more about the world.

Now, let's look at some brain workouts to help us think better:

1. "What If" Scenarios

Imagine crazy things happening, like, "What if there was no internet for a month? What would we do?" These games help us think of new and different ideas.

Pick a hot topic. Argue one side of it and then try arguing the opposite. This makes us see different viewpoints and think deeply about a topic.

3. Analyze Visual Data

Check out charts or pictures with lots of numbers and info but no explanations. What story are they telling? This helps us get better at understanding information just by looking at it.

4. Mind Mapping

Write an idea in the center and then draw lines to related ideas. It's like making a map of your thoughts. This helps us see how everything is connected.

There's lots of mind-mapping software , but it's also nice to do this by hand.

5. Weekly Diary

Every week, write about what happened, the choices you made, and what you learned. Writing helps us think about our actions and how we can do better.

6. Evaluating Information Sources

Collect stories or articles about one topic from newspapers or blogs. Which ones are trustworthy? Which ones might be a little biased? This teaches us to be smart about where we get our info.

There are many resources to help you determine if information sources are factual or not.

7. Socratic Questioning

This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic. You can do this by yourself or chat with a friend.

Start with a Big Question:

"What does 'success' mean?"

Dive Deeper with More Questions:

"Why do you think of success that way?" "Do TV shows, friends, or family make you think that?" "Does everyone think about success the same way?"

"Can someone be a winner even if they aren't rich or famous?" "Can someone feel like they didn't succeed, even if everyone else thinks they did?"

Look for Real-life Examples:

"Who is someone you think is successful? Why?" "Was there a time you felt like a winner? What happened?"

Think About Other People's Views:

"How might a person from another country think about success?" "Does the idea of success change as we grow up or as our life changes?"

Think About What It Means:

"How does your idea of success shape what you want in life?" "Are there problems with only wanting to be rich or famous?"

Look Back and Think:

"After talking about this, did your idea of success change? How?" "Did you learn something new about what success means?"

socratic dialogue statues

8. Six Thinking Hats 

Edward de Bono came up with a cool way to solve problems by thinking in six different ways, like wearing different colored hats. You can do this independently, but it might be more effective in a group so everyone can have a different hat color. Each color has its way of thinking:

White Hat (Facts): Just the facts! Ask, "What do we know? What do we need to find out?"

Red Hat (Feelings): Talk about feelings. Ask, "How do I feel about this?"

Black Hat (Careful Thinking): Be cautious. Ask, "What could go wrong?"

Yellow Hat (Positive Thinking): Look on the bright side. Ask, "What's good about this?"

Green Hat (Creative Thinking): Think of new ideas. Ask, "What's another way to look at this?"

Blue Hat (Planning): Organize the talk. Ask, "What should we do next?"

When using this method with a group:

  • Explain all the hats.
  • Decide which hat to wear first.
  • Make sure everyone switches hats at the same time.
  • Finish with the Blue Hat to plan the next steps.

9. SWOT Analysis

SWOT Analysis is like a game plan for businesses to know where they stand and where they should go. "SWOT" stands for Strengths, Weaknesses, Opportunities, and Threats.

There are a lot of SWOT templates out there for how to do this visually, but you can also think it through. It doesn't just apply to businesses but can be a good way to decide if a project you're working on is working.

Strengths: What's working well? Ask, "What are we good at?"

Weaknesses: Where can we do better? Ask, "Where can we improve?"

Opportunities: What good things might come our way? Ask, "What chances can we grab?"

Threats: What challenges might we face? Ask, "What might make things tough for us?"

Steps to do a SWOT Analysis:

  • Goal: Decide what you want to find out.
  • Research: Learn about your business and the world around it.
  • Brainstorm: Get a group and think together. Talk about strengths, weaknesses, opportunities, and threats.
  • Pick the Most Important Points: Some things might be more urgent or important than others.
  • Make a Plan: Decide what to do based on your SWOT list.
  • Check Again Later: Things change, so look at your SWOT again after a while to update it.

Now that you have a few tools for thinking critically, let’s get into some specific examples.

Everyday Examples

Life is a series of decisions. From the moment we wake up, we're faced with choices – some trivial, like choosing a breakfast cereal, and some more significant, like buying a home or confronting an ethical dilemma at work. While it might seem that these decisions are disparate, they all benefit from the application of critical thinking.

10. Deciding to buy something

Imagine you want a new phone. Don't just buy it because the ad looks cool. Think about what you need in a phone. Look up different phones and see what people say about them. Choose the one that's the best deal for what you want.

11. Deciding what is true

There's a lot of news everywhere. Don't believe everything right away. Think about why someone might be telling you this. Check if what you're reading or watching is true. Make up your mind after you've looked into it.

12. Deciding when you’re wrong

Sometimes, friends can have disagreements. Don't just get mad right away. Try to see where they're coming from. Talk about what's going on. Find a way to fix the problem that's fair for everyone.

13. Deciding what to eat

There's always a new diet or exercise that's popular. Don't just follow it because it's trendy. Find out if it's good for you. Ask someone who knows, like a doctor. Make choices that make you feel good and stay healthy.

14. Deciding what to do today

Everyone is busy with school, chores, and hobbies. Make a list of things you need to do. Decide which ones are most important. Plan your day so you can get things done and still have fun.

15. Making Tough Choices

Sometimes, it's hard to know what's right. Think about how each choice will affect you and others. Talk to people you trust about it. Choose what feels right in your heart and is fair to others.

16. Planning for the Future

Big decisions, like where to go to school, can be tricky. Think about what you want in the future. Look at the good and bad of each choice. Talk to people who know about it. Pick what feels best for your dreams and goals.

choosing a house

Job Examples

17. solving problems.

Workers brainstorm ways to fix a machine quickly without making things worse when a machine breaks at a factory.

18. Decision Making

A store manager decides which products to order more of based on what's selling best.

19. Setting Goals

A team leader helps their team decide what tasks are most important to finish this month and which can wait.

20. Evaluating Ideas

At a team meeting, everyone shares ideas for a new project. The group discusses each idea's pros and cons before picking one.

21. Handling Conflict

Two workers disagree on how to do a job. Instead of arguing, they talk calmly, listen to each other, and find a solution they both like.

22. Improving Processes

A cashier thinks of a faster way to ring up items so customers don't have to wait as long.

23. Asking Questions

Before starting a big task, an employee asks for clear instructions and checks if they have the necessary tools.

24. Checking Facts

Before presenting a report, someone double-checks all their information to make sure there are no mistakes.

25. Planning for the Future

A business owner thinks about what might happen in the next few years, like new competitors or changes in what customers want, and makes plans based on those thoughts.

26. Understanding Perspectives

A team is designing a new toy. They think about what kids and parents would both like instead of just what they think is fun.

School Examples

27. researching a topic.

For a history project, a student looks up different sources to understand an event from multiple viewpoints.

28. Debating an Issue

In a class discussion, students pick sides on a topic, like school uniforms, and share reasons to support their views.

29. Evaluating Sources

While writing an essay, a student checks if the information from a website is trustworthy or might be biased.

30. Problem Solving in Math

When stuck on a tricky math problem, a student tries different methods to find the answer instead of giving up.

31. Analyzing Literature

In English class, students discuss why a character in a book made certain choices and what those decisions reveal about them.

32. Testing a Hypothesis

For a science experiment, students guess what will happen and then conduct tests to see if they're right or wrong.

33. Giving Peer Feedback

After reading a classmate's essay, a student offers suggestions for improving it.

34. Questioning Assumptions

In a geography lesson, students consider why certain countries are called "developed" and what that label means.

35. Designing a Study

For a psychology project, students plan an experiment to understand how people's memories work and think of ways to ensure accurate results.

36. Interpreting Data

In a science class, students look at charts and graphs from a study, then discuss what the information tells them and if there are any patterns.

Critical Thinking Puzzles

critical thinking tree

Not all scenarios will have a single correct answer that can be figured out by thinking critically. Sometimes we have to think critically about ethical choices or moral behaviors. 

Here are some mind games and scenarios you can solve using critical thinking. You can see the solution(s) at the end of the post.

37. The Farmer, Fox, Chicken, and Grain Problem

A farmer is at a riverbank with a fox, a chicken, and a grain bag. He needs to get all three items across the river. However, his boat can only carry himself and one of the three items at a time. 

Here's the challenge:

  • If the fox is left alone with the chicken, the fox will eat the chicken.
  • If the chicken is left alone with the grain, the chicken will eat the grain.

How can the farmer get all three items across the river without any item being eaten? 

38. The Rope, Jar, and Pebbles Problem

You are in a room with two long ropes hanging from the ceiling. Each rope is just out of arm's reach from the other, so you can't hold onto one rope and reach the other simultaneously. 

Your task is to tie the two rope ends together, but you can't move the position where they hang from the ceiling.

You are given a jar full of pebbles. How do you complete the task?

39. The Two Guards Problem

Imagine there are two doors. One door leads to certain doom, and the other leads to freedom. You don't know which is which.

In front of each door stands a guard. One guard always tells the truth. The other guard always lies. You don't know which guard is which.

You can ask only one question to one of the guards. What question should you ask to find the door that leads to freedom?

40. The Hourglass Problem

You have two hourglasses. One measures 7 minutes when turned over, and the other measures 4 minutes. Using just these hourglasses, how can you time exactly 9 minutes?

41. The Lifeboat Dilemma

Imagine you're on a ship that's sinking. You get on a lifeboat, but it's already too full and might flip over. 

Nearby in the water, five people are struggling: a scientist close to finding a cure for a sickness, an old couple who've been together for a long time, a mom with three kids waiting at home, and a tired teenager who helped save others but is now in danger. 

You can only save one person without making the boat flip. Who would you choose?

42. The Tech Dilemma

You work at a tech company and help make a computer program to help small businesses. You're almost ready to share it with everyone, but you find out there might be a small chance it has a problem that could show users' private info. 

If you decide to fix it, you must wait two more months before sharing it. But your bosses want you to share it now. What would you do?

43. The History Mystery

Dr. Amelia is a history expert. She's studying where a group of people traveled long ago. She reads old letters and documents to learn about it. But she finds some letters that tell a different story than what most people believe. 

If she says this new story is true, it could change what people learn in school and what they think about history. What should she do?

The Role of Bias in Critical Thinking

Have you ever decided you don’t like someone before you even know them? Or maybe someone shared an idea with you that you immediately loved without even knowing all the details. 

This experience is called bias, which occurs when you like or dislike something or someone without a good reason or knowing why. It can also take shape in certain reactions to situations, like a habit or instinct. 

Bias comes from our own experiences, what friends or family tell us, or even things we are born believing. Sometimes, bias can help us stay safe, but other times it stops us from seeing the truth.

Not all bias is bad. Bias can be a mechanism for assessing our potential safety in a new situation. If we are biased to think that anything long, thin, and curled up is a snake, we might assume the rope is something to be afraid of before we know it is just a rope.

While bias might serve us in some situations (like jumping out of the way of an actual snake before we have time to process that we need to be jumping out of the way), it often harms our ability to think critically.

How Bias Gets in the Way of Good Thinking

Selective Perception: We only notice things that match our ideas and ignore the rest. 

It's like only picking red candies from a mixed bowl because you think they taste the best, but they taste the same as every other candy in the bowl. It could also be when we see all the signs that our partner is cheating on us but choose to ignore them because we are happy the way we are (or at least, we think we are).

Agreeing with Yourself: This is called “ confirmation bias ” when we only listen to ideas that match our own and seek, interpret, and remember information in a way that confirms what we already think we know or believe. 

An example is when someone wants to know if it is safe to vaccinate their children but already believes that vaccines are not safe, so they only look for information supporting the idea that vaccines are bad.

Thinking We Know It All: Similar to confirmation bias, this is called “overconfidence bias.” Sometimes we think our ideas are the best and don't listen to others. This can stop us from learning.

Have you ever met someone who you consider a “know it”? Probably, they have a lot of overconfidence bias because while they may know many things accurately, they can’t know everything. Still, if they act like they do, they show overconfidence bias.

There's a weird kind of bias similar to this called the Dunning Kruger Effect, and that is when someone is bad at what they do, but they believe and act like they are the best .

Following the Crowd: This is formally called “groupthink”. It's hard to speak up with a different idea if everyone agrees. But this can lead to mistakes.

An example of this we’ve all likely seen is the cool clique in primary school. There is usually one person that is the head of the group, the “coolest kid in school”, and everyone listens to them and does what they want, even if they don’t think it’s a good idea.

How to Overcome Biases

Here are a few ways to learn to think better, free from our biases (or at least aware of them!).

Know Your Biases: Realize that everyone has biases. If we know about them, we can think better.

Listen to Different People: Talking to different kinds of people can give us new ideas.

Ask Why: Always ask yourself why you believe something. Is it true, or is it just a bias?

Understand Others: Try to think about how others feel. It helps you see things in new ways.

Keep Learning: Always be curious and open to new information.

city in a globe connection

In today's world, everything changes fast, and there's so much information everywhere. This makes critical thinking super important. It helps us distinguish between what's real and what's made up. It also helps us make good choices. But thinking this way can be tough sometimes because of biases. These are like sneaky thoughts that can trick us. The good news is we can learn to see them and think better.

There are cool tools and ways we've talked about, like the "Socratic Questioning" method and the "Six Thinking Hats." These tools help us get better at thinking. These thinking skills can also help us in school, work, and everyday life.

We’ve also looked at specific scenarios where critical thinking would be helpful, such as deciding what diet to follow and checking facts.

Thinking isn't just a skill—it's a special talent we improve over time. Working on it lets us see things more clearly and understand the world better. So, keep practicing and asking questions! It'll make you a smarter thinker and help you see the world differently.

Critical Thinking Puzzles (Solutions)

The farmer, fox, chicken, and grain problem.

  • The farmer first takes the chicken across the river and leaves it on the other side.
  • He returns to the original side and takes the fox across the river.
  • After leaving the fox on the other side, he returns the chicken to the starting side.
  • He leaves the chicken on the starting side and takes the grain bag across the river.
  • He leaves the grain with the fox on the other side and returns to get the chicken.
  • The farmer takes the chicken across, and now all three items -- the fox, the chicken, and the grain -- are safely on the other side of the river.

The Rope, Jar, and Pebbles Problem

  • Take one rope and tie the jar of pebbles to its end.
  • Swing the rope with the jar in a pendulum motion.
  • While the rope is swinging, grab the other rope and wait.
  • As the swinging rope comes back within reach due to its pendulum motion, grab it.
  • With both ropes within reach, untie the jar and tie the rope ends together.

The Two Guards Problem

The question is, "What would the other guard say is the door to doom?" Then choose the opposite door.

The Hourglass Problem

  • Start both hourglasses. 
  • When the 4-minute hourglass runs out, turn it over.
  • When the 7-minute hourglass runs out, the 4-minute hourglass will have been running for 3 minutes. Turn the 7-minute hourglass over. 
  • When the 4-minute hourglass runs out for the second time (a total of 8 minutes have passed), the 7-minute hourglass will run for 1 minute. Turn the 7-minute hourglass again for 1 minute to empty the hourglass (a total of 9 minutes passed).

The Boat and Weights Problem

Take the cat over first and leave it on the other side. Then, return and take the fish across next. When you get there, take the cat back with you. Leave the cat on the starting side and take the cat food across. Lastly, return to get the cat and bring it to the other side.

The Lifeboat Dilemma

There isn’t one correct answer to this problem. Here are some elements to consider:

  • Moral Principles: What values guide your decision? Is it the potential greater good for humanity (the scientist)? What is the value of long-standing love and commitment (the elderly couple)? What is the future of young children who depend on their mothers? Or the selfless bravery of the teenager?
  • Future Implications: Consider the future consequences of each choice. Saving the scientist might benefit millions in the future, but what moral message does it send about the value of individual lives?
  • Emotional vs. Logical Thinking: While it's essential to engage empathy, it's also crucial not to let emotions cloud judgment entirely. For instance, while the teenager's bravery is commendable, does it make him more deserving of a spot on the boat than the others?
  • Acknowledging Uncertainty: The scientist claims to be close to a significant breakthrough, but there's no certainty. How does this uncertainty factor into your decision?
  • Personal Bias: Recognize and challenge any personal biases, such as biases towards age, profession, or familial status.

The Tech Dilemma

Again, there isn’t one correct answer to this problem. Here are some elements to consider:

  • Evaluate the Risk: How severe is the potential vulnerability? Can it be easily exploited, or would it require significant expertise? Even if the circumstances are rare, what would be the consequences if the vulnerability were exploited?
  • Stakeholder Considerations: Different stakeholders will have different priorities. Upper management might prioritize financial projections, the marketing team might be concerned about the product's reputation, and customers might prioritize the security of their data. How do you balance these competing interests?
  • Short-Term vs. Long-Term Implications: While launching on time could meet immediate financial goals, consider the potential long-term damage to the company's reputation if the vulnerability is exploited. Would the short-term gains be worth the potential long-term costs?
  • Ethical Implications : Beyond the financial and reputational aspects, there's an ethical dimension to consider. Is it right to release a product with a known vulnerability, even if the chances of it being exploited are low?
  • Seek External Input: Consulting with cybersecurity experts outside your company might be beneficial. They could provide a more objective risk assessment and potential mitigation strategies.
  • Communication: How will you communicate the decision, whatever it may be, both internally to your team and upper management and externally to your customers and potential users?

The History Mystery

Dr. Amelia should take the following steps:

  • Verify the Letters: Before making any claims, she should check if the letters are actual and not fake. She can do this by seeing when and where they were written and if they match with other things from that time.
  • Get a Second Opinion: It's always good to have someone else look at what you've found. Dr. Amelia could show the letters to other history experts and see their thoughts.
  • Research More: Maybe there are more documents or letters out there that support this new story. Dr. Amelia should keep looking to see if she can find more evidence.
  • Share the Findings: If Dr. Amelia believes the letters are true after all her checks, she should tell others. This can be through books, talks, or articles.
  • Stay Open to Feedback: Some people might agree with Dr. Amelia, and others might not. She should listen to everyone and be ready to learn more or change her mind if new information arises.

Ultimately, Dr. Amelia's job is to find out the truth about history and share it. It's okay if this new truth differs from what people used to believe. History is about learning from the past, no matter the story.

Related posts:

  • Experimenter Bias (Definition + Examples)
  • Hasty Generalization Fallacy (31 Examples + Similar Names)
  • Ad Hoc Fallacy (29 Examples + Other Names)
  • Confirmation Bias (Examples + Definition)
  • Equivocation Fallacy (26 Examples + Description)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

When Science Denial Meets Epistemic Understanding

Ayça fackler.

Department of Mathematics and Science Education, The University of Georgia, Athens, GA USA

Science denial has a long history of causing harm in contemporary society when ignored. Recent discussions of science denial suggest that correcting people’s false beliefs rarely has an impact on eliminating the adherence to false beliefs and assumptions, which is called the backfire effect. This paper brings the backfire effect within the context of science denial to the attention of science education researchers and practitioners and discusses the potential role(s) of epistemic understanding of knowledge production in science in dealing with the rejection of scientific evidence and claims in science classrooms. The use of epistemic understanding of knowledge production in science with a focus on avoiding the backfire effect may increase the potential for science education research to produce fruitful strategies which advance students’ attitudes toward science and deepen students’ understanding of how science works through divergent perspectives. There are some areas that need to be focused on and investigated for their potential to combat science denial and the backfire effect while foregrounding the role(s) epistemic understanding of knowledge production for science instruction. These areas include expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time, comparing claims and arguments that derive from different frameworks, teaching about the power and limitations of science, and bringing different and similar ways science is done to students’ attention.

Introduction

There has been increased attention paid to science denial in both educational and social context (Hansson 2017b ; Liu 2012 ; Rosenau 2012 ). Science denial is defined as “the systematic rejection of empirical evidence to avoid [personally and subjectively] undesirable facts or conclusions” (Liu 2012 , p. 129). Some typical examples of science denial are denial of climate change, relativity theory, evolution, the origin of life, AIDS, vaccination, and tobacco disease. Science denial is a social phenomenon, and it is one form of pseudo-science (Bardon 2020 ). Another form is called pseudo-theory promotion. While science denial is coloured by a growing antipathy towards particular scientific theories and the refusal of some parts of science (e.g., climate change denial, evolution denial, continental drift denial, the origin of life, or relativity theory denial), pseudo-theory promotion is based on the attempts to construct personal theories or claims (e.g., transcendental meditation, astrology, herbal medicine, or iridology) (Hansson 2017b ). Hansson ( 2017b , pp. 43–44) outlined ten sociological characteristics shared by science denialists and pseudo-theory promoters as listed in Table ​ Table1 1 .

Ten sociological characteristics of science denialists and pseudo-theory promoters (adapted from Hansson 2017b )

Science denial is slightly different than pseudo-theory promotion (Hansson 2017b ). The most important difference between science denial and pseudo-theory promotion is that while the fabrication of false controversies is a standard practice in science denial, most cases of pseudo-theory promotion do not engage in producing fake controversies (Hansson 2017a ). In contrast, pseudo-theory promotion tends to avoid controversies with science and describes its claims as compatible with and conformable to science (Hansson 2017a , b ). In this paper, distinguishing and comparing science denial and pseudo-theory promotion is key for two main reasons. First, this paper focuses only on science denial due to the ongoing discussions around bringing science denial to classrooms (e.g., Boyle 2017 ) and the massive spread and acceptance of conspiracy theories about scientific phenomena (e.g., climate change, the origin of life, COVID-19) in both the public and schools. Second, the discussion in this paper takes the characteristics of science denial into account to determine some areas for both educators and researchers to focus on as to how to respond to science denial in educational settings.

The purpose of this paper is to bring the backfire effect within the context of science denial to the attention of science education researchers and practitioners and discuss the potential role(s) of epistemic understanding of knowledge production in science in dealing with the rejection of scientific evidence and claims in science classrooms. I wish to take the reader beyond what I present and discuss here and to detect some areas open for further exploration rather than providing a road map or a list of tips and strategies to combat science denial and the backfire effect.

Correcting Misbeliefs?

Many people resist evaluating and accepting reliable scientific evidence. One of the reasons for denying scientific evidence is that scientific ideas may threaten people’s beliefs, ideologies, and background assumptions which are often wrong and misleading . For instance, “what predicts the denial of human-made climate change is not scientific illiteracy but political ideology” (Pinker 2018 , p. 357). Adherence to personal beliefs and background assumptions, what Sandoval ( 2005 ) called personal epistemology , interferes with the acceptance of scientific facts and conclusions (Sinatra et al. 2014 ). One may ask the question of whether we can change or correct people’s false beliefs. In general, people are supposed to adjust their assumptions when they evaluate scientific evidence that challenges their beliefs. So, is this always the case? The answer is no. In their review of the literature on correcting misinformation, Lewandowsky et al. ( 2012 ) showed that correcting people’s false beliefs rarely has an impact on eliminating the adherence to false beliefs and assumptions. They also argued that even though people understand the retraction, correcting false beliefs is still ineffective (Lewandowsky et al. 2012 ).

One of the reasons why people fail to refute personal beliefs and assumptions is explained by the backfire effect (Ecker et al. 2017 ; Swire et al. 2017 ). The backfire effect is a cognitive bias that causes people’s background assumptions to get stronger when they encounter contradictory evidence (Nyhan and Reifler 2010 , 2015 ). In other words, the backfire effect means that showing people scientific claims and evidence which prove that they are wrong is often ineffective because it causes them to support their original assumptions more strongly than they previously did (Nyhan and Reifler 2010 ; Trevors et al. 2016 ). It is an important phenomenon because it derails critical thinking skills. The backfire effect is the very heart of how people negotiate between scientific ideas and their background assumptions (Sinatra et al. 2014 ).

In 2010, Nyhan and Reifler designed a study to test the backfire effect. The researchers created an article that included a very common misconception about certain issues in politics. Participants were first asked to read a fake article and then another article that corrected the fake article. Participants with a certain ideological belief strongly disagreed with the correct article while they articulated stronger beliefs about the fake article. In that study, corrections failed to reduce misconceptions among the targeted ideological group. The same researchers designed the same experiment about other controversial topics such as tax cuts and stem cell research. They concluded that corrections that contradicted participants’ beliefs caused background assumptions to get stronger (Nyhan and Reifler 2010 ).

The same researchers also conducted a study that examined people’s beliefs about vaccination against the flu. They showed that when people who believe that vaccine is unsafe are provided with correct information challenging their beliefs, misconceptions about vaccination among the group increased (Nyhan and Reifler 2015 ). Another study examined parents’ intent to vaccinate their children (Nyhan et al. 2014 ). The researchers found that corrective information (pro-vaccination messages) decreased intent to vaccinate among parents who had the most negative attitudes toward vaccines. Nyhan et al. ( 2014 ) concluded that “respondents brought to mind other concerns about vaccines to defend their anti-vaccine attitudes, a response that is broadly consistent with the literature on motivated reasoning about politics and vaccines” (p. 840).

Supporting the findings of Nyhan and Reifler ( 2010 , 2015 ) and Nyhan and colleagues ( 2014 ), other researchers have concluded that even though people understand the rationale for retraction, corrections are still ineffective (Lewandowsky et al. 2012 ). Correcting widespread misinformation has little effect on the ways people act and think (Sides and Citrin 2007 ), and the arguments that reinforce people’s background beliefs are favoured while the ones that contradict their views are disparaged (Taber and Lodge 2006 ). Additionally, a review of research by Tippett ( 2010 ) on refutation texts in science education showed that reading a refutation text that explicitly challenges and refutes students’ naïve conceptions seemed to be useful for improving students’ conceptual understanding but the review also pointed out that a refutation text alone is not enough to change or improve students’ misconceptions (Tippett 2010 ).

On the other hand, some researchers (e.g., Crozier and Strange 2019 ; Haglin 2017 ; Wood and Porter 2017 ) have argued that the backfire effect is not as strong as had been claimed in the literature (e.g., Lewandowsky et al. 2012 ; Nyhan and Reifler 2015 ). Crozier and Strange ( 2019 ) found no evidence for a backfire effect in their study in which they evaluated the effects of corrections on reliance on misinformation. They found that corrections can decrease individuals’ reliance on misinformation (Crozier and Strange 2019 ). The researchers also argued that the format of corrections (the frequency of exposures to the corrections, the activation of the misinformation and its correction simultaneously, etc.) has a key role in its effectiveness (Crozier and Strange 2019 ). Replicating the Nyhan and Reifler ( 2015 ) corrective information experiment with a different population, Haglin ( 2017 ) also found no support for a backfire effect from corrections of misinformation and highlighted the importance of investigating the specific conditions and individuals affected when a suspected backfire effect occurs. According to the literature discussed, we still need more evidence to figure out whether corrections are a successful strategy for combatting misinformation or misbeliefs. It is important to make it clear that whether the backfire effect exists or not is not the focus of this paper. With the actual purpose of this piece in mind, I now turn to different forms of the backfire effect.

The Backfire Effect and Reasoning:

Two forms of the backfire effect cause the denial of scientific knowledge: the familiarity backfire effect (Swire et al. 2017 ) and the overkill backfire effect (Ecker et al. 2019 ). The familiarity backfire effect occurs when people remember misinformation rather than its inaccuracy as a result of getting exposed to misinformation frequently (Swire et al. 2017 ). This effect can influence the way people respond to pseudo-scientific arguments (Hansson 2017b ). The overkill backfire effect occurs when people reject multiple complex scientific explanations for certain phenomena that are difficult to understand and process (Ecker et al. 2019 ). This shows that people tend to engage in simple and easy explanations. When people are presented with a complicated scientific explanation, the overkill backfire effect may cause them to reject that explanation and to stick to their simple misconceptions (Chater 1999 ; Lombrozo 2007 ).

The backfire effect explains why people confirm their own biases even though they have heard about scientific facts and observed scientific phenomena and why they reject scientific information and create counterarguments against empirical evidence. Additionally, the backfire effect can help us understand and explain why the way science is traditionally taught is not successful at eliminating science denial. In a traditional classroom setting, students who deny scientific facts and conclusions are usually provided with complex explanations that aim to convince students and correct their false beliefs and assumptions. Science instruction should encourage students, citizens of the future, to differentiate selective use of evidence, what Hansson ( 2017b ) called “cherry-picking” or what Sinatra et al. ( 2014 ) called “motivated reasoning”, from accuracy-oriented scientific reasoning. It does not mean that there is no motivated reasoning in science. For instance, Mizrahi ( 2015 ) discussed some examples of confirmation bias from the history of science. Rather, it means that science instruction should emphasize the differences between deliberate thoughts and intuitive thoughts as students learn about methods of reasoning (Short et al. 2019 ).

The understanding of scientific reasoning is one of the three dimensions of scientific literacy (Fasce and Picó 2019 ). The understanding of scientific reasoning means a public understanding of the way(s) scientific knowledge is developed in terms of sociological, philosophical, and historical aspects of science (Fasce and Picó 2019 ). Students should understand scientific reasoning and separate scientific reasoning from motivated reasoning. Scientific reasoning has a logical nature based on some principles. There are some ways to decide how much confidence we should place in scientific explanations: deduction, induction, and abduction (inferences to the best) (Okasha 2002 ). These three forms of logical inference are important for understanding how we, human beings, think and how we make meaning out of the world around us. While reasoning, we look at the premises and draw conclusions based on the premises through deduction, induction, and abduction.

The first form of logical inference is deductive reasoning. With deduction, our conclusions must be true as long as the premises are true (Okasha 2002 ). Deductive inferences move from the general to the specific (Jaipal 2009 ). An example of deductive reasoning, or inference, in Okasha ( 2002 , p. 18) is the following:

All Frenchmen like red wine. Pierre is a Frenchman. Therefore, Pierre likes red wine.

If the premises are true in the first two statements, then the conclusion must be true. The most important feature of deductive inferences is that their premises are general and their conclusions are more specific.

The second form of inference is inductive reasoning. In induction, the premises do not entail the conclusion (Okasha 2002 ). Here is an example of inductive reasoning from Okasha ( 2002 , p. 19):

The first five eggs in the box were rotten. All the eggs have the same best-before date stamped on them. Therefore, the sixth egg will be rotten too.

It is possible that even if the premises of this inference are true, the conclusion can be false. The reason is that we move from specific observations about objects or events we have examined (i.e., the first five eggs) to generalizations about objects or events that we have not examined (i.e., the rest of the eggs in the box).

With deduction, we can be certain if we begin with true premises, we will come to a true conclusion. With induction, we cannot be so confident because inductive inferences can possibly take us from true premises to a false conclusion (Okasha 2002 ). Even though inductive reasoning is weaker than deductive reasoning, much scientific research and reasoning in everyday life is carried out inductively. Consider the following examples in Okasha ( 2002 ). An example of inductive reasoning in everyday life is as follows.

… when you turn on your computer on the morning, you are confident it will not explode in your face. Why? Because you turn on your computer every morning, and it has never exploded in your face up to now. The premises of this inference do not entail the conclusion. (Okasha 2002 , p. 20)

So how do scientists use inductive reasoning? Consider this example.

… geneticists tell us that Down’s syndrome (DS) sufferers have an additional chromosome. How do they know this? The answer, of course, is that they examined a large number of DS sufferers and found that each had an additional chromosome. They then reasoned inductively to the conclusion that all DS sufferers, including ones they had not examined, have an additional chromosome. (Okasha 2002 , pp. 20–22)

Some philosophers such as David Hume and Karl Popper denied the existence and importance of inductive reasoning in science by arguing that inductive inferences are not justifiable because we cannot make sure that phenomena that we have not experienced will resemble those that we have experienced in the past (Okasha 2002 ). However, we know that inductive reasoning is a perfectly sensible way of forming beliefs about the world around us by making our inferences quite probable.

The third form of logical inference is called abduction (inference to the best explanation). Abductive inference makes a similar jump to the logic of the inductive syllogism but the abductive inference is fallible. Consider the following example that Okasha ( 2002 , p. 29) offers:

The cheese in the larder has disappeared, apart from a few crumbs . Scratching noises were heard coming from the larder last night . Therefore, the cheese was eaten by a mouse .

In this case, the premises do not entail the conclusion. However, with the available data, the inference is reasonable. If we obtain more data, we can make the reasoning stronger. Scientists (doctors and detectives as well) use abduction—drawing a conclusion that best explains a state of events from a set of possible scenarios, rather than solely based on evidence provided in the premises. Within this context, scientists’ theories provide strong evidence for their claims. In addition to inferences, many scientific laws and theories are expressed in terms of probability (probabilistic reasoning) such as Mendelian genetics arguing that there is a 50% chance that any gene in your mother (and father) will be in you. “Probability provides a continuous scale from poor theories with low probability to good theories with high probability” (Lakatos 1998 , p. 22). The importance of probabilistic reasoning in understanding and accepting polarizing scientific ideas (e.g., evolution) is also highlighted in the literature (e.g., Fiedler et al. 2019 ; Lenormand et al. 2009 ).

Learning about the three forms of logical inferences discussed above is important to distinguish between motivated reasoning and scientific reasoning and to address science denial. As Hand et al. ( 1999 ) suggested, logical reasoning is important because “science distinguishes itself from other ways of knowing and from other bodies of knowledge through the use of empirical standards, logical arguments, and scepticism to generate the best temporal explanations possible about the natural world” (p. 1023). The way we make inferences through deduction, induction, and abduction shows that even though scientific knowledge is temporary and uncertain, it is highly probable and it is subject to change as we collect more evidence (Hand et al., 1999 ; Okasha 2002 ). In contrast, motivated reasoning relies on selectively interpreting evidence and leads to preferred inferences.

Making logical inferences while evaluating claims and evidence is one of the critical thinking abilities (Paul 1995 ). As one might infer from the nature of science literature, students have limited ability to evaluate scientific claims and evidence. One reason is that science instruction in K-12 does not facilitate engaging in aspects of scientific inquiry and practices about evaluating the strengths and limitations of the evidence and developing scientific arguments (Banilower 2019 ). Banilower ( 2019 ) provides an interesting finding from the study as follows:

Fewer than a quarter of secondary science classes have students, at least once a week, pose questions about scientific arguments, evaluate the credibility of scientific information, identify strengths and limitations of a scientific model, evaluate the strengths and weaknesses of competing scientific explanations, determine what details about an investigation might persuade a targeted audience about a scientific claim, or construct a persuasive case. (Banilower 2019 , p. 204)

The absence of logical inferences may add strength to the backfire effect by leading to the retrieval of thoughts that support one’s background beliefs and assumptions. It means that “when we think we are reasoning, we may instead be rationalizing” (Mooney 2011 , para. 11). Rationalization involves deciding what evidence to accept based on the preferred conclusion—motivated reasoning (Bardon 2020 ). In contrast, scientific reasoning requires using critical thinking skills to determine which explanation(s) represents the best answer to our question based on evidence (Lawson 1999 ).

As discussed earlier, when we encourage students to engage in evaluating evidence that has the potential to threaten their background assumptions and beliefs, science denial might become more entrenched. One reason is that people tend to look for evidence which confirms their beliefs and background assumptions (Druckman and McGrath 2019 ). Referring to this point, one may ask whether we should avoid discussing scientific evidence that may conflict with students’ worldviews while teaching controversial topics in science in order to not enable science denial. How can science educators address science denial in the classroom? How can science educators make scientific claims and evidence sticky so that students remember what they read or observe and try to evaluate their background assumptions? The answers to these questions are complicated. Regarding these questions, the following paragraphs discuss the intersections between the ways science should be taught and the suggestions for addressing science denial and the backfire effect.

Science Denial, the Backfire Effect and Science Teaching

It seems that pedagogical suggestions for avoiding the backfire effect and dealing with science denial are inconclusive and contradictory. Regarding the fact that there is a strong relationship between background assumptions and science denial or acceptance (Mazur 2004 ), Nyhan and Reifler ( 2010 ) and Cook and Lewandowsky ( 2011 ) suggested that when educators present counter-evidence, they should acknowledge students’ background assumptions (e.g., political ideologies, religious beliefs). On the other hand, there are some suggestions on how to discuss controversial issues by avoiding considering students’ background assumptions. Consider the following excerpt showing how we should be careful while teaching about climate change:

… in a polarized political landscape, talking about politicians and the decisions they make is counterproductive. Students may put their guard up, thinking that I’m partisan, and tune me out when I’m lecturing about other things, such as climate modeling. So, I made a conscious decision to change my approach to teaching the subject. As part of my modified strategy, I joined a local bipartisan group that aims to bring people together by emphasizing the potential consequences, rather than causes, of climate change. (Kannan 2019 , p. 1042)

This example suggests that leaving politics out of the classroom while discussing polarizing issues in science is considered as an important attempt to prevent science denial and to avoid threatening students’ worldviews. So, should we acknowledge students’ background assumptions? It is not clear how educators should go about reconciling the advice in their classroom.

Another example of contradictory advice to educators can be seen in Cook and Lewandowsky ( 2011 ). The authors suggested that if teachers aim to debunk misbeliefs about scientific phenomena, they should begin by emphasizing the scientific facts, not the misbeliefs. The goal should be to increase students’ familiarity with scientific facts (Cook and Lewandowsky 2011 ). Even though this bit of advice seems to work for specifically combating the familiarity backfire effect discussed earlier, it still invites the backfire effect, in general, described by Nyhan and Reifler ( 2010 , 2015 ) and Nyhan and colleagues ( 2014 ).

Moreover, when we compare what the literature on how to teach science and what to teach about science says with the suggested ways of avoiding the backfire effect and science denial, we see conflicting ideas on these issues. Duschl and Osborne ( 2002 ), for instance, argued that science instruction should focus on “how we know what we know and why we believe the beliefs of science to be superior or more fruitful than competing viewpoints” (Duschl and Osborne 2002 , p. 43). Even though this statement refers to the importance of the epistemic aspect of understanding scientific practices, it seems to neglect what might happen when students are provided with the idea that the scientific way of knowing is superior to other ways of knowing, and triggering a possible backfire effect.

Emphasizing the role(s) of an epistemic understanding of knowledge production in science might be a fruitful way to avoid the backfire effect while learning and teaching polarizing scientific issues. Using Duschl ( 2008 )’s framing of epistemic and conceptual aspects of science learning, I define the epistemic understanding of knowledge production in science as the consideration of multiple perspectives and contexts (social, cultural, historical, linguistic, etc.) while evaluating or challenging evidence and claims. The integration of the epistemic understanding of how to develop and evaluate scientific knowledge into scientific practices is one of the more important goals for science learning defined by Duschl ( 2008 ). This goal can be accomplished by facilitating a dialogical discourse through which learners have a chance to evaluate claims and evidence to make inferences about the natural world (Duschl 2020 ). Even though the literature on the importance of the epistemic understanding in science classrooms is well-established, its potential role in preventing or fostering science denial and the backfire effect is not adequately discussed in the field of science education. There are some areas that need to be focused on and investigated for their potential to combat science denial and the backfire effect while foregrounding the role(s) of the epistemic understanding of knowledge production for science instruction. These areas include expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time, comparing claims and arguments that derive from different frameworks, teaching about the power and limitations of science, and bringing different and similar ways science is done to students’ attention .

First, educators can encourage expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time. Expanding ways of knowing involves acknowledging knowledge that is value-based and cultural not only empirical. The scientific way of knowing produces knowledge (I will call this type of knowledge scientific knowledge ) through specific practices (observation, experimentation, logical inference, etc.). Scientific knowledge tries to explain the natural world by focusing on individual parts. On the other hand, traditional knowledge, indigenous knowledge, or local knowledge (I use these terms interchangeably here) refers to other ways of knowing embedded in the cultural traditions, beliefs, and attitudes of specific communities. The production of this type of knowledge also includes observations, predictions, and problem-solving (Snively and Corsiglia 2001 ). However, the way of producing traditional knowledge is not always systematic. Additionally, the traditional ways of knowing try to understand the natural world more holistically by observing the interactions between all of the parts of a phenomenon. Consider this example. Cobern and Loving ( 2001 ) shared the following conversation between a researcher working at a scientific station on a South Pacific Island and an indigenous islander:

The islander commented that Westerners only think they know why the ocean rises and falls on a regular basis. They think it has to do with the moon. They are wrong. The ocean rises and falls as the great sea turtles leave and return to their homes in the sand. The ocean falls as the water rushes into the empty nest. The ocean rises as the water is forced out by the returning turtles. (Cobern and Loving 2001 , p. 51)

As another example of other ways of knowing, Foucault ( 1970 ) mentioned a Chinese encyclopaedia in which animals are divided into groups: “(a) belonging to the Emperor, (b) embalmed, (c) tame, (d) sucking pigs, (e) sirens, (f) fabulous, (g) stray dogs, (h) included in the present classification, (i) frenzied, (j) innumerable, (k) drawn with a very fine camelhair brush, (l) etcetera, (m) having just broken the water pitcher, and (n) that from a long way off look like flies” (p. 16). For another example, an indigenous group, called Tao (or Yami) people, living on Orchid Island (Lanyu) located near South-East Taiwan, has a different taxonomy where fish are grouped into two main classes: edible and inedible fish (Wang 2012 ). The inedible fish are like fish without scales such as eels. The edible fish are further divided into different groups: old people fish (only to be consumed by elders), men fish (prohibited to women), and women fish (for all to consume). This kind of classification is based on the different purposes fish are used for in the community. The indigenous classification method is motivated by the protection of natural diversity and ecosystem while scientific classification aims to inform the user as to what the relatives of the taxon are hypothesized to be (M.-Y. Lin, personal communication, September 14, 2020). For instance, the reason Tao people do not eat eels (and classify it as inedible fish) is that the eels dredge the headwater of the taro fields and hunt pests (Wang 2012 ). These three examples of other ways of knowing show that knowledge is produced within specific contexts, with specific purposes, and with specific methods.

The literature in the sciences and science education has emphasized and valued expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing without focusing on science denial and the backfire effect. As an example of acknowledging other ways of knowing, Behrens ( 1989 ) examined the correspondence between Shipibo, an indigenous group in the Peruvian Amazon, soil categories, and Western pedology (a branch of soil science) to understand soil-plant associations and agricultural productivity. There are also many studies about how educators can acknowledge different ways of knowing in their science teaching practices (see Barba 1995 ; Loving 1991 ; Ogawa 1995 ). Ogawa ( 1995 ), for instance, argued that bringing a multiscience perspective in science classrooms helps students understand more than one view simultaneously and discuss how and why some natural phenomena can be interpreted similarly or differently in different contexts. For another example, Loving ( 1991 ) proposed a model called the Scientific Theory Profile to help science teachers develop an understanding of the nature of science and evaluate scientific explanations and theories within cultural contexts. Even though these studies provide insights into what expanding ways of knowing might look like in practice and how it might be useful to facilitate the epistemic understanding of knowledge production in science, they do not discuss the potential of fostering science denial and the backfire effect instead of avoiding it.

The proponents of diverse perspectives in explaining natural phenomena argue that scientific way of knowing and other ways of knowing should be viewed as co-existing or parallel (e.g., Cobern and Loving 2001 ; Snively and Corsiglia 2001 ) rather than competing viewpoints. This is true. One reason is that different ways of knowing might be useful in different social or cultural contexts and lead to different consequences and decision-making processes (Feinstein and Waddington 2020 ). It is also important to note that these different ways of knowing are not equal. It means that knowledge-building encompasses multiple ways of origins, practices, logical conclusions, rationales, and methods. Here, the intent of this paper is not to discuss whether or not other ways of knowing are classified as scientific knowledge or science. The answers to this question in the science education literature are not in agreement with one another (for detailed discussions see Cobern and Loving 2001 ; Snively and Corsiglia 2001 ; Southerland 2000 ; Stanley and Brickhouse 1994 ).

Potential Impact on Students’ Learning

What we educators can do by expanding ways of knowing is to consider the epistemological pluralism and the ability to wisely differentiate scientific knowledge from other ways of knowing in light of logical inferences, use of evidence, systematic observation, etc. (Cobern and Loving 2001 ). By doing so, educators provide a way of distinguishing reliable knowledge claims from unreliable ones (Laudan 1996 ). Different ways of knowing can contribute to our explanations about the world (Snively and Corsiglia 2001 ) and work in consort because different ways of knowing may be important in different situations. Expanding ways of knowing provides students with a chance to see how the practice of science may utilize the insights of another domain of knowledge (Cobern and Loving 2001 ). Science instruction should “value knowledge on its many forms and from its many sources” (Cobern and Loving 2001 , p. 63) so that students feel free to bring different perspectives and ways of knowing to their classroom and discuss them.

Second, students should be able to compare claims and arguments that derive from different frameworks or domains of knowledge. To do so, it is important to know how to engage in scientific practices such as making inferences, generating and evaluating explanations, and making observations. Teaching students about “methods for posing questions about science, scientific models for serious thinking about science, understandings about aspects of scientific inquiry, and a sceptical orientation regarding ways that science is characterized in curriculum materials and instruction” might be a good way to guide them to develop and evaluate arguments and counterarguments (Kelly 2014 , p. 1368).

Constructing a counterargument that successfully weakens the force of others’ arguments is a challenging task for students (Kuhn 2010 ). In her study, Kuhn highlighted two important implications for learning and teaching about scientific argumentation: (a) students should be encouraged to develop alternative arguments based on evidence against the opponents’ argument rather than critiquing the opponents’ arguments and threatening their beliefs and assumptions. (b) There are two main ways of making use of evidence in argumentation: the support strategy—using the evidence to support one’s claim, and the challenge strategy—using the evidence to challenge the other’s claim. Educators tend to avoid using the term argument in the classroom because of fear that argument may be associated with negative concepts and senses in students’ minds. However, developing arguments and counterarguments are key components of critical thinking and it creates an opportunity for students to make use of their skills of analysis, synthesis, and evaluation (Osborne and Patterson 2011 ). An example that fits this argument would be the curriculum introduced in 2016 in Finland that requires students to think critically, interpret, and evaluate all the information they encounter across all subjects. Henley ( 2020 ) reports on how the national curriculum aims to accomplish this goal in Finland as follow:

In maths lessons, … pupils learn how easy it is to lie with statistics. In art, they see how an image’s meaning can be manipulated. In history, they analyse notable propaganda campaigns, while Finnish language teachers work with them on the many ways in which words can be used to confuse, mislead, and deceive. (Henley 2020 , para. 4)

This is one way of providing students with the necessary skills and methods to evaluate claims and evidence without leading to any conflicts and threats. As reported by Henley from his personal communication with Mikko Salo, a member of the European Union’s independent high-level expert group on fake news, “It’s about trying to vaccinate against problems, rather than telling people what’s right and wrong. That can easily lead to polarisation” (Henley 2020 , para. 23).

Third, students should learn about both the power and limitations of science to engage with the epistemic aspect of knowledge production in science. Even though the programme of study for 14–16-year-old students in England contains an acknowledgement that students are taught about the “power and limitations” of science (Department of Education 2014 , p. 5), it is argued in the literature that school science does not explicitly and efficiently teach that argumentation is associated with uncertainty—being unsure and lacking knowledge or evidence (Chen et al. 2019 ). Researchers showed that an individual’s political attitudes, beliefs, and worldviews are strongly related to the level of tolerance of uncertainty (Jost et al. 2003 ; Pennycook et al. 2012 ). For instance, conservatives are less likely to tolerate uncertainty (Deppe et al. 2015 ). (A caveat should be noted: Denial is not a problem for only conservatives. Kahan et al. ( 2011 ) have found that liberals are less likely to accept a hypothetical expert consensus on nuclear waste disposal and handgun regulations). Uncertainty is one of the factors that trigger science denial that educators encounter while teaching and learning about hot button issues. Chen et al. ( 2019 ) proposed a way of productively managing uncertainty in the classroom: raising uncertainty —expressing confusion and seeing other ideas to problematize a phenomenon, maintaining uncertainty —facilitating a discussion by which students can deepen their scientific reasoning with evidence, and reducing uncertainty —synthesizing alternative ideas, looking for inconsistencies among them, and connecting them to each other. This way helps teachers facilitate students’ epistemic understanding of knowledge production to manage uncertainty and prevents students from constructing motivated reasoning.

Lastly, science educators can bring different and similar ways science is done to their students’ attention to emphasize epistemic understanding. For instance, historical (e.g., palaeontology, historical geology, archaeology) and experimental sciences (e.g., physics, chemistry, astronomy) use distinct ways of producing scientific knowledge and reasoning. Historical sciences focus on explaining observable phenomena in terms of unobservable causes by using retrodiction, abduction, reasoning from analogy, and multiple working hypotheses (Gray 2014 ). In contrast, experimental sciences engage in making predictions and testing these predictions in controlled laboratory settings by focusing on hypotheses, experiments, controls, and variables. In addition to the differences between historical and experimental sciences, it is also important to highlight that even though historical science hypotheses and methods are usually associated with fields such as palaeontology and archaeology, we can see historical hypotheses and methods in geology, planetary science, astronomy, and astrophysics—such as continental drift, the meteorite impact extinction of the dinosaurs, and the big bang origin of the universe hypotheses (Cleland 2001 ). The epistemological and methodological differences and similarities between historical and experimental sciences are important since background assumptions and beliefs about historical science claims can have important consequences (e.g., creationist critiques of evolution) (Gray 2014 ). Just because historical sciences cannot replicate unobservable causes in laboratory settings, it is not true to assume that the way historical scientists do science is inferior to the way experimental sciences produce knowledge and make inferences (Cleland 2001 ), and that historical sciences are more subject to denial.

For another example of different ways of doing science, scientists working on the same problem and with the same data can arrive at different conclusions. In a recent study (Silberzahn et al. 2018 ), 29 research teams (a total of 61 researchers) from 13 countries with a variety of research backgrounds including Psychology, Statistics, Research Methods, Economics, Sociology, Linguistics, and Management were provided with the same set of data and asked to answer the same question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Twenty of the teams found a statistically significant relationship between a player’s skin color and the likelihood of receiving a red card. Nine teams found no significant relationship at all. The researchers came to different conclusions because they used different statistical models and took different variables from the data set into account. It is clear that their analyses led to somewhat subjective decisions about the best statistical model to use and which variables should be included in the analyses. Silberzahn et al. ( 2018 ) concluded that “many subjective decisions are part of the research process and can affect the outcomes” (p. 354). As an important consequence, this variability in analytic approaches and conclusions is likely to affect decision-making processes. With this illustrative example in mind, it is important for teachers to consider different analytical tools and methodologies used in science and how these differences lead to diverse viewpoints while they engage students with using and interpreting scientific evidence and making inferences in classrooms.

These four areas discussed above are promising and are open to further investigations to evaluate their potential to combat science denial and the backfire effect while facilitating the epistemic understanding of how we know and what know about the natural world around us. The reason these areas are important to focus on is that they can address the sociological characteristics of science denial(ists), such as considering scientific theories as threats, finding scientific ideas difficult to understand, and disseminating false beliefs, assumptions, and ideologies in the public (see Table ​ Table1), 1 ), and provide some insights into how to deal with science denial and the backfire effect. For instance, expanding ways of knowing can take the familiarity backfire effect into account while providing students with diverse perspectives on the same phenomenon. Encountering different ways of knowing, students can have a chance to access to and discuss a vast array of ideas instead of getting exposed to the same (mis)beliefs frequently. Moreover, if students would like to challenge some ideas, they need to learn how to develop counterarguments based on evidence rather than solely targeting other ideas just because these ideas contradict with their background assumptions. Additionally, teaching students about how knowledge is produced (different ways of logical reasoning, different methodologies, etc.) before teaching them scientific ideas themselves may prevent the overkill backfire effect. To do so, educators can explain why there are multiple explanations on the same phenomenon and why the ways science is done seem to be complicated processes that may lead to uncertainty or inconclusive evidence. The most important point of zooming in on these four areas can potentially provide learners, scientifically literate citizens, with opportunities to reflect on their background assumptions, beliefs, ideologies, and cultural resources while negotiating and distinguishing between different ways of knowing and evaluating the credibility of claims and evidence.

Conclusions and Discussion

With a focus on science denial, this paper brings the backfire effect to the attention of science educators and science education researchers and discusses the potential role(s) of epistemic understanding of knowledge production in science in dealing with the rejection of scientific evidence and claims in science classrooms. In order to investigate the potential role(s) of epistemic understanding of knowledge production in confronting the denial of scientific ideas and mitigating the influence of the backfire effect, the current paper suggest taking a close look at expanding ways of knowing and marking the boundary between the scientific way of knowing and other ways of knowing at the same time, comparing claims and arguments that derive from different domains of knowledge, recognizing the power and limitations of science, and learning about different ways science is done .

Given these four areas to seek effective ways of dealing with science denial in science classrooms, it may seem that the suggested areas for further explorations are based on the nature of science rather than the specific ways of combating the backfire effect. There are two main reasons for that. First, the literature on debunking misinformation and avoiding the backfire effect has offered contradictory advice (e.g., emphasizing scientific facts not (mis)beliefs vs. acknowledging students’ beliefs). This literature also falls short in providing educators with practical ways of implementing these strategies. For example, how can educators acknowledge students’ beliefs and values while presenting a counterargument or scientific fact? How can educators balance a discussion of different ways of knowing without opening the door to science denial? What forms of knowing or knowledge production should be admitted to science classrooms? Should educators care about the correctness of different ways of knowing at all? Or should they focus on how different ways of knowing are useful in different contexts?

Second, even though cognition-oriented research findings in the field of science education (e.g., conceptual change pedagogies such as cognitive conflict pedagogies) have provided insights on the processes of how students reconstruct their knowledge and understanding (Chinn and Malhotra 2002 ; diSessa 1993 ; Vosniadou 2002 ), we still do not know what steps students follow to achieve a meaningful conflict while they reconstruct their prior knowledge, beliefs, and values (Limón 2001 ). As an example, despite the fact that cognitive conflict—confronting learners with contradictory information—has a long history as a suggested strategy for supporting learning and teaching in science education, it has had less success in classroom implementations than expected and has led to conflicting results as well (e.g., Limón and Carretero 1997 ). One reason is that many educators do not know how to facilitate a meaningful cognitive conflict in classrooms (Limón 2001 ). Several models and theories on conceptual change focus only on the cognitive processes of individuals and underestimate the importance of epistemological beliefs, values, attitudes, and reasoning strategies (Limón 2001 ). Moreover, it seems that these models and theories neglect the consequences of inducing conflict by providing anomalous and contradictory information, situations which ignite the backfire effect. The given perspectives from these two areas, the literature on debunking misinformation and how students reconstruct their knowledge through a meaningful conflict, might be complementary but neither is sufficient alone to provide fruitful strategies to avoid the backfire effect and science denial and promote meaningful conflict while learning and teaching about controversial issues in science.

With regard to the potentially fruitful areas discussed earlier, the epistemic understanding of knowledge production in science is not a panacea, or a one-size-fits-all solution. However, the epistemic understanding of knowledge production in science seems to be relevant to lead students to consider different perspectives and sources of knowledge and knowing on polarizing scientific issues rather than dismissing ideas that contradict their knowledge, beliefs, and values. Limitations exist in terms of the role of researchers and educators in addressing science denial and the backfire effect while facilitating epistemic understanding of knowledge production. There are some important questions that we need to ask and to seek answers for. Do educators consider the importance of presenting relevant information to explain scientific phenomena in classrooms? Teachers, for instance, who heavily depend on textbooks to teach science might encounter issues related to the epistemic aspect of knowledge production in science. As Kuhn ( 1970 ) pointed out, textbooks are “persuasive” (p. 1) and what is described as science in the textbooks does not fit the way science is done. One may also ask whether we teach students about both the scientific knowledge and the way knowledge is produced. Teaching scientific knowledge before explaining how it is produced can be exemplified by a cart before the horse approach. There is a need, then, for educators and researchers to be conscious of the backfire effect and the nature of scientific knowledge and formulate a comprehensive approach to science denial. Moreover, educators and researchers should pay attention to students’ background assumptions according to their specific contexts. It means that the strategies in dealing with students’ assumptions and beliefs about electrons should be different than their beliefs about hot button issues such as vaccination and global warming (Hodgin and Kahne 2018 ). It is important to consider different pedagogical approaches based on whether students’ misbeliefs are caused by the absence of knowledge, pseudo-theory promotion, or antipathy towards scientific facts. Regarding the challenges of post-truth and science denial, it would be wise to develop well-focused and empirically grounded strategies to combat with different types of unwarranted beliefs to produce satisfactory instructional outcomes (Fasce and Picó 2019 ).

Only a handful of studies in political science have analysed the effects of attempts to correct misbeliefs and background assumptions, leading to contradictory research findings. The studies also lack evidence on effective strategies for pedagogical implementations. Little is known about how science educators and researchers approach the backfire effect with polarising issues and science denial within the field of science education. Use of epistemic understanding of knowledge production in science with a focus on avoiding the backfire effect may increase the potential for science education research to produce fruitful strategies and democratic environments which promote divergent perspectives to deepen students’ understanding of how science works. There is a need for science education research to consider the consequences of the backfire effect and develop a program of research or supplemental curriculum to help students use critical and reflective thinking skills within a multidisciplinary context (e.g., natural sciences, political sciences, media and communication studies).

Declarations

The author declares no conflict of interest.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Banilower, E. R. (2019). Understanding the big picture for science teacher education: The 2018 NSSME+. Journal of Science Teacher Education, 30(3), 201–208
  • Barba, R. H. (1995). Science in the multicultural classroom: A guide to teaching and learning . Needham Heights, MA: Allyn and Bacon
  • Bardon, A. (2020). The truth about denial: Bias and self-deception in science, politics, and religion . New York, NY: Oxford University Press
  • Behrens, C. A. (1989). The scientific basis for Shipibo soil classification and land use: Changes in soil-plant associations with cash cropping. American Anthropologist, 91, 83–100
  • Boyle R. States are trying to bring science denial to the classroom. 2017. [ Google Scholar ]
  • Chater, N. (1999). The search for simplicity: A fundamental cognitive principle? The Quarterly Journal of Experimental Psychology Section A, 52(2), 273–302
  • Chen YC, Benus MJ, Hernandez J. Managing uncertainty in scientific argumentation. Science Education. 2019; 103 :1235–1276. doi: 10.1002/sce.21527. [ CrossRef ] [ Google Scholar ]
  • Chinn CA, Malhotra BA. Children’s responses to anomalous scientific data: how is conceptual change impeded? Journal of Educational Psychology. 2002; 94 (2):327–343. doi: 10.1037/0022-0663.94.2.327. [ CrossRef ] [ Google Scholar ]
  • Cleland CE. Historical science, experimental science, and the scientific method. Geology. 2001; 29 (11):987–990. doi: 10.1130/0091-7613(2001)029<0987:HSESAT>2.0.CO;2. [ CrossRef ] [ Google Scholar ]
  • Cobern WW, Loving CC. Defining “science” in a multicultural world: implications for science education. Science Education. 2001; 85 (1):50–67. doi: 10.1002/1098-237X(200101)85:1<50::AID-SCE5>3.0.CO;2-G. [ CrossRef ] [ Google Scholar ]
  • Cook J, Lewandowsky S. The debunking handbook . St. Lucia, Australia: University of Queensland; 2011. [ Google Scholar ]
  • Crozier WE, Strange D. Correcting the misinformation effect. Applied Cognitive Psychology. 2019; 33 (4):585–595. [ Google Scholar ]
  • Department of Education. (2014). Science programmes of study: key stage 4 . National Curriculum in England. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/381380/Science_KS4_PoS_7_November_2014.pdf
  • Deppe, K. D., Gonzalez, F. J., Neiman, J. L., Jacobs, C., Pahlke, J., Smith, K. B., & Hibbing, J. R. (2015). Reflective liberals and intuitive conservatives: a look at the cognitive reflection test and ideology. Judgment & Decision Making, 10(4), 314–331
  • diSessa AA. Toward an epistemology of physics. Cognition and Instruction. 1993; 10 (2-3):105–225. doi: 10.1080/07370008.1985.9649008. [ CrossRef ] [ Google Scholar ]
  • Druckman JN, McGrath MC. The evidence for motivated reasoning in climate change preference formation. Nature Climate Change. 2019; 9 (2):111–119. doi: 10.1038/s41558-018-0360-1. [ CrossRef ] [ Google Scholar ]
  • Duschl R. Science education in three-part harmony: Balancing conceptual, epistemic, and social learning goals. Review of Research in Education. 2008; 32 :268–291. doi: 10.3102/0091732X07309371. [ CrossRef ] [ Google Scholar ]
  • Duschl, R. A. (2020). Practical reasoning and decision making in science: Struggles for truth. Educational Psychologist, 55(3), 187–192
  • Duschl RA, Osborne J. Supporting and promoting argumentation discourse in science education. Studies in Science Education. 2002; 38 (1):39–72. doi: 10.1080/03057260208560187. [ CrossRef ] [ Google Scholar ]
  • Ecker UK, Hogan JL, Lewandowsky S. Reminders and repetition of misinformation: helping or hindering its retraction? Journal of Applied Research in Memory and Cognition. 2017; 6 (2):185–192. doi: 10.1037/h0101809. [ CrossRef ] [ Google Scholar ]
  • Ecker, U. K., Lewandowsky, S., Jayawardana, K., & Mladenovic, A. (2019). Refutations of equivocal claims: No evidence for an ironic effect of counterargument number. Journal of Applied Research in Memory and Cognition, 8(1), 98–107
  • Fasce A, Picó A. Science as a vaccine. Science & Education. 2019; 28 (1-2):109–125. doi: 10.1007/s11191-018-00022-0. [ CrossRef ] [ Google Scholar ]
  • Feinstein NW, Waddington DI. Individual truth judgments or purposeful, collective sensemaking? Rethinking science education’s response to the post-truth era. Educational Psychologist. 2020; 55 (3):155–166. doi: 10.1080/00461520.2020.1780130. [ CrossRef ] [ Google Scholar ]
  • Fiedler D, Sbeglia GC, Nehm RH, Harms U. How strongly does statistical reasoning influence knowledge and acceptance of evolution? Journal of Research in Science Teaching. 2019; 56 (9):1183–1206. doi: 10.1002/tea.21547. [ CrossRef ] [ Google Scholar ]
  • Foucault, M. (1970). The order of things: An archaeology of the human sciences . (A. M. Sheridan Smith, Trans.). New York, NY: Vintage Books
  • Gray RON. The distinction between experimental and historical sciences as a framework for improving classroom inquiry. Science Education. 2014; 98 (2):327–341. doi: 10.1002/sce.21098. [ CrossRef ] [ Google Scholar ]
  • Hand, B., Lawrence, C., & Yore, L. D. (1999). A writing in science framework designed to enhance science literacy. International Journal of Science Education, 21(10), 1021–1035
  • Haglin K. The limitations of the backfire effect. Research & Politics. 2017; 4 (3):1–5. doi: 10.1177/2053168017716547. [ CrossRef ] [ Google Scholar ]
  • Hansson, S. O. (2017a). Science and pseudo-science. In E. N. Zalta (Ed.). The Stanford encyclopedia of philosophy (Summer 2017 ed.). Retrieved from https://plato.stanford.edu/entries/pseudo-science/#ScD
  • Hansson SO. Science denial as a form of pseudoscience. Studies in History and Philosophy of Science. 2017; 63 :39–47. doi: 10.1016/j.shpsa.2017.05.002. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Henley J. How Finland starts its fight against fake news in primary schools. 2020. [ Google Scholar ]
  • Hodgin, E., & Kahne, J. (2018). Misinformation in the information age: what teachers can do to support students. Social Education, 82(4), 208–212
  • Jaipal, K. (2009). Meaning making through multiple modalities in a biology classroom: A multimodal semiotics discourse analysis. Science Education, 94(1), 48–72
  • Jost JT, Glaser J, Kruglanski AW, Sulloway FJ. Political conservatism as motivated social cognition. Psychological Bulletin. 2003; 129 (3):339–375. doi: 10.1037/0033-2909.129.3.339. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahan DM, Jenkins-Smith H, Braman D. Cultural cognition of scientific consensus. Journal of Risk Research. 2011; 14 (2):147–174. doi: 10.1080/13669877.2010.511246. [ CrossRef ] [ Google Scholar ]
  • Kannan R. Sidestepping politics to teach climate. Science. 2019; 366 (6468):1042. doi: 10.1126/science.366.6468.1042. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kelly, G. J. (2014). Inquiry teaching and learning: Philosophical considerations. International handbook of research in history, philosophy and science teaching (pp. 1363–1380). The Netherlands: Springer
  • Kuhn TS. The structure of scientific revolutions . Chicago: University of Chicago Press; 1970. [ Google Scholar ]
  • Kuhn D. Teaching and learning science as argument. Science Education. 2010; 94 (5):810–824. doi: 10.1002/sce.20395. [ CrossRef ] [ Google Scholar ]
  • Lakatos I. Science and pseudoscience. In: Curd M, Cover JA, editors. Philosophy of science: The central issues . New York, NY: W. W. Norton & Company; 1998. pp. 20–26. [ Google Scholar ]
  • Laudan L. Beyond positivism and relativism . Boulder, CO: Westview Press; 1996. [ Google Scholar ]
  • Lawson AE. A scientific approach to teaching about evolution and special creation. American Biology Teacher. 1999; 61 (4):266–274. doi: 10.2307/4450669. [ CrossRef ] [ Google Scholar ]
  • Lenormand T, Roze D, Rousset F. Stochasticity in evolution. Trends in Ecology & Evolution. 2009; 24 :157–165. doi: 10.1016/j.tree.2008.09.014. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lewandowsky S, Ecker UKH, Seifert C, Schwarz N, Cook J. Misinformation and its correction: continued influence and successful debiasing. Psychological Science in the Public Interest. 2012; 13 :106–131. doi: 10.1177/1529100612451018. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Limón, M. (2001). On the cognitive conflict as an instructional strategy for conceptual change: A critical appraisal. Learning and Instruction, 11 (4), 357–380
  • Limón, M., & Carretero, M. (1997). Conceptual change and anomalous data: A case study in the domain of natural sciences. European Journal of Psychology of Education, 12 (2), 213–230
  • Liu DWC. Science denial and the science classroom. CBE-Life Sciences Education. 2012; 11 :129–134. doi: 10.1187/cbe.12-03-0029. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lombrozo T. Simplicity and probability in causal explanation. Cognitive Psychology. 2007; 55 (3):232–257. doi: 10.1016/j.cogpsych.2006.09.006. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Loving, C. C. (1991). The scientific theory profile: A philosophy of science models for science teachers. Journal of Research in Science Teaching, 28 , 823–838.
  • Mazur A. Believers and disbelievers in evolution. Politics and the Life Sciences. 2004; 23 (2):55–61. doi: 10.2990/1471-5457(2004)23[55:BADIE]2.0.CO;2. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mizrahi, M. (2015). Historical inductions: New cherries, same old cherry-picking. International Studies in the Philosophy of Science, 29 (2), 129–148.
  • Mooney, C. (2011). The science of why we don’t believe science. Mother Jones. https://www.motherjones.com/politics/2011/04/denial-science-chris-mooney/
  • Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32 (2), 303–330
  • Nyhan B, Reifler J. Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine. 2015; 33 (3):459–464. doi: 10.1016/j.vaccine.2014.11.017. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: A randomized trial. Pediatrics, 133 (4), 835–842 [ PubMed ]
  • Ogawa M. Science education in a multiscience perspective. Science Education. 1995; 79 :583–593. doi: 10.1002/sce.3730790507. [ CrossRef ] [ Google Scholar ]
  • Okasha, S. (2002). Philosophy of science: A very short introduction . New York, NY: Oxford University Press
  • Osborne, J. F., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95 (4), 627–638
  • Paul, R. W. (1995). Critical thinking: How to prepare students for a rapidly changing world . Santa Rosa, CA: Foundation for Critical Thinking
  • Pennycook G, Cheyne JA, Seli P, Koehler DJ, Fugelsang JA. Analytic cognitive style predicts religious and paranormal belief. Cognition. 2012; 123 (3):335–346. doi: 10.1016/j.cognition.2012.03.003. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pinker, S. (2018). Enlightenment now: The case for reason, science, humanism, and progress . Viking
  • Rosenau, J. (2012). Science denial: A guide for scientists. Trends in Microbiology, 20 (12), 567–569 [ PubMed ]
  • Sandoval WA. Understanding students’ practical epistemologies and their influence on learning through inquiry. Science Education. 2005; 89 (4):634–656. doi: 10.1002/sce.20065. [ CrossRef ] [ Google Scholar ]
  • Short SD, Lastrapes KA, Natale NE, McBrady EE. Rational engagement buffers the effect of conservatism on one’s reported relevance of the theory of evolution. Journal of Research in Science Teaching. 2019; 56 :1384–1405. doi: 10.1002/tea.21559. [ CrossRef ] [ Google Scholar ]
  • Sides J, Citrin J. Paper presented at the 2007 annual meeting of the Midwest Political Science Association, Chicago, IL. 2007. How large the huddled masses? The causes and consequences of public misperceptions about immigrant populations. [ Google Scholar ]
  • Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E. C., et al. (2018). Many analysts, one dataset: Making transparent how variations in analytical choices affect results. Advances in Methods and Practices in Psychological Science, 1 (3), 337–356
  • Sinatra, G. M., Kienhues, D., & Hofer, B. K. (2014). Addressing challenges to public understanding of science: Epistemic cognition, motivated reasoning, and conceptual change. Educational Psychologist, 49 (2), 123–138
  • Snively, G., & Corsiglia, J. (2001). Discovering indigenous science: Implications for science education. Science Education, 85 (1), 6–34
  • Southerland SA. Epistemic universalism and the shortcomings of curricular multicultural science education. Science & Education. 2000; 9 (3):289–307. doi: 10.1023/A:1008676109903. [ CrossRef ] [ Google Scholar ]
  • Stanley WB, Brickhouse NW. Multiculturalism, universalism, and science education. Science Education. 1994; 78 (4):387–398. doi: 10.1002/sce.3730780405. [ CrossRef ] [ Google Scholar ]
  • Swire B, Ecker UK, Lewandowsky S. The role of familiarity in correcting inaccurate information. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2017; 43 :1948–1961. [ PubMed ] [ Google Scholar ]
  • Taber CS, Lodge M. Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science. 2006; 50 :755–769. doi: 10.1111/j.1540-5907.2006.00214.x. [ CrossRef ] [ Google Scholar ]
  • Tippett, C. D. (2010). Refutation text in science education: A review of two decades of research. International Journal of Science and Mathematics Education, 8 , 951–970
  • Trevors, G. J., Muis, K. R., Pekrun, R., Sinatra, G. M., & Winne, P. H. (2016). Identity and epistemic emotions during knowledge revision: A potential account for the backfire effect. Discourse Processes, 53 (5-6), 339–370
  • Vosniadou S. On the nature of naïve physics. In: Limón M, Mason L, editors. Reconsidering the processes of conceptual change . Dordrecht: Kluwer Academic Publishers; 2002. pp. 61–76. [ Google Scholar ]
  • Wang KC. Animals in Tao’s eco-cultural meanings (蘭嶼動物生態文化) Taiwan: National Chiao Tung University Press; 2012. [ Google Scholar ]
  • Wood, T., & Porter, E. (2017). The elusive backfire effect: Mass attitudes’ steadfast factual adherence . Political Behavior: Forthcoming. 10.2139/ssrn.2819073.

Logo Oapen

  • For Librarians
  • For Publishers
  • For Researchers
  •   OAPEN Home

The Truth About Denial

Bias and Self-Deception in Science, Politics, and Religion

Thumbnail

Publisher website

Publication date and place, classification, public remark.

  • Imported or submitted locally

Export search results

The export option will allow you to export the current search results of the entered query to a file. Differen formats are available for download. To export the items, click on the button corresponding with the preferred download format.

A logged-in user can export up to 15000 items. If you're not logged in, you can export no more than 500 items.

To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

20 Defense Mechanisms We Use to Protect Ourselves

Which of these is your go-to?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

example of denial in critical thinking

Steven Gans, MD is board-certified in psychiatry and is an active supervisor, teacher, and mentor at Massachusetts General Hospital.

example of denial in critical thinking

  • Main Defense Mechanisms
  • Other Defense Mechanisms
  • How They Work
  • Coping Tips

Some of the best-known defense mechanisms have become a common part of everyday language. For example, you might describe someone as being "in denial" of a problem they face. When someone falls back into old ways of doing things, you might term them as "regressing" into an earlier point of development. Defense mechanisms are unconscious psychological responses that protect people from feelings of anxiety, threats to self-esteem , and things that they don't want to think about or deal with.

Defense Mechanisms vs. Defence Mechanisms

In the U.S., the term "defense mechanisms" is spelled with an 's' in defense. However, in other areas of the world, it is spelled with a 'c.' If you live in the U.K., for instance, the spelling is "defence mechansms." So, you may see it spelled either way.

Defense mechanisms were first described by  Sigmund Freud  in his psychoanalytic theory. According to Freud, these mechanisms protect the conscious mind from contradictions between the animalistic id and the idealistic superego, ultimately contributing to "mental homeostasis."

AleksandarGeorgiev / Getty Images

10 Key Defense Mechanisms

Freud's daughter,  Anna Freud , expanded on her father's theory by describing 10 different defense mechanisms used by the ego. When reading through them, consider whether you use any in your own life.

Displacement

Have you ever had a really bad day at work, then went home and took out your frustration on family and friends? If you answered yes, you have experienced the ego defense mechanism of  displacement .

Displacement involves taking out our frustrations, feelings, and impulses on people or objects that are less threatening.

Displaced  aggression  is a common example of this defense mechanism. Rather than express your anger in ways that could lead to negative consequences (like arguing with your boss), you instead express your anger toward a person or object that poses no threat (such as your spouse, children, or pets).

Denial , probably one of the best-known defense mechanisms, is an outright refusal to admit or recognize that something has occurred or is currently occurring. It functions to protect the ego from things with which the person cannot cope and is used often to describe situations in which people seem unable to face reality or admit an obvious truth (e.g., "They're in denial").

For example, people living with drug or alcohol addiction often deny that they have a problem, while victims of traumatic events may deny that the event ever occurred.

While it may temporarily shield you from anxiety or pain, denial also requires a substantial investment of energy. Because of this, other defenses are used to help keep these unacceptable feelings from  conscious  awareness.

In many cases, there might be overwhelming evidence that something is true, yet the person will continue to deny its existence or truth because it is too uncomfortable to face.

Denial can involve a flat-out rejection of the existence of a fact or reality. In other cases, it might involve admitting that something is true, but minimizing its importance. Sometimes people will accept reality and the seriousness of the fact, but they will deny their own responsibility and instead blame other people or other outside forces.

Repression acts to keep information out of conscious awareness. However, these memories don't just disappear; they continue to influence our behavior. For example, a person who has repressed memories of abuse suffered as a child may later have difficulty forming relationships.

Suppression

Sometimes you might repress information consciously by forcing the unwanted information out of your awareness. This is known as  suppression . In most cases, however, this removal of anxiety-provoking memories from awareness is believed to occur unconsciously.

Sublimation

Sublimation  is a defense mechanism that allows us to act out unacceptable impulses by converting these behaviors into a more acceptable form. For example, a person experiencing extreme anger might take up kickboxing as a means of venting frustration.

Freud believed that sublimation was a sign of maturity and allows people to function normally in socially acceptable ways.

Projection  is a defense mechanism that involves taking your own unacceptable qualities or feelings and ascribing them to other people. For example, if you have a strong dislike for someone, you might instead believe that they do not like you.

Projection works by allowing the expression of the desire or impulse, but in a way that the ego cannot recognize, therefore reducing anxiety.

Intellectualization

Intellectualization works to reduce anxiety by thinking about events in a cold, clinical way. This defense mechanism allows us to avoid thinking about the stressful, emotional aspect of the situation and instead focus only on the intellectual component.

For example, a person who has just been diagnosed with a terminal illness might focus on learning everything about the disease in order to avoid distress and remain distant from the reality of the situation and their feelings about it.

Rationalization

Rationalization is a defense mechanism that involves explaining an unacceptable behavior or feeling in a rational or logical manner, avoiding the true reasons for the behavior.

For example, a person who is turned down for a date might rationalize the situation by saying they were not attracted to the other person anyway. A student might rationalize a poor exam score by blaming the instructor rather than admitting their own lack of preparation.

Rationalization not only prevents anxiety, but it may also protect self-esteem and self-concept .

When trying to explain success or failure, people using this defense mechanism tend to attribute achievement to their own qualities and skills while failures are blamed on other people or outside forces.

When confronted by stressful events, people sometimes abandon coping strategies and revert to patterns of behavior used earlier in development. Anna Freud called this defense mechanism  regression and suggested that people act out behaviors from the  stage of psychosexual development  in which they are fixated.

For example, an individual fixated at an earlier developmental stage might cry or sulk upon hearing unpleasant news.

According to Freud, behaviors associated with regression can vary greatly depending on the stage at which a person is fixated. For example, an individual fixated at the oral stage might begin eating or smoking excessively, or might become verbally aggressive. A fixation at the anal stage might result in excessive tidiness or messiness.

Reaction Formation

Reaction formation  reduces anxiety by taking up the opposite feeling, impulse, or behavior. An example of reaction formation would be treating someone you strongly dislike in an excessively friendly manner in order to hide your true feelings.

Why do people behave this way? According to Freud, they are using reaction formation as a defense mechanism to hide their true feelings by behaving in the exact opposite manner.

7 Main Defense Mechanisms

This list is sometimes shortened to provide only seven main defense mechanisms, which are denial, displacement, projection, rationalization, reaction formation, repression, and sublimation.

10 Other Common Defense Mechanisms

Since Freud first described the original defense mechanisms, other researchers have continued to describe other methods of reducing anxiety. Some of these defense mechanisms include:

  • Acting out : Coping with stress by engaging in actions rather than acknowledging and bearing certain feelings. For example, instead of telling someone that you are angry with them, you might yell at them or throw something against the wall.
  • Aim inhibition : Accepting a modified form of their original goal. An example of this would be becoming a high school basketball coach rather than a professional athlete.
  • Altruism : Satisfying internal needs through helping others. For example, someone recovering from substance use might volunteer to help others in recovery as a way to deal with drug cravings.
  • Avoidance : Refusing to deal with or encounter unpleasant objects or situations. For example, rather than discuss a problem with someone, you might simply start avoiding them altogether so you don't have to deal with the issue.
  • Compensation : Overachieving in one area to compensate for failures in another. For example, someone who feels insecure academically might compensate by excelling in athletics.
  • Dissociation : Becoming separated or removed from your experience. When dealing with something stressful, for example, you might mentally and emotionally disengage yourself from the situation.
  • Fantasy : Avoiding reality by retreating to a safe place within your mind. When something in your life is causing anxiety, you might retreat to your inner world where the cause of the stress cannot harm you.
  • Humor : Pointing out the funny or ironic aspects of a situation. An example of this might be cracking a joke in a stressful or traumatic situation.
  • Passive-aggression : Indirectly expressing anger. Instead of telling someone that you are upset, for example, you might give them the silent treatment.
  • Undoing : Trying to make up for what you feel are inappropriate thoughts, feelings, or behaviors. For example, if you hurt someone's feelings, you might offer to do something nice for them to assuage your anxiety or guilt.

While defense mechanisms are often thought of as negative reactions, we all need them to temporarily ease stress and protect self-esteem during critical times, allowing us to focus on what is necessary at the moment.

Some of these defenses can be more helpful than others. For example, utilizing humor to overcome a stressful, anxiety-provoking situation can actually be an adaptive defense mechanism.

There are many different types of defense mechanisms that can be used to protect the ego from anxiety. Some of these can be healthier and more helpful than others.

How Do Defense Mechanisms Work?

In Sigmund Freud's model of personality, the  ego  is the aspect of personality that deals with reality. While doing this, the ego also has to cope with the conflicting demands of the id and the superego. 

  • The id : The part of the personality that seeks to fulfill all wants, needs, and impulses. The id is the most basic, primal part of our personalities and does not consider things such as social appropriateness, morality, or even the reality of fulfilling our wants and needs.
  • The superego : The part of the personality that tries to get the ego to act in an idealistic and moral manner. The superego is made up of all the internalized morals and values we acquire from our parents, other family members, religious influences, and society.

To deal with anxiety, Freud believed that defense mechanisms helped shield the ego from the conflicts created by the  id, superego, and reality . So what happens when the ego cannot deal with the demands of our desires, the constraints of reality, and our own moral standards?

According to Freud, anxiety is an unpleasant inner state that people seek to avoid. Anxiety acts as a signal to the ego that things are not going the way they should. As a result, the ego employs some sort of defense mechanism to help reduce these feelings of anxiety .

Types of Anxiety

Not all types of anxiety are created equal. Nor do these anxieties stem from the same sources. Freud identified three types of anxiety :

  • Moral anxiety : A fear of violating our own moral principles
  • Neurotic anxiety : The unconscious worry that we will lose control of the id's urges, resulting in punishment for inappropriate behavior
  • Reality anxiety : Fear of real-world events. The cause of this anxiety is usually easily identified. For example, a person might fear a dog bite when they are near a menacing dog. The most common way of reducing this anxiety is to avoid the threatening object.

Although we may knowingly use coping mechanisms to manage anxiety, in many cases, these defenses work  unconsciously  to distort reality.

Coping With Unhealthy Defense Mechanisms

While all defense mechanisms can be unhealthy, they can also be adaptive and allow us to function normally. For example, altruism, humor, sublimation, and suppression are four mature defense mechanisms that signal higher adaptiveness.

At the same time, problems can arise when defense mechanisms are overused in an attempt to avoid dealing with problems. To keep this from happening to you, here are a few ways to cope with unhealthy defenses.

  • Develop greater self-awareness . Self-awareness helps you identify when you may be using one or more defense mechanisms too often. Once you take this step, you know where you need to make changes.
  • Learn effective coping skills . If you have an unhealthy defense mechanism, learning new coping skills can help you better deal with uncomfortable emotions. Coping skills include meditation, establishing healthy boundaries, and asking for support.
  • Seek mental health therapy . Psychoanalytic therapy can help you uncover your unconscious defense mechanisms and find better, healthier ways of coping with anxiety and distress. According to licensed psychologist David Susman, PhD, "cognitive-behavioral therapy could also be helpful in addressing maladaptive use of defense mechanisms, since defense mechanisms can often contribute to irrational thoughts and beliefs as well as problematic or impulsive behaviors," he says. "For example, someone who uses denial in the face of a diagnosis of a serious medical condition may delay or avoid going to the doctor by mentally discounting the seriousness of the health issue."

Cognitive-behavioral therapy could also be helpful in addressing maladaptive use of defense mechanisms, since defense mechanisms can often contribute to irrational thoughts and beliefs as well as problematic or impulsive behaviors.

Keep in Mind

Remember, defense mechanisms can be both good and bad. They can serve a helpful role by protecting your ego from stress and providing a healthy outlet. In other instances, these defense mechanisms might hold you back from facing reality and can act as a form of self-deception.

If you notice that the overuse of certain defense mechanisms is having a negative impact on your life, consider consulting with a mental health professional. Psychotherapy may help whether you pursue a traditional face-to-face treatment or an online therapy option.

Get Help Now

We've tried, tested, and written unbiased reviews of the best online therapy programs including Talkspace, Betterhelp, and Regain. Find out which option is the best for you.

Cramer P. Understanding defense mechanisms . Psychodyn Psychiatry . 2015;43(4):523-52. doi:10.1521/pdps.2015.43.4.523

Waqas A, Rehman A, Malik A, Muhammad U, Khan S, Mahmood N. Association of ego defense mechanisms with academic performance, anxiety and depression in medical students: A mixed methods study . Cureus . 2015;7(9):e337. doi:10.7759/cureus.337

Corey, G. Theory and Practice of Counseling and Psychotherapy (8th ed.) .

Macdonald K, Thomas ML, Sciolla AF, et al. Minimization of childhood maltreatment is common and consequential: Results from a large, multinational sample using the childhood trauma questionnaire . PLoS ONE . 2016;11(1):e0146058. doi:10.1371/journal.pone.0146058

Malle BF, Guglielmo S, Monroe AE. A theory of blame .  Psychological Inquiry . 2014;25(2):147-186. doi:10.1080/1047840X.2014.877340

Anderson MC, Huddleston E. Towards a cognitive and neurobiological model of motivated forgetting .  True and False Recovered Memories. 2011:53-120. doi:10.1007/978-1-4614-1195-6_3

Kim E, Zeppenfeld V, Cohen D.  Sublimation, culture, and creativity . J Pers Soc Psychol . 2013;105(4):639-66. doi:10.1037/a0033487

Vaillant GE. Ego Mechanisms of Defense, A Guide for Clinicans and Researchers . American Psychiatric Pub; 1992.

Di Giuseppe M, Perry JC. The hierarchy of defense mechanisms: Assessing defensive functioning with the defense mechanisms rating scales Q-sort . Front Psychol . 2021;12:718440. doi:10.3389/fpsyg.2021.718440

Burgo, J. Why Do I Do That? Psychological Defense Mechanisms and the Hidden Ways They Shape Our Lives .

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

example of denial in critical thinking

Try for free

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

example of denial in critical thinking

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved March 13, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources.

ICD10monitor

How Critical Thinking Skills Help Avoid Denials

  • By Laurie M. Johnson, MS, RHIA, FAHIMA, AHIMA Approved ICD-10-CM/PCS Trainer
  • May 22, 2023

How Critical Thinking Skills Help Avoid Denials

Denials have been on my mind for the last few weeks – specifically, how do we prevent them?

According to the Public Broadcasting Service (PBS), 48.3 million claims were denied in 2021, which was 16.6 percent of all claims. If you use the standard of $25 per claim as the benchmark, then the total cost for reworking the denials was over $1.2 billion.

As coders, how can we affect claim denials? My thought is…we need to use critical thinking skills.

What is critical thinking? According to mystudentvoices.com, critical thinking includes five steps: formulate your question, gather your information, apply the information, consider the implications, and explore other points of view.

In applying this process to coding, my initial question is this: what is wrong with this patient, and how many resources were used to treat them? The ICD-10-CM/PCS and/or CPT codes must paint the picture of severity of illness and resource consumption for each patient.

To gather the information, I will read and understand the clinical information in the chart. When I read through charts, I jot down the diagnoses and procedures that were documented.

Now, to apply the information, the codes are assigned. Are there any information gaps to assigning the codes? Is there conflicting information regarding the diagnoses or procedures?

The next step in the process is to consider the implications. Does the conflicting information impact the MS-DRG or APR-DRG assignment? Does missing information impact the code assignment? Would additional information provide a more specific code?

The last step is exploring the options. What is the likelihood that this case will be denied? One complication or comorbidity (CC) or major CC (MCC) increases the likelihood of denial. Conflicting or missing information also increases the denial rate.

Let’s walk through an example of following these critical thinking points:

A patient presents to the emergency department after a fall. He has some confusion as well as a hip fracture, which is repaired later in the stay. The neurological assessment is alert on most days, but a few days he is documented as confused. The laboratory values show that the patient is dehydrated. The provider documents in several progress notes that the 79-year-old patient has dementia and toxic metabolic encephalopathy. The discharge summary does not include the diagnosis of toxic metabolic encephalopathy.

In applying our critical thinking model, I know that this patient had a hip fracture and surgery. He also had some confusion. As I read through the chart, I noticed that the documentation is conflicting regarding the patient’s confusion. As I assign the codes, I have an issue with toxic metabolic encephalopathy, because it is not carried through discharge. This case has a high likelihood of being denied, because it is the only MCC on the chart and it increases reimbursement. It requires a physician query to close the loose ends.

As coders, we use critical thinking skills daily, and on every chart. By using this skill, we avoid reimbursement loss, delay, and the extra work required to respond to a denial. In some cases, the extra time spent resolving conflicting information will avoid the denial.

Make sure to use your critical thinking skills when coding charts!

Programming note : Listen to Laurie Johnson’s live coding reports every Tuesday on Talk Ten Tuesdays , 10 a.m., with Chuck Buck and Dr. Erica Remer.

  • TAGS: Coding , Denials , ICD-10 , ICD-10-CM , ICD-10-PCS

Print Friendly, PDF & Email

Laurie M. Johnson, MS, RHIA, FAHIMA, AHIMA Approved ICD-10-CM/PCS Trainer

Related stories.

Understanding the Connection Between Observation Rate & CMI

Understanding the Connection Between Observation Rate & CMI

EDITOR’S NOTE: The context for this article was conceptualized from a recent ACPA Town Hall meeting on observation metrics and an upcoming presentation by the

How to Prevent Pneumonia DRG Denials

How to Prevent Pneumonia DRG Denials

When I ask facilities what their most common denials are, invariably, pneumonia makes the list. That was my personal experience when I handled clinical validation

Leave a Reply

Please log in to your account to comment on this article.

Featured Webcasts

Leveraging the CERT: A New Coding and Billing Risk Assessment Plan

Leveraging the CERT: A New Coding and Billing Risk Assessment Plan

Frank Cohen shows you how to leverage the Comprehensive Error Rate Testing Program (CERT) to create your own internal coding and billing risk assessment plan, including granular identification of risk areas and prioritizing audit tasks and functions resulting in decreased claim submission errors, reduced risk of audit-related damages, and a smoother, more efficient reimbursement process from Medicare.

2024 Observation Services Billing: How to Get It Right

2024 Observation Services Billing: How to Get It Right

Dr. Ronald Hirsch presents an essential “A to Z” review of Observation, including proper use for Medicare, Medicare Advantage, and commercial payers. He addresses the correct use of Observation in medical patients and surgical patients, and how to deal with the billing of unnecessary Observation services, professional fee billing, and more.

Top-10 Compliance Risk Areas for Hospitals & Physicians in 2024: Get Ahead of Federal Audit Targets

Top-10 Compliance Risk Areas for Hospitals & Physicians in 2024: Get Ahead of Federal Audit Targets

Explore the top-10 federal audit targets for 2024 in our webcast, “Top-10 Compliance Risk Areas for Hospitals & Physicians in 2024: Get Ahead of Federal Audit Targets,” featuring Certified Compliance Officer Michael G. Calahan, PA, MBA. Gain insights and best practices to proactively address risks, enhance compliance, and ensure financial well-being for your healthcare facility or practice. Join us for a comprehensive guide to successfully navigating the federal audit landscape.

Mastering Healthcare Refunds: Navigating Compliance with Confidence

Mastering Healthcare Refunds: Navigating Compliance with Confidence

Join healthcare attorney David Glaser, as he debunks refund myths, clarifies compliance essentials, and empowers healthcare professionals to safeguard facility finances. Uncover the secrets behind when to refund and why it matters. Don’t miss this crucial insight into strategic refund management.

2024 ICD-10-CM/PCS Coding Clinic Update Webcast Series

2024 ICD-10-CM/PCS Coding Clinic Update Webcast Series

HIM coding expert, Kay Piper, RHIA, CDIP, CCS, reviews the guidance and updates coders and CDIs on important information in each of the AHA’s 2024 ICD-10-CM/PCS Quarterly Coding Clinics in easy-to-access on-demand webcasts, available shortly after each official publication.

2024 ICD-10-CM/PCS Coding Clinic Update: Fourth Quarter

2024 ICD-10-CM/PCS Coding Clinic Update: Fourth Quarter

Kay Piper reviews the guidance and updates coders and CDISs on important information in the AHA’s fourth quarter 2024 ICD-10-CM/PCS Quarterly Coding Clinic in an easy to access on-demand webcast.

2024 ICD-10-CM/PCS Coding Clinic Update: Third Quarter

2024 ICD-10-CM/PCS Coding Clinic Update: Third Quarter

Kay Piper reviews the guidance and updates coders on information in the AHA’s third quarter 2024 ICD-10-CM/PCS Coding Clinic in an easy to access on-demand webcast.

2024 ICD-10-CM/PCS Coding Clinic Update: Second Quarter

2024 ICD-10-CM/PCS Coding Clinic Update: Second Quarter

Kay Piper reviews the guidance and updates coders on information in the AHA’s second quarter 2024 ICD-10-CM/PCS Coding Clinic in an easy to access on-demand webcast.

Trending News

How to Ensure Standardization of Files and Data for Consumer Access and Readability

How to Ensure Standardization of Files and Data for Consumer Access and Readability

MAC Misinterprets NCD, Denial Ensues

MAC Misinterprets NCD, Denial Ensues

The Worrisome Rise of Prescription Drug Prices

The Worrisome Rise of Prescription Drug Prices

New Clarification Regarding Post Transplant Testing

New Clarification Regarding Post Transplant Testing

Stay connected.

Subscribe to receive free ICD-10 news and updates.

5874 Blackshire Path, #13 Inver Grove Heights, MN 55076

Hours: 9am – 5pm CT Phone: (800) 252-1578 Email: [email protected]

Copyright © 2024 ICD10monitor. Powered by MedLearn Media.

ICD10monitor

Carl Alasko

How Does Denial Actually Work?

How to respond when a family member claims that "we're all in denial.".

Posted April 23, 2012 | Reviewed by Abigail Fagan

Dear Dr. Alasko: One of my family members likes to constantly accuse others of "being in denial .” When I question him on this, his explanations don't make much sense. I believe that either something is true or it’s not. So does denial really exist? And if it does, how does it work?

Dear Reader: Yes, denial (of reality) exists. But why? And how can human beings gifted with the ability to analyze complex information ignore facts directly in front of their eyes? And refusing to see it even when ignoring the information might be disastrous?

Let’s start by looking at your root premise. It’s an over-simplification to believe that something is either true or false. Why? Because humans experience a range of powerful and complex emotions, such as desire, greed, pride, revenge , need for status, shame , humiliation , etc. These emotions exert a strong influence over a person's ability to interpret facts.

Now, our overall progress as a society is predicated on our learning how to control these emotions and make decisions based on facts. However, fact-based decision-making hasn’t made as much progress in our society as it deserves because many decisions are overwhelmed by those emotions. Add in other psychological dynamics such as ideology (which substitutes belief for facts), inertia (change requires significant energy), momentum (the desire to will obstacles out of our way), impulsiveness (wanting it now!) and stubbornness (no one will change my mind), and we can easily relegate facts to a far corner behind several pieces of heavy mental furniture.

Here’s a common example of denial: how we spend money. Desire, greed and need for status can easily override rational considerations, providing the stimuli that power our spending habits.

For instance, an important friend invites you to a birthday party at an expensive restaurant. You know that going will cost you at least $50. You know you’ll have to charge all of it. You know that your credit limits are stretched. You know you can’t really afford the extra $50 added to your debt. What do you do?

Most probably, you go and spend the $50. And you justify it with denial plus some delusion. Yes, the facts about how it will impact your financial state are true, but saying no to the party would mean confronting a strong desire, pride in ability to spend, your already impulsive habits, and your social status as a publicly-seen person. In short, it would mean admitting to a whole set of factual limitations concerning your life. Reality feels constricting, so denial rules.

The same reasoning process applies to a thousand different kinds of decisions, whether deciding on ordering French fries or another drink, or buying a new SUV, or going on a date with someone who’s married, or ignoring the fact that your date or spouse consistently drinks too much, or that you and your spouse haven’t had a meaningful conversation in over a year. Denying those facts allows you to keep moving rather than stopping and facing the painful restrictions and demands of reality.

There is an immutable fact about denial: it does not work—long term. Reality always wins. And when it does, the next step in the process is blame, which shifts responsibility onto someone or something else. "I only did it because of you! If you hadn't done that, I wouldn't have done this." So where there's denial, blame is always available to ease the pain when reality bites.

So yes, the state of being “in denial” does exist. Whether your family member is correct in assigning that state to the rest of you is a different question—but it is in fact a condition many of us live in on a regular basis.

Carl Alasko

Carl Alasko, Ph.D. is the author of Beyond Blame (Tarcher Penguin), and like his first book Emotional Bullshit , it has been published in five languages.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

January 2024 magazine cover

Overcome burnout, your burdens, and that endless to-do list.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

Defense Mechanisms in Psychology Explained (+ Examples)

Defense Mechanisms

These are known as defensive mechanisms. They’re a set of behaviors that your client has learned to rely on in times of stress.

Your goal is to identify these mechanisms and to understand how the client is using them.

Although Sigmund Freud never produced a comprehensive list of defense mechanisms, they are well documented in psychology.

In this post, you’ll learn more about the different types of defense mechanisms.

Before you continue, we thought you might like to download our three Resilience Exercises for free . These engaging, science-based exercises will help you to effectively deal with difficult circumstances and give you the tools to improve the resilience of your clients, students, or employees.

This Article Contains:

Defense mechanisms in psychology: freud’s theory, displacement, introjection, compensation, suppression, dissociation, positivepsychology.com’s relevant resources, a take-home message.

Freud argued that the mind was made up of three components: the id, ego, and superego (Rennison, 2015).

  • The id houses basic needs, impulses, and desires. Simply, the id acts as a hedonistic pleasure center whose primary goal is to satisfy basic needs and drives.
  • The ego is responsible for how we react to, function in, and make sense of the external world. The ego controls the demands of impulses of the id and is home to our consciousness.
  • The superego houses all the rules that we have learned throughout our life and uses these to control the ego. The superego is also home to the expectations of the ego: the way we should behave and think.

Ideally, the id, ego, and superego interact in concert and harmoniously. However, each component can cause anxiety within an individual.

Sigmund Freud argued that when placed in a psychologically dangerous or threatening situation, the patient was likely to resort to defense mechanisms for protection. In a psychoanalytic context, a dangerous threat is something that challenges the patient’s self-concept or self-esteem (Baumeister, Dale, & Sommer, 1998).

Initially, Freud argued that these threats were basic drives (specifically sexual and aggressive drives) that were at odds with the ego (Baumeister et al., 1998); for example, feeling sexually attracted to one’s child.

Freud later refined his theory by shifting the focus toward self-esteem preservation. Specifically, he posited that when the patient’s self-esteem and self-image were challenged or threatened, they would rely on certain cognitive or mental strategies to protect their self-esteem (Baumeister et al., 1998).

To preserve their self-esteem, the client develops defense mechanisms (Baumeister et al., 1998). Defense mechanisms may be employed unconsciously, with the client unaware that they are using them or why.

The presence of a defense mechanism, however, implies that the client’s self-esteem and self-concept feel threatened and need protecting.

Defense mechanisms can include:

  • Sublimation
  • Identification

In the next section, we will explore some defense mechanisms clients might use.

Projection

  • Fail to recognize that they possess these traits
  • See these same threatening traits in other people

This is known as projection (Baumeister et al., 1998). By not acknowledging threatening traits in themselves, and seeing them in other people instead, the client can protect their self-concept.

The therapist suggests to the client, Amelia, that she fails to acknowledge her partner’s feelings in an argument. Amelia believes she is a very empathetic person, and she thinks she is very responsive to her partner’s feelings.

In response, Amelia argues that it is not her, but her partner who fails to acknowledge Amelia’s feelings. Amelia’s self-concept is threatened by having to recognize these behaviors in herself, and therefore she projects these traits onto her partner instead.

3 resilience exercises

Download 3 Free Resilience Exercises (PDF)

These detailed, science-based exercises will equip you or your clients to recover from personal challenges and turn setbacks into opportunities for growth.

Download 3 Free Resilience Tools Pack (PDF)

By filling out your name and email address below.

When a client displays displacement, they are changing or displacing the original target of a particular impulse to another similar target (Baumeister et al., 1998).

The displacement occurs because the response to the initial target is considered unacceptable or impossible, so a more suitable target is found. The displaced impulse might be very intense toward the original target, but more subdued toward the alternative target. Freud argued that displacement was commonly used in dreaming (Rennison, 2015).

Aidan experiences intense rage and hatred toward his mother; however, he cannot act on these impulses. Instead, he displaces his feelings about his mother onto other people whom he associates with her. He might show hostile behavior toward other women who embody the same characteristics and behaviors as his mother.

Repression

By repressing a memory, feeling, or thought, these things are no longer accessible in the client’s consciousness (Cramer, 1991, 2006). These things do not cease to exist and may be represented in dreams and thoughts by other things, people, or objects.

Although often contrasted as the unconscious variant of suppression, Erdelyi (2006) argues that Freud used repression and suppression interchangeably and considered repression to fall on an unconscious–conscious continuum .

Jacob cannot remember certain painful memories as a child. To protect himself, he unconsciously represses these memories from his consciousness. Instead, he displays anxious behaviors toward other items that he associates with these original painful memories.

Denial refers to the client’s refusal to acknowledge certain facts about a particular situation (Baumeister et al., 1998) or denial of the existence of specific feelings, thoughts, or even perceptions (Cramer, 1991, 2006).

By not acknowledging the facts, the client is protected from a particular state of the world and its consequences – or even from themselves – and how these impact the client.

Ahmed has received various negative job evaluations about his inability to communicate empathetically with clients. Since Ahmed believes he communicates very effectively, he dismisses these negative evaluations using several arguments.

He argues that his manager is wrong, his manager is jealous, that he was stressed that one day with the client, that the client was unclear, and that the other client was hostile.

All of these denials help protect Ahmed from having to incorporate the negative feedback into his self-concept and accept that he is less empathetic than he originally thought.

Introjection

With identification, a highly valued external object is regarded as separate from the client; however, with introjection, the boundary between the client and the external object is blurred.

The client identifies key behaviors, thoughts, and characteristics of important people in their life and forms an internal representation of these individuals. Henry, Schacht, and Strupp (1990) argue that these internal representations mirror the behaviors, feelings, and thoughts of these people and play a key role in developing the client’s self-concept.

Agatha experiences introjection related to her highly critical mother as the internal voice that continuously criticizes and berates her. As a result, Agatha has developed low self-esteem and often runs herself down.

While in therapy, Agatha’s therapist pushes back against Agatha’s opinion, and Agatha experiences this as criticism that confirms her opinion of herself.

Undoing refers to a behavior when individuals ruminate on previous events, replaying and reimagining them as a way to change what happened and, as a result, help protect against certain feelings or behaviors (Baumeister et al., 1998).

Since the particular event has already happened, there is nothing that can be done to change that particular outcome; instead, the replaying of the events allows the individual to protect themselves from certain feelings.

Jayme recently argued with a customer, lost his temper, and consequently lost that customer’s contract. He is very angry about the outcome. He relives this argument, ruminating on how he should have responded, and imagines delivering a precise retort and embarrassing the client.

The reimagining doesn’t change the scenario, but it makes him feel like he was better equipped to deal with the argument.

Compensation

These compensations can be very extreme; the flaws or shortcomings might be real or imaginary, psychological or physical. When the compensatory response is excessive compared to the shortcoming, then it is typically described as overcompensation.

Jeffrey is bullied at school by the other boys because of his slim build. In response, Jeffrey exercises regularly. He undertakes an intense exercise program, drinks protein shakes, and is very diligent in his strength training.

He obtains the desired result. He puts on a great deal of muscle mass, and his body changes. In this instance, Jeffrey is compensating for what he considers to be a physical flaw through strength training.

Splitting refers to the mechanism where individuals are considered either only good or only bad, but never a mix of both. Splitting can be applied to oneself or other people.

It is hypothesized that as a defense mechanism, splitting happens in childhood and is typically associated with poor development of the self (Gould, Prentice, & Ainslie, 1996).

Although young children typically hold polarized beliefs about themselves and other people, they integrate negative and positive beliefs and representations as they get older. However, if the child is continually exposed to negative situations, then this integration is interrupted and becomes the default mechanism through which they view and understand the world.

The assignment of a positive or negative evaluation to oneself or others is not stable; it changes in response to how the client’s needs are satisfied.

Therefore, in situations when the client’s need is being met, the external party is ‘good.’ When the client’s needs are frustrated, then the external party is ‘bad,’ and only negative attributes are assigned to them. As a result, clients who have developed a splitting mechanism tend to have unstable interpersonal relationships.

When Cary receives the help and favors that she asks for, she describes the people who satisfy these needs in very positive terms. They’re extremely helpful, loving, and patient, and in response, she shows them love and affection.

One day, she asks her friend to help her financially, but her friend is unable to assist. In response, Cary becomes extremely upset, and she turns against this friend, describing her as “unreliable,” “good for nothing,” and “selfish.”

Her therapist tries to point out that Cary’s friend has helped in the past, but Cary refuses to acknowledge this and continues to harbor resentment toward her friend. A few weeks later, when Cary asks for help again, this same friend offers to lend a hand. Cary flips her opinion and now embraces this friend wholeheartedly.

Because of Cary’s unstable attitude toward her friend and inability to consider that her friend can have good and bad qualities, her friendships are very tenuous and often characterized by unrealistic expectations and conflict.

Suppression

This distinction was first introduced by Anna Freud (Erdelyi, 2006). By suppressing thoughts, feelings, perceptions, and memories from consciousness, the client is protected from experiencing emotional and psychological distress.

During the therapy session, Amy refuses to recall her feelings toward her late husband. She actively works against these memories through a variety of techniques (e.g., ignoring them, changing the topic, or just refusing). When pushed, she tells her therapist that quite simply, she ‘cannot go there.’

Conversion is characterized by the transformation of psychological pain or distress into physiological impairment, typically of sensory or motor symptoms such as blindness, paralysis, seizures, etc. (Sundbom, Binzer, & Kullgren, 1999).

The physiological symptoms and experiences are idiopathic (i.e., without origin) and cannot be explained by another disease process. The DSM-V recognizes conversion as a disorder, although there is debate about its classification and taxonomy (Brown, Cardeña, Nijenhuis, Sar, & van der Hart, 2007).

Awongiwe has experienced extreme trauma and distress while relocating. A few days later, Awongiwe wakes up to find that she is blind.

Neurological and ophthalmological examinations show that her eyes are healthy, her optic nerve is intact, yet Awongiwe continues to present with blindness. In this case, her blindness has developed in response to her extreme stress.

Dissociation

By not ‘experiencing’ a particularly stressful period and subsequently integrating it into their consciousness, the client is protected from harmful experiences.

Katherine is recalling an especially traumatic experience to her therapist. While recalling the experience, Katherine feels overwhelmingly exhausted and cannot control her yawning.

These feelings of exhaustion quickly intensify, and she struggles immensely not to fall asleep. Her exhaustion is a sign of dissociation, and her mind is trying to protect her from re-experiencing the traumatic experience.

Isolation is defined as the act of creating a mental or cognitive barrier around threatening thoughts and feelings, isolating them from other cognitive processes (Baumeister et al., 1998).

By isolating these threats, it is difficult for mental associations to be formed between threatening thoughts and other thoughts. Isolation is clear when the client doesn’t complete a thought, trailing off and changing the topic instead. Isolation is evidenced by the silent ellipse that follows a trailing thought.

During her session, Emily is describing an argument with her husband and is about to describe a thought that she remembers thinking during the argument.

The thought that she was about to recall is unlike the thoughts and feelings that Emily believes she typically feels toward her husband, and it does not fit in her self-concept of a loving wife.

As she is about to recall the thought, she pauses, leaving the sentence unfinished, and describes a different aspect of the argument instead.

Regression

Regression is considered maladaptive since more emotionally mature behaviors and thought processes are more likely to aid in problem solving and coping .

In response to the news that his parents are getting divorced, Gary has displayed behavior that is more typical of younger children.

When frustrated, he screams and bites, kicks and hits his parents, and has started wetting the bed.

7 Freudian defence mechanisms explained – Lewis Psychology

At PositivePsychology.com, you’ll find several very useful tools to help your client better cope with stressful situations. Here is a list of three recommended tools.

To help your client better understand the type of coping mechanisms that they rely on, we recommend the Explore Coping Modes tool. This tool teaches clients how to:

the behaviors and cognitive processes that they currently use when they feel stressed.

With the Schema Therapy Flash Card , you and your client can ‘summarize’ their behavior. With these flashcards, your client will learn bite-sized morsels of wisdom that can help them respond more healthily to any maladaptive behaviors and thought processes. Furthermore, these cards are easy to carry, so your client can rely on them in distressing situations.

If your client relies on avoidant behaviors, we recommend that you use the Conquering Avoidant Tendencies worksheet. In this defense mechanism worksheet , you will work with your client to help identify the source of their anxiety, which is what they are trying to avoid, and learn how they can approach this source in a manageable way by focusing on smaller steps.

This task can be used in multiple situations, and once your client is familiar with it, they can apply it at home on their own.

If you’re looking for more science-based ways to help others overcome adversity, this collection contains 17 validated resilience tools for practitioners . Use them to help others recover from personal challenges and turn setbacks into opportunities for growth.

example of denial in critical thinking

17 Tools To Build Resilience and Coping Skills

Empower others with the skills to manage and learn from inevitable life challenges using these 17 Resilience & Coping Exercises [PDF] , so you can increase their ability to thrive.

Created by Experts. 100% Science-based.

Human behavior is complex, and often our behavior is not as simple as it appears. We say one thing, but actually, we mean another. Or, we think one thing when we were motivated by something else.

One of the many challenges of being a therapist is exploring and understanding the nuanced complexities of a client’s behavior. In some instances, you may even find yourself participating in your client’s defense mechanisms.

One of your tasks is to always be aware of how complex behavior is, specifically, how your client’s defense mechanisms, and your behavior in response, actively or passively influence their behaviors.

We hope you enjoyed reading this article. Don’t forget to download our three Resilience Exercises for free .

  • Baumeister, R. F., Dale, K., & Sommer, K. L. (1998). Freudian defense mechanisms and empirical findings in modern social psychology: Reaction formation, projection, displacement, undoing, isolation, sublimation, and denial. Journal of Personality , 66 (6), 1081–1124.
  • Brown, R. J., Cardeña, E., Nijenhuis, E., Sar, V., & van der Hart, O. (2007). Should conversion disorder be reclassified as a dissociative disorder in DSM–V? Psychosomatics , 48 (5), 369–378.
  • Costa, R. M. (2020). Regression (defense mechanism).  In V. Zeigler-Hill & T. K. Shackelford (Eds.), Encyclopedia of personality and individual differences (pp. 4346–4348). Springer.
  • Cramer, P. (1991). The development of defense mechanisms: Theory, research, and assessment . Springer Science & Business Media.
  • Cramer, P. (2006). Protecting the self: Defense mechanisms in action . Guilford Press.
  • Erdelyi, M. H. (2006). The unified theory of repression. Behavioral and Brain Sciences , 29 (5), 499–511.
  • Gould, J. R., Prentice, N. M., & Ainslie, R. C. (1996). The Splitting Index: Construction of a scale measuring the defense mechanism of splitting. Journal of Personality Assessment , 66 (2), 414–430.
  • Henry, W. P., Schacht, T. E., & Strupp, H. H. (1990). Patient and therapist introject, interpersonal process, and differential psychotherapy outcome. Journal of Consulting and Clinical Psychology , 58 (6), 768.
  • Hentschel, U., Smith, G., Draguns, J. G., & Ehlers, W. (Eds.). (2004). Defense mechanisms: Theoretical, research and clinical perspectives . Elsevier.
  • Rennison, N. (2015). Freud and psychoanalysis: everything you need to know about id, ego, super-ego and more . Oldcastle Books.
  • Sundbom, E., Binzer, M., & Kullgren, G. (1999). Psychological defense strategies according to the Defense Mechanism Test among patients with severe conversion disorder. Psychotherapy Research , 9 (2), 184–198.

' src=

Share this article:

Article feedback

What our readers think.

priscilla

Thanks for this – is there a way to download the article?

Julia Poernbacher

Hi Priscilla,

Glad you found this post helpful. While we don’t currently have an option to download our posts, you are very welcome to share them with others. If you scroll to the end of the post and respond positively to the question ‘How useful was this article to you?’ several sharing options will become available to you.

Hope this helps!

Warm regards, Julia | Community Manager

Gerald Remingto

Amazing. I found this–first try, first listing on Google. Ihave been interested as a layman in psychology for over 60 years. It is excellent to see what I learned long ago about defense mechanisms, updated and clarified. Thank you for your work.

Lily

Hello, if someone is aware that something bad happened to them, altough they don’t exactly know what, but they are aware they have those memories, but refuse to recall them, is it repression or supression? Thank you for the answer.

Caroline Rou

Hi there, Lily 🙂

Thanks so much for your question and for sharing. Of course, I cannot know exactly what you are referring to. However, I think that this article will help you understand what suppressed memories are and whether this is what you are experiencing.

Kind regards, -Caroline | Community Manager

chelimo colas

hello i love the work , it is what i shall always refer to , thanks.

Lovetta Gbatea Barkon

thanks for refreshing my mind. I love your articles and hope to get more

julia l. de quiros

Thank you so much for this article. Very imformative and insightful. Great help for my Masteral degree.

Kate Thomas

I thought this article was really useful and simple to understand. However, as someone with lived experience of mental health and who works in mental health, I would question the use of the photograph on, ‘regression.’ I feel it is sensationalist, not a positive image and paints people who are very unwell and who have regressed (like me) in a very negative light.

Thank you for sharing your thoughts on this and bringing it to our attention. I see why this image may be portraying regression in a negative light. We are sorry if this caused you any discomfort.

I will make a note for it to be changed.

– Caroline | Community Manager

adaze

hello great paper what is the full harvard refence for the entire paper

Nicole Celestine, Ph.D.

Glad you liked it! Here’s how you’d reference it in Harvard style:

Nortje, A. (2021) ‘Defense mechanisms in psychology explained (+ examples)’, PositivePsychology.com , 12 April 2021 [Blog]. Available at https://positivepsychology.com/defense-mechanisms-in-psychology/ (Accessed 15 March 2022).

– Nicole | Community Manager

Nagawa kurusum

Very good article. I am doing M.A. in psychology. Where can I get your research papers?

Nicole Celestine, Ph.D.

Hi Nischal,

Glad you enjoyed the post! If you scroll to the bottom of the article, you’ll find a button to expand the reference list. From there, you can search for the titles in Google Scholar. Some of the articles will be behind paywalls, but others you may be able to access for free through platforms like Researchgate.

Let us know your thoughts Cancel reply

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Related articles

Unhealthy coping mechanism

10 Most Common Unhealthy Coping Mechanisms: A List

Life entails a wide tapestry of different experiences. Some of those are pleasant, some sad, others challenging. Coping mechanisms are the ways we respond to [...]

Healthy coping mechanisms

Healthy Coping Mechanisms: 9 Adaptive Strategies to Try

Have you ever watched in amazement as a family member or acquaintance overcame seemingly insurmountable obstacles with strength and courage? Coping with life’s challenges is [...]

Coping skills cards

Using Coping Cards in Therapy: 13 Examples & Templates

Supporting clients to develop a range of coping skills is a common focus in therapy. Having effective coping tools helps to manage difficult emotions, overcome [...]

Read other articles by their category

  • Body & Brain (47)
  • Coaching & Application (57)
  • Compassion (26)
  • Counseling (51)
  • Emotional Intelligence (24)
  • Gratitude (18)
  • Grief & Bereavement (21)
  • Happiness & SWB (40)
  • Meaning & Values (26)
  • Meditation (20)
  • Mindfulness (45)
  • Motivation & Goals (45)
  • Optimism & Mindset (34)
  • Positive CBT (26)
  • Positive Communication (20)
  • Positive Education (47)
  • Positive Emotions (31)
  • Positive Leadership (16)
  • Positive Psychology (33)
  • Positive Workplace (35)
  • Productivity (16)
  • Relationships (48)
  • Resilience & Coping (34)
  • Self Awareness (20)
  • Self Esteem (37)
  • Strengths & Virtues (30)
  • Stress & Burnout Prevention (34)
  • Theory & Books (46)
  • Therapy Exercises (37)
  • Types of Therapy (64)

3 Resilience Exercises Pack

Ignorance, misconceptions and critical thinking

  • Knowing the Unknown
  • Published: 07 January 2020
  • Volume 198 , pages 7473–7501, ( 2021 )

Cite this article

  • Sara Dellantonio   ORCID: orcid.org/0000-0002-2281-7754 1 &
  • Luigi Pastore   ORCID: orcid.org/0000-0002-5892-6928 2  

1457 Accesses

6 Citations

1 Altmetric

Explore all metrics

In this paper we investigate ignorance in relation to our capacity to justify our beliefs. To achieve this aim we specifically address scientific misconceptions, i.e. beliefs that are considered to be false in light of accepted scientific knowledge. The hypothesis we put forward is that misconceptions are not isolated false beliefs, but rather form part of a system of inferences—an explanation—which does not match current scientific theory. We further argue that, because misconceptions are embedded in a system, they cannot be rectified simply by replacing false beliefs with true ones. To address our misconceptions, we must rather act on the system of beliefs that supports them. In the first step of our analysis, we distinguish between misconceptions that are easy to dispel because they represent simple errors that occur against the background of a correct explanatory apparatus and misconceptions that are, on the contrary, very difficult to dispel because they are the product of pseudo explanations. We show that, in the latter case, misconceptions constitute an integral part of an incorrect explanation and the reasons that support such misconceptions are deeply misleading. In the second step, we discuss various approaches that have been adopted to address the problem of misconceptions. Challenging the notion that directly addressing and criticizing specific misconceptions is an effective approach, we propose that critical thinking is the most fruitful means to deal with misconceptions. We define the core competences and knowledge relevant for the practice of critical thinking and discuss how they help us avoid misconceptions that arise from accepting beliefs that form part of a mistaken explanation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

In the literature, the approach to ignorance that considers the possession of true justified beliefs as a necessary condition for having knowledge and, by contrast, the absence of one of these conditions as sufficient to establish ignorance is called the Standard View of Ignorance (Le Morvan 2011 , 2012 , 2013 ; Le Morvan and Peels 2016 ). Recently, epistemological research has developed a new approach called the New View of Ignorance in which the role assigned to justification and to the capacity to explicitly offer reasons in support of our beliefs has been weakened and in which a person is considered to be ignorant primarily in the case in which s/he holds false beliefs. (On the so-called New View see Peels 2010 , 2011 , 2012 ; Le Morvan and Peels 2016 ). On the one hand, the New View denies—contrary to what we suggest—that justification plays a central function in determining whether we are or are not ignorant about some topic. However, on the other hand, it takes a step towards our approach: it not only places stronger emphasis on the cognizing powers of the subject than the "standard view" but also considers the possibility of distinguishing between various degrees of ignorance. In fact, by discussing the impossibility of providing a complete justification for a belief, it makes it possible to consider ignorance as a continuum rather than in a categorical manner, distinguishing between various degrees in which a person can be said to be ignorant as for the reasons s/he has to hold his/her beliefs (for a conception of ignorance as an epistemic status that comes in degrees and ignorance as the incapacity to adequately and completely answer questions concerning our beliefs cf. Nottelmann 2016 ). Of course, we cannot expect that people are able to provide a complete justification for their beliefs and even the issue of what an acceptable justification should look like is controversial. However, especially when considering beliefs concerning (relatively simple) phenomena that have a widely accepted , we can take a practical stance and assume that scientific theories which are commonly accepted by the scientific community at a given time provide us a measure for which beliefs should be considered to be true or false at that time and what a justification of them would ideally look like. On this basis, we can also assess (or approximate) the distance between scientific knowledge and individual beliefs as well as the distance between the way scientific theories justify specific pieces of knowledge and the way in which people justify their beliefs about the same phenomena.

In fact, it is not impossible for an individual to hold a belief that is inconsistent with other beliefs s/he holds, and yet it would be irrational for him/her to do so. As Davidson argues: “Strictly speaking, then, the irrationality consists not in any particular belief but in inconsistency within a set of beliefs.” (Davidson 1985 /2004, p. 192) According to a widely shared view in psychology developed by Leon Festinger ( 1957 ), inconsistencies cannot be psychologically accepted by the subject who will make every possible effort to rationalize and thus resolve them. In the same vein, Davidson ( 1982 /2004, 1986 /2004) points out how the inconsistencies we are sometimes victim of can be explained only by postulating a kind of compartmentalization of the mind. However, generally people seek internal congruency among their beliefs and coherence plays a pivotal role in the way we interpret human thinking (on this cf. also Thagard 2000 ).

In epistemology, justification has been viewed in various ways. Justification might be conceived as being linear: in this case an individual belief is proven to be true by a set of other beliefs and those other beliefs are proven to be true by another set and so on, until we reach some beliefs that are based on experience and are therefore considered—if not indisputable—at least well-grounded: as BonJour ( 1985 , p. 26) formulates it, “sufficient by itself to satisfy the adequate justification condition for knowledge”. Alternatively, justification can be viewed holistically as an interferential network of beliefs that are interconnected within a system and providing mutual support but are supported by experience only altogether as a whole. In this case, what justifies a belief is primarily its coherence with the system (this form of holism is commonly discussed in relation to Quine 1951 ; for a discussion of different justification models cf. Elgin 2005 ; van Cleve 2005 ). While strong forms of holism would consider a belief to be justified only if it is coherent with the whole system of beliefs that includes it, more moderate forms of holism support the view that justification depends on some chunks of this system. Typically, they also embrace some weak form of foundationalism in which some beliefs are considered more basic than others because they are closer to experience, i.e. observational, and thus serve as a foundation for others. (On moderate vs. strong holism from the point of view of Quine’s philosophy and on Quine’s later arguments in favor of a moderate form of holism cf. De Rosa and Lepore 2004 ). To numerous epistemologists, the idea that we can always apply linear models of justification that lead us to some fundamental beliefs appeared to be implausible in the light of the complexity of our system of knowledge. For this reason, they argued for a holist picture of knowledge in which beliefs are connected to each other within an inferential network and mutually sustain each other (cf. van Quine and Ullian 1970 ; Bonjour 1985 ; Harman 1993 ; Thagard 2007 ). At the same time, the view that all the beliefs of a complex beliefs system are also involved in the justification of each appeared to be too extreme as well as problematic from an epistemological point of view. In fact, this implies that, when even one single belief in the system is changed, all others must be modified accordingly. This excessive interdependence of beliefs makes the system as a whole too instable (cf. Fodor and Lepore 1992 , chap. 2). For this reason, many epistemologists have considered a moderate form of holism as the most plausible option. And yet, independently of which view of the structure of knowledge we favor and thus of which is the specific model of justification we prefer, at least some principles of inferential justification can be considered to be shared by all these models. Indeed, independently of whether we think that our beliefs are structured “like a building that rests upon a foundation” or "like a web where the strength of any given area depends on the strength of the surrounding areas" (Steup 2018 ), we always presuppose that beliefs form a congruent structure and that their relationships are somehow explanatory . We will say something more on this last factor below, but—since we will consider scientific theory as a benchmark to assess misconceptions—we will mainly just assume that the inferential relationships presupposed by scientific theories are explanatory, i.e. that they are form part of an appropriate explanation.

Even though this remains implicit in his paper, Reichenbach’s analysis is inspired by a specific model of explanations, i.e. by the Hempel and Oppenheim’s ( 1948 ) Deductive-Nomological Model. However, his description is general enough to also be compatible with other positions on what a consists of. The Deductive-Nomological Model of explanation does not explicitly rely on the notion of causation. But many advocates of this model argue that it still captures the causal component of explanations since “all causal claims imply the existence of some corresponding regularity (a “law”) linking cause to effect” (Woodward 2017 , cf. this article also for a brief discussion of the main modes of explanation that are currently under debate).

Reichenbach suggests that—when people do not have the means to develop an actual explanation—they try to account for phenomena by analogy with something else they understand better: since human experience is something everybody has firsthand knowledge of, people usually resort to analogies with human experience. Reichenbach intuition on this is confirmed by a number of contemporary, empirical studies showing that people with a poor understanding of the physical word have a strong tendency to anthropomorphize. They tend to explain physical phenomena using the same principles they would use to explain the behavior of human agents and thus they project human-like characteristics onto non-human things (Epley et al. 2007 ; Willard and Norenzayan 2013 ; Lindeman and Svedholm-Häkkinen 2016 ). And yet, the opposite also occurs, even if more rarely: people who exhibit a poor knowledge of the human mind and of social dynamics, but have a better comprehension of mechanisms and physical systems tend to interpret human phenomena according to non-human but better known mechanical principles (Lindeman and Svedholm-Häkkinen 2016 ).

Aarnio, K., & Lindeman, M. (2005). Paranormal beliefs, education, and thinking styles. Personality and Individual Differences, 39 (7), 1227–1236. https://doi.org/10.1016/j.paid.2005.04.009 .

Article   Google Scholar  

Bailin, S. (2002). Critical thinking and science education. Science and Education, 11 (4), 361–375. https://doi.org/10.1023/A:1016042608621 .

Bailin, S., Case, R., Coombs, J. R., & Daniels, L. B. (1999a). Conceptualizing critical thinking. Journal of Curriculum Studies, 31 (3), 285–302. https://doi.org/10.1080/002202799183133 .

Bailin, S., Case, R., Coombs, J. R., & Daniels, L. B. (1999b). Common misconceptions of critical thinking. Journal of Curriculum Studies, 31 (3), 269–283. https://doi.org/10.1080/002202799183124 .

Bannink, F. P. (2007). Solution-focused brief therapy. Journal of Contemporary Psychotherapy, 37 (2), 87–94. https://doi.org/10.1007/s10879-006-9040-y .

Bensley, D. A., & Lilienfeld, S. O. (2015). What is a psychological misconception? Moving toward an empirical answer. Teaching of Psychology, 42 (4), 282–292. https://doi.org/10.1177/0098628315603059 .

Bensley, D. A., Lilienfeld, S. O., & Powell, L. A. (2014). A new measure of psychological misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences, 36, 9–18. https://doi.org/10.1016/j.lindif.2014.07.009 .

BonJour, L. (1985). The structure of empirical knowledge . Cambridge: Harvard University Press.

Google Scholar  

Brosnan, M., Ashwin, C., & Lewton, M. (2017). Brief report: Intuitive and reflective reasoning in autism spectrum disorder. Journal of Autism and Developmental Disorders, 47 (8), 2595–2601. https://doi.org/10.1007/s10803-017-3131-3 .

Brosnan, M., Lewton, M., & Ashwin, K. (2016). Reasoning on the autism spectrum: A dual process theory account. Journal of Autism and Developmental Disorders, 46, 2115–2125.

Browne, N. M., & Keeley, S. M. (2007). Asking the right questions . Upper Saddle River: Prentice-Hall.

Burke, B. L., Sears, S. R., Kraus, S., & Roberts-Cady, S. (2014). Critical analysis: A comparison of critical thinking changes in psychology and philosophy classes. Teaching of Psychology, 41 (1), 28–36. https://doi.org/10.1177/0098628313514175 .

Conception. (2011). In Merriam-Webster.com . Retrieved August 17, 2019, from https://www.merriam-webster.com/dictionary/conception .

Cottrell, S. (2005). Critical thinking skills. Developing effective analysis and argument . New York: Palgrave.

Davidson, D. (1982/2004). Paradoxes of irrationality. In D. Davidson (Ed.), Problems of irrationality (pp. 169–187). Oxford/New York: Oxford University Press.

Davidson, D. (1985/2004). Incoherence and irrationality. In D. Davidson (Ed.), Problems of irrationality (pp. 189–198). Oxford/New York: Oxford University Press.

Davidson, D. (1986/2004). Deception and division. In D. Davidson (Ed.), Problems of irrationality (pp. 199–212). Oxford/New York: Oxford University Press.

De Martino, B., Harrison, N. A., Knafo, S., Bird, G., & Dolan, R. J. (2008). Explaining enhanced logical consistency during decision making in autism. Journal of Neuroscience, 28 (42), 10746–10750. https://doi.org/10.1523/JNEUROSCI.2895-08.2008 .

De Rosa, R., & Lepore, E. (2004). Quine’s meaning holisms. In R. F. Gibson (Ed.), The Cambridge companion to Quine (pp. 65–90). Cambridge: Cambridge University Press.

Chapter   Google Scholar  

Dewey, J. (1910). How we think . Boston/New York/Chicago: D.C. Heath.

Book   Google Scholar  

Dewey, J. (1933). How we think: A restatement of the relation of reflective thinking to the educative process . Lexington: D.C. Heath.

di Sessa, A. A. (2006). A history of conceptual change research: Threads and fault lines. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 88–108). Cambridge: Cambridge University Press.

Elgin, C. (2005). Non-foundationalist epistemology: Holism, coherence, and tenability. In M. Steup & E. Sosa (Eds.), Contemporary debates in epistemology (pp. 156–167). New York/London: Blackwell.

Ennis, R. H. (1985). A logical basis for measuring critical thinking skills. Educational Leadership, 43 (2), 44–48.

Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114 (4), 864–886.

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction . Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.

Festinger, L. (1957). A theory of cognitive dissonance . Stanford: Stanford University Press.

Firestein, S. (2012). Ignorance: How it drives science . Oxford: Oxford University Press.

Fischer, K. M. (1983). Amino acids and translations: A misconception in biology. In H. Helm & J. D. Nowak (Eds.), Proceedings of the international seminar on misconceptions in science and mathematics (pp. 407–419). Ithaca: Cornell University Press.

Fodor, J. A., & Lepore, E. (1992). Holism: A shopper’s guide . Oxford: Blackwell.

Furnham, A., & Hughes, D. J. (2014). Myths and misconceptions in popular psychology: Comparing psychology students and the general public. Teaching of Psychology, 41 (3), 256–261.

Gardner, R., & Brown, D. L. (2013). A test of contemporary misconceptions in psychology. Learning and Individual Differences, 24, 211–215. https://doi.org/10.1016/j.lindif.2012.12.008 .

Garnett, P. J., & Treagust, D. F. (1992a). Conceptual difficulties experienced by senior high school students of electrochemistry: Electric circuits and oxidation-reduction equations. Journal of Research in Science and Teaching, 29 (2), 121–142.

Garnett, P. J., & Treagust, D. F. (1992b). Conceptual difficulties experienced by senior high school students of electrochemistry: Electrochemical (galvanic) and electrolytic cells. Journal of Research in Science and Teaching, 29 (10), 1079–1099.

Gil, F. (2000). La conviction . Paris: Flammarion.

Gilbert, J. K., & Watts, D. M. (2008). Concepts, misconceptions and alternative conceptions: Changing perspective in science education. Studies in Science Education, 10 (1), 61–98.

Gingerich, W. J., & Eisengart, S. (2004). Solution-focused brief therapy: A review of the outcome research. Family Process, 39 (4), 477–498. https://doi.org/10.1111/j.1545-5300.2000.39408.x .

Goris, T. & Dyrenfurth, M. (2010). Students’ misconception in science, technology, and engineering. In ASEE Illinois/Indiana section conference . Retrieved September 10, 2019 from http://ilin.asee.org/Conference2012/Papers/Goris.pdf .

Govier, T. (1989). Critical thinking as argument analysis? Argumentation, 3 (2), 115–126. https://doi.org/10.1007/BF00128143 .

Govier, T. (2010). A practical study of argument . Cengage: Wadsworth.

Gregory, T. R. (2009). Understanding natural selection: Essential concepts and common misconceptions. Evolution: Education and Outreach, 2 (2), 156–175.

Guzzetti, B. J. (2000). Learning counter-intuitive science concepts: What have we learned from over a decade of research? Reading and Writing Quarterly, 16 (2), 89–98.

Halpern, D. F. (2014). Thought and knowledge. An introduction to critical thinking . New York: Psychology Press.

Hare, W. (1979). Open-mindedness and education . Kingston: McGill-Queen’s University Press.

Hare, W. (2001). Bertrand Russell and the ideal of critical receptiveness. Skeptical Inquirer, 25 (3), 40–44.

Harman, G. (1993). Meaning holism defended. In J. A. Fodor & E. Lepore (Eds.), Holism: A consumers update (pp. 163–171). Amsterdam: Rodopi.

Hempel, C., & Oppenheim, P. (1948). Studies in the logic of explanation. Philosophy of Science, 15, 135–175.

Herron, J. D. (1990). Research in chemical education: Results and directions. In M. Gardner, J. G. Greeno, F. Reif, A. H. Schoenfaled, A. A. di Sessa, & E. Stage (Eds.), Toward a scientific practice of science education (pp. 31–54). Hillsdale: Erlbaum.

Hitchcock, D. (2017). On reasoning and argument: Essays in informal logic and on critical thinking . Dordrecht: Springer. https://doi.org/10.1007/978-3-319-53562-3_30 .

Hitchcock, D. (2018a). Critical thinking. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/archives/fall2018/entries/critical-thinking/ .

Hitchcock, D. (2018b). Assessment. Supplement to critical thinking. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/entries/critical-thinking/assessment.html .

Irwin, H. J. (2009). The psychology of paranormal belief. A researcher’s handbook . Hatfield: University of Hertfordshire Press.

Kahane, H. (1989). The proper subject matter for critical thinking courses. Argumentation, 3 (2), 141–147.

Kahneman, D. (2011). Thinking, fast and slow . New York: Farrar, Strauss & Giroux.

Kendeou, P., & van den Broek, P. (2005). The effects of readers’ misconceptions on comprehension of scientific text. Journal of Educational Psychology, 97 (2), 235–245. https://doi.org/10.1037/0022-0663.97.2.235 .

Kikas, E. (2004). Teachers’ conceptions and misconceptions concerning three natural phenomena. Journal of Research in Science Education, 41 (5), 432–448.

Kim, J. S. (2008). Examining the effectiveness of solution-focused brief therapy: A meta-analysis. Research on Social Work Practice, 18 (2), 49–64. https://doi.org/10.1177/1049731507307807 .

Kirby, G. (2018). Wacky and wonderful misconceptions about our universe . Berlin/Heidelberg: Springer.

Kowalski, P., & Taylor, A. (2009). The effect of refuting misconceptions in the introductory psychology class. Teaching of Psychology, 36 (3), 153–159.

Kuczmann, I. (2017). The structure of knowledge and students’ misconceptions in physics. AIP Conference Proceedings, 1916, 050001. https://doi.org/10.1063/1.5017454 .

Le Morvan, P. (2011). On ignorance: A reply to Peels. Philosophia, 39 (2), 335–344.

Le Morvan, P. (2012). On ignorance: A vindication of the standard view. Philosophia, 40 (2), 379–393.

Le Morvan, P. (2013). Why the standard view of ignorance prevails. Philosophia, 41 (1), 239–256.

Le Morvan, P., & Peels, R. (2016). The nature of ignorance: Two views. In R. Peels & M. Blaauw (Eds.), The epistemic dimensions of ignorance (pp. 12–32). Cambridge: Cambridge University Press. https://doi.org/10.1017/9780511820076.002 .

Lindeman, M., & Aarnio, K. (2007). Superstitious, magical, and paranormal beliefs: An integrative model. Journal of Research in Personality, 41 (4), 731–744.

Lindeman, M., & Svedholm-Häkkinen, A. M. (2016). Does poor understanding of physical world predict religious and paranormal beliefs? Applied Cognitive Psychology, 30 (5), 736–742. https://doi.org/10.1002/acp.3248 .

Manza, L., Hilperts, K., Hindley, L., Marco, C., Santana, A., & Vosburgh Hawk, M. (2010). Exposure to science is not enough: The influence of classroom experiences on belief in paranormal phenomena. Teaching of Psychology, 37 (3), 165–171.

Maynes, J. (2015). Critical thinking and cognitive bias. Informal Logic, 35 (2), 183–203.

McLean, C. P., & Miller, N. A. (2010). Changes in critical thinking skills following a course on science and pseudoscience: A quasi-experimental study. Teaching of Psychology, 37 (2), 85–90.

Nottelmann, N. (2016). The varieties of ignorance. In R. Peels & M. Blaauw (Eds.), The epistemic dimensions of ignorance (pp. 33–56). Cambridge: Cambridge University Press. https://doi.org/10.1017/9780511820076.003 .

Özmen, H. (2004). Some student misconceptions in chemistry: A literature review of chemical bonding. Journal of Science Education and Technology, 13 (2), 147–159. https://doi.org/10.1023/B:JOST.0000031255.92943.6d .

Peels, R. (2010). What is ignorance? Philosophia, 38 (1), 57–67.

Peels, R. (2011). Ignorance is lack of true belief: A rejoinder to Le Morvan. Philosophia, 39 (2), 345–355.

Peels, R. (2012). The new view on ignorance undefeated. Philosophia, 40 (4), 741–750.

Pennycook, G., Cheyne, J. A., Seli, P., Koehler, D. J., & Fugelsang, J. A. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123 (3), 335–346. https://doi.org/10.1016/j.cognition.2012.03.003 .

Posner, G., Strike, K., Hewson, P., & Gertzog, W. (1982). Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66 (2), 211–227.

Potvin, P., & Cyr, G. (2017). Toward a durable prevalence of scientific concept: Tracking the effects of two interfering misconceptions about buoyancy from preschoolers to teachers. Journal of Research in Science Teaching, 54 (9), 1121–1142.

Pressman, M. R. (2011). Common misconceptions about sleepwalking and other parasomnias. Sleep Medicine Clinics, 6 (4), 13–17.

Quine, W. V. O. (1951). Two dogmas of empiricism. Philosophical Review, 60, 20–43.

Rainbolt, G. W., & Dwyer, S. L. (2012). Critical thinking. The art of argument . Boston: Wadsworth.

Reichenbach, H. (1951/1968). The rise of scientific philosophy . Berkeley/Los Angeles: University of California Press.

Russell, B. (1960). Our knowledge of the external world . New York: Mentor.

Sanger, M. J., & Greenbowe, T. J. (1997). Common students’ misconceptions in electrochemistry: Galvanic, electrolytic, and concentration cells. Journal of Research in Science Teaching, 34 (4), 377–398.

Siegel, H. (1989). Epistemology, critical thinking, and critical thinking pedagogy. Argumentation, 3 (2), 127–140.

Siegel, H. (2009). Open-mindedness, critical thinking, and indoctrination: Hommage to William Hare. Paideusis, 18 (1), 26–34.

Simpson, W. D., & Marek, E. A. (1988). Understandings and misconceptions of biology concepts held by students attending small high schools and students attending large high schools. Journal of Research in Science Teaching, 25 (5), 361–364.

Smith, J. P., di Sessa, A. A., & Roschelle, J. (1994). A constructivist analysis of knowledge in transition. Journal of the Learning Science, 3 (2), 115–163.

Smithson, M. (1989). Ignorance and uncertainty. Emerging paradigms . New York/Berlin: Springer.

Stark, E. (2012). Enhancing and assessing critical thinking in a psychological research methods course. Teaching of Psychology, 39 (2), 107–112. https://doi.org/10.1177/0098628312437725 .

Stein, M., Larrabbee, T. G., & Barman, C. R. (2008). A study of common beliefs and misconceptions in physical science. Journal of Elementary Science Education, 20 (2), 1–11.

Steup, M. (2018). Epistemology. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/archives/win2018/entries/epistemology/ .

Sumner, W. G. (1906). Folkways. A study of the sociological importance of usage, manners, customs, mores, and morals . Boston: Ginn.

Taylor, A., & Kowalski, P. (2004). Naive psychological science: The prevalence, strength and sources of misconceptions. Psychological Record, 54 (1), 15–25.

Taylor, A. K., & Kowalski, P. (2012). Students’ misconceptions in psychology: How you ask matters… sometimes. Journal of the Scholarship of Teaching and Learning, 12 (3), 62–72.

Taylor, A. K., & Kowalski, P. (2014). Student misconceptions: Where do they come from and what can we do? In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.), Applying science of learning in education: Infusing psychological science into the curriculum (pp. 259–273). Washington: Society for the Teaching of Psychology.

Thagard, P. (2000). Coherence in thought and action . Cambridge: MIT Press.

Thagard, P. (2007). Coherence, truth, and the development of scientific knowledge. Philosophy of Science, 74 (1), 28–47.

Todd, C. (2018). Fitting feelings and elegant proofs: On the psychology of aesthetic evaluation in mathematics. Philosophia Mathematica, 26 (2), 211–233. https://doi.org/10.1093/philmat/nkx007 .

Tversky, A., & Kahneman, D. (1983). Extension versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90 (4), 293–315. https://doi.org/10.1037/0033-295X.90.4.293 .

van Cleve, J. (2005). Why coherence is not enough: A defense of moderate foundationalism. In M. Steup & E. Sosa (Eds.), Contemporary debates in epistemology (pp. 168–180). New York/London: Blackwell.

van Quine, W., & Ullian, J. S. (1970). The web of belief . New York: Random House.

Willard, A. K., & Norenzayan, A. (2013). Cognitive biases explain religious belief, paranormal belief, and belief in life’s purpose. Cognition, 129 (2), 379–391. https://doi.org/10.1016/j.cognition.2013.07.016 .

Wilson, J. A. (2018). Reducing pseudoscientific and paranormal beliefs in university students through a course in science and critical thinking. Science and Education, 27 (1–2), 183–210. https://doi.org/10.1007/s11191-018-9956-0 .

Woodward, J. (2017). Scientific explanation. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy . Retrieved September 10, 2019, from https://plato.stanford.edu/archives/fall2017/entries/scientific-explanation/ .

Wynn, L. L., Foster, A. M., & Trussell, J. (2009). Can I get pregnant from oral sex? Sexual health misconceptions in e-mails to a reproductive health website. Contraception, 79 (2), 91–97.

Zohar, A., Weinberger, Y., & Tamir, P. (1994). The effect of the biology critical thinking project on the development of critical thinking. Journal of Research in Science Teaching, 31 (1), 183–196. https://doi.org/10.1002/tea.3660310208 .

Download references

Author information

Authors and affiliations.

Department of Psychology and Cognitive Sciences, University of Trento, Palazzo Fedrigotti – Corso Bettini, 31, 38068, Rovereto, TN, Italy

Sara Dellantonio

Department of Education, Psychology, Communication, University of Bari “A. Moro”, Palazzo Chiaia-Napolitano – Via Crisanzio, 42, 70121, Bari, BA, Italy

Luigi Pastore

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sara Dellantonio .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Authors are listed alphabetically, this article was truly cooperative.

Rights and permissions

Reprints and permissions

About this article

Dellantonio, S., Pastore, L. Ignorance, misconceptions and critical thinking. Synthese 198 , 7473–7501 (2021). https://doi.org/10.1007/s11229-019-02529-7

Download citation

Received : 24 March 2019

Accepted : 25 December 2019

Published : 07 January 2020

Issue Date : August 2021

DOI : https://doi.org/10.1007/s11229-019-02529-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Misconceptions
  • Critical thinking

Advertisement

  • Find a journal
  • Publish with us
  • Track your research

Ph: (954) 488-2933 | Fx: (954) 488-2912

Therapists & Psychologists in Fort Lauderdale, Florida

Therapists & Psychologists in Fort Lauderdale, Florida

  • The Concierge Difference
  • Jose Ramirez, LMHC
  • Dr. Janine Furdella
  • Dr. Dennis London
  • Dr. Rachel Christopher
  • Dr. Jamie Long
  • Insurance & Other FAQs
  • As Quoted By
  • Anxiety Therapy
  • Depression Therapy
  • PTSD & Trauma Therapy
  • Cognitive Processing Therapy (CPT)
  • Couples Therapy
  • Sex Therapy
  • Eating Disorder Therapy
  • Grief Therapy
  • Cognitive Behavioral Therapy for Insomnia (CBT-i)
  • Acceptance and Commitment Therapy (ACT)
  • Dialectical Behavior Therapy (DBT)
  • Teen Therapy
  • Therapy for Young Adults
  • LGBTQ+ Services
  • Psycho-educational Testing
  • Third-Party Reproduction Psychological Evaluation
  • OCD Therapy
  • Mini Sessions
  • Online Therapy & Telehealth
  • Hypnotherapy
  • Location/Contact
  • Schedule A Callback
  • Client Portal

Avoidance - Anxiety Therapy - The Psychology Group Fort Lauderdale

Avoidance: The Band-Aid Solution to Long-Term Problems

By Christina Smith, LMHC

We are all guilty of avoiding things that we don’t want to do from time to time. We may let laundry pile up, avoid confronting a friend over something they did that upset us, or wait until the last minute to start a project for work or school. 

It’s easy to avoid things, especially when there are so many things we would rather be doing with our time. But, for those experiencing more serious mental health issues such as anxiety and depression , avoidance can worsen symptoms of those issues. 

What is avoidance behavior?

Avoidance is a maladaptive coping skill that offers the mind an escape from uncomfortable thoughts, feelings, and/or experiences. It may seem like avoiding discomfort could be helpful, however, it results in never addressing the actual issue. In fact, avoidance may create a cycle of behavior that exacerbates feelings of anxiety and depression, making it much harder to problem solve, cope, and heal.

For example, someone who feels depressed might find it hard to get out of bed in the morning and may avoid daily responsibilities that seem stressful. They might stay in bed until noon, miss breakfast, avoid paying bills, skip the gym, etc. When they do finally get up, they have lower energy and less time to take care of responsibilities. The lack of energy and time will most likely result in more negative thoughts and feelings. Then, they may engage in more avoidant behavior and ultimately perpetuate the cycle of depression. 

Here are a few other examples of avoidance:

Someone might avoid triggers such as people, places, and things that may incite uncomfortable feelings. Those dealing with social anxiety, for example, might avoid crowds of people or hanging out with a group of friends. Avoiding these situations may spare them from uncomfortable feelings, but will also prevent them from learning effective coping skills to deal with difficult social situations in the future.

Another example might be someone experiencing relationship issues. Given that most of us do not enjoy conflict, it’s easy to find ways to avoid confronting an issue. Particularly when it comes to someone in our life that we care about. Individuals facing marital issues may divert attention from the issue by changing the subject when it comes up. Also, they may become passive aggressive toward their partner or even completely withdraw from them. This pattern of avoidance is sometimes referred to as ‘ conflict avoidance ‘. When an underlying issue is never addressed it can be buried under issues and become even more difficult to resolve. 

Avoidance is the best short-term strategy to escape conflict, and the best long-term strategy to ensure suffering. Brendon Burchard

It is also common for some people to avoid negative feelings by regularly engaging in “numbing” behaviors. This may come in the form of drinking more often or heavier, over-eating , over-exercising, etc; anything that might replace an uncomfortable feeling. It is important to note that these behaviors are only temporary fixes. It may keep the feelings out momentarily, but as soon as the numbing behavior stops, the feelings rush back, and solutions continue to evade our grasp. 

The one thing we may want to avoid is avoidance itself. How to stop avoiding:

  • Recognize and understand that you’re doing it. Become mindful of your behavior patterns and how you might be avoiding negative feelings or situations in your life. It can be useful to keep a journal, or a log of your thoughts, feelings, and behaviors to more easily identify such patterns.
  • Practice effective stress relief. Learn relaxation skills such as meditation, deep breathing, yoga, journaling, art, etc. to combat stress. It’s important to find techniques that work for you. Exercising regularly and keeping a balanced diet can also help reduce stress. 
  • Remind yourself that it’s OK to feel uncomfortable and to have negative thoughts and feelings. They will pass. When we allow ourselves to feel our feelings we can finally start the process of healing.
  • Get support. Family and friends are sometimes the best sources of support but it can also be helpful to find other sources such as a therapist or support group that can provide different perspectives. 

Many times, ending the cycle of avoidance is a longer process than we may imagine. It might not be as simple as facing our fears and moving on. 

Working with a therapist and taking small steps to learn about avoidance and how it is affecting you can be a positive step in overcoming issues like depression and anxiety. 

Copyright © 2019-2023 The Psychology Group Fort Lauderdale, LLC, all rights reserved.

Bad Critical Thinking Examples: 14 Tips for Better Decisions

Critical thinking is the ability to analyze and evaluate information logically and unbiasedly. It is a skill that can help us make better decisions, solve problems, and avoid fallacies. However, not everyone has good critical thinking skills, and some people may even have bad critical thinking habits that hinder their reasoning. But your Life may full of opportunities and challenges. You can choose to focus on the positive aspects of your life and appreciate what you have. Rather than being a poor reasoner.

example of denial in critical thinking

Sanju Pradeepa

Bad critical thinking examples

Thinking critically is essential for success in life, yet it’s not always easy. That’s why so many of us fall back on less effective thinking patterns. But what exactly are these poor critical thinking examples?

In this article, we’ll outline some of the most common examples of poor critical thinking and explain why they don’t hold up to rigorous scrutiny. We’ll also share an actionable approach to help you start thinking more critically and be more successful in your daily life.

So, whether you’re struggling with the idea of critical thinking or have already embraced the concept, read on to find what are bad critical thinking examples, learn some useful tips and insights that can help you become a better thinker.

Table of Contents

What is critical thinking.

What Is Critical Thinking

Critical thinking is the process of actively and skillfully gathering, evaluating, and discussing information to reach an informed decision or conclusion. It involves analyzing evidence, addressing different perspectives, making connections between ideas, and creating arguments and conclusions.

However, there are some common pitfalls to critical thinking that can prevent you from arriving at the best conclusions.

For example, using overly simplistic solutions to complex problems, such as basing decisions solely on emotions rather than facts and logical reasoning, making assumptions without considering the evidence, or not questioning your own biases,

Ultimately, poor critical thinking skills can lead to rushed decisions that do not fully consider all of the available information or evidence.

As a result, it is important to practice critical thinking regularly to develop strong analytical skills that are necessary for any decision-making process.

Do you know there are some types of Critical Thinking Skills – 7 Types of Critical Thinking: A Guide to Analyzing Problems

Bad Critical Thinking Examples

14 Bad Critical Thinking Examples

These examples highlight just how easy it is to get caught up in our own biases rather than relying on facts and data to reach conclusions. So, if you want to practice better critical thinking skills, try to identify these traps before they ensnare your reasoning skills.

1. Substituting Emotion for Reason

Have you ever found yourself substituting emotion for reason? It’s a common mistake. But what does that mean exactly?

It is when you make important decisions based purely on your feelings, rather than on facts and logic. Your emotions may tell you something’s wrong, but if you don’t use facts to back up your feelings, chances are you’re making an ill-informed decision.

One example of substituting emotion for reason is forming an opinion about someone else’s choices without considering their point of view. In this situation, you may have a strongly held belief about how certain scenarios “should” turn out, but if you don’t consider the other person’s reason for making their decision, you could be missing key information.

This might look like deciding something before considering all of the evidence or relying on assumptions instead of facts.

Tip – The best way to avoid substituting emotion for reason is to take a step back and ask yourself, “Am I looking at the whole picture?” If not, use critical thinking skills like fact-checking or asking questions to gain a more complete understanding of the situation.

With an open mind and an objective approach, it’s possible to make decisions based on facts instead of feelings.

2. Jumping to Conclusions Without Evaluating Evidence

It’s easy to come to a conclusion based on assumptions rather than facts and reasoning.

Let’s say someone told you that Bob always arrives late for work. Without looking into the evidence, you might assume he’s lazy and not a team player. But after looking at the evidence, for example, if Bob was in a car accident or his commute was particularly long that week, you’d realize there were other factors at play here.

So what can we learn from this? Well, it comes down to seeing things from multiple perspectives and understanding there may be more than one explanation for something. Before concluding,

Tip – It’s important to evaluate the evidence surrounding it and consider all possible factors influencing the situation. This way, you can get closer to the truth rather than making assumptions based on an incomplete picture.

3. Ignoring Information and Facts

You might not know this, but one of the most common examples is ignoring information and facts. It’s like you have blinders on and you’re determined to stick to your own opinion no matter what, even though there is evidence that contradicts it.

Ignoring important information and facts is bad news because facts are the foundation of critical thinking. If you don’t consider relevant data when reaching a conclusion, then you can’t have an accurate opinion; it will just be based on assumptions and your own biases. 

Tip – So, here are some things to look out for when it comes to avoiding this mistake:

  • Don’t dismiss facts that are presented by people who present them in an unfamiliar way, even if the information seems overwhelming or confusing at first glance.
  • Consider all angles of a situation before making an opinion or forming an argument.
  • Don’t take things at face value; check sources, view multiple perspectives, and try to find reliable sources for your beliefs.
  • Don’t be afraid to ask questions or seek out people with different perspectives to better understand the issue at hand.
“The unexamined life is not worth living.”  Socrates

4. Not Considering Other Perspectives

Another thing to consider when it comes to poor critical thinking is failing to consider other perspectives. After all, if you’re not taking the time to look at things from different angles, then you are limiting yourself .

For example, say you’re trying to come up with solutions for a problem at work. You might think that the first solution you come up with is the best one. But if you take the time to really brainstorm and consider other perspectives and options, then you might find a better solution.

Assumptions: It’s easy when dissecting an issue to assume that other people’s ideas apply only to them and not necessarily to everyone. Not considering other perspectives oftentimes results in making false assumptions about someone else’s ideas or beliefs. These assumptions can lead to miscommunication or worse.

Ignorance: Not listening and attempting to understand other perspectives can also lead to ignorance. Rather than trying hard to understand how someone else thinks or believes, we can stick with our narrow view of the world and be closed off from new information and experiences that could benefit us in some way. 

By not taking the step of trying to engage with different points of view, we risk missing out on important insights and valuable context that can help us make more informed decisions or form a deeper understanding of an issue or situation.

Tip – So next time you’re approaching an issue or problem critically, try zooming out and looking at things from multiple angles. You’ll soon see why it pays off.

5. Overgeneralizing

This is when you take a single experience or observation and use it to draw conclusions about an entire group or situation. People might do this when they oversimplify their views instead of looking at all the evidence and weighing it up objectively first.

Let’s look at this with an example: You see a tourist asking for directions, and the person who helps them gives some very helpful advice. You might think that all tourists are friendly, but this isn’t necessarily true; you don’t have enough evidence to make such a broad assumption.

Tip – The next time you think of making a snap judgment call about something, consider taking the time to look at all sides of the story first before coming to any conclusions.

6. Neglecting Creative Thinking

Poor critical thinking examples can also include neglecting to come up with creative ideas or solutions. It’s great to be able to analyze a situation and make decisions based on facts, but sometimes it’s just as important to challenge the status quo and come up with new, innovative solutions.

Creative thinking is about being able to blend facts and existing ideas together in a way that produces something new, is more effective, or increases efficiency. Neglecting this form of critical thinking can lead to missed opportunities as well as poor decision-making.

It helps you assess a situation from different angles and perspectives. It’s not enough to just identify the problem; you have to find a solution that works best for everyone involved. 

Tip – Here are some ways you can start incorporating creative thinking into your problem-solving:

  • Brainstorming: Give yourself time to think of different approaches and techniques.
  • Incorporating input from others: Get feedback from others who are knowledgeable about the issue.
  • Identifying trends: look for patterns that could inform your approach.
  • Re-framing the problem : ask yourself if there is another way you could interpret the issue.
  • Experimenting: Try different methods until you find one that works best.

Positive thinking is not about ignoring reality or denying problems

7. Listening to Biased Sources

When it comes to critical thinking, one of the biggest mistakes you can make is listening to biased sources. They are ones that already have an agenda when it comes to your opinion. They want you to believe a certain way, and they’re going to do whatever they can to make that happen.

This kind of source will often present “facts” in a way that skews your perspective, and even if they’re telling the truth, they’ll leave out important context. This is why it’s always important to find multiple sources and look at the information objectively.

Here are some red flags that might tip you off when someone is being biased:

  • They use inflammatory language or negative stereotypes .
  • They cherry-pick data and leave out essential context.
  • Their arguments rely more on personal attacks than on facts.
  • They bring up irrelevant topics just to distract from the facts.

Tip – If you start seeing any of these signs, it’s time to take a step back, recognize what’s happening, and look for more reliable sources instead. Remember: critical thinking requires an open mind and plenty of research.

8. Confirmation Bias: Seeking Data That Confirms Your Own Beliefs

Another critical thinking error is confirmation bias. This typically happens when you’re trying to prove a point you already believe in and only seek out or interpret data that confirms what you already think.

For example, if you happen to be a firm believer in the benefits of the keto diet but are presented with evidence that shows the diet is bad for your health, chances are your confirmation bias will kick in. You’ll ignore or discount the evidence and instead look for data that confirms your own beliefs .

Tip – Confirmation bias can happen through a deliberate attempt to ignore contradictory information. The result is that you end up creating a false illusion. All the information out there supports your beliefs. So be aware of every step when making decisions and act accordingly.

9. False Dilemma

This is when someone presents two options as if there were no other alternatives, but usually there are more options than just those two.

For example, say a member of your team tells you that you have to either choose her idea or put the project on hold. This is a false dilemma; in reality, the project could move forward using some combination of ideas from all members of the team.

Tip – False dilemmas are sometimes used to manipulate people into making decisions they wouldn’t otherwise make. If someone ever presents you with only two options and claims that it’s an either/or situation, be sure to stop and think critically about whether or not there are truly any other possibilities.

10. Straw Man Argument

A straw man argument is a logical fallacy that involves misrepresenting or distorting an opponent’s position to make it easier to refute. It’s a common tactic used to make an argument look stronger than it is.

It’s important to remember that using straw man arguments does not help support an argument; rather, it detracts from its validity and damages its credibility. 

Tip – The best way to win an argument is to focus on the facts and present well-thought-out evidence in support of your positions rather than resorting to logical fallacies like the straw man argument.

11. Slippery Slope Fallacy

The slippery slope fallacy is a particularly dangerous one to make in critical thinking. It’s an attempt to predict a seemingly inevitable outcome from a supposed “first step.” When you use this fallacy, you might find yourself saying something like, “If A happens, then it’s only a matter of time before Z happens; obviously, A must be prevented.”

This type of reasoning is usually flawed because it ignores reality. For example, if someone argues that legalizing marijuana will inevitably lead to addiction and economic ruin, they are ignoring the fact that there are several factors at play, not just the legalization.

Tip – It’s important to avoid this fallacy when critically examining an argument because it can often lead people astray or cloud their judgment. Make sure to check your premises carefully and ask yourself if what you’re saying is based on reality or just speculation.

12. Expecting Perfection or the Impossible

A common mistake people make when trying to think critically is expecting perfection or the impossible. This occurs when a person outlines a goal that is either unattainable or not completely realistic.

For example, someone might set out to solve a complicated problem in one day, even though it requires time and effort to build up the skills or resources needed to get the job done. Instead of setting themselves up for failure from the beginning, they should break down the problem into smaller, more attainable tasks.

It’s important to recognize that critical thinking isn’t about being perfect; it’s about understanding your limitations and working within them to come up with creative solutions.

Letting go of expectations that are unrealistic or unattainable will help you become a better problem solver and critical thinker.

Life is a precious gift that we should cherish and appreciate

13. Misinterpreting Data and Statistics

When it comes to critical thinking, data, and statistics, they don’t lie, or do they? Unfortunately, many times people misinterpret data and statistics, which can be a major critical thinking mistake.

Take the example of a study that claims eating pizza is healthier than eating chicken. Sure, that could be true based on this particular study. But without looking further into the details of the study, such as the number of participants or sample size, you can’t form an accurate opinion.

Drawing Conclusions Too Quickly: It’s important to analyze background information and other data points in order to draw more meaningful conclusions. Without looking at the complete picture, you could come away with a conclusion based solely on surface-level information that just isn’t accurate.

Drawing the Wrong Conclusions: Critical thinking is key here. While one might conclude that pizza is healthier than chicken from the first example, it’s possible that there were elements of bias in the study. well-rounded.

Tip – Do not jump to conclusions or accept claims without evidence. Look for patterns, trends and outliers in the data. Ask questions and seek explanations. Evaluate the arguments and evidence from different perspectives.

14. Circular Reasoning

If you’re not familiar, this is a logical fallacy where the argument doesn’t have any actual basis or supporting evidence.

Instead, it just keeps going around in circles, with the conclusion supporting the same premise that was already established in the original statement. It’s an assumption masquerading as an argument.

So how can you recognize this fallacy when you see it? Here are some common examples:

  • “People should obey the law because it is the law.” This statement presumes that people should accept and obey laws simply because they exist. There is no further explanation or evidence provided as to why they exist or why they should be obeyed.
  • Circular reasoning provides a false sense of security. It might sound convincing at first, but when you look at it, you’ll quickly see that there’s no real evidence or supporting facts behind it. Critical thinkers recognize this practice for what it is: an invalid argument that’s desperately trying to pass itself off as convincing logic.

Tip – To avoid circular reasoning, one should provide independent evidence or reasons to support the conclusion, and avoid restating the conclusion in different words.

Examples of Poor Reasoning

Examples of Poor Reasoning

Critical thinking doesn’t always get the best press, and that’s probably because it gets abused. Poor critical thinking is littered with fallacies, confirmation biases, and leaps of logic that make it a frustrating affair.

We often see Bad critical thinking in everyday life. Here are some examples:

  • jumping to conclusions and reaching a decision too quickly without considering all of the evidence.
  • overgeneralizing and drawing broad conclusions from a single event or data point
  • Selective Thinking Focus on selected pieces of evidence that support your position and ignore other information.
  • Emotional reasoning: making decisions based on how you feel rather than facts and logic.
  • Ad hominem attacks attack someone personally to invalidate their arguments instead of focusing on the argument itself.
  • False dilemma: assuming there are only two possible sides to an issue or two possible outcomes when in reality there are more options or scenarios.

In conclusion, examples of bad critical thinking can be found in many aspects of our lives, such as politics, media, education, and personal decisions. They can lead to faulty reasoning, biased arguments, fallacious claims, and poor judgment.

To avoid bad critical thinking, we should always question our assumptions, seek evidence, consider alternative perspectives, and evaluate the consequences of our actions. By doing so, we can improve our thinking skills and make better choices for ourselves and others.

  • When Critical Thinking goes wrong – There is a fragile line between Critical thinking and Overthinking. by Hoang Nguyen Published in Prototypr
  • No Such Thing as ‘Good’ Critical Thinking – A process outline of what it means to be a critical thinker. by Christopher Dwyer Ph.D. (2018) published in Psychology Today (https://www.psychologytoday.com/)
  • Risks Associated with Weak Critical Thinkers from Insight Assessment (https://www.insightassessment.com/)

Call To Action

Do you want to grow as a person and achieve your goals? Do you want to learn from experts and get inspired by success stories? Do you want to join a community of like-minded people who support each other? If you answered yes to any of these questions, then you need to subscribe to our weekly newsletter!

Every week, we’ll send you valuable tips, insights, and resources to help you on your self-growth journey. Don’t miss this opportunity to transform your life and reach your full potential. Subscribe today and get ready to grow!

Believe in mind Newsletter

Let’s boost your self-growth with Believe in Mind.

Interested in self-reflection tips, learning hacks, and knowing ways to calm down your mind? We offer you the best content which you have been looking for.

Follow Me on

You May Like Also

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

IMAGES

  1. Examples of Denial: 11 Different Types and Their Treatment

    example of denial in critical thinking

  2. Ultimate Critical Thinking Cheat Sheet

    example of denial in critical thinking

  3. 6 Main Types of Critical Thinking Skills (With Examples)

    example of denial in critical thinking

  4. Critical Thinking

    example of denial in critical thinking

  5. PPT

    example of denial in critical thinking

  6. 25 Critical Thinking Examples (2024)

    example of denial in critical thinking

VIDEO

  1. Critical Thinking

  2. Critical thinking explaining

  3. Critical thinking

  4. Webinar

  5. Critical Thinking

  6. Critical thinking

COMMENTS

  1. Fear, Anger, and Denial—How do critical thinkers deal with barriers

    The barriers that have surfaced are fear, anger, denial, egocentrism, and Sociocentrism. Each one of these barriers can have a significant impact on our decision making and with so many important decisions now that must be made, we want to be certain we are making the best possible choice. Fear is perhaps one of those barriers that are partly ...

  2. Denial as a Defense Mechanism

    Denial is a type of defense mechanism that involves ignoring the reality of a situation to avoid anxiety. Defense mechanisms are strategies that people use to cope with distressing feelings. In the case of denial, it can involve not acknowledging reality or denying the consequences of that reality. If you are in denial, it often means that you ...

  3. Being In Denial: Signs You Are, Examples, How To Stop

    4. You justify your negative behavior or circumstances. ("I can't have fun without drinking.") 5. You say you will just address the problem in the future. ("That toothache isn't a big deal. I'll deal with it in a couple of weeks.") 6. You just won't talk about the problem with anyone.

  4. Break through these 5 common critical thinking barriers

    Evaluate the logic. Analyze whether the assumptions align with the conclusions. Make the decision. Evaluate the argument using evidence, logic, and supporting data to increase the weight, contradictions, poor reasoning, or lack of evidence to decrease the weight. Finding accuracy in ideas and challenging assumptions are essential parts of this ...

  5. Denial

    Denial. Denial is a defense mechanism in which an individual refuses to recognize or acknowledge objective facts or experiences. It's an unconscious process that serves to protect the person ...

  6. Critical Thinking vs. Denialism

    Critical thinking is using rational thought and logic to analyze information and make reasonable decisions or conclusions based on facts and evidence. Being critical is also sometimes described as discerning, analytical, diagnostic, exacting, particular, open minded, informed by evidence and disciplined. Skepticism is popularly thought to be ...

  7. 41+ Critical Thinking Examples (Definition + Practices)

    There are many resources to help you determine if information sources are factual or not. 7. Socratic Questioning. This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic.

  8. When Science Denial Meets Epistemic Understanding

    It is an important phenomenon because it derails critical thinking skills. ... (Cleland 2001), and that historical sciences are more subject to denial. For another example of different ways of doing science, scientists working on the same problem and with the same data can arrive at different conclusions.

  9. Fallacies: Denying the Antecedent (video)

    We find this in the arguments of a form called "modus ponens." It is also valid to argue from[br]the denial of a consequent to a denial of the antecedent. But it is never, ever valid[br]to deny the antecedent to reject its consequent. Let's try another example: "If you are a property[br]owner, then you are a human. "But you are not a property ...

  10. When Science Denial Meets Epistemic Understanding

    There has been increased attention paid to science denial in both educational and social context (Hansson 2017b; Liu 2012; Rosenau 2012).Science denial is defined as "the systematic rejection of empirical evidence to avoid [personally and subjectively] undesirable facts or conclusions" (Liu 2012, p. 129).Some typical examples of science denial are denial of climate change, relativity ...

  11. PDF The Nature of Critical Thinking

    Critical thinking is "reasonable reflective thinking focused on deciding what to believe or do." ... denial of the conclusion contradicts the assertion of the premises. b. Class logic c. Conditional logic . d. Interpretation of logical terminology, including (1) Negation and double negation ... Example and non-example form (non-examples ...

  12. The Truth About Denial

    One significant type of reason-distorting emotional threat is a threat to one's ideological worldview. When group interests, creeds, or dogmas are threatened by unwelcome factual information, biased thinking becomes ideological denialism. (One critical example of such denialism is the widespread denial of settled climate science.)

  13. Defense Mechanisms: Definition, Types, Examples, Solutions

    According to Freud, these mechanisms protect the conscious mind from contradictions between the animalistic id and the idealistic superego, ultimately contributing to "mental homeostasis." Here we explain the 20 most common defense mechanisms, some of which include denial, projection, dissociation, and humor. We also share how these mechanisms ...

  14. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  15. How Critical Thinking Skills Help Avoid Denials

    One complication or comorbidity (CC) or major CC (MCC) increases the likelihood of denial. Conflicting or missing information also increases the denial rate. Let's walk through an example of following these critical thinking points: A patient presents to the emergency department after a fall. He has some confusion as well as a hip fracture ...

  16. How Does Denial Actually Work?

    Here's a common example of denial: how we spend money. Desire, greed and need for status can easily override rational considerations, providing the stimuli that power our spending habits. For ...

  17. Defense Mechanisms in Psychology Explained (+ Examples)

    Defense Mechanisms in Psychology: Freud's Theory. Freud argued that the mind was made up of three components: the id, ego, and superego (Rennison, 2015). The id houses basic needs, impulses, and desires. Simply, the id acts as a hedonistic pleasure center whose primary goal is to satisfy basic needs and drives.

  18. Top 7 Barriers to Critical Thinking: Examples and Solutions

    2. Drone Mentality. Having a drone mentality means facing a barrier to critical thinking that makes you practically incapable of identifying problems, analyzing situations, or solving problems. The ability to think critically distinguishes us from animals as intelligent beings.

  19. Ignorance, misconceptions and critical thinking

    Critical thinking is a way to acquire procedural capacities of this kind without falling into the trap of formalization. The second aspect of critical thinking that needs to be explored more closely concerns the knowledge of the topics at issue and, in the case of misconceptions, specifically of the relevant scientific facts (c).

  20. Defense Mechanisms In Psychology Explained (+ Examples)

    Defense mechanisms operate at an unconscious level and help ward off unpleasant feelings (i.e., anxiety) or make good things feel better for the individual. Ego-defense mechanisms are natural and normal. When they get out of proportion (i.e., used with frequency), neuroses develop, such as anxiety states, phobias, obsessions, or hysteria.

  21. Denial as a Defense Mechanism

    The denial defense mechanism can be an attempt to avoid uncomfortable realities (such as grief), anxiety, or truths or a means of coping with distressing or painful situations, unpleasant feelings, or traumatic events. While being in denial can give you time to adjust to sudden life changes, this defense mechanism can also prevent you from ...

  22. Avoidance: The Band-Aid Solution to Long-Term Problems

    Avoidance is the best short-term strategy to escape conflict, and the best long-term strategy to ensure suffering. It is also common for some people to avoid negative feelings by regularly engaging in "numbing" behaviors. This may come in the form of drinking more often or heavier, over-eating, over-exercising, etc; anything that might ...

  23. Bad Critical Thinking Examples: 14 Tips for Better Decisions

    These examples highlight just how easy it is to get caught up in our own biases rather than relying on facts and data to reach conclusions. So, if you want to practice better critical thinking skills, try to identify these traps before they ensnare your reasoning skills. 1. Substituting Emotion for Reason.