Accelerate Learning

  • MISSION / VISION
  • DIVERSITY STATEMENT
  • CAREER OPPORTUNITIES
  • Kide Science
  • STEMscopes Science
  • Collaborate Science
  • STEMscopes Math
  • Math Nation
  • STEMscopes Coding
  • Mastery Coding
  • DIVE-in Engineering
  • STEMscopes Streaming
  • Tuva Data Literacy
  • NATIONAL INSTITUTE FOR STEM EDUCATION
  • STEMSCOPES PROFESSIONAL LEARNING
  • RESEARCH & EFFICACY STUDIES
  • STEM EDUCATION WEBINARS
  • LEARNING EQUITY
  • DISTANCE LEARNING
  • PRODUCT UPDATES
  • LMS INTEGRATIONS
  • STEMSCOPES BLOG
  • FREE RESOURCES
  • TESTIMONIALS

Critical Thinking in Science: Fostering Scientific Reasoning Skills in Students

Thinking like a scientist is a central goal of all science curricula.

As students learn facts, methodologies, and methods, what matters most is that all their learning happens through the lens of scientific reasoning what matters most is that it’s all through the lens of scientific reasoning.

That way, when it comes time for them to take on a little science themselves, either in the lab or by theoretically thinking through a solution, they understand how to do it in the right context.

One component of this type of thinking is being critical. Based on facts and evidence, critical thinking in science isn’t exactly the same as critical thinking in other subjects.

Students have to doubt the information they’re given until they can prove it’s right.

They have to truly understand what’s true and what’s hearsay. It’s complex, but with the right tools and plenty of practice, students can get it right.

What is critical thinking?

This particular style of thinking stands out because it requires reflection and analysis. Based on what's logical and rational, thinking critically is all about digging deep and going beyond the surface of a question to establish the quality of the question itself.

It ensures students put their brains to work when confronted with a question rather than taking every piece of information they’re given at face value.

It’s engaged, higher-level thinking that will serve them well in school and throughout their lives.

Why is critical thinking important?

Critical thinking is important when it comes to making good decisions.

It gives us the tools to think through a choice rather than quickly picking an option — and probably guessing wrong. Think of it as the all-important ‘why.’

Why is that true? Why is that right? Why is this the only option?

Finding answers to questions like these requires critical thinking. They require you to really analyze both the question itself and the possible solutions to establish validity.

Will that choice work for me? Does this feel right based on the evidence?

How does critical thinking in science impact students?

Critical thinking is essential in science.

It’s what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought.

It’s not just about whether evidence is available to support a particular answer but how valid that evidence is.

It’s about whether the information the student has fits together to create a strong argument and how to use verifiable facts to get a proper response.

Critical thinking in science helps students:

  • Actively evaluate information
  • Identify bias
  • Separate the logic within arguments
  • Analyze evidence

4 Ways to promote critical thinking

Figuring out how to develop critical thinking skills in science means looking at multiple strategies and deciding what will work best at your school and in your class.

Based on your student population, their needs and abilities, not every option will be a home run.

These particular examples are all based on the idea that for students to really learn how to think critically, they have to practice doing it. 

Each focuses on engaging students with science in a way that will motivate them to work independently as they hone their scientific reasoning skills.

Project-Based Learning

Project-based learning centers on critical thinking.

Teachers can shape a project around the thinking style to give students practice with evaluating evidence or other critical thinking skills.

Critical thinking also happens during collaboration, evidence-based thought, and reflection.

For example, setting students up for a research project is not only a great way to get them to think critically, but it also helps motivate them to learn.

Allowing them to pick the topic (that isn’t easy to look up online), develop their own research questions, and establish a process to collect data to find an answer lets students personally connect to science while using critical thinking at each stage of the assignment.

They’ll have to evaluate the quality of the research they find and make evidence-based decisions.

Self-Reflection

Adding a question or two to any lab practicum or activity requiring students to pause and reflect on what they did or learned also helps them practice critical thinking.

At this point in an assignment, they’ll pause and assess independently. 

You can ask students to reflect on the conclusions they came up with for a completed activity, which really makes them think about whether there's any bias in their answer.

Addressing Assumptions

One way critical thinking aligns so perfectly with scientific reasoning is that it encourages students to challenge all assumptions. 

Evidence is king in the science classroom, but even when students work with hard facts, there comes the risk of a little assumptive thinking.

Working with students to identify assumptions in existing research or asking them to address an issue where they suspend their own judgment and simply look at established facts polishes their that critical eye.

They’re getting practice without tossing out opinions, unproven hypotheses, and speculation in exchange for real data and real results, just like a scientist has to do.

Lab Activities With Trial-And-Error

Another component of critical thinking (as well as thinking like a scientist) is figuring out what to do when you get something wrong.

Backtracking can mean you have to rethink a process, redesign an experiment, or reevaluate data because the outcomes don’t make sense, but it’s okay.

The ability to get something wrong and recover is not only a valuable life skill, but it’s where most scientific breakthroughs start. Reminding students of this is always a valuable lesson.

Labs that include comparative activities are one way to increase critical thinking skills, especially when introducing new evidence that might cause students to change their conclusions once the lab has begun.

For example, you provide students with two distinct data sets and ask them to compare them.

With only two choices, there are a finite amount of conclusions to draw, but then what happens when you bring in a third data set? Will it void certain conclusions? Will it allow students to make new conclusions, ones even more deeply rooted in evidence?

Thinking like a scientist

When students get the opportunity to think critically, they’re learning to trust the data over their ‘gut,’ to approach problems systematically and make informed decisions using ‘good’ evidence.

When practiced enough, this ability will engage students in science in a whole new way, providing them with opportunities to dig deeper and learn more.

It can help enrich science and motivate students to approach the subject just like a professional would.

New call-to-action

Topics: science , "STEM" , project-based learning

Recent Posts

Posts by tag.

  • "STEM" (68)
  • science (31)
  • teacher writers (17)
  • distance learning (14)
  • STEM careers (12)
  • "remote learning" (10)
  • math strategies (9)
  • real world connection (9)
  • early childhood education (8)
  • mathematical reasoning (8)
  • productive struggle (8)
  • project-based learning (8)
  • COVID-19 (7)
  • technology (7)
  • inquiry-based learning (6)
  • k12 education (5)
  • learning strategies (5)
  • virtual teaching (5)
  • Community of Learners (4)
  • effective math class (4)
  • inquiry (4)
  • stem majors (4)
  • student engagement (4)
  • teaching online (4)
  • cooperative learning (3)
  • stem strategies (3)
  • teacher appreciation (3)
  • "parents" (2)
  • ALI employees (2)
  • Guidelines of Best Practice (2)
  • Lab Safety (2)
  • Teaching Strategies (2)
  • aerospace engineering (2)
  • digital education (2)
  • digital learning (2)
  • engineering (2)
  • interactive math (2)
  • intervention (2)
  • math activities (2)
  • math anxiety (2)
  • phenomena (2)
  • professional development (2)
  • stem innovators (2)
  • strategies (2)
  • student autonomy (2)
  • student confidence (2)
  • student-centered learning (2)
  • summer break activities (2)
  • take home activities (2)
  • teaching (2)
  • women in stem (2)
  • 21st-Century skills (1)
  • Academic Language (1)
  • Black History Month (1)
  • Black scientists (1)
  • CRA approach (1)
  • ChatGPT (1)
  • Computation Fluency (1)
  • Conceptual Understanding (1)
  • Earth Day (1)
  • Google Classroom (1)
  • Google Classroom LMS (1)
  • Hispanic Heritage (1)
  • Paleontology (1)
  • Parent/Student Content Support (1)
  • Virtual Reality (1)
  • abstract (1)
  • at-home (1)
  • barriers & challenges (1)
  • cares act (1)
  • college readiness (1)
  • concrete (1)
  • confidence (1)
  • constructivism (1)
  • constructivist (1)
  • data literacy (1)
  • data science (1)
  • data-based decision making (1)
  • digital literacy (1)
  • dinosaurs (1)
  • diversity and inclusion (1)
  • driving question (1)
  • educative curriculum (1)
  • everyday science (1)
  • first-year teacher (1)
  • flipped classroom (1)
  • formative assessment (1)
  • future of stem (1)
  • game-based learning (1)
  • gamification (1)
  • geology (1)
  • intentional discourse (1)
  • k12 funding (1)
  • latest trends in education (1)
  • learning community (1)
  • math chats (1)
  • math learning loss (1)
  • math trauma (1)
  • multilingual learners (1)
  • number sense (1)
  • outdoor stem activities (1)
  • peer learning (1)
  • perserverance (1)
  • representational (1)
  • robotics (1)
  • science strategies (1)
  • small-group instruction (1)
  • struggling students (1)
  • time management (1)
  • vocabulary building (1)
  • warm-ups (1)
  • women scientists (1)

Subscribe Here!

STEMscopes Tech Specifications      STEMscopes Security Information & Compliance      Privacy Policy      Terms and Conditions

© 2024 Accelerate Learning  

Distance Learning

Using technology to develop students’ critical thinking skills.

by Jessica Mansbach

What Is Critical Thinking?

Critical thinking is a higher-order cognitive skill that is indispensable to students, readying them to respond to a variety of complex problems that are sure to arise in their personal and professional lives. The  cognitive skills at the foundation of critical thinking are  analysis, interpretation, evaluation, explanation, inference, and self-regulation.  

When students think critically, they actively engage in these processes:

  • Communication
  • Problem-solving

To create environments that engage students in these processes, instructors need to ask questions, encourage the expression of diverse opinions, and involve students in a variety of hands-on activities that force them to be involved in their learning.

Types of Critical Thinking Skills

Instructors should select activities based on the level of thinking they want students to do and the learning objectives for the course or assignment. The chart below describes questions to ask in order to show that students can demonstrate different levels of critical thinking.

*Adapted from Brown University’s Harriet W Sheridan Center for Teaching and Learning

Using Online Tools to Teach Critical Thinking Skills

Online instructors can use technology tools to create activities that help students develop both lower-level and higher-level critical thinking skills.

  • Example: Use Google Doc, a collaboration feature in Canvas, and tell students to keep a journal in which they reflect on what they are learning, describe the progress they are making in the class, and cite course materials that have been most relevant to their progress. Students can share the Google Doc with you, and instructors can comment on their work.
  • Example: Use the peer review assignment feature in Canvas and manually or automatically form peer review groups. These groups can be anonymous or display students’ names. Tell students to give feedback to two of their peers on the first draft of a research paper. Use the rubric feature in Canvas to create a rubric for students to use. Show students the rubric along with the assignment instructions so that students know what they will be evaluated on and how to evaluate their peers.
  • Example: Use the discussions feature in Canvas and tell students to have a debate about a video they watched. Pose the debate questions in the discussion forum, and give students instructions to take a side of the debate and cite course readings to support their arguments.  
  • Example: Us e goreact , a tool for creating and commenting on online presentations, and tell students to design a presentation that summarizes and raises questions about a reading. Tell students to comment on the strengths and weaknesses of the author’s argument. Students can post the links to their goreact presentations in a discussion forum or an assignment using the insert link feature in Canvas.
  • Example:  Use goreact, a narrated Powerpoint, or a Google Doc and instruct students to tell a story that informs readers and listeners about how the course content they are learning is useful in their professional lives. In the story, tell students to offer specific examples of readings and class activities that they are finding most relevant to their professional work. Links to the goreact presentation and Google doc can be submitted via a discussion forum or an assignment in Canvas. The Powerpoint file can be submitted via a discussion or submitted in an assignment.

Pulling it All Together

Critical thinking is an invaluable skill that students need to be successful in their professional and personal lives. Instructors can be thoughtful and purposeful about creating learning objectives that promote lower and higher-level critical thinking skills, and about using technology to implement activities that support these learning objectives. Below are some additional resources about critical thinking.

Additional Resources

Carmichael, E., & Farrell, H. (2012). Evaluation of the Effectiveness of Online Resources in Developing Student Critical Thinking: Review of Literature and Case Study of a Critical Thinking Online Site.  Journal of University Teaching and Learning Practice ,  9 (1), 4.

Lai, E. R. (2011). Critical thinking: A literature review.  Pearson’s Research Reports ,  6 , 40-41.

Landers, H (n.d.). Using Peer Teaching In The Classroom. Retrieved electronically from https://tilt.colostate.edu/TipsAndGuides/Tip/180

Lynch, C. L., & Wolcott, S. K. (2001). Helping your students develop critical thinking skills (IDEA Paper# 37. In  Manhattan, KS: The IDEA Center.

Mandernach, B. J. (2006). Thinking critically about critical thinking: Integrating online tools to Promote Critical Thinking. Insight: A collection of faculty scholarship , 1 , 41-50.

Yang, Y. T. C., & Wu, W. C. I. (2012). Digital storytelling for enhancing student academic achievement, critical thinking, and learning motivation: A year-long experimental study. Computers & Education , 59 (2), 339-352.

Insight Assessment: Measuring Thinking Worldwide

http://www.insightassessment.com/

Michigan State University’s Office of Faculty  & Organizational Development, Critical Thinking: http://fod.msu.edu/oir/critical-thinking

The Critical Thinking Community

http://www.criticalthinking.org/pages/defining-critical-thinking/766

Related Posts

Help Your Students Make Better Virtual Presentations

Featured Tech: ThingLink

Documenting the Course Development Process

Focus on Teaching and Learning Highlights Active Learning

9 responses to “ Using Technology To Develop Students’ Critical Thinking Skills ”

This is a great site for my students to learn how to develop critical thinking skills, especially in the STEM fields.

Great tools to help all learners at all levels… not everyone learns at the same rate.

Thanks for sharing the article. Is there any way to find tools which help in developing critical thinking skills to students?

Technology needs to be advance to develop the below factors:

Understand the links between ideas. Determine the importance and relevance of arguments and ideas. Recognize, build and appraise arguments.

Excellent share! Can I know few tools which help in developing critical thinking skills to students? Any help will be appreciated. Thanks!

  • Pingback: EDTC 6431 – Module 4 – Designing Lessons That Use Critical Thinking | Mr.Reed Teaches Math
  • Pingback: Homepage
  • Pingback: Magacus | Pearltrees

Brilliant post. Will be sharing this on our Twitter (@refthinking). I would love to chat to you about our tool, the Thinking Kit. It has been specifically designed to help students develop critical thinking skills whilst they also learn about the topics they ‘need’ to.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

Developing Students’ Critical Thinking Skills and Argumentation Abilities Through Augmented Reality–Based Argumentation Activities in Science Classes

Tuba demircioglu.

1 Faculty of Education, Department of Mathematics and Science Education/Elementary Science Education, Cukurova University, 01330 Saricam-Adana, Turkey

Memet Karakus

2 Department of Educational Sciences, Cukurova University, Adana, Turkey

Due to the COVID-19 pandemic and adapting the classes urgently to distance learning, directing students’ interest in the course content became challenging. The solution to this challenge emerges through creative pedagogies that integrate the instructional methods with new technologies like augmented reality (AR). Although the use of AR in science education is increasing, the integration of AR into science classes is still naive. The lack of the ability to identify misinformation in the COVID-19 pandemic process has revealed the importance of developing students’ critical thinking skills and argumentation abilities. The purpose of this study was to examine the change in critical thinking skills and argumentation abilities through augmented reality–based argumentation activities in teaching astronomy content. The participants were 79 seventh grade students from a private school. In this case study, the examination of the verbal arguments of students showed that all groups engaged in the argumentation and produced quality arguments. The critical thinking skills of the students developed until the middle of the intervention, and the frequency of using critical thinking skills varied after the middle of the intervention. The findings highlight the role of AR-based argumentation activities in students’ critical thinking skills and argumentation in science education.

Introduction

With rapidly developing technology, the number of children using mobile handheld devices has increased drastically (Rideout et al., 2010 ; Squire, 2006 ). Technologies and digital enhancements that use the internet have become a part of the daily life of school-age children (Kennedy et al., 2008 ), and education evolves in line with the changing technology. Rapidly changing innovation technologies have changed the characteristics of learners in the fields of knowledge, skills, and expertise that are valuable for society, and circumstances for teachers and students have changed over time (Yuen et al., 2011 ). Almost every school subject incorporates technological devices into the pedagogy to different extents, but science teachers are the most eager to use technological devices in science classes because of the nature of the content they are expected to teach.

The COVID-19 pandemic has had an important impact on educational systems worldwide. Due to the fast-spreading of that disease, the educators had to adapt their classes urgently to technology and distance learning (Dietrich et al., 2020 ), and schools have had to put more effort into adapting new technologies to teaching. Z generation was born into a time of information technology, but they did not choose distance courses that were not created for them so they are not motivated during the classes (Dietrich et al., 2020 ). Directing students’ interest in the course content is challenging, while their interest has changed by this technological development. The solution to this challenge emerges through creative pedagogies that integrate the instructional methods with new striking technology. Augmented reality has demonstrated high potential as part of many teaching methods.

Literature Review

Augmented reality, education, and science education.

AR applications have important potential for many areas where rapid transfer of information is important. This is especially effective for education. Science education is among the disciplines where rapid information transfer is important. Taylor ( 1987 , p. 1) stated that “the transfer of scientific and technological information to children and to the general public is as important as the search for information.” With the rapid change in science and technology and outdating of knowledge, learning needs rapid changes in transfer of information (Ploman, 1987 ). Technology provides new and innovative methods for science education and could be an effective media in promoting students’ learning (Virata & Castro, 2019 ). AR technology could be a promising teaching tool for science teaching in which AR technology is especially applicable (Arici et al., 2019 ).

Research shows that AR has great potential and benefits for learning and teaching (Yuen et al., 2011 ). The AR applications used in teaching and learning present many objects, practices, and experiments that students cannot obtain from the first-hand experience into many different dimensions because of the impossibilities in the real world, and it is an approach that can be applied to many science contents that are unreachable, unobtrusive, and unable to travel (Cai et al., 2013 ; Huang et al., 2019 ; Pellas et al., 2019 ). For example, physically unreachable phenomena such as solar systems, moon phases, and magnetic fields become accessible for learners through AR (Fleck & Simon, 2013 ; Kerawalla et al., 2006 ; Shelton & Hedley, 2002 ; Sin & Zaman, 2010 ; Yen et al., 2013 ). Through AR, learners can obtain instant access to location-specific information provided by a wide range of sources (Yuen et al., 2011 ). Location-based information, when used in particular contextual learning activities, is essential for assisting students’ outdoor learning. This interaction develops comprehension, understanding, imagination, and retention, which are the learning and cognitive skills of learners (Chiang et al., 2014 ). For example, an AR-based mobile learning system was used in the study conducted by Chiang et al. ( 2014 ) on aquatic animals and plants. The location module can identify the students’ GPS location, direct them to discover the target ecological regions, and provide the appropriate learning tasks or additional resources. When students explore various characteristics of learning objects, the camera and image editing modules can take the image from the real environment and make comment on the image of the observed things.

Research reveals that the use of AR technology as part of teaching a subject has the features of being constructivist, problem solving-based, student-centered, authentic, participative, creative, personalized, meaningful, challenging, collaborative, interactive, entertaining, cognitively rich, contextual, and motivational (Dunleavy et al., 2009 ). Despite its advantages and although the use of AR in science education is increasing, the integration of AR into science classes is still naive, and teachers still do not consider themselves as ready for use of AR in their class (Oleksiuk & Oleksiuk, 2020 ; Romano et al., 2020 ) and choose not to use AR technology (Alalwan et al., 2020 ; Garzón et al., 2019 ), because most of them do not have the abilities and motivation to design AR learning practices (Garzón et al., 2019 ; Romano et al., 2020 ). It is thought that the current study will contribute to the use of AR in science lessons and how science teachers will include AR technology in their lessons.

Argumentation, Critical Thinking, and Augmented Reality

New trends in information technologies have contributed to the development of new skills in which people have to struggle with a range of information and evaluate this information. An important point of these skills is the ability to argue with evidence (Jiménez -Aleixandre & Erduran, 2007 ) in which young people create appropriate results from the information and evidence given to them to criticize the claims of others in the direction of the evidence and to distinguish an idea from evidence-based situations (OECD, 2003 , p. 132).

Learning with technologies could produce information and misinformation simultaneously (Chai et al., 2015 ). Misinformation has spread very quickly in public in COVID-19 pandemic, so the lack of the ability to interpret and evaluate the validity and credibility of them arose again (Saribas & Çetinkaya, 2021 ). This process revealed the importance of developing students’ critical thinking skills and argumentation abilities (Erduran, 2020 ) to make decisions and adequate judgments when they encountered contradicting information (Chai et al., 2015 ).

Thinking about different subjects, evaluating the validity of scientific claims, and interpreting and evaluating evidence are important elements of science courses and play important roles in the construction of scientific knowledge (Driver et al., 2000 ). The use of scientific knowledge in everyday life ensures that critical thinking skills come to the forefront. Ennis ( 2011 , p. 1) defined critical thinking as “Critical thinking is reasonable and reflective thinking focused on deciding what to believe”. Jiménez-Aleixandre and Puig ( 2012 ) found this definition very broad, and they proposed a comprehensive definition of critical thinking that combines the components of social emancipation and evidence evaluation. It contains the competence to form autonomous ideas as well as the ability to participate in and reflect on the world around us. Figure  1 summarizes this comprehensive definition.

An external file that holds a picture, illustration, etc.
Object name is 11191_2022_369_Fig1_HTML.jpg

Argumentation levels by groups

Critical thinking skills that include the ability to evaluate arguments and counterarguments in a variety of contexts are very important, and effective argumentation is the focal point of criticism and the informed decision (Nussbaum, 2008 ). Argumentation is defined as the process of making claims about a scientific subject, supporting them with data, using warrants, and criticizing, refuting, and evaluating an idea (Toulmin, 1990 ). Argumentation as an instructional method is an important research area in science education and has received enduring interest from science educators for more than a decade (Erduran et al., 2015 ). Researchers concluded that learners mostly made only claims in the argumentation process and had difficulty producing well-justified and high-quality arguments (Demircioglu & Ucar, 2014 ; Demircioglu & Ucar, 2015 ; Cavagnetto et al., 2010 ; Erdogan et al., 2017 ; Erduran et al., 2004 ; Novak & Treagust, 2017 ). To improve the quality of arguments, students should be given supportive elements to produce more consistent arguments during argumentation. One of these supportive elements is the visual representations of the phenomena.

Visual representations could make it easier to see the structure of the arguments of learners (Akpınar et al., 2014 ) and improve students’ awareness. For example, the number of words and comments used by students or meaningful links in conversations increases with visually enriched arguments (Erkens & Janssen, 2006 ). Sandoval & Millwood ( 2005 ) stated that students should be able to evaluate different kinds of evidence such as digital data and graphic photography to defend their claims. Appropriate data can directly support a claim and allow an argument to be accepted or rejected by students (Lin & Mintzes, 2010 ). Enriched visual representations provide students with detailed and meaningful information about the subject (Clark et al., 2007 ). Students collect evidence for argumentation by observing enriched representations (Clark et al., 2007 ), and these representations help to construct higher-quality arguments (Buckingham Shum et al., 1997 ; Jermann & Dillenbourg, 2003 ). Visualization techniques enable students to observe how objects behave and interact and provide an easy-to-understand presentation of scientific facts that are difficult to understand with textual or oral explanations (Cadmus, 1990 ). In short, technological opportunities to create visually enriched representations increase students’ access to rich data to support their arguments.

Among the many technological opportunities to promote argumentation, AR seems to be the most promising application for instructing school subjects. AR applications are concerned with the combination of computer-generated data (virtual reality) and the real world, where computer graphics are projected onto real-time video images (Dias, 2009 ). In addition, augmented reality provides users with the ability to see a real-world environment enriched with 3D images and to interact in real time by combining virtual objects with the real environment in 3D and showing the spatial relations (Kerawalla et al., 2006 ). AR applications are thus important tools for students’ arguments with the help of detailed and meaningful information and enriched representations. Research studies using AR technology revealed that all students in the study engaged in argumentation and produced arguments (Jan, 2009 ; Squire & Jan, 2007 ).

Many studies focusing on using AR in science education have been published in recent decades. Research studies related to AR in science education have focused on the use of game-based AR in science education (Atwood-Blaine & Huffman, 2017 ; Bressler & Bodzin, 2013 ; Dunleavy et al., 2009 ; López-Faican & Jaen, 2020 ; Squire, 2006 ), academic achievement (Hsiao et al., 2016 ; Faridi et al., 2020 ; Hwang et al., 2016 ; Lu et al., 2020 ; Sahin & Yilmaz, 2020 ;, Yildirim & Seckin-Kapucu, 2020 ), understanding science content and its conceptual understanding (Cai et al., 2021 ; Chang et al., 2013 ; Chen & Liu, 2020 ; Ibáñez et al., 2014 ), attitude (Sahin & Yilmaz, 2020 0; Hwang et al., 2016 ), self-efficacy (Cai et al., 2021 ), motivation (Bressler & Bodzin, 2013 ; Chen & Liu, 2020 ; Kirikkaya & Başgül, 2019 ; Lu et al., 2020 ; Zhang et al., 2014 ), and critical thinking skills (Faridi et al., 2020 ; Syawaludin et al., 2019 ). The general trend in these research studies based on the content of “learning/academic achievement,” “understanding science content and its conceptual understanding,” “motivation,” “attitude,” and methodologically quantitative studies was mostly used in articles in science education. Therefore, qualitative and quantitative data to be obtained from studies investigating the use of augmented reality technology in education and focusing on cognitive issues, interaction, and collaborative activities are needed (Arici et al., 2019 ; Cheng & Tsai, 2013 ).

Instructional strategies using AR technology ensure interactions between students and additionally between students and teachers (Hanid et al., 2020 ). Both the technological features of AR and learning strategies should be regarded by the teachers, the curriculum, and AR technology developers to acquire the complete advantage of AR in student learning (Garzón & Acevedo, 2019 ; Garzón et al., 2020 ). Researchers investigated the learning outcomes with AR-integrated learning strategies such as collaborative learning (Baran et al., 2020 ; Chen & Liu, 2020 ; Ke & Carafano, 2016 ), socioscientific reasoning (Chang et al., 2020 ), student-centered hands-on learning activities (Chen & Liu, 2020 ), inquiry-based learning (Radu & Schneider, 2019 ), concept-map learning system (Chen et al., 2019 ), problem-based learning (Fidan & Tuncel, 2019 ), and argumentation (Jan, 2009 ; Squire & Jan, 2007 ) in science learning.

The only two existing studies using both AR and argumentation (Jan, 2009 ; Squire & Jan, 2007 ) focus on environmental education and use location-based augmented reality games through mobile devices to engage students in scientific argumentation. Studies combining AR and argumentation in astronomy education have not been found in the literature. In the current study, AR was integrated with argumentation in teaching astronomy content.

Studies have revealed that many topics in astronomy are very difficult to learn and that students have incorrect and naive concepts (Yu & Sahami, 2007 ). Many topics include three-dimensional (3D) spatial relationships between astronomical objects (Aktamış & Arıcı, 2013 ; Yu & Sahami, 2007 ). However, most of the traditional teaching materials used in astronomy education are two-dimensional (Aktamış & Arıcı, 2013 ). Teaching astronomy through photographs and 2D animations is not sufficient to understand the difficult and complex concepts of astronomy (Chen et al., 2007 ). Static visualization tools such as texts, photographs, and 3D models do not change over time and do not have continuous movement, while dynamic visualization tools such as videos or animations show continuous movement and change over time (Schnotz & Lowe, 2008 ). However, animation is the presentation of images on a computer screen (Rieber & Kini, 1991 ), not in the real world, and the users do not have a chance to manipulate the images (Setozaki et al., 2017 ). As a solution to this shortcoming, using 3D technology in science classes, especially AR technology for abstract concepts, has become a necessity (Sahin & Yilmaz, 2020 ). By facilitating interaction with real and virtual environment and supporting object manipulation, AR is possible to enhance educational benefits (Billinghurst, 2002 ). The students are not passive participants while using AR technology. For example, the animated 3D sun and Earth models are moved on a handheld platform that adjusts its orientation in accordance with the student’s point of view in Shelton’s study ( 2002 ). They found that the ability of students to manage “how” and “when” they are allowed to manipulate virtual 3D objects has a direct impact on learning complex spatial phenomena. Experimental results show that compared with traditional video teaching, AR multimedia video teaching method significantly improves students’ learning (Chen et al., 2022 ).

This study, which integrates argumentation with new striking technology “AR” in astronomy education, clarifies the relationship between them and examines variables such as critical thinking skills and argumentation abilities that are essential in the era we live, making this research important.

Research Questions

The purpose of this study was to identify the change in critical thinking skills and argumentation abilities through augmented reality–based argumentation activities in teaching astronomy content. The following research questions guided this study:

  • RQ1: How do the critical thinking skills of students who participated in both augmented reality and argumentation activities on astronomy change during the study?
  • RQ2: How do the argumentation abilities of students who participated in both augmented reality and argumentation activities on astronomy change during the study?

In this case study, we investigated the change of critical thinking skills and argumentation abilities of middle school students. Before the main intervention, a pilot study was conducted to observe the effectiveness of the prepared lesson plans in practice and to identify the problems in the implementation process. The pilot study was recorded with a camera. The camera recordings were watched by the researcher, and the difficulties in the implementation process were identified. In the main intervention, preventions were taken to overcome these difficulties. Table ​ Table1 1 illustrates that the problems encountered during the pilot study and the preventions taken to eliminate these problems.

The solutions to the problems in the pilot study

During the main intervention, qualitative data were collected through observations and audio recordings to determine the change in the critical thinking skills and argumentation abilities of students who participated in both augmented reality and argumentation activities on astronomy.

Context and Participants

The participants consisted of 79 7th middle school students aged between 12 and 13 from a private school in Southern Turkey. The participants were determined as students in a private school where tablet computers are available for each student and the school willing to participate in the study. Twenty-six students, including 17 females and 9 males, participated in the study. The students’ parents signed the consent forms (whether participating or refusing participation in the study). The researcher informed them about the purpose of the study, instructional process, and ethical principles that directed the study. The teachers and school principals were informed that the preliminary and detailed conclusions of the study will be shared with them. The first researcher conducted the lessons in all groups because when the study was conducted, the use of augmented reality technology in education was very new. Also, the science teachers had inadequate knowledge and experience about augmented reality applications. Before the study, the researcher attended the classes with the teacher and made observations to help students become accustomed to the presence of the researcher in the classroom. This prolonged engagement increased the reliability of the implementation of instructions and data collection (Guba & Lincoln, 1989 ).

Instructional Activities

The 3-week, 19-h intervention process, which was based on the prepared lesson plan, was conducted. The students participated in the learning process that included both augmented reality and argumentation activities about astronomy.

Augmented Reality Activities

Free applications such as Star Chart, Sky View Free, Aurasma, Junaio, Augment, and i Solar System were used with students’ tablet computers in augmented reality instructions. Tablet computers were provided by the school administration from their stock. Videos, simulations, and 3D visuals generated by applications were used as “overlays.” In addition, pictures, photographs, colored areas in the worksheets, and students’ textbooks were used as “trigger images.” Students had the opportunity to interact with and manipulate these videos, simulations, and 3D visuals while using the applications. With applications such as Sky View Free and Star Chart, students were provided with the resources to make sky observations.

A detailed description of the activities used in augmented reality is given in Appendix Table ​ Table8 8 .

The activities performed with augmented reality technology

Argumentation Activities

Before the instruction, the students were divided into six groups by the teacher, paying attention to heterogeneity in terms of gender and academic achievement. After small group discussions, the students participated in whole-class discussions. Competing theories cartoons, tables of statements, constructing an argument, and argument-driven inquiry (ADI) frameworks were used to support argumentation in the learning process. Argument-driven inquiry consists of eight steps including the following: identification of the task, the generation and analysis of data, the production of a tentative argument, an argumentation session, an investigation report, a double-blind peer review, revision of the report, and explicit and reflective discussion (Sampson & Gleim, 2009 ; Sampson et al., 2011 ).

A detailed description of the activities used in argumentation is given in Appendix Table ​ Table9 9 .

Activities performed with argumentation

Data Collection

The data were collected through unstructured and participant observations (Maykut & Morehouse, 1994 ; Patton, 2002 ). The instructional intervention was recorded with a video camera, and the students’ argumentation processes were also recorded with a voice recorder.

Since all students spoke at the same time during group discussions, the observation records were insufficient to understand the student talks. To determine what each student in the group said during the argumentation process, a voice recorder was placed in the middle of the group table, and a voice recording was taken throughout the lesson. A total of 2653.99 min of voice recordings were taken in the six groups.

Data Analysis

The analysis of the data was conducted with inductive and deductive approaches. Before coding, the data were arranged. The critical thinking data were organized by day. The argumentation skills were organized by day and also on the basis of the groups. After generating codes during the inductive analysis of the development of critical thinking skills, a deductive approach was adopted (Patton, 2002 ). The critical thinking skills dimensions discussed by Ennis ( 2011 ) and Ennis ( 1991 ) were used to determine the relationship between codes. Ennis ( 2011 ) prepared an outline to distinguish critical thinking dispositions and skills by synthesizing of many years of studies. These critical skills that contain abilities that ideal critical thinkers have were used to generate codes from students’ talks. This skills and abilities were given in Appendix Table ​ Table10. 10 . Then “clarification skills, decision making-supporting skills, inference skills, advanced clarification skills, and other/strategy and techniques skills” discussed by Ennis ( 1991 ) and Ennis ( 2011 ) were used to determine the categories. The change in the argumentation abilities of the students was analyzed descriptively based on the Toulmin argument model (Toulmin, 1990 ) using the data obtained from the students’ voice recordings. The argument structures of each group during verbal argumentation were determined by dividing them into components according to the Toulmin model (Toulmin, 1990 ). The first three items (data, claim, and warrant) in the Toulmin model form the basis of an argument, and the other three items (rebuttal, backing, and qualifier) are subsidiary elements of the argument (Toulmin, 1990 ).

The critical thinking skills and abilities (Ennis, 2011 , pp. 2–4)

Some quotations regarding the analysis of the arguments according to the items are given in Appendix Table ​ Table11 11 .

Quotations regarding the analysis of the arguments according to the items

Arguments from the whole group were put into stages based on the argumentation-level model developed by Erduran et al. ( 2004 ) to examine the changes in each lesson and to make comparisons between the small groups of students. By considering the argument model developed by Toulmin, Erduran et al. ( 2004 ) created a five-level framework for the assessment of the quality of argumentation supposing that the quality of the arguments including rebuttals was high. The framework is given in Table ​ Table2 2 .

The framework for the assessment of the quality of argumentation (Erduran et al., 2004 ; pp. 928)

Validity and Reliability

To confirm the accuracy and validity of the analysis, method triangulation, triangulation of data sources, and analyst triangulation were used (Patton, 2002 ).

For analyst triangulation, the qualitative findings were also analyzed independently by a researcher studying in the field of critical thinking and argumentation, and then these evaluations made by the researchers were compared.

Video and audio recordings of intervention and documents from the activities were used for the triangulation of data sources. In addition, the data were described in detail without interpretation. Additionally, within the reliability and validity efforts, direct quotations were given in the findings. In this sense, for students, codes such as S1, S2, and S3 were used, and the source of data, group number, and relevant date of the conversation were included at the end of the quotations.

In addition, experts studying in the field of critical thinking and argumentation were asked to verify all data and findings. After the process of reflection and discussion with experts, the codes, subcategories, and categories were revised.

For reliability, some of the data randomly selected from the written transcripts of the students’ audio recordings were also coded by a second encoder, and the interrater agreement between the two coders, determined by Cohen’s kappa (Cohen, 1960 ), was κ = 0.86, which is considered high reliability.

Development of Critical Thinking Ability

The development of critical thinking skills was given separately for the trend drastically changed on the day when the first skills were used by the students. All six dimensions of critical thinking skills were included in students’ dialogs or when there was a decrease in the number of categories of critical thinking skills.

The codes, subcategories, and categories of critical thinking skills that occurred on the first day (dated 11.05) are given in Table ​ Table3 3 .

The codes, subcategories, and categories of critical thinking skills that occurred on the first day

Clarification skills, inference skills, other/strategy and technical skills, advanced clarification skills, and decision-making/supporting skills occurred on the first day. The students mostly used decision-making/supporting skills ( f  = 55). Under the decision-making/supporting skills category, students mostly explained observation data ( f  = 37). S7, S1, and S20 stated the data they presented about their observations with the Star Chart and Sky View applications as follows:

S7: Venus is such a yellowish reddish colour.

S1: What was the colour? Red and big. The moon’s color is white.

S20: Not white here.

S20: It’s not white here. (Audio Recordings (AuR), Group 2 / 11.05).

Additionally, S19 mentioned the observation data with the words “I searched Saturn. It is bright. It does not vibrate. It is yellow and it’s large.” (AuR, Group 2 / 11.05).

Decision-making/supporting skills were followed by inference ( f  = 17), clarification ( f  = 13), advanced clarification ( f  = 5), and skills and other/strategy technical skills ( f  = 1).

In Table ​ Table4, 4 , the categories, subcategories, and codes for critical thinking skills that occurred on the fifth day (dated 18.05) are presented.

The categories, subcategories, and codes for critical thinking skills that occurred on the fifth day

It was observed for the first time on the fifth day that all six dimensions of critical thinking skills were included in students’ dialogs. These are, according to the frequency of use, inference ( f  = 152), decision-making/support ( f  = 116), clarification ( f  = 43), advanced clarification ( f  = 8), other/strategy and technique ( f  = 3), and suppositional thinking and integrational ( f  = 2) skills.

On this date, judging the credibility of the source from decision-making/supporting skills ( f  = 1) was the skill used for the first time.

Unlike other days, for the first time, a student tried to prove his thoughts with an analogy in advanced clarification skills. An exemplary dialogue to this finding is as follows:

S19: Even the Moon remains constant, we will see different faces of the moon because the Earth revolves around its axis.

S6: I also say that it turns at the same speed. So, for example, when this house turns like this while we return in the same way, we always see the same face. (AuR, 18.05, Group 2).

Here, S6 tried to explain to his friend that they always see the same face of the moon by comparing how they see the same face of the house.

In Table ​ Table5, 5 , the categories, subcategories, and codes for critical thinking skills that occurred on the sixth day (dated 21.05) are included.

The categories, subcategories, and codes for critical thinking skills that occurred on the sixth day

There is a decrease in the number of categories of critical thinking skills. It was determined that the students used mostly inference skills in three categories ( f  = 38). Additionally, students used decision-making/support ( f  = 34) and clarification ( f  = 9) skills. In inference skills, it is seen that students often make claims ( f  = 33) and rarely infer from the available data ( f  = 4).

Among the decision-making/support skills, students mostly used the skill to give reasons ( f  = 28). S24 accepted herself as Uranus during the activity, and she gave reason to make Saturn as an enemy like that: “No, Saturn would be my enemy too. Its ring is more distinctive, it can be seen from the Earth, its ring is more beautiful than me.” (AuR, 21.05, Group 3/).

The categories, subcategories, and codes for critical thinking skills that occurred on the ninth day (dated 28.05) are presented in Table ​ Table6 6 .

The categories, subcategories, and codes for critical thinking skills that occurred on the ninth day

In the course of the day dated 28.05, six categories of critical thinking skills were observed: clarification, inference, other/strategy and technique, advanced clarification, decision-making/support, suppositional thinking and integration skills. Furthermore, the subcategories under these categories are also very diverse.

There are 10 subcategories under clarification skills ( f  = 57), which are the most commonly used skills. The frequency of using these skills is as follows: asking his friend about his opinion ( f  = 15), asking questions to clarify the situation ( f  = 12), explaining his statement ( f  = 10), summarizing the solutions of other groups ( f  = 7), asking for a detailed explanation ( f  = 4), summarizing the idea ( f  = 3), explaining the solution proposal ( f  = 2), asking for a reason ( f  = 2), focusing on the question ( f  = 1), and asking what the tools used in experiment do ( f  = 1) skills. Explaining the solution proposal, asking what the tools used in the experiment do, and focusing on the question are the first skills used by the students.

When the qualitative findings regarding the critical thinking skills of the students were examined as a whole, it was determined that there was an improvement in the students’ critical thinking skills dimensions in the lessons held in the first 5 days (between 11.05 and 18.05). There was a decrease in the number of critical thinking skills dimensions in the middle of the intervention (21.05). However, after this date, there was an increase again in the number of critical thinking skills dimensions; and on the last day of the intervention, all the critical thinking skills dimensions were used by the students. In addition, it was determined that the skills found under these dimensions showed great variety at this date. Only in the middle (18.05) and on the last day (28.05) of the intervention did students use the skills in the six dimensions of critical thinking.

It was determined that students used mostly decision-making/support, inference, and clarification skills. According to the days, it was determined that the students mostly used inference skills (12.05, 15.05, 18.05, and 21.05) among these skills.

The Argumentation Abilities of the Students

Argument structures in students’ verbal argumentation activities.

Instead of the argument structures of all groups, only an example of one group is presented because of including both basic and subsidiary items in the Toulmin argument model. In Table ​ Table7, 7 , the argument structures in the verbal argumentation activities of the fourth group of students are presented due to the use of the “rebuttal” item.

The argument structures in the verbal argumentation activities of the fourth group of students

When the argument structures in the verbal argumentation process of the six groups were examined, it was found that all groups engaged in the argumentation and produced arguments. In the activities, students mostly made claims. This was followed by data and warrant items. In the “the phases of the moon” activity, it was determined that only the second and fourth groups used rebuttal and the other groups did not.

The number of rebuttals used by the groups is lower in “the planets-table of statements” activity than in other activities. The rebuttals used are also weak. The use of rebuttals differs in the “who is right?” and “urgent solution to space pollution” activities. The number of rebuttal students used in these activities is higher than that in the other activities. The quality rebuttals are also higher.

When the structure of the warrants is examined, there are more unscientific warrants in the “urgent solution to space pollution” and “who is right” activities, while the correct scientific and partially correct scientific warrants were more frequently used in the “the phases of the moon” and “the planets table of statements” activities.

When the models related to the argument structures are examined in general, it was found that there is a decrease in the type of items used and the number of uses in the “the phases of the moon” and “the planets-table of statements” activities rather than the “urgent solution to space pollution” and “who is right” activities.

When the results were analyzed in terms of groups, it was determined that the argument structures of the second and fourth groups showed more variety than those of the other groups.

The Change of Argumentation Levels

The argumentation levels achieved by six groups created in the “who is right,” “ the planets-table of statements,” “phases of the moon,” and “urgent solution to space pollution” activities are shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 11191_2022_369_Fig2_HTML.jpg

A characterization of the components of critical thinking (Jiménez-Aleixandre & Puig, 2012 , p. 6)

In the first verbal argumentation activity, “who is right?,” the arguments achieved by the five of the six groups were at level 5. Additionally, the arguments achieved by one group, which was group 6, were at level 4.

In the second verbal argumentation activity “table of statements,” a decrease was determined at the levels of the argumentation of the other groups except group 1 and group 3. In the “the phases of the moon” activity, there was a decrease at the level of argumentation achieved by the other groups except for group 2 and group 4. In the last argumentation activity, “urgent solution to space pollution,” it was found that the arguments of all groups were at level 5.

Conclusions and Discussion

The critical thinking skills of the students developed until the middle of the intervention, and the frequency of using critical thinking skills varied after the middle of the intervention. When the activities in the lessons were examined, on the days when critical thinking skills were frequently used, activities including argumentation methods were performed. Based on this situation, it could be revealed that the frequency of using critical thinking skills by students varies according to the use of the argumentation method.

Argumentation is defined as the process of making claims about a scientific subject, supporting them with data, providing reasons for proof, and criticizing, rebutting, and evaluating an idea (Toulmin, 1990 ). According to the definition of argumentation, these processes are also in the subdimensions of critical thinking skills. The ability to provide reasons for critical thinking skills in decision-making/supporting skills is the equivalent of providing reasons for proof in the argumentation process using warrants in the Toulmin argument model. Different types of claims under inference skills are related to making claims in the argumentation process, and rejecting a judgment is related to rebutting an idea in the argumentation process. In this context, the argumentation method is thought to contribute to the development of critical thinking skills within AR.

Another qualitative finding reached in the study is that the skills most used in the subdimensions differ according to the days. This can be explained by the different types of activities performed in each lesson. For example, on the day when the ability to explain observation data was used the most, students observed the sky, constellations, and galaxies with the Star Chart or Sky View applications or observed the planets with the i-Solar System application, and they presented the data they obtained during these observations.

Regarding the verbal argumentation structure of the groups, the findings imply that all groups engaged in argumentation and produced arguments. This finding presented evidence with qualitative data to further verify Squire & Jan’s ( 2007 ) research conducted with primary, middle, and high school students to investigate the potential of a location-based AR game in environmental science concluding that all groups engaged in argumentation. Similarly, Jan ( 2009 ) investigated the experience of three middle school students and their argumentative discourse on environmental education using a location-based AR game, and it was found that all students participated in argumentation and produced arguments.

Another finding in the current study was that students mostly made claims in the activities. This situation can be interpreted as students being strong in expressing their opinions. Similar findings are found in the literature (Author, 20xxa; Cavagnetto et al., 2010 ; Erduran et al., 2004 ; Novak & Treagust, 2017 ). In addition, it was concluded that the students failed to use warrants and data, they could not support their claims with the data, and they did not use “rebuttal” in these studies. However, in this study in which both augmented reality applications and argumentation methods were used, students mostly made contradictory claims and used data and warrants in their arguments. This situation can be interpreted as students being strong in defending their opinions. Additionally, although it was stated in many of the studies that students’ argumentation levels were generally at level 1 or level 2 (Erdogan et al., 2017 ; Erduran et al., 2004 ; Venville & Dawson, 2010 ; Zohar & Nemet, 2002 ), it was found that most of the students’ arguments were at level 4 and level 5 in the current study. Arguments are considered to be high quality in line with the existence of rebuttals, and discussions involving rebuttals are characterized as having a high level of argumentation (Aufschnaiter et al., 2008 ; Erduran et al., 2004 ). Students used rebuttals in their arguments, and their arguments were at high levels, which indicates that students could produce quality arguments. The reason for these findings to differ from those of other studies may be due to the augmented reality technology used in the current study. Enriched representations make it easier to see the structure of arguments (Akpınar et al., 2014 ), helping students to improve their awareness, increase the number of words they use and comments they make (Erkens & Janssen, 2006 ), and provide important information about the subject (Clark et al., 2007 ). By observing enriched representations, students collect evidence for argumentation (Clark & Sampson, 2008 ) and explore different points of view to support their claim (Oestermeier & Hesse, 2000 ). AR technology, which includes enriched representations, may have increased the accessibility of rich data to support students’ arguments; and using these data has helped them to support their arguments and enabled them to discover different perspectives. For example, S4 explained that the statement in the table is incorrect because she observed Uranus, Jupiter, and Neptune having rings around them in the application “I-solar system” as Uranus. She used the data obtained in the AR application to support her claim.

When the models related to the argument structures are examined in general, it was concluded that the type of items, the number of items, and the rebuttals used in scientific activities were less than those in the activities involving socioscientific issues. The rebuttals used were also weak. There are also findings in the literature that producing arguments on scientific issues is more difficult than producing arguments on socioscientific issues (Osborne et al., 2004 ).

When the structure of the warrants in the students’ arguments was examined, it was seen that there are more nonscientific warrants in socioscientific activities, and the scientific and partially scientific warrants are more in the activities that contain scientific subjects. This shows that students were unable to combine what they have learned in science with socioscientific issues. Albe ( 2008 ) and Kolsto ( 2001 ) stated that scientific knowledge is very low in students’ arguments on socioscientific issues. Similarly, the results of the studies conducted in the related literature support this view (Demircioglu & Ucar, 2014 ; Sadler & Donnelly, 2006 ; Wu & Tsai, 2007 ).

When the argument structures in the activities are analyzed by groups, the argument structures of the two groups vary more than the other groups, and the argumentation levels of these groups are at level 4 and level 5. This might be because some students have different prior knowledge about subjects. Different studies have also indicated that content knowledge plays an important role in the quality of students’ arguments (Acar, 2008 ; Aufschnaiter et al., 2008 ; Clark & Sampson, 2008 ; Cross et al., 2008 ; Sampson & Clark, 2011 ). In many studies, it has been emphasized that the most important thing affecting the choice and process of knowledge is previous information (Stark et al., 2009 ). To better understand how previous information affects argumentation quality in astronomy education, investigating the relationship between middle school students’ content knowledge and argumentation quality could be a direction of future research.

Limitations and Future Research

There are some limitations in this study. First, this study was implemented in a private school. Therefore, the results are true for these students. Future research is necessary to be performed with the students in public schools. Second, the researcher conducted the lessons because the science teacher had no ability to design AR learning practices. Teachers and students creating their own AR experiences is an important way to bring the learning outcomes of AR available to a wider audience (Romano et al., 2020 ). Further research can be conducted in which the science teacher of the class is the instructor. Another limitation of the study is that the instruction with AR-based argumentation was time-consuming, and the time allocated for the “Solar System and Beyond” unit in the curriculum was not sufficient for the implementation, because students tried to understand to use AR applications, and they needed time to reflect on the activities despite prior training on AR before the instructional process. This situation may cause cognitive overload (Alalwan et al., 2020 ). The adoption and implementation of educational technologies are more difficult and time-consuming than other methods (Parker & Heywood, 1998 ). A longer period is needed to prepare student-centered and technology-supported activities.

Tables ​ Tables8, 8 , ​ ,9, 9 , ​ ,10 10 and ​ and11 11

This study is a part of Tuba Demircioğlu’s dissertation supported by the Cukurova University Scientific Research Projects (grant number: SDK20153929).

The manuscript is part of first author’s PhD dissertation. The study was reviewed and approved by the PhD committee in the Cukurova University Faculty of Education, as well as by the committee of Ministry of National Education. The parents of students were provided with written informed consent.

Declarations

The authors declare that they have no conflict of interest.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Tuba Demircioglu, Email: moc.liamg@ulgoicrimedabut .

Memet Karakus, Email: moc.liamg@skkmem .

Sedat Ucar, Email: moc.liamg@racutades .

  • Acar, O. (2008).  Argumentation skills and conceptual knowledge of undergraduate students in a physics by inquiry class . (Doctoral dissertation, The Ohio State University). https://etd.ohiolink.edu/apexprod/rws_etd/send_file/send?accession=osu1228972473&disposition=inline
  • Akpınar Y, Ardaç D, Er-Amuce N. Development and validation of an argumentation based multimedia science learning environment: Preliminary findings. Procedia-Social and Behavioral Sciences. 2014; 116 :3848–3853. doi: 10.1016/j.sbspro.2014.01.853. [ CrossRef ] [ Google Scholar ]
  • Aktamış H, Arıcı VA. The effects of using virtual reality software in teaching astronomy subjects on academic achievement and retention. Mersin University Journal of the Faculty of Education. 2013; 9 (2):58–70. [ Google Scholar ]
  • Alalwan N, Cheng L, Al-Samarraie H, Yousef R, Alzahrani AI, Sarsam SM. Challenges and prospects of virtual reality and augmented reality utilization among primary school teachers: A developing country perspective. Studies in Educational Evaluation. 2020; 66 :100876. doi: 10.1016/j.stueduc.2020.100876. [ CrossRef ] [ Google Scholar ]
  • Albe V. When scientific knowledge, daily life experience, epistemological and social considerations intersect: Students’ argumentation in group discussions on a socio-scientific issue. Research in Science Education. 2008; 38 (1):67–90. doi: 10.1007/s11165-007-9040-2. [ CrossRef ] [ Google Scholar ]
  • Arici F, Yildirim P, Caliklar Ş, Yilmaz RM. Research trends in the use of augmented reality in science education: Content and bibliometric mapping analysis. Computers & Education. 2019; 142 :103647. doi: 10.1016/j.compedu.2019.103647. [ CrossRef ] [ Google Scholar ]
  • Atwood-Blaine D, Huffman D. Mobile gaming and student interactions in a science center: The future of gaming in science education. International Journal of Science and Mathematics Education. 2017; 15 (1):45–65. doi: 10.1007/s10763-017-9801-y. [ CrossRef ] [ Google Scholar ]
  • Baran B, Yecan E, Kaptan B, Paşayiğit O. Using augmented reality to teach fifth grade students about electrical circuits. Education and Information Technologies. 2020; 25 (2):1371–1385. doi: 10.1007/s10639-019-10001-9. [ CrossRef ] [ Google Scholar ]
  • Billinghurst M. Augmented reality in education. New Horizons for Learning. 2002; 12 (5):1–5. [ Google Scholar ]
  • Bressler DM, Bodzin AM. A mixed methods assessment of students’ flow experiences during a mobile augmented reality science game. Journal of Computer Assisted Learning. 2013; 29 (6):505–517. doi: 10.1111/jcal.12008. [ CrossRef ] [ Google Scholar ]
  • Buckingham Shum SJ, MacLean A, Bellotti VM, Hammond NV. Graphical argumentation and design cognition. Human-Computer Interaction. 1997; 12 (3):267–300. doi: 10.1207/s15327051hci1203_2. [ CrossRef ] [ Google Scholar ]
  • Cadmus RR., Jr A video technique to facilitate the visualization of physical phenomena. American Journal of Physics. 1990; 58 (4):397–399. doi: 10.1119/1.16483. [ CrossRef ] [ Google Scholar ]
  • Cai S, Chiang FK, Wang X. Using the augmented reality 3D technique for a convex imaging experiment in a physics course. International Journal of Engineering Education. 2013; 29 (4):856–865. [ Google Scholar ]
  • Chai CS, Deng F, Tsai PS, Koh JHL, Tsai CC. Assessing multidimensional students’ perceptions of twenty-first-century learning practices. Asia Pacific Education Review. 2015; 16 (3):389–398. doi: 10.1007/s12564-015-9379-4. [ CrossRef ] [ Google Scholar ]
  • Cai S, Liu C, Wang T, Liu E, Liang JC. Effects of learning physics using augmented reality on students’ self-efficacy and conceptions of learning. British Journal of Educational Technology. 2021; 52 (1):235–251. doi: 10.1111/bjet.13020. [ CrossRef ] [ Google Scholar ]
  • Cavagnetto A, Hand BM, Norton-Meier L. The nature of elementary student science discourse in the context of the science writing heuristic approach. International Journal of Science Education. 2010; 32 (4):427–449. doi: 10.1080/09500690802627277. [ CrossRef ] [ Google Scholar ]
  • Chang HY, Wu HK, Hsu YS. Integrating a mobile augmented reality activity to contextualize student learning of a socioscientific issue. British Journal of Educational Technology. 2013; 44 (3):95–99. doi: 10.1111/j.1467-8535.2012.01379.x. [ CrossRef ] [ Google Scholar ]
  • Chang HY, Liang JC, Tsai CC. Students’ context-specific epistemic justifications, prior knowledge, engagement, and socioscientific reasoning in a mobile augmented reality learning environment. Journal of Science Education and Technology. 2020; 29 (3):399–408. doi: 10.1007/s10956-020-09825-9. [ CrossRef ] [ Google Scholar ]
  • Chen SY, Liu SY. Using augmented reality to experiment with elements in a chemistry course. Computers in Human Behavior. 2020; 111 (2020):106418. doi: 10.1016/j.chb.2020.106418. [ CrossRef ] [ Google Scholar ]
  • Chen CH, Yang JC, Shen S, Jeng MC. A desktop virtual reality earth motion system in astronomy education. Educational Technology & Society. 2007; 10 (3):289–304. [ Google Scholar ]
  • Chen CH, Huang CY, Chou YY. Effects of augmented reality-based multidimensional concept maps on students’ learning achievement, motivation and acceptance. Universal Access in the Information Society. 2019; 18 (2):257–268. doi: 10.1007/s10209-017-0595-z. [ CrossRef ] [ Google Scholar ]
  • Chen CC, Chen HR, Wang TY. Creative situated augmented reality learning for astronomy curricula. Educational Technology & Society. 2022; 25 (2):148–162. [ Google Scholar ]
  • Cheng KH, Tsai CC. Affordances of augmented reality in science learning: Suggestions for future research. Journal of Science Education and Technology. 2013; 22 (4):449–462. doi: 10.1007/s10956-012-9405-9. [ CrossRef ] [ Google Scholar ]
  • Chiang THC, Yang SJH, Hwang GJ. An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Journal of Educational Technology & Society. 2014; 17 (4):352–365. [ Google Scholar ]
  • Clark DB, Sampson V. Assessing dialogic argumentation in online environments to relate structure, grounds, and conceptual quality. Journal of Research in Science Teaching. 2008; 45 (3):293–321. doi: 10.1002/tea.20216. [ CrossRef ] [ Google Scholar ]
  • Clark DB, Stegmann K, Weinberger A, Menekse M, Erkens G. Technology-enhanced learning environments to support students’ argumentation. In: Erduran S, Jimenez-Aleixandre MP, editors. Argumentation in science education: Perspectives from classroom-based research. Springer; 2007. pp. 217–243. [ Google Scholar ]
  • Cohen J. A coefficient of agreement for nominal scales. Educational and Psychological Measurement. 1960; 20 (1):37–46. doi: 10.1177/001316446002000104. [ CrossRef ] [ Google Scholar ]
  • Cross D, Taasoobshirazi G, Hendricks S, Hickey DT. Argumentation: A strategy for improving achievement and revealing scientific identities. International Journal of Science Education. 2008; 30 (6):837–861. doi: 10.1080/09500690701411567. [ CrossRef ] [ Google Scholar ]
  • Demircioglu T, Ucar S. Investigation of written arguments about Akkuyu nuclear power plant. Elementary Education Online. 2014; 13 (4):1373–1386. [ Google Scholar ]
  • Demircioglu T, Ucar S. Investigation the effect of argument-driven inquiry in laboratory instruction. Educational Sciences: Theory and Practice. 2015; 15 (1):267–283. [ Google Scholar ]
  • Dias A. Technology enhanced learning and augmented reality: An application on multimedia interactive books. International Business & Economics Review. 2009; 1 (1):69–79. [ Google Scholar ]
  • Dietrich N, Kentheswaran K, Ahmadi A, Teychené J, Bessière Y, Alfenore S, Laborie S, Hébrard G. Attempts, successes, and failures of distance learning in the time of COVID-19. Journal of Chemical Education. 2020; 97 (9):2448–2457. doi: 10.1021/acs.jchemed.0c00717. [ CrossRef ] [ Google Scholar ]
  • Driver R, Newton P, Osborne J. Establishing the norms of scientific argumentation in classrooms. Science Education. 2000; 84 :287–312. doi: 10.1002/(SICI)1098-237X(200005)84:3<287::AID-SCE1>3.0.CO;2-A. [ CrossRef ] [ Google Scholar ]
  • Dunleavy M, Dede C, Mitchell R. Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. Journal of Science Education and Technology. 2009; 18 (1):7–22. doi: 10.1007/s10956-008-9119-1. [ CrossRef ] [ Google Scholar ]
  • Ennis RH. Goals for a critical thinking curriculum. In: Costa A, editor. Developing minds: A resource book for teaching thinking. The Association of Supervision and Curriculum Development; 1991. pp. 68–71. [ Google Scholar ]
  • Ennis, R. H. (2011). The nature of critical thinking: An outline of critical thinking dispositions and abilities. Illinois College of Education. https://education.illinois.edu/docs/default-source/faculty-documents/robert-ennis/thenatureofcriticalthinking_51711_000.pdf
  • Erdogan I, Ciftci A, Topcu MS. Examination of the questions used in science lessons and argumentation levels of students. Journal of Baltic Science Education. 2017; 16 (6):980–993. doi: 10.33225/jbse/17.16.980. [ CrossRef ] [ Google Scholar ]
  • Erduran S. Science education in the era of a pandemic: How can history, philosophy and sociology of science contribute to education for understanding and solving the Covid-19 crisis? Science & Education. 2020; 29 :233–235. doi: 10.1007/s11191-020-00122-w. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Erduran S, Simon S, Osborne J. TAPping into argumentation: Developments in the application of Toulmin’s argument pattern for studying science discourse. Science Education. 2004; 88 (6):915–933. doi: 10.1002/sce.20012. [ CrossRef ] [ Google Scholar ]
  • Erduran S, Özdem Y, Park JY. Research trends on argumentation in science education: A journal content analysis from 1998–2014. International Journal of STEM Education. 2015; 2 (1):1–12. doi: 10.1186/s40594-015-0020-1. [ CrossRef ] [ Google Scholar ]
  • Erkens, G., & Janssen, J. (2006). Automatic coding of communication in collaboration protocols. In S. Barab, K. Hay, & D. Hickey (Eds.), Proceedings of the 7th International Conference on Learning Sciences (ICLS) , (pp. 1063–1064). International Society of the Learning Sciences. 10.5555/1150034
  • Faridi H, Tuli N, Mantri A, Singh G, Gargrish S. A framework utilizing augmented reality to improve critical thinking ability and learning gain of the students in Physics. Computer Applications in Engineering Education. 2020; 29 :258–273. doi: 10.1002/cae.22342. [ CrossRef ] [ Google Scholar ]
  • Fidan M, Tuncel M. Integrating augmented reality into problem based learning: The effects on learning achievement and attitude in physics education. Computers & Education. 2019; 142 :103635. doi: 10.1016/j.compedu.2019.103635. [ CrossRef ] [ Google Scholar ]
  • Fleck, S., & Simon, G. (2013, November). An augmented reality environment for astronomy learning in elementary grades: An exploratory study. In  Proceedings of the 25th Conference on l ’ Interaction Homme-Machine  (pp. 14–22). ACM.
  • Garzón J, Acevedo J. Meta-analysis of the impact of augmented reality on students’ learning gains. Educational Research Review. 2019; 27 :244–260. doi: 10.1016/j.edurev.2019.04.001. [ CrossRef ] [ Google Scholar ]
  • Garzón, J., Pavón, J., & Baldiris, S. (2019). Systematic review and meta-analysis of augmented reality in educational settings. Virtual Reality , 1–13
  • Garzón, J., Baldiris, S., Gutiérrez, J., & Pavón, J. (2020). How do pedagogical approaches affect the impact of augmented reality on education? A meta-analysis and research synthesis.  Educational Research Review , 100334.
  • Guba, E.G. & Lincoln, Y.S. (1989). Fourth generation evaluation . Sage Publications.
  • Hanid MFA, Said MNHM, Yahaya N. Learning strategies using augmented reality technology in education: Meta-analysis. Universal Journal of Educational Research. 2020; 8 (5A):51–56. doi: 10.13189/ujer.2020.081908. [ CrossRef ] [ Google Scholar ]
  • Hsiao HS, Chang CS, Lin CY, Wang YZ. Weather observers: A manipulative augmented reality system for weather simulations at home, in the classroom, and at a museum. Interactive Learning Environments. 2016; 24 (1):205–223. doi: 10.1080/10494820.2013.834829. [ CrossRef ] [ Google Scholar ]
  • Huang KT, Ball C, Francis J, Ratan R, Boumis J, Fordham J. Augmented versus virtual reality in education: An exploratory study examining science knowledge retention when using augmented reality/virtual reality mobile applications. Cyberpsychology, Behavior, and Social Networking. 2019; 22 (2):105–110. doi: 10.1089/cyber.2018.0150. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hwang GJ, Wu PH, Chen CC, Tu NT. Effects of an augmented reality-based educational game on students’ learning achievements and attitudes in real-world observations. Interactive Learning Environments. 2016; 24 (8):1895–1906. doi: 10.1080/10494820.2015.1057747. [ CrossRef ] [ Google Scholar ]
  • Ibáñez MB, Di Serio Á, Villarán D, Kloos CD. Experimenting with electromagnetism using augmented reality: Impact on flow student experience and educational effectiveness. Computers & Education. 2014; 71 :1–13. doi: 10.1016/j.compedu.2013.09.004. [ CrossRef ] [ Google Scholar ]
  • Jan, M. (2009). Designing an augmented reality game-based curriculum for argumentation . (Unpublished doctoral dissertation). University of Wisconsin-Madison.
  • Jermann P, Dillenbourg P. Elaborating new arguments through a CSCL script. In: Andriessen J, Baker M, Suthers D, editors. Arguing to learn: Confronting cognitions in computer-supported collaborative learning environments. Springer; 2003. pp. 205–226. [ Google Scholar ]
  • Jiménez –Aleixandre MP, Erduran S. Argumentation in science education: An overview. In: Erduran S, Jimenez-Aleixandre MP, editors. Argumentation in science education: Perspectives from classroom-based research. Springer; 2007. pp. 3–27. [ Google Scholar ]
  • Jiménez-Aleixandre, M. P., & Puig, B. (2012). Argumentation, evidence evaluation and critical thinking. In  Second international handbook of science education  (pp. 1001–1015). Springer, Dordrecht.
  • Ke F, Carafano P. Collaborative science learning in an immersive flight simulation. Computers & Education. 2016; 103 :114–123. doi: 10.1016/j.compedu.2016.10.003. [ CrossRef ] [ Google Scholar ]
  • Kennedy G, Dalgarno B, Bennett S, Judd T, Gray K, Chang R. Immigrants and natives: Investigating differences between staff and students’ use of technology. In Hello! Where are you in the landscape of educational technology? Proceedings Ascilite Melbourne. 2008; 2008 :484–492. [ Google Scholar ]
  • Kerawalla L, Luckin R, Seljeflot S, Woolard A. “Making it real”: Exploring the potential of augmented reality for teaching primary school science. Virtual Reality. 2006; 10 (3–4):163–174. doi: 10.1007/s10055-006-0036-4. [ CrossRef ] [ Google Scholar ]
  • Kirikkaya EB, Başgül MŞ. The effect of the use of augmented reality applications on the academic success and motivation of 7th grade students. Journal of Baltic Science Education. 2019; 18 (3):362. doi: 10.33225/jbse/19.18.362. [ CrossRef ] [ Google Scholar ]
  • Kolsto SD. ‘To trust or not to trust,…’-pupils’ ways of judging information encountered in a socio-scientific issue. International Journal of Science Education. 2001; 23 (9):877–901. doi: 10.1080/09500690010016102. [ CrossRef ] [ Google Scholar ]
  • Lin SS, Mintzes JJ. Learning argumentation skills through instruction in socioscientific issues: The effect of ability level. International Journal of Science and Mathematics Education. 2010; 8 (6):993–1017. doi: 10.1007/s10763-010-9215-6. [ CrossRef ] [ Google Scholar ]
  • López-Faican L, Jaen J. Emofindar: Evaluation of a mobile multiplayer augmented reality game for primary school children. Computers & Education. 2020; 149 (2020):103814. doi: 10.1016/j.compedu.2020.103814. [ CrossRef ] [ Google Scholar ]
  • Lu SJ, Liu YC, Chen PJ, Hsieh MR. Evaluation of AR embedded physical puzzle game on students’ learning achievement and motivation on elementary natural science. Interactive Learning Environments. 2020; 28 (4):451–463. doi: 10.1080/10494820.2018.1541908. [ CrossRef ] [ Google Scholar ]
  • Maykut, P., & Morehouse, R. (1994). Beginning qualitative research: A philosophic and practical guide. The Falmer Press.
  • Novak AM, Treagust DF. Adjusting claims as new evidence emerges: Do students incorporate new evidence into their scientific explanations? Journal of Research in Science Teaching. 2017; 55 (4):526–549. doi: 10.1002/tea.21429. [ CrossRef ] [ Google Scholar ]
  • Nussbaum EM. Collaborative discourse, argumentation, and learning: Preface and literature review. Contemporary Educational Psychology. 2008; 33 (3):345–359. doi: 10.1016/j.cedpsych.2008.06.001. [ CrossRef ] [ Google Scholar ]
  • OECD . Literacy skills for the world of tomorrow: Further results from PISA 2003. OECD Publishing; 2003. [ Google Scholar ]
  • Oestermeier U, Hesse FW. Verbal and visual causal arguments. Cognition. 2000; 75 (1):65–104. doi: 10.1016/S0010-0277(00)00060-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Oleksiuk, V.P., Oleksiuk, O.R. (2020). Exploring the potential of augmented reality for teaching school computer science. In Burov, O.Yu., Kiv, A.E. (Eds.) Proceedings of the 3rd International Workshop on Augmented Reality in Education (AREdu 2020) (pp. 91–107).
  • Osborne J, Erduran S, Simon S. Enhancing the quality of argument in school science. Journal of Research in Science Teaching. 2004; 41 (10):994–1020. doi: 10.1002/tea.20035. [ CrossRef ] [ Google Scholar ]
  • Parker J, Heywood D. The earth and beyond: Developing primary teachers’ understanding of basic astronomical events. International Journal of Science Education. 1998; 20 :503–520. doi: 10.1080/0950069980200501. [ CrossRef ] [ Google Scholar ]
  • Patton MQ. Qualitative research and evaluation methods. 3. Sage; 2002. [ Google Scholar ]
  • Pellas N, Fotaris P, Kazanidis I, Wells D. Augmenting the learning experience in primary and secondary school education: A systematic review of recent trends in augmented reality game-based learning. Virtual Reality. 2019; 23 (4):329–346. doi: 10.1007/s10055-018-0347-2. [ CrossRef ] [ Google Scholar ]
  • Ploman, E. W. (1987), Global learning: A challenge. In C. A., Taylor (Ed.) Science education and information transfer (pp. 75–80). Oxford: Pergamon (for ICSU Press).
  • Radu, I., & Schneider, B. (2019). What can we learn from augmented reality (AR)? Benefits and drawbacks of AR for inquiry-based learning of physics. In  Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems  (pp. 1–12).
  • Rideout VJ, Foehr UG, Roberts DF. Generation M [superscript 2]: Media in the lives of 8-to 18-year-olds. Kaiser Family Foundation; 2010. [ Google Scholar ]
  • Rieber LP, Kini AS. Theoretical foundations of instructional applications of computer-generated animated visuals. Journal of Computer-Based Instruction. 1991; 18 :83e88. [ Google Scholar ]
  • Romano, M., Díaz, P., & Aedo, I. (2020). Empowering teachers to create augmented reality experiences: The effects on the educational experience.  Interactive Learning Environments , 1–18.
  • Sadler TD, Donnelly LA. Socioscientific argumentation: The effects of content knowledge and morality. International Journal of Science Education. 2006; 28 (12):1463–1488. doi: 10.1080/09500690600708717. [ CrossRef ] [ Google Scholar ]
  • Sahin D, Yilmaz RM. The effect of augmented reality technology on middle school students’ achievements and attitudes towards science education. Computers & Education. 2020; 144 (2020):103710. doi: 10.1016/j.compedu.2019.103710. [ CrossRef ] [ Google Scholar ]
  • Sampson V, Clark DB. A comparison of the collaborative scientific argumentation practices of two high and two low performing groups. Research in Science Education. 2011; 41 (1):63–97. doi: 10.1007/s11165-009-9146-9. [ CrossRef ] [ Google Scholar ]
  • Sampson V, Gleim L. Argument-driven ınquiry to promote the understanding of important concepts & practices in biology. The American Biology Teacher. 2009; 71 (8):465–472. doi: 10.2307/20565359. [ CrossRef ] [ Google Scholar ]
  • Sampson V, Grooms J, Walker JP. Argument-driven ınquiry as a way to help students learn how to participate in scientific argumentation and craft written arguments: An exploratory study. Science Education. 2011; 95 (2):217–257. doi: 10.1002/sce.20421. [ CrossRef ] [ Google Scholar ]
  • Sandoval WA, Millwood KA. The quality of students’ use of evidence in written scientific explanations. Cognition and Instruction. 2005; 23 (1):23–55. doi: 10.1207/s1532690xci2301_2. [ CrossRef ] [ Google Scholar ]
  • Saribas D, Çetinkaya E. Pre-service teachers’ analysis of claims about COVID-19 in an online course. Science & Education. 2021; 30 (2):235–266. doi: 10.1007/s11191-020-00181-z. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Schnotz W, Lowe RK. A unified view of learning from animated and static graphics. In: Lowe RK, Schnotz W, editors. Learning with animation: Research implications for design. Cambridge University Press; 2008. pp. 304–357. [ Google Scholar ]
  • Setozaki N, Suzuki K, Iwasaki T, Morita Y. Development and evaluation of the usefulness of collaborative learning on the tangible AR learning equipment for astronomy education. Educational Technology Research. 2017; 40 (1):71–83. [ Google Scholar ]
  • Shelton, B. E. & Hedley, N. R. (2002). Using augmented reality for teaching earth-sun relationships to undergraduate geography students. In Augmented Reality Toolkit, The First IEEE International Workshop (8).
  • Sin AK, Zaman HB. Live solar system (LSS): Evaluation of an augmented reality book-based educational tool. In Proceedings of 2010 International Symposium on Information Technology. 2010; 1 :1–6. [ Google Scholar ]
  • Squire K. From content to context: Videogames as designed experience. Educational Researcher. 2006; 35 (8):19–29. doi: 10.3102/0013189X035008019. [ CrossRef ] [ Google Scholar ]
  • Squire KD, Jan M. Mad City Mystery: Developing scientific argumentation skills with a place-based augmented reality game on handheld computers. Journal of Science Education and Technology. 2007; 16 (1):5–29. doi: 10.1007/s10956-006-9037-z. [ CrossRef ] [ Google Scholar ]
  • Stark R, Puhl T, Krause UM. Improving scientific argumentation skills by a problem-based learning environment: Effects of an elaboration tool and relevance of student characteristics. Evaluation & Research in Education. 2009; 22 (1):51–68. doi: 10.1080/09500790903082362. [ CrossRef ] [ Google Scholar ]
  • Syawaludin A, Gunarhadi R, P. Development of augmented reality-based interactive multimedia to improve critical thinking skills in science learning. International Journal of Instruction. 2019; 12 (4):331–344. doi: 10.29333/iji.2019.12421a. [ CrossRef ] [ Google Scholar ]
  • Taylor, C. A. (1987). Science education and information transfer (pp.1–15). Oxford: Pergamon (for ICSU Press)
  • Toulmin, S. E. (1990). The uses of argument (10th ed.). Cambridge University Press.
  • Venville GJ, Dawson VM. The impact of a classroom intervention on grade 10 students’ argumentation skills, informal reasoning, and conceptual understanding of science. Journal of Research in Science Teaching. 2010; 47 (8):952–977. [ Google Scholar ]
  • Virata, R. O., & Castro, J. D. L. (2019). Augmented reality in science classroom: Perceived effects in education, visualization and information processing. In  Proceedings of the 10th International Conference on E-Education, E-Business, E-Management and E-Learning  (pp. 85–92).
  • Von Aufschnaiter C, Erduran S, Osborne J, Simon S. Arguing to learn and learning to argue: Case studies of how students’ argumentation relates to their scientific knowledge. Journal of Research in Science Teaching. 2008; 45 (1):101–131. doi: 10.1002/tea.20213. [ CrossRef ] [ Google Scholar ]
  • Wu YT, Tsai CC. High school students’ informal reasoning on a socio-scientific issue: Qualitative and quantitative analyses. International Journal of Science Education. 2007; 29 (9):1163–1187. doi: 10.1080/09500690601083375. [ CrossRef ] [ Google Scholar ]
  • Yen JC, Tsai CH, Wu M. Augmented reality in the higher education: Students’ science concept learning and academic achievement in astronomy. Procedia-Social and Behavioral Sciences. 2013; 103 :165–173. doi: 10.1016/j.sbspro.2013.10.322. [ CrossRef ] [ Google Scholar ]
  • Yildirim I, Seckin-Kapucu M. The effect of augmented reality applications in science education on academic achievement and retention of 6th grade students. Journal of Education in Science Environment and Health. 2020; 7 (1):56–71. [ Google Scholar ]
  • Yu, K. C., & Sahami, K. (2007). Visuospatial astronomy education in immersive digital planetariums.  Communicating Astronomy with the Public , 242–245.
  • Yuen S, Yaoyuneyong G, Johnson E. Augmented reality: An overview and five directions for AR in education. Journal of Educational Technology Development and Exchange. 2011; 4 (1):119–140. doi: 10.18785/jetde.0401.10. [ CrossRef ] [ Google Scholar ]
  • Zhang J, Sung Y-T, Hou H-T, Chang K-E. The development and evaluation of an augmented reality-based armillary sphere for astronomical observation instruction. Computers & Education. 2014; 73 :178–188. doi: 10.1016/j.compedu.2014.01.003. [ CrossRef ] [ Google Scholar ]
  • Zohar A, Nemet F. Fostering students’ knowledge and argumentation skills through dilemmas in human genetics. Journal of Research in Science Teaching. 2002; 39 (1):35–62. doi: 10.1002/tea.10008. [ CrossRef ] [ Google Scholar ]

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

  • Jason E. Dowd
  • Robert J. Thompson
  • Leslie A. Schiff
  • Julie A. Reynolds

*Address correspondence to: Jason E. Dowd ( E-mail Address: [email protected] ).

Department of Biology, Duke University, Durham, NC 27708

Search for more papers by this author

Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

ACKNOWLEDGMENTS

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . Google Scholar
  • August, D. ( 2016 ). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. Google Scholar
  • Beyer, C. H., Taylor, E., & Gillmore, G. M. ( 2013 ). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. Google Scholar
  • Bissell, A. N., & Lemons, P. P. ( 2006 ). A new method for assessing critical thinking in the classroom . BioScience , 56 (1), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . Google Scholar
  • Blattner, N. H., & Frazier, C. L. ( 2002 ). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , 8 (1), 47–64. Google Scholar
  • Clase, K. L., Gundlach, E., & Pelaez, N. J. ( 2010 ). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , 38 (5), 290–295. Medline ,  Google Scholar
  • Condon, W., & Kelly-Riley, D. ( 2004 ). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , 9 (1), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . Google Scholar
  • Ding, L., Wei, X., & Liu, X. ( 2016 ). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , 46 (5), 613–632. https://doi.org/10.1007/s11165-015-9473-y . Google Scholar
  • Dowd, J. E., Connolly, M. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015a ). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , 46 (1), 14–27. https://doi.org/10.1080/00220485.2014.978924 . Google Scholar
  • Dowd, J. E., Roy, C. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015b ). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , 92 (1), 39–45. https://doi.org/10.1021/ed500298r . Google Scholar
  • Dowd, J. E., Thompson, R. J.Jr., & Reynolds, J. A. ( 2016 ). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , 27 , 36–51. Google Scholar
  • Facione, P. A. ( 1990 ). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association. Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . Google Scholar
  • Gerdeman, R. D., Russell, A. A., Worden, K. J., Gerdeman, R. D., Russell, A. A., & Worden, K. J. ( 2007 ). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , 36 (5), 46–52. Google Scholar
  • Greenhoot, A. F., Semb, G., Colombo, J., & Schreiber, T. ( 2004 ). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , 18 (2), 203–221. https://doi.org/10.1002/acp.959 . Google Scholar
  • Haaga, D. A. F. ( 1993 ). Peer review of term papers in graduate psychology courses . Teaching of Psychology , 20 (1), 28–32. https://doi.org/10.1207/s15328023top2001_5 . Google Scholar
  • Halonen, J. S., Bosack, T., Clay, S., McCarthy, M., Dunn, D. S., Hill, G. W., … Whitlock, K. ( 2003 ). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , 30 (3), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . Google Scholar
  • Hand, B., & Keys, C. W. ( 1999 ). Inquiry investigation . Science Teacher , 66 (4), 27–29. Google Scholar
  • Holm, S. ( 1979 ). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , 6 (2), 65–70. Google Scholar
  • Holyoak, K. J., & Morrison, R. G. ( 2005 ). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. Google Scholar
  • Insight Assessment . ( 2016a ). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST . Google Scholar
  • Insight Assessment . ( 2016b ). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 . Google Scholar
  • Kelly, G. J., & Takao, A. ( 2002 ). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , 86 (3), 314–342. https://doi.org/10.1002/sce.10024 . Google Scholar
  • Kuhn, D., & Dean, D.Jr. ( 2004 ). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , 5 (2), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . Google Scholar
  • Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. ( 2008 ). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . Google Scholar
  • Lawson, A. E. ( 2010 ). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , 94 (2), 336–364. https://doi.org/­10.1002/sce.20357 . Google Scholar
  • Meizlish, D., LaVaque-Manty, D., Silver, N., & Kaplan, M. ( 2013 ). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson, R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. Google Scholar
  • Miri, B., David, B.-C., & Uri, Z. ( 2007 ). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , 37 (4), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . Google Scholar
  • Moshman, D. ( 2015 ). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. Google Scholar
  • National Research Council . ( 2000 ). How people learn: Brain, mind, experience, and school . Expanded ed.. Washington, DC: National Academies Press. Google Scholar
  • Pukkila, P. J. ( 2004 ). Introducing student inquiry in large introductory genetics classes . Genetics , 166 (1), 11–18. https://doi.org/10.1534/genetics.166.1.11 . Medline ,  Google Scholar
  • Quitadamo, I. J., Faiola, C. L., Johnson, J. E., & Kurtz, M. J. ( 2008 ). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , 7 (3), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . Link ,  Google Scholar
  • Quitadamo, I. J., & Kurtz, M. J. ( 2007 ). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , 6 (2), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . Link ,  Google Scholar
  • Reynolds, J. A., Smith, R., Moskovitz, C., & Sayle, A. ( 2009 ). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , 59 (10), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . Google Scholar
  • Reynolds, J. A., Thaiss, C., Katkin, W., & Thompson, R. J. ( 2012 ). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , 11 (1), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . Link ,  Google Scholar
  • Reynolds, J. A., & Thompson, R. J. ( 2011 ). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , 10 (2), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . Link ,  Google Scholar
  • Rhemtulla, M., Brosseau-Liard, P. E., & Savalei, V. ( 2012 ). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , 17 (3), 354–373. https://doi.org/­10.1037/a0029315 . Medline ,  Google Scholar
  • Stephenson, N. S., & Sadler-McKnight, N. P. ( 2016 ). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , 17 (1), 72–79. https://doi.org/­10.1039/C5RP00102A . Google Scholar
  • Tariq, V. N., Stefani, L. A. J., Butcher, A. C., & Heylings, D. J. A. ( 1998 ). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , 23 (3), 221–240. https://doi.org/­10.1080/0260293980230301 . Google Scholar
  • Timmerman, B. E. C., Strickland, D. C., Johnson, R. L., & Payne, J. R. ( 2011 ). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , 36 (5), 509–547. https://doi.org/10.1080/­02602930903540991 . Google Scholar
  • Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. ( 2000 ). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , 25 (2), 149–169. https://doi.org/10.1080/713611428 . Google Scholar
  • Willison, J., & O’Regan, K. ( 2007 ). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , 26 (4), 393–409. https://doi.org/10.1080/07294360701658609 . Google Scholar
  • Woodin, T., Carter, V. C., & Fletcher, L. ( 2010 ). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , 9 (2), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . Link ,  Google Scholar
  • Zeineddin, A., & Abd-El-Khalick, F. ( 2010 ). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , 47 (9), 1064–1093. https://doi.org/10.1002/tea.20368 . Google Scholar
  • Zimmerman, C. ( 2000 ). The development of scientific reasoning skills . Developmental Review , 20 (1), 99–149. https://doi.org/10.1006/drev.1999.0497 . Google Scholar
  • Zimmerman, C. ( 2007 ). The development of scientific thinking skills in elementary and middle school . Developmental Review , 27 (2), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . Google Scholar
  • Designing a framework to improve critical reflection writing in teacher education using action research 24 February 2022 | Educational Action Research, Vol. 32, No. 1
  • Scientific Thinking and Critical Thinking in Science Education  5 September 2023 | Science & Education, Vol. 11
  • Students Need More than Content Knowledge To Counter Vaccine Hesitancy Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Critical thinking during science investigations: what do practicing teachers value and observe? 16 March 2023 | Teachers and Teaching, Vol. 29, No. 6
  • Effect of Web-Based Collaborative Learning Method with Scratch on Critical Thinking Skills of 5th Grade Students 30 March 2023 | Participatory Educational Research, Vol. 10, No. 2
  • Are We on the Way to Successfully Educating Future Citizens?—A Spotlight on Critical Thinking Skills and Beliefs about the Nature of Science among Pre-Service Biology Teachers in Germany 22 March 2023 | Behavioral Sciences, Vol. 13, No. 3
  • A Systematic Review on Inquiry-Based Writing Instruction in Tertiary Settings 30 November 2022 | Written Communication, Vol. 40, No. 1
  • An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills 8 December 2022 | SN Social Sciences, Vol. 2, No. 12
  • TEACHING OF CRITICAL THINKING SKILLS BY SCIENCE TEACHERS IN JAPANESE PRIMARY SCHOOLS 25 October 2022 | Journal of Baltic Science Education, Vol. 21, No. 5
  • A Team-Based Activity to Support Knowledge Transfer and Experimental Design Skills of Undergraduate Science Students 4 May 2022 | Journal of Microbiology & Biology Education, Vol. 21
  • Curriculum Design of College Students’ English Critical Ability in the Internet Age Wireless Communications and Mobile Computing, Vol. 2022
  • Exploring the structure of students’ scientific higher order thinking in science education Thinking Skills and Creativity, Vol. 43
  • The Asia-Pacific Education Researcher, Vol. 31, No. 4
  • Conspiratorial Beliefs and Cognitive Styles: An Integrated Look on Analytic Thinking, Critical Thinking, and Scientific Reasoning in Relation to (Dis)trust in Conspiracy Theories 12 October 2021 | Frontiers in Psychology, Vol. 12
  • Professional Knowledge and Self-Efficacy Expectations of Pre-Service Teachers Regarding Scientific Reasoning and Diagnostics 11 October 2021 | Education Sciences, Vol. 11, No. 10
  • Developing textbook based on scientific approach, critical thinking, and science process skills Journal of Physics: Conference Series, Vol. 1839, No. 1
  • Using Models of Cognitive Development to Design College Learning Experiences
  • Thinking Skills and Creativity, Vol. 42
  • Assessing students’ prior knowledge on critical thinking skills in the biology classroom: Has it already been good?
  • Critical Thinking Level among Medical Sciences Students in Iran Education Research International, Vol. 2020
  • Teaching during a pandemic: Using high‐impact writing assignments to balance rigor, engagement, flexibility, and workload 12 October 2020 | Ecology and Evolution, Vol. 10, No. 22
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices Neuroscience Letters, Vol. 737
  • Developing critical thinking skills assessment for pre-service elementary school teacher about the basic concept of science: validity and reliability Journal of Physics: Conference Series, Vol. 1567, No. 2
  • Challenging endocrinology students with a critical-thinking workbook Advances in Physiology Education, Vol. 44, No. 1
  • Jason E. Dowd ,
  • Robert J. Thompson ,
  • Leslie Schiff ,
  • Kelaine Haas ,
  • Christine Hohmann ,
  • Chris Roy ,
  • Warren Meck ,
  • John Bruno , and
  • Rebecca Price, Monitoring Editor
  • Kari L. Nelson ,
  • Claudia M. Rauter , and
  • Christine E. Cutucache
  • Elisabeth Schussler, Monitoring Editor

Submitted: 17 March 2017 Revised: 19 October 2017 Accepted: 20 October 2017

© 2018 J. E. Dowd et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Center for Innovation and Education

The Role of Critical Thinking in STEAM

As an educator, you must empower your students to think critically and strategically to fuel their interest in STEA M. By nurturing these skills in our students, we can ensure they possess the necessary problem-solving traits for success across all disciplines — from science and math to engineering to arts and technology. Understanding the importance of critical thinking also gives us an appreciation for how it can help further scientific discoveries by exploring beyond Earth’s atmosphere into outer space.

Let’s take a closer look at how developing these important skills can aid those engaging with STEAM education initiatives or looking toward future perspectives on intergalactic discovery.

What is critical thinking, and why is it important in STEAM (Science, Technology, Engineering, Arts and Mathematics)?

Critical thinking is a process of analyzing, evaluating, and interpreting information in a logical and systematic way. This skill allows individuals to make informed decisions based on evidence and identify flaws or biases in their thinking. 

In the STEAM fields, the ability to think critically is crucial. It not only helps individuals to analyze data and findings but also inspires creativity and innovation. Teaching critical thinking skills from an early age encourages problem-solving and independent thinking. By fostering these skills in the classroom, students will gain a deep understanding of the concepts in STEAM fields and be better equipped to tackle real-world problems.  Read More: What is STEAM Education and Why is it Important?

Critical Thinking Fosters Innovation in Space and Space-Adjacent Industries

critical thinking in science and technology

As the modern world continues to evolve at an unprecedented pace, so must our approach to innovation. Nowhere is this more evident than in the space and space-adjacent industries, where new technologies and bold ideas are constantly emerging. Whether designing new spacecraft or developing cutting-edge satellite technology, the ability to think critically is the key to staying ahead of the competition in the ever-expanding frontier of space exploration.

Educators play a crucial role in encouraging students to think critically about complex concepts, and challenging them to approach problems from multiple perspectives can foster innovation in space science and technology. By teaching critical thinking skills, educators can equip their students with the tools they need to be successful members of the next generation of the STEAM workforce. 

Five Strategies Educators Can Use to Encourage Students’ Critical Thinking Skills in the Classroom

Teaching critical thinking skills is an essential aspect of providing a well-rounded education.

To encourage critical thinking among young students, educators can utilize various strategies, including:

  • Encouraging questions
  • Promoting independent learning
  • Utilizing real-life scenarios
  • Incorporating collaborative learning activities
  • Providing opportunities for creative problem-solving. 

Each of these strategies involves engaging students in meaningful conversations, allowing them to explore ideas independently, and exposing them to practical situations that require critical thinking. By employing these strategies, educators can help young learners develop essential skills that will prepare them for academic and personal success in the future.

Examples of a Few Ways Professionals Use Their Skills in Space Exploration

critical thinking in science and technology

Space exploration is an exciting and complex field that requires professionals to think critically daily. 

This is essential to success in this industry, from designing and launching spacecraft to analyzing data collected from faraway planets and moons. Whether deciding on the most efficient route for a spacecraft, troubleshooting a technical glitch, or interpreting data from a remote rover, space exploration requires professionals to use their sharp analytical abilities to make evidence-based decisions. 

This is why teaching critical thinking skills is so important in this field. By exploring real-world examples of how professionals use their critical thinking skills in space exploration, students can better understand the important role these skills play in unraveling the mysteries of the universe.

What are the benefits of helping students develop critical thinking skills?

critical thinking in science and technology

Teaching critical thinking skills sets students up for lifelong success. By developing critical thinking skills, students can learn to analyze information objectively, identify assumptions, and improve problem-solving abilities, improving their overall academic performance. They also gain the ability to question, evaluate, and make informed decisions that can help students excel in all facets of life beyond the classroom, such as in their personal and professional lives. Additionally, critical thinking skills can help them navigate complex situations and make better decisions, leading to improved communication and the ability to consider multiple perspectives. 

By dedicating time and effort to teaching critical thinking skills, educators can help students become proactive learners who are more confident when making decisions and contributing to discussions on various topics. Read More: Five Benefits of Teaching STEAM

Get Resources to Help Students from Space Foundation’s Center for Innovation and Education

Careers in the space and space-adjacent industries are more available than ever before. At Space Foundation’s Center for Innovation and Education , we are dedicated to providing the next generation of space professionals with the tools they need to be successful in the industry. If you’re an educator looking for resources to expand your student’s interests in STEAM and teach critical thinking skills, then we encourage you to take a look at our available programs. Center for Innovation and Education offers STEAM lesson plans , e-learning tools , professional development opportunities , field trips , scholarships , education awards , and more. 

critical thinking in science and technology

Educational Membership icon

  • New! Member Benefit New! Member Benefit
  • Featured Analytics Hub
  • Resources Resources
  • Member Directory
  • Networking Communities
  • Advertise, Exhibit, Sponsor
  • Find or Post Jobs

Connect Icon

  • Learn and Engage Learn and Engage
  • Bridge Program

critical thinking in science and technology

  • Compare AACSB-Accredited Schools
  • Explore Programs

Bullseye mission icon

  • Advocacy Advocacy
  • Featured AACSB Recognizes 26 Business Schools Leading Boldly
  • Diversity, Equity, Inclusion, and Belonging
  • Influential Leaders
  • Innovations That Inspire
  • Connect With Us Connect With Us
  • Accredited School Search
  • Accreditation
  • Learning and Events
  • Advertise, Sponsor, Exhibit
  • Tips and Advice
  • Is Business School Right for Me?

How to Evaluate Critical Thinking in the Age of AI

Article Icon

  • Many campuses have taken steps to curtail the use of generative AI tools, often over fears of plagiarism—but these fears overshadow AI’s potential as a pedagogical tool.
  • Because GenAI’s responses are immediate and nonjudgmental, learners can develop their critical thinking processes as they freely reflect on thoughts, responses, and concepts.
  • GenAI has not supplanted the role of instructors in the classroom. Rather, it has become a tool that we can use to teach, inspire, and guide our learners.

Learners have embraced generative artificial intelligence (GenAI), but academic administrators and faculty  appear to be more apprehensive  about using this emerging technology. Since GenAI began taking hold, administrators and faculty have set policies to restrict its use. They have used AI detectors to police plagiarism (despite the inconsistent capabilities of these tools), while their offices of integrity have doled out punishments.

But as educators have learned over the past year, these interventions won’t curtail the use of generative AI by learners. Moreover, we believe there are many reasons that educators should stop resisting this technology and start enjoying the benefits GenAI has to offer.

Before we offered anything close to a salve, we wanted to know: What are some of the  sources of apprehension  among our colleagues? The three of us have had productive conversations on this question with professors from various institutions. Through these conversations, we learned that most faculty were concerned about the same thing: plagiarism.

As we listened, we realized that plagiarism is merely an administrative term used by academic cultures. When we set rules prohibiting plagiarism, we create a policy safety net that allows us to teach and evaluate our students’ critical thinking. We want to know of our students: Is this your own thinking? Are these your own written words?

These questions lie at the heart of our anxiety. How can we evaluate a learner’s critical thinking if the content is AI-generated? While this is a fair question, we should be asking a different one: How can we use generative AI to develop our students’ critical thinking skills?

The Limitation of Traditional Teaching

Our answer here may surprise you. For example, prior to having chatbots in our own classroom, we provided learners with short scenarios focused on ethical dilemmas that entry-level professionals might encounter. Each learner would take 20 minutes to think through the dilemma, generate an overview, identify stakeholders, and decide what course of action to take. We would then spend the rest of the class time in discussion.

Our students enjoyed these thought challenges. As instructors, we recognized the effectiveness in getting future business leaders to think, write, and discuss potential moments of  ethical fading . We never graded these interactions, and learners never asked for points for their participation. Socrates would have been proud. With these class discussions, we had transcended transactional coursework.

In our classes, we encourage students to engage in conversations with the bots. Learners discover that GenAI can serve as a tutor, an intellectual sparring partner, and a personal instructor.

But these assignments had a significant limitation: It was difficult to measure whether all learners had pushed themselves to think critically and reflect deeply about the dilemma. As in any group discussion, some were more vocal than others. Even if called on, some learners would simply parrot previous responses. Moreover, these assignments were not designed to provide students with additional instructional feedback after the in-class discussions were over.

How could we address this limitation? How could we ascertain every learner’s depth of critical thinking through this exercise? Enter ChatGPT.

Conversing With the Bots

In an  October 2023 article  in AACSB Insights , Anthony Hié and Claire Thouary write that “the better students are at communicating with AI, the more likely it is that they will have seamless and rewarding learning experiences as they use AI to deepen their understanding of complex concepts, find solutions to problems, or explore new areas of knowledge.”

Yes, ChatGPT creates content; it can write essays, blogs, and even novels based on a simple prompt. But at the J. Whitney Bunting College of Business and Technology (CoBT) at Georgia College & State University in Milledgeville, we use it differently. Rather than worrying about how it might replace our teaching, we wanted to figure out  how it could improve student learning .

After all, chatbots are, at their core, dialogical. With this in mind, we guide our learners to engineer effective prompts. We encourage them to learn how to engage in conversations with the bots. In our classes, learners discover that GenAI can serve as a tutor, an intellectual sparring partner, and a personal instructor.

Learning Through Repetition

Let’s look, for instance, at how we now ask students to think through ethical dilemmas in an in-class assignment in our undergraduate business communications course. Before the class session starts, we send students a specific prompt. We instruct them to copy and paste the entire prompt into their own ChatGPT accounts.

It’s important to note that the prompt’s rules and steps tell the bot how to behave. When we write in the prompt, “Now, please follow these steps,” we are instructing the bot to follow those exact steps. The learner is identified as the “user” in this context.

Once the learner submits the prompt, ChatGPT will create an ethical dilemma for the learner, along with the three discussion questions and the required list of components the learner must address. Until the learner has answered the questions and provided the information itemized on the list, the system will continue to request that the learner satisfy all components. (These components are listed a through d , as noted below.)

Once the learner gives the required responses, ChatGPT then will become the expert debater and present a response that questions the learner’s stance by offering the opposite perspective. The student will then respond to that “debate,” and then ChatGPT will evaluate the learner’s final response.

Here is the prompt we use for this assignment:

Act as an expert professor in business ethics. Create an ethical dilemma that involves an entry-level finance employee.

Rule: The dilemma should be complex. Right versus wrong should not be explicit. Please do NOT provide analysis.

Now, please follow these steps:

1. Create three discussion questions.

2. After the user’s response, create three more questions, UNLESS the response does NOT include all the following components:

a) An overview of the ethical situation

b) A list of options

c) A list of stakeholders

d) A recommended action

3. If the responses are missing any of the components, please ask the user to provide the missing component.

4. If all the components are provided, then act as expert debater and present an opposite perspective.

5. Wait for a final statement from the user.

6. Once the user provides the final statement, evaluate the quality of the responses based on the detail of the user’s responses, user’s use of evidence, and ethical validity.

The prompt creates an individual dilemma, and learners must work through that dilemma step by step — this prompt focuses on finance, but we can modify to focus on any industry. The benefit to these in-class conversations with ChatGPT is that learners often go beyond initial levels of thinking about the ethicality of the dilemma.

In fact, learners reach secondary and tertiary levels of thinking. They ask themselves more nuanced questions: Why does a particular response matter? What are the implications of good or bad decisions? What learned concepts can be applied to making ethical decisions and acting ethically?

The point of critically writing through these dilemmas is not to bring about ethical epiphanies, since such epiphanies are hard to sustain. Instead, by regularly assigning these writing exercises, we want students to create muscle memory, as Brooke Deterline describes in her  2012 TED talk  on creating ethical cultures in business. Through such repetition, learners are more likely to acquire ethical reflexes that guard against the potential risks of ethical fading.

Learning Without Judgment

Another important benefit to using generative AI tools for critical thinking in the classroom is that each tool acts as a nonjudgmental collaborator. This means learners can converse with the tool, asking any question they want without the collaborator judging that question as “stupid” or “unworthy.”

GenAI’s nonjudgmental, in-depth responses ultimately help learners develop their own critical thinking processes, because the platform allows them to play with and reflect upon a variety of thoughts, responses, and concepts. They feel free to ask questions, challenge their own perspectives, and allow the bot to help shape and organize their thinking. We cannot overstate the value to learners of playing with questions, thoughts, and concepts.

Generative AI isn’t going away. As Microsoft and Google integrate generative AI into their word processing software, instructors need to adapt.

In a  September 2023 article  published by Harvard Business Publishing–Education, Ethan Mollick and Lilach Mollick note that the instantaneous feedback from ChatGPT adds significant educational benefits. Learners often have attention and distraction issues, but AI tools can instantly generate feedback, which means learners don’t have to wait to see if their responses can be better developed.

Revolutionized, Not Replaced

As we have found, GenAI has not supplanted the role of instructors. On the contrary, after our students’ initial independent conversations with the bots, our facilitated class discussions are much more focused and informed. We can select one dilemma from the group and discuss it in detail, and those discussions are lively and provocative. Now, everyone has self-developed perspectives. We still find room to teach, inspire, and guide our learners.

To further ensure accountability, we require students to submit their conversations to our learning management system, a process that requires just the click of a button. We then can review and evaluate each learner’s response.

At the end of the day, generative AI isn’t going away. As Microsoft and Google integrate generative AI into their word processing software, instructors need to adapt. This is why the CoBT continues to expand its work with this technology. For instance, we have established an AI Lab, and we offer GenAI workshops for the campus and broader communities in Georgia. We continue to bring in industry leaders to engage our campus community on the topic, and we collaborate on AI projects with students and faculty outside the CoBT.

We must continue to innovate to make the best use of all AI has to offer. Let’s use this revolutionary tool to help ourselves and our learners become better thinkers—and better people.

  • artificial intelligence
  • critical thinking
  • digital transformation
  • future of business education
  • learner engagement
  • soft skills
  • Advanced search
  • Peer review

critical thinking in science and technology

If you have found this article useful and you think it is important that researchers across the world have access,  please consider donating , to ensure that this valuable collection remains Open Access.

Prometheus is published by Pluto Journals, an Open Access publisher. This means that everyone has free and unlimited access to the full-text of all articles from our international collection of social science journals .  Furthermore Pluto Journals authors don’t pay article processing charges (APCs).

critical thinking in science and technology

Twisted thinking: Technology, values and critical thinking

  • Exploring value change
  • Techno-moral change through solar geoengineering: How geoengineering challenges sustainability
  • Value change through information exchange in human–machine interaction
  • The streetlights are watching you: A historical perspective on value change and public lighting
  • Editorial: Designing for value change
  • Future value change: Identifying realistic possibilities and risks
  • Imagining digital twins in healthcare: Designing for values as designing for technical milieus
  • More work for Roomba? Domestic robots, housework and the production of privacy
  • Understanding value change
  • Record : found
  • Abstract : found
  • Article : found

critical thinking in science and technology

  • Download PDF
  • Review article
  • Invite someone to review

Abstract

Technology should be aligned with our values. We make the case that attempts to align emerging technologies with our values should reflect critically on these values. Critical thinking seems like a natural starting point for the critical assessment of our values. However, extant conceptualizations of critical thinking carve out no space for the critical scrutiny of values. We will argue that we need critical thinking that focuses on values instead of taking them as unexamined starting points. In order to play a crucial role in helping to align values and technology, critical thinking needs to be modified and refocused on values. Here, we outline what value-centred critical thinking could look like.

Author and article information

Author notes.

Accepting Editor : Steven Umbrello

All content is freely available without charge to users or their institutions. Users are allowed to read, download, copy, distribute, print, search, or link to the full texts of the articles in this journal without asking prior permission of the publisher or the author. Articles published in the journal are distributed under a http://creativecommons.org/licenses/by/4.0/ .

E. Anderson (1995) Value in Ethics and Economics, Harvard University Press, Cambridge MA.

J. Angwin J. Larson S. Mattu L. Kirchner (2016) May 23) ‘Machine bias’, Propublica, 23 May, available at https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing (accessed March 2022).

H. Arendt J. Kohn (2018) Thinking without a Banister: Essays in Understanding, 1953-1975, Schocken Books, New York.

S. Bailin R. Case J. R. Coombs L. B. Daniels (1999) ‘Conceptualizing critical thinking’, Journal of Curriculum Studies, 31, 3, pp.285–302. https://doi.org/10.1080/002202799183133

S. Bailin H. Siegel (2003) ‘Critical thinking’ in N. Blake P. Smeyers R. Smith P. Standish (eds) Blackwell Guide to the Philosophy of Education, Blackwell, Oxford, pp.181–93.

H. Battaly (2021) ‘Engaging closed-mindedly with your polluted media feed’ in M. de Hannon J. Ridder (eds) Routledge Handbook of Political Epistemology, Routledge, Milton Park, pp.312–24.

R. Benjamin (2019) Race After Technology: Abolitionist Tools for the New Jim Code, Polity, Cambridge.

C. Blake-Turner (2020) ‘Fake news, relevant alternatives, and the degradation of our epistemic environment’, Inquiry, 63. available at https://doi.org/10.1080/0020174X.2020.1725623 (accessed March 2022).

M. Boenink T. Swierstra D. Stemerding (2010) ‘Anticipating the interaction between technology and morality: a scenario study of experimenting with humans in bionanotechnology’, Studies in Ethics, Law, and Technology, 4, 2. https://doi.org/10.2202/1941-6008.1098 .

D. Boyd N. Ellison (2007) ‘Social network sites: definition, history, and scholarship’, Journal of Computer-Mediated Communication, 13, 1, pp.210–30. https://doi.org/10.1111/j.1083-6101.2007.00393.x .

K. Boyd (2020) ‘Group epistemology and structural factors in online group polarization’, Episteme, 17 December. https://doi.org/10.1017/epi.2020.47 .

T. Byerly M. Byerly (2016) ‘Collective virtue’, Journal of Value Inquiry, 50, 1, pp.33–50. https://doi.org/10.1007/s10790-015-9484-y .

H. Cook (2005) ‘The English sexual revolution: technology and social change’, History Workshop Journal, 59, 1, pp.109–28. https://doi.org/10.1093/hwj/dbi009 .

J. Dewey (1929) The Quest for Certainty: A Study of the Relation of Knowledge and Action, Allen & Unwin, London.

J. Dewey (2004) [1915] Democracy and Education: An Introduction to the Philosophy of Education, Aakar Books, Delhi.

L. Dumont (1980) Homo Hierarchicus: The Caste System and its Implications, University of Chicago Press, Chicago.

European Council (2016) ‘Media literacy and critical thinking – education’s role: Council conclusions of 30 May 2016 on developing media literacy and critical thinking through education and training’, EUR-Lex, available at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=LEGISSUM:150102_3 (accessed March 2022).

A. Fisher (2019) ‘What critical thinking is’ in A. Blair (ed.) Studies in Critical Thinking, University of Windsor, Windsor, Ontario, pp.7–32.

A. Fisher M. Scriven (1997) Critical Thinking: Its Definition and Assessment, Centre for Research in Critical Thinking, University of East Anglia, Norwich.

D. Frau-Meigs I. Velez Flores J. Michel (2017) Public Policies in Media and Information Literacy in Europe: Cross-Country Comparisons, Routledge, Milton Park.

B. Friedman D. Hendry (2019) Value Sensitive Design: Shaping Technology with Moral Imagination, MIT Press, Cambridge MA.

I. Gabriel (2020) ‘Artificial intelligence, values, and alignment’, Minds and Machines, 30, 3, pp.411–37. https://doi.org/10.1007/s11023-020-09539-2 .

A. Goldman C. O’Connor (2019) ‘Social epistemology’ in E. Zalta (ed.) Stanford Encyclopedia of Philosophy, Metaphysice Research Laboratory, Department of Philosophy, Stanford University, Stanford CA, available at https://plato.stanford.edu/ (accessed March 2022).

G. Goodnight (2009) ‘Critical thinking in a digital age: argumentation and the projects of new media literacy’ in OSSA Conference Archive, paper 58, University of Windsor, Windsor, Ontario, available at https://scholar.uwindsor.ca/cgi/viewcontent.cgi?article=1143&context=ossaarchive (accessed March 2022).

A. Grafstein (2017) ‘Information literacy and critical thinking’ in D. Sales M. Pinto (eds) Pathways into Information Literacy and Communities of Practice, Elsevier, Amsterdam, pp.3–28. https://doi.org/10.1016/B978-0-08-100673-3.00001-0 .

A. Haines (2021) ‘From “Instagram face” to “Snapchat dysmorphia”: how beauty filters are changing the way we see ourselves’, Forbes, 27 April. Available at https://www.forbes.com/sites/annahaines/2021/04/27/from-instagram-face-to-snapchat-dysmorphia-how-beauty-filters-are-changing-the-way-we-see-ourselves/?sh=205b901a4eff

R. Heersmink (2018) ‘A virtue epistemology of the internet: search engines, intellectual virtues and education’, Social Epistemology, 32, 1, 1–12. https://doi.org/10.1080/02691728.2017.1383530 .

R. Heilweil (2019) ‘Artificial intelligence will help determine if you get your next job’, Vox, 12 December, available at https://www.vox.com/recode/2019/12/12/20993665/artificial-intelligence-ai-job-screen (accessed March 2022).

D. Hitchcock (2018) ‘Critical thinking’ in E. Zalta (ed.) Stanford Encyclopedia of Philosophy, Metaphysice Research Laboratory, Department of Philosophy, Stanford University, Stanford CA, available at https://plato.stanford.edu/ (accessed March 2022).

C. Horne B. Darras E. Bean A. Srivastava S. Frickel (2015) ‘Privacy, technology, and norms: the case of smart meters’, Social Science Research, 51, pp.64–76. https://doi.org/10.1016/j.ssresearch.2014.12.003 .

M. Huemer (2005) ‘Is critical thinking epistemically responsible?’, Metaphilosophy, 36, 4, pp.522–31. https://doi.org/10.1111/j.1467-9973.2005.00388.x

S. Jackson (2019) ‘How a critical thinker uses the web’ in A. Blair (ed.) Studies in Critical Thinking, University of Windsor, Windsor, Ontario, pp.269–97.

R. Johnson B. Hamby (2015) ‘A meta-level approach to the problem of defining “critical thinking”’, Argumentation, 29, 4, pp.417–30. https://doi.org/10.1007/s10503-015-9356-4 .

J. Kantor K. Weise G. Ashford (2021) ‘Inside Amazon’s employment machine’, New York Times, 15 June. https://www.nytimes.com/interactive/2021/06/15/us/amazon-workers.html .

D. Kary (2013) ‘Critical thinking and epistemic responsibility’ in D. Mohammed M. Lewinski (eds) Virtues of Argumentation, OSSA Conference Archive, paper 86, University of Windsor, Windsor, Ontario.

R. Lahroodi (2007) ‘Collective epistemic virtues’, Social Epistemology, 21, 3, pp.281–97. https://doi.org/10.1080/02691720701674122

M. Lipman (2003) Thinking in Education, Cambridge University Press, Cambridge.

L. Marin (2021) ‘Three contextual dimensions of information on social media: lessons learned from the covid-19 infodemic’, Ethics and Information Technology, 23, S1, pp.S79–S86.

R. McPhee (2016) ‘A virtue epistemic approach to critical thinking’, unpublished PhD thesis, Bond University, Gold Coast Australia, available at https://pure.bond.edu.au/ws/portalfiles/portal/36124417/Russell_McPhee_Thesis_.pdf (accessed March 2022).

M. O’Malley (2013) ‘Value ethics: a meta-ethical framework for emerging sciences in pluralistic contexts’ in C. Baumbach-Knopf J. Achatz N. Knoepffler (eds) Facetten der Ethik, Königshausen Neumann, Würzburg, pp.71–90.

R. Paul (1981) ‘Teaching critical thinking in the “strong” sense: a focus on self-deception, world views, and a dialectical mode of analysis’, Informal Logic, 4, 2. https://doi.org/10.22329/il.v4i2.2766 .

R. Paul L. Elder (2009) ‘Critical thinking, creativity, ethical reasoning: a unity of opposites’ in T. Cross D. Ambrose (eds) Morality, Ethics, and Gifted Minds, Springer, New York, pp.117–31.

J. Rawls (1999) A Theory of Justice, Belknap Press, Cambridge MA.

J. Ritola (2012) ‘Critical thinking is epistemically responsible’, Metaphilosophy, 43, 5, pp.659–78. https://doi.org/10.1111/j.1467-9973.2012.01773.x .

S. Russell (2020) Human Compatible: Artificial Intelligence and the Problem of Control, Penguin Books, Harmondsworth.

T. Ryan-Mosley (2021) ‘Beauty filters are changing the way young girls see themselves’, MIT Technology Review, 2 April. Available at https://www.technologyreview.com/2021/04/02/1021635/beauty-filters-young-girls-augmented-reality-social-media/

S. H. Schwartz (2015) ‘Basic individual values: sources and consequences’ in T. Brosch D. Sander (eds) Handbook of Value, Oxford University Press, Oxford, pp.63–84. http://www.oxfordscholarship.com/view/10.1093/acprof:oso/9780198716600.001.0001/acprof-9780198716600-chapter-4

S. Schwartz (1992) ‘Universals in the content and structure of values: theoretical advances and empirical tests in 20 countries’, Advances in Experimental Social Psychology, l, 25, pp.1–65. https://linkinghub.elsevier.com/retrieve/pii/S0065260108602816 .

L. Schwengerer (2020) ‘Online intellectual virtues and the extended mind’, Social Epistemology, 35, 3, pp.312–22. https://doi.org/10.1080/02691728.2020.1815095 .

B. Smith (1991) Contingencies of Value: Alternative Perspectives for Critical Theory, Harvard University Press, Cambridge MA.

T. Swierstra (2013) ‘Nanotechnology and technomoral change’, Etica and Politica, 15, 1, pp. 200–19.

T. Swierstra J. Keulartz (2011) ‘Obesity in 2020: three scenarios on techno-socio-ethical co-evolution’ in M. Korthals (ed.) Genomics, Obesity and the Struggle over Responsibilities, Springer, Dordrecht, pp.97–112. http://link.springer.com/10.1007/978-94-007-0127-4_7 .

M. Thorseth (2008) ‘Reflective judgment and enlarged thinking online’, Ethics and Information Technology, 10, 4, pp. 221–31. https://doi.org/10.1007/s10676-008-9166-6 .

V. Tiberius (2018) Well-Being as Value Fulfillment: How we Can Help Each Other to Live Well, Oxford University Press, Oxford.

J. Van den Hoven P. E. Vermaas I. van de Poel (2015) ‘Design for values: an introduction’ in J. Van den Hoven P. Vermaas I. van de Poel (eds) Handbook of Ethics, Values, and Technological Design, Springer, Dordrecht, pp.1–7. http://link.springer.com/10.1007/978-94-007-6970-0_40 .

I. van de Poel L. Royakkers (2011) Ethics, Technology, and Engineering: An Introduction, Wiley-Blackwell, Hoboken NJ.

S. Vosoughi D. Roy S. Aral (2018) ‘The spread of true and false news online’, Science, 359, 6380, pp.1146–51. https://doi.org/10.1126/science.aap9559 .

K. Waelbers T. Swierstra (2014) ‘The family of the future: how technologies can lead to moral change’ in J. Van den Hoven N. Doorn T. Swierstra B.-J. Koops H. Romijn (eds) Innovative Solutions for Global Issues, Springer, New York, pp.219–36.

M. Weber (2013) Economy and Society, University of California Press, Berkeley.

E. Williams (2015) ‘In excess of epistemology: Siegel, Taylor, Heidegger and the conditions of thought’, Journal of Philosophy of Education, 49, 1, pp.142–60.

D. Wohn B. Bowe (2016) ‘Micro agenda setters: the effect of social media on young adults’ exposure to and attitude toward news’, Social Media + Society, 2, 1. https://doi.org/10.1177/2056305115626750 .

F. Zimmer K. Scheibe M. Stock W. Stock (2019) ’Fake news in social media: bad algorithms or biased users?’, Journal of Information Science Theory and Practice, 7, 2, pp.40–53.

Comment on this article

Cryptopolitan

How Can Creativity and Critical Thinking Thrive in a Tech-Driven World?

I n a rapidly evolving technological landscape, the symbiotic relationship between innovation and introspection stands at the forefront of professional discourse. The amalgamation of Creativity and Critical Thinking in the face of burgeoning technological advancements has become the focal point of contemporary discussions. Amidst the proliferation of AI-powered tools and the democratization of creative resources, concerns arise regarding the preservation of originality and independent thought.

The influence of technology on creative processes

In the contemporary creative landscape, technology serves as both a facilitator and a disruptor. The advent of AI and the proliferation of tech tools have streamlined once laborious processes, rendering tasks such as research, design, and marketing analytics more accessible than ever. However, while these advancements promise efficiency, they also pose challenges to traditional modes of ideation and problem-solving. The ease of generating pre-designed templates and automated solutions may inadvertently stifle individual creativity, leading to homogenized outputs devoid of originality.

As professionals grapple with the implications of technological integration, the need to preserve the human element within creative endeavors becomes increasingly evident. While AI excels in tasks requiring data processing and pattern recognition, it often falls short in replicating the nuanced decision-making and emotional depth inherent in human creativity. Also, reliance on AI-generated outputs may obscure opportunities for serendipitous discovery and unconventional thinking, limiting the potential for truly groundbreaking innovation.

Balancing automation with human ingenuity

Striking a delicate balance between automation and human ingenuity is essential for maintaining the integrity of creative processes. While AI tools offer undeniable advantages in terms of speed and precision, they must complement rather than replace human cognition. By delineating tasks that necessitate human judgment and creativity from those amenable to automation, individuals can preserve the essence of their creative vision. Also, fostering a culture of experimentation and hands-on exploration outside the confines of digital interfaces cultivates adaptability and resilience, traits essential for navigating the complexities of modern creative practice.

As the boundaries between human and machine continue to blur, the role of technology in augmenting human creativity becomes increasingly nuanced. Rather than viewing AI as a threat to creative autonomy, professionals must embrace it as a collaborator, leveraging its capabilities to enhance rather than usurp their creative vision. By integrating technology as a tool for amplifying human ingenuity, individuals can harness its potential to unlock new realms of creative expression while remaining steadfast in their commitment to originality and independent thought.

Nurturing creativity and critical thinking

In the quest to cultivate creativity and critical thinking in a tech-driven world, one must navigate the intricate interplay between innovation and introspection with intentionality and foresight. As we venture into uncharted territory where human intelligence and artificial ingenuity converge, the future of creativity lies not in the hands of technology alone but in the symbiotic relationship between human imagination and technological innovation. 

By embracing a human-centered approach to technology adoption and prioritizing hands-on exploration and experimentation, individuals can transcend the limitations of automation to unlock the boundless potential of human creativity. In an era defined by unprecedented technological advancement, how can individuals harness the power of technology to amplify rather than diminish their capacity for original thought and expression?

How Can Creativity and Critical Thinking Thrive in a Tech-Driven World?

Christopher Dwyer Ph.D.

The Relationship Between Critical Thinking and Critical Theory

Comparing approaches..

Posted February 15, 2024 | Reviewed by Gary Drevitch

  • Critical theory is a way of identifying, critiquing, and challenging social dynamics and power structures.
  • Modern critical theory seems to skip a lot of steps associated with logic and mechanisms of good thinking.
  • Human beings think in hierarchically structured fashion, and they develop social groups in a similar manner.

I recently asked a fellow academic, in conversation, how they try to integrate critical thinking into their classroom, and they replied that they don’t have "much time for that kind of thing." I quickly realised that they didn’t know what I was talking about and likely confused it for something else. This shouldn’t have been entirely surprising to me, given research by Lloyd and Bahr (2010) indicates that, unfortunately, many educators are not au fait with what critical thinking actually is. Following further conversation, I came to understand what this academic was referring to: critical theory. This was neither the first time I’ve encountered such confusion of terms, nor was it the first time I heard criticism of the field.

What Critical Theory Is

I recognise that the phrasing "critical lens" one often hears in educational contexts might be a bit ambiguous and could be perceived in various ways. Critical thinking is many things, but one thing it is not is critical theory. Critical theory is an arts and humanities approach to identifying, critiquing, and challenging social dynamics and power structures within society (e.g., see Tyson 2023, Marcuse, 1968). Simply, it’s a critique of society; hence, the name—though some in the field would argue this and uphold the belief that it’s an association with our beloved critical thinking. I would argue that such people would fit in well with the aforementioned cohort of people who don’t really understand what critical thinking is.

The critical theory approach developed out of post-World War II German social climates as a means of exploring how Germany and, indeed, Europe got to where they were at that point in time. This is reasonable; indeed, psychology was interested in these implications as well (e.g., consider the work of Milgram and Asch). Critical theory grew from there into other socially aware applications. Despite methodological concerns, there is some good work done through critical theory. However, there is also considerable poor research done in this area. I would argue that the core reason for this is that the approach is often founded in bias . That is, unfortunately, a lot of modern critical theory starts with the proposition that some dynamic is "bad." Now, I’m not saying that many of the dynamics often under investigation aren’t bad, but starting research on the basis of a biased perspective doesn’t sound like a particularly promising rationale. Where’s the critical thinking? Where’s the evaluation? If you truly care about the topic, apply critical thinking from an unbiased perspective. Modern critical theory seems to skip a lot of steps associated with logic and the mechanisms of good thinking.

The purpose of this brief discussion on critical theory is two-fold. First, it’s argued that there has been "considerable" growth in the field in recent years (e.g., critical theory student numbers, growing presence in popular society, and growing inclusion in educational curricula), which is concerning given the rationale above, and, second, consistent with my observation in the introduction, its name is unfortunately similar to "critical thinking" and, thus, the two are often confused for one another. Please, don’t make this mistake.

Power Structures

Similar to the aforementioned negative social dynamics, I’m not saying that power structures don’t exist either. Look at families: Parents hold "power" over their children. Look at jobs: Employees are under the power of their managers, who are under the power of other managers, and so on. Indeed, depending on what country you live in, your government has varying levels of power over those it governs (e.g., with respect to law and policy-making). Some will argue that it’s the people who should be governing themselves: voting in law- and policy-makers as representatives, which is reasonable to me, but not all governments are like this— that’s politics for you (e.g., largely belief-led) , so what can you do? "Think critically about it" would be a reasonable response in the context of this page, and that is notably distinct from engaging in critical theory.

The point is that such "structures" are naturally occurring. Human beings think in hierarchically structured fashion (e.g., through schema construction, classification, categorisation) and they develop social groups in a similar manner. That’s not to say that we should accept such structures in all situations, but no amount of academia is likely to change human nature; believe me, we’ve been trying to get people to think critically for a long time. Another important consideration for recognising this commonality is our expectance of these structures. Unfortunately, because we expect to see them everywhere, we wind up creating many of them, through our interpretations, when they might not even exist.

So, if you are approaching your research from the perspective that because something (e.g., some group) experiences, for example, a less-than-desirable event or condition, it’s very easy—without the application of critical thinking—that such negative outcomes should be attributed to some other group, in a sort of causal relationship. The problem is, as opposed to this being a conclusion ( a leads to b ), it is often the starting point of research, which then biases the methodology and its outcomes. For example, in an effort not to single out any particular group, let’s say I’m studying some topic from a Zuggist perspective (I made-up the word/group "Zug"). Considering the fact that I side with Zuggists—I might even be a Zug myself—the chances of me reporting something that is biased in favour of Zugs is more likely than not. To me, that’s not good research.

Again, I’m not saying that all research from a critical theory approach is like this, but, unfortunately, a noticeable amount of it is. Sure, every field has its barriers and "crises" from time to time: Psychology has been battling a replicability crisis in recent years. However, at least psychology (for the most part) recognises the importance of replicability and other research mechanisms associated with good methodology. I have concerns about that with respect to critical theory.

All in all, critical theory doesn’t mean much to me, but, for now, like my fellow academic said in the introduction, "I don’t have much time for that kind of thing." So, why bother talking about it here? This page is focused on critical thinking and good decision-making . These are the outcomes in which I and readers of this blog are interested, alongside learning more about how we can enhance them. It’s difficult enough conceptualising and describing critical thinking without having something similarly named adding further confusion. I’m not putting blame on anyone for the manner in which they coined the term "critical theory"; however, I think it important that people from all walks of life know the differences between them, because those differences are many and important.

Lloyd, M., & Bahr, N. (2010). Thinking critically about critical thinking in higher education. International Journal for the Scholarship of Teaching and Learning, 4, 2, 1–16.

Marcuse, Herbert. "Philosophy and Critical Theory," in Negations: Essays in Critical Theory , with translations from the German by Jeremy J. Shapiro (Boston: Beacon Press, 1968), pp. 134–158.

Tyson, L. (2023). Critical Theory Today: A User-Friendly Guide . Taylor & Francis.

Christopher Dwyer Ph.D.

Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Support Group
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

January 2024 magazine cover

Overcome burnout, your burdens, and that endless to-do list.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

Creative Collisions: Crossing the Art-Science Divide

“Creating Art, Thinking Science,” credit HErickson/MIT

A partnership between ACT and MIT.nano, the class “Creating Art, Thinking Science” asks what it really takes to cultivate dialogue between disciplines

Editorial direction by Leah Talatinian Written by Matilda Bathurst, Arts at MIT

MIT has a rich history of productive collaboration between the arts and the sciences, anchored by the conviction that these two conventionally opposed ways of thinking can form a deeply generative symbiosis that serves to advance and humanize new technologies.

This ethos was made tangible when the Bauhaus artist and educator György Kepes established the MIT Center for Advanced Visual Studies (CAVS) within the Department of Architecture in 1967. CAVS has since evolved into the  Art, Culture, and Technology (ACT) program , which fosters close links to multiple other programs, centers, and labs at MIT. The class “ Creating Art, Thinking Science ” (4.373/4.374), open to undergraduates and master’s students of all disciplines as well as certain students from Harvard Graduate School of Design (GSD), is one of the program’s most innovative offerings, proposing a model for how the relationship between art and science might play out at a time of exponential technological growth.

Now in its third year, the class is supported by an Interdisciplinary Class Development Grant from the  MIT Center for Art, Science & Technology (CAST ) and draws upon the unparalleled resources of  MIT.nano ; an artist’s high-tech toolbox for investigating the hidden structures and beauty of our material universe.

critical thinking in science and technology

High ambitions and critical thinking

The class was initiated by Tobias Putrih, lecturer in ACT, and is taught with the assistance of Ardalan SadeghiKivi MArch ’23, and Aubrie James, SMACT ’24. Central to the success of the class has been the collaboration with co-instructor Vladimir Bulović, the founding director of MIT.nano and Fariborz Maseeh Chair in Emerging Technology, who has positioned the facility as an open-access resource for the campus at large—including MIT’s community of artists. “Creating Art, Thinking Science” unfolds the 100,000 sq ft of cleanroom and lab space within the Lisa T. Su Building, inviting participating students to take advantage of cutting-edge equipment for nanoscale visualization and fabrication; in the hands of artists, devices for discovering nanostructures and manipulating atoms become tools for rendering the invisible visible and deconstructing the dynamics of perception itself.

The expansive goals of the class are tempered by an in-built criticality. “ACT has a unique position as an art program nested within a huge scientific institute—and the challenges of that partnership should not be underestimated,” reflects Putrih. “Science and art are wholly different knowledge systems with distinct historical perspectives. So, how do we communicate? How do we locate that middle ground, that third space?”

An evolving answer, tested and developed throughout the partnership between ACT and MIT.nano, involves a combination of attentive mentorship and sharing of artistic ideas, combined with access to advanced technological resources and hands-on practical training.

“MIT.nano currently accommodates more than  1,200 individuals to do their work, across 250 different research groups,” said Bulović. “The fact that we count artists among those is a matter of pride for us. We’ve found that the work of our scientists and technologists is enhanced by having access to the language of art as a form of expression—equally, the way that artists express themselves can be stretched beyond what could previously be imagined, simply by having access to the tools and instruments at MIT.nano.”

critical thinking in science and technology

Full article can be read here .

critical thinking in science and technology

.zerodotzerozerozerozerozerozerozerozeroone Opening Panel and Exhibition

Ardalan SadeghiKivi

Ardalan SadeghiKivi

Advanced projects in art, culture, and technology (fall 2023).

Aubrie James

Aubrie James

critical thinking in science and technology

zero.zerozerozerozerozeozerozerozeroone

critical thinking in science and technology

Tobias Putrih

Advertisement

Advertisement

An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills

  • Original Paper
  • Open access
  • Published: 08 December 2022
  • Volume 2 , article number  270 , ( 2022 )

Cite this article

You have full access to this open access article

  • María Antonia Manassero-Mas   ORCID: orcid.org/0000-0002-7804-7779 1 &
  • Ángel Vázquez-Alonso   ORCID: orcid.org/0000-0001-5830-7062 2  

1905 Accesses

Explore all metrics

Critical thinking (CRT) skills transversally pervade education and nature of science (NOS) knowledge is a key component of science literacy. Some science education researchers advocate that CRT skills and NOS knowledge have a mutual impact and relationship. However, few research studies have undertaken the empirical confirmation of this relationship and most fail to match the two terms of the relationship adequately. This paper aims to test the relationship by applying correlation, regression and ANOVA procedures to the students’ answers to two tests that measure thinking skills and science definitions. The results partly confirm the hypothesised relationship, which displays some complex features: on the one hand, the relationship is positive and significant for the NOS variables that express adequate ideas about science. However, it is non-significant when the NOS variables depict misinformed ideas about science. Furthermore, the comparison of the two student cohorts reveals that two years of science instruction do not seem to contribute to advancing students’ NOS conceptions. Finally, some interpretations and consequences of these results for scientific literacy, teaching NOS (paying attention both to informed and misinformed ideas), for connecting NOS with general epistemic knowledge, and assessing CRT skills are discussed.

Similar content being viewed by others

critical thinking in science and technology

Philosophical Inquiry and Critical Thinking in Primary and Secondary Science Education

critical thinking in science and technology

Scientific Thinking and Critical Thinking in Science Education

Antonio García-Carmona

critical thinking in science and technology

Using Socioscientific Issues to Promote the Critical Thinking Skills of Year 10 Science Students in Diverse School Contexts

Avoid common mistakes on your manuscript.

Introduction

Among other objectives, school science education perennially aims to improve scientific literacy for all, which involves being useful and functional for making adequate and sound personal and social daily life decisions. An essential component of scientific literacy is the knowledge “about” science, that is, knowledge about how science works and validates its knowledge and intervenes in the world (along with technology). This study focuses on the knowledge about science, which is often referred to in the literature as nature of science (NOS), scientific practice, ideas about science, etc., in turn, related to a continuous innovative teaching tradition (Vesterinen et al., 2014 ; Khishfe, 2012 ; Lederman, 2007 ; Matthews, 2012 ; McComas, 1996 ; Olson, 2018 ; among others).

On the other hand, some international reports and experts state that critical thinking (CRT) skills are key and transversal competencies for all educational levels, subjects and jobs in the 21st century. For instance, the European Union ( 2014 ) proposes seven key competencies that require developing a set of transversal skills, namely CRT, creativity, initiative, problem-solving, risk assessment, decision-making, communication and constructive management of emotions. In the same vein, the National Research Council ( 2012 ) proposes the transferable knowledge and skills for life and work, which explicitly details the following skills: argumentation, problem-solving, decision-making, analysis, interpretation, creativity, and others. In short, these and many other proposals converge in pointing out that teaching students to think and educating in CRT skills is an innovative and significant challenge for 21st century education and, of course, for science education. The CRT construct has been widely developed within psychological research. Yet, the field is complex, and terminologically bewildering (i.e., higher-order skills, cognitive skills, thinking skills, CRT, and other terms are used interchangeably), and some controversies are still unresolved. For instance, scholars do not agree on a common definition of CRT, and the most appropriate set of skills and dispositions to depict CRT is also disputed. As the differences among scholars still persist, the term CRT will be adopted hereafter to generally describe the variety of higher-order thinking skills that are usually associated in the CRT literature.

Further, some science education research currently suggests connections between NOS and CRT, arguing that CRT skills and NOS knowledge are related. Some claim that thinking skills are key to learning NOS (Erduran & Kaya, 2018 ; Ford & Yore, 2014 ; García-Mila & Andersen, 2008 ; Simonneaux, 2014 ), and specifically, that argumentation skills may enhance NOS understanding (Khishfe et al., 2017 ). In contrast, as argumentation skills are a key competence for the construction and validation of scientific knowledge, other studies claim that NOS knowledge (i.e., understanding the differences between data and claims) is also key to learning CRT skills such as argumentation (Allchin & Zemplén, 2020 ; Greene et al., 2016 ; Settlage & Southerland, 2020 ). Both directions of this intuitive relationship between CRT skills and NOS are fruitful ways to enhance scientific literacy and general learning. Hence, this study aims to empirically explore the NOS-CRT relationship, as the prior literature is somewhat mystifying and its contributions are limited, as will be shown below.

Theoretical contextualization

This study copes with two different, vast and rich realms of research, namely NOS and CRT, and their theoretical frameworks: the interdisciplinary context of philosophy, sociology, and history of science and science education for NOS; and psychology and general education for CRT skills. Both frameworks are summarized below to meet the journal space limitations.

Under the NOS label, science education has developed a fertile and vast realm of “knowledge about scientific knowledge and knowing”, which is obviously a particular case of human thinking, and probably the most developed to date. NOS represents the meta-cognitive, multifaceted and dynamic knowledge about what science is and how science works as a social way of knowing and explaining the natural world (knowledge construction and validation). This knowledge has been interdisciplinarily elaborated from history, philosophy, sociology of science and technology, and other disciplines. Scholars raised many and varied NOS issues (Matthews, 2012 ), which are relevant to scientific research and widely surpass the reduced consensus view (Lederman, 2007 ). Despite NOS complexity, it has been systematized across two broad dimensions: epistemological and social (Erduran & Dagher, 2014 ; Manassero-Mass & Vazquez-Alonso, 2019 ). The epistemological dimension refers to the principles and values underlying knowledge construction and validation, which are often described as the scientific method, empirical basis, observation, data and inference, tentativeness, theory and law, creativity, subjectivity, demarcation, and many others. The social dimension refers to the social construction of scientific knowledge and its social impact. It often deals with the scientific community and institutions, social influences, and general science-technology-society interactions (peer evaluation, communication, gender, innovation, development, funding, technology, psychology, etc.).

From its beginning, NOS research agrees that students (and teachers) hold inadequate and misinformed beliefs on NOS issues across different educational levels and contexts. Further, researchers agree that effective NOS teaching requires explicit and reflective methods to overcome the many learning barriers (Bennássarr et al., 2010 ; García et al., 2011 ; Cofré et al., 2019 ; Deng et al., 2011 ). These barriers relate to the basic processes of gathering (observation) and elaborating (analysis) data, decision-making in science, and specifically, the inability to differentiate facts and explanations and adequately coordinate evidence, justifications, arguments and conclusions; the lack of elementary meta-cognitive and self-regulation skills (i.e., the quick jump to conclusions as self-evident); the introduction of personal opinions, inferences, and reinterpretations and the dismissal of the counter-arguments or evidence that may contradict personal ideas (García-Mila & Andersen, 2008 ; McDonald & McRobbie, 2012 ).

As these barriers point directly to the general abilities involved in thinking (observation, analysis, answering questions, solving problems, decision-making and the like), researchers attribute those difficulties to the lack of the cognitive skills involved in the adequate management of the barriers, whose higher-order cognitive nature corresponds to many CRT skills (Kolstø, 2001 ; Zeidler et al., 2002 ). Thus, the solutions to overcome the barriers imply mastering the CRT skills, and, consequently, achieving successful NOS learning (Ford & Yore, 2014 ; McDonald & McRobbie, 2012 ; Simonneaux, 2014 ). Erduran and Kaya ( 2018 ) argue that the perennial aim of developing students’ and teachers’ NOS epistemic insights still remains a challenge for science education, despite decades of NOS research, due to the many aspects involved. They conclude that NOS knowledge critically demands higher-order cognitive skills. The paragraphs below elaborate on these higher-order cognitive skills or CRT skills.

Critical thinking

As previously stated, the CRT field shows many differences in scholarly knowledge on the conceptualization and composition of CRT. Ennis’ ( 1996 ) simple definition of CRT as reasonable reflective thinking focused on deciding what to believe or do is likely the most celebrated definition among many others. A Delphi panel of experts defined CRT as an intentional and self-regulated judgment, which results in interpretation, analysis, evaluation and inference, as well as the explanation of the evidentiary, conceptual, methodological, criterial or contextual considerations on which that judgment is based (American Psychological Association 1990 ).

However, the varied set of skills associated with CRT is controversial (Fisher, 2009 ). For instance, Ennis ( 2019 ) developed an extensive conception of CRT through a broad set of dispositions and abilities. Similarly, Madison ( 2004 ) proposed an extensive and comprehensive list of skills (Table 1 ).

The development of CRT tests has contributed to clarifying the relevance of the many CRT skills, as the test’s functionality requires concentrating on a few skills. For instance, Halpern’s ( 2010 ) questionnaire assesses, through everyday situations, problem-solving, verbal reasoning, probability and uncertainty, hypothesis-testing, argument analysis and decision-making. Watson and Glaser’s ( 2002 ) instrument assesses deduction, recognition of assumptions, interpretation, inference, and evaluation of arguments. The California Critical Thinking Skills Test assesses analysis, evaluation, inference, deduction and induction (Facione et al., 1998 ). It is also worth mentioning that most CRT tests target adults, although the Cornell Critical Thinking Tests (Ennis & Millman, 2005 ) were developed for a variety of young people and address several CRT skills (X test, induction, deduction, credibility, and identification of assumptions; Class Test, classical logical reasoning from premises to conclusion, etc.). The large number of CRT skills led scholars to perform efforts of synthesis and refinement that are summarized through some exemplary proposals (Table 1 ).

The CRT psychological framework presented above places the complex set of skills within the high-level cognitive constructs whose practice involves a self-directed, self-disciplined, self-supervised, and self-corrective way of thinking that presupposes conscious mastery of skills and conformity with rigorous quality standards. In addition to skills, CRT also involves effective communication and attitudinal commitment to intellectual standards to overcome the natural tendencies to fallacy and bias (self-centeredness and socio-centrism).

Science education and thinking skills

CRT skills mirror the scientific reasoning skills of scientific practice, and vice versa, based on their similar contents. This intuitive resemblance may launch expectations of their mutual relationship. Science education research has increased attention to CRT skills as promotors of meaningful learning, especially when involving NOS and understanding of socio-scientific issues (Vieira et al., 2011 ; Torres & Solbes, 2016 ; Vázquez-Alonso & Manassero-Mas, 2018 ; Yacoubian & Khishfe, 2018 , among others). Furthermore, Yacoubian ( 2015 ) elaborated several reasons to consider CRT a fundamental pillar for NOS learning.

Some authors stress the convergence between science and CRT based on the word critical , as thinking and science are both critical. Critical approaches have always been considered consubstantial to science (and likely a key factor of its success), as their range spreads from specific critical social issues (i.e., scientific controversies, social acceptance of scientific knowledge, social coping with a virus pandemic) to the socially organized scepticism of science (i.e., peer evaluation, scientific communication). The latter is considered a universal value of scientific practice to guarantee the validity of knowledge (Merton, 1968 ; Osborne, 2014 ). In the context of CRT research, the term critical involves normative ways to ensure the quality of good thinking, such as open-minded abilities and a disposition for relentless scrutiny of ideas, criteria for evaluating the goodness of thinking, adherence to the norms, standards of excellence, and avoidance of errors and fallacies (traits of poor thinking). These obviously also apply to scientific knowledge through peer evaluation practice, which represents a superlative form of good normative thinking (Bailin, 2002 ; Paul & Elder, 2008 ).

Another important feature of the convergence of CRT and science is the broad set of common skills sharing the same semantic content in both fields, despite that their names may seem different. Induction, deduction, abduction, and, in general, all kinds of argumentation skills, as well as problem-solving and decision-making, exemplify key tools of scientific practice to validate and defend ideas and develop controversies, discussions, and debates. Concurrently, they, too, are CRT skills (Sprod, 2014 ; Vieira et al., 2011 ; Yacoubian & Kishfe, 2018 ). In addition, Santos’ ( 2017 ) review suggests the following tentative list of skills: observation, exploration, research, problem-solving, decision-making, information-gathering, critical questions, reliable knowledge-building, evaluation, rigorous checks, acceptance and rejection of hypotheses, clarification of meanings, and true conclusions. Beyond skill names and focusing on their semantic content, (Manassero-Mas & Vázquez-Alonso, 2020a ) developed a deeper analysis of the skills usually attributed to scientific thinking and critical thinking, concluding that their constituent skills are deeply intertwined and much more coincident than different. This suggests that scientific and critical thinking may be considered equivalent concepts across the many shared skills they put into practice. However, equivalence does not mean identity, as important differences may still exist. For instance, the evaluation and judgment of ideas involved in organized scientific skepticism (i.e., peer evaluation) are much more demanding and deeper in scientific practice than in daily life thinking realms.

In sum, research on the CRT and NOS constructs is plural, as they draw from two different fields and traditions, general education and cognitive psychology, and science education, respectively. However, CRT and NOS share many skills, processes, and thinking strategies, as they both pursue the same general goal, namely, to establish the true value of knowledge claims. These shared features provide further reasons to investigate the possible relationships between NOS and CRT skills.

Research involving nature of science and thinking skills

The research involving both constructs is heterogeneous, as the operationalisations and methods are quite varied, given the pluralized nature of NOS and thinking. For example, Yang and Tsai ( 2012 ) reviewed 37 empirical studies on the relationship between personal epistemologies and science learning, concluding that research was heterogeneous along different NOS orientations: applications of Kuhn’s ( 2012 ) evolutionary epistemic categories, use of general epistemic knowledge categories, studies on epistemological beliefs about science (empiricism, tentativeness, etc.), and applications of other epistemic frameworks. The studies dealing with the epistemological beliefs about science were a minority. Another example of heterogeneity comes from Koray and Köksal’s ( 2009 ) study about the effect of laboratory instruction versus traditional teaching on creativity and logical thinking in prospective primary school teachers, where the laboratory group showed a significant effect in comparison to the traditional group. However, the NOS contents involved in laboratory instruction are still unclear. Dowd et al. ( 2018 ) examined the relationship between written scientific reasoning and eight specific CRT skills, finding that only three aspects of reasoning were significantly related to one skill (inference) and negatively to argument.

A series of studies suggest implicit relationships between NOS and thinking skills. Yang and Tsai ( 2010 ) interviewed sixth-graders to examine two uncertain science-related issues, finding that children who developed more complex (multiplistic) NOS knowledge displayed better reflective thinking and coordination of theory and evidence. Dogan et al. ( 2020 ) compared the impact of two epistemic-based methodologies (problem-based and history of science) on the creativity skills of prospective primary school teachers, finding that the problem-solving approach was more effective in increasing students’ creative thinking. Khishfe ( 2012 ) and Khishfe et al. ( 2017 ) found no differences in decision-making and argumentation in socio-scientific issues regarding NOS knowledge, but more participants in the treatment groups referred their post-decision-making factors to NOS than the other groups. Other studies found relationships between NOS understanding and variables that do not match CRT skills precisely. For instance, Bogdan ( 2020 ) found that inference and tentativeness relate to attitudes toward the role of science in social progress, but creativity does not, and the same applies to the acceptance of the evolution theory (Cofré et al., 2017 ; Sinatra et al., 2003 ).

Another set of studies comes from science education research on argumentation, which is based on the rationale that argumentation is a key scientific skill for validating knowledge in scientific practice. Thus, reasoning skills should be related to NOS understanding. Students who viewed science as dynamic and changeable were likely to develop more complex arguments (Stathopoulou & Vosnidou, 2007 ). In a floatation experience, Zeineddin and Abd-El-Khalick ( 2010 ) found that the stronger the epistemic commitments, the greater the quality of the scientific reasoning produced by the individuals. Accordingly, the term epistemic cognition of scientific argumentation has been coined, although specific research on argumentation and epistemic cognition is still relatively scarce (He et al., 2020 ).

Weinstock’s ( 2006 ) review suggested that people’s argumentation skills develop in proportion to their epistemic development, which Noroozi ( 2016 ) also confirmed. Further, Mason and Scirica ( 2006 ) studied the contribution of general epistemological comprehension to argumentation skills in two readings, finding that participants at the highest level of epistemic comprehension (evaluative) generated better quality arguments than participants at the previous multiplistic stage (Kuhn, 2012 ). In addition, the review of Rapanta et al. ( 2013 ) on argumentative competence proposed a three-dimensional hierarchical framework, where the highest level is epistemological (the ability to evaluate the relevance, sufficiency, and acceptability of arguments). Again, Henderson et al. ( 2018 ) discussed the key challenges of argumentation research and pointed to students’ shifting epistemologies about what might count as a claim or evidence or what might make an argument persuasive or convincing, as well as developing valid and reliable assessments of argumentation. On the contrary, Yang et al. ( 2019 ) found no significant associations between general epistemic knowledge and the performance of scientific reasoning in a controversial case with undergraduates.

From science education, González‐Howard and McNeill ( 2020 ) analysed middle-school classroom interactions in critique argumentation when an epistemic agency is incorporated, indicating that the development of students’ epistemic agency shows multiple and conflating approaches to address the tensions inherent to critiquing practices and to fostering equitable learning environments. This idea is further developed in the special section on epistemic tools of Science Education (2020), which highlights the continual need to accommodate and adapt the epistemic tools and agencies of scientific practices within classrooms while taking into account teaching, engineering, sustainability, equity and justice (González‐Howard & McNeill, 2020 ; Settlage & Southerland, 2020 ).

Finally, some of the above-mentioned research used a noteworthy concept of epistemic knowledge (EK) as “knowledge about knowledge and knowing” (Hofer & Pintrich, 1997 ), which has been developed in mainstream general education research and involves some meta-cognitions about human knowledge that research has largely connected to general learning and CRT skills (Greene et al., 2016 ). Obviously, EK and NOS knowledge share many common aspects (epistemic), suggesting a considerable overlap between them. However, it is noteworthy that NOS research is oriented toward CRT skills impacting NOS learning, while EK research orientates toward EK impacting CRT skills and general learning.

Regarding the Likert formats for research tools, test makers are concerned about the control of response biases that cause a lack of true reflection on the statement content and may damage the fidelity of data and correlations. Respondents’ tendency to agree with statements (acquiescence bias) is widespread. Further, neutrality bias and polarity bias reflect respondents’ propensity to choose fixed score points of the scale, either the midpoints (neutrality) or the extreme scores (polarity), either extreme high scores (positive bias) or extreme low scores (negative bias). To mitigate biases, experts recommend avoiding the exclusive use of positively worded statements within the instruments and combining positive and reversed items. This recommendation has been implemented here using three categories for NOS phrases that operationalize positive, intermediate and reversed statements (Vázquezr et al., 2006 ; Kreitchmann et al., 2019 ; Suárez-Alvarez et al., 2018 ; Vergara & Balluerka, 2000 ). However, the use of varied styles for phrases harms the instrument’s reliability and validity, and reliability is underestimated (Suárez-Alvarez et al., 2018 ).

All in all, the theoretical framework is twofold: CRT and NOS research. The above-mentioned research shares the hypothesis that the relationship between NOS and CRT skills matters. However, it displays a broad heterogeneity of research methods, variables, instruments and mixed results on the NOS-CRT relationship that do not allow a common methodological standpoint. Further, mainstream research focuses on college students and argumentation skills. In this regard, this study aims to empirically research the NOS-CRT relationship by applying standardized assessment tools for both constructs. This promotes comparability among researchers and provides quick diagnostic tools for teachers. Secondly, this study addresses younger students, which involves the creation of NOS and CRT tools adapted to young participants, for which some test validity and reliability data are provided. The research questions within this framework are: Do NOS knowledge and CRT skills correlate? What are the traits and limits conditions of this relationship, if any?

Materials and methods

The data gathering took place in Spain in the year 2018. At this time, the enacted school curriculum missed the international standards and specific curriculum proposals about CRT and NOS issues, so NOS issues could be implicitly related to some curricular contents about scientific research. Despite this lack of curricular emphasis, the principals of the participant Spanish schools expressed interest in diagnosing students’ thinking skills and NOS knowledge and agreed with the authors on the specific CRT and NOS-skills to be tested. As the Spanish school curriculum does not emphasize CRT and NOS issues, the students are expected to be equally trained, and this context conditioned the design of tentative tests through simple contents and an open-ended format, as they are cheap and easy to administer and interpret.

Participants

The participant schools (17) included some public (4) and state-funded private schools (13) that spread across mixed socio-cultural contexts and large, medium, and small Spanish townships. The participant students were tested in their natural school classes (29) of the two target grades. The valid convenience samples are two cohorts of students, each representing students of 6 th grade of Primary Education (PE6) ( n  = 434; 54.8% girls and 45.2% boys; mean age 11.3 years) and 8th grade of Secondary Compulsory Education (SCE8) ( n  = 347; 48.5% girls and 51.5% boys; mean age 13.3 years). In Spain, 6 th grade is the last year of the primary stage (11–12-year-old students), and the 8 th grade is the second year of the lower secondary compulsory stage (13–14-year-old students).

Instruments

Two assessment tools were tailored by researchers (a CRT skill test and a NOS scenario) to operationalise CRT and NOS to empirically check their relationships. As the Spanish school curriculum lacks CRT standards, the specific thinking skills that represent the CRT construct were agreed upon between principals and researchers. The design of the tool to assess NOS knowledge took into account that NOS was not explicitly taught in Spanish schools. Both tools were designed to match the schools’ interests and the students’ developmental level; the latter particularly led to choosing a simple NOS issue (definition of science) to match the primary students’ capabilities better.

Thinking challenge tests

Two CRT thinking skill test were developed for the two participant cohorts (PE6 and SCE8). The design aligns with the tradition of most CRT standardised tests that concentrate assessment on a few selected thinking skills (i.e., Ennis & Millman, 2005 ; Halpern, 2010 ). The test for the 6th-graders (PE6) assesses five skills: prediction, comparison and contrast, classification, problem-solving and logical reasoning. The test for the 8th-graders (SCE8) assesses causal explanation, decision-making, parts-all relationships, sequence and logical reasoning.

As most CRT tests are designed for adults, many tests and item pools were reviewed to select suitable items for younger students. The selection criteria were the fit of the items’ cognitive demand with students’ age, the addressed skill and the motivational challenge for students. Moreover, items must be readable, understandable, adequate, and interesting for the participant students. Then, two 45-item and 38-item tests were agreed on and piloted. Their results are described elsewhere (Manassero-Mas & Vázquez-Alonso, 2020b ). The items were examined by the authors according to their reliability, correlation and factor analysis to eliminate unfair items. Again, the former criteria were used to add new items to conform the two new 35-item Thinking Challenge Tests (TCT) to assess the CRT skills of this study.

The items of the first two skills were drawn from the Cornell (Nicoma) test, which evaluates four CRT skills through the information provided by a fictional story about some explorers of the Nicoma planet and asks questions about the story. Some items from prediction and comparison skills were drawn for the 6th-grade TCT (PE6), and some items from causal explanation and decision-making skills were drawn for the 8th-grade TCT (SCE8). The two TCT include three additional items on logical reasoning that were selected from the 78-item Class-Reasoning Cornell Test (Ennis & Millman, 2005 ). One item was also drawn from the 25-situation Halpern CRT test (Halpern, 2010 ) for the problem-solving skill of the PE6 test. The authors adapted the remaining figurative items (Table 2 ) to enhance students’ challenge, understanding, and motivation and make the TCT free of school knowledge (Appendix).

Overall, the TCT items pose authentic culture-free challenges, as their contents and cognitive demands are not related to or anchored in any prior school curricular knowledge, especially language and mathematics. Therefore, the TCT are intended to assess culture-free thinking skills.

The item formats involve multiple-choice and Likert scales with appropriate ranges and rubrics that facilitate quick and objective scoring and the elaboration of increasing adjustment between items’ cognitive demand and their corresponding skill, thereby leading to further revision based on validity and reliability improvement. This format also allows setting standardised baselines for hypothesis-testing through comparisons of research, educational programs, and teaching methodologies.

Nature of science assessment

A scenario on science definitions is used to assess the participants’ NOS understanding because this simple issue may better fit the lack of explicit NOS teaching and the developmental stage of the young students, especially the youngest 6th-graders. The scenario provides nine phrases that convey an epistemic, plural and varied range of science definitions, and respondents rate their agreement-disagreement with the phrases on a 9-point Likert scale (1 =  strongly disagree , 9 =  strongly agree ) to allow better nuancing of their NOS beliefs and avoid psychometric objections to the scale intervals. The scenario is drawn from the “Views on Science-Technology-Society” (VOSTS) pool that Aikenhead and Ryan ( 1992 ) developed empirically by synthesizing many students’ interviews and open answers into some scenarios, written in simple, understandable, and non-technical language. They consider that VOSTS items have intrinsic validity due to their empirical development, as the scenario phrases come from students, not from researchers or a particular philosophy, thus avoiding the immaculate perception bias and ensuring students’ understanding. Lederman et al. ( 1998 ) also consider VOSTS a valid and reliable tool for investigating NOS conceptions. Manassero et al. ( 2003 ) adapted the scenarios into the Spanish language and contexts, and developed a multiple-rating assessment rubric, based on the phrase scaling achieved through expert judges’ consensus. The rubric assigns indices whose empirical reliability has been presented elsewhere (Vázquezr et al., 2006 ; Bennássar et al., 2010 ).

The students completed the two tests through digital devices led by their teachers within their natural school classroom groups during 2018–19. To enhance students’ effort and motivation, the applications were infused into curricular learning activities, where students were encouraged to ask about problems and difficulties. During applications students did not ask questions to teacher that may reflect some difficulty to understand the tests. The database was processed with SPSS 25 and Factor program (Baglin, 2014 ) for exploratory and confirmatory factor analysis through polychoric correlations and Robust Unweighted Least Squares (RULS) method that lessen conditions on the score distribution of variables. Effect size statistics use a cut-off point ( d  = 0.30) to discriminate relevant differences.

There was no time limit for students to complete the tests, and the applications took between 25 and 50 min. Correct answers score one point, incorrect answers zero points, and no random corrections were applied. The skill scores were computed by adding the scores of the items that belong to each skill, which are independent. The addition of the five skill scores makes up a test score (thinking total) that estimates students’ global CRT competence and is dependent on the skill scores (Table 2 ).

The different types of validity maintain a reciprocal influence and represent the various parts of a whole, so they are not mutually independent. The Thinking Challenge tests’ validity relies on the quality of the CRT pools and tests examined by the authors, their agreement to choose the items that best matched the criteria, and the reviewed pilot results (Manassero-Mas & Vázquez-Alonso, 2020b ). The Factor program computes several reliability statistics (Cronbach alpha, EAP, Omega, etc.).

Nature of science scenario

The nine phrases describe different science definitions, and students rated each one on a 1–9 agreement scale. According to the experts’ current views on NOS, a panel of qualified judges reached a 2/3-consensus to categorize each phrase within a 3-level scheme (Adequate, Plausible, Naive), which has been widely used in NOS assessment (Khishfe, 2012 ; Liang et al., 2008 ; Rubba et al., 1996 ). The scheme means the phrases express informed (Adequate), partially informed (Plausible), or uninformed (Naive) NOS knowledge (see Appendix). According to this scheme, an evaluation rubric transforms the students’ direct ratings (1–9) into an index [− 1 to + 1], which is proportionally higher when the person agrees with an Adequate phrase, partially agrees with a Plausible phrase, or disagrees with a Naive phrase. All the rubric indices balance positive and negative scores, which are symmetrical for Adequate and Naïve phrases, but plausible indices are somewhat loaded toward agreement, as higher agreement would be expected. The index unifies the NOS measurements to make them homogeneous (positive indices mean informed conceptions), invariant (measurement independent of scenario/phrase/category), and standardised (all measures within the same interval [− 1, + 1]). The index proportionally values the adjustment of students’ NOS knowledge to the current views of science: the higher (or lower) the index, the better (or worse) informed is their NOS knowledge (Vázquezr et al., 2006 ).

Three category variables (Adequate, Plausible, and Naïve) are computed by averaging their phrase indices, which are mutually independent. The average of the three category variables computes a global NOS index representing the student’s overall NOS knowledge (Global). The use of three categories aligns with test makers’ recommendations to avoid using only positively worded phrases in order to elude the acquiescence bias, which harms reliability and validity (Suárez-Alvarez et al., 2018 ).

The links between thinking skills and NOS are empirically explored through correlational methods and one-way ANOVA procedures of the variables of the Thinking Challenge test and science definitions.

The results include the descriptive statistics of the target variables, twelve thinking variables (five skills plus thinking total for each group) and four variables of the science definitions (adequate, plausible, naive, and global), the analysis of the correlations, a linear regression analysis among these variables, and a comparison of thinking skills between NOS categorical groups through a one-way ANOVA.

Descriptive statistics

Most mean thinking variables scores fell near the midpoint of the scale range. Four skills (classification, problem-solving, causal explanation and sequence) scored above the midpoints of their ranges, whereas two variables (logical reasoning and decision) scored slightly below their midpoints. Overall, these results indicate the medium difficulty of the tests for the students, neither easy nor difficult, which means the CRT tests can be acceptable to assess young students’ thinking skills (Table 3 ).

The EAP reliability indices of classification, problem-solving, sequence, parts (mainly figurative items) and thinking scales were excellent, good for the remaining scales, but poor for logical reasoning. Low reliability indicates a need for item revision and limited applicability (i.e., inappropriate for individual diagnosis), but is insufficient to reject the test in research purposes (U.S. Department of Labor, 1999 ). As test reliability critically depends on the number of items, increasing the length of logical reasoning over its three current items will improve its reliability.

The descriptive results for the direct scores of the NOS variables (Table 4 ) showed a biased pattern toward agreement (average phrases between 4.9 and 7.4), which suggests some acquiescence bias in spite of presenting varied phrases. The average indices obtained positive scores for the adequate category, slightly negative ones for the naïve category, and close-to-zero for the plausible phrases (the effect size of the differences concerning a zero score was low). The overall weighted average index for the whole sample (global variable) was close-to-zero and slightly positive, meaning that the students’ overall epistemic conception of science definition was not significantly informed. The overall average index of Adequate phrases obtained the highest positive score for both samples of students, which means that most students agreed with the Adequate phrases (expressing informed beliefs about science). In contrast, the Naïve overall average index obtained the lowest negative mean score, indicating that the students agreed instead of disagreeing with phrases expressing uninformed views about science. The Plausible variable (phrases expressing partially informed beliefs, neither adequate nor naive) obtained a close-to-zero average score, meaning that the students’ beliefs about these variables were far from informed. Overall, the students presented slightly informed views on Adequate phrases, close-to-zero average indices scores (not informed views) for Plausible phrases and slightly uninformed views on Naive statements.

Polychoric correlations among NOS direct scores computed through Factor attained good scores on all NOS items, indicating a unidimensional structure (but Phrase I). The exploratory factor analysis (EFA) applied to phrase scores displayed a dominant eigenvalue, whose general factor had acceptable loadings for all phrases (only phrase I had low loading). The unidimensional model obtained fair statistics in the confirmatory factor analysis. These results suggest one general factor underlying students’ scores and justify a global score representing the variance of all the NOS phrases. The expected a posteriori (EAP) reliability scores for the entire NOS scale were good (Table 4 ).

The comparison of NOS scores between primary and secondary grades highlights that the four NOS variable scores on science definitions were significantly equal for both cohorts of students, despite the two years separation. So, the educational impact of the two-year period on NOS seems almost null, given the close-to-zero differences in science definitions. This result could be expected, as NOS is not explicitly planned in Spanish science curricula and is not usually taught in the classroom.

Both cohorts answered the same anchoring CRT item (see Appendix), whose correct answer rate (27% primary; 33% secondary) suggests a slight improvement in CRT skills that sharply contrasts with the former NOS comparison. Summing up, despite that CRT and NOS have not been taught to Spanish students, developmental learning may increase CRT skills but not improve NOS knowledge. This reinforces the claim for explicit and reflective teaching of NOS, as implicit developmental maturation alone seems ineffective.

Correlations between nature of science and thinking skills

The empirical analysis of the hypothesised relationships between thinking skills and NOS epistemic variables (Adequate, Plausible, Naive) was performed through correlational methods (Pearson’s bivariate correlation coefficients and linear regression analysis) and one-way analysis of variance.

The Pearson correlation coefficients revealed a pattern of the relationships between NOS and thinking skills (Table 5 ): all thinking skills positively correlated with the Adequate variable, and most were significant, except for prediction and logical reasoning in EP6, which were non-significant. However, the correlations with the Naive and Plausible variables were overall non-significant. However, there were some exceptions: first, the Plausible/problem-solving correlation in EP6 was significant (and negative); second, the correlations between Naïve and logical reasoning (positive in EP6) and also between decision-making, logical reasoning and the thinking total score (negative in SCE8) were significant.

Thus, the noteworthy pattern for the NOS-CRT relationship showed that the Adequate variable positively correlated with all the thinking variables and was mostly statistically significant (83%); the highest positive correlations corresponded to problem-solving (EP6), sequence and parts-all (ES8), and the thinking total skills for both groups ( p  < 0.01). This pattern means that students with higher (lower) thinking skill scores expressed higher (lower) agreement with Adequate phrases.

The correlation pattern between thinking skills and the Plausible and Naive variables was mainly non-significant (75%). Only two correlations were significant in the EP6 group; the Plausible-problem-solving correlation was negative (higher scorers on problem-solving did not recognize the intermediate value of Plausible science definitions), whereas the Naïve-logical reasoning correlation was positive (higher scorers on logical reasoning tended to disagree with Naive science definitions). Three Naïve correlations were significant and negative in the secondary group (SCE8): parts-all, logical reasoning skills and thinking total.

Overall, the positive and significant correlation pattern of the Adequate variable was stronger than the mainly non-significant and somewhat negative Naive and Plausible correlation pattern.

Linear regression analysis between nature of science and thinking skills

Regression analysis (RA) compares the power of a set of variables to predict a dependent variable and the common variance. Two linear regression analyses were carried out to test the mutual contribution of the CRT and NOS variables. The first RA uses the NOS variables (Adequate, Plausible, Naive and Global) as the dependent variables, and the five independent thinking skills as predictors (Table 6 ). The second RA (Table 7 ) reversed the roles of the variables, thus establishing the thinking skills as the dependent variables and the three independent NOS variables (Adequate, Plausible and Naive) as the predictors. Collinearity tests were negative for all RAs through tolerance, variance inflation factor and condition index statistics.

The first RA (Table 6 ) showed that the NOS Adequate variable achieved the highest proportion of common variance with thinking skill predictors at both educational levels (4.2% in PE6 and 9.2% in SCE8), whereas the other two NOS variables achieved much lower levels of explained variance. In PE6, the most significant predictor skill of NOS was problem-solving, whereas the other predictor skills did not reach statistical significance in any case. In SCE8, the most significant predictors were three skills (sequencing, reasoning, and parts-all), whereas the remaining skills did not reach statistical significance (the predictors of the Plausible variable were negative).

The second RA (Table 7 ) showed that the Adequate variable achieved the greatest predictive power, as most thinking skills displayed statistically significant standardised beta coefficients at the two educational levels, while Plausible and Naïve variables had a much lower predictive power, and Plausible standardised coefficients were non-significant for any skill predictor. The common variance displayed a similar amount to the first analysis; the thinking total variable displayed the largest variance at both educational levels (4.8% PE6; 9.6% SCE8), and the problem-solving skills at PE6 (5.3%) and parts-all at SCE8 (7.1%).

In summary, the Adequate variable and the classification and problem-solving skills (PE6) and sequencing and parts-all skills (SCE8) were the variables that presented the largest standardised coefficients and statistical significance regarding the research question raised in this study about the positive relationship between NOS and thinking skills.

Analysis of variance between nature of science and thinking skills

Further exploration of the NOS-skills relationship was conducted through one-way between-groups analysis of variance. According to performance on the Adequate, Plausible and Naive variables, the participants were allocated to four percentile groups (low group: 0–25%; medium–low: 25–50%; medium–high: 50–75%; high: 75–100%), which made up the independent variable of the ANOVA for testing the differences in thinking skills (dependent variable) among these four groups.

The Adequate groups yielded a statistically significant main effect for the thinking total in primary [ F (3, 429) = 7.745, p  = 0.000] and secondary education [ F (3, 343) = 2.607, p  = 0.052]. The effect size of the differences in the thinking total scores between the high and low groups was large for the primary ( d  = 0.69) and secondary ( d  = 0.86) cohorts. Furthermore, comparison, classification, and problem-solving skills also replicated this pattern of large differences between high-low groups that supports the NOS/CRT positive relationship. However, prediction ( p  = 0.069) and logical reasoning ( p  = 0.504) did not display differences among the Adequate groups.

Post-hoc comparisons (Scheffé test) showed that the low group achieved significantly lower scores than the other three Adequate groups. The Adequate low group scores on thinking total, comparison, classification, and problem-solving skills were significantly lower than the scores of the other three groups, whereas the differences among the Adequate groups on prediction and logical reasoning scores were non-significant.

The main effect of the Plausible groups on the thinking total variable did not reach statistical significance for the primary F (3, 430) = 1.805, p = 0.145] and secondary groups [ F (3, 343) = 2.607, p  = 0.052]. The effect size was small ( d  = − 0.31 primary; d  = − 0.32 secondary) and negative (the thinking total mean score of the low group was higher than that of the high group). Post-hoc comparisons (Scheffé test) confirmed the trend, as they did not yield significant differences among the Plausible groups, although the mean score of the Plausible high group was lower than the other three groups. Exceptionally, problem-solving skill (primary) displayed a statistically significant difference between the Plausible high group (the lowest mean score) and the remaining three groups.

The main effect of Naive groups on the thinking total variable did not reach statistical significance [ F (3, 430) = 1.075, p  = 0.367 primary; F (3, 343) = 1.642, p  = 0.179 secondary] and the effect size of the differences was small ( d  = 0.32 primary; d  = − 0.31 secondary). The opposite direction of the differences in primary (positive) and secondary education (negative) is noteworthy, as it means that the highest mean score corresponded to the Naive high group in primary (positive) or the Naive low group in secondary (negative). Post-hoc comparisons (Scheffé test) showed that there were no significant differences among the Naive groups. However, the league table of groups across the Naive groups revealed differences between primary and secondary cohorts. Overall, the primary Naive groups followed the pattern of the Adequate variable (the low group displayed the lowest score), whereas the secondary Naive groups followed the pattern of the Plausible variable (the high group tended to display the lowest score).

The empirical findings of this study quantify through correlations some significant and positive relationships between thinking skills and NOS beliefs about science definitions, as the main answer to the research question. However, the analysis shows a complex pattern of the relationship, which depends on the kind of the NOS variable under consideration: the NOS Adequate variable, which represents phrases expressing informed views on science, is positively and significantly related to most thinking skills, whereas the uninformed Naive and intermediate Plausible variables show a lower predictive power of thinking skills. Summing up, the positive significant CRT-NOS relationship is not displayed by all NOS variables, as it is limited to those NOS variables that express an Adequate view of science, while the other NOS variables do not significantly correlate with CRT skills.

The implications of this study for research are twofold. On the one hand, the variables of this study specifically operationalise the two constructs under investigation, namely, CRT skills and NOS knowledge, which has been a challenge throughout their mixed operationalisation in the reviewed research. On the other hand, via Pearson correlations and regression analysis, this study quantifies the amount of the common variance between specific CRT skills and specific NOS knowledge, which is significant in many cases. Both contributions improve the features of previous studies, as most of them investigated the relationship from varied methodological frameworks: some reported group comparison, fewer analysed correlations, and most of the latter used a diversity of variables, which often did not match either CRT skills or NOS variables. For instance, Vieira et al. ( 2011 ) correlated thinking skills with science literacy (not NOS) and reported Pearson correlations that were lower than the correlations obtained herein, even though they used a smaller sample, which favours higher correlations.

The findings reveal the complexity of the NOS-CRT relationship, which limits the positive and relevant relationship to the NOS Adequate variables about science definitions, but not to the Plausible or Naive conceptualizations, which mainly display non-significant and somewhat negative correlations. The positive relationship between thinking and Adequate science definitions is a remarkable finding, which empirically supports the hypothesis that better thinking skills involve better NOS knowledge and confirms the concomitant intuitions and claims of some studies about the importance of thinking skills for learning NOS epistemic topics (Erduran & Kaya, 2018 ; Ford & Yore, 2014 ; Simonneaux, 2014 ; Torres & Solbes, 2016 ; Yacoubian, 2015 ). The findings also contribute to establishing the limit of the significant relationship, which applies when the NOS is conveyed by informed statements (Adequate phrases) and does not apply for non-adequate NOS statements, which are a minority in the face of most NOS literature, which conveys informed statements on NOS (Cofré et al., 2019 ).

The implications of the collateral finding on the lack of differences in science definitions between primary and secondary cohorts deserve further comments. Obviously, the finding confirms that two educational years have a scarce impact on improving Spanish students’ understanding of science definitions; that is, NOS teaching seems ineffective and stagnated, probably due to poor curriculum development and the lack of teacher training and educational resources. Besides, the students’ higher performance on adequate phrases than on plausible and naïve phrases also suggests that Spanish students may achieve some mild knowledge about the informed traits of science because they are implicitly displayed in teaching, textbooks and media. However, plausible and naïve knowledge is not usually available from those sources, as it requires explicit and reflective teaching, which Spanish students usually lack. Both findings suggest the need for further attention to misinformed NOS knowledge to invigorate explicit and reflective NOS teaching (Cofré et al., 2019 ; McDonald & McRobbie, 2012 ).

The unexpected non-significant/negative relationships between thinking and Plausible and Naive variables may need some elaboration due to the complexity of students’ NOS conceptions. For instance, Bennássar et al. ( 2010 ) described the students’ inconsistent agreements when rating opposite statements. Bogdan ( 2020 ) found that epistemic conceptions of science creativity did not relate to attitudes to science, and Khishfe ( 2012 ) reported complex relationships between epistemic aspects of science and decision-making about genetically modified organisms or the acceptance of the evolution theory (Cofré et al., 2017 ; Sinatra, et al., 2003 ). Thus, a tentative interpretation of those paradoxical relationships is elaborated.

Higher-thinking-skill students might develop better quality reflections that elicit more confident and higher scores on NOS phrases than lower-thinking-skill students. The latter tend toward less confident and low-quality reflection, which may elicit intermediate, less polarized scores. On average, this differential pattern explains the complex pattern of relationships between CRT and NOS variables. For the Adequate phrases (where the rubric assigns the best indices to the highest scores), higher-thinking students will achieve higher NOS indices than lower-thinking students, explaining the observed positive CRT-NOS correlations in the Adequate variables and the ANOVA results. On the other hand, when Naive and, especially, Plausible phrases are involved (which obtain their highest indices at low and intermediate scores, respectively), the differential response pattern would lead the lower-thinking students to achieve higher NOS indices than the higher-thinking students, thus shifting to the observed non-significant or negative correlations for Naive and Plausible phrases. In short, unconfident/confident and lower/higher quality reflection on NOS knowledge of the lower-/higher-thinking students would explain the shift from the positive and significant relationship of CRT-Adequate phrases to the non-significant correlations of Plausible and Naive phrases. This interpretation agrees with the striking finding of O’Brien et al. ( 2021 ) about a similar unexpected higher adherence to pseudoscientific claims in students with higher trust in science, which the authors attributed to the acritical acceptation of any scientific contents. Similarly, mastery of CRT skills is a desirable learning outcome, but it may make master students vulnerable to positive polarization in science definitions. However, further research is needed to confirm the non-significant correlations and the interpretation of the differential response pattern.

As the previous reference suggests, the findings about the complex CRT-NOS relationship connect with some pending controversies about NOS teaching, namely, the marginalized attention paid to misinformed ideas or myths about science, in favour of the informed ideas, which reveal implicit and non-reflective NOS teaching, as obviously misinformed ideas contribute to triggering more reflection than informed ideas (Acevedo et al., 2007 ; McComas, 1996 ). The effect of this under-exposure is students’ under-training about misinformed NOS ideas, which may act as obstacles to authentic NOS epistemic learning, explaining the differences presented herein. The remedy to this situation and the unconfident bias may lie in devoting more time and explicit attention to uninformed or incomplete NOS claims through reflective teaching.

This study is determined and limited by the contextual conditions of its correlational methodology. First, the research question implied measurements of thinking skills and NOS knowledge; second, the young participants (12–14-year-olds) required measurement tools appropriate to this age; third, the thinking skill tests had to match the thinking skills demanded by the participant school; fourth, the selected NOS tool was conditioned by the students’ age and the lack of appropriate NOS assessment tools. Thus, further suggestions to overcome these limitations are focused on expanding empirical support for the NOS-CRT relationship. On the one hand, some new NOS issues, such as additional epistemological and social aspects of science, should be explored to extend the representativeness of NOS knowledge. Similar reflections apply to including new skills to expand the scope of the CRT tool. Furthermore, the number of items of the logical reasoning scale should be increased to improve its reliability. Overall, the perennial debate between open-ended and closed formats is also noteworthy for future research, as quantitative methods could be complemented with qualitative methods (such as students’ interviews and the like).

Finally, the main educational implication of this study is that students may need to master some competence in CRT skills to learn NOS knowledge or general epistemic knowledge. Conversely, mastery of CRT skills may foster learning NOS knowledge. Although this study focuses on epistemic NOS knowledge drawn from science education, educational research has parallelly elaborated the epistemic knowledge (EK) construct for general education (Hofer & Pintrich, 1997 ), which opens further prospective research developments for NOS comprehension and CRT skills. On the one hand, the study of the NOS-EK relationship may shed light on convergent epistemic teaching and learning, both in science and in general education. On the other hand, the importance of CRT skills for NOS, and vice versa, may help coordinate teaching NOS-EK issues (Erduran & Kaya, 2018 ; Ford & Yore, 2014 ; McDonald & McRobbie, 2012 ; Simonneaux, 2014 ). This joint prospective of NOS-EK elaboration may also provide new answers to two aspects: the mutual connections between CRT skills and NOS-EK issues and the EK assessment tools that may also contribute to advancing the evaluation of CRT skills and NOS.

Data availability

The Spanish State Research Agency and the University of the Balearic Islands hold the property of all data and materials of this study, which may be available under reasonable request to them.

Code availability

Not applicable.

Acevedo JA, Vázquez A, Manassero MA, Acevedo P (2007) Consensus on the nature of science: epistemological aspects. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias 4:202–225. http://www.apac-eureka.org/revista/Larevista.htm

Aikenhead GS, Ryan AG (1992) The development of a new instrument: “Views on Science-Technology-Society” (VOSTS). Sci Educ 76:477–491

Article   Google Scholar  

Allchin D, Zemplén GÁ (2020) Finding the place of argumentation in science education: Epistemics and whole science. Sci Educ 104(5):907–933. https://doi.org/10.1002/sce.21589

American Psychological Association (1990) Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Executive Summary “The Delphi Report”. www.insightassessment.com/dex.html

Baglin J (2014) Improving your exploratory factor analysis for ordinal data: a demonstration using factor. Pract Assess Res Eval 19(5):2

Google Scholar  

Bailin S (2002) Critical thinking and science education. Sci Educ 11:361–375

Bennássar A, Vázquez A, Manassero MA, García-Carmona A (Coor.). (2010) Ciencia, tecnología y sociedad en Iberoamérica [Science, technology society in Latin America]. Organización de Estados Iberoamericanos. http://www.oei.es/salactsi/DOCUMENTO5vf.pdf

Bogdan R (2020) Understanding of epistemic aspects of NOS and appreciation of its social dimension. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 17, Article 2303. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2020.v17.i2.2303

Cofré H, Cuevas E, Becerra B (2017) The relationship between biology teachers’ understanding of the NOS and the understanding and acceptance of the theory of evolution. Int J Sci Educ 39:2243–2260. https://doi.org/10.1080/09500693.2017.1373410

Cofré H, Nuñez P, Santibáñez D, Pavez JM, Valencia M, Vergara C (2019) A critical review of students’ and teachers’ understandings of NOS. Sci Educ 28:205–248. https://doi.org/10.1007/s11191-019-00051-3

Deng F, Chen D-T, Tsai C-C, Chai C-S (2011) Students’ views of the NOS: a critical review of research. Sci Educ 95:961–999

Dogan N, Manassero MA, Vázquez A (2020) Creative thinking in prospective science teachers: effects of problem and history of science based learning, 48. https://doi.org/10.17227/ted.num48-10926

Dowd JE, Thompson RJ Jr, Schiff LA, Reynolds JA (2018) Understanding the complex relationship between critical thinking and science reasoning among undergraduate thesis writers. CBE Life Sci Educ. https://doi.org/10.1187/cbe.17-03-0052

Ennis RH (1996) Critical thinking. Prentice, Hoboken

Ennis RH, Millman J (2005) Cornell Critical Thinking Test Level X. The Critical Thinking Company.

Ennis, R. H. (2019). Long definition of critical thinking . http://criticalthinking.net/definition/long-definition-of-critical-thinking/

Erduran S, y Dagher, Z. R. (eds) (2014) Reconceptualizing the Nature of Science for Science Education. Scientific Knowledge, Practices and Other Family Categories. Springer, Dordrecht

Erduran S, Kaya E (2018) Drawing nature of science in pre-service science teacher education: epistemic insight through visual representations. Res Sci Educ 48(6):1133–1149. https://doi.org/10.1007/s11165-018-9773-0

European Union (2014). Key competence development in school education in Europe. KeyCoNet’s review of the literature: A summary . http://keyconet.eun.org

Facione PA, Facione RN, Blohm SW, Howard K, Giancarlo CAF (1998) California Critical Thinking Skills Test: Manual (Revised). California Academic Press, California

Fisher A (2009) Critical thinking An introduction. Cambridge University Press, Cambridge

Fisher A (2021) What critical thinking is. In: Blair JA (ed) Studies in critical thinking, 2nd edn. University of Windsor, Canada, pp 7–26

Ford CL, Yore LD (2014) Toward convergence of critical thinking, metacognition, and reflection: Illustrations from natural and social sciences, teacher education, and classroom practice. In: Zohar A, Dori YJ (eds) Metacognition in science education. Springer, Berlin, pp 251–271

García-Mila M, Andersen C (2008) Cognitive foundations of learning argumentation. In: Erduran S, Jiménez-Aleixandre MP (eds) Argumentation in science education: perspectives from classroom-based research. Springer, Berlin, pp 29–45

García-Carmona A, Vázquez A, Manassero MA (2011) Current status and perspectives on teaching the nature of science: a review of teachers’ beliefs obstacles. Enseñanza de las Ciencias 28:403–412

González-Howard M, McNeill KL (2020) Acting with epistemic agency: characterizing student critique during argumentation discussions. Sci Educ 104:953–982

Greene JA, Sandoval WA, Bråten I (2016) Handbook of epistemic cognition. Routledge, London

Book   Google Scholar  

Halpern DF (2010) Halpern Critical Thinking Assessment. Schuhfried, Modling

He X, Deng Y, Saisai Y, Wang H (2020) The influence of context on the large-scale assessment of high school students’ epistemic cognition of scientific argumentation. Sci Educ 29:7–41. https://doi.org/10.1007/s11191-019-00088-4

Henderson JB, McNeill KL, Gonzalez-Howard M, Close K, Evans M (2018) Key challenges and future directions for educational research on scientific argumentation. J Res Sci Teach 55(1):5–18. https://doi.org/10.1002/tea.21412

Hofer BK, Pintrich PR (1997) The development of epistemological theories: beliefs about knowledge and knowing and their relation to learning. Rev Educ Res 67:88–140. https://doi.org/10.3102/00346543067001088

Khishfe R (2012) Nature of science and decision-making. Int J Sci Educ 34:67–100. https://doi.org/10.1080/09500693.2011.559490

Khishfe R, Alshaya FS, BouJaoude S, Mansour N, Alrudiyan KI (2017) Students’ understandings of nature of science and their arguments in the context of four socio-scientific issues. Int J Sci Educ 39:299–334

Kolstø SD (2001) Scientific literacy for citizenship: Tools for dealing with the science dimension of controversial socio-scientific issues. Sci Educ 85:291–310

Koray Ö, Köksal MS (2009) The effect of creative and critical thinking based laboratory applications on creative logical thinking abilities of prospective teachers. Asia-Pacific Forum Sci Learn Teach 10, Article 2. https://www.eduhk.hk/apfslt/download/v10_issue1_files/koksal.pdf

Kreitchmann RS, Abad FJ, Ponsoda V, Nieto MD, Morillo D (2019) Controlling for response biases in self-report scales: Forced-choice vs psychometric modeling of Likert items. Front Psychol. https://doi.org/10.3389/fpsyg.2019.02309

Kuhn D (2012) Enseñar a pensar [Education for thinking]. Amorrortu, Argentina

Lederman NG (2007) Nature of science: past, present, and future. In: Abell SK, Lederman NG (eds) Handbook of research on science education. Lawrence Erlbaum Associates, USA, pp 831–879

Lederman NG, Wade PD, Bell RL (1998) Assessing understanding of the NOS: A historical perspective. In: McComas WF (ed) The NOS in science education: Rationales and strategies. Kluwer, Netherland, pp 331–350

Liang LL, Chen S, Chen X, Kaya ON, Adams AD, Macklin M, Ebenezer J (2008) Assessing preservice elementary teachers’ views on the nature of scientific knowledge: a dual-response instrument. Asia- Pacific Forum Sci Learn Teach 9(1). http://www.ied.edu.hk/apfslt/v9_issue1/liang/index.htm

Madison, J. (2004). James Madison Critical Thinking Course . The Critical Thinking Co. https://www.criticalthinking.com/james-madison-critical-thinking-course.html

Manassero MA, Vázquez A, Acevedo JA (2003) Cuestionario de opiniones sobre ciencia, tecnologia y sociedad (COCTS) [Questionnaire of opinions on science, technology and society]. Educational Testing Service. https://store.ets.org/store/ets/en_US/pd/ThemeID.12805600/productID.39407800

Manassero-Mas MA, Vázquez-Alonso A (2019) Conceptualization and taxonomy to structure knowledge about science. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias 16(3):3104. http://www.10.25267/Rev_Eureka_ensen_divulg_cienc.2019.v16.i3.3104

Manassero-Mas M, Vázquez-Alonso Á (2020a) Scientific thinking and critical thinking: transversal competences for learning. Indag Didact 12(4):401–420. https://doi.org/10.34624/id.v12i4.21808

Manassero-Mas MA, Vásquez-Alonso Á (2020b) Assessment of critical thinking skills: validation of free-culture tools. Tecné, Epistemé y Didaxis, 47:15–32. https://doi.org/10.17227/ted.num47-9801

Mason L, Scirica F (2006) Prediction of students’ argumentation skills about controversial topics by epistemological understanding. Learn Instr 16:492–509. https://doi.org/10.1016/j.learninstruc.2006.09.007

Matthews MR (2012) Changing the focus: From nature of science (NOS) to features of science (FOS). In: Khine MS (ed) Advances in nature of science research Concepts and methodologies. Springer, Berlin, pp 3–26

Chapter   Google Scholar  

McComas WF (1996) Ten myths of science: reexamining what we think we know about the NOS. Sch Sci Math 96:10–16

McDonald CV, McRobbie CJ (2012) Utilising argumentation to teach NOS. In: Fraser BJ, Tobin KG, McRobbie CJ (eds) Second international handbook of science education. Springer, Berlin, pp 969–986

Merton RK (1968) Social theory and social structure. Simon and Schuster, Newyork

National Research Council (2012) Education for life and work: Developing transferable knowledge and skills in the 21st century. The National Academies Press, USA

Noroozi O (2016) Considering students’ epistemic beliefs to facilitate their argumentative discourse and attitudinal change with a digital dialogue game. Innov Educ Teach Int 55(3):357–365. https://doi.org/10.1080/14703297.2016.1208112

O’Brien TC, Palmer R, Albarracin D (2021) Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation. J Exp Soc Psychol 96:104184. https://doi.org/10.1016/J.JESP.2021.104184

Olson JK (2018) The inclusion of the NOS in nine recent international science education standards documents. Sci Educ 27:637–660. https://doi.org/10.1007/s11191-018-9993-8

Osborne J (2014) Teaching critical thinking? new directions in science education. Sch Sci Rev 95:53–62

Paul R, Elder L (2008) The miniature guide to critical thinking: concepts and tools (5th ed.). Foundation for Critical Thinking Press

Rapanta C, Garcia-Mila M, Gilabert S (2013) What is meant by argumentative competence? an integrative review of methods of analysis and assessment in education. Rev Educ Res 83:483–520

Rubba PA, Schoneweg CS, Harkness WL (1996) A new scoring procedure for the views on science-technology-society instrument. Int J Sci Educ 18(4):387–400. https://doi.org/10.1080/0950069960180401

Santos LF (2017) The role of critical thinking in science education. J Educ Pract 8:159–173

Settlage J, Southerland SA (2020) Epistemic tools for science classrooms: the continual need to accommodate and adapt. Sci Educ 103(4):1112–1119. https://doi.org/10.1002/sce.21510

Simonneaux L (2014) From promoting the techno-sciences to activism – A variety of objectives involved in the teaching of SSIS. In: Bencze L, Alsop S (eds) Activist science and technology education. Springer, Berlin, pp 99–112

Sinatra GM, Southerland SA, McConaughy F, Demastes JW (2003) Intentions and beliefs in students’ understanding and acceptance of biological evolution. J Res Sci Teach 40:510–528. https://doi.org/10.1002/tea.10087

Sprod T (2014) Philosophical Inquiry and Critical Thinking in Science Education. In: Matthews MR (ed) International Handbook of Research in History, Philosophy and Science Teaching. Springer, Berlin, pp 1531–1564

Stathopoulou C, Vosnidou S (2007) Conceptual change in physics and physics-related epistemological beliefs: A relationship under scrutiny. In: Vosnidou S, Baltas A, Vamvakoussi X (eds) Re-framing the problem of conceptual change in learning and instruction. Elsevier, Amsterdam, pp 145–163

Suárez-Alvarez J, Pedrosa I, Lozano LM, García-Cueto E, Cuesta M, Muñiz J (2018) Using reversed items in likert scales: a questionable practice. Psicothema 30:149–158. https://doi.org/10.7334/psicothema2018.33

Torres N, Solbes J (2016) Contributions of a teaching intervention using socio-scientific issues to develop critical thinking. Enseñanza De Las Ciencias 34:43–65. https://doi.org/10.5565/rev/ensciencias.1638

U.S. Department of Labor Employment and Training Administration (1999). Understanding test quality-concepts of reliability and validity . https://hr-guide.com/Testing_and_Assessment/Reliability_and_Validity.htm

Vázquez-Alonso Á, Manassero-Mas MA (2018) Beyond science understanding: science education to develop thinking. Revista Electrónica de Enseñanza de las Ciencias 17:309–336. http://www.saum.uvigo.es/reec

Vázquez A, Manassero MA, Acevedo JA (2006) An analysis of complex multiple-choice science-technology-society items: Methodological development and preliminary results. Sci Educ 90: 681–706

Vergara AI, Balluerka N (2000) Methodology in cross-cultural research: current perspectives. Psicothema 12:557–562

Vesterinen VM, Manassero-Mas MA, Vázquez-Alonso Á (2014) History, philosophy, and sociology of science and science-technology-society traditions in science education: continuities and discontinuities. In Matthews MR (ed) International Handbook of Research in History, Philosophy and Science Teaching (pp 1895–1925). Springer

Vieira RM, Tenreiro-Vieira C, Martins IP (2011) Critical thinking: Conceptual clarification and its importance in science education. Sci Educ Int 22:43–54

Watson G, Glaser EM (2002) Watson-Glaser Critical Thinking Appraisal-II Form E. Pearson, London

Weinstock MP (2006) Psychological research and the epistemological approach to argumentation. Informal Logic 26:103–120

Yacoubian HA (2015) A framework for guiding future citizens to think critically about NOS and socioscientific issues. Can J Sci Math Technol Educ 15:248–260

Yacoubian HA, Khishfe R (2018) Argumentation, critical thinking, NOS and socioscientific issues: a dialogue between two researchers. Int J Sci Educ 40:796–807

Yang FY, Tsai CC (2010) Reasoning on the science-related uncertain issues and epistemological perspectives among children. Instr Sci 38:325–354

Yang FY, Tsai CC (2012) Personal epistemology and science learning: A review of studies. In: Fraser BJ, Tobin KG, McRobbie CJ (eds) Second international handbook of science education. Springer, Berlin, pp 259–280

Yang F-Y, Bhagat KK, Cheng C-H (2019) Associations of epistemic beliefs in science and scientific reasoning in university students from Taiwan and India. Int J Sci Educ 41:1347–1365. https://doi.org/10.1080/09500693.2019.1606960

Zeidler DL, Walker KA, Ackett WA, Simmons ML (2002) Tangled up in views: beliefs in the NOS and responses to socioscientific dilemmas. Sci Educ 86:343–367

Zeineddin A, Abd-El-Khalick F (2010) Scientific reasoning and epistemological commitments: coordination of theory and evidence among college science students. J Res Sci Teach 47:1064–1093. https://doi.org/10.1002/tea.20368

Download references

Acknowledgments

Grant EDU2015-64642-R of the Spanish State Research Agency and the European Regional Development Fund, European Union.

Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. This study is part of a research project funded by Grant No EDU2015-64642-R of the Spanish State Research Agency and the European Regional Development Fund, European Union.

Author information

Authors and affiliations.

Department of Psychology, University of the Balearic Islands, Palma, Spain

María Antonia Manassero-Mas

Centre for Postgraduate Studies, University of the Balearic Islands, Edificio Guillem Cifre de Colonya, Carretera de Valldemossa, Km. 7.5, 07122, Palma, Spain

Ángel Vázquez-Alonso

You can also search for this author in PubMed   Google Scholar

Contributions

Both authors declare their contribution to this study, their agreement with the content, their explicit consent to submit and that they obtained consent from the responsible authorities at the organization where the work has been carried out before the work was submitted. All authors contributed to the study conception and design, material preparation, data collection and analysis of the first draft of the manuscript, and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Ángel Vázquez-Alonso .

Ethics declarations

Conflict of interest.

The authors have no conflicts of interest or competing interests to declare regarding this article.

Ethical approval

This study was performed in accordance with the Declaration of Helsinki and the Ethics Committee of the University of the Balearic Islands approved the whole research project. Participants’informed consent was deemed not necessary because only the participants’ teachers developed the tasks involved in the study as ordinary learning classroom tasks, without any intervention of researchers. This manuscript is original, has not been published elsewhere and has not been submitted simultaneously to any other journal for consideration.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Manassero-Mas, M.A., Vázquez-Alonso, Á. An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills. SN Soc Sci 2 , 270 (2022). https://doi.org/10.1007/s43545-022-00546-x

Download citation

Received : 11 December 2021

Accepted : 10 October 2022

Published : 08 December 2022

DOI : https://doi.org/10.1007/s43545-022-00546-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Nature of science
  • Critical thinking skills
  • Scientific literacy
  • Assessment of thinking skills
  • Epistemic assessment
  • Find a journal
  • Publish with us
  • Track your research

ScienceDaily

'Scientists' warning' on climate and technology

Academics explore roles of clean energy and ai in combating global warming.

Throughout human history, technologies have been used to make peoples' lives richer and more comfortable, but they have also contributed to a global crisis threatening Earth's climate, ecosystems and even our own survival. Researchers at the University of California, Irvine, the University of Kansas and Oregon State University have suggested that industrial civilization's best way forward may entail embracing further technological advancements but doing so with greater awareness of their potential drawbacks.

In a paper titled "Scientists' Warning on Technology," published recently in the Journal of Cleaner Production , the researchers, including Bill Tomlinson, UCI professor of informatics, stress that innovations, particularly in the fields of clean energy and artificial intelligence, will come with risks but may be the most effective way to ensure a sustainable future.

"Since prehistoric times, technologies have been created to solve problems and benefit people; think of the improvements that have been made in agriculture, manufacturing and transportation," Tomlinson said. "But these developments have had a dual nature. While addressing the human need for food, farming has led to environmental degradation, and our factories and vehicles have caused a massive buildup of atmospheric carbon dioxide, which is causing climate change."

Co-author Andrew W. Torrance, the Paul E. Wilson Distinguished Professor of Law at the University of Kansas, said: "Technology is often offered as a panacea for environmental crises. It is not. Nevertheless, it will play a crucial role in any solution. That is why the role of technology must be taken seriously, rigorously measured, modeled and understood -- and then interpreted in light of population and affluence."

He added, "I am extremely optimistic about the beneficial role technology could play in helping humanity find its sustainable niche in the biosphere, but [I'm also] stone-cold sober that other, less hopeful outcomes remain possible."

The scientists' warning concept dates to the early 1990s, when the Union of Concerned Scientists published a letter exhorting people to change their habits regarding stewardship of Earth and its resources "if vast human misery is to be avoided and our global home on this planet is not to be irretrievably mutilated." A second warning, in 2017, was signed by more than 15,000 scholars in different scientific fields. Since then, dozens of additional admonitions have been published, with over 50 currently in preparation.

"The scientists' warnings weave a compelling narrative of humanity at a crossroads, urging us to acknowledge the fragility of our biosphere and embrace a collective responsibility for safeguarding our future through proper, science-based actions," said co-author William Ripple, Oregon State University Distinguished Professor of ecology, who led the project to write the article.

The Journal of Cleaner Production warning outlines two main methods for reducing, mitigating or eliminating fossil fuel use. The first is infrastructural substitution, replacing coal- and natural gas-fired power plants with renewable resources such as wind and solar, and abandoning internal combustion engines in favor of electric motors. This shift would also involve widespread adoption of electric appliances in homes and swapping out gas furnaces and water heaters for heat pumps.

A second method to steer humanity away from fossil fuel burning centers on a concept known as "undesign," the intentional negation of technology and consideration of alternatives that do not rely on labor-saving human inventions.

"People are often resistant to change, though, especially in contexts where they have come to depend strongly on particular goods and services," Tomlinson said. "Embracing undesign will require people to be guided to new cultural narratives that are not so reliant on heavily impactful systems."

In addition to clean energy technologies, the warning's authors look to artificial intelligence as a way to point human civilization toward a more sustainable tomorrow. They mention how AI is being used currently to connect wildlife habitats, monitor methane emissions and optimize supply chains. Tomlinson and his colleagues said AI presents far less energy-intensive alternatives to laborious tasks like writing and illustration and is becoming adept at writing computer code, which could come in handy in managing the "complexities of 8 billion-plus people cohabiting on Earth," according to the paper.

But Tomlinson noted that AI is not without risks, such as the possibility of runaway energy consumption, perpetuating biases in human societies and AI systems becoming independent and powerful enough that they pose a real danger to humanity.

"It's important that humans deploy new technologies to replace those that are environmentally harmful," he said. "But we need to remain vigilant for potential future harm and attempt to mitigate that as much as possible.

"In our scientists' warning, we identify an array of potential future risks from both electrification and AI. We believe that these outcomes are substantially less problematic than these technologies' potential benefits from addressing the pressing environmental crises that humanity is currently facing."

This project received funding from the National Science Foundation.

  • Energy and Resources
  • Energy Technology
  • Computer Science
  • Computer Modeling
  • Neural Interfaces
  • Environmental Policies
  • Energy Issues
  • Resource Shortage
  • Climate engineering
  • Climate change mitigation
  • Ethnic group
  • Global climate model
  • Western culture
  • Water resources

Story Source:

Materials provided by University of California - Irvine . Note: Content may be edited for style and length.

Journal Reference :

  • Bill Tomlinson, Andrew W. Torrance, William J. Ripple. Scientists’ warning on technology . Journal of Cleaner Production , 2024; 434: 140074 DOI: 10.1016/j.jclepro.2023.140074

Cite This Page :

  • Anchors Holding Antarctic Land-Ice Shrinking
  • Compound Vital for All Life and Life's Origin
  • How Competition Can Help Promote Cooperation
  • Giant New Snake Species Identified in the Amazon
  • Chemists Synthesize Unique Anticancer Molecules
  • Neutron Star at Heart of Supernova Remnant
  • The Search for More Temperate Tatooines
  • Steering Light With Supercritical Coupling
  • Dramatic Improvements in Crohn's Patients
  • Record-Breaking Quasar Discovered

Division of Research logo

2024 Critical and Emerging Technologies List - U.S.

In February 2024, the National Science and Technology Council, within the Executive Office of the President of the United States, published a document titled, Critical and Emerging Technologies List Update . The document overview notes:

The 2022 National Security Strategy identifies three national security interests: protect the security of the American people, expand economic prosperity and opportunity, and realize and defend the democratic values at the heart of the American way of life. The NSTC established this Fast Track Action Subcommittee in 2020 to identify critical and emerging technologies to inform national security-related activities. This list identifies CETs with the potential to further these interests and builds on the October 2020 National Strategy for Critical and Emerging Technologies, which contains an initial list of priority CETs. This updated document expands upon that original CET list and the February 2022 update by identifying subfields for each CET with a focus, where possible, on core technologies that continue to emerge and modernize, while remaining critical to a free, open, secure, and prosperous world.

The interagency process included U.S. government subject matter experts from 18 departments, agencies, and Executive Office of the President, who identified subfields which may be critical to U.S. national security.

Critical and Emerging Technologies List 

The following critical and emerging technology areas are of particular importance to the national security of the United States: 

• Advanced Computing 

• Advanced Engineering Materials 

• Advanced Gas Turbine Engine Technologies 

• Advanced and Networked Sensing and Signature Management 

• Advanced Manufacturing 

• Artificial Intelligence 

• Biotechnologies 

• Clean Energy Generation and Storage 

• Data Privacy, Data Security, and Cybersecurity Technologies 

• Directed Energy 

• Highly Automated, Autonomous, and Uncrewed Systems (UxS), and Robotics 

• Human-Machine Interfaces 

• Hypersonics 

• Integrated Communication and Networking Technologies 

• Positioning, Navigation, and Timing (PNT) Technologies 

• Quantum Information and Enabling Technologies 

• Semiconductors and Microelectronics 

• Space Technologies and Systems

  • Advanced supercomputing, including for AI applications 
  • Edge computing and devices 
  • Advanced cloud services 
  • High-performance data storage and data centers
  • Advanced computing architectures
  • Advanced modeling and simulation
  • Data processing and analysis techniques
  • Spatial computing
  • Materials by design and material genomics
  • Materials with novel properties to include substantial improvements to existing properties
  • Novel and emerging techniques for material property characterization and lifecycle assessment
  • Aerospace, maritime, and industrial development and production technologies
  • Full-authority digital engine control, hot-section manufacturing, and associated technologies
  • Payloads, sensors, and instruments
  • Sensor processing and data fusion
  • Adaptive optics
  • Remote sensing of the Earth
  • Geophysical sensing
  • Signature management
  • Detection and characterization of pathogens and of chemical, biological, radiological and nuclear weapons and materials
  • Transportation-sector sensing
  • Security-sector sensing
  • Health-sector sensing
  • Energy-sector sensing
  • Manufacturing-sector sensing
  • Building-sector sensing
  • Environmental-sector sensing
  • Advanced additive manufacturing
  • Advanced manufacturing technologies and techniques including those supporting clean, sustainable, and smart manufacturing, nanomanufacturing, lightweight metal manufacturing, and product and material recovery
  • Reinforcement learning
  • Sensory perception and recognition
  • AI assurance and assessment techniques
  • Foundation models
  • Generative AI systems, multimodal and large language models
  • Synthetic data approaches for training, tuning, and testing
  • Planning, reasoning, and decision making
  • Technologies for improving AI safety, trust, security, and responsible use
  • Novel synthetic biology including nucleic acid, genome, epigenome, and protein synthesis and engineering, including design tools
  • Multi-omics and other biometrology, bioinformatics, computational biology, predictive modeling, and analytical tools for functional phenotypes
  • Engineering of sub-cellular, multicellular, and multi-scale systems
  • Cell-free systems and technologies
  • Engineering of viral and viral delivery systems
  • Biotic/abiotic interfaces
  • Biomanufacturing and bioprocessing technologies
  • Renewable generation
  • Renewable and sustainable chemistries, fuels, and feedstocks
  • Nuclear energy systems
  • Fusion energy
  • Energy storage
  • Electric and hybrid engines
  • Grid integration technologies
  • Energy-efficiency technologies
  • Carbon management technologies
  • Distributed ledger technologies
  • Digital assets
  • Digital payment technologies
  • Digital identity technologies, biometrics, and associated infrastructure
  • Communications and network security
  • Privacy-enhancing technologies
  • Technologies for data fusion and improving data interoperability, privacy, and security 
  • Distributed confidential computing
  • Computing supply chain security
  • Security and privacy technologies in augmented reality/virtual reality
  • High-power microwaves
  • Particle beams
  • Supporting digital infrastructure, including High Definition (HD) maps
  • Autonomous command and control
  • Augmented reality
  • Virtual reality
  • Human-machine teaming
  • Neurotechnologies
  • Aerodynamics and control
  • Materials, structures, and manufacturing
  • Detection, tracking, characterization, and defense
  • Radio-frequency (RF) and mixed-signal circuits, antennas, filters, and components
  • Spectrum management and sensing technologies
  • Future generation wireless networks
  • Optical links and fiber technologies
  • Terrestrial/undersea cables
  • Satellite-based and stratospheric communications
  • Delay-tolerant networking
  • Mesh networks/infrastructure independent communication technologies
  • Software-defined networking and radios
  • Modern data exchange techniques
  • Adaptive network controls
  • Resilient and adaptive waveforms
  • Diversified PNT-enabling technologies for users and systems in airborne, space-based, terrestrial, subterranean, and underwater settings
  • Interference, jamming, and spoofing detection technologies, algorithms, analytics, and networked monitoring systems
  • Disruption/denial-resisting and hardening technologies
  • Quantum computing
  • Materials, isotopes, and fabrication techniques for quantum devices
  • Quantum sensing
  • Quantum communications and networking
  • Supporting systems
  • Design and electronic design automation tools
  • Manufacturing process technologies and manufacturing equipment
  • Beyond complementary metal-oxide-semiconductor (CMOS) technology
  • Heterogeneous integration and advanced packaging
  • Specialized/tailored hardware components for artificial intelligence, natural and hostile radiation environments, RF and optical components, high-power devices, and other critical applications
  • Novel materials for advanced microelectronics
  • Microelectromechanical systems (MEMS) and Nanoelectromechanical systems (NEMS)
  • Novel architectures for non-Von Neumann computing
  • In-space servicing, assembly, and manufacturing as well as enabling technologies
  • Technology enablers for cost-effective on-demand, and reusable space launch systems
  • Technologies that enable access to and use of cislunar space and/or novel orbits
  • Sensors and data analysis tools for space-based observations
  • Space propulsion
  • Advanced space vehicle power generation
  • Novel space vehicle thermal management
  • Crewed spaceflight enablers
  • Resilient and path-diverse space communication systems, networks, and ground stations
  • Space launch, range, and safety technologies

IMAGES

  1. 5 Tech Tools to Encourage Critical Thinking

    critical thinking in science and technology

  2. 6 Main Types of Critical Thinking Skills (With Examples)

    critical thinking in science and technology

  3. Science, technology and critical thinking by joshua amponsem

    critical thinking in science and technology

  4. Teaching critical thinking in science

    critical thinking in science and technology

  5. Science, technology and critical thinking by joshua amponsem

    critical thinking in science and technology

  6. Critical Thinking in Science: Exploring the Benefits and Developing the

    critical thinking in science and technology

VIDEO

  1. Webinar

  2. Critical thinking

  3. Critical Thinking

  4. Critical thinking

  5. critical thinking

  6. DIY BATTERY #chemistry #electrochemistry #diy

COMMENTS

  1. Critical Thinking in Science: Fostering Scientific Reasoning Skills in

    Critical thinking in science helps students: Actively evaluate information Identify bias Separate the logic within arguments Analyze evidence 4 Ways to promote critical thinking Figuring out how to develop critical thinking skills in science means looking at multiple strategies and deciding what will work best at your school and in your class.

  2. Teaching critical thinking in science

    1. Identifying a problem and asking questions about that problem 2. Selecting information to respond to the problem and evaluating it 3. Drawing conclusions from the evidence Critical thinking can be developed through focussed learning activities.

  3. Using Technology To Develop Students' Critical Thinking Skills

    On September 14, 2015 What Is Critical Thinking? Critical thinking is a higher-order cognitive skill that is indispensable to students, readying them to respond to a variety of complex problems that are sure to arise in their personal and professional lives.

  4. Understanding the Complex Relationship between Critical Thinking and

    Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities.

  5. Scientific Thinking and Critical Thinking in Science Education

    1 Introduction In consulting technical reports, theoretical frameworks, research, and curricular reforms related to science education, one commonly finds appeals to scientific thinking and critical thinking as essential educational processes or objectives.

  6. Developing Students' Critical Thinking Skills and Argumentation

    With the rapid change in science and technology and outdating of knowledge, learning needs rapid changes in transfer of information ... P. Development of augmented reality-based interactive multimedia to improve critical thinking skills in science learning. International Journal of Instruction. 2019; 12 (4):331-344. doi: 10.29333/iji.2019 ...

  7. Students' and teachers' critical thinking in science education: are

    Research Article Students' and teachers' critical thinking in science education: are they related to each other and with physics achievement? Xin Ma , Yin Zhang & Xingkai Luo Pages 734-758 | Published online: 23 Jun 2021 Cite this article https://doi.org/10.1080/02635143.2021.1944078 Full Article Figures & data References Citations Metrics

  8. Fostering Scientific Literacy and Critical Thinking in ...

    Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning experiences that promote SL and CT, which may trigger the ...

  9. Science and the Spectrum of Critical Thinking

    Critical thinking Informal logic Logic Rationality Science Scientific method The term 'critical thinking ' is a bit like the Euro: a form of currency that not long ago many were eager to adopt but that has proven troublesome to maintain. And in both cases, the Greeks bear an outsized portion of the blame. Peter Wood [ 1] Download chapter PDF

  10. Critical Thinking in Science and Technology

    Critical Thinking in Science and Technology David F. Samuel Published in Handbook of Research on… 2019 Education Handbook of Research on Critical Thinking and Teacher Education Pedagogy From as far back as the 1980s, many researchers have cited the importance of critical thinking in the citizens of modern societies.

  11. Understanding the Complex Relationship between Critical Thinking and

    Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students' development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in ...

  12. PDF The Role of Critical Thinking in Science Education

    of Critical Thinking in Science curriculum in the last years, and how Critical thinking can be implemented into ... technology or science production), stirs up the need to solve these problems, where Educational institutions have a crucial role through moral, ethical values as well as Critical thinking promotion. (Sahin, Senar Alkin; Tunca,

  13. Is technology producing a decline in critical thinking and analysis

    Stuart Wolpert | January 27, 2009 As technology has played a bigger role in our lives, our skills in critical thinking and analysis have declined, while our visual skills have improved, according to research by Patricia Greenfield, UCLA distinguished professor of psychology and director of the Children's Digital Media Center, Los Angeles.

  14. The Role of Critical Thinking in STEAM

    Educators play a crucial role in encouraging students to think critically about complex concepts, and challenging them to approach problems from multiple perspectives can foster innovation in space science and technology. By teaching critical thinking skills, educators can equip their students with the tools they need to be successful members ...

  15. Critical thinking in STEM (science, technology, engineering, and

    Critical thinking in STEM (science, technology, engineering, and mathematics) Utopía y Praxis Latinoamericana, vol. 24, núm. Esp.6, pp. 32-41, 2019 Universidad del Zulia Esta obra está bajo una Licencia Creative Commons Atribución-NoComercial-CompartirIgual 3.0 Internacional. Recepción: 09 Septiembre 2019 Aprobación: 15 Noviembre 2019

  16. How to Evaluate Critical Thinking in the Age of AI

    Because GenAI's responses are immediate and nonjudgmental, learners can develop their critical thinking processes as they freely reflect on thoughts, responses, and concepts. GenAI has not supplanted the role of instructors in the classroom. Rather, it has become a tool that we can use to teach, inspire, and guide our learners.

  17. Critical Thinking in Science and Technology Education: Importance

    This chapter focuses on science and technology curricula and rationalizes the need for changes both in the development as well as the implementation of the curriculum to facilitate the...

  18. Critical theory and the question of technology: The Frankfurt School

    A critical theory of technology would serve to identify such contradictions through an analysis of the relationship between science, technology and capital, through an analysis of the world-expanding normativities it precipitates, and the reductive, commodifying impacts technology poses to the subject as a social being within capitalism, and to ...

  19. The Conceptualization of Critical Thinking: Toward a Culturally

    The prevalence of different forms of fake news and pseudoscience raises the need to refocus research on critical thinking (CT) and its role in twenty-first century education (OECD, 2018; Vincent-Lancrin et al., 2019).CT is recognized as a higher order cognitive function with a wide range of applications across science disciplines (Barak et al., 2007; Lamb et al., 2021).

  20. (PDF) Scientific thinking and critical thinking in science education

    Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is...

  21. Twisted thinking: Technology, values and critical thinking

    In order to play a crucial role in helping to align values and technology, critical thinking needs to be modified and refocused on values. Here, we outline what value-centred critical thinking could look like. ... S. Vosoughi D. Roy S. Aral (2018) 'The spread of true and false news online', Science, 359, 6380, pp.1146-51.

  22. The Role of Critical Thinking in Science Education

    There are practical Critical thinking-related strategies that can be applied in science classrooms to improve science education results and critical thinking dispositions on students, one of these is 'questioning', regarded among the most powerful tools. Keywords : Critical Thinking, Questioning, Critical questioning, Science, Science ...

  23. How Can Creativity and Critical Thinking Thrive in a Tech-Driven ...

    Nurturing creativity and critical thinking. In the quest to cultivate creativity and critical thinking in a tech-driven world, one must navigate the intricate interplay between innovation and ...

  24. The Relationship Between Critical Thinking and Critical Theory

    Critical thinking is many things, but one thing it is not is critical theory. Critical theory is an arts and humanities approach to identifying, critiquing, and challenging social dynamics and ...

  25. Creative Collisions: Crossing the Art-Science Divide

    The class "Creating Art, Thinking Science" (4.373/4.374), open to undergraduates and master's students of all disciplines as well as certain students from Harvard Graduate School of Design (GSD), is one of the program's most innovative offerings, proposing a model for how the relationship between art and science might play out at a time ...

  26. An empirical analysis of the relationship between nature of science and

    Critical thinking (CRT) skills transversally pervade education and nature of science (NOS) knowledge is a key component of science literacy. Some science education researchers advocate that CRT skills and NOS knowledge have a mutual impact and relationship.

  27. 'Scientists' warning' on climate and technology

    In a paper titled "Scientists' Warning on Technology," published recently in the Journal of Cleaner Production, the researchers, including Bill Tomlinson, UCI professor of informatics, stress that ...

  28. 2024 Critical and Emerging Technologies List

    In February 2024, the National Science and Technology Council, within the Executive Office of the President of the United States, published a document titled, Critical and Emerging Technologies List Update. The document overview notes: ... The following critical and emerging technology areas are of particular importance to the national security ...