• Utility Menu

University Logo

GA4 Tracking Code

Home

fa51e2b1dc8cca8f7467da564e77b5ea

  • Make a Gift
  • Join Our Email List

Active Learning

  • Classroom Debate
  • Leading Discussions
  • Flipped Classrooms
  • Polling & Clickers
  • Teaching with Cases
  • Problem Solving in STEM

Active learning includes any type of instructional activity that engages students in learning, beyond listening, reading, and memorizing.  As examples, students might talk to a classmate about a challenging question, respond to an in-class prompt in writing, make a prediction about an experiment, or apply knowledge from a reading to a case study.  Active learning commonly includes collaboration between students in pairs or larger groups, but independent activities that involve reflection or writing—like quick-writes, or real-time polling in lectures—are also valuable.

Instructors can employ active learning in classes of any size, although certain activities may be better suited for smaller classes than large lecture halls.  Nonetheless, even large classes—including classes that meet in lecture halls with fixed seats—can incorporate a variety of activities that encourage students to talk with each other, work in small groups on an activity, or respond to a question through in-class writing or polling.  Furthermore, even small classes can increase student engagement beyond what might occur in a full group discussion by varying the instructional approaches and including small group discussions and activities.

Why should I use active learning?

 Active learning is valuable for a variety of reasons:

  • It provides instructors with feedback about what students are learning.
  • It helps students gauge their own understanding. By grappling with ideas, students connect new concepts to prior knowledge in meaningful ways and construct their own understanding.
  • Collaborating with classmates promotes community and connection between students, which can enhance a sense of belonging as well as motivation.
  • It creates a low bar to participation for quiet or passive students by encouraging every student to think and do.

Many of the larger scale studies on active learning have been conducted in STEM disciplines, although it reasonable to expect that the benefits of active learning extend to any field.  A 2014 meta-analysis of 225 research studies in STEM classes found that students in classes with active learning performed 6% better on exams than students in classes with traditional lecturing, and that students in classes with traditional lecturing were 1.5 times more likely to fail than in classes with active learning ( Freeman et al, 2014 ).  Additionally, active learning has been shown to decrease the achievement gap for underrepresented minorities and first generation college students ( Theobald et al, 2020 ).

What are some examples?

think pair share

Active learning strategies come in many varieties, most of which can be grafted into existing courses without costly revisions. One of the simplest and most elegant exercises, called  Think-pair-share , could easily be written into almost any lecture. In this exercise, students are given a minute to think about—and perhaps respond in writing—to a question on their own.  Students next exchange ideas with a partner.  Finally, some students share with the entire class. A think-pair-share engages every student, and also encourages more participation than simply asking for volunteers to respond to a question.

Other active learning exercises include:

  • Case studies :  In a case study, students apply their knowledge to real life scenarios, requiring them to synthesize a variety of information and make recommendations.
  • Collaborative note taking :  The instructor pauses during class and asks students to take a few minutes to summarize in writing what they have just learned and/or consolidate their notes.  Students then exchange notes with a partner to compare; this can highlight key ideas that a student might have missed or misunderstood.
  • Concept map :  This activity helps students understand the relationship between concepts. Typically, students are provided with a list of terms.  They arrange the terms on paper and draw arrows between related concepts, labeling each arrow to explain the relationship.
  • Group work :  Whether solving problems or discussing a prompt, working in small groups can be an effective method of engaging students.  In some cases, all groups work on or discuss the same question; in other cases, the instructor might assign different topics to different groups.  The group’s task should be purposeful, and should be structured in such a way that there is an obvious advantage to working as a team rather than individually.  It is useful for groups to share their ideas with the rest of the class—whether by writing answers on the board, raising key points that were discussed, or sharing a poster they created.
  • Jigsaw :  Small groups of students each discuss a different, but related topic. Students are then shuffled such that new groups are comprised of one student from each of the original groups. In these new groups, each student is responsible for sharing key aspects of their original discussion. The second group must synthesize and use all of the ideas from the first set of discussions in order to complete a new or more advanced task.  A nice feature of a jigsaw is that every student in the original group must fully understand the key ideas so that they can teach their classmates in the second group. 

  • NB: A minute paper can also be used as a reflection at the end of class.  The instructor might ask students to write down the most important concept that they learned that day, as well as something they found confusing.  Targeted questions can also provide feedback to the instructor about students’ experience in the class.
  • Statement correction , or  intentional mistakes :  The instructor provides statements, readings, proofs, or other material that contains errors.  The students are charged with finding and correcting the errors.  Concepts that students commonly misunderstand are well suited for this activity.
  • Strip sequence , or  sequence reconstruction : The goal of this activity is for students to order a set of items, such as steps in a biological process or a series of historical events.  As one strategy, the instructor provides students with a list of items written on strips of paper for the students to sort.  Removable labels with printed items also work well for this activity.
  • Polling :  During class, the instructor asks a multiple-choice question.  Students can respond in a variety of ways.  Possibilities include applications such as  PollEverywhere  or  Learning Catalytics .  In some courses, each student uses a handheld clicker, or personal response device, to record their answers through software such as  TurningPoint  or  iClicker .  Alternatively, students can respond to a multiple-choice question by raising the appropriate number of fingers or by holding up a colored card, where colors correspond to the different answers. A particularly effective strategy is to ask each student to first respond to the poll independently, then discuss the question with a neighbor, and then re-vote.

ABL Connect  provides more in-depth information about and examples of many of these activities.

In addition to these classroom-based strategies, instructors might take students out of the classroom; for example, students can visit museums or libraries, engage in field research, or work with the local community. 

For more information...

Tipsheet: Active Learning

PhysPort resources on stimulating productive engagement

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How Learning Works: Seven Research-based Principles for Smart Teaching. Chicago, IL: John Wiley & Sons.

Bain, K. 2004. What the best college teachers do. Cambridge, MA: Harvard University Press.

Brookfield, S. D., & Preskill, S. (2012). Discussion As a Way of Teaching: Tools and Techniques for Democratic Classrooms. Chicago, IL: John Wiley & Sons.

Brookfield, S. D., & Preskill, S. (2016). The Discussion Book: 50 Great Ways to Get People Talking. San Francisco, CA: Jossey-Bass.

Brown, P. C., Roediger, H. L., & McDaniel, M. A. (2014). Make It Stick: The Science of Successful Learning, 1st Edition. Cambridge, MA: Harvard University Press.

Handelsman, J., Miller, S., & Pfund, C. 2007. Scientific teaching. New York: W. H. Freeman and Company.

Lang, J. (2010). On Course: A Week-by-Week Guide to Your First Semester of College Teaching, 1st Edition. Cambridge, MA: Harvard University Press.

Lang, J. (2016). Small Teaching: Everyday Lessons from the Science of Learning. San Francisco, CA: Jossey-Bass.

Millis, B. J. 1990. Helping faculty build learning communities through cooperative groups. Available: http://digitalcommons.unl.edu/podimproveacad/202/ [2017, August 31].

  • Designing Your Course
  • A Teaching Timeline: From Pre-Term Planning to the Final Exam
  • The First Day of Class
  • Group Agreements
  • Engaged Scholarship
  • Devices in the Classroom
  • Beyond the Classroom
  • On Professionalism
  • Getting Feedback
  • Equitable & Inclusive Teaching
  • Advising and Mentoring
  • Teaching and Your Career
  • Teaching Remotely
  • Tools and Platforms
  • The Science of Learning
  • Bok Publications
  • Other Resources Around Campus
  • Open access
  • Published: 15 March 2021

Instructor strategies to aid implementation of active learning: a systematic literature review

  • Kevin A. Nguyen 1 ,
  • Maura Borrego 2 ,
  • Cynthia J. Finelli   ORCID: orcid.org/0000-0001-9148-1492 3 ,
  • Matt DeMonbrun 4 ,
  • Caroline Crockett 3 ,
  • Sneha Tharayil 2 ,
  • Prateek Shekhar 5 ,
  • Cynthia Waters 6 &
  • Robyn Rosenberg 7  

International Journal of STEM Education volume  8 , Article number:  9 ( 2021 ) Cite this article

24k Accesses

39 Citations

15 Altmetric

Metrics details

Despite the evidence supporting the effectiveness of active learning in undergraduate STEM courses, the adoption of active learning has been slow. One barrier to adoption is instructors’ concerns about students’ affective and behavioral responses to active learning, especially student resistance. Numerous education researchers have documented their use of active learning in STEM classrooms. However, there is no research yet that systematically analyzes these studies for strategies to aid implementation of active learning and address students’ affective and behavioral responses. In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and recommended at least one strategy for implementing active learning. In this paper, we ask: (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide?

In our review, we noted that most active learning activities involved in-class problem solving within a traditional lecture-based course ( N = 21). We found mostly positive affective and behavioral outcomes for students’ self-reports of learning, participation in the activities, and course satisfaction ( N = 23). From our analysis of the 29 studies, we identified eight strategies to aid implementation of active learning based on three categories. Explanation strategies included providing students with clarifications and reasons for using active learning. Facilitation strategies entailed working with students and ensuring that the activity functions as intended. Planning strategies involved working outside of the class to improve the active learning experience.

To increase the adoption of active learning and address students’ responses to active learning, this study provides strategies to support instructors. The eight strategies are listed with evidence from numerous studies within our review on affective and behavioral responses to active learning. Future work should examine instructor strategies and their connection with other affective outcomes, such as identity, interests, and emotions.

Introduction

Prior reviews have established the effectiveness of active learning in undergraduate science, technology, engineering, and math (STEM) courses (e.g., Freeman et al., 2014 ; Lund & Stains, 2015 ; Theobald et al., 2020 ). In this review, we define active learning as classroom-based activities designed to engage students in their learning through answering questions, solving problems, discussing content, or teaching others, individually or in groups (Prince & Felder, 2007 ; Smith, Sheppard, Johnson, & Johnson, 2005 ), and this definition is inclusive of research-based instructional strategies (RBIS, e.g., Dancy, Henderson, & Turpen, 2016 ) and evidence-based instructional practices (EBIPs, e.g., Stains & Vickrey, 2017 ). Past studies show that students perceive active learning as benefitting their learning (Machemer & Crawford, 2007 ; Patrick, Howell, & Wischusen, 2016 ) and increasing their self-efficacy (Stump, Husman, & Corby, 2014 ). Furthermore, the use of active learning in STEM fields has been linked to improvements in student retention and learning, particularly among students from some underrepresented groups (Chi & Wylie, 2014 ; Freeman et al., 2014 ; Prince, 2004 ).

Despite the overwhelming evidence in support of active learning (e.g., Freeman et al., 2014 ), prior research has found that traditional teaching methods such as lecturing are still the dominant mode of instruction in undergraduate STEM courses, and low adoption rates of active learning in undergraduate STEM courses remain a problem (Hora & Ferrare, 2013 ; Stains et al., 2018 ). There are several reasons for these low adoption rates. Some instructors feel unconvinced that the effort required to implement active learning is worthwhile, and as many as 75% of instructors who have attempted specific types of active learning abandon the practice altogether (Froyd, Borrego, Cutler, Henderson, & Prince, 2013 ).

When asked directly about the barriers to adopting active learning, instructors cite a common set of concerns including the lack of preparation or class time (Finelli, Daly, & Richardson, 2014 ; Froyd et al., 2013 ; Henderson & Dancy, 2007 ). Among these concerns, student resistance to active learning is a potential explanation for the low rates of instructor persistence with active learning, and this negative response to active learning has gained increased attention from the academic community (e.g., Owens et al., 2020 ). Of course, students can exhibit both positive and negative responses to active learning (Carlson & Winquist, 2011 ; Henderson, Khan, & Dancy, 2018 ; Oakley, Hanna, Kuzmyn, & Felder, 2007 ), but due to the barrier student resistance can present to instructors, we focus here on negative student responses. Student resistance to active learning may manifest, for example, as lack of student participation and engagement with in-class activities, declining attendance, or poor course evaluations and enrollments (Tolman, Kremling, & Tagg, 2016 ; Winkler & Rybnikova, 2019 ).

We define student resistance to active learning (SRAL) as a negative affective or behavioral student response to active learning (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). The affective domain, as it relates to active learning, encompasses not only student satisfaction and perceptions of learning but also motivation-related constructs such as value, self-efficacy, and belonging. The behavioral domain relates to participation, putting forth a good effort, and attending class. The affective and behavioral domains differ from much of the prior research on active learning that centers measuring cognitive gains in student learning, and systematic reviews are readily available on this topic (e.g., Freeman et al., 2014 ; Theobald et al., 2020 ). Schmidt, Rosenberg, and Beymer ( 2018 ) explain the relationship between affective, cognitive, and behavioral domains, asserting all three types of engagement are necessary for science learning, and conclude that “students are unlikely to exert a high degree of behavioral engagement during science learning tasks if they do not also engage deeply with the content affectively and cognitively” (p. 35). Thus, SRAL and negative affective and behavioral student response is a critical but underexplored component of STEM learning.

Recent research on student affective and behavioral responses to active learning has uncovered mechanisms of student resistance. Deslauriers, McCarty, Miller, Callaghan, and Kestin’s ( 2019 ) interviews of physics students revealed that the additional effort required by the novel format of an interactive lecture was the primary source of student resistance. Owens et al. ( 2020 ) identified a similar source of student resistance, which was to their carefully designed biology active learning intervention. Students were concerned about the additional effort required and the unfamiliar student-centered format. Deslauriers et al. ( 2019 ) and Owens et al. ( 2020 ) go a step further in citing self-efficacy (Bandura, 1982 ), mindset (Dweck & Leggett, 1988 ), and student engagement (Kuh, 2005 ) literature to explain student resistance. Similarly, Shekhar et al.’s ( 2020 ) review framed negative student responses to active learning in terms of expectancy-value theory (Wigfield & Eccles, 2000 ); students reacted negatively when they did not find active learning useful or worth the time and effort, or when they did not feel competent enough to complete the activities. Shekhar et al. ( 2020 ) also applied expectancy violation theory from physics education research (Gaffney, Gaffney, & Beichner, 2010 ) to explain how students’ initial expectations of a traditional course produced discomfort during active learning activities. To address both theories of student resistance, Shekhar et al. ( 2020 ) suggested that instructors provide scaffolding (Vygotsky, 1978 ) and support for self-directed learning activities. So, while framing the research as SRAL is relatively new, ideas about working with students to actively engage them in their learning are not. Prior literature on active learning in STEM undergraduate settings includes clues and evidence about strategies instructors can employ to reduce SRAL, even if they are not necessarily framed by the authors as such.

Recent interest in student affective and behavioral responses to active learning, including SRAL, is a relatively new development. But, given the discipline-based educational research (DBER) knowledge base around RBIS and EBIP adoption, we need not to reinvent the wheel. In this paper, we conduct a system review. Systematic reviews are designed to methodically gather and synthesize results from multiple studies to provide a clear overview of a topic, presenting what is known and what is not known (Borrego, Foster, & Froyd, 2014 ). Such clarity informs decisions when designing or funding future research, interventions, and programs. Relevant studies for this paper are scattered across STEM disciplines and in DBER and general education venues, which include journals and conference proceedings. Quantitative, qualitative, and mixed methods approaches have been used to understand student affective and behavioral responses to active learning. Thus, a systematic review is appropriate for this topic given the long history of research on the development of RBIS, EBIPs, and active learning in STEM education; the distribution of primary studies across fields and formats; and the different methods taken to evaluate students’ affective and behavioral responses.

Specifically, we conducted a systematic review to address two interrelated research questions. (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies ? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide ? These two questions are linked by our goal of sharing instructor strategies that can either reduce SRAL or encourage positive student affective and behavioral responses. Therefore, the instructor strategies in this review are only from studies that present empirical data of affective and behavioral student response to active learning. The strategies we identify in this review will not be surprising to highly experienced teaching and learning practitioners or researchers. However, this review does provide an important link between these strategies and student resistance, which remains one of the most feared barriers to instructor adoption of RBIS, EBIPs, and other forms of active learning.

Conceptual framework: instructor strategies to reduce resistance

Recent research has identified specific instructor strategies that correlate with reduced SRAL and positive student response in undergraduate STEM education (Finelli et al., 2018 ; Nguyen et al., 2017 ; Tharayil et al., 2018 ). For example, Deslauriers et al. ( 2019 ) suggested that physics students perceive the additional effort required by active learning to be evidence of less effective learning. To address this, the authors included a 20-min lecture about active learning in a subsequent course offering. By the end of that course, 65% of students reported increased enthusiasm for active learning, and 75% said the lecture intervention positively impacted their attitudes toward active learning. Explaining how active learning activities contribute to student learning is just one of many strategies instructors can employ to reduce SRAL (Tharayil et al., 2018 ).

DeMonbrun et al. ( 2017 ) provided a conceptual framework for differentiating instructor strategies which includes not only an explanation type of instructor strategies (e.g., Deslauriers et al., 2019 ; Tharayil et al., 2018 ) but also a facilitation type of instructor strategies. Explanation strategies involve describing the purpose (such as how the activity relates to students’ learning) and expectations of the activity to students. Typically, instructors use explanation strategies before the in-class activity has begun. Facilitation strategies include promoting engagement and keeping the activity running smoothly once the activity has already begun, and some specific strategies include walking around the classroom or directly encouraging students. We use the existing categories of explanation and facilitation as a conceptual framework to guide our analysis and systematic review.

As a conceptual framework, explanation and facilitation strategies describe ways to aid the implementation of RBIS, EBIP, and other types of active learning. In fact, the work on these types of instructor strategies is related to higher education faculty development, implementation, and institutional change research perspectives (e.g., Borrego, Cutler, Prince, Henderson, & Froyd, 2013 ; Henderson, Beach, & Finkelstein, 2011 ; Kezar, Gehrke, & Elrod, 2015 ). As such, the specific types of strategies reviewed here are geared to assist instructors in moving toward more student-centered teaching methods by addressing their concerns of student resistance.

SRAL is a particular negative form of affective or behavioral student response (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). Affective and behavioral student responses are conceptualized at the reactionary level (Kirkpatrick, 1976 ) of outcomes, which consists of how students feel (affective) and how they conduct themselves within the course (behavioral). Although affective and behavioral student responses to active learning are less frequently reported than cognitive outcomes, prior research suggests a few conceptual constructs within these outcomes.

Affective outcomes consist of any students’ feelings, preferences, and satisfaction with the course. Affective outcomes also include students’ self-reports of whether they thought they learned more (or less) during active learning instruction. Some relevant affective outcomes include students’ perceived value or utility of active learning (Shekhar et al., 2020 ; Wigfield & Eccles, 2000 ), their positivity toward or enjoyment of the activities (DeMonbrun et al., 2017 ; Finelli et al., 2018 ), and their self-efficacy or confidence with doing the in-class activity (Bandura, 1982 ).

In contrast, students’ behavioral responses to active learning consist of their actions and practices during active learning. This includes students’ attendance in the class, their participation , engagement, and effort with the activity, and students’ distraction or off-task behavior (e.g., checking their phones, leaving to use the restroom) during the activity (DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Winkler & Rybnikova, 2019 ).

We conceptualize negative or low scores in either affective or behavioral student outcomes as an indicator of SRAL (DeMonbrun et al., 2017 ; Nguyen et al., 2017 ). For example, a low score in reported course satisfaction would be an example of SRAL. This paper aims to synthesize instructor strategies to aid implementation of active learning from studies that either address SRAL and its negative or low scores or relate instructor strategies to positive or high scores. Therefore, we also conceptualize positive student affective and behavioral outcomes as the absence of SRAL. For easy categorization of this review then, we summarize studies’ affective and behavioral outcomes on active learning to either being positive , mostly positive , mixed/neutral , mostly negative , or negative .

We conducted a systematic literature review (Borrego et al., 2014 ; Gough, Oliver, & Thomas, 2017 ; Petticrew & Roberts, 2006 ) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.

A systematic review was warranted due to the popularity of active learning and the publication of numerous papers on the topic. Multiple STEM disciplines and research audiences have published journal articles and conference papers on the topic of active learning in the undergraduate STEM classroom. However, it was not immediately clear which studies addressed active learning, affective and behavioral student responses, and strategies to aid implementation of active learning. We used the systematic review process to efficiently gather results of multiple types of studies and create a clear overview of our topic.

Definitions

For clarity, we define several terms in this review. Researchers refer to us, the authors of this manuscript. Authors and instructors wrote the primary studies we reviewed, and we refer to these primary studies as “studies” consistently throughout. We use the term activity or activities to refer to the specific in-class active learning tasks assigned to students. Strategies refer to the instructor strategies used to aid implementation of active learning and address student resistance to active learning (SRAL). Student response includes affective and behavioral responses and outcomes related to active learning. SRAL is an acronym for student resistance to active learning, defined here as a negative affective or behavioral student response. Categories or category refer to a grouping of strategies to aid implementation of active learning, such as explanation or facilitation. Excerpts are quotes from studies, and these excerpts are used as codes and examples of specific strategies.

Study timeline, data collection, and sample selection

From 2015 to 2016, we worked with a research librarian to locate relevant studies and conduct a keyword search within six databases: two multidisciplinary databases (Web of Science and Academic Search Complete), two major engineering and technology indexes (Compendex and Inspec), and two popular education databases (Education Source and Education Resource Information Center). We created an inclusion criteria that listed both search strings and study requirements:

Studies must include an in-class active learning intervention. This does not include laboratory classes. The corresponding search string was:

“active learning” or “peer-to-peer” or “small group work” or “problem based learning” or “problem-based learning” or “problem-oriented learning” or “project-based learning” or “project based learning” or “peer instruction” or “inquiry learning” or “cooperative learning” or “collaborative learning” or “student response system” or “personal response system” or “just-in-time teaching” or “just in time teaching” or clickers

Studies must include empirical evidence addressing student response to the active learning intervention. The corresponding search string was:

“affective outcome” or “affective response” or “class evaluation” or “course evaluation” or “student attitudes” or “student behaviors” or “student evaluation” or “student feedback” or “student perception” or “student resistance” or “student response”

Studies must describe a STEM course, as defined by the topic of the course, rather than by the department of the course or the major of the students enrolled (e.g., a business class for mathematics majors would not be included, but a mathematics class for business majors would).

Studies must be conducted in undergraduate courses and must not include K-12, vocational, or graduate education.

Studies must be in English and published between 1990 and 2015 as journal articles or conference papers.

In addition to searching the six databases, we emailed solicitations to U.S. National Science Foundation Improving Undergraduate STEM Education (NSF IUSE) grantees. Between the database searches and email solicitation, we identified 2364 studies after removing duplicates. Most studies were from the database search, as we received just 92 studies from email solicitation (Fig. 1 ).

figure 1

PRISMA screening overview styled after Liberati et al. ( 2009 ) and Passow and Passow ( 2017 )

Next, we followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for screening studies with our inclusion criteria (Borrego et al., 2014 ; Petticrew & Roberts, 2006 ). From 2016 to 2018, a team of seven researchers conducted two rounds of review in Refworks: the first round with only titles and abstracts and the second round with the entire full-text. In both rounds, two researchers independently decided whether each study should be retained based on our inclusion criteria listed above. At the abstract review stage, if there was a disagreement between independent coders, we decided to pass the study on to the full text screening round. We screened a total of 2364 abstracts, and only 746 studies passed the first round of title and abstract verification (see PRISMA flow chart on Fig. 1 ). If there was still a disagreement between independent coders at the full text screening round, then the seven researchers met and discussed the study, clarified the inclusion criteria as needed to resolve potential future disagreements, and when necessary, took a majority vote (4 out of the 7 researchers) on the inclusion of the study. Due to the high number of coders, it was unusual to reach full consensus with all 7 coders, so a majority vote was used to finalize the inclusion of certain studies. We resolved these disagreements on a rolling basis, and depending on the round (abstract or full text), we disagreed about 10–15% of the time on the inclusion of a study. In both the first and second round of screening, studies were often excluded because they did not gather novel empirical data or evidence (inclusion criteria #2) or were not in an undergraduate STEM course (inclusion criteria #3 and #4). Only 412 studies met all our final inclusion criteria.

Coding procedure

From 2017 to 2018, a team of five researchers then coded these 412 studies for detailed information. To quickly gather information about all 412 studies and to answer the first part of our research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we developed an online coding form using Google Forms and Google Sheets. The five researchers piloted and refined the coding form over three rounds of pair coding, and 19 studies were used to test and revise early versions of the coding form. The final coding form (Borrego et al., 2018 ) used a mix of multiple choice and free response items regarding study characteristics (bibliographic information, type of publication, location of study), course characteristics (discipline, course level, number of students sampled, and type of active learning), methodology (main type of evidence collected, sample size, and analysis methods), study findings (types of student responses and outcomes), and strategy reported (if the study explicitly mentioned using strategies to implementation of active learning).

In the end, only 29 studies explicitly described strategies to aid implementation of active learning (Fig. 1 ), and we used these 29 studies as the dataset for this study. The main difference between these 29 studies and the other 383 studies was that these 29 studies explicitly described the ways authors implemented active learning in their courses to address SRAL or positive student outcomes. Although some readers who are experienced active learning instructors or educational researchers may view pedagogies and strategies as integrated, we found that most papers described active learning methods in terms of student tasks, while advice on strategies, if included, tended to appear separately. We chose to not over interpret passing mentions of how active learning was implemented as strategies recommended by the authors.

Analysis procedure for coding strategies

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we closely reviewed the 29 studies to analyze the strategies in more detail. We used Boyatzis’s ( 1998 ) thematic analysis technique to compile all mentions of instructor strategies to aid implementation of active learning and categorize these excerpts into certain strategies. This technique uses both deductive and inductive coding processes (Creswell & Creswell, 2017 ; Jesiek, Mazzurco, Buswell, & Thompson, 2018 ).

In 2018, three researchers reread the 29 studies, marking excerpts related to strategies independently. We found a total of 126 excerpts. The number of excerpts within each study ranged from 1 to 14 excerpts ( M = 4, SD = 3). We then took all the excerpts and pasted each into its own row in a Google Sheet. We examined the entire spreadsheet as a team and grouped similar excerpts together using a deductive coding process. We used the explanation and facilitation conceptual framework (DeMonbrun et al., 2017 ) and placed each excerpt into either category. We also assigned a specific strategy (i.e., describing the purpose of the activity, or encouraging students) from the framework for each excerpt.

However, there were multiple excerpts that did not easily match either category; we set these aside for the inductive coding process. We then reviewed all excerpts without a category and suggested the creation of a new third category, called planning . We based this new category on the idea that the existing explanation and facilitation conceptual framework did not capture strategies that occurred outside of the classroom. We discuss the specific strategies within the planning category in the Results. With a new category in hand, we created a preliminary codebook consisting of explanation, facilitation, and planning categories, and their respective specific strategies.

We then passed the spreadsheet and preliminary codebook to another researcher who had not previously seen the excerpts. The second researcher looked through all the excerpts and assigned categories and strategies, without being able to see the suggestions of the initial three researchers. The second researcher also created their own new strategies and codes, especially when a specific strategy was not presented in the preliminary codebook. All of their new strategies and codes were created within the planning category. The second researcher agreed on assigned categories and implementation strategies for 71% of the total excerpts. A researcher from the initial strategies coding met with the second researcher and discussed all disagreements. The high number of disagreements, 29%, arose from the specific strategies within the new third category, planning. Since the second researcher created new planning strategies, by default these assigned codes would be a disagreement. The two researchers resolved the disagreements by finalizing a codebook with the now full and combined list of planning strategies and the previous explanation and facilitation strategies. Finally, they started the last round of coding, and they coded the excerpts with the final codebook. This time, they worked together in the same coding sessions. Any disagreements were immediately resolved through discussion and updating of final strategy codes. In the end, all 126 excerpts were coded and kept.

Characteristics of the primary studies

To answer our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we report the results from our coding and systematic review process. We discuss characteristics of studies within our dataset below and in Table 1 .

Type of publication and research audience

Of the 29 studies, 11 studies were published in conference proceedings, while the remaining 18 studies were journal articles. Examples of journals included the European Journal of Engineering Education , Journal of College Science Teaching , and PRIMUS (Problems, Resources, and Issues in Mathematics Undergraduate Studies).

In terms of research audiences and perspectives, both US and international views were represented. Eighteen studies were from North America, two were from Australia, three were from Asia, and six were from Europe. For more details about the type of research publications, full bibliographic information for all 29 studies is included in the Appendix.

Types of courses sampled

Studies sampled different types of undergraduate STEM courses. In terms of course year, most studies sampled first-year courses (13 studies). All four course years were represented (4 second-year, 3 third-year, 2 fourth-year, 7 not reported). In regards to course discipline or major, all major STEM education disciplines were represented. Fourteen studies were conducted in engineering courses, and most major engineering subdisciplines were represented, such as electrical and computer engineering (4 studies), mechanical engineering (3 studies), general engineering courses (3 studies), chemical engineering (2 studies), and civil engineering (1 study). Thirteen studies were conducted in science courses (3 physics/astronomy, 7 biology, 3 chemistry), and 2 studies were conducted in mathematics or statistics courses.

For teaching methods, most studies sampled traditional courses that were primarily lecture-based but included some in-class activities. The most common activity was giving class time for students to do problem solving (PS) (21 studies). Students were instructed to either do problem solving in groups (16 studies) or individually (5 studies) and sometimes both in the same course. Project or problem-based learning (PBL) was the second most frequently reported activity with 8 studies, and the implementation of this teaching method ranged from end of term final projects to an entire project or problem-based course. The third most common activity was using clickers (4 studies) or having class discussions (4 studies).

Research design, methods, and outcomes

The 29 studies used quantitative (10 studies), qualitative (6 studies), or mixed methods (13 studies) research designs. Most studies contained self-made instructor surveys (IS) as their main source of evidence (20 studies). In contrast, only 2 studies used survey instruments with evidence of validity (IEV). Other forms of data collection included using institutions’ end of course evaluations (EOC) (10 studies), observations (5 studies), and interviews (4 studies).

Studies reported a variety of different measures for researching students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of learning (an affective outcome); twenty-one studies measured whether students thought they learned more or less due to the active learning intervention. Other common measures included whether students participated in the activities (16 studies, participation), whether they enjoyed the activities (15 studies, enjoyment), and if students were satisfied with the overall course experience (13 studies, course satisfaction). Most studies included more than one measure. Some studies also measured course attendance (4 studies) and students’ self-efficacy with the activities and relevant STEM disciplines (4 studies).

We found that the 23 of the 29 studies reported positive or mostly positive outcomes for their students’ affective and behavioral responses to active learning. Only 5 studies reported mixed/neutral study outcomes, and only one study reported negative student response to active learning. We discuss the implications of this lack of negative study outcomes and reports of SRAL in our dataset in the “Discussion” section.

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we provide descriptions, categories, and excerpts of specific strategies found within our systematic literature review.

Explanation strategies

Explanation strategies provide students with clarifications and reasons for using active learning (DeMonbrun et al., 2017 ). Within the explanation category, we identified two specific strategies: establish expectations and explain the purpose .

Establish expectations

Establishing expectations means setting the tone and routine for active learning at both the course and in-class activity level. Instructors can discuss expectations at the beginning of the semester, at the start of a class session, or right before the activity.

For establishing expectations at the beginning of the semester, studies provide specific ways to ensure students became familiar with active learning as early as possible. This included “introduc[ing] collaborative learning at the beginning of the academic term” (Herkert , 1997 , p. 450) and making sure that “project instructions and the data were posted fairly early in the semester, and the students were made aware that the project was an important part of their assessment” (Krishnan & Nalim, 2009 , p. 5).

McClanahan and McClanahan ( 2002 ) described the importance of explaining how the course will use active learning and purposely using the syllabus to do this:

Set the stage. Create the expectation that students will actively participate in this class. One way to accomplish that is to include a statement in your syllabus about your teaching strategies. For example: I will be using a variety of teaching strategies in this class. Some of these activities may require that you interact with me or other students in class. I hope you will find these methods interesting and engaging and that they enable you to be more successful in this course . In the syllabus, describe the specific learning activities you plan to conduct. These descriptions let the students know what to expect from you as well as what you expect from them (emphasis added, p. 93).

Early on, students see that the course is interactive, and they also see the activities required to be successful in the course.

These studies and excerpts demonstrate the importance of explaining to students how in-class activities relate to course expectations. Instructors using active learning should start the semester with clear expectations for how students should engage with activities.

Explain the purpose

Explaining the purpose includes offering students reasons why certain activities are being used and convincing them of the importance of participating.

One way that studies explained the purpose of the activities was by leveraging and showing assessment data on active learning. For example, Lenz ( 2015 ) dedicated class time to show current students comments from previous students:

I spend the first few weeks reminding them of the research and of the payoff that they will garner and being a very enthusiastic supporter of the [active learning teaching] method. I show them comments I have received from previous classes and I spend a lot of time selling the method (p. 294).

Providing current students comments from previous semesters may help students see the value of active learning. Lake ( 2001 ) also used data from prior course offerings to show students “the positive academic performance results seen in the previous use of active learning” on the first day of class (p. 899).

However, sharing the effectiveness of the activities does not have to be constrained to the beginning of the course. Autin et al. ( 2013 ) used mid-semester test data and comparisons to sell the continued use of active learning to their students. They said to students:

Based on your reflections, I can see that many of you are not comfortable with the format of this class. Many of you said that you would learn better from a traditional lecture. However, this class, as a whole, performed better on the test than my other [lecture] section did. Something seems to be working here (p. 946).

Showing students’ comparisons between active learning and traditional lecture classes is a powerful way to explain how active learning is a benefit to students.

Explaining the purpose of the activities by sharing course data with students appears to be a useful strategy, as it tells students why active learning is being used and convinces students that active learning is making a difference.

Facilitation strategies

Facilitation strategies ensure the continued engagement in the class activities once they have begun, and many of the specific strategies within this category involve working directly with students. We identified two strategies within the facilitation category: approach students and encourage students .

Approach students

Approaching students means engaging with students during the activity. This includes physical proximity and monitoring students, walking around the classroom, and providing students with additional feedback, clarifications, or questions about the activity.

Several studies described how instructors circulated around the classroom to check on the progress of students during an activity. Lenz ( 2015 ) stated this plainly in her study, “While the students work on these problems I walk around the room, listening to their discussions” (p. 284). Armbruster et al. ( 2009 ) described this strategy and noted positive student engagement, “During each group-work exercise the instructor would move throughout the classroom to monitor group progress, and it was rare to find a group that was not seriously engaged in the exercise” (p. 209). Haseeb ( 2011 ) combined moving around the room and approaching students with questions, and they stated, “The instructor moves around from one discussion group to another and listens to their discussions, ask[ing] provoking questions” (p. 276). Certain group-based activities worked better with this strategy, as McClanahan and McClanahan ( 2002 ) explained:

Breaking the class into smaller working groups frees the professor to walk around and interact with students more personally. He or she can respond to student questions, ask additional questions, or chat informally with students about the class (p. 94).

Approaching students not only helps facilitate the activity, but it provides a chance for the instructor to work with students more closely and receive feedback. Instructors walking around the classroom ensure that both the students and instructor continue to engage and participate with the activity.

Encourage students

Encouraging students includes creating a supportive classroom environment, motivating students to do the activity, building respect and rapport with students, demonstrating care, and having a positive demeanor toward students’ success.

Ramsier et al. ( 2003 ) provided a detailed explanation of the importance of building a supportive classroom environment:

Most of this success lies in the process of negotiation and the building of mutual respect within the class, and requires motivation, energy and enthusiasm on behalf of the instructor… Negotiation is the key to making all of this work, and building a sense of community and shared ownership. Learning students’ names is a challenge but a necessary part of our approach. Listening to student needs and wants with regard to test and homework due dates…projects and activities, etc. goes a long way to build the type of relationships within the class that we need in order to maintain and encourage performance (pp. 16–18).

Here, the authors described a few specific strategies for supporting a positive demeanor, such as learning students’ names and listening to student needs and wants, which helped maintain student performance in an active learning classroom.

Other ways to build a supportive classroom environment were for instructors to appear more approachable. For example, Bullard and Felder ( 2007 ) worked to “give the students a sense of their instructors as somewhat normal and approachable human beings and to help them start to develop a sense of community” (p. 5). As instructors and students become more comfortable working with each other, instructors can work toward easing “frustration and strong emotion among students and step by step develop the students’ acceptance [of active learning]” (Harun, Yusof, Jamaludin, & Hassan, 2012 , p. 234). In all, encouraging students and creating a supportive environment appear to be useful strategies to aid implementation of active learning.

Planning strategies

The planning category encompasses strategies that occur outside of class time, distinguishing it from the explanation and facilitation categories. Four strategies fall into this category: design appropriate activities , create group policies , align the course , and review student feedback .

Design appropriate activities

Many studies took into consideration the design of appropriate or suitable activities for their courses. This meant making sure the activity was suitable in terms of time, difficulty, and constraints of the course. Activities were designed to strike a balance between being too difficult and too simple, to be engaging, and to provide opportunities for students to participate.

Li et al. ( 2009 ) explained the importance of outside-of-class planning and considering appropriate projects: “The selection of the projects takes place in pre-course planning. The subjects for projects should be significant and manageable” (p. 491). Haseeb ( 2011 ) further emphasized a balance in design by discussing problems (within problem-based learning) between two parameters, “the problem is deliberately designed to be open-ended and vague in terms of technical details” (p. 275). Armbruster et al. ( 2009 ) expanded on the idea of balanced activities by connecting it to group-work and positive outcomes, and they stated, “The group exercises that elicited the most animated student participation were those that were sufficiently challenging that very few students could solve the problem individually, but at least 50% or more of the groups could solve the problem by working as a team” (p. 209).

Instructors should consider the design of activities outside of class time. Activities should be appropriately challenging but achievable for students, so that students remain engaged and participate with the activity during class time.

Create group policies

Creating group policies means considering rules when using group activities. This strategy is unique in that it directly addresses a specific subset of activities, group work. These policies included setting team sizes and assigning specific roles to group members.

Studies outlined a few specific approaches for assigning groups. For example, Ramsier et al. ( 2003 ) recommended frequently changing and randomizing groups: “When students enter the room on these days they sit in randomized groups of 3 to 4 students. Randomization helps to build a learning community atmosphere and eliminates cliques” (p. 4). Another strategy in combination with frequent changing of groups was to not allow students to select their own groups. Lehtovuori et al. ( 2013 ) used this to avoid problems of freeriding and group dysfunction:

For example, group division is an issue to be aware of...An easy and safe solution is to draw lots to assign the groups and to change them often. This way nobody needs to suffer from a dysfunctional group for too long. Popular practice that students self-organize into groups is not the best solution from the point of view of learning and teaching. Sometimes friendly relationships can complicate fair division of responsibility and work load in the group (p. 9).

Here, Lehtovuori et al. ( 2013 ) considered different types of group policies and concluded that frequently changing groups worked best for students. Kovac ( 1999 ) also described changing groups but assigned specific roles to individuals:

Students were divided into groups of four and assigned specific roles: manager, spokesperson, recorder, and strategy analyst. The roles were rotated from week to week. To alleviate complaints from students that they were "stuck in a bad group for the entire semester," the groups were changed after each of the two in-class exams (p. 121).

The use of four specific group roles is a potential group policy, and Kovac ( 1999 ) continued the trend of changing group members often.

Overall, these studies describe the importance of thinking about ways to implement group-based activities before enacting them during class, and they suggest that groups should be reconstituted frequently. Instructors using group activities should consider whether to use specific group member policies before implementing the activity in the classroom.

Align the course

Aligning the course emphasizes the importance of purposely connecting multiple parts of the course together. This strategy involves planning to ensure students are graded on their participation with the activities as well as considering the timing of the activities with respect to other aspects of the course.

Li et al. ( 2009 ) described aligning classroom tasks by discussing the importance of timing, and they wrote, “The coordination between the class lectures and the project phases is very important. If the project is assigned near the directly related lectures, students can instantiate class concepts almost immediately in the project and can apply the project experience in class” (p. 491).

Krishnan and Nalim ( 2009 ) aligned class activities with grades to motivate students and encourage participation: “The project was a component of the course counting for typically 10-15% of the total points for the course grade. Since the students were told about the project and that it carried a significant portion of their grade, they took the project seriously” (p. 4). McClanahan and McClanahan ( 2002 ) expanded on the idea of using grades to emphasize the importance of active learning to students:

Develop a grading policy that supports active learning. Active learning experiences that are important enough to do are important enough to be included as part of a student's grade…The class syllabus should describe your grading policy for active learning experiences and how those grades factor into the student's final grade. Clarify with the students that these points are not extra credit. These activities, just like exams, will be counted when grades are determined (p. 93).

Here, they suggest a clear grading policy that includes how activities will be assessed as part of students’ final grades.

de Justo and Delgado ( 2014 ) connected grading and assessment to learning and further suggested that reliance on exams may negatively impact student engagement:

Particular attention should be given to alignment between the course learning outcomes and assessment tasks. The tendency among faculty members to rely primarily on written examinations for assessment purposes should be overcome, because it may negatively affect students’ engagement in the course activities (p. 8).

Instructors should consider their overall assessment strategies, as overreliance on written exams could mean that students engage less with the activities.

When planning to use active learning, instructors should consider how activities are aligned with course content and students’ grades. Instructors should decide before active learning implementation whether class participation and engagement will be reflected in student grades and in the course syllabus.

Review student feedback

Reviewing student feedback includes both soliciting feedback about the activity and using that feedback to improve the course. This strategy can be an iterative process that occurs over several course offerings.

Many studies utilized student feedback to continuously revise and improve the course. For example, Metzger ( 2015 ) commented that “gathering and reviewing feedback from students can inform revisions of course design, implementation, and assessment strategies” (p. 8). Rockland et al. ( 2013 ) further described changing and improving the course in response to student feedback, “As a result of these discussions, the author made three changes to the course. This is the process of continuous improvement within a course” (p. 6).

Herkert ( 1997 ) also demonstrated the use of student feedback for improving the course over time: “Indeed, the [collaborative] learning techniques described herein have only gradually evolved over the past decade through a process of trial and error, supported by discussion with colleagues in various academic fields and helpful feedback from my students” (p. 459).

In addition to incorporating student feedback, McClanahan and McClanahan ( 2002 ) commented on how student feedback builds a stronger partnership with students, “Using student feedback to make improvements in the learning experience reinforces the notion that your class is a partnership and that you value your students’ ideas as a means to strengthen that partnership and create more successful learning” (p. 94). Making students aware that the instructor is soliciting and using feedback can help encourage and build rapport with students.

Instructors should review student feedback for continual and iterative course improvement. Much of the student feedback review occurs outside of class time, and it appears useful for instructors to solicit student feedback to guide changes to the course and build student rapport.

Summary of strategies

We list the appearance of strategies within studies in Table 1 in short-hand form. No study included all eight strategies. Studies that included the most strategies were Bullard and Felder’s ( 2007 ) (7 strategies), Armbruster et al.’s ( 2009 ) (5 strategies), and Lenz’s ( 2015 ) (5 strategies). However, these three studies were exemplars, as most studies included only one or two strategies.

Table 2 presents a summary list of specific strategies, their categories, and descriptions. We also note the number of unique studies ( N ) and excerpts ( n ) that included the specific strategies. In total, there were eight specific strategies within three categories. Most strategies fell under the planning category ( N = 26), with align the course being the most reported strategy ( N = 14). Approaching students ( N = 13) and reviewing student feedback ( N = 11) were the second and third most common strategies, respectively. Overall, we present eight strategies to aid implementation of active learning.

Characteristics of the active learning studies

To address our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we discuss the different ways studies reported research on active learning.

Limitations and gaps within the final sample

First, we must discuss the gaps within our final sample of 29 studies. We excluded numerous active learning studies ( N = 383) that did not discuss or reflect upon the efficacy of their strategies to aid implementation of active learning. We also began this systematic literature review in 2015 and did not finish our coding and analysis of 2364 abstracts and 746 full-texts until 2018. We acknowledge that there have been multiple studies published on active learning since 2015. Acknowledging these limitations, we discuss our results and analysis in the context of the 29 studies in our dataset, which were published from 1990 to 2015.

Our final sample included only 2 studies that sampled mathematics and statistics courses. In addition, there was also a lack of studies outside of first-year courses. Much of the active learning research literature introduces interventions in first-year (cornerstone) or fourth-year (capstone) courses, but we found within our dataset a tendency to oversample first-year courses. However, all four course-years were represented, as well as all major STEM disciplines, with the most common STEM disciplines being engineering (14 studies) and biology (7 studies).

Thirteen studies implemented course-based active learning interventions, such as project-based learning (8 studies), inquiry-based learning (3 studies), or a flipped classroom (2 studies). Only one study, Lenz ( 2015 ), used a previously published active learning intervention, which was Process-Oriented Guided Inquiry Learning (POGIL). Other examples of published active learning programs include the Student-Centered Active Learning Environment for Upside-down Pedagogies (SCALE-UP, Gaffney et al., 2010 ) and Chemistry, Life, the Universe, and Everything (CLUE, Cooper & Klymkowsky, 2013 ), but these were not included in our sample of 29 studies.

In contrast, most of the active learning interventions involved adding in-class problem solving (either with individual students or groups of students) to a traditional lecture course (21 studies). For some instructors attempting to adopt active learning, using this smaller active learning intervention (in-class problem solving) may be a good starting point.

Despite the variety of quantitative, qualitative, and mixed method research designs, most studies used either self-made instructor surveys (20 studies) or their institution’s course evaluations (10 studies). The variation between so many different versions of instructor surveys and course evaluations made it difficult to compare data or attempt a quantitative meta-analysis. Further, only 2 studies used instruments with evidence of validity. However, that trend may change as there are more examples of instruments with evidence of validity, such as the Student Response to Instructional Practices (StRIP, DeMonbrun et al., 2017 ), the Biology Interest Questionnaire (BIQ, Knekta, Rowland, Corwin, & Eddy, 2020 ), and the Pedagogical Expectancy Violation Assessment (PEVA, Gaffney et al., 2010 ).

We were also concerned about the use of institutional course evaluations (10 studies) as evidence of students’ satisfaction and affective responses to active learning. Course evaluations capture more than just students’ responses to active learning, as the scores are biased toward the instructors’ gender (Mitchell & Martin, 2018 ) and race (Daniel, 2019 ), and they are strongly correlated with students’ expected grade in the class (Nguyen et al., 2017 ). Despite these limitations, we kept course evaluations in our keyword search and inclusion criteria, because they relate to instructors concerns about student resistance to active learning, and these scores continue to be used for important instructor reappointment, tenure, and promotion decisions (DeMonbrun et al., 2017 ).

In addition to students’ satisfaction, there were other measures related to students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of whether they thought they learned more or less (21 studies). Other important affective outcomes included enjoyment (13 studies) and self-efficacy (4 students). The most common behavioral measure was students’ participation (16 studies). However, missing from this sample were other affective outcomes, such as students’ identities, beliefs, emotions, values, and buy-in.

Positive outcomes for using active learning

Twenty-three of the 29 studies reported positive or mostly positive outcomes for their active learning intervention. At the start of this paper, we acknowledged that much of the existing research suggested the widespread positive benefits of using active learning in undergraduate STEM courses. However, much of these positive benefits related to active learning were centered on students’ cognitive learning outcomes (e.g., Theobald et al., 2020 ) and not students’ affective and behavioral responses to active learning. Here, we show positive affective and behavioral outcomes in terms of students’ self-reports of learning, enjoyment, self-efficacy, attendance, participation, and course satisfaction.

Due to the lack of mixed/neutral or negative affective outcomes, it is important to acknowledge potential publication bias within our dataset. Authors may be hesitant to report negative outcomes to active learning interventions. It could also be the case that negative or non-significant outcomes are not easily published in undergraduate STEM education venues. These factors could help explain the lack of mixed/neutral or negative study outcomes in our dataset.

Strategies to aid implementation of active learning

We aimed to answer the question: what instructor strategies to aid implementation of active learning do the authors of these studies provide? We addressed this question by providing instructors and readers a summary of actionable strategies they can take back to their own classrooms. Here, we discuss the range of strategies found within our systematic literature review.

Supporting instructors with actionable strategies

We identified eight specific strategies across three major categories: explanation, facilitation, and planning. Each strategy appeared in at least seven studies (Table 2 ), and each strategy was written to be actionable and practical.

Strategies in the explanation category emphasized the importance of establishing expectations and explaining the purpose of active learning to students. The facilitation category focused on approaching and encouraging students once activities were underway. Strategies in the planning category highlight the importance of working outside of class time to thoughtfully design appropriate activities , create policies for group work , align various components of the course , and review student feedback to iteratively improve the course.

However, as we note in the “Introduction” section, these strategies are not entirely new, and the strategies will not be surprising to experienced researchers and educators. Even still, there has yet to be a systematic review that compiles these instructor strategies in relation to students’ affective and behavioral responses to active learning. For example, the “explain the purpose” strategy is similar to the productive framing (e.g., Hutchison & Hammer, 2010 ) of the activity for students. “Design appropriate activities” and “align various components of the course” relate to Vygotsky’s ( 1978 ) theories of scaffolding for students (Shekhar et al., 2020 ). “Review student feedback” and “approaching students” relate to ideas on formative assessment (e.g., Pellegrino, DiBello, & Brophy, 2014 ) or revising the course materials in relation to students’ ongoing needs.

We also acknowledge that we do not have an exhaustive list of specific strategies to aid implementation of active learning. More work needs to be done measuring and observing these strategies in-action and testing the use of these strategies against certain outcomes. Some of this work of measuring instructor strategies has already begun (e.g., DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Tharayil et al., 2018 ), but further testing and analysis would benefit the active learning community. We hope that our framework of explanation, facilitation, and planning strategies provide a guide for instructors adopting active learning. Since these strategies are compiled from the undergraduate STEM education literature and research on affective and behavioral responses to active learning, instructors have compelling reason to use these strategies to aid implementation of active learning.

One way to consider using these strategies is to consider the various aspects of instruction and their sequence. That is, planning strategies would be most applicable during the phase of work that occurs prior to classroom instruction, the explanation strategies would be more useful when introducing students to active learning activities, while facilitation strategies would be best enacted while students are already working and engaged in the assigned activities. Of course, these strategies may also be used in conjunction with each other and are not strictly limited to these phases. For example, one plausible approach could be using the planning strategies of design and alignment as areas of emphasis during explanation . Overall, we hope that this framework of strategies supports instructors’ adoption and sustained use of active learning.

Creation of the planning category

At the start of this paper, we presented a conceptual framework for strategies consisting of only explanation and facilitation categories (DeMonbrun et al., 2017 ). One of the major contributions of this paper is the addition of a third category, which we call the planning category, to the existing conceptual framework. The planning strategies were common throughout the systematic literature review, and many studies emphasized the need to consider how much time and effort is needed when adding active learning to the course. Although students may not see this preparation, and we did not see this type of strategy initially, explicitly adding the planning category acknowledges the work instructors do outside of the classroom.

The planning strategies also highlight the need for instructors to not only think about implementing active learning before they enter the class, but to revise their implementation after the class is over. Instructors should refine their use of active learning through feedback, reflection, and practice over multiple course offerings. We hope this persistence can lead to long-term adoption of active learning.

Despite our review ending in 2015, most of STEM instruction remains didactic (Laursen, 2019 ; Stains et al., 2018 ), and there has not been a long-term sustained adoption of active learning. In a push to increase the adoption of active learning within undergraduate STEM courses, we hope this study provided support and actionable strategies for instructors who are considering active learning but are concerned about student resistance to active learning.

We identified eight specific strategies to aid implementation of active learning based on three categories. The three categories of strategies were explanation, facilitation, and planning. In this review, we created the third category, planning, and we suggested that this category should be considered first when implementing active learning in the course. Instructors should then focus on explaining and facilitating their activity in the classroom. The eight specific strategies provided here can be incorporated into faculty professional development programs and readily adopted by instructors wanting to implement active learning in their STEM courses.

There remains important future work in active learning research, and we noted these gaps within our review. It would be useful to specifically review and measure instructor strategies in-action and compare its use against other affective outcomes, such as identity, interest, and emotions.

There has yet to be a study that compiles and synthesizes strategies reported from multiple active learning studies, and we hope that this paper filled this important gap. The strategies identified in this review can help instructors persist beyond awkward initial implementations, avoid some problems altogether, and most importantly address student resistance to active learning. Further, the planning strategies emphasize that the use of active learning can be improved over time, which may help instructors have more realistic expectations for the first or second time they implement a new activity. There are many benefits to introducing active learning in the classroom, and we hope that these benefits are shared among more STEM instructors and students.

Availability of data and materials

Journal articles and conference proceedings which make up this review can be found through reverse citation lookup. See the Appendix for the references of all primary studies within this systematic review. We used the following databases to find studies within the review: Web of Science, Academic Search Complete, Compendex, Inspec, Education Source, and Education Resource Information Center. More details and keyword search strings are provided in the “Methods” section.

Abbreviations

Science, technology, engineering, and mathematics

Student resistance to active learning

Instrument with evidence of validity

Instructor surveys

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Problem solving

Problem or project-based learning

End of course evaluations

Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sciences Education , 8 (3), 203–213. https://doi.org/10.1187/cbe.09-03-0025 .

Article   Google Scholar  

Autin, M., Bateiha, S., & Marchionda, H. (2013). Power through struggle in introductory statistics. PRIMUS , 23 (10), 935–948. https://doi.org/10.1080/10511970.2013.820810 .

Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psychologist , 37 (2), 122 https://psycnet.apa.org/doi/10.1037/0003-066X.37.2.122 .

Berkling, K., & Zundel, A. (2015). Change Management: Overcoming the Challenges of Introducing Self-Driven Learning. International Journal of Engineering Pedagogy (iJEP), 5 (4), 38–46. https://www.learntechlib.org/p/207352/ .

Bilston, L. (1999). Lessons from a problem-based learning class in first year engineering statics . Paper presented at the 2nd Asia-Pacific Forum on Engineering and Technology Education, Clayton, Victoria.

Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education , 102 (3), 394–425. https://doi.org/10.1002/jee.20020 .

Borrego, M., Foster, M. J., & Froyd, J. E. (2014). Systematic literature reviews in engineering education and other developing interdisciplinary fields. Journal of Engineering Education , 103 (1), 45–76. https://doi.org/10.1002/jee.20038 .

Borrego, M., Nguyen, K., Crockett, C., DeMonbrun, M., Shekhar, P., Tharayil, S., … Waters, C. (2018). Systematic literature review of students’ affective responses to active learning: Overview of results . San Jose: Paper presented at the 2018 IEEE Frontiers in Education Conference (FIE). https://doi.org/10.1109/FIE.2018.8659306 .

Book   Google Scholar  

Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development . Sage Publications Inc.

Breckler, J., & Yu, J. R. (2011). Student responses to a hands-on kinesthetic lecture activity for learning about the oxygen carrying capacity of blood. Advances in Physiology Education, 35 (1), 39–47. https://doi.org/10.1152/advan.00090.2010 .

Bullard, L., & Felder, R. (2007). A student-centered approach to the stoichiometry course . Honolulu: Paper presented at the 2007 ASEE Annual Conference and Exposition https://peer.asee.org/1543 .

Carlson, K. A., & Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education , 19 (1). https://doi.org/10.1080/10691898.2011.11889596 .

Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist , 49 (4), 219–243. https://doi.org/10.1080/00461520.2014.965823 .

Christensen, T. (2005). Changing the learning environment in large general education astronomy classes. Journal of College Science Teaching, 35 (3), 34.

Cooper, M., & Klymkowsky, M. (2013). Chemistry, life, the universe, and everything: A new approach to general chemistry, and a model for curriculum reform. Journal of Chemical Education , 90 (9), 1116–1122. https://doi.org/10.1021/ed300456y .

Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches . Sage Publishing Inc.

Dancy, M., Henderson, C., & Turpen, C. (2016). How faculty learn about and implement research-based instructional strategies: The case of peer instruction. Physical Review Physics Education Research , 12 (1). https://doi.org/10.1103/PhysRevPhysEducRes.12.010110 .

Daniel, B. J. (2019). Teaching while black: Racial dynamics, evaluations, and the role of white females in the Canadian academy in carrying the racism torch. Race Ethnicity and Education , 22 (1), 21–37. https://doi.org/10.1080/13613324.2018.1468745 .

de Justo, E., & Delgado, A. (2014). Change to competence-based education in structural engineering. Journal of Professional Issues in Engineering Education and Practice , 141 (3). https://doi.org/10.1061/(ASCE)EI.1943-5541.0000215 .

DeMonbrun, R. M., Finelli, C., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an instrument to measure student response to instructional practices. Journal of Engineering Education , 106 (2), 273–298. https://doi.org/10.1002/jee.20162 .

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences , 116 (39), 19251–19257. https://doi.org/10.1073/pnas.1821936116 .

Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review , 95 (2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256 .

Finelli, C., Nguyen, K., Henderson, C., Borrego, M., Shekhar, P., Prince, M., … Waters, C. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching , 47 (5), 80–91 https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-mayjune-2018/research-and-1 .

Google Scholar  

Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education , 103 (2), 331–361. https://doi.org/10.1002/jee.20042 .

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences , 111 (23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 .

Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. J. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education , 56 (4), 393–399. https://doi.org/10.1109/TE.2013.2244602 .

Gaffney, J. D., Gaffney, A. L. H., & Beichner, R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics - Physics Education Research , 6 (1), 010102. https://doi.org/10.1103/PhysRevSTPER.6.010102 .

Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews . Sage Publishing Inc.

Harun, N. F., Yusof, K. M., Jamaludin, M. Z., & Hassan, S. A. H. S. (2012). Motivation in problem-based learning implementation. Procedia-Social and Behavioral Sciences , 56 , 233–242. https://doi.org/10.1016/j.sbspro.2012.09.650 .

Haseeb, A. (2011). Implementation of micro-level problem based learning in a course on electronic materialas. Journal of Materials Education , 33 (5-6), 273–282 http://eprints.um.edu.my/id/eprint/5501 .

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching , 48 (8), 952–984. https://doi.org/10.1002/tea.20439 .

Henderson, C., & Dancy, M. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research , 3 (2). https://doi.org/10.1103/PhysRevSTPER.3.020102 .

Henderson, C., Khan, R., & Dancy, M. (2018). Will my student evaluations decrease if I adopt an active learning instructional strategy? American Journal of Physics , 86 (12), 934–942. https://doi.org/10.1119/1.5065907 .

Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics , 3 (4), 447–462. https://doi.org/10.1007/s11948-997-0047-x .

Hodgson, Y., Benson, R., & Brack, C. (2013). Using action research to improve student engagement in a peer-assisted learning programme. Educational Action Research, 21 (3), 359-375. https://doi.org/10.1080/09650792.2013.813399 .

Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multi-dimensional analysis of math and science undergraduate course planning and classroom teaching. Journal of the Learning Sciences , 22 (2), 212–257. https://doi.org/10.1080/10508406.2012.729767 .

Hutchison, P., & Hammer, D. (2010). Attending to student epistemological framing in a science classroom. Science Education , 94 (3), 506–524. https://doi.org/10.1002/sce.20373 .

Jaeger, B., & Bilen, S. (2006). The one-minute engineer: Getting design class out of the starting blocks . Paper presented at the 2006 ASEE Annual Conference and Exposition, Chicago, IL. https://peer.asee.org/524 .

Jesiek, B. K., Mazzurco, A., Buswell, N. T., & Thompson, J. D. (2018). Boundary spanning and engineering: A qualitative systematic review. Journal of Engineering Education , 107 (3), 318–413. https://doi.org/10.1002/jee.20219 .

Kezar, A., Gehrke, S., & Elrod, S. (2015). Implicit theories of change as a barrier to change on college campuses: An examination of STEM reform. The Review of Higher Education , 38 (4), 479–506. https://doi.org/10.1353/rhe.2015.0026 .

Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development . McGraw Hill.

Knekta, E., Rowland, A. A., Corwin, L. A., & Eddy, S. (2020). Measuring university students’ interest in biology: Evaluation of an instrument targeting Hidi and Renninger’s individual interest. International Journal of STEM Education , 7 , 1–16. https://doi.org/10.1186/s40594-020-00217-4 .

Kovac, J. (1999). Student active learning methods in general chemistry. Journal of Chemical Education , 76 (1), 120. https://doi.org/10.1021/ed076p120 .

Krishnan, S., & Nalim, M. R. (2009). Project based learning in introductory thermodynamics . Austin: Paper presented at the 2009 ASEE Annual Conference and Exposition https://peer.asee.org/5615 .

Kuh, G. D. (2005). Student engagement in the first year of college. In M. L. Upcraft, J. N. Gardner, J. N, & B. O. Barefoot (Eds.), Challenging and supporting the first-year student: A handbook for improving the first year of college , (pp. 86–107). Jossey-Bass.

Laatsch, L., Britton, L., Keating, S., Kirchner, P., Lehman, D., Madsen-Myers, K., Milson, L., Otto, C., & Spence, L. (2005). Cooperative learning effects on teamwork attitudes in clinical laboratory science students. American Society for Clinical Laboratory Science, 18(3). https://doi.org/10.29074/ascls.18.3.150 .

Lake, D. A. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy , 81 (3), 896–902. https://doi.org/10.1093/ptj/81.3.896 .

Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction American Association for the Advancement of Science. https://www.aaas.org/resources/levers-change-assessment-progress-changing-stem-instruction .

Lehtovuori, A., Honkala, M., Kettunen, H., & Leppävirta, J. (2013). Interactive engagement methods in teaching electrical engineering basic courses. In Paper presented at the IEEE global engineering education conference (EDUCON) . Germany: Berlin. https://doi.org/10.1109/EduCon.2013.6530089 .

Chapter   Google Scholar  

Lenz, L. (2015). Active learning in a math for liberal arts classroom. PRIMUS , 25 (3), 279–296. https://doi.org/10.1080/10511970.2014.971474 .

Li, J., Zhao, Y., & Shi, L. (2009). Interactive teaching methods in information security course . Paper presented at the International Conference on Scalable Computing and Communications; The Eighth International Conference on Embedded Computing. https://doi.org/10.1109/EmbeddedCom-ScalCom.2009.94 .

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., … Moher, D. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. Journal of Clinical Epidemology , 62 (10), e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .

Lund, T. J., & Stains, M. (2015). The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education , 2 (1). https://doi.org/10.1186/s40594-015-0026-8 .

Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education , 8 (1), 9–30. https://doi.org/10.1177/1469787407074008 .

Maib, J., Hall, R., Collier, H., & Thomas, M. (2006). A multi-method evaluation of the implementation of a student response system . Paper presented at the 12th Americas’ Conference on Information Systems (AMCIS), Acapulco, Mexico. https://aisel.aisnet.org/amcis2006/27 .

McClanahan, E. B., & McClanahan, L. L. (2002). Active learning in a non-majors biology class: Lessons learned. College Teaching , 50 (3), 92–96. https://doi.org/10.1080/87567550209595884 .

McLoone, S., & Brennan, C. (2015). On the use and evaluation of a smart device student response system in an undergraduate mathematics classroom. AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education, 7(3). http://ojs.aishe.org/index.php/aishe-j/article/view/243 .

Metzger, K. J. (2015). Collaborative teaching practices in undergraduate active learning classrooms: A report of faculty team teaching models and student reflections from two biology courses. Bioscene: Journal of College Biology Teaching , 41 (1), 3–9 http://www.acube.org/wp-content/uploads/2017/11/2015_1.pdf .

Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics , 51 (3), 648–652. https://doi.org/10.1017/S104909651800001X .

Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., DeMonbrun, R. M., … Waters, C. (2017). Students’ expectations, types of instruction, and instructor strategies predicting student response to active learning. International Journal of Engineering Education , 33 (1(A)), 2–18 http://www.ijee.ie/latestissues/Vol33-1A/02_ijee3363ns.pdf .

Oakley, B. A., Hanna, D. M., Kuzmyn, Z., & Felder, R. M. (2007). Best practices involving teamwork in the classroom: Results from a survey of 6435 engineering student respondents. IEEE Transactions on Education , 50 (3), 266–272. https://doi.org/10.1109/TE.2007.901982 .

Oliveira, P. C., & Oliveira, C. G. (2014). Integrator element as a promoter of active learning in engineering teaching. European Journal of Engineering Education, 39 (2), 201–211. https://doi.org/10.1080/03043797.2013.854318 .

Owens, D. C., Sadler, T. D., Barlow, A. T., & Smith-Walters, C. (2020). Student motivation from and resistance to active learning rooted in essential science practices. Research in Science Education , 50 (1), 253–277. https://doi.org/10.1007/s11165-017-9688-1 .

Parker Siburt, C. J., Bissell, A. N., & Macphail, R. A. (2011). Developing Metacognitive and Problem-Solving Skills through Problem Manipulation. Journal of Chemical Education, 88 (11), 1489–1495. https://doi.org/10.1021/ed100891s .

Passow, H. J., & Passow, C. H. (2017). What competencies should undergraduate engineering programs emphasize? A systematic review. Journal of Engineering Education , 106 (3), 475–526. https://doi.org/10.1002/jee.20171 .

Patrick, L. E., Howell, L. A., & Wischusen, W. (2016). Perceptions of active learning between faculty and undergraduates: Differing views among departments. Journal of STEM Education: Innovations and Research , 17 (3), 55 https://www.jstem.org/jstem/index.php/JSTEM/article/view/2121/1776 .

Pellegrino, J., DiBello, L., & Brophy, S. (2014). The science and design of assessment in engineering education. In A. Johri, & B. Olds (Eds.), Cambridge handbook of engineering education research , (pp. 571–598). Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.036 .

Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide . Blackwell Publishing. https://doi.org/10.1002/9780470754887 .

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education , 93 , 223–232. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x .

Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching , 36 (5), 14–20.

Ramsier, R. D., Broadway, F. S., Cheung, H. M., Evans, E. A., & Qammar, H. K. (2003). University physics: A hybrid approach . Nashville: Paper presented at the 2003 ASEE Annual Conference and Exposition https://peer.asee.org/11934 .

Regev, G., Gause, D. C., & Wegmann, A. (2008). Requirements engineering education in the 21st century, an experiential learning approach . 2008 16th IEEE International Requirements Engineering Conference, Catalunya. https://doi.org/10.1109/RE.2008.28 .

Rockland, R., Hirsch, L., Burr-Alexander, L., Carpinelli, J. D., & Kimmel, H. S. (2013). Learning outside the classroom—Flipping an undergraduate circuits analysis course . Atlanta: Paper presented at the 2013 ASEE Annual Conference and Exposition https://peer.asee.org/19868 .

Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching , 55 (1), 19–43. https://doi.org/10.1002/tea.21409 .

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., & Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: A systematic review of underlying reasons. Journal of College Science Teaching , 49 (6) https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-julyaugust-2020/negative-student .

Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom-based practices. Journal of Engineering Education , 94 (1), 87–101. https://doi.org/10.1002/j.2168-9830.2005.tb00831.x .

Stains, M., Harshman, J., Barker, M., Chasteen, S., Cole, R., DeChenne-Peters, S., … Young, A. M. (2018). Anatomy of STEM teaching in north American universities. Science , 359 (6383), 1468–1470. https://doi.org/10.1126/science.aap8892 .

Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE Life Sciences Education , 16 (1). https://doi.org/10.1187/cbe.16-03-0113 .

Stump, G. S., Husman, J., & Corby, M. (2014). Engineering students' intelligence beliefs and learning. Journal of Engineering Education , 103 (3), 369–387. https://doi.org/10.1002/jee.20051 .

Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education , 5 (1), 7. https://doi.org/10.1186/s40594-018-0102-y .

Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences , 117 (12), 6476–6483. https://doi.org/10.1073/pnas.1916903117 .

Tolman, A., Kremling, J., & Tagg, J. (2016). Why students resist learning: A practical model for understanding and helping students . Stylus Publishing, LLC.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Harvard University Press.

Weimer, M. (2002). Learner-centered teaching: Five key changes to practice . Wiley.

Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology , 25 (1), 68–81. https://doi.org/10.1006/ceps.1999.1015 .

Winkler, I., & Rybnikova, I. (2019). Student resistance in the classroom—Functional-instrumentalist, critical-emancipatory and critical-functional conceptualisations. Higher Education Quarterly , 73 (4), 521–538. https://doi.org/10.1111/hequ.12219 .

Download references

Acknowledgements

We thank our collaborators, Charles Henderson and Michael Prince, for their early contributions to this project, including screening hundreds of abstracts and full papers. Thank you to Adam Papendieck and Katherine Doerr for their feedback on early versions of this manuscript. Finally, thank you to the anonymous reviewers at the International Journal of STEM Education for your constructive feedback.

This work was supported by the National Science Foundation through grant #1744407. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and affiliations.

Hutchins School of Liberal Studies, Sonoma State University, Rohnert Park, CA, USA

Kevin A. Nguyen

Departments of Curriculum & Instruction and Mechanical Engineering, University of Texas, Austin, TX, USA

Maura Borrego & Sneha Tharayil

Departments of Electrical Engineering & Computer Science and Education, University of Michigan, 4413 EECS Building, 1301 Beal Avenue, Ann Arbor, MI, 48109, USA

Cynthia J. Finelli & Caroline Crockett

Enrollment Management Research Group, Southern Methodist University, Dallas, TX, USA

Matt DeMonbrun

School of Applied Engineering and Technology, New Jersey Institute of Technology, Newark, NJ, USA

Prateek Shekhar

Advanced Manufacturing and Materials, Naval Surface Warfare Center Carderock Division, Potomac, MD, USA

Cynthia Waters

Cabot Science Library, Harvard University, Cambridge, MA, USA

Robyn Rosenberg

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design and execution of this paper. KN, MB, and CW created the original vision for the paper. RR solicited, downloaded, and catalogued all studies for review. All authors contributed in reviewing and screening hundreds of studies. KN then led the initial analysis and creation of strategy codes. CF reviewed and finalized the analysis. All authors drafted, reviewed, and finalized sections of the paper. KN, MB, MD, and CC led the final review of the paper. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cynthia J. Finelli .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Nguyen, K.A., Borrego, M., Finelli, C.J. et al. Instructor strategies to aid implementation of active learning: a systematic literature review. IJ STEM Ed 8 , 9 (2021). https://doi.org/10.1186/s40594-021-00270-7

Download citation

Received : 19 June 2020

Accepted : 18 January 2021

Published : 15 March 2021

DOI : https://doi.org/10.1186/s40594-021-00270-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Active learning
  • Systematic review
  • Instructor strategies; student response

research about active learning

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley - PMC COVID-19 Collection

Logo of pheblackwell

Active learning tools improve the learning outcomes, scientific attitude, and critical thinking in higher education: Experiences in an online course during the COVID ‐19 pandemic

Izadora volpato rossi.

1 Postgraduate Program in Cellular and Molecular Biology, Federal University of Paraná, Curitiba Brazil, Brazil

Jordana Dinorá de Lima

Bruna sabatke, maria alice ferreira nunes, graciela evans ramirez.

2 Technological Professional Education Sector, Federal University of Paraná, Curitiba Brazil

Marcel Ivan Ramirez

3 EVAHPI ‐ Extracellular Vesicles and Host‐Parasite Interactions Research Group Laboratório de Biologia Molecular e Sistemática de Tripanossomatideos, Carlos Chagas Institute‐Fiocruz, Curitiba Brazil

Associated Data

Active teaching methodologies have been placed as a hope for changing education at different levels, transiting from passive lecture‐centered to student‐centered learning. With the health measures of social distance, the COVID‐19 pandemic forced a strong shift to remote education. With the challenge of delivering quality education through a computer screen, we validated and applied an online course model using active teaching tools for higher education. We incorporated published active‐learning strategies into an online construct, with problem‐based inquiry and design of inquiry research projects to serve as our core active learning tool. The gains related to students' science learning experiences and their attitudes toward science were assessed by applying questionnaires before, during, and after the course. The course counted on the participation of 83 students, most of them (60.8%) from postgraduate students. Our results show that engagement provided by active learning methods can improve performance both in hard and soft skills. Students' participation seems to be more relevant when activities require the interaction of information, prediction, and reasoning, such as open‐ended questions and design of research projects. Therefore, our data show that, in pandemic, active learning tools benefit students and improve their critical thinking and their motivation and positive positioning in science.

1. INTRODUCTION

Academically first‐world countries have debated how the training of students should be, from basic primary education at schools to higher education at universities. 1 , 2 , 3 , 4 A major concern is how education can collaborate in the formation of citizens and professionals capable of leading technological, economic, social, cultural, and political changes. 5 , 6 , 7 Specifically, in the area of science, researchers should be trained with skills that go beyond the technical reproduction of experiments, but that employ critical thinking and that are capable of applying scientific concepts to propose solutions and generate knowledge. 8 , 9 , 10 The change of curricular programs in the STEM area (science, technology, engineering, and mathematics) and new proposals for educational strategies have been stimulated in different countries. 11 , 12 Lecture‐based and teacher‐centered pedagogy is undergoing a shift toward more active learning, in which students build their own understanding of a subject through learning activities. 13 , 14 The benefits of active learning seem substantial, both in cognitive learning and in the development of soft skills by students, such as leadership, problem‐solving, and autonomy. 15 , 16 , 17 In Brazil, few efforts have been made to discuss structural changes in education from basic to university. The absence of adequate working conditions encourages teachers to adopt an old‐fashioned type of education, in which passive teaching methods predominate. Although there is no state initiative that encourages the incorporation of active learning methods, some higher teaching institutions have introduced methods of problem‐solving, critical thinking, and/or problem‐based learning with inspiring success. 18 , 19 , 20 , 21

Active learning comprises approaches that focus more on developing students' skills than transmitting information and require students to perform activities that require higher‐order thinking. 13 For this, students use critical thinking, which involves analysis, reflection, evaluation, interpretation, and inference to synthesize information that is obtained through reading, observation, communication, or experience to answer a question. 22 There are several methodologies that fit the concept of active teaching, such as inquiry‐based learning, project‐based learning, and problem‐based learning. 17 , 23 , 24 Among them is, for example, project‐based learning is a model that organizes learning around projects, in which challenging questions or problems are involved that involve proposing solutions, formulating hypotheses, and investigative activities. 17

The COVID‐19 pandemic has produced a situation of health emergency, economic, and social instability that challenged the entire educational system. The intense contact and exchange of information that took place during face‐to‐face classes in normal life have been restricted to virtual spaces. Given all these sudden changes, online courses have been a viable option to prepare students at different levels (Figure  1 ). Although some groups have already reported their teaching experiences and perceptions in times of lockdown and social distance, 25 , 26 , 27 , 28 , 29 , 30 , 31 very few of them reported the impact of active learning on online courses, and rarer are the studies in postgraduate students. During the pandemic, we have seen the opportunity to validate a course model with the aim of actively encouraging students of higher education to acquire important biological concepts. We planned to create a rich, multifaceted course that integrated active learning methodologies. We incorporated active‐learning strategies that allowed transit in the course from passive lecture‐centered to active student‐centered learning. With this approach, we were interested in understanding the benefit of our course at the student formation and in answering two important questions:

  • Does the course increase the cognitive and intellectual skills of the students?
  • How was the impact of critical thinking methodologies on the student's attitudes toward science and soft skills?

Our interest was concentrated in analyzing whether students through the course showed more enthusiasm for the concept of research and science. Crucial elements in science such as forming and testing hypotheses, defining strategies, communicating results were evaluated to determine whether critical thinking methods could improve thinking and rational logic. In order to assess students' gains in these two aspects, we applied questionnaires to students before, during, and after the course. Here, we will comment on the results of this experience that incorporated active methodologies and student‐teacher interaction tools for remote higher education.

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g003.jpg

Passive (teacher‐centered) and active (student‐centered) learning in classroom or remote teaching models

2. MATERIAL AND METHODS

2.1. undergraduate pilot course to validate online active learning tools.

In order to validate an online course model and test some active learning tools, we have offered a course aimed primarily at undergraduates. The subject of this course was cell culture which has a wide interest and application in the biological area. Knowledge on cell culture is required for some research activities and also represents a promising alternative for replacement of animal experimentation.

In order to follow contagious preventive actions during the COVID‐19 pandemic, the course was administered remotely in a teleconference format through the Microsoft Teams platform. This platform allows the instructors to interact through video, audio, and live chat, which gives the feeling of a personal meeting from a safe distance. Before each class, there was a moment of relaxation with “icebreaker” conversations to get to know the audience. This moment helped to create a more intimate environment and also to share tensions and concerns about the pandemic.

The course had a total of 15 h, 10 h of synchronous activities and 5 h of asynchronous activities. Synchronous activities included lectures, simultaneous online quiz activities, and discussion of scientific papers. Asynchronous activities consisted of two questionnaires containing guided questions for critical reading of a scientific paper (one of the papers involving chronic diseases and the other infectious diseases). After returning this questionnaire, the papers were discussed during classes. To measure perceptions of the overall effectiveness of the course and the proposed methodologies, we asked students to complete a questionnaire at the end of the course.

2.2. Experimental undergraduate and postgraduate course

2.2.1. course design.

The experimentation course was offered as a satellite event during a symposium hosted by a Postgraduate Program at a Brazilian state university. The focus of the course was redefined from our previous basic course to contemplate strategies for the study of infectious diseases using cell culture. In order to know the profile of the enrolled students, we applied two questionnaires containing open and closed questions: one with demographic questions and previous research experience and the other about their previous experiences with active learning methodologies.

The course had a short duration (12 h total), divided between synchronous (7 h) and asynchronous activities (5 h). The synchronous activities of the course were structured as follows: (i) 2 h of key concepts to introduce the subject and situate the content and emphasis of the course; (ii) 2 h of strategies for studying the pathogen‐host cell interaction using cell culture, (iii) 1 h of presentation of an inquiry research project (IRP) with the subject chosen by the participant, (iv) 1 h of questions about concepts and strategies to solve problems (Table  1 ). The “offline” time was used to prepare the scientific IRP and participate in the questionnaires with questions related to the classes. The description of the activities developed can be found in the topic “Active learning instruments/tools” below.

Course schedule

Abbreviations: A.A.T., asynchronous activities time; S.A.T., Synchronous Activities Time.

2.2.2. Active learning instruments/tools

In order to place the student as the center of the course, we incorporated some active‐learning strategies into an online course construct. Some moments of the dynamics of the classes and the approaches used during the course are gathered in Video S 1 . We proposed some activities that required student's engagement:

The quiz was a knowledge fixation tool performed at the end of lectures. In this activity, participants answered questions related to the presented content directly through the Voxvote website ( https://www.voxvote.com/ ). Table  2 contains some examples of applied questions; the questions were corrected at the end of the time proposed by the VoxVote tool (Video S 1 , min 02:36–02:51).

Examples of questions administered during live quiz using VoxVote

We proposed to the participants to develop an IRP to stimulate the construction of knowledge and critical thinking. The IRP should contain the scientific relevance of the project, main objectives, and methodologies to achieve the proposed objectives. Along with the description of the project, participants could send a graphic design summarizing their project proposal, following a Graphical Abstract model indicated as a reference (Figure S 1 ). The IRP was sent using Google Forms. The IRP proposals were evaluated by all instructors who selected the best 10 for presentation based on criteria of coherence and conceptualization of the biological question, ampleness of the applied methodologies, and connection between the proposed strategies.

Inquiry questionnaires

Two online questionnaires were sent to all participants via email and were available for at least 48 h. Both questionnaires contained eight multiple‐choice and four open‐ended questions about biological concepts related to the course subject. The first questionnaire (Q1) was available before the beginning of the course, while the second (Q2) was available 2 days after the experimental course started. Q1 and Q2 had the same level of difficulty, with multiple‐choices (basic) and problem‐based questions (open‐ended) (see Table  3 ). Q2 was answered while the students were simultaneously participating in several activities of the hosted event.

Examples of questions applied in the inquiry questionnaires

2.2.3. Inquiry questionnaire assessment

Questionnaire responses were corrected by five evaluators. Multiple‐choice questions scores were calculated by sum of the right answers. Open‐ended questions required a more detailed evaluation process where four evaluation criteria were scored in each answer: comprehension, specificity, ampleness, and connection. All evaluators considered whether the student had understood the question (comprehension), the approaches that the student proposed to solve the problem (ampleness), the specificity of this or these approaches (specificity), and the rationale and feasibility of the strategy (connection). For comprehension evaluation only 0 (lack of comprehension) and 1 (adequately answered). The other criteria considered three levels of score: insufficient (0), good (1), and excellent (2). The maximum score was seven points for each answer. Answers zeroed in comprehension were not evaluated in the remaining criteria. Furthermore, the order of questions and answers were randomized to avoid possible bias during the assessment process. The scores were generated from the average of five evaluators. Total score was calculated by the sum of multiple‐choice questions (0%–50%) and open‐ended questions (0%–50%).

Intra‐questionnaires comparisons, that is, between questions, were assessed by ANOVA, while questionnaire differences were analyzed by unpaired t test. All analyses were performed in GraphPad Prism version 6.01.

2.2.4. Analysis of students' perception of the course

At the end of the course, students were asked to fill up their impressions and suggestions about the course in a feedback form containing multiple‐choice and open‐ended questions. Some questions were to choose the sentence which they felt more identified and in others the students evaluated sentences in a five points scale, with 0 being “nothing” and 5 being “very” (Likert scale. 32 ). Open‐ended questions were added to stimulate the students to express their opinion about the course. The open‐ended data were coded in categories considering the most cited answer for each question. Qualitative thematic content analysis was applied to quantify answers, providing support for a quantitative evaluation.

3.1. The online course can be a platform of active learning methodologies: A pilot experience

In order to validate an online course model and test some active learning tools, we have offered a course aimed primarily at undergraduates during pandemic. The wide theme Cell Culture was well received by students, attracting participants from different fields of health sciences (including biology, biomedicine, pharmacy, biotechnology, and medicine—data not shown) and with different backgrounds (3.37% bachelor degree, 56.75% undergraduate students, 18.91% mastering students, 4.72% masters, 10.13% doctoral students, graduate course 2.02% and 4.05% doctoral also participated. n  = 148. Figure S 2 A).

The great advantage of remote education is being able to bring together or to mix participants from different educational institutions and different backgrounds. Participants were from 22 different Brazilian institutions and 1 foreign institution, with public and private education (Figure S 2 B).

Although only 29% of students had worked with cell culture, the positive perception of the course was very high (Figure  2a ). Moreover, 87% of the participants evaluated the course as excellent (Figure  2b ). Active learning tools used during the course (real‐time online quiz [live], paper reading guide, etc.) was positively rated by participants (data not shown) and the participants pointed out as main strengths the didactics, teaching methodology, and the interaction between teacher and student (Figure  2c ).

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g006.jpg

The methodology tools used during the validation course were positively evaluated by the participants. (a) Course evaluation by participants. (b) Contribution in the course in learning cell culture. (c) The open‐ended question on “course strengths” was content analyzed, and the responses were classified into categories that included similar statements

Excited by the positive experience of the first course, we decided to go deeper into a course aimed at postgraduate students to understand whether active learning tools could improve their cognitive and thinking skills. With the validation and approval of the active learning approach, we decided to maintain some activities (such as the real‐time online quiz and questionnaires) and adjust some activities targeting the topic to the participants.

3.2. Experimental course: Active learning tools improve the performance of students in higher education

3.2.1. participants students profile: a representative sample of brazilian higher education.

The second experience with the online course model had a heterogeneous audience profile, including participants with different levels and from different locations. There were 83 enrolled, most of them master students (38.0%) (Figure  3a ). Undergraduate students constituted 24.1% and PhD students 19.0%. There was also the participation of PhDs, constituting a very heterogeneous public. The participants belonged to 22 Brazilian institutions from different states (Figure  3b ). Although 94% had previous research experience, only 59.4% had experience in cell culture (Figure S 3 A), either carrying out in vitro experiments (full experience) or just accompanying other people (partial experience) (Figure  3c ). The focus of the course was infectious diseases, which was the object of work of 59.5% of participants, including the biological model of bacteria (20.3%), fungi (15.2%), parasites (16.5%), and viruses (7.6%) (Figure S 3 B).

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g005.jpg

The audience of attendants to the course was heterogeneous. (a) Participants' educational background, divided between complete and incomplete. Others include “specialist” as complete and “Incomplete second degree” and “residency” as incomplete. (b) Distribution of participants' institutions in Brazilian states (highlighted in gray). (c) Students' previous experience with cell culture techniques

To obtain an overview of students' previous experience with teaching methodologies, we asked students ( n  = 60) about which method was most used during their academic experience. The majority of the respondents (76.7%) affirm that their predominant teaching approach was passive, mainly represented by traditional lectures (Figure S 4 A,B). Among graduate students, 50% answered that their classes have similar proportions between active and passive classes (Figure S 4 C). When asked in an open question about what could be improved in their education (undergraduate or graduate), 72.1% of students admitted that other teaching approaches could be employed (data not shown). Most comments pointed to the necessity of interactive classes, including solving clinical cases and practical application of knowledge. The answers pointed out that most of the students (76.7%) consider that active teaching methodologies are excellent for their learning and that participating interactively in the subjects improve their apprenticeship (Figure S 4 D).

3.2.2. Active methodologies promote improved short‐term learning outcomes

Interested in observing the development of students during the course, we used research‐based learning approaches through the application of questionnaires in a pretest (Q1) and posttest (Q2). Fifty‐four students participated in the online questionnaire activities (Figure S 5 —graduation: n  = 14; masters: n  = 25; doctoral: n  = 14; postdoctoral: n  = 3; other: n  = 1). Most of them participated in the first questionnaire (Q1: n  = 49), while a minority participated in the second one (Q2: n  = 26); finally, 20 students participated in both questionnaires.

The average scores of students in the questionnaires were higher in Q2 compared with Q1 (Figure  4a , Table S 1 ). This progress was distributed similarly through multiple‐choice (13.51%) and open‐ended questions (15.67%). On average, no student had zeroed their score in Q2, which may represent that students were more committed to the second test (Figure  4a ). The proportion of students with high performance (total score > 80%) was at least three times higher in Q2 compared with Q1 (6.1% at Q1 and 19.2% at Q2. Figure  4b ). The students showed improvement in all four criteria evaluated in the open‐ended questions from Q1 to Q2 (Suppl. Figure  6 ). When multiple‐choice and open‐ended questions were analyzed separately, Q2's superior performance was predominantly due to the scores at the multiple‐choice questions (Figure  4c ) than from the open‐ended (Figure  4d ).

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g001.jpg

Students showed a rapid evolution in their performance during the course. (a) General average score in each questionnaire (0%–100%); multiple‐choice and open‐ended questions represent 50% of the score each; (b) Proportion of students within score ranges in Q1 ( n  = 49) and Q2 ( n  = 26). (c) Students' scores average only in the multiple‐choice questions between questionnaires; (d) Students' scores average only in the open‐ended questions between questionnaires

We hypothesized that the overall improvement of open‐ended questions may be due to a lower engagement at the more difficult and exploratory questions (such as the open‐ended questions). Regarding this point we calculated the student dropout rate to each question by the rate of NA answers—that is, described as blank answers and “I don't know” type of answers. In fact, the dropout for open‐ended questions (Q1: 22% and Q2: 15%, Table  4 ) was higher than for multiple‐choice questions, which was irrelevant (Q1: 2% and Q2: 0%). Furthermore, there was a 31.8% reduction in the dropout rate in open‐ended questions from Q2 compared with Q1 (Table  4 ). This may indicate that the students felt more confident and motivated to commit intellectual effort during the performance of Q2, resulting in a better outcome.

Dropout rate among open‐ended questions in Q1 and Q2

Note : Dropout was considered for blank answers and “I don't know” type of answers. “OE” stands for open‐ended questions from 1 to 4 in each questionnaire.

3.2.3. Formulating hypotheses and proposing strategies: A scientist‐like experience through project‐based learning

The inclusion of project‐based learning strategies is effective in STEM courses, to involve students in authentic “real world” tasks. 17 , 33 During the course, students were motivated to prepare a mini scientific project to answer a biological question of their interest; applying cell culture strategies (see Materials and Methods). The elaboration of the scientific IRP represented the most demanding activity for the student and we had only 26.5% of participation (22 IRPs), most of which are master's students (Figure S 7 ). This type of activity was a challenge for the students, who feel freedom to “think outside the box” and find ways to answer their biological questions. Many students elaborate different and curious hypotheses, from which the instructors selected the 10 best IRPs based on criteria of coherence, conceptualization, applied methodologies, and connection between the proposed strategies. We were able to see some students who stood out for the quality of their IRP proposal. Interestingly, among the 10 best IRPs selected, the fourth part was written by undergraduate students (data not shown). In addition, we reserved a period of the course for the presentation of the selected IRPs to the whole class at a “symposium‐like moment,” using their graphical abstracts as support. This type of activity adds other soft skills to students, such as communication and accepting challenges, essential for future scientists. Part of the presentations of the selected students and their graphical abstract/poster as other course activities were compiled in Video S 1 (min 03:01–03:51).

3.2.4. Engagement in active‐learning activities correlates to better student performance

Active methodologies place the student as the center of learning and for this reason, their effectiveness relies heavily on the student's engagement in activities. Motivated by the various studies that show a positive correlation between student engagement and performance, 12 , 34 , 35 , 36 , 37 , 38 we assessed whether the most engaged students during our course had higher scores.

First, we evaluated the scores of the group of students who participated in both inquiry questionnaires (“BOTH”) separated from those who have answered only one of the questionnaires (“ONLY Q1” or “ONLY Q2”). This analysis showed that students that were engaged in both activities had higher performance in open‐ended questions, but not in multiple‐choice (Figure  5A,B ).

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g002.jpg

Highly engaged students have better performances in open‐ended questions. (a) Students' scores average in the multiple‐choice questions within each engagement subgroup; (b) Students' scores in the open‐ended questions within each engagement subgroup. (c) Venn diagram, representing the number of participants in each activity (Q1, Q2, IRP, and TOP IRP). (d) Total score (%) in Q1 and Q2 analyzed in groups classified by the level of engagement in the course activities. The questionnaire to which the average scores refer is indicated by the horizontal bars (Q1 or Q2). (e) Students average in multiple‐choice questions within each group engagement. (f) Students average in open‐ended questions within each group engagement

We hypothesized whether engagement in activities proposed during the course (questionnaires and IRP) would be related to the best performance of students. For this, we considered the following groups: students who had been selected as TOP IRP and also participated in both Q1 and Q2 (TOP IRP + Q1 + Q2, n  = 5), students who participated in Q1 and Q2 and sent IRP (but were not TOP IRP, named IRP + Q1 + Q2, n  = 6), students who only participated in the questionnaires (BOTH Q1 and Q2, n  = 9) and those who participated in only one of the questionnaires (Only Q1, n  = 23 or Only Q2, n  = 3) (Figure  5c ). Interestingly, half of the students who were selected as TOP IRP also engaged in both Q1 and Q2 ( n  = 5). The students who participated in all activities had higher score levels when compared with the other groups of engagement, mainly in the open‐ended questions analyzed separately (Figure  5d ). Among the students who participated in the IRP, the best scores were from the students who were in the top‐10 IRP (TOP IRP) (Figure  5c,d ). Our data show that the participants who answered only one of the questionnaires (Only Q1 or Only Q2) had the worst scores in the open questions and shows that involvement in more than one activity improves the student's performance (Figure  5d ). Altogether, the data show a positive trend in the relationship between engagement and performance (Figure S 8 ).

3.2.5. Active learning tools improve students' critical thinking and motivation in science

The evaluation of the course was positive by 74% of the participants ( n  = 50), who considered that the course was excellent (Figure  6a ). The open‐ended questions on “Course strengths” and “Course weaknesses” were content analyzed, and the responses were classified into categories that included similar statements (Table  5 ). Among the strengths, 70% of the students considered the didactics as a strong point, which includes the quality of the presentations, the confidence of the instructors regarding the domain of the content, the lesson plan, and the dynamics of the class. Fifty‐six percent of the participants assessed that the student–teacher interaction was a positive aspect of the classes, where the students revealed that they felt included (even remotely). Another point highlighted as strength of the classes was the teaching methodology and the subjects covered, which brought a balance between variety and depth. As negative points of the course, issues with infrastructure and technical problems (such as timetable, platform, class time, sound) and course complexity for a short time were mentioned.

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g004.jpg

Students demonstrate a positive feeling about active learning tool. (a) Percentage of responses from students on the multiple‐choice question “How do you think the course contributed to your learning?,” with possible answers “excellent,” “moderate,” and “insufficient.” (b) Percentage of responses to the multiple‐choice question “In the questionnaires, what type of question do you prefer?” with possible answers “multiple‐choice,” “open‐ended,” and “I have no preference.” (c) Percentage of students' responses to the question “How do you evaluate the problem‐based questions present in the questionnaires?” with possible answers “They were excellent,” “They were very difficult,” and “They were very simple.” (d) Percentage of responses to the question “How did you feel during the conduct of the inquiry research project?,” with possible responses being “motivated,” “comfortable,” and “apprehensive.” The percentage of responses was calculated on the number of students who answered the questionnaire ( n  = 50)

The main positive points cited by the students were didactics, teaching methodology, and instructor–student interaction

Note : content categories in the table represent any categories included by more than 6% ( n  ≥ 3) of students who responded to feedback. The open‐ended questions on “course strengths” and “course weaknesses” were content analyzed, and the responses were classified into categories that included similar statements.

The students' feelings about the course's active learning tools were assessed by the feedback form. Students were instructed to rate from 1 to 5 on how positive the online inquiry questionnaires were for their learning, being 1 “negative” and 5 “very positive.” The average score of the responses was 4.34, indicating that the questionnaires were validated by the students. Regarding the type of question contained in the questionnaires, 48% of students prefer multiple‐choice questions (Figure  6b ). This shows that at least half of students prefer questions that students prefer questions that only recall information and do not require elaborating their own reasoning. Despite the high preference for multiple‐choice questions among the participants (48%), 62% considered that discursive problem‐solving questions are a great way to make them think critically and formulate strategies for real situations that a researcher faces (Figure  6c ).

One of the proposed activities was the writing of an IRP about some biological question of their interest. The participation rate in IRPs was relatively low (26.2%), being 68.2% postgraduate students. Interestingly, 54% of the participants felt motivated during elaboration of the scientific project (Figure  6d ). In an open question, the participants affirm that the elaboration of an IRP improves its positioning in science, becoming more critical and more motivated. It is also mentioned that the IRP stimulates the acquisition of more knowledge, they are able to expand their scientific vision, simulate a real situation of researchers, and collaboration in scientific communication (data not shown).

In general, there was a demonstration of positive perception regarding active learning methodologies by most students (96%) (data not shown). The main points commented by the students regarding their perception of active methodologies were that they are more effective for lasting learning, stimulate critical thinking and improve the dynamics of the class and the student–teacher interaction.

When consulted in an open‐ended question about the skills they improved with the course, the answers were directed to three points: incorporation of knowledge, motivation about science, and gains in their skills on scientific processes. Forty‐four percent of the participants cited an improvement in their logical critical and rational thinking. A gain in knowledge of the subject was pointed out by 22% of them, and the expansion of the vision by 20% (Figure  7a ). It is also interesting to note that 94% of the participants indicate that the course was able to give a real insight into problems that scientists face in their research. When questioned how motivated they are to solve scientific problems using critical thinking after the course on a scale of 1 to 5 (1: nothing; 5: very), the average response was 4.14, with a rating of 4 and 5 by 86% of them (Figure  7b ).

An external file that holds a picture, illustration, etc.
Object name is BMB-49-888-g007.jpg

Active methodologies are able to increase the incorporation of knowledge, motivation in front of science and students show gains in soft skills. (a) Answers to the open question “what are the main gains you obtained with the course?” were categorized among common themes (showing categories that comprise 6% [ n  = 3] or more of the answers). (b) Student responses to the question “how motivated are you to solve scientific problems using critical thinking after the course?” on a scale of 1 to 5 (1: Nothing; 5: Very). The percentage of responses was calculated on the number of students who answered the questionnaire ( n  = 50)

4. DISCUSSION

The constant concern with excellence in the scientific training of academics encountered a new challenge during the COVID‐19 pandemic: how to engage students in effective learning in remote education? This question was the driving force of our study, which reports a semi‐experimental online course for higher education. Our course incorporated active methodology tools that promoted the integration of students in the construction of knowledge and stimulated their critical thinking skills. For this, we proposed problem‐based learning strategies in questionnaires, elaboration of a scientific project, and online quiz in order to complete the lectures. In the last few months, there has been a huge increase in the number of studies dedicated to developing and validating active‐learning strategies in remote or hybrid education, driven by the pandemic. 28 , 39 , 40 , 41 , 42

Our study was interested in evaluating mainly two types of achievement in students: (i) Cognitive and intellectual skills (learning outcomes) and (ii) Critical thinking, attitudes toward science and soft skills. For this, different activities and questionnaires were applied before, during, and after the course. Our data show that student engagement in the different active learning tools proposed is directly linked to their performance in the course. The average score of the groups that participated in all the proposed activities and stood out in the writing of the IRP was considerably higher compared with the groups with less involvement in the course when evaluating the discursive questions. In fact, other studies have already shown that active learning approaches in the classroom improve academic performance. In a long‐term study (3 years), the implementation of problem‐based learning (PBL) and learning by teaching (LbT) resulted in an increase from 5 to 6–7 in the average scores in final exams of engineering students. 43 Interactive‐engagement also shows score improvements in physics courses compared with traditional pedagogical strategies. 44

Our data show that student involvement is a key point for their learning. This is widely accepted and experienced at different levels. 45 , 46 Emotional, behavioral, and cognitive dimensions can be considered when analyzing engagement. 47 First, emotional engagement happens when students are emotionally affected and motivated by the learning environment. 48 In our courses, introductory icebreakers and friendly communication was a factor that contributed to students to feel comfortable in interacting with instructors and with each other. Second, behavioral engagement corresponds to attitudes students demonstrate in class, such as listening and paying attention to the class or the persistence and concentration in activities. 49 In this scenario, at least three forms of interaction were provided (chat, audio only, and video), in which the chat demonstrated that students were constantly connected to instructors during the presentation. Finally, cognitive engagement happens when students apply their ability to select, connect, and plan in constructing and self‐regulating the learning process. 47 , 50 Here, these movements were detected, under our point of view, in the construction of the IRP and in the responses to open‐ended questions in both questionnaires, in which students provide strategies to real problems inside and outside their fields of study. All three‐dimensions of engagement are linked together and may contribute to improvement on students' academic performance, then one should not consider them solely.

Beyond the intellectual benefit, traditionally used as teaching quality indicators, we hypothesized that student‐centered teaching methodologies would lead to a positive attitude or perception with science and thinking skills. In a self‐assessment, students reported that they had an improvement in their critical thinking, which involves judging the information with criteria and healthy skepticism. This relationship between active learning and improving critical thinking has been reported in other groups around the world. 22 , 51 , 52 Active‐learning strategies (such as collaborative work in small groups and case studies) improved students' critical thinking skills as measured by the Watson‐Glaser Critical Thinking Appraisal, which assesses decision‐making ability as well as predicts judgment, problem‐solving, and creativity. 53 Umbach and Wawrzynski 54 analyzed two sets of American national data and showed a positive relationship between university environments where teachers used active and collaborative learning techniques and students' gains in personal‐social development. Improving students' ability to recognize problems and apply effective strategies and solutions to fundamental challenges in the field is the basis of good scientific training. Our results show that tools of active methodology can impact the attitude of students that will be reflected in future scientists able to position themselves in the face of problems.

The improvement of the indicators added to the approval of the course by the students confirmed that the approaches were well chosen and encouraged us to write our experience in order to facilitate the implementation of active methodologies in other courses. We opted for active learning tools that could be easily applied to the virtual environment, improving the dynamics of the classes. Online questionnaires seem to be a great option for validating students' learning, and makes them reflect on the class and apply their knowledge in the answers. Because our courses aim at a scientific formation associated with the resolution of real problems, the questionnaires addressed both concept questions and interpretive/exploratory open‐ended questions. This allowed us to highlight a clear problem in Brazilian education: students are trained as “information recorders/archivers” and not as “critical thinkers,” as many students showed good levels in concept questions and poor performance in problem‐based questions. The use of open and closed questions is ideal to provide greater freedom of responses for students and to stimulate reasoning, but they also need clear criteria for their correction. In order to guarantee the impartiality of the corrections, all five instructors of the courses corrected all the questions and the scores were given by an average between evaluators.

During the course design, we were interested in getting immediate feedback on student learning in relation to the main concepts discussed. For this, at the final of everyday classes, ~10 final minutes were reserved for an online quiz. This activity was very interesting to reaffirm “take home messages,” that is, what the student cannot “get out of class” without learning and their perception about the acquired knowledge. There are several online tools for this type of quiz, and we emphasize that the most interesting ones are those that allow a real‐time assessment of the result with a percentage of “votes” in each of the questions. This allows questions to be promptly corrected and students can use that time to clear up any doubts.

During the undergraduate course, we opted for a questionnaire that represented a “critical reading guide” for scientific papers. Participation in the questionnaires was very positive, but we replaced this activity with the elaboration of a mini‐scientific project in the graduate course, since reading scientific papers is a basic/trivial activity for graduate students. The preparation of the IRP represented the most demanding activity for the student, because there he should use the knowledge of the course to answer a scientific question of his preference. This type of activity gives students freedom to “think outside the box” and search for ways to answer biological questions that interest them. With this activity, we detected—observed some students who were highlighted for their commitment to develop a project as a principal investigator. In addition, we reserved a time within the course for some students who had the IRPs selected to present for everybody. This type of activity adds other soft skills to students, such as communication and accepting challenges, which are essential for future scientists.

Although we have achieved good results as an online course model for higher education, we have encountered some limitations in our study. The course was presented in a short‐time (3 consecutive days) which hampered a robust evaluation regarding the impact of active tools in student progress. In addition, the experimental course was transmitted simultaneously with other activities of the hosted congress, which may have impacted on students' outcomes due to other demanding activities. In addition, because it is an optional course (as a satellite event), there were no ways to require student participation, nor condition performance to the approval of the course. This could have been caused, among other possible reasons, by the low responsiveness in certain activities, showing that part of the students only engages in activities when they are required for approval. Previous experiences with the theme were not considered as a differential advantage, students from different fields in health and biological sciences were analyzed together; the same happened to undergraduate students and postdoctoral fellows, for example. Finally, a point that can be seen in a positive and negative way was the heterogeneity of the class. This was interesting because it brought the most different backgrounds to the same class, however, it also made it difficult to know about the level of knowledge among students, since the same knowledge could be very basic or essential for some and very advanced or specific for others.

Interestingly, although our study was carried out during a pandemic, with a limited number of students, our data reflect the profile of Brazilian education. The students admit that most of their academic training was with passive approaches, but they are interested and willing to more interactive activities. This exposes a gap in the unequal Brazilian educational model: changes in the educational environment are strongly necessary to prepare citizens socially and personally able to participate in society in a democratic way. 55 , 56 , 57 The current model of higher education in force in Europe after the establishment of the parameters determined by the European Higher Education Area prioritizes among the student's abilities the development of an autonomous learning capacity. 58 , 59 However, the models found in traditional schools, including Brazil, prepare students equally, minimizing the idea that knowledge acquisition is motivated in cognitive, personal, and also social skills. 60

The introduction of active learning methodologies has been widely encouraged worldwide, but it requires a great effort from both teachers and students: educators need to review their lesson plans and add new tools and students need to be willing to engage in the construction of knowledge. 14 Unquestionably, the process requires dynamic instructors, with a flexible mind and willing to use the class to produce a transformation in the students to acquire knowledge through active methodologies. The course was carried out after 3 months in full lockdown. We have no elements to evaluate if the impact of our proposal could be different spending more time within the course. Certainly, with all the uncertainties of this moment in the world, our experience reaffirms the remote method of learning when using elements of critical thinking and active methodologies, with a real benefit of self‐confidence and empowerment of students to motivate themselves in their long and arduous road to be a scientist.

Above all, our experience showed that making the student the center of the class brings not only cognitive benefits (such as intellectual growth) but also in the psychosocial and personal spheres, giving students independence, improvement in their effective communication, and in their ability to accept challenges for self‐development. Our data show that active learning tools that require constant engagement benefits students and improve their critical thinking. This study also shows that if courses on various scientific topics were reformulated by adding active methodologies, it is likely that more students will obtain better intellectual baggage and positive positioning towards participation in science, forming/preparing more powerful thinkers.

CONFLICT OF INTEREST

No competing interest has been declared. All authors have seen and approved the manuscript. The manuscript has not been accepted or published elsewhere.

Supporting information

Appendix S1. Supporting Information.

ACKNOWLEDGMENTS

We are grateful for the invitation from the Graduate Program in Biosciences and Pathophysiology (State University of Maringá) through Prof. Gessilda de Alcantara Nogueira de Melo to participate in the VII International Meeting of Biosciences and physiopathology. This study received support from FIOCRUZ, UFPR, CNPq, CAPES, and Programa Básico de Parasitologia AUXPE 2041/2011 (CAPES) Brazil. Marcel Ivan Ramirez is currently a fellow from CNPq‐Brazil.

Rossi IV, de Lima JD, Sabatke B, Nunes MAF, Ramirez GE, Ramirez MI. Active learning tools improve the learning outcomes, scientific attitude, and critical thinking in higher education: Experiences in an online course during the COVID‐19 pandemic . Biochem Mol Biol Educ . 2021; 49 :888–903. 10.1002/bmb.21574 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

Contributor Information

Izadora Volpato Rossi, Email: moc.liamg@otaplovarodazi .

Marcel Ivan Ramirez, Email: [email protected] .

Featured Topics

Featured series.

A series of random questions answered by Harvard experts.

Explore the Gazette

Read the latest.

George Whitesides.

‘The scientist is not in the business of following instructions.’

Mikhail Lukin (left) and Can Knaut stand near a quantum network node.

Glimpse of next-generation internet

Portrait of Venki Ramakrishnan.

Science is making anti-aging progress. But do we want to live forever?

Lessons in learning.

Sean Finamore ’22 (left) and Xaviera Zime ’22 study during a lecture in the Science Center.

Photos by Kris Snibbe/Harvard Staff Photographer

Peter Reuell

Harvard Staff Writer

Study shows students in ‘active learning’ classrooms learn more than they think

For decades, there has been evidence that classroom techniques designed to get students to participate in the learning process produces better educational outcomes at virtually all levels.

And a new Harvard study suggests it may be important to let students know it.

The study , published Sept. 4 in the Proceedings of the National Academy of Sciences, shows that, though students felt as if they learned more through traditional lectures, they actually learned more when taking part in classrooms that employed so-called active-learning strategies.

Lead author Louis Deslauriers , the director of science teaching and learning and senior physics preceptor, knew that students would learn more from active learning. He published a key study in Science in 2011 that showed just that. But many students and faculty remained hesitant to switch to it.

“Often, students seemed genuinely to prefer smooth-as-silk traditional lectures,” Deslauriers said. “We wanted to take them at their word. Perhaps they actually felt like they learned more from lectures than they did from active learning.”

In addition to Deslauriers, the study is authored by director of sciences education and physics lecturer Logan McCarty , senior preceptor in applied physics Kelly Miller, preceptor in physics Greg Kestin , and Kristina Callaghan, now a physics lecturer at the University of California, Merced.

The question of whether students’ perceptions of their learning matches with how well they’re actually learning is particularly important, Deslauriers said, because while students eventually see the value of active learning, initially it can feel frustrating.

“Deep learning is hard work. The effort involved in active learning can be misinterpreted as a sign of poor learning,” he said. “On the other hand, a superstar lecturer can explain things in such a way as to make students feel like they are learning more than they actually are.”

To understand that dichotomy, Deslauriers and his co-authors designed an experiment that would expose students in an introductory physics class to both traditional lectures and active learning.

For the first 11 weeks of the 15-week class, students were taught using standard methods by an experienced instructor. In the 12th week, half the class was randomly assigned to a classroom that used active learning, while the other half attended highly polished lectures. In a subsequent class, the two groups were reversed. Notably, both groups used identical class content and only active engagement with the material was toggled on and off.

Following each class, students were surveyed on how much they agreed or disagreed with statements such as “I feel like I learned a lot from this lecture” and “I wish all my physics courses were taught this way.” Students were also tested on how much they learned in the class with 12 multiple-choice questions.

When the results were tallied, the authors found that students felt as if they learned more from the lectures, but in fact scored higher on tests following the active learning sessions. “Actual learning and feeling of learning were strongly anticorrelated,” Deslauriers said, “as shown through the robust statistical analysis by co-author Kelly Miller, who is an expert in educational statistics and active learning.”

Those results, the study authors are quick to point out, shouldn’t be interpreted as suggesting students dislike active learning. In fact, many studies have shown students quickly warm to the idea, once they begin to see the results. “In all the courses at Harvard that we’ve transformed to active learning,” Deslauriers said, “the overall course evaluations went up.”

bar chart

Co-author Kestin, who in addition to being a physicist is a video producer with PBS’ NOVA, said, “It can be tempting to engage the class simply by folding lectures into a compelling ‘story,’ especially when that’s what students seem to like. I show my students the data from this study on the first day of class to help them appreciate the importance of their own involvement in active learning.”

McCarty, who oversees curricular efforts across the sciences, hopes this study will encourage more of his colleagues to embrace active learning.

“We want to make sure that other instructors are thinking hard about the way they’re teaching,” he said. “In our classes, we start each topic by asking students to gather in small groups to solve some problems. While they work, we walk around the room to observe them and answer questions. Then we come together and give a short lecture targeted specifically at the misconceptions and struggles we saw during the problem-solving activity. So far we’ve transformed over a dozen classes to use this kind of active-learning approach. It’s extremely efficient — we can cover just as much material as we would using lectures.”

A pioneer in work on active learning, Balkanski Professor of Physics and Applied Physics Eric Mazur hailed the study as debunking long-held beliefs about how students learn.

“This work unambiguously debunks the illusion of learning from lectures,” he said. “It also explains why instructors and students cling to the belief that listening to lectures constitutes learning. I recommend every lecturer reads this article.”

Dean of Science Christopher Stubbs , Samuel C. Moncher Professor of Physics and of Astronomy, was an early convert. “When I first switched to teaching using active learning, some students resisted that change. This research confirms that faculty should persist and encourage active learning. Active engagement in every classroom, led by our incredible science faculty, should be the hallmark of residential undergraduate education at Harvard.”

Ultimately, Deslauriers said, the study shows that it’s important to ensure that neither instructors nor students are fooled into thinking that lectures are the best learning option. “Students might give fabulous evaluations to an amazing lecturer based on this feeling of learning, even though their actual learning isn’t optimal,” he said. “This could help to explain why study after study shows that student evaluations seem to be completely uncorrelated with actual learning.”

This research was supported with funding from the Harvard FAS Division of Science.

Share this article

You might like.

George Whitesides became a giant of chemistry by keeping it simple

Mikhail Lukin (left) and Can Knaut stand near a quantum network node.

Physicists demo first metro-area quantum computer network in Boston

Portrait of Venki Ramakrishnan.

Nobel laureate details new book, which surveys research, touches on larger philosophical questions

Epic science inside a cubic millimeter of brain

Researchers publish largest-ever dataset of neural connections

Finding right mix on campus speech policies

Legal, political scholars discuss balancing personal safety, constitutional rights, academic freedom amid roiling protests, cultural shifts

Good genes are nice, but joy is better

Harvard study, almost 80 years old, has proved that embracing community helps us live longer, and be happier

Center for Teaching

Active learning.

brainpuzzle

What is it?

What’s the theoretical basis, is there evidence that it works, why is it important, what are techniques to use, how should you get started.

  • New! Cheat sheet

Other sources of information

In their seminal work Active Learning: Creating Excitement in the Classroom , compiled in 1991 for the Association for the Study of Higher Education and the ERIC Clearinghouse on Higher Education, Bonwell and Eison defined strategies that promote active learning as “instructional activities involving students in doing things and thinking about what they are doing” (Bonwell and Eison, 1991). Approaches that promote active learning focus more on developing students’ skills than on transmitting information and require that students do something—read, discuss, write—that requires higher-order thinking. They also tend to place some emphasis on students’ explorations of their own attitudes and values.

This definition is broad, and Bonwell and Eison explicitly recognize that a range of activities can fall within it. They suggest a spectrum of activities to promote active learning, ranging from very simple (e.g., pausing lecture to allow students to clarify and organize their ideas by discussing with neighbors) to more complex (e.g., using case studies as a focal point for decision-making). In their book Scientific Teaching , Handelsman, Miller and Pfund also note that the line between active learning and formative assessment is blurry and hard to define; after all, teaching that promotes students’ active learning asks students to do or produce something, which then can serve to help assess understanding (2007).

quotebox2

The National Survey of Student Engagement (NSSE) and the Australasian Survey of Student Engagement (AUSSE) provides a very simple definition: active learning involves “students’ efforts to actively construct their knowledge.” This definition is supplemented by the items that the AUSSE uses to measure active learning: working with other students on projects during class; making a presentation; asking questions or contributing to discussions; participating in a community-based project as part of a course; working with other students outside of class on assignments; discussing ideas from a course with others outside of class; tutoring peers (reported in Carr et al., 2015).

Freeman and colleagues collected written definitions of active learning from >300 people attending seminars on active learning, arriving at a consensus definition that emphasizes students’ use of higher order thinking to complete activities or participate in discussion in class (Freeman et al., 2014). Their definition also notes the frequent link between active learning and working in groups.

Thus active learning is commonly defined as activities that students do to construct knowledge and understanding. The activities vary but require students to do higher order thinking . Although not always explicitly noted, metacognition —students’ thinking about their own learning—is an important element, providing the link between activity and learning .

Constructivist learning theory emphasizes that individuals learn through building their own knowledge, connecting new ideas and experiences to existing knowledge and experiences to form new or enhanced understanding (Bransford et al., 1999). The theory, developed by Piaget and others, posits that learners can either assimilate new information into an existing framework, or can modify that framework to accommodate new information that contradicts prior understanding. Approaches that promote active learning often explicitly ask students to make connections between new information and their current mental models, extending their understanding. In other cases, teachers may design learning activities that allow students to confront misconceptions, helping students reconstruct their mental models based on more accurate understanding. In either case, approaches that promote active learning promote the kind of cognitive work identified as necessary for learning by constructivist learning theory.

Active learning approaches also often embrace the use of cooperative learning groups, a constructivist-based practice that places particular emphasis on the contribution that social interaction can make. Lev Vygotsky’s work elucidated the relationship between cognitive processes and social activities and led to the sociocultural theory of development, which suggests that learning takes place when students solve problems beyond their current developmental level with the support of their instructor or their peers (Vygotsky 1978). Thus active learning approaches that rely on group work rest on this sociocultural branch of constructivist learning theory, leveraging peer-peer interaction to promote students’ development of extended and accurate mental models.

The evidence that active learning approaches help students learn more effectively than transmissionist approaches in which instructors rely on “teaching by telling” is robust and stretches back more than thirty years (see, for example, Bonwell and Eison, 1991). Here, we will focus on two reports that review and analyze multiple active learning studies.

prob-fail

These results support other, earlier reviews (e.g., Hake, 1998; Prince, 2004; Springer et al., 1999). In one such review, Ruiz-Primo and colleagues examined published studies examining the effects of active learning approaches in undergraduate biology, chemistry, engineering and physics courses (Ruiz-Primo et al., 2011). They identified 166 studies that reported an effect size when comparing the effects of an innovation (i.e., active learning approaches) to traditional instruction that did not include the innovation. Overall, they found that inclusion of the active learning approaches improved student outcomes (mean effect size = 0.47), although there are important caveats to consider. First, the authors coded the active learning activities as conceptually oriented tasks, collaborative learning activities, technology-enabled activities, inquiry-based projects, or some combination of those four categories, and important differences existed within the categories (for example, technology-assisted inquiry-based projects on average did not produce positive effects). Second, more than 80% of the studies included were quasi-experimental rather than experimental, and the positive benefits (average effect size = 0.26) were lower for the experimental studies in which students were randomly assigned to a treatment group. Finally, many of the studies did not control for pre-existing knowledge and abilities in the treatment groups. Nonetheless, the review does provide qualified support for the inclusion of active learning approaches in instruction.

While the two reviews reported focus on STEM disciplines and no similar reviews exist for the humanities and social sciences, the bulk of the evidence suggests that active learning approaches are effective across disciplines (Ambrose et al, 2010; Bonwell and Eison, 1991; Chickering and Gamson, 1987).

In addition to the evidence that active learning approaches promote learning for all students, there is some evidence that active learning approaches are an effective tool in making classrooms more inclusive. Haak and colleagues examined the effects of active learning for students in the University of Washington’s Educational Opportunity Program (EOP) who were enrolled in an introductory biology course (Haak et al., 2011). Students in the EOP are educationally or economically disadvantaged, are typically the first in their families to attend college, and include most underrepresented minority students at the University of Washington. Previous work had demonstrated that the researchers could predict student grades in the introductory biology course based on their college GPA and SAT verbal score; students in the EOP had a mean failure rate of ~22% compared to a mean failure rate of ~10% for students not in the EOP. When multiple highly structured approaches to promote active learning were incorporated into the introductory biology course, all students in the course benefited, but students in the EOP demonstrated a disproportionate benefit, reducing the achievement gap to almost half of the starting level. Given the pressing need to make U.S. college classrooms more inviting and productive spaces for students from all backgrounds, these results provide another compelling reason to incorporate active learning approaches into course design.

Lorenzo, Crouch, and Mazur also investigated the impact of active learning approaches on the difference in male and female performance in introductory physics classes (2006). They found that inclusion of active engagement techniques benefited all students, but had the greatest impact on female students’ performance. In fact, when they included a “high dose” of active learning approaches, the gender gap was eliminated. This result supports earlier work suggesting that women particularly benefit from active learning approaches (Laws et al., 1999; Schneider, 2001).

Brief, easy supplements for lectures

The Pause Procedure — Pause for two minutes every 12 to 18 minutes, encouraging students to discuss and rework notes in pairs. This approach encourages students to consider their understanding of the lecture material, including its organization. It also provides an opportunity for questioning and clarification and has been shown to significantly increase learning when compared to lectures without the pauses. (Bonwell and Eison, 1991; Rowe, 1980; 1986; Ruhl, Hughes, & Schloss, 1980)

Retrieval practice —Pause for two or three minutes every 15 minutes, having students write everything they can remember from preceding class segment. Encourage questions. This approach prompts students to retrieve information from memory, which improves long term memory, ability to learn subsequent material, and ability to translate information to new domains. (Brame and Biel, 2015; see also the CFT’s guide to test-enhanced learning )

Demonstrations —Ask students to predict the result of a demonstration, briefly discussing with a neighbor. After demonstration, ask them to discuss the observed result and how it may have differed from their prediction; follow up with instructor explanation. This approach asks students to test their understanding of a system by predicting an outcome. If their prediction is incorrect, it helps them see the misconception and thus prompts them to restructure their mental model.

Think-pair-share —Ask students a question that requires higher order thinking (e.g., application, analysis, or evaluation levels within Bloom’s taxonomy ). Ask students to think or write about an answer for one minute, then turn to a peer to discuss their responses for two minutes. Ask groups to share responses and follow up with instructor explanation. By asking students to explain their answer to a neighbor and to critically consider their neighbor’s responses, this approach helps students articulate newly formed mental connections.

Peer instruction with ConcepTests—This modification of the think-pair-share involves personal response devices (e.g., clickers). Pose a conceptually based multiple-choice question. Ask students tothink about their answer and vote on a response before turning to a neighbor to discuss. Encourage students to change their answers after discussion, if appropriate, and share class results by revealing a graph of student responses. Use the graph as a stimulus for class discussion. This approach is particularly well-adapted for large classes and can be facilitated with a variety of tools (e.g., Poll Everywhere, TopHat, TurningPoint). More information is available in the CIRTL MOOC An Introduction t o Evidence-Based College STEM Teaching . (Fagen et al., 2002; Crouch and Mazur, 2001)

questions

Minute papers —Ask students a question that requires them to reflect on their learning or to engage in critical thinking. Have them write for one minute. Ask students to share responses to stimulate discussion or collect all responses to inform future class sessions. Like the think-pair-share approach, this approach encourages students to articulate and examine newly formed connections. (Angelo and Cross, 1993; Handelsman et al., 2007)

Activities to replace some lecture

Strip sequence —Give students the steps in a process on strips of paper that are jumbled; ask them to work together to reconstruct the proper sequence. This approach can strengthen students’ logical thinking processes and test their mental model of a process. (Handelsman et al., 2007) An example from Aarhus University is provided below.

Screen Shot 2016-06-20 at 10.18.23 AM

Concept map —Concept maps are visual representations of the relationships between concepts. Concepts are placed in nodes (often, circles), and the relationships between indicated by labeled arrows connecting the concepts. To have students create a concept map, identify the key concepts to be mapped in small groups or as a whole class. Ask students to determine the general relationship between the concepts and to arrange them two at a time, drawing arrows between related concepts and labeling with a short phrase to describe the relationship. By asking students to build an external representation of their mental model of a process, this approach helps students examine and strengthen the organization within the model. Further, it can emphasize the possibility of multiple “right” answers. More information and a tool to do online concept mapping can be found at the Institute for Human & Machine Cognition . (Novak and Canas, 2008) An example is shown below.

Mini-maps . Mini-maps are like concept maps, but students are given a relatively short list of terms (usually 10 or fewer) to incorporate into their map. To use this approach, provide students a list of major concepts or specific terms and ask them to work in groups of two or three to arrange the terms in a logical structure, showing relationships with arrows and words.   Ask groups to volunteer to share their mini-maps and clarify any confusing points. Mini-maps have many of the same strengths as concept maps but can be completed more quickly and thus can serve as part of a larger class session with other learning activities. (Handelsman et al., 2007)

Categorizing grids. Present students with a grid made up of several important categories and a list of scrambled terms, images, equations, or other items. Ask students to quickly sort the terms into the correct categories in the grid. Ask volunteers to share their grids and answer questions that arise. This approach allows students to express and thus interrogate the distinctions they see within a field of related items. It can be particularly effective at helping instructors identify misconceptions. (Angelo and Cross, 1993)

Bloomtaxonomy

Decision-making activities . Ask students to imagine that they are policy-makers who must make and justify tough decisions. Provide a short description of a thorny problem, ask them to work in groups to arrive at a decision, and then have groups share out their decisions and explain their reasoning. This highly engaging technique helps students critically consider a challenging problem and encourages them to be creative in considering solutions. The “real-world” nature of the problems can provide incentive for students to   dig deeply into the problems. (Handelsman et al., 2007)

Content, form, and function outlines. Students in small groups are asked to carefully analyze a particular artifact—such as a poem, a story, an essay, a billboard, an image, or a graph—and identify the “what” (the content), the “how” (the form), and the function (the why). This technique can help students consider the various ways that meaning is communicated in different genres. (Angelo and Cross, 1993)

Case-based learning. Much like decision-making activities, case-based learning presents students with situations from the larger world that require students to apply their knowledge to reach a conclusion about an open-ended situation. Provide students with a case, asking them to decide what they know that is relevant to the case, what other information they may need, and what impact their decisions may have, considering the broader implications of their decisions. Give small groups (3-5) of students time to consider responses, circulating to ask questions and provide help as needed. Provide opportunities for groups to share responses; the greatest value from case-based learning comes from the complexity and variety of answers that may be generated. More information and collections of cases are available at the National Center for Case Study Teaching in Science  and World History Sources .

Discussion techniques

Many faculty members dispense with lecture altogether, turning to discussion to prompt the kinds of thinking needed to build understanding. Elizabeth Barkley provides a large collection of discussion techniques focused on different learning goals, ranging from lower level to higher level thinking (Barkley, 2010). The CFT’s Joe Bandy has summarized some of the most useful of these techniques.

Other approaches

There are other active learning pedagogies, many of which are highly structured and have dedicated websites and strong communities. These include team-based learning (TBL), process-oriented guided inquiry learning (POGIL), peer-led team learning , and problem-based learning (PBL). Further, the flipped classroom model is based on the idea that class time will be spent with students engaged in active learning.

Start small, start early, and start with activities that pose low risk for both instructors and students. The Pause Procedure, retrieval practice, minute papers, and the think-pair-share technique provide easy entry points to incorporating active learning approaches, requiring the instructor to change very little while providing students an opportunity to organize and clarify their thinking. As you begin to incorporate these practices, it’s a good idea to explain to your students why you’re doing so; talking to your students about their learning not only helps build a supportive classroom environment, but can also help them develop their metacognitive skills (and thus their ability to become independent learners).

As you consider other active learning techniques to use, use the “ backwards design ” approach: begin by identifying your learning goals, think about how you would identify whether students had reached them (that is, how you might structure assessment), and then choose an active learning approach that helps your students achieve those goals. Students typically have positive responses to active learning activities that are meaningful, appropriately challenging, and clearly tied to learning goals and assessments (see, for example, Lumpkin et al., 2015). Finally, consult colleagues within your department and the Center for Teaching for help and feedback as you design and implement active learning approaches.

There are many great sites that provide examples of active learning activities. Here is a sampling:

  • National Center for Case Study Teaching in Science
  • World History Sources
  • Online Teaching Activity Index
  • MERLOT II (online resources)
  • University of Michigan Center for Research on Learning and Teaching Active Learning page

Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C., Norman, M.K., and Mayer, R.E. (2010). How learning works: seven research-based principles for smart teaching. San Francisco: Jossey-Bass.

Angelo, T.A. and Cross, K.P. (1993). Classroom assessment techniques: a handbook for college teachers . San Francisco: Jossey-Bass.

Barkley, E. (2010). Student engagement techniques: a handbook for college faculty . San Francisco: Jossey-Bass.

Bonwell, C. C., and Eison, J.A. (1991). Active learning: creating excitement in the classroom . ASH#-ERIC Higher Education Report No. 1, Washington, D.C.: The George Washington University, School of Education and Human Development.

Brame, C.J. and Biel, R. (2015). Test-enhanced learning: the potential for testing to promote greater learning in undergraduate science courses. CBE Life Sciences Education , 14 , 1-12.

Bransford, J.D., Brown, A.L., and Cocking, R.R. (Eds.) (1999).  How people learn: Brain, mind, experience, and school . Washington, D.C.: National Academy Press.

Carr, R., Palmer, S., and Hagel, P. (2015). Active learning: the importance of developing a comprehensive measure. Active Learning in Higher Education 16 , 173-186.

Chickering, A.W. and Gamson, Z.F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin March 1987, 3-7

Crouch, C.H. and Mazur, E. (2001). Peer instruction: ten years of experience and results. Am. Journal of Physics 69 , 970-977.

Fagen, A.P., Crouch, C.H., and Mazur, E. (2002). Peer instruction: results from a range of classrooms. Physics Teacher 40 , 206-209.

Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., and Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences USA 111 , 8410-8415.

Haak, D.C., HilleRisLambers, J., Pitre, E., and Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science 332, 1213–1216.

Hake, R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses.  American Journal of Physics   66 , 64-74.

Handelsman, J., Miller, S., and Pfund, C. (2007). Scientific teaching . New York: W.H. Freeman.

Hyman, R.T. (1980). Improving discussion leadership . New York: Columbia University Teachers College Press.

Laws, P., Rosborough P, and Poodry, F. (1999).Women’s responses to an activity-based introductory physics program. American Journal of Physics 67 , S32–S37.

Lorenzo, M., Crouch, C.H., Mazur, E. (2006). Reducing the gender gap in the physics classroom. American Journal of Physics 74 , 118–122.

Lumpkin, A., Achen, R., and Dodd,R. (2015). Student perceptions of active learning. College Student Journal 49 , 121-133.

Novak, J.D. and Canas, A.J. (2008). The theory underlying concept maps and how to construct and use them. Technical Report IHMC CmapTools 2006-01 Rev 2008-01 (retrieved from http://cmap.ihmc.us/docs/theory-of-concept-maps ).

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education 93 , 223-231.

Rowe, M.B. (1980). Pausing principles and their effects on reasoning in science. In Teaching the Sciences , edited by F. B. Brawer. New Directions for Community Colleges No. 31. San Francisco: Jossey-Bass.

Ruhl, K., Hughes, C.A., and Schloss, P.J. (1987). Using the Pause Procedure to enhance lecture rcall. Teacher Education and Special Education 10 , 14-18.

Ruiz-Primo, M.A., Briggs, D., Iverson, H., Talbot, R., Shepard, L.A. (2011). Impact of undergraduate science course innovations on learning. Science 331 , 1269–1270.

Schneider, M. (2001). Encouragement of women physics majors at Grinnell College: A case study. Phys. Teach. 39 , 280–282.

Springer, L., Stanne, M.E., Donovan, S.S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology . Rev. Educ. Res. 69, 21–51.

Vygotsky, L. S. (1978).  Mind in society.   Cambridge, MA: Harvard University Press.

Creative Commons License

Teaching Guides

  • Online Course Development Resources
  • Principles & Frameworks
  • Pedagogies & Strategies
  • Reflecting & Assessing
  • Challenges & Opportunities
  • Populations & Contexts

Quick Links

  • Services for Departments and Schools
  • Examples of Online Instructional Modules

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Published: 02 August 2022

The science of effective learning with spacing and retrieval practice

  • Shana K. Carpenter   ORCID: orcid.org/0000-0003-0784-9026 1 ,
  • Steven C. Pan   ORCID: orcid.org/0000-0001-9080-5651 2 &
  • Andrew C. Butler   ORCID: orcid.org/0000-0002-6367-0795 3 , 4  

Nature Reviews Psychology volume  1 ,  pages 496–511 ( 2022 ) Cite this article

7482 Accesses

26 Citations

562 Altmetric

Metrics details

  • Learning and memory

Research on the psychology of learning has highlighted straightforward ways of enhancing learning. However, effective learning strategies are underused by learners. In this Review, we discuss key research findings on two specific learning strategies: spacing and retrieval practice. We focus on how these strategies enhance learning in various domains across the lifespan, with an emphasis on research in applied educational settings. We also discuss key findings from research on metacognition — learners’ awareness and regulation of their own learning. The underuse of effective learning strategies by learners could stem from false beliefs about learning, lack of awareness of effective learning strategies or the counter-intuitive nature of these strategies. Findings in learner metacognition highlight the need to improve learners’ subjective mental models of how to learn effectively. Overall, the research discussed in this Review has important implications for the increasingly common situations in which learners must effectively monitor and regulate their own learning.

This is a preview of subscription content, access via your institution

Access options

Subscribe to this journal

Receive 12 digital issues and online access to articles

55,14 € per year

only 4,60 € per issue

Buy this article

  • Purchase on Springer Link
  • Instant access to full article PDF

Prices may be subject to local taxes which are calculated during checkout

research about active learning

Similar content being viewed by others

research about active learning

Does pre-testing promote better retention than post-testing?

research about active learning

Real-world effectiveness of a social-psychological intervention translated from controlled trials to classrooms

research about active learning

Learning in hybrid classes: the role of off-task activities

Witherby, A. E. & Tauber, S. K. The current status of students’ note-taking: why and how do students take notes? J. Appl. Res. Mem. Cogn. 8 , 139–153 (2019).

Article   Google Scholar  

Feitosa de Moura, V., Alexandre de Souza, C. & Noronha Viana, A. B. The use of massive open online courses (MOOCs) in blended learning courses and the functional value perceived by students. Comput. Educ. 161 , 104077 (2021).

Hew, K. F. & Cheung, W. S. Students’ and instructors’ use of massive open online courses (MOOCs): motivations and challenges. Educ. Res. Rev. 12 , 45–58 (2014).

Adesope, O. O., Trevisan, D. A. & Sundararajan, N. Rethinking the use of tests: a meta-analysis of practice testing. Rev. Educ. Res. 87 , 659–701 (2017).

Carpenter, S. K. in Learning and Memory: A Comprehensive Reference 2nd edn (ed. Byrne, J. H.) 465–485 (Academic, 2017).

Carpenter, S. K. Distributed practice or spacing effect. Oxford Research Encyclopedia of Education https://oxfordre.com/education/view/10.1093/acrefore/9780190264093.001.0001/acrefore-9780190264093-e-859 (2020).

Yang, C., Luo, L., Vadillo, M. A., Yu, R. & Shanks, D. R. Testing (quizzing) boosts classroom learning: a systematic and meta-analytic review. Psychol. Bull. 147 , 399–435 (2021).

Article   PubMed   Google Scholar  

Agarwal, P. K., Nunes, L. D. & Blunt, J. R. Retrieval practice consistently benefits student learning: a systematic review of applied research in schools and classrooms. Educ. Psychol. Rev. 33 , 1409–1453 (2021).

Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T. & Rohrer, D. Distributed practice in verbal recall tasks: a review and quantitative synthesis. Psychol. Bull. 132 , 354–380 (2006).

Chi, M. T. H. & Ohlsson, S. in The Cambridge Handbook of Thinking and Reasoning 371–399 (Cambridge Univ. Press, 2005).

Bransford, J. D. & Schwartz, D. L. Chapter 3: Rethinking transfer: a simple proposal with multiple implications. Rev. Res. Educ. 24 , 61–100 (1999).

Google Scholar  

Barnett, S. M. & Ceci, S. J. When and where do we apply what we learn?: a taxonomy for far transfer. Psychol. Bull. 128 , 612–637 (2002).

Ebbinghaus, H. Über das Gedächtnis: Untersuchungen zur experimentellen Psychologie [German] (Duncker & Humblot, 1885).

Vlach, H. A., Sandhofer, C. M. & Kornell, N. The spacing effect in children’s memory and category induction. Cognition 109 , 163–167 (2008).

Jackson, C. E., Maruff, P. T. & Snyder, P. J. Massed versus spaced visuospatial memory in cognitively healthy young and older adults. Alzheimer’s Dement. 9 , S32–S38 (2013).

Emeny, W. G., Hartwig, M. K. & Rohrer, D. Spaced mathematics practice improves test scores and reduces overconfidence. Appl. Cognit. Psychol. 35 , 1082–1089 (2021). This study demonstrates significant benefits of spacing over massed learning on 11–12-year-old students’ mathematics knowledge .

Vlach, H. A. & Sandhofer, C. M. Distributing learning over time: the spacing effect in children’s acquisition and generalization of science concepts: spacing and generalization. Child. Dev. 83 , 1137–1144 (2012).

Article   PubMed   PubMed Central   Google Scholar  

Foot-Seymour, V., Foot, J. & Wiseheart, M. Judging credibility: can spaced lessons help students think more critically online? Appl. Cognit. Psychol. 33 , 1032–1043 (2019). This study demonstrates significant long-term benefits of spacing on 9–12-year-old children’s ability to evaluate the credibility of information on websites .

Rohrer, D., Dedrick, R. F., Hartwig, M. K. & Cheung, C.-N. A randomized controlled trial of interleaved mathematics practice. J. Educ. Psychol. 112 , 40–52 (2020).

Yazdani, M. A. & Zebrowski, E. Spaced reinforcement: an effective approach to enhance the achievement in plane geometry. J. Math. Sci . 7 , 37–43 (2006).

Samani, J. & Pan, S. C. Interleaved practice enhances memory and problem-solving ability in undergraduate physics. npj Sci. Learn. 6 , 32 (2021). This study demonstrates significant benefits of distributing homework problems on retention and transfer of university students’ physics knowledge over an academic term .

Raman, M. et al. Teaching in small portions dispersed over time enhances long-term knowledge retention. Med. Teach. 32 , 250–255 (2010).

Moulton, C.-A. E. et al. Teaching surgical skills: what kind of practice makes perfect?: a randomized, controlled trial. Ann. Surg. 244 , 400–409 (2006).

Van Dongen, K. W., Mitra, P. J., Schijven, M. P. & Broeders, I. A. M. J. Distributed versus massed training: efficiency of training psychomotor skills. Surg. Tech. Dev. 1 , e17 (2011).

Spruit, E. N., Band, G. P. H. & Hamming, J. F. Increasing efficiency of surgical training: effects of spacing practice on skill acquisition and retention in laparoscopy training. Surg. Endosc. 29 , 2235–2243 (2015).

Lyle, K. B., Bego, C. R., Hopkins, R. F., Hieb, J. L. & Ralston, P. A. S. How the amount and spacing of retrieval practice affect the short- and long-term retention of mathematics knowledge. Educ. Psychol. Rev. 32 , 277–295 (2020).

Kapler, I. V., Weston, T. & Wiseheart, M. Spacing in a simulated undergraduate classroom: long-term benefits for factual and higher-level learning. Learn. Instr. 36 , 38–45 (2015).

Sobel, H. S., Cepeda, N. J. & Kapler, I. V. Spacing effects in real-world classroom vocabulary learning. Appl. Cognit. Psychol. 25 , 763–767 (2011).

Carpenter, S. K., Pashler, H. & Cepeda, N. J. Using tests to enhance 8th grade students’ retention of US history facts. Appl. Cognit. Psychol. 23 , 760–771 (2009). This study finds that spacing and retrieval practice can improve eighth- grade students’ knowledge of history facts across a 9-month period .

Cepeda, N. J., Vul, E., Rohrer, D., Wixted, J. T. & Pashler, H. Spacing effects in learning: a temporal ridgeline of optimal retention. Psychol. Sci. 19 , 1095–1102 (2008).

Delaney, P. F., Spirgel, A. S. & Toppino, T. C. A deeper analysis of the spacing effect after “deep” encoding. Mem. Cogn. 40 , 1003–1015 (2012).

Hintzman, D. L., Block, R. A. & Summers, J. J. Modality tags and memory for repetitions: locus of the spacing effect. J. Verbal Learn. Verbal Behav. 12 , 229–238 (1973).

Glenberg, A. M. Component-levels theory of the effects of spacing of repetitions on recall and recognition. Mem. Cogn. 7 , 95–112 (1979).

Verkoeijen, P. P. J. L., Rikers, R. M. J. P. & Schmidt, H. G. Detrimental influence of contextual change on spacing effects in free recall. J. Exp. Psychol. Learn. Mem. Cogn. 30 , 796–800 (2004).

Benjamin, A. S. & Tullis, J. What makes distributed practice effective? Cognit. Psychol. 61 , 228–247 (2010).

Thios, S. J. & D’Agostino, P. R. Effects of repetition as a function of study-phase retrieval. J. Verbal Learn. Verbal Behav. 15 , 529–536 (1976).

Smolen, P., Zhang, Y. & Byrne, J. H. The right time to learn: mechanisms and optimization of spaced learning. Nat. Rev. Neurosci. 17 , 77–88 (2016).

Goossens, N. A. M. C., Camp, G., Verkoeijen, P. P. J. L., Tabbers, H. K. & Zwaan, R. A. Spreading the words: a spacing effect in vocabulary learning. J. Cognit. Psychol. 24 , 965–971 (2012).

Zulkiply, N., McLean, J., Burt, J. S. & Bath, D. Spacing and induction: application to exemplars presented as auditory and visual text. Learn. Instr. 22 , 215–221 (2012).

Küpper-Tetzel, C. E. & Erdfelder, E. Encoding, maintenance, and retrieval processes in the lag effect: a multinomial processing tree analysis. Memory 20 , 37–47 (2012).

Verkoeijen, P. P. J. L., Rikers, R. M. J. P. & Schmidt, H. G. Limitations to the spacing effect: demonstration of an inverted U-shaped relationship between interrepetition spacing and free recall. Exp. Psychol. 52 , 257–263 (2005).

Randler, C., Kranich, K. & Eisele, M. Block scheduled versus traditional biology teaching—an educational experiment using the water lily. Instr. Sci. 36 , 17–25 (2008).

Abbott, E. E. On the analysis of the factor of recall in the learning process. Psychol. Rev. Monogr. Suppl. 11 , 159–177 (1909).

Roediger, H. L. & Butler, A. C. The critical role of retrieval practice in long-term retention. Trends Cognit. Sci. 15 , 20–27 (2011).

Rowland, C. A. The effect of testing versus restudy on retention: a meta-analytic review of the testing effect. Psychol. Bull. 140 , 1432–1463 (2014).

Pan, S. C. & Rickard, T. C. Transfer of test-enhanced learning: meta-analytic review and synthesis. Psychol. Bull. 144 , 710–756 (2018).

Sheffield, E. & Hudson, J. You must remember this: effects of video and photograph reminders on 18-month-olds’ event memory. J. Cogn. Dev. 7 , 73–93 (2006).

Fazio, L. K. & Marsh, E. J. Retrieval-based learning in children. Curr. Dir. Psychol. Sci. 28 , 111–116 (2019). This brief review highlights evidence that retrieval practice can benefit learning as early as infancy .

Coane, J. H. Retrieval practice and elaborative encoding benefit memory in younger and older adults. J. Appl. Res. Mem. Cogn. 2 , 95–100 (2013).

Bahrick, H. P., Bahrick, L. E., Bahrick, A. S. & Bahrick, P. E. Maintenance of foreign language vocabulary and the spacing effect. Psychol. Sci. 4 , 316–321 (1993). This classic study demonstrates benefits of spaced retrieval practice (successive relearning) on the learning of foreign language vocabulary in adults over a period of 5 years .

Bahrick, H. P. & Phelps, E. Retention of Spanish vocabulary over 8 years. J. Exp. Psychol. Learn. Mem. Cogn. 13 , 344–349 (1987).

Kulhavy, R. W. & Stock, W. A. Feedback in written instruction: the place of response certitude. Educ. Psychol. Rev. 1 , 279–308 (1989).

Pan, S. C., Hutter, S. A., D’Andrea, D., Unwalla, D. & Rickard, T. C. In search of transfer following cued recall practice: the case of process-based biology concepts. Appl. Cogn. Psychol. 33 , 629–645 (2019).

Pashler, H., Cepeda, N. J., Wixted, J. T. & Rohrer, D. When does feedback facilitate learning of words? J. Exp. Psychol. Learn. Mem. Cogn. 31 , 3–8 (2005).

Kang, S. H. K., McDermott, K. B. & Roediger, H. L. Test format and corrective feedback modify the effect of testing on long-term retention. Eur. J. Cognit. Psychol. 19 , 528–558 (2007).

Jaeger, A., Eisenkraemer, R. E. & Stein, L. M. Test-enhanced learning in third-grade children. Educ. Psychol. 35 , 513–521 (2015).

Pan, S. C., Rickard, T. C. & Bjork, R. A. Does spelling still matter — and if so, how should it be taught? Perspectives from contemporary and historical research. Educ. Psychol. Rev. 33 , 1523–1552 (2021).

Jones, A. C. et al. Beyond the rainbow: retrieval practice leads to better spelling than does rainbow writing. Educ. Psychol. Rev. 28 , 385–400 (2016).

McDermott, K. B., Agarwal, P. K., D’Antonio, L., Roediger, H. L. & McDaniel, M. A. Both multiple-choice and short-answer quizzes enhance later exam performance in middle and high school classes. J. Exp. Psychol. Appl. 20 , 3–21 (2014).

Roediger, H., Agarwal, P., McDaniel, M. & McDermott, K. Test-enhanced learning in the classroom: long-term improvements from quizzing. J. Exp. Psychol. Appl. 17 , 382–395 (2011).

Bobby, Z. & Meiyappan, K. “Test-enhanced” focused self-directed learning after the teaching modules in biochemistry. Biochem. Mol. Biol. Educ. 46 , 472–477 (2018).

Pan, S. C. et al. Online and clicker quizzing on jargon terms enhances definition-focused but not conceptually focused biology exam performance. CBE Life Sci. Educ. 18 , ar54 (2019).

Thomas, A. K., Smith, A. M., Kamal, K. & Gordon, L. T. Should you use frequent quizzing in your college course? Giving up 20 minutes of lecture time may pay off. J. Appl. Res. Mem. Cogn. 9 , 83–95 (2020).

Lyle, K. B. & Crawford, N. A. Retrieving essential material at the end of lectures improves performance on statistics exams. Teach. Psychol. 38 , 94–97 (2011).

Larsen, D. P., Butler, A. C. & Roediger, H. L. III Comparative effects of test-enhanced learning and self-explanation on long-term retention. Med. Educ. 47 , 674–682 (2013).

Eglington, L. G. & Kang, S. H. K. Retrieval practice benefits deductive inference. Educ. Psychol. Rev. 30 , 215–228 (2018).

Butler, A. C. Repeated testing produces superior transfer of learning relative to repeated studying. J. Exp. Psychol. Learn. Mem. Cogn. 36 , 1118–1133 (2010). This study demonstrates that retrieval practice can promote the ability to answer inferential questions involving a new knowledge domain (far transfer) .

Brabec, J. A., Pan, S. C., Bjork, E. L. & Bjork, R. A. True–false testing on trial: guilty as charged or falsely accused? Educ. Psychol. Rev. 33 , 667–692 (2021).

McDaniel, M. A., Wildman, K. M. & Anderson, J. L. Using quizzes to enhance summative-assessment performance in a web-based class: an experimental study. J. Appl. Res. Mem. Cogn. 1 , 18–26 (2012).

Rawson, K. A., Dunlosky, J. & Sciartelli, S. M. The power of successive relearning: improving performance on course exams and long-term retention. Educ. Psychol. Rev. 25 , 523–548 (2013).

Morris, P. E. & Fritz, C. O. The name game: using retrieval practice to improve the learning of names. J. Exp. Psychol. Appl. 6 , 124–129 (2000).

Smith, M. A., Roediger, H. L. & Karpicke, J. D. Covert retrieval practice benefits retention as much as overt retrieval practice. J. Exp. Psychol. Learn. Mem. Cogn. 39 , 1712–1725 (2013).

Rummer, R., Schweppe, J., Gerst, K. & Wagner, S. Is testing a more effective learning strategy than note-taking? J. Exp. Psychol. Appl. 23 , 293–300 (2017).

Karpicke, J. D. & Blunt, J. R. Retrieval practice produces more learning than elaborative studying with concept mapping. Science 331 , 772–775 (2011).

Ebersbach, M., Feierabend, M. & Nazari, K. B. B. Comparing the effects of generating questions, testing, and restudying on students’ long-term recall in university learning. Appl. Cognit. Psychol. 34 , 724–736 (2020).

Roelle, J. & Nückles, M. Generative learning versus retrieval practice in learning from text: the cohesion and elaboration of the text matters. J. Educ. Psychol. 111 , 1341–1361 (2019).

Endres, T., Carpenter, S., Martin, A. & Renkl, A. Enhancing learning by retrieval: enriching free recall with elaborative prompting. Learn. Instr. 49 , 13–20 (2017).

Glover, J. A. The ‘testing’ phenomenon: not gone but nearly forgotten. J. Educ. Psychol. 81 , 392–399 (1989).

Karpicke, J. D., Lehman, M. & Aue, W. R. in Psychology of Learning and Motivation Vol. 61 Ch. 7 (ed. Ross, B. H.) 237–284 (Academic, 2014).

Carpenter, S. K. Cue strength as a moderator of the testing effect: the benefits of elaborative retrieval. J. Exp. Psychol. Learn. Mem. Cogn. 35 , 1563–1569 (2009).

Carpenter, S. K. Semantic information activated during retrieval contributes to later retention: support for the mediator effectiveness hypothesis of the testing effect. J. Exp. Psychol. Learn. Mem. Cogn. 37 , 1547–1552 (2011).

Rickard, T. C. & Pan, S. C. A dual memory theory of the testing effect. Psychon. Bull. Rev. 25 , 847–869 (2018).

Bjork, R. A. Retrieval as a Memory Modifier: An Interpretation of Negative Recency and Related Phenomena (CiteSeer X , 1975).

Arnold, K. M. & McDermott, K. B. Test-potentiated learning: distinguishing between direct and indirect effects of tests. J. Exp. Psychol. Learn. Mem. Cogn. 39 , 940–945 (2013).

Roediger, H. L. & Karpicke, J. D. The power of testing memory: basic research and implications for educational practice. Perspect. Psychol. Sci. 1 , 181–210 (2006). This review details the history of psychology research on the retrieval practice effect and is contributing heavily to the resurgence of researcher interest in the topic .

Carpenter, S. K. Testing enhances the transfer of learning. Curr. Dir. Psychol. Sci. 21 , 279–283 (2012).

Pan, S. C. & Agarwal, P. K. Retrieval Practice and Transfer of Learning: Fostering Students’ Application of Knowledge (Univ. of California, 2018).

Tran, R., Rohrer, D. & Pashler, H. Retrieval practice: the lack of transfer to deductive inferences. Psychon. Bull. Rev. 22 , 135–140 (2015).

Wissman, K. T., Zamary, A. & Rawson, K. A. When does practice testing promote transfer on deductive reasoning tasks? J. Appl. Res. Mem. Cogn. 7 , 398–411 (2018).

van Gog, T. & Sweller, J. Not new, but nearly forgotten: the testing effect decreases or even disappears as the complexity of learning materials increases. Educ. Psychol. Rev. 27 , 247–264 (2015).

Carpenter, S. K., Endres, T. & Hui, L. Students’ use of retrieval in self-regulated learning: implications for monitoring and regulating effortful learning experiences. Educ. Psychol. Rev. 32 , 1029–1054 (2020).

Yeo, D. J. & Fazio, L. K. The optimal learning strategy depends on learning goals and processes: retrieval practice versus worked examples. J. Educ. Psychol. 111 , 73–90 (2019).

Peterson, D. J. & Wissman, K. T. The testing effect and analogical problem-solving. Memory 26 , 1460–1466 (2018).

Hostetter, A. B., Penix, E. A., Norman, M. Z., Batsell, W. R. & Carr, T. H. The role of retrieval practice in memory and analogical problem-solving. Q. J. Exp. Psychol. 72 , 858–871 (2019).

Karpicke, J. D., Blunt, J. R., Smith, M. A. & Karpicke, S. S. Retrieval-based learning: the need for guided retrieval in elementary school children. J. Appl. Res. Mem. Cogn. 3 , 198–206 (2014).

Smith, M. A. & Karpicke, J. D. Retrieval practice with short-answer, multiple-choice, and hybrid tests. Memory 22 , 784–802 (2014).

Latimier, A., Peyre, H. & Ramus, F. A meta-analytic review of the benefit of spacing out retrieval practice episodes on retention. Educ. Psychol. Rev. 33 , 959–987 (2021).

Higham, P. A., Zengel, B., Bartlett, L. K. & Hadwin, J. A. The benefits of successive relearning on multiple learning outcomes. J. Educ. Psychol. https://doi.org/10.1037/edu0000693 (2021).

Hopkins, R. F., Lyle, K. B., Hieb, J. L. & Ralston, P. A. S. Spaced retrieval practice increases college students’ short- and long-term retention of mathematics knowledge. Educ. Psychol. Rev. 28 , 853–873 (2016).

Bahrick, H. P. Maintenance of knowledge: questions about memory we forgot to ask. J. Exp. Psychol. Gen. 108 , 296–308 (1979).

Rawson, K. A. & Dunlosky, J. Successive relearning: an underexplored but potent technique for obtaining and maintaining knowledge. Curr. Dir. Psychol. Sci. https://doi.org/10.1177/09637214221100484 (2022). This brief review discusses the method of successive relearning — an effective learning technique that combines spacing and retrieval — and its benefits .

Rawson, K. A. & Dunlosky, J. When is practice testing most effective for improving the durability and efficiency of student learning? Educ. Psychol. Rev. 24 , 419–435 (2012).

Janes, J. L., Dunlosky, J., Rawson, K. A. & Jasnow, A. Successive relearning improves performance on a high-stakes exam in a difficult biopsychology course. Appl. Cognit. Psychol. 34 , 1118–1132 (2020).

Rawson, K. A., Dunlosky, J. & Janes, J. L. All good things must come to an end: a potential boundary condition on the potency of successive relearning. Educ. Psychol. Rev. 32 , 851–871 (2020).

Rawson, K. A. & Dunlosky, J. Optimizing schedules of retrieval practice for durable and efficient learning: how much is enough? J. Exp. Psychol. Gen. 140 , 283–302 (2011).

Flavell, J. H. Metacognition and cognitive monitoring: a new area of cognitive–developmental inquiry. Am. Psychol. 34 , 906–911 (1979). This classic paper introduces ideas that are now foundational to research on metacognition .

Kuhn, D. Metacognition matters in many ways. Educ. Psychol. 57 , 73–86 (2021).

Norman, E. et al. Metacognition in psychology. Rev. Gen. Psychol. 23 , 403–424 (2019).

Was, C. A. & Al-Harthy, I. S. Persistence of overconfidence in young children: factors that lead to more accurate predictions of memory performance. Eur. J. Dev. Psychol. 15 , 156–171 (2018).

Forsberg, A., Blume, C. L. & Cowan, N. The development of metacognitive accuracy in working memory across childhood. Dev. Psychol. 57 , 1297–1317 (2021).

Kuhn, D. Metacognitive development. Curr. Dir. Psychol. Sci . 9 , 178-181 (2000).

Bell, P. & Volckmann, D. Knowledge surveys in general chemistry: confidence, overconfidence, and performance. J. Chem. Educ. 88 , 1469–1476 (2011).

Saenz, G. D., Geraci, L. & Tirso, R. Improving metacognition: a comparison of interventions. Appl. Cognit. Psychol. 33 , 918–929 (2019).

Morphew, J. W. Changes in metacognitive monitoring accuracy in an introductory physics course. Metacogn. Learn. 16 , 89–111 (2021).

Geller, J. et al. Study strategies and beliefs about learning as a function of academic achievement and achievement goals. Memory 26 , 683–690 (2018).

Kornell, N. & Bjork, R. A. The promise and perils of self-regulated study. Psychon. Bull. Rev. 14 , 219–224 (2007).

Yan, V. X., Thai, K.-P. & Bjork, R. A. Habits and beliefs that guide self-regulated learning: do they vary with mindset? J. Appl. Res. Mem. Cogn. 3 , 140–152 (2014).

Rivers, M. L. Metacognition about practice testing: a review of learners’ beliefs, monitoring, and control of test-enhanced learning. Educ. Psychol. Rev. 33 , 823–862 (2021).

Carpenter, S. K. et al. Students’ use of optional online reviews and its relationship to summative assessment outcomes in introductory biology. LSE 16 , ar23 (2017).

Corral, D., Carpenter, S. K., Perkins, K. & Gentile, D. A. Assessing students’ use of optional online lecture reviews. Appl. Cognit. Psychol. 34 , 318–329 (2020).

Blasiman, R. N., Dunlosky, J. & Rawson, K. A. The what, how much, and when of study strategies: comparing intended versus actual study behaviour. Memory 25 , 784–792 (2017).

Karpicke, J. D., Butler, A. C. & Roediger, H. L. III Metacognitive strategies in student learning: do students practise retrieval when they study on their own? Memory 17 , 471–479 (2009).

Hamman, D., Berthelot, J., Saia, J. & Crowley, E. Teachers’ coaching of learning and its relation to students’ strategic learning. J. Educ. Psychol. 92 , 342–348 (2000).

Kistner, S. et al. Promotion of self-regulated learning in classrooms: investigating frequency, quality, and consequences for student performance. Metacogn. Learn. 5 , 157–171 (2010).

Morehead, K., Rhodes, M. G. & DeLozier, S. Instructor and student knowledge of study strategies. Memory 24 , 257–271 (2016).

Pomerance, L., Greenberg, J. & Walsh, K. Learning about Learning: What Every New Teacher Needs to Know (National Council on Teacher Quality, 2016).

Dinsmore, D. L., Alexander, P. A. & Loughlin, S. M. Focusing the conceptual lens on metacognition, self-regulation, and self-regulated learning. Educ. Psychol. Rev. 20 , 391–409 (2008). This conceptual review paper explores the relationship between metacognition, self-regulation and self-regulated learning .

Winne, P. H. in Handbook of Self-regulation of Learning and Performance 2nd edn 36–48 (Routledge/Taylor & Francis, 2018).

Pintrich, P. R. A conceptual framework for assessing motivation and self-regulated learning in college students. Educ. Psychol. Rev. 16 , 385–407 (2004).

Zimmerman, B. J. Self-efficacy: an essential motive to learn. Contemp. Educ. Psychol. 25 , 82–91 (2000).

McDaniel, M. A. & Butler, A. C. in Successful Remembering and Successful Forgetting: A Festschrift in Honor of Robert A. Bjork 175–198 (Psychology Press, 2011).

Bjork, R. A., Dunlosky, J. & Kornell, N. Self-regulated learning: beliefs, techniques, and illusions. Annu. Rev. Psychol. 64 , 417–444 (2013). This review provides an overview of the cognitive psychology perspective on the metacognition of strategy planning and use .

Nelson, T. O. & Narens, L. in Psychology of Learning and Motivation Vol. 26 (ed. Bower, G. H.) 125–173 (Academic, 1990).

Fiechter, J. L., Benjamin, A. S. & Unsworth, N. in The Oxford Handbook of Metamemory (eds Dunlosky, J. & Tauber, S. K.) 307–324 (Oxford Univ. Press, 2016).

Efklides, A. Interactions of metacognition with motivation and affect in self-regulated learning: the MASRL model. Educ. Psychol. 46 , 6–25 (2011).

Zimmerman, B. J. in Handbook of Self-regulation (eds Boekaerts, M. & Pintrich, P. R.) 13–39 (Academic, 2000). This paper lays out a prominent theory of self-regulated learning and exemplifies the educational psychology perspective on the metacognition of strategy planning and use .

Wolters, C. A. Regulation of motivation: evaluating an underemphasized aspect of self-regulated learning. Educ. Psychol. 38 , 189–205 (2003).

Wolters, C. A. & Benzon, M. Assessing and predicting college students’ use of strategies for the self-regulation of motivation. J. Exp. Educ. 18 , 199–221 (2013).

Abel, M. & Bäuml, K.-H. T. Would you like to learn more? Retrieval practice plus feedback can increase motivation to keep on studying. Cognition 201 , 104316 (2020).

Kang, S. H. K. & Pashler, H. Is the benefit of retrieval practice modulated by motivation? J. Appl. Res. Mem. Cogn. 3 , 183–188 (2014).

Vermunt, J. D. & Verloop, N. Congruence and friction between learning and teaching. Learn. Instr. 9 , 257–280 (1999).

Coertjens, L., Donche, V., De Maeyer, S., Van Daal, T. & Van Petegem, P. The growth trend in learning strategies during the transition from secondary to higher education in Flanders. High. Educ.: Int. J. High. Education Educ. Plan. 3 , 499–518 (2017).

Severiens, S., Ten Dam, G. & Van Hout Wolters, B. Stability of processing and regulation strategies: two longitudinal studies on student learning. High. Educ. 42 , 437–453 (2001).

Watkins, D. & Hattie, J. A longitudinal study of the approaches to learning of Austalian tertiary students. Hum. Learn. J. Practical Res. Appl. 4 , 127–141 (1985).

Russell, J. M., Baik, C., Ryan, A. T. & Molloy, E. Fostering self-regulated learning in higher education: making self-regulation visible. Act. Learn. Higher Educ . 23 , 97–113 (2020).

Schraw, G. Promoting general metacognitive awareness. Instr. Sci. 26 , 113–125 (1998).

Lundeberg, M. A. & Fox, P. W. Do laboratory findings on test expectancy generalize to classroom outcomes? Rev. Educ. Res. 61 , 94–106 (1991).

Rivers, M. L. & Dunlosky, J. Are test-expectancy effects better explained by changes in encoding strategies or differential test experience? J. Exp. Psychol. Learn. Mem. Cognn. 47 , 195–207 (2021).

Chi, M. in Handbook of Research on Conceptual Change (ed. Vosniadou, S.) 61–82 (Lawrence Erlbaum, 2009).

Susser, J. A. & McCabe, J. From the lab to the dorm room: metacognitive awareness and use of spaced study. Instr. Sci. 41 , 345–363 (2013).

Yan, V. X., Bjork, E. L. & Bjork, R. A. On the difficulty of mending metacognitive illusions: a priori theories, fluency effects, and misattributions of the interleaving benefit. J. Exp. Psychol. Gen. 145 , 918–933 (2016).

Ariel, R. & Karpicke, J. D. Improving self-regulated learning with a retrieval practice intervention. J. Exp. Psychol.Appl. 24 , 43–56 (2018).

Biwer, F., oude Egbrink, M. G. A., Aalten, P. & de Bruin, A. B. H. Fostering effective learning strategies in higher education — a mixed-methods study. J. Appl. Res. Mem. Cogn. 9 , 186–203 (2020).

McDaniel, M. A. & Einstein, G. O. Training learning strategies to promote self-regulation and transfer: the knowledge, belief, commitment, and planning framework. Perspect. Psychol. Sci. 15 , 1363–1381 (2020). This paper provides a framework for training students on how to use learning strategies .

Cleary, A. M. et al. Wearable technology for automatizing science-based study strategies: reinforcing learning through intermittent smartwatch prompting. J. Appl. Res. Mem. Cogn. 10 , 444–457 (2021).

Fazio, L. K. Repetition increases perceived truth even for known falsehoods. Collabra: Psychology 6 , 38 (2020).

Kozyreva, A., Lewandowsky, S. & Hertwig, R. Citizens versus the Internet: confronting digital challenges with cognitive tools. Psychol. Sci. Public. Interest. 21 , 103–156 (2020).

Pennycook, G. & Rand, D. G. The psychology of fake news. Trends Cognit. Sci. 25 , 388–402 (2021).

Ecker, U. K. H. et al. The psychological drivers of misinformation belief and its resistance to correction. Nat. Rev. Psychol. 1 , 13–29 (2022).

Toppino, T. C., Kasserman, J. E. & Mracek, W. A. The effect of spacing repetitions on the recognition memory of young children and adults. J. Exp. Child. Psychol. 51 , 123–138 (1991).

Childers, J. B. & Tomasello, M. Two-year-olds learn novel nouns, verbs, and conventional actions from massed or distributed exposures. Dev. Psychol. 38 , 967–978 (2002).

Lotfolahi, A. R. & Salehi, H. Spacing effects in vocabulary learning: young EFL learners in focus. Cogent Education 4 , 1287391 (2017).

Ambridge, B., Theakston, A. L., Lieven, E. V. M. & Tomasello, M. The distributed learning effect for children’s acquisition of an abstract syntactic construction. Cognit. Dev. 21 , 174–193 (2006).

Schutte, G. M. et al. A comparative analysis of massed vs. distributed practice on basic math fact fluency growth rates. J. Sch. Psychol. 53 , 149–159 (2015).

Küpper-Tetzel, C. E., Erdfelder, E. & Dickhäuser, O. The lag effect in secondary school classrooms: enhancing students’ memory for vocabulary. Instr. Sci. 42 , 373–388 (2014).

Bloom, K. C. & Shuell, T. J. Effects of massed and distributed practice on the learning and retention of second-language vocabulary. J. Educ. Res. 74 , 245–248 (1981).

Grote, M. G. Distributed versus massed practice in high school physics. Sch. Sci. Math. 95 , 97 (1995).

Minnick, B. Can spaced review help students learn brief forms? J. Educ. Bus. 44 , 146–148 (1969).

Dobson, J. L., Perez, J. & Linderholm, T. Distributed retrieval practice promotes superior recall of anatomy information. Anat. Sci. Educ. 10 , 339–347 (2017).

Kornell, N. & Bjork, R. A. Learning concepts and categories: is spacing the “enemy of induction”? Psychol. Sci. 19 , 585–592 (2008).

Rawson, K. A. & Kintsch, W. Rereading effects depend on time of test. J. Educ. Psychol. 97 , 70–80 (2005).

Butler, A. C., Marsh, E. J., Slavinsky, J. P. & Baraniuk, R. G. Integrating cognitive science and technology improves learning in a STEM classroom. Educ. Psychol. Rev. 26 , 331–340 (2014).

Carpenter, S. K. & DeLosh, E. L. Application of the testing and spacing effects to name learning. Appl. Cognit. Psychol. 19 , 619–636 (2005).

Pan, S. C., Tajran, J., Lovelett, J., Osuna, J. & Rickard, T. C. Does interleaved practice enhance foreign language learning? The effects of training schedule on Spanish verb conjugation skills. J. Educ. Psychol. 111 , 1172–1188 (2019).

Miles, S. W. Spaced vs. massed distribution instruction for L2 grammar learning. System 42 , 412–428 (2014).

Rohrer, D. & Taylor, K. The effects of overlearning and distributed practise on the retention of mathematics knowledge. Appl. Cognit. Psychol. 20 , 1209–1224 (2006).

Wahlheim, C. N., Dunlosky, J. & Jacoby, L. L. Spacing enhances the learning of natural concepts: an investigation of mechanisms, metacognition, and aging. Mem. Cogn. 39 , 750–763 (2011).

Simmons, A. L. Distributed practice and procedural memory consolidation in musicians’ skill learning. J. Res. Music. Educ. 59 , 357–368 (2012).

Ebersbach, M. & Barzagar Nazari, K. Implementing distributed practice in statistics courses: benefits for retention and transfer. J. Appl. Res. Mem. Cogn. 9 , 532–541 (2020).

Kornell, N. Optimising learning using flashcards: spacing is more effective than cramming. Appl. Cognit. Psychol. 23 , 1297–1317 (2009).

Bouzid, N. & Crawshaw, C. M. Massed versus distributed wordprocessor training. Appl. Ergon. 18 , 220–222 (1987).

Lin, Y., Cheng, A., Grant, V. J., Currie, G. R. & Hecker, K. G. Improving CPR quality with distributed practice and real-time feedback in pediatric healthcare providers—a randomized controlled trial. Resuscitation 130 , 6–12 (2018).

Terenyi, J., Anksorus, H. & Persky, A. M. Impact of spacing of practice on learning brand name and generic drugs. Am. J. Pharm. Educ. 82 , 6179 (2018).

Kerfoot, B. P., DeWolf, W. C., Masser, B. A., Church, P. A. & Federman, D. D. Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med. Educ. 41 , 23–31 (2007).

Kornell, N., Castel, A. D., Eich, T. S. & Bjork, R. A. Spacing as the friend of both memory and induction in young and older adults. Psychol. Aging 25 , 498–503 (2010).

Leite, C. M. F., Ugrinowitsch, H., Carvalho, M. F. S. P. & Benda, R. N. Distribution of practice effects on older and younger adults’ motor-skill learning ability. Hum. Mov. 14 , 20–26 (2013).

Balota, D. A., Duchek, J. M. & Paullin, R. Age-related differences in the impact of spacing, lag, and retention interval. Psychol. Aging 4 , 3–9 (1989).

Kliegl, O., Abel, M. & Bäuml, K.-H. T. A (preliminary) recipe for obtaining a testing effect in preschool children: two critical ingredients. Front. Psychol. 9 , 1446 (2018).

Fritz, C. O., Morris, P. E., Nolan, D. & Singleton, J. Expanding retrieval practice: an effective aid to preschool children’s learning. Q. J. Exp. Psychol. 60 , 991–1004 (2007).

Rohrer, D., Taylor, K. & Sholar, B. Tests enhance the transfer of learning. J. Exp. Psychol. Learn. Mem. Cogn. 36 , 233–239 (2010).

Lipowski, S. L., Pyc, M. A., Dunlosky, J. & Rawson, K. A. Establishing and explaining the testing effect in free recall for young children. Dev. Psychol. 50 , 994–1000 (2014).

Wartenweiler, D. Testing effect for visual-symbolic material: enhancing the learning of Filipino children of low socio-economic status in the public school system. Int. J. Res. Rev . 20 , 74–93 (2011).

Karpicke, J. D., Blunt, J. R. & Smith, M. A. Retrieval-based learning: positive effects of retrieval practice in elementary school children. Front. Psychol. 7 , 350 (2016).

Metcalfe, J., Kornell, N. & Son, L. K. A cognitive-science based programme to enhance study efficacy in a high and low risk setting. Eur. J. Cognit. Psychol. 19 , 743–768 (2007).

Rowley, T. & McCrudden, M. T. Retrieval practice and retention of course content in a middle school science classroom. Appl. Cognit. Psychol. 34 , 1510–1515 (2020).

McDaniel, M. A., Agarwal, P. K., Huelser, B. J., McDermott, K. B. & Roediger, H. L. Test-enhanced learning in a middle school science classroom: the effects of quiz frequency and placement. J. Educ. Psychol. 103 , 399–414 (2011).

Nungester, R. J. & Duchastel, P. C. Testing versus review: effects on retention. J. Educ. Psychol. 74 , 18–22 (1982).

Dirkx, K. J. H., Kester, L. & Kirschner, P. A. The testing effect for learning principles and procedures from texts. J. Educ. Res. 107 , 357–364 (2014).

Marsh, E. J., Agarwal, P. K. & Roediger, H. L. Memorial consequences of answering SAT II questions. J. Exp. Psychol. Appl. 15 , 1–11 (2009).

Chang, C., Yeh, T. & Barufaldi, J. P. The positive and negative effects of science concept tests on student conceptual understanding. Int. J. Sci. Educ. 32 , 265–282 (2010).

Grimaldi, P. J. & Karpicke, J. D. Guided retrieval practice of educational materials using automated scoring. J. Educ. Psychol. 106 , 58–68 (2014).

Pan, S. C., Gopal, A. & Rickard, T. C. Testing with feedback yields potent, but piecewise, learning of history and biology facts. J. Educ. Psychol. 108 , 563–575 (2016).

Darabi, A., Nelson, D. W. & Palanki, S. Acquisition of troubleshooting skills in a computer simulation: worked example vs. conventional problem solving instructional strategies. Comput. Hum. Behav. 23 , 1809–1819 (2007).

Kang, S. H. K., Gollan, T. H. & Pashler, H. Don’t just repeat after me: retrieval practice is better than imitation for foreign vocabulary learning. Psychon. Bull. Rev. 20 , 1259–1265 (2013).

Carpenter, S. K. & Pashler, H. Testing beyond words: using tests to enhance visuospatial map learning. Psychon. Bull. Rev. 14 , 474–478 (2007).

Carpenter, S. K. & Kelly, J. W. Tests enhance retention and transfer of spatial learning. Psychon. Bull. Rev. 19 , 443–448 (2012).

Kang, S. H. K., McDaniel, M. A. & Pashler, H. Effects of testing on learning of functions. Psychon. Bull. Rev. 18 , 998–1005 (2011).

Jacoby, L. L., Wahlheim, C. N. & Coane, J. H. Test-enhanced learning of natural concepts: effects on recognition memory, classification, and metacognition. J. Exp. Psychol. Learn. Mem. Cogn. 36 , 1441–1451 (2010).

McDaniel, M. A., Anderson, J. L., Derbish, M. H. & Morrisette, N. Testing the testing effect in the classroom. Eur. J. Cognit. Psychol. 19 , 494–513 (2007).

Foss, D. J. & Pirozzolo, J. W. Four semesters investigating frequency of testing, the testing effect, and transfer of training. J. Educ. Psychol. 109 , 1067–1083 (2017).

Wong, S. S. H., Ng, G. J. P., Tempel, T. & Lim, S. W. H. Retrieval practice enhances analogical problem solving. J. Exp. Educ. 87 , 128–138 (2019).

Pan, S. C., Rubin, B. R. & Rickard, T. C. Does testing with feedback improve adult spelling skills relative to copying and reading? J. Exp. Psychol. Appl. 21 , 356–369 (2015).

Coppens, L., Verkoeijen, P. & Rikers, R. Learning Adinkra symbols: the effect of testing. J. Cognit. Psychol. 23 , 351–357 (2011).

Zaromb, F. M. & Roediger, H. L. The testing effect in free recall is associated with enhanced organizational processes. Mem. Cogn. 38 , 995–1008 (2010).

Carpenter, S. K., Pashler, H. & Vul, E. What types of learning are enhanced by a cued recall test? Psychon. Bull. Rev. 13 , 826–830 (2006).

Pan, S. C., Wong, C. M., Potter, Z. E., Mejia, J. & Rickard, T. C. Does test-enhanced learning transfer for triple associates? Mem. Cogn. 44 , 24–36 (2016).

Butler, A. C. & Roediger, H. L. Testing improves long-term retention in a simulated classroom setting. Eur. J. Cognit. Psychol. 19 , 514–527 (2007).

Dobson, J. L. & Linderholm, T. Self-testing promotes superior retention of anatomy and physiology information. Adv. Health Sci. Educ. 20 , 149–161 (2015).

Kromann, C. B., Jensen, M. L. & Ringsted, C. The effect of testing on skills learning. Med. Educ. 43 , 21–27 (2009).

Baghdady, M., Carnahan, H., Lam, E. W. N. & Woods, N. N. Test-enhanced learning and its effect on comprehension and diagnostic accuracy. Med. Educ. 48 , 181–188 (2014).

Freda, N. M. & Lipp, M. J. Test-enhanced learning in competence-based predoctoral orthodontics: a four-year study. J. Dental Educ. 80 , 348–354 (2016).

Tse, C.-S., Balota, D. A. & Roediger, H. L. The benefits and costs of repeated testing on the learning of face–name pairs in healthy older adults. Psychol. Aging 25 , 833–845 (2010).

Meyer, A. N. D. & Logan, J. M. Taking the testing effect beyond the college freshman: benefits for lifelong learning. Psychol. Aging 28 , 142–147 (2013).

Guran, C.-N. A., Lehmann-Grube, J. & Bunzeck, N. Retrieval practice improves recollection-based memory over a seven-day period in younger and older adults. Front. Psychol. 10 , 2997 (2020).

McCabe, J. Metacognitive awareness of learning strategies in undergraduates. Mem. Cogn. 39 , 462–476 (2011).

Carpenter, S. K., Witherby, A. E. & Tauber, S. K. On students’ (mis)judgments of learning and teaching effectiveness. J. Appl. Res. Mem. Cogn. 9 , 137–151 (2020). This review discusses the factors underlying faulty metacognition, and how they can mislead students’ judgements of their own learning as well as the quality of effective teaching .

Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P. & Glaser, R. Self-explanations: how students study and use examples in learning to solve problems. Cognit. Sci. 13 , 145–182 (1989).

Gurung, R. A. R. How do students really study (and does it matter)? Teach. Psychol. 32 , 238–241 (2005).

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K. & Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl Acad. Sci. USA 116 , 19251–19257 (2019).

Hartwig, M. K., Rohrer, D. & Dedrick, R. F. Scheduling math practice: students’ underappreciation of spacing and interleaving. J. Exp. Psychol. Appl. 28 , 100–113 (2022).

Carpenter, S. K., King-Shepard, Q., & Nokes-Malach, T. J. in In Their Own Words: What Scholars Want You to Know About Why and How to Apply the Science of Learning in Your Academic Setting (eds Overson, C., Hakala, C., Kordonowy, L. & Benassi, V.) (American Psychological Association, in the press).

Kirk-Johnson, A., Galla, B. M. & Fraundorf, S. H. Perceiving effort as poor learning: the misinterpreted-effort hypothesis of how experienced effort and perceived learning relate to study strategy choice. Cognit. Psychol. 115 , 101237 (2019).

Fisher, O. & Oyserman, D. Assessing interpretations of experienced ease and difficulty as motivational constructs. Motiv. Sci. 3 , 133–163 (2017).

Schiefele, U. Interest, learning, and motivation. Educ. Psychol. 26 , 299–323 (1991).

Simons, J., Dewitte, S. & Lens, W. The role of different types of instrumentality in motivation, study strategies, and performance: know why you learn, so you’ll know what you learn! Br. J. Educ. Psychol. 74 , 343–360 (2004).

Pan, S. C., Sana, F., Samani, J., Cooke, J. & Kim, J. A. Learning from errors: students’ and instructors’ practices, attitudes, and beliefs. Memory 28 , 1105–1122 (2020).

Download references

Acknowledgements

This material is based upon work supported by the James S. McDonnell Foundation 21st Century Science Initiative in Understanding Human Cognition, Collaborative Grant 220020483. The authors thank C. Phua for assistance with verifying references.

Author information

Authors and affiliations.

Department of Psychology, Iowa State University, Ames, IA, USA

Shana K. Carpenter

Department of Psychology, National University of Singapore, Singapore City, Singapore

  • Steven C. Pan

Department of Education, Washington University in St. Louis, St. Louis, MO, USA

Andrew C. Butler

Department of Psychological and Brain Sciences, Washington University in St. Louis, St. Louis, MO, USA

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design of the article. S.K.C. drafted the sections on measuring learning, spacing, successive relearning and future directions; S.C.P. drafted the section on retrieval practice, developed the figures and drafted the tables; A.C.B. drafted the section on metacognition. All authors edited and approved the final draft of the complete manuscript.

Corresponding author

Correspondence to Shana K. Carpenter .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Peer review

Peer review information.

Nature Reviews Psychology thanks Veronica Yan, who co-reviewed with Brendan Schuetze; Mirjam Ebersbach; and Nate Kornell for their contribution to the peer review of this work.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cite this article.

Carpenter, S.K., Pan, S.C. & Butler, A.C. The science of effective learning with spacing and retrieval practice. Nat Rev Psychol 1 , 496–511 (2022). https://doi.org/10.1038/s44159-022-00089-1

Download citation

Accepted : 23 June 2022

Published : 02 August 2022

Issue Date : September 2022

DOI : https://doi.org/10.1038/s44159-022-00089-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Optimizing self-organized study orders: combining refutations and metacognitive prompts improves the use of interleaved practice.

  • Felicitas Biwer
  • Anique de Bruin

npj Science of Learning (2024)

Improved Soft-Skill Competencies of ABA Professionals Following Training and Coaching: A Feasibility Study

  • Zahava L. Friedman
  • Daphna El Roy
  • Angela Broff

Behavior and Social Issues (2024)

A Computational Model of School Achievement

  • Brendan A. Schuetze

Educational Psychology Review (2024)

Becoming Better Learners, Becoming Better Teachers: Augmenting Learning via Cognitive and Motivational Theories

  • Veronica X. Yan
  • Stephany Duany Rea

Human Arenas (2024)

Emerging and Future Directions in Test-Enhanced Learning Research

  • John Dunlosky
  • Kim Ouwehand

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

research about active learning

ScienceDaily

Learning is more effective when active

Engaging students through interactive activities, discussions, feedback and AI-enhanced technologies resulted in improved academic performance compared to traditional lectures, lessons or readings, faculty from Carnegie Mellon University's Human-Computer Interaction Institute concluded after collecting research into active learning.

The research also found that effective active learning methods use not only hands-on and minds-on approaches, but also hearts-on, providing increased emotional and social support.

Interest in active learning grew as the COVID-19 pandemic challenged educators to find new ways to engage students. Schools and teachers incorporated new technologies to adapt, while students faced negative psychological effects of isolation, restlessness and inattention brought on by quarantine and remote learning. The pandemic made it clear that traditional approaches to education may not be the best way to learn, but questions persisted about what active learning is and how best to use it to teach and engage and excite students.

Nesra Yannier, faculty in HCII, and Ken Koedinger, a professor of human-computer interaction and psychology, collaborated with researchers at several universities including Stanford, Harvard and University of Washington, to summarize the important findings around active learning. Their work, "Active learning: 'Hands-on' meets 'minds-on,'" was published in Science. The recent studies collected by Yannier and Koedinger span children to college-age adults, demonstrate how and when different approaches of active learning can be effective and engaging, and suggest ways to incorporate lessons learned from schooling during the height of the COVID-19 pandemic.

"We wanted to see what we learned from teaching and learning during COVID and what could be brought back into the classroom," Yannier said. "COVID forced educators to engage students in novel ways, and teachers were experimenting with new technology."

The collected studies showed that active learning can put students in the driver's seat of their lessons. Active learning techniques encourage students to produce thoughts and get feedback through interactive settings rather than passively receiving information as is common in pervasive approaches to education like lectures and readings.

One study included in the collection showed the benefits of physical activity for creativity and idea generation. Another found that while college students think they learn more in traditional lectures than through active learning approaches, they do not. Active learning produces better outcomes.

Yannier and Koedinger included their own research, completed with Scott Hudson, a professor in HCII, that found incorporating an AI-based virtual helper to question students, encourage them to think critically and engage them in discussions increased learning in hands-on activities, while also supporting teachers. The researchers performed controlled experiments to see how much children learned while interacting with NoRILLA, a mixed-reality learning platform where children perform and interpret real-world experiments with personalized interactive feedback in an earthquake table, ramps or other physical apparatuses, with the artificial intelligence turned on and off. When turned off, the students learned far less.

"We've done a lot of research around this," Yannier said. "If we don't have the AI guidance on, the children are not able to understand the underlying concepts, and the learning doesn't translate into the real world."

Both Yannier and Koedinger said that the studies they summarized made it clear that there are many approaches to active learning and how to research it. They hope their paper will move educators to incorporate more active learning in their lessons and think about how they can participate in research into it.

"It's quite clear in this collection that even among like-minded folks there are seven or more applications of active learning that work and sometimes they work in contradictory ways," Koedinger said. "There is so much richness in this field that we can continually make improvements to make it more effective and enjoyable for a long, long time."

  • Educational Psychology
  • K-12 Education
  • Learning Disorders
  • Illegal Drugs
  • Child Development
  • Learning disability
  • Intellectual giftedness
  • Early childhood education
  • Special education
  • Double blind
  • Computational neuroscience
  • Asperger syndrome

Story Source:

Materials provided by Carnegie Mellon University . Original written by Aaron Aupperlee. Note: Content may be edited for style and length.

Journal Reference :

  • Nesra Yannier, Scott E. Hudson, Kenneth R. Koedinger, Kathy Hirsh-Pasek, Roberta Michnick Golinkoff, Yuko Munakata, Sabine Doebel, Daniel L. Schwartz, Louis Deslauriers, Logan McCarty, Kristina Callaghan, Elli J. Theobald, Scott Freeman, Katelyn M. Cooper, Sara E. Brownell. Active learning: “Hands-on” meets “minds-on” . Science , 2021; 374 (6563): 26 DOI: 10.1126/science.abj9957

Cite This Page :

Explore More

  • High-Efficiency Photonic Integrated Circuit
  • Life Expectancy May Increase by 5 Years by 2050
  • Toward a Successful Vaccine for HIV
  • Highly Efficient Thermoelectric Materials
  • Toward Human Brain Gene Therapy
  • Whale Families Learn Each Other's Vocal Style
  • AI Can Answer Complex Physics Questions
  • Otters Use Tools to Survive a Changing World
  • Monogamy in Mice: Newly Evolved Type of Cell
  • Sustainable Electronics, Doped With Air

Trending Topics

Strange & offbeat.

X

Teaching & Learning

  • Education Excellence
  • Professional development
  • Case studies
  • Teaching toolkits
  • MicroCPD-UCL
  • Assessment resources
  • Student partnership
  • Generative AI Hub
  • Community Engaged Learning
  • UCL Student Success

Menu

Active learning

Active learning is an approach which gets your students more involved and engaged in their learning. Learn how to incorporate it into your teaching.

The words Teaching toolkits ucl arena centre on a blue background

1 August 2019

Active learning puts students at the heart of the learning experience. It enables them to become much more engaged with their own learning.

By becoming active participants in the classroom, students build knowledge through their own experiences.

Research shows that active learning can help students achieve a far deeper understanding of a topic than by simply listening to lectures or reading textbooks.

For teachers, active learning provides more opportunities to interact with students. For example, it can give you more ways to get continual feedback to evaluate your teaching.

What active learning means

Active learning is an approach, rather than a fixed set of activities.

It can include any activity that encourages students to take an active, engaged part in the learning process within the classroom, such as:

  • group discussions
  • student presentations
  • experiments
  • problem-solving

Active learning is about teachers providing the environment and opportunities for students to build knowledge and understanding of a subject for themselves.

This is in contrast to more traditional methods of teaching, such as a lecturer seeking to ‘transmit’ knowledge to students as they sit and listen.

  • 20 terrible reasons for lecturing by Oxford Brookes University  describes the limitations of the traditional lecture.

Putting it into practice

Make lectures more interactive.

There are many ways to introduce an active learning style into your teaching sessions – both with large groups and small groups.

You don’t necessarily have to make entire sessions ‘active’. Consider breaking up a lecture with other activities to make it more interactive.

Classroom response systems, also known as ‘clickers’ (hand-held devices allowing students to select answers to multiple-choice questions that you ask in class) are an easy and effective way to get students actively involved.

Clickers are installed in UCL’s larger lecture theatres. There are also several free apps, such as Socrative, that allow students to record their answers using their own smartphone or tablet.

In large classes, try dividing the students into small groups so that they can get involved in active discussions.

  • Digital education: resources and useful links has links to various examples of technology that could assist you.

Try introducing a few active learning elements into your teaching to begin with. If students respond positively, add more.

Use ‘flipped’ learning

Get students to prepare for a classroom session in advance, for example by watching a video and answering questions about it. The session could then be based around small-group activities that you facilitate.

You could use a ‘flipped’ session midway through a module as a revision session or for working on an assignment that you’ve previously set.

Flipped learning works best when all the students have done the preparation work beforehand.

Get students to present their work

Ask your students to show some of their assignments and coursework to fellow students or people outside UCL.  

Knowing there’s an audience for their work besides the tutor can encourage students to become more actively engaged with a topic.

Don’t just think about what you want to teach. More importantly, think about how the students will learn.

Encourage students to work together

Get your students to work with their peers to solve a challenging problem or complete a task. This can help students to develop their communication and negotiation skills, as well as helping them gain a deeper understanding of a subject.

See the small-group teaching toolkit  for more ideas about how to get students to work together.

When running small-group discussions, choose which students are in which groups rather than allowing them to choose their groups. This will lead to a more diverse learning environment and will encourage all students to take an active part.

Get students to learn by doing research

The UCL Connected Curriculum aims to ensure that all UCL students learn through participating in research and active, critical enquiry.

Think about how you can get students to engage in the subject by doing their own research.

See the Connected Curriculum case studies on the Teaching and Learning Portal for examples of how UCL colleagues have given students the opportunity to carry out research as part of their modules.

Where to find help and support

For more ideas about how to include active learning in your sessions:

  • speak to your teaching lead
  • attend a UCL Arena event
  • see this Digital Education Team blog post: Meet the Active Learning Classroom (2012)

Get your students more actively engaged in their learning by getting involved with a departmental  UCL ChangeMakers project . ChangeMakers projects are intended to innovate or enhance the learning experience at UCL.

UCL resources

Connected Curriculum  - our framework for research-based education at UCL. Explore the six dimensions and get tips and advice on how to further enhance your programmes of study.

Distance Learning An increasing number of UCL courses are fully online distance learning, or mostly-online 'blended learning' (where students might come for compressed teaching). With recent technological developments, reaching out to a truly global audience has become ever more possible for credit-bearing programmes or the growing field of life learning (CPD, short courses etc).

Object-based learning Using objects in teaching not only helps students to understand their subject but also develops academic and transferable skills such as team work and communication, analytical research skills, practical observation and drawing skills. It can also trigger innovative dissertation topics. This link takes you to the Object-based learning section on the UCL Culture website.

External resources

Active Learning in Higher Education an international, refereed publication for all those who teach and support learning in higher education and those who undertake or use research into effective learning, teaching and assessment in universities and colleges.

Does active learning work? A review of the research , 2004, Michael Prince, Bucknell University.

Active Learning , The Higher Education Academy (HEA).

Twenty terrible reasons for lecturing , Oxford Brookes University.

This guide has been produced by the UCL Arena Centre for Research-based Education . You are welcome to use this guide if you are from another educational facility, but you must credit the UCL Arena Centre. 

Further information

More teaching toolkits  - back to the toolkits menu

[email protected] : contact UCL Arena Centre 

UCL Education Strategy 2016–21  

Download a printable copy of this guide  

Case studies : browse related stories from UCL staff and students.

Sign up to the monthly UCL education e-newsletter  to get the latest teaching news, events & resources.  

Teaching methods events

Funnelback feed: https://search2.ucl.ac.uk/s/search.json?collection=drupal-teaching-learn... Double click the feed URL above to edit

Teaching methods case studies

  • Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, active learning to improve student learning experiences in an online postgraduate course.

research about active learning

  • 1 School of Exercise and Nutrition Sciences, Deakin University, Geelong, VIC, Australia
  • 2 Health Educational Development Unit, Deakin University, Geelong, VIC, Australia
  • 3 Centre for Quality and Patient Safety Research in Institute of Health Transformation, School of Nursing and Midwifery, Deakin University, Geelong, VIC, Australia
  • 4 Deakin Learning Futures, Office of the Deputy Vice Chancellor (Education), Deakin University, Geelong, VIC, Australia

Post-graduate programs attract older students, who often work part-time or full-time and have child-care responsibilities. In the Information Age, online learning environments can help these students to meet their learning objectives more efficiently and provide a unique opportunity to address individual learning preferences. The aim of this study was to assess the learning experiences of postgraduate students in an online learning environment delivering content in a guided, self-directed way focusing on active learning opportunities. Two-hundred and eighty-seven students participated in the study. A pragmatic descriptive design with purposive sampling was used to examine the impact of a newly developed active online learning environment on student commitment, performance and satisfaction when compared to a passive, pre-recorded lecture. In contrast to our hypothesis that all metrics would improve with subject redevelopment, student performance and commitment did not improve in the active online learning environment; however, student satisfaction increased significantly. These findings might be partly attributed to the increased cognitive load associated to online learning. This study demonstrates how, for postgraduate students choosing online learning, active learning experiences can be used to provide students with a greater sense of satisfaction while acknowledging for the heterogeneity of the cohort and its different learning preferences. However, in the worldwide context of remote learning rapidly and urgently expanding, it also outlines that online learning needs to be carefully scaffolded to ensure deep learning and that the impact of the transition to online learning on performance and commitment should be considered, especially when directed at non-experienced students.

Introduction

In 2010, 19% of students studying at Australian universities were enrolled in fully online (12%) or multi-modal (partially online, 7%) programs ( Australian Bureau of Statistics, 2012 ). By 2017, it was estimated that these numbers had grown to over 175,000 Australian students (14%) enrolled in a fully online course ( University Rankings Australia, 2017 ). In the worldwide context of online learning rapidly and urgently expanding in response to COVID-19, this trend will only expand exponentially in the coming years. Australian online post-graduate programs typically attract older students, who often work full-time or part-time and have child-care responsibilities within the home ( Stoessel et al., 2015 ), replicating a local and international trend ( Jancey and Sharyn, 2013 ). The emergence of web-based learning technologies has provided a unique opportunity for flexible access to learning. In particular, a growing number of students living in rural/remote areas enroll into post-graduate university degrees ( Waschull, 2001 ; Greenland and Moore, 2014 ) thanks to ever-increasing internet access ( Casey, 2008 ).

Web-based programs require more learning independence than traditional on campus, highly structured programs. Self-directed learning is a key factor determining student success, and appropriate learning design is critical to engage and motivate students. In the globally expanding context of web-based learning delivery ( Casey, 2008 ) added to the current context of a pandemic, it has become increasingly important for tertiary educators to offer online delivery methods and learning experiences that address the social and cognitively active online context of learning ( Akcaoglu and Lee, 2016 ) and acknowledge the students’ different learning preferences. This study, undertaken before the pandemic, evaluated the redevelopment of an online postgraduate subject offered by Deakin University, Nutritional Biochemistry and Physiology . As with many online courses, the cohort of students enrolled in this subject is characterized by a wide range of ages, academic backgrounds and professional commitments. Creating an online environment that meets diverse students’ learning needs is complex and challenging ( Hill et al., 2009 ; Chita-Tegmark et al., 2011 ; Gillett-Swan, 2017 ). It was suggested as early as 2001, when web-based learning was in its infancy, that such an environment may require a unique approach to content delivery ( Fetherston, 2001 ). Innovative, online learning environments can help students to meet their learning objectives more effectively by offering content in multiple formats (written, auditory, interactive) along with providing learning opportunities that are less passive ( Laurillard, 1998 ; Chita-Tegmark et al., 2011 ; Moorefield-Lang et al., 2016 ). While the effectiveness of online learning for student satisfaction, commitment and performance are increasingly important research areas in undergraduate education contexts, there is a distinct lack of systematic evidence assessing post-graduate online learning environments.

The content of the Nutritional Biochemistry and Physiology subject is typically found dry and challenging by students as they re-enter higher education, because it addresses foundation knowledge in biochemistry and physiology needed for applied nutrition related topics in the future. Students with little or no science background can also find studying biochemistry and physiology for the first time somewhat confronting. This subject was historically delivered as a series of audio-visual recordings of a 45 min narrated slide show, which received poor evaluations from students. These evaluations were partly attributed to poor design of the online content delivery. In 2018, we implemented a series of changes to the learning design that encouraged engagement, social presence, communication, active learning, interactivity and formative feedback through a series of synchronous and asynchronous activities ( Laurillard, 2013 ; Sun and Chen, 2015 ; Stone, 2016 ). These changes were aimed at delivering the subject content in a planned sequence, with multiple opportunities to engage with learning materials in varying formats, but without modifying the amount or type of content that was delivered. Active learning, whether online or in a classroom, requires students to cognitively engage with learning materials and activities specifically designed by the teacher ( Bonwell et al., 1991 ). During active learning, students are required to think, analyze, synthesize, discuss with peers and make decisions, resulting in increased student engagement ( Freeman et al., 2014 ). Our hypothesis was that student commitment, performance and evaluations for the subject would be higher than previously, and that the subject would also rate higher against experiences in concurrently enrolled subjects using the recorded lecture model.

The overarching aim of this study was to assess how a new learning design would influence post-graduate student commitment, performance and satisfaction in the online learning environment. A secondary aim was to assess how each learning activity was perceived by the students, and how it positively or negatively contributed to their learning experience. We hypothesized that delivering content in multiple formats would generally improve the students’ experience of the subject, increase their commitment and performance, and that students’ age and educational background may be important cofactors in this response.

Materials and Methods

A pragmatic, descriptive design using a purposive sample was used to address the research aims. A pragmatic design is appropriate in this context as it allows a matter-of-fact approach of the research question ( Cohen et al., 2013 ).

This study was approved by the Deakin University Human Ethics Advisory Group (HEAG-H 2018-057). Of the 288 students enrolled in the Deakin University online postgraduate subject Nutritional Biochemistry and Physiology in 2017 and 2018, 287 (134 from 2017, and 153 from 2018) provided informed consent to collect their anonymous data from the university web-based delivery system (99.5% response rate). Out of 153 post-graduate students enrolled in 2018, 75 also agreed to complete the survey (49% response rate).

Subject Redevelopment

For the 2018 academic offering of the subject, 21 core topics were redeveloped and delivered over 11 weeks. Instead of passively listening to an audio-visual recording of a 45 min narrated slide show as per the 2017 curricula, the new mode of delivery was based on an html webpage hosted by Brightspace by D2L learning management system (LMS). Based on best practice recommendations for online learning design ( Puzziferro and Shelton, 2008 ; Laurillard, 2013 ; Sun and Chen, 2015 ; Stone, 2016 ), the learning experiences for each week were scaffolded with deliberate inclusion of educational material and specific communication strategies to actively engage students in the learning process. For each week, the webpage specifically included:

• Short video clips (<3 min) aimed at welcoming the students in a personal way, establishing a safe learning space, introducing each core topic and guiding the students through self-directed learning activities by explaining their purpose and putting them into context of the subject expectations. These clips contributed to maximize social presence ( Hostetter and Busch, 2012 ; Richardson and Swan, 2019 ) and frame expectations ( Tharayil et al., 2018 ), which are essential to ensure student satisfaction and engagement ( Trowler, 2010 ).

• One to three short narrations of key concepts (<10 min) along with supporting written content focusing on core ideas to ensure constructive alignment ( Biggs, 2015 ).

• One to two videos from external providers and/or links to online interactive activities to ensure student active engagement ( Puzziferro and Shelton, 2008 ; Laurillard, 2013 ; Sun and Chen, 2015 ; Stone, 2016 ).

• One to two links to contemporary readings (e.g., pieces published in The Conversation ) 1 ( Kinash, 2019 ).

• Non-graded, self-assessment multiple-choice questionnaires (MCQs) that were available each time new concepts had been introduced, as well as at the end of each topic, to encourage learner feedback on knowledge acquired through the previous activities ( Hattie and Timperley, 2007 ; Shute, 2008 ).

• Finally, students had the choice to attend five synchronous interactive 90 min seminars with the teacher online (via the BlackBoard Collaborate virtual classroom platform) or on-campus (for those living locally). For those students who could not engage synchronously, recordings of these interactive sessions were also offered. These seminars were interactive and an online polling service was used to provide students with the opportunity to participate and interact via their mobile phones, with results displayed to stimulate class discussion. The reasons for including polling were threefold: to allow students an opportunity to anonymously benchmark their progress against other students by answering practice MCQs ( Gillett-Swan, 2017 ); to receive further feedback on their acquisition of key concepts and enable discussion of their responses with the teacher and their peers to deepen learning through peer teaching ( Stigmar, 2016 ); and to access the teacher’s functioning knowledge.

Quantitative Data

Student commitment.

The Brightspace by D2L LMS has the capacity to provide data on student activity within the site. Student activity data were extracted for 2017 and 2018. All data were de-identified to protect privacy. For each student, the following variables were determined: (1) time (in minutes) spent in the resources section of the subject website, (2) total number of mouse clicks on topics within the resources section, (3) discussion posts read, expressed as a percentage of total number of discussion posts, and (4) discussion posts authored, also expressed as a percentage. These values were used as a proxy for student commitment prior to and after the redevelopment of the subject. An independent samples t -test was used to determine any significant differences between student commitment in 2017 and 2018.

Student Performance

Student performance for each of the assessment tasks (which remained the same from 2017 to 2018) was de-identified, weighted and averaged (final subject grade), and compared from 2017 to 2018 using an independent samples t -test. The relationship between student performance and each of the measures of commitment described above was analyzed using a Pearson’s correlation.

Student Satisfaction

Data were drawn from eVALUate, the Deakin University student evaluation system. With the intention of measuring student engagement and satisfaction with learning and the subject, eVALUate is a 5-item scale anchored from strongly agree to strongly disagree, and provides scores against 10 statements and an overall subject score (see Table 1 ). Student responses were used to determine and compare the level of student satisfaction from 2017 to 2018. Because these data are anonymously collected and provided in an aggregated form, no further analysis could be run.

www.frontiersin.org

Table 1. Student satisfaction percentage scores (%) against 11 eVALUate items for the students in 2017 ( N = 134) and 2018 ( N = 153).

Learning Experiences

Due to the study design, only the 2018 cohort was eligible to participate in an online survey that aimed to compare student satisfaction between current online learning methods and previous online learning methods made up of an audio-visual recording of a 45 min narrated slide show. Due to the structure of the course, each student currently was or had been exposed to the 45 min narrated slide show format in the past 12 months for another subject. Demographic data collected included gender, age, first language, educational background, current degree, estimated study time, and employment time. Students were asked to grade their current online learning experience compared to their previous or concurrent experiences of an audio-visual recording of a 45 min narrated slide show. Finally, students were asked to assign a satisfaction score (from 1 to 10, 10 being the highest) to each online learning activity in the revised program comprising scaffolded, sequenced and guided learning activities including short narrations of key concepts, readings, videos, interactive pages, and self-assessment questions as described earlier.

Qualitative Data

For the qualitative part of this study, an extended response questionnaire was used to explore student perceptions of elements of the new subject learning design. Students were asked to reflect about their personal learning experience in this subject, including their perceptions of delivery mode, the requested study time commitment, learning activities, and the format of the seminars. Questions were adapted and derived from previously published research ( Currey et al., 2015 ). Examples of the questions asked are provided in Table 2 . The survey was conveyed via the Qualtrics online platform during the final 3 weeks of the teaching trimester. The full survey can be found in Supplementary Material .

www.frontiersin.org

Table 2. Example questions used in the extended response questionnaire.

Data Analysis

Demographic data were analyzed using descriptive statistics. Quantitative data were analyzed using descriptive statistics and parametric tests including paired t-tests , one-way ANOVA and Pearson’s correlations . Data are presented as mean ± SD. SPSS statistics 24 was used to conduct all statistical tests, and the value for significance was set at p < 0.05.

Qualitative data were analyzed using thematic analysis techniques in order to gain a greater understanding of student perceptions of the benefits of the new mode of delivery of this subject. Thematic analysis, a method designed at identifying, analyzing and reporting themes or patterns within data, was conducted according to the principles stated by Braun and Clarke ( Braun and Clarke, 2006 ). Briefly, these principles rely on a 6-step analysis process of: 1. Familiarizing yourself with the data, 2. Generating initial codes, 3. Searching for themes, 4. Reviewing themes, 5. Defining and naming themes, 6. Producing the report. Constant comparative techniques were applied to analyze for overlap, redundancy, emergence of any new themes and relationships between themes.

Student Performance and Commitment in 2017 and 2018

Student overall performance (final subject grade) did not differ between 2017 (uploaded classroom recordings; subject grade = 68.14 ± 13.01) and 2018 (self-directed, guided learning; subject grade = 66.47 ± 12.79) ( p > 0.05). Student commitment was assessed as a combination of time spent in the resources section, total number of clicks in the resources section, percentage of discussion posts read and percentage of discussion post authored. None of these parameters significantly differed between 2017 and 2018 (all p > 0.05), suggesting that students’ commitment was not influenced by the new format of the subject ( Table 3 ). In both 2017 and 2018, the best predictor of overall students’ performance was their involvement in the discussion board. Number of discussion posts read were weakly correlated to overall performance, with Pearson’s correlations returned significant, r (132) = 0.268, p < 0.01 in 2017 and r (151) = 0.197, p < 0.05 in 2018, respectively. Similarly, number of discussion posts authored were weakly correlated to overall performance, with Pearson’s correlations returned significant, r (132) = 0.261, p < 0.01 in 2017 and r (151) = 0.175, p < 0.05 in 2018, respectively.

www.frontiersin.org

Table 3. Student commitment data for 2017 ( N = 134) and 2018 ( N = 153).

Student Satisfaction in 2017 and 2018

Student responses to the eVALUate items indicate satisfaction was higher in 2018 compared to 2017. The largest improvements in student satisfaction from 2017 to 2018 were observed for item 11 “overall subject satisfaction” (increase from 53 to 83%), item 6: “workload” (increase from 55 to 86%) and item 7: “teaching quality” (increase from 42 to 83%). Item 2 “learning experiences” (increase from 61 to 86%), and item 3 “learning resources” (increase from 76 to 86%) were identified as the most likely to be reflective of changes in the content delivery mode. Data for each of the eVALUate items are shown in Table 1 .

Demographics of the 2018 Student Cohort

Out of 153 post-graduate students enrolled in 2018, 75 also agreed to complete the online survey (response rate: 49%). Of the students, 84% were female. The most represented age cohort was 25–34, and the students’ first language was predominantly English. About one third of the students had an undergraduate degree in biomedical, health or exercise sciences, and about a quarter of them had an undergraduate degree in nutrition or food sciences. Before enrolling into their current postgraduate degree, 70% of students had last studied at university between 2011 and 2017. In contrast, 12% had not studied at university since 2000 or earlier, or at all. Twenty-three percent of the students did not do any paid work, or were retired, whilst 27% of them worked full-time (35 h per week or more). These data can be seen in Supplementary Table 1 .

Subject Enrollment Status and Self-Reported Study Commitment

Students were mostly enrolled in Master degrees (60%) (see Supplementary Table 2 ). Most students reported that they committed up to 9 (37%) or between 10 and 19 (31%) hours of work per week to this specific subject. Students predominantly attended the online (47%) or on-campus seminars (27%).

Subject Delivery Mode

For all demographic groups, the new delivery mode was preferred to an audio-visual recording of a 45-min narrated slide show ( p = 0.001). Gender, first language, educational background, current degree, and professional commitment did not influence the way the students perceived any component of the subject. The self-reported time spent on the subject website weekly was negatively correlated to the overall score given to the subject ( p = 0.048), suggesting that students allocating less study time to this subject may find more benefits in the new format. This idea of flexible access to learning material and time efficiency was consistent across the qualitative responses. For example, student reported “ I really appreciate I can watch it when I am ready and have the time to watch ” or “ I have found this format much more effective. It allows me to focus in smaller concentrated bursts on the material, as time allows me. ” This was specifically emphasized by students working full time: “ Because I work full time I find the flexible delivery much more user friendly as it allows me to study from home around my current commitments. ” While the concept of flexible learning is intrinsic to online programs, offering content in small packages of different formats seemed even more beneficial to those having less study time to allocate to this subject.

Learning Activities

In terms of meeting the subject learning outcomes, readings were perceived as the less effective activity (mean satisfaction score = 7.2/10), which still indicated a moderately high satisfaction level. The short narrations of key concepts scored 7.4/10 closely followed by the interactive links to external videos and websites at 7.5/10. Self-assessment questionnaires were more highly rated with a mean of 7.9/10.

The readings were perceived as effective for learning by most students (mean satisfaction score = 7.2/10), as reflected in 43 positive responses compared to only 10 negative responses. Students found that readings helped to consolidate their learning and sometimes provided a more detailed explanation of a topic. Students appreciated the ability to read at their own pace and reinforce important messages from the key concepts. For example, students reported that the readings were “particularly useful, they help me to consolidate my understanding and to go at my own pace” and that they “effectively reinforce the material as well as giving access to revision or more elaborate explanations . ” Students who found the readings ineffective for their learning typically identified themselves as visual learners, preferring to “ watch the videos or interact with others. ”

Short Narrations Delivering Concepts

The short narrations delivering key concepts were increasingly well scored (mean satisfaction score = 7.4/10) with students of an increasing age ( p = 0.029). This effect was also reflected by a negative correlation between the score given to this particular activity and the last time the student had studied at university ( p = 0.049). Students found the short narrations to be effective in maintaining their concentration and focus compared to standard 45 min lectures. This was reflected by 49 positive responses indicating that short narrations were effective for learning compared to only 13 negative responses. Students regularly commented that these narrations were “concise, relevant, well-structured” and “extremely effective because they are quick and to the point, making it easy to retain the information and easy to find the time to watch them.”

Interactive Activities and External Video Links

Interactive activities and external video links were the second-most preferred activity for learning (mean satisfaction score = 7.5/10). Forty-eight responses indicated that the interactive activities or videos were effective for learning, compared to only seven responses suggesting these activities were ineffective. Many students perceived the interactive activities and videos provided a good supplement to their learning by giving them a different perspective of the subject content. One student commented that the interactive activities and videos “put what we are learning into perspective. I am a very visual learner and this helped me grasp most concepts a lot more and have a lot more of an understanding.” The students acknowledged that learning content from a different viewpoint (than what is presented by the lecturer) can help them to better incorporate key ideas into their professional practice. In contrast, some students felt frustrated that they were directed to external videos and activities to achieve learning outcomes rather than being taught by academic staff. However, students consistently reflected that the interactive activities and videos made studying, “ easy and fun ,” supporting the idea that students value enjoyment in learning, and may better retain information when employing active learning, in comparison to passive learning.

Self-Assessment Questions

The self-assessment questions aimed at providing formative feedback were judged the most useful activity overall (mean satisfaction score = 7.9/10). The more recent the students had attended university for previous study, the less highly they graded the opportunity for formative feedback ( p = 0.042) in this subject. Fifty-four student responses for the self-assessment questions were coded as effective for learning, compared to only seven responses that were coded as ineffective for learning. Students consistently explained that these activities provided feedback about whether they had understood a topic concept, with one student reporting “I find these to be extremely helpful. They’re a great way to self-assess and to ensure you not only know the content but understand how it applies to different concepts . ” Students were enthusiastic about receiving feedback that could provide an indication of their progress and highlight where they misunderstood concepts or required further revision. Several students also suggested that the self-assessment questions “ helped [them] gain confidence ” before undertaking graded assessments.

Three quarters of students (74%) responding to the survey reported attending synchronous seminars (face to face or remotely), while the remaining quarter elected to listen to the recordings asynchronously (21%) or did not engage in the seminars at all (5%). Students reported that the seminars were helpful for their learning, with 48 positive responses and 18 negative responses coded. Students felt the ability to engage with teaching staff and other students was a positive learning experience. For example, students reported that the seminars “ helped to feel included with the cohort and hear others discuss aspects of the content .” Students valued personal interaction and real-time feedback that they could get from academic staff. Negative responses commonly described that the seminars were “ quite long ” or that “ more detail was needed ” in the information delivered. It may be therefore important to ensure that the aims of each learning activity are outlined to students to ensure that their expectations are met. Live polling was a key mode of learning that was implemented in the seminars, allowing students to anonymously respond to online questions using their mobile phones, with results displayed to stimulate class discussion. Thirty-two responses were coded for students perceiving the live polling as effective and engaging, whereas 18 responses suggested that the live polling was not an effective or engaging mode of learning. Students who disliked the live polling were commonly those who did not attend the seminar, and rather listened to a recording of the seminar. For example, students reported, “ I find that listening to the seminars after they’ve happened difficult as they’re often interactive and much of the time is spent waiting for other students to respond/engage ” or “ when I watch the recordings, there have been times where I have not understood the answer yet I can’t do anything about it.” In contrast, students who attended the seminars were more engaged by the live polling and reported that “ the live questionnaires are more engaging than a standard seminar as it is fun and we are more focused ” and “ I liked these as it provided a chance to test knowledge without pressure and then talk through the answers. This really helped my learning. ” In sum, those who engaged during real time/synchronous opportunities reported to have gained more learning than those who passively watched recordings afterward.

In this study, we evaluated the comprehensive redevelopment of the online learning environment of a post-graduate subject nested within the Deakin University’s post-graduate Human Nutrition course. Students who enroll in this course complete an online, non-vocational pathway at the Graduate Certificate, Graduate Diploma or Masters level. The main finding of this study was that an interactive and guided, self-directed delivery of the learning content was perceived as more effective in meeting the learning outcomes than uploaded, pre-recorded lectures. While this overall positive response was supported by student satisfaction metrics, students did not perform better in terms of final grade, nor did the results suggest that they committed more time to their studies.

As part of broader university modeling, course students were surveyed in 2018 about their study intentions. Students reported pursuing post-graduate studies out of personal interest (27%), but also because they aspired to a career change (24%), and believed the course would advance their career (12%) (Deakin University Marketing Division, 2018 ). Of importance, an overwhelming majority of students (86%) chose to enroll for this course because of the online delivery mode (Deakin University Marketing Division, 2018 ). Yet, in contrast to our hypothesis, students did not perform better nor committed more time to their studies in the new online learning environment. This interesting finding should be discussed in the light of the recent results reported by Deslauriers et al. (2019) . When comparing the perception of learning in response to the same content being taught using passive or active instruction, these authors found that students’ perception of learning was poorer in the active learning setting. Deslauriers et al. (2019) attributed this finding to the increased cognitive load that is inherent to active learning and suggested that it may impair student motivation, commitment, and willingness to further engage in their learning. Our student cohort, especially the older students, had primarily experienced passive learning during their earlier studies in the curriculum. Due to our study design, it was not possible to directly compare the perception of passive and active delivery of the same content. It can, however, be postulated that students who are novice to active learning might interpret this increased cognitive effort as a sign that their learning is not effective, which may prevent them to want to learn more and invest more time in their studies. Taken together, our results suggest that the active format conveyed the content in a way that was as effective, but not more effective than in the past. On the other hand, we found that the new online learning environment was significantly more enjoyable, motivating, and perceived as more time efficient. This highlights the importance of preparing the students for active leaning and to provide a clear and smooth flow of explanations allowing the students to easily navigate the learning design and content ( Deslauriers et al., 2019 ). In these conditions, active and interactive learning experiences can be used to provide postgraduate students choosing online learning with a greater sense of satisfaction.

In terms of individual learning activities, we observed a clear gradation from active/interactive activities (self-assessment questions, interactive activities, and external video links) being preferred by the students, to passive activities (short narrations delivering key concepts and readings) being found less effective. This confirms that in a context where students are mostly isolated and do not get many opportunities to apply their knowledge, activities involving some level of active engagement should be promoted ( Prince, 2004 ) despite the increased cognitive load associated with active learning ( Deslauriers et al., 2019 ). The idea of using multiple learning activities presented in alternative formats improves the online learning experience ( Chita-Tegmark et al., 2011 ; Moorefield-Lang et al., 2016 ; Fidaldo and Thormann, 2017 ) and provides the opportunity for students to focus on content resources that are most suitable for their preferred style of learning. Structuring content delivery modes to suit all learning preferences may enhance student engagement and retention of knowledge ( Hawk Thomas and Shah Amit, 2007 ). As nicely summarized by one of our participants “ I like to listen plus read. The more senses [I can use] the better .” Figure 1 summarizes the perceived advantages and disadvantages of the different learning activities in the context of our study.

www.frontiersin.org

Figure 1. Summary of findings. Advantages and disadvantages of learning activities in an online higher education subject.

Social isolation, absence of peer interaction, lack of social cues and lack of benchmarks are issues that are common to online programs ( Croft et al., 2010 ; Gillett-Swan, 2017 ). Some of the improvements implemented in this subject may have started to address these issues. The live polling tool used in the seminars typically gave the students an opportunity to benchmark themselves against other students and to participate in the discussion regardless of their individual outcome. As such, this teaching strategy may not only have reinforced students’ self-confidence, but also made the seminars more engaging and triggered interaction with other students. Even in an anonymous format, students found the active participation in live polling gave them a “ sense of belonging and contributing ,” which they may otherwise have not experienced in a passive learning environment. Being able to create a sense of sense of community where peers provide constructive feedback in a supported social context is recognized as one of the most significant challenges associated with implementing successful online courses ( Desai et al., 2008 ). Our findings as well as other studies confirmed that students value social exchanges more than any other aspect of their online courses ( Boling et al., 2012 ), and that peer learning should be prioritized when possible ( Broadbent and Poon, 2015 ). This corroborates the idea that collaborating and learning from peers is integral to students’ learning and performance. The anonymous nature of the poll questions followed by an open discussion may also have helped to overcome some of the personal barriers that online students regularly report to experience in collaborative learning tasks, including anxiety, insecurity and feeling out of their comfort zone ( Roberts and Joanne, 2007 ; Hill et al., 2009 ; Gillett-Swan, 2017 ). Indeed, students appreciated this ability to engage with the class in an anonymous format, whereby any incorrect answers they provided were not met with embarrassment. Not surprisingly, the students who did not find any benefit in the live polling were those who did not attend the online or on-campus seminars, but rather listened to the recordings later on. This constitutes a known limitation of these types of activities ( Stoessel et al., 2015 ; Gillett-Swan, 2017 ), and despite offering several real time synchronous seminar times during the week, personal factors including caring for children or professional commitments can consistently impact online postgraduate students abilities to access and participate in live sessions ( Stoessel et al., 2015 ).

Age, which also positively correlated to the criteria last time (students) attended university , was the main variable explaining students’ appraisal of the different activities whereas, for the second part of our hypothesis, individual academic background did not account for any of the variability. Students having only experienced face-to-face lectures before typically found more value in the 10-min narrations of key concepts and often asked for more, or for reinforced presence of the lecturer, an outcome that can be achieved in several ways ( Lehman and Conceição, 2010 ). Previous research however suggests that students’ attention span in lectures lasts approximately 15 min ( Prince, 2004 ), and thereafter, focus and retention of information declines. Through various references to being time poor, some students highlighted that short narrations enabled them to find the time to address the learning outcomes. Still, others found short narrations less useful due to the lack of depth in learning. These notions highlight the importance of reiterating to students that the key learning outcomes cannot be fully attained with a 10 min narration and that engaging with the other activities will allow them to supplement their learning and indulge in greater depth of learning, where required. Consistent with designing online learning, it was expected students would engage in multiple ways with various materials ( Laurillard, 2013 ; Stone, 2016 ). The older age cohort also more strongly relied on the self-assessment questions, especially as a way to gain confidence before undertaking graded assessments. This suggests that the lack of recent experience at university may generate a lack of confidence about current expectations and standards but also highlights the value placed on formative feedback. Non-experienced students are also poor at assessing their own learning, especially in the non-traditional context of online learning ( Deslauriers et al., 2019 ). This concern can be partly addressed by clear explanations about what is expected of students in the subject, and what to expect from the teacher, along with providing the students with early and frequent opportunities to test their knowledge prior to undertaking graded assessment tasks. To maximize the benefit of formative feedback on their learning acquired thus far, both forms—individualized or anonymous—are valuable. Increasing students’ self-esteem is particularly important for students enrolled in online subjects who have limited interaction with other students for support and feedback ( Gillett-Swan, 2017 ) and out results suggest that the form of feedback we used was appropriate and constructively aligned ( Hattie and Timperley, 2007 ; Biggs, 2015 ).

Limitations of this study include that, in contrast to the work by Deslauriers et al. (2019) for example, it was not possible to qualitatively compare the perception of passive and active delivery of the same content. Instead, we used another subject concurrently undertaken by the students and presented as a 45-min, narrated slide show, as a surrogate for an experience of passive delivery. In contrast, a strength was that we were able to directly compare metrics of students’ commitment, satisfaction and performance because neither the learning content nor the assessment tasks changed from the old to the new version of the subject. Because student satisfaction data are anonymously collected and provided in an aggregated form, we could not analyze the association between student satisfaction and student commitment and performance across the 2017 and 2018 versions of the subject. The same limitation applies to our qualitative data, where no linkage was possible between satisfaction indices and appraisal of the different activities and of the subject in general. Paired analyses would allow to start teasing out the mechanisms responsible for a general increase in satisfaction that was not paralleled with a general increase in commitment or performance and, when possible, is recommended in future research.

Implications for future research stem from our main observation that post-graduate student satisfaction, enjoyment, and sense of inclusion can be improved by redeveloping online subjects into an active form that demands cognitive engagement, but that this may not be enough to improve their performance and commitment. Further research opportunities abound for developing a stronger evidence base for best practice in online learning design and delivery that combines cognitive, emotional and sensory engagement. In the current context of a world pandemic, online learning is becoming the new normal, and many universities may only rarely offer in person teaching in coming years. Educators have been placed under unprecedented pressure to quickly design and develop new online learning experiences. In contrast to the current conditions of remote emergency learning implemented during COVID-19, our study conducted before the pandemic was carefully planned, evidence-based and appropriately resourced. Online education design based on best practice was however not enough to improve students’ results or increase the time invested in their studies, a result that might be attributed to the increased cognitive load associated to online learning. Further research is required to understand what constitutes the optimal forms and processes of both technical and learning design teams to provide support to content expert teachers when designing online learning experiences. While the advantages of online learning are multiple, valuable online learning experiences require more than the provision of available material and efficient access. Creating a sense of social connectiveness to foster engagement and ensure deep learning can be achieved through active and collaborative learning. However, even when aligning with best practice, course designers should carefully consider the cognitive load associated with active learning and be clear and explicit about their learning design as it may negatively impact student results or commitment levels. This is especially true in the post-graduate context where students are traditionally older and more likely to have limited experience of online learning.

In conclusion, redeveloping an online subject from an uploaded classroom-based recorded lecture to an interactive and guided, self-directed mode of delivery presents multiple advantages in terms of student satisfaction, motivation and enjoyment without impacting the student results or commitment levels. This might be attributed to the increased cognitive load associated with active learning online. A more active and interactive mode of delivery provides the students with a greater sense of inclusion while accommodating the diversity of the cohort and its different learning preferences.

Data Availability Statement

All datasets generated for this study are included in the article/ Supplementary Material , further inquiries can be directed to the corresponding author.

Ethics Statement

The studies involving human participants were reviewed and approved by the Deakin University Human Ethics Advisory Group (HEAG-H 2018-057). Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.

Author Contributions

SL, IS, and JC designed the study. SL and OK collected the data. OK, AH, and SL processed and analyzed the data. SL, OK, and JC wrote the manuscript. All authors edited the manuscript and proof-read the final version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2020.598560/full#supplementary-material

Supplementary Table 1 | Demographic Data of the 2018 student cohort ( N = 75).

Supplementary Table 2 | Subject Enrollment and Study Commitment of the 2018 student cohort ( N = 75).

  • ^ http://theconversation.com/au

Akcaoglu, M., and Lee, E. (2016). Increasing social presence in online learning through small group discussions. Int. Rev. Res. Open Distrib Learn. 17, 1–17. doi: 10.19173/irrodl.v17i3.2293

CrossRef Full Text | Google Scholar

Australian Bureau of Statistics (2012). Higher Education. Available online at: http://www.abs.gov.au/ausstats/[email protected]/Lookup/by%20Subject/1301.0~2012~Main%20Features~Higher%20education~107 (accessed May 24,2012)

Google Scholar

Biggs, J. T. C. (2015). “Constructive alignment: an outcomes-based approach to teaching anatomy,” in Teaching Anatomy , eds W. Pawlina and L. K. Chan (Cham: Springer). doi: 10.1007/978-3-319-08930-0_4

Boling, E. C., Hough, M., Krinsky, H., Saleem, H., and Stevens, M. (2012). Cutting the distance in distance education: perspectives on what promotes positive, online learning experiences. Int. Higher Educ. 15, 118–126. doi: 10.1016/j.iheduc.2011.11.006

Bonwell, C. C., Eison, J. A., and Aehe Staff. (1991). Active Learning: Creating Excitement in the Classroom. New York, NY: Wiley.

Braun, V., and Clarke, V. (2006). Using thematic analysis in psychology. Qual. Res. Psychol. 3, 77–101. doi: 10.1191/1478088706qp063oa

PubMed Abstract | CrossRef Full Text | Google Scholar

Broadbent, J., and Poon, W. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: a systematic review. Int. Higher Educ. 27, 1–13. doi: 10.1016/j.iheduc.2015.04.007

Casey, D. M. (2008). The historical development of distance education through technology. TechTrends 52, 45. doi: 10.1007/s11528-008-0135-z

Chita-Tegmark, M., Gravel, J. W., Serpa, B., Domings, Y., and Rose, D. H. (2011). Using the universal design for learning framework to support culturally diverse learners. J. Educ. 192, 17–22. doi: 10.1177/002205741219200104

Cohen, L., Manion, L., and Morrison, K. (2013). Research Methods in Education. Milton Park: Taylor & Francis. doi: 10.4324/9780203720967

Croft, N., Dalton, A., and Grant, M. (2010). Overcoming isolation in distance learning: building a learning community through time and space. J. Educ. Built Environ. 5, 27–64. doi: 10.11120/jebe.2010.05010027

Currey, J., Eustace, P., Oldland, E., Glanville, D., and Story, I. (2015). Developing professional attributes in critical care nurses using Team-Based Learning. Nurse Educ. Pract. 15, 232–238. doi: 10.1016/j.nepr.2015.01.011

Desai, M. S., Hart, J., and Richards, T. C. (2008). E-learning: paradigm shift in education. Education 129, 327–334.

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. U.S.A. 116, 19251–19257. doi: 10.1073/pnas.1821936116

Fetherston, A. (2001). Pedagogical Challenges for the World Wide Web , Vol. 9, Greenville, NC: ECU Publications.

Fidaldo, P., and Thormann, J. (2017). Reaching students in online courses using alternative formats. Int. Rev. Res. Open Distrib. Learn. 18, 139–161. doi: 10.19173/irrodl.v18i2.2601

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. U.S.A. 111, 8410–8415. doi: 10.1073/pnas.1319030111

Gillett-Swan, J. (2017). The challenges of online learning: supporting and engaging the isolated learner. J. Learn. Des. 10, 20–30. doi: 10.5204/jld.v9i3.293

Greenland, S., and Moore, C. (2014). Patterns of online student enrolment and attrition in Australian open access online education: a preliminary case study. Open Praxis 6, 45–54. doi: 10.5944/openpraxis.6.1.95

Hattie, J., and Timperley, H. (2007). The power of feedback. Rev. Educ. Res. 77, 81–112. doi: 10.3102/003465430298487

Hawk Thomas, F., and Shah Amit, J. (2007). Using learning style instruments to enhance student learning. Decision Sci. J. Innov. Educ. 5, 1–19. doi: 10.1111/j.1540-4609.2007.00125.x

Hill, J. R., Song, L., and West, R. E. (2009). Social learning theory and web-based learning environments: a review of research and discussion of implications. Am. J. Distance Educ. 23, 88–103. doi: 10.1080/08923640902857713

Hostetter, C., and Busch, M. (2012). Measuring up online: the relationship between social presence and student learning satisfaction. J. Scholarsh. Teach. Learn. 6, 1–12.

Jancey, J., and Sharyn, B. (2013). Institutional factors and the postgraduate student experience. Qual. Assur. Educ. 21, 311–322. doi: 10.1108/qae-nov-2011-0069

Kinash, S. (2019). How to Design & Deliver Quality Online Education. Available online at: https://educationtechnologysolutions.com/2019/06/how-to-design-deliver-quality-online-education/ (accessed July 7, 2020).

Laurillard, D. (1998). Multimedia and the learner’s experience of narrative. Comput. Educ. 31, 229–242. doi: 10.1016/S0360-1315(98)00041-4

Laurillard, D. (2013). Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology. Milton Park: Taylor & Francis. doi: 10.4324/9780203125083

Lehman, R. M., and Conceição, S. C. (2010). Creating A Sense of Presence in Online Teaching: How to” be There” for Distance Learners , Vol. 18, Hoboken, NJ: John Wiley & Sons.

Marketing Division (2018). Master of Human Nutrition student journey. Melbourne, VIC: Deakin University.

Moorefield-Lang, H., Copeland, C. A., and Haynes, A. (2016). Accessing abilities: creating innovative accessible online learning environments and putting quality into practice. Educ. Inf , 32, 27–33. doi: 10.3233/Efi-150966

Prince, M. (2004). Does active learning work? A review of the research. J. Eng. Educ. 93, 223–231. doi: 10.1002/j.2168-9830.2004.tb00809.x

Puzziferro, M. (2008). Online technologies self-efficacy and self-regulated learning as predictors of final grade and satisfaction in college-level online courses. Am. J. Distance Educ. 22, 72–89. doi: 10.1080/08923640802039024

Puzziferro, M., and Shelton, K. (2008). A model for developing high-quality online courses: integrating a systems approach with learning theory. J. Asynchronous Learn. Netw. 12, 119–136. doi: 10.24059/olj.v12i3.58

Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. J. Asynchronous Learn. Netw. 7, 68–88. doi: 10.24059/olj.v7i1.1864

Roberts, T. S. M., and Joanne, M. (2007). Seven problems of online group learning (and their solutions). Educ. Technol. Soc. 10, 257–268.

Shute, V. J. (2008). Focus on formative feedback. Rev. Educ. Res. 78, 153–189. doi: 10.3102/0034654307313795

Stigmar, M. (2016). Peer-to-peer teaching in higher education: a critical literature review. Mentor. Tutoring Partnersh. Learn. 24, 124–136. doi: 10.1080/13611267.2016.1178963

Stoessel, K., Ihme, T. A., Barbarino, M.-L., Fisseler, B., and Stürmer, S. (2015). Sociodemographic diversity and distance education: who drops out from academic programs and why? Res. Higher Educ. 56, 228–246. doi: 10.1007/s11162-014-9343-x

Stone, C. (2016). Opportunity Through Online Learning (National guidelines). Bentley, VIC: NCSEHE.

Sun, A., and Chen, X. (2015). Online education and its effective practice: a research review. J. Inf. Technol. Educ. Res. 15, 157–190. doi: 10.28945/3502

Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., et al. (2018). Strategies to mitigate student resistance to active learning. Int. J. STEM Educ. 5:7. doi: 10.1186/s40594-018-0102-y

Trowler, V. (2010). Student Engagement Literature Review. Heslington: The Higher Education Academy.

University Rankings Australia. (2017). Number of Students Studying Online. Available online at: http://www.universityrankings.com.au/distance-education-numbers.html (accessed July 7, 2020).

Waschull, S. B. (2001). The online delivery of psychology courses: attrition, performance, and evaluation. Teach. Psychol. 28, 143–147. doi: 10.1207/S15328023TOP2802_15

Keywords : online learning, online education, post-graduate learning, active learning, subject redevelopment

Citation: Lamon S, Knowles O, Hendy A, Story I and Currey J (2020) Active Learning to Improve Student Learning Experiences in an Online Postgraduate Course. Front. Educ. 5:598560. doi: 10.3389/feduc.2020.598560

Received: 25 August 2020; Accepted: 13 October 2020; Published: 03 November 2020.

Reviewed by:

Copyright © 2020 Lamon, Knowles, Hendy, Story and Currey. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Séverine Lamon, [email protected]

Active Learning

What is active learning.

Active learning generally refers to any instructional method that engages students in the learning process beyond listening and passive note taking. Active learning approaches promote skill development and higher order thinking through activities that might include reading, writing, and/or discussion. Metacognition -- thinking about one’s thinking -- can also be an important element, helping students connect course activities to their learning (Brame, 2016).

Active learning is rooted in constructivist learning theory , or the idea that students (humans!) learn by connecting new information and experiences to their prior knowledge and experiences, allowing them to build, or construct, new knowledge and understandings (Bransford et al., 1999). Often, although not exclusively, active learning approaches also include collaborative and cooperative learning in small groups. These approaches stem from social constructivism , which emphasizes the importance of peer-to-peer interactions in learning (Vygotsky 1978).

Beyond the theoretical underpinnings, many studies across disciplines have explored the benefits of active learning approaches in college classrooms (e.g., Freeman et al., 2014; Prince et al., 2004). Active learning strategies provide valuable opportunities for students to develop disciplinary skills and expertise, including serving as sources of knowledge, formulating questions and articulating ideas, as well as fostering interactions with peers (Turpen & Finkelstein, 2009). Perhaps most notably, compared to traditional lecture alone, use of active learning approaches has been shown to increase student performance and decrease failure rates, particularly for students from underrepresented and excluded communities (Eddy & Hogan, 2014; Haak et al., 2011; Theobald et al., 2020).

What are some strategies that I might try? 

There are many different active learning strategies that instructors might incorporate into their teaching. These can range from brief interactions during lecture, activities that may take 10-20 minutes, to strategies that could span multiple class periods. The table below outlines a variety of sample strategies with tips for both in-person and remote implementation in courses. The strategies are roughly organized based on potential time-intensity for implementation. Instructors might also explore these active learning designs as they consider opportunities for using each strategy.

Purposeful Pause

Quick write or “minute” paper, think-pair-share (tps), polling/peer instruction, concept map, case study/group problem solving, think-aloud problem solving, gallery walk, what can active learning look like in practice.

In this section, we’ve included several resources with videos that describe different types of active learning strategies and how to implement them. Many also demonstrate active learning strategies in action.

REALISE videos, SEER Center, University of Georgia

Scientific Teaching Series , iBiology

Community-building active learning strategies (remote context), OneHE 

How might I get started?

  • Check out this active learning “cheat sheet” with 10 tips to help you get started, from choosing the “right” exercise to planning the logistics.
  • If you are new to active learning, you might start with identifying strategies to incorporate into your lecture (see these resources on lecturing and interactive lecturing ).
  • Have more questions, or interested in brainstorming for some ideas? Reach out to the Center for Teaching and Learning ( [email protected] ) for a consultation !

Request an Active Learning Classroom

research about active learning

Schedule a CTL Consultation Today!

What additional resources are available, active learning guides:.

  • Active Learning Teaching Guide , Vanderbilt CFT
  • Introduction to Active Learning , Michigan CRLT
  • Active Learning , Yale Poorvu Center

Advice and strategies related to remote active learning:

  • Hybrid active learning strategies , Eberly Center, CMU
  • Flipping the remote classroom , Berkeley CTL

For a deeper dive:

Check out these research summaries describing common active learning techniques.

Polling with a student response system:

This Clicker Resource Guide (see PDF ) has some helpful advice for using polling questions in class with a student response system (e.g., iClicker Cloud or Poll Everywhere), including tips for logistics and "choreography" for implementation. It also touches on writing effective conceptual questions that are multiple choice.

Additional group-based learning approaches:

  • Process-oriented guided inquiry learning (POGIL)
  • Problem-based learning (PBL) (see also: the Problem Library ) and working in teams .

References:

Angelo, T.A. and Cross, K.P. (1993). Classroom assessment techniques: a handbook for college teachers. San Francisco: Jossey-Bass. Aronson, E.; Blaney, N.; Stephin, C.; Sikes, J., & Snapp, M. (1978). The jigsaw classroom. Beverley Hills, CA: Sage Publishing Company Aronson, Elliot. (2000) The jigsaw classroom. Retrieved from https://www.jigsaw.org/ . Barkley, Elizabeth F., K. Patricia Cross, and Clair H. Major. (2014) Collaborative learning techniques: A handbook for college faculty. Jossey-Bass. (available online and downloadable through the UC Berkeley Library; includes adaptations for synchronous and asynchronous instruction). Brame, C. (2016). Active learning. Vanderbilt University Center for Teaching. Retrieved March 10, 2021 from https://cft.vanderbilt.edu/active-learning/ . Bransford, J.D., Brown, A.L., and Cocking, R.R. (Eds.) (1999). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press. Christensen, C.R. (1987). Teaching and the case method. Boston: Harvard Business School. Crouch, C.H. and Mazur, E. (2001). Peer instruction: ten years of experience and results. Am. Journal of Physics 69, 970-977. Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work?. CBE—Life Sciences Education, 13(3), 453-468. Fagen, A.P., Crouch, C.H., and Mazur, E. (2002). Peer instruction: results from a range of classrooms. Physics Teacher 40, 206-209. Francek, M. (2006). Promoting Discussion in the Science Classroom Using Gallery Walks. Journal of College Science Teaching, 36(1). Francek, Mark. "What is Gallery Walk?". Starting Point-Teaching Entry Level Geoscience. Retrieved March 24, 2021. Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., and Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences USA 111, 8410-8415. Haak, D.C., HilleRisLambers, J., Pitre, E., and Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science 332, 1213–1216. Handelsman, J., Miller, S., and Pfund, C. (2007). Scientific teaching. New York: W.H. Freeman. Herreid, C.F. (1994). Case studies in science: A novel method of science education. Journal of College Science Teaching, 23(4), 221-229 Lyman, F. (1981). The responsive classroom discussions: the inclusion of all students. A. Anderson (Ed.), Mainstreaming Digest, College Park: University of Maryland Press, pp. 109-113. Millis, B. J., & Cottell Jr, P. G. (1997). Cooperative Learning for Higher Education Faculty. Series on Higher Education. Oryx Press, PO Box 33889, Phoenix, AZ 85067-3889. Nesbit, J.C. & Adesope, O.O. (2006). Learning with concept and knowledge maps: A meta-analysis. Review of Educational Research, 76(3), 413-448. Novak, J.D. and Canas, A.J. (2008). The theory underlying concept maps and how to construct and use them. Technical Report IHMC CmapTools 2006-01 Rev 2008-01 (retrieved from http://cmap.ihmc.us/docs/theory-of-concept-maps ). Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education 93, 223-231. Rivard, L. O. P. (1994). A review of writing to learn in science: Implications for practice and research. Journal of Research in Science Teaching, 31(9), 969-983. Rowe, M.B. (1980). Pausing principles and their effects on reasoning in science. In Teaching the Sciences, edited by F. B. Brawer. New Directions for Community Colleges No. 31. San Francisco: Jossey-Bass. Ruhl, K., Hughes, C.A., and Schloss, P.J. (1987). Using the Pause Procedure to enhance lecture recall. Teacher Education and Special Education 10, 14-18. Smith, M. K., W. B. Wood, W. K. Adams, C. Wieman, J. K. Knight, N. Guild, and T. T. Su. (2009). “Why Peer Discussion Improves Student Performance on In-Class Concept Questions.” Science, 323, 122-24. Tanner, K. D. (2012). Promoting student metacognition. CBE—Life Sciences Education, 11(2), 113-120. Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., ... & Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences, 117(12), 6476-6483. Turpen, C., & Finkelstein, N. D. (2009). Not all interactive engagement is the same: Variations in physics professors’ implementation of peer instruction. Physical Review Special Topics-Physics Education Research, 5(2), 020101. Vygotsky, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.

Constructivist Learning Theory and Creating Effective Learning Environments

  • First Online: 30 October 2021

Cite this chapter

research about active learning

  • Joseph Zajda 36  

Part of the book series: Globalisation, Comparative Education and Policy Research ((GCEP,volume 25))

3702 Accesses

9 Citations

This chapter analyses constructivism and the use of constructivist learning theory in schools, in order to create effective learning environments for all students. It discusses various conceptual approaches to constructivist pedagogy. The key idea of constructivism is that meaningful knowledge and critical thinking are actively constructed, in a cognitive, cultural, emotional, and social sense, and that individual learning is an active process, involving engagement and participation in the classroom. This idea is most relevant to the process of creating effective learning environments in schools globally. It is argued that the effectiveness of constructivist learning and teaching is dependent on students’ characteristics, cognitive, social and emotional development, individual differences, cultural diversity, motivational atmosphere and teachers’ classroom strategies, school’s location, and the quality of teachers. The chapter offers some insights as to why and how constructivist learning theory and constructivist pedagogy could be useful in supporting other popular and effective approaches to improve learning, performance, standards and teaching. Suggestions are made on how to apply constructivist learning theory and how to develop constructivist pedagogy, with a range of effective strategies for enhancing meaningful learning and critical thinking in the classroom, and improving academic standards.

The unexamined life is not worth living (Socrates, 399 BCE).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abdullah Alwaqassi, S. (2017). The use of multisensory in schools . Indiana University. https://scholarworks.iu.edu/dspace/bitstream/handle/2022/21663/Master%20Thesis%20in%

Acton, G., & Schroeder, D. (2001). Sensory discrimination as related to general intelligence. Intelligence, 29 , 263–271.

Google Scholar  

Adak, S. (2017). Effectiveness of constructivist approach on academic achievement in science at secondary level. Educational Research Review, 12 (22), 1074–1079.

Adler, E. (1997). Seizing the middle ground: Constructivism in world politics. European Journal of International Relations, 3 , 319–363.

Akpan, J., & Beard, B. (2016). Using constructivist teaching strategies to enhance academic outcomes of students with special needs. Journal of Educational Research , 4 (2), 392–398. Retrieved from https://files.eric.ed.gov/fulltext/EJ1089692.pdf

Al Sayyed Obaid, M. (2013). The impact of using multi-sensory approach for teaching students with learning disabilities. Journal of International Education Research , 9 (1), 75–82. Retrieved from https://eric.ed.gov/?id=EJ1010855

Alt, D. (2017).Constructivist learning and openness to diversity and challenge in higher education environments. Learning Environments Research, 20 , 99–119. Retrieved from https://doi.org/10.1007/s10984-016-9223-8

Arends, R. (1998). Learning to teach . Boston: McGraw Hill.

Ayaz, M. F., & Şekerci, H. (2015). The effects of the constructivist learning approach on student’s academic achievement: A meta-analysis study . Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1072.4600&rep=rep1&type=pdf

Barlett, F. (1932), Remembering: A study in experimental and social psychology. Cambridge: CUP.

Bandura, A. (1977). Social learning theory . New York: General Learning Press.

Beck, C., & Kosnik, C. (2006). Innovations in teacher education: A social constructivist approach . New York, NY: SUNY Press.

Black, A., & Ammon, P. (1992). A developmental-constructivist approach to teacher education. Journal of Teacher Education, 43 (5), 323–335.

Bowles, S., & Gintis, H. (1976). Schooling in capitalist America . London: Routledge & Kegan Paul.

Brooks, J., & Brooks, M. (1993). In search of understanding: The case for constructivist classrooms . Alexandria, VA: Association of Supervision and Curriculum Development.

Bruner, J. (1963). The process of education . New York: Vintage Books.

Bynum, W. F., & Porter, R. (Eds.). (2005). Oxford dictionary of scientific quotations . Oxford: Oxford University Press.

Carnoy, M. (1999). Globalization and education reforms: What planners need to know . Paris: UNESCO, International Institute for Educational Planning.

Crogman, H., & Trebeau Crogman, M. (2016). Generated questions learning model (GQLM): Beyond learning styles . Retrieved from https://www.cogentoa.com/article/10.1080/2331186X.2016.1202460

Dangel, J. R. (2011). An analysis of research on constructivist teacher education . Retreived from https://ineducation.ca/ineducation/article/view/85/361

Dewey, J. (1938). Experience and education . New York: Collier Books.

Doll, W. (1993). A post-modem perspective on curriculum . New York: Teachers College Press.

Doolittle, P. E., & Hicks, D. E. (2003). Constructivism as a theoretical foundation for the use of technology in social studies. Theory and Research in Social Education, 31 (1), 71–103.

Dunn, R., & Smith, J. B. (1990). Chapter four: Learning styles and library media programs. In J. B. Smith (Ed.), School library media annual (pp. 32–49). Englewood, CO: Libraries Unlimited.

Dunn, R., et al. (2009). Impact of learning-style instructional strategies on students’ achievement and attitudes: Perceptions of educators in diverse institutions. The Clearing House, 82 (3), 135–140. Retrieved from http://www.jstor.org/stable/30181095

Fontana, D. (1995). Psychology for teachers . London: Palgrave Macmillan.

Fosnot, C. T. (Ed.). (1989). Constructivism: Theory, perspectives, and practice . New York: Teacher's College Press.

Fosnot, C. T., & Perry, R. S. (2005). Constructivism: A psychological theory of learning. In C. T. Fosnot (Ed.), Constructivism: Theory, perspectives, and practice . New York: Teacher’s College Press.

Gardner, H. (1983). Frames of mind: The theory of multiple intelligences . New York: Basic Books.

Gardner, H. (1999). Intelligence reframed: Multiple intelligences for the 21st century . New York: Basic Books.

Gredler, M. E. (1997). Learning and instruction: Theory into practice (3rd ed.). Upper Saddle River, NJ: Prentice-Hall.

Gupta, N., & Tyagi, H. K. (2017). Constructivist based pedagogy for academic improvement at elementary level . Retrieved from https://www.researchgate.net/publication/321018062_constructivist_based_pedagogy_for_academic_improvement_at_elementary_level

Guzzini, S. (2000). A reconstruction of constructivism in international relations. European Journal of International Relations, 6 , 147–182.

Hirtle, J. (1996). Social constructivism . English Journal, 85 (1), 91. Retrieved from. https://search.proquest.com/docview/237276544?accountid=8194

Howe, K., & Berv, J. (2000). Constructing constructivism, epistemological and pedagogical. In D. C. Phillips (Ed.), Constructivism in education (pp. 19–40). Illinois: The National Society for the Study of Education.

Hunter, W. (2015). Teaching for engagement: part 1: Constructivist principles, case-based teaching, and active learning . Retrieved from https://www.researchgate.net/publication/301950392_Teaching_for_Engagement_Part_1_Co

Jonassen, D. H. (1994). Thinking technology. Educational Technology, 34 (4), 34–37.

Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments (pp. 89–121). Mahwah, NJ: Lawrence Erlbaum.

Kelly, G. A. (1955/1991). The psychology of personal constructs . Norton (Reprinted by Routledge, London, 1991).

Kharb, P. et al. (2013). The learning styles and the preferred teaching—Learning strategies of first year medical students . Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3708205/

Kim, B. (2001). Social constructivism. In M. Orey (Ed.), Emerging perspectives on learning, teaching, and technology . http://www.coe.uga.edu/epltt/SocialConstructivism.htm

Kim, J. S. (2005). The effects of a constructivist teaching approach on student academic achievement, self-concept, and learning strategies. Asia Pacific Education Review, 6 (1), 7–19.

Kolb, D. A., & Fry, R. (1975). Toward an applied theory of experiential learning. In C. Cooper (Ed.), Theories of group process . London: John Wiley.

Kukla, A. (2000). Social constructivism and the philosophy of science . London: Routledge.

Mahn, H., & John-Steiner, V. (2012). Vygotsky and sociocultural approaches to teaching and learning . https://doi.org/10.1002/9781118133880.hop207006

Martin, J., & Sugarman, J. (1999). The psychology of human possibility and constraint . Albany: SUNY.

Matthews, M. (2000). Constructivism in science and mathematics education. In C. Phillips (Ed.), Constructivism in education, ninety-ninth yearbook of the national society for the study of education, Part 1 (pp. 159–192). Chicago: University of Chicago Press.

Maypole, J., & Davies, T. (2001). Students’ perceptions of constructivist learning in a Community College American History 11 Survey Course . Retrieved from https://doi.org/10.1177/009155210102900205

McInerney, D. M., & McInerney, V. (2018). Educational psychology: Constructing learning (5th ed.). Sydney: Pearson.

McLeod, S. (2019). Constructivism as a theory for teaching and learning . Retrieved from https://www.simplypsychology.org/constructivism.html

OECD. (2007). Equity and quality in education . Paris: OECD.

OECD. (2009a). Key factors in developing effective learning environments: Classroom disciplinary climate and teachers’ self-efficacy. In Creating effective teaching and learning environments . Paris: OECD.

OECD. (2009b). Education at a glance . Paris: OECD.

OECD. (2009c). The schooling for tomorrow . Paris: OECD, Centre for Educational Research and Innovation.

OECD. (2013). Synergies for better learning: An international perspective on evaluation & assessment . Retrieved from // www.oecd.org/edu/school/Evaluation_and_Assessment_Synthesis_Report.pdf

OECD. (2019a). PISA 2018 results (volume III): What school life means for students’ lives . Paris: OECD.

Oldfather, P., West, J., White, J., & Wilmarth, J. (1999). Learning through children’s eyes: Social constructivism and the desire to learn . Washington, DC: American Psychological Association.

O’Loughin, M. (1992). Rethinking science education: Beyond Piagetian constructivism toward a sociocultural model of teaching and learning. Journal of Research in Science Teaching, 2 (8), 791–820.

Onuf, N. (2003). Parsing personal identity: Self, other, agent. In F. Debrix (Ed.), Language, agency and politics in a constructed world (pp. 26–49). Armonk, NY: M.E. Sharpe.

Onuf, N. G. (2013). World of our making . Abingdon, UK: Routledge.

Packer, M., & Goicoechea, J. (2000). Sociocultural and constructivist theories of learning: Ontology, not just epistemology. Educational Psychologist, 35 (4), 227–241.

Phillips, D. (2000). An opinionated account of the constructivist landscape. In D. C. Phillips (Ed.), Constructivism in education, Ninety-ninth yearbook of the national society for the study of education, Part 1 (pp. 1–16). Chicago: University of Chicago Press.

Piaget, J. (1936). Origins of intelligence in the child . London: Routledge & Kegan Paul.

Piaget, J. (1967). Biologie et connaissance (Biology and knowledge). Gallimard.

Piaget, J. (1972). The principles of genetic epistemology (W. Mays, Trans.). Basic Books.

Piaget, J. (1977). The development of thought: Equilibration of cognitive structures . (A. Rosin, Trans.). The Viking Press.

Postman, N., & Weingartner, C. S. (1971). Teaching as a subversive activity . Harmondsworth: Penguin Books.

Puacharearn, P. (2004). The effectiveness of constructivist teaching on improving learning environments in thai secondary school science classrooms . Doctor of Science Education thesis. Curtin University of Technology. Retrieved from https://espace.curtin.edu.au/bitstream/handle/20.500.11937/2329/14877_

Richardson, V. (2003). Constructivist pedagogy. Teachers College Record, 105 (9), 1623–1640.

Sadler-Smith, E. (2001). The relationship between learning style and cognitive style. Personality and Individual Differences, 30 (4), 609–616.

Searle, J. R. (1995). The construction of social reality . New York, NY: Penguin Books.

Shah, R. K. (2019). Effective constructivist teaching learning in the classroom . Retrieved from https://files.eric.ed.gov/fulltext/ED598340.pdf

Sharma, H. L., & Sharma, L. (2012). Effect of constructivist approach on academic achievement of seventh grade learners in Mathematics. International Journal of Scientific Research, 2 (10), 1–2.

Shively, J. (2015). Constructivism in music education. Arts Education Policy Review: Constructivism, Policy, and Arts Education, 116 (3), 128–136.

Shor, I. (1992). Empowering education: Critical teaching for social change . Chicago: University of Chicago Press.

Slavin, R. (1984). Effective classrooms, effective schools: A research base for reform in Latin American education . Retrieved from http://www.educoas.org/Portal/bdigital/contenido/interamer/BkIACD/Interamer/

Slavin, R. E. (2020). Education psychology: theory and practice (12th ed.). Pearson.

Steffe, L., & Gale, J. (Eds.). (1995). Constructivism in education . Hillsdale, NJ.: Erlbaum.

Stoffers, M. (2011). Using a multi-sensory teaching approach to impact learning and community in a second-grade classroom . Retrieved from https://rdw.rowan.edu/cgi/viewcontent.cgi?article=1109&context=etd

Thomas, A., Menon, A., Boruff, J., et al. (2014). Applications of social constructivist learning theories in knowledge translation for healthcare professionals: A scoping review. Implementation Science, 9 , 54. https://doi.org/10.1186/1748-5908-9-54 .

Article   Google Scholar  

Thompson, P. (2000). Radical constructivism: Reflections and directions. In L. P. Steffe & P. W. Thompson (Eds.), Radical constructivism in action: Building on the pioneering work of Ernst von Glasersfeld (pp. 412–448). London: Falmer Press.

von Glaserfeld, E. (1995). Radical constructivism: A way of knowing and learning . London: The Falmer Press.

Vygotsky, L. S. (1934a). Myshlenie i rech (Thought and language). State Socio-Economic Publishing House (translated in 1986 by Alex Kozulin, MIT).

Vygotsky. (1934b). Thought and language . Cambridge, Mass.: The MIT Press.

Vygotsky, L. (1968). The psychology of art . Moscow: Art.

Vygotsky, L. (1973). Thought and language (A. Kozulin, Trans. and Ed.). The MIT Press. (Originally published in Russian in 1934.)

Vygotsky, L. S. (1978). In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: The development of higher psychological processes . Cambridge, MA: Harvard University Press.

Watson, J. (2003). Social constructivism in the classroom . Retrieved from https://onlinelibrary.wiley.com/doi/abs/10.1111/1467-9604.00206

Wertsch, J. V. (1991). Voices of the mind: A sociocultural approach to mediated action . Cambridge, MA: Harvard University Press.

Zajda, J. (Ed.). (2008a). Learning and teaching (2nd ed.). Melbourne: James Nicholas Publishers.

Zajda, J. (2008b). Aptitude. In G. McCulloch & D. Crook (Eds.), The international encyclopedia of education . London: Routledge.

Zajda, J. (2008c). Globalisation, education and social stratification. In J. Zajda, B. Biraimah, & W. Gaudelli (Eds.), Education and social inequality in the global culture (pp. 1–15). Dordrecht: Springer.

Zajda, J. (2018a). Motivation in the classroom: Creating effective learning environments. Educational Practice & Theory, 40 (2), 85–103.

Zajda, J. (2018b). Effective constructivist pedagogy for quality learning in schools. Educational Practice & Theory, 40 (1), 67–80.

Zajda, J. (Ed.). (2020a). Globalisation, ideology and education reforms: Emerging paradigms . Dordrecht: Springer.

Zajda, J. (Ed.). (2021). 3rd international handbook of globalisation, education and policy research . Dordrecht: Springer.

Zajda, J., & Majhanovich, S. (Eds.). (2021). Globalisation, cultural identity and nation-building: The changing paradigms . Dordrecht: Springer.

Zaphir, L. (2019). Can schools implement constructivism as an education philosophy? Retrieved from https://www.studyinternational.com/news/constructivism-education/

Download references

Author information

Authors and affiliations.

Faculty of Education & Arts, School of Education, Australian Catholic University, East Melbourne, VIC, Australia

Joseph Zajda

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Zajda, J. (2021). Constructivist Learning Theory and Creating Effective Learning Environments. In: Globalisation and Education Reforms. Globalisation, Comparative Education and Policy Research, vol 25. Springer, Cham. https://doi.org/10.1007/978-3-030-71575-5_3

Download citation

DOI : https://doi.org/10.1007/978-3-030-71575-5_3

Published : 30 October 2021

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-71574-8

Online ISBN : 978-3-030-71575-5

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Scholars Crossing

  • Liberty University
  • Jerry Falwell Library
  • Special Collections
  • < Previous

Home > ETD > Doctoral > 5527

Doctoral Dissertations and Projects

Teachers’ lived experiences nurturing the development of self-regulated learning to address academic outcomes for high school students with low reading achievement: a phenomenological study.

Kimberly Mazie Wilson , Liberty University Follow

School of Education

Doctor of Philosophy in Education (PhD)

Gail Collins

Reading achievement, high school students with reading deficits, Phenomenological study, Self-regulated Learning, Active View of Reading, Reading disability

Disciplines

Special Education and Teaching

Recommended Citation

Wilson, Kimberly Mazie, "Teachers’ Lived Experiences Nurturing the Development of Self-Regulated Learning to Address Academic Outcomes for High School Students with Low Reading Achievement: A Phenomenological Study" (2024). Doctoral Dissertations and Projects . 5527. https://digitalcommons.liberty.edu/doctoral/5527

The purpose of this transcendental phenomenological study was to examine teachers’ lived experiences nurturing the development of self-regulated learning to address academic outcomes for high school students with low reading achievement. The two conceptual frameworks that guided this study were Zimmerman’s self-regulated learning, derived from Bandura’s social cognitive theory, and Duke and Cartwright’s active view of reading. These frameworks both provided contributing factors of self-regulatory skills for improved learning and reading outcomes. The research method used for this study was qualitative, and the design was transcendental phenomenology. Using the qualitative method gave a voice to the participants’ lived experiences. It captured their in-depth accounts through semi-structured interviews, teachers’ letters of advice, and focus groups. The study included 10 participants, certified in general or special education, who worked with high school students with a reading disability. The development of themes emerged using the data analysis processes outlined by Moustakas and yielded five themes: (a) challenges, (b) building relationships, (c) differentiated instruction, (d) fostering motivation and engagement, and (e) strategy instruction. The results indicated that the participants helped high school students with low reading achievement develop self-regulated learning by building relationships, making content accessible, and giving choices to demonstrate knowledge and teaching strategies, all of which aided high school students in using self-regulatory skills for improved academic performance.

Included in

Special Education and Teaching Commons

  • Collections
  • Faculty Expert Gallery
  • Theses and Dissertations
  • Conferences and Events
  • Open Educational Resources (OER)
  • Explore Disciplines

Advanced Search

  • Notify me via email or RSS .

Faculty Authors

  • Submit Research
  • Expert Gallery Login

Student Authors

  • Undergraduate Submissions
  • Graduate Submissions
  • Honors Submissions

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright

Batch Active Learning at Scale

Research areas.

Machine Intelligence

Learn more about how we conduct our research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work.

Philosophy-light-banner

American Mathematical Society

Publications — Over 100 years of publishing excellence

  • Book Author Resources
  • Submit a Book Proposal
  • AMS Rights, Licensing, and Permissions
  • Open Math Notes
  • Frequently asked questions
  • Member Journals
  • Research Journals
  • Translation Journals
  • Distributed Journals
  • Open Access Journals
  • Guidelines and Policies
  • Journal Author Resources

Librarian Resources

  • eBook Collections
  • COUNTER Usage Statistics
  • My Subscriptions
  • Subscription Information
  • Licensing Information

Mathematical Reviews/MathSciNet®

  • MathSciNet ®
  • Reviewer Home
  • MathSciNet ® Subscriptions

Membership — Welcome to your membership center

Join the ams, renew your membership, give a membership, individual membership.

  • Member Benefits
  • Member Directory
  • Reciprocating Societies
  • Members in Developing Countries

Institutional Membership

  • Domestic Institutions
  • International Institutions
  • Two-Year Institutions
  • Graduate Student Chapter Program

Other Member Types

  • Corporate Memberships
  • Associate Memberships

Meetings & Conferences — Engage with colleagues and the latest research

National meetings.

  • Joint Mathematics Meetings
  • Upcoming JMMs
  • Previous JMMs
  • Special Lectures
  • Professional Enhancement Programs (PEPs)

Sectional Meetings

  • Upcoming Sectionals
  • Previous Sectionals
  • Presenting Papers
  • Hosting Sectionals

Other Meetings, Conferences & Workshops

  • Mathematics Research Communities
  • Education Mini-conference
  • International Meetings
  • Mathematics Calendar
  • Short Courses
  • Workshop for Department Chairs and Leaders

Meetings Resources

  • Suggest a Speaker
  • AMS Meetings Grants
  • Submitting Abstracts
  • Welcoming Environment Policy
  • MathSafe – supporting safe meetings

News & Outreach — Explore news, images, posters, and mathematical essays

News from the ams.

  • AMS News Releases
  • Feature Stories
  • Information for Journalists
  • In Memory Of

Math Voices

  • Feature Column
  • Math in the Media
  • Column on Teaching and Learning

Explorations

  • Recognizing Diverse Mathematicians
  • AMS Posters
  • Mathematics & Music
  • Mathematical Imagery
  • Mathematical Moments

Professional Programs — Resources and opportunities to further your mathematical pursuits

Professional development.

  • Employment Services
  • Mathjobs.org
  • BEGIN Career Initiative
  • Mathprograms.org
  • Mathematical Opportunities Database
  • Research Seminars

Institutional Information and Data

  • Annual Survey of the Mathematical and Statistical Sciences
  • CBMS Survey
  • Other Sources of Data
  • Directory of Institutions in the Mathematical Sciences
  • Professional Directory

Grants & Support

  • AMS-Simons Grants for PUI Faculty
  • Travel Grants
  • Fellowships & Scholarships
  • Epsilon Fund
  • Child Care Grants

Awards & Recognition

  • AMS Prizes & Awards
  • Fellows of the AMS

Education — Resources to support advanced mathematics teaching and learning

For students.

  • Information for Undergraduate and High School Students
  • Research Experiences for Undergraduates (REUs)
  • Considering Grad School
  • Find Grad Programs
  • Applying to Grad School
  • What do Mathematicians Do?

For Teachers

  • Teaching Online
  • Teaching Resources
  • Inclusive Classrooms
  • Assessing Student Learning
  • Education Webinars

For Department Leaders & Mentors

  • Information for Department Leaders
  • paraDIGMS (Diversity in Graduate Mathematical Sciences)

Government Relations — Advocating for the mathematical sciences

Elevating mathematics in congress.

  • Our Mission
  • Letters, Statements, & Legislation
  • Congressional Briefings

Legislative Priorities

  • Federal Issues of Concern
  • Federal Budget Process

Get Involved

  • Advocacy Resources
  • Take Action

DC-Based Fellowships

  • Congressional Fellowship
  • Mass Media Fellowship
  • Catalyzing Advocacy in Science & Engineering (CASE) Fellowship

Giving to the AMS — Your gifts make great things happen for mathematics   Make a Gift

What you can support.

  • The 2020 Fund
  • Next Generation Fund
  • Birman Fellowship for Women Scholars
  • JMM Child Care Grants
  • MathSciNet for Developing Countries

Create a Legacy

  • Make a Tribute Gift
  • Create a Permanent Fund
  • Establish a Prize, Award or Fellowship
  • Bequests and Charitable Estate Planning

Honoring Your Gift

  • Donor Stories
  • Donor Wall of Honor
  • Thomas S. Fiske Society
  • AMS Contributors Society
  • AMS Gardens

Giving Resources

  • AMS Development Committee
  • AMS Gift Acceptance Policy

About the AMS — Advancing research. Connecting the mathematics community.

Our organization.

  • Executive Staff
  • Equity, Diversity, & Inclusion
  • Jobs at AMS
  • Customer Service

Our Governance

  • Board of Trustees
  • Executive Committee

Governance Operations

  • Calendar of Meetings
  • Policy Statements & Guidelines

JOURNAL OF THE AMS

Communications of the American Mathematical Society

Launched by the American Mathematical Society in 2021, Communications of the American Mathematical Society (CAMS) , is a Diamond Open Access online journal dedicated to publishing the very best research and review articles across all areas of mathematics. The journal presents a holistic view of mathematics and its applications to a wide range of disciplines.

ISSN 2692-3688

The 2020 MCQ for Communications of the American Mathematical Society is 1.00 . What is MCQ? The Mathematical Citation Quotient (MCQ) measures journal impact by looking at citations over a five-year period. Subscribers to MathSciNet may click through for more detailed information.

  • All volumes

Contents of Volume 4 HTML articles powered by AMS MathViewer

U.S. flag

A .gov website belongs to an official government organization in the United States.

A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Data and Statistics on ADHD
  • Free Materials on ADHD
  • Attention-Deficit / Hyperactivity Disorder Articles
  • Clinical Care and Treatment

Attention-Deficit / Hyperactivity Disorder (ADHD)

ADHD wooden blocks on top of pen and paper

About Attention-Deficit / Hyperactivity Disorder (ADHD)

mother and daughter using the computer

Is it ADHD?

Two parents smiling with their daughter in between them

Symptoms of ADHD

Two parents helping their sons with homework

Treatment of ADHD

A young boy with his parents on the couch pointing at a tablet

ADHD Information and Resources for States

Image of children in a classroom from the back of the room

School Changes — Helping Children with ADHD

Data and computer infographic.

CDC's Attention-Deficit / Hyperactivity Disorder (ADHD) site includes information on symptoms, diagnosis, treatment, data, research, and free resources.

For Everyone

Health care providers.

IMAGES

  1. Infographic: Active Kids Learn Better

    research about active learning

  2. How active learning can enhance students’ skills

    research about active learning

  3. Active Learning

    research about active learning

  4. 6 Effective Learning Methods

    research about active learning

  5. Study shows that students learn more when taking part in classrooms

    research about active learning

  6. Active Learning

    research about active learning

VIDEO

  1. Khoá học Active Learning

  2. What is Active Learning? 💪🏻🧠

  3. Active Learning Tip 3

  4. Active Learning of Actions

  5. Active learning engages students in the learning process, making it more interesting and enjoyable

  6. Communicating Your Research through the Life Cycle

COMMENTS

  1. New Research Shows Learning Is More Effective When Active

    The research also found that effective active learning methods use not only hands-on and minds-on approaches, but also hearts-on, providing increased emotional and social support. Interest in active learning grew as the COVID-19 pandemic challenged educators to find new ways to engage students.

  2. The Curious Construct of Active Learning

    The concept of active learning has been around for several decades. For example, How People Learn (Bransford et al., 2000) contained a two-page section on active learning suggesting that it involves students taking control of their learning through some level of metacognitive sense making, self-assessment, and reflection. How People Learn II (National Academies of Sciences, Engineering, and ...

  3. Active Learning

    A 2014 meta-analysis of 225 research studies in STEM classes found that students in classes with active learning performed 6% better on exams than students in classes with traditional lecturing, and that students in classes with traditional lecturing were 1.5 times more likely to fail than in classes with active learning (Freeman et al, 2014).

  4. (PDF) The Concept of Active Learning and the Measurement of Learning

    in student learning and, especially lately, in generic working life skill or competence development [. 2. , 3. ]. Generic working life skills have been viewed as important learning goals, because ...

  5. Instructor strategies to aid implementation of active learning: a

    Much of the active learning research literature introduces interventions in first-year (cornerstone) or fourth-year (capstone) courses, but we found within our dataset a tendency to oversample first-year courses. However, all four course-years were represented, as well as all major STEM disciplines, with the most common STEM disciplines being ...

  6. PDF Active learning classroom design and student engagement: An ...

    recipient of the instructor's expertise, whereas active learning pedagogies perceive the learner as an engaged participant in the learning process (Sabagh & Saroyan, 2014). Research examining the utilization of active learning pedagogies compared to lecture‐based instruction has shown positive and substantial outcomes in university

  7. Active learning: "Hands-on" meets "minds-on"

    A common characteristic of active learning is increased interactions among students (e.g., group work) and between a student and the instructor (e.g., instructor asking questions). These interactions can change classroom dynamics and create stressful situations that would not exist in traditional lecture courses.

  8. Active learning tools improve the learning outcomes, scientific

    Active learning comprises approaches that focus more on developing students' skills than transmitting information and require students to perform activities that require higher‐order thinking. 13 For this, ... we used research‐based learning approaches through the application of questionnaires in a pretest (Q1) and posttest (Q2).

  9. Frontiers

    This literature Review had the purpose of inspecting how the use of active learning methodologies in higher education can impact students' Well-being. ... articles, with just one from 2021, three from 2020, one from 2019, and one from 2018. In general, they are simple research documenting active approaches used in higher education. The main ...

  10. Investigating the contributions of active, playful learning to student

    Emerging research further suggests that active, playful learning, particularly when implemented through guided play, supports the 6 Cs beyond basic understanding of academic content (Hirsh-Pasek et al., 2022). Hirsh-Pasek and colleagues describe how they collaborated with kindergarten teachers across New Hampshire in an instructional coaching ...

  11. Lessons in learning

    A pioneer in work on active learning, Balkanski Professor of Physics and Applied Physics Eric Mazur hailed the study as debunking long-held beliefs about how students learn. ... This research confirms that faculty should persist and encourage active learning. Active engagement in every classroom, led by our incredible science faculty, should be ...

  12. Active Learning

    Approaches that promote active learning focus more on developing students' skills than on transmitting information and require that students do something—read, discuss, write—that requires higher-order thinking. They also tend to place some emphasis on students' explorations of their own attitudes and values.

  13. Evidence-Based Practices for the Active Learning Classroom

    The definition of active learning has developed over several decades. Early researchers defined active learning as "learning that is done with the expectation of using the material" (Benware and Deci 1984, p. 758).This definition was centered in motivational theory, and the authors argued that active learning facilitates higher levels of intrinsic motivation than passive learning.

  14. Active Learning in Higher Education: Sage Journals

    Active Learning in Higher Education is an international, refereed publication for all those who teach and support learning in higher education (HE) and those who undertake or use research into effective learning, teaching and assessment in universities and colleges. The journal is devoted to all aspects of development, innovations and good practice in higher education teaching and learning...

  15. The science of effective learning with spacing and retrieval practice

    Alexander Renkl. Educational Psychology Review (2023) Research on the psychology of learning has highlighted straightforward ways of enhancing learning. However, effective learning strategies are ...

  16. Learning is more effective when active

    The research also found that effective active learning methods use not only hands-on and minds-on approaches, but also hearts-on, providing increased emotional and social support.

  17. The Impact of Active Learning on Students' Academic Performance

    flipped and 89% for traditio nal), the students in the active learning classroom. had a higher academi c achievement rate wit h 64% of the students achieving an. A- grade as compared to 36% in the ...

  18. Overview of Active Learning Research and Rationale for ...

    Active Learning offers an alternative to the traditional lecture and note-taking model. Active Learning is generally defined as any instructional method that engages learners in the learning process [].This has become an umbrella term used to describe both an overall pedagogy and specific strategies for teaching and learning in the classroom or lecture hall.

  19. Active learning

    Research shows that active learning can help students achieve a far deeper understanding of a topic than by simply listening to lectures or reading textbooks. For teachers, active learning provides more opportunities to interact with students. For example, it can give you more ways to get continual feedback to evaluate your teaching.

  20. (PDF) Active learning: An introduction

    Here is the basic active learning structure. Active learning is anything c ourse-related that all students in a class session are. called upon to do other than simply watching, listening and ...

  21. Frontiers

    The aim of this study was to assess the learning experiences of postgraduate students in an online learning environment delivering content in a guided, self-directed way focusing on active learning opportunities. Two-hundred and eighty-seven students participated in the study.

  22. Active Learning

    Active learning generally refers to any instructional method that engages students in the learning process beyond listening and passive note taking. Active learning approaches promote skill development and higher order thinking through activities that might include reading, writing, and/or discussion. Metacognition -- thinking about one's ...

  23. Active learning: "Hands-on" meets "minds-on"

    SCIENCE science.org 1 OCTOBER 2021 • VOL 374 ISSUE 6563 29 Students may learn more than they think By Louis Deslauriers8, Logan McCarty8,9, Kristina Callaghan8,10 Despite strong evidence that active learning based on the principles of deliberate practice produces better educational outcomes (17), traditional lecturing remains the dominant mode of instruction in

  24. Constructivist Learning Theory and Creating Effective Learning

    With reference to constructivism, McLeod (), like many other constructivist researchers, suggested that knowledge is indeed socially constructed, and that learning is a necessarily active process (see Dewey, 1938; Bruner, 1963; Vygotsky, 1978).The purpose of constructivism is, then, for the individual to construct her or his own meanings out of the elements of individual experience (see McLeod ...

  25. Leveraging Physical Activities to Support Learning for Young People via

    Through content analysis, we identified four approaches of how PA and learning were combined (i.e., PA embodied learning content, served as a functional input method for learning tasks, guided learners through different learning sites, and generated data for learning activities) and supporting technologies like full-body interaction learning ...

  26. "Teachers' Lived Experiences Nurturing the Development of Self-Regulate

    The purpose of this transcendental phenomenological study was to examine teachers' lived experiences nurturing the development of self-regulated learning to address academic outcomes for high school students with low reading achievement. The two conceptual frameworks that guided this study were Zimmerman's self-regulated learning, derived from Bandura's social cognitive theory, and Duke ...

  27. Batch Active Learning at Scale

    Batch active learning, which adaptively issues batched queries to a labeling oracle, is a common approach for addressing this problem. The practical benefits of batch sampling come with the downside of less adaptivity and the risk of sampling redundant examples within a batch -- a risk that grows with the batch size.

  28. AMS :: Comm. Amer. Math. Soc. -- Volume 4

    CURRENT ISSUE: Communications of the American Mathematical Society. Launched by the American Mathematical Society in 2021, Communications of the American Mathematical Society (CAMS), is a Diamond Open Access online journal dedicated to publishing the very best research and review articles across all areas of mathematics.The journal presents a holistic view of mathematics and its applications ...

  29. Attention-Deficit / Hyperactivity Disorder (ADHD)

    CDC's Attention-Deficit / Hyperactivity Disorder (ADHD) site includes information on symptoms, diagnosis, treatment, data, research, and free resources. Site Index For Everyone

  30. (PDF) Active Learning with Fully Bayesian Neural Networks for

    Active learning optimizes the exploration of large parameter spaces by strategically selecting which experiments or simulations to conduct, thus reducing resource consumption and potentially ...