Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 10 August 2016

Learning strategies: a synthesis and conceptual model

  • John A C Hattie 1 &
  • Gregory M Donoghue 1  

npj Science of Learning volume  1 , Article number:  16013 ( 2016 ) Cite this article

170k Accesses

182 Citations

680 Altmetric

Metrics details

  • Social sciences

The purpose of this article is to explore a model of learning that proposes that various learning strategies are powerful at certain stages in the learning cycle. The model describes three inputs and outcomes (skill, will and thrill), success criteria, three phases of learning (surface, deep and transfer) and an acquiring and consolidation phase within each of the surface and deep phases. A synthesis of 228 meta-analyses led to the identification of the most effective strategies. The results indicate that there is a subset of strategies that are effective, but this effectiveness depends on the phase of the model in which they are implemented. Further, it is best not to run separate sessions on learning strategies but to embed the various strategies within the content of the subject, to be clearer about developing both surface and deep learning, and promoting their associated optimal strategies and to teach the skills of transfer of learning. The article concludes with a discussion of questions raised by the model that need further research.

Similar content being viewed by others

research about learning strategies

Sleep quality, duration, and consistency are associated with better academic performance in college students

Kana Okano, Jakub R. Kaczmarzyk, … Jeffrey C. Grossman

research about learning strategies

Artificial intelligence and illusions of understanding in scientific research

Lisa Messeri & M. J. Crockett

research about learning strategies

Interviews in the social sciences

Eleanor Knott, Aliya Hamid Rao, … Chana Teeger

There has been a long debate about the purpose of schooling. These debates include claims that schooling is about passing on core notions of humanity and civilisation (or at least one’s own society’s view of these matters). They include claims that schooling should prepare students to live pragmatically and immediately in their current environment, should prepare students for the work force, should equip students to live independently, to participate in the life of their community, to learn to ‘give back’, to develop personal growth. 1

In the past 30 years, however, the emphasis in many western systems of education has been more on enhancing academic achievement—in domains such as reading, mathematics, and science—as the primary purpose of schooling. 2 Such an emphasis has led to curricula being increasingly based on achievement in a few privileged domains, and ‘great’ students are deemed those who attain high levels of proficiency in these narrow domains.

This has led to many countries aiming to be in the top echelon of worldwide achievement measures in a narrow range of subjects; for example, achievement measures such as PISA (tests of 15-year olds in mathematics, reading and science, across 65 countries in 2012) or PIRLS (Year-5 tests of mathematics, reading and science, across 57 countries in 2011). Indeed, within most school systems there is a plethora of achievement tests; many countries have introduced accountability pressures based on high levels of testing of achievement; and communities typically value high achievement or levels of knowledge. 3 The mantra underpinning these claims has been cast in terms of what students know and are able to do; the curriculum is compartmentalised into various disciplines of achievement; and students, teachers, parents and policy makers talk in terms of success in these achievement domains.

Despite the recent emphasis on achievement, the day-to-day focus of schools has always been on learning—how to know, how to know more efficiently and how to know more effectively. The underlying philosophy is more about what students are now ready to learn, how their learning can be enabled, and increasing the ‘how to learn’ proficiencies of students. In this scenario, the purpose of schooling is to equip students with learning strategies, or the skills of learning how to learn. Of course, learning and achievement are not dichotomous; they are related. 4 Through growth in learning in specific domains comes achievement and from achievement there can be much learning. The question in this article relates to identifying the most effective strategies for learning.

In our search, we identified >400 learning strategies: that is, those processes which learners use to enhance their own learning. Many were relabelled versions of others, some were minor modifications of others, but there remained many contenders purported to be powerful learning strategies. Such strategies help the learner structure his or her thinking so as to plan, set goals and monitor progress, make adjustments, and evaluate the process of learning and the outcomes. These strategies can be categorised in many ways according to various taxonomies and classifications (e.g., references 5 , 6 , 7 ). Boekaerts, 8 for example, argued for three types of learning strategies: (1) cognitive strategies such as elaboration, to deepen the understanding of the domain studied; (2) metacognitive strategies such as planning, to regulate the learning process; and (3) motivational strategies such as self-efficacy, to motivate oneself to engage in learning. Given the advent of newer ways to access information (e.g., the internet) and the mountain of information now at students’ fingertips, it is appropriate that Dignath, Buettner and Langfeldt 9 added a fourth category—management strategies such as finding, navigating, and evaluating resources.

But merely investigating these 400-plus strategies as if they were independent is not defensible. Thus, we begin with the development of a model of learning to provide a basis for interpreting the evidence from our meta-synthesis. The argument is that learning strategies can most effectively enhance performance when they are matched to the requirements of tasks (cf. 10 ).

A model of learning

The model comprises the following components: three inputs and three outcomes; student knowledge of the success criteria for the task; three phases of the learning process (surface, deep and transfer), with surface and deep learning each comprising an acquisition phase and a consolidation phase; and an environment for the learning ( Figure 1 ). We are proposing that various learning strategies are differentially effective depending on the degree to which the students are aware of the criteria of success, on the phases of learning process in which the strategies are used, and on whether the student is acquiring or consolidating their understanding. The following provides an overview of the components of the model (see reference 11 for a more detailed explanation of the model).

figure 1

A model of learning.

Input and outcomes

The model starts with three major sources of inputs: the skill, the will and the thrill. The ‘skill’ is the student’s prior or subsequent achievement, the ‘will’ relates to the student’s various dispositions towards learning, and the ‘thrill’ refers to the motivations held by the student. In our model, these inputs are also the major outcomes of learning. That is, developing outcomes in achievement (skill) is as valuable as enhancing the dispositions towards learning (will) and as valuable as inviting students to reinvest more into their mastery of learning (thrill or motivations).

The first component describes the prior achievement the student brings to the task. As Ausubel 12 claimed ‘if I had to reduce all of educational psychology to just one principle, I would say this ‘The most important single factor influencing learning is what the leaner already knows. Ascertain this and teach him accordingly. Other influences related to the skills students bring to learning include their working memory, beliefs, encouragement and expectations from the student’s cultural background and home.

Dispositions are more habits of mind or tendencies to respond to situations in certain ways. Claxton 13 claimed that the mind frame of a ‘powerful learner’ is based on the four major dispositions: resilience or emotional strength, resourcefulness or cognitive capabilities, reflection or strategic awareness, and relating or social sophistication. These dispositions involve the proficiency to edit, select, adapt and respond to the environment in a recurrent, characteristic manner. 14 But dispositions alone are not enough. Perkins et al. 15 outlined a model with three psychological components which must be present in order to spark dispositional behaviour: sensitivity—the perception of the appropriateness of a particular behaviour; inclination—the felt impetus toward a behaviour; and ability—the basic capacity and confidence to follow through with the behaviour.

There can be a thrill in learning but for many students, learning in some domains can be dull, uninviting and boring. There is a huge literature on various motivational aspects of learning, and a smaller literature on how the more effective motivational aspects can be taught. A typical demarcation is between mastery and performance orientations. Mastery goals are seen as being associated with intellectual development, the acquisition of knowledge and new skills, investment of greater effort, and higher-order cognitive strategies and learning outcomes. 16 Performance goals, on the other hand, have a focus on outperforming others or completing tasks to please others. A further distinction has been made between approach and avoidance performance goals. 17 – 19 The correlations of mastery and performance goals with achievement, however, are not as high as many have claimed. A recent meta-analysis found 48 studies relating goals to achievement (based on 12,466 students), and the overall correlation was 0.12 for mastery and 0.05 for performance goals on outcomes. 20 Similarly, Hulleman et al. 21 reviewed 249 studies ( N =91,087) and found an overall correlation between mastery goal and outcomes of 0.05 and performance goals and outcomes of 0.14. These are small effects and show the relatively low importance of these motivational attributes in relation to academic achievement.

An alternative model of motivation is based on Biggs 22 learning processes model, which combines motivation (why the student wants to study the task) and their related strategies (how the student approaches the task). He outlined three common approaches to learning: deep, surface and achieving. When students are taking a deep strategy, they aim to develop understanding and make sense of what they are learning, and create meaning and make ideas their own. This means they focus on the meaning of what they are learning, aim to develop their own understanding, relate ideas together and make connections with previous experiences, ask themselves questions about what they are learning, discuss their ideas with others and compare different perspectives. When students are taking a surface strategy, they aim to reproduce information and learn the facts and ideas—with little recourse to seeing relations or connections between ideas. When students are using an achieving strategy, they use a ‘minimax’ notion—minimum amount of effort for maximum return in terms of passing tests, complying with instructions, and operating strategically to meet a desired grade. It is the achieving strategy that seems most related to school outcomes.

Success criteria

The model includes a prelearning phase relating to whether the students are aware of the criteria of success in the learning task. This phase is less about whether the student desires to attain the target of the learning (which is more about motivation), but whether he or she understands what it means to be successful at the task at hand. When a student is aware of what it means to be successful before undertaking the task, this awareness leads to more goal-directed behaviours. Students who can articulate or are taught these success criteria are more likely to be strategic in their choice of learning strategies, more likely to enjoy the thrill of success in learning, and more likely to reinvest in attaining even more success criteria.

Success criteria can be taught. 23 , 24 Teachers can help students understand the criteria used for judging the students’ work, and thus teachers need to be clear about the criteria used to determine whether the learning intentions have been successfully achieved. Too often students may know the learning intention, but do not how the teacher is going to judge their performance, or how the teacher knows when or whether students have been successful. 25 The success criteria need to be as clear and specific as possible (at surface, deep, or transfer level) as this enables the teacher (and learner) to monitor progress throughout the lesson to make sure students understand and, as far as possible, attain the intended notions of success. Learning strategies that help students get an overview of what success looks like include planning and prediction, having intentions to implement goals, setting standards for judgement success, advance organisers, high levels of commitment to achieve success, and knowing about worked examples of what success looks like. 23

Environment

Underlying all components in the model is the environment in which the student is studying. Many books and internet sites on study skills claim that it is important to attend to various features of the environment such as a quiet room, no music or television, high levels of social support, giving students control over their learning, allowing students to study at preferred times of the day and ensuring sufficient sleep and exercise.

The three phases of learning: surface, deep and transfer

The model highlights the importance of both surface and deep learning and does not privilege one over the other, but rather insists that both are critical. Although the model does seem to imply an order, it must be noted that these are fuzzy distinctions (surface and deep learning can be accomplished simultaneously), but it is useful to separate them to identify the most effective learning strategies. More often than not, a student must have sufficient surface knowledge before moving to deep learning and then to the transfer of these understandings. As Entwistle 26 noted, ‘The verb ‘to learn’ takes the accusative’; that is, it only makes sense to analyse learning in relation to the subject or content area and the particular piece of work towards which the learning is directed, and also the context within which the learning takes place. The key debate, therefore, is whether the learning is directed content that is meaningful to the student, as this will directly affect student dispositions, in particular a student’s motivation to learn and willingness to reinvest in their learning.

A most powerful model to illustrate this distinction between surface and deep is the structure of observed learning outcomes, or SOLO, 27 , 28 as discussed above. The model has four levels: unistructural, multistructural, relational and extended abstract. A unistructural intervention is based on teaching or learning one idea, such as coaching one algorithm, training in underlining, using a mnemonic or anxiety reduction. The essential feature is that this idea alone is the focus, independent of the context or its adaption to or modification by content. A multistructural intervention involves a range of independent strategies or procedures, but without integrating or orchestration as to the individual differences or demands of content or context (such as teaching time management, note taking and setting goals with no attention to any strategic or higher-order understandings of these many techniques). Relational interventions involve bringing together these various multistructural ideas, and seeing patterns; it can involve the strategies of self-monitoring and self-regulation. Extended abstract interventions aim at far transfer (transfer between contexts that, initally, appear remote to one another) such that they produce structural changes in an individual’s cognitive functioning to the point where autonomous or independent learning can occur. The first two levels (one then many ideas) refer to developing surface knowing and the latter two levels (relate and extend) refer to developing deeper knowing. The parallel in learning strategies is that surface learning refers to studying without much reflecting on either purpose or strategy, learning many ideas without necessarily relating them and memorising facts and procedures routinely. Deep learning refers to seeking meaning, relating and extending ideas, looking for patterns and underlying principles, checking evidence and relating it to conclusions, examining arguments cautiously and critically, and becoming actively interested in course content (see reference 29 ).

Our model also makes a distinction between first acquiring knowledge and then consolidating it. During the acquisition phase, information from a teacher or instructional materials is attended to by the student and this is taken into short-term memory. During the consolidation phase, a learner then needs to actively process and rehearse the material as this increases the likelihood of moving that knowledge to longer-term memory. At both phases there can be a retrieval process, which involves transferring the knowing and understanding from long-term memory back into short-term working memory. 30 , 31

Acquiring surface learning

In their meta-analysis of various interventions, Hattie et al. 32 found that many learning strategies were highly effective in enhancing reproductive performances (surface learning) for virtually all students. Surface learning includes subject matter vocabulary, the content of the lesson and knowing much more. Strategies include record keeping, summarisation, underlining and highlighting, note taking, mnemonics, outlining and transforming, organising notes, training working memory, and imagery.

Consolidating surface learning

Once a student has begun to develop surface knowing it is then important to encode it in a manner such that it can retrieved at later appropriate moments. This encoding involves two groups of learning strategies: the first develops storage strength (the degree to which a memory is durably established or ‘well learned’) and the second develops strategies that develop retrieval strength (the degree to which a memory is accessible at a given point in time). 33 ‘Encoding’ strategies are aimed to develop both, but with a particular emphasis on developing retrieval strength. 34 Both groups of strategies invoke an investment in learning, and this involves ‘the tendency to seek out, engage in, enjoy and continuously pursue opportunities for effortful cognitive activity. 35 Although some may not ‘enjoy’ this phase, it does involve a willingness to practice, to be curious and to explore again, and a willingness to tolerate ambiguity and uncertainty during this investment phase. In turn, this requires sufficient metacognition and a calibrated sense of progress towards the desired learning outcomes. Strategies include practice testing, spaced versus mass practice, teaching test taking, interleaved practice, rehearsal, maximising effort, help seeking, time on task, reviewing records, learning how to receive feedback and deliberate practice (i.e., practice with help of an expert, or receiving feedback during practice).

Acquiring deep learning

Students who have high levels of awareness, control or strategic choice of multiple strategies are often referred to as ‘self-regulated’ or having high levels of metacognition. In Visible Learning , Hattie 36 described these self-regulated students as ‘becoming like teachers’, as they had a repertoire of strategies to apply when their current strategy was not working, and they had clear conceptions of what success on the task looked like. 37 More technically, Pintrich et al. 38 described self-regulation as ‘an active, constructive process whereby learners set goals for their learning and then attempt to monitor, regulate and control their cognition, motivation and behaviour, guided and constrained by their goals and the contextual features in the environment’. These students know the what, where, who, when and why of learning, and the how, when and why to use which learning strategies. 39 They know what to do when they do not know what to do. Self-regulation strategies include elaboration and organisation, strategy monitoring, concept mapping, metacognitive strategies, self-regulation and elaborative interrogation.

Consolidating deep learning

Once a student has acquired surface and deep learning to the extent that it becomes part of their repertoire of skills and strategies, we may claim that they have ‘automatised’ such learning—and in many senses this automatisation becomes an ‘idea’, and so the cycle continues from surface idea to deeper knowing that then becomes a surface idea, and so on. 40 There is a series of learning strategies that develop the learner’s proficiency to consolidate deeper thinking and to be more strategic about learning. These include self-verbalisation, self-questioning, self-monitoring, self-explanation, self-verbalising the steps in a problem, seeking help from peers and peer tutoring, collaborative learning, evaluation and reflection, problem solving and critical thinking techniques.

There are skills involved in transferring knowledge and understanding from one situation to a new situation. Indeed, some have considered that successful transfer could be thought as synonymous with learning. 41 , 42 There are many distinctions relating to transfer: near and far transfer, 43 low and high transfer, 44 transfer to new situations and problem solving transfer, 5 and positive and negative transfer. 45 Transfer is a dynamic, not static, process that requires learners to actively choose and evaluate strategies, consider resources and surface information, and, when available, to receive or seek feedback to enhance these adaptive skills. Reciprocal teaching is one program specifically aiming to teach these skills; for example, Bereiter and Scardamalia 46 have developed programs in the teaching of transfer in writing, where students are taught to identify goals, improve and elaborate existing ideas, strive for idea cohesion, present their ideas to groups and think aloud about how they might proceed. Similarly, Schoenfeld 47 outlined a problem-solving approach to mathematics that involves the transfer of skills and knowledge from one situation to another. Marton 48 argued that transfer occurs when the learner learns strategies that apply in a certain situation such that they are enabled to do the same thing in another situation when they realise that the second situation resembles (or is perceived to resemble) the first situation. He claimed that not only sameness, similarity, or identity might connect situations to each other, but also small differences might connect them as well. Learning how to detect such differences is critical for the transfer of learning. As Heraclitus claimed, no two experiences are identical; you do not step into the same river twice.

Overall messages from the model

There are four main messages to be taken from the model. First, if the success criteria is the retention of accurate detail (surface learning) then lower-level learning strategies will be more effective than higher-level strategies. However, if the intention is to help students understand context (deeper learning) with a view to applying it in a new context (transfer), then higher level strategies are also needed. An explicit assumption is that higher level thinking requires a sufficient corpus of lower level surface knowledge to be effective—one cannot move straight to higher level thinking (e.g., problem solving and creative thought) without sufficient level of content knowledge. Second, the model proposes that when students are made aware of the nature of success for the task, they are more likely to be more involved in investing in the strategies to attain this target. Third, transfer is a major outcome of learning and is more likely to occur if students are taught how to detect similarities and differences between one situation and a new situation before they try to transfer their learning to the new situation. Hence, not one strategy may necessarily be best for all purposes. Fourth, the model also suggests that students can be advantaged when strategy training is taught with an understanding of the conditions under which the strategy best works—when and under what circumstance it is most appropriate.

The current study

The current study synthesises the many studies that have related various learning strategies to outcomes. This study only pertains to achievement outcomes (skill, on the model of learning); further work is needed to identify the strategies that optimise the dispositions (will) and the motivation (thrill) outcomes. The studies synthesised here are from four sources. First, there are the meta-analyses among the 1,200 meta-analyses in Visible Learning that relate to strategies for learning. 36 , 49 , 50 Second, there is the meta-analysis conducted by Lavery 51 on 223 effect-sizes derived from 31 studies relating to self-regulated learning interventions. The third source is two major meta-analyses by a Dutch team of various learning strategies, especially self-regulation. And the fourth is a meta-analysis conducted by Donoghue et al. 52 based on a previous analysis by Dunlosky et al. 53

The data in Visible Learning is based on 800 meta-analyses relating influences from the home, school, teacher, curriculum and teaching methods to academic achievement. Since its publication in 2009, the number of meta-analyses now exceeds 1,200, and those influences specific to learning strategies are retained in the present study. Lavery 51 identified 14 different learning strategies and the overall effect was 0.46—with greater effects for organising and transforming (i.e., deliberate rearrangement of instructional materials to improve learning, d =0.85) and self-consequences (i.e., student expectation of rewards or punishment for success or failure, d =0.70). The lowest effects were for imagery (i.e., creating or recalling vivid mental images to assist learning, d =0.44) and environmental restructuring (i.e., efforts to select or arrange the physical setting to make learning easier, d =0.22). She concluded that the higher effects involved ‘teaching techniques’ and related to more ‘deep learning strategies’, such as organising and transforming, self-consequences, self-instruction, self-evaluation, help-seeking, keeping records, rehearsing/memorising, reviewing and goal-setting. The lower ranked strategies were more ‘surface learning strategies’, such as time management and environmental restructuring.

Of the two meta-analyses conducted by the Dutch team, the first study, by Dignath et al. 9 analysed 357 effects from 74 studies ( N =8,619). They found an overall effect of 0.73 from teaching methods of self-regulation. The effects were large for achievement (elementary school, 0.68; high school, 0.71), mathematics (0.96, 1.21), reading and writing (0.44, 0.55), strategy use (0.72, 0.79) and motivation (0.75, 0.92). In the second study, Donker et al. 54 reviewed 180 effects from 58 studies relating to self-regulation training, reporting an overall effect of 0.73 in science, 0.66 in mathematics and 0.36 in reading comprehension. The most effective strategies were cognitive strategies (rehearsal 1.39, organisation 0.81 and elaboration 0.75), metacognitive strategies (planning 0.80, monitoring 0.71 and evaluation 0.75) and management strategies (effort 0.77, peer tutoring 0.83, environment 0.59 and metacognitive knowledge 0.97). Performance was almost always improved by a combination of strategies, as was metacognitive knowledge. This led to their conclusion that students should not only be taught which strategies to use and how to apply them (declarative knowledge or factual knowledge) but also when (procedural or how to use the strategies) and why to use them (conditional knowledge or knowing when to use a strategy).

Donoghue et al. 52 conducted a meta-analysis based on the articles referenced in Dunlosky et al. 53 They reviewed 10 learning strategies and a feature of their review is a careful analysis of possible moderators to the conclusions about the effectiveness of these learning strategies, such as learning conditions (e.g., study alone or in groups), student characteristics (e.g., age, ability), materials (e.g., simple concepts to problem-based analyses) and criterion tasks (different outcome measures).

In the current study, we independently assigned all strategies to the various parts of the model—this was a straightforward process, and the few minor disagreements were resolved by mutual agreement. All results are presented in Appendix 1.

Results: the meta-synthesis of learning strategies

There are 302 effects derived from the 228 meta-analyses from the above four sources that have related some form of learning strategy to an achievement outcome. Most are experimental–control studies or pre–post studies, whereas some are correlations ( N =37). There are 18,956 studies (although some may overlap across meta-analyses). Only 125 meta-analyses reported the sample size ( N =11,006,839), but if the average (excluding the outlier 7 million from one meta-analysis) is used for the missing sample sizes, the best estimate of sample size is between 13 and 20 million students.

The average effect is 0.53 but there is considerable variance ( Figure 2 ), and the overall number of meta-analyses, studies, number of people (where provided), effects and average effect-sizes for the various phases of the model are provided in Table 1 . The effects are lowest for management of the environment and ‘thrill’ (motivation), and highest for developing success criteria across the learning phases. The variance is sufficiently large, however, that it is important to look at specific strategies within each phase of the model.

figure 2

The average and the distribution of all effect sizes.

Synthesis of the input phases of the model

The inputs: skills.

There are nine meta-analyses that have investigated the relation between prior achievement and subsequent achievement, and not surprisingly these relations are high ( Table 2 ). The average effect-size is 0.77 (s.e.=0.10), which translates to a correlation of 0.36—substantial for any single variable. The effects of prior achievement are lowest in the early years, and highest from high school to university. One of the purposes of school, however, is to identify those students who are underperforming relative to their abilities and thus to not merely accept prior achievement as destiny. The other important skill is working memory—which relates to the amount of information that can be retained in short-term working memory when engaged in processing, learning, comprehension, problem solving or goal-directed thinking. 55 Working memory is strongly related to a person’s ability to reason with novel information (i.e., general fluid intelligence. 56

The inputs: will

There are 28 meta-analyses related to the dispositions of learning from 1,304 studies and the average effect-size is 0.48 (s.e.=0.09; Table 3 ). The effect of self-efficacy is highest ( d =0.90), followed by increasing the perceived value of the task ( d = 0.46), reducing anxiety ( d =0.45) and enhancing the attitude to the content ( d =0.35). Teachers could profitably increase students’ levels of confidence and efficacy to tackle difficult problems; not only does this increase the probability of subsequent learning but it can also help reduce students’ levels of anxiety. It is worth noting the major movement in the anxiety and stress literature in the 1980s moved from a preoccupation on understanding levels of stress to providing coping strategies—and these strategies were powerful mediators in whether people coped or not. 57 Similarly in learning, it is less the levels of anxiety and stress but the development of coping strategies to deal with anxiety and stress. These strategies include being taught to effectively regulate negative emotions; 58 increasing self-efficacy, which relates to developing the students conviction in their own competence to attain desired outcomes; 59 focusing on the positive skills already developed; increasing social support and help seeking; reducing self-blame; and learning to cope with error and making mistakes. 60 Increasing coping strategies to deal with anxiety and promoting confidence to tackle difficult and challenging learning tasks frees up essential cognitive resources required for the academic work.

There has been much discussion about students having growth—or incremental—mindsets (human attributes are malleable not fixed) rather than fixed mindsets (attributes are fixed and invariant). 61 However, the evidence in Table 3 ( d =0.19) shows how difficult it is to change to growth mindsets, which should not be surprising as many students work in a world of schools dominated by fixed notions—high achievement, ability groups, and peer comparison.

The inputs: thrill

The thrill relates to the motivation for learning: what is the purpose or approach to learning that the student adopts? Having a surface or performance approach motivation (learning to merely pass tests or for short-term gains) or mastery goals is not conducive to maximising learning, whereas having a deep or achieving approach or motivation is helpful ( Table 4 ). A possible reason why mastery goals are not successful is that too often the outcomes of tasks and assessments are at the surface level and having mastery goals with no strategic sense of when to maximise them can be counter-productive. 62 Having goals, per se , is worthwhile—and this relates back to the general principle of having notions of what success looks like before investing in the learning. The first step is to teach students to have goals relating to their upcoming work, preferably the appropriate mix of achieving and deep goals, ensure the goals are appropriately challenging and then encourage students to have specific intentions to achieve these goals. Teaching students that success can then be attributed to their effort and investment can help cement this power of goal setting, alongside deliberate teaching.

The environment

Despite the inordinate attention, particularly by parents, on structuring the environment as a precondition for effective study, such effects are generally relatively small ( Table 5 ). It seems to make no differences if there is background music, a sense of control over learning, the time of day to study, the degree of social support or the use of exercise. Given that most students receive sufficient sleep and exercise, it is perhaps not surprising that these are low effects; of course, extreme sleep or food deprivation may have marked effects.

Knowing the success criteria

A prediction from the model of learning is that when students learn how to gain an overall picture of what is to be learnt, have an understanding of the success criteria for the lessons to come and are somewhat clear at the outset about what it means to master the lessons, then their subsequent learning is maximised. The overall effect across the 31 meta-analyses is 0.54, with the greatest effects relating to providing students with success criteria, planning and prediction, having intentions to implement goals, setting standards for self-judgements and the difficulty of goals ( Table 6 ). All these learning strategies allow students to see the ‘whole’ or the gestalt of what is targeted to learn before starting the series of lessons. It thus provides a ‘coat hanger’ on which surface-level knowledge can be organised. When a teacher provides students with a concept map, for example, the effect on student learning is very low; but in contrast, when teachers work together with students to develop a concept map, the effect is much higher. It is the working with students to develop the main ideas, and to show the relations between these ideas to allow students to see higher-order notions, that influences learning. Thus, when students begin learning of the ideas, they can begin to know how these ideas relate to each other, how the ideas are meant to form higher order notions, and how they can begin to have some control or self-regulation on the relation between the ideas.

Synthesis of the learning phases of the model

There are many strategies, such as organising, summarising, underlining, note taking and mnemonics that can help students master the surface knowledge ( Table 7 ). These strategies can be deliberately taught, and indeed may be the only set of strategies that can be taught irrespective of the content. However, it may be that for some of these strategies, the impact is likely to be higher if they are taught within each content domain, as some of the skills (such as highlighting, note taking and summarising) may require specific ideas germane to the content being studied.

While it appears that training working memory can have reasonable effects ( d =0.53) there is less evidence that training working memory transfers into substantial gains in academic attainment. 63 There are many emerging and popular computer games that aim to increase working memory. For example, CogMed is a computer set of adaptive routines that is intended to be used 30–40 min a day for 25 days. A recent meta-analysis (by the commercial owners 64 ) found average effect-sizes (across 43 studies) exceed 0.70, but in a separate meta-analysis of 21 studies on the longer term effects of CogMed, there was zero evidence of transfer to subjects such as mathematics or reading 65 . Although there were large effects in the short term, they found that these gains were not maintained at follow up (about 9 months later) and no evidence to support the claim that working memory training produces generalised gains to the other skills that have been investigated (verbal ability, word decoding or arithmetic) even when assessment takes place immediately after training. For the most robust studies, the effect of transfer is zero. It may be better to reduce working memory demands in the classroom. 66

The investment of effort and deliberate practice is critical at this consolidation phase, as are the abilities to listen, seek and interpret the feedback that is provided ( Table 8 ). At this consolidation phase, the task is to review and practice (or overlearn) the material. Such investment is more valuable if it is spaced over time rather than massed. Rehearsal and memorisation is valuable—but note that memorisation is not so worthwhile at the acquisition phase. The difficult task is to make this investment in learning worthwhile, to make adjustments to the rehearsal as it progresses in light of high levels of feedback, and not engage in drill and practice. These strategies relating to consolidating learning are heavily dependent on the student’s proficiency to invest time on task wisely, 67 to practice and learn from this practice and to overlearn such that the learning is more readily available in working memory for the deeper understanding.

Acquiring deeper learning

Nearly all the strategies at this phase are powerful in enhancing learning ( Table 9 ). The ability to elaborate and organise, monitor the uses of the learning strategies, and have a variety of metacognitive strategies are the critical determinants of success at this phase of learning. A major purpose is for the student to deliberately activate prior knowledge and then make relations and extensions beyond what they have learned at the surface phase.

At this phase, the power of working with others is most apparent ( Table 10 ). This involves skills in seeking help from others, listening to others in discussion and developing strategies to ‘speak’ the language of learning. It is through such listening and speaking about their learning that students and teachers realise what they do deeply know, what they do not know and where they are struggling to find relations and extensions. An important strategy is when students become teachers of others and learn from peers, as this involves high levels of regulation, monitoring, anticipation and listening to their impact on the learner.

There has been much research confirming that teaching help-seeking strategies is successful, but how this strategy then works in classrooms is more complex. Teachers have to welcome students seeking help, and there needs to be knowledgeable others (e.g., peers) from whom to seek the help—too often students left in unsupported environments can seek and gain incorrect help and not know the help is incorrect. 68 Ryan and Shin 69 also distinguished between adaptive help seeking (seeking help from others, such as an explanation, a hint, or an example, that would further learning and promote independent problem solving in the future) and expedient help seeking (seeking help that expedites task completion, such as help that provides the answer and is not focused on learning). They showed that adaptive help seeking from peers declines and expedient help seeking increases during early adolescence. Further, increases in expedient help seeking were associated with declines in achievement but changes in adaptive help seeking were unrelated to achievement. The key is for teachers to teach adaptive help seeking, to ensure the help is dependable and correct and to see this more of a student than a teacher skill. Help seeking needs to be welcomed before it can have an effect.

The transfer model promoted by Marton 48 seems to be supported in that a key in teaching for transfer involves understanding the patterns, similarities and differences in the transfer before applying the strategies to new task ( Table 11 ). Marton argued that transfer occurs when students learn strategies that apply in a certain situation such that they are enabled to do the same thing in another situation to the degree that they realise how the second situation does (or does not) resemble the first situation. It is learning to detect differences and similarities that is the key that leads to transfer of learning.

Discussion and Conclusions

There is much debate about the optimal strategies of learning, and indeed we identified >400 terms used to describe these strategies. Our initial aim was to rank the various strategies in terms of their effectiveness but this soon was abandoned. There was too much variability in the effectiveness of most strategies depending on when they were used during the learning process, and thus we developed the model of learning presented in this article. Like all models, it is a conjecture, it aims to say much and it is falsifiable. The efficacy of any model can be seen as an expression of its capacity to generate a scalable solution to a problem or need in ways that resolve more issues than prevailing theories or approaches. 70 The model posits that learning must be embedded in some content (something worth knowing) and thus the current claims about developing 21st century skills sui generis are most misleading. These skills often are promoted as content free and are able to be developed in separate courses (e.g., critical thinking, resilience). Our model, however, suggests that such skills are likely to be best developed relative to some content. There is no need to develop learning strategy courses, or teach the various strategies outside the context of the content. Instead, the strategies should be an integral part of the teaching and learning process, and can be taught within this process.

The model includes three major inputs and outcomes. These relate to what the students bring to the learning encounter (skill), their dispositions about learning (will) and their motivations towards the task (thrill). The first set of strategies relate to teaching students the standards for what is to be learned (the success criteria). We propose that effective learning strategies will be different depending on the phase of the learning—the strategies will be different when a student is first acquiring the matters to be learnt compared with when the student is embedding or consolidating this learning. That is, the strategies are differentially effective depending on whether the learning intention is surface learning (the content), deep learning (the relations between content) or the transfer of the skills to new situations or tasks. In many ways this demarcation is arbitrary (but not capricious) and more experimental research is needed to explore these conjectures. Further, the model is presented as linear whereas there is often much overlap in the various phases. For example, to learn subject matter (surface) deeply (i.e., to encode in memory) is helped by exploring and understanding its meaning; success criteria can have a mix of surface and deep and even demonstrate the transfer to other (real world) situations; and often deep learning necessitates returning to acquire specific surface level vocabulary and understanding. In some cases, there can be multiple overlapping processes. A reviewer provided a clear example: in learning that the internal angles of a quadrilateral add up to 360°, this might involve surface learning, which then requires rehearsal to consolidate, some self-questioning to apply, some detection of similarities to then work out what the internal angles of a hexagon might be, and spotting similarities to the triangle rule. There may be no easy way to know the right moment, or no easy demarcation of the various phases. The proposal in this paper is but a ‘model’ to help clarify the various phases of learning, and in many real world situations there can be considerable overlap.

We have derived six sets of propositions from our conceptual model of learning and the results of our meta-synthesis of research on learning strategies. The first set relates to the differential role played by what students bring to and take from the learning encounter—the inputs and outcomes. Second, there are some strategies that are more effective than others—but their relative effectiveness depends on the phase in the model of learning in which they take place. Third is the distinction between surface learning, deep learning and the transfer of learning. The fourth set relates to the skills of transfer, the fifth to how the model of learning can be used to resolve some unexpected findings about the effectiveness of some strategies, and the sixth set discusses the question ‘what is learning?’.

The intertwining role of skill, will, and thrill

Our first set of claims relates to the differential role of what students bring to and take from the learning encounter. Rather than arguing that many factors contribute to achievement (an important but sometimes the only privileged outcome of learning), we are promoting the notion that the skill, will and thrill can intertwine during learning and that these three inputs are also important outcomes of learning—the aim is to enhance the will (e.g., the willingness to reinvest in more and deeper learning), the thrill (e.g., the emotions associated with successful learning, the curiosity and the willingness to explore what one does not know) and the skills (e.g., the content and the deeper understanding). The relation between the thrill, will and skill can vary depending on the student and the requirements of the task. Certainly, negative emotions, such as those induced by fear, anxiety, and stress can directly and negatively affect learning and memory. Such negative emotions block learning: ‘If the student is faced with sources of stress in an educational context which go beyond the positive challenge threshold—for instance, aggressive teachers, bullying students or incomprehensible learning materials whether books or computers—it triggers fear and cognitive function is negatively affected. 71 Our argument is that learning can lead to enhanced skills, dispositions, motivations and excitements that can be reinvested in learning, and can lead to students setting higher standards for their success criteria. When skill, will, and thrill overlap, this should be considered a bonus; developing each is a worthwhile outcome of schooling in its own right.

It is all in the timing

Our second set of claims is that while it is possible to nominate the top 10 learning strategies the more critical conclusion is that the optimal strategies depend on where in the learning cycle the student is located. This strategic skill in using the strategies at the right moment is akin to the message in the Kenny Rogers song—you need to ‘know when to hold ‘em, know when to fold ‘em’. For example, when starting a teaching sequence, it is most important to be concerned that students have confidence they can understand the lessons, see value in the lessons and are not overly anxious about their skills to be mastered. Providing them early on with an overview of what successful learning in the lessons will look like (knowing the success criteria) will help them reduce their anxiety, increase their motivation, and build both surface and deeper understandings.

To acquire surface learning, it is worthwhile knowing how to summarise, outline and relate the learning to prior achievement; and then to consolidate this learning by engaging in deliberate practice, rehearsing over time and learning how to seek and receive feedback to modify this effort. To acquire deep understanding requires the strategies of planning and evaluation and learning to monitor the use of one’s learning strategies; and then to consolidate deep understanding calls on the strategy of self-talk, self-evaluation and self-questioning and seeking help from peers. Such consolidation requires the learner to think aloud, learn the ‘language of thinking’, 72 know how to seek help, self-question and work through the consequences of the next steps in learning. To transfer learning to new situations involves knowing how to detect similarities and differences between the old and the new problem or situations.

We recommend that these strategies are developed by embedding them into the cycle of teaching rather than by running separate sessions, such as ‘how to learn’ or study skills courses. There is a disappointing history of educational programs aimed at teaching students how to learn. 30 , 73 , 74 Wiliam 75 made this case for why teaching these learning strategies (e.g., critical thinking) out of context is unlikely to develop a generic skill applicable to many subjects. He noted that in a ‘mathematics proof, critical thinking might involve ensuring that each step follows from the previous one (e.g., by checking that there has not been a division by zero). In reading a historical account, critical thinking might involve considering the author of the account, the potential biases and limitations that the author may be bringing to the account, and what other knowledge the reader has about the events being described. The important point here is that although there is some commonality between the processes in mathematics and history, they are not the same. Developing a capacity for critical thinking in history does not make one better at critical thinking in mathematics. For all of the apparent similarities, critical thinking in history and critical thinking in mathematics are different, and they are developed in different ways’. Many others have noted that metacognition is not knowledge-free but needs to be taught in the context of the individual subject areas. 76 , 77 Perkins 78 also noted that there is a certain art to infusing the teaching of thinking into content learning. Sometimes, ‘teachers think it is enough simply to establish a generally thoughtful atmosphere in a classroom, with regular expectations for thinking critically and creatively...teaching for know-how about learning to learn is a much more time-consuming enterprise than teaching for just learning the ideas... Building active know-how requires much more attention’.

Another aspect to consider is the difference, identified in the model, between being first exposed to learning and the consolidation of this learning. This distinction is far from novel. Shuell, 79 for example, distinguished between initial, intermediate, and final phases of learning. In the initial phase, the students can encounter a ‘large array of facts and pieces of information that are more-or-less isolated conceptually... there appears to be little more than a wasteland with few landmarks to guide the traveller on his or her journey towards understanding and mastery’. Students can use existing schema to make sense of this new information, or can be guided to have more appropriate schema (and thus experience early stages of concept learning and relation between ideas) otherwise the information may remain as isolated facts, or be linked erroneously to previous understandings. At the intermediate phase, the learner begins to see similarities and relationships among these seemingly conceptually isolated pieces of information. ‘The fog continues to lift but still has not burnt off completely’. During the final phase, the knowledge structure becomes well integrated and functions more autonomously, and the emphasis is more on performance or exhibiting the outcome of learning.

Horses for courses: matching strategies with phases

The third set of claims relates to the distinction between surface, deep, and transfer of learning. Although not a hard and fast set of demarcations, surface learning refers more to the content and underlying skills; deep learning to the relationships between, and extensions of, ideas; and transfer to the proficiency to apply learning to new problems and situations. During the surface learning phase, an aim is to assist students to overlearn certain ideas and thus reduce the needs of their working memory to work with these new facts when moving into the deeper understanding phase. Note, for example, that Marton et al. 80 made an important distinction between memorising without understanding first and called this rote memorisation (which has long term effect), and memorisation when you have understood and called this meaningful memorisation (which can be powerful). The evidence in the current study supports this distinction.

It is when students have much information, or many seemingly unrelated ideas, that the learning strategies for the deep phase are optimally invoked. This is when they should be asked to integrate ideas with previous schema or modify their previous schema to integrate new ideas and ways of thinking. The key to this process is first gaining ideas—a fact often missed by those advocating deeper thinking strategies when they try to teach these skills prior to developing sufficient knowledge within the content domain. The students need to first have ideas before they can relate them. The model does not propose discarding the teaching or learning skills that have been developed to learn surface knowing, but advocates the benefits of a more appropriate balance of surface and deeper strategies and skills that then lead to transfer. The correct balance of surface to deep learning depends on the demands of the task. It is likely that more emphasis on surface strategies is probably needed as students learn new ideas, moving to an emphasis on deeper strategies as they become more proficient.

Pause and reflect: detecting similarities and differences

The fourth set of claims relate to the skills of transfer, and how important it is to teach students to pause and detect the similarities and differences between previous tasks and the new one, before attempting to answer a new problem. Such transfer can be positive, such as when a learner accurately remembers a learning outcome reached in a certain situation and appropriately applies it in a new and similar situation, or negative, such as when a learner applies a strategy used successfully in one situation in a new situation where this strategy is not appropriate. Too many (particularly struggling) students over-rehearse a few learning strategies (e.g., copying and highlighting) and apply them in situations regardless of the demands of new tasks. Certainly, the fundamental skill for positive transfer is stopping before addressing the problem and asking about the differences and similarities of the new to any older task situation. This skill can be taught.

This ability to notice similarities and differences over content is quite different for novices and experts 81 , 82 and we do not simply learn from experience but we also learn to experience. 83 Preparation for future learning involves opportunities to try our hunches in different contexts, receive feedback, engage in productive failure and learn to revise our knowing based on feedback. The aim is to solve problems more efficiently, and also to ‘let go’ of previously acquired knowledge in light of more sophisticated understandings—and this can have emotional consequences: ‘Failure to change strategies in new situations has been described as the tyranny of success’. 84 It is not always productive for students to try the same thing that worked last time. Hence there may need to be an emphasis on knowledge-building rather than knowledge-telling, 85 and systematic inquiry based on theory-building and disconfirmation rather than simply following processes for how to find some result.

Why some strategies do not work

The fifth set of claims relate to how the model can be used to resolve some of the unexpected findings about the impact of various teaching methods. In Visible Learning , 36 it was noted that many programs that seem to lead to developing deeper processing have very low effect sizes (e.g., inquiry based methods, d =0.31; problem-based learning, d =0.15). For example, there have been 11 meta-analyses relating to problem-based learning based on 509 studies, leading to an average small effect ( d =0.15). It hardly seems necessary to run another problem-based program (particularly in first-year medicine, where four of the meta-analyses were completed) to know that the effects of problem-based learning on outcomes are small. The reason for this low effect seems to be related to using problem-based methods before attaining sufficient surface knowledge. When problem-based learning is used in later medical years, the effects seem to increase. Albanese and Mitchell 86 claimed that increased years of exposure to medical education increases the effect of problem-based learning. They argued that lack of experience (and lack of essential surface knowledge) leads the student to make more errors in their knowledge base, add irrelevant material to their explanations and engage in backward reasoning (from the unknown to the givens), whereas experts engaged in forward reasoning (also see references 87 , 88 ). Walker et al. 89 also noted that novice problem-based learning students tended to engage in far more backward-driven reasoning, which results in more errors during problem solving and may persist even after the educational intervention is complete. It is likely that problem-based learning works more successfully when students engage in forward reasoning and this depends on having sufficient content knowledge to make connections.

Deep understanding in problem-based learning requires a differentiated knowledge structure, 90 and this may need to be explicitly taught—as there is no assumption that students will see similarities and differences in contexts by themselves. There is a limit to what we can reasonably expect students to discover, and it may require teaching students to make predictions based on features that were told to them and that they may not notice on their own. Deliberate teaching of these surface features can offer a higher level of explanation that would be difficult or time consuming to discover. A higher level explanation is important because it provides a generative framework that can extend one understanding beyond the specific cases that have been analysed and experienced. On the other hand, the problems need not be too overly structured, as then students do not gain experience of searching out conceptual tools or homing in on particular cases of application. 78

Another example of the different requirements of surface and deep learning is the effect of asking students to explore errors and misconceptions during their learning. Using meta-analysis, Keith and Frese 91 found that the average effect of using these strategies when the outcome was surface learning was −0.15 and when the outcome was deep learning and far transfer to new problems, it was 0.80.

So: what is learning?

The sixth set of claims relate to the notion of ‘what is learning?’. The argument in this article is that learning is the outcome of the processes of moving from surface to deep to transfer. Only then will students be able to go beyond the information given to ‘figure things out’, which is one of the few untarnishable joys of life. 92 One of the greatest triumphs of learning is what Perkins 78 calls ‘knowing one’s way around’ a particular topic or ‘playing the whole game’ of history, mathematics, science or whatever. This is a function of knowing much and then using this knowledge in the exploration of relations and to make extensions to other ideas, and being able to know what to do when one does not know what to do (the act of transfer).

Concluding comments

Like all models, the one proposed in this article invites as many conjectures and directions for further research as it provide a basis for interpreting the evidence from the meta-synthesis. It helps make sense of much of the current literature but it is speculative in that it also makes some untested predictions. There is much solace in Popper's 93 claim that ‘Bold ideas, unjustified anticipations, and speculative thought, are our only means for interpreting nature: our only organon, our only instrument, for grasping her. And we must hazard them to win our prize. Those among us who are unwilling to expose their ideas to the hazard of refutation do not take part in the scientific game.’ Further research is needed, for example, to better understand the optimal order through the various phases; there may be circumstances where it may be beneficial to learn the deeper notions before developing the surface knowledge. It is highly likely that as one develops many ideas and even relates and extends them, these become ‘ideas’ and the cycle continues. 94 We know much, but we need to know much more, and in particular we need to know how these many learning strategies might be better presented in another competing model. Such testing of a bold model and making predictions from models is, according to Popper, how science progresses.

Further research is needed that asks whether the distinction between the acquisition and the consolidation of learning is a distinctive difference, a melding from one to the other or whether both can occur simultaneously. If there is a difference, then more research on ascertaining the best time to move from acquisition to consolidation would be informative. Similarly, there is no hard rule in the model of a sequence from surface to deep to transfer. In some ways, teaching the strategies of knowing what success looks like upfront implies an exposure to both surface and deep learning. Also, the many arguments (but surprisingly there is a lack of evidence) for the popular notions of flipped classrooms could be supported with more evidence of introducing the success criteria upfront to students. A typical flipped lesson starts with students accessing online video lectures or resources prior to in-class sessions so that students are prepared to participate in more interactive and higher-order activities such as problem solving, discussions and debates. 95 The most needed research concerns transfer—the variation theory of Marton, 48 the claims by Perkins 78 and others need more focused attention and the usual (and often unsubstantiated) claims that doing x will assist learning y should come back as a focus of learning sciences.

We are proposing that it is worthwhile to develop the skill, will and thrill of learning, and that there are many powerful strategies for learning. Students can be taught these strategies (declarative knowledge), how to use them (procedural knowledge), under what conditions it may be more or less useful to apply them (conditional knowledge) and how to evaluate them. It may be necessary to teach when best to use these strategies according the nature of the outcomes (surface and deep), according to the timing of learning (first acquiring and then consolidating learning) and to teach the skill of transferring learning to new situations. We need to think in terms of ‘surface to deep’ and not one alone; we need to think in terms of developing dispositions, motivations and achievement, and not one alone. This invites considering multiple outcomes from our schools. Singapore, 96 for example, is now committed to developing an educational system which will produce young people who have the moral courage to stand up for what is right; pursue a healthy lifestyle and have an appreciation of aesthetics; are proud to be Singaporeans; are resilient in the face of difficulty, innovative and enterprising; are purposeful in the pursuit of excellence; are able to collaborate across cultures; and can think critically and communicate persuasively. Academic achievement is but one desirable learning outcomes of many.

Another important message is that developing a few learning strategies may not be optimal. The failure to change strategies in new situations has been described as the tyranny of success; 84 and the current meta-synthesis suggests that choosing different strategies as one progresses through the learning cycle (from first exposure to embedding, from surface to deep to transfer) demands cognitive flexibility. It may not be the best option for students to use the same strategies that worked last time, as when the context is changed the old strategies may no longer work.

Widdowson, D. A., Dixon, R. S., Peterson, E. R., Rubie-Davies, C. M. & Irving, S. E. Why go to school? Student, parent and teacher beliefs about the purposes of schooling. Asia Pac. J. Educ. 35 , 1–14 (2014).

Google Scholar  

Biesta, G. J. Good Education in an Age of Measurement: Ethics, Politics, democracy , (Routledge, 2015).

Tröhler, D., Meyer, H. D., Labaree, D. F. & Hutt, E. L. Accountability: antecedents, power, and processes. Teach. Coll. Rec. 116 , 1–12 (2014).

Soderstrom, N. C. & Bjork, R. A. Learning versus performance: an integrative review. Perspect. Psychol. Sci. 10 , 176–199 (2015).

Article   Google Scholar  

Mayer, R. E. Applying the science of learning: Evidence-based principles for the design of multimedia instruction. Am. Psychol. 63 , 760–769 (2008).

Pressley, M. Comprehension Instruction: Research-Based Best Practices 11–27 (Guilford Press, 2002).

Weinstein, C. E. & Mayer, R. E. in Handbook of Research on Teaching: a Project of the American Educational Research Association (ed. Wittrock, M. C.) 3rd edn., 315–327 (Macmillan, 1986).

Boekaerts, M. Self-regulated learning: a new concept embraced by researchers, policy makers, educators, teachers, and students. Learn. Instruct. 7 , 161–186 (1997).

Dignath, C., Buettner, G. & Langfeldt, H. P. How can primary school students learn self-regulated learning strategies most effectively? A meta-analysis on self-regulation training programmes. Educ. Res. Rev. 3 , 101–129 (2008).

Pressley, M., Goodchild, F., Fleet, J., Zajchowski, R. & Evans, E. D. The challenges of classroom strategy instruction. The Elementary School Journal 89 , 301–342 (1989).

Hattie, J. A. C. The 34th Vernon-Wall Lecture (British Psychological Society, 2014).

Ausubel, D. P. Educational Psychology: a Cognitive View (Holt, Rinehart, & Winston, 1968).

Claxton, G. What's the Point of School? Rediscovering the Heart of Education (Oneworld Publications, 2013).

Carr, M. & Claxton, G. Tracking the development of learning dispositions. Assessment in Education: Principles, Policy & Practice 9 , 9–37 (2002).

Perkins, D. N., Jay, E. & Tishman, S. Beyond abilities: A dispositional theory of thinking. Merrill-Palmer Quarterly 39 , 1–21 (1993).

Anderman, E. M. & Patrick, H. in Handbook of Research on Student Engagement , 173–191 (Springer, USA, 2012).

Elliot, A. J. & Harackiewicz, J. M. Approach and avoidance achievement goals and intrinsic motivation: A mediational analysis. J. Personal. Soc. Psychol. 70 , 461–475 (1996).

Middleton, M. J. & Midgley, C. Avoiding the demonstration of lack of ability: An underexplored aspect of goal theory. J. Educ. Psychol. 89 , 710 (1997).

Skaalvik, E. M. in Advances in Motivation and Achievement (eds Maehr, M. L. & Pintrich, P. R.) 51–97 (JAI Press, 1997).

Carpenter, S. L. A Comparison of the Relationships of Students' Self-Efficacy, Goal Orientation, and Achievement Across Grade Levels: a Meta-Analysis (Unpublished doctoral dissertation Faculty of Education, Simon Fraser Univ., 2007).

Hulleman, C. S., Schrager, S. M., Bodmann, S. M. & Harackiewicz, J. M. A meta-analytic review of achievement goal measures: Different labels for the same constructs or different constructs with similar labels? Psychol. Bull. 136 , 422 (2010).

Biggs, J. B. What do inventories of students’ learning processes really measure? A theoretical review and clarification. British J. Educ. Psychol. 63 , 3–19 (1993).

Clarke, S. Formative assessment in the secondary classroom (Hodder Murray, 2005).

Heritage, M. Formative assessment: What do teachers need to know and do? Phi Delta Kappan 89 , 140–145 (2007).

Boekaerts, M. Self-regulated learning: Where we are today. Int. J. Educ. Res. 31 , 445–457 (1999).

Entwistle, N. J. The verb ‘to learn’takes the accusative. Br. J. Educ. Psychol. 46 , 1–3 (1976).

Biggs, J. B. & Collis, K. F. Evaluating the Quality of Learning: the SOLO Taxonomy (Structure of the Observed Learning Outcome) (Academic Press, 1982).

Hattie, J. A. C. & Brown, G. T. L. Cognitive processes in asTTle: the SOLO taxonomy. asTTle technical report (No. 43) (University of Auckland & Ministry of Education, 2004).

Enwistle, N. J. Approaches to studying and levels of understanding: the influences of teaching and assessment . High. Educ. 15 , 156–218. (2000).

Mayer, R. E. The Cambridge Handbook of Multimedia Learning , 2nd edn, 43–71 (Cambridge Univ. Press, 2014).

Robertson, E. M., Pascual-Leone, A. & Miall, R. C. Current concepts in procedural consolidation. Nat. Rev. Neurosci. 5 , 576–582 (2004).

Article   CAS   Google Scholar  

Hattie, J. A. C., Biggs, J. & Purdie, N. Effects of learning skills interventions on student learning: A meta-analysis. Rev. Educ. Res. 66 , 99–136 (1996).

Bjork, R. A. & Bjork, E. L. in Learning Processes to Cognitive Processes: essays in Honor of William K. Estes (eds Healy, A. et al. ) Vol. 2, 35–67 (Erlbaum, 1992).

Bjork, E. L., de Winstanley, P. A. & Storm, B. C. Learning how to learn: can experiencing the outcome of different encoding strategies enhance subsequent encoding? Psychon. Bull. Rev. 14 , 207–211 (2007).

von Stumm, S., Chamorro-Premuzic, T. & Ackerman, P. L. in Wiley-Blackwell Handbook of Individual Differences (eds Chamorr-Premuzic, T. et al. ) 217–241 (Wiley-Blackwell, 2011).

Hattie, J. A. C. Visible Learning: a Synthesis of Over 800 Meta-Analyses Relating to Achievement (Routledge, 2009).

Purdie, N. & Hattie, J. A. C. Assessing students’ conceptions of learning. Austr. J. Dev. Educ. Psychol. 2 , 17–32 (2002).

Pintrich, P. R. Multiple goals, multiple pathways: the role of goal orientation in learning and achievement. J. Educ. Psychol. 92 , 544–55 (2000).

Schraw, G. & Dennison, R. S. Assessing metacognitive awareness. Contemp. Educ. Psychol. 19 , 460–475 (1994).

Pegg, J. & Tall, D. in Theories of Mathematics Education 173–192 (Springer, 2010).

Perkins, D. N. & Salomon, G. Knowledge to go: a motivational and dispositional view of transfer. Educ. Psychol. 47 , 248–258 (2012).

Smedslund, J. The problem of “what is learned?”. Psychol. Rev. 60 , 157–158 (1953).

Barnett, S. M. & Ceci, S. J. When and where do we apply what we learn? A taxonomy for far transfer. Psychol. Bull. 128 , 612–637 (2002).

Salomon, G. & Perkins, D. N. Rocky roads to transfer: rethinking mechanism of a neglected phenomenon. Educ. Psychol. 24 , 113–142 (1989).

Osgood, C. E. The similarity paradox in Hum. Learn.: A resolution. Psychol. Rev. 56 , 132–143 (1949).

Bereiter, C. & Scardamalia, M. in Knowledge Creation in Education (eds Tan, S. C. et al. ) 35–52 (Springer, 2014).

Schoenfeld, A. H. in Handbook of Research on Mathematics Learning and Teaching (ed. Grouws, D. A.) 334–370 (Macmillan, 1992).

Marton, F. Sameness and difference in transfer. J. Learn. Sci. 15 , 499–535 (2006).

Hattie, J. A. C. Visible Learning for Teachers (Routledge, 2012).

Book   Google Scholar  

Hattie, J. A. C. The applicability of Visible Learning to higher education. Scholarship Teach. Learn. Psychol. 1 , 79–91 (2015).

Lavery, L. Self-regulated Learning For Academic Success: an Evaluation Of Instructional Techniques (Unpublished doctoral thesis Univ. Auckland, 2008).

Donoghue, G. & Hattie, J. A. C. A Meta-Analysis of Learning Strategies based on Dunlosky et al. (Unpublished paper Science of Learning Research Centre, 2015).

Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J. & Willingham, D. T. Improving students’ learning with effective learning techniques promising directions from cognitive and educational psychology. Psychol. Sci. Public Interest 14 , 4–58 (2013).

Donker, A. S., de Boer, H., Kostons, D., van Ewijk, C. D. & van der Werf, M. P. C. Effectiveness of learning strategy instruction on academic performance: A meta-analysis. Educ. Res. Rev. 11 , 1–26 (2013).

Cowan, N. in Neuroscience in Education: the Good, the Bad and the Ugly (eds Della Sala, S. & Anderson, M.) 111–127 (Oxford Univ. Press, 2012).

Ackerman, P. L., Beier, M. E. & Boyle, M. O. Working memory and intelligence: The same or different constructs? Psychol. Bull. 131 , 30–60 (2005).

Frydenberg, E. & Lewis, R. Coping Scale for Adults (ACER, 2015).

Galla, B. M. & Wood, J. J. Emotional self-efficacy moderates anxiety-related impairments in math performance in elementary school-age youth. Pers. Indiv. Differ. 52 , 118–122 (2012).

Bandura, A. Self-efficacy: the Exercise of Control (Macmillan, 1997).

Frydenberg, E. Think positively!: a Course For Developing Coping Skills In Adolescents (A&C Black, 2010).

Dweck, C. Mindset: How You Can Fulfil Your Potential (Hachette, 2012).

Watkins, D. & Hattie, J. A. A longitudinal study of the approaches to learning of Australian Tertiary students. Hum. Learn. 4 , 127–141 (1985).

Turley-Ames, K. J. & Whitfield, M. M. Strategy training and working memory task performance. J. Mem. Lang. 49 , 446–468 (2003).

Owen, A. M. et al. Putting brain training to the test. Nature 465 , 775–778 (2010).

Melby-Lervåg, M. & Hulme, C. Is working memory training effective? A meta-analytic review. Dev. Psychol. 49 , 270–291 (2013).

Alloway, T. P. How does working memory work in the classroom? Educ. Res. Rev. 1 , 134–139 (2006).

Claessens, B. J., Van Eerde, W., Rutte, C. G. & Roe, R. A. Things to do today: a daily diary study on task completion at work. Appl. Psychol. 59 , 273–295 (2010).

Nuthall, G. A. The Hidden Lives of Learners (New Zealand Council for Educational Research, 2007).

Ryan, A. M. & Shin, H. Help-seeking tendencies during early adolescence: an examination of motivational correlates and consequences for achievement. Learn. Instruct. 21 , 247–256 (2011).

Kuhn, T. S. The structure of scientific revolutions (The University of Chicago Press, 1962).

CERI. 40th Anniversary International Conference on Learning in the 21st century , May 2008. (2008).

Zohar, A. in Metacognition in Science Education 197–223 (Springer, 2012).

Lockhead J. & Clement J. (eds) Research on Teaching Thinking Skills (Franklin Institute Press, 1979).

Mayer, R. E. Multimedia learning: Are we asking the right questions? Educ. Psychol. 32 , 1–19 (1997).

Wiliam, D. Presentation to the Salzburg Global Seminar. Available at www.dylanwiliam.org/Dylan_Wiliams.../Salzburg%20Seminar%20talk (2014).

Donovan M. S. & Bransford J. D. (eds). How Students Learn: History in the Classroom (National Academies Press, 2005).

Vye, N., Schwartz, D. L., Bransford, J. D., Barron, B. J. & Zech, L. in Metacognition in Educational Theory And Practice (eds Dunlosky, H. & Graesser, A.) 305–347 (Lawrence Erlbaum, 1998).

Perkins, D. Future Wise: Educating our Children for a Changing World (John Wiley & Sons, 2014).

Shuell, T. J. Teaching and learning as problem solving. Theor. Pract. 29 , 102–108 (1990).

Marton, F., Wen, Q. & Wong, K. C. ‘Read a hundred times and the meaning will appear’. Changes in Chinese university students’ views of the temporal structure of learning. High. Educ. 49 , 291–318 (2005).

Chase, W. G. & Simon, H. A. Perception in chess. Cogn. Psychol. 4 , 55–81 (1973).

Berliner, D. C. Learning about and learning from expert teachers. Int. J. Educ. Res. 35 , 463–482 (2001).

Bransford, J. D., Brown, A. L. & Cocking, R. R. How People Learn: Brain, Mind, Experience, and School (National Academy Press, 1999).

Robinson, A. G., Stern, S. & Stern, S. Corporate creativity: How Innovation and Improvement Actually Happen (Berrett-Koehler Publishers, 1997).

Bereiter, C. & Scardamalia, M. An Inquiry Into the Nature and Implications of Expertise , (Open Court, 1993).

Albanese, M. A. & Mitchell, S. Problem-based learning: A review of literature on its outcomes and implementation issues. Acad. Med. 68 , 52–81 (1993).

Gijbels, D., Dochy, F., Van den Bossche, P. & Segers, M. Effects of problem-based learning: A meta-analysis from the angle of assessment. Rev. Educ. Res. 75 , 27–61 (2005).

Gilhooly, K. J. Cognitive psychology and medical diagnosis. Appl. Cogn. Psychol. 4 , 261–272 (1990).

Walker, A. & Leary, H. M. A problem based learning meta-analysis: differences across problem types, implementation types, disciplines, and assessment levels. Interdiscipl. J. Probl. Based Learn. 3 , 12–43 (2009).

Schwartz, D. L. & Bransford, J. D. A time for telling. Cogn. Instruct. 16 , 475–522 (1998).

Keith, N. & Frese, M. Effectiveness of error management training: A meta-analysis. J. Appl. Psychol. 93 , 59 (2008).

Bruner, J. S. The Culture of Education (Harvard Univ. Press, 1996).

Popper, K. R. The Logic of Scientific Discovery , 3rd edn (Hutchinson, 1968).

Pegg, J. in Encyclopedia of Mathematics Education , 570–572 (Springer, 2014).

DeWitt, P. M. Flipping Leadership Doesn’t Mean Reinventing the Wheel (Corwin Press, 2014).

Tan, C. Y. & Dimmock, C. How a ‘top performing’ Asian school system formulates and implements policy: The case of Singapore. Educ. Manage. Admin. Leadership 42 , 743–763 (2014).

Download references

Acknowledgements

The Science of Learning Research Centre is a Special Research Initiative of the Australian Research Council. Project Number SR120300015. We thank the following for critiquing earlier drafts of this article: Dan Willingham, Jason Lodge, Debra Masters, Rob Hester, Jared Horvath and Luke Rowe.

Author information

Authors and affiliations.

Science of Learning Research Centre, Graduate School of Education, University of Melbourne, Carlton, VIC, Australia

John A C Hattie & Gregory M Donoghue

You can also search for this author in PubMed   Google Scholar

Contributions

The authors contributed equally to the project and writing of this paper.

Corresponding author

Correspondence to John A C Hattie .

Ethics declarations

Competing interests.

The authors declare no conflict of interest.

Supplementary information

Supplementary appendix (doc 899 kb), rights and permissions.

This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

Reprints and permissions

About this article

Cite this article.

Hattie, J., Donoghue, G. Learning strategies: a synthesis and conceptual model. npj Science Learn 1 , 16013 (2016). https://doi.org/10.1038/npjscilearn.2016.13

Download citation

Received : 30 December 2015

Revised : 12 April 2016

Accepted : 23 May 2016

Published : 10 August 2016

DOI : https://doi.org/10.1038/npjscilearn.2016.13

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Effect of instruction and experience on students’ learning strategies.

  • Ezgi Melisa Yüksel
  • C. Shawn Green
  • Haley A. Vlach

Metacognition and Learning (2024)

Key antecedents of maximal levels of aspiration and minimal boundary goals: a structural equation modeling analysis

  • Flaviu A. Hodis
  • Georgeta M. Hodis

Asia Pacific Education Review (2024)

Where is Ethics in the Evolution of AIED’s Future?

  • Arthur C. Graesser
  • Colin M. Carmon

International Journal of Artificial Intelligence in Education (2024)

Analyzing the associations between motivation and academic performance via the mediator variables of specific mathematic cognitive learning strategies in different subject domains of higher education

  • Steffen Wild
  • Christoph Neef

International Journal of STEM Education (2023)

Teachers’ assessment of self-regulated learning: Linking professional competences, assessment practices, and judgment accuracy

  • Yves Karlen
  • Kerstin Bäuerlein
  • Sabrina Brunner

Social Psychology of Education (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

research about learning strategies

Learning Strategies That Work

Dr. Mark A. McDaniel shares effective, evidence-based strategies about learning to replace less effective but widely accepted practices.

Dr. Mark A. McDaniel

How do we learn and absorb new information? Which learning strategies actually work and which are mere myths?

Such questions are at the center of the work of Mark McDaniel , professor of psychology and the director of the Center for Integrative Research on Cognition, Learning, and Education at Washington University in St. Louis. McDaniel coauthored the book Make it Stick: The Science of Successful Learning .

In this Q&A adapted from a Career & Academic Resource Center podcast episode , McDaniel discusses his research on human learning and memory, including the most effective strategies for learning throughout a lifetime.

Harvard Extension: In your book, you talk about strategies to help students be better learners in and outside of the classroom. You write, “We harbor deep convictions that we learn better through single-minded focus and dogged repetition. And these beliefs are validated time and again by the visible improvement that comes during practice, practice, practice.”

McDaniel: This judgment that repetition is effective is hard to shake. There are cues present that your brain picks up when you’re rereading, when you’re repeating something that give you the metacognitive, that is your judgment about your own cognition, give you the misimpression that you really have learned this stuff well.

Older learners shouldn’t feel that they’re at a definitive disadvantage, because they’re not. Older learners really want to try to leverage their prior knowledge and use that as a basis to structure and frame and understand new information coming in.

And two of the primary cues are familiarity. So as you keep rereading, the material becomes more familiar to you. And we mistakenly judge familiarity as meaning robust learning.

And the second cue is fluency. It’s very clear from much work in reading and cognitive processes during reading that when you reread something at every level, the processes are more fluent. Word identification is more fluent. Parsing the structure of the sentence is more fluent. Extracting the ideas is more fluent. Everything is more fluent. And we misinterpret these fluency cues that the brain is getting. And these are accurate cues. It is more fluent. But we misinterpret that as meaning, I’ve really got this. I’ve really learned this. I’m not going to forget this. And that’s really misleading.

So let me give you another example. It’s not just rereading. It’s situations in, say, the STEM fields or any place where you’ve got to learn how to solve certain kinds of problems. One of the standard ways that instructors present homework is to present the same kind of problem in block fashion. You may have encountered this in your own math courses, your own physics courses.

So for example, in a physics course, you might get a particular type of work problem. And the parameters on it, the numbers might change, but in your homework, you’re trying to solve two or three or four of these work problems in a row. Well, it gets more and more fluid because exactly what formula you have to use. You know exactly what the problem is about. And as you get more fluid, and as we say in the book, it looks like you’re getting better. You are getting better at these problems.

But the issue is that can you remember how to identify which kinds of problems go with which kinds of solutions a week later when you’re asked to do a test where you have all different kinds of problems? And the answer is no, you cannot when you’ve done this block practice. So even though instructors who feel like their students are doing great with block practice and students will feel like they’re doing great, they are doing great on that kind of block practice, but they’re not at all good now at retaining information about what distinguishing features or problems are signaling certain kinds of approaches.

What you want to do is interleave practice in these problems. You want to randomly have a problem of one type and then solve a problem of another type and then a problem of another type. And in doing that, it feels difficult and it doesn’t feel fluent. And the signals to your brain are, I’m not getting this. I’m not doing very well. But in fact, that effort to try to figure out what kinds of approaches do I need for each problem as I encounter a different kind of problem, that’s producing learning. That’s producing robust skills that stick with you.

So this is a seductive thing that we have to, instructors and students alike, have to understand and have to move beyond those initial judgments, I haven’t learned very much, and trust that the more difficult practice schedule really is the better learning.

And I’ve written more on this since Make It Stick . And one of my strong theoretical tenets now is that in order for students to really embrace these techniques, they have to believe that they work for them. Each student has to believe it works for them. So I prepare demonstrations to show students these techniques work for them.

The net result of adopting these strategies is that students aren’t spending more time. Instead they’re spending more effective time. They’re working better. They’re working smarter.

When students take an exam after doing lots of retrieval practice, they see how well they’ve done. The classroom becomes very exciting. There’s lots of buy-in from the students. There’s lots of energy. There’s lots of stimulation to want to do more of this retrieval practice, more of this difficulty. Because trying to retrieve information is a lot more difficult than rereading it. But it produces robust learning for a number of reasons.

I think students have to trust that these techniques, and I think they also have to observe that these techniques work for them. It’s creating better learning. And then as a learner, you are more motivated to replace these ineffective techniques with more effective techniques.

Harvard Extension: You talk about tips for learners , how to make it stick. And there are several methods or tips that you share: elaboration, generation, reflection, calibration, among others. Which of these techniques is best?

McDaniel: It depends on the learning challenges that are faced. So retrieval practice, which is practicing trying to recall information from memory is really super effective if the requirements of your course require you to reproduce factual information.

For other things, it may be that you want to try something like generating understanding, creating mental models. So if your exams require you to draw inferences and work with new kinds of problems that are illustrative of the principles, but they’re new problems you haven’t seen before, a good technique is to try to connect the information into what I would call mental models. This is your representation of how the parts and the aspects fit together, relate together.

It’s not that one technique is better than the other. It’s that different techniques produce certain kinds of outcomes. And depending on the outcome you want, you might select one technique or the other.

I really firmly believe that to the extent that you can make learning fun and to the extent that one technique really seems more fun to you, that may be your go to technique. I teach a learning strategy course and I make it very clear to students. You don’t need to use all of these techniques. Find a couple that really work for you and then put those in your toolbox and replace rereading with these techniques.

Harvard Extension: You reference lifelong learning and lifelong learners. You talk about the brain being plastic, mutability of the brain in some ways, and give examples of how some lifelong learners approach their learning.

McDaniel: In some sense, more mature learners, older learners, have an advantage because they have more knowledge. And part of learning involves relating new information that’s coming into your prior knowledge, relating it to your knowledge structures, relating it to your schemas for how you think about certain kinds of content.

And so older adults have the advantage of having this richer knowledge base with which they can try to integrate new material. So older learners shouldn’t feel that they’re at a definitive disadvantage, because they’re not. Older learners really want to try to leverage their prior knowledge and use that as a basis to structure and frame and understand new information coming in.

Our challenges as older learners is that we do have these habits of learning that are not very effective. We turn to these habits. And if these aren’t such effective habits, we maybe attribute our failures to learn to age or a lack of native ability or so on and so forth. And in fact, that’s not it at all. In fact, if you adopt more effective strategies at any age, you’re going to find that your learning is more robust, it’s more successful, it falls into place.

You can learn these strategies at any age. Successful lifelong learning is getting these effective strategies in place, trusting them, and having them become a habit for how you’re going to approach your learning challenges.

6 Benefits of Connecting with an Enrollment Coach

Thinking about pursuing a degree or certificate at Harvard Extension School? Learn more about how working with an enrollment coach will get you off to a great start.

Harvard Division of Continuing Education

The Division of Continuing Education (DCE) at Harvard University is dedicated to bringing rigorous academics and innovative teaching capabilities to those seeking to improve their lives through education. We make Harvard education accessible to lifelong learners from high school to retirement.

Harvard Division of Continuing Education Logo

  • Utility Menu

University Logo

  • ARC Scheduler
  • Student Employment
  • Effective Learning Practices

Learning at college requires processing and retaining a high volume of information across various disciplines and subjects at the same time, which can be a daunting task, especially if the information is brand new. In response, college students try out varied approaches to their learning – often drawing from their high school experiences and modeling what they see their peers doing. While it’s great to try different styles and approaches to learning and studying for your courses, it's smart to incorporate into your daily habits some learning practices that are backed up by current research. 

Below are some effective learning practices suggested by research in the cognitive and learning sciences:

Take ownership of your educational experience.

As an engaged learner, it is important to take an active, self-directed role in your academic experience. Taking agency might feel new to you. In high school, you might have felt like you had little control over your learning experience, so transitioning to an environment where you are implicitly expected to be in the driver’s seat can be disorienting. 

A shift in your mindset regarding your agency, however, can make a big difference in your ability to learn effectively and get the results you want out of your courses.  

Here are four concrete actions you can take to assert ownership over your education :

  • Attend office hours . Come prepared with questions for your instructor about lectures, readings, or other aspects of the course. 
  • Schedule meetings with administrators and faculty  to discuss your academic trajectory and educational goals. You might meet with your academic adviser, course heads, or the Director of Undergraduate Studies (DUS) in your concentration.
  • Identify areas for growth and development  based on your academic goals. Then, explore opportunities to shape and further refine your skills in those areas.
  • Advocate  for support, tools, equipment, or considerations that address your learning needs.

Seek out opportunities for active learning.

Many courses include opportunities for active and engaged learning within their structure. Take advantage of those opportunities in order to enhance your understanding of the material. If such opportunities are not built into the course structure, you can develop your own active learning strategies, including joining study groups and using other active studying techniques. Anytime you grapple actively with your course material, rather than taking it in passively, you’re engaging in active learning. By doing so, you are increasing your retention of key course concepts.

One particularly effective way to help yourself stay focused and engaged in the learning process is to cultivate learning communities, such as accountability groups and study groups. Working in the company of other engaged learners can help remind you why you love learning or why you chose a particular course, concentration, research project, or field of study. Those reminders can re-energize and refocus your efforts. 

Practice study strategies that promote deep learning.

In an attempt to keep up with the demands of college, many students learn concepts just in time for assessment benchmarks (tests, exams, and quizzes). The problem with this methodology is that, for many disciplines (and especially in STEM), the concepts build on one another. Students survive the course only to be met at the final with concepts from the first quiz that they have forgotten long ago. This is why deep learning is important. Deep learning occurs when students use study strategies that ensure course ideas and concepts are embedded into long-term, rather than just short-term, memory. Building your study plans and review sessions in a way that helps create a conceptual framing of the material will serve you now and in the long run. 

Here are some study strategies that promote deep learning: 

Concept Mapping : A concept map is a visualization of knowledge that is organized by the relationships between the topics. At its core, it is made of concepts that are connected together by lines (or arrows) that are labeled with the relationship between the concepts. 

Collaboration : You don’t have to go it alone. In fact, research on learning suggests that it’s best not to. Using study groups, ARC accountability hours, office hours, question centers, and other opportunities to engage with your peers helps you not only test your understanding but also learn different approaches to tackling the material.

Self-test : Quiz yourself about the material you need to know with your notes put away. Refamiliarize yourself with the answers to questions you get wrong, wait a few hours, and then try asking yourself again. Use practice tests provided by your courses or use free apps to create quizzes for yourself.

Create a connection : As you try to understand how all the concepts and ideas from your course fit together, try to associate new information with something you already know. Making connections can help you create a more holistic picture of the material you’re learning. 

Teach someone (even yourself!) : Try teaching someone the concept you’re trying to remember. You can even try to talk to yourself about it! Vocalizing helps activate different sensory processes, which can enhance memory and help you embed concepts more deeply.

Interleave : We often think we’ll do best if we study one subject for long periods of time, but research contradicts this. Try to work with smaller units of time (a half-hour to an hour) and switch up your subjects. Return to concepts you studied earlier at intervals to ensure you learned them sufficiently.

Be intentional about getting started and avoiding procrastination.

When students struggle to complete tasks and projects, their procrastination is not because of laziness, but rather because of the anxiety and negative emotions that accompany starting the task. Understanding what conditions promote or derail your intention to begin a task can help you avoid procrastinating.

Consider the following tips for getting started: 

Eat the Frog : The frog is that one thing you have on your to-do list that you have absolutely no motivation to do and that you’re most likely to procrastinate on. Eating the frog means to just do it, as the first thing you do, and get it over with. If you don’t, odds are that you’ll procrastinate all day. With that one task done, you will experience a sense of accomplishment at the beginning of your day and gain some momentum that will help you move through the rest of your tasks.

Pomodoro Technique : Sometimes, we can procrastinate because we’re overwhelmed by the sheer amount of time we expect it will take to complete a task. But, while it might feel hard to sit down for several hours to work on something, most of us feel we can easily work for a half hour on almost any task. Enter the Pomodoro Technique! When faced with any large task or series of tasks, break the work down into short, timed intervals (25 minutes or so) that are spaced out by short breaks (5 minutes). Working in short intervals trains your brain to focus for manageable periods of time and helps you stay on top of deadlines. With time, the Pomodoro Technique can even help improve your attention span and concentration. Pomodoro is a cyclical system. You work in short sprints, which makes sure you’re consistently productive. You also get to take regular breaks that bolster your motivation and get you ready for your next pomodoro.

Distraction Pads : Sometimes we stop a task that took us a lot of time to get started on because we get distracted by something else. To avoid this, have a notepad beside you while working, and every time you get distracted with a thought, write it down, then push it aside for later. Distracting thoughts can be anything from remembering that you still have another assignment to complete to daydreaming about your next meal. Later on in the day, when you have some free time, you can review your distraction pad to see if any of those thoughts are important and need to be addressed.

Online Apps : It can be hard to rely on our own force of will to get ourselves to start a task, so consider using an external support. There are many self-control apps available for free online (search for "self-control apps"). Check out a few and decide on one that seems most likely to help you eliminate the distractions that can get in the way of starting and completing your work. 

Engage in metacognition.

An effective skill for learning is metacognition. Metacognition is the process of “thinking about thinking” or reflecting on personal habits, knowledge, and approaches to learning. Engaging in metacognition enables students to become aware of what they need to do to initiate and persist in tasks, to evaluate their own learning strategies, and to invest the adequate mental effort to succeed. When students work at being aware of their own thinking and learning, they are more likely to recognize patterns and to intentionally transfer knowledge and skills to solve increasingly complex problems. They also develop a greater sense of self-efficacy.

Mentally checking in with yourself while you study is a great metacognitive technique for assessing your level of understanding. Asking lots of “why,” “how,” and “what” questions about the material you’re reviewing helps you to be reflective about your learning and to strategize about how to tackle tricky material. If you know something, you should be able to explain to yourself how you know it. If you don’t know something, you should start by identifying exactly what you don’t know and determining how you can find the answer.

Metacognition is important in helping us overcome illusions of competence (our brain’s natural inclination to think that we know more than we actually know). All too often students don’t discover what they really know until they take a test. Metacognition helps you be a better judge of how well you understand your course material, which then enables you to refine your approach to studying and better prepare for tests.

Accordion style

  • Assessing Your Understanding
  • Building Your Academic Support System
  • Common Class Norms
  • First-Year Students
  • How to Prepare for Class
  • Interacting with Instructors
  • Know and Honor Your Priorities
  • Memory and Attention
  • Minimizing Zoom Fatigue
  • Note-taking
  • Office Hours
  • Perfectionism
  • Scheduling Time
  • Senior Theses
  • Study Groups
  • Tackling STEM Courses
  • Test Anxiety
  • Open access
  • Published: 15 March 2021

Instructor strategies to aid implementation of active learning: a systematic literature review

  • Kevin A. Nguyen 1 ,
  • Maura Borrego 2 ,
  • Cynthia J. Finelli   ORCID: orcid.org/0000-0001-9148-1492 3 ,
  • Matt DeMonbrun 4 ,
  • Caroline Crockett 3 ,
  • Sneha Tharayil 2 ,
  • Prateek Shekhar 5 ,
  • Cynthia Waters 6 &
  • Robyn Rosenberg 7  

International Journal of STEM Education volume  8 , Article number:  9 ( 2021 ) Cite this article

23k Accesses

36 Citations

5 Altmetric

Metrics details

Despite the evidence supporting the effectiveness of active learning in undergraduate STEM courses, the adoption of active learning has been slow. One barrier to adoption is instructors’ concerns about students’ affective and behavioral responses to active learning, especially student resistance. Numerous education researchers have documented their use of active learning in STEM classrooms. However, there is no research yet that systematically analyzes these studies for strategies to aid implementation of active learning and address students’ affective and behavioral responses. In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and recommended at least one strategy for implementing active learning. In this paper, we ask: (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide?

In our review, we noted that most active learning activities involved in-class problem solving within a traditional lecture-based course ( N = 21). We found mostly positive affective and behavioral outcomes for students’ self-reports of learning, participation in the activities, and course satisfaction ( N = 23). From our analysis of the 29 studies, we identified eight strategies to aid implementation of active learning based on three categories. Explanation strategies included providing students with clarifications and reasons for using active learning. Facilitation strategies entailed working with students and ensuring that the activity functions as intended. Planning strategies involved working outside of the class to improve the active learning experience.

To increase the adoption of active learning and address students’ responses to active learning, this study provides strategies to support instructors. The eight strategies are listed with evidence from numerous studies within our review on affective and behavioral responses to active learning. Future work should examine instructor strategies and their connection with other affective outcomes, such as identity, interests, and emotions.

Introduction

Prior reviews have established the effectiveness of active learning in undergraduate science, technology, engineering, and math (STEM) courses (e.g., Freeman et al., 2014 ; Lund & Stains, 2015 ; Theobald et al., 2020 ). In this review, we define active learning as classroom-based activities designed to engage students in their learning through answering questions, solving problems, discussing content, or teaching others, individually or in groups (Prince & Felder, 2007 ; Smith, Sheppard, Johnson, & Johnson, 2005 ), and this definition is inclusive of research-based instructional strategies (RBIS, e.g., Dancy, Henderson, & Turpen, 2016 ) and evidence-based instructional practices (EBIPs, e.g., Stains & Vickrey, 2017 ). Past studies show that students perceive active learning as benefitting their learning (Machemer & Crawford, 2007 ; Patrick, Howell, & Wischusen, 2016 ) and increasing their self-efficacy (Stump, Husman, & Corby, 2014 ). Furthermore, the use of active learning in STEM fields has been linked to improvements in student retention and learning, particularly among students from some underrepresented groups (Chi & Wylie, 2014 ; Freeman et al., 2014 ; Prince, 2004 ).

Despite the overwhelming evidence in support of active learning (e.g., Freeman et al., 2014 ), prior research has found that traditional teaching methods such as lecturing are still the dominant mode of instruction in undergraduate STEM courses, and low adoption rates of active learning in undergraduate STEM courses remain a problem (Hora & Ferrare, 2013 ; Stains et al., 2018 ). There are several reasons for these low adoption rates. Some instructors feel unconvinced that the effort required to implement active learning is worthwhile, and as many as 75% of instructors who have attempted specific types of active learning abandon the practice altogether (Froyd, Borrego, Cutler, Henderson, & Prince, 2013 ).

When asked directly about the barriers to adopting active learning, instructors cite a common set of concerns including the lack of preparation or class time (Finelli, Daly, & Richardson, 2014 ; Froyd et al., 2013 ; Henderson & Dancy, 2007 ). Among these concerns, student resistance to active learning is a potential explanation for the low rates of instructor persistence with active learning, and this negative response to active learning has gained increased attention from the academic community (e.g., Owens et al., 2020 ). Of course, students can exhibit both positive and negative responses to active learning (Carlson & Winquist, 2011 ; Henderson, Khan, & Dancy, 2018 ; Oakley, Hanna, Kuzmyn, & Felder, 2007 ), but due to the barrier student resistance can present to instructors, we focus here on negative student responses. Student resistance to active learning may manifest, for example, as lack of student participation and engagement with in-class activities, declining attendance, or poor course evaluations and enrollments (Tolman, Kremling, & Tagg, 2016 ; Winkler & Rybnikova, 2019 ).

We define student resistance to active learning (SRAL) as a negative affective or behavioral student response to active learning (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). The affective domain, as it relates to active learning, encompasses not only student satisfaction and perceptions of learning but also motivation-related constructs such as value, self-efficacy, and belonging. The behavioral domain relates to participation, putting forth a good effort, and attending class. The affective and behavioral domains differ from much of the prior research on active learning that centers measuring cognitive gains in student learning, and systematic reviews are readily available on this topic (e.g., Freeman et al., 2014 ; Theobald et al., 2020 ). Schmidt, Rosenberg, and Beymer ( 2018 ) explain the relationship between affective, cognitive, and behavioral domains, asserting all three types of engagement are necessary for science learning, and conclude that “students are unlikely to exert a high degree of behavioral engagement during science learning tasks if they do not also engage deeply with the content affectively and cognitively” (p. 35). Thus, SRAL and negative affective and behavioral student response is a critical but underexplored component of STEM learning.

Recent research on student affective and behavioral responses to active learning has uncovered mechanisms of student resistance. Deslauriers, McCarty, Miller, Callaghan, and Kestin’s ( 2019 ) interviews of physics students revealed that the additional effort required by the novel format of an interactive lecture was the primary source of student resistance. Owens et al. ( 2020 ) identified a similar source of student resistance, which was to their carefully designed biology active learning intervention. Students were concerned about the additional effort required and the unfamiliar student-centered format. Deslauriers et al. ( 2019 ) and Owens et al. ( 2020 ) go a step further in citing self-efficacy (Bandura, 1982 ), mindset (Dweck & Leggett, 1988 ), and student engagement (Kuh, 2005 ) literature to explain student resistance. Similarly, Shekhar et al.’s ( 2020 ) review framed negative student responses to active learning in terms of expectancy-value theory (Wigfield & Eccles, 2000 ); students reacted negatively when they did not find active learning useful or worth the time and effort, or when they did not feel competent enough to complete the activities. Shekhar et al. ( 2020 ) also applied expectancy violation theory from physics education research (Gaffney, Gaffney, & Beichner, 2010 ) to explain how students’ initial expectations of a traditional course produced discomfort during active learning activities. To address both theories of student resistance, Shekhar et al. ( 2020 ) suggested that instructors provide scaffolding (Vygotsky, 1978 ) and support for self-directed learning activities. So, while framing the research as SRAL is relatively new, ideas about working with students to actively engage them in their learning are not. Prior literature on active learning in STEM undergraduate settings includes clues and evidence about strategies instructors can employ to reduce SRAL, even if they are not necessarily framed by the authors as such.

Recent interest in student affective and behavioral responses to active learning, including SRAL, is a relatively new development. But, given the discipline-based educational research (DBER) knowledge base around RBIS and EBIP adoption, we need not to reinvent the wheel. In this paper, we conduct a system review. Systematic reviews are designed to methodically gather and synthesize results from multiple studies to provide a clear overview of a topic, presenting what is known and what is not known (Borrego, Foster, & Froyd, 2014 ). Such clarity informs decisions when designing or funding future research, interventions, and programs. Relevant studies for this paper are scattered across STEM disciplines and in DBER and general education venues, which include journals and conference proceedings. Quantitative, qualitative, and mixed methods approaches have been used to understand student affective and behavioral responses to active learning. Thus, a systematic review is appropriate for this topic given the long history of research on the development of RBIS, EBIPs, and active learning in STEM education; the distribution of primary studies across fields and formats; and the different methods taken to evaluate students’ affective and behavioral responses.

Specifically, we conducted a systematic review to address two interrelated research questions. (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies ? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide ? These two questions are linked by our goal of sharing instructor strategies that can either reduce SRAL or encourage positive student affective and behavioral responses. Therefore, the instructor strategies in this review are only from studies that present empirical data of affective and behavioral student response to active learning. The strategies we identify in this review will not be surprising to highly experienced teaching and learning practitioners or researchers. However, this review does provide an important link between these strategies and student resistance, which remains one of the most feared barriers to instructor adoption of RBIS, EBIPs, and other forms of active learning.

Conceptual framework: instructor strategies to reduce resistance

Recent research has identified specific instructor strategies that correlate with reduced SRAL and positive student response in undergraduate STEM education (Finelli et al., 2018 ; Nguyen et al., 2017 ; Tharayil et al., 2018 ). For example, Deslauriers et al. ( 2019 ) suggested that physics students perceive the additional effort required by active learning to be evidence of less effective learning. To address this, the authors included a 20-min lecture about active learning in a subsequent course offering. By the end of that course, 65% of students reported increased enthusiasm for active learning, and 75% said the lecture intervention positively impacted their attitudes toward active learning. Explaining how active learning activities contribute to student learning is just one of many strategies instructors can employ to reduce SRAL (Tharayil et al., 2018 ).

DeMonbrun et al. ( 2017 ) provided a conceptual framework for differentiating instructor strategies which includes not only an explanation type of instructor strategies (e.g., Deslauriers et al., 2019 ; Tharayil et al., 2018 ) but also a facilitation type of instructor strategies. Explanation strategies involve describing the purpose (such as how the activity relates to students’ learning) and expectations of the activity to students. Typically, instructors use explanation strategies before the in-class activity has begun. Facilitation strategies include promoting engagement and keeping the activity running smoothly once the activity has already begun, and some specific strategies include walking around the classroom or directly encouraging students. We use the existing categories of explanation and facilitation as a conceptual framework to guide our analysis and systematic review.

As a conceptual framework, explanation and facilitation strategies describe ways to aid the implementation of RBIS, EBIP, and other types of active learning. In fact, the work on these types of instructor strategies is related to higher education faculty development, implementation, and institutional change research perspectives (e.g., Borrego, Cutler, Prince, Henderson, & Froyd, 2013 ; Henderson, Beach, & Finkelstein, 2011 ; Kezar, Gehrke, & Elrod, 2015 ). As such, the specific types of strategies reviewed here are geared to assist instructors in moving toward more student-centered teaching methods by addressing their concerns of student resistance.

SRAL is a particular negative form of affective or behavioral student response (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). Affective and behavioral student responses are conceptualized at the reactionary level (Kirkpatrick, 1976 ) of outcomes, which consists of how students feel (affective) and how they conduct themselves within the course (behavioral). Although affective and behavioral student responses to active learning are less frequently reported than cognitive outcomes, prior research suggests a few conceptual constructs within these outcomes.

Affective outcomes consist of any students’ feelings, preferences, and satisfaction with the course. Affective outcomes also include students’ self-reports of whether they thought they learned more (or less) during active learning instruction. Some relevant affective outcomes include students’ perceived value or utility of active learning (Shekhar et al., 2020 ; Wigfield & Eccles, 2000 ), their positivity toward or enjoyment of the activities (DeMonbrun et al., 2017 ; Finelli et al., 2018 ), and their self-efficacy or confidence with doing the in-class activity (Bandura, 1982 ).

In contrast, students’ behavioral responses to active learning consist of their actions and practices during active learning. This includes students’ attendance in the class, their participation , engagement, and effort with the activity, and students’ distraction or off-task behavior (e.g., checking their phones, leaving to use the restroom) during the activity (DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Winkler & Rybnikova, 2019 ).

We conceptualize negative or low scores in either affective or behavioral student outcomes as an indicator of SRAL (DeMonbrun et al., 2017 ; Nguyen et al., 2017 ). For example, a low score in reported course satisfaction would be an example of SRAL. This paper aims to synthesize instructor strategies to aid implementation of active learning from studies that either address SRAL and its negative or low scores or relate instructor strategies to positive or high scores. Therefore, we also conceptualize positive student affective and behavioral outcomes as the absence of SRAL. For easy categorization of this review then, we summarize studies’ affective and behavioral outcomes on active learning to either being positive , mostly positive , mixed/neutral , mostly negative , or negative .

We conducted a systematic literature review (Borrego et al., 2014 ; Gough, Oliver, & Thomas, 2017 ; Petticrew & Roberts, 2006 ) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.

A systematic review was warranted due to the popularity of active learning and the publication of numerous papers on the topic. Multiple STEM disciplines and research audiences have published journal articles and conference papers on the topic of active learning in the undergraduate STEM classroom. However, it was not immediately clear which studies addressed active learning, affective and behavioral student responses, and strategies to aid implementation of active learning. We used the systematic review process to efficiently gather results of multiple types of studies and create a clear overview of our topic.

Definitions

For clarity, we define several terms in this review. Researchers refer to us, the authors of this manuscript. Authors and instructors wrote the primary studies we reviewed, and we refer to these primary studies as “studies” consistently throughout. We use the term activity or activities to refer to the specific in-class active learning tasks assigned to students. Strategies refer to the instructor strategies used to aid implementation of active learning and address student resistance to active learning (SRAL). Student response includes affective and behavioral responses and outcomes related to active learning. SRAL is an acronym for student resistance to active learning, defined here as a negative affective or behavioral student response. Categories or category refer to a grouping of strategies to aid implementation of active learning, such as explanation or facilitation. Excerpts are quotes from studies, and these excerpts are used as codes and examples of specific strategies.

Study timeline, data collection, and sample selection

From 2015 to 2016, we worked with a research librarian to locate relevant studies and conduct a keyword search within six databases: two multidisciplinary databases (Web of Science and Academic Search Complete), two major engineering and technology indexes (Compendex and Inspec), and two popular education databases (Education Source and Education Resource Information Center). We created an inclusion criteria that listed both search strings and study requirements:

Studies must include an in-class active learning intervention. This does not include laboratory classes. The corresponding search string was:

“active learning” or “peer-to-peer” or “small group work” or “problem based learning” or “problem-based learning” or “problem-oriented learning” or “project-based learning” or “project based learning” or “peer instruction” or “inquiry learning” or “cooperative learning” or “collaborative learning” or “student response system” or “personal response system” or “just-in-time teaching” or “just in time teaching” or clickers

Studies must include empirical evidence addressing student response to the active learning intervention. The corresponding search string was:

“affective outcome” or “affective response” or “class evaluation” or “course evaluation” or “student attitudes” or “student behaviors” or “student evaluation” or “student feedback” or “student perception” or “student resistance” or “student response”

Studies must describe a STEM course, as defined by the topic of the course, rather than by the department of the course or the major of the students enrolled (e.g., a business class for mathematics majors would not be included, but a mathematics class for business majors would).

Studies must be conducted in undergraduate courses and must not include K-12, vocational, or graduate education.

Studies must be in English and published between 1990 and 2015 as journal articles or conference papers.

In addition to searching the six databases, we emailed solicitations to U.S. National Science Foundation Improving Undergraduate STEM Education (NSF IUSE) grantees. Between the database searches and email solicitation, we identified 2364 studies after removing duplicates. Most studies were from the database search, as we received just 92 studies from email solicitation (Fig. 1 ).

figure 1

PRISMA screening overview styled after Liberati et al. ( 2009 ) and Passow and Passow ( 2017 )

Next, we followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for screening studies with our inclusion criteria (Borrego et al., 2014 ; Petticrew & Roberts, 2006 ). From 2016 to 2018, a team of seven researchers conducted two rounds of review in Refworks: the first round with only titles and abstracts and the second round with the entire full-text. In both rounds, two researchers independently decided whether each study should be retained based on our inclusion criteria listed above. At the abstract review stage, if there was a disagreement between independent coders, we decided to pass the study on to the full text screening round. We screened a total of 2364 abstracts, and only 746 studies passed the first round of title and abstract verification (see PRISMA flow chart on Fig. 1 ). If there was still a disagreement between independent coders at the full text screening round, then the seven researchers met and discussed the study, clarified the inclusion criteria as needed to resolve potential future disagreements, and when necessary, took a majority vote (4 out of the 7 researchers) on the inclusion of the study. Due to the high number of coders, it was unusual to reach full consensus with all 7 coders, so a majority vote was used to finalize the inclusion of certain studies. We resolved these disagreements on a rolling basis, and depending on the round (abstract or full text), we disagreed about 10–15% of the time on the inclusion of a study. In both the first and second round of screening, studies were often excluded because they did not gather novel empirical data or evidence (inclusion criteria #2) or were not in an undergraduate STEM course (inclusion criteria #3 and #4). Only 412 studies met all our final inclusion criteria.

Coding procedure

From 2017 to 2018, a team of five researchers then coded these 412 studies for detailed information. To quickly gather information about all 412 studies and to answer the first part of our research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we developed an online coding form using Google Forms and Google Sheets. The five researchers piloted and refined the coding form over three rounds of pair coding, and 19 studies were used to test and revise early versions of the coding form. The final coding form (Borrego et al., 2018 ) used a mix of multiple choice and free response items regarding study characteristics (bibliographic information, type of publication, location of study), course characteristics (discipline, course level, number of students sampled, and type of active learning), methodology (main type of evidence collected, sample size, and analysis methods), study findings (types of student responses and outcomes), and strategy reported (if the study explicitly mentioned using strategies to implementation of active learning).

In the end, only 29 studies explicitly described strategies to aid implementation of active learning (Fig. 1 ), and we used these 29 studies as the dataset for this study. The main difference between these 29 studies and the other 383 studies was that these 29 studies explicitly described the ways authors implemented active learning in their courses to address SRAL or positive student outcomes. Although some readers who are experienced active learning instructors or educational researchers may view pedagogies and strategies as integrated, we found that most papers described active learning methods in terms of student tasks, while advice on strategies, if included, tended to appear separately. We chose to not over interpret passing mentions of how active learning was implemented as strategies recommended by the authors.

Analysis procedure for coding strategies

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we closely reviewed the 29 studies to analyze the strategies in more detail. We used Boyatzis’s ( 1998 ) thematic analysis technique to compile all mentions of instructor strategies to aid implementation of active learning and categorize these excerpts into certain strategies. This technique uses both deductive and inductive coding processes (Creswell & Creswell, 2017 ; Jesiek, Mazzurco, Buswell, & Thompson, 2018 ).

In 2018, three researchers reread the 29 studies, marking excerpts related to strategies independently. We found a total of 126 excerpts. The number of excerpts within each study ranged from 1 to 14 excerpts ( M = 4, SD = 3). We then took all the excerpts and pasted each into its own row in a Google Sheet. We examined the entire spreadsheet as a team and grouped similar excerpts together using a deductive coding process. We used the explanation and facilitation conceptual framework (DeMonbrun et al., 2017 ) and placed each excerpt into either category. We also assigned a specific strategy (i.e., describing the purpose of the activity, or encouraging students) from the framework for each excerpt.

However, there were multiple excerpts that did not easily match either category; we set these aside for the inductive coding process. We then reviewed all excerpts without a category and suggested the creation of a new third category, called planning . We based this new category on the idea that the existing explanation and facilitation conceptual framework did not capture strategies that occurred outside of the classroom. We discuss the specific strategies within the planning category in the Results. With a new category in hand, we created a preliminary codebook consisting of explanation, facilitation, and planning categories, and their respective specific strategies.

We then passed the spreadsheet and preliminary codebook to another researcher who had not previously seen the excerpts. The second researcher looked through all the excerpts and assigned categories and strategies, without being able to see the suggestions of the initial three researchers. The second researcher also created their own new strategies and codes, especially when a specific strategy was not presented in the preliminary codebook. All of their new strategies and codes were created within the planning category. The second researcher agreed on assigned categories and implementation strategies for 71% of the total excerpts. A researcher from the initial strategies coding met with the second researcher and discussed all disagreements. The high number of disagreements, 29%, arose from the specific strategies within the new third category, planning. Since the second researcher created new planning strategies, by default these assigned codes would be a disagreement. The two researchers resolved the disagreements by finalizing a codebook with the now full and combined list of planning strategies and the previous explanation and facilitation strategies. Finally, they started the last round of coding, and they coded the excerpts with the final codebook. This time, they worked together in the same coding sessions. Any disagreements were immediately resolved through discussion and updating of final strategy codes. In the end, all 126 excerpts were coded and kept.

Characteristics of the primary studies

To answer our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we report the results from our coding and systematic review process. We discuss characteristics of studies within our dataset below and in Table 1 .

Type of publication and research audience

Of the 29 studies, 11 studies were published in conference proceedings, while the remaining 18 studies were journal articles. Examples of journals included the European Journal of Engineering Education , Journal of College Science Teaching , and PRIMUS (Problems, Resources, and Issues in Mathematics Undergraduate Studies).

In terms of research audiences and perspectives, both US and international views were represented. Eighteen studies were from North America, two were from Australia, three were from Asia, and six were from Europe. For more details about the type of research publications, full bibliographic information for all 29 studies is included in the Appendix.

Types of courses sampled

Studies sampled different types of undergraduate STEM courses. In terms of course year, most studies sampled first-year courses (13 studies). All four course years were represented (4 second-year, 3 third-year, 2 fourth-year, 7 not reported). In regards to course discipline or major, all major STEM education disciplines were represented. Fourteen studies were conducted in engineering courses, and most major engineering subdisciplines were represented, such as electrical and computer engineering (4 studies), mechanical engineering (3 studies), general engineering courses (3 studies), chemical engineering (2 studies), and civil engineering (1 study). Thirteen studies were conducted in science courses (3 physics/astronomy, 7 biology, 3 chemistry), and 2 studies were conducted in mathematics or statistics courses.

For teaching methods, most studies sampled traditional courses that were primarily lecture-based but included some in-class activities. The most common activity was giving class time for students to do problem solving (PS) (21 studies). Students were instructed to either do problem solving in groups (16 studies) or individually (5 studies) and sometimes both in the same course. Project or problem-based learning (PBL) was the second most frequently reported activity with 8 studies, and the implementation of this teaching method ranged from end of term final projects to an entire project or problem-based course. The third most common activity was using clickers (4 studies) or having class discussions (4 studies).

Research design, methods, and outcomes

The 29 studies used quantitative (10 studies), qualitative (6 studies), or mixed methods (13 studies) research designs. Most studies contained self-made instructor surveys (IS) as their main source of evidence (20 studies). In contrast, only 2 studies used survey instruments with evidence of validity (IEV). Other forms of data collection included using institutions’ end of course evaluations (EOC) (10 studies), observations (5 studies), and interviews (4 studies).

Studies reported a variety of different measures for researching students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of learning (an affective outcome); twenty-one studies measured whether students thought they learned more or less due to the active learning intervention. Other common measures included whether students participated in the activities (16 studies, participation), whether they enjoyed the activities (15 studies, enjoyment), and if students were satisfied with the overall course experience (13 studies, course satisfaction). Most studies included more than one measure. Some studies also measured course attendance (4 studies) and students’ self-efficacy with the activities and relevant STEM disciplines (4 studies).

We found that the 23 of the 29 studies reported positive or mostly positive outcomes for their students’ affective and behavioral responses to active learning. Only 5 studies reported mixed/neutral study outcomes, and only one study reported negative student response to active learning. We discuss the implications of this lack of negative study outcomes and reports of SRAL in our dataset in the “Discussion” section.

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we provide descriptions, categories, and excerpts of specific strategies found within our systematic literature review.

Explanation strategies

Explanation strategies provide students with clarifications and reasons for using active learning (DeMonbrun et al., 2017 ). Within the explanation category, we identified two specific strategies: establish expectations and explain the purpose .

Establish expectations

Establishing expectations means setting the tone and routine for active learning at both the course and in-class activity level. Instructors can discuss expectations at the beginning of the semester, at the start of a class session, or right before the activity.

For establishing expectations at the beginning of the semester, studies provide specific ways to ensure students became familiar with active learning as early as possible. This included “introduc[ing] collaborative learning at the beginning of the academic term” (Herkert , 1997 , p. 450) and making sure that “project instructions and the data were posted fairly early in the semester, and the students were made aware that the project was an important part of their assessment” (Krishnan & Nalim, 2009 , p. 5).

McClanahan and McClanahan ( 2002 ) described the importance of explaining how the course will use active learning and purposely using the syllabus to do this:

Set the stage. Create the expectation that students will actively participate in this class. One way to accomplish that is to include a statement in your syllabus about your teaching strategies. For example: I will be using a variety of teaching strategies in this class. Some of these activities may require that you interact with me or other students in class. I hope you will find these methods interesting and engaging and that they enable you to be more successful in this course . In the syllabus, describe the specific learning activities you plan to conduct. These descriptions let the students know what to expect from you as well as what you expect from them (emphasis added, p. 93).

Early on, students see that the course is interactive, and they also see the activities required to be successful in the course.

These studies and excerpts demonstrate the importance of explaining to students how in-class activities relate to course expectations. Instructors using active learning should start the semester with clear expectations for how students should engage with activities.

Explain the purpose

Explaining the purpose includes offering students reasons why certain activities are being used and convincing them of the importance of participating.

One way that studies explained the purpose of the activities was by leveraging and showing assessment data on active learning. For example, Lenz ( 2015 ) dedicated class time to show current students comments from previous students:

I spend the first few weeks reminding them of the research and of the payoff that they will garner and being a very enthusiastic supporter of the [active learning teaching] method. I show them comments I have received from previous classes and I spend a lot of time selling the method (p. 294).

Providing current students comments from previous semesters may help students see the value of active learning. Lake ( 2001 ) also used data from prior course offerings to show students “the positive academic performance results seen in the previous use of active learning” on the first day of class (p. 899).

However, sharing the effectiveness of the activities does not have to be constrained to the beginning of the course. Autin et al. ( 2013 ) used mid-semester test data and comparisons to sell the continued use of active learning to their students. They said to students:

Based on your reflections, I can see that many of you are not comfortable with the format of this class. Many of you said that you would learn better from a traditional lecture. However, this class, as a whole, performed better on the test than my other [lecture] section did. Something seems to be working here (p. 946).

Showing students’ comparisons between active learning and traditional lecture classes is a powerful way to explain how active learning is a benefit to students.

Explaining the purpose of the activities by sharing course data with students appears to be a useful strategy, as it tells students why active learning is being used and convinces students that active learning is making a difference.

Facilitation strategies

Facilitation strategies ensure the continued engagement in the class activities once they have begun, and many of the specific strategies within this category involve working directly with students. We identified two strategies within the facilitation category: approach students and encourage students .

Approach students

Approaching students means engaging with students during the activity. This includes physical proximity and monitoring students, walking around the classroom, and providing students with additional feedback, clarifications, or questions about the activity.

Several studies described how instructors circulated around the classroom to check on the progress of students during an activity. Lenz ( 2015 ) stated this plainly in her study, “While the students work on these problems I walk around the room, listening to their discussions” (p. 284). Armbruster et al. ( 2009 ) described this strategy and noted positive student engagement, “During each group-work exercise the instructor would move throughout the classroom to monitor group progress, and it was rare to find a group that was not seriously engaged in the exercise” (p. 209). Haseeb ( 2011 ) combined moving around the room and approaching students with questions, and they stated, “The instructor moves around from one discussion group to another and listens to their discussions, ask[ing] provoking questions” (p. 276). Certain group-based activities worked better with this strategy, as McClanahan and McClanahan ( 2002 ) explained:

Breaking the class into smaller working groups frees the professor to walk around and interact with students more personally. He or she can respond to student questions, ask additional questions, or chat informally with students about the class (p. 94).

Approaching students not only helps facilitate the activity, but it provides a chance for the instructor to work with students more closely and receive feedback. Instructors walking around the classroom ensure that both the students and instructor continue to engage and participate with the activity.

Encourage students

Encouraging students includes creating a supportive classroom environment, motivating students to do the activity, building respect and rapport with students, demonstrating care, and having a positive demeanor toward students’ success.

Ramsier et al. ( 2003 ) provided a detailed explanation of the importance of building a supportive classroom environment:

Most of this success lies in the process of negotiation and the building of mutual respect within the class, and requires motivation, energy and enthusiasm on behalf of the instructor… Negotiation is the key to making all of this work, and building a sense of community and shared ownership. Learning students’ names is a challenge but a necessary part of our approach. Listening to student needs and wants with regard to test and homework due dates…projects and activities, etc. goes a long way to build the type of relationships within the class that we need in order to maintain and encourage performance (pp. 16–18).

Here, the authors described a few specific strategies for supporting a positive demeanor, such as learning students’ names and listening to student needs and wants, which helped maintain student performance in an active learning classroom.

Other ways to build a supportive classroom environment were for instructors to appear more approachable. For example, Bullard and Felder ( 2007 ) worked to “give the students a sense of their instructors as somewhat normal and approachable human beings and to help them start to develop a sense of community” (p. 5). As instructors and students become more comfortable working with each other, instructors can work toward easing “frustration and strong emotion among students and step by step develop the students’ acceptance [of active learning]” (Harun, Yusof, Jamaludin, & Hassan, 2012 , p. 234). In all, encouraging students and creating a supportive environment appear to be useful strategies to aid implementation of active learning.

Planning strategies

The planning category encompasses strategies that occur outside of class time, distinguishing it from the explanation and facilitation categories. Four strategies fall into this category: design appropriate activities , create group policies , align the course , and review student feedback .

Design appropriate activities

Many studies took into consideration the design of appropriate or suitable activities for their courses. This meant making sure the activity was suitable in terms of time, difficulty, and constraints of the course. Activities were designed to strike a balance between being too difficult and too simple, to be engaging, and to provide opportunities for students to participate.

Li et al. ( 2009 ) explained the importance of outside-of-class planning and considering appropriate projects: “The selection of the projects takes place in pre-course planning. The subjects for projects should be significant and manageable” (p. 491). Haseeb ( 2011 ) further emphasized a balance in design by discussing problems (within problem-based learning) between two parameters, “the problem is deliberately designed to be open-ended and vague in terms of technical details” (p. 275). Armbruster et al. ( 2009 ) expanded on the idea of balanced activities by connecting it to group-work and positive outcomes, and they stated, “The group exercises that elicited the most animated student participation were those that were sufficiently challenging that very few students could solve the problem individually, but at least 50% or more of the groups could solve the problem by working as a team” (p. 209).

Instructors should consider the design of activities outside of class time. Activities should be appropriately challenging but achievable for students, so that students remain engaged and participate with the activity during class time.

Create group policies

Creating group policies means considering rules when using group activities. This strategy is unique in that it directly addresses a specific subset of activities, group work. These policies included setting team sizes and assigning specific roles to group members.

Studies outlined a few specific approaches for assigning groups. For example, Ramsier et al. ( 2003 ) recommended frequently changing and randomizing groups: “When students enter the room on these days they sit in randomized groups of 3 to 4 students. Randomization helps to build a learning community atmosphere and eliminates cliques” (p. 4). Another strategy in combination with frequent changing of groups was to not allow students to select their own groups. Lehtovuori et al. ( 2013 ) used this to avoid problems of freeriding and group dysfunction:

For example, group division is an issue to be aware of...An easy and safe solution is to draw lots to assign the groups and to change them often. This way nobody needs to suffer from a dysfunctional group for too long. Popular practice that students self-organize into groups is not the best solution from the point of view of learning and teaching. Sometimes friendly relationships can complicate fair division of responsibility and work load in the group (p. 9).

Here, Lehtovuori et al. ( 2013 ) considered different types of group policies and concluded that frequently changing groups worked best for students. Kovac ( 1999 ) also described changing groups but assigned specific roles to individuals:

Students were divided into groups of four and assigned specific roles: manager, spokesperson, recorder, and strategy analyst. The roles were rotated from week to week. To alleviate complaints from students that they were "stuck in a bad group for the entire semester," the groups were changed after each of the two in-class exams (p. 121).

The use of four specific group roles is a potential group policy, and Kovac ( 1999 ) continued the trend of changing group members often.

Overall, these studies describe the importance of thinking about ways to implement group-based activities before enacting them during class, and they suggest that groups should be reconstituted frequently. Instructors using group activities should consider whether to use specific group member policies before implementing the activity in the classroom.

Align the course

Aligning the course emphasizes the importance of purposely connecting multiple parts of the course together. This strategy involves planning to ensure students are graded on their participation with the activities as well as considering the timing of the activities with respect to other aspects of the course.

Li et al. ( 2009 ) described aligning classroom tasks by discussing the importance of timing, and they wrote, “The coordination between the class lectures and the project phases is very important. If the project is assigned near the directly related lectures, students can instantiate class concepts almost immediately in the project and can apply the project experience in class” (p. 491).

Krishnan and Nalim ( 2009 ) aligned class activities with grades to motivate students and encourage participation: “The project was a component of the course counting for typically 10-15% of the total points for the course grade. Since the students were told about the project and that it carried a significant portion of their grade, they took the project seriously” (p. 4). McClanahan and McClanahan ( 2002 ) expanded on the idea of using grades to emphasize the importance of active learning to students:

Develop a grading policy that supports active learning. Active learning experiences that are important enough to do are important enough to be included as part of a student's grade…The class syllabus should describe your grading policy for active learning experiences and how those grades factor into the student's final grade. Clarify with the students that these points are not extra credit. These activities, just like exams, will be counted when grades are determined (p. 93).

Here, they suggest a clear grading policy that includes how activities will be assessed as part of students’ final grades.

de Justo and Delgado ( 2014 ) connected grading and assessment to learning and further suggested that reliance on exams may negatively impact student engagement:

Particular attention should be given to alignment between the course learning outcomes and assessment tasks. The tendency among faculty members to rely primarily on written examinations for assessment purposes should be overcome, because it may negatively affect students’ engagement in the course activities (p. 8).

Instructors should consider their overall assessment strategies, as overreliance on written exams could mean that students engage less with the activities.

When planning to use active learning, instructors should consider how activities are aligned with course content and students’ grades. Instructors should decide before active learning implementation whether class participation and engagement will be reflected in student grades and in the course syllabus.

Review student feedback

Reviewing student feedback includes both soliciting feedback about the activity and using that feedback to improve the course. This strategy can be an iterative process that occurs over several course offerings.

Many studies utilized student feedback to continuously revise and improve the course. For example, Metzger ( 2015 ) commented that “gathering and reviewing feedback from students can inform revisions of course design, implementation, and assessment strategies” (p. 8). Rockland et al. ( 2013 ) further described changing and improving the course in response to student feedback, “As a result of these discussions, the author made three changes to the course. This is the process of continuous improvement within a course” (p. 6).

Herkert ( 1997 ) also demonstrated the use of student feedback for improving the course over time: “Indeed, the [collaborative] learning techniques described herein have only gradually evolved over the past decade through a process of trial and error, supported by discussion with colleagues in various academic fields and helpful feedback from my students” (p. 459).

In addition to incorporating student feedback, McClanahan and McClanahan ( 2002 ) commented on how student feedback builds a stronger partnership with students, “Using student feedback to make improvements in the learning experience reinforces the notion that your class is a partnership and that you value your students’ ideas as a means to strengthen that partnership and create more successful learning” (p. 94). Making students aware that the instructor is soliciting and using feedback can help encourage and build rapport with students.

Instructors should review student feedback for continual and iterative course improvement. Much of the student feedback review occurs outside of class time, and it appears useful for instructors to solicit student feedback to guide changes to the course and build student rapport.

Summary of strategies

We list the appearance of strategies within studies in Table 1 in short-hand form. No study included all eight strategies. Studies that included the most strategies were Bullard and Felder’s ( 2007 ) (7 strategies), Armbruster et al.’s ( 2009 ) (5 strategies), and Lenz’s ( 2015 ) (5 strategies). However, these three studies were exemplars, as most studies included only one or two strategies.

Table 2 presents a summary list of specific strategies, their categories, and descriptions. We also note the number of unique studies ( N ) and excerpts ( n ) that included the specific strategies. In total, there were eight specific strategies within three categories. Most strategies fell under the planning category ( N = 26), with align the course being the most reported strategy ( N = 14). Approaching students ( N = 13) and reviewing student feedback ( N = 11) were the second and third most common strategies, respectively. Overall, we present eight strategies to aid implementation of active learning.

Characteristics of the active learning studies

To address our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we discuss the different ways studies reported research on active learning.

Limitations and gaps within the final sample

First, we must discuss the gaps within our final sample of 29 studies. We excluded numerous active learning studies ( N = 383) that did not discuss or reflect upon the efficacy of their strategies to aid implementation of active learning. We also began this systematic literature review in 2015 and did not finish our coding and analysis of 2364 abstracts and 746 full-texts until 2018. We acknowledge that there have been multiple studies published on active learning since 2015. Acknowledging these limitations, we discuss our results and analysis in the context of the 29 studies in our dataset, which were published from 1990 to 2015.

Our final sample included only 2 studies that sampled mathematics and statistics courses. In addition, there was also a lack of studies outside of first-year courses. Much of the active learning research literature introduces interventions in first-year (cornerstone) or fourth-year (capstone) courses, but we found within our dataset a tendency to oversample first-year courses. However, all four course-years were represented, as well as all major STEM disciplines, with the most common STEM disciplines being engineering (14 studies) and biology (7 studies).

Thirteen studies implemented course-based active learning interventions, such as project-based learning (8 studies), inquiry-based learning (3 studies), or a flipped classroom (2 studies). Only one study, Lenz ( 2015 ), used a previously published active learning intervention, which was Process-Oriented Guided Inquiry Learning (POGIL). Other examples of published active learning programs include the Student-Centered Active Learning Environment for Upside-down Pedagogies (SCALE-UP, Gaffney et al., 2010 ) and Chemistry, Life, the Universe, and Everything (CLUE, Cooper & Klymkowsky, 2013 ), but these were not included in our sample of 29 studies.

In contrast, most of the active learning interventions involved adding in-class problem solving (either with individual students or groups of students) to a traditional lecture course (21 studies). For some instructors attempting to adopt active learning, using this smaller active learning intervention (in-class problem solving) may be a good starting point.

Despite the variety of quantitative, qualitative, and mixed method research designs, most studies used either self-made instructor surveys (20 studies) or their institution’s course evaluations (10 studies). The variation between so many different versions of instructor surveys and course evaluations made it difficult to compare data or attempt a quantitative meta-analysis. Further, only 2 studies used instruments with evidence of validity. However, that trend may change as there are more examples of instruments with evidence of validity, such as the Student Response to Instructional Practices (StRIP, DeMonbrun et al., 2017 ), the Biology Interest Questionnaire (BIQ, Knekta, Rowland, Corwin, & Eddy, 2020 ), and the Pedagogical Expectancy Violation Assessment (PEVA, Gaffney et al., 2010 ).

We were also concerned about the use of institutional course evaluations (10 studies) as evidence of students’ satisfaction and affective responses to active learning. Course evaluations capture more than just students’ responses to active learning, as the scores are biased toward the instructors’ gender (Mitchell & Martin, 2018 ) and race (Daniel, 2019 ), and they are strongly correlated with students’ expected grade in the class (Nguyen et al., 2017 ). Despite these limitations, we kept course evaluations in our keyword search and inclusion criteria, because they relate to instructors concerns about student resistance to active learning, and these scores continue to be used for important instructor reappointment, tenure, and promotion decisions (DeMonbrun et al., 2017 ).

In addition to students’ satisfaction, there were other measures related to students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of whether they thought they learned more or less (21 studies). Other important affective outcomes included enjoyment (13 studies) and self-efficacy (4 students). The most common behavioral measure was students’ participation (16 studies). However, missing from this sample were other affective outcomes, such as students’ identities, beliefs, emotions, values, and buy-in.

Positive outcomes for using active learning

Twenty-three of the 29 studies reported positive or mostly positive outcomes for their active learning intervention. At the start of this paper, we acknowledged that much of the existing research suggested the widespread positive benefits of using active learning in undergraduate STEM courses. However, much of these positive benefits related to active learning were centered on students’ cognitive learning outcomes (e.g., Theobald et al., 2020 ) and not students’ affective and behavioral responses to active learning. Here, we show positive affective and behavioral outcomes in terms of students’ self-reports of learning, enjoyment, self-efficacy, attendance, participation, and course satisfaction.

Due to the lack of mixed/neutral or negative affective outcomes, it is important to acknowledge potential publication bias within our dataset. Authors may be hesitant to report negative outcomes to active learning interventions. It could also be the case that negative or non-significant outcomes are not easily published in undergraduate STEM education venues. These factors could help explain the lack of mixed/neutral or negative study outcomes in our dataset.

Strategies to aid implementation of active learning

We aimed to answer the question: what instructor strategies to aid implementation of active learning do the authors of these studies provide? We addressed this question by providing instructors and readers a summary of actionable strategies they can take back to their own classrooms. Here, we discuss the range of strategies found within our systematic literature review.

Supporting instructors with actionable strategies

We identified eight specific strategies across three major categories: explanation, facilitation, and planning. Each strategy appeared in at least seven studies (Table 2 ), and each strategy was written to be actionable and practical.

Strategies in the explanation category emphasized the importance of establishing expectations and explaining the purpose of active learning to students. The facilitation category focused on approaching and encouraging students once activities were underway. Strategies in the planning category highlight the importance of working outside of class time to thoughtfully design appropriate activities , create policies for group work , align various components of the course , and review student feedback to iteratively improve the course.

However, as we note in the “Introduction” section, these strategies are not entirely new, and the strategies will not be surprising to experienced researchers and educators. Even still, there has yet to be a systematic review that compiles these instructor strategies in relation to students’ affective and behavioral responses to active learning. For example, the “explain the purpose” strategy is similar to the productive framing (e.g., Hutchison & Hammer, 2010 ) of the activity for students. “Design appropriate activities” and “align various components of the course” relate to Vygotsky’s ( 1978 ) theories of scaffolding for students (Shekhar et al., 2020 ). “Review student feedback” and “approaching students” relate to ideas on formative assessment (e.g., Pellegrino, DiBello, & Brophy, 2014 ) or revising the course materials in relation to students’ ongoing needs.

We also acknowledge that we do not have an exhaustive list of specific strategies to aid implementation of active learning. More work needs to be done measuring and observing these strategies in-action and testing the use of these strategies against certain outcomes. Some of this work of measuring instructor strategies has already begun (e.g., DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Tharayil et al., 2018 ), but further testing and analysis would benefit the active learning community. We hope that our framework of explanation, facilitation, and planning strategies provide a guide for instructors adopting active learning. Since these strategies are compiled from the undergraduate STEM education literature and research on affective and behavioral responses to active learning, instructors have compelling reason to use these strategies to aid implementation of active learning.

One way to consider using these strategies is to consider the various aspects of instruction and their sequence. That is, planning strategies would be most applicable during the phase of work that occurs prior to classroom instruction, the explanation strategies would be more useful when introducing students to active learning activities, while facilitation strategies would be best enacted while students are already working and engaged in the assigned activities. Of course, these strategies may also be used in conjunction with each other and are not strictly limited to these phases. For example, one plausible approach could be using the planning strategies of design and alignment as areas of emphasis during explanation . Overall, we hope that this framework of strategies supports instructors’ adoption and sustained use of active learning.

Creation of the planning category

At the start of this paper, we presented a conceptual framework for strategies consisting of only explanation and facilitation categories (DeMonbrun et al., 2017 ). One of the major contributions of this paper is the addition of a third category, which we call the planning category, to the existing conceptual framework. The planning strategies were common throughout the systematic literature review, and many studies emphasized the need to consider how much time and effort is needed when adding active learning to the course. Although students may not see this preparation, and we did not see this type of strategy initially, explicitly adding the planning category acknowledges the work instructors do outside of the classroom.

The planning strategies also highlight the need for instructors to not only think about implementing active learning before they enter the class, but to revise their implementation after the class is over. Instructors should refine their use of active learning through feedback, reflection, and practice over multiple course offerings. We hope this persistence can lead to long-term adoption of active learning.

Despite our review ending in 2015, most of STEM instruction remains didactic (Laursen, 2019 ; Stains et al., 2018 ), and there has not been a long-term sustained adoption of active learning. In a push to increase the adoption of active learning within undergraduate STEM courses, we hope this study provided support and actionable strategies for instructors who are considering active learning but are concerned about student resistance to active learning.

We identified eight specific strategies to aid implementation of active learning based on three categories. The three categories of strategies were explanation, facilitation, and planning. In this review, we created the third category, planning, and we suggested that this category should be considered first when implementing active learning in the course. Instructors should then focus on explaining and facilitating their activity in the classroom. The eight specific strategies provided here can be incorporated into faculty professional development programs and readily adopted by instructors wanting to implement active learning in their STEM courses.

There remains important future work in active learning research, and we noted these gaps within our review. It would be useful to specifically review and measure instructor strategies in-action and compare its use against other affective outcomes, such as identity, interest, and emotions.

There has yet to be a study that compiles and synthesizes strategies reported from multiple active learning studies, and we hope that this paper filled this important gap. The strategies identified in this review can help instructors persist beyond awkward initial implementations, avoid some problems altogether, and most importantly address student resistance to active learning. Further, the planning strategies emphasize that the use of active learning can be improved over time, which may help instructors have more realistic expectations for the first or second time they implement a new activity. There are many benefits to introducing active learning in the classroom, and we hope that these benefits are shared among more STEM instructors and students.

Availability of data and materials

Journal articles and conference proceedings which make up this review can be found through reverse citation lookup. See the Appendix for the references of all primary studies within this systematic review. We used the following databases to find studies within the review: Web of Science, Academic Search Complete, Compendex, Inspec, Education Source, and Education Resource Information Center. More details and keyword search strings are provided in the “Methods” section.

Abbreviations

Science, technology, engineering, and mathematics

Student resistance to active learning

Instrument with evidence of validity

Instructor surveys

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Problem solving

Problem or project-based learning

End of course evaluations

Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sciences Education , 8 (3), 203–213. https://doi.org/10.1187/cbe.09-03-0025 .

Article   Google Scholar  

Autin, M., Bateiha, S., & Marchionda, H. (2013). Power through struggle in introductory statistics. PRIMUS , 23 (10), 935–948. https://doi.org/10.1080/10511970.2013.820810 .

Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psychologist , 37 (2), 122 https://psycnet.apa.org/doi/10.1037/0003-066X.37.2.122 .

Berkling, K., & Zundel, A. (2015). Change Management: Overcoming the Challenges of Introducing Self-Driven Learning. International Journal of Engineering Pedagogy (iJEP), 5 (4), 38–46. https://www.learntechlib.org/p/207352/ .

Bilston, L. (1999). Lessons from a problem-based learning class in first year engineering statics . Paper presented at the 2nd Asia-Pacific Forum on Engineering and Technology Education, Clayton, Victoria.

Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education , 102 (3), 394–425. https://doi.org/10.1002/jee.20020 .

Borrego, M., Foster, M. J., & Froyd, J. E. (2014). Systematic literature reviews in engineering education and other developing interdisciplinary fields. Journal of Engineering Education , 103 (1), 45–76. https://doi.org/10.1002/jee.20038 .

Borrego, M., Nguyen, K., Crockett, C., DeMonbrun, M., Shekhar, P., Tharayil, S., … Waters, C. (2018). Systematic literature review of students’ affective responses to active learning: Overview of results . San Jose: Paper presented at the 2018 IEEE Frontiers in Education Conference (FIE). https://doi.org/10.1109/FIE.2018.8659306 .

Book   Google Scholar  

Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development . Sage Publications Inc.

Breckler, J., & Yu, J. R. (2011). Student responses to a hands-on kinesthetic lecture activity for learning about the oxygen carrying capacity of blood. Advances in Physiology Education, 35 (1), 39–47. https://doi.org/10.1152/advan.00090.2010 .

Bullard, L., & Felder, R. (2007). A student-centered approach to the stoichiometry course . Honolulu: Paper presented at the 2007 ASEE Annual Conference and Exposition https://peer.asee.org/1543 .

Carlson, K. A., & Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education , 19 (1). https://doi.org/10.1080/10691898.2011.11889596 .

Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist , 49 (4), 219–243. https://doi.org/10.1080/00461520.2014.965823 .

Christensen, T. (2005). Changing the learning environment in large general education astronomy classes. Journal of College Science Teaching, 35 (3), 34.

Cooper, M., & Klymkowsky, M. (2013). Chemistry, life, the universe, and everything: A new approach to general chemistry, and a model for curriculum reform. Journal of Chemical Education , 90 (9), 1116–1122. https://doi.org/10.1021/ed300456y .

Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches . Sage Publishing Inc.

Dancy, M., Henderson, C., & Turpen, C. (2016). How faculty learn about and implement research-based instructional strategies: The case of peer instruction. Physical Review Physics Education Research , 12 (1). https://doi.org/10.1103/PhysRevPhysEducRes.12.010110 .

Daniel, B. J. (2019). Teaching while black: Racial dynamics, evaluations, and the role of white females in the Canadian academy in carrying the racism torch. Race Ethnicity and Education , 22 (1), 21–37. https://doi.org/10.1080/13613324.2018.1468745 .

de Justo, E., & Delgado, A. (2014). Change to competence-based education in structural engineering. Journal of Professional Issues in Engineering Education and Practice , 141 (3). https://doi.org/10.1061/(ASCE)EI.1943-5541.0000215 .

DeMonbrun, R. M., Finelli, C., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an instrument to measure student response to instructional practices. Journal of Engineering Education , 106 (2), 273–298. https://doi.org/10.1002/jee.20162 .

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences , 116 (39), 19251–19257. https://doi.org/10.1073/pnas.1821936116 .

Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review , 95 (2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256 .

Finelli, C., Nguyen, K., Henderson, C., Borrego, M., Shekhar, P., Prince, M., … Waters, C. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching , 47 (5), 80–91 https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-mayjune-2018/research-and-1 .

Google Scholar  

Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education , 103 (2), 331–361. https://doi.org/10.1002/jee.20042 .

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences , 111 (23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 .

Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. J. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education , 56 (4), 393–399. https://doi.org/10.1109/TE.2013.2244602 .

Gaffney, J. D., Gaffney, A. L. H., & Beichner, R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics - Physics Education Research , 6 (1), 010102. https://doi.org/10.1103/PhysRevSTPER.6.010102 .

Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews . Sage Publishing Inc.

Harun, N. F., Yusof, K. M., Jamaludin, M. Z., & Hassan, S. A. H. S. (2012). Motivation in problem-based learning implementation. Procedia-Social and Behavioral Sciences , 56 , 233–242. https://doi.org/10.1016/j.sbspro.2012.09.650 .

Haseeb, A. (2011). Implementation of micro-level problem based learning in a course on electronic materialas. Journal of Materials Education , 33 (5-6), 273–282 http://eprints.um.edu.my/id/eprint/5501 .

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching , 48 (8), 952–984. https://doi.org/10.1002/tea.20439 .

Henderson, C., & Dancy, M. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research , 3 (2). https://doi.org/10.1103/PhysRevSTPER.3.020102 .

Henderson, C., Khan, R., & Dancy, M. (2018). Will my student evaluations decrease if I adopt an active learning instructional strategy? American Journal of Physics , 86 (12), 934–942. https://doi.org/10.1119/1.5065907 .

Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics , 3 (4), 447–462. https://doi.org/10.1007/s11948-997-0047-x .

Hodgson, Y., Benson, R., & Brack, C. (2013). Using action research to improve student engagement in a peer-assisted learning programme. Educational Action Research, 21 (3), 359-375. https://doi.org/10.1080/09650792.2013.813399 .

Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multi-dimensional analysis of math and science undergraduate course planning and classroom teaching. Journal of the Learning Sciences , 22 (2), 212–257. https://doi.org/10.1080/10508406.2012.729767 .

Hutchison, P., & Hammer, D. (2010). Attending to student epistemological framing in a science classroom. Science Education , 94 (3), 506–524. https://doi.org/10.1002/sce.20373 .

Jaeger, B., & Bilen, S. (2006). The one-minute engineer: Getting design class out of the starting blocks . Paper presented at the 2006 ASEE Annual Conference and Exposition, Chicago, IL. https://peer.asee.org/524 .

Jesiek, B. K., Mazzurco, A., Buswell, N. T., & Thompson, J. D. (2018). Boundary spanning and engineering: A qualitative systematic review. Journal of Engineering Education , 107 (3), 318–413. https://doi.org/10.1002/jee.20219 .

Kezar, A., Gehrke, S., & Elrod, S. (2015). Implicit theories of change as a barrier to change on college campuses: An examination of STEM reform. The Review of Higher Education , 38 (4), 479–506. https://doi.org/10.1353/rhe.2015.0026 .

Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development . McGraw Hill.

Knekta, E., Rowland, A. A., Corwin, L. A., & Eddy, S. (2020). Measuring university students’ interest in biology: Evaluation of an instrument targeting Hidi and Renninger’s individual interest. International Journal of STEM Education , 7 , 1–16. https://doi.org/10.1186/s40594-020-00217-4 .

Kovac, J. (1999). Student active learning methods in general chemistry. Journal of Chemical Education , 76 (1), 120. https://doi.org/10.1021/ed076p120 .

Krishnan, S., & Nalim, M. R. (2009). Project based learning in introductory thermodynamics . Austin: Paper presented at the 2009 ASEE Annual Conference and Exposition https://peer.asee.org/5615 .

Kuh, G. D. (2005). Student engagement in the first year of college. In M. L. Upcraft, J. N. Gardner, J. N, & B. O. Barefoot (Eds.), Challenging and supporting the first-year student: A handbook for improving the first year of college , (pp. 86–107). Jossey-Bass.

Laatsch, L., Britton, L., Keating, S., Kirchner, P., Lehman, D., Madsen-Myers, K., Milson, L., Otto, C., & Spence, L. (2005). Cooperative learning effects on teamwork attitudes in clinical laboratory science students. American Society for Clinical Laboratory Science, 18(3). https://doi.org/10.29074/ascls.18.3.150 .

Lake, D. A. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy , 81 (3), 896–902. https://doi.org/10.1093/ptj/81.3.896 .

Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction American Association for the Advancement of Science. https://www.aaas.org/resources/levers-change-assessment-progress-changing-stem-instruction .

Lehtovuori, A., Honkala, M., Kettunen, H., & Leppävirta, J. (2013). Interactive engagement methods in teaching electrical engineering basic courses. In Paper presented at the IEEE global engineering education conference (EDUCON) . Germany: Berlin. https://doi.org/10.1109/EduCon.2013.6530089 .

Chapter   Google Scholar  

Lenz, L. (2015). Active learning in a math for liberal arts classroom. PRIMUS , 25 (3), 279–296. https://doi.org/10.1080/10511970.2014.971474 .

Li, J., Zhao, Y., & Shi, L. (2009). Interactive teaching methods in information security course . Paper presented at the International Conference on Scalable Computing and Communications; The Eighth International Conference on Embedded Computing. https://doi.org/10.1109/EmbeddedCom-ScalCom.2009.94 .

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., … Moher, D. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. Journal of Clinical Epidemology , 62 (10), e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .

Lund, T. J., & Stains, M. (2015). The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education , 2 (1). https://doi.org/10.1186/s40594-015-0026-8 .

Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education , 8 (1), 9–30. https://doi.org/10.1177/1469787407074008 .

Maib, J., Hall, R., Collier, H., & Thomas, M. (2006). A multi-method evaluation of the implementation of a student response system . Paper presented at the 12th Americas’ Conference on Information Systems (AMCIS), Acapulco, Mexico. https://aisel.aisnet.org/amcis2006/27 .

McClanahan, E. B., & McClanahan, L. L. (2002). Active learning in a non-majors biology class: Lessons learned. College Teaching , 50 (3), 92–96. https://doi.org/10.1080/87567550209595884 .

McLoone, S., & Brennan, C. (2015). On the use and evaluation of a smart device student response system in an undergraduate mathematics classroom. AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education, 7(3). http://ojs.aishe.org/index.php/aishe-j/article/view/243 .

Metzger, K. J. (2015). Collaborative teaching practices in undergraduate active learning classrooms: A report of faculty team teaching models and student reflections from two biology courses. Bioscene: Journal of College Biology Teaching , 41 (1), 3–9 http://www.acube.org/wp-content/uploads/2017/11/2015_1.pdf .

Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics , 51 (3), 648–652. https://doi.org/10.1017/S104909651800001X .

Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., DeMonbrun, R. M., … Waters, C. (2017). Students’ expectations, types of instruction, and instructor strategies predicting student response to active learning. International Journal of Engineering Education , 33 (1(A)), 2–18 http://www.ijee.ie/latestissues/Vol33-1A/02_ijee3363ns.pdf .

Oakley, B. A., Hanna, D. M., Kuzmyn, Z., & Felder, R. M. (2007). Best practices involving teamwork in the classroom: Results from a survey of 6435 engineering student respondents. IEEE Transactions on Education , 50 (3), 266–272. https://doi.org/10.1109/TE.2007.901982 .

Oliveira, P. C., & Oliveira, C. G. (2014). Integrator element as a promoter of active learning in engineering teaching. European Journal of Engineering Education, 39 (2), 201–211. https://doi.org/10.1080/03043797.2013.854318 .

Owens, D. C., Sadler, T. D., Barlow, A. T., & Smith-Walters, C. (2020). Student motivation from and resistance to active learning rooted in essential science practices. Research in Science Education , 50 (1), 253–277. https://doi.org/10.1007/s11165-017-9688-1 .

Parker Siburt, C. J., Bissell, A. N., & Macphail, R. A. (2011). Developing Metacognitive and Problem-Solving Skills through Problem Manipulation. Journal of Chemical Education, 88 (11), 1489–1495. https://doi.org/10.1021/ed100891s .

Passow, H. J., & Passow, C. H. (2017). What competencies should undergraduate engineering programs emphasize? A systematic review. Journal of Engineering Education , 106 (3), 475–526. https://doi.org/10.1002/jee.20171 .

Patrick, L. E., Howell, L. A., & Wischusen, W. (2016). Perceptions of active learning between faculty and undergraduates: Differing views among departments. Journal of STEM Education: Innovations and Research , 17 (3), 55 https://www.jstem.org/jstem/index.php/JSTEM/article/view/2121/1776 .

Pellegrino, J., DiBello, L., & Brophy, S. (2014). The science and design of assessment in engineering education. In A. Johri, & B. Olds (Eds.), Cambridge handbook of engineering education research , (pp. 571–598). Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.036 .

Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide . Blackwell Publishing. https://doi.org/10.1002/9780470754887 .

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education , 93 , 223–232. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x .

Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching , 36 (5), 14–20.

Ramsier, R. D., Broadway, F. S., Cheung, H. M., Evans, E. A., & Qammar, H. K. (2003). University physics: A hybrid approach . Nashville: Paper presented at the 2003 ASEE Annual Conference and Exposition https://peer.asee.org/11934 .

Regev, G., Gause, D. C., & Wegmann, A. (2008). Requirements engineering education in the 21st century, an experiential learning approach . 2008 16th IEEE International Requirements Engineering Conference, Catalunya. https://doi.org/10.1109/RE.2008.28 .

Rockland, R., Hirsch, L., Burr-Alexander, L., Carpinelli, J. D., & Kimmel, H. S. (2013). Learning outside the classroom—Flipping an undergraduate circuits analysis course . Atlanta: Paper presented at the 2013 ASEE Annual Conference and Exposition https://peer.asee.org/19868 .

Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching , 55 (1), 19–43. https://doi.org/10.1002/tea.21409 .

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., & Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: A systematic review of underlying reasons. Journal of College Science Teaching , 49 (6) https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-julyaugust-2020/negative-student .

Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom-based practices. Journal of Engineering Education , 94 (1), 87–101. https://doi.org/10.1002/j.2168-9830.2005.tb00831.x .

Stains, M., Harshman, J., Barker, M., Chasteen, S., Cole, R., DeChenne-Peters, S., … Young, A. M. (2018). Anatomy of STEM teaching in north American universities. Science , 359 (6383), 1468–1470. https://doi.org/10.1126/science.aap8892 .

Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE Life Sciences Education , 16 (1). https://doi.org/10.1187/cbe.16-03-0113 .

Stump, G. S., Husman, J., & Corby, M. (2014). Engineering students' intelligence beliefs and learning. Journal of Engineering Education , 103 (3), 369–387. https://doi.org/10.1002/jee.20051 .

Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education , 5 (1), 7. https://doi.org/10.1186/s40594-018-0102-y .

Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences , 117 (12), 6476–6483. https://doi.org/10.1073/pnas.1916903117 .

Tolman, A., Kremling, J., & Tagg, J. (2016). Why students resist learning: A practical model for understanding and helping students . Stylus Publishing, LLC.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Harvard University Press.

Weimer, M. (2002). Learner-centered teaching: Five key changes to practice . Wiley.

Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology , 25 (1), 68–81. https://doi.org/10.1006/ceps.1999.1015 .

Winkler, I., & Rybnikova, I. (2019). Student resistance in the classroom—Functional-instrumentalist, critical-emancipatory and critical-functional conceptualisations. Higher Education Quarterly , 73 (4), 521–538. https://doi.org/10.1111/hequ.12219 .

Download references

Acknowledgements

We thank our collaborators, Charles Henderson and Michael Prince, for their early contributions to this project, including screening hundreds of abstracts and full papers. Thank you to Adam Papendieck and Katherine Doerr for their feedback on early versions of this manuscript. Finally, thank you to the anonymous reviewers at the International Journal of STEM Education for your constructive feedback.

This work was supported by the National Science Foundation through grant #1744407. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and affiliations.

Hutchins School of Liberal Studies, Sonoma State University, Rohnert Park, CA, USA

Kevin A. Nguyen

Departments of Curriculum & Instruction and Mechanical Engineering, University of Texas, Austin, TX, USA

Maura Borrego & Sneha Tharayil

Departments of Electrical Engineering & Computer Science and Education, University of Michigan, 4413 EECS Building, 1301 Beal Avenue, Ann Arbor, MI, 48109, USA

Cynthia J. Finelli & Caroline Crockett

Enrollment Management Research Group, Southern Methodist University, Dallas, TX, USA

Matt DeMonbrun

School of Applied Engineering and Technology, New Jersey Institute of Technology, Newark, NJ, USA

Prateek Shekhar

Advanced Manufacturing and Materials, Naval Surface Warfare Center Carderock Division, Potomac, MD, USA

Cynthia Waters

Cabot Science Library, Harvard University, Cambridge, MA, USA

Robyn Rosenberg

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design and execution of this paper. KN, MB, and CW created the original vision for the paper. RR solicited, downloaded, and catalogued all studies for review. All authors contributed in reviewing and screening hundreds of studies. KN then led the initial analysis and creation of strategy codes. CF reviewed and finalized the analysis. All authors drafted, reviewed, and finalized sections of the paper. KN, MB, MD, and CC led the final review of the paper. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cynthia J. Finelli .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Nguyen, K.A., Borrego, M., Finelli, C.J. et al. Instructor strategies to aid implementation of active learning: a systematic literature review. IJ STEM Ed 8 , 9 (2021). https://doi.org/10.1186/s40594-021-00270-7

Download citation

Received : 19 June 2020

Accepted : 18 January 2021

Published : 15 March 2021

DOI : https://doi.org/10.1186/s40594-021-00270-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Active learning
  • Systematic review
  • Instructor strategies; student response

research about learning strategies

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Learning strategies and their correlation with academic success in biology and physiology examinations during the preclinical years of medical school

Annemarie Hogh

Core Facility for Cell Sorting and Cell Analysis, University Medical Center Rostock, Rostock, Germany

Brigitte Müller-Hilke

Associated data.

Our original data files are available from the figshare repository ( doi.org/10.6084/m9.figshare.13536605.v1 ).

Efficient learning is essential for successful completion of the medical degree and students use a variety of strategies to cope with university requirements. However, strategies that lead to academic success have hardly been explored. We therefore evaluated the individual learning approaches used by a cohort of medical students in their first and second preclinical years and analyzed possible correlations with examination scores.

107 students participated in our longitudinal survey on cognitive, meta-cognitive and resource-oriented learning strategies using the LIST-questionnaire (Lernstrategien im Studium). The students were surveyed twice while in their first and second year of medical school, respectively and academic performances were assessed as scores obtained in two examinations written shortly after the LIST surveys. Statistical evaluations included comparisons and cluster analyses.

We here identified four different patterns of learning strategy combinations, describing the relaxed, diligent, hard-working, and sociable learners. About half of the students stayed true to their initially registered pattern of learning strategy combinations while 53 students underwent a change between the first and second surveys. Changes were predominantly made between the relaxed and the sociable and between the diligent and the hard-working learners, respectively. Examination results suggested that the diligent and hard-working learners were academically more successful than the relaxed and sociable ones.

Early habits of sociable learning were quickly abandoned however, not in favor of more successful patterns. It is therefore essential to develop interventions on learning skills that have a lasting impact on the pattern of the students´ learning strategy combinations.

Students in higher education are expected to autonomously learn, rehearse, and deepen the teaching content conveyed in lectures and seminars. Likewise, they should be able to independently prepare for oral and written examinations which is often described as self-regulated learning [ 1 , 2 ]. The concept of self-regulation provides a basic principle for learning processes. In particular the cognitive and meta cognitive actions describe how learners control their thoughts, feelings and actions in order to achieve best learning results [ 3 ]. The use of efficient learning strategies is hence an essential prerequisite for academic success [ 4 , 5 ]. According to van Lohuizen, the term ‘learning strategy’ is used for clusters of related learning activities that students have at their disposal in reaction to a specific learning goal [ 6 ]. And even though there are various classifications of learning strategies, three general scales emerged that describe cognitive, meta-cognitive, and resource-oriented strategies [ 7 , 8 ]. Cognitive strategies serve to process the information collected e.g. in lectures and seminars. These cognitive strategies consist of organization, critical thinking, development and rehearsal of learning material. Meta-cognitive strategies help students to control and regulate their cognition and are subdivided into the subscales of planning (setting goals), regulating, and monitoring the learning processes [ 9 ]. Resource-oriented strategies differentiate intrinsic, and extrinsic resources and are divided into the intrinsic subscales of distractibility, effort regulation and time management while extrinsic subscales include managing the study environment, peer-learning, and use of additional literature.

Studying medicine is considered to be particularly demanding and learning-intensive. Students need to be proficient in defining their own learning goals, acquiring new knowledge and skills independently, and assessing the outcome of the learning process. Importantly, efficient learning is not only key for passing the upcoming examinations and successful completion of the medical degree, but provides the basis for lifelong professional advancement and keeping up with current scientific knowledge [ 1 , 10 ]. Successful and sustained learning is therefore mandatory for the long-term provision of high-quality patient care [ 11 – 13 ].

Various studies have previously addressed the learning strategies medical students use to cope with university requirements [ 14 , 15 ]. In the project described in this manuscript, we explored the learning strategies used by medical students at our university during their preclinical years. We investigated whether these learning strategies were constant or changed over time and whether they correlated with academic success. To that extend, we performed an exploratory study and recruited medical students during their first preclinical year. We assessed their individual learning strategies just before their first examination period at three months into medical school and repeated this assessment one year later. By analyzing the results of examinations written shortly after these assessments, we were able to indirectly correlate the various learning strategies with academic success.

Participant recruitment

During a compulsory course in their first preclinical year, a total of 224 students at the Rostock University Medical Center was informed about the study and invited to participate. Participation required written consent to the monitoring of learning approaches and examination scores as well as the participation in a self-assessment that contained demographic data. As an incentive to participate, we assured the students access to their results and evaluations at the end of the study. The study has received approval from the ethics committee of the medical faculty of the University of Rostock, it is registered under A 2018–0005 and was performed in accordance with the Declaration of Helsinki. Written informed consent was obtained from all participants.

Learning strategies

The learning strategies were assessed using the LIST-Questionnaire (Lernstrategien im Studium) which is a German version based on the Anglo-American Motivated Strategies of Learning questionnaire (MSLQ) [ 16 , 17 ]. The LIST contains 96 items that score on an endpoint named Likert-scale ranging from 1 (`not at all true of me´) to 5 (`very true of me´) and classifies various learning activities into the following four domains–cognitive strategy use, meta-cognitive strategy use, intrinsic and extrinsic resource-oriented strategy uses. The LIST was compiled as an online questionnaire using EvaSys and sent to the students by email in their first (T1) and second (T2) preclinical year.

Learning outcome

The medical school at Rostock University adheres to a traditional curriculum, teaching each subject individually and segregating the first two preclinical form the four clinical years. During the preclinical years, the students attend lectures and seminars, possibly take smaller examinations in the course of the term and through passing these, gain access to the final examinations at the end of the term. Smaller subjects like Biology stretch over the first term, require attending a lecture as well as practical courses and conclude with a written examination. More comprehensive subjects like physiology cover the third and fourth term and require attending lectures as well as seminars during the third term. The written examinations at the end of the third term is prerequisite for participating in a practical course in the fourth term which is flanked by oral examinations. Once all credits associated with the preclinical years are obtained, students can register for the first state examination which in turn needs to be passed in order to move on to the clinical years. However, students can opt out of any preclinical examination–at the cost of having to extend the preclinical phase.

To monitor the learning success, we here used the results of the biology examination at T1 and the physiology examination at T2, respectively. The biology examination consisted of 40 multiple choice questions and the physiology examination of 75 multiple choice plus 5 open-ended questions. The mean scores achieved by a cluster of students using the same learning strategies were then compared.

Simultaneous to the LIST, the participants received a second questionnaire asking them how they rated their own learning outcomes, how much time they spent studying and to predict their examination results. These self-evaluations were later compared to the real scores achieved. Moreover, the students were asked about their age, sex and final school grades.

Data analysis

Fisher’s exact tests were performed to compare the ratio of female to male students between the study cohort and the rest of the academic year. For each participant, the individual LIST-scores describing one of the thirteen subscales were totaled and entered into hierarchical cluster analyses (Ward method). Exploratory data analyses of the individual clusters resulted in the mean LIST-scores for each of the thirteen subscales. We conducted Kolmogorov-Smirnov tests to analyze the Gaussian distribution of the data. Comparisons between the mean LIST-scores and the corresponding demographic data were performed via one-way analysis of variance (ANOVA) (Scheffé-procedure) and Kruskal-Wallis tests, respectively. Comparisons of examination results between the various learning strategy combinations were computed via Mann-Whitney (biology) and unpaired t-tests (physiology examination), respectively. P-values < 0.05 were considered statistically significant. Reliability was calculated via Cronbach´s alpha. All analyses were carried out using IBM SPSS Statistics, version 25.

Study cohort

Fig 1 summarizes the numbers of participants at the various stages of the study as well as a time line. Out of a total of 224 first year medical students 157 students initially declared their consent, 123 of these filled out the LIST at the first time point (T1) and sat the biology examination. 100 of these students completed the self-assessment at T1. Of the 123 LIST participants at T1, 109 students completed the LIST at the second time point (T2), and 107 of these sat the physiology examination, so that the corresponding results were available. 103 of these study participants completed the self-assessment at T2. Reasons for being excluded from further analyses were the lack of or incompletely filled in LIST-questionnaires. The ratio of male/female students in the study cohort was 29/78 and was 46/71 for the rest of the academic year. The difference between male/female proportions in both cohorts was not statistically significant (p-value = 0.065).

An external file that holds a picture, illustration, etc.
Object name is pone.0245851.g001.jpg

Students in their first year at medical school segregated into four distinct patterns of learning strategy combinations

The LIST comprises 96 items that classify various learning activities into four scales. The cognitive scale differentiates 4 subscales, organization, critical thinking, elaboration, and rehearsal of learning material. The meta-cognitive scale differentiates 3 subscales, planning, regulating, and monitoring the learning process. The resource-oriented scales differentiate 3 intrinsic and extrinsic strategies each, distractibility, effort regulation, and time management on the one hand and managing of the study environment, peer-learning and use of additional literature on the other. The reliability for each of the scales was calculated as Cronbach’s alpha and was 0.85 for cognitive learning strategies, 0.68 for meta-cognitive learning strategies, 0.76 for intrinsic, and 0.79 for extrinsic resource-oriented strategies.

The individual LIST-scores describing one of the thirteen subscales were totaled and entered into hierarchical cluster analyses. These analyses were performed for T1 and T2 and at both time points yielded four identical clusters. These clusters identified distinct patterns of learning strategy combinations among medical students and defined 4 types of learners.

The relaxed learners impressed with their significantly elevated LIST-scores regarding distractibility and were followed by sociable learners ( Fig 2A ). As for all the other subscales, except for critical thinking, the relaxed learners achieved the lowest scores of all types of learners. Moreover, their self-assessed learning outcomes concerning the upcoming examination revealed a rather relaxed attitude that prompted their name. At T1, 17 students belonged to the relaxed type of learners. The diligent learners impressed with the highest scores in all cognitive, meta-cognitive, and intrinsic resource-oriented subscales and only marginally decelerated when it came to managing the study environment. They invested the highest efforts into critical thinking ( Fig 2B ), organizing, and elaborating the subjects at hand and spent the most time planning, regulating, and monitoring their learning activities. Because they not only invested so much effort but also used their resources diligently, we named them the diligent learners. At T1, 25 students belonged to the diligent type of learners. The hard-working learners also scored high on learning strategies that require cognitive and meta-cognitive skills. However, they focused on repetitive activities like rehearsal ( Fig 2C ) and placed high emphasis on an efficient study environment yet scored the lowest on critical thinking. We, therefore, named them the hard-working learners and at T1, 33 students were the hard-working type. And finally, there were 48 sociable learners at T1. They preferred to learn in the company of others and scored the highest with the extrinsic resource-oriented strategy of peer-learning ( Fig 2D ). We called these the sociable learners.

An external file that holds a picture, illustration, etc.
Object name is pone.0245851.g002.jpg

Box-plots show LIST scores of the most characteristic learning activities defining the four types of learners at T1. Statistical significance resulting from ANOVA (Scheffé-test with α = 0.05) are indicated by asterisks. *p<0.05; **p<0.01; ***p<0.001.

Table 1 summarizes the mean individual scores achieved for the four different patterns of learning strategy combinations and the remaining subscales not presented in Fig 2 . The p-values resulting from ANOVA confirm four distinct patterns of learning strategy combinations. The statistics regarding self-assessment and demographics of the four different patterns of learning strategy combinations revealed that the diligent and the hard-working learners were the youngest and spent the most time studying ( Table 2 ). The difference in school leaving grades showed a tendency but did not reach statistical significance.

# according to ANOVA (Scheffé-test with α = 0.05); standard deviation (SD).

# according to ANOVA (Scheffé-test with α = 0.05). Note, that higher grades in Germany represent lower academic performance.

Learning patterns were flexible but defined preferences

When repeating the hierarchical cluster analysis of LIST-scores at T2, the same four patterns were reproduced. However, only half of the students stayed true to their previous learning habits while 53/107 study participants changed their learning strategies in the course of the preclinical training ( Fig 3 ). At close inspection though, the changes did not occur randomly. Instead, certain preferences emerged that appeared stable. E.g., while most of the relaxed learners (10/12) remained relaxed, the two who did change turned into sociable learners. Likewise, most of the sociable ones remained sociable (14/44) or ended up relaxed (n = 15). Only 6 of the originally relaxed learners turned out diligent. In summary, the relaxed learners recorded the strongest growth, while the sociable ones were reduced by half.

An external file that holds a picture, illustration, etc.
Object name is pone.0245851.g003.jpg

Bar graphs show changes of learning patterns in the course of the first 18 months at medical school. The graphic designs define the learning patterns at T1 and indicate changes towards different patterns at T2. Only those students were included for whom T2 data were fully available.

On the other hand, the majority of the diligent learners remained diligent (13/24) and of those who changed, the majority (n = 7) became hard-working. None turned into a relaxed learner. Among the hard-working learners at T1 (n = 29), 16 remained hard-working and 5 became diligent, while 3 turned into sociable learners and 5 became relaxed. The preferences emerging were thus either hard-working and/or diligent on the one hand or social and/or relaxed on the other.

Academic success segregated with the hard-working and diligent learners

In order to analyze whether certain learning patterns were academically more successful and led to better examination results, we compared the various patterns for the scores achieved in the biology and physiology examinations. There were no significant differences when the four learning patterns were compared to each other. However, when grouping according to learning preferences–the relaxed and the sociable learners versus the diligent and hard-working ones–the latter obtained significantly better scores and that was true for the biology as well as the physiology examinations ( Fig 4 ). The combined medians for the scores obtained in the biology examination at T1 were 34 (interquartile range IQR = 6) and 35 (IQR = 5) for the sociable/relaxed and the diligent/hard-working learners, respectively. For the physiology examination they were 48 (IQR = 14) and 53.5 (IQR = 10).

An external file that holds a picture, illustration, etc.
Object name is pone.0245851.g004.jpg

Dot-plots show the scores obtained for the relaxed/sociable and the hard-working/diligent learners in the biology examination at T1 and the physiology examination at T2. Statistical significance was calculated via Mann-Whitney (biology) and unpaired t-test (physiology examination) and is indicated by p-values.

We here described four distinct patterns of learning strategy combinations among medical students which were found in repeated assessments yet for the individual student were not necessarily stable. Students were first assessed six months into medical school and at this point in time, we consider it likely that the individual learning strategy combined an individual´s preference with the strategy that had proved itself successful in high school. A student´s learning approach may be regarded as a combination of several components, among them views about learning as well as metacognitive and processing activities, whereby the actualized learning strategy is assumed to depend on the specific learning situation [ 18 , 19 ]. Indeed, at the second assessment conducted one year later, 49.5% of the students had changed their learning habits. Whether this change occurred on purpose because the students were afraid of failure or whether the change was unintentional we cannot conclude. However, the observation that changes in learning patterns did not occur at random but predominantly between the sociable and relaxed learners on the one hand and between the hard-working and diligent ones suggested certain preferences. We therefore investigated whether these preferences correlated with academic success. And indeed, the preference for a sociable or relaxed learning pattern turned out significantly less successful compared to the preference for a diligent or hard-working one. Our results therefore support the notion that the right choice of learning activities influences academic success [ 20 ].

It is not surprising that most of these early changes in learning patterns were made by the sociable learners, given the more time-consuming schedule that leaves little time for group meetings in the second preclinical year. And even though comprehensible, any decrease in sociability will hamper the fostering of important skills like team work and will therefore reduce motivation, which in turn stimulates discussion and critical thinking about the learning material [ 21 , 22 ]. Unfortunately, the sociable learning strategy was frequently replaced with the relaxed one, indicating an individual preference for unfavorable learning strategies.

Indeed, the relaxed learners not only featured the most inefficient pattern of learning strategy combinations, they also spent the least time for self-study. A glance at their final school grades suggested that these students already in their earlier years had learning deficiencies. And even when abandoning their old learning preference, the relaxed ones remained true to an inefficient pattern and predominantly turned into sociable learners. Distressingly, the numbers of relaxed learners doubled between the first and the second survey, suggesting an increase of students who were bound to run into academic problems in the years to come.

The most significant differences between the four patterns of learning strategy combinations were caused by differences in resource-oriented learning strategies like distractibility, time management and effort. Here the diligent learners achieved particularly low LIST-scores for the former and particularly high scores for the latter two. They thus appeared well prepared for the requirements of medical school. Moreover, the diligent learners scored also highest in many other learning activities that allowed for a flexible realignment to various learning problems. The fact that they also achieved highest scores in cognitive and metacognitive learning strategies (except for repetition) indicated a preferred learning to understand as opposed to superficial memorizing. Indeed, previous publications showed that medical students who pursue a deep or strategic learning approach achieve highest academic success, whereas a surface approach correlated with poorer outcomes [ 4 , 23 , 24 ].

The hard-working learners relied on repetitive strategies, however they lacked the capacity to apply a variety of learning activities. On the long run, this is bound to turn out negative with respect to the numerous theoretical and practical requirements of the clinical years and the professional life thereafter. We favor the idea that the hard-working learners changed their patterns of learning strategy combination most flexibly because they realized at some point that their strategies did not suffice to keep up with the increasing learning demands and were in dire need of an alternative strategy.

Both, the relaxed and the sociable learners are likely to benefit from learning skill interventions [ 25 ]. Medical educators therefore need to impart sufficient metacognitive knowledge to adopt the hard-working if not the diligent strategy and to show students how to gain control over their learning processes in order to turn into medical doctors that are proficient in life-long learning [ 1 ].

There are limitations to our study and these concern the relatively small sample size of 107 students and even smaller subgroups for the various patterns of learning strategy combinations. Moreover, our monocentric design and the short observational period within the framework of a traditional curriculum limit the explanatory power. It would be interesting to find out how the learning habits differ in a problem-based teaching environment and how they develop over longer time, in particular, during the clinical education. As we here compared learning patterns to the biology and physiology examinations only, we cannot extrapolate which learning strategy would be the best fit for other preclinical, let alone clinical subjects. Moreover, any questionnaire assessing learning strategies harbors the risk of socially desirable statements. However, as opposed to the available literature which assessed academic success via cumulative grade point averages and temporal distance, we here evaluated examinations written in temporal proximity to the surveys on learning strategies [ 4 , 5 , 14 ]. We therefore were not only able to follow individual developments but also to directly correlate changes in the learning strategies to academic achievements.

In summary, we here defined four distinct patterns of learning strategy combinations among early medical students that were defined by differences concerning resource-oriented learning strategies. Early habits of sociable learning were quickly abandoned however, not in favor of more successful patterns. It is therefore essential to develop interventions on learning skills that have a lasting impact on the pattern of learning strategy combinations.

Acknowledgments

The authors are grateful to all study participants for their readiness to give us an insight into their learning behavior during the preclinical years. Moreover, we would like to express our thanks to the members of the student research group for their critical comments during data analysis.

Funding Statement

The author(s) received no specific funding for this work.

Data Availability

  • PLoS One. 2021; 16(1): e0245851.

Decision Letter 0

29 Oct 2020

PONE-D-20-27667

Learning strategy as predictor of academic success in medical school

Dear Dr. Müller-Hilke,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

The reviewers had concerns about the title, the grading methods, the review of the literature and the context as well as the limitations of the work.

Please submit your revised manuscript by Dec 13 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Mohammed Saqr, Ph.D

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2.Thank you for including your ethics statement:

"The study was approved by the local ethics committee and is registered under A 2018-0005."

Please amend your current ethics statement to include the full name of the ethics committee/institutional review board(s) that approved your specific study.

Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”).

For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research .

3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please delete it from any other section.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: Yes

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The manuscript is well written and presents a very interesting topic. The authors present the subject matter in a clear and concise way that captures the reader's interest.

I am missing a review of related work on medical education or similar studies in other fields. In addition, I think the study is lacking in theoretical foundation. For instance, the authors use the LIST questionnaire, which is based on Piltrich's MSLQ instrument for self-regulation, but they do not mention the concept of self-regulation at all in their manuscript as the basis of the learning process (cognitive and meta cognitive actions). I think including such theoretical foundation for students' actions would greatly help position the article within existing literature.

Some statements in the result analyisis are a bit "bold". For instance, in line 146, "For the comparison of both cohorts resulted in a p-value of 0.065, confirming comparable ratios". A p-value of 0.065 fails to reject the null hypotheses, but does not mean that the opposite is confirmed. It should say something like "The difference between male/female proportions in both cohorts was not statistically significant (p-value = .065)". Also, in Table 1, for instance, for such low p-values, it is usually specified as p < .001 instead of including so many decimal figures.

Lastly, I think it would be useful to include some sort of timeline of the study, including the interventions used at each point in time.

Reviewer #2: This study looks at different study strategies adopted by preclinical medical students at a German university and investigates whether these choices impact the academic performance for two basic science topics, biology and physiology. The authors also analyze whether students change their strategy from the first to the second year of medical school. The authors mention that they see learning success (examination scores) differences between groups of students with different learning strategies, but none of these differences is statistically significant. That may be due to the small sample size. The second question about students changing their learning strategy would also be of interest, but again it is hampered by the small sample size.

These drawbacks of the study make it very preliminary and we can’t draw any useful conclusions at this time. I would advise the authors to collect more data and if no significant learning success differences between learning strategies become apparent, to concentrate on the second question. It would be of interest to evaluate why students change their learning strategy and whether that makes a positive, negative or no difference for their academic success of these students.

Nice short title that describes that topic and the goal of the study. I would advise to reword the title and to avoid the word “predictor”. What is the authors look at is a correlation (and they do not find any).

The introduction is well organized, concise and develops the problems addressed in this project.

The English of the manuscript, specifically the abstract, could be improved. Some of the word choices, although not wrong, are suboptimal. See a few examples below.

For example: “We here explored the learning strategies applied during the preclinical years, whether they are constant traits or subject to change and how they impact on the academic success.”

Better: “In the project described in this manuscript we explored the learning strategies used by medical students at our university during their preclinical years. We investigated whether these learning strategies were constant or changed over time and whether they correlated with academic success.”

The abstract misrepresents the finding of the study. As academic differences between different study strategies are not statistically significant, we can’t draw any conclusion as to which strategy/ies is/are best. The conclusion section is not based on the data, but are simple truisms that are independent of the results presented in the described work.

The Methods section is missing a lot of important information.

I would appreciate more background about the structure of the German preclinical curriculum as it is used at the University of Rostock. Clearly describe the situation, like the size of the entire class (224) and its composition? Were the study participants a representative sample of the class? Which students were eliminated from the study and for what reasons. How and when were the surveys offered? There is only a very superficial description of how the learning outcome was measured. What is the role of the self-assessment? It appears that only results of a small biology/physiology examinations were used, leaving open that other learning strategies might be a better fit for other topics (anatomy, biochemistry, pharmacology etc.). It would improve the analysis if additional or more general examination scores would be used for the correlation with the different learning strategies used by different students.

The statistical analysis appears to be appropriate.

Another significant problem of the study is that the academic success was measured using different examination topics (as this is a longitudinal study, there is no way to change this aspect), biology for year 1 and physiology for year 2. At least this needs to be critically discussed.

Also, it is never clearly stated that both topics show no statistically significant differences for the different learning strategies. That may be a direct result of the small sample size. You can’t discuss the differences in the result section if they are not statistically significant.

The most interesting result is that many students keep their initial study strategy, although some change. Again, the small sample size does not allow a more detailed analysis why these changes occur and how they impacted individual student’s performance.

Good that the authors included a paragraph about the limitations of their study. I would add a title “Limitation of the study” and organize this paragraph as a subsection of the discussion.

Limitations are that this analysis looks at preclinical learning, not on the students clinical abilities. There is nothing wrong with that, however, this limitation needs to be stated. As the examinations only cover biology/physiology, do NOT use the general term “academic success”, but make clear that it is only academic success in these subjects. Without further information and analysis, academic success can’t be assumed for other preclinical and clinical subjects.

Smaller issues:

Define acronyms at their first appearance and independently in the abstract. E.g., LIST is never defined.

“Medical school” should be “medical school”.

“Examinations”, not “exams”.

“Likert”, not “LIKERT”. Likert is a name (of the social psychologist Rensis Likert), not an acronym.

Not “MC-questions”, but the correct acronym is MCQ for Multiple Choice Question.

“…and sat the biology exam.” That is not a complete sentence.

6. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy .

Reviewer #1: No

Reviewer #2: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Submitted filename: Pone-D-20-27667 Review.pdf

Author response to Decision Letter 0

Dear Reviewers,

We highly appreciate your comments - as they did point out (potential) misunderstandings and weaknesses. In fact, they prompted us to look at our data again with fresh eyes – and to revise our manuscript accordingly. We believe it improved significantly.

Below, please find our point by point reply.

Reviewer #1: The manuscript is well written and presents a very interesting topic. The authors present the subject matter in a clear and concise way that captures the reader's interest.

completely agreed – we introduced another sentence plus reference, please see LL59-61

Agreed – and done! Please see LL 105/106

Also agreed – and done. Please see Figure 1.

Reviewer #2: This study looks at different study strategies adopted by preclinical medical students at a German university and investigates whether these choices impact the academic performance for two basic science topics, biology and physiology. The authors also analyze whether students change their strategy from the first to the second year of medical school. The authors mention that they see learning success (examination scores) differences between groups of students with different learning strategies, but none of these differences is statistically significant. That may be due to the small sample size. The second question about students changing their learning strategy would also be of interest, but again it is hampered by the small sample size.

We considered your comments for quite a while – and that made us look at our data with fresh eyes. The preferences of our students – if they change their learning pattern, they do so mainly between the social and relaxed or between the diligent and hard-working patterns – made us compare the examination results accordingly. And that did indeed reveal academically more successful (diligent and hard-working) and less successful (social and relaxed) patterns. We revised our manuscript accordingly.

Agreed and done!

Agreed and done! Please, see LL83-86

Revised (according to our comment above).

I would appreciate more background about the structure of the German preclinical curriculum as it is used at the University of Rostock. Clearly describe the situation, like the size of the entire class (224) and its composition?

We revised the “study cohort” paragraph in the results section (LL96-106) as well as the “learning outcome” paragraph in the methods section (LL 309-321) in order to provide the missing information.

Were the study participants a representative sample of the class?

In terms of age and sex: yes. Other than that: we assume yes - but would not know.

Which students were eliminated from the study and for what reasons.

We excluded those students, whose data were incomplete. E.g. they may have declared consent but then did not even fill in the first LIST (n=34), they may have had complete data sets for T1 (the first LIST and biology examination scores) but did not respond at T2 (n=14) or filled in the second LIST but then did not write the physiology examination (n=2). We rephrased in the Results section, please see LL 102/103.

How and when were the surveys offered? There is only a very superficial description of how the learning outcome was measured. What is the role of the self-assessment? It appears that only results of a small biology/physiology examinations were used, leaving open that other learning strategies might be a better fit for other topics (anatomy, biochemistry, pharmacology etc.). It would improve the analysis if additional or more general examination scores would be used for the correlation with the different learning strategies used by different students.

There was only one examination each in temporal proximity to the LIST surveys. That was for one a chemistry examination close to the biology one. However to our experience, the results of the chemistry exam are very much dependent on whether and for how long students had had chemistry in school. We therefore refrained from including the chemistry examination. For the other, there is the biochemistry examination shortly after the physiology examination. However, the physiology examination among students is considered the more challenging one – leading to a high percentage of failures in biochemistry, because students concentrate their learning activities on physiology. At the time we therefore decided to evaluate the physiology results. Looking back it would have been wise to include all results. As it is, we cannot simply include chemistry and biochemistry results, because the written consent explicitly asked for permission to evaluate biology and physiology.

At the end of the “learning outcome” paragraph in the methods section we elaborate on the self-evaluations. (LL 327-330)

We address this in our “limitations of the study” section, please see LL 265-278

as pointed out above, we revised our manuscript and believe, that this particular comment is now obsolete

We have extended this paragraph (LL276-279) in order to consider your comments – but abstained from an extra title. We do not have any subtitles at all in our discussion – and considered it not appropriate to put this extra weight on our limitations.

All agreed and done!

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,

Submitted filename: response to reviewers.docx

Decision Letter 1

29 Dec 2020

PONE-D-20-27667R1

As you can see in the reviewers comments, there are few minor comments that need to be addressed. You may also  need to consult PlOS one for styles guidelines before submitted this version.

Please submit your revised manuscript by Feb 12 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: (No Response)

2. Is the manuscript technically sound, and do the data support the conclusions?

3. Has the statistical analysis been performed appropriately and rigorously?

4. Have the authors made all data underlying the findings in their manuscript fully available?

5. Is the manuscript presented in an intelligible fashion and written in standard English?

6. Review Comments to the Author

Reviewer #1: The authors have addressed all my previous concerns. The only remaining issue is the order of the sections. The Methods section should go before results, and not at the end of the manuscript.

Reviewer #2: I am very pleased with the corrections, clarifications and additions the authors have included in their revised manuscript. Their conclusions and the presentation are much clearer now. I think they understand that the small sample size and the limited number of topics reduce the impact of their study.

I have one remaining request, the real p-values need to be presented in Tables 1 and 2 and also for Figure 2. Just giving p-value ranges by number of stars is not sufficient.

7. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

Author response to Decision Letter 1

A Happy New Year to you – and thank you for your friendly comments. Please, find below our responses:

Reviewer #1: The authors have addressed all my previous concerns. The only remaining issue is the order of the sections. The Methods section should go before results, and not at the end of the manuscript.

Reviewer #2: I am very pleased with the corrections, clarifications and additions the authors have included in their revised manuscript. Their conclusions and the presentation are much clearer now. I think they understand that the small sample size and the limited number of topics reduce the impact of their study.

Exact p-values have been reintroduced. Replacements by asterisks were introduced into the former version in order to accommodate a comment by Reviewer #1. We therefore feel a little bit between rock and had place…..

Thanks to both of you for your time and effort!

Submitted filename: rebuttal letter.docx

Decision Letter 2

11 Jan 2021

PONE-D-20-27667R2

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

13 Jan 2021

Dear Dr. Müller-Hilke:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Mohammed Saqr

The impact of learning strategies on the academic achievement of university students in Saudi Arabia

Learning and Teaching in Higher Education: Gulf Perspectives

ISSN : 2077-5504

Article publication date: 11 February 2022

Issue publication date: 22 February 2022

This study aimed to investigate the learning strategies adopted by Saudi university students and explore the differences in the use of learning strategies due to gender and academic achievement.

Design/methodology/approach

The study utilized a cross-sectional descriptive analytic approach and adopted the brief “ACRA-C” learning strategies scale. The study sample consisted of 365 students enrolled at a Saudi university selected using the random clustering technique.

The study revealed that microstrategies and study habits are the most preferred strategies by Saudi university students. Statistically significant differences in the use of learning strategies were found between male and female students in favor of the female students. The study also found that learning strategies are a significant predictor of students' academic achievement.

Research limitations/implications

The study was limited to one college in one Saudi university. Future studies should use larger samples from different colleges and universities in Saudi Arabia and incorporate a variety of measures of academic achievement, such as students' grades in specific courses rather than the overall grade average.

Originality/value

While there are a number of studies that investigated the use of learning strategies by students, there is a lack of such research in the higher education context of Saudi Arabia. Hence, the current study contributes to closing this gap in the literature by looking at the use of learning strategies by university students in Saudi Arabia and the relationship between strategy use, gender and academic achievement.

  • Learning strategies
  • Saudi higher education
  • Academic achievement

Almoslamani, Y. (2022), "The impact of learning strategies on the academic achievement of university students in Saudi Arabia", Learning and Teaching in Higher Education: Gulf Perspectives , Vol. 18 No. 1, pp. 4-18. https://doi.org/10.1108/LTHE-08-2020-0025

Emerald Publishing Limited

Copyright © 2021, Yousef Almoslamani

Published in Learning and Teaching in Higher Education: Gulf Perspectives . Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode .

Introduction

Traditional rote-learning memorization has been the dominant learning strategy by students in educational institutions in the Kingdom of Saudi Arabia (KSA). This emphasis on rote memorization is responsible to a great degree for Saudi students being passive recipients of information in the classroom ( Al-Seghayer, 2021 ; Pordanjani & Guntur, 2019 ; Kim & Alghamdi, 2019 ).

Recently, in KSA, there has been substantial interest in raising students' awareness of learning strategies in an effort to increase the quality of learning in educational institutions and satisfy preestablished global performance standards, such as the KSA national accreditation requirements established by the National Commission of Academic Accreditation and Assessment (NCAAA). The accreditation certificate is a significant indicator of educational quality, and it assesses four aspects of the educational system: curriculum, instructors, teaching strategies and students. In terms of the student indicators, performance is the first measurement of learning quality ( Vermunt & Vermunt, 2017 ), while learning is measured through attainment or accumulative achievements, such as exam results. Ali, Medhekar and Rattanawiboonsom (2017) argued that student achievement in a higher education institution can be improved through several critical factors namely, the quality of the staff, the inclusion of information technology and appropriate learning strategies. Thus, a number of local studies have investigated the role and impact of instructors in promoting student achievement and learning. For example, Bashir, Lockheed, Ninan and Tan (2018) asserted that pedagogical practice and instructor knowledge play a critical role in increasing student learning. Similarly, Buchori, Setyosari, Dasna, Degeng and Sa'dijah (2017) established that instructors' strategies and techniques determine students' roles, activities and achievement in the learning process and likewise foster students' responsibility for their learning. Other studies investigated learning strategies which can help students acquire information and take an active role in the learning process (e.g. McMullen, 2009 ; Shehzad, Razzaq, Dahri, & Shah, 2019 ).

Research on learning strategies has shown that students may adopt more than one learning strategy since the different academic tasks and their nature require different processing strategies, which range from simple to more complex strategies. Some studies established that the learning strategies could be a good predictor of academic achievement (e.g. Pennequin, Sorel, Nanty, & Fontaine, 2010 ; Muelas & Navarro, 2015 ; Pinto, Bigozzi, Vettori, & Vezzani, 2018 ; Tan, 2019 ), while others found that the relationship between learning strategies and academic achievement was negative such as in Vettori, Vezzani, Bigozzi and Pinto (2020) . Furthermore, a few studies did not find any association between learning strategies and student performance (see Tariq et al. , 2016 ). In their study, Chiu, Chow and Mcbride-Chang (2007) found that different contextual factors such as the economic and cultural background of the students may substantially affect the association between learning strategies and academic achievement.

Despite the extended research conducted investigating the relationship between the use of learning strategies and student academic performance, there is lack of evidence on the use of learning strategies by Saudi students. Therefore, this study explores the learning strategies adopted by Saudi university students in the education process in light of the country's efforts to raise the quality of teaching and learning in its educational institutions.

Literature review

Learning strategies are defined as a set of approaches that learners use to acquire information and knowledge, such as taking notes, organizing information, summarizing and coding ( Muelas & Navarro, 2015 ). There is a difference between learning style and learning strategies. Learning style is used to describe the information processing routines associated with students' personalities, whereas learning strategies refer to students' learning approaches in specific learning activities and learning situations ( Curry, 1990 ; Li, Medwell, Wray, Wang, & Xiaojing, 2016 ).

Effective learning strategies refer to techniques and approaches learners use to achieve the acquisition, storage, retention, recall and adoption of knowledge. Cognitive learning theories consider learners as primary participants in the education process in which their role goes beyond passively acquiring information to being active participants. Consequently, students not only receive information and knowledge but also perform mental activities to process and adopt information effectively ( Shi, 2017 ). Accordingly, learners have a wide range of sources and are free to select their learning strategies, direct their learning process and control their tendencies and emotions to serve their learning objectives ( Díaz, Zapata, Diaz, Arroyo, & Fuentes, 2019 ).

Determine the information that is most significant by extracting keywords, ideas and models.

Make notes that are more frequently used within classroom time, which help students to recall the information mentioned by the lecturer.

Retrieve relevant information associated with the constructivist learning approach, which relies on making associations among prior information and newly acquired information.

Organize the content and material using the specific plan and obvious objectives previously formulated by learners.

Elaborate on the content of the material and course sources, extract conclusions and extrapolate the information.

Summarize the information into general ideas and concepts and determine the more important relationships and conceptual definitions.

Monitor their memorization and comprehension periodically to ensure their understanding and their knowledge.

Similarly, the study of Montero and Arizmendiarrieta (2017) explicated 10 learning strategies consisting of elaboration, time and effort, perseverance, organization, classmates' support, metacognition, self-questioning, the study environment, repetition and instructors' help. Furthermore, Juste and López (2010) identified seven learning strategies that include the planning and reinforcement of self-esteem, classification, problem-solving, repetition, cooperation, deduction and inference, and prediction and assessment. Apart from identifying specific strategies, Muelas and Navarro (2015) classified strategies into four main categories (i.e. information acquisition strategies, information coding strategies, information retrieval strategies and processing support strategies), while Vega-Hernández, Patino-Alonso, Cabello, Galindo-Villardón and Fernández-Berrocal (2017) identified three main categories of learning strategies: cognitive and learning control strategies, learning support strategies and study habits.

Further studies have attempted the classification of learning strategies into micro and macrostrategies ( Jiménez, García, López-Cepero, & Saavedr, 2017 ). Planning and self-regulation are the main pillars of macrostrategies while summarizing and highlighting information are related to tasks and situations that are present in microstrategies. According to Nikou and Economides (2019) , homework is one of the main examples of a microlearning strategy, and this explains why microstrategies are often used among students. Microlearning delivers learning through small and short units within short, focused activities. In microlearning, students summarize and highlight content to obtain smaller units, such as definitions, formulas and brief paragraphs. Conversely, the concept of macrostrategies is seen as a set of approaches encompassing monitoring, revising, checking and self-assessment. Macrostrategies are more general and developmental, and they cannot be directly defined.

Another classification associated with the use of learning strategies was proposed by Rosário et al. (2015) who stated that students have to be self-regulated to control their learning and effectively implement learning strategies. Therefore, students must acquire three types of knowledge: declarative, procedural and conditional knowledge. Declarative knowledge includes information about various learning strategies. Procedural knowledge includes knowing the appropriate way to apply the different learning strategies. Finally, conditional knowledge identifies the proper context to implement a specific learning strategy.

In addition to identifying and classifying the different learning strategies that students employ, a number of studies were carried out to examine the different preferences among students when adopting learning strategies. Vega-Hernández et al. (2017) explored the differences in learning strategy utilization among students according to gender and age and found that male students preferred learning support strategies and study habits, while female students used cognitive and learning control strategies more frequently. Díaz et al. (2019) also revealed that studying in a group, learning through graphic expression and focusing on information synthesis are most commonly used by university students. In a recent study, Tan (2019) found that students rarely used surface or strategic learning strategies, while they frequently used deep learning strategies, but at a moderate level, thus exhibiting less interest in reading and solving word and numeric problems in math.

The subject area has also been found to have an effect on the use of learning strategies. For example, Muelas and Navarro (2015) investigated student strategy use in three main subject areas: language, math and social sciences. In the language subject, the information coding and information recovery strategies were found to be the most significantly related to higher achievement. The coding strategy was the only strategy that had a significant correlation with higher achievement in math and social science subjects. Muelas and Navarro (2015) argued that teaching learning strategies can be a remedial solution for low student achievement, and they illustrated how to exploit brain competencies through learning strategies to improve academic achievement.

Apart from academic achievement, studies have also looked at other psychological aspects in the context of effective use of learning strategies. For example, Tan (2019) concluded that the use of learning strategies has a moderating effect on the relationship between self-concept and problem-solving skills in students studying mathematics. Similarly, Montero and Arizmendiarrieta (2017) found that remedial interventions in enhancing the use of learning strategies improved student motivation and learning beliefs. Vega-Hernández et al. (2017) also found the use of learning strategies had a positive relationship with perceived emotional intelligence (repair, attention and clarity).

While there are a number of studies that investigated different aspects of the use of learning strategies by university students, there is a lack of such research in the higher education context of Saudi Arabia. Hence, the current study contributes to closing this gap in the literature by looking at the use of learning strategies by Saudi university students and the relationship between strategy use and academic achievement. The research question that guided the present study was: “What is the impact of learning strategies on the academic achievement of Saudi university students?” The study further explored whether gender makes any difference in the selection and use of learning strategies.

Methodology

The study adopted a cross-sectional descriptive analytic approach and applied a quantitative method using a scale as a data collection tool. The study intended to examine the adopted learning strategies among students regardless of whether they had a good basic knowledge of learning strategies (i.e. used the learning strategies intentionally or not).

Participants

The study population comprised all students enrolled in the College of Education at a university in Saudi Arabia. First, the participants of the study were selected using the clustering technique. Four degree programs were identified: Diploma, Bachelor, Master and Doctorate. Then, the participants from each degree program were selected using the stratified random technique to include a variety of the population in the sample. The study selected students enrolled in the College of Education to avoid differences in the use of learning strategies due to the subject area. Thus, the target population consisted of 2,870 female students and 999 male students according to the admission and registration department of the university. The study sample consisted of 365 students, which means that the results can be generalized to all students enrolling in the College of Education at the target university (see Krejcie & Morgan, 1970 ). Table 1 shows that the gender distribution of the sample was balanced (49% female and 51% male). The majority of the participants were enrolled in a bachelor's degree program (81.9%). Participants' grade point average (GPA) varied: 44.9% had very good grades, 34.5% had good grades, 18.9% had excellent grades and 1.6% had passing grades. Participants were mainly in their final year (54.8%) and third year (25%).

Data collection instrument

The study adopted the higher education version of the brief “ACRA-C” learning strategies scale by Jiménez et al. (2017) (see Appendix 1 ). The scale assesses the strategies used by students during the learning process in the university. The original ACRA-C scale was adapted to the study context and the scale used in the study comprised 22 items (17 items for learning strategies and 5 items for learning habits). Participants were asked to evaluate each item using a four-point Likert scale according to the knowledge process (from 1 = Never use to 4 = Always use). The knowledge process is anchored mainly on the following strategies: cognitive and learning control strategies, learning support strategies and study habits. The 22 items were further organized into four main categories: microstrategies (Items 1–5), keys of memory and metacognition (Items 6–10), emotional-social support (Items 11–17) and study habits (Items 18–22). Microstrategies are strategies that control leaning (e.g. “I make summaries after underlining”). Keys of memory and metacognition referred to the ability to self-regulate the learning process (e.g. “It helps me if I recall events or anecdotes to remember”). Emotional-social support referred to the personal motivational aspects and learning support from surroundings (e.g. “I study hard to feel proud of myself”). Study habits referred to what students do habitually (e.g. “I try to express what I have learned in my own words, instead of repeating literally what the teacher or the book says”). A sociodemographic section was added to the scale. This section recorded various types of information about the participants such as their degree, gender, college enrollment, GPA and years of study.

The instrument was translated into Arabic prior to distribution to the sample. In order to ensure that the respondents understood the questions, the instrument was presented to a panel of academics in the field to ensure the translated scale was linguistically and culturally valid. Also, the scale was presented to five students who were from the study population but were not included in the study sample to ensure that they comprehended the items fully. Furthermore, the reliability and validity of the scale were measured. The reliability was measured using a split half (Guttman coefficient = 0.657) and Cronbach's alpha for each dimension and the total scale ranged from 0.658 to 0.777, representing an acceptable level of internal consistency (see Table 2 ). Furthermore, the total score of the instrument was 0.726, indicating good consistency.

To test the validity of the instrument, exploratory factor analysis (EFA) was conducted. According to the Kaiser–Meyer–Olkin (KMO) test, the sample was adequate to run the EFA test (KMO = 0.707; Bartlett's sphericity p  = 0.000). The results found that the variance (eigenvalues) of the instrument's items ranged from 1 to 3.39, and the commonalities of all items were higher than 0.4. The results showed that four factors can be retained by eliminating items that are not saturated by any factor (<0.4), as shown in Table 3 . The instrument is divided into four main dimensions: microstrategies, keys of memory and metacognition, emotional support and study habits. The EFA results are similar to the results obtained by Jiménez et al. (2017) . Therefore, the factors were named the same as those in Jiménez et al. (2017) : microstrategies, keys of memory and metacognitive strategies, social-emotional supports and study habits.

Data analysis

The variance of the learning strategies among participants due to gender and GPA was investigated using covariance tests such as the t -test. Then, the combination of bivariate correlation and regression tests was used to investigate the impact of learning strategies on the students' performance.

The central tendency and dispersion of participants' responses were measured for each dimension, as shown in Table 4 . Participants reported frequent use of all learning strategies in their learning and a preference for microstrategies and study habits compared to the rest of the learning strategies. The kurtosis values for all dimensions excluding “study habits” were positive, which show peaked distributions, while “study habits” showed a flatter distribution.

Furthermore, to investigate the differences in the participants' responses due to gender, the t -test was used, and the results are shown in Table 5 . The female participants reported a significantly higher level of use overall ( M  = 3.24; t (363) = 5.689, p  = 0.000) and also for each category of strategies: microstrategies ( M  = 3.28, SD = 0.504; t (363) = 3.79, p  = 0.000), keys of memory and metacognition ( M  = 3.26; t (363) = 4.65, p  = 0.000), emotional and social support ( M  = 3.21; t (363) = 3.75, p  = 0.000), study habits ( M  = 3.24; t (363) = 3.75, p  = 0.000), when compared to the male participants.

Furthermore, the study investigated the differences in the use of learning strategies using academic achievement and gender as the predictors. The results are shown in Table 6 . There was no difference in the learning strategies among students who achieved “passing” grades. However, in students with “good,” “very good” or “excellent” grades, there were significant differences found in the use of learning strategies in favor of the female students.

According to Table 6 , female students who achieved “very good” grades showed higher overall use of learning strategies than males with the exception of “emotional-social support.” However, females who achieved “excellent” grades surpassed the males even in “emotional-social support” along with “study habits” and the overall use of learning strategies, while there was no difference between the genders in “microstrategies” and “keys of memory and metacognition” in this GPA group.

Table 7 shows the results of the linear regression test seeking to discover the impact of learning strategies on student achievement. According to the results, there is a positive relationship between the use of learning strategies and student achievement, where learning strategies can explain 8% of the variance in student achievement. In addition, the learning strategies were statistically significant in predicting student achievement ( F (1, 363) = 34.816, p  < 0.05).

Moreover, a multiple regression test was conducted to investigate the source of the impact of various learning strategies on students' achievement. To conduct a multiple linear regression, multicollinearity has to be checked first. In this study, all variance inflation factors (VIFs) were less than 3, which means that there was no multicollinearity between the learning strategy dimensions, while linearity between the learning strategy dimensions and students' achievement was diagnosed. Another assumption that had to be examined before conducting a multiple linear regression was the normality of the residuals using the Q-Q plot, as shown in Figure 1 in which all data points are so close to the diagonal line; thus, they are normally distributed.

As can be seen in Table 8 , the overall model (microstrategies, keys of memory and metacognition, emotional-social support and study habits) was a significant predictor of student achievement ( F (4, 360) = 10.167, p  < 0.01), where the model explained 10% of the variance in academic achievement and had an appositive mild correlation ( R  = 0.31). The significant contributors of the model were microstrategies ( β  =  0.138, p  = 0.013 < 0.05) and keys of memory and metacognition ( β  =  0.196, p  = 0.001 < 0.01). These two categories were the main sources of the effects on student achievement.

The present study utilized a scale to examine Saudi students' use of learning strategies and the extent to which strategy use is related to academic achievement and gender. The results presented a high preference for microstrategies by students. This can be explained by the fact that in Saudi universities, students are encouraged to use microstrategies like summarizing and highlighting information rather than macrostrategies such as self-regulated learning and planning for learning (see Alhaisoni, 2012 ; Al-Otaibi, 2004 ). In the majority of the lectures delivered in Saudi universities, students are only passive recipients of information, summarizing and highlighting what the instructor disclosed during the lecture, using a specific textbook for reference ( Al-Seghayer, 2021 ; Pordanjani & Guntur, 2019 ; Kim & Alghamdi, 2019 ). This contradicts the results for university students in Lima in Díaz et al . (2019) where students preferred metacognitive strategies and information processing strategies. Study habits which ranked second in this study explained the high level of self-regulation that Saudi students have to control their learning, and this is aligned with the higher education norms in Saudi Arabia, which use mostly a student-centered curriculum. Therefore, students have to assume responsibility for their learning. Accordingly, students always seek summaries and short focus activities to help them acquire information. Nevertheless, the descriptive data also referred to a lack of emotional-social support to students. This could be attributed to the poor educational content, which does not meet students' interests or their educational needs ( Alenezi, 2020 ; Khan, 2019 ).

The results of the study further revealed differences in the frequency of using the various learning strategies, and the overall academic achievement, with female Saudi students showing a higher use of learning strategies. Previous studies in other parts of the world have also shown that female students have a higher level of competence and willingness to perform better in their academic programs ( DiPrete & Buchmann, 2013 ; Tariq et al. , 2016 ; Quadlin, 2018 ). This result is also in agreement with the results obtained by Vega-Hernández et al. (2017) . Furthermore, female students with “good,” “very good” or “excellent” grades showed significant differences in their use of learning strategies compared to male students. However, this was not the case when comparing male and female students with low grade achievement. This makes sense since these students are not successful learners and they therefore do not use learning strategies that much regardless of their gender. In the case of the highest GPA students, there was no difference in all learning strategies except in the emotional-social support category with female students outperforming the male students. These students are highly motivated and competitive with females being extra determined to prove themselves in a patriarchal and male dominated society making the emotional-social support strategies all the more important. These results taken together show that learning strategies have a significant effect on students' academic achievement and they have clear implications for faculty in Saudi universities who have to use numerous and various teaching strategies to induce students' use of appropriate learning strategies especially among the weaker students. Ali et al . (2017) reported that both the quality of the staff and appropriate teaching and learning methods are factors that affect student learning at university. The findings of the current study contribute valuable insight into how faculty in Saudi universities may help develop students' use of appropriate learning strategies.

Finding differences in the use of learning strategies between male and female students of varying GPA levels encourages further investigation of the association between learning strategies use and students' academic performance. In this study, learning strategies explained 8% of the variance in student achievement. The microstrategies and keys of memory and metacognition were the main sources of the effects on student achievement, which means that only these two main strategies statistically significantly predicted the achievement. In addition, the overall model used in this study (microstrategies, keys of memory and metacognition, emotional-social support and study habits) was a significant predictor of student achievement, in which the model explained 10% of the variance in academic achievement. This is in agreement with other empirical studies that support the positive relationship between the use of learning strategies and academic achievement ( Pennequin et al. , 2010 ; Pinto et al. , 2018 ). Furthermore, the evidence presented in this study contradicts studies that refuted any association between learning strategies and student achievement or performance (such as Tariq et al. , 2016 ).

Succinctly, the results revealed that there is a positive relationship between learning strategies and student achievement with the frequency of use of learning strategies significantly predicting the academic achievement of students. Furthermore, Saudi female students were found more eager to use learning strategies than male students, especially in higher GPA levels.

The study assessed the impact of Saudi university students' use of learning strategies on their academic achievement. The study adopted the higher education version of the brief “ACRA-C” learning strategies developed by Jiménez et al. (2017) and divided learning strategies into four main categories: microstrategies, keys of memory and metacognition, emotional-social support and study habits. A total of 365 female and male university students at a College of Education participated in the study. Results showed statistically significant differences in the use of learning strategies due to gender in favor of the female students, which implies that male students have to improve their use of learning strategies and study habits. The study also found that the use of learning strategies significantly predicted student achievement, particularly the microstrategies and keys of memory and metacognition. This implies that students have to pay more attention to the use of these learning strategies if they are to enhance their academic performance.

Based on the study results, it is recommended that training programs on learning strategies be introduced to enrich Saudi students' knowledge and utilization of learning strategies. Also, the training program has to consider the students' gender and their academic level. Furthermore, students have to grasp the significance of the learning strategies as a facilitating tool to increase their academic achievement.

While the study made a valuable contribution, it was limited to one college in one Saudi university. Future studies should use larger samples from different colleges and universities in Saudi Arabia and incorporate a variety of measures of academic achievement, such as students' grades in specific courses rather than the overall grade average.

Despite its limitations, the current study contributed to the field of learning strategy use and filled a gap in the literature by shedding light on the Saudi Arabian context. By examining the relationship between strategy use, academic achievement and gender, it makes an important contribution to Saudi higher education and provides a map to help improve the quality of higher education and student achievement in university.

research about learning strategies

Normal Q-Q plot of the standardized residual of the regression (DV: student achievement)

Demographic characteristics of the participants ( N  = 365)

Reliability of the scale

Exploratory factor analysis of the instrument (four factors)

The results of the mean comparison t -test according to gender ( N  = 365)

Al-Otaibi , G. N. ( 2004 ). Language Learning Strategy use Among Saudi EFL Students and its Relationship to Language Proficiency Level, Gender and Motivation . Dissertation . Indiana University of Pennsylvania .

Al-Seghayer , K. ( 2021 ). Characteristics of Saudi EFL learners' learning styles . English Language Teaching , 14 ( 7 ), 82 – 94 .

Alenezi , A. M. ( 2020 ). The relationship of students' emotional intelligence and the level of their readiness for online education: A contextual study on the example of university training in Saudi Arabia . The Education and Science Journal , 22 ( 4 ), 89 – 109 . doi: 10.17853/1994-5639-2020-4-89-109 .

Alhaisoni , E. ( 2012 ). Language learning strategy use of Saudi EFL students in an intensive English learning . Asian Social Science , 8 ( 13 ), 115 – 127 . doi: 10.5539/ass.v8n13p115 .

Ali , M. M. , Medhekar , A. , & Rattanawiboonsom , V. ( 2017 ). Quality enhancement in teaching-learning strategies of Bangladesh: A qualitative assessment . International Journal of Advanced Trends in Technology: Management and Applied Science (IJATTMAS) , 3 ( 1 ), 121 – 147 .

Bashir , S. , Lockheed , M. , Ninan , E. , & Tan , J.-P. ( 2018 ). Facing forward: Schooling for learning in Africa . The World Bank . Available at: https://www.worldbank.org/en/region/afr/publication/facing-forward-schooling-for-learning-in-africa .

Buchori , A. , Setyosari , P. , Dasna , W. , Degeng , N. S. , & Sa'dijah , C. ( 2017 ). Effectiveness of direct instruction learning strategy assisted by mobile augmented reality and achievement motivation on students cognitive learning results . Asian Social Science , 13 ( 9 ), 137 – 145 .

Chiu , M. M. , Chow , B. W.-Y. , & Mcbride-Chang , C. ( 2007 ). Universals and specifics in learning strategies: Explaining adolescent mathematics, science, and reading achievement across 34 countries . Learning and Individual Differences , 17 ( 4 ), 344 – 365 . doi: 10.1016/j.lindif.2007.03.007 .

Curry , L. ( 1990 ). A critique of the research on learning styles . Educational Leadership , 48 ( 2 ), 50 – 56 .

Díaz , M. A. , Zapata , N. A. , Diaz , H. H. , Arroyo , J. A. , & Fuentes , A. R. ( 2019 ). Use of learning strategies in the university. A case study . Propósitos y Representaciones Monographic: Advances on Qualitative Research in Education , 7 ( 1 ), 10 – 32 .

DiPrete , T. A. , & Buchmann , C. ( 2013 ). The Rise of Women: The Growing Gender Gap in Education and What it Means for American Schools . NY : Russell Sage .

Jiménez , L. , García , A.-J. , López-Cepero , J. , & Saavedr , F.-J. ( 2017 ). The brief-ACRA scale on learning strategies for university students . Revista de Psicodidáctica , 23 ( 1 ), 63 – 69 . doi: 10.1016/j.psicod.2017.03.001 .

Juste , M. P. , & López , B. R. ( 2010 ). Learning strategies in higher education . International Journal of Learning , 17 ( 1 ), 259 – 274 . doi: 10.18848/1447-9494/CGP/v17i01/46813 .

Khan , S. ( 2019 ). A comparative analysis of emotional intelligence and intelligence quotient among Saudi business students' toward academic performance . International Journal of Engineering , 11 , 1 – 10 . doi: 10.1177/1847979019880665 .

Kim , S. Y. , & Alghamdi , A. K. ( 2019 ). Female secondary students' and their teachers' perceptions of science learning environments within the context of science education reform in Saudi Arabia . International Journal of Science and Mathematics Education , 17 , 1475 – 1496 . doi: 10.1007/s10763-018-09946-z .

Krejcie , R. V. , & Morgan , D. W. ( 1970 ). Determining sample size for research activities . Educational and Psychological Measurement , 30 ( 3 ), 607 – 610 . doi: 10.1177/001316447003000308 .

Li , Y. , Medwell , J. , Wray , D. , Wang , L. , & Xiaojing , L. ( 2016 ). Learning styles: A review of validity and usefulness . Journal of Education and Training Studies , 4 ( 10 ), 90 – 94 .

McMullen , M. G. ( 2009 ). Using language learning strategies to improve the writing skills of Saudi EFL students: Will it really work? System , 37 ( 3 ), 418 – 433 . doi: 10.1016/j.system.2009.05.001 .

Montero , C. R. , & Arizmendiarrieta , B. S. Y. ( 2017 ). The effectiveness of a learning strategies program for university students . Psicothema , 29 ( 4 ), 527 – 532 . doi: 10.7334/psicothema2016.171 .

Muelas , A. , & Navarro , E. ( 2015 ). Learning strategies and academic achievement . Procedia – Social and Behavioral Sciences , 165 , 217 – 221 . doi: 10.1016/j.sbspro.2014.12.625 , Proceeding in CPSYC 2014 .

Nikou , S. A. , & Economides , A. A. ( 2019 ). Mobile-based micro-learning and assessment: Impact on learning performance and motivation of high school students . Journal of Computer Assisted Learning , 34 ( 3 ), 269 – 278 . doi: 10.1111/jcal.12240 .

Pennequin , V. , Sorel , O. , Nanty , I. , & Fontaine , R. ( 2010 ). Metacognition and low achievement in mathematics: The effect of training in the use of metacognitive skills to solve mathematical word problems . Thinking and Reasoning , 16 ( 3 ), 198 – 220 . doi: 10.1080/13546783.2010.509052 .

Pinto , G. , Bigozzi , L. , Vettori , G. , & Vezzani , C. ( 2018 ). The relationship between conceptions of learning and academic outcomes in middle school students according to gender differences . Learning, Culture and Social Interactions , 16 , 45 – 54 . doi: 10.1016/j.lcsi.2017.11.001 .

Pordanjani , S. R. , & Guntur , L. M. ( 2019 ). Humanities investigating the implementation of critical literacy approach in the Middle-East education contexts: Three main constraints . ELS Journal on Interdisciplinary Studies on Humanities , 2 ( 3 ), 410 – 418 .

Quadlin , N. ( 2018 ). The mark of a woman's record: Gender and academic performance in hiring . American Sociological Review , 83 ( 2 ), 331 – 360 . doi: 10.1177/0003122418762291 .

Rosário , P. , Núñez , J. C. , Trigo , L. , Guimarães , C. , Fernández , E. , Cerezo , R. , Fuentes , S. , Orellana , M. , Santibáñez , A. , Fulano , C. , Ferreira , Â. , & Figueiredo , M. ( 2015 ). Transcultural analysis of the effectiveness of a program to promote self-regulated learning in Mozambique, Chile, Portugal, and Spain . Journal of Higher Education Research and Development , 34 ( 1 ), 173 – 187 . doi: 10.1080/07294360.2014.935932 .

Shehzad , M. W. , Razzaq , S. , Dahri , A. S. , & Shah , S. K. ( 2019 ). The association between reading self-efficacy beliefs and meta cognitive reading strategies among Saudi PYP students . The Dialogue , 14 ( 2 ), 32 – 43 .

Shi , H. ( 2017 ). Learning strategies and classification in education . Institute for Learning Styles Journal , 1 , 24 – 36 .

Tan , R. E. ( 2019 ). Academic self-concept, learning strategies and problem-solving achievement of university students . European Journal of Education Studies , 6 ( 2 ), 287 – 303 . doi: 10.5281/zenodo.3235652 .

Tariq , S. , Khan , M. , Afzal , S. , Shahzad , S. , Hamza , M. , Khan , H. , & Shaikh , S. ( 2016 ). Association between academic learning strategies and annual examination results among medical students of King Edward medical university . Annals of King Edward Medical University , 22 ( 2 ), 124 – 134 . doi: 10.21649/akemu.v22i2.1290 .

Tomar , S. , & Jindal , A. ( 2014 ). A study of effective learning strategies in relation to intelligence level across the science and arts academic streams of secondary level . IOSR Journal of Research and Method in Education (IOSR-JRME) , 4 ( 6 ), 41 – 50 .

Vega-Hernández , M. C. , Patino-Alonso , M. C. , Cabello , R. , Galindo-Villardón , M. P. , & Fernández-Berrocal , P. ( 2017 ). Perceived emotional intelligence and learning strategies in Spanish university students: A new perspective from a canonical non-symmetrical correspondence analysis . Front Psychology , 8 . doi: 10.3389/fpsyg.2017.01888 .

Vermunt , J. D. , & Vermunt , J. D. ( 2017 ). A learning patterns perspective on student learning in higher education: State of the art and moving forward . Educational Psychology Review , 29 ( 2 ), 269 – 299 .

Vettori , G. , Vezzani , C. , Bigozzi , L. , & Pinto , G. ( 2020 ). Upper secondary school students' conceptions of learning, learning strategies . The Journal of Educational Research , 113 ( 6 ), 475 – 485 . doi: 10.1080/00220671.2020.1861583 .

Further reading

Almusharraf , N. M. ( 2019 ). Learner autonomy and vocabulary development for Saudi university female EFL learners: Students' perspectives . International Journal of Linguistics , 11 ( 1 ), 166 – 195 .

Babbage , R. , Byers , R. , & Redding , H. ( 2008 ). Approaches to Teaching and Learning: Including Pupils Within Learning Difficulties . Oxen : David Fulton Publica .

Corresponding author

About the author.

Dr. Yousef Almoslamani is an Assistant Professor at the Instructional Technology Department, Faculty of Education, Ha'il University. He holds a PhD in Educational Technology from the University of Northern Colorado, USA.

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

  • Teaching Tips

17 Learning Strategies to Implement In Your Classroom

Learning strategies are a critical element in ensuring students grasp course concepts and are especially important in blended and online learning environments

' src=

Danielle Leboff

17 Learning Strategies to Implement In Your Classroom

Learning strategies are methods used by instructors to initiate students into effective learning by using a variety of engaging learning techniques, activities and practices. These methods are all derived from years of meticulous research into how people learn best.

In any lesson plan, instructors can incorporate multiple learning strategies. By catering to different learning styles and varying your approach, you can better engage students while helping them master new concepts.

Top Hat’s 2021 Online Teaching Toolkit gives you easy-to-use teaching templates, active learning strategies and more to engage your students in an online or hybrid learning environment. Get free access today.

Why are learning strategies important?

Learning strategies are an essential component of creating an effective learning experience. They can help learners develop proficiency in various subject matter areas and develop new skill sets. They also help learners develop confidence in their own knowledge, proficiency and learning abilities.

The following describes some common strategies for achieving various learning outcomes, along with practical examples you can incorporate directly into your learning environment.

Think-pair-share

This active learning exercise is designed to activate any prior knowledge a student may have on a subject by having them share their thoughts and beliefs with their fellow learners.

A think-pair-share exercise is structured to help students first organize their thoughts, then share these with a partner followed by the broader class.

  • Think : Students take a moment to contemplate the new concept or idea on their own. They can also write down their thoughts to help develop their note-taking skills.
  • Pair : Students break off into pairs to share their thoughts and beliefs on the topic with another learner.
  • Share : Students then share their takeaways from this conversation with one or more successively larger groups, up to and including the whole class.

Putting think-pair-share into action

To execute think-pair-share in your class, define the exercise for the group and display the prompts you’d like to pose for discussion. Once students have completed the exercise, you can then facilitate a larger class discussion.

Make a point of listening to student responses before offering your own ideas. You can also pose probing questions while encouraging other students to offer their own responses and reactions to each other’s ideas.

Tests and quizzes

There are several ways instructors can use tests and quizzes as effective strategies for learning.

Individual plus group quizzes : Have learners complete independent quizzes for grading. Following this, place learners into small groups and give them the same quiz as a form of cooperative learning. This time, allow the groups to discuss their answers and come up with an answer for each question. Then, grade the group as a whole on their collective performance.

Not every student likes group assignments, since this may raise concerns about their individual grades. To avoid penalizing more diligent learners, take an average of each student’s two scores if the group score is higher than their individual score. If the student’s individual score is higher than the group score, let that individual score stand as the average. This process encourages students to be accountable for their own learning while helping develop their test-taking and collaboration skills.

Tests and quizzes with distractors : Distractors are common preconceptions or misconceptions about a topic. Have students answer various questions and, then, discuss their answers with a fellow student. After this discussion, have each student answer the same question again and see if their answers are any different. To close off the activity, initiate a group discussion about why the correct answer is actually the correct one. This acts as a form of metacognition by encouraging students to think about their own learning.

Retrieval practice

The process of bringing information to mind, or retrieval practice, is an effective strategy in boosting learning. In these exercises, students put away all learning materials and answer questions or discuss a topic purely based on their own recall of the information. Students can then refer to learning to evaluate how accurately they conveyed the information. Retrieval practice exercises also work well using the think-pair-share format.

Elaboration

In elaboration, students demonstrate the depth of their knowledge of a given topic by describing and explaining as much as they know about it, including as many relevant details as they can call to mind. This strategy extends the concept of rote memorization by encouraging students to draw connections within the content and between the content and other knowledge they already possess.

Interleaving

Interleaving is the process by which students mix multiple subjects or topics while they study. This allows students the opportunity to practice different modes of thinking and problem-solving as opposed to ‘blocked practice,’ which involves studying one topic thoroughly before moving on to the next.

Interleaving has been shown to improve test scores in a number of studies. As a best practice, it is important to use interleaving for related topics. For example, interleaving works well when switching between different algebra problems but is not nearly as effective when switching between radically different subject matter areas, such as literature and math equations.

Muddiest point

This form of assessment helps educators understand which elements of their course pose difficulties that may impede student progress and performance.

In this exercise, instructors ask students to note the “muddiest points” of the lesson, or the most confusing or difficult to grasp. Have students rate their degree of understanding and capture where the difficulty lies.

While the exercise shouldn’t take more than a few minutes, it has additional benefits beyond helping the instructor understand where the obstacles are for students. It also helps students more effectively analyze their own learning and to zero in on the exact issue that may be holding them back.

Peer instruction

Also known as ‘reciprocal teaching,’ this structured teaching practice asks students to reflect on new concepts they may be confused about and then share their responses to those prompts with a small group. Each group then derives a consensus response to share collectively with the rest of the class.

Peer instruction offers a number of benefits, including:

  • Increasing a student’s problem-solving skills and conceptual understanding abilities
  • Deepening student understanding of a topic and encouraging greater knowledge retention
  • Bolstering student engagement and raising student course satisfaction

Not only does this exercise call upon students to explain their thinking, it asks them to defend it against alternative arguments and modes. This helps reveal for students as much about how they think and process information as it does about the information itself.

Differentiated instruction

Not all students learn the same way. Differentiated instruction recognizes and accommodates for this by tailoring the learning process to individual needs. This is accomplished by altering the content, process, product or the learning environment itself.

With differentiated instruction, instructors consider the different learning styles of their students before devising their teaching strategies. That way, they can incorporate multiple modalities to allow all students to succeed equally in learning the material.

Some other ways to implement differentiated learning include:

  • Grouping students together for assignments by shared topics, interests, learning abilities or styles
  • Using formative assessment tools to assess individual student learning styles and progress and then adjusting lesson plans accordingly
  • Using classroom management tools to create safe and supportive learning environments for all students

Gamification

Sometimes turning a lesson into a game can better engage students in learning and comprehending the material. Gamification essentially incorporates reward-based activities and teaching tools into the lesson plan. Examples of gamification include:

  • Earning points for finishing tasks
  • Competing against peers toward a goal
  • Playing games that teach particular academic skills

Project-based learning

Through project-based learning, students work together on a project over an extended period, generally between one week and an entire semester. The project ideally involves solving a real-world problem or addressing a complex question. The finished product is a public presentation or product they can present to a live audience.

Problem-based learning

Problem-based learning involves incorporating real-world situations as a vehicle to help students apply course concepts in a practical application. This helps make learning more relevant by connecting concepts to the world outside the classroom and can add variety to the learning process itself.

Formative assessments

Formative assessments are designed to monitor learning and provide feedback on each student’s progress on an ongoing basis. The steady stream of feedback allows instructors to refine and improve their teaching strategies to keep the class on track. At the same time, students can practice their test-taking skills, improve information recall while honing in on their areas of strength and weakness.

Formative assessments are typically considered “low stakes.” The primary goal is not a letter grade but generating feedback for the instructor and the student. Examples of formative assessments include:

  • Self-assessments
  • Entry and exit slips
  • Low-stakes polls and quizzes
  • Exercises incorporating art or other visual representations of learning content
  • Misconception and errors
  • Interview assessments

Summative assessments

Instructors use summative assessments to evaluate how thoroughly students learned an area of study. Summative assessments usually come upon the completion of an instructional unit and compare student knowledge and achievement against a previously determined set of benchmarks.

Considered “high stakes,” summative assessments are commonly used to determine a student’s subsequent course work and educational progress. Examples of summative assessments include:

  • Final projects
  • Term papers
  • Midterm, final or standardized exams
  • Performance or recital

Educators may sometimes use summative assessments in a formative manner to guide student activities and efforts throughout their coursework.

Quick write

In this exercise, pose a prompt to the group to respond to in writing. Only allow five minutes for this exercise, so students can quickly reflect on their initial thoughts on a subject.

Uses and benefits of a quick write include helping to:

  • Determine whether students completed their assigned homework
  • Prime students to think about topics to be introduced or developed in the upcoming lesson
  • Give students the chance to access previous knowledge they may have on a subject
  • Instructors can opt to grade the quick write or simply collect it as a means of confirming attendance.

Pose a question to be answered or explained, and then take an anonymous poll to see how many students favor particular answers or explanations to the question.

Afterward, initiate a group discussion of the question and the poll’s results to see why students voted the way they did. Following the discussion, take the same poll again to gauge whether any students changed their answers and, if so, to what extent and why.

Hearing why students chose a particular explanation or answer helps the instructor understand how students think about that topic. It also helps them determine if additional explanation or clarification may be required before moving on in the lesson plan.

Turn and talk

In this exercise, instructors pose a question to the group, then instruct students to choose a partner to discuss their thoughts on the question with. This can create a comfortable atmosphere for sharing ideas before bringing ideas before the whole group.

Make sure the questions students are asked to discuss are clear and that the understanding of each participant is there in order to contribute to the conversation both as a speaker and listener.

This exercise is performed in small groups in which students read a preselected passage of course material. Students in each group divide up the material so that each member reads a portion of it silently and then shares what they’ve learned with the rest of the group.

Some questions participants can use as points of focus include:

  • What’s the big idea here?
  • What do you believe it means, and why does it matter?
  • How can someone apply this idea to help understand a larger topic?
  • What part(s) of the reading do you agree and/or disagree with?
  • What questions does the reading raise for you?

Instructors can implement jigsaws in a number of ways. In an ‘expert and cooperative group’ format, assign different groups different pieces of the material to read individually and discuss. Each group then becomes the expert group on that portion of the material. Following this, groups are redivided so that each new cooperative group contains one or two representatives from each of the previous expert groups. Each cooperative group then reviews the material with the expert representative. The jigsaw method is a great way to get students up to speed quickly on material while honing their critical thinking and communication skills.

Learning strategies help you better engage students in active learning by using a variety of activities such as reading, writing, discussion or problem-solving. Easy to execute, these activities promote analysis, synthesis, and the evaluation of class content. Equally important, they provide students with opportunities for feedback on how well they understand course material, ensuring they are making meaningful progress toward achieving course objectives.

Recommended Readings

research about learning strategies

Educators In Conversation: How to Help Students ‘Do’ Sociology

research about learning strategies

A 6-Step Exercise for Discussing AI In Education

Subscribe to the top hat blog.

Join more than 10,000 educators. Get articles with higher ed trends, teaching tips and expert advice delivered straight to your inbox.

School pupils using computers

Schools are using research to try to improve children’s learning – but it’s not working

research about learning strategies

Senior Research Fellow in the Centre for Teachers and Teaching Research, UCL

Disclosure statement

Sally Riordan is currently working on two projects that receive funding from the Education Endowment Foundation.

University College London provides funding as a founding partner of The Conversation UK.

View all partners

Evidence is obviously a good thing. We take it for granted that evidence from research can help solve the post-lockdown crises in education – from how to keep teachers in the profession to how to improve behaviour in schools, get children back into school and protect the mental health of a generation.

But my research and that of others shows that incorporating strategies that have evidence backing them into teaching doesn’t always yield the results we want.

The Department for Education encourages school leadership teams to cite evidence from research studies when deciding how to spend school funding. Teachers are more frequently required to conduct their own research as part of their professional training than they were a decade ago. Independent consultancies have sprung up to support schools to bring evidence-based methods into their teaching.

This push for evidence to back up teaching methods has become particularly strong in the past ten years. The movement has been driven by the Education Endowment Foundation (EEF), a charity set up in 2011 with funding from the Conservative-Liberal Democrat coalition government to provide schools with information about which teaching methods and other approaches to education actually work.

The EEF funds randomised controlled trials – large-scale studies in which students are randomly assigned to an educational initiative or not and then comparisons are then made to see which students perform better. For instance, several of these studies have been carried out in which some children received one-on-one reading sessions with a trained classroom assistant, and their reading progress was compared to children who had not. The cost of one of these trials was around £500,000 over the course of a year.

Trials such as this in education were lobbied for by Ben Goldacre , a doctor and data scientist who wrote a report in 2013 on behalf of the Department for Education. Goldacre suggested that education should follow the lead of medicine in the use of evidence.

Using evidence

In 2023, however, researchers at the University of Warwick pointed out something that should have been obvious for some time but has been very much overlooked – that following the evidence is not resulting in the progress we might expect.

Reading is the most heavily supported area of the EEF’s research, accounting for more than 40% of projects . Most schools have implemented reading programmes with significant amounts of evidence behind them. But, despite this, reading abilities have not changed much in the UK for decades.

This flatlining of test scores is a global phenomenon . If reading programmes worked as the evidence says they do, reading abilities should be better.

Man and boy reading from tablet in library

And the evidence is coming back with unexpected results. A series of randomised controlled trials, including one looking at how to improve literacy through evidence , have suggested that schools that use methods based on research are not performing better than schools that do not.

In fact, research by a team at Sheffield Hallam University have demonstrated that on average, these kinds of education initiatives have very little to no impact .

My work has shown that when the findings of different research studies are brought together and synthesised, teachers may end up implementing these findings in contradictory ways. Research messages are frequently too vague to be effective because the skills and expertise of teaching are difficult to transfer.

It is also becoming apparent that the gains in education are usually very small, perhaps because learning is the sum total of trillions of interactions. It is possible that the research trials we really need in education would be so vast that they are currently too impractical to do.

It seems that evidence is much harder to tame and to apply sensibly in education than elsewhere. In my view, it was inevitable and necessary that educators had to follow medicine in our search for answers. But we now need to think harder about the peculiarities of how evidence works in education.

Right now, we don’t have enough evidence to be confident that evidence should always be our first port of call.

  • Young people
  • Department for Education

research about learning strategies

Research Fellow – Beyond The Resource Curse

research about learning strategies

Audience Development Coordinator (fixed-term maternity cover)

research about learning strategies

Lecturer (Hindi-Urdu)

research about learning strategies

Director, Defence and Security

research about learning strategies

Opportunities with the new CIEHF

ORIGINAL RESEARCH article

This article is part of the research topic.

Global Lesson Study Policy, Practice, and Research for Advancing Teacher and Student Learning in STEM

Evolving Engineering Education: Online vs. In-Person Capstone Projects Compared (EEE-OIPC) Provisionally Accepted

  • 1 Engineering and Physics Department, Texas A&M University Texarkana, United States

The final, formatted version of the article will be published soon.

This study compares online and face-to-face (F2F) instructional methods in Capstone Senior Design (CSD) projects across the disciplines of Electrical Engineering (EE) and Mechanical Engineering (ME). Through a comprehensive assessment involving project evaluations, advisor feedback, and self-peer reviews, it aims to gauge the efficacy of each approach in enhancing student success and learning outcomes. A key observation is the parity between online and F2F modalities in several metrics, yet F2F instruction distinctly advances teamwork and collaboration. Conversely, online environments show robust advisor evaluations, signifying effective mentoring despite hurdles in consistent team collaboration and project execution. Highlighting the imperative to blend online and traditional pedagogies, suggesting improved online strategies and a holistic curriculum to boost CSD students' learning experiences. These insights bear significance for ongoing and future STEM education research, stressing adaptable teaching techniques to better student experiences across varied settings. The outcomes yield important guidance for evolving STEM education research and practices, stressing the need for flexible teaching techniques to enrich learning in different educational environments. These findings are crucial for educators and institutions working to adapt their strategies to the changing landscape of online and F2F instruction in STEM areas.

Keywords: Capstone Senior Project, Online Learning, F2F, Teamwork, Engineering Education, project-based learning, group projects, Communication

Received: 19 Mar 2024; Accepted: 10 Apr 2024.

Copyright: © 2024 Znidi, Uddin and Morsy. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Faycal Znidi, Texas A&M University Texarkana, Engineering and Physics Department, Texarkana, 75503, Texas, United States

People also looked at

  • Open access
  • Published: 07 April 2024

Efficacy of a virtual nursing simulation-based education to provide psychological support for patients affected by infectious disease disasters: a randomized controlled trial

  • Eunjung Ko 1 &
  • Yun-Jung Choi 1  

BMC Nursing volume  23 , Article number:  230 ( 2024 ) Cite this article

81 Accesses

1 Altmetric

Metrics details

Virtual simulation-based education for healthcare professionals has emerged as a strategy for dealing with infectious disease disasters, particularly when training at clinical sites is restricted due to the risk of infection and a lack of personal protective equipment. This research evaluated a virtual simulation-based education program intended to increase nurses’ perceived competence in providing psychological support to patients affected by infectious disease disasters.

The efficacy of the program was evaluated via a randomized controlled trial. We recruited 104 nurses for participation in the study and allocated them randomly and evenly to an experimental group and a control group. The experimental group was given a web address through which they could access the program, whereas the control group was provided with a web address that directed them to text-based education materials. Data were then collected through an online survey of competence in addressing disaster mental health, after which the data were analyzed using the Statistical Package for the Social Sciences(version 23.0).

The analysis showed that the experimental group’s disaster mental health competence (F = 5.149, p  =.026), problem solving process (t = 3.024, p  =.003), self-leadership (t = 2.063, p  =.042), learning self-efficacy (t = 3.450, p  =.001), and transfer motivation (t = 2.095, p  =.039) significantly statistically differed from those of the control group.

Conclusions

A virtual nursing simulation-based education program for psychological support can overcome limitations of time and space. The program would also be an effective learning resource during infectious disease outbreaks.

Clinical trial registration

This Korean clinical trial was retrospectively registered (21/11/2023) in the Clinical Research Information Service ( https://cris.nih.go.kr ) with trial registration number KCT0008965.

Peer Review reports

The last two decades have confronted the world with a variety of infectious diseases, such as severe acute respiratory syndrome, which first occurred in Asia in 2003 before spreading worldwide, including Korea, in only a few months. Since then, infectious disease outbreaks began to be recognized as severe disasters. Other examples include the 2009 H1N1 influenza outbreak, which caused more than 10,000 deaths worldwide and 140 deaths in Korea; the proliferation of the Ebola virus, which resulted in a fatality rate of more than 90% in Africa in 2014; and the outbreak of Middle East respiratory syndrome in 2015, Zika virus disease in 2016, and coronavirus disease (COVID-19) in 2019 [ 1 ]. The COVID-19 pandemic, in particular, has caused infections among approximately 64 million people and the deaths of 1.5 million individuals as of December 2020 [ 2 ].

Direct victims of infectious disease disasters, infected patients, and quarantined individuals suffer from a fear of stigma or social blame and guilt, but even people who are unexposed to sources of infection experience psychological distress from anxiety and fear of disease or possible death [ 3 ]. They also blame infected people and harbor hatred toward them [ 3 ]. This assertion is supported by an examination of web search behaviors and infodemic attitudes toward COVID-19, which identified superficial and racist attitudes [ 4 ]. Additionally, in research using a health stigma and discrimination framework related to communicable diseases, the authors found that people exhibit negative stereotypes, biases, and discriminatory conduct toward infected groups owing to fears of contagion, concerns about potential harm, and perceptions that individuals violate central values [ 5 ]. Stigmatized individuals experience adverse effects on their health because of both the stress induced by stigma and the decreased use of available services [ 5 ].

Severe and prolonged anxiety, fear, blame, and aggression can lead to mental health problems, including depression, anxiety, panic attacks, somatic symptoms, post-traumatic stress disorder, psychosis, and even suicide and life-threatening behaviors [ 6 ]. Therefore, recovery from the psychological trauma caused by a disaster should be regarded as equally necessary as physical recovery, with emphasis placed on psychological support activities that prevent the deterioration of mental health [ 7 ].

Disasters pose a significant threat to mental health support systems, wherein the lack of healthcare professionals or psychologists trained to address these conditions exacerbates the psychological distress and psychopathological risk experienced by society [ 8 ]. When training at clinical sites is restricted due to infection risks and a lack of personal protective equipment (PPE), an emerging solution is virtual simulation [ 9 ].

A virtual simulation is a simulation modality developed on the basis of video or graphic recordings featuring virtual patients and delivered via either a static or mobile device. It replicates real-world clinical situations and affords learners an interactive experience [ 10 ]. Virtual simulation-based education provides an immersive clinical environment, as virtual patients respond to a learner’s assessments and interventions [ 11 , 12 ]. It enables two-way communication, and allows medical professionals to practice making clinical decisions [ 10 ]. Virtual patients are equipped with voice, intonation, and expressions that reinforce the educational narrative within the virtual environment, thereby enhancing the effectiveness of the learning experience [ 13 ]. One of the primary advantages of virtual simulation-based education is its provision of a safe and non-threatening environment in which learners can practice. It also offers flexible and reproducible learning experiences, thus catering to the diverse needs of learners [ 14 ].

Self-assessment is the most commonly used competence evaluation tool, as it is cost-effective and helps nurses improve their practice by identifying their strengths and weaknesses for development [ 15 ]. Self-assessed competence is also related to the quality of patient care because nurses promote continuous learning by determining educational needs through such evaluations [ 16 ]. The competence perceived by a nurse is inherently subjective given its self-reported nature and poses a challenge in establishing a direct correlation with the actual care of patients [ 17 , 18 ]. However, studies have indicated that increased levels of self-perceived competence are associated with a significant increase in core competencies related to patient care and frequent use of clinical skills [ 19 , 20 ]. Perceived competence likewise influences the job satisfaction and organizational citizenship behavior of nurses and is significantly related to absenteeism, one of the deterrents to the delivery of quality care [ 21 , 22 ].

Competence refers to the possession of qualifications and abilities to satisfy professional standards, as well as the capability to perform tasks and duties in a suitable and effective manner [ 23 ]. Competencies for disaster mental health are crucial for enhancing disaster response capabilities. These competencies encompass a range of skills, knowledge, and attitudes necessary for mental health professionals to effectively support individuals and communities affected by disasters [ 24 ]. Such competencies and how they are affected by simulation-based training have been explored in some studies, which reported a significant increase in competence after exposure to the aforementioned education [ 25 , 26 ].

The simulation education defined in mock training designs based on real situations provides opportunities to exercise problem-solving through various strategies. Problem-solving process is considered key competency through which learners are expected to enhance their relevant knowledge and clinical performance abilities [ 27 ]. In particular, problem-solving processes for identifying and assessing problems and finding solutions are psychological strategies that help people cope and recover after a disaster [ 28 ]. A scoping review on the effect of simulation-based education on the problem-solving process indicated that out of 32 studies reviewed, 21 demonstrated statistically significant improvement in people’s ability to resolve problems [ 29 ].

Simulation training can also address self-leadership, which is an essential self-learning quality that aids individuals in staying motivated and focused on their learning goals. It is also required as a basic qualification of professional nurses, who must be able to take initiative and make responsible decisions [ 30 , 31 ]. Previous studies have reported statistically significant improvements in self-leadership following simulation training [ 32 , 33 ].

Another aspect that benefits from simulation-driven education is learning self-efficacy, which plays a crucial role in predicting learners’ levels of engagement and academic success in online education. It reflects learners’ confidence in their ability to manage their own learning process. It is a significant predictor of both learners’ participation levels and their academic achievements in online education settings [ 34 , 35 ]. Several studies have demonstrated virtual simulation- or online education-induced significant improvements in learning self-efficacy [ 36 , 37 ]. Finally, virtual simulation-based education can also improve the motivation to transfer new knowledge and skills learned through education to clinical practice [ 38 ]. This motivation is considered an essential measure of effective learning for nurses working in the clinical field [ 38 ]. A previous study reported that psychiatric nursing simulation training combined with post-course debriefing significantly increases participants’ level of motivation to transfer [ 38 ].

On the basis of the discussion above, this study evaluated a virtual nursing simulation-based education program on disaster psychology designed to provide psychological support to patients affected by infectious disease disasters.

Study design

This study conducted a randomized controlled trial (RCT) to test the virtual nursing simulation-based education program of interest. The RCT protocol used was based on CONSORT guidelines.

Participants

We recruited nurses working at general hospitals in South Korea. With permission from the nurse managers of these hospitals, a participation notice was posted on the institutions’ internet bulletin boards for nurses for a week. The two-sided test criterion, with a significance level (α) of 0.05, a power (1-β) of 0.80, and a medium effect size of 0.6, dictates that the minimum number of participants per group be 90. The effect size was based on a virtual simulation intervention study conducted by Kim and Choi [ 36 ]. Taking the dropout rate into consideration, we recruited 104 nurses, who were assigned to an experimental group and a control group using the random sampling functionality of the Statistical Package for the Social Sciences (SPSS version 23.0). Out of the initial sample, 11 participants were excluded because they were on vacation, could not be contacted, or provided incomplete responses during data collection (Fig.  1 ).

figure 1

Flowchart of the randomized controlled trial

The virtual nursing simulation-based education program

This study probed into the virtual nursing simulation-based education program developed by Ko [ 39 ]. The program is implemented using an e-learning development platform, Articulate Storyline, whose operating environment is compatible with all web browsers (Internet Explorer, Microsoft Edge, Firefox, Google Chrome, etc.). It is a mobile-friendly application that can run in devices with Android and iOS operating systems. When an individual uses their smartphone or personal computer to access the server via the web address corresponding to the education program, the content functions execute. Ko’s [ 39 ] program involves five stages of learning completed in 100 min: (1) preparatory learning (30 min), (2) pre-test (5 min), (3) pre-briefing (5 min), (4) simulation game (30 min), and (5) structured self-debriefing (30 min) (Fig.  2 ).

Preparatory learning comes with lecture materials on guidelines for providing psychological support to victims of infectious disease disasters, administering psychological first aid, donning and doffing PPE, and exercising mindfulness through videos and pictures. In the pretest stage, a learner answers five questions and can immediately check the correct responses, which come with detailed explanations. In the prebriefing stage, an overview of a nursing simulation scenario, patient information, learning objectives, and instructions on using the virtual simulation are provided. During the simulation game, a video of the simulation is presented. It starts with a 39-year-old female, a standardized patient who is age- and gender-matched to the scenario, confirmed to have contracted COVID-19 and transferred to a negative pressure isolation room. The patient presents with extreme anxiety and feeling of tightness in her chest. During the game, learners are expected to complete 12 quizzes. In the debriefing stage, a summary of the simulation quiz results and self-debriefing questions are provided, and the comments made by learners are saved in the Naver cloud platform.

figure 2

The evaluated virtual nursing simulation-based education program (examples are our own work)

Measurements

Disaster mental health competence.

Disaster mental health competence was measured using the perceived competence scale for disaster mental health workforce (PCS-DMHW), which was developed by Yoon and Choi [ 40 ]. This tool consists of 24 questions related to knowledge (6 questions), attitudes (9 questions), and skills (9 questions). Each item is rated using a five-point Likert scale (0 = strongly disagree, 4 = strongly agree), and the responses are summed. The higher the score, the greater the perception of competence in a relevant area [ 40 ]. The Cronbach’s α values of the PCS-DMHW were 0.95 and 0.94 at the time of tool development and the present study, respectively.

Problem solving process

Problem solving process was determined using a tool modified and supplemented by Park and Woo [ 41 ] on the grounds of the problem solving process and behavior survey developed by Lee [ 42 ]. This tool is composed of 25 questions on five factors, namely, problem discovery, problem definition, problem solution design, problem solution execution, and problem solving review [ 41 ]. The reliability of the tool was 0.89 at the time of development [ 41 ], but the Cronbach’s α found in the current research was 0.94.

Self-leadership

Self-leadership was measured using a tool developed by Manz [ 43 ] and modified by Kim [ 44 ]. The tool consists of 18 questions distributed over six factors (three questions each): self-defense, rehearsal, goal setting, self-compensation, self-expense edition, and constructive thinking. The reliability of the tool at the time of development and the present research was (Cronbach’s α) 0.87 and 0.82, respectively.

Learning self-efficacy

To ascertain learning self-efficacy, we used the tool developed by Ayres [45] and translated by Park and Kweon [ 38 ]. This tool consists of 10 questions, and it had a reliability (Cronbach’s ⍺) of 0.94 and 0.93 at the time of development and the current study, respectively.

Motivation to transfer

We used Ayres’s [45] motivation to transfer scale, which was translated by Park and Kweon [ 38 ]. Its reliability at the time of development and the present research was (Cronbach’s ⍺) 0.80 and 0.93, respectively.

Data collection

The experimental and control groups were administered a pretest through an online survey. The web address through which the evaluated virtual simulation-based education program could be accessed was provided to the experimental group, whereas text-based education materials on psychological support for victims of infectious disease disasters were given to the control group. The groups were simultaneously sent the program’s instruction manual, and their inquiries were answered through chat. After the interventions, each participant was administered a posttest through another online survey.

Data analysis

The collected data were analyzed using SPSS version 23.0. The homogeneity test for general characteristics between the experimental and control groups was analyzed using a t-test, a chi-square test, and Fisher’s exact test. The normality of the dependent variables was analyzed using the Kolmogorov-Smirnov test. Changes in the dependent variables between the pretest and posttest were analyzed using a paired t-test. Differences in the dependent variables before and after the groups’ use of the interventions were examined via a t-test and ANCOVA.

Ethical considerations

We completed education in bioethics law prior to the research and obtained approval of the research proposal and questionnaire from the Institutional Review Board of the affiliated university (IRB approval number 1041078-202003-HRSB-070-01CC). A signed consent form was also obtained from each participant after the purpose and methods of the research, the confidentiality of personal information, and the voluntary nature of participation or their right to withdraw from the study were explained to them. All collected data were kept in a lockable cabinet, and electronic data were encrypted and stored. These data are to be discarded after three years.

A total of 93 participants (45 in the experimental group and 48 in the control group) were left after the exclusion of unsuitable respondents. of the between-group comparisons of the subjects indicated no significant differences between them (5% significance level) in terms of general characteristics, such as gender, age, work unit, and clinical experience (Table  1 ).

The score of the experimental group on disaster mental health competence increased from 48.13 in the pretest to 70.51 in the posttest (+ 22.38), whereas that of the control group increased from 53.33 in the pretest to 68.38 in the posttest (+ 15.04). These findings reflect a statistically significant difference in competence between the groups (F = 5.149, p  =.026). The scores of the experimental and control groups on problem solving process increased from 73.07 in the pretest to 88.24 in the posttest (+ 15.18) and from 75.75 in the pretest to 83.77 in the posttest (+ 8.02), respectively. As with the competence findings, these point to a significant difference between the groups in terms of the ability to resolve problems (t = 3.024, p  =.003) (Table  2 ).

The score of the experimental group on self-leadership increased from 54.87 in the pretest to 59.58 in the posttest (+ 4.71), and that of the control group increased from 57.48 in the pretest to 60.10 in the posttest (+ 2.63). These results denote a statistically significant difference in this ability between the groups (t = 2.063, p  =.042). The scores of the experimental and control participants on learning self-rose from 55.40 in the pretest to 58.84 in the posttest (+ 3.44) and from 56.81 in the pretest to 57.13 in the posttest (+ 0.31), respectively. Again, a statistically significant difference was found between the groups (t = 3.450, p  =.001). Their scores on motivation to transfer rose from 49.31 in the pretest to 54.29 in the posttest (+ 4.98) (experimental group) and the score increased from 50.50 in the pretest to 51.85 in the posttest (+ 1.35) (control group), pointing to a significant difference between the groups (t = 2.095, p  =.039).

As previously stated, this research was evaluated a virtual nursing simulation-based education program designed to provide psychological support to patients affected by infectious disease disasters. The results showed statistically significant increases in the experimental group’s pretest and posttest scores on disaster mental health competence, problem solving process, self-leadership, learning self-efficacy, and motivation to transfer.

The experimental group achieved more statistically significant improvements in disaster mental health competence than did the control group. This finding is similar to the statistically significant increase in the average disaster mental health competence shown by providers of disaster mental health services providers and non-expert groups after PFA training involving lecture and practice [ 46 ]. It is also consistent with the significant increase in the scores of school counselors on disaster mental health competence after a lecture and simulation on PFA [ 25 ]. In their study on disaster relief workers, Kang and Choi [ 26 ] measured the participants’ performance competence in PFA after the delivery of a lecture and simulation-based education using a standardized patient. The authors found a significant increase in PFA performance competence, consistent with the present research. Since there are currently no other virtual simulation-based education programs for disaster psychological support available, we compared the effectiveness of various PFA training methods with the program assessed in the present work.

In the current research, the posttest scores of the experimental group on problem solving process significantly increased, similar to the results of Kim et al.’s study on virtual simulation- and blended simulation-based education on asthmatic child nursing [ 47 ]. Both the control and experimental groups (virtual simulation only and blended simulation featuring high-fidelity and virtual simulations, respectively) showed an increase in their problem solving process scores. These results and those derived in the present work are similar because reading and pretest phases were incorporated into the design of the previous study. Given that researchers have used commercial virtual simulations featuring avatars rather than standardized patient videos available through English-based platforms, user experiences may differ, thus requiring a qualitative analysis to identify differences. However, Kim et al. [ 47 ] did not implement a debriefing after the virtual simulation program, rendering comparison impossible. Another research reported that a multimodality simulation education that combines such methods as virtual simulation, the use of mannequins, and part-task training increase increased the scores of hospital nurses’ on problem solving process [ 48 ].

In the present work, the experimental group’s self-leadership scores increased after they used the program, and these scores were higher [ 49 , 50 ]. This difference can be explained by the fact that our respondents voluntarily participated in our research given their interest in self-learning programs for disaster psychological support; even in the comparison studies, participants with stronger interest in leadership education typically exhibited heightened degrees of self-leadership [ 51 ]. The increase in self-leadership scores in the current research is consistent with a previous study involving a two-hour simulation education about PPE donning and doffing, medication administration, and medical specimen treatment in a scenario of patients suspected of having infectious diseases [ 32 ]. Another research showed that simulation education on high-risk pregnancy enhances nursing students’ problem-solving processes and self-leadership [ 52 ].

Learning self-efficacy is a key variable that enables the prediction of learners’ degrees of participation in online education and the prediction of their academic achievements, as it points to the ability to manage their learning processes [ 34 , 53 ]. The results of the current research in this regard are consistent with those of a study on the online practice of basic nursing skills, which increased participants’ learning self-efficacy [ 54 ]. The researchers included an online quiz about basic nursing skills and feedback sections for learners’ self-evaluations of their performance as avenues through which to encourage autonomy in learning. A similar approach was used in the present study, which involved both a pretest for self-evaluation, direct feedback on the virtual simulation, and a self-debriefing session, enabling the participants to reflect on their simulation experiences while reviewing other participants’ answers during self-debriefing. These functions of the evaluated program were expected to factor importantly in the significant increase in the participants’ learning self-efficacy scores.

Many studies on practice education have examined participants’ motivations to transfer knowledge and skills alongside their learning self-efficacies. In the current research, the motivation to transfer scores of the experimental increased, and the difference between the two groups was statistically meaningful. This result is consistent with the findings of Park and Kweon on the simulation education about psychiatric nursing, during which post-course debriefing increased the participants’ average scores on motivation to transfer and learning self-efficacy [ 38 ]. Conversely, Kang and Kim found that a six-week simulation program for alcoholic patient care did not generate a significant increase in the participants’ motivation to transfer and learning self-efficacy scores [ 55 ]. This finding was attributed to the unfamiliarity of the local community scenario used in the research to the participants, who were in their senior year of nursing school [ 55 ]. This limitation was overcome in the current research by administering a qualitative survey of nurses’ actual demand for education on psychological support for infectious disease patients. That is, the survey presented scenarios that the participants needed.

As with other studies, the present research was encumbered by several limitations. First, the self-assessment measures used in this study may be unreliable, because they are based on individuals’ subjective perceptions and interpretations of their abilities. There is also the possibility of respondent fatigue given that the participants were compelled to answer numerous questions. Future studies should incorporate both subjective and objective measures into data collection and consider as concise an evaluation method as possible to prevent respondent fatigue. Second, this study did not establish a direct link between the obtained results and actual changes in practice or improvements in patient outcomes. We propose a follow-up study to investigate the impact of the education program examined in this study on either the mental health of patients or the quality of patient care. Third, simulation-based education tends to be accompanied with more guidance than text-based program because the former has diverse components, including quiz games, and participants are predisposed to allocate more time to simulation-based education. These may potentially influence the results. In the future, we propose to conduct research by modifying education under the same time and guided condition.

This study proposed that a well-designed virtual nursing simulation-based education program can be an effective modality with which to satisfy the educational needs of nurses in the context of infectious disease outbreaks. Such programs can be easily used by nurses anywhere and anytime before they are deployed to provide psychological support to patients with infectious diseases. They are also expected to contribute to enhancing competence in addressing disaster mental health and improving the quality of care of patients afflicted with infectious diseases.

Data availability

The datasets used and/or analyzed in this study are available from the corresponding author upon reasonable request.

Abbreviations

Coronavirus disease 2019

Randomized controlled trial

Personal protective equipment

Statistical Package for the Social Sciences

Analysis of covariance

Psychological first aid

World Health Organization. WHO coronavirus disease (COVID-19) dashboard. 2020.

Oh EG, Yu SH, Kang JH, Oh YJ, Cha KI, Jeon YH, et al. Perspectives on nursing profession for a post-COVID-19 new normal. Korean J Adult Nurs. 2020;32(3):221–2.

Article   Google Scholar  

Brooks SK, Webster RK, Smith LE, Woodland L, Wessely S, Greenberg N, et al. Rapid review. Lancet. 2020;395:912–20.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Çalışkan C, Özsezer G, Pay M, Demir G, Çelebi I, Koçak H. Web search behaviors and infodemic attitudes regarding COVID-19 in Turkey: a framework study for improving response and informing policy on the COVID-19 infodemic. Front Public Health. 2022;10:948478.

Article   PubMed   PubMed Central   Google Scholar  

Stangl AL, Earnshaw VA, Logie CH, Van Brakel WC, Simbayi L, Barré I, et al. The health stigma and discrimination Framework: a global, crosscutting framework to inform research, intervention development, and policy on health-related stigmas. BMC Med. 2019;17:1–13.

Tucci V, Moukaddam N, Meadows J, Shah S, Galwankar SC, Kapur GB. The forgotten plague: psychiatric manifestations of Ebola, Zika, and emerging infectious diseases. J Global Infect Dis. 2017;9(4):151.

Korea Mental Health Technology R&D Project. (2017). Korean Disaster and Mental Health Support Guidelines.

Zhang J, Wu W, Zhao X, Zhang W. Recommended psychological crisis intervention response to the 2019 novel coronavirus pneumonia outbreak in China: a model of West China Hospital. Precision Clin Med. 2020;3(1):3–8.

Liu W. The effects of virtual simulation on undergraduate nursing students’ mental health literacy: a prospective cohort study. Issues Ment Health Nurs. 2020:1–10.

Cant R, Cooper S, Sussex R, Bogossian F. What’s in a name? Clarifying the nomenclature of virtual simulation. Clin Simul Nurs. 2019;27:26–30.

Foronda C, Bauman EB. Strategies to incorporate virtual simulation in nurse education. Clin Simul Nurs. 2014;10(8):412–812.

Berman NB, Durning SJ, Fischer MR, Huwendiek S, Triola MM. The role for virtual patients in the future of medical education. Acad Med. 2016;91(9):1217–22.

Article   PubMed   Google Scholar  

Verkuyl M, Hughes M, Tsui J, Betts L, St-Amant O, Lapum JL. Virtual gaming simulation in nursing education: a focus group study. J Nurs Educ. 2017;56(5):274–80.

Cobbett S, Snelgrove-Clarke E. Virtual versus face-to-face clinical simulation in relation to student knowledge, anxiety, and self-confidence in maternal-newborn nursing: a randomized controlled trial. Nurse Educ Today. 2016;45:179–84.

Bahreini M, Shahamat S, Hayatdavoudi P, Mirzaei M. Comparison of the clinical competence of nurses working in two university hospitals in Iran. Nurs Health Sci. 2011;13(3):282–8.

Goliroshan S, Babanataj R, Aziznejadroshan P. Investigating the self-assessment of clinical competency of nurses working in Babol University of Medical Sciences hospitals. Middle East J Family Med. 2018;7(10):279.

Google Scholar  

Bam V, Diji AKA, Asante E, Lomotey AY, Adade P, Akyeampong BA. Self-assessed competencies of nurses at an emergency department in Ghana. Afr J Emerg Med. 2020;10(1):8–12.

O’Leary J. Comparison of self-assessed competence and experience among critical care nurses. J Nurs Adm Manag. 2012;20(5):607–14. https://doi.org/10.1111/j.1365-2834.2012.01394.x .

Kim KM, Choi JS. Self-perceived competency of infection control nurses based on Benner’s framework: a nationwide survey in Korea. Appl Nurs Res. 2015;28(2):175–9.

Hassankhani H, Hasanzadeh F, Powers KA, Dadash Zadeh A, Rajaie R. Clinical skills performed by Iranian emergency nurses: perceived competency levels and attitudes toward expanding professional roles. J Emerg Nurs. 2018;44(2):156–63. https://doi.org/10.1016/j.jen.2017.06.007 .

Biagioli V, Prandi C, Nyatanga B, Fida R. The role of professional competency in influencing job satisfaction and organizational citizenship behavior among palliative care nurses. J Hospice Palliat Nurs. 2018;20(4):377–84.

Alreshidi NM, Alaseeri R, Garcia M. Factors influencing absenteeism among nursing staff in the primary health care centers in hail: a preliminary study for enhancing staff commitment. Health Sci J. 2019;13(3):658–1.

Rodolfa E, Bent R, Eisman E, Nelson P, Rehm L, Ritchie P. A cube model for competency development: implications for psychology educators and regulators. Prof Psychology: Res Pract. 2005;36(4):347.

King RV, Burkle FM, Walsh LE, North CS. Competencies for disaster mental health. Curr Psychiatry Rep. 2015;17(3).

Lee JS, You S, Choi YK, Youn Hy SH. A preliminary evaluation of the training effects of a didactic and simulation-based psychological first aid program in students and school counselors in South Korea. PLoS ONE. 2017;12(7):e0181271.

Kang JY, Choi YJ. Effects of a psychological first aid simulated training for pregnant flood victims on disaster relief worker’s knowledge, competence, and self-efficacy. Appl Nurs Res. 2020:1513.

Park S, Kim S. The effects of team-based simulation education on problem solving process. Korean J Health Communication. 2019;14(2):165–72.

Australian Psychological Society. Useful skills for disaster recovery: Problem solving. 2016.

Lim A, Song Y. A scoping review of instruments for measuring problem-solving ability in simulation nursing education in Korea: a focus on process behavior survey. J Korean Acad Fundamentals Nurs. 2022;29(3):269–83.

Lee M, Lee M, Kim S. A study on nursing students’ self-leadership and their perception of learning. J Korean Acad Soc Nurs Educ. 2015;21(3):417–25. https://doi.org/10.5977/jkasne.2015.21.3.417 .

Goldsby E, Goldsby M, Neck CB, Neck CP. Under pressure: time management, self-leadership, and the nurse manager. Administrative Sci. 2020;10(3):38.

Kim JK, Song MS. Effects of respiratory infectious disease simulation-based education on nursing students of clinical competency, self-leadership and critical thinking. J Korea Academia-Industrial Cooperation Soc. 2019;20(8):93–101.

Cho GY, Seo MK. Influencing factors of learning flow, self-leadership and debriefing satisfaction on problem solving ability of nursing students in simulation learning. J Fisheries Mar Sci Educ. 2020;32(2):409–19.

Artino AR. Motivational beliefs and perceptions of instructional quality: Predicting satisfaction with online training. J Comput Assist Learn. 2008;24(3):260–70.

Yu BM, Jeon JC, Park HJ. The effects on learning motivation and self-efficacy according to the type of reflection. J Educational Inform Media. 2013;19(4):837–59.

Kim KA, Choi DW. The effect of virtual simulation in nursing education: an application of care for acute heart disease patients. J Korean Soc Simul Nurs. 2018;6(2):1–13.

Sapiano AB, Sammut R, Trapani J. The effectiveness of virtual simulation in improving student nurses’ knowledge and performance during patient deterioration: a pre and post test design. Nurse Educ Today. 2018;62:128–33.

Park SY, Kweon YR. The effect of using standardized patients in psychiatric nursing practical training for nursing college students. J Korean Acad Psychiatric Mental Health Nurs. 2012;21(1):79–88.

Ko E. Development of a virtual simulation-based nurse education program to provide psychological support for patients affected by infectious disease disasters [dissertation]. [Seoul, Korea]: Chung-Ang University.

Yoon HY, Choi YK. The development and validation of the perceived competence scale for disaster mental health workforce. Psychiatry Invest. 2019;16(11):816.

Park JW, Woo OH. The effects of PBL (problem-based learning) on problem solving process by learner’s metacognitive level. J Educational Technol. 1999;15(3):55–81.

Lee JS. The effects of process behaviors on problem-solving performance in various tests. Dissertation Abstracts International Section A: Humanities and Social Sciences. 1978;39(4-A):2149.

Manz CC. The art of self-leadership: strategies for personal effectiveness in your life and work. Prentice-Hall; 1983.44. Kim HS. The relationship between self-leadership and job satisfaction of middle school teacher [master’s thesis]. [Seoul]: Soongsil University; 2003. 83 p. 45. Ayres HW. Factors related to motivation to learn and motivation to transfer learning in a nursing population [dissertation]. [Raleigh]: North Carolina State University.

Kim HS. The relationship between self-leadership and job satisfaction of middle school teacher [master?s thesis]. [Seoul]: Soongsil University; 2003. 83 p.

Ayres HW. Factors related to motivation to learn and motivation to transfer learning in a nursing population [dissertation]. [Raleigh]: North Carolina State University.

Park H, Choi S, Choi Y, Park S, You S, Baik M, et al. Effect of Korean version of psychological first aid training program on training disaster mental health service provider. J Korean Neuropsychiatric Association. 2020;59(2):123–35.

Kim M, Kim S, Lee WS. Effects of a virtual reality simulation and a blended simulation of care for pediatric patient with asthma. Child Health Nurs Res. 2019;25(4):496–506.

Noh J, Oh EG, Kim SS, Jang YS, Chung HS, Lee O. Development and evaluation of a multimodality simulation disaster education and training program for hospital nurses. Int J Nurs Pract. 2020:e12810.

Cho JL. The effect of self-leadership and communication barriers on nursing performance in hospital nurses. J Convergence Cult Technol. 2019;5(2):239–46.

Kwon SN, Park HJ. Effects of nurses’ positive psychological capital, self-leadership, and relational bonds on organizational commitment. J Korean Acad Nurs Adm. 2020;26(3):241–50.

Yoo JY, Lee YH, Ha YK. A convergence study on the effects of self-leadership and job satisfaction on nursing performance in general hospital nurses. J Convergence Inform Technol. 2019;9(10):28–38.

Park SY, Shim CS. Effects of a simulator-based delivery education on the major satisfaction, nursing professionalism and clinical competence of the nursing students. J Korea Entertainment Ind Association (JKEIA). 2018;12(5):199.

Jeon JC. The effects on learning motivation and self-efficacy according to the type of reflection. J Educational Inform Media. 2013;19(4):837–59.

Kim S, Cho Y. Proliferation of online learning and the implications for teaching and learning in future university education. J Lifelong Learn Soc. 2018;14(4):51–78.

Kang GS, Kim Y. Development and application of integrated-simulation practice program using standardized patients: caring for alcoholism with diabetes mellitus in the community. J Korea Academia-Industrial Cooperation Soc. 2016;17(8):662–72.

Download references

Acknowledgements

The authors would like to thank Eun-Joo Choi and Dong-Hee Cho for their contributions to the development of the simulation program.

This work was supported by the National Research Foundation of Korea (NRF) through a grant funded by the Korean government (Ministry of Science and ICT) (NRF-2020R1A2B5B0100208).

Author information

Authors and affiliations.

Chung-Ang University, Red Cross College of Nursing, Seoul, South Korea

Eunjung Ko & Yun-Jung Choi

You can also search for this author in PubMed   Google Scholar

Contributions

K.E. and Y.C. wrote the design, collected data, data analysis, results, discussion, and conclusion.

Corresponding author

Correspondence to Yun-Jung Choi .

Ethics declarations

Ethics approval and consent to participate.

This study was conducted in accordance with the Declaration of Helsinki (Association World Medical, 2013) and was part of a larger study. It was approved by the Institutional Review Board of Chung-Ang University (IRB approval number 1041078-202003-HRSB-070-01CC) and retrospectively registered (21/11/2023) in the Clinical Research Information Service ( https://cris.nih.go.kr ) with trial registration number KCT0008965. All the participants provided written informed consent and were informed of the right to withdraw from participation at any time during the research until publication. Data confidentiality was ensured, and the results were provided to the participants at their request.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Ko, E., Choi, YJ. Efficacy of a virtual nursing simulation-based education to provide psychological support for patients affected by infectious disease disasters: a randomized controlled trial. BMC Nurs 23 , 230 (2024). https://doi.org/10.1186/s12912-024-01901-4

Download citation

Received : 01 November 2023

Accepted : 29 March 2024

Published : 07 April 2024

DOI : https://doi.org/10.1186/s12912-024-01901-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Virtual simulation
  • Psychological support
  • Infectious diseases disaster
  • Nursing education

BMC Nursing

ISSN: 1472-6955

research about learning strategies

  • Do Not Sell My Personal Info

Register For Free

  •  ⋅ 
  • Content Marketing

35 Content Marketing Statistics You Should Know

Stay informed with the latest content marketing statistics. Discover how optimized content can elevate your digital marketing efforts.

research about learning strategies

Content continues to sit atop the list of priorities in most marketing strategies, and there is plenty of evidence to support the reasoning.

Simply put, content marketing is crucial to any digital marketing strategy, whether running a small local business or a large multinational corporation.

After all, content in its many and evolving forms is indisputably the very lifeblood upon which the web and social media are based.

Modern SEO has effectively become optimized content marketing for all intents and purposes.

This is when Google demands and rewards businesses that create content demonstrating experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) for their customers – content that answers all of the questions consumers may have about their services, products, or business in general.

Content marketing involves creating and sharing helpful, relevant, entertaining, and consistent content in various text, image, video, and audio-based formats to the plethora of traditional and online channels available to modern marketers.

The primary focus should be on attracting and retaining a clearly defined audience, with the ultimate goal of driving profitable customer action.

Different types of content can and should be created for each stage of a customer’s journey .

Some content, like blogs or how-to videos, are informative or educational. Meanwhile, other content, like promotional campaign landing pages , gets to the point of enticing prospective customers to buy.

But with so much content being produced and shared every day, it’s important to stay updated on the latest trends and best practices in content marketing to keep pace and understand what strategies may be most effective.

Never has this been more true than in 2024, when we’re in the midst of a content revolution led by generative AI , which some feel represents both an opportunity and a threat to marketers.

To help you keep up, here are 35 content marketing statistics I think you should know:

Content Marketing Usage

How many businesses are leveraging content marketing, and how are they planning to find success?

  • According to the Content Marketing Institute (CMI), 73% of B2B marketers, and 70% of B2C marketers use content marketing as part of their overall marketing strategy.
  • 97% of marketers surveyed by Semrush achieved success with their content marketing in 2023.
  • A B2B Content Marketing Study conducted by CMI found that 40% of B2B marketers have a documented content marketing strategy; 33% have a strategy, but it’s not documented, and 27% have no strategy.
  • Half of the surveyed marketers by CMI said they outsource at least one content marketing activity.

Content Marketing Strategy

What strategies are content marketers using or finding to be most effective?

  • 83% of marketers believe it’s more effective to create higher quality content less often. (Source: Hubspot)
  • In a 2022 Statista Research Study of marketers worldwide, 62% of respondents emphasized the importance of being “always on” for their customers, while 23% viewed content-led communications as the most effective method for personalized targeting efforts.
  • With the increased focus on AI-generated search engine results, 31% of B2B marketers say they are sharpening their focus on user intent/answering questions, 27% are creating more thought leadership content, and 22% are creating more conversational content. (Source: CMI)

Types Of Content

Content marketing was synonymous with posting blogs, but the web and content have evolved into audio, video, interactive, and meta formats.

Here are a few stats on how the various types of content are trending and performing.

  • Short-form video content, like TikTok and Instagram Reel, is the No. 1 content marketing format, offering the highest return on investment (ROI).
  • 43% of marketers reported that original graphics (like infographics and illustrations) were the most effective type of visual content. (Source: Venngage)
  • 72% of B2C marketers expected their organization to invest in video marketing in 2022. (Source: Content Marketing Institute – CMI)
  • The State of Content Marketing: 2023 Global Report by Semrush reveals that articles containing at least one video tend to attract 70% more organic traffic than those without.
  • Interactive content generates 52.6% more engagement compared to static content. On average, buyers spend 8.5 minutes viewing static content items and 13 minutes on interactive content items. (Source: Mediafly)

Content Creation

Creating helpful, unique, engaging content can be one of a marketer’s greatest challenges. However, innovative marketers are looking at generative AI as a tool to help ideate, create, edit, and analyze content quicker and more cost-effectively.

Here are some stats around content creation and just how quickly AI is changing the game.

  • Generative AI reached over 100 million users just two months after ChatGPT’s launch. (Source: Search Engine Journal)
  • A recent Ahrefs poll found that almost 80% of respondents had already adopted AI tools in their content marketing strategies.
  • Marketers who are using AI said it helps most with brainstorming new topics ( 51%) , researching headlines and keywords (45%), and writing drafts (45%). (Source: CMI)
  • Further, marketers polled by Hubspot said they save 2.5 hours per day using AI for content.

Content Distribution

It is not simply enough to create and publish content.

For a content strategy to be successful, it must include distributing content via the channels frequented by a business’s target audience.

  • Facebook is still the dominant social channel for content distribution, but video-centric channels like YouTube, TikTok, and Instagram are growing the fastest .  (Source: Hubspot)
  • B2B marketers reported to CMI that LinkedIn was the most common and top-performing organic social media distribution channel at 84% by a healthy margin. All other channels came in under 30%.
  • 80% of B2B marketers who use paid distribution use paid social media advertising. (Source: CMI)

Content Consumption

Once content reaches an audience, it’s important to understand how an audience consumes the content or takes action as a result.

  • A 2023 Content Preferences Study by Demand Gen reveals that 62% of B2B buyers prefer practical content like case studies to inform their purchasing decisions, citing “a need for valid sources.”
  • The same study also found that buyers tend to rely heavily on content when researching potential business solutions, with 46% reporting that they increased the amount of content they consumed during this time.
  • In a recent post, blogger Ryan Robinson reports the average reader spends 37 seconds reading a blog.
  • DemandGen’s survey participants also said they rely most on demos ( 62% ) and user reviews (55%) to gain valuable insights into how a solution will meet their needs.

Content Marketing Performance

One of the primary reasons content marketing has taken off is its ability to be measured, optimized, and tied to a return on investment.

  • B2C marketers reported to CMI that the top three goals content marketing helps them to achieve are creating brand awareness, building trust, and educating their target audience.
  • 87% of B2B marketers surveyed use content marketing successfully to generate leads.
  • 56% of marketers who leverage blogging say it’s an effective tactic, and 10% say it generates the greatest return on investment (ROI).
  • 94% of marketers said personalization boosts sales.

Content Marketing Budgets

Budget changes and the willingness to invest in specific marketing strategies are good indicators of how popular and effective these strategies are at a macro level.

The following stats certainly seem to indicate marketers have bought into the value of content.

  • 61% of B2C marketers said their 2022 content marketing budget would exceed their 2021 budget.
  • 22% of B2B marketers said they spent 50% or more of their total marketing budget on content marketing. Furthermore, 43% saw their content marketing budgets grow from 2020 to 2021, and 66% expected them to grow again in 2022.

Content Challenges

All forms of marketing come with challenges related to time, resources, expertise, and competition.

Recognizing and addressing these challenges head-on with well-thought-out strategies is the best way to overcome them and realize success.

  • Top 3 content challenges included “attracting quality leads with content” ( 45% ), “creating more content faster” (38%), and “generating content ideas” (35%). (Source: Semrush’s The State of Content Marketing: 2023 Global Report)
  • 44% of marketers polled for CMI’s 2022 B2B report highlighted the challenge of creating the right content for multi-level roles as their top concern. This replaced internal communication as the top challenge from the previous year.
  • Changes to SEO/search algorithms ( 64% ), changes to social media algorithms (53%), and data management/analytics (48%) are also among the top concerns for B2C marketers.
  • 47% of people are seeking downtime from internet-enabled devices due to digital fatigue.
  • While generative AI has noted benefits, it also presents challenges for some marketers who fear it may replace them. In Hubspot’s study, 23% said they felt we should avoid using generative AI.
  • Another challenge with AI is how quickly it has come onto the scene without giving organizations time to provide training or to create policies and procedures for its appropriate and legal use. According to CMI, when asked if their organizations have guidelines for using generative AI tools, 31% of marketers said yes, 61% said no, and 8% were unsure.

Time To Get Started

As you can clearly see and perhaps have already realized, content marketing can be a highly effective and cost-efficient way to generate leads, build brand awareness, and drive sales. Content, in its many formats, powers virtually all online interactions.

Generative AI is effectively helping to solve some of the time and resource challenges by acting as a turbo-powered marketing assistant, while also raising a few procedural concerns.

However, the demand for content remains strong.

Those willing to put in the work of building a documented content strategy and executing it – by producing, optimizing, distributing, and monitoring high-value, relevant, customer-centric content, with the help of AI or not – can reap significant business rewards.

More resources:

  • 6 Ways To Humanize Your Content In The AI Era
  • Interactive Content: 10 Types To Engage Your Audience
  • B2B Lead Generation: Create Content That Converts

Featured Image: Deemak Daksina/Shutterstock 

Jeff has been helping organizations manage, measure and optimize their Web presences for over 20 years. He has deep knowledge ...

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.

IMAGES

  1. 6 Effective Learning Methods

    research about learning strategies

  2. How Learning Works: 10 Research-Based Insights

    research about learning strategies

  3. 6 Powerful Learning Strategies You MUST Share with Students

    research about learning strategies

  4. 9 Research-Based Teaching Strategies for Your Toolbox

    research about learning strategies

  5. Best practices for effective learning

    research about learning strategies

  6. Module 8: Research-Based Instructional Strategies

    research about learning strategies

VIDEO

  1. Special Issue Series

  2. Strategies Effective E-Learning

  3. Research Learning Series (RLS)

  4. Enhancing Learning in Higher Education

  5. Maximize Learning: How to Study Effectively in Less Time for Better Results

  6. What is a language learning strategy?

COMMENTS

  1. Learning strategies: a synthesis and conceptual model

    Surface learning includes subject matter vocabulary, the content of the lesson and knowing much more. Strategies include record keeping, summarisation, underlining and highlighting, note taking ...

  2. Fostering effective learning strategies in higher education—A mixed

    Cognitive psychological research from the last decades has shown that learning strategies that create desirable difficulties during learning, e.g., practice testing, are most effective for long-term learning outcomes. However, there is a paucity of research on how to effectively translate these insights into training students in higher education. Therefore, we designed an intervention program ...

  3. Improving Students' Study Habits and Course Performance With a

    The study strategies on which students rely also may impact student learning, and these strategies differ in their effectiveness. Dunlosky and colleagues (2013) reviewed 10 such strategies, reporting on their utility for improving student learning and generalizability across different types of learners, materials, criterion tasks, and educational contexts.

  4. How and Why Do Students Use Learning Strategies? A Mixed Methods Study

    Traditionally, research on learning strategy use is performed using self-report questionnaires. As this method is accompanied by several drawbacks, we chose a qualitative, in-depth approach to inquire about students' strategies and to investigate how students successfully self-regulate their learning. ... Applying learning strategy ...

  5. Learning Strategies That Work

    Which learning strategies actually work and which are mere myths? Such questions are at the center of the work of Mark McDaniel, professor of psychology and the director of the Center for Integrative Research on Cognition, Learning, and Education at Washington University in St. Louis.

  6. Effective Learning Practices

    An effective skill for learning is metacognition. Metacognition is the process of "thinking about thinking" or reflecting on personal habits, knowledge, and approaches to learning. Engaging in metacognition enables students to become aware of what they need to do to initiate and persist in tasks, to evaluate their own learning strategies ...

  7. PLAT 20 (1) 2021: Enhancing Student Learning in Research and

    Many of the retrieval-practice studies include spaced retrieval in the experimental designs. Future research should examine whether the benefits of testing can be combined with other effective learning strategies to produce even larger learning gains (e.g., Kubik et al., 2020; Miyatsu & McDaniel, 2019). Relevant to Kroneisen and Kuepper-Tetzel ...

  8. Applying Cognitive Learning Strategies to Enhance Learning and

    Introduction. Medical researchers project that the collective body of medical information will double every 73 days by 2020. 1 It is incumbent on medical schools and clinical training programs to help students and trainees learn to absorb, organize, store, and retrieve this vast amount of information. Historically, students and trainees have been taught using passive learning strategies, such ...

  9. PDF An Analysis of Learning Styles and Learning Strategies Used by a

    This study aimed to describe major learning styles and learning strategies used by a research subject from four years of a college student of English Education Study Program. It was a qualitative study and conducted through a ... Every learning strategy has a different number of items. Memory strategies represented by items 1-9, Cognitive ...

  10. Instructor strategies to aid implementation of active learning: a

    Recent research has identified specific instructor strategies that correlate with reduced SRAL and positive student response in undergraduate STEM education (Finelli et al., 2018; Nguyen et al., 2017; Tharayil et al., 2018).For example, Deslauriers et al. suggested that physics students perceive the additional effort required by active learning to be evidence of less effective learning.

  11. Learning strategies and their correlation with academic success in

    The learning strategies were assessed using the LIST-Questionnaire (Lernstrategien im Studium) which is a German version based on the Anglo-American Motivated Strategies of Learning questionnaire (MSLQ) [16, 17]. The LIST contains 96 items that score on an endpoint named Likert-scale ranging from 1 (`not at all true of me´) to 5 (`very true of ...

  12. Vocabulary learning strategies: A comparative study of EFL learners

    However, research in this field has focused on identifying the general types of strategies that are linked to successful learning as well as with implementing these strategies to more specific language learning areas, such as the development of L2 vocabulary (Schmitt, Citation 1997). A part of research into learning strategies was focused on ...

  13. Investigating language learning strategies: Prospects, pitfalls and

    Major advances have been made in research on language learning strategies (LLS) since it was triggered by good language learner studies (e.g. Rubin, 1975). Numerous accounts of strategy use have been compiled, key classifications have been proposed, some progress has been made towards furthering our understanding of the complex relationship ...

  14. The impact of learning strategies on the academic achievement of

    Similarly, the study of Montero and Arizmendiarrieta (2017) explicated 10 learning strategies consisting of elaboration, time and effort, perseverance, organization, classmates' support, metacognition, self-questioning, the study environment, repetition and instructors' help. Furthermore, Juste and López (2010) identified seven learning strategies that include the planning and reinforcement ...

  15. Full article: Is research-based learning effective? Evidence from a pre

    The effectiveness of research-based learning. Conducting one's own research project involves various cognitive, behavioural, and affective experiences (Lopatto, Citation 2009, 29), which in turn lead to a wide range of benefits associated with RBL. RBL is associated with long-term societal benefits because it can foster scientific careers: Students participating in RBL reported a greater ...

  16. Frontiers

    Traditionally, research on learning strategy use is performed using self-report questionnaires. As this method is accompanied by several drawbacks, we chose a qualitative, in-depth approach to inquire about students' strategies and to investigate how students successfully self-regulate their learning.

  17. (PDF) Learning Strategies in Higher Education

    Abstract: In the context of higher education, learning strategies become important factors for success. in academic performance. The majority of the authors consider learning strategies to be ...

  18. 17 Learning Strategies to Implement In Your Classroom

    Learning strategies are methods used by instructors to initiate students into effective learning by using a variety of engaging learning techniques, activities and practices. These methods are all derived from years of meticulous research into how people learn best. In any lesson plan, instructors can incorporate multiple learning strategies.

  19. (PDF) Language Learning Strategies

    Learning strategies are de-. fined as "specific actions, behaviors, steps, or techniques-such as seeking out con-. versation partners or giving onesel f encouragement to tackle a difficult ...

  20. (PDF) Study On Academic Performance And Learning Strategies Of Tertiary

    among students in tertiary educ ation in Malaysia. Hence, the focus of this study is to explore learning. strategies among undergraduate students in bus iness management and accounting programs ...

  21. 32 Research-Based Instructional Strategies

    20. Developing high expectations for each student. 21. Providing clear and effective learning feedback (see 13 Concrete Examples Of Effective Learning Feedback) 22. Teacher clarity (learning goals, expectations, content delivery, assessment results, etc.) 23. Setting goals or objectives (Lipset & Wilson 1993) 24.

  22. Developing students' self-regulated learning strategies to facilitate

    With self-regulated learning strategies, LP students could benefit more from DGBL than HP students did in vocabulary development, reducing the gaps between LP and HP students. ... This research was supported by the National Science and Technology Council in the Republic of China (Taiwan) under project numbers (112-2410-H-224-013-MY2). ...

  23. Schools are using research to try to improve children's learning

    Research messages are frequently too vague to be effective because the skills and expertise of teaching are difficult to transfer. It is also becoming apparent that the gains in education are ...

  24. Fostering Effective Learning Strategies in Higher Education

    Cognitive psychological research from the last decades has shown that learning strategies that create desirable difficulties during learning, e.g., practice testing, are most effective for long-term learning outcomes. However, there is a paucity of research on how to effectively translate these insights into training students in higher education.

  25. Frontiers

    The outcomes yield important guidance for evolving STEM education research and practices, stressing the need for flexible teaching techniques to enrich learning in different educational environments. These findings are crucial for educators and institutions working to adapt their strategies to the changing landscape of online and F2F ...

  26. Efficacy of a virtual nursing simulation-based education to provide

    Virtual simulation-based education for healthcare professionals has emerged as a strategy for dealing with infectious disease disasters, particularly when training at clinical sites is restricted due to the risk of infection and a lack of personal protective equipment. This research evaluated a virtual simulation-based education program intended to increase nurses' perceived competence in ...

  27. How Delayed Learning about Climate Uncertainty Impacts Decarbonization

    The findings show that delaying learning about the remaining carbon budget impacts investment in three ways: (i) the cost of policy increases, especially when adjustment costs are present; (ii) abatement investment is front-loaded relative to the certainty policy; and (iii) the sectoral allocation of investment changes to favor declining ...

  28. Research on Optimization Strategies for Closed-Loop Supply Chain

    This study explores the integration of deep learning (DL) technology and the guided simulated annealing algorithm (GSAA) to optimize closed-loop supply chains (CLSC) for sustainable development. By applying DL for predictive analysis and GSAA for optimization, the research aims to enhance CLSC operational efficiency and environmental ...

  29. 35 Content Marketing Statistics You Should Know

    According to the Content Marketing Institute (CMI), 73% of B2B marketers, and 70% of B2C marketers use content marketing as part of their overall marketing strategy. 97% of marketers surveyed by ...