• Reference Manager
  • Simple TEXT file

People also looked at

Original research article, a comparative analysis of student performance in an online vs. face-to-face environmental science course from 2009 to 2016.

online education vs traditional education research paper

  • Department of Biology, Fort Valley State University, Fort Valley, GA, United States

A growing number of students are now opting for online classes. They find the traditional classroom modality restrictive, inflexible, and impractical. In this age of technological advancement, schools can now provide effective classroom teaching via the Web. This shift in pedagogical medium is forcing academic institutions to rethink how they want to deliver their course content. The overarching purpose of this research was to determine which teaching method proved more effective over the 8-year period. The scores of 548 students, 401 traditional students and 147 online students, in an environmental science class were used to determine which instructional modality generated better student performance. In addition to the overarching objective, we also examined score variabilities between genders and classifications to determine if teaching modality had a greater impact on specific groups. No significant difference in student performance between online and face-to-face (F2F) learners overall, with respect to gender, or with respect to class rank were found. These data demonstrate the ability to similarly translate environmental science concepts for non-STEM majors in both traditional and online platforms irrespective of gender or class rank. A potential exists for increasing the number of non-STEM majors engaged in citizen science using the flexibility of online learning to teach environmental science core concepts.

Introduction

The advent of online education has made it possible for students with busy lives and limited flexibility to obtain a quality education. As opposed to traditional classroom teaching, Web-based instruction has made it possible to offer classes worldwide through a single Internet connection. Although it boasts several advantages over traditional education, online instruction still has its drawbacks, including limited communal synergies. Still, online education seems to be the path many students are taking to secure a degree.

This study compared the effectiveness of online vs. traditional instruction in an environmental studies class. Using a single indicator, we attempted to see if student performance was effected by instructional medium. This study sought to compare online and F2F teaching on three levels—pure modality, gender, and class rank. Through these comparisons, we investigated whether one teaching modality was significantly more effective than the other. Although there were limitations to the study, this examination was conducted to provide us with additional measures to determine if students performed better in one environment over another ( Mozes-Carmel and Gold, 2009 ).

The methods, procedures, and operationalization tools used in this assessment can be expanded upon in future quantitative, qualitative, and mixed method designs to further analyze this topic. Moreover, the results of this study serve as a backbone for future meta-analytical studies.

Origins of Online Education

Computer-assisted instruction is changing the pedagogical landscape as an increasing number of students are seeking online education. Colleges and universities are now touting the efficiencies of Web-based education and are rapidly implementing online classes to meet student needs worldwide. One study reported “increases in the number of online courses given by universities have been quite dramatic over the last couple of years” ( Lundberg et al., 2008 ). Think tanks are also disseminating statistics on Web-based instruction. “In 2010, the Sloan Consortium found a 17% increase in online students from the years before, beating the 12% increase from the previous year” ( Keramidas, 2012 ).

Contrary to popular belief, online education is not a new phenomenon. The first correspondence and distance learning educational programs were initiated in the mid-1800s by the University of London. This model of educational learning was dependent on the postal service and therefore wasn't seen in American until the later Nineteenth century. It was in 1873 when what is considered the first official correspondence educational program was established in Boston, Massachusetts known as the “Society to Encourage Home Studies.” Since then, non-traditional study has grown into what it is today considered a more viable online instructional modality. Technological advancement indubitably helped improve the speed and accessibility of distance learning courses; now students worldwide could attend classes from the comfort of their own homes.

Qualities of Online and Traditional Face to Face (F2F) Classroom Education

Online and traditional education share many qualities. Students are still required to attend class, learn the material, submit assignments, and complete group projects. While teachers, still have to design curriculums, maximize instructional quality, answer class questions, motivate students to learn, and grade assignments. Despite these basic similarities, there are many differences between the two modalities. Traditionally, classroom instruction is known to be teacher-centered and requires passive learning by the student, while online instruction is often student-centered and requires active learning.

In teacher-centered, or passive learning, the instructor usually controls classroom dynamics. The teacher lectures and comments, while students listen, take notes, and ask questions. In student-centered, or active learning, the students usually determine classroom dynamics as they independently analyze the information, construct questions, and ask the instructor for clarification. In this scenario, the teacher, not the student, is listening, formulating, and responding ( Salcedo, 2010 ).

In education, change comes with questions. Despite all current reports championing online education, researchers are still questioning its efficacy. Research is still being conducted on the effectiveness of computer-assisted teaching. Cost-benefit analysis, student experience, and student performance are now being carefully considered when determining whether online education is a viable substitute for classroom teaching. This decision process will most probably carry into the future as technology improves and as students demand better learning experiences.

Thus far, “literature on the efficacy of online courses is expansive and divided” ( Driscoll et al., 2012 ). Some studies favor traditional classroom instruction, stating “online learners will quit more easily” and “online learning can lack feedback for both students and instructors” ( Atchley et al., 2013 ). Because of these shortcomings, student retention, satisfaction, and performance can be compromised. Like traditional teaching, distance learning also has its apologists who aver online education produces students who perform as well or better than their traditional classroom counterparts ( Westhuis et al., 2006 ).

The advantages and disadvantages of both instructional modalities need to be fully fleshed out and examined to truly determine which medium generates better student performance. Both modalities have been proven to be relatively effective, but, as mentioned earlier, the question to be asked is if one is truly better than the other.

Student Need for Online Education

With technological advancement, learners now want quality programs they can access from anywhere and at any time. Because of these demands, online education has become a viable, alluring option to business professionals, stay-at home-parents, and other similar populations. In addition to flexibility and access, multiple other face value benefits, including program choice and time efficiency, have increased the attractiveness of distance learning ( Wladis et al., 2015 ).

First, prospective students want to be able to receive a quality education without having to sacrifice work time, family time, and travel expense. Instead of having to be at a specific location at a specific time, online educational students have the freedom to communicate with instructors, address classmates, study materials, and complete assignments from any Internet-accessible point ( Richardson and Swan, 2003 ). This type of flexibility grants students much-needed mobility and, in turn, helps make the educational process more enticing. According to Lundberg et al. (2008) “the student may prefer to take an online course or a complete online-based degree program as online courses offer more flexible study hours; for example, a student who has a job could attend the virtual class watching instructional film and streaming videos of lectures after working hours.”

Moreover, more study time can lead to better class performance—more chapters read, better quality papers, and more group project time. Studies on the relationship between study time and performance are limited; however, it is often assumed the online student will use any surplus time to improve grades ( Bigelow, 2009 ). It is crucial to mention the link between flexibility and student performance as grades are the lone performance indicator of this research.

Second, online education also offers more program choices. With traditional classroom study, students are forced to take courses only at universities within feasible driving distance or move. Web-based instruction, on the other hand, grants students electronic access to multiple universities and course offerings ( Salcedo, 2010 ). Therefore, students who were once limited to a few colleges within their immediate area can now access several colleges worldwide from a single convenient location.

Third, with online teaching, students who usually don't participate in class may now voice their opinions and concerns. As they are not in a classroom setting, quieter students may feel more comfortable partaking in class dialogue without being recognized or judged. This, in turn, may increase average class scores ( Driscoll et al., 2012 ).

Benefits of Face-to-Face (F2F) Education via Traditional Classroom Instruction

The other modality, classroom teaching, is a well-established instructional medium in which teaching style and structure have been refined over several centuries. Face-to-face instruction has numerous benefits not found in its online counterpart ( Xu and Jaggars, 2016 ).

First and, perhaps most importantly, classroom instruction is extremely dynamic. Traditional classroom teaching provides real-time face-to-face instruction and sparks innovative questions. It also allows for immediate teacher response and more flexible content delivery. Online instruction dampens the learning process because students must limit their questions to blurbs, then grant the teacher and fellow classmates time to respond ( Salcedo, 2010 ). Over time, however, online teaching will probably improve, enhancing classroom dynamics and bringing students face-to face with their peers/instructors. However, for now, face-to-face instruction provides dynamic learning attributes not found in Web-based teaching ( Kemp and Grieve, 2014 ).

Second, traditional classroom learning is a well-established modality. Some students are opposed to change and view online instruction negatively. These students may be technophobes, more comfortable with sitting in a classroom taking notes than sitting at a computer absorbing data. Other students may value face-to-face interaction, pre and post-class discussions, communal learning, and organic student-teacher bonding ( Roval and Jordan, 2004 ). They may see the Internet as an impediment to learning. If not comfortable with the instructional medium, some students may shun classroom activities; their grades might slip and their educational interest might vanish. Students, however, may eventually adapt to online education. With more universities employing computer-based training, students may be forced to take only Web-based courses. Albeit true, this doesn't eliminate the fact some students prefer classroom intimacy.

Third, face-to-face instruction doesn't rely upon networked systems. In online learning, the student is dependent upon access to an unimpeded Internet connection. If technical problems occur, online students may not be able to communicate, submit assignments, or access study material. This problem, in turn, may frustrate the student, hinder performance, and discourage learning.

Fourth, campus education provides students with both accredited staff and research libraries. Students can rely upon administrators to aid in course selection and provide professorial recommendations. Library technicians can help learners edit their papers, locate valuable study material, and improve study habits. Research libraries may provide materials not accessible by computer. In all, the traditional classroom experience gives students important auxiliary tools to maximize classroom performance.

Fifth, traditional classroom degrees trump online educational degrees in terms of hiring preferences. Many academic and professional organizations do not consider online degrees on par with campus-based degrees ( Columbaro and Monaghan, 2009 ). Often, prospective hiring bodies think Web-based education is a watered-down, simpler means of attaining a degree, often citing poor curriculums, unsupervised exams, and lenient homework assignments as detriments to the learning process.

Finally, research shows online students are more likely to quit class if they do not like the instructor, the format, or the feedback. Because they work independently, relying almost wholly upon self-motivation and self-direction, online learners may be more inclined to withdraw from class if they do not get immediate results.

The classroom setting provides more motivation, encouragement, and direction. Even if a student wanted to quit during the first few weeks of class, he/she may be deterred by the instructor and fellow students. F2F instructors may be able to adjust the structure and teaching style of the class to improve student retention ( Kemp and Grieve, 2014 ). With online teaching, instructors are limited to electronic correspondence and may not pick-up on verbal and non-verbal cues.

Both F2F and online teaching have their pros and cons. More studies comparing the two modalities to achieve specific learning outcomes in participating learner populations are required before well-informed decisions can be made. This study examined the two modalities over eight (8) years on three different levels. Based on the aforementioned information, the following research questions resulted.

RQ1: Are there significant differences in academic performance between online and F2F students enrolled in an environmental science course?

RQ2: Are there gender differences between online and F2F student performance in an environmental science course?

RQ3: Are there significant differences between the performance of online and F2F students in an environmental science course with respect to class rank?

The results of this study are intended to edify teachers, administrators, and policymakers on which medium may work best.

Methodology

Participants.

The study sample consisted of 548 FVSU students who completed the Environmental Science class between 2009 and 2016. The final course grades of the participants served as the primary comparative factor in assessing performance differences between online and F2F instruction. Of the 548 total participants, 147 were online students while 401 were traditional students. This disparity was considered a limitation of the study. Of the 548 total students, 246 were male, while 302 were female. The study also used students from all four class ranks. There were 187 freshmen, 184 sophomores, 76 juniors, and 101 seniors. This was a convenience, non-probability sample so the composition of the study set was left to the discretion of the instructor. No special preferences or weights were given to students based upon gender or rank. Each student was considered a single, discrete entity or statistic.

All sections of the course were taught by a full-time biology professor at FVSU. The professor had over 10 years teaching experience in both classroom and F2F modalities. The professor was considered an outstanding tenured instructor with strong communication and management skills.

The F2F class met twice weekly in an on-campus classroom. Each class lasted 1 h and 15 min. The online class covered the same material as the F2F class, but was done wholly on-line using the Desire to Learn (D2L) e-learning system. Online students were expected to spend as much time studying as their F2F counterparts; however, no tracking measure was implemented to gauge e-learning study time. The professor combined textbook learning, lecture and class discussion, collaborative projects, and assessment tasks to engage students in the learning process.

This study did not differentiate between part-time and full-time students. Therefore, many part-time students may have been included in this study. This study also did not differentiate between students registered primarily at FVSU or at another institution. Therefore, many students included in this study may have used FVSU as an auxiliary institution to complete their environmental science class requirement.

Test Instruments

In this study, student performance was operationalized by final course grades. The final course grade was derived from test, homework, class participation, and research project scores. The four aforementioned assessments were valid and relevant; they were useful in gauging student ability and generating objective performance measurements. The final grades were converted from numerical scores to traditional GPA letters.

Data Collection Procedures

The sample 548 student grades were obtained from FVSU's Office of Institutional Research Planning and Effectiveness (OIRPE). The OIRPE released the grades to the instructor with the expectation the instructor would maintain confidentiality and not disclose said information to third parties. After the data was obtained, the instructor analyzed and processed the data though SPSS software to calculate specific values. These converted values were subsequently used to draw conclusions and validate the hypothesis.

Summary of the Results: The chi-square analysis showed no significant difference in student performance between online and face-to-face (F2F) learners [χ 2 (4, N = 548) = 6.531, p > 0.05]. The independent sample t -test showed no significant difference in student performance between online and F2F learners with respect to gender [ t (145) = 1.42, p = 0.122]. The 2-way ANOVA showed no significant difference in student performance between online and F2F learners with respect to class rank ( Girard et al., 2016 ).

Research question #1 was to determine if there was a statistically significant difference between the academic performance of online and F2F students.

Research Question 1

The first research question investigated if there was a difference in student performance between F2F and online learners.

To investigate the first research question, we used a traditional chi-square method to analyze the data. The chi-square analysis is particularly useful for this type of comparison because it allows us to determine if the relationship between teaching modality and performance in our sample set can be extended to the larger population. The chi-square method provides us with a numerical result which can be used to determine if there is a statistically significant difference between the two groups.

Table 1 shows us the mean and SD for modality and for gender. It is a general breakdown of numbers to visually elucidate any differences between scores and deviations. The mean GPA for both modalities is similar with F2F learners scoring a 69.35 and online learners scoring a 68.64. Both groups had fairly similar SDs. A stronger difference can be seen between the GPAs earned by men and women. Men had a 3.23 mean GPA while women had a 2.9 mean GPA. The SDs for both groups were almost identical. Even though the 0.33 numerical difference may look fairly insignificant, it must be noted that a 3.23 is approximately a B+ while a 2.9 is approximately a B. Given a categorical range of only A to F, a plus differential can be considered significant.

www.frontiersin.org

Table 1 . Means and standard deviations for 8 semester- “Environmental Science data set.”

The mean grade for men in the environmental online classes ( M = 3.23, N = 246, SD = 1.19) was higher than the mean grade for women in the classes ( M = 2.9, N = 302, SD = 1.20) (see Table 1 ).

First, a chi-square analysis was performed using SPSS to determine if there was a statistically significant difference in grade distribution between online and F2F students. Students enrolled in the F2F class had the highest percentage of A's (63.60%) as compared to online students (36.40%). Table 2 displays grade distribution by course delivery modality. The difference in student performance was statistically significant, χ 2 (4, N = 548) = 6.531, p > 0.05. Table 3 shows the gender difference on student performance between online and F2F students.

www.frontiersin.org

Table 2 . Contingency table for student's academic performance ( N = 548).

www.frontiersin.org

Table 3 . Gender * performance crosstabulation.

Table 2 shows us the performance measures of online and F2F students by grade category. As can be seen, F2F students generated the highest performance numbers for each grade category. However, this disparity was mostly due to a higher number of F2F students in the study. There were 401 F2F students as opposed to just 147 online students. When viewing grades with respect to modality, there are smaller percentage differences between respective learners ( Tanyel and Griffin, 2014 ). For example, F2F learners earned 28 As (63.60% of total A's earned) while online learners earned 16 As (36.40% of total A's earned). However, when viewing the A grade with respect to total learners in each modality, it can be seen that 28 of the 401 F2F students (6.9%) earned As as compared to 16 of 147 (10.9%) online learners. In this case, online learners scored relatively higher in this grade category. The latter measure (grade total as a percent of modality total) is a better reflection of respective performance levels.

Given a critical value of 7.7 and a d.f. of 4, we were able to generate a chi-squared measure of 6.531. The correlating p -value of 0.163 was greater than our p -value significance level of 0.05. We, therefore, had to accept the null hypothesis and reject the alternative hypothesis. There is no statistically significant difference between the two groups in terms of performance scores.

Research Question 2

The second research question was posed to evaluate if there was a difference between online and F2F varied with gender. Does online and F2F student performance vary with respect to gender? Table 3 shows the gender difference on student performance between online and face to face students. We used chi-square test to determine if there were differences in online and F2F student performance with respect to gender. The chi-square test with alpha equal to 0.05 as criterion for significance. The chi-square result shows that there is no statistically significant difference between men and women in terms of performance.

Research Question 3

The third research question tried to determine if there was a difference between online and F2F varied with respect to class rank. Does online and F2F student performance vary with respect to class rank?

Table 4 shows the mean scores and standard deviations of freshman, sophomore, and junior and senior students for both online and F2F student performance. To test the third hypothesis, we used a two-way ANOVA. The ANOVA is a useful appraisal tool for this particular hypothesis as it tests the differences between multiple means. Instead of testing specific differences, the ANOVA generates a much broader picture of average differences. As can be seen in Table 4 , the ANOVA test for this particular hypothesis states there is no significant difference between online and F2F learners with respect to class rank. Therefore, we must accept the null hypothesis and reject the alternative hypothesis.

www.frontiersin.org

Table 4 . Descriptive analysis of student performance by class rankings gender.

The results of the ANOVA show there is no significant difference in performance between online and F2F students with respect to class rank. Results of ANOVA is presented in Table 5 .

www.frontiersin.org

Table 5 . Analysis of variance (ANOVA) for online and F2F of class rankings.

As can be seen in Table 4 , the ANOVA test for this particular hypothesis states there is no significant difference between online and F2F learners with respect to class rank. Therefore, we must accept the null hypothesis and reject the alternative hypothesis.

Discussion and Social Implications

The results of the study show there is no significant difference in performance between online and traditional classroom students with respect to modality, gender, or class rank in a science concepts course for non-STEM majors. Although there were sample size issues and study limitations, this assessment shows both online learners and classroom learners perform at the same level. This conclusion indicates teaching modality may not matter as much as other factors. Given the relatively sparse data on pedagogical modality comparison given specific student population characteristics, this study could be considered innovative. In the current literature, we have not found a study of this nature comparing online and F2F non-STEM majors with respect to three separate factors—medium, gender, and class rank—and the ability to learn science concepts and achieve learning outcomes. Previous studies have compared traditional classroom learning vs. F2F learning for other factors (including specific courses, costs, qualitative analysis, etcetera, but rarely regarding outcomes relevant to population characteristics of learning for a specific science concepts course over many years) ( Liu, 2005 ).

In a study evaluating the transformation of a graduate level course for teachers, academic quality of the online course and learning outcomes were evaluated. The study evaluated the ability of course instructors to design the course for online delivery and develop various interactive multimedia models at a cost-savings to the respective university. The online learning platform proved effective in translating information where tested students successfully achieved learning outcomes comparable to students taking the F2F course ( Herman and Banister, 2007 ).

Another study evaluated the similarities and differences in F2F and online learning in a non-STEM course, “Foundations of American Education” and overall course satisfaction by students enrolled in either of the two modalities. F2F and online course satisfaction was qualitatively and quantitative analyzed. However, in analyzing online and F2F course feedback using quantitative feedback, online course satisfaction was less than F2F satisfaction. When qualitative data was used, course satisfaction was similar between modalities ( Werhner, 2010 ). The course satisfaction data and feedback was used to suggest a number of posits for effective online learning in the specific course. The researcher concluded that there was no difference in the learning success of students enrolled in the online vs. F2F course, stating that “in terms of learning, students who apply themselves diligently should be successful in either format” ( Dell et al., 2010 ). The author's conclusion presumes that the “issues surrounding class size are under control and that the instructor has a course load that makes the intensity of the online course workload feasible” where the authors conclude that the workload for online courses is more than for F2F courses ( Stern, 2004 ).

In “A Meta-Analysis of Three Types of Interaction Treatments in Distance Education,” Bernard et al. (2009) conducted a meta-analysis evaluating three types of instructional and/or media conditions designed into distance education (DE) courses known as interaction treatments (ITs)—student–student (SS), student–teacher (ST), or student–content (SC) interactions—to other DE instructional/interaction treatments. The researchers found that a strong association existed between the integration of these ITs into distance education courses and achievement compared with blended or F2F modalities of learning. The authors speculated that this was due to increased cognitive engagement based in these three interaction treatments ( Larson and Sung, 2009 ).

Other studies evaluating students' preferences (but not efficacy) for online vs. F2F learning found that students prefer online learning when it was offered, depending on course topic, and online course technology platform ( Ary and Brune, 2011 ). F2F learning was preferred when courses were offered late morning or early afternoon 2–3 days/week. A significant preference for online learning resulted across all undergraduate course topics (American history and government, humanities, natural sciences, social, and behavioral sciences, diversity, and international dimension) except English composition and oral communication. A preference for analytical and quantitative thought courses was also expressed by students, though not with statistically significant results ( Mann and Henneberry, 2014 ). In this research study, we looked at three hypothesis comparing online and F2F learning. In each case, the null hypothesis was accepted. Therefore, at no level of examination did we find a significant difference between online and F2F learners. This finding is important because it tells us traditional-style teaching with its heavy emphasis on interpersonal classroom dynamics may 1 day be replaced by online instruction. According to Daymont and Blau (2008) online learners, regardless of gender or class rank, learn as much from electronic interaction as they do from personal interaction. Kemp and Grieve (2014) also found that both online and F2F learning for psychology students led to similar academic performance. Given the cost efficiencies and flexibility of online education, Web-based instructional systems may rapidly rise.

A number of studies support the economic benefits of online vs. F2F learning, despite differences in social constructs and educational support provided by governments. In a study by Li and Chen (2012) higher education institutions benefit the most from two of four outputs—research outputs and distance education—with teaching via distance education at both the undergraduate and graduate levels more profitable than F2F teaching at higher education institutions in China. Zhang and Worthington (2017) reported an increasing cost benefit for the use of distance education over F2F instruction as seen at 37 Australian public universities over 9 years from 2003 to 2012. Maloney et al. (2015) and Kemp and Grieve (2014) also found significant savings in higher education when using online learning platforms vs. F2F learning. In the West, the cost efficiency of online learning has been demonstrated by several research studies ( Craig, 2015 ). Studies by Agasisti and Johnes (2015) and Bartley and Golek (2004) both found the cost benefits of online learning significantly greater than that of F2F learning at U.S. institutions.

Knowing there is no significant difference in student performance between the two mediums, institutions of higher education may make the gradual shift away from traditional instruction; they may implement Web-based teaching to capture a larger worldwide audience. If administered correctly, this shift to Web-based teaching could lead to a larger buyer population, more cost efficiencies, and more university revenue.

The social implications of this study should be touted; however, several concerns regarding generalizability need to be taken into account. First, this study focused solely on students from an environmental studies class for non-STEM majors. The ability to effectively prepare students for scientific professions without hands-on experimentation has been contended. As a course that functions to communicate scientific concepts, but does not require a laboratory based component, these results may not translate into similar performance of students in an online STEM course for STEM majors or an online course that has an online laboratory based co-requisite when compared to students taking traditional STEM courses for STEM majors. There are few studies that suggest the landscape may be changing with the ability to effectively train students in STEM core concepts via online learning. Biel and Brame (2016) reported successfully translating the academic success of F2F undergraduate biology courses to online biology courses. However, researchers reported that of the large-scale courses analyzed, two F2F sections outperformed students in online sections, and three found no significant difference. A study by Beale et al. (2014) comparing F2F learning with hybrid learning in an embryology course found no difference in overall student performance. Additionally, the bottom quartile of students showed no differential effect of the delivery method on examination scores. Further, a study from Lorenzo-Alvarez et al. (2019) found that radiology education in an online learning platform resulted in similar academic outcomes as F2F learning. Larger scale research is needed to determine the effectiveness of STEM online learning and outcomes assessments, including workforce development results.

In our research study, it is possible the study participants may have been more knowledgeable about environmental science than about other subjects. Therefore, it should be noted this study focused solely on students taking this one particular class. Given the results, this course presents a unique potential for increasing the number of non-STEM majors engaged in citizen science using the flexibility of online learning to teach environmental science core concepts.

Second, the operationalization measure of “grade” or “score” to determine performance level may be lacking in scope and depth. The grades received in a class may not necessarily show actual ability, especially if the weights were adjusted to heavily favor group tasks and writing projects. Other performance indicators may be better suited to properly access student performance. A single exam containing both multiple choice and essay questions may be a better operationalization indicator of student performance. This type of indicator will provide both a quantitative and qualitative measure of subject matter comprehension.

Third, the nature of the student sample must be further dissected. It is possible the online students in this study may have had more time than their counterparts to learn the material and generate better grades ( Summers et al., 2005 ). The inverse holds true, as well. Because this was a convenience non-probability sampling, the chances of actually getting a fair cross section of the student population were limited. In future studies, greater emphasis must be placed on selecting proper study participants, those who truly reflect proportions, types, and skill levels.

This study was relevant because it addressed an important educational topic; it compared two student groups on multiple levels using a single operationalized performance measure. More studies, however, of this nature need to be conducted before truly positing that online and F2F teaching generate the same results. Future studies need to eliminate spurious causal relationships and increase generalizability. This will maximize the chances of generating a definitive, untainted results. This scientific inquiry and comparison into online and traditional teaching will undoubtedly garner more attention in the coming years.

Our study compared learning via F2F vs. online learning modalities in teaching an environmental science course additionally evaluating factors of gender and class rank. These data demonstrate the ability to similarly translate environmental science concepts for non-STEM majors in both traditional and online platforms irrespective of gender or class rank. The social implications of this finding are important for advancing access to and learning of scientific concepts by the general population, as many institutions of higher education allow an online course to be taken without enrolling in a degree program. Thus, the potential exists for increasing the number of non-STEM majors engaged in citizen science using the flexibility of online learning to teach environmental science core concepts.

Limitations of the Study

The limitations of the study centered around the nature of the sample group, student skills/abilities, and student familiarity with online instruction. First, because this was a convenience, non-probability sample, the independent variables were not adjusted for real-world accuracy. Second, student intelligence and skill level were not taken into consideration when separating out comparison groups. There exists the possibility that the F2F learners in this study may have been more capable than the online students and vice versa. This limitation also applies to gender and class rank differences ( Friday et al., 2006 ). Finally, there may have been ease of familiarity issues between the two sets of learners. Experienced traditional classroom students now taking Web-based courses may be daunted by the technical aspect of the modality. They may not have had the necessary preparation or experience to efficiently e-learn, thus leading to lowered scores ( Helms, 2014 ). In addition to comparing online and F2F instructional efficacy, future research should also analyze blended teaching methods for the effectiveness of courses for non-STEM majors to impart basic STEM concepts and see if the blended style is more effective than any one pure style.

Data Availability Statement

The datasets generated for this study are available on request to the corresponding author.

Ethics Statement

The studies involving human participants were reviewed and approved by Fort Valley State University Human Subjects Institutional Review Board. Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.

Author Contributions

JP provided substantial contributions to the conception of the work, acquisition and analysis of data for the work, and is the corresponding author on this paper who agrees to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. FJ provided substantial contributions to the design of the work, interpretation of the data for the work, and revised it critically for intellectual content.

This research was supported in part by funding from the National Science Foundation, Awards #1649717, 1842510, Ñ900572, and 1939739 to FJ.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The authors would like to thank the reviewers for their detailed comments and feedback that assisted in the revising of our original manuscript.

Agasisti, T., and Johnes, G. (2015). Efficiency, costs, rankings and heterogeneity: the case of US higher education. Stud. High. Educ. 40, 60–82. doi: 10.1080/03075079.2013.818644

CrossRef Full Text | Google Scholar

Ary, E. J., and Brune, C. W. (2011). A comparison of student learning outcomes in traditional and online personal finance courses. MERLOT J. Online Learn. Teach. 7, 465–474.

Google Scholar

Atchley, W., Wingenbach, G., and Akers, C. (2013). Comparison of course completion and student performance through online and traditional courses. Int. Rev. Res. Open Dist. Learn. 14, 104–116. doi: 10.19173/irrodl.v14i4.1461

Bartley, S. J., and Golek, J. H. (2004). Evaluating the cost effectiveness of online and face-to-face instruction. Educ. Technol. Soc. 7, 167–175.

Beale, E. G., Tarwater, P. M., and Lee, V. H. (2014). A retrospective look at replacing face-to-face embryology instruction with online lectures in a human anatomy course. Am. Assoc. Anat. 7, 234–241. doi: 10.1002/ase.1396

PubMed Abstract | CrossRef Full Text | Google Scholar

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkesh, M. A., et al. (2009). A meta-analysis of three types of interaction treatments in distance education. Rev. Educ. Res. 79, 1243–1289. doi: 10.3102/0034654309333844

Biel, R., and Brame, C. J. (2016). Traditional versus online biology courses: connecting course design and student learning in an online setting. J. Microbiol. Biol. Educ. 17, 417–422. doi: 10.1128/jmbe.v17i3.1157

Bigelow, C. A. (2009). Comparing student performance in an online versus a face to face introductory turfgrass science course-a case study. NACTA J. 53, 1–7.

Columbaro, N. L., and Monaghan, C. H. (2009). Employer perceptions of online degrees: a literature review. Online J. Dist. Learn. Administr. 12.

Craig, R. (2015). A Brief History (and Future) of Online Degrees. Forbes/Education . Available online at: https://www.forbes.com/sites/ryancraig/2015/06/23/a-brief-history-and-future-of-online-degrees/#e41a4448d9a8

Daymont, T., and Blau, G. (2008). Student performance in online and traditional sections of an undergraduate management course. J. Behav. Appl. Manag. 9, 275–294.

Dell, C. A., Low, C., and Wilker, J. F. (2010). Comparing student achievement in online and face-to-face class formats. J. Online Learn. Teach. Long Beach 6, 30–42.

Driscoll, A., Jicha, K., Hunt, A. N., Tichavsky, L., and Thompson, G. (2012). Can online courses deliver in-class results? A comparison of student performance and satisfaction in an online versus a face-to-face introductory sociology course. Am. Sociol. Assoc . 40, 312–313. doi: 10.1177/0092055X12446624

Friday, E., Shawnta, S., Green, A. L., and Hill, A. Y. (2006). A multi-semester comparison of student performance between multiple traditional and online sections of two management courses. J. Behav. Appl. Manag. 8, 66–81.

Girard, J. P., Yerby, J., and Floyd, K. (2016). Knowledge retention in capstone experiences: an analysis of online and face-to-face courses. Knowl. Manag. ELearn. 8, 528–539. doi: 10.34105/j.kmel.2016.08.033

Helms, J. L. (2014). Comparing student performance in online and face-to-face delivery modalities. J. Asynchr. Learn. Netw. 18, 1–14. doi: 10.24059/olj.v18i1.348

Herman, T., and Banister, S. (2007). Face-to-face versus online coursework: a comparison of costs and learning outcomes. Contemp. Issues Technol. Teach. Educ. 7, 318–326.

Kemp, N., and Grieve, R. (2014). Face-to-Face or face-to-screen? Undergraduates' opinions and test performance in classroom vs. online learning. Front. Psychol. 5:1278. doi: 10.3389/fpsyg.2014.01278

Keramidas, C. G. (2012). Are undergraduate students ready for online learning? A comparison of online and face-to-face sections of a course. Rural Special Educ. Q . 31, 25–39. doi: 10.1177/875687051203100405

Larson, D.K., and Sung, C. (2009). Comparing student performance: online versus blended versus face-to-face. J. Asynchr. Learn. Netw. 13, 31–42. doi: 10.24059/olj.v13i1.1675

Li, F., and Chen, X. (2012). Economies of scope in distance education: the case of Chinese Research Universities. Int. Rev. Res. Open Distrib. Learn. 13, 117–131.

Liu, Y. (2005). Effects of online instruction vs. traditional instruction on student's learning. Int. J. Instruct. Technol. Dist. Learn. 2, 57–64.

Lorenzo-Alvarez, R., Rudolphi-Solero, T., Ruiz-Gomez, M. J., and Sendra-Portero, F. (2019). Medical student education for abdominal radiographs in a 3D virtual classroom versus traditional classroom: a randomized controlled trial. Am. J. Roentgenol. 213, 644–650. doi: 10.2214/AJR.19.21131

Lundberg, J., Castillo-Merino, D., and Dahmani, M. (2008). Do online students perform better than face-to-face students? Reflections and a short review of some Empirical Findings. Rev. Univ. Soc. Conocim . 5, 35–44. doi: 10.7238/rusc.v5i1.326

Maloney, S., Nicklen, P., Rivers, G., Foo, J., Ooi, Y. Y., Reeves, S., et al. (2015). Cost-effectiveness analysis of blended versus face-to-face delivery of evidence-based medicine to medical students. J. Med. Internet Res. 17:e182. doi: 10.2196/jmir.4346

Mann, J. T., and Henneberry, S. R. (2014). Online versus face-to-face: students' preferences for college course attributes. J. Agric. Appl. Econ . 46, 1–19. doi: 10.1017/S1074070800000602

Mozes-Carmel, A., and Gold, S. S. (2009). A comparison of online vs proctored final exams in online classes. Imanagers J. Educ. Technol. 6, 76–81. doi: 10.26634/jet.6.1.212

Richardson, J. C., and Swan, K. (2003). Examining social presence in online courses in relation to student's perceived learning and satisfaction. J. Asynchr. Learn. 7, 68–88.

Roval, A. P., and Jordan, H. M. (2004). Blended learning and sense of community: a comparative analysis with traditional and fully online graduate courses. Int. Rev. Res. Open Dist. Learn. 5. doi: 10.19173/irrodl.v5i2.192

Salcedo, C. S. (2010). Comparative analysis of learning outcomes in face-to-face foreign language classes vs. language lab and online. J. Coll. Teach. Learn. 7, 43–54. doi: 10.19030/tlc.v7i2.88

Stern, B. S. (2004). A comparison of online and face-to-face instruction in an undergraduate foundations of american education course. Contemp. Issues Technol. Teach. Educ. J. 4, 196–213.

Summers, J. J., Waigandt, A., and Whittaker, T. A. (2005). A comparison of student achievement and satisfaction in an online versus a traditional face-to-face statistics class. Innov. High. Educ. 29, 233–250. doi: 10.1007/s10755-005-1938-x

Tanyel, F., and Griffin, J. (2014). A Ten-Year Comparison of Outcomes and Persistence Rates in Online versus Face-to-Face Courses . Retrieved from: https://www.westga.edu/~bquest/2014/onlinecourses2014.pdf

Werhner, M. J. (2010). A comparison of the performance of online versus traditional on-campus earth science students on identical exams. J. Geosci. Educ. 58, 310–312. doi: 10.5408/1.3559697

Westhuis, D., Ouellette, P. M., and Pfahler, C. L. (2006). A comparative analysis of on-line and classroom-based instructional formats for teaching social work research. Adv. Soc. Work 7, 74–88. doi: 10.18060/184

Wladis, C., Conway, K. M., and Hachey, A. C. (2015). The online STEM classroom-who succeeds? An exploration of the impact of ethnicity, gender, and non-traditional student characteristics in the community college context. Commun. Coll. Rev. 43, 142–164. doi: 10.1177/0091552115571729

Xu, D., and Jaggars, S. S. (2016). Performance gaps between online and face-to-face courses: differences across types of students and academic subject areas. J. Higher Educ. 85, 633–659. doi: 10.1353/jhe.2014.0028

Zhang, L.-C., and Worthington, A. C. (2017). Scale and scope economies of distance education in Australian universities. Stud. High. Educ. 42, 1785–1799. doi: 10.1080/03075079.2015.1126817

Keywords: face-to-face (F2F), traditional classroom teaching, web-based instructions, information and communication technology (ICT), online learning, desire to learn (D2L), passive learning, active learning

Citation: Paul J and Jefferson F (2019) A Comparative Analysis of Student Performance in an Online vs. Face-to-Face Environmental Science Course From 2009 to 2016. Front. Comput. Sci. 1:7. doi: 10.3389/fcomp.2019.00007

Received: 15 May 2019; Accepted: 15 October 2019; Published: 12 November 2019.

Reviewed by:

Copyright © 2019 Paul and Jefferson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jasmine Paul, paulj@fvsu.edu

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 25 January 2021

Online education in the post-COVID era

  • Barbara B. Lockee 1  

Nature Electronics volume  4 ,  pages 5–6 ( 2021 ) Cite this article

137k Accesses

199 Citations

337 Altmetric

Metrics details

  • Science, technology and society

The coronavirus pandemic has forced students and educators across all levels of education to rapidly adapt to online learning. The impact of this — and the developments required to make it work — could permanently change how education is delivered.

The COVID-19 pandemic has forced the world to engage in the ubiquitous use of virtual learning. And while online and distance learning has been used before to maintain continuity in education, such as in the aftermath of earthquakes 1 , the scale of the current crisis is unprecedented. Speculation has now also begun about what the lasting effects of this will be and what education may look like in the post-COVID era. For some, an immediate retreat to the traditions of the physical classroom is required. But for others, the forced shift to online education is a moment of change and a time to reimagine how education could be delivered 2 .

online education vs traditional education research paper

Looking back

Online education has traditionally been viewed as an alternative pathway, one that is particularly well suited to adult learners seeking higher education opportunities. However, the emergence of the COVID-19 pandemic has required educators and students across all levels of education to adapt quickly to virtual courses. (The term ‘emergency remote teaching’ was coined in the early stages of the pandemic to describe the temporary nature of this transition 3 .) In some cases, instruction shifted online, then returned to the physical classroom, and then shifted back online due to further surges in the rate of infection. In other cases, instruction was offered using a combination of remote delivery and face-to-face: that is, students can attend online or in person (referred to as the HyFlex model 4 ). In either case, instructors just had to figure out how to make it work, considering the affordances and constraints of the specific learning environment to create learning experiences that were feasible and effective.

The use of varied delivery modes does, in fact, have a long history in education. Mechanical (and then later electronic) teaching machines have provided individualized learning programmes since the 1950s and the work of B. F. Skinner 5 , who proposed using technology to walk individual learners through carefully designed sequences of instruction with immediate feedback indicating the accuracy of their response. Skinner’s notions formed the first formalized representations of programmed learning, or ‘designed’ learning experiences. Then, in the 1960s, Fred Keller developed a personalized system of instruction 6 , in which students first read assigned course materials on their own, followed by one-on-one assessment sessions with a tutor, gaining permission to move ahead only after demonstrating mastery of the instructional material. Occasional class meetings were held to discuss concepts, answer questions and provide opportunities for social interaction. A personalized system of instruction was designed on the premise that initial engagement with content could be done independently, then discussed and applied in the social context of a classroom.

These predecessors to contemporary online education leveraged key principles of instructional design — the systematic process of applying psychological principles of human learning to the creation of effective instructional solutions — to consider which methods (and their corresponding learning environments) would effectively engage students to attain the targeted learning outcomes. In other words, they considered what choices about the planning and implementation of the learning experience can lead to student success. Such early educational innovations laid the groundwork for contemporary virtual learning, which itself incorporates a variety of instructional approaches and combinations of delivery modes.

Online learning and the pandemic

Fast forward to 2020, and various further educational innovations have occurred to make the universal adoption of remote learning a possibility. One key challenge is access. Here, extensive problems remain, including the lack of Internet connectivity in some locations, especially rural ones, and the competing needs among family members for the use of home technology. However, creative solutions have emerged to provide students and families with the facilities and resources needed to engage in and successfully complete coursework 7 . For example, school buses have been used to provide mobile hotspots, and class packets have been sent by mail and instructional presentations aired on local public broadcasting stations. The year 2020 has also seen increased availability and adoption of electronic resources and activities that can now be integrated into online learning experiences. Synchronous online conferencing systems, such as Zoom and Google Meet, have allowed experts from anywhere in the world to join online classrooms 8 and have allowed presentations to be recorded for individual learners to watch at a time most convenient for them. Furthermore, the importance of hands-on, experiential learning has led to innovations such as virtual field trips and virtual labs 9 . A capacity to serve learners of all ages has thus now been effectively established, and the next generation of online education can move from an enterprise that largely serves adult learners and higher education to one that increasingly serves younger learners, in primary and secondary education and from ages 5 to 18.

The COVID-19 pandemic is also likely to have a lasting effect on lesson design. The constraints of the pandemic provided an opportunity for educators to consider new strategies to teach targeted concepts. Though rethinking of instructional approaches was forced and hurried, the experience has served as a rare chance to reconsider strategies that best facilitate learning within the affordances and constraints of the online context. In particular, greater variance in teaching and learning activities will continue to question the importance of ‘seat time’ as the standard on which educational credits are based 10 — lengthy Zoom sessions are seldom instructionally necessary and are not aligned with the psychological principles of how humans learn. Interaction is important for learning but forced interactions among students for the sake of interaction is neither motivating nor beneficial.

While the blurring of the lines between traditional and distance education has been noted for several decades 11 , the pandemic has quickly advanced the erasure of these boundaries. Less single mode, more multi-mode (and thus more educator choices) is becoming the norm due to enhanced infrastructure and developed skill sets that allow people to move across different delivery systems 12 . The well-established best practices of hybrid or blended teaching and learning 13 have served as a guide for new combinations of instructional delivery that have developed in response to the shift to virtual learning. The use of multiple delivery modes is likely to remain, and will be a feature employed with learners of all ages 14 , 15 . Future iterations of online education will no longer be bound to the traditions of single teaching modes, as educators can support pedagogical approaches from a menu of instructional delivery options, a mix that has been supported by previous generations of online educators 16 .

Also significant are the changes to how learning outcomes are determined in online settings. Many educators have altered the ways in which student achievement is measured, eliminating assignments and changing assessment strategies altogether 17 . Such alterations include determining learning through strategies that leverage the online delivery mode, such as interactive discussions, student-led teaching and the use of games to increase motivation and attention. Specific changes that are likely to continue include flexible or extended deadlines for assignment completion 18 , more student choice regarding measures of learning, and more authentic experiences that involve the meaningful application of newly learned skills and knowledge 19 , for example, team-based projects that involve multiple creative and social media tools in support of collaborative problem solving.

In response to the COVID-19 pandemic, technological and administrative systems for implementing online learning, and the infrastructure that supports its access and delivery, had to adapt quickly. While access remains a significant issue for many, extensive resources have been allocated and processes developed to connect learners with course activities and materials, to facilitate communication between instructors and students, and to manage the administration of online learning. Paths for greater access and opportunities to online education have now been forged, and there is a clear route for the next generation of adopters of online education.

Before the pandemic, the primary purpose of distance and online education was providing access to instruction for those otherwise unable to participate in a traditional, place-based academic programme. As its purpose has shifted to supporting continuity of instruction, its audience, as well as the wider learning ecosystem, has changed. It will be interesting to see which aspects of emergency remote teaching remain in the next generation of education, when the threat of COVID-19 is no longer a factor. But online education will undoubtedly find new audiences. And the flexibility and learning possibilities that have emerged from necessity are likely to shift the expectations of students and educators, diminishing further the line between classroom-based instruction and virtual learning.

Mackey, J., Gilmore, F., Dabner, N., Breeze, D. & Buckley, P. J. Online Learn. Teach. 8 , 35–48 (2012).

Google Scholar  

Sands, T. & Shushok, F. The COVID-19 higher education shove. Educause Review https://go.nature.com/3o2vHbX (16 October 2020).

Hodges, C., Moore, S., Lockee, B., Trust, T. & Bond, M. A. The difference between emergency remote teaching and online learning. Educause Review https://go.nature.com/38084Lh (27 March 2020).

Beatty, B. J. (ed.) Hybrid-Flexible Course Design Ch. 1.4 https://go.nature.com/3o6Sjb2 (EdTech Books, 2019).

Skinner, B. F. Science 128 , 969–977 (1958).

Article   Google Scholar  

Keller, F. S. J. Appl. Behav. Anal. 1 , 79–89 (1968).

Darling-Hammond, L. et al. Restarting and Reinventing School: Learning in the Time of COVID and Beyond (Learning Policy Institute, 2020).

Fulton, C. Information Learn. Sci . 121 , 579–585 (2020).

Pennisi, E. Science 369 , 239–240 (2020).

Silva, E. & White, T. Change The Magazine Higher Learn. 47 , 68–72 (2015).

McIsaac, M. S. & Gunawardena, C. N. in Handbook of Research for Educational Communications and Technology (ed. Jonassen, D. H.) Ch. 13 (Simon & Schuster Macmillan, 1996).

Irvine, V. The landscape of merging modalities. Educause Review https://go.nature.com/2MjiBc9 (26 October 2020).

Stein, J. & Graham, C. Essentials for Blended Learning Ch. 1 (Routledge, 2020).

Maloy, R. W., Trust, T. & Edwards, S. A. Variety is the spice of remote learning. Medium https://go.nature.com/34Y1NxI (24 August 2020).

Lockee, B. J. Appl. Instructional Des . https://go.nature.com/3b0ddoC (2020).

Dunlap, J. & Lowenthal, P. Open Praxis 10 , 79–89 (2018).

Johnson, N., Veletsianos, G. & Seaman, J. Online Learn. 24 , 6–21 (2020).

Vaughan, N. D., Cleveland-Innes, M. & Garrison, D. R. Assessment in Teaching in Blended Learning Environments: Creating and Sustaining Communities of Inquiry (Athabasca Univ. Press, 2013).

Conrad, D. & Openo, J. Assessment Strategies for Online Learning: Engagement and Authenticity (Athabasca Univ. Press, 2018).

Download references

Author information

Authors and affiliations.

School of Education, Virginia Tech, Blacksburg, VA, USA

Barbara B. Lockee

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Barbara B. Lockee .

Ethics declarations

Competing interests.

The author declares no competing interests.

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Lockee, B.B. Online education in the post-COVID era. Nat Electron 4 , 5–6 (2021). https://doi.org/10.1038/s41928-020-00534-0

Download citation

Published : 25 January 2021

Issue Date : January 2021

DOI : https://doi.org/10.1038/s41928-020-00534-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

A comparative study on the effectiveness of online and in-class team-based learning on student performance and perceptions in virtual simulation experiments.

BMC Medical Education (2024)

Leveraging privacy profiles to empower users in the digital society

  • Davide Di Ruscio
  • Paola Inverardi
  • Phuong T. Nguyen

Automated Software Engineering (2024)

Growth mindset and social comparison effects in a peer virtual learning environment

  • Pamela Sheffler
  • Cecilia S. Cheung

Social Psychology of Education (2024)

Nursing students’ learning flow, self-efficacy and satisfaction in virtual clinical simulation and clinical case seminar

  • Sunghee H. Tak

BMC Nursing (2023)

Online learning for WHO priority diseases with pandemic potential: evidence from existing courses and preparing for Disease X

  • Heini Utunen
  • Corentin Piroux

Archives of Public Health (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

online education vs traditional education research paper

  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

147k Accesses

47 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

online education vs traditional education research paper

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Elsevier - PMC COVID-19 Collection

Logo of pheelsevier

A systematic review of research on online teaching and learning from 2009 to 2018

Associated data.

Systematic reviews were conducted in the nineties and early 2000's on online learning research. However, there is no review examining the broader aspect of research themes in online learning in the last decade. This systematic review addresses this gap by examining 619 research articles on online learning published in twelve journals in the last decade. These studies were examined for publication trends and patterns, research themes, research methods, and research settings and compared with the research themes from the previous decades. While there has been a slight decrease in the number of studies on online learning in 2015 and 2016, it has then continued to increase in 2017 and 2018. The majority of the studies were quantitative in nature and were examined in higher education. Online learning research was categorized into twelve themes and a framework across learner, course and instructor, and organizational levels was developed. Online learner characteristics and online engagement were examined in a high number of studies and were consistent with three of the prior systematic reviews. However, there is still a need for more research on organization level topics such as leadership, policy, and management and access, culture, equity, inclusion, and ethics and also on online instructor characteristics.

  • • Twelve online learning research themes were identified in 2009–2018.
  • • A framework with learner, course and instructor, and organizational levels was used.
  • • Online learner characteristics and engagement were the mostly examined themes.
  • • The majority of the studies used quantitative research methods and in higher education.
  • • There is a need for more research on organization level topics.

1. Introduction

Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase ( Allen & Seaman, 2017 ), and so has the research on online learning. There have been review studies conducted on specific areas on online learning such as innovations in online learning strategies ( Davis et al., 2018 ), empirical MOOC literature ( Liyanagunawardena et al., 2013 ; Veletsianos & Shepherdson, 2016 ; Zhu et al., 2018 ), quality in online education ( Esfijani, 2018 ), accessibility in online higher education ( Lee, 2017 ), synchronous online learning ( Martin et al., 2017 ), K-12 preparation for online teaching ( Moore-Adams et al., 2016 ), polychronicity in online learning ( Capdeferro et al., 2014 ), meaningful learning research in elearning and online learning environments ( Tsai, Shen, & Chiang, 2013 ), problem-based learning in elearning and online learning environments ( Tsai & Chiang, 2013 ), asynchronous online discussions ( Thomas, 2013 ), self-regulated learning in online learning environments ( Tsai, Shen, & Fan, 2013 ), game-based learning in online learning environments ( Tsai & Fan, 2013 ), and online course dropout ( Lee & Choi, 2011 ). While there have been review studies conducted on specific online learning topics, very few studies have been conducted on the broader aspect of online learning examining research themes.

2. Systematic Reviews of Distance Education and Online Learning Research

Distance education has evolved from offline to online settings with the access to internet and COVID-19 has made online learning the common delivery method across the world. Tallent-Runnels et al. (2006) reviewed research late 1990's to early 2000's, Berge and Mrozowski (2001) reviewed research 1990 to 1999, and Zawacki-Richter et al. (2009) reviewed research in 2000–2008 on distance education and online learning. Table 1 shows the research themes from previous systematic reviews on online learning research. There are some themes that re-occur in the various reviews, and there are also new themes that emerge. Though there have been reviews conducted in the nineties and early 2000's, there is no review examining the broader aspect of research themes in online learning in the last decade. Hence, the need for this systematic review which informs the research themes in online learning from 2009 to 2018. In the following sections, we review these systematic review studies in detail.

Comparison of online learning research themes from previous studies.

2.1. Distance education research themes, 1990 to 1999 ( Berge & Mrozowski, 2001 )

Berge and Mrozowski (2001) reviewed 890 research articles and dissertation abstracts on distance education from 1990 to 1999. The four distance education journals chosen by the authors to represent distance education included, American Journal of Distance Education, Distance Education, Open Learning, and the Journal of Distance Education. This review overlapped in the dates of the Tallent-Runnels et al. (2006) study. Berge and Mrozowski (2001) categorized the articles according to Sherry's (1996) ten themes of research issues in distance education: redefining roles of instructor and students, technologies used, issues of design, strategies to stimulate learning, learner characteristics and support, issues related to operating and policies and administration, access and equity, and costs and benefits.

In the Berge and Mrozowski (2001) study, more than 100 studies focused on each of the three themes: (1) design issues, (2) learner characteristics, and (3) strategies to increase interactivity and active learning. By design issues, the authors focused on instructional systems design and focused on topics such as content requirement, technical constraints, interactivity, and feedback. The next theme, strategies to increase interactivity and active learning, were closely related to design issues and focused on students’ modes of learning. Learner characteristics focused on accommodating various learning styles through customized instructional theory. Less than 50 studies focused on the three least examined themes: (1) cost-benefit tradeoffs, (2) equity and accessibility, and (3) learner support. Cost-benefit trade-offs focused on the implementation costs of distance education based on school characteristics. Equity and accessibility focused on the equity of access to distance education systems. Learner support included topics such as teacher to teacher support as well as teacher to student support.

2.2. Online learning research themes, 1993 to 2004 ( Tallent-Runnels et al., 2006 )

Tallent-Runnels et al. (2006) reviewed research on online instruction from 1993 to 2004. They reviewed 76 articles focused on online learning by searching five databases, ERIC, PsycINFO, ContentFirst, Education Abstracts, and WilsonSelect. Tallent-Runnels et al. (2006) categorized research into four themes, (1) course environment, (2) learners' outcomes, (3) learners’ characteristics, and (4) institutional and administrative factors. The first theme that the authors describe as course environment ( n  = 41, 53.9%) is an overarching theme that includes classroom culture, structural assistance, success factors, online interaction, and evaluation.

Tallent-Runnels et al. (2006) for their second theme found that studies focused on questions involving the process of teaching and learning and methods to explore cognitive and affective learner outcomes ( n  = 29, 38.2%). The authors stated that they found the research designs flawed and lacked rigor. However, the literature comparing traditional and online classrooms found both delivery systems to be adequate. Another research theme focused on learners’ characteristics ( n  = 12, 15.8%) and the synergy of learners, design of the online course, and system of delivery. Research findings revealed that online learners were mainly non-traditional, Caucasian, had different learning styles, and were highly motivated to learn. The final theme that they reported was institutional and administrative factors (n  = 13, 17.1%) on online learning. Their findings revealed that there was a lack of scholarly research in this area and most institutions did not have formal policies in place for course development as well as faculty and student support in training and evaluation. Their research confirmed that when universities offered online courses, it improved student enrollment numbers.

2.3. Distance education research themes 2000 to 2008 ( Zawacki-Richter et al., 2009 )

Zawacki-Richter et al. (2009) reviewed 695 articles on distance education from 2000 to 2008 using the Delphi method for consensus in identifying areas and classified the literature from five prominent journals. The five journals selected due to their wide scope in research in distance education included Open Learning, Distance Education, American Journal of Distance Education, the Journal of Distance Education, and the International Review of Research in Open and Distributed Learning. The reviewers examined the main focus of research and identified gaps in distance education research in this review.

Zawacki-Richter et al. (2009) classified the studies into macro, meso and micro levels focusing on 15 areas of research. The five areas of the macro-level addressed: (1) access, equity and ethics to deliver distance education for developing nations and the role of various technologies to narrow the digital divide, (2) teaching and learning drivers, markets, and professional development in the global context, (3) distance delivery systems and institutional partnerships and programs and impact of hybrid modes of delivery, (4) theoretical frameworks and models for instruction, knowledge building, and learner interactions in distance education practice, and (5) the types of preferred research methodologies. The meso-level focused on seven areas that involve: (1) management and organization for sustaining distance education programs, (2) examining financial aspects of developing and implementing online programs, (3) the challenges and benefits of new technologies for teaching and learning, (4) incentives to innovate, (5) professional development and support for faculty, (6) learner support services, and (7) issues involving quality standards and the impact on student enrollment and retention. The micro-level focused on three areas: (1) instructional design and pedagogical approaches, (2) culturally appropriate materials, interaction, communication, and collaboration among a community of learners, and (3) focus on characteristics of adult learners, socio-economic backgrounds, learning preferences, and dispositions.

The top three research themes in this review by Zawacki-Richter et al. (2009) were interaction and communities of learning ( n  = 122, 17.6%), instructional design ( n  = 121, 17.4%) and learner characteristics ( n  = 113, 16.3%). The lowest number of studies (less than 3%) were found in studies examining the following research themes, management and organization ( n  = 18), research methods in DE and knowledge transfer ( n  = 13), globalization of education and cross-cultural aspects ( n  = 13), innovation and change ( n  = 13), and costs and benefits ( n  = 12).

2.4. Online learning research themes

These three systematic reviews provide a broad understanding of distance education and online learning research themes from 1990 to 2008. However, there is an increase in the number of research studies on online learning in this decade and there is a need to identify recent research themes examined. Based on the previous systematic reviews ( Berge & Mrozowski, 2001 ; Hung, 2012 ; Tallent-Runnels et al., 2006 ; Zawacki-Richter et al., 2009 ), online learning research in this study is grouped into twelve different research themes which include Learner characteristics, Instructor characteristics, Course or program design and development, Course Facilitation, Engagement, Course Assessment, Course Technologies, Access, Culture, Equity, Inclusion, and Ethics, Leadership, Policy and Management, Instructor and Learner Support, and Learner Outcomes. Table 2 below describes each of the research themes and using these themes, a framework is derived in Fig. 1 .

Research themes in online learning.

Fig. 1

Online learning research themes framework.

The collection of research themes is presented as a framework in Fig. 1 . The themes are organized by domain or level to underscore the nested relationship that exists. As evidenced by the assortment of themes, research can focus on any domain of delivery or associated context. The “Learner” domain captures characteristics and outcomes related to learners and their interaction within the courses. The “Course and Instructor” domain captures elements about the broader design of the course and facilitation by the instructor, and the “Organizational” domain acknowledges the contextual influences on the course. It is important to note as well that due to the nesting, research themes can cross domains. For example, the broader cultural context may be studied as it pertains to course design and development, and institutional support can include both learner support and instructor support. Likewise, engagement research can involve instructors as well as learners.

In this introduction section, we have reviewed three systematic reviews on online learning research ( Berge & Mrozowski, 2001 ; Tallent-Runnels et al., 2006 ; Zawacki-Richter et al., 2009 ). Based on these reviews and other research, we have derived twelve themes to develop an online learning research framework which is nested in three levels: learner, course and instructor, and organization.

2.5. Purpose of this research

In two out of the three previous reviews, design, learner characteristics and interaction were examined in the highest number of studies. On the other hand, cost-benefit tradeoffs, equity and accessibility, institutional and administrative factors, and globalization and cross-cultural aspects were examined in the least number of studies. One explanation for this may be that it is a function of nesting, noting that studies falling in the Organizational and Course levels may encompass several courses or many more participants within courses. However, while some research themes re-occur, there are also variations in some themes across time, suggesting the importance of research themes rise and fall over time. Thus, a critical examination of the trends in themes is helpful for understanding where research is needed most. Also, since there is no recent study examining online learning research themes in the last decade, this study strives to address that gap by focusing on recent research themes found in the literature, and also reviewing research methods and settings. Notably, one goal is to also compare findings from this decade to the previous review studies. Overall, the purpose of this study is to examine publication trends in online learning research taking place during the last ten years and compare it with the previous themes identified in other review studies. Due to the continued growth of online learning research into new contexts and among new researchers, we also examine the research methods and settings found in the studies of this review.

The following research questions are addressed in this study.

  • 1. What percentage of the population of articles published in the journals reviewed from 2009 to 2018 were related to online learning and empirical?
  • 2. What is the frequency of online learning research themes in the empirical online learning articles of journals reviewed from 2009 to 2018?
  • 3. What is the frequency of research methods and settings that researchers employed in the empirical online learning articles of the journals reviewed from 2009 to 2018?

This five-step systematic review process described in the U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse Procedures and Standards Handbook, Version 4.0 ( 2017 ) was used in this systematic review: (a) developing the review protocol, (b) identifying relevant literature, (c) screening studies, (d) reviewing articles, and (e) reporting findings.

3.1. Data sources and search strategies

The Education Research Complete database was searched using the keywords below for published articles between the years 2009 and 2018 using both the Title and Keyword function for the following search terms.

“online learning" OR "online teaching" OR "online program" OR "online course" OR “online education”

3.2. Inclusion/exclusion criteria

The initial search of online learning research among journals in the database resulted in more than 3000 possible articles. Therefore, we limited our search to select journals that focus on publishing peer-reviewed online learning and educational research. Our aim was to capture the journals that published the most articles in online learning. However, we also wanted to incorporate the concept of rigor, so we used expert perception to identify 12 peer-reviewed journals that publish high-quality online learning research. Dissertations and conference proceedings were excluded. To be included in this systematic review, each study had to meet the screening criteria as described in Table 3 . A research study was excluded if it did not meet all of the criteria to be included.

Inclusion/Exclusion criteria.

3.3. Process flow selection of articles

Fig. 2 shows the process flow involved in the selection of articles. The search in the database Education Research Complete yielded an initial sample of 3332 articles. Targeting the 12 journals removed 2579 articles. After reviewing the abstracts, we removed 134 articles based on the inclusion/exclusion criteria. The final sample, consisting of 619 articles, was entered into the computer software MAXQDA ( VERBI Software, 2019 ) for coding.

Fig. 2

Flowchart of online learning research selection.

3.4. Developing review protocol

A review protocol was designed as a codebook in MAXQDA ( VERBI Software, 2019 ) by the three researchers. The codebook was developed based on findings from the previous review studies and from the initial screening of the articles in this review. The codebook included 12 research themes listed earlier in Table 2 (Learner characteristics, Instructor characteristics, Course or program design and development, Course Facilitation, Engagement, Course Assessment, Course Technologies, Access, Culture, Equity, Inclusion, and Ethics, Leadership, Policy and Management, Instructor and Learner Support, and Learner Outcomes), four research settings (higher education, continuing education, K-12, corporate/military), and three research designs (quantitative, qualitative and mixed methods). Fig. 3 below is a screenshot of MAXQDA used for the coding process.

Fig. 3

Codebook from MAXQDA.

3.5. Data coding

Research articles were coded by two researchers in MAXQDA. Two researchers independently coded 10% of the articles and then discussed and updated the coding framework. The second author who was a doctoral student coded the remaining studies. The researchers met bi-weekly to address coding questions that emerged. After the first phase of coding, we found that more than 100 studies fell into each of the categories of Learner Characteristics or Engagement, so we decided to pursue a second phase of coding and reexamine the two themes. Learner Characteristics were classified into the subthemes of Academic, Affective, Motivational, Self-regulation, Cognitive, and Demographic Characteristics. Engagement was classified into the subthemes of Collaborating, Communication, Community, Involvement, Interaction, Participation, and Presence.

3.6. Data analysis

Frequency tables were generated for each of the variables so that outliers could be examined and narrative data could be collapsed into categories. Once cleaned and collapsed into a reasonable number of categories, descriptive statistics were used to describe each of the coded elements. We first present the frequencies of publications related to online learning in the 12 journals. The total number of articles for each journal (collectively, the population) was hand-counted from journal websites, excluding editorials and book reviews. The publication trend of online learning research was also depicted from 2009 to 2018. Then, the descriptive information of the 12 themes, including the subthemes of Learner Characteristics and Engagement were provided. Finally, research themes by research settings and methodology were elaborated.

4.1. Publication trends on online learning

Publication patterns of the 619 articles reviewed from the 12 journals are presented in Table 4 . International Review of Research in Open and Distributed Learning had the highest number of publications in this review. Overall, about 8% of the articles appearing in these twelve journals consisted of online learning publications; however, several journals had concentrations of online learning articles totaling more than 20%.

Empirical online learning research articles by journal, 2009–2018.

Note . Journal's Total Article count excludes reviews and editorials.

The publication trend of online learning research is depicted in Fig. 4 . When disaggregated by year, the total frequency of publications shows an increasing trend. Online learning articles increased throughout the decade and hit a relative maximum in 2014. The greatest number of online learning articles ( n  = 86) occurred most recently, in 2018.

Fig. 4

Online learning publication trends by year.

4.2. Online learning research themes that appeared in the selected articles

The publications were categorized into the twelve research themes identified in Fig. 1 . The frequency counts and percentages of the research themes are provided in Table 5 below. A majority of the research is categorized into the Learner domain. The fewest number of articles appears in the Organization domain.

Research themes in the online learning publications from 2009 to 2018.

The specific themes of Engagement ( n  = 179, 28.92%) and Learner Characteristics ( n  = 134, 21.65%) were most often examined in publications. These two themes were further coded to identify sub-themes, which are described in the next two sections. Publications focusing on Instructor Characteristics ( n  = 21, 3.39%) were least common in the dataset.

4.2.1. Research on engagement

The largest number of studies was on engagement in online learning, which in the online learning literature is referred to and examined through different terms. Hence, we explore this category in more detail. In this review, we categorized the articles into seven different sub-themes as examined through different lenses including presence, interaction, community, participation, collaboration, involvement, and communication. We use the term “involvement” as one of the terms since researchers sometimes broadly used the term engagement to describe their work without further description. Table 6 below provides the description, frequency, and percentages of the various studies related to engagement.

Research sub-themes on engagement.

In the sections below, we provide several examples of the different engagement sub-themes that were studied within the larger engagement theme.

Presence. This sub-theme was the most researched in engagement. With the development of the community of inquiry framework most of the studies in this subtheme examined social presence ( Akcaoglu & Lee, 2016 ; Phirangee & Malec, 2017 ; Wei et al., 2012 ), teaching presence ( Orcutt & Dringus, 2017 ; Preisman, 2014 ; Wisneski et al., 2015 ) and cognitive presence ( Archibald, 2010 ; Olesova et al., 2016 ).

Interaction . This was the second most studied theme under engagement. Researchers examined increasing interpersonal interactions ( Cung et al., 2018 ), learner-learner interactions ( Phirangee, 2016 ; Shackelford & Maxwell, 2012 ; Tawfik et al., 2018 ), peer-peer interaction ( Comer et al., 2014 ), learner-instructor interaction ( Kuo et al., 2014 ), learner-content interaction ( Zimmerman, 2012 ), interaction through peer mentoring ( Ruane & Koku, 2014 ), interaction and community building ( Thormann & Fidalgo, 2014 ), and interaction in discussions ( Ruane & Lee, 2016 ; Tibi, 2018 ).

Community. Researchers examined building community in online courses ( Berry, 2017 ), supporting a sense of community ( Jiang, 2017 ), building an online learning community of practice ( Cho, 2016 ), building an academic community ( Glazer & Wanstreet, 2011 ; Nye, 2015 ; Overbaugh & Nickel, 2011 ), and examining connectedness and rapport in an online community ( Bolliger & Inan, 2012 ; Murphy & Rodríguez-Manzanares, 2012 ; Slagter van Tryon & Bishop, 2012 ).

Participation. Researchers examined engagement through participation in a number of studies. Some of the topics include, participation patterns in online discussion ( Marbouti & Wise, 2016 ; Wise et al., 2012 ), participation in MOOCs ( Ahn et al., 2013 ; Saadatmand & Kumpulainen, 2014 ), features that influence students’ online participation ( Rye & Støkken, 2012 ) and active participation.

Collaboration. Researchers examined engagement through collaborative learning. Specific studies focused on cross-cultural collaboration ( Kumi-Yeboah, 2018 ; Yang et al., 2014 ), how virtual teams collaborate ( Verstegen et al., 2018 ), types of collaboration teams ( Wicks et al., 2015 ), tools for collaboration ( Boling et al., 2014 ), and support for collaboration ( Kopp et al., 2012 ).

Involvement. Researchers examined engaging learners through involvement in various learning activities ( Cundell & Sheepy, 2018 ), student engagement through various measures ( Dixson, 2015 ), how instructors included engagement to involve students in learning ( O'Shea et al., 2015 ), different strategies to engage the learner ( Amador & Mederer, 2013 ), and designed emotionally engaging online environments ( Koseoglu & Doering, 2011 ).

Communication. Researchers examined communication in online learning in studies using social network analysis ( Ergün & Usluel, 2016 ), using informal communication tools such as Facebook for class discussion ( Kent, 2013 ), and using various modes of communication ( Cunningham et al., 2010 ; Rowe, 2016 ). Studies have also focused on both asynchronous and synchronous aspects of communication ( Swaggerty & Broemmel, 2017 ; Yamagata-Lynch, 2014 ).

4.2.2. Research on learner characteristics

The second largest theme was learner characteristics. In this review, we explore this further to identify several aspects of learner characteristics. In this review, we categorized the learner characteristics into self-regulation characteristics, motivational characteristics, academic characteristics, affective characteristics, cognitive characteristics, and demographic characteristics. Table 7 provides the number of studies and percentages examining the various learner characteristics.

Research sub-themes on learner characteristics.

Online learning has elements that are different from the traditional face-to-face classroom and so the characteristics of the online learners are also different. Yukselturk and Top (2013) categorized online learner profile into ten aspects: gender, age, work status, self-efficacy, online readiness, self-regulation, participation in discussion list, participation in chat sessions, satisfaction, and achievement. Their categorization shows that there are differences in online learner characteristics in these aspects when compared to learners in other settings. Some of the other aspects such as participation and achievement as discussed by Yukselturk and Top (2013) are discussed in different research themes in this study. The sections below provide examples of the learner characteristics sub-themes that were studied.

Self-regulation. Several researchers have examined self-regulation in online learning. They found that successful online learners are academically motivated ( Artino & Stephens, 2009 ), have academic self-efficacy ( Cho & Shen, 2013 ), have grit and intention to succeed ( Wang & Baker, 2018 ), have time management and elaboration strategies ( Broadbent, 2017 ), set goals and revisit course content ( Kizilcec et al., 2017 ), and persist ( Glazer & Murphy, 2015 ). Researchers found a positive relationship between learner's self-regulation and interaction ( Delen et al., 2014 ) and self-regulation and communication and collaboration ( Barnard et al., 2009 ).

Motivation. Researchers focused on motivation of online learners including different motivation levels of online learners ( Li & Tsai, 2017 ), what motivated online learners ( Chaiprasurt & Esichaikul, 2013 ), differences in motivation of online learners ( Hartnett et al., 2011 ), and motivation when compared to face to face learners ( Paechter & Maier, 2010 ). Harnett et al. (2011) found that online learner motivation was complex, multifaceted, and sensitive to situational conditions.

Academic. Several researchers have focused on academic aspects for online learner characteristics. Readiness for online learning has been examined as an academic factor by several researchers ( Buzdar et al., 2016 ; Dray et al., 2011 ; Wladis & Samuels, 2016 ; Yu, 2018 ) specifically focusing on creating and validating measures to examine online learner readiness including examining students emotional intelligence as a measure of student readiness for online learning. Researchers have also examined other academic factors such as academic standing ( Bradford & Wyatt, 2010 ), course level factors ( Wladis et al., 2014 ) and academic skills in online courses ( Shea & Bidjerano, 2014 ).

Affective. Anderson and Bourke (2013) describe affective characteristics through which learners express feelings or emotions. Several research studies focused on the affective characteristics of online learners. Learner satisfaction for online learning has been examined by several researchers ( Cole et al., 2014 ; Dziuban et al., 2015 ; Kuo et al., 2013 ; Lee, 2014a ) along with examining student emotions towards online assessment ( Kim et al., 2014 ).

Cognitive. Researchers have also examined cognitive aspects of learner characteristics including meta-cognitive skills, cognitive variables, higher-order thinking, cognitive density, and critical thinking ( Chen & Wu, 2012 ; Lee, 2014b ). Lee (2014b) examined the relationship between cognitive presence density and higher-order thinking skills. Chen and Wu (2012) examined the relationship between cognitive and motivational variables in an online system for secondary physical education.

Demographic. Researchers have examined various demographic factors in online learning. Several researchers have examined gender differences in online learning ( Bayeck et al., 2018 ; Lowes et al., 2016 ; Yukselturk & Bulut, 2009 ), ethnicity, age ( Ke & Kwak, 2013 ), and minority status ( Yeboah & Smith, 2016 ) of online learners.

4.2.3. Less frequently studied research themes

While engagement and learner characteristics were studied the most, other themes were less often studied in the literature and are presented here, according to size, with general descriptions of the types of research examined for each.

Evaluation and Quality Assurance. There were 38 studies (6.14%) published in the theme of evaluation and quality assurance. Some of the studies in this theme focused on course quality standards, using quality matters to evaluate quality, using the CIPP model for evaluation, online learning system evaluation, and course and program evaluations.

Course Technologies. There were 35 studies (5.65%) published in the course technologies theme. Some of the studies examined specific technologies such as Edmodo, YouTube, Web 2.0 tools, wikis, Twitter, WebCT, Screencasts, and Web conferencing systems in the online learning context.

Course Facilitation. There were 34 studies (5.49%) published in the course facilitation theme. Some of the studies in this theme examined facilitation strategies and methods, experiences of online facilitators, and online teaching methods.

Institutional Support. There were 33 studies (5.33%) published in the institutional support theme which included support for both the instructor and learner. Some of the studies on instructor support focused on training new online instructors, mentoring programs for faculty, professional development resources for faculty, online adjunct faculty training, and institutional support for online instructors. Studies on learner support focused on learning resources for online students, cognitive and social support for online learners, and help systems for online learner support.

Learner Outcome. There were 32 studies (5.17%) published in the learner outcome theme. Some of the studies that were examined in this theme focused on online learner enrollment, completion, learner dropout, retention, and learner success.

Course Assessment. There were 30 studies (4.85%) published in the course assessment theme. Some of the studies in the course assessment theme examined online exams, peer assessment and peer feedback, proctoring in online exams, and alternative assessments such as eportfolio.

Access, Culture, Equity, Inclusion, and Ethics. There were 29 studies (4.68%) published in the access, culture, equity, inclusion, and ethics theme. Some of the studies in this theme examined online learning across cultures, multi-cultural effectiveness, multi-access, and cultural diversity in online learning.

Leadership, Policy, and Management. There were 27 studies (4.36%) published in the leadership, policy, and management theme. Some of the studies on leadership, policy, and management focused on online learning leaders, stakeholders, strategies for online learning leadership, resource requirements, university policies for online course policies, governance, course ownership, and faculty incentives for online teaching.

Course Design and Development. There were 27 studies (4.36%) published in the course design and development theme. Some of the studies examined in this theme focused on design elements, design issues, design process, design competencies, design considerations, and instructional design in online courses.

Instructor Characteristics. There were 21 studies (3.39%) published in the instructor characteristics theme. Some of the studies in this theme were on motivation and experiences of online instructors, ability to perform online teaching duties, roles of online instructors, and adjunct versus full-time online instructors.

4.3. Research settings and methodology used in the studies

The research methods used in the studies were classified into quantitative, qualitative, and mixed methods ( Harwell, 2012 , pp. 147–163). The research setting was categorized into higher education, continuing education, K-12, and corporate/military. As shown in Table A in the appendix, the vast majority of the publications used higher education as the research setting ( n  = 509, 67.6%). Table B in the appendix shows that approximately half of the studies adopted the quantitative method ( n  = 324, 43.03%), followed by the qualitative method ( n  = 200, 26.56%). Mixed methods account for the smallest portion ( n  = 95, 12.62%).

Table A shows that the patterns of the four research settings were approximately consistent across the 12 themes except for the theme of Leaner Outcome and Institutional Support. Continuing education had a higher relative frequency in Learner Outcome (0.28) and K-12 had a higher relative frequency in Institutional Support (0.33) compared to the frequencies they had in the total themes (0.09 and 0.08 respectively). Table B in the appendix shows that the distribution of the three methods were not consistent across the 12 themes. While quantitative studies and qualitative studies were roughly evenly distributed in Engagement, they had a large discrepancy in Learner Characteristics. There were 100 quantitative studies; however, only 18 qualitative studies published in the theme of Learner Characteristics.

In summary, around 8% of the articles published in the 12 journals focus on online learning. Online learning publications showed a tendency of increase on the whole in the past decade, albeit fluctuated, with the greatest number occurring in 2018. Among the 12 research themes related to online learning, the themes of Engagement and Learner Characteristics were studied the most and the theme of Instructor Characteristics was studied the least. Most studies were conducted in the higher education setting and approximately half of the studies used the quantitative method. Looking at the 12 themes by setting and method, we found that the patterns of the themes by setting or by method were not consistent across the 12 themes.

The quality of our findings was ensured by scientific and thorough searches and coding consistency. The selection of the 12 journals provides evidence of the representativeness and quality of primary studies. In the coding process, any difficulties and questions were resolved by consultations with the research team at bi-weekly meetings, which ensures the intra-rater and interrater reliability of coding. All these approaches guarantee the transparency and replicability of the process and the quality of our results.

5. Discussion

This review enabled us to identify the online learning research themes examined from 2009 to 2018. In the section below, we review the most studied research themes, engagement and learner characteristics along with implications, limitations, and directions for future research.

5.1. Most studied research themes

Three out of the four systematic reviews informing the design of the present study found that online learner characteristics and online engagement were examined in a high number of studies. In this review, about half of the studies reviewed (50.57%) focused on online learner characteristics or online engagement. This shows the continued importance of these two themes. In the Tallent-Runnels et al.’s (2006) study, the learner characteristics theme was identified as least studied for which they state that researchers are beginning to investigate learner characteristics in the early days of online learning.

One of the differences found in this review is that course design and development was examined in the least number of studies in this review compared to two prior systematic reviews ( Berge & Mrozowski, 2001 ; Zawacki-Richter et al., 2009 ). Zawacki-Richter et al. did not use a keyword search but reviewed all the articles in five different distance education journals. Berge and Mrozowski (2001) included a research theme called design issues to include all aspects of instructional systems design in distance education journals. In our study, in addition to course design and development, we also had focused themes on learner outcomes, course facilitation, course assessment and course evaluation. These are all instructional design focused topics and since we had multiple themes focusing on instructional design topics, the course design and development category might have resulted in fewer studies. There is still a need for more studies to focus on online course design and development.

5.2. Least frequently studied research themes

Three out of the four systematic reviews discussed in the opening of this study found management and organization factors to be least studied. In this review, Leadership, Policy, and Management was studied among 4.36% of the studies and Access, Culture, Equity, Inclusion, and Ethics was studied among 4.68% of the studies in the organizational level. The theme on Equity and accessibility was also found to be the least studied theme in the Berge and Mrozowski (2001) study. In addition, instructor characteristics was the least examined research theme among the twelve themes studied in this review. Only 3.39% of the studies were on instructor characteristics. While there were some studies examining instructor motivation and experiences, instructor ability to teach online, online instructor roles, and adjunct versus full-time online instructors, there is still a need to examine topics focused on instructors and online teaching. This theme was not included in the prior reviews as the focus was more on the learner and the course but not on the instructor. While it is helpful to see research evolving on instructor focused topics, there is still a need for more research on the online instructor.

5.3. Comparing research themes from current study to previous studies

The research themes from this review were compared with research themes from previous systematic reviews, which targeted prior decades. Table 8 shows the comparison.

Comparison of most and least studied online learning research themes from current to previous reviews.

L = Learner, C=Course O=Organization.

5.4. Need for more studies on organizational level themes of online learning

In this review there is a greater concentration of studies focused on Learner domain topics, and reduced attention to broader more encompassing research themes that fall into the Course and Organization domains. There is a need for organizational level topics such as Access, Culture, Equity, Inclusion and Ethics, and Leadership, Policy and Management to be researched on within the context of online learning. Examination of access, culture, equity, inclusion and ethics is very important to support diverse online learners, particularly with the rapid expansion of online learning across all educational levels. This was also least studied based on Berge and Mrozowski (2001) systematic review.

The topics on leadership, policy and management were least studied both in this review and also in the Tallent-Runnels et al. (2006) and Zawacki-Richter et al. (2009) study. Tallent-Runnels categorized institutional and administrative aspects into institutional policies, institutional support, and enrollment effects. While we included support as a separate category, in this study leadership, policy and management were combined. There is still a need for research on leadership of those who manage online learning, policies for online education, and managing online programs. In the Zawacki-Richter et al. (2009) study, only a few studies examined management and organization focused topics. They also found management and organization to be strongly correlated with costs and benefits. In our study, costs and benefits were collectively included as an aspect of management and organization and not as a theme by itself. These studies will provide research-based evidence for online education administrators.

6. Limitations

As with any systematic review, there are limitations to the scope of the review. The search is limited to twelve journals in the field that typically include research on online learning. These manuscripts were identified by searching the Education Research Complete database which focuses on education students, professionals, and policymakers. Other discipline-specific journals as well as dissertations and proceedings were not included due to the volume of articles. Also, the search was performed using five search terms “online learning" OR "online teaching" OR "online program" OR "online course" OR “online education” in title and keyword. If authors did not include these terms, their respective work may have been excluded from this review even if it focused on online learning. While these terms are commonly used in North America, it may not be commonly used in other parts of the world. Additional studies may exist outside this scope.

The search strategy also affected how we presented results and introduced limitations regarding generalization. We identified that only 8% of the articles published in these journals were related to online learning; however, given the use of search terms to identify articles within select journals it was not feasible to identify the total number of research-based articles in the population. Furthermore, our review focused on the topics and general methods of research and did not systematically consider the quality of the published research. Lastly, some journals may have preferences for publishing studies on a particular topic or that use a particular method (e.g., quantitative methods), which introduces possible selection and publication biases which may skew the interpretation of results due to over/under representation. Future studies are recommended to include more journals to minimize the selection bias and obtain a more representative sample.

Certain limitations can be attributed to the coding process. Overall, the coding process for this review worked well for most articles, as each tended to have an individual or dominant focus as described in the abstracts, though several did mention other categories which likely were simultaneously considered to a lesser degree. However, in some cases, a dominant theme was not as apparent and an effort to create mutually exclusive groups for clearer interpretation the coders were occasionally forced to choose between two categories. To facilitate this coding, the full-texts were used to identify a study focus through a consensus seeking discussion among all authors. Likewise, some studies focused on topics that we have associated with a particular domain, but the design of the study may have promoted an aggregated examination or integrated factors from multiple domains (e.g., engagement). Due to our reliance on author descriptions, the impact of construct validity is likely a concern that requires additional exploration. Our final grouping of codes may not have aligned with the original author's description in the abstract. Additionally, coding of broader constructs which disproportionately occur in the Learner domain, such as learner outcomes, learner characteristics, and engagement, likely introduced bias towards these codes when considering studies that involved multiple domains. Additional refinement to explore the intersection of domains within studies is needed.

7. Implications and future research

One of the strengths of this review is the research categories we have identified. We hope these categories will support future researchers and identify areas and levels of need for future research. Overall, there is some agreement on research themes on online learning research among previous reviews and this one, at the same time there are some contradicting findings. We hope the most-researched themes and least-researched themes provide authors a direction on the importance of research and areas of need to focus on.

The leading themes found in this review is online engagement research. However, presentation of this research was inconsistent, and often lacked specificity. This is not unique to online environments, but the nuances of defining engagement in an online environment are unique and therefore need further investigation and clarification. This review points to seven distinct classifications of online engagement. Further research on engagement should indicate which type of engagement is sought. This level of specificity is necessary to establish instruments for measuring engagement and ultimately testing frameworks for classifying engagement and promoting it in online environments. Also, it might be of importance to examine the relationship between these seven sub-themes of engagement.

Additionally, this review highlights growing attention to learner characteristics, which constitutes a shift in focus away from instructional characteristics and course design. Although this is consistent with the focus on engagement, the role of the instructor, and course design with respect to these outcomes remains important. Results of the learner characteristics and engagement research paired with course design will have important ramifications for the use of teaching and learning professionals who support instruction. Additionally, the review also points to a concentration of research in the area of higher education. With an immediate and growing emphasis on online learning in K-12 and corporate settings, there is a critical need for further investigation in these settings.

Lastly, because the present review did not focus on the overall effect of interventions, opportunities exist for dedicated meta-analyses. Particular attention to research on engagement and learner characteristics as well as how these vary by study design and outcomes would be logical additions to the research literature.

8. Conclusion

This systematic review builds upon three previous reviews which tackled the topic of online learning between 1990 and 2010 by extending the timeframe to consider the most recent set of published research. Covering the most recent decade, our review of 619 articles from 12 leading online learning journal points to a more concentrated focus on the learner domain including engagement and learner characteristics, with more limited attention to topics pertaining to the classroom or organizational level. The review highlights an opportunity for the field to clarify terminology concerning online learning research, particularly in the areas of learner outcomes where there is a tendency to classify research more generally (e.g., engagement). Using this sample of published literature, we provide a possible taxonomy for categorizing this research using subcategories. The field could benefit from a broader conversation about how these categories can shape a comprehensive framework for online learning research. Such efforts will enable the field to effectively prioritize research aims over time and synthesize effects.

Credit author statement

Florence Martin: Conceptualization; Writing - original draft, Writing - review & editing Preparation, Supervision, Project administration. Ting Sun: Methodology, Formal analysis, Writing - original draft, Writing - review & editing. Carl Westine: Methodology, Formal analysis, Writing - original draft, Writing - review & editing, Supervision

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

1 Includes articles that are cited in this manuscript and also included in the systematic review. The entire list of 619 articles used in the systematic review can be obtained by emailing the authors.*

Appendix B Supplementary data to this article can be found online at https://doi.org/10.1016/j.compedu.2020.104009 .

Appendix A. 

Research Themes by the Settings in the Online Learning Publications

Research Themes by the Methodology in the Online Learning Publications

Appendix B. Supplementary data

The following are the Supplementary data to this article:

References 1

  • Ahn J., Butler B.S., Alam A., Webster S.A. Learner participation and engagement in open online courses: Insights from the Peer 2 Peer University. MERLOT Journal of Online Learning and Teaching. 2013; 9 (2):160–171. * [ Google Scholar ]
  • Akcaoglu M., Lee E. Increasing social presence in online learning through small group discussions. International Review of Research in Open and Distance Learning. 2016; 17 (3) * [ Google Scholar ]
  • Allen I.E., Seaman J. Babson survey research group; 2017. Digital compass learning: Distance education enrollment Report 2017. [ Google Scholar ]
  • Amador J.A., Mederer H. Migrating successful student engagement strategies online: Opportunities and challenges using jigsaw groups and problem-based learning. Journal of Online Learning and Teaching. 2013; 9 (1):89. * [ Google Scholar ]
  • Anderson L.W., Bourke S.F. Routledge; 2013. Assessing affective characteristics in the schools. [ Google Scholar ]
  • Archibald D. Fostering the development of cognitive presence: Initial findings using the community of inquiry survey instrument. The Internet and Higher Education. 2010; 13 (1–2):73–74. * [ Google Scholar ]
  • Artino A.R., Jr., Stephens J.M. Academic motivation and self-regulation: A comparative analysis of undergraduate and graduate students learning online. The Internet and Higher Education. 2009; 12 (3–4):146–151. [ Google Scholar ]
  • Barnard L., Lan W.Y., To Y.M., Paton V.O., Lai S.L. Measuring self-regulation in online and blended learning environments. Internet and Higher Education. 2009; 12 (1):1–6. * [ Google Scholar ]
  • Bayeck R.Y., Hristova A., Jablokow K.W., Bonafini F. Exploring the relevance of single‐gender group formation: What we learn from a massive open online course (MOOC) British Journal of Educational Technology. 2018; 49 (1):88–100. * [ Google Scholar ]
  • Berge Z., Mrozowski S. Review of research in distance education, 1990 to 1999. American Journal of Distance Education. 2001; 15 (3):5–19. doi: 10.1080/08923640109527090. [ CrossRef ] [ Google Scholar ]
  • Berry S. Building community in online doctoral classrooms: Instructor practices that support community. Online Learning. 2017; 21 (2):n2. * [ Google Scholar ]
  • Boling E.C., Holan E., Horbatt B., Hough M., Jean-Louis J., Khurana C., Spiezio C. Using online tools for communication and collaboration: Understanding educators' experiences in an online course. The Internet and Higher Education. 2014; 23 :48–55. * [ Google Scholar ]
  • Bolliger D.U., Inan F.A. Development and validation of the online student connectedness survey (OSCS) International Review of Research in Open and Distance Learning. 2012; 13 (3):41–65. * [ Google Scholar ]
  • Bradford G., Wyatt S. Online learning and student satisfaction: Academic standing, ethnicity and their influence on facilitated learning, engagement, and information fluency. The Internet and Higher Education. 2010; 13 (3):108–114. * [ Google Scholar ]
  • Broadbent J. Comparing online and blended learner's self-regulated learning strategies and academic performance. The Internet and Higher Education. 2017; 33 :24–32. [ Google Scholar ]
  • Buzdar M., Ali A., Tariq R. Emotional intelligence as a determinant of readiness for online learning. International Review of Research in Open and Distance Learning. 2016; 17 (1) * [ Google Scholar ]
  • Capdeferro N., Romero M., Barberà E. Polychronicity: Review of the literature and a new configuration for the study of this hidden dimension of online learning. Distance Education. 2014; 35 (3):294–310. [ Google Scholar ]
  • Chaiprasurt C., Esichaikul V. Enhancing motivation in online courses with mobile communication tool support: A comparative study. International Review of Research in Open and Distance Learning. 2013; 14 (3):377–401. [ Google Scholar ]
  • Chen C.H., Wu I.C. The interplay between cognitive and motivational variables in a supportive online learning system for secondary physical education. Computers & Education. 2012; 58 (1):542–550. * [ Google Scholar ]
  • Cho H. Under co-construction: An online community of practice for bilingual pre-service teachers. Computers & Education. 2016; 92 :76–89. * [ Google Scholar ]
  • Cho M.H., Shen D. Self-regulation in online learning. Distance Education. 2013; 34 (3):290–301. [ Google Scholar ]
  • Cole M.T., Shelley D.J., Swartz L.B. Online instruction, e-learning, and student satisfaction: A three-year study. International Review of Research in Open and Distance Learning. 2014; 15 (6) * [ Google Scholar ]
  • Comer D.K., Clark C.R., Canelas D.A. Writing to learn and learning to write across the disciplines: Peer-to-peer writing in introductory-level MOOCs. International Review of Research in Open and Distance Learning. 2014; 15 (5):26–82. * [ Google Scholar ]
  • Cundell A., Sheepy E. Student perceptions of the most effective and engaging online learning activities in a blended graduate seminar. Online Learning. 2018; 22 (3):87–102. * [ Google Scholar ]
  • Cung B., Xu D., Eichhorn S. Increasing interpersonal interactions in an online course: Does increased instructor email activity and voluntary meeting time in a physical classroom facilitate student learning? Online Learning. 2018; 22 (3):193–215. [ Google Scholar ]
  • Cunningham U.M., Fägersten K.B., Holmsten E. Can you hear me, Hanoi?" Compensatory mechanisms employed in synchronous net-based English language learning. International Review of Research in Open and Distance Learning. 2010; 11 (1):161–177. [ Google Scholar ]
  • Davis D., Chen G., Hauff C., Houben G.J. Activating learning at scale: A review of innovations in online learning strategies. Computers & Education. 2018; 125 :327–344. [ Google Scholar ]
  • Delen E., Liew J., Willson V. Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education. 2014; 78 :312–320. [ Google Scholar ]
  • Dixson M.D. Measuring student engagement in the online course: The Online Student Engagement scale (OSE) Online Learning. 2015; 19 (4):n4. * [ Google Scholar ]
  • Dray B.J., Lowenthal P.R., Miszkiewicz M.J., Ruiz‐Primo M.A., Marczynski K. Developing an instrument to assess student readiness for online learning: A validation study. Distance Education. 2011; 32 (1):29–47. * [ Google Scholar ]
  • Dziuban C., Moskal P., Thompson J., Kramer L., DeCantis G., Hermsdorfer A. Student satisfaction with online learning: Is it a psychological contract? Online Learning. 2015; 19 (2):n2. * [ Google Scholar ]
  • Ergün E., Usluel Y.K. An analysis of density and degree-centrality according to the social networking structure formed in an online learning environment. Journal of Educational Technology & Society. 2016; 19 (4):34–46. * [ Google Scholar ]
  • Esfijani A. Measuring quality in online education: A meta-synthesis. American Journal of Distance Education. 2018; 32 (1):57–73. [ Google Scholar ]
  • Glazer H.R., Murphy J.A. Optimizing success: A model for persistence in online education. American Journal of Distance Education. 2015; 29 (2):135–144. [ Google Scholar ]
  • Glazer H.R., Wanstreet C.E. Connection to the academic community: Perceptions of students in online education. Quarterly Review of Distance Education. 2011; 12 (1):55. * [ Google Scholar ]
  • Hartnett M., George A.S., Dron J. Examining motivation in online distance learning environments: Complex, multifaceted and situation-dependent. International Review of Research in Open and Distance Learning. 2011; 12 (6):20–38. [ Google Scholar ]
  • Harwell M.R. 2012. Research design in qualitative/quantitative/mixed methods. Section III. Opportunities and challenges in designing and conducting inquiry. [ Google Scholar ]
  • Hung J.L. Trends of e‐learning research from 2000 to 2008: Use of text mining and bibliometrics. British Journal of Educational Technology. 2012; 43 (1):5–16. [ Google Scholar ]
  • Jiang W. Interdependence of roles, role rotation, and sense of community in an online course. Distance Education. 2017; 38 (1):84–105. [ Google Scholar ]
  • Ke F., Kwak D. Online learning across ethnicity and age: A study on learning interaction participation, perception, and learning satisfaction. Computers & Education. 2013; 61 :43–51. [ Google Scholar ]
  • Kent M. Changing the conversation: Facebook as a venue for online class discussion in higher education. MERLOT Journal of Online Learning and Teaching. 2013; 9 (4):546–565. * [ Google Scholar ]
  • Kim C., Park S.W., Cozart J. Affective and motivational factors of learning in online mathematics courses. British Journal of Educational Technology. 2014; 45 (1):171–185. [ Google Scholar ]
  • Kizilcec R.F., Pérez-Sanagustín M., Maldonado J.J. Self-regulated learning strategies predict learner behavior and goal attainment in Massive Open Online Courses. Computers & Education. 2017; 104 :18–33. [ Google Scholar ]
  • Kopp B., Matteucci M.C., Tomasetto C. E-tutorial support for collaborative online learning: An explorative study on experienced and inexperienced e-tutors. Computers & Education. 2012; 58 (1):12–20. [ Google Scholar ]
  • Koseoglu S., Doering A. Understanding complex ecologies: An investigation of student experiences in adventure learning programs. Distance Education. 2011; 32 (3):339–355. * [ Google Scholar ]
  • Kumi-Yeboah A. Designing a cross-cultural collaborative online learning framework for online instructors. Online Learning. 2018; 22 (4):181–201. * [ Google Scholar ]
  • Kuo Y.C., Walker A.E., Belland B.R., Schroder K.E. A predictive study of student satisfaction in online education programs. International Review of Research in Open and Distance Learning. 2013; 14 (1):16–39. * [ Google Scholar ]
  • Kuo Y.C., Walker A.E., Schroder K.E., Belland B.R. Interaction, Internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Higher Education. 2014; 20 :35–50. * [ Google Scholar ]
  • Lee J. An exploratory study of effective online learning: Assessing satisfaction levels of graduate students of mathematics education associated with human and design factors of an online course. International Review of Research in Open and Distance Learning. 2014; 15 (1) [ Google Scholar ]
  • Lee S.M. The relationships between higher order thinking skills, cognitive density, and social presence in online learning. The Internet and Higher Education. 2014; 21 :41–52. * [ Google Scholar ]
  • Lee K. Rethinking the accessibility of online higher education: A historical review. The Internet and Higher Education. 2017; 33 :15–23. [ Google Scholar ]
  • Lee Y., Choi J. A review of online course dropout research: Implications for practice and future research. Educational Technology Research & Development. 2011; 59 (5):593–618. [ Google Scholar ]
  • Li L.Y., Tsai C.C. Accessing online learning material: Quantitative behavior patterns and their effects on motivation and learning performance. Computers & Education. 2017; 114 :286–297. [ Google Scholar ]
  • Liyanagunawardena T., Adams A., Williams S. MOOCs: A systematic study of the published literature 2008-2012. International Review of Research in Open and Distance Learning. 2013; 14 (3):202–227. [ Google Scholar ]
  • Lowes S., Lin P., Kinghorn B.R. Gender differences in online high school courses. Online Learning. 2016; 20 (4):100–117. [ Google Scholar ]
  • Marbouti F., Wise A.F. Starburst: A new graphical interface to support purposeful attention to others' posts in online discussions. Educational Technology Research & Development. 2016; 64 (1):87–113. * [ Google Scholar ]
  • Martin F., Ahlgrim-Delzell L., Budhrani K. Systematic review of two decades (1995 to 2014) of research on synchronous online learning. American Journal of Distance Education. 2017; 31 (1):3–19. [ Google Scholar ]
  • Moore-Adams B.L., Jones W.M., Cohen J. Learning to teach online: A systematic review of the literature on K-12 teacher preparation for teaching online. Distance Education. 2016; 37 (3):333–348. [ Google Scholar ]
  • Murphy E., Rodríguez-Manzanares M.A. Rapport in distance education. International Review of Research in Open and Distance Learning. 2012; 13 (1):167–190. * [ Google Scholar ]
  • Nye A. Building an online academic learning community among undergraduate students. Distance Education. 2015; 36 (1):115–128. * [ Google Scholar ]
  • Olesova L., Slavin M., Lim J. Exploring the effect of scripted roles on cognitive presence in asynchronous online discussions. Online Learning. 2016; 20 (4):34–53. * [ Google Scholar ]
  • Orcutt J.M., Dringus L.P. Beyond being there: Practices that establish presence, engage students and influence intellectual curiosity in a structured online learning environment. Online Learning. 2017; 21 (3):15–35. * [ Google Scholar ]
  • Overbaugh R.C., Nickel C.E. A comparison of student satisfaction and value of academic community between blended and online sections of a university-level educational foundations course. The Internet and Higher Education. 2011; 14 (3):164–174. * [ Google Scholar ]
  • O'Shea S., Stone C., Delahunty J. “I ‘feel’like I am at university even though I am online.” Exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education. 2015; 36 (1):41–58. * [ Google Scholar ]
  • Paechter M., Maier B. Online or face-to-face? Students' experiences and preferences in e-learning. Internet and Higher Education. 2010; 13 (4):292–297. [ Google Scholar ]
  • Phirangee K. Students' perceptions of learner-learner interactions that weaken a sense of community in an online learning environment. Online Learning. 2016; 20 (4):13–33. * [ Google Scholar ]
  • Phirangee K., Malec A. Othering in online learning: An examination of social presence, identity, and sense of community. Distance Education. 2017; 38 (2):160–172. * [ Google Scholar ]
  • Preisman K.A. Teaching presence in online education: From the instructor's point of view. Online Learning. 2014; 18 (3):n3. * [ Google Scholar ]
  • Rowe M. Developing graduate attributes in an open online course. British Journal of Educational Technology. 2016; 47 (5):873–882. * [ Google Scholar ]
  • Ruane R., Koku E.F. Social network analysis of undergraduate education student interaction in online peer mentoring settings. Journal of Online Learning and Teaching. 2014; 10 (4):577–589. * [ Google Scholar ]
  • Ruane R., Lee V.J. Analysis of discussion board interaction in an online peer mentoring site. Online Learning. 2016; 20 (4):79–99. * [ Google Scholar ]
  • Rye S.A., Støkken A.M. The implications of the local context in global virtual education. International Review of Research in Open and Distance Learning. 2012; 13 (1):191–206. * [ Google Scholar ]
  • Saadatmand M., Kumpulainen K. Participants' perceptions of learning and networking in connectivist MOOCs. Journal of Online Learning and Teaching. 2014; 10 (1):16. * [ Google Scholar ]
  • Shackelford J.L., Maxwell M. Sense of community in graduate online education: Contribution of learner to learner interaction. International Review of Research in Open and Distance Learning. 2012; 13 (4):228–249. * [ Google Scholar ]
  • Shea P., Bidjerano T. Does online learning impede degree completion? A national study of community college students. Computers & Education. 2014; 75 :103–111. * [ Google Scholar ]
  • Sherry L. Issues in distance learning. International Journal of Educational Telecommunications. 1996; 1 (4):337–365. [ Google Scholar ]
  • Slagter van Tryon P.J., Bishop M.J. Evaluating social connectedness online: The design and development of the social perceptions in learning contexts instrument. Distance Education. 2012; 33 (3):347–364. * [ Google Scholar ]
  • Swaggerty E.A., Broemmel A.D. Authenticity, relevance, and connectedness: Graduate students' learning preferences and experiences in an online reading education course. The Internet and Higher Education. 2017; 32 :80–86. * [ Google Scholar ]
  • Tallent-Runnels M.K., Thomas J.A., Lan W.Y., Cooper S., Ahern T.C., Shaw S.M., Liu X. Teaching courses online: A review of the research. Review of Educational Research. 2006; 76 (1):93–135. doi: 10.3102/00346543076001093. [ CrossRef ] [ Google Scholar ]
  • Tawfik A.A., Giabbanelli P.J., Hogan M., Msilu F., Gill A., York C.S. Effects of success v failure cases on learner-learner interaction. Computers & Education. 2018; 118 :120–132. [ Google Scholar ]
  • Thomas J. Exploring the use of asynchronous online discussion in health care education: A literature review. Computers & Education. 2013; 69 :199–215. [ Google Scholar ]
  • Thormann J., Fidalgo P. Guidelines for online course moderation and community building from a student's perspective. Journal of Online Learning and Teaching. 2014; 10 (3):374–388. * [ Google Scholar ]
  • Tibi M.H. Computer science students' attitudes towards the use of structured and unstructured discussion forums in fully online courses. Online Learning. 2018; 22 (1):93–106. * [ Google Scholar ]
  • Tsai C.W., Chiang Y.C. Research trends in problem‐based learning (pbl) research in e‐learning and online education environments: A review of publications in SSCI‐indexed journals from 2004 to 2012. British Journal of Educational Technology. 2013; 44 (6):E185–E190. [ Google Scholar ]
  • Tsai C.W., Fan Y.T. Research trends in game‐based learning research in online learning environments: A review of studies published in SSCI‐indexed journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (5):E115–E119. [ Google Scholar ]
  • Tsai C.W., Shen P.D., Chiang Y.C. Research trends in meaningful learning research on e‐learning and online education environments: A review of studies published in SSCI‐indexed journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (6):E179–E184. [ Google Scholar ]
  • Tsai C.W., Shen P.D., Fan Y.T. Research trends in self‐regulated learning research in online learning environments: A review of studies published in selected journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (5):E107–E110. [ Google Scholar ]
  • U.S. Department of Education, Institute of Education Sciences . InstituteofEducationSciences; Washington,DC: 2017. What Works Clearinghouse procedures and standards handbook, version3.0. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_procedures_v3_0_standards_handbook.pdf Retrievedfrom. [ Google Scholar ]
  • Veletsianos G., Shepherdson P. A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. International Review of Research in Open and Distance Learning. 2016; 17 (2) [ Google Scholar ]
  • VERBI Software . 2019. MAXQDA 2020 online manual. Retrieved from maxqda. Com/help-max20/welcome [ Google Scholar ]
  • Verstegen D., Dailey-Hebert A., Fonteijn H., Clarebout G., Spruijt A. How do virtual teams collaborate in online learning tasks in a MOOC? International Review of Research in Open and Distance Learning. 2018; 19 (4) * [ Google Scholar ]
  • Wang Y., Baker R. Grit and intention: Why do learners complete MOOCs? International Review of Research in Open and Distance Learning. 2018; 19 (3) * [ Google Scholar ]
  • Wei C.W., Chen N.S., Kinshuk A model for social presence in online classrooms. Educational Technology Research & Development. 2012; 60 (3):529–545. * [ Google Scholar ]
  • Wicks D., Craft B.B., Lee D., Lumpe A., Henrikson R., Baliram N., Wicks K. An evaluation of low versus high collaboration in online learning. Online Learning. 2015; 19 (4):n4. * [ Google Scholar ]
  • Wise A.F., Perera N., Hsiao Y.T., Speer J., Marbouti F. Microanalytic case studies of individual participation patterns in an asynchronous online discussion in an undergraduate blended course. The Internet and Higher Education. 2012; 15 (2):108–117. * [ Google Scholar ]
  • Wisneski J.E., Ozogul G., Bichelmeyer B.A. Does teaching presence transfer between MBA teaching environments? A comparative investigation of instructional design practices associated with teaching presence. The Internet and Higher Education. 2015; 25 :18–27. * [ Google Scholar ]
  • Wladis C., Hachey A.C., Conway K. An investigation of course-level factors as predictors of online STEM course outcomes. Computers & Education. 2014; 77 :145–150. * [ Google Scholar ]
  • Wladis C., Samuels J. Do online readiness surveys do what they claim? Validity, reliability, and subsequent student enrollment decisions. Computers & Education. 2016; 98 :39–56. [ Google Scholar ]
  • Yamagata-Lynch L.C. Blending online asynchronous and synchronous learning. International Review of Research in Open and Distance Learning. 2014; 15 (2) * [ Google Scholar ]
  • Yang J., Kinshuk, Yu H., Chen S.J., Huang R. Strategies for smooth and effective cross-cultural online collaborative learning. Journal of Educational Technology & Society. 2014; 17 (3):208–221. * [ Google Scholar ]
  • Yeboah A.K., Smith P. Relationships between minority students online learning experiences and academic performance. Online Learning. 2016; 20 (4):n4. * [ Google Scholar ]
  • Yu T. Examining construct validity of the student online learning readiness (SOLR) instrument using confirmatory factor analysis. Online Learning. 2018; 22 (4):277–288. * [ Google Scholar ]
  • Yukselturk E., Bulut S. Gender differences in self-regulated online learning environment. Educational Technology & Society. 2009; 12 (3):12–22. [ Google Scholar ]
  • Yukselturk E., Top E. Exploring the link among entry characteristics, participation behaviors and course outcomes of online learners: An examination of learner profile using cluster analysis. British Journal of Educational Technology. 2013; 44 (5):716–728. [ Google Scholar ]
  • Zawacki-Richter O., Backer E., Vogt S. Review of distance education research (2000 to 2008): Analysis of research areas, methods, and authorship patterns. International Review of Research in Open and Distance Learning. 2009; 10 (6):30. doi: 10.19173/irrodl.v10i6.741. [ CrossRef ] [ Google Scholar ]
  • Zhu M., Sari A., Lee M.M. A systematic review of research methods and topics of the empirical MOOC literature (2014–2016) The Internet and Higher Education. 2018; 37 :31–39. [ Google Scholar ]
  • Zimmerman T.D. Exploring learner to content interaction as a success factor in online courses. International Review of Research in Open and Distance Learning. 2012; 13 (4):152–165. [ Google Scholar ]

A Comparison of Student Learning Outcomes: Online Education vs. Traditional Classroom Instruction

Despite the prevalence of online learning today, it is often viewed as a less favorable option when compared to the traditional, in-person educational experience. Criticisms of online learning come from various sectors, like employer groups, college faculty, and the general public, and generally includes a lack of perceived quality as well as rigor. Additionally, some students report feelings of social isolation in online learning (Protopsaltis & Baum, 2019).

In my experience as an online student as well as an online educator, online learning has been just the opposite. I have been teaching in a fully online master’s degree program for the last three years and have found it to be a rich and rewarding experience for students and faculty alike. As an instructor, I have felt more connected to and engaged with my online students when compared to in-person students. I have also found that students are actively engaged with course content and demonstrate evidence of higher-order thinking through their work. Students report high levels of satisfaction with their experiences in online learning as well as the program overall as indicated in their Student Evaluations of Teaching  (SET) at the end of every course. I believe that intelligent course design, in addition to my engagement in professional development related to teaching and learning online, has greatly influenced my experience.

In an article by Wiley Education Services, authors identified the top six challenges facing US institutions of higher education, and include:

  • Declining student enrollment
  • Financial difficulties
  • Fewer high school graduates
  • Decreased state funding
  • Lower world rankings
  • Declining international student enrollments

Of the strategies that institutions are exploring to remedy these issues, online learning is reported to be a key focus for many universities (“Top Challenges Facing US Higher Education”, n.d.).

online education vs traditional education research paper

Babson Survey Research Group, 2016, [PDF file].

Some of the questions I would like to explore in further research include:

  • What factors influence engagement and connection in distance education?
  • Are the learning outcomes in online education any different than the outcomes achieved in a traditional classroom setting?
  • How do course design and instructor training influence these factors?
  • In what ways might educational technology tools enhance the overall experience for students and instructors alike?

In this literature review, I have chosen to focus on a comparison of student learning outcomes in online education versus the traditional classroom setting. My hope is that this research will unlock the answers to some of the additional questions posed above and provide additional direction for future research.

Online Learning Defined

According to Mayadas, Miller, and Sener (2015), online courses are defined by all course activity taking place online with no required in-person sessions or on-campus activity. It is important to note, however, that the Babson Survey Research Group, a prominent organization known for their surveys and research in online learning, defines online learning as a course in which 80-100% occurs online. While this distinction was made in an effort to provide consistency in surveys year over year, most institutions continue to define online learning as learning that occurs 100% online.

Blended or hybrid learning is defined by courses that mix face to face meetings, sessions, or activities with online work. The ratio of online to classroom activity is often determined by the label in which the course is given. For example, a blended classroom course would likely include more time spent in the classroom, with the remaining work occurring outside of the classroom with the assistance of technology. On the other hand, a blended online course would contain a greater percentage of work done online, with some required in-person sessions or meetings (Mayadas, Miller, & Sener, 2015).

A classroom course (also referred to as a traditional course) refers to course activity that is anchored to a regular meeting time.

Enrollment Trends in Online Education

There has been an upward trend in the number of postsecondary students enrolled in online courses in the U.S. since 2002. A report by the Babson Survey Research Group showed that in 2016, more than six million students were enrolled in at least one online course. This number accounted for 31.6% of all college students (Seaman, Allen, & Seaman, 2018). Approximately one in three students are enrolled in online courses with no in-person component. Of these students, 47% take classes in a fully online program. The remaining 53% take some, but not all courses online (Protopsaltis & Baum, 2019).

online education vs traditional education research paper

(Seaman et al., 2016, p. 11)

Perceptions of Online Education

In a 2016 report by the Babson Survey Research Group, surveys of faculty between 2002-2015 showed approval ratings regarding the value and legitimacy of online education ranged from 28-34 percent. While numbers have increased and decreased over the thirteen-year time frame, faculty approval was at 29 percent in 2015, just 1 percent higher than the approval ratings noted in 2002 – indicating that perceptions have remained relatively unchanged over the years (Allen, Seaman, Poulin, & Straut, 2016).

online education vs traditional education research paper

(Allen, I.E., Seaman, J., Poulin, R., Taylor Strout, T., 2016, p. 26)

In a separate survey of chief academic officers, perceptions of online learning appeared to align with that of faculty. In this survey, leaders were asked to rate their perceived quality of learning outcomes in online learning when compared to traditional in-person settings. While the percentage of leaders rating online learning as “inferior” or “somewhat inferior” to traditional face-to-face courses dropped from 43 percent to 23 percent between 2003 to 2012, the number rose again to 29 percent in 2015 (Allen, Seaman, Poulin, & Straut, 2016).

online education vs traditional education research paper

Faculty and academic leaders in higher education are not alone when it comes to perceptions of inferiority when compared to traditional classroom instruction. A 2013 Gallop poll assessing public perceptions showed that respondents rated online education as “worse” in five of the seven categories seen in the table below.

online education vs traditional education research paper

(Saad, L., Busteed, B., and Ogisi, M., 2013, October 15)

In general, Americans believed that online education provides both lower quality and less individualized instruction and less rigorous testing and grading when compared to the traditional classroom setting. In addition, respondents also thought that employers would perceive a degree from an online program less positively when compared to a degree obtained through traditional classroom instruction (Saad, Busteed, & Ogisi, 2013).

Student Perceptions of Online Learning

So what do students have to say about online learning? In  Online College Students 2015: Comprehensive Data on Demands and Preferences,  1500 college students who were either enrolled or planning to enroll in a fully online undergraduate, graduate, or certificate program were surveyed. 78 percent of students believed the academic quality of their online learning experience to be better than or equal to their experiences with traditional classroom learning. Furthermore, 30 percent of online students polled said that they would likely not attend classes face to face if their program were not available online (Clienfelter & Aslanian, 2015). The following video describes some of the common reasons why students choose to attend college online.

How Online Learning Affects the Lives of Students ( Pearson North America, 2018, June 25)

In a 2015 study comparing student perceptions of online learning with face to face learning, researchers found that the majority of students surveyed expressed a preference for traditional face to face classes. A content analysis of the findings, however, brought attention to two key ideas: 1) student opinions of online learning may be based on “old typology of distance education” (Tichavsky, et al, 2015, p.6) as opposed to actual experience, and 2) a student’s inclination to choose one form over another is connected to issues of teaching presence and self-regulated learning (Tichavsky et al, 2015).

Student Learning Outcomes

Given the upward trend in student enrollment in online courses in postsecondary schools and the steady ratings of the low perceived value of online learning by stakeholder groups, it should be no surprise that there is a large body of literature comparing student learning outcomes in online classes to the traditional classroom environment.

While a majority of the studies reviewed found no significant difference in learning outcomes when comparing online to traditional courses (Cavanaugh & Jacquemin, 2015; Kemp & Grieve, 2014; Lyke & Frank 2012; Nichols, Shaffer, & Shockey, 2003; Stack, 2015; Summers, Waigandt, & Whittaker, 2005), there were a few outliers. In a 2019 report by Protopsaltis & Baum, authors confirmed that while learning is often found to be similar between the two mediums, students “with weak academic preparation and those from low-income and underrepresented backgrounds consistently underperform in fully-online environments” (Protopsaltis & Baum, 2019, n.p.). An important consideration, however, is that these findings are primarily based on students enrolled in online courses at the community college level – a demographic with a historically high rate of attrition compared to students attending four-year institutions (Ashby, Sadera, & McNary, 2011). Furthermore, students enrolled in online courses have been shown to have a 10 – 20 percent increase in attrition over their peers who are enrolled in traditional classroom instruction (Angelino, Williams, & Natvig, 2007). Therefore, attrition may be a key contributor to the lack of achievement seen in this subgroup of students enrolled in online education.

In contrast, there were a small number of studies that showed that online students tend to outperform those enrolled in traditional classroom instruction. One study, in particular, found a significant difference in test scores for students enrolled in an online, undergraduate business course. The confounding variable, in this case, was age. Researchers found a significant difference in performance in nontraditional age students over their traditional age counterparts. Authors concluded that older students may elect to take online classes for practical reasons related to outside work schedules, and this may, in turn, contribute to the learning that occurs overall (Slover & Mandernach, 2018).

In a meta-analysis and review of online learning spanning the years 1996 to 2008, authors from the US Department of Education found that students who took all or part of their classes online showed better learning outcomes than those students who took the same courses face-to-face. In these cases, it is important to note that there were many differences noted in the online and face-to-face versions, including the amount of time students spent engaged with course content. The authors concluded that the differences in learning outcomes may be attributed to learning design as opposed to the specific mode of delivery (Means, Toyoma, Murphy, Bakia, Jones, 2009).

Limitations and Opportunities

After examining the research comparing student learning outcomes in online education with the traditional classroom setting, there are many limitations that came to light, creating areas of opportunity for additional research. In many of the studies referenced, it is difficult to determine the pedagogical practices used in course design and delivery. Research shows the importance of student-student and student-teacher interaction in online learning, and the positive impact of these variables on student learning (Bernard, Borokhovski, Schmid, Tamim, & Abrami, 2014). Some researchers note that while many studies comparing online and traditional classroom learning exist, the methodologies and design issues make it challenging to explain the results conclusively (Mollenkopf, Vu, Crow, & Black, 2017). For example, some online courses may be structured in a variety of ways, i.e. self-paced, instructor-led and may be classified as synchronous or asynchronous (Moore, Dickson-Deane, Galyan, 2011)

Another gap in the literature is the failure to use a common language across studies to define the learning environment. This issue is explored extensively in a 2011 study by Moore, Dickson-Deane, and Galyan. Here, the authors examine the differences between e-learning, online learning, and distance learning in the literature, and how the terminology is often used interchangeably despite the variances in characteristics that define each. The authors also discuss the variability in the terms “course” versus “program”. This variability in the literature presents a challenge when attempting to compare one study of online learning to another (Moore, Dickson-Deane, & Galyan, 2011).

Finally, much of the literature in higher education focuses on undergraduate-level classes within the United States. Little research is available on outcomes in graduate-level classes as well as general information on student learning outcomes and perceptions of online learning outside of the U.S.

As we look to the future, there are additional questions to explore in the area of online learning. Overall, this research led to questions related to learning design when comparing the two modalities in higher education. Further research is needed to investigate the instructional strategies used to enhance student learning, especially in students with weaker academic preparation or from underrepresented backgrounds. Given the integral role that online learning is expected to play in the future of higher education in the United States, it may be even more critical to move beyond comparisons of online versus face to face. Instead, choosing to focus on sound pedagogical quality with consideration for the mode of delivery as a means for promoting positive learning outcomes.

Allen, I.E., Seaman, J., Poulin, R., & Straut, T. (2016). Online Report Card: Tracking Online Education in the United States [PDF file]. Babson Survey Research Group.   http://onlinelearningsurvey.com/reports/onlinereportcard.pdf

Angelino, L. M., Williams, F. K., & Natvig, D. (2007). Strategies to engage online students and reduce attrition rates.  The Journal of Educators Online , 4(2).

Ashby, J., Sadera, W.A., & McNary, S.W. (2011). Comparing student success between developmental math courses offered online, blended, and face-to-face.  Journal of Interactive Online Learning , 10(3), 128-140.

Bernard, R.M., Borokhovski, E., Schmid, R.F., Tamim, R.M., & Abrami, P.C. (2014). A meta-analysis of blended learning and technology use in higher education: From the general to the applied.  Journal of Computing in Higher Education , 26(1), 87-122.

Cavanaugh, J.K. & Jacquemin, S.J. (2015). A large sample comparison of grade based student learning outcomes in online vs. face-fo-face courses.  Journal of Asynchronous Learning Network,  19(2).

Clinefelter, D. L., & Aslanian, C. B. (2015). Online college students 2015: Comprehensive data on demands and preferences.   https://www.learninghouse.com/wp-content/uploads/2017/09/OnlineCollegeStudents2015.pdf

Golubovskaya, E.A., Tikhonova, E.V., & Mekeko, N.M. (2019). Measuring learning outcome and students’ satisfaction in ELT (e-learning against conventional learning). Paper presented the ACM International Conference Proceeding Series, 34-38. Doi: 10.1145/3337682.3337704

Kemp, N. & Grieve, R. (2014). Face-to-face or face-to-screen? Undergraduates’ opinions and test performance in classroom vs. online learning.  Frontiers in Psychology , 5. Doi: 10.3389/fpsyg.2014.01278

Lyke, J., & Frank, M. (2012). Comparison of student learning outcomes in online and traditional classroom environments in a psychology course. (Cover story).  Journal of Instructional Psychology , 39(3/4), 245-250.

Mayadas, F., Miller, G. & Senner, J.  Definitions of E-Learning Courses and Programs Version 2.0.  Online Learning Consortium.  https://onlinelearningconsortium.org/updated-e-learning-definitions-2/

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. US Department of Education.  https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf

Mollenkopf, D., Vu, P., Crow, S, & Black, C. (2017). Does online learning deliver? A comparison of student teacher outcomes from candidates in face to face and online program pathways.  Online Journal of Distance Learning Administration.  20(1).

Moore, J.L., Dickson-Deane, C., & Galyan, K. (2011). E-Learning, online learning, and distance learning environments: Are they the same?  The Internet and Higher Education . 14(2), 129-135.

Nichols, J., Shaffer, B., & Shockey, K. (2003). Changing the face of instruction: Is online or in-class more effective?   College & Research Libraries , 64(5), 378–388.  https://doi-org.proxy2.library.illinois.edu/10.5860/crl.64.5.378

Parsons-Pollard, N., Lacks, T.R., & Grant, P.H. (2008). A comparative assessment of student learning outcomes in large online and traditional campus based introduction to criminal justice courses.  Criminal Justice Studies , 2, 225-239.

Pearson North America. (2018, June 25).  How Online Learning Affects the Lives of Students . YouTube.  https://www.youtube.com/watch?v=mPDMagf_oAE

Protopsaltis, S., & Baum, S. (2019). Does online education live up to its promise? A look at the evidence and implications for federal policy [PDF file].   http://mason.gmu.edu/~sprotops/OnlineEd.pdf

Saad, L., Busteed, B., & Ogisi, M. (October 15, 2013). In U.S., Online Education Rated Best for Value and Options.  https://news.gallup.com/poll/165425/online-education-rated-best-value-options.aspx

Stack, S. (2015). Learning Outcomes in an Online vs Traditional Course.  International Journal for the Scholarship of Teaching and Learning , 9(1).

Seaman, J.E., Allen, I.E., & Seaman, J. (2018). Grade Increase: Tracking Distance Education in the United States [PDF file]. Babson Survey Research Group.  http://onlinelearningsurvey.com/reports/gradeincrease.pdf

Slover, E. & Mandernach, J. (2018). Beyond Online versus Face-to-Face Comparisons: The Interaction of Student Age and Mode of Instruction on Academic Achievement.  Journal of Educators Online,  15(1) .  https://files.eric.ed.gov/fulltext/EJ1168945.pdf

Summers, J., Waigandt, A., & Whittaker, T. (2005). A Comparison of Student Achievement and Satisfaction in an Online Versus a Traditional Face-to-Face Statistics Class.  Innovative Higher Education , 29(3), 233–250.  https://doi-org.proxy2.library.illinois.edu/10.1007/s10755-005-1938-x

Tichavsky, L.P., Hunt, A., Driscoll, A., & Jicha, K. (2015). “It’s just nice having a real teacher”: Student perceptions of online versus face-to-face instruction.  International Journal for the Scholarship of Teaching and Learning.  9(2).

Wiley Education Services. (n.d.).  Top challenges facing U.S. higher education.  https://edservices.wiley.com/top-higher-education-challenges/

July 17, 2020

Online Learning

college , distance education , distance learning , face to face , higher education , online learning , postsecondary , traditional learning , university , virtual learning

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

© 2024 — Powered by WordPress

Theme by Anders Noren — Up ↑

logo

Look Who’s Talking About Online vs. Traditional Education

Any student considering taking courses online who has never done so before may understandably have some trepidation. Is an online course really going to give you the experience and knowledge that you need to pursue the degree — and eventually career — that you want?

Luckily, we are no longer sitting at the starting gate with online learning. With more than 20 years of research available, it is much easier today to assess the impact of online learning on the learning experience, as well as the comparative learning outcomes for students that take an online path versus a more traditional one.

The following experts in learning methodology and online education have taken the time to research, write and publish their own findings. Each is employed by a university, most are full-time professors, though adjunct faculty are also represented.

Meet the Experts

Mark-Bullen_img

The college learning experience extends far beyond what students may learn in a course. In fact, from one perspective, college as much about learning how to learn and how to think critically as it is about the actual substance of the courses themselves. Because of the importance of the critical thinking aspect and how that process is often supported by interaction between students and faculty, one of the criticisms levied against online education is that it makes these interactions more difficult.

In his piece for the International Journal of E-learning and Distance Education , Dr. Mark Bullen sets out to analyze one particular online course to determine:

  • whether the students were actively participating, building on each other’s contributions, and thinking critically about the discussion topics; and
  • what factors affected student participation and critical thinking

The conclusions of this study, which had a very small sample size, was that students’ discussions in the online course for the most part did not build upon one another, with most students adding comments that were independent from those of their classmates.

It is important to note that Dr. Bullen undertook this study in the nascent stages of online education with what would be very remedial technology in comparison to what is in use today. Advances such as live chat features, video conferencing, and “threaded” forums and discussion areas may all help facilitate interaction between students and faculty. Further, the instructor of the courses noted that “he might have been able to stimulate some discussion if he had taken a more active role, challenging students to elaborate their positions and to compare them with those of other students.”

Dr. Bullen has a Ph.D. in Adult Education, a Master’s degree in Educational Psychology and a B.Ed. from the University of British Columbia. He was also the Chief Editor of the Journal of Distance Education from 2006 to 2012.

Participation and Critical Thinking in Online University Distance Education

Gregg-Bennett_img

In the journal Quest , Drs. Gregg Bennett and Frederick P. Green undertake a thorough review of the available research on the phenomenon of online education as it compares to traditional classroom-based courses. The article identifies three key factors that collectively determine whether students in online and traditional learning environments will achieve the same outcomes: the instructor, the students, and the tools used for the course. Although the technology (such as the Learning Management System, or LMS) does play a role, it is only a well-designed course with a dedicated instructor and students who are motivated to learn that will, together, determine the success of an online program. Drs. Bennett and Green also find that collaboration, convenience, and easy access to additional resources are benefits that give online courses an advantage. The takeaway from the analysis as a whole is that with the right instruction — and the right students — it is possible to conduct courses online just as successfully as in a more traditional setting.

Dr. Bennett currently serves as a Professor of Health & Kinesiology at Texas A&M University where he was awarded the 2010 ING Professor of Excellence, while Dr. Green continues his scholarship focus on the relationship between leisure, leisure lifestyle and the community inclusion of marginalized groups at the University of Southern Mississippi.

Student Learning in the Online Environment: No Significant Difference?

Jennifer-Jill-Harman_img

Dr. Jennifer Jill Harman is an Associate Professor in the Psychology department at Colorado State University. In that role, she has written about the process of developing and teaching online courses from a personal perspective. Having taught in classrooms for more than 15 years, Dr. Harman was at first skeptical about the possibility of translating her rigorous psychology coursework to an online platform. However, with the right tools, she was able to create online courses that provided comparable learning outcomes and that, by their online nature, were accessible to more students. Dr. Harman was even able to incorporate counseling skills into her courses through the use of video conferencing tools like Skype and Google Hangouts.

Dr. Harman holds a PhD in Psychology from the University of Connecticut and has published work in the Journal of Family Psychology and Children & Youth Services Review , among others.

Online Versus Traditional Education: Is One Better Than the Other?

Steven-Stack_img

In the International Journal for the Scholarship of Teaching & Learning, Dr. Steven Stack has written about how learning outcomes differ in online and traditional educational settings. The study that Dr. Stack uses is particularly interesting because, unlike most other online learning studies, students were not able to choose whether or not they took an online or classroom course, due to an error in the course selection process. Therefore, a set of students took the same class with the same instructor, with some online and some in the classroom. The data from this study, which used 64 total students with a nearly balanced gender ratio, found that both sections of the course performed largely the same, with the online students outperforming the traditional students just slightly. Further, when students were asked to evaluate the course, the two sets gave nearly identical ratings for how much they learned and how they would rate the instructor.

Because of the unique “blind selection” data for this study, it is fascinating to note that the course delivery method, with the same instructor and the same materials, made little to no difference in how students perceived the course or how they performed on exams.

Dr. Stack continues to work as a Professor in the College of Liberal Arts & Sciences at Wayne State University, where his research interests include Social risk and protective factors for suicide, Cultural Axes of Nations and link to Public Opinion on Criminality and Deviance, and the impact of the death penalty on homicide.

Learning Outcomes in an Online vs Traditional Course

Yuliang-Liu_img

In the International Journal of Instructional Technology and Distance Learning, Dr. Yuliang Liu directly addresses the idea of how well students learn in an online environment as opposed to a traditional classroom setting.

Unlike the study that Dr. Sacks published, Dr. Liu used self-selected students at a midwestern university for his analysis. Subjects in one online and one traditional course, using the same learning objectives, were given pretests and posttests to assess their learning, as well as quizzes throughout the course. The results of the study found that online students did measurably better on quizzes and in the course overall and had fewer complaints about the course. In fact, Dr. Liu concludes that “online instruction can be a viable alternative for higher education.”

Dr. Liu holds a PhD in Educational Psychology from Texas A&M University in Commerce. Prior to joining the faculty at Southern Illinois University Edwardsville, he taught both graduate and undergraduate courses both in classrooms and online at Southeastern Oklahoma State University.

Effects of Online Instruction vs. Traditional Instruction on Students’ Learning

Maureen-Hannay_img

In the Journal of Online Learning and Teaching (JOLT), Dr. Maureen Hannay and Tracy Newvine conducted a study to assess student perceptions of their online learning experiences as compared to classroom courses. The study surveyed 217 students, most of whom were adults taking courses part-time and found that by and large, this student population preferred online learning and felt they were able to achieve more in an online environment. Students noted the convenience of online learning and being able to balance school with other commitments, something that is a great importance to part-time students. 59% of respondents reports achieving higher grades in their online courses while 57% indicated that they learned more in the online setting.

While this questionnaire may not hold all the answers to online vs. traditional education, it is certainly important to consider the views of students who have experienced both formats.

Dr. Hannay holds a Ph.D. in Industrial Relations and Human Resource Management from the University of Toronto, Toronto, Canada and is a Professor of Management at Troy University while Tracy Newvine is a Senior Lecturer in the Department of Criminal Justice at Troy University – Global.

Perceptions of Distance Learning: A Comparison of Online and Traditional Learning

Cindy-Ann-Dell_img

In another article from JOLT, Dr. Cindy Ann Dell, Christy Low, and Dr. Jeanine F. Wilker analyze student results from online and traditional sections of the same courses. Rather than relying on tests and student reporting, this analysis looks directly at the work handed in for the different courses and compares the quality. The study looked at both graduate and undergraduate courses, using different assignments for each analysis.

Ultimately, this study found that the quality of work turned in was not significantly different for the online and traditional courses. Rather, the more important indicator of student success was method of instruction that the teacher chose. The study concludes that: “There are a few pedagogical variables that can have an influence including (1) the use of problem-based learning strategies, (2) the opportunity for students to engage in mediated communication with the instructor, (3) course and content information provided to students prior to class starting, (4) and the use of video provided to students by the instructor, to name a few. ”

It can be easy to get weighed down in the technological specifics of online learning, but what this analysis shows is that any instructor can excel in the online space with the right resources and attention to methodology.

Dr. Cindy Ann Dell holds an EdD in Adult and Higher Education from Montana University at Bozeman while Dr. Jeanine F. Wilker holds her PhD in Education with a specialization in Professional Studies from Capella University. Christy Low currently works as an Instructional Designer at Old Dominion University.

Comparing Student Achievement in Online and Face-to-Face Class Formats

Guide to Online Education

  • Expert Advice for Online Students
  • Frequently Asked Questions About Online Education
  • Instructional Design in Online Programs
  • Learning Management Systems
  • Online Student Trends and Success Factors
  • Online Teaching Methods
  • Student Guide to Understanding and Avoiding Plagiarism
  • Student Services for Online Learners

IMAGES

  1. Online Education vs Traditional Education Essay Example

    online education vs traditional education research paper

  2. 🏆 Similarities between online and traditional education. The Six Key

    online education vs traditional education research paper

  3. Online Education vs Traditional Education: Things to know

    online education vs traditional education research paper

  4. The difference between online classes and traditional classes. Online

    online education vs traditional education research paper

  5. PPT

    online education vs traditional education research paper

  6. Traditional Learning vs. Online Learning

    online education vs traditional education research paper

VIDEO

  1. How to write debate writing. || Debate on "Offline Classes Are Better Than Online classes" ||

  2. What happen in traditional and online universities

  3. (PG) 1st semester education research paper-4 2023 (MA)

  4. Online Education vs Offline Education

  5. A Comprehensive Comparison: NEP Education vs Traditional Education

  6. Online Learning vs Traditional Education

COMMENTS

  1. Traditional Learning Compared to Online Learning During the COVID-19

    Online learning is defined as an educational strategy in which the learner is geographically distant from the teacher, and the entire educational process is conducted across the Internet and communication networks (Ali Ta'amneh, 2021). Despite the advantages of online learning, there are still numerous challenges for students, administration ...

  2. Online classes versus traditional classes? Comparison during COVID-19

    Study participants were provided with a questionnaire to do comparison between the online versus traditional method of education. Due to COVID-19 pandemic restrictions, traditional classroom teaching was shifted to online teaching. In traditional classroom setting, lecture duration ranged typically from 45 min to 1 h, with few minutes dedicated ...

  3. PDF Learning Outcomes in an online vs traditional course

    For example, a study of learning outcomes (exam scores) in online vs. traditional classes in microeconomics determined that students in the online class scored higher on the final exam than the traditional class (68.1% vs. 61.6%). However, the classes,

  4. A Comparative Analysis of Student Performance in an Online vs. Face-to

    Although it boasts several advantages over traditional education, online instruction still has its drawbacks, including limited communal synergies. Still, online education seems to be the path many students are taking to secure a degree. This study compared the effectiveness of online vs. traditional instruction in an environmental studies class.

  5. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  6. Online vs. traditional learning in teacher education: a comparison of

    Online education programs are well established in higher education, including graduate level and non-traditional teacher education programs. However, there is a lack of substantial research into online programs for undergraduate students in comparable preparation programs.

  7. PDF Comparing Effectiveness Of Online and Traditional Teaching Using ...

    Abstract. This study was conducted to examine the effectiveness of online education. Two sections of Information Management Systems (IST 483) -Real Time Captioning Technology (I) were compared. Comparison of the two sections was based on the students' final letter grades. The results of the two-tailed T-test show that there were no ...

  8. Online vs in-person learning in higher education: effects on student

    This study is a comparative analysis of online distance learning and traditional in-person education at King Saud University in Saudi Arabia, with a focus on understanding how different ...

  9. PDF Effectiveness of traditional and online learning: comparative analysis

    traditional face-to-face learning have largely been overlooked" [11]. The aim of this paper is to examine Russian students' percep tion of traditional and online education processes. The objectives to achieve this aim are the following: to provide comparison of online learning to traditional one from student's perspective, to determine

  10. Online education in the post-COVID era

    Metrics. The coronavirus pandemic has forced students and educators across all levels of education to rapidly adapt to online learning. The impact of this — and the developments required to make ...

  11. (PDF) Effectiveness of traditional and online learning: comparative

    Due to changes caused by the COVID-19 pandemic, online education has been popularized and more fully developed. The pandemic has increased not only the importance of emergency online studying, but ...

  12. Integrating students' perspectives about online learning: a hierarchy

    This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social ...

  13. A systematic review of research on online teaching and learning from

    1. Introduction. Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase (Allen & Seaman, 2017), and so has the research on online learning.There have been review studies conducted on specific areas on online learning such as innovations ...

  14. A Comparison of Student Learning Outcomes: Online Education vs

    A Comparison of Student Learning Outcomes: Online Education vs. Traditional Classroom Instruction. Despite the prevalence of online learning today, it is often viewed as a less favorable option when compared to the traditional, in-person educational experience. Criticisms of online learning come from various sectors, like employer groups ...

  15. (PDF) Traditional vs. Online Education

    The study says that 78% of more than 1,000 students surveyed still believe it is easier to learn in a classroom. When it comes to the effectiveness of online education, studies show that the ...

  16. Classroom and Online Learning: Teaching Research Methods

    enrollment growth; specifically, in higher education, online enrollments have grown 21%, whereas growth for traditional classroom instruction registers only 2% since 2002 (Allen & Seaman, 2007). Concurrent with the expansion of online education, higher education programs today are wresding with how to respond to ever-increasing accountability ...

  17. (PDF) Online vs. traditional learning: A comparative analysis of

    Results: A total of 98 students of Bahria University Dental College participated. A comparison between grade scores of online and traditional learning groups reported statistically significant ...

  18. Traditional Learning versus E-Learning by Libron Kelmendi

    Abstract. The axis of this research paper is to compare and contrast the methods of traditional learning in classroom and E-Learning. The topic of this research paper appeared while considering the constant growing trend of technology and as a consequence of the current trends, the need for change to the methods of learning and teaching appears.

  19. Look Who's Talking: Traditional vs. Online Education

    In the journal Quest, Drs. Gregg Bennett and Frederick P. Green undertake a thorough review of the available research on the phenomenon of online education as it compares to traditional classroom-based courses. The article identifies three key factors that collectively determine whether students in online and traditional learning environments ...

  20. Effectiveness of Online Vs. Traditional Classroom Teaching for

    Traditional learning was more preferred among students in this study, so students need to be more careful in every way depending on online courses unless system readiness, skillful testing, a rigorous feedback system and a mass strategy can lead to this path in future. Objective: To find effectiveness between online and traditional classroom teaching methods among final year class of MBBS.

  21. Students' perception and preference for online education in India

    These facts clearly show us that online learning is a perfect substitute for the traditional classroom learning if they are designed suitably. Educational institutions in India have also made a transition to online teaching environment soon after Union Government's decision to impose nation-wide lock-down for 21 days from 25th March, 2020 ...

  22. (PDF) Education: Traditional Vs. Modern Perspective

    American Research Journal of English and Literature Original Article. ISSN 2378-9026 Volume 1, Issue 2, April-2015. www.arjonline.org 1. Education: Traditional Vs. Modern Perspective. Sofe Ahmed 1 ...

  23. Online Programs vs. Traditional Education

    The effectiveness of online learning compared to a traditional classroom setting is subjective and varies based on individual learning styles, objectives, and the quality of the educational material and instruction. Some find online learning more conducive to their needs, while others benefit more from in-person interaction and guidance.