• Review article
  • Open access
  • Published: 22 January 2020

Mapping research in student engagement and educational technology in higher education: a systematic evidence map

  • Melissa Bond   ORCID: orcid.org/0000-0002-8267-031X 1 ,
  • Katja Buntins 2 ,
  • Svenja Bedenlier 1 ,
  • Olaf Zawacki-Richter 1 &
  • Michael Kerres 2  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  2 ( 2020 ) Cite this article

132k Accesses

295 Citations

65 Altmetric

Metrics details

Digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience. It has also been linked to an increase in behavioural, affective and cognitive student engagement, the facilitation of which is a central concern of educators. In order to delineate the complex nexus of technology and student engagement, this article systematically maps research from 243 studies published between 2007 and 2016. Research within the corpus was predominantly undertaken within the United States and the United Kingdom, with only limited research undertaken in the Global South, and largely focused on the fields of Arts & Humanities, Education, and Natural Sciences, Mathematics & Statistics. Studies most often used quantitative methods, followed by mixed methods, with little qualitative research methods employed. Few studies provided a definition of student engagement, and less than half were guided by a theoretical framework. The courses investigated used blended learning and text-based tools (e.g. discussion forums) most often, with undergraduate students as the primary target group. Stemming from the use of educational technology, behavioural engagement was by far the most often identified dimension, followed by affective and cognitive engagement. This mapping article provides the grounds for further exploration into discipline-specific use of technology to foster student engagement.

Introduction

Over the past decade, the conceptualisation and measurement of ‘student engagement’ has received increasing attention from researchers, practitioners, and policy makers alike. Seminal works such as Astin’s ( 1999 ) theory of involvement, Fredricks, Blumenfeld, and Paris’s ( 2004 ) conceptualisation of the three dimensions of student engagement (behavioural, emotional, cognitive), and sociocultural theories of engagement such as Kahu ( 2013 ) and Kahu and Nelson ( 2018 ), have done much to shape and refine our understanding of this complex phenomenon. However, criticism about the strength and depth of student engagement theorising remains e.g. (Boekaerts, 2016 ; Kahn, 2014 ; Zepke, 2018 ), the quality of which has had a direct impact on the rigour of subsequent research (Lawson & Lawson, 2013 ; Trowler, 2010 ), prompting calls for further synthesis (Azevedo, 2015 ; Eccles, 2016 ).

In parallel to this increased attention on student engagement, digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience (Barak, 2018 ; Henderson, Selwyn, & Aston, 2017 ; Selwyn, 2016 ). International recognition of the importance of ICT skills and digital literacy has been growing, alongside mounting recognition of its importance for active citizenship (Choi, Glassman, & Cristol, 2017 ; OECD, 2015a ; Redecker, 2017 ), and the development of interdisciplinary and collaborative skills (Barak & Levenberg, 2016 ; Oliver, & de St Jorre, Trina, 2018 ). Using technology has the potential to make teaching and learning processes more intensive (Kerres, 2013 ), improve student self-regulation and self-efficacy (Alioon & Delialioğlu, 2017 ; Bouta, Retalis, & Paraskeva, 2012 ), increase participation and involvement in courses as well as the wider university community (Junco, 2012 ; Salaber, 2014 ), and predict increased student engagement (Chen, Lambert, & Guidry, 2010 ; Rashid & Asghar, 2016 ). There is, however, no guarantee of active student engagement as a result of using technology (Kirkwood, 2009 ), with Tamim, Bernard, Borokhovski, Abrami, and Schmid’s ( 2011 ) second-order meta-analysis finding only a small to moderate impact on student achievement across 40 years. Rather, careful planning, sound pedagogy and appropriate tools are vital (Englund, Olofsson, & Price, 2017 ; Koehler & Mishra, 2005 ; Popenici, 2013 ), as “technology can amplify great teaching, but great technology cannot replace poor teaching” (OECD, 2015b ), p. 4.

Due to the nature of its complexity, educational technology research has struggled to find a common definition and terminology with which to talk about student engagement, which has resulted in inconsistency across the field. For example, whilst 77% of articles reviewed by Henrie, Halverson, and Graham ( 2015 ) operationalised engagement from a behavioural perspective, most of the articles did not have a clearly defined statement of engagement, which is no longer considered acceptable in student engagement research (Appleton, Christenson, & Furlong, 2008 ; Christenson, Reschly, & Wylie, 2012 ). Linked to this, educational technology research has, however, lacked theoretical guidance (Al-Sakkaf, Omar, & Ahmad, 2019 ; Hew, Lan, Tang, Jia, & Lo, 2019 ; Lundin, Bergviken Rensfeldt, Hillman, Lantz-Andersson, & Peterson, 2018 ). A review of 44 random articles published in 2014 in the journals Educational Technology Research & Development and Computers & Education, for example, revealed that more than half had no guiding conceptual or theoretical framework (Antonenko, 2015 ), and only 13 out of 62 studies in a systematic review of flipped learning in engineering education reported theoretical grounding (Karabulut-Ilgu, Jaramillo Cherrez, & Jahren, 2018 ). Therefore, calls have been made for a greater understanding of the role that educational technology plays in affecting student engagement, in order to strengthen teaching practice and lead to improved outcomes for students (Castañeda & Selwyn, 2018 ; Krause & Coates, 2008 ; Nelson Laird & Kuh, 2005 ).

A reflection upon prior research that has been undertaken in the field is a necessary first step to engage in meaningful discussion on how to foster student engagement in the digital age. In support of this aim, this article provides a synthesis of student engagement theory research, and systematically maps empirical higher education research between 2007 and 2016 on student engagement in educational technology. Synthesising the vast body of literature on student engagement (for previous literature and systematic reviews, see Additional file  1 ), this article develops “a tentative theory” in the hopes of “plot[ting] the conceptual landscape…[and chart] possible routes to explore it” (Antonenko, 2015 , pp. 57–67) for researchers, practitioners, learning designers, administrators and policy makers. It then discusses student engagement against the background of educational technology research, exploring prior literature and systematic reviews that have been undertaken. The systematic review search method is then outlined, followed by the presentation and discussion of findings.

Literature review

What is student engagement.

Student engagement has been linked to improved achievement, persistence and retention (Finn, 2006 ; Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008 ), with disengagement having a profound effect on student learning outcomes and cognitive development (Ma, Han, Yang, & Cheng, 2015 ), and being a predictor of student dropout in both secondary school and higher education (Finn & Zimmer, 2012 ). Student engagement is a multifaceted and complex construct (Appleton et al., 2008 ; Ben-Eliyahu, Moore, Dorph, & Schunn, 2018 ), which some have called a ‘meta-construct’ (e.g. Fredricks et al., 2004 ; Kahu, 2013 ), and likened to blind men describing an elephant (Baron & Corbin, 2012 ; Eccles, 2016 ). There is ongoing disagreement about whether there are three components e.g., (Eccles, 2016 )—affective/emotional, cognitive and behavioural—or whether there are four, with the recent suggested addition of agentic engagement (Reeve, 2012 ; Reeve & Tseng, 2011 ) and social engagement (Fredricks, Filsecker, & Lawson, 2016 ). There has also been confusion as to whether the terms ‘engagement’ and ‘motivation’ can and should be used interchangeably (Reschly & Christenson, 2012 ), especially when used by policy makers and institutions (Eccles & Wang, 2012 ). However, the prevalent understanding across the literature is that motivation is an antecedent to engagement; it is the intent and unobservable force that energises behaviour (Lim, 2004 ; Reeve, 2012 ; Reschly & Christenson, 2012 ), whereas student engagement is energy and effort in action; an observable manifestation (Appleton et al., 2008 ; Eccles & Wang, 2012 ; Kuh, 2009 ; Skinner & Pitzer, 2012 ), evidenced through a range of indicators.

Whilst it is widely accepted that no one definition exists that will satisfy all stakeholders (Solomonides, 2013 ), and no one project can be expected to possibly examine every sub-construct of student engagement (Kahu, 2013 ), it is important for each research project to begin with a clear definition of their own understanding (Boekaerts, 2016 ). Therefore, in this project, student engagement is defined as follows:

Student engagement is the energy and effort that students employ within their learning community, observable via any number of behavioural, cognitive or affective indicators across a continuum. It is shaped by a range of structural and internal influences, including the complex interplay of relationships, learning activities and the learning environment. The more students are engaged and empowered within their learning community, the more likely they are to channel that energy back into their learning, leading to a range of short and long term outcomes, that can likewise further fuel engagement.

Dimensions and indicators of student engagement

There are three widely accepted dimensions of student engagement; affective, cognitive and behavioural. Within each component there are several indicators of engagement (see Additional file  2 ), as well as disengagement (see Additional file 2 ), which is now seen as a separate and distinct construct to engagement. It should be stated, however, that whilst these have been drawn from a range of literature, this is not a finite list, and it is recognised that students might experience these indicators on a continuum at varying times (Coates, 2007 ; Payne, 2017 ), depending on their valence (positive or negative) and activation (high or low) (Pekrun & Linnenbrink-Garcia, 2012 ). There has also been disagreement in terms of which dimension the indicators align with. For example, Järvelä, Järvenoja, Malmberg, Isohätälä, and Sobocinski ( 2016 ) argue that ‘interaction’ extends beyond behavioural engagement, covering both cognitive and/or emotional dimensions, as it involves collaboration between students, and Lawson and Lawson ( 2013 ) believe that ‘effort’ and ‘persistence’ are cognitive rather than behavioural constructs, as they “represent cognitive dispositions toward activity rather than an activity unto itself” (p. 465), which is represented in the table through the indicator ‘stay on task/focus’ (see Additional file 2 ). Further consideration of these disagreements represent an area for future research, however, as they are beyond the scope of this paper.

Student engagement within educational technology research

The potential that educational technology has to improve student engagement, has long been recognised (Norris & Coutas, 2014 ), however it is not merely a case of technology plus students equals engagement. Without careful planning and sound pedagogy, technology can promote disengagement and impede rather than help learning (Howard, Ma, & Yang, 2016 ; Popenici, 2013 ). Whilst still a young area, most of the research undertaken to gain insight into this, has been focused on undergraduate students e.g., (Henrie et al., 2015 ; Webb, Clough, O’Reilly, Wilmott, & Witham, 2017 ), with Chen et al. ( 2010 ) finding a positive relationship between the use of technology and student engagement, particularly earlier in university study. Research has also been predominantly STEM and medicine focused (e.g., Li, van der Spek, Feijs, Wang, & Hu, 2017 ; Nikou & Economides, 2018 ), with at least five literature or systematic reviews published in the last 5 years focused on medicine, and nursing in particular (see Additional file  3 ). This indicates that further synthesis is needed of research in other disciplines, such as Arts & Humanities and Education, as well as further investigation into whether research continues to focus on undergraduate students.

The five most researched technologies in Henrie et al.’s ( 2015 ) review were online discussion boards, general websites, learning management systems (LMS), general campus software and videos, as opposed to Schindler, Burkholder, Morad, and Marsh’s ( 2017 ) literature review, which concentrated on social networking sites (Facebook and Twitter), digital games, wikis, web-conferencing software and blogs. Schindler et al. found that most of these technologies had a positive impact on multiple indicators of student engagement across the three dimensions of engagement, with digital games, web-conferencing software and Facebook the most effective. However, it must be noted that they only considered seven indicators of student engagement, which could be extended by considering further indicators of student engagement. Other reviews that have found at least a small positive impact on student engagement include those focused on audience response systems (Hunsu, Adesope, & Bayly, 2016 ; Kay & LeSage, 2009 ), mobile learning (Kaliisa & Picard, 2017 ), and social media (Cheston, Flickinger, & Chisolm, 2013 ). Specific indicators of engagement that increased as a result of technology include interest and enjoyment (Li et al., 2017 ), improved confidence (Smith & Lambert, 2014 ) and attitudes (Nikou & Economides, 2018 ), as well as enhanced relationships with peers and teachers e.g., (Alrasheedi, Capretz, & Raza, 2015 ; Atmacasoy & Aksu, 2018 ).

Literature and systematic reviews focused on student engagement and technology do not always include information on where studies have been conducted. Out of 27 identified reviews (see Additional file 3 ), only 14 report the countries included, and two of these were explicitly focused on a specific region or country, namely Africa and Turkey. Most of the research has been conducted in the USA, followed by the UK, Taiwan, Australia and China. Table  1 depicts the three countries from which most studies originated from in the respective reviews, and highlights a clear lack of research conducted within mainland Europe, South America and Africa. Whilst this could be due to the choice of databases in which the literature was searched for, this nevertheless highlights a substantial gap in the literature, and to that end, it will be interesting to see whether this review is able to substantiate or contradict these trends.

Research into student engagement and educational technology has predominantly used a quantitative methodology (see Additional file 3 ), with 11 literature and systematic reviews reporting that surveys, particularly self-report Likert-scale, are the most used source of measurement (e.g. Henrie et al., 2015 ). Reviews that have included research using a range of methodologies, have found a limited number of studies employing qualitative methods (e.g. Connolly, Boyle, MacArthur, Hainey, & Boyle, 2012 ; Kay & LeSage, 2009 ; Lundin et al., 2018 ). This has led to a call for further qualitative research to be undertaken, exploring student engagement and technology, as well as more rigorous research designs e.g., (Li et al., 2017 ; Nikou & Economides, 2018 ), including sampling strategies, data collection, and in experimental studies in particular (Cheston et al., 2013 ; Connolly et al., 2012 ). However, not all reviews included information on methodologies used. Crook ( 2019 ), in his recent editorial in the British Journal of Educational Technology , stated that research methodology is a “neglected topic” (p. 487) within educational technology research, and stressed its importance in order to conduct studies delving deeper into phenomena (e.g. longitudinal studies).

Therefore, this article presents an initial “evidence map” (Miake-Lye, Hempel, Shanman, & Shekelle, 2016 ), p. 19 of systematically identified literature on student engagement and educational technology within higher education, undertaken through a systematic review, in order to address the issues raised by prior research, and to identify research gaps. These issues include the disparity between field of study and study levels researched, the geographical distribution of studies, the methodologies used, and the theoretical fuzziness surrounding student engagement. This article, however, is intended to provide an initial overview of the systematic review method employed, as well as an overview of the overall corpus. Further synthesis of possible correlations between student engagement and disengagement indicators with the co-occurrence of technology tools, will be undertaken within field of study specific articles (e.g., Bedenlier, 2020b ; Bedenlier 2020a ), allowing more meaningful guidance on applying the findings in practice.

The following research questions guide this enquiry:

How do the studies in the sample ground student engagement and align with theory?

Which indicators of cognitive, behavioural and affective engagement were identified in studies where educational technology was used? Which indicators of student disengagement?

What are the learning scenarios, modes of delivery and educational technology tools employed in the studies?

Overview of the study

With the intent to systematically map empirical research on student engagement and educational technology in higher education, we conducted a systematic review. A systematic review is an explicitly and systematically conducted literature review, that answers a specific question through applying a replicable search strategy, with studies then included or excluded, based on explicit criteria (Gough, Oliver, & Thomas, 2012 ). Studies included for review are then coded and synthesised into findings that shine light on gaps, contradictions or inconsistencies in the literature, as well as providing guidance on applying findings in practice. This contribution maps the research corpus of 243 studies that were identified through a systematic search and ensuing random parameter-based sampling.

Search strategy and selection procedure

The initial inclusion criteria for the systematic review were peer-reviewed articles in the English language, empirically reporting on students and student engagement in higher education, and making use of educational technology. The search was limited to records between 1995 and 2016, chosen due to the implementation of the first Virtual Learning Environments and Learning Management Systems within higher education see (Bond, 2018 ). Articles were limited to those published in peer-reviewed journals, due to the rigorous process under which they are published, and their trustworthiness in academia (Nicholas et al., 2015 ), although concerns within the scientific community with the peer-review process are acknowledged e.g. (Smith, 2006 ).

Discussion arose on how to approach the “hard-to-detect” (O’Mara-Eves et al., 2014 , p. 51) concept of student engagement in regards to sensitivity versus precision (Brunton, Stansfield, & Thomas, 2012 ), particularly in light of engagement being Henrie et al.’s ( 2015 ) most important search term. The decision was made that the concept ‘student engagement’ would be identified from titles and abstracts at a later stage, during the screening process. In this way, it was assumed that articles would be included, which indeed are concerned with student engagement, but which use different terms to describe the concept. Given the nature of student engagement as a meta-construct e.g. (Appleton et al., 2008 ; Christenson et al., 2012 ; Kahu, 2013 ) and by limiting the search to only articles including the term engagement , important research on other elements of student engagement might be missed. Hence, we opted for recall over precision. According to Gough et al. ( 2012 ), p. 13 “electronic searching is imprecise and captures many studies that employ the same terms without sharing the same focus”, or would lead to disregarding studies that analyse the construct but use different terms to describe it.

With this in mind, the search strategy to identify relevant studies was developed iteratively with support from the University Research Librarian. As outlined in O’Mara-Eves et al. ( 2014 ) as a standard approach, we used reviewer knowledge—in this case strongly supported through not only reviewer knowledge but certified expertise—and previous literature (e.g. Henrie et al., 2015 ; Kahu, 2013 ) to elicit concepts with potential importance under the topics student engagement, higher education and educational technology . The final search string (see Fig.  1 ) encompasses clusters of different educational technologies that were searched for separately in order to avoid an overly long search string. It was decided not to include any brand names, e.g. Facebook, Twitter, Moodle etc. because it was again reasoned that in scientific publication, the broader term would be used (e.g. social media). The final search string was slightly adapted, e.g. the format required for truncations or wildcards, according to the settings of each database being used Footnote 1 .

figure 1

Final search terms used in the systematic review

Four databases (ERIC, Web of Science, Scopus and PsycINFO) were searched in July 2017 and three researchers and a student assistant screened abstracts and titles of the retrieved references between August and November 2017, using EPPI Reviewer 4.0. An initial 77,508 references were retrieved, and with the elimination of duplicate records, 53,768 references remained (see Fig.  2 ). A first cursory screening of records revealed that older research was more concerned with technologies that are now considered outdated (e.g. overhead projectors, floppy disks). Therefore, we opted to adjust the period to include research published between 2007 and 2016, labeled as a phase of research and practice, entitled ‘online learning in the digital age’ (Bond, 2018 ). Whilst we initially opted for recall over precision, the decision was then made to search for specific facets of the student engagement construct (e.g. deep learning, interest and persistence) within EPPI-Reviewer, in order to further refine the corpus. These adaptations led to a remaining 18,068 records.

figure 2

Systematic review PRISMA flow chart (slightly modified after Brunton et al., 2012 , p. 86; Moher, Liberati, Tetzlaff, & Altman, 2009 ), p. 8

Four researchers screened the first 150 titles and abstracts, in order to iteratively establish a joint understanding of the inclusion criteria. The remaining references were distributed equally amongst the screening team, which resulted in the inclusion of 4152 potentially relevant articles. Given the large number of articles for screening on full text, whilst facing restrained time as a condition in project-based and funded work, it was decided that a sample of articles would be drawn from this corpus for further analysis. With the intention to draw a sample that estimates the population parameters with a predetermined error range, we used methods of sample size estimation in the social sciences (Kupper & Hafner, 1989 ). To do so, the R Package MBESS (Kelley, Lai, Lai, & Suggests, 2018 ) was used. Accepting a 5% error range, a percentage of a half and an alpha of 5%, 349 articles were sampled, with this sample being then stratified by publishing year, as student engagement has become much more prevalent (Zepke, 2018 ) and educational technology has become more differentiated within the last decade (Bond, 2018 ). Two researchers screened the first 100 articles on full text, reaching an agreement of 88% on inclusion/exclusion. The researchers then discussed the discrepancies and came to an agreement on the remaining 12%. It was decided that further comparison screening was needed, to increase the level of reliability. After screening the sample on full text, 232 articles remained for data extraction, which contained 243 studies.

Data extraction process

In order to extract the article data, an extensive coding system was developed, including codes to extract information on the set-up and execution of the study (e.g. methodology, study sample) as well as information on the learning scenario, the mode of delivery and educational technology used. Learning scenarios included broader pedagogies, such as social collaborative learning and self-determined learning, but also specific pedagogies such as flipped learning, given the increasing number of studies and interest in these approaches (e.g., Lundin et al., 2018 ). Specific examples of student engagement and/or disengagement were coded under cognitive, affective or behavioural (dis)engagement. The facets of student (dis)engagement were identified based on the literature review undertaken (see Additional file 2 ), and applied in this detailed manner to not only capture the overarching dimensions of the concept, but rather their diverse sub-meanings. New indicators also emerged during the coding process, which had not initially been identified from the literature review, including ‘confidence’ and ‘assuming responsibility’. The 243 studies were coded with this extensive code set and any disagreements that occurred between the coders were reconciled. Footnote 2

As a plethora of over 50 individual educational technology applications and tools were identified in the 243 studies, in line with results found in other large-scale systematic reviews (e.g., Lai & Bower, 2019 ), concerns were raised over how the research team could meaningfully analyse and report the results. The decision was therefore made to employ Bower’s ( 2016 ) typology of learning technologies (see Additional file  4 ), in order to channel the tools into groups that share the same characteristics or “structure of information” (Bower, 2016 ), p. 773. Whilst it is acknowledged that some of the technology could be classified into more than one type within the typology, e.g. wikis can be used in individual composition, for collaborative tasks, or for knowledge organisation and sharing, “the type of learning that results from the use of the tool is dependent on the task and the way people engage with it rather than the technology itself” therefore “the typology is presented as descriptions of what each type of tool enables and example use cases rather than prescriptions of any particular pedagogical value system” (Bower, 2016 ), p. 774. For further elaboration on each category, please see Bower ( 2015 ).

Study characteristics

Geographical characteristics.

The systematic mapping reveals that the 243 studies were set in 33 different countries, whilst seven studies investigated settings in an international context, and three studies did not indicate their country setting. In 2% of the studies, the country was allocated based on the author country of origin, if the two authors came from the same country. The top five countries account for 158 studies (see Fig.  3 ), with 35.4% ( n  = 86) studies conducted in the United States (US), 10.7% ( n  = 26) in the United Kingdom (UK), 7.8% ( n  = 19) in Australia, 7.4% ( n  = 18) in Taiwan, and 3.7% ( n  = 9) in China. Across the corpus, studies from countries employing English as the official or one of the official languages total up to 59.7% of the entire sample, followed by East Asian countries that in total account for 18.8% of the sample. With the exception of the UK, European countries are largely absent from the sample, only 7.3% of the articles originate from this region, with countries such as France, Belgium, Italy or Portugal having no studies and countries such as Germany or the Netherlands having one respectively. Thus, with eight articles, Spain is the most prolific European country outside of the UK. The geographical distribution of study settings also clearly shows an almost complete absence of studies undertaken within African contexts, with five studies from South Africa and one from Tunisia. Studies from South-East Asia, the Middle East, and South America are likewise low in number this review. Whilst the global picture evokes an imbalance, this might be partially due to our search and sampling strategy, having focused on English language journals, indexed in four primarily Western-focused databases.

figure 3

Percentage deviation from the average relative frequencies of the different data collection formats per country (≥ 3 articles). Note. NS = not stated; AUS = Australia; CAN = Canada; CHN = China; HKG = Hong Kong; inter = international; IRI = Iran; JAP = Japan; MYS = Malaysia; SGP = Singapore; ZAF = South Africa; KOR = South Korea; ESP = Spain; SWE = Sweden; TWN = Taiwan; TUR = Turkey; GBR = United Kingdom; USA = United States of America

Methodological characteristics

Within this literature corpus, 103 studies (42%) employed quantitative methods, 84 (35%) mixed methods, and 56 (23%) qualitative. Relating these numbers back to the contributing countries, different preferences for and frequencies of methods used become apparent (see Fig. 3 ). As a general tendency, mixed methods and qualitative research occurs more often in Western countries, whereas quantitative research is the preferred method in East Asian countries. For example, studies originating from Australia employ mixed methods research 28% more often than the average, whereas Singapore is far below average in mixed methods research, with 34.5% less than the other countries in the sample. In Taiwan, on the other hand, mixed methods studies are being conducted 23.5% below average and qualitative research 6.4% less often than average. However, quantitative research occurs more often than in other countries, with 29.8% above average.

Amongst the qualitative studies, qualitative content analysis ( n  = 30) was the most frequently used analysis approach, followed by thematic analysis ( n  = 21) and grounded theory ( n  = 12). However, a lot of times ( n  = 37) the exact analysis approach was not reported, could not be allocated to a specific classification ( n  = 22), or no method of analysis was identifiable ( n  = 11). Within studies using quantitative methods, mean comparison was used in 100 studies, frequency data was collected and analysed in 83 studies, and in 40 studies regression models were used. Furthermore, looking at the correlation between the different analysis approaches, only one significant correlation can be identified, this being between mean comparison and frequency data (−.246). Besides that, correlations are small, for example, in only 14% of the studies both mean comparisons and regressions models are employed.

Study population characteristics

Research in the corpus focused on universities as the prime institution type ( n  = 191, 79%), followed by 24 (10%) non-specified institution types, and colleges ( n  = 21, 8.2%) (see Fig.  4 ). Five studies (2%) included institutions classified as ‘other’, and two studies (0.8%) included both college and university students. The most frequently studied student population was undergraduate students (60%, n  = 146), as opposed to 33 studies (14%) focused on postgraduate students (see Fig.  6 ). A combination of undergraduate and postgraduate students were the subject of interest in 23 studies (9%), with 41 studies (17%) not specifying the level of study of research participants.

figure 4

Relative frequencies of study field in dependence of countries with ≥3 articles. Note. Country abbreviations are as per Figure 4. A&H = Arts & Humanities; BA&L = Business, Administration and Law; EDU = Education; EM&C = Engineering, Manufacturing & Construction; H&W = Health & Welfare; ICT = Information & Communication Technologies; ID = interdisciplinary; NS,M&S = Natural Science, Mathematics & Statistics; NS = Not specified; SoS = Social Sciences, Journalism & Information

Based on the UNESCO (2015) ISCED classification, eight broad study fields are covered in the sample, with Arts & Humanities (42 studies), Education (42 studies), and Natural Sciences, Mathematics & Statistics (37) being the top three study fields, followed by Health & Welfare (30 studies), Social Sciences, Journalism & Information (22), Business, Administration & Law (19 studies), Information & Communication Technologies (13), Engineering, Manufacturing & Construction (11), and another 26 studies of interdisciplinary character. One study did not specify a field of study.

An expectancy value was calculated, according to which, the distribution of studies per discipline should occur per country. The actual deviation from this value then showed that several Asian countries are home to more articles in the field of Arts & Humanities than was expected: Japan with 3.3 articles more, China with 5.4 and Taiwan with 5.9. Furthermore, internationally located research also shows 2.3 more interdisciplinary studies than expected, whereas studies on Social Sciences occur more often than expected in the UK (5.7 more articles) and Australia (3.3 articles) but less often than expected across all other countries. Interestingly, the USA have 9.9 studies less in Arts & Humanities than was expected but 5.6 articles more than expected in Natural Science.

Question One: How do the studies in the sample ground student engagement and align with theory?

Defining student engagement.

It is striking that almost all of the studies ( n  = 225, 93%) in this corpus lack a definition of student engagement, with only 18 (7%) articles attempting to define the concept. However, this is not too surprising, as the search strategy was set up with the assumption that researchers investigating student engagement (dimensions and indicators) would not necessarily label them as student engagement. When developing their definitions, authors in these 18 studies referenced 22 different sources, with the work of Kuh and colleagues e.g., (Hu & Kuh, 2002 ; Kuh, 2001 ; Kuh et al., 2006 ), as well as Astin ( 1984 ), the only authors referred to more than once. The most popular definition of student engagement within these studies was that of active participation and involvement in learning and university life e.g., (Bolden & Nahachewsky, 2015 ; bFukuzawa & Boyd, 2016 ), which was also found by Joksimović et al. ( 2018 ) in their review of MOOC research. Interaction, especially between peers and with faculty, was the next most prevalent definition e.g., (Andrew, Ewens, & Maslin-Prothero, 2015 ; Bigatel & Williams, 2015 ). Time and effort was given as a definition in four studies (Gleason, 2012 ; Hatzipanagos & Code, 2016 ; Price, Richardson, & Jelfs, 2007 ; Sun & Rueda, 2012 ), with expending physical and psychological energy (Ivala & Gachago, 2012 ) another definition. This variance in definitions and sources reflects the ongoing complexity of the construct (Zepke, 2018 ), and serves to reinforce the need for a clearer understanding across the field (Schindler et al., 2017 ).

Theoretical underpinnings

Reflecting findings from other systematic and literature reviews on the topic (Abdool, Nirula, Bonato, Rajji, & Silver, 2017 ; Hunsu et al., 2016 ; Kaliisa & Picard, 2017 ; Lundin et al., 2018 ), 59% ( n  = 100) of studies did not employ a theoretical model in their research. Of the 41% ( n  = 100) that did, 18 studies drew on social constructivism, followed by the Community of Inquiry model ( n  = 8), Sociocultural Learning Theory ( n  = 5), and Community of Practice models ( n  = 4). These findings also reflect the state of the field in general (Al-Sakkaf et al., 2019 ; Bond, 2019b ; Hennessy, Girvan, Mavrikis, Price, & Winters, 2018 ).

Another interesting finding of this research is that whilst 144 studies (59%) provided research questions, 99 studies (41%) did not. Although it is recognised that not all studies have research questions (Bryman, 2007 ), or only develop them throughout the research process, such as with grounded theory (Glaser & Strauss, 1967 ), a surprising number of quantitative studies (36%, n  = 37) did not have research questions. This is a reflection on the lack of theoretical guidance, as 30 of these 37 studies also did not draw on a theoretical or conceptual framework.

Question 2: which indicators of cognitive, behavioural and affective engagement were identified in studies where educational technology was used? Which indicators of student disengagement?

Student engagement indicators.

Within the corpus, the behavioural engagement dimension was documented in some form in 209 studies (86%), whereas the dimension of affective engagement was reported in 163 studies (67%) and the cognitive dimension in only 136 (56%) studies. However, the ten most often identified student engagement indicators across the studies overall (see Table  2 ) were evenly distributed over all three dimensions (see Table  3 ). The indicators participation/interaction/involvement , achievement and positive interactions with peers and teachers each appear in at least 100 studies, which is almost double the amount of the next most frequent student engagement indicator.

Across the 243 studies in the corpus, 117 (48%) showed all three dimensions of affective, cognitive and behavioural student engagement e.g., (Szabo & Schwartz, 2011 ), including six studies that used established student engagement questionnaires, such as the NSSE (e.g., Delialioglu, 2012 ), or self-developed addressing these three dimensions. Another 54 studies (22%) displayed at least two student engagement dimensions e.g., (Hatzipanagos & Code, 2016 ), including six questionnaire studies. Studies exhibiting one student engagement dimension only, was reported in 71 studies (29%) e.g., (Vural, 2013 ).

Student disengagement indicators

Indicators of student disengagement (see Table  4 ) were identified considerably less often across the corpus, which could be explained by the purpose of the studies being to primarily address/measure positive engagement, but on the other hand this could potentially be due to a form of self-selected or publication bias, due to less frequently reporting and/or publishing studies with negative results. The three disengagement indicators that were most often indicated were frustration ( n  = 33, 14%) e.g., (Ikpeze, 2007 ), opposition/rejection ( n  = 20, 8%) e.g., (Smidt, Bunk, McGrory, Li, & Gatenby, 2014 ) and disappointment e.g., (Granberg, 2010 ) , as well as other affective disengagement ( n  = 18, 7% each).

Technology tool typology and engagement/disengagement indicators

Across the 243 studies, a plethora of over 50 individual educational technology tools were employed. The top five most frequently researched tools were LMS ( n  = 89), discussion forums ( n  = 80), videos ( n  = 44), recorded lectures ( n  = 25), and chat ( n  = 24). Following a slightly modified version of Bower’s ( 2016 ) educational tools typology, 17 broad categories of tools were identified (see Additional file 4 for classification, and 3.2 for further information). The frequency with which tools from the respective groups employed in studies varied considerably (see Additional file 4 ), with the top five categories being text-based tools ( n  = 138), followed by knowledge organisation & sharing tools ( n  = 104), multimodal production tools ( n  = 89), assessment tools ( n  = 65) and website creation tools ( n  = 29).

Figure  5 shows what percentage of each engagement dimension (e.g., affective engagement or cognitive disengagement) was fostered through each specific technology type. Given the results in 4.2.1 on student engagement, it was somewhat unsurprising to see the prevalence of text-based tools , knowledge organisation & sharing tools, and multimodal production tools having the highest proportion of affective, behavioural and cognitive engagement. For example, affective engagement was identified in 163 studies, with 63% of these studies using text-based tools (e.g., Bulu & Yildirim, 2008 ) , and cognitive engagement identified in 136 studies, with 47% of those using knowledge organisation & sharing tools e.g., (Shonfeld & Ronen, 2015 ). However, further analysis of studies employing discussion forums (a text-based tool ) revealed that, whilst the top affective and behavioural engagement indicators were found in almost two-thirds of studies (see Additional file  5 ), there was a substantial gap between that and the next most prevalent engagement indicator, with the exact pattern (and indicators) emerging for wikis. This represents an area for future research.

figure 5

Engagement and disengagement by tool typology. Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning; A&H = Arts & Humanities; BA&L = Business, Administration and Law; EDU = Education; EM&C = Engineering, Manufacturing & Construction; H&W = Health & Welfare; ICT = Information & Communication Technologies; ID = interdisciplinary; NS,M&S = Natural Science, Mathematics & Statistics; NS = Not specified; SoS = Social Sciences, Journalism & Information

Interestingly, studies using website creation tools reported more disengagement than engagement indicators across all three domains (see Fig.  5 ), with studies using assessment tools and social networking tools also reporting increased instances of disengagement across two domains (affective and cognitive, and behavioural and cognitive respectively). 23 of the studies (79%) using website creation tools , used blogs, with students showing, for example, disinterest in topics chosen e.g., (Sullivan & Longnecker, 2014 ), anxiety over their lack of blogging knowledge and skills e.g., (Mansouri & Piki, 2016 ), and continued avoidance of using blogs in some cases, despite introductory training e.g., (Keiller & Inglis-Jassiem, 2015 ). In studies where assessment tools were used, students found timed assessment stressful, particularly when trying to complete complex mathematical solutions e.g., (Gupta, 2009 ), as well as quizzes given at the end of lectures, with some students preferring take-up time of content first e.g., (DePaolo & Wilkinson, 2014 ). Disengagement in studies where social networking tools were used, indicated that some students found it difficult to express themselves in short posts e.g., (Cook & Bissonnette, 2016 ), that conversations lacked authenticity e.g., (Arnold & Paulus, 2010 ), and that some did not want to mix personal and academic spaces e.g., (Ivala & Gachago, 2012 ).

Question 3: What are the learning scenarios, modes of delivery and educational technology tools employed in the studies?

Learning scenarios.

With 58.4% across the sample, social-collaborative learning (SCL) was the scenario most often employed ( n  = 142), followed by 43.2% of studies investigating self-directed learning (SDL) ( n  = 105) and 5.8% of studies using game-based learning (GBL) ( n  = 14) (see Fig. 6 ). Studies coded as SCL included those exploring social learning (Bandura, 1971 ) and social constructivist approaches (Vygotsky, 1978 ). Personal learning environments (PLE) were found for 2.9% of studies, 1.3% studies used other scenarios ( n  = 3), whereas another 13.2% did not provide specification of their learning scenarios ( n  = 32). It is noteworthy that in 45% of possible cases for employing SDL scenarios, SCL was also used. Other learning scenarios were also used mostly in combination with SCL and SDL. Given the rising number of higher education studies exploring flipped learning (Lundin et al., 2018 ), studies exploring the approach were also specifically coded (3%, n  = 7).

figure 6

Co-occurrence of learning scenarios across the sample ( n  = 243). Note. SDL = self-directed learning; SCL = social collaborative learning; GBL = game-based learning; PLE = personal learning environments; other = other learning scenario

Modes of delivery

In 84% of studies ( n  = 204), a single mode of delivery was used, with blended learning the most researched (109 studies), followed by distance education (72 studies), and face-to-face instruction (55 studies). Of the remaining 39 studies, 12 did not indicate their mode of delivery, whilst the other 27 studies combined or compared modes of delivery, e.g. comparing face to face courses to blended learning, such as the study on using iPads in undergraduate nursing education by Davies ( 2014 ).

Educational technology tools investigated

Most studies in this corpus (55%) used technology asynchronously, with 12% of studies researching synchronous tools, and 18% of studies using both asynchronous and synchronous. When exploring the use of tools, the results are not surprising, with a heavy reliance on asynchronous technology. However, when looking at tool usage with studies in face-to-face contexts, the number of synchronous tools (31%) is almost as many as the number of asynchronous tools (41%), and surprisingly low within studies in distance education (7%).

Tool categories were used in combination, with text-based tools most often used in combination with other technology types (see Fig.  7 ). For example, in 60% of all possible cases using multimodal production tools, in 69% of all possible synchronous production tool cases, in 72% of all possible knowledge, organisation & sharing tool cases , and a striking 89% of all possible learning software cases and 100% of all possible MOOC cases. On the contrary, text-based tools were never used in combination with games or data analysis tools . However, studies using gaming tools were used in 67% of possible assessment tool cases as well. Assessment tools, however, constitute somewhat of a special case when studies using website creation tools are concerned, with only 7% of possible cases having employed assessment tools .

figure 7

Co-occurrence of tools across the sample ( n  = 243). Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning

In order to gain further understanding into how educational technology was used, we examined how often a combination of two variables should occur in the sample and how often it actually occurs, with deviations described as either ‘more than’ or ‘less than’ the expected value. This provides further insight into potential gaps in the literature, which can inform future research. For example, an analysis of educational technology tool usage amongst study populations (see Fig.  8 ) reveals that 5.0 more studies than expected looked at knowledge organisation & sharing for graduate students, but 5.0 studies less than expected investigated assessment tools for this group. By contrast, 5 studies more than expected researched assessment tools for unspecified study levels, and 4.3 studies less than expected employed knowledge organisation & sharing for undergraduate students.

figure 8

Relative frequency of educational technology tools used according to study level Note. Abbreviations are explained in Fig. 7

Educational technology tools were also used differently from the expected pattern within various fields of study (see Fig.  9 ), most obviously for the cases of the top five tools. However, also for virtual worlds, found in 5.8 studies more in Health & Welfare than expected, and learning software, used in 6.4 studies more in Arts & Humanities than expected. In all other disciplines, learning software was used less often than assumed. Text-based tools were used more often than expected in fields of study that are already text-intensive, including Arts & Humanities, Education, Business, Administration & Law as well as Social Sciences - but less often than thought in fields such as Engineering, Health & Welfare, and Natural Sciences, Mathematics & Statistics. Multimodal production tools were used more often only in Health & Welfare, ICT and Natural Sciences, and less often than assumed across all other disciplines. Assessment tools deviated most clearly, with 11.9 studies more in Natural Sciences, Mathematics & Statistics than assumed, but with 5.2 studies less in both Education and Arts & Humanities.

figure 9

Relative frequency of educational technology tools used according to field of study. Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning

In regards to mode of delivery and educational technology tools used, it is interesting to see that from the five top tools, except for assessment tools , all tools were used in face-to-face instruction less often than expected (see Fig.  10 ); from 1.6 studies less for website creation tools to 14.5 studies less for knowledge organisation & sharing tools . Assessment tools , however, were used in 3.3 studies more than expected - but less often than assumed (although moderately) in blended learning and distance education formats. Text-based tools, multimodal production tools and knowledge organisation & sharing tools were employed more often than expected in blended and distance learning, especially obvious in 13.1 studies more on t ext-based tools and 8.2 studies on knowledge organisation & sharing tools in distance education. Contrary to what one would perhaps expect, social networking tools were used in 4.2 studies less than expected for this mode of delivery.

figure 10

Relative frequency of educational technology tools used according mode of delivery. Note. Tool abbreviations as per Figure 10. BL = Blended learning; DE = Distance education; F2F = Face-to-face; NS = Not stated

The findings of this study confirm those of previous research, with the most prolific countries being the US, UK, Australia, Taiwan and China. This is rather representative of the field, with an analysis of instructional design and technology research from 2007 to 2017 listing the most productive countries as the US, Taiwan, UK, Australia and Turkey (Bodily, Leary, & West, 2019 ). Likewise, an analysis of 40 years of research in Computers & Education (CAE) found that the US, UK and Taiwan accounted for 49.9% of all publications (Bond, 2018 ). By contrast, a lack of African research was apparent in this review, which is also evident in educational technology research in top tier peer-reviewed journals, with only 4% of articles published in the British Journal of Educational Technology ( BJET ) in the past decade (Bond, 2019b ) and 2% of articles in the Australasian Journal of Educational Technology (AJET) (Bond, 2018 ) hailing from Africa. Similar results were also found in previous literature and systematic reviews (see Table 1 ), which again raises questions of literature search and inclusion strategies, which will be further discussed in the limitations section.

Whilst other reviews of educational technology and student engagement have found studies to be largely STEM focused (Boyle et al., 2016 ; Li et al., 2017 ; Lundin et al., 2018 ; Nikou & Economides, 2018 ), this corpus features a more balanced scope of research, with the fields of Arts & Humanities (42 studies, 17.3%) and Education (42 studies, 17.3%) constituting roughly one third of all studies in the corpus - and Natural Sciences, Mathematics & Statistics, nevertheless, assuming rank 3 with 38 studies (15.6%). Beyond these three fields, further research is needed within underrepresented fields of study, in order to gain more comprehensive insights into the usage of educational technology tools (Kay & LeSage, 2009 ; Nikou & Economides, 2018 ).

Results of the systematic map further confirm the focus that prior educational technology research has placed on undergraduate students as the target group and participants in technology-enhanced learning settings e.g. (Cheston et al., 2013 ; Henrie et al., 2015 ). With the overwhelming number of 146 studies researching undergraduate students—compared to 33 studies on graduate students and 23 studies investigating both study levels—this also indicates that further investigation into the graduate student experience is needed. Furthermore, the fact that 41 studies do not report on the study level of their participants is an interesting albeit problematic fact, as implications might not easily be drawn for application to one’s own specific teaching context if the target group under investigation is not clearly denominated. A more precise reporting of participants’ details, as well as specification of the study context (country, institution, study level to name a few) is needed to transfer and apply study results to practice—being then able to take into account why some interventions succeed and others do not.

In line with other studies e.g. (Henrie et al., 2015 ), this review has also demonstrated that student engagement remains an under-theorised concept, that is often only considered fragmentally in research. Whilst studies in this review have often focused on isolated aspects of student engagement, their results are nevertheless interesting and valuable. However, it is important to relate these individual facets to the larger framework of student engagement, by considering how these aspects are connected and linked to each other. This is especially helpful to integrate research findings into practice, given that student engagement and disengagement are rarely one-dimensional; it is not enough to focus only on one aspect of engagement, but also to look at aspects that are adjacent to it (Pekrun & Linnenbrink-Garcia, 2012 ). It is also vital, therefore, that researchers develop and refine an understanding of student engagement, and make this explicit in their research (Appleton et al., 2008 ; Christenson et al., 2012 ).

Reflective of current conversations in the field of educational technology (Bond, 2019b ; Castañeda & Selwyn, 2018 ; Hew et al., 2019 ), as well as other reviews (Abdool et al., 2017 ; Hunsu et al., 2016 ; Kaliisa & Picard, 2017 ; Lundin et al., 2018 ), a substantial number of studies in this corpus did not have any theoretical underpinnings. Kaliisa and Picard ( 2017 ) argue that, without theory, research can result in disorganised accounts and issues with interpreting data, with research effectively “sit[ting] in a void if it’s not theoretically connected” (Kara, 2017 ), p. 56. Therefore, framing research in educational technology with a stronger theoretical basis, can assist with locating the “field’s disciplinary alignment” (Crook, 2019 ), p. 486 and further drive conversations forward.

The application of methods in this corpus was interesting in two ways. First, it is noticeable that quantitative studies are prevalent across the 243 articles in the sample. The number of studies employing qualitative research methods in the sample was comparatively low (56 studies as opposed to 84 mixed method studies and 103 quantitative studies). This is also reflected in the educational technology field at large, with a review of articles published in BJET and Educational Technology Research & Development (ETR&D) from 2002 to 2014 revealing that 40% of articles used quantitative methods, 26% qualitative and 13% mixed (Baydas, Kucuk, Yilmaz, Aydemir, & Goktas, 2015 ), and likewise a review of educational technology research from Turkey 1990–2011 revealed that 53% of articles used quantitative methods, 22% qualitative and 10% mixed methods (Kucuk, Aydemir, Yildirim, Arpacik, & Goktas, 2013 ). Quantitative studies primarily show that an intervention has worked or not when applied to e.g. a group of students in a certain setting as done in the study on using mobile apps on student performance in engineering education by Jou, Lin, and Tsai ( 2016 ), however, not all student engagement indicators can actually be measured in this way. The lower numbers of affective and cognitive engagement found in the studies in the corpus, reflect a wider call to the field to increase research on these two domains (Henrie et al., 2015 ; Joksimović et al., 2018 ; O’Flaherty & Phillips, 2015 ; Schindler et al., 2017 ). Whilst it is arguably more difficult to measure these two than behavioural engagement, the use of more rigorous and accurate surveys could be one possibility, as they can “capture unobservable aspects” (Henrie et al., 2015 ), p. 45 such as student feelings and information about the cognitive strategies they employ (Finn & Zimmer, 2012 ). However, they are often lengthy and onerous, or subject to the limitations of self-selection.

Whereas low numbers of qualitative studies researching student engagement and educational technology were previously identified in other student engagement and technology reviews (Connolly et al., 2012 ; Kay & LeSage, 2009 ; Lundin et al., 2018 ), it is studies like that by Lopera Medina ( 2014 ) in this sample, which reveal how people perceive this educational experience and the actual how of the process. Therefore, more qualitative and ethnographic measures should also be employed, such as student observations with thick descriptions, which can help shed light on the complexity of teaching and learning environments (Fredricks et al., 2004 ; Heflin, Shewmaker, & Nguyen, 2017 ). Conducting observations can be costly, however, both in time and money, so this is suggested in combination with computerised learning analytic data, which can provide measurable, objective and timely insight into how certain manifestations of engagement change over time (Henrie et al., 2015 ; Ma et al., 2015 ).

Whereas other results of this review have confirmed previous results in the field, the technology tools that were used in the studies and considered in their relation to student engagement in this corpus deviate. Whilst Henrie et al. ( 2015 ) found that the most frequently researched tools were discussion forums, general websites, LMS, general campus software and videos, the studies here focused predominantly on LMS, discussion forums, videos, recorded lectures and chat. Furthermore, whilst Schindler et al. ( 2017 ) found that digital games, web-conferencing software and Facebook were the most effective tools at enhancing student engagement, this review found that it was rather text-based tools , knowledge organisation & sharing , and multimodal production tools .

Limitations

During the execution of this systematic review, we tried to adhere to the method as rigorously as possible. However, several challenges were also encountered - some of which are addressed and discussed in another publication (Bedenlier, 2020b ) - resulting in limitations to this study. Four large, general educational research databases were searched, which are international in scope. However, by applying the criterion of articles published in English, research published on this topic in languages other than English was not included in this review. The same applies to research documented in, for example, grey literature, book chapters or monographs, or even articles from journals that are not indexed in the four databases searched. Another limitation is that only research published within the period 2007–2016 was investigated. Whilst we are cognisant of this being a restriction, we also think that the technological advances and the implications to be drawn from this time-frame relate more meaningfully to the current situation, than would have been the case for technologies used in the 1990s see (Bond, 2019b ). The sampling strategy also most likely accounts for the low number of studies from certain countries, e.g. in South America and Africa.

Studies included in this review represent various academic fields, and they also vary in the rigour with which they were conducted. Harden and Gough ( 2012 ) stress that the appraisal of quality and relevance of studies “ensure[s] that only the most appropriate, trustworthy and relevant studies are used to develop the conclusions of the review” (p. 154), we have included the criterion of being a peer reviewed contribution as a formal inclusion criterion from the beginning. In doing so, we reason that studies met a baseline of quality as applicable to published research in a specific field - otherwise they would not have been accepted for publication by the respective community. Finally, whilst the studies were diligently read and coded, and disagreements also discussed and reconciled, the human flaw of having overlooked or misinterpreted information provided in the individual articles cannot fully be excluded.

Finally, the results presented here provide an initial window into the overall body of research identified during the search, and further research is being undertaken to provide deeper insight into discipline specific use of technology and resulting student engagement using subsets of this sample (Bedenlier, 2020a ; Bond, M., Bedenlier, S., Buntins, K., Kerres, M., & Zawacki-Richter, O.: Facilitating student engagement through educational technology: A systematic review in the field of education, forthcoming).

Recommendations for future work and implications for practice

Whilst the evidence map presented in this article has confirmed previous research on the nexus of educational technology and student engagement, it has also elucidated a number of areas that further research is invited to address. Although these findings are similar to that of previous reviews, in order to more fully and comprehensively understand student engagement as a multi-faceted construct, it is not enough to focus only on indicators of engagement that can easily be measured, but rather the more complex endeavour of uncovering and investigating those indicators that reside below the surface. This also includes the careful alignment of theory and methodological design, in order to both adequately analyse the phenomenon under investigation, as well as contributing to the soundly executed body of research within the field of educational technology. Further research is invited in particular into how educational technology affects cognitive and affective engagement, whilst considering how this fits within the broader sociocultural framework of engagement (Bond, 2019a ). Further research is also invited into how educational technology affects student engagement within fields of study beyond Arts & Humanities, Education and Natural Sciences, Mathematics & Statistics, as well as within graduate level courses. The use of more qualitative research methods is particularly encouraged.

The findings of this review suggest that research gaps exist with particular combinations of tools, study levels and modes of delivery. With respect to study level, the use of assessment tools with graduate students, as well as knowledge organisation & sharing tools with undergraduate students, are topics researched far less than expected. The use of text-based tools in Engineering, Health & Welfare and Natural Sciences, Mathematics & Statistics, as well as the use of multimodal production tools outside of these disciplines, are also areas for future research, as is the use of assessment tools in the fields of Education and Arts & Humanities in particular.

With 109 studies in this systematic review using a blended learning design, this is a confirmation of the argument that online distance education and traditional face-to-face education are becoming increasingly more integrated with one another. Whilst this indicates that a lot of educators have made the move from face-to-face teaching to technology-enhanced learning, this also makes a case for the need for further professional development, in order to apply these tools effectively within their own teaching contexts, with this review indicating that further research is needed in particlar into the use of social networking tools in online/distance education. The question also needs to be asked, not only why the number of published studies are low within certain countries and regions, but also to enquire into the nature of why that is the case. This entails questioning the conditions under which research is being conducted, potentially criticising publication policies of major, Western-based journals, but also ultimately to reflect on one’s search strategy and research assumptions as a Western educator-researcher.

Based on the findings of this review, educators within higher education institutions are encouraged to use text-based tools , knowledge, organisation and sharing tools , and multimodal production tools in particular and, whilst any technology can lead to disengagement if not employed effectively, to be mindful that website creation tools (blogs and ePortfolios), social networking tools and assessment tools have been found to be more disengaging than engaging in this review. Therefore, educators are encouraged to ensure that students receive sufficient and ongoing training for any new technology used, including those that might appear straightforward, e.g. blogs, and that they may require extra writing support. Ensure that discussion/blog topics are interesting, that they allow student agency, and they are authentic to students, including the use of social media. Social networking tools that augment student professional learning networks are particularly useful. Educators should also be aware, however, that some students do not want to mix their academic and personal lives, and so the decision to use certain social platforms could be decided together with students.

Availability of data and materials

All data will be made publicly available, as part of the funding requirements, via https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn .

The detailed search strategy, including the modified search strings according to the individual databases, can be retrieved from https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn

The full code set can be retrieved from the review protocol at https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn .

Abdool, P. S., Nirula, L., Bonato, S., Rajji, T. K., & Silver, I. L. (2017). Simulation in undergraduate psychiatry: Exploring the depth of learner engagement. Academic Psychiatry : the Journal of the American Association of Directors of Psychiatric Residency Training and the Association for Academic Psychiatry , 41 (2), 251–261. https://doi.org/10.1007/s40596-016-0633-9 .

Article   Google Scholar  

Alioon, Y., & Delialioğlu, Ö. (2017). The effect of authentic m-learning activities on student engagement and motivation. British Journal of Educational Technology , 32 , 121. https://doi.org/10.1111/bjet.12559 .

Alrasheedi, M., Capretz, L. F., & Raza, A. (2015). A systematic review of the critical factors for success of mobile learning in higher education (university students’ perspective). Journal of Educational Computing Research , 52 (2), 257–276. https://doi.org/10.1177/0735633115571928 .

Al-Sakkaf, A., Omar, M., & Ahmad, M. (2019). A systematic literature review of student engagement in software visualization: A theoretical perspective. Computer Science Education , 29 (2–3), 283–309. https://doi.org/10.1080/08993408.2018.1564611 .

Andrew, L., Ewens, B., & Maslin-Prothero, S. (2015). Enhancing the online learning experience using virtual interactive classrooms. Australian Journal of Advanced Nursing , 32 (4), 22–31.

Google Scholar  

Antonenko, P. D. (2015). The instrumental value of conceptual frameworks in educational technology research. Educational Technology Research and Development , 63 (1), 53–71. https://doi.org/10.1007/s11423-014-9363-4 .

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools , 45 (5), 369–386. https://doi.org/10.1002/pits.20303 .

Arnold, N., & Paulus, T. (2010). Using a social networking site for experiential learning: Appropriating, lurking, modeling and community building. Internet and Higher Education , 13 (4), 188–196. https://doi.org/10.1016/j.iheduc.2010.04.002 .

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Development , 25 (4), 297–308.

Astin, A. W. (1999). Student involvement: A developmental theory for higher education. Journal of College Student Development , 40 (5), 518–529. https://www.researchgate.net/publication/220017441 (Original work published July 1984).

Atmacasoy, A., & Aksu, M. (2018). Blended learning at pre-service teacher education in Turkey: A systematic review. Education and Information Technologies , 23 (6), 2399–2422. https://doi.org/10.1007/s10639-018-9723-5 .

Azevedo, R. (2015). Defining and measuring engagement and learning in science: Conceptual, theoretical, methodological, and analytical issues. Educational Psychologist , 50 (1), 84–94. https://doi.org/10.1080/00461520.2015.1004069 .

Bandura, A. (1971). Social learning theory . New York: General Learning Press.

Barak, M. (2018). Are digital natives open to change? Examining flexible thinking and resistance to change. Computers & Education , 121 , 115–123. https://doi.org/10.1016/j.compedu.2018.01.016 .

Barak, M., & Levenberg, A. (2016). Flexible thinking in learning: An individual differences measure for learning in technology-enhanced environments. Computers & Education , 99 , 39–52. https://doi.org/10.1016/j.compedu.2016.04.003 .

Baron, P., & Corbin, L. (2012). Student engagement: Rhetoric and reality. Higher Education Research and Development , 31 (6), 759–772. https://doi.org/10.1080/07294360.2012.655711 .

Baydas, O., Kucuk, S., Yilmaz, R. M., Aydemir, M., & Goktas, Y. (2015). Educational technology research trends from 2002 to 2014. Scientometrics , 105 (1), 709–725. https://doi.org/10.1007/s11192-015-1693-4 .

Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020a). Facilitating student engagement through educational technology in higher education: A systematic review in the field of arts & humanities. Australasian Journal of Educational Technology , 36 (4), 27–47. https://doi.org/10.14742/ajet.5477 .

Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020b). Learning by Doing? Reflections on Conducting a Systematic Review in the Field of Educational Technology. In O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, & K. Buntins (Eds.), Systematic Reviews in Educational Research (Vol. 45 , pp. 111–127). Wiesbaden: Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-27602-7_7 .

Ben-Eliyahu, A., Moore, D., Dorph, R., & Schunn, C. D. (2018). Investigating the multidimensionality of engagement: Affective, behavioral, and cognitive engagement across science activities and contexts. Contemporary Educational Psychology , 53 , 87–105. https://doi.org/10.1016/j.cedpsych.2018.01.002 .

Betihavas, V., Bridgman, H., Kornhaber, R., & Cross, M. (2016). The evidence for ‘flipping out’: A systematic review of the flipped classroom in nursing education. Nurse Education Today , 38 , 15–21. https://doi.org/10.1016/j.nedt.2015.12.010 .

Bigatel, P., & Williams, V. (2015). Measuring student engagement in an online program. Online Journal of Distance Learning Administration , 18 (2), 9.

Bodily, R., Leary, H., & West, R. E. (2019). Research trends in instructional design and technology journals. British Journal of Educational Technology , 50 (1), 64–79. https://doi.org/10.1111/bjet.12712 .

Boekaerts, M. (2016). Engagement as an inherent aspect of the learning process. Learning and Instruction , 43 , 76–83. https://doi.org/10.1016/j.learninstruc.2016.02.001 .

Bolden, B., & Nahachewsky, J. (2015). Podcast creation as transformative music engagement. Music Education Research , 17 (1), 17–33. https://doi.org/10.1080/14613808.2014.969219 .

Bond, M. (2018). Helping doctoral students crack the publication code: An evaluation and content analysis of the Australasian Journal of Educational Technology. Australasian Journal of Educational Technology , 34 (5), 168–183. https://doi.org/10.14742/ajet.4363 .

Bond, M., & Bedenlier, S. (2019a). Facilitating Student Engagement Through Educational Technology: Towards a Conceptual Framework. Journal of Interactive Media in Education , 2019 (1), 1-14. https://doi.org/10.5334/jime.528 .

Bond, M., Zawacki-Richter, O., & Nichols, M. (2019b). Revisiting five decades of educational technology research: A content and authorship analysis of the British Journal of Educational Technology. British Journal of Educational Technology , 50 (1), 12–63. https://doi.org/10.1111/bjet.12730 .

Bouta, H., Retalis, S., & Paraskeva, F. (2012). Utilising a collaborative macro-script to enhance student engagement: A mixed method study in a 3D virtual environment. Computers & Education , 58 (1), 501–517. https://doi.org/10.1016/j.compedu.2011.08.031 .

Bower, M. (2015). A typology of web 2.0 learning technologies . EDUCAUSE Digital Library Retrieved 20 June 2019, from http://www.educause.edu/library/resources/typology-web-20-learning-technologies .

Bower, M. (2016). Deriving a typology of web 2.0 learning technologies. British Journal of Educational Technology , 47 (4), 763–777. https://doi.org/10.1111/bjet.12344 .

Boyle, E. A., Connolly, T. M., Hainey, T., & Boyle, J. M. (2012). Engagement in digital entertainment games: A systematic review. Computers in Human Behavior , 28 (3), 771–780. https://doi.org/10.1016/j.chb.2011.11.020 .

Boyle, E. A., Hainey, T., Connolly, T. M., Gray, G., Earp, J., Ott, M., … Pereira, J. (2016). An update to the systematic literature review of empirical evidence of the impacts and outcomes of computer games and serious games. Computers & Education , 94 , 178–192. https://doi.org/10.1016/j.compedu.2015.11.003 .

Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. The Internet and Higher Education , 27 , 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007 .

Brunton, G., Stansfield, C., & Thomas, J. (2012). Finding relevant studies. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews , (pp. 107–134). Los Angeles: Sage.

Bryman, A. (2007). The research question in social research: What is its role? International Journal of Social Research Methodology , 10 (1), 5–20. https://doi.org/10.1080/13645570600655282 .

Bulu, S. T., & Yildirim, Z. (2008). Communication behaviors and trust in collaborative online teams. Educational Technology & Society , 11 (1), 132–147.

Bundick, M., Quaglia, R., Corso, M., & Haywood, D. (2014). Promoting student engagement in the classroom. Teachers College Record , 116 (4) Retrieved from http://www.tcrecord.org/content.asp?contentid=17402 .

Castañeda, L., & Selwyn, N. (2018). More than tools? Making sense of the ongoing digitizations of higher education. International Journal of Educational Technology in Higher Education , 15 (1), 211. https://doi.org/10.1186/s41239-018-0109-y .

Chen, P.-S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education , 54 (4), 1222–1232. https://doi.org/10.1016/j.compedu.2009.11.008 .

Cheston, C. C., Flickinger, T. E., & Chisolm, M. S. (2013). Social media use in medical education: A systematic review. Academic Medicine : Journal of the Association of American Medical Colleges , 88 (6), 893–901. https://doi.org/10.1097/ACM.0b013e31828ffc23 .

Choi, M., Glassman, M., & Cristol, D. (2017). What it means to be a citizen in the internet age: Development of a reliable and valid digital citizenship scale. Computers & Education , 107 , 100–112. https://doi.org/10.1016/j.compedu.2017.01.002 .

Christenson, S. L., Reschly, A. L., & Wylie, C. (Eds.) (2012). Handbook of research on student engagement . Boston: Springer US.

Coates, H. (2007). A model of online and general campus-based student engagement. Assessment & Evaluation in Higher Education , 32 (2), 121–141. https://doi.org/10.1080/02602930600801878 .

Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education , 59 (2), 661–686. https://doi.org/10.1016/j.compedu.2012.03.004 .

Cook, M. P., & Bissonnette, J. D. (2016). Developing preservice teachers’ positionalities in 140 characters or less: Examining microblogging as dialogic space. Contemporary Issues in Technology and Teacher Education (CITE Journal) , 16 (2), 82–109.

Crompton, H., Burke, D., Gregory, K. H., & Gräbe, C. (2016). The use of mobile learning in science: A systematic review. Journal of Science Education and Technology , 25 (2), 149–160. https://doi.org/10.1007/s10956-015-9597-x .

Crook, C. (2019). The “British” voice of educational technology research: 50th birthday reflection. British Journal of Educational Technology , 50 (2), 485–489. https://doi.org/10.1111/bjet.12757 .

Davies, M. (2014). Using the apple iPad to facilitate student-led group work and seminar presentation. Nurse Education in Practice , 14 (4), 363–367. https://doi.org/10.1016/j.nepr.2014.01.006 .

Article   MathSciNet   Google Scholar  

Delialioglu, O. (2012). Student engagement in blended learning environments with lecture-based and problem-based instructional approaches. Educational Technology & Society , 15 (3), 310–322.

DePaolo, C. A., & Wilkinson, K. (2014). Recurrent online quizzes: Ubiquitous tools for promoting student presence, participation and performance. Interdisciplinary Journal of E-Learning and Learning Objects , 10 , 75–91 Retrieved from http://www.ijello.org/Volume10/IJELLOv10p075-091DePaolo0900.pdf .

Doherty, K., & Doherty, G. (2018). Engagement in HCI. ACM Computing Surveys , 51 (5), 1–39. https://doi.org/10.1145/3234149 .

Eccles, J. (2016). Engagement: Where to next? Learning and Instruction , 43 , 71–75. https://doi.org/10.1016/j.learninstruc.2016.02.003 .

Eccles, J., & Wang, M.-T. (2012). Part I commentary: So what is student engagement anyway? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 133–145). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_6 .

Chapter   Google Scholar  

Englund, C., Olofsson, A. D., & Price, L. (2017). Teaching with technology in higher education: Understanding conceptual change and development in practice. Higher Education Research and Development , 36 (1), 73–87. https://doi.org/10.1080/07294360.2016.1171300 .

Fabian, K., Topping, K. J., & Barron, I. G. (2016). Mobile technology and mathematics: Effects on students’ attitudes, engagement, and achievement. Journal of Computers in Education , 3 (1), 77–104. https://doi.org/10.1007/s40692-015-0048-8 .

Filsecker, M., & Kerres, M. (2014). Engagement as a volitional construct. Simulation & Gaming , 45 (4–5), 450–470. https://doi.org/10.1177/1046878114553569 .

Finn, J. (2006). The adult lives of at-risk students: The roles of attainment and engagement in high school (NCES 2006-328) . Washington, DC: U.S. Department of Education, National Center for Education Statistics Retrieved from website: https://nces.ed.gov/pubs2006/2006328.pdf .

Finn, J., & Zimmer, K. (2012). Student engagement: What is it? Why does it matter? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 97–131). Boston: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_5 .

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research , 74 (1), 59–109. https://doi.org/10.3102/00346543074001059 .

Fredricks, J. A., Filsecker, M., & Lawson, M. A. (2016). Student engagement, context, and adjustment: Addressing definitional, measurement, and methodological issues. Learning and Instruction , 43 , 1–4. https://doi.org/10.1016/j.learninstruc.2016.02.002 .

Fredricks, J. A., Wang, M.-T., Schall Linn, J., Hofkens, T. L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction , 43 , 5–15. https://doi.org/10.1016/j.learninstruc.2016.01.009 .

Fukuzawa, S., & Boyd, C. (2016). Student engagement in a large classroom: Using technology to generate a hybridized problem-based learning experience in a large first year undergraduate class. Canadian Journal for the Scholarship of Teaching and Learning , 7 (1). https://doi.org/10.5206/cjsotl-rcacea.2016.1.7 .

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research . Chicago: Aldine.

Gleason, J. (2012). Using technology-assisted instruction and assessment to reduce the effect of class size on student outcomes in undergraduate mathematics courses. College Teaching , 60 (3), 87–94.

Gough, D., Oliver, S., & Thomas, J. (2012). An introduction to systematic reviews . Los Angeles: Sage.

Granberg, C. (2010). Social software for reflective dialogue: Questions about reflection and dialogue in student Teachers’ blogs. Technology, Pedagogy and Education , 19 (3), 345–360. https://doi.org/10.1080/1475939X.2010.513766 .

Greenwood, L., & Kelly, C. (2019). A systematic literature review to explore how staff in schools describe how a sense of belonging is created for their pupils. Emotional and Behavioural Difficulties , 24 (1), 3–19. https://doi.org/10.1080/13632752.2018.1511113 .

Gupta, M. L. (2009). Using emerging technologies to promote student engagement and learning in agricultural mathematics. International Journal of Learning , 16 (10), 497–508. https://doi.org/10.18848/1447-9494/CGP/v16i10/46658 .

Harden, A., & Gough, D. (2012). Quality and relevance appraisal. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews , (pp. 153–178). London: Sage.

Hatzipanagos, S., & Code, J. (2016). Open badges in online learning environments: Peer feedback and formative assessment as an engagement intervention for promoting agency. Journal of Educational Multimedia and Hypermedia , 25 (2), 127–142.

Heflin, H., Shewmaker, J., & Nguyen, J. (2017). Impact of mobile technology on student attitudes, engagement, and learning. Computers & Education , 107 , 91–99. https://doi.org/10.1016/j.compedu.2017.01.006 .

Henderson, M., Selwyn, N., & Aston, R. (2017). What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education , 42 (8), 1567–1579. https://doi.org/10.1080/03075079.2015.1007946 .

Hennessy, S., Girvan, C., Mavrikis, M., Price, S., & Winters, N. (2018). Editorial. British Journal of Educational Technology , 49 (1), 3–5. https://doi.org/10.1111/bjet.12598 .

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education , 90 , 36–53. https://doi.org/10.1016/j.compedu.2015.09.005 .

Hew, K. F., & Cheung, W. S. (2013). Use of web 2.0 technologies in K-12 and higher education: The search for evidence-based practice. Educational Research Review , 9 , 47–64. https://doi.org/10.1016/j.edurev.2012.08.001 .

Hew, K. F., Lan, M., Tang, Y., Jia, C., & Lo, C. K. (2019). Where is the “theory” within the field of educational technology research? British Journal of Educational Technology , 50 (3), 956–971. https://doi.org/10.1111/bjet.12770 .

Howard, S. K., Ma, J., & Yang, J. (2016). Student rules: Exploring patterns of students’ computer-efficacy and engagement with digital technologies in learning. Computers & Education , 101 , 29–42. https://doi.org/10.1016/j.compedu.2016.05.008 .

Hu, S., & Kuh, G. D. (2002). Being (dis)engaged in educationally purposeful activities: The influences of student and institutional characteristics. Research in Higher Education , 43 (5), 555–575. https://doi.org/10.1023/A:1020114231387 .

Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers & Education , 94 , 102–119. https://doi.org/10.1016/j.compedu.2015.11.013 .

Ikpeze, C. (2007). Small group collaboration in peer-led electronic discourse: An analysis of group dynamics and interactions involving Preservice and Inservice teachers. Journal of Technology and Teacher Education , 15 (3), 383–407.

Ivala, E., & Gachago, D. (2012). Social media for enhancing student engagement: The use of Facebook and blogs at a university of technology. South African Journal of Higher Education , 26 (1), 152–167.

Järvelä, S., Järvenoja, H., Malmberg, J., Isohätälä, J., & Sobocinski, M. (2016). How do types of interaction and phases of self-regulated learning set a stage for collaborative engagement? Learning and Instruction , 43 , 39–51. https://doi.org/10.1016/j.learninstruc.2016.01.005 .

Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., … Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research , 88 (1), 43–86. https://doi.org/10.3102/0034654317740335 .

Jou, M., Lin, Y.-T., & Tsai, H.-C. (2016). Mobile APP for motivation to learning: An engineering case. Interactive Learning Environments , 24 (8), 2048–2057. https://doi.org/10.1080/10494820.2015.1075136 .

Junco, R. (2012). The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement. Computers & Education , 58 (1), 162–171. https://doi.org/10.1016/j.compedu.2011.08.004 .

Kahn, P. (2014). Theorising student engagement in higher education. British Educational Research Journal , 40 (6), 1005–1018. https://doi.org/10.1002/berj.3121 .

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education , 38 (5), 758–773. https://doi.org/10.1080/03075079.2011.598505 .

Kahu, E. R., & Nelson, K. (2018). Student engagement in the educational interface: Understanding the mechanisms of student success. Higher Education Research and Development , 37 (1), 58–71. https://doi.org/10.1080/07294360.2017.1344197 .

Kaliisa, R., & Picard, M. (2017). A systematic review on mobile learning in higher education: The African perspective. The Turkish Online Journal of Educational Technology , 16 (1) Retrieved from https://files.eric.ed.gov/fulltext/EJ1124918.pdf .

Kara, H. (2017). Research and evaluation for busy students and practitioners: A time-saving guide , (2nd ed., ). Bristol: Policy Press.

Book   Google Scholar  

Karabulut-Ilgu, A., Jaramillo Cherrez, N., & Jahren, C. T. (2018). A systematic review of research on the flipped learning method in engineering education: Flipped learning in engineering education. British Journal of Educational Technology , 49 (3), 398–411. https://doi.org/10.1111/bjet.12548 .

Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education , 53 (3), 819–827. https://doi.org/10.1016/j.compedu.2009.05.001 .

Keiller, L., & Inglis-Jassiem, G. (2015). A lesson in listening: Is the student voice heard in the rush to incorporate technology into health professions education? African Journal of Health Professions Education , 7 (1), 47–50. https://doi.org/10.7196/ajhpe.371 .

Kelley, K., Lai, K., Lai, M. K., & Suggests, M. (2018). Package ‘MBESS’. Retrieved from https://cran.r-project.org/web/packages/MBESS/MBESS.pdf

Kerres, M. (2013). Mediendidaktik. Konzeption und Entwicklung mediengestützter Lernangebote . München: Oldenbourg.

Kirkwood, A. (2009). E-learning: You don’t always get what you hope for. Technology, Pedagogy and Education , 18 (2), 107–121. https://doi.org/10.1080/14759390902992576 .

Koehler, M., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of Educational Computing Research , 32 (2), 131–152.

Krause, K.-L., & Coates, H. (2008). Students’ engagement in first-year university. Assessment & Evaluation in Higher Education , 33 (5), 493–505. https://doi.org/10.1080/02602930701698892 .

Kucuk, S., Aydemir, M., Yildirim, G., Arpacik, O., & Goktas, Y. (2013). Educational technology research trends in Turkey from 1990 to 2011. Computers & Education , 68 , 42–50. https://doi.org/10.1016/j.compedu.2013.04.016 .

Kuh, G. D. (2001). The National Survey of student engagement: Conceptual framework and overview of psychometric properties . Bloomington: Indiana University Center for Postsecondary Research Retrieved from http://nsse.indiana.edu/2004_annual_report/pdf/2004_conceptual_framework.pdf .

Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development , 50 (6), 683–706. https://doi.org/10.1353/csd.0.0099 .

Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education , 79 (5), 540–563 Retrieved from http://www.jstor.org.ezproxy.umuc.edu/stable/25144692 .

Kuh, G. D., J. Kinzie, J. A. Buckley, B. K. Bridges, & J. C. Hayek. (2006). What matters to student success: A review of the literature. Washington, DC: National Postsecondary Education Cooperative.

Kupper, L. L., & Hafner, K. B. (1989). How appropriate are popular sample size formulas? The American Statistician , 43 (2), 101–105.

Lai, J. W. M., & Bower, M. (2019). How is the use of technology in education evaluated? A systematic review. Computers & Education , 133 , 27–42. https://doi.org/10.1016/j.compedu.2019.01.010 .

Lawson, M. A., & Lawson, H. A. (2013). New conceptual frameworks for student engagement research, policy, and practice. Review of Educational Research , 83 (3), 432–479. https://doi.org/10.3102/0034654313480891 .

Leach, L., & Zepke, N. (2011). Engaging students in learning: A review of a conceptual organiser. Higher Education Research and Development , 30 (2), 193–204. https://doi.org/10.1080/07294360.2010.509761 .

Li, J., van der Spek, E. D., Feijs, L., Wang, F., & Hu, J. (2017). Augmented reality games for learning: A literature review. In N. Streitz, & P. Markopoulos (Eds.), Lecture Notes in Computer Science. Distributed, Ambient and Pervasive Interactions , (vol. 10291, pp. 612–626). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-58697-7_46 .

Lim, C. (2004). Engaging learners in online learning environments. TechTrends , 48 (4), 16–23 Retrieved from https://link.springer.com/content/pdf/10.1007%2FBF02763440.pdf .

Lopera Medina, S. (2014). Motivation conditions in a foreign language reading comprehension course offering both a web-based modality and a face-to-face modality (Las condiciones de motivación en un curso de comprensión de lectura en lengua extranjera (LE) ofrecido tanto en la modalidad presencial como en la modalidad a distancia en la web). PROFILE: Issues in Teachers’ Professional Development , 16 (1), 89–104 Retrieved from https://search.proquest.com/docview/1697487398?accountid=12968 .

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: A systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1), 1. https://doi.org/10.1186/s41239-018-0101-6 .

Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education , 24 , 26–34. https://doi.org/10.1016/j.iheduc.2014.09.005 .

Mahatmya, D., Lohman, B. J., Matjasko, J. L., & Farb, A. F. (2012). Engagement across developmental periods. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 45–63). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_3 .

Mansouri, A. S., & Piki, A. (2016). An exploration into the impact of blogs on students’ learning: Case studies in postgraduate business education. Innovations in Education and Teaching International , 53 (3), 260–273. https://doi.org/10.1080/14703297.2014.997777 .

Martin, A. J. (2012). Motivation and engagement: Conceptual, operational, and empirical clarity. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 303–311). Boston: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_14 .

McCutcheon, K., Lohan, M., Traynor, M., & Martin, D. (2015). A systematic review evaluating the impact of online or blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education. Journal of Advanced Nursing , 71 (2), 255–270. https://doi.org/10.1111/jan.12509 .

Miake-Lye, I. M., Hempel, S., Shanman, R., & Shekelle, P. G. (2016). What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products. Systematic Reviews , 5 , 28. https://doi.org/10.1186/s13643-016-0204-x .

Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ (Clinical Research Ed.) , 339 , b2535. https://doi.org/10.1136/bmj.b2535 .

Nelson Laird, T. F., & Kuh, G. D. (2005). Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education , 46 (2), 211–233. https://doi.org/10.1007/s11162-004-1600-y .

Nguyen, L., Barton, S. M., & Nguyen, L. T. (2015). iPads in higher education-hype and hope. British Journal of Educational Technology , 46 (1), 190–203. https://doi.org/10.1111/bjet.12137 .

Nicholas, D., Watkinson, A., Jamali, H. R., Herman, E., Tenopir, C., Volentine, R., … Levine, K. (2015). Peer review: Still king in the digital age. Learned Publishing , 28 (1), 15–21. https://doi.org/10.1087/20150104 .

Nikou, S. A., & Economides, A. A. (2018). Mobile-based assessment: A literature review of publications in major referred journals from 2009 to 2018. Computers & Education , 125 , 101–119. https://doi.org/10.1016/j.compedu.2018.06.006 .

Norris, L., & Coutas, P. (2014). Cinderella’s coach or just another pumpkin? Information communication technologies and the continuing marginalisation of languages in Australian schools. Australian Review of Applied Linguistics , 37 (1), 43–61 Retrieved from http://www.jbe-platform.com/content/journals/10.1075/aral.37.1.03nor .

OECD (2015a). Schooling redesigned. Educational Research and Innovation . OECD Publishing Retrieved from http://www.oecd-ilibrary.org/education/schooling-redesigned_9789264245914-en .

OECD (2015b). Students, computers and learning . PISA: OECD Publishing Retrieved from http://www.oecd-ilibrary.org/education/students-computers-and-learning_9789264239555-en .

O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002 .

O’Gorman, E., Salmon, N., & Murphy, C.-A. (2016). Schools as sanctuaries: A systematic review of contextual factors which contribute to student retention in alternative education. International Journal of Inclusive Education , 20 (5), 536–551. https://doi.org/10.1080/13603116.2015.1095251 .

Oliver, B., & de St Jorre, Trina, J. (2018). Graduate attributes for 2020 and beyond: recommendations for Australian higher education providers. Higher Education Research and Development , 1–16. https://doi.org/10.1080/07294360.2018.1446415 .

O’Mara-Eves, A., Brunton, G., McDaid, D., Kavanagh, J., Oliver, S., & Thomas, J. (2014). Techniques for identifying cross-disciplinary and ‘hard-to-detect’ evidence for systematic review. Research Synthesis Methods , 5 (1), 50–59. https://doi.org/10.1002/jrsm.1094 .

Payne, L. (2017). Student engagement: Three models for its investigation. Journal of Further and Higher Education , 3 (2), 1–17. https://doi.org/10.1080/0309877X.2017.1391186 .

Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 259–282). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_12 .

Popenici, S. (2013). Towards a new vision for university governance, pedagogies and student engagement. In E. Dunne, & D. Owen (Eds.), The student engagement handbook: Practice in higher education , (1st ed., pp. 23–42). Bingley: Emerald.

Price, L., Richardson, J. T., & Jelfs, A. (2007). Face-to-face versus online tutoring support in distance education. Studies in Higher Education , 32 (1), 1–20.

Quin, D. (2017). Longitudinal and contextual associations between teacher–student relationships and student engagement. Review of Educational Research , 87 (2), 345–387. https://doi.org/10.3102/0034654316669434 .

Rashid, T., & Asghar, H. M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior , 63 , 604–612. https://doi.org/10.1016/j.chb.2016.05.084 .

Redecker, C. (2017). European framework for the digital competence of educators . Luxembourg: Office of the European Union.

Redmond, P., Heffernan, A., Abawi, L., Brown, A., & Henderson, R. (2018). An online engagement framework for higher education. Online Learning , 22 (1). https://doi.org/10.24059/olj.v22i1.1175 .

Reeve, J. (2012). A self-determination theory perspective on student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 149–172). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_7 .

Reeve, J., & Tseng, C.-M. (2011). Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology , 36 (4), 257–267. https://doi.org/10.1016/j.cedpsych.2011.05.002 .

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 3–19). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_1 .

Salaber, J. (2014). Facilitating student engagement and collaboration in a large postgraduate course using wiki-based activities. The International Journal of Management Education , 12 (2), 115–126. https://doi.org/10.1016/j.ijme.2014.03.006 .

Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature. International Journal of Educational Technology in Higher Education , 14 (1), 253. https://doi.org/10.1186/s41239-017-0063-0 .

Selwyn, N. (2016). Digital downsides: Exploring university students’ negative engagements with digital technology. Teaching in Higher Education , 21 (8), 1006–1021. https://doi.org/10.1080/13562517.2016.1213229 .

Shonfeld, M., & Ronen, I. (2015). Online learning for students from diverse backgrounds: Learning disability students, excellent students and average students. IAFOR Journal of Education , 3 (2), 13–29.

Skinner, E., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 21–44). Boston: Springer US.

Smidt, E., Bunk, J., McGrory, B., Li, R., & Gatenby, T. (2014). Student attitudes about distance education: Focusing on context and effective practices. IAFOR Journal of Education , 2 (1), 40–64.

Smith, R. (2006). Peer review: A flawed process at the heart of science and journals. Journal of the Royal Society of Medicine , 99 , 178–182.

Smith, T., & Lambert, R. (2014). A systematic review investigating the use of twitter and Facebook in university-based healthcare education. Health Education , 114 (5), 347–366. https://doi.org/10.1108/HE-07-2013-0030 .

Solomonides, I. (2013). A relational and multidimensional model of student engagement. In E. Dunne, & D. Owen (Eds.), The student engagement handbook: Practice in higher education , (1st ed., pp. 43–58). Bingley: Emerald.

Sosa Neira, E. A., Salinas, J., & de Benito, B. (2017). Emerging technologies (ETs) in education: A systematic review of the literature published between 2006 and 2016. International Journal of Emerging Technologies in Learning (IJET) , 12 (05), 128. https://doi.org/10.3991/ijet.v12i05.6939 .

Sullivan, M., & Longnecker, N. (2014). Class blogs as a teaching tool to promote writing and student interaction. Australasian Journal of Educational Technology , 30 (4), 390–401. https://doi.org/10.14742/ajet.322 .

Sun, J. C.-Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: Their impact on student engagement in distance education. British Journal of Educational Technology , 43 (2), 191–204. https://doi.org/10.1111/j.1467-8535.2010.01157.x .

Szabo, Z., & Schwartz, J. (2011). Learning methods for teacher education: The use of online discussions to improve critical thinking. Technology, Pedagogy and Education , 20 (1), 79–94. https://doi.org/10.1080/1475939x.2010.534866 .

Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research , 81 (1), 4–28. https://doi.org/10.3102/0034654310393361 .

Trowler, V. (2010). Student engagement literature review . York: The Higher Education Academy Retrieved from website: https://www.heacademy.ac.uk/system/files/studentengagementliteraturereview_1.pdf .

Van Rooij, E., Brouwer, J., Fokkens-Bruinsma, M., Jansen, E., Donche, V., & Noyens, D. (2017). A systematic review of factors related to first-year students’ success in Dutch and Flemish higher education. Pedagogische Studien , 94 (5), 360–405 Retrieved from https://repository.uantwerpen.be/docman/irua/cebc4c/149722.pdf .

Vural, O. F. (2013). The impact of a question-embedded video-based learning tool on E-learning. Educational Sciences: Theory and Practice , 13 (2), 1315–1323.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Cambridge: Harvard University Press.

Webb, L., Clough, J., O’Reilly, D., Wilmott, D., & Witham, G. (2017). The utility and impact of information communication technology (ICT) for pre-registration nurse education: A narrative synthesis systematic review. Nurse Education Today , 48 , 160–171. https://doi.org/10.1016/j.nedt.2016.10.007 .

Wekullo, C. S. (2019). International undergraduate student engagement: Implications for higher education administrators. Journal of International Students , 9 (1), 320–337. https://doi.org/10.32674/jis.v9i1.257 .

Wimpenny, K., & Savin-Baden, M. (2013). Alienation, agency and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education , 18 (3), 311–326. https://doi.org/10.1080/13562517.2012.725223 .

Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners’ agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational Psychologist , 52 (1), 17–37. https://doi.org/10.1080/00461520.2016.1207538 .

Zepke, N. (2014). Student engagement research in higher education: Questioning an academic orthodoxy. Teaching in Higher Education , 19 (6), 697–708. https://doi.org/10.1080/13562517.2014.901956 .

Zepke, N. (2018). Student engagement in neo-liberal times: What is missing? Higher Education Research and Development , 37 (2), 433–446. https://doi.org/10.1080/07294360.2017.1370440 .

Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action. Active Learning in Higher Education , 11 (3), 167–177. https://doi.org/10.1177/1469787410379680 .

Zhang, A., & Aasheim, C. (2011). Academic success factors: An IT student perspective. Journal of Information Technology Education: Research , 10 , 309–331. https://doi.org/10.28945/1518 .

Download references

Acknowledgements

The authors thank the two student assistants who helped during the article retrieval and screening stage.

This research resulted from the ActiveLearn project, funded by the Bundesministerium für Bildung und Forschung (BMBF-German Ministry of Education and Research) [grant number 16DHL1007].

Author information

Authors and affiliations.

Faculty of Education and Social Sciences (COER), Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany

Melissa Bond, Svenja Bedenlier & Olaf Zawacki-Richter

Learning Lab, Universität Duisburg-Essen, Essen, Germany

Katja Buntins & Michael Kerres

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design and conceptualisation of the systematic review. MB, KB and SB conducted the systematic review search and data extraction. MB undertook the literature review on student engagement and educational technology, co-wrote the method, results, discussion and conclusion section. KB designed and executed the sampling strategy and produced all of the graphs and tables, as well as assisted with the formulation of the article. SB co-wrote the method, results, discussion and conclusion sections, and proof read the introduction and literature review sections. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Melissa Bond .

Ethics declarations

Consent for publication.

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Literature reviews (LR) and systematic reviews (SR) on student engagement

Additional file 2.

Indicators of engagement and disengagement

Additional file 3.

Literature reviews (LR) and systematic reviews (SR) on student engagement and technology in higher education (HE)

Additional file 4.

Educational technology tool typology based on Bower ( 2016 ) and Educational technology tools used

Additional file 5.

Text-based tool examples by engagement domain

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Bond, M., Buntins, K., Bedenlier, S. et al. Mapping research in student engagement and educational technology in higher education: a systematic evidence map. Int J Educ Technol High Educ 17 , 2 (2020). https://doi.org/10.1186/s41239-019-0176-8

Download citation

Received : 01 May 2019

Accepted : 17 December 2019

Published : 22 January 2020

DOI : https://doi.org/10.1186/s41239-019-0176-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Educational technology
  • Higher education
  • Systematic review
  • Evidence map
  • Student engagement

research on student engagement

Homepage image

Journal of Interactive Media in Education

Ubiquity Press logo

  • Download PDF (English) XML (English)
  • Alt. Display
  • Collection: Doctoral Research: Learning in an Open World

Facilitating Student Engagement Through Educational Technology: Towards a Conceptual Framework

  • Melissa Bond
  • Svenja Bedenlier

The concept of student engagement has become somewhat of an enigma for educators and researchers, with ongoing discussions about its nature and complexity, and criticism about the depth and breadth of theorising and operationalisation within empirical research. This equally applies to research conducted in the field of educational technology and its application in schools and higher education. Recognising the inherent role that technology now plays in education, and the potential it has to engage students, this paper draws on a range of student engagement literature and conceptualises a provisional bioecological framework of student engagement that explicitly includes technology as one influential factor. This paper first proposes a definition of student engagement and provides an exploration of positive student engagement indicators. It then presents a bioecological framework, and the microsystemic facets of technology, teacher and curriculum are further explored in their relation to fostering student engagement. Based on this framework, implications for further theory-based research into student engagement and its relation to educational technology are discussed and recommendations for educators are given.

  • student engagement
  • educational technology
  • theoretical framework
  • bioecological model
  • higher education

Introduction

The concept of student engagement has become somewhat of an enigma for educators and researchers, with ongoing discussions about its nature and complexity, and criticism about the depth and breadth of theorising and operationalisation within empirical research (e.g. Kahn, 2014 ; Zepke, 2018a ). The role that digital technology plays in affecting student engagement is a particular area of interest, as it has become a central feature within the student educational experience ( Henderson, Selwyn and Aston, 2017 ; Selwyn, 2016 ). Recognition is growing of the importance of digital literacy and information and communications technology (ICT) skills (Organisation for Economic Co-operation and Development [OECD], 2015 ; Redecker, 2017 ), as is evidence of technology’s potential to increase self-efficacy, self-regulation and involvement within the wider educational community ( Alioon and Delialioğlu, 2019 ; Junco, 2012 ). The field of educational technology has, however, lacked theoretical guidance ( Antonenko, 2015 ; Karabulut-Ilgu, Jaramillo Cherrez and Jahren, 2018 ), with the operationalisation and understanding of student engagement being a particular issue ( Henrie, Halverson and Graham, 2015 ). Calls have been made, therefore, for a strengthening of theoretical understanding and the use of theory within empirical research in the field (e.g. Hennessy et al. 2019 ; Hew et al. 2019 ), as well as for further understanding of how educational technology can affect student engagement in particular (e.g. Castañeda and Selwyn, 2018 ; Nelson Laird and Kuh, 2005 ). Although recent efforts have investigated the interplay of engagement and educational technology, these have been limited to informal learning contexts (e.g. MOOCs, see Joksimović et al. 2018 ) and online learning in higher education (e.g. Redmond et al. 2018 ).

This paper forms part of the first author’s PhD by publication, which is an exploration into the complexity of the ever-evolving concept of student engagement, in an effort to gain further understanding of how technology interacts with and affects aspects of the learning environment in both school and higher education contexts. It also forms the theoretical basis of a larger research project on student engagement and technology in higher education. 1 The present paper presents a bioecological student engagement framework developed by the first author, in order to guide and ground further research on this complex topic. The model includes influences on student engagement at the macro, exo, meso and micro levels, with a particular focus on the microsystem – the student’s immediate learning environment – as this is where practitioners are able to exert the most influence. Recommendations are then provided on how the framework can be used by practitioners, and how it can help improve practice.

What is student engagement?

Student engagement has long been recognised as an enigmatic and multifaceted meta-construct ( Appleton, Christenson and Furlong, 2008 ; Fredricks, Blumenfeld and Paris, 2004 ), with seminal works such as Astin’s ( 1999 ) theory of involvement and Kahu’s ( 2013 ; Kahu and Nelson, 2018 ) sociocultural conceptualisation of engagement, influencing ongoing conversations about the nature of and research into engagement (e.g. Boekaerts, 2016 ; Eccles, 2016 ). Often confused with motivation, which is seen as an antecedent and the force that energises behaviour ( Lim, 2004 ; Reschly and Christenson, 2012 ), engagement is defined as:

The energy and effort that students employ within their learning community, observable via any number of behavioural, cognitive or affective indicators across a continuum. It is shaped by a range of structural and internal influences, including the complex interplay of relationships, learning activities and the learning environment. The more students are engaged and empowered within their learning community, the more likely they are to channel that energy back into their learning, leading to a range of short and long term outcomes, that can likewise further fuel engagement. ( Bond et al. Manuscript in preparation: 2–3 )

This definition arose in part out of literature stressing the importance of agentic engagement ( Reeve, 2012 ; Reeve and Tseng, 2011 ); the more students have a say within their learning environment, the more engagement and achievement are likely to increase ( Peters et al. 2019 ; Reeve, 2013 ; Zepke, 2018b ), the more likely they are then to feedback positively into the learning environment ( Matos et al. 2018 ). The concept of social engagement ( Finn and Zimmer, 2012 ; Linnenbrink-Garcia, Rogat and Koskey, 2011 ), where students’ affect is influenced by social elements within the learning environment, is also represented within the acknowledgement of social, alongside internal, influences.

Dimensions and indicators of student engagement

Cognitive, affective and behavioural engagement are the three widely accepted dimensions of student engagement ( Fredricks et al. 2004 ; Fredricks, Filsecker and Lawson, 2016 ). Cognitive engagement relates to deep learning strategies, self-regulation and understanding; affective engagement relates to positive reactions to the learning environment, peers and teachers, as well as their sense of belonging and interest; and behavioural engagement relates to participation, persistence and positive conduct. However, each dimension of engagement comprises a range of indicators (see Table 1 ), experienced on a continuum at varying times ( Coates, 2007 ; Payne, 2017 ), depending on their activation (low or high) and valence (positive or negative) ( Pekrun and Linnenbrink-Garcia, 2012 ). The term ‘indicators’ is used here, following the use by Fredricks et al. ( 2004 ), and is understood in the sense of indicating or being a manifestation of student engagement and is expressed—and eventually observable and measurable—through cognitive, affective or behavioural action or reaction. The authors do, however, acknowledge that sometimes these are referred to as ‘facets’ of engagement (e.g. Coates, 2009 ). It is also important to note that, although not discussed at length in the present paper, disengagement needs to be included as well, when talking about engagement; not necessarily as a distinct concept, but rather as residing on the other side of a continuum of (dis)engagement, expressed either as an active action of disengaging from a learning context or even as a character trait (e.g. Chipchase et al. 2017 ).

Indicators of student engagement (Adapted from Bond et al. Manuscript in preparation ).

Cognitive engagementAffective engagementBehavioural engagement
PurposefulEnthusiasmEffort
Integrating ideasSense of belongingAttention/focus
Critical thinkingSatisfactionDeveloping agency
Setting learning goalsCuriosityAttendance
Self-regulationSees relevanceAttempting
Operational reasoningInterestHomework completion
Trying to understandSense of wellbeingPositive conduct
ReflectionVitality/zestAction/initiation
Focus/concentrationFeeling appreciatedConfidence
Deep learningManages expectationsParticipation/involvement
Learning from peersEnjoymentAsking teacher or peers for help
Justifying decisionsPrideAssuming responsibility
UnderstandingExcitementIdentifying opportunities/challenges
Doing extra to learn moreDesire to do wellDeveloping multidisciplinary skills
Follow through/care/thoroughnessPositive interactions with peers and teachersSupporting and encouraging peers
Positive self-perceptions and self-efficacySense of connectedness to school/university/within classroomInteraction (peers, teacher, content, technology)
Preference for challenging tasks
Teaching self and peersPositive attitude about learning/values learningStudy habits/accessing course material
Use of sophisticated learning strategiesTime on task/staying on task/persistence
Positive perceptions of teacher support

Sociocultural positioning of student engagement

Engagement does not occur in a vacuum; rather, it is impacted and influenced by many contextual factors, and it is vital that these wider influences be considered when exploring student engagement ( Appleton et al. 2008 ; Kahu, 2013 ; Quin, 2017 ). Within her conceptual framework of student engagement in higher education, Kahu ( 2013, p. 766 ) differentiated between sociocultural influences, such as the political and social environment; structural influences, such as the university context and student background; and psychosocial influences, such as the teaching environment, teacher-student relationships and student motivation. By considering the wider sociopolitical context that influences student engagement, a more holistic and clearer understanding of the concept can be gained, which allows educators more insight into how to further build engagement and ultimately improve outcomes for students ( Appleton et al. 2008 ). Kahu’s framework has been criticised, however, for a lack of clear focus on what students were engaging with ( Ashwin and McVitty, 2015 ), which resulted in a revised framework emphasising the ‘educational interface’ ( Kahu and Nelson, 2018 ). However, given the emphasis that has been placed on the possibility of technology playing a formative role in student engagement ( Coates, 2007 ; Nelson Laird and Kuh, 2005 ; Schindler et al. 2017 ), further theorising of how technology fits within a framework of engagement is warranted.

Bronfenbrenner and colleagues (e.g. Bronfenbrenner, 1979 , 1986 ; Bronfenbrenner and Ceci, 1994 ) developed a bioecological model of external influences affecting families and child development, used to guide a range of research on child learning and parent engagement (e.g. Ansong et al. 2017 ; Heatly and Votruba-Drzal, 2018 ). This model has been particularly useful in educational practice, as it provides a conceptual framework for understanding how multiple settings and actors influence students at the same time (e.g. Sontag, 1996 ). Nested within a system of intertwined milieus, the individual student sits at the centre of the microsystem, which encompasses their immediate setting, e.g. classroom, or home. The mesosystem level represents the interactions between microsystems, as well as between the micro and exosystems. The exosystem includes the wider social structures that impact on the learner, such as educational institutions, the media, government, the world of work and social services, and the macrosystem encompasses the wider economic, social, legal, political and educational systems in which the other systems are located. This model was used, in conjunction with Schwab’s ( 1973 ) framework of curriculum redevelopment, to develop a bioecological model of influences on student engagement, as the theoretical framework for a case study on flipped learning in secondary classrooms ( Bond, 2019 ). The interconnected dimensions of curriculum, students, teachers and milieus (school, classrooms, family/parents, community) within Schwab’s ( 1973 ) framework, as well as the inclusion of technology by Willis et al. ( 2018 ) in their study of parent engagement with their child’s learning, allowed the first author to visualise more easily the interconnected, fluid relationship between the external influences on student engagement. This model is a vehicle through which to explore and visualise further how technology affects student engagement.

Bioecological student engagement framework

There are a range of structural and psychosocial influences that affect the learning environment, learning processes, student engagement and subsequent outcomes at all levels of the bioecological model (see Figure 1 ). Drawing on educational technology literature from two systematic reviews ( Bond, Manuscript in preparation ; Bond et al. Manuscript in preparation ), as well as wider literature, technological influences on student engagement are examined at each of the macro, exo, meso and microsystem levels.

research on student engagement

Bioecological model of influences on student engagement, based on Bond ( 2019 ) and adapted from Bronfenbrenner and colleagues ( Bronfenbrenner, 1979 , 1986 ; Bronfenbrenner and Ceci, 1994 ).

Macrosystem

The rapid onset of digitalisation is having, and will continue to have, a profound effect on governmental policy and educational institutions ( EDUCAUSE, 2018 ). Each country is reacting to digital transformation in different ways, with some, e.g., Germany (see Bond et al. 2018 ), investing heavily in research and development, including specific funding calls for research projects. The German government sponsored higher education think tank, Hochschulforum Digitalisierung, has recognised that “the use of digital media contributes to the improvement of higher education teaching”; however, “there is no shortage of digital teaching and learning innovations at universities but their structural and strategic advancement is deficient” ( Hochschulforum Digitalisierung, 2016: n.p. ). Therefore, funding is being provided by the Bundesministerium für Bildung und Forschung (BMBF – German Ministry of Education and Research) on the topics of ‘Adaptive learning and assessment environments’, ‘Interactivity and multimediality of digital learning environments’, ‘Researching theory and practice in digital learning environments’, and digitalisation in higher education ( Bundesministerium für Bildung und Forschung, Referat Digitaler Wandel in der Bildung, 2018 ), alongside peer-to-peer coaching for institution leaders and educators, to implement digital learning strategies and develop technological pedagogical skills. These projects will inform teaching and learning, and influence technology integration (infrastructure) and application (within the classroom) ( Hochschulforum Digitalisierung, 2016 ).

In Australia, digitalisation has meant the introduction of a National Broadband Network (NBN), in an attempt to “bridge the digital divide” ( NBN Co., 2018: 2 ), as well as boost the national gross domestic product. However, the process has been marred by cost blowouts ( Tucker, 2015 ) and delays ( Alizadeh, 2017 ), with Australia still lagging well behind other nations in Internet speed, ranked 50th in the world ( Akamai, 2017 ). This has had implications for families, especially those in rural areas where the NBN has yet to roll out and/or who cannot afford to buy credit on pre-paid Internet dongles or mobile phones. For example, within a case study on the flipped learning approach in rural South Australia ( Bond, 2019 ), a lack of access to the NBN has contributed to reduced parent engagement with students’ learning and within the school community, as well as having had a direct impact on students’ ability to engage with their learning.

Institutions that develop a culture of student success, with high expectations of both students and staff, and that invest in support services and infrastructure, such as reliable Internet connections and technology (e.g. desktop computers, wifi repeaters), are far more likely to promote positive student engagement ( Almarghani and Mijatovic, 2017 ; Peters et al. 2019 ; Umbach and Wawrzynski, 2005 ; Zepke, 2018a ). Institutional leadership and attitudes have a direct bearing on student learning, as well as on teacher attitudes towards using educational technology ( Cheng and Weng, 2017 ). This includes institutional policies on teacher professional development and the expectation of technology use within teaching and learning ( Gerick, Eickelmann and Bos, 2017 ), policies about staffing of classes ( Hill and Tyson, 2009 ), which may impede the development of effective relationships between educators, students and their families, as well as policies on student technology use, such as Bring Your Own Device (BYOD) programs ( Adhikari, Mathrani and Scogings, 2016 ). It is particularly important to remain cognisant of potential digital divide issues ( Adams Becker et al. 2018 ), including student ownership and use of devices that are incompatible with institutional devices, as this can impact participation and engagement ( Bond, 2019 ).

The mesosystem level reflects the relationships between elements of the exosystem and the microsystem. However, it also represents a student’s background and social milieu ( Eng, Szmodis and Mulsow, 2014 ), and the interplay of their (family) socioeconomic status and geographical location. This can impact on family income and their ability to afford devices ( Adhikari et al. 2016 ; Hohlfeld, Ritzhaupt and Barron, 2010 ; Warschauer and Xu, 2018 ), as well as their access to the Internet ( Beckmann, 2010 ; Bond, 2019 ), and thereby affect their attitudes towards technology ( Hollingworth et al. 2011 ). Therefore, it is vital that low-cost hardware and software are made available to students and families, to reduce this digital divide ( Adams Becker et al. 2018 ; Daniels and Holtman, 2014 ), but also that institutions conduct needs analyses, so as to deepen understanding of real and potential barriers for students and families ( Education Endowment Foundation, 2018 ; Goodall and Vorhaus, 2011 ). Further ideas for increasing technology access include opening up computer labs to students and families ( Lewin and Luckin, 2010 ) or establishing loan equipment schemes ( Hohlfeld et al. 2010 ).

Microsystem

The microsystem technology-enhanced learning environment is reflective of other models that have focused on the relationship between learner-teacher-content ( Bundick et al. 2014 ; Martin and Bolliger, 2018 ; Moore, 1989 ), including interaction with peers, teachers, authentic and worthwhile tasks ( Kearsley and Shneiderman, 1998 ; Lim, 2004 ), and technology ( Koehler and Mishra, 2005 ). These ‘external’ relationships, or the ‘inter-individual factors’ ( Bundick et al. 2014 ), play a vital role in ongoing student wellbeing, sense of connectedness, engagement and success ( Aldridge and McChesney, 2018 ; Wimpenny and Savin-Baden, 2013 ). It is also important to consider that a student’s life load, including employment, health, finances and family problems, can impact the amount that a student can become actively involved within school or university life ( Baron and Corbin, 2012 ), and to recognise that there are ‘internal’ psychosocial influences (see Figure 2 ), or ‘intra-individual factors’, that influence student engagement. These include a student’s self-concept, skills, motivation, self-efficacy, self-regulation, subject/discipline interest and wellbeing ( Bandura, 1995 ; Reschly and Christenson, 2012 ; Zepke, 2014 ), as well as their prior technology experience and acceptance ( Moos and Azevedo, 2009 ), as negative feelings about technology are related to disengagement ( Bartle, Longnecker and Pegrum, 2011 ; Howard, Ma and Yang, 2016 ).

research on student engagement

Internal psychosocial influences on student engagement.

Learning environment and technology

There are a variety of factors that influence student engagement when using technology (see Figure 3 ). Students’ access to technology is an issue, which may also impact on their level of confidence and prior level of experience ( Zweekhorst and Maas, 2015 ). Assuming that technology and the Internet can be accessed, the provision of technical (and sometimes emotional) support is necessary, to ensure not losing students along the way due, for example, to anxiety of receiving lower grades as a result of technology issues ( Mejia, 2016 ). Potential problems can be mitigated through introductory sessions to the technology being used ( Shepherd and Hannafin, 2011 ) or having a continuous technical support team present ( Levin, Whitsett and Wood, 2013 ). Providing thorough and clear explanations of how technology is to be used ( Lim, 2004 ; Peck, 2012 ; Salaber, 2014 ), including an emphasis on using ICT for self-directed learning ( Sumuer, 2018 ), and why it is being employed in a specific course setting ( Cakir, 2013 ; Northey et al. 2015 ; Skinner, 2009 ) is also helpful, if not necessary, to ensure student engagement. Consideration should be given to allowing students a choice in which technologies are used ( Martin and Bolliger, 2018 ), as familiar technology can eradicate issues of low technology confidence ( Northey et al. 2018 ). Including out-of-class technology activities in assessment has also been shown to improve engagement and student buy-in ( Northey et al. 2018 ; Zhu, 2006 ).

research on student engagement

Learning environment and technology influences on student engagement.

Engagement is more likely to develop when student-teacher relationships are strong ( Martin and Bolliger, 2018 ; Quin, 2017 ; Zepke and Leach, 2010 ; Zhang and Aasheim, 2011 ) and when students perceive the teacher to be knowledgeable, supportive, invested and effective ( Beer, Clark and Jones, 2010 ; Zhu, 2006 ) (see Figure 4 ). Teachers are more likely to employ and be successful using technology when they are confident that they have the skills to use it ( Jääskelä, Häkkinen and Rasku-Puttonen, 2017 ; Marcelo and Yot-Domínguez, 2019 ). Ongoing professional development is crucial to ensure that teachers have the requisite technology knowledge and skills, and can actually foster student engagement ( Bigatel and Williams, 2015 ). Providing regular, personalised, clear and constructive feedback can also enhance engagement ( Ma et al. 2015 ; Martin and Bolliger, 2018 ; Whipp and Lorentz, 2009 ) and influence student agency ( Coates, 2007 ), alongside the use of humour within online discussions ( Imlawi, Gregg and Karimi, 2015 ). By giving feedback in the form of asking questions, students are encouraged to reflect more deeply ( Alcaraz-Salarirche et al. 2011 ). Providing ongoing encouragement to students to contact teachers proactively when needed has also been found to be particularly effective ( Leese, 2009 ), as has providing ongoing attention and follow-up with students ( Zhang et al. 2014 ).

research on student engagement

Teacher influences on student engagement.

The learner-content relationship is crucial ( Xiao, 2017 ). Therefore content that is relevant and challenging ( Bundick et al. 2014 ; Cakir, 2013 ; Coates, 2007 ), and taught using active and collaborative learning techniques ( Almarghani and Mijatovic, 2017 ; Umbach and Wawrzynski, 2005 ; Wimpenny and Savin-Baden, 2013 ), has been shown to be highly effective at promoting student engagement (see Figure 5 ). Designing meaningful learning activities is essential, relating directly to students and/or content. For example, Abate, Gomes and Linton ( 2011 ) stress the importance of choosing appropriate and meaningful questions when using audience response systems, to avoid student disengagement. It is important to avoid redundantly doubling up on activities, such as using both online journals and online discussions ( Ruckert et al. 2014 ), and activities should be related to real life (e.g. Alshaikhi and Madini, 2016 ), as this makes them more useful to students. Likewise, ensuring that technology-enhanced activities are of high quality was found to be one aspect to engage students successfully, the lack of it resulting in students asking for “greater content rigor, depth, and relevancy” ( Eick and King Jr., 2012: 29 ) in, for example, YouTube videos used in class.

research on student engagement

Curriculum/activity influences on student engagement.

Creating learning communities in which students can interact collaboratively with others to build effective peer-peer relationships—with or without technology—is extremely valuable to engagement ( Nelson Laird and Kuh, 2005 ; Northey et al. 2015 ; Zepke and Leach, 2010 ) (see Figure 6 ). Students who collaborate actively in the group space, as part of the flipped learning approach, for example, have been found to experience deeper learning, increased confidence and greater achievement ( D’addato and Miller, 2016 ; de Araujo, Otten and Birisci, 2017 ; Grypp and Luebeck, 2015 ; Lee, 2018 ). Yildiz ( 2009 ), in her investigation of social presence in the online classroom, found that knowing what class members look like and having well-meaning social interactions, was conducive to increased confidence and sense of knowing each other. However, students in the study by Sullivan and Longnecker ( 2014, p. 397 ) referred to the course requirement of having to post comments to fellow students’ blogs as “the worst aspect of the blog”. Thus, peer interaction, and the value and meaning attached to it, is strongly related to how learning activities and digital tools are designed and used within a course.

research on student engagement

Peer influences on student engagement.

Family relationships, level of parent education, and parental involvement and engagement with student learning can play a large role in student engagement ( Diogo, Silva and Viana, 2018 ; Doctoroff and Arnold, 2017 ; Howell, 2013 ) (see Figure 7 ), as well as in students’ motivation towards schooling ( Heatly and Votruba-Drzal, 2018 ), achievement ( Castro et al. 2015 ; Hill and Tyson, 2009 ), self-efficacy ( Vekiri, 2010 ) and psychological wellbeing ( Wong et al. 2018 ). In particular, families can also affect the level of student involvement with, use of and attitude towards technology ( Krause, 2014 ; Stevenson, 2008 ), with students also often learning their computing skills from their parents ( Ihme and Senkbeil, 2017 ).

research on student engagement

Family influences on student engagement.

Enhanced student engagement through using technology can lead to a number of short and long term academic and social outcomes (see Figure 8 ), termed proximal and distal consequences by Kahu ( 2013 ). Short term outcomes include increased discipline specific knowledge and higher order thinking skills ( Nelson Laird and Kuh, 2005 ; Salaber, 2014 ), increased motivation ( Akbari et al. 2016 ), enhanced sense of belonging and wellbeing ( Lear, Ansorge and Steckelberg, 2010 ), and improved relationships through peer-to-peer learning and collaboration ( Zweekhorst and Maas, 2015 ). Long term outcomes include lifelong learning ( Karabulut-Ilgu et al. 2018 ), enhanced personal development ( Alioon and Delialioğlu, 2019 ), and increased involvement in the wider educational community ( Chen, Lambert and Guidry, 2010 ; Junco, 2012 ).

research on student engagement

Short and long term outcomes of student engagement.

Student engagement within a technology-enhanced learning (TEL) microsystem

Bringing these ideas together, the following framework shows the interplay between the TEL microsystem, student engagement and ensuing outcomes (see Figure 9 ). It reflects the definition of student engagement initially provided, whereby engagement is influenced by a range of internal and external factors. The more students are engaged and empowered within their learning community, the more likely it is that engagement will lead to a range of outcomes, and the more likely it is that this energy, effort and engagement will then feed back into the activities and learning environment.

research on student engagement

Student engagement framework.

In this article, the authors have synthesised a range of student engagement and educational technology literature, and sought to present an in-depth analysis of a bioecological student engagement framework, conceptualising how educational technology can influence engagement in the K-12 and higher education classroom. Although the body of literature exploring the interplay between student engagement and technology continues to grow, there is an obvious gap in its theoretical understanding and grounding (e.g. Henrie et al. 2015 ). With its focus on the macro, exo, meso and micro levels, this framework zooms in on the microsystem of the classroom and its constituents—these are also ultimately the factors that can be impacted by educators and further elaborated on by educational research. Owing to a lack of space in the present paper, further work is needed to examine the macro, exo and meso levels. Although the framework presented in this contribution is only one way of viewing this complex phenomenon, it offers a clear conceptual structure that other researchers, instructional designers, policy advisors and practitioners may find useful, and could help guide future student engagement research.

Grounding future research

By understanding the range of influences on student engagement, researchers could choose to focus on how certain factors affect engagement, and use the model presented here to frame their investigation and subsequent results discussion. So too, research may focus on one or all three engagement dimensions (e.g. cognitive engagement), and/or individual or multiple indicators of engagement (e.g. critical thinking and learning from peers). Using the first author’s flipped learning case study as an example ( Bond, 2019 ), the bioecological model was used to frame the results and identify recommendations for schools on successful flipped learning implementation. A new model was then presented, which clearly reflected the influences pertaining to that particular case study. The merit of applying a strong theoretical grounding and framework for analysing student engagement and educational technology is in substantiating research, which is still, however, lacking ( Castañeda and Selwyn, 2018 ). For example, the results of an extensive review of educational technology literature revealed that only 174 of 503 studies (35%) actually used a theoretical framework ( Hew et al. 2019 ), and much research specifically investigating student engagement lacked appropriate definition and operationalisation ( Henrie et al. 2015 ). As Antonenko ( 2015, p. 53 ) concisely states, “conceptual frameworks should be viewed as an instrument for organizing inquiry and creating a compelling theory-based and data-driven argument for the importance of the problem, rigor of the method, and implications for further development of theory and enhancement of practice”.

Implications for practice

The model presented in this paper is of interest to practitioners to raise and focus their attention to the different layers of their students’ environments. Although most educators have this perspective, this model places technology as an integral part of this environment, identifying it as an influential factor, that can equally be influenced through the educator in his or her practice. Whereas educators are able to influence the meso and macrosystem components only marginally, they do have the power and responsibility to ensure that the microsystem is set up in a way that is conducive to student engagement—especially in regard to using educational technology. This involves reflection on their own ability and confidence in using technology, as well as seeing themselves as facilitators and initiators of technology use within (and outside of) the classroom, as stressed in the analysis of the microsystem components of the framework presented here. Practitioners are encouraged to use the figures provided in this paper (e.g. Figure 4 ) to conduct periodic (self-)assessments, reflecting on the extent to which these factors are having a positive influence.

Providing ongoing support to enable students’ actual use of technology, as well as ensuring instructor presence throughout the course, has been seen as a crucial element for engaged students. As has been argued, the integration of educational technology facilitates engagement if students find it meaningful, related to real life, and can act without anxiety. In this context, providing opportunities for students to engage agentically in their learning, through activity and technology choice, as well as through collaborative activities, can also enhance engagement. Through thoughtful engagement with and application of technology, and by providing students with opportunities for active participation, student engagement can be nurtured.

See http://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn for further information.  

Funding Information

This research resulted from the ActiveLearn project, funded by the Bundesministerium für Bildung und Forschung (BMBF—German Ministry of Education and Research) [grant number 16DHL1007].

Competing Interests

The authors have no competing interests to declare.

Author Contributions

The first author conducted the literature review and developed the framework. The second author contributed to the conceptualisation of the microsystem. Both authors wrote the conclusion and developed the overall flow of the paper.

Abate, LE, Gomes, A and Linton, A. 2011. Engaging Students in Active Learning: Use of a Blog and Audience Response System. Medical Reference Services Quarterly , 30(1): 12–18. DOI: https://doi.org/10.1080/02763869.2011.540206  

Adams Becker, S, Brown, M, Dahlstrom, E, Davis, A, DePaul, K, Diaz, V and Pomerantz, J. 2018. NMC Horizon Report: 2018 Higher Education Edition . Louisville, CO: EDUCAUSE.  

Adhikari, J, Mathrani, A and Scogings, C. 2016. Bring Your Own Devices classroom. Interactive Technology and Smart Education , 13(4): 323–343. DOI: https://doi.org/10.1108/ITSE-04-2016-0007  

Akamai. 2017. Akamai’s State of the Internet Q1 2017 Report . Available at https://www.akamai.com/uk/en/multimedia/documents/state-of-the-internet/q1-2017-state-of-the-internet-connectivity-report.pdf [Accessed 15 July 2019].  

Akbari, E, Naderi, A, Simons, R-J and Pilot, A. 2016. Student engagement and foreign language learning through online social networks. Asian-Pacific Journal of Second and Foreign Language Education , 1(1): 1–22. DOI: https://doi.org/10.1186/s40862-016-0006-7  

Alcaraz-Salarirche, N, Gallardo-Gil, M, Herrera-Pastor, D and Serván-Núñez, MJ. 2011. An action research process on university tutorial sessions with small groups: presentational tutorial sessions and online communication. Educational Action Research , 19(4): 549–565. DOI: https://doi.org/10.1080/09650792.2011.625713  

Aldridge, JM and McChesney, K. 2018. The relationships between school climate and adolescent mental health and wellbeing: A systematic literature review. International Journal of Educational Research , 88: 121–145. DOI: https://doi.org/10.1016/j.ijer.2018.01.012  

Alioon, Y and Delialioğlu, Ö. 2019. The effect of authentic m-learning activities on student engagement and motivation. British Journal of Educational Technology , 50(2): 655–668. DOI: https://doi.org/10.1111/bjet.12559  

Alizadeh, T. 2017. The NBN: how a national infrastructure dream fell short . Available at http://theconversation.com/the-nbn-how-a-national-infrastructure-dream-fell-short-77780 [Accessed 15 July 2019].  

Almarghani, EM and Mijatovic, I. 2017. Factors affecting student engagement in HEIs – it is all about good teaching. Teaching in Higher Education , 22(8): 940–956. DOI: https://doi.org/10.1080/13562517.2017.1319808  

Alshaikhi, D and Madini, AA. 2016. Attitude toward Enhancing Extensive Listening through Podcasts Supplementary Pack. English Language Teaching , 9(7): 32–47. DOI: https://doi.org/10.5539/elt.v9n7p32  

Ansong, D, Okumu, M, Bowen, GL, Walker, AM and Eisensmith, SR. 2017. The role of parent, classmate, and teacher support in student engagement: Evidence from Ghana. International Journal of Educational Development , 54: 51–58. DOI: https://doi.org/10.1016/j.ijedudev.2017.03.010  

Antonenko, PD. 2015. The instrumental value of conceptual frameworks in educational technology research. Educational Technology Research and Development , 63(1): 53–71. DOI: https://doi.org/10.1007/s11423-014-9363-4  

Appleton, JJ, Christenson, SL and Furlong, MJ. 2008. Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools , 45(5): 369–386. DOI: https://doi.org/10.1002/pits.20303  

Ashwin, P and McVitty, D. 2015. The meanings of student engagement: Implications for policies and practices. In: Curaj, A, Matei, L, Pricopie, R, Salmi, J and Scott, P (eds.), The European Higher Education Area , 343–359. Cham: Springer International Publishing. DOI: https://doi.org/10.1007/978-3-319-20877-0_23  

Astin, A. 1999. Student involvement: A developmental theory for higher education. Journal of College Student Development , 40(5): 518–529.  

Bandura, A. 1995. Exercise of personal and collective efficacy in changing societies. In: Bandura, A (ed.), Self-efficacy in Changing Societies , 1–45. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511527692.003  

Baron, P and Corbin, L. 2012. Student engagement: Rhetoric and reality. Higher Education Research & Development , 31(6): 759–772. DOI: https://doi.org/10.1080/07294360.2012.655711  

Bartle, E, Longnecker, N and Pegrum, M. 2011. Collaboration, contextualisation and communication using new media: Introducing podcasting into an undergraduate chemistry class. International Journal of Innovation in Science and Mathematics Education , 19(1): 16–28.  

Beckmann, EA. 2010. Learners on the move: Mobile modalities in development studies. Distance Education , 31(2): 159–173. DOI: https://doi.org/10.1080/01587919.2010.498081  

Beer, C, Clark, K and Jones, D. 2010. Indicators of engagement. In: Steel, CH, Keppell, MJ, Gerbic, P and Housego, S (eds.), Curriculum, technology & transformation for an unknown. Proceedings ascilite Sydney 2010 , 75–86.  

Bigatel, P and Williams, V. 2015. Measuring Student Engagement in an Online Program. Online Journal of Distance Learning Administration , 18(2).  

Boekaerts, M. 2016. Engagement as an inherent aspect of the learning process. Learning and Instruction , 43: 76–83. DOI: https://doi.org/10.1016/j.learninstruc.2016.02.001  

Bond, M. 2019. Flipped learning and parent engagement in secondary schools: A South Australian case study. British Journal of Educational Technology , 50(3): 1294–1319. DOI: https://doi.org/10.1111/bjet.12765  

Bond, M. (Manuscript in preparation). Facilitating student engagement through the flipped learning approach in K-12: A systematic review.  

Bond, M, Buntins, K, Bedenlier, S, Zawacki-Richter, O, and Kerres, M. (Manuscript in preparation). Mapping research in student engagement and educational technology in higher education.  

Bond, M, Marín, VI, Dolch, C, Bedenlier, S and Zawacki-Richter, O. 2018. Digital transformation in German higher education: student and teacher perceptions and usage of digital media. International Journal of Educational Technology in Higher Education , 15(1): 1–20. DOI: https://doi.org/10.1186/s41239-018-0130-1  

Bronfenbrenner, U. 1979. The ecology of human development: Experiments by nature and design . Cambridge, Mass: Harvard University Press.  

Bronfenbrenner, U. 1986. Ecology of the family as a context for human development: Research perspectives. Developmental Psychology , 22(6): 723–742. DOI: https://doi.org/10.1037/0012-1649.22.6.723  

Bronfenbrenner, U and Ceci, SJ. 1994. Nature-Nurture Reconceptualized in Developmental Perspective: A Bioecological Model. Psychological Review , 101(4): 568–586. DOI: https://doi.org/10.1037/0033-295X.101.4.568  

Bundesministerium für Bildung und Forschung, Referat Digitaler Wandel in der Bildung. 2018. Bildung digital. Digitale Hochschulbildung . Available at https://www.bmbf.de/de/digitale-hochschullehre-2417.html [Accessed 20 April 2018].  

Bundick, M, Quaglia, R, Corso, M and Haywood, DE. 2014. Promoting student engagement in the classroom. Teachers College Record , 116(4).  

Cakir, H. 2013. Use of blogs in pre-service teacher education to improve student engagement. Computers & Education , 68: 244–252. DOI: https://doi.org/10.1016/j.compedu.2013.05.013  

Castañeda, L and Selwyn, N. 2018. More than tools? Making sense of the ongoing digitizations of higher education. International Journal of Educational Technology in Higher Education , 15(1): 211. DOI: https://doi.org/10.1186/s41239-018-0109-y  

Castro, M, Expósito-Casas, E, López-Martín, E, Lizasoain, L, Navarro-Asencio, E and Gaviria, JL. 2015. Parental involvement on student academic achievement: A meta-analysis. Educational Research Review , 14: 33–46. DOI: https://doi.org/10.1016/j.edurev.2015.01.002  

Chen, P-SD, Lambert, AD and Guidry, KR. 2010. Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education , 54(4): 1222–1232. DOI: https://doi.org/10.1016/j.compedu.2009.11.008  

Cheng, Y-H and Weng, C-W. 2017. Factors influence the digital media teaching of primary school teachers in a flipped class: A Taiwan case study. South African Journal of Education , 37(1): 1–12. DOI: https://doi.org/10.15700/saje.v37n1a1293  

Chipchase, L, Davidson, M, Blackstock, F, Bye, R, Colthier, P, Krupp, N, Dickson, W, Turner, D and Williams, M. 2017. Conceptualising and Measuring Student Disengagement in Higher Education: A Synthesis of the Literature. International Journal of Higher Education , 6(2): 31. DOI: https://doi.org/10.5430/ijhe.v6n2p31  

Coates, H. 2007. A model of online and general campus-based student engagement. Assessment & Evaluation in Higher Education , 32(2): 121–141. DOI: https://doi.org/10.1080/02602930600801878  

Coates, H. 2009. Engaging students for success: Australasian Student Engagement Report . Camberwell, Vic.  

D’addato, T and Miller, LR. 2016. An Inquiry into Flipped Learning in Fourth Grade Math Instruction. Canadian Journal of Action Research , 17(2): 33–55.  

Daniels, AD and Holtman, LB. 2014. The Use of Artefact Production to Achieve Learning Objectives in a Second-Year Zoology Course at an Institute of Higher Learning. Mediterranean Journal of Social Sciences , 5(6): 263–272. DOI: https://doi.org/10.5901/mjss.2014.v5n6p263  

de Araujo, Z, Otten, S and Birisci, S. 2017. Mathematics teachers’ motivations for, conceptions of, and experiences with flipped instruction. Teaching and Teacher Education , 62: 60–70. DOI: https://doi.org/10.1016/j.tate.2016.11.006  

Diogo, AM, Silva, P and Viana, J. 2018. Children’s use of ICT, family mediation, and social inequalities. Issues in Educational Research , 28(1): 61–76.  

Doctoroff, GL and Arnold, DH. 2017. Doing homework together: The relation between parenting strategies, child engagement, and achievement. Journal of Applied Developmental Psychology , 48: 103–113. DOI: https://doi.org/10.1016/j.appdev.2017.01.001  

Eccles, J. 2016. Engagement: Where to next? Learning and Instruction , 43: 71–75. DOI: https://doi.org/10.1016/j.learninstruc.2016.02.003  

Education Endowment Foundation. 2018. Working with parents to support children’s learning . Available at https://educationendowmentfoundation.org.uk/tools/guidance-reports/working-with-parents-to-support-childrens-learning/ [Accessed 18 July 2019].  

Educause. 2018. Report from the 2018 EDUCAUSE Task Force on Digital Transformation . Available at https://library.educause.edu/resources/2018/11/report-from-the-2018-educause-task-force-on-digital-transformation [Accessed 18 July 2019].  

Eick, C and King, DT, Jr. 2012. Nonscience Majors’ Perceptions on the Use of YouTube Video to Support Learning in an Integrated Science Lecture. Journal of College Science Teaching , 42(1): 26–30.  

Eng, S, Szmodis, W and Mulsow, M. 2014. Cambodian Parental Involvement. The Elementary School Journal , 114(4): 573–594. DOI: https://doi.org/10.1086/675639  

Finn, J and Zimmer, K. 2012. Student engagement: What is it? Why does it matter? In: Christenson, SL, Reschly, AL and Wylie, C (eds.), Handbook of Research on Student Engagement , 97–131. Boston, MA: Springer US.  

Fredricks, JA, Blumenfeld, PC and Paris, AH. 2004. School engagement: Potential of the concept, state of the evidence. Review of Educational Research , 74(1): 59–109. DOI: https://doi.org/10.3102/00346543074001059  

Fredricks, JA, Filsecker, M and Lawson, MA. 2016. Student engagement, context and adjustment: Addressing definitional, measurement, and methodological issues. Learning and Instruction , 43: 1–4. DOI: https://doi.org/10.1016/j.learninstruc.2016.02.002  

Gerick, J, Eickelmann, B and Bos, W. 2017. School level predictors for the use of ICT in schools and students’ CIL in international comparison. Large-scale Assessments in Education , 5(5): 1–13. DOI: https://doi.org/10.1186/s40536-017-0037-7  

Goodall, J and Vorhaus, J. 2011. Review of best practice in parental engagement . Available at https://www.gov.uk/government/publications/review-of-best-practice-in-parental-engagement [Accessed 18 July 2019].  

Grypp, L and Luebeck, J. 2015. Rotating Solids and Flipping Instruction. Mathematics Teacher , 109(3): 186–193. DOI: https://doi.org/10.5951/mathteacher.109.3.0186  

Heatly, MC and Votruba-Drzal, E. 2018. Developmental precursors of engagement and motivation in fifth grade: Linkages with parent- and teacher-child relationships. Journal of Applied Developmental Psychology , 60: 144–156. DOI: https://doi.org/10.1016/j.appdev.2018.09.003  

Henderson, M, Selwyn, N and Aston, R. 2017. What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education , 42(8): 1567–1579. DOI: https://doi.org/10.1080/03075079.2015.1007946  

Hennessy, S, Mavrikis, M, Girvan, C, Price, S and Winters, N. 2019. BJET Editorial for the 50th Anniversary Volume in 2019: Looking back, reaching forward. British Journal of Educational Technology , 50(1): 5–11. DOI: https://doi.org/10.1111/bjet.12730  

Henrie, CR, Halverson, LR and Graham, CR. 2015. Measuring student engagement in technology-mediated learning: A review. Computers & Education , 90: 36–53. DOI: https://doi.org/10.1016/j.compedu.2015.09.005  

Hew, KF, Lan, M, Tang, Y, Jia, C and Lo, CK. 2019. Where is the “theory” within the field of educational technology research? British Journal of Educational Technology , 50(3): 956–971. DOI: https://doi.org/10.1111/bjet.12770  

Hill, NE and Tyson, DF. 2009. Parental involvement in middle school: A meta-analytic assessment of the strategies that promote achievement. Developmental Psychology , 45(3): 740–763. DOI: https://doi.org/10.1037/a0015362  

Hochschulforum Digitalisierung. 2016. Discussion Paper. 20 Theses on Digital Teaching and Learning in Higher Education. Working Paper No. 18 . Berlin: Hochschulforum Digitalisierung. Available at https://hochschulforumdigitalisierung.de/sites/default/files/dateien/HFD_AP_Nr%2018_Discussion_Paper.pdf [Accessed 19 July 2019].  

Hohlfeld, TN, Ritzhaupt, AD and Barron, AE. 2010. Connecting schools, community, and family with ICT: Four-year trends related to school level and SES of public schools in Florida. Computers & Education , 55(1): 391–405. DOI: https://doi.org/10.1016/j.compedu.2010.02.004  

Hollingworth, S, Mansaray, A, Allen, K and Rose, A. 2011. Parents’ perspectives on technology and children’s learning in the home: Social class and the role of the habitus. Journal of Computer Assisted Learning 27(4): 347–360. DOI: https://doi.org/10.1111/j.1365-2729.2011.00431.x  

Howard, SK, Ma, J and Yang, J. 2016. Student rules: Exploring patterns of students’ computer-efficacy and engagement with digital technologies in learning. Computers & Education , 101: 29–42. DOI: https://doi.org/10.1016/j.compedu.2016.05.008  

Howell, D. 2013. Effects of an Inverted Instructional Delivery Model on Achievement of Ninth-Grade Physical Science Honors Students, Gardner-Webb University.  

Ihme, JM and Senkbeil, M. 2017. Why Adolescents Cannot Realistically Assess Their Own Computer-Related Skills. Zeitschrift Fur Entwicklungspsychologie Und Padagogische Psychologie , 49(1): 24–37. DOI: https://doi.org/10.1026/0049-8637/a000164  

Imlawi, J, Gregg, D and Karimi, J. 2015. Student engagement in course-based social networks: The impact of instructor credibility and use of communication. Computers & Education , 88: 84–96. DOI: https://doi.org/10.1016/j.compedu.2015.04.015  

Jääskelä, P, Häkkinen, P and Rasku-Puttonen, H. 2017. Teacher beliefs regarding learning, pedagogy, and the use of technology in higher education. Journal of Research on Technology in Education , 49(3–4): 198–211. DOI: https://doi.org/10.1080/15391523.2017.1343691  

Joksimović, S, Poquet, O, Kovanović, V, Dowell, N, Mills, C, Gašević, D, Dawson, S, Graesser, AC and Brooks, C. 2018. How Do We Model Learning at Scale? A Systematic Review of Research on MOOCs. Review of Educational Research , 88(1): 43–86. DOI: https://doi.org/10.3102/0034654317740335  

Junco, R. 2012. The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement. Computers & Education , 58(1): 162–171. DOI: https://doi.org/10.1016/j.compedu.2011.08.004  

Kahn, P. 2014. Theorising student engagement in higher education. British Educational Research Journal , 40(6): 1005–1018. DOI: https://doi.org/10.1002/berj.3121  

Kahu, ER. 2013. Framing student engagement in higher education. Studies in Higher Education , 38(5): 758–773. DOI: https://doi.org/10.1080/03075079.2011.598505  

Kahu, ER and Nelson, K. 2018. Student engagement in the educational interface: Understanding the mechanisms of student success. Higher Education Research & Development , 37(1): 58–71. DOI: https://doi.org/10.1080/07294360.2017.1344197  

Karabulut-Ilgu, A, Jaramillo Cherrez, N and Jahren, CT. 2018. A systematic review of research on the flipped learning method in engineering education. British Journal of Educational Technology , 49(3): 398–411. DOI: https://doi.org/10.1111/bjet.12548  

Kearsley, G and Shneiderman, B. 1998. Engagement theory: A framework for technology-based teaching and learning. Educational Technology , 38(5): 20–23.  

Koehler, M and Mishra, P. 2005. What happens when teachers design educational technology? The development of Technological Pedagogical Content Knowledge. Journal of Educational Computing Research , 32(2): 131–152. DOI: https://doi.org/10.2190/0EW7-01WB-BKHL-QDYV  

Krause, L. 2014. Examining Stakeholder Perceptions of Accessibility and Utilization of Computer and Internet Technology in the Selinsgrove Area School District, Drexel University. Available at https://eric.ed.gov/?id=ED569546 [Accessed 7 August 2019].  

Lear, J, Ansorge, C and Steckelberg, A. 2010. Interactivity/Community Process Model for the online education environment. Journal of Online Learning & Teaching , 6(1): 71–77.  

Lee, M-K. 2018. Flipped classroom as an alternative future class model? implications of South Korea’s social experiment. Educational Technology Research and Development , 66(3): 837–857. DOI: https://doi.org/10.1007/s11423-018-9587-9  

Leese, M. 2009. Out of class—out of mind? The use of a virtual learning environment to encourage student engagement in out of class activities. British Journal of Educational Technology , 40(1): 70–77. DOI: https://doi.org/10.1111/j.1467-8535.2008.00822.x  

Levin, S, Whitsett, D and Wood, G. 2013. Teaching MSW Social Work Practice in a Blended Online Learning Environment. Journal of Teaching in Social Work , 33(4–5): 408–420. DOI: https://doi.org/10.1080/08841233.2013.829168  

Lewin, C and Luckin, R. 2010. Technology to support parental engagement in elementary education: Lessons learned from the UK. Computers & Education , 54(3): 749–758. DOI: https://doi.org/10.1016/j.compedu.2009.08.010  

Lim, C. 2004. Engaging learners in online learning environments. TechTrends , 48(4): 16–23. DOI: https://doi.org/10.1007/BF02763440  

Linnenbrink-Garcia, L, Rogat, TK and Koskey, KLK. 2011. Affect and engagement during small group instruction. Contemporary Educational Psychology , 36(1): 13–24. DOI: https://doi.org/10.1016/j.cedpsych.2010.09.001  

Ma, J, Han, X, Yang, J and Cheng, J. 2015. Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education , 24: 26–34. DOI: https://doi.org/10.1016/j.iheduc.2014.09.005  

Marcelo, C and Yot-Domínguez, C. 2019. From chalk to keyboard in higher education classrooms: changes and coherence when integrating technological knowledge into pedagogical content knowledge. Journal of Further and Higher Education , 43(7): 975–988. DOI: https://doi.org/10.1080/0309877X.2018.1429584  

Martin, F and Bolliger, DU. 2018. Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning , 22(1): 205–222. DOI: https://doi.org/10.24059/olj.v22i1.1092  

Matos, L, Reeve, J, Herrera, D and Claux, M. 2018. Students’ agentic engagement predicts longitudinal increases in perceived autonomy-supportive teaching: The squeaky wheel gets the grease. The Journal of Experimental Education , 86(4): 579–596. DOI: https://doi.org/10.1080/00220973.2018.1448746  

Mejia, G. 2016. Promoting language learning: The use of mLearning in the Spanish classes. Revista de Lenguas para Fines Especificos , 22(1): 80–99.  

Moore, MG. 1989. Editorial: Three types of interaction. American Journal of Distance Education , 3(2): 1–7. DOI: https://doi.org/10.1080/08923648909526659  

Moos, DC and Azevedo, R. 2009. Learning With Computer-Based Learning Environments: A Literature Review of Computer Self-Efficacy. Review of Educational Research , 79(2): 576–600. DOI: https://doi.org/10.3102/0034654308326083  

NBN, Co. 2018. The Corporate Plan 2019–22 . Available at https://www.nbnco.com.au/content/dam/nbnco2/2018/documents/media-centre/corporate-plan-report-2019-2022.pdf [Accessed 19 July 2019].  

Nelson Laird, TF and Kuh, GD. 2005. Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education , 46(2): 211–233. DOI: https://doi.org/10.1007/s11162-004-1600-y  

Northey, G, Bucic, T, Chylinski, M and Govind, R. 2015. Increasing student engagement using asynchronous learning. Journal of Marketing Education , 37(3): 171–180. DOI: https://doi.org/10.1177/0273475315589814  

Northey, G, Govind, R, Bucic, T, Chylinski, M, Dolan, R and van Esch, P. 2018. The effect of “here and now” learning on student engagement and academic achievement. British Journal of Educational Technology , 49(2): 321–333. DOI: https://doi.org/10.1111/bjet.12589  

OECD. 2015. Schooling Redesigned . OECD Publishing. Available at https://www.oecd.org/education/schooling-redesigned-9789264245914-en.htm [Accessed 19 July 2019].  

Payne, L. 2017. Student engagement: Three models for its investigation. Journal of Further and Higher Education , 3(2): 1–17. DOI: https://doi.org/10.1080/0309877X.2017.1391186  

Peck, JJ. 2012. Keeping it Social: Engaging Students Online and in Class. Asian Social Science , 8(14): 81–90. DOI: https://doi.org/10.5539/ass.v8n14p81  

Pekrun, R and Linnenbrink-Garcia, L. 2012. Academic Emotions and Student Engagement. In: Christenson, SL, Reschly, AL and Wylie, C (eds.), Handbook of Research on Student Engagement , 259–282. Boston, MA: Springer US. DOI: https://doi.org/10.1007/978-1-4614-2018-7_12  

Peters, H, Zdravkovic, M, João Costa, M, Celenza, A, Ghias, K, Klamen, D, Mossop, L, Rieder, M, Devi Nadarajah, V, Wangsaturaka, D, Wohlin, M and Weggemans, M. 2019. Twelve tips for enhancing student engagement. Medical Teacher , 41(6): 632–637. DOI: https://doi.org/10.1080/0142159X.2018.1459530  

Quin, D. 2017. Longitudinal and contextual associations between teacher–student relationships and student engagement. Review of Educational Research , 87(2): 345–387. DOI: https://doi.org/10.3102/0034654316669434  

Redecker, C. 2017. European Framework for the Digital Competence of Educators: DigCompEdu . DOI: https://doi.org/10.2760/159770  

Redmond, P, Heffernan, A, Abawi, L, Brown, A and Henderson, R. 2018. An online engagement framework for higher education. Online Learning , 22(1): 183–204. DOI: https://doi.org/10.24059/olj.v22i1.1175  

Reeve, J. 2012. A self-determination theory perspective on student engagement. In: Christenson, SL, Reschly, AL and Wylie, C (eds.), Handbook of Research on Student Engagement . 149–172. Boston, MA: Springer US. DOI: https://doi.org/10.1007/978-1-4614-2018-7_7  

Reeve, J. 2013. How students create motivationally supportive learning environments for themselves: The concept of agentic engagement. Journal of Educational Psychology , 105(3): 579–595. DOI: https://doi.org/10.1037/a0032690  

Reeve, J and Tseng, C-M. 2011. Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology , 36(4): 257–267. DOI: https://doi.org/10.1016/j.cedpsych.2011.05.002  

Reschly, AL and Christenson, SL. 2012. Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In: Christenson, SL, Reschly, AL and Wylie, C (eds.), Handbook of Research on Student Engagement , 3–19. Boston, MA: Springer US. DOI: https://doi.org/10.1007/978-1-4614-2018-7_1  

Ruckert, E, McDonald, PL, Birkmeier, M, Walker, B, Cotton, L, Lyons, LB, Straker, HO and Plack, MM. 2014. Using Technology to Promote Active and Social Learning Experiences in Health Professions Education. Online Learning , 18(4): 1–21. DOI: https://doi.org/10.24059/olj.v18i4.515  

Salaber, J. 2014. Facilitating student engagement and collaboration in a large postgraduate course using wiki-based activities. The International Journal of Management Education , 12(2): 115–126. DOI: https://doi.org/10.1016/j.ijme.2014.03.006  

Schindler, LA, Burkholder, GJ, Morad, OA and Marsh, C. 2017. Computer-based technology and student engagement: a critical review of the literature. International Journal of Educational Technology in Higher Education , 14(1): 25. DOI: https://doi.org/10.1186/s41239-017-0063-0  

Schwab, JT. 1973. The Practical 3: Translation into Curriculum. The School Review , 81(4): 501–522. DOI: https://doi.org/10.1080/00220272.2013.798838  

Selwyn, N. 2016. Digital downsides: Exploring university students’ negative engagements with digital technology. Teaching in Higher Education , 21(8): 1006–1021. DOI: https://doi.org/10.1080/13562517.2016.1213229  

Shepherd, C and Hannafin, M. 2011. Supporting Preservice Teacher Inquiry with Electronic Portfolios. Journal of Technology and Teacher Education , 19(2): 189–207.  

Skinner, E. 2009. Using community development theory to improve student engagement in online discussion: A case study. ALT-J: Research in Learning Technology , 17(2): 89–100. DOI: https://doi.org/10.1080/09687760902951599  

Sontag, JC. 1996. Toward a Comprehensive Theoretical Framework for Disability Research. The Journal of Special Education , 30(3): 319–344. DOI: https://doi.org/10.1177/002246699603000306  

Stevenson, O. 2008. Ubiquitous presence, partial use: The everyday interaction of children and their families with ICT. Technology, Pedagogy and Education , 17(2): 115–130. DOI: https://doi.org/10.1080/14759390802098615  

Sullivan, M and Longnecker, N. 2014. Class blogs as a teaching tool to promote writing and student interaction. Australasian Journal of Educational Technology , 30(4): 390–401. DOI: https://doi.org/10.14742/ajet.322  

Sumuer, E. 2018. Factors related to college students’ self-directed learning with technology. Australasian Journal of Educational Technology , 34(4): 29–43. DOI: https://doi.org/10.14742/ajet.3142  

Tucker, R. 2015. What will the NBN really cost? The Conversation, 1 December. Available at https://theconversation.com/what-will-the-nbn-really-cost-51562 [Accessed 19 July 2019].  

Umbach, PD and Wawrzynski, MR. 2005. Faculty do matter: The role of college faculty in student learning and engagement. Research in Higher Education , 46(2): 153–184. DOI: https://doi.org/10.1007/s11162-004-1598-1  

Vekiri, I. 2010. Socioeconomic differences in elementary students’ ICT beliefs and out-of-school experiences. Computers & Education , 54(4): 941–950. DOI: https://doi.org/10.1016/j.compedu.2009.09.029  

Warschauer, M and Xu, Y. 2018. Technology and Equity in Education. In: Voogt, J, Knezek, G, Christensen, R and Lai, K-W (eds.), Second Handbook of Information Technology in Primary and Secondary Education , 1063–1079. Cham: Springer International Publishing. DOI: https://doi.org/10.1007/978-3-319-71054-9_76  

Whipp, JL and Lorentz, RA. 2009. Cognitive and social help giving in online teaching: An exploratory study. Educational Technology Research and Development , 57(2): 169–192. DOI: https://doi.org/10.1007/s11423-008-9104-7  

Willis, L-D, Povey, J, Hodges, J and Carroll, A. 2018. PES – Parent engagement in schools . Brisbane, QLD, Australia: The University of Queensland, Institute for Social Science Research. Available at https://issr.uq.edu.au/parent-engagement-schools [Accessed 8 January 2019].  

Wimpenny, K and Savin-Baden, M. 2013. Alienation, agency and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education , 18(3): 311–326. DOI: https://doi.org/10.1080/13562517.2012.725223  

Wong, RSM, Ho, FKW, Wong, WHS, Tung, KTS, Chow, CB, Rao, N, Chan, KL and Ip, P. 2018. Parental Involvement in Primary School Education: Its Relationship with Children’s Academic Performance and Psychosocial Competence through Engaging Children with School. Journal of Child and Family Studies , 27(5): 1544–1555. DOI: https://doi.org/10.1007/s10826-017-1011-2  

Xiao, J. 2017. Learner-content interaction in distance education: The weakest link in interaction research. Distance Education , 38(1): 123–135. DOI: https://doi.org/10.1080/01587919.2017.1298982  

Yildiz, S. 2009. Social Presence in the Web-Based Classroom: Implications for Intercultural Communication. Journal of Studies in International Education , 13(1): 46–65. DOI: https://doi.org/10.1177/1028315308317654  

Zepke, N. 2014. Student engagement research in higher education: Questioning an academic orthodoxy. Teaching in Higher Education , 19(6): 697–708. DOI: https://doi.org/10.1080/13562517.2014.901956  

Zepke, N. 2018a. Student engagement in neo-liberal times: What is missing? Higher Education Research & Development , 37(2): 433–446. DOI: https://doi.org/10.1080/07294360.2017.1370440  

Zepke, N. 2018b. Learning with peers, active citizenship and student engagement in Enabling Education. Student Success , 9(1): 61–73. DOI: https://doi.org/10.5204/ssj.v9i1.433  

Zepke, N and Leach, L. 2010. Improving student engagement: Ten proposals for action. Active Learning in Higher Education , 11(3): 167–177. DOI: https://doi.org/10.1177/1469787410379680  

Zhang, A and Aasheim, C. 2011. Academic success factors: An IT student perspective. Journal of Information Technology Education: Research , 10: 309–331. DOI: https://doi.org/10.28945/1518  

Zhang, H, Song, W, Shen, S and Huang, R. 2014. The effects of blog-mediated peer feedback on learners’ motivation, collaboration, and course satisfaction in a second language writing course. Australasian Journal of Educational Technology , 30(6): 670–685. DOI: https://doi.org/10.14742/ajet.860  

Zhu, E. 2006. Interaction and cognitive engagement: An analysis of four asynchronous online discussions. Instructional Science , 34(6): 451–480. DOI: https://doi.org/10.1007/s11251-006-0004-0  

Zweekhorst, MBM and Maas, J. 2015. ICT in higher education: Students perceive increased engagement. Journal of Applied Research in Higher Education , 7(1): 2–18. DOI: https://doi.org/10.1108/JARHE-02-2014-0022  

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

Staying Engaged: Knowledge and Research Needs in Student Engagement

Ming-te wang.

School of Education, Learning Research and Development Center, Department of Psychology, University of Pittsburgh

Jessica Degol

School of Education, University of Pittsburgh

In this article, we review knowledge about student engagement and look ahead to the future of study in this area. We begin by describing how researchers in the field define and study student engagement. In particular, we describe the levels, contexts, and dimensions that constitute the measurement of engagement, summarize the contexts that shape engagement and the outcomes that result from it, and articulate person-centered approaches for analyzing engagement. We conclude by addressing limitations to the research and providing recommendations for study. Specifically, we point to the importance of incorporating more work on how learning-related emotions, personality characteristics, prior learning experiences, shared values across contexts, and engagement in nonacademic activities influence individual differences in student engagement. We also stress the need to improve our understanding of the nuances involved in developing engagement over time by incorporating more extensive longitudinal analyses, intervention trials, research on affective neuroscience, and interactions among levels and dimensions of engagement.

Over the past 25 years, student engagement has become prominent in psychology and education because of its potential for addressing problems of student boredom, low achievement, and high dropout rates. When students are engaged with learning, they can focus attention and energy on mastering the task, persist when difficulties arise, build supportive relationships with adults and peers, and connect to their school ( Wang & Eccles, 2012a , 2012b ). Therefore, student engagement is critical for successful learning ( Appleton, Christenson, & Furlong, 2008 ). In this article, we review research on student engagement in school and articulate the key features of student engagement. In addition, we provide recommendations for research on student engagement to address limits to our understanding, apply what we have learned to practice, and focus on aspects that warrant further investigation.

KEY FEATURES OF STUDENT ENGAGEMENT

Engagement is distinct from motivation.

Engagement is a broadly defined construct encompassing a variety of goal-directed behaviors, thoughts, or affective states ( Fredricks, Blumenfeld, & Paris, 2004 ). Although definitions of engagement vary across studies ( Reschly & Christenson, 2012 ), engagement is distinguished from motivation. A common conceptualization, though not universally established, is that engagement is the effort directed toward completing a task, or the action or energy component of motivation ( Appleton et al., 2008 ). For example, motivation has been defined as the psychological processes that underlie the energy, purpose, and durability of activities, while engagement is defined as the outward manifestation of motivation ( Skinner, Kindermann, Connell, & Wellborn, 2009 ). Engagement can take the form of observable behavior (e.g., participation in the learning activity, on-task behavior), or manifest as internal affective (e.g., interest, positive feelings about the task) and cognitive (e.g., metacognition, self-regulated learning) states ( Christenson et al., 2008 ). Therefore, when motivation to pursue a goal or succeed at an academic task is put into action deliberately, the energized result is engagement.

Engagement Is Multilevel

Engagement is a multilevel construct, embedded within several different levels of increasing hierarchy ( Eccles & Wang, 2012 ). Researchers have focused on at least three levels in relation to student engagement ( Skinner & Pitzer, 2012 ). The first level represents student involvement within the school community (e.g., involvement in school activities). The second level narrows the focus to the classroom or subject domain (e.g., how students interact with math teachers and curriculum). The third level examines student engagement in specific learning activities within the classroom, emphasizing the moment-to-moment or situation-to-situation variations in activity and experience.

Engagement Is Multidimensional

Although most researchers agree that student engagement is multidimensional, consensus is lacking over the dimensions that should be distinguished ( Fredricks et al., 2004 ). Most models contain both a behavioral (e.g., active participation within the school) and an emotional (e.g., affective responses to school experiences) component ( Finn, 1989 ). Other researchers have identified cognitive engagement as a third factor that incorporates mental efforts that strengthen learning and performance, such as self-regulated planning and preference for challenge ( Connell & Wellborn, 1991 ; Wang, Willett, & Eccles, 2011 ). Although not as widely recognized, a fourth dimension, agentic engagement , reflects a student’s direct and intentional attempts to enrich the learning process by actively influencing teacher instruction, whereas behavioral, emotional, and cognitive engagement typically represent student reactions to classroom experiences ( Reeve & Tseng, 2011 ). Given the variety of definitions of engagement throughout the field, researchers must specify their dimensions and ensure that their measures align properly with these descriptions of engagement.

Engagement Is Malleable

Student engagement is shaped by context, so it holds potential as a locus for interventions ( Wang & Holcombe, 2010 ). When students have positive learning experiences, supportive relationships with adults and peers, and reaffirmations of their developmental needs in learning contexts, they are more likely to remain actively engaged in school ( Wang & Eccles, 2013 ). Structural features of schools (e.g., class size, school location) have also been attributed to creating an educational atmosphere that influences student engagement and achievement. However, structural characteristics may not directly alter student engagement, but may in fact alter classroom processes, which in turn affect engagement ( Benner, Graham, & Mistry, 2008 ).

Several aspects of classroom processes are central to student engagement. For example, engagement is greater in classrooms where tasks are hands-on, challenging, and authentic ( Marks, 2000 ). Teachers who provide clear expectations and instructions, strong guidance during lessons, and constructive feedback have students who are more behaviorally and cognitively engaged ( Jang, Reeve, & Deci, 2010 ). Researchers have also linked high parental expectations to persistence and interest in school ( Spera, 2005 ), and linked high parental involvement to academic success and mental health both directly and indirectly through behavioral and emotional engagement ( Wang & Sheikh-Khalil, 2014 ). Conceptualizing student engagement as a malleable construct enables researchers to identify features of the environment that can be altered to increase student engagement and learning.

Engagement Predicts Student Outcomes

Student engagement is a strong predictor of educational outcomes. Students with higher behavioral and cognitive engagement have higher grades and aspire to higher education ( Wang & Eccles, 2012a ). Emotional engagement is also correlated positively with academic performance ( Stewart, 2008 ). Student engagement also operates as a mediator between supportive school contexts and academic achievement and school completion ( Wang & Holcombe, 2010 ). Therefore, increasing student engagement is a critical aspect of many intervention efforts aimed at reducing school dropout rates ( Archambault, Janosz, Morizot, & Pagani, 2009 ; Christenson & Reschly, 2010 ; Wang & Fredricks, 2014 ). Moreover, engagement is linked to other facets of child development. Youth with more positive trajectories of behavioral and emotional engagement are less depressed and less likely to be involved in delinquency and substance abuse ( Li & Lerner, 2011 ). School disengagement has been linked to negative indicators of youth development, including higher rates of substance use, problem behaviors, and delinquency ( Henry, Knight, & Thornberry, 2012 ). Some of these associations may actually be reciprocal, so that high engagement may lead to greater academic success, and greater academic success may then lead to even greater academic engagement ( Hughes, Luo, Kwok, & Loyd, 2008 ).

Engagement Comes in Qualitatively Different Patterns

Using person-centered approaches to study engagement advances our understanding of student variation in multivariate engagement profiles and the differential impact of these profiles on child development. One study ( Wang & Peck, 2013 ) used latent profile analysis to classify students into five groups of varying patterns of behavioral, emotional, and cognitive engagement, which were associated differentially with educational and psychological functioning. For example, a group of emotionally disengaged youth was identified (high behavioral and cognitive engagement, but low emotional engagement) with grade point averages and dropout rates comparable to those of the highly engaged group of youth (high on all three dimensions). However, despite their academic success, the emotionally disengaged students had a greater risk of poor mental health, reporting higher rates of symptoms of depression than any other group. Furthermore, growth mixture modeling analysis with a combined measure of behavioral, cognitive, and emotional engagement showed that unlike most individuals who experienced high to moderately stable trajectories of engagement throughout adolescence, many students experienced linear or nonlinear growth or declines ( Janosz, Archambault, Morizot, & Pagani, 2008 ). Students with unstable patterns of engagement were more likely to drop out. These developmental patterns and profiles cannot be detected by variable-centered approaches that focus on population means and overlook heterogeneity across groups. As person-centered research becomes more common, targeted intervention programs should be more effective at serving unique subgroups of students with specific developmental needs.

Disengagement Is More Than the Lack of Engagement

One of the inconsistencies found in the research is whether we should distinguish engagement from disengagement and measure these constructs on the same continuum or as separate continua. Most studies consider engagement as the opposite of disengagement with lower levels of engagement indicating more disengagement. However, some researchers have begun to view disengagement as a separate and distinct psychological process that makes unique contributions to academic outcomes, not simply as the absence of engagement ( Jimerson, Campos, & Greif, 2003 ). For example, behavioral and emotional indicators of engagement (e.g., effort, interest, persistence) and disaffection (e.g., withdrawal, boredom, frustration) can be treated as separate constructs, indicating that although similar, engagement and disaffection do not overlap completely ( Skinner, Furrer, Marchand, & Kindermann, 2008 ). Researchers should incorporate separate measures of engagement and disengagement into their work to determine the unique contributions of each construct to academic, behavioral, and psychological outcomes.

LOOKING AHEAD

Although we know much from research on student engagement, a number of areas require clarification and expansion.

Affective Arousal and Engagement

Emotions in educational contexts can enhance or impede learning by shaping the motivational and cognitive strategies that individuals use when faced with a new challenge. Negative emotions such as anxiety may interfere with performing a task by reducing the working memory, energy, and attention directed at completing the task, whereas positive emotions such as enjoyment, hope, and pride may increase performance by focusing attention on the task and promoting adaptive coping strategies ( Pekrun, Goetz, Titz, & Perry, 2002 ; Reschly, Huebner, Appleton, & Antaramian, 2008 ). However, much of the work on emotions and engagement focuses on general dispositions toward the learning environment, such as measuring interest in or valuing of school ( Stewart, 2008 ). Far less is known about how students’ actual emotions or affective states during specific learning activities influence their academic engagement and achievement ( Linnenbrink-Garcia & Pekrun, 2011 ). Researchers rarely measure how emotions relate to subsequent engagement, relying predominantly on retrospective student self-reports to measure affective states. Useful supplements to students’ reports would be psychophysiological indicators of emotional distress (e.g., facial expression, heart rate) and experience sampling methods to assess situational emotional states during classroom activities.

With the advancement of brain imaging technology, neuroimaging studies show that affective states during learning are important in determining how efficiently the brain processes new information ( Schwabe & Wolf, 2012 ). Although neuroimaging cannot be used to measure classroom engagement in real time, neuroscience techniques are valuable tools that may advance our understanding of how emotional experiences shape neural processing of information and affect engagement during a task. For example, do prolonged states of boredom in the classroom actually alter the shape and functionality of the brain over time, and can we intervene in these processes to reverse the negative effects of boredom or apathy? We also need a more thorough understanding of how genetic predispositions and environmental conditions interact to alter brain chemistry. Studies should identify precursors to or triggers for negative affective experiences, and identify environmental supports that can eliminate these negative emotions, foster adaptive coping strategies, and increase learning engagement and performance.

Interactions Among Levels

Engagement is represented at many hierarchical levels in the educational environment (e.g., school, classroom, momentary level). However, researchers rarely frame their conceptualizations and assessments of engagement in terms of a hierarchical system or process, so we lack understanding about how student engagement at these various levels interacts to influence performance. Learning is a continuous developmental process, not an instantaneous event, and engagement is the energy that directs mental, behavioral, and psychological faculties to the learning process. By focusing on only one level of engagement, we understand little about the process through which engagement is formed and leads ultimately to academic achievement.

Are there reciprocal interrelations between more immediate states of engagement and broader representations, such that moment-to-moment engagement within the classroom informs feelings and behaviors toward the school as a whole, which then trickle down to influence momentary classroom engagement through a continuous feedback loop? Are these levels additive or multiplicative, such that higher engagement across the board is associated with better academic outcomes than high engagement at only one or two levels? Or does engagement at one level compensate for lower engagement at another level, demonstrating that high engagement across all levels is not necessary for optimal functioning? Broadening the focus of research to incorporate engagement at many micro and macro levels of the educational context would advance our understanding of how different levels develop and interact to shape student engagement, and the differential pathways that lead to academic success.

Development of Many Dimensions

Despite the consensus over the multidimensionality of student engagement, the role that each dimension plays in shaping academic outcomes remains unclear ( Skinner et al., 2008 ). Three avenues warrant exploration: (a) independent relations, (b) emotional engagement (which drives behavioral and cognitive engagement), and (c) reciprocal relations.

Independent relations suggest that each dimension of engagement makes unique contributions to student functioning. In other words, high behavioral engagement cannot compensate for the effects of low emotional engagement, given that both shape student outcomes independently.

The second avenue posits that emotional engagement could be a prerequisite for behavioral and cognitive engagement. According to this viewpoint, students who enjoy learning should participate in classroom activities more often and take more ownership over their learning. Emotional engagement sets the stage for developing cognitive and behavioral processes of student engagement.

The third possibility suggests bidirectional relations among the organizational constructs of engagement, with each dimension influencing the others cyclically. For example, enjoyment of learning or high emotional engagement may lead to greater use of self-regulated learning strategies or cognitive engagement and greater behavioral engagement within the classroom. This increased behavioral participation and use of cognitive strategies to improve performance may elicit positive feedback from classmates and teachers, further increasing enjoyment of learning, and so on. With reciprocal relations, each process reinforces and feeds into the others. For researchers to understand the developmental progression of engagement over time, they should tease apart the unique versus compounded effects of each dimension of engagement.

Longitudinal Research Across Developmental Periods

Some research on how student engagement unfolds and changes over time has shown average declines in various indicators of engagement throughout adolescence and in the transition to secondary school ( Wang & Eccles, 2012a , 2012b ), but other studies have shown heterogeneity in engagement patterns across subgroups of individuals ( Archambault et al., 2009 ; Janosz et al., 2008 ; Li & Lerner, 2011 ). However, we know little about developmental trajectories of engagement spanning early childhood to late adolescence. Many studies track engagement only in early adolescence across a span of 3 or 4 years. Because the ability to become a self-regulated learner, set goals, and monitor progress advances as children mature and become active agents in their own learning, student engagement may take different forms in elementary school than it does in subsequent years ( Fredricks et al., 2004 ). Researchers should investigate how younger versus older students think of engagement, how engagement changes across developmental periods, and whether sociocultural and psychological factors differentially shape engagement at the elementary and secondary levels.

Students’ Prior Learning Experiences

Researchers should also explore the role of students’ previous learning experiences in shaping engagement. When students are confronted with new academic challenges, the emotions and cognitions attached to previous experiences should influence how they adjust or cope with these challenges. In particular, engagement and academic achievement decline during school transitions (e.g., elementary to middle school, middle school to high school), which can be stressful experiences for many students ( Eccles et al., 1993 ; Pekrun, 2006 ). Students with prior experiences of failure in school may be especially vulnerable to the alienating effects of school transitions. How do we discontinue students’ negative feelings about schoolwork and reengage them in their education? How do we maintain positive and engaging experiences for students through every grade level and every transition? Using students’ prior learning experiences to break the cycle of disengagement and strengthen the cycle of continuous interest and engagement could inform interventions, particularly during crucial transitory periods when students are most vulnerable to feelings of isolation, boredom, or alienation.

Intervention

Despite the malleability of student engagement and the connection between developmental contexts and engagement, very few theory- and evidence-based preventative programs have been developed, implemented, and tested on a large scale. A few interventions have increased student engagement. For example, Check & Connect, an evidence-based intervention program, has reduced rates of dropout and truancy, particularly for students at high risk of school failure ( Reschly & Christenson, 2012 ). Randomized control trials of schoolwide positive behavioral support programs have also improved student engagement and achievement, reducing discipline referrals and suspensions ( Horner et al., 2009 ; Ward & Gersten, 2013 ). However, many programs are small, intensive interventions that have not been implemented on a larger scale, raising concerns about implementation fidelity and reduced effectiveness. Many interventions also rely on one dose of services and track developmental changes over a short period, making it difficult to infer long-term benefits.

We need to develop comprehensive programs that adapt to the unique needs of individuals receiving services. Preventative programs often rely on one-size-fits-all models, so subgroups of students may not be served properly. Although universal interventions are beneficial for students in general, targeted programs might be more effective for students at greater risk of academic or psychological problems. Therefore, interventions should be implemented at many levels, incorporating a universal program for students in general and more selected services for at-risk students.

Engagement Across Contexts

We should also explore the relative alignment of educational messages, values, and goals across contexts and how this compatibility influences student engagement. Teachers, parents, and peers are not always in tune with each other over educational values, and these conflicting messages may impair how students engage fully with school. For example, parents might endorse educational excellence as a priority, whereas peers may endorse academic apathy. In these situations, students may have to set aside their personal values and pursue or coordinate the values of others, or try to integrate their personal values with the values of the other group. Students’ ability to coordinate the messages, goals, and values from different agents in their social circles will also determine how they see themselves as learners.

We lack studies on how students reconcile inconsistencies in these messages across groups and how it affects their engagement. If peer groups promote antiachievement goals that are directly in conflict with the educational ideals transmitted by parents, will students conform to peer norms or seek out friends with achievement values that are more aligned with the values endorsed by their families? Is misalignment of educational goals across social contexts a risk factor for school dropout, particularly among students from disadvantaged backgrounds? Researchers need to address this area to help students cope with the inconsistent messages about education in their social circles and to consolidate a stronger academic identity.

Student Character and Engagement

Although researchers have examined how contextual, sociocultural, and motivational factors influence student engagement, the influence of student character or personality factors is less well understood. Research on the Big Five personality traits has found conscientiousness, an indicator of perseverance, to be the most consistent predictor of academic achievement ( Poropat, 2009 ).

Persistence has been examined through grit , a characteristic that entails working passionately and laboriously to achieve a long-term goal, and persisting despite challenges, setbacks, or failures ( Duckworth, Peterson, Matthews, & Kelly, 2007 ). Individuals with grit are more likely to exert effort to prepare and practice to achieve their goals, leading them to be more successful than individuals who use less effortful strategies ( Duckworth, Kirby, Tsukayama, Berstein, & Ericsson, 2011 ).

Nevertheless, we know little about how personality traits might interact with environmental contexts to shape student engagement. Additionally, researchers have yet to examine how profiles of personality traits might interact with each other to influence student engagement. More nuanced research in these areas will aid in the development of learning strategies and educational contexts that may yield the most successful outcomes for various personality types.

Beyond Academic Engagement

Research on student engagement has focused on academic engagement or academic-related activities. Although academic experiences are critical determinants of educational success, school is also a place where students socialize with their friends and engage in nonacademic activities. Focusing exclusively on academic engagement neglects the school’s role as a developmental context in which students engage in a wide range of academic, social, and extracurricular activities that shape their identities as academically capable, socially integrated individuals who are committed to learning. For example, students who struggle with academic learning but are athletic may experience more engagement on the football field than in the classroom. Through participating in these types of nonacademic social activities, students build skills and learn life lessons such as collaborating as a team and becoming a leader. Thus, students’ schooling experiences should involve many forms of engagement, including academic, social, and extracurricular engagement. More research is needed to integrate these forms of engagement in school and examine how they interact to influence students’ academic and socioemotional well-being collectively.

Since its conception more than two decades ago, research on student engagement has permeated the fields of psychology and education. Over this period, we have learned much about engagement. We know that engagement can be measured as a multidimensional construct, including both observable and unobservable phenomena. We have come to appreciate the importance of engagement in preventing dropout and promoting academic success. We also understand that engagement is responsive to variations in classroom and family characteristics.

But in spite of the accrued knowledge on engagement, we have barely scratched the surface in understanding how engagement and disengagement can affect academic development, and how engagement unfolds over time by tracking interactions across contexts, dimensions, and levels. We also cannot dismiss the personal traits and affective states that students bring to the classroom, which may influence engagement regardless of the supportive nature of the environment. We lack knowledge about the extent to which large-scale interventions can produce long-term improvements in engagement across diverse groups. As we move forward with engagement research, we must apply what we have learned and focus on aspects that warrant further exploration. The insight this research provides will allow educators to create supportive learning environments in which diverse groups of students not only stay engaged but also experience the academic learning and success that is a byproduct of continuous engagement.

Acknowledgments

This project was supported by Grant DRL1315943 from the National Science Foundation and Grant DA034151-02 from the National Institute on Drug Abuse at the National Institute of Health to Ming-Te Wang.

Contributor Information

Ming-Te Wang, School of Education, Learning Research and Development Center, Department of Psychology, University of Pittsburgh.

Jessica Degol, School of Education, University of Pittsburgh.

  • Appleton JJ, Christenson SL, Furlong MJ. Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools. 2008; 45 :369–386. [ Google Scholar ]
  • Archambault I, Janosz M, Morizot J, Pagani L. Adolescent behavioral, affective, and cognitive engagement in school: Relationship to dropout. Journal of School Health. 2009; 79 :408–415. [ PubMed ] [ Google Scholar ]
  • Benner AD, Graham S, Mistry RS. Discerning direct and mediated effects of ecological structures and processes on adolescents’ educational outcomes. Developmental Psychology. 2008; 44 :840–854. [ PubMed ] [ Google Scholar ]
  • Christenson SL, Reschly AL. Check & Connect: Enhancing school completion through student engagement. In: Doll E, Charvat J, editors. Handbook of prevention science. New York, NY: Routledge; 2010. pp. 327–348. [ Google Scholar ]
  • Christenson SL, Reschly AL, Appleton JJ, Berman-Young S, Spaniers DM, Varro P. Best practices in fostering student engagement. In: Thomas A, Grimes J, editors. Best practices in school psychology. 5th. Bethesda, MD: National Association of School Psychologists; 2008. pp. 1099–1119. [ Google Scholar ]
  • Connell JP, Wellborn JG. Competence, autonomy, and relatedness: A motivational analysis of self-system processes. In: Gunnar MR, Sroufe LA, editors. Self processes in development: Minnesota symposium on child psychology. Vol. 23. Chicago, IL: University of Chicago Press; 1991. pp. 43–77. [ Google Scholar ]
  • Duckworth AL, Kirby T, Tsukayama E, Berstein H, Ericsson KA. Deliberate practice spells success: Why grittier competitors triumph at the National Spelling Bee. Social Psychological and Personality Science. 2011; 2 :174–181. [ Google Scholar ]
  • Duckworth AL, Peterson C, Matthews MD, Kelly DR. Grit: Perseverance and passion for long-term goals. Journal of Personality and Social Psychology. 2007; 92 :1087–1101. [ PubMed ] [ Google Scholar ]
  • Eccles JS, Midgley C, Wigfield A, Buchanan CM, Reuman D, Flanagan C, Mac Iver D. Development during adolescence: The impact of stage-environment fit on young adolescents’ experiences in schools and in families. American Psychologist. 1993; 48 :90–101. [ PubMed ] [ Google Scholar ]
  • Eccles JS, Wang M-T. Part I Commentary: So what is student engagement anyway? In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York, NY: Springer; 2012. pp. 133–148. [ Google Scholar ]
  • Finn JD. Withdrawing from school. Review of Educational Research. 1989; 59 :117–142. [ Google Scholar ]
  • Fredricks JA, Blumenfeld PC, Paris AH. School engagement: Potential of the concept, state of the evidence. Review of Educational Research. 2004; 74 :59–109. [ Google Scholar ]
  • Henry KL, Knight KE, Thornberry TP. School disengagement as a predictor of dropout, delinquency, and problem substance use during adolescence and early adulthood. Journal of Youth and Adolescence. 2012; 41 :156–166. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Horner RH, Sugai G, Smolkowski K, Eber L, Nakasato J, Todd AW, Esperanza J. A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. Journal of Positive Behavior Interventions. 2009; 11 :133–144. [ Google Scholar ]
  • Hughes JN, Luo W, Kwok OM, Loyd LK. Teacher-student support, effortful engagement, and achievement: A 3-year longitudinal study. Journal of Educational Psychology. 2008; 100 :1–14. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jang H, Reeve J, Deci EL. Engaging students in learning activities: It is not autonomy support or structure but autonomy support and structure. Journal of Educational Psychology. 2010; 102 :588–600. [ Google Scholar ]
  • Janosz M, Archambault I, Morizot J, Pagani LS. School engagement trajectories and their differential predictive relations to dropout. Journal of Social Issues. 2008; 64 :21–40. [ Google Scholar ]
  • Jimerson SR, Campos E, Greif JL. Toward an understanding of definitions and measures of school engagement and related terms. The California School Psychologist. 2003; 8 :7–27. [ Google Scholar ]
  • Li Y, Lerner RM. Trajectories of school engagement during adolescence: Implications for grades, depression, delinquency, and substance use. Developmental Psychology. 2011; 47 :233–247. [ PubMed ] [ Google Scholar ]
  • Linnenbrink-Garcia L, Pekrun R. Students’ emotions and academic engagement: Introduction to the special issue. Contemporary Educational Psychology. 2011; 36 :1–3. [ Google Scholar ]
  • Marks HM. Student engagement in instructional activity: Patterns in the elementary, middle, and high school years. American Educational Research Journal. 2000; 37 :153–184. [ Google Scholar ]
  • Pekrun R. The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review. 2006; 18 :315–341. [ Google Scholar ]
  • Pekrun R, Goetz T, Titz W, Perry RP. Academic emotions in students’ self-regulated learning and achievement: A program of qualitative and quantitative research. Educational Psychologist. 2002; 37 :91–105. [ Google Scholar ]
  • Poropat AE. A meta-analysis of the five-factor model of personality and academic performance. Psychological Bulletin. 2009; 135 :322–338. [ PubMed ] [ Google Scholar ]
  • Reeve J, Tseng CM. Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology. 2011; 36 :257–267. [ Google Scholar ]
  • Reschly AL, Christenson SL. Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York, NY: Springer; 2012. pp. 3–20. [ Google Scholar ]
  • Reschly AL, Huebner ES, Appleton JJ, Antaramian S. Engagement as flourishing: The contribution of positive emotions and coping to adolescents’ engagement at school and with learning. Psychology in the Schools. 2008; 45 :419–431. [ Google Scholar ]
  • Schwabe L, Wolf OT. Stress modulates the engagement of multiple memory systems in classification learning. Journal of Neuroscience. 2012; 32 :11042–11049. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Skinner EA, Furrer C, Marchand G, Kindermann T. Engagement and disaffection in the classroom: Part of a larger motivational dynamic? Journal of Educational Psychology. 2008; 100 :765–781. [ Google Scholar ]
  • Skinner EA, Kindermann TA, Connell JP, Wellborn JG. Engagement and disaffection as organizational constructs in the dynamics of motivational development. In: Wentzel K, Wigfield A, editors. Handbook of motivation at school. Mahwah, NJ: Erlbaum; 2009. pp. 223–245. [ Google Scholar ]
  • Skinner EA, Pitzer JR. Developmental dynamics of student engagement, coping, and everyday resilience. In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York, NY: Springer; 2012. pp. 21–44. [ Google Scholar ]
  • Spera C. A review of the relationship among parenting practices, parenting styles, and adolescent school achievement. Educational Psychology Review. 2005; 17 :125–146. [ Google Scholar ]
  • Stewart EB. School structural characteristics, student effort, peer associations, and parental involvement: The influence of school- and individual-level factors on academic achievement. Education and Urban Society. 2008; 40 :179–204. [ Google Scholar ]
  • Wang M-T, Eccles JS. Adolescent behavioral, emotional, and cognitive engagement trajectories in school and their differential relations to educational success. Journal of Research on Adolescence. 2012a; 22 :31–39. [ Google Scholar ]
  • Wang M-T, Eccles JS. Social support matters: Longitudinal effects of social support on three dimensions of school engagement from middle to high school. Child Development. 2012b; 83 :877–895. [ PubMed ] [ Google Scholar ]
  • Wang M-T, Eccles JS. School context, achievement motivation, and academic engagement: A longitudinal study of school engagement using a multidimensional perspective. Learning and Instruction. 2013; 28 :12–23. [ Google Scholar ]
  • Wang M-T, Fredricks JA. The reciprocal links between school engagement, youth problem behaviors, and school dropout during adolescence. Child Development. 2014; 85 :722–737. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Wang M-T, Holcombe R. Adolescents’ perceptions of school environment, engagement, and academic achievement in middle school. American Educational Research Journal. 2010; 47 :633–662. [ Google Scholar ]
  • Wang M-T, Peck SC. Adolescent educational success and mental health vary across school engagement profiles. Developmental Psychology. 2013; 49 :1266–1276. [ PubMed ] [ Google Scholar ]
  • Wang M-T, Sheikh-Khalil S. Does parental involvement matter for student achievement and mental health in high school? Child Development. 2014; 85 :610–625. [ PubMed ] [ Google Scholar ]
  • Wang M-T, Willett JB, Eccles JS. The assessment of school engagement: Examining dimensionality and measurement invariance by gender and race/ethnicity. Journal of School Psychology. 2011; 49 :465–480. [ PubMed ] [ Google Scholar ]
  • Ward B, Gersten R. A randomized evaluation of the Safe and Civil Schools model for positive behavioral interventions and supports at elementary schools in a large urban school district. School Psychology Review. 2013; 42 :317–333. [ Google Scholar ]
  • Our Mission

To Increase Student Engagement, Focus on Motivation

Teachers can motivate middle and high school students by providing structure while also allowing them some control over their learning.

Photo of high school teacher and student

A 2018 Gallup study found that as students get older, they become less engaged, or “ involved, enthusiastic, and committed .” The study contained some alarming findings: In fifth grade, most students (74 percent) report high levels of engagement with school. However, by middle school, only half of students are engaged, and by high school the number of engaged students shrinks to about one-third. 

Student engagement continued to be a pressing concern for parents and educators during and after the pandemic. Approximately half of parents (45 percent), 77 percent of administrators, and 81 percent of teachers said that keeping students engaged was difficult during remote learning. In addition, 94 percent of educators considered student engagement to be the most important metric to look at when determining student success. Gallup found that students who are engaged with school not only report achieving higher grades but also feel more hopeful about their future. 

Student motivation

Engagement and motivation are separate, related, but often confused. Motivation is the driving force that causes a student to take action. Engagement is the observable behavior or evidence of that motivation. Motivation is necessary for engagement, but successful engagement could also help students to feel motivated in the future. 

In my book The Independent Learner , I discuss how self-regulated learning strategies help students to increase their motivation and willingness to engage in learning because they create feelings of autonomy, competence, and relatedness. According to research by Ryan and Deci, these are the three components that facilitate motivation :

  • Autonomy is a “sense of initiative and ownership in one’s actions.”
  • Competence is a “feeling of mastery” and a sense that with effort a student can “succeed and grow.”
  • Relatedness is when the school setting “conveys respect and caring” that results in the student feeling “a sense of belonging and connection.”

Motivating students in the classroom

It is important not to confuse engagement with entertainment. In an EdWeek survey , researchers found that the entertaining activities that teachers expect to engage students are not necessarily working. While the majority of teachers who had increased their use of digital games assumed that games would engage students, only 27 percent of students reported feeling more engaged when digital games were involved. In addition, 30 percent of students said learning was actually less engaging. 

So what creates engagement? Gallup found that students who strongly agreed with the following two statements were 30 times more likely to report high levels of engagement with school:

  • My school is “committed to building the strengths of each student.”
  • I have “at least one teacher who makes me excited about the future.”

In other words, engaged students recognize that they have the support of caring adults who are willing to partner with them in their learning. 

Not all classrooms create these conditions. Controlling classrooms lower autonomy and motivation and increase student frustration. In controlling classrooms, students avoid challenges because they are afraid of failure. They work toward external rewards or to avoid possible anxiety or shame caused by mistakes. The teacher controls the answers and learning materials and uses language like “should” or “have to,” and students feel pressured to behave and achieve.

In contrast, creating classroom environments where students feel autonomy, competence, and relatedness helps students to maintain motivation and increase their engagement in school activities. Classrooms that foster motivation and increase engagement are high in structure but low in top-down control . These classrooms have the following qualities.

Supportive: Teachers support autonomy by listening and attempting to understand and respond to students’ perspectives. They look at what a student can currently do and where they need to go to reach the standard or objective, and they help the student by building scaffolds or supports to bridge the gap. This makes achievement toward grade-level content possible, even for learners who are not quite there yet.

Personal and individualized: Students feel like they are able to customize their assignments in order to explore their own interests. Students can also be taught to make their own connections to what they are learning through creating their own hooks for a lesson . Recognizing students’ unique qualities and special talents, getting to know what interests students and incorporating these interests into lessons or assignments, and reaching out to parents with a note or email when a student does something well are all strategies that I use to make learning personal and increase relatedness in the classroom. 

Structured and goal-oriented: When teachers give students strategies, provide frequent feedback, and show them how to use those strategies effectively, students are motivated by observing their own progress. Teachers can provide a rationale or standard and guide students in setting short-term mastery goals for each required task. They can also help students to align their daily actions and effort with the results they are hoping to achieve by making a process plan . I have found that when students graph their own progress or use their process plan as a checklist, this makes growth visual and allows students to see the steps they are accomplishing each day toward their goal. In addition, clear expectations, consistency in classroom structure, clear rules, and set routines are all important. 

Collaborative: Teachers provide students with choices and opportunities to partner with the teacher in their learning experiences and show ownership in the tasks that are assigned to them. When teachers encourage students to begin to make choices and take responsibility for their own learning, students see a purpose in school activities. One way to do this is through using self-assessment to prompt reflection on strategy use. I have students analyze their graded assignments to decide what strategies to keep and what to do differently next time. When students see errors as a signal that they need to reflect on the process and learning strategies they used, they realize there are no real mistakes, just opportunities for learning.

Although the pandemic has been difficult, the majority of students (69 percent) report feeling hopeful about the future. Students who are hopeful and engaged are less likely to get suspended or expelled, have chronic absenteeism, skip school, or drop out of school. When educators put effort into the goal of creating a school environment guided by student engagement, motivation, and autonomy, students can see their own growth. This creates an excitement for learning that helps students to maintain hope for the future even through difficult circumstances.

  • Share on LinkedIn
  • Share on Twitter
  • Share on Facebook
  • Share via Email

Focus on Student Engagement for Better Academic Outcomes

research on student engagement

Story Highlights

  • Social-emotional learning is rapidly growing in popularity in U.S. schools
  • Student engagement and hope are significantly related to student academic outcomes
  • The Gallup Student Poll helps you discover elements that drive student success

With the current shift in U.S. education policies putting a priority on social-emotional learning (SEL), the importance of teachers and schools having SEL resources -- proven to create positive student outcomes -- is growing rapidly. As the new goals and guidelines are implemented in accordance with the Every Student Succeeds Act (ESSA), schools have, unfortunately, been flooded with instructional guides, curriculum advice, and other implementation options that are founded on minimal research or unproven outcomes.

Measure What Matters Most for Student Success

For the past 12 years, Gallup and been surveying students through the Gallup Student Poll to research and fine-tune an SEL-based solution with proven impacts and outcomes. Statistically linked to a majority of the mandated ESSA accountability measures, the outcome of Gallup's research is two indexes that measure students' engagement and hope . These indexes consist of nine and seven items, respectively, and indicate students' "involvement in and enthusiasm for school" and "ideas and energy for the future." The outcomes of these indexes go beyond showing the level of engagement and hope of a school's students -- they are linked to the overall success of districts and schools, based on the measures to which they are being held accountable.

Discover the Elements That Drive Success

One recent Gallup study including 128 schools and more than 110,000 students found that student engagement and hope were significantly positively related to student academic achievement progress (growth) in math, reading, and all subjects combined, along with postsecondary readiness in math and writing. Focusing specifically on student engagement, the study found that schools in the top quartile of student engagement had significantly more students exceeding and meeting proficiency requirements than schools in the bottom quartile of engagement.

Bar graph showing student engagement quartiles and student academic outcomes of top quartile versus bottom quartile.

When compounded with another Gallup study finding that identified links from student engagement and hope to student discipline and behaviors , implementing strategies for measuring and growing both of these is a win for our education system as a whole. The impact these foundational notions of student engagement and hope have on students and corresponding school outcomes is precisely the focus of the newly minted ESSA state, district and school report cards. The report cards' measures include a combination of student achievement, student growth, English language proficiency, high school graduation rates and a school quality or student success measure, which were selected by each state autonomously.

ESSA and the SEL movements are steering the education system in a great direction, focusing education leaders' and schools' efforts where they should always be. As Gallup's founder Don Clifton put it, "Our greatest contribution is to be sure there is a teacher in every classroom who cares that every student, every day, learns and grows and feels like a real human being."

Create the Right Classroom Environment

The means to impact student engagement and hope can be found in multiple avenues. One is the straightforward path of measuring student engagement and hope to identify areas of strength within a school and areas of opportunity in which to implement different solutions. Leveraging the different items within the engagement and hope indexes provides building blocks for this step, while still maintaining the autonomy of teachers and schools to decide the best way to do so.

Another point of impact is the effect teacher engagement has on student engagement . By providing teachers with the opportunity to voice their opinions on different areas that affect their engagement, principals can better partner with teachers to enhance a culture of engagement. Together, they can create an environment that is more conducive to teacher productivity, which fosters the classroom environment best suited for student engagement, hope and learning.

"Our greatest contribution is to be sure there is a teacher in every classroom who cares that every student, every day, learns and grows and feels like a real human being."

A third and final intervention that can support student engagement and hope is the identification and development of student talents and strengths . While great teachers innately practice this without any external resources, the simple identification and acknowledgment of the things a student naturally does well can be an immediate boost to their self-esteem, wellbeing, engagement and hope.

No matter the policy or level of standards in place, by focusing on interventions that provide positive outcomes for students and teachers alike, our education system will develop as it was always intended.

Gallup can help you improve teacher and student outcomes:

  • Read Positive Relationships Between Student Engagement and Hope and Student Behavior to uncover more key findings from a recent Gallup Student Poll analysis.
  • Discover how to help kids have great lives by focusing on what they do best.
  • Get in touch with a Gallup expert for more information on how we can help you improve your school's success.

Mark Reckmeyer is a K-12 Education Researcher and Consultant at Gallup.

research on student engagement

Optimize Your School Engage students to help them get the most out of their education.

Related Topics Include:

Subscribe to the Gallup at Work newsletter to create an exceptional workplace.

( * ) Required

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Recommended

research on student engagement

Superintendents Say Engagement, Hope Best Measures of Success

2018 K-12 superintendent survey results emphasize the need for implementing and assessing nonacademic aspects of students' education experiences.

research on student engagement

School Engagement Is More Than Just Talk

Foster engagement among students, parents and teachers using Gallup's science-based analytics and advice to transform your school.

research on student engagement

How to Decrease Student Chronic Absenteeism

Chronic absenteeism is a problem in America's schools. Use the Gallup Student Poll to increase engagement and lower absenteeism.

research on student engagement

Create an Exceptional Teacher Experience

Build a school culture that attracts, retains and develops great teachers today.

October 30, 2019 Gallup https://www.gallup.com/education/267521/focus-student-engagement-better-academic-outcomes.aspx Gallup World Headquarters, 901 F Street, Washington, D.C., 20001, U.S.A +1 202.715.3030

Created by the Great Schools Partnership , the GLOSSARY OF EDUCATION REFORM is a comprehensive online resource that describes widely used school-improvement terms, concepts, and strategies for journalists, parents, and community members. | Learn more »

Share

Student Engagement

In education, student engagement refers to the degree of attention, curiosity, interest, optimism, and passion that students show when they are learning or being taught, which extends to the level of motivation they have to learn and progress in their education. Generally speaking, the concept of “student engagement” is predicated on the belief that learning improves when students are inquisitive, interested, or inspired, and that learning tends to suffer when students are bored, dispassionate, disaffected, or otherwise “disengaged.” Stronger student engagement or improved student engagement are common instructional objectives expressed by educators.

In many contexts, however, student engagement may also refer to the ways in which school leaders, educators, and other adults might “engage” students more fully in the governance and decision-making processes in school, in the design of programs and learning opportunities, or in the civic life of their community. For example, many schools survey students to determine their views on any number of issues, and then use the survey findings to modify policies or programs in ways that honor or respond to student perspectives and concerns. Students may also create their own questions, survey their peers, and then present the results to school leaders or the school board to advocate for changes in programs or policies. Some schools have created alternative forms of student governance, “student advisory committees,” student appointments to the school board, and other formal and informal ways for students to contribute to the governance of a school or advise superintendents, principals, and local policy makers. These broader forms of “student engagement” can take a wide variety of forms—far too many to extensively catalog here. Yet a few illustrative examples include school-supported  volunteer programs and community-service requirements (engaging students in public service and learning through public service), student organizing (engaging students in advocacy, community organizing, and constructive protest), and any number of potential student-led groups, forums, presentations, and events (engaging students in community leadership, public speaking, and other activities that contribute to “ positive youth development “). For a related discussion, see student voice .

In education, the term student engagement has grown in popularity in recent decades, most likely resulting from an increased understanding of the role that certain intellectual, emotional, behavioral, physical, and social factors play in the learning process and social development. For example, a wide variety of research studies on learning have revealed connections between so-called “non-cognitive factors” or “non-cognitive skills” (e.g., motivation, interest, curiosity, responsibility, determination, perseverance, attitude, work habits, self-regulation, social skills, etc.) and “cognitive” learning results (e.g., improved academic performance, test scores, information recall, skill acquisition, etc.). The concept of student engagement typically arises when educators discuss or prioritize educational strategies and teaching techniques that address the developmental, intellectual, emotional, behavioral, physical, and social factors that either enhance or undermine learning for students.

It should be noted that educators may hold different views on student engagement, and it may be defined or interpreted differently from place to place. For example, in one school observable behaviors such as attending class, listening attentively, participating in discussions, turning in work on time, and following rules and directions may be perceived as forms of “engagement,” while in another school the concept of “engagement” may be largely understood in terms of internal states such as enthusiasm, curiosity, optimism, motivation, or interest.

While the concept of student engagement seems straightforward, it can take fairly complex forms in practice. The following examples illustrate a few ways in which student engagement may be discussed or addressed in schools:

  • Intellectual engagement: To increase student engagement in a course or subject, teachers may create lessons, assignments, or projects that appeal to student interests or that stimulate their curiosity. For example, teachers may give students more choice over the topics they are asked to write about (so students can choose a topic that specifically interests them) or they may let students choose the way they will investigate a topic or demonstrate what they have learned (some students may choose to write a paper, others may produce short video or audio documentary, and still others may create a multimedia presentation). Teachers may also introduce a unit of study with a problem or question that students need to solve. For example, students might be asked to investigate the causes of a local environmental problem, determine the species of an unknown animal from a few short descriptions of its physical characteristics and behaviors, or build a robot that can accomplish a specific task. In these cases, sparking student curiosity can increase “engagement” in the learning process. For related discussions, see authentic learning , community-based learning , differentiation , personalized learning , project-based learning , and  relevance .
  • Emotional engagement: Educators may use a wide variety of strategies to promote positive emotions in students that will facilitate the learning process, minimize negative behaviors, or keep students from dropping out. For example, classrooms and other learning environments may be redesigned to make them more conducive to learning, teachers may make a point of monitoring student moods and asking them how they are feeling, or school programs may provide counseling, peer mentoring, or other services that generally seek to give students the support they need to succeed academically and feel positive, optimistic, or excited about school and learning. Strategies such as advisories , for example, are intended to build stronger relationships between students and adults in a school. The basic theory is that students will be more likely to succeed if at least one adult in the school is meeting with a student regularly, inquiring about academic and non-academic issues, giving her advice, and taking an interest in her out-of-school life, personal passions, future aspirations, and distinct learning challenges and needs.
  • Behavioral engagement: Teachers may establish classroom routines, use consistent cues, or assign students roles that foster behaviors more conducive to learning. For example, elementary school teachers may use cues or gestures that help young students refocus on a lesson if they get distracted or boisterous. The teacher may clap three times or raise a hand, for example, which signals to students that it’s time to stop talking, return to their seats, or begin a new activity. Teachers may also establish consistent routines that help students stay on task or remain engaged during a class. For example, the class may regularly break up into small groups or move their seats into a circle for a group discussion, or the teacher may ask students on a rotating basis to lead certain activities. By introducing variation into a classroom routine, teachers can reduce the monotony and potential disengagement that may occur when students sit in the same seat, doing similar tasks, for extended periods of time. Research on brain-based learning has also provided evidence that variation, novelty, and physical activity can stimulate and improve learning. For a related discussion, see classroom management .
  • Physical engagement: Teachers may use physical activities or routines to stimulate learning or interest. For example, “kinesthetic learning” refers to the use of physical motions and activities during the learning process. Instead of asking students to answer questions aloud, a teacher might ask students to walk up to the chalkboard and answer the question verbally while also writing the answer on the board (in this case, the theory is that students are more likely to remember information when they are using multiple parts of the brain at the same time—i.e., the various parts dedicated to speaking, writing, physical activity, etc.). Teachers may also introduce short periods of physical activity or quick exercises, particularly during the elementary years, to reduce antsy, fidgety, or distracted behaviors. In addition, more schools throughout the United States are addressing the physical needs of students by, for example, offering all students free breakfasts (because disengagement in learning and poor academic performance have been linked to hunger and malnutrition) or starting school later at a later time (because adolescent sleep patterns and needs differ from those of adults, and adolescents may be better able to learn later in the morning).
  • Social engagement: Teachers may use a variety of strategies to stimulate engagement through social interactions. For example, students may be paired or grouped to work collaboratively on projects, or teachers may create academic contests that students compete in—e.g., a friendly competition in which teams of students build robots to complete a specific task in the shortest amount of time. Academic and co-curricular activities such as debate teams, robotics clubs, and science fairs also bring together learning experiences and social interactions. In addition, strategies such as demonstrations of learning or capstone projects may require students to give public presentations of their work, often to panels of experts from the local community, while strategies such as community-based learning or service learning (learning through volunteerism) can introduce civic and social issues into the learning process. In these cases, learning about societal problems, or participating actively in social causes, can improve engagement.
  • Cultural engagement: Schools may take active steps to make students from diverse cultural backgrounds—particularly recently arrived immigrant or refugee students and their families—feel welcomed, accepted, safe, and valued. For example, administrators, teachers, and school staff may provide special orientation sessions for their new-American populations or offer translation services and informational materials translated into multiple languages. Students, families, and local cultural leaders from diverse backgrounds may be asked to speak about their experiences to students and school staff, and teachers may intentionally modify lessons to incorporate the history, literature, arts, and perspectives of the student ethnicities and nationalities represented in their classes. School activities may also incorporate multicultural songs, dances, and performances, while posters, flags, and other educational materials featured throughout the school may reflect the cultural diversity of the students and school community . The general goal of such strategies would be to reduce the feelings of confusion, alienation, disconnection, or exclusion that some students and families may experience, and thereby increase their engagement in academics and school activities. For related discussions, see dual-language education , English-language learner ,  multicultural education , and voice .

Creative Commons License

Alphabetical Search

Student Engagement in Higher Education: Conceptualizations, Measurement, and Research

  • Living reference work entry
  • First Online: 06 October 2023
  • Cite this living reference work entry

research on student engagement

  • Teniell L. Trolian   ORCID: orcid.org/0000-0001-5268-6081 3  

Part of the book series: Higher Education: Handbook of Theory and Research ((HATR,volume 39))

203 Accesses

1 Citations

Researchers have suggested that student engagement in the college and university learning environment is an important contributor to student learning, achievement, and outcomes. Higher education scholars have developed measures to assess students’ engagement in higher education, and they have examined experiences that contribute to student engagement, as well as whether student engagement leads to important college outcomes such as college persistence, academic achievement, cognitive development, and other affective outcomes. This chapter considers how student engagement has been conceptualized and measured by researchers and discusses research on student engagement in higher education, focusing on four sets of factors – precursors to student engagement, facilitators of student engagement, indicators of student engagement, and outcomes of student engagement. This chapter also offers recommendations for institutional policy and practice related to student engagement, as well as directions for future research focused on student engagement.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Ahlfeldt, S., Mehta, S., & Sellnow, T. (2005). Measurement and analysis of student engagement in university classes where varying levels of PBL methods of instruction are in use. Higher Education Research and Development, 24 (1), 5–20. https://doi.org/10.1080/0729436052000318541

Article   Google Scholar  

Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the student engagement instrument. Journal of School Psychology, 44 (5), 427–445. https://doi.org/10.1016/j.jsp.2006.04.002

Arum, R., & Roksa, J. (2011). Academically adrift: Limited learning on college campuses . University of Chicago Press.

Google Scholar  

Association of American Colleges and Universities [AAC&U]. (2007). College learning for the new global century: A report from the National Leadership Council for Liberal Education & America’s Promise . https://eric.ed.gov/?id=ED495004

Assunção, H., Lin, S., Sit, P., Cheung, K., Harju-Luukkainen, H., Smith, T., Maloa, B., Campos, J. A. D. B., Ilic, I. S., Esposito, G., Francesca, F. M., & Marôco, J. (2020). University Student Engagement Inventory (USEI): Transcultural validity evidence across four continents. Frontiers in Psychology, 10 , 1–12. https://doi.org/10.3389/fpsyg.2019.02796

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Personnel, 25 , 297–307.

Astin, A. W. (1993). What matters in college? Four critical years revisited . Jossey-Bass.

Australian Council for Educational Research. (2023). Australasian Survey of Student Engagement (AUSSE): Background . https://www.acer.org/au/ausse/background

Axelson, R. D., & Flick, A. (2010). Defining student engagement. Change: The Magazine of Higher Learning, 43 (1), 38–43. https://doi.org/10.1080/00091383.2011.533096

Bandura, A. (1977). Social learning theory . Prentice Hall.

Bandura, A. (1997). Self-efficacy: The exercise of control . Freeman.

Barkley, E. F. (2018). Terms of engagement: Understanding and promoting student engagement in today’s college classroom. In K. Matsushita (Ed.), Deep active learning: Toward greater depth in university education (pp. 35–58). Springer. https://doi.org/10.1007/978-981-10-5660-4_3

Chapter   Google Scholar  

Barkley, E. F., Cross, K. P., & Major, C. H. (2014). Collaborative learning techniques: A handbook for college faculty . Jossey-Bass.

Baron, P., & Corbin, L. (2012). Student engagement: Rhetoric and reality. Higher Education Research and Development, 31 (6), 759–772. https://doi.org/10.1080/07294360.2012.655711

Beachboard, M. R., Beachboard, J. C., Li, W., & Adkison, S. R. (2011). Cohorts and relatedness: Self-determination theory as an explanation of how learning communities affect educational outcomes. Research in Higher Education, 52 , 853–874. https://doi.org/10.1007/s11162-011-9221-8

Beasley, S. (2021). Student–faculty interactions and psychosociocultural influences as predictors of engagement among Black college students. Journal of Diversity in Higher Education, 14 (2), 240–251. https://doi.org/10.1037/dhe0000169

Bowman, N. A. (2010a). Can 1st-year college students accurately report their learning and development? American Educational Research Journal, 47 (2), 466–496. https://doi.org/10.3102/0002831209353595

Bowman, N. A. (2010b). College diversity experiences and cognitive development: A meta-analysis. Review of Educational Research, 80 (1), 4–33. https://doi.org/10.3102/0034654309352495

Bowman, N. A. (2011). Validity of college self-reported gains at diverse institutions. Educational Researcher, 40 (1), 22–24. https://doi.org/10.3102/0013189X10397630

Bowman, N. A., & Hill, P. L. (2011). Measuring how college affects students: Social desirability and other potential biases in college student self-reported gains. New Directions for Institutional Research, 150 , 73–85. https://doi.org/10.1002/ir.390

Bowman, N. A., Rockenbach, A. N., & Mayhew, M. J. (2015). Campus religious/worldview climate, institutional religious affiliation, and student engagement. Journal of Student Affairs Research and Practice, 52 (1), 24–37. https://doi.org/10.1080/19496591.2015.996045

Bowman, N. A., Wolniak, G. C., Seifert, T. A., Wise, K., & Blaich, C. (2023). The long-term role of undergraduate experiences: Predicting intellectual and civic outcomes. Research in Higher Education, 64 , 379–401. https://doi.org/10.1007/s11162-022-09708-5

Boyer Commission on Educating Undergraduates in the Research University. (1998). Reinventing undergraduate education: A blueprint for America’s research universities . State University of New York. https://eric.ed.gov/?id=ED424840

Brint, S., & Cantwell, A. M. (2010). Undergraduate time use and academic outcomes: Results from the University of California undergraduate experience survey 2006. Teachers College Record, 112 (9), 2441–2470. https://doi.org/10.1177/01614681101120090

Brint, S., & Cantwell, A. M. (2014). Conceptualizing, measuring, and analyzing the characteristics of academically disengaged students: Results from UCUES 2010. Journal of College Student Development, 55 (8), 808–823. https://doi.org/10.1353/csd.2014.0080

Buckley, J. A., Korkmaz, A., & Kuh, G. D. (2008, November). The disciplinary effects of undergraduate research experiences with faculty on selected student self-reported gains . Paper presented at the annual meeting of the Association for the Study of Higher Education.

Campbell, C. M., & Cabrera, A. F. (2011). How sound is NSSE? Investigating the psychometric properties of NSSE at a public, research-extensive institution. Review of Higher Education, 35 (1), 77–103. https://doi.org/10.1353/rhe.2011.0035

Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47 , 1–32. https://doi.org/10.1007/s11162-005-8150-9

Chen, W., & Chan, Y. (2020). Can higher education increase students’ moral reasoning? The role of student engagement in the U.S. Journal of Moral Education, 51 (2), 169–185. https://doi.org/10.1080/03057240.2020.1806045

Cherney, I. D. (2008). The effects of active learning on students’ memories for course content. Active Learning in Higher Education, 9 (2), 152–171. https://doi.org/10.1177/1469787408090841

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39 (7), 3–7. https://eric.ed.gov/?id=ED282491

Chong, Y., & Sin Soo, H. (2021). Evaluation of first-year university students’ engagement to enhance student development. Asian Journal of University Education, 17 (2), 113–121. https://doi.org/10.24191/ajue.v17i2.13388

Cleary, T. J., & Zimmerman, B. J. (2012). A cyclical self-regulatory account of student engagement: Theoretical foundations and applications. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 237–257). Springer. https://doi.org/10.1007/978-1-4614-2018-7_11

Coates, H. (2007). A model of online and general campus-based student engagement. Assessment and Evaluation in Higher Education, 32 (2), 121–141. https://doi.org/10.1080/02602930600801878

Coates, H. (2010). Development of the Australasian Survey of Student Engagement (AUSSE). Higher Education, 60 , 1–17. https://doi.org/10.1007/s10734-009-9281-2

Coates, H., & McCormick, A. C. (2014). Emerging trends and perspectives. In H. Coates & A. C. McCormick (Eds.), Engaging university students (pp. 151–158). Springer. https://doi.org/10.1007/978-981-4585-63-7_11

Coker, J. S., Heiser, E., & Taylor, L. (2018). Student outcomes associated with shortterm and semester study abroad programs. Frontiers: The Interdisciplinary Journal of Study Abroad, 30 (2), 92–105. https://doi.org/10.36366/frontiers.v30i2.414

Cole, J. S., Kennedy, M., & Ben-Avie, M. (2009). The role of precollege data in assessing and understanding student engagement in college. New Directions for Institutional Research, 141 , 55–69. https://doi.org/10.1002/ir.286

Community College Survey of Student Engagement. (2021a). About the Community College Survey of Student Engagement (CCSSE) . https://www.ccsse.org/aboutccsse/aboutccsse.cfm

Community College Survey of Student Engagement. (2021b). Community College Faculty Survey of Student Engagement (CCFSSE) . https://www.ccsse.org/CCFSSE/CCFSSE.cfm

Cruce, T., Wolniak, G., Seifert, T., & Pascarella, E. (2006). Impacts of good practices on cognitive development, learning orientations, and graduate degree plans during the first year of college. Journal of College Student Development, 47 (4), 365–383. https://doi.org/10.1353/csd.2006.0042

Culver, K. C., & Bowman, N. A. (2020). Is what glitters really gold? A quasi-experimental study of first-year seminars and college student success. Research in Higher Education, 61 , 167–196. https://doi.org/10.1007/s11162-019-09558-8

Deci, E. L., & Ryan, R. M. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well being. American Psychologist, 55 (1), 68–78. https://doi.org/10.1037//0003-066X.55.1.68

Deil-Amen, R. (2011). Socio-academic integrative moments: Rethinking academic and social integration among two-year college students in career-related programs. The Journal of Higher Education, 82 (1), 54–91. https://doi.org/10.1080/00221546.2011.11779085

DeMarinis, M., Beaulieu, J., Cull, I., & Alaa, A. (2017). A mixed-methods approach to understanding the impact of a first-year peer mentor program. Journal of the First-Year Experience and Students in Transition, 29 (2), 93–107.

Denson, N., & Chang, M. J. (2009). Racial diversity matters: The impact of diversity-related student engagement and institutional context. American Educational Research Journal, 46 (2), 322–353. https://doi.org/10.3102/0002831208323278

Dixson, M. D. (2010). Creating effective student engagement in online courses: What do students find engaging? Journal of Scholarship of Teaching and Learning, 10 (2), 1–13. https://eric.ed.gov/?id=EJ890707

Dixson, M. D. (2015). Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learning Journal, 19 (4), 1–15. https://eric.ed.gov/?id=EJ1079585

Dong, S. (2019). The effects of first-generation status on student engagement and outcomes at liberal arts colleges. Journal of College Student Development, 60 (1), 17–34. https://doi.org/10.1353/csd.2019.0001

Douglass, J. A., & Zhao, C. (2013). Undergraduate research engagement at major U.S. research universities (Occasional paper series CSHE.14.13). Center for Studies in Higher Education. https://eric.ed.gov/?id=ED545187

Drennan, J., O’Reilly, S., O’Connor, M., O’Driscoll, C., Patterson, V., Purser, L., & Murray, J. (2014). The Irish survey of student engagement. In H. Coates & A. C. McCormick (Eds.), Engaging university students (pp. 109–125). Springer. https://doi.org/10.1007/978-981-4585-63-7_8

Eagan, M. K., Jr., Hurtado, S., Chang, M. J., Garcia, G. A., Herrera, F. A., & Garibay, J. C. (2013). Making a difference in science education: The impact of undergraduate research programs. American Educational Research Journal, 50 (4), 683–713. https://doi.org/10.3102/0002831213482038

Eccles, J., & Wang, M. (2012). Part I commentary: So what is student engagement anyway? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 133–148). Springer. https://doi.org/10.1007/978-1-4614-2018-7_6

Elphinstone, B., & Tinker, S. (2017). Use of the motivation and engagement scale–university/college as a means of identifying student typologies. Journal of College Student Development, 58 (3), 457–462. https://doi.org/10.1353/csd.2017.0034

Esposito, G., Marôcob, J., Passeggia, R., Pepicella, G., & Freda, M. F. (2022). The Italian validation of the university student engagement inventory. European Journal of Higher Education, 12 (1), 35–55. https://doi.org/10.1080/21568235.2021.1875018

Finley, A., & McNair, T. (2013). Assessing underserved students’ engagement in high-impact practices with an assessing equity in high-impact practices toolkit . Association of American Colleges and Universities. https://eric.ed.gov/?id=ED582014

Flynn, D. (2014). Baccalaureate attainment of college students at 4-year institutions as a function of student engagement behaviors: Social and academic student engagement behaviors matter. Research in Higher Education, 55 , 467–493. https://doi.org/10.1007/s11162-013-9321-8

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74 (1), 59–109. https://doi.org/10.3102/00346543074001059

Fuller, M. B., Wilson, M. A., & Tobin, R. M. (2011). The National Survey of Student Engagement as a predictor of undergraduate GPA: A cross-sectional and longitudinal examination. Assessment and Evaluation in Higher Education, 36 (6), 735–748. https://doi.org/10.1080/02602938.2010.488791

Garvey, J., BrckaLorenz, A., Latopolski, K., & Hurtado, S. (2018). High-impact practices and student–faculty interactions for students across sexual orientations. Journal of College Student Development, 59 (2), 210–226. https://doi.org/10.1353/csd.2018.0018

Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Chang, M. (2012). From gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Research in Higher Education, 53 , 229–261. https://doi.org/10.1007/s11162-011-9247-y

Gonyea, R. M. (2006, May). The relationship between student engagement and selected desirable outcomes in the first year of college . Paper presented at the annual meeting of the Association for Institutional Research.

Gonyea, R. M. (2008, November). The impact of study abroad on senior year engagement . Paper presented at the annual meeting of the Association for the Study of Higher Education.

Gordon, J., Ludlum, J., & Hoey, J. J. (2008). Validating NSSE against student outcomes: Are they related? Research in Higher Education, 49 , 19–39. https://doi.org/10.1007/s11162-007-9061-8

Greene, T. G., Marti, C., & McClenney, K. (2008). The effort – outcome gap: Differences for African American and Hispanic community college students in student engagement and academic achievement. Journal of Higher Education, 79 (5), 513–539. https://doi.org/10.1080/00221546.2008.11772115

Grier-Reed, T., Appleton, J., Rodriguez, M., Ganuza, Z., & Reschly, A. (2012). Exploring the student engagement instrument and career perceptions with college students. Journal of Educational and Developmental Psychology, 2 (2), 1–12. https://doi.org/10.5539/jedp.v2n2p85

Griffin, C. P., & Howard, S. (2017). Restructuring the college classroom: A critical reflection on the use of collaborative strategies to target student engagement in higher education. Psychology Learning and Teaching, 16 (3), 375–392. https://doi.org/10.1177/1475725717692681

Groccia, J. E. (2018). What is student engagement? New Directions for Teaching and Learning, 154 , 11–20. https://doi.org/10.1002/tl.20287

Guerrero, M., & Rod, A. B. (2013). Engaging in office hours: A study of student-faculty interaction and academic performance. Journal of Political Science Education, 9 (4), 403–416. https://doi.org/10.1080/15512169.2013.835554

Hagel, P., Carr, R., & Devlin, M. (2012). Conceptualising and measuring student engagement through the Australasian Survey of Student Engagement (AUSSE): A critique. Assessment and Evaluation in Higher Education, 37 (4), 475–486. https://doi.org/10.1080/02602938.2010.545870

Handelsman, M. M., Briggs, W. L., Sullivan, N., & Towler, A. (2005). A measure of college student course engagement. The Journal of Educational Research, 98 (3), 184–192. https://doi.org/10.3200/JOER.98.3.184-192

Hanson, J., Paulsen, M., & Pascarella, E. (2016a). Understanding graduate school aspirations: The effect of good teaching practices. Higher Education, 71 , 735–752. https://doi.org/10.1007/s10734-015-9934-2

Hanson, J. M., Trolian, T. L., Paulsen, M. B., & Pascarella, E. T. (2016b). Evaluating the influence of peer learning on psychological well-being. Teaching in Higher Education, 21 (2), 191–206. https://doi.org/10.1080/13562517.2015.1136274

Harper, S. R., & Quaye, S. J. (2008). Beyond sameness, with engagement and outcomes for all: An introduction. In S. R. Harper & S. J. Quaye (Eds.), Student engagement in higher education: Theoretical perspectives and practical approaches for diverse populations (1st ed., pp. 1–16). Routledge.

Harper, S. R., Carini, R. M., Bridges, B. K., & Hayek, J. C. (2004). Gender differences in student engagement among African American undergraduates at historically Black colleges and universities. Journal of College Student Development, 45 (3), 271–284. https://doi.org/10.1353/csd.2004.0035

Harris, J. C., & BrckaLorenz, A. (2017). Black, White, and Biracial students’ engagement at differing institutional types. Journal of College Student Development, 58 (5), 783–789. https://doi.org/10.1353/csd.2017.0061

Hart, S. R., Stewart, K., & Jimerson, S. R. (2011). The Student Engagement in Schools Questionnaire (SESQ) and the Teacher Engagement Report Form-New (TERF-N): Examining the preliminary evidence. Contemporary School Psychology, 15 , 67–79. https://doi.org/10.1007/BF03340964

Hatch, D. K. (2017). The structure of student engagement in community college student success programs: A quantitative activity systems analysis. AERA Open, 3 (4), 1–14. https://doi.org/10.1177/2332858417732744

Hauck, A. A., Ward, C., Persutte-Manning, S. L., & Vaughan, A. L. (2020). Assessing first-year seminar performance with college engagement, academic self-efficacy, and student achievement. Journal of Higher Education Theory and Practice, 20 (4), 88–101.

Hedrick, B., Dizen, M., Collins, K., Evans, J., & Grayson, T. (2010). Perceptions of college students with and without disabilities and effects of STEM and non-STEM enrollment on student engagement and institutional involvement. Journal of Postsecondary Education and Disability, 23 (2), 129–136. https://eric.ed.gov/?id=EJ906698

Hendrickson, J. M., Therrien, W. J., Weeden, D. D., Pascarella, E., & Hosp, J. L. (2015). Engagement among students with intellectual disabilities and first year students: A comparison. Journal of Student Affairs Research and Practice, 52 (2), 204–219. https://doi.org/10.1080/19496591.2015.1041872

Herrmann, K. J. (2013). The impact of cooperative learning on student engagement: Results from an intervention. Active Learning in Higher Education, 14 (3), 175–187. https://doi.org/10.1177/1469787413498035

Hu, S. (2011). Reconsidering the relationship between student engagement and persistence in college. Innovative Higher Education, 36 , 97–106. https://doi.org/10.1007/s10755-010-9158-4

Hu, S., & Li, S. (Eds.). (2011). Using typological approaches to understand college student experiences and outcomes. In New directions for institutional research, assessment supplement 2011 . Jossey-Bass.

Hu, S., & McCormick, A. C. (2012). An engagement-based student typology and its relationship to college outcomes. Research in Higher Education, 53 , 738–754. https://doi.org/10.1007/s11162-012-9254-7

Hu, S., & Wolniak, G. C. (2010). Initial evidence on the influence of college student engagement on early career earnings. Research in Higher Education, 51 , 750–766. https://doi.org/10.1007/s11162-010-9176-1

Hu, S., & Wolniak, G. C. (2013). College student engagement and early career earnings: Differences by gender, race/ethnicity, and academic preparation. Review of Higher Education, 36 (2), 211–233. https://doi.org/10.1353/rhe.2013.0002

Hu, S., Kuh, G. D., & Li, S. (2008). The effects of engagement in inquiry-oriented activities on student learning and personal development. Innovative Higher Education, 33 , 71–81. https://doi.org/10.1007/s10755-008-9066-z

Hurtado, S. S., Gonyea, R. M., Graham, P. A., & Fosnacht, K. (2020). The relationship between residential learning communities and student engagement. Learning Communities: Research and Practice, 8 (1), 5. https://eric.ed.gov/?id=EJ1251590

Indiana University Center for Postsecondary Research. (2013). NSSE’s conceptual framework . https://nsse.indiana.edu/nsse/about-nsse/conceptual-framework/index.html

Indiana University Center for Postsecondary Research. (2021a). About BCSSE . https://nsse.indiana.edu/bcsse/about-bcsse/index.html

Indiana University Center for Postsecondary Research. (2021b). About FSSE. https://nsse.indiana.edu/fsse/about-fsse/index.html

Indiana University Center for Postsecondary Research. (2021c). About NSSE. https://nsse.indiana.edu/nsse/about-nsse/index.html

Indiana University Center for Postsecondary Research. (2021d). What is CLASSE? https://nsse.indiana.edu/research/classe.html#:~:text=What%20is%20CLASSE%3F,engagement%20at%20the%20classroom%20level

Inkelas, K. K., Soldner, M., Longerbeam, S. D., & Leonard, J. B. (2008). Differences in student outcomes by types of living–learning programs: The development of an empirical typology. Research in Higher Education, 49 , 495–512. https://doi.org/10.1007/s11162-008-9087-6

Jach, E. A., & Trolian, T. L. (2022). Applied learning experiences in higher education and students’ attitudes toward professional success. Journal of Student Affairs Research and Practice, 59 (4), 401–418. https://doi.org/10.1080/19496591.2021.1967758

Johnson, S. R., & Stage, F. K. (2018). Academic engagement and student success: Do high-impact practices mean higher graduation rates? The Journal of Higher Education, 89 (5), 753–781. https://doi.org/10.1080/00221546.2018.1441107

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38 (5), 758–773. https://doi.org/10.1080/03075079.2011.598505

Karim, M. I., & Hamid, H. S. A. (2016). Factor structure of the student engagement instrument among Malaysian undergraduates. The Malaysian Journal of Psychology, 30 (2), 1–12.

Keup, J. R., & Barefoot, B. O. (2005). Learning how to be a successful student: Exploring the impact of first-year seminars on student outcomes. Journal of the First-Year Experience and Students in Transition, 17 (1), 11–47.

Kezar, A. J. (2006). The impact of institutional size on student engagement. NASPA Journal, 43 (1), 87–114. https://doi.org/10.2202/1949-6605.1573

Kezar, A., & Kinzie, J. (2006). Examining the ways institutions create student engagement: The role of mission. Journal of College Student Development, 47 (2), 149–172. https://doi.org/10.1353/csd.2006.0018

Kilgo, C. A., Ezell Sheets, J. K., & Pascarella, E. T. (2015). The link between high-impact practices and student learning: Some longitudinal evidence. Higher Education, 69 , 509–525. https://doi.org/10.1007/s10734-014-9788-z

Kim, Y., & Lundberg, C. (2016). A structural model of the relationship between student–faculty interaction and cognitive skills development among college students. Research in Higher Education, 57 , 288–309. https://doi.org/10.1007/s11162-015-9387-6

Kim, Y., & Sax, L. (2009). Student–faculty interaction in research universities: Differences by student gender, race, social class, and first-generation status. Research in Higher Education, 50 , 437–459. https://doi.org/10.1007/s11162-009-9127-x

Kim, S. Y., Westine, C., Wu, T., & Maher, D. (2022). Validation of the higher education student engagement scale in use for program evaluation. Journal of College Student Retention: Research, Theory, and Practice . https://doi.org/10.1177/15210251221120908

Kimbark, K., Peters, M. L., & Richardson, T. (2016). Effectiveness of the student success course on persistence, retention, academic achievement, and student engagement. Community College Journal of Research and Practice, 41 (2), 124–138. https://doi.org/10.1080/10668926.2016.1166352

Kinkead, J. (2003). Learning through inquiry: An overview of undergraduate research. New Directions for Teaching and Learning, 93 , 5–17. https://doi.org/10.1002/tl.85

Kinzie, J. (2005). Promoting student success: What faculty members can do (Occasional paper no. 6). Indiana University Center for Postsecondary Research. https://scholarworks.iu.edu/dspace/handle/2022/23546

Kinzie, J., Gonyea, R., Kuh, G. D., Umbach, P., Blaich, C., & Korkmaz, A. (2007, November). The relationship between gender and student engagement in college . Paper presented at the annual meeting of the Association for the Study of Higher Education.

Komarraju, M., Musulkin, S., & Bhattacharya, G. (2010). Role of student–faculty interactions in developing college students’ academic self-concept, motivation, and achievement. Journal of College Student Development, 51 (3), 332–342. https://doi.org/10.1353/csd.0.0137

Krause, K., & Coates, H. (2008). Students’ engagement in first-year university. Assessment and Evaluation in Higher Education, 33 (5), 493–505. https://doi.org/10.1080/02602930701698892

Krause, K. L., Hartley, R., James, R., & McInnis, C. (2005). The first year experience in Australian universities: Findings from a decade of national studies . Centre for the Study of Higher Education, University of Melbourne.

Kuh, G. D. (2001). Assessing what really matters to student learning: Inside the National Survey of Student Engagement. Change: The Magazine of Higher Learning, 33 (3), 10–17. https://doi.org/10.1080/00091380109601795

Kuh, G. D. (2003a). What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices. Change: The Magazine of Higher Learning, 35 (2), 24–32. https://doi.org/10.1080/00091380309604090

Kuh, G. D. (2003b). The National Survey of Student Engagement: Conceptual framework and overview of psychometric properties . Indiana University Center for Postsecondary Research. https://scholarworks.iu.edu/dspace/handle/2022/24268

Kuh, G. D. (2007, Winter). What student engagement data tell us about college readiness. Peer Review . Association of American Colleges and Universities 4–8. https://creativecommons.org/licenses/by/4.0/

Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Association of American Colleges and Universities. https://www.aacu.org/publication/high-impact-educational-practices-what-they-are-who-has-access-to-them-and-why-they-matter

Kuh, G. D. (2009). The National Survey of Student Engagement: Conceptual and empirical foundations. New Directions for Institutional Research, 141 , 5–20. https://doi.org/10.1002/ir.283

Kuh, G. D., & Umbach, P. D. (2004). College and character: Insights from the National Survey of Student Engagement. New Directions for Institutional Research, 122 , 37–54. https://doi.org/10.1002/ir.108

Kuh, G. D., Hu, S., & Vesper, N. (2000). “They shall be known by what they do”: An activities-based typology of college students. Journal of College Student Development, 41 (2), 228–244.

Kuh, G. D., Kinzie, J., Buckley, J. A., Bridges, B. K., & Hayek, J. C. (2007a). Piecing together the student success puzzle. ASHE Higher Education Report, 34 (1), 1–87. https://doi.org/10.1002/aehe.3205

Kuh, G. D., Kinzie, J., Cruce, T., Shoup, R., & Gonyea, R. M. (2007b). Connecting the dots: Multi-faceted analyses of the relationships between student engagement results from the NSSE, and the institutional practices and conditions that foster student success . Indiana University Center for Postsecondary Research. https://scholarworks.iu.edu/dspace/handle/2022/23684

Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education, 79 (5), 540–563. https://doi.org/10.1080/00221546.2008.11772116

Lam, S. F., & Jimerson, S. R. (2008). Exploring student engagement in schools internationally: Consultation paper . International School Psychologist Association.

Lam, S.-f., Jimerson, S., Wong, B. P. H., Kikas, E., Shin, H., Veiga, F. H., Hatzichristou, C., Polychroni, F., Cefai, C., Negovan, V., Stanculescu, E., Yang, H., Liu, Y., Basnett, J., Duck, R., Farrell, P., Nelson, B., & Zollneritsch, J. (2014). Understanding and measuring student engagement in school: The results of an international study from 12 countries. School Psychology Quarterly, 29 (2), 213–232. https://doi.org/10.1037/spq0000057

LaNasa, S., Cabrera, A. F., & Tangsrud, H. (2009). The construct validity of student engagement: A confirmatory factor analysis. Research in Higher Education, 50 , 313–352. https://doi.org/10.1007/s11162-009-9123-1

Leach, L., & Zepke, N. (2011). Engaging students in learning: A review of a conceptual organiser. Higher Education Research and Development, 30 (2), 193–204. https://doi.org/10.1080/07294360.2010.509761

Lester, D. (2013). A review of the student engagement literature. Focus on Colleges, Universities, and Schools, 7 (1), 1–8.

Liem, G. A. D., & Martin, A. J. (2012). The motivation and engagement scale: Theoretical framework, psychometric properties, and applied yields. Australian Psychologist, 47 , 3–13. https://doi.org/10.1111/j.1742-9544.2011.00049.x

Lifelong Achievement Group. (2022). The Motivation and Engagement Scale (MES). https://lifelongachievement.com/pages/the-motivation-and-engagement-scale-mes

Loes, C. N. (2019). Applied learning through collaborative educational experiences. In T. L. Trolian & E. A. Jach (Eds.), Applied learning in higher education: Curricular and co-curricular experiences that improve student learning. New directions for higher education, 188 (pp. 13–21). Jossey-Bass. https://doi.org/10.1002/he.20341

Loes, C. N., & An, B. P. (2021). Collaborative learning and need for cognition: Considering the mediating role of deep approaches to learning. Review of Higher Education, 45 (2), 149–179. https://doi.org/10.1353/rhe.2021.0019

Lopatto, D. (2004). Survey of Undergraduate Research Experiences (SURE): First findings. Cell Biology Education, 3 (4), 270–277. https://doi.org/10.1187/cbe.04-07-0045

Lopatto, D. (2007). Undergraduate research experiences support science career decisions and active learning. CBE-Life Sciences Education, 6 (4), 297–306. https://doi.org/10.1187/cbe.07-06-0039

Lovelace, M. D., Reschly, A. L., Appleton, J. J., & Lutz, M. E. (2014). Concurrent and predictive validity of the student engagement instrument. Journal of Psychoeducational Assessment, 32 (6), 509–520. https://doi.org/10.1177/0734282914527548

Lundberg, C. A., Schreiner, L. A., Hovaguimian, K. D., & Miller, S. S. (2007). First-generation status and student race/ethnicity as distinct predictors of student involvement and learning. Journal of Student Affairs Research and Practice, 44 (1), 57–83. https://doi.org/10.2202/1949-6605.1755

Luo, Y., Xie, M., & Lian, Z. (2019). Emotional engagement and student satisfaction: A study of Chinese college students based on a nationally representative sample. The Asia-Pacific Education Researcher, 28 , 283–292. https://doi.org/10.1007/s40299-019-00437-5

Macfarlane, B., & Tomlinson, M. (2017). Critical and alternative perspectives on student engagement. Higher Education Policy, 30 , 1–4. https://doi.org/10.1057/s41307-016-0026-4

Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education, 8 (1), 11–31. https://doi.org/10.1177/1469787407074008

Maroco, J., Maroco, A. L., Campos, J. A. D. B., & Fredricks, J. A. (2016). University student’s engagement: Development of the University Student Engagement Inventory (USEI). Psychology: Research and Review, 29 , 21. https://doi.org/10.1186/s41155-016-0042-8

Marti, C. N. (2004). Overview of the CCSSE instrument and psychometric properties . Community College Survey of Student Engagement.

Martin, A. J. (2007). Examining a multidimensional model of student motivation and engagement using a construct validation approach. British Journal of Educational Psychology, 77 , 413–440. https://doi.org/10.1348/000709906X118036

Martin, A. J. (2009). Motivation and engagement across the academic life span: A developmental construct validity study of elementary school, high school, and university/college students. Educational and Psychological Measurement, 69 , 794–824. https://doi.org/10.1177/0013164409332214

Martin, J., & Torres, A. (2016). User’s guide and toolkit for the surveys of student engagement: The High School Survey of Student Engagement (HSSSE) and the Middle Grades Survey of Student Engagement (MGSSE) . https://www.nais.org/analyze/student-engagement-surveys/

Mayhew, M. J., Seifert, T. A., Pascarella, E. T., Nelson Laird, T. D., & Blaich, C. (2012). Going deep into mechanisms for moral reasoning growth: How deep learning approaches affect moral reasoning development for first-year students. Research in Higher Education, 53 , 26–46. https://doi.org/10.1007/s11162-011-9226-3

Mayhew, M. J., Rockenbach, A. N., Bowman, N. A., Seifert, T. A., & Wolniak, G. C. (2016). How college affects students: 21st century evidence that higher education works . Jossey-Bass.

McCarrell, K., & Selznick, B. (2020). (Re)measuring community college student engagement: Testing a seven-factor CCSSE model. Community College Review, 48 (4), 400–422. https://doi.org/10.1177/0091552120936345

McCarthy, M., & Kuh, G. D. (2006). Are students ready for college? What student engagement data say. The Phi Delta Kappan, 87 (9), 664–669. https://doi.org/10.1177/003172170608700909

McClenney, K. M., & Marti, C. N. (2006). Exploring relationships between student engagement and student outcomes in community colleges: Report on validation research . Community College Survey of Student Engagement. https://eric.ed.gov/?id=ED529076

McClenney, K. M., Marti, C. N., & Adkins, C. (2007). Student engagement and student outcomes: Key findings from CCSSE validation research . Community College Survey of Student Engagement.

McClenney, K., Marti, C. N., & Adkins, C. (2012). Student engagement and student outcomes: Key findings from “CCSSE” validation research . Community College Survey of Student Engagement. https://eric.ed.gov/?id=ED529076

McCormick, A. C., Kinzie, J., & Gonyea, R. M. (2013). Student engagement: Bridging research and practice to improve the quality of undergraduate education. In M. Paulsen (Ed.), Higher education: Handbook of theory and research (Vol. 28, pp. 47–92). Springer. https://doi.org/10.1007/978-94-007-5836-0_2

McDaniel, A., & Van Jura, M. (2022). High-impact practices: Evaluating their effect on college completion. Journal of College Student Retention: Research, Theory & Practice, 24 (3), 740–757. https://doi.org/10.1177/1521025120947357

Mills, M. (2010). Tools of engagement: Success course influence on student engagement. Journal of the First-Year Experience and Students in Transition, 22 (2), 9–31.

Moreira, P. A. S., Machado Vaz, F., Dias, P. C., & Petracchi, P. (2009). Psychometric properties of the Portuguese version of the Student Engagement Instrument. Canadian Journal of School Psychology, 24 (4), 303–317. https://doi.org/10.1177/0829573509346680

Nasir, M. A. M., Janikowski, T., Guyker, W., & Wang, C. C. (2020). Modifying the student course engagement questionnaire for use with online courses. Journal of Educators Online, 17 (1), 1–11. https://eric.ed.gov/?id=EJ1241583

National Survey of Student Engagement (NSSE). (2000). The NSSE 2000 report: National benchmarks of effective educational practice . Indiana University Center for Postsecondary Research. https://scholarworks.iu.edu/dspace/handle/2022/23382

Nelson Laird, T. F. (2005). College students’ experiences with diversity and their effects on academic self-confidence, social agency, and disposition toward critical thinking. Research in Higher Education, 46 , 365–387. https://doi.org/10.1007/s11162-005-2966-1

Nelson Laird, T. F., Shoup, R., & Kuh, G. D. (2005, May). Measuring deep approaches to learning using the National Survey of Student Engagement . Paper presented at the annual meeting of the annual meeting of the Association for Institutional Research.

Nelson Laird, T. F., Chen, D., & Kuh, G. D. (2008a). Classroom practices at institutions with higher-than-expected persistence rates: What student engagement data tell us. New Directions for Teaching and Learning, 115 , 85–99. https://doi.org/10.1002/tl.327

Nelson Laird, T. F., Garver, A. K., Niskodé-Dossett, A. S., & Banks, J. V. (2008b, November). The predictive validity of a measure of deep approaches to learning . Paper presented at the annual meeting of the Association for the Study of Higher Education.

Nelson Laird, T. F., Smallwood, R., Niskodé-Dossett, A. S., & Garver, A. K. (2009). Effectively involving faculty in the assessment of student engagement. New Directions for Institutional Research, 141 , 71–81. https://doi.org/10.1002/ir.287

Nelson Laird, T. F., Seifert, T. A., Pascarella, E. T., Mayhew, M. J., & Blaich, C. F. (2014). Deeply affecting first-year students’ thinking: Deep approaches to learning and three dimensions of cognitive development. The Journal of Higher Education, 85 (3), 402–432. https://doi.org/10.1080/00221546.2014.11777333

Nora, A., Crisp, G., & Matthews, C. (2011). A reconceptualization of CCSSE’s benchmarks of student engagement. Review of Higher Education, 35 (1), 105–130. https://doi.org/10.1353/rhe.2011.0036

Ouimet, J. A., & Smallwood, R. A. (2005). Assessment measures: CLASSE – the class-level survey of student engagement. Assessment Update, 17 (6), 13–15.

Pace, C. R. (1984). Measuring the quality of college student experiences: An account of the development and use of the college student experiences questionnaire . Higher Education Research Institute. https://eric.ed.gov/?id=ED255099

Pascarella, E. T. (1985). College environmental influences on learning and cognitive development. In J. Smart (Ed.), Higher education: Handbook of theory and research (Vol. 1, pp. 1–61). Agathon.

Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students: Findings and insights from twenty years of research . Jossey-Bass.

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research . Jossey-Bass.

Pascarella, E. T., Seifert, T. A., & Blaich, C. (2008). Validation of the NSSE benchmarks and deep approaches to learning against liberal arts outcomes . Paper presented at the annual meeting of the Association for the Study of Higher Education.

Pascarella, E. T., Blaich, C., Martin, G. L., & Hanson, J. M. (2011). How robust are the findings of Academically Adrift ? Change: The Magazine of Higher Learning, 43 (3), 20–24. https://doi.org/10.1080/00091383.2011.568898

Perna, L. W., & Thomas, S. L. (2008). Theoretical perspectives on student success: Understanding the contributions of the disciplines. ASHE Higher Education Report, 32 (5), 1–182.

Pike, G. R., & Kuh, G. D. (2005). A typology of student engagement for American colleges and universities. Research in Higher Education, 46 , 185–209. https://doi.org/10.1007/s11162-004-1599-0

Pike, G. R., Kuh, G. D., & McCormick, A. C. (2011a). An investigation of the contingent relationships between learning community participation and student engagement. Research in Higher Education, 52 (3), 300–322. https://doi.org/10.1007/s11162-010-9192-1

Pike, G. R., Hansen, M. J., & Lin, C. (2011b). Using instrumental variables to account for selection effects in research on first-year programs. Research in Higher Education, 52 (2), 194–214. https://doi.org/10.1007/s11162-010-9188-x

Porter, S. R. (2006). Institutional structures and student engagement. Research in Higher Education, 47 , 521–558. https://doi.org/10.1007/s11162-005-9006-z

Porter, S. R. (2011). Do college student surveys have any validity? Review of Higher Education, 35 (1), 45–76. https://doi.org/10.1353/rhe.2011.0034

Price, D. V., & Tovar, E. (2014). Student engagement and institutional graduation rates: Identifying high-impact educational practices for community colleges. Community College Journal of Research and Practice, 38 (9), 766–782. https://doi.org/10.1080/10668926.2012.719481

Radloff, A., & Coates, H. (2014). Engaging university students in Australia. In H. Coates & A. C. McCormick (Eds.), Engaging university students (pp. 53–64). Springer. https://doi.org/10.1007/978-981-4585-63-7_4

Ramsden, P. (1991). A performance indicator of teaching quality in higher education: The course experience questionnaire. Studies in Higher Education, 16 (2), 129–150. https://doi.org/10.1080/03075079112331382944

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 3–20). Springer. https://doi.org/10.1007/978-1-4614-2018-7

Robbins, S. B., Oh, I.-S., Le, H., & Button, C. (2009). Intervention effects on college performance and retention as mediated by motivational, emotional, and social control factors: Integrated meta-analytic path analyses. Journal of Applied Psychology, 94 (5), 1163–1184. https://doi.org/10.1037/a0015738

Roblyer, M. D., & Wiencke, W. R. (2003). Design and use of a rubric to assess and encourage interactive qualities in distance courses. American Journal of Distance Education, 17 (2), 77.98. https://doi.org/10.1207/S15389286AJDE1702_2

Roblyer, M. D., & Wiencke, W. R. (2004). Exploring the interaction equation: Validating a rubric to assess and encourage interaction in distance courses. Journal of Asynchronous Learning Networks, 8 (4), 25–37. https://doi.org/10.24059/olj.v8i4.1808

Ross, H., Cen, Y., & Shi, J. (2014). Engaging students in China. In H. Coates & A. C. McCormick (Eds.), Engaging university students (pp. 93–107). Springer. https://doi.org/10.1007/978-981-4585-63-7_7

Russell, S. H., Hancock, M. P., & McCullough, J. (2007). Benefits of undergraduate research experiences. Science, 316 (5824), 548–549. https://doi.org/10.1126/science.1140384

Sax, L. J., Bryant, A. N., & Harper, C. E. (2005). The differential effects of student-faculty interaction on college outcomes for women and men. Journal of College Student Development, 46 (6), 642–657. https://doi.org/10.1353/csd.2005.0067

Schunk, D. H., & Mullen, C. A. (2012). Self-efficacy as an engaged learner. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 219–236). Springer. https://doi.org/10.1007/978-1-4614-2018-7_10

Seifert, T., Goodman, K., Lindsay, N., Jorgensen, J., Wolniak, G., Pascarella, E., & Blaich, C. (2008). The effects of liberal arts experiences on liberal arts outcomes. Research in Higher Education, 49 , 107–125. https://doi.org/10.1007/s11162-007-9070-7

Seifert, T. A., Gillig, B., Hanson, J. M., & Pascarella, E. T.,& Blaich, C. F. (2014). The conditional nature of high impact/good practices on student learning outcomes. The Journal of Higher Education, 85 (4), 531–564. https://doi.org/10.1080/00221546.2014.11777339

She, L., KhoshnavayFomani, F., Marôco, J., Allen, K.-A., Sharif Nia, H., & Rahmatpour, P. (2023). Psychometric properties of the university student engagement inventory among Chinese students. Asian Association of Open Universities Journal, 18 (1), 46–60. https://doi.org/10.1108/AAOUJ-08-2022-0111

Shi, J., Wen, W., Yifei, L., & Jing, C. (2014). China College Student Survey (CCSS): Breaking open the black box of the process of learning. International Journal of Chinese Education, 3 (1), 132–159. https://doi.org/10.1163/22125868-12340033

Sinval, J., Casanova, J. R., Marôco, J., & Almeida, L. S. (2021). University student engagement inventory (USEI): Psychometric properties. Current Psychology, 40 , 1608–1620. https://doi.org/10.1007/s12144-018-0082-6

Skinner, E. A., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 21–44). Springer. https://doi.org/10.1007/978-1-4614-2018-7

Skinner, E. A., Wellborn, J. G., & Connell, J. P. (1990). What it takes to do well in school and whether I’ve got it: A process model of perceived control and children’s engagement and achievement in school. Journal of Educational Psychology, 82 (1), 22–32. https://doi.org/10.1037/0022-0663.82.1.22

Stebleton, M. J., Soria, K. M., & Cherney, B. T. (2013). The high impact of education Abroad: College students’ engagement in international experiences and the development of intercultural competencies. Frontiers: The Interdisciplinary Journal of Study Abroad, 22 (1), 1–24. https://doi.org/10.36366/frontiers.v22i1.316

Strayhorn, T. L. (2008). How college students’ engagement affects personal and social learning outcomes. Journal of College and Character, 10 (2), 1–16. https://doi.org/10.2202/1940-1639.1071

Stringer, G. (2022a). NAIS report on the 2022 Middle Grades Survey of Student Engagement (MGSSE). National Association of Independent Schools. https://www.nais.org/articles/pages/research/nais-report-on-the-2022-middle-grades-survey-of-student-engagement-mgsse/

Stringer, G. (2022b). NAIS report on the 2022 High School Survey of Student Engagement (HSSSE). National Association of Independent Schools. https://www.nais.org/articles/pages/research/nais-research-report-on-the-2022-high-school-survey-of-student-engagement-hssse/

Strydom, J. F., & Mentz, M. M. (2010). South African survey of student engagement: Focusing the student experience on success through student engagement . South African Council on Higher Education. https://eletsa.org.za/wp-content/uploads/2020/06/focusing-the-student-experience-on-success-through-student-engagement-92-eng.pdf

Strydom, J. F., & Mentz, M. M. (2014). Student engagement in South Africa: A key to success, quality and development. In H. Coates & A. C. McCormick (Eds.), Engaging university students (pp. 77–91). Springer. https://doi.org/10.1007/978-981-4585-63-7_6

Sweat, J., Jones, G., Han, S., & Wolfgram, S. M. (2013). How does high impact practice predict student engagement? A comparison of white and minority students. International Journal for the Scholarship of Teaching and Learning, 7 (2), 17. https://doi.org/10.20429/ijsotl.2013.070217

Tinto, V. (1975). Dropout from higher education: A theoretical synthesis of recent research. Review of Educational Research, 45 (1), 89–125. https://doi.org/10.3102/00346543045001089

Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition . University of Chicago Press.

Trolian, T. L., & Jach, E. A. (2020). Engagement in college and university applied learning experiences and students’ academic motivation. The Journal of Experimental Education, 43 (4), 317–335. https://doi.org/10.1177/1053825920925100

Trolian, T., & Parker, E. (2017). Moderating influences of student-faculty interactions on students’ graduate and professional school aspirations. Journal of College Student Development, 58 (8), 1261–1267. https://doi.org/10.1353/csd.2017.0098

Trolian, T. L., Jach, E. A., Hanson, J. M., & Pascarella, E. T. (2016). Influencing academic motivation: The effects of student-faculty interaction. Journal of College Student Development, 57 (7), 810–826. https://doi.org/10.1353/csd.2016.0080

Trolian, T. L., Jach, E. A., & Archibald, G. C. (2021). Shaping students’ attitudes toward professional success: Examining the role of student-faculty interactions. Innovative Higher Education, 46 (2), 111–131. https://doi.org/10.1007/s10755-020-09529-3

Trowler, V. (2010). Student engagement literature review. The Higher Education Academy. https://pure.hud.ac.uk/en/publications/student-engagement-literature-review

Umbach, P. D., & Kuh, G. D. (2006). Student experiences with diversity at liberal arts colleges: Another claim for distinctiveness. The Journal of Higher Education, 77 (1), 169–192. https://doi.org/10.1080/00221546.2006.11778923

Umbach, P., & Wawrzynski, M. (2005). Faculty do matter: The role of college faculty in student learning and engagement. Research in Higher Education, 46 , 153–184. https://doi.org/10.1007/s11162-004-1598-1

Varela, J. J., Melipillán, R., Reschly, A. L., Squicciarini Navarro, A. M., Quintanilla, F. P., & Campos, P. S. (2023). Cross-cultural validation of the student engagement instrument for Chilean students. Journal of Psychoeducational Assessment, 41 (2), 226–233. https://doi.org/10.1177/07342829221141512

Waldrop, D., Reschly, A. L., Fraysier, K., & Appleton, J. J. (2019). Measuring the engagement of college students: Administration format, structure, and validity of the student engagement instrument–college. Measurement and Evaluation in Counseling and Development, 52 (2), 90–107. https://doi.org/10.1080/07481756.2018.1497429

Wang, J., Pascarella, E. T., Nelson Laird, T. F., & Ribera, A. K. (2015). How clear and organized classroom instruction and deep approaches to learning affect growth in critical thinking and need for cognition. Studies in Higher Education, 40 (10), 1786–1807. https://doi.org/10.1080/03075079.2014.914911

Wigfield, A., & Eccles, J. S. (2002). The development of competence beliefs, expectancies for success, and achievement values from childhood through adolescence. In A. Wigfield & J. S. Eccles (Eds.), Development of achievement motivation (pp. 91–120). Academic Press. https://doi.org/10.1016/B978-012750053-9/50006-1

Wilcox, P., Winn, S., & Fyvie-Gauld, M. (2005). ‘It was nothing to do with the university, it was just the people’: The role of social support in the first-year experience of higher education. Studies in Higher Education, 30 (6), 707–722. https://doi.org/10.1080/03075070500340036

Wolf-Wendel, L., Ward, K., & Kinzie, J. (2009). A tangled web of terms: The overlap and unique contribution of involvement, engagement, and integration to understanding college student success. Journal of College Student Development, 50 (4), 407–428. https://doi.org/10.1353/csd.0.0077

Zhao, C. M., & Kuh, G. D. (2004). Adding value: Learning communities and student engagement. Research in Higher Education, 45 , 115–138. https://doi.org/10.1023/B:RIHE.0000015692.88534.de

Zhao, C. M., Kuh, G. D., & Carini, R. M. (2005). A comparison of international student and American student engagement in effective educational practices. The Journal of Higher Education, 76 (2), 209–231. https://doi.org/10.1080/00221546.2005.11778911

Zhoc, K. C. H., Webster, B. J., King, R. B., Li, J. C. H., & Chung, T. S. H. (2019). Higher Education Student Engagement Scale (HESES): Development and psychometric evidence. Research in Higher Education, 60 , 219–244. https://doi.org/10.1007/s11162-018-9510-6

Zilvinskis, J., & Dumford, A. D. (2018). The relationship between transfer student status, student engagement, and high-impact practice participation. Community College Review, 46 (4), 368–387. https://doi.org/10.1177/0091552118781495

Zilvinskis, J., Masseria, A. A., & Pike, G. R. (2017). Student engagement and student learning: Examining the convergent and discriminant validity of the revised National Survey of Student Engagement. Research in Higher Education, 58 , 880–903. https://doi.org/10.1007/s11162-017-9450-6

Zimmerman, B. J., & Campillo, M. (2003). Motivating self-regulated problem solvers. In J. E. Davidson & R. J. Sternberg (Eds.), The nature of problem solving (pp. 233–262). Cambridge University Press. https://doi.org/10.1017/CBO9780511615771.009

Download references

Author information

Authors and affiliations.

University at Albany, State University of New York, Albany, NY, USA

Teniell L. Trolian

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Teniell L. Trolian .

Editor information

Editors and affiliations.

Graduate School of Education, University of Pennsylvania, Philadelphia, PA, USA

Laura W. Perna

Section Editor information

University of Iowa, Iowa City, IA, USA

Nicholas A Bowman

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this entry

Cite this entry.

Trolian, T.L. (2023). Student Engagement in Higher Education: Conceptualizations, Measurement, and Research. In: Perna, L.W. (eds) Higher Education: Handbook of Theory and Research. Higher Education: Handbook of Theory and Research, vol 39. Springer, Cham. https://doi.org/10.1007/978-3-031-32186-3_6-1

Download citation

DOI : https://doi.org/10.1007/978-3-031-32186-3_6-1

Received : 31 July 2023

Accepted : 01 August 2023

Published : 06 October 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-32186-3

Online ISBN : 978-3-031-32186-3

eBook Packages : Springer Reference Education Reference Module Humanities and Social Sciences Reference Module Education

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Plan to Attend Cell Bio 2024

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Engaging Undergraduate Students in Course-based Research Improved Student Learning of Course Material

  • Nicole T. Appel
  • Ammar Tanveer
  • Sara Brownell
  • Joseph N. Blattman

School of Life Sciences, Arizona State University, Tempe, AZ 85281

Search for more papers by this author

*Address correspondence to: Joseph N. Blattman ( E-mail Address: [email protected] ).

Course-based undergraduate research experiences (CUREs) offer students opportunities to engage in critical thinking and problem solving. However, quantitating the impact of incorporating research into undergraduate courses on student learning and performance has been difficult since most CUREs lack a comparable traditional course as a control. To determine how course-based research impacts student performance, we compared summative assessments of the traditional format for our upper division immunology course (2013–2016), in which students studied known immune development and responses, to the CURE format (2017–2019), in which students studied the effects of genetic mutations on immune development and responses. Because the overall class structure remained unaltered, we were able to quantitate the impact of incorporating research on student performance. Students in the CURE format class performed significantly better on quizzes, exams, and reports. There were no significant differences in academic levels, degree programs, or grade point averages, suggesting improved performance was due to increased engagement of students in research.

INTRODUCTION

Research experiences benefit undergraduate students by offering opportunities to engage in critical thinking and problem solving beyond the textbook and known experimental outcomes ( Kardash, 2000 ; Russell et al. , 2007 ; D’Arcy et al. , 2019 ). AAAS Vision and Change: A Call to Action suggested incorporating research into undergraduate education for students to appreciate the setbacks and unexpected outcomes of scientific research and to apply analytical skills and critical thinking to understanding their results (Bauerle et al. , 2011). Skills developed during undergraduate research experiences (UREs), such as teamwork, critical thinking, and oral and written communication, help prepare students for the workforce independent of whether they stay in a Science, Technology, Engineering, and Mathematics (STEM) field ( McClure-Brenchley et al. , 2020 ). Participating in undergraduate research is also associated with increased retention and likelihood of pursuing scientific research as a career ( Mastronardi et al. , 2021 ). Course-based undergraduate research experiences (CURE) provide students the opportunities to obtain scientific process skills while also having a greater outreach compared with one-on-one undergraduate research experiences ( Bangera and Brownell, 2014 ; Burmeister et al. , 2021 ). CURE classes are defined as integrating scientific practices, discovery, collaboration, and iteration with broadly relevant work. All five criteria must be met for a class to be a CURE, although every CURE may cover each criterion in varying degrees ( Auchincloss et al. , 2014 ).

Various surveys for undergraduate research experiences are available for measuring psychological and knowledge-based gains of participating in a CURE class. Using these tools, research has shown several benefits to transitioning to a CURE class. Biology Intensive Orientation Summer (BIOS) is a CURE that originated in China to help undergraduate students gain research skills and graduate students gain mentor skills. The CURE gave students the confidence and skills to pursue mentor based UREs, which was a graduation requirement ( Fendos et al. , 2022 ). Another study found that, after participating in a CURE for a year, students perceived gains in scientific literacy, data collection, presenting results via oral and written communication, and maintaining a lab notebook ( Peteroy-Kelly et al. , 2017 ). Other CUREs, such as the biochemistry authentic scientific inquiry laboratory (BASIL) CURE, measured student reported gains in lab skills, aka anticipated learning outcomes or course-based undergraduate research abilities. Students were asked about their knowledge, experience, and confidence in the seven anticipated learning outcomes in pretests and posttests and reported gains in all seven areas ( Irby et al. , 2020 ). Measuring gains in content knowledge is rarer, but a study by Wolkow et al. followed up with students 1 and 3 years after taking an introductory biology class to which students were randomly assigned to either a CURE or traditional lab class ( Wolkow et al. , 2014 ; Wolkow et al. , 2019 ). One year after the course, students in the CURE lab reported greater psychological gains, such as enjoying the class and considering a research career, and performed better on an assessment used to measure gains on topics covered in the CURE lab. The knowledge gains for general introductory biology were comparable between groups ( Wolkow et al. , 2014 ). By senior year, perceived gains were no longer different between those who were in the CURE and traditional lab classes as freshmen, and the knowledge gains for general introductory biology were comparable, too. However, the targeted knowledge gains of what was covered in the lab classes remained significantly higher in the CURE group ( Wolkow et al. , 2019 ).

In many CUREs, students develop their own questions and experiments to fulfill the five criteria of integrating scientific practices, discovery, collaboration, and iteration with broadly relevant work. Although students report positive outcomes when asked to compare CUREs with previous traditional labs they have taken, obtaining empirical, measurable benefits for students to engage in undergraduate research is difficult when questions and experiments vary by semester or even by lab group ( Linn et al. , 2015 ). In collaboration with Dr. Brownell, we agreed members from her lab could interview our students to determine whether any differences in cognitive and emotional ownership existed between the two class formats and, if so, whether that impacted student perceptions on collaboration, iteration, or discovery/relevance. The interviews were performed in 2016 (traditional), 2017 (CURE), and 2018 (CURE). Based on that collaboration, we learned that changing our Experimental Immunology class to a CURE format did not significantly impact collaboration or iteration, but the CURE format students perceived their data were novel and relevant outside of class. CURE format students also expressed increased cognitive and emotional ownership compared with traditional format students ( Cooper et al. , 2019 ). To ease the transition from a traditional format to the CURE format, the only change made between the formats was that the CURE students researched how a genetic change impacted the immune response alongside the control experiments by comparing the known immune responses of wild-type (WT) mice to previously uncharacterized genetically modified mice. The experiments and assessments were unchanged between class formats. We realized retrospectively that we were in a unique position to empirically measure whether and how incorporating research impacted students and therefore fulfill the gap in knowledge detailed by Linn et al. (2015) . We knew our students had increased cognitive and emotional ownership when research was incorporated into the course ( Cooper et al. , 2019 ), and ownership has been linked to improved student performance ( Martínez et al. , 2019 ). Therefore, the first question we asked was whether incorporating research into the course resulted in improved overall performance? This question was examined using three main variables: overall course performance; sets of quizzes, reports, and exams; and individual assessment items. While we could not quantitate the amount of literature read or scientific skills acquired, we were able to compare the scores of the direct summative assessments intended to measure student learning. In this lab course, we had direct summative assessments in the form of lab reports, quizzes, and two exams. During the semester, we also assessed participation and lab notebooks to encourage students to come prepared to perform experiments and to understand the material beforehand, but we did not use participation or notebook grades to measure how well students learned the course material since those assessments were tools to ensure students came to class knowing the procedures and actively participated. Therefore, we compared the total results from direct summative assessments (lab report, quiz, and exam grades) to determine whether incorporating research into a lab class resulted in any impact on student learning. The quizzes and exams remained identical between the two class formats, which provided another control variable when comparing class formats. Because the Teaching Assistants (TAs) grading the assessments did not know scores would be compared by the professor after the transition to a CURE format, we argue the assessments were graded without bias for CURE or traditional formats. Because students were not told beforehand whether the class was traditional or CURE format, the “volunteer effect” also did not impact our findings ( Brownell et al. , 2013 ). In a second experimental question, we asked whether additional factors influenced course performance or were distinct between traditional versus CURE formats, including grade point average (GPA), academic major, academic year, grading trends, and racial/ethnic or gender diversity.

MATERIALS AND METHODS

This study was conducted with an approved Institutional Review Board protocol (#4249) from Arizona State University.

Traditional Class Format

The class was divided into five sections each consisting of two or three laboratory exercises. The five sections were anatomy and cells of the immune system (labs 2–3), innate immunity (labs 4–6), adaptive immune system development (labs 7–8), acute adaptive immune response (labs 10–12), and immune memory and protection (labs 13–15). All experiments and protocols followed the lab manual “The Immune System: An Experimental Approach” ( Blattman et al. , 2016 ). The purpose of each lab could be copied from the lab manual. In other words, the traditional lab class format was prescriptive or “cookbook.” While students wrote individual hypotheses, the findings were not novel and no outside literature was needed to support the hypothesis; all necessary information to form a hypothesis was in the lab manual. All lab experiments were performed using immune cells from recently killed WT Bl6 mice (IACUC 19-1684T). The class structure was to start with a quiz, review the quiz, answer students’ questions regarding immunology and the day’s lab exercises, and then let the students perform the experiments and, when applicable, gather data the same day. Notebooks were signed at the end of class. Students were encouraged to have everything written in their notebooks and review class material before class started.

CURE Class Format

To transition to a CURE format, students studied mice with a genetic mutation that had not been studied in immunology thereby generating novel data. By collaborating with other laboratories within Arizona State University, students studied what effect knocking out Mohawk (2017, Alan Rawls), having a Raf1L613V mutation (2018, Jason Newborn), or knocking out Z-DNA-binding protein 1 (2019, Bertram Jacobs) had on the immune response. The five areas of immunology studied, the lab manual, and protocols remained unchanged between traditional and CURE formats. The purpose of each CURE lab shifted to understanding how the immune response of a genetically modified mouse differed from the WT mouse. CURE students were required to read outside literature before class started to generate a novel hypothesis. They were instructed to hypothesize how the immune response would differ (better, worse, or no change) between mice and provide their own reasoning as to why. Other than encouraging students to find publications on pubmed, instructors did not help students generate hypotheses. Before experiments started, students discussed their hypotheses in small groups before sharing their different hypotheses with the class. Instructors encouraged students to share their different hypotheses by asking, “Did anyone think the immune response would be better in the knock out mouse? Why? Who thought there’d be no change? Why?” and finally saying, “All these hypotheses are valid. We don’t know the answer yet because the experiment has never been done before.” The class structure was to start with the quiz, review the quiz, answer immunology questions, discuss hypotheses and reasoning, answer questions related to lab exercise, let the students perform the experiments and, when applicable, gather data the same day. Notebooks were signed at the end of class. Because there was no time to generate a hypothesis during class, students were required to read the course material and apply it to the genetic mutation before coming to class.

At the beginning of every lab, students took a quiz on the immunology on which the lab was based and the experiment itself. The first and last quizzes were omitted because the first quiz was used to show students how the class would flow throughout the semester and therefore did not count toward the final grade and the last quiz was a practical to determine student ability to analyze flow cytometry data.

Before class, both teaching methods required students to read the lab material, write the purpose, question, hypothesis, and procedures for the day’s lab in their notebooks, and take a quiz at the beginning of class. Students used the same book with the same optional practice questions and took the same quizzes. Although both class formats required students to come to class prepared, the CURE teaching method enforced that requirement because incorporating real scientific research into the class required CURE students to develop a novel hypothesis on how altering the gene of interest would impact the immune response. Due to time constraints, all reading for generating their novel hypothesis needed to happen before the class started. Immediately after reviewing the quiz, CURE students discussed their hypotheses in their groups for 2 minutes prior to sharing with the class via random call. To receive a notebook signature for the day’s lab, CURE students needed to have citations for their hypotheses. Students were expected to have hypotheses with citations before the start of class beginning with the second lab quiz.

Students wrote lab reports after the first four class sections. The fifth lab report was not included in this analysis since students taking the class 2013–2015 were not told to write a fifth report, and students in 2016–2019 could write the fifth report to replace the lowest report grade. Therefore, the significant difference between report grades was calculated based on reports 1 through 4 without replacing any scores since report 5 was omitted. Regardless of class format, students followed the same report rubric. Each report was worth 50 points, and the points were Introduction-7, Methods-5, Results-12, Discussion-15, References-5, Grammar-2, Legends/captions-2, and Formatting-2. In the introduction, students were expected to provide relevant information, purpose of the experiments, questions answered, and hypotheses. The CURE students not only provided the relevant immunology background for the report but also read additional literature for relevant background information regarding the gene of interest. This background reading (which took place before class and therefore before the quiz) then needed to provide a clear link to the hypothesis. Students were told the hypothesis should answer whether they expected the immune response would be greater, the same, or less than the WT mouse and why. The methods remained unchanged other than the CURE students had one additional sample to run due to also analyzing the immune response from the genetically modified mouse. For results, students in both class formats analyzed immune organ cell counts, flow cytometry data, ELISA results, and cytotoxicity data for their reports. However, students taking the CURE format had additional samples and needed to compare WT results with the genetically altered mouse. While the analysis itself was similar given the rubric (figures, description/summary, how data were obtained, and identifying controls in the experiments), CURE students analyzed two sets of data, learned how to prevent bias between samples, and then compared/contrasted the data in the discussion. In the discussion, both class formats read outside literature and discussed the impact of the data. Both formats analyzed WT data and determined whether the data obtained fit within expected values. For the CURE students, the WT data served as a control that then told them whether they correctly performed the experiment. Therefore, if the WT values fit the expected norm, then the data from the genetically altered mouse, which used the same methods and reagents, could be believed. Students then read further literature to try to understand the reasoning behind the results from the genetically altered mouse and then showed how their research regarding the gene of interest had impact outside of class. Both class formats discussed the impact of the established immunology and why the immunology was important to study (Supplemental Table S1).

The class had two exams: the open book take-home midterm was given as a hard copy before spring break and due when classes resumed and the closed book in-class final.

The students followed the same rubric for lab reports (Supplemental Table S1) and had the same quizzes and exams. Quiz and exam questions consisted of multiple choice, fill in the blank, drawing, short answer, identify cells or organs, and math. Short answer and drawing questions could have resulted in variance in TA grading. However, any variance was mitigated by reviewing all quiz answers in class and exam answers when requested. The professor, who did not change between formats, was present for classes and answered questions regarding which short answers were or were not acceptable and whether partial credit would be granted. Drawings were also reviewed in class using either a whiteboard or TV screens depending on class size.

If the increase in grades in the CURE format were due to students obtaining copies of previous quizzes, then we would expect quiz grades to have started rising during the 4 years the traditional format was taught. The midterm exam was always an open book take home exam given before students left for spring break. The final exam was a closed book, in class exam for both class formats and had the same questions and available number of points.

For both formats, TAs encouraged study practices for the final exam. In the traditional class, students were allowed a notecard during the final. In the CURE format, students had an in-class quiz-like review session 2 days before the final. While the TAs provided different study aids, students still studied on their own. In other words, the students in the traditional class could have still quizzed themselves while students in the CURE class were observed taking notes during the review session.

Regarding lab reports, different TAs graded the lab reports depending on the year. However, the same rubric was followed for grading, and the available number of points for each report remained consistent within the class format.

Finally, the quality of education remained consistent across the different class formats. The same professor was responsible for the class even though the TA teaching the class changed. In both formats, the class TAs were recognized for quality teaching. The traditional format was taught by a TA who was student-nominated and awarded ASU’s Teacher of the Year. The CURE format was taught by a TA who was self-nominated and awarded GPSA’s Teaching Excellence Award.

Student Demographics

GPA, degree program, and academic level were all analyzed in the results section as described below. The class did not have any prerequisites to enroll and was not required by any degree program at Arizona State University. In other words, the likelihood of students enrolling in Experimental Immunology remained consistent between class formats. A lecture class (MIC 420: Basic Immunology) was offered all the years that Experimental Immunology was taught. However, the lecture class was not required, and both the traditional and CURE class formats had a mixture of students who had and had not taken the lecture. Students did not elect to enroll in a CURE or traditional class and were not told prior to enrollment that the class format had changed to a CURE. Other demographics, including prior research experience, were assessed previously and not found to be significantly different between class formats ( Cooper et al. , 2019 ).

Statistical Analysis

Scores from quizzes, reports, and exams were pulled from 2013 to 2019 and analyzed for any differences via GraphPad Prism unpaired t tests. Correction from multiple t tests was done using false discovery rate determined by two-stage step-up (Benjamini, Krieger, and Yekutieli). GPA was also analyzed via unpaired t tests to determine significance.

The Shannon–Wiener diversity index, Chi-squared, and Fisher t test were used to determine level of diversity for degree program, academic level, race, and gender. The Shannon–Wiener diversity index is a way to measure diversity within a population ( Shannon, 1948 ). A t test was used to compare results from the Shannon–Wiener diversity index ( Hutcheson, 1970 ). Further analysis for degree program, academic level, and race used Chi-squared. Significant differences in gender were determined using the Fisher t test.

GraphPad Prism’s multiple regression analysis was used to determine the impact of the predictor variables GPA, class format (0-Traditional, 1-Cure), and academic level on the outcome variable (overall points earned). Further analysis studied the predictor variables on points earned on quizzes, reports, and exams separately.

Scores from quizzes 2013–2016 were analyzed via one-way ANOVA in GraphPad Prism.

Students in CURE Class Averaged ∼5% Higher than Students Taught via Traditional Method

To determine whether students learned more course material in the CURE format, we compared overall course grades and found that students in the CURE class performed better overall with a class average of 80% compared with students in the traditional format course who averaged 75% ( p < 0.0001) ( Figure 1A ). The aggregate semester grade was calculated from student quizzes (79% vs. 71%, p < 0.0001) ( Figure 1B ), lab reports (84% vs. 78%, p < 0.0001) ( Figure 1C ), and exams (82% vs. 77%, p < 0.01) ( Figure 1D ). On all three assessments, CURE format students had significantly improved performance, which suggests incorporating research into the immunology laboratory class resulted in improved understanding and application of the course material.

FIGURE 1. Changing class format to a CURE improved student performance. (A) Students enrolled in the CURE class format demonstrated improved mastery of course material compared with those in the traditional class format based on improved scores in quizzes, lab reports, and exams. (B) Quizzes were given at the beginning of class before the instructor/TA reviewed the material and experimental setup. Based on quiz scores, CURE students demonstrated increased understanding of and preparedness for class. (C) Lab reports assessed scientific writing and ability to analyze and interpret data. Students enrolled in the CURE format performed better overall on reports. (D) Students engaged in research scored higher on exams indicating improved mastery of course material. CURE students n = 139, traditional students n = 119; **** p < 0.0001, ** p < 0.01; unpaired t test was used to test statistical significance with false discovery rate determined by two-stage step-up (Benjamini, Krieger, and Yekutieli).

Changing Class Format to a CURE Improved Majority of Quiz Scores

We used quizzes to test student preparedness for class and understanding of important background information each laboratory period. CURE students achieved significantly higher scores on seven out of 13 quizzes ( Figure 2 ). Early in the semester, the CURE teaching method resulted in students performing significantly better on the second quiz (88% vs. 80%, p < 0.01). The CURE teaching method resulted in continued improved performance when viral infection was mimicked in quiz 5 by studying the innate immune response to poly(I:C) (61% vs. 51%, p < 0.01) and when lymphocyte development was studied in quiz 7 (88% vs. 78%, p < 0.0001) and quiz 8 (88% vs. 81%. p < 0.01). Later in the semester, lymphocyte response to virus was studied, specifically how T cells respond to viral infections. The difference in quiz scores between teaching methods then often exceeded 15% such as in quizzes 10 (76% vs. 58%, p < 0.0001), 11 (82% vs. 61%, p < 0.0001), and 14 (83% vs. 66%, p < 0.0001). Scores for quizzes 12 and 13 were not significantly different between teaching methods (70% vs. 62% and 80% vs. 75%, respectively). When the focus was on learning a new technique instead of forming a new hypothesis, such as quiz 6 (79% vs. 80%) and quiz 9 (77% vs. 75%), no significant difference in scores was noticed. Scores for quizzes 3 and 4 scores were also not significantly different between teaching methods (87% vs. 82% and 67% vs. 71%, respectively). Lab 3 studied cells of the immune system and reviewed fundamentals for flow cytometry, which the class used to analyze data. Lab 4 studied oxidative burst, which occurs when leukocytes encounter a pathogen. Overall, seven of the 13 quizzes were significantly improved for CURE format versus traditional format. Of the six quizzes that were not significantly improved, two involved learning a technique instead of generating a hypothesis before obtaining novel results.

FIGURE 2. Incorporating research into the course resulted in improved quiz scores. Of the quizzes analyzed, students engaged in research earned higher scores in seven of the 13 quizzes. Two of the four quizzes in which there was no significant difference did not have novel data in the lab classes for those quizzes. Unfilled bars represent the traditional format, and filled bars represent the CURE format. CURE students n = 139, traditional students n = 119; **** p < 0.0001, ** p < 0.01, * p < 0.05, ns = not significant; unpaired t test was used to test statistical significance with false discovery rate determined by two-stage step-up (Benjamini, Krieger, and Yekutieli).

Incorporating Scientific Research Resulted in Improved Performance on Reports

While quizzes demonstrated student preparedness for class, we used laboratory reports to assess student analytical skills for interpreting data, as well as critical thinking about how their work applied to current research outside the class. The rubric for grading reports was unchanged between class formats. We found significantly improved scores for all four analyzed laboratory reports from CURE format students compared with scores from traditional format students. As with quizzes, incorporating research into the class benefitted students from the beginning. The CURE format resulted in students earning 6% higher on the first report (74% vs. 68%, p < 0.01). Students appeared to incorporate feedback from the first report regardless of class format given the second report was one letter grade higher for both sets of students. However, the benefit of incorporating research early resulted in the CURE class still scoring 8% higher (87% vs. 79%, p < 0.0001). The third report (89% vs. 83%, p < 0.0001) and fourth report (87% vs. 84%, p < 0.05) also demonstrated improved scientific writing when research was incorporated ( Figure 3 ).

FIGURE 3. Students demonstrated better scientific writing when they produced and analyzed novel data. Lab reports assessed analytical skills and data interpretation and required students to look at contemporary literature to understand how their work was applicable outside class. CURE students were told from the beginning of the semester their work was novel. The same rubric was used for both class formats in which over half the grade came from the results and discussion sections. CURE students scored higher on all analyzed reports. Unfilled bars represent the traditional format, and filled bars represent the CURE format. CURE students n = 139, traditional students n = 119; **** p < 0.0001, ** p < 0.01, * p < 0.05; unpaired t test was used to test statistical significance with false discovery rate determined by two-stage step-up (Benjamini, Krieger, and Yekutieli).

Midterm, but not Final, Exam Scores Improved in CURE Format

Two exams were given to assess student mastery of the course material. A midterm exam assessed student understanding of labs 1–9, and a final exam covered material from labs 10 to 15. The questions and format were the same for exams for the traditional and CURE format courses. Again, students in the CURE format class performed significantly better on the midterm exam compared with students from the traditional format course (88% vs. 83%, p < 0.0001). However, student performance on the final exam did not differ significantly between traditional and CURE format courses (75% vs. 73%) ( Figure 4 ).

FIGURE 4. CURE students scored higher on the midterm but not the final. The exams tested student understanding of the course material; no questions were modified to incorporate research material. All students were provided with the same take-home, open-book midterm to be completed in the same timeframe. Although course-based research was not incorporated into the exam itself, CURE students scored higher on the midterm. The final exam was administered in class after all experiments were completed. CURE students and traditional students performed equally on the final. Unfilled bars represent the traditional format, and filled bars represent the CURE format. CURE students n = 139, traditional students n = 119; **** p < 0.0001, ns = not significant; unpaired t test was used to test statistical significance with false discovery rate determined by two-stage step-up (Benjamini, Krieger, and Yekutieli).

Student Demographics Remained Consistent Between Formats

The improved performance on quizzes, exams, and lab reports in the CURE format course compared with the traditional format course, despite no other differences in format or assessments, suggests incorporating research into a laboratory course increases student mastery. However, other factors including student demographics could also result in this change. To determine whether the student population changed, data on students’ overall academic GPAs, degree programs, and academic levels were analyzed. No significant difference was observed across GPA ( p = 0.07) ( Figure 5C ), degree program ( p = 0.6) ( Figure 5A ), or academic level ( p = 0.4) ( Figure 5B ). Therefore, the students who took the traditional class format were equally capable of mastering the course material as students who took the CURE class format. The improved performance observed in the CURE format was due to incorporating research into the teaching method.

FIGURE 5. Student demographics were unchanged between traditional and CURE formats. (A) Students enrolled in either the traditional or CURE format participated in similar degree programs. CURE students n = 139, traditional students n = 119; Shannon Diversity test followed by an unpaired t test was used to test statistical significance in differences between formats. (B) Both the traditional and CURE formats consisted mostly of seniors. No significant difference was observed in student academic level between the different formats. CURE students n = 139, traditional students n = 119; Shannon Diversity test followed by an unpaired t test was used to test statistical significance in differences between formats. (C) No significant difference was found in student GPAs between the class formats. CURE students n = 139, traditional students n = 119; ns = not significant; an unpaired t test was used to test statistical significance .

Multiple Linear Regression Analysis Indicates the Teaching Intervention Improved Student Scores

Multiple linear regression (MLR) controls for other variables that impact student performance and is therefore a reliable method for determining whether a teaching intervention, such as incorporating research, impacted student performance and learning or whether the change in performance was due to student-intrinsic factors ( Theobald and Freeman, 2014 ). In setting up the MLR analysis, we chose student GPA, class format, and academic level as the predictor, or control, variables. The outcome variable was the total number of points earned in the class. GPA represented overall academic performance. Academic level helped measure preparedness of previous coursework since more senior students would likely have taken more life science classes to prepare them for an upper division immunology course. Class format represented the teaching intervention, which was incorporating research. MLR analysis showed how well incorporating research impacted student performance ( p = 0.0004) in the class when the predictor variables were controlled ( Table 1 ). GPA also served as a good indicator of well a student would do in the class ( p < 0.0001), but academic level had no impact on how well students performed in class. Individual analyses were performed for quizzes, reports, and exams with similar findings (Supplemental Tables S2–S4).

GPA and class format each impacted student scores. MLR analysis showed GPA and class format each impacted student performance in the class. Student academic level had no impact on student performance. Degree program was not able to be analyzed via MLR due to the number of different degree programs students had. Regression type was least squares. The formula used was Overall Points Earned = β + β *GPA + β *Class Format + β *Academic Level[Junior] + β *Academic Level[Post-Bac] + β *Academic Level[Graduate] + β *Academic Level[Freshman]. CURE students = 139, traditional students = 119

Parameter estimatesVariable value Value summary
β0Intercept<0.0001****
β1GPA<0.0001****
β2Class format0.0004***
β3Academic level [Junior]0.2679ns
β4Academic level [Post-Bacc]0.2614ns
β5Academic level [Graduate]0.1862ns
β6Academic level [Freshman]0.7654ns
β7Academic level [Senior]0.3918ns

Improved Scores were Due to Changing the Teaching Format without Changing Assessments

If the increase in grades in the CURE format were due to students obtaining copies of previous assessments, especially quizzes, then we hypothesized that we would see significant increases in quiz grades prior to changing class formats. We analyzed quiz averages across the 4 years the traditional lab was taught, 2013–2016. While there was an improvement between 2013 and 2014 (Supplemental Figure S1), no further improvement was noted across the 4 years. 2013 was the first year this course was taught. Nonetheless, we reanalyzed the overall scores for quizzes, reports, and exams to determine whether 2013 falsely lowered the scores from the traditional class format. All three assessment areas remained significant between class formats when the scores from the first ever class were omitted (Supplemental Figure S2).

Assessing Differences in TA Grading Showed No Significance Difference between Formats

The traditional format had four graders over the course of 4 years (two TAs and two assistant TAs). The CURE format had two graders over the course of 3 years (two TAs). The class professor, who was consistent across formats, also graded occasionally. Although unlikely that the clear divide in grades across class formats was due to grading variances since there were seven total graders for the class, we sought to determine whether differences in scores were due to grading. We therefore analyzed the coefficient of variance (%CV) within samples and compared the two class formats. Any %CV due to student-intrinsic factors would be similar between teaching methods because student demographics were comparable between class formats. If %CV were significantly different between formats, then other factors, including differences in TA grading, would likely be responsible. The %CV for quizzes, reports, and exams were not significant between class formats ( p = 0.0671, 0.3162, and 0.8858, respectively) ( Figure 6 ), which suggests that the grading rigor was comparable between class formats.

FIGURE 6. Grading practices were comparable between traditional and CURE formats. (A) The 13 quizzes were analyzed for %CV to determine intravariability in grading in both traditional and CURE formats. The %CVs were then analyzed via unpaired t test. No significant difference in intravariability was found between class formats. (B) The four reports were analyzed for %CV to determine intravariability in grading in both traditional and CURE formats. The %CVs were then analyzed via unpaired t test. No significant difference in intravariability was found between class formats. (C) The two exams were analyzed for %CV to determine intravariability in grading in both traditional and CURE formats. The %CVs were then analyzed via unpaired t test. No significant difference in intravariability was found between class formats.

Experimental Immunology Taught a Diverse Student Population

CUREs are known to include more students in research and have a broader outreach than the traditional one-on-one mentoring method ( Bangera and Brownell, 2014 ; Burmeister et al. , 2021 ). Both the traditional and CURE class formats rated high on the Shannon–Wiener diversity index with richness scores of 7 and 11, respectively. There was no significant difference between class diversity as calculated via Shannon–Wiener diversity index ( p = 0.2) or Chi-squared test ( p = 0.3255) ( Figure 7A ). Both the traditional and CURE class formats had over 50% female students, and the ratio of female to male students was not significantly different between class formats as calculated via Shannon–Wiener diversity index ( p = 0.2) or Fisher’s exact test ( p = 0.1298) ( Figure 7B ). Overall, the Experimental Immunology class serves a diverse group of students. By changing the class format to incorporate research, we included the diverse student population we serve in critical thinking and problem solving.

FIGURE 7. Traditional and CURE formats served equally diverse student populations. (A) Students enrolled in the course came from diverse racial backgrounds, several of which are underrepresented in science. CURE students n = 139, traditional students n = 119; Shannon diversity test followed by an unpaired t test was used to test statistical significance in differences between formats. Chi-squared test was also tested. No significant difference was observed between class formats. (B) More women than men enrolled in Experimental Immunology. CURE students n = 139, traditional students n = 119; Shannon diversity test followed by an unpaired t test was used to test statistical significance in differences between formats. Fisher’s exact test was also used. No significant difference was observed between class formats.

While incorporating research into existing laboratory courses benefits students by encouraging critical thinking, problem solving, and reading current literature to show how their work is novel and applicable outside class, quantitating the impact of integrating research on student mastery of the course material has been difficult ( Linn et al. , 2015 ). We changed the format of an upper division immunology lab course into a CURE class by having students study the immune response of previously uncharacterized genetically altered mice compared with the known response of WT mice. The class structure, such as the experiments performed, lab manual used, and assessments, remained unchanged between formats thereby allowing us to compare the effect incorporating research has on student performance in the class. We realized retroactively that we were therefore in a unique position to determine whether incorporating research improved student learning of the original course material as evidenced by improved scores.

Overall, we found incorporating research into the class resulted in students performing significantly better in all assessment areas. The overall difference in student performance was not surprising since every assessment showed that incorporating research improved student performance. The difference in quiz scores could be due to the nature of the hypotheses required for both classes. Because students studied known outcomes in the traditional class format, the lab manual often provided enough information for students to know what to expect and why. However, the lab manual did not detail any genetic mutations. While CURE students would have read the same immunology background from the lab manual, they had the additional responsibility to apply what they learned to whether a genetic mutation would impact the immune system and, if so, how. Encouraging students to apply their knowledge before taking any assessment likely resulted in improved learning of the course material and therefore higher quiz scores ( Freeman et al. , 2014 ).

Incorporating research improved students’ scientific writing as evidenced by improved lab report scores. Writing about real scientific results in the results and discussion sections, which are responsible for over half the lab report grade, likely made scientific writing more approachable. For example, the discussion section required students to compare their results with outside literature and explain why their results did or did not agree with current literature. The CURE teaching method required students to start reading outside literature before writing the report. Therefore, CURE students had an advantage regarding which literature sources to cite for the discussion because they already read multiple sources to formulate a hypothesis. The increased ownership and engagement in the class ( Cooper et al. , 2019 ) may have also resulted in increased scores as they had to write why their novel results were important outside of class ( Conley and French, 2014 ; Cannata et al. , 2019 ; Martínez et al. , 2019 ). The traditional teaching format encouraged outside reading before writing the report but did not require it. The results were not novel for the traditional teaching method, and therefore the impact of what was studied focused less on their results and more on how the experiments studied are still used in current research.

The significant difference in exam scores resulted from the CURE teaching method improving student scores by 5% on the midterm exam. Because the midterm exam was an open book take home exam completed over spring break, students in both class formats had equal access to the lab manual and previous quizzes to do equally well on the exam. Therefore, the difference in scores was likely due to student motivation to devote the time to do well on the exam ( Dweck, 1986 ) possibly due to increased project ownership ( Cannata et al. , 2017 ). The midterm also correlated with the increased peak in quiz performance suggesting that students in the CURE format exhibited higher levels of engagement immediately after spring break.

The difference in scores began to decline when quizzes and assignments for the class overlapped with projects necessary for students to graduate, such as capstone and honors thesis projects. If students experienced equal levels of burnout or had multiple assignments due around the time the final was taken, then student engagement in the class would be comparable between class formats and therefore explain why the scores on the final exam were not significantly different.

Active learning encourages students to take ownership of their education by actively participating in what they learn. By changing the lab format to a CURE class, students received guidance on how to look up and interpret journal articles. Once empowered in ways to educate themselves and look up information beyond what was provided in the course materials, the improved scores suggest students truly engaged in the class and took ownership of their projects and their education. Teaching the students scientific processes, such as graphing, data analysis, experimental design, scientific writing, and science communication, before students enrolled in introductory science classes improved content learning when students later enrolled in introductory biology classes despite minimal differences in student GPA/SAT scores ( Dirks and Cunningham, 2006 ). CUREs teach scientific processes alongside class material ( Auchincloss et al. , 2014 ). Our work supports that CUREs are an effective way to improve undergraduate education by engaging more students in scientific research. We showed transitioning to a CURE format resulted in a similar improvement in scores compared with teaching scientific processes separately. This further shows the CURE format enhanced student learning of course material rather than distracted from it, which is a concern raised when educators express reasons for not integrating more active learning in their courses ( Kim et al. , 2018 ; Shadle et al. , 2017 ; Ul-Huda et al. , 2018 ).

One limitation of this study is that different TAs were present and responsibilities, such as grading, shifted as TAs shifted. To mitigate variation, answers were reviewed in class whenever possible (always for quizzes, upon request for exams). Regarding quizzes, since all those answers were approved by the professor who was present for both class formats, CURE students performed better than traditional students in seven of the 13 quizzes, or seven of the 11 quizzes that involved applying their knowledge to a genetic mutation. A subset of assessments was unavailable to regrade to verify the impact of adding research because assessments were handed back to students. Nonetheless, most questions did not have multiple correct answers. The only questions that could have different points awarded based on TA leniency were drawings and short answers. To understand whether there were any significant differences, we compared %CV between formats. We showed through multiple analyses (MLR, Chi-squared, diversity index, and previous work from Cooper et al. (2019) ) that student demographics were not responsible for the difference in scores. Any %CV related to student ability would remain consistent between formats. We reasoned that a change in %CV would therefore be due to other variables, such as TA grading leniency. Our data showed that %CV was not different between class formats. Therefore, we do not believe having different graders significantly impacted the scores because no variance was noted within the class formats themselves.

We also analyzed whether academic dishonesty in the form of obtaining quizzes from prior years could have resulted in improved scores. While the first year the class was taught did have lower scores compared with subsequent semesters, no further improvement occurred. The improvement from 2013 to 2014 likely resulted from having an experienced professor and experienced TAs.

Another limitation of this study is we had no information on differences in family support and/or responsibilities (parents, spouse, children), socioeconomics such as whether the students were working to support themselves or were supported by family, and other personal factors that could affect student performance in the class ( Mushtaq and Khan, 2012 ). Nonetheless, previous studies showed prior test scores, course knowledge, and experience had the highest correlations to student performance when compared with other factors such as learning/teaching styles, gender, and family stress ( Van Lanen et al. , 2000 ; Clark and Latshaw, 2012 ; Mushtaq and Khan, 2012 ). Because no significant difference was noted in GPA, degree program, or academic level between the two formats, the improved scores likely resulted from the teaching method and not the students. This was further supported via MLR analysis in which class format was found to contribute to student performance when all other variables, including GPA, were controlled. GPA was accessed when most of the students had graduated Arizona State University, which means the GPA analyzed was likely the students’ final GPAs or close to their final GPAs. We recognize that our students had both visible and invisible diversities that were not disclosed in interviews or through demographic information. Therefore, measuring to what extent transitioning the class format to a CURE class impacted each demographic is outside the scope of this study. Nonetheless, we did note that students in the CURE class performed better overall. We hope this information encourages educators to include active learning, particularly course-based research, in their classes and universities to offer rewards and incentives for educators to update courses as needed to improve student engagement and thereby improve student mastery of course material.

Overall, we showed incorporating research into an upper division lab improved student learning and mastery of course material. Changing the class to a CURE format resulted in students experiencing increased project ownership, which was likely associated with increased engagement in the course and ownership of their education which then translated to improved scores on assessments. We were able to show this because the overall class structure and assessments remained unaltered between the traditional and CURE class formats; the only change was students studied how a previously uncharacterized gene impacted immune system development, response, and memory. Over the course of 3 years, 139 students from diverse backgrounds, some of which are underrepresented in science, participated in scientific research through this class, which supports that CUREs can engage large numbers of diverse students in science ( Auchincloss et al. , 2014 ).

ACKNOWLEDGMENTS

We would like to thank all the labs that collaborated with Experimental Immunology to provide genetically modified mice. While the labs were already breeding mice for their own research, we are grateful for the work they put in to breed additional mice for this class. Special thanks to Cherie Alissa Lynch (2017), Michael Holter (2018), and Karen Kibler (2019) for helping make the CURE class possible. Student fees for Arizona State University’s MIC 421 Experimental Immunology class 435 were used to fund the class experiments.

  • Auchincloss, L. C., Laursen, S. L., Branchaw, J. L., Eagan, K., Graham, M., Hanauer, D. I., … & Dolan, E. L. ( 2014 ). Assessment of course-based undergraduate research experiences: A meeting report . CBE—Life Sciences Education , 13 (1), 29–40.  https://doi.org/10.1187/cbe.14-01-0004 Link ,  Google Scholar
  • Bangera, G., & Brownell, S. E. ( 2014 ). Course-based undergraduate research experiences can make scientific research more inclusive . CBE—Life Sciences Education , 13 (4), 602–606.  https://doi.org/10.1187/cbe.14-06-0099 Link ,  Google Scholar
  • Bauerle, C. M. , American Association for the Advancement of Science , National Science Foundation (U.S.) , Division of Undergraduate Education , & Directorate for Biological Sciences . ( 2011 ) Vision and change in undergraduate biology education: A call to action: Final report of a national conference held from July 15–17, Washington, DC . Google Scholar
  • Blattman, J. N., McAfee, M. S., & Schoettle, L. ( 2016 ). The Immune System: An Experimental Approach (Preliminary) . Cognella, Inc. Google Scholar
  • Brownell, S. E., Kloser, M. J., Fukami, T., & Shavelson, R. J. ( 2013 ). Context matters: Volunteer bias, small sample size, and the value of comparison groups in the assessment of research-based undergraduate introductory biology lab courses . Journal of Microbiology & Biology Education , 14 (2), 176–182.  https://doi.org/10.1128/jmbe.v14i2.609 Medline ,  Google Scholar
  • Burmeister, A. R., Dickinson, K., & Graham, M. J. ( 2021 ). Bridging trade-offs between traditional and course-based undergraduate research experiences by building student communication skills, identity, and interest . Journal of Microbiology & Biology Education , 22 (2).  https://doi.org/10.1128/jmbe.00156-21 Google Scholar
  • Cannata, M., Redding, C., & Nguyen, T. ( 2019 ). Building student ownership and responsibility: Examining student outcomes from a research-practice partnership . Journal of Research on Education Effectiveness , 12 (3), 333–362. Google Scholar
  • Cannata, M. A., Smith, T. M., & Taylor Haynes, K. ( 2017 ). Integrating academic press and support by increasing student ownership and responsibility . AERA Open , 3 (3), 233285841771318.  https://doi.org/10.1177/2332858417713181 Google Scholar
  • Clark, S., & Latshaw, C. ( 2012 ). “Peeling the Onion” called student performance: An investigation into the factors affecting student performance in an introductory accounting class . Review of Business; New York , 33 (1), 19–27. Google Scholar
  • Conley, D. T., & French, E. M. ( 2014 ). Student ownership of learning as a key component of college readiness . American Behavioral Scientist , 58 (8), 1018–1034.  https://doi.org/10.1177/0002764213515232 Google Scholar
  • Cooper, K. M., Blattman, J. N., Hendrix, T., & Brownell, S. E. ( 2019 ). The impact of broadly relevant novel discoveries on student project ownership in a traditional lab course turned CURE . CBE—Life Sciences Education , 18 (4), ar57.  https://doi.org/10.1187/cbe.19-06-0113 Link ,  Google Scholar
  • D’Arcy, C. E., Martinez, A., Khan, A. M., & Olimpo, J. T. ( 2019 ). Cognitive and non-cognitive outcomes associated with student engagement in a novel brain chemoarchitecture mapping course-based undergraduate research experience . Journal of Undergraduate Neuroscience Education: JUNE: A Publication of FUN, Faculty for Undergraduate Neuroscience , 18 (1), A15–A43 Medline ,  Google Scholar
  • Dirks, C., & Cunningham, M. ( 2006 ). Enhancing diversity in science: Is teaching science process skills the answer? CBE—Life Sciences Education , 5 (3), 218–226.  https://doi.org/10.1187/cbe.05-10-0121 Link ,  Google Scholar
  • Dweck, C. S. ( 1986 ). Motivational processes affecting learning . American Psychologist , 41 (10), 1040–1048.  https://doi.org/10.1037/0003-066X.41.10.1040 Google Scholar
  • Fendos, J., Cai, L., Yang, X., Ren, G., Li, L., Yan, Z., Lu, B., Pi, Y., Ma, J., Guo, B., Wu, X., Lu, P., Zhang, R., & Yang, J. ( 2022 ). A course-based undergraduate research experience improves outcomes in mentored research . CBE—Life Sciences Education , 21 (3), ar49.  https://doi.org/10.1187/cbe.21-03-0065 Medline ,  Google Scholar
  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. ( 2014 ). Active learning increases student performance in science, engineering, and mathematics . Proceedings of the National Academy of Sciences , 111 (23), 8410–8415. Medline ,  Google Scholar
  • Hutcheson, K. ( 1970 ). A test for comparing diversities based on the shannon formula . Journal of Theoretical Biology , 29 (1), 151–154.  https://doi.org/10.1016/0022-5193(70)90124-4 Google Scholar
  • Irby, S. M., Pelaez, N. J., & Anderson, T. R. ( 2020 ). Student perceptions of their gains in course-based undergraduate research abilities identified as the anticipated learning outcomes for a biochemistry CURE . Journal of Chemical Education , 97 (1), 56–65.  https://doi.org/10.1021/acs.jchemed.9b00440 Google Scholar
  • Kardash, C. M. ( 2000 ). Evaluation of undergraduate research experience: Perceptions of undergraduate interns and their faculty mentors . Journal of Educational Psychology , 92 (1), 191–201.  https://doi.org/10.1037/0022-0663.92.1.191 Google Scholar
  • Kim, A. M., Speed, C. J., & Macaulay, J. O. ( 2018 ). Barriers and strategies: Implementing active learning in biomedical science lectures . Biochemistry and Molecular Biology Education , 47 (1), 29–40.  https://doi.org/10.1002/bmb.21190 Google Scholar
  • Linn, M. C., Palmer, E., Baranger, A., Gerard, E., & Stone, E. ( 2015 ). Undergraduate research experiences: Impacts and opportunities . Science , 347 (6222), 1261757. https://doi.org/10.1126/science.1261757 Medline ,  Google Scholar
  • Martínez, I. M., Youssef-Morgan, C. M., Chambel, M. J., & Marques-Pinto, A. ( 2019 ). Antecedents of academic performance of university students: Academic engagement and psychological capital resources . Educational Psychology , 39 (8), 1047–1067.  https://doi.org/10.1080/01443410.2019.1623382 Google Scholar
  • Mastronardi, Borrego, M., Choe, N., & Hartman, R. ( 2021 ). The impact of undergraduate research experiences on participants’ career decisions . Journal of STEM Education , 22 (2), 75–82. Google Scholar
  • McClure-Brenchley, K. J., Picardo, K., & Overton-Healy, J. ( 2020 ). Beyond learning: Leveraging undergraduate research into marketable workforce skills . Scholarship and Practice of Undergraduate Research , 3 (3), 28–35. https://doi.org/10.18833/spur/3/3/10 Google Scholar
  • Mushtaq, I., & Khan, S. ( 2012 ). Factors affecting students’ academic performance . Global Journal of Management and Business Research , 12 (9), 17–22. Google Scholar
  • Peteroy-Kelly, M. A., Marcello, M. R., Crispo, E., Buraei, Z., Strahs, D., Isaacson, M., Jaworski, L., Lopatto, D., & Zuzga, D. ( 2017 ). Participation in a year-long CURE embedded into major core genetics and cellular and molecular biology laboratory courses results in gains in foundational biological concepts and experimental design skills by novice undergraduate researchers . Journal of Microbiology & Biology Education , 18 (1), 18.1.10.  https://doi.org/10.1128/jmbe.v18i1.1226 Google Scholar
  • Russell, S. H., Hancock, M. P., & McCullough, J. ( 2007 ). THE PIPELINE: Benefits of undergraduate research experiences . Science , 316 (5824), 548–549.  https://doi.org/10.1126/science.1140384 Medline ,  Google Scholar
  • Shadle, S. E., Marker, A., & Earl, B. ( 2017 ). Faculty drivers and barriers: Laying the groundwork for undergraduate STEM education reform in academic departments . International Journal of STEM Education , 4 (1), 8.  https://doi.org/10.1186/s40594-017-0062-7 Medline ,  Google Scholar
  • Shannon, C. E. ( 1948 ). A mathematical theory of communication . Bell System Technical Journal , 27 (3), 379–423.  https://doi.org/10.1002/j.1538-7305.1948.tb01338.x Google Scholar
  • Theobald, R., & Freeman, S. ( 2014 ) Is it the intervention or the students? Using linear regression to control for student characteristics in undergraduate STEM education research . CBE—Life Sciences Education , 13 (1), 41–48.  https://doi.org/10.1187/cbe-13-07-0136 Link ,  Google Scholar
  • Ul-Huda, S., Ali, T., Cassum, S., Nanji, K., & Yousafzai, J. ( 2018 ). Faculty perception about active learning strategies: A cross sectional survey . Journal of Liaquat University of Medical & Health Sciences , 17 (02), 96–100. https://doi.org/10.22442/jlumhs.18172055817 Google Scholar
  • Van Lanen, R. J., McGannon, T., & Lockie, N. M. ( 2000 ). Predictors of nursing students’ performance in a one-semester organic and biochemistry course . Journal of Chemical Education , 77 (6), 767.  https://doi.org/10.1021/ed077p767 Google Scholar
  • Wolkow, T. D., Durrenberger, L. T., Maynard, M. A., Harrall, K. K., & Hines, L. M. ( 2014 ). A comprehensive faculty, staff, and student training program enhances student perceptions of a course-based research experience at a two-year institution . CBE—Life Sciences Education , 13 (4), 724–737.  https://doi.org/10.1187/cbe.14-03-0056 Link ,  Google Scholar
  • Wolkow, T. D., Jenkins, J., Durrenberger, L., Swanson-Hoyle, K., & Hines, L. M. ( 2019 ). One early course-based undergraduate research experience produces sustainable knowledge gains, but only transient perception gains . Journal of Microbiology & Biology Education , 20 (2), 10.  https://doi.org/10.1128/jmbe.v20i2.1679 Google Scholar

research on student engagement

Submitted: 23 May 2022 Revised: 9 January 2024 Accepted: 22 July 2024

© 2024 N. T. Appel et al. CBE—Life Sciences Education © 2024 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Monash University Logo

  • Help & FAQ

Digital literacies in the classroom: Authentic opportunities for student engagement

Research output : Chapter in Book/Report/Conference proceeding › Chapter (Book) › Other › peer-review

In efforts to improve students' digital literacies on a STEM-focused campus, one university created a digital literacies initiative to support both faculty and students. Faculty development programming supported the development of assignment parameters, detailed assessment rubrics, and scaffolding activities. A campus tutoring center was piloted to support students' acquisition of digital literacies. This chapter offers examples from three faculty members who participated in the digital literacies initiative and implemented digital literacy assignments in their courses. The researchers offer best practices for campuses interested in developing digital literacy initiatives.

Original languageEnglish
Title of host publicationHandbook of Research on Fostering Student Engagement With Instructional Technology in Higher Education
EditorsEmtinan Alqurashi
Publisher
Chapter7
Pages116-138
Number of pages23
ISBN (Electronic)9781799801214, 9781799801191, 1799801195
ISBN (Print)9781799809401
DOIs
Publication statusPublished - 2020
Externally publishedYes

Access to Document

  • 10.4018/978-1-7998-0119-1.ch007

Other files and links

  • Link to publication in Scopus

T1 - Digital literacies in the classroom

T2 - Authentic opportunities for student engagement

AU - Mumpower, Lori Ann

AU - Branham, Cassandra

AU - Clevenger, Aaron D.

AU - Faulconer, Emily

AU - Watkins, Alex

N1 - Publisher Copyright: © 2020, IGI Global.

N2 - In efforts to improve students' digital literacies on a STEM-focused campus, one university created a digital literacies initiative to support both faculty and students. Faculty development programming supported the development of assignment parameters, detailed assessment rubrics, and scaffolding activities. A campus tutoring center was piloted to support students' acquisition of digital literacies. This chapter offers examples from three faculty members who participated in the digital literacies initiative and implemented digital literacy assignments in their courses. The researchers offer best practices for campuses interested in developing digital literacy initiatives.

AB - In efforts to improve students' digital literacies on a STEM-focused campus, one university created a digital literacies initiative to support both faculty and students. Faculty development programming supported the development of assignment parameters, detailed assessment rubrics, and scaffolding activities. A campus tutoring center was piloted to support students' acquisition of digital literacies. This chapter offers examples from three faculty members who participated in the digital literacies initiative and implemented digital literacy assignments in their courses. The researchers offer best practices for campuses interested in developing digital literacy initiatives.

UR - http://www.scopus.com/inward/record.url?scp=85077938854&partnerID=8YFLogxK

U2 - 10.4018/978-1-7998-0119-1.ch007

DO - 10.4018/978-1-7998-0119-1.ch007

M3 - Chapter (Book)

AN - SCOPUS:85077938854

SN - 9781799809401

BT - Handbook of Research on Fostering Student Engagement With Instructional Technology in Higher Education

A2 - Alqurashi, Emtinan

PB - IGI Global

  • Contact QUT Contact QUT

Study finds program boosts cognitive engagement of students with language and attention difficulties

A QUT-led study has found high school students with disabilities impacting language and information processing were able to better comprehend content when teachers adopted evidence-based strategies to increase the accessibility of classroom teaching.

research on student engagement

The study, funded by the Australian Research Council Linkage Projects scheme and published in Learning Environments Research , is led by QUT Centre for Inclusive Education researchers Ms Haley Tancredi , Professor Linda J. Graham and Dr Callula Killingly , and Professor Naomi Sweller of Macquarie University.

The study involved 56 year 10 students and found a statistically significant increase in cognitive engagement when their teachers participated in the Accessible Pedagogies program.

Doctoral student Haley Tancredi, who was first author of the study, said the two most common disabilities impacting language and information processing were Developmental Language Disorder (DLD) and Attention Deficit Hyperactivity Disorder (ADHD), which together account for about 14 per cent of school students – or four students in every classroom of 30.

Ms Tancredi said students with DLD and ADHD often “ hide in plain sight ” in the classroom and underachieved relative to their potential. However, many of the barriers these students face could be proactively removed using accessible teaching practices.

Ms Tancredi and Professor Graham, Director of the QUT Centre for Inclusive Education , have spent the last seven years developing a program of learning to help teachers reach these students whether they have been identified or not.

“In prior research , we investigated the impact of the Accessible Pedagogies program on the accessibility of teachers’ practice,” Professor Graham said.

Professor Graham said findings were extremely positive with most teachers demonstrating improvements across a range of accessible teaching practices targeted in the program.

Ms Tancredi said this latest research “aimed to determine whether improved accessibility was noticed by students and whether it made any difference to their engagement and classroom learning experience”.

“Students participating in this study were interviewed before and after their teachers underwent the program,” Ms Tancredi said.

Professor Graham said that when asked what their teacher did to help them pay attention and to understand, students whose teachers participated in Accessible Pedagogies explicitly described increased use of practices emphasised in the program.

“Importantly, students did not know whether their teachers were participating in the program, nor did they know its content,” Professor Graham said.

Professor Sweller said students also completed questionnaires assessing their classroom engagement before and after their teachers participated in the Accessible Pedagogies program.

“Students whose teachers participated in the program reported a significant increase in cognitive engagement, which is a measure of the level of investment that students put into comprehending complicated ideas and mastering content. There was no such increase for students whose teachers did not participate,” Professor Sweller said.

Ms Tancredi said that the team’s findings “suggest that teachers’ use of Accessible Pedagogies may help students redirect their mental effort towards learning, rather than expending that effort to overcome unnecessary instructional barriers”.

“The practices in Accessible Pedagogies are essential for students with DLD and ADHD, but help all students to pay attention, understand, and learn. Future research will investigate the use of Accessible Pedagogies in primary school classrooms and in schools serving disadvantaged communities,” Ms Tancredi said.

Read the full article, Investigating the impact of Accessible Pedagogies on the experiences and engagement of students with language and/or attentional difficulties , published online in the Learning Environments Research journal.

Main image (left to right): Haley Tancredi, Professor Linda Graham and Dr Callula Killingly.

Media contact:

Rod Chester, 07 3138 9449, [email protected]

After hours: 0407 585 901, [email protected]

Share on Facebook

Great ‘techspectations’ by customers let down by retailers

QUT researchers have found that while Australians generally trust retail technology, they remain hesitant to swiftly adopt new advancements, largely due to concerns over security and privacy.

Digital map of Australia’s environmental health vulnerabilities

Another step in the creation of a national digital environmental health decision-support platform has been awarded $1.9 million from the Medical Research Future Fund National Critical Research Infrastructure program.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

buildings-logo

Article Menu

research on student engagement

  • Subscribe SciFeed
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

A systematic review of the impact of emerging technologies on student learning, engagement, and employability in built environment education.

research on student engagement

1. Introduction

2. technology in education, 2.1. enhancing teaching methods and learning experience, 2.2. addressing industry demands and real-world applications, 2.3. improving employability and soft skills development, 2.4. research gaps, 3. research method, 3.1. the review process, 3.2. database and keywords, 3.3. study selection process with inclusion and exclusion criteria, 3.4. data analysis, 4. review results, 4.1. descriptive analysis, 4.2. thematic analysis, 4.2.1. commonly used technologies in be education, 4.2.2. enhancing student engagement through technology in be education, 4.2.3. improving learning outcomes with technology in be education, 4.2.4. enhancing employability skills through technology in be education, 4.2.5. challenges in implementing technologies in be education, 5. conclusions, 6. future research, 7. theoretical and practical implications, author contributions, data availability statement, conflicts of interest, nomenclature.

AECarchitecture, engineering, and construction
AIartificial intelligence
ARAugmented Reality
BIMBuilding Information Modelling
BEBuilt Environment
CATscomputer-aided technologies
DTDigital Twin
EVEnhanced Virtuality
GBLgamification-based learning
ICTsinformation and communication technologies
IoTInternet of Things
IVRInteractive Voice Response
MRMixed Reality
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-analyses
SLRsystematic literature review
SVSmart Vision
VRVirtual Reality
XRExtended Reality
  • Portella, A.A. Built environment. In Encyclopedia of Quality of Life and Well-Being Research , 1st ed.; Michalos, A.C., Ed.; Springer Science+Business Media: Dordrecht, The Netherlands, 2014; pp. 454–461. [ Google Scholar ]
  • Kamalipour, H.; Peimani, N. Towards an Informal Turn in the Built Environment Education: Informality and Urban Design Pedagogy. Sustainability 2019 , 11 , 4163. [ Google Scholar ] [ CrossRef ]
  • Ndou, M.; Aigbavboa, C.O. A Theoretical Review of Ecological Economic Thinking amongst Professionals in the Built Environment. Procedia Manuf. 2017 , 7 , 523–528. [ Google Scholar ] [ CrossRef ]
  • Opoku, A.; Guthrie, P. Education for sustainable development in the built environment. Int. J. Constr. Educ. Res. 2018 , 14 , 1–3. [ Google Scholar ] [ CrossRef ]
  • Siriwardena, M.; Malalgoda, C.; Thayaparan, M.; Amaratunga, D.; Keraminiyage, K. Disaster resilient built environment: Role of lifelong learning and the implications for higher education. Int. J. Strateg. Prop. Manag. 2013 , 17 , 174–187. [ Google Scholar ] [ CrossRef ]
  • Ebekozien, A.; Aigbavboa, C.O.; Aliu, J.; Thwala, W.D. Generic Skills of Future Built Environment Practitioners in South Africa: Unexplored Mechanism Via Students’ Perception. J. Eng. Des. Technol. 2022 , 22 , 561–577. [ Google Scholar ] [ CrossRef ]
  • Hajirasouli, A.; Banihashemi, S.; Sanders, P.; Rahimian, F. BIM-enabled Virtual Reality (VR)-based Pedagogical Framework in Architectural Design Studios. Smart Sustain. Built Environ. 2023 . [ Google Scholar ] [ CrossRef ]
  • Sami Ur Rehman, M.; Abouelkhier, N.; Shafiq, M.T. Exploring the Effectiveness of Immersive Virtual Reality for Project Scheduling in Construction Education. Buildings 2023 , 13 , 1123. [ Google Scholar ] [ CrossRef ]
  • Abidoye, R.; Lim, B.T.H.; Lin, Y.C.; Ma, J. Equipping Property Graduates for the Digital Age. Sustainability 2022 , 14 , 640. [ Google Scholar ] [ CrossRef ]
  • Kahu, E.R.; Nelson, K. Student engagement in the educational interface: Understanding the mechanisms of student success. High. Educ. Res. Dev. 2018 , 37 , 58–71. [ Google Scholar ] [ CrossRef ]
  • Zhang, J.; Xie, H.; Schmidt, K.; Xia, B.; Li, H.; Skitmore, M. Integrated experiential learning–based framework to facilitate project planning in civil engineering and construction management courses. J. Prof. Issues Eng. Educ. Pract. 2019 , 145 , 05019005. [ Google Scholar ] [ CrossRef ]
  • Ruge, G.; McCormack, C. Building and construction students’ skills development for employability–reframing assessment for learning in discipline-specific contexts. Archit. Eng. Des. Manag. 2017 , 13 , 365–383. [ Google Scholar ] [ CrossRef ]
  • Bhoir, S.; Esmaeili, B. State-of-the-art Review of Virtual Reality Environment Applications in Construction Safety. AEI 2015 , 2015 , 457–468. [ Google Scholar ]
  • Shirazi, A.; Behzadan, A.H. Content Delivery Using Augmented Reality to Enhance Students’ Performance in a Building Design and Assembly Project. Adv. Eng. Educ. 2015 , 4 , n3. [ Google Scholar ]
  • Obi, N.I.; Obi JS, C.; Okeke, F.O.; Nnaemeka-Okeke, R.C. Pedagogical Challenges of Architectural Education in Nigeria; Study of Curriculum Contents and Physical Learning Environment. Eur. J. Sustain. Dev. 2022 , 11 , 32. [ Google Scholar ] [ CrossRef ]
  • Rybakova, A.; Shcheglova, A.; Bogatov, D.; Alieva, L. Using interactive technologies and distance learning in sustainable education. E3S Web Conf. 2021 , 250 , 07003. [ Google Scholar ] [ CrossRef ]
  • Hayden, I. An evaluation of the design and use of applied visual interactive resources for teaching building regulations in higher education built environment programmes. Archit. Eng. Des. Manag. 2019 , 15 , 159–180. [ Google Scholar ] [ CrossRef ]
  • Gledson, B.J.; Dawson, S. Use of simulation through BIM-enabled virtual projects to enhance learning and soft employability skills in architectural technology education. In Building Information Modelling, Building Performance, Design and Smart Construction ; Dastbaz, M., Gorse, C., Moncaster, A., Eds.; Springer: Cham, Switzerland, 2017. [ Google Scholar ] [ CrossRef ]
  • Al-Maskari, A.; Al Riyami, T.; Ghnimi, S. Factors affecting students’ preparedness for the fourth industrial revolution in higher education institutions. J. Appl. Res. High. Educ. 2022 , 16 , 246–264. [ Google Scholar ] [ CrossRef ]
  • Gleason, N.W. Higher Education in the Era of the Fourth Industrial Revolution ; Springer Nature Palgrave Macmillan: Singapore, 2018; p. 229. [ Google Scholar ]
  • AbuMezied, A. What Role Will Education Play in the Fourth Industrial Revolution. In World Economic Forum. 2016. Available online: https://www.weforum.org/agenda/2016/01/what-role-will-education-play-in-the-fourth-industrial-revolution (accessed on 12 August 2024).
  • Shahroom, A.A.; Hussin, N. Industrial revolution 4.0 and education. Int. J. Acad. Res. Bus. Soc. Sci. 2018 , 8 , 314–319. [ Google Scholar ] [ CrossRef ]
  • Oke, A.; Fernandes, F.A.P. Innovations in teaching and learning: Exploring the perceptions of the education sector on the 4th industrial revolution (4IR). J. Open Innov. Technol. Mark. Complex. 2020 , 6 , 31. [ Google Scholar ] [ CrossRef ]
  • Garzón, J.; Acevedo, J. Meta-analysis of the impact of Augmented Reality on students’ learning gains. Educ. Res. Rev. 2019 , 27 , 244–260. [ Google Scholar ] [ CrossRef ]
  • Solnosky, R.; Parfitt, M.K.; Holland, R. Delivery methods for a multi-disciplinary architectural engineering capstone design course. Archit. Eng. Des. Manag. 2015 , 11 , 305–324. [ Google Scholar ] [ CrossRef ]
  • Alizadehsalehi, S.; Hadavi, A.; Huang, J.C. Assessment of AEC students’ performance using BIM-into-VR. Appl. Sci. 2021 , 11 , 3225. [ Google Scholar ] [ CrossRef ]
  • Spitzer, B.O.; Ma, J.H.; Erdogmus, E.; Kreimer, B.; Ryherd, E.; Diefes-Dux, H. Framework for the Use of Extended Reality Modalities in AEC Education. Buildings 2022 , 12 , 2169. [ Google Scholar ] [ CrossRef ]
  • Vasilevski, N.; Birt, J. Analysing Construction Student Experiences of Mobile Mixed Reality Enhanced Learning in Virtual and Augmented Reality Environments. Res. Learn. Technol. 2020 , 28 . [ Google Scholar ] [ CrossRef ]
  • Hajirasouli, A.; Banihashemi, S. Augmented Reality in Architecture and Construction Education: State of the Field and Opportunities. Int. J. Educ. Technol. High. Educ. 2022 , 19 , 39. [ Google Scholar ] [ CrossRef ]
  • Hussein, H.A.A. Integrating augmented reality technologies into architectural education: Application to the course of landscape design at Port Said University. Smart Sustain. Built Environ. 2023 , 12 , 721–741. [ Google Scholar ] [ CrossRef ]
  • Ibáñez, M.B.; Di Serio, Á.; Villarán, D.; Kloos, C.D. Experimenting with Electromagnetism Using Augmented Reality: Impact on Flow Student Experience and Educational Effectiveness. Comput. Educ. 2014 , 71 , 1–13. [ Google Scholar ] [ CrossRef ]
  • Thompson, P. The Digital Natives as Learners: Technology Use Patterns and Approaches to Learning. Comput. Educ. 2013 , 65 , 12–33. [ Google Scholar ] [ CrossRef ]
  • Sánchez-Mena, A.; Martí-Parreño, J. Drivers and barriers to adopting gamification: Teachers’ perspectives. Electron. J. e-Learn. 2017 , 15 , 434–443. [ Google Scholar ]
  • Diao, P.H.; Shih, N.J. Trends and Research Issues of Augmented Reality Studies in Architectural and Civil Engineering Education—A Review of Academic Journal Publications. Appl. Sci. 2019 , 9 , 1840. [ Google Scholar ] [ CrossRef ]
  • Ayer, S.K.; Messner, J.I.; Anumba, C.J. Augmented Reality Gaming in Sustainable Design Education. J. Archit. Eng. 2016 , 22 , 04015012. [ Google Scholar ] [ CrossRef ]
  • Aguayo, C.; Cochrane, T.; Narayan, V. Key Themes in Mobile Learning: Prospects for Learner-generated Learning Through AR and VR. Australas. J. Educ. Technol. 2017 , 33 . [ Google Scholar ] [ CrossRef ]
  • Sepasgozar, S.M. Digital Twin and Web-Based Virtual Gaming Technologies for Online Education: A Case of Construction Management and Engineering. Appl. Sci. 2020 , 10 , 4678. [ Google Scholar ] [ CrossRef ]
  • Patil, K.R.; Ayer, S.K.; Wu, W.; London, J. Mixed Reality Multimedia Learning to Facilitate Learning Outcomes from Project Based Learning. In Construction Research Congress ; American Society of Civil Engineers: Reston, VA, USA, 2020; pp. 153–161. [ Google Scholar ]
  • Kim, J.; Irizarry, J. Evaluating the Use of Augmented Reality Technology to Improve Construction Management Student’s Spatial Skills. Int. J. Constr. Educ. Res. 2021 , 17 , 99–116. [ Google Scholar ] [ CrossRef ]
  • Noghabaei, M.; Heydarian, A.; Balali, V.; Han, K. Trend Analysis on Adoption of Virtual and Augmented Reality in the Architecture, Engineering, and Construction industry. Data 2020 , 5 , 26. [ Google Scholar ] [ CrossRef ]
  • Cochrane, T.; Smart, F.; Narayan, V. Special Issue on Mobile Mixed Reality. Res. Learn. Technol. 2018 , 26 . [ Google Scholar ] [ CrossRef ]
  • Ebekozien, A.; Aigbavboa, C.; Aliu, J. Built environment academics for 21st-century world of teaching: Stakeholders’ perspective. Int. J. Build. Pathol. Adapt. 2023 , 41 , 119–138. [ Google Scholar ] [ CrossRef ]
  • Moghayedi, A.; Le Jeune, K.; Massyn, M.; Ekpo, C. Establishing the key elements of incorporation and outcomes of 4th industrial revolution in built environment education: A mixed bibliographic and bibliometric analysis. J. Constr. Proj. Manag. Innov. 2020 , 10 , 1–19. [ Google Scholar ] [ CrossRef ]
  • Rennie, L.J.; Stocklmayer, S.; Gilbert, J.K. Supporting Self-Directed Learning in Science and Technology beyond the School Years ; Taylor & Francis: Abingdon, UK, 2019; p. 225. [ Google Scholar ]
  • Ku, K.; Mahabaleshwarkar, P.S. Building Interactive Modeling for Construction Education in Virtual Worlds. 2011. Available online: http://hdl.handle.net/10919/92597 (accessed on 12 August 2024).
  • Keenaghan, G.; Horváth, I. State of the Art of Using Virtual Reality Technologies in Built Environment Education. In Proceedings of the TMCE 2014, Budapest, Hungary, 19–23 May 2014. [ Google Scholar ]
  • Tumpa, R.J.; Ahmad, T.; Naeni, L.M.; Kujala, J. Computer-based Games in Project Management Education: A Review. Proj. Leadersh. Soc. 2024 , 5 , 100130. [ Google Scholar ] [ CrossRef ]
  • Wang, L.H.; Chen, B.; Hwang, G.J.; Guan, J.Q.; Wang, Y.Q. Effects of digital game-based STEM education on students’ learning achievement: A meta-analysis. Int. J. STEM Educ. 2022 , 9 , 26. [ Google Scholar ] [ CrossRef ]
  • Li, X.; Yi, W.; Chi, H.L.; Wang, X.; Chan, A.P. A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Autom. Constr. 2018 , 86 , 150–162. [ Google Scholar ] [ CrossRef ]
  • Zhang, Y.; Liu, H.; Kang, S.C.; Al-Hussein, M. Virtual reality applications for the built environment: Research trends and opportunities. Autom. Constr. 2020 , 118 , 103311. [ Google Scholar ] [ CrossRef ]
  • Skaik, S.; Tumpa, R.J. A case study of the practical implications of using interactive technology in teaching international postgraduate students. Contemp. Educ. Technol. 2021 , 14 , ep335. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Sampaio, A.Z.; Ferreira, M.M.; Rosário, D.P.; Martins, O.P. 3D and VR models in Civil Engineering education: Construction, rehabilitation and maintenance. Autom. Constr. 2010 , 19 , 819–828. [ Google Scholar ] [ CrossRef ]
  • Repetto, C.; Serino, S.; Macedonia, M.; Riva, G. Virtual reality as an embodied tool to enhance episodic memory in elderly. Front. Psychol. 2016 , 7 , 214435. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Okada, Y.; Kaneko, K.; Shi, W. Web-based VR Education Contents Supporting VR-goggles and User Study. In Proceedings of the 31st International Conference on Computers in Education, ICCE 2023, Matsue, Japan, 4–8 December 2023; Asia-Pacific Society for Computers in Education: Taoyuan City, Taiwan, 2023; pp. 774–779. [ Google Scholar ]
  • Young, B.; Ellobody, E.; Hu, T.W. 3D visualization of structures using finite-element analysis in teaching. J. Prof. Issues Eng. Educ. Pract. 2012 , 138 , 131–138. [ Google Scholar ] [ CrossRef ]
  • Han, I. Immersive virtual field trips in education: A mixed-methods study on elementary students’ presence and perceived learning. Br. J. Educ. Technol. 2020 , 51 , 420–435. [ Google Scholar ] [ CrossRef ]
  • Lin, Y.J.; Wang, H.C. Using virtual reality to facilitate learners’ creative self-efficacy and intrinsic motivation in an EFL classroom. Educ. Inf. Technol. 2021 , 26 , 4487–4505. [ Google Scholar ] [ CrossRef ]
  • Rahimian, F.P.; Ibrahim, R. Impacts of VR 3D sketching on novice designers’ spatial cognition in collaborative conceptual architectural design. Des. Stud. 2011 , 32 , 255–291. [ Google Scholar ] [ CrossRef ]
  • Jin, R.; Yang, T.; Piroozfar, P.; Kang, B.G.; Wanatowski, D.; Hancock, C.M.; Tang, L. Project-based pedagogy in interdisciplinary building design adopting BIM. Eng. Constr. Archit. Manag. 2018 , 25 , 1376–1397. [ Google Scholar ] [ CrossRef ]
  • Tumpa, R.J.; Skaik, S.; Ham, M.; Chaudhry, G. Authentic Design and Administration of Group-based Assessments to Improve the Job-readiness of Project Management Graduates. Sustainability 2022 , 14 , 9679. [ Google Scholar ] [ CrossRef ]
  • Balogun, T.B. Built environment professionals’ perspective on digital technology skills. Educ. + Train. 2024 , 66 , 181–194. [ Google Scholar ] [ CrossRef ]
  • Underwood, J.; Shelbourn, M. (Eds.) Handbook of Research on Driving Transformational Change in the Digital Built Environment ; IGI Global: London, UK, 2021. [ Google Scholar ]
  • Leon, I.; Sagarna, M.; Mora, F.; Otaduy, J.P. BIM Application for Sustainable Teaching Environment and Solutions in the Context of COVID-19. Sustainability 2021 , 13 , 4746. [ Google Scholar ] [ CrossRef ]
  • Tumpa, R.J.; Skaik, S.; Ham, M.; Chaudhry, G. Enhancing project management graduates’ employability through group assessment innovations: An empirical study. Proj. Leadersh. Soc. 2023 , 4 , 100084. [ Google Scholar ] [ CrossRef ]
  • Bai, X.; He, Y.; Kohlbacher, F. Older people’s adoption of e-learning services: A qualitative study of facilitators and barriers. Gerontol. Geriatr. Educ. 2020 , 41 , 291–307. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Papadonikolaki, E.; Krystallis, I.; Morgan, B. Digital technologies in built environment projects: Review and future directions. Proj. Manag. J. 2022 , 53 , 501–519. [ Google Scholar ] [ CrossRef ]
  • Sawhney, A.; Riley, M.; Irizarry, J. Construction 4.0: An Innovation Platform for the Built Environment ; Routledge: London, UK, 2020; ISBN 9780367027308. [ Google Scholar ]
  • Fernandes, J.O.; Singh, B. Accreditation and ranking of higher education institutions (HEIs): Review, observations and recommendations for the Indian higher education system. TQM J. 2022 , 34 , 1013–1038. [ Google Scholar ] [ CrossRef ]
  • Darwish, M.; Kamel, S.; Assem, A.M. A Theoretical Model of Using Extended Reality in Architecture Design Education. Eng. Res. J.-Fac. Eng. (Shoubra) 2023 , 52 , 36–45. [ Google Scholar ] [ CrossRef ]
  • Lu, Y. Teaching Architectural Technology Knowledge Using Virtual Reality Technology. Can. J. Learn. Technol. 2022 , 48 , 1–26. [ Google Scholar ] [ CrossRef ]
  • Lucas, J.; Gajjar, D. Influence of Virtual Reality on Student Learning in Undergraduate Construction Education. Int. J. Constr. Educ. Res. 2022 , 18 , 374–387. [ Google Scholar ] [ CrossRef ]
  • Marinelli, M.; Male, S.A.; Valentine, A.; Guzzomi, A.; Van Der Veen, T.; Hassan, G.M. Using VR to Teach Safety in Design: What and How Do Engineering Students Learn? Eur. J. Eng. Educ. 2023 , 48 , 538–558. [ Google Scholar ] [ CrossRef ]
  • Xiao, Y.; Watson, M. Guidance on Conducting a Systematic Literature Review. J. Plan. Educ. Res. 2017 , 39 , 93–112. [ Google Scholar ] [ CrossRef ]
  • Rauniyar, S.; Awasthi, M.K.; Kapoor, S.; Mishra, A.K. Agritourism: Structured literature review and bibliometric analysis. Tour. Recreat. Res. 2021 , 46 , 52–70. [ Google Scholar ] [ CrossRef ]
  • Gao, Y.; Gonzalez, V.A.; Yiu, T.W. The Effectiveness of Traditional Tools and Computer-aided Technologies for Health and Safety Training in the Construction Sector: A Systematic Review. Comput. Educ. 2019 , 138 , 101–115. [ Google Scholar ] [ CrossRef ]
  • Wen, J.; Gheisari, M. Using Virtual Reality to Facilitate Communication in the AEC Domain: A Systematic Review. Constr. Innov. 2020 , 20 , 509–542. [ Google Scholar ] [ CrossRef ]
  • Ummihusna, A.; Zairul, M. Exploring Immersive Learning Technology as Learning Tools in Experiential Learning for Architecture Design Education. Open House Int. 2022 , 47 , 605–619. [ Google Scholar ] [ CrossRef ]
  • Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.; Clarke, M.; Devereaux, P.J.; Kleijnen, J.; Moher, D. The PRISMA Statement for Reporting Systematic Reviews and Meta-analyses of Studies that Evaluate Health Care Interventions: Explanation and Elaboration. Ann. Intern. Med. 2009 , 151 , W-65–W-94. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. Int. J. Surg. 2021 , 88 , 105906. [ Google Scholar ] [ CrossRef ]
  • Benavides, L.M.C.; Tamayo Arias, J.A.; Arango Serna, M.D.; Branch Bedoya, J.W.; Burgos, D. Digital transformation in higher education institutions: A systematic literature review. Sensors 2020 , 20 , 3291. [ Google Scholar ] [ CrossRef ]
  • Mukul, E.; Büyüközkan, G. Digital transformation in education: A systematic review of education 4.0. Technol. Forecast. Soc. Chang. 2023 , 194 , 122664. [ Google Scholar ] [ CrossRef ]
  • Cui, C.; Liu, Y.; Hope, A.; Wang, J. Review of Studies on the Public–Private Partnerships (PPP) for Infrastructure Projects. Int. J. Proj. Manag. 2018 , 36 , 773–794. [ Google Scholar ] [ CrossRef ]
  • Zhang, S.; Chan, A.P.; Feng, Y.; Duan, H.; Ke, Y. Critical Review on PPP Research—A Search from the Chinese and International Journals. Int. J. Proj. Manag. 2016 , 34 , 597–612. [ Google Scholar ] [ CrossRef ]
  • Grimaldi, M.; Corvello, V.; De Mauro, A.; Scarmozzino, E. A systematic literature review on intangible assets and open innovation. Knowl. Manag. Res. Pract. 2017 , 15 , 90–100. [ Google Scholar ] [ CrossRef ]
  • Ghanbaripour, A.N.; Tumpa, R.J.; Sunindijo, R.Y.; Zhang, W.; Yousefian, P.; Camozzi, R.N.; Hon, C.; Talebian, N.; Liu, T.; Hemmati, M. Retention over attraction: A review of women’s experiences in the Australian construction industry; challenges and solutions. Buildings 2023 , 13 , 490. [ Google Scholar ] [ CrossRef ]
  • Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006 , 3 , 77–101. [ Google Scholar ] [ CrossRef ]
  • Beekhuyzen, J. Putting the pieces of the puzzle together: Using NVivo for a literature review. In Proceedings of the QualIT2007: Qualitative Research, From the Margins to the Mainstream, Wellington, New Zealand, 18–20 November 2007. [ Google Scholar ]
  • King, S.; Boyer, J.; Bell, T.; Estapa, A. An Automated Virtual Reality Training System for Teacher-Student Interaction: A Randomized Controlled Trial. JMIR Serious Games 2022 , 10 , e41097. [ Google Scholar ] [ CrossRef ]
  • Le, Q.T.; Pedro, A.; Park, C.S. A Social Virtual Reality Based Construction Safety Education System for Experiential Learning. J. Intell. Robot. Syst. 2015 , 79 , 487–506. [ Google Scholar ] [ CrossRef ]
  • Lotfi, Y.A.; Aly, T.F.; Sadek, R. Virtual Reality as A Tool to Enhance Students’ Perceptions of Architectural Historical Spaces. Eng. J. 2023 , 2 , 596–607. [ Google Scholar ] [ CrossRef ]
  • Pham, H.C.; Dao, N.; Pedro, A.; Le, Q.T.; Hussain, R.; Cho, S.; Park, C.S.I.K. Virtual Field trip for Mobile Construction Safety Education Using 360-degree Panoramic Virtual Reality. Int. J. Eng. Educ. 2018 , 34 , 1174–1191. [ Google Scholar ]
  • Ceylan, S. Using Virtual Reality to Improve Visual Recognition Skills of First Year Architecture Students: A Comparative Study. In Proceedings of the 12th International Conference on Computer Supported Education (CSEDU), Prague, Czech Republic, 2–4 May 2020; Volume 2, pp. 54–63. [ Google Scholar ]
  • Hendricks, D. Applications of Augmented Reality as a Blended Learning Tool for Architectural Education. Scholarsh. Teach. Learn. South 2022 , 6 , 79–94. [ Google Scholar ] [ CrossRef ]
  • Wolf, M.; Teizer, J.; Wolf, B.; Bükrü, S.; Solberg, A. Investigating Hazard Recognition in Augmented Virtuality for Personalized Feedback in Construction Safety Education and Training. Adv. Eng. Inform. 2022 , 51 , 101469. [ Google Scholar ] [ CrossRef ]
  • Mastrolembo Ventura, S.; Castronovo, F.; Nikolić, D.; Ciribini, A. Implementation of Virtual Reality in Construction Education: A Content-Analysis Based Literature Review. J. Inf. Technol. Constr. 2022 , 27 , 705–731. [ Google Scholar ] [ CrossRef ]
  • Kuncoro, T.; Ichwanto, M.A.; Muhammad, D.F. VR-Based Learning Media of Earthquake-resistant Construction for Civil Engineering Students. Sustainability 2023 , 15 , 4282. [ Google Scholar ] [ CrossRef ]
  • Mahat, N.; Azman, M.A.; Bohari, A.A.M.; Rashid, A.F.A.; Khamaksorn, A.; Abd, M.I. BC-DIGIT: A Digital Game Application for Learning Building Construction Technology Course among Undergraduate Students. Development 2022 , 11 , 467–473. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Lucas, J.D. Identifying learning objectives by seeking a balance between student and industry expectations for technology exposure in construction education. J. Prof. Issues Eng. Educ. Pract. 2017 , 143 , 05016013. [ Google Scholar ] [ CrossRef ]
  • Xu, L.; Zhang, J.; Ding, Y.; Sun, G.; Zhang, W.; Philbin, S.P.; Guo, B.H. Assessing the Impact of Digital Education and the Role of the Big Data Analytics Course to Enhance the Skills and Employability of Engineering Students. Front. Psychol. 2022 , 13 , 974574. [ Google Scholar ] [ CrossRef ]
  • Pugacheva, N.; Kirillova, T.; Kirillova, O.; Luchinina, A.; Korolyuk, I.; Lunev, A. Digital Paradigm in Educational Management: The Case of Construction Education Based on Emerging Technologies. Int. J. Emerg. Technol. Learn. 2020 , 15 , 96–115. [ Google Scholar ] [ CrossRef ]
  • Landorf, C.; Ward, S. The Learning Impact of a 4-dimensional Digital Construction Learning Environment. Int. J. Educ. Pedagog. Sci. 2017 , 11 , 1266–1271. [ Google Scholar ]
  • Pedro, A.; Le, Q.T.; Park, C.S. Framework for integrating safety into construction methods education through interactive virtual reality. J. Prof. Issues Eng. Educ. Pract. 2016 , 142 , 04015011. [ Google Scholar ] [ CrossRef ]
  • Tan, Y.; Xu, W.; Li, S.; Chen, K. Augmented and Virtual Reality (AR/VR) for Education and Training in the AEC Industry: A Systematic Review of Research and Applications. Buildings 2022 , 12 , 1529. [ Google Scholar ] [ CrossRef ]
  • Fauzi, A.F.A.A.; Ali, K.N.; Amirudin, R. Evaluating Students Readiness, Expectancy, Acceptance and Effectiveness of Augmented Reality Based Construction Technology Education. Int. J. Built Environ. Sustain. 2019 , 6 , 7–13. [ Google Scholar ] [ CrossRef ]
  • Kassem, M.; Benomran, L.; Teizer, J. Virtual Environments for Safety Learning in Construction and Engineering: Seeking Evidence and Identifying Gaps for Future Research. Vis. Eng. 2017 , 5 , 16. [ Google Scholar ] [ CrossRef ]
  • Park, C.S.; Le, Q.T.; Pedro, A.; Lim, C.R. Interactive Building Anatomy Modeling for Experiential Building Construction Education. J. Prof. Issues Eng. Educ. Pract. 2016 , 142 , 04015019. [ Google Scholar ] [ CrossRef ]
  • Lasheen, R.; Khodeir, L.; Nessim, A. Identifying the Gap Between Practical and Educational Fields in the Egyptian AEC Industry in the Age of Digitalization. Eng. Res. J. 2022 , 176 , 281–303. [ Google Scholar ] [ CrossRef ]
  • Mohamed, N.A.G.; Sadek, M.R. Artificial Intelligence as a Pedagogical Tool for Architectural Education: What Does the Empirical Evidence Tell Us? MSA Eng. J. 2023 , 2 , 133–148. [ Google Scholar ]
  • Shanbari, H.A.; Blinn, N.M.; Issa, R.R. Laser Scanning Technology and BIM in Construction Management Education. J. Inf. Technol. Constr. 2016 , 21 , 204–217. [ Google Scholar ]
  • Urban, H.; Pelikan, G.; Schranz, C. Augmented Reality in AEC Education: A Case Study. Buildings 2022 , 12 , 391. [ Google Scholar ] [ CrossRef ]
  • Jacobsson, M.; Linderoth, H.C. Newly Graduated Students’ Role as Ambassadors for Digitalisation in Construction Firms. Constr. Manag. Econ. 2021 , 39 , 759–772. [ Google Scholar ] [ CrossRef ]
  • Kandi, V.R.; Castronovo, F.; Brittle, P.; Mastrolembo Ventura, S.; Nikolic, D. Assessing the Impact of a Construction Virtual Reality Game on Design Review Skills of Construction Students. J. Archit. Eng. 2020 , 26 , 04020035. [ Google Scholar ] [ CrossRef ]
  • Vassigh, S.; Davis, D.; Behzadan, A.H.; Mostafavi, A.; Rashid, K.; Alhaffar, H.; Elias, A.; Gallardo, G. Teaching Building Sciences in Immersive Environments: A Prototype Design, Implementation, and Assessment. Int. J. Constr. Educ. Res. 2020 , 16 , 180–196. [ Google Scholar ] [ CrossRef ]
  • Shojaei, A.; Rokooei, S.; Mahdavian, A.; Carson, L.; Ford, G. Using immersive video technology for construction management content delivery: Pilot study. J. Inf. Technol. Constr. 2021 , 26 , 886–901. [ Google Scholar ] [ CrossRef ]
  • Baduge, S.K.; Thilakarathna, S.; Perera, J.S.; Arashpour, M.; Sharafi, P.; Teodosio, B.; Shringi, A.; Mendis, P. Artificial intelligence and smart vision for building and construction 4.0: Machine and deep learning methods and applications. Autom. Constr. 2022 , 141 , 104440. [ Google Scholar ] [ CrossRef ]
  • Zamora-Polo, F.; Luque Sendra, A.; Aguayo-Gonzalez, F.; Sanchez-Martin, J. Conceptual Framework for the Use of Building Information Modeling in Engineering Education. Int. J. Eng. Educ. 2019 , 35 , 744–755. [ Google Scholar ]
  • Kraus, M.; Rust, R.; Rietschel, M.; Hall, D. Improved Perception of AEC Construction Details via Immersive Teaching in Virtual Reality. arXiv 2022 , arXiv:2209.10617. [ Google Scholar ]

Click here to enlarge figure

ThemesCodesArticlesFrequency
Technology and Student Engagement in BE EducationImproved students’ understanding, engagement, interests, and comprehension[ , , , , , , , ]9
Increased students’ motivation[ , , , , , ]6
Better engagement in the design process[ , , , , ]5
Providing real-time experiences in safe settings[ , , , ]4
Interaction with virtual architectural details and understand spatial linkages[ , , , ]4
Facilitation of active learning[ , , ] 3
Improved critical thinking[ , , ]3
Improved collaborative learning and teamwork[ , , ]3
Improved engagement with equipment[ , , ]3
Providing interesting and realistic learning settings[ , ]2
Improved comprehension and practical abilities[ ]1
Dynamic interaction with information[ ]1
Technology and Learning Outcomes in BE EducationImproved immersive and interactive learning experiences[ , , , , , , , , ]9
Increased knowledge and skills[ , , , , , , , ]8
Improved learning experiences and environment[ , , , , , , ]7
Enhanced learning outcomes[ , , , , ]5
Improved visualization and understanding of construction processes and complex concepts[ , , , ]5
Increased safety training and education[ , , , , ]5
Enhanced students’ comprehension of structural elements[ , , , , ]5
Facilitation of construction methodologies[ , , , , ]5
Improved hazard identification[ , , ]3
Improved students’ academic performance and decision-making[ , , ]3
Self-directed learning resources and problem-based learning[ , ]2
Improved understanding of subjects, grades, and educational experiences[ , ]2
Improved both hard and soft skills[ , ]2
Ability to carry out a virtual exploration of construction sites[ , ]2
Improved spatial and graphical skills[ , ]2
Comprehension of challenging assembly processes[ , ]2
Integrating in-class demonstration[ , ]2
Ability to test ideas and receive immediate feedback[ ]1
Technology and Employability in BE EducationBy equipping students with necessary knowledge and competencies, and more competitive in the job market by expanding their knowledge of cutting-edge technologies[ , , , , , ]6
Challenges in Implementing Technologies in BE EducationRestricted access to resources, high costs, need for training, and requirement for a foundational understanding of usage[ , , , , , , , , , , ]11
Complexity of implementation[ , , ]3
Poor integration with other design methodologies[ , , ]3
Faculty reluctance[ , ]2
Motion sickness[ ]1
Emerging Technologies in BE EducationArticlesFrequency
Virtual Reality (VR)[ , , , , , , , , , , , , , , , , , , , , , , ] 23
Augmented Reality (AR)[ , , , , , , , , , , , , ]13
Building Information Modeling (BIM)[ , , , , , , , ]8
Gamification[ , , , , , , ]7
Extended Reality (XR)[ , , , , ]6
Mixed Reality (MR)[ , , ]4
3D scanning[ , , ]3
Drones[ , , ]3
Interactive Voice Response (IVR)[ , ]2
Computer-aided technologies (CATs)[ ]1
Enhanced virtuality (EV)[ ]1
Laser scanning[ ]1
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Ghanbaripour, A.N.; Talebian, N.; Miller, D.; Tumpa, R.J.; Zhang, W.; Golmoradi, M.; Skitmore, M. A Systematic Review of the Impact of Emerging Technologies on Student Learning, Engagement, and Employability in Built Environment Education. Buildings 2024 , 14 , 2769. https://doi.org/10.3390/buildings14092769

Ghanbaripour AN, Talebian N, Miller D, Tumpa RJ, Zhang W, Golmoradi M, Skitmore M. A Systematic Review of the Impact of Emerging Technologies on Student Learning, Engagement, and Employability in Built Environment Education. Buildings . 2024; 14(9):2769. https://doi.org/10.3390/buildings14092769

Ghanbaripour, Amir Naser, Nima Talebian, Dane Miller, Roksana Jahan Tumpa, Weiwei Zhang, Mehdi Golmoradi, and Martin Skitmore. 2024. "A Systematic Review of the Impact of Emerging Technologies on Student Learning, Engagement, and Employability in Built Environment Education" Buildings 14, no. 9: 2769. https://doi.org/10.3390/buildings14092769

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Schools struggling to find solutions to chronic absenteeism problem

by CORY SMITH | The National Desk

FILE - A 6th-grade student walks down a stairwell. (Photo by Sean Gallup/Getty Images)

(TND) — Schools are having limited success in reducing chronic absenteeism, according to a new RAND report .

And more students are missing school than they were before the pandemic.

School leaders seem to believe that the pandemic shifted perspectives to the detriment of students.

“Yeah, it seems like parents are viewing school as more optional,” Heather Schwartz, an education policy expert and researcher at RAND, said Tuesday of the feedback she’s heard from school district leaders.

RAND’s report, released last week and based on a survey and interviews conducted late last school year, determined that 19% of students were chronically absent.

That’s down from the peak of 28% during the 2021-22 school year but remains elevated from the 13% and 15% absenteeism rates from 2016 to 2019.

And about 10% of districts had 30% or more of their students chronically absent last school year.

Chronic absenteeism is defined as a student missing at least 10% of their school days. A typical school year is 180 days, so a chronically absent student would be out of class for at least 18 days.

The American Federation of Teachers says chronic absenteeism predicts both low academic success and potentially which students may eventually drop out of school.

RAND found a quarter of districts didn’t find their approaches to reducing chronic absenteeism particularly effective.

Common approaches have been the adoption of an early warning system to flag students who are at risk of being chronically absent, home visits, having teachers call students’ homes when they miss school, and hiring staff focused on reducing absenteeism.

“It was disheartening to see the lack of any clear approach, any clear winner among the approaches,” Schwartz said.

Why are more students missing school?

There were common themes RAND heard from district leaders.

Some families, it seems, might be more apathetic towards school since the pandemic.

Some parents seem to have more zealous concerns about sending their children to school when they have minor illnesses, like colds.

There seems to be a growing sense among parents that doing class assignments online is a “good enough” replacement for attending school in person.

And then there’s an increased desire to ease anxiety or mental health concerns children might have about going to school.

“I would say those are the four buckets we hear,” Schwartz said.

Those sentiments seem to be more prevalent now than they were before the pandemic.

“To be fair, we didn't ask them (to) look in the crystal ball and be like, ‘Do you think this is forever?’ So, I don't know if they see it as like a receding tide that's still hanging around, or if it's ... the new normal,” Schwartz said of their survey of school leaders.

How can we fix this?

Schwartz said it’s clear there’s no one-size-fits-all solution to chronic absenteeism.

The same approach in two different places is going to yield different results, she said.

“The culture of a community, the size of the district, the types of students they serve, the needs of the family, the availability of transportation, on and on, are all highly dependent on the local context,” Schwartz said.

Beyond that, she said schools need to tailor their messages to the needs and challenges of different families.

A student who misses a lot of class time in one concentrated chunk is different than a student who misses a lot of days sprinkled throughout the school year.

A student who is missing time and struggling to keep up is different from a student who is missing time but managing to stay on track.

Schwartz said schools need to invest in testing different messages for different student populations.

“The problem is important enough that it's worth investing to find a solution,” she said. “Solutions, plural, because there isn't going to be just any one solution.”

And any approach needs to include “carrots, not just sticks,” she said.

Schools can’t just send truancy officers to homes and solve chronic absenteeism, she said.

“To fundamentally fix this issue, I don't think you can merely hang a bunch of threats over families. ... Make school compelling enough that kids actually feel that they in some sense want to come,” Schwartz said.

And relationship-building is key to making students want to go to school, she said.

“There is an adult who knows you, who notices when you're not there, cares about you, and who wants to see you there at school,” Schwartz said.

Case Western Reserve University

Become a Civic Engagement Scholar

Now in its 15th year, the Civic Engagement Scholars program hosted by the Center for Civic Engagement and Learning (CCEL), promotes and recognizes meaningful student involvement in the community. 

Scholars choose to complete a track of 25 or 45 civic engagement activity hours over the course of the academic year and participate in community-focused educational events. Upon successful completion, Scholars are recognized for their commitment to the community. 

The program is open to both CWRU undergraduate and graduate students of all years and majors. The Scholars enrollment form is due by 11:59 p.m. ET on Monday, Sept. 16. 

Interested students can visit the Civic Engagement Scholars website for more information or contact Erin Corwin with any questions.

IMAGES

  1. | Model for student engagement.

    research on student engagement

  2. Handbook of Research on Student Engagement

    research on student engagement

  3. Focus on Student Engagement for Better Academic Outcomes

    research on student engagement

  4. 18+ Student Engagement Survey Templates in PDF

    research on student engagement

  5. (PDF) The Handbook of Research on Student Engagement

    research on student engagement

  6. Student Engagement Theory

    research on student engagement

VIDEO

  1. Strategies for Increasing Student Engagement

  2. SCALE Student Experience: Aiden Jacobsen and Verilog

  3. 2024 CAAL Annual Lecture

  4. Top 100 Best Universities Ranked for 2024: Future Leaders in Education!

  5. Student Success: Perspectives, Approaches, and Strategies

  6. Gurupurnima Special : A Heartfelt Interview with My Teacher

COMMENTS

  1. PDF Student Engagement

    Student engagement: Evidence-based strategies to boost academic and social-emotional results. McREL International. 20191115 Cheryl Abla is a former classroom teacher and education program director who now, at ... students through applied research, product development, and professional services to teachers and education leaders. ...

  2. Fostering student engagement with motivating teaching: an observation

    Introduction. Research shows that student engagement constitutes a crucial precondition for optimal and deep-level learning (Barkoukis et al. Citation 2014; Skinner Citation 2016; Skinner, Zimmer-Gembeck, and Connell Citation 1998).In addition, student engagement is associated with students' motivation to learn (Aelterman et al. Citation 2012), and their persistence to complete school ...

  3. PDF Active learning classroom design and student engagement: An ...

    ENGAGEMENT IN ACTIVE LEARNING CLASSROOM Journal of Learning Spaces, 10(1), 2021. achieve data triangulation, perceptions of student engagement were collected from students, the instructor, and the research team. Three research questions guided this inquiry: 1.

  4. Handbook of Research on Student Engagement

    The second edition of the handbook reflects the expanding growth and sophistication in research on student engagement. Editorial scope and coverage are significantly expanded in the new edition, including numerous new chapters that address such topics as child and adolescent well-being, resilience, and social-emotional learning as well as extending student engagement into the realm of college ...

  5. Student Engagement: Current State of the Construct, Conceptual

    In recent years, the construct of student engagement has gained substantial attention in education research, policy, and practice (Fredricks et al., 2016a).This is perhaps due to its reported associations with desired scholastic and non-scholastic outcomes, such as academic achievement (Reyes et al., 2012), school completion (Archambault et al., 2009), and physical and psychological well-being ...

  6. Full article: Fostering student engagement through a real-world

    Literature. Student engagement is the level of effort, interest and attention that students invest in the learning process (Klem & Connell, Citation 2004; Marks, Citation 2000).However, meaningful engagement is deeper than simple participation and involvement (Speight el al., Citation 2018).In general, student engagement has three dimensions: behavioural, cognitive, and emotional (Klem ...

  7. Student Engagement: What Is It? Why Does It Matter?

    Engagement was defined as "the student's psychological investment in and effort directed toward learning, understanding, or mastering the knowledge, skills, or crafts that academic work is intended to promote" (Newmann, 1992, p. 12). One set of models emphasized the role of school context.

  8. (PDF) The Handbook of Research on Student Engagement

    2. Abstract. The goal of this chapter is to present a perspective on student engagement. with academic work that emphasizes its role in organizing the daily. school experiences of children and ...

  9. Student engagement in a higher education course: A multidimensional

    While most of the research on student engagement in a higher education course assumes that the two emotional and cognitive factors are distinct (e.g., Gutiérrez and Tomás, 2019; Yun & Park, 2020), most of these studies found early signs of the very close relationship between these two factors (e.g., very strong correlations, emotional items ...

  10. A study of the relationship between students' engagement and their

    The findings of this research align with the existing body of work to establish that student engagement is an important factor that contributes to the success of students on online courses. However, there are different models of students' engagement based on the teaching and learning context and the preferred learning design when it comes to ...

  11. Mapping research in student engagement and educational technology in

    What is student engagement. Student engagement has been linked to improved achievement, persistence and retention (Finn, 2006; Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008), with disengagement having a profound effect on student learning outcomes and cognitive development (Ma, Han, Yang, & Cheng, 2015), and being a predictor of student dropout in both secondary school and higher education (Finn ...

  12. Relationships between student engagement and academic achievement: A

    Student engagement is an emergent research domain in educational psychology, as research increasingly supports the connection between academic achievement, school-related behaviours, and student ...

  13. Handbook of research on student engagement.

    An essential guide to the expanding knowledge base, the Handbook of research on student engagement serves as a valuable resource for researchers, scientist-practitioners, and graduate students in such varied fields as clinical child and school psychology, educational psychology, public health, teaching and teacher education, social work, and ...

  14. PDF Evidence-Based Strategies for Elevating Student Engagement

    Throughout this guide, we will share evidence-based strategies for engaging students before, during, and after class. 1. Center for Postsecondary Research Indiana University School of Education (2021). National Survey of Student Engagement. 2. McMurtrie, B. (2022, July 12). A 'stunning' level of student disconnection.

  15. (PDF) STUDENT ENGAGEMENT

    2013), 'student engagement' is a broad and variously defined concept and. collection of practices in higher education, with the definition depending on the. position one occupies in the ...

  16. Full article: Student engagement and learning outcomes: an empirical

    If research investigates only some of these dimensions of student engagement, or treats student engagement as a holistic concept, it is unclear whether all dimensions of engagement play the same role, and how we can apply student engagement in more practical ways [Citation 4, Citation 30].

  17. Facilitating Student Engagement Through Educational Technology: Towards

    Introduction. The concept of student engagement has become somewhat of an enigma for educators and researchers, with ongoing discussions about its nature and complexity, and criticism about the depth and breadth of theorising and operationalisation within empirical research (e.g. Kahn, 2014; Zepke, 2018a).The role that digital technology plays in affecting student engagement is a particular ...

  18. Staying Engaged: Knowledge and Research Needs in Student Engagement

    Research on student engagement has focused on academic engagement or academic-related activities. Although academic experiences are critical determinants of educational success, school is also a place where students socialize with their friends and engage in nonacademic activities. Focusing exclusively on academic engagement neglects the school ...

  19. To Increase Student Engagement, Focus on Motivation

    According to research by Ryan and Deci, these are the three components that facilitate motivation: Autonomy is a "sense of initiative and ownership in one's actions.". Competence is a "feeling of mastery" and a sense that with effort a student can "succeed and grow.". Relatedness is when the school setting "conveys respect and ...

  20. Secondary students' self-regulated engagement in reading: Researching

    In this research, we drew on a model of self-regulated learning (SRL) (Butler & Cartier, 2005; Cartier & Butler, 2004) to investigate student engagement in learning through reading (LTR) as situated in context. Our overarching goals were to enhance theoretical understanding about SRL as situated, identify patterns in self-regulated learning through reading (LTR) for secondary students within ...

  21. Focus on Student Engagement for Better Academic Outcomes

    Discover the Elements That Drive Success. One recent Gallup study including 128 schools and more than 110,000 students found that student engagement and hope were significantly positively related ...

  22. Student Engagement Definition

    Student Engagement. In education, student engagement refers to the degree of attention, curiosity, interest, optimism, and passion that students show when they are learning or being taught, which extends to the level of motivation they have to learn and progress in their education. Generally speaking, the concept of "student engagement" is ...

  23. Student Engagement in Higher Education: Conceptualizations, Measurement

    Researchers have offered varying definitions and conceptualizations of student engagement, including the factors that contribute to students' levels of engagement, emphasizing the role and importance of engagement in contributing to student learning and outcomes (Groccia, 2018).This research has attempted to tease out the various ways of defining and thinking about student engagement in the ...

  24. Engaging Undergraduate Students in Course-based Research Improved

    Course-based undergraduate research experiences (CUREs) offer students opportunities to engage in critical thinking and problem solving. However, quantitating the impact of incorporating research into undergraduate courses on student learning and performance has been difficult since most CUREs lack a comparable traditional course as a control. To determine how course-based research impacts ...

  25. Digital literacies in the classroom: Authentic opportunities for

    Mumpower, LA, Branham, C, Clevenger, AD, Faulconer, E & Watkins, A 2020, Digital literacies in the classroom: Authentic opportunities for student engagement. in E Alqurashi (ed.), Handbook of Research on Fostering Student Engagement With Instructional Technology in Higher Education. IGI Global, pp. 116-138.

  26. Motivational messages from teachers before exams: Links to intrinsic

    In research on teacher's behaviour, there are established links to students' motivation, engagement, and academic performance (Reeve, 2009, 2012; Wubbels et al., 2006). Empirical research from the perspective of the SDT showed that autonomy-supportive teachers' motivating style is associated with students' intrinsic motivation, more engagement ...

  27. Study finds program boosts cognitive engagement of students with

    Ms Tancredi said this latest research "aimed to determine whether improved accessibility was noticed by students and whether it made any difference to their engagement and classroom learning experience". "Students participating in this study were interviewed before and after their teachers underwent the program," Ms Tancredi said.

  28. A Systematic Review of the Impact of Emerging Technologies on Student

    This paper presents a systematic literature review of the impact of emerging technologies such as Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), and gamification on student engagement, learning outcomes, and employability in Built Environment (BE) education. This review covers studies conducted between 2013 and 2023, utilizing the Preferred Reporting Items for Systematic ...

  29. Schools struggling to find solutions to chronic absenteeism problem

    A student who is missing time and struggling to keep up is different from a student who is missing time but managing to stay on track. Schwartz said schools need to invest in testing different ...

  30. Become a Civic Engagement Scholar

    Now in its 15th year, the Civic Engagement Scholars program hosted by the Center for Civic Engagement and Learning (CCEL), promotes and recognizes meaningful student involvement in the community. Scholars choose to complete a track of 25 or 45 civic engagement activity hours over the course of the academic year and participate in community ...