Advertisement

Advertisement

Artificial Intelligence in Education (AIEd): a high-level academic and industry note 2021

  • Original Research
  • Open access
  • Published: 07 July 2021
  • Volume 2 , pages 157–165, ( 2022 )

Cite this article

You have full access to this open access article

artificial intelligence in education technology

  • Muhammad Ali Chaudhry   ORCID: orcid.org/0000-0003-0154-2613 1 &
  • Emre Kazim 2  

51k Accesses

84 Citations

25 Altmetric

Explore all metrics

In the past few decades, technology has completely transformed the world around us. Indeed, experts believe that the next big digital transformation in how we live, communicate, work, trade and learn will be driven by Artificial Intelligence (AI) [ 83 ]. This paper presents a high-level industrial and academic overview of AI in Education (AIEd). It presents the focus of latest research in AIEd on reducing teachers’ workload, contextualized learning for students, revolutionizing assessments and developments in intelligent tutoring systems. It also discusses the ethical dimension of AIEd and the potential impact of the Covid-19 pandemic on the future of AIEd’s research and practice. The intended readership of this article is policy makers and institutional leaders who are looking for an introductory state of play in AIEd.

Similar content being viewed by others

artificial intelligence in education technology

The Birth of IJAIED

Evolution and revolution in artificial intelligence in education.

artificial intelligence in education technology

The Future Development of Education in the Era of Artificial Intelligence

Explore related subjects.

  • Artificial Intelligence
  • Medical Ethics

Avoid common mistakes on your manuscript.

1 Introduction

Artificial Intelligence (AI) is changing the world around us [ 42 ]. As a term it is difficult to define even for experts because of its interdisciplinary nature and evolving capabilities. In the context of this paper, we define AI as a computer system that can achieve a particular task through certain capabilities (like speech or vision) and intelligent behaviour that was once considered unique to humans [ 54 ]. In more lay terms we use the term AI to refer to intelligent systems that can automate tasks traditionally carried out by humans. Indeed, we read AI within the continuation of the digital age, with increased digital transformation changing the ways in which we live in the world. With such change the skills and knowhow of people must reflect the new reality and within this context, the World Economic Forum identified sixteen skills, referred to as twenty-first century skills necessary for the future workforce [ 79 ]. This includes skills such as technology literacy, communication, leadership, curiosity, adaptability, etc. These skills have always been important for a successful career, however, with the accelerated digital transformation of the past 2 years and the focus on continuous learning in most professional careers, these skills are becoming necessary for learners.

AI will play a very important role in how we teach and learn these new skills. In one dimension, ‘AIEd’ has the potential to dramatically automate and help track the learner’s progress in all these skills and identify where best a human teacher’s assistance is needed. For teachers, AIEd can potentially be used to help identify the most effective teaching methods based on students’ contexts and learning background. It can automate monotonous operational tasks, generate assessments and automate grading and feedback. AI does not only impact what students learn through recommendations, but also how they learn, what are the learning gaps, which pedagogies are more effective and how to retain learner’s attention. In these cases, teachers are the ‘human-in-the-loop’, where in such contexts, the role of AI is only to enable more informed decision making by teachers, by providing them predictions about students' performance or recommending relevant content to students after teachers' approval. Here, the final decision makers are teachers.

Segal et al. [ 58 ] developed a system named SAGLET that utilized ‘human-in-the-loop’ approach to visualize and model students’ activities to teachers in real-time enabling them to intervene more effectively as and when needed. Here the role of AI is to empower the teachers enabling them to enhance students’ learning outcomes. Similarly, Rodriguez et al. [ 52 ] have shown how teachers as ‘human-in-the-loop’ can customize multimodal learning analytics and make them more effective in blended learning environments.

Critically, all these achievements are completely dependent on the quality of available learner data which has been a long-lasting challenge for ed-tech companies, at least until the pandemic. Use of technology in educational institutions around the globe is increasing [ 77 ], however, educational technology (ed-tech) companies building AI powered products have always complained about the lack of relevant data for training algorithms. The advent and spread of Covid in 2019 around the world pushed educational institutions online and left them at the mercy of ed-tech products to organize content, manage operations, and communicate with students. This shift has started generating huge amounts of data for ed-tech companies on which they can build AI systems. According to a joint report: ‘Shock to the System’, published by Educate Ventures and Cambridge University, optimism of ed-tech companies about their own future increased during the pandemic and their most pressing concern became recruitment of too many customers to serve effectively [ 15 ].

Additionally, most of the products and solutions provided by ed-tech start-ups lack the quality and resilience to cope with intensive use of several thousands of users. Product maturity is not ready for huge and intense demand as discussed in Sect. “ Latest research ” below. We also discuss some of these products in detail in Sect. “ Industry’s focus ” below. How do we mitigate the risks of these AI powered products and who monitors the risk? (we return to this theme in our discussion of ethics—Sect. “ Ethical AIEd ”).

This paper is a non-exhaustive overview of AI in Education that presents a brief survey of the latest developments of AI in Education. It begins by discussing different aspects of education and learning where AI is being utilized, then turns to where we see the industry’s current focus and then closes with a note on ethical concerns regarding AI in Education. This paper also briefly evaluates the potential impact of the pandemic on AI’s application in education. The intended readership of this article is the policy community and institutional executives seeking an instructive introduction to the state of play in AIEd. The paper can also be read as a rapid introduction to the state of play of the field.

2 Latest research

Most work within AIEd can be divided into four main subdomains. In this section, we survey some of the latest work in each of these domains as case studies:

Reducing teachers’ workload: the purpose of AI in Education is to reduce teachers’ workload without impacting learning outcomes

Contextualized learning for students: as every learner has unique learning needs, the purpose of AI in Education is to provide customized and/or personalised learning experiences to students based on their contexts and learning backgrounds.

Revolutionizing assessments: the purpose of AI in Education is to enhance our understanding of learners. This not only includes what they know, but also how they learn and which pedagogies work for them.

Intelligent tutoring systems (ITS): the purpose of AI in Education is to provide intelligent learning environments that can interact with students, provide customized feedback and enhance their understanding of certain topics

2.1 Reducing teachers’ workload

Recent research in AIEd is focusing more on teachers than other stakeholders of educational institutions, and this is for the right reasons. Teachers are at the epicenter of every learning environment, face to face or virtual. Participatory design methodologies ensure that teachers are an integral part of the design of new AIEd tools, along with parents and learners [ 45 ]. Reducing teachers’ workload has been a long-lasting challenge for educationists, hoping to achieve more affective teaching in classrooms by empowering the teachers and having them focus more on teaching than the surrounding activities.

With the focus on online education during the pandemic and emergence of new tools to facilitate online learning, there is a growing need for teachers to adapt to these changes. Importantly, teachers themselves are having to re-skill and up-skill to adapt to this age, i.e. the new skills that teachers need to develop to fully utilize the benefits of AIEd [ 39 ]. First, they need to become tech savvy to understand, evaluate and adapt new ed-tech tools as they become available. They may not necessarily use these tools, but it is important to have an understanding of what these tools offer and if they share teachers’ workload. For example, Zoom video calling has been widely used during the pandemic to deliver lessons remotely. Teachers need to know not only how to schedule lessons on Zoom, but also how to utilize functionalities like breakout rooms to conduct group work and Whiteboard for free style writing. Second, teachers will also need to develop analytical skills to interpret the data that are visualized by these ed-tech tools and to identify what kind of data and analytics tools they need to develop a better understanding of learners. This will enable teachers to get what they exactly need from ed-tech companies and ease their workload. Third, teachers will also need to develop new team working, group and management skills to accommodate new tools in their daily routines. They will be responsible for managing these new resources most efficiently.

Selwood and Pilkington [ 61 ] showed that the use of Information and Communication Technologies (ICT) leads to a reduction in teachers’ workload if they use it frequently, receive proper training on how to use ICT and have access to ICT in home and school. During the pandemic, teachers have been left with no options other than online teaching. Spoel et al. [ 76 ] have shown that the previous experience with ICT did not play a significant role in how they dealt with the online transition during pandemic. Suggesting that the new technologies are not a burden for teachers. It is early to draw any conclusions on the long-term effects of the pandemic on education, online learning and teachers’ workload. Use of ICT during the pandemic may not necessarily reduce teacher workload, but change its dynamics.

2.2 Contextualized learning for students

Every learner has unique learning contexts based on their prior knowledge about the topic, social background, economic well-being and emotional state [ 41 ]. Teaching is most effective when tailored to these changing contexts. AIEd can help in identifying the learning gaps in each learner, offer content recommendations based on that and provide step by step solutions to complex problems. For example, iTalk2Learn is an opensource platform that was developed by researchers to support math learning among students between 5 and 11 years of age [ 22 ]. This tutor interacted with students through speech, identified when students were struggling with fractions and intervened accordingly. Similarly, Pearson has launched a calculus learning tool called AIDA that provides step by step guidance to students and helps them complete calculus tasks. Use of such tools by young students also raises interesting questions about the illusion of empathy that learners may develop towards such educational bots [ 73 ].

Open Learner Models [ 12 , 18 ] have been widely used in AIEd to facilitate learners, teachers and parents in understanding what learners know, how they learn and how AI is being used to enhance learning. Another important construct in understanding learners is self-regulated learning [ 10 , 68 ]. Zimmerman and Schunk [ 85 ] define self-regulated learning as learner’s thoughts, feelings and actions towards achieving a certain goal. Better understanding of learners through open learner models and self-regulated learning is the first step towards contextualized learning in AIEd. Currently, we do not have completely autonomous digital tutors like Amazon’s Alexa or Apple’s Siri for education but domain specific Intelligent Tutoring Systems (ITS) are also very helpful in identifying how much students know, where they need help and what type of pedagogies would work for them.

There are a number of ed-tech tools available to develop basic literacy skills in learners like double digit division or improving English grammar. In future, AIEd powered tools will move beyond basic literacy to develop twenty-first century skills like curiosity [ 49 ], initiative and creativity [ 51 ], collaboration and adaptability [ 36 ].

2.3 Revolutionizing assessments

Assessment in educational context refers to ‘any appraisal (or judgement or evaluation) of a student’s work or performance’ [ 56 ]. Hill and Barber [ 27 ] have identified assessments as one of the three pillars of schooling along with curriculum and learning and teaching. The purpose of modern assessments is to evaluate what students know, understand and can do. Ideally, assessments should take account of the full range of student abilities and provide useful information about learning outcomes. However, every learner is unique and so are their learning paths. How can standardized assessment be used to evaluate every student, with distinct capabilities, passions and expertise is a question that can be posed to broader notions of educational assessment. According to Luckin [ 37 ] from University College London, ‘AI would provide a fairer, richer assessment system that would evaluate students across a longer period of time and from an evidence-based, value-added perspective’.

AIAssess is an example of an intelligent assessment tool that was developed by researchers at UCL Knowledge lab [ 38 , 43 ]. It assessed students learning math and science based on three models: knowledge model, analytics model and student model. Knowledge component stored the knowledge about each topic, the analytics component analyzed students’ interactions and the student model tracked students’ progress on a particular topic. Similarly, Samarakou et al. [ 57 ] have developed an AI assessment tool that also does qualitative evaluation of students to reduce the workload of instructors who would otherwise spend hours evaluating every exercise. Such tools can be further empowered by machine learning techniques such as semantic analysis, voice recognition, natural language processing and reinforcement learning to improve the quality of assessments.

2.4 Intelligent tutoring systems (ITS)

An intelligent tutoring system is a computer program that tries to mimic a human teacher to provide personalized learning to students [ 46 , 55 ]. The concept of ITS in AIEd is decades old [ 9 ]. There have always been huge expectations from ITS capabilities to support learning. Over the years, we have observed that there has been a significant contrast between what ITS were envisioned to deliver and what they have actually been capable of doing [ 4 ].

A unique combination of domain models [ 78 ], pedagogical models [ 44 ] and learner models [ 20 ] were expected to provide contextualized learning experiences to students with customized content, like expert human teachers [ 26 , 59 , 65 ],. Later, more models were introduced to enhance students' learning experience like strategy model, knowledge-base model and communication model [ 7 ]. It was expected that an intelligent tutoring system would not just teach, but also ensure that students have learned. It would care for students [ 17 ]. Similar to human teachers, ITS would improve with time. They would learn from their experiences, ‘understand’ what works in which contexts and then help students accordingly [ 8 , 60 ].

In recent years, ITS have mostly been subject and topic specific like ASSISTments [ 25 ], iTalk2Learn [ 23 ] and Aida Calculus. Despite being limited in terms of the domain that a particular intelligent tutoring system addresses, they have proven to be effective in providing relevant content to students, interacting with students [ 6 ] and improving students’ academic performance [ 18 , 41 ]. It is not necessary that ITS would work in every context and facilitate every teacher [ 7 , 13 , 46 , 48 ]. Utterberg et al. [78] showed why teachers have abandoned technology in some instances because it was counterproductive. They conducted a formative intervention with sixteen secondary school mathematics teachers and found systemic contradictions between teachers’ opinions and ITS recommendations, eventually leading to the abandonment of the tool. This highlights the importance of giving teachers the right to refuse AI powered ed-tech if they are not comfortable with it.

Considering a direct correlation between emotions and learning [ 40 ] recently, ITS have also started focusing on emotional state of students while learning to offer a more contextualized learning experience [ 24 ].

2.5 Popular conferences

To reflect on the increasing interest and activity in the space of AIEd, some of the most popular conferences in AIEd are shown in Table 1 below. Due to the pandemic all these conferences will be available virtually in 2021 as well. The first international workshop on multimodal artificial intelligence in education is being organized at AIEd [74] conference to promote the importance of multimodal data in AIEd.

3 Industry’s focus

In this section, we introduce the industry focus in the area of AIEd by case-studying three levels of companies start-up level, established/large company and mega-players (Amazon, Cisco). These companies represent different levels of the ecosystem (in terms of size).

3.1 Start-ups

There have been a number of ed-tech companies that are leading the AIEd revolution. New funds are also emerging to invest in ed-tech companies and to help ed-tech start-ups in scaling their products. There has been an increase in investor interest [ 21 ]. In 2020 the amount of investment raised by ed-tech companies more than doubled compared to 2019 (according to Techcrunch). This shows another dimension of pandemic’s effect on ed-tech. With an increase in data coming in during the pandemic, it is expected that industry’s focus on AI powered products will increase.

EDUCATE, a leading accelerator focused on ed-tech companies supported by UCL Institute of Education and European Regional Development Fund was formed to bring research and evidence at the centre of product development for ed-tech. This accelerator has supported more than 250 ed-tech companies and 400 entrepreneurs and helped them focus on evidence-informed product development for education.

Number of ed-tech companies are emerging in this space with interesting business models. Third Space Learning offers maths intervention programs for primary and secondary school students. The company aims to provide low-cost quality tuition to support pupils from disadvantaged backgrounds in UK state schools. They have already offered 8,00,000 h of teaching to around 70,000 students, 50% of who were eligible for free meals. Number of mobile apps like Kaizen Languages, Duolingo and Babbel have emerged that help individuals in learning other languages.

3.2 Established players

Pearson is one of the leading educational companies in the world with operations in more than 70 countries and more than 22,000 employees worldwide. They have been making a transition to digital learning and currently generate 66% of their annual revenue from it. According to Pearson, they have built world’s first AI powered calculus tutor called Aida which is publicly available on the App Store. But, its effectiveness in improving students’ calculus skills without any human intervention is still to be seen.

India based ed-tech company known for creating engaging educational content for students raised investment at a ten billion dollar valuation last year [ 70 ]. Century tech is another ed-tech company that is empowering learning through AI. They claim to use neuroscience, learning science and AI to personalize learning and identifying the unique learning pathways for students in 25 countries. They make more than sixty thousand AI powered smart recommendations to learners every day.

Companies like Pearson and Century Tech are building great technology that is impacting learners across the globe. But the usefulness of their acclaimed AI in helping learners from diverse backgrounds, with unique learning needs and completely different contexts is to be proven. As discussed above, teachers play a very important role on how their AI is used by learners. For this, teacher training is vital to fully understand the strengths and weaknesses of these products. It is very important to have an awareness of where these AI products cannot help or can go wrong so teachers and learners know when to avoid relying on them.

In the past few years, the popularity of Massive Online Open Courses (MOOCS) has grown exponentially with the emergence of platforms like Coursera, Udemy, Udacity, LinkedIn Learning and edX [ 5 , 16 , 28 ]. AI can be utilized to develop a better understanding of learner behaviour on MOOCS, produce better content and enhance learning outcomes at scale. Considering these platforms are collecting huge amounts of data, it will be interesting to see the future applications of AI in offering personalized learning and life-long learning solutions to their users [ 81 ].

3.3 Mega-players

Seeing the business potential of AIEd and the kind of impact it can have on the future of humanity, some of the biggest tech companies around the globe are moving into this space. The shift to online education during the pandemic boosted the demand for cloud services. Amazon’s AWS (Amazon Web Services) as a leader in cloud services provider facilitated institutions like Instituto Colombiano para la Evaluacion de la Educacion (ICFES) to scale their online examination service for 70,000 students. Similarly, LSE utilized AWS to scale their online assessments for 2000 students [ 1 , 3 ].

Google’s CEO Sunder Pichai stated that the pandemic offered an incredible opportunity to re-imagine education. Google has launched more than 50 new software tools during the pandemic to facilitate remote learning. Google Classroom which is a part of Google Apps for Education (GAFE) is being widely used by schools around the globe to deliver education. Research shows that it improves class dynamics and helps with learner participation [ 2 , 29 , 62 , 63 , 69 ].

Before moving onto the ethical dimensions of AIEd, it is important to conclude this section by noting an area that is of critical importance to processing industry and services. Aside from these three levels of operation (start-up, medium, and mega companies), there is the question of development of the AIEd infrastructure. As Luckin [41] points out, “True progress will require the development of an AIEd infrastructure. This will not, however, be a single monolithic AIEd system. Instead, it will resemble the marketplace that has been developed for smartphone apps: hundreds and then thousands of individual AIEd components, developed in collaboration with educators, conformed to uniform international data standards, and shared with researchers and developers worldwide. These standards will enable system-level data collation and analysis that help us learn much more about learning itself and how to improve it”.

4 Ethical AIEd

With a number of mishaps in the real world [ 31 , 80 ], ethics in AI has become a real concern for AI researchers and practitioners alike. Within computer science, there is a growing overlap with the broader Digital Ethics [ 19 ] and the ethics and engineering focused on developing Trustworthy AI [ 11 ]. There is a focus on fairness, accountability, transparency and explainability [ 33 , 82 , 83 , 84 ]. Ethics in AI needs to be embedded in the entire development pipeline, from the decision to start collecting data till the point when the machine learning model is deployed in production. From an engineering perspective, Koshiyama et al. [ 35 ] have identified four verticals of algorithmic auditing. These include performance and robustness, bias and discrimination, interpretability and explainability and algorithmic privacy.

In education, ethical AI is crucial to ensure the wellbeing of learners, teachers and other stakeholders involved. There is a lot of work going on in AIEd and AI powered ed-tech tools. With the influx of large amounts of data due to online learning during the pandemic, we will most likely see an increasing number of AI powered ed-tech products. But ethics in AIEd is not a priority for most ed-tech companies and schools. One of the reasons for this is the lack of awareness of relevant stakeholders regarding where AI can go wrong in the context of education. This means that the drawbacks of using AI like discrimination against certain groups due to data deficiencies, stigmatization due to reliance on certain machine learning modelling deficiencies and exploitation of personal data due to lack of awareness can go unnoticed without any accountability.

An AI wrongly predicting that a particular student will not perform very well in end of year exams or might drop out next year can play a very important role in determining that student’s reputation in front of teachers and parents. This reputation will determine how these teachers and parents treat that learner, resulting in a huge psychological impact on that learner, based on this wrong description by an AI tool. One high-profile case of harm was in the use of an algorithm to predict university entry results for students unable to take exams due to the pandemic. The system was shown to be biased against students from poorer backgrounds. Like other sectors where AI is making a huge impact, in AIEd this raises an important ethical question regarding giving students the freedom to opt out of AI powered predictions and automated evaluations.

The ethical implications of AI in education are dependent on the kind of disruption AI is doing in the ed-tech sector. On the one hand, this can be at an individual level for example by recommending wrong learning materials to students, or it can collectively impact relationships between different stakeholders such as how teachers perceive learners’ progress. This can also lead to automation bias and issues of accountability [ 67 ] where teachers begin to blindly rely on AI tools and prefer the tool’s outcomes over their own better judgement, whenever there is a conflict.

Initiatives have been observed in this space. For example, Professor Rose Luckin, professor of learner centered design at University College London along with Sir Anthony Seldon, vice chancellor of the University of Buckingham and Priya Lakhani, founder and CEO of Century Tech founded the Institute of Ethical AI in Education (IEAIEd) [ 72 ] to create awareness and promote the ethical aspects of AI in education. In its interim report, the institute identified seven different requirements for ethical AI to mitigate any kind of risks for learners. This included human agency and oversight to double-check AI’s performance, technical robustness and safety to prevent AI going wrong with new data or being hacked; diversity to ensure similar distribution of different demographics in data and avoid bias; non-discrimination and fairness to prevent anyone from being unfairly treated by AI; privacy and data governance to ensure everyone has the right to control their data; transparency to enhance the understanding of AI products; societal and environmental well-being to ensure that AI is not causing any harm and accountability to ensure that someone takes the responsibility for any wrongdoings of AI. Recently, the institute has also published a framework [ 71 ] for educators, schools and ed-tech companies to help them with the selection of ed-tech products with various ethical considerations in mind, like ethical design, transparency, privacy etc.

With the focus on online learning during the pandemic, and more utilization of AI powered ed-tech tools, risks of AI going wrong have increased significantly for all the stakeholders including ed-tech companies, schools, teachers and learners. A lot more work needs to be done on ethical AI in learning contexts to mitigate these risks, including assessment balancing risks and opportunities.

UNESCO published ‘Beijing Consensus’ on AI and Education that recommended member states to take a number of actions for the smooth and positively impactful integration of AI with education [ 74 ]. International bodies like EU have also recently published a set of draft guidelines under the heading of EU AI Act to ban certain uses of AI and categorize some as ‘high risk’ [ 47 ].

5 Future work

With the focus on online education due to Covid’19 in the past year, it will be consequential to see what AI has to offer for education with vast amounts of data being collected online through Learning Management Systems (LMS) and Massive Online Open Courses (MOOCS).

With this influx of educational data, AI techniques such as reinforcement learning can also be utilized to empower ed-tech. Such algorithms perform best with the large amounts of data that was limited to very few ed-tech companies in 2021. These algorithms have achieved breakthrough performance in multiple domains including games [ 66 ], healthcare [ 14 ] and robotics [ 34 ]. This presents a great opportunity for AI’s applications in education for further enhancing students’ learning outcomes, reducing teachers’ workloads [ 30 ] and making learning personalized [ 64 ], interactive and fun [ 50 , 53 ] for teachers and students.

With a growing number of AI powered ed-tech products in future, there will also be a lot of research on ethical AIEd. The risks of AI going wrong in education and the psychological impact this can have on learners and teachers is huge. Hence, more work needs to be done to ensure robust and safe AI products for all the stakeholders.

This can begin from the ed-tech companies sharing detailed guidelines for using AI powered ed-tech products, particularly specifying when not to rely on them. This includes the detailed documentation of the entire machine learning development pipeline with the assumptions made, data processing approaches used and the processes followed for selecting machine learning models. Regulators can play a very important role in ensuring that certain ethical principles are followed in developing these AI products or there are certain minimum performance thresholds that these products achieve [ 32 ].

6 Conclusion

AIEd promised a lot in its infancy around 3 decades back. However, there are still a number of AI breakthroughs required to see that kind of disruption in education at scale (including basic infrastructure). In the end, the goal of AIEd is not to promote AI, but to support education. In essence, there is only one way to evaluate the impact of AI in Education: through learning outcomes. AIEd for reducing teachers’ workload is a lot more impactful if the reduced workload enables teachers to focus on students’ learning, leading to better learning outcomes.

Cutting edge AI by researchers and companies around the world is not of much use if it is not helping the primary grade student in learning. This problem becomes extremely challenging because every learner is unique with different learning pathways. With the recent developments in AI, particularly reinforcement learning techniques, the future holds exciting possibilities of where AI will take education. For impactful AI in education, learners and teachers always need to be at the epicenter of AI development.

About Amazon.: Helping 7,00,000 students transition to remote learning. https://www.aboutamazon.com/news/community/helping-700-000-students-transition-to-remote-learning (2020)

Al-Maroof, R.A.S., Al-Emran, M.: Students acceptance of google classroom: an exploratory study using PLS–SEM approach. Int. J. Emerg. Technol Learn. (2018). https://doi.org/10.3991/ijet.v13i06.8275

Article   Google Scholar  

Amazon Web Services, Inc. (n.d.).: Amazon Web Services, Inc. https://pages.awscloud.com/whitepaper-emerging-trends-in-education.html (2020)

Baker, R.S.: Stupid tutoring systems, intelligent humans. Int. J. Artif. Intell. Educ. 26 (2), 600–614 (2016)

Baturay, M.H.: An overview of the world of MOOCs. Procedia. Soc. Behav. Sci. 174 , 427–433 (2015)

Baylari, A., Montazer, G.A.: Design a personalized e-learning system based on item response theory and artificial neural network approach. Expert. Syst. Appl. 36 (4), 8013–8021 (2009)

Beck, J., Stern, M., Haugsjaa, E.: Applications of AI in education. Crossroads 3 (1), 11–15 (1996). https://doi.org/10.1016/j.eswa.2008.10.080

Beck, J.E.: Modeling the Student with Reinforcement Learning. Proceedings of the Machine learning for User Modeling Workshop at the Sixth International Conference on User Modeling (1997)

Beck, J.E., Woolf, B.P., Beal, C.R.: ADVISOR: A machine learning architecture for intelligent tutor construction. Proceedings of the 7th National Conference on Artificial Intelligence, New York, ACM, 552–557 (2000)

Boekaerts, M.: Self-regulated learning: where we are today. Int. J. Educ. Res. 31 (6), 445–457 (1999)

Brundage, M., Avin, S., Wang, J., Belfield, H., Krueger, G., Hadfield, G., Maharaj, T.: Toward trustworthy AI development: mechanisms for supporting verifiable claims. arXiv preprint arXiv:2004.07213 (2020)

Bull, S., Kay, J.: Open learner models. In: Nkambou, R., Bourdeau, J., Mizoguchi, R. (eds.) Studies in computational intelligence, pp. 301–322. Springer, Berlin (2010)

Google Scholar  

Cunha-Perez, C., Arevalillo-Herraez, M., Marco-Gimenez, L., Arnau, D.: On incorporating affective support to an intelligent tutoring system: an empirical study. IEEE. R. Iberoamericana. De. Tecnologias. Del. Aprendizaje. 13 (2), 63–69 (2018)

Callaway, E.: “It will change everything”: DeepMind’s AI makes gigantic leap in solving protein structures. Nature. https://www.nature.com/articles/d41586-020-03348-4 . (2020)

Cambridge University Press and Educate Ventures. Shock to the system: lessons from Covid-19 Volume 1: Implications and recommendations. https://www.cambridge.org/pk/files/1616/1349/4545/Shock_to_the_System_Lessons_from_Covid19_Volume_1.pdf (2021). Accessed 12 Apr 2021

Deng, R., Benckendorff, P., Gannaway, D.: Progress and new directions for teaching and learning in MOOCs. Comput. Educ. 129 , 48–60 (2019)

Erümit, A.K., Çetin, İ: Design framework of adaptive intelligent tutoring systems. Educ. Inf. Technol. 25 (5), 4477–4500 (2020)

Fang, Y., Ren, Z., Hu, X., Graesser, A.C.: A meta-analysis of the effectiveness of ALEKS on learning. Educ. Psychol. 39 (10), 1278–1292 (2019)

Floridi, L.: Soft ethics, the governance of the digital and the general data protection regulation. Philos. Trans. R. Soc. A. Math. Phys. Eng. Sci. 376 (2133), 20180081 (2018)

Goldstein, I.J.: The genetic graph: a representation for the evolution of procedural knowledge. Int. J. Man. Mach. Stud. 11 (1), 51–77 (1979)

Goryachikh, S.P., Sozinova, A.A., Grishina, E.N., Nagovitsyna, E.V.: Optimisation of the mechanisms of managing venture investments in the sphere of digital education on the basis of new information and communication technologies: audit and reorganisation. IJEPEE. 13 (6), 587–594 (2020)

Grawemeyer, B., Gutierrez-Santos, S., Holmes, W., Mavrikis, M., Rummel, N., Mazziotti, C., Janning, R.: Talk, tutor, explore, learn: intelligent tutoring and exploration for robust learning, p. 2015. AIED, Madrid (2015)

Hansen, A., Mavrikis, M.: Learning mathematics from multiple representations: two design principles. ICTMT-12, Faro (2015)

Hasan, M.A., Noor, N.F.M., Rahman, S.S.A., Rahman, M.M.: The transition from intelligent to affective tutoring system: a review and open issues. IEEE Access (2020). https://doi.org/10.1109/ACCESS.2020.3036990

Heffernan, N.T., Heffernan, C.L.: The ASSISTments ecosystem: building a platform that brings scientists and teachers together for minimally invasive research on human learning and teaching. Int. J. Artif. Intell. Educ. (2014). https://doi.org/10.1007/s40593-014-0024-x

Heffernan, N.T., Koedinger, K.R.: An intelligent tutoring system incorporating a model of an experienced human tutor. Proceedings of the 6th International Conference on Intelligent Tutoring Systems, 2363, p 596–608, (2002)

Hill, P., Barber, M.: Preparing for a Renaissance in Assessment. Pearson, London (2014)

Hollands, F.M., Tirthali, D.: Why do institutions offer MOOCs? Online Learning 18 (3), 3 (2014)

Iftakhar, S.: Google classroom: what works and how. J. Educ. Soc. Sci. 3 (1), 12–18 (2016)

Iglesias, A., Martínez, P., Aler, R., Fernández, F.: Reinforcement learning of pedagogical policies in adaptive and intelligent educational systems. Knowl. Based. Syst. 22 (4), 266–270 (2009)

Johnson, D.G., Verdicchio, M.: AI, agency and responsibility: the VW fraud case and beyond. Ai. Soc. 34 (3), 639–647 (2019)

Kazim, E., Denny, D.M.T., Koshiyama, A.: AI auditing and impact assessment: according to the UK information commissioner’s office. AI. Ethics. 1 , 1–10 (2021)

Kazim, E., Koshiyama, A.: A High-Level Overview of AI Ethics. SSRN J (2020). https://doi.org/10.2139/ssrn.3609292

Kober, J., Bagnell, J.A., Peters, J.: Reinforcement learning in robotics: a survey. Int. J. Robot. Res. 32 (11), 1238–1274 (2013)

Koshiyama, A., Kazim, E., Treleaven, P., Rai, P., Szpruch, L., Pavey, G., Ahamat, G., Leutner, F., Goebel, R., Knight, A., Adams, J., Hitrova, C., Barnett, J., Nachev, P., Barber, D., Chamorro-Premuzic, T., Klemmer, K., Gregorovic, M., Khan, S., Lomas, E.: Towards algorithm auditing a survey on managing legal ethical and technological risks of AI, ML and associated algorithms. SSRN J (2021). https://doi.org/10.2139/ssrn.3778998

LaPierre, J.: How AI Enhances Collaborative Learning. Filament Games (2018). https://www.filamentgames.com/blog/how-ai-enhances-collaborative-learning/ . Accessed 12 Apr 2021

Luckin, R.: Towards artificial intelligence-based assessment systems. Nat. Hum. Behav. (2017). https://doi.org/10.1038/s41562-016-0028

Luckin, R., du Boulay, B.: Int. J. Artif. Intell. Educ. 26 , 416–430 (2016)

Luckin, R., Holmes, W., Griffiths, M., Pearson, L.: Intelligence Unleashed An argument for AI in Education. https://static.googleusercontent.com/media/edu.google.com/en//pdfs/Intelligence-Unleashed-Publication.pdf (2016)

Barron-Estrada M.L., Zatarain-Cabada, R., Oramas-Bustillos, R., Gonzalez-Hernandez, F.: Sentiment analysis in an affective intelligent tutoring system. Proc. IEEE 17th Int. Conf. Adv. Learn. Technol. (ICALT), Timisoara pp. 394–397 2017.

Ma, W., Adesope, O., Nesbit, J.C., Liu, Q.: Intelligent tutoring systems and learning outcomes: a meta-analysis. J. Educ. Psychol. 106 (4), 901–918 (2014)

Makridakis, S.: The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms. Futures 90 , 46–60 (2017)

Mavrikis, M.: Int. J. Artif. Intell. Tools. 19 , 733–753 (2010)

Merrill, D.C., Reiser, B.J., Ranney, M., Trafton, J.G.: Effective tutoring techniques: a comparison of human tutors and intelligent tutoring systems. J. Learn. Sci. 2 (3), 277–305 (1992)

Moeini, A.: Theorising Evidence-Informed Learning Technology Enterprises: A Participatory Design-Based Research Approach. Doctoral dissertation, UCL University College London, London, (2020)

Mohamed, H., Lamia, M.: Implementing flipped classroom that used an intelligent tutoring system into learning process. Comput. Educ. 124 , 62–76 (2018). https://doi.org/10.1016/j.compedu.2018.05.011

Mueller, B.: The Artificial Intelligence Act: A Quick Explainer. [online] Center for Data Innovation (2021). https://datainnovation.org/2021/05/the-artificial-intelligence-act-a-quick-explainer/ . Accessed 12 Apr 2021

Murray, M.C., Pérez, J.: Informing and performing: A study comparing adaptive learning to traditional learning. Inform. Sci. J. 18 , 111–125 (2015)

Oudeyer, P-Y.: Computational Theories of Curiosity-Driven Learning. https://arxiv.org/pdf/1802.10546.pdf (2018)

Park, H.W., Grover, I., Spaulding, S., Gomez, L., Breazeal, C.: A model-free affective reinforcement learning approach to personalization of an autonomous social robot companion for early literacy education. AAAI. 33 (1), 687–694 (2019)

Resnick, M., Robinson, K.: Lifelong kindergarten: cultivating creativity through projects, passion, peers, and play. MIT press, Cambridge (2017)

Book   Google Scholar  

Rodríguez-Triana, M.J., Prieto, L.P., Martínez-Monés, A., Asensio-Pérez, J.I. and Dimitriadis, Y.: The teacher in the loop: Customizing multimodal learning analytics for blended learning. In Proceedings of the 8th international conference on learning analytics and knowledge. pp 417–426 (2018)

Rowe, J.P., Lester, J.C.: Improving student problem solving in narrative-centered learning environments: a modular reinforcement learning framework. In International Conference on Artificial Intelligence in Education. pp. 419–428. Springer, Cham (2015)

Russell, S.J., Norvig, P., Davis, E.: Artificial intelligence: a modern approach. Prentice Hall, Upper Saddle River (2010)

MATH   Google Scholar  

Jiménez, S., Juárez-Ramírez, R., Castillo, V.H., Licea, G., Ramírez-Noriega, A., Inzunza, S.: A feedback system to provide affective support to students. Comput. Appl. Eng. Educ. 26 (3), 473–483 (2018)

Sadler, D.R.: Formative assessment in the design of instructional systems. Instr. Sci. 18 , 119–144 (1989)

Samarakou, M., Fylladitakis, E., Prentakis, P., Athineos, S.: Implementation of artificial intelligence assessment in engineering laboratory education. https://files.eric.ed.gov/fulltext/ED557263.pdf (2014). Accessed 24 Feb 2021

Segal, A., Hindi, S., Prusak, N., Swidan, O., Livni, A., Palatnic, A., Schwarz, B.: Keeping the teacher in the loop: Technologies for monitoring group learning in real-time. In International Conference on Artificial Intelligence in Education. pp. 64–76. Springer, Cham (2017)

Self, J. A. (1990). Theoretical foundations of intelligent tutoring systems. J. Artif. Intell

Self, J.A.: The defining characteristics of intelligent tutoring systems research: ITSs care, precisely. IJAIEd. 10 , 350–364 (1998)

Selwood, I., Pilkington, R.: Teacher workload: using ICT to release time to teach. Educ. Rev. 57 (2), 163–174 (2005)

Shaharanee, I.N.M., Jamil, J.M. and Rodzi, S.S.M.:æ Google classroom as a tool for active learning. AIP Conference Proceedings, 1761 (1), pp. 020069, AIP Publishing LLC, College Park (2016)

Shaharanee, I.N.M., Jamil, J.M., Rodzi, S.S.M.: The application of Google Classroom as a tool for teaching and learning. J. Telecommun. Electron. Comp. Eng. 8 (10), 5–8 (2016)

Shawky, D., Badawi, A.: Towards a personalized learning experience using reinforcement learning. In: Hassanien, A.E. (ed.) Machine learning paradigms Theory and application, pp. 169–187. Springer (2019)

Shute, V.J. (1991). Rose garden promises of intelligent tutoring systems: blossom or thorn.  NASA, Lyndon B. Johnson Space Center, Fourth Annual Workshop on Space Operations Applications and Research (SOAR 90) . Available at: https://ntrs.nasa.gov/citations/19910011382 . Accessed 4 July 2021

Silver, D., Huang, A., Maddison, C.J., Guez, A., Sifre, L., Van Den Driessche, G., Schrittwieser, J., Antonoglou, I., Panneershelvam, V., Lanctot, M., Dieleman, S.: Mastering the game of Go with deep neural networks and tree search. Nature 529 (7587), 484–489 (2016)

Skitka, L.J., Mosier, K., Burdick, M.D.: Accountability and automation bias. Int. J. Hum. Comput. Stud. 52 (4), 701–717 (2000)

Steenbergen-Hu, S., Cooper, H.: A meta-analysis of the effectiveness of intelligent tutoring systems on K–12 students’ mathematical learning. J. Educ. Psychol. 105 (4), 970–987 (2013)

Sudarsana, I.K., Putra, I.B., Astawa, I.N.T., Yogantara, I.W.L.: The use of google classroom in the learning process. J. Phys. Conf. Ser 1175 (1), 012165 (2019)

TechCrunch. Indian education startup Byju’s is fundraising at a $10B valuation. https://techcrunch.com/2020/05/01/indian-education-startup-byjus-is-fundraising-at-a-10b-valuation/ (2020). Accessed 12 Apr 2021

The Institute for Ethical AI in Education The Ethical Framework for AI in Education (IEAIED). https://fb77c667c4d6e21c1e06.b-cdn.net/wp-content/uploads/2021/03/The-Ethical-Framework-for-AI-in-Education-Institute-for-Ethical-AI-in-Education-Final-Report.pdf (2021). Accessed 12 Apr 2021

The Institute for Ethical AI in Education The Ethical Framework for AI in Education (n.d.). Available at: https://www.buckingham.ac.uk/wp-content/uploads/2021/03/The-Institute-for-Ethical-AI-in-Education-The-Ethical-Framework-for-AI-in-Education.pdf . Accessed 4 July 2021

Tisseron, S., Tordo, F., Baddoura, R.: Testing Empathy with Robots: a model in four dimensions and sixteen ítems. Int. J. Soc. Robot. 7 (1), 97–102 (2015)

UNESCO. Artificial intelligence in education. UNESCO. https://en.unesco.org/artificial-intelligence/education . (2019). Accessed 12 Apr 2021

Utterberg Modén, M., Tallvid, M., Lundin, J., Lindström, B.: Intelligent Tutoring Systems: Why Teachers Abandoned a Technology Aimed at Automating Teaching Processes. In: Proceedings of the 54th Hawaii International Conference on System Sciences, Maui, p. 1538 (2021)

van der Spoel, I., Noroozi, O., Schuurink, E., van Ginkel, S.: Teachers’ online teaching expectations and experiences during the Covid19-pandemic in the Netherlands. Eur. J. Teach. Educ. 43 (4), 623–638 (2020)

Weller, M.: Twenty years of EdTech. Educa. Rev. Online. 53 (4), 34–48 (2018)

Wenger, E.: Artificial intelligence and tutoring systems. Morgan Kauffman, Los Altos (1987)

World Economic Forum and The Boston Consulting Group. New vision for education unlocking the potential of technology industry agenda prepared in collaboration with the Boston consulting group. http://www3.weforum.org/docs/WEFUSA_NewVisionforEducation_Report2015.pdf (2015). Accessed 12 Apr 2021

Yampolskiy, R.V., Spellchecker, M.S.: Artificial intelligence safety and cybersecurity: a timeline of AI failures. arXiv:1610.07997 (2016)

Yu, H., Miao, C., Leung, C., White, T.J.: Towards AI-powered personalization in MOOC learning. Npj. Sci. Learn. 2 (1), 1–5 (2017)

Yu, H., Shen, Z., Miao, C., Leung, C., Lesser, V.R., Yang, Q.: Building ethics into artificial intelligence. arXiv:1812.02953 (2018)

Zemel, R., Wu Y., Swersky, K., Pitassi, T., Dwork, C.: Learning fair representations. In: International Conference on Machine Learning, pp. 325–333 (2013)

Zhang, Y., Liao, Q.V., Bellamy, R.K.E.: Effect of confidence and explanation on accuracy and trust calibration in AI-assisted decision making. In: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. arXiv:2001.02114 (2020)

Zimmerman, B.J., Schunk, D.H.: Handbook of Self-Regulation of Learning and Performance. Routledge, Oxfordshire (2011)

Download references

Author information

Authors and affiliations.

Artificial Intelligence at University College, London, UK

Muhammad Ali Chaudhry

Department of Computer Science, University College, London, UK

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Muhammad Ali Chaudhry .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Chaudhry, M.A., Kazim, E. Artificial Intelligence in Education (AIEd): a high-level academic and industry note 2021. AI Ethics 2 , 157–165 (2022). https://doi.org/10.1007/s43681-021-00074-z

Download citation

Received : 25 April 2021

Accepted : 17 June 2021

Published : 07 July 2021

Issue Date : February 2022

DOI : https://doi.org/10.1007/s43681-021-00074-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Machine learning
  • Learning science
  • Artificial Intelligence in Education (AIEd)
  • Intelligent Tutoring Systems (ITS)
  • Find a journal
  • Publish with us
  • Track your research
  • Become a Member
  • Artificial Intelligence
  • Computational Thinking
  • Digital Citizenship
  • Edtech Selection
  • Global Collaborations
  • STEAM in Education
  • Teacher Preparation
  • ISTE Certification
  • School Partners
  • Career Development
  • 2024 ASCD Leadership Summit
  • ISTELive 25
  • 2025 ASCD Annual Conference
  • Leadership Exchange
  • Solutions Summit
  • Edtech Product Database
  • Solutions Network
  • Sponsorship & Advertising
  • Sponsorship & Advertising

Artificial Intelligence in Education

artificial intelligence in education technology

To prepare students to thrive as learners and leaders of the future, educators must become comfortable teaching with and about Artificial Intelligence. Generative AI tools such as ChatGPT , Claude and Midjourney , for example, further the opportunity to rethink and redesign learning. Educators can use these tools to strengthen learning experiences while addressing the ethical considerations of using AI. ISTE is the global leader in supporting schools in thoughtfully, safely and responsibly introducing AI in ways that enhance learning and empower students and teachers.

Interested in learning how to teach AI?

Sign up to learn about ISTE’s AI resources and PD opportunities. 

ASCD + ISTE StretchAI

StretchAI: An AI Coach Just for Educators

ISTE and ASCD are developing the first AI coach specifically for educators. With Stretch AI, educators can get tailored guidance to improve their teaching, from tips on ways to use technology to support learning, to strategies to create more inclusive learning experiences. Answers are based on a carefully validated set of resources and include the citations from source documents used to generate answers. If you are interested in becoming a beta tester for StretchAI, please sign up below.

Evolving Teacher Education in an AI World

Download this free report—with a framework and recommendations—on how educator prep programs can better ready their program and teacher candidates for incorporating AI.

Leaders' Guide to Artificial Intelligence

School leaders must ensure the use of AI is thoughtful and appropriate, and supports the district’s vision. Download this free guide  (or the UK version ) to get the background you need to guide your district in an AI-infused world.

UPDATED! Free Guides for Engaging Students in AI Creation

ISTE and GM have partnered to create Hands-On AI Projects for the Classroom guides to provide educators with a variety of activities to teach students about AI across various grade levels and subject areas. Each guide includes background information for teachers and student-driven project ideas that relate to subject-area standards. 

The hands-on activities in the guides range from “unplugged” projects to explore the basic concepts of how AI works to creating chatbots and simple video games with AI, allowing students to work directly with innovative AI technologies and demonstrate their learning. 

These updated hands-on guides are available in downloadable PDF format in English, Spanish and Arabic from the list below.

Hands-On AI Projects for the Classroom: A Guide for Elementary Teachers cover image

Artificial Intelligence Explorations for Educators unpacks everything educators need to know about bringing AI to the classroom. Sign up for the next course and find out how to earn graduate-level credit for completing the course.

Teach AI Feature

As a co-founder of  TeachAI , ISTE provides guidance to support school leaders and policy makers around leveraging AI for learning.

Ai joint page image square

Dive deeper into AI and learn how to navigate ChatGPT in schools with curated resources and tools  from ASCD and ISTE.

Join our Educator AI Community on Connect

ISTE+ASCD’s free online community brings together educators from around the world to share ideas and best practices for using artificial intelligence to support learning.

Learn More From These Podcasts, Blog Posts, Case Studies and Websites

Chat GPT Version Id05 Ih Ki9a XD Vlr CZ996 RB Kd8 xd DYB3 Gu

Partners Code.org, ETS, ISTE and Khan Academy offer engaging sessions with renowned experts to demystify AI, explore responsible implementation, address bias, and showcase how AI-powered learning can revolutionize student outcomes

Edsurge podcast better representation in ai

One of the challenges with bias in AI comes down to who has access to these careers in the first place, and that's the area that Tess Posner, CEO of the nonprofit AI4All, is trying to address.

Aiex banner 1602805883

Featuring in-depth interviews with practitioners, guidelines for classroom teachers and a webinar about the importance of AI in education, this site provides K-12 educators with practical tools for integrating AI and computational thinking across their curricula.

Web 1433045 1920

This 15-hour, self-paced introduction to artificial intelligence is designed for students in grades 9-12. Educators and students should create a free account at P-TECH before viewing the course.

Westville 2

Explore More in the Learning Library

Explore more books, articles, and tools about artificial intelligence in the Learning Library.

  • artificial intelligence
  • Newsletters

Unleashing the power of AI for education

Provided by Microsoft Education

Artificial intelligence (AI) is a major influence on the state of education today, and the implications are huge. AI has the potential to transform how our education system operates, heighten the competitiveness of institutions, and empower teachers and learners of all abilities.

Dan Ayoub is general manager of education at Microsoft.

The opportunities for AI to support education are so broad that recently Microsoft commissioned research on this topic from IDC  to understand where the company can help. The findings illustrate the strategic nature of AI in education and highlight the need for technologies and skills to make the promise of AI a reality.

The results showed almost universal acceptance among educators that AI is important for their future—99.4% said AI would be instrumental to their institution’s competitiveness within the next three years, with 15% calling it a “game-changer.” Nearly all are trying to work with it too—92% said they have started to experiment with the technology.

Yet on the other hand, most institutions still lack a formal data strategy or practical measures in place to advance AI capabilities, which remains a key inhibitor. The finding indicates that although the vast majority of leaders understand the need for an AI strategy, they may lack clarity on how to implement one. And it could be that they just don’t know where to start.

Unleashing the power of AI for education

David Kellermann has become a pioneer in how to use AI in the classroom. At the University of New South Wales in Sydney, Australia, Kellermann has built a question bot capable of answering questions on its own or delivering video of past lectures. The bot can also flag student questions for teaching assistants (TAs) to follow up. What’s more, it keeps getting better at its job as it’s exposed to more and different questions over time.

Kellermann began his classroom’s transformation with a single Surface laptop. He’s also employed out-of-the-box systems like Microsoft Teams to foster collaboration among his students. Kellermann used the Microsoft Power Platform to build the question bot, and he’s also built a dashboard using Power BI that plots the class’s exam scores and builds personalized study packs based on students’ past performance.

Educators see AI as instrumental to their institution’s competitiveness, yet most institutions still lack a formal data strategy to advance AI.

Kellermann’s project illustrates a key principle for organizations in nearly every industry when it comes to working with AI and machine learning—knowing where to start, starting small, and adding to your capabilities over time. The potential applications of AI are so vast, even the most sophisticated organizations can become bogged down trying to do too much, too soon. Often, it comes down to simply having a small goal and building from there.  

As an AI initiative gradually grows and becomes more sophisticated, it’s also important to have access to experts who can navigate technology and put the right systems in place. To gain a foothold with AI, institutions need tools, technologies, and skills.

This is a big focus for our work at Microsoft—to support educational institutions and classrooms. We’ve seen the strides some institutions have already taken to bring the potential of AI technologies into the classroom. But we also know there is much more work to do. Over the next few years, AI’s impact will be felt in several ways—managing operations and processes, data-driven programs to increase effectiveness, saving energy with smart buildings, creating a modern campus with a secure and safe learning environment.  

But its most important and far-reaching impact may lie in AI’s potential to change the way teachers teach and students learn, helping maximize student success and prepare them for the future.  ​

Collective intelligence tools will be available to save teachers time with tasks like grading papers so teachers and TAs can spend more time with students. AI can help identify struggling students through behavioral cues and give them a nudge in the right direction.

AI can also help educators foster greater inclusivity—AI-based language translation, for example, can enable more students with diverse backgrounds to participate in a class or listen to a lecture. Syracuse University’s School of Information Studies is working to drive experiential learning for students while also helping solve real-world problems, such as Our Ability , a website that helps people with disabilities get jobs. 

AI has the power to become an equalizer in education and a key differentiator for institutions that embrace it.

Schools can even use AI to offer a truly personalized learning experience—overcoming one of the biggest limitations of our modern, one-to-many education model. Kellermann’s personalized learning system in Sydney shows that the technology is here today.

AI has the power to become a great equalizer in education and a key differentiator for institutions that embrace it. Schools that adopt AI in clever ways are going to show better student success and empower their learners to enter the work force of tomorrow.

Given its importance, institutions among that 92% should start thinking about the impact they can achieve with AI technologies now. Do you want to more quickly grade papers? Empower teachers to spend more time with students? Whatever it is, it’s important to have that goal in mind, and then maybe dream a little.

Artificial intelligence

a protractor, a child writing math problems on a blackboard and a German text on geometry

Google DeepMind’s new AI systems can now solve complex math problems

AlphaProof and AlphaGeometry 2 are steps toward building systems that can reason, which could unlock exciting new capabilities.

  • Rhiannon Williams archive page

person using the voice function of their phone with the openai logo and a sound wave

OpenAI has released a new ChatGPT bot that you can talk to

The voice-enabled chatbot will be available to a small group of people today, and to all ChatGPT Plus users in the fall. 

  • Melissa Heikkilä archive page

8 bit concentric rings of ouroboros snakes

AI trained on AI garbage spits out AI garbage

As junk web pages written by AI proliferate, the models that rely on that data will suffer.

  • Scott J Mulligan archive page

foundational models of a racetrack with a text/image prompt to "Make the scenery a desert."

Roblox is launching a generative AI that builds 3D environments in a snap

It will make it easy to build new game environments on the platform, even if you don’t have any design skills.

Stay connected

Get the latest updates from mit technology review.

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at [email protected] with a list of newsletters you’d like to receive.

Stanford University

Along with Stanford news and stories, show me:

  • Student information
  • Faculty/Staff information

We want to provide announcements, events, leadership messages and resources that are relevant to you. Your selection is stored in a browser cookie which you can remove at any time using “Clear all personalization” below.

Image credit: Claire Scully

New advances in technology are upending education, from the recent debut of new artificial intelligence (AI) chatbots like ChatGPT to the growing accessibility of virtual-reality tools that expand the boundaries of the classroom. For educators, at the heart of it all is the hope that every learner gets an equal chance to develop the skills they need to succeed. But that promise is not without its pitfalls.

“Technology is a game-changer for education – it offers the prospect of universal access to high-quality learning experiences, and it creates fundamentally new ways of teaching,” said Dan Schwartz, dean of Stanford Graduate School of Education (GSE), who is also a professor of educational technology at the GSE and faculty director of the Stanford Accelerator for Learning . “But there are a lot of ways we teach that aren’t great, and a big fear with AI in particular is that we just get more efficient at teaching badly. This is a moment to pay attention, to do things differently.”

For K-12 schools, this year also marks the end of the Elementary and Secondary School Emergency Relief (ESSER) funding program, which has provided pandemic recovery funds that many districts used to invest in educational software and systems. With these funds running out in September 2024, schools are trying to determine their best use of technology as they face the prospect of diminishing resources.

Here, Schwartz and other Stanford education scholars weigh in on some of the technology trends taking center stage in the classroom this year.

AI in the classroom

In 2023, the big story in technology and education was generative AI, following the introduction of ChatGPT and other chatbots that produce text seemingly written by a human in response to a question or prompt. Educators immediately worried that students would use the chatbot to cheat by trying to pass its writing off as their own. As schools move to adopt policies around students’ use of the tool, many are also beginning to explore potential opportunities – for example, to generate reading assignments or coach students during the writing process.

AI can also help automate tasks like grading and lesson planning, freeing teachers to do the human work that drew them into the profession in the first place, said Victor Lee, an associate professor at the GSE and faculty lead for the AI + Education initiative at the Stanford Accelerator for Learning. “I’m heartened to see some movement toward creating AI tools that make teachers’ lives better – not to replace them, but to give them the time to do the work that only teachers are able to do,” he said. “I hope to see more on that front.”

He also emphasized the need to teach students now to begin questioning and critiquing the development and use of AI. “AI is not going away,” said Lee, who is also director of CRAFT (Classroom-Ready Resources about AI for Teaching), which provides free resources to help teach AI literacy to high school students across subject areas. “We need to teach students how to understand and think critically about this technology.”

Immersive environments

The use of immersive technologies like augmented reality, virtual reality, and mixed reality is also expected to surge in the classroom, especially as new high-profile devices integrating these realities hit the marketplace in 2024.

The educational possibilities now go beyond putting on a headset and experiencing life in a distant location. With new technologies, students can create their own local interactive 360-degree scenarios, using just a cell phone or inexpensive camera and simple online tools.

“This is an area that’s really going to explode over the next couple of years,” said Kristen Pilner Blair, director of research for the Digital Learning initiative at the Stanford Accelerator for Learning, which runs a program exploring the use of virtual field trips to promote learning. “Students can learn about the effects of climate change, say, by virtually experiencing the impact on a particular environment. But they can also become creators, documenting and sharing immersive media that shows the effects where they live.”

Integrating AI into virtual simulations could also soon take the experience to another level, Schwartz said. “If your VR experience brings me to a redwood tree, you could have a window pop up that allows me to ask questions about the tree, and AI can deliver the answers.”

Gamification

Another trend expected to intensify this year is the gamification of learning activities, often featuring dynamic videos with interactive elements to engage and hold students’ attention.

“Gamification is a good motivator, because one key aspect is reward, which is very powerful,” said Schwartz. The downside? Rewards are specific to the activity at hand, which may not extend to learning more generally. “If I get rewarded for doing math in a space-age video game, it doesn’t mean I’m going to be motivated to do math anywhere else.”

Gamification sometimes tries to make “chocolate-covered broccoli,” Schwartz said, by adding art and rewards to make speeded response tasks involving single-answer, factual questions more fun. He hopes to see more creative play patterns that give students points for rethinking an approach or adapting their strategy, rather than only rewarding them for quickly producing a correct response.

Data-gathering and analysis

The growing use of technology in schools is producing massive amounts of data on students’ activities in the classroom and online. “We’re now able to capture moment-to-moment data, every keystroke a kid makes,” said Schwartz – data that can reveal areas of struggle and different learning opportunities, from solving a math problem to approaching a writing assignment.

But outside of research settings, he said, that type of granular data – now owned by tech companies – is more likely used to refine the design of the software than to provide teachers with actionable information.

The promise of personalized learning is being able to generate content aligned with students’ interests and skill levels, and making lessons more accessible for multilingual learners and students with disabilities. Realizing that promise requires that educators can make sense of the data that’s being collected, said Schwartz – and while advances in AI are making it easier to identify patterns and findings, the data also needs to be in a system and form educators can access and analyze for decision-making. Developing a usable infrastructure for that data, Schwartz said, is an important next step.

With the accumulation of student data comes privacy concerns: How is the data being collected? Are there regulations or guidelines around its use in decision-making? What steps are being taken to prevent unauthorized access? In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data.

Technology is “requiring people to check their assumptions about education,” said Schwartz, noting that AI in particular is very efficient at replicating biases and automating the way things have been done in the past, including poor models of instruction. “But it’s also opening up new possibilities for students producing material, and for being able to identify children who are not average so we can customize toward them. It’s an opportunity to think of entirely new ways of teaching – this is the path I hope to see.”

You might be using an unsupported or outdated browser. To get the best possible experience please use the latest version of Chrome, Firefox, Safari, or Microsoft Edge to view this website.

Artificial Intelligence In Education: Teachers’ Opinions On AI In The Classroom

Ilana Hamilton

Updated: Jun 6, 2024, 4:04am

Artificial Intelligence In Education: Teachers’ Opinions On AI In The Classroom

In recent years, the meteoric rise of artificial intelligence (AI) has sent shockwaves through society on both economic and cultural levels. Seemingly poised to become as ubiquitous as email, this rapidly evolving technology is transforming many aspects of daily life—including how we teach and learn.

In October 2023, Forbes Advisor surveyed 500 practicing educators from around the U.S. about their experiences with AI in the classroom. With respondents representing teachers at all career stages, the results reveal a snapshot of how artificial intelligence is impacting education.

Why You Can Trust Forbes Advisor Education

Forbes Advisor’s education editors are committed to producing unbiased rankings and informative articles covering online colleges, tech bootcamps and career paths. Our ranking methodologies use data from the National Center for Education Statistics , education providers, and reputable educational and professional organizations. An advisory board of educators and other subject matter experts reviews and verifies our content to bring you trustworthy, up-to-date information. Advertisers do not influence our rankings or editorial content.

  • Over 3,868 accredited, nonprofit colleges and universities analyzed nationwide
  • 52 reputable tech bootcamp providers evaluated for our rankings
  • All content is fact-checked and updated on an annual basis
  • Rankings undergo five rounds of fact-checking
  • Only 7.12% of all colleges, universities and bootcamp providers we consider are awarded

What Is AI?

Before we dive into AI’s function in the education space, let’s define this technology in general terms. Artificial intelligence allows machines to execute tasks that have traditionally required human cognition. AI-powered programs and devices can make decisions, solve problems, understand and mimic natural language and learn from unstructured data.

OpenAI’s release of ChatGPT—a natural language processing chatbot—in the fall of 2022 brought AI to many people’s attention for the first time. However, AI tools have been part of the tech landscape for years. If you’ve ever played chess against a bot, consulted a virtual assistant like Siri or Alexa or even scrolled through your social media feed, you’ve already interacted with artificial intelligence.

What Are the Functions of AI in Education?

We’re still learning how AI technologies will integrate into the education sector as they develop, and we don’t yet have a full picture of how AI will affect critical issues of ethics, equity and data safety. However, we’ve already pinpointed several key uses for artificial intelligence in education, including the following.

AI-Powered Educational Games

Teachers have long recognized the value of play-based learning, and schools have used educational computer games—such as The Oregon Trail, first released in 1974—since the early days of computer gaming. Today’s AI-powered games can deliver targeted learning thanks to user-responsive programming.

Adaptive Learning Platforms

Educational technology leaders such as Carnegie Learning and Knewton offer adaptive platforms that customize learning activities and content in real time. Continuous assessment allows for immediate feedback and helps the system adjust its approach. Adaptive learning methodologies vary from simple rules-based systems to multifaceted machine learning algorithms.

Automated Grading and Feedback Systems

By automating grading, planning and administrative work, artificial intelligence systems can free up educators’ time and energy for increased student contact. This is a common argument in support of using AI in the classroom.

Chatbots for Student Support

At many higher education institutions, university chatbots support learners by responding to admissions queries, connecting students to course information and student services and delivering reminders. Other chatbots can help students brainstorm ideas, improve their writing skills and optimize their study time.

Intelligent Tutoring Systems

Often dedicated to a single subject such as math or language, intelligent tutoring systems simulate the one-on-one experience of working with a human tutor. Examples include the Duolingo app and Khan Academy’s Khanmigo tutoring system.

AI’s Influence on Education

More than half of the teachers who responded to Forbes Advisor’s survey said they believe AI has had a positive effect on the teaching and learning process. Less than 1 in 5 cited a negative effect.

60% of Educators Use AI in Their Classrooms

AI tools for teacher and student support are growing in popularity. Our survey found that younger teachers are more likely to adopt these tools, with respondents under 26 reporting the highest usage rates.

The Most Common AI Tools Used in the Classroom

Teachers use AI-powered educational games more often than any other AI tools, but adaptive learning platforms and automated grading and feedback systems are also popular.

Concerns Surrounding AI in Education

As the adoption of AI in the classroom proliferates, students, teachers and schools must grapple with how to use these technologies responsibly. Chatbots such as ChatGPT have sparked controversy among educators about their potential to facilitate cheating and generate misinformation. Moreover, professionals and observers have raised critical questions about data privacy, algorithmic bias and access disparities as they relate to AI.

Teachers Worry About Cheating, Lack of Human Interaction in Classrooms

Academic dishonesty tops the list of educators' concerns about AI in education. Teachers also worry that increased use of AI may mean learners receive less human contact.

The Most Common AI Cheating Methods

Most of the teachers we surveyed have observed students using AI—particularly generative AI, which can compose essays and supply answers on demand—to cheat.

The Future of AI in Education

In response to the growing presence of AI in education, organizations like the U.S. Department of Education (ED) and UNESCO have called for a transparent, human-centered approach to the use of these technologies.

ED recommends prioritizing educators' perspectives in developing AI solutions that enhance and support teachers' traditional roles rather than attempt to replace them. Along with state entities such as the California Department of Education, UNESCO advocates for equity-focused AI in education policies aimed at narrowing technological gaps within communities and worldwide.

Leading AI companies have taken note of the education space’s unique needs and concerns regarding the responsible use of artificial intelligence and have begun to adapt their products to address these factors.

For example, in May 2024, OpenAI introduced ChatGPT Edu, a version of ChatGPT designed for higher education institutions. This iteration of the popular platform features enhanced security and privacy, does not use conversations and data to train Open AI models, and offers education-relevant capabilities such as document summarization and the ability for students and instructors to build and share customized GPT models.

Although artificial intelligence presents novel concerns for the education sector, most teachers we surveyed reported a positive outlook on the future.

Teachers Want More Education To Understand AI and Use It Ethically

Ninety-eight percent of our survey respondents identified a need for at least some education on ethical AI usage. More than 60% recommended comprehensive education.

Educators Don’t Expect AI To Take Center Stage in Education

Nearly all of the teachers we surveyed predict that artificial intelligence will continue to impact classrooms of the future. However, most don't envision it playing a central role.

The Bottom Line

Today's education professionals are watching a technological revolution unfold in real time as AI-enabled learning platforms, educational games, chatbots, virtual tutors and organizational tools become more widespread every day. More than half of the teachers we surveyed had encountered at least some of these technologies.

As observers on the front lines, teachers are well-positioned to identify major concerns regarding the education sector’s adoption of AI-powered tools. Our survey found that respondents were the most worried about issues like cheating, loss of human interaction, job security, equity and safety.

Despite these concerns, U.S. educators seem optimistic about the potential of AI in the classroom. Acknowledging that artificial intelligence will likely play an expanding role in education, most teachers have already begun to integrate AI tools into their daily work routines.

Frequently Asked Questions (FAQs) About Artificial Intelligence in Education

How is artificial intelligence used in education.

AI-powered educational technology encompasses tools for teachers, students and administrators. Educational games, adaptive learning platforms, chatbots and intelligent tutoring systems provide individualized support for learners. Automated grading, feedback and planning programs cater to education professionals.

What are the pros and cons of AI in education?

Artificial intelligence can offer personally tailored instruction to students and help reduce administrative work for teachers. AI-powered accommodations can increase accessibility for students with disabilities and English language learners. However, students can also use AI technology to cheat, and some AI tools may provide misinformation. AI could also lead to reduced human contact in the classroom, access disparities and threats to data privacy.

How is AI shaping the future of education?

Forbes Advisor’s survey found that most U.S. teachers expect AI to continue expanding its influence in the education sphere. Nearly all respondents believe schools should teach students how to use AI ethically. As AI becomes more prevalent in education technology, teachers, institutions and government agencies should develop new strategies to ensure academic integrity, promote equitable access to AI tools and address other concerns.

  • Best Online Bachelor’s Degrees In Cybersecurity
  • Best Master’s In Computer Science Online
  • Best Online Data Science Master’s Degrees
  • Best Online Master’s In Computer Engineering
  • Best Online Master’s In Information Technology Programs
  • Best Software Engineering Master’s Online
  • Best Master’s In Cybersecurity Online Degrees
  • Best Online Computer Programming Degrees
  • Best Online Information Technology Degree
  • Best Online Software Engineering Degrees
  • Best Online Computer Science Degrees
  • How To Become A Cybersecurity Analyst
  • How To Become a Web Developer
  • How To Become A Sales Engineer
  • How To Become a Software Engineer
  • Careers In Cybersecurity
  • 10 Careers In Game Design To Consider
  • How To Become A Cybersecurity Engineer
  • How To Become A Game Developer
  • Earning An Associate In Computer Science
  • Earning A Bachelor’s Degree In Cybersecurity
  • How To Become A Cybersecurity Specialist
  • Cybersecurity Salary Guide
  • The 7 Best Programming Languages To Learn For Beginners
  • How Long Does It Take To Learn Coding? And Other Coding Questions
  • How To Learn Python For Free
  • Cybersecurity Degree Guide
  • Software Developer Vs. Software Engineer
  • What Is CISSP Certification? Qualifications, Benefits And Salary
  • Ask A Tech Recruiter

How To Become A Computer Network Architect

How To Become A Computer Network Architect

Amy Boyington

How To Become A Machine Learning Engineer

Sheryl Grey

How To Become A Data Scientist

Garrett Andrews

How To Become A Computer Systems Analyst: A Step-By-Step Guide

Nneoma Uche

How To Become A Cloud Engineer

Matt Whittle

How To Become A Computer Engineer: A Step-By-Step Guide

Liz Simmons

With five years of experience as a writer and editor in the higher education and career development space, Ilana has a passion for creating accessible, relevant content that demystifies the higher-ed landscape for traditional and nontraditional learners alike. Prior to joining Forbes Advisor's education team, Ilana wrote and edited for websites such as BestColleges.com and AffordableCollegesOnline.org.

Artificial Intelligence and Education: A Reading List

A bibliography to help educators prepare students and themselves for a future shaped by AI—with all its opportunities and drawbacks.

Young black student studying at night at home, with a help of a laptop computer.

How should education change to address, incorporate, or challenge today’s AI systems, especially powerful large language models? What role should educators and scholars play in shaping the future of generative AI? The release of ChatGPT in November 2022 triggered an explosion of news, opinion pieces, and social media posts addressing these questions. Yet many are not aware of the current and historical body of academic work that offers clarity, substance, and nuance to enrich the discourse.

JSTOR Daily Membership Ad

Linking the terms “AI” and “education” invites a constellation of discussions. This selection of articles is hardly comprehensive, but it includes explanations of AI concepts and provides historical context for today’s systems. It describes a range of possible educational applications as well as adverse impacts, such as learning loss and increased inequity. Some articles touch on philosophical questions about AI in relation to learning, thinking, and human communication. Others will help educators prepare students for civic participation around concerns including information integrity, impacts on jobs, and energy consumption. Yet others outline educator and student rights in relation to AI and exhort educators to share their expertise in societal and industry discussions on the future of AI.

Nabeel Gillani, Rebecca Eynon, Catherine Chiabaut, and Kelsey Finkel, “ Unpacking the ‘Black Box’ of AI in Education ,” Educational Technology & Society 26, no. 1 (2023): 99–111.

Whether we’re aware of it or not, AI was already widespread in education before ChatGPT. Nabeel Gillani et al. describe AI applications such as learning analytics and adaptive learning systems, automated communications with students, early warning systems, and automated writing assessment. They seek to help educators develop literacy around the capacities and risks of these systems by providing an accessible introduction to machine learning and deep learning as well as rule-based AI. They present a cautious view, calling for scrutiny of bias in such systems and inequitable distribution of risks and benefits. They hope that engineers will collaborate deeply with educators on the development of such systems.

Jürgen Rudolph, Samson Tan, and Shannon Tan, “ ChatGPT: Bullshit Spewer or the End of Traditional Assessments in Higher Education? ” The Journal of Applied Learning and Teaching 6, no. 1 (January 24, 2023).

Jürgen Rudolph et al. give a practically oriented overview of ChatGPT’s implications for higher education. They explain the statistical nature of large language models as they tell the history of OpenAI and its attempts to mitigate bias and risk in the development of ChatGPT. They illustrate ways ChatGPT can be used with examples and screenshots. Their literature review shows the state of artificial intelligence in education (AIEd) as of January 2023. An extensive list of challenges and opportunities culminates in a set of recommendations that emphasizes explicit policy as well as expanding digital literacy education to include AI.

Emily M. Bender, Timnit Gebru, Angela McMillan-Major, and Shmargaret Shmitchell, “ On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜 ,” FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (March 2021): 610–623.

Student and faculty understanding of the risks and impacts of large language models is central to AI literacy and civic participation around AI policy. This hugely influential paper details documented and likely adverse impacts of the current data-and-resource-intensive, non-transparent mode of development of these models. Bender et al. emphasize the ways in which these costs will likely be borne disproportionately by marginalized groups. They call for transparency around the energy use and cost of these models as well as transparency around the data used to train them. They warn that models perpetuate and even amplify human biases and that the seeming coherence of these systems’ outputs can be used for malicious purposes even though it doesn’t reflect real understanding.

The authors argue that inclusive participation in development can encourage alternate development paths that are less resource intensive. They further argue that beneficial applications for marginalized groups, such as improved automatic speech recognition systems, must be accompanied by plans to mitigate harm.

Erik Brynjolfsson, “ The Turing Trap: The Promise & Peril of Human-Like Artificial Intelligence ,” Daedalus 151, no. 2 (2022): 272–87.

Erik Brynjolfsson argues that when we think of artificial intelligence as aiming to substitute for human intelligence, we miss the opportunity to focus on how it can complement and extend human capabilities. Brynjolfsson calls for policy that shifts AI development incentives away from automation toward augmentation. Automation is more likely to result in the elimination of lower-level jobs and in growing inequality. He points educators toward augmentation as a framework for thinking about AI applications that assist learning and teaching. How can we create incentives for AI to support and extend what teachers do rather than substituting for teachers? And how can we encourage students to use AI to extend their thinking and learning rather than using AI to skip learning?

Kevin Scott, “ I Do Not Think It Means What You Think It Means: Artificial Intelligence, Cognitive Work & Scale ,” Daedalus 151, no. 2 (2022): 75–84.

Brynjolfsson’s focus on AI as “augmentation” converges with Microsoft computer scientist Kevin Scott’s focus on “cognitive assistance.” Steering discussion of AI away from visions of autonomous systems with their own goals, Scott argues that near-term AI will serve to help humans with cognitive work. Scott situates this assistance in relation to evolving historical definitions of work and the way in which tools for work embody generalized knowledge about specific domains. He’s intrigued by the way deep neural networks can represent domain knowledge in new ways, as seen in the unexpected coding capabilities offered by OpenAI’s GPT-3 language model, which have enabled people with less technical knowledge to code. His article can help educators frame discussions of how students should build knowledge and what knowledge is still relevant in contexts where AI assistance is nearly ubiquitous.

Laura D. Tyson and John Zysman, “ Automation, AI & Work ,” Daedalus 151, no. 2 (2022): 256–71.

How can educators prepare students for future work environments integrated with AI and advise students on how majors and career paths may be affected by AI automation? And how can educators prepare students to participate in discussions of government policy around AI and work? Laura Tyson and John Zysman emphasize the importance of policy in determining how economic gains due to AI are distributed and how well workers weather disruptions due to AI. They observe that recent trends in automation and gig work have exacerbated inequality and reduced the supply of “good” jobs for low- and middle-income workers. They predict that AI will intensify these effects, but they point to the way collective bargaining, social insurance, and protections for gig workers have mitigated such impacts in countries like Germany. They argue that such interventions can serve as models to help frame discussions of intelligent labor policies for “an inclusive AI era.”

Todd C. Helmus, Artificial Intelligence, Deepfakes, and Disinformation: A Primer (RAND Corporation, 2022).

Educators’ considerations of academic integrity and AI text can draw on parallel discussions of authenticity and labeling of AI content in other societal contexts. Artificial intelligence has made deepfake audio, video, and images as well as generated text much more difficult to detect as such. Here, Todd Helmus considers the consequences to political systems and individuals as he offers a review of the ways in which these can and have been used to promote disinformation. He considers ways to identify deepfakes and ways to authenticate provenance of videos and images. Helmus advocates for regulatory action, tools for journalistic scrutiny, and widespread efforts to promote media literacy. As well as informing discussions of authenticity in educational contexts, this report might help us shape curricula to teach students about the risks of deepfakes and unlabeled AI.

William Hasselberger, “ Can Machines Have Common Sense? ” The New Atlantis 65 (2021): 94–109.

Students, by definition, are engaged in developing their cognitive capacities; their understanding of their own intelligence is in flux and may be influenced by their interactions with AI systems and by AI hype. In his review of The Myth of Artificial Intelligence: Why Computers Can’t Think the Way We Do by Erik J. Larson, William Hasselberger warns that in overestimating AI’s ability to mimic human intelligence we devalue the human and overlook human capacities that are integral to everyday life decision making, understanding, and reasoning. Hasselberger provides examples of both academic and everyday common-sense reasoning that continue to be out of reach for AI. He provides a historical overview of debates around the limits of artificial intelligence and its implications for our understanding of human intelligence, citing the likes of Alan Turing and Marvin Minsky as well as contemporary discussions of data-driven language models.

Gwo-Jen Hwang and Nian-Shing Chen, “ Exploring the Potential of Generative Artificial Intelligence in Education: Applications, Challenges, and Future Research Directions ,” Educational Technology & Society 26, no. 2 (2023).

Gwo-Jen Hwang and Nian-Shing Chen are enthusiastic about the potential benefits of incorporating generative AI into education. They outline a variety of roles a large language model like ChatGPT might play, from student to tutor to peer to domain expert to administrator. For example, educators might assign students to “teach” ChatGPT on a subject. Hwang and Chen provide sample ChatGPT session transcripts to illustrate their suggestions. They share prompting techniques to help educators better design AI-based teaching strategies. At the same time, they are concerned about student overreliance on generative AI. They urge educators to guide students to use it critically and to reflect on their interactions with AI. Hwang and Chen don’t touch on concerns about bias, inaccuracy, or fabrication, but they call for further research into the impact of integrating generative AI on learning outcomes.

Weekly Newsletter

Get your fix of JSTOR Daily’s best stories in your inbox each Thursday.

Privacy Policy   Contact Us You may unsubscribe at any time by clicking on the provided link on any marketing message.

Lauren Goodlad and Samuel Baker, “ Now the Humanities Can Disrupt ‘AI’ ,” Public Books (February 20, 2023).

Lauren Goodlad and Samuel Baker situate both academic integrity concerns and the pressures on educators to “embrace” AI in the context of market forces. They ground their discussion of AI risks in a deep technical understanding of the limits of predictive models at mimicking human intelligence. Goodlad and Baker urge educators to communicate the purpose and value of teaching with writing to help students engage with the plurality of the world and communicate with others. Beyond the classroom, they argue, educators should question tech industry narratives and participate in public discussion on regulation and the future of AI. They see higher education as resilient: academic skepticism about former waves of hype around MOOCs, for example, suggests that educators will not likely be dazzled or terrified into submission to AI. Goodlad and Baker hope we will instead take up our place as experts who should help shape the future of the role of machines in human thought and communication.

Kathryn Conrad, “ Sneak Preview: A Blueprint for an AI Bill of Rights for Education ,” Critical AI 2.1 (July 17, 2023).

How can the field of education put the needs of students and scholars first as we shape our response to AI, the way we teach about it, and the way we might incorporate it into pedagogy? Kathryn Conrad’s manifesto builds on and extends the Biden administration’s Office of Science and Technology Policy 2022 “Blueprint for an AI Bill of Rights.” Conrad argues that educators should have input into institutional policies on AI and access to professional development around AI. Instructors should be able to decide whether and how to incorporate AI into pedagogy, basing their decisions on expert recommendations and peer-reviewed research. Conrad outlines student rights around AI systems, including the right to know when AI is being used to evaluate them and the right to request alternate human evaluation. They deserve detailed instructor guidance on policies around AI use without fear of reprisals. Conrad maintains that students should be able to appeal any charges of academic misconduct involving AI, and they should be offered alternatives to any AI-based assignments that might put their creative work at risk of exposure or use without compensation. Both students’ and educators’ legal rights must be respected in any educational application of automated generative systems.

Support JSTOR Daily! Join our new membership program on Patreon today.

JSTOR logo

JSTOR is a digital library for scholars, researchers, and students. JSTOR Daily readers can access the original research behind our articles for free on JSTOR.

Get Our Newsletter

More stories.

Gift for the grangers

The Gift of the Grange

Aerial shot of Brooklyn, New York city on an overcast day in summer, taken from over the Bedford-Stuvesant neighborhood.

Where Are the Trees?

Close-up portrait of Border Collie dog on Rainbow Bridge background

Do All Dogs Go to Heaven?

An image tiled "A strange gathering of Anabaptists and Quakers" depicting a naked woman and Anabaptists and Quakers before a pulpit

The Naked Quakers

Recent posts.

  • How Pentecostalism Shaped Rock ’n’ Roll
  • Raccoons in the Laboratory
  • Genesis of the Modern American Right
  • How a Rice Economy Toppled the Shogun
  • Richard Gregg: An American Pioneer of Nonviolence Remembered 

Support JSTOR Daily

Sign up for our weekly newsletter.

  • Our Program Divisions
  • Our Three Academies
  • Government Affairs
  • Statement on Diversity and Inclusion
  • Our Study Process
  • Conflict of Interest Policies and Procedures
  • Project Comments and Information
  • Read Our Expert Reports and Published Proceedings
  • Explore PNAS, the Flagship Scientific Journal of NAS
  • Access Transportation Research Board Publications
  • Coronavirus Disease 2019 (COVID-19)
  • Diversity, Equity, and Inclusion
  • Economic Recovery
  • Fellowships and Grants
  • Publications by Division
  • Division of Behavioral and Social Sciences and Education
  • Division on Earth and Life Studies
  • Division on Engineering and Physical Sciences
  • Gulf Research Program
  • Health and Medicine Division
  • Policy and Global Affairs Division
  • Transportation Research Board
  • National Academy of Sciences
  • National Academy of Engineering
  • National Academy of Medicine
  • Publications by Topic
  • Agriculture
  • Behavioral and Social Sciences
  • Biography and Autobiography
  • Biology and Life Sciences
  • Computers and Information Technology
  • Conflict and Security Issues
  • Earth Sciences
  • Energy and Energy Conservation
  • Engineering and Technology
  • Environment and Environmental Studies
  • Food and Nutrition
  • Health and Medicine
  • Industry and Labor
  • Math, Chemistry, and Physics
  • Policy for Science and Technology
  • Space and Aeronautics
  • Surveys and Statistics
  • Transportation and Infrastructure
  • Searchable Collections
  • New Releases

Artificial Intelligence in Education and Mental Health for a Sustainable Future: Proceedings of a Workshop—in Brief

VIEW LARGER COVER

Artificial Intelligence in Education and Mental Health for a Sustainable Future

Proceedings of a workshop—in brief.

The pandemic and overlapping global crises, including climate change, have increased attention to the importance of mental health and well-being as foundational for humans. Similarly, COVID-19 significantly exacerbated gaps in education, leaving children one to three years behind. Artificial intelligence (AI) has demonstrated potential to be transformative in addressing challenges in mental health and education and in supporting broader sustainability issues. However, there are well-founded concerns about AI regarding its potential to exacerbate inequity, further marginalizing underserved communities.

To further explore these issues, the National Academies of Sciences, Engineering, and Medicine's Roundtable on Science and Technology for Sustainability, in collaboration with the Board on Health Care Services and Board on Science Education, convened a hybrid workshop, Artificial Intelligence in Education and Mental Health for a Sustainable Future on May 30, 2024. The workshop consisted of two parts: AI in mental health and well-being and AI in education. Participants reviewed AI tools, applications, and strategies in education and mental health and the implications for sustainable development. The workshop also discussed AI's potential to accelerate progress on the Sustainable Development Goals.

  • Engineering and Technology — Applications of Technology
  • Health and Medicine — Healthcare and Quality
  • Education — Educational Technology

Suggested Citation

National Academies of Sciences, Engineering, and Medicine. 2024. Artificial Intelligence in Education and Mental Health for a Sustainable Future: Proceedings of a Workshop—in Brief . Washington, DC: The National Academies Press. https://doi.org/10.17226/27995. Import this citation to: Bibtex EndNote Reference Manager

Publication Info

Chapters skim
1-11

What is skim?

The Chapter Skim search tool presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter. You may select key terms to highlight them within pages of each chapter.

Copyright Information

The National Academies Press (NAP) has partnered with Copyright Clearance Center's Marketplace service to offer you a variety of options for reusing NAP content. Through Marketplace, you may request permission to reprint NAP content in another publication, course pack, secure website, or other media. Marketplace allows you to instantly obtain permission, pay related fees, and print a license directly from the NAP website. The complete terms and conditions of your reuse license can be found in the license agreement that will be made available to you during the online order process. To request permission through Marketplace you are required to create an account by filling out a simple online form. The following list describes license reuses offered by the NAP through Marketplace:

  • Republish text, tables, figures, or images in print
  • Post on a secure Intranet/Extranet website
  • Use in a PowerPoint Presentation
  • Distribute via CD-ROM

Click here to obtain permission for the above reuses. If you have questions or comments concerning the Marketplace service, please contact:

Marketplace Support International +1.978.646.2600 US Toll Free +1.855.239.3415 E-mail: [email protected] marketplace.copyright.com

To request permission to distribute a PDF, please contact our Customer Service Department at [email protected] .

What is a prepublication?

What is a prepublication image

An uncorrected copy, or prepublication, is an uncorrected proof of the book. We publish prepublications to facilitate timely access to the committee's findings.

What happens when I pre-order?

The final version of this book has not been published yet. You can pre-order a copy of the book and we will send it to you when it becomes available. We will not charge you for the book until it ships. Pricing for a pre-ordered book is estimated and subject to change. All backorders will be released at the final established price. As a courtesy, if the price increases by more than $3.00 we will notify you. If the price decreases, we will simply charge the lower price. Applicable discounts will be extended.

Downloading and Using eBooks from NAP

What is an ebook.

An ebook is one of two file formats that are intended to be used with e-reader devices and apps such as Amazon Kindle or Apple iBooks.

Why is an eBook better than a PDF?

A PDF is a digital representation of the print book, so while it can be loaded into most e-reader programs, it doesn't allow for resizable text or advanced, interactive functionality. The eBook is optimized for e-reader devices and apps, which means that it offers a much better digital reading experience than a PDF, including resizable text and interactive features (when available).

Where do I get eBook files?

eBook files are now available for a large number of reports on the NAP.edu website. If an eBook is available, you'll see the option to purchase it on the book page.

View more FAQ's about Ebooks

Types of Publications

Proceedings: Proceedings published by the National Academies of Sciences, Engineering, and Medicine chronicle the presentations and discussions at a workshop, symposium, or other event convened by the National Academies. The statements and opinions contained in proceedings are those of the participants and are not endorsed by other participants, the planning committee, or the National Academies.

  • Open access
  • Published: 16 September 2024

Artificial intelligence (AI) -integrated educational applications and college students’ creativity and academic emotions: students and teachers’ perceptions and attitudes

  • Haozhuo Lin 1 &
  • Qiu Chen 2  

BMC Psychology volume  12 , Article number:  487 ( 2024 ) Cite this article

Metrics details

Integrating Artificial Intelligence (AI) in educational applications is becoming increasingly prevalent, bringing opportunities and challenges to the learning environment. While AI applications have the potential to enhance structured learning, they may also significantly impact students’ creativity and academic emotions.

This study aims to explore the effects of AI-integrated educational applications on college students’ creativity and academic emotions from the perspectives of both students and teachers. It also assessed undergraduate students’ and faculty’s attitudes to AI-integrated applications.

Methodology

A mixed-method research design was used. In the first phase, a qualitative research approach was employed, utilizing theoretical sampling to select informants. Data were collected through in-depth interviews with undergraduate students and university lecturers to gain comprehensive insights into their experiences and perceptions. A scale was developed, validated, and administered to 120 students and faculty in the quantitative phase. Descriptive statistics was used to analyze the data.

The study revealed that AI applications often impose rigid frameworks that constrain creative thinking and innovation, leading to emotional disengagement due to AI interactions’ repetitive and impersonal nature. Additionally, constant AI assessments heightened performance anxiety, and technical frustrations disrupted the learning process. Conversely, AI applications stimulated creativity by introducing new ideas and problem-solving techniques, enhanced engagement through interactive elements, provided personalized feedback, and supported emotional well-being with gamified elements and constant availability. Quantitative data also verified that teachers and students have positive attitudes toward the benefits and challenges of these applications.

Conclusions

AI integration in educational applications has a dual-edged impact on college students’ creativity and academic emotions. While there are notable benefits in stimulating creativity and enhancing engagement, significant challenges such as creativity constraints, emotional disengagement, and performance anxiety must be addressed. Balancing these factors requires thoughtful implementation and continuous evaluation to optimize the role of AI in education.

Peer Review reports

Introduction

Artificial Intelligence (AI) represents a subdivision of computer science that employs algorithms and machine learning techniques to emulate or mimic human intelligence [ 1 ]. AI is categorized into three types: narrow AI, general AI, and artificial superintelligence. Narrow AI, the most prevalent and developed form of AI to date, is highly goal-oriented and utilizes machine learning techniques to accomplish specific objectives or tasks, such as image and facial recognition or virtual assistants like Siri and Alexa. General AI, also known as deep AI, possesses capabilities comparable to human intelligence, including understanding the needs and emotions of other intelligent beings. In contrast, artificial superintelligence surpasses human capabilities in all respects, resembling portrayals of AI in science fiction that exceed human intelligence [ 2 ].

In the educational context, the development of AI is likely to remain within the scope of narrow AI. Current educational technologies encompass speech semantic recognition, image recognition, augmented reality/virtual reality, machine learning, brain neuroscience, quantum computing, and blockchain. These technologies are increasingly being integrated into classrooms. Many AI-based educational products are being implemented in K-12 education [ 3 ]. Research indicates that AI technology in education has been applied in at least ten areas: automatic grading systems, interval reminders, teacher feedback, virtual teachers, personalized learning, adaptive learning, augmented reality/virtual reality, precise reading, intelligent campuses, and distance learning [ 3 ]. The Artificial Intelligence in Education (AIED) community focuses on developing systems as effective as one-on-one human tutoring [ 4 ]. Significant advancements toward this goal have been made over the past 25 years. However, prioritizing the human tutor or teacher as the benchmark, AIED practices typically involve students working with computers to solve step-based problems centred on domain-specific knowledge in subjects such as science and mathematics [ 5 ]. This approach needs to fully account for recent educational practices and theory developments, including emphasizing 21st-century competencies. The 21st-century competency approach to education highlights the importance of general skills and competencies such as creativity. Modern classrooms aim to incorporate authentic practices using real-world problems in collaborative learning environments. To remain relevant and enhance its impact, the field of AIED must adapt to these evolving educational paradigms.

The impact of AI applications on various aspects of education has garnered significant attention in recent years. While research has delved into its effects on different variables, one area deserving deeper exploration is its influence on students’ creativity [ 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 ]. Creativity is a multifaceted construct crucial for problem-solving, innovation, and adaptability in an ever-evolving society. Traditional educational paradigms often need help to fully nurture and assess creativity due to their structured nature and emphasis on standardized assessments. However, AI-integrated educational applications possess the potential to revolutionize this landscape [ 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 ].

AI applications can provide personalized learning experiences tailored to students’ unique cognitive profiles, preferences, and learning styles. By offering adaptive feedback, generating diverse learning materials, and facilitating interactive learning environments, AI can foster a conducive atmosphere for creativity to flourish. Through algorithms that analyze students’ performance, identify patterns, and suggest novel approaches, AI empowers learners to explore unconventional solutions, think critically, and engage in creative problem-solving processes [ 16 , 17 , 18 , 19 , 20 , 21 , 22 ].

Moreover, AI technologies can facilitate collaborative and interdisciplinary learning experiences, exposing students to diverse perspectives, ideas, and methodologies. Virtual reality simulations, augmented reality tools, and intelligent tutoring systems can immerse students in interactive learning environments where they can experiment, innovate, and co-create content. By transcending the constraints of physical classrooms and textbooks, AI-enabled platforms offer limitless possibilities for creative expression and exploration [ 23 , 24 , 25 , 26 , 27 , 28 , 29 , 30 ].

Furthermore, AI’s ability to curate and recommend relevant resources from vast repositories of educational content enhances students’ exposure to diverse sources of inspiration and knowledge. By leveraging natural language processing algorithms, sentiment analysis, and recommendation systems, AI can identify content aligned with students’ interests, passions, and learning objectives, nurturing intrinsic motivation and curiosity-driven exploration [ 31 , 32 , 33 ]. In addition to creativity, another crucial aspect of the educational experience that AI-integrated applications may influence is academic emotions. These are the emotions experienced by students and educators in educational settings. These emotions are directly linked to academic activities like learning, teaching, studying, and taking exams. They can be positive (e.g., enjoyment, pride, and hope) or negative (e.g., anxiety, frustration, and boredom) and significantly impact motivation, learning strategies, cognitive resources, and academic performance [ 34 ]. Academic emotions encompass a spectrum of affective states, including motivation, engagement, anxiety, boredom, and satisfaction, significantly impacting students’ learning outcomes, perseverance, and overall well-being. Traditional educational approaches often overlook the complex interplay between cognitive processes and emotional experiences, resulting in suboptimal learning environments and outcomes [ 1 , 2 , 3 , 4 , 5 , 35 ].

However, AI technologies offer unprecedented opportunities to monitor, analyze, and respond to students’ academic emotions in real time [ 4 ]. By employing affective computing techniques, sentiment analysis algorithms, and facial recognition technology, AI can detect subtle cues indicative of students’ emotional states and adjust learning experiences accordingly [ 1 ]. For instance, adaptive tutoring systems can dynamically adapt to the difficulty level of tasks, provide scaffolding support, or offer motivational prompts based on students’ emotional responses and performance metrics [ 5 ]. Moreover, AI-integrated learning platforms can incorporate gamification elements, immersive storytelling, and personalized avatars to enhance students’ emotional engagement and investment in learning activities [ 4 ]. By fostering a supportive and inclusive learning environment that acknowledges and addresses students’ diverse emotional needs, AI can promote positive academic emotions, such as curiosity, excitement, and self-efficacy, while mitigating negative ones, such as frustration, anxiety, and disengagement.

Furthermore, AI-driven analytics and data visualization tools empower educators to gain deeper insights into students’ emotional trajectories, identify at-risk individuals, and implement timely interventions. By harnessing predictive analytics and machine learning algorithms, educators can anticipate students’ emotional responses to various instructional strategies, anticipate potential challenges, and proactively implement personalized interventions to foster resilience, motivation, and emotional well-being. In line with the existing gap, the following research questions were raised:

How do teachers and students perceive the challenges of using AI applications in the students’ creativity and academic emotions?

How do teachers and students perceive the merits of using AI applications in the students’ creativity and academic emotions?

What are the teachers and students’ attitudes to AI-integrated educational applications?

Artificial intelligence and higher education

21st-century higher education is rapidly changing due to globalization, technological advancements, and student demographics [ 16 ]. Online learning platforms have become widely accessible, enabling universities to offer fully online courses and degree programs, expanding access to education and providing flexibility in learning [ 17 ]. The growing diversity of the educational field, with students from various backgrounds, highlights the significance of global citizenship and intercultural understanding. Universities are playing a significant role in promoting innovation and research as technology advancements pick up speed [ 18 ], encouraging industry-academia cooperation and placing a focus on commercialization and entrepreneurship. The emphasis is shifting toward skills-based learning patterns for practical, career-focused skills, as evidenced by recent recruitment trends favouring graduates with particular skills [ 19 ].

To enhance the quality of higher education, the industry is exploring various strategies to meet stakeholders’ requirements [ 20 ]. Artificial intelligence (AI) integration is one particularly hopeful solution [ 21 ]. As technology advances, artificial intelligence (AI) in education has enormous potential to change the teaching and learning environment [ 22 ]. AI is significantly improving the quality of higher education in several ways [ 23 ]. Artificial intelligence (AI)--powered learning strategies evaluate students’ performance, pinpoint their advantages and disadvantages and offer individualized learning experiences. With the help of this strategy, students can acquire knowledge and produce more valuable results in the real world [ 24 ].

Chatbots, virtual assistants, and adaptive learning systems are examples of AI-based technology providing immersive and exciting learning environments while actively enabling students to investigate complicated ideas [ 25 ]. Artificial intelligence (AI) helps with assessment and feedback by helping with assignment grading, tracking student participation, giving quicker and more accurate feedback, and freeing up teachers’ time for other teaching responsibilities [ 26 ]. Chatbots with artificial intelligence (AI) provide quick, individualized support by evaluating student data to identify individuals who may be at risk and enabling early interventions for academic success—various AI applications and platforms, including Bit. AI, Mendeley, Turnitin, elinik. Io and Coursera tools support higher education research by analyzing large datasets, generating insights, and identifying patterns challenging for human researchers to detect [ 27 ]. We expect even more cutting-edge AI applications in education due to continued technological advancement, giving students individualized, engaging, and productive learning experiences [ 28 ].

The exciting development of AI dramatically improves both the effectiveness and engagement of instructors in postsecondary education. Adopting AI helps educators free up time for more meaningful activities by automating administrative duties like tracking attendance and grading assignments [ 29 ]. Additionally, AI helps educators pinpoint areas in which they can grow by offering individualized opportunities for professional development [ 30 ]. Solutions to enduring problems in modern higher education are needed, such as limited inclusivity and unequal access [ 31 ]. Traditional teaching methods often fail to engage students with diverse learning preferences, hindering active participation and critical thinking skills [ 32 ]. The inability of conventional assessment techniques to capture thorough understanding makes using AI necessary. AI algorithms analyze individual learning patterns, tailor coursework, and predict at-risk students, enabling timely interventions [ 33 ]. Content delivery is revolutionized by AI-driven systems that adjust to students’ learning styles, pace, and knowledge gaps.

In conclusion, adopting AI in higher education empowers the system by addressing challenges and enhancing the quality of education. Ongoing research aims to understand faculty members’ awareness of AI’s applicability and impact on learning experiences, work engagement, and productivity in higher education. This research provides insights for institutional policymakers to facilitate the adoption of new technologies and overcome specific challenges. Despite the increasing integration of technology and artificial intelligence (AI) in education, there is a notable gap in understanding how AI-empowered technology educational apps specifically influence undergraduate students’ academic emotions and test anxiety. While various studies have explored the general impact of technology on education and student emotions, there is a need for focused research on the unique effects of AI-powered educational apps. Understanding the dynamics between these technologies and students’ emotional experiences can provide valuable insights into the efficacy of AI applications in promoting positive emotions and reducing test anxiety.

AI applications and the students’ creativity

Students should be aware of AI’s potential to bolster their creativity and learning processes. Modern educational methodologies prioritize problem-solving approaches, underscoring the significance of nurturing children’s creative thinking abilities. However, extensive research corroborates the existence of a decline in creativity among younger individuals across various disciplines [ 6 , 7 ]. One explanation for this decline is attributed to the overly structured nature of school curricula and a shortage of play-based learning activities within educational frameworks [ 8 ].

Emerging research indicates how AI can enhance skills commonly associated with creativity, such as curiosity [ 9 ], perseverance, and attentiveness [ 10 ]. The potential of AI to support creativity is also under investigation. Kafai and Burke assert that AI in education aims to foster problem-solving and creativity skills through collaborative interactions with AI systems rather than solely focusing on knowledge acquisition within specific domains [ 11 ]. They suggest that AI can facilitate the unfolding of creativity, thus being intertwined with the creative process. Additionally, Ryu and Han examine Korean educators’ perceptions of AI in education, noting that experienced teachers acknowledge AI’s potential to enhance creativity [ 12 ]. Hence, AI in education could address concerns related to the decline of creativity, particularly by emphasizing the creative process. This may aid in enhancing students’ creative thinking abilities and comfort level with utilizing AI, thereby adequately preparing them for the contemporary workforce [ 13 , 14 , 15 ].

To effectively merge AI and creativity, it is imperative to gain a deeper understanding of how students perceive the relationship between these concepts. Situating AI within prevailing creativity theories, such as the 4 C model of creativity, can further enrich this understanding.

Creativity and AI in an educational setting can be analyzed through the lens of the 4 C model [ 8 ]. Mini-Q, or ‘personal creativity,’ encapsulates creativity’s subjective and developmental facets. Mini-X pertains to individualized creative discoveries that may not be recognized as such by others. For instance, a slight variation on a well-known recipe could exemplify mini-c creativity. Little-c, also known as ‘everyday creativity,’ refers to creative outputs acknowledged by others, like inventing a new recipe. Pro-c, or ‘professional creativity,’ involves becoming an expert in a particular field or discipline, akin to the chef Gordon Ramsay. Big-C, or ‘legendary creativity,’ epitomizes eminent creativity that leaves a lasting legacy, as seen in figures like August Escoffier, who revolutionized the culinary landscape [ 15 ].

AI can support creativity at the pro-c and potentially Big-C levels by extending expertise in specific domains. However, its role in fostering mini-c and little-c contributions is less apparent, as the focus in these levels lies more on the process of self-discovery than on the creative output itself. Therefore, it is crucial to understand when and where AI is most beneficial, particularly in delineating the narrow domains where AI is most apt for educational purposes and how it can encourage mini-c and little-c contributions. This study aims to explore students’ perceptions of AI and creativity and the interplay between the two.

Studies on academic emotions

Lei and Cui [ 36 ] defined academic emotions as students’ emotional experiences related to the academic processes of teaching and learning, including enjoyment, hopelessness, boredom, anxiety, anger, and pride. Based on arousal and enjoyment concepts, academic emotions have been divided into four categories: positive low-arousal, negative low-arousal, and negative high-arousal [ 37 ]. It is also argued that achievement emotions include prospective emotions, such as fear of failure, and retrospective emotions, e.g., shame, which learners experience after they receive feedback on their achievements.

Academic accomplishment serves as a commonly employed criterion for evaluating the effectiveness of educational systems, teachers, schools, and the success or failure of students. Consequently, scholars in this field have conducted empirical investigations to explore the causal link between students’ academic emotions and academic achievements, as evidenced by a body of practical studies [ 38 ]. However, the findings from these studies could be more consistent. In general, there is an anticipation that positive emotions will forecast favorable outcomes in academic settings, including high grades and commendable performance in both local and large-scale educational assessments [ 39 , 40 ]. Conversely, it is hypothesized that negative emotions will correlate with adverse consequences, such as lower grades and compromised performance in classroom activities and standardized examinations [ 41 ].

Results of the meta-analysis study undertaken by Lei and Cui [ 36 ] developed the Chinese version of the Academic Emotions Questionnaire, which was employed to evaluate the academic emotions of adolescents. Academic emotions have been linked to various variables, including cognitive activity, learning motivation, and strategies. Lei and Cui’s [ 36 ] meta-analysis study revealed positive correlations between positive high-arousal, positive low-arousal, and academic achievement and negative correlations between negative high-arousal, negative low-arousal, and academic achievement. The study suggested that factors such as the student’s age, regional location, and gender could moderate the effects of epistemic cognition on academic achievement.

Positive correlations between positive high-arousal, positive low-arousal, and academic achievement and negative correlations between negative high-arousal, negative low arousal, and academic achievement. The authors suggested that the student’s age, regional location, and gender moderated the effects of epistemic cognition on academic achievement [ 42 ].

Currently, domestically and internationally scholars are directing their attention towards analyzing academic emotions in distance learners, resulting in noteworthy research outcomes [ 43 ]. Research conducted by Thelwall et al. [ 44 ] delved into the impact of screen time on emotion regulation and student performance. The study involved over 400 children over four years, examining their usage of smartphones and tablets. The research analyzed the correlation between these behaviours, emotions, and academic performance, concurrently evaluating students’ abilities and educational achievements. Similarly [ 45 ], investigated the influence of early childhood emotions on academic preparation and social-emotional issues. Emotion regulation, identified as the process of managing emotional arousal and expression, plays a crucial role in determining children’s adaptation to the school environment.

Building on the perspectives of the previously mentioned scholars, Sakulwichitsintu [ 46 ] integrated connectionist learning theory to devise an innovative distance education model. This model introduced educational content that was aligned with emotional education objectives and implemented the Mu class teaching mode, establishing a distance learning community and humanized network courses to address emotional shortcomings in the distance education process. Ensuring effectiveness, Pekrun et al. [ 35 ] developed a hybrid reality virtual intelligent classroom system incorporating television broadcasting and interactive space technology to create a networked teaching environment. Teachers utilized diverse techniques, including video, audio, and text, to foster engagement and enhance communication between educators and students during the network teaching phase.

In addition to the earlier scholars, Fang et al. [ 47 ] introduced an emotion recognition algorithm based on facial expression scale-invariant feature transformation. This algorithm captures the facial expressions of distance learners, employing SIFT feature extraction and expression recognition to address emotional gaps in the learning phase of distance education. Simultaneously, Méndez López [ 48 ] developed a learner emotion prediction model for an intelligent learning environment utilizing a fuzzy cognitive map. This model facilitated the extraction and prediction of distance learners’ emotional states, allowing real-time adjustments to the teaching approach based on predicted emotions. Huang and Bo [ 49 ] contributed to the field by introducing the distance learner emotion self-assessment scale, defining essential emotion variables, and establishing an early warning model.

Drawing inspiration from the valuable contributions of the scholars mentioned earlier, Zembylas [ 50 ] examined the online academic emotions experienced by adults. This investigation involved the analysis of diverse influencing factors and the exploration of an environmental factor model within the online learning community, specifically focusing on academic emotional tendencies. Building upon the insights derived from these scholars, our objective is to delve into the academic emotions of distance learners. We plan to achieve this through the analysis of online learning behaviour data, with the anticipation of uncovering meaningful findings in this domain.

This study used mixed-method research (qualitative-quantitative). The following sections describe each phase.

Qualitative method

Sampling and design.

This study employs a qualitative research design to explore the impact of AI-integrated educational applications on undergraduate students’ creativity and academic emotions from the perspectives of both students and university faculties. The research was conducted at Wenzhou University, leveraging theoretical sampling to ensure a comprehensive understanding of the phenomena under investigation. The informants were selected using theoretical sampling, a technique where participants are chosen based on their potential to contribute to the development of emerging theories, ensuring that the sample is rich in information pertinent to the research questions. A total of 23 participants were included in the study, comprising 15 students and eight teachers. The decision to interview these specific numbers was driven by the principle of data saturation, which refers to the point at which no new information or themes are observed in the data. Data saturation was achieved after interviewing the 15th student and the 8th teacher, indicating that the sample size was sufficient to capture the full range of perspectives necessary for the research. The criterion for including the participants in the study was their familiarity with the components of AI. AI-integrated educational applications. These components include Adaptive Learning Systems, Intelligent Tutoring Systems (ITS), Natural Language Processing (NLP) applications, AI-enhanced collaborative Learning Platforms, and Predictive Analytics.

To evaluate the impact of AI-integrated educational applications on students’ creativity and academic emotions, we focused on several key components of AI applied to educational processes. These components include Adaptive Learning Systems, which personalize learning experiences by adjusting content and pace based on individual student performance and preferences, enhancing creativity through personalized challenges and immediate feedback. Intelligent Tutoring Systems (ITS) offer personalized tutoring and feedback, fostering creative problem-solving skills and reducing negative emotions such as anxiety and frustration. Natural Language Processing (NLP) applications facilitate interaction between computers and humans using natural language, enhancing creativity through brainstorming sessions and interactive writing assistance while monitoring changes in academic emotions. AI-enhanced collaborative Learning Platforms support and enhance collaborative learning experiences with features like intelligent grouping, real-time feedback, and automated moderation, impacting group creativity and collective emotional states. Predictive Analytics analyze data to predict student performance, engagement, and emotional states, informing instructional decisions and personalized interventions to enhance creativity and mitigate negative academic emotions.

Data collection

Data collection was carried out through semi-structured interviews, a method well-suited to qualitative research. This method allows for in-depth exploration of participants’ experiences and perceptions while providing some level of structure to ensure that all relevant topics are covered. The semi-structured format includes predefined questions but also allows for flexibility in probing deeper into interesting or unexpected responses.

Interviews were conducted in a quiet and comfortable setting within the university premises to ensure participants felt at ease, thereby facilitating open and honest communication. Each interview lasted approximately 45 to 60 min. For the student participants, the interview questions focused on their experiences using AI-integrated educational applications, perceived impacts on their creativity, and any changes in their academic emotions (e.g., motivation, anxiety, enjoyment). Teacher participants were asked about their observations of students’ engagement and creativity, as well as their own experiences and attitudes towards integrating AI applications in their teaching practices.

Before the interviews, informed consent was obtained from all participants, ensuring they were aware of the study’s purpose, their rights to confidentiality, and their freedom to withdraw from the study at any point without any repercussions. The interviews were audio-recorded with participants’ permission to ensure accurate data capture and were later transcribed verbatim for analysis.

Data analysis

The data analysis process began with the transcription of the audio-recorded interviews, followed by a thorough reading of the transcripts to gain an initial understanding of the data. Thematic analysis was employed to identify, analyze, and report patterns (themes) within the data. This method is particularly effective in qualitative research as it provides a systematic approach to handling large volumes of text and can reveal complex patterns in participants’ narratives.

The thematic analysis was conducted in several steps. First, open coding was performed, where the transcripts were examined line-by-line, and initial codes were generated to capture significant statements and ideas. These codes were then grouped into broader categories based on similarities and relationships. For instance, codes related to students’ enhanced engagement and creativity when using AI applications were grouped under a category labelled “positive impacts on creativity.” Next, the categories were reviewed and refined into overarching themes. This involved constant comparison within and between the data to ensure the themes accurately represented the participants’ perspectives. Themes were then defined and named, providing a clear and concise description of each theme’s essence. Open themes were then classified into two main categories: Challenges and Merits of AI-integrated applications.

Research quality

To ensure research quality, several rigorous steps were undertaken. The transcription of audio-recorded interviews was done verbatim to preserve the original meaning and nuances, maintaining data integrity. Researchers immersed themselves in the data by reading the transcripts multiple times, allowing for a deep understanding. Thematic analysis was systematically employed to identify, analyze, and report patterns, facilitating the uncovering of complex themes. Open coding involved line-by-line examination and initial coding to capture significant statements and ideas, ensuring comprehensive data consideration. Codes were then grouped into broader categories, organizing data meaningfully and aiding in the identification of overarching themes.

Peer debriefing sessions with colleagues provided external validation, enhancing credibility by identifying potential biases and ensuring balanced interpretations. Triangulation was used to confirm consistency and validity by comparing data from multiple sources, reinforcing the reliability of the themes. Detailed documentation of the analytical process ensured transparency and created an audit trail, allowing verification of the research steps and findings. Finally, researchers engaged in reflexivity, continuously reflecting on potential biases to ensure objectivity and accurate representation of participants’ voices, further contributing to the study’s reliability.

Quantitative method

The quantitative phase explored teachers’ and students’ attitudes towards AI applications in education. The sample consisted of 120 undergraduate students and 30 teachers. Participants were selected using a convenience sampling method, ensuring a diverse representation of experiences and perspectives within the educational environment.

Participants were asked to complete a survey that included statements related to the perceived challenges and benefits of AI applications in education. The survey featured a series of Likert scale questions where respondents indicated their level of agreement with each statement on a scale of 1 to 5, where one represented “Strongly Disagree,” 2 represented “Disagree,” 3 represented “Neutral,” 4 represented “Agree,” and five represented “Strongly Agree. The construct validity. It was estimated using exploratory factor analysis, and the items were reduced to factors: challenges and merits. All items had loading factors above 0.70, indicating that the scale enjoyed acceptable construct validity. The reliability of the scale was estimated using Cronbach’s alpha. The internal consistency of the factors of the scale were 0.85 and 0.89, respectively, and the reliability of the total scale was 0.90, which verifies the reliability of the scale (See Appendix).

The survey was divided into two sections: Constraints of AI Applications and Merits of AI Applications. The Constraints section included statements about creativity constraints, emotional disengagement, performance anxiety, technical frustration, over-reliance on AI, the digital divide, and ethical concerns. The Merits section included statements about stimulated creativity, increased engagement, personalized feedback, emotional support, collaborative creativity, accessible learning resources, and enhanced academic emotions.

Data were collected through an online survey platform, ensuring anonymity and confidentiality for all respondents. Descriptive statistics, specifically percentages, were used to summarize the responses. The rate of respondents in each agreement category (Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree) was calculated for each statement. The results were then tabulated separately for teachers and students to identify any significant differences or similarities in their perceptions. This approach provided a clear overview of the collective attitudes of both groups towards AI applications in education, facilitating a detailed comparative analysis. Finally, the findings were interpreted to understand the broader implications of these attitudes on the integration of AI in educational settings. This comprehensive methodology ensured that the study captured a wide range of perspectives, providing valuable insights into how AI is perceived in the context of teaching and learning.

Qualitative findings

The interviews with participants were analyzed, resulting in two selective codes: Challenges and Merits. Each code consists of seven main themes related to students’ creativity and academic emotions. Below, each theme is explained in detail and followed by quotations from both students and teachers to exemplify these findings.

Challenges of AI-applications

Interviews with the informants were thematically analyzed, and different themes were extracted. The interviews highlighted challenges of AI applications in education, including creativity constraints, emotional disengagement, performance anxiety, technical frustration, over-reliance on AI, the digital divide, and ethical concerns. These issues affect students’ creativity, engagement, stress levels, and equitable access to technology. Each sub-theme is explained as follows.

Creativity constraints

The first challenge identified was creativity constraints. Participants noted that some AI applications impose rigid frameworks and lack the flexibility needed to foster creative thinking. These constraints can hinder students’ ability to think outside the box and explore innovative solutions. The following quotations exemplify this finding:

Student 1: “Sometimes the AI applications don’t allow much room for creativity because they follow a strict format.” Teacher 2: “I’ve noticed that some students feel boxed in by the structure imposed by the AI, hindering their creative expression.”

Emotional disengagement

Another challenge was emotional disengagement. The repetitive nature of AI interactions and the absence of a human touch were found to diminish emotional connection and motivation among students. This lack of engagement can detract from the overall learning experience. The following quotations exemplify this finding:

Student 10: “Interacting with AI can get monotonous, and I miss the personal interaction with my teachers.” Teacher 8: “There’s a certain emotional warmth in human interactions that AI can’t replicate, which some students really miss.”

Performance anxiety

Performance anxiety was a significant challenge, with students experiencing heightened stress due to constant monitoring and frequent AI assessments. This pressure can make students more fearful of making mistakes, impacting their academic emotions negatively. The following quotations exemplify this finding:

Student: “The AI assessments are so frequent that I constantly feel pressured to perform well, which makes me anxious.” Teacher: “I’ve observed that some students become overly anxious about their performance because they know the AI is always evaluating them.”

Technical frustration

Technical frustration was a common issue, with frequent glitches and difficult-to-navigate interfaces disrupting the learning process and causing frustration among students. This negatively impacted their creativity and emotional state. The following quotations exemplify this finding:

Student 8: “When the app glitches, it disrupts my workflow and frustrates me, killing my creative vibe.” Teacher 6: “Technical problems often leave students frustrated, which can stifle their creativity and motivation.”

Over-reliance on AI

Over-reliance on AI applications was another challenge, leading to reduced critical thinking and self-initiative among students. This dependency can hinder the development of essential problem-solving skills. The following quotations exemplify this finding:

Student 11: “I sometimes rely too much on the AI for answers instead of trying to figure things out myself.” Teacher 9: “There’s a danger that students may become too dependent on AI, which can hinder their ability to think critically and independently.”

Digital divide

The digital divide posed a significant challenge, with inequitable access to technology and varying levels of technological literacy affecting students’ ability to engage fully and creatively. This disparity can exacerbate existing educational inequalities. The following quotations exemplify this finding:

Student 12: “Not everyone has the same access to the necessary technology, which can be limiting for those who don’t.” Teacher 4: “Students with limited tech skills or access are at a disadvantage, impacting their ability to participate fully and creatively.”

Ethical concerns

Participants raised ethical concerns about biases in AI algorithms and the ethical use of AI in education. These concerns are related to fairness and equity in academic evaluations and the potential for AI to perpetuate existing biases. The following quotations exemplify this finding:

Student: “I’m concerned that the AI might have biases that affect how it evaluates my work.” Teacher: “There are significant ethical questions about how AI is used and whether it treats all students fairly, which can impact their academic emotions and creativity.”

Teachers and students’ perceptions of the merits of AI-applications

Teachers and students believe that AI-integrated educational applications stimulate creativity, increase engagement, provide personalized feedback, offer emotional support, facilitate collaborative creativity, and make learning resources more accessible. These benefits enhance students’ academic emotions and foster innovative approaches to learning, as illustrated by student and teacher testimonials. Each of these themes is explained and exemplified in detail as follows.

Stimulated creativity

On the positive side, AI applications were found to stimulate creativity by presenting new ideas and enhancing problem-solving skills. This allowed students to explore innovative approaches to learning. The following quotations exemplify this finding:

Student 6: “The AI applications introduce me to new ideas that I wouldn’t have thought of on my own, boosting my creativity.” Teacher 8: “I’ve seen students come up with innovative solutions and creative projects thanks to the AI applications.”

Increased engagement

Increased engagement was another significant benefit, with the interactive nature of AI applications making learning more enjoyable and keeping students motivated. This positive engagement enhanced both creativity and academic emotions. The following quotations exemplify this finding:

Student 9: “The interactive features make learning more enjoyable and keep me engaged.” Teacher 5: “Students are more engaged and seem to enjoy the learning process more when using AI applications.”

Personalized feedback

Personalized feedback provided by AI applications offered tailored guidance and immediate responses, helping students improve their work and boosting their confidence. This customised approach supported their creative and emotional development. The following quotations exemplify this finding:

Student 5: “The AI gives me personalized feedback that really helps me understand where I can improve.” Teacher 3: “The immediate, tailored feedback from AI applications helps students feel more confident and supported in their learning.”

Emotional support

AI applications also provide emotional support by reducing anxiety through their constant availability and increasing motivation with gamified elements and positive reinforcement. This support helped maintain a positive emotional state conducive to learning. The following quotations exemplify this finding:

Student 9: “The AI apps reduce my anxiety by being available whenever I need help, and the gamified elements keep me motivated.” Teacher 6: “Students seem less anxious and more motivated when they use AI applications that provide continuous support and positive feedback.”

Collaborative creativity

Collaborative creativity was facilitated by AI, which supported group projects and peer interactions, fostering a sense of community and collective problem-solving. This collaborative environment enhanced creative outcomes. The following quotations exemplify this finding:

Student 13: “AI applications make group projects easier and more creative by allowing us to collaborate effectively.” Teacher 9: “The AI applications encourage peer interaction and collaboration, leading to more creative and well-rounded projects.”

Accessible learning resources

The accessibility of a wide range of learning resources through AI applications supported continuous learning and inspired creativity. Students could explore diverse materials anytime, enhancing their educational experience. The following quotations exemplify this finding:

Student 8: “Having access to a wide range of resources anytime I need them inspires me to be more creative in my studies.” Teacher 7: “The vast array of resources available through AI applications encourages students to explore topics more deeply and creatively.”

Enhanced academic emotions

Finally, AI applications enhance academic emotions by creating positive learning experiences and building emotional resilience through adaptive learning paths and supportive environments. This improvement in emotional well-being positively influenced students’ academic performance. The following quotations exemplify this finding:

Student 4: “The AI apps make learning a more positive experience, which helps me stay emotionally resilient.” Teacher 5: “I’ve seen students develop greater emotional resilience and have more positive learning experiences with the support of AI applications.”

These findings illustrate a nuanced view of AI-integrated educational applications, highlighting both the challenges and benefits in terms of students’ creativity and academic emotions. While there are significant obstacles to overcome, the potential for enhancing creativity and emotional well-being is substantial.

Quantitative findings

To present teachers’ and students’ attitudes towards AI applications in education, we used descriptive statistics to summarize their responses to the statements provided. Tables  1 and 2 include the percentage of respondents in each category of agreement (Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree) for teachers and students, respectively.

Both groups were concerned about AI applications imposing rigid frameworks that could hinder creative thinking, with 25% of both teachers and students agreeing and 15% strongly agreeing. A similar percentage disagreed, with 20% of teachers and 25% of students, while 10% of teachers and 15% of students strongly disagreed. Teachers were more neutral, with 30% compared to 20% of students.

Emotional disengagement due to AI was also a concern, with 35% of both teachers and students agreeing that AI interactions lack a personal touch. An additional 20% of teachers and 15% of students strongly agreed. Neutral responses were common, with 25% of teachers and 20% of students, while fewer disagreed (15% of teachers and 20% of students) or strongly disagreed (5% of teachers and 10% of students).

Performance anxiety caused by frequent AI assessments was another shared concern, with 25% of teachers and 20% of students agreeing and 15% of teachers and 20% of students strongly agreeing. Neutral responses were common, with 20% of teachers and 15% of students, while 25% of both teachers and students disagreed and 15% of teachers and 20% of students strongly disagreed.

Both teachers and students expressed concern over technical issues in AI applications that could disrupt the learning process. A quarter (25%) of each group agreed with this sentiment, while 15% strongly agreed. Neutral responses were quite common, with 30% of teachers and 25% of students expressing no strong opinion. A smaller proportion of participants disagreed (20% of both groups) or strongly disagreed (10% of teachers and 15% of students). There was also a shared recognition among both groups about the potential drawbacks of excessive reliance on AI, as 35% of teachers and 30% of students agreed that AI could diminish critical thinking and self-initiative, with 20% of teachers and 15% of students strongly agreeing. Neutral responses were frequent (25% for both groups), while a minority disagreed (15% of teachers and 20% of students) or strongly disagreed (5% of teachers and 10% of students).

Both groups similarly acknowledged the impact of the digital divide, with 30% of teachers and 25% of students agreeing, and 20% of both groups strongly agree. Neutral responses were common (20% for both groups), while a smaller number disagreed (20% of teachers and 15% of students) or strongly disagreed (10% of teachers and 15% of students). Ethical concerns regarding biases in AI algorithms were also similarly perceived. Agreement was noted among 30% of teachers and students, with 15% strongly agreeing. Neutral responses were pretty common (25% of teachers and 30% of students), and fewer respondents disagreed (20% of teachers and 15% of students) or strongly disagreed (10% from each group).

Both teachers and students had a favourable view of AI’s capacity to enhance problem-solving skills and creativity. 40% of both groups agreed with this perspective, and a notable number strongly agreed (25% of teachers and 20% of students). Neutral responses were less frequent (20% of teachers and 25% of students), while disagreement was relatively uncommon (10% from each group), as was strong disagreement (5% from each group). Furthermore, both groups acknowledged that AI could increase the enjoyment of learning, with 30% of teachers and 35% of students agreeing and 20% from each group strongly agreeing. Neutral responses were moderate (25% of teachers and 20% of students), while fewer participants disagreed (15% from both groups) or strongly disagreed (10% from each group).

The benefits of AI in providing personalized feedback were highly recognized, with 35% of teachers and students agreeing and a substantial proportion strongly agreeing (30% of teachers and 35% of students). Neutral responses were moderate (20% of teachers and 15% of students), while fewer respondents disagreed (10% from each group) or strongly disagreed (5% from each group). AI’s role in reducing anxiety through constant availability was similarly viewed, with 25% of teachers and 30% of students agreeing and 15% from each group strongly agreeing. Neutral responses were moderate (25% from both groups), with some disagreement (20% of teachers and 15% of students) and strong disagreement (15% of teachers and 10% of students).

Both groups positively perceived AI’s facilitation of group projects, with 35% of teachers and students agreeing and 25% from each group strongly agreeing. Neutral responses were common (25% of teachers and 20% of students), with fewer participants disagreeing (10% of teachers and 15% of students) or strongly disagreeing (5% from each group). The accessibility of a wide range of learning resources through AI was highly valued, with 35% of teachers and students agreeing and a notable portion strongly agreeing (30% of teachers and 25% of students). Neutral responses were moderate (20% of teachers and 25% of students), while fewer disagreed (10% from each group) or strongly disagreed (5% from each group). Lastly, both groups acknowledged AI’s role in fostering positive learning experiences, with 30% of teachers and students agreeing and 20% strongly agreeing. Neutral responses were moderate (25% from each group), while fewer participants disagreed (15% from both groups) or strongly disagreed (10% from each group).

The integration of AI in educational applications presents several significant challenges that impact students’ creativity and academic emotions. One major issue is the creativity constraints imposed by AI applications. Specifically, the rigid frameworks and lack of flexibility in some applications limit students’ ability to think creatively and explore innovative solutions. This finding aligns with previous research indicating that while AI can facilitate structured learning, it can also stifle creative thinking by enforcing rigid paths [ 51 , 52 ]. Moreover, another significant challenge is emotional disengagement. The repetitive nature of AI interactions and the lack of a human touch can lead to emotional detachment, reducing students’ motivation and engagement. This phenomenon is supported by studies showing that human interaction plays a crucial role in maintaining student engagement and emotional connection [ 53 , 54 ].

Additionally, technical frustration due to frequent glitches and complicated interfaces further hampers the learning experience. This frustration can disrupt creative processes and negatively affect academic emotions [ 55 ]. This issue is highlighted by research showing that technical difficulties are a common barrier to effective AI implementation in education [ 56 ].

Another concern is the over-reliance on AI applications, which can reduce critical thinking and self-initiative among students. This dependency can hinder the development of essential problem-solving skills. Zhai et al. [ 56 ] emphasized the importance of balancing AI use with opportunities for independent thought and critical reasoning.

The digital divide remains a significant challenge, with inequitable access to technology and varying levels of technological literacy among students creating disparities. This issue is well-documented, with recent studies highlighting how unequal access to digital applications can exacerbate existing educational inequalities [ 57 ].

Lastly, ethical concerns regarding biases in AI algorithms and the ethical use of AI in education were prominent. Participants worried about the fairness and equity of AI evaluations, consistent with findings from Bogina et al. [ 58 ], who discussed the potential for AI to perpetuate existing biases and inequalities in educational settings.

Despite these challenges, the integration of AI in educational applications also presents numerous merits that positively impact students’ creativity and academic emotions. One significant benefit is the stimulation of creativity. AI applications can introduce new ideas and enhance problem-solving skills, fostering innovative approaches to learning. This finding is supported by studies showing that AI can provide diverse perspectives and problem-solving techniques that stimulate creative thinking [ 59 , 60 ]. Additionally, increased engagement is another notable merit, with AI’s interactive nature making learning more enjoyable and motivating for students. This enhanced engagement is consistent with research indicating that interactive AI applications can significantly boost student motivation and participation [ 61 ]. Moreover, personalized feedback provided by AI applications offers tailored guidance and immediate responses, helping students improve their work and boosting their confidence. This personalized approach is crucial for supporting students’ creative and emotional development, as noted by Chang et al. [ 62 ], who found that personalized AI feedback enhances learning outcomes and student confidence.

Furthermore, emotional support is another significant benefit, with AI applications reducing anxiety through their constant availability and increasing motivation with gamified elements and positive reinforcement. Studies have shown that such support mechanisms are effective in maintaining a positive emotional state conducive to learning [ 63 ]. In addition, collaborative creativity facilitated by AI applications supports group projects and peer interactions, fostering a sense of community and collective problem-solving. This collaborative environment aligns with findings from Graesser et al. [ 64 ], who emphasized the role of technology in enhancing collaborative learning and creativity.

The provision of accessible learning resources by AI applications supports continuous learning and inspires creativity by allowing students to explore diverse materials anytime. This accessibility is crucial for fostering an inclusive learning environment, as highlighted by Yenduri et al. [ 65 ], who noted that diverse and readily available resources enhance educational equity and creativity. Finally, enhanced academic emotions resulting from AI integration create positive learning experiences and build emotional resilience. Adaptive learning paths and supportive environments provided by AI applications contribute to improved emotional well-being and academic performance. This is supported by research indicating that adaptive learning technologies positively impact student emotions and resilience [ 5 , 6 , 7 , 8 , 9 ].

The integration of AI in education has elicited varied responses from both teachers and students, reflecting a complex interplay of benefits and challenges. One prominent concern is the potential for AI applications to impose rigid frameworks that may stifle creativity. This apprehension aligns with the notion that while AI can provide structured guidance, it may also limit the spontaneous and divergent thinking essential for creative processes. This balance between structure and freedom is critical, as noted in the literature on educational methodologies and creativity development [ 1 , 2 , 3 ].

Emotional disengagement emerges as another significant issue, with both groups expressing that AI interactions often lack the personal touch necessary for effective learning experiences. The importance of human elements in education is well-documented, with studies emphasizing the role of personal connection in fostering student engagement and motivation [ 4 , 5 ]. This emotional component is vital, as AI systems, despite their capabilities, may only partially replicate the nuanced and empathetic interactions provided by human educators [ 6 , 7 ].

Performance anxiety due to frequent AI assessments is another shared concern. AI’s ability to provide continuous and immediate feedback can be a double-edged sword, potentially leading to increased stress and anxiety among students. This is consistent with findings that highlight the psychological impact of constant monitoring and assessment, which can detract from the learning experience and affect student well-being [ 8 , 9 ].

Technical issues associated with AI applications also pose significant challenges. Both teachers and students have reported frustrations with technical glitches disrupting the learning process. These disruptions can hinder the seamless integration of AI into educational environments, underscoring the need for robust and reliable technology infrastructure [ 10 , 11 ].

Despite these concerns, both groups recognize the benefits of AI, particularly in enhancing creativity and engagement. AI’s ability to stimulate problem-solving skills and foster creativity is acknowledged as a significant advantage. This aligns with research suggesting that AI can catalyze creative thinking by providing novel applications and approaches to problem-solving [ 12 , 13 , 14 ]. Additionally, the literature supports AI’s potential to increase student engagement through interactive and personalized learning experiences [ 15 , 16 ].

The role of AI in providing personalized feedback is highly valued, with both teachers and students appreciating its capacity to tailor educational experiences to individual needs. Customised learning, facilitated by AI, can address diverse learning styles and paces, thereby enhancing educational outcomes [ 17 , 18 ]. This personalization is crucial in meeting the unique needs of each student, fostering a more inclusive and effective learning environment [ 19 , 20 ].

AI’s contribution to collaborative creativity and accessible learning resources is also positively viewed. AI’s ability to facilitate group projects and provide a wide range of learning materials supports collaborative learning and resource accessibility, which are essential components of a modern educational framework [ 21 , 22 , 23 ]. Moreover, the enhancement of academic emotions through AI-driven learning experiences highlights AI’s potential to create positive and engaging educational environments [ 24 , 25 ].

In conclusion, the attitudes of teachers and students towards AI in education reflect a balanced perspective that acknowledges both its limitations and advantages. While there are valid concerns about emotional disengagement, ethical issues, and performance anxiety, the benefits of enhanced creativity, engagement, and personalized feedback cannot be overlooked. This underscores the need for thoughtful and strategic integration of AI in educational settings, ensuring that its deployment maximizes benefits while mitigating potential drawbacks. As AI continues to evolve, ongoing research and dialogue will be essential in navigating its role in education and optimizing its impact on teaching and learning [ 26 , 27 , 28 ].

Conclusions and implications

The integration of AI in educational applications presents a complex landscape characterized by significant challenges and notable benefits impacting students’ creativity and academic emotions. On the downside, AI applications often impose rigid frameworks that constrain creative thinking and innovation, echoing previous research on the stifling effects of structured learning paths. Emotional disengagement is another critical issue, as the repetitive and impersonal nature of AI interactions can diminish student motivation and engagement. This phenomenon underscores the importance of human interaction for maintaining emotional connections in learning. Additionally, the constant monitoring and assessments by AI applications heighten performance anxiety, negatively affecting student well-being. Technical frustrations due to frequent glitches and complex interfaces further disrupt the learning process. At the same time, an over-reliance on AI can reduce critical thinking and self-initiative, hindering essential problem-solving skills. The digital divide exacerbates educational disparities, highlighting the need for equitable access to technology. Ethical concerns about biases in AI algorithms also raise questions about fairness and equity in educational evaluations.

Conversely, AI integration offers substantial benefits, including the stimulation of creativity and enhanced engagement. AI applications can introduce new ideas and improve problem-solving skills, fostering innovative learning approaches. Their interactive nature makes learning more enjoyable and motivating, significantly boosting student participation. Personalized feedback from AI applications offers tailored guidance and immediate responses, helping students improve their work and build confidence. AI applications also provide emotional support, reducing anxiety through constant availability and enhancing motivation with gamified elements and positive reinforcement. They facilitate collaborative creativity, fostering a sense of community and collective problem-solving. Additionally, AI applications offer accessible learning resources, supporting continuous learning and inspiring creativity, which is crucial for educational equity. Adaptive learning paths and supportive environments provided by AI applications improve emotional well-being and academic performance, fostering positive learning experiences and building emotional resilience. Balancing these benefits with the challenges requires thoughtful implementation and continuous evaluation to optimize AI’s role in education.

Limitations and suggestions for further studies

Despite the merits and rich data, this study has some limitations which need to be mentioned. Firstly, the exclusive use of interviews for data collection limits the breadth of perspectives gathered. Interviews may reflect individual viewpoints rather than broader trends or consensus among participants. Additionally, the absence of focus groups in data collection further restricts the depth of insights obtained, as group dynamics and interactions that could reveal shared experiences or divergent opinions are not explored. Moreover, the study lacks detailed demographic information about participants, such as their majors, teaching experience (for teachers), or other relevant characteristics. This omission must include a nuanced understanding of how these factors might influence perceptions of AI-integrated educational applications.

Furthermore, the study’s small sample size raises concerns about the generalizability of findings. With a limited number of participants, the variability in perceptions and attitudes towards AI in education may need to be adequately captured. Additionally, a comparative analysis between teachers’ and students’ perceptions and attitudes needs to be conducted to uncover potential differences or similarities that could provide richer insights into the impact of AI on educational experiences from both perspectives.

Suggestions for future research include employing mixed-methods approaches that combine interviews with other qualitative methods, such as focus groups. This would allow for a more comprehensive exploration of diverse perspectives and enable researchers to triangulate findings for greater validity. Moreover, expanding the sample size and ensuring diversity among participants in terms of academic disciplines, teaching experience, and student backgrounds could provide a more robust basis for generalizing findings. Additionally, conducting comparative analyses between different stakeholder groups (e.g., teachers vs. students) would deepen understanding of how AI-integrated educational applications affect various participants differently. Finally, longitudinal studies could track changes in perceptions and attitudes over time as AI technologies in education continue to evolve, offering insights into the long-term impacts and adaptations within educational settings. These methodological enhancements and research directions would contribute to a more comprehensive understanding of the complex interactions between AI technology and educational practices.

Data availability

Data is provided within the manuscript or supplementary information files.

Helm JM, Swiergosz AM, Haeberle HS, Karnuta JM, Schaffer JL, Krebs VE, Spitzer AI, Ramkumar PN. Machine learning and artificial intelligence: definitions, applications, and future directions. Curr Rev Musculoskelet Med. 2020;13:69–76. https://doi.org/10.1007/s12178-020-09620-7 .

Article   PubMed   PubMed Central   Google Scholar  

Hassani H, Silva ES, Unger S, TajMazinani M, Mac Feely S. Artificial Intelligence (AI) or Intelligence Augmentation (IA): What Is the Future? AI. 2020;1:143–155. https://doi.org/10.3390/ai1020011

Yufeia L, Salehb S, Jiahuic H, Syed Mohamad Syed S. Review of the application of Artificial Intelligence in Education. Int J Innov Create Change. 2020;12:548–62.

Google Scholar  

VanLehn K. The relative effectiveness of human tutoring, intelligent tutoring systems, and other tutoring systems. Educ Psychol. 2011;46:197–221.

Article   Google Scholar  

Trilling B, Fadel C. 21st Century skills: learning for life in our Times. San Francisco: Wiley; 2009.

Torrance EP. A longitudinal examination of the fourth-grade slump in creativity. Gifted Child Q. 1968;12:195–9.

Tubb AL, Cropley DH, Marrone RL, Patston T, Kaufman JC. The development of mathematical creativity across high school: increasing, decreasing, or both? Think Skills Creativity. 2020;35:100634.

Alves-Oliveira P, Arriaga P, Paiva A, Hoffman G, Design. Yolo, a robot for creativity: A co-design study with children. In: Proceedings of the and Children; June 27–30, 2017; Stanford, CA, USA. ACM; 2017:423–429.

Gordon G, Breazeal C, Engel S. Can children catch curiosity from a social robot? In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction; March 2–5, 2015; Portland, OR, USA. ACM; 2015:91–98.

Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F. Social robots for education: a review. Sci Robot. 2018;3.

Kafai YB, Burke Q. Connected code: why children need to learn programming. Cambridge: MIT Press; 2014.

Book   Google Scholar  

Ryu M, Han S. The educational perception on artificial intelligence by elementary school teachers. J Korean Assoc Inf Educ. 2018;22:317–24.

Kaufman JC, Beghetto RA. Beyond big and little: the four C model of creativity. Rev Gen Psychol. 2009;13:1–12.

Beghetto RA, Kaufman JC, Hatcher R. Applying creativity research to cooking. J Creat Behav. 2016;50:171–7.

Marrone R, Taddeo V, Hill G. Creativity and Artificial Intelligence—A Student Perspective. J Intell. 2022;10(3):65. https://doi.org/10.3390/jintelligence10030065 .

Johnson L, Becker A, Estrada S, V., Freeman A. NMC Horizon Report: 2015 higher Education Edition. Austin, Texas: The New Media Consortium; 2015.

Allen IE, Seaman J. Digital Learning Compass: Distance Education Enrollment Report 2017. Babson Survey Research Group, e-Literate, and WCET; 2017.

Banks JA. Cultural Diversity and Education: foundations, curriculum, and teaching. Routledge; 2015.

Etzkowitz H, Leydesdorff L. The dynamics of innovation: from National systems and Mode 2 to a Triple Helix of university–industry–government relations. Res Policy. 2000;29(2):109–23.

Carnevale AP, Smith N, Strohl J. Recovery: Job Growth and Education requirements through 2020. Georgetown University Center on Education and the Workforce; 2013.

Altbach PG, Reisberg L, Rumbley LE. (2009). Trends in Global Higher Education: Tracking an Academic Revolution . A report prepared for the UNESCO 2009 World Conference on Higher Education.

Luckin R, Holmes W, Griffiths M, Forcier LB. Intelligence unleashed: an argument for AI in Education. Pearson; 2016.

Cope B, Kalantzis M. Interpreting evidence-of-learning: Educational research in the era of big data. Open Rev Educational Res. 2015;2(1):218–39.

Roll I, Wylie R. Evolution and revolution in artificial intelligence in education. Int J Artif Intell Educ. 2016;26(2):582–99.

Siemens G, Long P. Penetrating the fog: Analytics in learning and education. EDUCAUSE Rev. 2011;46(5):30–2.

Warschauer M. Technology and social inclusion: rethinking the digital divide. MIT Press; 2004.

Barocas S, Hardt M. Fairness and Abstraction in Sociotechnical Systems. IEEE Data Eng Bull. 2019;42(3):56–68.

Amornkitpinyo T, Yoosomboon S, Sopapradit S, Amornkitpinyo P. The structural equation model of actual use of Cloud Learning for Higher Education students in the 21st Century. J e-Learn Knowl Soc. 2021;17(1):72–80. https://doi.org/10.20368/1971-8829/1135300 .

Koçak O, Koçak ÖE, Younis MZ. The psychological consequences of COVID-19 fear and the moderator effects of individuals’ underlying illness and witnessing infected friends and family. Int J Environ Res Public Health. 2021;18(4):1836. https://doi.org/10.3390/ijerph18041836 .

Khan S, Zaman SI, Rais M. Measuring student satisfaction through overall quality at business schools: a structural equation modelling: student satisfaction and quality of education. South Asian J Soc Rev. 2022;1(2):34–55.

Chedrawi C, Howayeck P, Tarhini A. CSR and legitimacy in higher education accreditation programs, an isomorphic approach of Lebanese business schools. Qual Assur Educ. 2019;27(1):70–81. https://doi.org/10.1108/QAE-04-2018-0053 .

Hu J. The challenge of Traditional Teaching Approach: a study on the path to Improve Classroom Teaching Effectiveness based on secondary School Students’ psychology. Lecture Notes Educ Psychol Public Media. 2024;50(1):213–9. https://doi.org/10.54254/2753-7048/50/20240945 .

Ali SS, Choi BJ. State-of-the-art artificial intelligence techniques for distributed smart grids: a review. Electronics. 2020;9(6):1030. https://doi.org/10.3390/electronics9061030 .

Derry SJ. Learning strategies for acquiring useful knowledge. Dimensions of thinking and cognitive instruction. Routledge; 2013. pp. 347–79.

Pekrun R, Goetz T, Titz W, Perry RP. Academic emotions in students’ self-regulated learning and achievement: a program of qualitative and quantitative research. Educ Psychol. 2002;37(2):91–105.

Lei H, Cui Y. Effects of academic emotions on achievement among mainland Chinese students: a meta-analysis. Soc Behav Pers. 2016;44(9):1541–54. https://doi.org/10.2224/sbp.2016.44.9.1541 .

Artino AR, Jones KD. Exploring the complex relations between achievement emotions and self-regulated learning behaviours in online learning. Internet High Educ. 2012;15(3):170–5. https://doi.org/10.1016/j.iheduc.2012.01.006 .

Cocoradă E. Achievement emotions and performance among university students. Bull Transilv Univ Braşov. 2016;9(2–Suppl):119–28.

Lam UF, Chen WW, Zhang J, Liang T. It feels good to learn where I belong: School belonging, academic emotions, and academic achievement in adolescents. School Psychol Int. 2015;36:393–409. https://doi.org/10.1177/0143034316680410 .

Villavicencio FT. Critical thinking, negative academic emotions, and achievement: a meditational analysis. Asia-Pac Educ Res. 2011;20:118–26.

Shen B, Wang Y, Yang Y, Yu X. Relationships between Chinese university EFL learners’ academic emotions and self-regulated learning strategies: a structural equation modelling analysis. Front Psychol. 2021;12:629892. https://doi.org/10.3389/fpsyg.2021.629892 .

Alam MM, Rayhan MI, Rahman MA, Imran S. The impact of teachers’ emotion regulation, students’ learning strategies, and academic emotions on students’ learning outcomes in Bangladesh. Int J Educ Pract. 2021;9(2):306–21.

Kahu ER, Stephens C, Zepke N, Leach L. Space and time to engage: mature-aged distance students’ learning experiences. High Educ Res Dev. 2013;32(5):791–804.

Thelwall M, Buckley K, Paltoglou G. Sentiment in Twitter events. J Am Soc Inf Sci Technol. 2011;62(2):406–18.

Graziano PA, Reavis RD, Keane SP, Calkins SD. The role of emotion regulation and children’s early academic success. J Sch Psychol. 2007;45(1):3–19. https://doi.org/10.1016/j.jsp.2006.09.002 . PMID: 21179384; PMCID: PMC3004175.

Sakulwichitsintu S. Mobile technology–An innovative instructional design model in distance education. Int J Interact Mob Technol. 2023;17(7).

Fang B, Li X, Han G, He J. Facial expression recognition in educational research from the perspective of machine learning: a systematic review. IEEE Access. 2023.

Méndez López MG. Emotion and language learning: An exploration of experience and motivation in a Mexican university context [doctoral dissertation]. University of Nottingham; 2011.

Huang Y, Bo D. Emotion classification and achievement of students in distance learning based on the knowledge state model. Sustainability. 2023;15(3):2367.

Zembylas M. Adult learners’ emotions in online learning. Distance Educ. 2008;29(1):71–87.

Saputra I, Astuti M, Sayuti M, Kusumastuti D. Integration of artificial intelligence in education: opportunities, challenges, threats and obstacles. A literature review. Indones J Comput Sci. 2023;12(4).

Marrone R, Taddeo V, Hill G. Creativity and artificial intelligence—A student perspective. J Intell. 2022;10(3):65.

Lavinder KW. Staff experiences with the organizational departure of chief student affairs officers: An interpretative phenomenological analysis [doctoral dissertation]. Northeastern University; 2021.

Pekrun R, Linnenbrink-Garcia L. Academic emotions and student engagement. Handbook of Research on Student Engagement. Boston, MA: Springer US; 2012. pp. 259–82.

Chapter   Google Scholar  

Welter VDE, Becker LB, Großschedl J. Helping learners become their own teachers: the beneficial impact of trained concept-mapping-strategy use on metacognitive regulation in learning. Educ Sci. 2022;12(5):325.

Zhai C, Wibowo S, Li LD. The effects of over-reliance on AI dialogue systems on students’ cognitive abilities: a systematic review. Smart Learn Environ. 2024;11(1):28.

Kelly MA. Bridging digital and cultural divides: TPCK for equity of access to technology. Handbook of Technological Pedagogical Content Knowledge (TPCK) for educators. Routledge; 2014. pp. 41–68.

Bogina V, Hartman A, Kuflik T, Shulner-Tal A. Educating software and AI stakeholders about algorithmic fairness, accountability, transparency and ethics. Int J Artif Intell Educ. 2022;1–26.

Urmeneta A, Romero M. Creative applications of artificial intelligence in education. Springer Nature; 2024.

Li G, Zarei MA, Alibakhshi G, Labbafi A. Teachers and educators’ experiences and perceptions of artificial-powered interventions for autism groups. BMC Psychol. 2024;12(1):199.

Huang AY, Lu OH, Yang SJ. Effects of artificial intelligence–enabled personalized recommendations on learners’ learning engagement, motivation, and outcomes in a flipped classroom. Comput Educ. 2023;194:104684.

Chang DH, Lin MPC, Hajian S, Wang QQ. Educational design principles of using AI chatbot that supports self-regulated learning in education: goal setting, feedback, and personalization. Sustainability. 2023;15(17):12921.

Li L, Gow ADI, Zhou J. The role of positive emotions in education: a neuroscience perspective. Mind Brain Educ. 2020;14(3):220–34.

Graesser AC, Fiore SM, Greiff S, Andrews-Todd J, Foltz PW, Hesse FW. Advancing the science of collaborative problem-solving. Psychol Sci Public Interest. 2018;19(2):59–92.

Article   PubMed   Google Scholar  

Yenduri G, Kaluri R, Rajput DS, Lakshmanna K, Gadekallu TR, Mahmud M, Brown DJ. From assistive technologies to metaverse—technologies in inclusive higher education for students with specific learning difficulties: a review. IEEE Access. 2023;11:64907–27.

Download references

Acknowledgements

The author would like to thank all the participants.

2024 Higher Education Research Project (KT2024176), ZAHE, the Zhejiang Provincial Association of Higher Education.

Author information

Authors and affiliations.

School of Marxism, Wenzhou University of Technology, Wenzhou, Zhejiang, 32500, China

Haozhuo Lin

School of Humanities, Wenzhou University, Wenzhou, Zhejiang, 32500, China

You can also search for this author in PubMed   Google Scholar

Contributions

HL and QC designed the study. HL and QC collected the data. HL and QC analyzed and interpreted the data. HL and QC drafted the manuscript. HL and QC proofread the paper. HL and QC agreed to be accountable and verified the submitted version.

Corresponding author

Correspondence to Qiu Chen .

Ethics declarations

Ethics approval and consent to participate.

The Institutional Review Board of Wenzhou approved this study and issued a letter indicating that it had no side effects on the participants. All experiments and methods were carried out in accordance with relevant guidelines and regulations. Informed consent was obtained from all subjects.

Consent for publication

Not applicable .

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Lin, H., Chen, Q. Artificial intelligence (AI) -integrated educational applications and college students’ creativity and academic emotions: students and teachers’ perceptions and attitudes. BMC Psychol 12 , 487 (2024). https://doi.org/10.1186/s40359-024-01979-0

Download citation

Received : 20 May 2024

Accepted : 03 September 2024

Published : 16 September 2024

DOI : https://doi.org/10.1186/s40359-024-01979-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Artificial intelligence
  • Academic emotions
  • AI-empowered applications
  • Chinese students

BMC Psychology

ISSN: 2050-7283

artificial intelligence in education technology

Banner

Artificial Intelligence (AI) Research Guide

  • What is Artificial Intelligence?
  • General Use Bots
  • Literature Search
  • Knowledge Map
  • Reading Papers
  • Image Tools
  • Audio Generation
  • Video Generation
  • Programming
  • Video Tutorials
  • Research Databases
  • Prompt Engineering
  • Generative AI Policy
  • How to Cite Generative AI
  • AI in Medical Education & Healthcare

Introduction

All Artificial Intelligence is not the same. AI has been in healthcare for over 50 years, though not at the scale we’ve seen over the last decade with advancements in neural networks and transformer architecture. It began with expert knowledge getting encoded into electronic clinical pathways to being able to identify skin conditions and make predictions from electronic health record data. Currently, most AI in medical settings is used for:

  • image acquisition and processing
  • early disease detection
  • more accurate diagnosis, prognosis, and risk assessment
  • identification of new patterns in human physiology and disease progression
  • development of personalized diagnostics
  • therapeutic treatment of response monitoring

This new wave of AI algorithms and other applications powered by AI will likely usher in new ways to support clinicians and other health professionals.

Howell MD, Corrado GS, DeSalvo KB. Three Epochs of Artificial Intelligence in Health Care. JAMA . 2024 Jan 16;331(3):242-244. doi: 10.1001/jama.2023.25057 . PMID: 38227029 . Source:  https://www.fda.gov/medical-devices/medical-device-regulatory-science-research-programs-conducted-osel/artificial-intelligence-program-research-aiml-based-medical-devices

🔎 Literature Search

When conducting your literature searches, ensure reliability and credibility by using accurate sources. Keep these in mind when using AI-based platforms for your literature searching: 

  • Choose a tool that is specialized for literature searching such as Scite.ai and Perplexity for accurate and credible information retrieval. 
  • Minimize the chance of encountering hallucinations or misleading content by steering clear of AI-based platforms like ChatGPT for literature searches. 
  • Enhance credibility by using platforms that source information from credible scholarly literature to ensure the reliability of search results and maintain academic integrity. 
  • Article.Audio : https://article.audio/ 
  • ⭐️  Consensus : https://consensus.app/ 
  • ⭐️  Elicit: https://elicit.org/ 
  • ELIF : https://explainlikeimfive.io/ 
  • ⭐️  Perplexity : https://www.perplexity.ai/ 
  • Scholarcy : https://www.scholarcy.com/. Article summarizer.
  • Typeset : https://typeset.io/ 
  • TutorAI : https://www.tutorai.me/

AI Initiatives in Healthcare

Across NIH’s 27 institutes and centers, AI/ML technologies are being developed. Follow the link for some specific examples: Institute and Center Funded Initiatives  

None of the following resources are meant to provide medical advice, diagnosis, or treatment. Users must assess the information they obtain from any source:

OpenEvidence is an artificial intelligence system to aggregate, synthesize, and visualize clinically relevant evidence in understandable, accessible formats that can be used to make more evidenced-based decisions and improve patient outcomes. 

Coalition for Health AI (CHAI) is an initiative to develop and promote responsible AI standards in healthcare to ensure high-quality, trustworthy, and equitable AI applications. They aim to do this by creating a federated network of labs, tools, and a robust framework that can be used by all healthcare providers. 

BioMedical Engineering and Imaging Institute focuses on the use of multimodality imaging for brain, heart, and cancer research, along with research in nanomedicine for precision imaging and drug delivery. 

Listen to  The Role of Artificial Intelligence in Clinical Trial Design and Research with Dr. ElZarrad  where the FDA's Division of Drug Information discusses the role of AI in clinical trial design. Listening is worth 0.5 CE credit.

Selected Health Science Literature in AI/ML

  • Intelligence-based medicine The primary focus of this open-access journal is on the clinical perspective and translation of emerging technologies into care practices and patient benefits.
  • Artificial intelligence in medicine Original articles from a wide variety of interdisciplinary perspectives concerning the theory and practice of artificial intelligence (AI) in medicine, human biology, and health care.
  • Artificial Intelligence and Academic Medicine Collection The Journal of the Association of American Medical Colleges' featured collection on the evolving topic of AI tools, including natural language processing and machine learning models.

Can an artificial intelligence chatbot assistant, provide responses to patient questions that are of comparable quality and empathy to those written by physicians?

Ayers JW, Poliak A, Dredze M, et al.  Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum .  JAMA Intern Med.  2023;183(6):589–596. doi:10.1001/jamainternmed.2023.1838

Jin, Q., Chen, F., Zhou, Y.  et al.   Hidden flaws behind expert-level accuracy of multimodal GPT-4 vision in medicine .  npj Digit. Med.   7 , 190 (2024). https://doi.org/10.1038/s41746-024-01185-7

Books from the Catalog

From the l.i. catalog.

Cover Art

From the Arkansas Catalog

Cover Art

Publisher Statements on AI

In keeping with the pace of emerging generative AI technologies, publishers are establishing guidelines and policies surrounding the use of AI in scholarly publishing. These guidelines may change and as such, be sure to check or contact the library to confirm whether your research is in accordance with the publisher's policies. The following links are the latest guidelines as of 09/03/2024.

  • Elsevier Books: "Authors should not list generative AI and AI-assisted technologies as an author or co-author, nor cite AI as an author. Authorship implies responsibilities and tasks that can only be attributed to and performed by humans." Images/Figures: "Elsevier does not permit the use of generative AI or AI-assisted tools to create or alter images in submitted manuscripts." Visit link for exceptions.
  • Springer Nature "An attribution of authorship carries with it accountability for the work, which cannot be effectively applied to LLMs."
  • Reporting Use of AI in Research and Scholarly Publication—JAMA Network Guidance JAMA has sought to define the broad scope of discovery and innovation in medical applications of AI and to address potential challenges in its implementation.
  • Taylor & Francis AI Policy "Authors are responsible for ensuring that the content of their submissions meets the required standards of rigorous scientific and scholarly assessment, research and validation, and is created by the author. Note that some journals may not allow use of Generative AI tools beyond language improvement, therefore authors are advised to consult with the editor of the journal prior to submission."
  • Science "AI-assisted technologies [such as large language models (LLMs), chatbots, and image creators] do not meet the Science journals’ criteria for authorship and therefore may not be listed as authors or coauthors, nor may sources cited in Science journal content be authored or coauthored by AI tools."
  • MDPI "While AI can contribute intellectually to the writing process, it is now widely accepted that it cannot take responsibility of the content it produces."

Clinical Trials Using AI/ML

The FDA's Medical Product Centers are collaborating to advance the responsible use of AI for medical products. They are collaborating to build regulations that can be applied across various medical products and used within the health care delivery system. To learn more, read the FDA's paper titled  Artificial Intelligence and Medical Products: How CBER, CDER, CDRH, and OCP are Working Together .

The FDA has seen a rapid growth in the number of submissions that reference AI. Clinical trials are an integral part of new product discovery and development and are required by the Food and Drug Administration before a new product can be brought to the market.

The following RSS feed includes the new and existing studies added or modified (last update posted) in the last 14 days on ClinicalTrials.gov that use AI in some capacity. If you want to complete a similar search, these were the search terms used under “Other Terms”: 

algorithm OR “artificial intelligence” OR “convolutional network” OR “computer reasoning” OR “vision system” OR “deep learning” OR “Hierarchical Learning” OR “machine intelligence” OR “machine learning” OR “neural network” OR radiomics OR “in-context learning” 

If you want to look at AI within a field, just go to ClinicalTrials.gov, type in the provided terms, along with whichever conditions and/or interventions you want to research further. 

New and existing studies added or modified in the last 14 days.

Attribution.

In addition to credit given for various images, parts of this Medical Education & Healthcare page were adapted from work/guides by: Emory Libraries, New York University, Westport Library, University of North Dakota, Mount Sinai Levy Library Used with permission or in accordance with Creative Commons Licensing.

  • << Previous: How to Cite Generative AI
  • Last Updated: Sep 17, 2024 3:30 PM
  • URL: https://libguides.nyit.edu/ai

How can artificial intelligence enhance education?

artificial intelligence in education technology

The transformative power of Artificial Intelligence (AI) cuts across all economic and social sectors, including education .

“Education will be profoundly transformed by AI,” says UNESCO Director-General Audrey Azoulay. “Teaching tools, ways of learning, access to knowledge, and teacher training will be revolutionized.”

AI has the potential to accelerate the process of achieving the global education goals through reducing barriers to access learning, automating management processes, and optimizing methods in order to improve learning outcomes.  

This is why UNESCO’s upcoming Mobile Learning Week (4-8 March 2019) will focus on AI and its implications for sustainable development. Held annually at UNESCO Headquarters in Paris, the 5-day event offers an exciting mix of high-level plenaries, workshops and hands-on demonstrations. Some 1200 participants have already registered for the event that provides the educational community, governments and other stakeholders a unique opportunity to discuss the opportunities and threats of AI in the area of education.

The discussions will evolve around four key issues:

  • Ensure inclusive and equitable use of AI in education – including actions on how to address inequalities related to socio-economic status, gender, ethnicity and geographic location; identify successful projects or proved-effective AI solutions to break through barriers for vulnerable groups to access quality education.
  • Leverage AI to enhance education and learning – improve education management systems, AI-boosted learning management systems or other AI in education applications, and identify new forms of personalized learning that can support teachers and tackle education challenges.
  • Promote skills development for jobs and life in the AI era – support the design of local, regional and international strategies and policies, consider the readiness of policymakers and other education leaders and stakeholders; and explore how AI-powered mobile technology tools can support skills development and innovation.
  • Safeguard transparent and auditable use of education data – analyze how to mitigate the risks and perils of AI in education; identify and promote sound evidence for policy formulation guaranteeing accountability, and adopt algorithms that are transparent and explainable to education stakeholders.

During Mobile Learning Week , UNESCO is organizing a Global Conference on AI (Monday 4 March) to raise awareness and promote reflection on the opportunities and challenges that AI and its correlated technologies pose, notably in the area of transparency and accountability. The conference, entitled “AI with Human Values for Sustainable Development” will also explore the potential of AI in relation to the SDGs.

More on this subject

Language Technologies for All – LT4All 2025

Other recent news

UIS uncovers its brand-new knowledge products

H.R. 9607: To promote a 21st century artificial intelligence workforce and to authorize the Secretary of Education to carry out a program to increase access to prekindergarten through grade 12 emerging and advanced technology education and upskill workers in the technology of the future.

React to this bill with an emoji, save your opinion on this bill on a six-point scale from strongly oppose to strongly support.

(Shared on panel .)

Widget for your website

Get a bill status widget »

Follow GovTrack on social media for more updates:

Visit us on Mastodon

  • Study Guide

Add a note about this bill. Your note is for you and will not be shared with anyone.

Because you are a member of panel , your positions on legislation and notes below will be shared with the panel administrators. ( More Info )

Sponsor and status

Introduced on Sep 16, 2024

This bill is in the first stage of the legislative process. It was introduced into Congress on September 16, 2024. It will typically be considered by committee next before it is possibly sent on to the House or Senate as a whole.

Other activity may have occurred on another bill with identical or similar provisions.

Photo of sponsor Barbara Lee

Barbara Lee

Representative for California's 12th congressional district

Not available yet.

1 Cosponsor (1 Democrat)

Bills and resolutions are referred to committees which debate the bill before possibly sending it on to the whole chamber.

If this bill has further action, the following steps may occur next:

H.R. 9607 is a bill in the United States Congress.

A bill must be passed by both the House and Senate in identical form and then be signed by the President to become law.

Bills numbers restart every two years. That means there are other bills with the number H.R. 9607. This is the one from the 118 th Congress.

How to cite this information.

We recommend the following MLA -formatted citation when using the information you see here in academic work:

GovTrack.us. (2024). H.R. 9607 — 118th Congress: To promote a 21st century artificial intelligence workforce and to authorize the Secretary of Education …. Retrieved from https://www.govtrack.us/congress/bills/118/hr9607

“H.R. 9607 — 118th Congress: To promote a 21st century artificial intelligence workforce and to authorize the Secretary of Education ….” www.GovTrack.us. 2024. September 17, 2024 <https://www.govtrack.us/congress/bills/118/hr9607>

To promote a 21st century artificial intelligence workforce and to authorize the Secretary of Education to carry out a program to increase access to prekindergarten through grade 12 emerging and advanced technology education and upskill workers in the technology of the future, H.R. 9607, 118th Cong. (2024).

{{cite web |url=https://www.govtrack.us/congress/bills/118/hr9607 |title=H.R. 9607 (118th) |accessdate=September 17, 2024 |author=118th Congress (2024) |date=September 16, 2024 |work=Legislation |publisher=GovTrack.us |quote=To promote a 21st century artificial intelligence workforce and to authorize the Secretary of Education … }}

  • show another citation format:
  • Wikipedia Template

Where is this information from?

GovTrack automatically collects legislative information from a variety of governmental and non-governmental sources. This page is sourced primarily from Congress.gov , the official portal of the United States Congress. Congress.gov is generally updated one day after events occur, and so legislative activity shown here may be one day behind. Data via the congress project .

Prognosis Details

This bill has a . . .

2% chance of getting past committee. 1% chance of being enacted.

Only 11% of bills made it past committee and only about 2% were enacted in 2021–2023.

Factors considered:

The bill was referred to House Education and the Workforce.

These factors are correlated with either an increased or decreased chance of being enacted.

Please read our full methodology for further details.

[error message]

Introduction to Artificial Intelligence

This course provides a foundational understanding of artificial intelligence (AI), its components, and its societal impact. Students evaluate AI’s problem areas and its influence on current and future job markets. Students will:

  • Learn about algorithms, machine learning, and neural networks
  • Explore current applications of AI
  • Evaluate issues like bias and job impacts

Introduction to Artificial Intelligence Course for the Secondary CTE Classroom

Prepare students to learn AI’s building blocks, its impact on existing jobs, and its potential to create new career fields in the future.

Student-Centered Approach

Student-Centered Design

Based on decades of educational research, each course is designed to maximize student learning, motivation, and achievement. Utilize pedagogical concepts such as Understanding by Design, Growth Mindset, and Video and Project-based Learning.

2-intro-to-ai-course-benefit-337x214.jpg

Interactive Learning

Engage students every step of the way with relevant content, interactives, videos, discussion boards, text-to-speech, language translations, projects, and more. It’s a learning experience students love.

3-intro-to-ai-course-benefit-337x214.jpg

Ready for the Real World

Students become career ready when they go beyond understanding concepts to applying them in real life. Through practical activities, students gain the skills they need for in-demand careers.

Learn artificial intelligence and understand how AI works

  • Course Outline

Module Introduction

Interactive instructional design, formative and summative assessments, accessibility, course outline by module.

  • Module 1 : Introduction to Artificial Intelligence
  • Module 2 : Perception and Intelligence
  • Module 3 : Algorithms in AI
  • Module 4 : Machine Learning
  • Module 5 : Deep Learning & Neural Networks
  • Module 6 : Humans and AI
  • Module 7 : Ethical AI and Biases
  • Module 8 : AI and Jobs

1-intro-to-ai-course-featured-613x750.png

  • Module Introduction Video and Description Each module begins with an introductory video where the instructor sets the stage for topics and learning objectives for the upcoming module. The video is followed by a short description of the module content.
  • Module Learning Objectives Each module introduction includes the set of learning objectives for students to review prior to beginning instruction.
  • Polling Question Students engage with an interactive poll related to the upcoming module content. After completing the poll, students can see how their peers responded with a percentage breakdown of the results.
  • Introduce New Vocabulary The module introduction concludes with an interactive vocabulary matching activity, designed to familiarize students with words and concepts they will learn in the upcoming module.

2-intro-to-ai-course-featured-613x750.png

  • Engaging Lesson Videos Each lesson begins with an instructional video from an expert educator, designed to grab students’ attention while addressing learning objectives.
  • Interactive Reading Each lesson video is followed by an interactive reading where students dive into new material with embedded interactives like hot spots, flip cards, slides, videos, and more!
  • Integrated Activities Activities are embedded purposefully between lessons and incorporate a variety of interactive tools for students to practice what they’ve learned.
  • Project-Based Learning End-of-module projects provide the opportunity for students to apply what they’ve learned to a real-world situation that they would encounter in the workforce.

3-intro-to-ai-course-featured-613x750.png

  • Concept Check Embedded concept checks include a variety of low-stakes activities, including short answer responses, matching, flashcards, and sorting.
  • Discussion-based Reflections Thought-provoking discussion prompts invite students to process and share their learning with their peers. Utilize your LMS discussion board or have students work in an individual course journal.
  • Short Answer Assignments Short answer assignments are included at the end of each module, providing an opportunity for students to analyze and apply their learning to a real-world situation. Students are supplied with a detailed assignment rubric, with clear expectations.
  • Module Quizzes Each module concludes with a quiz, assessing student understanding. Quizzes are auto-scored and the results report back to the teacher’s gradebook in their LMS.
  • Final Exam A comprehensive final exam assesses student skills and knowledge at the end of the course. The final exam is auto-scored and the results report back to the teacher’s gradebook in their LMS.

4-intro-to-ai-course-featured-613x750.png

  • Text-to-Speech Audio features allow greater accessibility for students. Highlight any text within the course to have it read out loud, including image alt text.
  • Language Translations Translate any text within the course, including video transcripts, into 60+ languages. Additionally, many language translations can be read out loud using the text-to-speech feature.
  • Closed Captioning All videos within the course include closed captioning with the ability to access video transcripts and translate into 60+ languages.
  • Alternative Activities Alternative activities are embedded throughout lessons to meet accessibility standards and provide alternatives to the interactive activities in multiple choice format.

5-intro-to-ai-course-featured-613x750.png

Explore Introduction to Artificial Intelligence content to promote learning

1-intro-to-ai-course-video2-up-674x428.jpg

Course Preview

Get a sneak peek into the Introduction to Artificial Intelligence course, featuring key learning highlights.

Digital Tour

Digital Tour

Experience the engaging instructional design and learn about system integrations.

Flexible System Integrations for Your Introduction to Artificial Intelligence CTE Program

Digital Courseware on Your LMS

Seamless Integration With Your LMS

CTE has never been more accessible with digital courses that integrate smoothly with your Learning Management System. Course materials are available anytime, anywhere, all in your familiar and convenient LMS.

Additional Introduction to Artificial Intelligence Resources

Downloadable instructors’ guide, course syllabus, on-demand training.

Downloadable Instructors’ Guide

The syllabus includes a high-level course overview, module overviews, and module learning objectives.

Downloadable Instructors’ Guide

Frequently asked questions about Introduction to Artificial Intelligence

  • What grade levels is this text appropriate for? Grades 6–12
  • What types of programs is this course designed for? This program is designed for CTE STEM pathways or elective offerings.
  • Is this program available in print or digitally? Introduction to Artificial Intelligence is a robust, digital-only course ideal for virtual or blended learning.
  • What teacher resources are available? Teacher resources include the course syllabus, instructors’ guide, and digital courseware access. Additionally, an on-demand, self-paced teacher training course covers the fundamentals of implementing the curriculum.
  • Which Learning Management System (LMS) does this course integrate with? Digital courseware is delivered by LTI integration with the following Learning Management Systems: Canvas®, Schoology®, Blackboard®, Moodle®, AGILIX® Buzz®, D2L or Focalpoint.
  • What are the digital license options? Student digital access to Introduction to Artificial Intelligence can be purchased for 1 year.
  • How long does it take to complete the course? Introduction to Artificial Intelligence is designed to support a full-credit course and therefore may be used to support a semester- or year-long option. On average, this course requires 80 instructional seat-time hours, equating to roughly 3–4 lessons per week. Teachers can make course customizations if desired to meet specific needs.

1-intro-to-ai-course-faq-705x755.png

Transform Learning Outcomes

The Savvas experts will guide you through our blended solutions, digital textbooks and printed materials. We'll also assist you throughout the entirety of the process.

Subscribe to Our Savvas Insights Newsletter.

  • Share to twitter
  • Share to facebook
  • Share to pinterest
  • Share to linkedin
  • Share to email
  • Request Info

Select an Option below:

More options:.

California governor signs legislation to protect entertainers from AI

  • Medium Text

First 2024 presidential debate in Atlanta

WHY IT'S IMPORTANT

Sign up here.

Reporting by Kanishka Singh in Washington; Editing by Aurora Ellis

Our Standards: The Thomson Reuters Trust Principles. , opens new tab

artificial intelligence in education technology

Thomson Reuters

Kanishka Singh is a breaking news reporter for Reuters in Washington DC, who primarily covers US politics and national affairs in his current role. His past breaking news coverage has spanned across a range of topics like the Black Lives Matter movement; the US elections; the 2021 Capitol riots and their follow up probes; the Brexit deal; US-China trade tensions; the NATO withdrawal from Afghanistan; the COVID-19 pandemic; and a 2019 Supreme Court verdict on a religious dispute site in his native India.

Illustration shows AI (Artificial Intelligence) letters and robot hand

Hedge fund Irenic urges Canadian firm Kinaxis to run a strategic review

Hedge fund Irenic Capital Management on Tuesday urged Kinaxis to see who may want to buy the Canadian software company and warned its board against making "reactive decisions" and taking "half measures."

Voters cast their ballots during early voting at San Francisco City Hall

IMAGES

  1. 7 Roles for Artificial Intelligence in Education

    artificial intelligence in education technology

  2. 10 Application of Artificial Intelligence in Education- Pickl.AI

    artificial intelligence in education technology

  3. Unlock the Power of Education with Artificial Intelligence

    artificial intelligence in education technology

  4. 9 Ways AI Is Reforming The Education System

    artificial intelligence in education technology

  5. Revolutionizing Education with Artificial Intelligence (AI)

    artificial intelligence in education technology

  6. How Artificial Intelligence Is Used in Education

    artificial intelligence in education technology

VIDEO

  1. Decoding the "AI and Future of Education" Report

  2. Incorporating AI into education

  3. Artificial Intelligence RBWM JAIIB

  4. Artificial Intelligence (AI): Fundamental Skills for Educators & Students (CDE)

  5. Types Of Artificial Intelligence

  6. Top 10 Artificial Intelligence Universities in Pakistan 2024

COMMENTS

  1. Artificial Intelligence

    Artificial Intelligence and the Future of Teaching and Learning. The U.S. Department of Education Office of Educational Technology's new policy report, Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations, addresses the clear need for sharing knowledge, engaging educators, and refining technology plans and policies for artificial intelligence (AI) use ...

  2. Artificial intelligence in education

    Artificial Intelligence (AI) has the potential to address some of the biggest challenges in education today, innovate teaching and learning practices, and accelerate progress towards SDG 4. However, rapid technological developments inevitably bring multiple risks and challenges, which have so far outpaced policy debates and regulatory frameworks.

  3. Artificial Intelligence and the Future of Teaching and Learning

    The U.S. Department of Education Office of Educational Technology's new policy report, Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations, addresses the clear need for sharing knowledge, engaging educators, and refining technology plans and policies for artificial intelligence (AI) use in education.The report describes AI as a rapidly-advancing set ...

  4. How AI can transform education for students and teachers

    Advances in artificial intelligence (AI) could transform education systems and make them more equitable. It can accelerate the long overdue transformation of education systems towards inclusive learning that will prepare young people to thrive and shape a better future.; At the same time, teachers can use these technologies to enhance their teaching practice and professional experience.

  5. AI technologies for education: Recent research & future directions

    2.1 Prolific countries. Artificial intelligence in education (AIEd) research has been conducted in many countries around the world. The 40 articles reported AIEd research studies in 16 countries (See Table 1).USA was so far the most prolific, with nine articles meeting all criteria applied in this study, and noticeably seven of them were conducted in K-12.

  6. PDF Artificial Intelligence and the Future of Teaching and Learning

    leaders, policy makers, researchers, and educational technology innovators and providers as they work together on pressing policy issues that arise as Artificial Intelligence (AI) is used in education. AI can be defined as "automation based on associations." When computers automate reasoning

  7. Use of AI in education: Deciding on the future we want

    Artificial intelligence tools are being deployed rapidly in education systems across the globe. As much as they offer immense opportunities to enhance and expand learning, their rapid roll out also presents risks: They are being used in the absence of regulatory frameworks needed to protect learners and teachers, and ensure a human-centered approach to using these technologies in education.

  8. Artificial intelligence in education: A systematic literature review

    1. Introduction. Information technologies, particularly artificial intelligence (AI), are revolutionizing modern education. AI algorithms and educational robots are now integral to learning management and training systems, providing support for a wide array of teaching and learning activities (Costa et al., 2017, García et al., 2007).Numerous applications of AI in education (AIED) have emerged.

  9. Artificial Intelligence in Education (AIEd): a high-level academic and

    In the past few decades, technology has completely transformed the world around us. Indeed, experts believe that the next big digital transformation in how we live, communicate, work, trade and learn will be driven by Artificial Intelligence (AI) [83]. This paper presents a high-level industrial and academic overview of AI in Education (AIEd). It presents the focus of latest research in AIEd ...

  10. Artificial intelligence and the Futures of Learning

    The Artificial Intelligence and the Futures of Learning project builds on the Recommendation on the Ethics of Artificial Intelligence adopted at the 41st session of the UNESCO General Conference in 2019 and follows up on the recommendations of the UNESCO global report Reimagining our futures together: a new social contract for education, launched in November 2021.

  11. U.S. Department of Education Shares Insights and Recommendations for

    Today, the U.S. Department of Education's Office of Educational Technology (OET) released a new report, "Artificial Intelligence (AI) and the Future of Teaching and Learning: Insights and Recommendations" that summarizes the opportunities and risks for AI in teaching, learning, research, and assessment based on public input. This report is part of the Biden-Harris Administration's ongoing ...

  12. Artificial Intelligence in Education

    Artificial intelligence is poised to change education in ways we've hardly begun to anticipate, and ChatGPT's emergence serves as a reminder that teachers need to be ready to adapt quickly to sudden and exponential advancements in technology.

  13. The future of learning: AI is revolutionizing education 4.0

    With increasing interest in AI and education, the Education 4.0 Alliance sought to understand the current state and future promises of the technology for education. The latest report - Shaping the Future of Learning: The Role of AI in Education 4.0 - shows four key promises that have emerged for AI to enable Education 4.0: 1.

  14. Unleashing the power of AI for education

    Dan Ayoub. March 4, 2020. Provided by Microsoft Education. Artificial intelligence (AI) is a major influence on the state of education today, and the implications are huge. AI has the potential to ...

  15. A Review of Artificial Intelligence (AI) in Education from 2010 to 2020

    Analyzing analysis of worldwide educational artificial intelligence research development in recent twenty years: 66: Ulum : FEE: ... According to the reviewed papers, a majority of AI technology in education focused on online information technology or system (107 out of 109), such as intelligent tutoring system, intelligent virtual laboratory ...

  16. How technology is reinventing K-12 education

    New advances in technology are upending education, from the recent debut of new artificial intelligence (AI) chatbots like ChatGPT to the growing accessibility of virtual-reality tools that expand ...

  17. AI in Education| Harvard Graduate School of Education

    Chris Dede thinks we need to get smarter about using artificial intelligence and education. He has spent decades exploring emerging learning technologies as a Harvard researcher. The recent explosion of generative AI, like ChatGPT, has been met with mixed reactions in education. Some public school districts have banned it.

  18. Artificial Intelligence In Education: Teachers' Opinions On AI In The

    Before we dive into AI's function in the education space, let's define this technology in general terms. Artificial intelligence allows machines to execute tasks that have traditionally ...

  19. Artificial Intelligence and Education: A Reading List

    Gwo-Jen Hwang and Nian-Shing Chen, "Exploring the Potential of Generative Artificial Intelligence in Education: Applications, Challenges, and Future Research Directions," Educational Technology & Society 26, no. 2 (2023). Gwo-Jen Hwang and Nian-Shing Chen are enthusiastic about the potential benefits of incorporating generative AI into ...

  20. Artificial intelligence in education: The three paradigms

    1. Introduction. With the development of computing and information processing techniques, artificial intelligence (AI) has been widely applied in educational practices (Artificial Intelligence in Education; AIEd), such as intelligent tutoring systems, teaching robots, learning analytics dashboards, adaptive learning systems, human-computer interactions, etc. (Chen, Xie, & Hwang, 2020).

  21. Artificial Intelligence in Education and Mental Health for a

    To further explore these issues, the National Academies of Sciences, Engineering, and Medicine's Roundtable on Science and Technology for Sustainability, in collaboration with the Board on Health Care Services and Board on Science Education, convened a hybrid workshop, Artificial Intelligence in Education and Mental Health for a Sustainable ...

  22. Accelerating Research on Generative Artificial Intelligence: IES

    National Center on Generative AI for Uplifting STEM+C Education (GENIUS Center) (R305C240010) This Center aims to transform education through the development of a GenAI learning agent, GenAgent, to address the need for robust interdisciplinary science, technology, engineering, mathematics, and computing (STEM+C) education for middle school ...

  23. Artificial intelligence (AI) -integrated educational applications and

    Artificial intelligence (AI) integration is one particularly hopeful solution . As technology advances, artificial intelligence (AI) in education has enormous potential to change the teaching and learning environment . AI is significantly improving the quality of higher education in several ways . Artificial intelligence (AI)--powered learning ...

  24. Does the human professor or artificial intelligence (AI) offer better

    Although artificial intelligence (AI) technology has been incorporated into higher education for some time (Guan et al., 2020), more recently, generative AI-chatbots have entered higher education w...

  25. AI in Medical Education & Healthcare

    Can an artificial intelligence chatbot assistant, provide responses to patient questions that are of comparable quality and empathy to those written by physicians? Ayers JW, Poliak A, Dredze M, et al. Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum.

  26. How can artificial intelligence enhance education?

    Last update:20 April 2023. The transformative power of Artificial Intelligence (AI) cuts across all economic and social sectors, including education. "Education will be profoundly transformed by AI," says UNESCO Director-General Audrey Azoulay. "Teaching tools, ways of learning, access to knowledge, and teacher training will be ...

  27. To promote a 21st century artificial intelligence workforce and to

    H.R. 9607: To promote a 21st century artificial intelligence workforce and to authorize the Secretary of Education to carry out a program to increase access to prekindergarten through grade 12 emerging and advanced technology education and upskill workers in the technology of the future.

  28. Introduction to Artificial Intelligence

    Introduction to Artificial Intelligence is designed to support a full-credit course and therefore may be used to support a semester- or year-long option. On average, this course requires 80 instructional seat-time hours, equating to roughly 3-4 lessons per week.

  29. Microsoft and BlackRock plan $30 bln fund to invest in AI

    BlackRock is preparing to launch a more than $30 billion artificial intelligence investment fund with Microsoft to build data centers and energy projects to meet growing demands stemming from AI ...

  30. California governor signs legislation to protect entertainers from AI

    California Governor Gavin Newsom signed two bills into law on Tuesday that aim to help actors and performers protect their digital replicas in audio and visual productions from artificial ...