• Multi-Tiered System of Supports Build effective, district-wide MTSS
  • School Climate & Culture Create a safe, supportive learning environment
  • Positive Behavior Interventions & Supports Promote positive behavior and climate
  • Family Engagement Engage families as partners in education
  • Platform Holistic data and student support tools
  • Integrations Daily syncs with district data systems and assessments
  • Professional Development Strategic advising, workshop facilitation, and ongoing support

Mesa OnTime

  • Success Stories
  • Surveys and Toolkits
  • Product Demos
  • Events and Conferences

State of Chronic Absenteeism Report

2024 State of Chronic Absenteeism Report

Explore what millions of students tell us about the link between school climate and chronic absenteeism.

  • Connecticut
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Testimonials
  • About Panorama
  • Data Privacy
  • Leadership Team
  • In the Press
  • Request a Demo

Request a Demo

  • Popular Posts
  • Multi-Tiered System of Supports
  • Family Engagement
  • Social-Emotional Well-Being
  • College and Career Readiness

Show Categories

School Climate

45 survey questions to understand student engagement in online learning.

Nick Woolf

In our work with K-12 school districts during the COVID-19 pandemic, countless district leaders and school administrators have told us how challenging it's been to  build student engagement outside of the traditional classroom. 

Not only that, but the challenges associated with online learning may have the largest impact on students from marginalized communities.   Research   suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

As you look to improve the online learning experience for students, take a moment to understand  how students, caregivers, and staff are currently experiencing virtual learning. Where are the areas for improvement? How supported do students feel in their online coursework? Do teachers feel equipped to support students through synchronous and asynchronous facilitation? How confident do families feel in supporting their children at home?

Below, we've compiled a bank of 45 questions to understand student engagement in online learning.  Interested in running a student, family, or staff engagement survey? Click here to learn about Panorama's survey analytics platform for K-12 school districts.

Download Toolkit: 9 Virtual Learning Resources to Engage Students, Families, and Staff

45 Questions to Understand Student Engagement in Online Learning

For students (grades 3-5 and 6-12):.

1. How excited are you about going to your classes?

2. How often do you get so focused on activities in your classes that you lose track of time?

3. In your classes, how eager are you to participate?

4. When you are not in school, how often do you talk about ideas from your classes?

5. Overall, how interested are you in your classes?

6. What are the most engaging activities that happen in this class?

7. Which aspects of class have you found least engaging?

8. If you were teaching class, what is the one thing you would do to make it more engaging for all students?

9. How do you know when you are feeling engaged in class?

10. What projects/assignments/activities do you find most engaging in this class?

11. What does this teacher do to make this class engaging?

12. How much effort are you putting into your classes right now?

13. How difficult or easy is it for you to try hard on your schoolwork right now?

14. How difficult or easy is it for you to stay focused on your schoolwork right now?

15. If you have missed in-person school recently, why did you miss school?

16. If you have missed online classes recently, why did you miss class?

17. How would you like to be learning right now?

18. How happy are you with the amount of time you spend speaking with your teacher?

19. How difficult or easy is it to use the distance learning technology (computer, tablet, video calls, learning applications, etc.)?

20. What do you like about school right now?

21. What do you not like about school right now?

22. When you have online schoolwork, how often do you have the technology (laptop, tablet, computer, etc) you need?

23. How difficult or easy is it for you to connect to the internet to access your schoolwork?

24. What has been the hardest part about completing your schoolwork?

25. How happy are you with how much time you spend in specials or enrichment (art, music, PE, etc.)?

26. Are you getting all the help you need with your schoolwork right now?

27. How sure are you that you can do well in school right now?

28. Are there adults at your school you can go to for help if you need it right now?

29. If you are participating in distance learning, how often do you hear from your teachers individually?

For Families, Parents, and Caregivers:

30 How satisfied are you with the way learning is structured at your child’s school right now?

31. Do you think your child should spend less or more time learning in person at school right now?

32. How difficult or easy is it for your child to use the distance learning tools (video calls, learning applications, etc.)?

33. How confident are you in your ability to support your child's education during distance learning?

34. How confident are you that teachers can motivate students to learn in the current model?

35. What is working well with your child’s education that you would like to see continued?

36. What is challenging with your child’s education that you would like to see improved?

37. Does your child have their own tablet, laptop, or computer available for schoolwork when they need it?

38. What best describes your child's typical internet access?

39. Is there anything else you would like us to know about your family’s needs at this time?

For Teachers and Staff:

40.   In the past week, how many of your students regularly participated in your virtual classes?

41. In the past week, how engaged have students been in your virtual classes?

42. In the past week, how engaged have students been in your in-person classes?

43. Is there anything else you would like to share about student engagement at this time?

44. What is working well with the current learning model that you would like to see continued?

45. What is challenging about the current learning model that you would like to see improved?

Elevate Student, Family, and Staff Voices This Year With Panorama

Schools and districts can use Panorama’s leading survey administration and analytics platform to quickly gather and take action on information from students, families, teachers, and staff. The questions are applicable to all types of K-12 school settings and grade levels, as well as to communities serving students from a range of socioeconomic backgrounds.

back-to-school-students

In the Panorama platform, educators can view and disaggregate results by topic, question, demographic group, grade level, school, and more to inform priority areas and action plans. Districts may use the data to improve teaching and learning models, build stronger academic and social-emotional support systems, improve stakeholder communication, and inform staff professional development.

To learn more about Panorama's survey platform, get in touch with our team.

Related Articles

44 Questions to Ask Students, Families, and Staff During the Pandemic

44 Questions to Ask Students, Families, and Staff During the Pandemic

Identify ways to support students, families, and staff in your school district during the pandemic with these 44 questions.

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Learn how to engage principals, staff, families, and students in the survey results when running a stakeholder feedback program around school climate.

Strategies to Promote Positive Student-Teacher Relationships

Strategies to Promote Positive Student-Teacher Relationships

Explore four strategies for building strong student-teacher relationships in your school.

research survey questionnaire about online classes

Featured Resource

9 virtual learning resources to connect with students, families, and staff.

We've bundled our top resources for building belonging in hybrid or distance learning environments.

Join 90,000+ education leaders on our weekly newsletter.

Copyright © SurveySparrow Inc. 2024 Privacy Policy Terms of Service SurveySparrow Inc.

80+ Remote Learning Survey Questions for Students, Teachers, and Parents

blog author

Pragadeesh Natarajan

Last Updated: 23 May 2024

80+ Remote Learning Survey Questions for Students, Teachers, and Parents

Table Of Contents

  • Distance learning survey questions for Students
  • Distance learning survey questions for Teachers
  • Distance learning survey questions for Parents

Are you a school or university that’s transitioned to remote learning (or distance learning) during the Covid-19 pandemic? Looking to measure the effectiveness and experience of remote education? Remote learning (or distance learning) surveys can help! Remote learning survey questions help you improve student engagement and understand the challenges associated with remote learning. For example, the employees may want to customize the training schedules based on the shift plans . Or they may want to add case studies and simulations that they can solve as a team. A survey is a great way to create an effective remote training program.

In this article, we’ve put together a list of the 80 best remote learning survey questions you can ask students, parents, and teachers to optimize and design effective learning experiences.

Here’s everything we’ll cover:

47 Remote Learning Survey Questions for Students

  • 27 Remote Learning Survey Questions for Parents
  • 13 Remote Learning Survey Questions for Teachers

Before we dive into questions, what if I tell you I am here to make your job easier? If you are looking for a questionnaire, here is this full-fledged remote learning survey that asks the right questions to get the right feedback from students.

Feel free to use this template to collect critical feedback from your students. You can also customize it according to your brand identity and send it as your branded survey. Sign up and check it for free.

Try our remote learning survey to test the conversational experience!

Preview Template

 Try our remote learning survey to test the conversational experience!

Now, off to remote learning survey questions…

Learn about your students’ challenges and the effectiveness of your remote learning programs and resources with our list of the best remote learning survey questions for students:

  • On a scale of 1 to 10, rate your overall remote learning experience.
  • How stressful is remote learning for you during the Covid-19 pandemic?
  • Is this remote learning program working for you?
  • Do you enjoy learning remotely?
  • How peaceful is the environment at home while learning remotely?
  • Are you able to keep up with the number of hours you committed to each week?
  • How well could you manage your time while learning remotely?
  • How well is the online curriculum working for you?
  • Are you satisfied with the technology and software you are using for remote learning?
  • How important is face-to-face communication for you while learning remotely?
  • How often do you talk to your {school/university name} classmates?
  • Do you have access to a device for learning online?
  • How often do you have 1-1 discussions with your teachers?
  • How helpful are your teachers while learning online?
  • What type of device do you use for remote learning? (smartphone, desktop, tablet, etc.)
  • How much time do you spend each day on remote learning?
  • How effective has remote learning been for you?
  • Why are you using remote learning?
  • Are there any challenges that might prevent you from staying?
  • How often do you hear from your teachers when learning remotely?
  • Are there teachers you can go to for help if you need it?
  • How helpful has {school or university name} been in providing you with the resources to learn from home?
  • How sure are you that you can do well?
  • Are you getting all the help you need with your coursework?
  • What has been the hardest part about completing your coursework?
  • How difficult or easy is it for you to connect to the internet to access your coursework?
  • When you have your online classes, how often do you have the technology (laptop, tablet, etc) you need?
  • What do you not like about your remote learning classes?
  • What do you like about your remote learning classes?
  • How difficult or easy is it to use remote learning technology (computer, video conferencing tools , online learning software, etc.)?
  • How difficult (or easy) is it to stay focused on your coursework?
  • What does this teacher do to make this class engaging?
  • How much effort are you putting into your online classes?
  • How difficult (or easy) is it to try hard on your coursework?
  • What projects or activities do you find the most engaging in this class?
  • How do you know when you are engaged in your online classes?
  • If you were teaching an online class yourself, what is the one thing you would do to make it more engaging?
  • Which aspects of your online class have you found the least engaging?
  • What are the most engaging activities that happen in this class?
  • How often are you so focused in your online classes that you lose track of time?
  • How eager are you to participate in your online classes?
  • If you have missed any online classes recently, why did you miss them?
  • How excited are you about attending your online classes?
  • Overall, how interested are you in your online classes?
  • How else would you like to be learning?
  • How happy are you with the amount of time you spend speaking with your teacher?
  • Do you have any suggestions for us? Anything you would like to see offered or done differently?

14-day free trial • Cancel Anytime • No Credit Card Required • No Strings Attached

27 Remote Learning Survey Questions for Teachers

To help your teachers give their best and succeed in remote learning, here are the top remote survey questions for teachers:

  • How stressful do you find teaching remotely during the pandemic?
  • How stressful are your students while learning remotely during the pandemic?
  • Are you enjoying teaching remotely?
  • How well could you maintain a work-life balance while teaching remotely?
  • How was your experience teaching your students from home as compared to teaching them at school?
  • Approximately how long has your work taken you each day?
  • How challenging has the work been for you?
  • Do you have access to a device for online teaching?
  • How many of your students regularly participated in your online classes in the past few weeks?
  • Do you have high-speed internet at home?
  • How helpful has {school or university name} been in offering you the resources to teach from home?
  • What device do you use for online teaching?
  • Are you satisfied with the technology and software you are using for online teaching?
  • How is {school or university name} delivering remote learning?
  • What kind of response have you received from your students so far?
  • How helpful have your coworkers been while teaching online?
  • What specific task have you found the most challenging?
  • How ideal is your home environment for teaching remotely?
  • Are your students learning better after switching to remote learning?
  • How often do you have 1-1 discussions with your students?
  • How helpful have parents been while supporting their children’s remote learning?
  • Is there anything you would like to share about student engagement?
  • How important is face-to-face communication for you while teaching remotely?
  • How engaged have students been in your online classes in the past few weeks?
  • What types of tasks have you found the most interesting and enjoyable?
  • How can {school or university name} support you further?
  • Do you have any suggestions to help improve the whole process of working from home?

Create student feedback surveys for Free. Make Learning more Effective.

Get 40% more response rates. Sign up for free.

14-Day-Free Trial • Cancel Anytime • No Credit Card Required • Need a Demo?

13 Remote Learning Survey Questions for Parents

Measure the parents’ or caregiver’s satisfaction with your online learning programs and more with our list of remote learning survey questions for parents:

  • Do all the members of your family work?
  • How soon would you like your child to return to in-person learning full-time?
  • How satisfied are you with the software and platforms used for remote learning?
  • What more can {school or university name} do to improve your child’s remote learning initiatives?
  • How concerned are you about your child’s social-emotional health and development?
  • How difficult or easy is it for your child to use remote learning tools and platforms?
  • Are you confident your child will make sufficient progress through remote learning?
  • How satisfied are you with the way your child’s course has been structured and delivered?
  • On a scale of 1 to 10, how do you rate the communication between students and teachers?
  • How confident are you in your ability to support your child’s remote education?
  • Does your child have the necessary tools available for coursework?
  • How confident are you that teachers can motivate students to learn effectively?
  • is there anything you would like us to know about your family’s needs or preferences?

Final thoughts

Remote or distance learning surveys can help provide you with all the insights you need to make necessary adjustments. The above questions will help you quickly gather and take action on feedback from students, teachers, and parents.

If you’re looking to create pleasant experiences and get more responses from your surveys, take the conversational way and give SurveySparrow a whirl today!

Have you got any questions on creating remote learning surveys? Got any tips or hacks for conducting effective distance learning surveys? Let us know in the comment section below.

Looking for a survey platform that makes it easy and effective to conduct remote learning surveys? Wondering whether SurveySparrow is the right fit for conducting distance learning surveys? Reach out to us for a free, personalized demo!

blog author image

I'm a developer turned marketer, working as a Product Marketer at SurveySparrow — A survey tool that lets anyone create beautiful, conversational surveys people love to answer.

You Might Also Like

QR Code Surveys: Definition, Benefits & Examples

QR Code Surveys: Definition, Benefits & Examples

20 Takeaways From Google Innovation Culture That'll Blow Your Minds!

Work Culture

20 Takeaways From Google Innovation Culture That'll Blow Your Minds!

How We Scored 1000 Customers Within 50 Days of Our SaaS Startup Launch

How We Scored 1000 Customers Within 50 Days of Our SaaS Startup Launch

Sampling Bias: Definition, Methods, and How To Avoid It

Customer Experience

Sampling Bias: Definition, Methods, and How To Avoid It

Turn every feedback into a growth opportunity.

14-day free trial • Cancel Anytime • No Credit Card Required • Need a Demo?

How to make a questionnaire regarding the impact of eLearning?

  • Best practice

Difference between eLearning, remote learning, and distance learning

Elearning surveys to evaluate impact, things to consider before creating an elearning questionnaire, a step-by-step guide to making an elearning survey, teacher elearning survey questions, elearning survey questions for students.

eLearning has become increasingly prevalent in recent years, thanks to its providing access to education regardless of location or physical accessibility issues. The term refers to using an electronic device and an Internet connection to distribute learning opportunities and provide educational content.

It is essential to better understand the impact of eLearning on current generations and its role in the future. Learning what people think about e-learning and investigating how it affects their lives is the goal we will explore in this article—and how to make a questionnaire on the impact of e-learning that will stand out.

Continue reading to get answers about gauging the impact of e-learning, including inspiration from our examples of eLearning survey questions.

Consequences of eLearning

There are several reasons to discuss eLearning. First, it has the potential to expand access to education for people who cannot attend traditional in-person classes, like those living in remote or underdeveloped areas or people with disabilities.

eLearning can also be more cost-effective for students and institutions since it reduces the need for physical classrooms and materials. That also means that eLearning has a positive impact on the environment, given that it has the potential to reduce waste.

Additionally, eLearning can potentially improve the quality of education by providing access to a broader range of resources and experts. Borders and distance from a university will no longer bar individuals of talent from accessing materials.

To conclude, eLearning has the potential to increase access to education, improve the quality of education, and make teaching more convenient and cost-effective. It is a topic of ongoing research, development, and discussion as it continues evolving and shaping education.

Terms like eLearning, remote, and distance learning are often used interchangeably. However, there are nuances. eLearning refers to learning facilitated by Internet technologies.

However, the term “distance learning” is most appropriate when the student and educator are physically separated.

Remote learning is a more general term that encompasses all forms of education that are not conducted in a traditional classroom setting, including distance learning. Surveys about distance learning, remote learning, and eLearning are often used as synonyms, regardless of their subtle differences.

To sum up, eLearning typically emphasizes technology, while distance learning refers to any remote education form. However, distant eLearning and remote learning have become more prevalent in recent years due to the COVID-19 pandemic, which has led to widespread school closures and a shift toward online learning.

If you are interested in measuring the impact of eLearning, the first step is to make a questionnaire. For example, you can ask about how often respondents use the technology to learn new information, whether they prefer traditional or digital education, and if they think eLearning is destined to become more popular. What they like and don’t like about eLearning can also be asked.

Once surveys have been collected, analyzing them will highlight the most common trends in the field.

If you need help analyzing the collected data, we’ve got you! Read our blog to find helpful tips and tricks that will ease this process.

Before diving into creating a survey, make sure to think through the following issues:

Define your target audience: Who should be surveyed? This will help narrow down the required research questions and determine whether a topic is relevant. For example, if you’re looking into how eLearning affects teenagers, then survey teenagers rather than adults or children, which will ensure more accurate answers since respondents will have first-hand experience rather than just their impressions about it (as in the case of parents who haven’t actually taken eLearning courses).

Choose an appropriate format: There are many different research methodologies, such as face-to-face interviews, telephone interviews, written questionnaires/surveys, etc. Each has its advantages/disadvantages.

Do your research: No one likes monotonous and bland surveys. Help yourself by referring to our valuable resources that will help create surveys that will stand out from the crowd:

  • How to write good survey questions (with examples)
  • How to create an engaging survey
  • How to avoid biased questions

Here are nine steps to follow when making an eLearning questionnaire:

  • Begin by defining objectives. What specific information are you trying to gather about the impact of eLearning?
  • Determine the target population for your questionnaire. Who will you be surveying and why?
  • Create questions that will help achieve your objectives. Be sure to include a mix of open-ended and closed-ended questions.
  • Have someone review the questionnaire to ensure the questions are straightforward and understandable.
  • Test the questionnaire with a small group to identify issues or ambiguities.
  • Consider the feedback from the test group and make any necessary changes.
  • Distribute the questionnaire to your target population.
  • Collect and analyze the data.
  • Draw conclusions about the impact of eLearning from the data.
  • Create a report summarising the research findings.
  • Provide recommendations for future eLearning initiatives.

The target group for eLearning surveys is usually someone from the online education system—administrators, teachers, professors, or students. So, let’s start with eLearning survey questions for teachers and educators.

  • How satisfied are you with the overall eLearning experience?
  • On a scale of 1 to 10, how successful have your classes been?
  • Do you have the opportunity to use innovative methods for improved learning outcomes using eLearning?
  • What type of learning methods do you prefer?
  • How often have you received an appraisal for your teaching?
  • How difficult was it to hold the attention of students in an online classroom?
  • On a scale of 1 to 10, what was the average attention level of students in the online classroom?
  • What are the biggest challenges for teachers when utilizing eLearning?
  • In your opinion, how could online lectures be more effective?
  • Is there anything else you’d like to share about your eLearning experience?

See more teacher survey examples here.

These are extremely important to better understand the educational “product” being offered to students. Here are some examples of eLearning survey questions to gather feedback from students:

  • How easily could you access the course materials and assignments?
  • How effective was the instructor at communicating and providing feedback through the eLearning platform?
  • How well did the eLearning format support your ability to learn and retain material?
  • Were there any technical issues that hindered your eLearning experience?
  • How much interaction did you have with classmates and instructors during the eLearning course?
  • How would you rate the overall quality of the course materials and resources provided in the eLearning format?
  • How would you rate the eLearning platform (e.g., Moodle, Blackboard) in terms of user-friendliness and functionality?
  • How would you rate the eLearning course compared to a traditional classroom setting?
  • Are there any suggestions you would make to improve the eLearning experience?

See more student survey examples here.

Following the steps above, while using our questions as an inspiration, will help you gather relevant data and develop proper conclusions regarding eLearning programming.

For more inspiration and tools, we encourage you to explore our free education survey questions, templates, and academic survey question examples .

The survey-making tools that SurveyPlanet offers can assist you further. We have hundreds of survey templates, pre-written questions, and multiple additional features. Sign up to create an unlimited number of questionnaires, the kinds that will help you gather valuable data and better understand the actual impacts of eLearning.

Photo by Compare Fibre on Unsplash

  • Scroll to top
  • Dark Light Dark Light

SurveyPoint

The Perfect Distance Learning Questionnaire for Students

  • Author Survey Point Team
  • Published January 1, 2023

Distance Learning Questionnaire

Distance Learning Questionnaire Surveys on remote learning might help you increase student engagement and comprehend the difficulties involved. This article has compiled a list of the top survey questions for students in online learning to help improve and create engaging learning experiences. 

The COVID-19 epidemic altered education in previously unheard-of ways. Students have to switch entirely to online instruction. Some people found it difficult to adapt to the “new normal.” 

Table of Contents

Questions on distance learning

This survey questionnaire will help gauge the degree to which students perceive their online education helpful.

Start Distance Learning Questionnaire for Students

What standard are you in?

  • Pre-primary
  • 1st to 5th standard
  • 6th to 8th standard
  • 9th to 12th standard

How do you feel about online learning in general? 

  • Below average
  • Average 

On average, how much time do you devote to distance learning each day?

  • 1-3 hours 
  • 3-5 hours 
  • 5-7 hours 
  • 7-10 hours 
  • 10+ hours 

Do you have a device at your disposal for online education? 

  • No, I have to share with others 

What kind of technology do you use for online classes? 

  • Laptop 

Do you have access to the internet at home? 

  • Yes, but it doesn’t function very well

How challenging or straightforward is it to use the technology for distance learning (computer, tablet, video chats, learning apps, etc.)? 

  • Very challenging 
  • Challenging 
  • Somewhat challenging 
  • Not challenging

How simple is using tools your school offers for remote learning? (ClassDojo, Google Classroom, etc.) 

  • Not easy at all.
  • Somewhat easy.

Do you like the devices and software you’re using for online learning? 

  • Very satisfied
  • Unsatisfied
  • Very Unsatisfied

What kind of home/remote learning is offered to you by your school? 

  • Online. 
  • printed materials 
  • both print and online resources. 

Do you like online classes? 

  • Yes, but I would like to change a few things

How calm is the atmosphere at home when you’re learning? 

  • Very chaotic

Have you received detailed instructions from your school on accessing the course materials? 

How supportive has your [School or University] been in providing the tools you need to learn at home? 

  • Not at all helpful
  • Slightly helpful
  • Moderately helpful
  • Very helpful
  • Extremely helpful

Are your teachers accessible to you if you need help? 

  • Somewhat 

How supportive are your teachers when you’re learning online?

  • Not supportive
  • Moderately supportive
  • Very supportive

What kind of support did your teachers give you for these assignments? 

  • Gave me sufficient guidance to finish my assignments. 
  • Gave me some guidance, but I need more to do my assignments. 
  • Did not provide me with adequate instructions to do my assignments. 
  • I have yet to receive any homework assignments. 

How frequently do your teachers contact you one-on-one in your distance learning? 

  • All the time
  • As per need
  • Never 

Does the input from your teacher help you with your remote or at-home learning? 

Do your teachers give you various opportunities to show what you’ve learned? 

Do you do your schoolwork as promptly as you would in a classroom? 

How much do you learn through distance learning than in a classroom? 

  • Much less learning 
  • Somewhat less learning 
  • Learning the same thing. 
  • Learning more.

What degree of comfort do you have with online learning? 

  • Not comfortable at all
  • Comfortable
  • Somewhat comfortable
  • Very comfortable

How crucial is face-to-face interaction to your learning while you are studying online? 

  • Very important
  • Somewhat important
  • Not at all important

Do you learn online as much as possible in a traditional classroom setting? 

Describe the online learning activities you have taken part in. Please check all that apply. 

  • I joined the class on a video call. 
  • Joined a video conference for teacher-to-student interaction. 
  • Joined a phone call regarding learning. 
  • Posted assignments on the online platform. 
  • Completed tasks on paper. 
  • Received a notification regarding an assignment. 
  • Read a book. 
  • Made a project or artwork. 
  • Watched movies and events online. 
  • Participated in online activities like yoga or choir.

What online learning activities are the most interesting?

How frequently does an adult assist you with your homework at home? 

  • Almost all the time

What element of finishing your schoolwork has been the hardest? 

Do you believe that even while you’re at home, you’re still learning new things?

  • Strongly agree
  • Strongly disagree

Do you favor traditional, hybrid, or online learning? 

  • Traditional classes. 
  • Hybrid classes (mix of traditional and online). 
  • Online classes. 

How effectively can you manage your time when taking online courses on a scale of 1 to 10? 

How stressful is it to study remotely in light of the COVID-19 pandemic? 

  • Very stressful
  • Not stressful
  • Can’t say

How successful have you found distance learning to be? 

  • Not at all effective
  • Slightly effective
  • Moderately effective
  • Very effective
  • Extremely effective

What further assistance or support may your school provide for your distant learning?

Surveys for remote or distant learning can assist in giving you all the information you need to make the appropriate adjustments. Schools must understand how students feel about online learning and gain deeper insight into their experiences. Distance learning is less personal with teachers and presents its unique problems. Some pupils require more time and effort to understand a subject than others. They can survey students on remote learning to get the necessary data.

These survey questions will aid in understanding the students’ educational experiences with distance learning. It gathers feedback on the student’s entire experience with online education. It will make assessing the efficiency and quality of distance learning more manageable. The management staff will be able to determine what students enjoy about the current structure and what they want to change once they obtain the results.

Survey Point Team

Recent posts.

Understanding SEO: A Comprehensive Guide

  • Posted by Survey Point Team

Understanding SEO: Everything You Need To Know

iOS

All You Need To Know About Creating iOS Applications

Understanding FAANG Stocks: All You Need To Know

Understanding FAANG Stocks: All You Need To Know

research survey questionnaire about online classes

  • Mobile Forms
  • FEATURED INTEGRATIONS
  • See more Integrations
  • See more CRM Integrations

FTP

  • See more Storage Integrations
  • See more Payment Integrations

Mad Mimi

  • See more Email Integrations
  • See 100+ integrations
  • Jotform Teams
  • Enterprise Mobile
  • Prefill Forms
  • HIPAA Forms
  • Secure Forms
  • Assign Forms
  • Online Payments
  • See more features
  • Multiple Users
  • Admin Console
  • White Labeling
  • See more Enterprise Features
  • Contact Sales
  • Contact Support
  • Help Center
  • Jotform Books
  • Jotform Academy

Get a dedicated support team with Jotform Enterprise.

  • Sign Up for Free

Online Learning Survey

An online learning survey is a questionnaire used by e-learning companies to find out how students feel about their experience in e-learning courses.

  • Education Forms

Survey Templates

Education surveys.

  • Training Survey Templates

An online learning survey is a questionnaire used by e-learning companies to find out how students feel about their experience in e-learning courses. Whether you’re an e-learning company, a student learning online, or a parent using online courses for your children, our free online learning survey will help you get the information you need! Just customize our online learning survey template with your logo or colors, add images or videos or links to your e-learning platform, and embed it on your website or share it with a link. Then start collecting responses from students — whether they’re right in front of their computer or using their mobile phone.

Make this survey customizable by adding form fields of your choice, integrating with your storage platform, or even using Jotform’s 100+ powerful integrations. You can even analyze survey results with our Jotform Tables or Jotform Report Builder. If you’d like to send responses to your other accounts, use our free mobile app, Jotform Mobile Forms. We’d love to help you get the information you need from students.

Event Satisfaction Survey Form Template

Event Satisfaction Survey Form

If you want to improve your upcoming event, you can get suggestions from participants by using this event satisfaction survey template. This sample feedback form allows gathering overall satisfaction by categorizing the event services. These categories are location, content, price, speakers, organization.If you are looking for creating your own survey from scratch, you can get started with the survey maker now for free!

Online Interview Questionnaire Form Template

Online Interview Questionnaire Form

An online interview questionnaire form is used by organizations to help them get important information from their interviewees. Whether you’re an insurance company, a hospital, or a company that hires individuals in various roles, use this form to gather more information from applicants applying to jobs. This Online Interview Questionnaire Form is also an additional way to confirm the information you have received from your applicants. Just customize the template, embed the form on your website, and watch as applicants send you the information you need.This online interview questionnaire form allows you to select the responses you need, and then convert the responses into easily downloadable or printable PDFs. If you’d like to automate the questions or the responses, you can use Jotform’s integration options with 100+ other powerful apps, including Google Drive, Dropbox, and Slack. You can also personalize your online interview questionnaire form to fit your business’s branding. Add your logo, change the background, or add more form fields to your liking. You can even send questions to your own CRM automatically if you use Jotform Report Builder! Make the most of your job applications with a free online interview questionnaire form.

Market Research Survey Form Template

Market Research Survey

A market research survey is a questionnaire used by companies to collect information about their customers and their market. Here's a quick market research form that focuses on a few demographic statistics. This survey form will ask for the respondents' age, gender, household income, and educational attainment. The survey format is multiple-choice, giving your respondents an easy way of completing it in a few minutes. Use this market research template and use the collected submission data to your advantage. If you want to create your survey from scratch, you can get started with the survey maker now!Use our free Market Research Survey template for your business — it’s easy to customize, fast to set up, and perfect for collecting info from your customers and fans. Simply add your company logo, change the text and colors, and you’re on your way! If you’d like to embed this survey on your website, you can sync it to a website builder. If you’re looking to collect more information than the Market Research Survey provides, you can use our Form Builder to add more fields. Collect all of the data you need to make decisions with a free Market Research Survey.

Employee Satisfaction Survey Form Template

Employee Satisfaction Survey

An employee satisfaction survey is used by managers or HR professionals to get a better understanding of how their employees view their work environment. With Jotform’s free Employee Satisfaction Survey, you can collect survey responses online from any device! Just customize the form template, embed it in your employee website or share it with a link, and view responses in your secure Jotform account. You can then auto-generate detailed reports using Jotform Report Builder, or convert each submission into a PDF automatically.This Employee Satisfaction Survey template already includes questions to help you get to know your employees, but if you’d like to add more questions, upload your logo, or change the survey design, do it in a few easy clicks with no coding using Jotform Form Builder. Feel free to send submissions to other accounts automatically with 100+ free integrations including Airtable, Google Drive, Dropbox, Trello, Slack, and more. Save time by collecting employee satisfaction surveys online with a free Employee Satisfaction Survey from Jotform! Gauge satisfaction with this form, or create a new survey from scratch!

Restaurant Evaluation Form Template

Restaurant Evaluation Form

Customers satisfaction is important for every business and to determine that you need to survey your customers. This restaurant survey form is designed for this purpose. This restaurant evaluation form let your customers rate or evaluate the quality of your services, this includes food quality, overall service quality, cleanliness, order accuracy, speed of service and others. To deliver the highest level of service, this restaurant review form will help you easily understand your customers and their tastes based on their feedback. So if you own a restaurant and you want the quickest and hassle-free to collect feedback, this restaurant review template free is all you need!If you need a brand new one, you can make the perfect survey you need in just a few minutes!

Patient Feedback Form Template

Patient Feedback Form

A patient feedback form is a survey with questions that allows medical doctors to gather feedback from patients regarding their overall experience with the clinic. Get patient feedback with this online feedback form and improve your service. Want to start from scratch? Get started with the Jotform's easy-to-use Survey Creator now! Whether you’re a medical professional or you’re with a patient’s medical practice, use this free Patient Feedback Form template to gather customer feedback online! Just customize the questions in this Patient Feedback Form to match your practice, embed the form on your website, or share it with a link, and get results quickly. Boost your business or patient satisfaction through this free service!Being the owner of your own practice is an incredible feeling, but it’s also tough to juggle the variety of tasks involved. Luckily, Jotform’s 100+ integrations can help with that! You can sync form submissions with your storage service of choice, give you more time to spend with your clientele. You can also use our incredibly powerful Form Builder App to customize the form according to how you want to get feedbacks from your patients. With Jotform’s free online Patient Feedback Form, you can collect information from patients that you might not have been able to before.

If you want to improve your upcoming event, you can get suggestions from participants by using this event satisfaction survey template. This sample feedback form allows gathering overall satisfaction by categorizing the event services. These categories are location, content, price, speakers, organization.

An Online Interview Questionnaire Form is a form template designed to help organizations gather important information from their interviewees.

A Market Research Survey is a form template designed to collect important information about customers and the overall market for companies.

Get to know your employees with a free online survey. Collect responses from any device. Customize in minutes with no coding. Sync responses to 100+ popular apps.

Customers satisfaction is important for every business and to determine that you need to survey your customers. This restaurant survey form is designed for this purpose. This restaurant evaluation form let your customers rate or evaluate the quality of your services, this includes food quality, overall service quality, cleanliness, order accuracy, speed of service and others. To deliver the highest level of service, this restaurant review form will help you easily understand your customers and their tastes based on their feedback. So if you own a restaurant and you want the quickest and hassle-free to collect feedback, this restaurant review template free is all you need!

A patient feedback form is a survey with questions that allows medical doctors to gather feedback from patients regarding their overall experience with the clinic.

Online Slam Book Form Template

Online Slam Book Form

Want an online slam book format for friends? Create your own with Jotform!

Student Survey Form Template

Student Survey

Find out what students think about topics like curriculum, materials, and facilities with Student Survey.

Exit Interview Form Template

Exit Interview Form

HR departments can use this free Exit Interview Form to conduct exit interviews online. Customize the form and share via email to quickly collect employee feedback.

Product Survey Form Template

Product Survey Form

A product feedback form is a good way to gauge how well (or bad) you're doing as a company. With this product survey form sample, a variety of commonly asked questions are readily available for you to use. This product survey form will ask your respondents how long they have been using your products/services, their impression on how you compete with other competitors, their satisfaction about the products/services you offer and a couple more that's related to the overall experience they had.

Customer Feedback Survey Form Template

Customer Feedback Survey

Great tool to capture customers concerns on products and services used

Follow Up Survey Form Template

Follow Up Survey

A follow up survey is a customer feedback survey that allows customers to review a company or individual. Easy to use. No coding.

Support Satisfaction Survey Form Template

Support Satisfaction Survey

A support satisfaction survey is used by companies to collect feedback about their customer support services.

Quiz Form With A Calculated Number Of Correct Answers Form Template

Quiz Form With A Calculated Number Of Correct Answers

Calculate a number of correct answers with a Form Calculation Widget, and show that number on the form's Thank You page.

Past Crushes Survey Form Template

Past Crushes Survey

An online past crushes survey is a questionnaire used by students to collect information about previous relationships.

Website Survey Form Template

Website Survey

A website survey is used to collect information about websites, users, or the website itself.

Demographic Survey Form Template

Demographic Survey

Here is a simple demographic survey template that you can use to determine your market or to make any other research. With this demographic form for research, you can gather gender, age, education, household income and interests of the form's respondents. Use this demographics form template to start your survey now! Or, simply make your own online surveys from scratch!

New Product Survey Form Template

New Product Survey

A new product survey is a tool used by businesses to collect customer feedback about a new product.

COVID 19 Vaccine Survey Form Template

COVID 19 Vaccine Survey

Get to know how people feel about the new COVID-19 vaccine with a custom online survey. Easy to personalize, embed, and share. Option for HIPAA friendly features.

Evaluation Survey Form Template

Evaluation Survey Form

An evaluation survey form is a form template designed to collect information from students about their experience at the school, the quality of the education, and any suggestions for improvement.

Political Poll Form Template

Political Poll

Get a full scale political poll from the visitors and determine what the country thinks of the current politics.

Instructor Evaluation Form Template

Instructor Evaluation Form

An Instructor Evaluation Survey is a feedback form used by teachers to evaluate the performance of an instructor.

My Favorite Things Questionnaire Form Template

My Favorite Things Questionnaire

A My Favorite Things Questionnaire is a form template designed to ask students about their favorite movie, favorite place to go, food, person, game, biggest fear, and greatest hope.

Online Shopping Survey Form Template

Online Shopping Survey

An online shopping survey is a questionnaire used by online stores to collect feedback from their customers. Whether you run a book, magazine, clothing, or furniture store, use this free Online Shopping Survey!

Product Surveys

Product Customer Feedback Form  Form Template

Product Customer Feedback Form 

A Product Customer Feedback Survey is a customer feedback survey that allows clients to review a company's products and services.

Cancellation Survey Form Template

Cancellation Survey

A cancellation survey is a questionnaire used to determine the reasons why customers cancel their service. Fully customizable and free.

Voice Of The Customer Survey Form Template

Voice Of The Customer Survey

Get important customer feedback online. Easy to customize and embed with no coding. Great for small businesses. Collect and view responses on any device.

Technology Surveys

Technology Survey For Remote Learning Form Template

Technology Survey For Remote Learning

Let students rate their remote learning experience with a free online Technology Survey for Remote Learning. Easy to customize and share. Fill out on any device.

Software Survey Form Template

Software Survey Form

A software survey is a questionnaire used by a software company to collect feedback from its users. If you work in software, use our free Software Survey Form to talk to your customers and find out more about how they use your product!

IT Satisfaction Survey Form Template

IT Satisfaction Survey

Let's measure how satisfied your customers are with the IT service you provide with the IT Satisfaction Survey. No code required!

Healthcare Surveys

Health Survey Form Template

Health Survey

A Health Survey is a form template designed to collect medical information from patients and log their anamnesis

Mental Health Survey Form Template

Mental Health Survey

Conduct mental health assessments with this free survey template for businesses, schools, and more. Easy to customize and fill from any device. No coding.

Patient Health Questionnaire And Generalized Anxiety Disorder Questionnaire Form Template

Patient Health Questionnaire And Generalized Anxiety Disorder Questionnaire

A free Patient Health Questionnaire and Generalized Anxiety Disorder questionnaire is an excellent tool for getting everything you need in one convenient place! Accessible through any mobilde device. Fully customizable.

Classroom Observation Survey Form Template

Classroom Observation Survey

Does your school accommodate external reviews by conducting class observations? Once the reviewer is done, observation survey forms would surely help in letting them share their feedback. This classroom observation template will ask the panel the teachers/classes they observed, the grade level, how the environment was throughout the session, and the overall knowledge, skills, behavior, class management, and the overall impression of the class. Use this observation survey template to improve your teachers and students alike.

Teacher Satisfaction Survey Form Template

Teacher Satisfaction Survey

Make the teachers happy by attending to their needs and listening to their feedback by using this Teacher Satisfaction Survey. This form template contains all the required questions when building a survey.

Student Interest Survey Form Template

Student Interest Survey

Encourage the students to enjoy the school year by making them interested in the school activities and class lessons. In order to identify their expectations, have them fill up this Student Interest Survey form.

Business Surveys

Customer Satisfaction Survey Form Template

Customer Satisfaction Survey Form

Get to know your customers with a free online Client Satisfaction Survey. Easy to customize, share, and embed. Analyze results to improve your business.

Employee Motivation Survey Form Template

Employee Motivation Survey

Conduct motivation self-assessments on any device with an online Employee Motivation Survey. Free to customize and share. Analyze results to improve your business.

Business Demographic Survey Form Template

Business Demographic Survey

A business demographic survey is a survey that captures information about the demographics of a business and its customers. Fully customizable and free.

About Survey Templates

Surveys are the perfect way to gauge customer, employee, or even just public opinion about your brand. Get started the easy way: select a free online survey template from Jotform. We have all the survey and reporting tools to find and collect helpful data. It's perfect when you need to understand customer demographics, or when you need to conduct a market research survey. Select from one of our pre-made sample survey forms or make your own survey from scratch in just minutes. Once you have selected a survey template, use the Jotform builder to design, format and customize your survey form. Try one of our free online survey form templates today!

You can also check out our ready-to-use questionnaire templates prepared for a variety of use cases which allows you to customize your online questionnaire with our drag-and-drop Form Builder.

Frequently Asked Questions

1) what are survey templates.

Survey templates are ready-made sample surveys with built-in questions. Instead of creating a survey from scratch, you can use a template as a starting point, then customize it with your own specific questions, branding elements, and more.

2) Where can I find survey templates?

You can find many survey templates online for free! Jotform’s survey templates are all available in our form templates library. Not only are they free, but they’re totally customizable with our no-code, drag-and-drop form builder, so you don’t have to switch between platforms to find a survey template, customize it, and share it with your audience!

3) What types of survey templates are available?

Jotform offers a wide variety of survey templates for different industries and use cases, including surveys for HR, marketing, products, customer satisfaction, and more. Plus, we have a great selection of employee and education surveys, with everything from polls and assessment forms to motivation and feedback forms.

4) Are survey templates customizable?

Absolutely! As mentioned above, Jotform’s drag-and-drop form builder is the ultimate tool to customize your survey templates in a matter of minutes. You can adjust fonts and colors, drag and drop form fields, drop in logos and images, and so much more. Plus, feel free to sync with our 100-plus integrations — like Google Drive or Dropbox — to add extra functionality to your survey or include it in your existing workflows.

5) How can I use a survey template?

To use a survey template, just select the template you want to use and edit it according to your needs. With Jotform, simply navigate to our Survey Templates page and choose the right one for you. Click Use Template , then use our drag-and-drop Form Builder to customize it and share it with others, all from one place. Responses will automatically be sent to your Jotform Inbox, and you can analyze your data in Jotform Tables and Jotform Report Builder.

6) Can I create my own survey template?

Yes, you can build your own survey from scratch. Navigate to Jotform’s Form Builder, click Create Form , then Start from Scratch . From there, you can add questions and widgets to create the perfect survey template to use as needed.

7) Are survey templates free to use?

Depending on where you source them, some survey templates are free, and some you’ll need to pay for. Jotform’s 1,000-plus survey templates will always be 100 percent free. You’ll just need to upgrade to a paid plan to increase the limit on the number of submissions you receive.

8) How many questions are typically included in a survey template?

Make sure your respondents don’t get “survey fatigue” before they even start taking your survey! The tricky part about this question is that there’s no exact answer — it depends on a number of factors, like who your audience is, what information you need, and what your purpose is. In general, a survey should never take longer than 10 minutes, which means that 5–10 questions is usually a safe range.

9) How do I choose the right survey template for my needs?

Before you choose a survey template, you need to define your research objectives. This can be done by doing a deep dive into your product, service, or team and pinpointing specific topics you want to gather more data on. Once you’ve done this, you can search for survey templates related to your research objective and find one that most closely matches your needs.

10) Can survey templates be used for both online and offline surveys?

Sometimes, you need to be offline to conduct market research or surveys in the field — that’s why finding survey tools that work offline is a game-changer. Jotform Mobile Forms allows you to create, edit, and fill out forms from any location and on any device, and even collects data offline to be populated later once you connect back to the internet. Don’t let spotty cell reception stop you from collecting the survey data you need!

11) What are the benefits of using a survey template?

Using a survey template can cut down on the time you would otherwise spend building a survey from scratch, so that you can spend more time collecting responses and analyzing data to make better business decisions. On top of that, survey templates provide great inspiration for the kinds of questions you should be asking, as well as the format of those questions (multiple choice, yes/no, fill-in-the-blank, etc.).

12) Can survey templates be used for different industries and sectors?

Survey templates can be used across every industry and sector! Whether you’re collecting employee feedback, reaching out to customers about a potential new product, or simply just polling your friends for fun, there’s a survey template for you.

13) How do I modify a survey template to suit my specific requirements?

To modify a survey template to suit your specific needs, you can customize it with your own text, images, branding, and more. Jotform makes this easy with our drag-and-drop form builder. Simply replace our questions with your own, add new fields, switch up the survey colors and fonts, include your logo, and more. Then share it through email, a link, or a QR code, or embed it on your website.

Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19

  • Published: 21 April 2021
  • Volume 26 , pages 6923–6947, ( 2021 )

Cite this article

research survey questionnaire about online classes

  • Ram Gopal 1 ,
  • Varsha Singh 1 &
  • Arun Aggarwal   ORCID: orcid.org/0000-0003-3986-188X 2  

630k Accesses

242 Citations

25 Altmetric

Explore all metrics

The aim of the study is to identify the factors affecting students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or hotel management courses in Indian universities. Structural equation modeling was used to analyze the proposed hypotheses. The results show that four independent factors used in the study viz. quality of instructor, course design, prompt feedback, and expectation of students positively impact students’ satisfaction and further student’s satisfaction positively impact students’ performance. For educational management, these four factors are essential to have a high level of satisfaction and performance for online courses. This study is being conducted during the epidemic period of COVID- 19 to check the effect of online teaching on students’ performance.

Explore related subjects

  • Artificial Intelligence
  • Digital Education and Educational Technology

Avoid common mistakes on your manuscript.

1 Introduction

Coronavirus is a group of viruses that is the main root of diseases like cough, cold, sneezing, fever, and some respiratory symptoms (WHO, 2019 ). Coronavirus is a contagious disease, which is spreading very fast amongst the human beings. COVID-19 is a new sprain which was originated in Wuhan, China, in December 2019. Coronavirus circulates in animals, but some of these viruses can transmit between animals and humans (Perlman & Mclntosh, 2020 ). As of March 282,020, according to the MoHFW, a total of 909 confirmed COVID-19 cases (862 Indians and 47 foreign nationals) had been reported in India (Centers for Disease Control and Prevention, 2020 ). Officially, no vaccine or medicine is evaluated to cure the spread of COVID-19 (Yu et al., 2020 ). The influence of the COVID-19 pandemic on the education system leads to schools and colleges’ widespread closures worldwide. On March 24, India declared a country-wide lockdown of schools and colleges (NDTV, 2020 ) for preventing the transmission of the coronavirus amongst the students (Bayham & Fenichel, 2020 ). School closures in response to the COVID-19 pandemic have shed light on several issues affecting access to education. COVID-19 is soaring due to which the huge number of children, adults, and youths cannot attend schools and colleges (UNESCO, 2020 ). Lah and Botelho ( 2012 ) contended that the effect of school closing on students’ performance is hazy.

Similarly, school closing may also affect students because of disruption of teacher and students’ networks, leading to poor performance. Bridge ( 2020 ) reported that schools and colleges are moving towards educational technologies for student learning to avoid a strain during the pandemic season. Hence, the present study’s objective is to develop and test a conceptual model of student’s satisfaction pertaining to online teaching during COVID-19, where both students and teachers have no other option than to use the online platform uninterrupted learning and teaching.

UNESCO recommends distance learning programs and open educational applications during school closure caused by COVID-19 so that schools and teachers use to teach their pupils and bound the interruption of education. Therefore, many institutes go for the online classes (Shehzadi et al., 2020 ).

As a versatile platform for learning and teaching processes, the E-learning framework has been increasingly used (Salloum & Shaalan, 2018 ). E-learning is defined as a new paradigm of online learning based on information technology (Moore et al., 2011 ). In contrast to traditional learning academics, educators, and other practitioners are eager to know how e-learning can produce better outcomes and academic achievements. Only by analyzing student satisfaction and their performance can the answer be sought.

Many comparative studies have been carried out to prove the point to explore whether face-to-face or traditional teaching methods are more productive or whether online or hybrid learning is better (Lockman & Schirmer, 2020 ; Pei & Wu, 2019 ; González-Gómez et al., 2016 ; González-Gómez et al., 2016 ). Results of the studies show that the students perform much better in online learning than in traditional learning. Henriksen et al. ( 2020 ) highlighted the problems faced by educators while shifting from offline to online mode of teaching. In the past, several research studies had been carried out on online learning to explore student satisfaction, acceptance of e-learning, distance learning success factors, and learning efficiency (Sher, 2009 ; Lee, 2014 ; Yen et al., 2018 ). However, scant amount of literature is available on the factors that affect the students’ satisfaction and performance in online classes during the pandemic of Covid-19 (Rajabalee & Santally, 2020 ). In the present study, the authors proposed that course design, quality of the instructor, prompt feedback, and students’ expectations are the four prominent determinants of learning outcome and satisfaction of the students during online classes (Lee, 2014 ).

The Course Design refers to curriculum knowledge, program organization, instructional goals, and course structure (Wright, 2003 ). If well planned, course design increasing the satisfaction of pupils with the system (Almaiah & Alyoussef, 2019 ). Mtebe and Raisamo ( 2014 ) proposed that effective course design will help in improving the performance through learners knowledge and skills (Khan & Yildiz, 2020 ; Mohammed et al., 2020 ). However, if the course is not designed effectively then it might lead to low usage of e-learning platforms by the teachers and students (Almaiah & Almulhem, 2018 ). On the other hand, if the course is designed effectively then it will lead to higher acceptance of e-learning system by the students and their performance also increases (Mtebe & Raisamo, 2014 ). Hence, to prepare these courses for online learning, many instructors who are teaching blended courses for the first time are likely to require a complete overhaul of their courses (Bersin, 2004 ; Ho et al., 2006 ).

The second-factor, Instructor Quality, plays an essential role in affecting the students’ satisfaction in online classes. Instructor quality refers to a professional who understands the students’ educational needs, has unique teaching skills, and understands how to meet the students’ learning needs (Luekens et al., 2004 ). Marsh ( 1987 ) developed five instruments for measuring the instructor’s quality, in which the main method was Students’ Evaluation of Educational Quality (SEEQ), which delineated the instructor’s quality. SEEQ is considered one of the methods most commonly used and embraced unanimously (Grammatikopoulos et al., 2014 ). SEEQ was a very useful method of feedback by students to measure the instructor’s quality (Marsh, 1987 ).

The third factor that improves the student’s satisfaction level is prompt feedback (Kinicki et al., 2004 ). Feedback is defined as information given by lecturers and tutors about the performance of students. Within this context, feedback is a “consequence of performance” (Hattie & Timperley, 2007 , p. 81). In education, “prompt feedback can be described as knowing what you know and what you do not related to learning” (Simsek et al., 2017 , p.334). Christensen ( 2014 ) studied linking feedback to performance and introduced the positivity ratio concept, which is a mechanism that plays an important role in finding out the performance through feedback. It has been found that prompt feedback helps in developing a strong linkage between faculty and students which ultimately leads to better learning outcomes (Simsek et al., 2017 ; Chang, 2011 ).

The fourth factor is students’ expectation . Appleton-Knapp and Krentler ( 2006 ) measured the impact of student’s expectations on their performance. They pin pointed that the student expectation is important. When the expectations of the students are achieved then it lead to the higher satisfaction level of the student (Bates & Kaye, 2014 ). These findings were backed by previous research model “Student Satisfaction Index Model” (Zhang et al., 2008 ). However, when the expectations are students is not fulfilled then it might lead to lower leaning and satisfaction with the course. Student satisfaction is defined as students’ ability to compare the desired benefit with the observed effect of a particular product or service (Budur et al., 2019 ). Students’ whose grade expectation is high will show high satisfaction instead of those facing lower grade expectations.

The scrutiny of the literature show that although different researchers have examined the factors affecting student satisfaction but none of the study has examined the effect of course design, quality of the instructor, prompt feedback, and students’ expectations on students’ satisfaction with online classes during the pandemic period of Covid-19. Therefore, this study tries to explore the factors that affect students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19. As the pandemic compelled educational institutions to move online with which they were not acquainted, including teachers and learners. The students were not mentally prepared for such a shift. Therefore, this research will be examined to understand what factors affect students and how students perceived these changes which are reflected through their satisfaction level.

This paper is structured as follows: The second section provides a description of theoretical framework and the linkage among different research variables and accordingly different research hypotheses were framed. The third section deals with the research methodology of the paper as per APA guideline. The outcomes and corresponding results of the empirical analysis are then discussed. Lastly, the paper concludes with a discussion and proposes implications for future studies.

2 Theoretical framework

Achievement goal theory (AGT) is commonly used to understand the student’s performance, and it is proposed by four scholars Carole Ames, Carol Dweck, Martin Maehr, and John Nicholls in the late 1970s (Elliot, 2005 ). Elliott & Dweck ( 1988 , p11) define that “an achievement goal involves a program of cognitive processes that have cognitive, affective and behavioral consequence”. This theory suggests that students’ motivation and achievement-related behaviors can be easily understood by the purpose and the reasons they adopted while they are engaged in the learning activities (Dweck & Leggett, 1988 ; Ames, 1992 ; Urdan, 1997 ). Some of the studies believe that there are four approaches to achieve a goal, i.e., mastery-approach, mastery avoidance, performance approach, and performance-avoidance (Pintrich, 1999 ; Elliot & McGregor, 2001 ; Schwinger & Stiensmeier-Pelster, 2011 , Hansen & Ringdal, 2018 ; Mouratidis et al., 2018 ). The environment also affects the performance of students (Ames & Archer, 1988 ). Traditionally, classroom teaching is an effective method to achieve the goal (Ames & Archer, 1988 ; Ames, 1992 ; Clayton et al., 2010 ) however in the modern era, the internet-based teaching is also one of the effective tools to deliver lectures, and web-based applications are becoming modern classrooms (Azlan et al., 2020 ). Hence, following section discuss about the relationship between different independent variables and dependent variables (Fig. 1 ).

figure 1

Proposed Model

3 Hypotheses development

3.1 quality of the instructor and satisfaction of the students.

Quality of instructor with high fanaticism on student’s learning has a positive impact on their satisfaction. Quality of instructor is one of the most critical measures for student satisfaction, leading to the education process’s outcome (Munteanu et al., 2010 ; Arambewela & Hall, 2009 ; Ramsden, 1991 ). Suppose the teacher delivers the course effectively and influence the students to do better in their studies. In that case, this process leads to student satisfaction and enhances the learning process (Ladyshewsky, 2013 ). Furthermore, understanding the need of learner by the instructor also ensures student satisfaction (Kauffman, 2015 ). Hence the hypothesis that the quality of instructor significantly affects the satisfaction of the students was included in this study.

H1: The quality of the instructor positively affects the satisfaction of the students.

3.2 Course design and satisfaction of students

The course’s technological design is highly persuading the students’ learning and satisfaction through their course expectations (Liaw, 2008 ; Lin et al., 2008 ). Active course design indicates the students’ effective outcomes compared to the traditional design (Black & Kassaye, 2014 ). Learning style is essential for effective course design (Wooldridge, 1995 ). While creating an online course design, it is essential to keep in mind that we generate an experience for students with different learning styles. Similarly, (Jenkins, 2015 ) highlighted that the course design attributes could be developed and employed to enhance student success. Hence the hypothesis that the course design significantly affects students’ satisfaction was included in this study.

H2: Course design positively affects the satisfaction of students.

3.3 Prompt feedback and satisfaction of students

The emphasis in this study is to understand the influence of prompt feedback on satisfaction. Feedback gives the information about the students’ effective performance (Chang, 2011 ; Grebennikov & Shah, 2013 ; Simsek et al., 2017 ). Prompt feedback enhances student learning experience (Brownlee et al., 2009 ) and boosts satisfaction (O'donovan, 2017 ). Prompt feedback is the self-evaluation tool for the students (Rogers, 1992 ) by which they can improve their performance. Eraut ( 2006 ) highlighted the impact of feedback on future practice and student learning development. Good feedback practice is beneficial for student learning and teachers to improve students’ learning experience (Yorke, 2003 ). Hence the hypothesis that prompt feedback significantly affects satisfaction was included in this study.

H3: Prompt feedback of the students positively affects the satisfaction.

3.4 Expectations and satisfaction of students

Expectation is a crucial factor that directly influences the satisfaction of the student. Expectation Disconfirmation Theory (EDT) (Oliver, 1980 ) was utilized to determine the level of satisfaction based on their expectations (Schwarz & Zhu, 2015 ). Student’s expectation is the best way to improve their satisfaction (Brown et al., 2014 ). It is possible to recognize student expectations to progress satisfaction level (ICSB, 2015 ). Finally, the positive approach used in many online learning classes has been shown to place a high expectation on learners (Gold, 2011 ) and has led to successful outcomes. Hence the hypothesis that expectations of the student significantly affect the satisfaction was included in this study.

H4: Expectations of the students positively affects the satisfaction.

3.5 Satisfaction and performance of the students

Zeithaml ( 1988 ) describes that satisfaction is the outcome result of the performance of any educational institute. According to Kotler and Clarke ( 1986 ), satisfaction is the desired outcome of any aim that amuses any individual’s admiration. Quality interactions between instructor and students lead to student satisfaction (Malik et al., 2010 ; Martínez-Argüelles et al., 2016 ). Teaching quality and course material enhances the student satisfaction by successful outcomes (Sanderson, 1995 ). Satisfaction relates to the student performance in terms of motivation, learning, assurance, and retention (Biner et al., 1996 ). Mensink and King ( 2020 ) described that performance is the conclusion of student-teacher efforts, and it shows the interest of students in the studies. The critical element in education is students’ academic performance (Rono, 2013 ). Therefore, it is considered as center pole, and the entire education system rotates around the student’s performance. Narad and Abdullah ( 2016 ) concluded that the students’ academic performance determines academic institutions’ success and failure.

Singh et al. ( 2016 ) asserted that the student academic performance directly influences the country’s socio-economic development. Farooq et al. ( 2011 ) highlights the students’ academic performance is the primary concern of all faculties. Additionally, the main foundation of knowledge gaining and improvement of skills is student’s academic performance. According to Narad and Abdullah ( 2016 ), regular evaluation or examinations is essential over a specific period of time in assessing students’ academic performance for better outcomes. Hence the hypothesis that satisfaction significantly affects the performance of the students was included in this study.

H5: Students’ satisfaction positively affects the performance of the students.

3.6 Satisfaction as mediator

Sibanda et al. ( 2015 ) applied the goal theory to examine the factors persuading students’ academic performance that enlightens students’ significance connected to their satisfaction and academic achievement. According to this theory, students perform well if they know about factors that impact on their performance. Regarding the above variables, institutional factors that influence student satisfaction through performance include course design and quality of the instructor (DeBourgh, 2003 ; Lado et al., 2003 ), prompt feedback, and expectation (Fredericksen et al., 2000 ). Hence the hypothesis that quality of the instructor, course design, prompts feedback, and student expectations significantly affect the students’ performance through satisfaction was included in this study.

H6: Quality of the instructor, course design, prompt feedback, and student’ expectations affect the students’ performance through satisfaction.

H6a: Students’ satisfaction mediates the relationship between quality of the instructor and student’s performance.

H6b: Students’ satisfaction mediates the relationship between course design and student’s performance.

H6c: Students’ satisfaction mediates the relationship between prompt feedback and student’s performance.

H6d: Students’ satisfaction mediates the relationship between student’ expectations and student’s performance.

4.1 Participants

In this cross-sectional study, the data were collected from 544 respondents who were studying the management (B.B.A or M.B.A) and hotel management courses. The purposive sampling technique was used to collect the data. Descriptive statistics shows that 48.35% of the respondents were either MBA or BBA and rests of the respondents were hotel management students. The percentages of male students were (71%) and female students were (29%). The percentage of male students is almost double in comparison to females. The ages of the students varied from 18 to 35. The dominant group was those aged from 18 to 22, and which was the under graduation student group and their ratio was (94%), and another set of students were from the post-graduation course, which was (6%) only.

4.2 Materials

The research instrument consists of two sections. The first section is related to demographical variables such as discipline, gender, age group, and education level (under-graduate or post-graduate). The second section measures the six factors viz. instructor’s quality, course design, prompt feedback, student expectations, satisfaction, and performance. These attributes were taken from previous studies (Yin & Wang, 2015 ; Bangert, 2004 ; Chickering & Gamson, 1987 ; Wilson et al., 1997 ). The “instructor quality” was measured through the scale developed by Bangert ( 2004 ). The scale consists of seven items. The “course design” and “prompt feedback” items were adapted from the research work of Bangert ( 2004 ). The “course design” scale consists of six items. The “prompt feedback” scale consists of five items. The “students’ expectation” scale consists of five items. Four items were adapted from Bangert, 2004 and one item was taken from Wilson et al. ( 1997 ). Students’ satisfaction was measure with six items taken from Bangert ( 2004 ); Wilson et al. ( 1997 ); Yin and Wang ( 2015 ). The “students’ performance” was measured through the scale developed by Wilson et al. ( 1997 ). The scale consists of six items. These variables were accessed on a five-point likert scale, ranging from 1(strongly disagree) to 5(strongly agree). Only the students from India have taken part in the survey. A total of thirty-four questions were asked in the study to check the effect of the first four variables on students’ satisfaction and performance. For full details of the questionnaire, kindly refer Appendix Tables 6 .

The study used a descriptive research design. The factors “instructor quality, course design, prompt feedback and students’ expectation” were independent variables. The students’ satisfaction was mediator and students’ performance was the dependent variable in the current study.

4.4 Procedure

In this cross-sectional research the respondents were selected through judgment sampling. They were informed about the objective of the study and information gathering process. They were assured about the confidentiality of the data and no incentive was given to then for participating in this study. The information utilizes for this study was gathered through an online survey. The questionnaire was built through Google forms, and then it was circulated through the mails. Students’ were also asked to write the name of their college, and fifteen colleges across India have taken part to fill the data. The data were collected in the pandemic period of COVID-19 during the total lockdown in India. This was the best time to collect the data related to the current research topic because all the colleges across India were involved in online classes. Therefore, students have enough time to understand the instrument and respondent to the questionnaire in an effective manner. A total of 615 questionnaires were circulated, out of which the students returned 574. Thirty responses were not included due to the unengaged responses. Finally, 544 questionnaires were utilized in the present investigation. Male and female students both have taken part to fill the survey, different age groups, and various courses, i.e., under graduation and post-graduation students of management and hotel management students were the part of the sample.

5.1 Exploratory factor analysis (EFA)

To analyze the data, SPSS and AMOS software were used. First, to extract the distinct factors, an exploratory factor analysis (EFA) was performed using VARIMAX rotation on a sample of 544. Results of the exploratory analysis rendered six distinct factors. Factor one was named as the quality of instructor, and some of the items were “The instructor communicated effectively”, “The instructor was enthusiastic about online teaching” and “The instructor was concerned about student learning” etc. Factor two was labeled as course design, and the items were “The course was well organized”, “The course was designed to allow assignments to be completed across different learning environments.” and “The instructor facilitated the course effectively” etc. Factor three was labeled as prompt feedback of students, and some of the items were “The instructor responded promptly to my questions about the use of Webinar”, “The instructor responded promptly to my questions about general course requirements” etc. The fourth factor was Student’s Expectations, and the items were “The instructor provided models that clearly communicated expectations for weekly group assignments”, “The instructor used good examples to explain statistical concepts” etc. The fifth factor was students’ satisfaction, and the items were “The online classes were valuable”, “Overall, I am satisfied with the quality of this course” etc. The sixth factor was performance of the student, and the items were “The online classes has sharpened my analytic skills”, “Online classes really tries to get the best out of all its students” etc. These six factors explained 67.784% of the total variance. To validate the factors extracted through EFA, the researcher performed confirmatory factor analysis (CFA) through AMOS. Finally, structural equation modeling (SEM) was used to test the hypothesized relationships.

5.2 Measurement model

The results of Table 1 summarize the findings of EFA and CFA. Results of the table showed that EFA renders six distinct factors, and CFA validated these factors. Table 2 shows that the proposed measurement model achieved good convergent validity (Aggarwal et al., 2018a , b ). Results of the confirmatory factor analysis showed that the values of standardized factor loadings were statistically significant at the 0.05 level. Further, the results of the measurement model also showed acceptable model fit indices such that CMIN = 710.709; df = 480; CMIN/df = 1.481 p  < .000; Incremental Fit Index (IFI) = 0.979; Tucker-Lewis Index (TLI) = 0.976; Goodness of Fit index (GFI) = 0.928; Adjusted Goodness of Fit Index (AGFI) = 0.916; Comparative Fit Index (CFI) = 0.978; Root Mean Square Residual (RMR) = 0.042; Root Mean Squared Error of Approximation (RMSEA) = 0.030 is satisfactory.

The Average Variance Explained (AVE) according to the acceptable index should be higher than the value of squared correlations between the latent variables and all other variables. The discriminant validity is confirmed (Table 2 ) as the value of AVE’s square root is greater than the inter-construct correlations coefficient (Hair et al., 2006 ). Additionally, the discriminant validity existed when there was a low correlation between each variable measurement indicator with all other variables except with the one with which it must be theoretically associated (Aggarwal et al., 2018a , b ; Aggarwal et al., 2020 ). The results of Table 2 show that the measurement model achieved good discriminate validity.

5.3 Structural model

To test the proposed hypothesis, the researcher used the structural equation modeling technique. This is a multivariate statistical analysis technique, and it includes the amalgamation of factor analysis and multiple regression analysis. It is used to analyze the structural relationship between measured variables and latent constructs.

Table  3 represents the structural model’s model fitness indices where all variables put together when CMIN/DF is 2.479, and all the model fit values are within the particular range. That means the model has attained a good model fit. Furthermore, other fit indices as GFI = .982 and AGFI = 0.956 be all so supportive (Schumacker & Lomax, 1996 ; Marsh & Grayson, 1995 ; Kline, 2005 ).

Hence, the model fitted the data successfully. All co-variances among the variables and regression weights were statistically significant ( p  < 0.001).

Table 4 represents the relationship between exogenous, mediator and endogenous variables viz—quality of instructor, prompt feedback, course design, students’ expectation, students’ satisfaction and students’ performance. The first four factors have a positive relationship with satisfaction, which further leads to students’ performance positively. Results show that the instructor’s quality has a positive relationship with the satisfaction of students for online classes (SE = 0.706, t-value = 24.196; p  < 0.05). Hence, H1 was supported. The second factor is course design, which has a positive relationship with students’ satisfaction of students (SE = 0.064, t-value = 2.395; p < 0.05). Hence, H2 was supported. The third factor is Prompt feedback, and results show that feedback has a positive relationship with the satisfaction of the students (SE = 0.067, t-value = 2.520; p < 0.05). Hence, H3 was supported. The fourth factor is students’ expectations. The results show a positive relationship between students’ expectation and students’ satisfaction with online classes (SE = 0.149, t-value = 5.127; p < 0.05). Hence, H4 was supported. The results of SEM show that out of quality of instructor, prompt feedback, course design, and students’ expectation, the most influencing factor that affect the students’ satisfaction was instructor’s quality (SE = 0.706) followed by students’ expectation (SE =5.127), prompt feedback (SE = 2.520). The factor that least affects the students’ satisfaction was course design (2.395). The results of Table 4 finally depicts that students’ satisfaction has positive effect on students’ performance ((SE = 0.186, t-value = 2.800; p < 0.05). Hence H5 was supported.

Table 5 shows that students’ satisfaction partially mediates the positive relationship between the instructor’s quality and student performance. Hence, H6(a) was supported. Further, the mediation analysis results showed that satisfaction again partially mediates the positive relationship between course design and student’s performance. Hence, H6(b) was supported However, the mediation analysis results showed that satisfaction fully mediates the positive relationship between prompt feedback and student performance. Hence, H6(c) was supported. Finally, the results of the Table 5 showed that satisfaction partially mediates the positive relationship between expectations of the students and student’s performance. Hence, H6(d) was supported.

6 Discussion

In the present study, the authors evaluated the different factors directly linked with students’ satisfaction and performance with online classes during Covid-19. Due to the pandemic situation globally, all the colleges and universities were shifted to online mode by their respective governments. No one has the information that how long this pandemic will remain, and hence the teaching method was shifted to online mode. Even though some of the educators were not tech-savvy, they updated themselves to battle the unexpected circumstance (Pillai et al., 2021 ). The present study results will help the educators increase the student’s satisfaction and performance in online classes. The current research assists educators in understanding the different factors that are required for online teaching.

Comparing the current research with past studies, the past studies have examined the factors affecting the student’s satisfaction in the conventional schooling framework. However, the present study was conducted during India’s lockdown period to identify the prominent factors that derive the student’s satisfaction with online classes. The study also explored the direct linkage between student’s satisfaction and their performance. The present study’s findings indicated that instructor’s quality is the most prominent factor that affects the student’s satisfaction during online classes. This means that the instructor needs to be very efficient during the lectures. He needs to understand students’ psychology to deliver the course content prominently. If the teacher can deliver the course content properly, it affects the student’s satisfaction and performance. The teachers’ perspective is critical because their enthusiasm leads to a better online learning process quality.

The present study highlighted that the second most prominent factor affecting students’ satisfaction during online classes is the student’s expectations. Students might have some expectations during the classes. If the instructor understands that expectation and customizes his/her course design following the student’s expectations, then it is expected that the students will perform better in the examinations. The third factor that affects the student’s satisfaction is feedback. After delivering the course, appropriate feedback should be taken by the instructors to plan future courses. It also helps to make the future strategies (Tawafak et al., 2019 ). There must be a proper feedback system for improvement because feedback is the course content’s real image. The last factor that affects the student’s satisfaction is design. The course content needs to be designed in an effective manner so that students should easily understand it. If the instructor plans the course, so the students understand the content without any problems it effectively leads to satisfaction, and the student can perform better in the exams. In some situations, the course content is difficult to deliver in online teaching like the practical part i.e. recipes of dishes or practical demonstration in the lab. In such a situation, the instructor needs to be more creative in designing and delivering the course content so that it positively impacts the students’ overall satisfaction with online classes.

Overall, the students agreed that online teaching was valuable for them even though the online mode of classes was the first experience during the pandemic period of Covid-19 (Agarwal & Kaushik, 2020 ; Rajabalee & Santally, 2020 ). Some of the previous studies suggest that the technology-supported courses have a positive relationship with students’ performance (Cho & Schelzer, 2000 ; Harasim, 2000 ; Sigala, 2002 ). On the other hand, the demographic characteristic also plays a vital role in understanding the online course performance. According to APA Work Group of the Board of Educational Affairs ( 1997 ), the learner-centered principles suggest that students must be willing to invest the time required to complete individual course assignments. Online instructors must be enthusiastic about developing genuine instructional resources that actively connect learners and encourage them toward proficient performances. For better performance in studies, both teachers and students have equal responsibility. When the learner faces any problem to understand the concepts, he needs to make inquiries for the instructor’s solutions (Bangert, 2004 ). Thus, we can conclude that “instructor quality, student’s expectation, prompt feedback, and effective course design” significantly impact students’ online learning process.

7 Implications of the study

The results of this study have numerous significant practical implications for educators, students and researchers. It also contributes to the literature by demonstrating that multiple factors are responsible for student satisfaction and performance in the context of online classes during the period of the COVID-19 pandemic. This study was different from the previous studies (Baber, 2020 ; Ikhsan et al., 2019 ; Eom & Ashill, 2016 ). None of the studies had examined the effect of students’ satisfaction on their perceived academic performance. The previous empirical findings have highlighted the importance of examining the factors affecting student satisfaction (Maqableh & Jaradat, 2021 ; Yunusa & Umar, 2021 ). Still, none of the studies has examined the effect of course design, quality of instructor, prompt feedback, and students’ expectations on students’ satisfaction all together with online classes during the pandemic period. The present study tries to fill this research gap.

The first essential contribution of this study was the instructor’s facilitating role, and the competence he/she possesses affects the level of satisfaction of the students (Gray & DiLoreto, 2016 ). There was an extra obligation for instructors who taught online courses during the pandemic. They would have to adapt to a changing climate, polish their technical skills throughout the process, and foster new students’ technical knowledge in this environment. The present study’s findings indicate that instructor quality is a significant determinant of student satisfaction during online classes amid a pandemic. In higher education, the teacher’s standard referred to the instructor’s specific individual characteristics before entering the class (Darling-Hammond, 2010 ). These attributes include factors such as instructor content knowledge, pedagogical knowledge, inclination, and experience. More significantly, at that level, the amount of understanding could be given by those who have a significant amount of technical expertise in the areas they are teaching (Martin, 2021 ). Secondly, the present study results contribute to the profession of education by illustrating a realistic approach that can be used to recognize students’ expectations in their class effectively. The primary expectation of most students before joining a university is employment. Instructors have agreed that they should do more to fulfill students’ employment expectations (Gorgodze et al., 2020 ). The instructor can then use that to balance expectations to improve student satisfaction. Study results can be used to continually improve and build courses, as well as to make policy decisions to improve education programs. Thirdly, from result outcomes, online course design and instructors will delve deeper into how to structure online courses more efficiently, including design features that minimize adversely and maximize optimistic emotion, contributing to greater student satisfaction (Martin et al., 2018 ). The findings suggest that the course design has a substantial positive influence on the online class’s student performance. The findings indicate that the course design of online classes need to provide essential details like course content, educational goals, course structure, and course output in a consistent manner so that students would find the e-learning system beneficial for them; this situation will enable students to use the system and that leads to student performance (Almaiah & Alyoussef, 2019 ). Lastly, the results indicate that instructors respond to questions promptly and provide timely feedback on assignments to facilitate techniques that help students in online courses improve instructor participation, instructor interaction, understanding, and participation (Martin et al., 2018 ). Feedback can be beneficial for students to focus on the performance that enhances their learning.

Author information

Authors and affiliations.

Chitkara College of Hospitality Management, Chitkara University, Chandigarh, Punjab, India

Ram Gopal & Varsha Singh

Chitkara Business School, Chitkara University, Chandigarh, Punjab, India

Arun Aggarwal

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Arun Aggarwal .

Ethics declarations

Ethics approval.

Not applicable.

Conflict of interest

The authors declare no conflict of interest, financial or otherwise.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Gopal, R., Singh, V. & Aggarwal, A. Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Educ Inf Technol 26 , 6923–6947 (2021). https://doi.org/10.1007/s10639-021-10523-1

Download citation

Received : 07 December 2020

Accepted : 22 March 2021

Published : 21 April 2021

Issue Date : November 2021

DOI : https://doi.org/10.1007/s10639-021-10523-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Quality of instructor
  • Course design
  • Instructor’s prompt feedback
  • Expectations
  • Student’s satisfaction
  • Perceived performance

Advertisement

  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Nepal J Epidemiol
  • v.6(4); 2016 Dec

Guide to the design and application of online questionnaire surveys

Pramod r regmi.

1 Faculty of Health and Social Sciences, Bournemouth University, England , UK.

2 Visiting Research Fellow, Chitwan Medical College (CMC), Tribhuvan University , Nepal.

Elizabeth Waithaka

Anjana paudyal.

3 The School of Environment, Charles Darwin University, Darwin , Australia.

Padam Simkhada

4 Faculty of Education, Health and Community, Liverpool John Moores University , UK

5 Manmohan Memorial Institute of Health Sciences, Tribhuvan University, , Nepal.

6 Nobel College, Pokhara University, , Nepal.

Edwin van Teijlingen

Author's Contribution: PR and EvT conceived the idea. PR, AP, EW and PS reviewed the literature and drafted the manuscript. All authors reviewed, edited and agreed on the final version of this manuscript. .

Collecting research data through traditional approaches (face-to-face, postal or telephone survey) can be costly and time consuming. The emerging data collection approach based on internet/e-based technologies (e.g. online platforms and email), is a relatively cost effective survey alternative. These novel data collection strategies can collect large amounts of data from participants in a short time frame. Similarly, they also seem to be feasible and effective in collecting data on sensitive issues or with samples they are generally hard to reach, for example, men who have sex with men (MSM) or migrants. As a significant proportion of the population currently in the world are digitally connected, the shift from postal (paper-pencil) or telephone towards online survey use in research is in the interests of researchers in academia as well as in the commercial world. However, compared to designing and executing paper version of the questionnaire, there is limited literature to help a starting researcher with the design and a use of online questionnaires. This short paper highlights issues around: a) methodological aspect of online questionnaire survey; b) online survey planning and management; and c) ethical concerns that may arise while using this option. We believe that this paper will be useful for researchers who want to gain knowledge or apply this approach in their research.

Introduction

Questionnaire surveys are a popular data collection method for academic or marketing research in a variety of fields. Face-to-face, telephone interviews and postal surveys are traditional approaches of completing questionnaire surveys. However, with the growing access to the internet facility globally [ 1 , 2 ], for example, internet penetration in Nepal, a low-income country, increased exponentially in the past two decades, from less than 50 users in 1995 to 11.9 million users (about 45% of total population) in 2015 [ 3 ] and the price of technology devices (e.g. tablet computers, hardware) and software continuing to reduce [ 4 ], an novel internet-based data collection technique such as online questionnaire survey has become popular in recent years [ 5 ]. Recently, qualitative data collection through online focus groups is also emerging [ 6 , 7 ], suggesting, research participants in the digital age can now interact with each other and the interviewer/facilitator in an online- multimedia setting.

Data collection through an online survey appears to have the potential to collect large amounts of data efficiently (i.e. with less error due to the lack transferring written data on to a computer), economically (as it requires low human resource efforts while collecting or managing data) and within relatively short time frames. Online survey approach is also very useful when collecting data from hard-to-reach populations such as lesbian, gay, bisexual and transgender (LGB&T) or travellers, etc. Moreover, people with certain conditions, such as HIV are often hard to access since they are stigmatized offline [ 4 ]. Studying these sub-populations can be possible through an online survey approach as this may help access these hard to reach population by sending an invitation through a range of media and discussion platforms (e.g. social media, discussion fora).

Online survey approach provides convenience in several ways, for example, a) respondent can answer at a convenient time; b) respondent can take as much time as they need to response questions; c) respondent can complete survey in multiple sessions. Similar to the paper-based survey; online questionnaire surveys are capable of question diversity (e.g. dichotomous questions, multiple-choice questions, scales), skip irrelevant questions for sub-groups in the sample (i.e. no pregnancy questions for men) and even collect an open-ended questions (qualitative data) through a free text box. Similarly, the construction of the online questionnaire can also be built to help better response rate for each item; for example, respondents must answer a question before advancing to the next question. This, however, might create an unfavourable situation to some research participants if they do not want to answer sensitive questions such as sexual behaviours or drug use. Unlike the paper postal survey, through this approach, follow up could be easy through email which enhance response rate.

There is substantial evidence that many large cross-country studies have been completed using online questionnaire surveys through popular dedicated platform (e.g. https://www.surveymonkey.co.uk/, https://www.onlinesurveys.ac.uk/about/ , https://www.qualtrics.com/ ).These platforms allow researchers to deploy and analyse surveys via the web without any advanced technical knowledge. Despite these developments, there is not much research focusing on an online survey or other technology-based survey methodologies, simply because they have been introduced a few years ago.

An online survey questionnaire survey follows the same characteristics as the paper version of the survey. However, the data collection strategies have specific characteristics (e.g. technological, demographic, response rate) that affect their design and implementation. Online questionnaires can only produce valid and meaningful results if the: a) layout of the questionnaire and all its questions/items are clear and precise; b) if they are appropriately executed (for example, completing survey through a mobile app or via tablet might attract young generation but may not work well with elderly population); and c) if they are asked consistently across all respondents. Therefore, a careful consideration needs to be given while designing the online questionnaire survey. In this paper, we discuss: a) methodological aspect of online questionnaire survey; b) online survey planning and management; and c) ethical concern that may arises while using this option.

Methodological components

Whilst developing and operationalising the online questionnaire survey, six methodological components are critical to successful online surveys. These are (a) user-friendly design and layout; (b) selecting survey participants; (c) avoiding multiple responses; (d) data management; e) ethical issues; and f) piloting tools. These are discussed below.

a) User-friendly design and layout

Generally, online survey link is promoted through an email, websites, social media or online discussion plateforms and potential survey participants are invited to take part the survey. Research participants always prefer a tool which is easy to follow and complete. The format of the questionnaire, therefore, should be easy for the participants to navigate around and should need only a minimum of computer skills for their completion. The items should be short, clear and easy to read by the participants, e.g. elderly people might need larger fonts. Similarly, research participants may be more open to sharing sensitive or personal information such as age, sex, after completing other questions, sensitive or personal questions should be placed at the end. Dillman [ 8 ] found that visual presentation is essential and also strengthen response rates, lead to longer download times of large files (especially in a setting where internet speed is slow) and this must be considered. Moreover, as online surveys are generally self-administered, answering instructions must be extremely clear.

b) Selecting survey participants

An easy access to surveys for all participants is essential in any online questionnaire survey [ 9 ]. Therefore, an online questionnaire may be appropriate only for a certain age groups. [ 10 ]. For example, an online study among elderly population would not be appropriate if the proportion of elderly who access/use internet is low. Similarly, if the survey link is promoted through social media (e.g. Twitter, Facebook), it might not capture the views from other people who do not use social media. In such circumstances the survey should be promoted through other channels and perhaps other possible data collection strategies (e.g. telephone or paper survey) should be combined with your online survey. Although, relatively little may be identified about the background characteristics of people in online communities, except basic demographic variables, and tracking non-response rate is not an easy in most online survey [ 5 , 11 ], it is very likely that participants in online surveys are more experienced or have stronger internet skills. They may be younger male and from households having fairly high incomes [ 12 ], however, with the modernisation and wide coverage of the internet facilities globally (particularly through mobile phones), recently the gap in internet use has decreased in countries like Nepal.

c) Avoiding multiple responses

Another important feature of the online survey design is the ability to avoid multiple responses. This is a particularly challenging when incentives are provided to the survey participants. In order to minimise this problem, online survey design should able to include a feature that enables to register interested participants (through their email) in the first stage so that the online tool will be able to assign a unique participant number which will minimise the chance of multiple enrolments into the study. A personalised link to access the online questionnaire can be sent to participants’ email address. It is very important that the email should be used for sharing the survey link only (to ensure participant’s details are protected). Restriction through an IP address could be another strategy to avoid multiple enrolments; however, it limits the opportunity for participants (e.g. family members or students living in communal dwelling) who share a common IP address. Similarly, participants should be offered completing the survey across multiple sessions if they wish (as long as they use the same device), as survey responses save automatically as participants progress through the survey pages.

d) Data management

Generally online survey platforms offer convenient and reliable data management. By design, online survey format protects against the loss of data and facilitates data transfer into a database (e.g. excel or SPSS) for analysis [ 9 , 10 ]. As these approaches provide the ability to export responses into a compatible database, it eliminates transcription errors and prevents survey modification by the survey participant. It can be argued the overall ease of use for well-designed questionnaires for both study participants and the researchers potentially improves the reliability and validity of the data collection process and the collected data [ 13 ].

e) Ethical issues

Online administration of surveys raises unique ethical questions regarding key ethical components including:

i. Informed consent

In most online survey tools, it is not possible to explain the study or to take verbal consent from participants. Researchers therefore have turned to ensuring that all information regarding the study, participants' rights and researcher's contact details are provided on the first page of the survey [ 14 ]. However, this is dependent on the study design. For example, in the conduct of e-Delphi studies, researchers have the option to administer participant information sheets, consent forms and additional study information by personal contact thereby allowing for oral consent [ 15 ]. The consent practice needs to be cautiously considered and determined. One increasingly common way is presenting items as would be found on paper-based consent forms such that the items must be endorsed before the next page can be opened.

ii. Privacy and confidentiality

There have been concerns regarding the ability of online administration tools ability to facilitate privacy and confidentiality [ 16 ]. Most of these tools rely on the researchers' ingenuity in setting up the survey settings to limit for instance participants' IP addresses. However, tools such as Survey Monkey have been associated with easily accessible data from surveys shared from a common account thereby compromising confidentiality [ 14 ].

iii. Right to withdrawal or omission of items

Study participants should have a right to withdrawal from the survey in addition to the choice to opt out of sharing the data already provided on an online questionnaire. Researchers should, therefore, ensure that the opportunity to erase or skip questions or backtrack through the survey is provided in order to maintain ethically sound research conduct. As a rule, only items relating to the consent form should require a response [ 14 ].

f) Piloting

When the survey tools, contents, platforms are decided, it is very important to carry out a pilot [ 17 , 18 ] with potential participants. Pilot studies can help ensure the adequacy of the questions, ordering of the questions, comprehensiveness of the contents, instructions are clear and adequate, feasibility of the technology (e.g. download time), skipping patters, data compatibility/transfer issues etc. Not all piloting has to be online as researchers can conduct cognitive interviews with those involved in a pilot study, although obviously certain aspects such as download time require piloting an electronic version of the survey.

Internet access is increasing across the globe has resulted in an increase in the use of online surveys. This data collection approach has a potential to collect both qualitative and quantitative data. Conducting an online survey enables access to large and geographically distributed populations. Experts have argued it as a cost-effective and time saving for the researcher. Although multiple data collection strategies help achieve a better response, combining email, postal and web-based survey, may, however, prove impractical or financially unfeasible to use. If designed and executed rigorously, results from an online survey may be no different than paper based survey results, yet may demonstrate to be advantages due to lower costs and speedy distribution. When designing an online survey, researchers should consider a number of principles such as simplicity in items included, feasibility, appropriateness of online surveys for the target participants, being culturally and ethically sensitive, completeness and neutrality. Adhering to these principles will ensure that your online survey is methodological sound.

survey design, online research

Survey Design Training Modules

Good data starts with good survey designs.

During these training sessions, our survey design experts share research-on-research findings and best practices for engaging an online audience. Through intentional design choices and question phrasing, you can create engaging surveys that will yield more honest and reliable insights.

Choose individual modules specific to your needs, or work through them all in your own time. Contact us here  if you have any questions or for a complementary survey review.

Optimising research for reliable data

  • The impact of bias in surveys  (8 minutes)
  • The effects of scale designs on answers  (21 minutes)
  • Reducing online survey dropout  (8 minutes)
  • Using visuals for engagement  (18 minutes)
  • Writing questions effectively  (11 minutes)
  • Conducting cross-cultural research (26 minutes)
  • Designing surveys with empathy (27 minutes)
  • How to create a questionnaire (14 minutes)
  • Using humour for truthful feedback  (28 minutes)
  • Using logos, product imagery and visuals (7 minutes)
  • Space optimisation & iconography techniques (11 minutes)

Question type design and use

  • Design and use of open-ended questions (9 minutes)
  • Design and use of single and multiple-choice questions (19 minutes)
  • Design and use of grid and rating scale questions (11 minutes)
  • Design and use of slider questions  (11 minutes)
  • Design and use of drag and drop questions (12 minutes)

Full length webinars

  • Getting closer to the truth: techniques for collecting honest answers
  • Deepen your survey research by exploiting open-ended questions

Be the first to know when we release new modules. Subscribe here for email notifications and monthly research tips from our experts.

Jon Puleston

Vice President of Innovation Profiles Division

Jon Puleston

Steve Wigmore

Senior Director, Modern Surveys Profiles Division

steve wigmore

Anna Shevchenko

Research Director, Americas Profiles Division

anna shevchenko, kantar profiles

May Ling Tham

Research Director, APAC, Profiles Division

May Ling Tham

Martha Espley

Research Director, EMEA, Profiles Division

Martha Espley

Anoushka Adams

anoushka adams

Christiaan Meliezer

Senior Project Manager, Profiles Division

christiaan meliezer

Subscribe now to know first when new modules are released

Registration for Inspire 2024 is now open!

research survey questionnaire about online classes

  • [wpml_language_selector_widget]
  • Sales: + 1.800.681.4601
  • Docebo Help  
  • Investors  

10 Essential Training Survey Questions You Must Be Asking

research survey questionnaire about online classes

A training program survey is a series of follow-up questions sent to participants shortly after they complete a training program. It is an easy way to gather information on the effectiveness of the training, the course content, the instructors, and the overall learning experience. With this valuable feedback, you can improve future training programs and better understand attendees’ needs and expectations. 

We compiled a list of 10 essential training survey questions to provide you with the information you need to build an even better training program next time. Keep reading until the end to learn how to set up an effective post-training survey.

Disclaimer : The information below is accurate as of August 23, 2024. 

1. How would you rate the overall quality of the training content?

Start the training evaluation survey with a general, qualifying question that gets attendees to think about the learning experience as a whole. Survey takers often don’t reflect on their experience before they open the training feedback survey to submit answers. This rating question gives them an opportunity to reflect on the overall program before going into more specifics. If they rate the course quality low in their initial reflections, you know the program needs improvement or an overhaul. 

A high rating could mean they found the course material helpful, but you still need to gather more information. 

Let’s jump to the next training survey question to reveal more. 

2. How satisfied were you with the training instructor?

The instructor plays a vital role in the training program, and the program’s success may depend more on the delivery of the content than on the content itself. 

Consider following this question with an invitation to provide more details. This will help you determine whether the trainer is an effective instructor and identify ways to improve their delivery method. 

If you have an instructor who consistently receives rave reviews, you’ll want to know what precisely trainees liked about their methods. You can share this information with other instructors to improve the overall program, no matter who is leading the session. 

3. Was the LMS/training portal user-friendly?

Orientation or training programs in a large company are often automated and have a large technology component. 

The material may be organized, valuable, and well-explained, but if the trainee lacks technology know-how or the LMS ( Learning Management System ) is difficult to use, they will not be able to benefit from the training. 

Docebo’s employee training tools are designed to be user-friendly and intuitive, making it easy for employees to navigate and access. They also empower employees to advance in their careers by offering personalized content that aligns with their interests, skills, knowledge gaps, and goals.

4. Please describe 1-3 goals you wanted to achieve after completing this training.

An open-ended question will help you gather the most valuable information from trainees. This question reveals what attendees expected to learn when they entered the training program and how they plan to apply new learnings. 

If trainees mention topics they expected the program to cover, you will have ideas about expanding the program or better communicating and setting expectations. 

Consider adding a follow-up question, “Do you feel better equipped to meet these goals after this training?” This question reveals whether the training met participants’ expectations and identifies areas for improvement.

5. How well did this training align with your professional goals?

This open-ended question helps you learn if the material was relevant and beneficial to most attendees.

Designing a catch-all training program that meets the professional or learning goals of a diverse group of learners with different roles in your organization can be challenging. You don’t want employees to feel like you’re wasting their time with training that is not relevant to them. 

You can also understand whether employees’ professional goals align with your company’s goals. The replies could offer insight into recruiting, job duties, and more.  

6. Did you face any challenges or obstacles while following the training modules? If so, please describe.

Post-training survey questions like this will help you identify the training program’s delivery and usability issues. The problem might not be in the content itself but in its delivery.

Are your modules up to date? Are interactive components working properly? Are they running too slowly? Are they confusing or disorganized? 

By addressing these potential obstacles, you can improve the overall training effectiveness

7. What would you change to improve this training program? Please describe at least one specific change.

By the second half of the post-training survey, participants have had time to reflect and can offer more meaningful feedback. This makes it the perfect time to ask your most thought-provoking questions.

Your best resource is valuable, open-ended feedback from respondents who recently completed the program. This question prompts respondents to point out areas for improvement and suggest creative solutions.

Here, participants can elaborate on topics from any of the previous questions or bring up entirely different topics, giving insight into what is on their minds.

8. Did your work schedule accommodate your training?

This question reveals more than whether the training is available during regular working hours. 

Here, you will learn more about the relevancy of your training. New employees are often overwhelmed with learning protocols, systems, expectations, and duties. If they find the training program a good use of their busy schedule, your training was successful. If their schedule feels too full to accommodate the training, consider the type or amount of content and its relevance. 

But, in-person training isn’t always possible, especially with busy schedules or remote employees. Video training offers a flexible solution that fits into almost any schedule and is often more effective than text-based methods. 

Creating training videos isn’t as complicated as it seems and could lead to a better overall training program. Most LMS programs will allow you to embed your video in your training program. For example, here’s how you can do that in Docebo.  

9. How likely are you to recommend this training to your colleagues? 

This is another excellent way of asking if the training was worth attendees’ time. A word-of-mouth recommendation is a reliable way to determine if everything from a movie to a restaurant is worth checking out. Colleagues who recommend work-relating training programs are telling their colleagues that the content and their valuable time are worth it. 

As opposed to a yes or no question, an employee net promoter score (eNPS) will better predict how likely someone will recommend your program. On a scale of 0-10, with 0 being the least likely and 10 being the most likely, you will have a reliable metric that is easy to understand. 

10. Would you like to take a similar training on different topics? If so, please specify which topics.

This final question is extremely valuable for identifying gaps in your training program. Perhaps you thought you covered a topic, but participants missed it. Or, maybe you covered a topic, but employees want to learn more about it. You’ll discover if you went into enough detail on a specific topic. 

You will also learn what topics employees consider valuable and how to stay current on what they want to know. Participants actively working in the field are a great resource for understanding what information you need to add, update, or expand on.

How to create effective post-training surveys

Now that you know the essential questions to ask in a post-training evaluation, here are several best practices for designing a successful survey.

Use survey software

Survey tools can help you gather feedback and organize information from a large sample of participants. 

Many tools have pre-made templates or AI capabilities to speed up the creation process. You can easily share or embed your surveys via email, an internal social network, or a learning platform. Also, these tools typically offer analytics dashboards where you can keep track of survey responses in real time, export survey data, and analyze it. 

Finally, survey tools allow participants to share feedback anonymously, leading to more open and honest answers. Studies show that only 63% of employees feel safe sharing their opinions at work. 

Limit the number of questions

Avoid overwhelming your trainees with too many questions.  Too many questions can lead to rushed, superficial answers that may not provide the valuable data you need. Focusing on shorter, more thoughtful post-training surveys will get you the best results.  Aim for about 10 questions, allowing you to cover various topics while keeping the survey concise and manageable. 

Choose the right time for your survey

Timing is crucial when sending out your survey. Send it as soon as possible after the training course while the material and experience are still fresh in participants’ minds. 

For in-person or hybrid corporate events , you can expect higher reply rates if you share the survey on the training day while all participants are still together. 

For online training, you’ll usually email the survey after the training and give participants a deadline to complete it. This allows you maximum flexibility, whether you’re hosting a live training or offering on-demand training materials.

Make your questions clear and varied

Course evaluation questions should be simple and straightforward. Respondents may have just finished the training program, and they have a lot of new information on their minds. Questions that are unclear, too long, or too difficult will lead to lower response rates and a lack of valuable insights. 

Vary the types of training survey questions you ask to keep respondents engaged and clarify potentially confusing questions. Combine a mix of rating scales, open-ended, multiple choice, and close-ended questions.

Communicate your survey goal

Communicate specific goals you wish to achieve so participants understand what kind of feedback to give you. 

To encourage participation, let trainees know how much you value their opinions. They will be more inclined to provide thorough answers if they know their time and insights are valued. 

Be transparent about results and take action

Trainees are perceptive and will notice if you’re taking their feedback seriously. Be open about the results, how you plan to use the information, and what steps you are taking to improve the program.  If participants know you’re constantly improving training sessions for their benefit, they will be motivated to attend more of them and share more honest opinions. 

Launch a successful training

With Docebo’s Learning Management System, you can provide your employees with an effective training experience that helps them achieve their goals and reach their full potential.

Schedule a demo with Docebo today to see how our solutions can benefit your team.

Platform Modules

  • Learning Management
  • Content Creation
  • Content Marketplace
  • Embedded Learning
  • Learning Intelligence

Plattform-Module

Moduli piattaforma.

  • Learning Suite
  • Content Creation [Previously Shape]
  • Learning Impact
  • Extended Enterprise
  • Mobile App Publisher
  • Docebo for MS Teams
  • Docebo for Salesforce
  • Strategic Services
  • Integrations
  • Docebo per MS Teams
  • Docebo per Salesforce
  • Servizi Strategici
  • Integrazioni
  • Financial Services
  • Manufacturing
  • Restaurant & Hospitality
  • Tech / IT Services
  • Customer Stories
  • Whitepapers

Education & Community

  • Docebo Community
  • Product Updates
  • Careers We’re hiring
  • Awards and Industry Recognition

Informazioni su Docebo

  • marquette.edu //
  • Contacts //
  • A-Z Index //
  • Give to Marquette

Marquette.edu  //  Institutional Research and Analysis  //  Public Reports  // 

Class Size Distributions Interactive Report

Class Size Distribution Dashboard

Marquette University Zilber Hall, Room 203 1250 W. Wisconsin Ave. Milwaukee, WI 53233 Phone: (414) 288-8049 Fax: (414) 288-6318

  • Campus contacts
  • Search marquette.edu

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Privacy Policy Legal Disclaimer Non-Discrimination Policy Accessible Technology

© 2024 Marquette University

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research survey questionnaire about online classes

Home Surveys

Online questionnaire: Definition, examples & how to create them

Online questionnaire

As a customer, you might have wanted to share your views with the product or service vendors to let them know your honest opinion. No matter which role you played, feedback was the center stage. Well, an online questionnaire will make your life easier either way.

As a business, you must have tried to collect customer feedback through various methods to offer them the best experience.

Gathering and providing valuable information is getting easier quickly thanks to technology and the constant improvement in survey software like QuestionPro. Keep reading and learn what an online survey is.

Content Index

What is an online questionnaire?

What is an online questionnaire used for, what is the benefit of developing an online questionnaire, disadvantages of developing an online questionnaire, examples of online questionnaires.

  • How to create an online Questionnaire with QuestionPro

An online questionnaire is a series of questions specifically structured to gather survey data about a target audience or group of people conducted online. 

The interviewees will answer the form quickly, accessing it through an internet connection. This online survey tool allows us to get feedback about a product or service and collect data for research thanks to online survey platforms.

Online surveys are fully customizable and differ in format, length, and design. A database provided by the survey software sorts the information obtained from online surveys, along with valuable results for data analysis .

Create surveys the way you want using professional survey software and achieve greater reach by quickly sending them to your target audience

LEARN ABOUT: How to create online surveys .

Online questionnaires allow the process to be effective. Researchers can reach a wider audience, obtain the results in real time and collect responses for better decision-making. To create surveys, not everyone needs to be an expert researcher.

We can use questionnaires to:

  • Know what the population thinks about various topics by conducting opinion surveys.
  • Measure customer satisfaction.
  • Creating surveys or questionnaires to survey a given population’s socio-economic or educational level.
  • Evaluate personnel or know their level of job satisfaction through employee surveys.
  • Conduct online tests for your students.

Differences between an online survey and an offline survey:

  • It avoids going from house to house, so you get many responses at a lower cost.
  • Traditional surveys have the cost of production as a barrier to getting the proper sample for the research process.
  • You can place a tablet in kiosk mode for people to answer your questionnaire template so that you can know their points of view about your service. You can also set the survey URL on the sales ticket for them to access whenever they want.
  • You can create an anonymous online survey, so people can feel safe while answering. It translates into honest answers and more detail in the responses.
  • An online survey allows you to analyze responses in real time, unlike a paper survey requiring more time for survey data collection and tabulation.

Paid or free online survey makers can be powerful if used correctly. They can help your company gather valuable information to improve products or services. They can also define and implement strategies to gain market share. You can use single ease questions . A single-ease question is a straightforward query that elicits a concise and uncomplicated response.

Some benefits of developing an online survey are easy to identify; others require more attention. Let’s get to know some of them to better understand what you should get from the best online survey.

LEARN ABOUT: Top 12 Tips to Create A Good Survey

research survey questionnaire about online classes

Get information in real-time

Unlike traditional offline surveys, online survey maker lets you immediately obtain answers from a large part of the sample population. 

Greater reach

This is one of the most important advantages. Online surveys allow you to reach as many people as you want quickly and easily. You can send them through email, or QR codes , publish them on your social networks or web pages or send your surveys by SMS . 

Save resources

The online survey is significantly cheaper than doing them on paper. In addition, you reduce the time resource since it is unnecessary to go to each person to answer and process the information. 

Better segmentation

An online survey allows you to segment your sample precisely according to the factors that suit you best. Thus, you can obtain more valuable and accurate survey responses. 

Greater participation

This point involves factors such as the length of the questionnaire, the ease and accuracy of the questions, and the design. You can also create mobile surveys and reach your participants wherever they are.  

LEARN ABOUT:   Structured Question  &   Structured Questionnaire

Those responsible for creating surveys know how valuable these online survey tools are for obtaining information and developing future projects. But often, decision-makers prefer to use other data collection methods.

Some of the disadvantages of developing an online questionnaire are:

Internet access

Some online surveys require internet access to be conducted. However, with the online survey tool QuestionPro, you can survey without internet access, thanks to our offline survey app . 

Having a computer or mobile device

To conduct an online survey, you must consider that your sample population has a mobile device or computer to respond.

Training to use a survey software

Before developing the online questionnaire, the researchers must know the tool’s characteristics and how to use it. This way, they will get the most out of it.

Challenging to obtain answers from older adults

Generally, older people do not know about new technologies, so you should consider that it will be difficult to receive answers from this market segment. If necessary, you should adjust it to their needs.

An online questionnaire is valuable for a fast and efficient research process. With QuestionPro, you can do it easily and get the information you need to make the right decisions for your organization.

In this video, you can learn a little more about our platform.

Here you will find some examples of questionnaires that can serve as a guide to designing your following survey and getting the survey results you want.

  • Work-from-home check-in
  • COVID-19 symptoms check
  • B2B customer pulse
  • Customer satisfaction survey
  • Employee satisfaction survey
  • Airline service evaluation questions
  • Transportation survey questions

With QuestionPro, it is possible to leave behind the paper and use a free online survey tool that will allow you to add the type of questions and survey logic that the platform has with solutions for companies, tech entrepreneurs, and small businesses.

In addition, it is possible to conduct user surveys without internet access. Respondents can answer the survey questions at their convenience. The survey responses are stored locally, and once they reconnect with the internet, the answers transfer to the server. QuestionPro users can generate reports, export results, or download them offline.

Some of the features available in our app for questionnaires are:

  • Access or login question
  • Synchronization of answers
  • Language selection
  • Design in our interface and a fast user experience
  • Smooth animations

In addition to the many other features of our survey app, it is also possible to use a wide variety of questions that can be effective for your research process:

  • Multiple-choice questions
  • Single choice questions
  • NPS (Net Promoter Score)
  • Image selection
  • Image classification

How to create an online questionnaire with QuestionPro

Using QuestionPro to create an online questionnaire is a simple process. To begin, follow these steps:

Sign up for an account

Sign up for a free account at the QuestionPro website ( www.questionpro.com ). You may upgrade to a paid plan if you require advanced features and additional settings.

Create a new survey

After signing up and logging in, click the “Create a Survey” option to create your questionnaire.

Choose a template or start from scratch

QuestionPro provides a variety of pre-designed templates from which to get started immediately. Alternatively, you may create from scratch by using the blank template.

Add questions to your survey

Add questions to your questionnaire by selecting a question type from the QuestionPro options. 

Multiple-choice questions , open-ended questions, rating scales, matrix questions, and other types of questions can be included. Each question can be customized by adding answer possibilities, creating skip logic, and specifying question validation standards.

Configure survey settings

Configure the survey options to meet your needs. Options include survey title, description, privacy settings, response quotas, and survey distribution settings.

Test your survey

Before publishing your survey, make sure it works properly. Preview the survey, review each question, and test any skip logic or branching you’ve built up.

Distribute your survey

When you’re satisfied with your survey, you may begin distributing it to your target audience . QuestionPro offers a variety of distribution techniques, including sending the survey link through email, embedding it on your website, distributing it on social media platforms, and leveraging its panel services to reach a certain audience.

Collect and analyze responses

As respondents complete your survey, you can use QuestionPro’s reporting and analytics capabilities to track and evaluate their responses in real-time. Reports can be generated, filtered, and segmented data and the results can be exported for further study if necessary.

Follow up and thank respondents

When you’ve collected enough responses, sending a thank-you note or follow-up correspondence is a good idea to acknowledge and appreciate the respondents’ time and effort.

LEARN ABOUT: B2B Online Panels

The best way to use online survey templates to your advantage is to choose the perfect mix for your customers and their needs. With QuestionPro, you can get the information you need through the best free online survey maker using the features of our powerful tool. So, try out QuestionPro today!

LEARN MORE         SIGN UP FREE

Frequently Asking Questions (FAQ)

The margin of error is considerably decreased with online surveys because participants submit their responses directly into the system.

Online questionnaires make the procedure more efficient. Researchers can reach a larger audience, acquire real-time results, and collect enormous amounts of data for better decision-making.

Online surveys frequently have two severe methodological flaws: the population to which they are disseminated cannot be characterized, and respondents with biases may choose themselves from the sample.

MORE LIKE THIS

statistical methods

Statistical Methods: What It Is, Process, Analyze & Present

Aug 28, 2024

research survey questionnaire about online classes

Velodu and QuestionPro: Connecting Data with a Human Touch

Google Forms vs QuestionPro

Google Forms vs QuestionPro: Which is Best for Your Needs?

Cross-cultural research

Cross-Cultural Research: Methods, Challenges, & Key Findings

Aug 27, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Methodology: Teens and parents survey

The analysis in this report is based on a self-administered web survey conducted from Sept. 26 to Oct. 23, 2023, among a sample of 1,453 dyads, with each dyad (or pair) comprised of one U.S. teen ages 13 to 17 and one parent per teen. The margin of sampling error for the full sample of 1,453 teens is plus or minus 3.2 percentage points. The margin of sampling error for the full sample of 1,453 parents is plus or minus 3.2 percentage points. The survey was conducted by Ipsos Public Affairs in English and Spanish using KnowledgePanel, its nationally representative online research panel.

The research plan for this project was submitted to an external institutional review board (IRB), Advarra, which is an independent committee of experts that specializes in helping to protect the rights of research participants. The IRB thoroughly vetted this research before data collection began. Due the risks associated with surveying minors, this research underwent a full board review and received approval (Approval ID Pro00073203).

KnowledgePanel members are recruited through probability sampling methods and include both those with internet access and those who did not have internet access at the time of their recruitment. KnowledgePanel provides internet access for those who do not have it and, if needed, a device to access the internet when they join the panel. KnowledgePanel’s recruitment process was originally based exclusively on a national random-digit dialing (RDD) sampling methodology. In 2009, Ipsos migrated to an address-based sampling (ABS) recruitment methodology via the U.S. Postal Service’s Delivery Sequence File (DSF). The DSF has been estimated to cover as much as 98% of the population, although some studies suggest that the coverage could be in the low 90% range. 1

Panelists were eligible for participation in this survey if they indicated on an earlier profile survey that they were the parent of a teen ages 13 to 17. A random sample of 3,981 eligible panel members were invited to participate in the study. Responding parents were screened and considered qualified for the study if they reconfirmed that they were the parent of at least one child ages 13 to 17 and granted permission for their teen who was chosen to participate in the study. In households with more than one eligible teen, parents were asked to think about one randomly selected teen and that teen was instructed to complete the teen portion of the survey. A survey was considered complete if both the parent and selected teen completed their portions of the questionnaire, or if the parent did not qualify during the initial screening.

Of the sampled panelists, 1,763 (excluding break-offs) responded to the invitation and 1,453 qualified, completed the parent portion of the survey, and had their selected teen complete the teen portion of the survey yielding a final stage completion rate of 44% and a qualification rate of 82%. The cumulative response rate accounting for nonresponse to the recruitment surveys and attrition is 2.2%. The break-off rate among those who logged on to the survey (regardless of whether they completed any items or qualified for the study) is 26.9%.

Upon completion, qualified respondents received a cash-equivalent incentive worth $10 for completing the survey. To encourage response from non-Hispanic Black panelists, the incentive was increased from $10 to $20 on Oct. 5, 2023. The incentive was increased again on Oct. 10, 2023, from $20 to $40; then to $50 on Oct. 17, 2023; and to $75 on Oct. 20, 2023. Reminders and notifications of the change in incentive were sent for each increase.

All panelists received email invitations and any nonresponders received reminders, shown in the table. The field period was closed on Oct. 23, 2023.

A table showing Invitation and reminder dates

The analysis in this report was performed using separate weights for parents and teens. The parent weight was created in a multistep process that begins with a base design weight for the parent, which is computed to reflect their probability of selection for recruitment into the KnowledgePanel. These selection probabilities were then adjusted to account for the probability of selection for this survey which included oversamples of Black and Hispanic parents.

Next, an iterative technique was used to align the parent design weights to population benchmarks for parents of teens ages 13 to 17 on the dimensions identified in the accompanying table, to account for any differential nonresponse that may have occurred.

To create the teen weight, an adjustment factor was applied to the final parent weight to reflect the selection of one teen per household. Finally, the teen weights were further raked to match the demographic distribution for teens ages 13 to 17 who live with parents. The teen weights were adjusted on the same teen dimensions as parent dimensions with the exception of teen education, which was not used in the teen weighting.

Sampling errors and tests of statistical significance take into account the effect of weighting. Interviews were conducted in both English and Spanish.

In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

The following tables show the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey:

A table showing unweighted sample sizes and the error attributable to sampling among parents of teens

Sample sizes and sampling errors for subgroups are available upon request.

Dispositions and response rates

The tables below display dispositions used in the calculation of completion, qualification and cumulative response rates. 2

A table showing dispositions

© Pew Research Center, 2024

  • AAPOR Task force on Address-based Sampling. 2016. “AAPOR Report: Address-based Sampling.” ↩
  • For more information on this method of calculating response rates, refer to Callegaro, Mario, and Charles DiSogra. 2008. “Computing response metrics for online panels.” Public Opinion Quarterly. ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Teens & Tech

Why Many Parents and Teens Think It’s Harder Being a Teen Today

Teens and video games today, how teens and parents approach screen time, teens and internet, device access fact sheet, teens and social media fact sheet, most popular, report materials.

  • Feature: Why Many Parents and Teens Think It’s Harder Being a Teen Today

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

  • Science & Research Policy

Academic Publishers Threatened By Open-Access Expansion

Critics say a directive to make federally funded research immediately free to the public could violate authors’ copyrights. It could also disrupt the $19 billion academic publishing industry.

By  Kathryn Palmer

You have / 5 articles left. Sign up for a free account or log in.

Person trying to pry open locked cage with a light bulb inside

Some politicians and publishers argue that giving federal agencies a license to immediately publish scholarly research would violate authors’ copyright protections.

Photo illustration by Justin Morrison/Inside Higher Ed | GelatoPlus/Getty Images

Even as federal agencies work to implement the Nelson memo—a 2022 White House directive to make federally funded research freely available to the public immediately after publication—members of Congress are joining academic publishers in pushing back.

Under the directive, slated to go into effect by 2026, authors who use grant funding to produce research will be required to deposit their work into agency-designated public-access repositories as soon as it’s published. That change eliminates the existing option for authors or their publishers to place a 12-month embargo on public access to government-funded research publications, a rule that’s been in place since 2013.

Alondra Nelson, the former acting director of the Office of Science and Technology Policy, wrote in the 2022 memo that bears her name that the goal of lifting the embargo is to promote “equity and advance the work of restoring the public’s trust in Government science, and to advance American scientific leadership.”

Most Popular

  • Stanford creative writing program laying off lecturers
  • Kill the 5-Paragraph Essay
  • Colleges slam Ed Department's proposed attendance policy

Although open-access advocates and library groups support the move, opponents argue the new policy will limit researchers’ ability to maintain control of their published work—and cut into the $19 billion academic publishing industry ’s profit margins.

“Researchers should have the right to choose how and where they publish or communicate their research and should not be forced to disseminate their research in ways or under licenses that could harm its integrity or lead to its modification without their express consent,” the Senate and House Appropriations Committees both wrote in reports attached to their draft budget bills, which passed out of committee earlier this summer.

Carl Maxwell, vice president of public policy for the Association of American Publishers (AAP), said in an email to Inside Higher Ed that the organization applauds Congress’s latest efforts to scrutinize the Nelson memo and “protect the right of authors to determine how the articles, books, and reports they have written are licensed.”

Federal Purpose License

As tensions over open-access expansion mount in Washington, a growing number of academic library groups across the nation has expressed concern about the challenges of helping authors learn to comply with the new deposit policy, which many may encounter for the first time after the Nelson memo takes effect.

“In a worst-case scenario, authors who do not understand their grant requirements and the legal landscape may face negative enforcement actions from funders, disputes about copyrights or contracts, or roadblocks to publishing,” reads a recently drafted petition signed by dozens of individual librarians and library groups, including the Authors Alliance and SPARC, the Scholarly Publishing and Academic Resources Coalition.

The petition encourages federal agencies to “level the playing field for authors” by applying the federal purpose license, a decades-old regulation that gives federal agencies that funded the publication of research “the royalty-free, nonexclusive and irrevocable right” to reproduce, publish or otherwise use the work.

Although federal purpose license advocates believe it will “provide grant recipients with a clear understanding of their obligations as authors [and] facilitate better compliance with funder requirements,” the House Appropriations Committee’s recently advanced bill prohibits agencies from exerting “broad” federal purpose authority.

Not a ‘Viable’ Path

Publishers don’t like the idea, either.

Maxwell, a registered lobbyist for the AAP , said that while “broad open licenses may make sense for some researchers,” others “may be rightfully concerned about inappropriate modification, or commercialization of their publication, and those authors should have the final say in who can modify and commercialize their work.”

Editors’ Picks

  • ‘Red Wedding’: Storied Stanford Creative Writing Program Laying Off Lecturers
  • Universities Hit Back Against Proposed Online Attendance Policy
  • Re: Your Recent Email to Your Professor

Although the copyright-focused argument is dominating the political opposition to the Nelson memo this year, last year the House Appropriations Committee tried—but failed—to block funding to implement it. Public comments submitted by the AAP and numerous other publishing groups, including the American Society of Civil Engineers, Springer Nature and Wiley, show the embargo lift is also creating financial anxieties.

“There is no viable way for scholarly societies and other publishers to continue to produce trustworthy, high-quality open access publications without any means to recoup the significant investments and expenses required for them to do so,” Maxwell wrote in a letter submitted on behalf of the AAP to the National Institute of Standards and Technology last August. “We are concerned about potential long-term effects of the new policy on the scholarly communication ecosystem.”

Currently, the academic publishing industry’s business model relies largely on an author’s willingness to submit work for free—or even pay to publish it—and the publisher’s ability to turn around and sell that research to academic libraries through expensive journal subscriptions. Libraries at doctoral-granting institutions spend about 80 percent of their materials budgets on such subscriptions, according to data from the Association of Research Libraries (ARL), which supports expanding open access of federally funded research and the federal purpose license.

“We don’t have any concerns about agencies limiting authors’ control over their works,” said Katherine Klosek, ARL’s director of information policy and federal relations. “These are nonexclusive licenses that authors are granting to agencies to use their work, so authors can retain those rights and choose to publish wherever they like in addition to complying with public access policies.”

Despite its purported concerns about copyright infringement, the publishing industry hasn’t always prioritized the rights of individual authors; last month the academic publishing giant Taylor & Francis angered many scholars by neglecting to mention it was selling their work to Microsoft for $10 million as part of an AI partnership.

Dave Hansen, executive director of the Authors Alliance, a California-based nonprofit that supports authors in disseminating their work, said authors already lack control because most closed-access subscription journals require them either to assign their copyright to the publisher or to grant the publisher exclusive rights.

“The idea that the federal government is exercising its right—before publishers swoop in—to reserve for itself this nonexclusive license is troubling to publishers that might worry that would limit their ability to exploit their exclusive rights for subscription revenue or other kinds of licensing deals,” he said. “But that’s not really a copyright conflict—that’s just a business model conflict.”

To address that conflict, publishers will first have to decide if they want to continue publishing research funded by the federal government, which finances nearly 55 percent of academic research and development, according to the National Science Board . “[For] most publishers, the answer would be absolutely not, because they’d have nothing to publish,” Hansen added.

He believes implementing the Nelson memo will also push publishers to make big decisions about how the industry should move forward in an age of open-access expansion.

“Do they adapt their business models and try to align more closely with what authors and funders want?” he said. “Or do they try to stick out the model they’ve developed, which is one that’s dependent on them trying to scoop up as much exclusive rights as they can for authors’ articles?”

Three students sit at a conference table working on laptops.

Success Program Launch: Talent Pipeline for Black Engineering, Comp Sci Students

A new scholarship initiative at Howard University and Prairie View A&M provides guaranteed job placement for part

Share This Article

More from science & research policy.

Aerial view of the rice harvest at 3S Ranch, near El Campo, Texas.

Farm Bill Proposals Boost Research Facilities and HBCUs

But a political impasse over SNAP benefits could make the bill impossible to pass, at least in 2024.

Older scientist helping younger scientist. Both are women.

National Science Foundation Expands Mentoring Requirements to Bolster STEM Pipeline

Under a newly adopted policy, the independent federal agency now requires mentoring plans for all graduate students i

Group of students on beach with matching blue T-shirts

Defense Department Cuts 13 of its Language Flagship Programs

Linguists are concerned about the implications the elimination of these programs may have on foreign relations.

  • Become a Member
  • Sign up for Newsletters
  • Learning & Assessment
  • Diversity & Equity
  • Career Development
  • Labor & Unionization
  • Shared Governance
  • Academic Freedom
  • Books & Publishing
  • Financial Aid
  • Residential Life
  • Free Speech
  • Physical & Mental Health
  • Race & Ethnicity
  • Sex & Gender
  • Socioeconomics
  • Traditional-Age
  • Adult & Post-Traditional
  • Teaching & Learning
  • Artificial Intelligence
  • Digital Publishing
  • Data Analytics
  • Administrative Tech
  • Alternative Credentials
  • Financial Health
  • Cost-Cutting
  • Revenue Strategies
  • Academic Programs
  • Physical Campuses
  • Mergers & Collaboration
  • Fundraising
  • Research Universities
  • Regional Public Universities
  • Community Colleges
  • Private Nonprofit Colleges
  • Minority-Serving Institutions
  • Religious Colleges
  • Women's Colleges
  • Specialized Colleges
  • For-Profit Colleges
  • Executive Leadership
  • Trustees & Regents
  • State Oversight
  • Accreditation
  • Politics & Elections
  • Supreme Court
  • Student Aid Policy
  • State Policy
  • Colleges & Localities
  • Employee Satisfaction
  • Remote & Flexible Work
  • Staff Issues
  • Study Abroad
  • International Students in U.S.
  • U.S. Colleges in the World
  • Intellectual Affairs
  • Seeking a Faculty Job
  • Advancing in the Faculty
  • Seeking an Administrative Job
  • Advancing as an Administrator
  • Beyond Transfer
  • Call to Action
  • Confessions of a Community College Dean
  • Higher Ed Gamma
  • Higher Ed Policy
  • Just Explain It to Me!
  • Just Visiting
  • Law, Policy—and IT?
  • Leadership & StratEDgy
  • Leadership in Higher Education
  • Learning Innovation
  • Online: Trending Now
  • Resident Scholar
  • University of Venus
  • Student Voice
  • Academic Life
  • Health & Wellness
  • The College Experience
  • Life After College
  • Academic Minute
  • Weekly Wisdom
  • Reports & Data
  • Quick Takes
  • Advertising & Marketing
  • Consulting Services
  • Data & Insights
  • Hiring & Jobs
  • Event Partnerships

4 /5 Articles remaining this month.

Sign up for a free account or log in.

  • Sign Up, It’s FREE
  • Open access
  • Published: 24 August 2024

Analysis of factors influencing medical students’ learning engagement and its implications for teaching work—— a network analysis perspective

  • Yijun Li 1 ,
  • Fengzhan Li 1 ,
  • Peng Fang 1 ,
  • Xufeng Liu 1 &
  • Shengjun Wu 1  

BMC Medical Education volume  24 , Article number:  918 ( 2024 ) Cite this article

210 Accesses

Metrics details

Higher medical education has always been a major project in the fields of education and health, and therefore, the quality of education has received much attention. Learning engagement has emerged as a significant indicator of teaching quality, attracting considerable research attention. This study aims to explore the relationship between medical students’ learning engagement and their sense of school belonging, professional identity, and academic self-efficacy.

We conducted an online survey using convenience sampling method with 311 medical students. We employed Revised version of the Utrech Work Engagement Scale-Student (UWES-S), Chinese version of the Psychological Sense of School Membership (PSSM) scale, Academic Self-Efficacy Scale, and the questionnaire of college students’ speciality identity for evaluation. Network analysis was used to analyze the relationships among these factors.

Medical students’ overall performance in school showed a positive trend. However, there is still room for improvement. In the network structure of learning engagement and its influencing factors, the “emotional” aspect of professional identity (EI = 1.11) was considered to be an important node with strong centrality. And “academic competence self-efficacy” aspect of academic self-efficacy (BEI = 0.72) was considered an important node with strong transitivity.

Deepening medical students’ emotional identification with their profession and enhancing their confidence in their academic abilities may improve their learning engagement and educational quality.

Peer Review reports

Introduction

Higher medical education is intricately linked to both education and public welfare initiatives. It is responsible for nurturing medical professionals and advancing medical education, while also playing a crucial role in the establishment of a “Healthy China” [ 1 ]. Consequently, the quality of higher medical education is under increased scrutiny as both education and healthcare, essential components of public welfare, demand higher standards. Historically, efforts to enhance educational quality have focused predominantly on external resources such as hardware facilities, faculty strength, and teaching methods [ 2 ]. The emergence of educational theories, particularly the “student-centered” approach, has elevated the significance of students’ agency in shaping learning outcomes. The publication of the “Implementation Opinions on the Construction of First-class Undergraduate Courses in 2019” represents a notable advancement in education research, notably integrating the concept of “student learning engagement” into national policy discussions [ 3 ]. Consequently, the issue of learning engagement has garnered significant attention, given its close association with academic performance. It serves as a pivotal indicator and determinant in assessing the effectiveness and quality of learning [ 4 , 5 , 6 ]. It is essential to study the learning engagement of medical students and its influencing factors.

  • Learning engagement

Learning engagement, a key factor tied to academic success [ 4 , 5 , 6 , 7 ], is the positive, fulfilling state shown by students during learning, marked by energy, devotion, and concentration [ 8 ]. Students who are more engaged get better grades and improve their personal and professional qualities [ 6 , 7 ]. It assists teachers in understanding and tailoring teaching to students’ abilities and helps schools foster a conducive learning environment. Influences on student engagement include external factors like school, sense of belonging, and relationships with teachers and peers, and internal factors like motivation, interest, self-efficacy, and identity [ 8 , 9 , 10 , 11 , 12 ]. Various research methods, such as questionnaires, interviews, and observations, have highlighted the impact of individual, family, and school factors on engagement. However, existing research mainly focuses on middle or regular college students, with less emphasis on the unique challenges faced by medical students, who often deal with heavy workloads and intense academic demands. Further study on factors affecting medical students’ engagement is vital for enhancing their academic outcomes.

Academic self-efficacy and learning engagement

Bandura’s theory of social learning underscores the significance of self-efficacy and underscores the interplay among behavior, individual characteristics, and environmental factors. Academic self-efficacy plays a crucial role in driving learning engagement and academic achievement [ 13 , 14 , 15 , 16 ], serving as a key determinant of students’ success in various academic contexts [ 17 , 18 ]. This concept pertains to individuals’ perceptions of their own capabilities to attain academic objectives [ 19 ], with previous accomplishments bolstering positive beliefs in their academic performance potential and consequently fostering increased engagement in learning activities and higher prospects for future success. Previous academic setbacks, low expectations, and assessments of academic capabilities can result in considerable personal fatigue [ 9 ]. Furthermore, research indicates that individuals with high academic self-efficacy demonstrate increased levels of innovation and participation in scientific endeavors [ 17 ], as well as heightened engagement in various modes of learning [ 20 , 21 , 22 , 23 ]. Additionally, it has been found that high academic self-efficacy can enhance innovation and scientific involvement [ 24 , 25 ], and reduce the current state of academic and occupational burnout.

Psychological sense of school membership and learning engagement

According to Bandura, behavior, internal cognitive processes, and the environment mutually influence one another, shaping an individual’s behavioral outcomes [ 26 ]. The educational environment plays a significant role in shaping the learning engagement of medical students, with schools being a primary source of influence. Consequently, the assessment and acknowledgment of schools have a direct impact on students’ learning behaviors. Psychological sense of school membership pertains to the degree to which students perceive acceptance, inclusion, respect, and support within the school setting [ 27 , 28 ]. This has a positive effect on students’ mental health and learning behavior [ 29 ]. Drawing from Maslow’s hierarchy of needs theory, psychological sense of school membership aligns with students’ foundational needs within the educational context, laying the groundwork for their academic development [ 30 ]. Studies have indicated that a strong psychological sense of school membership is associated with increased levels of learning engagement [ 27 , 31 , 32 ], academic achievement [ 33 ], academic self-efficacy, and reduces academic fatigue [ 34 ]. This factor plays a significant role in shaping students’ learning behaviors. Specifically, an enhanced psychological sense of school membership in preclinical medical students has been linked to improved academic performance, while a decline in this sense is correlated with heightened levels of learning fatigue. Improving students’ psychological sense of belonging to their educational institution has been shown to be an effective strategy for mitigating school fatigue and enhancing academic achievement [ 35 ].

College students’ speciality identity and learning engagement

The primary intrinsic determinants of medical students’ engagement in their studies are their attitudes and evaluations towards their chosen field of study. Developing a strong sense of identification with their chosen field is crucial for the establishment of a professional identity [ 36 ]. Speciality identity, on the other hand, pertains to the emotional acceptance and recognition that learners experience after acquiring a comprehensive understanding of the subject matter they are studying. The development of specialty identity in medical students is characterized by positive external behaviors and a sense of appropriateness, reflecting a process of emotional, attitudinal, and cognitive assimilation [ 37 ]. The relationship between specialty identity and career identity is particularly significant for medical students, as they share similarities and intersecting influences. Consequently, the establishment of a strong specialty identity not only influences the realization of potential and quality of education, but also impacts work performance and passion in the workplace. The research indicates that speciality identity is closely related to learning engagement, academic performance, and psychological sense of school membership [ 2 ]. Therefore, professional identification can be regarded as a pre-variable for learning engagement, integrating cognition, behavior, and emotion, which deserves further investigation.

Previous studies have primarily examined the impact of psychological sense of school membership, academic engagement, and academic self-efficacy on learning engagement using scale-based assessments. However, this scoring-based approach may have certain limitations. Firstly, it is important to note that the reliance on scale scores restricts the comprehensive understanding of the influence of these factors.The proposed method may overlook the fine-grained relationships among dimensions within these psychological or educational constructs and blur the importance of different dimensions [ 38 , 39 ]. For example, learning engagement is a complex structure that integrates motivation, energy, and focus. However, research based on a holistic perspective neglects the pathways between psychological or educational constructs, which could serve as adjustable targets for enhancing learning engagement and instructional quality. Additionally, this approach assumes causal relationships among four variables, but in fact, all four variables can function as antecedent, mediator, and outcome variables [ 21 , 40 , 41 ]. Therefore, it is possible that there is not a singular causal relationship among these four variables. By applying network analysis, the variables can be placed within a network structure to observe their relationships with each other. Thus, examining the network structure of the four variables at a granular level is crucial. In this study, network analysis method was employed to investigate the relationships among psychological sense of school membership, academic engagement, academic self-efficacy, and learning investment.The dimensions are unified in a network, aiming to explore key nodes for enhancing medical students’ learning engagement and providing evidence for effective methods to enhance learning engagement.

Participants

This study recruited participants through an online survey platform ( www.wenjuan.com ). This platform is a comprehensive online tool that supports multiple publication methods, including links and QR codes. Data is automatically stored in the background, and valid data can be filtered based on IP address and response time. Follow this link to go directly to the questionnaire page ( https://www.wenjuan.com/s/UZBZJvv7SI/ ). A survey was conducted among 311 medical students from a certain medical university. Among them, there were 212 male students and 99 female students.The median age of the participants was 20, with a range from 18 to 21. The majority of the students surveyed were in their second and third year (first-year students had a shorter period of time in the university and had less familiarity with the school, teachers, and peers, as well as a limited perception of their professional knowledge and skills; fifth-year students were already engaged in hospital internships, thus their experience of university life was not considered in the survey).

Research tools

Revised version of the utrech work engagement scale-student (uwes-s).

The scale used in this study is the revised version of the Utrecht Work Engagement Scale-Student (UWES-S) developed by Li Xiying [ 42 ]. This version includes three aspects: motivation (referring to individuals’ liking for learning and being interested, understanding the significance of learning, and experiencing happiness), energy (referring to individuals being energetic, not easily fatigued in learning, and persisting), and focus (referring to individuals being fully concentrated on learning, reaching a state of self-forgetfulness). The questionnaire consists of 17 questions, using the Likert 7-point scoring method. In this method, a score of 1 represents “never,” and a score of 7 represents “always.” The higher the score, the higher the individual’s level of learning engagement. In this study, the Cronbach’s α coefficient for the scale was found to be 0.983, indicating high internal consistency. Additionally, the three dimensions of the scale demonstrated good internal consistency with Cronbach’s α coefficients of 0.969, 0.973 and 0.969 respectively.

Chinese version of the psychological sense of school membership (PSSM) scale

The Psychological Sense of School Membership (PSSM) Scale, which is a Chinese version of the scale translated and revised by scholars from Hong Kong, was utilized in this study [ 43 ]. The questionnaire encompasses two dimensions: a sense of belonging (including responsibility and pride in the school, positive teacher-student relationships, and good peer relationships) and a sense of resistance (feeling rejected or unrecognized by the school and its members). The questionnaire consists of 18 items and follows a standardized format. A Likert 6-point scoring method was employed, where 1 point represents complete non-compliance and 6 points represent complete compliance. A higher score indicates a stronger sense of belonging to the school. The Cronbach’s α coefficient for the scale in this study was calculated as 0.848, demonstrating good internal consistency. The Cronbach’s α coefficients for the dimensions were 0.966 and 0.944, respectively.

Academic self-efficacy scale

The scale used in this study is an adapted version of the Academic Self-Efficacy Scale developed by Liang Yusong [ 44 ], The questionnaire consists of 22 items, measuring two dimensions: self-efficacy in learning ability (individuals’ judgment and confidence in their ability to successfully complete their studies, achieve good grades, and avoid academic failure) and learning behavior self-efficacy (individuals’ judgment and confidence in their ability to adopt specific learning methods to achieve learning goals). The Likert 4-point scale was used for the questionnaire.Scoring method: A score of 1 indicates complete inconsistency, while a score of 4 indicates complete consistency. The higher the score, the higher the perceived academic self-efficacy. The Cronbach’s α value of the scale in this study was 0.961, indicating high internal consistency. The scale consists of twoThe Cronbach’s α coefficients for the dimensions were 0.970 and 0.918, respectively.

Questionnaire of college students’ speciality identity

The scale used in this study is the College Students’ Speciality Identity Scale developed by Qin Panbo [ 37 ]. It consists of four dimensions: cognitive dimension (knowledge about the profession), affective dimension (liking for the profession), behavioral dimension (professional behaviors), and appropriateness dimension (degree of match between the profession and oneself). The scale includes 23 items, rated on the Likert 5-point scale, with 1 indicating completely disagree and 5 indicating completely agree. Higher scores indicate stronger professional identification. The Cronbach’s α coefficient for the scale in this study was 0.980, and the Cronbach’s α coefficients for the four dimensions were 0.927, 0.962, 0.949, and 0.936, respectively.

  • Network analysis

The Gaussian graphical model (GGM) was employed to estimate this undirected network [ 45 ]. In this model, each dimension of the 4-questionnaire is treated as a node, and the partial correlation between two nodes is considered as an edge after statistically controlling for the influence of other nodes. The construction of the network involved nonparametric Spearman correlation, with regularization performed using the Least Absolute Shrinkage and Selection Operator (LASSO) and Extended Bayesian Information Criterion (EBIC) methods [ 45 , 46 ]. This study utilizes the R-package qgraph to construct and visualize networks. The R-package qgraph is employed to calculate the expected influence (EI) as a centrality index. EI values indicate the importance of a node in the entire network. The higher the EI value, the greater the influence of the node [ 47 ]. Bridge expected influence (BEI) is computed using the R-package networktools to measure the expected influence of bridges. BEI serves as a centrality index for bridges. A higher BEI value indicates a higher risk of spreading from the current community to other communities [ 48 ] (See Supplementary Material 1 for details). In the current network, nodes are divided into four communities before analysis: psychological sense of school membership (2 nodes), learning engagement (3 nodes), students’ speciality identity (4 nodes), and academic self-efficacy (1 node).

Overall situation of medical students’ psychological sense of school membership, speciality identity, academic self-efficacy and learning engagement

The mean scores and standard deviations for each variable are shown in Table  1 . The average score for medical students’ psychological sense of school membership was 4.38 ± 1.00. The average score for their speciality identity was 4.12 ± 0.74. The average score of academic self-efficacy is 3.08 ± 0.61. The average score for learning engagement was 5.16 ± 1.34. They were all higher than the corresponding theoretical average scores of 3, 2.5, 2, and 3.5 respectively.

Network analysis of medical students’ psychological sense of school membership, speciality identity, academic self-efficacy and learning engagement

According to Fig.  1 , this network structure has characteristics. First, among the 55 potential edges, there are 34 edges that are non-zero. It included 4 negatively correlated edges and 30 positively correlated edges. And all 4 negative edges are related to resistance to school belonging. There are four distinct clusters within the network, suggesting different subgroups or patterns of relationships. Secondly, the strongest connections in the network appear in the “learning engagement” and “professional identification” within the community. In “learning engagement”, “vigor” and “focus” have the strongest correlation (weight = 0.495). In “professional identification”, “cognitive” and “behavioral"have a stronger correlation (weight = 0.407). There are also strong connections across communities. “Emotional” in “professional identification” strongly associated with “academic self-efficacy” and “ability self-efficacy” (weight = 0.334). “Sense of belonging” in “school belonging” strongly related to “motivation” in “learning engagement” (weight = 0.205). All edge weights are seen in Supplementary Table 1 . The bootstrapped 95% CI was narrow, suggesting that the estimation of edge weights was accurate and stable (Supplementary Fig.  1 ). The bootstrapped difference test for edge weights is shown in Supplementary Fig.  2 .

figure 1

The network structure diagram of medical students’ psychological sense of school membership, speciality identity, academic self-efficacy and learning engagement

Importance of nodes is evaluated by their EI values in the network. As shown in Fig.  2 a; Table  1 , emotionality (EI = 1.11) and self-efficacy of learning ability (EI = 1.06) are relatively high, making them the most important central nodes. Moreover, the stability coefficient of EI (CS = 0.75) is high, indicating a stable estimation of EI (see Supplementary Fig.  3 ). The result of the bootstrapped difference test for node EI is shown in Supplementary Fig.  4 . BEI values indicate the importance of nodes in transmitting to other communities in the network. As shown in Fig.  2 b; Table  1 , self-efficacy (BEI = 0.72) and affectivity (BEI = 0.64) are relatively high, being key bridge nodes. Additionally, the stability coefficient of BEI (CS = 0.75) is high, indicating a stable estimation of BEI (see Supplementary Fig.  5 ). The result of the bootstrapped difference test for node BEI is shown in Supplementary Fig.  6 .

figure 2

The expected influence and bridge expected influence of each node in the present network (raw value)

Expected influence (EI).

Bridge expected influence (BEI).

The research status of students’ learning engagement and its influencing factors

Medical students exhibit positive perceptions and behaviors in school, consistent with previous research findings [ 2 ]. Specifically, medical students have a strong sense of belonging to their school, indicating their acceptance and satisfaction with the institution. They generally have a high level of identification with their chosen major, with the highest cognitive identification and lowest affective identification. This suggests that while medical students have a good understanding of their chosen field, they may have lower emotional engagement, possibly due to academic pressure and related emotions. Medical students are confident in completing their studies, with higher self-efficacy in learning ability and lower self-efficacy in learning behavior. This indicates that medical students have a positive and objective evaluation of their own learning ability, but lack confidence in adopting effective methods to achieve learning goals. The above results are similar to those of previous studies [ 49 , 50 ]. The performance of Chinese college students in speciality identity, psychological sense of school membership and academic self-efficacy is above average. Lastly, medical students generally exhibit a positive and engaged attitude towards learning, although they may have slightly lower motivation, energy, and focus. But the results are different from those of previous studies. Researchers believe that the phenomenon of insufficient learning engagement is common among Chinese college students, especially among vocational nursing students [ 49 , 51 , 52 , 53 ]. This may have something to do with the difference in major. Medical undergraduates have heavy academic tasks and many assessments. In order to meet the assessment standards, medical students have to increase their learning engagement to improve their academic performance.

Important nodes in network analysis: emotion and ability self-efficacy

Network analysis reveals the relationships between dimensions of the four variables. The most important among them are the values of EI and BEI in the network. In the current network, the highest EI and BEI values are found in “Emotion” in the dimension of “Professional Identity” and “Ability Self-efficacy” in the dimension of “Academic Self-efficacy”.

Firstly, professional identity is frequently in positive correlation with learning engagement [ 54 , 55 ]. Especially in the majors of medicine or education, the positive role of professional identity in learning engagement is extremely important [ 56 ]. Some studies suggest that professional identity refers to the acceptance and recognition of the learned courses from the learner’s heart, which is transformed into external behavioral motivation. It is a process of gradual transfer of emotions, attitudes, and even cognitions [ 54 , 57 ]. Therefore, particular attention is paid to the positive prediction of emotional identity in professional identity on learning engagement, and this is consistent with the results of this study. Social and Emotional Learning (SEL) emphasizes the importance of emotions in education and recognizes the interconnection between cognition and emotions [ 58 ]. Therefore, a large body of research has shown the impact of SEL on students’ academic achievement, and it has been proven to be a predictive factor for students’ success in school [ 58 , 59 , 60 ]. Therefore, within the dimension of Professional Identity, emotion plays a significant role in influencing cognition and behavior. The social and emotional learning also contributes to students’ attention regulation and reshaping of learning behavior. In this network, emotion serves as a highly influential node, which can affect the nodes of focus, behavior, and self-efficacy, consequently influencing learning engagement, professional identification, and academic self-efficacy. In addition to the direct impact of professional identity (especially emotional identity) on learning engagement [ 61 , 62 ], there is also an indirect impact, mainly mediated by self-efficacy [ 49 , 50 , 63 ].

Secondly, the theory of self-efficacy suggests that self-efficacy regulates the relationship between learning goals and learning behavior, serving as a determinant of learning behavior [ 64 ]. Academic self-efficacy refers to individuals’ subjective judgment of their own learning abilities, which primarily functions to regulate and control learning behavior, thereby influencing learning outcomes [ 49 , 50 , 51 ]. High academic self-efficacy students are more likely to select challenging learning tasks persist in completing them with confidence, achieving positive results [ 65 , 66 , 67 ]. And in this process, they exhibit courage in facing challenging learning tasks, demonstrating resilience and a proper understanding of their abilities. They are not easily discouraged by setbacks and remain confident in overcoming obstacles, leading to progress and success in their academic endeavors [ 68 ]. Students with high academic self-efficacy attribute success to their own high abilities and failure to insufficient effort. As a result, they do not experience negative academic emotions such as inferiority. With a positive mindset, they approach every academic progress and failure, learning from their experiences and continuously striving for better academic performance [ 64 , 69 ]. Therefore, academic self-efficacy plays an important role in the network, with strong transmission ability to various aspects such as emotions, behavior, and cognition, ultimately influencing learning engagement. However, researchers have not carried out separate studies on the two distinct dimensions of academic self-efficacy. This study reveals that academic ability self-efficacy has stronger transmissibility than academic behavior self-efficacy and can impact other communities (professional identity, sense of belonging, and learning engagement) within the current network. This could be because academic ability self-efficacy is the recognition of and self-confidence in one’s own abilities, thereby stimulating greater enthusiasm and interest in learning engagement [ 68 ]. Moreover, the positive feedback in learning and the recognition of one’s major lead to the continuous improvement of one’s abilities during the learning process, and the stronger the self-identity becomes [ 70 ].

Limitations

Although this study has obtained some results and implications, there are still some limitations. First of all, although the educational and psychological constructs measured in this paper are students’ subjective feelings, we have to admit that self-reported measurement methods may bring some biases, such as social approval effects. In the follow-up study, we can consider the objective indicators such as academic performance and other evaluation methods to determine the relationship between medical students’ learning engagement and its influencing factors. Secondly, this paper adopts cross-sectional data, which may not capture the long-term change and development trend of students, nor can it accurately infer the directional relationship in the network. Tracking students and recording data could be considered in the future to achieve richer results. Last but not least, this sample selected medical students from the first to the fourth year. Medical students in different grades may face different pressures of study and life, professional knowledge and clinical experience, which may affect the research results. Moreover, the non-inclusion of students in other grades or other types of medical education (such as graduate students, advanced students, etc.) limits the generalization of the findings in the broader field of medical education. Future studies can include more types of medical students and expand the sample size, respectively discuss the learning input of medical students at each stage and its influencing factors, and determine better and more accurate methods to promote the learning input of medical students, so as to improve the quality of higher medical education.

Conclusion and recommendations

In conclusion, medical students exhibit predominantly positive attitudes and behaviors within their academic environment, yet there remains substantial room for improvement. Within the network of learning engagement factors, “affective” in college students’ speciality identity is the node with the greatest expected impact in the entire network, and “self-efficacy in learning ability” in academic self-efficacy is the node with the greatest expected bridge impact (transmissibility) in the network. Both of them are the target goals that affect medical students’ learning engagement and important elements for academic performance emerge as. Consequently, targeting these domains could enhance medical students’ learning engagement, ultimately contributing to the overall quality of education.

Deepening medical students’ identification with their own profession from an emotional perspective. Medical students have already understood the medical profession and future career prospects from the aspects of cognition and behavior. However, the degree of identification with this profession is influenced by academic pressure, responsibility risks, public opinion, and the potential doctor-patient relationship. Therefore, it is necessary to strengthen the cultivation of humanistic spirit for medical students in the new era, and pay attention to the following cultivation methods:

Clarify the purpose of traditional medical education, strengthen the guidance of medical students’ professional thoughts, reduce anxiety and panic about future employment positions, and further enhance medical students’ sense of professional identity.

Increase the cultivation of high-level medical and health talents. The process from medical students to medical professionals requires a long-term cultivation process. Only with further development of incentive systems and improvement of teaching quality can medical students have greater learning motivation and professional identity.

Create a good school atmosphere, improve medical students’ favorable impression and identification with the school and profession from internal and external conditions.

Improving academic self-efficacy is instrumental in fostering deeper engagement in learning among medical students. To enhance their academic self-efficacy, the following strategies can be employed:

Increase direct experiences of success: Educators should progressively escalate the difficulty of learning tasks to afford students more opportunities for successful outcomes.

Facilitate vicarious experiences: Teachers can model effective practices or showcase successful student role models, allowing medical learners to relate to similar success and failure scenarios, which in turn influences their self-efficacy.

Strengthen social support: Encouragement, trust, and backing from teachers, family, friends, and peers contribute to a student’s belief in their ability to overcome challenges, thereby prompting greater effort in tackling demanding academic tasks.

Cultivate positive emotions: Given the reciprocal relationship between emotions and self-efficacy, cultivating positive emotions enhances motivation, learning efficiency, and outcomes, fostering confidence. This can be achieved by promoting a supportive learning environment, providing constructive feedback, and maintaining harmonious teacher-student relationships.

These approaches, grounded in Bandura’s self-efficacy theory, maybe helpful for enhancing the academic performance and dedication of medical students.

Based on our analysis, we have identified affective and academic self-efficacy as pivotal elements in the construct of college students’ speciality identity, psychological sense of school membership, academic self-efficacy, and academic engagement. Focusing on the cultivation of humanistic spirit among medical students and the enhancement of their academic self-efficacy, particularly targeting professional identity, can lead to a student-centered approach that promotes increased academic investment and improved academic performance. Such an approach is instrumental in augmenting the overall quality of medical education, thereby solidifying the foundation for the development of competent medical professionals.

Data availability

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Gan Y. Optimization strategies for the Training Program of Clinical Medical undergraduates in the context of Healthy China. Army Medical University, Department of Military Preventive Medicine; 2020.

Li R. The Impact of College Students’ Major Identification on Learning Engagement: A Chain Mediating Role of School Belonging and Academic Self-Efficacy. Department of Public Administration, South China University of Technology; 2018.

Yang L. Optimization Study of College Students’ Learning Engagement from the Perspective of Embodied Cognition Theory. Hunan University of Science and Technology; 2022.

Mizani H, Cahyadi A, Hendryadi H, Salamah S, Retno Sari S. Loneliness, student engagement, and academic achievement during emergency remote teaching during COVID-19: the role of the God locus of control. Humanit Social Sci Commun 2022, 9(1).

Xie K, Vongkulluksn VW, Lu L, Cheng S. A person-centered approach to examining high-school students’ motivation, engagement and academic performance. CONTEMP EDUC PSYCHOL. 2020;62:101877.

Article   Google Scholar  

Zheng Z, Zeng M, Huang W, Sun N. The influence of university library environment on student interactions and college students’ learning engagement. Humanit Social Sci Commun 2024, 11(1).

Skinner EA, Belmont MJ, Levin JR. Motivation in the Classroom: reciprocal effects of Teacher Behavior and Student Engagement across the School Year. J EDUC PSYCHOL. 1993;85(4):571–81.

Schaufeli WB, Martínez IM, Pinto AM, Salanova M, Bakker AB. Burnout and Engagement in University students. J CROSS CULT PSYCHOL. 2002;33(5):464–81.

Dominguez Lara SA, Fernández Arata M, Seperak Viera RA. Análisis psicométrico De Una Medida ultra-breve para El engagement académico: UWES-3S. Revista Argentina De ciencias del Comportamiento. 2021;13(1):25–37.

Hu Z, Shan N, Jiao R. The relationships between perceived teacher autonomy support, academic self-efficacy and learning engagement among primary school students: a network analysis. EUR J PSYCHOL EDUC 2023.

Zhao H, Zhang Z, Heng S. Grit and college students’learning engagement: serial mediating effects of mastery goal orientation and cognitive flexibility. CURR PSYCHOL. 2024;43(8):7437–50.

Huang L, Li X, Meng Y, Lei M, Niu Y, Wang S, Li R. The mediating effects of self-directed learning ability and critical thinking ability on the relationship between learning engagement and problem-solving ability among nursing students in Southern China: a cross-sectional study. BMC Nurs 2023, 22(1).

Bandura A. Self-efficacy: toward a unifying theory of behavioral change. PSYCHOL REV. 1977;84(2):191–215.

Derakhshan A, Fathi J. Grit and Foreign Language Enjoyment as predictors of EFL Learners’ Online Engagement: the mediating role of online learning self-efficacy. Asia-Pacific Educ Researcher 2023.

Koh J, Farruggia SP, Back LT, Han C. Self-efficacy and academic success among diverse first-generation college students: the mediating role of self-regulation. SOC PSYCHOL EDUC. 2022;25(5):1071–92.

Ibrahim RK, Aldawsari AN. Relationship between digital capabilities and academic performance: the mediating effect of self-efficacy. BMC Nurs. 2023;22(1):434.

Zhao B, Zhang Y, Liu X, Zhang J. The Mediating Role of Academic Self-Efficacy in the relationship between emotional resilience and learning burnout among vocational nursing students. J Bengbu Med Coll. 2024;49(01):133–6.

Google Scholar  

Zhao S, Chen Z. The mediating role of self-efficacy in college students’ positive academic emotions and achievement goal orientation. Chin J Health Psychol. 2022;30(01):156–60.

DiBenedetto MK, Schunk DH. Assessing Academic Self-efficacy. In Academic Self-efficacy in Education: Nature, Assessment, and Research . Edited by Khine MS, Nielsen T. Singapore: Springer Singapore; 2022:11–37.

Cai L, Jia X. The relationship between academic self-efficacy and Online Learning Engagement: the Mediating Role of Learning Motivation and Flow Experience. Psychol Behav Res. 2020;18(06):805–11.

Li D, Wang P. The influence of medical education environment perception on medical students’ learning engagement: the mediating role of academic self-efficacy. J China Med Univ. 2020;49(04):357–61.

Liu F, Jin S, Yan T. A study on the correlation between distance learning engagement and professional identity and self-efficacy of medical students. Health Vocat Educ. 2024;42(04):30–3.

Seyum Getenet RCPR. Students ’ digital technology attitude, literacy and self – efficacy and their effect on online learning engagement. Int J Educational Technol High Educ 2024, 21(3).

Han X, Xu Q, Xiao J, Liu Z. Academic atmosphere and graduate students’innovation ability: the role of scientific research self-efficacy and scientific engagement. EUR J PSYCHOL EDUC 2023.

Huang J, Zhou L, Zhu D, Liu W, Lei J. Changes in academic self-efficacy and value and crossover of Burnout among adolescent students: a two-wave longitudinal study. J YOUTH ADOLESCENCE. 2023;52(7):1405–16.

Bandura A. Social foundations of thought and action: a social cognitive theory. Englewood Cliffs, NJ: Prentice-Hall, Inc; 1986.

Zeng WG, Hu R. The influence of school belonging on learning engagement of higher vocational college students: the chain mediating role of academic self-efficacy and positive academic emotion. J Wuhan Vocat Coll Transp. 2022;24(03):79–85.

Goodenow C, Grady KE. The relationship of School Belonging and friends’ values to academic motivation among Urban adolescent students. J Experimental Educ. 1993;62(1):60–71.

Murat Yıldırım HYHB. School Belongingness and Internalizing and externalizing problems in adolescents: exploring the influence of Meaningful School. CHILD INDIC RES. 2023;16(5):2125–40.

An L, Xu B, Han Z, Qi H. The influence of college students’ sense of belonging on learning engagement: the mediating role of life history strategies. J Xianyang Normal Univ. 2023;38(02):94–8.

Guo J, Liu Y, Yang L. Influencing mechanism and model of college students’ learning engagement: based on a survey of 311 undergraduate colleges and universities. Educational Res. 2021;42(08):104–15.

St-Amand J, Smith J, Goulet M. Is teacher humor an asset in classroom management? Examining its association with students’well-being, sense of school belonging, and engagement. CURR PSYCHOL. 2024;43(3):2499–514.

Qiu X, Zhu H, Gao M, Rao B. The relationship between school belonging and academic achievement: a meta-analysis of Chinese students. J Corps Educ Coll. 2024;34(01):41–50.

Yu W, Yao W, Chen M, Zhu H, Yan J. School climate and academic burnout in medical students: a moderated mediation model of collective self-esteem and psychological capital. BMC Psychol. 2023;11(1):77.

Servet Aker. Mustafa Kürşat Şahin: the relationship between school burnout, sense of school belonging and academic achievement in preclinical medical students. ADV HEALTH SCI EDUC. 2022;27(4):949–63.

van Oorschot F, Brouwers M, Muris J, Veen M, Timmerman A, Dulmen SV. How does guided group reflection work to support professional identity formation in postgraduate medical education: a scoping review. MED TEACH 2023:1–11.

Qin P. The characteristics and related research of college students’ professional identity. Southwest University; 2009.

Fried EI, Epskamp S, Nesse RM, Tuerlinckx F, Borsboom D. What are ‘good’ depression symptoms? Comparing the centrality of DSM and non-DSM symptoms of depression in a network analysis. J AFFECT DISORDERS. 2016;189:314–20.

Fried EI. Problematic assumptions have slowed down depression research: why symptoms, not syndromes are the way forward. FRONT PSYCHOL 2015, 6.

Lin H, Wang F. The impact of perceived discrimination on learning burnout of part-time graduate students: the chain mediating role of time management inclination and academic self-efficacy. Chin J Health Psychol. 2023;31(03):429–34.

Huo Y, Shen Y. The influence of learning engagement on vocational college students’ psychological capital: the mediating effect of learning outcomes. J Higher Educ. 2024;10(S2):61–7.

Li X, Huang R. A revised report on the Learning Engagement Scale for College Students (UWES-S). Psychol Res. 2010;3(01):84–8.

Hoi Yan Cheung SKFH. Mainland immigrant and Hong Kong local students ’ psychological sense of School Membership. ASIA PAC EDUC REV. 2003;4(1):67–74.

Liang Y. A study on college students’ achievement goals, attribution style and academic self-efficacy. Central China Normal University; 2000.

Epskamp S, Waldorp LJ, Mottus R, Borsboom D. The gaussian graphical model in cross-sectional and time-Series Data. Multivar Behav Res. 2018;53(4):453–80.

Friedman J, Hastie T, Tibshirani R. Sparse inverse covariance estimation with the graphical lasso. BIOSTATISTICS. 2008;9(3):432–41.

Foygel R, Drton M. Extended bayesian information criteria for gaussian graphical models. Ithaca: Cornell University Library, arXiv.org;; 2010.

Byrne ME, Tanofsky Kraff M, Lavender JM, Parker MN, Shank LM, Swanson TN, Ramirez E, LeMay Russell S, Yang SB, Brady SM, et al. Bridging executive function and disinhibited eating among youth: a network analysis. INT J EAT DISORDER. 2021;54(5):721–32.

Liao M, Xie Z, Ou Q, Yang L, Zou L. Self-efficacy mediates the effect of professional identity on learning engagement for nursing students in higher vocational colleges: a cross-sectional study. Nurse Educ Today. 2024;139:106225.

Chen Q, Zhang Q, Yu F, Hou B. Investigating Structural relationships between Professional Identity, Learning Engagement, Academic Self-Efficacy, and University support: evidence from Tourism students in China. Behav Sci. 2023;14(1):26.

Liu K, Zhang XM, Zheng SS, Zhu L, Li W. Mediating effect of self-efficacy between personality traits and learning engagement in nursing undergraduates. Chin J Mod Nurs. 2021;27(26):3622–7. https://doi.org/10.3760/cma.j.cn115682-20210125-00395 .

Li Z-Y, Pan J-B, Jiang S-Y, Wang C-F, Jia X-J, Li Q. The status of online learning engagement and learning adaptability of associate nursing students. Chin J Nurs Educ. 2020;17(11):1014–7. https://doi.org/10.3761/j.issn.1672-9234.2020.11.011 .

Ma Y. The impact of academic self-efficacy and academic motivation on Chinese EFL students’ academic burnout. Learn Motiv. 2024;85:1–9. https://doi.org/10.1016/j.lmot.2024.101959 .

Yu F, Chen Q, Hou B. Understanding the impacts of Chinese undergraduate tourism students’ professional identity on learning engagement. Sustainability. 2021;13:13379.

Liu LJ. A study on the impact of Role Identity on Learning Engagement among College Students. Yichun Univ. 2017;7:116–20.

Wang P, Sun JH, Ji F. A study on the relationship between Learning Engagement and Professional Identity among Medical College Students. China High Med Educ. 2018;9:39–40.

Wang DM, Liu YC. Survey on Professional Identity of Master’s students. Res High Educ China. 2007;8:18–22.

Yang T, Huang X. From natural person to Social Person: the practical framework and promotion strategy of Social Emotional Learning at Harvard University. Global Educ Outlook. 2024;53(02):151–60.

Raver CC. Emotions matter: making the case for the role of young children’s emotional development for early school readiness. Soc Policy Rep. 2002;16:1–20. https://doi.org/10.1002/j.2379-3988.2002.tb00041.x .

Bierman KL, Domitrovich CE, Nix RL, Gest SD, Welsh JA, Greenberg MT, et al. Promoting academic and social-emotional school readiness: the head start REDI program. Child Dev. 2008;79:1802–17. https://doi.org/10.1111/j.1467-8624.2008.01227.x .

Engels MC, Spilt J, Denies K, Verschueren K. The role of affective teacher-student relationships in adolescents’ school engagement and achievement trajectories. Learn Instr. 2021;75:101485. https://doi.org/10.1016/j.learninstruc.2021.101485 .

Hasnine MN, Nguyen HT, Tran TTT, Bui HT, Akçapınar G, Ueda H. A real-time learning analytics dashboard for automatic detection of online learners’. Affect States Sens. 2023;23:4243. https://doi.org/10.3390/s23094243 .

Luo Q, Chen L, Yu D et al. The mediating role of learning engagement between self-efficacy and academic achievement among Chinese college students. Psychol Res Behav Manage, 2023: 1533–43.

Feng Z, Wu X, Yao M, Wang J. Educational Psychology (3rd edition). Beijing: People’s Education Press ; 2015.

Martínez BMT, del Carmen Pérez-Fuentes M, Jurado MMM. The influence of perceived social support on academic engagement among adolescents: the mediating role of self-efficacy. Anales De Psicología/Annals Psychol. 2024;40(3):409–20.

Gan Y, Zhang J, Wu X, et al. Chain Mediating effects of Student Engagement and Academic Achievement on University Identification. SAGE Open. 2024;14(1):21582440241226903.

Wei L. Relationship among Perceived Teacher Support, Academic Self-Efficacy, and Online Learning Engagement in College EFL Learning. Open J Mod Linguistics. 2024;14(3):290–314.

WU J, FU H. A meta-analysis of the relationship between achievement goal orientation and academic achievement: the mediating role of self-efficacy and student engagement. Adv Psychol Sci. 2024;32(7):1104.

Karakus M. Understanding the mastery-avoidance goals construct: An investigation among middle school students in two domains (Unpublished doctorial dissertation). Temple University, Philadelphia

Dai X. A study on mindful agency’s influence on college students’ engagement with online teaching: the mediating roles of e-learning self-efficacy and self-regulation. Acta Psychol. 2024;243:104146.

Download references

This work was supported by the National Science Foundation Project in China (72374208).

Author information

Authors and affiliations.

Department of Military Medical Psychology, Air Force Medical University, No. 169 West Changle Road, Xi’an, Shaanxi Province, 710032, China

Yijun Li, Lin Wu, Fengzhan Li, Peng Fang, Xufeng Liu & Shengjun Wu

You can also search for this author in PubMed   Google Scholar

Contributions

Li, Y.: Formal analysis, Visualization, Writing - original draft. Wu, L.: Conceptualization, Investigation. Li, F.: Investigation, Writing - Review & Editing. Fang, P.: Resources, Investigation, Data curation. Liu, X.: Supervision, Writing - Review & Editing, Project administration. Wu, S.: Funding acquisition, Supervision, Writing - Review & Editing, Project administration.

Corresponding authors

Correspondence to Xufeng Liu or Shengjun Wu .

Ethics declarations

Ethics approval and consent to participate.

This study was conducted in accordance with the ethical standards put forth in the Declaration of Helsinki. Written informed consent was obtained from all individual participants included in the study. The study design and procedures were reviewed and approved by the Independent Ethics Committee of the First Affiliated Hospital of the Fourth Military Medical University.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Li, Y., Wu, L., Li, F. et al. Analysis of factors influencing medical students’ learning engagement and its implications for teaching work—— a network analysis perspective. BMC Med Educ 24 , 918 (2024). https://doi.org/10.1186/s12909-024-05908-y

Download citation

Received : 08 May 2024

Accepted : 14 August 2024

Published : 24 August 2024

DOI : https://doi.org/10.1186/s12909-024-05908-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Medical students
  • College students’ speciality identity
  • Psychological sense of school membership
  • Academic self-efficacy

BMC Medical Education

ISSN: 1472-6920

research survey questionnaire about online classes

  • Skip to main content
  • Skip to search
  • Skip to footer

Products and Services

research survey questionnaire about online classes

Cisco Secure Firewall

Do you have a firewall fit for today's challenges.

Does it harmonize your network, workload, and application security? Does it protect apps and employees in your hybrid or multicloud environment? Make sure you're covered.

Anticipate, act, and simplify with Secure Firewall

Overview video of Secure Firewall 4220 and software update

Cisco AI Assistant for Security demo

With workers, data, and offices located across the country and around the world, your firewall must be ready for anything. Secure Firewall helps you plan, prioritize, close gaps, and recover from disaster—stronger.

Lean on AI that simplifies policy management

Streamlining workflows. Finding misconfigurations. Auto-generating rules. With thousands of policies to manage and threats pouring in, Cisco AI Assistant saves time by simplifying how you manage firewall policy.

Achieve superior visibility

Regain visibility and control of your encrypted traffic and application environments. See more and detect more with Cisco Talos, while leveraging billions of signals across your infrastructure with security resilience.

Drive efficiency at scale

Secure Firewall supports advanced clustering, high availability, and multi-instance capabilities, enabling you to bring scalability, reliability, and productivity across your teams and hybrid network environments.

Make zero trust practical

Secure Firewall makes a zero-trust posture achievable and cost-effective with network, microsegmentation, and app security integrations. Automate access and anticipate what comes next.

Find the ideal firewall for your business

Cisco Secure Firewall

1000 Series

Best for smaller businesses and branch offices.

1200 Series

Advanced security for distributed enterprise branches in a compact, high-performing form factor.

3100 Series

Enhanced for medium-sized enterprises, with the flexibility to grow in the future.

4200 Series

Experience faster threat detection with greater visibility and the agility to safeguard large enterprise data center and campus networks.

9300 Series

Optimized for service providers and high-performance data centers.

Secure Firewall Threat Defense Virtual

Virtual firewalls for consistent policies across physical, cloud, and hyperconverged environments.

Secure Firewall ISA3000

Rugged design for manufacturing, industrial, and operational technology environments.

Secure WAF and bot protection

Enhance application security and resilience for today’s digital enterprise with Secure WAF and bot protection.

DDoS protection

Defend against attacks that flood your network with traffic, impacting access to apps and business-critical services.

Why migrate?

Level up your security posture with the latest capabilities for unified network and workload micro-segmentation protection.

Cisco Secure Firewall

Experience Firewall Management Center in action

See how you can centralize and simplify your firewall admin and intrusion prevention. With visibility across ever-changing and global networks, you can manage modern applications and malware outbreaks in real time.

Worker using laptop while on a flight

Get 3 vital protections in a single step

You don't have to trade security for productivity. The Cisco Security Step-Up promotion deploys three powerful lines of defense that are simple, secure, and resilient for your business. Defend every critical attack vector–email, web traffic, and user credentials—in one easy step.

Explore the evolution of network security

We asked hundreds of IT and security professionals how they’re managing threats and using firewall in the face of AI, cloud complexity, and more. Here’s how they’re meeting those challenges.

Cisco Community: Connect with peers and experts

Cisco Community is your destination for product advice, a place to foster connections and share your knowledge.

Find the latest content and resources to help you learn more about Cisco Secure Firewall.

Add value to security solutions

Cisco Security Enterprise Agreement

Instant savings

Experience security software buying flexibility with one easy-to-manage agreement.

Services for security

Let the experts secure your business

Get more from your investments and enable constant vigilance to protect your organization.

Customer stories and insights

Powering fuel providers.

Ampol logo

Ampol's global business includes refineries, fueling stations, and corporate offices. The company's infrastructure and retail operations are protected and connected with Cisco technology.

Ampol Limited

Reducing cybersecurity risk

Dayton Children's logo

A zero-trust approach to security protects the privacy of patients' personal data at this Ohio children's hospital.

Dayton Children’s

Better wireless access and security

Keller logo

A Texas school district turned to Cisco technology to bring ubiquitous, reliable wireless access to students while assuring proactive network monitoring capabilities.

Protecting networks and assets

Lake Trust logo

A Michigan-based credit union protects the digital security of its hybrid workforce, customers, and assets with help from Cisco.

Lake Trust Credit Union

Boosting visibility and security

Marian University

This Indiana university provides reliable and safe network access with Cisco's unified security ecosystem as its foundation for zero trust.

Marian University

The NFL relies on Cisco

NFL logo

From the draft to Super Bowl Sunday, the NFL relies on Cisco to protect billions of devices, endpoints, and users from cyber threats. What does that look like on game day? Watch the video on the story page to find out.

National Football League

Share your experience. Create a safer digital world.

Join us in shaping the future of cybersecurity and creating a safer digital world, one story at a time.

Simple, visible, and unified

Unify security across your high-performing data centers, providing superior visibility and efficiency. Then watch it work with ease.

IMAGES

  1. Research Questionnaire

    research survey questionnaire about online classes

  2. Research Questionnaire

    research survey questionnaire about online classes

  3. Sample questionnaire applied to students about their digital skills

    research survey questionnaire about online classes

  4. Biochemistry BSCI2520

    research survey questionnaire about online classes

  5. Survey Questionnaire (SAMPLE)

    research survey questionnaire about online classes

  6. Survey on online classes : education

    research survey questionnaire about online classes

VIDEO

  1. Social Research, Questionnaire, प्रश्नवली

  2. Research collaborations: Self Directed Learning Questionnaire, Explanations

  3. Questionnaire || Meaning and Definition || Type and Characteristics || Research Methodology ||

  4. research methods questionnaire

  5. How to create an effective Questionnaire and Survey Design

  6. differences between the questionnaire and schedule research methodology

COMMENTS

  1. 45 Survey Questions to Understand Student Engagement in Online Learning

    Research suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face. As you look to improve the online learning experience for students, take a moment to understand how students, caregivers, and staff are currently experiencing virtual learning.

  2. 80+Remote Learning Survey Questions for Students

    In this article, we've put together a list of the 80 best remote learning survey questions you can ask students, parents, and teachers to optimize and design effective learning experiences. Here's everything we'll cover: 47 Remote Learning Survey Questions for Students. 27 Remote Learning Survey Questions for Parents.

  3. Distance learning survey for students

    Likert Scale Complete Likert Scale Questions, Examples and Surveys for 5, 7 and 9 point scales. Learn everything about Likert Scale with corresponding example for each question and survey demonstrations. Conjoint Analysis; Net Promoter Score (NPS) Learn everything about Net Promoter Score (NPS) and the Net Promoter Question. Get a clear view on the universal Net Promoter Score Formula, how to ...

  4. (PDF) Online/Digital Learning Questionnaire

    7) I am comfortable communicating electr onically. 8) Learning is the same in class and at home on the Internet. than a regular course. 10) I believe a complete course can be gi ven by the ...

  5. Remote Learning Survey Questionnaire For Students

    When developing any type of survey, whether in the education or non-education sectors, you need a mechanism to conveniently gather and preserve the data you get. A secure drag-and-drop remote learning survey-building tool is ideal for schools and other community groups who need to construct customized surveys. Additional Resources

  6. Questionnaire on the impact of e-learning

    Consequences of eLearning. There are several reasons to discuss eLearning. First, it has the potential to expand access to education for people who cannot attend traditional in-person classes, like those living in remote or underdeveloped areas or people with disabilities. eLearning can also be more cost-effective for students and institutions ...

  7. A Survey on the Effectiveness of Online Teaching-Learning

    Online quiz having multiple-choice questions (MCQ) preferred by students. 6. Student version software is useful for learning. 7. Online classes are more effective because they provide PPTs in front of every student, lectures are heard by all students at the sound level of their choice, and walking/travel to reach classes is eliminated. 8.

  8. The Perfect Distance Learning Questionnaire for Students

    These survey questions will aid in understanding the students' educational experiences with distance learning. It gathers feedback on the student's entire experience with online education. It will make assessing the efficiency and quality of distance learning more manageable. The management staff will be able to determine what students ...

  9. 20 Key Distance Learning Survey Questions for Teachers

    Likert Scale Complete Likert Scale Questions, Examples and Surveys for 5, 7 and 9 point scales. Learn everything about Likert Scale with corresponding example for each question and survey demonstrations. Conjoint Analysis; Net Promoter Score (NPS) Learn everything about Net Promoter Score (NPS) and the Net Promoter Question. Get a clear view on the universal Net Promoter Score Formula, how to ...

  10. A Survey on the Effectiveness of Online Teaching-Learning Methods for

    About 450 students belonging to VIT Vellore, CMRIT Bangalore, Medical College, Pudukkottai, and engineering colleges have responded to the survey. A questionnaire designed for taking is survey is presented. The chi-square statistic is 65.6025. The p value is < 0.00001. The result is significant at p < 0.05. Associations between several ...

  11. (PDF) A Survey on the Effectiveness of Online Teaching

    An attempt is made to find the effectiveness of online teaching-learning methods for university and college students by conducting an online survey. A questionnaire has been specially designed ...

  12. Class Survey Questions: Top 35 for Questionnaires

    Likert Scale Complete Likert Scale Questions, Examples and Surveys for 5, 7 and 9 point scales. Learn everything about Likert Scale with corresponding example for each question and survey demonstrations. Conjoint Analysis; Net Promoter Score (NPS) Learn everything about Net Promoter Score (NPS) and the Net Promoter Question. Get a clear view on the universal Net Promoter Score Formula, how to ...

  13. Online Learning Survey Form Template

    An online learning survey is a questionnaire used by e-learning companies to find out how students feel about their experience in e-learning courses. Whether you're an e-learning company, a student learning online, or a parent using online courses for your children, our free online learning survey will help you get the information you need!

  14. A Questionnaire Based Survey of Students Underlying Online Classes

    As shown in figure 2 and Table 2, the survey results reveal that approximately 51% of. the students were not able to learn and retain more than 50% of the contents taught via the. online platform ...

  15. PDF STUDENT EXPERIENCES IN ONLINE COURSES A Qualitative Research Synthesis

    million stu-dents were taking an online course (All. n & Seaman, 2010). Nearly 30% of students were taking a course online. The same study also found percent of enrollmen. growth was 21%, while overall growth in higher education was only 2%. Moreover, the 21% growth rate for online enrollments far exceeds the l.

  16. Impact of online classes on the satisfaction and performance of

    The aim of the study is to identify the factors affecting students' satisfaction and performance regarding online classes during the pandemic period of COVID-19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or ...

  17. Online and face‐to‐face learning: Evidence from students' performance

    This study investigates the factors that predict students' performance after transitioning from face‐to‐face to online learning as a result of the Covid‐19 pandemic. It uses students' responses from survey questions and the difference in the average assessment grades between pre‐lockdown and post‐lockdown at a South African university.

  18. Guide to the design and application of online questionnaire surveys

    Methodological components. Whilst developing and operationalising the online questionnaire survey, six methodological components are critical to successful online surveys. These are (a) user-friendly design and layout; (b) selecting survey participants; (c) avoiding multiple responses; (d) data management; e) ethical issues; and f) piloting tools.

  19. Kantar's Online Survey Training Modules

    The impact of bias in surveys (8 minutes) The effects of scale designs on answers (21 minutes) Reducing online survey dropout (8 minutes) Using visuals for engagement (18 minutes) Writing questions effectively (11 minutes) Conducting cross-cultural research (26 minutes) Designing surveys with empathy (27 minutes)

  20. Best Survey Courses Online with Certificates [2024]

    Questionnaire Design for Social Surveys: University of Michigan. UX Research at Scale: Surveys, Analytics, Online Testing: University of Michigan. Survey analysis to Gain Marketing Insights: Emory University. Research Methodologies: Queen Mary University of London.

  21. Journal recommended guidelines for survey-based research

    Survey-based research is a cornerstone of empirical inquiry across education and social science disciplines, providing insights into human behaviors, attitudes, perceptions, and experiences. Compared to a decade ago, the number of studies listed on PubMed containing the terms "survey" and "medical education" has increased by 33% (i.e ...

  22. 10 Essential Training Survey Questions You Must Be Asking

    We compiled a list of 10 essential training survey questions to provide you with the information you need to build an even better training program next time. Keep reading until the end to learn how to set up an effective post-training survey. Disclaimer: The information below is accurate as of August 23, 2024. 1.

  23. (PDF) Online Courses: Student Preferences Survey

    University of Baltimore. Abstract. As online and hybrid courses are increasingly used to deliver college courses and. curriculum, an online survey was developed and implemented at the University ...

  24. PDF Effective Use of Web-based Survey Research Platforms

    REDCap is secure web application for building and managing online surveys and databases. The REDCap application allows users to build and manage online surveys and databases quickly and . securely. REDCap can be used to collect virtually any type of data, it is specifically geared to support . data capture for research studies.

  25. Class Size Distributions Interactive Report // Institutional Research

    The reports below are publicly available unless otherwise noted. If you have any questions about ... Explore Resources for Online Surveys. The Office of Institutional Research and Analysis is responsible for both administering campus-wide surveys of broad institutional scope and assisting the campus community in conducting online surveys. ...

  26. Online questionnaire: Definition, examples & how to create them

    Likert Scale Complete Likert Scale Questions, Examples and Surveys for 5, 7 and 9 point scales. Learn everything about Likert Scale with corresponding example for each question and survey demonstrations. Conjoint Analysis; Net Promoter Score (NPS) Learn everything about Net Promoter Score (NPS) and the Net Promoter Question. Get a clear view on the universal Net Promoter Score Formula, how to ...

  27. Methodology: Teens and parents survey

    The survey was conducted by Ipsos Public Affairs in English and Spanish using KnowledgePanel, its nationally representative online research panel. The research plan for this project was submitted to an external institutional review board (IRB), Advarra, which is an independent committee of experts that specializes in helping to protect the ...

  28. Open-access expansion threatens academic publishing industry

    Critics say a directive to make federally funded research immediately free to the public could violate authors' copyrights. It could also disrupt the $19 billion academic publishing industry. Even as federal agencies work to implement the Nelson memo—a 2022 White House directive to make federally funded research freely available to the public immediately after publication—members of ...

  29. Analysis of factors influencing medical students' learning engagement

    Higher medical education has always been a major project in the fields of education and health, and therefore, the quality of education has received much attention. Learning engagement has emerged as a significant indicator of teaching quality, attracting considerable research attention. This study aims to explore the relationship between medical students' learning engagement and their sense ...

  30. Cisco Secure Firewall

    With workers, data, and offices located across the country and around the world, your firewall must be ready for anything. Secure Firewall helps you plan, prioritize, close gaps, and recover from disaster—stronger.