Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • Product Demos
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence
  • Market Research
  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO

assignment survey sample

Academic Experience

How to write great survey questions (with examples)

Learning how to write survey questions is both art and science. The wording you choose can make the difference between accurate, useful data and just the opposite. Fortunately, we’ve got a raft of tips to help.

Figuring out how to make a good survey that yields actionable insights is all about sweating the details. And writing effective questionnaire questions is the first step.

Essential for success is understanding the different types of survey questions and how they work. Each format needs a slightly different approach to question-writing.

In this article, we’ll share how to write survey questionnaires and list some common errors to avoid so you can improve your surveys and the data they provide.

Free eBook: The Qualtrics survey template guide

Survey question types

Did you know that Qualtrics provides 23 question types you can use in your surveys ? Some are very popular and used frequently by a wide range of people from students to market researchers, while others are more specialist and used to explore complex topics. Here’s an introduction to some basic survey question formats, and how to write them well.

Multiple choice

Familiar to many, multiple choice questions ask a respondent to pick from a range of options. You can set up the question so that only one selection is possible, or allow more than one to be ticked.

When writing a multiple choice question…

  • Be clear about whether the survey taker should choose one (“pick only one”) or several (“select all that apply”).
  • Think carefully about the options you provide, since these will shape your results data.
  • The phrase “of the following” can be helpful for setting expectations. For example, if you ask “What is your favorite meal” and provide the options “hamburger and fries”, “spaghetti and meatballs”, there’s a good chance your respondent’s true favorite won’t be included. If you add “of the following” the question makes more sense.

Asking participants to rank things in order, whether it’s order of preference, frequency or perceived value, is done using a rank structure. There can be a variety of interfaces, including drag-and-drop, radio buttons, text boxes and more.

When writing a rank order question…

  • Explain how the interface works and what the respondent should do to indicate their choice. For example “drag and drop the items in this list to show your order of preference.”
  • Be clear about which end of the scale is which. For example, “With the best at the top, rank these items from best to worst”
  • Be as specific as you can about how the respondent should consider the options and how to rank them. For example, “thinking about the last 3 months’ viewing, rank these TV streaming services in order of quality, starting with the best”

Slider structures ask the respondent to move a pointer or button along a scale, usually a numerical one, to indicate their answers.

When writing a slider question…

  • Consider whether the question format will be intuitive to your respondents, and whether you should add help text such as “click/tap and drag on the bar to select your answer”
  • Qualtrics includes the option for an open field where your respondent can type their answer instead of using a slider. If you offer this, make sure to reference it in the survey question so the respondent understands its purpose.

Also known as an open field question, this format allows survey-takers to answer in their own words by typing into the comments box.

When writing a text entry question…

  • Use open-ended question structures like “How do you feel about…” “If you said x, why?” or “What makes a good x?”
  • Open-ended questions take more effort to answer, so use these types of questions sparingly.
  • Be as clear and specific as possible in how you frame the question. Give them as much context as you can to help make answering easier. For example, rather than “How is our customer service?”, write “Thinking about your experience with us today, in what areas could we do better?”

Matrix table

Matrix structures allow you to address several topics using the same rating system, for example a Likert scale (Very satisfied / satisfied / neither satisfied nor dissatisfied / dissatisfied / very dissatisfied).

When writing a matrix table question…

  • Make sure the topics are clearly differentiated from each other, so that participants don’t get confused by similar questions placed side by side and answer the wrong one.
  • Keep text brief and focused. A matrix includes a lot of information already, so make it easier for your survey-taker by using plain language and short, clear phrases in your matrix text.
  • Add detail to the introductory static text if necessary to help keep the labels short. For example, if your introductory text says “In the Philadelphia store, how satisfied were you with the…” you can make the topic labels very brief, for example “staff friendliness” “signage” “price labeling” etc.

Now that you know your rating scales from your open fields, here are the 7 most common mistakes to avoid when you write questions. We’ve also added plenty of survey question examples to help illustrate the points.

Likert Scale Questions

Likert scales are commonly used in market research when dealing with single topic survyes. They're simple and most reliable when combatting survey bias . For each question or statement, subjects choose from a range of possible responses. The responses, for example, typically include:

  • Strongly agree
  • Strongly disagree

7 survey question examples to avoid.

There are countless great examples of writing survey questions but how do you know if your types of survey questions will perform well? We've highlighted the 7 most common mistakes when attempting to get customer feedback with online surveys.

Survey question mistake #1: Failing to avoid leading words / questions

Subtle wording differences can produce great differences in results. For example, non-specific words and ideas can cause a certain level of confusing ambiguity in your survey. “Could,” “should,” and “might” all sound about the same, but may produce a 20% difference in agreement to a question.

In addition, strong words such as “force” and “prohibit” represent control or action and can bias your results.

Example: The government should force you to pay higher taxes.

No one likes to be forced, and no one likes higher taxes. This agreement scale question makes it sound doubly bad to raise taxes. When survey questions read more like normative statements than questions looking for objective feedback, any ability to measure that feedback becomes difficult.

Wording alternatives can be developed. How about simple statements such as: The government should increase taxes, or the government needs to increase taxes.

Example: How would you rate the career of legendary outfielder Joe Dimaggio?

This survey question tells you Joe Dimaggio is a legendary outfielder. This type of wording can bias respondents.

How about replacing the word “legendary” with “baseball” as in: How would you rate the career of baseball outfielder Joe Dimaggio? A rating scale question like this gets more accurate answers from the start.

Survey question mistake #2: Failing to give mutually exclusive choices

Multiple choice response options should be mutually exclusive so that respondents can make clear choices. Don’t create ambiguity for respondents.

Review your survey and identify ways respondents could get stuck with either too many or no single, correct answers to choose from.

Example: What is your age group?

What answer would you select if you were 10, 20, or 30? Survey questions like this will frustrate a respondent and invalidate your results.

Example: What type of vehicle do you own?

This question has the same problem. What if the respondent owns a truck, hybrid, convertible, cross-over, motorcycle, or no vehicle at all?

Survey question mistake #3: Not asking direct questions

Questions that are vague and do not communicate your intent can limit the usefulness of your results. Make sure respondents know what you’re asking.

Example: What suggestions do you have for improving Tom’s Tomato Juice?

This question may be intended to obtain suggestions about improving taste, but respondents will offer suggestions about texture, the type of can or bottle, about mixing juices, or even suggestions relating to using tomato juice as a mixer or in recipes.

Example: What do you like to do for fun?

Finding out that respondents like to play Scrabble isn’t what the researcher is looking for, but it may be the response received. It is unclear that the researcher is asking about movies vs. other forms of paid entertainment. A respondent could take this question in many directions.

Survey question mistake #4: Forgetting to add a “prefer not to answer” option

Sometimes respondents may not want you to collect certain types of information or may not want to provide you with the types of information requested.

Questions about income, occupation, personal health, finances, family life, personal hygiene, and personal, political, or religious beliefs can be too intrusive and be rejected by the respondent.

Privacy is an important issue to most people. Incentives and assurances of confidentiality can make it easier to obtain private information.

While current research does not support that PNA (Prefer Not to Answer) options increase data quality or response rates, many respondents appreciate this non-disclosure option.

Furthermore, different cultural groups may respond differently. One recent study found that while U.S. respondents skip sensitive questions, Asian respondents often discontinue the survey entirely.

  • What is your race?
  • What is your age?
  • Did you vote in the last election?
  • What are your religious beliefs?
  • What are your political beliefs?
  • What is your annual household income?

These types of questions should be asked only when absolutely necessary. In addition, they should always include an option to not answer. (e.g. “Prefer Not to Answer”).

Survey question mistake #5: Failing to cover all possible answer choices

Do you have all of the options covered? If you are unsure, conduct a pretest version of your survey using “Other (please specify)” as an option.

If more than 10% of respondents (in a pretest or otherwise) select “other,” you are probably missing an answer. Review the “Other” text your test respondents have provided and add the most frequently mentioned new options to the list.

Example: You indicated that you eat at Joe's fast food once every 3 months. Why don't you eat at Joe's more often?

There isn't a location near my house

I don't like the taste of the food

Never heard of it

This question doesn’t include other options, such as healthiness of the food, price/value or some “other” reason. Over 10% of respondents would probably have a problem answering this question.

Survey question mistake #6: Not using unbalanced scales carefully

Unbalanced scales may be appropriate for some situations and promote bias in others.

For instance, a hospital might use an Excellent - Very Good - Good - Fair scale where “Fair” is the lowest customer satisfaction point because they believe “Fair” is absolutely unacceptable and requires correction.

The key is to correctly interpret your analysis of the scale. If “Fair” is the lowest point on a scale, then a result slightly better than fair is probably not a good one.

Additionally, scale points should represent equi-distant points on a scale. That is, they should have the same equal conceptual distance from one point to the next.

For example, researchers have shown the points to be nearly equi-distant on the strongly disagree–disagree–neutral–agree–strongly agree scale.

Set your bottom point as the worst possible situation and top point as the best possible, then evenly spread the labels for your scale points in-between.

Example: What is your opinion of Crazy Justin's auto-repair?

Pretty good

The Best Ever

This question puts the center of the scale at fantastic, and the lowest possible rating as “Pretty Good.” This question is not capable of collecting true opinions of respondents.

Survey question mistake #7: Not asking only one question at a time

There is often a temptation to ask multiple questions at once. This can cause problems for respondents and influence their responses.

Review each question and make sure it asks only one clear question.

Example: What is the fastest and most economical internet service for you?

This is really asking two questions. The fastest is often not the most economical.

Example: How likely are you to go out for dinner and a movie this weekend?

Dinner and Movie

Dinner Only

Even though “dinner and a movie” is a common term, this is two questions as well. It is best to separate activities into different questions or give respondents these options:

5 more tips on how to write a survey

Here are 5 easy ways to help ensure your survey results are unbiased and actionable.

1. Use the Funnel Technique

Structure your questionnaire using the “funnel” technique. Start with broad, general interest questions that are easy for the respondent to answer. These questions serve to warm up the respondent and get them involved in the survey before giving them a challenge. The most difficult questions are placed in the middle – those that take time to think about and those that are of less general interest. At the end, we again place general questions that are easier to answer and of broad interest and application. Typically, these last questions include demographic and other classification questions.

2. Use “Ringer” questions

In social settings, are you more introverted or more extroverted?

That was a ringer question and its purpose was to recapture your attention if you happened to lose focus earlier in this article.

Questionnaires often include “ringer” or “throw away” questions to increase interest and willingness to respond to a survey. These questions are about hot topics of the day and often have little to do with the survey. While these questions will definitely spice up a boring survey, they require valuable space that could be devoted to the main topic of interest. Use this type of question sparingly.

3. Keep your questionnaire short

Questionnaires should be kept short and to the point. Most long surveys are not completed, and the ones that are completed are often answered hastily. A quick look at a survey containing page after page of boring questions produces a response of, “there is no way I’m going to complete this thing”. If a questionnaire is long, the person must either be very interested in the topic, an employee, or paid for their time. Web surveys have some advantages because the respondent often can't view all of the survey questions at once. However, if your survey's navigation sends them page after page of questions, your response rate will drop off dramatically.

How long is too long?  The sweet spot is to keep the survey to less than five minutes. This translates into about 15 questions. The average respondent is able to complete about 3 multiple choice questions per minute. An open-ended text response question counts for about three multiple choice questions depending, of course, on the difficulty of the question. While only a rule of thumb, this formula will accurately predict the limits of your survey.

4. Watch your writing style

The best survey questions are always easy to read and understand. As a rule of thumb, the level of sophistication in your survey writing should be at the 9th to 11th grade level. Don’t use big words. Use simple sentences and simple choices for the answers. Simplicity is always best.

5. Use randomization

We know that being the first on the list in elections increases the chance of being elected. Similar bias occurs in all questionnaires when the same answer appears at the top of the list for each respondent. Randomization corrects this bias by randomly rotating the order of the multiple choice matrix questions for each respondent.

While not totally inclusive, these seven survey question tips are common offenders in building good survey questions. And the five tips above should steer you in the right direction.

Focus on creating clear questions and having an understandable, appropriate, and complete set of answer choices. Great questions and great answer choices lead to great research success. To learn more about survey question design, download our eBook, The Qualtrics survey template guide or get started with a free survey account with our world-class survey software .

Sarah Fisher

Related Articles

February 8, 2023

Smoothing the transition from school to work with work-based learning

December 6, 2022

How customer experience helps bring Open Universities Australia’s brand promise to life

August 9, 2022

3 things that will improve your teachers’ school experience

August 2, 2022

Why a sense of belonging at school matters for K-12 students

July 14, 2022

Improve the student experience with simplified course evaluations

March 17, 2022

Understanding what’s important to college students

February 18, 2022

Malala: ‘Education transforms lives, communities, and countries’

July 8, 2020

5 challenges in getting back to school (and 5 ways to tackle them)

Stay up to date with the latest xm thought leadership, tips and news., request demo.

Ready to learn more about Qualtrics?

Best Practices and Sample Questions for Course Evaluation Surveys

Meaningful input from students is essential for improving courses. One of the most common indirect course assessment methods is the course evaluation survey. In addition to providing useful information for improving courses, course evaluations provide an opportunity for students to reflect and provide feedback on their own learning. Review an example of a digital course evaluation survey in HelioCampus Assessment and Credentialing (formerly AEFIS) that was created by Testing and Evaluation Services.

Best Practices

The following best practices are intended to guide departments and programs in creating and revising course evaluation questions, and achieving high response rates.

This is an accordion element with a series of buttons that open and close related content panels.

Achieving High Response Rates

  • Give students time (10-15 minutes) to complete the digital evaluation during class (just as they do with printed, paper evaluations).
  • Encourage students to complete the evaluation by discussing its purpose and importance in the weeks leading up to it. If students know that you will read their feedback and seriously consider changes based on their feedback, they will be more likely to complete the evaluation.
  • Share how you have incorporated past feedback into your courses.
  • Examples include making the evaluation an assignment with points attached or giving students a bonus point. One way to do this is to set a target response rate for the class – say 90% – and provide everyone a bonus point if the class reaches the target
  • Ask students to provide feedback about their own learning relative to the course’s learning outcomes

Creating and Revising Survey Questions - Strategies to Obtain More Effective Feedback

  • Meaningful input from students is essential for improving courses.
  • Obtaining student feedback on their learning is important to you.
  • Guide students to the specific type of feedback you are looking for.
  • Students, like anyone answering questions, tend to provide better feedback to more specific questions. Asking about a specific type of activity, or asking students to share the most important point they learned during the semester, may provide more useful feedback.
  • Example: instead of asking “How useful were the instructional materials and activities for this course?”, focus on a specific material or activity.
  • Yes/no questions can often be leading questions. Instead of asking “Did you learn a great amount from this course?”, a better question would be “To what extent do you feel you mastered the content in this course?
  • Asking open-ended questions can help you gain insight you may not otherwise receive. Research by the University of California – Merced is finding that coaching from peers or near-peers can help students provide more effective feedback to open-ended questions. The research includes short videos and a rubric you can share with your students prior to completing evaluations.
  • Students are hesitant to complete course evaluations if they feel they may be identified by their responses. For example, responding to “level” or “year” when they are the only graduate student or undergraduate senior in a course.

Sample Questions

Instructor-specific: delivery - teaching methods, strategies, practices and clarity.

  • The instructor was well prepared for class.
  • Individual class meetings were well prepared.
  • The instructor used class time effectively.
  • The instructor was organized, well prepared, and used class time efficiently.
  • The instructor communicated clearly and was easy to understand.
  • The instructor encouraged student participation in class.
  • The instructor presented course material in a clear manner that facilitated understanding.
  • The instructor effectively organized and facilitated well-run learning activities.
  • The instructor’s teaching methods were effective.
  • The instructor’s teaching methods aided my learning.
  • The instructor stimulated my interest in the subject matter.
  • The instructor provided helpful feedback.
  • The instructor provided feedback in a timely manner.
  • The instructor returned assignments and exams in a timely manner.
  • The online course platform was updated and accurate.

Instructor-Specific: Personal / Connection - Clarity and Encouragement

  • The instructor effectively explained and illustrated course concepts.
  • The instructor’s feedback to me was helpful and improved my understanding of the material.
  • I was able to access the instructor outside of scheduled class time for additional help.
  • The instructor was available to students.
  • I could get help if I needed it.
  • The instructor cared about the students, their progress, and successful course completion.
  • The instructor created a welcoming and inclusive learning environment.
  • The instructor treated students with respect.

Course Materials

  • The lectures, readings, and assignments complemented each other.
  • The instructional materials (i.e., books, readings, handouts, study guides, lab manuals, multimedia, software) increased my knowledge and skills in the subject matter.
  • The text and assigned readings were valuable.
  • The workload consisted of less than two hours outside of the classroom for each hour in class.
  • The course workload and requirements were appropriate for the course level.
  • The course was organized in a manner that helped me understand underlying concepts.
  • The course assignments (readings, assigned problems, laboratory experiments, videos, etc.) facilitated my learning.
  • The assigned readings helped me understand the course material.
  • Graded assignments helped me understand the course material.
  • The tests/assessments accurately assess what I have learned in this course.
  • Exams and assignments were reflective of the course content.
  • The course was well organized.
  • The course followed the syllabus.
  • The instructor grades consistently with the evaluation criteria.
  • The course environment felt like a welcoming place to express my ideas.

Student Engagement and Involvement

  • I attend class regularly.
  • I consistently prepared for class.
  • I have put a great deal of effort into advancing my learning in this course.
  • In this course, I have been challenged to learn more than I expected.
  • I was well-prepared for class/discussion sections.

Course Structure

  • This class has increased my interest in this field of study.
  • This course gave me confidence to do more advanced work in the subject.
  • I believe that what I am being asked to learn in this course is important.
  • The readings were appropriate to the goals of the course.
  • The written assignments contributed to my knowledge of the course material and understanding of the subject.
  • Expectations for student learning were clearly defined.
  • Student learning was fairly assessed (e.g., through quizzes, exams, projects, and other graded work).
  • Exams/assignments were a fair assessment of my knowledge of the course material.
  • The grading practices were clearly defined.
  • The grading practices were fair.
  • The examinations/projects measured my knowledge of the course material.
  • This course was challenging.
  • This course made me think.
  • What grade do you expect to earn in this course? Options for this question: A-AB, B-BC,  C,  Below C, Unsure

Student Learning and Course Learning Outcomes

Text in “{}” should be changed to match the specific course learning outcomes (CLO). 

  • This course helped me { develop intellectual and critical thinking skills }.
  • This course improved my ability to { evaluate arguments }.
  • This course helped me { argue effectively }.
  • My ability to { identify, formulate, and solve problems } has increased.
  • My understanding of { basic chemical transformations, reactivity, and properties } has increased.
  • My ability to { recognize the relationship between structure, bonding, and the properties of molecules and materials } has increased.
  • I am capable of { locating, evaluating, and using information in the literature }.
  • I am confident in my ability to { communicate chemical knowledge effectively }.
  • I understand { professional and ethical responsibility related to data storage }.
  • This course helped me analyze { relations among individual, civil society, political institution, and countries }.
  • The coursed helped me { further develop my writing ability }.
  • The course improved my { verbal communication skills }.
  • The course increased my ability to { collaborate and work in teams }.
  • The course increased my { intercultural knowledge and awareness to help me become a global citizen }.

UW Essential Learning Outcomes

  • This course enhanced my knowledge of the world (e.g., human cultures, society, sciences, etc.).
  • This course helped me develop intellectual skills (e.g., critical or creative thinking, quantitative reasoning, problem solving, etc.).
  • This course helped me develop professional skills (e.g., written or oral communication, computer literacy, teamwork, etc.).
  • This course enhanced my sense of social responsibility.

General / Overall Rating

  • I would highly recommend this instructor to other students.
  • I would recommend this instructor to others.
  • Overall, this instructor met my expectations for the quality of a UW-Madison teacher.
  • I would highly recommend this course to other students.
  • I would recommend this course to others.
  • Overall, this course met my expectations for the quality of a UW-Madison course.
  • This course had high educational impact.
  • This course was useful in progress toward my degree.

Qualitative, Open-Ended Response

  • Do you have any specific recommendations for improving this course?
  • What are one to three specific things about the course or instructor that especially helped to support student learning?
  • What are the strengths of this course?
  • What parts of the course aided your learning the most?
  • What are one to three specific things about the course that could be improved to better support student learning?
  • What parts of the class were obstacles to your learning?
  • What changes might improve your learning?

TA-Specific

  • Assignments and tests handled by the TA were returned with useful feedback.
  • The TA was willing to explain grading and evaluation of my work.
  • The TA knew and was confident in the material related to this course.
  • The TA was adequately prepared for discussion sections.
  • The TA was clear in presenting subject matter.
  • The TA presented the material in an interesting and engaging way.
  • The TA fostered intellectual communication among my peers.
  • The TA was able to adequately prepare students for assignments (examination, book reviews, research papers, etc.).
  • The TA stimulated thought and discussion.
  • I felt comfortable asking my TA questions.
  • The TA was willing to answer questions.
  • The TA was able to answer questions clearly and completely.
  • The TA effectively utilizes electronic communication (e.g., Learn@UW, Canvas, email, etc.).
  • The TA is well-prepared for each meeting.
  • The TA is flexible and adapts the learning environment when things do not go according to plan.
  • The TA was available during offices hours or by appointment.
  • The TA arrives to class on time.
  • The TA was committed to teaching and aiding students.
  • The TA is an effective teacher.
  • If given the opportunity, I would enroll in a section led by my TA again.
  • Overall, the TA performed well.

Qualitative, open-ended response

  • Are there distinctive qualities about the TA that you would like to highlight?
  • What are one to three specific things that your TA does particularly well to support student learning?
  • What might your TA do to improve his/her teaching?
  • What are one to three specific things that you would like to see your TA improve to better support student learning?

Training & Resources

  • Getting Started - Resources for HelioCampus AC Administrators More
  • Instructor FAQs More
  • Student FAQs More

Contact us at [email protected]

Copyright © SurveySparrow Inc. 2024 Privacy Policy Terms of Service SurveySparrow Inc.

Student Satisfaction Survey Questions, 100 Samples, & Free Survey Template

blog author

Kate William

Last Updated: 23 July 2024

21 min read

Student Satisfaction Survey Questions, 100 Samples, & Free Survey Template

Table Of Contents

  • Student Satisfaction Survey Questions
  • Student Satisfaction Questions On Teachers
  • Student Satisfaction Questions On Courses & Evaluation Methods
  • Online Student Satisfaction Survey Questions
  • Student Satisfaction Survey Questions For Campus Environment & Bullying!
  • Student Satisfaction Survey Questions On Personal Engagement
  • Student SatisfactionSurvey QuestionsOn Stress Levels

“I don’t understand why our students aren’t responding well to the curriculum.”

“Maybe they want something else.” 

“What is that?” 

If you’re a teacher or the head of an educational institution, you must be familiar with such conversations. A curriculum has always room for improvement. After all, it’s rarely a perfect fit for everyone. 

As such, student satisfaction survey questions reveal exactly what your students need and how they think changes could be brought for good. In this blog, we will:

  • Discover why student survey questions – & student feedback in general – are helpful
  • Explore six types of student satisfaction survey questions that you need to ask
  • Understand the twelve factors that influence student satisfaction

Looking for an Easy Student Satisfaction Survey ? We’ve Got You Covered!

Forget creating a survey from scratch. Sign up with your email and access this Free Student Satisfaction Survey Form now.

Student Satisfaction Survey Template for Free

Preview Template

 Student Satisfaction Survey Template for Free

Just sign up, customize it to your liking, and start collecting feedback in no time.

Okay, now to the details…

How Do Student Satisfaction Survey Questions Help?

Student satisfaction survey questions help us understand the aspects of an educational institution that contribute to a student’s satisfaction and engagement. We also aim to find what resonates with students and what doesn’t.

The answers and overall feedback give enough reasons for the decision-makers to make changes, if needed, quickly. 

Additionally, every student satisfaction survey is an opportunity for students to let their voices get heard. It’s a chance for them to put their points across. And this is really pivotal in 2024 and beyond. 

100+ Student Satisfaction Survey Questions That You Need to Ask + Free Template

Time to let the cat out of the bag. Here are 100+ absolutely top student satisfaction survey questions in 6 categories that always deliver results.

#1. Student Satisfaction Questionnaire On Teachers

Let’s start with student satisfaction survey questions on the quality of teachers, their experience levels, teaching style, and more. This student satisfaction questionnaire is a mix of rating scale, open-ended and closed-ended feedback. 

  • Are you satisfied with all your teacher’s knowledge base? 
  • What type of teaching style do you like the most? 
  • What are your thoughts on teachers using practical examples to teach a concept?
  • Do you think that teachers should prioritize reason over discipline?
  • How best should teachers handle differences among students, according to you?
  • Do you find your [Subject Name] teacher’s knowledge base excellent?
  • Are all your doubts attended to in the classroom by teachers?
  • Do you enjoy the approach towards topics that [Subject Name] teacher uses?
  • What would you suggest as an area of improvement for teachers here?
  • Are student suggestions received well by teachers in the classroom?
  • Would you recommend the [Subject Name] teacher for other classrooms?
  • When the [Subject Name] teacher talks about the course material, do you have complete clarity?
  • What level of experience would you appreciate in a faculty?
  • On a scale of 1 to 10, rate how prepared the [Subject Name] teacher comes for the class?
  • Out of the 4 options, how’s the environment when the [Subject Name] teacher is in the class?
  • On a scale of 1 to 10, how easy is it to contact a teacher of [Institution Name] outside the class? 
  • Has [Subject Name] teacher improved your motivation levels?
  • On a scale of 1 to 10, how would you rate [Subject Name] teacher?
  • On a scale of 1 to 10, how would you rate [Institution Name] teachers?

Create Quick Student Satisfaction Surveys!

Sign up with SurveySparrow, Grab your free form, Customize, and Start!

14-Day-Free Trial • Cancel Anytime • No Credit Card Required • Need a Demo?

#2. Student Satisfaction Questionnaire On Courses & Evaluation Methods

This part of the student satisfaction survey questionnaire highlights the courses offered, their quality, and the evaluation methods teachers use to score students. 

  • Are you satisfied with the courses offered at [Institution Name]?
  • Are you happy with the topics the [Course Name] course will cover?
  • On a scale of 1 to 10, how well do you reciprocate with the course material?
  • Can the monthly course load distribution improve according to you?
  • On a scale of 1 to 10, how consistent is the quality across all your courses? 
  • Does [Course Name] evoke positive discussions in the classroom because of its material?
  • What do you think the college should do to make [Course Name] better?
  • How many hours a day after school do you like investing in [Course Name]?
  • On a scale of 1 to 10, what’s your level of interest in the [Course Name] class?
  • Are all [Course Name] lectures, assignments, and tests perfectly synced?
  • On a scale of 1 to 10, how satisfied are you with your [Course Name] evaluation grade?
  • Are you satisfied with the overall course evaluation method?
  • Do you feel the learning was much more than the course grades you received?
  • Do you feel the tests and assignment types for [Course Name] should change?
  • What improvements would you want to make in the [Course Name] material, session structure, and evaluation methods? 

Using SurveySparrow’s interactive survey builder, you can include as many questions as you want in our pre-built student survey templates . Additionally, you can customize it to create your own mobile form or chat survey . 

With an online survey tool like SurveySparrow, you can easily share surveys and collect responses from multiple channels. Moreover, it will generate dynamic reports that refresh with each new response.

Create your first survey in minutes with a free account. Sign up below. 

14-day free trial • Cancel Anytime • No Credit Card Required • No Strings Attached

#3. Online Student Satisfaction Questionnaire For Remote Learning

Just like remote work, remote learning played a crucial role during the pandemic, as it made sure a student’s learning didn’t stop. But how effective was it? You can gauge the results with this online student satisfaction survey questionnaire.

  • On a scale of 1 to 10, how would you rate your remote learning experience?
  • Was remote learning stressful for you during the Covid-19 pandemic?
  • Is the remote learning program working well for you?
  • Do you enjoy learning remotely? If yes, what are your reasons?
  • How’s the environment at home when you’re learning remotely?
  • As a remote learner, are you keeping up with the number of hours you committed to studying in school?
  • On a scale of 1 to 10, how well do you think you manage your time while learning remotely?
  • How’s the online curriculum different from the normal one?
  • On a scale of 1 to 10, how well do you think the online curriculum works for you?
  • Are you satisfied with the technology and software being used for remote learning?
  • On a scale of 1 to 10, how important is face-to-face communication while learning remotely?
  • How often do you talk to your [school/university name] classmates in a week?
  • Are you getting all the help you need with your coursework?
  • What has been the hardest part about completing your coursework?
  • How difficult or easy is it for you to connect to the internet to access your coursework?
  • When you have your online classes, how often do you have the technology (laptop, tablet, etc) you need?
  • What do you not like about your remote learning classes?
  • What do you like about your remote learning classes?
  • How difficult or easy is it to use remote learning technology (computer, video conferencing tools , online learning software, etc.)?
  • How difficult (or easy) is it to stay focused on your coursework?
  • What projects or activities do you find the most engaging in this class?
  • How do you know when you are engaged in your online classes?
  • If you were teaching an online class yourself, what is the one thing you would do to make it more engaging?

student satisfaction survey questions template for free

#4. Student Satisfaction Questionnaire For Campus Environment & Bullying

Here’s the fourth category – student satisfaction survey questions for campus environment, bullying, and all types of harassment.

  • On a scale of 1 to 10, how safe do you feel inside the college/school campus?
  • How would you score the campus on a scale of 1 to 10?
  • What do you find best about the campus environment?
  • On a scale of 1 to 10, is a good campus environment crucial for both physical and mental well-being?
  • What do you like the most about the campus environment?
  • Have you been bullied inside or outside the campus this academic session?
  • Can you talk about how you got bullied?
  • From the given options, select the frequency of bullying inside [Institution Name]?
  • Select the type of harassment prevalent inside [Institution Name].
  • From the given options, select the area inside the campus where bullying occurs the most?
  • Have you talked to the authorities or your faculty about bullying and harassment?
  • What do you think is/are the reason(s) behind bullying?
  • How can [Institution Name] stop bullying?
  • Is trolling on social media prevalent during online education?
  • What will be your first course of action if you encounter harassment?
55-56 percent.  That’s the average satisfaction level of college students in the US. In contrast, online learners had an impressive satisfaction level of 73 percent.  These stats are from the 2021 National Student Satisfaction and Priorities Report which surveyed 397,571 students from 260 institutions.

#5. Student Satisfaction Questionnaire On Personal Engagement 

Personal engagement plays a massive role in learner satisfaction – both in online and offline education. Let’s discuss student survey questions related to that.

  • What value did you receive from [Course Name]?
  • On a scale of 1 to 10, how focused do you feel inside the classroom?
  • Do you like participating in active discussions inside the classroom?
  • What resources should the college offer for improved student satisfaction?
  • What aspects of learning do you find the least exciting?
  • Do you feel hesitant in clearing doubts with the faculty?
  • How excited are you about studying [Course Name] in the future?
  • How motivated are you on a scale of 1 to 10 about investing daily time on [Course Name]?
  • On a scale of 1 to 10, do you miss the college, classroom, and everything related to campus life when away?
  • Is online education improving your satisfaction levels as a learner?

student satisfaction survey questions

#6. Student Satisfaction Questionnaire On Stress Levels

Lastly, we look at student satisfaction survey questions to gauge the stress levels of students in schools and colleges.

  • On a scale of 1 to 10, how stressed did you feel on a daily basis during the year?
  • What do you think are some common causes of stress in your life?
  • How exactly do you experience stress? Tell us a little bit about the sensations you get when you’re stressed.
  • What are your methods to relieve stress?
  • On a scale of 1 to 10, how well do you think you cope with study-related stress?
  • What are your biggest stressors from school/college?
  • What are the common psychological effects of stress you’ve noticed on yourself?
  • What’s something our school/university could do to help lower your stress?
  • I find it difficult to pay attention in class. (Yes/No)
  • I don’t fully understand what my teacher teaches. (Yes/No)
  • I’m not sure if I’m able to do well in school. (Yes/No)
  • My attendance is poor. (Yes/No)
  • I feel there is a great deal of homework to do. (Yes/No)
  • I’ve got too many assignments to complete. (Yes/No)
  • I am often late for class. (Yes/No)
  • I have trouble getting along with my family and friends. (Yes/No)
  • I’ve got no friends. (Yes/No)
  • I feel insecure because there’s too much competition for getting good grades and a good job. (Yes/No)
  • I’m left with hardly any time for physical activities after school/college. (Yes/No)
  • I have gained/lost weight due to the school/college curriculum. (Yes/No)
  • I’m tired and sleeping more/less than normal due to the curriculum. (Yes/No)

12 Factors Determining Student Satisfaction

Many factors determine a student’s satisfaction in schools or colleges. Some are so deeply rooted in an educational institute that it’s hard to change, and some can change for good with the right action and effort.

You must’ve got a good understanding of the six categories and the 100+ questions in it. They are a part of this discussion, and then there are more factors to discuss. Therefore, on we go. 

The History

The history of an institution might not be the brightest when it comes to accepting diversity and giving each student an equal opportunity to shine.

That creates unrest and becomes a point of dissatisfaction for students who want their schools and colleges to be the leaders of change, forward-thinking, and innovation. Sticking to all its historical ideals and way of working might not be the best idea for an educational group. 

Courses Offered

Not just ethnic, but students appreciate the diversity in courses also. No student wants limited course options. They want as many choices as possible.

It can be difficult, as there are factors like quality teachers, lab, and equipment shortages for several courses. But this is where student satisfaction survey questions help, as they reveal a pattern where identifying the in-demand courses becomes easy.

Taking action remains after that. Otherwise, it leads to dissatisfaction and a disconnect between students with their schools and colleges.

Quality Of Courses

We partly covered this factor above, as not the number but the quality of courses define student satisfaction. No learner wants subjects with no good teachers or infrastructure. 

This factor has grown In the post-pandemic world, where online studying has enough great courses. You can see this trend with every online student satisfaction survey questionnaire. So it’s time all educational establishments start focusing on the core quality of courses, as student satisfaction hinges heavily on it. 

Quality Of Teachers

Obviously, one of the most crucial factors that affect student satisfaction is teaching quality. That includes course curriculum, teaching techniques, practical approach, and testing methods.

Undergrads perform at their best when the teacher is excellent in his craft. And the same is true vice-versa. As you saw, student satisfaction survey questions on faculty assessment made a good chunk of the questions we gave, and rightly so. Ensuring optimum teaching quality is too big to miss out on for any educational establishment. Period. 

Campus Infrastructure

The related aspect to quality teaching is the availability of strong campus infrastructure.

It will be wrong to say a faculty is not good enough when he does not have the right labs to conduct experiments, study material to conduct classroom sessions, and enough space to conduct practical examples. And this not just dissatisfies students, but teachers, too.

Campus Environment

Students should enjoy roaming freely on campus. That’s how the campus environment should be.

A focus on well-built roads, greenery, and open grounds let students relax and de-stress. And it also allows faculties to conduct practical, real-life experiments with great effect. Both of which add to a student’s satisfaction levels. 

We know that having a large campus is not always possible, but doing the best with the existing area is surely a possibility, isn’t it? 

Opportunities

One of the prime motivators for someone to join a particular educational foundation is its opportunities. The opportunities to learn, experiment, collaborate and build something.

As such, learners need the right opportunities to compete, intern, or get a job while studying. And it becomes pretty obvious if they’re satisfied with the existing opportunities by reading the responses to student satisfaction survey questions. If people are satisfied, great. If not, time to change and do it quickly.

Testing & Evaluation Methods

“On what basis has she scored my report?”

“I don’t believe the grade I got. I deserved so much better.”

“Whatever I do, I’m not gonna get a better grade with him.”

If you’ve heard such statements from students, it’s time you sit with all faculties and review the testing and evaluation methods. Make sure it’s not old and redundant. And if it is, don’t hesitate in overhauling it completely. The satisfaction of your students greatly depends on the evaluation methods in use. 

Extracurriculars

The mundane college or school life can get boring very quickly.

As such, extracurricular activities like sports, art and cultural competitions, events, and trips give a much-needed breather for everyone to refresh and get going again. And although not the most crucial factor, it becomes a major cause of dissatisfaction among students if ignored. Remember that.

Any educational institution needs to make sure students aren’t being bullied inside the campus. Bullying has been shown to have a direct impact on people’s mental health and is one of the biggest causes of unrest and dissatisfaction among the student community. 

Specific questions, like the ones we gave, related to bullying and other harassment in your student satisfaction survey will highlight if it’s a cause of concern or not. And addressing it right at the beginning is the best way to deal with it. 

Food & Accommodation

Food and accommodation might not have an instant impact on learners’ satisfaction levels, but if left unattended, the dissatisfaction slowly adds up. So make sure to keep it under check with satisfaction surveys. 

Lastly, the often ignored aspect of safety. Yes, everybody talks in detail about how crucial the safety and well-being of students and faculties are, but it often gets ignored compared to other factors. 

We feel that ensuring the safety of everyone inside the campus should be the first step towards generating good satisfaction levels, as it sets the tone for a great campus environment. One where students feel safe and at home so that they can focus completely on their work and learning. 

Wrapping Up

Student surveys always bring rich data with minimal efforts that help alter the education model for better results. The student satisfaction survey questions are the main catalyst for that. These surveys highlight the problem areas for the decision-makers to act on it with emphasis. 

Start conducting these surveys then. Use the 100+ questions to carry out not just one but multiple satisfaction surveys. For a student satisfaction template or a query, talk to us here 24/7.  Ciao!

blog author image

Content Marketer at SurveySparrow

You Might Also Like

What is Customer Experience? Unveiling the Power of Lasting Impressions

Customer Experience

What is Customer Experience? Unveiling the Power of Lasting Impressions

Payment Experience: What is it & How to Improve it?

Payment Experience: What is it & How to Improve it?

100+ Restaurant Survey Questions for Guest Feedback + Free Template

100+ Restaurant Survey Questions for Guest Feedback + Free Template

Pollfish vs SurveyMonkey vs Qualtrics: A Comparison

Pollfish vs SurveyMonkey vs Qualtrics: A Comparison

Turn every feedback into a growth opportunity.

14-day free trial • Cancel Anytime • No Credit Card Required • Need a Demo?

  • Request a Consultation
  • Workshops and Virtual Conversations
  • Technical Support
  • Course Design and Preparation
  • Observation & Feedback

Teaching Resources

Templates for Quick Student Feedback

Resource overview.

Gather quick and useful student feedback about your course with our easy-to-use templates.

Gathering Student Feedback

Instructor in front of classroom

Get started with survey templates created through collaboration between CIRCLE and the Center for Teaching and Learning.

We offer templates available for use in Canvas, Google Forms and Qualtrics. The templates all have the same questions and are fully customizable; use whichever platform you prefer. Please note that only Google Forms and Qualtrics offer truly anonymous surveys.

Each format has four templates available for gathering various kinds of feedback:

  • Quick Course Pulse : General feedback on how the course is going overall
  • Quick Feedback on a Course Activity : Feedback on a specific course activity
  • Quick Activity Feedback – Group Work : Feedback on a group assignment or activity
  • Quick Lecture Feedback : Feedback specific to lecture portions of a course

Using these Templates in your Course

  • Consider each template as a jumping off point. You can use them as is, or you can update them to better fit your needs by changing settings or editing, adding, or deleting questions. Instructions and tips are included in the survey guides found on the pages for each survey mode.
  • Under “Downloads” there is a pdf guide explaining how to use each kind of template in detail (Canvas / Google Forms / Qualtrics).
  • If you’d like further help or input on collecting student feedback, schedule a consultation with CTL staff or contact Rick Moore, the Assistant Director for Assessment and Evaluation [email protected]

Created through a collaboration between CIRCLE and the Center for Teaching and Learning. To preview the templates and import them into your courses, log into Canvas, and then follow the links below. Alternatively, you can also search for the templates within Canvas Commons using the instructions further down the page.

See our pdf guide download that includes detailed instructions plus general tips on collecting student feedback using these templates.

Available Survey Templates

Canvas commons logo

  • Quick Course Pulse:  General feedback on how the course is going overall
  • Quick Feedback on a Course Activity:  Feedback on a specific course activity
  • Quick Activity Feedback – Group Work:  Feedback on a group assignment or activity

If you still can’t get to the surveys via the links, please follow the instructions below to access the surveys by searching Canvas Commons. Once you’ve imported a survey into your course, it can be found under the “Quizzes” where you can make any changes you need and publish it to your students.

Accessing Surveys via Canvas Commons Search

Canvas commons logo

Correct results will have the following image:

assignment survey sample

Google Forms is a easy-to-use survey platform that can be completely anonymous. If you don’t already have one you will need to  create a Google account  to design and send Google Forms, but students do not need to use or even have a Google account to respond.

The templates below were created through collaboration between CIRCLE and the Center for Teaching and Learning.

Click the links below to access the survey templates. Once a template has been opened, click the “USE TEMPLATE” button located in the upper right corner of the screen to import the survey into your Google account.

We also offer a pdf guide that includes  more instructions and general tips  on collecting student feedback using these Google Form templates.

Qualtrics is a professional survey platform to which WashU has an institutional subscription. Details and support can be found on the  Information Technology Qualtrics page .

Although Qualtrics is not difficult to use, because it is relatively feature-rich, it may be easier for you to use our Google Form or Canvas templates if you are not already familiar with designing surveys on this platform.

To import the templates into your Qualtrics account, download and  import the .qsf file for the survey  that you would like to use. In addition to the .qsf files, you can also download a Word version of the survey in order to preview the questions before importing them into Qualtrics.

We offer a pdf guide that includes more instructions and general tips on collecting student feedback using these templates.

  • Quick Course Pulse: General feedback on how the course is going overall [ .qsf  /  Word ]
  • Quick Feedback on a Course Activity: Feedback on a specific course activity [ .qsf  /  Word ]
  • Quick Activity Feedback – Group Work: Feedback on a group assignment or activity [ .qsf  /  Word ]
  • Quick Lecture Feedback: Feedback specific to lecture portions of a course [ .qsf  /  Word ]

Further Tips

  • Collect Something : Don’t get overwhelmed with possibility. Even a single piece of feedback can help you improve your course. Think about what you most want to know about your students’ experience in your course.
  • No need to reinvent the wheel : Use one of our ready-made templates to get started. You can always customize the questions to meet your specific needs, but there’s no need to start from scratch.
  • Wait a bit : Refrain from collecting feedback during the first two weeks of class; this will help you avoid investing effort into fixing issues that will naturally improve as students settle into the course.
  • Explain : Tell students what types of feedback you plan to collect and why.
  • Keep it brief : Try to use surveys that students can complete in a minute or two.
  • Make it convenient : Embed surveys into Canvas modules or send via email to streamline the process for students.to streamline the process for students.
  • Be strategic : Don’t overwhelm your students or yourself by collecting feedback on every activity. Focus your efforts on activities you suspect are not working well or those you’ve had to alter significantly.
  • Provide clear response options : Aim for response options with clear practical meanings (e.g., Needs Improvement/Meets Expectations vs. a 1-5 scale). This approach may help you view, interpret, and respond to feedback more quickly.
  • Give credit : If possible, consider tying surveys to a small amount of points (i.e., completion credit) to encourage participation. For example, this can be easily done on Canvas with an graded survey .
  • Acknowledge : Thank students for their feedback and emphasize its importance for improving the course.
  • Follow-Up : If necessary, make changes and assess their effect with a follow-up survey. For changes made to specific activities, the follow-up can be conducted the next time students complete the activity. For more general course-level changes, allow at least 3-4 weeks for the changes to take hold before reassessing. It’s also good idea to tell students explicitly that you’re making changes based on their feedback. If many students ask for a change that you decide not to implement, you might want to explain why you made that choice, if appropriate. This lets the students know that they were heard.
  • Start small: If your data indicate to you that improvement is needed, start by identifying one concrete change you can make rather than making a major overhaul.
  • Take care of “easy” fixes: Be sure to make “easy” fixes (broken links, pdfs that are hard to read, not be audible during lecture, etc.); even if only one student has mentioned them others have likely had the same problem.
  • Expect some negative feedback: A small percentage of students will always indicate that something “Needs Improvement.” It’s also normal to receive some contradictory results, especially with open ended questions (e.g. discussions are too long / discussions are too short). Keep negative ratings and comments in perspective.
  • Use open-ended feedback: Identify potential changes by consulting open-ended feedback (if you’ve collected it), soliciting follow-up feedback from students, seeking input from colleagues, or consulting with the CTL.
  • Look for patterns: Several students writing similar responses may be evidence of wider agreement on an issue. At the same time, comments that disagree with one another can also point to areas where student experience varies; such variation may signal an area to address in the future.
  • Sort answers: Finding patterns and making sense of qualitative data is not always easy, especially if you have more than just a few responses. One method is to copy and paste the qualitative answers into another program (e.g. Excel, Word, etc.) where you can sort them into categories that are helpful to you. There is no one right sorting strategy for everyone and you should choose the one that works best for you.
  • Follow up: You can always collect another round of feedback to assess changes you make. If this is a course-level change (and not just a change to a specific activity), give several weeks for the change to take effect before assessing.

While designing your survey, consider the advantages and challenges associated with open-ended survey questions.

Advantages:

  • Provides a low-barrier opportunity for students to share concerns and suggestions
  • Can help you interpret and respond to results of the closed questions without additional follow-up

Challenges:

  • Reviewing open responses can be time consuming, especially for a large course.
  • Open questions require more effort from students and can reduce response rates.

Some form of open-ended feedback is crucial to pinpoint problems and identify potential improvements. If you chose not to initially ask open-ended questions, you might consider following up with students via class discussion, email, or another survey on any specific areas of concern identified by your initial survey. While leaving open-ended questions for follow-up steps can help you control the amount and type of feedback you receive, there are also drawbacks to this approach. It creates extra logistical steps for yourself and your students, and surveying people frequently can significantly lower response rates. In general, open-ended questions are easiest to implement in smaller courses and more difficult (although not impossible) to use in larger ones.

Are you an agency specialized in UX, digital marketing, or growth? Join our Partner Program

Learn / Blog / Article

Back to blog

Survey questions 101: 70+ survey question examples, types of surveys, and FAQs

How well do you understand your prospects and customers—who they are, what keeps them awake at night, and what brought them to your business in search of a solution? Asking the right survey questions at the right point in their customer journey is the most effective way to put yourself in your customers’ shoes.

Last updated

Reading time.

assignment survey sample

This comprehensive intro to survey questions contains over 70 examples of effective questions, an overview of different types of survey questions, and advice on how to word them for maximum effect. Plus, we’ll toss in our pre-built survey templates, expert survey insights, and tips to make the most of AI for Surveys in Hotjar. ✨

Surveying your users is the simplest way to understand their pain points, needs, and motivations. But first, you need to know how to set up surveys that give you the answers you—and your business—truly need. Impactful surveys start here:

❓ The main types of survey questions : most survey questions are classified as open-ended, closed-ended, nominal, Likert scale, rating scale, and yes/no. The best surveys often use a combination of questions.

💡 70+ good survey question examples : our top 70+ survey questions, categorized across ecommerce, SaaS, and publishing, will help you find answers to your business’s most burning questions

✅ What makes a survey question ‘good’ : a good survey question is anything that helps you get clear insights and business-critical information about your customers 

❌ The dos and don’ts of writing good survey questions : remember to be concise and polite, use the foot-in-door principle, alternate questions, and test your surveys. But don’t ask leading or loaded questions, overwhelm respondents with too many questions, or neglect other tools that can get you the answers you need.

👍 How to run your surveys the right way : use a versatile survey tool like Hotjar Surveys that allows you to create on-site surveys at specific points in the customer journey or send surveys via a link

🛠️ 10 use cases for good survey questions : use your survey insights to create user personas, understand pain points, measure product-market fit, get valuable testimonials, measure customer satisfaction, and more

Use Hotjar to build your survey and get the customer insight you need to grow your business.

6 main types of survey questions

Let’s dive into our list of survey question examples, starting with a breakdown of the six main categories your questions will fall into:

Open-ended questions

Closed-ended questions

Nominal questions

Likert scale questions

Rating scale questions

'Yes' or 'no' questions

1. Open-ended survey questions

Open-ended questions  give your respondents the freedom to  answer in their own words , instead of limiting their response to a set of pre-selected choices (such as multiple-choice answers, yes/no answers, 0–10 ratings, etc.). 

Examples of open-ended questions:

What other products would you like to see us offer?

If you could change just one thing about our product, what would it be?

When to use open-ended questions in a survey

The majority of example questions included in this post are open-ended, and there are some good reasons for that:

Open-ended questions help you learn about customer needs you didn’t know existed , and they shine a light on areas for improvement that you may not have considered before. If you limit your respondents’ answers, you risk cutting yourself off from key insights.

Open-ended questions are very useful when you first begin surveying your customers and collecting their feedback. If you don't yet have a good amount of insight, answers to open-ended questions will go a long way toward educating you about who your customers are and what they're looking for.

There are, however, a few downsides to open-ended questions:

First, people tend to be less likely to respond to open-ended questions in general because they take comparatively more effort to answer than, say, a yes/no one

Second, but connected: if you ask consecutive open-ended questions during your survey, people will get tired of answering them, and their answers might become less helpful the more you ask

Finally, the data you receive from open-ended questions will take longer to analyze compared to easy 1-5 or yes/no answers—but don’t let that stop you. There are plenty of shortcuts that make it easier than it looks (we explain it all in our post about how to analyze open-ended questions , which includes a free analysis template.)

💡 Pro tip: if you’re using Hotjar Surveys, let our AI for Surveys feature analyze your open-ended survey responses for you. Hotjar AI reviews all your survey responses and provides an automated summary report of key findings, including supporting quotes and actionable recommendations for next steps.

2. Closed-ended survey questions

Closed-end questions limit a user’s response options to a set of pre-selected choices. This broad category of questions includes

‘Yes’ or ‘no’ questions

When to use closed-ended questions

Closed-ended questions work brilliantly in two scenarios:

To open a survey, because they require little time and effort and are therefore easy for people to answer. This is called the foot-in-the-door principle: once someone commits to answering the first question, they may be more likely to answer the open-ended questions that follow.

When you need to create graphs and trends based on people’s answers. Responses to closed-ended questions are easy to measure and use as benchmarks. Rating scale questions, in particular (e.g. where people rate customer service or on a scale of 1-10), allow you to gather customer sentiment and compare your progress over time.

3. Nominal questions

A nominal question is a type of survey question that presents people with multiple answer choices; the answers are  non-numerical in nature and don't overlap  (unless you include an ‘all of the above’ option).

Example of nominal question:

What are you using [product name] for?

Personal use

Both business and personal use

When to use nominal questions

Nominal questions work well when there is a limited number of categories for a given question (see the example above). They’re easy to create graphs and trends from, but the downside is that you may not be offering enough categories for people to reply.

For example, if you ask people what type of browser they’re using and only give them three options to choose from, you may inadvertently alienate everybody who uses a fourth type and now can’t tell you about it.

That said, you can add an open-ended component to a nominal question with an expandable ’other’ category, where respondents can write in an answer that isn’t on the list. This way, you essentially ask an open-ended question that doesn’t limit them to the options you’ve picked.

4. Likert scale questions

The Likert scale is typically a 5- or 7-point scale that evaluates a respondent’s level of agreement with a statement or the intensity of their reaction toward something.

The scale develops symmetrically: the median number (e.g. a 3 on a 5-point scale) indicates a point of neutrality, the lowest number (always 1) indicates an extreme view, and the highest number (e.g. a 5 on a 5-point scale) indicates the opposite extreme view.

Example of a Likert scale question:

#The British Museum uses a Likert scale Hotjar survey to gauge visitors’ reactions to their website optimizations

When to use Likert scale questions

Likert-type questions are also known as ordinal questions because the answers are presented in a specific order. Like other multiple-choice questions, Likert scale questions come in handy when you already have some sense of what your customers are thinking. For example, if your open-ended questions uncover a complaint about a recent change to your ordering process, you could use a Likert scale question to determine how the average user felt about the change.

A series of Likert scale questions can also be turned into a matrix question. Since they have identical response options, they are easily combined into a single matrix and break down the pattern of single questions for users.

5. Rating scale questions

Rating scale questions are questions where the answers map onto a numeric scale (such as rating customer support on a scale of 1-5, or likelihood to recommend a product from 0-10).

Examples of rating questions:

How likely are you to recommend us to a friend or colleague on a scale of 0-10?

How would you rate our customer service on a scale of 1-5?

When to use rating questions

Whenever you want to assign a numerical value to your survey or visualize and compare trends , a rating question is the way to go.

A typical rating question is used to determine Net Promoter Score® (NPS®) : the question asks customers to rate their likelihood of recommending products or services to their friends or colleagues, and allows you to look at the results historically and see if you're improving or getting worse. Rating questions are also used for customer satisfaction (CSAT) surveys and product reviews.

When you use a rating question in a survey, be sure to explain what the scale means (e.g. 1 for ‘Poor’, 5 for ‘Amazing’). And consider adding a follow-up open-ended question to understand why the user left that score.

Example of a rating question (NPS):

#Hotjar's Net Promoter Score® (NPS®) survey template lets you add open-ended follow-up questions so you can understand the reasons behind users' ratings

6. ‘Yes’ or ‘no’ questions

These dichotomous questions are super straightforward, requiring a simple ‘yes’ or ‘no’ reply.

Examples of yes/no questions:

Was this article useful? (Yes/No)

Did you find what you were looking for today? (Yes/No)

When to use ‘yes’ or ‘no’ questions

‘Yes’ and ‘no’ questions are a good way to quickly segment your respondents . For example, say you’re trying to understand what obstacles or objections prevent people from trying your product. You can place a survey on your pricing page asking people if something is stopping them, and follow up with the segment who replied ‘yes’ by asking them to elaborate further.

These questions are also effective for getting your foot in the door: a ‘yes’ or ‘no’ question requires very little effort to answer. Once a user commits to answering the first question, they tend to become more willing to answer the questions that follow, or even leave you their contact information.

#Web design agency NerdCow used Hotjar Surveys to add a yes/no survey on The Transport Library’s website, and followed it up with an open-ended question for more insights

70+ more survey question examples

Below is a list of good survey questions, categorized across ecommerce, software as a service (SaaS), and publishing. You don't have to use them word-for-word, but hopefully, this list will spark some extra-good ideas for the surveys you’ll run immediately after reading this article. (Plus, you can create all of them with Hotjar Surveys—stick with us a little longer to find out how. 😉)

📊 9 basic demographic survey questions

Ask these questions when you want context about your respondents and target audience, so you can segment them later. Consider including demographic information questions in your survey when conducting user or market research as well. 

But don’t ask demographic questions just for the sake of it—if you're not going to use some of the data points from these sometimes sensitive questions (e.g. if gender is irrelevant to the result of your survey), move on to the ones that are truly useful for you, business-wise. 

Take a look at the selection of examples below, and keep in mind that you can convert most of them to multiple choice questions:

What is your name?

What is your age?

What is your gender?

What company do you work for?

What vertical/industry best describes your company?

What best describes your role?

In which department do you work?

What is the total number of employees in your company (including all locations where your employer operates)?

What is your company's annual revenue?

🚀 Get started: gather more info about your users with our product-market fit survey template .

👥 20+ effective customer questions

These questions are particularly recommended for ecommerce companies:

Before purchase

What information is missing or would make your decision to buy easier?

What is your biggest fear or concern about purchasing this item?

Were you able to complete the purpose of your visit today?

If you did not make a purchase today, what stopped you?

After purchase

Was there anything about this checkout process we could improve?

What was your biggest fear or concern about purchasing from us?

What persuaded you to complete the purchase of the item(s) in your cart today?

If you could no longer use [product name], what’s the one thing you would miss the most?

What’s the one thing that nearly stopped you from buying from us?

👉 Check out our 7-step guide to setting up an ecommerce post-purchase survey .

Other useful customer questions

Do you have any questions before you complete your purchase?

What other information would you like to see on this page?

What were the three main things that persuaded you to create an account today?

What nearly stopped you from creating an account today?

Which other options did you consider before choosing [product name]?

What would persuade you to use us more often?

What was your biggest challenge, frustration, or problem in finding the right [product type] online?

Please list the top three things that persuaded you to use us rather than a competitor.

Were you able to find the information you were looking for?

How satisfied are you with our support?

How would you rate our service/support on a scale of 0-10? (0 = terrible, 10 = stellar)

How likely are you to recommend us to a friend or colleague? ( NPS question )

Is there anything preventing you from purchasing at this point?

🚀 Get started: learn how satisfied customers are with our expert-built customer satisfaction and NPS survey templates .

Set up a survey in seconds

Use Hotjar's free survey templates to build virtually any type of survey, and start gathering valuable insights in moments.

🛍 30+ product survey questions

These questions are particularly recommended for SaaS companies:

Questions for new or trial users

What nearly stopped you from signing up today?

How likely are you to recommend us to a friend or colleague on a scale of 0-10? (NPS question)

Is our pricing clear? If not, what would you change?

Questions for paying customers

What convinced you to pay for this service?

What’s the one thing we are missing in [product type]?

What's one feature we can add that would make our product indispensable for you?

If you could no longer use [name of product], what’s the one thing you would miss the most?

🚀 Get started: find out what your buyers really think with our pricing plan feedback survey template .

Questions for former/churned customers

What is the main reason you're canceling your account? Please be blunt and direct.

If you could have changed one thing in [product name], what would it have been?

If you had a magic wand and could change anything in [product name], what would it be?

🚀 Get started: find out why customers churn with our free-to-use churn analysis survey template .

Other useful product questions

What were the three main things that persuaded you to sign up today?

Do you have any questions before starting a free trial?

What persuaded you to start a trial?

Was this help section useful?

Was this article useful?

How would you rate our service/support on a scale of 1-10? (0 = terrible, 10 = stellar)

Is there anything preventing you from upgrading at this point?

Is there anything on this page that doesn't work the way you expected it to?

What could we change to make you want to continue using us?

If you did not upgrade today, what stopped you?

What's the next thing you think we should build?

How would you feel if we discontinued this feature?

What's the next feature or functionality we should build?

🚀 Get started: gather feedback on your product with our free-to-use product feedback survey template .

🖋 20+ effective questions for publishers and bloggers

Questions to help improve content.

If you could change just one thing in [publication name], what would it be?

What other content would you like to see us offer?

How would you rate this article on a scale of 1–10?

If you could change anything on this page, what would you have us do?

If you did not subscribe to [publication name] today, what was it that stopped you?

🚀 Get started: find ways to improve your website copy and messaging with our content feedback survey template .

New subscriptions

What convinced you to subscribe to [publication] today?

What almost stopped you from subscribing?

What were the three main things that persuaded you to join our list today?

Cancellations

What is the main reason you're unsubscribing? Please be specific.

Other useful content-related questions

What’s the one thing we are missing in [publication name]?

What would persuade you to visit us more often?

How likely are you to recommend us to someone with similar interests? (NPS question)

What’s missing on this page?

What topics would you like to see us write about next?

How useful was this article?

What could we do to make this page more useful?

Is there anything on this site that doesn't work the way you expected it to?

What's one thing we can add that would make [publication name] indispensable for you?

If you could no longer read [publication name], what’s the one thing you would miss the most?

💡 Pro tip: do you have a general survey goal in mind, but are struggling to pin down the right questions to ask? Give Hotjar’s AI for Surveys a go and watch as it generates a survey for you in seconds with questions tailored to the exact purpose of the survey you want to run.

What makes a good survey question?

We’ve run through more than 70 of our favorite survey questions—but what is it that makes a good survey question, well, good ? An effective question is anything that helps you get clear insights and business-critical information about your customers , including

Who your target market is

How you should price your products

What’s stopping people from buying from you

Why visitors leave your website

With this information, you can tailor your website, products, landing pages, and messaging to improve the user experience and, ultimately, maximize conversions .

How to write good survey questions: the DOs and DON’Ts

To help you understand the basics and avoid some rookie mistakes, we asked a few experts to give us their thoughts on what makes a good and effective survey question.

Survey question DOs

✅ do focus your questions on the customer.

It may be tempting to focus on your company or products, but it’s usually more effective to put the focus back on the customer. Get to know their needs, drivers, pain points, and barriers to purchase by asking about their experience. That’s what you’re after: you want to know what it’s like inside their heads and how they feel when they use your website and products.

Rather than asking, “Why did you buy our product?” ask, “What was happening in your life that led you to search for this solution?” Instead of asking, “What's the one feature you love about [product],” ask, “If our company were to close tomorrow, what would be the one thing you’d miss the most?” These types of surveys have helped me double and triple my clients.

✅ DO be polite and concise (without skimping on micro-copy)

Put time into your micro-copy—those tiny bits of written content that go into surveys. Explain why you’re asking the questions, and when people reach the end of the survey, remember to thank them for their time. After all, they’re giving you free labor!

✅ DO consider the foot-in-the-door principle

One way to increase your response rate is to ask an easy question upfront, such as a ‘yes’ or ‘no’ question, because once people commit to taking a survey—even just the first question—they’re more likely to finish it.

✅ DO consider asking your questions from the first-person perspective

Disclaimer: we don’t do this here at Hotjar. You’ll notice all our sample questions are listed in second-person (i.e. ‘you’ format), but it’s worth testing to determine which approach gives you better answers. Some experts prefer the first-person approach (i.e. ‘I’ format) because they believe it encourages users to talk about themselves—but only you can decide which approach works best for your business.

I strongly recommend that the questions be worded in the first person. This helps create a more visceral reaction from people and encourages them to tell stories from their actual experiences, rather than making up hypothetical scenarios. For example, here’s a similar question, asked two ways: “What do you think is the hardest thing about creating a UX portfolio?” versus “My biggest problem with creating my UX portfolio is…” 

The second version helps get people thinking about their experiences. The best survey responses come from respondents who provide personal accounts of past events that give us specific and real insight into their lives.

✅ DO alternate your questions often

Shake up the questions you ask on a regular basis. Asking a wide variety of questions will help you and your team get a complete view of what your customers are thinking.

✅ DO test your surveys before sending them out

A few years ago, Hotjar created a survey we sent to 2,000 CX professionals via email. Before officially sending it out, we wanted to make sure the questions really worked. 

We decided to test them out on internal staff and external people by sending out three rounds of test surveys to 100 respondents each time. Their feedback helped us perfect the questions and clear up any confusing language.

Survey question DON’Ts

❌ don’t ask closed-ended questions if you’ve never done research before.

If you’ve just begun asking questions, make them open-ended questions since you have no idea what your customers think about you at this stage. When you limit their answers, you just reinforce your own assumptions.

There are two exceptions to this rule:

Using a closed-ended question to get your foot in the door at the beginning of a survey

Using rating scale questions to gather customer sentiment (like an NPS survey)

❌ DON’T ask a lot of questions if you’re just getting started

Having to answer too many questions can overwhelm your users. Stick with the most important points and discard the rest.

Try starting off with a single question to see how your audience responds, then move on to two questions once you feel like you know what you’re doing.

How many questions should you ask? There’s really no perfect answer, but we recommend asking as few as you need to ask to get the information you want. In the beginning, focus on the big things:

Who are your users?

What do potential customers want?

How are they using your product?

What would win their loyalty?

❌ DON’T just ask a question when you can combine it with other tools

Don’t just use surveys to answer questions that other tools (such as analytics) can also answer. If you want to learn about whether people find a new website feature helpful, you can also observe how they’re using it through traditional analytics, session recordings , and other user testing tools for a more complete picture.

Don’t use surveys to ask people questions that other tools are better equipped to answer. I’m thinking of questions like “What do you think of the search feature?” with pre-set answer options like ‘Very easy to use,’ ‘Easy to use,’ etc. That’s not a good question to ask. 

Why should you care about what people ‘think’ about the search feature? You should find out whether it helps people find what they need and whether it helps drive conversions for you. Analytics, user session recordings, and user testing can tell you whether it does that or not.

❌ DON’T ask leading questions

A leading question is one that prompts a specific answer. Avoid asking leading questions because they’ll give you bad data. For example, asking, “What makes our product better than our competitors’ products?” might boost your self-esteem, but it won’t get you good information. Why? You’re effectively planting the idea that your own product is the best on the market.

❌ DON’T ask loaded questions

A loaded question is similar to a leading question, but it does more than just push a bias—it phrases the question such that it’s impossible to answer without confirming an underlying assumption.

A common (and subtle) form of loaded survey question would be, “What do you find useful about this article?” If we haven’t first asked you whether you found the article useful at all, then we’re asking a loaded question.

❌ DON’T ask about more than one topic at once

For example, “Do you believe our product can help you increase sales and improve cross-collaboration?”

This complex question, also known as a ‘double-barreled question’, requires a very complex answer as it begs the respondent to address two separate questions at once:

Do you believe our product can help you increase sales?

Do you believe our product can help you improve cross-collaboration?

Respondents may very well answer 'yes', but actually mean it for the first part of the question, and not the other. The result? Your survey data is inaccurate, and you’ve missed out on actionable insights.

Instead, ask two specific questions to gather customer feedback on each concept.

How to run your surveys

The format you pick for your survey depends on what you want to achieve and also on how much budget or resources you have. You can

Use an on-site survey tool , like Hotjar Surveys , to set up a website survey that pops up whenever people visit a specific page: this is useful when you want to investigate website- and product-specific topics quickly. This format is relatively inexpensive—with Hotjar’s free forever plan, you can even run up to 3 surveys with unlimited questions for free.

assignment survey sample

Use Hotjar Surveys to embed a survey as an element directly on a page: this is useful when you want to grab your audience’s attention and connect with customers at relevant moments, without interrupting their browsing. (Scroll to the bottom of this page to see an embedded survey in action!) This format is included on Hotjar’s Business and Scale plans—try it out for 15 days with a free Ask Business trial .

Use a survey builder and create a survey people can access in their own time: this is useful when you want to reach out to your mailing list or a wider audience with an email survey (you just need to share the URL the survey lives at). Sending in-depth questionnaires this way allows for more space for people to elaborate on their answers. This format is also relatively inexpensive, depending on the tool you use.

Place survey kiosks in a physical location where people can give their feedback by pressing a button: this is useful for quick feedback on specific aspects of a customer's experience (there’s usually plenty of these in airports and waiting rooms). This format is relatively expensive to maintain due to the material upkeep.

Run in-person surveys with your existing or prospective customers: in-person questionnaires help you dig deep into your interviewees’ answers. This format is relatively cheap if you do it online with a user interview tool or over the phone, but it’s more expensive and time-consuming if done in a physical location.

💡 Pro tip: looking for an easy, cost-efficient way to connect with your users? Run effortless, automated user interviews with Engage , Hotjar’s user interview tool. Get instant access to a pool of 200,000+ participants (or invite your own), and take notes while Engage records and transcribes your interview.

10 survey use cases: what you can do with good survey questions

Effective survey questions can help improve your business in many different ways. We’ve written in detail about most of these ideas in other blog posts, so we’ve rounded them up for you below.

1. Create user personas

A user persona is a character based on the people who currently use your website or product. A persona combines psychographics and demographics and reflects who they are, what they need, and what may stop them from getting it.

Examples of questions to ask:

Describe yourself in one sentence, e.g. “I am a 30-year-old marketer based in Dublin who enjoys writing articles about user personas.”

What is your main goal for using this website/product?

What, if anything, is preventing you from doing it?

👉 Our post about creating simple and effective user personas in four steps highlights some great survey questions to ask when creating a user persona.

🚀 Get started: use our user persona survey template or AI for Surveys to inform your user persona.

2. Understand why your product is not selling

Few things are more frightening than stagnant sales. When the pressure is mounting, you’ve got to get to the bottom of it, and good survey questions can help you do just that.

What made you buy the product? What challenges are you trying to solve?

What did you like most about the product? What did you dislike the most?

What nearly stopped you from buying?

👉 Here’s a detailed piece about the best survey questions to ask your customers when your product isn’t selling , and why they work so well.

🚀 Get started: our product feedback survey template helps you find out whether your product satisfies your users. Or build your surveys in the blink of an eye with Hotjar AI.

3. Understand why people leave your website

If you want to figure out why people are leaving your website , you’ll have to ask questions.

A good format for that is an exit-intent pop-up survey, which appears when a user clicks to leave the page, giving them the chance to leave website feedback before they go.

Another way is to focus on the people who did convert, but just barely—something Hotjar founder David Darmanin considers essential for taking conversions to the next level. By focusing on customers who bought your product (but almost didn’t), you can learn how to win over another set of users who are similar to them: those who almost bought your products, but backed out in the end.

Example of questions to ask:

Not for you? Tell us why. ( Exit-intent pop-up —ask this when a user leaves without buying.)

What almost stopped you from buying? (Ask this post-conversion .)

👉 Find out how HubSpot Academy increased its conversion rate by adding an exit-intent survey that asked one simple question when users left their website: “Not for you? Tell us why.”

🚀 Get started: place an exit-intent survey on your site. Let Hotjar AI draft the survey questions by telling it what you want to learn.

I spent the better half of my career focusing on the 95% who don’t convert, but it’s better to focus on the 5% who do. Get to know them really well, deliver value to them, and really wow them. That’s how you’re going to take that 5% to 10%.

4. Understand your customers’ fears and concerns

Buying a new product can be scary: nobody wants to make a bad purchase. Your job is to address your prospective customers’ concerns, counter their objections, and calm their fears, which should lead to more conversions.

👉 Take a look at our no-nonsense guide to increasing conversions for a comprehensive write-up about discovering the drivers, barriers, and hooks that lead people to converting on your website.

🚀 Get started: understand why your users are tempted to leave and discover potential barriers with a customer retention survey .

5. Drive your pricing strategy

Are your products overpriced and scaring away potential buyers? Or are you underpricing and leaving money on the table?

Asking the right questions will help you develop a pricing structure that maximizes profit, but you have to be delicate about how you ask. Don’t ask directly about price, or you’ll seem unsure of the value you offer. Instead, ask questions that uncover how your products serve your customers and what would inspire them to buy more.

How do you use our product/service?

What would persuade you to use our product more often?

What’s the one thing our product is missing?

👉 We wrote a series of blog posts about managing the early stage of a SaaS startup, which included a post about developing the right pricing strategy —something businesses in all sectors could benefit from.

🚀 Get started: find the sweet spot in how to price your product or service with a Van Westendorp price sensitivity survey or get feedback on your pricing plan .

6. Measure and understand product-market fit

Product-market fit (PMF) is about understanding demand and creating a product that your customers want, need, and will actually pay money for. A combination of online survey questions and one-on-one interviews can help you figure this out.

What's one thing we can add that would make [product name] indispensable for you?

If you could change just one thing in [product name], what would it be?

👉 In our series of blog posts about managing the early stage of a SaaS startup, we covered a section on product-market fit , which has relevant information for all industries.

🚀 Get started: discover if you’re delivering the best products to your market with our product-market fit survey .

7. Choose effective testimonials

Human beings are social creatures—we’re influenced by people who are similar to us. Testimonials that explain how your product solved a problem for someone are the ultimate form of social proof. The following survey questions can help you get some great testimonials.

What changed for you after you got our product?

How does our product help you get your job done?

How would you feel if you couldn’t use our product anymore?

👉 In our post about positioning and branding your products , we cover the type of questions that help you get effective testimonials.

🚀 Get started: add a question asking respondents whether you can use their answers as testimonials in your surveys, or conduct user interviews to gather quotes from your users.

8. Measure customer satisfaction

It’s important to continually track your overall customer satisfaction so you can address any issues before they start to impact your brand’s reputation. You can do this with rating scale questions.

For example, at Hotjar, we ask for feedback after each customer support interaction (which is one important measure of customer satisfaction). We begin with a simple, foot-in-the-door question to encourage a response, and use the information to improve our customer support, which is strongly tied to overall customer satisfaction.

How would you rate the support you received? (1-5 scale)

If 1-3: How could we improve?

If 4-5: What did you love about the experience?

👉 Our beginner’s guide to website feedback goes into great detail about how to measure customer service, NPS , and other important success metrics.

🚀 Get started: gauge short-term satisfaction level with a CSAT survey .

9. Measure word-of-mouth recommendations

Net Promoter Score is a measure of how likely your customers are to recommend your products or services to their friends or colleagues. NPS is a higher bar than customer satisfaction because customers have to be really impressed with your product to recommend you.

Example of NPS questions (to be asked in the same survey):

How likely are you to recommend this company to a friend or colleague? (0-10 scale)

What’s the main reason for your score?

What should we do to WOW you?

👉 We created an NPS guide with ecommerce companies in mind, but it has plenty of information that will help companies in other industries as well.

🚀 Get started: measure whether your users would refer you to a friend or colleague with an NPS survey . Then, use our free NPS calculator to crunch the numbers.

10. Redefine your messaging

How effective is your messaging? Does it speak to your clients' needs, drives, and fears? Does it speak to your strongest selling points?

Asking the right survey questions can help you figure out what marketing messages work best, so you can double down on them.

What attracted you to [brand or product name]?

Did you have any concerns before buying [product name]?

Since you purchased [product name], what has been the biggest benefit to you?

If you could describe [brand or product name] in one sentence, what would you say?

What is your favorite thing about [brand or product name]?

How likely are you to recommend this product to a friend or colleague? (NPS question)

👉 We talk about positioning and branding your products in a post that’s part of a series written for SaaS startups, but even if you’re not in SaaS (or you’re not a startup), you’ll still find it helpful.

Have a question for your customers? Ask!

Feedback is at the heart of deeper empathy for your customers and a more holistic understanding of their behaviors and motivations. And luckily, people are more than ready to share their thoughts about your business— they're just waiting for you to ask them. Deeper customer insights start right here, with a simple tool like Hotjar Surveys.

Build surveys faster with AI🔥

Use AI in Hotjar Surveys to build your survey, place it on your website or send it via email, and get the customer insight you need to grow your business.

FAQs about survey questions

How many people should i survey/what should my sample size be.

A good rule of thumb is to aim for at least 100 replies that you can work with.

You can use our  sample size calculator  to get a more precise answer, but understand that collecting feedback is research, not experimentation. Unlike experimentation (such as A/B testing ), all is not lost if you can’t get a statistically significant sample size. In fact, as little as ten replies can give you actionable information about what your users want.

How many questions should my survey have?

There’s no perfect answer to this question, but we recommend asking as few as you need to ask in order to get the information you want. Remember, you’re essentially asking someone to work for free, so be respectful of their time.

Why is it important to ask good survey questions?

A good survey question is asked in a precise way at the right stage in the customer journey to give you insight into your customers’ needs and drives. The qualitative data you get from survey responses can supplement the insight you can capture through other traditional analytics tools (think Google Analytics) and behavior analytics tools (think heatmaps and session recordings , which visualize user behavior on specific pages or across an entire website).

The format you choose for your survey—in-person, email, on-page, etc.—is important, but if the questions themselves are poorly worded you could waste hours trying to fix minimal problems while ignoring major ones a different question could have uncovered. 

How do I analyze open-ended survey questions?

A big pile of  qualitative data  can seem intimidating, but there are some shortcuts that make it much easier to analyze. We put together a guide for  analyzing open-ended questions in 5 simple steps , which should answer all your questions.

But the fastest way to analyze open questions is to use the automated summary report with Hotjar AI in Surveys . AI turns the complex survey data into:

Key findings

Actionable insights

Will sending a survey annoy my customers?

Honestly, the real danger is  not  collecting feedback. Without knowing what users think about your page and  why  they do what they do, you’ll never create a user experience that maximizes conversions. The truth is, you’re probably already doing something that bugs them more than any survey or feedback button would.

If you’re worried that adding an on-page survey might hurt your conversion rate, start small and survey just 10% of your visitors. You can stop surveying once you have enough replies.

Related articles

assignment survey sample

User research

5 tips to recruit user research participants that represent the real world

Whether you’re running focus groups for your pricing strategy or conducting usability testing for a new product, user interviews are one of the most effective research methods to get the needle-moving insights you need. But to discover meaningful data that helps you reach your goals, you need to connect with high-quality participants. This article shares five tips to help you optimize your recruiting efforts and find the right people for any type of research study.

Hotjar team

assignment survey sample

How to instantly transcribe user interviews—and swiftly unlock actionable insights

After the thrill of a successful user interview, the chore of transcribing dialogue can feel like the ultimate anticlimax. Putting spoken words in writing takes several precious hours—time better invested in sharing your findings with your team or boss.

But the fact remains: you need a clear and accurate user interview transcript to analyze and report data effectively. Enter automatic transcription. This process instantly transcribes recorded dialogue in real time without human help. It ensures data integrity (and preserves your sanity), enabling you to unlock valuable insights in your research.

assignment survey sample

Shadz Loresco

assignment survey sample

An 8-step guide to conducting empathetic (and insightful) customer interviews in mid-market companies

Customer interviews uncover your ideal users’ challenges and needs in their own words, providing in-depth customer experience insights that inform product development, new features, and decision-making. But to get the most out of your interviews, you need to approach them with empathy. This article explains how to conduct accessible, inclusive, and—above all—insightful interviews to create a smooth (and enjoyable!) process for you and your participants.

surveys-cube-80px

  • Solutions Industry Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Member Experience Technology Use case AskWhy Communities Audience InsightsHub InstantAnswers Digsite LivePolls Journey Mapping GDPR Positive People Science 360 Feedback Surveys Research Edition
  • Resources Blog eBooks Survey Templates Case Studies Training Webinars Help center

Survey questions: Examples and tips

The complete guide to survey questions with examples, types, and tips to write great questions., 350+ free survey templates, customer surveys, human resources surveys, marketing surveys, industry surveys, community surveys, academic evaluation surveys, non-profit surveys.

Content Index

  • Dichotomous questions
  • Multiple choice
  • Rank order scaling
  • Rating scale
  • Semantic differential scale
  • Stapel scale
  • Constant sum survey questions
  • Demographic survey questions
  • Matrix table
  • Side-by-side matrix
  • Static content
  • Miscellaneous
  • Visual analog scale
  • Image chooser
  • Data reference
  • Upload data
  • Net Promoter Score (NPS)
  • Choice model
  • Good survey questions

Popular survey questions and examples

The types of questions you ask can prove to be one most critical factors determining the success of a survey. From email to SMS surveys, the common denominator that determines effectiveness is the questions. Different question and answer types promote multiple answers, even for similar questions.

This guide covers the types of survey questions available and looks at what makes good survey questions. We'll also explore examples and give you access to sample survey questions as a template for writing your own.

Using different question and answer types effectively lead to more engaging surveys. Incorporating the different types gives you more complete and accurate results.

Survey Questions

1. Dichotomous questions

Dichotomous is generally a "Yes/No" question. It's often a screening question to filter those who don't fit the needs of the research. Dichotomous question example:

Survey Questions - Dichotomous

2. Multiple choice

The multiple-choice survey questions consist of three or more exhaustive, mutually exclusive categories. Ask for either single or multiple answers. In the following survey questions example, the user selects only one out of the seven provided. You could configure this question to allow users to select multiple answers, such as all of the above responses. Multiple choice survey question example:

Survey Questions - Multiple Choice

3. Rank order scaling

Rank order scaling types of survey questions allow ranking of brands or products. You list options and ask users to rank them on specific attributes or characteristics. Consider a fitness tracker company that wants to know what features their users like the most. List down the features and ask your respondents to rank the options based on how much they like them. Rank order scaling survey example:

Survey Questions - Rank Order

4. Rating scale

Survey Questions - Rating Scale

5. Semantic differential scale

Survey Questions - Semantic Differential Scale

6. Stapel scale

The stapel scale question asks a person to rate a brand, product, or service according to a specific characteristic on a scale from -5 to +5. The rating range indicates how well the attribute describes the product or service. Stapel scale survey example:

Survey Questions - Staple Scale

7. Constant sum survey questions

A constant sum survey question permits the collection of "ratio" data. It means data can express the relative value or importance of the options. For example, if option A is twice as important as option B. Constant sum survey example:

Survey Questions - Constant Sum

8. Open-ended

Survey Questions - Open Ended

Create a free account

9. Demographic survey questions

Survey Questions - Open Ended

10. Matrix table

Matrix table questions are in the tabular format. The questions reside on the left of the table with answer options across the top. They are two-dimensional variants of multiple-choice questions. Multipoint scales allow respondents to select only one option per parameter, while multi-select will enable them to choose multiple options. The spreadsheet structure converts text and options into organized tables that are easy for the respondents to complete. Matrix survey example:

Survey Questions - Matrix Table

11. Side-by-side matrix

Need to know multiple aspects of a single parameter? Use a side-by-side matrix for a visually appealing design. It gives you the option to define various rating options simultaneously. Consider that you need to know how important and satisfied a customer is with customer service. A side-by-side matrix allows you to ask about both at once. This layout also makes it easy to identify the problem areas to make changes and improve your business. Side by side matrix survey example:

Survey Questions - Side-By-Side Matrix

12. Static content

Static questions add value to your questionnaire by displaying additional information. Presentation text questions, a static type, usually separate different sections of a survey. You can also add headings and subheadings to the various parts of the study to make it aesthetically pleasing. Static text question example:

Survey Questions - Static Content

13. Miscellaneous

This category of survey questions captures a variety of data types. Depending on the purpose of the survey, you might want to collect a captcha code, date of birth, or point on a map. Miscellaneous survey question example:

Survey Questions - Miscellaneous Survey Question

14. Visual analog scale

The Visual analog scale allows you to increase the visual appeal of questions. For example, you ask participants to rate the services they receive. Text sliders and numeric sliders provide a convenient and engaging way to answer. Other options include social media sharing, star-rating questions , thumbs up or down, and smiley-rating. Smiley ratings, in particular, are pleasant to the eyes and deliver a positive impact. Smiley survey example:

Survey Questions - Graphical Rating Type Survey Question

15. Image chooser

The use of images improves user experience. Consider an article with lot of text. Would you prefer to read a page with only text or the one with lots of attractive graphics? Most people will choose the one with images. Image chooser question example:

Survey Questions - Image Chooser Type Survey Question

Put this theory into practice to increase user responses. Image questions allow the respondents an opportunity to select images from a list. Take the image chooser question type to the next level with an image matrix.

16. Data reference

Data reference questions gather and validate data against the standard databases. A zip code, for example, is a type of data deference. Another option is the dynamic lookup tables. Use these tables to depict data according to rankings. Data reference survey question example:

Survey Questions - Data Reference Survey Question

17. Upload data

This type allows users to upload documents, images, and more to their survey responses. Upload data question example:

Survey Questions - Upload Data Survey Question

18. Net Promoter Score (NPS)

A Net Promoter Score survey research questions measure brand shareability and customer satisfaction. It asks respondents to rate whether they'll recommend your company to their network on a scale of 0 to 10. It categorizes the respondents into Promoters (9-10), Passives (7-8), and Detractors (0-6).

Your NPS helps you identify why customers are promoting or detracting from your brand. Patterns in the responses of Promoters and Detractors provide insights into the strengths and weaknesses of your business. NPS question example:

Survey Questions - Net Promoter Score Survey Question

19. Choice model

Choice model survey questions include Conjoint Analysis and Maximum Difference Scaling.

Conjoint Analysis is one of the most accepted quantitative methods in market research. Use it to determine client preferences. For example, discover which product features customers prefer or how price changes influence sales.

Maximum Difference Scaling is an effective way to establish a relative ranking for up to 30 elements. They might include:

  • Features or benefits of a service
  • Areas for potential investment of resources
  • Interests and activities
  • Potential marketing messages for a new product
  • Products or Services in use

Conjoint choice model survey example:

Survey Questions - Choice Model Survey Question

What are good survey questions?

Which question type should you use to get the best response rate? Does the language of the questions make a significant impact? How do you find good survey questions examples?

It turns out that it takes a little of both to get the perfect questions. Learn how to use question types to write great questions and get examples.

Check out our 350+ FREE & Ready-made Survey Templates.

how to ask good questions for you survey

Keep things fair : Don't boast too much about your products or services. Limit your use of adjectives to avoid distancing your customers. You want your company to appear open to constructive criticism. Dodge questions like: "What do you feel about the warm welcome our staff gave at your arrival?" Respondents will rather prefer, "How did you like your welcome at our hotel?"

Simple survey questions = Better responses: Come up with items that are easy to understand and answer. Expecting respondents to answer essay-like questions repeatedly causes burnout and lowers response rate. Instead, focus on easy-to-answer issues that don't take too long.

Don't ask just because you can: You may feel the need to get as much information as you can from a single survey. However, this temptation causes your study to veer off track. Many users see overly nosy surveys as suspicious and irritating.

Skip what-ifs: Avoid cooking up situations your respondents may never face. You'll lower the response rate and receive fewer authentic answers. What-if scenarios relevant to your audience, however, could increase the effectiveness of your questionnaire.

Ask "how": A single select question like "Did you like our gym?" will get you either "yes" or "no." Skip the yes/no questions and focus on asking how your business did instead. For example, you could ask, "How did you find the services at our gym?" Answers could include "extremely professional," "moderately professional," and "not at all professional." This question captures detailed data and can lead to more actionable insights.

Don't ask more than one question at once: The last thing you want is to confuse respondents. Asking two or more correlated items in one question will baffle your customers. Interlinking multiple topics also promotes the idea that neither is significant. Divide complicated topics into multiple questions for the most effective and reliable answers.

Additional sensitivities to keep in mind when creating good survey questions:

If you have to ask sensitive questions, such as religion or political affiliation, place them next to the questions contextually related to them. This will make it easier for the respondents to understand why you're asking.

Make the first questions simple, pleasant, and exciting.

Include a question mark near the end of every question.

Ensure all questions are grammatically correct and error-free.

Avoid jargon and use terms and concepts that are easy for all respondents to understand.

Remember that simplicity and a direct approach inspire respondents to complete a survey.

Related Content

How to effectively conduct an online survey.

How to effectively conduct an Online Survey

To effectively conduct an Online Survey the first you need to decide what the objectives of the study are. And also Review the basic objectives of the study.

The Hacker's Guide to MaxDiff

MaxDiffanalysisCover

Learn about MaxDiff Analysis, its market research applications, and how top marketing strategists implement MaxDiff Questions to understand their target market’s decision-making psychology.

Third-Party Risk Management: Critical Steps to Safeguard Your Business Relationships

Third Party Risk Management

Learn how to identify and profile your organization's third-party risks. This ebook will teach you the steps necessary to managing risk exposure over time with QuestionPro Assessments.

  • Survey Templates

Product Surveys

Product evaluation survey template.

Feedback on company, product, customer service, ratings, intention to return.

Employee Evaluation Surveys

Company communications evaluation survey template.

Evaluation of communication information and strategy within the company.

  • Customer Satisfaction Surveys

Customer Satisfaction Survey Template

Customer satisfaction survey with product, representative, and process questions.

  • Sample questions
  • Sample reports
  • Survey logic
  • Integrations
  • Professional services
  • Survey Software
  • Customer Experience
  • Communities
  • Polls Explore the QuestionPro Poll Software - The World's leading Online Poll Maker & Creator. Create online polls, distribute them using email and multiple other options and start analyzing poll results.
  • Research Edition
  • InsightsHub
  • Case Studies
  • AI in Market Research
  • Quiz Templates
  • Qualtrics Alternative Explore the list of features that QuestionPro has compared to Qualtrics and learn how you can get more, for less.
  • SurveyMonkey Alternative
  • VisionCritical Alternative
  • Medallia Alternative
  • Likert Scale Complete Likert Scale Questions, Examples and Surveys for 5, 7 and 9 point scales. Learn everything about Likert Scale with corresponding example for each question and survey demonstrations.
  • Conjoint Analysis
  • Net Promoter Score (NPS) Learn everything about Net Promoter Score (NPS) and the Net Promoter Question. Get a clear view on the universal Net Promoter Score Formula, how to undertake Net Promoter Score Calculation followed by a simple Net Promoter Score Example.
  • Offline Surveys
  • Employee Survey Software Employee survey software & tool to create, send and analyze employee surveys. Get real-time analysis for employee satisfaction, engagement, work culture and map your employee experience from onboarding to exit!
  • Market Research Survey Software Real-time, automated and advanced market research survey software & tool to create surveys, collect data and analyze results for actionable market insights.
  • GDPR & EU Compliance
  • Employee Experience
  • Customer Journey
  • Executive Team
  • In the news
  • Testimonials
  • Advisory Board

QuestionPro in your language

  • Encuestas Online
  • Pesquisa Online
  • Umfrage Software
  • برامج للمسح

Awards & certificates

The experience journal.

Find innovative ideas about Experience Management from the experts

  • © 2021 QuestionPro Survey Software | +1 (800) 531 0228
  • Privacy Statement
  • Terms of Use
  • Cookie Settings
  • (855) 776-7763

Training Maker

All Products

Qualaroo Insights

ProProfs.com

  • Get Started Free

FREE. All Features. FOREVER!

Try our Forever FREE account with all premium features!

Survey Question: 250+Examples, Types & Best Practices

assignment survey sample

Market Research Specialist

Emma David, a seasoned market research professional, specializes in employee engagement, survey administration, and data management. Her expertise in leveraging data for informed decisions has positively impacted several brands, enhancing their market position.

assignment survey sample

Ever noticed why some surveys get overwhelming responses with high participation and detailed answers, while others barely receive any attention?

Well, the secret to a successful survey lies in crafting the right questions.

A successful survey balances straightforward, closed-ended questions with more expansive, open-ended ones. The former allows for quick and easy responses, while the latter invites detailed feedback, offering deeper insights.

The overall length of the survey and the timing of its distribution also play significant roles in effectively engaging your audience.

If you aim to design surveys that receive high response rates, this blog is tailor-made for you.

But first, let’s watch a quick video on how to create a survey .

Watch: How to Create a Survey Using ProProfs Survey Maker

Types of Survey Questions

Common survey questions can be broadly divided into open-ended and closed-ended questions. While open-ended questions help you collect qualitative data , with closed-ended questions, you can collect quantitative feedback.

Let’s explore these in detail:

1. Open-Ended Questions

These types of questions collect detailed information from your target audience in the form of text answers. Open-ended questions are most utilized in cases where your customers have a concern beyond what’s available in the predefined answer options.

By analyzing their word choice, language, and tone of answers, you can understand the emotions that customers go through while using your products or services. In crucial areas like customer support, you need more than just a yes/no answer from your respondents, and open-ended questions add that depth to the feedback you collect.

Example: “How is your experience of using our products?”

2. Close-ended Questions

Closed-ended questions require the respondent to choose from a given set of responses, limiting answers to those options. This makes the data easy to quantify, allowing for straightforward analysis and comparison. Closed-ended questions are efficient for surveys with a large number of respondents, as they ensure consistency in responses and facilitate automated data processing.

Example : “Do you own a smartphone? (Yes/No)”

3. Rating Scale

Rating scale questions ask respondents to evaluate a statement or question based on a given scale, such as 1 to 5 or 1 to 10, where each point on the scale represents a different level of intensity or frequency. This format is useful in customer satisfaction survey  and for measuring attitudes, opinions, or behaviors, providing a quantifiable measure of subjective phenomena.

It allows researchers to assess degrees of agreement or satisfaction, making it easier to identify trends and patterns.

Example : “How likely are you to recommend our service to a friend or colleague on a scale of 0 (not likely) to 10 (extremely likely)?”

assignment survey sample

Demographic Survey Questions

Demographic survey questions seek to collect specific data about the respondent’s background, including age, gender, income, education, employment status, and more. This information is crucial for segmenting the survey population and analyzing responses based on demographic factors.

It helps in understanding how different groups perceive and interact with a product, service, or topic, enabling targeted insights and decision-making.

Example : “What is your highest level of education completed?

  • Some high school
  • High school graduate
  • Some college
  • Bachelor’s degree
  • Graduate degree

5. Multiple Choice Questions

Multiple choice questions offer respondents a list of possible answers, from which they select the one that most closely aligns with their opinion or experience. This type is versatile and can be used to gather data on preferences, behaviors, or factual information.

Multiple-choice questions simplify the analysis process by standardizing responses, but they require careful consideration to ensure all potential answers are represented.

Example : “Which of the following categories best describes your employment status?

  • Employed full-time
  • Employed part-time
  • Self-employed
  • Unemployed, Student

6. Drop Down

Drop-down questions are a variant of multiple-choice questions that save space and help keep surveys looking clean and uncluttered. They are particularly useful when presenting a long list of options, such as countries or states. This format can improve the respondents’ experience by making it easier for them to navigate the survey.

Example : “Select your industry from the dropdown list: [List of industries]”

7. Grid of Choice

Also known as matrix questions, a grid of choice questions allows respondents to evaluate multiple items using the same set of response options presented in a grid format. This is efficient for collecting data on a series of statements, making it easy to compare responses across different items. However, make sure to use them sparingly to avoid overwhelming respondents.

Example: “Please rate the following aspects of our product on a scale from 1 (Very Unsatisfied) to 5 (Very Satisfied): Quality, Price, Customer Service, Durability.”

assignment survey sample

8. NPS Scale

The Net Promoter Score (NPS) is a specialized rating scale that measures customer loyalty and the likelihood of referrals. It is a powerful tool for gauging customer satisfaction and predicting business growth. The simplicity of the NPS question format facilitates quick and easy assessment of customer sentiment towards a company, product, or service.

Example : “On a scale of 0-10, how likely are you to recommend our company to friends or colleagues?”

Upload questions enable respondents to provide additional context to their answers through the upload of files or images. This can be particularly useful for gathering evidence in customer service inquiries, collecting creative submissions, or obtaining documentation. It adds a layer of depth to the data collected, allowing for more nuanced analysis.

Example : “Please upload a copy of your receipt or proof of purchase.”

 10. Nominal Questions

Nominal questions categorize data into non-ordinal categories, meaning there is no inherent order to the options. They are used to label variables into distinct, separate groups without implying any hierarchy or quantity. This type is key for classifying respondents and can be pivotal in analyzing behavioral patterns across different segments.

Example : “Which of the following best describes your current role? (Manager, Technician, Salesperson, Administrative, Other)”

 11. Likert Scale Questions

Likert scale questions are designed to capture the intensity of a respondent’s feeling towards a statement, typically ranging from strong agreement to strong disagreement. They are widely used in surveys to measure attitudes, opinions, and people’s perceptions.

Usually, the Likert scale makes use of levels (1-7), (1-5), or (1-3), where the lower levels (‘1’) are indicative of low or negative sentiments while higher levels (‘7’, ‘5’, ‘3’) are indicative of higher or positive feelings. The midpoint of the scale indicates neutral views of survey takers.

Likert scales are valuable for understanding nuances in responses and are effective for measuring changes in perceptions over time.

Example : “I believe that the customer service I received was satisfactory. (Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree)”

12. ‘Yes’ or ‘No’ Questions

‘Yes’ or ‘No’ questions are the simplest form of survey questions, offering respondents a binary choice. While they may not provide deep insights, they are effective for qualifying respondents or obtaining clear, decisive answers to specific questions. They can serve as gateways to more detailed questions based on the response given.

Example : “Have you used our online customer service portal? (Yes/No)”

 13. Data Reference

Data reference questions ask respondents to consider specific data, experiences, or content before answering. This type encourages respondents to reflect on particular aspects of their interaction with a product or service, providing more targeted insights.

Example : “Based on the last product update, how would you rate the improvement in user experience? (Significantly improved, Somewhat improved, No change, Somewhat worsened, Significantly worsened)”

14. Miscellaneous Questions

Miscellaneous questions encompass any question type that doesn’t fit into the other categories but is necessary for achieving the survey’s objectives. They can be creatively used to gather unique insights or to add an engaging element to the survey.

Example : “If you could suggest one new feature for our app, what would it be?”

250+ Survey Questions That You Can Use

Now that we know about the different types of survey questions, it is time to delve into survey question examples categorized based on different business strategies.

1. Customer Survey Questions

Customer surveys are used to gather insights from the customer about the product, service, and the whole sales process in general. Customer survey questions are tricky as it is often difficult to identify what exactly you want from the customers. In this section, we have covered sample survey questions for customer satisfaction and customer feedback. You can use these survey questions as examples or frame your questions in a similar way.

1.1 Customer Satisfaction Survey Questions Examples

Customer satisfaction survey questions are centered around how satisfied the customer is with the product and the company. You can frame the standard customer satisfaction survey questions for your feedback forms using these sample customer satisfaction survey questions.

  • How satisfied were you with the way our operator handled your problem?
  • Was your issue resolved promptly?
  • Were you able to locate products/services/information without assistance?
  • Are you happy with your shipping options?
  • Did the product arrive on time?
  • Did the product meet your expectations?
  • Does our product offer value for money? Please rate.
  • What made you choose us over the competition?
  • Would you recommend our products or services? Why or why not?
  • If you could change one thing about our product or service, what would it be?

1.2 Customer Feedback Survey Questions Examples

Customer feedback surveys are targeted towards the most recent transaction a customer has with the brand. Take a look at how your customer service survey questions should ideally be framed.

  • Did we meet your expectations?
  • How would you rate your interaction with our employees?
  • Did you have any difficulties finding the product you were looking for?
  • Did you find the sales process too long or tiring?
  • What can we do to make future transactions easier?
  • Did you feel comfortable shopping with us?
  • What products or services do you wish we carried?
  • How could we have exceeded your expectations?
  • Is there anything else you would like to tell us regarding your experience?
  • Have you used or visited our website?
  • Was the website easy to use?
  • What would you change about our website?
  • What were you using before you found us?
  • What did you like about the previous product/service(s)?
  • What caused you to leave?
  • What does our business do better?
  • Is there anything you miss about the previous product/service(s)?
  • What problem were you trying to solve when you initially came across our product or service?

2. Human Resource Survey Questions

Human resource survey question examples primarily deal with questions related to employee engagement, satisfaction, and training & development. Use these sample employee survey questions as a base for your employee survey questions .

2.1 Employee Engagement Survey Questions Examples

Employee engagement survey questions are aimed at identifying the level of involvement an employee has within the organization. You can frame your employee engagement survey around the following questions.

  • What would you like to change about your job/workplace?
  • What’s your favorite thing about your job/workplace?
  • What is the company’s greatest weakness?
  • What is the company’s greatest strength?
  • Do you see your career developing in this company?
  • How challenging or exciting do you find your work to be?
  • Do you feel valued by your manager?
  • Do you trust other members of your team?
  • Do you have all the resources you need to do your job well?
  • Do you feel that your contribution is recognized?
  • Would you recommend your friends to work here?
  • Do you have a clear understanding of your career or promotion path?
  • How would you rate your work-life balance?
  • Hypothetically, if you were to quit tomorrow, what would your reason be?
  • If you were given the chance, would you reapply to your current job?

2.2 Employee Evaluation Survey Questions Examples

Employee evaluation surveys are aimed at identifying an employee’s strengths and weaknesses. Given below are examples of survey questions for your employee evaluation survey.

  • What accomplishments on this job are you most proud of?
  • Which goals did you meet? Which goals fell short?
  • What motivates you to get your job done?
  • What can we do to make your job more enjoyable?
  • In what conditions are you most productive?
  • What personal strengths do you have that help you do your job effectively?
  • What makes you the best fit for your position?
  • What skills do you have that you think could be used more effectively within the company?
  • What kind of work comes easiest to you?
  • What 2-3 things will you focus on in the future to help you grow and develop?
  • How can the company help you better meet your goals?
  • Which job responsibilities/tasks do you enjoy most?
  • Which roles do you least enjoy?
  • How do you think your role helps the company succeed?
  • What do you like least about your current role? What would you change?
  • What do you like most about working for this company?
  • What are your most important goals for the next quarter?
  • What do you want your next position at this company to be? How would your responsibilities change?
  • What professional growth opportunities would you like to explore in order to get there?
  • What type of career growth is most important to you?
  • How do you prefer to receive feedback and/or recognition for your work?
  • What can we do to improve our manager-employee relationship?

2.3 Job Satisfaction Survey Questions Examples

Job satisfaction surveys, also known as employee satisfaction surveys , are used to gauge the morale of employees. Employee satisfaction survey questions need to be framed around how the employee feels regarding the job and workplace environment.

  • Is there a clear understanding of the goals set by the organization?
  • Are you clear on what your role demands in meeting the company objectives?
  • Do you believe there is an opportunity for individual growth and development within the company?
  • Do you see yourself working for the same organization in the next 2 years?
  • Are you satisfied with your job overall?
  • How would you rate the team culture?
  • Does your team provide you support at work whenever needed?
  • If something unusual comes up, do you know who to go to for a solution?
  • Do you have all the resources and tools you need to perform your duties well?
  • Do your seniors/managers encourage you to do your best?
  • Do you feel you are rewarded for your dedication and commitment towards the work?
  • Do you feel that your superiors value your opinions?
  • Do you feel there is a scope for personal growth such as skill enhancement?
  • Does the management involve you while taking leadership related decisions?
  • Do you think you are valued by your manager?
  • Do you think you go beyond your limits to fulfill a task?
  • Do you think you have had enough training to solve customer issues?
  • Do you think your personal time is respected by the management?
  • Do you think the environment here helps you maintain a work-life balance?
  • Does your job cause unreasonable stress to you?
  • Do you think the organization has fair policies for promotion for all employees?
  • Do you find your job meaningful?

2.4 Survey Question for Training Feedback

Training survey questions are focused on employee training and development activities. The ideal questions to use in training surveys are shown below.

2.5 Pre-Training Survey Questions

  • What do you expect from this training session?
  • Do you think this training session will help you?
  • Do you have any expectations regarding the outcome of this training program?
  • Are you nervous about taking up this training program?
  • Are you looking forward to this training?
  • Were you informed beforehand about the specifics of the training session?
  • Did you get an itinerary for the session?
  • Do you know what the training session will cover?
  • Are the timings of the training session convenient to you?
  • How likely are you to recommend this session to your friends/family?

2.6 Post-Training Survey Questions

  • Did the training session meet your expectations?
  • Did you enjoy the training session?
  • Did you learn something new?
  • Was the training session implemented as you had hoped?
  • How can we improve the training session further?
  • Were the objectives of the training well defined?
  • Was participation and interaction encouraged?
  • Were the topics covered relevant to you?
  • Was the training material well organized and easy to follow?
  • How useful was the training material or session to you?

3. Market Research Survey Questions

Market research is important when companies venture into new markets or are about to launch a new product or service. Market research survey questions include questions on demographics, previous buying behavior, and future expectations.

3.1 Demographic Research Questions

Demographic research questions aim to find out more about the lives of the intended audience. However, try not to include too many questions about demographics and other sensitive information, as the information revealed may get too personal.

  • What is your age?
  • What is your gender?
  • What is your education level?
  • Where do you live?
  • What is your profession?
  • What is your household size?
  • What are your biggest challenges?
  • What are your primary goals?
  • What is most important to you?
  • Where do you go for information?
  • How do you like to make purchases?
  • How likely are you to recommend our brand to a friend?
  • How long have you been a customer?
  • What problems do we solve for you?
  • How does our product/service fit into your daily workflow?
  • How well does our product/service meet your needs?
  • What do you wish the product/service had that it currently does not?
  • What do you like most/least about our product/service?
  • What made you choose us over our competitors?
  • How would you rate your last experience with us?
  • How did you find us?

3.2 Questions for Competitor Analysis

These questions focus on collecting data about competitors’ products, services, customer satisfaction levels, marketing strategies, and market positioning. The aim is to identify strengths and weaknesses in competitors’ offerings and strategies, as well as to uncover opportunities and threats in the market.

  • How is our brand doing compared to our competitors?
  • How do our competitors effectively attract customers?
  • How much website traffic do your competitors receive?
  • Which keywords are driving traffic to your competitors?
  • Which sources are driving traffic to your competitors?
  • How many inbound links do your competitors have?
  • What type of content is performing well for your competitors?

4. Education Survey Questions

Education surveys are equally important for teachers and students. Collecting feedback from students and teachers can help improve the school curriculum and environment. Check out the sample questions for student and education surveys below.

4.1 Survey Question Examples for Students

Survey questions for students are centered around the students’ workload and its effect on the mind of a student.

  • How often does schoolwork keep you from getting enough sleep?
  • How often do you worry about school assignments?
  • Do you worry about getting into the college of your choice?
  • How many hours a day do you spend on homework?
  • Does the assigned homework help you learn the material?
  • How many hours a week do you spend participating in extracurricular activities?
  • What is the main reason you participate in extracurricular activities?
  • How much sleep do you typically get on a weeknight? On weekends?
  • Where do you typically keep your phone when you go to sleep at night?
  • To what extent are you confident in your ability to cope with stress?
  • In the past month, how often have you experienced exhaustion, difficulty sleeping, or headaches?
  • How often do you try as hard as you can in school?
  • How often do you enjoy your schoolwork?
  • How often do you find your schoolwork valuable?
  • Do your teachers value and listen to students’ ideas?
  • Do your teachers treat students with respect?
  • Do you feel like you belong at this school?
  • Do you feel that other students at this school accept you for who you are?
  • How important is it to your parents/guardians that your schoolwork challenges you to think?
  • How much do your parents/guardians worry about you getting a bad grade in school?

4.2 Teacher Survey Questions Examples

Survey questions for teachers should always center around the curriculum and the relationship of the teacher with the administration. The best survey questions for teachers are shown below.

  • Does the school administration care about you as an individual?
  • How fairly does the school leadership treat the staff?
  • At your school, how valuable are the available professional development opportunities?
  • Do you get inputs for your professional development opportunities?
  • How often do your school’s facilities need repairs?
  • How important is it for your school to hire more specialists to help students?
  • How respectful are the relationships between staff and students?
  • To what extent are you trusted to work in the way the administration thinks is the best?
  • At your school, how objectively is your performance assessed?
  • How effective is your school’s evaluation system at helping you improve?
  • Does the school staff collectively brainstorm to provide better learning experiences?
  • Does the school staff put in extra effort for students with disabilities?
  • Were you consulted before finalizing the curriculum for the students?

5. Event Survey Questions

Conducting an event survey is crucial for knowing about your audience’s expectations. Also, a post-event survey lets you know if you have addressed their challenges effectively.

Event survey questions can be broadly classified into:

  • Pre-event survey questions
  • Mid-event survey questions
  • Post-event survey questions

5.1 Pre-event Survey Questions Examples

Pre-event survey questions are set with the future arrangements you intend to make for your event in mind.

  • How did you hear about the event?
  • When are you arriving?
  • Which social platform do you prefer?
  • How do you feel about the location of the event?
  • What are you hoping to get out of the event?
  • Have you attended this event before?
  • What speakers are you looking forward to most?
  • Is there any event information that you couldn’t find or access easily?
  • Do you have any special needs or disabilities that we can help accommodate?
  • Do you have any food allergies or specific dietary needs?

5.2 Post-event Survey Questions Examples

Frame your post-event survey questions around the insights you require from your attendees regarding the organization, facilities, and recommendations for future events.

  • How satisfied were you with the event?
  • What did you like most about the event?
  • What did you like least about the event?
  • How likely are you to attend one of our events in the future?
  • How likely are you to recommend our events to a friend?
  • In your opinion, did the event meet the objectives it set in the beginning?
  • Which topics/activities would you like to see covered at future conferences/events?
  • Was the event staff friendly?
  • Did you get all the information regarding the event beforehand?
  • Was the event too long, too short, or just rightly timed?

6. Healthcare Survey Questions

Healthcare surveys mainly include questions about patient satisfaction, hospital employee surveys, and other surveys regarding confidential patient information. We’ve covered the ideal questions to ask in a patient satisfaction survey below.

  • How can we improve our services?
  • Was it easy to make an appointment?
  • Were you pleased with the check-in process?
  • Were you pleased with the check-out process?
  • Were you attended to in a timely manner?
  • How do you rate the staff that worked to care for you?
  • How much concern did the care provider show for your questions or worries?
  • During your most recent visit, did this care provider listen carefully to you?
  • Rate the friendliness/courtesy of the care provider.
  • Would you recommend our hospital to a family member or friend?
  • Are you currently covered under a health insurance plan?
  • How often did you receive conflicting information from different medical care professionals at this hospital?
  • How satisfied were you with the medical and other facilities available to you?
  • Did the hospital staff follow adequate medical safety procedures during the treatment process?

7. Brand Awareness Survey Questions

Understanding how “aware” your audience is regarding your brand is a crucial step in learning whether you were able to reach out to them successfully. That’s exactly what brand awareness survey questions aim to achieve. Most commonly seen on social media, a few brand awareness survey questions include:

  • Did you see an ad for this product/service recently?
  • Have you ever used any product from <product brand> before?
  • Do you remember seeing an ad for <brand name> in the past week?
  • Which word would you use to define this brand?
  • Would you recommend this brand to your friends or family?
  • How likely are you to recommend this brand to your friends or family?
  • Have you always been satisfied with this brand’s products?
  • Will you ever return to this brand again?
  • Were the products available as per your quality standards?
  • Which brand do you remember seeing an ad for this past week?

8. Website Feedback Survey Questions

Creating a good customer experience is crucial, but so is ensuring a smooth user experience. The only way you can achieve this is by asking customers how they like your website and how you can improve upon it further. That’s what website feedback survey does. A few questions you can ask in such surveys include:

  • Were you able to navigate through our website easily?
  • Did you find all the important information on the website?
  • Would you like to see any changes in the website design?
  • Does the website meet your expectations?
  • Did it take a lot of time for you to find what you were looking for on the website?
  • Is our website visually appealing?
  • Is our website properly optimized for your device?
  • Do you trust the information available on our website?
  • Is the information easily visible on our website?
  • How likely are you to recommend this website to a friend/colleague?

9. Sales Survey Questions

The most common survey question type is probably the sales survey questions. Post-sales, every customer is asked to undergo a quick survey. This way, companies get insight into whether their customers are happy with their purchase process, products, and services. The two main instances wherein companies ask for customer feedback and the sales survey questions that are asked include:

9.1 Post-Sales Questions

  • Was the product reasonably priced?
  • Did you find it difficult to find the product you liked?
  • Was it easy to go through with the purchase?
  • Is there any way we can improve the overall buying experience?
  • If we can, please tell us how you’d like us to change.

9.2 Post-Delivery Questions

  • Did you receive the same product you placed an order for?
  • Is the quality as per your standards?
  • Does the product look exactly like the images shown online?
  • Is the product worth the price mentioned on the app?
  • Will you buy a product from us again?

10. Transport Survey Questions

Carpooling, taking a cab, booking a bus or train ticket, and taking a flight to your destination have become really common among people nowadays. With a rise in demand, numerous companies are offering this service. In this case, companies ask customers to ask survey questions after the customers reach their destination. A few examples of such questions are:

  • Was your trip comfortable?
  • Were the seats allocated to you available at the time?
  • Were the seats clean?
  • Did you have any difficulty boarding or finding the bus/cab?
  • Was the bus/cab driver polite?
  • Did the flight attendants cater to your needs?
  • Did the flight attendants give you the basic safety information?
  • Was the food served during your journey of good quality?
  • How would you rate your journey on a scale of 1 to 10?
  • Will you travel with us again?

11. Lead Generation Survey Questions

Capturing leads and converting them into potential customers is crucial to keep your business booming. Not only that, but lead generation also helps you generate brand awareness and build a brand image among your target audience. A few lead generation questions you can add in your survey include:

  • Are you looking to buy something online in the next quarter?
  • Have you heard of <this brand>?
  • Are you solely responsible for your company’s budget?
  • Are you planning to save money?
  • How often do you order food online?
  • Have you ever bought clothes online?
  • When was the last time you bought something?
  • What’s the most common thing you bought online?
  • How do you decide on a brand when finalizing a product?
  • What would matter more to you: price or quality?

12. Non-Profit Survey Questions

Non-profit organizations cover a large network of donors, volunteers, and other staff. Surveys for non-profit organizations can cover fundraising events, volunteer feedback, and so on. Good examples of survey questions for nonprofits are shown below.

Volunteer Feedback Survey Questions Examples

  • How much of an impact do you feel your work here has had?
  • How convenient were the volunteer training sessions?
  • Did you find the volunteer training sessions to be helpful?
  • How easy was it to get along with other volunteers?
  • Did you find the staff easy to get along with?
  • Do you feel appreciated by your supervisor?
  • How satisfied or dissatisfied are you with your association with our organization?
  • How likely are you to continue volunteering for our organization?
  • How many hours a month do you spend volunteering?
  • How likely are you to recommend this organization to a friend or colleague?

Survey Questions: Do’s and Don’ts

The best survey questions are short and precise, avoid jargon, and prompt the respondent to take up a survey with ease.

Let’s understand the tips and tricks to create a survey questions:

  • Frame the questions from the perspective of the customers by keeping their needs first. You must use empathetic language to address their pain points.
  • Use simple and jargon-free language in your survey. Else, your respondents may get confused and drop out of the survey.
  • Avoid leading questions that can sound biased and give your respondents a desirable answer.
  • Start your survey with simple close-ended dichotomous questions. This will encourage your respondents to continue attempting your survey questions.
  • Use a balanced mix of both open-ended and close-ended questions.
  • Preferably use the first-person perspective while framing questions. The survey takers feel more connected with such personalized questions.

Use Surveys the Right Way to Enhance Your Brand Image

Survey questions play a crucial role in gathering valuable data and insights that can inform decision-making and strategy development across various fields.

Ultimately, you have to thoroughly understand the different types of survey questions and how to employ them effectively to design surveys that not only engage respondents but also yield meaningful and actionable data.

Whether you’re conducting market research, evaluating customer satisfaction, or exploring new trends, the knowledge of survey question types and their applications will increase your ability to gather reliable information and make well-informed decisions that drive success.

Make sure your surveys have versatile question types like multiple-choice questions (MCQs), rating scale and ranking scale questions, open-ended questions, etc., so that the data you collect is well-rounded, nuanced, and in-depth.

Also, use simple and jargon-free language so that your survey audience can complete your survey without confusion.

For online surveys , you can use popular survey software like ProProfs Survey Maker to get started. Use the available customizable templates and share them on multiple platforms like email, social media, website embedding, and more.

Learn More About Survey Questions

How to write survey questions.

To write survey questions, ensure they are clear, concise, and free from bias. Begin with a clear objective, use straightforward language, and make sure each question is relevant to the participants.

What is a bad survey?

A bad survey is characterized by leading questions, unnecessary complexity, lack of focus, or relevance to its goals, which results in poor-quality data.

How do you ask someone to answer your survey?

You can ask someone to answer your survey by politely requesting their participation, explaining the significance of the survey, and emphasizing how their feedback will contribute to meaningful insights or improvements.

How do you convince a customer to fill out a survey?

To convince a customer to fill out a survey, offer incentives, highlight the survey’s quick completion time, emphasize the impact of their feedback on improving their future experiences, and ensure the survey is easy to access.

How to analyze all of the questions and answers?

To analyze all of the questions and answers, systematically organize the collected data, utilize statistical tools or software for analyzing quantitative responses, and apply thematic analysis for qualitative responses to identify patterns, trends, and key themes. You can also use a survey software like ProProfs Survey Maker to make the process automated and easy.

Emma David

About the author

Emma David is a seasoned market research professional with 8+ years of experience. Having kick-started her journey in research, she has developed rich expertise in employee engagement, survey creation and administration, and data management. Emma believes in the power of data to shape business performance positively. She continues to help brands and businesses make strategic decisions and improve their market standing through her understanding of research methodologies.

Related Posts

assignment survey sample

Close Ended Questions: A Complete Guide With Types & Examples

assignment survey sample

How to Write Crucial Return to Work Survey Questions

assignment survey sample

75+ Student Survey Questions to Collect Valuable Students Feedback

assignment survey sample

What Are Product Survey Questions? [Guide + Examples]

assignment survey sample

How to Ask Employee Survey Questions About Management the Right Way: A Guide

assignment survey sample

75+ Human Resources Survey Questions To Ask Your Employees

50 Training Survey Questions for Post-training Feedback

Updated on: 16 Jul 2024 , 13 mins to read

50 Training Survey Questions for Post-training Feedback

' src=

Your team has just finished their assigned training. You see a completion rate of 100%. And you see they all passed their quizzes and assessments with flying colors.

That’s all good. But is it sufficient?

Metrics such as completion and pass rates are good indicators of the effectiveness of a training. However, relying solely on these quantitative metrics doesn’t give you a well-rounded view of training outcomes.

A lot of work is put into creating, managing and delivering training to learners with the aim to equip them with the skills and knowledge they need to be successful. Part of the process is being able to evaluate the effectiveness of training programs and that can only be achieved when you consider quantitative as well as qualitative metrics.

But to see all the rewards, you first need to know if it actually works, and if not, you’ll need to know what to optimize. Otherwise, you might waste time, money, and effort.

Post-training survey questions are a great way to prove the effectiveness of a training program. The right post-training survey questions can tell what is working and how your employees are moving toward the learning goals set for them. They can also show you areas in which you can improve your course content, delivery, and overall training experience.

What are training survey questions?

Training survey questions are a tool for collecting specific learner feedback for a training program they’ve participated in. It’s a systematic approach to review the quality, relevance, and impact of the training on participants.

There are two key reasons for using training survey questions:

  • To help understand if the training is interesting and practical.
  • To gather relevant data points to continuously improve the training program.

Now you might be wondering… How long should the survey be?

Of course, that depends on how in-depth the training is. For shorter trainings, a quick pulse check would be more acceptable than a lengthy survey. While for longer, ongoing training programs, you may need to be a bit more thorough with your survey.

And in terms of timing, post-training surveys are best conducted as close to the completion date as possible. That way, the training experience is still fresh in learners’ minds, and can therefore provide more accurate information.

50 post-training survey questions you should ask

These 50 post-training survey questions will give you valuable data points and relevant insights to take your current training program from good to great.

You can also download your very own template here and adjust it to your own needs.

Post-training evaluation questions related to pre-training material

  • How helpful was the pre-training material you had studied in readiness for the training program?
  • Has the pre-training assessment helped you determine your areas of improvement?
  • How interactive and engaging were the pre-training materials?
  • What else could we have done to make our pre-training more effective?

Post-training survey questions about course content, structure, and modules

  • How informative and useful would you rate the course materials?
  • Did the course content cover all the relevant topics and concepts you expected?
  • How well were the explanations and examples given during the training?
  • Were there any topics or areas that you felt were lacking and required more explanation?
  • Was there a logical flow of new knowledge in the training modules?
  • How appropriate were the exercises and activities created to support the concepts?
  • Was the material you were trained upon interesting?
  • Were you able to maintain your attention for the entire duration?

Post-training survey questions for the effectiveness of the instructor

  • How well did the instructors explain the subject?
  • Was the content explicit and well articulated by the instructor?
  • To what extent could the instructor involve you and keep you interested throughout?
  • Did the instructor use good examples and scenarios when presenting the concepts?
  • How easy did the instructor make it for you to follow the pace and flow of the training?
  • How responsive was the instructor to questions and concerns?
  • Did the instructor offer the team helpful feedback and suggestions for improvement during activities or exercises?

Training feedback questions on learning experience

  • How would you rate your learning experience during this training?
  • Were the interactive features engaging and helpful in driving home the concepts?
  • Did the overall training environment help enhance your learning experience positively?
  • Were there any distractions or disturbances that took away from your ability to concentrate and learn?

Post-training assessment questions on practical application

How confident are you in applying the skills/knowledge acquired in this training? What specific tasks or scenarios from your job do you predict using your new skills? Which parts of the training do you feel will be helpful to enhance your job performance the most? What are the barriers or difficulties that you foresee in implementing ‌training concepts? How will you ensure the implementation of the skills you gained is maintained and, over time, further inculcated? What additional resources, tools, or support would help you better apply what was taught? Can you give an example of how you have already used something from the training in your work?

Post-training evaluation questions on satisfaction and improvement

  • How satisfied were you with the overall training program?
  • What aspects of the training were particularly valuable or engaging?
  • What aspect of the training could be improved?
  • Were there any topics or areas that you felt needed more coverage or depth?
  • Did the training meet your expectations? If not, how did it fall short?
  • How would you rate the pace of the training – too fast, too slow, or just right?

Training feedback questions on tech and tools

  • How user-friendly and intuitive were the tools (e.g., learning management system, discussion forums, etc.) used during the training program?
  • Were there any technical issues or glitches that disrupted the training sessions? If so, how were they handled?
  • Did the training program effectively use media?
  • How would you rate the quality and functionality of the tools used in terms of audio, video, screen-sharing, and other features?

Post-training survey questions for employees on logistics and administration (on-site)

  • How convenient was the training schedule for you?
  • Was the training venue easily accessible and comfortable?
  • How would you rate the quality of the training facilities?
  • Did you face any issues with registration or enrollment for the training program?
  • How responsive and helpful was the administrative support staff throughout the training process?

Post-training survey questions for employees on logistics and administration (online)

  • How convenient was the online training schedule for you?
  • How easy was it to access and navigate the online training platform?
  • Did you face any issues with registration or enrollment for the online training program?
  • Did the online training provide sufficient opportunities for networking with peers?
  • How would you rate your experience with technical support during the online training?

Grab your copy

Types of training survey questions

There are two main types of post-training survey questions: objective and subjective. Objective questions aim to gather factual, quantifiable data, while subjective questions focus on the trainees’ opinions, attitudes, and perceptions.

Objective questions

Objective questions are typically close-ended, with predetermined answer choices. These questions are ideal for gathering measurable data, such as:

  • Knowledge assessment (multiple-choice questions)
  • Skill level evaluation (rating scales)
  • Completion rates (yes/no questions)

Examples of objective questions:

  • “On a scale of 1 to 5 (1 = Bad, 5 = Excellent), how would you rate your knowledge of the topic?”
  • “What percentage of the training content was new to you?”

Subjective questions

On the other hand, subjective questions let learners share their thoughts and experiences freely . These questions yield powerful qualitative insights, such as:

  • Overall training satisfaction
  • Areas for improvement
  • Suggestions for future training programs

Examples of subjective questions:

  • “What did you find most valuable about the training?”
  • “How can we enhance the training experience for future participants?”

What types of post-training survey response formats can you use?

As you create a survey to gather feedback after training, you have multiple options for how participants can respond, each with its own unique insights.

Some of the most popular formats are:

Numerical rating scales

These simple scales ask respondents to rate something on a numerical scale, typically from 1-5 or 1-10. They are simple and easy for respondents to use, plus they yield quantitative data that are easy to analyze and compare.

Example: On a scale of 1-5, how would you rate the instructor’s knowledge of the subject matter?

Rating scales work well for gauging overall satisfaction, quality, the relevance of the content, and the usefulness of the materials.

Likert scales

Similar to numerical rating scales, Likert scales measure attitudes and opinions with a greater degree of nuance. A common format is: Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree.

For instance, “The training content was relevant to my job duties” followed by the Likert scale options.

  • Strongly Disagree
  • Strongly Agree

Likert scales show how people feel about things like objectives, instruction pace, and practice exercises.

Open text boxes

Open-ended questions gather rich data but can be harder to analyze at scale. They work well for gaining deeper insights, suggestions for improvement, and specific examples or anecdotes related to the training experience.

Example: What did you like most about the training?

While open text boxes can be more challenging to analyze at scale, they offer rich feedback that can inform meaningful improvements.

Multiple choice

Multiple-choice questions come with pre-defined answer options, making them quick for respondents to answer and easy to analyze. These questions are ideal for gathering feedback on specific aspects of the training where there are clear, distinct choices.

Example: Which training delivery method did you prefer? A. Instructor-led B. Online self-paced C. Blended

Multiple-choice questions can also be used to gather demographic information or to identify areas for further exploration.

Dropdown menus

Like multiple-choice but more compact, dropdowns work well when there are many possible answers. They keep surveys concise and help to gather quantitative data.

For example, “Which department are you in?” with a dropdown of all company departments.

  • Human resources

Dropdown menus can be particularly helpful when surveying a diverse audience with varying needs and interests.

4 benefits of employee post-training survey

You can gain a number of benefits from conducting employee post-training surveys.

Evaluate training effectiveness

Post-training survey questions can help ‌assess the efficacy of employee training and development programs . It may help you determine if the content was relevant, engaging, and thorough enough to achieve the designed learning objectives.

Create my TalentLMS forever-free account

Enhance future training programs

If you gather ‌feedback and analyze it, then you will probably have everything to build a stronger training program.

Using ‌feedback from the training survey questions, you can develop highly customized and relevant training sessions that directly address the needs and knowledge gaps of the workforce.

Post-training survey responses can shed light on employees’ preferences regarding various training methods and tools. Some people do better in workshops that involve hands-on activities, while others prefer learning online at their own pace. Employers can use different training methods that work well with different learning styles to boost engagement and knowledge retention.

Measure training impact

You can relate the feedback to ‌current or new performance metrics. You could tie it to better job performance, more efficiency, happier customers, and financial growth.

Increase employee engagement

Post-training survey questions help employees feel more engaged.

Asking for learners’ feedback makes them feel valued and respected. This increases their commitment to their jobs and the company.

Survey responses can also show you how to make training more engaging. For example, employees might say the delivery is terrible or give negative feedback about the instructor or course design. Correcting these issues in your training program should help your employees engage better with the content.

6 tips to improve survey response rates

Achieving high response rates for post-training survey questions is how you gather rich data to act upon later, but it’s not always straightforward. Sometimes employees might get caught up in work, or personal things, or they might not understand the importance of answering these surveys.

Nevertheless, you can use these six tips to increase employee survey response rates:

  • Articulate the goal. Employees must know that their input directly impacts the quality and relevance of future training sessions. Give examples so that this point is relatable. Example: Recall an employee training session on a new software tool, during which many employees said the pace was too fast. If most indicate this as an issue on the survey, future sessions may be modified to run at a more manageable pace, and everyone wins.
  • Use convenient timing . Send a link to the survey immediately after training closes, and follow up a few days later with a gentle, friendly email message. Avoid sending the survey during peak workload hours or company-wide activities.
  • Send in a familiar format . Use mobile-friendly surveys that can be answered from virtually anywhere.
  • Reminders and deadlines. Set a reasonable deadline to create a sense of urgency. Send friendly reminders to non-respondents, but avoid being overly persistent. You can also use this email template .
  • Incentives and rewards. Sometimes, it helps to entice people with small incentives or reward mechanisms for completing the survey, like entry into a prize draw, gift cards, or company swag.
  • Follow-up and action. Communicate the survey results and the actions taken. It closes the loop, and it encourages participation in the future.

Tools for a post-training survey

Employee surveys should be part of regular communication. Beyond post-training surveys, you can collect employee feedback about almost any business-related topic you need input about.

Of course, the tools of the trade make this whole process much easier. Modern employee training software helps simplify how employers conduct employee surveys and collect feedback.

  • TalentLMS is a cloud-based learning management system ( LMS ) with survey and assessment features. With TalentLMS, you can create custom post-training surveys using a range of question types. The platform also comes with real-time reporting and analytics to help you ‌track survey responses and find areas for improvement. Other than that, you can also create and manage your training courses on TalentLMS.
  • SurveyMonkey is an online survey tool you can use to create post-training surveys with branching logic, skip logic, and custom branding. The only drawback is that you can not build or run training programs from this platform and you will likely have to use a stack of tools fro what should be a one-tool solution.
  • Google Forms is a free, web-based survey tool that integrates with other Google applications. While it may not have as many advanced features as some paid tools, Google Forms is a great option for creating simple post-training surveys. It offers a variety of question types, including multiple-choice, checkboxes, and short answers, and you can view responses in real-time.
  • TypeForm offers a unique, conversational approach to survey design. With TypeForm, you can create engaging post-training surveys with a variety of question types, including picture-choice, rating scales, and open-ended questions.

​​Analyze and use training evaluation feedback

Collecting responses on post-training survey questions for employees is just the start—the real value comes from analyzing and using the feedback.

Based on what you find, you may need to update training content, change delivery methods, add more resources, or offer refresher sessions.

It is a good idea to share the survey results and changes made with your team members or stakeholders—this way, everyone is on the same page about better future training sessions.

Save time, frustration and money with TalentLMS, the most-affordable and user-friendly learning management system on the market. Try it for free for as long as you want and discover why our customers consistently give us 4.5 stars (out of 5!)

Try for free!

Originally published on: 28 Jan 2019 | Tags: eLearning Design

Marialena Kanaki - Content Marketing Manager

Marialena hates talking about herself in the third person. She loves to inspire people with authenticity. And she prioritizes that in all her content—without the need for smoke and mirrors.

You may also like

eLearning Course Design: 10+1 Steps To Success – TalentLMS Blog

eLearning Course Design: 10+1 Steps To Success

How A Learner-Centered Approach Boosts Knowledge Retention

Eyes on your learners: Choosing a learner-centered approach

What is Learning in the Flow of Work? Definition & Use Cases

What is Learning in the Flow of Work? Definition and Use Cases

Popular articles, training evaluation methods: a comprehensive guide to techniques & tools.

6 months ago by Elena Koumparaki, 23 mins to read

A 2024 Guide to New Employee Orientation (Includes Checklist)

3 months ago by Fiona McSweeney, 18 mins to read

Would you take a pay cut to keep working remotely? 62% say no.

2 years ago by Athena Marousis, 17 mins to read

The top 26 most used online employee training tools

4 months ago by Christina Pavlou, 11 mins to read

Training Objectives: 5 Tips To Set Realistic Goals For Your Training

2 months ago by Aris Apostolopoulos, 9 mins to read

We love social, let’s connect!

Start your elearning portal in 30 seconds.

Get started it's free!

TalentLMS is free to use for as long as you want! You can always upgrade to a paid plan to get much more!

WPForms Blog

How to Write a Summary of Survey Results

How to Create a Survey Results Report (+7 Examples to Steal)

Claire Broadley

Content Manager

Jared Atchison

Do you need to write a survey results report?

A great report will increase the impact of your survey results and encourage more readers to engage with the content.

Create Your Survey Now

In This Article

1. Use Data Visualization

2. write the key facts first, 3. write a short survey summary, 4. explain the motivation for your survey, 5. put survey statistics in context, 6. tell the reader what the outcome should be, 7. export your survey results in other formats, bonus tip: export data for survey analysis, faqs on writing survey summaries, how to write a survey results report.

Let’s walk through some tricks and techniques with real examples.

The most important thing about a survey report is that it allows readers to make sense of data. Visualizations are a key component of any survey summary.

Examples of Survey Visualizations

Pie charts are perfect when you want to bring statistics to life. Here’s a great example from a wedding survey:

Example of a pie chart in a survey summary introduction

Pie charts can be simple and still get the message across. A well-designed chart will also add impact and reinforce the story you want to tell.

Here’s another great example from a homebuyer survey introduction:

Summary of survey results in a pie chart

If your survey is made up of open-ended questions, it might be more challenging to produce charts. If that’s the case, you can write up your findings instead. We’ll look at that next.

When you’re thinking about how to write a summary of survey results, remember that the introduction needs to get the reader’s attention.

Focusing on key facts helps you to do that right at the start.

This is why it’s usually best to write the survey introduction at the end once the rest of the survey report has been compiled. That way, you know what the big takeaways are.

This is an easy and powerful way to write a survey introduction that encourages the reader to investigate.

Examples of Survey Summaries With Key Facts

Here’s an awesome example of a survey summary that immediately draws the eye.

The key finding is presented first, and then we see a fact about half the group immediately after:

Survey summary with key facts

Using this order lets us see the impactful survey responses right up top.

If you need help deciding which questions to ask in your survey, check out this article on the best survey questions to include.

Your survey summary should give the reader a complete overview of the content. But you don’t want to take up too much space.

Survey summaries are sometimes called executive summaries because they’re designed to be quickly digested by decision-makers.

You’ll want to filter out the less important findings and focus on what matters. A 1-page summary is enough to get this information across. You might want to leave space for a table of contents on this page too.

Examples of Short Survey Introductions

One way to keep a survey summary short is to use a teaser at the start.

Here’s an example introduction that doesn’t state all of its findings but gives us the incentive to keep reading:

Survey summary report teaser

And here’s a great survey introduction that summarizes the findings in just one sentence:

Survey introduction with summary of findings

In WPForms, you can reduce the size of your survey report by excluding questions you don’t need. We decided to remove this question from the report PDF because it has no answers. Just click the arrow at the top, and it won’t appear in the final printout:

Exclude question from survey introduction report

This is a great way to quickly build a PDF summary of your survey that only includes the most important questions. You can also briefly explain your methodology.

When you create a survey in WordPress, you probably have a good idea of your reasons for doing so.

Make your purpose clear in the intro. For example, if you’re running a demographic survey , you might want to clarify that you’ll use this information to target your audience more effectively.

The reader must know exactly what you want to find out. Ideally, you should also explain why you wanted to create the survey in the first place. This can help you to reach the correct target audience for your survey.

Examples of Intros that Explain Motivation

This vehicle survey was carried out to help with future planning, so the introduction makes the purpose clear to the reader:

Explaining the motivation for a survey in survey results

Having focused questions can help to give your survey a clear purpose. We have some questionnaire examples and templates that can help with that.

Explaining why you ran the survey helps to give context, which we’ll talk about more next.

Including numbers in a survey summary is important. But your survey summary should tell a story too.

Adding numbers to your introduction will help draw the eye, but you’ll also want to explain what the numbers tell you.

Otherwise, you’ll have a list of statistics that don’t mean much to the reader.

Examples of Survey Statistics in Context

Here’s a great example of a survey introduction that uses the results from the survey to tell a story.

Survey summary introduction with context

Another way to put numbers in context is to present the results visually.

Here, WPForms has automatically created a table from our Likert Scale question that makes it easy to see a positive trend in the survey data:

WPForms survey summary results in a table

If you’d like to use a Likert scale to produce a chart like this, check out this article on the best Likert scale questions for survey forms .

Now that your survey report is done, you’ll likely want action to be taken based on your findings.

That’s why it’s a good idea to make a recommendation.

If you already explained your reasons for creating the survey, you can naturally add a few sentences on the outcomes you want to see.

Examples of Survey Introductions with Recommendations

Here’s a nice example of a survey introduction that clearly states the outcomes that the organization would like to happen now that the survey is published:

Survey introduction with recommendations

This helps to focus the reader on the content and helps them to understand why the survey is important. Respondents are more likely to give honest answers if they believe that a positive outcome will come from the survey.

You can also cite related research here to give your reasoning more weight.

You can easily create pie charts in the WPForms Surveys and Polls addon. It allows you to change the way your charts look without being overwhelmed by design options.

This handy feature will save tons of time when you’re composing your survey results.

Once you have your charts, exporting them allows you to use them in other ways. You may want to embed them in marketing materials like:

  • Presentation slides
  • Infographics
  • Press releases

WPForms makes it easy to export any graphic from your survey results so you can use it on your website or in slides.

Just use the dropdown to export your survey pie chart as a JPG or PDF:

Export survey pie chart

And that’s it! You now know how to create an impactful summary of survey results and add these to your marketing material or reports.

WPForms is the best form builder plugin for WordPress. As well as having the best survey tools, it also has the best data export options.

Often, you’ll want to export form entries to analyze them in other tools. You can do exactly the same thing with your survey data.

For example, you can:

  • Export your form entries or survey data to Excel
  • Automatically send survey responses to a Google Sheet

We really like the Google Sheets addon in WPForms because it sends your entries to a Google Sheet as soon as they’re submitted. And you can connect any form or survey to a Sheet without writing any code.

wpforms to google sheets

The Google Sheets integration is powerful enough to send all of your metrics. You can add columns to your Sheet and map the data points right from your WordPress form.

This is an ideal solution if you want to give someone else access to your survey data so they can crunch the numbers in spreadsheet format.

We’ll finish up with a few questions we’ve been asked about survey reporting.

What Is a Survey Report and What Should It Include?

A survey report compiles all data collected during a survey and presents it objectively. The report often summarizes pages of data from all responses received and makes it easier for the audience to process and digest.

How Do You Present Survey Results in an Impactful Way?

The best way to present survey results is to use visualizations. Charts, graphs, and infographics will make your survey outcomes easier to interpret.

For online surveys, WPForms has an awesome Surveys and Polls addon that makes it easy to publish many types of surveys and collect data using special survey fields:

  • Likert Scale (sometimes called a matrix question )
  • Net Promoter Score (sometimes called an NPS Survey)
  • Star Rating
  • Single Line Text
  • Multiple Choice (sometimes called radio buttons )

You can turn on survey reporting at any time, even if the form expiry date has passed.

To present your results, create a beautiful PDF by clicking Print Survey Report right from the WordPress dashboard:

Print survey results

Next Step: Make Your Survey Form

To create a great survey summary, you’ll want to start out with a great survey form. Check out this article on how to create a survey form online to learn how to create and customize your surveys in WordPress.

You can also:

  • Learn how to create a popup WordPress survey
  • Read some rating scale question examples
  • Get started easily with a customer survey template from the WPForms template library.

Ready to build your survey? Get started today with the easiest WordPress form builder plugin. WPForms Pro includes free survey form templates and offers a 14-day money-back guarantee.

If this article helped you out, please follow us on Facebook and Twitter for more free WordPress tutorials and guides.

Disclosure : Our content is reader-supported. This means if you click on some of our links, then we may earn a commission. See how WPForms is funded, why it matters, and how you can support us .

' src=

Claire Broadley

Claire is the Content Manager for the WPForms team. She has 13+ years' experience writing about WordPress and web hosting. Learn More

The Best WordPress Drag and Drop Form Builder Plugin

Easy, Fast, and Secure. Join over 6 million website owners who trust WPForms.

Popular on WPForms Right Now!

how-to-get-an-unlimited-free-trial-of-wpforms

How to Get an Unlimited Free Trial of WPForms (100% Free Forever)

Are you wondering how to get an unlimited WPForms trial for free?

You can use WPForms Lite without spending a penny. In this post, we’ll show you how to get an unlimited free WPForms trial and start building contact forms on your WordPress site right away.

what is a wpforms hidden fielld

What Is a WPForms Hidden Field? (Discover Hidden User Data)

Would you like to collect more data from the people who fill out your WordPress forms?

WPForms includes a Hidden field that lets you learn more about your users without showing them additional fields in your forms. In this article, we share our favorite tips and tricks for learning more information about your users.

14 comments on “ How to Create a Survey Results Report (+7 Examples to Steal) ”

This is really good

Hi Jocasta! Glad to hear that you enjoyed our article! Please check back often as we’re always adding new content as well as updating old ones!

Hi, I need to write an opinion poll report would you help with a sample I could use

Hi Thuku, I’m sorry but we don’t have any such examples available as it’s a bit outside our purview. A quick Google search does show some sites with information and examples regarding this though. I hope that helps!

With the Likert Scale what visualisation options are available? For example if there were 30 questions… I would like to be able to total up for all questions how many said never, or often… etc… and for each ‘x’ option for example if it was chocolate bars down the side and never through to often across the top… for each question… I would like to total for all questions for each chocolate bar… the totals of never through to often…? can you help?

Hey Nigel- to achieve what you’ve mentioned, I’d recommend you to make use of the Survey and Poll addon that has the ability to display the number of polls count. Here is a complete guide on this addon

If you’ve any questions, please get in touch with the support team and we’d be happy to assist you further!

Thanks, and have a good one 🙂

I am looking for someone to roll-up survey responses and prepare presentations/graphs. I have 58 responses. Does this company offer this as an option? If so, what are the cost?

Hi Ivory! I apologize for any misunderstanding, but we do not provide such services.

Hi! Can you make survey report.

Hi Umay! I apologize as I’m not entirely certain about your question, or what you’re looking to do. In case it helps though, our Survey and Polls addon does have some features to generate survey reports. You can find out more about that in this article .

I hope this helps to clarify 🙂 If you have any further questions about this, please contact us if you have an active subscription. If you do not, don’t hesitate to drop us some questions in our support forums .

Super helpful..

Hi Shaz! We’re glad to hear that you found this article helpful. Please check back often as we’re always adding new content and making updates to old ones 🙂

Hi , can you help meon how to present the questionnaire answer on my report writing

Hi Elida – Yes, we will be happy to help!

If you have a WPForms license, you have access to our email support, so please submit a support ticket . Otherwise, we provide limited complimentary support in the WPForms Lite WordPress.org support forum .

Add a Comment Cancel reply

We're glad you have chosen to leave a comment. Please keep in mind that all comments are moderated according to our privacy policy , and all links are nofollow. Do NOT use keywords in the name field. Let's have a personal and meaningful conversation.

Your Comment

Your Full Name

Your Email Address

Save my name, email, and website in this browser for the next time I comment.

This form is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Add Your Comment

  • Testimonials
  • FTC Disclosure
  • Online Form Builder
  • Conditional Logic
  • Conversational Forms
  • Form Landing Pages
  • Entry Management
  • Form Abandonment
  • Form Notifications
  • Form Templates
  • File Uploads
  • Calculation Forms
  • Geolocation Forms
  • Multi-Page Forms
  • Newsletter Forms
  • Payment Forms
  • Post Submissions
  • Signature Forms
  • Spam Protection
  • Surveys and Polls
  • User Registration
  • HubSpot Forms
  • Mailchimp Forms
  • Brevo Forms
  • Salesforce Forms
  • Authorize.Net
  • PayPal Forms
  • Square Forms
  • Stripe Forms
  • Documentation
  • Plans & Pricing
  • WordPress Hosting
  • Start a Blog
  • Make a Website
  • WordPress Forms for Nonprofits

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Writing Survey Questions

Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions. Creating good measures involves both writing good questions and organizing them to form the questionnaire.

Questionnaire design is a multistage process that requires attention to many details at once. Designing the questionnaire is complicated because surveys can ask about topics in varying degrees of detail, questions can be asked in different ways, and questions asked earlier in a survey may influence how people respond to later questions. Researchers are also often interested in measuring change over time and therefore must be attentive to how opinions or behaviors have been measured in prior surveys.

Surveyors may conduct pilot tests or focus groups in the early stages of questionnaire development in order to better understand how people think about an issue or comprehend a question. Pretesting a survey is an essential step in the questionnaire design process to evaluate how people respond to the overall questionnaire and specific questions, especially when questions are being introduced for the first time.

For many years, surveyors approached questionnaire design as an art, but substantial research over the past forty years has demonstrated that there is a lot of science involved in crafting a good survey questionnaire. Here, we discuss the pitfalls and best practices of designing questionnaires.

Question development

There are several steps involved in developing a survey questionnaire. The first is identifying what topics will be covered in the survey. For Pew Research Center surveys, this involves thinking about what is happening in our nation and the world and what will be relevant to the public, policymakers and the media. We also track opinion on a variety of issues over time so we often ensure that we update these trends on a regular basis to better understand whether people’s opinions are changing.

At Pew Research Center, questionnaire development is a collaborative and iterative process where staff meet to discuss drafts of the questionnaire several times over the course of its development. We frequently test new survey questions ahead of time through qualitative research methods such as  focus groups , cognitive interviews, pretesting (often using an  online, opt-in sample ), or a combination of these approaches. Researchers use insights from this testing to refine questions before they are asked in a production survey, such as on the ATP.

Measuring change over time

Many surveyors want to track changes over time in people’s attitudes, opinions and behaviors. To measure change, questions are asked at two or more points in time. A cross-sectional design surveys different people in the same population at multiple points in time. A panel, such as the ATP, surveys the same people over time. However, it is common for the set of people in survey panels to change over time as new panelists are added and some prior panelists drop out. Many of the questions in Pew Research Center surveys have been asked in prior polls. Asking the same questions at different points in time allows us to report on changes in the overall views of the general public (or a subset of the public, such as registered voters, men or Black Americans), or what we call “trending the data”.

When measuring change over time, it is important to use the same question wording and to be sensitive to where the question is asked in the questionnaire to maintain a similar context as when the question was asked previously (see  question wording  and  question order  for further information). All of our survey reports include a topline questionnaire that provides the exact question wording and sequencing, along with results from the current survey and previous surveys in which we asked the question.

The Center’s transition from conducting U.S. surveys by live telephone interviewing to an online panel (around 2014 to 2020) complicated some opinion trends, but not others. Opinion trends that ask about sensitive topics (e.g., personal finances or attending religious services ) or that elicited volunteered answers (e.g., “neither” or “don’t know”) over the phone tended to show larger differences than other trends when shifting from phone polls to the online ATP. The Center adopted several strategies for coping with changes to data trends that may be related to this change in methodology. If there is evidence suggesting that a change in a trend stems from switching from phone to online measurement, Center reports flag that possibility for readers to try to head off confusion or erroneous conclusions.

Open- and closed-ended questions

One of the most significant decisions that can affect how people answer questions is whether the question is posed as an open-ended question, where respondents provide a response in their own words, or a closed-ended question, where they are asked to choose from a list of answer choices.

For example, in a poll conducted after the 2008 presidential election, people responded very differently to two versions of the question: “What one issue mattered most to you in deciding how you voted for president?” One was closed-ended and the other open-ended. In the closed-ended version, respondents were provided five options and could volunteer an option not on the list.

When explicitly offered the economy as a response, more than half of respondents (58%) chose this answer; only 35% of those who responded to the open-ended version volunteered the economy. Moreover, among those asked the closed-ended version, fewer than one-in-ten (8%) provided a response other than the five they were read. By contrast, fully 43% of those asked the open-ended version provided a response not listed in the closed-ended version of the question. All of the other issues were chosen at least slightly more often when explicitly offered in the closed-ended version than in the open-ended version. (Also see  “High Marks for the Campaign, a High Bar for Obama”  for more information.)

assignment survey sample

Researchers will sometimes conduct a pilot study using open-ended questions to discover which answers are most common. They will then develop closed-ended questions based off that pilot study that include the most common responses as answer choices. In this way, the questions may better reflect what the public is thinking, how they view a particular issue, or bring certain issues to light that the researchers may not have been aware of.

When asking closed-ended questions, the choice of options provided, how each option is described, the number of response options offered, and the order in which options are read can all influence how people respond. One example of the impact of how categories are defined can be found in a Pew Research Center poll conducted in January 2002. When half of the sample was asked whether it was “more important for President Bush to focus on domestic policy or foreign policy,” 52% chose domestic policy while only 34% said foreign policy. When the category “foreign policy” was narrowed to a specific aspect – “the war on terrorism” – far more people chose it; only 33% chose domestic policy while 52% chose the war on terrorism.

In most circumstances, the number of answer choices should be kept to a relatively small number – just four or perhaps five at most – especially in telephone surveys. Psychological research indicates that people have a hard time keeping more than this number of choices in mind at one time. When the question is asking about an objective fact and/or demographics, such as the religious affiliation of the respondent, more categories can be used. In fact, they are encouraged to ensure inclusivity. For example, Pew Research Center’s standard religion questions include more than 12 different categories, beginning with the most common affiliations (Protestant and Catholic). Most respondents have no trouble with this question because they can expect to see their religious group within that list in a self-administered survey.

In addition to the number and choice of response options offered, the order of answer categories can influence how people respond to closed-ended questions. Research suggests that in telephone surveys respondents more frequently choose items heard later in a list (a “recency effect”), and in self-administered surveys, they tend to choose items at the top of the list (a “primacy” effect).

Because of concerns about the effects of category order on responses to closed-ended questions, many sets of response options in Pew Research Center’s surveys are programmed to be randomized to ensure that the options are not asked in the same order for each respondent. Rotating or randomizing means that questions or items in a list are not asked in the same order to each respondent. Answers to questions are sometimes affected by questions that precede them. By presenting questions in a different order to each respondent, we ensure that each question gets asked in the same context as every other question the same number of times (e.g., first, last or any position in between). This does not eliminate the potential impact of previous questions on the current question, but it does ensure that this bias is spread randomly across all of the questions or items in the list. For instance, in the example discussed above about what issue mattered most in people’s vote, the order of the five issues in the closed-ended version of the question was randomized so that no one issue appeared early or late in the list for all respondents. Randomization of response items does not eliminate order effects, but it does ensure that this type of bias is spread randomly.

Questions with ordinal response categories – those with an underlying order (e.g., excellent, good, only fair, poor OR very favorable, mostly favorable, mostly unfavorable, very unfavorable) – are generally not randomized because the order of the categories conveys important information to help respondents answer the question. Generally, these types of scales should be presented in order so respondents can easily place their responses along the continuum, but the order can be reversed for some respondents. For example, in one of Pew Research Center’s questions about abortion, half of the sample is asked whether abortion should be “legal in all cases, legal in most cases, illegal in most cases, illegal in all cases,” while the other half of the sample is asked the same question with the response categories read in reverse order, starting with “illegal in all cases.” Again, reversing the order does not eliminate the recency effect but distributes it randomly across the population.

Question wording

The choice of words and phrases in a question is critical in expressing the meaning and intent of the question to the respondent and ensuring that all respondents interpret the question the same way. Even small wording differences can substantially affect the answers people provide.

[View more Methods 101 Videos ]

An example of a wording difference that had a significant impact on responses comes from a January 2003 Pew Research Center survey. When people were asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” 68% said they favored military action while 25% said they opposed military action. However, when asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule  even if it meant that U.S. forces might suffer thousands of casualties, ” responses were dramatically different; only 43% said they favored military action, while 48% said they opposed it. The introduction of U.S. casualties altered the context of the question and influenced whether people favored or opposed military action in Iraq.

There has been a substantial amount of research to gauge the impact of different ways of asking questions and how to minimize differences in the way respondents interpret what is being asked. The issues related to question wording are more numerous than can be treated adequately in this short space, but below are a few of the important things to consider:

First, it is important to ask questions that are clear and specific and that each respondent will be able to answer. If a question is open-ended, it should be evident to respondents that they can answer in their own words and what type of response they should provide (an issue or problem, a month, number of days, etc.). Closed-ended questions should include all reasonable responses (i.e., the list of options is exhaustive) and the response categories should not overlap (i.e., response options should be mutually exclusive). Further, it is important to discern when it is best to use forced-choice close-ended questions (often denoted with a radio button in online surveys) versus “select-all-that-apply” lists (or check-all boxes). A 2019 Center study found that forced-choice questions tend to yield more accurate responses, especially for sensitive questions.  Based on that research, the Center generally avoids using select-all-that-apply questions.

It is also important to ask only one question at a time. Questions that ask respondents to evaluate more than one concept (known as double-barreled questions) – such as “How much confidence do you have in President Obama to handle domestic and foreign policy?” – are difficult for respondents to answer and often lead to responses that are difficult to interpret. In this example, it would be more effective to ask two separate questions, one about domestic policy and another about foreign policy.

In general, questions that use simple and concrete language are more easily understood by respondents. It is especially important to consider the education level of the survey population when thinking about how easy it will be for respondents to interpret and answer a question. Double negatives (e.g., do you favor or oppose  not  allowing gays and lesbians to legally marry) or unfamiliar abbreviations or jargon (e.g., ANWR instead of Arctic National Wildlife Refuge) can result in respondent confusion and should be avoided.

Similarly, it is important to consider whether certain words may be viewed as biased or potentially offensive to some respondents, as well as the emotional reaction that some words may provoke. For example, in a 2005 Pew Research Center survey, 51% of respondents said they favored “making it legal for doctors to give terminally ill patients the means to end their lives,” but only 44% said they favored “making it legal for doctors to assist terminally ill patients in committing suicide.” Although both versions of the question are asking about the same thing, the reaction of respondents was different. In another example, respondents have reacted differently to questions using the word “welfare” as opposed to the more generic “assistance to the poor.” Several experiments have shown that there is much greater public support for expanding “assistance to the poor” than for expanding “welfare.”

We often write two versions of a question and ask half of the survey sample one version of the question and the other half the second version. Thus, we say we have two  forms  of the questionnaire. Respondents are assigned randomly to receive either form, so we can assume that the two groups of respondents are essentially identical. On questions where two versions are used, significant differences in the answers between the two forms tell us that the difference is a result of the way we worded the two versions.

assignment survey sample

One of the most common formats used in survey questions is the “agree-disagree” format. In this type of question, respondents are asked whether they agree or disagree with a particular statement. Research has shown that, compared with the better educated and better informed, less educated and less informed respondents have a greater tendency to agree with such statements. This is sometimes called an “acquiescence bias” (since some kinds of respondents are more likely to acquiesce to the assertion than are others). This behavior is even more pronounced when there’s an interviewer present, rather than when the survey is self-administered. A better practice is to offer respondents a choice between alternative statements. A Pew Research Center experiment with one of its routinely asked values questions illustrates the difference that question format can make. Not only does the forced choice format yield a very different result overall from the agree-disagree format, but the pattern of answers between respondents with more or less formal education also tends to be very different.

One other challenge in developing questionnaires is what is called “social desirability bias.” People have a natural tendency to want to be accepted and liked, and this may lead people to provide inaccurate answers to questions that deal with sensitive subjects. Research has shown that respondents understate alcohol and drug use, tax evasion and racial bias. They also may overstate church attendance, charitable contributions and the likelihood that they will vote in an election. Researchers attempt to account for this potential bias in crafting questions about these topics. For instance, when Pew Research Center surveys ask about past voting behavior, it is important to note that circumstances may have prevented the respondent from voting: “In the 2012 presidential election between Barack Obama and Mitt Romney, did things come up that kept you from voting, or did you happen to vote?” The choice of response options can also make it easier for people to be honest. For example, a question about church attendance might include three of six response options that indicate infrequent attendance. Research has also shown that social desirability bias can be greater when an interviewer is present (e.g., telephone and face-to-face surveys) than when respondents complete the survey themselves (e.g., paper and web surveys).

Lastly, because slight modifications in question wording can affect responses, identical question wording should be used when the intention is to compare results to those from earlier surveys. Similarly, because question wording and responses can vary based on the mode used to survey respondents, researchers should carefully evaluate the likely effects on trend measurements if a different survey mode will be used to assess change in opinion over time.

Question order

Once the survey questions are developed, particular attention should be paid to how they are ordered in the questionnaire. Surveyors must be attentive to how questions early in a questionnaire may have unintended effects on how respondents answer subsequent questions. Researchers have demonstrated that the order in which questions are asked can influence how people respond; earlier questions can unintentionally provide context for the questions that follow (these effects are called “order effects”).

One kind of order effect can be seen in responses to open-ended questions. Pew Research Center surveys generally ask open-ended questions about national problems, opinions about leaders and similar topics near the beginning of the questionnaire. If closed-ended questions that relate to the topic are placed before the open-ended question, respondents are much more likely to mention concepts or considerations raised in those earlier questions when responding to the open-ended question.

For closed-ended opinion questions, there are two main types of order effects: contrast effects ( where the order results in greater differences in responses), and assimilation effects (where responses are more similar as a result of their order).

assignment survey sample

An example of a contrast effect can be seen in a Pew Research Center poll conducted in October 2003, a dozen years before same-sex marriage was legalized in the U.S. That poll found that people were more likely to favor allowing gays and lesbians to enter into legal agreements that give them the same rights as married couples when this question was asked after one about whether they favored or opposed allowing gays and lesbians to marry (45% favored legal agreements when asked after the marriage question, but 37% favored legal agreements without the immediate preceding context of a question about same-sex marriage). Responses to the question about same-sex marriage, meanwhile, were not significantly affected by its placement before or after the legal agreements question.

assignment survey sample

Another experiment embedded in a December 2008 Pew Research Center poll also resulted in a contrast effect. When people were asked “All in all, are you satisfied or dissatisfied with the way things are going in this country today?” immediately after having been asked “Do you approve or disapprove of the way George W. Bush is handling his job as president?”; 88% said they were dissatisfied, compared with only 78% without the context of the prior question.

Responses to presidential approval remained relatively unchanged whether national satisfaction was asked before or after it. A similar finding occurred in December 2004 when both satisfaction and presidential approval were much higher (57% were dissatisfied when Bush approval was asked first vs. 51% when general satisfaction was asked first).

Several studies also have shown that asking a more specific question before a more general question (e.g., asking about happiness with one’s marriage before asking about one’s overall happiness) can result in a contrast effect. Although some exceptions have been found, people tend to avoid redundancy by excluding the more specific question from the general rating.

Assimilation effects occur when responses to two questions are more consistent or closer together because of their placement in the questionnaire. We found an example of an assimilation effect in a Pew Research Center poll conducted in November 2008 when we asked whether Republican leaders should work with Obama or stand up to him on important issues and whether Democratic leaders should work with Republican leaders or stand up to them on important issues. People were more likely to say that Republican leaders should work with Obama when the question was preceded by the one asking what Democratic leaders should do in working with Republican leaders (81% vs. 66%). However, when people were first asked about Republican leaders working with Obama, fewer said that Democratic leaders should work with Republican leaders (71% vs. 82%).

The order questions are asked is of particular importance when tracking trends over time. As a result, care should be taken to ensure that the context is similar each time a question is asked. Modifying the context of the question could call into question any observed changes over time (see  measuring change over time  for more information).

A questionnaire, like a conversation, should be grouped by topic and unfold in a logical order. It is often helpful to begin the survey with simple questions that respondents will find interesting and engaging. Throughout the survey, an effort should be made to keep the survey interesting and not overburden respondents with several difficult questions right after one another. Demographic questions such as income, education or age should not be asked near the beginning of a survey unless they are needed to determine eligibility for the survey or for routing respondents through particular sections of the questionnaire. Even then, it is best to precede such items with more interesting and engaging questions. One virtue of survey panels like the ATP is that demographic questions usually only need to be asked once a year, not in each survey.

U.S. Surveys

Other research methods.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan, nonadvocacy fact tank that informs the public about the issues, attitudes and trends shaping the world. It does not take policy positions. The Center conducts public opinion polling, demographic research, computational social science research and other data-driven research. Pew Research Center is a subsidiary of The Pew Charitable Trusts , its primary funder.

© 2024 Pew Research Center

28 Questionnaire Examples, Questions, & Templates to Survey Your Clients

Swetha Amaresan

Published: May 15, 2023

The adage "the customer is always right" has received some pushback in recent years, but when it comes to conducting surveys , the phrase is worth a deeper look. In the past, representatives were tasked with solving client problems as they happened. Now, they have to be proactive by solving problems before they come up.

Person fills out a questionnaire surrounded by question mark scrabble tiles

Salesforce found that 63% of customers expect companies to anticipate their needs before they ask for help. But how can a customer service team recognize these customer needs in advance and effectively solve them on a day-to-day basis?

→ Free Download: 5 Customer Survey Templates [Access Now]

A customer questionnaire is a tried-and-true method for collecting survey data to inform your customer service strategy . By hearing directly from the customer, you'll capture first-hand data about how well your service team meets their needs. In this article, you'll get free questionnaire templates and best practices on how to administer them for the most honest responses.

Table of Contents:

Questionnaire Definition

Survey vs. questionnaire, questionnaire templates.

  • Questionnaire Examples

Questionnaire Design

Survey question examples.

  • Examples of Good Survey Questions

How to Make a Questionnaire

assignment survey sample

5 Free Customer Satisfaction Survey Templates

Easily measure customer satisfaction and begin to improve your customer experience.

  • Net Promoter Score
  • Customer Effort Score

Download Free

All fields are required.

You're all set!

Click this link to access this resource at any time.

A questionnaire is a research tool used to conduct surveys. It includes specific questions with the goal to understand a topic from the respondents' point of view. Questionnaires typically have closed-ended, open-ended, short-form, and long-form questions.

The questions should always stay as unbiased as possible. For instance, it's unwise to ask for feedback on a specific product or service that’s still in the ideation phase. To complete the questionnaire, the customer would have to imagine how they might experience the product or service rather than sharing their opinion about their actual experience with it.

Ask broad questions about the kinds of qualities and features your customers enjoy in your products or services and incorporate that feedback into new offerings your team is developing.

What makes a good questionnaire?

Define the goal, make it short and simple, use a mix of question types, proofread carefully, keep it consistent.

A good questionnaire should find what you need versus what you want. It should be valuable and give you a chance to understand the respondent’s point of view.

Make the purpose of your questionnaire clear. While it's tempting to ask a range of questions simultaneously, you'll get more valuable results if you stay specific to a set topic.

According to HubSpot research , 47% of those surveyed say their top reason for abandoning a survey is the time it takes to complete.

So, questionnaires should be concise and easy to finish. If you're looking for a respondent’s experience with your business, focus on the most important questions.

5 Customer Survey Templates

Featured resource.

Your questionnaire should include a combination of question types, like open-ended, long-form, or short-ended questions.

Open-ended questions give users a chance to share their own answers. But closed-ended questions are more efficient and easy to quantify, with specific answer choices.

If you're not sure which question types are best, read here for more survey question examples .

While it's important to check spelling and grammar, there are two other things you'll want to check for a great questionnaire.

First, edit for clarity. Jargon, technical terms, and brand-specific language can be confusing for respondents. Next, check for leading questions. These questions can produce biased results that will be less useful to your team.

Consistency makes it easier for respondents to quickly complete your questionnaire. This is because it makes the questions less confusing. It can also reduce bias.

Being consistent is also helpful for analyzing questionnaire data because it makes it easier to compare results. With this in mind, keep response scales, question types, and formatting consistent.

In-Depth Interviews vs. Questionnaire

Questionnaires can be a more feasible and efficient research method than in-depth interviews. They are a lot cheaper to conduct. That’s because in-depth interviews can require you to compensate the interviewees for their time and give accommodations and travel reimbursement.

Questionnaires also save time for both parties. Customers can quickly complete them on their own time, and employees of your company don't have to spend time conducting the interviews. They can capture a larger audience than in-depth interviews, making them much more cost-effective.

It would be impossible for a large company to interview tens of thousands of customers in person. The same company could potentially get feedback from its entire customer base using an online questionnaire.

When considering your current products and services (as well as ideas for new products and services), it's essential to get the feedback of existing and potential customers. They are the ones who have a say in purchasing decisions.

A questionnaire is a tool that’s used to conduct a survey. A survey is the process of gathering, sampling, analyzing, and interpreting data from a group of people.

The confusion between these terms most likely stems from the fact that questionnaires and data analysis were treated as very separate processes before the Internet became popular. Questionnaires used to be completed on paper, and data analysis occurred later as a separate process. Nowadays, these processes are typically combined since online survey tools allow questionnaire responses to be analyzed and aggregated all in one step.

But questionnaires can still be used for reasons other than data analysis. Job applications and medical history forms are examples of questionnaires that have no intention of being statistically analyzed. The key difference between questionnaires and surveys is that they can exist together or separately.

Below are some of the best free questionnaire templates you can download to gather data that informs your next product or service offering.

What makes a good survey question?

Have a goal in mind, draft clear and distinct answers and questions, ask one question at a time, check for bias and sensitivity, include follow-up questions.

To make a good survey question, you have to choose the right type of questions to use. Include concise, clear, and appropriate questions with answer choices that won’t confuse the respondent and will clearly offer data on their experience.

Good survey questions can give a business good data to examine. Here are some more tips to follow as you draft your survey questions.

To make a good survey, consider what you are trying to learn from it. Understanding why you need to do a survey will help you create clear and concise questions that you need to ask to meet your goal. The more your questions focus on one or two objectives, the better your data will be.

You have a goal in mind for your survey. Now you have to write the questions and answers depending on the form you’re using.

For instance, if you’re using ranks or multiple-choice in your survey, be clear. Here are examples of good and poor multiple-choice answers:

Poor Survey Question and Answer Example

California:

  • Contains the tallest mountain in the United States.
  • Has an eagle on its state flag.
  • Is the second-largest state in terms of area.
  • Was the location of the Gold Rush of 1849.

Good Survey Question and Answer Example

What is the main reason so many people moved to California in 1849?

  • California's land was fertile, plentiful, and inexpensive.
  • The discovery of gold in central California.
  • The East was preparing for a civil war.
  • They wanted to establish religious settlements.

In the poor example, the question may confuse the respondent because it's not clear what is being asked or how the answers relate to the question. The survey didn’t fully explain the question, and the options are also confusing.

In the good example above, the question and answer choices are clear and easy to understand.

Always make sure answers and questions are clear and distinct to create a good experience for the respondent. This will offer your team the best outcomes from your survey.

It's surprisingly easy to combine multiple questions into one. They even have a name — they’re called "double-barreled" questions. But a good survey asks one question at a time.

For example, a survey question could read, "What is your favorite sneaker and clothing apparel brand?" This is bad because you’re asking two questions at once.

By asking two questions simultaneously, you may confuse your respondents and get unclear answers. Instead, each question should focus on getting specific pieces of information.

For example, ask, "What is your favorite sneaker brand?" then, "What is your favorite clothing apparel brand?" By separating the questions, you allow your respondents to give separate and precise answers.

Biased questions can lead a respondent toward a specific response. They can also be vague or unclear. Sensitive questions such as age, religion, or marital status can be helpful for demographics. These questions can also be uncomfortable for people to answer.

There are a few ways to create a positive experience with your survey questions.

First, think about question placement. Sensitive questions that appear in context with other survey questions can help people understand why you are asking. This can make them feel more comfortable responding.

Next, check your survey for leading questions, assumptions, and double-barreled questions. You want to make sure that your survey is neutral and free of bias.

Asking more than one survey question about an area of interest can make a survey easier to understand and complete. It also helps you collect more in-depth insights from your respondents.

1. Free HubSpot Questionnaire Template

HubSpot offers a variety of free customer surveys and questionnaire templates to analyze and measure customer experience. Choose from five templates: net promoter score, customer satisfaction, customer effort, open-ended questions, and long-form customer surveys.

2. Client Questionnaire Template

It's a good idea to gauge your clients' experiences with your business to uncover opportunities to improve your offerings. That will, in turn, better suit their lifestyles. You don't have to wait for an entire year to pass before polling your customer base about their experience either. A simple client questionnaire, like the one below, can be administered as a micro survey several times throughout the year. These types of quick survey questions work well to retarget your existing customers through social media polls and paid interactive ads.

1. How much time do you spend using [product or service]?

  • Less than a minute
  • About 1 - 2 minutes
  • Between 2 and 5 minutes
  • More than 5 minutes

2. In the last month, what has been your biggest pain point?

  • Finding enough time for important tasks
  • Delegating work
  • Having enough to do

3. What's your biggest priority right now?

  • Finding a faster way to work
  • Problem-solving
  • Staff development

send-now-hubspot-sales-bar

3. Website Questionnaire Template

Whether you just launched a brand new website or you're gathering data points to inform a redesign, you'll find customer feedback to be essential in both processes. A website questionnaire template will come in handy to collect this information using an unbiased method.

1. How many times have you visited [website] in the past month?

  • More than once

2. What is the primary reason for your visit to [website]?

  • To make a purchase
  • To find more information before making a purchase in-store
  • To contact customer service

3. Are you able to find what you're looking for on the website homepage?

4. Customer Satisfaction Questionnaire Template

If you've never surveyed your customers and are looking for a template to get started, this one includes some basic customer satisfaction questions. These will apply to just about any customer your business serves.

1. How likely are you to recommend us to family, friends, or colleagues?

  • Extremely unlikely
  • Somewhat unlikely
  • Somewhat likely
  • Extremely likely

2. How satisfied were you with your experience?

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10

3. Rank the following items in terms of their priority to your purchasing process.

  • Helpful staff
  • Quality of product
  • Price of product
  • Ease of purchase
  • Proximity of store
  • Online accessibility
  • Current need
  • Appearance of product

4. Who did you purchase these products for?

  • Family member
  • On behalf of a business

5. Please rate our staff on the following terms:

  • Friendly __ __ __ __ __ Hostile
  • Helpful __ __ __ __ __ Useless
  • Knowledgeable __ __ __ __ __ Inexperienced
  • Professional __ __ __ __ __ Inappropriate

6. Would you purchase from our company again?

7. How can we improve your experience for the future?

________________________________.

5. Customer Effort Score Questionnaire Template

The following template gives an example of a brief customer effort score (CES) questionnaire. This free template works well for new customers to measure their initial reaction to your business.

1. What was the ease of your experience with our company?

  • Extremely difficult
  • Somewhat difficult
  • Somewhat easy
  • Extremely easy

2. The company did everything it could to make my process as easy as possible.

  • Strongly disagree
  • Somewhat disagree
  • Somewhat agree
  • Strongly agree

3. On a scale of 1 to 10 (1 being "extremely quickly" and 10 being "extremely slowly"), how fast were you able to solve your problem?

4. How much effort did you have to put forth while working with our company?

  • Much more than expected
  • Somewhat more than expected
  • As much as expected
  • Somewhat less than expected
  • Much less than expected

6. Demographic Questionnaire Template

Here's a template for surveying customers to learn more about their demographic background. You could substantiate the analysis of this questionnaire by corroborating the data with other information from your web analytics, internal customer data, and industry data.

1. How would you describe your employment status?

  • Employed full-time
  • Employed part-time
  • Freelance/contract employee
  • Self-employed

2. How many employees work at your company?

3. How would you classify your role?

  • Individual Contributor

4. How would you classify your industry?

  • Technology/software
  • Hospitality/dining
  • Entertainment

Below, we have curated a list of questionnaire examples that do a great job of gathering valuable qualitative and quantitative data.

4 Questionnaire Examples

1. customer satisfaction questions.

patient satisfaction survey

Learn more about HubSpot's Customer Survey software.

Multiple-Choice

Multiple-choice questions offer respondents several answers to choose from. This is a popular choice of questionnaire format since it's simple for people to fill out and for companies to analyze.

Multiple-choice questions can be in single-answer form (respondents can only choose one response) or multiple-answer form (respondents can choose as many responses as necessary).

Multiple-choice survey question examples : "Which of the following social media platforms do you use most often?"

Survey question examples: Multiple choice

Image Source

Rating Scale

Rating scale questions offer a scale of numbers and ask respondents to rate topics based on the sentiments assigned to that scale. This is effective when assessing customer satisfaction.

Rating scale survey question examples : "Rate your level of satisfaction with the customer service you received today on a scale of 1-10."

Survey question examples: Rating Scale

Yes or no survey questions are a type of dichotomous question. These are questions that only offer two possible responses. They’re useful because they’re quick to answer and can help with customer segmentation.

Yes or no survey questions example : "Have you ever used HubSpot before?"

Likert Scale

Likert scale questions assess whether a respondent agrees with the statement, as well as the extent to which they agree or disagree.

These questions typically offer five or seven responses, with sentiments ranging from items such as "strongly disagree" to "strongly agree." Check out this post to learn more about the Likert scale .

Likert scale survey question examples : “How satisfied are you with the service from [brand]?”

Survey question examples: Likert Scale

Open-ended questions ask a broader question or offer a chance to elaborate on a response to a close-ended question. They're accompanied by a text box that leaves room for respondents to write freely. This is particularly important when asking customers to expand on an experience or recommendation.

Open-ended survey question examples : "What are your personal goals for using HubSpot? Please describe."

Survey question examples: Open-Ended

Matrix Table

A matrix table is usually a group of multiple-choice questions grouped in a table. Choices for these survey questions are usually organized in a scale. This makes it easier to understand the relationships between different survey responses.

Matrix table survey question examples : "Rate your level of agreement with the following statements about HubSpot on a scale of 1-5."

Survey question examples: Matrix table

Rank Order Scaling

These questions ask respondents to rank a set of terms by order of preference or importance. This is useful for understanding customer priorities.

Rank order scaling examples : "Rank the following factors in order of importance when choosing a new job."

Survey question examples: Rank order scaling

Semantic Differential Scale

This scale features pairs of opposite adjectives that respondents use for rating, usually for a feature or experience. This type of question makes it easier to understand customer attitudes and beliefs.

Semantic differential scale question examples : "Rate your overall impression of this brand as friendly vs. unfriendly, innovative vs. traditional, and boring vs. exciting."

Survey question examples: Semantic differential scale

Side-By-Side Matrix

This matrix table format includes two sets of questions horizontally for easy comparison. This format can help with customer gap analysis.

Side-by-side matrix question examples : "Rate your level of satisfaction with HubSpot's customer support compared to its ease of use."

Survey question examples: Side-by-side matrix

Stapel Scale

The Stapel rating scale offers a single adjective or idea for rating. It uses a numerical scale with a zero point in the middle. This survey question type helps with in-depth analysis.

Stapel scale survey question examples : "Rate your overall experience with this product as +5 (excellent) to -5 (terrible)."

Survey question examples: Stapel scale

Constant Sum Survey Questions

In this question format, people distribute points to different choices based on the perceived importance of each point. This kind of question is often used in market research and can help your team better understand customer choices .

Constant sum survey question examples : "What is your budget for the following marketing expenses: Paid campaigns, Events, Freelancers, Agencies, Research."

Survey question examples: Constant sum

Image Choice

This survey question type shows several images. Then, it asks the respondent to choose the image that best matches their response to the question. These questions are useful for understanding your customers’ design preferences.

Image choice survey questions example : "Which of these three images best represents your brand voice?"

Survey question examples: Image chooser

Choice Model

This survey question offers a hypothetical scenario, then the respondent must choose from the presented options. It's a useful type of question when you are refining a product or strategy.

Choice model survey questions example : "Which of these three deals would be most appealing to you?"

Click Map Questions

Click map questions offer an image click on specific areas of the image in response to a question. This question uses data visualization to learn about customer preferences for design and user experience.

Click map question examples : "Click on the section of the website where you would expect to find pricing information."

Survey question examples: Choice model

Data Upload

This survey question example asks the respondent to upload a file or document in response to a question. This type of survey question can help your team collect data and context that might be tough to collect otherwise.

Data upload question examples : "Please upload a screenshot of the error you encountered during your purchase."

Survey question examples: Data Upload

Benchmarkable Questions

This question type asks a respondent to compare their answers to a group or benchmark. These questions can be useful if you're trying to compare buyer personas or other customer groups.

Benchmarkable survey questions example : "Compare your company's marketing budget to other companies in your industry."

Good Survey Questions

  • What is your favorite product?
  • Why did you purchase this product?
  • How satisfied are you with [product]?
  • Would you recommend [product] to a friend?
  • Would you recommend [company name] to a friend?
  • If you could change one thing about [product], what would it be?
  • Which other options were you considering before [product or company name]?
  • Did [product] help you accomplish your goal?
  • How would you feel if we did not offer this product, feature, or service?
  • What would you miss the most if you couldn't use your favorite product from us?
  • What is one word that best describes your experience using our product?
  • What's the primary reason for canceling your account?
  • How satisfied are you with our customer support?
  • Did we answer all of your questions and concerns?
  • How can we be more helpful?
  • What additional features would you like to see in this product?
  • Are we meeting your expectations?
  • How satisfied are you with your experience?

1. "What is your favorite product?"

This question is a great starter for your survey. Most companies want to know what their most popular products are, and this question cuts right to the point.

It's important to note that this question gives you the customer's perspective, not empirical evidence. You should compare the results to your inventory to see if your customers' answers match your actual sales. You may be surprised to find your customers' "favorite" product isn't the highest-selling one.

2. "Why did you purchase this product?"

Once you know their favorite product, you need to understand why they like it so much. The qualitative data will help your marketing and sales teams attract and engage customers. They'll know which features to advertise most and can seek out new leads similar to your existing customers.

3. "How satisfied are you with [product]?"

When you have a product that isn't selling, you can ask this question to see why customers are unhappy with it. If the reviews are poor, you'll know that the product needs reworking, and you can send it back to product management for improvement. Or, if these results are positive, they may have something to do with your marketing or sales techniques. You can then gather more info during the questionnaire and restrategize your campaigns based on your findings.

4. "Would you recommend [product] to a friend?"

This is a classic survey question used with most NPS® surveys. It asks the customer if they would recommend your product to one of their peers. This is extremely important because most people trust customer referrals more than traditional advertising. So, if your customers are willing to recommend your products, you'll have an easier time acquiring new leads.

5. "Would you recommend [company name] to a friend?"

Similar to the question above, this one asks the customer to consider your business as a whole and not just your product. This gives you insight into your brand's reputation and shows how customers feel about your company's actions. Even if you have an excellent product, your brand's reputation may be the cause of customer churn . Your marketing team should pay close attention to this question to see how they can improve the customer experience .

6. "If you could change one thing about [product], what would it be?"

This is a good question to ask your most loyal customers or ones that have recently churned. For loyal customers, you want to keep adding value to their experience. Asking how your product can improve helps your development team find flaws and increases your chances of retaining a valuable customer segment.

For customers that have recently churned, this question gives insight into how you can retain future users that are unhappy with your product or service. By giving these customers a space to voice their criticisms, you can either reach out and offer solutions or relay feedback for consideration.

7. "Which other options were you considering before [product or company name]?"

If you're operating in a competitive industry, customers will have more than one choice when considering your brand. And if you sell variations of your product or produce new models periodically, customers may prefer one version over another.

For this question, you should offer answers to choose from in a multiple-selection format. This will limit the types of responses you'll receive and help you get the exact information you need.

8. "Did [product] help you accomplish your goal?"

The purpose of any product or service is to help customers reach a goal. So, you should be direct and ask them if your company steered them toward success. After all, customer success is an excellent retention tool. If customers are succeeding with your product, they're more likely to stay loyal to your brand.

9. "How would you feel if we did not offer this product, feature, or service?"

Thinking about discontinuing a product? This question can help you decide whether or not a specific product, service, or feature will be missed if you were to remove it.

Even if you know that a product or service isn't worth offering, it's important to ask this question anyway because there may be a certain aspect of the product that your customers like. They'll be delighted if you can integrate that feature into a new product or service.

10. "If you couldn't use your favorite product from us, what would you miss the most about it?"

This question pairs well with the one above because it frames the customer's favorite product from a different point of view. Instead of describing why they love a particular product, the customer can explain what they'd be missing if they didn't have it at all. This type of question uncovers "fear of loss," which can be a very different motivating factor than "hope for gain."

11. "What word best describes your experience using our product?"

Your marketing team will love this question. A single word or a short phrase can easily sum up your customers’ emotions when they experience your company, product, or brand. Those emotions can be translated into relatable marketing campaigns that use your customers’ exact language.

If the responses reveal negative emotions, it's likely that your entire customer service team can relate to that pain point. Rather than calling it "a bug in the system," you can describe the problem as a "frustrating roadblock" to keep their experience at the forefront of the solution.

12. "What's the primary reason for canceling your account?"

Finding out why customers are unhappy with your product or service is key to decreasing your churn rate . If you don't understand why people leave your brand, it's hard to make effective changes to prevent future turnover. Or worse, you might alter your product or service in a way that increases your churn rate, causing you to lose customers who were once loyal supporters.

13. "How satisfied are you with our customer support?"

It's worth asking customers how happy they are with your support or service team. After all, an excellent product doesn't always guarantee that customers will stay loyal to your brand. Research shows that one in six customers will leave a brand they love after just one poor service experience.

14. "Did we answer all of your questions and concerns?"

This is a good question to ask after a service experience. It shows how thorough your support team is and whether they're prioritizing speed too much over quality. If customers still have questions and concerns after a service interaction, your support team is focusing too much on closing tickets and not enough on meeting customer needs .

15. "How can we be more helpful?"

Sometimes it's easier to be direct and simply ask customers what else you can do to help them. This shows a genuine interest in your buyers' goals which helps your brand foster meaningful relationships with its customer base. The more you can show that you sincerely care about your customers' problems, the more they'll open up to you and be honest about how you can help them.

16. What additional features would you like to see in this product?

With this question, your team can get inspiration for the company's next product launch. Think of the responses as a wish list from your customers. You can discover what features are most valuable to them and whether they already exist within a competitor's product.

Incorporating every feature suggestion is nearly impossible, but it's a convenient way to build a backlog of ideas that can inspire future product releases.

17. "Are we meeting your expectations?"

This is a really important question to ask because customers won't always tell you when they're unhappy with your service. Not every customer will ask to speak with a manager when they're unhappy with your business. In fact, most will quietly move on to a competitor rather than broadcast their unhappiness to your company. To prevent this type of customer churn, you need to be proactive and ask customers if your brand is meeting their expectations.

18. "How satisfied are you with your experience?"

This question asks the customer to summarize their experience with your business. It gives you a snapshot of how the customer is feeling in that moment and their perception of your brand. Asking this question at the right stage in the customer's journey can tell you a lot about what your company is doing well and where you can stand to improve.

Next, let's dig into some tips for creating your own questionnaire.

Start with templates as a foundation. Know your question types. Keep it brief when possible. Choose a simple visual design. Use a clear research process. Create questions with straightforward, unbiased language. Make sure every question is important. Ask one question at a time. Order your questions logically. Consider your target audience. Test your questionnaire.

1. Use questionnaire templates.

Rather than build a questionnaire from scratch, consider using questionnaire templates to get started. HubSpot's collection of customer-facing questionnaire templates can help you quickly build and send a questionnaire to your clients and analyze the results right on Google Drive.

customer survey templates

Download Now

2. Know your question types.

A simple "yes" or "no" doesn't cut it. To get feedback that actually matters, you need to give customers options that go in-depth. There's a method to getting accurate feedback from your questionnaire, and it starts by choosing the appropriate types of questions for the information you want to know.

Vrnda LeValley , customer training manager at HubSpot, recommends starting with an alignment question like, "Does this class meet your expectations?" because it gives more context to any positive or negative scores that follow. She continues, "If it didn't meet expectations, then there will potentially be negative responses across the board (as well as the reverse)."

3. Keep it brief, when possible.

Most questionnaires don't need to be longer than a page. For routine customer satisfaction surveys, it's unnecessary to ask 50 slightly varied questions about a customer's experience when those questions could be combined into 10 solid questions.

The shorter your questionnaire is, the more likely a customer will complete it. Plus a shorter questionnaire means less data for your team to collect and analyze. Based on the feedback, it will be a lot easier for you to get the information you need to make the necessary changes in your organization and products.

4. Choose a simple visual design.

There's no need to make your questionnaire a stunning work of art. As long as it's clear and concise, it will be attractive to customers. When asking questions that are important to furthering your company, it's best to keep things simple. Select a font that’s common and easy to read, like Helvetica or Arial. Use a text size that customers of all abilities can navigate.

A questionnaire is most effective when all the questions are visible on a single screen. The layout is important. If a questionnaire is even remotely difficult to navigate, your response rate could suffer. Make sure that buttons and checkboxes are easy to click and that questions are visible on both computer and mobile screens.

5. Use a clear research process.

Before planning questions for your questionnaire, you'll need to have a definite direction for it. A questionnaire is only effective if the results answer an overarching research question. After all, the research process is an important part of the survey, and a questionnaire is a tool that's used within the process.

In your research process, you should first come up with a research question. What are you trying to find out? What's the point of this questionnaire? Keep this in mind throughout the process.

After coming up with a research question, it's a good idea to have a hypothesis. What do you predict the results will be for your questionnaire? This can be structured in a simple "If … then …" format. A structured experiment — yes, your questionnaire is a type of experiment — will confirm that you're only collecting and analyzing data necessary to answer your research question. Then, you can move forward with your survey .

6. Create questions with straightforward, unbiased language.

When crafting your questions, it's important to structure them to get the point across. You don't want any confusion for your customers because this may influence their answers. Instead, use clear language. Don't use unnecessary jargon, and use simple terms in favor of longer-winded ones.

You may risk the reliability of your data if you try to combine two questions. Rather than asking, "How was your experience shopping with us, and would you recommend us to others?" separate it into two separate questions. Customers will be clear on your question and choose a response most appropriate for each one.

You should always keep the language in your questions unbiased. You never want to sway customers one way or another because this will cause your data to be skewed. Instead of asking, "Some might say that we create the best software products in the world. Would you agree or disagree?" it may be better to ask, "How would you rate our software products on a scale of 1 to 10?" This removes any bias and confirms that all the responses are valid.

7. Ask only the most important questions.

When creating your questionnaire, keep in mind that time is one of the most valuable commodities for customers. Most aren't going to sit through a 50-question survey, especially when they're being asked about products or services they didn't use. Even if they do complete it, most of these will be half-hearted responses from fatigued customers who simply want to be finished with it.

If your questionnaire has five or 55 questions, make sure each has a specific purpose. Individually, they should be aimed at collecting certain pieces of information that reveal new insights into different aspects of your business. If your questions are irrelevant or seem out of place, your customers will be easily derailed by the survey. And, once the customer has lost interest, it'll be difficult to regain their focus.

8. Ask one question at a time.

Since every question has a purpose, ask them one at a time. This lets the customer focus and encourages them to share a thoughtful response. This is particularly important for open-ended questions where customers need to describe an experience or opinion.

By grouping questions together, you risk overwhelming busy customers who don't have time for a long survey. They may think you're asking them too much, or they might see your questionnaire as a daunting task. You want your survey to appear as painless as possible. Keeping your questions separated will make it more user-friendly.

9. Order your questions logically.

A good questionnaire is like a good book. The beginning questions should lay the framework, the middle ones should cut to the core issues, and the final questions should tie up all loose ends. This flow keeps customers engaged throughout the entire survey.

When creating your questionnaire, start with the most basic questions about demographics. You can use this information to segment your customer base and create different buyer personas.

Next, add in your product and services questions. These are the ones that offer insights into common customer roadblocks and where you can improve your business's offerings. Questions like these guide your product development and marketing teams looking for new ways to enhance the customer experience.

Finally, you should conclude your questionnaire with open-ended questions to understand the customer journey. These questions let customers voice their opinions and point out specific experiences they've had with your brand.

10. Consider your target audience.

Whenever you collect customer feedback, you need to keep in mind the goals and needs of your target audience. After all, the participants in this questionnaire are your active customers. Your questions should be geared toward the interests and experiences they've already had with your company.

You can even create multiple surveys that target different buyer personas. For example, if you have a subscription-based pricing model, you can personalize your questionnaire for each type of subscription your company offers.

11. Test your questionnaire.

Once your questionnaire is complete, it's important to test it. If you don't, you may end up asking the wrong questions and collecting irrelevant or inaccurate information. Start by giving your employees the questionnaire to test, then send it to small groups of customers and analyze the results. If you're gathering the data you're looking for, then you should release the questionnaire to all of your customers.

How Questionnaires Can Benefit Your Customer Service Strategy

Whether you have one customer or 1000 customers, their opinions matter when it comes to the success of your business. Their satisfaction with your offerings can reveal how well or how poorly your customer service strategy and business are meeting their needs. A questionnaire is one of the most powerful, cost-effective tools to uncover what your customers think about your business. When analyzed properly, it can inform your product and service launches.

Use the free questionnaire templates, examples, and best practices in this guide to conduct your next customer feedback survey.

Now that you know the slight difference between a survey and a questionnaire, it’s time to put it into practice with your products or services. Remember, a good survey and questionnaire always start with a purpose. But, a great survey and questionnaire give data that you can use to help companies increase the way customers respond to their products or services because of the questions.

Net Promoter, Net Promoter System, Net Promoter Score, NPS, and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld, and Satmetrix Systems, Inc.

Editor's note: This post was originally published in July 2018 and has been updated for comprehensiveness.

Don't forget to share this post!

Related articles.

Nonresponse Bias: What to Avoid When Creating Surveys

Nonresponse Bias: What to Avoid When Creating Surveys

How to Make a Survey with a QR Code

How to Make a Survey with a QR Code

50 Catchy Referral Slogans & How to Write Your Own

50 Catchy Referral Slogans & How to Write Your Own

How Automated Phone Surveys Work [+Tips and Examples]

How Automated Phone Surveys Work [+Tips and Examples]

Online Panels: What They Are & How to Use Them Effectively

Online Panels: What They Are & How to Use Them Effectively

The Complete Guide to Survey Logic (+Expert Tips)

The Complete Guide to Survey Logic (+Expert Tips)

Focus Group vs. Survey: Which One Should You Use?

Focus Group vs. Survey: Which One Should You Use?

Leading Questions: What They Are & Why They Matter [+ Examples]

Leading Questions: What They Are & Why They Matter [+ Examples]

What are Survey Sample Sizes & How to Find Your Sample Size

What are Survey Sample Sizes & How to Find Your Sample Size

24 Diversity, Equity, and Inclusion Survey Questions to Ask Your Employees

24 Diversity, Equity, and Inclusion Survey Questions to Ask Your Employees

5 free templates for learning more about your customers and respondents.

Service Hub provides everything you need to delight and retain customers while supporting the success of your whole front office

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Survey Research | Definition, Examples & Methods

Survey Research | Definition, Examples & Methods

Published on August 20, 2019 by Shona McCombes . Revised on June 22, 2023.

Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyze the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyze the survey results, step 6: write up the survey results, other interesting articles, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research : investigating the experiences and characteristics of different social groups
  • Market research : finding out what customers think about products, services, and companies
  • Health research : collecting data from patients about symptoms and treatments
  • Politics : measuring public opinion about parties and policies
  • Psychology : researching personality traits, preferences and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and in longitudinal studies , where you survey the same sample several times over an extended period.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

assignment survey sample

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • US college students
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18-24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalized to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

Several common research biases can arise if your survey is not generalizable, particularly sampling bias and selection bias . The presence of these biases have serious repercussions for the validity of your results.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every college student in the US. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalize to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions. Again, beware of various types of sampling bias as you design your sample, particularly self-selection bias , nonresponse bias , undercoverage bias , and survivorship bias .

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by mail, online or in person, and respondents fill it out themselves.
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses.

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by mail is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g. residents of a specific region).
  • The response rate is often low, and at risk for biases like self-selection bias .

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyze.
  • The anonymity and accessibility of online surveys mean you have less control over who responds, which can lead to biases like self-selection bias .

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping mall or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g. the opinions of a store’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations and is at risk for sampling bias .

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data: the researcher records each response as a category or rating and statistically analyzes the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analyzed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g. yes/no or agree/disagree )
  • A scale (e.g. a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g. age categories)
  • A list of options with multiple answers possible (e.g. leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analyzed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an “other” field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic. Avoid jargon or industry-specific terminology.

Survey questions are at risk for biases like social desirability bias , the Hawthorne effect , or demand characteristics . It’s critical to use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no indication that you’d prefer a particular answer or emotion.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by mail, online, or in person.

There are many methods of analyzing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also clean the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organizing them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analyzing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analyzed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyze it. In the results section, you summarize the key results from your analysis.

In the discussion and conclusion , you give your explanations and interpretations of these results, answer your research question, and reflect on the implications and limitations of the research.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyze your data.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, June 22). Survey Research | Definition, Examples & Methods. Scribbr. Retrieved September 9, 2024, from https://www.scribbr.com/methodology/survey-research/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, questionnaire design | methods, question types & examples, what is a likert scale | guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

IMAGES

  1. FREE 9+ Sample Assignment Sheet Templates in PDF

    assignment survey sample

  2. FREE 8+ Sample Survey Example Templates in PDF

    assignment survey sample

  3. FREE 7+ Sample Student Survey Templates in PDF

    assignment survey sample

  4. 30+ Questionnaire Templates (Word) ᐅ TemplateLab

    assignment survey sample

  5. 30 Free Survey Templates & Examples (Word, Excel)

    assignment survey sample

  6. Survey Methodology Assignment Help

    assignment survey sample

VIDEO

  1. Traffic Survey Sample Night time

  2. GGPLOT & DPLYR for R: Group Assignment 1

  3. Utilizing the Pre designed Surveys in Moodle

  4. Survey, Sample Survey and Steps involve in sample Survey

  5. 3-Secondary Survey in the ED: Integrative Medical Students, Grade 5, Acute Emergency Block

  6. Statistics Help: A survey of a groups viewing habits over the last year revealed the following

COMMENTS

  1. How to write great survey questions (with examples)

    For example "drag and drop the items in this list to show your order of preference.". Be clear about which end of the scale is which. For example, "With the best at the top, rank these items from best to worst". Be as specific as you can about how the respondent should consider the options and how to rank them.

  2. Best Practices and Sample Questions for Course Evaluation Surveys

    One of the most common course assessment methods is the course evaluation survey. The following best practices are intended to guide departments and programs in creating or revising course evaluation questions. Clearly state the purpose at the top of the course evaluation. Meaningful input from students is essential for improving courses. Obtaining student feedback on…

  3. Student Satisfaction Survey Questions, 100 Samples, & Free Survey

    Time to let the cat out of the bag. Here are 100+ absolutely top student satisfaction survey questions in 6 categories that always deliver results. #1. Student Satisfaction Questionnaire On Teachers. Let's start with student satisfaction survey questions on the quality of teachers, their experience levels, teaching style, and more.

  4. Templates for Quick Student Feedback

    The templates all have the same questions and are fully customizable; use whichever platform you prefer. Please note that only Google Forms and Qualtrics offer truly anonymous surveys. Each format has four templates available for gathering various kinds of feedback: Quick Course Pulse: General feedback on how the course is going overall.

  5. Top 16 Student Survey Questions for Student Feedback

    Student perception survey questions about the teacher. Teacher feedback is also essential for the institute. Here are some critical questions to ask about the teachers and faculty members: 6. On a scale of 0-10, please rate your teacher - This rating scale question is the most basic yet essential question for a teacher.

  6. Course Evaluation Examples: 3 Amazing Surveys

    2.3It helps future students. 3Three amazing course evaluation examples of survey. 3.11. Forrest College - South Carolina. 3.22. The University of Oregon. 3.33. The University of Wisconsin - Madison. 4Importance of running online course evaluation surveys regularly.

  7. 35 Comprehensive Student Satisfaction Survey Questions

    Use our survey template. Get up and running quickly with our student satisfaction survey template. Pre-built with 12 sample questions, it can be used as provided or customized to meet your needs. Or alternatively, jump straight to our complete collection of example education questionnaires.

  8. Survey Questions: 70+ Survey Question Examples & Survey Types

    Impactful surveys start here: The main types of survey questions: most survey questions are classified as open-ended, closed-ended, nominal, Likert scale, rating scale, and yes/no. The best surveys often use a combination of questions. 💡 70+ good survey question examples: our top 70+ survey questions, categorized across ecommerce, SaaS, and ...

  9. Sample survey questions and examples

    Get examples of survey questions and learn how to build successful surveys with our most popular questions. Products. Product Overview. SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.

  10. Survey Questions: Examples and Sample Survey Questions

    For example, you ask a person's favorite color and include options for orange, green, blue, and "other." For the respondents whose favorite color is not present as one of the options, they can select "other" option and enter their answer. Open-ended survey question example: 9. Demographic survey questions.

  11. 150+ Free Questionnaire Examples & Sample Survey Templates

    Filter by survey type. All our sample survey template questions are expert-certified by professional survey methodologists to make sure you ask questions the right way-and get reliable results. You can send out our templates as is, choose separate variables, add additional questions, or customize our questionnaire templates to fit your needs.

  12. Survey Questions: 250+ Good Examples, Types & Best Practices

    Let's explore these in detail: 1. Open-Ended Questions. These types of questions collect detailed information from your target audience in the form of text answers. Open-ended questions are most utilized in cases where your customers have a concern beyond what's available in the predefined answer options.

  13. How to write a survey questionnaire for evaluation: A guide for ...

    To help prevent surveys becoming too long, focus on 'need to know' questions and minimise 'nice to know' questions. Ensure your survey questions focus on the information you need to answer your overall evaluation questions and address your purpose. Use neutral language that doesn't predict or signal a 'correct' answer.

  14. 50 Training Survey Questions for Post-training Feedback

    Communicate the survey results and the actions taken. It closes the loop, and it encourages participation in the future. Tools for a post-training survey. Employee surveys should be part of regular communication. Beyond post-training surveys, you can collect employee feedback about almost any business-related topic you need input about.

  15. How to Create a Survey Results Report (+7 Examples to Steal)

    How to Write a Survey Results Report. Let's walk through some tricks and techniques with real examples. 1. Use Data Visualization. The most important thing about a survey report is that it allows readers to make sense of data. Visualizations are a key component of any survey summary.

  16. How to Write a Survey Introduction [+Examples]

    This makes customers feel like the company is there for them. It's clear from the survey introduction that these responses are anonymous, so respondents can be as direct as they want. 3. Boast.ai. Adding a definition to the survey introduction makes it more likely that respondents will provide helpful information.

  17. How to Write a Literature Review

    Examples of literature reviews. Step 1 - Search for relevant literature. Step 2 - Evaluate and select sources. Step 3 - Identify themes, debates, and gaps. Step 4 - Outline your literature review's structure. Step 5 - Write your literature review.

  18. Writing Survey Questions

    Writing Survey Questions. Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions.

  19. Questionnaire Design

    Questionnaires vs. surveys. A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.. Designing a questionnaire means creating valid and reliable questions that address your research objectives, placing them in a useful order, and selecting an appropriate method for administration.

  20. 28 Questionnaire Examples, Questions, & Templates to Survey Your Clients

    1. Free HubSpot Questionnaire Template. HubSpot offers a variety of free customer surveys and questionnaire templates to analyze and measure customer experience. Choose from five templates: net promoter score, customer satisfaction, customer effort, open-ended questions, and long-form customer surveys.

  21. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  22. PDF APA 7 Student Sample Paper

    In this sample paper, we've put four blank lines above the title. Commented [AF3]: Authors' names are written below the title, with one double-spaced blank line between them. Names should be written as follows: First name, middle initial(s), last name. Commented [AF4]: Authors' affiliations follow immediately after their names.

  23. Types of Survey Questions: Online Survey Examples & Tips

    Discover what type of questions to include in your online survey. Explore expert tips for crafting an effective survey that yields insightful responses. Products. Product Overview. SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.