A Comprehensive Guide to Survey Research Methodologies

For decades, researchers and businesses have used survey research to produce statistical data and explore ideas. The survey process is simple, ask questions and analyze the responses to make decisions. Data is what makes the difference between a valid and invalid statement and as the American statistician, W. Edwards Deming said:

“Without data, you’re just another person with an opinion.” - W. Edwards Deming

In this article, we will discuss what survey research is, its brief history, types, common uses, benefits, and the step-by-step process of designing a survey.

What is Survey Research

A survey is a research method that is used to collect data from a group of respondents in order to gain insights and information regarding a particular subject. It’s an excellent method to gather opinions and understand how and why people feel a certain way about different situations and contexts.

Brief History of Survey Research

Survey research may have its roots in the American and English “social surveys” conducted around the turn of the 20th century. The surveys were mainly conducted by researchers and reformers to document the extent of social issues such as poverty. ( 1 ) Despite being a relatively young field to many scientific domains, survey research has experienced three stages of development ( 2 ):

-       First Era (1930-1960)

-       Second Era (1960-1990)

-       Third Era (1990 onwards)

Over the years, survey research adapted to the changing times and technologies. By exploiting the latest technologies, researchers can gain access to the right population from anywhere in the world, analyze the data like never before, and extract useful information.

Survey Research Methods & Types

Survey research can be classified into seven categories based on objective, data sources, methodology, deployment method, and frequency of deployment.

Types of survey research based on objective, data source, methodology, deployment method, and frequency of deployment.

Surveys based on Objective

Exploratory survey research.

Exploratory survey research is aimed at diving deeper into research subjects and finding out more about their context. It’s important for marketing or business strategy and the focus is to discover ideas and insights instead of gathering statistical data.

Generally, exploratory survey research is composed of open-ended questions that allow respondents to express their thoughts and perspectives. The final responses present information from various sources that can lead to fresh initiatives.

Predictive Survey Research

Predictive survey research is also called causal survey research. It’s preplanned, structured, and quantitative in nature. It’s often referred to as conclusive research as it tries to explain the cause-and-effect relationship between different variables. The objective is to understand which variables are causes and which are effects and the nature of the relationship between both variables.

Descriptive Survey Research

Descriptive survey research is largely observational and is ideal for gathering numeric data. Due to its quantitative nature, it’s often compared to exploratory survey research. The difference between the two is that descriptive research is structured and pre-planned.

 The idea behind descriptive research is to describe the mindset and opinion of a particular group of people on a given subject. The questions are every day multiple choices and users must choose from predefined categories. With predefined choices, you don’t get unique insights, rather, statistically inferable data.

Survey Research Types based on Concept Testing

Monadic concept testing.

Monadic testing is a survey research methodology in which the respondents are split into multiple groups and ask each group questions about a separate concept in isolation. Generally, monadic surveys are hyper-focused on a particular concept and shorter in duration. The important thing in monadic surveys is to avoid getting off-topic or exhausting the respondents with too many questions.

Sequential Monadic Concept Testing

Another approach to monadic testing is sequential monadic testing. In sequential monadic surveys, groups of respondents are surveyed in isolation. However, instead of surveying three groups on three different concepts, the researchers survey the same groups of people on three distinct concepts one after another. In a sequential monadic survey, at least two topics are included (in random order), and the same questions are asked for each concept to eliminate bias.

Based on Data Source

Primary data.

Data obtained directly from the source or target population is referred to as primary survey data. When it comes to primary data collection, researchers usually devise a set of questions and invite people with knowledge of the subject to respond. The main sources of primary data are interviews, questionnaires, surveys, and observation methods.

 Compared to secondary data, primary data is gathered from first-hand sources and is more reliable. However, the process of primary data collection is both costly and time-consuming.

Secondary Data

Survey research is generally used to collect first-hand information from a respondent. However, surveys can also be designed to collect and process secondary data. It’s collected from third-party sources or primary sources in the past.

 This type of data is usually generic, readily available, and cheaper than primary data collection. Some common sources of secondary data are books, data collected from older surveys, online data, and data from government archives. Beware that you might compromise the validity of your findings if you end up with irrelevant or inflated data.

Based on Research Method

Quantitative research.

Quantitative research is a popular research methodology that is used to collect numeric data in a systematic investigation. It’s frequently used in research contexts where statistical data is required, such as sciences or social sciences. Quantitative research methods include polls, systematic observations, and face-to-face interviews.

Qualitative Research

Qualitative research is a research methodology where you collect non-numeric data from research participants. In this context, the participants are not restricted to a specific system and provide open-ended information. Some common qualitative research methods include focus groups, one-on-one interviews, observations, and case studies.

Based on Deployment Method

Online surveys.

With technology advancing rapidly, the most popular method of survey research is an online survey. With the internet, you can not only reach a broader audience but also design and customize a survey and deploy it from anywhere. Online surveys have outperformed offline survey methods as they are less expensive and allow researchers to easily collect and analyze data from a large sample.

Paper or Print Surveys

As the name suggests, paper or print surveys use the traditional paper and pencil approach to collect data. Before the invention of computers, paper surveys were the survey method of choice.

Though many would assume that surveys are no longer conducted on paper, it's still a reliable method of collecting information during field research and data collection. However, unlike online surveys, paper surveys are expensive and require extra human resources.

Telephonic Surveys

Telephonic surveys are conducted over telephones where a researcher asks a series of questions to the respondent on the other end. Contacting respondents over a telephone requires less effort, human resources, and is less expensive.

What makes telephonic surveys debatable is that people are often reluctant in giving information over a phone call. Additionally, the success of such surveys depends largely on whether people are willing to invest their time on a phone call answering questions.

One-on-one Surveys

One-on-one surveys also known as face-to-face surveys are interviews where the researcher and respondent. Interacting directly with the respondent introduces the human factor into the survey.

Face-to-face interviews are useful when the researcher wants to discuss something personal with the respondent. The response rates in such surveys are always higher as the interview is being conducted in person. However, these surveys are quite expensive and the success of these depends on the knowledge and experience of the researcher.

Based on Distribution

The easiest and most common way of conducting online surveys is sending out an email. Sending out surveys via emails has a higher response rate as your target audience already knows about your brand and is likely to engage.

Buy Survey Responses

Purchasing survey responses also yields higher responses as the responders signed up for the survey. Businesses often purchase survey samples to conduct extensive research. Here, the target audience is often pre-screened to check if they're qualified to take part in the research.

Embedding Survey on a Website

Embedding surveys on a website is another excellent way to collect information. It allows your website visitors to take part in a survey without ever leaving the website and can be done while a person is entering or exiting the website.

Post the Survey on Social Media

Social media is an excellent medium to reach abroad range of audiences. You can publish your survey as a link on social media and people who are following the brand can take part and answer questions.

Based on Frequency of Deployment

Cross-sectional studies.

Cross-sectional studies are administered to a small sample from a large population within a short period of time. This provides researchers a peek into what the respondents are thinking at a given time. The surveys are usually short, precise, and specific to a particular situation.

Longitudinal Surveys

Longitudinal surveys are an extension of cross-sectional studies where researchers make an observation and collect data over extended periods of time. This type of survey can be further divided into three types:

-       Trend surveys are employed to allow researchers to understand the change in the thought process of the respondents over some time.

-       Panel surveys are administered to the same group of people over multiple years. These are usually expensive and researchers must stick to their panel to gather unbiased opinions.

-       In cohort surveys, researchers identify a specific category of people and regularly survey them. Unlike panel surveys, the same people do not need to take part over the years, but each individual must fall into the researcher’s primary interest category.

Retrospective Survey

Retrospective surveys allow researchers to ask questions to gather data about past events and beliefs of the respondents. Since retrospective surveys also require years of data, they are similar to the longitudinal survey, except retrospective surveys are shorter and less expensive.

Why Should You Conduct Research Surveys?

“In God we trust. All others must bring data” - W. Edwards Deming

 In the information age, survey research is of utmost importance and essential for understanding the opinion of your target population. Whether you’re launching a new product or conducting a social survey, the tool can be used to collect specific information from a defined set of respondents. The data collected via surveys can be further used by organizations to make informed decisions.

Furthermore, compared to other research methods, surveys are relatively inexpensive even if you’re giving out incentives. Compared to the older methods such as telephonic or paper surveys, online surveys have a smaller cost and the number of responses is higher.

 What makes surveys useful is that they describe the characteristics of a large population. With a larger sample size , you can rely on getting more accurate results. However, you also need honest and open answers for accurate results. Since surveys are also anonymous and the responses remain confidential, respondents provide candid and accurate answers.

Common Uses of a Survey

Surveys are widely used in many sectors, but the most common uses of the survey research include:

-       Market research : surveying a potential market to understand customer needs, preferences, and market demand.

-       Customer Satisfaction: finding out your customer’s opinions about your services, products, or companies .

-       Social research: investigating the characteristics and experiences of various social groups.

-       Health research: collecting data about patients’ symptoms and treatments.

-       Politics: evaluating public opinion regarding policies and political parties.

-       Psychology: exploring personality traits, behaviors, and preferences.

6 Steps to Conduct Survey Research

An organization, person, or company conducts a survey when they need the information to make a decision but have insufficient data on hand. Following are six simple steps that can help you design a great survey.

Step 1: Objective of the Survey

The first step in survey research is defining an objective. The objective helps you define your target population and samples. The target population is the specific group of people you want to collect data from and since it’s rarely possible to survey the entire population, we target a specific sample from it. Defining a survey objective also benefits your respondents by helping them understand the reason behind the survey.

Step 2: Number of Questions

The number of questions or the size of the survey depends on the survey objective. However, it’s important to ensure that there are no redundant queries and the questions are in a logical order. Rephrased and repeated questions in a survey are almost as frustrating as in real life. For a higher completion rate, keep the questionnaire small so that the respondents stay engaged to the very end. The ideal length of an interview is less than 15 minutes. ( 2 )

Step 3: Language and Voice of Questions

While designing a survey, you may feel compelled to use fancy language. However, remember that difficult language is associated with higher survey dropout rates. You need to speak to the respondent in a clear, concise, and neutral manner, and ask simple questions. If your survey respondents are bilingual, then adding an option to translate your questions into another language can also prove beneficial.

Step 4: Type of Questions

In a survey, you can include any type of questions and even both closed-ended or open-ended questions. However, opt for the question types that are the easiest to understand for the respondents, and offer the most value. For example, compared to open-ended questions, people prefer to answer close-ended questions such as MCQs (multiple choice questions)and NPS (net promoter score) questions.

Step 5: User Experience

Designing a great survey is about more than just questions. A lot of researchers underestimate the importance of user experience and how it affects their response and completion rates. An inconsistent, difficult-to-navigate survey with technical errors and poor color choice is unappealing for the respondents. Make sure that your survey is easy to navigate for everyone and if you’re using rating scales, they remain consistent throughout the research study.

Additionally, don’t forget to design a good survey experience for both mobile and desktop users. According to Pew Research Center, nearly half of the smartphone users access the internet mainly from their mobile phones and 14 percent of American adults are smartphone-only internet users. ( 3 )

Step 6: Survey Logic

Last but not least, logic is another critical aspect of the survey design. If the survey logic is flawed, respondents may not continue in the right direction. Make sure to test the logic to ensure that selecting one answer leads to the next logical question instead of a series of unrelated queries.

How to Effectively Use Survey Research with Starlight Analytics

Designing and conducting a survey is almost as much science as it is an art. To craft great survey research, you need technical skills, consider the psychological elements, and have a broad understanding of marketing.

The ultimate goal of the survey is to ask the right questions in the right manner to acquire the right results.

Bringing a new product to the market is a long process and requires a lot of research and analysis. In your journey to gather information or ideas for your business, Starlight Analytics can be an excellent guide. Starlight Analytics' product concept testing helps you measure your product's market demand and refine product features and benefits so you can launch with confidence. The process starts with custom research to design the survey according to your needs, execute the survey, and deliver the key insights on time.

  • Survey research in the United States: roots and emergence, 1890-1960 https://searchworks.stanford.edu/view/10733873    
  • How to create a survey questionnaire that gets great responses https://luc.id/knowledgehub/how-to-create-a-survey-questionnaire-that-gets-great-responses/    
  • Internet/broadband fact sheet https://www.pewresearch.org/internet/fact-sheet/internet-broadband/    

Related Articles

Moments of truth: building brand loyalty among your customers.

Learn about the four discrete Moments of Truth and how they influence a customer’s perception of—and loyalty to—your brand.

How to Identify Barriers to Purchase (and Crush Revenue Goals!)

Learn how to identify and remove barriers to purchase to improve conversion rates and increase revenue.

What is A/B Testing and How to Do it Effectively in 2022

A/B testing is a technique that tests two versions of something in different groups. Learn more about making data-driven decisions with A/B testing.

Product Life Cycle | What is it and What are the Stages?

The product life cycle outlines a product's journey from introduction to decline. Learn all about the product life cycle, its stages, and examples here.

Fuel your innovation with data - Custom AI-powered Research

Need fresh innovation ideas? Read more about our custom AI-powered methods that generate + prioritize new product ideas for your innovation pipeline.

Monadic Testing: A Survey Methodology for Concept Tests

Learn how monadic survey methodologies can provide clear, strategic direction on your product development process.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

how to write methodology for online survey

Home Surveys

Survey Methodology: How to reach your Audience

A survey methodology is a technique that is carried out by applying a questionnaire to a sample of people. Let's talk about it.

When conducting research and collecting data, it is critical to get accurate information. However, it would be best if you also remembered to get the right sample size to ensure that you are drawing reliable conclusions. There are currently many ways to reach your target audience. Let’s talk about survey methodology.

The main difference between methodologies is the cost, depending on the type of data you want to collect and the number of responses you need for your specific research. However, respondent behavior has changed over time, and different factors should be considered when choosing your survey methodology.

What is a Survey Methodology?

A survey methodology is a technique that is carried out by applying a questionnaire to a sample of people. Surveys provide information on citizens’ opinions, attitudes, and behaviors, and there’s more than one method to conduct them. In fact, there’s a whole universe to learn from in the surveys world.

LEARN ABOUT: Survey Sample Sizes

Survey Methodology Examples

The most common channels to contact respondents are in-person, telephone, mail, and online. Each of these methods has strengths and weaknesses that should be considered.

In-person surveys

These are conducted with an interviewer meeting with the respondent directly. This survey can be in a panel setting, where respondents meet the interviewers in a central location or where the interviewer goes to a product or service location to find customers interested in completing a survey.

The format allows the customer to give open-ended feedback, which a skilled interviewer can put into quantitative values. The interviewer will be able to explain the content of the survey and read non-verbal cues from the customer. This allows the customer to give additional information to the researcher. The feedback saves time in categorizing and interpreting responses later in the research process.

It is the most intimate survey methodology, which makes it a good option for discussing more sensitive topics. It is also the most expensive option. The reason is it requires a large number of responses, which must cover travel, setting up in a physical location, and training interviewers to have the skills to manage and record a survey in this way.

Telephone interviews

These surveys are a way to follow the in-person interview format, with less cost for travel and the ability to reach people more easily. One interviewer will be able to contact more respondents by telephone. While the interviewer can’t read visual cues, they will still be able to explain the survey content and collect more open-ended feedback from the respondent.

However, currently, people are less likely to answer the phone for numbers they don’t recognize and are even less likely to give out information to anyone. This will restrict your sample size while still having costs and time for interviewers to attempt to contact people.

Paper surveys by mail

Surveys can be completed at the respondents’ convenience sample . They are a great way to reach many potential respondents for much less cost. It is an easy way to reach people without feeling as intrusive as a phone call could be. Due to the less personal contact method, it is of greater importance that the survey content is planned out so that questions are clear to the respondent.

This method requires good address records to ensure they end up in the right person’s hands, and unlike an interview, respondents may put off replying and forget to send back a response. If done correctly, this will result in answers that researchers will be able to use when analyzing the data.

This is an excellent method for primarily quantitative data when you want a large sample size . Since they are less personal and responses are sent back by mail, respondents may also be less willing to reply on more sensitive topics. 

Online survey methodology for collecting information

Online survey methodology is prevalent now. An online survey can reach a wide audience and can be completed at their convenience. A short survey can even be conducted while they are waiting for a site to load or other activity to complete. Similar to email surveys , an online survey will have to be written so that a respondent can follow instructions without an interviewer to help them a nd focus on quantitative feedback.

When collecting data online, the answers can be directly stored in a  database, saving costs on tabulating paper responses. Online surveys are a cost-effective way compared to paper and telephonic ways. Online surveys are global and can be taken by each individual through the internet as well as offline means.

Online surveys are scalable and we can reach out to larger sample sizes easily compared to paper-based and telephonic surveys . Online surveys are fast compared to traditional paper-based surveys. Due to contents in the survey chances are the survey may be junked due to links and keywords used in a survey this will cost reacting to the sampling size or target size of the survey.

Online surveys are ignored at times as online surveys are bombarded and in this case, there are chances of missing important surveys. One of the major disadvantages of online surveys is their inability to reach people residing in far-off, remote locations with no access to the internet.

Even elders who do not have internet access and are not proficient in working with online platforms are difficult to connect with using web-based surveys.

An important area where online surveys lag behind is capturing what respondents really feel about the problems mentioned in the survey. Being completely online, there is no way you can notice the facial expressions or body language of the participants. All you have is the responses to decode what customers really feel.

LEARN ABOUT: best time to send out surveys

The future of Survey Methodology

Modern technology means that many respondents are already spending more time with devices connected to the internet, which makes it an excellent contact channel for getting survey responses. Since it is cost-effective and efficient to focus on a smaller number of contact channels, it is critical to understand the strengths and weaknesses of online surveys, including how it affects the demographics of your respondents.

Finding the best way to reach your audience online is important to get the responses needed to conduct meaningful research. Ensure that you have proper contact information, whether that is ensuring that companies are keeping accurate email databases or that web portals for online panels are up to date. Pay attention to respondent feedback to ensure that you are keeping their attention long enough to get responses and that they are motivated to give true responses in an impersonal environment.

You will be competing with many other sources that are trying to get your respondents’ attention online, so ensure you are contacting them in a way where it is clear why your survey should be important to them. With market research to know how to best target your audience, you have the opportunity to collect a large amount of data in an efficient manner.

CREATE FREE ACCOUNT

Authors : Devendra Joag & Davin Moloney

MORE LIKE THIS

target population

Target Population: What It Is + Strategies for Targeting

Aug 29, 2024

Microsoft Customer Voice vs QuestionPro: Choosing the Best

statistical methods

Statistical Methods: What It Is, Process, Analyze & Present

Aug 28, 2024

how to write methodology for online survey

Velodu and QuestionPro: Connecting Data with a Human Touch

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • Doing Survey Research | A Step-by-Step Guide & Examples

Doing Survey Research | A Step-by-Step Guide & Examples

Published on 6 May 2022 by Shona McCombes . Revised on 10 October 2022.

Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyse the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyse the survey results, step 6: write up the survey results, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research: Investigating the experiences and characteristics of different social groups
  • Market research: Finding out what customers think about products, services, and companies
  • Health research: Collecting data from patients about symptoms and treatments
  • Politics: Measuring public opinion about parties and policies
  • Psychology: Researching personality traits, preferences, and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and longitudinal studies , where you survey the same sample several times over an extended period.

Prevent plagiarism, run a free check.

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • University students in the UK
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18 to 24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalised to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every university student in the UK. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalise to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions.

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by post, online, or in person, and respondents fill it out themselves
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by post is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g., residents of a specific region).
  • The response rate is often low.

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyse.
  • The anonymity and accessibility of online surveys mean you have less control over who responds.

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping centre or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g., the opinions of a shop’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations.

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data : the researcher records each response as a category or rating and statistically analyses the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analysed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g., yes/no or agree/disagree )
  • A scale (e.g., a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g., age categories)
  • A list of options with multiple answers possible (e.g., leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analysed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an ‘other’ field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic.

Use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no bias towards one answer or another.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by post, online, or in person.

There are many methods of analysing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also cleanse the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organising them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analysing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analysed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyse it. In the results section, you summarise the key results from your analysis.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviours. It is made up of four or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with five or seven possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyse your data.

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analysing data from people using questionnaires.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2022, October 10). Doing Survey Research | A Step-by-Step Guide & Examples. Scribbr. Retrieved 29 August 2024, from https://www.scribbr.co.uk/research-methods/surveys/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs quantitative research | examples & methods, construct validity | definition, types, & examples, what is a likert scale | guide & examples.

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • Product Demos
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO

how to write methodology for online survey

How to create an effective survey in 15 simple tips

Updated August 22, 2024

You don’t have to be an expert to create a survey, but by following a few survey best practices you can make sure you’re collecting the best data possible.

Access 50+ expert-designed survey templates with a free Qualtrics Surveys account

From working out what you want to achieve to providing incentives for respondents, survey design can take time.

But when you don’t have hours to devote to becoming a survey-creation guru, a quick guide to the essentials is a great way to get started.

In this article, we’re going to reveal how to create a survey that’s easy to complete, encourages collecting feedback, hits the research questions you’re interested in, and produces data that’s easy to work with at the analysis stage .

15 Tips when creating surveys

1. define the purpose of the survey.

Before you even think about your survey questions , you need to define their purpose.

The survey’s purpose should be a clear, attainable, and relevant goal. For example, you might want to understand why customer engagement is dropping off during the middle of the sales process.

Your goal could then be something like: “I want to understand the key factors that cause engagement to dip at the middle of the sales process, including both internal and external elements.”

Or maybe you want to understand customer satisfaction post-sale. If so, the goal of your survey could be: “I want to understand how customer satisfaction is influenced by customer service and support post-sale, including through online and offline channels.”

The idea is to come up with a specific, measurable, and relevant goal for your survey. This way you ensure that your questions are tailored to what you want to achieve and that the data captured can be compared against your goal.

2. Make every question count

You’re building your survey questionnaire to obtain important insights, so every question should play a direct role in hitting that target.

Make sure each question adds value and drives survey responses that relate directly to your research goals. For example, if your participant’s precise age or home state is relevant to your results, go ahead and ask. If not, save yourself and your respondents some time and skip it.

It’s best to plan your survey by first identifying the data you need to collect and then writing your questions.

You can also incorporate multiple-choice questions to get a range of responses that provide more detail than a solid yes or no. It’s not always black and white.

For a deeper dive into the art and science of question-writing and survey best practices, check out Survey questions 101 .

3. Keep it short and simple

Although you may be deeply committed to your survey, the chances are that your respondents... aren’t.

As a survey designer, a big part of your job is keeping their attention and making sure they stay focused until the end of the survey.

Respondents are less likely to complete long surveys or surveys that bounce around haphazardly from topic to topic. Make sure your survey follows a logical order and takes a reasonable amount of time to complete.

Although they don’t need to know everything about your research project, it can help to let respondents know why you’re asking about a certain topic. Knowing the basics about who you are and what you’re researching means they’re more likely to keep their responses focused and in scope.

Access 50+ expert-designed survey templates now

4. Ask direct questions

Vaguely worded survey questions confuse respondents and make your resulting data less useful. Be as specific as possible, and strive for clear and precise language that will make your survey questions easy to answer.

It can be helpful to mention a specific situation or behavior rather than a general tendency. That way you focus the respondent on the facts of their life rather than asking them to consider abstract beliefs or ideas .

See an example:

Good survey design isn’t just about getting the information you need, but also encouraging respondents to think in different ways.

Get access to the top downloaded survey templates here

5. Ask one question at a time

Although it’s important to keep your survey as short and sweet as possible, that doesn’t mean doubling up on questions. Trying to pack too much into a single question can lead to confusion and inaccuracies in the responses.

Take a closer look at questions in your survey that contain the word “and” – it can be a red flag that your question has two parts. For example: “Which of these cell phone service providers has the best customer support and reliability?” This is problematic because a respondent may feel that one service is more reliable, but another has better customer support.

6. Avoid leading and biased questions

Although you don’t intend them to, certain words and phrases can introduce bias into your questions or point the respondent in the direction of a particular answer.

As a rule of thumb, when you conduct a survey it’s best to provide only as much wording as a respondent needs to give an informed answer. Keep your question wording focused on the respondent and their opinions, rather than introducing anything that could be construed as a point of view of your own.

In particular, scrutinize adjectives and adverbs in your questions. If they’re not needed, take them out.

7. Speak your respondent's language

This tip goes hand in hand with many others in this guide – it’s about making language only as complex or as detailed as it needs to be when conducting great surveys.

Create surveys that use language and terminology that your respondents will understand. Keep the language as plain as possible, avoid technical jargon and keep sentences short. However, beware of oversimplifying a question to the point that its meaning changes.

8. Use response scales whenever possible

Response scales capture the direction and intensity of attitudes, providing rich data. In contrast, categorical or binary response options, such as true/false or yes/no response options, generally produce less informative data.

If you’re in the position of choosing between the two, the response scale is likely to be the better option.

Avoid using scales that ask your target audience to agree or disagree with statements, however. Some people are biased toward agreeing with statements , and this can result in invalid and unreliable data.

9. Avoid using grids or matrices for responses

Grids or matrices of answers demand a lot more thinking from your respondent than a scale or multiple choice question. They need to understand and weigh up multiple items at once, and oftentimes they don’t fill in grids accurately or according to their true feelings .

Another pitfall to be aware of is that grid question types aren’t mobile-friendly. It’s better to separate questions with grid responses into multiple questions in your survey with a different structure such as a response scale.

See an example using our survey tool:

10. Rephrase yes/no questions if possible in online survyes

As we’ve described, yes/no questions provide less detailed data than a response scale or multiple-choice, since they only yield one of two possible answers.

Many yes/no questions can be reworked by including phrases such as “How much,” “How often,” or “How likely.” Make this change whenever possible and include a response scale for richer data.

By rephrasing your questions in this way, your survey results will be far more comprehensive and representative of how your respondents feel.

Next? Find out how to write great questions .

11. Start with the straightforward stuff

Ease your respondent into the survey by asking easy questions at the start of your questionnaire, then moving on to more complex or thought-provoking elements once they’re engaged in the process.

This is especially valuable if you need to cover any potentially sensitive topics in your survey. Never put sensitive questions at the start of the questionnaire where they’re more likely to feel off-putting.

Your respondent will probably become more prone to fatigue and distraction towards the end of the survey, so keep your most complex or contentious questions in the middle of the survey flow rather than saving them until last.

12. Use unbalanced scales with care

Unbalanced response scales and poorly worded questions can mislead respondents.

For example, if you’ve asked them to rate a product or service and you provide a scale that includes “poor”, “satisfactory”, “good” and “excellent”, they could be swayed towards the “excellent” end of the scale because there are more positive options available.

Make sure your response scales have a definitive, neutral midpoint (aim for odd numbers of possible responses) and that they cover the whole range of possible reactions to the question .

13. Consider adding incentives

To increase the number of responses, incentives — discounts, offers, gift cards, or sweepstakes — can prove helpful.

Of course, while the benefits of offering incentives sound appealing (more respondents), there’s the possibility of attracting the opinions of the wrong audiences, such as those who are only in it for the incentive.

With this in mind, make sure you limit your surveys to your target population and carefully assess which incentives would be most valuable to them.

14. Take your survey for a test drive

Want to know how to make a survey a potential disaster? Send it out before you pre-test .

However short or straightforward your questionnaire is, it’s always a good idea to pre-test your survey before you roll it out fully so that you can catch any possible errors before they have a chance to mess up your survey results.

Share your survey with at least five people, so that they can test your survey to help you catch and correct problems before you distribute it.

15. Let us help you

Survey design doesn’t have to be difficult — even less so with the right expertise, digital solutions, and survey templates.

At Qualtrics, we provide survey software that’s used by more than 11,000 of the top brands and 99 of the top business schools worldwide.

Furthermore, we have a library of high-quality, ready-to-use, and easy-to-configure survey templates that can improve your surveys significantly.

You can check out our template marketplace here . As a free or existing customer, you have access to the complete collection and can filter by the core experiences you want to drive.

As for our survey software , it’s completely free to use and powers more than 1 billion surveys a year. Using it, you can get answers to your most important brand, market, customer, and product questions, build your own surveys, get insights from your audience wherever they are, and much, much more.

If you want to learn more about how to use our survey tool to create a survey, as well as what else it can do — check out our blog on how to create a free online survey using Qualtrics .

See instant results with our online free survey maker

Sarah Fisher

Related Articles

May 20, 2024

Best strategy & research books to read in 2024

May 13, 2024

Experience Management

X4 2024 Strategy & Research Showcase: Introducing the future of insights generation

November 7, 2023

Brand Experience

The 4 market research trends redefining insights in 2024

June 27, 2023

The fresh insights people: Scaling research at Woolworths Group

June 20, 2023

Bank less, delight more: How Bankwest built an engine room for customer obsession

April 1, 2023

Academic Experience

How to write great survey questions (with examples)

March 21, 2023

Sample size calculator

November 18, 2022

Statistical analysis software: your complete guide to getting started

Stay up to date with the latest xm thought leadership, tips and news., request demo.

Ready to learn more about Qualtrics?

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Nepal J Epidemiol
  • v.6(4); 2016 Dec

Guide to the design and application of online questionnaire surveys

Pramod r regmi.

1 Faculty of Health and Social Sciences, Bournemouth University, England , UK.

2 Visiting Research Fellow, Chitwan Medical College (CMC), Tribhuvan University , Nepal.

Elizabeth Waithaka

Anjana paudyal.

3 The School of Environment, Charles Darwin University, Darwin , Australia.

Padam Simkhada

4 Faculty of Education, Health and Community, Liverpool John Moores University , UK

5 Manmohan Memorial Institute of Health Sciences, Tribhuvan University, , Nepal.

6 Nobel College, Pokhara University, , Nepal.

Edwin van Teijlingen

Author's Contribution: PR and EvT conceived the idea. PR, AP, EW and PS reviewed the literature and drafted the manuscript. All authors reviewed, edited and agreed on the final version of this manuscript. .

Collecting research data through traditional approaches (face-to-face, postal or telephone survey) can be costly and time consuming. The emerging data collection approach based on internet/e-based technologies (e.g. online platforms and email), is a relatively cost effective survey alternative. These novel data collection strategies can collect large amounts of data from participants in a short time frame. Similarly, they also seem to be feasible and effective in collecting data on sensitive issues or with samples they are generally hard to reach, for example, men who have sex with men (MSM) or migrants. As a significant proportion of the population currently in the world are digitally connected, the shift from postal (paper-pencil) or telephone towards online survey use in research is in the interests of researchers in academia as well as in the commercial world. However, compared to designing and executing paper version of the questionnaire, there is limited literature to help a starting researcher with the design and a use of online questionnaires. This short paper highlights issues around: a) methodological aspect of online questionnaire survey; b) online survey planning and management; and c) ethical concerns that may arise while using this option. We believe that this paper will be useful for researchers who want to gain knowledge or apply this approach in their research.

Introduction

Questionnaire surveys are a popular data collection method for academic or marketing research in a variety of fields. Face-to-face, telephone interviews and postal surveys are traditional approaches of completing questionnaire surveys. However, with the growing access to the internet facility globally [ 1 , 2 ], for example, internet penetration in Nepal, a low-income country, increased exponentially in the past two decades, from less than 50 users in 1995 to 11.9 million users (about 45% of total population) in 2015 [ 3 ] and the price of technology devices (e.g. tablet computers, hardware) and software continuing to reduce [ 4 ], an novel internet-based data collection technique such as online questionnaire survey has become popular in recent years [ 5 ]. Recently, qualitative data collection through online focus groups is also emerging [ 6 , 7 ], suggesting, research participants in the digital age can now interact with each other and the interviewer/facilitator in an online- multimedia setting.

Data collection through an online survey appears to have the potential to collect large amounts of data efficiently (i.e. with less error due to the lack transferring written data on to a computer), economically (as it requires low human resource efforts while collecting or managing data) and within relatively short time frames. Online survey approach is also very useful when collecting data from hard-to-reach populations such as lesbian, gay, bisexual and transgender (LGB&T) or travellers, etc. Moreover, people with certain conditions, such as HIV are often hard to access since they are stigmatized offline [ 4 ]. Studying these sub-populations can be possible through an online survey approach as this may help access these hard to reach population by sending an invitation through a range of media and discussion platforms (e.g. social media, discussion fora).

Online survey approach provides convenience in several ways, for example, a) respondent can answer at a convenient time; b) respondent can take as much time as they need to response questions; c) respondent can complete survey in multiple sessions. Similar to the paper-based survey; online questionnaire surveys are capable of question diversity (e.g. dichotomous questions, multiple-choice questions, scales), skip irrelevant questions for sub-groups in the sample (i.e. no pregnancy questions for men) and even collect an open-ended questions (qualitative data) through a free text box. Similarly, the construction of the online questionnaire can also be built to help better response rate for each item; for example, respondents must answer a question before advancing to the next question. This, however, might create an unfavourable situation to some research participants if they do not want to answer sensitive questions such as sexual behaviours or drug use. Unlike the paper postal survey, through this approach, follow up could be easy through email which enhance response rate.

There is substantial evidence that many large cross-country studies have been completed using online questionnaire surveys through popular dedicated platform (e.g. https://www.surveymonkey.co.uk/, https://www.onlinesurveys.ac.uk/about/ , https://www.qualtrics.com/ ).These platforms allow researchers to deploy and analyse surveys via the web without any advanced technical knowledge. Despite these developments, there is not much research focusing on an online survey or other technology-based survey methodologies, simply because they have been introduced a few years ago.

An online survey questionnaire survey follows the same characteristics as the paper version of the survey. However, the data collection strategies have specific characteristics (e.g. technological, demographic, response rate) that affect their design and implementation. Online questionnaires can only produce valid and meaningful results if the: a) layout of the questionnaire and all its questions/items are clear and precise; b) if they are appropriately executed (for example, completing survey through a mobile app or via tablet might attract young generation but may not work well with elderly population); and c) if they are asked consistently across all respondents. Therefore, a careful consideration needs to be given while designing the online questionnaire survey. In this paper, we discuss: a) methodological aspect of online questionnaire survey; b) online survey planning and management; and c) ethical concern that may arises while using this option.

Methodological components

Whilst developing and operationalising the online questionnaire survey, six methodological components are critical to successful online surveys. These are (a) user-friendly design and layout; (b) selecting survey participants; (c) avoiding multiple responses; (d) data management; e) ethical issues; and f) piloting tools. These are discussed below.

a) User-friendly design and layout

Generally, online survey link is promoted through an email, websites, social media or online discussion plateforms and potential survey participants are invited to take part the survey. Research participants always prefer a tool which is easy to follow and complete. The format of the questionnaire, therefore, should be easy for the participants to navigate around and should need only a minimum of computer skills for their completion. The items should be short, clear and easy to read by the participants, e.g. elderly people might need larger fonts. Similarly, research participants may be more open to sharing sensitive or personal information such as age, sex, after completing other questions, sensitive or personal questions should be placed at the end. Dillman [ 8 ] found that visual presentation is essential and also strengthen response rates, lead to longer download times of large files (especially in a setting where internet speed is slow) and this must be considered. Moreover, as online surveys are generally self-administered, answering instructions must be extremely clear.

b) Selecting survey participants

An easy access to surveys for all participants is essential in any online questionnaire survey [ 9 ]. Therefore, an online questionnaire may be appropriate only for a certain age groups. [ 10 ]. For example, an online study among elderly population would not be appropriate if the proportion of elderly who access/use internet is low. Similarly, if the survey link is promoted through social media (e.g. Twitter, Facebook), it might not capture the views from other people who do not use social media. In such circumstances the survey should be promoted through other channels and perhaps other possible data collection strategies (e.g. telephone or paper survey) should be combined with your online survey. Although, relatively little may be identified about the background characteristics of people in online communities, except basic demographic variables, and tracking non-response rate is not an easy in most online survey [ 5 , 11 ], it is very likely that participants in online surveys are more experienced or have stronger internet skills. They may be younger male and from households having fairly high incomes [ 12 ], however, with the modernisation and wide coverage of the internet facilities globally (particularly through mobile phones), recently the gap in internet use has decreased in countries like Nepal.

c) Avoiding multiple responses

Another important feature of the online survey design is the ability to avoid multiple responses. This is a particularly challenging when incentives are provided to the survey participants. In order to minimise this problem, online survey design should able to include a feature that enables to register interested participants (through their email) in the first stage so that the online tool will be able to assign a unique participant number which will minimise the chance of multiple enrolments into the study. A personalised link to access the online questionnaire can be sent to participants’ email address. It is very important that the email should be used for sharing the survey link only (to ensure participant’s details are protected). Restriction through an IP address could be another strategy to avoid multiple enrolments; however, it limits the opportunity for participants (e.g. family members or students living in communal dwelling) who share a common IP address. Similarly, participants should be offered completing the survey across multiple sessions if they wish (as long as they use the same device), as survey responses save automatically as participants progress through the survey pages.

d) Data management

Generally online survey platforms offer convenient and reliable data management. By design, online survey format protects against the loss of data and facilitates data transfer into a database (e.g. excel or SPSS) for analysis [ 9 , 10 ]. As these approaches provide the ability to export responses into a compatible database, it eliminates transcription errors and prevents survey modification by the survey participant. It can be argued the overall ease of use for well-designed questionnaires for both study participants and the researchers potentially improves the reliability and validity of the data collection process and the collected data [ 13 ].

e) Ethical issues

Online administration of surveys raises unique ethical questions regarding key ethical components including:

i. Informed consent

In most online survey tools, it is not possible to explain the study or to take verbal consent from participants. Researchers therefore have turned to ensuring that all information regarding the study, participants' rights and researcher's contact details are provided on the first page of the survey [ 14 ]. However, this is dependent on the study design. For example, in the conduct of e-Delphi studies, researchers have the option to administer participant information sheets, consent forms and additional study information by personal contact thereby allowing for oral consent [ 15 ]. The consent practice needs to be cautiously considered and determined. One increasingly common way is presenting items as would be found on paper-based consent forms such that the items must be endorsed before the next page can be opened.

ii. Privacy and confidentiality

There have been concerns regarding the ability of online administration tools ability to facilitate privacy and confidentiality [ 16 ]. Most of these tools rely on the researchers' ingenuity in setting up the survey settings to limit for instance participants' IP addresses. However, tools such as Survey Monkey have been associated with easily accessible data from surveys shared from a common account thereby compromising confidentiality [ 14 ].

iii. Right to withdrawal or omission of items

Study participants should have a right to withdrawal from the survey in addition to the choice to opt out of sharing the data already provided on an online questionnaire. Researchers should, therefore, ensure that the opportunity to erase or skip questions or backtrack through the survey is provided in order to maintain ethically sound research conduct. As a rule, only items relating to the consent form should require a response [ 14 ].

f) Piloting

When the survey tools, contents, platforms are decided, it is very important to carry out a pilot [ 17 , 18 ] with potential participants. Pilot studies can help ensure the adequacy of the questions, ordering of the questions, comprehensiveness of the contents, instructions are clear and adequate, feasibility of the technology (e.g. download time), skipping patters, data compatibility/transfer issues etc. Not all piloting has to be online as researchers can conduct cognitive interviews with those involved in a pilot study, although obviously certain aspects such as download time require piloting an electronic version of the survey.

Internet access is increasing across the globe has resulted in an increase in the use of online surveys. This data collection approach has a potential to collect both qualitative and quantitative data. Conducting an online survey enables access to large and geographically distributed populations. Experts have argued it as a cost-effective and time saving for the researcher. Although multiple data collection strategies help achieve a better response, combining email, postal and web-based survey, may, however, prove impractical or financially unfeasible to use. If designed and executed rigorously, results from an online survey may be no different than paper based survey results, yet may demonstrate to be advantages due to lower costs and speedy distribution. When designing an online survey, researchers should consider a number of principles such as simplicity in items included, feasibility, appropriateness of online surveys for the target participants, being culturally and ethically sensitive, completeness and neutrality. Adhering to these principles will ensure that your online survey is methodological sound.

  • (855) 776-7763

All Products

BIGContacts CRM

Survey Maker

ProProfs.com

  • Get Started Free

FREE. All Features. FOREVER!

Try our Forever FREE account with all premium features!

7 Steps to Conduct a Survey- Best Practices, Tools, & More

Shivani Dubey

Author & Editor at ProProfs

Shivani Dubey specializes in crafting engaging write-ups and exploring the intricacies of customer experience management. She covers vital topics such as customer feedback, voice of the customer (VoC), NPS, emerging UX and CX trends, and sentiment analysis.

7 Steps to Conduct a Survey- Best Practices, Tools, & More

Want to conduct a survey but wondering where to start? We’ve got you covered. 

Here’s how to do it:

  • Identify your target audience
  • Decide on the survey questions
  • Add question branching
  • Set triggers to pick the right moment to conduct the survey
  • Choose the design and deploy the survey
  • Analyze the results & take action

Simple, right? The challenge is to execute these steps properly.

Being a leading survey tool host, we regularly see businesses like yours needing help with their survey design and launch.

That’s why we have created this one-stop guide on how to conduct surveys. We’ll explore steps to plan the surveys, collect responses and analyze results to help you design campaigns that produce tangible results.

Let’s begin.

What is a Survey?

A survey is defined as a single or multi-page questionnaire that aims to collect information from the respondents. Surveys can be used to gather different types of data from the intended audience, like demographic, behavioral, psychographic, and more.

For businesses, a survey can be a source of first-hand customer data, making it an indispensable part of customer success strategy.

You can generate leads, pinpoint product or website issues, map customers’ journeys, and gauge users’ preferences.

For example, if you’ve just added a new feature to the mobile app, you can conduct a survey like CSAT to gauge users’ satisfaction and feature adoption.

As to how you can conduct a survey, there are several mediums/channels to do so:

  • In-store kiosks
  • Social media polls

Read More: 13 Ways to Collect Customer Feedback

Importance of Conducting Surveys

1 . measure the customers’ pulse.

Surveys can provide hundreds of nuanced data points about your customers and prospects. You can see how they perceive your brand and map their sentiments.

  • How happy are they with your products and services?
  • What do they think about your brand?
  • How effective are your support services?
  • Which touchpoints offer the best and worst experiences?

You can quantify these sentiments using surveys like NPS , CSAT, and CES. Just conduct pulse surveys on your website, app, or product to gauge their experiences.

how to write methodology for online survey

2. Map Issues With Products & Services

The flexibility of survey deployment makes it an ideal means to collect data about product issues. Suppose you have a website and see many people bouncing off the pricing page. You can deploy surveys on the page to find the reason for this departure.

how to write methodology for online survey

In the same way, you can add surveys at various stages of customers’ journeys to map their issues and problems. The best part is that you can keep it active to collect feedback data continuously and optimize the experience.

3. Validate New Ideas & Product Opportunities

Let’’s say you have three different ideas about new product features. How would you know which one to work on?

Simple! Conduct a survey and ask your users.

Surveys lets you know what your audience wants and expects from you. You can then implement this feedback into your product development process as Twilio does.

There are 18 teams at Twilio that depend on customer feedback to deliver optional solutions quickly. They conduct target surveys to validate which ideas are worth developing. It helps them to prioritize testing and experimentation for those features that offer the most value to the customers.

The teams can go from ideation to development stages faster to deliver new functionalities weekly.

You can also do the same with your website or product. Just conduct a survey at critical touchpoints to find new growth opportunities.

4. Set a Baseline & Compare

You can leverage the redeployment ability of surveys to compare customer experiences over time and track the changes.

For example, a simple NPS survey on your product page can help you track customers’ loyalty and repurchase probability with time. You can rerun it every month to compare the scores and make improvements.

But it’s not only restricted to a single page; you can compare the feedback data from different channels to design a cohesive and seamless omnichannel experience.

What Do You Need to Conduct a Survey?

Before creating a survey, you need to prepare a strategy to maximize the response rate and collect accurate feedback data. You can’t survey randomly at any point. It would only result in skewed feedback and a waste of time.

Here are some pointers that will help you to develop a plan for your survey campaign:

1. Identify the Points of Deployment

The customers’ interaction points are scattered across various stages along with their journey. You can’t conduct surveys at every point as it’ll frustrate the customer. Plus, the data would require more workforce and hours to analyze.

That’s why it’s crucial to pinpoint the precise interaction to show the survey to visitors, so they are more inclined to answer it.

You can use Google Analytics to find the most underperforming pages and conduct a survey there. Or you can choose those pages where conversions have steadily declined with time.

2. Set Measurable Goals

Each survey serves a different purpose, and associating goals with it’ll help you frame the right questions.

For example:

  • If the goal is to reduce the bounce rate, you can choose an exit-intent survey and find out the reasons behind page abandonment. Then, optimize the page and track changes in the bounce rate.
  • If the goals are to measure customer satisfaction , you can collect CSAT scores using CSAT surveys. 
  • If you want to evaluate newly added page content, choose the experience mapping survey to check how the content is performing.

3. Determine the Required Sample Size

One of the most important aspects of your survey campaign is achieving a sufficient response size. The more responses you collect, higher are the chances that the feedback data is reliable and accurate.

Here’s a sample size calculator you can use:

how to write methodology for online survey

As you can see, for a target audience (population size) of 13000 and a confidence level of 99%, the required sample size is 3145. It means that you need to collect at least 3145 responses for your survey for the results to be statistically significant.

4. Choose the Right Tool

Half of your work is done when you have the right survey tool. It’ll let you conduct the survey and aid in analyzing feedback data.

Here’s What to Look for in a Survey Tool:

  • The number of deployment channels: Choose a tool that offers all the channels you want to conduct the survey on, like website, email, mobile app, product, link, etc. It’ll help to keep all the data in one place.

how to write methodology for online survey

  • Sufficient audience targeting options: Tools like Qualaroo offer advanced triggering options to target visitors’ behavior, actions, and other attributes to show your survey to the right people.
  • Flexible survey creator: Look for a survey tool that offers features like skip logic, design options, template library, rebranding, etc., to create highly personalized and targeted surveys.
  • Data analysis techniques: If your tool offers AI-based analysis techniques like sentiment analysis, it can automatically categorize the responses based on user emotions and prioritize the negative feedback to take necessary actions.

how to write methodology for online survey

7 Steps to Conduct a Survey That Brings Desired Results

You’ve done your research and come up with a workable strategy. You also have the right tool in hand. Now you’re ready to conduct the survey by following these simple steps:

1. Identify Your Target Audience

To make your surveys successful, you need a target audience. For example:

  • You can’t collect satisfaction scores from first-time visitors. Technically, you can, but collecting it from people who have used or purchased the product is more worthy and meaningful.
  • In the same way, if you’ve added a new feature to the Chrome browser, you can’t show the survey to Firefox users.
  • To collect data from visitors aged 18 to 25, you must select the sample size from this visitor segment.

That’s why you need to choose your target audience carefully. It’ll ensure the reliability and accuracy of your survey responses.

It’ll also help you add screening questions to disqualify irrelevant respondents.

2. Decide on the Survey Questions

Make a list of all the possible questions you want to ask the respondents. Then, pick the ones that are important for your survey. You can choose between open-ended and closed-ended questions.

For example, if you’re surveying to create a user persona, the possible questions can be:

  • Describe yourself in one sentence.
  • What is your name?
  • What is your age?
  • Which device do you usually use to shop with us?
  • What did you come to this site to do today?
  • What were you hoping to find on this page?
  • Does this page meet your expectations?

You can add/remove the questions depending on your audience and the depth of information you want from them.

Now, select the order of the questions. It’s best to start with the rating scale question and then move toward the free-text questions to gather in-depth data.

Read More: Answer selection types

3. Add Question Branching to Your Survey

Add screening questions and skip logic to make your survey personalized and highly targeted. 

It’ll ensure two things:

  • Only people who qualify for the survey can answer it.
  • The respondents will only see relevant questions based on their previous answers.

how to write methodology for online survey

For example, if your target audience is visitors from the age group of 18-25, you can add the following as the first question:

Using question branching , you can then disqualify the respondents who choose (b),(c),(d), or (e) as their answers. For respondents who choose (a), you can add the relevant follow-up questions to gather feedback data.

It’ll help to keep the feedback clean without worrying about other groups compromising the data. So, list the questions, assign appropriate branching and check that there’s no broken or incomplete path for any branch.

4. Set Triggers to Pick the Right Moment to Conduct the Survey

Survey launch timing matters. It helps to collect contextual data from the customers. So set up appropriate triggers to show the survey at the right moment to the right people.

  • If you’re conducting a survey to gauge users’ perception of a newly added product feature, you can set it to appear when they interact with that feature.
  • Similarly, the ideal trigger for post-purchase surveys is when the order confirmation message appears.

Advanced survey tools provide many in-built targeting options to help you set the right triggers. You can add different conditions to deploy the survey at the precise moment.

how to write methodology for online survey

5. Choose the Survey Design and Deploy the Survey

The last thing to do before deploying the survey is to set its theme and color. It’s best to align the survey theme with your website or app. You can also add the brand logo to imbue confidence in the respondents.

how to write methodology for online survey

Once satisfied, activate the survey and start collecting the feedback data. Make sure you collect enough responses to meet the required sample size.

6. Analyze the Results

There are different ways to analyze the responses depending on the resources available at your disposal. Here are a few steps to do it:

  • Restructure the data in a spreadsheet and add all the relevant information to each response, such as customer ID, metadata, feedback, point of interaction, customer type, and lifetime value.
  • Categorize the feedback by its types: issue, general feedback, bug, feature request, support grievances, and more.
  • Next, start assigning the appropriate action to resolve the problems. Make sure these are small and quick points. For example, if the feedback type is an issue marked as critical, summarize the issue and the required action in a few words, like:

Payment failure >> possible issue with Stripe API >> Check on priority.

7. Take Actions

Share the sheet with other teams and create tasks for them to address the feedback.

  • Resolve the uncovered UI/UX on a priority.
  • Fix the broken flow with the help of your dev team.
  • Reach out to those who shared positive feedback to collect product reviews and app store ratings.
  • Get in touch with frustrated customers to solve their problems and retain them.

6 Best Practices for Conducting an Online Survey

Whether you are new or experienced, there are few basic rules that you can add to your checklist to get the maximum return on your investment. Let’s quickly take some of these best strategies and practices to do a survey properly.

1. Add Incentives to Improve the Response Rate

Incentives are one of the best ways to increase the response rate on your surveys. 

According to PeoplePulse , incentivized surveys receive at least 10% more responses than surveys without incentives.

how to write methodology for online survey

Use different incentives to improve response rates.

  • You can add discount coupon codes to the CSAT and NPS surveys. It will also act as an encouragement for the customers to purchase again.
  • Embed free consultation offer in the surveys on your pricing page or landing page. In return, you can collect visitors’ contact information to add them to your prospect list.
  • Other incentives such as customized meal plans, exercise plans, personality profiles, and gift cards to entice people into filling the survey.

2. State the Purpose of Your Survey

It is always helpful to showcase the objective of your survey to your respondents. Make them understand that the survey feedback will help make their experience better. 

how to write methodology for online survey

You can also mention the recent interaction related to the survey to help customers recall their experience while filling the survey. 

3. Always Follow up on Your Surveys

It is a good practice to send survey reminders to the customers or product users who haven’t submitted their responses.

A single survey reminder can increase the response rate by 14% . It is a substantial jump in the number of responses as the sample size increases.

  • If you are using mail campaigns, you can send a reminder mail to non-respondents after a few days. 
  • If you use the website or in-app surveys, set the survey to reappear to the visitor during their second visit.
  • If you use product surveys, add an unobtrusive survey reminder notification bar in the My Account section. Set the bar to auto-disappear after the user completes the survey.

4. Use the Funnel Technique

The Funnel technique is a powerful mechanism to direct the respondents through the survey starting with broader questions and asking specific questions towards the middle of the survey. It helps to pose more in-depth questions to the respondents.

  • Do you shop online?
  • How often do you shop online?
  • What are your favorite shopping websites?
  • What products do you usually buy online?

In the above example, each question narrows down the line of inquiry to gauge respondents’ preferences and interests.

Using this technique, you can gradually ask more personal questions without making the respondents uncomfortable.

5. Keep the Surveys Short

Survey length is an important factor that can affect the response rate. According to the Internal journal for market research , the ideal survey length should be between 10 to 20 minutes. 

What’s more, the response rate may drop by over 15% if a survey takes more than 5 minutes to complete.

The reason for this drop is simple. The respondents won’t wait for so long to complete the survey. They are more likely to abandon it with the increase in the number of questions or completion time.

Another drawback of longer surveys is that the respondents may answer the questions randomly without much thought to complete them quickly. This behavior will pollute the data samples and may produce incorrect results.

That’s why it is vital to keep your surveys short and to the point. Share your survey with the internal teams to calculate the average completion rate before sending them out.

6.Use Randomization

It is observed that the respondents have a natural tendency to feel inclined towards the first option in a survey question. This is called order bias . As a result, respondents are more likely to choose the responses that sit towards the top of the list.

Randomizing the order of response anchors can help mitigate this issue. Since each respondent sees a different sequence of the responses, the data results are less likely to be affected by order bias.

50+ Survey Questions to Choose From

The questions are the crux of survey campaigns. To learn how to conduct a survey is to learn about the right questions to ask the respondents. You can get everything right, but it will all be for naught if you don’t ask the right questions.

That’s why we have compiled a list of professional questions you can use in your surveys. Choose the questions depending on the feedback and survey type you wish to conduct on your website, app, or product.

Many survey tools also offer readymade templates to help you get started if you are new to this.

1.c Market Research Surveys

  • Rate the factors that affect your buying decision for [product].
  • Would you purchase the product at [price]?
  • According to you, what is the ideal price range for the product?
  • Would you purchase this product if it were available today?
  • Based on its current features and attributes, would you recommend [your brand name] to others?
  • If yes, please tell us what you like the most about [your brand name]?
  • If no, please specify the reason.
  • According to you, In which area is this product/service lacking the most? Specify below.
  • Which product/service would you consider as an alternative to ours?
  • Rate our competitor based on the following:

2. Demographic Surveys

  • Tell us something about yourself?
  • What is your gender?
  • What is your age group?
  • What is your highest level of education?
  • Which best describes your family?
  • Do you use the [product name]?
  • How likely is it that you’d recommend our product to a friend or colleague?
  • What feature would you like to see in the website/product?
  • Which feature do you think will help improve the product experience for you?
  • Of these four options, what’s the next thing you think we should build?

4. Product Opportunity Surveys

  • What’s the one feature we can add that would make our product indispensable for you?
  • How often do you use this feature?
  • What’s the next feature we should build?
  • How disappointed would you be if you could no longer use [Product/feature name?]
  • How does the product run after the update?
  • Have you seen any website/product/app with a similar feature?
  • Would the implementation of [this feature] increase the usability of the [product name]?
  • How would you rate this new feature?

5. Experience Mapping Surveys

  • Rate our product based on the following aspects:
  • How long have you had the product?
  • How often do you use the product?
  • Have you faced any problems with the product? Specify below.
  • How satisfied are you with the product?
  • How likely are you to purchase a product from this company again?
  • Is there anything that can be improved? Please specify.
  • How well does the website meet your needs?
  • Was the information easy to find?
  • Was the information clearly presented?
  • What other information should we provide on our website?
  • How can we make the site easier to use?

6. Brand Awareness Surveys

  • [Your brand name] Have you heard of the brand before?
  • How do you feel about this brand?
  • Have you seen this brand’s advertisements?
  • If yes, where have you seen or heard about our brand recently? (Select all that apply)
  • Have you purchased from this brand before?
  • Do you currently use the product of this brand?
  • Of all the brands offering similar products, which do you feel is the best brand?
  • Please specify what makes it the best brand for you in the category.
  • If the answer is 0-6, please specify the reason for your answer.
  • If the answer is 9-10, what do you like the most about the brand/product?
  • How satisfied are you with the product/website/app?
  • If the answer is 1-5, how can we improve the product/website/app?
  • If the answer is 8-10, what 3 things do you like the most about the product/website/app?
  • How would you rate our service on a scale of 1 – 10?
  • Was this article helpful? (Yes/No)
  • How satisfied are you with our support?
  • How easy/hard was it for you to use the product/website/app?
  • Does this [website/ product/ tool/ software] have all the features and functionalities you expected?
  • How would you improve this [website/ product/ tool/ software]?
  • What is missing from the website/product/app?
  • What is the most important feature you think we should add?

6 Common Survey Challenges & How to Overcome Them

1. keeping the scales uniform across all surveys.

The interpretation of survey scales varies as per their arrangement.

For example, if a Likert scale response anchors range from negative to positive, a higher score is desirable. However, if the sequence is reversed, a lower score would be considered as good.

The challenge is to keep the sequence uniform in all your surveys to track the scores correctly. If the scale gets reversed at any point, it can skew up the results.

Always stick to one pattern for all the scales – negative to positive or positive to negative to avoid confusion and misinterpretations.

2. Difficulty in Analyzing Free Responses

Analyzing free responses is one of the biggest challenges when you conduct an online survey. These are unregulated and depend solely on the understanding of the respondents. If they misinterpret the question, the open response will only skew your feedback data.

It’s one of the main reasons to keep your survey questions simple.

What’s more, they are also affected by the grasp of the language of the respondent. The sentences can be unstructured or hard to understand.

Free-text responses offer more in-depth feedback but pose a severe challenge of extracting valuable insights. They are time-taking and tedious.

One of the ways to mitigate this issue is using AI-based analytical tools like sentiment analysis, text analytics, and word cloud generator.

Advanced survey tools offer these techniques as an inbuilt tool feature to make data mining easier.

how to write methodology for online survey

3. Avoiding Leading Questions in the Survey

Leading questions are framed in such a way that they allude to a specific direction. The problem with these questions is that they can influence the respondents to choose a particular answer from the options.

For example, what is your favorite fast food?

The above question implies that the respondent eats fast food. The respondent may either skip the question or answer it randomly.

You can add a screening question to improve the data quality and disqualify irrelevant respondents.

  • How often do you eat fast food?
  • What is your favorite fast food?

Another way to avoid leading questions is to get an extra pair of eyes. Share your survey with other teams or a control group to test it out.

4. Survey Fatigue

Survey fatigue is a real challenge that can affect both the response rate and feedback quality. People constantly receive surveys in their SMS, emails, website visits, and apps.

So, it is possible that the visitors may completely ignore the survey or answer it randomly without properly reading the questions.

how to write methodology for online survey

There are few ways to combat survey fatigue while conducting a survey.

  • Add an incentive to encourage people to take the survey. As discussed earlier, it can improve the response rate.
  • Keep your survey short.
  • Separate critical surveys like NPS or CSAT from general surveys that take longer to complete.

5. Duplicate Responses

You cannot avoid duplicate responses in online surveys. But there are few ways to reduce them.

  • Use cookies to identify repeat visitors and prevent them from retaking the survey.
  • You can target IP addresses to prevent visitors from the same IP from retaking it.
  • A lot of tools also provide inbuilt duplicate response protection techniques.

6. Avoid Double-Barreled Questions

Double-barreled questions are those that pose two situations into one question. The respondent may have conflicting views about the statements, making it harder to choose one response.

For example, how satisfied are you with our services and customer support?

Here, the respondent may have positive sentiment toward the service quality but may be dissatisfied with the customer support agent.

To make the question more precise, you can split it into two questions. It will also let you add follow-up questions to each answer to find more details about customers’ issues and delights.

5 Best Tools to Conduct Online Surveys in 2021

It’s easy to get overwhelmed while looking for a correct survey tool because of their sheer numbers in the market. That’s why we have listed the top 5 tools to lighten your load and help you get started with the survey campaigns.

1. Qualaroo

Qualaroo is a complete customer feedback management solution that can help you create and manage the survey data under one dashboard. You can conduct surveys on your website, app, SaaS product, social media, and email. 

With features like skip-logic, 40+ pre-built question templates, 12+ question types, 50+ language translations, customer survey design options, rebranding, and advanced targeting options, you can start collecting feedback in a few hours. 

The tool also supports advanced AI-based data analysis techniques – sentiment analysis and text analytics to categorize the survey responses and produce valuable insights automatically. 

Price: Free forever account. Paid plans start at $80/month

2. ProProfs Survey Maker

ProProfs Survey Maker brings more than surveys to the table. It lets you create interactive scored quizzes, polls, assessments, and survey forms as well. You can also add a feedback sidebar on your website. 

It also supports multi-channel deployment, i.e., you can add the survey to your website, mobile app, email, and social media. You can analyze and track the responses using a detailed dashboard.

Price: Forever free account. Paid plans start at $0.05/response/month

3. SurveyMonkey

SurveyMonkey is one of the best survey tools in the market that offers skip logic, multiple answer types, survey language translation, progress bar, scoring mechanism, question randomization, pre-built templates, and design customization options. 

You can deploy surveys on your website, app, product, or email. It also features the sentiment analysis tool and word cloud generator to analyze the feedback and extract valuable insights. You can filter the data using custom charts and feedback summaries to study the desired data points.

Price: Forever free account. Paid plans start at $31/month

4. Typeform

If you are looking for a simple yet effective survey tool for your website, Typeform is the one to go for. The tool lets you build surveys, forms, polls, and quizzes for your website. You can show the survey on your website as a popup, popover, slider, or sidebar button. 

Like other tools in the list, Typeform also has a plethora of survey personalization features like pre-built templates, question randomizer, progress bar, skip logic, customization themes, and mobile-responsive design. The targeting options are a little simpler than other tools.

Price: Forever free account. Paid plans start at $25/month

5. Survicate

Last but not least, Survicate is another one-stop customer feedback tool with features to conduct surveys on websites, web apps, mobile apps, and emails. Use advanced features like audience targeting, question branching, pre-built templates, 15+ question types, and white labeling to create highly targeted surveys. 

With the inbuilt AI-based text analytics engine and analytical dashboard, you can extract valuable customer insights and track the survey campaign’s performance.

Price: Free forever plan. Paid plans start at $89/month

Ask.Analyze.Act

As you can see, a survey is not just a list of questions but an entire strategic approach to establish a line of conversation between you and your customers. With the survey tools becoming less expensive and more versatile, learning how to conduct a survey effectively can help you gain new customers, retain existing ones, optimize your products and develop ideas to increase conversions. 

So, what are you waiting for?

Understand what data you want to collect, pick the right survey tool, and follow these steps to build survey campaigns that produce tangible results.

What Question Types Can I Use in Surveys?

You can choose from different survey answer types like 

  • Multiple answer selection (checkboxes)
  • Single answer selection (radio button)
  • Single answer selection (dropdown)
  • Text-based answer
  • Text-based answer (single line)
  • Star Rating selection
  • Net Promoter Score

What Are the Different Types of Surveys?

The survey types depend on the type of data they collect from the respondents, like:

  • Market Research Surveys
  • Post-Purchase Surveys
  • Customer Satisfaction Surveys
  • Exit-Intent Surveys
  • NPS (Net Promoter Score) Surveys
  • Lead Generation Surveys
  • Website Polls

Read More: Types of Website Surveys

What Are Common Survey Challenges and Errors?

Here are some common challenges and mistakes people make while conducting surveys:

  • Asking too many questions 
  • Framing assumptive questions
  • Making selection error
  • Not adding enough response options
  • Using negative question wording
  • Assuming prior knowledge

About the author

Shivani Dubey

Shivani Dubey is a seasoned writer and editor specializing in Customer Experience Management. She covers customer feedback management, emerging UX and CX trends, transformative strategies, and experience design dos and don'ts. Shivani is passionate about helping businesses unlock insights to improve products, services, and overall customer experience.

Related Posts

how to write methodology for online survey

Customer Intent – How to Understand, Capture & Capitalize It?

how to write methodology for online survey

Hiten Shah: Stay Data-Informed to Build Something Customers Want

how to write methodology for online survey

You Don’t Get A Second Chance To Make A First Impression – Build An Effective Landing Page

how to write methodology for online survey

8 SnapSurveys Alternatives for Enhanced Feedback Collection in 2024

how to write methodology for online survey

Churn Rate: What Is It and Different Ways to Calculate It

The 5 Minute Drill to Get Customer Insights

7 steps when conducting survey research: A beginner-friendly guide

  • May 17, 2022

Steps of survey method: Things to know before conducting survey research

Pay attention to questions, step 2: define the population and sample (who will participate in the survey), are interviews or in-person surveys better than written ones, online surveys are the easiest way to reach a broad audience, mail surveys: control who participates, types of questions: what are the most common questions used in survey research, content, phrasing, and the order of questions, step 5: distribute the survey and gather responses, step 6: analyze the collected data, step 7: create a report based on survey results, last but not least: frequently asked questions, follow the seven steps of survey research with surveyplanet.

Conducting survey research encompasses gaining insight from a diverse group of people by asking questions and analyzing answers. It is the best way to collect information about people’s preferences, beliefs, characteristics, and related information.

The key to a good survey is asking relevant questions that will provide needed information. Surveys can be used one-time or repeatedly.

Wondering how to conduct survey research correctly?

This article will lay out—even if you are a beginner—the seven steps of conducting survey research with guidance on how to successfully carrying it out.

How to conduct survey research in 7 steps

Conducting survey research typically involves several key things to do. Here are the most common seven steps in conducting survey research:

Step 1: Identify research goals and objectives

Step 3: decide on the type of survey method to use, step 4: design and write questions.

These survey method steps provide a general framework for conducting research. But keep in mind that specific details and requirements may vary based on research context and objectives.

To understand the process of conducting a survey, start at the beginning. Conducting a survey consists of several steps, each equally important to the outcome.

Before conducting survey research, here are some resources you might find helpful regarding different methods, such as focus group interviews , survey sampling , and qualitative research methods . Learn why a market research survey is important and how to utilize it for your business research goals.

Finally, it is always a good idea to understand what is the difference between a survey and a questionnaire .

The first of seven steps in conducting survey research is to identify the goal of the research.

This will help with subsequent steps, like finding the right audience and designing appropriate questions. In addition, it will provide insight into what data is most important.

By identifying goals, several questions will be answered: What type of information am I collecting? Is it general or specific? Is it for a particular or broad audience? Research goals will define the answers to these questions and help focus the purpose of the survey and its goal.

An objective is a specific action that helps achieve research goals. Usually, for every goal, there are several objectives.

The answers collected from a survey are only helpful if used properly. Determining goals will provide a better idea of what it is you want to learn and make it easier to design questions. However, setting goals and objectives can be confusing. Ask the following questions:

  • What is the subject or topic of the research? This will clarify feedback that is needed and subjects requiring further input.
  • What do I want to learn? The first step is knowing what precisely needs to be learned about a particular subject.
  • What am I looking to achieve with the collected data? This will help define how the survey will be used to improve, adjudicate, and understand a specific subject.

Uncertain about how to write a good survey question ? We got you covered.

Who is the target audience from which information is being gathered? This is the demographic group that will participate in the survey. To successfully define this group, narrow down a specific population segment that will provide accurate and unbiased information.

Depending on the kind of information required, this group can be broad—for example the population of Florida—or it can be relatively narrow, like consumers of a specific product who are between the ages of 18 and 24.

It is rarely possible to survey the entire population being researched. Instead, a sample population is surveyed. This should represent the subject population as a whole. The number required depends on various factors, mainly the size of the subject population. Therefore, the larger and more representative the sample is the more valid the survey.

Precisely determine what mode of collecting data will be used. The ways to conduct a survey depend on sample size, location, types of questions, and the costs of conducting the research. Not sure how many people you need to survey to be statistically significant!? Use our survey sample size calculator and determine your needed survey size.

Based on the purpose of the research, there are various methods of conducting a survey:

In-person surveys are useful for smaller sample sizes since they allow for the gathering of more detailed information on the survey’s subject. They can be conducted either by phone or in person.

The advantage of interviews is that the interviewer can clarify questions and seek additional information. The main risk with this method is researcher bias or respondent equivocation, though a skilled interviewer is usually able to eliminate these issues.

If the correct steps are followed, conducting an online survey has many advantages, such as cost efficiency and flexibility. In addition, online surveys can reach either a vast audience or a very focused one, depending on your needs.

Online tools are the most effective method of conducting a survey. They can be used by anyone and easily customized for any target group. There are many kinds of online surveys that can be sent via email, hosted on a website, or advertised through social media.

To follow the correct steps for conducting a survey, get help from SurveyPlanet . All you need to do is sign up for an account . Creating perfect surveys will be at your fingertips.

Delivered to the respondents’ email addresses, mail surveys access a large sample group and provide control over who is included in the sample. Though increasingly the most common survey research method, response rates are now relatively low .

To get the best response rate results, read our blogs How to write eye-catching survey emails and What’s the best time to send survey emails ?

Survey questions play a significant role in successful research. Therefore, when deciding what questions to ask—and how to ask them—it is crucial to consider various factors.

Choose between closed-ended and open-ended questions. Closed-ended questions have predefined answer options, while open-ended ones enable respondents to shape an answer in their own words.

Before deciding which to use, get acquainted with the options available. Some common types of research questions include:

  • Demographic questions
  • Multiple-choice questions
  • Rating scale questions
  • Likert scale questions
  • Yes or no questions
  • Ranking questions
  • Image choice questions

To make sure results are reliable, each question in a survey needs to be formulated carefully. Each should be directly relevant to the survey’s purpose and include enough information to be answered accurately.

If using closed-ended questions, make sure the available answers cover all possibilities. In addition, questions should be clear and precise without any vagueness and in the language idiom respondents will understand.

When organizing questions, make sure the order is logical. For example, easy and closed-ended questions encourage respondents to continue—they should be at the beginning of the survey. More difficult and complex questions should come later. Related questions should be clustered together and, if there are several topics covered, then related questions should be grouped.

Surveys can be distributed in person, over the phone, via email, or with an online form.

When creating a survey, first determine the number of responses required and how to access the survey sample. It is essential to monitor the response rate. This is calculated by dividing the number of respondents who answered the survey by the number of people in the sample.

There are various methods of conducting a survey and also different methods of analyzing the data collected. After processing and sorting responses (usually with the help of a computer), clean the data by removing incomplete or inaccurate responses.

Different data analysis methods should be used depending on the type of questions utilized. For example, open-ended questions require a bucketing approach in which labels are added to each response and grouped into categories.

Closed-ended questions need statistical analysis. For interviews, use a qualitative method (like thematic analysis) and for Likert scale questions use analysis tools (mean, median, and mode).

Other practical analyzing methods are cross-tabulation and filtering. Filtering can help in understanding the respondent pool better and be used to organize results so that data analysis is quicker and more accessible.

If using an online survey tool, data will be compiled automatically, so the only thing needed is identifying patterns and trends.

The last of the seven steps in conducting survey research is creating a report. Analyzed data should be translated into units of information that directly correspond to the aims and goals identified before creating the survey.

Depending on the formality of the report, include different kinds of information:

  • Initial aims and goals
  • Methods of creation and distribution
  • How the target audience or sample was selected
  • Methods of analysis
  • The results of the survey
  • Problems encountered and whether they influenced results
  • Conclusion and recommendations
  • What’s the best way to select my survey sample size? One must carefully consider the survey sample size to ensure accurate results. Please read our complete guide to survey sample size and find all the answers.
  • How do I design an effective survey instrument? Try out SurveyPlanet PRO features including compelling survey theme templates.
  • How do I analyze and interpret survey data? Glad you asked! We got you covered. Learn how to analyze survey data and what to do with survey responses by reading our blog.
  • What should I consider in terms of ethical practices in survey research? Exploring ethical considerations related to obtaining informed consent, ensuring privacy, and handling sensitive data might be helpful. Start with learning how to write more inclusive surveys .
  • How do I address common survey challenges and errors? Explore strategies to overcome common issues, such as response bias or question-wording problems .
  • How can I maximize survey response rates? Seeking advice on strategies to encourage higher response rates and minimize non-response bias is a first step. Start by finding out what is a good survey response rate .
  • How can I ensure the validity and reliability of my survey results? Learn about methods to enhance the trustworthiness of survey data .

Now that we’ve gone through the seven steps in survey research and understand how to conduct survey research, why not create your own survey and conduct research that will drive better choices and decisions?

Were these seven steps helpful? Then check out Seven tips for creating an exceptional survey design (with examples) and How to conduct online surveys in seven simple steps as well.

Sign up for a SurveyPlanet account to access pre-made questions and survey themes. And, if you upgrade to a SurveyPlanet Pro account, gain access to many unique tools that will enhance your survey creation and analysis experience.

Photo by Adeolu Eletu on Unsplash

how to write methodology for online survey

How To Write The Methodology Chapter

The what, why & how explained simply (with examples).

By: Jenna Crossley (PhD) | Reviewed By: Dr. Eunice Rautenbach | September 2021 (Updated April 2023)

So, you’ve pinned down your research topic and undertaken a review of the literature – now it’s time to write up the methodology section of your dissertation, thesis or research paper . But what exactly is the methodology chapter all about – and how do you go about writing one? In this post, we’ll unpack the topic, step by step .

Overview: The Methodology Chapter

  • The purpose  of the methodology chapter
  • Why you need to craft this chapter (really) well
  • How to write and structure the chapter
  • Methodology chapter example
  • Essential takeaways

What (exactly) is the methodology chapter?

The methodology chapter is where you outline the philosophical underpinnings of your research and outline the specific methodological choices you’ve made. The point of the methodology chapter is to tell the reader exactly how you designed your study and, just as importantly, why you did it this way.

Importantly, this chapter should comprehensively describe and justify all the methodological choices you made in your study. For example, the approach you took to your research (i.e., qualitative, quantitative or mixed), who  you collected data from (i.e., your sampling strategy), how you collected your data and, of course, how you analysed it. If that sounds a little intimidating, don’t worry – we’ll explain all these methodological choices in this post .

Free Webinar: Research Methodology 101

Why is the methodology chapter important?

The methodology chapter plays two important roles in your dissertation or thesis:

Firstly, it demonstrates your understanding of research theory, which is what earns you marks. A flawed research design or methodology would mean flawed results. So, this chapter is vital as it allows you to show the marker that you know what you’re doing and that your results are credible .

Secondly, the methodology chapter is what helps to make your study replicable. In other words, it allows other researchers to undertake your study using the same methodological approach, and compare their findings to yours. This is very important within academic research, as each study builds on previous studies.

The methodology chapter is also important in that it allows you to identify and discuss any methodological issues or problems you encountered (i.e., research limitations ), and to explain how you mitigated the impacts of these. Every research project has its limitations , so it’s important to acknowledge these openly and highlight your study’s value despite its limitations . Doing so demonstrates your understanding of research design, which will earn you marks. We’ll discuss limitations in a bit more detail later in this post, so stay tuned!

Need a helping hand?

how to write methodology for online survey

How to write up the methodology chapter

First off, it’s worth noting that the exact structure and contents of the methodology chapter will vary depending on the field of research (e.g., humanities, chemistry or engineering) as well as the university . So, be sure to always check the guidelines provided by your institution for clarity and, if possible, review past dissertations from your university. Here we’re going to discuss a generic structure for a methodology chapter typically found in the sciences.

Before you start writing, it’s always a good idea to draw up a rough outline to guide your writing. Don’t just start writing without knowing what you’ll discuss where. If you do, you’ll likely end up with a disjointed, ill-flowing narrative . You’ll then waste a lot of time rewriting in an attempt to try to stitch all the pieces together. Do yourself a favour and start with the end in mind .

Section 1 – Introduction

As with all chapters in your dissertation or thesis, the methodology chapter should have a brief introduction. In this section, you should remind your readers what the focus of your study is, especially the research aims . As we’ve discussed many times on the blog, your methodology needs to align with your research aims, objectives and research questions. Therefore, it’s useful to frontload this component to remind the reader (and yourself!) what you’re trying to achieve.

In this section, you can also briefly mention how you’ll structure the chapter. This will help orient the reader and provide a bit of a roadmap so that they know what to expect. You don’t need a lot of detail here – just a brief outline will do.

The intro provides a roadmap to your methodology chapter

Section 2 – The Methodology

The next section of your chapter is where you’ll present the actual methodology. In this section, you need to detail and justify the key methodological choices you’ve made in a logical, intuitive fashion. Importantly, this is the heart of your methodology chapter, so you need to get specific – don’t hold back on the details here. This is not one of those “less is more” situations.

Let’s take a look at the most common components you’ll likely need to cover. 

Methodological Choice #1 – Research Philosophy

Research philosophy refers to the underlying beliefs (i.e., the worldview) regarding how data about a phenomenon should be gathered , analysed and used . The research philosophy will serve as the core of your study and underpin all of the other research design choices, so it’s critically important that you understand which philosophy you’ll adopt and why you made that choice. If you’re not clear on this, take the time to get clarity before you make any further methodological choices.

While several research philosophies exist, two commonly adopted ones are positivism and interpretivism . These two sit roughly on opposite sides of the research philosophy spectrum.

Positivism states that the researcher can observe reality objectively and that there is only one reality, which exists independently of the observer. As a consequence, it is quite commonly the underlying research philosophy in quantitative studies and is oftentimes the assumed philosophy in the physical sciences.

Contrasted with this, interpretivism , which is often the underlying research philosophy in qualitative studies, assumes that the researcher performs a role in observing the world around them and that reality is unique to each observer . In other words, reality is observed subjectively .

These are just two philosophies (there are many more), but they demonstrate significantly different approaches to research and have a significant impact on all the methodological choices. Therefore, it’s vital that you clearly outline and justify your research philosophy at the beginning of your methodology chapter, as it sets the scene for everything that follows.

The research philosophy is at the core of the methodology chapter

Methodological Choice #2 – Research Type

The next thing you would typically discuss in your methodology section is the research type. The starting point for this is to indicate whether the research you conducted is inductive or deductive .

Inductive research takes a bottom-up approach , where the researcher begins with specific observations or data and then draws general conclusions or theories from those observations. Therefore these studies tend to be exploratory in terms of approach.

Conversely , d eductive research takes a top-down approach , where the researcher starts with a theory or hypothesis and then tests it using specific observations or data. Therefore these studies tend to be confirmatory in approach.

Related to this, you’ll need to indicate whether your study adopts a qualitative, quantitative or mixed  approach. As we’ve mentioned, there’s a strong link between this choice and your research philosophy, so make sure that your choices are tightly aligned . When you write this section up, remember to clearly justify your choices, as they form the foundation of your study.

Methodological Choice #3 – Research Strategy

Next, you’ll need to discuss your research strategy (also referred to as a research design ). This methodological choice refers to the broader strategy in terms of how you’ll conduct your research, based on the aims of your study.

Several research strategies exist, including experimental , case studies , ethnography , grounded theory, action research , and phenomenology . Let’s take a look at two of these, experimental and ethnographic, to see how they contrast.

Experimental research makes use of the scientific method , where one group is the control group (in which no variables are manipulated ) and another is the experimental group (in which a specific variable is manipulated). This type of research is undertaken under strict conditions in a controlled, artificial environment (e.g., a laboratory). By having firm control over the environment, experimental research typically allows the researcher to establish causation between variables. Therefore, it can be a good choice if you have research aims that involve identifying causal relationships.

Ethnographic research , on the other hand, involves observing and capturing the experiences and perceptions of participants in their natural environment (for example, at home or in the office). In other words, in an uncontrolled environment.  Naturally, this means that this research strategy would be far less suitable if your research aims involve identifying causation, but it would be very valuable if you’re looking to explore and examine a group culture, for example.

As you can see, the right research strategy will depend largely on your research aims and research questions – in other words, what you’re trying to figure out. Therefore, as with every other methodological choice, it’s essential to justify why you chose the research strategy you did.

Methodological Choice #4 – Time Horizon

The next thing you’ll need to detail in your methodology chapter is the time horizon. There are two options here: cross-sectional and longitudinal . In other words, whether the data for your study were all collected at one point in time (cross-sectional) or at multiple points in time (longitudinal).

The choice you make here depends again on your research aims, objectives and research questions. If, for example, you aim to assess how a specific group of people’s perspectives regarding a topic change over time , you’d likely adopt a longitudinal time horizon.

Another important factor to consider is simply whether you have the time necessary to adopt a longitudinal approach (which could involve collecting data over multiple months or even years). Oftentimes, the time pressures of your degree program will force your hand into adopting a cross-sectional time horizon, so keep this in mind.

Methodological Choice #5 – Sampling Strategy

Next, you’ll need to discuss your sampling strategy . There are two main categories of sampling, probability and non-probability sampling.

Probability sampling involves a random (and therefore representative) selection of participants from a population, whereas non-probability sampling entails selecting participants in a non-random  (and therefore non-representative) manner. For example, selecting participants based on ease of access (this is called a convenience sample).

The right sampling approach depends largely on what you’re trying to achieve in your study. Specifically, whether you trying to develop findings that are generalisable to a population or not. Practicalities and resource constraints also play a large role here, as it can oftentimes be challenging to gain access to a truly random sample. In the video below, we explore some of the most common sampling strategies.

Methodological Choice #6 – Data Collection Method

Next up, you’ll need to explain how you’ll go about collecting the necessary data for your study. Your data collection method (or methods) will depend on the type of data that you plan to collect – in other words, qualitative or quantitative data.

Typically, quantitative research relies on surveys , data generated by lab equipment, analytics software or existing datasets. Qualitative research, on the other hand, often makes use of collection methods such as interviews , focus groups , participant observations, and ethnography.

So, as you can see, there is a tight link between this section and the design choices you outlined in earlier sections. Strong alignment between these sections, as well as your research aims and questions is therefore very important.

Methodological Choice #7 – Data Analysis Methods/Techniques

The final major methodological choice that you need to address is that of analysis techniques . In other words, how you’ll go about analysing your date once you’ve collected it. Here it’s important to be very specific about your analysis methods and/or techniques – don’t leave any room for interpretation. Also, as with all choices in this chapter, you need to justify each choice you make.

What exactly you discuss here will depend largely on the type of study you’re conducting (i.e., qualitative, quantitative, or mixed methods). For qualitative studies, common analysis methods include content analysis , thematic analysis and discourse analysis . In the video below, we explain each of these in plain language.

For quantitative studies, you’ll almost always make use of descriptive statistics , and in many cases, you’ll also use inferential statistical techniques (e.g., correlation and regression analysis). In the video below, we unpack some of the core concepts involved in descriptive and inferential statistics.

In this section of your methodology chapter, it’s also important to discuss how you prepared your data for analysis, and what software you used (if any). For example, quantitative data will often require some initial preparation such as removing duplicates or incomplete responses . Similarly, qualitative data will often require transcription and perhaps even translation. As always, remember to state both what you did and why you did it.

Section 3 – The Methodological Limitations

With the key methodological choices outlined and justified, the next step is to discuss the limitations of your design. No research methodology is perfect – there will always be trade-offs between the “ideal” methodology and what’s practical and viable, given your constraints. Therefore, this section of your methodology chapter is where you’ll discuss the trade-offs you had to make, and why these were justified given the context.

Methodological limitations can vary greatly from study to study, ranging from common issues such as time and budget constraints to issues of sample or selection bias . For example, you may find that you didn’t manage to draw in enough respondents to achieve the desired sample size (and therefore, statistically significant results), or your sample may be skewed heavily towards a certain demographic, thereby negatively impacting representativeness .

In this section, it’s important to be critical of the shortcomings of your study. There’s no use trying to hide them (your marker will be aware of them regardless). By being critical, you’ll demonstrate to your marker that you have a strong understanding of research theory, so don’t be shy here. At the same time, don’t beat your study to death . State the limitations, why these were justified, how you mitigated their impacts to the best degree possible, and how your study still provides value despite these limitations .

Section 4 – Concluding Summary

Finally, it’s time to wrap up the methodology chapter with a brief concluding summary. In this section, you’ll want to concisely summarise what you’ve presented in the chapter. Here, it can be a good idea to use a figure to summarise the key decisions, especially if your university recommends using a specific model (for example, Saunders’ Research Onion ).

Importantly, this section needs to be brief – a paragraph or two maximum (it’s a summary, after all). Also, make sure that when you write up your concluding summary, you include only what you’ve already discussed in your chapter; don’t add any new information.

Keep it simple

Methodology Chapter Example

In the video below, we walk you through an example of a high-quality research methodology chapter from a dissertation. We also unpack our free methodology chapter template so that you can see how best to structure your chapter.

Wrapping Up

And there you have it – the methodology chapter in a nutshell. As we’ve mentioned, the exact contents and structure of this chapter can vary between universities , so be sure to check in with your institution before you start writing. If possible, try to find dissertations or theses from former students of your specific degree program – this will give you a strong indication of the expectations and norms when it comes to the methodology chapter (and all the other chapters!).

Also, remember the golden rule of the methodology chapter – justify every choice ! Make sure that you clearly explain the “why” for every “what”, and reference credible methodology textbooks or academic sources to back up your justifications.

If you need a helping hand with your research methodology (or any other component of your research), be sure to check out our private coaching service , where we hold your hand through every step of the research journey. Until next time, good luck!

how to write methodology for online survey

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

55 Comments

DAUDI JACKSON GYUNDA

highly appreciated.

florin

This was very helpful!

Nophie

This was helpful

mengistu

Thanks ,it is a very useful idea.

Thanks ,it is very useful idea.

Lucia

Thank you so much, this information is very useful.

Shemeka Hodge-Joyce

Thank you very much. I must say the information presented was succinct, coherent and invaluable. It is well put together and easy to comprehend. I have a great guide to create the research methodology for my dissertation.

james edwin thomson

Highly clear and useful.

Amir

I understand a bit on the explanation above. I want to have some coach but I’m still student and don’t have any budget to hire one. A lot of question I want to ask.

Henrick

Thank you so much. This concluded my day plan. Thank you so much.

Najat

Thanks it was helpful

Karen

Great information. It would be great though if you could show us practical examples.

Patrick O Matthew

Thanks so much for this information. God bless and be with you

Atugonza Zahara

Thank you so so much. Indeed it was helpful

Joy O.

This is EXCELLENT!

I was totally confused by other explanations. Thank you so much!.

keinemukama surprise

justdoing my research now , thanks for the guidance.

Yucong Huang

Thank uuuu! These contents are really valued for me!

Thokozani kanyemba

This is powerful …I really like it

Hend Zahran

Highly useful and clear, thank you so much.

Harry Kaliza

Highly appreciated. Good guide

Fateme Esfahani

That was helpful. Thanks

David Tshigomana

This is very useful.Thank you

Kaunda

Very helpful information. Thank you

Peter

This is exactly what I was looking for. The explanation is so detailed and easy to comprehend. Well done and thank you.

Shazia Malik

Great job. You just summarised everything in the easiest and most comprehensible way possible. Thanks a lot.

Rosenda R. Gabriente

Thank you very much for the ideas you have given this will really help me a lot. Thank you and God Bless.

Eman

Such great effort …….very grateful thank you

Shaji Viswanathan

Please accept my sincere gratitude. I have to say that the information that was delivered was congruent, concise, and quite helpful. It is clear and straightforward, making it simple to understand. I am in possession of an excellent manual that will assist me in developing the research methods for my dissertation.

lalarie

Thank you for your great explanation. It really helped me construct my methodology paper.

Daniel sitieney

thank you for simplifieng the methodoly, It was realy helpful

Kayode

Very helpful!

Nathan

Thank you for your great explanation.

Emily Kamende

The explanation I have been looking for. So clear Thank you

Abraham Mafuta

Thank you very much .this was more enlightening.

Jordan

helped me create the in depth and thorough methodology for my dissertation

Nelson D Menduabor

Thank you for the great explaination.please construct one methodology for me

I appreciate you for the explanation of methodology. Please construct one methodology on the topic: The effects influencing students dropout among schools for my thesis

This helped me complete my methods section of my dissertation with ease. I have managed to write a thorough and concise methodology!

ASHA KIUNGA

its so good in deed

leslie chihope

wow …what an easy to follow presentation. very invaluable content shared. utmost important.

Ahmed khedr

Peace be upon you, I am Dr. Ahmed Khedr, a former part-time professor at Al-Azhar University in Cairo, Egypt. I am currently teaching research methods, and I have been dealing with your esteemed site for several years, and I found that despite my long experience with research methods sites, it is one of the smoothest sites for evaluating the material for students, For this reason, I relied on it a lot in teaching and translated most of what was written into Arabic and published it on my own page on Facebook. Thank you all… Everything I posted on my page is provided with the names of the writers of Grad coach, the title of the article, and the site. My best regards.

Daniel Edwards

A remarkably simple and useful guide, thank you kindly.

Magnus Mahenge

I real appriciate your short and remarkable chapter summary

Olalekan Adisa

Bravo! Very helpful guide.

Arthur Margraf

Only true experts could provide such helpful, fantastic, and inspiring knowledge about Methodology. Thank you very much! God be with you and us all!

Aruni Nilangi

highly appreciate your effort.

White Label Blog Content

This is a very well thought out post. Very informative and a great read.

FELEKE FACHA

THANKS SO MUCH FOR SHARING YOUR NICE IDEA

Chandika Perera

I love you Emma, you are simply amazing with clear explanations with complete information. GradCoach really helped me to do my assignment here in Auckland. Mostly, Emma make it so simple and enjoyable

Zibele Xuba

Thank you very much for this informative and synthesised version.

Yusra AR. Mahmood

thank you, It was a very informative presentation, you made it just to the point in a straightforward way .

Chryslin

Help me write a methodology on the topic “challenges faced by family businesses in Ghana

Kajela

Well articulated, clear, and concise. I got a lot from this writings. Thanks

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

[email protected]

214-257-8909

SurveyMethods

Creating Effective Surveys: 6 Steps for Gathering Relevant Feedback

Two people writing on paper

Article Summary:

  • Surveys help you quickly reach and engage with a diverse audience at a much lower cost.
  • Follow our six-step guide to design your survey to gather the actionable feedback you need to make informed business decisions.
  • With an online application like SurveyMethods you can develop and deploy your survey in minutes via email or web and watch the results come in — in real time.

Too often, business leaders operate in a vacuum, making their best guess as to what customers truly want or need. In the end, there’s no better approach to understanding your customers than simply asking them. One of the most effective ways to keep an open dialogue with customers is to use surveys, which give your customers (and employees, in some cases) a chance to provide the feedback you need to understand what’s important to them.

Conducting Effective Online Surveys

So, you’ve wisely decided to survey your customers—and the rest of your company has “bought into” the idea. As is so often the case with new-technology adoption, it pays huge dividends and accelerates ROI to partner with an online survey leader, such as SurveyMethods, to benefit from their expertise and experience to help you every step of the way. Using our online survey software, you can quickly and easily  design effective surveys  by following these six steps.

Six Steps for Conducting Effective Online Surveys

1. Define Your Survey’s Objectives

The first key to a successful survey is to define objectives. Exactly what is it you want to know? Is there a problem (or problems) that needs solving? What actions are you prepared to implement depending on the results of the survey? Put a survey together with less-than-focused objectives, and you almost guarantee a survey with unclear results. List the questions your survey should answer. Do you want to know what your customers’ satisfaction levels are by segment? Do you want to ask if they’d recommend your company to others? Do you want to measure in what format and how often your customers prefer to receive communications from your marketing department? Focus on the big picture, and keep your objectives narrowly scoped; more complex surveys tend to result in less meaningful results.

2. Identify Your Target Audience

Who should you survey? You may want to start with your existing customer base, but consider surveying prospects in other markets, as well. The proper sizes of survey samples depend on budget and the time available to analyze the results and act on them. Statistically, larger sample sizes deliver more accurate results. The good news: today, the web makes it easier than ever to sample large groups quickly and cost-efficiently.

3. Prioritize Your Questions

Obviously, every survey revolves around a specific set of questions, but with so many options, where do you start? Create questions related to your goals and objectives from Step 1. What customer attitudes or perceptions do you want to measure? What answers might ultimately help you to make more informed decisions? Remember, always provide an option that allows a recipient to say, “I don’t know” or skip a question entirely, especially when you’re asking for subjective opinions vs. quantitative facts. Also, don’t ask more questions than necessary; the shorter the survey, the better your chances of success.

4. Test the Survey

It may sound obvious, but before you hit “Send” and broadcast the survey to your selected sample world, test it thoroughly on as many different PC platforms, operating systems, various web browsers, etc. Try to “break” it in any way you can, because it’s an unpredictable technology world out there.

5. Communicate the Survey’s Purpose

It’s important to communicate to customers why they’re being surveyed, how you’d appreciate their support and what you intend to do with the information you gather. In other words, what’s in it for THEM? Explain why the survey is relevant to the recipient. Will it help the company create better products and services, improve customer service, seek more competitive pricing, etc.? One proven technique is to send an email announcing the survey to your existing customer base, asking for assistance and highlighting a direct link to the survey within the message. Providing an incentive can greatly increase response rates, especially from your top customer segments; it’s amazing what customers will do for a ballpoint pen, free T-shirt or other promotional items you may have sitting around in boxes anyway.

6. Analyze and Act Upon the Results

As soon as your send a survey and results begin to trickle in, you can begin analyzing the data. Once it’s in your database, it can be sliced, diced and analyzed as needed in spreadsheets, presentation programs and statistical software. Finally, it’s time to act. Compare the results of your survey to your original objectives, coming up with specific and actionable business responses as a result. After all, isn’t that the reason you surveyed customers in the first place?

Advantages of Deploying Your Survey Online

Web-based surveys are both quick and cost-effective, allowing you to reach broad audiences quickly and analyze results in real-time. Compared with traditional mail surveys, using an application like SurveyMethods has many benefits and limited drawbacks.

Advantages:

  • Extremely fast: An email survey or a survey posted on a popular website can gather several thousand responses within a few hours.
  • Cost : After initial setup, there’s practically none.
  • Complex logic: Online surveys can use complex question-skipping logic, randomization, quotas, and other features that can deliver better data.
  • Customization : Questionnaires can include colors, fonts, and other formatting options, and can easily be customized for different users with custom fields and mail merge.
  • Media flexibility : With the SurveyMethods survey software, you can display pictures, video and play sound, all from within your survey.
  • Higher response rates , compared to ordinary “snail” mail surveys.
  • The honesty factor : Most people will give more honest answers to a computer rather than to a person or sheet of paper.
  • More details : Respondents typically provide longer answers to open-ended questions on the web than they do on other kinds of self-administered surveys.
  • Superior analysis : SurveyMethods provides you with easy-to-use analysis tools that enable you to conduct group analysis, segmentation analysis, and more.

Disadvantages:

  • You must have (or purchase) a list of email addresses, or have a high-traffic website to distribute your link on (social media makes this much easier).
  • Many people dislike unsolicited email even more than unsolicited regular mail.
  • People can easily quit mid-questionnaire online.
  • If you’re deploying the survey with a custom URL pasted on a web page or on social media, you often have no idea who’s replying – if that information is desired (and it won’t result in less candid responses), make sure you ask for contact information as well as include questions about the person’s background, demographics, etc.

Surveys are by no means new. But today, conducting a survey with a large sample size is significantly more efficient and affordable with new technologies and the support of online survey industry leaders, such as the proven experts at SurveyMethods. It’s no longer necessary to assume or guess what your customers’ needs and expectations might be. Ask them yourself, with a quick, cost-efficient online survey. You might be surprised with what you learn!

If you need help in creating and implementing your online research strategy,  contact SurveyMethods  or  start your free trial  today.

Leave a Reply Cancel Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

By using this form you agree with the storage and handling of your data by this website. *

Creating Effective Surveys

Six Steps for Gathering Relevant Feedback

(6 minute read)

Two people writing on paper

So, you’ve wisely decided to survey your customers—and the rest of your company has “bought into” the idea. As is so often the case with new-technology adoption, it pays huge dividends and accelerates ROI to partner with an online survey leader, such as SurveyMethods, to benefit from their expertise and experience to help you every step of the way. Using our online survey software, you can quickly and easily design effective surveys by following these six steps.

As soon as your send a survey and results begin to trickle in, you can begin analyzing the data. Once it’s in your database, it can be sliced, diced and analyzed as needed in spreadsheets, presentation programs and statistical software. Finally, it’s time to act. Compare the results of your survey to your original objectives, coming up with specific and actionable  business responses as a result. After all, isn’t that the reason you surveyed customers in the first place?

  • Extremely fast : An email survey or a survey posted on a popular website can gather several thousand responses within a few hours.
  • Complex logic : Online survyes can use complex question-skipping logic, randomization, quotas, and other features that can deliver better data.
  • If you’re deploying the survey with a custom URL pasted on a web page or on social media, you often have no idea who’s replying – if that information is desired (and it won’t result in less candid responses), make sure you ask for contact information as well as include questions about the person’s background, demographics, etc.

If you need help in creating and implementing your online research strategy,  contact SurveyMethods or start your free trial today.

SurveyMethods Logo

  • Privacy Policy
  • Strictly Necessary Cookies
  • Cookie Policy

This Privacy Policy sets out how we,  Methods Group LLC ("SurveyMethods") , collect, store and use information about you when you use or interact with our website,  surveymethods.com (our website) and where we otherwise obtain or collect information about you. This Privacy Policy is effective from 2 nd April 2020 .

Our details

When you visit our website, when you use our website, marketing communications, information obtained from third parties, disclosure and additional uses of your information, how long we retain your information, how we secure your information, transfers of your information outside the european economic area, your rights in relation to your information, changes to our privacy policy, children’s privacy.

This section summarises how we obtain, store and use information about you.  It is intended to provide a very general overview only. It is not complete in and of itself and it must be read in conjunction with the corresponding full sections of this Privacy Policy.

  • Data controller: SurveyMethods
  • How we collect or obtain information about you: When you provide it to us, e.g. by contacting us, placing an order on our website, completing registration forms, signing up for free trials, submitting and completing surveys, posting blogs, or signing up for content such as newsletters. From your use of our website, using cookies and occasionally, from third parties such as mailing list providers.
  • Information we collect: Email address, password, first name, last name, job function, demographic information such as age, date of birth, gender, education, company name, phone, billing address, country, state/province/region, city, zip/postal code, and very limited credit card details (the cardholder’s name, only the last 4 digits of the credit card number, and the expiration date) for authentication (we do not store the CVV number), IP address, information from cookies, information about your computer or device (e.g. device and browser type), information about how you use our website (e.g. which pages you have viewed, the time when you view them and what you clicked on, the geographical location from which you accessed our website (based on your IP address), VAT number (if applicable), engagement history and transaction history).
  • How we use your information: For administrative and business purposes (particularly to contact you and process orders you place on our website, to post and respond to surveys, to improve our business and website, to fulfill our contractual obligations, to advertise our and other’s goods and services, to analyse your use of our website, and in connection with our legal rights and obligations).
  • Disclosure of your information to third parties: User information can be shared with partners for specific types of content and events where a user has registered their information. Other disclosures are only to the extent necessary to run our business, to our service providers, to fulfill any contracts we enter into with you and where required by law or to enforce our legal rights.
  • Do we sell your information to third parties (other than in the course of a business sale or purchase or similar event): No,  SurveyMethods does not sell data. However, when you register or sign up for certain types of content, your registration data can be shared with sponsors and partners. Examples of where we do this include event registrations, webinar signups or whitepaper downloads. We will always make it clear where any information provided will be shared with other parties.
  • How long we retain your information: For no longer than necessary, taking into account any legal obligations we have (e.g. to maintain records for tax purposes), any other legal basis we have for using your information (e.g. your consent, performance of a contract with you or our legitimate interests as a business) and certain additional factors described in the main section below entitled 'How long we retain your information'. For specific retention periods in relation to certain information which we collect from you, please see the main section below entitled 'How long we retain your information'.
  • How we secure your information: Using appropriate technical and organisational measures such as storing your information on secure servers, encrypting transfers of data to or from our servers using Secure Sockets Layer (SSL) technology, encrypting payments you make on or via our website using Secure Sockets Layer (SSL) technology and only granting access to your information where necessary.
  • Use of cookies and similar technologies: We use cookies and similar information-gathering technologies such as marketing automation tracking on our website including essential, functional, analytical and targeting cookies. For more information, please view our cookie policy .
  • Transfers of your information outside the European Economic Area: By using our website, your information may be transferred outside of the European Economic Area. We take personal data seriously and as such we ensure appropriate safeguards are in place, including, for example, that the third parties we use who transfer your information outside the European Economic Area have self-certified themselves as compliant with the EU-U.S. Privacy Shield.
  • Use of profiling: We use profiling to understand our users better through web and marketing analytics, provide targeted advertising and deliver a personalised user experience.
  • to access your information and to receive information about its use
  • to have your information corrected and/or completed
  • to have your information deleted
  • to restrict the use of your information
  • to receive your information in a portable format
  • to object to the use of your information
  • to withdraw your consent to the use of your information
  • to complain to a supervisory authority
  • Sensitive personal information:  We do collect what is commonly referred to as ‘sensitive personal information’, however we will only capture essential minimum data where it is strictly necessary.  Any sensitive data will be held on our secure servers and will be transferred securely using SSL 256-bit encryption.

If you have any questions about this Privacy Policy, please contact the data controller.

The data controller in respect of our website is SurveyMethods and can be contacted at 800-601-2462 or 214-257-8909 .

You can also contact the data controller by emailing our data protection officer at [email protected] .

We collect and use information from website visitors in accordance with this section and the section entitled 'Disclosure and additional uses of your information'.

Web server log information

We use a third party server to host our website called  Google Cloud the privacy policy of which is available here: https://policies.google.com/

Our website server automatically logs the IP address you use to access our website as well as other information about your visit such as the pages accessed, information requested, the date and time of the request, the source of your access to our website (e.g. the website or URL (link) which referred you to our website), and your browser version and operating system.

Use of website server log information for IT security purposes

We collect and store server logs to ensure network and IT security and so that the server and website remain uncompromised. This includes analysing log files to help identify and prevent unauthorised access to our network, the distribution of malicious code, denial of services attacks and other cyber-attacks, by detecting unusual or suspicious activity.

Unless we are investigating suspicious or potential criminal activity, we do not make, nor do we allow our hosting provider to make, any attempt to identify you from the information collected via server logs.

Legal basis for processing:  Compliance with a legal obligation to which we are subject (Article 6(1)(c) of the General Data Protection Regulation).

Legal obligation:  We have a legal obligation to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk of our processing of information about individuals. Recording access to our website using server log files is such a measure.

Legal basis for processing:  Our legitimate interests (Article 6(1)(f) of the General Data Protection Regulation).

Legitimate interests:  We have a legitimate interest in using your information for the purposes of ensuring network and information security.

Use of website server log information to analyse website use and improve our website

We use the information collected by our website server logs to analyse how our website users interact with our website and its features. For example, we analyse the number of visits and unique visitors we receive, the time and date of the visit, the location of the visit and the operating system and browser use.

We use the information gathered from the analysis of this information to improve our website. For example, we use the information gathered to change the information, content and structure of our website and individual pages based according to what users are engaging most with and the duration of time spent on particular pages on our website.

Legal basis for processing:  our legitimate interests (Article 6(1)(f) of the General Data Protection Regulation).

Legitimate interest:  improving our website for our website users and getting to know our website users’ preferences so our website can better meet their needs and desires.

Cookies are data files which are sent from a website to a browser to record information about users for various purposes.

We use cookies on our website, including essential, functional, analytical and targeting cookies. For further information on how we use cookies, please see our cookie policy .

You can reject some or all of the cookies we use on or via our website by changing your browser settings or non-essential cookies by using a cookie control tool, but doing so can impair your ability to use our website or some or all of its features. For further information about cookies, including how to change your browser settings, please visit  www.allaboutcookies.org  or see our cookie policy .

When you contact us

We collect and use information from individuals who contact us in accordance with this section and the section entitled 'Disclosure and additional uses of your information'.

When you send an email to the email address displayed on our website we collect your email address and any other information you provide in that email (such as your name, telephone number and the information contained in any signature block in your email).

Legitimate interest(s):  Responding to enquiries and messages we receive and keeping records of correspondence.

Legal basis for processing:  Necessary to perform a contract or to take steps at your request to enter into a contract (Article 6(1)(b) of the General Data Protection Regulation).

Reason why necessary to perform a contract:  Where your message relates to us providing you with goods or services or taking steps at your request prior to providing you with our goods and services (for example, providing you with information about such goods and services), we will process your information in order to do so).

Enquiry forms

When you contact us using an enquiry form, we collect your personal details and match this to any information we hold about you on record. Typical personal information collected will include your name and contact details. We will also record the time, date and the specific form you completed.

If you do not provide the mandatory information required by our contact form, you will not be able to submit the contact form and we will not receive your enquiry.

We will also use this information to tailor any follow up sales and marketing communications with you. For further information, see the section of this privacy policy titled 'Marketing communications'.

Messages you send to us via our contact form may be stored outside the European Economic Area on our contact form provider’s servers.

When you contact us by phone, we collect your phone number and any information provide to us during your conversation with us.

We may record phone calls with customers for training and customer service purposes.

Legal basis for processing:  Our legitimate interests (Article 6(1)(f) of the General Data Protection Regulation)

Reason why necessary to perform a contract:  Where your message relates to us providing you with goods or services or taking steps at your request prior to providing you with our goods and services (for example, providing you with information about such goods and services), we will process your information in order to do so.

If you contact us by post, we will collect any information you provide to us in any postal communications you send us.

Legal basis for processing:  Necessary to perform a contract or to take steps at your request to enter into a contract (Article 6(1)(b) of the General Data Protection Regulation).

We collect and use information from individuals who interact with particular features of our website in accordance with this section and the section entitled 'Disclosure and additional uses of your information'.

Social Media Tools

We have a wide range of social media tools to be able to use on our website.  These tools include (but are not limited to); Sharing, Likes, comments and submitting content both on and off our website. By using these tools, you are providing your consent to store and use the submitted data, whether personal information or general information, both on and off our website.

Legal basis for processing:  Your consent (Article 6(1)(a) of the General Data Protection Regulation). Consent: You give your consent to us storing and using submitted content using the steps described above.

We may also use this information to tailor any follow up sales and marketing communications with you. For further information, see the section of this privacy policy titled 'Marketing Communications'.

Information you submit may be stored both inside and outside the European Economic Area on our servers as well as third-party servers such as Facebook.

Registered Users

When you register as a user on our website:

  • We collect and store one or more of the following: Your email address, password, first name, last name, job function, company name, phone, billing address, country, state/province/region, city, zip/postal code, and very limited credit card details (the cardholder’s name, only the last 4 digits of the credit card number, and the expiration date) for authentication. We do not store the CVV number.
  • We use this data to provide you with customer support and other services, bill you for our services, collect feedback, send you account-related notifications, and keep you informed about our key features, important feature updates, and latest releases.
  • We store data related to your surveys, polls, and newsletters in your account that you access using your login-id and password. This includes questions, responses, images, email lists, data you enter while configuring or customizing any settings, etc. This data is processed by SurveyMethods to enable you to perform functions like design and distribution of surveys, polls, newsletters, and analysis & reporting.
  • We do not share any personally identifiable and account-related data with a third party without your explicit consent. However, if you use the SurveyMethods API or 3rd Party Integrations, you will need to share your SurveyMethods login-id and the “API Key” with the 3rd party for authentication. For more on our API Terms of Use, click here .
  • We may display your organization’s name and/or logo on our customer listing (unless agreed upon otherwise by both parties herein).
  • Your data will be visible to those with whom you share your published reports or extracted data/reports.
  • If you collaborate your surveys with other Registered Users, all collaborated data and your login-id will be visible to them.
  • If you are a Child User on an Enterprise account, the Enterprise Master User (Administrator) will be able to see the SurveyMethods login-id, first name, last name, phone number, account type, and expiration date of the Enterprise Child Accounts (Member Accounts). The Enterprise Child Accounts can view the SurveyMethods login-id, first name, last name, phone number, job title, job function, country, state/province/region, and city of the Enterprise Master User.
  • Troubleshoot problems and fix bugs (issues).
  • Ensure website compatibility across different devices and browsers.
  • Identify trends and patterns in the usage of our Services.
  • Gain insights for adding or improving the functionality and usability of our website.
  • Monitor and prevent abuse.
  • To prevent any undesirable, abusive, or illegal activities, we have automated processes in place that check your data for malicious activities, spam, and fraud.
  • We may use your data if required by law, court orders, subpoenas, or to enforce our agreements.
  • We collect information using cookies. Cookies are digital files that allow websites to recognize returning users. While most browsers allow users to refuse cookies or request permission on a case-by-case basis, our site will not function properly without them. SurveyMethods uses cookies primarily to enable the smooth functioning of its Services.
  • While accessing SurveyMethods, you may be able to access links that take you to websites external to SurveyMethods. SurveyMethods is not responsible for the content, policies, or terms of these websites.

GDPR Legal Classification for registered users

Legitimate interest:  Registering and administering accounts on our website to provide access to content, allows you to buy goods and services and facilitates the running and operation of our business.

Transfer and storage of your information 

Information you submit to us via the registration form on our website may be stored outside the European Economic Area on our third-party hosting provider’s servers.

When you register as an end user;

  • SurveyMethods’ Surveys and Polls sent by Registered Users
  • Newsletters from SurveyMethods Newsletter module and sent by Registered Users
  • When responding to a survey or a poll, End Users may provide personal data such as first name, last name, phone number, email address, demographic data like age, date of birth, gender, education, income, marital status, and any other sensitive data that directly or indirectly identifies them. SurveyMethods does not use or share any data of End Users in any way. The Registered User is solely responsible for ensuring that collection and sharing of any End User data, personal or otherwise, is done with the End User’s consent and in accordance with applicable data protection laws.
  • Since the Registered User controls and manages all data of their surveys, polls, and newsletters, End Users may contact the Registered User for any concerns regarding consent, privacy and protection of their data, or if they wish to access, modify, or delete their data.

GDPR Legal Classification for End Users

Visitors to our website

When you visit our website:

  • SurveyMethods may record your personal data (such as your name, email address, phone, company, and the reason you are contacting us) when you visit the SurveyMethods website and contact us using our online form. Any consent for the collection and use of your data in this case is entirely voluntary.
  • We may use your contact information to respond to you. We do not share any personally identifiable information with a third party without your explicit consent.

GDPR Legal Classification for Visitors

When you place an order

We collect and use information from individuals who place an order on our website in accordance with this section and the section entitled 'Disclosure and additional uses of your information'.

Information collected when you place an order

Mandatory information

When you place an order for goods or services on our website, we collect your name, email address, billing address.

If you do not provide this information, you will not be able to purchase goods or services from us on our website or enter into a contract with us.

Legal basis for processing:  Compliance with a legal obligation (Article 6(1)(c) of the General Data Protection Regulation).

Legal obligation:  We have a legal obligation to issue you with an invoice for the goods and services you purchase from us where you are VAT registered and we require the mandatory information collected by our checkout form for this purpose. We also have a legal obligation to keep accounting records, including records of transactions.

Additional information 

We can also collect additional information from you, such as your phone number, full name, address etc.

We use this information to manage and improve your customer experience with us.

If you do not supply the additional information requested at checkout, you will not be able to complete your order as we will not have the correct level of information to adequately manage your account.

Legitimate interests: The ability to provide adequate customer service and management of your customer account.

Our content, goods and services

When signing up for content, registering on our website or making a payment, we will use the information you provide in order to contact you regarding related content, products and services.

We will continue to send you marketing communications in relation to similar goods and services if you do not opt out from receiving them.

You can opt-out from receiving marketing communications at any time by emailing [email protected] .

Legitimate interests:  Sharing relevant, timely and industry-specific information on related business services, in order to help your organisation achieve its goals.

Third party goods and services

In addition to receiving information about our products and services, you can opt in to receiving marketing communications from us in relation third party goods and services by email by ticking a box indicating that you would like to receive such communications.

Legal basis for processing:  Consent (Article 6(1)(a) of the General Data Protection Regulation).

Consent:  You give your consent to us sending you information about third party goods and services by signing up to receive such information in accordance with the steps described above.

Information for marketing campaigns will be stored outside the European Economic Area on our third-party mailing list provider’s servers in the United States.

For further information about the safeguards used when your information is transferred outside the European Economic Area, see the section of this privacy policy below entitled 'Transfers of your information outside the European Economic Area'.

Use of tracking in emails

We use technologies such as tracking pixels (small graphic files) and tracked links in the emails we send to allow us to assess the level of engagement our emails receive by measuring information such as the delivery rates, open rates, click through rates and content engagement that our emails achieve.

This section sets out how we obtain or collect information about you from third parties.

Information received from third parties

We can often receive information about you from third parties. The third parties from which we receive information about you can include partner events within the marketing industry and other organisations that we have a professional affiliation with.

It is also possible that third parties with whom we have had no prior contact may provide us with information about you.

Information we obtain from third parties will generally be your name and contact details but will include any additional information about you which they provide to us.

Reason why necessary to perform a contract:  Where a third party has passed on information about you to us (such as your name and email address) in order for us to provide services to you, we will process your information in order to take steps at your request to enter into a contract with you and perform a contract with you (as the case may be).

Consent:  Where you have asked a third party to share information about you with us and the purpose of sharing that information is not related to the performance of a contract or services by us to you, we will process your information on the basis of your consent, which you give by asking the third party in question to pass on your information to us.

Legitimate interests:  Where a third party has shared information about you with us and you have not consented to the sharing of that information, we will have a legitimate interest in processing that information in certain circumstances.

For example, we would have a legitimate interest in processing your information to perform our obligations under a sub-contract with the third party, where the third party has the main contract with you. Our legitimate interest is the performance of our obligations under our sub-contract.

Similarly, third parties may pass on information about you to us if you have infringed or potentially infringed any of our legal rights. In this case, we will have a legitimate interest in processing that information to investigate and pursue any such potential infringement.

Information obtained by us from third parties

In certain circumstances (for example, to verify the information we hold about you or obtain missing information we require to provide you with a service) we will obtain information about you from certain publicly accessible sources, both EU and non-EU, such as Companies House, online customer databases, business directories, media publications, social media, and websites (including your own website if you have one).

In certain circumstances will also obtain information about you from private sources, both EU and non-EU, such as marketing data services.

Legitimate interests:  Sharing relevant, timely and industry-specific information on related business services.

Where we receive information about you in error

If we receive information about you from a third party in error and/or we do not have a legal basis for processing that information, we will delete your information.

This section sets out the circumstances in which will disclose information about you to third parties and any additional purposes for which we use your information.

Disclosure of your information to service providers

We use a number of third parties to provide us with services which are necessary to run our business or to assist us with running our business.

These include the following: Internet services, IT service providers and web developers.

Our third-party service providers are located both inside and outside of the European Economic Area.

Your information will be shared with these service providers where necessary to provide you with the service you have requested, whether that is accessing our website or ordering goods and services from us.

We do not display the identities of our service providers publicly by name for security and competitive reasons. If you would like further information about the identities of our service providers, however, please contact us directly by email and we will provide you with such information where you have a legitimate reason for requesting it (where we have shared your information with such service providers, for example).

Legal basis for processing:  Legitimate interests (Article 6(1)(f) of the General Data Protection Regulation).

Legitimate interest relied on:  Where we share your information with these third parties in a context other than where is necessary to perform a contract (or take steps at your request to do so), we will share your information with such third parties in order to allow us to run and manage our business efficiently.

Legal basis for processing:  Necessary to perform a contract and/or to take steps at your request prior to entering into a contract (Article 6(1)(b) of the General Data Protection Regulation).

Reason why necessary to perform a contract:  We may need to share information with our service providers to enable us to perform our obligations under that contract or to take the steps you have requested before we enter into a contract with you.

Disclosure and use of your information for legal reasons

Indicating possible criminal acts or threats to public security to a competent authority.

If we suspect that criminal or potential criminal conduct has occurred, we will in certain circumstances need to contact an appropriate authority, such as the police. This could be the case, for instance, if we suspect that fraud or a cyber-crime has been committed or if we receive threats or malicious communications towards us or third parties.

We will generally only need to process your information for this purpose if you were involved or affected by such an incident in some way.

Legitimate interests:  Preventing crime or suspected criminal activity (such as fraud).

In connection with the enforcement or potential enforcement our legal rights

We will use your information in connection with the enforcement or potential enforcement of our legal rights including, for example, sharing information with debt collection agencies if you do not pay amounts owed to us when you are contractually obliged to do so. Our legal rights may be contractual (where we have entered into a contract with you) or non-contractual (such as legal rights that we have under copyright law or tort law).

Legitimate interest:  Enforcing our legal rights and taking steps to enforce our legal rights.

In connection with a legal or potential legal dispute or proceedings

We may need to use your information if we are involved in a dispute with you or a third party for example, either to resolve the dispute or as part of any mediation, arbitration or court resolution or similar process.

Legitimate interest(s):  Resolving disputes and potential disputes.

This section sets out how long we retain your information. We have set out specific retention periods where possible. Where that has not been possible, we have set out the criteria we use to determine the retention period.

Retention periods

Server log information: We retain information on our server logs for 3 months.

Order information: When you place an order for goods and services, we retain that information for seven years following the end of the financial year in which you placed your order, in accordance with our legal obligation to keep records for tax purposes.

Correspondence and enquiries: When you make an enquiry or correspond with us for any reason, whether by email or via our contact form or by phone, we will retain your information for as long as it takes to respond to and resolve your enquiry, and for 36 further months, after which we will archive your information.

Newsletter: We retain the information you used to sign up for our newsletter for as long as you remain subscribed (i.e. you do not unsubscribe).

Registration: We retain the information you used to register for as long as you remain subscribed (i.e. you do not unsubscribe).

Criteria for determining retention periods

In any other circumstances, we will retain your information for no longer than necessary, taking into account the following:

  • the purpose(s) and use of your information both now and in the future (such as whether it is necessary to continue to store that information in order to continue to perform our obligations under a contract with you or to contact you in the future);
  • whether we have any legal obligation to continue to process your information (such as any record-keeping obligations imposed by relevant law or regulation);
  • whether we have any legal basis to continue to process your information (such as your consent);
  • how valuable your information is (both now and in the future);
  • any relevant agreed industry practices on how long information should be retained;
  • the levels of risk, cost and liability involved with us continuing to hold the information;
  • how hard it is to ensure that the information can be kept up to date and accurate; and
  • any relevant surrounding circumstances (such as the nature and status of our relationship with you).

We take appropriate technical and organisational measures to secure your information and to protect it against unauthorised or unlawful use and accidental loss or destruction, including:

  • only sharing and providing access to your information to the minimum extent necessary, subject to confidentiality restrictions where appropriate, and on an anonymised basis wherever possible;
  • using secure servers to store your information;
  • verifying the identity of any individual who requests access to information prior to granting them access to information;
  • using Secure Sockets Layer (SSL) software to encrypt any payment transactions you make on or via our website;
  • only transferring your information via closed system or encrypted data transfers;

Transmission of information to us by email

Transmission of information over the internet is not entirely secure, and if you submit any information to us over the internet (whether by email, via our website or any other means), you do so entirely at your own risk.

We cannot be responsible for any costs, expenses, loss of profits, harm to reputation, damages, liabilities or any other form of loss or damage suffered by you as a result of your decision to transmit information to us by such means.

Your information may be transferred and stored outside the European Economic Area (EEA) in the circumstances set out earlier in this policy.

We will also transfer your information outside the EEA or to an international organisation in order to comply with legal obligations to which we are subject (compliance with a court order, for example). Where we are required to do so, we will ensure appropriate safeguards and protections are in place.

Subject to certain limitations on certain rights, you have the following rights in relation to your information, which you can exercise by writing to the data controller using the details provided at the top of this policy.

  • to request access to your information and information related to our use and processing of your information;
  • to request the correction or deletion of your information;
  • to request that we restrict our use of your information;
  • to receive information which you have provided to us in a structured, commonly used and machine-readable format (e.g. a CSV file) and the right to have that information transferred to another data controller (including a third-party data controller);
  • to object to the processing of your information for certain purposes (for further information, see the section below entitled 'Your right to object to the processing of your information for certain purposes'); and
  • to withdraw your consent to our use of your information at any time where we rely on your consent to use or process that information. Please note that if you withdraw your consent, this will not affect the lawfulness of our use and processing of your information on the basis of your consent before the point in time when you withdraw your consent.

In accordance with Article 77 of the General Data Protection Regulation, you also have the right to lodge a complaint with a supervisory authority, in particular in the Member State of your habitual residence, place of work or of an alleged infringement of the General Data Protection Regulation.

Further information on your rights in relation to your personal data as an individual

You can find out further information about your rights, as well as information on any limitations which apply to those rights, by reading the underlying legislation contained in Articles 12 to 22 and 34 of the General Data Protection Regulation, which is available here: http://ec.europa.eu/justice/data-protection/reform/files/regulation_oj_en.pdf

Verifying your identity where you request access to your information

Where you request access to your information, we are required by law to use all reasonable measures to verify your identity before doing so.

These measures are designed to protect your information and to reduce the risk of identity fraud, identity theft or general unauthorised access to your information.

How we verify your identity

Where we possess appropriate information about you on file, we will attempt to verify your identity using that information.

If it is not possible to identity you from such information, or if we have insufficient information about you, we may require original or certified copies of certain documentation in order to be able to verify your identity before we are able to provide you with access to your information.

We will be able to confirm the precise information we require to verify your identity in your specific circumstances if and when you make such a request.

Your right to object

You have the following rights in relation to your information, which you may exercise in the same way as you may exercise by writing to the data controller using the details provided at the top of this policy.

  • to object to us using or processing your information where we use or process it in order to  carry out a task in the public interest or for our legitimate interests , including ‘profiling’ (i.e. analysing or predicting your behaviour based on your information) based on any of these purposes; and
  • to object to us using or processing your information for  direct marketing purposes (including any profiling we engage in that is related to such direct marketing).

You may also exercise your right to object to us using or processing your information for direct marketing purposes by:

  • clicking the unsubscribe link contained at the bottom of any marketing email we send to you and following the instructions which appear in your browser following your clicking on that link;
  • sending an email to [email protected] , asking that we stop sending you marketing communications or by including the words “OPT OUT”.

Sensitive Personal Information

‘Sensitive personal information’ is information about an individual that reveals their racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, genetic information, biometric information for the purpose of uniquely identifying an individual, information concerning health or information concerning a natural person’s sex life or sexual orientation.

Our website may allow you to register ‘Sensitive Information’, however when we ask for this, you will be considered to have explicitly consented to us processing that sensitive personal information under Article 9(2)(a) of the General Data Protection Regulation.

We update and amend our Privacy Policy from time to time.

Minor changes to our Privacy Policy 

Where we make minor changes to our Privacy Policy, we will update our Privacy Policy with a new effective date stated at the beginning of it. Our processing of your information will be governed by the practices set out in that new version of the Privacy Policy from its effective date onwards.

Major changes to our Privacy Policy or the purposes for which we process your information 

Where we make major changes to our Privacy Policy or intend to use your information for a new purpose or a different purpose than the purposes for which we originally collected it, we will notify you by email (where possible) or by posting a notice on our website.

We will provide you with the information about the change in question and the purpose and any other relevant information before we use your information for that new purpose.

Wherever required, we will obtain your prior consent before using your information for a purpose that is different from the purposes for which we originally collected it.

Because we care about the safety and privacy of children online, we comply with the Children’s Online Privacy Protection Act of 1998 (COPPA). COPPA and its accompanying regulations protect the privacy of children using the internet. We do not knowingly contact or collect information from persons under the age of 18. The website is not intended to solicit information of any kind from persons under the age of 18.

It is possible that we could receive information pertaining to persons under the age of 18 by the fraud or deception of a third party. If we are notified of this, as soon as we verify the information, we will, where required by law to do so, immediately obtain the appropriate parental consent to use that information or, if we are unable to obtain such parental consent, we will delete the information from our servers. If you would like to notify us of our receipt of information about persons under the age of 18, please do so by contacting us by using the details at the top of this policy.

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

OUR COOKIES AND YOU

Hello! If you are reading this, then you care about privacy – and your privacy is very important to us. Cookies are an important part of almost all online companies these days, and this page describes what they are, how we use them, what data they collect, and most importantly, how you can change your browser settings to turn them off.

WHAT IS A COOKIE?

A cookie is a file containing an identifier (a string of letters and numbers) that is sent by a web server to a web browser and is stored by the browser. The identifier is then sent back to the server each time the browser requests a page from the server.

Cookies may be either “persistent” cookies or “session” cookies: a persistent cookie will be stored by a web browser and will remain valid until its set expiry date, unless deleted by the user before the expiry date; a session cookie, on the other hand, will expire at the end of the user session, when the web browser is closed.

Cookies do not typically contain any information that personally identifies a user, but personal information that we store about you may be linked to the information stored in and obtained from cookies.

HOW WE USE COOKIES?

We use cookies for a number of different purposes. Some cookies are necessary for technical reasons; some enable a personalized experience for both visitors and registered users; and some allow the display of advertising from selected third party networks. Some of these cookies may be set when a page is loaded, or when a visitor takes a particular action (clicking the “like” or “follow” button on a post, for example).

WHAT COOKIES DO SURVEYMETHODS.COM USE?

We use cookies for the following purposes:

_ga Persistent (2 years) Performance Used by Google Analytics to distinguish users
_gat

 

Persistent (1 Minute) Performance Used by Google Analytics to throttle request rate.
_gid Persistent (2 days) Performance Used by Google Analytics to distinguish users
__fbp Persistent

(3 Months)

Marketing Used by Facebook to track our advertising campaigns.
hssc,hssrc, hstc, hubspotutk Persistent Strictly Necessary Used by Hubspot to help us to manage our relationship with our customers.
PHPSESSID Session Strictly Necessary Used to store a generic value to identify your session on our website

WHAT COOKIES ARE USED BY OUR SERVICE PROVIDERS?

Our service providers use cookies and those cookies may be stored on your computer when you visit our website.

Google Analytics

We use Google Analytics to analyse the use of our website. Google Analytics gathers information about website use by means of cookies. The information gathered relating to our website is used to create reports about the use of our website. Google’s privacy policy is available at https://www.google.com/policies/privacy/

DoubleClick/Google Adwords

We use Google Adwords which also owns DoubleClick for marketing and remarketing purposes.  Cookies are placed on your PC to help us track our adverts performance, as well as to help tailor our marketing to your needs.  You can view Googles Privacy policy here https://policies.google.com/privacy

Facebook and Facebook Pixel

We use Facebook and Facebook Pixel to track our campaigns and to provide social media abilities on our website such as visiting our Facebook page, liking content and more. To view Facebooks Privacy Policy click here https://www.facebook.com/policy.php .

We use hubspot to manage our relationship with our customers and to track conversions on our website.  You can view HubSpots Privacy Policy here https://legal.hubspot.com/privacy-policy

MANAGING COOKIES

Most browsers allow you to refuse to accept cookies and to delete cookies. The methods for doing so vary from browser to browser, and from version to version. You can, however, obtain up-to-date information about blocking and deleting cookies via these links:

https://support.google.com/chrome/answer/95647?hl=en

https://support.mozilla.org/en-US/kb/enable-and-disable-cookies-website-preferences

https://www.opera.com/help/tutorials/security/cookies/

https://support.microsoft.com/en-gb/help/17442/windows-internet-explorer-delete-manage-cookies

https://support.apple.com/kb/PH21411

https://privacy.microsoft.com/en-us/windows-10-microsoft-edge

Blocking all cookies will have a negative impact upon the usability of many websites. If you block cookies, you will not be able to use all the features on our website.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Writing Survey Questions

Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions. Creating good measures involves both writing good questions and organizing them to form the questionnaire.

Questionnaire design is a multistage process that requires attention to many details at once. Designing the questionnaire is complicated because surveys can ask about topics in varying degrees of detail, questions can be asked in different ways, and questions asked earlier in a survey may influence how people respond to later questions. Researchers are also often interested in measuring change over time and therefore must be attentive to how opinions or behaviors have been measured in prior surveys.

Surveyors may conduct pilot tests or focus groups in the early stages of questionnaire development in order to better understand how people think about an issue or comprehend a question. Pretesting a survey is an essential step in the questionnaire design process to evaluate how people respond to the overall questionnaire and specific questions, especially when questions are being introduced for the first time.

For many years, surveyors approached questionnaire design as an art, but substantial research over the past forty years has demonstrated that there is a lot of science involved in crafting a good survey questionnaire. Here, we discuss the pitfalls and best practices of designing questionnaires.

Question development

There are several steps involved in developing a survey questionnaire. The first is identifying what topics will be covered in the survey. For Pew Research Center surveys, this involves thinking about what is happening in our nation and the world and what will be relevant to the public, policymakers and the media. We also track opinion on a variety of issues over time so we often ensure that we update these trends on a regular basis to better understand whether people’s opinions are changing.

At Pew Research Center, questionnaire development is a collaborative and iterative process where staff meet to discuss drafts of the questionnaire several times over the course of its development. We frequently test new survey questions ahead of time through qualitative research methods such as  focus groups , cognitive interviews, pretesting (often using an  online, opt-in sample ), or a combination of these approaches. Researchers use insights from this testing to refine questions before they are asked in a production survey, such as on the ATP.

Measuring change over time

Many surveyors want to track changes over time in people’s attitudes, opinions and behaviors. To measure change, questions are asked at two or more points in time. A cross-sectional design surveys different people in the same population at multiple points in time. A panel, such as the ATP, surveys the same people over time. However, it is common for the set of people in survey panels to change over time as new panelists are added and some prior panelists drop out. Many of the questions in Pew Research Center surveys have been asked in prior polls. Asking the same questions at different points in time allows us to report on changes in the overall views of the general public (or a subset of the public, such as registered voters, men or Black Americans), or what we call “trending the data”.

When measuring change over time, it is important to use the same question wording and to be sensitive to where the question is asked in the questionnaire to maintain a similar context as when the question was asked previously (see  question wording  and  question order  for further information). All of our survey reports include a topline questionnaire that provides the exact question wording and sequencing, along with results from the current survey and previous surveys in which we asked the question.

The Center’s transition from conducting U.S. surveys by live telephone interviewing to an online panel (around 2014 to 2020) complicated some opinion trends, but not others. Opinion trends that ask about sensitive topics (e.g., personal finances or attending religious services ) or that elicited volunteered answers (e.g., “neither” or “don’t know”) over the phone tended to show larger differences than other trends when shifting from phone polls to the online ATP. The Center adopted several strategies for coping with changes to data trends that may be related to this change in methodology. If there is evidence suggesting that a change in a trend stems from switching from phone to online measurement, Center reports flag that possibility for readers to try to head off confusion or erroneous conclusions.

Open- and closed-ended questions

One of the most significant decisions that can affect how people answer questions is whether the question is posed as an open-ended question, where respondents provide a response in their own words, or a closed-ended question, where they are asked to choose from a list of answer choices.

For example, in a poll conducted after the 2008 presidential election, people responded very differently to two versions of the question: “What one issue mattered most to you in deciding how you voted for president?” One was closed-ended and the other open-ended. In the closed-ended version, respondents were provided five options and could volunteer an option not on the list.

When explicitly offered the economy as a response, more than half of respondents (58%) chose this answer; only 35% of those who responded to the open-ended version volunteered the economy. Moreover, among those asked the closed-ended version, fewer than one-in-ten (8%) provided a response other than the five they were read. By contrast, fully 43% of those asked the open-ended version provided a response not listed in the closed-ended version of the question. All of the other issues were chosen at least slightly more often when explicitly offered in the closed-ended version than in the open-ended version. (Also see  “High Marks for the Campaign, a High Bar for Obama”  for more information.)

how to write methodology for online survey

Researchers will sometimes conduct a pilot study using open-ended questions to discover which answers are most common. They will then develop closed-ended questions based off that pilot study that include the most common responses as answer choices. In this way, the questions may better reflect what the public is thinking, how they view a particular issue, or bring certain issues to light that the researchers may not have been aware of.

When asking closed-ended questions, the choice of options provided, how each option is described, the number of response options offered, and the order in which options are read can all influence how people respond. One example of the impact of how categories are defined can be found in a Pew Research Center poll conducted in January 2002. When half of the sample was asked whether it was “more important for President Bush to focus on domestic policy or foreign policy,” 52% chose domestic policy while only 34% said foreign policy. When the category “foreign policy” was narrowed to a specific aspect – “the war on terrorism” – far more people chose it; only 33% chose domestic policy while 52% chose the war on terrorism.

In most circumstances, the number of answer choices should be kept to a relatively small number – just four or perhaps five at most – especially in telephone surveys. Psychological research indicates that people have a hard time keeping more than this number of choices in mind at one time. When the question is asking about an objective fact and/or demographics, such as the religious affiliation of the respondent, more categories can be used. In fact, they are encouraged to ensure inclusivity. For example, Pew Research Center’s standard religion questions include more than 12 different categories, beginning with the most common affiliations (Protestant and Catholic). Most respondents have no trouble with this question because they can expect to see their religious group within that list in a self-administered survey.

In addition to the number and choice of response options offered, the order of answer categories can influence how people respond to closed-ended questions. Research suggests that in telephone surveys respondents more frequently choose items heard later in a list (a “recency effect”), and in self-administered surveys, they tend to choose items at the top of the list (a “primacy” effect).

Because of concerns about the effects of category order on responses to closed-ended questions, many sets of response options in Pew Research Center’s surveys are programmed to be randomized to ensure that the options are not asked in the same order for each respondent. Rotating or randomizing means that questions or items in a list are not asked in the same order to each respondent. Answers to questions are sometimes affected by questions that precede them. By presenting questions in a different order to each respondent, we ensure that each question gets asked in the same context as every other question the same number of times (e.g., first, last or any position in between). This does not eliminate the potential impact of previous questions on the current question, but it does ensure that this bias is spread randomly across all of the questions or items in the list. For instance, in the example discussed above about what issue mattered most in people’s vote, the order of the five issues in the closed-ended version of the question was randomized so that no one issue appeared early or late in the list for all respondents. Randomization of response items does not eliminate order effects, but it does ensure that this type of bias is spread randomly.

Questions with ordinal response categories – those with an underlying order (e.g., excellent, good, only fair, poor OR very favorable, mostly favorable, mostly unfavorable, very unfavorable) – are generally not randomized because the order of the categories conveys important information to help respondents answer the question. Generally, these types of scales should be presented in order so respondents can easily place their responses along the continuum, but the order can be reversed for some respondents. For example, in one of Pew Research Center’s questions about abortion, half of the sample is asked whether abortion should be “legal in all cases, legal in most cases, illegal in most cases, illegal in all cases,” while the other half of the sample is asked the same question with the response categories read in reverse order, starting with “illegal in all cases.” Again, reversing the order does not eliminate the recency effect but distributes it randomly across the population.

Question wording

The choice of words and phrases in a question is critical in expressing the meaning and intent of the question to the respondent and ensuring that all respondents interpret the question the same way. Even small wording differences can substantially affect the answers people provide.

[View more Methods 101 Videos ]

An example of a wording difference that had a significant impact on responses comes from a January 2003 Pew Research Center survey. When people were asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” 68% said they favored military action while 25% said they opposed military action. However, when asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule  even if it meant that U.S. forces might suffer thousands of casualties, ” responses were dramatically different; only 43% said they favored military action, while 48% said they opposed it. The introduction of U.S. casualties altered the context of the question and influenced whether people favored or opposed military action in Iraq.

There has been a substantial amount of research to gauge the impact of different ways of asking questions and how to minimize differences in the way respondents interpret what is being asked. The issues related to question wording are more numerous than can be treated adequately in this short space, but below are a few of the important things to consider:

First, it is important to ask questions that are clear and specific and that each respondent will be able to answer. If a question is open-ended, it should be evident to respondents that they can answer in their own words and what type of response they should provide (an issue or problem, a month, number of days, etc.). Closed-ended questions should include all reasonable responses (i.e., the list of options is exhaustive) and the response categories should not overlap (i.e., response options should be mutually exclusive). Further, it is important to discern when it is best to use forced-choice close-ended questions (often denoted with a radio button in online surveys) versus “select-all-that-apply” lists (or check-all boxes). A 2019 Center study found that forced-choice questions tend to yield more accurate responses, especially for sensitive questions.  Based on that research, the Center generally avoids using select-all-that-apply questions.

It is also important to ask only one question at a time. Questions that ask respondents to evaluate more than one concept (known as double-barreled questions) – such as “How much confidence do you have in President Obama to handle domestic and foreign policy?” – are difficult for respondents to answer and often lead to responses that are difficult to interpret. In this example, it would be more effective to ask two separate questions, one about domestic policy and another about foreign policy.

In general, questions that use simple and concrete language are more easily understood by respondents. It is especially important to consider the education level of the survey population when thinking about how easy it will be for respondents to interpret and answer a question. Double negatives (e.g., do you favor or oppose  not  allowing gays and lesbians to legally marry) or unfamiliar abbreviations or jargon (e.g., ANWR instead of Arctic National Wildlife Refuge) can result in respondent confusion and should be avoided.

Similarly, it is important to consider whether certain words may be viewed as biased or potentially offensive to some respondents, as well as the emotional reaction that some words may provoke. For example, in a 2005 Pew Research Center survey, 51% of respondents said they favored “making it legal for doctors to give terminally ill patients the means to end their lives,” but only 44% said they favored “making it legal for doctors to assist terminally ill patients in committing suicide.” Although both versions of the question are asking about the same thing, the reaction of respondents was different. In another example, respondents have reacted differently to questions using the word “welfare” as opposed to the more generic “assistance to the poor.” Several experiments have shown that there is much greater public support for expanding “assistance to the poor” than for expanding “welfare.”

We often write two versions of a question and ask half of the survey sample one version of the question and the other half the second version. Thus, we say we have two  forms  of the questionnaire. Respondents are assigned randomly to receive either form, so we can assume that the two groups of respondents are essentially identical. On questions where two versions are used, significant differences in the answers between the two forms tell us that the difference is a result of the way we worded the two versions.

how to write methodology for online survey

One of the most common formats used in survey questions is the “agree-disagree” format. In this type of question, respondents are asked whether they agree or disagree with a particular statement. Research has shown that, compared with the better educated and better informed, less educated and less informed respondents have a greater tendency to agree with such statements. This is sometimes called an “acquiescence bias” (since some kinds of respondents are more likely to acquiesce to the assertion than are others). This behavior is even more pronounced when there’s an interviewer present, rather than when the survey is self-administered. A better practice is to offer respondents a choice between alternative statements. A Pew Research Center experiment with one of its routinely asked values questions illustrates the difference that question format can make. Not only does the forced choice format yield a very different result overall from the agree-disagree format, but the pattern of answers between respondents with more or less formal education also tends to be very different.

One other challenge in developing questionnaires is what is called “social desirability bias.” People have a natural tendency to want to be accepted and liked, and this may lead people to provide inaccurate answers to questions that deal with sensitive subjects. Research has shown that respondents understate alcohol and drug use, tax evasion and racial bias. They also may overstate church attendance, charitable contributions and the likelihood that they will vote in an election. Researchers attempt to account for this potential bias in crafting questions about these topics. For instance, when Pew Research Center surveys ask about past voting behavior, it is important to note that circumstances may have prevented the respondent from voting: “In the 2012 presidential election between Barack Obama and Mitt Romney, did things come up that kept you from voting, or did you happen to vote?” The choice of response options can also make it easier for people to be honest. For example, a question about church attendance might include three of six response options that indicate infrequent attendance. Research has also shown that social desirability bias can be greater when an interviewer is present (e.g., telephone and face-to-face surveys) than when respondents complete the survey themselves (e.g., paper and web surveys).

Lastly, because slight modifications in question wording can affect responses, identical question wording should be used when the intention is to compare results to those from earlier surveys. Similarly, because question wording and responses can vary based on the mode used to survey respondents, researchers should carefully evaluate the likely effects on trend measurements if a different survey mode will be used to assess change in opinion over time.

Question order

Once the survey questions are developed, particular attention should be paid to how they are ordered in the questionnaire. Surveyors must be attentive to how questions early in a questionnaire may have unintended effects on how respondents answer subsequent questions. Researchers have demonstrated that the order in which questions are asked can influence how people respond; earlier questions can unintentionally provide context for the questions that follow (these effects are called “order effects”).

One kind of order effect can be seen in responses to open-ended questions. Pew Research Center surveys generally ask open-ended questions about national problems, opinions about leaders and similar topics near the beginning of the questionnaire. If closed-ended questions that relate to the topic are placed before the open-ended question, respondents are much more likely to mention concepts or considerations raised in those earlier questions when responding to the open-ended question.

For closed-ended opinion questions, there are two main types of order effects: contrast effects ( where the order results in greater differences in responses), and assimilation effects (where responses are more similar as a result of their order).

how to write methodology for online survey

An example of a contrast effect can be seen in a Pew Research Center poll conducted in October 2003, a dozen years before same-sex marriage was legalized in the U.S. That poll found that people were more likely to favor allowing gays and lesbians to enter into legal agreements that give them the same rights as married couples when this question was asked after one about whether they favored or opposed allowing gays and lesbians to marry (45% favored legal agreements when asked after the marriage question, but 37% favored legal agreements without the immediate preceding context of a question about same-sex marriage). Responses to the question about same-sex marriage, meanwhile, were not significantly affected by its placement before or after the legal agreements question.

how to write methodology for online survey

Another experiment embedded in a December 2008 Pew Research Center poll also resulted in a contrast effect. When people were asked “All in all, are you satisfied or dissatisfied with the way things are going in this country today?” immediately after having been asked “Do you approve or disapprove of the way George W. Bush is handling his job as president?”; 88% said they were dissatisfied, compared with only 78% without the context of the prior question.

Responses to presidential approval remained relatively unchanged whether national satisfaction was asked before or after it. A similar finding occurred in December 2004 when both satisfaction and presidential approval were much higher (57% were dissatisfied when Bush approval was asked first vs. 51% when general satisfaction was asked first).

Several studies also have shown that asking a more specific question before a more general question (e.g., asking about happiness with one’s marriage before asking about one’s overall happiness) can result in a contrast effect. Although some exceptions have been found, people tend to avoid redundancy by excluding the more specific question from the general rating.

Assimilation effects occur when responses to two questions are more consistent or closer together because of their placement in the questionnaire. We found an example of an assimilation effect in a Pew Research Center poll conducted in November 2008 when we asked whether Republican leaders should work with Obama or stand up to him on important issues and whether Democratic leaders should work with Republican leaders or stand up to them on important issues. People were more likely to say that Republican leaders should work with Obama when the question was preceded by the one asking what Democratic leaders should do in working with Republican leaders (81% vs. 66%). However, when people were first asked about Republican leaders working with Obama, fewer said that Democratic leaders should work with Republican leaders (71% vs. 82%).

The order questions are asked is of particular importance when tracking trends over time. As a result, care should be taken to ensure that the context is similar each time a question is asked. Modifying the context of the question could call into question any observed changes over time (see  measuring change over time  for more information).

A questionnaire, like a conversation, should be grouped by topic and unfold in a logical order. It is often helpful to begin the survey with simple questions that respondents will find interesting and engaging. Throughout the survey, an effort should be made to keep the survey interesting and not overburden respondents with several difficult questions right after one another. Demographic questions such as income, education or age should not be asked near the beginning of a survey unless they are needed to determine eligibility for the survey or for routing respondents through particular sections of the questionnaire. Even then, it is best to precede such items with more interesting and engaging questions. One virtue of survey panels like the ATP is that demographic questions usually only need to be asked once a year, not in each survey.

U.S. Surveys

Other research methods.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

SC Blog Logo

Survey Methodology: Effective Ways to Test Your Online Surveys!

how to write methodology for online survey

Photo Credit: iStock.com/Artist's aydinynr

Online surveys are relatively easier. At SurveyCrest.com , they are extremely easy. So once a surveyor has finished creating a survey following all the tips and tricks we share on our blog; does he really need to test it before sending over to his respondents? The answer is: Yes, he does .

To some, critical analysis is a daunting task. It requires asking questions and they don’t feel clever enough to ask the right ones. The questioniaphobia may increase the amount of time they need to craft the questionnaire. Such people would check and double-check each question naturally. But what they, and essentially most of the surveyors, tend to overlook is the neutral feedback they need from people they are not targeting as respondents.

They also forget that sometimes, mistakes disappear in the background as they read the text way too many times in their minds or focus on other elements such as the structure and flow of the document. For situations like those, one needs to bring into play different approaches to test his survey.

Following are the most potent methods to evaluate your survey’s efficiency. We suggest that you follow them step-by-step.

1. Plan Out For Testing And Improvement

Your survey schedule may be overloaded with multitasking but as we have discussed, it is imperative to find time for trial and fixing of errors. After making changes, you will need to test again to see if you have improved it enough for the desired results. Since testing itself is a lengthy process, we advise you to start as soon as you are done creating the questionnaire. Even before adding the logic questions which we’ll discuss in the next point. Time is of the essence in business but timely testing saves you tons of trouble in the long run.

2. Test Before And After Adding Logic

Initial testing is best right after creating the basic questions; i.e. before attaching logic to them. Even if you test after adding them, you would have to make further changes later on which only waste more time. Besides, it is convenient to move questions and sections around before adding logic to them. Once the primary questions have been tested thoroughly, you can add logic and test gain.

3. Test On Different Browsers

You may guess which browser your respondents would use if you target a particular sample of people, but you can never be too sure. Instead of making guesses or just trying the survey on your favorite browsers, test it on as many as you can. Your survey should be compatible to all the major browsers, at least, such as, Internet Explorer , Chrome , Firefox , etc. Most of the surveys are now attempted through a mobile device ; therefore, it is highly important to make your surveys responsive . You can even create free online surveys from mobile.

4. Find Neutral Audience To Take Your Test

Like we said earlier, it is always a good idea to test your survey by asking random people to attempt it. Those people need not be included in your target audience. They can be your friends, colleagues or family members, but they would give you a good feedback on the craft and convenience of taking your survey. For example, you can count on your friends to give you concrete advice on how to improve your wording or syntax of the questions. They can also pinpoint any issues that might occur while taking it on various browsers or on their phones.

5. Engage The Stakeholders To Attempt The Survey

By the time you decide upon each of your questions, test on random audience, add logic, etc. your stakeholders would be impatiently waiting for the survey to reach their actual audience. It is then the perfect time to contact them and ask them to attempt the survey, just to see that the final product is exactly what you all had expected.

As the stakeholders would have various things on their mind, you are better off sending the survey with a reminder of goals and the targeted results. If they suggest more questions, only include those which are in-line with your survey objectives.

6. Check Usability Of Results

So now that you are done with adding questions and contacting people for feedback, it’s time for the ultimate test – i.e. checking the usability of survey results – before the final show down. It’s true that you and your stakeholders would know what you want to ask, but most of the times, that’s not enough. You need to be sure that your queries would produce an outcome that will lead to actionable results. You need to check how the data will look in reports. If it’s unusually hard to analyze, you need to revise your phrasing or diction or take whatever measures necessary to make it simple.

By running a test data, you can see what things need to be included or excluded to acquire the perfect survey report.

7. Use A Sample Of Real Respondents For A Final Review

This step is optional. Once you have carefully followed all of the tips mentioned above, you may decide to simply send out the survey to the entire target audience. But, if you are looking for absolute satisfaction before you can reap the benefits of your hard work; you are better off taking one final step to ensure its – or rather yours – success. Take a sample of your real respondents and first try the survey on them only. This will help you achieve actual responses and make sure that the received data looks exactly like you expected. If there are no major changes or issues, you can use this data along with the rest later on.

Conclusion:

Survey methodology helps you develop the most effective surveys you can possibly think of. There are various strategies to ensure the success of your surveys. Sure the content needs to be unique, but the drill remains the same. All surveys need to be tested before they are disseminated among respondents.

Do you use any of the strategies discussed in this article to test your online surveys? Which of these steps do you find more advantageous and what can you add to this procedure from your own experience? Tell us your story in the comments.

« Merry Christmas or Happy Holidays: What will entrepreneurs celebrate this December?

5 most popular surveys of 2013 – the hype and hoopla ».

' src=

About The Author Kelvin Stiles

Kelvin Stiles is a tech enthusiast and works as a marketing consultant at SurveyCrest – FREE online survey software and publishing tools for academic and business use. He is also an avid blogger and a comic book fanatic.

  • How it Works
  • Survey Templates
  • Terms and Conditions
  • Privacy Policy

google play

Copyright © 2024 SurveyCrest.com

Don't have an account? Register Here

Already have an account? Sign in Here

Forgot Password

To reset your password please enter username or email below and press reset button.

OR login using a social network account

  •  --> Login with Facebook
  • Login with Facebook
  • --> Sign in with Google
  • PRO Courses Guides New Tech Help Pro Expert Videos About wikiHow Pro Upgrade Sign In
  • EDIT Edit this Article
  • EXPLORE Tech Help Pro About Us Random Article Quizzes Request a New Article Community Dashboard This Or That Game Happiness Hub Popular Categories Arts and Entertainment Artwork Books Movies Computers and Electronics Computers Phone Skills Technology Hacks Health Men's Health Mental Health Women's Health Relationships Dating Love Relationship Issues Hobbies and Crafts Crafts Drawing Games Education & Communication Communication Skills Personal Development Studying Personal Care and Style Fashion Hair Care Personal Hygiene Youth Personal Care School Stuff Dating All Categories Arts and Entertainment Finance and Business Home and Garden Relationship Quizzes Cars & Other Vehicles Food and Entertaining Personal Care and Style Sports and Fitness Computers and Electronics Health Pets and Animals Travel Education & Communication Hobbies and Crafts Philosophy and Religion Work World Family Life Holidays and Traditions Relationships Youth
  • Browse Articles
  • Learn Something New
  • Quizzes Hot
  • Happiness Hub
  • This Or That Game
  • Train Your Brain
  • Explore More
  • Support wikiHow
  • About wikiHow
  • Log in / Sign up
  • Finance and Business

How to Write a Survey Report

Last Updated: February 16, 2024 Approved

This article was reviewed by Anne Schmidt . Anne Schmidt is a Chemistry Instructor in Wisconsin. Anne has been teaching high school chemistry for over 20 years and is passionate about providing accessible and educational chemistry content. She has over 9,000 subscribers to her educational chemistry YouTube channel. She has presented at the American Association of Chemistry Teachers (AATC) and was an Adjunct General Chemistry Instructor at Northeast Wisconsin Technical College. Anne was published in the Journal of Chemical Education as a Co-Author, has an article in ChemEdX, and has presented twice and was published with the AACT. Anne has a BS in Chemistry from the University of Wisconsin, Oshkosh, and an MA in Secondary Education and Teaching from Viterbo University. wikiHow marks an article as reader-approved once it receives enough positive feedback. In this case, several readers have written to tell us that this article was helpful to them, earning it our reader-approved status. This article has been viewed 408,760 times.

Once you have finished conducting a survey, all that is left to do is write the survey report. A survey report describes a survey, its results, and any patterns or trends found in the survey. Most survey reports follow a standard organization, broken up under certain headings. Each section has a specific purpose. Fill out each section correctly and proofread the paper to create a polished and professional report.

Writing the Summary and Background Info

Step 1 Break the report up into separate sections with headings.

  • Table of Contents
  • Executive Summary
  • Background and Objectives
  • Methodology
  • Conclusion and Recommendations

Step 2 Write a 1-2...

  • Methodology of the survey.
  • Key results of the survey.
  • Conclusions drawn from the results of the survey.
  • Recommendations based on the results of the survey.

Step 3 State the objectives of the survey in the background section.

  • Study or target population: Who is being studied? Do they belong to a certain age group, cultural group, religion, political belief, or other common practice?
  • Variables of the study: What is the survey trying to study? Is the study looking for the association or relationship between two things?
  • Purpose of the study: How will this information be used? What new information can this survey help us realize?

Step 4 Provide background information by explaining similar research and studies.

  • Look for surveys done by researchers in peer-viewed academic journals. In addition to these, consult reports produced by similar companies, organizations, newspapers, or think tanks.
  • Compare their results to yours. Do your results support or conflict with their claims? What new information does your report provide on the matter?
  • Provide a description of the issue backed with peer-reviewed evidence. Define what it is you're trying to learn and explain why other studies haven't found this information.

Explaining the Method and Results

Step 1 Explain how the study was conducted in the methodology section.

  • Who did you ask? How can you define the gender, age, and other characteristics of these groups?
  • Did you do the survey over email, telephone, website, or 1-on-1 interviews?
  • Were participants randomly chosen or selected for a certain reason?
  • How large was the sample size? In other words, how many people answered the results of the survey?
  • Were participants offered anything in exchange for filling out the survey?

Step 2 Describe what type of questions were asked in the methodology section.

  • For example, you might sum up the general theme of your questions by saying, "Participants were asked to answer questions about their daily routine and dietary practices."
  • Don't put all of the questions in this section. Instead, include your questionnaire in the first appendix (Appendix A).

Step 3 Report the results of the survey in a separate section.

  • If your survey interviewed people, choose a few relevant responses and type them up in this section. Refer the reader to the full questionnaire, which will be in the appendix.
  • If your survey was broken up into multiple sections, report the results of each section separately, with a subheading for each section.
  • Avoid making any claims about the results in this section. Just report the data, using statistics, sample answers, and quantitative data.
  • Include graphs, charts, and other visual representations of your data in this section.

Step 4 Point out any interesting trends in the results section.

  • For example, do people from a similar age group response to a certain question in a similar way?
  • Look at questions that received the highest number of similar responses. This means that most people answer the question in similar ways. What do you think that means?

Analyzing Your Results

Step 1 State the implications of your survey at the beginning of the conclusion.

  • Here you may break away from the objective tone of the rest of the paper. You might state if readers should be alarmed, concerned, or intrigued by something.
  • For example, you might highlight how current policy is failing or state how the survey demonstrates that current practices are succeeding.

Step 2 Make recommendations about what needs to be done about this issue.

  • More research needs to be done on this topic.
  • Current guidelines or policy need to be changed.
  • The company or institution needs to take action.

Step 3 Include graphs, charts, surveys, and testimonies in the appendices.

  • Appendices are typically labeled with letters, such as Appendix A, Appendix B, Appendix C, and so on.
  • You may refer to appendices throughout your paper. For example, you can say, “Refer to Appendix A for the questionnaire” or “Participants were asked 20 questions (Appendix A)”.

Polishing Your Report

Step 1 Add a title page and table of contents to the first 2 pages.

  • The table of contents should list the page numbers for each section (or heading) of the report.

Step 2 Cite your research according to the style required for the survey report.

  • Typically, you will cite information using in-text parenthetical citations. Put the name of the author and other information, such as the page number or year of publication, in parentheses at the end of a sentence.
  • Some professional organizations may have their own separate guidelines. Consult these for more information.
  • If you don’t need a specific style, make sure that the formatting for the paper is consistent throughout. Use the same spacing, font, font size, and citations throughout the paper.

Step 3 Adopt a clear, objective voice throughout the paper.

  • Try not to editorialize the results as you report them. For example, don’t say, “The study shows an alarming trend of increasing drug use that must be stopped.” Instead, just say, “The results show an increase in drug use.”

Step 4 Write in concise, simple sentences.

  • If you have a choice between a simple word and a complex word, choose the simpler term. For example, instead of “1 out of 10 civilians testify to imbibing alcoholic drinks thrice daily,” just say “1 out of 10 people report drinking alcohol 3 times a day.”
  • Remove any unnecessary phrases or words. For example, instead of “In order to determine the frequency of the adoption of dogs,” just say “To determine the frequency of dog adoption.”

Step 5 Revise your paper thoroughly before submitting.

  • Make sure you have page numbers on the bottom of the page. Check that the table of contents contains the right page numbers.
  • Remember, spell check on word processors doesn’t always catch every mistake. Ask someone else to proofread for you to help you catch errors.

Survey Report Template

how to write methodology for online survey

Community Q&A

Community Answer

  • Always represent the data accurately in your report. Do not lie or misrepresent information. Thanks Helpful 0 Not Helpful 0

You Might Also Like

Skip Surveys

  • ↑ https://survey.umn.edu/best-practices/survey-analysis-reporting-your-findings
  • ↑ https://www.poynter.org/news/beware-sloppiness-when-reporting-surveys
  • ↑ https://ctb.ku.edu/en/table-of-contents/assessment/assessing-community-needs-and-resources/conduct-surveys/main

About This Article

Anne Schmidt

To write a survey report, you’ll need to include an executive summary, your background and objectives, the methodology, results, and a conclusion with recommendations. In the executive summary, write out the main points of your report in a brief 1-2 page explanation. After the summary, state the objective of the summary, or why the survey was conducted. You should also include the hypothesis and goals of the survey. Once you’ve written this, provide some background information, such as similar studies that have been conducted, that add to your research. Then, explain how your study was conducted in the methodology section. Make sure to include the size of your sample and what your survey contained. Finally, include the results of your study and what implications they present. To learn how to polish your report with a title page and table of contents, read on! Did this summary help you? Yes No

  • Send fan mail to authors

Reader Success Stories

Vickey Zhao

Vickey Zhao

Nov 21, 2021

Did this article help you?

Vickey Zhao

Fotima Mamatkulova

Jan 4, 2021

N. M.

Jul 15, 2019

Geraldine Robertson

Geraldine Robertson

Dec 4, 2018

Moniba Fatima

Moniba Fatima

Oct 1, 2019

Do I Have a Dirty Mind Quiz

Featured Articles

Protect Yourself from Predators (for Kids)

Trending Articles

Superhero Name Generator

Watch Articles

Wear a Headband

  • Terms of Use
  • Privacy Policy
  • Do Not Sell or Share My Info
  • Not Selling Info

Get all the best how-tos!

Sign up for wikiHow's weekly email newsletter

  • Utility Menu

University Logo

Harvard University Program on Survey Research

  • How to Frame and Explain the Survey Data Used in a Thesis

Surveys are a special research tool with strengths, weaknesses, and a language all of their own. There are many different steps to designing and conducting a survey, and survey researchers have specific ways of describing what they do.

This handout, based on an annual workshop offered by the Program on Survey Research at Harvard, is geared toward undergraduate honors thesis writers using survey data.

74 KB

PSR Resources

  • Managing and Manipulating Survey Data: A Beginners Guide
  • Finding and Hiring Survey Contractors
  • Overview of Cognitive Testing and Questionnaire Evaluation
  • Questionnaire Design Tip Sheet
  • Sampling, Coverage, and Nonresponse Tip Sheet
  • Introduction to Surveys for Honors Thesis Writers
  • PSR Introduction to the Survey Process
  • Related Centers/Programs at Harvard
  • General Survey Reference
  • Institutional Review Boards
  • Select Funding Opportunities
  • Survey Analysis Software
  • Professional Standards
  • Professional Organizations
  • Major Public Polls
  • Survey Data Collections
  • Major Longitudinal Surveys
  • Other Links
  • Privacy Policy

Research Method

Home » Dissertation Methodology – Structure, Example and Writing Guide

Dissertation Methodology – Structure, Example and Writing Guide

  • Table of Contents

Dissertation Methodology

Dissertation Methodology

In any research, the methodology chapter is one of the key components of your dissertation. It provides a detailed description of the methods you used to conduct your research and helps readers understand how you obtained your data and how you plan to analyze it. This section is crucial for replicating the study and validating its results.

Here are the basic elements that are typically included in a dissertation methodology:

  • Introduction : This section should explain the importance and goals of your research .
  • Research Design : Outline your research approach and why it’s appropriate for your study. You might be conducting an experimental research, a qualitative research, a quantitative research, or a mixed-methods research.
  • Data Collection : This section should detail the methods you used to collect your data. Did you use surveys, interviews, observations, etc.? Why did you choose these methods? You should also include who your participants were, how you recruited them, and any ethical considerations.
  • Data Analysis : Explain how you intend to analyze the data you collected. This could include statistical analysis, thematic analysis, content analysis, etc., depending on the nature of your study.
  • Reliability and Validity : Discuss how you’ve ensured the reliability and validity of your study. For instance, you could discuss measures taken to reduce bias, how you ensured that your measures accurately capture what they were intended to, or how you will handle any limitations in your study.
  • Ethical Considerations : This is where you state how you have considered ethical issues related to your research, how you have protected the participants’ rights, and how you have complied with the relevant ethical guidelines.
  • Limitations : Acknowledge any limitations of your methodology, including any biases and constraints that might have affected your study.
  • Summary : Recap the key points of your methodology chapter, highlighting the overall approach and rationalization of your research.

Types of Dissertation Methodology

The type of methodology you choose for your dissertation will depend on the nature of your research question and the field you’re working in. Here are some of the most common types of methodologies used in dissertations:

Experimental Research

This involves creating an experiment that will test your hypothesis. You’ll need to design an experiment, manipulate variables, collect data, and analyze that data to draw conclusions. This is commonly used in fields like psychology, biology, and physics.

Survey Research

This type of research involves gathering data from a large number of participants using tools like questionnaires or surveys. It can be used to collect a large amount of data and is often used in fields like sociology, marketing, and public health.

Qualitative Research

This type of research is used to explore complex phenomena that can’t be easily quantified. Methods include interviews, focus groups, and observations. This methodology is common in fields like anthropology, sociology, and education.

Quantitative Research

Quantitative research uses numerical data to answer research questions. This can include statistical, mathematical, or computational techniques. It’s common in fields like economics, psychology, and health sciences.

Case Study Research

This type of research involves in-depth investigation of a particular case, such as an individual, group, or event. This methodology is often used in psychology, social sciences, and business.

Mixed Methods Research

This combines qualitative and quantitative research methods in a single study. It’s used to answer more complex research questions and is becoming more popular in fields like social sciences, health sciences, and education.

Action Research

This type of research involves taking action and then reflecting upon the results. This cycle of action-reflection-action continues throughout the study. It’s often used in fields like education and organizational development.

Longitudinal Research

This type of research involves studying the same group of individuals over an extended period of time. This could involve surveys, observations, or experiments. It’s common in fields like psychology, sociology, and medicine.

Ethnographic Research

This type of research involves the in-depth study of people and cultures. Researchers immerse themselves in the culture they’re studying to collect data. This is often used in fields like anthropology and social sciences.

Structure of Dissertation Methodology

The structure of a dissertation methodology can vary depending on your field of study, the nature of your research, and the guidelines of your institution. However, a standard structure typically includes the following elements:

  • Introduction : Briefly introduce your overall approach to the research. Explain what you plan to explore and why it’s important.
  • Research Design/Approach : Describe your overall research design. This can be qualitative, quantitative, or mixed methods. Explain the rationale behind your chosen design and why it is suitable for your research questions or hypotheses.
  • Data Collection Methods : Detail the methods you used to collect your data. You should include what type of data you collected, how you collected it, and why you chose this method. If relevant, you can also include information about your sample population, such as how many people participated, how they were chosen, and any relevant demographic information.
  • Data Analysis Methods : Explain how you plan to analyze your collected data. This will depend on the nature of your data. For example, if you collected quantitative data, you might discuss statistical analysis techniques. If you collected qualitative data, you might discuss coding strategies, thematic analysis, or narrative analysis.
  • Reliability and Validity : Discuss how you’ve ensured the reliability and validity of your research. This might include steps you took to reduce bias or increase the accuracy of your measurements.
  • Ethical Considerations : If relevant, discuss any ethical issues associated with your research. This might include how you obtained informed consent from participants, how you ensured participants’ privacy and confidentiality, or any potential conflicts of interest.
  • Limitations : Acknowledge any limitations in your research methodology. This could include potential sources of bias, difficulties with data collection, or limitations in your analysis methods.
  • Summary/Conclusion : Briefly summarize the key points of your methodology, emphasizing how it helps answer your research questions or hypotheses.

How to Write Dissertation Methodology

Writing a dissertation methodology requires you to be clear and precise about the way you’ve carried out your research. It’s an opportunity to convince your readers of the appropriateness and reliability of your approach to your research question. Here is a basic guideline on how to write your methodology section:

1. Introduction

Start your methodology section by restating your research question(s) or objective(s). This ensures your methodology directly ties into the aim of your research.

2. Approach

Identify your overall approach: qualitative, quantitative, or mixed methods. Explain why you have chosen this approach.

  • Qualitative methods are typically used for exploratory research and involve collecting non-numerical data. This might involve interviews, observations, or analysis of texts.
  • Quantitative methods are used for research that relies on numerical data. This might involve surveys, experiments, or statistical analysis.
  • Mixed methods use a combination of both qualitative and quantitative research methods.

3. Research Design

Describe the overall design of your research. This could involve explaining the type of study (e.g., case study, ethnography, experimental research, etc.), how you’ve defined and measured your variables, and any control measures you’ve implemented.

4. Data Collection

Explain in detail how you collected your data.

  • If you’ve used qualitative methods, you might detail how you selected participants for interviews or focus groups, how you conducted observations, or how you analyzed existing texts.
  • If you’ve used quantitative methods, you might detail how you designed your survey or experiment, how you collected responses, and how you ensured your data is reliable and valid.

5. Data Analysis

Describe how you analyzed your data.

  • If you’re doing qualitative research, this might involve thematic analysis, discourse analysis, or grounded theory.
  • If you’re doing quantitative research, you might be conducting statistical tests, regression analysis, or factor analysis.

Discuss any ethical issues related to your research. This might involve explaining how you obtained informed consent, how you’re protecting participants’ privacy, or how you’re managing any potential harms to participants.

7. Reliability and Validity

Discuss the steps you’ve taken to ensure the reliability and validity of your data.

  • Reliability refers to the consistency of your measurements, and you might discuss how you’ve piloted your instruments or used standardized measures.
  • Validity refers to the accuracy of your measurements, and you might discuss how you’ve ensured your measures reflect the concepts they’re supposed to measure.

8. Limitations

Every study has its limitations. Discuss the potential weaknesses of your chosen methods and explain any obstacles you faced in your research.

9. Conclusion

Summarize the key points of your methodology, emphasizing how it helps to address your research question or objective.

Example of Dissertation Methodology

An Example of Dissertation Methodology is as follows:

Chapter 3: Methodology

  • Introduction

This chapter details the methodology adopted in this research. The study aimed to explore the relationship between stress and productivity in the workplace. A mixed-methods research design was used to collect and analyze data.

Research Design

This study adopted a mixed-methods approach, combining quantitative surveys with qualitative interviews to provide a comprehensive understanding of the research problem. The rationale for this approach is that while quantitative data can provide a broad overview of the relationships between variables, qualitative data can provide deeper insights into the nuances of these relationships.

Data Collection Methods

Quantitative Data Collection : An online self-report questionnaire was used to collect data from participants. The questionnaire consisted of two standardized scales: the Perceived Stress Scale (PSS) to measure stress levels and the Individual Work Productivity Questionnaire (IWPQ) to measure productivity. The sample consisted of 200 office workers randomly selected from various companies in the city.

Qualitative Data Collection : Semi-structured interviews were conducted with 20 participants chosen from the initial sample. The interview guide included questions about participants’ experiences with stress and how they perceived its impact on their productivity.

Data Analysis Methods

Quantitative Data Analysis : Descriptive and inferential statistics were used to analyze the survey data. Pearson’s correlation was used to examine the relationship between stress and productivity.

Qualitative Data Analysis : Interviews were transcribed and subjected to thematic analysis using NVivo software. This process allowed for identifying and analyzing patterns and themes regarding the impact of stress on productivity.

Reliability and Validity

To ensure reliability and validity, standardized measures with good psychometric properties were used. In qualitative data analysis, triangulation was employed by having two researchers independently analyze the data and then compare findings.

Ethical Considerations

All participants provided informed consent prior to their involvement in the study. They were informed about the purpose of the study, their rights as participants, and the confidentiality of their responses.

Limitations

The main limitation of this study is its reliance on self-report measures, which can be subject to biases such as social desirability bias. Moreover, the sample was drawn from a single city, which may limit the generalizability of the findings.

Where to Write Dissertation Methodology

In a dissertation or thesis, the Methodology section usually follows the Literature Review. This placement allows the Methodology to build upon the theoretical framework and existing research outlined in the Literature Review, and precedes the Results or Findings section. Here’s a basic outline of how most dissertations are structured:

  • Acknowledgements
  • Literature Review (or it may be interspersed throughout the dissertation)
  • Methodology
  • Results/Findings
  • References/Bibliography

In the Methodology chapter, you will discuss the research design, data collection methods, data analysis methods, and any ethical considerations pertaining to your study. This allows your readers to understand how your research was conducted and how you arrived at your results.

Advantages of Dissertation Methodology

The dissertation methodology section plays an important role in a dissertation for several reasons. Here are some of the advantages of having a well-crafted methodology section in your dissertation:

  • Clarifies Your Research Approach : The methodology section explains how you plan to tackle your research question, providing a clear plan for data collection and analysis.
  • Enables Replication : A detailed methodology allows other researchers to replicate your study. Replication is an important aspect of scientific research because it provides validation of the study’s results.
  • Demonstrates Rigor : A well-written methodology shows that you’ve thought critically about your research methods and have chosen the most appropriate ones for your research question. This adds credibility to your study.
  • Enhances Transparency : Detailing your methods allows readers to understand the steps you took in your research. This increases the transparency of your study and allows readers to evaluate potential biases or limitations.
  • Helps in Addressing Research Limitations : In your methodology section, you can acknowledge and explain the limitations of your research. This is important as it shows you understand that no research method is perfect and there are always potential weaknesses.
  • Facilitates Peer Review : A detailed methodology helps peer reviewers assess the soundness of your research design. This is an important part of the publication process if you aim to publish your dissertation in a peer-reviewed journal.
  • Establishes the Validity and Reliability : Your methodology section should also include a discussion of the steps you took to ensure the validity and reliability of your measurements, which is crucial for establishing the overall quality of your research.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Theoretical Framework

Theoretical Framework – Types, Examples and...

Thesis

Thesis – Structure, Example and Writing Guide

Research Design

Research Design – Types, Methods and Examples

Scope of the Research

Scope of the Research – Writing Guide and...

Appendices

Appendices – Writing Guide, Types and Examples

Thesis Statement

Thesis Statement – Examples, Writing Guide

  • Study Protocol
  • Open access
  • Published: 26 August 2024

Learning effect of online versus onsite education in health and medical scholarship – protocol for a cluster randomized trial

  • Rie Raffing 1 ,
  • Lars Konge 2 &
  • Hanne Tønnesen 1  

BMC Medical Education volume  24 , Article number:  927 ( 2024 ) Cite this article

123 Accesses

Metrics details

The disruption of health and medical education by the COVID-19 pandemic made educators question the effect of online setting on students’ learning, motivation, self-efficacy and preference. In light of the health care staff shortage online scalable education seemed relevant. Reviews on the effect of online medical education called for high quality RCTs, which are increasingly relevant with rapid technological development and widespread adaption of online learning in universities. The objective of this trial is to compare standardized and feasible outcomes of an online and an onsite setting of a research course regarding the efficacy for PhD students within health and medical sciences: Primarily on learning of research methodology and secondly on preference, motivation, self-efficacy on short term and academic achievements on long term. Based on the authors experience with conducting courses during the pandemic, the hypothesis is that student preferred onsite setting is different to online setting.

Cluster randomized trial with two parallel groups. Two PhD research training courses at the University of Copenhagen are randomized to online (Zoom) or onsite (The Parker Institute, Denmark) setting. Enrolled students are invited to participate in the study. Primary outcome is short term learning. Secondary outcomes are short term preference, motivation, self-efficacy, and long-term academic achievements. Standardized, reproducible and feasible outcomes will be measured by tailor made multiple choice questionnaires, evaluation survey, frequently used Intrinsic Motivation Inventory, Single Item Self-Efficacy Question, and Google Scholar publication data. Sample size is calculated to 20 clusters and courses are randomized by a computer random number generator. Statistical analyses will be performed blinded by an external statistical expert.

Primary outcome and secondary significant outcomes will be compared and contrasted with relevant literature. Limitations include geographical setting; bias include lack of blinding and strengths are robust assessment methods in a well-established conceptual framework. Generalizability to PhD education in other disciplines is high. Results of this study will both have implications for students and educators involved in research training courses in health and medical education and for the patients who ultimately benefits from this training.

Trial registration

Retrospectively registered at ClinicalTrials.gov: NCT05736627. SPIRIT guidelines are followed.

Peer Review reports

Medical education was utterly disrupted for two years by the COVID-19 pandemic. In the midst of rearranging courses and adapting to online platforms we, with lecturers and course managers around the globe, wondered what the conversion to online setting did to students’ learning, motivation and self-efficacy [ 1 , 2 , 3 ]. What the long-term consequences would be [ 4 ] and if scalable online medical education should play a greater role in the future [ 5 ] seemed relevant and appealing questions in a time when health care professionals are in demand. Our experience of performing research training during the pandemic was that although PhD students were grateful for courses being available, they found it difficult to concentrate related to the long screen hours. We sensed that most students preferred an onsite setting and perceived online courses a temporary and inferior necessity. The question is if this impacted their learning?

Since the common use of the internet in medical education, systematic reviews have sought to answer if there is a difference in learning effect when taught online compared to onsite. Although authors conclude that online learning may be equivalent to onsite in effect, they agree that studies are heterogeneous and small [ 6 , 7 ], with low quality of the evidence [ 8 , 9 ]. They therefore call for more robust and adequately powered high-quality RCTs to confirm their findings and suggest that students’ preferences in online learning should be investigated [ 7 , 8 , 9 ].

This uncovers two knowledge gaps: I) High-quality RCTs on online versus onsite learning in health and medical education and II) Studies on students’ preferences in online learning.

Recently solid RCTs have been performed on the topic of web-based theoretical learning of research methods among health professionals [ 10 , 11 ]. However, these studies are on asynchronous courses among medical or master students with short term outcomes.

This uncovers three additional knowledge gaps: III) Studies on synchronous online learning IV) among PhD students of health and medical education V) with long term measurement of outcomes.

The rapid technological development including artificial intelligence (AI) and widespread adaption as well as application of online learning forced by the pandemic, has made online learning well-established. It represents high resolution live synchronic settings which is available on a variety of platforms with integrated AI and options for interaction with and among students, chat and break out rooms, and exterior digital tools for teachers [ 12 , 13 , 14 ]. Thus, investigating online learning today may be quite different than before the pandemic. On one hand, it could seem plausible that this technological development would make a difference in favour of online learning which could not be found in previous reviews of the evidence. On the other hand, the personal face-to-face interaction during onsite learning may still be more beneficial for the learning process and combined with our experience of students finding it difficult to concentrate when online during the pandemic we hypothesize that outcomes of the onsite setting are different from the online setting.

To support a robust study, we design it as a cluster randomized trial. Moreover, we use the well-established and widely used Kirkpatrick’s conceptual framework for evaluating learning as a lens to assess our outcomes [ 15 ]. Thus, to fill the above-mentioned knowledge gaps, the objective of this trial is to compare a synchronous online and an in-person onsite setting of a research course regarding the efficacy for PhD students within the health and medical sciences:

Primarily on theoretical learning of research methodology and

Secondly on

◦ Preference, motivation, self-efficacy on short term

◦ Academic achievements on long term

Trial design

This study protocol covers synchronous online and in-person onsite setting of research courses testing the efficacy for PhD students. It is a two parallel arms cluster randomized trial (Fig.  1 ).

figure 1

Consort flow diagram

The study measures baseline and post intervention. Baseline variables and knowledge scores are obtained at the first day of the course, post intervention measurement is obtained the last day of the course (short term) and monthly for 24 months (long term).

Randomization is stratified giving 1:1 allocation ratio of the courses. As the number of participants within each course might differ, the allocation ratio of participants in the study will not fully be equal and 1:1 balanced.

Study setting

The study site is The Parker Institute at Bispebjerg and Frederiksberg Hospital, University of Copenhagen, Denmark. From here the courses are organized and run online and onsite. The course programs and time schedules, the learning objective, the course management, the lecturers, and the delivery are identical in the two settings. The teachers use the same introductory presentations followed by training in break out groups, feed-back and discussions. For the online group, the setting is organized as meetings in the online collaboration tool Zoom® [ 16 ] using the basic available technicalities such as screen sharing, chat function for comments, and breakout rooms and other basics digital tools if preferred. The online version of the course is synchronous with live education and interaction. For the onsite group, the setting is the physical classroom at the learning facilities at the Parker Institute. Coffee and tea as well as simple sandwiches and bottles of water, which facilitate sociality, are available at the onsite setting. The participants in the online setting must get their food and drink by themselves, but online sociality is made possible by not closing down the online room during the breaks. The research methodology courses included in the study are “Practical Course in Systematic Review Technique in Clinical Research”, (see course programme in appendix 1) and “Getting started: Writing your first manuscript for publication” [ 17 ] (see course programme in appendix 2). The two courses both have 12 seats and last either three or three and a half days resulting in 2.2 and 2.6 ECTS credits, respectively. They are offered by the PhD School of the Faculty of Health and Medical Sciences, University of Copenhagen. Both courses are available and covered by the annual tuition fee for all PhD students enrolled at a Danish university.

Eligibility criteria

Inclusion criteria for participants: All PhD students enrolled on the PhD courses participate after informed consent: “Practical Course in Systematic Review Technique in Clinical Research” and “Getting started: Writing your first manuscript for publication” at the PhD School of the Faculty of Health and Medical Sciences, University of Copenhagen, Denmark.

Exclusion criteria for participants: Declining to participate and withdrawal of informed consent.

Informed consent

The PhD students at the PhD School at the Faculty of Health Sciences, University of Copenhagen participate after informed consent, taken by the daily project leader, allowing evaluation data from the course to be used after pseudo-anonymization in the project. They are informed in a welcome letter approximately three weeks prior to the course and again in the introduction the first course day. They register their consent on the first course day (Appendix 3). Declining to participate in the project does not influence their participation in the course.

Interventions

Online course settings will be compared to onsite course settings. We test if the onsite setting is different to online. Online learning is increasing but onsite learning is still the preferred educational setting in a medical context. In this case onsite learning represents “usual care”. The online course setting is meetings in Zoom using the technicalities available such as chat and breakout rooms. The onsite setting is the learning facilities, at the Parker Institute, Bispebjerg and Frederiksberg Hospital, The Capital Region, University of Copenhagen, Denmark.

The course settings are not expected to harm the participants, but should a request be made to discontinue the course or change setting this will be met, and the participant taken out of the study. Course participants are allowed to take part in relevant concomitant courses or other interventions during the trial.

Strategies to improve adherence to interventions

Course participants are motivated to complete the course irrespectively of the setting because it bears ECTS-points for their PhD education and adds to the mandatory number of ECTS-points. Thus, we expect adherence to be the same in both groups. However, we monitor their presence in the course and allocate time during class for testing the short-term outcomes ( motivation, self-efficacy, preference and learning). We encourage and, if necessary, repeatedly remind them to register with Google Scholar for our testing of the long-term outcome (academic achievement).

Outcomes are related to the Kirkpatrick model for evaluating learning (Fig.  2 ) which divides outcomes into four different levels; Reaction which includes for example motivation, self-efficacy and preferences, Learning which includes knowledge acquisition, Behaviour for practical application of skills when back at the job (not included in our outcomes), and Results for impact for end-users which includes for example academic achievements in the form of scientific articles [ 18 , 19 , 20 ].

figure 2

The Kirkpatrick model

Primary outcome

The primary outcome is short term learning (Kirkpatrick level 2).

Learning is assessed by a Multiple-Choice Questionnaire (MCQ) developed prior to the RCT specifically for this setting (Appendix 4). First the lecturers of the two courses were contacted and asked to provide five multiple choice questions presented as a stem with three answer options; one correct answer and two distractors. The questions should be related to core elements of their teaching under the heading of research training. The questions were set up to test the cognition of the students at the levels of "Knows" or "Knows how" according to Miller's Pyramid of Competence and not their behaviour [ 21 ]. Six of the course lecturers responded and out of this material all the questions which covered curriculum of both courses were selected. It was tested on 10 PhD students and within the lecturer group, revised after an item analysis and English language revised. The MCQ ended up containing 25 questions. The MCQ is filled in at baseline and repeated at the end of the course. The primary outcomes based on the MCQ is estimated as the score of learning calculated as number of correct answers out of 25 after the course. A decrease of points of the MCQ in the intervention groups denotes a deterioration of learning. In the MCQ the minimum score is 0 and 25 is maximum, where 19 indicates passing the course.

Furthermore, as secondary outcome, this outcome measurement will be categorized as binary outcome to determine passed/failed of the course defined by 75% (19/25) correct answers.

The learning score will be computed on group and individual level and compared regarding continued outcomes by the Mann–Whitney test comparing the learning score of the online and onsite groups. Regarding the binomial outcome of learning (passed/failed) data will be analysed by the Fisher’s exact test on an intention-to-treat basis between the online and onsite. The results will be presented as median and range and as mean and standard deviations, for possible future use in meta-analyses.

Secondary outcomes

Motivation assessment post course: Motivation level is measured by the Intrinsic Motivation Inventory (IMI) Scale [ 22 ] (Appendix 5). The IMI items were randomized by random.org on the 4th of August 2022. It contains 12 items to be assessed by the students on a 7-point Likert scale where 1 is “Not at all true”, 4 is “Somewhat true” and 7 is “Very true”. The motivation score will be computed on group and individual level and will then be tested by the Mann–Whitney of the online and onsite group.

Self-efficacy assessment post course: Self-efficacy level is measured by a single-item measure developed and validated by Williams and Smith [ 23 ] (Appendix 6). It is assessed by the students on a scale from 1–10 where 1 is “Strongly disagree” and 10 is “Strongly agree”. The self-efficacy score will be computed on group and individual level and tested by a Mann–Whitney test to compare the self-efficacy score of the online and onsite group.

Preference assessment post course: Preference is measured as part of the general course satisfaction evaluation with the question “If you had the option to choose, which form would you prefer this course to have?” with the options “onsite form” and “online form”.

Academic achievement assessment is based on 24 monthly measurements post course of number of publications, number of citations, h-index, i10-index. This data is collected through the Google Scholar Profiles [ 24 ] of the students as this database covers most scientific journals. Associations between onsite/online and long-term academic will be examined with Kaplan Meyer and log rank test with a significance level of 0.05.

Participant timeline

Enrolment for the course at the Faculty of Health Sciences, University of Copenhagen, Denmark, becomes available when it is published in the course catalogue. In the course description the course location is “To be announced”. Approximately 3–4 weeks before the course begins, the participant list is finalized, and students receive a welcome letter containing course details, including their allocation to either the online or onsite setting. On the first day of the course, oral information is provided, and participants provide informed consent, baseline variables, and base line knowledge scores.

The last day of scheduled activities the following scores are collected, knowledge, motivation, self-efficacy, setting preference, and academic achievement. To track students' long term academic achievements, follow-ups are conducted monthly for a period of 24 months, with assessments occurring within one week of the last course day (Table  1 ).

Sample size

The power calculation is based on the main outcome, theoretical learning on short term. For the sample size determination, we considered 12 available seats for participants in each course. To achieve statistical power, we aimed for 8 clusters in both online and onsite arms (in total 16 clusters) to detect an increase in learning outcome of 20% (learning outcome increase of 5 points). We considered an intraclass correlation coefficient of 0.02, a standard deviation of 10, a power of 80%, and a two-sided alpha level of 5%. The Allocation Ratio was set at 1, implying an equal number of subjects in both online and onsite group.

Considering a dropout up to 2 students per course, equivalent to 17%, we determined that a total of 112 participants would be needed. This calculation factored in 10 clusters of 12 participants per study arm, which we deemed sufficient to assess any changes in learning outcome.

The sample size was estimated using the function n4means from the R package CRTSize [ 25 ].

Recruitment

Participants are PhD students enrolled in 10 courses of “Practical Course in Systematic Review Technique in Clinical Research” and 10 courses of “Getting started: Writing your first manuscript for publication” at the PhD School of the Faculty of Health Sciences, University of Copenhagen, Denmark.

Assignment of interventions: allocation

Randomization will be performed on course-level. The courses are randomized by a computer random number generator [ 26 ]. To get a balanced randomization per year, 2 sets with 2 unique random integers in each, taken from the 1–4 range is requested.

The setting is not included in the course catalogue of the PhD School and thus allocation to online or onsite is concealed until 3–4 weeks before course commencement when a welcome letter with course information including allocation to online or onsite setting is distributed to the students. The lecturers are also informed of the course setting at this time point. If students withdraw from the course after being informed of the setting, a letter is sent to them enquiring of the reason for withdrawal and reason is recorded (Appendix 7).

The allocation sequence is generated by a computer random number generator (random.org). The participants and the lecturers sign up for the course without knowing the course setting (online or onsite) until 3–4 weeks before the course.

Assignment of interventions: blinding

Due to the nature of the study, it is not possible to blind trial participants or lecturers. The outcomes are reported by the participants directly in an online form, thus being blinded for the outcome assessor, but not for the individual participant. The data collection for the long-term follow-up regarding academic achievements is conducted without blinding. However, the external researcher analysing the data will be blinded.

Data collection and management

Data will be collected by the project leader (Table  1 ). Baseline variables and post course knowledge, motivation, and self-efficacy are self-reported through questionnaires in SurveyXact® [ 27 ]. Academic achievements are collected through Google Scholar profiles of the participants.

Given that we are using participant assessments and evaluations for research purposes, all data collection – except for monthly follow-up of academic achievements after the course – takes place either in the immediate beginning or ending of the course and therefore we expect participant retention to be high.

Data will be downloaded from SurveyXact and stored in a locked and logged drive on a computer belonging to the Capital Region of Denmark. Only the project leader has access to the data.

This project conduct is following the Danish Data Protection Agency guidelines of the European GDPR throughout the trial. Following the end of the trial, data will be stored at the Danish National Data Archive which fulfil Danish and European guidelines for data protection and management.

Statistical methods

Data is anonymized and blinded before the analyses. Analyses are performed by a researcher not otherwise involved in the inclusion or randomization, data collection or handling. All statistical tests will be testing the null hypotheses assuming the two arms of the trial being equal based on corresponding estimates. Analysis of primary outcome on short-term learning will be started once all data has been collected for all individuals in the last included course. Analyses of long-term academic achievement will be started at end of follow-up.

Baseline characteristics including both course- and individual level information will be presented. Table 2 presents the available data on baseline.

We will use multivariate analysis for identification of the most important predictors (motivation, self-efficacy, sex, educational background, and knowledge) for best effect on short and long term. The results will be presented as risk ratio (RR) with 95% confidence interval (CI). The results will be considered significant if CI does not include the value one.

All data processing and analyses were conducted using R statistical software version 4.1.0, 2021–05-18 (R Foundation for Statistical Computing, Vienna, Austria).

If possible, all analysis will be performed for “Practical Course in Systematic Review Technique in Clinical Research” and for “Getting started: Writing your first manuscript for publication” separately.

Primary analyses will be handled with the intention-to-treat approach. The analyses will include all individuals with valid data regardless of they did attend the complete course. Missing data will be handled with multiple imputation [ 28 ] .

Upon reasonable request, public assess will be granted to protocol, datasets analysed during the current study, and statistical code Table 3 .

Oversight, monitoring, and adverse events

This project is coordinated in collaboration between the WHO CC (DEN-62) at the Parker Institute, CAMES, and the PhD School at the Faculty of Health and Medical Sciences, University of Copenhagen. The project leader runs the day-to-day support of the trial. The steering committee of the trial includes principal investigators from WHO CC (DEN-62) and CAMES and the project leader and meets approximately three times a year.

Data monitoring is done on a daily basis by the project leader and controlled by an external independent researcher.

An adverse event is “a harmful and negative outcome that happens when a patient has been provided with medical care” [ 29 ]. Since this trial does not involve patients in medical care, we do not expect adverse events. If participants decline taking part in the course after receiving the information of the course setting, information on reason for declining is sought obtained. If the reason is the setting this can be considered an unintended effect. Information of unintended effects of the online setting (the intervention) will be recorded. Participants are encouraged to contact the project leader with any response to the course in general both during and after the course.

The trial description has been sent to the Scientific Ethical Committee of the Capital Region of Denmark (VEK) (21041907), which assessed it as not necessary to notify and that it could proceed without permission from VEK according to the Danish law and regulation of scientific research. The trial is registered with the Danish Data Protection Agency (Privacy) (P-2022–158). Important protocol modification will be communicated to relevant parties as well as VEK, the Joint Regional Information Security and Clinicaltrials.gov within an as short timeframe as possible.

Dissemination plans

The results (positive, negative, or inconclusive) will be disseminated in educational, scientific, and clinical fora, in international scientific peer-reviewed journals, and clinicaltrials.gov will be updated upon completion of the trial. After scientific publication, the results will be disseminated to the public by the press, social media including the website of the hospital and other organizations – as well as internationally via WHO CC (DEN-62) at the Parker Institute and WHO Europe.

All authors will fulfil the ICMJE recommendations for authorship, and RR will be first author of the articles as a part of her PhD dissertation. Contributors who do not fulfil these recommendations will be offered acknowledgement in the article.

This cluster randomized trial investigates if an onsite setting of a research course for PhD students within the health and medical sciences is different from an online setting. The outcomes measured are learning of research methodology (primary), preference, motivation, and self-efficacy (secondary) on short term and academic achievements (secondary) on long term.

The results of this study will be discussed as follows:

Discussion of primary outcome

Primary outcome will be compared and contrasted with similar studies including recent RCTs and mixed-method studies on online and onsite research methodology courses within health and medical education [ 10 , 11 , 30 ] and for inspiration outside the field [ 31 , 32 ]: Tokalic finds similar outcomes for online and onsite, Martinic finds that the web-based educational intervention improves knowledge, Cheung concludes that the evidence is insufficient to say that the two modes have different learning outcomes, Kofoed finds online setting to have negative impact on learning and Rahimi-Ardabili presents positive self-reported student knowledge. These conflicting results will be discussed in the context of the result on the learning outcome of this study. The literature may change if more relevant studies are published.

Discussion of secondary outcomes

Secondary significant outcomes are compared and contrasted with similar studies.

Limitations, generalizability, bias and strengths

It is a limitation to this study, that an onsite curriculum for a full day is delivered identically online, as this may favour the onsite course due to screen fatigue [ 33 ]. At the same time, it is also a strength that the time schedules are similar in both settings. The offer of coffee, tea, water, and a plain sandwich in the onsite course may better facilitate the possibility for socializing. Another limitation is that the study is performed in Denmark within a specific educational culture, with institutional policies and resources which might affect the outcome and limit generalization to other geographical settings. However, international students are welcome in the class.

In educational interventions it is generally difficult to blind participants and this inherent limitation also applies to this trial [ 11 ]. Thus, the participants are not blinded to their assigned intervention, and neither are the lecturers in the courses. However, the external statistical expert will be blinded when doing the analyses.

We chose to compare in-person onsite setting with a synchronous online setting. Therefore, the online setting cannot be expected to generalize to asynchronous online setting. Asynchronous delivery has in some cases showed positive results and it might be because students could go back and forth through the modules in the interface without time limit [ 11 ].

We will report on all the outcomes defined prior to conducting the study to avoid selective reporting bias.

It is a strength of the study that it seeks to report outcomes within the 1, 2 and 4 levels of the Kirkpatrick conceptual framework, and not solely on level 1. It is also a strength that the study is cluster randomized which will reduce “infections” between the two settings and has an adequate power calculated sample size and looks for a relevant educational difference of 20% between the online and onsite setting.

Perspectives with implications for practice

The results of this study may have implications for the students for which educational setting they choose. Learning and preference results has implications for lecturers, course managers and curriculum developers which setting they should plan for the health and medical education. It may also be of inspiration for teaching and training in other disciplines. From a societal perspective it also has implications because we will know the effect and preferences of online learning in case of a future lock down.

Future research could investigate academic achievements in online and onsite research training on the long run (Kirkpatrick 4); the effect of blended learning versus online or onsite (Kirkpatrick 2); lecturers’ preferences for online and onsite setting within health and medical education (Kirkpatrick 1) and resource use in synchronous and asynchronous online learning (Kirkpatrick 5).

Trial status

This trial collected pilot data from August to September 2021 and opened for inclusion in January 2022. Completion of recruitment is expected in April 2024 and long-term follow-up in April 2026. Protocol version number 1 03.06.2022 with amendments 30.11.2023.

Availability of data and materials

The project leader will have access to the final trial dataset which will be available upon reasonable request. Exception to this is the qualitative raw data that might contain information leading to personal identification.

Abbreviations

Artificial Intelligence

Copenhagen academy for medical education and simulation

Confidence interval

Coronavirus disease

European credit transfer and accumulation system

International committee of medical journal editors

Intrinsic motivation inventory

Multiple choice questionnaire

Doctor of medicine

Masters of sciences

Randomized controlled trial

Scientific ethical committee of the Capital Region of Denmark

WHO Collaborating centre for evidence-based clinical health promotion

Samara M, Algdah A, Nassar Y, Zahra SA, Halim M, Barsom RMM. How did online learning impact the academic. J Technol Sci Educ. 2023;13(3):869–85.

Article   Google Scholar  

Nejadghaderi SA, Khoshgoftar Z, Fazlollahi A, Nasiri MJ. Medical education during the coronavirus disease 2019 pandemic: an umbrella review. Front Med (Lausanne). 2024;11:1358084. https://doi.org/10.3389/fmed.2024.1358084 .

Madi M, Hamzeh H, Abujaber S, Nawasreh ZH. Have we failed them? Online learning self-efficacy of physiotherapy students during COVID-19 pandemic. Physiother Res Int. 2023;5:e1992. https://doi.org/10.1002/pri.1992 .

Torda A. How COVID-19 has pushed us into a medical education revolution. Intern Med J. 2020;50(9):1150–3.

Alhat S. Virtual Classroom: A Future of Education Post-COVID-19. Shanlax Int J Educ. 2020;8(4):101–4.

Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: A meta-analysis. JAMA. 2008;300(10):1181–96. https://doi.org/10.1001/jama.300.10.1181 .

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019;24(1):1666538. https://doi.org/10.1080/10872981.2019.1666538 .

Richmond H, Copsey B, Hall AM, Davies D, Lamb SE. A systematic review and meta-analysis of online versus alternative methods for training licensed health care professionals to deliver clinical interventions. BMC Med Educ. 2017;17(1):227. https://doi.org/10.1186/s12909-017-1047-4 .

George PP, Zhabenko O, Kyaw BM, Antoniou P, Posadzki P, Saxena N, Semwal M, Tudor Car L, Zary N, Lockwood C, Car J. Online Digital Education for Postregistration Training of Medical Doctors: Systematic Review by the Digital Health Education Collaboration. J Med Internet Res. 2019;21(2):e13269. https://doi.org/10.2196/13269 .

Tokalić R, Poklepović Peričić T, Marušić A. Similar Outcomes of Web-Based and Face-to-Face Training of the GRADE Approach for the Certainty of Evidence: Randomized Controlled Trial. J Med Internet Res. 2023;25:e43928. https://doi.org/10.2196/43928 .

Krnic Martinic M, Čivljak M, Marušić A, Sapunar D, Poklepović Peričić T, Buljan I, et al. Web-Based Educational Intervention to Improve Knowledge of Systematic Reviews Among Health Science Professionals: Randomized Controlled Trial. J Med Internet Res. 2022;24(8): e37000.

https://www.mentimeter.com/ . Accessed 4 Dec 2023.

https://www.sendsteps.com/en/ . Accessed 4 Dec 2023.

https://da.padlet.com/ . Accessed 4 Dec 2023.

Zackoff MW, Real FJ, Abramson EL, Li STT, Klein MD, Gusic ME. Enhancing Educational Scholarship Through Conceptual Frameworks: A Challenge and Roadmap for Medical Educators. Acad Pediatr. 2019;19(2):135–41. https://doi.org/10.1016/j.acap.2018.08.003 .

https://zoom.us/ . Accessed 20 Aug 2024.

Raffing R, Larsen S, Konge L, Tønnesen H. From Targeted Needs Assessment to Course Ready for Implementation-A Model for Curriculum Development and the Course Results. Int J Environ Res Public Health. 2023;20(3):2529. https://doi.org/10.3390/ijerph20032529 .

https://www.kirkpatrickpartners.com/the-kirkpatrick-model/ . Accessed 12 Dec 2023.

Smidt A, Balandin S, Sigafoos J, Reed VA. The Kirkpatrick model: A useful tool for evaluating training outcomes. J Intellect Dev Disabil. 2009;34(3):266–74.

Campbell K, Taylor V, Douglas S. Effectiveness of online cancer education for nurses and allied health professionals; a systematic review using kirkpatrick evaluation framework. J Cancer Educ. 2019;34(2):339–56.

Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7.

Ryan RM, Deci EL. Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being. Am Psychol. 2000;55(1):68–78. https://doi.org/10.1037//0003-066X.55.1.68 .

Williams GM, Smith AP. Using single-item measures to examine the relationships between work, personality, and well-being in the workplace. Psychology. 2016;07(06):753–67.

https://scholar.google.com/intl/en/scholar/citations.html . Accessed 4 Dec 2023.

Rotondi MA. CRTSize: sample size estimation functions for cluster randomized trials. R package version 1.0. 2015. Available from: https://cran.r-project.org/package=CRTSize .

Random.org. Available from: https://www.random.org/

https://rambollxact.dk/surveyxact . Accessed 4 Dec 2023.

Sterne JAC, White IR, Carlin JB, Spratt M, Royston P, Kenward MG, et al. Multiple imputation for missing data in epidemiological and clinical research: Potential and pitfalls. BMJ (Online). 2009;339:157–60.

Google Scholar  

Skelly C, Cassagnol M, Munakomi S. Adverse Events. StatPearls Treasure Island: StatPearls Publishing. 2023. Available from: https://www.ncbi.nlm.nih.gov/books/NBK558963/ .

Rahimi-Ardabili H, Spooner C, Harris MF, Magin P, Tam CWM, Liaw ST, et al. Online training in evidence-based medicine and research methods for GP registrars: a mixed-methods evaluation of engagement and impact. BMC Med Educ. 2021;21(1):1–14. Available from:  https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8439372/pdf/12909_2021_Article_2916.pdf .

Cheung YYH, Lam KF, Zhang H, Kwan CW, Wat KP, Zhang Z, et al. A randomized controlled experiment for comparing face-to-face and online teaching during COVID-19 pandemic. Front Educ. 2023;8. https://doi.org/10.3389/feduc.2023.1160430 .

Kofoed M, Gebhart L, Gilmore D, Moschitto R. Zooming to Class?: Experimental Evidence on College Students' Online Learning During Covid-19. SSRN Electron J. 2021;IZA Discussion Paper No. 14356.

Mutlu Aİ, Yüksel M. Listening effort, fatigue, and streamed voice quality during online university courses. Logop Phoniatr Vocol :1–8. Available from: https://doi.org/10.1080/14015439.2024.2317789

Download references

Acknowledgements

We thank the students who make their evaluations available for this trial and MSc (Public Health) Mie Sylow Liljendahl for statistical support.

Open access funding provided by Copenhagen University The Parker Institute, which hosts the WHO CC (DEN-62), receives a core grant from the Oak Foundation (OCAY-18–774-OFIL). The Oak Foundation had no role in the design of the study or in the collection, analysis, and interpretation of the data or in writing the manuscript.

Author information

Authors and affiliations.

WHO Collaborating Centre (DEN-62), Clinical Health Promotion Centre, The Parker Institute, Bispebjerg & Frederiksberg Hospital, University of Copenhagen, Copenhagen, 2400, Denmark

Rie Raffing & Hanne Tønnesen

Copenhagen Academy for Medical Education and Simulation (CAMES), Centre for HR and Education, The Capital Region of Denmark, Copenhagen, 2100, Denmark

You can also search for this author in PubMed   Google Scholar

Contributions

RR, LK and HT have made substantial contributions to the conception and design of the work; RR to the acquisition of data, and RR, LK and HT to the interpretation of data; RR has drafted the work and RR, LK, and HT have substantively revised it AND approved the submitted version AND agreed to be personally accountable for their own contributions as well as ensuring that any questions which relates to the accuracy or integrity of the work are adequately investigated, resolved and documented.

Corresponding author

Correspondence to Rie Raffing .

Ethics declarations

Ethics approval and consent to participate.

The Danish National Committee on Health Research Ethics has assessed the study Journal-nr.:21041907 (Date: 21–09-2021) without objections or comments. The study has been approved by The Danish Data Protection Agency Journal-nr.: P-2022–158 (Date: 04.05.2022).

All PhD students participate after informed consent. They can withdraw from the study at any time without explanations or consequences for their education. They will be offered information of the results at study completion. There are no risks for the course participants as the measurements in the course follow routine procedure and they are not affected by the follow up in Google Scholar. However, the 15 min of filling in the forms may be considered inconvenient.

The project will follow the GDPR and the Joint Regional Information Security Policy. Names and ID numbers are stored on a secure and logged server at the Capital Region Denmark to avoid risk of data leak. All outcomes are part of the routine evaluation at the courses, except the follow up for academic achievement by publications and related indexes. However, the publications are publicly available per se.

Competing interests

The authors declare no competing interests

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., supplementary material 2., supplementary material 3., supplementary material 4., supplementary material 5., supplementary material 6., supplementary material 7., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Raffing, R., Konge, L. & Tønnesen, H. Learning effect of online versus onsite education in health and medical scholarship – protocol for a cluster randomized trial. BMC Med Educ 24 , 927 (2024). https://doi.org/10.1186/s12909-024-05915-z

Download citation

Received : 25 March 2024

Accepted : 14 August 2024

Published : 26 August 2024

DOI : https://doi.org/10.1186/s12909-024-05915-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Self-efficacy
  • Achievements
  • Health and Medical education

BMC Medical Education

ISSN: 1472-6920

how to write methodology for online survey

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • How to Write a Literature Review | Guide, Examples, & Templates

How to Write a Literature Review | Guide, Examples, & Templates

Published on January 2, 2023 by Shona McCombes . Revised on September 11, 2023.

What is a literature review? A literature review is a survey of scholarly sources on a specific topic. It provides an overview of current knowledge, allowing you to identify relevant theories, methods, and gaps in the existing research that you can later apply to your paper, thesis, or dissertation topic .

There are five key steps to writing a literature review:

  • Search for relevant literature
  • Evaluate sources
  • Identify themes, debates, and gaps
  • Outline the structure
  • Write your literature review

A good literature review doesn’t just summarize sources—it analyzes, synthesizes , and critically evaluates to give a clear picture of the state of knowledge on the subject.

Instantly correct all language mistakes in your text

Upload your document to correct all your mistakes in minutes

upload-your-document-ai-proofreader

Table of contents

What is the purpose of a literature review, examples of literature reviews, step 1 – search for relevant literature, step 2 – evaluate and select sources, step 3 – identify themes, debates, and gaps, step 4 – outline your literature review’s structure, step 5 – write your literature review, free lecture slides, other interesting articles, frequently asked questions, introduction.

  • Quick Run-through
  • Step 1 & 2

When you write a thesis , dissertation , or research paper , you will likely have to conduct a literature review to situate your research within existing knowledge. The literature review gives you a chance to:

  • Demonstrate your familiarity with the topic and its scholarly context
  • Develop a theoretical framework and methodology for your research
  • Position your work in relation to other researchers and theorists
  • Show how your research addresses a gap or contributes to a debate
  • Evaluate the current state of research and demonstrate your knowledge of the scholarly debates around your topic.

Writing literature reviews is a particularly important skill if you want to apply for graduate school or pursue a career in research. We’ve written a step-by-step guide that you can follow below.

Literature review guide

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

how to write methodology for online survey

Writing literature reviews can be quite challenging! A good starting point could be to look at some examples, depending on what kind of literature review you’d like to write.

  • Example literature review #1: “Why Do People Migrate? A Review of the Theoretical Literature” ( Theoretical literature review about the development of economic migration theory from the 1950s to today.)
  • Example literature review #2: “Literature review as a research methodology: An overview and guidelines” ( Methodological literature review about interdisciplinary knowledge acquisition and production.)
  • Example literature review #3: “The Use of Technology in English Language Learning: A Literature Review” ( Thematic literature review about the effects of technology on language acquisition.)
  • Example literature review #4: “Learners’ Listening Comprehension Difficulties in English Language Learning: A Literature Review” ( Chronological literature review about how the concept of listening skills has changed over time.)

You can also check out our templates with literature review examples and sample outlines at the links below.

Download Word doc Download Google doc

Before you begin searching for literature, you need a clearly defined topic .

If you are writing the literature review section of a dissertation or research paper, you will search for literature related to your research problem and questions .

Make a list of keywords

Start by creating a list of keywords related to your research question. Include each of the key concepts or variables you’re interested in, and list any synonyms and related terms. You can add to this list as you discover new keywords in the process of your literature search.

  • Social media, Facebook, Instagram, Twitter, Snapchat, TikTok
  • Body image, self-perception, self-esteem, mental health
  • Generation Z, teenagers, adolescents, youth

Search for relevant sources

Use your keywords to begin searching for sources. Some useful databases to search for journals and articles include:

  • Your university’s library catalogue
  • Google Scholar
  • Project Muse (humanities and social sciences)
  • Medline (life sciences and biomedicine)
  • EconLit (economics)
  • Inspec (physics, engineering and computer science)

You can also use boolean operators to help narrow down your search.

Make sure to read the abstract to find out whether an article is relevant to your question. When you find a useful book or article, you can check the bibliography to find other relevant sources.

You likely won’t be able to read absolutely everything that has been written on your topic, so it will be necessary to evaluate which sources are most relevant to your research question.

For each publication, ask yourself:

  • What question or problem is the author addressing?
  • What are the key concepts and how are they defined?
  • What are the key theories, models, and methods?
  • Does the research use established frameworks or take an innovative approach?
  • What are the results and conclusions of the study?
  • How does the publication relate to other literature in the field? Does it confirm, add to, or challenge established knowledge?
  • What are the strengths and weaknesses of the research?

Make sure the sources you use are credible , and make sure you read any landmark studies and major theories in your field of research.

You can use our template to summarize and evaluate sources you’re thinking about using. Click on either button below to download.

Take notes and cite your sources

As you read, you should also begin the writing process. Take notes that you can later incorporate into the text of your literature review.

It is important to keep track of your sources with citations to avoid plagiarism . It can be helpful to make an annotated bibliography , where you compile full citation information and write a paragraph of summary and analysis for each source. This helps you remember what you read and saves time later in the process.

Prevent plagiarism. Run a free check.

To begin organizing your literature review’s argument and structure, be sure you understand the connections and relationships between the sources you’ve read. Based on your reading and notes, you can look for:

  • Trends and patterns (in theory, method or results): do certain approaches become more or less popular over time?
  • Themes: what questions or concepts recur across the literature?
  • Debates, conflicts and contradictions: where do sources disagree?
  • Pivotal publications: are there any influential theories or studies that changed the direction of the field?
  • Gaps: what is missing from the literature? Are there weaknesses that need to be addressed?

This step will help you work out the structure of your literature review and (if applicable) show how your own research will contribute to existing knowledge.

  • Most research has focused on young women.
  • There is an increasing interest in the visual aspects of social media.
  • But there is still a lack of robust research on highly visual platforms like Instagram and Snapchat—this is a gap that you could address in your own research.

There are various approaches to organizing the body of a literature review. Depending on the length of your literature review, you can combine several of these strategies (for example, your overall structure might be thematic, but each theme is discussed chronologically).

Chronological

The simplest approach is to trace the development of the topic over time. However, if you choose this strategy, be careful to avoid simply listing and summarizing sources in order.

Try to analyze patterns, turning points and key debates that have shaped the direction of the field. Give your interpretation of how and why certain developments occurred.

If you have found some recurring central themes, you can organize your literature review into subsections that address different aspects of the topic.

For example, if you are reviewing literature about inequalities in migrant health outcomes, key themes might include healthcare policy, language barriers, cultural attitudes, legal status, and economic access.

Methodological

If you draw your sources from different disciplines or fields that use a variety of research methods , you might want to compare the results and conclusions that emerge from different approaches. For example:

  • Look at what results have emerged in qualitative versus quantitative research
  • Discuss how the topic has been approached by empirical versus theoretical scholarship
  • Divide the literature into sociological, historical, and cultural sources

Theoretical

A literature review is often the foundation for a theoretical framework . You can use it to discuss various theories, models, and definitions of key concepts.

You might argue for the relevance of a specific theoretical approach, or combine various theoretical concepts to create a framework for your research.

Like any other academic text , your literature review should have an introduction , a main body, and a conclusion . What you include in each depends on the objective of your literature review.

The introduction should clearly establish the focus and purpose of the literature review.

Depending on the length of your literature review, you might want to divide the body into subsections. You can use a subheading for each theme, time period, or methodological approach.

As you write, you can follow these tips:

  • Summarize and synthesize: give an overview of the main points of each source and combine them into a coherent whole
  • Analyze and interpret: don’t just paraphrase other researchers — add your own interpretations where possible, discussing the significance of findings in relation to the literature as a whole
  • Critically evaluate: mention the strengths and weaknesses of your sources
  • Write in well-structured paragraphs: use transition words and topic sentences to draw connections, comparisons and contrasts

In the conclusion, you should summarize the key findings you have taken from the literature and emphasize their significance.

When you’ve finished writing and revising your literature review, don’t forget to proofread thoroughly before submitting. Not a language expert? Check out Scribbr’s professional proofreading services !

This article has been adapted into lecture slides that you can use to teach your students about writing a literature review.

Scribbr slides are free to use, customize, and distribute for educational purposes.

Open Google Slides Download PowerPoint

If you want to know more about the research process , methodology , research bias , or statistics , make sure to check out some of our other articles with explanations and examples.

  • Sampling methods
  • Simple random sampling
  • Stratified sampling
  • Cluster sampling
  • Likert scales
  • Reproducibility

 Statistics

  • Null hypothesis
  • Statistical power
  • Probability distribution
  • Effect size
  • Poisson distribution

Research bias

  • Optimism bias
  • Cognitive bias
  • Implicit bias
  • Hawthorne effect
  • Anchoring bias
  • Explicit bias

A literature review is a survey of scholarly sources (such as books, journal articles, and theses) related to a specific topic or research question .

It is often written as part of a thesis, dissertation , or research paper , in order to situate your work in relation to existing knowledge.

There are several reasons to conduct a literature review at the beginning of a research project:

  • To familiarize yourself with the current state of knowledge on your topic
  • To ensure that you’re not just repeating what others have already done
  • To identify gaps in knowledge and unresolved problems that your research can address
  • To develop your theoretical framework and methodology
  • To provide an overview of the key findings and debates on the topic

Writing the literature review shows your reader how your work relates to existing research and what new insights it will contribute.

The literature review usually comes near the beginning of your thesis or dissertation . After the introduction , it grounds your research in a scholarly field and leads directly to your theoretical framework or methodology .

A literature review is a survey of credible sources on a topic, often used in dissertations , theses, and research papers . Literature reviews give an overview of knowledge on a subject, helping you identify relevant theories and methods, as well as gaps in existing research. Literature reviews are set up similarly to other  academic texts , with an introduction , a main body, and a conclusion .

An  annotated bibliography is a list of  source references that has a short description (called an annotation ) for each of the sources. It is often assigned as part of the research process for a  paper .  

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, September 11). How to Write a Literature Review | Guide, Examples, & Templates. Scribbr. Retrieved August 26, 2024, from https://www.scribbr.com/dissertation/literature-review/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, what is a theoretical framework | guide to organizing, what is a research methodology | steps & tips, how to write a research proposal | examples & templates, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

JavaScript seems to be disabled in your browser. For the best experience on our site, be sure to turn on Javascript in your browser.

Equine Team Launches 2024 Equine Community Survey

Photo credit: Laura Kenny

Photo credit: Laura Kenny

If you're an equine enthusiast, we want to hear from you!

The Penn State Extension Equine Team is launching a new survey to determine audience preferences for educational topics, program delivery methods, and more. The results will be used to fine-tune our educational offerings based on the industry's preferences. This is your chance to give feedback and tell us what you want from our team!

The survey is open to anyone over the age of 18, and all answers are completely anonymous. It should take around 10-15 minutes to complete the survey.

The link to take the survey is https://pennstate.qualtrics.com/jfe/form/SV_9QtgILGr8Lrv206 . Your opinion matters!

Laura Kenny

  • Environmental stewardship
  • Pasture management
  • Manure management
  • Nutrient management

You may also be interested in ...

Equine First Aid Workshop

Equine First Aid Workshop

Equine Sips and Tips

Equine Sips and Tips

Controlling Parasite Resistance on Your Equine Farm

Controlling Parasite Resistance on Your Equine Farm

Proper Animal Mortality Disposal

Proper Animal Mortality Disposal

Photo by Carey Williams

Trees in Horse Pastures

Photo credit: Andrea Kocher

How to Take Your Horse's Vital Signs

Rain Rot in Horses

Rain Rot in Horses

image source: pixabay.com

Vaccines for Your Horse

Bright airy stables feature open interiors with fresh air openings for each stall.

Horse Stable Ventilation

Decoding Dewormers: Types, Resistance Concerns, and Use for Horses

Decoding Dewormers: Types, Resistance Concerns, and Use for Horses

Personalize your experience with penn state extension and stay informed of the latest in agriculture..

IMAGES

  1. Master Survey Design: A 10-step Guide with Examples

    how to write methodology for online survey

  2. SOLUTION: How to write a research methodology in 4 steps

    how to write methodology for online survey

  3. how to write methodology step by step

    how to write methodology for online survey

  4. How to Write Research Methodology: 13 Steps (with Pictures)

    how to write methodology for online survey

  5. How to write methodology in research paper

    how to write methodology for online survey

  6. What Is Methodology? Understand the Most Crucial Steps to Write It

    how to write methodology for online survey

VIDEO

  1. How to write a research methodology

  2. Tips to write an Effective Methodology Section

  3. Webinar Series on Survey Methodology

  4. Literature Survey

  5. Asking the Right Questions in the Right Ways: Introduction to Survey Design and Analysis

  6. How to Write a Research Methodology for a Dissertation or Thesis?

COMMENTS

  1. PDF Rapid Guide to Describing a Survey Methodology

    ad IndiKit's Rapid Guide to Survey Sampling. Administering Questions: Describe whether your survey will be self-administered (i.e. the respondents fill in the questionnaire) or whether the questions will be asked (a. d observations made, etc.) by data collectors. Specify whether the data collectors will work alone or.

  2. A Comprehensive Guide to Survey Research Methodologies

    A survey is a research method that is used to collect data from a group of respondents in order to gain insights and information regarding a particular subject. It's an excellent method to gather opinions and understand how and why people feel a certain way about different situations and contexts. ‍.

  3. Survey Methodology: How to reach your Audience

    A survey methodology is a technique that is carried out by applying a questionnaire to a sample of people. Surveys provide information on citizens' opinions, attitudes, and behaviors, and there's more than one method to conduct them. In fact, there's a whole universe to learn from in the surveys world. LEARN ABOUT: Survey Sample Sizes.

  4. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  5. Doing Survey Research

    Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout. Distribute the survey.

  6. How to Create an Effective Survey (Updated 2022)

    The idea is to come up with a specific, measurable, and relevant goal for your survey. This way you ensure that your questions are tailored to what you want to achieve and that the data captured can be compared against your goal. 2. Make every question count.

  7. Guide to the design and application of online questionnaire surveys

    Methodological components. Whilst developing and operationalising the online questionnaire survey, six methodological components are critical to successful online surveys. These are (a) user-friendly design and layout; (b) selecting survey participants; (c) avoiding multiple responses; (d) data management; e) ethical issues; and f) piloting tools.

  8. 7 Steps to Conduct a Survey- Best Practices, Tools, & More

    It'll ensure the reliability and accuracy of your survey responses. It'll also help you add screening questions to disqualify irrelevant respondents. 2. Decide on the Survey Questions. Make a list of all the possible questions you want to ask the respondents.

  9. 7 Steps In Conducting a Survey Research

    Step 3: Decide on the type of survey method to use. Step 4: Design and write questions. Step 5: Distribute the survey and gather responses. Step 6: Analyze the collected data. Step 7: Create a report based on survey results. These survey method steps provide a general framework for conducting research.

  10. How To Write The Methodology Chapter

    You don't need a lot of detail here - just a brief outline will do. Section 2 - The Methodology. The next section of your chapter is where you'll present the actual methodology. In this section, you need to detail and justify the key methodological choices you've made in a logical, intuitive fashion.

  11. Conducting Effective Online Surveys

    4. Test the Survey. It may sound obvious, but before you hit "Send" and broadcast the survey to your selected sample world, test it thoroughly on as many different PC platforms, operating systems, various web browsers, etc. Try to "break" it in any way you can, because it's an unpredictable technology world out there.

  12. Research Methodology

    Research Methodology refers to the systematic and scientific approach used to conduct research, investigate problems, and gather data and information for a specific purpose. It involves the techniques and procedures used to identify, collect, analyze, and interpret data to answer research questions or solve research problems.

  13. How to write good survey & poll questions

    7 tips for writing a great survey or poll. Whether you are conducting market research surveys, gathering large amounts of data to analyze, collecting feedback from employees, or running online polls—you can follow these tips to craft a survey, poll, or questionnaire that gets results. 1. Ask closed-ended questions.

  14. How to Write an APA Methods Section

    To structure your methods section, you can use the subheadings of "Participants," "Materials," and "Procedures.". These headings are not mandatory—aim to organize your methods section using subheadings that make sense for your specific study. Note that not all of these topics will necessarily be relevant for your study.

  15. How to Write Research Methodology: 13 Steps (with Pictures)

    A quantitative approach and statistical analysis would give you a bigger picture. 3. Identify how your analysis answers your research questions. Relate your methodology back to your original research questions and present a proposed outcome based on your analysis.

  16. Writing Survey Questions

    Writing Survey Questions. Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions.

  17. Questionnaire Design

    Questionnaires vs. surveys. A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.. Designing a questionnaire means creating valid and reliable questions that address your research objectives, placing them in a useful order, and selecting an appropriate method for administration.

  18. Survey Methodology: Effective Ways to Test Your Online Surveys!

    Following are the most potent methods to evaluate your survey's efficiency. We suggest that you follow them step-by-step. 1. Plan Out For Testing And Improvement. Your survey schedule may be overloaded with multitasking but as we have discussed, it is imperative to find time for trial and fixing of errors.

  19. How to Write a Survey Report (with Pictures)

    2. Write a 1-2 page executive summary paraphrasing the report. This comes at the very beginning of the report, after the table of contents. An executive summary condenses the main points of the report into a few pages. It should include: Methodology of the survey.

  20. How to Frame and Explain the Survey Data Used in a Thesis

    Surveys are a special research tool with strengths, weaknesses, and a language all of their own. There are many different steps to designing and conducting a survey, and survey researchers have specific ways of describing what they do.This handout, based on an annual workshop offered by the Program on Survey Research at Harvard, is geared toward undergraduate honors thesis writers using survey ...

  21. Dissertation Methodology

    In any research, the methodology chapter is one of the key components of your dissertation. It provides a detailed description of the methods you used to conduct your research and helps readers understand how you obtained your data and how you plan to analyze it. This section is crucial for replicating the study and validating its results.

  22. PDF Effective Use of Web-based Survey Research Platforms

    REDCap is secure web application for building and managing online surveys and databases. The REDCap application allows users to build and manage online surveys and databases quickly and . securely. REDCap can be used to collect virtually any type of data, it is specifically geared to support . data capture for research studies.

  23. Integrating substance use disorder screening in primary healthcare in

    Methods . An online survey of primary healthcare providers, descriptive statistics, bivariate analysis, and odds ratio calculations were used. ... IK - Methodology, Investigation, Writing - review and editing. UK - Methodology, Investigation, Writing - review and editing. NSh - Methodology, Investigation, Writing - review and ...

  24. Singapore consumers want simpler online payments to overseas merchants

    SINGAPORE - Consumers face friction when they buy items online from overseas merchants, according to a new survey. Almost 80 per cent of Singaporeans feel the process of paying for goods and ...

  25. What Is a Research Methodology?

    Step 1: Explain your methodological approach. Step 2: Describe your data collection methods. Step 3: Describe your analysis method. Step 4: Evaluate and justify the methodological choices you made. Tips for writing a strong methodology chapter. Other interesting articles.

  26. Addressing the Evolving Needs of a Multigenerational Workforce

    Survey Methodology. The Financial Wellness in the Workplace Study 2024 was conducted in early 2024 and surveyed two different populations: U.S. employers and U.S. workers. The research was conducted in two phases. The qualitative phase included in-depth interviews, separately, with employers and workers in January 2024. The quantitative phase ...

  27. Learning effect of online versus onsite education in health and medical

    Medical education was utterly disrupted for two years by the COVID-19 pandemic. In the midst of rearranging courses and adapting to online platforms we, with lecturers and course managers around the globe, wondered what the conversion to online setting did to students' learning, motivation and self-efficacy [1,2,3].What the long-term consequences would be [] and if scalable online medical ...

  28. Adobe Workfront

    ADOBE WORKFRONT Plan, assign, and execute work from one place. Build a marketing system of record by centralizing and integrating work across teams and applications with the industry-leading enterprise marketing work management application.

  29. How to Write a Literature Review

    Examples of literature reviews. Step 1 - Search for relevant literature. Step 2 - Evaluate and select sources. Step 3 - Identify themes, debates, and gaps. Step 4 - Outline your literature review's structure. Step 5 - Write your literature review.

  30. Equine Team Launches 2024 Equine Community Survey

    The Penn State Extension Equine Team is launching a new survey to determine audience preferences for educational topics, program delivery methods, and more. The results will be used to fine-tune our educational offerings based on the industry's preferences. This is your chance to give feedback and tell us what you want from our team!