Survey animation

90 Survey Question Examples + Best Practices Checklist

What makes a good survey question, what is the importance of asking the right questions, 9 types of survey questions + examples, how to conduct surveys effectively, make surveys easier with fullsession, fullsession pricing plans, install your first website survey today, faqs about survey questions.

An effective survey is the best way to collect customer feedback. It will serve as your basis for multiple functions, such as improving your product, supplementing market research, creating new marketing strategies, and much more. But what makes an effective survey?

The answer is simple–you have to ask the right questions. Good survey questions gather concrete information from your audience and give you a solid idea of what you need to do next. However, the process of creating a survey is not that easy–you want to make every question count.

In this article we’ll cover everything you need to know about survey questions, with 90 examples and use cases.

Understanding the anatomy of a good survey question can transform your approach to data collection, ensuring you gather information that’s both actionable and insightful. Let’s dive deeper into the elements that make a survey question effective:

  • Clarity is Key:  Questions should be straightforward and leave no room for interpretation, ensuring uniform understanding across all respondents.
  • Conciseness Matters:  Keep questions short and to the point. Avoid unnecessary wording that could confuse or disengage your audience.
  • Bias-Free Questions:  Ensure questions are neutral and do not lead respondents toward a particular answer. This maintains the integrity of your data.
  • Avoiding Ambiguity:  Specify the context clearly and ask questions in a way that allows for direct and clear answers, eliminating confusion.
  • Ensuring Relevance:  Each question should have a clear purpose and be directly related to your survey’s objectives, avoiding any irrelevant inquiries.
  • Easy to Answer:  Design questions in a format that is straightforward for respondents to understand and respond to, whether open-ended, multiple-choice, or using a rating scale.

Keep these points in mind as you prepare to write your survey questions. It also helps to refer back to these goals after drafting your survey so you can see if you hit each mark.

The primary goal of a survey is to collect information that would help meet a specific goal, whether that be gauging customer satisfaction or getting to know your target audience more. Asking the right survey questions is the best way to achieve that goal. More specifically, a good survey can help you with:

Informed Decision-Making

A solid foundation of data is essential for any business decision, and the right survey questions point you in the direction of the most valuable information.

Survey responses serve as a basis for the strategic decisions that can propel a business forward or redirect its course to avoid potential pitfalls. By understanding what your audience truly wants or needs, you can tailor your products or services to meet those demands more effectively.

Uncovering Customer Preferences

Today’s consumers have more options than ever before, and their preferences can shift with the wind. Asking the right survey questions helps you tap into the current desires of their target market, uncovering trends and preferences that may not be immediately obvious.

This insight allows you to adapt your products, services, and marketing messages to resonate more deeply with the target audience, fostering loyalty and encouraging engagement.

Identifying Areas for Improvement

No product, service, or customer experience is perfect, but the path to improvement lies in understanding where the gaps are. The right survey questions can shine a light on these areas, offering a clear view of what’s working and what’s not.

This feedback is invaluable for continuous improvement, helping you refine your products and enhance the customer experience. In turn, this can lead to increased satisfaction, loyalty, and positive word-of-mouth.

Reducing Churn Rate

Churn rate is the percentage of customers who stop using your service or product over a given period. High churn rates can be a symptom of deeper issues, such as dissatisfaction with the product or service, poor customer experience, or unmet needs. Including good survey questions can help you identify the reasons behind customer departure and take proactive steps to address them.

For example, survey questions that explore customer satisfaction levels, reasons for discontinuation, or the likelihood of recommending the service to others can pinpoint specific factors contributing to churn.

Minimizing Website Bounce Rate

Bounce rate  is the percentage of visitors leaving a website after viewing just one page. High bounce rates may signal issues with a site’s content, layout, or user experience not meeting visitor expectations.

Utilizing surveys to ask about visitors’ web experiences can provide valuable insights into website usability, content relevance, and navigation ease. Effectively, well-crafted survey questions aimed at understanding the user experience can lead to strategic adjustments, improving overall website performance, and fostering a more engaged audience.

three people filling out a feedback form animated picture

A good survey consists of two or more types of survey questions. However, all questions must serve a purpose. In this section, we divide survey questions into nine categories and include the best survey question examples for each type:

1. Open Ended Questions

Open-ended questions  allow respondents to answer in their own words instead of selecting from pre-selected answers.

“What features would you like to see added to our product?”

“How did you hear about our service?”

“What was your reason for choosing our product over competitors?”

“Can you describe your experience with our customer service?”

“What improvements can we make to enhance your user experience?”

“Why did you cancel your subscription?”

“What challenges are you facing with our software?”

“How can we better support your goals?”

“What do you like most about our website?”

“Can you provide feedback on our new product launch?”

When to use open-ended questions: Using these survey questions is a good idea when you don’t have a solid grasp of customer satisfaction yet. Customers will have the freedom to express all their thoughts and opinions, which, in turn, will let you have an accurate feel of how customers perceive your brand.

2. Multiple Choice Questions

Multiple-choice questions offer a set of predefined answers, usually three to four. Businesses usually use multiple-choice survey questions to gather information on participants’ attitudes, behaviors, and preferences.

“Which of the following age groups do you fall into? (Under 18, 19-25, 26-35, 36-45, 46-55, 56+)”

“What is your primary use of our product? (Personal, Business, Educational)”

“How often do you use our service? (Daily, Weekly, Monthly, Rarely)”

“Which of our products do you use? (Product A, Product B, Product C, All of the above)”

“What type of content do you prefer? (Blogs, Videos, Podcasts, eBooks)”

“Where do you usually shop for our products? (Online, In-store, Both)”

“What is your preferred payment method? (Credit Card, PayPal, Bank Transfer, Cash)”

“Which social media platforms do you use regularly? (Facebook, Twitter, Instagram, LinkedIn)”

“What is your employment status? (Employed, Self-Employed, Unemployed, Student)”

“Which of the following best describes your fitness level? (Beginner, Intermediate, Advanced, Expert)”

When to use multiple-choice questions: Asking multiple-choice questions can help with market research and segmentation. You can easily divide respondents depending on what pre-determined answer they choose. However, if this is the purpose of your survey, each question must be based on behavioral types or customer personas.

3. Yes or No Questions

Yes or no questions are straightforward, offering a binary choice.

“Have you used our product before?”

“Would you recommend our service to a friend?”

“Are you satisfied with your purchase?”

“Do you understand the terms and conditions?”

“Was our website easy to navigate?”

“Did you find what you were looking for?”

“Are you interested in receiving our newsletter?”

“Have you attended one of our events?”

“Do you agree with our privacy policy?”

“Have you experienced any issues with our service?”

When to use yes/no questions: These survey questions are very helpful in market screening and filtering out certain people for targeted surveys. For example, asking “Have you used our product before?” helps you separate the people who have tried out your product, a.k.a. the people who qualify for your survey.

4. Rating Scale Questions

Rating scale questions ask respondents to rate their experience or satisfaction on a numerical scale.

“On a scale of 1-10, how would you rate our customer service?”

“How satisfied are you with the product quality? (1-5)”

“Rate your overall experience with our website. (1-5)”

“How likely are you to purchase again? (1-10)”

“On a scale of 1-10, how easy was it to find what you needed?”

“Rate the value for money of your purchase. (1-5)”

“How would you rate the speed of our service? (1-10)”

“Rate your satisfaction with our return policy. (1-5)”

“How comfortable was the product? (1-10)”

“Rate the accuracy of our product description. (1-5)”

When to use rating scale questions: As you can see from the survey question examples above, rating scale questions give you excellent  quantitative data  on customer satisfaction.

5. Checkbox Questions

Checkbox questions allow respondents to select multiple answers from a list. You can also include an “Others” option, where the respondent can answer in their own words.

“Which of the following features do you value the most? (Select all that apply)”

“What topics are you interested in? (Select all that apply)”

“Which days are you available? (Select all that apply)”

“Select the services you have used. (Select all that apply)”

“What types of notifications would you like to receive? (Select all that apply)”

“Which of the following devices do you own? (Select all that apply)”

“Select any dietary restrictions you have. (Select all that apply)”

“Which of the following brands have you heard of? (Select all that apply)”

“What languages do you speak? (Select all that apply)”

“Select the social media platforms you use regularly. (Select all that apply)”

When to use checkbox questions: Checkbox questions are an excellent tool for collecting  psychographic data , including information about customers’ lifestyles, behaviors, attitudes, beliefs, etc. Moreover, survey responses will help you correlate certain characteristics to specific market segments.

6. Rank Order Questions

Rank order questions ask respondents to prioritize options according to their preference or importance.

“Rank the following features in order of importance to you. (Highest to Lowest)”

“Please rank these product options based on your preference. (1 being the most preferred)”

“Rank these factors by how much they influence your purchase decision. (Most to Least)”

“Order these services by how frequently you use them. (Most frequent to Least frequent)”

“Rank these issues by how urgently you think they need to be addressed. (Most urgent to Least urgent)”

“Please prioritize these company values according to what matters most to you. (Top to Bottom)”

“Rank these potential improvements by how beneficial they would be for you. (Most beneficial to Least beneficial)”

“Order these content types by your interest level. (Most interested to Least interested)”

“Rank these brands by your preference. (Favorite to Least favorite)”

“Prioritize these activities by how enjoyable you find them. (Most enjoyable to Least enjoyable)”

When to use rank order questions: Respondents must already be familiar with your brand or products to answer these questions, which is why we recommend using these for customers in the middle or bottom of your  conversion funnel .

Checklist of items animated

7. Likert Scale Questions

Likert scale questions measure the intensity of feelings towards a statement on a scale of agreement or satisfaction. Usually, these survey questions use a 5 to 7-point scale, ranging from “Strongly Agree” to “Strongly Disagree” or something similar.

  • “I am satisfied with the quality of customer service. (Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree)”
  • “The product meets my needs. (Strongly Agree to Strongly Disagree)”
  • “I find the website easy to navigate. (Strongly Agree to Strongly Disagree)”
  • “I feel that the pricing is fair for the value I receive. (Strongly Agree to Strongly Disagree)”
  • “I would recommend this product/service to others. (Strongly Agree to Strongly Disagree)”
  • “I am likely to purchase from this company again. (Strongly Agree to Strongly Disagree)”
  • “The company values customer feedback. (Strongly Agree to Strongly Disagree)”
  • “I am confident in the security of my personal information. (Strongly Agree to Strongly Disagree)”
  • “The product features meet my expectations. (Strongly Agree to Strongly Disagree)”
  • “Customer service resolved my issue promptly. (Strongly Agree to Strongly Disagree)”

When to use Likert scale questions: You can use these survey question examples in different types of surveys, such as customer satisfaction (CSAT) surveys. Likert scale questions give you precise measurements of how satisfied respondents are with a specific aspect of your product or service.

8. Matrix Survey Questions

Matrix survey questions allow respondents to evaluate multiple items using the same set of response options. Many companies combine matrix survey questions with Likert scales to make the survey easier to do.

  • “Please rate the following aspects of our service. (Customer support, Product quality, Delivery speed)”
  • “Evaluate your level of satisfaction with these website features. (Search functionality, Content relevance, User interface)”
  • “Rate the importance of the following factors in your purchasing decision. (Price, Brand, Reviews)”
  • “Assess your agreement with these statements about our company. (Innovative, Ethical, Customer-focused)”
  • “Rate your satisfaction with these aspects of our product. (Ease of use, Durability, Design)”
  • “Evaluate these aspects of our mobile app. (Performance, Security, Features)”
  • “Rate how well each of the following describes our brand. (Trustworthy, Innovative, Responsive)”
  • “Assess your satisfaction with these elements of our service. (Responsiveness, Accuracy, Friendliness)”
  • “Rate the effectiveness of these marketing channels for you. (Email, Social Media, Print Ads)”
  • “Evaluate your agreement with these workplace policies. (Flexibility, Diversity, Wellness initiatives)”

When to use matrix survey questions: Ask matrix survey questions when you want to make your survey more convenient to answer, as they allow multiple questions on various topics without repeating options. This is particularly helpful when you want to cover many points of interest in one survey.

9. Demographic Questions

Lastly, demographic questions collect basic information about respondents, aiding in data segmentation and analysis.

  • “What is your age?”
  • “What is your gender? (Male, Female, Prefer not to say, Other)”
  • “What is your highest level of education completed?”
  • “What is your employment status? (Employed, Self-employed, Unemployed, Student)”
  • “What is your household income range?”
  • “What is your marital status? (Single, Married, Divorced, Widowed)”
  • “How many people live in your household?”
  • “What is your ethnicity?”
  • “In which city and country do you currently reside?”
  • “What is your occupation?”

When to use demographic questions: From the survey question examples, you can easily tell that these questions aim to collect information on your respondents’ backgrounds, which will be helpful in creating buyer personas and improving market segmentation.

Checklist pointer arrow on tablet held in hands animation

Surveys can help you accomplish many things for your business, but only if you do it right. Creating the perfect survey isn’t just about crafting the best survey questions, you also have to:

1. Define Your Objectives

Before crafting your survey, be clear about what you want to achieve. Whether it’s understanding customer satisfaction, gauging interest in a new product, or collecting feedback on services, having specific objectives will guide your survey design and ensure you ask the right questions.

2. Know Your Audience

Understanding who your respondents are will help tailor the survey to their interests and needs, increasing the likelihood of participation. Consider demographics, behaviors, and preferences to make your survey relevant and engaging to your target audience.

3. Choose the Right Type of Survey Questions

Utilize a mix of the nine types of survey questions to gather a wide range of data. Balance open-ended questions for qualitative insights with closed-ended questions for easy-to-analyze quantitative data. Ensure each question aligns with your objectives and is clear and concise.

4. Keep It Short and Simple (KISS)

Respondents are more likely to complete shorter surveys. Aim for a survey that takes 5-10 minutes to complete, focusing on essential questions only. A straightforward and intuitive survey design encourages higher response rates.

5. Use Simple Language

Avoid technical jargon, complex words, or ambiguous terms. The language should be accessible to all respondents, ensuring that questions are understood as intended.

6. Ensure Anonymity and Confidentiality

Assure respondents that their answers are anonymous and their data will be kept confidential. This assurance can increase the honesty and accuracy of the responses you receive.

7. Test Your Survey

Pilot your survey with a small group before full deployment. This testing phase can help identify confusing questions, technical issues, or any other aspects of the survey that might hinder response quality or quantity.

8. Choose the Right Distribution Channels

Select the most effective channels to reach your target audience. This could be via email, social media, your website, or in-app notifications, depending on where your audience is most active and engaged.

9. Offer Incentives

Consider offering incentives to increase participation rates. Incentives can range from discounts, entry into a prize draw, or access to exclusive content. Ensure the incentive is relevant and appealing to your target audience.

10. Analyze and Act on the Data

After collecting the responses, analyze the data to extract meaningful insights. Use these insights to make informed decisions, implement changes, or develop strategies that align with your objectives. Sharing key findings and subsequent actions with respondents can also demonstrate the value of their feedback and encourage future participation.

11. Follow Up

Consider following up with respondents after the survey, especially if you promised to share results or if you’re conducting longitudinal studies. A follow-up can reinforce their importance to your research and maintain engagement over time.

12. Iterate and Improve

Surveys are not a one-time activity. Regularly conducting surveys and iterating based on previous feedback and results can help you stay aligned with your audience’s changing needs and preferences.

Checklist of items animated

These survey question examples are a great place to start in creating efficient and effective surveys. Why not take it a step further by integrating a  customer feedback tool  on your website?

FullSession  lets you collect instant visual feedback with an intuitive in-app survey. With this tool, you can:

  • Build unique surveys
  • Target feedback based on users’ devices or specific pages
  • Measure survey responses

Aside from FullSession’s customer feedback tool, you also gain access to:

  • Interactive heat maps: A  website heat map  shows you which items are gaining the most attention and which ones are not, helping you optimize UI and UX.
  • Session recordings: Watch  replays  or live sessions to see how users are navigating your website and pinpoint areas for improvement.
  • Funnels and conversions: Analyze funnel data to figure out what’s causing  funnel drops  and what contributes to successful conversions.

fullsession pricing image

The FullSession platform offers a  14-day free trial.  It provides two paid plans—Basic and Business. Here are more details on each plan.

  • The Basic plan costs $39/month and allows you to monitor up to 5,000 monthly sessions.
  • The Business plan costs $149/month and helps you to track and analyze up to 25,000 monthly sessions.
  • The Enterprise plan starts from 100,000 monthly sessions and has custom pricing.

If you need more information, you can  get a demo.

It takes less than 5 minutes to set up your first website or app survey form, with  FullSession , and it’s completely free!

How many questions should I include in my survey?

Aim for 10-15 questions to keep surveys short and engaging, ideally taking 5-10 minutes to complete. Focus on questions that directly support your objectives.

How can I ensure my survey questions are not biased?

Use neutral language, avoid assumptions, balance answer choices, and pre-test your survey with a diverse group to identify and correct biases.

How do I increase my survey response rate?

To boost response rates, ensure your survey is concise and relevant to the audience. Use engaging questions, offer incentives where appropriate, and communicate the value of respondents’ feedback. Choose the right distribution channels to reach your target audience effectively.

research questions for survey

Enhance Your Insights With Richer User Behavior Data

Discover FullSession's Digital Experience Intelligence solution firsthand. Explore FullSession for free

thumbs up

Learn / Blog / Article

Back to blog

Survey questions 101: 70+ survey question examples, types of surveys, and FAQs

How well do you understand your prospects and customers—who they are, what keeps them awake at night, and what brought them to your business in search of a solution? Asking the right survey questions at the right point in their customer journey is the most effective way to put yourself in your customers’ shoes.

Last updated

Reading time.

research questions for survey

This comprehensive intro to survey questions contains over 70 examples of effective questions, an overview of different types of survey questions, and advice on how to word them for maximum effect. Plus, we’ll toss in our pre-built survey templates, expert survey insights, and tips to make the most of AI for Surveys in Hotjar. ✨

Surveying your users is the simplest way to understand their pain points, needs, and motivations. But first, you need to know how to set up surveys that give you the answers you—and your business—truly need. Impactful surveys start here:

❓ The main types of survey questions : most survey questions are classified as open-ended, closed-ended, nominal, Likert scale, rating scale, and yes/no. The best surveys often use a combination of questions.

💡 70+ good survey question examples : our top 70+ survey questions, categorized across ecommerce, SaaS, and publishing, will help you find answers to your business’s most burning questions

✅ What makes a survey question ‘good’ : a good survey question is anything that helps you get clear insights and business-critical information about your customers 

❌ The dos and don’ts of writing good survey questions : remember to be concise and polite, use the foot-in-door principle, alternate questions, and test your surveys. But don’t ask leading or loaded questions, overwhelm respondents with too many questions, or neglect other tools that can get you the answers you need.

👍 How to run your surveys the right way : use a versatile survey tool like Hotjar Surveys that allows you to create on-site surveys at specific points in the customer journey or send surveys via a link

🛠️ 10 use cases for good survey questions : use your survey insights to create user personas, understand pain points, measure product-market fit, get valuable testimonials, measure customer satisfaction, and more

Use Hotjar to build your survey and get the customer insight you need to grow your business.

6 main types of survey questions

Let’s dive into our list of survey question examples, starting with a breakdown of the six main categories your questions will fall into:

Open-ended questions

Closed-ended questions

Nominal questions

Likert scale questions

Rating scale questions

'Yes' or 'no' questions

1. Open-ended survey questions

Open-ended questions  give your respondents the freedom to  answer in their own words , instead of limiting their response to a set of pre-selected choices (such as multiple-choice answers, yes/no answers, 0–10 ratings, etc.). 

Examples of open-ended questions:

What other products would you like to see us offer?

If you could change just one thing about our product, what would it be?

When to use open-ended questions in a survey

The majority of example questions included in this post are open-ended, and there are some good reasons for that:

Open-ended questions help you learn about customer needs you didn’t know existed , and they shine a light on areas for improvement that you may not have considered before. If you limit your respondents’ answers, you risk cutting yourself off from key insights.

Open-ended questions are very useful when you first begin surveying your customers and collecting their feedback. If you don't yet have a good amount of insight, answers to open-ended questions will go a long way toward educating you about who your customers are and what they're looking for.

There are, however, a few downsides to open-ended questions:

First, people tend to be less likely to respond to open-ended questions in general because they take comparatively more effort to answer than, say, a yes/no one

Second, but connected: if you ask consecutive open-ended questions during your survey, people will get tired of answering them, and their answers might become less helpful the more you ask

Finally, the data you receive from open-ended questions will take longer to analyze compared to easy 1-5 or yes/no answers—but don’t let that stop you. There are plenty of shortcuts that make it easier than it looks (we explain it all in our post about how to analyze open-ended questions , which includes a free analysis template.)

💡 Pro tip: if you’re using Hotjar Surveys, let our AI for Surveys feature analyze your open-ended survey responses for you. Hotjar AI reviews all your survey responses and provides an automated summary report of key findings, including supporting quotes and actionable recommendations for next steps.

2. Closed-ended survey questions

Closed-end questions limit a user’s response options to a set of pre-selected choices. This broad category of questions includes

‘Yes’ or ‘no’ questions

When to use closed-ended questions

Closed-ended questions work brilliantly in two scenarios:

To open a survey, because they require little time and effort and are therefore easy for people to answer. This is called the foot-in-the-door principle: once someone commits to answering the first question, they may be more likely to answer the open-ended questions that follow.

When you need to create graphs and trends based on people’s answers. Responses to closed-ended questions are easy to measure and use as benchmarks. Rating scale questions, in particular (e.g. where people rate customer service or on a scale of 1-10), allow you to gather customer sentiment and compare your progress over time.

3. Nominal questions

A nominal question is a type of survey question that presents people with multiple answer choices; the answers are  non-numerical in nature and don't overlap  (unless you include an ‘all of the above’ option).

Example of nominal question:

What are you using [product name] for?

Personal use

Both business and personal use

When to use nominal questions

Nominal questions work well when there is a limited number of categories for a given question (see the example above). They’re easy to create graphs and trends from, but the downside is that you may not be offering enough categories for people to reply.

For example, if you ask people what type of browser they’re using and only give them three options to choose from, you may inadvertently alienate everybody who uses a fourth type and now can’t tell you about it.

That said, you can add an open-ended component to a nominal question with an expandable ’other’ category, where respondents can write in an answer that isn’t on the list. This way, you essentially ask an open-ended question that doesn’t limit them to the options you’ve picked.

4. Likert scale questions

The Likert scale is typically a 5- or 7-point scale that evaluates a respondent’s level of agreement with a statement or the intensity of their reaction toward something.

The scale develops symmetrically: the median number (e.g. a 3 on a 5-point scale) indicates a point of neutrality, the lowest number (always 1) indicates an extreme view, and the highest number (e.g. a 5 on a 5-point scale) indicates the opposite extreme view.

Example of a Likert scale question:

#The British Museum uses a Likert scale Hotjar survey to gauge visitors’ reactions to their website optimizations

When to use Likert scale questions

Likert-type questions are also known as ordinal questions because the answers are presented in a specific order. Like other multiple-choice questions, Likert scale questions come in handy when you already have some sense of what your customers are thinking. For example, if your open-ended questions uncover a complaint about a recent change to your ordering process, you could use a Likert scale question to determine how the average user felt about the change.

A series of Likert scale questions can also be turned into a matrix question. Since they have identical response options, they are easily combined into a single matrix and break down the pattern of single questions for users.

5. Rating scale questions

Rating scale questions are questions where the answers map onto a numeric scale (such as rating customer support on a scale of 1-5, or likelihood to recommend a product from 0-10).

Examples of rating questions:

How likely are you to recommend us to a friend or colleague on a scale of 0-10?

How would you rate our customer service on a scale of 1-5?

When to use rating questions

Whenever you want to assign a numerical value to your survey or visualize and compare trends , a rating question is the way to go.

A typical rating question is used to determine Net Promoter ScoreÂŽ (NPSÂŽ) : the question asks customers to rate their likelihood of recommending products or services to their friends or colleagues, and allows you to look at the results historically and see if you're improving or getting worse. Rating questions are also used for customer satisfaction (CSAT) surveys and product reviews.

When you use a rating question in a survey, be sure to explain what the scale means (e.g. 1 for ‘Poor’, 5 for ‘Amazing’). And consider adding a follow-up open-ended question to understand why the user left that score.

Example of a rating question (NPS):

#Hotjar's Net Promoter ScoreÂŽ (NPSÂŽ) survey template lets you add open-ended follow-up questions so you can understand the reasons behind users' ratings

6. ‘Yes’ or ‘no’ questions

These dichotomous questions are super straightforward, requiring a simple ‘yes’ or ‘no’ reply.

Examples of yes/no questions:

Was this article useful? (Yes/No)

Did you find what you were looking for today? (Yes/No)

When to use ‘yes’ or ‘no’ questions

‘Yes’ and ‘no’ questions are a good way to quickly segment your respondents . For example, say you’re trying to understand what obstacles or objections prevent people from trying your product. You can place a survey on your pricing page asking people if something is stopping them, and follow up with the segment who replied ‘yes’ by asking them to elaborate further.

These questions are also effective for getting your foot in the door: a ‘yes’ or ‘no’ question requires very little effort to answer. Once a user commits to answering the first question, they tend to become more willing to answer the questions that follow, or even leave you their contact information.

#Web design agency NerdCow used Hotjar Surveys to add a yes/no survey on The Transport Library’s website, and followed it up with an open-ended question for more insights

70+ more survey question examples

Below is a list of good survey questions, categorized across ecommerce, software as a service (SaaS), and publishing. You don't have to use them word-for-word, but hopefully, this list will spark some extra-good ideas for the surveys you’ll run immediately after reading this article. (Plus, you can create all of them with Hotjar Surveys—stick with us a little longer to find out how. 😉)

📊 9 basic demographic survey questions

Ask these questions when you want context about your respondents and target audience, so you can segment them later. Consider including demographic information questions in your survey when conducting user or market research as well. 

But don’t ask demographic questions just for the sake of it—if you're not going to use some of the data points from these sometimes sensitive questions (e.g. if gender is irrelevant to the result of your survey), move on to the ones that are truly useful for you, business-wise. 

Take a look at the selection of examples below, and keep in mind that you can convert most of them to multiple choice questions:

What is your name?

What is your age?

What is your gender?

What company do you work for?

What vertical/industry best describes your company?

What best describes your role?

In which department do you work?

What is the total number of employees in your company (including all locations where your employer operates)?

What is your company's annual revenue?

🚀 Get started: gather more info about your users with our product-market fit survey template .

👥 20+ effective customer questions

These questions are particularly recommended for ecommerce companies:

Before purchase

What information is missing or would make your decision to buy easier?

What is your biggest fear or concern about purchasing this item?

Were you able to complete the purpose of your visit today?

If you did not make a purchase today, what stopped you?

After purchase

Was there anything about this checkout process we could improve?

What was your biggest fear or concern about purchasing from us?

What persuaded you to complete the purchase of the item(s) in your cart today?

If you could no longer use [product name], what’s the one thing you would miss the most?

What’s the one thing that nearly stopped you from buying from us?

👉 Check out our 7-step guide to setting up an ecommerce post-purchase survey .

Other useful customer questions

Do you have any questions before you complete your purchase?

What other information would you like to see on this page?

What were the three main things that persuaded you to create an account today?

What nearly stopped you from creating an account today?

Which other options did you consider before choosing [product name]?

What would persuade you to use us more often?

What was your biggest challenge, frustration, or problem in finding the right [product type] online?

Please list the top three things that persuaded you to use us rather than a competitor.

Were you able to find the information you were looking for?

How satisfied are you with our support?

How would you rate our service/support on a scale of 0-10? (0 = terrible, 10 = stellar)

How likely are you to recommend us to a friend or colleague? ( NPS question )

Is there anything preventing you from purchasing at this point?

🚀 Get started: learn how satisfied customers are with our expert-built customer satisfaction and NPS survey templates .

Set up a survey in seconds

Use Hotjar's free survey templates to build virtually any type of survey, and start gathering valuable insights in moments.

🛍 30+ product survey questions

These questions are particularly recommended for SaaS companies:

Questions for new or trial users

What nearly stopped you from signing up today?

How likely are you to recommend us to a friend or colleague on a scale of 0-10? (NPS question)

Is our pricing clear? If not, what would you change?

Questions for paying customers

What convinced you to pay for this service?

What’s the one thing we are missing in [product type]?

What's one feature we can add that would make our product indispensable for you?

If you could no longer use [name of product], what’s the one thing you would miss the most?

🚀 Get started: find out what your buyers really think with our pricing plan feedback survey template .

Questions for former/churned customers

What is the main reason you're canceling your account? Please be blunt and direct.

If you could have changed one thing in [product name], what would it have been?

If you had a magic wand and could change anything in [product name], what would it be?

🚀 Get started: find out why customers churn with our free-to-use churn analysis survey template .

Other useful product questions

What were the three main things that persuaded you to sign up today?

Do you have any questions before starting a free trial?

What persuaded you to start a trial?

Was this help section useful?

Was this article useful?

How would you rate our service/support on a scale of 1-10? (0 = terrible, 10 = stellar)

Is there anything preventing you from upgrading at this point?

Is there anything on this page that doesn't work the way you expected it to?

What could we change to make you want to continue using us?

If you did not upgrade today, what stopped you?

What's the next thing you think we should build?

How would you feel if we discontinued this feature?

What's the next feature or functionality we should build?

🚀 Get started: gather feedback on your product with our free-to-use product feedback survey template .

🖋 20+ effective questions for publishers and bloggers

Questions to help improve content.

If you could change just one thing in [publication name], what would it be?

What other content would you like to see us offer?

How would you rate this article on a scale of 1–10?

If you could change anything on this page, what would you have us do?

If you did not subscribe to [publication name] today, what was it that stopped you?

🚀 Get started: find ways to improve your website copy and messaging with our content feedback survey template .

New subscriptions

What convinced you to subscribe to [publication] today?

What almost stopped you from subscribing?

What were the three main things that persuaded you to join our list today?

Cancellations

What is the main reason you're unsubscribing? Please be specific.

Other useful content-related questions

What’s the one thing we are missing in [publication name]?

What would persuade you to visit us more often?

How likely are you to recommend us to someone with similar interests? (NPS question)

What’s missing on this page?

What topics would you like to see us write about next?

How useful was this article?

What could we do to make this page more useful?

Is there anything on this site that doesn't work the way you expected it to?

What's one thing we can add that would make [publication name] indispensable for you?

If you could no longer read [publication name], what’s the one thing you would miss the most?

💡 Pro tip: do you have a general survey goal in mind, but are struggling to pin down the right questions to ask? Give Hotjar’s AI for Surveys a go and watch as it generates a survey for you in seconds with questions tailored to the exact purpose of the survey you want to run.

What makes a good survey question?

We’ve run through more than 70 of our favorite survey questions—but what is it that makes a good survey question, well, good ? An effective question is anything that helps you get clear insights and business-critical information about your customers , including

Who your target market is

How you should price your products

What’s stopping people from buying from you

Why visitors leave your website

With this information, you can tailor your website, products, landing pages, and messaging to improve the user experience and, ultimately, maximize conversions .

How to write good survey questions: the DOs and DON’Ts

To help you understand the basics and avoid some rookie mistakes, we asked a few experts to give us their thoughts on what makes a good and effective survey question.

Survey question DOs

✅ do focus your questions on the customer.

It may be tempting to focus on your company or products, but it’s usually more effective to put the focus back on the customer. Get to know their needs, drivers, pain points, and barriers to purchase by asking about their experience. That’s what you’re after: you want to know what it’s like inside their heads and how they feel when they use your website and products.

Rather than asking, “Why did you buy our product?” ask, “What was happening in your life that led you to search for this solution?” Instead of asking, “What's the one feature you love about [product],” ask, “If our company were to close tomorrow, what would be the one thing you’d miss the most?” These types of surveys have helped me double and triple my clients.

✅ DO be polite and concise (without skimping on micro-copy)

Put time into your micro-copy—those tiny bits of written content that go into surveys. Explain why you’re asking the questions, and when people reach the end of the survey, remember to thank them for their time. After all, they’re giving you free labor!

✅ DO consider the foot-in-the-door principle

One way to increase your response rate is to ask an easy question upfront, such as a ‘yes’ or ‘no’ question, because once people commit to taking a survey—even just the first question—they’re more likely to finish it.

✅ DO consider asking your questions from the first-person perspective

Disclaimer: we don’t do this here at Hotjar. You’ll notice all our sample questions are listed in second-person (i.e. ‘you’ format), but it’s worth testing to determine which approach gives you better answers. Some experts prefer the first-person approach (i.e. ‘I’ format) because they believe it encourages users to talk about themselves—but only you can decide which approach works best for your business.

I strongly recommend that the questions be worded in the first person. This helps create a more visceral reaction from people and encourages them to tell stories from their actual experiences, rather than making up hypothetical scenarios. For example, here’s a similar question, asked two ways: “What do you think is the hardest thing about creating a UX portfolio?” versus “My biggest problem with creating my UX portfolio is…” 

The second version helps get people thinking about their experiences. The best survey responses come from respondents who provide personal accounts of past events that give us specific and real insight into their lives.

✅ DO alternate your questions often

Shake up the questions you ask on a regular basis. Asking a wide variety of questions will help you and your team get a complete view of what your customers are thinking.

✅ DO test your surveys before sending them out

A few years ago, Hotjar created a survey we sent to 2,000 CX professionals via email. Before officially sending it out, we wanted to make sure the questions really worked. 

We decided to test them out on internal staff and external people by sending out three rounds of test surveys to 100 respondents each time. Their feedback helped us perfect the questions and clear up any confusing language.

Survey question DON’Ts

❌ don’t ask closed-ended questions if you’ve never done research before.

If you’ve just begun asking questions, make them open-ended questions since you have no idea what your customers think about you at this stage. When you limit their answers, you just reinforce your own assumptions.

There are two exceptions to this rule:

Using a closed-ended question to get your foot in the door at the beginning of a survey

Using rating scale questions to gather customer sentiment (like an NPS survey)

❌ DON’T ask a lot of questions if you’re just getting started

Having to answer too many questions can overwhelm your users. Stick with the most important points and discard the rest.

Try starting off with a single question to see how your audience responds, then move on to two questions once you feel like you know what you’re doing.

How many questions should you ask? There’s really no perfect answer, but we recommend asking as few as you need to ask to get the information you want. In the beginning, focus on the big things:

Who are your users?

What do potential customers want?

How are they using your product?

What would win their loyalty?

❌ DON’T just ask a question when you can combine it with other tools

Don’t just use surveys to answer questions that other tools (such as analytics) can also answer. If you want to learn about whether people find a new website feature helpful, you can also observe how they’re using it through traditional analytics, session recordings , and other user testing tools for a more complete picture.

Don’t use surveys to ask people questions that other tools are better equipped to answer. I’m thinking of questions like “What do you think of the search feature?” with pre-set answer options like ‘Very easy to use,’ ‘Easy to use,’ etc. That’s not a good question to ask. 

Why should you care about what people ‘think’ about the search feature? You should find out whether it helps people find what they need and whether it helps drive conversions for you. Analytics, user session recordings, and user testing can tell you whether it does that or not.

❌ DON’T ask leading questions

A leading question is one that prompts a specific answer. Avoid asking leading questions because they’ll give you bad data. For example, asking, “What makes our product better than our competitors’ products?” might boost your self-esteem, but it won’t get you good information. Why? You’re effectively planting the idea that your own product is the best on the market.

❌ DON’T ask loaded questions

A loaded question is similar to a leading question, but it does more than just push a bias—it phrases the question such that it’s impossible to answer without confirming an underlying assumption.

A common (and subtle) form of loaded survey question would be, “What do you find useful about this article?” If we haven’t first asked you whether you found the article useful at all, then we’re asking a loaded question.

❌ DON’T ask about more than one topic at once

For example, “Do you believe our product can help you increase sales and improve cross-collaboration?”

This complex question, also known as a ‘double-barreled question’, requires a very complex answer as it begs the respondent to address two separate questions at once:

Do you believe our product can help you increase sales?

Do you believe our product can help you improve cross-collaboration?

Respondents may very well answer 'yes', but actually mean it for the first part of the question, and not the other. The result? Your survey data is inaccurate, and you’ve missed out on actionable insights.

Instead, ask two specific questions to gather customer feedback on each concept.

How to run your surveys

The format you pick for your survey depends on what you want to achieve and also on how much budget or resources you have. You can

Use an on-site survey tool , like Hotjar Surveys , to set up a website survey that pops up whenever people visit a specific page: this is useful when you want to investigate website- and product-specific topics quickly. This format is relatively inexpensive—with Hotjar’s free forever plan, you can even run up to 3 surveys with unlimited questions for free.

research questions for survey

Use Hotjar Surveys to embed a survey as an element directly on a page: this is useful when you want to grab your audience’s attention and connect with customers at relevant moments, without interrupting their browsing. (Scroll to the bottom of this page to see an embedded survey in action!) This format is included on Hotjar’s Business and Scale plans—try it out for 15 days with a free Ask Business trial .

Use a survey builder and create a survey people can access in their own time: this is useful when you want to reach out to your mailing list or a wider audience with an email survey (you just need to share the URL the survey lives at). Sending in-depth questionnaires this way allows for more space for people to elaborate on their answers. This format is also relatively inexpensive, depending on the tool you use.

Place survey kiosks in a physical location where people can give their feedback by pressing a button: this is useful for quick feedback on specific aspects of a customer's experience (there’s usually plenty of these in airports and waiting rooms). This format is relatively expensive to maintain due to the material upkeep.

Run in-person surveys with your existing or prospective customers: in-person questionnaires help you dig deep into your interviewees’ answers. This format is relatively cheap if you do it online with a user interview tool or over the phone, but it’s more expensive and time-consuming if done in a physical location.

💡 Pro tip: looking for an easy, cost-efficient way to connect with your users? Run effortless, automated user interviews with Engage , Hotjar’s user interview tool. Get instant access to a pool of 200,000+ participants (or invite your own), and take notes while Engage records and transcribes your interview.

10 survey use cases: what you can do with good survey questions

Effective survey questions can help improve your business in many different ways. We’ve written in detail about most of these ideas in other blog posts, so we’ve rounded them up for you below.

1. Create user personas

A user persona is a character based on the people who currently use your website or product. A persona combines psychographics and demographics and reflects who they are, what they need, and what may stop them from getting it.

Examples of questions to ask:

Describe yourself in one sentence, e.g. “I am a 30-year-old marketer based in Dublin who enjoys writing articles about user personas.”

What is your main goal for using this website/product?

What, if anything, is preventing you from doing it?

👉 Our post about creating simple and effective user personas in four steps highlights some great survey questions to ask when creating a user persona.

🚀 Get started: use our user persona survey template or AI for Surveys to inform your user persona.

2. Understand why your product is not selling

Few things are more frightening than stagnant sales. When the pressure is mounting, you’ve got to get to the bottom of it, and good survey questions can help you do just that.

What made you buy the product? What challenges are you trying to solve?

What did you like most about the product? What did you dislike the most?

What nearly stopped you from buying?

👉 Here’s a detailed piece about the best survey questions to ask your customers when your product isn’t selling , and why they work so well.

🚀 Get started: our product feedback survey template helps you find out whether your product satisfies your users. Or build your surveys in the blink of an eye with Hotjar AI.

3. Understand why people leave your website

If you want to figure out why people are leaving your website , you’ll have to ask questions.

A good format for that is an exit-intent pop-up survey, which appears when a user clicks to leave the page, giving them the chance to leave website feedback before they go.

Another way is to focus on the people who did convert, but just barely—something Hotjar founder David Darmanin considers essential for taking conversions to the next level. By focusing on customers who bought your product (but almost didn’t), you can learn how to win over another set of users who are similar to them: those who almost bought your products, but backed out in the end.

Example of questions to ask:

Not for you? Tell us why. ( Exit-intent pop-up —ask this when a user leaves without buying.)

What almost stopped you from buying? (Ask this post-conversion .)

👉 Find out how HubSpot Academy increased its conversion rate by adding an exit-intent survey that asked one simple question when users left their website: “Not for you? Tell us why.”

🚀 Get started: place an exit-intent survey on your site. Let Hotjar AI draft the survey questions by telling it what you want to learn.

I spent the better half of my career focusing on the 95% who don’t convert, but it’s better to focus on the 5% who do. Get to know them really well, deliver value to them, and really wow them. That’s how you’re going to take that 5% to 10%.

4. Understand your customers’ fears and concerns

Buying a new product can be scary: nobody wants to make a bad purchase. Your job is to address your prospective customers’ concerns, counter their objections, and calm their fears, which should lead to more conversions.

👉 Take a look at our no-nonsense guide to increasing conversions for a comprehensive write-up about discovering the drivers, barriers, and hooks that lead people to converting on your website.

🚀 Get started: understand why your users are tempted to leave and discover potential barriers with a customer retention survey .

5. Drive your pricing strategy

Are your products overpriced and scaring away potential buyers? Or are you underpricing and leaving money on the table?

Asking the right questions will help you develop a pricing structure that maximizes profit, but you have to be delicate about how you ask. Don’t ask directly about price, or you’ll seem unsure of the value you offer. Instead, ask questions that uncover how your products serve your customers and what would inspire them to buy more.

How do you use our product/service?

What would persuade you to use our product more often?

What’s the one thing our product is missing?

👉 We wrote a series of blog posts about managing the early stage of a SaaS startup, which included a post about developing the right pricing strategy —something businesses in all sectors could benefit from.

🚀 Get started: find the sweet spot in how to price your product or service with a Van Westendorp price sensitivity survey or get feedback on your pricing plan .

6. Measure and understand product-market fit

Product-market fit (PMF) is about understanding demand and creating a product that your customers want, need, and will actually pay money for. A combination of online survey questions and one-on-one interviews can help you figure this out.

What's one thing we can add that would make [product name] indispensable for you?

If you could change just one thing in [product name], what would it be?

👉 In our series of blog posts about managing the early stage of a SaaS startup, we covered a section on product-market fit , which has relevant information for all industries.

🚀 Get started: discover if you’re delivering the best products to your market with our product-market fit survey .

7. Choose effective testimonials

Human beings are social creatures—we’re influenced by people who are similar to us. Testimonials that explain how your product solved a problem for someone are the ultimate form of social proof. The following survey questions can help you get some great testimonials.

What changed for you after you got our product?

How does our product help you get your job done?

How would you feel if you couldn’t use our product anymore?

👉 In our post about positioning and branding your products , we cover the type of questions that help you get effective testimonials.

🚀 Get started: add a question asking respondents whether you can use their answers as testimonials in your surveys, or conduct user interviews to gather quotes from your users.

8. Measure customer satisfaction

It’s important to continually track your overall customer satisfaction so you can address any issues before they start to impact your brand’s reputation. You can do this with rating scale questions.

For example, at Hotjar, we ask for feedback after each customer support interaction (which is one important measure of customer satisfaction). We begin with a simple, foot-in-the-door question to encourage a response, and use the information to improve our customer support, which is strongly tied to overall customer satisfaction.

How would you rate the support you received? (1-5 scale)

If 1-3: How could we improve?

If 4-5: What did you love about the experience?

👉 Our beginner’s guide to website feedback goes into great detail about how to measure customer service, NPS , and other important success metrics.

🚀 Get started: gauge short-term satisfaction level with a CSAT survey .

9. Measure word-of-mouth recommendations

Net Promoter Score is a measure of how likely your customers are to recommend your products or services to their friends or colleagues. NPS is a higher bar than customer satisfaction because customers have to be really impressed with your product to recommend you.

Example of NPS questions (to be asked in the same survey):

How likely are you to recommend this company to a friend or colleague? (0-10 scale)

What’s the main reason for your score?

What should we do to WOW you?

👉 We created an NPS guide with ecommerce companies in mind, but it has plenty of information that will help companies in other industries as well.

🚀 Get started: measure whether your users would refer you to a friend or colleague with an NPS survey . Then, use our free NPS calculator to crunch the numbers.

10. Redefine your messaging

How effective is your messaging? Does it speak to your clients' needs, drives, and fears? Does it speak to your strongest selling points?

Asking the right survey questions can help you figure out what marketing messages work best, so you can double down on them.

What attracted you to [brand or product name]?

Did you have any concerns before buying [product name]?

Since you purchased [product name], what has been the biggest benefit to you?

If you could describe [brand or product name] in one sentence, what would you say?

What is your favorite thing about [brand or product name]?

How likely are you to recommend this product to a friend or colleague? (NPS question)

👉 We talk about positioning and branding your products in a post that’s part of a series written for SaaS startups, but even if you’re not in SaaS (or you’re not a startup), you’ll still find it helpful.

Have a question for your customers? Ask!

Feedback is at the heart of deeper empathy for your customers and a more holistic understanding of their behaviors and motivations. And luckily, people are more than ready to share their thoughts about your business— they're just waiting for you to ask them. Deeper customer insights start right here, with a simple tool like Hotjar Surveys.

Build surveys faster with AI🔥

Use AI in Hotjar Surveys to build your survey, place it on your website or send it via email, and get the customer insight you need to grow your business.

FAQs about survey questions

How many people should i survey/what should my sample size be.

A good rule of thumb is to aim for at least 100 replies that you can work with.

You can use our  sample size calculator  to get a more precise answer, but understand that collecting feedback is research, not experimentation. Unlike experimentation (such as A/B testing ), all is not lost if you can’t get a statistically significant sample size. In fact, as little as ten replies can give you actionable information about what your users want.

How many questions should my survey have?

There’s no perfect answer to this question, but we recommend asking as few as you need to ask in order to get the information you want. Remember, you’re essentially asking someone to work for free, so be respectful of their time.

Why is it important to ask good survey questions?

A good survey question is asked in a precise way at the right stage in the customer journey to give you insight into your customers’ needs and drives. The qualitative data you get from survey responses can supplement the insight you can capture through other traditional analytics tools (think Google Analytics) and behavior analytics tools (think heatmaps and session recordings , which visualize user behavior on specific pages or across an entire website).

The format you choose for your survey—in-person, email, on-page, etc.—is important, but if the questions themselves are poorly worded you could waste hours trying to fix minimal problems while ignoring major ones a different question could have uncovered. 

How do I analyze open-ended survey questions?

A big pile of  qualitative data  can seem intimidating, but there are some shortcuts that make it much easier to analyze. We put together a guide for  analyzing open-ended questions in 5 simple steps , which should answer all your questions.

But the fastest way to analyze open questions is to use the automated summary report with Hotjar AI in Surveys . AI turns the complex survey data into:

Key findings

Actionable insights

Will sending a survey annoy my customers?

Honestly, the real danger is  not  collecting feedback. Without knowing what users think about your page and  why  they do what they do, you’ll never create a user experience that maximizes conversions. The truth is, you’re probably already doing something that bugs them more than any survey or feedback button would.

If you’re worried that adding an on-page survey might hurt your conversion rate, start small and survey just 10% of your visitors. You can stop surveying once you have enough replies.

Related articles

research questions for survey

User research

5 tips to recruit user research participants that represent the real world

Whether you’re running focus groups for your pricing strategy or conducting usability testing for a new product, user interviews are one of the most effective research methods to get the needle-moving insights you need. But to discover meaningful data that helps you reach your goals, you need to connect with high-quality participants. This article shares five tips to help you optimize your recruiting efforts and find the right people for any type of research study.

Hotjar team

research questions for survey

How to instantly transcribe user interviews—and swiftly unlock actionable insights

After the thrill of a successful user interview, the chore of transcribing dialogue can feel like the ultimate anticlimax. Putting spoken words in writing takes several precious hours—time better invested in sharing your findings with your team or boss.

But the fact remains: you need a clear and accurate user interview transcript to analyze and report data effectively. Enter automatic transcription. This process instantly transcribes recorded dialogue in real time without human help. It ensures data integrity (and preserves your sanity), enabling you to unlock valuable insights in your research.

research questions for survey

Shadz Loresco

research questions for survey

An 8-step guide to conducting empathetic (and insightful) customer interviews

Customer interviews uncover your ideal users’ challenges and needs in their own words, providing in-depth customer experience insights that inform product development, new features, and decision-making. But to get the most out of your interviews, you need to approach them with empathy. This article explains how to conduct accessible, inclusive, and—above all—insightful interviews to create a smooth (and enjoyable!) process for you and your participants.

  • Make a Survey

Opinion Stage Âť survey Âť Survey Questions

16 Types of Survey Questions, with 100 Examples

Good survey questions will help your business acquire the right information to drive growth. Surveys can be made up of different types of questions. Each type has a unique approach to gathering data. The questions you choose and the way you use them in your survey will affect its results.

These are the types of survey questions we will cover:

  • Open-Ended Questions
  • Closed-Ended Questions
  • Multiple Choice Questions
  • Dichotomous Questions
  • Rating Scale Questions
  • Likert Scale Questions
  • Nominal Questions
  • Demographic Questions
  • Matrix Table Questions
  • Side-by-Side Matrix Questions
  • Data Reference Questions
  • Choice Model Questions
  • Net Promoter Score Questions
  • Picture Choice Questions
  • Image Rating Questions
  • Visual Analog Scale Questions

But before we go into the actual question types, let’s talk a little about how you should use them.

Try this survey

Ready to create your own?  Make a survey .

How to Use Survey Questions in Market Research

First, you need to make sure it’s a survey you’re after. In some cases, you may find that it’s actually a questionnaire that you need (read more here to learn the difference:  Survey Vs. Questionnaire ), or a research quiz. In any case, though, you will need to use the right type of questions.

To determine the right type of questions for your survey, consider these factors:

  • The kind of data you want to gather
  • The depth of the information you require
  • How long it takes to answer the survey

Regardless of the size of your business, you can use surveys to learn about potential customers, research your product market fit, collect customer feedback or employee feedback, get new registrations, and improve retention.

Surveys can help you gather valuable insights into critical aspects of your business. From brand awareness to customer satisfaction, effective surveys give you the data you need to stay ahead of the competition.

So, how should you use surveys for your market research?

Try this market research survey

Ready to create your own?  Make a research survey .

Identify Customer Needs and Expectations

Perhaps the idea of using customer surveys in this advanced era of data analytics seems quaint. But one of the best ways to find out what consumers need and expect is to go directly to the source and ask. That’s why surveys still matter. All companies and online businesses can benefit from using market research surveys to determine the needs of their clients.

Determine Brand Attributes

A market research survey can also help your company identify the attributes that consumers associate with your brand. These could be tangible or intangible features that they think of when they see your brand. By determining your brand attributes, you can identify other brands in the same niche. Additionally, you can gain a clear understanding of what your audience values.

Understand Your Market’s Supply and Demand Chain

Surveying existing and potential customers enables you to understand the language of supply and demand. You can understand the measure of customer satisfaction and identify opportunities for the market to absorb new products. At the same time, you can use the data you collect to build customer-centric products or services. By understanding your target market, you can minimize the risks involved in important business ventures and develop an amazing customer experience.

Acquire Customer Demographic Information

Before any campaign or product launch, every company needs to determine its key demographic. Online surveys make it so much easier for marketers to get to know their audience and build effective user personas. With a market research survey, you can ask demographic survey questions to collect details such as family income, education, professional background, and ethnicity. It’s important to be careful and considerate in this area since questions that seem matter-of-fact to you may be experienced as loaded questions or sensitive questions by your audience.

Strategize for New Product Launches

Businesses of all sizes can use customer surveys to fine-tune products and improve services. Let’s say there’s a product you want to launch. But you’re hesitant to do so without ensuring that it will be well-received by your target audience. Why not send out a survey? With the data you gather from the survey responses, you can identify issues that may have been overlooked in the development process and make the necessary changes to improve your product’s success.

Develop a Strategic Marketing Plan

Surveys can be used in the initial phases of a campaign to help shape your marketing plan. Thanks to in-depth analytics, a quick and easy survey that respondents can finish within minutes can give you a clear idea of what potential consumers need and expect.

Create beautiful online surveys in minutes

Types of Survey Questions

No matter the purpose of your survey, the questions you ask will be crucial to its success. For this reason, it’s best to set the goal of your survey and define the information you want to gather before writing the questions.

Ask yourself: What do I want to know? Why do I want to know this? Can direct questions help me get the information I need? How am I going to use the data I gather?

Once you have a clear goal in mind, you can choose the best questions to elicit the right kind of information. We’ve made a list of the most common types of survey questions to help you get started.

1. Open-Ended Questions

If you prefer to gather qualitative insights from your respondents, the best way to do so is through an open-ended question. That’s because this survey question type gives respondents more opportunity to say what’s on their minds. After all, an open question doesn’t come with pre-set answer choices that respondents can select. Instead, it uses a text box where respondents can leave more detailed responses.

Ideally, you should ask such questions when you’re doing expert interviews or preliminary research. You may also opt to end surveys with this type of question. This is to give respondents a chance to share additional concerns with you. By letting respondents give answers in their own words, even to a single question, you can identify opportunities you might have overlooked. At the same time, it shows that you appreciate their effort to answer all your questions.

Since quantifying written answers isn’t easy to do, opt to use these questions sparingly, especially if you’re dealing with a large population.

Examples of open-ended questions:

  • What can you tell us about yourself? (Your age, gender, hobbies, interests, and anything else you’re willing to share)
  • How satisfied or dissatisfied are you with our service?
  • What has kept you from signing up for our newsletter?

2. Closed-Ended Questions

Consumers want surveys they can answer in a jiffy. Closed-ended questions are ideal for market research for that reason. They come with a limited number of options, often one-word responses such as yes or no, multiple-choice, or a rating scale. Compared to open-ended questions, these drive limited insights because respondents only have to choose from pre-selected choices.

Ask closed-ended questions if you need to gather quantifiable data or to categorize your respondents. Furthermore, you can use such questions to drive higher response rates. Let’s say your audience isn’t particularly interested in the topic you intend to ask them about. You can use closed-ended questions to make it easier for them to complete the survey in minutes.

Close-ended question examples:

  • Which of the following are you most likely to read? (a) a series of blog posts (b) a novel (c) the daily news (d) I don’t read on a regular basis
  • How would you rate our service on a 5-point scale, with 1 representing bad service, and 5 representing great service?
  • How likely are you to recommend us on a scale of 0 to 10?

3. Multiple Choice Questions

Multiple-choice questions are a basic type of closed-ended survey question that give respondents multiple answers to choose from. These questions can be broken down into two main categories:

  • Single-answer questions – respondents are directed to choose one, and only one answer from a list of answer options.
  • Multiple answer questions – where respondents can select a number of answers in a single question.

When designed correctly they can be very effective survey questions since they’re relatively simple questions to answer, and the data is easy to analyze.

Multiple-choice sample questions:

  • It’s exceptional
  • Could be better
  • It’s terrible
  • Whole-grain rice
  • Gluten-free noodles
  • Suger-free soft drinks
  • Lactose-free ice cream

Try this product survey

Ready to create your own?  Make a product survey .

4. Dichotomous Questions

Dichotomous questions are a type of close-ended questions with only two answer options that represent the opposite of each other. In other words, yes/no questions, or true/false questions. They’re often used as screening questions to identify potential customers since they’re so quick and easy to answer and require no extra effort.

They’re also good for splitting your audience into two groups, enabling you to direct each group to a different series of questions. This can be done quite easily using skip logic which sends people on different survey paths based on their answers to previous questions.

Examples of questions:

Do you have experience working with Google Analytics? Yes/no Google Analytics is used for tracking user behavior. True/false Google Analytics has a steep learning curve for the average user. Agree/disagree

5. Rating Scale Questions

Also called ordinal questions, these questions help researchers measure customer sentiment in a quantitative way. This type of question comes with a range of response options. It could be from 1 to 5 or 1 to 10.

In a survey, a respondent selects the number that accurately represents their response. Of course, you have to establish the value of the numbers on your scale for it to be effective.

Rating scales can be very effective survey questions, however, the lack of proper survey scaling could lead to bad survey questions that respondents Don’t know how to answer. And even if they think you do, the results won’t be reliable because every respondent could interpret the scale differently. So, it’s important to be clear.

If you want to know how respondents experienced your customer service, you can establish a scale from 1 to 10 to measure customer sentiment. Then, assign the value of 1 and 10. The lowest number on the scale could, for instance, mean “very disappointed” while the highest value could represent “very satisfied”.

Examples of rating scale questions:

  • On a scale of 0 to 10, how would you rate your last customer support interaction with us? (0=terrible, 10=amazing)
  • How likely are you to recommend our company to a friend or colleague on a scale of 1 to 5? 1=very unlikely, 5=very likely
  • How would you rate your shopping experience at our online business on a scale of 1 to 7? 1=bad, 4=ok, 7=amazing

6. Likert Scale Questions

These questions can either be unipolar or bipolar. Unipolar scales center on the presence or absence of quality. Moreover, they don’t have a natural midpoint. For example, a unipolar satisfaction scale may have the following options: extremely satisfied, very satisfied, moderately satisfied, slightly satisfied, and not satisfied.

Bipolar scales, on the other hand, are based on either side of neutrality. That means they have a midpoint. A common bipolar scale, for instance, may have the following options: extremely unsatisfied, very unsatisfied, somewhat unsatisfied, neither satisfied nor dissatisfied, somewhat satisfied, very satisfied, or extremely satisfied.

Likert scale questions can be used for a wide variety of objectives. They are great for collecting initial feedback. They can also help you gauge customer sentiment, among other things.

Likert scale sample questions:

  • How important is it that you can access customer support 24/7? (Choices: Very Important, Important, Neutral, Low Importance, and Not Important At All)
  • How satisfied are you after using our products? (Choices: Very Satisfied, Moderately Satisfied, Neutral, Slightly Unsatisfied, and Very Unsatisfied)
  • How would you rate our customer care representative’s knowledge of our products? (Choices: Not at All Satisfactory, Low Satisfactory, Somewhat Satisfactory, Satisfactory, and Very Satisfactory)

Try this Likert scale survey

Ready to create your own?  Make a Likert scale survey .

7. Nominal Questions

Also a type of measurement scale, nominal questions come with tags or labels for identifying or classifying items. For these questions, you can use non-numeric variables. You can also assign numbers to each response option, but they won’t actually have value.

On a nominal scale, you assign each number to a unique label. Especially if the goal is identification, you have to stick to a one-to-one correlation between the numeric value and the label. Much like cars on a race track, numbers are assigned to identify the driver associated with the car. It doesn’t represent the characteristics of the vehicle.

However, when a nominal scale is used for classification, the numerical values assigned to each descriptor serve as a tag. This is for categorizing or arranging the objects in a class. For example, you want to know your respondents’ gender. You can assign the letter M for males and F for females in the survey question.

Examples of nominal questions:

  • What is your hair color? (Choices: 1 – Black, 2 – Blonde, 3 – Brown, 4 – Red, 5 – Other)
  • How old are you? (Choices: 1 – Under 25, 2 – 25-35, 3 – Over 35)
  • How do you commute to work? (Choices: 1- Car, 2 – Bus, 3 – Train, 4 – Walk, 5 – Other)

8. Demographic Questions

As its name suggests, this question type is used for gathering information about a consumer. From their background to income level, these simple questions can provide you with deeper insights into your target market. They’re also used as screening questions since they can help you to identify the population segments you’re targeting.

Demographic questions  help you understand your target market. By collecting customer data, you can identify similarities and differences between different demographics. Then, you can make buyer personas and classify them based on who they are or what they do.

Some demographic topics can lead to quite loaded survey questions. When writing your demographic survey, try to identify the loaded questions and ask yourself if someone could find the question, the answer choices, or the lack of a certain answer choice offensive. Do your best to phrase them sensitively and respectfully, and if you can’t consider leaving them out.

With every single question that you write, it’s important to place yourself in the shoes of your respondents. If you want to ask students about their income, your response options should range below $20,000 per year, because most of them are probably not making more than that. But if your respondents are affluent, your choices should have a range higher than $100,000.

Examples of demographic questions:

  • How old are you?
  • What is your level of education?
  • What is your marital status?
  • What’s your current employment status?

Try this demographic survey

Ready to create your own?  Make a demographic survey .

9. Matrix Table Questions

If you need to ask a series of questions that require the same response options, you can group them together in a matrix table instead of breaking them into separate questions.

While these bundled questions are convenient, you have to use them carefully. Visually, large matrix tables can seem overwhelming. In addition, online survey questions of this sort aren’t always mobile-friendly. Having too many questions or choices may even trigger undesirable survey-taking behavior such as straight-lining. This is when respondents select the same options without carefully considering each one. Sometimes, they do that because the actual experience feels like a complicated matrix and they just want to finish it.

Example of a matrix table:

How satisfied or dissatisfied are you with the following?

Interaction with sales staff

Product selection

Marketing messages

Pricing structure

Then, you can make a brief list of response options. There should be no more than five options.

10. Side-by-Side Matrix Questions

A side-by-side matrix is similar to your regular matrix table in that it allows you to group together questions that require simple response options. However, a matrix table only lets you collect data from a single variable. A side-by-side matrix, on the other hand, enables you to gather data on two or more dimensions.

For example, let’s say you want to ask respondents about the importance of different services and their satisfaction with each. You can group them together in a side-by-side matrix. By organizing questions in tables, your respondents can easily fill out the survey in minutes.

Much like a regular matrix table, you shouldn’t overwhelm consumers. Avoid adding too many variables to your table. Moreover, you should keep the response options short.

Examples of side-by-side matrix questions:

Example of side-by-side matrix:

How would you rate our shopping services?

Identify the variables. They can be customer support, packaging, and punctuality. Next, you should add different dimensions such as importance and satisfaction level. On each table, you should add a similar scale. You can start with 1, which could mean Not Important and Not Satisfied.

11. Data Reference Questions

Use data reference questions to gather validated data against standardized databases. For example, direct respondents to enter their postal code or zip code in a small text box. The value entered will then be cross-referenced with the database. If it is correct, their city or state will be displayed, and they can proceed with the survey. And if it is incorrect, they’ll be asked to enter a valid postal code or zip code.

Examples of data reference questions:

  • What is your five-digit zip code?
  • What is your postal code?

12. Choice Model Questions

Choice model questions enable you to understand the essential aspects of consumers’ decision-making process. This involves a quantitative method called Conjoint Analysis. It helps you grasp your users’ preferences, the features they like, and the right price range your target market can afford. More importantly, it enables you to understand if your new products will be accepted by your target market.

These questions also involve Maximum Difference Scaling, a method that allows the ranking of up to 30 elements. This can include product features, benefits, opportunities for potential investment, and possible marketing messages for an upcoming product.

Example of a choice model question:

  • If you were to buy a sandwich, which ingredient combination would you choose?

Let’s say you want to know about consumers’ bread, filling, and sauce preferences. In your survey, you can give them three sandwich options. You can, for instance, offer three kinds of bread: grain wheat, parmesan oregano, and Italian. As for the sauces, you can make them choose between ranch, blue cheese, and mustard. Finally, you need to suggest three types of filling, for example, chicken, veggies, and meatballs.

Respondents will see unique combinations of these ingredients in your survey. Then, they will have to choose the one that they like best.

13. Net Promoter Score Questions

A net promoter score (NPS) survey question measures brand shareability, as well as customer satisfaction levels. It helps you get reliable customer insights and gauge the likelihood of respondents recommending your company to friends or colleagues (i.e. prospective customers). The scoring model involves a scale of 0 to 10, which is divided into three sections. Respondents who give a 9 to 10 score are considered Promoters. Passives give a 7 to 8 score, while the rest are considered Detractors.

Once you’ve gathered all the data, the responses per section are calculated. Then, the net value of promoters is shown. This type of survey question offers a useful form of initial feedback. It helps you understand why promoters are leaving high ratings so you can work on enhancing those strengths. At the same time, it enables you to determine weaknesses. It illustrates why detractors are leaving such low ratings.

Examples of net promoter score questions:

  • On a scale of 0 to 10, how likely are you to recommend our brand to a friend or colleague? (0 = Not at all Likely and 10 = Very Likely)
  • Would you encourage friends to work at our company?
  • How likely are you to recommend (specific name of the product) to friends?

Try this NPS survey

Ready to create your own?  Make an NPS survey .

14. Picture Choice Questions

It’s no secret that people respond to visual content more than plain text. This applies to surveys as well – visual content can boost user experience.

Think of these as alternate questions to multiple-choice questions. Users can pick one or many from a visual list of options. You can use picture choice questions to make your survey more engaging.

Keep in mind, that it’s very easy to unintentionally create a leading question by using images that get a specific reaction from people. For example, if you’re asking about food preferences and one of the images is more attractive than others, people may see it as the perfect answer even if it doesn’t represent their favorite dish because it looks most attractive. So when you’re illustrating a variety of answers with images make sure their quality and attractiveness is similar.

Picture choice examples:

  • What is your favorite pizza topping?
  • Which color should we choose for our logo?
  • What other products would you like to see in our online store?

Opinion Stage has an online  survey maker  tool that can help you design image-based survey questions in minutes. Choose from hundreds of professionally-designed templates, and tailor them to fit your needs, or design them from scratch.

Try this visual survey

Ready to create your own?  Make a visual survey .

15. Image Rating Questions

Another way to incorporate images in questions is through image ratings. Let’s say you want to know how satisfied consumers are with your products. You can display all of the items you want respondents to rate. Under each item, provide a shortlist of options (e.g. very unsatisfied, unsatisfied, neutral, satisfied, very unsatisfied).

You could also use a rank order question to let your respondents rank their favorite products. Simply give them multiple options, and then, ask them for their top three or five favorites. Or you could ask them to organize a series of answers by ranking.

For example, if it’s an employee engagement survey question you could ask your employees to rank a series of office activities from their least favorite to their most favorite. There are many ways to do this visually. Some tools use dropdown menus, and others let you move the answer options around, but the simplest way is to use numbers like in the example below.

Rank order questions should work well on mobile devices. After all, respondents only have to tap on their favorite items to participate.

Example of image rating questions:

  • What are your 5 favorite desserts?

16. Visual Analog Scale Questions

Another type of scale you can use in a survey is the visual analog scale, which displays your questions in a more engaging manner. For instance, you can use text sliders or numeric sliders to ask respondents to rate the service they’ve received from your company and let them select an image line that best illustrates their answer.

You can also use pictures to depict each option. Smiley ratings are commonly used in surveys nowadays because they’re simple questions, easy on the eyes, and quite fun. Star ratings are also effective survey questions that require no extra effort.

Examples of visual analog scale questions:

  • How would you rate the overall quality of our customer service?
  • What do you think of our website’s interface?
  • How satisfied are you with the way our service works in offline mode?

Create engaging image-based surveys in minutes

The Fundamentals of Good Survey Questions

There is an art to writing effective questions for your survey. Regardless of the kind of survey you plan to deploy, there are a few practices that you should adhere to.

Use Clear and Simple Language

Always choose clear and simple words when writing your online survey questions. In doing so, you can keep the questions short yet specific.

Complex phrasing, too many words, acronyms, and specialized jargon require extra effort and could cause confusion. Make it easy for your respondents to help you. Keep it simple.

Moreover, avoid  double-barreled questions , they will frustrate your respondents and skew your customer insights.  Here’s an example of a double-barreled question: “Did you find our new search feature helpful and easy to use? yes/no” Such a question might be simple to understand, but it isn’t easy to answer because it covers two issues. How could someone respond if they found the search feature helpful but difficult to understand? It would make more sense to separate it into two questions, i.e. did you find the new search feature helpful? Was the new search feature easy for you to use?

Focus on the Consumer

Make the survey engaging. Use the second-person (i.e., ‘you’ format) to address your respondents directly, and use the first-person (i.e., ‘we’ format) to refer to your company. This makes the survey more personal and helps respondents recall prior experiences with your company. In turn, it leads to quicker and more accurate answers.

Ask for Feedback

Get initial feedback from external people that fit the profile of your average user before sending your survey out. It’s like a user testing tool, you need someone who isn’t you to take a look and tell you if your survey is clear and friendly.

Require Minimal Effort to Answer

There’s no reason to ask people questions that aren’t essential to you. Ask people questions that really matter to you, and try to keep it down to the minimum number, so as not to waste their time. The more succinct a survey is, the more likely a respondent is to complete it. So, let them know that you value their time by designing a survey they can finish within minutes.

Stay Free From Bias

Survey question mistake #1 is to ask leading or biased questions. Don’t plant opinions in your respondents’ heads before they can formulate their own. Don’t ask people questions like “How good was your in-store experience today?” Phrase it in a neutral way like “On a scale of 1 to 10, how would you rate your in-store experience?”

Keep the Purpose of the Survey Vague

Sometimes, respondents have a tendency to give you the answers you want to hear. One of the simplest ways to prevent that is by keeping the purpose of your survey vague. Instead, you should give a general description of your survey.

Get a personalized survey up and running today

Sample Survey Questions

Below are sample questions for different market research needs. You can use many of them as close-ended questions as well as open questions, depending on your need and preference.

Brand Awareness Questions

  • When was the last time you used (a type of product)?
  • What brands come to mind as your top choice when you think of buying this product type?
  • What factors do you consider when selecting a vendor? (rank by importance)
  • Which of the following brands have you heard of? (please select all that apply)
  • Where have you seen or heard of our brand in the last three months? (please select all that apply)
  • How often have you heard people talking about our brand in the past three months?
  • How familiar are you with our company?
  • On a scale of 1 to 10, how likely are you to recommend our brand to a friend?

Customer Demographic Questions

  • What gender do you identify as?
  • Where were you born?
  • Are you married?
  • What is your annual household income?
  • Do you support children under the age of 18?
  • How many children under the age of 18 reside in your household?
  • What category best describes your employment status?
  • Which general geographic area of the state do you reside in?
  • What is your current employment status?
  • Which of the following languages can you speak fluently?

Brand & Marketing Feedback Questions

  • Have you purchased from our company before?
  • How long have you been a customer?
  • Which best describes your latest experience with our brand? (please select all that apply)
  • Which of the following attributes do you associate with our brand? (please select all that apply)
  • What kind of feelings do you associate with our brand?
  • Which of these marketing messages represents us best in your opinion?
  • How would you rate your level of emotional attachment to our brand?
  • What five words would you use to describe our brand to a friend or colleague?
  • On a scale of 1 to 10, how likely are you to recommend our brand to a friend or colleague? (1 being Not at All Likely at 10 being Extremely Likely)

Product & Package Testing Questions

  • What is your first impression of the product?
  • How important are the following features to you?
  • How would you rate the product’s quality?
  • If the product was already available, how likely are you to purchase it?
  • How likely are you to replace an old product with this one?
  • How likely would you recommend this product to a friend or colleague?
  • What did you like best about this product?
  • What are the features that you want to see improved?
  • Based on the value for money, how would you rate this product compared to the competition?
  • What is your first impression of the product packaging?
  • How satisfied or dissatisfied are you with the following features? (Visual appeal, Quality, and Price)
  • How similar or different is the packaging from the competition?
  • Does the packaging have too little or too much information?
  • How likely are you to purchase the product based on its packaging?
  • What did you like best about the packaging?
  • What did you dislike about the packaging?
  • How would you like the packaging to be improved?

Pricing Strategy Testing Questions

  • How often do you purchase this type of product?
  • What brands do you usually purchase? (Please select all that apply).
  • On a scale of 1 to 5, how satisfied are you with the pricing of this type of product? (1 being Not at All Satisfied at 5 being Extremely Satisfied)
  • What is the ideal price for this type of product?
  • What price range would make you consider that the product is too expensive?
  • At what price is the product too cheap that its quality is questionable?
  • How much does the price for our product compare to other products on the market?
  • If the product was available, how likely would you be to purchase it?

Customer Satisfaction Questions

  • How would you rate the following products/services at (name of company)?
  • Which of the following attributes would you use to describe our product/service? Please select all that apply.
  • Would you recommend our company to a friend or colleague? (1 being Very Unlikely and 10 being Very Likely)
  • How responsive has our support team been to your questions and concerns?
  • How likely are you to purchase from our company again?
  • What other comments, concerns, or questions do you have for us?

Brand Performance Questions

  • When was the last time you used this type of product?
  • When you think of our brand, what words come to mind?
  • Which of the following are important to your decision-making process?
  • How well do our products perform based on the following categories? (Price, Quality, Design, etc.)
  • How well does our product meet your needs?
  • What was missing or disappointing about your experience with our brand?
  • What did you like most about your experience with our brand?
  • How can we improve your experience?

Customer Behavior Questions

  • In the household, are you the primary decision maker when it comes to purchasing this type of product?
  • When was the last time you purchased this product type?
  • How do you find out about brands offering this product type? Please select all that apply.
  • When you think of this product type, which of the following are the top three brands that come to mind?
  • How much of your purchasing decisions are influenced by social media?

Save time and choose a customizable survey template

How to Improve Survey Response Rates

Every market research survey needs to be designed carefully in order to drive higher response rates. As a result, you can acquire the right data to inform the decision-making process.

Here are a few survey ideas to boost response rates:

Make It Personal

Write a survey as if it’s a conversation between you and your respondents. For example, use first-person pronouns to make your surveys feel more personal and customer-centric. In addition, stick with simple and specific language to better connect with respondents. Simply put, write your questions as you’d use them in a conversation with consumers.

Make It Engaging

Gathering data from consumers is essential to any business, but market research surveys don’t have to be dull. You can engage and connect with respondents on a human level through an interactive survey. As a result, you can obtain thorough responses and maximize the number of respondents that complete the entire survey.

Don’t Waste Their Time

No one wants to answer a survey with 50 questions because it takes too long to complete. Hence, you should narrow down your list to the most important ones. Only ask questions that will lead to actionable insights. As for the rest, you can get rid of them.

Offer Incentives

There are two types of incentives you can offer: monetary or non-monetary. Either way, you need to make sure that the incentive provides value to your target audience. In addition, you must choose between promised or prepaid incentives. In other words, you have to decide if you want to offer everyone or a small group of people some incentives.

Providing respondents with incentives to finish the survey can increase response rates—but not always. Customer satisfaction surveys, for example, won’t always need incentives because it might affect the quality of the results.

Make It Responsive

Perhaps the easiest way to gain respondents is to make your surveys responsive and mobile-optimized. In doing so, it will perform well and look amazing on all devices. It should also enable you to reach consumers during their daily commute or lunch break. Thus, make sure your survey is optimized for different kinds of devices, especially for mobile.

Offer Surveys in Multiple Channels

If a survey is optimized for all device types, it should be easily accessed on social media. So, take advantage of your platforms and share your survey on different social media channels to increase participation rates.

Designing surveys doesn’t have to be challenging. On the contrary, you can easily create interactive surveys with Opinion Stage. Create a survey from scratch, or choose one of our many professionally-made templates to complete it within minutes. Through Opinion Stage, you can drive higher response rates and evaluate results from a powerful analytics dashboard.

It’s important to be familiar with the different types of survey questions and when to use them. Getting to know each survey question type will help you improve your research. Not to mention, you can gain high-quality data when you design a survey with the right types of questions .

In addition, you should leverage the right tool to create engaging surveys in minutes. With an online survey maker like Opinion Stage, you can customize your surveys to fit your brand image. Or, you can choose from professionally-made templates. Either way, it can help boost response rates.

Last but not least, check your survey design before deploying it. Make sure to see what your survey will look like to your respondents. See opportunities for improvement, then apply the necessary changes.

Popular Resources

You can easily do yourself, no need for a developer

  • Help Center
  • اَلْعَرَبِيَّةُ
  • Deutsch (Schweiz)
  • EspaĂąol (Mexico)
  • Bahasa Indonesia
  • Bahasa Melayu
  • PortuguĂŞs (Brasil)
  • Tiếng việt

Survey Research — Types, Methods and Example Questions

Survey research The world of research is vast and complex, but with the right tools and understanding, it's an open field of discovery. Welcome to a journey into the heart of survey research. What is survey research? Survey research is the lens through which we view the opinions, behaviors, and experiences of a population. Think of it as the research world's detective, cleverly sleuthing out the truths hidden beneath layers of human complexity. Why is survey research important? Survey research is a Swiss Army Knife in a researcher's toolbox. It’s adaptable, reliable, and incredibly versatile, but its real power? It gives voice to the silent majority. Whether it's understanding customer preferences or assessing the impact of a social policy, survey research is the bridge between unanswered questions and insightful data. Let's embark on this exploration, armed with the spirit of openness, a sprinkle of curiosity, and the thirst for making knowledge accessible. As we journey further into the realm of survey research, we'll delve deeper into the diverse types of surveys, innovative data collection methods, and the rewards and challenges that come with them. Types of survey research Survey research is like an artist's palette, offering a variety of types to suit your unique research needs. Each type paints a different picture, giving us fascinating insights into the world around us. Cross-Sectional Surveys: Capture a snapshot of a population at a specific moment in time. They're your trusty Polaroid camera, freezing a moment for analysis and understanding. Longitudinal Surveys: Track changes over time, much like a time-lapse video. They help to identify trends and patterns, offering a dynamic perspective of your subject. Descriptive Surveys: Draw a detailed picture of the current state of affairs. They're your magnifying glass, examining the prevalence of a phenomenon or attitudes within a group. Analytical Surveys: Deep dive into the reasons behind certain outcomes. They're the research world's version of Sherlock Holmes, unraveling the complex web of cause and effect. But, what method should you choose for data collection? The plot thickens, doesn't it? Let's unravel this mystery in our next section. Survey research and data collection methods Data collection in survey research is an art form, and there's no one-size-fits-all method. Think of it as your paintbrush, each stroke represents a different way of capturing data. Online Surveys: In the digital age, online surveys have surged in popularity. They're fast, cost-effective, and can reach a global audience. But like a mysterious online acquaintance, respondents may not always be who they say they are. Mail Surveys: Like a postcard from a distant friend, mail surveys have a certain charm. They're great for reaching respondents without internet access. However, they’re slower and have lower response rates. They’re a test of patience and persistence. Telephone Surveys: With the sound of a ringing phone, the human element enters the picture. Great for reaching a diverse audience, they bring a touch of personal connection. But, remember, not all are fans of unsolicited calls. Face-to-Face Surveys: These are the heart-to-heart conversations of the survey world. While they require more resources, they're the gold standard for in-depth, high-quality data. As we journey further, let’s weigh the pros and cons of survey research. Advantages and disadvantages of survey research Every hero has its strengths and weaknesses, and survey research is no exception. Let's unwrap the gift box of survey research to see what lies inside. Advantages: Versatility: Like a superhero with multiple powers, surveys can be adapted to different topics, audiences, and research needs. Accessibility: With online surveys, geographical boundaries dissolve. We can reach out to the world from our living room. Anonymity: Like a confessional booth, surveys allow respondents to share their views without fear of judgment. Disadvantages: Response Bias: Ever met someone who says what you want to hear? Survey respondents can be like that too. Limited Depth: Like a puddle after a rainstorm, some surveys only skim the surface of complex issues. Nonresponse: Sometimes, potential respondents play hard to get, skewing the data. Survey research may have its challenges, but it also presents opportunities to learn and grow. As we forge ahead on our journey, we dive into the design process of survey research. Limitations of survey research Every research method has its limitations, like bumps on the road to discovery. But don't worry, with the right approach, these challenges become opportunities for growth. Misinterpretation: Sometimes, respondents might misunderstand your questions, like a badly translated novel. To overcome this, keep your questions simple and clear. Social Desirability Bias: People often want to present themselves in the best light. They might answer questions in a way that portrays them positively, even if it's not entirely accurate. Overcome this by ensuring anonymity and emphasizing honesty. Sample Representation: If your survey sample isn't representative of the population you're studying, it can skew your results. Aiming for a diverse sample can mitigate this. Now that we're aware of the limitations let's delve into the world of survey design. {loadmoduleid 430} Survey research design Designing a survey is like crafting a roadmap to discovery. It's an intricate process that involves careful planning, innovative strategies, and a deep understanding of your research goals. Let's get started. Approach and Strategy Your approach and strategy are the compasses guiding your survey research. Clear objectives, defined research questions, and an understanding of your target audience lay the foundation for a successful survey. Panel The panel is the heartbeat of your survey, the respondents who breathe life into your research. Selecting a representative panel ensures your research is accurate and inclusive. 9 Tips on Building the Perfect Survey Research Questionnaire Keep It Simple: Clear and straightforward questions lead to accurate responses. Make It Relevant: Ensure every question ties back to your research objectives. Order Matters: Start with easy questions to build rapport and save sensitive ones for later. Avoid Double-Barreled Questions: Stick to one idea per question. Offer a Balanced Scale: For rating scales, provide an equal number of positive and negative options. Provide a ‘Don't Know’ Option: This prevents guessing and keeps your data accurate. Pretest Your Survey: A pilot run helps you spot any issues before the final launch. Keep It Short: Respect your respondents' time. Make It Engaging: Keep your respondents interested with a mix of question types. Survey research examples and questions Examples serve as a bridge connecting theoretical concepts to real-world scenarios. Let's consider a few practical examples of survey research across various domains. User Experience (UX) Imagine being a UX designer at a budding tech start-up. Your app is gaining traction, but to keep your user base growing and engaged, you must ensure that your app's UX is top-notch. In this case, a well-designed survey could be a beacon, guiding you toward understanding user behavior, preferences, and pain points. Here's an example of how such a survey could look: "On a scale of 1 to 10, how would you rate the ease of navigating our app?" "How often do you encounter difficulties while using our app?" "What features do you use most frequently in our app?" "What improvements would you suggest for our app?" "What features would you like to see in future updates?" This line of questioning, while straightforward, provides invaluable insights. It enables the UX designer to identify strengths to capitalize on and weaknesses to improve, ultimately leading to a product that resonates with users. Psychology and Ethics in survey research The realm of survey research is not just about data and numbers, but it's also about understanding human behavior and treating respondents ethically. Psychology: In-depth understanding of cognitive biases and social dynamics can profoundly influence survey design. Let's take the 'Recency Effect,' a psychological principle stating that people tend to remember recent events more vividly than those in the past. While framing questions about user experiences, this insight could be invaluable. For example, a question like "Can you recall an instance in the past week when our customer service exceeded your expectations?" is likely to fetch more accurate responses than asking about an event several months ago. Ethics: On the other hand, maintaining privacy, confidentiality, and informed consent is more than ethical - it's fundamental to the integrity of the research process. Imagine conducting a sensitive survey about workplace culture. Ensuring respondents that their responses will remain confidential and anonymous can encourage more honest responses. An introductory note stating these assurances, along with a clear outline of the survey's purpose, can help build trust with your respondents. Survey research software In the age of digital information, survey research software has become a trusted ally for researchers. It simplifies complex processes like data collection, analysis, and visualization, democratizing research and making it more accessible to a broad audience. LimeSurvey, our innovative, user-friendly tool, brings this vision to life. It stands at the crossroads of simplicity and power, embodying the essence of accessible survey research. Whether you're a freelancer exploring new market trends, a psychology student curious about human behavior, or an HR officer aiming to improve company culture, LimeSurvey empowers you to conduct efficient, effective research. Its suite of features and intuitive design matches your research pace, allowing your curiosity to take the front seat. For instance, consider you're a researcher studying consumer behavior across different demographics. With LimeSurvey, you can easily design demographic-specific questions, distribute your survey across various channels, collect responses in real-time, and visualize your data through intuitive dashboards. This synergy of tools and functionalities makes LimeSurvey a perfect ally in your quest for knowledge. Conclusion If you've come this far, we can sense your spark of curiosity. Are you eager to take the reins and conduct your own survey research? Are you ready to embrace the simple yet powerful tool that LimeSurvey offers? If so, we can't wait to see where your journey takes you next! In the world of survey research, there's always more to explore, more to learn and more to discover. So, keep your curiosity alive, stay open to new ideas, and remember, your exploration is just beginning! We hope that our exploration has been as enlightening for you as it was exciting for us. Remember, the journey doesn't end here. With the power of knowledge and the right tools in your hands, there's no limit to what you can achieve. So, let your curiosity be your guide and dive into the fascinating world of survey research with LimeSurvey! Try it out for free now! Happy surveying! {loadmoduleid 429}

research questions for survey

Table Content

Survey research.

The world of research is vast and complex, but with the right tools and understanding, it's an open field of discovery. Welcome to a journey into the heart of survey research.

What is survey research?

Survey research is the lens through which we view the opinions, behaviors, and experiences of a population. Think of it as the research world's detective, cleverly sleuthing out the truths hidden beneath layers of human complexity.

Why is survey research important?

Survey research is a Swiss Army Knife in a researcher's toolbox. It’s adaptable, reliable, and incredibly versatile, but its real power? It gives voice to the silent majority. Whether it's understanding customer preferences or assessing the impact of a social policy, survey research is the bridge between unanswered questions and insightful data.

Let's embark on this exploration, armed with the spirit of openness, a sprinkle of curiosity, and the thirst for making knowledge accessible. As we journey further into the realm of survey research, we'll delve deeper into the diverse types of surveys, innovative data collection methods, and the rewards and challenges that come with them.

Types of survey research

Survey research is like an artist's palette, offering a variety of types to suit your unique research needs. Each type paints a different picture, giving us fascinating insights into the world around us.

  • Cross-Sectional Surveys: Capture a snapshot of a population at a specific moment in time. They're your trusty Polaroid camera, freezing a moment for analysis and understanding.
  • Longitudinal Surveys: Track changes over time, much like a time-lapse video. They help to identify trends and patterns, offering a dynamic perspective of your subject.
  • Descriptive Surveys: Draw a detailed picture of the current state of affairs. They're your magnifying glass, examining the prevalence of a phenomenon or attitudes within a group.
  • Analytical Surveys: Deep dive into the reasons behind certain outcomes. They're the research world's version of Sherlock Holmes, unraveling the complex web of cause and effect.

But, what method should you choose for data collection? The plot thickens, doesn't it? Let's unravel this mystery in our next section.

Survey research and data collection methods

Data collection in survey research is an art form, and there's no one-size-fits-all method. Think of it as your paintbrush, each stroke represents a different way of capturing data.

  • Online Surveys: In the digital age, online surveys have surged in popularity. They're fast, cost-effective, and can reach a global audience. But like a mysterious online acquaintance, respondents may not always be who they say they are.
  • Mail Surveys: Like a postcard from a distant friend, mail surveys have a certain charm. They're great for reaching respondents without internet access. However, they’re slower and have lower response rates. They’re a test of patience and persistence.
  • Telephone Surveys: With the sound of a ringing phone, the human element enters the picture. Great for reaching a diverse audience, they bring a touch of personal connection. But, remember, not all are fans of unsolicited calls.
  • Face-to-Face Surveys: These are the heart-to-heart conversations of the survey world. While they require more resources, they're the gold standard for in-depth, high-quality data.

As we journey further, let’s weigh the pros and cons of survey research.

Advantages and disadvantages of survey research

Every hero has its strengths and weaknesses, and survey research is no exception. Let's unwrap the gift box of survey research to see what lies inside.

Advantages:

  • Versatility: Like a superhero with multiple powers, surveys can be adapted to different topics, audiences, and research needs.
  • Accessibility: With online surveys, geographical boundaries dissolve. We can reach out to the world from our living room.
  • Anonymity: Like a confessional booth, surveys allow respondents to share their views without fear of judgment.

Disadvantages:

  • Response Bias: Ever met someone who says what you want to hear? Survey respondents can be like that too.
  • Limited Depth: Like a puddle after a rainstorm, some surveys only skim the surface of complex issues.
  • Nonresponse: Sometimes, potential respondents play hard to get, skewing the data.

Survey research may have its challenges, but it also presents opportunities to learn and grow. As we forge ahead on our journey, we dive into the design process of survey research.

Limitations of survey research

Every research method has its limitations, like bumps on the road to discovery. But don't worry, with the right approach, these challenges become opportunities for growth.

Misinterpretation: Sometimes, respondents might misunderstand your questions, like a badly translated novel. To overcome this, keep your questions simple and clear.

Social Desirability Bias: People often want to present themselves in the best light. They might answer questions in a way that portrays them positively, even if it's not entirely accurate. Overcome this by ensuring anonymity and emphasizing honesty.

Sample Representation: If your survey sample isn't representative of the population you're studying, it can skew your results. Aiming for a diverse sample can mitigate this.

Now that we're aware of the limitations let's delve into the world of survey design.

  •   Create surveys in 40+ languages
  •   Unlimited number of users
  •   Ready-to-go survey templates
  •   So much more...

Survey research design

Designing a survey is like crafting a roadmap to discovery. It's an intricate process that involves careful planning, innovative strategies, and a deep understanding of your research goals. Let's get started.

Approach and Strategy

Your approach and strategy are the compasses guiding your survey research. Clear objectives, defined research questions, and an understanding of your target audience lay the foundation for a successful survey.

The panel is the heartbeat of your survey, the respondents who breathe life into your research. Selecting a representative panel ensures your research is accurate and inclusive.

9 Tips on Building the Perfect Survey Research Questionnaire

  • Keep It Simple: Clear and straightforward questions lead to accurate responses.
  • Make It Relevant: Ensure every question ties back to your research objectives.
  • Order Matters: Start with easy questions to build rapport and save sensitive ones for later.
  • Avoid Double-Barreled Questions: Stick to one idea per question.
  • Offer a Balanced Scale: For rating scales, provide an equal number of positive and negative options.
  • Provide a ‘Don't Know’ Option: This prevents guessing and keeps your data accurate.
  • Pretest Your Survey: A pilot run helps you spot any issues before the final launch.
  • Keep It Short: Respect your respondents' time.
  • Make It Engaging: Keep your respondents interested with a mix of question types.

Survey research examples and questions

Examples serve as a bridge connecting theoretical concepts to real-world scenarios. Let's consider a few practical examples of survey research across various domains.

User Experience (UX)

Imagine being a UX designer at a budding tech start-up. Your app is gaining traction, but to keep your user base growing and engaged, you must ensure that your app's UX is top-notch. In this case, a well-designed survey could be a beacon, guiding you toward understanding user behavior, preferences, and pain points.

Here's an example of how such a survey could look:

  • "On a scale of 1 to 10, how would you rate the ease of navigating our app?"
  • "How often do you encounter difficulties while using our app?"
  • "What features do you use most frequently in our app?"
  • "What improvements would you suggest for our app?"
  • "What features would you like to see in future updates?"

This line of questioning, while straightforward, provides invaluable insights. It enables the UX designer to identify strengths to capitalize on and weaknesses to improve, ultimately leading to a product that resonates with users.

Psychology and Ethics in survey research

The realm of survey research is not just about data and numbers, but it's also about understanding human behavior and treating respondents ethically.

Psychology: In-depth understanding of cognitive biases and social dynamics can profoundly influence survey design. Let's take the 'Recency Effect,' a psychological principle stating that people tend to remember recent events more vividly than those in the past. While framing questions about user experiences, this insight could be invaluable.

For example, a question like "Can you recall an instance in the past week when our customer service exceeded your expectations?" is likely to fetch more accurate responses than asking about an event several months ago.

Ethics: On the other hand, maintaining privacy, confidentiality, and informed consent is more than ethical - it's fundamental to the integrity of the research process.

Imagine conducting a sensitive survey about workplace culture. Ensuring respondents that their responses will remain confidential and anonymous can encourage more honest responses. An introductory note stating these assurances, along with a clear outline of the survey's purpose, can help build trust with your respondents.

Survey research software

In the age of digital information, survey research software has become a trusted ally for researchers. It simplifies complex processes like data collection, analysis, and visualization, democratizing research and making it more accessible to a broad audience.

LimeSurvey, our innovative, user-friendly tool, brings this vision to life. It stands at the crossroads of simplicity and power, embodying the essence of accessible survey research.

Whether you're a freelancer exploring new market trends, a psychology student curious about human behavior, or an HR officer aiming to improve company culture, LimeSurvey empowers you to conduct efficient, effective research. Its suite of features and intuitive design matches your research pace, allowing your curiosity to take the front seat.

For instance, consider you're a researcher studying consumer behavior across different demographics. With LimeSurvey, you can easily design demographic-specific questions, distribute your survey across various channels, collect responses in real-time, and visualize your data through intuitive dashboards. This synergy of tools and functionalities makes LimeSurvey a perfect ally in your quest for knowledge.

If you've come this far, we can sense your spark of curiosity. Are you eager to take the reins and conduct your own survey research? Are you ready to embrace the simple yet powerful tool that LimeSurvey offers? If so, we can't wait to see where your journey takes you next!

In the world of survey research, there's always more to explore, more to learn and more to discover. So, keep your curiosity alive, stay open to new ideas, and remember, your exploration is just beginning!

We hope that our exploration has been as enlightening for you as it was exciting for us. Remember, the journey doesn't end here. With the power of knowledge and the right tools in your hands, there's no limit to what you can achieve. So, let your curiosity be your guide and dive into the fascinating world of survey research with LimeSurvey! Try it out for free now!

Happy surveying!

Think one step ahead.

Step into a bright future with our simple online survey tool

Open Source

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • Doing Survey Research | A Step-by-Step Guide & Examples

Doing Survey Research | A Step-by-Step Guide & Examples

Published on 6 May 2022 by Shona McCombes . Revised on 10 October 2022.

Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyse the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyse the survey results, step 6: write up the survey results, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research: Investigating the experiences and characteristics of different social groups
  • Market research: Finding out what customers think about products, services, and companies
  • Health research: Collecting data from patients about symptoms and treatments
  • Politics: Measuring public opinion about parties and policies
  • Psychology: Researching personality traits, preferences, and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and longitudinal studies , where you survey the same sample several times over an extended period.

Prevent plagiarism, run a free check.

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • University students in the UK
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18 to 24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalised to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every university student in the UK. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalise to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions.

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by post, online, or in person, and respondents fill it out themselves
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by post is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g., residents of a specific region).
  • The response rate is often low.

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyse.
  • The anonymity and accessibility of online surveys mean you have less control over who responds.

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping centre or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g., the opinions of a shop’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations.

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data : the researcher records each response as a category or rating and statistically analyses the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analysed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g., yes/no or agree/disagree )
  • A scale (e.g., a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g., age categories)
  • A list of options with multiple answers possible (e.g., leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analysed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an ‘other’ field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic.

Use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no bias towards one answer or another.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by post, online, or in person.

There are many methods of analysing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also cleanse the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organising them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analysing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analysed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyse it. In the results section, you summarise the key results from your analysis.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviours. It is made up of four or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with five or seven possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyse your data.

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analysing data from people using questionnaires.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2022, October 10). Doing Survey Research | A Step-by-Step Guide & Examples. Scribbr. Retrieved 26 May 2024, from https://www.scribbr.co.uk/research-methods/surveys/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs quantitative research | examples & methods, construct validity | definition, types, & examples, what is a likert scale | guide & examples.

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • EspaĂąol/Europa
  • EspaĂąol/AmĂŠrica Latina
  • PortuguĂŞs Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Market Research Questions

Try Qualtrics for free

Market research questions: what to ask and how.

9 min read Whether you’re looking for customer feedback, product suggestions or brand perception in the market, the right market research questions can help you get the best insights. Learn how you can use them correctly and where to begin.

What is market research?

Market research (also called marketing research) is the action or activity of gathering information about market needs and preferences. This helps companies understand their target market — how the audience feels and behaves.

For example, this could be an online questionnaire , shared by email, which has a set of questions that ask an audience about their views. For an audience of target customers, your questions may explore their reaction to a new product that can be used as feedback into the design.

Why do market research?

When you have tangible insights on the audience’s needs, you can then take steps to meet those needs and solve problems. This mitigates the risk of an experience gap – which is what your audience expects you deliver versus what you actually deliver.

In doing this work, you can gain:

  • Improved purchase levels – Sales will improve if your product or service is ticking all the right buttons for your customers.
  • Improved decision making – You can avoid the risk of losing capital or time by using what your research tells you and acting with insights.
  • Real connection with your target market – If you’re investing in understanding your target audience, your product and service will more likely to make an impact.
  • Understand new opportunities – it might be that your research indicates a new area for your product to play within, or you find potential for a new service that wasn’t considered before.

Get started with our free survey maker

Who do you ask your questions to?

Who to target in your market research is crucial to getting the right insights and data back. If you don’t have a firm idea on who your target audiences are, then here are some questions that you can ask before you begin writing your market research questions:

  • Who is our customer currently and who do we want to attract in the future?
  • How do they behave with your brand?
  • What do they say, do and think?
  • What are their pain points, needs and wants?
  • Where do they live? What is the size of our market?
  • Why do they use us? Why do they use other brands?

We’ve put together some questions below (Market research questions for your demographics) if you wanted to reach out to your market for this.

With the answers, you can help you segment your customer market, understand key consumer trends , create customer personas and discover the right way to target them.

Market research goals

Give yourself the right direction to work towards.There are different kinds of market research that can happen, but to choose the right market research questions, figure out your market research goals first.

Set a SMART goal that thinks about what you want to achieve and keeps you on track. SMART stands for Specific, Measurable, Attainable, Relevant and Timely. For example, a good SMART business goal would be to increase website sales for a top product by 10% over a period of 6 months.

You may need to review some strategic business information, like customer personas and historical sales data, which can give you the foundation of knowledge (the ‘baseline’) to grow from. This, combined with your business objectives, will help you form the right SMART targets tailored to your teams.

Types of market research questions

Now that you have your SMART target, you can look at which type of market research questions will help you reach your goal. They can be split into these types:

  • For demographics
  • For customers
  • For product

Market research questions for your demographics

Demographic information about your customers is data about gender, age, ethnicity, annual income, education and marital status. It also gives key information about their shopping habits.

Here are some questions you can ask in your market research survey:

  • What is your age / gender / ethnicity / marital status?
  • What is the highest level of education you have achieved?
  • What is your monthly income range?
  • What methods of shopping do you use?
  • What amount do you spend on [product/brand/shopping] each month?
  • How regular do you shop for [product/brand]?

Learn more about the demographic survey questions that yield valuable insights .

Market research questions for your customer

These questions are aimed at your customer to understand the voice of the customer — the customer marketing landscape is not an one-way dialogue for engaging prospects and your customer’s feedback is needed for the development of your products or services.

  • How did we do / would you rate us?
  • Why did you decide to use [product or service]?
  • How does that fit your needs?
  • Would you recommend us to your friends?
  • Would you buy from us again?
  • What could we do better?
  • Why did you decide to shop elsewhere?
  • In your opinion, why should customers choose us?
  • How would you rate our customer experience?

Learn more about why the voice of the customer matters or try running a customer experience survey.

Market research questions for your product

These questions will help you understand how your customers perceive your product, their reactions to it and whether changes need to be made in the development cycle.

  • What does our [product or service] do that you like or dislike?
  • What do you think about [feature or benefit]?
  • How does the product help you solve your problems?
  • Which of these features will be the most valuable / useful for you?
  • Is our product competitive with other similar products out there? How?
  • How does the product score on [cost / service / ease of use, etc.]?
  • What changes will customers likely want in the future that technology can provide?

There are also a set of questions you can ask to find out if your product pricing is set at the right mark:

  • Does the product value justify the price it’s marketed at?
  • Is the pricing set at the right mark?
  • How much would you pay for this product?
  • Is this similar to what competitors are charging?
  • Do you believe the price is fair?
  • Do you believe the pricing is right based on the amount of usage you’d get?

Have you tried a pricing and value research survey to see how much your target customers would be willing to pay?

Market research questions for your brand

How does the impact of your products, services and experiences impact your brand’s image? You can find out using these questions:

  • What do you think about our brand?
  • Have you seen any reviews about us online? What do they say?
  • Have you heard about our brand from friends or family? What do they say?
  • How likely are you to recommend our brand to a friend?
  • Have you read the testimonials on our own channels? Did they have an impact on your decision to purchase? How?
  • When you think of our brand, what do you think/ feel / want?
  • How did you hear about us?
  • Do you feel confident you know what our brand stands for?
  • Are you aware of our [channel] account?

Learn more about brand perception surveys and how to carry them out successfully.

How to use market research questions in a survey

For the best research questionnaires, tailoring your market research questions to the goal you want will help you focus the direction of the data received.

You can get started now on your own market research questionnaire, using one of our free survey templates, when you sign up to a free Qualtrics account.

Drag-and-drop interface that requires no coding is easy-to-use, and supported by our award-winning support team.

With Qualtrics, you can distribute, and analyse surveys to find customer, employee, brand, product, and marketing research insights.

More than 11,000 brands and 99 of the top 100 business schools use Qualtrics solutions because of the freedom and power it gives them.

Get started with our free survey maker tool

Related resources

Market intelligence 10 min read, marketing insights 11 min read, ethnographic research 11 min read, qualitative vs quantitative research 13 min read, qualitative research questions 11 min read, qualitative research design 12 min read, primary vs secondary research 14 min read, request demo.

Ready to learn more about Qualtrics?

SurveyCTO

How to write survey questions for research – with examples

You are currently viewing How to write survey questions for research – with examples

  • Post author: Marta Costa
  • Post published: April 5, 2023
  • Post category: Data Collection & Data Quality

A good survey can make or break your research. Learn how to write strong survey questions, learn what not to do, and see a range of practical examples.

The accuracy and relevance of the data you collect depend largely on the quality of your survey questions . In other words, good questions make for good research outcomes.  It makes sense then, that you should put considerable thought and planning into writing your survey or questionnaire.

In this article, we’ll go through what a good survey question looks like, talk about the different kinds of survey questions that exist, give you some tips for writing a good survey question, and finally, we’ll take a look at some examples. 

What is a good survey question?

A good survey question should contain simple and clear language. It should elicit responses that are accurate and that help you learn more about your target audience and their experiences. It should also fit in with the overall design of your survey project and connect with your research objective. There are many different types of survey questions. Let’s take a look at some of them now. 

New to survey data collection? Explore SurveyCTO for free with a 15-day trial.

Types of survey questions

Different types of questions are used for different purposes. Often questionnaires or surveys will combine several types of questions. The types you choose will depend on the overall design of your survey and your aims.  Here is a list of the most popular kinds of survey questions:  

Example of an open-ended question which reads Please list the names and ages of members of your household in the text box below

These questions can’t be answered with a simple yes or no. They require the respondent to use more descriptive language to share their thoughts and answer the question. These types of questions result in qualitative data.

Closed-ended

A closed-ended question is the opposite of an open-ended question. Here the respondent’s answers are normally restricted to a yes or no, true or false, or multiple-choice answer. This results in quantitative data.

research questions for survey

Dichotomous

This is a type of closed-ended question. The defining characteristic of these questions is that they have two opposing fields. For example, a question that can only be answered with a yes/no answer is a dichotomous question. 

research questions for survey

Multiple choice

research questions for survey

These are another type of closed-ended question. Here you give the respondent several possible ways, or options, in which they can respond. It’s also common to have an “other” section with a text box where the respondent can provide an unlisted answer.

Rating scale

This is again another type of close-ended question. Here you would normally present two extremes and the respondent has to choose between these extremes or an option placed along the scale.

Likert scale

A Likert scale is a form of a rating scale. These are generally used to measure attitudes towards something by asking the respondent to agree or disagree with a statement. They are commonly used to measure satisfaction. 

research questions for survey

Ranking scale 

Here the respondents are given a few options and they need to order these different options in terms of importance, relevance, or according to the instructions.  

Demographic questions

These are often personal questions that allow you to better understand your respondents and their backgrounds. They normally cover questions related to age, race, marital status, education level, etc.

Public transport vehicles with colorful roofs in Kampala, Uganda

Ready to start creating your surveys? Sign up for a free 15-day trial.

7 Tips for writing a good survey question

The following 7 tips will help you to write a good survey question: 

1. Use clear, simple language

Your survey questions must be easy to understand. When they’re straight to the point, it’s more likely that your respondent will understand what you are asking of them and be able to respond accurately, giving you the data you need. 

2. Keep your questions (and answers) concise

When sentences or questions are convoluted or confusing, respondents might misunderstand the question. If your questions are too long, they may also get bored by the questions. And in your lists of answers for multiple choice questions, make sure your choice lists are concise as well.  If your questions are too long, or if you’ve provided too many options, you may receive responses that are inaccurate or that are not a true representation of how the respondent feels. To limit the number of options a respondent sees, you can use a survey platform like SurveyCTO to filter choice lists and make it easy for respondents to answer quickly. If you have an exceptionally long list of possible responses, like countries, implement search functionality in your list of choices so your respondents can quickly search for their selection.

3. Don’t add bias to your question

You should avoid leading your respondent in any particular direction with your questions, you want their response to be 100% their thoughts without being unduly influenced.  An example of a question that could lead the respondent in a particular direction would be:  How happy are you to live in this amazing area?  By adding the adjective amazing before area, you are putting the idea in the respondent’s head that the area is amazing. This could cloud their judgment and influence the way they answer the question. The word happy together with amazing may also be problematic. A better, less loaded way to ask this question might be something like this:  How satisfied are you living in this area?

4. Ask one question at a time

Asking multiple things in one question is confusing and will lead to inaccuracies in the answer. When you write your question you should know exactly what you want to achieve. This will help you to avoid combining two questions in one. Here is an example of a double-barrelled question that would be difficult for a respondent to answer: Please answer yes or no to the following question: Do you drive to work and do you carry any passengers? In this question, the respondent is being asked two things, yet they only have the opportunity to respond to one. Even then, they don’t know which one they should respond to. Avoid this kind of questioning to get clearer, more accurate data.

5. Account for all possible answer choices

You should give your respondent the ability to answer a question accurately. For instance, if you are asking a demographic question you’ll need to provide options that accurately reflect their experience. Below, you can see there is an “other” option with space where the respondent can answer how they see fit, in the case that they don’t fit into any of the other options. Which gender do you most identify with:

  • Prefer not to say
  • Other [specify]

6. Plan the question flow and choose your questions carefully

Question writing goes hand-in-hand with questionnaire design. So, when writing survey questions, you should consider the survey as a whole. For example, if you write a close-ended question like:  Were you satisfied with the customer service you received when you bought x product? You might want to follow it up with an open-ended question such as:   Please explain the reason for your answer: This will help you draw out more information from your respondent that can help you assess the strengths and weaknesses of your customer service team.  Making sure your questions flow in a logical order is also important. 

For instance, if you ask a question regarding the total cost of a person’s childcare arrangements, but you’re unaware if they have children, you should first ask if they have children and how many.  It’s also a good idea to start your survey with short, easy-to-answer, non-sensitive questions before moving on to something more complex. This way there is more chance you’ll engage your audience early on and make it more likely that they’ll continue with the survey. You should also consider whether you need qualitative or quantitative data for your research outcomes or a mix of the two. This will help you decide the balance of closed-ended and open-ended questions you use.   With close-ended questions, you get quantitative data. This data will be fairly conclusive and simple to analyze. It can be useful when you need to measure specific variables or metrics like population sizes, education levels, literacy levels, etc. 

An enumerator conducting a phone interview using a tablet connected with headsets. The tablet is on a table

On the other hand, qualitative data gained by open-ended questions can be full of insights. However, these questions can be more laborious for the respondent to complete making it more likely for them to skip through or give a token answer. They’re also more complex to analyze.

7. Test your surveys

Before a questionnaire goes anywhere near a respondent, it needs to be checked over. Mistakes in your survey questions can give inaccurate results. They can also waste time and resources.  Having an impartial person check your questions can also help prevent bias. So, not only should you check your work, but you should also share it with colleagues for them to check.  After checking your survey questions, make sure to check the functionality and flow of your survey. If you’re building your form in SurveyCTO, you can use our form testing interface to catch errors, make quick fixes, and test your workflows with real data.

IFPRI agricultural field project with people seating in pairs under some trees during survey interviews

Examples of good survey questions

Now that we’ve gone through some dos and don’ts for writing survey questions, we can move on to more practical examples of how a good survey question should look. To keep these specific to the research world we’ll look at three categories of questions. 

  • Household survey questions 
  • Monitoring and evaluation survey questions 
  • Impact evaluation survey questions

1. Household Survey Questions

2. monitoring and evaluation survey questions , 3. impact evaluation questions .

Skip-logic-and-choice-filters

Strong survey questions lead to better research outcomes

Writing good survey questions is essential if you want to achieve your research aims.  A good survey question should be clear, concise, and contain simple language. They should be free of bias and not lead the respondent in any direction. Your survey questions need to complement each other, engage your audience and connect back to the overall objectives of your research.  Creating survey questions and survey designs is a large part of your research, however, is just a part of the puzzle. When your questions are ready, you’ll need to conduct your survey and then find a way to manage your data and workflow. Take a look at this post to see more ways SurveyCTO can help you beyond writing your research survey questions. 

Your next steps: Explore more resources

To keep reading about how SurveyCTO can help you design better surveys, take a look at these resources:  

  • Sign up here to get notified about our monthly webinars, where organizations like IDinsight  share best practices for effective surveys.
  • Check out previous webinars from SurveyCTO about survey forms, like this one on high-frequency checks for monitoring surveys. 
  • Sign up for a free trial of SurveyCTO for your next survey project.

To see how SurveyCTO can help you with your survey needs, start a free 15-day trial today. No credit card required. 

Post author avatar

Marta Costa

Senior Product Specialist

Marta is a member of the Customer Success team for Dobility. She helps users working at NGOs, nonprofits, survey firms, universities and research institutes achieve their objectives using SurveyCTO, and works on new ways to help users get the most out of the platform.

Marta has worked in international development consultancy and research, supporting and coordinating impact evaluations, monitoring and evaluation projects, and data collection processes at the national level in areas such as education, energy access, and financial inclusion.

You Might Also Like

Read more about the article SurveyCTO Webinar Series: Update your household rosters over time

SurveyCTO Webinar Series: Update your household rosters over time

Read more about the article SurveyCTO Desktop: Scale easily to manage multiple projects

SurveyCTO Desktop: Scale easily to manage multiple projects

Read more about the article Form testing interface: Better surveys in less time

Form testing interface: Better surveys in less time

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research questions for survey

Home Market Research

Survey Research: Definition, Examples and Methods

Survey Research

Survey Research is a quantitative research method used for collecting data from a set of respondents. It has been perhaps one of the most used methodologies in the industry for several years due to the multiple benefits and advantages that it has when collecting and analyzing data.

LEARN ABOUT: Behavioral Research

In this article, you will learn everything about survey research, such as types, methods, and examples.

Survey Research Definition

Survey Research is defined as the process of conducting research using surveys that researchers send to survey respondents. The data collected from surveys is then statistically analyzed to draw meaningful research conclusions. In the 21st century, every organization’s eager to understand what their customers think about their products or services and make better business decisions. Researchers can conduct research in multiple ways, but surveys are proven to be one of the most effective and trustworthy research methods. An online survey is a method for extracting information about a significant business matter from an individual or a group of individuals. It consists of structured survey questions that motivate the participants to respond. Creditable survey research can give these businesses access to a vast information bank. Organizations in media, other companies, and even governments rely on survey research to obtain accurate data.

The traditional definition of survey research is a quantitative method for collecting information from a pool of respondents by asking multiple survey questions. This research type includes the recruitment of individuals collection, and analysis of data. It’s useful for researchers who aim to communicate new features or trends to their respondents.

LEARN ABOUT: Level of Analysis Generally, it’s the primary step towards obtaining quick information about mainstream topics and conducting more rigorous and detailed quantitative research methods like surveys/polls or qualitative research methods like focus groups/on-call interviews can follow. There are many situations where researchers can conduct research using a blend of both qualitative and quantitative strategies.

LEARN ABOUT: Survey Sampling

Survey Research Methods

Survey research methods can be derived based on two critical factors: Survey research tool and time involved in conducting research. There are three main survey research methods, divided based on the medium of conducting survey research:

  • Online/ Email:   Online survey research is one of the most popular survey research methods today. The survey cost involved in online survey research is extremely minimal, and the responses gathered are highly accurate.
  • Phone:  Survey research conducted over the telephone ( CATI survey ) can be useful in collecting data from a more extensive section of the target population. There are chances that the money invested in phone surveys will be higher than other mediums, and the time required will be higher.
  • Face-to-face:  Researchers conduct face-to-face in-depth interviews in situations where there is a complicated problem to solve. The response rate for this method is the highest, but it can be costly.

Further, based on the time taken, survey research can be classified into two methods:

  • Longitudinal survey research:  Longitudinal survey research involves conducting survey research over a continuum of time and spread across years and decades. The data collected using this survey research method from one time period to another is qualitative or quantitative. Respondent behavior, preferences, and attitudes are continuously observed over time to analyze reasons for a change in behavior or preferences. For example, suppose a researcher intends to learn about the eating habits of teenagers. In that case, he/she will follow a sample of teenagers over a considerable period to ensure that the collected information is reliable. Often, cross-sectional survey research follows a longitudinal study .
  • Cross-sectional survey research:  Researchers conduct a cross-sectional survey to collect insights from a target audience at a particular time interval. This survey research method is implemented in various sectors such as retail, education, healthcare, SME businesses, etc. Cross-sectional studies can either be descriptive or analytical. It is quick and helps researchers collect information in a brief period. Researchers rely on the cross-sectional survey research method in situations where descriptive analysis of a subject is required.

Survey research also is bifurcated according to the sampling methods used to form samples for research: Probability and Non-probability sampling. Every individual in a population should be considered equally to be a part of the survey research sample. Probability sampling is a sampling method in which the researcher chooses the elements based on probability theory. The are various probability research methods, such as simple random sampling , systematic sampling, cluster sampling, stratified random sampling, etc. Non-probability sampling is a sampling method where the researcher uses his/her knowledge and experience to form samples.

LEARN ABOUT: Survey Sample Sizes

The various non-probability sampling techniques are :

  • Convenience sampling
  • Snowball sampling
  • Consecutive sampling
  • Judgemental sampling
  • Quota sampling

Process of implementing survey research methods:

  • Decide survey questions:  Brainstorm and put together valid survey questions that are grammatically and logically appropriate. Understanding the objective and expected outcomes of the survey helps a lot. There are many surveys where details of responses are not as important as gaining insights about what customers prefer from the provided options. In such situations, a researcher can include multiple-choice questions or closed-ended questions . Whereas, if researchers need to obtain details about specific issues, they can consist of open-ended questions in the questionnaire. Ideally, the surveys should include a smart balance of open-ended and closed-ended questions. Use survey questions like Likert Scale , Semantic Scale, Net Promoter Score question, etc., to avoid fence-sitting.

LEARN ABOUT: System Usability Scale

  • Finalize a target audience:  Send out relevant surveys as per the target audience and filter out irrelevant questions as per the requirement. The survey research will be instrumental in case the target population decides on a sample. This way, results can be according to the desired market and be generalized to the entire population.

LEARN ABOUT:  Testimonial Questions

  • Send out surveys via decided mediums:  Distribute the surveys to the target audience and patiently wait for the feedback and comments- this is the most crucial step of the survey research. The survey needs to be scheduled, keeping in mind the nature of the target audience and its regions. Surveys can be conducted via email, embedded in a website, shared via social media, etc., to gain maximum responses.
  • Analyze survey results:  Analyze the feedback in real-time and identify patterns in the responses which might lead to a much-needed breakthrough for your organization. GAP, TURF Analysis , Conjoint analysis, Cross tabulation, and many such survey feedback analysis methods can be used to spot and shed light on respondent behavior. Researchers can use the results to implement corrective measures to improve customer/employee satisfaction.

Reasons to conduct survey research

The most crucial and integral reason for conducting market research using surveys is that you can collect answers regarding specific, essential questions. You can ask these questions in multiple survey formats as per the target audience and the intent of the survey. Before designing a study, every organization must figure out the objective of carrying this out so that the study can be structured, planned, and executed to perfection.

LEARN ABOUT: Research Process Steps

Questions that need to be on your mind while designing a survey are:

  • What is the primary aim of conducting the survey?
  • How do you plan to utilize the collected survey data?
  • What type of decisions do you plan to take based on the points mentioned above?

There are three critical reasons why an organization must conduct survey research.

  • Understand respondent behavior to get solutions to your queries:  If you’ve carefully curated a survey, the respondents will provide insights about what they like about your organization as well as suggestions for improvement. To motivate them to respond, you must be very vocal about how secure their responses will be and how you will utilize the answers. This will push them to be 100% honest about their feedback, opinions, and comments. Online surveys or mobile surveys have proved their privacy, and due to this, more and more respondents feel free to put forth their feedback through these mediums.
  • Present a medium for discussion:  A survey can be the perfect platform for respondents to provide criticism or applause for an organization. Important topics like product quality or quality of customer service etc., can be put on the table for discussion. A way you can do it is by including open-ended questions where the respondents can write their thoughts. This will make it easy for you to correlate your survey to what you intend to do with your product or service.
  • Strategy for never-ending improvements:  An organization can establish the target audience’s attributes from the pilot phase of survey research . Researchers can use the criticism and feedback received from this survey to improve the product/services. Once the company successfully makes the improvements, it can send out another survey to measure the change in feedback keeping the pilot phase the benchmark. By doing this activity, the organization can track what was effectively improved and what still needs improvement.

Survey Research Scales

There are four main scales for the measurement of variables:

  • Nominal Scale:  A nominal scale associates numbers with variables for mere naming or labeling, and the numbers usually have no other relevance. It is the most basic of the four levels of measurement.
  • Ordinal Scale:  The ordinal scale has an innate order within the variables along with labels. It establishes the rank between the variables of a scale but not the difference value between the variables.
  • Interval Scale:  The interval scale is a step ahead in comparison to the other two scales. Along with establishing a rank and name of variables, the scale also makes known the difference between the two variables. The only drawback is that there is no fixed start point of the scale, i.e., the actual zero value is absent.
  • Ratio Scale:  The ratio scale is the most advanced measurement scale, which has variables that are labeled in order and have a calculated difference between variables. In addition to what interval scale orders, this scale has a fixed starting point, i.e., the actual zero value is present.

Benefits of survey research

In case survey research is used for all the right purposes and is implemented properly, marketers can benefit by gaining useful, trustworthy data that they can use to better the ROI of the organization.

Other benefits of survey research are:

  • Minimum investment:  Mobile surveys and online surveys have minimal finance invested per respondent. Even with the gifts and other incentives provided to the people who participate in the study, online surveys are extremely economical compared to paper-based surveys.
  • Versatile sources for response collection:  You can conduct surveys via various mediums like online and mobile surveys. You can further classify them into qualitative mediums like focus groups , and interviews and quantitative mediums like customer-centric surveys. Due to the offline survey response collection option, researchers can conduct surveys in remote areas with limited internet connectivity. This can make data collection and analysis more convenient and extensive.
  • Reliable for respondents:  Surveys are extremely secure as the respondent details and responses are kept safeguarded. This anonymity makes respondents answer the survey questions candidly and with absolute honesty. An organization seeking to receive explicit responses for its survey research must mention that it will be confidential.

Survey research design

Researchers implement a survey research design in cases where there is a limited cost involved and there is a need to access details easily. This method is often used by small and large organizations to understand and analyze new trends, market demands, and opinions. Collecting information through tactfully designed survey research can be much more effective and productive than a casually conducted survey.

There are five stages of survey research design:

  • Decide an aim of the research:  There can be multiple reasons for a researcher to conduct a survey, but they need to decide a purpose for the research. This is the primary stage of survey research as it can mold the entire path of a survey, impacting its results.
  • Filter the sample from target population:  Who to target? is an essential question that a researcher should answer and keep in mind while conducting research. The precision of the results is driven by who the members of a sample are and how useful their opinions are. The quality of respondents in a sample is essential for the results received for research and not the quantity. If a researcher seeks to understand whether a product feature will work well with their target market, he/she can conduct survey research with a group of market experts for that product or technology.
  • Zero-in on a survey method:  Many qualitative and quantitative research methods can be discussed and decided. Focus groups, online interviews, surveys, polls, questionnaires, etc. can be carried out with a pre-decided sample of individuals.
  • Design the questionnaire:  What will the content of the survey be? A researcher is required to answer this question to be able to design it effectively. What will the content of the cover letter be? Or what are the survey questions of this questionnaire? Understand the target market thoroughly to create a questionnaire that targets a sample to gain insights about a survey research topic.
  • Send out surveys and analyze results:  Once the researcher decides on which questions to include in a study, they can send it across to the selected sample . Answers obtained from this survey can be analyzed to make product-related or marketing-related decisions.

Survey examples: 10 tips to design the perfect research survey

Picking the right survey design can be the key to gaining the information you need to make crucial decisions for all your research. It is essential to choose the right topic, choose the right question types, and pick a corresponding design. If this is your first time creating a survey, it can seem like an intimidating task. But with QuestionPro, each step of the process is made simple and easy.

Below are 10 Tips To Design The Perfect Research Survey:

  • Set your SMART goals:  Before conducting any market research or creating a particular plan, set your SMART Goals . What is that you want to achieve with the survey? How will you measure it promptly, and what are the results you are expecting?
  • Choose the right questions:  Designing a survey can be a tricky task. Asking the right questions may help you get the answers you are looking for and ease the task of analyzing. So, always choose those specific questions – relevant to your research.
  • Begin your survey with a generalized question:  Preferably, start your survey with a general question to understand whether the respondent uses the product or not. That also provides an excellent base and intro for your survey.
  • Enhance your survey:  Choose the best, most relevant, 15-20 questions. Frame each question as a different question type based on the kind of answer you would like to gather from each. Create a survey using different types of questions such as multiple-choice, rating scale, open-ended, etc. Look at more survey examples and four measurement scales every researcher should remember.
  • Prepare yes/no questions:  You may also want to use yes/no questions to separate people or branch them into groups of those who “have purchased” and those who “have not yet purchased” your products or services. Once you separate them, you can ask them different questions.
  • Test all electronic devices:  It becomes effortless to distribute your surveys if respondents can answer them on different electronic devices like mobiles, tablets, etc. Once you have created your survey, it’s time to TEST. You can also make any corrections if needed at this stage.
  • Distribute your survey:  Once your survey is ready, it is time to share and distribute it to the right audience. You can share handouts and share them via email, social media, and other industry-related offline/online communities.
  • Collect and analyze responses:  After distributing your survey, it is time to gather all responses. Make sure you store your results in a particular document or an Excel sheet with all the necessary categories mentioned so that you don’t lose your data. Remember, this is the most crucial stage. Segregate your responses based on demographics, psychographics, and behavior. This is because, as a researcher, you must know where your responses are coming from. It will help you to analyze, predict decisions, and help write the summary report.
  • Prepare your summary report:  Now is the time to share your analysis. At this stage, you should mention all the responses gathered from a survey in a fixed format. Also, the reader/customer must get clarity about your goal, which you were trying to gain from the study. Questions such as – whether the product or service has been used/preferred or not. Do respondents prefer some other product to another? Any recommendations?

Having a tool that helps you carry out all the necessary steps to carry out this type of study is a vital part of any project. At QuestionPro, we have helped more than 10,000 clients around the world to carry out data collection in a simple and effective way, in addition to offering a wide range of solutions to take advantage of this data in the best possible way.

From dashboards, advanced analysis tools, automation, and dedicated functions, in QuestionPro, you will find everything you need to execute your research projects effectively. Uncover insights that matter the most!

MORE LIKE THIS

research questions for survey

What Are My Employees Really Thinking? The Power of Open-ended Survey Analysis

May 24, 2024

When I think of “disconnected”, it is important that this is not just in relation to people analytics, Employee Experience or Customer Experience - it is also relevant to looking across them.

I Am Disconnected – Tuesday CX Thoughts

May 21, 2024

Customer success tools

20 Best Customer Success Tools of 2024

May 20, 2024

AI-Based Services in Market Research

AI-Based Services Buying Guide for Market Research (based on ESOMAR’s 20 Questions) 

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

StatAnalytica

Explore 151 Amazing Topics For Survey Research (Updated 2024)

Topics For Survey Research

Hey everyone! Ready to dive into the world of surveys? We’ve got the ultimate guide “Explore 151 Amazing Topics for Survey Research (Updated 2024).” It’s like having a treasure map for cool and interesting stuff to ask people about. From what snacks folks love to the latest tech trends, we’ve got it all. No need for fancy words – just easy questions and tons of fun. 

Ever wondered if more people like cats or dogs? Or maybe what games are ruling the playground? Well, get ready to reveal the answers. These topics aren’t just surveys; they’re your passport to understanding the world around you. So, grab a seat and let’s dive into the awesomeness of surveys – because every question is a key to unlocking something cool.

What Is Survey Research?

Table of Contents

Survey research is a method of gathering data from individuals to understand opinions, behaviors, or characteristics. Researchers use questionnaires, interviews, or other structured approaches to collect information.

This systematic approach allows for the analysis of trends, preferences, and attitudes within a targeted population, providing valuable information for various fields, including social sciences, market research, and public opinion studies.

151 Amazing Topics For survey Research

Top 10 topics for survey research on job satisfaction.

Top 10 Topics For Survey Research On Job Satisfaction

  • Job Satisfaction Factors in Remote Work Environments
  • The Impact of Leadership Styles on Employee Job Satisfaction
  • Gender Disparities in Job Satisfaction Across Industries
  • Job Satisfaction and Its Relation to Employee Productivity
  • Influence of Organizational Culture on Job Satisfaction
  • Job Satisfaction Among Healthcare Professionals
  • Job Satisfaction Trends in the Gig Economy
  • Job Satisfaction and Work-Life Balance in the Tech Industry
  • Job Satisfaction Among Millennials in the Workplace
  • The Role of Training and Development in Enhancing Job Satisfaction

Top 10 Survey Research Topics On Health and Wellness

Top 10 Survey Research Topics On Health and Wellness

  • Workplace Wellness Programs: Employee Perceptions and Participation
  • Impact of Stress on Physical and Mental Health
  • Effectiveness of Remote Work on Employee Wellbeing
  • Health and Wellness Behaviors Among College Students
  • Community Perspectives on Access to Healthcare Services
  • Employee Engagement in Corporate Wellness Initiatives
  • Mental Health Stigma: Public Attitudes and Awareness
  • Impact of Physical Activity on Overall Wellbeing
  • Dietary Habits and their Influence on Health
  • Perceptions of Telehealth Services in Healthcare Accessibility

Top 10 Survey Research Topics On Education System Evaluation

Top 10 Survey Research Topics On Education System Evaluation

  • Parental Satisfaction with Remote Learning Environments
  • Student Perspectives on Online Education Effectiveness
  • Teacher Feedback on the Implementation of Technology in the Classroom
  • The Impact of Standardized Testing on Educational Quality
  • Community Perceptions of School Safety Measures
  • Teacher Training and Preparedness for Diverse Learning Needs
  • Student Engagement in STEM (Science, Technology, Engineering, and Mathematics) Education
  • Parental Involvement in School Decision-Making Processes
  • The Effectiveness of Extracurricular Programs on Student Development
  • Evaluation of Inclusivity and Diversity Initiatives in Educational Institutions

Top 10 Survey Research Topics On Consumer Preferences in Technology

Top 10 Survey Research Topics On Consumer Preferences in Technology

  • Smartphone vs. Tablet Preferences: User Experiences
  • Attitudes Towards Wearable Technology Adoption
  • Consumer Perception of Artificial Intelligence in Everyday Devices
  • Preferences in Operating Systems: iOS vs. Android
  • Interest in Augmented Reality (AR) and Virtual Reality (VR) Experiences
  • Security Concerns in Internet of Things (IoT) Devices
  • Sustainable Technology: Consumer Awareness and Choices
  • User Satisfaction with Smart Home Automation
  • E-readers vs. Physical Books: Reading Preferences
  • Preferences in Laptop vs. Desktop Computing for Various Tasks

Top 10 Survey Research Topics On Public Transportation Usage

Top 10 Survey Research Topics On Public Transportation Usage

  • Factors Influencing Public Transportation Commute Decisions
  • User Satisfaction with Public Transit Services
  • Impact of Commuting Time on Mode of Transportation
  • Barriers to Increased Public Transportation Adoption
  • Accessibility and Inclusivity in Public Transportation Systems
  • Attitudes Towards Public Transportation Safety Measures
  • Technology Integration in Public Transit: User Perspectives
  • Public Transportation Usage Patterns During Peak vs. Off-Peak Hours
  • The Role of Environmental Awareness in Transportation Choices
  • Public Transportation’s Impact on Urban Planning and Development

Top 10 Topics For Survey Research On Environmental Awareness and Practices

Top 10 Topics For Survey Research On Environmental Awareness and Practices

  • Public Perceptions of Climate Change: Knowledge and Concerns
  • Attitudes Towards Sustainable Energy Sources
  • Consumer Preferences for Eco-friendly Products
  • The Role of Environmental Education in Shaping Attitudes
  • Waste Reduction and Recycling Behaviors
  • Views on Government Policies Supporting Environmental Conservation
  • Participation in Community Environmental Initiatives
  • Water Conservation Practices in Urban and Rural Areas
  • Urban Green Spaces and their Impact on Wellbeing
  • The Influence of Corporate Sustainability Practices on Consumer Choices

Top 10 Survey Research Topics On Political Beliefs and Affiliations

Top 10 Survey Research Topics On Political Beliefs and Affiliations

  • Political Ideology Trends Among Different Age Groups
  • Media Influence on Political Beliefs and Opinions
  • Public Trust in Political Institutions and Leaders
  • Social Media’s Role in Shaping Political Affiliations
  • Factors Influencing Political Party Switching
  • Attitudes Towards Political Activism and Protests
  • The Impact of Economic Conditions on Political Preferences
  • Civic Engagement: Voter Turnout and Political Participation
  • Perspectives on Political Polarization and Unity
  • Political Communication Channels and Effectiveness

Top 10 Survey Research Topics On Social Media Usage Patterns

Top 10 Survey Research Topics On Social Media Usage Patterns

  • Social Media Preferences Across Different Age Groups
  • Influencers and their Impact on Social Media Choices
  • Privacy Concerns and User Behavior on Social Platforms
  • Trends in Social Media Engagement: Likes, Shares, and Comments
  • The Role of Social Media in Shaping Personal Identity
  • User Satisfaction with Social Media Algorithms
  • Social Media’s Influence on News Consumption Habits
  • Online Harassment and Cyberbullying on Social Platforms
  • Attitudes Towards Social Media Advertising
  • Impact of Social Media Detox Challenges on User Behavior 

Top 10 Topics For Survey Research On Community Involvement

Top 10 Topics For Survey Research On Community Involvement

  • Factors Influencing Community Volunteerism
  • Public Awareness and Participation in Local Governance
  • Community Perceptions of Civic Engagement Opportunities
  • Attitudes Towards Community Improvement Initiatives
  • Barriers to Active Community Participation
  • Influence of Socioeconomic Factors on Community Involvement
  • Perspectives on the Importance of Neighborhood Associations
  • The Role of Social Media in Fostering Community Connections
  • Civic Education and its Impact on Community Engagement
  • Volunteer Motivations and Satisfaction in Community Service

Top 10 Survey Research Topics On Work-Life Balance

Top 10 Survey Research Topics On Work-Life Balance

  • Employee Perspectives on Work-Life Balance Policies
  • The Impact of Flexible Work Arrangements on Wellbeing
  • Burnout and Stress Levels in Different Professions
  • Attitudes Towards Remote Work and its Effect on Work-Life Balance
  • Parental Leave and its Influence on Work-Life Integration
  • Job Satisfaction and its Relation to Work-Life Harmony
  • Perceptions of Organizational Support for Work-Life Balance
  • The Role of Technology in Balancing Work and Personal Life
  • Gender Disparities in Work-Life Balance Experiences
  • The Influence of Commute Length on Work-Life Equilibrium

Top 10 Topics For Survey Research On Financial Literacy

Top 10 Topics For Survey Research On Financial Literacy

  • Financial Literacy Among Different Age Groups
  • Impact of Educational Background on Financial Knowledge
  • Perspectives on Credit Card Usage and Debt Management
  • Knowledge of Investment Strategies and Risk Tolerance
  • Attitudes Towards Retirement Planning and Savings
  • The Role of Financial Education Programs in Shaping Literacy
  • Awareness and Utilization of Financial Planning Services
  • Influence of Socioeconomic Factors on Financial Literacy
  • Understanding of Taxation Systems and Financial Responsibilities
  • Consumer Perspectives on Fintech and Digital Banking Literacy

Top 10 Survey Research Topics On E-commerce Shopping Habits

Top 10 Survey Research Topics On E-commerce Shopping Habits

  • Consumer Trust in Online Payment Security
  • Preferences for Mobile vs. Desktop Shopping Experiences
  • Factors Influencing Online Shopping Cart Abandonment
  • Impact of Product Reviews on Purchasing Decisions
  • The Role of Social Media in Influencing E-commerce Choices
  • Customer Loyalty Programs: Effectiveness and Perception
  • Cross-Border E-commerce Shopping Trends
  • Sustainability and Ethical Considerations in Online Purchases
  • Influencers and their Impact on E-commerce Sales
  • Consumer Attitudes Towards Personalized Shopping Recommendations

Top 10 Topics For Survey Research On Entertainment Preferences

Top 10 Topics For Survey Research On Entertainment Preferences

  • Streaming Services vs. Traditional Cable: Consumer Preferences
  • Influences on Music Genre Preferences Among Different Age Groups
  • Viewer Satisfaction and Preferences in Online Video Content
  • Impact of Social Media on Movie and TV Show Recommendations
  • Trends in Live Events vs. Digital Entertainment Participation
  • The Role of Podcasts in Consumer Entertainment Choices
  • Gaming Preferences: Console vs. PC vs. Mobile
  • User Engagement in Virtual and Augmented Reality Entertainment
  • Attitudes Towards Celebrity Endorsements in Entertainment
  • Impact of Cultural Background on Entertainment Choices

Top 10 Survey Research Topics On Housing and Living Conditions

Top 10 Survey Research Topics On Housing and Living Conditions

  • Satisfaction with Affordable Housing Options
  • Preferences for Urban vs. Suburban Living Environments
  • Impact of Remote Work on Housing Choices and Locations
  • Community Safety and its Influence on Housing Decisions
  • The Role of Green Spaces in Residential Satisfaction
  • Access to Public Transportation and Housing Selection
  • Smart Home Technology Adoption and Preferences
  • Perceptions of Gentrification in Local Neighborhoods
  • Affordable Housing and its Impact on Local Communities
  • Housing Accessibility and Satisfaction for Individuals with Disabilities

Top 10 Topics For Survey Research On Attitudes Towards Remote Work

Top 10 Topics For Survey Research On Attitudes Towards Remote Work

  • Employee Satisfaction and Productivity in Remote Work Environments
  • Technological Challenges in Adapting to Remote Work
  • Employer Policies and Flexibility in Remote Work Arrangements
  • Impact of Remote Work on Work-Life Balance
  • Communication Effectiveness in Virtual Teams
  • Employee Perspectives on Mental Health and Remote Work
  • Remote Work’s Influence on Job Satisfaction and Career Advancement
  • Employer and Employee Perceptions of Remote Work Sustainability
  • Remote Work’s Effect on Organizational Culture
  • Factors Influencing Employee Return-to-Office Preferences

30 Field Specific Topics on Survey Research

Top 10 survey research topics for marketing students.

Top 10 Survey Research Topics For Marketing Students

  • Consumer Preferences in Online Shopping: A Comparative Analysis
  • Effectiveness of Social Media Influencers in Product Endorsements
  • Perceptions of Brand Loyalty Programs and Customer Retention
  • Impact of E-commerce Trends on Traditional Retail Shopping Habits
  • Evaluating Customer Satisfaction with Mobile App User Experiences
  • Consumer Attitudes Towards Sustainable and Eco-Friendly Products
  • Exploring the Influence of Celebrity Endorsements on Product Purchases
  • Effectiveness of Personalized Marketing Strategies in Online Retail
  • Customer Perception of Brand Authenticity in Marketing Campaigns
  • Assessing the Impact of Online Reviews on Consumer Decision-Making

Top 10 Survey Research Topics For Medical Students

Top 10 Survey Research Topics For Medical Students

  • Medical Students’ Perceptions of Virtual Learning Environments
  • Evaluating the Impact of Clinical Rotations on Career Aspirations
  • Assessment of Stress and Burnout Among Medical Students
  • Effectiveness of Peer-Assisted Learning in Medical Education
  • Attitudes Towards Integrating Technology in Medical Curriculum
  • Exploring Medical Students’ Awareness of Mental Health Resources
  • Medical Students’ Views on Cultural Competency Training
  • Perceptions of the Clinical Clerkship Experience
  • Understanding the Impact of Sleep Deprivation on Medical Students
  • Medical Students’ Perspectives on Patient Interactions and Communication Skills Training

Top 10 Survey Research Topics For Psychology Students

Top 10 Survey Research Topics For Psychology Students

  • The Impact of Social Media on Psychological Well being
  • Perceptions of Therapy Effectiveness and Accessibility
  • Factors Influencing Stress Levels Among College Students
  • Understanding Attitudes Towards Mental Health in the Workplace
  • Exploring Personality Traits and Coping Mechanisms
  • Attitudes Towards Seeking Help for Mental Health Issues
  • Impact of Childhood Experiences on Adult Mental Health
  • Psychological Effects of Social Isolation and Loneliness
  • Perceptions of Happiness and Life Satisfaction Across Age Groups

Top 15 Methods Of Survey Research

Survey research involves collecting data from a target audience to gain insights into opinions, behaviors, or characteristics. Several methods are employed in survey research, each with its strengths and limitations. Here are common methods:

  • Questionnaires: Distributing written or online surveys with structured questions allows for standardized data collection.
  • Interviews: Conducting one-on-one or group interviews enables in-depth exploration of responses and clarification of ambiguous answers.
  • Telephone Surveys: Collecting data over the phone is a quick and cost-effective method for reaching a wide audience.
  • Online Surveys: Utilizing web-based platforms for data collection offers convenience and access to a large and diverse participant pool.
  • Face-to-Face Surveys: Interviewers administer surveys in person, fostering rapport and improving response rates.
  • Mail Surveys: Sending questionnaires by mail is a traditional method, suitable for reaching specific demographic groups.
  • Cross-Sectional Surveys: Collecting data from a diverse group at a single point in time helps capture a snapshot of opinions or behaviors.
  • Longitudinal Surveys: Gathering data from the same participants over an extended period allows for the analysis of changes over time.
  • Online Panel Surveys: Building a panel of participants who regularly respond to surveys facilitates repeated data collection.
  • Randomized Controlled Trials (RCTs): Adding surveys within experimental designs helps assess causation and control for confounding variables.
  • Mail Panels: Creating a group of individuals who regularly receive and respond to mail surveys allows for longitudinal data collection.
  • Drop-Off/Pick-Up Surveys: Leaving surveys at a designated location for participants to complete and return later.
  • Intercept Surveys: Approaching participants in public spaces, such as malls or events, for on the spot interviews.
  • Ethnographic Surveys: Immersing researchers in the target environment to observe and understand behaviors and attitudes.
  • Delphi Method: Repeating surveys with expert opinions to reach a consensus on complex topics.

Choosing the appropriate survey method depends on factors like research goals, budget, and the nature of the data needed. Researchers often use a combination of methods for a comprehensive understanding of their study area.

And that’s a wrap on our journey through 151 topics for survey research. Each question is a little window into people’s thoughts and preferences. We hope this updated list brings you not just data, but insights and a bit of fun too. 

Surveys are like conversations with the world, and with these topics, you’re all set to start chatting. Whether it’s uncovering trends or settling debates, your survey adventure awaits. So, go ahead, dive into the questions, discover new perspectives, and enjoy the endless possibilities that survey research brings. Happy surveying, explorers.

Related Posts

best way to finance car

Step by Step Guide on The Best Way to Finance Car

how to get fund for business

The Best Way on How to Get Fund For Business to Grow it Efficiently

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Adv Pract Oncol
  • v.6(2); Mar-Apr 2015

Logo of jadpraconcol

Understanding and Evaluating Survey Research

A variety of methodologic approaches exist for individuals interested in conducting research. Selection of a research approach depends on a number of factors, including the purpose of the research, the type of research questions to be answered, and the availability of resources. The purpose of this article is to describe survey research as one approach to the conduct of research so that the reader can critically evaluate the appropriateness of the conclusions from studies employing survey research.

SURVEY RESEARCH

Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative research strategies (e.g., using questionnaires with numerically rated items), qualitative research strategies (e.g., using open-ended questions), or both strategies (i.e., mixed methods). As it is often used to describe and explore human behavior, surveys are therefore frequently used in social and psychological research ( Singleton & Straits, 2009 ).

Information has been obtained from individuals and groups through the use of survey research for decades. It can range from asking a few targeted questions of individuals on a street corner to obtain information related to behaviors and preferences, to a more rigorous study using multiple valid and reliable instruments. Common examples of less rigorous surveys include marketing or political surveys of consumer patterns and public opinion polls.

Survey research has historically included large population-based data collection. The primary purpose of this type of survey research was to obtain information describing characteristics of a large sample of individuals of interest relatively quickly. Large census surveys obtaining information reflecting demographic and personal characteristics and consumer feedback surveys are prime examples. These surveys were often provided through the mail and were intended to describe demographic characteristics of individuals or obtain opinions on which to base programs or products for a population or group.

More recently, survey research has developed into a rigorous approach to research, with scientifically tested strategies detailing who to include (representative sample), what and how to distribute (survey method), and when to initiate the survey and follow up with nonresponders (reducing nonresponse error), in order to ensure a high-quality research process and outcome. Currently, the term "survey" can reflect a range of research aims, sampling and recruitment strategies, data collection instruments, and methods of survey administration.

Given this range of options in the conduct of survey research, it is imperative for the consumer/reader of survey research to understand the potential for bias in survey research as well as the tested techniques for reducing bias, in order to draw appropriate conclusions about the information reported in this manner. Common types of error in research, along with the sources of error and strategies for reducing error as described throughout this article, are summarized in the Table .

An external file that holds a picture, illustration, etc.
Object name is jadp-06-168-g01.jpg

Sources of Error in Survey Research and Strategies to Reduce Error

The goal of sampling strategies in survey research is to obtain a sufficient sample that is representative of the population of interest. It is often not feasible to collect data from an entire population of interest (e.g., all individuals with lung cancer); therefore, a subset of the population or sample is used to estimate the population responses (e.g., individuals with lung cancer currently receiving treatment). A large random sample increases the likelihood that the responses from the sample will accurately reflect the entire population. In order to accurately draw conclusions about the population, the sample must include individuals with characteristics similar to the population.

It is therefore necessary to correctly identify the population of interest (e.g., individuals with lung cancer currently receiving treatment vs. all individuals with lung cancer). The sample will ideally include individuals who reflect the intended population in terms of all characteristics of the population (e.g., sex, socioeconomic characteristics, symptom experience) and contain a similar distribution of individuals with those characteristics. As discussed by Mady Stovall beginning on page 162, Fujimori et al. ( 2014 ), for example, were interested in the population of oncologists. The authors obtained a sample of oncologists from two hospitals in Japan. These participants may or may not have similar characteristics to all oncologists in Japan.

Participant recruitment strategies can affect the adequacy and representativeness of the sample obtained. Using diverse recruitment strategies can help improve the size of the sample and help ensure adequate coverage of the intended population. For example, if a survey researcher intends to obtain a sample of individuals with breast cancer representative of all individuals with breast cancer in the United States, the researcher would want to use recruitment strategies that would recruit both women and men, individuals from rural and urban settings, individuals receiving and not receiving active treatment, and so on. Because of the difficulty in obtaining samples representative of a large population, researchers may focus the population of interest to a subset of individuals (e.g., women with stage III or IV breast cancer). Large census surveys require extremely large samples to adequately represent the characteristics of the population because they are intended to represent the entire population.

DATA COLLECTION METHODS

Survey research may use a variety of data collection methods with the most common being questionnaires and interviews. Questionnaires may be self-administered or administered by a professional, may be administered individually or in a group, and typically include a series of items reflecting the research aims. Questionnaires may include demographic questions in addition to valid and reliable research instruments ( Costanzo, Stawski, Ryff, Coe, & Almeida, 2012 ; DuBenske et al., 2014 ; Ponto, Ellington, Mellon, & Beck, 2010 ). It is helpful to the reader when authors describe the contents of the survey questionnaire so that the reader can interpret and evaluate the potential for errors of validity (e.g., items or instruments that do not measure what they are intended to measure) and reliability (e.g., items or instruments that do not measure a construct consistently). Helpful examples of articles that describe the survey instruments exist in the literature ( Buerhaus et al., 2012 ).

Questionnaires may be in paper form and mailed to participants, delivered in an electronic format via email or an Internet-based program such as SurveyMonkey, or a combination of both, giving the participant the option to choose which method is preferred ( Ponto et al., 2010 ). Using a combination of methods of survey administration can help to ensure better sample coverage (i.e., all individuals in the population having a chance of inclusion in the sample) therefore reducing coverage error ( Dillman, Smyth, & Christian, 2014 ; Singleton & Straits, 2009 ). For example, if a researcher were to only use an Internet-delivered questionnaire, individuals without access to a computer would be excluded from participation. Self-administered mailed, group, or Internet-based questionnaires are relatively low cost and practical for a large sample ( Check & Schutt, 2012 ).

Dillman et al. ( 2014 ) have described and tested a tailored design method for survey research. Improving the visual appeal and graphics of surveys by using a font size appropriate for the respondents, ordering items logically without creating unintended response bias, and arranging items clearly on each page can increase the response rate to electronic questionnaires. Attending to these and other issues in electronic questionnaires can help reduce measurement error (i.e., lack of validity or reliability) and help ensure a better response rate.

Conducting interviews is another approach to data collection used in survey research. Interviews may be conducted by phone, computer, or in person and have the benefit of visually identifying the nonverbal response(s) of the interviewee and subsequently being able to clarify the intended question. An interviewer can use probing comments to obtain more information about a question or topic and can request clarification of an unclear response ( Singleton & Straits, 2009 ). Interviews can be costly and time intensive, and therefore are relatively impractical for large samples.

Some authors advocate for using mixed methods for survey research when no one method is adequate to address the planned research aims, to reduce the potential for measurement and non-response error, and to better tailor the study methods to the intended sample ( Dillman et al., 2014 ; Singleton & Straits, 2009 ). For example, a mixed methods survey research approach may begin with distributing a questionnaire and following up with telephone interviews to clarify unclear survey responses ( Singleton & Straits, 2009 ). Mixed methods might also be used when visual or auditory deficits preclude an individual from completing a questionnaire or participating in an interview.

FUJIMORI ET AL.: SURVEY RESEARCH

Fujimori et al. ( 2014 ) described the use of survey research in a study of the effect of communication skills training for oncologists on oncologist and patient outcomes (e.g., oncologist’s performance and confidence and patient’s distress, satisfaction, and trust). A sample of 30 oncologists from two hospitals was obtained and though the authors provided a power analysis concluding an adequate number of oncologist participants to detect differences between baseline and follow-up scores, the conclusions of the study may not be generalizable to a broader population of oncologists. Oncologists were randomized to either an intervention group (i.e., communication skills training) or a control group (i.e., no training).

Fujimori et al. ( 2014 ) chose a quantitative approach to collect data from oncologist and patient participants regarding the study outcome variables. Self-report numeric ratings were used to measure oncologist confidence and patient distress, satisfaction, and trust. Oncologist confidence was measured using two instruments each using 10-point Likert rating scales. The Hospital Anxiety and Depression Scale (HADS) was used to measure patient distress and has demonstrated validity and reliability in a number of populations including individuals with cancer ( Bjelland, Dahl, Haug, & Neckelmann, 2002 ). Patient satisfaction and trust were measured using 0 to 10 numeric rating scales. Numeric observer ratings were used to measure oncologist performance of communication skills based on a videotaped interaction with a standardized patient. Participants completed the same questionnaires at baseline and follow-up.

The authors clearly describe what data were collected from all participants. Providing additional information about the manner in which questionnaires were distributed (i.e., electronic, mail), the setting in which data were collected (e.g., home, clinic), and the design of the survey instruments (e.g., visual appeal, format, content, arrangement of items) would assist the reader in drawing conclusions about the potential for measurement and nonresponse error. The authors describe conducting a follow-up phone call or mail inquiry for nonresponders, using the Dillman et al. ( 2014 ) tailored design for survey research follow-up may have reduced nonresponse error.

CONCLUSIONS

Survey research is a useful and legitimate approach to research that has clear benefits in helping to describe and explore variables and constructs of interest. Survey research, like all research, has the potential for a variety of sources of error, but several strategies exist to reduce the potential for error. Advanced practitioners aware of the potential sources of error and strategies to improve survey research can better determine how and whether the conclusions from a survey research study apply to practice.

The author has no potential conflicts of interest to disclose.

  • Privacy Policy

Research Method

Home » Survey Research – Types, Methods, Examples

Survey Research – Types, Methods, Examples

Table of Contents

Survey Research

Survey Research

Definition:

Survey Research is a quantitative research method that involves collecting standardized data from a sample of individuals or groups through the use of structured questionnaires or interviews. The data collected is then analyzed statistically to identify patterns and relationships between variables, and to draw conclusions about the population being studied.

Survey research can be used to answer a variety of questions, including:

  • What are people’s opinions about a certain topic?
  • What are people’s experiences with a certain product or service?
  • What are people’s beliefs about a certain issue?

Survey Research Methods

Survey Research Methods are as follows:

  • Telephone surveys: A survey research method where questions are administered to respondents over the phone, often used in market research or political polling.
  • Face-to-face surveys: A survey research method where questions are administered to respondents in person, often used in social or health research.
  • Mail surveys: A survey research method where questionnaires are sent to respondents through mail, often used in customer satisfaction or opinion surveys.
  • Online surveys: A survey research method where questions are administered to respondents through online platforms, often used in market research or customer feedback.
  • Email surveys: A survey research method where questionnaires are sent to respondents through email, often used in customer satisfaction or opinion surveys.
  • Mixed-mode surveys: A survey research method that combines two or more survey modes, often used to increase response rates or reach diverse populations.
  • Computer-assisted surveys: A survey research method that uses computer technology to administer or collect survey data, often used in large-scale surveys or data collection.
  • Interactive voice response surveys: A survey research method where respondents answer questions through a touch-tone telephone system, often used in automated customer satisfaction or opinion surveys.
  • Mobile surveys: A survey research method where questions are administered to respondents through mobile devices, often used in market research or customer feedback.
  • Group-administered surveys: A survey research method where questions are administered to a group of respondents simultaneously, often used in education or training evaluation.
  • Web-intercept surveys: A survey research method where questions are administered to website visitors, often used in website or user experience research.
  • In-app surveys: A survey research method where questions are administered to users of a mobile application, often used in mobile app or user experience research.
  • Social media surveys: A survey research method where questions are administered to respondents through social media platforms, often used in social media or brand awareness research.
  • SMS surveys: A survey research method where questions are administered to respondents through text messaging, often used in customer feedback or opinion surveys.
  • IVR surveys: A survey research method where questions are administered to respondents through an interactive voice response system, often used in automated customer feedback or opinion surveys.
  • Mixed-method surveys: A survey research method that combines both qualitative and quantitative data collection methods, often used in exploratory or mixed-method research.
  • Drop-off surveys: A survey research method where respondents are provided with a survey questionnaire and asked to return it at a later time or through a designated drop-off location.
  • Intercept surveys: A survey research method where respondents are approached in public places and asked to participate in a survey, often used in market research or customer feedback.
  • Hybrid surveys: A survey research method that combines two or more survey modes, data sources, or research methods, often used in complex or multi-dimensional research questions.

Types of Survey Research

There are several types of survey research that can be used to collect data from a sample of individuals or groups. following are Types of Survey Research:

  • Cross-sectional survey: A type of survey research that gathers data from a sample of individuals at a specific point in time, providing a snapshot of the population being studied.
  • Longitudinal survey: A type of survey research that gathers data from the same sample of individuals over an extended period of time, allowing researchers to track changes or trends in the population being studied.
  • Panel survey: A type of longitudinal survey research that tracks the same sample of individuals over time, typically collecting data at multiple points in time.
  • Epidemiological survey: A type of survey research that studies the distribution and determinants of health and disease in a population, often used to identify risk factors and inform public health interventions.
  • Observational survey: A type of survey research that collects data through direct observation of individuals or groups, often used in behavioral or social research.
  • Correlational survey: A type of survey research that measures the degree of association or relationship between two or more variables, often used to identify patterns or trends in data.
  • Experimental survey: A type of survey research that involves manipulating one or more variables to observe the effect on an outcome, often used to test causal hypotheses.
  • Descriptive survey: A type of survey research that describes the characteristics or attributes of a population or phenomenon, often used in exploratory research or to summarize existing data.
  • Diagnostic survey: A type of survey research that assesses the current state or condition of an individual or system, often used in health or organizational research.
  • Explanatory survey: A type of survey research that seeks to explain or understand the causes or mechanisms behind a phenomenon, often used in social or psychological research.
  • Process evaluation survey: A type of survey research that measures the implementation and outcomes of a program or intervention, often used in program evaluation or quality improvement.
  • Impact evaluation survey: A type of survey research that assesses the effectiveness or impact of a program or intervention, often used to inform policy or decision-making.
  • Customer satisfaction survey: A type of survey research that measures the satisfaction or dissatisfaction of customers with a product, service, or experience, often used in marketing or customer service research.
  • Market research survey: A type of survey research that collects data on consumer preferences, behaviors, or attitudes, often used in market research or product development.
  • Public opinion survey: A type of survey research that measures the attitudes, beliefs, or opinions of a population on a specific issue or topic, often used in political or social research.
  • Behavioral survey: A type of survey research that measures actual behavior or actions of individuals, often used in health or social research.
  • Attitude survey: A type of survey research that measures the attitudes, beliefs, or opinions of individuals, often used in social or psychological research.
  • Opinion poll: A type of survey research that measures the opinions or preferences of a population on a specific issue or topic, often used in political or media research.
  • Ad hoc survey: A type of survey research that is conducted for a specific purpose or research question, often used in exploratory research or to answer a specific research question.

Types Based on Methodology

Based on Methodology Survey are divided into two Types:

Quantitative Survey Research

Qualitative survey research.

Quantitative survey research is a method of collecting numerical data from a sample of participants through the use of standardized surveys or questionnaires. The purpose of quantitative survey research is to gather empirical evidence that can be analyzed statistically to draw conclusions about a particular population or phenomenon.

In quantitative survey research, the questions are structured and pre-determined, often utilizing closed-ended questions, where participants are given a limited set of response options to choose from. This approach allows for efficient data collection and analysis, as well as the ability to generalize the findings to a larger population.

Quantitative survey research is often used in market research, social sciences, public health, and other fields where numerical data is needed to make informed decisions and recommendations.

Qualitative survey research is a method of collecting non-numerical data from a sample of participants through the use of open-ended questions or semi-structured interviews. The purpose of qualitative survey research is to gain a deeper understanding of the experiences, perceptions, and attitudes of participants towards a particular phenomenon or topic.

In qualitative survey research, the questions are open-ended, allowing participants to share their thoughts and experiences in their own words. This approach allows for a rich and nuanced understanding of the topic being studied, and can provide insights that are difficult to capture through quantitative methods alone.

Qualitative survey research is often used in social sciences, education, psychology, and other fields where a deeper understanding of human experiences and perceptions is needed to inform policy, practice, or theory.

Data Analysis Methods

There are several Survey Research Data Analysis Methods that researchers may use, including:

  • Descriptive statistics: This method is used to summarize and describe the basic features of the survey data, such as the mean, median, mode, and standard deviation. These statistics can help researchers understand the distribution of responses and identify any trends or patterns.
  • Inferential statistics: This method is used to make inferences about the larger population based on the data collected in the survey. Common inferential statistical methods include hypothesis testing, regression analysis, and correlation analysis.
  • Factor analysis: This method is used to identify underlying factors or dimensions in the survey data. This can help researchers simplify the data and identify patterns and relationships that may not be immediately apparent.
  • Cluster analysis: This method is used to group similar respondents together based on their survey responses. This can help researchers identify subgroups within the larger population and understand how different groups may differ in their attitudes, behaviors, or preferences.
  • Structural equation modeling: This method is used to test complex relationships between variables in the survey data. It can help researchers understand how different variables may be related to one another and how they may influence one another.
  • Content analysis: This method is used to analyze open-ended responses in the survey data. Researchers may use software to identify themes or categories in the responses, or they may manually review and code the responses.
  • Text mining: This method is used to analyze text-based survey data, such as responses to open-ended questions. Researchers may use software to identify patterns and themes in the text, or they may manually review and code the text.

Applications of Survey Research

Here are some common applications of survey research:

  • Market Research: Companies use survey research to gather insights about customer needs, preferences, and behavior. These insights are used to create marketing strategies and develop new products.
  • Public Opinion Research: Governments and political parties use survey research to understand public opinion on various issues. This information is used to develop policies and make decisions.
  • Social Research: Survey research is used in social research to study social trends, attitudes, and behavior. Researchers use survey data to explore topics such as education, health, and social inequality.
  • Academic Research: Survey research is used in academic research to study various phenomena. Researchers use survey data to test theories, explore relationships between variables, and draw conclusions.
  • Customer Satisfaction Research: Companies use survey research to gather information about customer satisfaction with their products and services. This information is used to improve customer experience and retention.
  • Employee Surveys: Employers use survey research to gather feedback from employees about their job satisfaction, working conditions, and organizational culture. This information is used to improve employee retention and productivity.
  • Health Research: Survey research is used in health research to study topics such as disease prevalence, health behaviors, and healthcare access. Researchers use survey data to develop interventions and improve healthcare outcomes.

Examples of Survey Research

Here are some real-time examples of survey research:

  • COVID-19 Pandemic Surveys: Since the outbreak of the COVID-19 pandemic, surveys have been conducted to gather information about public attitudes, behaviors, and perceptions related to the pandemic. Governments and healthcare organizations have used this data to develop public health strategies and messaging.
  • Political Polls During Elections: During election seasons, surveys are used to measure public opinion on political candidates, policies, and issues in real-time. This information is used by political parties to develop campaign strategies and make decisions.
  • Customer Feedback Surveys: Companies often use real-time customer feedback surveys to gather insights about customer experience and satisfaction. This information is used to improve products and services quickly.
  • Event Surveys: Organizers of events such as conferences and trade shows often use surveys to gather feedback from attendees in real-time. This information can be used to improve future events and make adjustments during the current event.
  • Website and App Surveys: Website and app owners use surveys to gather real-time feedback from users about the functionality, user experience, and overall satisfaction with their platforms. This feedback can be used to improve the user experience and retain customers.
  • Employee Pulse Surveys: Employers use real-time pulse surveys to gather feedback from employees about their work experience and overall job satisfaction. This feedback is used to make changes in real-time to improve employee retention and productivity.

Survey Sample

Purpose of survey research.

The purpose of survey research is to gather data and insights from a representative sample of individuals. Survey research allows researchers to collect data quickly and efficiently from a large number of people, making it a valuable tool for understanding attitudes, behaviors, and preferences.

Here are some common purposes of survey research:

  • Descriptive Research: Survey research is often used to describe characteristics of a population or a phenomenon. For example, a survey could be used to describe the characteristics of a particular demographic group, such as age, gender, or income.
  • Exploratory Research: Survey research can be used to explore new topics or areas of research. Exploratory surveys are often used to generate hypotheses or identify potential relationships between variables.
  • Explanatory Research: Survey research can be used to explain relationships between variables. For example, a survey could be used to determine whether there is a relationship between educational attainment and income.
  • Evaluation Research: Survey research can be used to evaluate the effectiveness of a program or intervention. For example, a survey could be used to evaluate the impact of a health education program on behavior change.
  • Monitoring Research: Survey research can be used to monitor trends or changes over time. For example, a survey could be used to monitor changes in attitudes towards climate change or political candidates over time.

When to use Survey Research

there are certain circumstances where survey research is particularly appropriate. Here are some situations where survey research may be useful:

  • When the research question involves attitudes, beliefs, or opinions: Survey research is particularly useful for understanding attitudes, beliefs, and opinions on a particular topic. For example, a survey could be used to understand public opinion on a political issue.
  • When the research question involves behaviors or experiences: Survey research can also be useful for understanding behaviors and experiences. For example, a survey could be used to understand the prevalence of a particular health behavior.
  • When a large sample size is needed: Survey research allows researchers to collect data from a large number of people quickly and efficiently. This makes it a useful method when a large sample size is needed to ensure statistical validity.
  • When the research question is time-sensitive: Survey research can be conducted quickly, which makes it a useful method when the research question is time-sensitive. For example, a survey could be used to understand public opinion on a breaking news story.
  • When the research question involves a geographically dispersed population: Survey research can be conducted online, which makes it a useful method when the population of interest is geographically dispersed.

How to Conduct Survey Research

Conducting survey research involves several steps that need to be carefully planned and executed. Here is a general overview of the process:

  • Define the research question: The first step in conducting survey research is to clearly define the research question. The research question should be specific, measurable, and relevant to the population of interest.
  • Develop a survey instrument : The next step is to develop a survey instrument. This can be done using various methods, such as online survey tools or paper surveys. The survey instrument should be designed to elicit the information needed to answer the research question, and should be pre-tested with a small sample of individuals.
  • Select a sample : The sample is the group of individuals who will be invited to participate in the survey. The sample should be representative of the population of interest, and the size of the sample should be sufficient to ensure statistical validity.
  • Administer the survey: The survey can be administered in various ways, such as online, by mail, or in person. The method of administration should be chosen based on the population of interest and the research question.
  • Analyze the data: Once the survey data is collected, it needs to be analyzed. This involves summarizing the data using statistical methods, such as frequency distributions or regression analysis.
  • Draw conclusions: The final step is to draw conclusions based on the data analysis. This involves interpreting the results and answering the research question.

Advantages of Survey Research

There are several advantages to using survey research, including:

  • Efficient data collection: Survey research allows researchers to collect data quickly and efficiently from a large number of people. This makes it a useful method for gathering information on a wide range of topics.
  • Standardized data collection: Surveys are typically standardized, which means that all participants receive the same questions in the same order. This ensures that the data collected is consistent and reliable.
  • Cost-effective: Surveys can be conducted online, by mail, or in person, which makes them a cost-effective method of data collection.
  • Anonymity: Participants can remain anonymous when responding to a survey. This can encourage participants to be more honest and open in their responses.
  • Easy comparison: Surveys allow for easy comparison of data between different groups or over time. This makes it possible to identify trends and patterns in the data.
  • Versatility: Surveys can be used to collect data on a wide range of topics, including attitudes, beliefs, behaviors, and preferences.

Limitations of Survey Research

Here are some of the main limitations of survey research:

  • Limited depth: Surveys are typically designed to collect quantitative data, which means that they do not provide much depth or detail about people’s experiences or opinions. This can limit the insights that can be gained from the data.
  • Potential for bias: Surveys can be affected by various biases, including selection bias, response bias, and social desirability bias. These biases can distort the results and make them less accurate.
  • L imited validity: Surveys are only as valid as the questions they ask. If the questions are poorly designed or ambiguous, the results may not accurately reflect the respondents’ attitudes or behaviors.
  • Limited generalizability : Survey results are only generalizable to the population from which the sample was drawn. If the sample is not representative of the population, the results may not be generalizable to the larger population.
  • Limited ability to capture context: Surveys typically do not capture the context in which attitudes or behaviors occur. This can make it difficult to understand the reasons behind the responses.
  • Limited ability to capture complex phenomena: Surveys are not well-suited to capture complex phenomena, such as emotions or the dynamics of interpersonal relationships.

Following is an example of a Survey Sample:

Welcome to our Survey Research Page! We value your opinions and appreciate your participation in this survey. Please answer the questions below as honestly and thoroughly as possible.

1. What is your age?

  • A) Under 18
  • G) 65 or older

2. What is your highest level of education completed?

  • A) Less than high school
  • B) High school or equivalent
  • C) Some college or technical school
  • D) Bachelor’s degree
  • E) Graduate or professional degree

3. What is your current employment status?

  • A) Employed full-time
  • B) Employed part-time
  • C) Self-employed
  • D) Unemployed

4. How often do you use the internet per day?

  •  A) Less than 1 hour
  • B) 1-3 hours
  • C) 3-5 hours
  • D) 5-7 hours
  • E) More than 7 hours

5. How often do you engage in social media per day?

6. Have you ever participated in a survey research study before?

7. If you have participated in a survey research study before, how was your experience?

  • A) Excellent
  • E) Very poor

8. What are some of the topics that you would be interested in participating in a survey research study about?

……………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………….

9. How often would you be willing to participate in survey research studies?

  • A) Once a week
  • B) Once a month
  • C) Once every 6 months
  • D) Once a year

10. Any additional comments or suggestions?

Thank you for taking the time to complete this survey. Your feedback is important to us and will help us improve our survey research efforts.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Quantitative Research

Quantitative Research – Methods, Types and...

Qualitative Research Methods

Qualitative Research Methods

Explanatory Research

Explanatory Research – Types, Methods, Guide

Integrations

What's new?

Prototype Testing

Live Website Testing

Feedback Surveys

Interview Studies

Card Sorting

Tree Testing

In-Product Prompts

Participant Management

Automated Reports

Templates Gallery

Choose from our library of pre-built mazes to copy, customize, and share with your own users

Browse all templates

Financial Services

Tech & Software

Product Designers

Product Managers

User Researchers

By use case

Concept & Idea Validation

Wireframe & Usability Test

Content & Copy Testing

Feedback & Satisfaction

Content Hub

Educational resources for product, research and design teams

Explore all resources

Question Bank

Research Maturity Model

Guides & Reports

Help Center

Future of User Research Report

The Optimal Path Podcast

The best user research questions and how to ask them

User Research

Sep 1, 2022

The best user research questions and how to ask them

To get the right insights, you need to ask the right questions. Here’s the best user research questions to start gathering feedback right away.

Lorelei Bowman

Lorelei Bowman

Content Editor at Maze

Knowing the right user research questions to ask is vital to the success of your UX research. Research is an invaluable source of input for product development, but before you can get started, you need to make sure the questions lined up will get the insights you need, without influencing the data.

Think of this article as your guide to all-things user research questions: what to ask, how to ask it, and how to create your own questions. Let’s get started.

What kind of user research questions are there?

The kind of questions you ask will depend on your research goals—are you looking to gather user feedback, or find out if a particular feature is (or would be) useful? Are you trying to discover what problems bother your user, or whether they’d prefer one solution over another?

Before planning your questions and diving head-first into research, look at your overarching research plan and objectives. Consider this on a project-by-project basis, as your end questions will be drastically different depending on where you are in the product development process . For instance, if you’re in early product discovery , you may want to discover user intent and pain points. Or, if you’re working on a high-fidelity prototype, you might want to see how users interact with the prototype, and how easy it is to use. Asking questions at different stages of your process is a big part of continuous product discovery and ensuring your product remains the best it can be.

💡 If you’re looking to understand the types of question format used in surveys or user questionnaires, take a look at our guide on how to write survey questions .

User research questions can be categorized in many ways—by objective, research scenario, or point in the product journey, to name a few. Since different questions may apply in multiple situations, we’re going to consider questions organized by their research focus.

Questions for user research can typically be categorized three ways:

  • Questions about the problem e.g., what are users’ pain points, what task are they trying to complete, what solution do they want
  • Questions about the people e.g., who they are, how they use products, what they want to accomplish, how likely are they to use the product
  • Questions about the product e.g., how users’ feel about content or design, can they navigate the product, how usable is it, what features do they like or dislike

Now we know what kinds of questions there are, let’s delve into the value of pre-made questions, and some examples of each question type.

Using pre-made user research questions

To elevate your research, you can opt to use pre-existing questions from a question bank. As with all research tools , there are many benefits to this, including saving time and effort, and having many questions to choose from. Using a question bank also ensures questions are always carefully considered, easily understandable for users, and unbiased.

Meet the Maze question bank

An open-source question repository for product teams, our question bank is aimed at helping you ask the best user research questions to gather insight that will help build truly user-centered products.

question-bank-3

A good question bank will be multifunctional, with questions you can use when running moderated to unmoderated testing, conducting generative or evaluative research, or gathering quantitative or qualitative data. So you can have one place to go for all your user research question needs.

🚀 Boost your research with Maze templates

If you’re a Maze user, you can also use the question bank as a handy companion to fuel your team’s research with Maze—check out the templates column and question block suggestions for maximum efficiency when building mazes.

Ultimately, a pre-made question bank can help save you a lot of time, and allow you to focus on conducting the research and processing analysis.

If you’d like to create your own questions, let’s get into the different user research question types, what questions they include, and how to ask them.

question bank for user research questions

Click on the image to head straight to the question bank 👆

Questions about the problem

To support product and design decisions behind any solution, you need to be familiar with the problem you (and your users) are trying to solve. Whether you’re starting product discovery and want to understand user pain points, or you’re testing new features and want to gauge which will be most popular— you can’t begin working on a solution until you’ve honed in on what the problem is.

What’s bothering your users? How can you make their lives easier? What’s their key challenge, and what are they trying to achieve that’s being blocked by that problem?

Only once you’ve narrowed down a key problem statement can you translate solutions into the user experience, and identify opportunities for product development .

Questions focusing on the problem you’re trying to solve are key in product discovery stages and concept validation . The reason for using a particular product or feature may vary between users—consider Instagram’s Explore tab: it could be used to find friends, connect with like-minded people, or find inspiration.

Questions that can help hone into the problem at hand include:

  • What problems do you face when you do [task]?
  • Please complete this sentence: "The most frustrating part of trying to complete [task] is…”
  • What is your main goal when trying to complete [task]?
  • What is your personal measure of success when it comes to [goal]?
  • How are you currently solving [problem]?
  • Describe your ideal solution to [problem]

Questions about the people

Understanding the problem you’re trying to solve goes hand in hand with the people who are facing this problem—who they are and how they think, how they adopt and use products, their wants, needs and dislikes.

Put simply, there’s no point building a product if it solves the problem your user is having—but not in the way they wanted it to.

To really understand how your users think, and the way they approach a product, you need to understand their mental models. Broadly speaking, a mental model determines how someone forms their thinking process—it might impact the way they approach a problem, the kind of solution they’d like, and how they expect certain features to work.

UX research methods like card sorting are a good way to understand people’s mental models, but you can also gather this insight through thoughtful user interviews or research questions.

User-focused questions also cover understanding attitudes towards product adoption, use habits and circumstances, pricing models, and demographics.

Some example questions you could ask to learn more about your target users include:

  • Are there any products that you prefer to use to do [task]?
  • What does your job look like in your company?
  • How do you prefer to be trained on new products?
  • How much would you pay for [product]?
  • Please describe who you might collaborate with when [task]?
  • How often do you do [task]?

Questions about the product

Once you understand the problem your product will solve, and the people who’ll be using it, it’s time to circle back to the product itself. Questions about the product may be about its usability, what features you’re including, how users feel about content or design, and whether it does what they need it to.

Just like all research, it’s a good idea to ask product-related questions multiple time during the research phase, as both questions and answer will vary depending on what development stage you’re at—from prioritizing which feature to focus on developing first, to assessing how navigable a certain product section is, or reviewing the appeal of specific design aspects.

To gain a well-rounded understanding of how users find using your product or feature, usability testing is imperative. And, if you’re trying to nail down product navigation and identify any bumps in the user journey, tree testing is the research method of choice.

Whatever your focus, questions relating to the product are useful in both evaluative and generative research , and critical for creating a user-centered, solution-focused product.

Sample questions you can use to learn more about the product and features could include:

  • How did you find the language (including but not limited to copy, phrasing, tone) used on the website?
  • What’s the single most important thing we could do to make [product] better?
  • On a scale of 1-10, how was your experience when [task]?
  • Was the product navigation as expected?
  • If you could change one thing about the design what would it be and why?
  • Thinking about what [product] offers, which of the following best describes your feelings about it?

🤔 To dive into the questions you should be asking during usability testing, check out how to ask effective usability testing questions .

Regardless of what questions you ask, it’s worth bearing in mind that this information should be considered a guide, not a rule—as sometimes what people think they’ll do is not what they always do in practice . This is why it’s so important to continue research and testing in all stages of product development, so you can always be working off the most reliable and up-to-date insight.

Guidelines for crafting the right user research questions

Research questions set the standard of the data you’ll gather with them, so it’s crucial to properly craft each question to maximize insight and ensure accurate results.

Using a pre-made question bank is one way to keep questions effective, but if you’re writing your own questions, bear in mind that everything from the language you use to the structure or format of questions can influence the user’s answer.

The best questions for user interviews and research are clear , precise , and unbiased . Let’s go over some ultimate tips for crafting questions that fulfill this.

research questions for survey

Stay neutral: avoid leading questions

One of the most important points when it comes to any research is being a neutral party, which means removing cognitive bias from your work. Research isn’t helpful if it’s biased, so ensure your questions are as impartial as possible—after all, just because you like Concept A over Concept B, doesn’t mean everyone will.

The key to staying neutral is avoiding leading questions where you subconsciously favor one thing over another, or plant an opinion or idea in the user’s mind, such as “How would you use concept A?”—this assumes they preferred concept A, which they may not have. Instead, try asking which concept they would use, followed by how they would use it.

Take it one question at a time

The majority of us think best when our minds are clear and able to focus on one thing, so avoid bombarding research participants with multiple questions phrased together.

Rather than asking a question like “What did you think about the design, copy and layout of the page?”, ask individually about the design, copy, and layout. Otherwise, you risk users merging their thoughts into one answer, when in fact they may feel very differently about each element.

Of course some questions lend themselves to being combined (e.g., “Which concept did you prefer and why?”), but it’s best to keep things separate when possible, and ask “Why?” in follow up questions, to allow users space to think and form individual answers for each question.

Ask open-ended questions

Similar to ensuring questions are unbiased, it’s also a good idea to ask open-ended questions—that is, to avoid questions which result in simply a ‘yes’ or ‘no’ answer.

The benefit of open-ended questions is that they give participants an opportunity to expand on their answer, work through their experience, and share details with you that may otherwise be missed. Consider that, while asking “Did you like the product?” may answer whether a user liked it, you’ll be left wondering what it is they like about it. Instead, try framing questions in a way that provides space for additional information, e.g. “What did you think about the product?”.

Pro tip ✨ If you do ask closed-ended questions, always keep follow up questions aside to dig deeper gather and extra insight from your participants.

Help users find their own voice

The language we use is incredibly powerful. Used well, words can move us, sway our opinions, educate us, and more.

By helping your research participants to find their own voice, you can unlock powerful statements and user insights which will truly impact your product. Formatting questions with the user at the center—using ‘you’ and asking emotive questions—builds empathy with the user and encourages them to find and share their own opinions through honest answers.

Ask questions you think you know the answer to

Our final question-crafting tip is to use research questions to test and validate your own assumptions and opinions. Ask questions you think you know the answer to—if you believe all users will prefer one new feature over the other, see if you’re right. If you think a certain design element works better on a different page, ask research participants to determine where they prefer it.

As with any research, while you may be user-adjacent, you are not your users. You are the expert in your product; they are the expert in using your product. Trust their opinions, and use their knowledge and experience to confirm your suspicions, or disprove them. Either way, you gain valuable insights.

User research is as effective as the questions you ask

Whether you’re investigating user preferences or conducting usability testing, research is only as effective as the questions you ask—and how you ask them.

Focus on questions that fit your research objectives, phrase your questions in the best way possible, and work to build empathy with your user; you’ll be able to gather valuable insights in no time.

Frequently asked questions and user research questions

What makes a good user research question?

A good research question is open-ended , unbiased , clear , and precise . It helps research participants share their thoughts, feedback, and opinions with researchers, without influencing or limiting their responses.

What type of user research questions are there?

User research questions can broadly be broken down into three categories:

How do you create a user research question?

There are several ways to create a user research question: you can either write your own question, or select premade questions from an existing research question bank.

If you choose to write your own research questions, it’s important to keep them clear and precise above all else—focus on asking questions that encourage users to open up, share additional information, and speak honestly.

Continue Reading

leading-questions-thumbnail

How to avoid leading questions in UX research (+ examples)

ethics-in-ux-research-thumbnail

UX research best practices: Building and researching products with ethics in mind

Try rapid testing now, for free.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Is College Worth It?

As economic outcomes for young adults with and without degrees have improved, americans hold mixed views on the value of college, table of contents.

  • Labor force trends and economic outcomes for young adults
  • Economic outcomes for young men
  • Economic outcomes for young women
  • Wealth trends for households headed by a young adult
  • The importance of a four-year college degree
  • Getting a high-paying job without a college degree
  • Do Americans think their education prepared them for the workplace?
  • Is college worth the cost?
  • Acknowledgments
  • The American Trends Panel survey methodology
  • Current Population Survey methodology
  • Survey of Consumer Finances methodology

research questions for survey

Pew Research Center conducted this study to better understand public views on the importance of a four-year college degree. The study also explores key trends in the economic outcomes of young adults among those who have and have not completed a four-year college degree.

The analysis in this report is based on three data sources. The labor force, earnings, hours, household income and poverty characteristics come from the U.S. Census Bureau’s Annual Social and Economic Supplement of the Current Population Survey. The findings on net worth are based on the Federal Reserve’s Survey of Consumer Finances.

The data on public views on the value of a college degree was collected as part of a Center survey of 5,203 U.S. adults conducted Nov. 27 to Dec. 3, 2023. Everyone who took part in the survey is a member of Pew Research Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. Address-based sampling ensures that nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the ATP’s methodology .

Here are the questions used for this report , along with responses, and the survey’s methodology .

Young adults refers to Americans ages 25 to 34.

Noncollege adults include those who have some college education as well as those who graduated from high school but did not attend college. Adults who have not completed high school are not included in the analysis of noncollege adults. About 6% of young adults have not completed high school. Trends in some labor market outcomes for those who have not finished high school are impacted by changes in the foreign-born share of the U.S. population. The Census data used in this analysis did not collect information on nativity before 1994.

Some college includes those with an associate degree and those who attended college but did not obtain a degree.

The some college or less population refers to adults who have some college education, those with a high school diploma only and those who did not graduate high school.

A full-time, full-year worker works at least 50 weeks per year and usually 35 hours a week or more.

The labor force includes all who are employed and those who are unemployed but looking for work.

The labor force participation rate is the share of a population that is in the labor force.

Young adults living independently refers to those who are not living in the home of either of their parents.

Household income is the sum of incomes received by all members of the household ages 15 and older. Income is the sum of earnings from work, capital income such as interest and dividends, rental income, retirement income, and transfer income (such as government assistance) before payments for such things as personal income taxes, Social Security and Medicare taxes, union dues, etc. Non-cash transfers such as food stamps, health benefits, subsidized housing and energy assistance are not included. As household income is pretax, it does not include stimulus payments or tax credits for earned income and children/dependent care.

Net worth, or wealth, is the difference between the value of what a household owns (assets) and what it owes (debts).

All references to party affiliation include those who lean toward that party. Republicans include those who identify as Republicans and those who say they lean toward the Republican Party. Democrats include those who identify as Democrats and those who say they lean toward the Democratic Party.

At a time when many Americans are questioning the value of a four-year college degree, economic outcomes for young adults without a degree are improving.

Pie chart shows Only 22% of U.S. adults say the cost of college is worth it even if someone has to take out loans

After decades of falling wages, young U.S. workers (ages 25 to 34) without a bachelor’s degree have seen their earnings increase over the past 10 years. Their overall wealth has gone up too, and fewer are living in poverty today.

Things have also improved for young college graduates over this period. As a result, the gap in earnings between young adults with and without a college degree has not narrowed.

The public has mixed views on the importance of having a college degree, and many have doubts about whether the cost is worth it, according to a new Pew Research Center survey.

  • Only one-in-four U.S. adults say it’s extremely or very important to have a four-year college degree in order to get a well-paying job in today’s economy. About a third (35%) say a college degree is somewhat important, while 40% say it’s not too or not at all important.
  • Roughly half (49%) say it’s less important to have a four-year college degree today in order to get a well-paying job than it was 20 years ago; 32% say it’s more important, and 17% say it’s about as important as it was 20 years ago.
  • Only 22% say the cost of getting a four-year college degree today is worth it even if someone has to take out loans. Some 47% say the cost is worth it only if someone doesn’t have to take out loans. And 29% say the cost is not worth it.

These findings come amid rising tuition costs and mounting student debt . Views on the cost of college differ by Americans’ level of education. But even among four-year college graduates, only about a third (32%) say college is worth the cost even if someone has to take out loans – though they are more likely than those without a degree to say this.

Four-year college graduates (58%) are much more likely than those without a college degree (26%) to say their education was extremely or very useful in giving them the skills and knowledge they needed to get a well-paying job. (This finding excludes the 9% of respondents who said this question did not apply to them.)

Chart shows 4 in 10 Americans say a college degree is not too or not at all important in order to get a well-paying job

Views on the importance of college differ widely by partisanship. Republicans and Republican-leaning independents are more likely than Democrats and Democratic leaners to say:

  • It’s not too or not at all important to have a four-year college degree in order to get a well-paying job (50% of Republicans vs. 30% of Democrats)
  • A college degree is less important now than it was 20 years ago (57% vs. 43%)
  • It’s extremely or very likely someone without a four-year college degree can get a well-paying job (42% vs. 26%)

At the same time that the public is expressing doubts about the value of college, a new Center analysis of government data finds young adults without a college degree are doing better on some key measures than they have in recent years.

A narrow majority of workers ages 25 to 34 do not have a four-year college degree (54% in 2023). Earnings for these young workers mostly trended downward from the mid-1970s until roughly a decade ago.

Outcomes have been especially poor for young men without a college degree. Other research has shown that this group saw falling labor force participation and sagging earnings starting in the early 1970s , but the last decade has marked a turning point.

This analysis looks at young men and young women separately because of their different experiences in the labor force.

Trends for young men

  • Labor force participation: The share of young men without a college degree who were working or looking for work dropped steadily from 1970 until about 2014. Our new analysis suggests things have stabilized somewhat for this group over the past decade. Meanwhile, labor force participation among young men with a four-year degree has remained mostly flat.
  • Full-time, full-year employment: The share of employed young men without a college degree who are working full time and year-round has varied somewhat over the years – trending downward during recessions. It’s risen significantly since the Great Recession of 2007-09, with the exception of a sharp dip in 2021 due to the COVID-19 pandemic. For employed young men with a college degree, the share working full time, full year has remained more stable over the years.

Chart shows Earnings of young men without a college degree have increased over the past 10 years

  • Median annual earnings: Since 2014, earnings have risen for young men with some college education and for those whose highest attainment is a high school diploma. Even so, earnings for these groups remain below where they were in the early 1970s. Earnings for young men with a bachelor’s degree have also trended up, for the most part, over the past 10 years.
  • Poverty: Among young men without a college degree who are living independently from their parents, the share in poverty has fallen significantly over the last decade. For example, 12% of young men with a high school diploma were living in poverty in 2023, down from a peak of 17% in 2011. The share of young men with a four-year college degree who are in poverty has also fallen and remains below that of noncollege young men.

Trends for young women

  • Labor force participation: The shares of young women with and without a college degree in the labor force grew steadily from 1970 to about 1990. Among those without a college degree, the share fell after 2000, and the drop-off was especially sharp for young women with a high school diploma. Since 2014, labor force participation for both groups of young women has increased.
  • Full-time, full-year employment: The shares of employed young women working full time and year-round, regardless of their educational attainment, have steadily increased over the decades. There was a decline during and after the Great Recession and again (briefly) in 2021 due to the pandemic. Today, the shares of women working full time, full year are the highest they’ve ever been across education levels.

Chart shows Earnings of young women without a college degree have trended up in the past decade

  • Median annual earnings: Median earnings for young women without a college degree were relatively flat from 1970 until about a decade ago. These women did not experience the steady decline in earnings that noncollege young men did over this period. By contrast, earnings have grown over the decades for young women with a college degree. In the past 10 years, earnings for women both with and without a college degree have risen.
  • Poverty: As is the case for young men without a college degree, the share of noncollege young women living in poverty has fallen substantially over the past decade. In 2014, 31% of women with a high school diploma who lived independently from their parents were in poverty. By 2023, that share had fallen to 21%. Young women with a college degree remain much less likely to be in poverty than their counterparts with less education.

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Business & Workplace
  • Economic Conditions
  • Higher Education
  • Income & Wages
  • Recessions & Recoveries
  • Student Loans

Half of Latinas Say Hispanic Women’s Situation Has Improved in the Past Decade and Expect More Gains

From businesses and banks to colleges and churches: americans’ views of u.s. institutions, fewer young men are in college, especially at 4-year schools, key facts about u.s. latinos with graduate degrees, private, selective colleges are most likely to use race, ethnicity as a factor in admissions decisions, most popular, report materials.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Return-to-Office Orders: A Survey Analysis of Employment Impacts

How did employers expect return-to-office (RTO) orders to affect employment? Were those expectations correct? We use special questions from the Richmond Fed business surveys to shed light on these questions. Overall, RTO orders were expected to reduce employment, but there was both substantial uncertainty and heterogeneity in expectations. Some employers even expected that RTO would increase employment. Ex post, employers believe RTO orders had a muted effect on employment. We find that the service sector was more likely to both issue RTO orders and expect and experience a reduction in employment.

The COVID-19 pandemic changed the way that both employers and employees think about the location of work. 1 The advent of remote work en masse in 2020 has been followed by a gradual implementation of requiring workers to work from the office, at least for some of their workweek. These forced return-to-office (RTO) orders have come with controversy: Many employers have implemented these policies, while many employees have resisted.

In this article, we attempt to shed light on the effects of RTO by reporting on special questions we asked in the March Richmond Fed business surveys . Specifically, these questions shed light on both the anticipated and realized employment outcomes of RTO orders from the employer's perspective. We find that uncertainty in the decision-making process was prevalent, but also that realized outcomes were generally muted. RTO did have an expected and actual negative effect on employment, but only in some sectors and for some employers. For others, RTO was a means of increasing employment. Our results highlight the large uncertainty in the pandemic, the heterogeneity of firms and the large heterogeneity of workers.

Why Examine the Impacts of RTO Orders?

This survey builds on a recent literature investigating the implications of remote work for workers, businesses and local economies . Uniquely, it attempts to discern how business leaders anticipated RTO policies would impact their firms as well as the actual impact on employment within their firms. Although there is work evaluating the benefits and costs to employers in terms of productivity or labor/non-labor costs, 2 there has been little work to understand the firm-by-firm implication of articulating and enforcing an RTO order.

Research indicates that hybrid options are highly valued by employees , 3 but how many separations can be attributed to an RTO policy? There is evidence that managers value in-person work more than employees, 4 but does that result in actual separations when RTO orders are implemented? Our results suggest the effects of these policies were muted.

There is also evidence of wide variation in employee hybrid-work preferences and in their willingness to pay for the option to work from home 5 as well as evidence that the value workers place on the "amenity" of remote or hybrid work has implications for aggregate wage changes in the macroeconomy. 6 Our work indicates this heterogeneity in preferences may have dampened the effect of RTO orders on employment. Our results are consistent with a literature that is still relatively mixed about the net effect on employers and workers of remote or on-site policies.

Methodology

The Federal Reserve Bank of Richmond has surveyed CEOs and other business leaders across the Fifth Federal Reserve District 7 for almost 30 years, currently gathering around 200-250 responses per month. The survey panel underweights the smallest firms and, due to the history of the survey, manufacturing firms make up about one-third of respondents even though they make up a much smaller share of establishments in the Fifth District or the nation.

In addition to a series of questions about variables such as demand, employment and prices, respondents are commonly asked a set of ad hoc questions. Here, we focus on a set of questions asked in March 2024 regarding the extent to which respondents articulated and enforced a mandatory RTO policy and what they expected upon its implementation. Emily Corcoran reported on employers' on-site general expectations for employees and how those have changed. But here, we focus on business leaders' expectations of RTO policy effects, providing insight into the anticipated and unanticipated employment effects of RTO orders. We begin by assessing whether the establishment implemented RTO. These results are tabulated in Table 1.

Overall, explicit RTO orders were relatively rare, with only 20 percent of respondents articulating RTO orders in the last three years. This small percentage is partly because 37 percent of respondents — many of them manufacturing firms — were fully on-site before the end of 2020, and an additional 26 percent of respondents said RTO wasn't applicable for their companies. 8 Of the remaining companies, there is a roughly equal split between firms that have an explicit RTO policy (20 percent of the full sample) and those that do not (16 percent of the full sample).

We asked these 20 percent of employers about the expected consequences of issuing RTO orders. Did they expect workers to quit because of these policies? Were they sure about the effect on employment? We also asked employers about their assessment of realized outcomes. Did workers quit as anticipated? Did RTO help the firm recruit workers?

What Did Employers Expect, and What Actually Happened?

Perhaps surprisingly, we found two-thirds of employers expected no impact on (net) employment from RTO orders, while 16 percent were too unsure of the impact to answer (Table 2). Among the 18 percent that expected some impact, the anticipated outcome was split between those that expected a decrease in employment (11 percent) and those that expected an increase (7 percent).

Why might employment increase? One possibility derives from employees feeling more connected to their co-workers with greater mentoring opportunities when in the office. 9 This could reduce quitting and improve hiring, as one survey respondent reported that, "...the employees that [formerly] chose to work remotely decided that they were more productive in the office. We are [now] 90+ percent in the office."

Additionally, RTO orders have often been hybrid, 10 potentially allowing the benefits of office culture to be obtained without sacrificing all of the flexibility associated with remote work.

We also asked employers about their evaluation of outcomes, and the results are given in Table 3. Here, a greater percentage reported no impact (82 percent), while 4 percent assessed that RTO had decreased employment, and 4 percent assessed that RTO had actually increased employment. (Nine percent were still unsure.)

Sectoral level analysis reveals employment impacts (both expected and realized) were concentrated in the service sector. In manufacturing, no firms concretely expected a change in employment (though some were unsure), and ex post they believe RTO did not cause them to lose workers. In services, however, only 59 percent expected no impact, while 16 percent expected a negative impact on employment. Ex post, impact on employment was less than expected.

While our analysis is suggestive, there are a few limitations. Foremost, our effective sample size was small, meaning some of these results could be driven by sampling error. Second, it has been years since some employers implemented RTO policies, so their memories of their expectations could be inaccurate. Third, our survey did not control for any other firm changes — such as changes in wages or product demand — that could confound our findings. Fourth, although our findings provide insight into net employment gains and losses, they do not speak to hiring and firing separately. 11

With these caveats in mind, however, our results show that RTO — while still a common topic of conversation — is not necessarily important to employers' and workers' employment decisions. Concerns about employment effects ex ante mostly did not materialize. Employment effects that did materialize were concentrated in services and resulted in a net gain of employees in some cases, rather than a loss.

Grey Gordon is a senior economist and Sonya Ravindranath Waddell is a vice president and economist, both in the Research Department of the Federal Reserve Bank of Richmond. The authors thank Jason Kosakow for helping to develop and execute the survey and for providing the tabulations underlying this analysis and thank RC Balaban, Zach Edwards and Claudia Macaluso for providing feedback on an earlier draft.

See, for example, the 2023 paper " The Evolution of Work From Home " by Jose Maria Barrero, Nicholas Bloom and Steven Davis.

See, for example, the 2024 working paper " The Big Shift in Working Arrangements: Eight Ways Unusual " by Steven Davis.

See, for example, the 2023 working paper " How Hybrid Working From Home Works Out " by Nicholas Bloom, Ruobing Han and James Liang.

See the previously cited paper " How Hybrid Working From Home Works Out ."

See, for example, the 2021 working paper " Why Working From Home Will Stick " by Jose Maria Barrero, Nicholas Bloom and Steven Davis.

See, for example, the 2024 working paper " Job Amenity Shocks and Labor Reallocation (PDF) " by Sadhika Bagga, Lukas Mann, Aysegul Sahin and Giovanni Violante.

The Fifth District comprises the District of Columbia, Maryland, North Carolina, South Carolina, Virginia and most of West Virginia.

Those who answered "not applicable" are presumably firms where work is necessarily done in person.

See, for example, the 2023 article " About a Third of U.S. Workers Who Can Work From Home Now Do So All the Time " by Kim Parker.

The previously cited article by Emily Corcoran noted that 38 percent of firms are in the office in between one and four days a week.

See the 2022 article " Changing Recruiting Practices and Methods in the Tight Labor Market " by Claudia Macaluso and Sonya Ravindranath Waddell for an analysis of how hiring practices have changed in the tight labor market that has prevailed since 2020.

This article may be photocopied or reprinted in its entirety. Please credit the authors, source, and the Federal Reserve Bank of Richmond and include the italicized statement below.

V iews expressed in this article are those of the authors and not necessarily those of the Federal Reserve Bank of Richmond or the Federal Reserve System.

Subscribe to Economic Brief

Receive a notification when Economic Brief is posted online.

By submitting this form you agree to the Bank's Terms & Conditions and Privacy Notice.

Thank you for signing up!

As a new subscriber, you will need to confirm your request to receive email notifications from the Richmond Fed. Please click the  confirm subscription  link in the email to activate your request.  

If you do not receive a confirmation email, check your junk or spam folder as the email may have been diverted.

Phone Icon Contact Us

  • Open access
  • Published: 16 May 2024

Experiences of UK clinical scientists (Physical Sciences modality) with their regulator, the Health and Care Professions Council: results of a 2022 survey

  • Mark McJury 1  

BMC Health Services Research volume  24 , Article number:  635 ( 2024 ) Cite this article

995 Accesses

Metrics details

In healthcare, regulation of professions is an important tool to protect the public. With increasing regulation however, professions find themselves under increasing scrutiny. Recently there has also been considerable concern with regulator performance, with high profile reports pointing to cases of inefficiency and bias. Whilst reports have often focused on large staff groups, such as doctors, in the literature there is a dearth of data on the experiences of smaller professional groups such Clinical Scientists with their regulator, the Health and Care Professions Council.

This article reports the findings of a survey from Clinical Scientists (Physical Sciences modality) about their experiences with their regulator, and their perception of the quality and safety of that regulation.

Between July–October 2022, a survey was conducted via the Medical Physics and Engineering mail-base, open to all medical physicists & engineers. Questions covered typical topics of registration, communication, audit and fitness to practice. The questionnaire consisted of open and closed questions. Likert scoring, and thematic analysis were used to assess the quantitative and qualitative data.

Of 146 responses recorded, analysis was based on 143 respondents. Overall survey sentiment was significantly more negative than positive, in terms of regulator performance (negative responses 159; positive 106; significant at p  < 0.001). Continuous Professional Development audit was rated median 4; other topics were rated as neutral (fitness to practice, policies & procedures); and some as poor (value).

Conclusions

The Clinical Scientist (Physical Sciences) professional registrants rated the performance of their regulator more negatively than other reported assessments (by the Professional Standards Authority). Survey respondents suggested a variety of performance aspects, such as communication and fitness to practice, would benefit from improvement. Indications from this small dataset, suggest a larger survey of HCPC registrants would be useful.

Peer Review reports

In Healthcare, protection of patients and the public is a core principle. Part the framework of protections, includes regulation of professions [ 1 ]. This aims to mitigate risks such as the risk from bogus practitioners – insufficiently trained people acting as fully-trained professional practitioners, see Fig.  1 .

figure 1

Recent UK media report on a bogus healthcare practitioner [ 2 ]

Regulation of professions ensures that titles (e.g. Doctor, Dentist, Clinical Scientist) are protected in law. The protected title means someone may only use that title, if they are on the national register, managed by the regulator – the Health and Care Professions Council (HCPC). It is a criminal offence to use a protected title if you are not entitled to do so [ 3 ]. There are a large number of regulators in healthcare – see Table  1 . Most of the regulators manage a register for one profession, except the HCPC which regulates 15 professions.

To be included on the register, a candidate must meet the regulators criteria for knowledge and training, and a key element to remain, is to show evidence of continuous professional development (CPD). Being on the register ensures that a practitioner has met the appropriate level of competence and professional practice.

For many healthcare workers, being on the HCPC register is a compulsory requirement to be appointable to a post. They must pay the necessary annual fees, and abide by the policies drawn-up by the regulator, and generally professions have no choice of regulator – these are statutory bodies, setup by government.

Recently, there has been considerable public dissatisfaction with the activity & performance of some regulators, notably Ofwat [ 4 ], and Ofgem [ 5 ]. Healthcare workers should expect a high level of professionalism, efficiency, and integrity from a regulator, as the regulator’s performance directly affects staff and public safety.

In terms of the regulation of UK Clinical Scientists, there is a dearth of data regarding experiences with the HCPC and views on the quality of regulation provided.

Findings are reported here from a 2022 survey of Medical Physicists and Engineers (one of the 16 job roles or ‘modalities’ under the umbrella of Clinical Scientist). The research aim was to assess experiences, and the level of ‘satisfaction’ with the regulator. For the remainder of this report, the term Clinical Scientist will be taken to mean Clinical Scientist (Medical Physicist/Engineer). The survey was designed to gather & explore data about opinions and experiences regarding several key aspects of how the HCPC performs its role, and perception of the quality & safety of regulation delivered.

A short survey questionnaire was developed, with questions aimed to cover the main regulatory processes, including registration & renewal, CPD audit, and fitness-to-practice. There were also questions relating more generally to HCPC’s performance as an organisation, e.g. handling of personal data. Finally, participants were asked to rate the HCPC’s overall performance and what they felt was the ‘value’ of regulation. The survey questions are listed in the Supplementary file along with this article.

Questions were carefully worded and there was a balance of open and closed questions. A five-point Likert score was used to rate closed questions. The survey was anonymous, and the questions were not compulsory, allowing the responders to skip irrelevant or difficult questions. The survey also aimed to be as short & concise as possible, to be a minimal burden to busy clinical staff & hopefully maximise response rate. There were a small number of questions at the start of the survey, to collect basic demographics on the respondents (role, grade, UK nation etc.).

The survey was advertised on the online JISC-hosted UK Medical Physics and Engineering (UKMPE) mail-base. This offered convenient access for the majority of Clinical Scientists. The survey was advertised twice, to allow for potential work absence, holiday/illness etc. It was active from the end of July 2002 until October 2022, when responses appeared to saturate.

The data is a combination of quantitative rating scores, and qualitative text responses. This allows a mixed-methods approach to data analysis, combining quantitative assessment of the Likert scoring, and (recursive) thematic analysis of the free-text answers [ 6 ]. Thematic analysis is a standard tool, and has been reported as a useful & appropriate for assessing experiences, thoughts, or behaviours in a dataset [ 7 ]. The survey questions addressed the main themes, but further themes were identified using an inductive, data-driven approach. Qualitative data analysis (QDA) was performed using NVivo (QSR International).

Two survey questions attempted to obtain an overall perception of HCPC’s performance: the direct one (Q12), and a further question’Would you recommend HCPC as a regulator…?’. This latter question doesn’t perhaps add anything more, and in fact a few respondents suggested it was a slightly awkward question, given professions do not have a choice of regulator – so that has been excluded from the analysis.

Study conduct was performed in accordance with relevant guidelines and regulations [ 8 , 9 ]. Before conducting the survey of Clinical Scientists, the survey was sent to their professional body, the Institute of Physics and Engineering in Medicine (IPEM). The IPEM Professional Standards Committee reviewed the survey questions [ 10 ]. Written informed consent was obtained from participants.

Data analysis

Data was collected via an MS form, in a single excel sheet and stored on a secure network drive. The respondents were anonymised, and the data checked for errors. The data was then imported into NVivo v12.

Qualitative data was manually coded for themes, and auto-coded for sentiment. An inductive approach was used to develop themes.

The sample size of responses allowed the use of simple parametric tests to establish the level of statistical significance.

Survey demographics

A total of 146 responses were collected. Two respondents noted that they worked as an HCPC Partner (a paid role). They were excluded from the analysis due to potential conflict of interest. One respondent’s responses were all blank aside from the demographic data, so they were also excluded from further analysis.

Analysis is based on 143 responses, which represents ~ 6% of the UK profession [ 11 ]. It is arguable whether it is representative of the profession at this proportion of response – but these responses do offer the only sizeable pool of data currently available. The survey was aimed at those who are on the statutory register as they are most likely to have relevant interactions & experiences of the HCPC, but a small number of responses were also received from Clinical Technologists (Medical Technical Officers-MTOs) and Engineers (CEs) and these have been included in the analysis. Figure  2 shows the breakdown in respondents, by nation.

figure 2

Proportion of respondents, by nation

Of the respondents, 91% are registered Clinical Scientists, and would therefore have a broad range of experience with HCPC and its processes. Mean time on the register was 12 yrs. Respondents show a large range in seniority, and their roles are shown in Fig.  3 (CS-Clinical Scientist; CE-Clinical Engineer; MTO-Medical Technical Officer/Technician; CS-P are those working in private healthcare settings, so not on Agenda for Change (AfC) pay bands).

figure 3

Breakdown in respondents, by role and pay banding

These data can be compared with the most recent HCPC ‘snapshot’ of the CS registrants (find here: Registrants by profession snapshot—1967 to 2019 | ( https://www.hcpc-uk.org/resources/data/2019/registrant-snapshot/ )).

The perception of overall regulator performance, can be assessed in two ways – one interview question directly asked for a rating score, and the overall survey sentiment also offers additional insight.

The score for overall performance was a median of 3 (mean 2.7; response rate 90%) which suggests neutral satisfaction.

Respondents were not asked directly to explain this overall performance rating – themes were extracted from the questionnaire as a whole.

The auto-coded sentiment scores generated in the NVivo software are shown in Table  2 . There is a significantly stronger negative sentiment than positive for HCPC performance – moderate, strong and total sentiment scores are all higher for negative sentiment. The normal test for a single proportion (109), shows the negative and positive sentiment differences have statistical significance with p  < 0.001. Whilst the PSA assessment of HCPC performance in 2022–23 shows 100% performance for 4 out of 5 assessment areas, survey data here from regulated professionals suggests considerably less satisfaction with HCPC. This raises associated questions about the relevance and validity of PSA assessment.

A large number of respondents seem to question the value of regulation. Whilst many accepted the value for it in terms of protecting the safety of the public, many questioned its relevance & benefit to themselves. Many respondents also queried the payment model where although the main beneficiaries of regulation are the public & the employer, it is the registrants actually pay the fees for registration. There was very little mention in survey responses, of benefit in terms of protected-title. These issues were amalgamated into Theme 1— Value of regulation , with the two sub-themes Value in monetary terms (value-for-money) and Value in professional terms (benefit and relevance to the individual professional) (see Table  3 ).

In the survey, several aspects of HCPC organisational performance were scored – handling of personal data, registration and renewal, engagement with the profession, audit, and the quality and usefulness of HCPC policies. These formed Theme 2 and its sub-themes.

A third theme Registrant competence and vulnerability , was developed to focus on responses to questions related to the assessment of registrant competence and Fitness To Practice (FTP) processes.

Finally, the survey also directly asked respondents if they could suggest improvements which would have resulted in higher scoring for regulation quality and performance. These were grouped into Theme 4.

Theme 1 – Value of regulation

Value in monetary terms.

The Likert score for value-for-money was a median of 2 (mean 2.3; response rate 100%) which suggests dissatisfaction. This is one of the few survey questions to elicit a 100% response rate – a clear signal of its importance for registrants.

There was a high number of responses suggesting fees are too expensive (and a significantly smaller number suggesting good value). This ties in with some respondents explaining that the ‘benefit’ from registration is mainly for the employer (an assurance of high quality, well-trained staff). Several respondents point to little ‘tangible’ benefit for registrants and query whether the payment model is fair and if the employer should pay registrant fees.

“Expensive fees for what appears to be very little support.” Resp094
“It seems that I pay about £100 per year to have my name written on a list. It is unclear to me what the HCPC actually does in order to justify such a high fee.” Resp014
“I get, quite literally, nothing from it. It’s essentially a tax on work.” Resp008

Several respondents suggested that as registration was mandated by the employer, it was in essence an additional ‘tax’ on their employment, which was highlighted previously by Unison [ 12 ]. A comparator for payment model, are the checks preformed on potential staff who will be working with children and vulnerable adults. In general, these ‘disclosure’ checks are paid for by the employer, however the checks are not recurrent cost for each individual, but done once at recruitment.

Value in professional terms & relevance

This was not a direct question on the questionnaire, but emerged consistently in survey responses. Aside from value-for-money, the value of regulation can also refer to more general benefit and relevance for a professional, for example in protecting a professional title or emphasising the importance of a role. Many respondents commented, in relation to the ‘value’ of regulation, about the relevance of the HCPC to them and their job/role.

The largest number of responses highlighted the lack of clarity about HCPC’s role, and also to note its lack of relevance felt by a significant proportion of respondents.

“Not sure I have seen any value in my registration except that it is a requirement for my role” Resp017
“I really fail to understand what (sic) the benefits of registration.” Resp018
“They do not promote the profession. I see no evidence of supporting the profession. I pay to have the title and I am not aware of any other benefits.” Resp038

Theme 2 – HCPC performance

Communication & handling data.

The survey questionnaire did not have a specific question relating to communication, therefore no specific Likert scores are available. Rather, communication was a sub-theme which emerged in survey responses. The response numbers related to positive (1) and negative experiences (50) clearly suggest an overall experience of poor communication processes (and statistically significant at p  < 0.001 for a normal proportion test).

One respondent noted they had ‘given up’ trying to communicate with HCPC electronically. Several respondents also noted issues with conventional communication—letters from HCPC going to old addresses, or being very slow to arrive.

“…I have given up on contacting by electronic means.” Resp134

When trying to renew their registration, communication with HCPC was so difficult that two respondents noted they raised a formal complaint.

A number of respondents noted that when they eventually got through to the HCPC, staff were helpful, so the main communication issue may relate to insufficiently resourced lines of communication (phones & email) or the need for a more focussed first point of contact e.g. some form of helpdesk or triaging system.

“Recently long wait to get through to speak to someone… Once through staff very helpful.” Resp126

This topic overlaps with the next (Processing Registration & renewals) in that both involve online logins, website use etc.

Security & data handling was rated as neutral (median 3, mean 3.4; response rate 91%). Although responses were balanced in terms of satisfaction, a significant number noted a lack of knowledge about HCPC processes. There are almost equal proportions of respondents reporting no issues, some problems with handling of personal data, or insufficient knowledge to express an opinion.

Registration and renewal

The score for processing registrations & renewals, was a median of 4 (mean 3.5; response rate 92%) which suggests modest satisfaction.

The overall rating also suggests that the issues may have been experienced by a comparative minority of registrants and that for most, renewal was straightforward.

“They expected people to call their phone number, which then wasn’t picked up. They didn’t reply to emails except after repeated attempts and finally having to resort to raising a complaint.” Resp023
“Difficult to get a timely response. Difficult to discuss my situation with a human being…” Resp044

Although the Likert score is positive, the themes in responses explaining the rating, are more mixed. Many respondents mentioned either having or knowing others who had issues with registration renewal, and its online processes including payments. A few respondents mentioned that the process was unforgiving of small errors. One respondent, for example, missed ticking a box on the renewal form, was removed from the register and experienced significant difficulties (poor communication with HCPC) getting the issue resolved.

Some respondents noted issues related to a long absence from work (e.g. maternity/illness etc.) causing them to miss registration deadlines – for some, this seems to have resulted in additional fees to renew registration. It seems rather easy for small errors (on either side) to result in registrants being removed from the register. For registrants, this can have very serious consequences and it can then be difficult and slow to resolve this, sometimes whilst on no pay. There have also been other reported instances of renewal payment collection errors [ 13 ].

“I had been off work… and had missed their renewal emails…I was told that there would be no allowances for this situation, and I would have to pay an additional fee to re-register…” Resp139.

Some respondents raised the issue of exclusion – certain staff groups not being included on the register—such as Clinical Technologists and Clinical Engineers. This desire for inclusion, also points to a perception of value in being on the register. One respondent raised an issue of very difficult and slow processing of registration for a candidate from outside the UK.

“Staff member who qualified as medical physicist abroad…has had a dreadful, drawn out and fruitless experience.” Resp135

Overall, many respondents noted difficulties in renewing registration and issues with HCPC’s online processes. Some of these issues (e.g. website renewal problems) may have been temporary and are now resolved, but others (e.g. available routes for registration) remain to be resolved.

Audit process & policies

In the survey, 12% respondents reported having been audited by HCPC regarding their CPD (response rate 97%). This is well above the level of 2.5% of each profession, which HCPC aims to review at each renewal [ 14 ], and similar values reported by some professional bodies [ 15 ]. The participants seem representative, although two respondents mentioned their perception of low audit rates. Data on CPD audit is available here: https://www.hcpc-uk.org/about-us/insights-and-data/cpd/cpd-audit-reports/

Respondents rated the process of being audited as a median of 4 (mean 3.7), which is the joint highest score on the survey, pointing to satisfaction with the process. From the responses, the overall perception could be summed up as straight-forward, but time-consuming. Without regular record-keeping, unfortunately most audits will be time-consuming – the HCPC more so, as it is not an annual audit, but covers the two preceding years.

Some respondents did find the process not only straight-forward, but also useful (related to feedback received). However, responses regarding feedback were mixed, with comments on both good, and poor feedback from HCPC.

“Not difficult but quite long-winded” Resp008
“Very stressful and time consuming” Resp081
“While it was a lot of work the process seemed very thorough and well explained.” Resp114

The HCPC’s policies & procedures were rated as a median of 3 (mean 3.2; response rate 98%). This neutral score could suggest a mixture of confidence in HCPC practise. This score may also reflect the fact that the majority of respondents had either not read, or felt they had no need to read the policies, and so are largely unfamiliar with them.

The reasons for this lack of familiarity are also explained by some respondents – four commented that the policies & procedures are rather too generic/vague. Three respondents noted that they felt the policies were not sufficiently relevant to their clinical roles to be useful. This may be due to the policies being written at a level to be applicable to registrants from all 16 modalities – and perhaps a limitation of the nature of HCPC as a very large regulator. Familiarity seemed mainly to be restricted to policies around registration, and CPD. There were slightly lower response levels for positive sentiment (6), than negative sentiment (9).

“I’ve never had cause to read them.” Resp115
“Detached from the real clinical interface for our professions…” Resp083

HCPC split their policies into ‘corporate’- which relate to organisational issues (e.g. equality & diversity; find them here: Our policies and procedures | ( https://www.hcpc-uk.org/about-us/corporate-governance/freedom-of-information/policies/#:~:text=Our%20main%20policies%20and%20procedures%201%20Customer%20feedback,scheme%20...%207%20Freedom%20of%20Information%20Policy%20 )) and those more relevant to professions (e.g. relating to the register; find them here: Resources | ( https://www.hcpc-uk.org/resources/?Query=&Categories=76 )).

One respondent noted not only that the policies were ‘as you might expect’, but felt the policies were less demanding than those from other similar bodies such as the CQC ( https://www.cqc.org.uk/publications ).

“…Other regulatory bodies (such as the CQC for example) have policies and procedures that are a lot more challenging to comply with.” Resp022

Theme 3 – Registrant competence and vulnerability

In this survey, 3.5% (5/143) of respondents noted some involvement with the HCPC’s Fitness to Practice service. These interactions were rated at a median of 3 (mean 2.8) suggesting neutral sentiment.

Firstly, we can immediately see the level of interaction with the FTP team is very small. CS registrants represent approx. 2% of HCPC registrants, and the level of CS referrals to FTP in 2020–21 was 0.2% [ 16 ].

The data is a very small sample, but responses vary strongly, so it is worth digging a little further into the granularity of individual responses. Response scores were 1, 1, 2, 5, 5 – which are mainly at the extremes of the rating spectrum. The majority of respondents described poor experiences with the FTP team: errors, a process which was ‘extremely prolonged’, involved slow/poor communication, and processes which were ‘entirely opaque’.

“It is slow, the process was badly managed… and the system was entirely opaque,” Resp37
“They were hard to contact and I didn't feel they listened…no explanation, apology or assurance it would not happen again. It left my colleague disillusioned and me very angry on their behalf…” Resp044

Some respondents commented that the team were not only difficult to contact, but also didn’t seem to listen. At the end of a process which involved errors from HCPC, one respondent noted were ‘no explanation, apologies or assurance that it would not happen again’, leaving the registrant ‘disillusioned’. These experiences do not fit with the HCPC’s stated goal to be a compassionate regulator, see Fig.  4 . Arguably it is more difficult to change a culture of behaviour and beliefs, than to publish a corporate goal or statement of vision.

figure 4

HCPC’s vision statement & purpose [ 17 ]

Some survey respondents have noted the necessity of regulation for our profession.

“Ultimately I am very grateful that I can register as a professional.” Resp024

Theme 4 – Suggestions for improved regulation

Following the question relating to overall performance, respondents were invited to suggest things which might improve their rating for HCPC’s performance and value. These suggestions were also combined with those which appeared in earlier survey responses.

Although we are in a current cost-of-living crisis, responses did not query simply high absolute cost of fees, but also queried the value/benefit of HCPC regulation for registrants. Many responses expressed doubt as to the added value & relevance of HCPC registration for them. They seem to point to a desire for more tangible benefit from their fees. Perhaps, given the costs and levels of scrutiny, registrants want some definite benefit to balance the scales .

“Cost less and do more for the people who are on the register.” Resp089
“Vastly reduced cost. Employer paying registrant fees.” Resp074

A significant number of responses pointed out that the main benefits of registration are for the public, and for employers – but that it is the registrants who pay for registration. Many queries why this should be, and whether there should be a different payment model, where for example employers pay.

Similarly, some respondents felt that the HCPC’s unusual position of regulating a large swathe of healthcare professions was not necessarily helpful for their profession or others.

Communication and response times are obviously an issue of concern for registrants, and improvements are needed based on the low satisfaction levels reported here. This is also linked to a wish for increased engagement with the CS profession.

“Engagement with the workforce, specialism specific development, reduced fees” Resp025

Some responses suggested they would be comforted by increased accountability / governance of HCPC including improved FTP efficiency.

“More accountability to registrants” Resp130

Finally, improvement in terms of additional registration routes for Engineers & Technical staff were also suggested. It may be damaging to work-place moral, if two professionals doing roles of a similar nature are not being governanced is the same way and if there is not parity of their gross salary due to mandatory professional fees & reductions.

Value-for-money : This will vary between individuals depending on many variables, such as upbringing & environment, salary, lifestyle priorities, political persuasion, and so on. However, many of these factors should balance in a large sample. In general, it can be suggestive of satisfaction (or lack of) with a service. The score here suggesting dissatisfaction, echoes with other reports on HCPC’s spending, and financial irregularities [ 18 , 19 ].

In the survey findings, respondents have voiced dissatisfaction with registration value for money. In fact, HCPC’s registration fees are not high when compared to the other healthcare professions regulators. Table 1 shows data from 2021–22 for regulator annual registration fees. However, the HCPC has risen from having the lowest regulator fees in 2014–5, to its current position (9 th of 13) slightly higher in the table. Perhaps more concerning than the absolute level of fees, are when large increases are proposed [ 12 , 20 , 21 , 22 ].

However, fees have regularly increased to current figure of ÂŁ196.48 for a two-year cycle. During a consultation process in 2018, the Academy for Healthcare Clinical Scientists (AHCS) wrote an open letter to the HCPC, disputing what they felt was a disproportionate fee increase [ 23 ]. Further fee rises have also been well above the level of inflation at the time.

HCPC expenditure (which is linked to registration fees) has arguably been even more controversial than fee increases – noted by several respondents. A freedom of information (FOI) request in 2016 showed HCPC’s spending of £17,000 for their Christmas party [ 18 ] – which amounts to just over £76 per person. This cost was close to the annual registration fee (at that time) for registrants.

In 2019, regulation of social workers in England moved from HCPC, to Social Work England. This resulted in a loss of over 100,000 registrants, and a loss in registration fee income. HCPC raised fees to compensate, but a freedom of information (FoI) request in 2020 [ 18 ] showed that even though there was an associated lowering in workload associated with the loss of 100 k registrants, the HCPC had no redundancies, suggesting the loss of income was compensated mainly by the fees increase.

Inherent value & relevance

One of HCPC’s aims is to promote ‘the value of regulation’ [ 24 ]. However, not only is there dissatisfaction with value-for-money, the second highest response suggests a lack of inherent value (or benefit) from regulation to the individual registrant. In some ways, there is a lack of balance – registrants are under increasing scrutiny, but feel there is little direct benefit, to provide balance.

This also suggests that HCPC’s aim or message is not getting through to the CS profession. It’s not clear what the HCPC 2021–22 achieved milestone – ‘Embedded our registrant experiences research into employee learning and development and inductions’ has actually achieved.

A large number of responses pointed to the lack of clarity about HCPC’s role, and also to note its lack of relevance for respondents. Some of this is understandable – until recently, many CS registrants will have little interaction with HCPC. They would typically get one email reminder each year to renew their registration and pay those fees, and hear little else from the HCPC. That is beginning to change, and HCPC have recently begun to send more regular, direct emails/updates to registrants.

However, for many registrants, the HCPC appears not to be clearly communicating its role, or the relevance/importance of regulation. As mentioned above, this also links in to previous mentions of the lack of any tangible benefit for registrants. Some note little more relevance other than the mandatory aspects of regulation.

Finally, relevance is also queried in relation to the limited access for some professional groups to a professional register. The current situation of gaps in registration for some groups, results in two situations – firstly, for Clinical Scientists and Clinical Engineers/Technologists, one group has to compulsorily pay a fee to be allowed/approved to do their job and the other does not; also, the public are routinely helped and assisted by Clinical Scientists and Clinical Engineers/Technologists – but only one group is regulated to ensure public safety.

HCPC Communication

This was highlighted by respondents as often poor. Recently in the media, there has been a concern raised by The College of Paramedics (CoP) about communications issues with HCPC—changes to the HCPC policy on the use of social media [ 25 ]. They raised particular concerns about the use of social media content and ‘historical content’ in the context of investigations of fitness-to practice.

There have previously been some concerns raised on the UKMPE mail-base regarding handling of personal data, and lack of efficiency in addressing the issue [ 26 ]. Several messages detailed HCPC communicating unencrypted registrant passwords in emails and sending personal data to the incorrect registrant. Some on the forum noted that they had reported this problem over a period of several years to HCPC, suggesting HCPC’s response to these serious issues was extremely slow. Several responses noted these previous issues.

Registration processes

Although responses here show some satisfaction, there have been reports in the media of significant issues with registration (such as removing registrants from the register in error) with associated impact for patients and the public [ 27 , 28 ]. Similarly, there were reports on the UKMPE mail-base of significant issues with registration renewals being problematic [ 26 ]. In Scotland, NHS.net email accounts ceased to be supported in July-Sept 2020 and the associated lack of access to email accounts and messages used for HCPC communication and registration, caused a major issue in registration renewal. This coincided with COVID lockdowns and a period of unusually difficult communication with HCPC. If NHS staff lose registration (irrespective of the reason), respondents noted that some Human Resources (HR) departments were quick to suspend staff from work, and in some cases withhold pay. That spike in difficulties is likely the cause of the most common responses suggesting issues with a complicated process.

In safe-guarding public safety, a key task for a healthcare regulator is assessing the competence of registrants. This is done via a small set of related activities. Registrants must return regular evidence of CPD, and these are audited for 2.5% registrants. This process is simple and routine, and as seen in Theme 2 responses here suggest registrants are reasonably satisfied with this process.

More formal and in-depth competence assessment happens when a complaint is raised against a registrant, either by a work colleague/management, a member of the public or occasionally by the HCPC itself. The process is complex, lengthy and can end in a registrant attending a court hearing [ 29 ].

It is usual for registrants to continue in their normal job during FTP investigations – effectively the public remains at risk from a registrant if their competence is eventually proven to be below the regulators standards, so there is a need for investigations to be efficient both in timeliness, and outcome.

Obviously, being under investigation can be highly stressful, and has the potential for the registrant to be ‘struck off’ the register, and lose their job if registration is mandated (e.g. NHS posts). There are many reports of the process & experience either provoking or increasing underlying mental health challenges [ 30 , 31 , 32 ]. Along with efficiency, a regulator needs to behave compassionately. Investigations of highly-skilled professionals engaging in complex work activities, is also necessarily complex and requires a high degree of knowledge and experience from the regulator’s investigational panel.

The Professional Standards Authority (PSA) regulate the HCPC, and publish annual reviews of their performance ( https://www.professionalstandards.org.uk/publications/performance-reviews ) (see Table  4 ). HCPC performance as reported by PSA, seems to be generally higher than noted by survey respondents here. For 2022–23, aside from one area, the HCPC has scored 100% for performance, which seems at odds with these survey responses [ 33 ]. The FTP team is notable in repeatedly performing very poorly compared to most other sections of the HCPC (even though the majority of the HCPC budget goes to FTP activity, see Fig.  4 ). The HCPC Annual Report 2018–9 [ 34 ] highlighted the completion of the first phase of the Fitness-To-Practice Improvement Plan. This delivered “A root and branch review of this regulatory function… a restructure, tightened roles and processes and the introduction of a new Threshold Policy”, but this seems to have no impact on the performance reported by the PSA for the next few years shown in Table  4 . However, the most recent data does suggest improvement, and HCPC continues to develop FTP team practice [ 17 ].

figure 5

HCPC expenditure for the year 2020–21 [ 17 ]

There are other reports of poor experiences with this team [ 35 , 36 ], and in one report the FTP team’s processes have been noted as being rather inhumane [ 35 ].

Regulation is an important part of public protection, but how effectively it is managed & enforced is also a concern, given it involves increased scrutiny of registrants. A topical comparator is the current dissatisfaction by a large section of the public about several other government regulators allowing seemingly poor performance to go unchecked [ 4 , 5 ].

It is arguable, that registrants remain on the register as long as the HCPC allows them. Several respondents in this survey noted being removed from the register through HCPC administrative error. Removal could also happen through poor judgement/decision-making – the FTP team handle large numbers of very complex investigational cases – 1603 concluded cases for the year 2021–22 and 1024 hearings [ 16 ]. Every justice system is subject to a level of error – guilty parties can be erroneously ‘cleared’, and vice-versa. It is essential therefore, that policies & procedures relating to FTP are fit for purpose—that the FTP team work effectively and humanely, and that there is genuine & effective governance of HCPC to ensure accountability. In this survey, some respondents seem to be saying that currently this seems not to be the case.

It might have been anticipated that the greatest concern is costs, especially in the current cost-of-living crisis. The recent HCPC consultation to increase fees [ 37 ] seems particularly tone-deaf and has caused concern across the professions [ 21 , 22 ].

Above findings show respondents are interested in lower fees, but also increased benefit for their fees. Some respondents pointed out that whilst registrants pay for registration, benefit is mainly for the public and employers. The HCPC is a statutory body, its funding model will have been designed/decided upon by government, and may be unlikely to change. However, there are a variety of potential regulation models [ 38 ], and so change is possible. A review of the financial model for regulation may be welcome.

Regulator size

Some aspects of HCPC performance, policies, and distribution of spending, is related to the nature of it being the largest and only multi-professional regulator in the healthcare sector. Data from the HCPC suggests (see Fig.  5 ) that the majority of spending relates to FTP activity. Data also points to Clinical Scientists having very low levels of FTP investigation compared to others in HCPC [ 16 ]. This suggests that a significant proportion of CS registrant fees are used to investigate other professions. It’s possible (perhaps simplistically) that if, like many other healthcare professions such as doctors & dentists who’s regulator is concerned only with that single profession, if CSs were regulated separately, their registrant fees may be reduced. This model of single-profession regulation may also mitigate against other disadvantages of the HCPC’s practice, such as the ‘generic’ policies aiming to apply to a pool of 15 professions.

Although there is a very low level of data for this topic, the concerned raised by registrants are serious in nature. There also seems to be issues in handling of complaints related to this service and advocacy for registrants. Certainly, there is a clear governance path via PSA, to the Health Secretary. However, this does not offer a route for individual complaints to be raised and addressed. Unlike complaints from the public in other areas, there is no recourse to an ombudsman for registrants. The only option for individual registrants, is the submission of a formal complaint to the HCPC itself, which is dealt with internally. Comments from survey respondents suggest this process does not guarantee satisfaction. Indeed, one of the respondents who mentioned submitting a complaint, made it clear they remained unhappy with HCPC’s response. Overall, there seems to be a lack of clear & effective advocacy for registrants.

“…the HCPC’s stance appeared to be guilty until proven innocent… At no point did I feel the HCPC cared that their (sic) was an individual involved....” Resp044.

FTP processes affect a comparatively small number of CS registrants, compared to other professions. However, it seems clear that the majority of those who have interacted with the FTP team have had poor experiences, and respondents have suggested improvements are needed. The reason for FTP investigations, is protection of staff and the public. If processes are slow, and investigations prolonged, or decisions flawed, the public may be exposed to increased levels of risk, as healthcare practitioners who may be lacking in competence continue to practice. The data in Table  4 shows concerning but improving trends in FTP performance levels.

Limitations

There are two main limitations to this work. Firstly, due to time constraints, there was no pilot work done when designing the survey questionnaire. This may have helped, as noted earlier, a few responses pointed to some awkwardness with one survey question. Although no pilot work was done, the questionnaire was reviewed by the IPEM Professional Standards Committee, as noted in the Acknowledgements section.

The other obvious limitation is the low response rate (~ 6% of UK Medical Physicists). Circulation of the survey was performed via the only online forum for the profession currently available. The survey was advertised multiple times to ensure visibility to staff who may have missed it initially due to leave etc. However, the forum does reach 100% of the profession, and some addressees may have filters set to send specific posts to junk folders etc. The professional body IPEM declined to offer support in circulating the survey (believing the issues involved would affect/be of interest only to a small minority of members.)

The low response rate also has a particular impact on the pool of responses relating to FTP issues, which inherently affect low numbers of registrants.

However, the importance of some of the findings here (e.g. expressed dissatisfaction with regulation in terms of value; the poor experience of some members with the Registration, Communication and FTP teams) and the low sample surveyed, both justify the need for a larger follow-on survey, across all of Clinical Science.

In Healthcare, regulation of professions is a key aspect of protecting the public. However, to be effective, regulation must be performed professionally, impartially, and associated concerns or complaints investigated efficiently and respectfully.

This report presents findings from a survey aimed at collecting a snap-shot of the experiences of Clinical Scientists with their regulator, and their perception of the quality and safety of that regulation performance.

Overall survey sentiment scores showed a significantly more negative responses than positive. Survey comments relate not only to current issues, but to previous problems and controversial issues [ 18 , 26 ]. It seems that some respondents have at some point lost confidence and trust in the HCPC, and survey responses suggest there has not been enough engagement and work done by HCPC to repair and rebuild this trust.

In the midst of a cost of living crisis, costs are a large concern for many. The HCPC fees are neither the highest not lowest amongst the healthcare regulators. Spending is transparent, and details can be found in any of the HCPC’s annual reports.

A repeating sub-theme in responses, was a lack of tangible value for the registrant, and that the employer should pay the costs of registration, where registration is mandated by the job.

Many respondents have suggested that they feel there should be more proactive engagement from HCPC with the profession. Most respondents were not familiar with or felt the HCPC policies are relevant/important to them.

Survey data showed moderate satisfaction with registration processes for the majority of respondents. Some respondents also noted a lack of registration route for engineering & technical healthcare staff. CPD processes also achieved a score indicating registrant satisfaction. This generated the highest ratings in the survey. Communication scored poorly and many respondents suggests there needs to be improved levels of communication in terms of response times and access to support.

The CS profession experiences low levels of interaction with the FTP service. However, those interactions which were recorded in the survey, show some poor experiences for registrants. There also seems to be a lack of advocacy/route for complaints about HCPC from individual registrants. There may need to be more engagement between registrants and their professional body regarding HCPC performance, and more proactivity from the stake-holder, IPEM.

Some of the findings reported here relate to important issues, but the survey data are based on a low response rate. A larger survey across all of Clinical Science is being planned.

Availability of data and materials

To protect confidentiality of survey respondents, the source data is not available publicly, but are available from the author on reasonable request.

Abbreviations

Agenda for Change

Academy for Healthcare Clinical Scientists

Continuous professional development

Clinical Engineer

Clinical Scientist

College of Paramedics

Clinical Technologist

Freedom of Information

Fitness-to-practice

Health and Care Professions Council

Human resources

Institute of Physics and Engineering in Medicine

Joint Information Systems Committee

Medical Technical Officer

Professional Standards Authority

Professional Standards Committee

Qualitative data analysis

UK Medical Physics and Engineering

Professional Standards Authority. Professional healthcare regulation in the UK. https://www.professionalstandards.org.uk/news-and-blog/blog/detail/blog/2018/04/10/professional-healthcare-regulation-explained#:~:text=Regulation%20is%20simply%20a%20way,may%20face%20when%20receiving%20treatment . Accessed 26 Jul 2023

Evening Standard. Bogus surgeon treated hundreds. https://www.standard.co.uk/hp/front/bogus-surgeon-treated-hundreds-6326549.html . Accessed 26 Jul 2023.

HCPC . About registration: protected titles. http://www.hcpc-uk.org/aboutregistration/protectedtitles/ . Accessed 27 Jul 23.

The Guardian. Public patience is wearing thin. Ofwat must wield the big stick | Nils Pratley |  https://www.theguardian.com/business/nils-pratley-on-finance/2022/dec/08/public-patience-is-wearing-thin-ofwat-must-wield-the-big-stick . Accessed 19 Jul 2023.

TrustPilot. Reviews of Ofgem. Ofgem Reviews | Read Customer Service Reviews of ofgem.com (trustpilot.com). Accessed 19 Jul 2023.

Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

Article   Google Scholar  

Kiger ME, Varpio L. Thematic analysis of qualitative data: AMEE Guide No. 131. Med Teach. 2020;42(8):846–54.

Article   PubMed   Google Scholar  

Declaration of Helsinki. 2013. https://www.wma.net/policies-post/wma-declaration-of-helsinki-ethical-principles-for-medical-research-involving-human-subjects/ . Accessed 12 Sept 2023.

UK Data Protection Act. 2018. https://www.gov.uk/data-protection . Accessed 15 Sept 2023.

Rowbottom C. Private communication on behalf of the IPEM Professional Standards Committee; 2022.

IPEM Workforce Team. Clinical scientist & engineer workforce data. Personal communication. 2022.

Unison. HCPC fee increase is an unjustified ‘tax on practising.’ https://www.unison.org.uk/news/press-release/2019/02/hcpc-fee-increase-unjustified-tax-practising/ . Accessed 27 Jul 2023.

HCPC. Direct debit collection errors. https://www.hcpc-uk.org/news-and-events/news/2020/early-direct-debit-collections/?dm_i=2NJF,141CO,7C0ZNI,4A8IE,1 . Accessed 27 Jul 23.

HCPC. CPD audit rates. https://www.hcpc-uk.org/cpd/cpd-audits/ . Accessed 21 Jul 2023.

IPEM. CPD audit rates. https://www.ipem.ac.uk/your-career/cpd-career-development/cpd-audit/ . Accessed 21 Jul 2023.

HCPC. Fitness to practice annual report 2020–21. https://www.hcpc-uk.org/about-us/insights-and-data/ftp/fitness-to-practise-annual-report-2020-21/ . Accessed 23 Jul 2023.

HCPC. Annual report and accounts, 2020–21. https://www.hcpc-uk.org/resources/reports/2022/annual-report-and-accounts-2020-21/ . Accessed 19 Jul 2023.

Wikipedia. The health and care professions council. https://en.wikipedia.org/wiki/Health_and_Care_Professions_Council . Accessed 2 Jul 23.

HCPC. Annual report 2005–06. https://www.hcpc-uk.org/resources/reports/2006/annual-report-2005-06/ . Accessed 19 Jul 2023.

British Dental Association. BDA very disappointed by HCPC decision to raise registration fees by 18%. https://www.bda.uk.com/resource/bda-very-disappointed-by-hcpc-decision-to-raise-registration-fees-by-18.html . Accessed 27 Jul 2023.

British Psychological Society. HCPC fees consultation – share your views. https://www.bps.org.uk/news/hcpc-fee-consultation-share-your-views . Accessed 27 Jul 23.

IBMS. IBMS response to the HCPC registration fees consultation. https://www.ibms.org/resources/news/ibms-response-to-hcpc-registration-fees-consultation/ . Accessed 17 Jul 23.

Association of HealthCare Scientists. Open letter to HCPC. https://www.ahcs.ac.uk/wp-content/uploads/2018/11/HCPC-Open-Letter.pdf . Accessed 27 Jul 23.

HCPC. Corporate plan 2022–23. https://www.hcpc-uk.org/resources/reports/2022/hcpc-corporate-plan-2022-23/ . Accessed 23 Jul 2023.

College of Paramedics. Our formal response to the HCPC consultation. https://collegeofparamedics.co.uk/COP/News/2023/Our%20formal%20response%20to%20the%20HCPC%20consultation.aspx . Accessed 27 Jul 23.

JISC Mail - MPE mailbase. JISCMail - Medical-physics-engineering list at www.jiscmail.ac.uk . Accessed 19 July 2023.

The Guardian. Thousands miss out on treatment as physiotherapists are taken off UK register. https://www.theguardian.com/society/2022/may/14/thousands-miss-out-on-treatment-as-physiotherapists-are-struck-off-uk-register . Accessed 27 Jul 2023.

HSJJobs.com. https://www.hsjjobs.com/article/thousands-of-clinicians-unable-to-work-after-registration-blunder . Accessed 27 Jul 2023.

HCPC. How we investigate. https://www.hcpc-uk.org/concerns/how-we-investigate/ . Accessed 21 Nov 2023.

Sirriyeh R, Lawton R, Gardner P, Armitage G. Coping with medical error: a systematic review of papers to assess the effects of involvement in medical errors on healthcare professionals’ psychological well-being. Br Med J Qual Saf. 2010;19:6.

Google Scholar  

Bourne T, Wynants L, Peters M, van Audenhove C, Timmerman D, van Calster B, et al. The impact of complaints procedures on the welfare, health and clinical practise of 7926 doctors in the UK: a cross-sectional survey. BMJ Open. 2015;5:e006687.

Article   PubMed   PubMed Central   Google Scholar  

Jones-Berry S. Suicide risk for nurses during fitness to practice process. Ment Health Pract. 2016;19:8.

Professional Standards Authority. HCPC performance review 2022–23. https://www.professionalstandards.org.uk/publications/performance-review-detail/periodic-review-hcpc-2022-23 . Accessed 25 Jul 2023

HCPC. Annual report and accounts, 2018–19. https://www.hcpc-uk.org/resources/reports/2019/hcpc-annual-report-and-accounts-2018-19/ . Accessed 19 Jul 2023.

Maben J, Hoinville L, Querstret D, Taylor C, Zasada M, Abrams R. Living life in limbo: experiences of healthcare professionals during the HCPC fitness to practice investigation process in the UK. BMC Health Serv Res. 2021;21:839–54.

Leigh J, Worsley A, Richard C, McLaughlin K. An analysis of HCPC fitness to practise hearings: fit to practise or fit for purpose? Ethics Soc Welfare. 2017;11(4):382–96.

HCPC. Consultation changes to fees. https://www.hcpc-uk.org/news-and-events/consultations/2022/consultation-on-changes-to-fees/ . Accessed 27 Jul 23

Department of Health. Review of the regulation of public health professions. London: DoH; 2010.

Download references

Acknowledgements

The author wishes to kindly acknowledge the input of Dr Carl Rowbottom (IPEM Professional Standards Committee), in reviewing the survey questions. Thanks also to Dr Nina Cockton for helpful advice on ethics and recruitment issues.

There were no sources of funding required for this work.

Author information

Authors and affiliations.

University of Glasgow, Level 2, ICE Building, Queen Elizabeth University Hospital Campus, 1345 Govan Road, Glasgow, G51 4TF, UK

Mark McJury

You can also search for this author in PubMed   Google Scholar

Contributions

All work to collect, analyse & publish this survey, are the work of the author Dr Mark McJury.

Corresponding author

Correspondence to Mark McJury .

Ethics declarations

Ethics approval and consent to participate.

As this study relates to low risk, survey data, formal ethics committee approval is not required (exemption obtained from NHSGGC REC04 REC Officer Dr Judith Godden [email protected]). As the survey responses were from members of a professional body (The Institute of Medical Physics and Engineering in Medicine (IPEM) it was consulted. Its Professional Standards Committee (PSC) reviewed the survey and raised no objections. The survey questions were assessed for bias and approved unchanged (acknowledged in the manuscript). Written informed consent was obtained from all participants in the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1..

The survey questionnaire has been provided as a supplementary file.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

McJury, M. Experiences of UK clinical scientists (Physical Sciences modality) with their regulator, the Health and Care Professions Council: results of a 2022 survey. BMC Health Serv Res 24 , 635 (2024). https://doi.org/10.1186/s12913-024-10956-7

Download citation

Received : 06 September 2023

Accepted : 05 April 2024

Published : 16 May 2024

DOI : https://doi.org/10.1186/s12913-024-10956-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Regulation of professions
  • Clinical scientists
  • Medical physicists

BMC Health Services Research

ISSN: 1472-6963

research questions for survey

COMMENTS

  1. 90 Survey Question Examples + Best Practices Checklist

    However, all questions must serve a purpose. In this section, we divide survey questions into nine categories and include the best survey question examples for each type: 1. Open Ended Questions. Open-ended questions allow respondents to answer in their own words instead of selecting from pre-selected answers.

  2. Survey Questions: 70+ Survey Question Examples & Survey Types

    Impactful surveys start here: The main types of survey questions: most survey questions are classified as open-ended, closed-ended, nominal, Likert scale, rating scale, and yes/no. The best surveys often use a combination of questions. 💡 70+ good survey question examples: our top 70+ survey questions, categorized across ecommerce, SaaS, and ...

  3. Survey Questions: Tips & Examples in 2022

    For example, "With the best at the top, rank these items from best to worst". Be as specific as you can about how the respondent should consider the options and how to rank them. For example, "thinking about the last 3 months' viewing, rank these TV streaming services in order of quality, starting with the best".

  4. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  5. 16 Types of Survey Questions, with 100 Examples

    The questions you choose and the way you use them in your survey will affect its results. These are the types of survey questions we will cover: Open-Ended Questions. Closed-Ended Questions. Multiple Choice Questions. Dichotomous Questions. Rating Scale Questions. Likert Scale Questions. Nominal Questions.

  6. Survey Research

    Survey research examples and questions Examples serve as a bridge connecting theoretical concepts to real-world scenarios. Let's consider a few practical examples of survey research across various domains. User Experience (UX) Imagine being a UX designer at a budding tech start-up. Your app is gaining traction, but to keep your user base ...

  7. Sample survey questions and examples

    Sample survey questions and examples. Sign up for an account today and access out library of survey questions and example templates. Get Started. Need help writing good survey questions? Get examples of survey questions and learn how to build successful surveys with our most popular questions.

  8. How to write good survey & poll questions

    7 tips for writing a great survey or poll. Whether you are conducting market research surveys, gathering large amounts of data to analyze, collecting feedback from employees, or running online polls—you can follow these tips to craft a survey, poll, or questionnaire that gets results. 1. Ask closed-ended questions.

  9. Doing Survey Research

    Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout. Distribute the survey.

  10. How to write effective research questions

    Writing effective research questions will ensure you get the correct data from your survey. Get started. When you're preparing to conduct research, creating the right question in the correct way is critical for producing the study and collecting the data you need for analysis. Questions that are too broad don't yield useful information.

  11. Market research questions: what to ask and how

    Market research (also called marketing research) is the action or activity of gathering information about market needs and preferences. This helps companies understand their target market — how the audience feels and behaves. For example, this could be an online questionnaire, shared by email, which has a set of questions that ask an audience ...

  12. Writing Effective Survey Questions

    Writing good survey questions is essential if you want to achieve your research aims. A good survey question should be clear, concise, and contain simple language. They should be free of bias and not lead the respondent in any direction. Your survey questions need to complement each other, engage your audience and connect back to the overall ...

  13. Writing Survey Questions

    Writing Survey Questions. Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions.

  14. 10 Research Question Examples to Guide your Research Project

    The first question asks for a ready-made solution, and is not focused or researchable. The second question is a clearer comparative question, but note that it may not be practically feasible. For a smaller research project or thesis, it could be narrowed down further to focus on the effectiveness of drunk driving laws in just one or two countries.

  15. Survey Research: Definition, Examples and Methods

    Survey Research Definition. Survey Research is defined as the process of conducting research using surveys that researchers send to survey respondents. The data collected from surveys is then statistically analyzed to draw meaningful research conclusions. In the 21st century, every organization's eager to understand what their customers think ...

  16. Explore 151 Amazing Topics For Survey Research (Updated 2024)

    Top 10 Topics For Survey Research On Entertainment Preferences. Streaming Services vs. Traditional Cable: Consumer Preferences. Influences on Music Genre Preferences Among Different Age Groups. Viewer Satisfaction and Preferences in Online Video Content. Impact of Social Media on Movie and TV Show Recommendations.

  17. Questionnaire Design

    Questionnaires vs. surveys. A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.. Designing a questionnaire means creating valid and reliable questions that address your research objectives, placing them in a useful order, and selecting an appropriate method for administration.

  18. 150+ Free Questionnaire Examples & Sample Survey Templates

    Filter by survey type. All our sample survey template questions are expert-certified by professional survey methodologists to make sure you ask questions the right way-and get reliable results. You can send out our templates as is, choose separate variables, add additional questions, or customize our questionnaire templates to fit your needs.

  19. Understanding and Evaluating Survey Research

    Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative ...

  20. Survey Research

    Computer-assisted surveys: A survey research method that uses computer technology to administer or collect survey data, often used in large-scale surveys or data collection. Interactive voice response surveys: A survey research method where respondents answer questions through a touch-tone telephone system, often used in automated customer ...

  21. The Best User Research Questions (+ How to Ask Them)

    Questions for user research can typically be categorized three ways: Questions about the problem e.g., what are users' pain points, what task are they trying to complete, what solution do they want. Questions about the people e.g., who they are, how they use products, what they want to accomplish, how likely are they to use the product.

  22. Is a College Degree Worth It in 2024?

    The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the ATP's methodology. Here are the questions used for this report, along with responses, and the survey's methodology.

  23. Writing Strong Research Questions

    A good research question is essential to guide your research paper, dissertation, or thesis. All research questions should be: Focused on a single problem or issue. Researchable using primary and/or secondary sources. Feasible to answer within the timeframe and practical constraints. Specific enough to answer thoroughly.

  24. CAHPS Mental Health Surveys

    The CAHPS research team developed this instrument in response to changes in the ways these services are provided and as a replacement for the CAHPS ECHO Survey (described below). The questions in the Outpatient Mental Health Survey focus on experiences with treatment and counseling services, including getting help between appointments and ...

  25. Survey Questions: Free Examples & Question Types

    Before you decide on the different types of survey questions to use, let's review each of your options. The best types of survey questions include: Multiple choice questions. Rating scale questions. Likert scale questions. Matrix questions. Dropdown questions. Open-ended questions. Demographic questions.

  26. Return-to-Office Orders: A Survey Analysis of Employment Impacts

    The row "we articulated a return-to-office policy" aggregates across those who implemented an RTO order in 2021, 2022, 2023 or 2024. Source: Federal Reserve Bank of Richmond business surveys (March 2024). We asked these 20 percent of employers about the expected consequences of issuing RTO orders. Did they expect workers to quit because of ...

  27. Experiences of UK clinical scientists (Physical Sciences modality) with

    The research aim was to assess experiences, and the level of 'satisfaction' with the regulator. For the remainder of this report, the term Clinical Scientist will be taken to mean Clinical Scientist (Medical Physicist/Engineer). ... The survey questions addressed the main themes, but further themes were identified using an inductive, data ...