If 75% of your current customers said they would reorder your bestselling t-shirt if you also offered it in red, wouldn’t you want to know?
Industry reports tell you what’s happening broadly—but user surveys tell you what your customers think about your products. In this guide, you’ll learn the main types of user surveys, when to use each format, and how to design questions that get honest, actionable answers.
What is a user survey?
A user survey captures insights about your business by asking questions to customers, potential buyers, research participants, or a target audience. Common audiences include existing customers, potential customers, or members of a user research group.
Surveys help businesses measure customer satisfaction, gain valuable insights into preferences, and learn how people perceive their brand, products, and overall customer experience. User surveys of non-customer groups, such as viewers of advertisements, help collect data about other interactions with the business and provide a deeper understanding of target audience needs.
Continuous feedback cycles are part of running a successful business.
Consider conducting user surveys at these moments:
- As part of ongoing reporting cycles
- After a customer takes an action (such as purchasing a product)
- When testing a new product
- When entering a new market
- After a change to a product, service, or delivery model
- When experiencing underperformance
- When making a business decision
Types of data collected in a user survey
Understanding these two data types helps you choose the right survey format for your goal—whether you need to track progress over time or discover new opportunities:
- Quantitative data. Quantitative surveys ask users to choose among a range of set responses. They’re an effective way to establish benchmarks and track progress over time.
- Qualitative data. Qualitative surveys ask open-ended questions. They can help businesses understand why customers feel the way they do about products or services.
Businesses can combine qualitative and quantitative research methods in one survey or use sentiment analysis to pull quantitative insights from qualitative data. They can also design quantitative surveys to answer questions about the customer experience.
For example, a business might use a qualitative survey to identify pain points and a quantitative one to measure the frequency of occurrence.
Here’s a breakdown of how user surveys can collect quantitative or qualitative data:
| Data collected | Purpose | Survey design | Question types | Outcomes | Best for |
|
Quantitative |
Measuring a level, e.g., satisfaction, approval, effort |
Closed questions |
Multiple choice; agree or disagree; ranking scales |
Numerical scores |
Monitoring progress, viewing the effects of a change, informing business decisions based on the overall strength of your business |
|
Qualitative |
Exploring the user experience, understanding why users feel the way they do |
Open-ended questions |
“What did you like most …?”; “What needs improvement …?”; “If you could make one change …?”; “What would make you more likely …” |
Specific pain points, experience trends, general user sentiment |
Developing plans for performance improvement; informing business decisions based on growth potential |
Types of user surveys
- Customer satisfaction
- Net Promoter Score
- Customer effort score
- System usability scale
- Qualitative user experience
User surveys vary by type, and businesses can create their own formats to collect the specific information they need.
Here are the five more common formats for business surveys:
Customer satisfaction
Customer satisfactionsurveys (CSAT) are focused on a particular customer experience or interaction. CSAT surveys measure satisfaction with a single experience expressed as a numerical rating (also referred to as a CSAT score) on a 100-point scale.
Net Promoter Score
Net Promoter Score (NPS) surveys measure satisfaction with your business as a whole by asking the question, “On a scale of 1 to 10, how likely would you be to recommend our business to a friend?”
Customer effort score
Customer effort score (CES) surveys are user experience surveys that calculate a customer’s perceived difficulty of taking an action, such as completing a purchase or navigating a product catalog. It measures this by asking customers to rate the difficulty on a five-point scale.
System usability scale
The system usability scale (SUS) is a popular user experiencesurvey that uses the Likert scale to gather quantitative UX data. Likert scale models ask respondents to specify how strongly they agree or disagree with each statement in a series.
For example, SUS statements could include, “I think that I would like to use this system frequently” or “I found the system very cumbersome to use.”
This format is common in product development, where it helps businesses determine the usability of a particular product or service.
Qualitative user experience
Qualitative user experience (UX) surveys ask open-ended questions to gather insights about a user’s interactions with your business, such as, “What did you like about this product?” or “What is one feature you wish the product had?” or “How would you describe your customer experience to a friend?”
Best practices for user surveys
- Know your goals
- Plan for analysis
- Incentivize participation
- Keep it short
- Write good survey questions
- Use survey tools
Conducting user surveys can help you understand your audiences, identify pain points along the customer journey, uncover market gaps or missed opportunities, and monitor your progress.
Want user surveys people actually complete? These six practices make the difference:
1. Know your goals
Before launching a survey, ask yourself what you hope to learn. Setting goals helps you determine the right type of survey. Claudia Snoh, co-founder of concentrated craft coffee company Kloo, used hers to fine-tune service delivery before formally launching her product.
“Toward the end of the soft launch, we did a pretty extensive survey,” she says in an episode of the Shopify Masters podcast. “We really learned a ton about target customers and their behavior, and we did actually end up making big adjustments to our branding, packaging, and pricing.”
2. Plan for analysis
Surveys without analysis plans waste customer goodwill and your time. Before launching, decide when you’ll review results and what actions you’ll take based on what you learn. This helps you resist the impulse to bombard customers with surveys that annoy or alienate them and don’t provide actionable insights to your company.
Distinguishing routine and targeted surveys can help. For example, you might use automated customer feedback tools to routinely send post-purchase or post-service-interaction surveys. Then review their performance weekly or monthly. If you’re testing a new user interface, you might design a targeted UX survey with the goal of identifying functionality. Then recruit participants to test the new product and ask for responses to the SUS survey.
3. Incentivize participation
Before you launch a survey, plan for how you’ll incentivize participants to complete it.
“Survey fatigue is really real,” Claudia says. “If I get an email asking to leave a review or do a survey, I will probably ignore it unless there is something in it for me.”
Claudia suggests using incentives to boost response rates.
“We offered our customers free products in exchange for their time,” she says.
You can also consider discount codes, customer loyalty program points, or entry into a contest.
4. Keep it short
Claudia recommends writing survey questions that will collectively take less than 10 minutes to answer.
“That’s roughly 10 to 15 questions,” she says, adding that qualitative surveys should be short and that long surveys can result in lower completion rates.
“I think that’s really the sweet spot to target,” she says.
5. Write good survey questions
Strong user survey design involves writing questions with easy-to-understand language. Use neutral language, avoid leading questions, and include a “neutral” or “n/a” among the possible answers for multiple-choice questions.
You can also include an open-ended question in quantitative surveys by inviting survey participants to provide any additional feedback in their own words.
6. Use survey tools
Customer feedback tools (or survey tools)—like Zigpoll, Hotjar, and Seguno—can help you design and publish user-friendlysurveys and analyze the results. Many offer multiple user survey formats and functions like progress indicators to reduce attrition.
Skip logic is a feature that can select a respondent’s next question based on their previous answer to reduce clustering bias and improve data quality. You can include responsive design for a clear display on mobile devices. These tools can also help you store and analyze survey data.
Consider using artificial intelligence (AI) and sentiment analysis tools like Brandwatch or Idiomatic to process a large number of open-ended responses and extract qualitative insights and quantitative data. This might help you to discern, for example, that the most commonly cited pain points with a product are battery life and aesthetics. Or that 47% of respondents expressed positive sentiments about your company.
User surveys FAQ
What is the best user survey tool?
The best user survey tool depends on what you hope to learn and the size of your representative sample. To conduct qualitative research on a large user base, select an AI-powered tool with natural language processing (NLP) and sentiment analysis capabilities.
How do you create user surveys?
You can use customer feedback tools to create and distribute qualitative and quantitative surveys. Many tools can also help you analyze data and extract insights from qualitative user research.
What are the four types of surveys?
The four major survey types are quantitative surveys, qualitative surveys, user experience surveys, and user satisfaction surveys.





