Why Most Surveys Fail (And How to Avoid It)
Most surveys fail not because of bad tools, but because of bad design. They ask too many questions, use confusing language, include leading questions that bias responses, or get distributed to the wrong audience. The result: low response rates, unreliable data, and wasted effort.
A well-designed survey, on the other hand, can be completed in under three minutes, generates high-quality data, and gives you clear, actionable insights. The difference comes down to following a structured process — which is exactly what this guide covers.
We'll walk through every step: defining your goal, choosing question types, writing clear questions, designing the flow, distributing effectively, and analysing results. By the end, you'll know exactly how to make a survey that works.
Define Your Goal Before Writing a Single Question
The most common survey mistake is jumping straight to writing questions without first defining what decision the survey needs to inform. Every question in your survey should connect directly to a specific insight you need. If you can't explain why a question is there, cut it.
Start by writing a single sentence that describes the decision your survey will help you make. For example: "We need to understand why customers are churning in the first 30 days so we can improve onboarding." That sentence becomes your filter — every question either helps answer it or gets removed.
Goal-setting questions to answer before you start:
What decision will this survey help me make?
Who is my target respondent, and do I have access to them?
What is the minimum number of responses I need for reliable data?
How will I use the results — report, product change, content, internal decision?
What is the deadline for collecting responses?
Choose the Right Question Types
Different question types serve different purposes. Using the wrong type for a question leads to data that's hard to analyse or responses that don't capture what you actually need to know. Here's a practical guide to the most useful question types:
Multiple choice
When there are a fixed set of possible answers and you want clean, quantifiable data. Best for demographic questions, preference questions, and yes/no decisions.
Rating scale (1–5 or 1–10)
When you want to measure intensity of opinion, satisfaction, or likelihood. NPS (Net Promoter Score) uses a 0–10 scale. Star ratings use 1–5. Both are easy to benchmark over time.
Likert scale
When you want to measure agreement with a statement (Strongly Disagree → Strongly Agree). Ideal for attitude and opinion measurement in research contexts.
Open-ended text
When you need qualitative insight — the "why" behind a rating or choice. Use sparingly (1–2 per survey max) as they require more effort from respondents and more time to analyse.
Ranking
When you need to understand relative priority. Ask respondents to rank a list of options from most to least important. Useful for feature prioritisation and preference research.
Matrix/Grid
When you have multiple items to rate on the same scale. Efficient for comparing several products, features, or statements at once. Keep rows to 5 or fewer to avoid fatigue.
Pro tip: Untold Opinion supports all of these question types — multiple choice, rating, Likert, open text, ranking, matrix, NPS, emoji rating, and more. You can mix types freely in a single survey.
Which question type do you use most in your surveys?
233 votes so far · Click an option to vote
Write Clear, Unbiased Questions
Question wording is where most surveys go wrong. Poorly worded questions produce unreliable data — not because respondents are dishonest, but because they interpret the question differently than you intended. Here are the most common mistakes and how to fix them:
Leading questions
✗ Avoid
"How much did you enjoy our excellent customer service?"
✓ Better
"How would you rate your customer service experience?"
Leading questions suggest the "correct" answer, biasing responses toward positive ratings.
Double-barrelled questions
✗ Avoid
"How satisfied are you with our product quality and delivery speed?"
✓ Better
Split into two separate questions — one for quality, one for delivery.
Respondents may feel differently about each aspect. Combining them makes the data uninterpretable.
Vague language
✗ Avoid
"Do you use our product often?"
✓ Better
"How many times per week do you use our product? (0 / 1–2 / 3–5 / Daily)"
"Often" means different things to different people. Specific answer options remove ambiguity.
Assuming knowledge
✗ Avoid
"What do you think of our new NPS integration?"
✓ Better
Only ask about features respondents have actually used. Add a "Not applicable / Haven't used this" option.
Respondents who haven't used a feature will guess or skip, polluting your data.
Design the Survey Flow
The order of questions affects how people respond. A well-structured survey feels like a natural conversation — it starts easy, builds context, asks the hard questions in the middle, and ends with demographics or open-ended feedback.
Start with easy, engaging questions
Open with simple, non-threatening questions that anyone can answer quickly. This builds momentum and reduces early drop-off. Avoid starting with demographics — they feel like a form, not a conversation.
Group related questions together
Organise questions by topic. Jumping between unrelated topics is disorienting and increases cognitive load. Use section headers to signal topic changes.
Put sensitive or complex questions in the middle
Once respondents are engaged, they're more willing to answer harder questions. Demographic questions (age, income, location) are best placed at the end.
Keep it short — aim for under 5 minutes
Response rates drop sharply after 5 minutes. For most purposes, 5–10 questions is ideal. If you need more, consider splitting into multiple shorter surveys.
End with an open-ended "anything else?" question
A final open text box ("Is there anything else you'd like to share?") often surfaces the most valuable qualitative insights — things you didn't think to ask about.
Choose Your Survey Tool
The right survey tool depends on your use case, budget, and technical comfort. Here's how the main options compare:
Untold Opinion
97/100Free AI-powered surveys with community reach, gamification, and built-in analytics. Best for creators, marketers, researchers, and anyone who wants a genuinely free, full-featured platform.
Google Forms
72/100Simple, free, and integrates with Google Sheets. Good for basic internal surveys but lacks AI, analytics depth, and community features.
Typeform
68/100Beautiful conversational surveys. Free tier is very limited (10 responses/month). Paid plans are expensive. Good for brand-conscious teams with budget.
SurveyMonkey
65/100Established platform with strong analytics. Free tier limits responses and features significantly. Enterprise pricing is high.
What survey tool do you currently use most?
260 votes so far · Click an option to vote
Distribute Your Survey Effectively
A great survey with no respondents is useless. Distribution strategy is as important as survey design. The best channel depends on who your target respondents are and where they spend their time.
Highest response rates for existing customers or subscribers. Keep the email short — one sentence explaining the survey, a time estimate, and a clear CTA button. Send on Tuesday–Thursday mornings for best open rates.
Social media
Best for public opinion polls and community research. Twitter/X, LinkedIn, and Facebook groups work well. Embed a preview of the first question to drive clicks.
Website embed
Embed surveys directly on relevant pages — a post-purchase survey on the order confirmation page, a feedback survey on the support page. Contextual placement dramatically increases response rates.
Community platforms
Reddit, Slack communities, Discord servers, and forums can drive high-quality responses for niche research. Be transparent about your purpose and offer to share results.
QR codes
For in-person events, physical locations, or printed materials. Generate a QR code from your survey link and place it where your target audience will see it.
Untold Opinion community
Public polls on Untold Opinion are discoverable by the platform's community, giving you organic reach without any distribution effort. Great for opinion polls and market research.
Analyse and Act on Your Results
Collecting responses is only half the job. The value of a survey comes from what you do with the data. Here's a practical framework for turning survey results into decisions:
Look at completion rate first
If fewer than 60% of people who started your survey finished it, something went wrong — too long, confusing questions, or a technical issue. Fix this before drawing conclusions from the data.
Segment responses by key demographics
Overall averages hide important differences. Break down results by customer segment, age group, or usage level to find the insights that matter most.
Use AI analysis for open-ended responses
Open text responses are the richest data but the hardest to analyse manually. AI tools like Untold Opinion's built-in insights can automatically cluster themes, detect sentiment, and surface key findings.
Identify the top 3 actionable insights
Resist the urge to report everything. Identify the three findings that most directly inform the decision you set out to make. Everything else is context.
Share results with respondents
Closing the loop — sharing a summary of what you found and what you're doing about it — builds trust and dramatically increases response rates for future surveys.
Survey length vs response rate benchmarks
Average completion rates by survey length. Source: industry benchmarks across B2B and B2C surveys.
What's the biggest challenge you face when creating surveys?
210 votes so far · Click an option to vote
Ready to make your first survey?
Create a free survey in minutes — no account required to start. Use AI to generate questions or build from scratch.