12 Free Customer Satisfaction Survey Templates
Pick a template, grab the questions, and start collecting feedback today. General templates for any business, plus industry-specific ones for restaurants, SaaS, healthcare, e-commerce, real estate, fitness, and education.
Why Bother Measuring Customer Satisfaction?
Because unhappy customers rarely tell you they're unhappy. They just leave. Research from Esteban Kolsky found that only 1 in 26 dissatisfied customers actually complains — the rest quietly churn. A satisfaction survey is your early warning system.
It's also the most direct line between what customers experience and what your team does about it. Your analytics can tell you what happened. A survey tells you how it felt. Those are very different things, and you need both.
The businesses that take this seriously — tracking their CSAT score over time, following up on bad ratings, feeding insights back into product decisions — they're the ones that retain customers and grow through referrals rather than constantly paying to replace the ones who left.
How to Create a Customer Satisfaction Survey
1. Start with the Goal, Not the Questions
Before you write a single question, decide what you're going to do with the answers. Trying to reduce support ticket volume? Focus on resolution and effort questions. Want to improve your product? Ask about expectations vs reality. The goal shapes everything else.
2. Keep It Under Five Questions
Survey fatigue is real. Every additional question you add drops your completion rate by roughly 5-10%. The sweet spot for transactional surveys is three questions. You can stretch to five if every question earns its place. If you can't justify why a question is there and what you'll change based on the answer, cut it.
3. Lead with a Rating, Close with Open Text
Put your satisfaction rating first — it's quick, sets expectations for the survey length, and gives you a trackable metric. Finish with an optional open-ended question. The word "optional" matters. Making open text required tanks completion rates because people feel pressured to write something meaningful.
4. Time It Right
For support interactions, send within an hour of resolution. For purchases, wait 2-3 days — long enough to use the product, short enough to remember the experience. For SaaS onboarding, trigger after the customer completes setup or finishes their first week. Late surveys get vague answers because the details have faded.
5. Use the Channel They're Already On
A survey that appears inside the chat window after a support conversation will outperform an email survey sent an hour later. Why? Zero friction. The customer is already there, already engaged, and tapping a star rating takes two seconds. This is where tools like AI chatbots shine — they can collect feedback as a natural part of the conversation.
Types of Survey Questions (and When to Use Each)
Different question types serve different purposes. Mix them strategically — too many rating questions feels like a test, too many open-ended questions feels like homework.
| Type | Best For | Example |
|---|---|---|
| Star Rating (1-5) | Quick satisfaction snapshots | How satisfied were you with today's service? |
| Numeric Scale (1-10) | NPS-style loyalty and effort scoring | How easy was it to get help today? |
| Multiple Choice | Categorising issues or intent | Was your issue fully resolved? |
| Open-Ended Text | Uncovering issues you didn't think to ask about | What could we have done better? |
When to Send Each Survey
Timing is the difference between a 15% response rate and a 50% response rate. Here's what works for each scenario:
| Survey Type | Send When | Expected Response Rate |
|---|---|---|
| Post-Support | Within 1 hour of resolution | 40-60% (in-chat), 15-25% (email) |
| Post-Purchase | 2-3 days after delivery | 10-20% |
| Onboarding | After setup or first week | 20-35% |
| Website Feedback | Exit intent or 60s on page | 5-15% |
| Quarterly Check-in | Every 90 days | 15-30% |
Mistakes That Kill Your Survey Results
Asking Too Many Questions
Every question after the fifth one is working against you. Completion rates drop, the answers get lazier, and the data becomes less reliable. If you catch yourself writing question number eight, stop and ask which three you'd cut if someone made you.
Leading Questions
"How great was your experience today?" is not a neutral question. Same with "Would you say our team handled things well?" — you're nudging people toward a yes before they've even thought about it. The result is inflated scores that paper over real problems. Ask straight: "How satisfied were you?"
Surveying Everyone the Same Way
A customer who just bought something for the first time has a completely different context from a customer who's been with you for two years. Use different templates for different moments. The templates above are built with this in mind — each one is designed for a specific touchpoint.
Collecting Feedback and Doing Nothing
This is the big one. Asking for feedback creates an implicit promise that you'll act on it. If customers keep telling you the same thing and nothing changes, they stop responding. Close the loop: fix the issue, then tell the customers who flagged it. "You mentioned X last month — we've fixed it" is one of the most powerful retention messages you can send.
Collecting Feedback Through AI Chatbots
The traditional approach — emailing a survey link after the fact — works, but it's getting harder to stand out in inboxes. Response rates for email surveys have been declining for years.
Chatbot-based surveys flip this around. Instead of asking the customer to go somewhere else and fill out a form, the survey happens inside the conversation they're already having. The chatbot resolves their question, then asks one or two follow-up questions right there. No context switch, no extra clicks.
The numbers back this up. In-chat surveys consistently see 2-3x higher response rates compared to email. And because the feedback is immediate, it's more accurate — customers are rating the experience they just had, not a vaguely remembered version of it from two days ago.
2-3x
Higher response rate vs email
24/7
Feedback collection around the clock
0 min
Delay between experience and survey
Collect satisfaction feedback automatically
Automate Support lets you build an AI chatbot that handles support questions and collects customer feedback — all in one conversation. Train it on your docs, deploy it on your site, and start getting real feedback from day one.
Frequently Asked Questions
What questions should I include in a customer satisfaction survey?
Start with one rating question (the classic 'How satisfied were you?'), add one about whether the issue was resolved, and finish with an open-ended 'anything we could do better?' Three questions is often enough. Go past five and your completion rate drops off a cliff.
How long should a customer satisfaction survey be?
Measure in minutes, not questions. If someone can finish it while waiting for a page to load, you're in the right range. Two to five questions for one-off interactions, up to eight for quarterly check-ins where you have more goodwill to spend. The real test: would you fill it out yourself at the end of a busy day?
When is the best time to send a satisfaction survey?
While the details are still sharp. A customer who rates a support call 20 minutes later is giving you accurate data. That same customer three days later is rating a memory — and memories are unreliable. The exception is post-purchase surveys, where you need to give people time to actually use what they bought.
What is a good response rate for customer satisfaction surveys?
Email surveys typically get 10-30%. In-app or post-chat surveys can hit 40-60% because the customer is already engaged. If you're below 10%, your survey is probably too long, poorly timed, or both. Short surveys sent at the right moment consistently outperform everything else.
Should I use a 5-point or 10-point scale?
Five-point for most situations. It's simple, people don't overthink it, and the data is easier to interpret. Ten-point scales work better for NPS-style questions where you need finer granularity. The worst choice? Anything with an even number of options — people default to the middle, and without a middle option, your data gets muddier.
How can I improve my survey response rate?
Make it feel like part of the experience, not an interruption. Personalise the ask — 'How did we do with your refund?' beats 'Please take our survey.' Remove every unnecessary click between the customer and the first question. And never, ever send a survey on a Monday morning when their inbox is already full.
Can I run satisfaction surveys through a chatbot?
Yes, and it works better than you'd expect. A chatbot can ask one or two questions right after resolving an issue — the customer is already in the conversation, so there's no friction. Response rates for in-chat surveys are typically 2-3x higher than email surveys.
What is a customer satisfaction survey?
It's a structured way to ask customers how they feel about a specific experience — a support call, a purchase, onboarding, or your product in general. You collect responses using a mix of rating scales, multiple choice, and open-ended questions, then use the results to find patterns: what's working, what's broken, and where to focus next.
How do I analyse customer satisfaction survey results?
Start with the number: calculate your CSAT score (satisfied responses divided by total, times 100). Then segment it — by channel, by agent, by product, by customer type. A company-wide 78% means nothing if your chat support is at 90% and your email support is at 55%. The segments tell you where to act. Finally, read the open-ended responses. The patterns in free text are often more valuable than the scores.
How many responses do I need for reliable results?
For a rough directional signal, 30-50 responses will show you clear trends. For statistically meaningful data you can compare month over month, aim for 100+. The real question isn't sample size though — it's whether the people responding are representative. If only your happiest customers bother to reply, your score looks great but tells you nothing useful.
What's the difference between CSAT and a general customer feedback survey?
CSAT is specific and scored — you're measuring satisfaction with one thing, at one moment, on a defined scale. A general feedback survey is broader and often exploratory: 'Tell us what you think.' CSAT gives you a number you can track. Feedback surveys give you themes you can explore. Most teams need both, but at different times.
Should I make survey responses anonymous?
It depends on what you're optimising for. Anonymous surveys get more honest negative feedback — people say things they'd never attach their name to. But you lose the ability to follow up, close the loop, or connect feedback to account data. For transactional surveys (post-support, post-purchase), tie responses to accounts so you can act on them. For broader sentiment surveys, consider anonymous options.
How do I write survey questions that avoid bias?
Use neutral phrasing. 'How satisfied were you?' instead of 'How great was your experience?' Don't stack positive options — if three out of five choices are variations of 'good,' you're skewing the results. Test your questions on a colleague who knows nothing about the project. If they can guess which answer you're hoping for, rewrite the question.
Explore More Tools
Discover other free tools to help optimize your website and business
FAQ Schema Generator
Create FAQ schema markup in JSON-LD format for rich snippets
Schema Markup Generator
Generate structured data and schema markup for SEO
AI Chatbot ROI Calculator
Calculate the return on investment for AI chatbot implementation
CSAT Calculator
Calculate your Customer Satisfaction Score with industry benchmarks
NPS Calculator
Calculate your Net Promoter Score with industry benchmarks
Canned Response Generator
Generate professional canned responses for customer support
Receipt Generator
Create professional receipts in seconds — download as PDF, no signup
Invoice Generator
Create invoices with payment terms, bank details, and tax breakdown