What is a Response Rate? Definition, Examples & Best Practices
Response rate measures how many people actually complete a survey compared to how many were asked. It is expressed as a percentage and used to assess the representativeness of survey data. A low response rate does not automatically invalidate results, but it raises questions about whether the respondents who did reply reflect the broader group. Response rate is tracked across all survey types, from customer feedback forms to academic studies and employee engagement polls.
Response Rate Definition
Response rate is the proportion of people who complete a survey relative to the number who were invited to take it. It is the standard measure of participation in survey research and one of the first things to check when assessing whether survey data can be trusted.
A high response rate means most people who were asked actually responded. A low response rate means the data comes from a small fraction of the intended sample, which increases the risk that respondents are not representative of the broader group you wanted to study.
Response rate is distinct from completion rate. Response rate measures who started the survey out of everyone invited. Completion rate measures who finished it out of everyone who started. Both matter, but they answer different questions.
Response Rate Formula and Example
The formula is straightforward:
Response Rate (%) = (Completed Responses / Total Invitations Sent) x 100
Example: You send a customer satisfaction survey to 800 people. 184 complete it. 184 / 800 x 100 = 23% response rate.
There are variations in how researchers count the denominator. Some exclude invitations that bounced or were undeliverable. Others include only people who opened the email, not everyone who received it. The most conservative and widely used definition is total invitations sent, including bounces, because it reflects the full picture of who you attempted to reach.
MindProbe shows response rate in real time on the survey dashboard as responses come in, alongside the number of invitations sent and the number of partial completions.
What Is a Good Response Rate: Benchmarks by Survey Type
There is no single answer to what counts as a good response rate. It depends entirely on the survey type, the relationship between sender and respondent, the survey length, and the channel used.
Response rates vary significantly depending on survey context and audience relationship.
- Internal employee surveys tend to achieve the highest engagement, typically between 60 and 80%, reflecting the captive nature of the audience and organisational accountability.
- Customer satisfaction surveys sent to existing customers follow at 20 to 40%, benefiting from an established relationship with the brand.
- B2B surveys to warm lists typically land between 15 and 30%
- Academic and research surveys see similar ranges of 10 to 30%.
- Consumer surveys using cold panels drop to 5 to 15%
- Intercept surveys - delivered in-product or on-site - sit in a comparable range of 5 to 20%.
- At the lower end, cold email outreach surveys see the weakest returns, often achieving just 1 to 10%, reflecting the challenge of engaging audiences with no prior connection to the sender.
These ranges are drawn from published benchmarks including Mailchimp's Email Marketing Benchmarks report (2023), the American Association for Public Opinion Research (AAPOR) standards, and SurveyMonkey's industry benchmark data.
Context matters more than the number itself. A 12% response rate from a cold B2B list of 5,000 contacts is strong. A 12% response rate from your own customer base after multiple follow-ups is a warning sign. Always interpret response rate relative to the relationship, the channel, and whether the people who did respond are likely to represent those who did not.
Factors That Affect Response Rate
Response rate is influenced by factors on both the sender side and the respondent side.
Survey length. Shorter surveys get more responses. Bain and Company found completion rates drop noticeably beyond 10 to 12 minutes. Every unnecessary question costs you responses.
Relationship strength. People are more likely to complete a survey from an organisation they trust or have a relationship with. Internal surveys outperform external ones. Customer surveys outperform cold outreach.
Perceived value. If respondents believe their answers will lead to something, they are more likely to complete the survey. Surveys that explain why the data is being collected and what will happen with it tend to perform better than those that do not.
Timing and channel. Email surveys sent on Tuesday to Thursday mornings consistently outperform those sent late on Fridays. In-product surveys shown at a natural break point outperform those that interrupt a task mid-flow.
Anonymity. Respondents are more likely to complete surveys, and to answer honestly, when they know their individual responses will not be attributed to them. This is especially relevant for employee surveys.
Incentives. Modest incentives, such as prize draws or small rewards, can improve response rates in consumer research. For employee surveys, incentives are generally less effective than visible leadership commitment to acting on results.
8 Ways to Improve Response Rate
- Keep it short. Remove every question that is not directly tied to a decision. Aim for five to seven minutes maximum. Use skip logic to route respondents past questions that do not apply to them, so no one sees more questions than necessary.
- Write a clear subject line. For email surveys, the subject line determines whether the survey gets opened at all. Avoid generic lines like "We value your feedback." Specificity works better: "Two minutes on your recent support experience."
- Personalise the invitation. Emails that address the recipient by name and reference a specific interaction achieve higher open rates than generic broadcasts. Even basic personalisation makes a measurable difference.
- Explain what you will do with the data. Tell respondents how the results will be used and whether they will be shared. Survey Lab (2022) found that surveys with a stated purpose achieved 14 percentage points higher response rates than those without.
- Send at the right time. Tuesday to Thursday, mid-morning tends to outperform other slots for email surveys. Avoid Mondays (inboxes are full) and Fridays (people are winding down).
- Follow up once. A single reminder to non-responders typically recovers 20 to 40 percent of the final response pool. More than one follow-up rarely improves things and risks irritating respondents.
- Make anonymity clear. If the survey is anonymous, say so explicitly in the invitation and on the first page of the survey. Respondents who are unsure whether their answers are traceable tend to either not respond or give socially desirable answers.
- Close the loop. Share a summary of results with respondents after the survey closes. This builds trust for future surveys and signals that participation leads to action. MindProbe's results-sharing feature lets you send a brief summary to all respondents with a single email.
Frequently Asked Questions
It depends on the survey type. Internal employee surveys typically achieve 60 to 80 percent when senior leadership visibly supports them. Customer surveys average 20 to 40 percent from warm lists. Cold outreach surveys often see single digits. Rather than chasing a benchmark, ask whether the respondents who replied are likely to represent those who did not. A 15 percent rate from a genuinely random sample can be more useful than a 60 percent rate from a biased one.
Divide the number of completed responses by the total number of invitations sent, then multiply by 100. If you sent 500 invitations and received 90 completed responses, your response rate is 18 percent. Some researchers exclude undeliverable emails from the denominator. The most conservative approach is to include all sent invitations, whether they were opened or not.
Response rate measures the percentage of invited people who started the survey. Completion rate measures the percentage of people who started the survey and finished it. Both matter. A high response rate with a low completion rate often points to a survey that is too long or has a confusing question partway through. MindProbe tracks both metrics separately in the survey dashboard.
Not automatically. A low response rate raises questions about non-response bias: the idea that people who did not respond might answer differently from those who did. Whether that matters depends on whether the non-respondents are systematically different from respondents. In some cases they are not, and a low response rate does not affect the conclusions. In others it does, and the data should be treated with caution.
Statistical significance is determined by sample size, not response rate. You need a large enough number of completed responses to detect a meaningful difference at your chosen confidence level, typically 95 percent. For most business surveys, 100 to 200 completed responses is sufficient to draw reliable conclusions from a population of several thousand. A higher response rate helps only insofar as it increases your completed response count.
Strongly. Bain and Company research found completion rates fall noticeably beyond 10 to 12 minutes of estimated survey length. SurveyMonkey's benchmark data shows surveys under five minutes achieve roughly twice the completion rate of surveys over ten minutes. Every question you add is a trade-off against participation. Use skip logic to keep each respondent's experience as short as possible.