Need help designing a survey? Look no further. I’ve compiled a fairly comprehensive set of guidelines to get you on your way!
I’ve put together some survey design tips that I hope you’ll find helpful. I realize that there are other variables to consider depending on the type of survey or the data collection methodology, but these general guidelines should apply regardless.
General Survey Guidelines
1. First and foremost, define and know your objective! As the saying goes, “garbage in, garbage out.” If you don’t have an objective in mind, your survey initiative will fail. Think about how you will analyze the responses and ask the questions in an appropriate manner.
2. Open your survey with a brief introduction, and I would state your objective (in customer-friendly terms) here, as well. Respondents want to know why you’re conducting this survey and what you’re going to be doing with their responses. Don’t set expectations about actions and follow-up here that won’t be executed. Also give an honest indication of how long the survey is or how long it will take.
3. Think about survey/question flow. Start with questions that warm up the respondent to the topic. As you dive into the survey, put questions in a natural, logic flow and in sections rather than jumping around in some illogical sequence. For example, in a post-transactional survey, ask questions in the flow of the experience; and when you are conducting brand awareness surveys, they come with their own set of requirements for how questions should be asked.
4. Know the reason for, and impact of, question placement. If you ask overall satisfaction at the beginning of the survey, you are getting a top-of-mind rating. If you place the question at the end of the survey, you have taken the respondent through the experience again via the flow of the survey and the questions asked, so the overall satisfaction rating will reflect that experience. You will get two different scores, depending on placement. Several years ago, I tested this theory on seven different surveys for seven different clients, and when the osat question was asked first, the score was always lower. I had a client who insisted on moving the question from the end of the survey to the beginning after years of having it at the end. I warned that the score would drop if we did that; the client still chose to move the question, and in the end, their osat score dropped one full point (on a 10-point scale) from the previous year! (Know that any discussion around placement of the osat question can be a “religious” one, and there could be a variety of differing views and opinions on this topic.)
5. Be mindful of survey length. Transactional surveys can be brief, e.g., 10-15 questions max, whereas relationship surveys can be a bit longer, e.g., 50 questions (where respondents only see those questions relevant to them, in essence making the survey shorter). Other methodologies may call for longer surveys. Use attribute grids to logically (questions that belong together) group questions with same rating scales. And don’t forget progress meters to let respondents know where they are.
6. Ask a mix of closed-ended and open-ended questions. It is not necessary to ask an open-ended question after every closed-ended question, e.g., every rating question. As a matter of fact, I strongly suggest you limit the number of open-ended questions in your survey. You need to have at least one, but don’t have 20!
7. Impacting survey length is question relevance, which each of the following will help with.
- Don’t ask things you already know about the customer, e.g., last purchase date, product purchased, date of support call, etc.
- Only ask questions that are relevant to that customer and his/her experience. For example, if you know the customer owns Product X and Product Y and recently called about support for Product X, don’t ask questions about Product Y, too. Or don’t ask questions about marketing materials in a support post-transactional survey.
- Don’t allow other groups or departments to commandeer the survey by adding questions that are not relevant to the survey objective.
- Use smart survey techniques to skip questions not relevant based on responses to previous questions.
8. Don’t use company or industry lingo/language that your customers don’t know or understand. Just like the customer experience, think about the survey from the customer’s perspective. If you must use such jargon, be sure to define it in customer terms.
9. Speaking of language, if your survey is going out to a global audience, be sure to offer respondents the option to take the survey in their preferred languages.
10. Remember that you cannot collect personal information from anyone under 13 without parental consent.
When in doubt about general survey and sampling guidelines, follow the CASRO Code of Standards.
Question Writing Guidelines
1. Don’t ask double-barreled or compound questions. That means, keep your question to just one thought and not a couple. For example, if you ask about “quality and timeliness of issue resolution,” I’m not really sure how to answer that. You have just asked me about two concepts: quality and timeliness. What if the quality was great, but it took you forever to resolve the issue?
2. Make sure your questions are not ambiguous. Write questions clearly. If a respondent pauses and says, “What do they mean by that,” then the question is poorly constructed.
3. Ensure that the questions are actionable. Ask yourself, “If someone rated that question poorly, what would I fix as a result of that?” If you can’t answer that question, then throw out the question.
4. Similarly, every question should have an owner. If you can’t attribute the question to a department or individual who owns its response or rating, pitch it. You’re just asking for the sake of asking. (Granted, there will be some questions, e.g., demographics, that don’t fit that requirement and will be needed to make the survey analysis more robust and the data actionable.)
5. Your question response choices and rating scales should be mutually exclusive. And do your homework; make sure you provide a complete list of response choices. I hate when the one answer that should be there is missing. Be sure to provide an “Other (please specify)” when appropriate.
6. Don’t asking leading or biased questions. “We know you loved our new soft drink. How much did you love it?”
7. Use randomization of response choices to avoid positioning bias; but use this judiciously, i.e., doesn’t make sense for every response choice list.
8. Use proper grammar and make sure you spell check!
9. Offer an “out” for questions, where appropriate. For example, not everyone wants to tell you their household income or about their children, and you may ask some questions for which they genuinely don’t have an answer. Similarly, do not make every question in the survey required. This really makes for an awful respondent experience.
10. For open-ended questions, be specific. Ask exactly what you want to know, e.g., “What can we do to ensure you rate us a 10 on overall satisfaction next time?” Or, “Tell us the most important reason you recommended us to your friends.”
11. And, last but certainly not least, I’ll briefly address question scales. Like placement of the osat question, question scales are a religious discussion. Get 10 researchers in a room and get 10 different views of which scale is best and when. My point on scales will be this: be consistent on your use of scales within a survey. Clients have handed me surveys to review that have five different scales within each survey. That’s a disaster for a variety of reasons, not the least of which is the respondent experience.
12. Don’t forget to thank your respondents for their time at the end of the survey!
I hope these tips are helpful. The main thing to keep in mind… as CX professionals, we know we need to think about the experience with a company from the customer perspective. The survey design process is no different: think about the customer experience as you design the surveys. After all, surveys in their simplest form are just another touchpoint that you’ll want to execute flawlessly.
Come back for my next post, when I outline how to maximize response rates.
Image courtesy of Pixabay.
Great Article! One question though, I was always of the mind set that you should ask OSAT FIRST. The theory being you get their overall reaction (the one they would likely pass along to friends etc) and not drag them through the weeds of every part of the process and then ask OSAT at the end at which point if any part of their experience was not perfect (and you have now reminded them of that) they will be less likely to give a top box rating?
What are your thought on this?
Thank you! Placement of the osat question depends on what you're trying to achieve. There are certainly times when you want top-of-mind osat, but especially for transactional surveys, you want to take them through the weeds to get a fair assessment of how each component of the transaction impacted osat.
This is a great article! I work for a translation company that specializes in market research and survey translation. I appreciate you mentioning language in your post! Can we connect offline – I would love to discuss writing guest posts for one another. (email@example.com)
Thanks, Brenna. I'll connect with you offline.
Good article – but I'd suggest caution regarding progress meters. E.g., if pages are of unequal length the meters can be inaccurate if they're simply based on a page count. If skip logic is being used the meters can be confusing as they can show unexpected jumps (as the respondent unknowingly skips past a page of which they're unaware).
Agreed. Always a good idea to advise pros and cons when it comes to progress meters.
Great article! I am interested about point number 4 regarding question placement. Is there any empirical research that support ovsat questions are lower at the beginning than the end?
Thank you, and thank you for commenting. The only research I have is what I've done, as noted in the post.
This comment has been removed by a blog administrator.
I appreciate comments, but please no self-promotion. Thank you!
It is critical to align the data collection approach (e.g., telephone or web-based) with the questionnaire design. Implementing a web-based versus a telephone study questionnaire requires treating some design issues very differently. survey questionnaire design
Appropriate survey design is works 100%, inappropriate design could create critical problems for sure! Therefore above provided 22 tips seems to me quite beneficial to design a survey properly. If you've struggled to figure out how efficient your survey analysis spend is, take a look at this panxpan analytic software survey response research module. It helps you monitor and measure your survey analysis efforts. Also allows you to gauge its effectiveness.
A perfect design website or any thing gets better response from everyone, just like that if you want to get good result from survey campaign you've to design it with great inspiration. These 22 tips are worth trying for such purpose. Effectiveness is definitely tricky to measure in survey response but analytics software like PanXpan's survey response module can help.
Before you create your next survey, you should consider what experience you are trying to create for your community.
Good Survey Questions