Oops! 10 User Survey Design Mistakes & How To Avoid Them

SurveyAs you begin a Conversion Rate Optimization project, you’re hopefully using various tools to get feedback from users about the strengths and weaknesses of your site.  One inexpensive tool to get wonderful feedback is a user survey, sometimes presented as a customer exits your site, or sometimes emailed out to an existing list.

However, even inexpensive can prove to be costly if you don’t implement your survey well.  Poor implementations can result in low response rates (resulting in only a small amount of useful information), and can even turn off potential and existing customers to your brand.  Now that is costly at any price!

Don’t make these mistakes in designing & implementing your survey.

  1. Failing to explain the purpose & failing to set your respondent at ease.  You’re asking for help from someone, so make it as easy as possible for them.  Tell them why you’re doing this and how they were chosen as recipients of the survey.  Let them know about how long it will take to fill out the survey.  Let them know that their individual answers will be kept private and that they will not be marketed to based on their answers.  If you don’t set them at ease up front, far fewer will take the time to go through the survey.
  2. Not offering an incentive to complete the survey.  Again, you’re asking for a favor.  Make it worth their while.  It can be as simple as entering them into a drawing for a prize, or giving them a free download of a whitepaper.  Or it could be a coupon for a future purchase.  The point is to find something (that you can afford) to offer them in return for their time.  If there is no incentive, some may help you just out of kindness, but you’ll get a much lower response rate.
  3. Not showing a progress indicator.  We know that showing a progress indicator during an ecommerce checkout process increases conversions.  The same principle applies here.  If your respondents do not know how much more work is required of them, many will drop out before finishing—often unknowingly just a couple questions away from finishing.  If they know they’re almost finished, they are much more likely to complete the survey.
  4. Using long questions that are difficult to understand.  Answering these questions should be a simple task for your respondents.  If they have to take time to wade through technical jargon or figure out what you are trying to ask, they’ll either drop out of the survey entirely, or at best give you inaccurate information because they didn’t complete understand the question.  Keep your questions short, simple and easy to comprehend.
  5. Asking leading questions.  Being close to the web site and its objectives, you have some inherent biases when gathering questions.  It’s critical that you try to check those biases at the door when writing your question.  Be sure therefore, to not write questions that are set up to lead to certain results.  For example, rather than asking something like, “Would you prefer to talk to someone live on the phone if you have problems ordering?”, go with something more open such as “How do you prefer to get help when you have problems ordering?”  You could follow that with choices or leave the question open ended for them to fill in an answer.  Which leads me to…
  6. Failing to include some open ended questions, or using too many of them.  If your entire survey is a series of Yes/No and Multiple Choice questions or ratings, you’re losing out on the “juice” of the survey.  There should be some questions where the respondents have the opportunity to say what’s on their minds.  These could be things they like well about a particular function or the site itself, as well as things they had troubles with or think could be improved.  If your average rating of the site is 1 out of 5, you know you have to improve things, but you don’t know what unless you let people tell you why they rated it so low.  However, don’t load the survey up with open ended questions.  If you have page after page of questions where you’re asking people to get creative, they’ll tire quickly, and either give you less valuable answers, or quit the survey entirely.
  7. Making too many questions required.  There are just too many situations where someone doesn’t want to, or can’t answer a question.  It could be that the question is too personal.  Perhaps it asks about a section of the web site they never used.  Maybe they misunderstand it and just think it doesn’t apply.  But one of the surest ways to get a respondent to quit the survey early is to arrogantly tell them you’re not interested in their responses to any other questions unless they answer this one.  It takes control away from the respondent, and causes ill will from them.  Look at your survey and the questions you are marking as required.  If you have any, you should be able to justify that the entire survey is completely worthless if that question is not answered.  You’ll find very few questions that pass that test.  Think about it this way.  You get more information back if 50% of your respondents answer 90% of your questions than if only 35% answer them all.  If there is a question that you do want to require, try to still give your respondents an “out” by including N/A or Other as one of the choices.
  8. Asking demographic information at the beginning.  If you start your survey by asking sensitive, personal information, such as race, income, age, etc., you start to raise flags immediately with your respondents, even if you did state that all information would be kept private in the survey’s introduction.  Reorder the questions in your survey to ask the questions about the site experience first, and end with just a few optional demographic questions, if you need them at all.  The number of questions should be small, they should not be required, and they should be at the end of the survey as the respondent will be more engaged at that point.
  9. Failing to test the survey first.  I know, it seems so simple.  It’s just a survey, a set of questions and answers.  It doesn’t seem possible to screw it up.  But almost every time we’ve seen an initial test run prior to sending out the survey for real, needed corrections have made themselves known quickly.  Send the survey to a half-dozen internal respondents—people who have used the site but who are not involved in its development or marketing.  Ask them to take the survey (not just review it), and to also notify you of anything they didn’t understand, any typographical, grammar, or logic errors they noticed, etc.  Try to make sure your test recipients include both mobile and desktop users.  And ensure that all of the test users’ responses do get recorded correctly before wiping the answers out.
  10. Not sending out a reminder prior to the submission deadline.  At some point, you’re going to end the survey and analyze the results.  However, many people are willing and even want to help you by answering the survey, but received notice of it at an inconvenient time.  People like that tend to set the email aside with the intention of getting back to it and filling out the survey, but never do, either because they forgot about it or just because it got pushed further and further down on their list.  Two or three days before you close the survey, send out an email reminder to those who did not respond, politely requesting they give it another look.  Be sure to emphasize your incentive to reiterate the benefit they receive from filling it out.

You’re never going to get a 100% rate of survey responses with useful data.  But if you set up your survey correctly, you can get a lot more valuable information back from your web site’s user base.  And the list doesn’t stop here.  Please leave a comment with tips or mistakes to avoid when crafting a user survey.

This entry was posted in Usability. Bookmark the permalink.