As user researchers, we tend to favour more in-depth qualitative approaches to finding out about user needs and behaviours. But this type of research takes time, and can only ever cover a very small percentage of our users. Surveys can, theoretically, give us insight into the thoughts and opinions of large numbers of people in a quick and easy manner. While surveys have their place in feeding into research projects, they can be very dangerous indeed.
To quote user researcher Erika Hall:
"Surveys are the most dangerous research tool — misunderstood and misused. They frequently straddle the qualitative and quantitative, and at their worst represent the worst of both."
Erika makes several points about the dangers involved in using surveys as a user research tool. While I don’t disagree with her warnings, there are ways to at least help to minimise the risk of derailing your research with invalid survey data.
Set a clear objective
Before you begin thinking about what you want to ask in your survey, you need to set a clear objective: what are you aiming to learn? Without an objective you may find you’re asking questions with no real purpose.
Examples of objectives include:
- To understand what factors are most important for people when choosing a travel agent.
- To understand what motivates people to give to different types of charity.
- To understand what is important for students when considering which courses to study.
A clear objective will give you something to hang all your questions off and ensure that your responses will help you achieve your goal. You need to be realistic though. Accept that the survey will not answer the objective on its own. Additional research is likely to be required. Surveys are just one piece of the research puzzle, but they can be a good starting point.
Ask the right questions
Our own Chris How wrote about the importance of asking the right questions. When it comes to surveys, you need to consider that this will be an unmoderated activity, so your questions need to make sense straight away. You won’t have a chance to explain them. Think carefully about the wording. Avoid jargon or ambiguity.
It’s always a good idea to do a dry run and sense-check your questions. Ask a few other people to complete your survey to see if it makes sense to them before sending it out to your chosen audience. If you check that there’s no ambiguity or confusion, you’ll be able to release it with confidence.
The Clearleft question protocol can also help ensure your questions are on the right track. A particularly important point when it comes to survey questions is thinking about how the information will be used and how it might affect future decisions. Carefully consider the questions and think about what you can learn from the possible answers. This will help to sense-check whether you need to ask the question in the first place.
The other big consideration here is how you validate that the information entered is true and accurate. As Erkia Hall says, “bad surveys don’t smell”. If a large proportion of your respondents are clicking random options in a multiple-choice survey, then you need to make sure you’re aware of this, and ideally prevent it happening in the first place. One way to check for low quality responses is to include some open questions and keep an eye out for people writing nonsense answers in these. You can then remove their responses completely if you feel that they’re not taking the survey seriously.
The number of questions you ask is also likely to impact the overall quality of your results. The longer the survey, the more chance that people will get bored or frustrated while completing it and will default to just clicking through on anything in an attempt to get it finished.
Ask the right people
As with all elements of user research, who you run your research with is crucial to the validity of your results. The people who complete your survey need to be the right audience for your research. They might be customers, potential customers or just people with specific needs or interests. You’ll want to consider the demographics of your participants too. Think carefully about who you want to complete your survey, and make sure you target them as accurately as possible.
Not only do you need to find the right type of people, but these people also need to be willing to find the time to complete your survey, and to take it seriously. Even if you have the perfect panel to undertake the survey, the results will be meaningless if they don’t think about their answers or they don’t answer honestly. Incentives may help encourage people to undertake the survey, but they may also encourage those who just want to get the money who will rush through their answers. There’s no easy way to ensure that people take your survey seriously, but there are some tips in the following section of how to go about removing dubious responses.
Finally, you’ll want to make sure you have the right number of people completing the survey. There are lots of complicated stats around this, but fortunately a survey sample size calculator gives you a quick way to roughly figure out how representative your survey is.
Clean the data
Now that you’ve got your responses, it’s tempting to jump straight into analysing the results. Before you do though, it’s crucial that you clean your data to remove those dubious responses. You’ll want to look out for:
- People who have completed the survey suspiciously quickly.
- People who give ‘junk’ answers to open questions.
- People who sit outside your demographics.
- Any indication of people completing the survey multiple times.
- Depending on what you’re asking, you might also want to remove any extreme outliers.
While there’s no guarantee that the remaining responses are from people who have given your survey due care and thought, removing obviously misleading responses is a step in the right direction.
Analyse with caution
Before you start analysing your clean(er) data, you need to consider the inherent limitations up-front. A survey is not a replacement for in-depth qualitative research. You shouldn’t try to answer all of your research questions with a survey alone. Think back to the objective for your survey and stick to it.
When you’re analysing, be consistent and try to avoid bias. Bias is often talked about in terms of qualitative research, but it is equally important when it comes to quantitative research. You may be tempted to cherry pick answers, or put your own spin on the results. Be as neutral as you can when it comes to analysis, and consider running through your interpretation with a colleague. Ask them to challenge anything that they think may be skewing your results.
You’ll also want to make sure you’re not trying to read too much into the results. It may be that your survey tells you about what’s happening but doesn’t tell you why it’s happening. Don’t be tempted to draw conclusions when there isn’t data to back them up. Instead, see this as an opportunity to undertake more research to really get to the heart of the issues.
Present with honesty
When it comes to presenting back your findings to others you’ll also want to make sure that no bias is creeping in. Report on the facts and be upfront about the limitations.
Don’t be afraid of presenting back ‘bad’ results. If your survey shows that users prefer competitor websites to yours, see this as an opportunity to make improvements!
Finally, be careful with your data visualisations. Choosing the right graphs to present back data could be a blog post of its own, but your choice of graph, the way you set up your axis and even the visual elements of what you’re presenting back may have a big impact on how it’s interpreted.
In summary
As with every user research method, preparation is key when it comes to creating and analysing user surveys. Avoid the temptation to use surveys as a quick and easy shortcut to getting feedback from users. Instead, use them only when they’re the most appropriate tool. You should always make sure you’re asking the right questions and avoiding bias to ensure your results are representative.
It’s important to remember that surveys will not give you a complete picture of your users’ experience. Use them alongside other, more detailed forms of research in order to really understand your users’ needs.
I’ve put together a checklist to help you make sure your survey is on track from the start.
Related thinking
- Tiny Lesson
Tiny Lesson: How to set up an inexpensive remote research session
- Viewpoint