12 things we learnt about creating effective surveys

At Co-op Digital we sometimes use surveys to get (mostly) quantitative feedback from users. They’re quick, cheap and they’re a useful research technique to capture data at scale.

But they can also be a waste of time and effort if we do not ask the right questions in a way that will give us meaningful answers.

We’ve compiled a list of things we keep in mind when we’re creating surveys.

Strive for meaningful data

1. Be clear on the purpose of the survey

We consider what we want to be able to do as a result of the survey. For example, when we’ve collated the responses, we want to be able to make a decision about doing (and sometimes *not* doing) something. To create an effective survey, we must know how we’ll act on each of the things we learn from it.

We give our survey a goal and consider what we want to know, and who we need to ask, then draft questions that help us achieve that goal.

2. Make sure each question supports the survey’s purpose

We keep in mind what we want to do with the survey responses, and make sure each question we ask is relevant and presented in a way that will return meaningful data from participants. If we can’t explain how the data we’ll get back will help us, we don’t ask that question.

We know that the more questions we ask, the more time we ask of our users, and the more likely they will be to drop out.

3. Check a survey is the most appropriate format

It can be tempting to cram in as many questions as possible because it’ll mean we get lots of data back. But quantity doesn’t translate to quality. Consider the times you’ve rushed through a long survey and justified giving inaccurate or meaningless answers just to get through it quickly. When we find ourselves wanting to ask lots of questions – especially ones with free text boxes – a survey isn’t the most appropriate format to gather feedback. An interview might be.

Consider what we’re asking and how we’re asking for it

4. Use free text boxes carefully

Free text boxes which allow people to type a response in their own words can be invaluable in helping us learn about the language that users naturally use around our subject matter.

But they can also be intimidating. The lack of structure means people can get worried about their grammar and how to compose what they’re saying, especially if they have low literacy or certain cognitive conditions. For many people they can be time-consuming, and so can make drop out more likely.

If we use free text boxes, we make them optional where possible. It can also increase completion rates if they’re positioned at the end of the survey – participants may be more invested and so more likely to complete them if they know they’re near the end.

5. One question at a time

Be considerate when you ask questions. To reduce the cognitive load on participants, reduce distraction and ask one question at a time. Ask it once. Clearly. And collect the answer.

Questions like ‘How do you use ‘X’, what’s good about it, what’s bad?’ are overwhelming. And then giving a single box to collect all 3 answers, often end up collecting incomplete answers. A participant will quite often answer only one of those 3 questions.

6. Ask questions that relate to current behaviour

People are not a good judge of their future actions so we don’t ask how they will behave in future. It’s easy for a participant to have good intentions about how they will, or would, behave or react to something but their answer may be an unintentionally inaccurate representation. Instead, we ask about how people have behaved, because it gives us more accurate, useful and actionable insights. “The best predictor of future behaviour is past behaviour,” as the saying goes.

7. If we ask for personal information, we explain why

Participants are often asked for their gender, sex, age or location in surveys but often nothing will be done with that data. If there’s no reason to ask for it, we don’t.

When there is a valid reason to ask for personal information, we explain why we’re asking.

For example, in the Co-op Health app we ask for an email address so that we can send essential service updates. Without explaining why we were asking for it, many people were reluctant to give their email because they thought they were going to get spam. By explaining the reason we were asking, and how the information will be used, the user was able to decide whether they wanted to proceed.

Explaining why we’re asking for personal information is essential in creating transparent and open dialogue. It gives users the context they need to make informed decisions.

8. Avoid bias. Show the full picture

Give participants context. For example, an online survey might reveal a snippet of text for a limited amount of time in order to find out how well participants retained the information. If the survey results say that 90% of people retained the information, that’s great but it doesn’t necessarily mean that was conclusively the best way to present the information – that’s only one of the possible ways of presenting the text. In these cases it’s better to do a multivariant test and use multiple examples to really validate our choices.

Be inclusive, be considerate

9. Avoid time estimates

Many surveys give an indication of how long the survey will take to complete. Setting expectations seems helpful but it’s often not for those who with poor vision, dyslexia or English as a second language. It also rarely takes into account people wo are stressed, distracted or are emotionally affected by the subject matter. Instead, we tend to be more objective when setting expectations and say how many questions there are.

10. Don’t tell participants how to feel

A team working on a service will often unthinkingly describe their service as being ‘quick’, ‘easy’, ‘convenient’ or similar. However, these terms are subjective and may not be how our users experience the service. We should be aware of our bias when we draft survey questions. So not, ‘how easy was it to use this service?’, which suggests that the service was easy to begin with, but ‘tell us about your experience using this service’.

11. Consider what people might be going through

Often, seemingly straight-forward questions can have emotional triggers.

Asking questions about family members, relationships or personal circumstances can be difficult if the user is in a complex or non-traditional situation. If someone is recently separated, bereaved or going through hardship, they could also be distressing.

If we have to ask for personal information, we consider circumstances and struggles that could make answering this difficult for people. We try to include the context that these people need to answer the question as easily as possible.

12. Give participants a choice about following up

Sometimes survey answers will be particularly interesting and we may not get all the information we want. At the end of the survey, we ask participants if they’d be happy to talk to us in the future.

We also give people a choice about how we follow up with them. Some people may be uncomfortable using a phone, some may struggle to meet you face to face, some may not be confident using certain technologies. Ask the user how they want us to contact them – it’s respectful, inclusive and is more likely to encourage a positive response.

When choosing who to follow up with, avoid participants that were either extremely positive or negative – they’ll can skew your data.

Time is precious – keep that in mind

At the end of the day, when people fill out a survey, they feel something about your brand, organisation or cause. They may like you or they may just want their complaint heard. Sometimes, they’re filling out your survey because they’re being compensated. Whatever the reason, view it as them doing you a favour and be respectful of their circumstance and time.

If you’ve got any tips to share, leave a comment.

Joanne Schofield, Lead content designer
Tom Walker, Lead user researcher