We’ve added user research guides to the design system

We recently added 4 user research guides to our Co-op design system. The guides cover:

  • how to plan and prepare for research as a team
  • how to choose the most appropriate research method, and how to use it
  • how to analyse your findings, turn them into something actionable and how to share with the rest of the team
  • a list of useful research tools

We’re committed to user-centred design. We start small, we test for user value and we work iteratively – research and reacting to feedback is vitally important to us.

But it’s not easy to do good research and by ‘good’ we mean using the appropriate method and ensuring the way we do it is planned, thorough and unbiased.

You need skilled researchers.

Helping teams help themselves

We have a superb small team of researchers at Co-op Digital. We have varying background, skills and strengths which means asking for advice on how to tackle something is always interesting and useful. But we can’t cover all our projects, at all product phases, all the time. There aren’t enough of us.

So in a few cases, we set the direction and encourage teams to do their own research, with us there as support.

Sharing the knowledge

The idea came while I was writing a research strategy for a team working on a particular scope of work. I realised the strategy could be adapted into more of a ‘how to do research at the Co-op’ guide. For years, in an unofficial, internal-channels-only type way, several researchers had been writing guides on things like ‘how to recruit users / gather informed consent / write a survey’. It made sense to pull this useful work together and make it open and available in our design system.

Presenting guidance in this way means that instead of individual researchers writing a strategy for a team now and then, we can give more general advice.We want to make sure people are doing good, useful research in the right way and we can now add value to any digital team by giving them a ‘best practice’ resource.

We’re working on it

As always, the plan is to iterate and add more guidance as we go. We’ve been looking towards the GDS service manual as an excellent, detailed resource for planning research.

As we come across a method that we don’t have a guide for, we’ll write one up. For example, the next time one of our researchers needs to conduct a diary study they’ll write that up.

We know we need to improve how we help people choose the appropriate method so that people don’t just fall back on conducting usability testing in a lab or face-to-face interviews. As Vicki Riley says in her post, matching our research approach to the project is really important.

We’d like your feedback on it too so if you have any, leave a comment.

Simon Hurst
Lead user researcher

How we’ve helped users understand Membership

At one point or another, most digital teams have been convinced that their assumption about how to fix something will work but when they’ve tested it, they’ve found they’re still way off solving the problem.

That’s ok.

It’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better. It’s part of working in an agile way. We should talk about failing more often. So, here’s an example of how we failed fast and learnt quickly in the Membership team.

Making assumptions based on user research

We’d seen from user research that most people, especially those who were new members, didn’t understand what a co-op is, how one operates and why it’s a different way of doing business.

Most people, especially those who are new members, don’t understand it even though we include loads of info on co-ops when we send out membership cards. But it looks like people either don’t read it at all, or, if they do, they don’t remember the information. Without that understanding, the Co-op Membership is just another loyalty card to most people.

During user research sessions when we talked about the idea of a co-op, people seemed interested. Not everyone, but some. The problem seemed to be not with the quality of information being given, but where and how in the user journey it was given.

It seemed if we could more effectively convey the concept of a co-op, that would be enough for some users to become more engaged. Certainly they would be better able to make an informed decision whether they wanted to get involved. They’d become true co-operators as opposed to just loyalty card users.

Making changes based on our assumptions

We designed an interaction where the information about co-ops and Co-op Membership was introduced to people as part of the online registration. Our hypothesis was that at this point in the user journey the member is more committed and more likely to have time to read this information and be more receptive to it.

By chunking the content into sections and importantly making it dismissable, the user would be able to digest as much or as little as met their needs, rather than being faced by the entirety of the proposition in one hit.

We know people don’t read things online. In fact you’re lucky if people read more than 20% of what you stick on a screen so we kept that in mind with the design.

Here are 2 examples of pages from the prototype.

Image shows a screenshot of a member account and a box out with information about Co-op Membership. It says: 'Your say in what we do' and gives an overview of things members can do.

Image shows a screenshot of a member account and a box out with information about 'Your 5% reward'

Then we tested the new design

During 2 rounds of research we spoke to 12 people (you can read more about our views on samples sizes in James Boardwell’s blog post ‘Small is beautiful’). The group included a mixture of ages, online capabilities and length of time being a member.

Before showing them our new design we asked each participant to fill in a short questionnaire to find out what they understood about Co-op Membership. We then interviewed them, and showed them the prototype that was intended to help them understand the idea of a co-op.

At the end of the session we asked them to fill in the same questionnaire.

Results showed we hadn’t got it right

As we expected, before looking at the prototype people didn’t understand:

  • what rewards they earned as a Co-op member
  • what a co-op is
  • elements of the Co-op such as the dividend, democracy and engagement

And the post-prototype results weren’t any better – the new design had had zero effect on users’ understanding.

Picking ourselves up. Trying again

We’d seen people read the information, but they didn’t take it in. Although we were giving them more control, we were still imposing a bulk of potentially irrelevant content rather than letting the user discover it in their own time, and reading as much or as little as met their need.

For some people, some of the information would have been both relevant and necessary – but for most their primary need at this point was to find out ‘what’s in it for me’ and everything else was a distraction.

So we iterated again. This time we wanted to give people a positive interaction that let them get only what they wanted, at a time when they needed it.

We added a ‘what’s this?’ drop down within members’ accounts to explain both rewards and Co-op points. Here’s how the current design looks.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box in a closed position.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box with content underneath that explains what this is.

We’d seen in research that many people often didn’t know exactly what they got for being a member so adding this was important.

Better results this time

During research we watched people time and again interacting with the drop down, unprompted. Responses were often comments from the user such as ‘ahhh, so that’s how it works’ or ‘I didn’t know that, I thought they’d send me vouchers’.

If there wasn’t that immediate, unprompted reaction we’d then follow it up with questions such as ‘what made you click on that’ and ‘what did it tell you’. This made us confident in what we were seeing had met the need we’d identified and so we released it. We know people are making use of it. Google Analytics tells us those drop down controls have been clicked 250,000 times since we released it on 14 February.

So after failing often and learning from, and iterating on, what users were saying to us, we’ve made good progress on helping people understand what rewards they’ve earned as a Co-op member.

We’re still researching how best to help people get a deeper understanding of what a Co-op is including elements of the Co-op such as the dividend, democracy and engagement. Those are things we haven’t solved yet, but we will. And it’ll probably involve a lot of failing fast.

Simon Hurst
User research

How user research is helping us improve the Membership site

My name’s Simon and I’m one of the user researchers on the Co-op Membership team, alongside my colleague, Vicki Riley. It’s our job to understand what members and non-members need from the service and find out what they think of it. This way we can act on their feedback and continually improve things. Whilst we’re responsible for user research, the whole team get involved in research sessions and meeting users so they can empathise with the people who use the services we’re building. This ensures they design with the user, and not themselves, in mind.

We don’t just rely on one method of user research to find out how people feel about the Membership service. We gather feedback in lots of ways and I wanted to share these with you.

Feedback through the website

The website has a ‘give feedback’ link. As of today, 7 December 2016, we’ve had 9469 comments. We’ve analysed them all and have been comparing them with what we learn from our other research approaches.

Phone call follow up

We often do phone interviews with people who have said they’re happy to be contacted about the website feedback they’ve given. This allows us to get more detailed feedback and also find out how people expect things to work.

Online surveys

We sometimes do online surveys of which allow us to range a wide range of people quickly and easily. These surveys are around 4 or 5 questions long. We’ve found that the easier it is for someone to give us feedback, the more likely they are to leave some.

Speaking to people in labs

We also speak to people in our research labs. These sound far more ‘scientific’ than they actually are. Research labs are usually a room with a computer, a microphone and a camera allow the rest of the team to observe the research. We invite people in, talk to them about shopping, loyalty cards, online services and Co-op Membership. We then watch people using the service as they complete tasks such as registering a temporary card or choosing which local cause to support. I ask them to talk me through what they’re thinking as they use the service so that we understand how they’re finding it.

Store visits

We already visit stores but we plan to do more of this.

Tracking website traffic

Finally, we also gather analytics from the website. This allows us to understand which pages people are visiting, how long they’re spending on pages, what they’re clicking or selecting, and which error messages are triggered most frequently.

By using a combination of these research methods, we have access to a wide range of interesting data about how people use the service.

Using research findings to improve

So here’s an example of how we’ve used what we’ve learnt from our research to make a change.

We’d seen through lab testing that people didn’t always understand that they could choose their own cause to support with their 1% for your community reward. We found people thought that we decided for them, or that they would email us later on with their choice. They didn’t notice there was something on the screen that they could click to choose a cause. Here’s how the page used to look:

The Membership page before our design changes. Many users weren't sure how to choose their cause in the 'your community' box

The comments from the feedback link told us the same thing. People had commented:

“I can’t find where to vote regarding where the 1% goes.”
“How do I select my preferred local cause please?”
“Should be able to select which charities I want to support.”

The analytics were backing this up too. We saw that a significant number of people were getting to the page with the ‘call to action’ (the bit where they could choose a cause) but they weren’t actually selecting one.

The team came up with an alternative design to try and make it more obvious how the user could interact with the page. It was a simple content fix. We added ‘See your local causes’ inside the box about ‘your community’. When we tested it with people in the lab, they understood it – they knew what to do. So earlier this week we put it live. Now the page looks like this:

New design of the Membership page includes a simple content fix in the 'your community' box. It now says 'See your local causes'

It’s early days but we’ve already seen more people selecting their cause and therefore benefiting their community. We’ve seen a 10% increase already. We’ll be keeping an eye on the feedback to make sure we’ve improved the journey. We’ll continue to research regularly and as always we’ll keep using what we’ve learnt to improve the service.

Members can visit membership.coop.co.uk to choose a local cause. If you’d like to become a member you can sign up for membership.

Simon Hurst
User researcher on the Membership team