The Web team wanted to find out why people were coming to coop.co.uk which is the start page for many other Co-op products and services within Co-op Funeralcare, Food, Legal Services and Insurance. Not knowing why people were coming to the homepage made prioritising our work difficult. To help with this, we recently did a piece of research into which tasks people want to complete when they visit the site.
At this point, I was new to both user research and the Web team so this was a brilliant introduction and overview of my new team’s scope.
Our colleagues in Insurance and Funeralcare suggested we use the ‘top tasks’ approach by Gerry McGovern which aims to help teams find out where they can add the most value to users based on which tasks are in the highest demand. The idea is to:
- Identify the tasks users want to complete most often – these are the ‘top tasks’.
- Measure how the top tasks are performing by looking at – amongst other things – how long they take to complete in comparison to a baseline time. Whether users completed the task and whether they followed the online journey we’d expected.
- Identify where improvements could be made. Make them.
How we identified the top tasks
Through analytics, a ‘what are you looking to do today?’ feedback form on the homepage, plus having a dig around the domain during desk research, we compiled a list of around 170 tasks that are possible to complete on coop.co.uk.
To make sure the work that followed this was free from bias – and therefore meaningful – it was important to compile a comprehensive list of every single task. A complete list meant we had a better chance at finding out what the customer wants to do rather than what the organisation wants the customer to do.
First we shared the list with the rest of the Web team to sense check. Then, because we knew that customers would skim–read the list when we put it in front of them, we asked the Content design community to check we’d written each task in a way that customers would understand quickly, using the language we’ve observed them using.
After finessing, we shared with product teams in Co-op Digital and stakeholders in the wider business to make sure we hadn’t missed tasks off.
Collaborating helped us whittle the lists of 170 tasks down to 50 – a much more manageable number to present to customers on the homepage.
And the 6 top tasks are…
We listed the top 50 tasks in an online survey on the homepage and asked users to vote on the top 5 reasons they come to the website.
At around 3,000 responses we took the survey down. The results showed that the most common reasons people visit coop.co.uk is to:
- Select personalised offers for members.
- Look for food deals available to all customers (£5 freezer fillers, fresh 3).
- Check your Co-op Membership balance.
- Find what deals are available in a store near to you.
- Choose a local cause.
- Add points from a receipt onto a Co-op Membership account.
There were no surprises here then.
Measuring the performance of the top tasks
In the majority of cases, we found that users succeeded in completing the tasks. This doesn’t come as a surprise because each individual product team knows why their group of users most frequently use their product or service and they already have product and research prioritise in place.
However, this piece of work did flag up that there could be room for improvement in the following tasks:
- Sign into a membership account.
- Change your local cause.
- Add points from a receipt onto a Co-op Membership account.
The image above shows how long it took on average for users to see their membership balance, choose an offer, choose a local cause and to add a receipt. The orange line shows how long we’d expect it to take. The graph shows that checking a balance is quicker than we’d expected but the remaining 3 are slightly longer.
The image above shows how how ‘successful’ users were at seeing their membership balance, choosing an offer, choosing a local cause and to adding points to their membership from a receipt. A ‘direct success’ (shown in green, the bottom band of colour) is when the user completes the task in the way we’d expect. An ‘indirect success’ is when a user completes a task in a way we didn’t expect (show in orange, or the top band). 20% of people failed to choose an offer (shown in red at the top of the second column).
The image above shows an ‘average overall score’ (where 10 is excellent and 1 is poor) and is worked out by combining the ‘success score’ (a scale of 1-3 indicating a direct success, indirect success or failure) plus a ‘difficulty score’ (a scale of 1-7 on how difficult the user found the task to complete).
The idea came from Measuring and quantifying user experience, a post from UX Collective.
What we learnt
The big takeaways were:
- There are a couple of tasks we didn’t think were important, but users did.
- The work also helped us optimise our search. The feedback form on the homepage which asked what customers wanted to do had a significant number of responses looking for our gluten-free (GF) fishcakes. This was a result of Coeliac UK including them in a round–up of GF products. But they weren’t on our site. And when people were searching for them, the search would return GF recipes. The Web team worked with the Optimisation and search team and now the GF products appear before recipes. Since then, there’s been a 70% increase in GF searches, and more pages are being looked at. People coming for GF products are now spending 2 minutes on the site – an increase of 30 seconds.
- However, the top tasks approach may be more useful for teams with transactional services so that measuring it a baseline and improvements would be easier – the Web team itself doesn’t have any transactional services.
Top tasks approach: how useful?
Overall, top tasks is useful because it gave us data that is helping the Web team prioritise, and set out my research priorities.
The priorities list will keep us focussed and it’ll be useful to point to if there’s a request to work on something that we’ve found to have little value. Now we have data to help us push back against requests that don’t have customer and member needs at the centre.
Now and next
The Web team has created a task performance indicator for some of the top tasks identified so that as we make improvements to different areas of the website, we have something to measure against.
If you’ve used the top tasks approach, let us know why and how useful you found it in the comments.