How we tried to increase temporary card registration with flyers

Recently, in his post How we’ve helped users understand Membership, user researcher Simon Hurst said that “it’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better.” It made me think about my most recent example of failing fast and how useful it was for the ‘more members’ part of the Membership team to do a quick, inexpensive trial so we could test an idea.

The problem with temporary cards

You can become a member by signing up online. You register your details, pay your £1 and your Co-op Membership card is sent to you through the post. You can also sign up in our food stores. You pay you £1 and you receive a temporary card to use there and then. The idea is that you’ll go online to register your temporary card later.

However, our user research and data show this isn’t what’s happening. 58% of temporary cards we’ve sold haven’t been registered. This is a problem because:

  • around £1 million of 5% reward is sitting in a pot, and can’t be spent until the temp cards are registered
  • we can’t get in touch with customers to let them know the balance they have because their temp card isn’t registered
  • until they register the card, customers can’t access all the member benefits. For example, they can build up their rewards but they can’t spend them or choose a local cause to support

To try and increase the number of temporary cards being registered we ran a few trials in stores. We dubbed one of these ‘the flyer test’.

Encouraging temporary card holders to register

Here’s our hypothesis:

Photo of post it notes stuck on a whiteboard with hypothesis on them. Hypothesis reads: We've seen/we've heard That people aren’t registering their temporary cards We believe this is because They don’t know they have to do anything with it, and the instructions given aren’t clear So if we Give them better instructions We'll see More members registering We'll know this is true when We see an increased temporary card conversion rate

To test this hypothesis we asked colleagues on tills in 10 stores to watch out for customers who were swiping a temporary card. When they spotted this happening, we asked them to hand those customers a flyer which had a call to action on it: ‘register your temp card’. The flyer also explained the benefits of registering the card to try and nudge people into registering.

Image shows front and back of flyer. Front says: Register your card online to claim your member rewards. Back lists things that members are missing out on if they haven't registered their cards online.

We included a vanity URL so we could track how many people registered their cards after receiving a flyer. Simple.

Learning early

We had our hypothesis and agreed our test. Our first failure was cracking the logistics of designing, printing, delivering leaflets across the country. That was hard, and so was making sure our store colleagues understood why we were doing this. This was our first learning: there are colleagues across the business great at doing this, and working with them is better than working alone.

We hadn’t fixed anything. And that’s hard to take

We sent flyers to 10 stores across the country and asked them to hand them out for the next 4 weeks. We put Google Analytics tracking in place and we decided on our measure of success: 10 visits to the URL a week, with 50% of those going on to register their card.

The test went live and we eagerly refreshed the Google Analytics report each morning waiting to see an improvement in temporary card registration. There were none. Nobody was visiting our URL.

We called the test stores. Maybe they hadn’t been handing the flyers out? Turns out they had. And what’s more, colleagues liked them because the flyers were an easy, concise way to tell customers why they should register their cards.

But they weren’t working for customers.

Over 4 weeks, 35 people visited the URL, and 3 of those people registered their cards. We hadn’t hit our measures. The test had failed.

We learnt lots, quickly

The trial taught us that:

  1. People don’t naturally move from a physical thing (a flyer in a shop) to a digital thing (our website). Even if you spell out all the really great reasons why they should. If moving from physical to digital was a natural thing for people to do, they probably would have already registered their temporary card.
  2. Involving wider team members early on is important because they may have ideas, sometimes tried and tested ones, about how to get stuff done.
  3. We should test an idea from as many angles as we can before we go ahead and roll it out further. We based our hypothesis on user research, then came up with an idea that we thought would test it. If we had looked at the data as well, we would have seen that there are only around 50 active temporary cards per store, and that these cards are only seen around around twice a month. So…
  4. Targeting active temporary cards isn’t the best way to solve the wider problem.

Learning a lesson cheaply, and on a small scale

We often say it’s okay to fail, but it’s still disappointing when you’ve put time and effort into something. You start picking it apart. Maybe we picked the wrong stores? Or the wrong time of year? Or the wrong colour flyer?

No, those things don’t matter – our idea just wasn’t that great.

Failing is ok, as long as you recognise when to let your idea go and move onto tackling a problem another way. So yes, we failed but we only failed in 10 shops, not all 3,000. We didn’t spend much money, we didn’t inconvenience our users and we were open about how the tests were performing in our weeknotes and our show and tells.

Most importantly we learnt enough to inform where we should focus our efforts next.

We’re moving away from encouraging users to do something towards giving them the tools they need to do it there and then – our next trial will test if customers would register their temporary cards on a tablet in store.

Joel Godfrey
Digital business analyst

How we’ve helped users understand Membership

At one point or another, most digital teams have been convinced that their assumption about how to fix something will work but when they’ve tested it, they’ve found they’re still way off solving the problem.

That’s ok.

It’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better. It’s part of working in an agile way. We should talk about failing more often. So, here’s an example of how we failed fast and learnt quickly in the Membership team.

Making assumptions based on user research

We’d seen from user research that most people, especially those who were new members, didn’t understand what a co-op is, how one operates and why it’s a different way of doing business.

Most people, especially those who are new members, don’t understand it even though we include loads of info on co-ops when we send out membership cards. But it looks like people either don’t read it at all, or, if they do, they don’t remember the information. Without that understanding, the Co-op Membership is just another loyalty card to most people.

During user research sessions when we talked about the idea of a co-op, people seemed interested. Not everyone, but some. The problem seemed to be not with the quality of information being given, but where and how in the user journey it was given.

It seemed if we could more effectively convey the concept of a co-op, that would be enough for some users to become more engaged. Certainly they would be better able to make an informed decision whether they wanted to get involved. They’d become true co-operators as opposed to just loyalty card users.

Making changes based on our assumptions

We designed an interaction where the information about co-ops and Co-op Membership was introduced to people as part of the online registration. Our hypothesis was that at this point in the user journey the member is more committed and more likely to have time to read this information and be more receptive to it.

By chunking the content into sections and importantly making it dismissable, the user would be able to digest as much or as little as met their needs, rather than being faced by the entirety of the proposition in one hit.

We know people don’t read things online. In fact you’re lucky if people read more than 20% of what you stick on a screen so we kept that in mind with the design.

Here are 2 examples of pages from the prototype.

Image shows a screenshot of a member account and a box out with information about Co-op Membership. It says: 'Your say in what we do' and gives an overview of things members can do.

Image shows a screenshot of a member account and a box out with information about 'Your 5% reward'

Then we tested the new design

During 2 rounds of research we spoke to 12 people (you can read more about our views on samples sizes in James Boardwell’s blog ‘Small is beautiful’). The group included a mixture of ages, online capabilities and length of time being a member.

Before showing them our new design we asked each participant to fill in a short questionnaire to find out what they understood about Co-op Membership. We then interviewed them, and showed them the prototype that was intended to help them understand the idea of a co-op.

At the end of the session we asked them to fill in the same questionnaire.

Results showed we hadn’t got it right

As we expected, before looking at the prototype people didn’t understand:

  • what rewards they earned as a Co-op member
  • what a co-op is
  • elements of the Co-op such as the dividend, democracy and engagement

And the post-prototype results weren’t any better – the new design had had zero effect on users’ understanding.

Picking ourselves up. Trying again

We’d seen people read the information, but they didn’t take it in. Although we were giving them more control, we were still imposing a bulk of potentially irrelevant content rather than letting the user discover it in their own time, and reading as much or as little as met their need.

For some people, some of the information would have been both relevant and necessary – but for most their primary need at this point was to find out ‘what’s in it for me’ and everything else was a distraction.

So we iterated again. This time we wanted to give people a positive interaction that let them get only what they wanted, at a time when they needed it.

We added a ‘what’s this?’ drop down within members’ accounts to explain both rewards and Co-op points. Here’s how the current design looks.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box in a closed position.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box with content underneath that explains what this is.

We’d seen in research that many people often didn’t know exactly what they got for being a member so adding this was important.

Better results this time

During research we watched people time and again interacting with the drop down, unprompted. Responses were often comments from the user such as ‘ahhh, so that’s how it works’ or ‘I didn’t know that, I thought they’d send me vouchers’.

If there wasn’t that immediate, unprompted reaction we’d then follow it up with questions such as ‘what made you click on that’ and ‘what did it tell you’. This made us confident in what we were seeing had met the need we’d identified and so we released it. We know people are making use of it. Google Analytics tells us those drop down controls have been clicked 250,000 times since we released it on 14 February.

So after failing often and learning from, and iterating on, what users were saying to us, we’ve made good progress on helping people understand what rewards they’ve earned as a Co-op member.

We’re still researching how best to help people get a deeper understanding of what a Co-op is including elements of the Co-op such as the dividend, democracy and engagement. Those are things we haven’t solved yet, but we will. And it’ll probably involve a lot of failing fast.

Simon Hurst
User research

Funeralcare: taking the beta to Edinburgh

Since April 2016, the Funeralcare team at Co-op Digital has been working to make life easier for our colleagues at our funeral homes across the UK. Our aim has always been to reduce the time our colleagues spend juggling and filling in paper forms so that they can spend more time with their clients – people who are grieving for their loved ones.

It’s been awhile since we wrote an update on our work. Back in August Andy Pipes, our Head of Product Management, said that we were rethinking how we deliver our at-need funeral service (an ‘at-need’ service is the immediate assistance someone might needs after reporting a bereavement).

At that point we’d built:

  • a ‘first call’ service that logs details of a death and automatically alerts an ambulance team by SMS to take the deceased into our care
  • a funeral arrangement service which captures the client’s decisions, the costs, and keeps colleagues in various locations from funeral homes and the central care centre updated
  • a hearse booking system, staff diary and staff assignment service
  • a coffin stock control system, and a way for clients to browse the existing coffin range
  • an audit system that captures certain steps in the service

Since then we’ve been busy testing with colleagues and iterating.

We’ve added new features

As we’ve learnt where the gaps are in the service, we’ve added new features. They include a digital mortuary register and a digital belongings log to record possessions.

Deceased can come into our colleagues’ care at any time of the day or night and it’s vital the funeral director knows where that person has been taken. To help, we’ve developed a digital mortuary register so that ambulance staff can book the deceased in and the funeral director can see where the person has been taken.

image shows a screen with the first page of the digital mortuary register. the options are 'booking in' and 'booking out'

Another new feature is a digital belongings log. Often, when someone is brought into our care they’ll have jewellery on them or other personal belongings with them. This means that when a funeral director at a funeral home gets a call from the grieving family to check up on jewellery, they don’t immediately know what the deceased came in with because the paper record is with the deceased at the mortuary. To make this easier and more efficient, we introduced a digital log instead of needing multiple phone calls between different locations.  

Live trial and user testing

We’ve been testing in 2 ways. From September to November we continued to visit funeral homes all over the country to observe how colleagues work but we were also doing usability testing on each of the individual features in the bulleted list above with colleagues in mock labs. We tested and improved each feature separately until we thought we’d built enough of a service to be valuable to colleagues. At that point, in December, we rolled out a beta trial in Bolton.

interaction designer Matt researching which content is most valuable to one of our colleagues with a paper prototype.

We asked colleagues in Bolton to use the service in parallel with their current process which involves whiteboards, post-its, paper diaries, fax machines and the old, often painful-to-use software. Letting them use it for real is the best way to learn what’s working and what’s not. It drew our attention to 3 major things we’d overlooked during usability testing.

  1. We thought we were being helpful by preloading the local churches and crematoriums but we hadn’t given colleagues the option to create new ones.
  2. We found that the calendar couldn’t cope with all day events.
  3. We discovered that colleagues help each other out so having restricted access for specific roles creates a problem if someone is off ill and cover is needed.

Testing the beta with a small number of colleagues helped us catch problems like these before we rolled the service out to more people.

Trialling the service in Edinburgh

We want our service to be useful everywhere but we’ve been told many times by colleagues that there’s no such thing as a ‘typical’ funeral. They vary from region to region for reasons including local traditions, operational set up, affluence, traffic as well as legislation. Because our aim is to give time back to colleagues so they can spend it with their customers, we need to create something that works for all users not just our colleagues in Bolton. That’s why we are launching our at-need funeral service trial in Edinburgh in March.

We’re still learning

The beta has shown us that funeral arrangements are made up of multiple interactions like choosing flowers, booking venues and signing off obituary notices. Funeral arrangements are iterative with lots of tweaks along the way, so iterating the design is the only way we can cope with all the new things we keep learning.

We know that standard software packages don’t solve every problem. By involving colleagues throughout we’re building something that meets their needs and will improve things for both colleagues and their customers.

We’re transforming the Co-op Funeralcare business but we believe that what we’re doing here will actually help transform the entire industry. To help us do this, Co-op Digital is working towards having a dedicated digital product teams within the Co-op Funeralcare business.

If that sounds like something you’d like to help with we’re looking for an agile delivery manager and a product manager.

You can read more about the agile delivery manager role and more about the product manager role.

Come to a talk about the digital transformation of our Funeralcare business on 28 March. We’re particularly interested in speaking to product managers, delivery managers, software developers and platform engineers. You can get your free ticket at Eventbrite.

Carl Burton
Product lead

How much do you know about your connected devices?

The Digital Product Research (DPR) team at Co-op Digital is exploring new products and services. We’ve been trying out Google Ventures’ Design Sprint, a framework that encourages teams to develop, prototype and test ideas in just 5 days.

Recently, we’ve looked at connected devices; everyday objects that communicate between themselves or with the internet. It’s a running joke that people don’t read terms of service documents, they just dart down the page to the ‘accept’ button so how much do they really understand about what they’ve signed up for?

Many connected devices are doing things people might not expect, like selling your personal data, or they’re vulnerable to malevolent activities, like your baby monitor being hacked. These things don’t seem to be common knowledge yet but when they start getting more coverage we expect there to be a big reaction.

A right to know what connected devices are doing

In the DPR team, we have a stance that the Co-op shouldn’t express an opinion on whether what a device is doing is good or bad. We’re just interested in making the information around it accessible to everyone so that people can decide for themselves.

In our first sprint we looked at how people relate to the connected devices they have in their homes. We found that though the people we interviewed were reluctant to switch them off at first, or to disable the ‘smart’ functionality, they were open to learning about what their devices are doing.

Influencing the buying decision

With that in mind, we looked at an earlier point in the buying process. We mapped the buying journey.

Mapping the buying journey on a whiteboard. Shows customers want to buy a TV. They research products by reading expert reviews, user reviews, looking on retailer websites and asking friends. Then they make a decision.

What if journalists and reviewers of connected devices were encouraged to write about privacy and security issues? Maybe this could satisfy our aim to influence consumers. If manufacturers knew that their terms and conditions would be scrutinised by reviewers and read by potential customers, maybe they’d make them more transparent from the start.

Our prototype

We made a website in a day and named it Legalease. The purpose of the website was to gather research. It was a throwaway prototype that wouldn’t be launched. It wasn’t Co-op branded so we could avoid any preconceptions. The site showed product terms and conditions and made it easy for reviewers to identify privacy and security clauses that could be clearer.

Shows a screenshot of Legalease prototype. The page shows an LG smart TV and highlights some of the T&Cs. Eg, 'please be aware that if your spoken word includes personal or other sensitive info, it will be captured if you use voice-recognition features'. Page shows someone's comment below: 'and then what happens to it? is it transmitted anywhere?'

The product page showed ‘top highlighted’ parts of the privacy policy ranked by votes. Annotations called into question the highlighted passage.

Shows a screenshot of another tab on the same page as first screenshot. This tab shows the T&Cs in full and contributors can highlight and comment on parts.

Another page showed the ‘full text’ – the full privacy policy document with annotations. The idea is that anybody who’s interested in this sort of thing can create an account and contribute. We imagined a community of enthusiasts would swarm around the text and discuss what they found noteworthy. This would become a resource for product reviewers (who in this case were our user research participants) to use in their reviews.

We interviewed reviewers

We spoke to a mixture of journalists and reviewers from publications like the Guardian and BBC and lesser known review sites like rtings.com. We got to understand how they write their stories.

Objectivity versus subjectivity

We found that what they write can be anywhere on the scale of objective to subjective. For example, a reviewer at rtings.com used repeatable machine testing to describe product features while a writer for The Next Web was able to introduce their own personal and political slant in their articles.

Accuracy

We found that the accuracy of their article was important to them. They’d use their personal and professional contacts for corroboration and often go to the source to give them chance to reply.

Sensationalism is winning!

We’re in danger of ‘fake news’. One of our research participants said:

“Now, with everything being on the internet, it’s pretty easy for someone who just has a couple of mates to throw stuff together on a blog and it look very persuasive.”

We found that they used a mixture of analytics and social media to measure their impact. There was no mention of being concerned with the broader impact their articles might have in terms of whether or not people bought the products based on certain aspects of what they wrote about.

Reviewers thoughts on our product

Some of our research participants made comparisons with websites that have similar structure and interactions like Genius and Medium. The annotations on the Legalease prototype highlighted ambiguity in the terms and conditions but our participants didn’t find that useful – they expected more objectivity. They were also concerned about the validity of the people making the annotations and said that lawyers or similar professionals would carry more weight and authority.

How ‘Co-op’ is the idea?

Our participants thought our prototype was open, fair and community-spirited so it reflects Co-op’s values. There were question marks around whether older organisation like Co-op can reinvent themselves in this way, though.

Reviewing security as well as features

Security and privacy are starting to show up more often in:

But after our research we don’t think reviewers would use something like a Legalease site to talk about security and privacy. Some of the journalists we spoke to thought their readers didn’t care about these issues, or that people are resigned to a lack of privacy. One said:

“People tend to approach tech products with blind faith, that they do what they say they do.”

Connecting the abstract with the real world

Our participants told us their readers are bothered by being bombarded by targeted ads and being ‘ripped off’. This leads us to consider exploring how to connect the more abstract issues around data protection and privacy to these real-world manifestations of those issues. Then we should explain why these annoying things keep happening — and in plain, everyday language.

James Rice
Product designer

How user research is helping us improve the Membership site

My name’s Simon and I’m one of the user researchers on the Co-op Membership team, alongside my colleague, Vicki Riley. It’s our job to understand what members and non-members need from the service and find out what they think of it. This way we can act on their feedback and continually improve things. Whilst we’re responsible for user research, the whole team get involved in research sessions and meeting users so they can empathise with the people who use the services we’re building. This ensures they design with the user, and not themselves, in mind.

We don’t just rely on one method of user research to find out how people feel about the Membership service. We gather feedback in lots of ways and I wanted to share these with you.

Feedback through the website

The website has a ‘give feedback’ link. As of today, 7 December 2016, we’ve had 9469 comments. We’ve analysed them all and have been comparing them with what we learn from our other research approaches.

Phone call follow up

We often do phone interviews with people who have said they’re happy to be contacted about the website feedback they’ve given. This allows us to get more detailed feedback and also find out how people expect things to work.

Online surveys

We sometimes do online surveys of which allow us to range a wide range of people quickly and easily. These surveys are around 4 or 5 questions long. We’ve found that the easier it is for someone to give us feedback, the more likely they are to leave some.

Speaking to people in labs

We also speak to people in our research labs. These sound far more ‘scientific’ than they actually are. Research labs are usually a room with a computer, a microphone and a camera allow the rest of the team to observe the research. We invite people in, talk to them about shopping, loyalty cards, online services and Co-op Membership. We then watch people using the service as they complete tasks such as registering a temporary card or choosing which local cause to support. I ask them to talk me through what they’re thinking as they use the service so that we understand how they’re finding it.

Store visits

We already visit stores but we plan to do more of this.

Tracking website traffic

Finally, we also gather analytics from the website. This allows us to understand which pages people are visiting, how long they’re spending on pages, what they’re clicking or selecting, and which error messages are triggered most frequently.

By using a combination of these research methods, we have access to a wide range of interesting data about how people use the service.

Using research findings to improve

So here’s an example of how we’ve used what we’ve learnt from our research to make a change.

We’d seen through lab testing that people didn’t always understand that they could choose their own cause to support with their 1% for your community reward. We found people thought that we decided for them, or that they would email us later on with their choice. They didn’t notice there was something on the screen that they could click to choose a cause. Here’s how the page used to look:

The Membership page before our design changes. Many users weren't sure how to choose their cause in the 'your community' box

The comments from the feedback link told us the same thing. People had commented:

“I can’t find where to vote regarding where the 1% goes.”
“How do I select my preferred local cause please?”
“Should be able to select which charities I want to support.”

The analytics were backing this up too. We saw that a significant number of people were getting to the page with the ‘call to action’ (the bit where they could choose a cause) but they weren’t actually selecting one.

The team came up with an alternative design to try and make it more obvious how the user could interact with the page. It was a simple content fix. We added ‘See your local causes’ inside the box about ‘your community’. When we tested it with people in the lab, they understood it – they knew what to do. So earlier this week we put it live. Now the page looks like this:

New design of the Membership page includes a simple content fix in the 'your community' box. It now says 'See your local causes'

It’s early days but we’ve already seen more people selecting their cause and therefore benefiting their community. We’ve seen a 10% increase already. We’ll be keeping an eye on the feedback to make sure we’ve improved the journey. We’ll continue to research regularly and as always we’ll keep using what we’ve learnt to improve the service.

Members can visit membership.coop.co.uk to choose a local cause. If you’d like to become a member you can sign up for membership.

Simon Hurst
User researcher on the Membership team

User research, not user testing

I’ve now been at the Co-op for a couple of months. In that time I’ve met lots of people, seen lots of work going on and talked about what I do with many, many people. I’ve even written my first CoopDigital blog post about user research at the Co-op.

One thing that I know we still need to work on is sharing wider what user research really is and how it should be used to influence what we do and how we do it. This is fine, it’s part of our jobs as experienced agile people and experienced researchers. It’s one of the reasons we’re hiring so many good people.

In my previous place of work, if someone called what we do ‘user testing’ there would almost always be someone who’d jump up and say ‘User Research NOT User Testing!’. I was always fairly relaxed about it, I knew what they meant, the person uttering it knew what they meant and it always felt like a bit of an over-reaction to me, personally, I’d just smile and let it go.

Moving to somewhere where it is a less familiar concept I’m beginning to realise why people did it.

I’m finding that, for some people, ‘user testing’ is something you do near the end, you’re fairly convinced you’ve got it right, you’re fairly convinced it’s going to work and it’s going to go down well. What you might get is some feedback or minor tweaks to make it even better. I think the issue is the word ‘testing’, where testing is generally done just before you go live to spot bugs and defects.

That’s not what user research and agile development is, what it’s for and what it’s brilliant at.

A picture of one of the CoopDigital product teams

User research is invaluable to us to help decide if we should build/release something at all, what that something should be and how it should work. It shows us how the thing we make will fit into users lives. It gives us insight into the language people use and how they view the world. It also helps us understand the problems in their lives they’re trying to solve, the tasks they’re trying to achieve and how what we build can help solve that problem or complete that task.

There is also the issue of what we’re testing when we do research: we’re testing our designs, we aren’t testing our users. The user doesn’t pass or fail, the design does.

Simon Hurst
User researcher

User Research at CoopDigital

Hello, my name’s Simon and I’ve just joined the team at CoopDigital as a user researcher.  I’m really excited to be here and help the team build some world class digital services.

Picture of Simon Hurst - user researcher
Simon Hurst – user researcher

What does a user researcher do?

User researchers fulfil several roles for a team, we’re there to help them understand:

1) Who the users of our services are and understand what they need from the service. To build great services you need to truly empathise with your users.

2) What’s the problem the user is trying to solve, what goal are they trying to achieve? How can we support them to achieve their goal?

3) Whether the solution we’re looking to provide works well and how can it be better?

Meeting real people

We do this by getting out of the building and meeting real people, talking to them and trying to understand their lives, watching them trying to complete their goals or use things we’re building.

We work with a huge variety of people, this includes those who are just learning the ropes, people who have maybe been bought a tablet by their children, or people who use a screenreader to interact with their device because of a visual impairment.

Bringing the team along

It’s even better when you bring members of the team with you, getting people who are building the service and writing the code to see users actually using the thing they’ve built and to see them struggling can have a tremendous impact on how they tackle problems. The result is a team who care about what they’re building and are absolutely committed to making it the best it can be.

User researchers are interested in if people can use things to complete a task, user research isn’t about asking people if they ‘like’ what we’ve built, or what they think of the colours.

It can be frustrating for people to see something they’ve designed and built not working, to see people struggling to understand the words they’ve used, or to interact with the clever little interface they’ve made. However, the sooner we can recognise the issues, the sooner we can fix them, it’s better to find this out before you release the thing.

It’s even more important to understand why we are building something in the first place, is it needed by people? Is it helping them to achieve a goal or to solve a problem? If it isn’t we end up building something that could be the most beautiful designed and usable product or service, but if it’s not needed then no one will ever use it.

We’ll be looking at how we get involved with users more and more in the near future and we’ll be sure to blog about how we’re doing it and what we’re learning along the way. We’ll also be working hard to try and understand how user research applies in an organisation as diverse and varied as Co-op. There’ll be plenty more blogs to come from us on that.

Simon Hurst
User Researcher