How we tried to increase temporary card registration with flyers

Recently, in his post How we’ve helped users understand Membership, user researcher Simon Hurst said that “it’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better.” It made me think about my most recent example of failing fast and how useful it was for the ‘more members’ part of the Membership team to do a quick, inexpensive trial so we could test an idea.

The problem with temporary cards

You can become a member by signing up online. You register your details, pay your £1 and your Co-op Membership card is sent to you through the post. You can also sign up in our food stores. You pay you £1 and you receive a temporary card to use there and then. The idea is that you’ll go online to register your temporary card later.

However, our user research and data show this isn’t what’s happening. 58% of temporary cards we’ve sold haven’t been registered. This is a problem because:

  • around £1 million of 5% reward is sitting in a pot, and can’t be spent until the temp cards are registered
  • we can’t get in touch with customers to let them know the balance they have because their temp card isn’t registered
  • until they register the card, customers can’t access all the member benefits. For example, they can build up their rewards but they can’t spend them or choose a local cause to support

To try and increase the number of temporary cards being registered we ran a few trials in stores. We dubbed one of these ‘the flyer test’.

Encouraging temporary card holders to register

Here’s our hypothesis:

Photo of post it notes stuck on a whiteboard with hypothesis on them. Hypothesis reads: We've seen/we've heard That people aren’t registering their temporary cards We believe this is because They don’t know they have to do anything with it, and the instructions given aren’t clear So if we Give them better instructions We'll see More members registering We'll know this is true when We see an increased temporary card conversion rate

To test this hypothesis we asked colleagues on tills in 10 stores to watch out for customers who were swiping a temporary card. When they spotted this happening, we asked them to hand those customers a flyer which had a call to action on it: ‘register your temp card’. The flyer also explained the benefits of registering the card to try and nudge people into registering.

Image shows front and back of flyer. Front says: Register your card online to claim your member rewards. Back lists things that members are missing out on if they haven't registered their cards online.

We included a vanity URL so we could track how many people registered their cards after receiving a flyer. Simple.

Learning early

We had our hypothesis and agreed our test. Our first failure was cracking the logistics of designing, printing, delivering leaflets across the country. That was hard, and so was making sure our store colleagues understood why we were doing this. This was our first learning: there are colleagues across the business great at doing this, and working with them is better than working alone.

We hadn’t fixed anything. And that’s hard to take

We sent flyers to 10 stores across the country and asked them to hand them out for the next 4 weeks. We put Google Analytics tracking in place and we decided on our measure of success: 10 visits to the URL a week, with 50% of those going on to register their card.

The test went live and we eagerly refreshed the Google Analytics report each morning waiting to see an improvement in temporary card registration. There were none. Nobody was visiting our URL.

We called the test stores. Maybe they hadn’t been handing the flyers out? Turns out they had. And what’s more, colleagues liked them because the flyers were an easy, concise way to tell customers why they should register their cards.

But they weren’t working for customers.

Over 4 weeks, 35 people visited the URL, and 3 of those people registered their cards. We hadn’t hit our measures. The test had failed.

We learnt lots, quickly

The trial taught us that:

  1. People don’t naturally move from a physical thing (a flyer in a shop) to a digital thing (our website). Even if you spell out all the really great reasons why they should. If moving from physical to digital was a natural thing for people to do, they probably would have already registered their temporary card.
  2. Involving wider team members early on is important because they may have ideas, sometimes tried and tested ones, about how to get stuff done.
  3. We should test an idea from as many angles as we can before we go ahead and roll it out further. We based our hypothesis on user research, then came up with an idea that we thought would test it. If we had looked at the data as well, we would have seen that there are only around 50 active temporary cards per store, and that these cards are only seen around around twice a month. So…
  4. Targeting active temporary cards isn’t the best way to solve the wider problem.

Learning a lesson cheaply, and on a small scale

We often say it’s okay to fail, but it’s still disappointing when you’ve put time and effort into something. You start picking it apart. Maybe we picked the wrong stores? Or the wrong time of year? Or the wrong colour flyer?

No, those things don’t matter – our idea just wasn’t that great.

Failing is ok, as long as you recognise when to let your idea go and move onto tackling a problem another way. So yes, we failed but we only failed in 10 shops, not all 3,000. We didn’t spend much money, we didn’t inconvenience our users and we were open about how the tests were performing in our weeknotes and our show and tells.

Most importantly we learnt enough to inform where we should focus our efforts next.

We’re moving away from encouraging users to do something towards giving them the tools they need to do it there and then – our next trial will test if customers would register their temporary cards on a tablet in store.

Joel Godfrey
Digital business analyst

How we’ve helped users understand Membership

At one point or another, most digital teams have been convinced that their assumption about how to fix something will work but when they’ve tested it, they’ve found they’re still way off solving the problem.

That’s ok.

It’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better. It’s part of working in an agile way. We should talk about failing more often. So, here’s an example of how we failed fast and learnt quickly in the Membership team.

Making assumptions based on user research

We’d seen from user research that most people, especially those who were new members, didn’t understand what a co-op is, how one operates and why it’s a different way of doing business.

Most people, especially those who are new members, don’t understand it even though we include loads of info on co-ops when we send out membership cards. But it looks like people either don’t read it at all, or, if they do, they don’t remember the information. Without that understanding, the Co-op Membership is just another loyalty card to most people.

During user research sessions when we talked about the idea of a co-op, people seemed interested. Not everyone, but some. The problem seemed to be not with the quality of information being given, but where and how in the user journey it was given.

It seemed if we could more effectively convey the concept of a co-op, that would be enough for some users to become more engaged. Certainly they would be better able to make an informed decision whether they wanted to get involved. They’d become true co-operators as opposed to just loyalty card users.

Making changes based on our assumptions

We designed an interaction where the information about co-ops and Co-op Membership was introduced to people as part of the online registration. Our hypothesis was that at this point in the user journey the member is more committed and more likely to have time to read this information and be more receptive to it.

By chunking the content into sections and importantly making it dismissable, the user would be able to digest as much or as little as met their needs, rather than being faced by the entirety of the proposition in one hit.

We know people don’t read things online. In fact you’re lucky if people read more than 20% of what you stick on a screen so we kept that in mind with the design.

Here are 2 examples of pages from the prototype.

Image shows a screenshot of a member account and a box out with information about Co-op Membership. It says: 'Your say in what we do' and gives an overview of things members can do.

Image shows a screenshot of a member account and a box out with information about 'Your 5% reward'

Then we tested the new design

During 2 rounds of research we spoke to 12 people (you can read more about our views on samples sizes in James Boardwell’s blog ‘Small is beautiful’). The group included a mixture of ages, online capabilities and length of time being a member.

Before showing them our new design we asked each participant to fill in a short questionnaire to find out what they understood about Co-op Membership. We then interviewed them, and showed them the prototype that was intended to help them understand the idea of a co-op.

At the end of the session we asked them to fill in the same questionnaire.

Results showed we hadn’t got it right

As we expected, before looking at the prototype people didn’t understand:

  • what rewards they earned as a Co-op member
  • what a co-op is
  • elements of the Co-op such as the dividend, democracy and engagement

And the post-prototype results weren’t any better – the new design had had zero effect on users’ understanding.

Picking ourselves up. Trying again

We’d seen people read the information, but they didn’t take it in. Although we were giving them more control, we were still imposing a bulk of potentially irrelevant content rather than letting the user discover it in their own time, and reading as much or as little as met their need.

For some people, some of the information would have been both relevant and necessary – but for most their primary need at this point was to find out ‘what’s in it for me’ and everything else was a distraction.

So we iterated again. This time we wanted to give people a positive interaction that let them get only what they wanted, at a time when they needed it.

We added a ‘what’s this?’ drop down within members’ accounts to explain both rewards and Co-op points. Here’s how the current design looks.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box in a closed position.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box with content underneath that explains what this is.

We’d seen in research that many people often didn’t know exactly what they got for being a member so adding this was important.

Better results this time

During research we watched people time and again interacting with the drop down, unprompted. Responses were often comments from the user such as ‘ahhh, so that’s how it works’ or ‘I didn’t know that, I thought they’d send me vouchers’.

If there wasn’t that immediate, unprompted reaction we’d then follow it up with questions such as ‘what made you click on that’ and ‘what did it tell you’. This made us confident in what we were seeing had met the need we’d identified and so we released it. We know people are making use of it. Google Analytics tells us those drop down controls have been clicked 250,000 times since we released it on 14 February.

So after failing often and learning from, and iterating on, what users were saying to us, we’ve made good progress on helping people understand what rewards they’ve earned as a Co-op member.

We’re still researching how best to help people get a deeper understanding of what a Co-op is including elements of the Co-op such as the dividend, democracy and engagement. Those are things we haven’t solved yet, but we will. And it’ll probably involve a lot of failing fast.

Simon Hurst
User research