Karen Lindop: the AGM, user research training plus award nominations

(Transcript) Karen Lindop: Hello, and welcome to our update on what’s happening in the Digital team.

I’ll start with some brilliant news. We’ve had not one, but 3 award nominations this week. Shifts has been shortlisted for best user experience and Guardian for transformation at the Big Chip Awards. Plus our service team have been shortlisted for the ‘special innovations’ award at the IT Service Management Foundation awards. Well done everyone, it’s testament to all of your hard work. Fingers crossed for 3 wins!

Talking of Shifts, you may remember a few weeks ago we made Shifts available to all our Food store colleagues. In just 4 weeks we’ve onboarded over half of the user base with 90% of those users returning in the past week. Chris, Paul and the team aren’t done though – they’ll continue to listen to feedback from colleagues and work on making it even better.

Our user research community have been running a training session to help get more honest feedback from our colleagues. Last week they ran further sessions with people who will be doing user research throughout our Co-op. Great work from Simon Hurst who has designed the course and is taking a lead on this.

Saturday was our Co-op AGM. Well done to all the teams across Co-op who come together as one Co-op to deliver a brilliant day for our members. If you couldn’t make it on the day, or were watching the royal wedding or football – our live stream is available to view.

Finally a quick reminder that next week the first of The Federation presents series begins. On the 30th May we are hosting an evening were we talk about Modern Slavery in Tech with the incredible Mary Mazzio. There are a few tickets left, so be quick and register.

That’s it for this week. Don’t forget to subscribe for all our updates on our blog and follow us on Twitter. See you soon.

Karen Lindop
Head of Digital Operations

 

Making product decisions requires us to take risks

Designing a good product – one that meets user needs and is both a viable value proposition as well as technically feasible – requires us to be both gamblers and scientists. When we say ‘gamblers’, we don’t mean we’re reckless and irresponsible. We work in an agile way which massively reduces financial risk and helps us find (or discount) solutions to problems quickly. Gambling for agile teams like ours is about speculating and taking risks in the hope of getting a desired result.  

Reducing the risk of building something useless

User-centred design tries to reduce the risk involved in building a product by focusing on what users do now, or what underlying job they’re trying to achieve. It involves determining who your users are, analysing their needs, and determining likely demand for different possible solutions. It’s as much art as it is science but done well it can reduce the risk inherent in deciding on a thing to build. And from these findings, these informed reckons, you do some research and start to shape a product.

If you gamble on your product decisions early you learn more, and the odds on creating a good product fit for your target user base start to shorten.

When to take bigger risks and embrace the long odds

Taking bigger risks at the start of the product life cycle usually pays off. At the start, you’re unlikely to have much data on your users and their behaviour, so prototypes will have a set of assumptions about your users to test.

Because you’ll probably only have relatively few users to test with, your gambles need to be stark in their differences. Be radical in tests as this helps discount huge swathes of things and sets you in roughly the right direction. As the product matures, you need to gamble less and make smaller, more educated guesses as the graph below highlights.

Graph. Y axis label is uncertainty or probability of being wrong. X axis is tests undertaken to validate product. The line goes from top left to bottom right.

Stakeholders: if we don’t take risks, we won’t win

Researcher Sam Ladner sums up the idea of being both gamblers and scientists in her post Design researchers must think fast and slow. She says:

Design researchers should embrace less structure and more openness at the early stages of product design, and rigour and structure in the mature stages of product sales.

Sam Ladner

The graph below, taken from Ladner’s post, illustrates this. It shows the move towards more structure as a product matures. You could plot the decline in uncertainty around market fit, target users, user journey and experience for example, as the product matures too.

The graph shows the move towards more structure as a product matures.

That’s how it should work.

However, being comfortable with uncertainty and embracing the idea of taking risks can make stakeholders and our non-Digital colleagues a little uneasy. And that’s understandable – this is an unfamiliar way of working to them. The Digital team know this though and we’re keen to work inclusively and show how testing assumptions is relatively cheap compared to traditional business.

How it works in practice

We can see from some of our projects at the Co-op that the propensity to gamble differs hugely from one project to the next. 

The Digital Product Research team moved quickly to test new propositions in the market: things like a white goods subscription service, and a service to ‘scrobble’ (automatically get and process) your utility bills to help you work out how your spending changes, and if it might be cheaper to switch. These were ideas spun out quickly and tested with real users. We learnt from doing the work that as products or services, they were unlikely to provide sufficient return for the investment needed.

Then there’s online Wills. We had more belief in this idea, ie, it exists in the market and is clearly already a thing. Here, it was a case of working out where our proposition would best fit with users and the existing business process. The gamble was on shorter odds, but in many ways felt far harder as we were working with an existing business and its staff, and embedded processes in a tightly regulated market.

Strategies for success

Navigating through product decisions and keeping our colleagues in other areas of the Co-op on board is not trivial. We’ve learnt 2 things which have helped us:

  1. Stay focused on the problem you’re trying to solve. Your experiments are trying to prove that it meets the user’s needs effectively.
  2. Business stakeholders prefer the language of data to qualitative research, so use data and qualitative findings to prove out whether the experiment worked and whether it met the user need effectively.

Good luck 🙂

James Boardwell, Head of User Research
Anna Goss, product manager

Switching energy company. For good

The Digital Product Research team at Co-op Digital spent a year exploring new products and services. We researched and tested ideas that we may or may not build. Our latest experiment was around people’s understanding of the energy market and what it would take to get them to switch to a renewable energy provider.

For the last few months the Digital Product Research team has been exploring how we might speed up Britain’s transition to sustainable energy controlled by its communities. By this we mean reducing our use of energy from fossil fuels bought from large multinational companies and moving to using renewable energy from a range of UK-based sources. Our most recent experiment in this area is a prototype for collective green switching.

Collective green switching happens when a group of people all move to a renewable energy supplier. It’s a really easy first step to reducing your carbon footprint. And when a group of people switch at the same time, it’s an effective way to fundraise, as energy companies make a thank-you payment for every new customer. A group of 40 people switching at the same time could raise £1000.

Our mission was to get people who have never switched, to move to a 100% renewable energy company. But we knew that might be tricky.

Why don’t people switch?

We are often told that switching energy provider will save us money. But only around 15% of us are receptive to this message and change energy company every year [PDF]. 

Photograph in Sheffield of a billboard that says: 7/10 brits overpaying for energy. Switch today and save up to £618

Yet over half of us never switch energy company – despite ads like this and nagging from evangelical switchers. Some of us are sticky, and that’s just the way we are.

It’s not about money

A big early influence on our thinking was a paper on attitudes to switching [PDF]. Xiaoping He and David Reiner are researchers at Cambridge University. They found that most of the non-switchers in their study stayed with the same company even though they knew they were paying too much. Any potential savings don’t seem to be worth the trouble of switching.

It’s the fear of hassle

We found this in our own research too. We held one-hour interviews with 5 parents of school-age children who had not switched energy provider at all or in the last 8 years. They all knew they would save money. But their perception of the process – that it would be a hassle, that it would go wrong, that it would be time consuming – put them off. One parent said, “It just gives me shudders. I just think it would be a nightmare.” Even thinking about the process was stressful. Another parent said, “A lot of mums when I speak to people, they don’t have the headspace to take on stuff.”

Renewable energy companies aren’t well known

We also found that awareness of alternatives to the ‘big 6’ energy companies was low. The people we spoke to had not heard of companies that only sell renewable energy, such as Ecotricity.

When we showed our non-switchers renewable energy companies, they liked them. One said, “I like that. It feels like buying local produce.” And others echoed this sentiment. Clean energy from renewable sources, produced by UK companies was appealing to them. Although buying energy from this type of company was a new idea for them, it went down well.

Would people switch for good?

During our interviews with parents, we showed them our prototype website where they could switch to a green energy provider. With the feedback from the interviews in mind we updated the content on our website. We removed all references to saving money. Instead we focused on how switching could help others. The message on our landing page was: Let Co-op change your energy provider for you and raise money for Hanover Primary School.

The image below shows the our old homepage next to the new one.

Homepage before when the focus was on saving money and after, when the focus is on raising money for a school

We don’t know…yet

We’re pausing work on our collective green switching service for now. We contacted a lot of schools and parent teacher associations and didn’t hear back from any of them. Schools and families aren’t at their most receptive in July.

This means, we can’t prove that people would switch for charitable reasons (to help a school raise money) even if they wouldn’t switch for personal gain. But we think it looks promising, and certainly an effective way to help schools. We’re still excited by collective switching for good, and hope to continue exploring this area soon.

Over to you

Our goal for this work was to speed up Britain’s transition to sustainable energy controlled by its communities. This is everyone’s responsibility, not just ours. And collective green switching is a great way for any type of organisation to raise small amounts (£1-2K at a time).

We hope to return to this work in the future, but we’re also keen that others carry on from where we’ve left off. Which is why we’ve worked openly. We’ve blogged about our findings and our software is available for anyone to view and download.

We’ll be very happy if anyone uses these resources for a project of their own. If you do, please let us know.

Sophy Colbert
Content designer

What we’ve learnt in Digital Product Research: adapting research techniques

The Digital Product Research team at Co-op Digital has been exploring what the future might look like for the wider Co-op. We’re about to move onto a new phase of work, so this is a good time to write up some of the things we’ve learnt.

Our team of 6 has been learning by doing. We’ve looked at things like Life after work, Everything is connected, Financial freedom through early planning. Our most recent work was about energy and co-operation: Collective switching for good. You can read about more of our experiments at dpr.coop.co.uk.

This is the first in a series of posts covering design principles and ways of working that have emerged in the last 12 months or so.

Getting out there matters

Our research started out much as you might expect: we spoke to Group colleagues, Co-op members and spent time with people in research labs. But we quickly became aware we were spending too much time in artificial settings. Research is best done in the context of the problem you’re trying to understand, so we made sure we got out of the office.

This took us to lots of different places — sometimes with a discussion guide that outlined areas of interest, sometimes with a digital prototype people could interact with.

Photograph show the back of Sophy's head as she rings the door bell hoping to speak to someone to do user research.

Hoping to understand how people use energy in their home, we took a trip to Lichfield and knocked on doors. We’ve looked for jobs at the Uber offices in a bid to understand a bit about what it’s like to be a driver. We also wandered down a high street to talk to shop owners about their relationship with other traders and with their customers. Doing all these things gave us greater confidence in the direction we’d take the projects.

Reflecting and adapting

We hit prototype testing fatigue after following GV’s Sprint method for a few weeks. We started to reflect on what the GV Sprint method offered us. We found it wasn’t providing enough insight into people’s problems, motivations and feelings. One of our experiments, Protecting your stuff, really highlighted that failing. The prototype was good, in lots of ways, and so was the idea of insurance based on trust within your community. But it didn’t explore people’s behaviour as a part of a real community in the context of our proposition.

We weren’t getting under the skin of the problem.

This sort of misstep led us to rethink how we thought about researching with prototypes. How might we prototype communities? How might we understand the mechanics of group behaviour to enable co-operation on a Job To Be Done?

Photograph of a group of people standing in a farm kitchen where the team thought about prototyping communities.

Our farm visit is an example of where we’ve given this approach a try. Read more on that experiment here: Cheaper, greener energy through smart behaviour.

Research is a team sport

We’ve each had ideas on how we might get closer to the user. Reading research papers helped our understanding of switching energy providers. And we used targeted Google Ads to get people to a website and used an Intercom chat widget where we could speak to them, in real time, at their point of need.

All of this was a collective effort.

Rather than have a dedicated user researcher, every member of the team has been involved in the research which is great. (Looking back, if we’d had a researcher, they might have helped make sure the time we spent with people clearly pointed back to the assumption we were trying to prove or disprove).

We’ve found that when every member of the team gets involved in the research process, they can understand people more and design our proposition better. As user researcher Simon Hurst says, getting each team member involved “ensures they design with the user, and not themselves in mind”.

We’ve poked and prodded along without a user researcher on the team and I wonder whether this has forced us to think differently. Learning from others — like GV’s Sprint method, or best practice led by an embedded researcher — is a good place to start, but there’s a lot to say for taking that baseline and adapting it to the specific problem at hand.

We’re also lucky to have a community of user researchers to guide us when needed. Research is integral to what we do and the onus is on us to question how we use it. We’ll keep doing that, as we move on to our next project.

James Rice
Interaction designer

How we tried to increase temporary card registration with flyers

Recently, in his post How we’ve helped users understand Membership, user researcher Simon Hurst said that “it’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better.” It made me think about my most recent example of failing fast and how useful it was for the ‘more members’ part of the Membership team to do a quick, inexpensive trial so we could test an idea.

The problem with temporary cards

You can become a member by signing up online. You register your details, pay your £1 and your Co-op Membership card is sent to you through the post. You can also sign up in our food stores. You pay you £1 and you receive a temporary card to use there and then. The idea is that you’ll go online to register your temporary card later.

However, our user research and data show this isn’t what’s happening. 58% of temporary cards we’ve sold haven’t been registered. This is a problem because:

  • around £1 million of 5% reward is sitting in a pot, and can’t be spent until the temp cards are registered
  • we can’t get in touch with customers to let them know the balance they have because their temp card isn’t registered
  • until they register the card, customers can’t access all the member benefits. For example, they can build up their rewards but they can’t spend them or choose a local cause to support

To try and increase the number of temporary cards being registered we ran a few trials in stores. We dubbed one of these ‘the flyer test’.

Encouraging temporary card holders to register

Here’s our hypothesis:

Photo of post it notes stuck on a whiteboard with hypothesis on them. Hypothesis reads: We've seen/we've heard That people aren’t registering their temporary cards We believe this is because They don’t know they have to do anything with it, and the instructions given aren’t clear So if we Give them better instructions We'll see More members registering We'll know this is true when We see an increased temporary card conversion rate

To test this hypothesis we asked colleagues on tills in 10 stores to watch out for customers who were swiping a temporary card. When they spotted this happening, we asked them to hand those customers a flyer which had a call to action on it: ‘register your temp card’. The flyer also explained the benefits of registering the card to try and nudge people into registering.

Image shows front and back of flyer. Front says: Register your card online to claim your member rewards. Back lists things that members are missing out on if they haven't registered their cards online.

We included a vanity URL so we could track how many people registered their cards after receiving a flyer. Simple.

Learning early

We had our hypothesis and agreed our test. Our first failure was cracking the logistics of designing, printing, delivering leaflets across the country. That was hard, and so was making sure our store colleagues understood why we were doing this. This was our first learning: there are colleagues across the business great at doing this, and working with them is better than working alone.

We hadn’t fixed anything. And that’s hard to take

We sent flyers to 10 stores across the country and asked them to hand them out for the next 4 weeks. We put Google Analytics tracking in place and we decided on our measure of success: 10 visits to the URL a week, with 50% of those going on to register their card.

The test went live and we eagerly refreshed the Google Analytics report each morning waiting to see an improvement in temporary card registration. There were none. Nobody was visiting our URL.

We called the test stores. Maybe they hadn’t been handing the flyers out? Turns out they had. And what’s more, colleagues liked them because the flyers were an easy, concise way to tell customers why they should register their cards.

But they weren’t working for customers.

Over 4 weeks, 35 people visited the URL, and 3 of those people registered their cards. We hadn’t hit our measures. The test had failed.

We learnt lots, quickly

The trial taught us that:

  1. People don’t naturally move from a physical thing (a flyer in a shop) to a digital thing (our website). Even if you spell out all the really great reasons why they should. If moving from physical to digital was a natural thing for people to do, they probably would have already registered their temporary card.
  2. Involving wider team members early on is important because they may have ideas, sometimes tried and tested ones, about how to get stuff done.
  3. We should test an idea from as many angles as we can before we go ahead and roll it out further. We based our hypothesis on user research, then came up with an idea that we thought would test it. If we had looked at the data as well, we would have seen that there are only around 50 active temporary cards per store, and that these cards are only seen around around twice a month. So…
  4. Targeting active temporary cards isn’t the best way to solve the wider problem.

Learning a lesson cheaply, and on a small scale

We often say it’s okay to fail, but it’s still disappointing when you’ve put time and effort into something. You start picking it apart. Maybe we picked the wrong stores? Or the wrong time of year? Or the wrong colour flyer?

No, those things don’t matter – our idea just wasn’t that great.

Failing is ok, as long as you recognise when to let your idea go and move onto tackling a problem another way. So yes, we failed but we only failed in 10 shops, not all 3,000. We didn’t spend much money, we didn’t inconvenience our users and we were open about how the tests were performing in our weeknotes and our show and tells.

Most importantly we learnt enough to inform where we should focus our efforts next.

We’re moving away from encouraging users to do something towards giving them the tools they need to do it there and then – our next trial will test if customers would register their temporary cards on a tablet in store.

Joel Godfrey
Digital business analyst

How we’ve helped users understand Membership

At one point or another, most digital teams have been convinced that their assumption about how to fix something will work but when they’ve tested it, they’ve found they’re still way off solving the problem.

That’s ok.

It’s fine to ‘fail’ as long as you do it quickly, learn from it and make changes to make things better. It’s part of working in an agile way. We should talk about failing more often. So, here’s an example of how we failed fast and learnt quickly in the Membership team.

Making assumptions based on user research

We’d seen from user research that most people, especially those who were new members, didn’t understand what a co-op is, how one operates and why it’s a different way of doing business.

Most people, especially those who are new members, don’t understand it even though we include loads of info on co-ops when we send out membership cards. But it looks like people either don’t read it at all, or, if they do, they don’t remember the information. Without that understanding, the Co-op Membership is just another loyalty card to most people.

During user research sessions when we talked about the idea of a co-op, people seemed interested. Not everyone, but some. The problem seemed to be not with the quality of information being given, but where and how in the user journey it was given.

It seemed if we could more effectively convey the concept of a co-op, that would be enough for some users to become more engaged. Certainly they would be better able to make an informed decision whether they wanted to get involved. They’d become true co-operators as opposed to just loyalty card users.

Making changes based on our assumptions

We designed an interaction where the information about co-ops and Co-op Membership was introduced to people as part of the online registration. Our hypothesis was that at this point in the user journey the member is more committed and more likely to have time to read this information and be more receptive to it.

By chunking the content into sections and importantly making it dismissable, the user would be able to digest as much or as little as met their needs, rather than being faced by the entirety of the proposition in one hit.

We know people don’t read things online. In fact you’re lucky if people read more than 20% of what you stick on a screen so we kept that in mind with the design.

Here are 2 examples of pages from the prototype.

Image shows a screenshot of a member account and a box out with information about Co-op Membership. It says: 'Your say in what we do' and gives an overview of things members can do.

Image shows a screenshot of a member account and a box out with information about 'Your 5% reward'

Then we tested the new design

During 2 rounds of research we spoke to 12 people (you can read more about our views on samples sizes in James Boardwell’s blog post ‘Small is beautiful’). The group included a mixture of ages, online capabilities and length of time being a member.

Before showing them our new design we asked each participant to fill in a short questionnaire to find out what they understood about Co-op Membership. We then interviewed them, and showed them the prototype that was intended to help them understand the idea of a co-op.

At the end of the session we asked them to fill in the same questionnaire.

Results showed we hadn’t got it right

As we expected, before looking at the prototype people didn’t understand:

  • what rewards they earned as a Co-op member
  • what a co-op is
  • elements of the Co-op such as the dividend, democracy and engagement

And the post-prototype results weren’t any better – the new design had had zero effect on users’ understanding.

Picking ourselves up. Trying again

We’d seen people read the information, but they didn’t take it in. Although we were giving them more control, we were still imposing a bulk of potentially irrelevant content rather than letting the user discover it in their own time, and reading as much or as little as met their need.

For some people, some of the information would have been both relevant and necessary – but for most their primary need at this point was to find out ‘what’s in it for me’ and everything else was a distraction.

So we iterated again. This time we wanted to give people a positive interaction that let them get only what they wanted, at a time when they needed it.

We added a ‘what’s this?’ drop down within members’ accounts to explain both rewards and Co-op points. Here’s how the current design looks.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box in a closed position.

Image shows a screenshot of the current design that has tested well. It shows the 'what's this' drop down box with content underneath that explains what this is.

We’d seen in research that many people often didn’t know exactly what they got for being a member so adding this was important.

Better results this time

During research we watched people time and again interacting with the drop down, unprompted. Responses were often comments from the user such as ‘ahhh, so that’s how it works’ or ‘I didn’t know that, I thought they’d send me vouchers’.

If there wasn’t that immediate, unprompted reaction we’d then follow it up with questions such as ‘what made you click on that’ and ‘what did it tell you’. This made us confident in what we were seeing had met the need we’d identified and so we released it. We know people are making use of it. Google Analytics tells us those drop down controls have been clicked 250,000 times since we released it on 14 February.

So after failing often and learning from, and iterating on, what users were saying to us, we’ve made good progress on helping people understand what rewards they’ve earned as a Co-op member.

We’re still researching how best to help people get a deeper understanding of what a Co-op is including elements of the Co-op such as the dividend, democracy and engagement. Those are things we haven’t solved yet, but we will. And it’ll probably involve a lot of failing fast.

Simon Hurst
User research

Funeralcare: taking the beta to Edinburgh

Since April 2016, the Funeralcare team at Co-op Digital has been working to make life easier for our colleagues at our funeral homes across the UK. Our aim has always been to reduce the time our colleagues spend juggling and filling in paper forms so that they can spend more time with their clients – people who are grieving for their loved ones.

It’s been awhile since we wrote an update on our work. Back in August Andy Pipes, our Head of Product Management, said that we were rethinking how we deliver our at-need funeral service (an ‘at-need’ service is the immediate assistance someone might needs after reporting a bereavement).

At that point we’d built:

  • a ‘first call’ service that logs details of a death and automatically alerts an ambulance team by SMS to take the deceased into our care
  • a funeral arrangement service which captures the client’s decisions, the costs, and keeps colleagues in various locations from funeral homes and the central care centre updated
  • a hearse booking system, staff diary and staff assignment service
  • a coffin stock control system, and a way for clients to browse the existing coffin range
  • an audit system that captures certain steps in the service

Since then we’ve been busy testing with colleagues and iterating.

We’ve added new features

As we’ve learnt where the gaps are in the service, we’ve added new features. They include a digital mortuary register and a digital belongings log to record possessions.

Deceased can come into our colleagues’ care at any time of the day or night and it’s vital the funeral director knows where that person has been taken. To help, we’ve developed a digital mortuary register so that ambulance staff can book the deceased in and the funeral director can see where the person has been taken.

image shows a screen with the first page of the digital mortuary register. the options are 'booking in' and 'booking out'

Another new feature is a digital belongings log. Often, when someone is brought into our care they’ll have jewellery on them or other personal belongings with them. This means that when a funeral director at a funeral home gets a call from the grieving family to check up on jewellery, they don’t immediately know what the deceased came in with because the paper record is with the deceased at the mortuary. To make this easier and more efficient, we introduced a digital log instead of needing multiple phone calls between different locations.  

Live trial and user testing

We’ve been testing in 2 ways. From September to November we continued to visit funeral homes all over the country to observe how colleagues work but we were also doing usability testing on each of the individual features in the bulleted list above with colleagues in mock labs. We tested and improved each feature separately until we thought we’d built enough of a service to be valuable to colleagues. At that point, in December, we rolled out a beta trial in Bolton.

interaction designer Matt researching which content is most valuable to one of our colleagues with a paper prototype.

We asked colleagues in Bolton to use the service in parallel with their current process which involves whiteboards, post-its, paper diaries, fax machines and the old, often painful-to-use software. Letting them use it for real is the best way to learn what’s working and what’s not. It drew our attention to 3 major things we’d overlooked during usability testing.

  1. We thought we were being helpful by preloading the local churches and crematoriums but we hadn’t given colleagues the option to create new ones.
  2. We found that the calendar couldn’t cope with all day events.
  3. We discovered that colleagues help each other out so having restricted access for specific roles creates a problem if someone is off ill and cover is needed.

Testing the beta with a small number of colleagues helped us catch problems like these before we rolled the service out to more people.

Trialling the service in Edinburgh

We want our service to be useful everywhere but we’ve been told many times by colleagues that there’s no such thing as a ‘typical’ funeral. They vary from region to region for reasons including local traditions, operational set up, affluence, traffic as well as legislation. Because our aim is to give time back to colleagues so they can spend it with their customers, we need to create something that works for all users not just our colleagues in Bolton. That’s why we are launching our at-need funeral service trial in Edinburgh in March.

We’re still learning

The beta has shown us that funeral arrangements are made up of multiple interactions like choosing flowers, booking venues and signing off obituary notices. Funeral arrangements are iterative with lots of tweaks along the way, so iterating the design is the only way we can cope with all the new things we keep learning.

We know that standard software packages don’t solve every problem. By involving colleagues throughout we’re building something that meets their needs and will improve things for both colleagues and their customers.

We’re transforming the Co-op Funeralcare business but we believe that what we’re doing here will actually help transform the entire industry. To help us do this, Co-op Digital is working towards having a dedicated digital product teams within the Co-op Funeralcare business.

If that sounds like something you’d like to help with we’re looking for an agile delivery manager and a product manager.

You can read more about the agile delivery manager role and more about the product manager role.

Come to a talk about the digital transformation of our Funeralcare business on 28 March. We’re particularly interested in speaking to product managers, delivery managers, software developers and platform engineers. You can get your free ticket at Eventbrite.

Carl Burton
Product lead

How much do you know about your connected devices?

The Digital Product Research (DPR) team at Co-op Digital is exploring new products and services. We’ve been trying out Google Ventures’ Design Sprint, a framework that encourages teams to develop, prototype and test ideas in just 5 days.

Recently, we’ve looked at connected devices; everyday objects that communicate between themselves or with the internet. It’s a running joke that people don’t read terms of service documents, they just dart down the page to the ‘accept’ button so how much do they really understand about what they’ve signed up for?

Many connected devices are doing things people might not expect, like selling your personal data, or they’re vulnerable to malevolent activities, like your baby monitor being hacked. These things don’t seem to be common knowledge yet but when they start getting more coverage we expect there to be a big reaction.

A right to know what connected devices are doing

In the DPR team, we have a stance that the Co-op shouldn’t express an opinion on whether what a device is doing is good or bad. We’re just interested in making the information around it accessible to everyone so that people can decide for themselves.

In our first sprint we looked at how people relate to the connected devices they have in their homes. We found that though the people we interviewed were reluctant to switch them off at first, or to disable the ‘smart’ functionality, they were open to learning about what their devices are doing.

Influencing the buying decision

With that in mind, we looked at an earlier point in the buying process. We mapped the buying journey.

Mapping the buying journey on a whiteboard. Shows customers want to buy a TV. They research products by reading expert reviews, user reviews, looking on retailer websites and asking friends. Then they make a decision.

What if journalists and reviewers of connected devices were encouraged to write about privacy and security issues? Maybe this could satisfy our aim to influence consumers. If manufacturers knew that their terms and conditions would be scrutinised by reviewers and read by potential customers, maybe they’d make them more transparent from the start.

Our prototype

We made a website in a day and named it Legalease. The purpose of the website was to gather research. It was a throwaway prototype that wouldn’t be launched. It wasn’t Co-op branded so we could avoid any preconceptions. The site showed product terms and conditions and made it easy for reviewers to identify privacy and security clauses that could be clearer.

Shows a screenshot of Legalease prototype. The page shows an LG smart TV and highlights some of the T&Cs. Eg, 'please be aware that if your spoken word includes personal or other sensitive info, it will be captured if you use voice-recognition features'. Page shows someone's comment below: 'and then what happens to it? is it transmitted anywhere?'

The product page showed ‘top highlighted’ parts of the privacy policy ranked by votes. Annotations called into question the highlighted passage.

Shows a screenshot of another tab on the same page as first screenshot. This tab shows the T&Cs in full and contributors can highlight and comment on parts.

Another page showed the ‘full text’ – the full privacy policy document with annotations. The idea is that anybody who’s interested in this sort of thing can create an account and contribute. We imagined a community of enthusiasts would swarm around the text and discuss what they found noteworthy. This would become a resource for product reviewers (who in this case were our user research participants) to use in their reviews.

We interviewed reviewers

We spoke to a mixture of journalists and reviewers from publications like the Guardian and BBC and lesser known review sites like rtings.com. We got to understand how they write their stories.

Objectivity versus subjectivity

We found that what they write can be anywhere on the scale of objective to subjective. For example, a reviewer at rtings.com used repeatable machine testing to describe product features while a writer for The Next Web was able to introduce their own personal and political slant in their articles.

Accuracy

We found that the accuracy of their article was important to them. They’d use their personal and professional contacts for corroboration and often go to the source to give them chance to reply.

Sensationalism is winning!

We’re in danger of ‘fake news’. One of our research participants said:

“Now, with everything being on the internet, it’s pretty easy for someone who just has a couple of mates to throw stuff together on a blog and it look very persuasive.”

We found that they used a mixture of analytics and social media to measure their impact. There was no mention of being concerned with the broader impact their articles might have in terms of whether or not people bought the products based on certain aspects of what they wrote about.

Reviewers thoughts on our product

Some of our research participants made comparisons with websites that have similar structure and interactions like Genius and Medium. The annotations on the Legalease prototype highlighted ambiguity in the terms and conditions but our participants didn’t find that useful – they expected more objectivity. They were also concerned about the validity of the people making the annotations and said that lawyers or similar professionals would carry more weight and authority.

How ‘Co-op’ is the idea?

Our participants thought our prototype was open, fair and community-spirited so it reflects Co-op’s values. There were question marks around whether older organisation like Co-op can reinvent themselves in this way, though.

Reviewing security as well as features

Security and privacy are starting to show up more often in:

But after our research we don’t think reviewers would use something like a Legalease site to talk about security and privacy. Some of the journalists we spoke to thought their readers didn’t care about these issues, or that people are resigned to a lack of privacy. One said:

“People tend to approach tech products with blind faith, that they do what they say they do.”

Connecting the abstract with the real world

Our participants told us their readers are bothered by being bombarded by targeted ads and being ‘ripped off’. This leads us to consider exploring how to connect the more abstract issues around data protection and privacy to these real-world manifestations of those issues. Then we should explain why these annoying things keep happening — and in plain, everyday language.

James Rice
Product designer

How user research is helping us improve the Membership site

My name’s Simon and I’m one of the user researchers on the Co-op Membership team, alongside my colleague, Vicki Riley. It’s our job to understand what members and non-members need from the service and find out what they think of it. This way we can act on their feedback and continually improve things. Whilst we’re responsible for user research, the whole team get involved in research sessions and meeting users so they can empathise with the people who use the services we’re building. This ensures they design with the user, and not themselves, in mind.

We don’t just rely on one method of user research to find out how people feel about the Membership service. We gather feedback in lots of ways and I wanted to share these with you.

Feedback through the website

The website has a ‘give feedback’ link. As of today, 7 December 2016, we’ve had 9469 comments. We’ve analysed them all and have been comparing them with what we learn from our other research approaches.

Phone call follow up

We often do phone interviews with people who have said they’re happy to be contacted about the website feedback they’ve given. This allows us to get more detailed feedback and also find out how people expect things to work.

Online surveys

We sometimes do online surveys of which allow us to range a wide range of people quickly and easily. These surveys are around 4 or 5 questions long. We’ve found that the easier it is for someone to give us feedback, the more likely they are to leave some.

Speaking to people in labs

We also speak to people in our research labs. These sound far more ‘scientific’ than they actually are. Research labs are usually a room with a computer, a microphone and a camera allow the rest of the team to observe the research. We invite people in, talk to them about shopping, loyalty cards, online services and Co-op Membership. We then watch people using the service as they complete tasks such as registering a temporary card or choosing which local cause to support. I ask them to talk me through what they’re thinking as they use the service so that we understand how they’re finding it.

Store visits

We already visit stores but we plan to do more of this.

Tracking website traffic

Finally, we also gather analytics from the website. This allows us to understand which pages people are visiting, how long they’re spending on pages, what they’re clicking or selecting, and which error messages are triggered most frequently.

By using a combination of these research methods, we have access to a wide range of interesting data about how people use the service.

Using research findings to improve

So here’s an example of how we’ve used what we’ve learnt from our research to make a change.

We’d seen through lab testing that people didn’t always understand that they could choose their own cause to support with their 1% for your community reward. We found people thought that we decided for them, or that they would email us later on with their choice. They didn’t notice there was something on the screen that they could click to choose a cause. Here’s how the page used to look:

The Membership page before our design changes. Many users weren't sure how to choose their cause in the 'your community' box

The comments from the feedback link told us the same thing. People had commented:

“I can’t find where to vote regarding where the 1% goes.”
“How do I select my preferred local cause please?”
“Should be able to select which charities I want to support.”

The analytics were backing this up too. We saw that a significant number of people were getting to the page with the ‘call to action’ (the bit where they could choose a cause) but they weren’t actually selecting one.

The team came up with an alternative design to try and make it more obvious how the user could interact with the page. It was a simple content fix. We added ‘See your local causes’ inside the box about ‘your community’. When we tested it with people in the lab, they understood it – they knew what to do. So earlier this week we put it live. Now the page looks like this:

New design of the Membership page includes a simple content fix in the 'your community' box. It now says 'See your local causes'

It’s early days but we’ve already seen more people selecting their cause and therefore benefiting their community. We’ve seen a 10% increase already. We’ll be keeping an eye on the feedback to make sure we’ve improved the journey. We’ll continue to research regularly and as always we’ll keep using what we’ve learnt to improve the service.

Members can visit membership.coop.co.uk to choose a local cause. If you’d like to become a member you can sign up for membership.

Simon Hurst
User researcher on the Membership team

User research, not user testing

I’ve now been at the Co-op for a couple of months. In that time I’ve met lots of people, seen lots of work going on and talked about what I do with many, many people. I’ve even written my first CoopDigital blog post about user research at the Co-op.

One thing that I know we still need to work on is sharing wider what user research really is and how it should be used to influence what we do and how we do it. This is fine, it’s part of our jobs as experienced agile people and experienced researchers. It’s one of the reasons we’re hiring so many good people.

In my previous place of work, if someone called what we do ‘user testing’ there would almost always be someone who’d jump up and say ‘User Research NOT User Testing!’. I was always fairly relaxed about it, I knew what they meant, the person uttering it knew what they meant and it always felt like a bit of an over-reaction to me, personally, I’d just smile and let it go.

Moving to somewhere where it is a less familiar concept I’m beginning to realise why people did it.

I’m finding that, for some people, ‘user testing’ is something you do near the end, you’re fairly convinced you’ve got it right, you’re fairly convinced it’s going to work and it’s going to go down well. What you might get is some feedback or minor tweaks to make it even better. I think the issue is the word ‘testing’, where testing is generally done just before you go live to spot bugs and defects.

That’s not what user research and agile development is, what it’s for and what it’s brilliant at.

A picture of one of the CoopDigital product teams

User research is invaluable to us to help decide if we should build/release something at all, what that something should be and how it should work. It shows us how the thing we make will fit into users lives. It gives us insight into the language people use and how they view the world. It also helps us understand the problems in their lives they’re trying to solve, the tasks they’re trying to achieve and how what we build can help solve that problem or complete that task.

There is also the issue of what we’re testing when we do research: we’re testing our designs, we aren’t testing our users. The user doesn’t pass or fail, the design does.

Simon Hurst
User researcher