Cookies

We use cookies to help us provide you with the best experience, improve and tailor our services, and carry out our marketing activities. For more information, including how to manage your cookie settings, see our privacy notice.

OK

Skip to content. | Skip to navigation

We’ve made our member-only resources free to everyone because of the current situation. We think it’s important people have the guidance they need to run their organisations during this time.

If you want to find out more about how you can volunteer to help deal with coronavirus, see our volunteering and coronavirus page.

If you are looking for advice on coronavirus and your charity, please see our dedicated coronavirus page.

Community-made content which you can improve Case study from our community

How to carry out effective user research

Building a new digital service can be hard. How do you know what features your users will need the most? Or what platform they’ll be most likely to use? Fortunately you don’t need to have all the answers yourself. In fact, you shouldn’t. Because that’s what user research is for. 

User research is not the same as market research, and 1:1 interviews will give you more valuable, in-depth insights than surveys and focus groups. The point of user research is to find out about your users’ attitudes, behaviour, goals and challenges. It’s about uncovering how they see the world, and how they behave in it. If you understand this, you can design and build a service that users need, not what you think they need.

Watch this video to find out why you should do user research, what the common types of research are, ethics and some top tips for writing an interview script. We’ve also developed this worksheet http://bit.ly/2VXMOAC you can fill in as you go.

Here are the tips that are covered in the video, plus a few bonus ones: 

 

1

Don’t ask leading questions

Leading questions massively bias your findings and should be avoided. Shouldn’t they? That said, it’s easy to catch yourself asking one when you’re in mid-flow during an interview. To avoid this, write out your questions in advance, including potential follow-on questions. Anything that starts ‘Do you think…’ or something similar, is almost certainly a leading question. Non-verbal cues can also be leading, so be conscious of your body language when interviewing people face to face.

2

You need detail

It’s important that you frame your questions in a way that will prompt your interviewee to open up about a specific topic. If you have any questions that might get a ‘yes’, ‘no’, or a ‘maybe’, rewrite your question so that it’s more open-ended. If it happens in an interview, make sure to try and get more detail by asking them ‘why?’ or ‘how’. 

3

Avoid asking about future behaviour

Asking someone if they will do something, or are likely to use a product or service, is unhelpful. People say they will do all sorts of things and may mean it at the time. But it’s not good evidence to build a product or service on. If you have a question in an interview that says, ‘would you be likely to do x in the future?’, scrap it. It’s more valuable to ask about their real past behaviours than their hypothetical future behaviours. But if you really want to know if people will do something, build a prototype and test it.

4

Avoid jargon

Make sure that the language you use in your interviews makes sense to your users. Avoiding jargon will make your interviewee feel more at ease and will avoid misunderstandings or confusion. 

5

Tech habits

Knowing what devices and technology your beneficiaries are using can be vital to understanding how they behave and how they might use a new service. To find out what they use, ask them to describe a particular time they used a specific device recently. What did they use it for? Why did they choose that device?

6

Before the interview, work out what’s the one thing you want to find out from this research

If you could only find out one thing from your interviews, what would it be? Holding that in mind can be especially helpful if an interview doesn’t go to plan, as you can skip the other questions and focus just on answering that one thing.  

 

7

Add an intro and thank you

Your users might be nervous about taking part in an interview. Make sure that you explain at the beginning of the interview what they can expect from the session so that they feel at ease, and thank them at the end for giving up their time to talk to you. 

8

Don’t just ask users what they want

It’s often unrealistic to ask your users to guess what product or feature is likely to best solve their problems because they aren’t technology designers. It’s helpful to understand people’s preferences, but user research should go beyond that. For example, if a user says they want a certain feature you can ask them, ‘what would that help you to do?’ Asking this means you can understand what problem the feature solves, and when it’s needed.

A second reason you don’t just ask what someone wants is because people aren’t able to articulate solutions they don’t know about yet. There’s a famous saying in user research circles called ‘faster horses’. It supposedly comes from Henry Ford who said that if he’d only listened to what people asked for, he would never have built a car. Instead he’d have created faster horses, because that’s the mode of transport people were used to, and that’s what they were limited to thinking about.

9

Remember that recruiting users for interviews is always the biggest problem

Without fail, the biggest challenge is getting users to interview or test with. There are lots of ways you can do this, from putting out a call to service users in your network (or the gatekeepers to those service users) through to using a professional user-recruitment agency.There are pros and cons with both approaches. Whatever you do, make sure you plan it plenty of time ahead. With charities with vulnerable or hard-to-reach users, it’s even more important to be patient and flexible.

10

Leave enough time for analysis

When you book out time for interviews or usability testing, book out time for the analysis as well. A good rule of thumb is to have at least the amount of time it took do the interview or testing session. Where possible, make analysis collaborative by involving the wider team, including in-house developers or the tech partner you’re working with. The findings will mean much more to the team if they understand where they came from. 

Further information

This blog was written by CAST - Centre for Acceleration of Social Technology

These are of course just 10 tips - we’d welcome your additions to this, so that it becomes useful for the sector.

 

  

Contributors

Page last edited Oct 21, 2019 History

Help us to improve this page – give us feedback.