We use cookies to help us provide you with the best experience, improve and tailor our services, and carry out our marketing activities. For more information, including how to manage your cookie settings, see our privacy notice.


Skip to content. | Skip to navigation

Community-made content which you can improve Case study from our community

Improving our website with quick and easy user research

This page is free to all
How NCVO did some simple user testing to improve the way it categorises content on its websites.


We have lots of content on our two main websites, and

We noticed that we were being inconsistent in the words and terms we use to categorise information across these sites. For example, in some places we label content on staff and employment as ‘HR’, but elsewhere we refer to it as ‘workforce’.

We felt this might be confusing for our users, so decided to develop a consistent set of category names that we could use across everything we do. We hope this will improve the user journey on our sites and make it easier for people to find what they’re looking for.

The issues we faced

We couldn’t say for sure that the terms we were using to categorise our content were the ones that our users were most familiar with. So not only did we need a consistent set of category names, but we wanted them to come from our users.

The language we’d been using was influenced in part by the way we talk about things within NCVO. Some terms reflected old team names, for example.

But it wasn’t enough for a group NCVO staff to sit around a table and come up with the category names ourselves – we had to ask the people who actually use our websites.

Resources were limited, though, so we had to find a way to get useful results that didn’t need much time or budget.

The actions we took

Working in a cross-team project group, we ran an open card sorting exercise: a simple type of user testing that we could do ourselves, and that would help us understand how people expect information on our websites to be organised and categorised.

Participants were given a set of 30 cards, each labelled with a common query that people use our websites to answer.

They had to sort the cards into groups that made sense to them and then give the groups names that they felt described them best.

(If you want to learn more about card sorting or run your own exercise, have a look at this how-to guide.)

Running our card sorting exercise

In person

For our first round of testing, we recruited people who had come to one of our events – a quick way of finding participants who were already in the building. They did the test during the lunch break (after eating – very important) and we gave each one a free book as a thank you.

Doing the exercise in person was useful – we learnt a lot from hearing people talk through what they were thinking and we could ask them questions about their choices.

But without much time to run the test we could only get through a handful of participants, and as it was a HR conference, we had to be aware that the delegates only represented a portion of our users.

Online testing

We decided to look into ways of running the test online, which would let us reach users in a wider range of roles, organisations and locations.

We found an online user research tool that could run a card sorting exercise. It was quick and easy to set up, and we shared it with our networks on Twitter and LinkedIn.

Analysing the results

Across both tests we had around 60 participants, who came up with hundreds of different category names – some similar, some very different.

Because of this, analysing the findings wasn’t as simple as crunching the data to find a winning set of categories.

Instead, we got together as a group to dig into the data, asking questions and making observations about:

  • how people grouped the cards
  • what cards they commonly grouped together
  • how they categorised things
  • anything interesting or unexpected.

Positive outcomes

We learnt enough from the testing to come up with a set of category names that we think reflects our users’ expectations, rather than our own internal structure and politics.

People also seemed to enjoy the exercise itself (‘This was fun!’) and appreciated being asked for their views (‘It’s nice to be asked’).

Negative outcomes

Even with a fairly small number of participants, there were a lot of results to wade through, which was sometimes overwhelming.

We also had to accept that there was only so much that one test could tell us. The 30 cards covered most of our work, but there were other areas that we’d like to have looked at in more detail.

Lessons learnt

A bit of user research is better than nothing

Getting insights from users doesn’t need to take much time or cost anything.

With limited resources, we did some simple user testing that we could run ourselves. With more time and money we could have done more, but it was better to do what we could than not involve our users at all.

Whether you have five participants or 50, you’ll get some useful insights that will improve the user journey on your websites.

Agree a clear aim for your testing – and write it down

We sometimes got lost in the data when analysing the results of our testing. There were some really interesting findings to delve into, so it was easy to be distracted by things that weren’t relevant to our aims.

While we’d verbally agreed what we wanted to achieve at the start, it would have been helpful to have written this down (and stuck it on the wall) to keep us focused.

There’s no perfect answer and there will always be more to do

There wasn’t a perfect answer hiding among the results – just observations and insights that helped us make an informed decision about what we should call the things we do.

We'd like to do more testing to refine and improve what we've done – but for now, we’re happy that we have a set of category names that our users will understand.

Understanding users is really (really) important

We knew this already, but it was  hammered home when we realised we’d been using words to describe our work that none of the 60 participants used in the exercise.

We create our websites for our users, not ourselves – so we need to make time to understand their needs and expectations.


Page last edited Apr 05, 2019

Help us to improve this page – give us feedback.