Cookies on Knowhow

We use cookies in order for parts of NCVO Knowhow to work properly, and also to collect information about how you use the site. We use this information to improve the site and tailor our services to you. For more, see our page on privacy and data protection.

OK

Skip to content. | Skip to navigation

Community-made content which you can improve Case study from our community

Monitoring outcomes of helplines

This page is free to all
Milla Gregor, associate consultant for NCVO Charities Evaluation Services, shares her experience of monitoring and evaluating helpline services

Background

During 2013, Charities Evaluation Services (now part of NCVO) and children’s charity Coram ran a year-long community of practice for helpline services. Funded by the Dulverton Trust, organisations developed their outcomes monitoring.

Since then I’ve worked with other charities as an evaluation consultant. Many share similar challenges, as well as a frustration that monitoring guides rarely provide practical examples. Real examples give confidence and inspiration, so I decided to share what I’ve learned since 2013.

I’m grateful to the following organisations (and others who remain anonymous) for sharing their practice:

  • Arthritis Care
  • First4Adoption
  • MOSAC
  • MS Trust
  • Samaritans
  • Survivors Network

The issues we faced

The helplines shared similar challenges:

  • Brief contact with callers, making it tough to attribute their outcomes to their work
  • Callers being in crisis, making it feel tough to ask for feedback
  • A lack of agreement on what the helpline’s intended outcomes were, so that feedback questions were hard to imagine or design
  • If running an anonymous service, difficulty in arranging feedback calls 

The actions we took

Support followed four steps:

  1. Setting out intended outcomes
  2. Developing a questionnaire
  3. Gathering information
  4. Analysing and using the information

For each step I’ll set out what approaches helplines used, with real examples, as well as any challenges and what helped.

Setting out intended outcomes

Helplines used either a theory of change or a CES planning triangle to work out their outcomes. Next, they developed monitoring frameworks setting out what information they would collect to measure if their intended outcomes had occurred.

Real examples of outcomes included:

  • increased understanding of the adoption process for professionals and potential adopters
  • improved survivors’ awareness of available support options
  • reduced physical and emotional isolation

One challenge was that developing planning materials could feel overwhelming. Helplines were able to mitigate this by:

  • setting time aside as a team to think about the purpose of the work
  • listening in to calls to understand caller outcomes
  • doing the planning early in the project

Developing questions

Non-anonymous helplines used a feedback survey once or twice a year. Some called, some sent an online survey, and some gave both options. Organisations left 2–12 weeks between the call and the call-back, depending on the type of service. This was a tough decision for everyone, as different outcomes have different time-scales. The first attempt was a pilot, with improvements then made to the questions and the process.

Real examples of questions:

  • What difference did your contact make to your ability to cope with life with this condition? (not relevant, decreased, no difference, increased a little, big increase)
  • Please indicate how much you agreed with the following statements BEFORE calling us? (question repeated for AFTER)
    • I felt very confident about being an adopter (answer options: strongly agree, agree, disagree, strongly disagree, N/A or not sure)
    • I felt very informed about adoption (answer options: strongly agree, agree, disagree, strongly disagree, N/A or not sure)
  • As a result of using the helpline, do you feel better able to manage any negative feelings and thoughts connected to what happened? (answer options: yes a lot, yes a little, no not really, no not at all)

What helped:

  • Using a clear list of intended outcomes and a monitoring plan led to shorter and more useful questionnaires
  • Carefully scripting the beginning of the call as feedback (only), to avoid ‘opening the floodgates’ to another helpline call

Gathering information

Non-anonymous helplines used different sampling approaches:

  • Collecting names and numbers over a two-month period, twice a year
  • Contacting everyone who called in a single month, or 3-month period
  • Collecting all names and calling people back twice a year

They used a three-stage approach:

  • Identify people to call
  • Ask permission to call back (unless the person was in severe distress)
  • Seek feedback, recording information on paper or straight into an online survey tool (Surveymonkey, Surveygizmo or Googledocs) to manage their data.

Anonymous helplines used other approaches:

  • Asking for feedback via their website or social media channels
  • Recording detailed feedback from call handlers on a sample of calls
  • Looking at information about the types of calls being made at different times recorded by advisers on their call-handling system

Challenges:

  • Initially some call handlers were reluctant to ask permission for feedback, whether or not the person was distressed, as they felt it was not part of their role
  • Using call handlers to carry out feedback calls can be problematic (people can be uncomfortable giving honest feedback those who have helped them directly)

What helped:

  • Using the same person to do many calls so that they built up skills and experience
  • Giving options for responses (now, later, online, by text etc.)
  • Developing short drop-down list for common answers (one organisation had one question with 69 options to begin with)

Analysing and using the information

Helplines used outcomes information to communicate internally and externally and to reflect on their service and make improvements. 

One challenge was that where a questionnaire was developed without proper planning, it could not be used to report on all outcomes. It helped to have someone else (from another team or organisation) collate the information and do the report, lending independence and saving time.

Positive outcomes

Organisations experienced a range of benefits:

  • Shorter feedback questionnaires were more efficient and effective
  • Having reliable information on the benefits of their service for callers helped staff morale and, externally, communication
  • Having the right information to make service improvements (eg increasing opening hours) 

Negative outcomes

Initially some organisations felt overwhelmed by the workload, but as they began to work with colleagues and peer organisations, the project started to feel more achievable.

Lessons learnt

Find a peer group through formal or informal networks

Many found it beneficial to work with others in a similar organisation. One organisation said ‘just to go through that with others that had a similar background… It’s a rare thing to be able to compare with people who are in a completely different genre but in a similar space. To get that support around the table was invaluable’.

Allow time (minimum six months from planning to a completed pilot)

Usually, those developing new monitoring processes already have a ‘day job’. A gentle process, with clear but generous milestones, helps to avoid frustration and pressure.

Just ask

Even the most reluctant advisers found that many callers were happy to help. As one person put it, ‘[While] a lot of our beliefs about our callers are true, some are based on assumptions. [In fact] people are used to being very open. If they trust you and think you are doing it properly, it’s amazing what people will share.’

Just start!

It’s possible to let the perfect be the enemy of the good. Most felt that effective outcomes monitoring would not be possible for them at all, but all made significant progress, saying ‘I think it’s easy to spend a lot of time thinking “aah, what if?’” and not doing anything. Scribble then edit… Leap in and adapt!’

Contributor

Page last edited Apr 07, 2017

Help us to improve this page – give us feedback.

1 star 2 stars 3 stars 4 stars 5 stars 3/5 from 1110 ratings