We use cookies to help us provide you with the best experience, improve and tailor our services, and carry out our marketing activities. For more information, including how to manage your cookie settings, see our privacy notice.


Skip to content. | Skip to navigation

PLEASE NOTE: You are no longer able to sign in to Knowhow

As we prepare to move our content to our new website this summer, we're turning off the ability to sign in on and

To ensure members can still access everything they need, member content will be available to all users until the end of July.

Community-made content which you can improve Case study from our community

How to write an evaluation report

Writing an evaluation report helps you share key findings and recommendations with internal and external stakeholders. A report can be used to suggest changes to how you work, to communicate your value to funders, or to share good practice with other organisations. It can also be the starting point for reporting in creative formats.

You'll need:

  • data that you've collected and analysed
  • an understanding of the people who'll be reading your report
  • helpful colleagues to read your drafts.

Consider your audience

Think about the people you're reporting to so you can tell them what they need to know. You should consider:

  • what kind of information they need (eg whether they need to know more about the difference you’ve made or the way in which you’ve delivered your work)
  • how they'd like the information presented (eg as tables, case studies or infographics), and when
  • why they need the information and what you want them to do as a result
  • whether there are any accessibility needs that you need to consider (eg does the report need to work on a screen reader?).

Plan your report

Having a clear structure makes your report easier to read. Before you write, plan your headings and subheadings. Most evaluation reports will include the following sections.

  • Executive summary  a synopsis of your key findings and recommendations.
  • Introduction – a brief description of what you're evaluating, the purpose of your evaluation and the methods you've used (eg surveys, interviews).
  • Findings and discussion  information on what you delivered, how you delivered it and what outcomes occurred.
  • Recommendations  actions that need to be taken to respond to the evaluation findings.

Write about your findings

What to report on

Reports will vary depending on the nature of your work, but you'll probably need to include findings on the following.

  • Outcomes  what outcomes have been achieved, for whom and under what circumstances. You should also report on intended outcomes that haven't been achieved.
  • Outputs  what has been delivered, when and to whom. You should also report on how satisfied beneficiaries were with your outputs.
  • Processes – information about how you delivered your outputs. You may need this information to explain why something worked particularly well, or why it didn’t work.

Describe and interpret your data

In your report, you should describe your data and interpret itanalysing your data before you start writing will help with this.

Description means presenting what the data tells you. You might describe, for example, what outcomes were achieved, by whom and in what circumstances.

Interpretation moves beyond description to say what the data means – make sure you word your report clearly so the reader can tell when you're describing data and when you're interpreting it. To help you interpret data, you could do the following.

  • Make connections by looking for trends, patterns and links. For example, if two groups had very different outcomes, what factors might have led to this?
  • Put data in a meaningful context. Numbers don’t speak for themselves. Is 70% good or bad? How do you know?

When you interpret your data, you could discuss the following.

  • Why outcomes were achieved, or not achieved. Understanding this may help you make decisions about future service planning. Many funders will also want to know about this.
  • What worked and what didn’t. Knowing about this will put you in a good position to improve your work. It may also be useful to share with partners or funders to improve practice in the sector.
  • Answers to your evaluation questions. When you planned your evaluation, you may have had two or three key questions you wanted it to answer. For example, you may have wanted to know whether your service works equally well for all groups.

Use subheadings to structure your ideas

Subheadings will make your report clear for your readers. Looking back at your evaluation framework or theory of change can help you generate ideas for subheadings. It often makes sense to have a subheading for each intended outcome.

Sometimes you'll have collected data about the same outcome from a range of different sources such as questionnaires, interviews, observation or secondary data. When you analysed your data, you probably looked at each source separately. In your report, it usually makes sense to write about all the data relating to each outcome together (rather than having separate sections on data from different sources).

Choose how to present your data

A common mistake is to try to present all your data, rather than focusing on what is most important. It helps to narrow down to what people reading your report need to know.

It’s also important to think about how you'll present your information. You could consider the following points.

Which key numbers do your audience need to know?

  • Decide whether to report using percentages, averages or other statistics.
  • Think about whether you need to compare numerical data for different groups. You may want to look at whether men were more likely to experience outcomes than women, for instance.
  • Read our guide on analysing quantitative data.

Which quotations will help you illustrate your themes?

  • Choose quotations that bring your outcomes to life. Don’t choose too many or they'll distract the reader from the point you want to make.
  • Have a mixture of typical responses and those that don’t fit easily into your categories.
  • Read our guide on analysing qualitative data.

What visual aids will you use?

  • Diagrams, graphs or charts should be used to highlight the most important information, rather than information which is less relevant.
  • It’s very easy for diagrams to mislead your audience. Here are some examples of misleading charts. If you think a diagram might be misleading, it’s better to leave it out.

As far as possible, present data that has been analysed or summarised rather than raw data, to make it as easy as possible for the reader to follow.


Write accurately and clearly

It’s important to write accurately and clearly so that your report can be easily understood and is not misleading.

Be transparent

Being transparent means being open about what you can and can’t say, and clear about how you reached your conclusions and about the limitations of your data. 

Just as it's important to minimise bias when collecting or analysing data, it's equally important to minimise bias when reporting.

  • Avoid overclaiming your role in making a difference. Your work may not be solely responsible for the outcomes that have occurred for individuals or organisations you've worked with. Remember to report on evidence of any other contributing factors (eg support received from other organisations or other sources).
  • Choose case studies carefully. Evaluation case studies are not the same as marketing case studies. They should illustrate your learning points, not just the very best of what you do. You won't have a representative group of case studies, but as far as possible, choose case studies – and quotations – that reflect the full range of responses you had.
  • Explore alternative interpretations or causal links. Sometimes, data is ambiguous and there could be more than one interpretation. All of us are prone to 'confirmation bias' – paying more attention to data that fits our existing beliefs. It's important to look for and talk about reasonable alternative interpretations or explanations of your data.
  • Be clear about the limitations of your data. If there was a group you weren't able to hear from, or your sample over- or under-represents a particular group, say so.
  • Be open about your sample size. In general, the smaller your sample, the less able you're to make generalisations about everyone in your target group.
  • Report negative findings. If the data shows something isn't working or an outcome hasn't been achieved, don’t ignore it. Reporting negative findings will help your audience to use the evaluation to learn and improve.

Check anonymity and consent

When you collected your data, respondents will have said whether they wanted to remain anonymous (most do) and whether you should check with them before using a quote or case study in your report. Make sure you do any checking with plenty of time before you need to complete the report.

Depending on the size of your sample and how easy it is to identify individuals, you may have to do more than just change the name to make someone anonymous. You might have to change their age or other identifying details, or remove references to anything that would allow people to identify them as an individual.

Use clear language

Evaluation reports need to be as clear and precise as possible in their wording. Be especially careful about using the word 'proof' or 'prove'. To 'prove' something requires 100% certainty, which you are very unlikely to have. 'Indicates', 'demonstrates', 'shows', 'suggests' or 'is evidence for' are useful alternative phrases.

Keep your language simple and straightforward. Remember to explain any terminology that might be unfamiliar to your audience. 


Develop your recommendations

Your recommendations are likely to be one of the most important parts of your report. Good recommendations will make your evaluation findings more likely to be used.

Recommendations are more likely to be implemented if the following factors are considered.

  • Supported by evidence  be clear how the recommendations build on the key findings. It can help to structure the recommendations in the same order as the main findings to help readers understand the evidence base for each.
  • Specific  say exactly what action needs to be taken and when within the control of the evaluation.
  • Users – make sure individuals or groups have the authority and capability to take forward what you’re suggesting.
  • Realistic and achievable – recommendations should be feasible. You can categorise them by which ones are easy to implement and which are less so. More ‘difficult’ recommendations might need budget or staff changes. These should still be stated, but so should their implications.
  • Prioritised –it’s helpful to indicate some priorities for action. You could, for example, split your recommendations into ‘essential’ versus ‘optional’ or ‘for consideration’ versus ‘for action’. Make sure the number of recommendations you include is achievable.

Involve people in the reporting process

You can involve other staff, beneficiaries and external stakeholders at several points. For example, you could share your report drafts and ask them to help you refine the conclusions. This 'co-production' of findings can be valuable and yield interpretations you may not have thought about.

You can also co-produce recommendations by sharing the findings with stakeholders and asking them to suggest and prioritise recommendations. If you do this, take care to guide people to base their recommendations on the evidence, and not their own interests or preoccupations.


Finish the report

Allow time for a couple of report drafts and make sure that there are people available to review the report for you. It's good to have someone look at it with ‘fresh eyes’. If the report is being widely shared, you could have someone from outside your sector review the draft to make sure it's clear for external audiences.

To complete the report, leave time for proofreading and editing, checking references, and design and print if needed. You might include your data collection tools in appendices – this could help other organisations working in your field to improve their evaluation.

Further information

This how-to was contributed by NCVO and StudySaurus. It was produced as part of a UK-wide collaborative programme supporting a focus on impact in the voluntary sector.


Page last edited Feb 25, 2022 History

Help us to improve this page – give us feedback.