We use cookies to help us provide you with the best experience, improve and tailor our services, and carry out our marketing activities. For more information, including how to manage your cookie settings, see our privacy notice.


Skip to content. | Skip to navigation

Community-made content which you can improve Case study from our community

Improving efficiency with operational research

This page is free to all
How Crimestoppers Trust used operational research methods to manage increased demand for their services, without increasing costs.


The Operational Research Society runs a scheme, Pro Bono O.R., which provides short term support to charities to assist them in making operational improvements.

Operational research (OR), sometimes known as management science, is the discipline of applying appropriate analytical methods to help those who run organisations make better decisions. Organisations may seek a very wide range of operational improvements - for example, greater efficiency, better customer service, higher quality or lower cost.

The aim of Pro Bono O.R. is to help voluntary sector organisations to do a better job, impacting upon desired outcomes, and build capacity by using the skills of volunteer OR analysts and consultants.

One charity which Pro Bono O.R. has supported is Crimestoppers Trust.

The issues we faced

We were asked to help with planning staff rosters for Crimestoppers’ call centre. The call centre is open 24/7 and handles phone calls and emails from the public who provide information (anonymously) which may be useful in solving crimes. The call centre was about to take on additional areas of work and anticipated a 60% increase in business which they needed to handle without increasing staffing costs while continuing to meet their customer service target (90% of calls to be answered within 20 seconds).

The actions we took

We collected data on customer contacts (both phone calls and online forms), mainly from the charity’s call handling system, though we also used paper based information about current and potential rosters. We then analysed the information, summarising arrival patterns by time of day, day of week, and time of year.

There were gaps in the data, for example, whereas the duration of phone calls was known, there was no information on the time spent by staff on follow-up work after a call was ended. As both the arrival of calls and the time spent answering calls was random, an off the shelf simulation package which was easy to learn and had appropriate facilities, including the ability to run quick ‘what-if’ scenarios, was chosen for the analysis. The model was able to handle current and forecast business volumes, two different process models, and different staff rosters. An example of model structure can be seen below:

Operational Research Society - model call centre structure

This model has 2 types of customer contact (phone calls and online forms) and 3 types of staff (call handlers, online staff and shift leaders). At busy periods, calls can be rerouted to shift leaders and when they are also busy, rerouted again to online staff.

Before using the model to test different staff rosters, it needed to be validated against actual data. We did this by comparing the actual number of ‘lost’ calls (those calls that were unanswered as all staff were busy) with the number produced by the model for a full week. See the  diagram below.

Positive outcomes

Once we were confident that the model was a sufficiently close representation of reality, we worked with the Staff Manager and the Performance Manager to try out different staff rosters. This was carried out as an iterative process (see below which shows the customer service KPI for the staff rosters actually in place and the 5 alternatives that were considered).

It was wonderful to see the faces of the managers as they realised the power of the simulation to compare options quickly. The final agreed shift patterns indicated a 7% improvement in percentage of calls answered in target time, nearly 50% decrease in lost calls, and a 40% decrease in average time to answer a call without any increase in staffing costs. A month after the new shift patterns were introduced we received an email from the Performance Manager: “This afternoon I received the management report figures for the Bureau, and I wanted to share the highlights with you as they reflect the first month on the new shift pattern. Service levels have increased from an average of 90% last year to 94.3% in January. The percentage of lost calls has fallen from a peak of 12% to 5.7%. I know this is just one month’s data but it indicates that the expectations we had of the new patterns are being realised, combined with the efforts of all our staff at the Bureau. We are grinning like Cheshire cats”.

Negative outcomes

There was an interesting reaction from call centre staff to our initial proposal for new rosters: although we originally modelled the type of shift patterns they said they would prefer, when they were shown the solution they said they didn’t like it and proposed an alternative solution themselves. The advantage of the model was that it could quickly and easily demonstrate the anticipated outcomes of using their suggested shift pattern (poorer customer service and increased costs). A compromise solution was eventually agreed which staff were content with.

Lessons learnt

Visual simulation software was excellent for helping the client’s staff to work with the consultants, quickly spot any problems, enhance the analysis and be able to demonstrate results really quickly. The immediate feedback was invaluable: managers would suggest additional Saturday morning shifts (for example) and when they saw results could immediately suggest starting them an hour later. Another lesson was that despite the lack of a full set of data it was possible to tackle a difficult problem using rough estimates provided that checks were made when data did become available.


Page last edited Jul 25, 2017

Help us to improve this page – give us feedback.