Benchmarking – what’s the question?

I attended the fourth meeting of the advisory group for the Delivering efficiencies through effective benchmarking project last week. The project was borne out of recommendations from the Diamond report on efficiencies and modernisation in the sector. The report noted that information on the costs of operational activities within higher education is poor which means it is difficult for institutions to effectively calculate the benefits of efficiency initiatives or demonstrate more widely how they are ensuring value for money. The report noted the need for improved cost data in order to facilitate effective benchmarking, noting that benchmarking is a vital tool for driving efficiency.

The project has reaffirmed the demand for benchmarking. However, it is only possible to establish effective benchmarking tools if we know the questions that we are trying to answer and those questions are many and varied. There is a perception in some areas of Government that the sector is feather-bedded and inefficient – should one focus of benchmarking be to demonstrate that the sector is delivering efficiencies? This is a genuine need but any focus on costs alone fails to tell the whole story. In a sector that is now being encouraged to be far more competitive it is not unreasonable for institutions to look to spend more on their services in order to enhance the student experience. What is needed is the impact of any increased spend to be assessed within individual institutions – this requires measurement at the commencement of the project and assessment of the benefits realised at various stages after the project has been completed. A project at Newcastle University highlighted how a benefits led approach could enhance the development of IT services within an institution and there is much to learn from this approach.

Highlighting the efficiencies being delivered in the sector is never going to be a simple metric – what the sector needs to do is demonstrate the various ways it is delivering efficiencies. The Efficiencies Exchange provides one way of highlighting such best practice; the professional associations such as UCISA and BUFDG provide others. What these don’t provide (and what we should perhaps steer clear from) is a number that indicates the costs savings (efficiencies to Government) of these measures. The difficulty is that the sector is so diverse that it isn’t always possible to extrapolate savings from individual initiatives to an overall figure.

There is merit in comparing costs between institutions – high level costs can determine whether or not an institution’s costs are within the right boundaries but only that. Even at this level, it is important to have some institutional context and know what is included within a services’ portfolio in order to make sure that the comparison is valid. The UCISA statistics provide both the context and assessment of overall spend and the cost of some of the specialist teams within IT departments. The high level comparison offers little to the IT Director; what is needed is a more detailed approach to assess the costs of individual services and activities. This can have several benefits. It can help identify any areas where the cost of provision is at either unexpectedly high or low, it can be used to demonstrate value for money when linked to the outputs of services, and it can assist in forming the business case for increased investment in services. It is important to recognise that costing exercises at a detailed level are not cheap themselves if they are to deliver accurate information. However, the benefits can easily justify the costs.

Whilst the main focus of costing exercises is internal, it is possible if a common approach is adopted, to compare with other institutions. What is needed for comparison is a common service catalogue which defines what is included within a given service area. UCISA has worked with Janet (UK) to produce a catalogue that covers the IT operation. The catalogue has evolved as it has been trialled at a number of institutions and will need to continue to evolve as new services emerge and are incorporated into mainstream delivery.

It is clear that there is no ‘one size fits all’ solution for benchmarking services – what is needed is a framework of tools that meet different needs and which focus on different levels. The bulk of the questions that need to be answered by benchmarks are internal to the sector. However, we need to be wary of benchmarks that provide headline figures without adequate context. It would be all too easy for raw cost information to be seized upon by some in Government to back up the misconception that the sector is fat and provide easy justification for further spending cuts.

Advertisements

Tags: ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: