Importance of data quality

I attended a meeting of the Data Efficiency Steering Group yesterday. The Group was set up to look at recommendations in a report by KPMG on the use of collected data in the higher education sector. The meeting is an interesting mix – on the one hand are the funding councils as the main stakeholders in the data collection and on the other are the various professional associations representing the views of those tasked with collecting, processing and using the data at an institutional level.

There are conflicting demands on the funding councils – they are looking to reduce the administrative burden of collecting data but are under pressure to demonstrate that the sector is meeting its statutory obligations. They also need to work together to achieve a consensus so that comparisons can be made across the whole of the UK and local needs met without ending up with a hotch potch of collections or one unwieldy one. These conflicts were brought into focus when talking about monitoring external examiners to ensure equal opportunities. One funding council representative expressed the view that external examiners should not be included; the sole responsibility was for the institution to ensure that they were appropriate in terms of their academic credibility. Another view was that they should be regarded as employees and included in monitoring. Against this, the institutions highlighted that data on external examiners was not held in a consistent way across the sector – in a number of cases there was no record of such positions on HR databases and monitoring would require an additional administrative burden.

In addition to ensuring that the data collected is appropriate and fit for purpose, there is a second stream of activities aimed at promoting the value of the HESA data itself. Often this is seen as a burden in institutions and there is a lack of senior management ownership of the data. As a consequence, the importance and impact of the data are not always clearly understood in institutions (the consequences of poor data quality only being highlighted with performance indicators or league tables are published). Some institutions are however focusing more on the data they collect and have to submit. As a result they are starting to use the data more effectively within their institutions to highlight problems, and are improving the quality of their own data to reduce the likelihood of mistakes in submissions. Case studies on these institutions may help others improve their game. Guidance on how the data comes together and is translated into performance indicators will also bring some focus on the importance of data quality.

So two strands of activity – making sure the data collected is appropriate, timely and does not increase the burden on institutions and promoting the uses of the data and the need for data quality. Progress is being made but it will take time to deliver on both fronts. There is some overlap with the work of the MIAP programme. It is a measure of how far that particular programme has to go to convince the sector of its value that one of the attendees commented ‘not that that [MIAP] will deliver anything until I am long in retirement’. The aims of MIAP are to improve data efficiency. But thus far, the sector has not been convinced that it will deliver savings or reduce the administrative burden. The proposed pilot studies will be key in demonstrating the business benefits and promoting the programme to the sector.

Advertisements

Tags: ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: