The National Student Survey (NSS) results were published today showing the level of satisfaction of final year undergraduates with various elements of their degree course. Whilst the headlines are focused on the overall satisfaction figure, within institutions there is greater analysis of the component questions. A poor score in a given area may highlight issues with given courses which may not have surfaced earlier. Three of the questions are on learning resources – one each on the Library (“The Library resources are good enough for my needs”), IT (“I have been able to access general IT resources when I needed to”) and specialist equipment and resources (again the measure is the ability to access when required).
The NSS ratings are used as key performance indicators in many institutions and a low rating in any one of the component questions is likely to trigger an investigation into the cause. IT services are no different in that respect but whilst a low rating may be indicative of a particular issue, does it necessarily follow that a high score indicates a good overall performance? Part of the problem is that the questions are very generic and so are open to interpretation. What do students interpret as general IT resources and when I needed to? Participants in one study I have seen considered general IT resources to be computers and printers, the VLE, or hardware support for personal equipment. Only a small proportion of the respondents considered the broader aspects of service provision such as speed of connectivity and logon, availability of wireless networking and helpfulness of support staff. Consequently the NSS rating is likely to be influenced by individuals’ perceptions of a subset of the service. Those who regard computers and printers as key will probably be satisfied if they can get on a PC in a computer room most of the time. Others may be satisfied if their laptop connects seamlessly with the wireless network. Those who focus on the VLE on the other hand may well consider the availability of teaching material within the VLE, something beyond the control of IT services, as important as the availability of the system itself when considering their response. So overall a good NSS score, whilst pleasing, does not give the overall picture. For that IT services will have to invest in a range of activities to obtain feedback on their services and input to their planning.
Looking forward, there is no easy solution to refining the NSS questions and it certainly isn’t a function of the NSS to replace more specific feedback mechanisms. Nonetheless the student spends a significant amount of time away from their institution or accessing resources from their own devices. This is a clear division and one that could perhaps be tested. A possibility we have discussed at UCISA could be to have two questions – one which asks for a view on general IT facilities within the campus and another which asks about the ease of access to resources such as e-journals and the VLE from off campus or from personal devices. Although this would not be perfect by a long way, it does distinguish between campus and remote access and could give a clearer picture of where resources might need to be targeted. Otherwise the responses to the Learning Resources questions within the survey remain good indicators of poor service but poor indicators of good.