Shaping the information landscape

February 5, 2015

One of UCISA’s roles is to ensure that suppliers to our sector are kept abreast of developments that may impact the software and services they deliver. The aim is to alert suppliers of potential changes in legislation or other statutory requirements so that they can effectively plan future developments. A recent example of this activity was the briefing day that UCISA and HEDIIP arranged at the end of January to bring suppliers of student records systems up to date with the work being carried out under the HEDIIP programme.

The meeting heard updates on four of the HEDIIP projects: data capability; the new subject coding system, the Unique Learner Number and the new Information Landscape. In addition we heard from HESA about the CACHED project. The aim of the HEDIIP programme is to redesign the information landscape to enhance the arrangements for the collection, sharing and dissemination of data and information about the HE system. Each of these projects will contribute to that overall goal – I won’t go into detail on these here but if you are interested in learning more, each is outlined on the HEDIIP website.

There were a number of common themes that emerged from the day. The first was the adoption of standards. One of the challenges the sector faces currently is that the same term can mean different things to different organisations (the term course being a prime example) so standard data definitions are essential to a common understanding and data sharing. This has been a particular problem with the JACS subject coding scheme where changes and growth in JACS’ range of functions mean it is no longer consistently applied.

The second theme was managing cultural change both within higher education institutions and a number of the organisations requesting data from the sector. In some institutions, many processes are geared around producing the HESA return and the need to get it “right”. The focus on a single return suggests that these institutions may be unaware of the volume of demands made on their data and the amount of resource across the institution spent in ensuring the various returns made are correct. It is highly unlikely that there will be one version of the truth in these institutions – indeed it was noted that one institution had over 200 separate collections of student records. It goes without saying that the data management in such institutions is poor – it will take a significant change to move away from data being an input to deliver a return to a point where it is seen as an institutional asset.

Finally, the biggest challenge is governance. At an institutional level, mature data management will only be achieved with effective information governance being driven from the top table. Getting the value of data understood at senior management level is key to improving the data and information management within an institution. There are wider governance issues that the HEDIIP programme will need to address. Moving to a set of standard data definitions is one challenge – ensuring that the governance mechanisms are in place to ensure that the standards remain consistently applied and understood is a league apart. Similarly with the new subject coding scheme, establishing a governance model that is supported by an appropriate selection of stakeholders, with sufficient authority and resources to manage its evolution will be critical to the success of the new scheme.

The feedback from those suppliers present was positive. They could recognise the efficiencies in moving to a model where, for the most part, data is submitted to a single point at various points in the year and drawn down from a single repository. The HEDIIP programme is only part of achieving this goal – the institutions need to improve their data management and change their processes, those requesting data may also have to change their processes and suppliers will need to amend their systems to implement new standards and enable data to be extracted at key points in the academic year or cycle. It will be a long journey but one that offers much reward.

ORCID seeds

January 22, 2015

I attended a meeting today to hear the final reports from a number of pilot projects looking at implementing the ORCID researcher identifier. UCISA was one of a number of organisations that were signatories in 2012 to a recommendation that ORCID should become the standard researcher identifier in the UK. Over one million researchers worldwide now have an ORCID with the growth being driven by improved integration with internal and publisher systems. ORCID has been adopted by a number of other countries in Europe and may be emerging as a de facto standard.

The difficulty in establishing any standard is that the benefits are only realised when there has been widespread adoption covering all aspects of the process. The pilot studies reflected this to a degree with a number highlighting the challenges of selling the long term benefits and managing the expectations of the early adopters within their institutions. Quick(ish) wins include improved internal systems integration but these are perhaps more likely to deliver benefits to professional services teams rather than the researchers themselves.

Overseas, implementation was being driven by mandating use or consortia funding. There was support amongst those present for employing both approaches in the UK. Jisc is to consult shortly on a possible national subscription for ORCID. This was identified as a quick win at a workshop on research data management last year and would encourage adoption across the UK. It would also put the sector in a strong position to lobby funders, publishers and other systems providers to include the ORCID and so facilitate better discovery and integration. However, this would still result in a slow and piecemeal adoption – a degree of mandation would hasten adoption, strengthen the business case and ensure that some of the benefits were realised earlier than might otherwise be the case. Although funders could support adoption by insisting researchers had an ORCID as part of their applications for grants, the key driver was seen as the 2020 REF. ORCID offers an opportunity to ease the burden of reporting on research outputs and impact and this may be sufficient to encourage adoption. Mandating that all researchers to be included in the REF must have an ORCID will hasten the process and should deliver wins all round.

The changing landscape of technology in higher education

January 21, 2015

I took part in a panel session at the BETT Show today on the changing landscape of technology in higher education. The panellists were invited to speak for a few minutes at the start of the session in order to prompt further discussion. I took the view that it wasn’t all about the technology…

Firstly there are the students themselves. A while back I spoke to a number of school leavers who were heading to university to try to find out how they were going to use the technology they were taking with them and what their expectations were of using technology at university. Their expectations were probably aligned to what they had seen during open days. They were expecting to make use of computer pool rooms and “Learning commons” facilities but there was little expectation of how technology was going to be used in their own education. Some had thought about the technology they were going to take to university – a smart phone for making quick notes, for reminders and for finding information on the move, a tablet for taking notes in lectures and for searching for information, and a laptop for producing their coursework. But although they regularly exchanged information with friends and were informally learning through their contacts, there wasn’t an understanding of how they were going to translate those skills into their university environment. Consequently universities need to help their students improve their digital capabilities, to help them make good use of the technology they have, to provide facilities for collaboration, to help them stay safe, and to distinguish between good online sources and bad.

On the other side of the equation, do universities have the ability to optimise the use of technology in teaching and learning? Research suggests that a blended model of teaching (utilising both face to face and online components) results in increased learning and understanding. However expertise in using technology and employing different pedagogic methods amongst academic staff varies enormously. Similarly the desire to move to a new teaching model also varies hugely. Teaching online and making use of technology to change how students learn requires different skill sets. Facilitating a discussion is different from delivering a lecture. Delivering short micro lectures where you are getting a key point across in fifteen minutes is different to delivering a 45 minute lecture. Further, there are many credible resources available online that can be used in teaching. Do academic staff understand how to make best use of the resources available or appreciate how technology could be used to teach in a different way? Universities have to work to develop the digital capabilities of their academic staff. They need to invest in training and supporting academic staff and invest in the estate to provide flexible learning spaces and social spaces that their student body can use for informal learning, collaboration and group work.

The need to invest highlights the need for those making the decisions on funding to understand the possibilities and benefits in investing in technology for teaching and learning and investing in the workforce, and to understand the impact on the estate. Much has been made of the ability of online learning to be easily scaled up and it would be easy to conclude that using technology to deliver learning, whilst not free, is a cheap alternative to traditional models. However, one benefit of the advent of MOOCs has been a recognition that, if you are going to deliver material online, you have to do it well. It is not a cheap option. Universities’ senior management teams, whilst perhaps not needing the same level of digital capabilities as academic staff and students, do need enough knowledge to understand the potential.

A little on technology – MOOCs have been a disruptor but not in the way that was anticipated. One impact has been that it has been recognised that externally facing resources have to be of a high quality – they are your public face and advertisements for your institution. As a consequence the quality of all online delivery has been raised. MOOCs also allow self paced learning accommodating different learning styles as students find their way through the presented material. This is one area where learning analytics can make a difference. Using all the information that is available to the institution, entry data, cumulative data from previous cohorts and data from the students themselves will allow universities to be more supportive and help guide their students to a successful conclusion. In the future this may be supplemented by personal profiling ahead of entry, helping to guide students earlier to identify the right course for them at the application stage. The question is, how far should this assistance go? Whilst it is desirable to get as high a pass rate as possible, does too much guidance mean that our graduates are less prepared for the world at large?

PS Readers may be interested in the results of UCISA’s Digital Capabilities survey. The Executive summary is available on the UCISA website.

Benchmark to improve

December 9, 2014

UCISA has run the HEITS exercise to collect benchmark statistics for seventeen years. During that time, members have used the data to assist in making business cases for funding, for quality assurance purposes and for comparing themselves with their peers. I attended a workshop run by EUNIS’s BENCHEIT working group last week partly to hear what others were doing in the way of benchmarking and partly to see if there were any lessons that we could learn from our peers (and thirdly to promote the results of the UCISA Digital Capabilities survey).

The Finns compiled their statistics by carrying out an in depth analysis of the costs of services. This is similar to the approach adopted by the Jisc Financial X-ray – although it takes time to produce the data, particularly when considering the apportionment of procurement items and staff costs, it does lead to detailed costs. It also permits quite detailed comparison between institutions. Individual institutions can pick out areas where their costs are very different (higher or lower) and they can then ask questions of the other participants to establish the reasons for the variation.

The Dutch approach was similar but they also used the statistics strategically within the individual institutions. Whilst they also identified the exceptional costs and sought to identify the reasons behind variations, they used the statistics to demonstrate value internally (“the IT infrastructure is only costing x% of the student fee”) and to baseline costs in order to highlight the impact of projects. In both the Finnish and Dutch cases, the statistics prompted an open discussion on the costs of contracts and where there were significant variations they were cited in talks with suppliers in order to bring costs down. There seemed to be far more openness with regard to commercial contracts than appears to be the case in the UK – perhaps this is something we need to address?

Whilst the Dutch and Finns largely concentrated on the costs of services, the Spanish adopted a more holistic approach. There too were carrying out cost comparisons but this was being done within an overall framework that assessed the maturity of the IT Governance and Management in the institution. A catalogue of principles, broken down into objectives, each with quantifiable indicators and variables, was used as the basis for the study. Each indicator and variable is fully defined to avoid any ambiguity. The results were then passed back to the institutions showing their position for each indicator relative to their peers.

The one message that emerged from the workshop is that it is important not to take raw cost figures as the basis for comparison. There are many reasons for differences in costs – the size of the institution and its mission will be contributing factors and the CHEITA group have been looking at using these to facilitate international comparisons (more in a later post). Other factors include the quality of the service being provided and institutional drivers – higher costs may be as a result of investment in any given year. It is important to have a dialogue in order to understand the context and the underlying reasons for any variation. It is a message that I continue to promote in the UUK benchmarking initiatives: the figures alone do not give the full picture – you need to understand the institutional drivers and the value of that spend in order to make a genuine comparison.

(also published on the UCISA blog)

Lessons from a large scale outsourcing contract

November 17, 2014

In 2007 Somerset County Council created a joint venture company (JVC) with IBM to provide services such as IT, Finance and HR/Payroll to the Council and other similar organisations. The Council have recently published a report on the lessons learnt and some of these may be applicable to the higher education sector as we seek to establish cost sharing groups to provide shared services, or indeed make use of managed contracts to deliver aspects of our back office functions. I’ve identified two below.

The JVC had three initial clients – Somerset County Council themselves, Taunton Deane Borough Council and Somerset Police. The first issue relates to governance. The report notes that the supplier had difficulty in managing the sometimes conflicting expectations and services for the three initial clients and that the partnership “depends upon having similar incentives and an understanding of each partner’s requirements”. This is challenging for shared services where, in order for services to be shared, processes and systems have to be standardised across the partners. Any changes have to be agreed and implemented by all. However there were times when the three clients had differing requirements. Managing the delivery of services that are different (even slightly) brings an additional overhead. Either way, the mechanism for resolving resourcing and operational conflicts needs to be established and agreed at the contract stage and embedded in service level agreements.

The report also notes that the “Client function monitoring a major contract needs to be adequately resourced”. This is an area where the higher education sector has struggled with in the past and it isn’t just a matter of resourcing – those monitoring need to understand what actions are open to them in the event of a service failing to meet the desired quality. Part of this is down to the relationship between the client and supplier. If the focus is too heavily on service metrics and tying that into the contract then you are likely to end up having a very mechanistic way of determining service quality with neither party fully understanding the needs and goals of the other. If, on the other hand, the relationship is built on trust and a true partnership established, then there is a better chance of a shared understanding, greater flexibility (by both parties) and consequently a better service. Metrics based performance indicators have their place but they need to be supplemented by softer measures and partnership.

Vendor management is rapidly becoming a necessary skill in university and college IT service departments. This does not just apply to managed services; the University Nottingham won a UCISA Best Practice Award in 2010 for their supplier relationship management programme where the University’s IT suppliers were all allocated a Supplier Relationship Manager. This required a heavy investment in staff development but the return was an improved working relationship with suppliers, strong buy-in from the staff and an improvement of service levels and reliability for end users, in addition to a significant financial return on investment.

It is fair to say that there is not yet a contract of the complexity of the South West One Contract between Somerset County Council and IBM. That’s not to say that there won’t be and, when the time comes, the sector should to look for similar contracts to understand what worked and what didn’t. However, some of the lessons here are applicable to smaller contracts and shared services. We should do well to heed them.

Current challenges for Higher Ed Corporate Information Services

November 10, 2014

I circulate a briefing to exhibitors ahead of each of UCISA’s main conferences. As we are approaching CISG14, here’s my take on the current state of the nation…

CIS Services – current issues

It is three years since both the UK and Scottish Governments published White Papers encouraging the sector to put students at the heart of the system. Since then, institutions have focussed on the student experience, streamlining processes and improving facilities. That focus continues. Although some have invested in bespoke applicant environments, there is recognition that institutions need to do more to foster a long term relationship with applicants to ensure that they translate from applicant to student through to graduation and alumnus. IT systems are at the hub of this transition. Institutions are looking to their student information systems to support a greater focus on the student journey. They are also investigating how CRM systems can help engagement with applicants, students and alumni, moving from administering customers, to putting them at the centre of what we do.

There is a growth in the use of analytics to assist in the support of student services. The focus has primarily been on retention, tracking the interactions students make with a variety of systems (e-assessment, online learning, the Library, IT systems login, security systems) to identify those students who are not engaging with the institution and so are at risk of dropping out. However, there is evidence that institutions are making wider use of the data they hold to recommend materials to students to assist in their studies or to recommend modules based on previous performance. The student now gets a much more personalised experience, with tailored information delivered through portals or apps from a variety of systems and benefits from increased automation of processes.

There continues to be pressure on recurrent spending and so it has become increasingly important to know your numbers and for senior management to have accurate and timely information at their fingertips in order to inform business decisions. Knowing the costs of services is vital to demonstrating the effectiveness of efficiency and modernisation initiatives; services are being benchmarked both internally and against the operations of other institutions. Business intelligence systems are being used for much more than management reporting. They are at the heart of scenario planning exercises and are used to monitor critical business processes to identify whether performance is on target or whether remedial measures need to be taken.
IT is embedded in every aspect of an institution’s operations and continues to enhance processes and deliver business benefits. There remain challenges in ensuring that decisions to develop new services and implement new systems are linked to the institution’s strategies and plans, and are founded on strong business cases, backed by sound financial data. Effective business intelligence can assist with the latter but there remains a need to ensure that practices such as Enterprise Architecture are deployed to ensure greater understanding of the impact of change.

The way in which services are delivered is changing with a blend of outsourced and internal provision now prevalent across the sector. The range of services continues to grow and new technologies emerge and are deployed in our institutions. The IT service needs to be agile to respond to both the changes in delivery model and technological advances; both require the workforce to acquire new skill sets.

Background

The student experience has been the focus since fees were increased to £9000. Whilst this has had a positive impact in that institutions have sought to improve facilities, there has also been an increased emphasis on making services more efficient and more appropriate for the modern age. This has been driven by a number of factors. The increase in fees has placed greater emphasis on value for money and the level of spend on administration has been highlighted as evidence that institutions are not investing in their core purpose. There has been a belief that the sector has been awash with cash and initiatives such as there continue to be initiatives to demonstrate that the sector punches above its weight whilst making improvements to aspects of its operation. There is though a declining unit of resource. Undergraduate fees are fixed and are not index linked; their value in real terms continues to decline. There is also pressure on the contributions made to the sector from Government funds. This is a particular challenge in Scotland, Wales and Northern Ireland where a greater proportion of funding is drawn directly from the national governments; a recent article on the situation in Northern Ireland in the Times Higher illustrates the issue.

Whilst the increase in fees has brought attention on the services for students, there has also been attention on the research institutions deliver. There is a growing emphasis on widening access to the outputs from publicly funded research and institutions are considering how open access to both publications and data can be delivered in a cost effective, efficient way. It is important that institutions are able to link publications to the researchers that produced them so that they can be referenced in applications for funding and so that evidence of their citation can be utilised in exercises like the Research Excellence Framework. The requirements of research focussed systems are complex as there are differences in the policies of the individual research councils and other funders as well as significant variations in disciplinary practices. It is an area that will continue to evolve and it is hoped that some standardisation and use of common identifiers can be achieved in order to facilitate the development of integrated systems.

Cyber security – top table interest

October 28, 2014

The risk cyber crime presents to the higher education sector was highlighted to Vice-Chancellors at the Universities UK Conference in 2012. Since then, there have been a series of round table discussions which have looked at the ability of the UK higher education sector to respond to cyber crime attacks. I attended the most recent of these which focused on the outcomes of a self-assessment exercise UUK promoted earlier in the year.

Those institutions that had completed the exercise will receive individual reports in the near future and a briefing will be circulated to Vice-Chancellors reflecting on the exercise. The briefing will include an additional report giving details of a number of UCISA resources that support institutions in their cyber security initiatives. The detailed results of the exercise are embargoed until the institutions have received their individual reports but, although it is clear that there is work to be done, there are some encouraging signs that cyber security is being taken seriously at a senior level within many institutions.
There are a number of factors that support this assessment. Firstly over sixty institutions took part in the exercise. In addition to these institutions, I am aware of a number of others that did not take part as they had already carried out similar work either utilising already published controls (such as the CPNI’s twenty controls for cyber defence) or by engaging external consultants.

Secondly there was a good level of interest shown in security and risk related topics by delegates at the Universities UK Conference this year. UCISA exhibits at the Conference to promote our resources and activities. Two publications that drew particular interest were the revised Model Regulations for the use of institutional IT systems and the Information Security Toolkit. Effective information security is underpinned by effective regulations and the Model Regulations give institutions a template to utilise locally. The current version of the Information Security Toolkit provides specimen policies for institutions to revise. The delegates were also interested in the Major Projects Governance Assessment Toolkit – effective governance reduces the risk of projects failing to deliver their anticipated benefits, or having major cost or time overruns.

So there are positive signs that risk and cyber security are being taken seriously. Care is needed though that cyber security is not just seen as an IT problem – people and processes are also important components in implementing effective information security measures. This is something that will be highlighted in the revised Information Security Toolkit – there is a need for senior management ownership and good governance in order for information security to be successfully managed. We also need to guard against IT only featuring at the top table for ‘problem’ issues – we need to work to ensure that the role IT can play in enhancing the student experience, delivering efficiencies is also understood by senior institutional managers.

Postscript – work is currently in progress on a revision of the Information Security Toolkit. It is anticipated that the new version will be launched at the UCISA15 Conference in March 2015.

Meeting the accessibility challenge

October 1, 2014

I attended a session at the Educause conference today on accessibility. This has become more of an issue in the US as a number of universities have faced litigation because of their lack of compliance with disability discrimination legislation. The number of cases is, in the overall context of the US education industry, relatively small but the amount of the awards made against institutions has made some university executives nervous and has driven moves towards greater compliance.

Temple University was one such institution. The University Board set a project in motion to review the current level of provision and take the steps necessary to comply with disability discrimination law. The initial analysis showed that Temple were not compliant with many aspects of that legislation – essentially in the same boat as many other institutions. I suspect that this is much the case in the UK too – there is some awareness of the disability legislation but not of what is required in order to comply.

However, Temple’s Board sought to address this, recognising that they needed to tackle to problem on a number of fronts. It was necessary to define the policy for the institution but then follow it through so that considering accessibility started to become business as usual. A broad based committee was established to oversee the project. Led by the CIO, it included representatives from the service departments but also Estates and the institutional counsel. The policy the group established was clear – we will be accessible. Responsibility for accessibility was devolved to the person providing the technology or information – so faculty were responsible for ensuring their materials were accessible and heads of service were responsible to ensuring compliance in their areas. Will became the watch word – where there were items that could not be made accessible, those responsible were challenged to think of another mode of delivery or whether the items were necessary at all.

After the initial audit, Temple instigated departmental liaison officers that were responsible for promoting the accessibility message within the department, ensuring departmental accessibility initiatives were funded and evaluating accessibility during the procurement process. The group established standards for the web services, learning spaces and IT labs with each bearing in mind the principle that accessibility should be standard provision, not the exception. Checklists were prepared to assist faculty in assessing their materials. Once the preparation was complete, the CIO promoted the policy and available support to a wide range of institutional groups through a series of roadshows.

There were some quick wins once the policy began to be implemented. The largest and most used IT labs were upgraded first bringing an instant return. Web accessibility standards were introduced and processes established to ensure compliance. Control panels in smart classrooms were upgraded. However, not everything gave so rapid a return. Although the processes were in place to ensure the web sites were compliant, adoption was slow. The guidelines for instructional materials took over 12 months to complete and a larger group was established to review and amend them as required. The initiative wasn’t cheap – Temple spent over $600k in their move towards compliance.

Not all institutions in the US had followed the same road – some opted to steer clear from even establishing an accessibility policy as they felt that doing so would put them at greater risk of litigation. I suspect the reverse is true – if you have a policy in place and plans to implement it then I believe you are less prone to litigation as you have recognised that you have a problem (in not being compliant) and are taking steps to address it. I wonder how compliant UK institutions are with the Disability Discrimination Act. My gut feel is that there probably aren’t that many. Will it take litigation in the UK to change that?

Sharing across borders

September 30, 2014

UCISA is a member of the Coalition of Higher Education IT Associations (CHEITA). Many of the issues we face in the UK are the same in other countries – it is hoped that the existence of CHEITA will encourage international collaboration to address those issues. The following is a report on the Spring meeting of CHEITA.

The Spring CHEITA meeting took place ahead of the UCISA14 Conference in Brighton, UK in March. The meeting looked at the four main issues that were identified at the CHEITA meeting at EDUCAUSE in Anaheim in October 2013 and sought to identify resources that member associations were willing to share to assist others in addressing those issues. In addition, there was a brief update on the benchmarking activities since Anaheim. The afternoon session was dedicated to the support of research and included a number of presentations. The meeting was attended by representatives from France, Italy, Sweden, EUNIS (the Europe wide association), Hong Kong, South Africa, the USA and the UK.

Benchmarking

Susan Grajek gave a brief update on the work of the CHEITA Benchmarking Group and the work EDUCAUSE have carried out. Susan highlighted the Top Ten IT Issues and the Top Ten Strategic Technologies for 2014. The discussion in Anaheim had focused on the need to develop maturity indices for technologies in higher education institutions (HEIs). Susan noted three areas where EDUCAUSE had developed indices:

• Research computing (see http://www.surveygizmo.com/s3/1125699/Research-Computing-Maturity-Index);
• Analytics (see http://www.educause.edu/ecar/research-publications/ecar-analytics-maturity-index-higher-education);
• E-learning (see https://www.surveygizmo.com/s3/1298256/E-Learning-Maturity-Index).

In addition SURF have developed a maturity index for Green IT (see http://www.surf.nl/en/knowledge-and-innovation/knowledge-base/2014/surf-green-ict-maturity-model.html).

A meeting of the EUNIS Benchmarking Group was held in December with Leah Lang attending from EDUCAUSE. The group had identified five elements that could be established as international IT benchmarks. It was noted that there were particular challenges in measuring spend and the quality of service delivered. It was also noted that it was difficult to compare institutions internationally because of the different educational systems in each country and different institutional missions within it. The CAUDIT Complexity index may provide a mechanism for facilitating international comparison. It was noted that the index worked well in South Africa and initial results in applying the index to US institutions was encouraging.

Jisc have developed a Financial X-Ray to establish the cost of IT in institutions. This work has identified a taxonomy for IT Services in institutions and looks through financial and staffing information to identify full costs for each element of the IT service provision. This is available as a service from Jisc. It was noted that a substantial amount of effort was required to establish full and accurate costs. A group looking at benchmarking of all university services is considering using the X-Ray method across all service departments for facilitate nationwide benchmarking.

There was a brief discussion on the role of benchmarking in driving improvements and efficiencies in institutions. There is a need to link cost with the quality of the service provided, both in terms of the service itself and customer satisfaction. Without an understanding of the quality of service and its relationship with cost, there is a risk that institution management may jeopardise quality services if they compare on cost alone. The UCISA approach has been to encourage benchsharing – institutions looking at the outputs from statistics exercises should compare all aspects of that service with their peers.

In addition to core data surveys, UCISA has carried out a benchmarking study on university service desks in the UK (report launched at the UCISA14 Conference) and Technology Enhanced Learning (report published in September 2014). UCISA is also planning to carry out a survey on Digital Capabilities and will look to see how these surveys can be shared effectively across the CHEITA members.

Information Security

A number of associations were carrying out work to improve information security in institutions in their countries. Cineca have developed systems to provide services on demand. These include virtual machines, disaster recovery and a remote systems management service. Cineca are storing and maintaining scientific data sets, backing them up and managing access to them through various clients. It was noted that institutions in Italy are mandated to have business continuity plans in place; the Cineca system assists in those plans.

The regulations institutions have in place underpin good information security. UCISA was launching the latest edition of its Model Regulations at the Conference; these are designed for institutions to take and adapt as they require. In addition, UCISA was revising the Information Security Toolkit. This is a substantial piece of work and the expected publication date is March 2015. The current version is available online.

In the UK, the operations of universities are overseen by governing boards that include members with no higher education background or involvement. UCISA has worked with the Leadership Foundation for Higher Education in the UK to produce a guide for institutional governors to help them understand the application of IT in universities and the related issues that institutions might face.

Efficiencies and modernisation/cloud and shared services

It was noted that all associations seek to drive efficiencies and modernisation in their membership by promoting best practice and sharing knowledge. Those that are consortia will help their members achieve efficiencies by developing new (and potentially shared) services for their members. The challenge within individual institutions is demonstrating that initiatives are delivering the efficiencies expected.

There were a number of developments taking place in Italy. Cineca were looking at providing facilities for a use on demand service for MOOCs. The prospect of developing a system based on the complete student lifecycle was being investigated. Cineca were developing a number of cloud solutions. EUDAT is a collaborative data infrastructure which will allow research data to be shared between communities and fortissimo provides services running on a cloud based high performance computing (HPC) infrastructure. In the UK, the possibility of a data centre being shared between a number of research focussed institutions to facilitate the sharing of research data was also being investigated. The University of Aberdeen and Robert Gordon University have refurbished a data centre and are now sharing it with another institution in the North East of Scotland. The initiative won the UCISA Award for Excellence this year.

EDUCAUSE have established a working group looking at the total cost of ownership of cloud computing and are starting work to establish if Cloud is cost effective. The Financial X-ray work from Jisc started as an initiative to ensure that institutions understood their full internal costs and so were able to compare their internal provision with cloud offerings. UCISA has produced a briefing paper on Cloud Computing targeted at senior management within institutions.

There is an initiative in Europe that is looking at the area of learning analytics. The LACE project is funded by the European Union and is considering the ethical aspects of learning analytics as well as looking to share best practice.

Support of research

The afternoon part of the meeting focused on the support of research and CHEITA delegates were joined by representatives from universities and a number of other organisations. It was recognised that, with increasing international collaboration, standards played a key part in sharing information. The session opened with two presentations looking at standards.

EuroCRIS is the European organisation for international research information. Although primarily Europe based it has members worldwide, including Australia, Canada and the USA. EuroCRIS promotes sharing through CERIF, the Common European Research Information Framework. CERIF supports a range of data objects, including publication, person and funding. It is intended that Research Information Systems (RIS) will hold information or be able to import/export information in the CERIF format. In the UK, the framework has been used to track publications and harmonise reporting. Germany is following the UK model. EuroCRIS are linking with CASRAI and ORCID.

A working party formed by Jisc and including representation from UK universities, research organisations and UCISA, recommended that the ORCID was adopted as a standard identifier for researchers. Following on from that recommendation, Jisc have established a number of pilot projects to streamline the ORCID implementation process at universities and to develop the best value approach for a potential UK wide adoption of ORCID in higher education. The pilots were due to begin in April 2014 and mirror similar projects taking place in the US.

Open access to research outputs, including data, is proving a challenge to CHEITA members. The pressures to develop an infrastructure to facilitate open access come from both governments who are seeking to maximise the investment they make in research by making the outputs more publicly available, and from the researchers themselves (particularly younger researchers) who seek to build their reputations through publications. The difficulty for publications is balancing the timeliness of public access against the desire for research outputs to be peer reviewed and the commercial aspects of publishing against the open movement. Open access to research data presents a further set of problems – the data need to be made available in such a way that they are discoverable and reusable and its curation and preservation need to be well managed. Both data and publications need to be discoverable. In Italy, Cineca have produced a number of resources to assist Italian universities. These include a directory of open access repositories, a registry of archiving policies where open access has been mandated, a directory of open access journals and a portal to provide a central point of access to publications archived in Italian open access repositories and journals. In addition, they have been participating in cross-Europe initiatives such as OpenAire to support the discovery, sharing and reuse of research outputs.

The meeting moved on to discuss institutional responses to the challenge of supporting research. In both instances it was clear that institutions need to invest heavily in supporting research if they are to maintain and/or enhance their research standing. The University of Cape Town (UCT) have established an eResearch Centre. The University recognised that leading research universities have a strategy that ensures that their researchers are equipped with the latest tools and techniques to raise their profile and improve collaboration. Consequently UCT planned to build an eResearch Centre to support their strategic mission to raise the quality of research within the institution and its profile globally. The first phase is to establish the core IT infrastructure to support research – HPC, storage and on demand (cloud) services were key to the initial phase but they must also be supported by dedicated IT and Library staff with a strong understanding of research. Thereafter the infrastructure can be built on by identifying discipline focussed pilot projects to develop institutional capabilities. Interdisciplinary projects can then follow before finally moving to international collaboration.

The University of Bristol was also contemplating setting up an eResearch centre which would bring two strands of activity together. The first of these was to develop a research data service which would assist researchers to develop data management plans, provide training and assist archiving data. The other strand was to develop an effective IT infrastructure to support the diverse requirements of researchers at the University. Bristol already provide 5Tb of storage to their researchers but need to build the support and tools to further assist their research faculty.

There were a number of common themes from the two presentations. The first was that IT departments are poor at communicating with researchers – this has led to frustration and the trend for researchers to do their own thing and build their own research infrastructure and support. A possible solution to this was to create a new role of Research analyst with IT or the eResearch Centre. This would be someone with a research background who would be better placed to understand researchers’ needs and both help them to use the tools available and communicate what was required of IT. This would go some way to making things as easy as possible for researchers – the institution needs to provide tools and support to its research community.
The meeting concluded with a discussion on research infrastructure models. These varied between highly centralised and government sponsored services (such as in Finland), services developed by consortia (such as in Italy) and services developed collaboratively between institutions (a growing model in the UK). In South Africa, one institution (UCT) appears to be taking the lead. The conclusion was that there was no one size fits all solution. There are perhaps efficiencies to be gained from a coordinated national approach which may require direction from government to be achieved.

Mission impossible

June 17, 2014

I attended a session at the EUNIS conference on internet security given by Leif Nixon from the Swedish National Supercomputer Centre. The first two words of the presentation title were “Mission impossible” so it seemed unlikely that there would be many answers to the challenge of securing the internet. And so it proved. Although, as Leif pointed out, people will “hook up all sorts of crap to the internet” (his words), the problem is with how that crap is configured, allied with individuals or groups that take advantage of vulnerabilities to attack systems. A great deal of what is connected to the internet is insecure by default.

Leif highlighted a range of devices that were insecure when purchased. These included routers for which the default password for the admin username was ‘admin’. He had found how to reset the password but the chances of an ordinary member of the public resetting the password were practically nil. It was much the same story with patching firmware – even if end users received a patch to update firmware to correct a vulnerability, few would have the expertise to apply it. It wasn’t just routers that were the problem – webcams, access control systems and printers all had vulnerabilities of various forms. These included open root access, poorly configured firmware (support for which was often deprecated) and automated responses to polling on given communications ports. Such devices are not just home owned – the chances are that there will be unsecured devices in practically every organisation (Leif illustrated the point by identifying devices in a number of organisations).

Why is this a problem? In short, it is these devices that are used as the source of a range of internet attacks such as denial of service. Those that are so minded can gain control of the device or can utilise its configuration to generate targeted traffic. The consumerisation of IT means that there is pressure on low end manufacturers to produce devices at as low a cost as possible and as a consequence there is little or no effort put into ensuring that the devices are secure at sale or providing ongoing support for the firmware. There is no financial incentive for them to do so. As a result there will continue to be insecure devices connected to the internet for those who want to do harm to exploit, making securing the internet truly “mission impossible”.


Follow

Get every new post delivered to your Inbox.