Monthly Archives: December 2014

Learning from others

It is all too easy to think that our sector is unique and that our needs and applications are so different from other sectors and different countries that we have little to learn from them. That may have been the case in the past but with universities and colleges are now large scale businesses with significant turnovers and are being operated as such. So the perception that we cannot learn from others is increasingly wide of the mark. We can learn from the commercial sector particularly in areas such as customer service and marketing, we can learn about delivering complex projects and leadership from those working in other fields, and we can learn from specialists in the application of standards and approaches. It is for all these reasons that we at UCISA look to bring in speakers from outside the sector to many of our events.

We can also learn from our overseas colleagues. UCISA is a member of CHEITA, the Coalition of Higher Education IT Associations, and there are many problems that are common to different countries in spite of the variation in higher education systems. Consequently we look to share our expertise internationally and identify those issues that other countries have tackled successfully. Learning from others was part of the reason that we ran the bursary scheme for UCISA members to attend EUNIS, EDUCAUSE and other overseas conferences. The winners of the bursaries have written a number of blog postings highlighting the lessons they learned from the conferences.

In a similar vein, the outputs from UCISA’s work have also been promoted internationally. The outputs from the TEL survey were presented at the ascilite2014 conference in New Zealand and at EDUCAUSE. Resources from both are available on the UCISA website. Similarly, the initial findings from the Digital Capabilities survey were presented to the EUNIS Benchmarking Group. In each instance, the sharing of best practice was matched by the learning from others; I’ve blogged previously about the Benchmarking Group workshop and Richard Walker has reported on both the ascilite conference and the Echo 360 Community Conference he subsequently attended.

Last year delegates from Sweden, Italy, South Africa, Hong Kong and the US attended the UCISA Conference in Brighton. Both they and the UK based delegates they spoke to appreciated the opportunity to learn from each other. We’re looking to encourage more of our overseas partners to attend the UCISA15 Conference and, to give an additional opportunity for interaction, to arrange a small event focused on one of the issues common across the world on the Tuesday afternoon. Further details will be published in the New Year.

Review of the year

2014 brought a range of challenges to the sector as a whole and to our institutions’ IT service departments. Although undergraduate student numbers continued to recover, the unit of resource continues to fall and this, combined with unpredictability in part time, postgraduate and international student numbers, has meant that institutions have continued to look for efficiencies. Some have had to combine managing a declining recurrent income with managing significant programmes to improve infrastructure and the student experience. UCISA, through the work of its Executive, its Groups and the central office, has sought to address the needs of our community in these difficult times. Brief highlights of this work are given below.

Following the creation of a trading company in December last year and endorsement of the proposed approach at the AGM in March, work has continued to move the charity from being a charitable trust to a charitable company limited by guarantee. This work is nearing completion and I am confident that this will allow us to strengthen the UCISA offering and allow the Association to provide more for our members in the coming years.

In 2014 we have:

  • Run fifteen events, over a third of which were fully booked, covering a range of topics and including three multi-day conferences with exhibitions;
  • Published the results of the seventh survey on Technology Enhanced Learning and supporting case studies, and promoted the outputs internationally;
  • Engaged with Universities UK on cyber security issues with a focus on UCISA material being included in a Universities UK briefing;
  • Published the third edition of the Model Regulations for the use of institutional IT facilities and systems (the new edition takes into account the increased use of personal devices to access institutional facilities and growth in the use of social networks);
  • Represented the community in discussions with the leading plagiarism service provider on improving performance;
  • Published revised guidelines to assist institutions with responses to standard copyright infringement notices;
  • Introduced new event formats including webinars and facilitated workshops;
  • Taken part in the review of the HESA Financial return at both a strategic and operational level;
  • Represented the community on the HEDIIP Advisory Panel;
  • Engaged with Jisc to provide input to the co-design process, represent the community on steering groups and advisory board, and act as a customer representative;
  • Provided bursaries to allow individuals to attend overseas and specialist conferences and highlight and communicate the best practice identified and lessons learned to the UCISA community;
  • Engaged with Scottish and Welsh IT Directors through their forums and instigated work to explore regional engagement in England;
  • Carried out the inaugural Digital Capabilities survey and published the initial findings;
  • Published two sets of case studies on the challenges of digital skills training and the role of mobile in technology enhancing learning;
  • Continued to represent our members on the UCAS Council and HESA User Group;
  • Published the Major Projects Governance Assessment Toolkit to encourage a more rigorous approach to governance and project management and the Effective Risk Management best practice guide;
  • Provided input to the Efficiency Exchange;
  • Carried out a benchmarking survey with commercial partners on service desks in the sector and published the results;
  • Continued to work with the Leadership Foundation for Higher Education on leadership development and improving management skills;
  • Worked with Universities UK and overseas organisations on benchmarking initiatives;
  • Strengthened our relationships with overseas organisations through our membership of CHEITA (the Coalition of HE IT Associations), hosting a seminar focused on the support of research before the UCISA14 Conference;
  • Continued to foster strong relationships with suppliers to the sector, briefing them on trends in the sector to aid their understanding and assist them to develop their marketing, and growing the corporate membership from 105 in 2013 to 130 in 2014.

The list above highlights just some of the work that UCISA has carried out on behalf of our members. A more formal annual report will be published in the New Year and presented at the Association’s AGM at the UCISA15 Conference in Edinburgh on 27 March.

We are now investing more in external resources to help deliver our more substantive projects and the outputs from a number of these will be published early in the New Year including the Information Security Management Toolkit, the Social Media Toolkit and the fourth edition of the Exploiting and projecting the network guide.

I should like to take this opportunity to remind you that bookings are open for the UCISA15 Conference in March. We are planning a second international seminar for the Tuesday immediately before the conference – details will be available early in the New Year. Bookings are also open for three other events taking place in January and February. I would also encourage you to showcase the excellent work in our institutions by submitting an entry for the UCISA Award for Excellence.

Finally, thank you for your support in 2014. I wish you, on behalf of the UCISA staff, all the best for Christmas and the New Year.

Peter Tinson
19 December 2014

Benchmarking sans frontieres

UCISA is a member of CHEITA, the Coalition of Higher Education IT Associations. CHEITA exists to share best practice globally and, although the education systems vary greatly from country to country, we are all tackling much the same issues. As I noted in an earlier blog post, the group have been looking at whether there is scope for benchmarking internationally.

One of the challenges of benchmarking has always been ensuring that comparisons are being made between similar institutions. The difficulty has been to determine which institutions are similar enough for relevant comparisons to be made – there can be significant variation even within mission groups. CAUDIT, our Australasian sister organisation, has sought to address this by developing a complexity index. Their initial model used student FTE, staff FTE, research income and the number of campuses as the factors in determining how complex an institution was. A complexity score was calculated from these data and plotted against institutional IT spend. This revealed that there was almost a direct correlation between the complexity score and IT spend – it was possible to derive a best fit straight line through the data points. There were some outliers but on further investigation, most of these had errors in their data – once these had been resolved they moved closer to the IT spend predicted by the best fit line.

The CHEITA benchmarking group adopted a similar model, using a weighted calculation based on staff and student FTE numbers and research income. The results for each country were the same – there was a strong correlation between the complexity score and institutional IT spend. The graph for the UK data from 2012 is shown here alongside data for the other countries taking part in the exercise (the Y axis for institutional IT spend ranges from nil to $250million – acknowledgements to Leah Lang from EDUCAUSE for producing the graph). Apart from two or three outliers, it is possible to achieve quite a close fit and we will be looking to see whether we can use the index with our HEITS figures to aid comparison.

Graph showing UK HEIs against international comparators

Graph showing UK HEIs against international comparators


In order to compare across borders, the financial components need adjusting. Exchange rates are not suitable, particularly when applied to the euro where the value of the currency varies across the Eurozone. Purchasing power allows for such variation. By applying purchasing power parity to the research income and IT spend, it is possible to consider international comparison. There is still some work to be carried out but it is hoped that the initial findings will be published and that this will instigate discussions between institutions in different countries.

Benchmark to improve

UCISA has run the HEITS exercise to collect benchmark statistics for seventeen years. During that time, members have used the data to assist in making business cases for funding, for quality assurance purposes and for comparing themselves with their peers. I attended a workshop run by EUNIS’s BENCHEIT working group last week partly to hear what others were doing in the way of benchmarking and partly to see if there were any lessons that we could learn from our peers (and thirdly to promote the results of the UCISA Digital Capabilities survey).

The Finns compiled their statistics by carrying out an in depth analysis of the costs of services. This is similar to the approach adopted by the Jisc Financial X-ray – although it takes time to produce the data, particularly when considering the apportionment of procurement items and staff costs, it does lead to detailed costs. It also permits quite detailed comparison between institutions. Individual institutions can pick out areas where their costs are very different (higher or lower) and they can then ask questions of the other participants to establish the reasons for the variation.

The Dutch approach was similar but they also used the statistics strategically within the individual institutions. Whilst they also identified the exceptional costs and sought to identify the reasons behind variations, they used the statistics to demonstrate value internally (“the IT infrastructure is only costing x% of the student fee”) and to baseline costs in order to highlight the impact of projects. In both the Finnish and Dutch cases, the statistics prompted an open discussion on the costs of contracts and where there were significant variations they were cited in talks with suppliers in order to bring costs down. There seemed to be far more openness with regard to commercial contracts than appears to be the case in the UK – perhaps this is something we need to address?

Whilst the Dutch and Finns largely concentrated on the costs of services, the Spanish adopted a more holistic approach. There too were carrying out cost comparisons but this was being done within an overall framework that assessed the maturity of the IT Governance and Management in the institution. A catalogue of principles, broken down into objectives, each with quantifiable indicators and variables, was used as the basis for the study. Each indicator and variable is fully defined to avoid any ambiguity. The results were then passed back to the institutions showing their position for each indicator relative to their peers.

The one message that emerged from the workshop is that it is important not to take raw cost figures as the basis for comparison. There are many reasons for differences in costs – the size of the institution and its mission will be contributing factors and the CHEITA group have been looking at using these to facilitate international comparisons (more in a later post). Other factors include the quality of the service being provided and institutional drivers – higher costs may be as a result of investment in any given year. It is important to have a dialogue in order to understand the context and the underlying reasons for any variation. It is a message that I continue to promote in the UUK benchmarking initiatives: the figures alone do not give the full picture – you need to understand the institutional drivers and the value of that spend in order to make a genuine comparison.