Yearly Archives: 2014

Learning from others

It is all too easy to think that our sector is unique and that our needs and applications are so different from other sectors and different countries that we have little to learn from them. That may have been the case in the past but with universities and colleges are now large scale businesses with significant turnovers and are being operated as such. So the perception that we cannot learn from others is increasingly wide of the mark. We can learn from the commercial sector particularly in areas such as customer service and marketing, we can learn about delivering complex projects and leadership from those working in other fields, and we can learn from specialists in the application of standards and approaches. It is for all these reasons that we at UCISA look to bring in speakers from outside the sector to many of our events.

We can also learn from our overseas colleagues. UCISA is a member of CHEITA, the Coalition of Higher Education IT Associations, and there are many problems that are common to different countries in spite of the variation in higher education systems. Consequently we look to share our expertise internationally and identify those issues that other countries have tackled successfully. Learning from others was part of the reason that we ran the bursary scheme for UCISA members to attend EUNIS, EDUCAUSE and other overseas conferences. The winners of the bursaries have written a number of blog postings highlighting the lessons they learned from the conferences.

In a similar vein, the outputs from UCISA’s work have also been promoted internationally. The outputs from the TEL survey were presented at the ascilite2014 conference in New Zealand and at EDUCAUSE. Resources from both are available on the UCISA website. Similarly, the initial findings from the Digital Capabilities survey were presented to the EUNIS Benchmarking Group. In each instance, the sharing of best practice was matched by the learning from others; I’ve blogged previously about the Benchmarking Group workshop and Richard Walker has reported on both the ascilite conference and the Echo 360 Community Conference he subsequently attended.

Last year delegates from Sweden, Italy, South Africa, Hong Kong and the US attended the UCISA Conference in Brighton. Both they and the UK based delegates they spoke to appreciated the opportunity to learn from each other. We’re looking to encourage more of our overseas partners to attend the UCISA15 Conference and, to give an additional opportunity for interaction, to arrange a small event focused on one of the issues common across the world on the Tuesday afternoon. Further details will be published in the New Year.

Review of the year

2014 brought a range of challenges to the sector as a whole and to our institutions’ IT service departments. Although undergraduate student numbers continued to recover, the unit of resource continues to fall and this, combined with unpredictability in part time, postgraduate and international student numbers, has meant that institutions have continued to look for efficiencies. Some have had to combine managing a declining recurrent income with managing significant programmes to improve infrastructure and the student experience. UCISA, through the work of its Executive, its Groups and the central office, has sought to address the needs of our community in these difficult times. Brief highlights of this work are given below.

Following the creation of a trading company in December last year and endorsement of the proposed approach at the AGM in March, work has continued to move the charity from being a charitable trust to a charitable company limited by guarantee. This work is nearing completion and I am confident that this will allow us to strengthen the UCISA offering and allow the Association to provide more for our members in the coming years.

In 2014 we have:

  • Run fifteen events, over a third of which were fully booked, covering a range of topics and including three multi-day conferences with exhibitions;
  • Published the results of the seventh survey on Technology Enhanced Learning and supporting case studies, and promoted the outputs internationally;
  • Engaged with Universities UK on cyber security issues with a focus on UCISA material being included in a Universities UK briefing;
  • Published the third edition of the Model Regulations for the use of institutional IT facilities and systems (the new edition takes into account the increased use of personal devices to access institutional facilities and growth in the use of social networks);
  • Represented the community in discussions with the leading plagiarism service provider on improving performance;
  • Published revised guidelines to assist institutions with responses to standard copyright infringement notices;
  • Introduced new event formats including webinars and facilitated workshops;
  • Taken part in the review of the HESA Financial return at both a strategic and operational level;
  • Represented the community on the HEDIIP Advisory Panel;
  • Engaged with Jisc to provide input to the co-design process, represent the community on steering groups and advisory board, and act as a customer representative;
  • Provided bursaries to allow individuals to attend overseas and specialist conferences and highlight and communicate the best practice identified and lessons learned to the UCISA community;
  • Engaged with Scottish and Welsh IT Directors through their forums and instigated work to explore regional engagement in England;
  • Carried out the inaugural Digital Capabilities survey and published the initial findings;
  • Published two sets of case studies on the challenges of digital skills training and the role of mobile in technology enhancing learning;
  • Continued to represent our members on the UCAS Council and HESA User Group;
  • Published the Major Projects Governance Assessment Toolkit to encourage a more rigorous approach to governance and project management and the Effective Risk Management best practice guide;
  • Provided input to the Efficiency Exchange;
  • Carried out a benchmarking survey with commercial partners on service desks in the sector and published the results;
  • Continued to work with the Leadership Foundation for Higher Education on leadership development and improving management skills;
  • Worked with Universities UK and overseas organisations on benchmarking initiatives;
  • Strengthened our relationships with overseas organisations through our membership of CHEITA (the Coalition of HE IT Associations), hosting a seminar focused on the support of research before the UCISA14 Conference;
  • Continued to foster strong relationships with suppliers to the sector, briefing them on trends in the sector to aid their understanding and assist them to develop their marketing, and growing the corporate membership from 105 in 2013 to 130 in 2014.

The list above highlights just some of the work that UCISA has carried out on behalf of our members. A more formal annual report will be published in the New Year and presented at the Association’s AGM at the UCISA15 Conference in Edinburgh on 27 March.

We are now investing more in external resources to help deliver our more substantive projects and the outputs from a number of these will be published early in the New Year including the Information Security Management Toolkit, the Social Media Toolkit and the fourth edition of the Exploiting and projecting the network guide.

I should like to take this opportunity to remind you that bookings are open for the UCISA15 Conference in March. We are planning a second international seminar for the Tuesday immediately before the conference – details will be available early in the New Year. Bookings are also open for three other events taking place in January and February. I would also encourage you to showcase the excellent work in our institutions by submitting an entry for the UCISA Award for Excellence.

Finally, thank you for your support in 2014. I wish you, on behalf of the UCISA staff, all the best for Christmas and the New Year.

Peter Tinson
19 December 2014

Benchmarking sans frontieres

UCISA is a member of CHEITA, the Coalition of Higher Education IT Associations. CHEITA exists to share best practice globally and, although the education systems vary greatly from country to country, we are all tackling much the same issues. As I noted in an earlier blog post, the group have been looking at whether there is scope for benchmarking internationally.

One of the challenges of benchmarking has always been ensuring that comparisons are being made between similar institutions. The difficulty has been to determine which institutions are similar enough for relevant comparisons to be made – there can be significant variation even within mission groups. CAUDIT, our Australasian sister organisation, has sought to address this by developing a complexity index. Their initial model used student FTE, staff FTE, research income and the number of campuses as the factors in determining how complex an institution was. A complexity score was calculated from these data and plotted against institutional IT spend. This revealed that there was almost a direct correlation between the complexity score and IT spend – it was possible to derive a best fit straight line through the data points. There were some outliers but on further investigation, most of these had errors in their data – once these had been resolved they moved closer to the IT spend predicted by the best fit line.

The CHEITA benchmarking group adopted a similar model, using a weighted calculation based on staff and student FTE numbers and research income. The results for each country were the same – there was a strong correlation between the complexity score and institutional IT spend. The graph for the UK data from 2012 is shown here alongside data for the other countries taking part in the exercise (the Y axis for institutional IT spend ranges from nil to $250million – acknowledgements to Leah Lang from EDUCAUSE for producing the graph). Apart from two or three outliers, it is possible to achieve quite a close fit and we will be looking to see whether we can use the index with our HEITS figures to aid comparison.

Graph showing UK HEIs against international comparators

Graph showing UK HEIs against international comparators


In order to compare across borders, the financial components need adjusting. Exchange rates are not suitable, particularly when applied to the euro where the value of the currency varies across the Eurozone. Purchasing power allows for such variation. By applying purchasing power parity to the research income and IT spend, it is possible to consider international comparison. There is still some work to be carried out but it is hoped that the initial findings will be published and that this will instigate discussions between institutions in different countries.

Benchmark to improve

UCISA has run the HEITS exercise to collect benchmark statistics for seventeen years. During that time, members have used the data to assist in making business cases for funding, for quality assurance purposes and for comparing themselves with their peers. I attended a workshop run by EUNIS’s BENCHEIT working group last week partly to hear what others were doing in the way of benchmarking and partly to see if there were any lessons that we could learn from our peers (and thirdly to promote the results of the UCISA Digital Capabilities survey).

The Finns compiled their statistics by carrying out an in depth analysis of the costs of services. This is similar to the approach adopted by the Jisc Financial X-ray – although it takes time to produce the data, particularly when considering the apportionment of procurement items and staff costs, it does lead to detailed costs. It also permits quite detailed comparison between institutions. Individual institutions can pick out areas where their costs are very different (higher or lower) and they can then ask questions of the other participants to establish the reasons for the variation.

The Dutch approach was similar but they also used the statistics strategically within the individual institutions. Whilst they also identified the exceptional costs and sought to identify the reasons behind variations, they used the statistics to demonstrate value internally (“the IT infrastructure is only costing x% of the student fee”) and to baseline costs in order to highlight the impact of projects. In both the Finnish and Dutch cases, the statistics prompted an open discussion on the costs of contracts and where there were significant variations they were cited in talks with suppliers in order to bring costs down. There seemed to be far more openness with regard to commercial contracts than appears to be the case in the UK – perhaps this is something we need to address?

Whilst the Dutch and Finns largely concentrated on the costs of services, the Spanish adopted a more holistic approach. There too were carrying out cost comparisons but this was being done within an overall framework that assessed the maturity of the IT Governance and Management in the institution. A catalogue of principles, broken down into objectives, each with quantifiable indicators and variables, was used as the basis for the study. Each indicator and variable is fully defined to avoid any ambiguity. The results were then passed back to the institutions showing their position for each indicator relative to their peers.

The one message that emerged from the workshop is that it is important not to take raw cost figures as the basis for comparison. There are many reasons for differences in costs – the size of the institution and its mission will be contributing factors and the CHEITA group have been looking at using these to facilitate international comparisons (more in a later post). Other factors include the quality of the service being provided and institutional drivers – higher costs may be as a result of investment in any given year. It is important to have a dialogue in order to understand the context and the underlying reasons for any variation. It is a message that I continue to promote in the UUK benchmarking initiatives: the figures alone do not give the full picture – you need to understand the institutional drivers and the value of that spend in order to make a genuine comparison.

Cloud for US HEIs

face
Caleb Racey
Systems Architecture Manager
Newcastle University
Member of UCISA EACP

 

 

In this blog post I’ll share some of the take home messages from the cloud sessions I attended at EDUCAUSE.

There was a wealth of cloud presentations at EDUCAUSE with several time slots involving a choice between competing cloud sessions.  The American universities seem to have largely got beyond the question of “should we use the cloud?” and are now concentrating on practical questions like “how do we effectively use the cloud?”.  The services used as examples are no longer the low hanging fruit of student email and have moved up the chain to a wide breadth of paid-for services (e.g. CRM, developer services, travel).  The impression I got was that the States has a 2-3 year lead in deploying cloud based services over the UK HE community.  The message of many of the presentations was that the advent of cloud is transformational.  Cloud is driving levels of change in university IT on a par with the two historical seismic shifts of the mainframe to PCs transition and the advent of the internet.

Leveraging collective buying power

Part of the reason for the US HE lead in the cloud appears to be the availability of the  Internet2 Net+ cloud procurement model.  This enables the community to purchase cloud services as a group and use that group leverage to get the services and their contracts shaped to HE requirements. Buying as a community allows them to use best buying practice when it comes to cloud, such as assessing the security of a cloud service via the cloud security alliance’s 136 point assessment matrix, or ensuring contracts have reasonable terms e.g. liability can not exceed purchased value. They also have a mature attitude to cloud service lifecycle with a well defined on-boarding and sun setting process.  While Net+ looks to be a valuable resource many seemed to be buying services on their own.  Salesforce in particular seems to have real traction; Cornell reported that when they mined their payment card data Salesforce was used across the organisation without the IT services involvement.  This was a common theme – cloud sells direct to the end user without reference to IT departments.  Indeed cloud sales activity is often targeted at academics and administrators rather than IT. The need for IT departments to adapt to this change is clear.  With IT now an intrinsic part of the majority of business activities it is inevitable that control of IT is increasingly going to reside in those business areas.  As a provider of enterprise IT the following Oliver Marks quote may be uncomfortable, but it makes it no less true: “cloud companies are cost-effectively emancipating enterprises from the tyranny of IT, solving lots of problems with tools that are a pleasure to use”.

Several of the presentations touched on the impact of the cloud on the IT department.  All presenters said that it had not reduced head count.  Cloud approaches increase the velocity of change and the depth of IT penetration into business processes, the increased efficiency of cloud is counteracted by much greater demand and resulting volume of activity.  This is a common story of conversion of an area from bespoke to utility provision.  The conversion of electricity from self-generated to utility provision saw massive growth in the demand for electricity and electricians.  Penn (Pennsylvania) State said that cloud for them was “about delivering more capability and capacity to their customers”.  For Cornell it was driven by a need to shrink a massive $27 million application budget to something more manageable.  The skill sets required in the IT department change with the cloud, managing services, facilitating, contract and relationship management all requiring development.

Addressing security concerns

The real value of EDUCAUSE for me is in hearing the experience of peers who have led the community in a particular area.  This was particularly true for cloud, it is an area awash with myth, marketing, fear, uncertainty, doubt and undoubted risk. Hearing real experience helps to focus concerns on the real issues.  When I speak to colleagues in my university about cloud,  security is one of the number one topics.  While this was a concern mentioned in the presentations it seemed to be largely a resolved issue.  There was a great quote in the Notre Dame session: “90% of HE already uses a SaaS solution for credit card payments and has for a long time.  Once you have put your highest risk assets (your customer’s financial details)  into  the cloud then most other data security concerns are minor”.  Notre Dame’s session was a particular eye opener, they already have 130+ SaaS service deployed on campus and aim to be 80% cloud based within a couple of years.  The speed with which they are moving to cloud is astonishing.

Lessons for UK HE  

Reflecting on the conference I’m increasingly convinced that UK HE needs to embrace the full breadth of cloud services.  Cloud is here to stay, it will continue to grow and those harnessing it will lead the sector in IT delivery.  A service  similar to the Net+ service would be a major bonus to the community.  Having worked with the JANET amazon AWS pilot I can see  the beginnings of a Net+ style service.  The JANET cloud service is  good, however I can’t see it having the breadth or pace of the Net+  initiative.

The government’s G_cloud  framework  has breadth and pace in spades, it easily outstrips Net+ in terms of size (13,000 services) and velocity.  G_cloud is usable from a university context with individual universities named in the list of eligible organisations.  However its terms and services aren’t tailored to university requirements.  Having investigated G_cloud as a route to procuring two potential services this has proved to be a problem.  In one instance the length of contract was an issue, the capital up front funding of some research projects means that the two year cut off of a G_cloud service makes it unsuitable. In another instance the flat unnegotiable fixed price nature of G_cloud agreements meant it was more expensive than  we could have negotiated directly with academic discount.  The way forward to me seems to be to pursue a multi-tiered approach: call off on JANET cloud services when available, use G_cloud services where suitable,  work with the government digital service to influence G_cloud to include HE specific considerations, and finally procure cloud services directly when the previous two approaches are exhausted.  However cloud is not an area that easily lends itself to tender agreements.  Procuring cloud on an individual basis will see each organisation invest considerable effort into answering the same concerns that many of their peers are also having to address.  While the US is leading the HE community in harnessing the transformational potential of cloud services I can see the real potential for the UK to pick up the pace and take the lead.  A sensible combination of tailored JANET cloud services and  G_cloud’s velocity and drive could act as a transformation enabler in the sector.

Cal Racey

 

SDN, Open Daylight and KUMO cloud storage

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

2014 Technology Exchange – Day 4: Software Defined Networking (SDN)

I started the conference this morning with the goal of understanding more about Software Defined Networking (SDN); it has been a bit of a buzzword in the industry; but I’ve personally not seen the deployment or use cases, which were going to be part of the first session this morning.

It was great to see a number of Research and Education Networks talking about their use or proposed use of SDN: GRNET, Pozan, RNP, Internet 2, CERN and NICT.  In brief, SDN is an abstraction of the control and data plane of networking devices. Splitting the system, which makes the routing and security decisions about traffic from the elements which simply forward packets. Primary goals are to provide agility, reduce complexity and to prevent vendor lock-in.

Milosz from PSNC discussed a number of the EU use cases they were working on. Interesting to see a lot of demand for agility to support remote (follow the moon) datacentres, cloud bursting and collaboration. Whilst OpenFlow is a part of SDN, it isn’t one and the same thing; they are focusing development on using version 1.3+. With the launch of the new Jisc national datacentre, it will be interesting to see Janet’s strategy for providing the network capabilities to exploit this resource.

Open Daylight

Continuing with the SDN theme, Cisco and Brocade hosted a session on the Open Daylight open source consortium.

There were some good dashboard examples of dynamic WAN link optimisation. One example was the dynamic configuration of the network based on the University calendar to provide bandwidth and cloud datacentre connectivity based on the peaks in the academic calendar.

Another use for this technology would be supporting the requirements of University and conference events based on their booking data without permanently over provisioning all year. Otherwise is there a key driver for SDN on the LAN?

It was interesting to see that the majority of the room had already implemented SDN (circa 50) or were planning to in the next 12 months. About half of the room were already using Open Daylight on their network.

Colleagues in the room were really interested in the utilisation of RESTful API’s to control SDN and it was promising to see the integration possible with the software.

Cisco explained that they are absolutely committed to Open Daylight, they believe it is the future and that they are providing the code to back that up. Both Cisco and Brocade believe the Open Daylight is the controller of the future. There was an interesting discussion about the changing supplier dynamics with the adoption of this technology. This includes a validation programme for the use of Open Daylight on other vendors’ products. Hear more about SDN and Open Daylight at our UCISA Network Futures event in January 2015.

KUMO Cloud Storage

I was interested in how Indiana University had deployed access to Google and One Drive, amongst other storage services in their computer lab environment. It was something that we had been trialling at Loughborough University.

IU’s primary driver was providing storage in all of the VDI environments they had. A client provides all of the associated storage as mapped drives without any local synchronisation requirements. On the roadmap is a Mac OS X client, multiple accounts per vendor and a new broadcast feature.

John explained how a lot of academics struggled to get the correct files to students and for them to have all the data extracted into their Box account. Broadcast integrates with the student module lists and automatically populates their drive with the correct files.

See https://cloudstorage.iu.edu/partner or jhoerr@iu.edu.

All students at IU each get 100GB of BOX storage through the Internet2 agreement!

http://www.internet2.edu/products-services/cloud-services-applications/box

It looks like Janet has an agreement with Box, along with Microsoft, Q Associates and Capito on their File Sync and Share DPS. It will be interesting to hear what these services are and more details about the agreement: https://community.ja.net/system/files/6989/List%20of%20Suppliers_0.pdf

What we were investigating at Loughborough University: http://blog.lboro.ac.uk/middleware/blog/apps-for-education/apps-for-education

We are certainly going to get in touch; this looks exactly what we need here at Loughborough. At the moment they are planning to sell it at $0.33 cents per student user and staff come free. Does this mean the end of providing local student filestore?

Conclusions

It was a very interesting conference and generated some probing questions, which I’ve been able to share with you through the UCISA blog. As it was the first time Internet2 and ESnet have run this conference, there were a few organisational teething troubles, which you don’t get with well-established events like TNC and Janet Networkshop.

In my personal opinion there is quite a bit of development from US based academic institutions which I was previously unaware of. It may be that I wasn’t looking in the right places, colleagues in the community were not sharing this information in the UK or there is a genuine communications gap, which hopefully the UCISA bursary scheme has started to fill.

Learning Points

  • The use cases for SDN is growing; however is there a compelling driver at the moment?
  • I personally see the future for SDN being integrated into cloud orchestration for the technical brokerage to cloud data centers.
  • Ensure you keep a watching brief on SDN and Open Daylight.
  • What is the best way to exploit free cloud services like Google Drive etc?
  • How do we get to proactively find out about neat solutions like Kumo?

I hope that there is something in the blog posts over the last four days that is useful to you. Please feel free to get in touch if you have any questions and I will be writing a management-briefing summary from the event learning points. I’m heading back home via Chicago, so time to hit the road and I’ll see you back in the UK.

2014TechExDay4

ESNet, The Energy Sciences Network

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 


2014 Technology Exchange – Day 3

One of the features of conferences outside of the UK and especially prevalent in the USA is early morning sessions. It was time on day three to hit the 07:15 morning working group/camp/BoF sessions.

Unfortunately the ‘Campus Cloud Architects BoF’ was cancelled, which was really disappointing and not a good start, as I was hopeful to explore in person some of the latest concerns, trends and experiences in this area.

Industry groups have been reporting more and more interest in Cloud brokerage solutions and some companies are now recruiting for cloud broker and cloud architect roles. As cloud technology gets more mature, there is an opportunity to start brokering for the best possible service and cost for your organisation. In the sector we have seen an excellent start in this area by Janet with their agreements with Microsoft and Google for their email/applications suite.

There is a lot of development scope in this area with Microsoft Azure, AWS etc and I’m interested to explore the strategy required to position infrastructure, automation and standards to take best advantage of the emerging competition.

Perhaps this area is something that colleagues in the UCISA IG may be interested in picking up in the future?

I took advantage of the programme changes to share more details about the current UCISA activity in the ad-hoc groups using a short five-slide presentation covering these pieces of work:

• A guide to the implementation of an Information Security Management System (ISMS), launching in early 2015

• An update to the popular  ‘Exploiting and Protecting the Network’ document, launching in early 2015

• The Major Project Governance Assessment Toolkit

• UCISA Report 2013 – Strategic Challenges for IT Services.

There was a lot of interest in these areas and I had a couple of questions about integrating the planning, effort and joint working of UCISA and EDUCAUSE where there are clear overlaps and topics of interest.

The Energy Sciences Network ESnet are also interested in contributing to the Network Performance and QoS ‘community of practice workshop’  which the UCISA Networking Group are planning in January 2015 (more details coming to the UCISA NG mailing list soon).

Data Intensive Science

As an area where I have little experience, I was interested in listening to what William Johnston from ESnet had to say about large-scale data intensive science. He started by explained his view that high energy physics is seen as a prototype platform for distributed collaboration in other science fields.

He explained that as instruments get bigger, they get more expensive (in a not-quite-as-dramatic Moore’s Law relationship); therefore there are less of them which results in an increase in collaboration, globally. This shows the potential future growth of research networking bandwidth requirements.

One of the things I didn’t realise was that ESnet have extended their 100Gb full network backbone across the Atlantic into Europe, including connections in London. Their first circuit is being tested today. What does this mean for science and research in the UK?

Further details are available at:
http://es.net/news-and-publications/esnet-news/2014/esnet-extends-100g-connectivity-across-atlantic
http://www.geant.net/MediaCentreEvents/news/Pages/three-high-speed-links.aspx

William went on to talk about monitoring the network, explaining the criticality of this area. With many Infrastructure as a Service (IaaS) offerings, researchers are requesting Network as a Service; and with that the same levels of assurance and guarantees that have only previously been available with point to point links; is this going to change?

As one would expect, ESnet use perfSONAR for their assurance measurements. As I mentioned earlier, we will hopefully have representatives from ESnet and eduPERT at our Network Performance and QoS ‘community of practice workshop’ in January 2015.

Would something like perfSONAR deployed across Janet be of benefit for the community, perhaps let us know your thoughts in the blog feedback section below? I would assume it requires volunteer sites; however Janet are already looking at the possibility of network based probes for eduroam, so perhaps there is scope for a next generation of Netsight with added assurance?

ESnet also use the weathermap tool, which is also loved by colleagues at Loughborough University. It was one of the best take away messages from a Janet Networkshop Lightning talk several years ago.

The remainder of the talk was about data transfer speeds and integrity. I was surprised to hear the comment “SCP is your enemy”. Surely not? However I was approaching the problem from the wrong angle, thinking about security and not data transfer speeds and parallelisation. Look at some of the figures in the photograph below.

2014TechExDay3

 

William discussed a number of tools including GridFTP and a development from CALTECH, which stripes data across discs as part of the FTP process as well as providing up to three times CRC checking.

Interestingly the last point was about data integrity, which is critical for the field of data intensive science. William referenced the paper Stone and Partridge, 2000 “When The CRC and TCP Checksum Disagree”.

During the break, I had a bit of a Google to find any UK user or interest groups for Research Computing and HPC. I found the HPC SIG, if you know of any others, please pop them in the blog comments to share.

Connecting 40Gb Hosts

Whilst in the ‘big data’ mindset, there was an interesting session where colleagues from Fermi Labs, ESnet and CALTECH shared best practice infrastructure configuration to support high-speed data transfer.

There was some very interesting visual modelling, which demonstrated the affinity the network card has with a particular processor socket and core. The difference between optimising for data transfer is significant 37Gbps vs 26Gbps max on a 40Gbps link.

It was a packed session with many colleagues standing at the back; there is certainly an art to tweaking infrastructure to perform in the best possible manner. It was also interesting to hear there are three 100Gb network cards in development and testing.

Pushing the Boundaries of the Traditional Classroom

There was a bit of a clash in the programme, so I didn’t get to spend a lot of time in this session, but it was interesting to see what Indiana University had done with their ‘Collaboration Café’.

It led me to wonder what the key limitation of adopting more of these learner centric classroom designs is? Is it financial or is it resistance from academic colleagues in the same way as there was/is resistance to lecture capture and VLE environments?

UCISA are working along with SCHOMS and AUDE on an update to Learning Space design principals. This document should be really useful, especially as the final point from the presentation was all about the removal of wires.

At Loughborough we are trialling the Epson projector series that use the Epson EasyMP software and iProjection App. What wireless projectors and/or screens are you using? Let us know in the blog feedback section below?

Other Thoughts

The other talks I attended through the day continued on the research and big data theme. It included hearing about the PetaBytes (PB) of data required by some of the medical research being undertaken as part of the ICTBioMed platform. One of the speakers commented that biology is becoming more like computer science by the day; confirming again that multidisciplinary research is a firm requirement for a lot of modern applied research.

Some examples of digital biology given were: DNA Sequencing, Gene Expression Analysis, Protein Profiling and Protein to Protein interactions.

A number of the speakers came in via videoconference; it was interesting to see the mix of success and failure of this bold move. It seems strange that we still struggle to co-ordinate a remote video connection with the technology we have at our disposal in 2014.

Another speaker also made reference to the worldwide nature of large research groups and collaborations and said this collaboration technology was essential.

Video Collaboration

For the final session of the day, I was interested to see what the future held for video based collaboration in a session with speakers from: Internet2, Pexip, Evogh, Blue Jeans and Vidyo. I didn’t manage to ask Robb from Blue Jeans more about the removal of the Skype interface API that was so disappointing, however during the panel he mentioned that they had a Google Hangouts bridge to standards based systems available.

There were some interesting remarks from Hakon Dahle who is CTO at Pexip based in Oslo (but was previously CTO at Tandberg and Cisco).

Hakon described their distributed architecture, where it is possible to start small and grow appropriately with options to add capacity on demand in an agile manner.

Latency was still an issue with global video conferencing and there was a panel debate about the pros/cons of transcoding increasing latency vs accessibility and interoperability.

“Transcoding is a necessary evil”; especially with new protocols like WebRTC etc!

There were very positive comments about WebRTC and how it will make video more accessible and will make face to face communications easier; however there is already a divide with Google VP9 protocols being favoured by some players in the market especially when delivering very high resolution 4K streams.

Hakon explained that WebRTC seemed the most promising technology to allow direct person to person video calls and will bring about a lot of new use cases and that the new use case element is the most exciting in terms of innovation.

Learning Points

• How do we best position our infrastructure to take advantage of emerging Cloud competition?
• How do we collaborate more with colleagues from Internet2, ESnet and EDUCAUSE? Is this something UCISA and Janet/Jisc can facilitate?
• Future growth potential of research data transfer requirements
• Are we best serving our research communities, what more can we do?
• Global nature of research and therefore the communication requirements.

Matt Cook

Cyber security – top table interest

The risk cyber crime presents to the higher education sector was highlighted to Vice-Chancellors at the Universities UK Conference in 2012. Since then, there have been a series of round table discussions which have looked at the ability of the UK higher education sector to respond to cyber crime attacks. I attended the most recent of these which focused on the outcomes of a self-assessment exercise UUK promoted earlier in the year.

Those institutions that had completed the exercise will receive individual reports in the near future and a briefing will be circulated to Vice-Chancellors reflecting on the exercise. The briefing will include an additional report giving details of a number of UCISA resources that support institutions in their cyber security initiatives. The detailed results of the exercise are embargoed until the institutions have received their individual reports but, although it is clear that there is work to be done, there are some encouraging signs that cyber security is being taken seriously at a senior level within many institutions.
There are a number of factors that support this assessment. Firstly over sixty institutions took part in the exercise. In addition to these institutions, I am aware of a number of others that did not take part as they had already carried out similar work either utilising already published controls (such as the CPNI’s twenty controls for cyber defence) or by engaging external consultants.

Secondly there was a good level of interest shown in security and risk related topics by delegates at the Universities UK Conference this year. UCISA exhibits at the Conference to promote our resources and activities. Two publications that drew particular interest were the revised Model Regulations for the use of institutional IT systems and the Information Security Toolkit. Effective information security is underpinned by effective regulations and the Model Regulations give institutions a template to utilise locally. The current version of the Information Security Toolkit provides specimen policies for institutions to revise. The delegates were also interested in the Major Projects Governance Assessment Toolkit – effective governance reduces the risk of projects failing to deliver their anticipated benefits, or having major cost or time overruns.

So there are positive signs that risk and cyber security are being taken seriously. Care is needed though that cyber security is not just seen as an IT problem – people and processes are also important components in implementing effective information security measures. This is something that will be highlighted in the revised Information Security Toolkit – there is a need for senior management ownership and good governance in order for information security to be successfully managed. We also need to guard against IT only featuring at the top table for ‘problem’ issues – we need to work to ensure that the role IT can play in enhancing the student experience, delivering efficiencies is also understood by senior institutional managers.

Postscript – work is currently in progress on a revision of the Information Security Toolkit. It is anticipated that the new version will be launched at the UCISA15 Conference in March 2015.

2014 Technology Exchange – Day 2 by Matt Cook

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

FIRST Robotics Competition

The Monday evening welcome reception included a contest based on robots developed by high school students. The students were given six weeks to raise funds, design, develop, brand and program a robot – not an easy task! It was great to see such innovation from our students and colleagues of the future. I wish we had these opportunities back when I was at school; the best we experienced was BigTrak and writing Logo programs. However at least we were taught programming in BBC Basic, and not simply how to use the Microsoft Office suite.

2014TechExDay2

The USA is promoting Science, Technology, Engineering and Mathematics (STEM) subjects in a similar manner to the UK. It will be interesting to see how successful this initiative is in providing the education required for our fellow colleagues of the future and plugging the current skills gap. Talking to the students, they are extremely enthused about the creator, maker, hacker opportunities being given through these programmes.

This is another one of those opportunities which demonstrates the value in the jobs we perform in our respective organisations to support education. I recently undertook a job shadow of a technician in one of our academic schools at Loughborough, and it was one of the most eye opening experiences I had all year.  It was extremely valuable to see the challenges they face within the school, how central IT policy affects their work and the innovation and creative ideas being developed by their students. I would certainly encourage everyone to get out into the wider university more to put everything into perspective.

Central IT vs Research Perspective on Information Security

There was a very interesting panel discussion mid-way through the Tuesday schedule investigating the challenges faced by both the central IT function and research centres in managing Information Security. Rob Stanfield from Purdue University provided an overview of the provision at his organisation and one thing that stood out was the scale of some of the US based education organisations that dwarfed most of the largest UK universities. The scale of operation also brought increased scale of both staffing, and following a coffee break discussion, of budget too. Purdue are currently recruiting a Security Risk Analyst and see an important element of their future service to be able to be better placed to advise on Information Security impact across their business.

There is a growing move to work with researchers to define strategy that allows Information Security to be an enabler and an active component in winning research grants. The panel all agreed that there was a need to form better relationships between research and central IT; something that I’ll personally be working on at Loughborough University over the coming years. There was an agreement that the era of silo’d departmental research email servers and wireless networks was not effective and the future is centralisation and collaboration. Closing comments focused on “…there is nothing like a good data breach to bring about change!” and “…some people are more concerned with IDS appliances than the balance of risk.”

Over coffee a number of people who attended the session were interested in the current UCISA activities to develop an Information Security Management System (ISMS) implementation guide and the update to the popular ‘Exploiting and Protecting the Network’ document; both set to launch in early 2015. Keep an eye on the UCISA website for more information!

As suggested, I will be posting details about these activities to the EDUCAUSE Security Constituent Group mailing list as well. This list may also be of interest to UK colleagues who are looking to get a wider perspective on Information Security concerns within global education organisations. Whilst the remit for security falls between both the Network (NG) and Infrastructure (IG) groups within UCISA, some readers of the blog may not be aware of the UCISA-IS Information Security mailing list. Although currently low traffic, it is a growing area of discussion.

For those with larger security teams, it may also be of interest to explore the TERENA TF-CSIRT group.

Privacy in Access and Identity Management

Dr Rhys Smith (Janet) delivered the final session I attended on Tuesday. I’ve not personally been involved in the Access and Identity Management (AIM) side of IT at Loughborough; however I was eager to see what was on the horizon for Moonshot, especially what it can offer the research community. It was nice to see some friendly faces: Rhys Smith, John Chapman and Rob Evans from Janet; and Nicole Harris from TERENA when I arrived at the conference; I’ve also since met quite a few people I’ve spoken to by email before or have seen posting on mailing lists from.

Rhys gave a gentle introduction to AIM before describing how we should be adopting privacy by design, as it is so difficult to retrofit. As part of a privacy vs utility discussion; Rhys provided the example that the routing of IP network packets outside of the EU is breaking EU data protection guidelines as an IP address is deemed to contain personally identifiable information. Whilst this example is simply unworkable, the categorisation of IP addresses has caused some interesting consequences for our Computer Science researchers.

Following a narrative of the difference between web based federation (SAML) and network based federations (like eduroam); Rhys outlined the timescales for the Moonshot trial and official service. Being able to unify many technologies from simple SSH through to Windows desktop authentication opens many possibilities for secure research collaboration in the future.

Other Thoughts

There were lots of interesting conversations through the conference today about the development of common shared tools or building blocks to solve future challenges. From the infrastructure that supports eduroam through to the Kuali HE software suite. Many felt that through collaboration, a better solution can be developed with less resource; however there were concerns that high workloads in recent years had removed a lot of these opportunities for some.

Another common theme was the adoption of standards, rather than closed proprietary technology, avoiding vendor lock-in where possible and using the infrastructure as a live research aid for students within our organisations.

Learning Points

• Get out into the wider university to put your role into perspective;
• Turn Information Security policy and strategy into an enabler that wins research grants;
• Seek collaboration and closer relationships with our research community;
• Explore opportunities for privacy by design;
• Keep a watching brief on Janet Moonshot developments;
• Support the development of common shared tools and building blocks where appropriate.

Matt Cook

 

APIs, architecture and the Narwhal

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

2014 Technology Exchange – Day 1

Courtesy of the UCISA 21st anniversary bursary scheme, I am in Indianapolis, USA this week for the inaugural Technology Exchange conference hosted jointly by Internet2 and ESnet. Internet2 is the USA equivalent of the Janet National Research and Education Network (NREN) in the UK. ESnet provides specific high bandwidth connections to Energy Science research organisations across the USA and beyond.

If you have never been to a conference within the USA before, I’d certainly recommend taking the opportunity to experience a different scale of event. I’ve spoken at VMworld in the USA before where over 7,000 delegates attended the conference, which was orchestrated more like a music concert or sporting event; I was pleasantly surprised to experience a more personal 750 delegates for the first Technology Exchange conference. The same networking opportunities are provided with mini sessions starting at breakfast, multiple mini working groups ‘camps’, Birds of a Feather (BoF) sessions and both leadership and technical streams.

There are four main topics covered within the conference

  • Security;
  • Trust, Identity and Middleware Applications;
  • Cloud Services; and
  • Advanced Networking/Joint Technologies.

As an inaugural event, I’m interested to see how it positions itself along with the Internet2 Global Summit, TNC and Janet Networkshop. I really value colleagues in the community who dedicate time to blogging thoughts from the events they are attending. Collectively it provides a rich resource and I’m pleased to be contributing to this through the UCISA blog over the next four days.

Opening ThoughtsMatthew Cook Day 1

The opening keynote was delivered by Harper Reed who would not look out of place in one of the hipster cafes in the Wicker Park area of Chicago. This is by no means a coincidence as one of his roles is CTO of Threadless, the crowdsourced printing company in an adjoining neighbourhood. Harper delivered an excellent opening keynote in a TED Talk style highlighting many learning points from his technology career including that as CTO of the Obama for America campaign – remember the Narwhal?

Harper spoke about how we grow the talent pipeline and further develop the bright people in our team. We often concentrate on the development of future leaders; do we pay enough attention to our technical talent pipeline? A stream of the conference is focusing on the diversity of our workforce and providing the opportunity to tell the story of our career to date, would it not be interesting to hear how colleagues got to where they are today? The point was made that we should always hire people who are smarter than you and who are different to you. A sure-fire way to build a great team. A lot of the work Harper’s team developed on the Obama for America campaign was related to business analytics, turning the data obtained from the doorstep campaign through information, into knowledge and ultimately wisdom for the micro targeting marketing campaign.

Harper’s insights into the development of the architecture required to support this initiative is a similar challenge to that raised in the UCISA Strategic Challenge Report 2013 “Supporting the use of analytics/business intelligence to inform decision-making and planning.” Architecture is key for success in this area and Harper outlined the simplicity of making the same data available through straightforward API calls. Although on the one morning when the daily campaign bulletin failed to arrive, it was not a failed ‘cronjob’ as the team expected; an intern had simply not turned up for the shift to input the data.

At Loughborough, I have the pleasure of working with some extremely clever people who can code and build things, which are beyond the reach of my BBC BASIC skills of the 1980s. In terms of visualisation, Harper mentioned StatsD/Graphite, which looked extremely interesting to me, so a quick Google search found an introduction that those of you who can code may find useful. Some of the technology we promote within the community has a very long gestation period from inception through to fruition. Some technology doesn’t make it, but others become part of everyday life. Take Eduroam for example, 11 years in the making, it took a big push in the late 2000’s for organisations to take it seriously and now it is in commonplace use, including at the conference venue.

I was at an EMUIT (East Midlands Universities IT) Operations meeting a week ago and was pleased to hear a colleague explain that they ‘required’ IPv6 to be operational on their site to win a research contract; in a similar vein I was pleased to see Harper explain that he ‘required’ the cloud in order to develop the architecture to support his work. Sometimes we are blinkered by the architecture we have always had, supported by the resources we have always had and have done things in the way we always have done. There are opportunities to think differently, there were a couple of Apple references in the talk, but I do genuinely believe there are opportunities to approach infrastructure in a different way.

When the UCISA bursary call for interest was released, I was originally going to submit a request to attend the “AFRICOMM 6th International Conference on e-Infrastructure and e-Services for Developing Countries” conference in Uganda. I can see there is a lot of potential learning in how to do things differently in challenging situations. As I was still recovering from a rather physically challenging broken ankle sustained in last year’s snowboarding season, I thought I’d play it safe and travel to the USA instead. I’d certainly watch the African NREN’s with interest after hearing some of the innovative work they are undertaking at a previous TNC conference.

The final points I wanted to make surrounding Harper’s presentation are around failure, he was proud to announce “We practiced failure for over four months”. The learning points from understanding and embracing failure are great and often swept under the table, rather than embracing and celebrating the learning from failure.

Learning Points

• How do we grow our technical talent pipeline?;
• Designing the architecture to support analytics/business intelligence;
• Sometimes technology innovation has a long gestation period, be patient;
• Find opportunities to think differently about architecture;
• We should all train for failure to understand success.

Matt Cook