Category Archives: UCISA-NG

SDN, Open Daylight and KUMO cloud storage



Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG


2014 Technology Exchange – Day 4: Software Defined Networking (SDN)

I started the conference this morning with the goal of understanding more about Software Defined Networking (SDN); it has been a bit of a buzzword in the industry; but I’ve personally not seen the deployment or use cases, which were going to be part of the first session this morning.

It was great to see a number of Research and Education Networks talking about their use or proposed use of SDN: GRNET, Pozan, RNP, Internet 2, CERN and NICT.  In brief, SDN is an abstraction of the control and data plane of networking devices. Splitting the system, which makes the routing and security decisions about traffic from the elements which simply forward packets. Primary goals are to provide agility, reduce complexity and to prevent vendor lock-in.

Milosz from PSNC discussed a number of the EU use cases they were working on. Interesting to see a lot of demand for agility to support remote (follow the moon) datacentres, cloud bursting and collaboration. Whilst OpenFlow is a part of SDN, it isn’t one and the same thing; they are focusing development on using version 1.3+. With the launch of the new Jisc national datacentre, it will be interesting to see Janet’s strategy for providing the network capabilities to exploit this resource.

Open Daylight

Continuing with the SDN theme, Cisco and Brocade hosted a session on the Open Daylight open source consortium.

There were some good dashboard examples of dynamic WAN link optimisation. One example was the dynamic configuration of the network based on the University calendar to provide bandwidth and cloud datacentre connectivity based on the peaks in the academic calendar.

Another use for this technology would be supporting the requirements of University and conference events based on their booking data without permanently over provisioning all year. Otherwise is there a key driver for SDN on the LAN?

It was interesting to see that the majority of the room had already implemented SDN (circa 50) or were planning to in the next 12 months. About half of the room were already using Open Daylight on their network.

Colleagues in the room were really interested in the utilisation of RESTful API’s to control SDN and it was promising to see the integration possible with the software.

Cisco explained that they are absolutely committed to Open Daylight, they believe it is the future and that they are providing the code to back that up. Both Cisco and Brocade believe the Open Daylight is the controller of the future. There was an interesting discussion about the changing supplier dynamics with the adoption of this technology. This includes a validation programme for the use of Open Daylight on other vendors’ products. Hear more about SDN and Open Daylight at our UCISA Network Futures event in January 2015.

KUMO Cloud Storage

I was interested in how Indiana University had deployed access to Google and One Drive, amongst other storage services in their computer lab environment. It was something that we had been trialling at Loughborough University.

IU’s primary driver was providing storage in all of the VDI environments they had. A client provides all of the associated storage as mapped drives without any local synchronisation requirements. On the roadmap is a Mac OS X client, multiple accounts per vendor and a new broadcast feature.

John explained how a lot of academics struggled to get the correct files to students and for them to have all the data extracted into their Box account. Broadcast integrates with the student module lists and automatically populates their drive with the correct files.

See or

All students at IU each get 100GB of BOX storage through the Internet2 agreement!

It looks like Janet has an agreement with Box, along with Microsoft, Q Associates and Capito on their File Sync and Share DPS. It will be interesting to hear what these services are and more details about the agreement:

What we were investigating at Loughborough University:

We are certainly going to get in touch; this looks exactly what we need here at Loughborough. At the moment they are planning to sell it at $0.33 cents per student user and staff come free. Does this mean the end of providing local student filestore?


It was a very interesting conference and generated some probing questions, which I’ve been able to share with you through the UCISA blog. As it was the first time Internet2 and ESnet have run this conference, there were a few organisational teething troubles, which you don’t get with well-established events like TNC and Janet Networkshop.

In my personal opinion there is quite a bit of development from US based academic institutions which I was previously unaware of. It may be that I wasn’t looking in the right places, colleagues in the community were not sharing this information in the UK or there is a genuine communications gap, which hopefully the UCISA bursary scheme has started to fill.

Learning Points

  • The use cases for SDN is growing; however is there a compelling driver at the moment?
  • I personally see the future for SDN being integrated into cloud orchestration for the technical brokerage to cloud data centers.
  • Ensure you keep a watching brief on SDN and Open Daylight.
  • What is the best way to exploit free cloud services like Google Drive etc?
  • How do we get to proactively find out about neat solutions like Kumo?

I hope that there is something in the blog posts over the last four days that is useful to you. Please feel free to get in touch if you have any questions and I will be writing a management-briefing summary from the event learning points. I’m heading back home via Chicago, so time to hit the road and I’ll see you back in the UK.


ESNet, The Energy Sciences Network



Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG


2014 Technology Exchange – Day 3

One of the features of conferences outside of the UK and especially prevalent in the USA is early morning sessions. It was time on day three to hit the 07:15 morning working group/camp/BoF sessions.

Unfortunately the ‘Campus Cloud Architects BoF’ was cancelled, which was really disappointing and not a good start, as I was hopeful to explore in person some of the latest concerns, trends and experiences in this area.

Industry groups have been reporting more and more interest in Cloud brokerage solutions and some companies are now recruiting for cloud broker and cloud architect roles. As cloud technology gets more mature, there is an opportunity to start brokering for the best possible service and cost for your organisation. In the sector we have seen an excellent start in this area by Janet with their agreements with Microsoft and Google for their email/applications suite.

There is a lot of development scope in this area with Microsoft Azure, AWS etc and I’m interested to explore the strategy required to position infrastructure, automation and standards to take best advantage of the emerging competition.

Perhaps this area is something that colleagues in the UCISA IG may be interested in picking up in the future?

I took advantage of the programme changes to share more details about the current UCISA activity in the ad-hoc groups using a short five-slide presentation covering these pieces of work:

• A guide to the implementation of an Information Security Management System (ISMS), launching in early 2015

• An update to the popular  ‘Exploiting and Protecting the Network’ document, launching in early 2015

• The Major Project Governance Assessment Toolkit

• UCISA Report 2013 – Strategic Challenges for IT Services.

There was a lot of interest in these areas and I had a couple of questions about integrating the planning, effort and joint working of UCISA and EDUCAUSE where there are clear overlaps and topics of interest.

The Energy Sciences Network ESnet are also interested in contributing to the Network Performance and QoS ‘community of practice workshop’  which the UCISA Networking Group are planning in January 2015 (more details coming to the UCISA NG mailing list soon).

Data Intensive Science

As an area where I have little experience, I was interested in listening to what William Johnston from ESnet had to say about large-scale data intensive science. He started by explained his view that high energy physics is seen as a prototype platform for distributed collaboration in other science fields.

He explained that as instruments get bigger, they get more expensive (in a not-quite-as-dramatic Moore’s Law relationship); therefore there are less of them which results in an increase in collaboration, globally. This shows the potential future growth of research networking bandwidth requirements.

One of the things I didn’t realise was that ESnet have extended their 100Gb full network backbone across the Atlantic into Europe, including connections in London. Their first circuit is being tested today. What does this mean for science and research in the UK?

Further details are available at:

William went on to talk about monitoring the network, explaining the criticality of this area. With many Infrastructure as a Service (IaaS) offerings, researchers are requesting Network as a Service; and with that the same levels of assurance and guarantees that have only previously been available with point to point links; is this going to change?

As one would expect, ESnet use perfSONAR for their assurance measurements. As I mentioned earlier, we will hopefully have representatives from ESnet and eduPERT at our Network Performance and QoS ‘community of practice workshop’ in January 2015.

Would something like perfSONAR deployed across Janet be of benefit for the community, perhaps let us know your thoughts in the blog feedback section below? I would assume it requires volunteer sites; however Janet are already looking at the possibility of network based probes for eduroam, so perhaps there is scope for a next generation of Netsight with added assurance?

ESnet also use the weathermap tool, which is also loved by colleagues at Loughborough University. It was one of the best take away messages from a Janet Networkshop Lightning talk several years ago.

The remainder of the talk was about data transfer speeds and integrity. I was surprised to hear the comment “SCP is your enemy”. Surely not? However I was approaching the problem from the wrong angle, thinking about security and not data transfer speeds and parallelisation. Look at some of the figures in the photograph below.



William discussed a number of tools including GridFTP and a development from CALTECH, which stripes data across discs as part of the FTP process as well as providing up to three times CRC checking.

Interestingly the last point was about data integrity, which is critical for the field of data intensive science. William referenced the paper Stone and Partridge, 2000 “When The CRC and TCP Checksum Disagree”.

During the break, I had a bit of a Google to find any UK user or interest groups for Research Computing and HPC. I found the HPC SIG, if you know of any others, please pop them in the blog comments to share.

Connecting 40Gb Hosts

Whilst in the ‘big data’ mindset, there was an interesting session where colleagues from Fermi Labs, ESnet and CALTECH shared best practice infrastructure configuration to support high-speed data transfer.

There was some very interesting visual modelling, which demonstrated the affinity the network card has with a particular processor socket and core. The difference between optimising for data transfer is significant 37Gbps vs 26Gbps max on a 40Gbps link.

It was a packed session with many colleagues standing at the back; there is certainly an art to tweaking infrastructure to perform in the best possible manner. It was also interesting to hear there are three 100Gb network cards in development and testing.

Pushing the Boundaries of the Traditional Classroom

There was a bit of a clash in the programme, so I didn’t get to spend a lot of time in this session, but it was interesting to see what Indiana University had done with their ‘Collaboration Café’.

It led me to wonder what the key limitation of adopting more of these learner centric classroom designs is? Is it financial or is it resistance from academic colleagues in the same way as there was/is resistance to lecture capture and VLE environments?

UCISA are working along with SCHOMS and AUDE on an update to Learning Space design principals. This document should be really useful, especially as the final point from the presentation was all about the removal of wires.

At Loughborough we are trialling the Epson projector series that use the Epson EasyMP software and iProjection App. What wireless projectors and/or screens are you using? Let us know in the blog feedback section below?

Other Thoughts

The other talks I attended through the day continued on the research and big data theme. It included hearing about the PetaBytes (PB) of data required by some of the medical research being undertaken as part of the ICTBioMed platform. One of the speakers commented that biology is becoming more like computer science by the day; confirming again that multidisciplinary research is a firm requirement for a lot of modern applied research.

Some examples of digital biology given were: DNA Sequencing, Gene Expression Analysis, Protein Profiling and Protein to Protein interactions.

A number of the speakers came in via videoconference; it was interesting to see the mix of success and failure of this bold move. It seems strange that we still struggle to co-ordinate a remote video connection with the technology we have at our disposal in 2014.

Another speaker also made reference to the worldwide nature of large research groups and collaborations and said this collaboration technology was essential.

Video Collaboration

For the final session of the day, I was interested to see what the future held for video based collaboration in a session with speakers from: Internet2, Pexip, Evogh, Blue Jeans and Vidyo. I didn’t manage to ask Robb from Blue Jeans more about the removal of the Skype interface API that was so disappointing, however during the panel he mentioned that they had a Google Hangouts bridge to standards based systems available.

There were some interesting remarks from Hakon Dahle who is CTO at Pexip based in Oslo (but was previously CTO at Tandberg and Cisco).

Hakon described their distributed architecture, where it is possible to start small and grow appropriately with options to add capacity on demand in an agile manner.

Latency was still an issue with global video conferencing and there was a panel debate about the pros/cons of transcoding increasing latency vs accessibility and interoperability.

“Transcoding is a necessary evil”; especially with new protocols like WebRTC etc!

There were very positive comments about WebRTC and how it will make video more accessible and will make face to face communications easier; however there is already a divide with Google VP9 protocols being favoured by some players in the market especially when delivering very high resolution 4K streams.

Hakon explained that WebRTC seemed the most promising technology to allow direct person to person video calls and will bring about a lot of new use cases and that the new use case element is the most exciting in terms of innovation.

Learning Points

• How do we best position our infrastructure to take advantage of emerging Cloud competition?
• How do we collaborate more with colleagues from Internet2, ESnet and EDUCAUSE? Is this something UCISA and Janet/Jisc can facilitate?
• Future growth potential of research data transfer requirements
• Are we best serving our research communities, what more can we do?
• Global nature of research and therefore the communication requirements.

Matt Cook

2014 Technology Exchange – Day 2 by Matt Cook



Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG


FIRST Robotics Competition

The Monday evening welcome reception included a contest based on robots developed by high school students. The students were given six weeks to raise funds, design, develop, brand and program a robot – not an easy task! It was great to see such innovation from our students and colleagues of the future. I wish we had these opportunities back when I was at school; the best we experienced was BigTrak and writing Logo programs. However at least we were taught programming in BBC Basic, and not simply how to use the Microsoft Office suite.


The USA is promoting Science, Technology, Engineering and Mathematics (STEM) subjects in a similar manner to the UK. It will be interesting to see how successful this initiative is in providing the education required for our fellow colleagues of the future and plugging the current skills gap. Talking to the students, they are extremely enthused about the creator, maker, hacker opportunities being given through these programmes.

This is another one of those opportunities which demonstrates the value in the jobs we perform in our respective organisations to support education. I recently undertook a job shadow of a technician in one of our academic schools at Loughborough, and it was one of the most eye opening experiences I had all year.  It was extremely valuable to see the challenges they face within the school, how central IT policy affects their work and the innovation and creative ideas being developed by their students. I would certainly encourage everyone to get out into the wider university more to put everything into perspective.

Central IT vs Research Perspective on Information Security

There was a very interesting panel discussion mid-way through the Tuesday schedule investigating the challenges faced by both the central IT function and research centres in managing Information Security. Rob Stanfield from Purdue University provided an overview of the provision at his organisation and one thing that stood out was the scale of some of the US based education organisations that dwarfed most of the largest UK universities. The scale of operation also brought increased scale of both staffing, and following a coffee break discussion, of budget too. Purdue are currently recruiting a Security Risk Analyst and see an important element of their future service to be able to be better placed to advise on Information Security impact across their business.

There is a growing move to work with researchers to define strategy that allows Information Security to be an enabler and an active component in winning research grants. The panel all agreed that there was a need to form better relationships between research and central IT; something that I’ll personally be working on at Loughborough University over the coming years. There was an agreement that the era of silo’d departmental research email servers and wireless networks was not effective and the future is centralisation and collaboration. Closing comments focused on “…there is nothing like a good data breach to bring about change!” and “…some people are more concerned with IDS appliances than the balance of risk.”

Over coffee a number of people who attended the session were interested in the current UCISA activities to develop an Information Security Management System (ISMS) implementation guide and the update to the popular ‘Exploiting and Protecting the Network’ document; both set to launch in early 2015. Keep an eye on the UCISA website for more information!

As suggested, I will be posting details about these activities to the EDUCAUSE Security Constituent Group mailing list as well. This list may also be of interest to UK colleagues who are looking to get a wider perspective on Information Security concerns within global education organisations. Whilst the remit for security falls between both the Network (NG) and Infrastructure (IG) groups within UCISA, some readers of the blog may not be aware of the UCISA-IS Information Security mailing list. Although currently low traffic, it is a growing area of discussion.

For those with larger security teams, it may also be of interest to explore the TERENA TF-CSIRT group.

Privacy in Access and Identity Management

Dr Rhys Smith (Janet) delivered the final session I attended on Tuesday. I’ve not personally been involved in the Access and Identity Management (AIM) side of IT at Loughborough; however I was eager to see what was on the horizon for Moonshot, especially what it can offer the research community. It was nice to see some friendly faces: Rhys Smith, John Chapman and Rob Evans from Janet; and Nicole Harris from TERENA when I arrived at the conference; I’ve also since met quite a few people I’ve spoken to by email before or have seen posting on mailing lists from.

Rhys gave a gentle introduction to AIM before describing how we should be adopting privacy by design, as it is so difficult to retrofit. As part of a privacy vs utility discussion; Rhys provided the example that the routing of IP network packets outside of the EU is breaking EU data protection guidelines as an IP address is deemed to contain personally identifiable information. Whilst this example is simply unworkable, the categorisation of IP addresses has caused some interesting consequences for our Computer Science researchers.

Following a narrative of the difference between web based federation (SAML) and network based federations (like eduroam); Rhys outlined the timescales for the Moonshot trial and official service. Being able to unify many technologies from simple SSH through to Windows desktop authentication opens many possibilities for secure research collaboration in the future.

Other Thoughts

There were lots of interesting conversations through the conference today about the development of common shared tools or building blocks to solve future challenges. From the infrastructure that supports eduroam through to the Kuali HE software suite. Many felt that through collaboration, a better solution can be developed with less resource; however there were concerns that high workloads in recent years had removed a lot of these opportunities for some.

Another common theme was the adoption of standards, rather than closed proprietary technology, avoiding vendor lock-in where possible and using the infrastructure as a live research aid for students within our organisations.

Learning Points

• Get out into the wider university to put your role into perspective;
• Turn Information Security policy and strategy into an enabler that wins research grants;
• Seek collaboration and closer relationships with our research community;
• Explore opportunities for privacy by design;
• Keep a watching brief on Janet Moonshot developments;
• Support the development of common shared tools and building blocks where appropriate.

Matt Cook


APIs, architecture and the Narwhal



Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG


2014 Technology Exchange – Day 1

Courtesy of the UCISA 21st anniversary bursary scheme, I am in Indianapolis, USA this week for the inaugural Technology Exchange conference hosted jointly by Internet2 and ESnet. Internet2 is the USA equivalent of the Janet National Research and Education Network (NREN) in the UK. ESnet provides specific high bandwidth connections to Energy Science research organisations across the USA and beyond.

If you have never been to a conference within the USA before, I’d certainly recommend taking the opportunity to experience a different scale of event. I’ve spoken at VMworld in the USA before where over 7,000 delegates attended the conference, which was orchestrated more like a music concert or sporting event; I was pleasantly surprised to experience a more personal 750 delegates for the first Technology Exchange conference. The same networking opportunities are provided with mini sessions starting at breakfast, multiple mini working groups ‘camps’, Birds of a Feather (BoF) sessions and both leadership and technical streams.

There are four main topics covered within the conference

  • Security;
  • Trust, Identity and Middleware Applications;
  • Cloud Services; and
  • Advanced Networking/Joint Technologies.

As an inaugural event, I’m interested to see how it positions itself along with the Internet2 Global Summit, TNC and Janet Networkshop. I really value colleagues in the community who dedicate time to blogging thoughts from the events they are attending. Collectively it provides a rich resource and I’m pleased to be contributing to this through the UCISA blog over the next four days.

Opening ThoughtsMatthew Cook Day 1

The opening keynote was delivered by Harper Reed who would not look out of place in one of the hipster cafes in the Wicker Park area of Chicago. This is by no means a coincidence as one of his roles is CTO of Threadless, the crowdsourced printing company in an adjoining neighbourhood. Harper delivered an excellent opening keynote in a TED Talk style highlighting many learning points from his technology career including that as CTO of the Obama for America campaign – remember the Narwhal?

Harper spoke about how we grow the talent pipeline and further develop the bright people in our team. We often concentrate on the development of future leaders; do we pay enough attention to our technical talent pipeline? A stream of the conference is focusing on the diversity of our workforce and providing the opportunity to tell the story of our career to date, would it not be interesting to hear how colleagues got to where they are today? The point was made that we should always hire people who are smarter than you and who are different to you. A sure-fire way to build a great team. A lot of the work Harper’s team developed on the Obama for America campaign was related to business analytics, turning the data obtained from the doorstep campaign through information, into knowledge and ultimately wisdom for the micro targeting marketing campaign.

Harper’s insights into the development of the architecture required to support this initiative is a similar challenge to that raised in the UCISA Strategic Challenge Report 2013 “Supporting the use of analytics/business intelligence to inform decision-making and planning.” Architecture is key for success in this area and Harper outlined the simplicity of making the same data available through straightforward API calls. Although on the one morning when the daily campaign bulletin failed to arrive, it was not a failed ‘cronjob’ as the team expected; an intern had simply not turned up for the shift to input the data.

At Loughborough, I have the pleasure of working with some extremely clever people who can code and build things, which are beyond the reach of my BBC BASIC skills of the 1980s. In terms of visualisation, Harper mentioned StatsD/Graphite, which looked extremely interesting to me, so a quick Google search found an introduction that those of you who can code may find useful. Some of the technology we promote within the community has a very long gestation period from inception through to fruition. Some technology doesn’t make it, but others become part of everyday life. Take Eduroam for example, 11 years in the making, it took a big push in the late 2000’s for organisations to take it seriously and now it is in commonplace use, including at the conference venue.

I was at an EMUIT (East Midlands Universities IT) Operations meeting a week ago and was pleased to hear a colleague explain that they ‘required’ IPv6 to be operational on their site to win a research contract; in a similar vein I was pleased to see Harper explain that he ‘required’ the cloud in order to develop the architecture to support his work. Sometimes we are blinkered by the architecture we have always had, supported by the resources we have always had and have done things in the way we always have done. There are opportunities to think differently, there were a couple of Apple references in the talk, but I do genuinely believe there are opportunities to approach infrastructure in a different way.

When the UCISA bursary call for interest was released, I was originally going to submit a request to attend the “AFRICOMM 6th International Conference on e-Infrastructure and e-Services for Developing Countries” conference in Uganda. I can see there is a lot of potential learning in how to do things differently in challenging situations. As I was still recovering from a rather physically challenging broken ankle sustained in last year’s snowboarding season, I thought I’d play it safe and travel to the USA instead. I’d certainly watch the African NREN’s with interest after hearing some of the innovative work they are undertaking at a previous TNC conference.

The final points I wanted to make surrounding Harper’s presentation are around failure, he was proud to announce “We practiced failure for over four months”. The learning points from understanding and embracing failure are great and often swept under the table, rather than embracing and celebrating the learning from failure.

Learning Points

• How do we grow our technical talent pipeline?;
• Designing the architecture to support analytics/business intelligence;
• Sometimes technology innovation has a long gestation period, be patient;
• Find opportunities to think differently about architecture;
• We should all train for failure to understand success.

Matt Cook

Comments welcome on new structure for the UCISA Information Security Toolkit

We would like to invite comment from the community on the revised structure and content of the UCISA Information Security Toolkit which was agreed by the project group at a meeting last month.


The UCISA Information Security Toolkit  has been very successful, providing much needed assistance to information security professionals across the sector. Since the original funding application for the project in 2004, there have been a number of iterations of the document,  based upon changing standards and sector wide activity.  The last Toolkit was published in 2007 (third edition).

A number of factors have prompted a rewrite and expansion of the document: cloud technologies, PCI DSS, data classification and supportive practical advice in the form of appropriate feedback cycles (for example Plan/Do/Check/Act). The largest factor was the release of the BS ISO/IEC 27001:2013 standard in the autumn of last year.

The group comprising of colleagues from University College London, University of Oxford, Loughborough University, Cardiff University, the University of York and Janet have met regularly in person and via Skype in order to generate new content.   The revised Toolkit will include a number of practical case studies demonstrating what works and does not work in practice. Topics include: policy development;  raising user awareness;  investigations and research security.

The new Toolkit will be launched in March 2015 to coincide with  UCISA 2015, Edinburgh and Janet Networkshop43,  Exeter.

Matt Cook, Chair, UCISA Networking Group
Head of Infrastructure and Middleware,
Loughborough University, IT Services


The Networking Group’s purpose is to raise awareness of networking, telephony and IT security developments; to share examples of good practice; to act as a voice on the networking field within the UCISA community.