Monthly Archives: October 2014

SDN, Open Daylight and KUMO cloud storage

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

2014 Technology Exchange – Day 4: Software Defined Networking (SDN)

I started the conference this morning with the goal of understanding more about Software Defined Networking (SDN); it has been a bit of a buzzword in the industry; but I’ve personally not seen the deployment or use cases, which were going to be part of the first session this morning.

It was great to see a number of Research and Education Networks talking about their use or proposed use of SDN: GRNET, Pozan, RNP, Internet 2, CERN and NICT.  In brief, SDN is an abstraction of the control and data plane of networking devices. Splitting the system, which makes the routing and security decisions about traffic from the elements which simply forward packets. Primary goals are to provide agility, reduce complexity and to prevent vendor lock-in.

Milosz from PSNC discussed a number of the EU use cases they were working on. Interesting to see a lot of demand for agility to support remote (follow the moon) datacentres, cloud bursting and collaboration. Whilst OpenFlow is a part of SDN, it isn’t one and the same thing; they are focusing development on using version 1.3+. With the launch of the new Jisc national datacentre, it will be interesting to see Janet’s strategy for providing the network capabilities to exploit this resource.

Open Daylight

Continuing with the SDN theme, Cisco and Brocade hosted a session on the Open Daylight open source consortium.

There were some good dashboard examples of dynamic WAN link optimisation. One example was the dynamic configuration of the network based on the University calendar to provide bandwidth and cloud datacentre connectivity based on the peaks in the academic calendar.

Another use for this technology would be supporting the requirements of University and conference events based on their booking data without permanently over provisioning all year. Otherwise is there a key driver for SDN on the LAN?

It was interesting to see that the majority of the room had already implemented SDN (circa 50) or were planning to in the next 12 months. About half of the room were already using Open Daylight on their network.

Colleagues in the room were really interested in the utilisation of RESTful API’s to control SDN and it was promising to see the integration possible with the software.

Cisco explained that they are absolutely committed to Open Daylight, they believe it is the future and that they are providing the code to back that up. Both Cisco and Brocade believe the Open Daylight is the controller of the future. There was an interesting discussion about the changing supplier dynamics with the adoption of this technology. This includes a validation programme for the use of Open Daylight on other vendors’ products. Hear more about SDN and Open Daylight at our UCISA Network Futures event in January 2015.

KUMO Cloud Storage

I was interested in how Indiana University had deployed access to Google and One Drive, amongst other storage services in their computer lab environment. It was something that we had been trialling at Loughborough University.

IU’s primary driver was providing storage in all of the VDI environments they had. A client provides all of the associated storage as mapped drives without any local synchronisation requirements. On the roadmap is a Mac OS X client, multiple accounts per vendor and a new broadcast feature.

John explained how a lot of academics struggled to get the correct files to students and for them to have all the data extracted into their Box account. Broadcast integrates with the student module lists and automatically populates their drive with the correct files.

See https://cloudstorage.iu.edu/partner or jhoerr@iu.edu.

All students at IU each get 100GB of BOX storage through the Internet2 agreement!

http://www.internet2.edu/products-services/cloud-services-applications/box

It looks like Janet has an agreement with Box, along with Microsoft, Q Associates and Capito on their File Sync and Share DPS. It will be interesting to hear what these services are and more details about the agreement: https://community.ja.net/system/files/6989/List%20of%20Suppliers_0.pdf

What we were investigating at Loughborough University: http://blog.lboro.ac.uk/middleware/blog/apps-for-education/apps-for-education

We are certainly going to get in touch; this looks exactly what we need here at Loughborough. At the moment they are planning to sell it at $0.33 cents per student user and staff come free. Does this mean the end of providing local student filestore?

Conclusions

It was a very interesting conference and generated some probing questions, which I’ve been able to share with you through the UCISA blog. As it was the first time Internet2 and ESnet have run this conference, there were a few organisational teething troubles, which you don’t get with well-established events like TNC and Janet Networkshop.

In my personal opinion there is quite a bit of development from US based academic institutions which I was previously unaware of. It may be that I wasn’t looking in the right places, colleagues in the community were not sharing this information in the UK or there is a genuine communications gap, which hopefully the UCISA bursary scheme has started to fill.

Learning Points

  • The use cases for SDN is growing; however is there a compelling driver at the moment?
  • I personally see the future for SDN being integrated into cloud orchestration for the technical brokerage to cloud data centers.
  • Ensure you keep a watching brief on SDN and Open Daylight.
  • What is the best way to exploit free cloud services like Google Drive etc?
  • How do we get to proactively find out about neat solutions like Kumo?

I hope that there is something in the blog posts over the last four days that is useful to you. Please feel free to get in touch if you have any questions and I will be writing a management-briefing summary from the event learning points. I’m heading back home via Chicago, so time to hit the road and I’ll see you back in the UK.

2014TechExDay4

ESNet, The Energy Sciences Network

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 


2014 Technology Exchange – Day 3

One of the features of conferences outside of the UK and especially prevalent in the USA is early morning sessions. It was time on day three to hit the 07:15 morning working group/camp/BoF sessions.

Unfortunately the ‘Campus Cloud Architects BoF’ was cancelled, which was really disappointing and not a good start, as I was hopeful to explore in person some of the latest concerns, trends and experiences in this area.

Industry groups have been reporting more and more interest in Cloud brokerage solutions and some companies are now recruiting for cloud broker and cloud architect roles. As cloud technology gets more mature, there is an opportunity to start brokering for the best possible service and cost for your organisation. In the sector we have seen an excellent start in this area by Janet with their agreements with Microsoft and Google for their email/applications suite.

There is a lot of development scope in this area with Microsoft Azure, AWS etc and I’m interested to explore the strategy required to position infrastructure, automation and standards to take best advantage of the emerging competition.

Perhaps this area is something that colleagues in the UCISA IG may be interested in picking up in the future?

I took advantage of the programme changes to share more details about the current UCISA activity in the ad-hoc groups using a short five-slide presentation covering these pieces of work:

• A guide to the implementation of an Information Security Management System (ISMS), launching in early 2015

• An update to the popular  ‘Exploiting and Protecting the Network’ document, launching in early 2015

• The Major Project Governance Assessment Toolkit

• UCISA Report 2013 – Strategic Challenges for IT Services.

There was a lot of interest in these areas and I had a couple of questions about integrating the planning, effort and joint working of UCISA and EDUCAUSE where there are clear overlaps and topics of interest.

The Energy Sciences Network ESnet are also interested in contributing to the Network Performance and QoS ‘community of practice workshop’  which the UCISA Networking Group are planning in January 2015 (more details coming to the UCISA NG mailing list soon).

Data Intensive Science

As an area where I have little experience, I was interested in listening to what William Johnston from ESnet had to say about large-scale data intensive science. He started by explained his view that high energy physics is seen as a prototype platform for distributed collaboration in other science fields.

He explained that as instruments get bigger, they get more expensive (in a not-quite-as-dramatic Moore’s Law relationship); therefore there are less of them which results in an increase in collaboration, globally. This shows the potential future growth of research networking bandwidth requirements.

One of the things I didn’t realise was that ESnet have extended their 100Gb full network backbone across the Atlantic into Europe, including connections in London. Their first circuit is being tested today. What does this mean for science and research in the UK?

Further details are available at:
http://es.net/news-and-publications/esnet-news/2014/esnet-extends-100g-connectivity-across-atlantic
http://www.geant.net/MediaCentreEvents/news/Pages/three-high-speed-links.aspx

William went on to talk about monitoring the network, explaining the criticality of this area. With many Infrastructure as a Service (IaaS) offerings, researchers are requesting Network as a Service; and with that the same levels of assurance and guarantees that have only previously been available with point to point links; is this going to change?

As one would expect, ESnet use perfSONAR for their assurance measurements. As I mentioned earlier, we will hopefully have representatives from ESnet and eduPERT at our Network Performance and QoS ‘community of practice workshop’ in January 2015.

Would something like perfSONAR deployed across Janet be of benefit for the community, perhaps let us know your thoughts in the blog feedback section below? I would assume it requires volunteer sites; however Janet are already looking at the possibility of network based probes for eduroam, so perhaps there is scope for a next generation of Netsight with added assurance?

ESnet also use the weathermap tool, which is also loved by colleagues at Loughborough University. It was one of the best take away messages from a Janet Networkshop Lightning talk several years ago.

The remainder of the talk was about data transfer speeds and integrity. I was surprised to hear the comment “SCP is your enemy”. Surely not? However I was approaching the problem from the wrong angle, thinking about security and not data transfer speeds and parallelisation. Look at some of the figures in the photograph below.

2014TechExDay3

 

William discussed a number of tools including GridFTP and a development from CALTECH, which stripes data across discs as part of the FTP process as well as providing up to three times CRC checking.

Interestingly the last point was about data integrity, which is critical for the field of data intensive science. William referenced the paper Stone and Partridge, 2000 “When The CRC and TCP Checksum Disagree”.

During the break, I had a bit of a Google to find any UK user or interest groups for Research Computing and HPC. I found the HPC SIG, if you know of any others, please pop them in the blog comments to share.

Connecting 40Gb Hosts

Whilst in the ‘big data’ mindset, there was an interesting session where colleagues from Fermi Labs, ESnet and CALTECH shared best practice infrastructure configuration to support high-speed data transfer.

There was some very interesting visual modelling, which demonstrated the affinity the network card has with a particular processor socket and core. The difference between optimising for data transfer is significant 37Gbps vs 26Gbps max on a 40Gbps link.

It was a packed session with many colleagues standing at the back; there is certainly an art to tweaking infrastructure to perform in the best possible manner. It was also interesting to hear there are three 100Gb network cards in development and testing.

Pushing the Boundaries of the Traditional Classroom

There was a bit of a clash in the programme, so I didn’t get to spend a lot of time in this session, but it was interesting to see what Indiana University had done with their ‘Collaboration Café’.

It led me to wonder what the key limitation of adopting more of these learner centric classroom designs is? Is it financial or is it resistance from academic colleagues in the same way as there was/is resistance to lecture capture and VLE environments?

UCISA are working along with SCHOMS and AUDE on an update to Learning Space design principals. This document should be really useful, especially as the final point from the presentation was all about the removal of wires.

At Loughborough we are trialling the Epson projector series that use the Epson EasyMP software and iProjection App. What wireless projectors and/or screens are you using? Let us know in the blog feedback section below?

Other Thoughts

The other talks I attended through the day continued on the research and big data theme. It included hearing about the PetaBytes (PB) of data required by some of the medical research being undertaken as part of the ICTBioMed platform. One of the speakers commented that biology is becoming more like computer science by the day; confirming again that multidisciplinary research is a firm requirement for a lot of modern applied research.

Some examples of digital biology given were: DNA Sequencing, Gene Expression Analysis, Protein Profiling and Protein to Protein interactions.

A number of the speakers came in via videoconference; it was interesting to see the mix of success and failure of this bold move. It seems strange that we still struggle to co-ordinate a remote video connection with the technology we have at our disposal in 2014.

Another speaker also made reference to the worldwide nature of large research groups and collaborations and said this collaboration technology was essential.

Video Collaboration

For the final session of the day, I was interested to see what the future held for video based collaboration in a session with speakers from: Internet2, Pexip, Evogh, Blue Jeans and Vidyo. I didn’t manage to ask Robb from Blue Jeans more about the removal of the Skype interface API that was so disappointing, however during the panel he mentioned that they had a Google Hangouts bridge to standards based systems available.

There were some interesting remarks from Hakon Dahle who is CTO at Pexip based in Oslo (but was previously CTO at Tandberg and Cisco).

Hakon described their distributed architecture, where it is possible to start small and grow appropriately with options to add capacity on demand in an agile manner.

Latency was still an issue with global video conferencing and there was a panel debate about the pros/cons of transcoding increasing latency vs accessibility and interoperability.

“Transcoding is a necessary evil”; especially with new protocols like WebRTC etc!

There were very positive comments about WebRTC and how it will make video more accessible and will make face to face communications easier; however there is already a divide with Google VP9 protocols being favoured by some players in the market especially when delivering very high resolution 4K streams.

Hakon explained that WebRTC seemed the most promising technology to allow direct person to person video calls and will bring about a lot of new use cases and that the new use case element is the most exciting in terms of innovation.

Learning Points

• How do we best position our infrastructure to take advantage of emerging Cloud competition?
• How do we collaborate more with colleagues from Internet2, ESnet and EDUCAUSE? Is this something UCISA and Janet/Jisc can facilitate?
• Future growth potential of research data transfer requirements
• Are we best serving our research communities, what more can we do?
• Global nature of research and therefore the communication requirements.

Matt Cook

Cyber security – top table interest

The risk cyber crime presents to the higher education sector was highlighted to Vice-Chancellors at the Universities UK Conference in 2012. Since then, there have been a series of round table discussions which have looked at the ability of the UK higher education sector to respond to cyber crime attacks. I attended the most recent of these which focused on the outcomes of a self-assessment exercise UUK promoted earlier in the year.

Those institutions that had completed the exercise will receive individual reports in the near future and a briefing will be circulated to Vice-Chancellors reflecting on the exercise. The briefing will include an additional report giving details of a number of UCISA resources that support institutions in their cyber security initiatives. The detailed results of the exercise are embargoed until the institutions have received their individual reports but, although it is clear that there is work to be done, there are some encouraging signs that cyber security is being taken seriously at a senior level within many institutions.
There are a number of factors that support this assessment. Firstly over sixty institutions took part in the exercise. In addition to these institutions, I am aware of a number of others that did not take part as they had already carried out similar work either utilising already published controls (such as the CPNI’s twenty controls for cyber defence) or by engaging external consultants.

Secondly there was a good level of interest shown in security and risk related topics by delegates at the Universities UK Conference this year. UCISA exhibits at the Conference to promote our resources and activities. Two publications that drew particular interest were the revised Model Regulations for the use of institutional IT systems and the Information Security Toolkit. Effective information security is underpinned by effective regulations and the Model Regulations give institutions a template to utilise locally. The current version of the Information Security Toolkit provides specimen policies for institutions to revise. The delegates were also interested in the Major Projects Governance Assessment Toolkit – effective governance reduces the risk of projects failing to deliver their anticipated benefits, or having major cost or time overruns.

So there are positive signs that risk and cyber security are being taken seriously. Care is needed though that cyber security is not just seen as an IT problem – people and processes are also important components in implementing effective information security measures. This is something that will be highlighted in the revised Information Security Toolkit – there is a need for senior management ownership and good governance in order for information security to be successfully managed. We also need to guard against IT only featuring at the top table for ‘problem’ issues – we need to work to ensure that the role IT can play in enhancing the student experience, delivering efficiencies is also understood by senior institutional managers.

Postscript – work is currently in progress on a revision of the Information Security Toolkit. It is anticipated that the new version will be launched at the UCISA15 Conference in March 2015.

2014 Technology Exchange – Day 2 by Matt Cook

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

FIRST Robotics Competition

The Monday evening welcome reception included a contest based on robots developed by high school students. The students were given six weeks to raise funds, design, develop, brand and program a robot – not an easy task! It was great to see such innovation from our students and colleagues of the future. I wish we had these opportunities back when I was at school; the best we experienced was BigTrak and writing Logo programs. However at least we were taught programming in BBC Basic, and not simply how to use the Microsoft Office suite.

2014TechExDay2

The USA is promoting Science, Technology, Engineering and Mathematics (STEM) subjects in a similar manner to the UK. It will be interesting to see how successful this initiative is in providing the education required for our fellow colleagues of the future and plugging the current skills gap. Talking to the students, they are extremely enthused about the creator, maker, hacker opportunities being given through these programmes.

This is another one of those opportunities which demonstrates the value in the jobs we perform in our respective organisations to support education. I recently undertook a job shadow of a technician in one of our academic schools at Loughborough, and it was one of the most eye opening experiences I had all year.  It was extremely valuable to see the challenges they face within the school, how central IT policy affects their work and the innovation and creative ideas being developed by their students. I would certainly encourage everyone to get out into the wider university more to put everything into perspective.

Central IT vs Research Perspective on Information Security

There was a very interesting panel discussion mid-way through the Tuesday schedule investigating the challenges faced by both the central IT function and research centres in managing Information Security. Rob Stanfield from Purdue University provided an overview of the provision at his organisation and one thing that stood out was the scale of some of the US based education organisations that dwarfed most of the largest UK universities. The scale of operation also brought increased scale of both staffing, and following a coffee break discussion, of budget too. Purdue are currently recruiting a Security Risk Analyst and see an important element of their future service to be able to be better placed to advise on Information Security impact across their business.

There is a growing move to work with researchers to define strategy that allows Information Security to be an enabler and an active component in winning research grants. The panel all agreed that there was a need to form better relationships between research and central IT; something that I’ll personally be working on at Loughborough University over the coming years. There was an agreement that the era of silo’d departmental research email servers and wireless networks was not effective and the future is centralisation and collaboration. Closing comments focused on “…there is nothing like a good data breach to bring about change!” and “…some people are more concerned with IDS appliances than the balance of risk.”

Over coffee a number of people who attended the session were interested in the current UCISA activities to develop an Information Security Management System (ISMS) implementation guide and the update to the popular ‘Exploiting and Protecting the Network’ document; both set to launch in early 2015. Keep an eye on the UCISA website for more information!

As suggested, I will be posting details about these activities to the EDUCAUSE Security Constituent Group mailing list as well. This list may also be of interest to UK colleagues who are looking to get a wider perspective on Information Security concerns within global education organisations. Whilst the remit for security falls between both the Network (NG) and Infrastructure (IG) groups within UCISA, some readers of the blog may not be aware of the UCISA-IS Information Security mailing list. Although currently low traffic, it is a growing area of discussion.

For those with larger security teams, it may also be of interest to explore the TERENA TF-CSIRT group.

Privacy in Access and Identity Management

Dr Rhys Smith (Janet) delivered the final session I attended on Tuesday. I’ve not personally been involved in the Access and Identity Management (AIM) side of IT at Loughborough; however I was eager to see what was on the horizon for Moonshot, especially what it can offer the research community. It was nice to see some friendly faces: Rhys Smith, John Chapman and Rob Evans from Janet; and Nicole Harris from TERENA when I arrived at the conference; I’ve also since met quite a few people I’ve spoken to by email before or have seen posting on mailing lists from.

Rhys gave a gentle introduction to AIM before describing how we should be adopting privacy by design, as it is so difficult to retrofit. As part of a privacy vs utility discussion; Rhys provided the example that the routing of IP network packets outside of the EU is breaking EU data protection guidelines as an IP address is deemed to contain personally identifiable information. Whilst this example is simply unworkable, the categorisation of IP addresses has caused some interesting consequences for our Computer Science researchers.

Following a narrative of the difference between web based federation (SAML) and network based federations (like eduroam); Rhys outlined the timescales for the Moonshot trial and official service. Being able to unify many technologies from simple SSH through to Windows desktop authentication opens many possibilities for secure research collaboration in the future.

Other Thoughts

There were lots of interesting conversations through the conference today about the development of common shared tools or building blocks to solve future challenges. From the infrastructure that supports eduroam through to the Kuali HE software suite. Many felt that through collaboration, a better solution can be developed with less resource; however there were concerns that high workloads in recent years had removed a lot of these opportunities for some.

Another common theme was the adoption of standards, rather than closed proprietary technology, avoiding vendor lock-in where possible and using the infrastructure as a live research aid for students within our organisations.

Learning Points

• Get out into the wider university to put your role into perspective;
• Turn Information Security policy and strategy into an enabler that wins research grants;
• Seek collaboration and closer relationships with our research community;
• Explore opportunities for privacy by design;
• Keep a watching brief on Janet Moonshot developments;
• Support the development of common shared tools and building blocks where appropriate.

Matt Cook

 

APIs, architecture and the Narwhal

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

2014 Technology Exchange – Day 1

Courtesy of the UCISA 21st anniversary bursary scheme, I am in Indianapolis, USA this week for the inaugural Technology Exchange conference hosted jointly by Internet2 and ESnet. Internet2 is the USA equivalent of the Janet National Research and Education Network (NREN) in the UK. ESnet provides specific high bandwidth connections to Energy Science research organisations across the USA and beyond.

If you have never been to a conference within the USA before, I’d certainly recommend taking the opportunity to experience a different scale of event. I’ve spoken at VMworld in the USA before where over 7,000 delegates attended the conference, which was orchestrated more like a music concert or sporting event; I was pleasantly surprised to experience a more personal 750 delegates for the first Technology Exchange conference. The same networking opportunities are provided with mini sessions starting at breakfast, multiple mini working groups ‘camps’, Birds of a Feather (BoF) sessions and both leadership and technical streams.

There are four main topics covered within the conference

  • Security;
  • Trust, Identity and Middleware Applications;
  • Cloud Services; and
  • Advanced Networking/Joint Technologies.

As an inaugural event, I’m interested to see how it positions itself along with the Internet2 Global Summit, TNC and Janet Networkshop. I really value colleagues in the community who dedicate time to blogging thoughts from the events they are attending. Collectively it provides a rich resource and I’m pleased to be contributing to this through the UCISA blog over the next four days.

Opening ThoughtsMatthew Cook Day 1

The opening keynote was delivered by Harper Reed who would not look out of place in one of the hipster cafes in the Wicker Park area of Chicago. This is by no means a coincidence as one of his roles is CTO of Threadless, the crowdsourced printing company in an adjoining neighbourhood. Harper delivered an excellent opening keynote in a TED Talk style highlighting many learning points from his technology career including that as CTO of the Obama for America campaign – remember the Narwhal?

Harper spoke about how we grow the talent pipeline and further develop the bright people in our team. We often concentrate on the development of future leaders; do we pay enough attention to our technical talent pipeline? A stream of the conference is focusing on the diversity of our workforce and providing the opportunity to tell the story of our career to date, would it not be interesting to hear how colleagues got to where they are today? The point was made that we should always hire people who are smarter than you and who are different to you. A sure-fire way to build a great team. A lot of the work Harper’s team developed on the Obama for America campaign was related to business analytics, turning the data obtained from the doorstep campaign through information, into knowledge and ultimately wisdom for the micro targeting marketing campaign.

Harper’s insights into the development of the architecture required to support this initiative is a similar challenge to that raised in the UCISA Strategic Challenge Report 2013 “Supporting the use of analytics/business intelligence to inform decision-making and planning.” Architecture is key for success in this area and Harper outlined the simplicity of making the same data available through straightforward API calls. Although on the one morning when the daily campaign bulletin failed to arrive, it was not a failed ‘cronjob’ as the team expected; an intern had simply not turned up for the shift to input the data.

At Loughborough, I have the pleasure of working with some extremely clever people who can code and build things, which are beyond the reach of my BBC BASIC skills of the 1980s. In terms of visualisation, Harper mentioned StatsD/Graphite, which looked extremely interesting to me, so a quick Google search found an introduction that those of you who can code may find useful. Some of the technology we promote within the community has a very long gestation period from inception through to fruition. Some technology doesn’t make it, but others become part of everyday life. Take Eduroam for example, 11 years in the making, it took a big push in the late 2000’s for organisations to take it seriously and now it is in commonplace use, including at the conference venue.

I was at an EMUIT (East Midlands Universities IT) Operations meeting a week ago and was pleased to hear a colleague explain that they ‘required’ IPv6 to be operational on their site to win a research contract; in a similar vein I was pleased to see Harper explain that he ‘required’ the cloud in order to develop the architecture to support his work. Sometimes we are blinkered by the architecture we have always had, supported by the resources we have always had and have done things in the way we always have done. There are opportunities to think differently, there were a couple of Apple references in the talk, but I do genuinely believe there are opportunities to approach infrastructure in a different way.

When the UCISA bursary call for interest was released, I was originally going to submit a request to attend the “AFRICOMM 6th International Conference on e-Infrastructure and e-Services for Developing Countries” conference in Uganda. I can see there is a lot of potential learning in how to do things differently in challenging situations. As I was still recovering from a rather physically challenging broken ankle sustained in last year’s snowboarding season, I thought I’d play it safe and travel to the USA instead. I’d certainly watch the African NREN’s with interest after hearing some of the innovative work they are undertaking at a previous TNC conference.

The final points I wanted to make surrounding Harper’s presentation are around failure, he was proud to announce “We practiced failure for over four months”. The learning points from understanding and embracing failure are great and often swept under the table, rather than embracing and celebrating the learning from failure.

Learning Points

• How do we grow our technical talent pipeline?;
• Designing the architecture to support analytics/business intelligence;
• Sometimes technology innovation has a long gestation period, be patient;
• Find opportunities to think differently about architecture;
• We should all train for failure to understand success.

Matt Cook

Meeting the accessibility challenge

I attended a session at the Educause conference on accessibility. This has become more of an issue in the US as a number of universities have faced litigation because of their lack of compliance with disability discrimination legislation. The number of cases is, in the overall context of the US education industry, relatively small but the amount of the awards made against institutions has made some university executives nervous and has driven moves towards greater compliance.

Temple University was one such institution. The University Board set a project in motion to review the current level of provision and take the steps necessary to comply with disability discrimination law. The initial analysis showed that Temple were not compliant with many aspects of that legislation – essentially in the same boat as many other institutions. I suspect that this is much the case in the UK too – there is some awareness of the disability legislation but not of what is required in order to comply.

However, Temple’s Board sought to address this, recognising that they needed to tackle to problem on a number of fronts. It was necessary to define the policy for the institution but then follow it through so that considering accessibility started to become business as usual. A broad based committee was established to oversee the project. Led by the CIO, it included representatives from the service departments but also Estates and the institutional counsel. The policy the group established was clear – we will be accessible. Responsibility for accessibility was devolved to the person providing the technology or information – so faculty were responsible for ensuring their materials were accessible and heads of service were responsible to ensuring compliance in their areas. Will became the watch word – where there were items that could not be made accessible, those responsible were challenged to think of another mode of delivery or whether the items were necessary at all.

After the initial audit, Temple instigated departmental liaison officers that were responsible for promoting the accessibility message within the department, ensuring departmental accessibility initiatives were funded and evaluating accessibility during the procurement process. The group established standards for the web services, learning spaces and IT labs with each bearing in mind the principle that accessibility should be standard provision, not the exception. Checklists were prepared to assist faculty in assessing their materials. Once the preparation was complete, the CIO promoted the policy and available support to a wide range of institutional groups through a series of roadshows.

There were some quick wins once the policy began to be implemented. The largest and most used IT labs were upgraded first bringing an instant return. Web accessibility standards were introduced and processes established to ensure compliance. Control panels in smart classrooms were upgraded. However, not everything gave so rapid a return. Although the processes were in place to ensure the web sites were compliant, adoption was slow. The guidelines for instructional materials took over 12 months to complete and a larger group was established to review and amend them as required. The initiative wasn’t cheap – Temple spent over $600k in their move towards compliance.

Not all institutions in the US had followed the same road – some opted to steer clear from even establishing an accessibility policy as they felt that doing so would put them at greater risk of litigation. I suspect the reverse is true – if you have a policy in place and plans to implement it then I believe you are less prone to litigation as you have recognised that you have a problem (in not being compliant) and are taking steps to address it. I wonder how compliant UK institutions are with the Disability Discrimination Act. My gut feel is that there probably aren’t that many. Will it take litigation in the UK to change that?

Campus Computing Survey

JulieVoce

Julie Voce
E-learning Services Manager
Imperial College London
Chair, UCISA-DSDG Academic Support Group

 

 

 

 

Wednesday at Educause

Wednesday’s highlight from Educause was the session by Casey Green on the Campus Computing Survey of US higher education institutions (HEIs), now in its 25th year. Reminiscent of the UCISA Technology Enhanced Learning (TEL) survey of UK HE, the Campus computing survey is sent to all US HE institutions and aims to find out what they are doing with IT, including learning and teaching technologies. Using the UCISA TEL survey as a comparator, I will review some of the data presented by Casey.

In the 2014 Campus Computing survey, 470 institutions responded, of which 70% had completed the 2013 survey, thus allowing for longitudinal analysis. There was also a good representation across the different types of institutions, e.g. private universities, community colleges.

Virtual Learning Environments (VLEs) are prevalent across the sector with Blackboard the leading VLE solution in US HEIs which is the same for the UK. It was noticeable that Moodle’s market share at around 20% was lower than in the UK, with other players such as Desire2Learn and Instructure having more prominence in the US. The trend for reviewing VLE provision is common to both nations with 64% of US HEIs reporting plans to review their current VLE. It would seem that institutions are now regularly reviewing the VLE to ensure it continues to meet their needs.

Cloud computing is increasing in importance with 29% of US HEIs now having a strategic plan for cloud computing. Outsourcing of VLE provision is more prevalent in the US, with 47% of US HEIs hosting their VLE in the cloud, compared with only 33% of UK HEIs. However this is a growth area in the UK with more institutions considering outsourcing VLE hosting.

Lecture capture has gained in prominence in the UK over the past few years with 63% of institutions currently supporting a central lecture recording solution. In the US 80% of HEIs reported that lecture capture is an important part of their campus plan for delivering instructional content, however only 11% reported that they have a written policy governing how students may record classroom lectures, presentations and discussions. This type of policy may not be necessary where institutions provide a centrally supported lecture recording solution.

The MOOC bubble appears to have burst as fewer US HE institutions feel that MOOCs are a viable model for online education and offer viable business models. The UCISA TEL survey also reported some indifference to MOOCs, which were ranked lowest as a driving factor for TEL in the UK, despite being cited as a future challenge for several Russell Group institutions.

From an IT perspective, there were some potentially alarming results, for example 23% of US HEIs do not have a strategic plan for network and data security and 32% do not have a plan for IT disaster recovery.

Considering student use of the internet, it was interesting to note that 24% of institutions felt that students who use excess bandwidth (greater than 20 GB per week) should be charged for usage.

When it comes to user support, 74% of US HE institutions cited this as a top priority, however the general feeling from the survey was that IT training for students and staff is not being done well. It will be useful to see whether the results from the UCISA Digital Capabilities survey will reflect this as well.

Overall this was an interesting study and a good opportunity to see how the UK compares with the US.

Julie Voce

Preparing to Wear – this is the year of wearable technologies

sally_bogg

 

Sally Bogg
IT Help Desk Manager
University of Leeds
Member of UCISA-SSG

 

On Thursday morning I attended a session on wearable technology. There was much interest in this as a topic and there was standing room only. The aim of the session was to facilitate a general discussion on the ways in which wearable technology will reshape the teaching and learning environment and the potential impact of wearables on interactions between students and facilities.

What does wearable tech mean for universities? Whilst we continue to grapple with BYOD the new wave of wearable tech brings new meaning to BYOE –  Bring Your Own Environment, or even BYEB –  Bring Your Own Enhanced Body, as it is anticipated that within the next couple of years wearable tech will become invisible and be worn under the skin.

We are now a society that is obsessed with sharing and we are entering a period where there is a tidal wave of data and images, in fact we shared 1.8 billion images last year alone. 1.8 billion and with the majority of us using our smartphones for anything but making phone calls.

Technology cycles usually last about 10 years, wearable tech will be the next cycle and it is likely that it is going to land on our campuses whether we like it or not. Some of the devices that were discussed were familiar to me – Fitbit and Google Glass – but there were a few that I had never heard of, including Narrative Clip and Lechal Shoes (sat nav shoes that tell you where to go!).

However wearable tech such as Google Glass does throw up some issues around privacy.   A comment from someone at the conference wearing Narrative Clip opened up a very heated debate on Twitter. General consensus was that new social conventions will need to be adopted in order to make wearable tech accessible. I guess as organisations we must now decide what to do about wearables — ban it, allow its unfettered use, or allow its use in certain circumstances by certain people with specific guidelines.

There was then a short discussion on what some of this wearable tech could mean for future learning and teaching. Some good examples were given for augmented reality and virtual reality being used to support more traditional class room methods; and whilst only relatively small numbers were currently exploring use of wearable tech for teaching/learning, there was lots of interest.

There is no doubt that wearable tech will change education and change it soon. In the next couple of years it is expected that learning simulations using augmented reality could start to replace direct instruction and work place learning in some subjects. I found an interesting article about tech and education  which is worth a quick look.

This session didn’t really provide me with any conclusions but it did give me a bit more of an insight into the type of devices we can expect to be supporting very soon. I don’t think it will be long before we see our first Google Glass at the IT Help Desk.

Sally

 

Agile at the IPMA conference

Simon Hogg
IT Portfolio Manager
OBIS (Oxford Brookes Information Solutions), Oxford Brookes University
Member of UCISA-PCMG

 

Another packed schedule of talks and sessions – again a difficult choice and there’s not much information to help you choose, so a bit of a lottery (although all talks are grouped into streams, which is good). I have tried to pick and choose rather than concentrate on a stream, but at the same time, try and get the best value.

Agile

My first session today was about Agile and how it can and does deliver success. What I didn’t know was the speaker was a co-author of the Agile manifesto. Quite a few questions were around the type of projects it can be used on and the scale. His response was that it can be applied to any project even bringing new things to market such as margarine. As for the scale, he didn’t seem to think this was a problem at all. The room seemed to be split 50:50 on those who know Agile and those who don’t, so his answers did raise a few eyebrows amongst the perhaps sceptical 50% of the room.

The next session was a practical Agile session. The speaker gave a very quick overview of Agile. We were split into small teams and given the brief of producing a children’s book in 45 minutes. We were given guidance and the 45 minutes was in fact a week compressed, so we had 5 minute sprints and 1 minute scrums. My Agile knowledge is limited, but I have a grasp of the basic concepts. However, my other team members didn’t have any idea about Agile other than the brief overview. I also think nationalities played a part, as one person more or less refused to participate. So we did this and yes, we did produce a crude book; we had access to some paper and felt-tip pens. It did illustrate the point that you can get complete strangers to collaborate. I think for Agile to work, you need 100% commitment  from everyone on the project, along with 100% understanding of what Agile is and how it does work in practice. For some people, the communication and interaction would be a challenge.

My next two sessions were in the communications theme. The first was a bit of a shameless plug for “my new book, which is available at the conference”. I can only say that I picked up a few useful tips: pause in communication, make sure your communication is truly understood and have a repertoire of communication skills. None of these were really explained but we were told that we could read about it in her new book.

The next session was about humour and its effect on people in the workplace. A very good presentation with lots of humour of course. Lots of images, lots of suggestions and university based research to show how humour does actually improve the workplace. Not a sales pitch at all, even though they were a consultancy firm.

A brief conclusion

It’s been very tiring, mentally and physically. Three days of talking about and listening to the many aspects of project management and its associated threads.  I only wish I could have attended more talks than I have, but as the programme had no duplicate sessions, I couldn’t.  The venue was excellent as was the organisation of the conference, which you would probably expect given the nature of it. Out of all the talks I’ve been to, the one on humour, coupled with the opening keynote address about luck, could be the most valuable. This is not what I expected at all, given the programme. However, that doesn’t mean everything else was not of use, it has all been useful as  the standard has generally been very high.

Would I attend again and would I recommend it? Yes on both counts, but with the knowledge that it is demanding.

Simon Hogg

Project Management sessions at Educause

Sally Jorjani,
Edinburgh Napier University
Project Manager
Member of UCISA-PCMG

 

My first session at Educause was the full day seminar on “Develop IT Governance, Portfolio, and Project Management Processes to Govern Execute and Measure Projects”. This was being run by University of Illinois, telling their story and experience of their implementation of IT Governance, how it has evolved through practice and the challenges they faced.

So why is IT Governance important, below are some of the University of Illinois’ thoughts:

  • Provides clearly defined and repeatable process for decision making
  • Provides transparency as to how decisions are made
  • Ability to measure project/service performance to budget/schedule and success against objectives
  • Ensures that IT Projects and resources are aligned towards the Strategic Plan
  • Enhances opportunities for shared use, reuse, integration, and interoperability of technologies

Key to success with IT Governance is communication, transparency and capturing how decisions were made and their path through the process.

Further points of note from the Portfolio & Project Management section:

  • You cannot plan in a vacuum – need to involve stakeholders
  • You need a senior champion to drive the process
  • Ensure you have the right people involved at the right level
  • PMO helps manage the schedule and resources, it is pivotal in the success of IT Governance
  • PMO is the centre of excellence to help and assist staff with their projects
  • PMO should have the toolbox for staff, such as templates, ideas, advise, and so forth
  • Standardise and make consistent
  • Ensure that the flow of customer requests are controlled, prioritised and transparent
  • Periodically revaluate the process and adjust as necessary
  • Train your staff in Project Management
  • Staff who have a passive aggressive resistance to change need to be managed
  • Buy in from staff very important; listen to what they are saying, especially those who “grump”
  • Get “on the road” with IT Governance within your institution – communication and explanation

And a final point, which was well made:

“A Project Manager’s mission requires courage and good communication (plus a lot of work).”

Throughout the day there was ample opportunity to speak and learn from others at my table as to their experience as well as listen to others through questions and answers. I found this part very interesting, realising that the issues that we face back at our institution are similar to those all over the HE sector no matter where you are from. Also, that we may be slightly further ahead with our implementation of a Project Management Office.

At the end of the session, the speakers were asked “do they review their process and what would they do better?”  Answers were to build trust through success, chunk down projects and say no more, as projects can languish at the bottom of the pile when really they should just be canned. Then much to my surprise, the presenter said that what she would like to do better was to follow our lead in having a Risk Register which is rolled up for all Projects to their PMO!

I felt that this seminar helped validate that our Technical Services Governance is not far off the mark and with a bit of tweaking and consolidation there will be a model which can be rolled out to the rest of the University.

The excellent resources of templates and forms in my booklet will be put to good use upon my return to incorporate into our process and PM training.

Sally