Tag Archives: Cloud

SharePoint migration from MySites to OneDrive for Business

Tristian O’Brien
SharePoint Technical Specialist
University of Brighton

Blog entry syndicated from my other blog that runs on GHOST.

I maintain a set of PowerShell scripts and processes to migrate many MySites from SharePoint 2010 to OneDrive for business.

As we know, PowerShell can automate many processes that you could perform using the user interfaces of SharePoint on premise or in the cloud.

So the general idea is to:

  • use a mapping file, where we have at least two columns. Column A in the windows on-premise username. Column B is the Office 365 (O365) login. I do have a third column, which is the destination OneDrive, but since this is almost usually the OneDrive logon, where any ‘@’ or ‘.’ are escaped as ‘_’
  • populate this file or database table with the users that you want to migrate
  • using PowerShell iterate through this list and
  • set users on-premise MySite to read only – I upload a separate master page and change the page status for this
  • in O365, assume the user is setup, licensed and provisioned. We use an account that has global admin rights in O365.
  • in O365, make sure that the global admin has access to the users OneDrive by adding it as a secondary admin
  • use ShareGate PowerShell to migrate the data. I know this is a cheat, but there are many reasons to use ShareGate such as insane mode, using Azure Storage and logging. Here are some other thoughts on Azure Storage.
  • when content has migrated successfully, timestamp the user profile on-premise with a date migrated value – later on we deploy some timer jobs to with delete or recycle after a specified time period.

Take a sneak peek at the https://github.com/devacuk/UoBMigration.  This is some code that I prepared for the dev.ac.uk event co-hosted by UCISA and JISC in February 2018. Slides are available here.  Much of the knowledge I accrued in order to do this are as a result of being awarded a UCISA bursary that paid the costs of travel, conference entrance fee and accommodation to MicroSoft IGNITE 2017.  For blogs on Microsoft Ignite, click here

I strongly suggest that if you do work in IT for a UK Higher Education institution, that you apply for the bursary yourself. Where this particularly helped is that I attended sessions about the latest developments in PowerShell, the tooling and Office 365. I found it particularly valuable to meets engineers from Microsoft Azure, ShareGate, MetaLogix and other vendors of migration products.

The only downside is that it is a corporate event, so one particular query about how and when any throttling of content into and out of Office 365 may occur, didn’t really get any answers from Microsoft, as I guess this kind of detail is a trade secret, I get that.

ShareGate offered some good advice on their experience with organisations way bigger than my institution, in that if you use their tools to manually migrate, use different tabs for different migration tasks. If using it in its PowerShell guise, then split the job up. Although managing that particular task would be a challenge in terms of organisation. I guess you could containerise the server using say Docker but ShareGate licenses would be needed for those individual containers I guess.

Another aspect of IGNITE is the sheer scale if it. I had planned to attend various sessions, but this wasn’t always possible due to sessions being placed far apart, overcrowding at some times and the warm weather. If I went again, I would be prepared for that though.

This blog post also appears on http://blogs.brighton.ac.uk/tristianobrien/

Interested in finding out more about a UCISA bursary, then visit UCISA Bursary Scheme.

UCISA welcomes blog contributions and comment responses to blog posts from all members. If you would like to contribute a new perspective or opinion on a current topic of interest, simply contact UCISA’s marketing manager Manjit Ghattaura via manjit.ghattaura@it.ox.ac.uk

 

The views expressed on UCISA blogs are the authors’ and do not necessarily reflect those of UCISA

Interview: Deakin University’s AV solutions

Ben Sleeman
Service Development Assistant
University of Greenwich

 

 

AETM Conference 2017 and university visits, Melbourne, Australia

Ben Sleeman was funded to attend this event as a 2017 UCISA bursary winner

As part of the UCISA bursary scheme, in November I attended the Audiovisual and Education Technology Management (AETM) Conference held at the University of the Sunshine Coast, Australia. I also visited five universities in Melbourne including Deakin University.

While visiting Deakin University, I was able to interview Jeremy West, Senior Audio Visual Engineer and Tech Lead in the eSolution Team. He kindly answered questions about a range of topics including how new audio video technologies are coming to Deakin University and how these technologies converge with other IT solutions.

In the interview, Jeremy talks about how Deakin University is moving its traditional audio/video over HDBaseT to over IP solutions. He also talks about moving to cloud control for AV teaching space control and using analytics that come back from these systems to improve academic user experience. Collaboration has been key to this move to IP AV solutions, working with their network engineering and systems teams.


I will be blogging about my further conversations with Jeremy on other AV developments at Deakin.

 

 

 

 

 Interested in finding out more about a UCISA bursary, then visit UCISA Bursary Scheme.

Cloud services mini-toolkits

There is increasing use of cloud-based services in Project and Change management, such as Trello, Skype and Doodle, often in conjunction with Google Apps. The PCMG Committee has developed a range of mini-toolkits to help people use these services more effectively:

These have been made available through Google docs to allow downloading for local use and for colleagues to suggest improvements and so keep the documents current and relevant. Our thanks to the University of Sheffield for their initial work in developing these documents.

Posted on behalf of Simon Geller, Joint Vice-Chair UCISA Project and Change Management Group

Compliance in multiple repositories

Sara Somerville

 

Sara Somerville
Information Solutions Manager
University of Glasgow

 

Feedback on a content management session at the AIIM conference

Content Whack-a-Mole: Keeping Up Compliance across Multiple Repositoriespresentation by Michyle LaPedis and Jordan Jones from Cisco Systems

There are multiple tools popping up that enable users to create, share and manage documents, and these were challenging the traditional repository set-up such that users didn’t know what tools to use when. The team realised they needed an overarching strategy to address this issue.

One of the other problems Cisco had was around the search tools which were returning a lot of ROT (Redundant, Obsolete and Trivial data), and if the users couldn’t find what they needed then they tended to create the document again.

Content Management IT at Cisco focuses on an open architecture with open source and open standards. They have three major on premise systems and are currently implementing a project to migrate documents in to one system/repository as a focus for records management, and to enable the application of lifecycle management to that content. (Cisco also use box as their organisational cloud based document management and collaboration tool.)

Some of the steps the team took to improve the situation included:

  • Rationalising their on and off premise services and adding a compliance layer
  • Creating a content management program management office to ensure that proposals for any new IT tools came through this office for approval
  • They defined an ILM (Information Lifecycle Management) strategy for their documents and identified their repositories
  • Cloud offerings often mean less control and make it harder to manage and delete content, but the team created a dashboard for users to move documents from the cloud (or any other location) in to the repository they had created
  • They sent out monthly emails with information about what records need to be deleted and then reminder emails were sent every month after that to remind users to take action (if the users don’t take action after six months then the data is deleted).

Some of the issues they have encountered:

  • New platforms do mean new issues
  • Changing personnel means sometimes starting over – but hopefully there is some hand-over
  • There is a code of conduct which states that it is an employee’s responsibility to manage their information responsibly
  • They have started to phase out the network file shares by making them read-only and then they will start to move the documents over to the other approved repositories.

Key takeaways:

How do you win the game (Whack-a-Mole)?

  • Remember the game will never end
  • It’s important to have a strategy and for records and information professionals to work with IT to implement it
  • There is always going to be a new technology, so it’s important to get the process and the people part working together.

 

 

Some reflections on the UCISA Bursary and Educause

simon

 

 

Simon Geller
Senior Project Manager
University of Sheffield
Member of UCISA-PCMG

 

 

I was very pleased to win a bursary to attend Educause 15. On reflection, however, I’m not sure that this is the best conference for bursary applicants to apply to.

So, what are my reservations? Well, it’s a very large conference, and therein lies the problem. It was hard to pick out presentations that could be relevant to my role, varied as it is, and I’d say my judgement was about 50% correct.

With the plenary sessions, of course, there were no choices to be made other than to get up and be ready. These large events were very professionally presented, although the topics were highly generalised – I think the conference could have had more of them, with speakers who had a strong overview of ICT in HE.

So how was the bursary of benefit to my professional development, to my institution, and to the HE IT community? The key thing I brought home was that my colleagues in the US are facing the same problems as we do in the UK – institutional inertia, resistance to change, ever-reducing budgets and ever-higher workloads, with a failure of senior management either to defend the industry or to bring in the kind of far-reaching changes that would enable us to adapt more quickly to changing circumstances, whether that is the political landscape or technological advances. However, my US colleagues didn’t seem to have the answers to these questions any more than we do in the UK.

Coming from University of Sheffield, the slow rate with which US institutions had embraced new technology was also quite striking. Technologies such as Google Docs, which we have been using for years, seemed like strange new innovations to many people. This is, however, not much different from UK institutions, with many still dependent on legacy systems for their core services.

I also learnt that interest in “sustainable IT” is on the wane. To an extent, this is because sustainability has become more embedded in the industry – personal devices and data centres have become more efficient, while the adoption of cloud services, which give institutions the opportunity to off-load their carbon footprints onto the cloud provider, do tend to be more energy-efficient than locally provided ICT services.

On reflection, therefore, I think it would be better to encourage colleagues to apply for bursaries to attend conferences that focus on their specialised areas, rather than big, generalised conferences.

Interested in applying for a UCISA bursary? Then visit UCISA Bursary Scheme 2018.

Benefits of a UCISA bursary – six months later

Allister Homes Profile pic - small

 

 

Allister Homes
Senior Systems Architect
University of Lincoln

 

 

 

I have attended a number of HE-sector EA events over past few years, and applied for the UCISA bursary hoping that the Gartner EA summit would help me learn more from experts outside the HE sector, and perhaps help me to consider different perspectives. I didn’t see official figures, but I estimated that there were roughly 400-600 attendees. The same summit also takes place in the USA on different dates (with, I would imagine, an even larger number of delegates). As you would expect, there were a lot of sessions running in parallel, so it was impossible to get to everything, and I cherry picked what I thought likely to be the most interesting and useful sessions.

It wasn’t surprising to find that the EA practice of universities is more modest than that a lot of other organisations represented by delegates at the conference. I mentioned in the reflections blog post that there was often an unvoiced assumption that delegates were part of teams with significant numbers of architects and developers, with suggestions such as “when you get back, why not assemble a small team of 5 people to go and investigate X, Y and Z”. It’s good to see how EA is being done outside the sector, but equally important to remember that we need to use it appropriately by learning and adapting from billion-pound organisations, rather than hoping to replicate.

I found the summit helpful to maintain my thinking as an architect on how the architecture we implement now can support the changes that we will need to implement in coming years. Nobody knows exactly what these changes will be, but nonetheless we need to make the best decisions we can now in order to be flexible for whatever change comes along later.

Cloud maturity

Gartner’s views on cloud maturity were interesting and seemed sound. Things such as breaking through vendor fog and hype to get the real information about what offerings are available, the fact that many vendors now offer new services as cloud first, the need to frequently update cloud strategies, and the fact that it’s a case of the “degree of cloudiness” rather than whether to take a cloud approach or not, all ring true.

There was useful insight into changes that Gartner Analysts expect to see over the next few years. Information about strategic trends was also interesting and useful as background information to keep in mind when considering enterprise architectures over the next few years. So too was the session on making sure the architecture is flexible enough to respond to business moments as rapidly as possible; in a setting such as HE, I think getting to that point of the intuition’s architecture being flexible is itself a significant undertaking that will take a long time to achieve, and has to come about gradually, but with deliberate direction, as things are introduced, removed and changed.

Software architecture

In retrospect, I’d categorise several sessions as being about software architecture rather than enterprise architecture; for example, more than one session looked at designers splitting applications into smaller applications and using micro-services for massive web-scale SOA.  Cases in point included Netflix and Facebook, but I think the enterprise architect would be more interested in the services Netflix delivers, how it interacts with other services and how people interact with it, than the detailed software architecture of how Netflix works internally.

Networking

Unlike many of the HE events I’ve attended, I didn’t make any useful contacts at the conference with whom I could occasionally keep in touch to share information. I mentioned in the reflections blog that conversations appeared to be mainly limited to people who already worked together, and a bit of people-watching seemed to reveal that others who, like me, tried to strike up conversations with ‘strangers’ didn’t get much of a flow going. This may well be the norm for a large conference with people from diverse organisations, the vast majority of which would be profit making entities less inclined to openly share.

Attending the summit has not fundamentally changed what I (or we at the University) do or how I think, and it’s a conference that would be useful to attend every two or three years rather than annually, but overall it was beneficial and an interesting experience.

Perhaps one of the most thought-provoking things was that Gartner estimates that by 2017 60% of global 1000 organisations will execute at least one revolutionary and currently unimaginable business transformation effort. Of course, there are no universities in that list, but I wonder – what the proportion of universities that will undergo such a transformation by 2017 will be, and what that transformation will be?

Interested in applying for a UCISA bursary? Then visit UCISA Bursary Scheme 2018.

Educause – the final day

simon

 

Simon Geller
Senior Project Manager
University of Sheffield
Member of UCISA-PCMG

 

 

 

The final session I attended was on preparing your organisation for the Cloud. It was noted that most organisations were already in the Cloud to some extent. A question was raised – ‘what does an IT Director actually do?’ – something I’m sure we’ve all asked ourselves.

The last general session was an inspirational talk from Emily Pilloton of Project H Design, who has found some exciting new ways of teaching kids how to build things. It was a great way to remind ourselves of what the business we’re in is all about – sometimes, as we plough away in our chosen furrows, this can be forgotten.

All in all, a very interesting conference; thanks to UCISA for making it possible for me to attend.

Enterprise architecture and digital business

Allister-Homes-Profile-pic---small

 

Allister Homes
Senior Systems Architect
University of Lincoln

Gartner EA Summit Day 1

There are attendees at the summit from all over Europe and it’s probably fair to say that as someone from the HE sector, our EA practice is somewhat more modest than a lot of others represented here; I think that would be the case for almost any HE institution that came here, I don’t mean just Lincoln.

I haven’t seen any information on the number of delegates, but I’d estimate there somewhere in the region of 400-600 and it’s quite a busy summit in terms of the number of people and the number of sessions. I thought the most straightforward way of blogging this would simply be to summarise each session I attended, so here goes.

Architecting the digital business ecosystem
ecosystemsAs Gartner’s opening keynote session, this started by concentrating on the fact that more and more businesses operate in the digital world – not just having a presence, but offering their services and interaction digitally. Garner’s figures are that 22% of businesses operate in the digital world now, and by 2020 that figure will be 83%. Digital business will mean business growth, and those who do not embrace it will lose out to those who do. We heard about the seismic shift happening, caused by the Internet of Things and what Gartner calls the ‘Nexus of forces’.

It is estimated that by 2020 the number of Things connected to the internet will outnumber people by at least five to one. We heard a story of how a connected home might, in a few years’ time, react to a tree crashing into the bathroom during a storm whilst the owner is away by cutting off electricity and water to protect the property, contact the insurance company and start arranging quotes, pass the information to the police and ensure the property is safe and secure. As Enterprise Architects we need to be aware of new technology and innovations so that we can become ‘vanguards’ and shape its use in the enterprise, which will become continually more consumer focused rather than enterprise user focused.

We analyse too much and synthesise too little

This session was focused on trying to change the way EAs think. Rather than always relying on analysis, the suggestion was that we rely on synthesis a little more. We were told how analysis does not work well when there is no data, there is ambiguity and when there are dilemmas, and then a short history of synthesis starting with dialectical reasoning. Some of the simpler examples used where synthesis can were how to cut costs given a particular banking scenario (don’t rely on the distinct possibilities provided by analysis) and the ‘chicken or beef’ question on aeroplane flights (either ask for both or eat before boarding!).

The state and innovation: from fixing markets to market making

state and innovationProfessor Mariana Mazzucato from the University of Sussex presented on innovation led growth and the role of the public vs private sector. She described the cartoon image many have, which assumes all innovation happens in the private sector whereas public sector is rational, boring, has dinosaur bureaucrats and exists (in terms of innovation in markets) to simply level the playing field. However, she went on to explain that science is simply not funded enough without state support, which is needed to fix market problems. In fact, public sector funded innovation is often mission oriented, such as sending mankind to the moon, and massive public sector innovation and investment has led to much of the technology that makes things like smart ‘phones possible – think of GPS, voice recognition, touch screens, cellular communication, signal compression and more.

What can sometimes be forgotten in the public sector, though, is to apply what is taught and used elsewhere, such as the approaches taught in business schools and lessons that can be learnt from great innovators. One particular example highlighted was that of higher investment risk green opportunities which are starved of private funding. In Germany in the early 2000s, when private banks in Europe were reducing loans and the economic crises was in its early stages, the public sector in Germany provided substantial mission-oriented funding for environmental and climate protection innovation.

Application architecture for digital business

This session, delivered by a Gartner analyst, concentrated on new approaches to application architecture for delivering business digitally. It was emphasised that the focus should first be on business outcomes which are then converted to technical goals, which lead to technical characteristics, then principles and then patterns. Most organisations are still using three tier (or n-tier) client/server architecture, which is typically designed with static architecture and linear flow in mind. The approach does not work so well with consumers using various devices and with business that needs to change rapidly, and so an application architecture of apps and services was suggested instead. This takes service-oriented architecture several steps further, and encourages the use of individual apps for particular personas, devices and scenarios rather than one large client application to do everything, uses services to support the apps, and encourages a many-to-many relationship between apps and services. In this scenario services are broken down much more than they typically are in most environments today, becoming microservices for web-scale SOA. Examples were provided of how the likes of Netflix, Facebook and Amazon use the microservices to scale to massive numbers of concurrent users and with the agility to make hundreds of updates per day to production systems by having these very specific microservices which are independently deployable and scalable. The difficulty is that although they provide separation of concerns and are individually easy to understand, you end up with a LOT of services, and have to radically alter the rest of the architecture because there is often no centralised application or data. Third normal form is out the window with this architecture, and a lot of what was done as server code becomes client code again.

Nissan Group: using enterprise architecture to enhance efficiencies and business growth

Enterprise Architects from Nissan presented on their use of MEGA at the Sunderland plant. Nissan had a lot of legacy applications and infrastructure at Sunderland, but not necessarily a good corporate understanding of all of it. Three main drivers led to the need to change this: a security incident, the transfer of some service to third parties, and a corporate objective from Japan to understand and standardise business processes. A project was launched to document 500 applications, the central technology infrastructure and the business processes, and to link them all together into a single EA model. A team of five full-time staff were set the task of doing it, and although timescales turned out to be a little too ambitious, much of the work is now done including the development of an application portfolio outside of the MEGA model and the creation of a bespoke metamodel.

The cloud computing scenario: examining cloud maturity and adoption in the age of digital business

This session looked at the adoption of cloud by businesses, how to make assessments and what to consider with a cloud strategy. Gartner’s analyst explained that cloud delivery options are becoming more varied and complex, leading to a “spectrum of cloudiness” that EAs need to understand in order to make the right decisions for the business. It’s not just the delivery model that needs to be considered (public cloud, community cloud, virtual private cloud and private cloud) but also the execution model, for example whether hardware and software is isolated or not. Cloud offerings are still changing quickly, for example by making improvements to security models, and although maturity is growing it is too early to put a final cloud strategy in place; a strategy is needed, but it will need to keep pace with the constantly maturing cloud technologies and offerings. Vendors sometimes complicate this, and an EA needs to be able to break through the vendor ‘fog and hype’ to understand what is really being offered.

It was emphasised that whether we like it or not, many (not all) vendors are now going cloud first with new solutions rather than traditional software first, which means the decision is shifting from ‘whether and when to go cloud’ to ‘how and how much’ to go cloud. The reasons for moving to cloud solutions is not always cost-based; there is value from agility, the offloading of responsibilities, business agility and the advantages that provider investments provides in terms of scale and security. An interesting element of the presentation was of how business users tend to focus on benefits of cloud such as speed of change (sometimes bypassing IT departments) and that IT and EA focus tends to be on ubiquity, scale and so on; there needs to be a balance and a meeting point between the two views.

In reality most organisations will use a hybrid of cloud services, and the cloud strategy needs to consider the “spectrum of cloudiness”. Comment was also made that not all vendor offerings are created equal, and work must go in to understanding the differences beneath the surface. There are some large vendors, such as IBM, who offer cloud services and will also build cloud solutions, and will often mix both in the same conversation which can lead to confusion and complex, bespoke solutions. The five questions the presenter suggested be asked when defining a cloud strategy are:

  • Where and how should the enterprise consume cloud computing services?
  • Where and how should the enterprise implement private cloud environments?
  • How will we secure, manage and govern cloud services across internal, external and hybrid environments?
  • How does cloud computing factor into our application strategy and architecture?
  • Are there opportunities for the business to become a cloud computing service provider?

Selling EA with stories: start with the end in mind

selling eaEE, the mobile telecoms company, presented this session. EE has a mature EA practice which is engaged with all levels in the organisation. It recently refocused EA into three areas:

  • Technology that underpins enterprise
  • Business capability
  • Business architecture

It has a comprehensive EA knowledge base or model, using Sparx, and has a philosophy of ‘doing just enough, just in time’ because otherwise all the time is spent trying (and failing) to keep everything perfectly finished and fully up to date instead of spending time innovating, working with business users and influencing change, which is where the real EA value is.

An opportunity arose for the EA team to create a vision, sell it to the business and propose a set of new initiatives. The team achieved this by first creating a compelling vision. The vision was rooted in the present day and based on information already held in the knowledgebase (architecture model), and focussed on the main business problems and desired business outcomes. The vision was communicated through personas and stories, which were designed to be both factual and appeal to emotions, by highlighting key frustrations and weaknesses and how the vision would help overcome them. The vision, including stories, was presented on one single sheet, including short biographies of personas, themes in columns and the use of icons and short stories. No formal architecture documents were given to the business at this stage, but the detail had to be ready if the outcomes were positive and the team was invited back to the senior stakeholders for further dialogue on roadmaps, sequence of capabilities, business benefits and costs. The approach was successful with three of five major initiatives being started.

Cloud for US HEIs

face
Caleb Racey
Systems Architecture Manager
Newcastle University
Member of UCISA EACP

 

 

In this blog post I’ll share some of the take home messages from the cloud sessions I attended at EDUCAUSE.

There was a wealth of cloud presentations at EDUCAUSE with several time slots involving a choice between competing cloud sessions.  The American universities seem to have largely got beyond the question of “should we use the cloud?” and are now concentrating on practical questions like “how do we effectively use the cloud?”.  The services used as examples are no longer the low hanging fruit of student email and have moved up the chain to a wide breadth of paid-for services (e.g. CRM, developer services, travel).  The impression I got was that the States has a 2-3 year lead in deploying cloud based services over the UK HE community.  The message of many of the presentations was that the advent of cloud is transformational.  Cloud is driving levels of change in university IT on a par with the two historical seismic shifts of the mainframe to PCs transition and the advent of the internet.

Leveraging collective buying power

Part of the reason for the US HE lead in the cloud appears to be the availability of the  Internet2 Net+ cloud procurement model.  This enables the community to purchase cloud services as a group and use that group leverage to get the services and their contracts shaped to HE requirements. Buying as a community allows them to use best buying practice when it comes to cloud, such as assessing the security of a cloud service via the cloud security alliance’s 136 point assessment matrix, or ensuring contracts have reasonable terms e.g. liability can not exceed purchased value. They also have a mature attitude to cloud service lifecycle with a well defined on-boarding and sun setting process.  While Net+ looks to be a valuable resource many seemed to be buying services on their own.  Salesforce in particular seems to have real traction; Cornell reported that when they mined their payment card data Salesforce was used across the organisation without the IT services involvement.  This was a common theme – cloud sells direct to the end user without reference to IT departments.  Indeed cloud sales activity is often targeted at academics and administrators rather than IT. The need for IT departments to adapt to this change is clear.  With IT now an intrinsic part of the majority of business activities it is inevitable that control of IT is increasingly going to reside in those business areas.  As a provider of enterprise IT the following Oliver Marks quote may be uncomfortable, but it makes it no less true: “cloud companies are cost-effectively emancipating enterprises from the tyranny of IT, solving lots of problems with tools that are a pleasure to use”.

Several of the presentations touched on the impact of the cloud on the IT department.  All presenters said that it had not reduced head count.  Cloud approaches increase the velocity of change and the depth of IT penetration into business processes, the increased efficiency of cloud is counteracted by much greater demand and resulting volume of activity.  This is a common story of conversion of an area from bespoke to utility provision.  The conversion of electricity from self-generated to utility provision saw massive growth in the demand for electricity and electricians.  Penn (Pennsylvania) State said that cloud for them was “about delivering more capability and capacity to their customers”.  For Cornell it was driven by a need to shrink a massive $27 million application budget to something more manageable.  The skill sets required in the IT department change with the cloud, managing services, facilitating, contract and relationship management all requiring development.

Addressing security concerns

The real value of EDUCAUSE for me is in hearing the experience of peers who have led the community in a particular area.  This was particularly true for cloud, it is an area awash with myth, marketing, fear, uncertainty, doubt and undoubted risk. Hearing real experience helps to focus concerns on the real issues.  When I speak to colleagues in my university about cloud,  security is one of the number one topics.  While this was a concern mentioned in the presentations it seemed to be largely a resolved issue.  There was a great quote in the Notre Dame session: “90% of HE already uses a SaaS solution for credit card payments and has for a long time.  Once you have put your highest risk assets (your customer’s financial details)  into  the cloud then most other data security concerns are minor”.  Notre Dame’s session was a particular eye opener, they already have 130+ SaaS service deployed on campus and aim to be 80% cloud based within a couple of years.  The speed with which they are moving to cloud is astonishing.

Lessons for UK HE  

Reflecting on the conference I’m increasingly convinced that UK HE needs to embrace the full breadth of cloud services.  Cloud is here to stay, it will continue to grow and those harnessing it will lead the sector in IT delivery.  A service  similar to the Net+ service would be a major bonus to the community.  Having worked with the JANET amazon AWS pilot I can see  the beginnings of a Net+ style service.  The JANET cloud service is  good, however I can’t see it having the breadth or pace of the Net+  initiative.

The government’s G_cloud  framework  has breadth and pace in spades, it easily outstrips Net+ in terms of size (13,000 services) and velocity.  G_cloud is usable from a university context with individual universities named in the list of eligible organisations.  However its terms and services aren’t tailored to university requirements.  Having investigated G_cloud as a route to procuring two potential services this has proved to be a problem.  In one instance the length of contract was an issue, the capital up front funding of some research projects means that the two year cut off of a G_cloud service makes it unsuitable. In another instance the flat unnegotiable fixed price nature of G_cloud agreements meant it was more expensive than  we could have negotiated directly with academic discount.  The way forward to me seems to be to pursue a multi-tiered approach: call off on JANET cloud services when available, use G_cloud services where suitable,  work with the government digital service to influence G_cloud to include HE specific considerations, and finally procure cloud services directly when the previous two approaches are exhausted.  However cloud is not an area that easily lends itself to tender agreements.  Procuring cloud on an individual basis will see each organisation invest considerable effort into answering the same concerns that many of their peers are also having to address.  While the US is leading the HE community in harnessing the transformational potential of cloud services I can see the real potential for the UK to pick up the pace and take the lead.  A sensible combination of tailored JANET cloud services and  G_cloud’s velocity and drive could act as a transformation enabler in the sector.

Cal Racey

 

ESNet, The Energy Sciences Network

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 


2014 Technology Exchange – Day 3

One of the features of conferences outside of the UK and especially prevalent in the USA is early morning sessions. It was time on day three to hit the 07:15 morning working group/camp/BoF sessions.

Unfortunately the ‘Campus Cloud Architects BoF’ was cancelled, which was really disappointing and not a good start, as I was hopeful to explore in person some of the latest concerns, trends and experiences in this area.

Industry groups have been reporting more and more interest in Cloud brokerage solutions and some companies are now recruiting for cloud broker and cloud architect roles. As cloud technology gets more mature, there is an opportunity to start brokering for the best possible service and cost for your organisation. In the sector we have seen an excellent start in this area by Janet with their agreements with Microsoft and Google for their email/applications suite.

There is a lot of development scope in this area with Microsoft Azure, AWS etc and I’m interested to explore the strategy required to position infrastructure, automation and standards to take best advantage of the emerging competition.

Perhaps this area is something that colleagues in the UCISA IG may be interested in picking up in the future?

I took advantage of the programme changes to share more details about the current UCISA activity in the ad-hoc groups using a short five-slide presentation covering these pieces of work:

• A guide to the implementation of an Information Security Management System (ISMS), launching in early 2015

• An update to the popular  ‘Exploiting and Protecting the Network’ document, launching in early 2015

• The Major Project Governance Assessment Toolkit

• UCISA Report 2013 – Strategic Challenges for IT Services.

There was a lot of interest in these areas and I had a couple of questions about integrating the planning, effort and joint working of UCISA and EDUCAUSE where there are clear overlaps and topics of interest.

The Energy Sciences Network ESnet are also interested in contributing to the Network Performance and QoS ‘community of practice workshop’  which the UCISA Networking Group are planning in January 2015 (more details coming to the UCISA NG mailing list soon).

Data Intensive Science

As an area where I have little experience, I was interested in listening to what William Johnston from ESnet had to say about large-scale data intensive science. He started by explained his view that high energy physics is seen as a prototype platform for distributed collaboration in other science fields.

He explained that as instruments get bigger, they get more expensive (in a not-quite-as-dramatic Moore’s Law relationship); therefore there are less of them which results in an increase in collaboration, globally. This shows the potential future growth of research networking bandwidth requirements.

One of the things I didn’t realise was that ESnet have extended their 100Gb full network backbone across the Atlantic into Europe, including connections in London. Their first circuit is being tested today. What does this mean for science and research in the UK?

Further details are available at:
http://es.net/news-and-publications/esnet-news/2014/esnet-extends-100g-connectivity-across-atlantic
http://www.geant.net/MediaCentreEvents/news/Pages/three-high-speed-links.aspx

William went on to talk about monitoring the network, explaining the criticality of this area. With many Infrastructure as a Service (IaaS) offerings, researchers are requesting Network as a Service; and with that the same levels of assurance and guarantees that have only previously been available with point to point links; is this going to change?

As one would expect, ESnet use perfSONAR for their assurance measurements. As I mentioned earlier, we will hopefully have representatives from ESnet and eduPERT at our Network Performance and QoS ‘community of practice workshop’ in January 2015.

Would something like perfSONAR deployed across Janet be of benefit for the community, perhaps let us know your thoughts in the blog feedback section below? I would assume it requires volunteer sites; however Janet are already looking at the possibility of network based probes for eduroam, so perhaps there is scope for a next generation of Netsight with added assurance?

ESnet also use the weathermap tool, which is also loved by colleagues at Loughborough University. It was one of the best take away messages from a Janet Networkshop Lightning talk several years ago.

The remainder of the talk was about data transfer speeds and integrity. I was surprised to hear the comment “SCP is your enemy”. Surely not? However I was approaching the problem from the wrong angle, thinking about security and not data transfer speeds and parallelisation. Look at some of the figures in the photograph below.

2014TechExDay3

 

William discussed a number of tools including GridFTP and a development from CALTECH, which stripes data across discs as part of the FTP process as well as providing up to three times CRC checking.

Interestingly the last point was about data integrity, which is critical for the field of data intensive science. William referenced the paper Stone and Partridge, 2000 “When The CRC and TCP Checksum Disagree”.

During the break, I had a bit of a Google to find any UK user or interest groups for Research Computing and HPC. I found the HPC SIG, if you know of any others, please pop them in the blog comments to share.

Connecting 40Gb Hosts

Whilst in the ‘big data’ mindset, there was an interesting session where colleagues from Fermi Labs, ESnet and CALTECH shared best practice infrastructure configuration to support high-speed data transfer.

There was some very interesting visual modelling, which demonstrated the affinity the network card has with a particular processor socket and core. The difference between optimising for data transfer is significant 37Gbps vs 26Gbps max on a 40Gbps link.

It was a packed session with many colleagues standing at the back; there is certainly an art to tweaking infrastructure to perform in the best possible manner. It was also interesting to hear there are three 100Gb network cards in development and testing.

Pushing the Boundaries of the Traditional Classroom

There was a bit of a clash in the programme, so I didn’t get to spend a lot of time in this session, but it was interesting to see what Indiana University had done with their ‘Collaboration Café’.

It led me to wonder what the key limitation of adopting more of these learner centric classroom designs is? Is it financial or is it resistance from academic colleagues in the same way as there was/is resistance to lecture capture and VLE environments?

UCISA are working along with SCHOMS and AUDE on an update to Learning Space design principals. This document should be really useful, especially as the final point from the presentation was all about the removal of wires.

At Loughborough we are trialling the Epson projector series that use the Epson EasyMP software and iProjection App. What wireless projectors and/or screens are you using? Let us know in the blog feedback section below?

Other Thoughts

The other talks I attended through the day continued on the research and big data theme. It included hearing about the PetaBytes (PB) of data required by some of the medical research being undertaken as part of the ICTBioMed platform. One of the speakers commented that biology is becoming more like computer science by the day; confirming again that multidisciplinary research is a firm requirement for a lot of modern applied research.

Some examples of digital biology given were: DNA Sequencing, Gene Expression Analysis, Protein Profiling and Protein to Protein interactions.

A number of the speakers came in via videoconference; it was interesting to see the mix of success and failure of this bold move. It seems strange that we still struggle to co-ordinate a remote video connection with the technology we have at our disposal in 2014.

Another speaker also made reference to the worldwide nature of large research groups and collaborations and said this collaboration technology was essential.

Video Collaboration

For the final session of the day, I was interested to see what the future held for video based collaboration in a session with speakers from: Internet2, Pexip, Evogh, Blue Jeans and Vidyo. I didn’t manage to ask Robb from Blue Jeans more about the removal of the Skype interface API that was so disappointing, however during the panel he mentioned that they had a Google Hangouts bridge to standards based systems available.

There were some interesting remarks from Hakon Dahle who is CTO at Pexip based in Oslo (but was previously CTO at Tandberg and Cisco).

Hakon described their distributed architecture, where it is possible to start small and grow appropriately with options to add capacity on demand in an agile manner.

Latency was still an issue with global video conferencing and there was a panel debate about the pros/cons of transcoding increasing latency vs accessibility and interoperability.

“Transcoding is a necessary evil”; especially with new protocols like WebRTC etc!

There were very positive comments about WebRTC and how it will make video more accessible and will make face to face communications easier; however there is already a divide with Google VP9 protocols being favoured by some players in the market especially when delivering very high resolution 4K streams.

Hakon explained that WebRTC seemed the most promising technology to allow direct person to person video calls and will bring about a lot of new use cases and that the new use case element is the most exciting in terms of innovation.

Learning Points

• How do we best position our infrastructure to take advantage of emerging Cloud competition?
• How do we collaborate more with colleagues from Internet2, ESnet and EDUCAUSE? Is this something UCISA and Janet/Jisc can facilitate?
• Future growth potential of research data transfer requirements
• Are we best serving our research communities, what more can we do?
• Global nature of research and therefore the communication requirements.

Matt Cook