Tag Archives: security

Coping with research data access and security challenges

Universities and colleges harbour a great deal of sensitive data which should be protected. But they are also encouraged to be open and make maximum use of the data they hold through personalisation and open access to research data. Here, UCISA’s Executive Director Peter Tinson looks at the issues for institutions in balancing the need to be open and yet secure.

 

 

 

BALANCING AGILITY, OPENNESS AND SECURITY

The challenges of providing effective services for the research community while supporting open access are many and varied. Researchers need access to both short-term storage and computational resources but the requirements of research funders are moving toward long-term preservation and archiving.
There is resistance to openness – researchers see the data as ‘theirs’ and there is a reluctance to place data in institutional repositories until all the research opportunities have been realised and the results published. Open access to research data requires that data to be tagged with appropriate metadata in order to be discoverable. However, few researchers possess the skills to tag their data and there are few incentives for them to do so.
The demand is for easy to access services provided free of charge at the point of use. While a number of institutions are starting to provide high volumes of storage for their researchers, there are few, if any, effective costing models for long-term storage and preservation. The absence of a cost-effective model provides the opportunity for a shared service; it is hoped that Jisc’s embryonic Research Data Shared Service will provide an effective solution for the sector.
Where there are no centrally provided services, or where researchers find those services too difficult or too costly to use, researchers sought alternative solutions. These included free or low-cost cloud services to store and share data, cloud services for computational resource, and the use of ‘personal’ devices such as removable hard disks or memory sticks. Information security rarely features in decisions to use easily accessible cloud services – this is due in part to the ease with which such services can be purchased but is also indicative of a lack of awareness amongst researchers. This challenge has now been recognised by many institutional IT services who are now providing supported access to cloud storage solutions and computation.
Data management is relatively immature within institutions. There is growing recognition that the data and information that an institution holds are assets and poor management of those assets represents an institutional risk. However, a one size fits all approach is not appropriate – information and data needs to be classified to determine the level of security that needs to be applied to it. The HESA Data Futures project, and HEDIIP before it ,has surfaced the lack of maturity in this area. Although there has been some improvement, we are still some way from data management being an established discipline.
Effective support of research and research data management requires a cross-institutional approach yet this is not readily understood by senior university management. This is all the more frustrating given that a briefing paper jointly produced by UCISA, SCONUL, RLUK, RUGIT, ARMA and Jisc highlighted the need for an institutional approach over three years ago.
A lack of understanding is sometimes reflected in diktats being issued and a resultant poor take up of services. Meeting the demands of both researchers and research funders requires resourcing, both in terms of staffing and services, and an understanding of how cloud services can be used effectively to meet the storage and computational demands securely. The planning process needs to be responsive to long-term trends but also to changes in policy, legislation and technological developments that may require quicker response.
The threat of cyber attack is a major concern; there is growing evidence that state-sponsored attacks primarily aimed at accessing research outputs and institutions’ intellectual property are on the rise. Yet the threat often comes from within as a result of a lack of awareness and poorly maintained systems within the institutional perimeter.
It is important that all staff in the institution realise and accept that information security is their responsibility. The institution’s management needs to recognise that information security is an institutional issue and requires a coordinated and risk-based approach. Where there are policies established to mandate information security awareness training for all staff, it may be necessary for senior institutional management to oversee the enforcement of that mandate, although such enforcement may be detrimental to building understanding and acceptance of individual responsibility.
In conclusion, managing the conundrum of being open in a secure environment requires effective governance, and a central coordinated approach that supports both research and information security. There is likely to be no one solution applicable to every research discipline but shared services such as Jisc’s RDSS should have a strong role to play.

Strategic questions to consider:

  • How mature is your institution’s information management capability? Does your institution have a business classification scheme? Are records management processes embedded in normal operations?

  • How influential is your internal audit function in determining or supporting information security policy and implementation?

  • What mechanisms do you have to learn from information security incidents, whether internal to your organisation or external?

  • Do you have an institutional approach to research data management?

 

UCISA welcomes blog contributions and comment responses to blog posts from all members. If you would like to contribute a new perspective or opinion on a current topic of interest, simply contact UCISA’s marketing manager Manjit Ghattaura via manjit.ghattaura@it.ox.ac.uk

 

The views expressed on UCISA blogs are the authors’ and do not necessarily reflect those of UCISA

Social engineering and hacking humans

Sebastian Barnes
IT Support Specialist
Leeds Beckett University

Sebastian Barnes was funded to attend this event as a 2017 UCISA bursary winner

SCHOMS Day 3 – IT Security Challenges

The end of SCHOMS 2017 conference was a half day, containing presentations and speeches as well as my favourite presentation of the week from psychologist, Jenny Radcliffe; what a speaker! Jenny delivered a presentation on Social Engineering, telling us about her life experiences in her field of work. It was amazing to listen to and very engaging, which resulted in me making very few notes.

Jenny explained how technology can have amazing security which makes it impossible to hack, however, why hack the technology when you can hack the human? If you know the password, you can bypass! Jenny explained scenarios she has been in where she has had to read body language and pretend to be someone she wasn’t to get the information she wanted. From what I remember, she was able to gain access to an account by just using Facebook; security questions are personal and unique to the person, but most of the time they are listed on Facebook! Mother’s maiden name? Within seconds she able to find this out using the family feature within Facebook. With this information she was able to reset the password and enter the account.

After watching this presentation, I was seriously considering entering this field of work. That’s how good it was!

Interested in finding out more about a UCISA bursary, then visit UCISA Bursary Scheme.

Identity and access management –a project from the US

michelle

 

Michelle Griffiths
ITS Project Manager
IT Services
University of Oxford
Member of UCISA-PCMG

Looking at TIER

This Educause session Trust and Identity in Education and Research: Identity for Everyone  was run by Ron Kraemer (Vice President and Chief Information and Digital Officer, University of Notre Dame), Kevin Morooney (Vice Provost for Information Technology-CIO, The Pennsylvania State University), Ann West (AVP, Trust and Identity, Internet2 ) and Steven Zoppi (Vice President, Internet2). Internet2’s  Trust and Identity in Education and Research (TIER) initiative  will provide a common framework for campus identity and access management (IAM) components.

An overview of the TIER project

  • TIER will provide a set of integrated components that address IAM as a whole.
  • 500 US HE institutions are involved.
  • Primary users: medical students, researchers, faculty staff and students.
  • TIER will address community requirements across components, and sustain components that were developed together.
  • During the next few years the project will focus on maturity and sustainability models for workforce and funding.

The TIER vision was outlined for the Educause audience:

  • “We believe identity will be a service.”
  • “We believe in a cloud service with campus localization.”
  • “We believe that if we don’t develop it, then we will have to accept that someone else has (social identities).”
  • “Effective collaboration with partners will be key (includes federated agencies).”
  • “We know we are at least three to five years from achieving this vision.”
  • “We will build frameworks and tools to make it simpler for ourselves.”

Components of TIER

  • Secure directories
  • Identity and metadata services
  • Single sign-on and identity components
  • Registry services
  • Workflow services
  • AuthN (who) & AuthZ (What)
  • Federated registry (Directory Search/lookup)
  • Persistence and reputation

The TIER project is moving from investor to sustainable models (financials and governance) via the TCIC – Tier Community Investor Council. Fifty campuses invested $75,000 each over three years ($4 million in total which includes funding provided by TIER themselves).  There is also programme support for community – Anne West (AUP trust and Identity), technology – Steve Zoppi (AUP services integration and architecture) and sustainability (community engagement and membership).

The first integrated release is scheduled for 2106.  There will be minimal installation/configuration of user interfaces, and the preliminary requirements will be set for scalable content.  The objective is point in time consistency.

Partners involved: Shibboleth, Grouper and COmanage 

The primary focus for the first release is: container/packaging, APIs, continuous update cycles every eight months, 250 user stories driving requirements, documentation and the Initial deployment.

Progress

  • MOU management community forum
  • Financial timetabling and reporting
  • Technical requirements revision
  • Working groups
  • First two corporate partners – Unicon (for Shibboleth and Grouper) and Spherical Cow Group (for COmanage)

The work is sponsored by the community, who are responsible for the for HE standards and by Internet2 who is responsible for industry approaches.

Approach

  • Several key working groups are formed or are forming including 3M (monitoring, measuring & managing)
  • Continuous meaningful feedback (how the community is utilising the components everywhere)
  • Community adoption – working group needed
  • Emerging community contributors

Performance management and assessing capacity

Giuseppe Sollazzo

 

 

Giuseppe Sollazzo
Senior Systems Analyst
St George’s, University of London

 

 

 

 

 

Velocity day three – the final one – has been another mind-boggling combination of technical talks and masterful storytelling about performance improvement in a disparate set of systems. The general lesson of the day is: know your user, know your organization, know your workflows – only then will you be able to adequately plan your performance management and assess your capability.

This was the message from the opening keynote by Eleanor Saitta. She spoke about how to design for ‘security outcomes’, or, in other words, ‘security for humans’: there is no threat management system that works if isolated from an understanding of the human system where the threats emerge. We have some great examples of this in academia, and at St George’s one of the major challenges we face is securing systems and data in a context of academic sharing of knowledge. Being a medical school, the human aspect of security – and how this can affect performances – is something we have to face on a daily basis.

One of the best presentations, however, was by David Booker of IBM, who gave a live demo of the Watson system, an Artificial Intelligence framework which is able to understand informal (up to a point) questions and answer them in speaking. As per every live demo, this encountered some issues. Curiously, Watson wasn’t able to understand David’s pronunciation of the simple word “yes”. “She doesn’t get when I say ‘yes’ because I’m from Brooklyn,” David said, triggering laughter in the audience.

Continuous delivery
Courtney Nash of O’Reilly spoke at length about how we should be thinking when we build IT services, with a focus on the popular strategy of continuous delivery. Continuous delivery is the idea that a system should transition from development to production very often, and this idea is taking traction in both industry and academia. However, this requires trust: trusting your tools, your infrastructure, your code, and most importantly, the people who power the whole organization. Once again, then, we see the emergence of a human factor when planning for the delivery of IT services.

The importance of 2G
In another keynote with a lot of applicable ideas for academic websites, Bruce Lawson of Opera ASA has focused on the ‘next billion’ users from developing countries who are starting to use internet services. Access to digital is spreading, especially in developing areas of Asia, where four billion people live. India had 190 million internet users in 2014, and this is poised to grow to 400 million by 2018.

The best piece of information in this talk was the realisation that if you take the US, India and Nigeria, the top 10 visited websites are the same: Facebook, Gmail, Twitter, and so on. Conversely, the top 10 devices give a very different picture: iPhones dominate in the US, cheap Androids in India, and Nokia or other regional feature phones in Nigeria. This teaches us an important lesson: regardless of hardware, people worldwide want to consume the same goods and services. This should tell us to build our services in a 2G-compatible way if we want to reach the next billion users (91.7% people in the world live within reach of a 2G network). This is of great importance to academia in terms of international student recruitment.

Performance optimisation
The afternoon sessions were an intense whistle-stop tour of experiences of performance optimisation. Alex Schoof of Fugue, for example, gave an intensely technical session about secret management in large scale systems, something that definitely applies to our context: how do we distribute keys and passwords in a secure way that allows that secrets to be changed whenever required? With security issues going mainstream, like the infamous Heartbleed bug, this is something of increasing importance. Adam Onishi of London-based dxw, a darling of public sector website development, gave an interesting talk on how performance, accessibility and technological progress in web design are interlinked, something academic website managers have too often failed to consider with websites that are published and then forgotten for years.

As someone who has developed mobile applications, I really enjoyed AT&T’s Doug Sillars’ session about ‘bad implementation of good ideas’, showing that lack of attention to the system as a whole has often killed otherwise excellent apps, which are too focused on local aspects of design.

Velocity has been a great event. I was worried it would be too ‘corporate’ or sponsor-oriented, but it has been incredibly rich, with good practical ideas that I could apply to my work immediately. It has also offered some good reflection on ‘running your systems in house’: we often perceive this dualism between the Cloud and in-house services. This is a technology that can be run in-house with no need to outsource. As IT professionals we should appreciate it, and make the case for adopting technologies that improve performance and compliance in a financially sound way. This often requires abandoning outsourcing and investing on internal resources: a good capital investment that will allow continuous improvement of the infrastructure.

 

A practical approach to risk management – two perspectives

Tim Banks
Faculty IT Manager
University of Leeds

 

This is a write-up of a session  I attended on Wednesday at Educause 2015  which was delivered by Bill Arnold, Information Security Analyst at the University of Tampa, and Dr Lawrence Dobranski, ICT Security Access & Compliance, University of Saskatchewan (Canada).

Introduction

The University of Tampa, Florida, is a liberal arts institution and has a student population of around 8,000 students, 65% of whom live on campus. There are 1,200 staff and the annual turnover is c. $235m with an estimated annual economic impact of around $850 million. They formally launched their Information Security Program 3 years ago with the appointment of a Chief Information Security Officer, who reports directly to the President (Vice-Chancellor). Their stated aim is to build a culture of risk management, security awareness and data protection, and as part of this, they have created a cyber-security lab. They achieved ISO/IEC 27001:2013 accreditation in July 2015.

The (often misspelt) University of Saskatchewan is one of the top 15 research universities in Canada with 22,500 students from over 100 countries. They have a 16:1 student:staff ratio and an annual budget in excess of $1bn which includes $9.2m of scholarships and bursaries. They have 120 Graduate Degree Programs (taught postgraduate) and over 200 undergraduate degree programs. It snows regularly and can get very cold! They formally launched their information security program in June 2012, which is centred around the following three areas:

  • IT Security
  • IT Compliance
  • IT Access

It is a risk based program, meaning that priorities for investment and action are based around a risk score. Bill observed that in 2014, cybersecurity criminals were making more money than drug cartels.

A number of barriers to progress were noted which included:

  • Lack of executive support
  • Inadequate investment
  • Ineffective information security leadership
  • Information security ‘unaware’ community
  • Information security gaps especially with respect to 3rd party service providers

Practical steps

  • Ask the right questions to the right people
  • Don’t adopt every aspect of a rigorous standard (like ISO27001), use common sense
  • Focus on information lifecycle
  • Insights will come quickly once you start working with your stakeholders. These will inform your future strategy.
  • Advance planning and effective communication are absolutely essential
  • Don’t use mass surveys (if you actually want people to provide useful information)
  • Decide how you will engage – either in person or through focused surveys
  • Keep the process simple
  • Focus on business processes and impacts on information (e.g. loss / unauthorised access) rather than using technical jargon

The University of Tampa developed a very simple spreadsheet that included each major business unit on campus, each major process within the units and the process owner. The process owner was asked to rank each of their processes on a scale of 1-5 in three areas:

  • Degree of sensitivity of the data
  • Impact of loss of integrity
  • Impact of loss of availability

The average was taken of each of the three scores for each process to arrive at a risk score for the process. A discussion was held with the process owner about the information handling lifecycle involved with each process which covered:

  • Accessing the data
  • Processing the data
  • Transmitting the data
  • Sharing the data
  • Storing the data (in both paper and electronic forms)

They also looked into whether there were any compliance requirements associated with the type of information that was being stored, and determined whether the University IT department or a third party provided the service.

Summary (University of Tampa)
Bill provided the following summary of the University of Tampa’s risk based approach to managing information security.

  • Data Discovery – find out where your confidential data resides
  • Opening the Doors to positive change in University departments. You should be seen not as people who stop departments from doing things, but the people who help them to do it securely.
  • Re-engineering information handling, which will require a change in mindset from both IT and the business
  • Getting everyone to participate
  • Security Awareness (education is key)
  • Once they trust you, they will come (bringing information about risks right to your door)
  • Rinse, wash repeat (continual process)
  • Collaborate to reduce risks

Blog_4__slide1Always remember there are a lot of things we don’t know that we don’t know, as demonstrated by this slide.

 

 

 

 

Summary (University of Saskatchewan)
Lawrence focussed mainly on the best way to present information security risks to University senior management. This is done most effectively when the senior officers of the University understand and accept the cyber-risk. In addition:

  • The information presented must be in a familiar format, as we cannot afford for the busy people we are trying to communicate with wasting time trying to understanding the presentation format.
  • We need to focus on risk information and focus on the high risk areas when talking to the University executive group.
  • Don’t make the visuals too complicated or people will stop listening to you and start focussing all their attention on trying to understand the graphics.
  • Read the IEEE publication (Slide Rules)

During their audit, they discovered an internet accessible incubator control unit with a built in web server. On further investigation, if this had been hacked and the incubators shut down, then thousands of cute little chicks would have died (and research would be put back two to three years). They also found a robot roaming the hall talking to patients which the department was trying to control remotely by adding it to the wireless network. This robot was big enough to cause serious injury to somebody if an authorised person managed to take control of it.

Blog 4_slide2The key stakeholders that Laurence identified were cyber security professionals (never be afraid to ask for help) and the staff and students at the University. It is vital that those closest to the business processes are closely involved in the threat and risk/privacy impact assessment process. The world of cyber security is a fast changing one, so dedicated cyber security professionals, either internal or external are vital in order to keep abreast of emerging threats and techniques to combat them. As an institution, we need to own risk and manage it.

Some particular suggestions for ways in which to present the information security risks included using a Gartner-style quadrant with likelihood on one axis and impact on the other. Then encourage your senior team to only focus on the top-right quadrant, whilst being able to see at a glance the entire risk landscape.

slide3

An alternative is to use a radar plot to display how well the University is doing with multiple aspects of a particular IT security concern.

Overall this was a very informative session with some practical takeaways on how to both manage information security risks and communicate this to senior managers.

 

 

 

 

2014 Technology Exchange – Day 2 by Matt Cook

matt_c

 

Matt Cook
Head of Infrastructure and Middleware
Loughborough University
Chair of UCISA-NG

 

FIRST Robotics Competition

The Monday evening welcome reception included a contest based on robots developed by high school students. The students were given six weeks to raise funds, design, develop, brand and program a robot – not an easy task! It was great to see such innovation from our students and colleagues of the future. I wish we had these opportunities back when I was at school; the best we experienced was BigTrak and writing Logo programs. However at least we were taught programming in BBC Basic, and not simply how to use the Microsoft Office suite.

2014TechExDay2

The USA is promoting Science, Technology, Engineering and Mathematics (STEM) subjects in a similar manner to the UK. It will be interesting to see how successful this initiative is in providing the education required for our fellow colleagues of the future and plugging the current skills gap. Talking to the students, they are extremely enthused about the creator, maker, hacker opportunities being given through these programmes.

This is another one of those opportunities which demonstrates the value in the jobs we perform in our respective organisations to support education. I recently undertook a job shadow of a technician in one of our academic schools at Loughborough, and it was one of the most eye opening experiences I had all year.  It was extremely valuable to see the challenges they face within the school, how central IT policy affects their work and the innovation and creative ideas being developed by their students. I would certainly encourage everyone to get out into the wider university more to put everything into perspective.

Central IT vs Research Perspective on Information Security

There was a very interesting panel discussion mid-way through the Tuesday schedule investigating the challenges faced by both the central IT function and research centres in managing Information Security. Rob Stanfield from Purdue University provided an overview of the provision at his organisation and one thing that stood out was the scale of some of the US based education organisations that dwarfed most of the largest UK universities. The scale of operation also brought increased scale of both staffing, and following a coffee break discussion, of budget too. Purdue are currently recruiting a Security Risk Analyst and see an important element of their future service to be able to be better placed to advise on Information Security impact across their business.

There is a growing move to work with researchers to define strategy that allows Information Security to be an enabler and an active component in winning research grants. The panel all agreed that there was a need to form better relationships between research and central IT; something that I’ll personally be working on at Loughborough University over the coming years. There was an agreement that the era of silo’d departmental research email servers and wireless networks was not effective and the future is centralisation and collaboration. Closing comments focused on “…there is nothing like a good data breach to bring about change!” and “…some people are more concerned with IDS appliances than the balance of risk.”

Over coffee a number of people who attended the session were interested in the current UCISA activities to develop an Information Security Management System (ISMS) implementation guide and the update to the popular ‘Exploiting and Protecting the Network’ document; both set to launch in early 2015. Keep an eye on the UCISA website for more information!

As suggested, I will be posting details about these activities to the EDUCAUSE Security Constituent Group mailing list as well. This list may also be of interest to UK colleagues who are looking to get a wider perspective on Information Security concerns within global education organisations. Whilst the remit for security falls between both the Network (NG) and Infrastructure (IG) groups within UCISA, some readers of the blog may not be aware of the UCISA-IS Information Security mailing list. Although currently low traffic, it is a growing area of discussion.

For those with larger security teams, it may also be of interest to explore the TERENA TF-CSIRT group.

Privacy in Access and Identity Management

Dr Rhys Smith (Janet) delivered the final session I attended on Tuesday. I’ve not personally been involved in the Access and Identity Management (AIM) side of IT at Loughborough; however I was eager to see what was on the horizon for Moonshot, especially what it can offer the research community. It was nice to see some friendly faces: Rhys Smith, John Chapman and Rob Evans from Janet; and Nicole Harris from TERENA when I arrived at the conference; I’ve also since met quite a few people I’ve spoken to by email before or have seen posting on mailing lists from.

Rhys gave a gentle introduction to AIM before describing how we should be adopting privacy by design, as it is so difficult to retrofit. As part of a privacy vs utility discussion; Rhys provided the example that the routing of IP network packets outside of the EU is breaking EU data protection guidelines as an IP address is deemed to contain personally identifiable information. Whilst this example is simply unworkable, the categorisation of IP addresses has caused some interesting consequences for our Computer Science researchers.

Following a narrative of the difference between web based federation (SAML) and network based federations (like eduroam); Rhys outlined the timescales for the Moonshot trial and official service. Being able to unify many technologies from simple SSH through to Windows desktop authentication opens many possibilities for secure research collaboration in the future.

Other Thoughts

There were lots of interesting conversations through the conference today about the development of common shared tools or building blocks to solve future challenges. From the infrastructure that supports eduroam through to the Kuali HE software suite. Many felt that through collaboration, a better solution can be developed with less resource; however there were concerns that high workloads in recent years had removed a lot of these opportunities for some.

Another common theme was the adoption of standards, rather than closed proprietary technology, avoiding vendor lock-in where possible and using the infrastructure as a live research aid for students within our organisations.

Learning Points

• Get out into the wider university to put your role into perspective;
• Turn Information Security policy and strategy into an enabler that wins research grants;
• Seek collaboration and closer relationships with our research community;
• Explore opportunities for privacy by design;
• Keep a watching brief on Janet Moonshot developments;
• Support the development of common shared tools and building blocks where appropriate.

Matt Cook

 

Securing card payments

Peter Tinson, UCISA’s Executive Secretary, attended the PCI DSS SIG conference this week to find out more about the standard which is intended to protect payment card data and processes. The PCI (Payment Card Industry) Data Security Standard is a global standard; compliance with the standard reduces the risk of credit card fraud and the resultant cost (both financial and reputational) to the organisation.

The scale of the problem cannot be underestimated. A Government report on the results of a survey on information security breaches revealed that over 90% of large organisations (those employing over 250 people) were affected by security breaches with an average of 113 breaches per organisation in 2012. This is perhaps indicative of the growth in cyber crime which can range from the sale of credit card numbers through to sophisticated schemes to steal on a large scale.

There was great mention of risk throughout the day. Clearly a starting point has to be that the institution needs to know where payments (or card details) are being taken and whether or not the information is stored. Once the location of information is known, an assessment can then be made of the risk (and impact) of its loss and proportional measures introduced to protect and secure it. There was general agreement that the potential loss of payment card data should be included in the institution’s risk register and so be clearly visible to the governing body.

Implementing technical measures to protect data is only part of the solution. The report on information security breaches notes that over a third of breaches were the result of inadvertent staff error. Training of staff is critical to ensure that staff are aware of their responsibilities; this needs to take place at the start of employment (including when there has been a role change within the institution) and at regular points thereafter (the suggestion was at least annually).

Whilst poorly trained staff present a risk to security breaches, so too do poor processes. One of the recommendations made at the conference was to review business processes to see whether they could be re-engineered so that it was not necessary to use card data. Obviously if card data is not being used in a process, then the risk of its loss disappears and so too does the need to comply with the DSS.

This summary only gives a brief snapshot of some of the issues being faced by institutions seeking to implement the standard. It was clear that institutions are at different stages in their adoption of the standard and that the barriers to adoption are not always technical. UCISA is looking to work with the PCI DSS SIG and with our sister organisation for Finance Directors, BUFDG, to promote best practice in this area and so reduce the risk of our institutions falling victim to fraud.