Tag Archives: Internet of Things

Digital transformation – dare to dream!

Sara Somerville

 

 

Sara Somerville
Information Solutions Manager
University of Glasgow

 

Day 1 of the AIIM 2016 conference

The introductory workshop with Thornton May provided discussion points for a smaller group of attendees, to get us thinking about what transformation really meant.

Thornton believes transformation has three elements:

  • Trends
  • Wild cards
  • Dreams

Many executives are ‘gee whiz nexties’ and spend a lot of time thinking about the next new shiny bright piece of technology; but where are the dreams and innovations? We are living in an information environment, so basically, information professionals should rule!

The opening keynote from the president of AIIM, John Mancini,  reflected on the twenty years John has been with AIIM. Technology and the information landscape have changed immeasurably since 1996. Back then, the iPhone was still eleven years away; there was no Google, no Twitter, no Wikipedia. And these changes are just a shadow of the change that is to come, and John advised that  we need to exercise humility when we consider the future and what it might hold for technology.

There are three main disrupters accelerating at a pace that could not have been predicted:

However, what distinguishes organisations in an information age is the difference in mindset between those which function in the mainstream and those which function on the edge.

If you compare the two with regard to the following themes, you can see how this manifests:

  • Mindsets – those on the edge will do things themselves, where those in the mainstream will contact IT
  • Messages – on the edge organisations are using Slack whilst mainstream organisations use SharePoint; and those on the edge emphasize innovation whilst the mainstream aims for efficiency
  • Money (where it’s being spent) – on the edge, in the last five to ten years, the big IT players have created wealth equivalent to the GDP of Korea!
  • Machines (what technology is being used) – on the edge they use the Cloud whilst mainstream use servers on premise; on the edge it’s mobile, versus PC for the mainstream; those on the edge configure and connect, while the mainstream build and develop.

The problems created by all this radical disruption can be broadly split in to three areas: automation, security and governance, and insight. Information professionals can make a real difference in all three of these areas:

  • Automation – information professionals can help to identify and automate business processes
  • Security and governance – they can help organisations secure and manage information
  • Insight – they can obtain value from big data and analytics.

AIIM has always believed that information management is about people, process and technology – the technology might change in the future, but the people and process will remain constant.

People_process_tech_blog_2_image1People_process_tech_blog_2_image2

 

 

 

 

 

 

 

 

So, who now owns the big picture? Who provides the ‘adult supervision’ for all of this? Who acts as the bridge between people, process and technology?  According to John, the information professional should be the glue – now is not the time to stick your head in the sand; now is the time to ‘own this’!

Key takeaway:  We are entering an information renaissance and each and every one of us is a Michelangelo!

Looking beyond the present

The opening keynote was followed by a panel discussion entitled Industry Insights 2020 Expert Panel: Consumerization + Simplification = Digital Transformation, chaired by futurist Thornton May with Andrea Chiappe, David Calderia, Hugo Palacios, and Stephen Ludlow.

The panelists discussed questions around what organisations are concerned with when they talk about digital transformation. It was agreed that it’s not just about taking the paper out of a process, but about using a whole new approach, and thinking about changing your business model. Information professionals should think about what they are not seeing at the moment (e.g. people using Slack). We should also be aware of self-selecting by taking the traditional technological approach.

For a lot of companies the worry is that their competitors will be the ones to smash through and use the latest technologies, leaving them behind. Where is the big spending? In Business Intelligence (BI), and in determining the best use of analytics. However, it’s no good having secure information if it isn’t available at the right time.

Key takeaway: There is a new world that requires a completely new approach and new ways of thinking.

Pan-European implementation

We then broke for the last sessions, and I attended the session entitled Implementing Automated Retention at the European Central Bank with Beatriz Garcia Garrido and Maria Luisa Di Biagio.

The bank uses the Open Text system, which they began implementing in 2007. There are now eighteen thousand users (not only ECB but counterparts across Europe) and eight million documents.

Why implement retention?

  • to keep only what is needed
  • for historical reasons
  • to comply with legislation

The bank ran a pilot to validate the approach, and to test the processes of managing the information. The pilot highlighted that it was complex to build a retention schedule and apply it to the right information at the right level ; so, they took a step back and focused on the final goal of applying retention in the simplest way. They created a task force made up of records managers, archivists, and some Open Text consultants. The task force re-examined the retention schedules and looked at how difficult it was to apply them in an electronic world (the schedules were originally designed for paper records, which made them very difficult to apply).

The schedules were analysed and simplified, rolling up some of the timescales (e.g. one year/five years/ten years/permanent), and adding information about each record series in order to define event-based or time-based triggers only. The system was also tweaked to make it simpler.

During the implementation phase the retention policies were applied at the folder levels, and deletion reports were sent to the business users for approval. Documents were automatically declared as records two years after their creation (if they hadn’t been manually declared as records).

There is a mixture of user-driven and automated retention application. Time-based retention is applied at document level and event based retention is applied at folder level.

Roles and responsibilities : project board; project manager – for each business area there is an implementation team that includes a business user as well as a records manager and an archivist. (Implemented over Jan 2015-Dec 2016.)

As you can see from the slide below, implementation has planned phases.

Implementation_blog_2_image3

The communication channels for the project include the executive board, senior management, the users, and the information management forum for each area.

For the full implementation phase, the team plans to replace sent deletion reports to the business areas with a yearly review of the retention periods within those areas. Future challenges include the preservation of records with long term retention (this is being scoped as a separate project), and other content not in the Electronic Document and Records Management System (EDRMS), e.g. emails.

Key takeaway: The integration of policy, systems, and processes is essential.

 

 

PaaS, bots, alerts and using analytics to improve web performance

Giuseppe Sollazzo

 

 

 

Giuseppe Sollazzo
Senior Systems Analyst
St George’s, University of London

 

 

Storytelling at Velocity

The second day of O’Reilly Velocity conference was definitely about storytelling: keynotes and sessions were both descriptions of performance-enhancement projects or accounts of particular experiences in the realm of systems management, and in all honesty, many of these stories resonate with our daily experience running IT Services in an academic environment. I will give a general summary, but also mention the names of the speakers I’ve found most useful.

Evolution in the Internet of Things age
An attention-catching keynote by Scott Jenson, Google’s Physical Web project lead, the first session was centred on a curious observation: most attention about web performances has traditionally been focused on the “body”, the page itself, while the most interesting and performance-challenged part is actually the address bar.

Starting from this point, Scott has illustrated how the web is evolving and what its characteristics will be especially in the Internet of Things age. He advocated for this to be an “open” project, rather than Google’s.

Another excellent point he has made is that control should be given back to the users. This was illustrated by a comparison between a QR code and an iBeacon : the former requires the user to take action; the latter is proactive to a passive user. Although we like to think of proactive applications, it only takes us to walk into a room full of them to understand being in control can be a good thing.

PaaS for Government as a Platform
Most of the conference talks have centred on monitoring and analytics as a way to manage performances. Among the most interesting talks, Anna Shipman of the UK Government Digital Service (GDS) illustrated how they are choosing a Platform-as-a-Service supplier in order to implement their “Government-as-a-Platform” vision.

I’ve argued a lot in the past that UK Academia will need, sooner or later, to go through a “GDS moment” to get back to innovation in a way it can control – as opposed to outsource in bulk – and this talk was definitely a reminder of that.

Rise of the bot
As with yesterday’s Velocity sessions, some truly mind-boggling statistics have been released today. One example is that that many servers are overwhelmed by web crawlers or “bots” – the automated software agents that index websites for search engines. In his presentation From RUM to robot crawl experience!  Klaus Enzenhofer of Dynatrace told the audience that he spoke to several companies for which two thirds of all traffic they receive is Google Bots. “We need a data centre only for Google”, they say.

Analytics for web performance
There has been quite a lot of discussion around monitoring vs. analysis. In his presentation Analytics is the new monitoring: Anomaly detection applied to web performance Bart De Vylder of CoScale argued for the adoption of data science techniques in order to build automatic analysis procedures for smart, adaptive alerting of anomalies. This requires an understanding of the domain of the anomalies in order to plan how to evolve the monitoring, considering for example seasonal variations in web access.

Using alerts
On a similar note was the most oversubscribed talk of the day, a 40 minute session by Sarah Wells of the Financial Times which saw over 200 attendees (with many trying to get a glimpse from outside the doors). Sarah told the audience about how it is very easy to be overwhelmed by alerts: in the FT’s case, they perform 1.5M checks per day generating over 400 alerts per day. She gave an account of their experience trimming down these figures. Very interestingly, the FT has adopted the cloud as a technology, but they haven’t bought it from an external supplier: they’ve built it themselves, with great attention to performance, cost, and compliance, surely a strategy that I subscribe to.

Conference creation
I also attended an interesting non-technical session by another Financial Times employee, Mark Barnes, who explained how they conceived the idea of an internal tech conference and how they effectively run it.

Hailed an internal success and attended by their international crowd, the conference idea came from an office party and reportedly has helped improve internal communications at all levels. As a conference/unconference organiser myself (OpenDataCamp, UkHealthCamp, WhereCampEU, UKGovCamp, and more), having this insight from the Financial Times will be invaluable for future events.

I’m continuing to fill in this Google doc with technical information and links from the sessions I attend, so have a look if you’re interested.

Enterprise architecture and digital business

Allister-Homes-Profile-pic---small

 

Allister Homes
Senior Systems Architect
University of Lincoln

Gartner EA Summit Day 1

There are attendees at the summit from all over Europe and it’s probably fair to say that as someone from the HE sector, our EA practice is somewhat more modest than a lot of others represented here; I think that would be the case for almost any HE institution that came here, I don’t mean just Lincoln.

I haven’t seen any information on the number of delegates, but I’d estimate there somewhere in the region of 400-600 and it’s quite a busy summit in terms of the number of people and the number of sessions. I thought the most straightforward way of blogging this would simply be to summarise each session I attended, so here goes.

Architecting the digital business ecosystem
ecosystemsAs Gartner’s opening keynote session, this started by concentrating on the fact that more and more businesses operate in the digital world – not just having a presence, but offering their services and interaction digitally. Garner’s figures are that 22% of businesses operate in the digital world now, and by 2020 that figure will be 83%. Digital business will mean business growth, and those who do not embrace it will lose out to those who do. We heard about the seismic shift happening, caused by the Internet of Things and what Gartner calls the ‘Nexus of forces’.

It is estimated that by 2020 the number of Things connected to the internet will outnumber people by at least five to one. We heard a story of how a connected home might, in a few years’ time, react to a tree crashing into the bathroom during a storm whilst the owner is away by cutting off electricity and water to protect the property, contact the insurance company and start arranging quotes, pass the information to the police and ensure the property is safe and secure. As Enterprise Architects we need to be aware of new technology and innovations so that we can become ‘vanguards’ and shape its use in the enterprise, which will become continually more consumer focused rather than enterprise user focused.

We analyse too much and synthesise too little

This session was focused on trying to change the way EAs think. Rather than always relying on analysis, the suggestion was that we rely on synthesis a little more. We were told how analysis does not work well when there is no data, there is ambiguity and when there are dilemmas, and then a short history of synthesis starting with dialectical reasoning. Some of the simpler examples used where synthesis can were how to cut costs given a particular banking scenario (don’t rely on the distinct possibilities provided by analysis) and the ‘chicken or beef’ question on aeroplane flights (either ask for both or eat before boarding!).

The state and innovation: from fixing markets to market making

state and innovationProfessor Mariana Mazzucato from the University of Sussex presented on innovation led growth and the role of the public vs private sector. She described the cartoon image many have, which assumes all innovation happens in the private sector whereas public sector is rational, boring, has dinosaur bureaucrats and exists (in terms of innovation in markets) to simply level the playing field. However, she went on to explain that science is simply not funded enough without state support, which is needed to fix market problems. In fact, public sector funded innovation is often mission oriented, such as sending mankind to the moon, and massive public sector innovation and investment has led to much of the technology that makes things like smart ‘phones possible – think of GPS, voice recognition, touch screens, cellular communication, signal compression and more.

What can sometimes be forgotten in the public sector, though, is to apply what is taught and used elsewhere, such as the approaches taught in business schools and lessons that can be learnt from great innovators. One particular example highlighted was that of higher investment risk green opportunities which are starved of private funding. In Germany in the early 2000s, when private banks in Europe were reducing loans and the economic crises was in its early stages, the public sector in Germany provided substantial mission-oriented funding for environmental and climate protection innovation.

Application architecture for digital business

This session, delivered by a Gartner analyst, concentrated on new approaches to application architecture for delivering business digitally. It was emphasised that the focus should first be on business outcomes which are then converted to technical goals, which lead to technical characteristics, then principles and then patterns. Most organisations are still using three tier (or n-tier) client/server architecture, which is typically designed with static architecture and linear flow in mind. The approach does not work so well with consumers using various devices and with business that needs to change rapidly, and so an application architecture of apps and services was suggested instead. This takes service-oriented architecture several steps further, and encourages the use of individual apps for particular personas, devices and scenarios rather than one large client application to do everything, uses services to support the apps, and encourages a many-to-many relationship between apps and services. In this scenario services are broken down much more than they typically are in most environments today, becoming microservices for web-scale SOA. Examples were provided of how the likes of Netflix, Facebook and Amazon use the microservices to scale to massive numbers of concurrent users and with the agility to make hundreds of updates per day to production systems by having these very specific microservices which are independently deployable and scalable. The difficulty is that although they provide separation of concerns and are individually easy to understand, you end up with a LOT of services, and have to radically alter the rest of the architecture because there is often no centralised application or data. Third normal form is out the window with this architecture, and a lot of what was done as server code becomes client code again.

Nissan Group: using enterprise architecture to enhance efficiencies and business growth

Enterprise Architects from Nissan presented on their use of MEGA at the Sunderland plant. Nissan had a lot of legacy applications and infrastructure at Sunderland, but not necessarily a good corporate understanding of all of it. Three main drivers led to the need to change this: a security incident, the transfer of some service to third parties, and a corporate objective from Japan to understand and standardise business processes. A project was launched to document 500 applications, the central technology infrastructure and the business processes, and to link them all together into a single EA model. A team of five full-time staff were set the task of doing it, and although timescales turned out to be a little too ambitious, much of the work is now done including the development of an application portfolio outside of the MEGA model and the creation of a bespoke metamodel.

The cloud computing scenario: examining cloud maturity and adoption in the age of digital business

This session looked at the adoption of cloud by businesses, how to make assessments and what to consider with a cloud strategy. Gartner’s analyst explained that cloud delivery options are becoming more varied and complex, leading to a “spectrum of cloudiness” that EAs need to understand in order to make the right decisions for the business. It’s not just the delivery model that needs to be considered (public cloud, community cloud, virtual private cloud and private cloud) but also the execution model, for example whether hardware and software is isolated or not. Cloud offerings are still changing quickly, for example by making improvements to security models, and although maturity is growing it is too early to put a final cloud strategy in place; a strategy is needed, but it will need to keep pace with the constantly maturing cloud technologies and offerings. Vendors sometimes complicate this, and an EA needs to be able to break through the vendor ‘fog and hype’ to understand what is really being offered.

It was emphasised that whether we like it or not, many (not all) vendors are now going cloud first with new solutions rather than traditional software first, which means the decision is shifting from ‘whether and when to go cloud’ to ‘how and how much’ to go cloud. The reasons for moving to cloud solutions is not always cost-based; there is value from agility, the offloading of responsibilities, business agility and the advantages that provider investments provides in terms of scale and security. An interesting element of the presentation was of how business users tend to focus on benefits of cloud such as speed of change (sometimes bypassing IT departments) and that IT and EA focus tends to be on ubiquity, scale and so on; there needs to be a balance and a meeting point between the two views.

In reality most organisations will use a hybrid of cloud services, and the cloud strategy needs to consider the “spectrum of cloudiness”. Comment was also made that not all vendor offerings are created equal, and work must go in to understanding the differences beneath the surface. There are some large vendors, such as IBM, who offer cloud services and will also build cloud solutions, and will often mix both in the same conversation which can lead to confusion and complex, bespoke solutions. The five questions the presenter suggested be asked when defining a cloud strategy are:

  • Where and how should the enterprise consume cloud computing services?
  • Where and how should the enterprise implement private cloud environments?
  • How will we secure, manage and govern cloud services across internal, external and hybrid environments?
  • How does cloud computing factor into our application strategy and architecture?
  • Are there opportunities for the business to become a cloud computing service provider?

Selling EA with stories: start with the end in mind

selling eaEE, the mobile telecoms company, presented this session. EE has a mature EA practice which is engaged with all levels in the organisation. It recently refocused EA into three areas:

  • Technology that underpins enterprise
  • Business capability
  • Business architecture

It has a comprehensive EA knowledge base or model, using Sparx, and has a philosophy of ‘doing just enough, just in time’ because otherwise all the time is spent trying (and failing) to keep everything perfectly finished and fully up to date instead of spending time innovating, working with business users and influencing change, which is where the real EA value is.

An opportunity arose for the EA team to create a vision, sell it to the business and propose a set of new initiatives. The team achieved this by first creating a compelling vision. The vision was rooted in the present day and based on information already held in the knowledgebase (architecture model), and focussed on the main business problems and desired business outcomes. The vision was communicated through personas and stories, which were designed to be both factual and appeal to emotions, by highlighting key frustrations and weaknesses and how the vision would help overcome them. The vision, including stories, was presented on one single sheet, including short biographies of personas, themes in columns and the use of icons and short stories. No formal architecture documents were given to the business at this stage, but the detail had to be ready if the outcomes were positive and the team was invited back to the senior stakeholders for further dialogue on roadmaps, sequence of capabilities, business benefits and costs. The approach was successful with three of five major initiatives being started.