Author: Jay Fearn

Digital transformation hype is not reflected by reality, finds new research from Claranet

Major pan-European research project highlights the complexities faced by CIOs as they navigate the changing IT landscape

In the hype surrounding digital transformation, the process is often presented as one which can be completed almost overnight. However, new research commissioned by Claranet has found that the reality is considerably different, with organisations facing a wide range of organisational, technical and operational barriers to change. According to the managed services provider, if IT leaders are to successfully generate value for their organisations they must focus on iterative change, putting their applications and increased automation at the heart of IT strategy.

Formally launched today, the report – Beyond Digital Transformation: Reality check for European IT and Digital leaders – explores the responses of 750 IT decision-makers from the UK, France, Germany, Spain, Portugal and the Benelux, around how their businesses manage and host their applications and how well-placed they are to adapt to the new digital economy.

Key findings from the research include:

  • 87 per cent report barriers to implementing organisation-wide technology changes, with skills shortages in the IT department (34 per cent), a lack of time to make changes (29 per cent), and a lack of support from senior management (28 per cent) given as key reasons
  • Eight in ten (81 per cent) agree that they should experiment more with new processes and technology;
  • 48 per cent of organisations report that their IT department is stuck in a reactive mode;
  • Over half (55 per cent) state that their applications are time-consuming and complicated to maintain;
  • Just 10 per cent of respondents said that their organisation is agile with regard to its approach to IT processes.

Commenting on the research findings, Michel Robert, Claranet’s UK Managing Director, said:

Business and IT leaders are facing ever-increasing amounts of pressure to transform their operations. Greater levels of competition, heightening customer demands and decreasing tolerance for technology faults and inflexible IT systems, are all creating new imperatives for change. It should come as little surprise that so many businesses have bought into the concept of ‘digital transformation’.

But for most, that change will take time to implement, and while the increasingly-common term digital transformation conjures up images of overnight metamorphosis, this research confirms that the picture, especially for mid-market organisations, is much more complicated.”

Organisations across Europe are gradually filtering more progressive IT practices into their operations, and there is a clear recognition of the importance of applications in improving the customer (both internal and external) experience, but barriers to the adoption of these practices exist. Only about one in ten European businesses report that their applications and the infrastructure to support them are where they need to be in terms of stability, reliability and responsiveness; IT systems are still fragmented in the majority of cases; and data sets are largely disparate, making them challenging to draw valuable insights from. While the majority of European businesses have started on their journey, it will take some time before they get there.”

The term digital transformation is a misnomer, and while that might be a matter of semantics, the risk is that it widens the gap between IT and the rest of the business; the latter expecting overnight change and the former contending with increasing infrastructure complexity, skills shortages and cost-cutting pressures. This creates a real need for strong leadership to ensure IT and Digital projects have a measurable impact on improving customer experience and uniting the business to improve performance.”

Robert concluded by stating that instead of the ‘big bang’ approach, IT leaders would be well-advised to approach things in a more focused and iterative manner, developing a philosophy of continuous improvement to boost competitive performance without having to periodically enact disruptive change.

There are right and wrong ways to move to the cloud. Simply ‘lifting and shifting’ an IT estate to the cloud and assuming the job is immediately done is not sufficient, and a ‘big bang’ approach – where businesses race to incorporate every cloud tool immediately – can lead to a loss of control and increased risk. Instead, a process of incremental change is the way to go, and enables a cloud environment to bed in and grow at a sensible pace.”

Our recommendation is for businesses to do more planning before any migrations start to check whether their applications are already cloud-ready or need to be re-engineered. This is essential as changes are often required to take full advantage of automation, scalability, rapid development and other features offered in the cloud. The right strategy will vary, sometimes considerably, depending on the nature of the application, so it is crucial that this is taken into account. Successful migrations are very rarely ‘lift and shift’ and strong leadership, with a clear direction of travel, is required.”

For more information about the research, and to download the full report, visit: www.claranet.com/research

Snow Joke – Why bad weather is bad news for business

Why bad weather is bad news for business (and how you can mitigate the risk)

We had a picturesque display of snow last week, and in some places, the country fell into chaos again: snow and ice caused massive disruption to planes,trains and automobiles in England and Wales. That left a lot of people unable to get to work on time, their email unattended, their phones unanswered.

That’s also a lot of unhappy customers trying to get in touch with you.

This is certainly not a one-off. A few weeks ago we were battling tube strikes and track failures at Waterloo. A few weeks before that a chemical alert closed the M3 and there was what The Telegraph called “world-wide airport chaos” as check-in systems crashed.

None of these things are the kind of disasters you’d make Bruce Willis movies about, but they can do serious damage to your business. A tube strike, some snowflakes or somebody accidentally severing essential communications cabling can be devastating. It can mean the difference between you being there for your customers or losing them to better-prepared rivals.

There’s a better way

Imagine being stuck at home because the snow’s closed your child’s school. No problem. To your customers, you’re in the office: your business number works on your landline, your mobile or your softphone app.

Or maybe you’re stuck at Tebay Services waiting for the snow ploughs to clear the slip roads. Again, no problem. You can work remotely and run conference calls while you sip your coffee.

The trick is to bring the cloud to your communications in the form of hosted voice services.

The benefits from a disaster recovery perspective are obvious, but hosted voice services are useful when things aren’t hitting the fan too. For example, they enable you to add new employees in small or huge numbers quickly — handy during seasonal peaks or other times of high demand — or to quickly reconfigure your call handling to meet a specific business objective, such as a new service launch or a period of peak demand. Removing the need to purchase fixed capacity on-premise voice switches, giving you a solution that can grow or shrink with your business needs.

Hosted voice services can also be helpful for data security and regulatory compliance. If you’re in an environment where call recording is mandatory, the platform can provide that as well, no matter what kind of phone or app is being used. A single administration system that’s easy for your IT team to manage reduces site visits and enables you to provide best-in-class communication services across your organisation.

Ultimately, hosted voice services offer something very valuable: peace of mind. They are there whenever you need them and won’t cost you when you don’t, it keeps your data safe and your people safer, and it brings extra flexibility to your business.

Not only that, hosted voice can also help you to deliver improved customer service, even if your staff are working from home, stuck in snowdrifts, or dealing with the unexpected arrival of Godzilla in Godalming.

Takeaways:
  • Disasters needn’t mean meteors or Godzilla. Bad weather or cut cables can leave your business stranded.
  • In this fiercely competitive climate, rivals will happily take the calls you can’t answer.
  • Hosted voice services bring cloud flexibility to your communications.
  • Hosted voice facilitates professional contact no matter where you are.
  • Don’t let disasters risk your data security or employees’ privacy.

See how hosted voice could reduce risk, add flexibility, and make life easier for your business.

Click HERE

2_0.PNG

re:Invent 2017 : Four Highlights

By Sam Bashton – Head of Public Cloud, Claranet

Last week was the annual Amazon Web Services re:Invent conference in Las Vegas, where over 40,000 geeks from all over the world gathered to hear the latest news and major product releases (and a few jokes targeted at Oracle’s Larry Ellison). With so many feature enhancements and product releases in an average week at AWS, there’s always plenty to talk about. This year’s conference didn’t disappoint either, with an avalanche of great new technology announced.

I’ll pick just four announcements that I think really show where AWS are heading. In my opinion, although Google and Azure are fighting hard to catch up, it’s hard to see how they can get anywhere close when AWS are playing the strategic game so well. From a technical point of view, many of the new features AWS released match those Google announced at their Next conference last March, but the breadth of what Amazon offers are unrivalled. AWS has an answer for basically every workload, including some (IoT for example) where they have stolen a significant march on their competitors.

EKS

EKS – or ‘Amazon Elastic Container Service for Kubernetes’, to give it its full name – was a highly anticipated release which I believe really shows how differently AWS behaves to the rest of the IT industry.

Kubernetes is an open source container orchestrator initially created by Google, and offered as a managed service by them since 2014. Initially, Amazon tried to compete with their own open source project, Blox, but they have clearly been listening to what customers are asking for (“Make Kubernetes on AWS easy!”)

The ability and willingness to adapt, instead of stubbornly doubling down on their own competing product, provides real insight into why AWS will continue to dominate the Cloud landscape for the foreseeable future.

Neptune

Graph databases aren’t at the forefront of IT media consciousness in the way machine learning is, but they can offer many of the same capabilities.

The “People that bought this also bought…” alert is often trotted out as an example of machine learning, but a graph database can provide this data without needing to be trained on large quantities of data. Up until now, running a clustered graph database in production such as Neo4j was possible but had a sizeable management overhead. Neptune brings the ease of RDS to graph databases, with the same underlying functionality of Amazon Aurora storage. In addition, data is replicated across multiple data centres and read replicas are provided, but in a fully managed service.

I expect Neptune to bring a new interest to this under-appreciated class of databases. When combined with machine learning enabled from Sagemaker (below), Neptune provides a way for businesses to easily build tools that were previously out of reach for anyone but the tech giants.

SageMaker

Machine learning is complex, and Amazon SageMaker doesn’t change this fact; it won’t be putting any data scientists out of work, but it will make them more productive. What it does is take away much of the ‘undifferentiated heavy lifting’, making it easy to spin up clusters and import, explore and visualise training data.

In a similar vein to the EKS announcement, it’s interesting to note that Tensorflow, another Google open source project, has equal billing with MXNet, the open source project behind which Amazon has previously put most of their energy.

Deeplens

I’m constantly surprised by just how much of a lead Amazon’s competitors have let them steal in the IoT space, particularly given the fact that the people Amazon are up against are market leaders in mobile and desktop.

With AWS IoT and Greengrass, announcements from re:Invent in previous years, AWS have created a great environment to build IoT devices. Deeplens is effectively a hobbyist device, which demonstrates just how effective these can be when combined with Sagemaker. Essentially a small Linux device with a built-in camera and a powerful graphics processor, Deeplens has been built to let people play with computer vision machine learning. At re:Invent, Amazon got Deeplens to detect people and hot-dogs in just a couple of hours, and I think this is an amazing tool to let people easily and cheaply prototype new features.

Find out more.

The 4 stages of your AWS Cloud journey

Depending on who you believe, hype around the Cloud is either reaching the “Peak of Inflated Expectations” (where early success stories capture lots of media and industry attention), or is already slipping quickly towards the “Trough of Disillusionment” (where early deployments fail to live up to expectations and the industry begins to re-evaluate their approach).

Claranet boosts revenues by over 40% in FY17

Organic growth and strategic acquisitions see Claranet Group’s turnover hit £216.5 million for the year ended 30 June 2017, and annualised revenue of £310 million

Claranet has released financial results for the year ended June 2017; a year which saw annual revenues increase by 42 per cent. The company’s revenues for the 2017 fiscal year were £216.5 million, up from £152.5 the previous year, while Adjusted EBITDA grew by 32 per cent to £38.7 million.

The Group’s international expansion was led by a series of strategic acquisitions, including that of Brazilian public cloud services provider CredibiliT in December 2016, which represented Claranet’s entrance into the South American market. This was complemented by the opening of Claranet Italy in February 2017, capitalising on a growing demand for managed public cloud services in the Italian market.

Claranet also successfully consolidated its presence in its established regions, through the acquisition of high-availability application management expert Ardenta and security solutions provider Sec-1 in the UK, French DevOps and cloud specialist Oxalide, Portuguese IT business ITEN Solutions and IT provider Rely in the Netherlands.

Commenting on this year’s growth, Charles Nasser, Founder and CEO of Claranet, said:

The most exciting aspect of our growth in scale and capability is providing services that are increasingly relevant to our customers’ journey, allowing us to develop ever stronger relationships. As we continue to expand our portfolio of services, we are also attracting larger customers with a broader range of services.”

This strategy has enabled us to make significant inroads with upcoming technologies and related services in the areas of Public Cloud, DevOps, Security and Big data.”

The past year has been one of the most significant in Claranet’s 21-year history, marked by a wide range of strategic acquisitions, our expansion into two completely new markets, and the strengthening of our service portfolio. The steps we have taken to grow the business provide the ideal platform from which we can consolidate our position in the market and pursue further growth as the IT services industry continues to evolve and consolidate.”

Our work over the past year has centred on building an organisation that caters to the increasingly complex application needs of modern businesses while remaining agile. We’re growing quickly both in terms of our size and capabilities, but as a business we’re doing what we’ve always done; staying close to our customers, making sure we’re delivering the best possible service and taking advantage of new technologies to help our customers do amazing things.”

Nigel Fairhurst, Chief Financial Officer at Claranet, commented:

Our results for the year highlight the positive impact our acquisition strategy and refinancing exercise have had on our position in the market. At the end of FY17, annualised revenue for the Group was in excess of £310 million, roughly double the revenue reported in FY16.”

Claranet’s growth this year has also been boosted by its refinancing exercise, which has provided the company with long-term funding and an incremental acquisition facility of £80 million. In addition, Tikehau Capital have acquired a minority shareholding in Claranet.

Fairhurst added:

The investments we’ve made over the past few years in our staff, technical expertise and partnerships mean that we’re now capable of competing with some of the biggest players in the industry, and we fully expect to maintain this momentum into the next financial year.”

Information security challenges holding back innovation in the financial services sector

  • New research reveals that information security is the most common challenge facing IT departments in the financial services sector, with almost 6 out of 10 seeing it as a primary point of concern
  • 58 per cent of financial services organisations have encountered issues with securing customer data when attempting to improve the customer digital user experience

Financial services organisations are feeling particularly challenged by the need to secure their applications and data, and for many this is hindering their efforts to adapt rapidly to changing market conditions.

Surveying 138 IT and digital decision makers across financial services organisations, technology market research firm Vanson Bourne found that 57 per cent of financial services sector respondents considered information security to be one of the biggest challenges facing their organisation. Worryingly, these information security difficulties are having implications for the sector’s ability to adapt to a changing business environment, with 62 per cent of financial services respondents stating that an inability to properly manage security is holding back innovation.

Jason Zimmer, a FinTech specialist at Claranet, commented on the significance of these findings:

Financial services are under pressure from both a demand-side and a supply-side angle. With regards to the former, the 21st century consumer has become accustomed to near-seamless service from organisations. FinTechs and new start-ups are increasingly addressing these needs, resulting in increased consumer expectations when they engage with their financial services provider. In turn, this is putting more pressure on traditional financial services companies to do the same – not just in their client-facing applications, but in their internal operations too”

Regardless of the need to adapt to a rapidly changing market, however, security has to lie at the heart of everything that financial services organisations do – these businesses deal with some of their customers’ most sensitive information. In our research, we have found that too often the challenge of guaranteeing the requisite level of security has held back financial services organisations from adopting to this new reality.”

Zimmer cited the user experience on digital channels as an example of an area where security challenges are proving a problem for effective innovation:

A key battleground in this rapidly changing market is providing customers with the best access to products, services, and assistance via digital channels. Digital-only FinTechs are taking this to a new level, with start-ups such as Atom Bank taking advantage of the modern consumer’s digital-savviness and intolerance of inconvenience to revolutionise how personal finance works. However, 58 per cent of financial services organisations have found that securing customer details has been an obstacle when trying to improve this digital user experience for their customers”

Whilst innovation and security can sometimes seem to be opposing priorities, in financial services they have to move hand-in-hand to prevent exciting initiatives from being stymied by a lack of data protection or compliance. For many financial services organisations, the best way to do this is by partnering with specialist organisations that understand the sheer breadth and depth of security threats. This gives the organisation space to focus its own resources on continuously improving customer experience and service. In an increasingly disruptive market that means finding new ways to do business whilst ensuring that customers and internal stakeholders are confident in the security of data.”

Growth in leaked exploit attacks means penetration testing should be a front-line defensive measure, warns Sec-1

Actively rooting out vulnerabilities is the most effective way of preventing attacks of this nature

Recent research by Kaspersky Lab has found that leaked exploits have rapidly become one of the most dangerous methods of compromising vulnerable systems, with more than five million attacks blocked by the company in the second quarter of 2017 alone. This highlights the vital importance of adequate and frequent penetration testing procedures in finding software flaws, and taking appropriate action before an attack can take place. This is according to internet security experts at Sec-1, a Claranet Group company.

Attackers use phishing emails or hijacked websites to spread malware loaded with an exploit. An exploit is a piece of software that takes advantage of a vulnerability in order to gain access or, in the case of RansomWare, encrypt the device. Recent attacks, such as WannaCry and NotPetya, have the ability to spread and hunt out machines without the latest patches/updates installed. Others, like the original CryptoLocker which first appeared in 2013, spread through spam messages and exploit kits that rely on manipulating user behaviour. Either way, these attacks can succeed so organisations need to redouble their efforts to patch vulnerabilities in their systems. This should go hand-in-hand with existing security efforts which focus on user behaviour.

Holly Williams, Senior Security Consultant at Sec-1 said:

Seeing malware authors bundle leaked exploits in order to improve propagation rates highlights the need for testing of the internal corporate network. This is something that is often overlooked in favour of purely testing the perimeter”

Zero-day attacks are a concern for IT teams, and for the wider business as a whole due to their very nature as an assault on an undisclosed vulnerability. This means that the most up-to-date systems can be compromised. Although real-world attacks utilising malware are still extremely rare and to date, the most effective attacks have exploited known vulnerabilities, an example being the Flash vulnerability, CVE-2015-7645. It’s true that this trend for leaked exploits to be added to malware shows that attackers are becoming more sophisticated in a bid to capitalise on insufficient attention to patching and good security hygiene. Now, more than ever, the justification for performing regular penetration testing is clear, find the unpatched vulnerabilities well before the hackers can get to them.”

Alongside this, it is crucial to note that many of the recent high-profile leaks such as EternalBlue, used in the malware WannaCry and NotPetya, actually had a patch already available. This malware also used previously known hacking methods. Again, comprehensive, frequent penetration testing can prevent this from becoming a problem.

Of recent malware, NotPetya in particular was talked about as having done something that is advanced for malware. However, the method of credential extraction used is already well-known to penetration testers and other security experts. As for both WannaCry and NotPetya, a patch was made available months before the attack actually hit. This points to many organisations needing to get a much better handle on the pre-existing vulnerabilities in their systems.”

To help make this happen, Williams feels that entrusting the responsibility for penetration testing to a third party can be hugely beneficial.

A third party organisation brings a fresh pair of eyes to the testing process, meaning they can often spot vulnerabilities (and an absence of available patches) more effectively than IT staff who have been close to the system for a long period of time. In short though, it all boils down to being better prepared: exploits can be hugely dangerous, so implementing the right testing procedures aimed at determining where current security practices are insufficient should be a key priority.”