Building the next-gen data centre: where traditional and web-scale apps co-exist
For all the excitement around microservices and cloud-native apps, enterprise hasn’t lost the need for traditional applications. In fact,...
It’s unclear whether Mark Twain ever actually uttered: “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” But whoever said it, the sentiment perfectly sums up the challenges currently facing CIOs and data centre managers.
The next few months will be transformational in the server market and data centres. There will be new technologies, products and services that will change the way IT is consumed.
In order to make sense of these new products and services, we need to step back and consider basic market dynamics to see the challenges facing CIOs and data centre managers.
You don’t have to be a rocket scientist to see that the amount of data moving through the world is growing exponentially. The drivers of this data growth are the capability and capacity of devices, greater accessibility and increased usage. Some industry analysts believe these factors will drive data growth by a factor of six by 2020.
The simple fact is that this data must be processed by servers in data centres, both public and private. This introduces new challenges to traditional data centres that have been built around the assumption that the company or organisation determines structure, value, quantity and accessibility to data.
Consider that there is a ‘sea’ of data on the internet compared to the ‘ponds’ available in companies and organisations. Unlike traditional data in company systems, this ‘sea’ is unstructured, and the majority is of little or no value. You may have to sort through a million pictures of ‘cats’ before finding data that provides valuable insight into a customer’s behaviour or buying patterns.
Big data, analytics and artificial intelligence (AI) offer a technical solution to the problem by providing the necessary tools. However, we tend to focus on the buzzword ‘big data’ and miss its intrinsic value. In the case of a business, we have to look at big data as providing a more detailed picture of existing customers and their behaviours, and information relating to potential future clients.
Hidden within the noise of this data growth are applications, which I consider the most important players in the data centre. Just as the map app on your phone holds the logic to get you from A to B, a business application holds the logic that enables business processes to occur, like booking an airline ticket or performing a bank transaction.
Changing a business process requires a change in the application, while increasing the performance of an application increases the performance of the business process.
Applications are changing from monolithic and organisational-based to flexible and dynamic, defined by end-user usage patterns. This complicates matters for the CIO or data centre manager – traditional applications that keep a business’s lights on must be maintained and upgraded, as well as provide an environment for the new class of applications.
Failure to provide suitable infrastructure to support these new demands has, in many cases, resulted in dev teams using public cloud systems for both developing and hosting the applications. This causes an additional headache where IT budgets are not controlled by the IT departments responsible for data centres, and controls relating to security, backup and disaster recovery are overlooked.
Cloud computing has transformed corporate IT beyond recognition. It is important to look at its effect on data centres in the last five to eight years to get some insight into what’s in store for the future.
When cloud computing first burst onto the scene, it was hailed as the last nail in the coffin of corporate data centres. Everything would be hosted in a few massive data centres the size of small towns – with power requirements to match – and that would be that. The reality turned out a little different. The initial hosting capability was for basic VM’s, not full-blown company applications and number of organisations and governments became concerned about the location and security of customer data. This forced cloud providers to evolve by providing regional and local capabilities, as well as enhancing their offering to accommodate more complex application environments. However, this didn’t fully address the situation where organisations want full control over their applications and data, which is becoming more of a requirement with threats posed by a rise in orchestrated attacks, data theft and ransomware.
Given these conditions, a hybrid cloud model is currently seen as the best way forward. You have the ease of deployment and the advantages of a public cloud, along with the security and control of a classic data centre. If required, you can also access the public cloud for additional performance or capacity. It is interesting to note that this is again app-centric and being driven by the traditional application providers like Microsoft, SAP and IBM, who have a strong tradition in corporate data centres.
When choosing Lenovo as their data centre infrastructure provider, there are three important factors that customers value.
The first is partnerships. As Lenovo does not provide hypervisors or applications, it is free to partner with leading vendors during the development stage of products. Our long-term partnership with SAP proves this – over a number of years, SAP has not only selected our servers for the development of its SAP HANA product, but also used our servers to build three SAP HANA cloud systems. Consequently, Lenovo is the market leader in SAP HANA solutions worldwide, with over 50 per cent market share.
The second factor relates to the transformation taking place in the data centre market. Lenovo does not have a legacy to protect in the areas of networking and storage, which are undergoing the most dramatic changes. As such, we can move at the speed the customer desires when adopting the latest technologies – tech that could potentially change the cost structure of the data centre.
Thirdly, many of the new technologies I have referred to in this article have come from the webscale and high-performance computing (HPC) spaces. Lenovo is number two in the TOP500 list of supercomputers, and the majority of leading academic organisations use Lenovo systems. These are some of the most demanding computing environments on the planet, and Lenovo has the skills to plan, deploy and support them. We harness these valuable skills and the knowledge gained in such complex environments, and then combine them with our experience in manufacturing and logistics. The purpose: to build a new class of server system.
Fundamentally, customers want different outcomes from their investment in future data centre infrastructure. Einstein is believed to have said, ‘The definition of insanity is doing the same thing over and over again but expecting different results’ – Lenovo offers customers a way of doing things differently.
Join Lenovo at ISC High Performance 2017 from June 19 to 21 in Frankfurt, Germany. You will not only see some of the HPC platforms that have made us number one in Europe, but you will also witness the shape of things to come.