Using HPC to unlock big data’s potential

William Payne

Friday 17 June 2016

As big data becomes increasingly strategic to organisations, conventional technologies are failing to fulfil their needs. Only high-performance computing (HPC) has the power, scale and sheer speed to place big data at the heart of critical business processes and strategic decision-making.

‘Time is money’ is a well-proven business mantra, and it’s the reason why so many firms and government agencies are turning to high-performance computing (HPC) to manage their big data needs – something that’s fast becoming vital to many organisations’ day-to-day business operations.

Conventional computing approaches to big data can be slow to provide answers in time, and there’s no point investing money and effort in big data if results arrive too late to be useful.

Across a range of industries, HPC is providing private- and public-sector organisations with sophisticated, in-depth analysis over entire big data sets in seconds where conventional configurations would take hours or even days. This speed in big data processing means that, with HPC, analysts and executives don’t have to wait hours between queries but can explore huge data sets interactively, seeing in seconds what approach or strategies might be most useful for their business aims.

Handling big data in real time

When enterprises are dealing with extremely high velocities of data, HPC systems equipped with unrivalled network, memory and processor configurations can keep abreast in even the most challenging conditions. Real-time fraud detection and personalised over-the-phone insurance quotes are just two examples where HPC big data processing can provide market-beating solutions.

It’s not just complex real-time applications that are attracting firms to HPC. Carrying out sophisticated molecular or engineering simulations in seconds rather than days can multiply the productivity of research and development (R&D), product design and engineering teams, with ripple-through effects across the organisational value chain.

Likewise, swiftly computing in minutes complex interactions and relationships across huge social networks in response to new messaging or product announcements makes firms and organisations far more agile in the marketplace, able to respond and adapt to market intelligence to shape ongoing campaigns as they happen. Conventional big data configurations – by contrast – will usually only tell you how and why it all went wrong, after the event.

HPC can also go where conventional big data configurations can’t. The most complex social network analysis – involving graph computing – can only effectively be carried out for real-world big data sets on HPC systems. They are optimised for these tasks and equipped with the memory, adaptive processing and GPU acceleration to crunch massive complex network analysis challenges that conventional architectures cannot handle.

Innovating new levels of performance

Lenovo incorporates a legacy in HPC that goes back to the first x86 based HPCs. Today, that tradition is represented by industry leading HPC systems, a commitment to constant, in-house technology development and product innovation. As a company, Lenovo has deep knowledge of technology integration and manufacturing across the whole technology value chain, from semiconductor production through to system integration – all of which is performed in-house.

Its installations include many of the largest systems in the TOP500 list of leading HPC sites worldwide. These include the SuperMUC and SuperMUC Phase 2 installations at the Leibniz Computing Centre in Germany, an installation at the Max-Planck-Society Computing and Data Facility in Munich, and the Alibaba Group, China.

YOU MIGHT ALSO LIKE

Building the next-gen data centre

Where traditional and web-scale apps co-exist