Using HPC to unlock big data's potential
As big data becomes increasingly strategic to organisations, conventional technologies are failing to fulfil their needs. Only high-performance computing...
Software-defined HPC brings data-intensive computing to some of the biggest challenges facing today’s businesses, wherever they are in the enterprise.
(HPC) bring to businesses and large organisations are being transformed by a revolution in HPC infrastructure, which is extending its ability to crack the most difficult problems and reveal new insights across every part of the business without the need for major upheaval or data migration.
At the same time, these new infrastructure technologies are sharply reducing the cost of installing and running HPC systems – making the cost–benefit case for business use of HPC unrivalled.
This transformation in HPC is the result of applying software intelligence, analytics and rules to how all the different components, devices and networks within an HPC installation work together.
Bringing HPC agility to enterprise
It’s called software-defined infrastructure, and it’s bringing real agility and flexibility to HPC. At the same time, it enables HPC to reach deep into an enterprise to bring the most powerful algorithms, analytics and machine learning to bear on any department or work group’s needs. It can also integrate data enterprise-wide for a complete, top-down view of the entire business.
Until now, HPC installations have been purposed and optimised through hardware configurations, and that included their network and data access, which meant that – for example – an HPC installation designed and configured for molecular engineering often couldn’t be used cost-effectively for any other purpose. It would be limited by technical, time and cost constraints to processing the data available on its own networks and data storage systems.
Software-defined infrastructure replaces the hardware-defined interconnects and protocols that kept data locked down and isolated with software instead.
Instead of only recognising data devices and networks configured at installation, software-defined HPC can reconfigure itself in near-real-time to work optimally with data formats, other computer and storage systems, as well as networks anywhere in an organisation.
Optimised HPC analytics on the fly
This brings the power of HPC analytics and computation to data challenges and tasks wherever they are in an enterprise, on the fly.
It means that an HPC installation implemented for R&D, engineering or financial analytics can now be made available very rapidly for market insight analytics, product prototyping, service call-out analysis, financial reporting or any number of compute and data-intensive challenges that are increasingly the key to remaining competitive.
Software-defined infrastructure also makes it possible to apply analytics and machine learning to the HPC configuration itself. For example, applying AI and analytics to HPC installations sets up a feedback loop, with software constantly optimising the configuration to maximise performance.
It also saves money. Intelligent infrastructure can power down devices that are not being used on a particular job, saving energy costs and contributing to green computing objectives.
Software-defined infrastructure is transforming HPC into a powerful tool for every part of the enterprise, bringing high-performance compute and data-intensive processing to every major task facing an organisation.