High Performance Computing by Any Other Name Smells So Sweet


Despite the many successful and revenue generating businesses created from capabilities initiated and developed within the HPC community, there are still those within the enterprise computing sector that view HPC as the lunatic fringe and the “Brains and No Money” brigade.

From the vendors’ point of view, it is not unreasonable to take that perspective when considering the “not for profit” sector focused on fundamental research, academia and government work. Few successfully manage to monetize the investment made in developing the necessary technologies. The commercial sector is a very different beast and exploits much of that effort while not acknowledging how it is profiting from its fundamental roots.

However, modern, standardized, and virtualized commercial data centers are architected in very similar ways to large HPC facilities. They typically run different kinds of workloads, but the challenges of eliminating infrastructure performance bottlenecks and operating them efficiently are not dissimilar. Except in the regard that the HPC community has been doing this for decades and commercial enterprises are mostly still learning the ropes. Our research shows that most of our respondents cite an expertise deficit as one of their biggest pain points.

Going back to fundamentals, HPC is frequently defined as either compute intensive or data intensive computing or both.  Welcome to today’s hottest commercial computing workload, “Total Data” and business analytics. As described by 451 Research, “Total Data” involves processing any data that might be applicable to the query at hand, whether that data is structured or unstructured, and whether it resides in the data warehouse, or a distributed Hadoop file system, or archived systems, or any operational data source – SQL or NoSQL – and whether it is on-premises or in the cloud.  

While frequently considered as the “new big thing” in enterprise computing this stuff has HPC written all over it when you look under the covers.  IBM is focusing on “big data”, it is showcasing its unstructured data analysis capabilities with Watson, and has acquired HPC stalwart Platform Computing, in part for its integration of HPC scheduling with Hadoop data analysis technology.

Oracle is often viewed as the bad boy in HPC circles for acquiring Sun Microsystems and setting the old customer base adrift, especially the HPC customers.  Yet, when you dig into the guts of an “Exa-whatever” appliance you can see HPC technology and a certain amount of Andy Bechtolsheim’s design influence scribbled all over it.

Even Cray, the epitome of HPC systems for iconic “propeller-head” scientists in popular perception, is getting in on the game.  Cray’s street creds for technology are more than a match for the challenge, but commercial credibility is a little different, which maybe why the company is going into the market backwards as Nicole Hemsoth, Tabor Communications’ Datanami Editor describes in covering the “Yarcdata initiative” from the company.

At the end of the day, it is time to wake up and smell the roses. Businesses are being inundated by data, commercial advantage is achieved through the insightful interpretation of information derived from the analysis of data, and knowledge is the key to good decisions and success.

So, it’s actually pretty simple.  In today’s world HPC is the path to success.  Just call it something else so it has credibility, deliver the results, and show the stakeholders the money.

Credits and references:

For the HPC market model:  “For Profit / Not For Profit” – Shahin Khan, Orion Enterprises

Total Data  –

Cray Opens Doors on Big Data –



« Go back

Add a comment

* mandatory

Platinum Sponsors

ISC Partner

Gold Sponsors