Top

Everything needs HPC

​​​​​​
For the last few decades, supercomputing has been fusing computation into science and engineering, becoming a powerful tool for scientists to advance more rapidly in research and answer difficult questions impossible to thoroughly examine only through theory and experimentation.

With the advancement of technology, high-performance computing (HPC) is now experiencing massive growth from all academic fields.

“If you look at HPC centers from a decade ago, we were very focused on the physical sciences—mathematics and engineering, things that were equation-based. Now all areas are looking for HPC,” said Dan Stanzione, executive director of the Texas Advanced Computing Center (TACC) at The University of Texas at Austin.

Stanzione was one of the thought leaders invited to attend the Global IT Summit held at KAUST in February. When asked about the future of HPC, he said that classic HPC problems were based on science that had a physical design, like designing airplanes, designing better materials and modeling reservoirs for oil. Now, however, the real growth drivers are in the area of the life sciences.

“If you look at how life sciences was done 20 years ago, it was a lot of wet lab and bench science. With imaging, genomics and proteomics, now there is so much digital data that comes out of the life sciences,” Stanzione said. “Everything from basic biology to engineering new crops to the way we deliver healthcare is being radically changed through computing technology.”

The increasing need for HPC doesn’t stop there. Life sciences may have been the growth driver in the area in the last decade, but the humanities and the social sciences are now moving into the supercomputing and big data space.

“We are seeing a sub-fusing of almost all areas of academic inquiry. The social sciences are becoming much more data-driven and are grabbing big pieces of data from the internet on how people behave and respond. Even the humanities and arts have become more and more digital,” he noted.

One of the reasons for the expansion is that HPC has advanced the digital visualization capabilities of the past. Stanzione explained that when researchers look at scientific data and render it in 3-D, it goes beyond science. One way academics in the humanities are working with HPC scientists is by building 3-D reconstructions and restorations of great artwork and architecture before the pieces are lost or become degraded over time.

“All of this is preserved digitally, and the best way to show them off is through the same facilities we use for scientific computing,” Stanzione said.

More diversity, more challenges

As systems are advancing, it’s not just about high processing computing any more. Cloud and big data are also playing a role, and this has raised more than a few challenges. The first challenge is diversity. HPC, cloud and big data require specialized platforms that require specialized skills. Beyond where you put the hardware, Stanzione said the question is how much in-house expertise is needed to take advantage of the platform's capabilities—and the even bigger question is how to pay for it.

There is also the challenge of compatibility. While a lot of scientific computing is suitable for the cloud, there are a number of processes conducted on high-performance systems that aren’t. There question of accessibility also arises. Organizations and research institutions are looking at what is private cloud, what is hybrid cloud and what should they take to the public cloud. Finally, there is an economic challenge.

“The cloud is a consumption model, and most universities run on a capital model. You might have a budget of a million dollars, and so you plan to do as much computation that fits into that million dollar budget. However, when you open it up to faculty out to the cloud and they start bringing in bills every month, you may not necessarily have the budget to cover it—it’s hard to predict how much they are going to use,” he explained.

Dan Stanzione, executive director of the Texas Advanced Computing Center (TACC) at The University of Texas at Austin, shares his insight on the future of HPC computing and the challenges faced by institutions as the demand for HPC, cloud and big data analysis grows. Host: Michelle Ponto.
​​​​​

Be flexible and prepared to adapt

These challenges were some of the issues discussed during the Global IT Summit by the 100 CEOs, CTO and IT thought leaders participating in the event's panels. With technology constantly changing and demands evolving, Stanzione noted the key to the next five years is flexibility, and being flexible is an advantage that KAUST has.

“KAUST is a remarkable place with a singular unique focus to be one of the best places in the world without all the traditional baggage that comes with university structure. There are opportunities to do things in new ways and really lead in the way IT is approached,” he said.

Stanzione added that another part of the University’s IT success is “good planning for resources and bringing in the right mix of skills that can see what’s coming.”

The structure of KAUST also really impressed him. “My whole career has been inter- and multi-disciplinary, but it’s still a huge challenge to get researchers to embrace that,” he explained. “The big challenges facing society involve blending computation and a number of academic disciplines. KAUST doesn’t have the discipline walls [that other universities have].”

While Stanzione agreed that results will be hard to measure in a quarter or even a couple of quarters, he noted that, like the Bell Labs in the 1960s, "KAUST is a place where smart people can come and blend these technologies and innovate, and not having ‘walls’ around IT is one of the ways to do this."