What’s next for the world’s fastest supercomputers

It can be difficult to understand the number-crunching capabilities of the world’s fastest supercomputer. But computer scientist Jack Dongarra of the University of Tennessee puts it this way: “If everyone on Earth did one calculation per second, it would take four years to match what this computer can do in one second.”

The supercomputer in question is called Frontier. It occupies the space of two tennis courts at Oak Ridge National Laboratory in the hills of East Tennessee, where it was unveiled in May 2022.

Here are some more specs: The Frontier uses approximately 50,000 processors, compared to 16 or 24 in the most powerful laptop. It consumes 20 million watts, compared to about 65 for a laptop. Its construction cost US$600 million.

When Frontier went into operation, it marked the beginning of so-called exascale computing, with machines that can perform one exaflop — or a quintillion floating-point operations per second. Since then, scientists have been preparing to make more of these incredibly fast computers: several exascale machines are expected to come into operation in the United States and Europe in 2024.

But speed itself is not the end game. Researchers are building exascale computers to explore previously inaccessible scientific and engineering questions in biology, climate, astronomy and other fields. In the coming years, scientists will use Frontier to run the most complicated computer simulations humans have ever created. They hope to answer unanswered questions about nature and design new technologies in areas ranging from transportation to medicine.

Evan Schneider at the University of Pittsburgh, for example, is using Frontier to run simulations of how our galaxy has evolved over time. In particular, she is interested in the flow of gas into and out of the Milky Way. In a way, a galaxy breathes: gas flows into it, coalescing into stars under gravity, but gas also flows outward — for example, when stars explode and release matter. Schneider studies the mechanisms by which galaxies exhale. “We can compare the simulations with the real observed universe and this gives us an idea that we are understanding the physics correctly,” says Schneider.

Schneider is using Frontier to build a computer model of the Milky Way with high enough resolution to zoom in on individual exploding stars. This means the model must capture the large-scale properties of our galaxy 100,000 light-years away, as well as the properties of supernovae about 10 light-years away. “That really hasn’t been done,” she says. To get an idea of ​​what this resolution means, it would be the same as creating a physically accurate model of a beer can, along with the individual yeast cells inside it and the interactions at each scale in between.

Stephan Priebe, a senior engineer at GE, is using Frontier to simulate the aerodynamics of next-generation airplane designs. To increase fuel efficiency, GE is investigating an engine design known as “open fan architecture.” Jet engines use fans to generate thrust, and bigger fans mean greater efficiency. To make the fans even larger, engineers proposed removing the external structural frame, known as the nacelle, so that the blades are exposed like a weather vane. “Simulations allow us to get a detailed view of aerodynamic performance early in the design phase,” says Priebe. They give engineers insight into how to shape fan blades to improve aerodynamics, for example, or to make them quieter.

Frontier will especially benefit Priebe’s studies of turbulence, the chaotic movement of a disturbed fluid—in this case, air—around the fan. Turbulence is a common phenomenon. We can see it in the crash of the sea waves and in the smoke that comes from an unlit candle. But scientists still have difficulty predicting how exactly a turbulent fluid will flow. This is because it moves in response to macroscopic influences, such as changes in pressure and temperature, and microscopic influences, such as individual nitrogen molecules in the air rubbing against each other. The interaction of forces at multiple scales complicates movement.

“In graduate school, [a professor] once said to me, ‘Bronson, if someone tells you they understand turbulence, you should put one hand on your desk and leave the room, because they’re trying to sell you something.’” , says astrophysicist Bronson Messer, director of science at the Oak Ridge Leadership Computing Facility, which houses Frontier. “No one understands the turbulence. It really is the last great problem of classical physics.”

These scientific studies illustrate the strength of supercomputers: simulating physical objects at multiple scales simultaneously. Other apps reflect this theme. Frontier enables more accurate climate models, which need to simulate climate at different spatial scales across the planet and also at long and short time scales. Physicists can also simulate nuclear fusion, the turbulent process in which the sun generates energy by joining atoms to form different elements. They want to better understand the process for developing fusion as a clean energy technology. While these types of multi-scale simulations have been a staple of supercomputing for many years, Frontier can incorporate a wider range of different scales than ever before.

To use Frontier, approved scientists log into the supercomputer remotely, submitting their work over the Internet. To make the most of the machine, Oak Ridge aims to have about 90% of the supercomputer’s processors running calculations 24 hours a day, seven days a week. “We get into this kind of steady state where we’re constantly doing scientific simulations for a few years,” Messer says. Users keep their data at Oak Ridge in a data storage facility that can store up to 700 petabytes, the equivalent of about 700,000 portable hard drives.

Although Frontier is the first exascale supercomputer, there are more to come. In the US, researchers are currently installing two machines that will have a capacity of more than two exaflops: Aurora, at the Argonne National Laboratory, in Illinois, and El Capitan, at the Lawrence Livermore National Laboratory, in California. Starting in early 2024, scientists plan to use Aurora to create maps of neurons in the brain and search for catalysts that could make industrial processes such as fertilizer production more efficient. El Capitan, also scheduled to come into operation in 2024, will simulate nuclear weapons to help the government maintain its stockpile without weapons testing. Meanwhile, Europe plans to deploy its first exascale supercomputer, Jupiter, in late 2024.

China reportedly also has exascale supercomputers, but has not released standard benchmark test results of their performance, so the computers do not appear in the TOP500, a biannual list of the fastest supercomputers. “The Chinese are concerned that the US will impose more limits in terms of technology going to China, and they are reluctant to disclose how many of these high-performance machines are available,” says Dongarra, who designed the benchmark that the supercomputers are expected to run. for the TOP500.

The desire for more computing power doesn’t end with exascale. Oak Ridge is already considering the next generation of computers, Messer says.

They would have three to five times the computing power of the Frontier. But there is a big challenge: the enormous energy consumption. The energy that Frontier consumes, even when it is idle, is enough to power thousands of homes. “It’s probably not sustainable for us to just make the machines bigger and bigger,” says Messer. As Oak Ridge built increasingly larger supercomputers, engineers worked to improve the machines’ efficiency with innovations including a new cooling method. The Summit, the Frontier’s predecessor that is still operating in Oak Ridge, uses about 10% of its total energy consumption to cool itself. By comparison, 3% to 4% of the Frontier’s power consumption is for cooling. This improvement was achieved by using room temperature water to cool the supercomputer, instead of ice water.

Next-generation supercomputers would be able to simulate even more scales simultaneously. For example, with Frontier, Schneider’s galaxy simulation has resolution down to tens of light years. This is still not enough to get to the scale of individual supernovae, so researchers need to simulate individual explosions separately. A future supercomputer may be able to bring all these scales together.

By simulating the complexity of nature and technology more realistically, these supercomputers push the boundaries of science. A more realistic galaxy simulation brings the vastness of the universe to scientists’ fingertips. An accurate model of the air turbulence around an airplane fan avoids the need to build a cost-prohibitive wind tunnel. Better climate models allow scientists to predict the fate of our planet. In other words, they give us a new tool to prepare for an uncertain future.

( source: MIT Technology Review )