MIT Technology Review’s What’s Next series looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of our series here.
It can be difficult to wrap your brain around the number-crunching capability of the world’s fastest supercomputer. But computer scientist Jack Dongarra, of the University of Tennessee, puts it this way: “If everybody on Earth were to do one calculation per second, it would take four years to equal what that computer can do in one second.”
The supercomputer in question is called Frontier. It takes up the space of two tennis courts at Oak Ridge National Laboratory in the eastern Tennessee hills, where it was unveiled in May 2022.
This story is only available to subscribers.
Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.
Subscribe now
Already a subscriber?
Sign in
Here are some more specs: Frontier uses approximately 50,000 processors, compared with the most powerful laptop’s 16 or 24. It consumes 20 million watts, compared with a laptop’s 65 or so. It cost $600 million to build.
When Frontier came online, it marked the dawn of so-called exascale computing, with machines that can execute an exaflop—or a quintillion (1018) floating point operations a second. Since then, scientists have geared up to make more of these blazingly fast computers: several exascale machines are due to come online in the US and Europe in 2024.
But speed itself isn’t the endgame. Researchers are building exascale computers to explore previously inaccessible science and engineering questions in biology, climate, astronomy, and other fields. In the next few years, scientists will use Frontier to run the most complicated computer simulations humans have ever devised. They hope to pursue yet unanswered questions about nature and to design new technologies in areas from transportation to medicine.
Evan Schneider of the University of Pittsburgh, for example, is using Frontier to run simulations of how our galaxy has evolved over time. In particular, she’s interested in the flow of gas in and out of the Milky Way. A galaxy breathes, in a way: gas flows into it, coalescing via gravity into stars, but gas also flows out—for example, when stars explode and release matter. Schneider studies the mechanisms by which galaxies exhale. “We can compare the simulations to the real observed universe, and that gives us a sense of whether we’re getting the physics right,” Schneider says.
Schneider is using Frontier to build a computer model of the Milky Way with high enough resolution to zoom in on individual exploding stars. That means the model must capture large-scale properties of our galaxy at 100,000 light-years, as well as properties of the supernovas at about 10 light-years across. “That really hasn’t been done,” she says. To get a sense of what that resolution means, it would be analogous to creating a physically accurate model of a can of beer along with the individual yeast cells within it, and the interactions at each scale in between.
Stephan Priebe, a senior engineer at GE, is using Frontier to simulate the aerodynamics of the next generation of airplane designs. To increase fuel efficiency, GE is investigating an engine design known as an “open fan architecture.” Jet engines use fans to generate thrust, and larger fans mean higher efficiency. To make fans even larger, engineers have proposed removing the outer structural frame, known as the nacelle, so that the blades are exposed as in a pinwheel. “The simulations allow us to obtain a detailed view of the aerodynamic performance early in the design phase,” says Priebe. They give engineers insight into how to shape the fan blades for better aerodynamics, for example, or to make them quieter.
Frontier will particularly benefit Priebe’s studies of turbulence, the chaotic motion of a disturbed fluid—in this case, air—around the fan. Turbulence is a common phenomenon. We see it in the crashing of ocean waves and in the curl of smoke rising from an extinguished candle. But scientists still struggle to predict how exactly a turbulent fluid will flow. That is because it moves in response to both macroscopic influences, such as pressure and temperature changes, and microscopic influences, such as the rubbing of individual molecules of nitrogen in the air against one another. The interplay of forces on multiple scales complicates the motion.
“In graduate school, [a professor] once told me, ‘Bronson, if anybody tells you that they understand turbulence, you should put one hand on your wallet and back out of the room, because they’re trying to sell you something,’” says astrophysicist Bronson Messer, the director of science at Oak Ridge Leadership Computing Facility, which houses Frontier. “Nobody understands turbulence. It really is the last great classical physics problem.”
These scientific studies illustrate the distinct forte of supercomputers: simulating physical objects at multiple scales simultaneously. Other applications echo this theme. Frontier enables more accurate climate models, which have to simulate weather at different spatial scales across the entire planet and also on both long and short time scales. Physicists can also simulate nuclear fusion, the turbulent process in which the sun generates energy by pushing atoms together to form different elements. They want to better understand the process in order to develop fusion as a clean energy technology. While these sorts of multi-scale simulations have been a staple of supercomputing for many years, Frontier can incorporate a wider range of different scales than ever before.
To use Frontier, approved scientists log in to the supercomputer remotely, submitting their jobs over the internet. To make the most of the machine, Oak Ridge aims to have around 90% of the supercomputer’s processors running computations 24 hours a day, seven days a week. “We enter this sort of steady state where we’re constantly doing scientific simulations for a handful of years,” says Messer. Users keep their data at Oak Ridge in a data storage facility that can store up to 700 petabytes, the equivalent of about 700,000 portable hard drives.
While Frontier is the first exascale supercomputer, more are coming down the line. In the US, researchers are currently installing two machines that will be capable of more than two exaflops: Aurora, at Argonne National Laboratory in Illinois, and El Capitan, at Lawrence Livermore National Laboratory in California. Beginning in early 2024, scientists plan to use Aurora to create maps of neurons in the brain and search for catalysts that could make industrial processes such as fertilizer production more efficient. El Capitan, also slated to come online in 2024, will simulate nuclear weapons in order to help the government to maintain its stockpile without weapons testing. Meanwhile, Europe plans to deploy its first exascale supercomputer, Jupiter, in late 2024.
China purportedly has exascale supercomputers as well, but it has not released results from standard benchmark tests of their performance, so the computers do not appear on the TOP500, a semiannual list of the fastest supercomputers. “The Chinese are concerned about the US imposing further limits in terms of technology going to China, and they’re reluctant to disclose how many of these high-performance machines are available,” says Dongarra, who designed the benchmark that supercomputers must run for TOP500.
The hunger for more computing power doesn’t stop with the exascale. Oak Ridge is already considering the next generation of computers, says Messer. These would have three to five times the computational power of Frontier. But one major challenge looms: the massive energy footprint. The power that Frontier draws, even when it is idling, is enough to run thousands of homes. “It’s probably not sustainable for us to just grow machines bigger and bigger,” says Messer.
As Oak Ridge has built progressively larger supercomputers, engineers have worked to improve the machines’ efficiency with innovations including a new cooling method. Summit, the predecessor to Frontier that is still running at Oak Ridge, expends about 10% of its total energy usage to cool itself. By comparison, 3% to 4% of Frontier’s energy consumption is for cooling. This improvement came from using water at ambient temperature to cool the supercomputer, rather than chilled water.
Next-generation supercomputers would be able to simulate even more scales simultaneously. For example, with Frontier, Schneider’s galaxy simulation has resolution down to the tens of light-years. That’s still not quite enough to get down to the scale of individual supernovas, so researchers must simulate the individual explosions separately. A future supercomputer may be able to unite all these scales.
By simulating the complexity of nature and technology more realistically, these supercomputers push the limits of science. A more realistic galaxy simulation brings the vastness of the universe to scientists’ fingertips. A precise model of air turbulence around an airplane fan circumvents the need to build a prohibitively expensive wind tunnel. Better climate models allow scientists to predict the fate of our planet. In other words, they give us a new tool to prepare for an uncertain future.