Originally appeared in September 09, 2002 GridToday
By Scott Nance
GRIDtoday

Success Of Japan's Earth Simulator To Spur Big Changes

It's been at least a decade since the United States has had to worry about Japan overtaking U.S. technological leadership. But that's just what's happening today, as a Japanese supercomputer appears to be outperforming the highest-end U.S. machines. That situation has prompted a lot of soul searching within the high-performance computing community and will likely translate to important changes in the direction of U.S. supercomputing, according to federal officials and others.

Five years in the making, Japan's Marine Science and Technology Center on March 1 switched on its new, $400-million supercomputer. The massive machine, located in a specially built building in Yokohama, is called the Earth Simulator because its primary purpose is to run advanced simulations on climate, atmospheric, and other Earth sciences. Built by Japanese manufacturer NEC, the Earth Simulator achieved 35.86 trillions of floating point operations per second, or teraflops, on the common Linpack benchmark software. Performance on Linpack is the basis that a group of computer scientists use to measure the top 500 most powerful supercomputers on the planet. In fact, the top 500 group now ranks the Earth Simulator as the new No. 1 number-cruncher among the top 500.

That performance on Linpack also is equal to about 87.5% of the Earth Simulator's peak performance level, Tetsuya Sato, the director of the Earth Simulator Center, told a group of computing experts and others in an invited talk in Washington. Besides Linpack, scientists achieved 26.58 teraflops on a global atmospheric circulation simulation, or an impressive 68% of peak. U.S. supercomputers have often achieved much smaller percentages of their theoretical peak levels in recent years. The potential for the Earth Simulator to outdo U.S. supercomputers drew much attention to Sato's Aug. 21 visit, including the attendance of a representative from the White House Office of Science and Technology Policy at his session. It was one of Sato's first discussions of the Earth Simulator held in the United States. Computer scientists have to be "careful" in comparing the Earth Simulator results against the performance of the highest end U.S. supercomputers, which are concentrated in the Department of Energy's stockpile stewardship program, James Decker, deputy director of the DOE Office of Science, warned.

"How good a particular computer is depends on the kind of problem you're trying to solve," Decker said. On some problems the DOE stockpile stewardship machines are "quite efficient," he said, adding, "But it does depend on the problem. For a lot of the scientific problems that we're trying to solve in the Office of Science, those machines have not proven to be very efficient." A major problem with many recent U.S. supercomputers is that they have not been designed with scientific problems in mind, Decker said. "We need to work more closely with the vendors to do that," he said.

Different approches contrast, Sato said Japan's government decided in 1997 to fund development of the Earth Simulator largely in response to the emergence of the Kyoto Protocol, which is a framework aimed at controlling greenhouse gas emissions and limiting global warming. Japanese manufacturer NEC took a very different approach to build the Earth Simulator than its U.S. counterparts have been taking to build their supercomputers. NEC used what are called vector processors, which were the traditional building blocks of all supercomputers up until about a decade ago. Vector processors and other specialized components that go with them are fairly expensive parts, however. Through most of the 1990s to today, U.S. vendors have concentrated on building supercomputers out of less expensive, commodity processors and components to make the high-end computing market a more profitable venture.

Many computing experts touted the ability to pile more and more cheap commodity systems together to "scale" or achieve ever higher levels of performance, as the solution to the future of supercomputing. The more expensive vector processors were called a thing of the past. The success of Earth Simulator now calls that into question. "In the future, we're going to see that probably there isn't one architecture that is the solution for all scientific problems," Decker admitted. Computer scientists have learned is that it has been very difficult to put together those commodity processors in a way they can run scientific problems efficiently, he said. Perhaps just as important as the processors-or maybe more-so-is the interconnect technology through which the many processors communicate with each other. Like the processors, some U.S. vendors have tried using less-expensive interconnects. The Earth Simulator uses new technology, however, which is capable of transmitting 12.3 gigabytes/second in two directions. That compares to the internal bandwidth of high-end U.S. machines said to be in the range of just 2.4 gigabytes/second.

Although the hardware that U.S. vendors are manufacturing is relatively cheap, Decker said the supercomputing community should now begin to look at "total costs," which include software. Software development is expensive, he said, and high-performance computing experts have devoted much time to tinkering with software to try to squeeze higher performance levels from the commodity machines. Sato said he expects the computing power of the Earth Simulator to lead to dramatic advances in the areas of climate science, manufacturing, and other key fields, particularly through being able to combine individual simulations at different scales into one "holistic" picture. Since high-end computation is an increasingly important tool for scientific discovery, the U.S. government has no choice but to sit up and take notice of the Earth Simulator and its technology if the United States is to remain competitive in the global scientific community, Decker said.

"Obviously, we're taking a very hard look at our (computing development) programs in light of the success of the Earth Simulator," he said. "We'd be crazy if we weren't doing that." That includes a recent announcement that Oak Ridge National Laboratory would evaluate Cray Inc's own new-generation vector-based system, known as the X-1, which is similar in architecture to the Earth Simulator. Testing the Cray X-1 is part of an effort to provide the U.S. scientific community with computing resources "to match or exceed" those of the Earth Simulator, which is more than 20 times that of the fastest U.S. civilian supercomputer, DOE Office of Science Director Raymond Orbach said at the time in making the announcement.



Copyright 2002, GRIDtoday Redistribution of this article
is forbidden by law without the express written consent of the publisher
.
For a subscription to GRIDtoday, send e-mail to gridfree@gridtoday.com
Mirrored with permission

  ORNL | Directorate | CSM | NCCS | ORNL Disclaimer | Search
Staff only: CSM computers | who, what, where? | news
 
URL: http://www.csm.ornl.gov/PR/GT090902.html
Updated: Thursday, 12-Sep-2002 08:52:43 EDT

webmaster