home  |  about us  |  contact  
 
   


Home > PC >

Originally appeared in Tuesday, September 12, 2006 Oak Ridger
URL: http://www.oakridger.com/stories/091206/new_96896174.shtml

ORNL a competitor for next-generation supercomputer

By: From Staff Reports

URBANA, Ill. (AP) — In a rapidly changing world where megabytes and gigabytes simply aren’t enough anymore, scientists are preparing for the next generation of supercomputers — machines that can make quadrillions of computations per second.

Called petascale, these new computers will help scientists go where no one has gone before to look deeper into the building blocks of life, investigate the most perplexing mysteries of the universe or determine the effects of global warming on perhaps a single county in the Corn Belt.

And the University of Illinois — the first university to build its own computer, ILLIAC, back in 1952 — wants to have one.

“Illinois is about building unique tools and instruments so that we can advance science and engineering in a way that nobody else can,” said Wen-Mei Hwu, a professor of computer and electrical engineering.

The National Center for Supercomputing Applications on the university’s Urbana campus is competing for National Science Foundation funding to acquire and deploy a supercomputer that can sustain between one and two petaflops of computational power, said Thom Dunning, the center’s director.

“NSF is really looking for a machine that has the potential to impact a huge number of science and engineering areas as opposed to just a very select set,” he said.

Illinois submitted its initial proposal to the National Science Foundation last week and will have until February to refine it and submit a final plan. Dunning expects competition from the University of California system, Oak Ridge National Laboratory in Tennessee and perhaps a few other supercomputing centers, but the complexity of operating such a system limits potential hosts to “a relatively small set of places.”

One petaflop represents one quadrillion computations per second, a one-thousand-fold increase over the teraflop, one trillion computations per second, the measure used for most current supercomputers.

“We’re talking about a half a million to a million processors that will be in the machine,” Dunning said. “Your PC has one processor in it.”

The NCSA currently provides a maximum 40 teraflops at its computing center on the campus, which are used by scientists and researchers around the world. But the National Science Foundation is seeking a computer that can sustain one to two petaflops, and the computer will need to reach occasional peaks of about 10 times that amount, Dunning said.

“The reason why we build these bigger and bigger machines is because we can actually discover and use more and more of the fundamental (scientific) principles” to solve problems, said Hwu, who is working with Dunning and others on the proposal.

With petaflops of computing power, scientists will be able to simulate problems and test possible solutions — not only in science and engineering but biology, medicine, economics and many other disciplines — with far greater detail, much more quickly. Complex computations that currently take days could be accomplished in a matter of seconds.

“We can solve problems that we simply couldn’t solve before,” said Edward Seidel, the director of the Center for Computational Technology at Louisiana State University and an astrophysicist interested in black holes and gravitational waves in outer space.

But reaching new heights in computing will not come without huge obstacles. Five hundred thousand microprocessors produce a lot of heat and will take vast amounts of electricity to cool. Software applications must be rewritten to work on the new computers. And there is the question of reliability, the ability to sustain operations over long periods of time.

“This has always been an issue with computing but when you’re computing at this scale it becomes an overriding issue,” Dunning said. “If you don’t play close attention to that, you can get a machine that is really powerful but will only stay up for an hour. That clearly wouldn’t lead to many of the scientific breakthroughs we’d be looking for.”

NSF is expected to announce results of the competition next September, but the new supercomputer likely would not be fully operational until mid-2011, Dunning said.

———

On the Net: http://www.ncsa.uiuc.edu

 


All Contents ©Copyright The Oak Ridger
Mirrored with permission

 

 
  ORNL | Directorate | CSM | NCCS | ORNL Disclaimer | Search
Staff only: CSM computers | who, what, where? | news
 
URL: http://www.csm.ornl.gov/PR/PR2006/OR-09-12-06_2.html
Updated: Thursday, 14-Sep-2006 16:25:54 EDT

webmaster