Supercomputers like those at the University at Buffalo let researchers simulate hurricanes, space shots and molecular interactions.
Now scientists are working on ultra high-speed links between supercomputer centers, to connect researchers with the enormous troves of data they need for computer models.
"We're completely balkanized on our campuses," said Larry Smarr, director of the California Institute for Telecommunications and Information Technology at the University of California at San Diego. Without the capacity to connect with far-flung data sources, "you've got to sit in your little science cave."
For example, hurricane predictions come from models at NASA's center in Maryland, but the data to feed the models, collected from weather balloons and other sources, sit on a storage system in California.
"Having to go across the country turns out to be a huge problem," Smarr said. Faulty hurricane predictions in recent months gave storm-hit parts of Florida little time to prepare, while causing needless evacuations of areas that were spared.
He made the remarks during a distinguished speakers talk Friday at UB's Amherst campus. A guru of high-performance computing, Smarr was founding director of the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign.
Now he heads a project called "Optiputer," for Optical networking, Internet Protocol computer. Funded by the National Science Foundation, the project is creating network and software technology capable of linking supercomputer sites together.
Buffalo, with its center for computational research, data-intensive bioinformatics center and a wealth of computer modeling work, is a good candidate to link with the growing optical network, Smarr said.
Russ Miller, director of UB's supercomputer center, agreed.
"We have all of the infrastructure in place," Miller said. The university is already working on high-speed links to the downtown medical campus, home of its bioinformatics center. And a consortium of universities statewide are in the process of building higher-capacity links.
But isn't there already some sort of network that allows computers to share information?
The public Internet is too slow and unpredictable to move the mountains of data generated by high-performance computers, Smarr said.
"Scientists invented this (network), and then the kids took it over," he said. "Which is fine; but scientists need networks to work on . . . and we can't do it when everybody's trying to download the latest Britney Spears video."
The optical network project is different from a pumped-up version of the Internet, said Kevin Thompson, a program director at the National Science Foundation who oversees Smarr's project. In 2002, NSF awarded $13.5 million for five years to the Optiputer effort.
Dedicated wavelengths on optical cables, working with special software, can provide strong enough connections to create a "virtual computer" or grid spanning hundreds or thousands of miles, he said. Computation power in one location can work seamlessly on data in another.
"This vision of a next-generation grid . . . opens the path for powerful new ways to conduct data-intensive science," Thompson said in an e-mail interview.