The new Astrid and Bruce McWilliams Center for Cosmology at Carnegie Mellon will be probing the mysteries of the universe while tackling another weighty problem in computer science--namely, how to process all the information produced by scientific research. Take, for example, the Large Synoptic Survey Telescope being constructed on a mountain in northern Chile. Carnegie Mellon is one of 18 American universities collaborating on the LSST, which will soon spend 10 years taking high-resolution images of the entire visible night sky using the largest digital camera ever created. The LSST is eventually expected to take more than 200,000 images every year, generating 100 terabytes of data every week.
"In almost all of the physical sciences, we're seeing gigantic amounts of data being generated," says Peter Lee, head of the Department of Computer Science. "There's just no way humans can sift through it all."
Carnegie Mellon alumnus and trustee Bruce McWilliams (S'78, '81) recognized that cosmology was complemented by computer science in many ways. For instance, the models that SCS researchers have created to simulate air currents and water flow are similar to the fluid models that cosmologists use to simulate the formation of galaxies and black holes, says Adrien Treuille, assistant professor in the computer graphics group at the Robotics Institute. "If interactions generally unfold in predictable ways, we can use far fewer variables to represent the system and have correspondingly faster simulations," he says. And cosmologists' need to process large digital images and create complex graphics to depict physical phenomena requires advanced methods of computer modeling, data mining, storage and retrieval.
The founder and CEO of Tessera Technologies Inc., a San Jose, Calif., based company that develops miniaturized integrated circuits and optical sensors, McWilliams decided that Carnegie Mellon's heritage of interdisciplinary research made it an ideal place to work on cosmic-scale simulations. Last April, he and his wife created the McWilliams Center as a collaborative effort between the Mellon College of Science and SCS. Lee says McWilliams has been engaged first-hand in creation of the center "at every step along the way."
Cosmic-scale simulations require an enormous amount of processing power. A new high-performance computer cluster donated by the Betty and Gordon Moore Foundation (and dubbed "the Moore Machine") will be shared by the Computer Science Department and the McWilliams Center and will be devoted solely to research on large-scale simulations. Additional computational oomph is coming from the Pittsburgh Supercomputing Center, home of a cluster of super-fast computers nicknamed "Ferrari."
Lee sees a role for computing in cosmology beyond data crunching. A growing field called "eScience" aims to use machine learning to automatically discover scientific facts. As computers process the massive amounts of data coming from telescopes, for example, they might be looking for galaxies of a certain shape. But they'll be seeing a lot of other things, too--including potentially valuable new discoveries.
"Can we make computers at this scale intelligent enough to know when there's something special?" Lee asks. "Can we automate discoveries?"
Isaac Asimov once wrote that the most exciting phrase in science wasn't "eureka!" but "that's funny." If computers can be taught how to spot "funny" anomalies in data, Lee envisions a day when they're making discoveries not only about galaxies and dark matter, but also spotting things like asteroids heading toward Earth.
Because the McWilliams Center combines both pure scientific research with the possible discovery of "practical things" like asteroids on collision courses, Lee says it's well positioned to attract funding from programs such as the LSST and the National Science Foundation's Cyber-Enabled Discovery and Innovation program. At any rate, the pace of discovery won't be slowing down any time soon.
Jason Togyer | 412-268-8721 | email@example.com