May 6, 2013
by Joanna Rowell
Building a universe uses a lot of computational power. The movement of billions of particles must be simulated over billions of years, spanning from 100 million years after the Big Bang to the modern day. To support the work of the ambitious, international Dark Energy Survey (DES), over 100 virtual universes will be created. But for physics graduate student Matthew Becker and the Research Computing Center (RCC), making a batch of universes was simply a project for the holidays.
On a single computer running around the clock, one simulation would take five years to run. But by using one-third of the 3,000 processors available on the RCC's Midway high performance computing (HPC) cluster, Becker and Professor of Astronomy & Astrophysics Andrey Kravtsov cut the expansion of a virtual universe down to a mere 30 hours. So far, Becker has run more than thirty simulations on the machine, mostly over the past Thanksgiving and Christmas holidays "when people won't be affected by me taking over the large fraction of the machine," he said.
Those virtual universes are a critical part of helping DES researchers study the real thing: millions of images taken by a custom-designed digital camera attached to a telescope in Chile that will cover 1/8th of the universe visible from Earth. With those images, the DES seeks to solve one of astronomy's greatest modern mysteries by looking for evidence of dark energy, dark matter and the forces that shaped the universe's expansion. Since the 1998 discovery that this expansion is actually speeding up, rather than slowing down as Einstein's theory of general relativity predicts, scientists have sought evidence of the mysterious force that must be exceeding the inward pull of mass and gravity upon the universe.
The images taken by the room-sized, 570 megapixel Dark Energy Camera (DECam) – designed and constructed at Fermilab – will hopefully contain the elusive signatures of dark energy and dark matter. But before the petabytes of data collected by that instrument are analyzed, researchers will calibrate and fine-tune their methods on the virtual universes produced by Kravtsov, Becker, and researchers at Stanford University and the University of Michigan in what's known as a Blind Cosmology Challenge.
Each virtual universe is generated using a slightly different cosmology, initial conditions of known and yet unknown forces that are within the range that astronomers believe were present in the early stages of our universe. After billions of years are simulated to bring the virtual universe up to the "present day," the data is converted into synthetic images similar to what will be collected by the real-life DECam. The DES analysis team then applies their methods to the synthetic data and tries to figure out the original conditions that were used to generate the virtual universe. The successes and failures of these practice runs will help researchers determine whether their algorithms are ready for prime time.
"The idea is that they'll apply their analysis techniques, and we'll see whether they actually recover the true parameters," said Kravstov. "If they do, it means that the techniques they will use on the real data are robust and trustworthy. If they don't, we'll have to go back to the drawing board and figure out where the problem is, and repeat that process."
By the time the DECam achieved "first light" and began collecting images in September, the Blind Cosmology Challenge team was already working hard at their task of creating this synthetic data. When the project began, Kravtsov and Becker were using XSEDE, a national network of supercomputers, to run their simulations. But a switch last year to the RCC's high performance computing resources has streamlined the process and allowed for faster generation of virtual universes. The transition has enabled "a big leap in progress," Kravtsov continued, including Becker's holiday simulation marathons on the Midway cluster.
The RCC deploys cutting-edge computing resources for the UChicago community. In addition, consultants at the RCC are applying their wide knowledge of science and computation to work directly with researchers in the use of high-performance computing and to integrate HPC into new scholarly disciplines. "Designing and managing world-class computers is just one part of supporting research computing at the University of Chicago," says H. Birali Runesha, Director of the RCC. "RCC also provides access to consultants to facilitate computational work of researchers with varying levels of HPC experience." RCC consultants leverage their experience in primary research, computing, and visualization to support researchers looking to take their projects to a new level with resources such as Midway. "He has a vision," Kravtsov said of Runesha. "It will transform the university and the way research is done here."