The holiday shopping season is just around the corner and many consumers turn to the convenience of online shopping to avoid the crowds while completing their gift lists. Many companies turn to cloud computing to ensure their online services can handle the increased customer load.
Cloud computing is a flexible computing model that allows even small businesses access to large data centers with thousands of servers. This allows the company to buy the computing services needed for a given period, like the holiday shopping season.
Businesses are not the only ones that could benefit from this approach. To address important scientific questions, scientists increasingly ‘crunch' large databases. The cloud model would allow scientists from many areas to share a large computer resource.
"Many scientists begin their work using a personal computer, but the project usually grows more challenging and complicated" said Katherine Yelick, Director of National Energy Research Scientific Computing (NERSC) Division at Lawrence Berkeley National Laboratory. "They may not require a massive supercomputer, but access to a mid-range computing resource could be a huge benefit."
In response, many laboratories and universities are buying computer clusters to meet their research needs.
"You can buy a small computer cluster for $50,000, but the cost of ownership often exceeds the cost of hardware when you factor in floor space, power demands, and staff support," said Yelick.
The Department of Energy is tackling this challenge with the Magellan project, a research and development effort to establish a cloud computer network designed exclusively for scientific purposes.
"Magellan differs from commercial cloud models in both size and the types of computer applications it can run. Whereas a commercial cloud application may only use a few dozen processors at a time, scientific applications can require hundreds of processors that work together" said Daniel Hitchcock, senior technical advisor for the DOE Office of Advanced Scientific Computing Research (ASCR) and the manager of the Magellan program.
The Magellan prototype will enhance efficiency by locating computer services in larger computer centers at Lawrence Berkeley National Laboratory and Argonne National Laboratory. The project is designed to develop the tools necessary to allow scientists to work on data-intensive projects.
"In essence, the Magellan cloud system will give scientific users access to a set of linked computers that may be used like their own personal cluster, but at a greatly reduced effort and cost" said Christine Chalk, program manager in ASCR.
Photo credit: Lawrence Berkeley National Laboratory
Katherine Yelick, Director of National Energy Research Scientific Computing (NERSC) Division at Lawrence Berkeley National Laboratory.
Magellan will build on existing efforts and provide transparent access to multi-teraflop computing to researchers at universities, DOE laboratories, and their collaborators.
"In order to visualize one teraflop, consider that one person can solve one basic math problem — addition and subtraction — about every second. A teraflop is equal to one trillion people solving complex math problems every second," said Yelick.
The success of the prototype depends on how quickly users can turn around scientific results compared to using their own clusters and on how easy the system is to use.
"Mid-range computing is something that science needs" said Yelick. Magellan not only offers enhanced computing, but also a better way to store critical scientific data. "While people who run their own clusters benefit from the computing power, their data storage systems usually can't keep up with the computer. Magellan will also make large datasets more widely available to the broader scientific community" said Yelick.
Magellan also promises to save energy as institutions move away from homegrown clusters to more efficient, shared cloud resources. Magellan could also make cloud computing a competitive tool for U.S. manufacturing and design industries throughout the supply chain, including many small businesses. Finally it could stretch research dollars by providing more computing for lower costs to institutions nationwide.
This initiative received more than $30 million from American Recovery and Reinvestment Act funds to set up and test the cloud computer-related hardware and software, as well as develop necessary tools and evaluate the scientific effectiveness of this approach.
"This basic research is critical to future energy technologies, and to inspire, train, and support future scientists who will ultimately be called upon to solve the nation's energy challenges in the 21st century" said Hitchcock.
This study is supported by the Department of Energy's (DOE) Office of Science, Office of Advanced Scientific Computing Research. DOE invests in science and solving critical issues impacting people's daily lives and the nation's future. For more information, visit www.science.doe.gov.
National Energy Research Scientific Computing (NERSC) Division at Lawrence Berkeley National Laboratory aims to accelerate scientific discovery by providing high-performance computing, data storage and user support services to DOE researchers.
The Argonne Leadership Computing Facility aims to dramatically accelerate scientific discovery by providing leadership computing resources to researchers nationwide for large-scale simulations with the potential to have a dramatic impact on science and engineering.
The American Recovery and Reinvestment Act of 2009 is an unprecedented effort to jumpstart our economy, create or save millions of jobs, and put a down payment on addressing long-neglected challenges so our country can thrive in the 21st century.
This article was written by Stacy W. Kish.