2012

March

ASCR Monthly Computing News Report-March 2012



This monthly survey of computing news of interest to ASCR is compiled by Jon Bashor (JBashor@lbl.gov) with news provided by Argonne, Fermi, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest and Sandia national labs.

In this issue:
Research News

Berkeley Lab-Led SciDAC Institute to Help Solve Data-Intensive Science Challenges PNNL Researchers Improve Precipitation Simulation with UQ
LLNL Develops New Technologies for Streaming Data
ASCR Research Highlighted at Copper Mountain Conference
Book Published Describing Sandia's Open-Source Optimization Modeling Language
Supercomputing the Difference between Matter and Antimatter
NERSC, ESnet Help Researchers
Discover How Neutrinos Shift Flavors
New Computer Codes Unlock the Secrets of Cleaner Burning Coal

People
Sandia Researchers Hendrickson and Bochev Named SIAM Fellows
ANL's Rajeev Thakur Serving as Technical Program Chair of SC12 Conference
LLNL's Chandrika Kamath Chairs Data Mining Conference Steering Committee

Berkeley Lab's Leinweber Gives Opening Keynote at TradeTech Conference in NYC

Facilities/Infrastructure
OLCF Completes First Phase of Titan Transition
New Supercomputers at Brookhaven Will Advance Nanomaterial Design

Outreach and Education
Berkeley Lab Staff Offer Career Advice to Area High School Students
Research News
Berkeley Lab-Led SciDAC Institute to Help Solve Data-Intensive Science Challenges

As scientists around the world address some of society's biggest challenges, they increasingly rely on tools ranging from powerful supercomputers to one-of-a-kind experimental facilities to dedicated high-bandwidth research networks. But whether they are investigating cleaner sources of energy, studying how to treat diseases, improve energy efficiency, understand climate change or address environmental issues, the scientists all face a common problem: massive amounts of data which must be stored, shared, analyzed and understood. And the amount of data continues to grow-scientists who already are falling behind are in danger of being engulfed by massive datasets.

On March 29, Energy Secretary Steven Chu announced the Scalable Data Management, Analysis, and Visualization (SDAV) Institute, a $25 million five-year initiative to help scientists better extract insights from today's increasingly massive research datasets. SDAV will be funded through DOE's Scientific Discovery through Advanced Computing (SciDAC) program and led by Arie Shoshani of Lawrence Berkeley National Laboratory (Berkeley Lab). Argonne National Laboratory computer scientist Robert Ross is deputy director of the SDAV Institute

As one of the nation's leading funders of basic scientific research, the Department of Energy Office of Science has a vested interested in helping researchers effectively manage and use these large datasets. SDAV was formally announced March 29 as part of the Administration's "Big Data Research and Development Initiative. "
Read more.External link

PNNL Researchers Improve Precipitation Simulation with UQ
Pacific Northwest National Laboratory (PNNL) researcher Guang Lin, and his PNNL colleagues Ben Yang, Yun Qian, and Ruby Lung, and Nanjing University colleague Yu Zhang, are using uncertainty quantification (UQ) to improve atmospheric model predictive capability within a regional weather research and forecasting model. Their study is the first to apply a stochastic importance sampling algorithm to optimize model inputs to match the precipitation simulation results with the observation data sets under Bayesian inference framework in a regional weather research and forecasting model. Their findings will help improve predictive capabilities for planners to utilize to forecast the probability of extreme weather and climate events. Their study "Some Issues in Uncertainty Quantification and Parameter Tuning: A Case Study of Convective Parameterization Scheme in the WRF Regional Climate Model " was published in Atmospheric Chemistry and Physics in March 2012.
Contact: Guang Lin, guang.lin@pnnl.gov

LLNL Develops New Technologies for Streaming Data
As part of the Mathematics for Analysis of Petascale Data (MAPD) program, Peter Lindstrom of Lawrence Livermore National Laboratory (LLNL) and his collaborators have developed two new technologies for streaming data. First, together with researchers at Polytechnic Institute of New York University, they have developed HyperFlow, a scalable framework for streaming data analysis and visualization on heterogeneous computers. HyperFlow supports data- and task-parallel computing on both CPUs and GPUs, and optimizes the scheduling and execution of dataflow tasks based on available compute resources to deliver consistently high performance. Second, they have developed a new streamable data structure for unstructured triangle meshes that supports constant-time random access connectivity queries and distributed and multithreaded parallel stream processing. The data structure can be constructed on the fly and streamed out in a single pass using very little memory. In addition to random access, the data structure supports windowed and frontal stream processing for access to very large meshes. At two integer references per triangle, it encodes both incidence and adjacency more compactly than common file formats, which use three vertex references to encode incidence alone.

ASCR Research Highlighted at Copper Mountain Conference
Ray Tuminaro (Sandia) and Howard Elman (University of Maryland) co-organized the 12th Copper Mountain Conference on Iterative Methods held March 25–30 in Copper Mountain, Colorado. The conference programExternal link included more than 150 presentations, many of which highlighted ASCR funded research. A sampling of session topics includes multicore/GPU architectures, large-scale optimization and inverse problems, stochastic PDEs and uncertainty quantification, nonlinear solvers, multi-physics applications, magneto hydrodynamics, neutronics, discretization, Helmholtz equations, hybrid direct-iterative linear solvers, nonlinear solvers, multigrid and domain decomposition, Krylov accelerators, large graphs/page rank/Markov chains, and tensors. Ray Tuminaro also leads an effort to define a new section within the journal SIAM Review and assumed the corresponding Section Chief role in January 2012.
Contact: Scott Collis, sscoll@sandia.gov)

Book Published Describing Sandia's Open-Source Optimization Modeling Language
Sandia's William E. Hart and Jean-Paul Watson, together with David L. Woodruff (University of California at Davis) and Carl D. Laird (Texas A & M University), have recently completed the book Pyomo: Optimization Modeling in Python. Hart and Watson are computer scientists in Sandia's Computing Research Center. Woodruff and Laird are respectively researchers in management science and chemical engineering departments.

The Pyomo modeling language is an open-source, Python-based library for expressing a very broad range of optimization problems, including linear programs, non-linear programs, and problems with discrete decision variables. By embedding Pyomo in Python, the library facilitates seamless integration with widely used scientific, visualization, and utility libraries, including SciPy and NumPy. Further, the open-source nature of Pyomo facilitates deployment on large-scale parallel compute platforms. Pyomo provides a number of advanced modeling features, including block-based composition and generic problem transformations. The book consists of 10 chapters, addressing topics ranging from a basic introduction to the language syntax to advanced scripting using Pyomo.

The book is published as Volume 67 in Springer's Optimization and Its Applications series, March 2012. Read more about the book.External link
Further information on the book can be found at: Read more about Coopr and Pyomo.External link

Supercomputing the Difference between Matter and Antimatter
An international collaboration of scientists has reported a landmark calculation of the decay process of a kaon into two pions, using breakthrough techniques on some of the world's fastest supercomputers. This is the same subatomic particle decay explored in a 1964 Nobel Prize-winning experimentExternal link performed at the U.S. Department of Energy's Brookhaven National Laboratory (BNL), which revealed the first experimental evidence of charge-parity (CP) violation - a lack of symmetry between particles and their corresponding antiparticles that may hold the answer to the question "Why are we made of matter and not antimatter? "

The new research - reported online in Physical Review Letters March 30, 2012 - helps nail down the exact process of kaon decay, and is also inspiring the development of a new generation of supercomputers that will allow the next step in this research.

"The present calculation is a major step forward in a new kind of stringent checking of the Standard Model of particle physics - the theory that describes the fundamental particles of matter and their interactions - and how it relates to the problem of matter/antimatter asymmetry, one of the most profound questions in science today, " said Taku Izubuchi of the RIKEN BNL Research Center and BNL, one of the members of the research team publishing the new findings. "When the universe began, did it start with more particles than antiparticles, or did it begin in a symmetrical way, with equal numbers of particles and antiparticles that, through CP violation or a similar mechanism, ended up with more matter than antimatter? "
Read more.External link

NERSC, ESnet Help Researchers Discover How Neutrinos Shift Flavors

Neutrinos, the wispy particles that flooded the universe in the earliest moments after the Big Bang, are continually produced in the hearts of stars and other nuclear reactions. Untouched by electromagnetism, they respond only to the weak nuclear force and even weaker gravity, passing mostly unhindered through everything from planets to people. Years ago scientists also discovered another hidden talent of neutrinos. Although they come in three basic "flavors "-electron, muon and tau-neutrinos and their corresponding antineutrinos can transform from one flavor to another while they are traveling close to the speed of light. How they do this has been a longstanding mystery.

But some new, and unprecedentedly precise measurements from the multinational Daya Bay Neutrino Experiment are revealing how electron antineutrinos "oscillate " into different flavors as they travel. This new finding from Daya Bay opens a gateway to a new understanding of fundamental physics and may eventually solve the riddle of why there is far more ordinary matter than antimatter in the universe today.

The international collaboration of researchers is made possible by advanced networking and computing facilities. In the U.S., the Department of Energy's high-speed science network, ESnet, speeds data to the National Energy Research Scientific Computing Center (NERSC) where it is analyzed, stored and made available to researchers via the Web. Both facilities are located at the DOE's Lawrence Berkeley National Laboratory (Berkeley Lab).
Read more.External link

New Computer Codes Unlock the Secrets of Cleaner Burning Coal

About half of all electricity used in the United States comes from coal. When burned, this fossil fuel emits pollutants that contribute to smog, acid rain and increased greenhouse gases in the atmosphere. But as the demand for power increases, and affordable renewable energy sources remain years away from offsetting this need, one fact is clear: coal is here to stay, at least for the foreseeable future. Recognizing this, researchers supported by the Department of Energy are investigating cleaner methods for extracting energy from coal, like gasification. Using supercomputers at the National Energy Research Scientific Computing Center (NERSC), scientists at the University of Utah have developed tools to model the complex processes of coal gasification, as well as methods to validate these models. A detailed understanding of coal gasification will help engineers retrofit existing gasification plants and improve the design of future gasification plants for efficiently producing electricity.

"Coal is an extremely complex organic fuel. Although it has been used as an energy source for centuries, humanity is only now beginning to understand the science behind how it works, " says Charles Martin Reid, who is currently a postdoctoral researcher at the Lawrence Livermore National Laboratory (LLNL).

For his doctoral thesis at the University of Utah, Reid retrofitted a massively parallel combustion code called ARCHES to simulate the complex processes occurring inside coal gasifiers. He then combined experimental data and mathematical models to validate the computer code. He used about 3 million processor hours on NERSC's Hopper and Franklin systems to build codes and validate his predictions.
Read more.External link

People

Sandia Researchers Hendrickson and Bochev Named SIAM Fellows
Bruce Hendrickson and Pavel Bochev have been named Fellows of the Society for Industrial and Applied Mathematics (SIAM). Hendrickson and Bochev were among 36 members selected for Fellow status this year, and are the first Sandia staff members to receive this honor. SIAM is the leading international professional society for applied mathematics, with 14,000 members from 85 countries, and is the de facto professional society for computational modeling and simulation. Hendrickson was recognized for contributions to combinatorial and parallel algorithms in scientific computing. Bochev was recognized for contributions to the numerical solution of partial differential equations. His main research interests are in numerical analysis, applied mathematics and computational science.
Contact: Scott Collis, sscoll@sandia.gov)

ANL's Rajeev Thakur Serving as Technical Program Chair of SC12 Conference
Rajeev Thakur, senior scientist in Argonne National Laboratory's (ANL's) Mathematics and Computer Science Division, is the Technical Program chair of the SC12 conference. SC is the premier international conference for high-performance computing, networking, storage, and analysis. As technical program chair, Thakur will be responsible for the entire technical program for SC12, including technical papers, tutorials, posters, workshops, panels, invited speakers, and birds-of-a-feather sessions.

Thakur has been actively involved in the SC conference series. He was technical papers co-chair for SC11, system software area co-chair for SC10, and tutorials co-chair for SC09.

Berkeley Lab's Leinweber Gives Opening Keynote at TradeTech Conference in NYC

David Leinweber, co-founder of the Center for Innovative Financial TechnologyExternal link (CIFT) in Berkeley Lab's Computational Research Division, gave the opening keynote talk at TradeTechExternal link, the world's largest annual conference for equity trading and technology professionals. TradeTech was held March 8 at the Javits Center in New York City. Launched in 2001, TradeTech annually draws about 1,200 top-level electronic trading professionals.

In his talk during the opening session on Wednesday, March 7, Leinweber discussed "Big Data in Financial Markets. " Among the topics he covered were structured and unstructured big data, federal market monitoring and buy side lessons, collaborative intelligence and research, and alpha hunting skills for human and machine teams.

CIFT was started at Berkeley Lab in July 2010 to help build a bridge between the technical needs of the federal financial market regulatory agencies and the substantial technological resources suited for this purpose developed over many decades at Berkeley Lab and elsewhere with support from the DOE science program.

Facilities/Infrastructure

OLCF Completes First Phase of Titan Transition
The Oak Ridge Leadership Computing Facility's (OLCF's) Jaguar supercomputer has completed the first phase of an upgrade that will eventually make it more powerful than any system currently in operation. The project, which was concluded ahead of schedule, upgraded Jaguar's AMD Opteron cores to the newest 6200 series and increased their number by a third, from 224,256 to 299,008. Two six-core Opteron processors were removed from each of Jaguar's 18,688 nodes and replaced with a single 16-core processor. At the same time, the system's interconnect was updated and its memory was doubled to 600 terabytes. In addition, 960 of Jaguar's 18,688 compute nodes now contain an NVIDIA graphical processing unit (GPU). The GPUs were added to the system in anticipation of a much larger GPU installation later in the year. The GPUs act as accelerators, giving researches a serious boost in computing power in a far more energy-efficient system.

Acceptance testing for the upgrade was completed in February 2012. The testing suite included leading scientific applications focused on molecular dynamics, high-temperature superconductivity, nuclear fusion, and combustion. It also included benchmarking programs designed to test specific areas of system performance. When the upgrade is completed this autumn, the system will be renamed Titan and will be capable of up to 20 petaflops. Users have had access to Jaguar throughout the upgrade process.

Outreach and Education

Berkeley Lab Staff Offer Career Advice to Area High School Students
Berkeley Lab Computing Sciences Communications staffers Jon Bashor, Margie Wylie and Linda Vu joined Rachel Carl and Jeff Todd of Berkeley Lab Human Resources in a presentation on finding rewarding jobs to juniors and seniors at Kennedy High School in Richmond, California on Wednesday, March 21.

The session drew more than 80 students from the school's IT Academy and stemmed from a summer outreach program organized by Computing Sciences Communications staff two summers ago, and a similar presentation to Kennedy High School students about a year ago. Among the topics covered were where to look for jobs, who to call on for help, dressing for success, likely interview questions, and a "circle of support " exercise to identify people, organizations, and other support resources. At the end, a handful of students stayed afterwards to ask questions about opportunities at the Lab.

Berkeley Lab Computing Sciences has engaged in a number of career-related outreach activities with Kennedy High, at which 76 percent of the students are economically disadvantaged and do not meet state goals for academic performance.
Contact: Jon Bashor, JBashor@lbl.gov

Last modified: 3/18/2013 10:12:26 AM