Provided below is a listing of relevant articles, plans and ASCR-sponsored workshop reports.
The 2015 Neuro-Inspired Computational Elements Workshop
This workshop focused on Information Processing and Computation Systems beyond von Neumann/Turing Architecture and Moore’s Law Limits. The workshop brought together researchers from different scientific disciplines and applications areas to provide a nucleation point for the development of next generation of information processing/computation architectures that go beyond stored program architectures and Moore’s Law limits.
Link to the report... (1.6MB)
DOE ASCR Workshop on Quantum Computing for Science
This report (3.9MB) details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies.
Link to the report... (3.9MB)
The 2015 Machine Learning and Understanding For Scientific Discovery Workshop
This workshop This workshop focused on three technological areas:
Link to the report... (4.3MB)
- Self-aware runtime and operating systems
- Deep learning for big data
- Resiliency and Trust
The 2015 Cybersecurity For Scientific Computing Integrity Workshop
This workshop, focused on four technological areas:
Link to the report... (2.5MB)
- Extreme Scale Power Grid Simulation. ;
- Trustworthy Supercomputing. ;
- Trust within Open, High-End Networking and Data Centers. ; and
- Extreme Scale Data, Knowledge, and Analytics for Understanding and Improving Cyber Security.
Storage Systems and Input/Output to Support Extreme Scale Science
Four priority research directions emerged from this activity:
Link to the report.. (4.2MB)
- In the area of SSIO architectures, additional research is needed to develop solutions to the challenge of managing upcoming deep and heterogeneous storage hierarchies, including storage in the compute system, and to explore alternative paradigms to the current file system model of access and organization.
- In the area of metadata, name spaces, and provenance, research is needed to devise new methods of capturing, organizing, presenting, and exploring rich metadata from DOE science activities, including breaking away from the current file model of data storage prevalent in DOE supercomputing facilities and science.
- In the area of supporting science data, research is needed to develop the next generation of I/O middleware and services in support of the broad collection of HPC and experimental and observational data needs and to integrate with and support new programming abstractions and workflow systems as they are adopted.
- In the area of understanding SSIO, research is needed to improve our ability to characterize the storage activities of DOE scientists and to model and predict the behavior of SSIO activities on future systems.
The 2014 Workshop on Modeling & Simulation of Systems and Applications
ModSim 2014, focused on three critical technological areas:
Link to the report.. (3.8MB)
- Integrated Modeling and Simulation of Performance, Power, and Reliability. ;
- Standards, Integration, and Interoperability of ModSim Methodologies and Tools . ; and
- Advances in Core Modeling and Simulation Methods and Tools.
NERSC Large Scale Computing and Storage Requirements for Basic Energy Sciences Research: Target 2017
The latest review revealed several key requirements, in addition to achieving its goal of characterizing BES computing and storage needs. High-level findings are:
Link to the report.. (6.0MB)
- Scientists will need access to significantly more computational and storage resources to achieve their goals and reach BES research objectives . ;
- Users will need assistance from NERSC to prepare for Cori (NERSC-8) and follow-on manycore systems. ;
- Research teams need to run complex jobs of many different types and scales.;
- BES is a leader in innovative use of HPC and requires a diverse set of resources and services from NERSC.; and
- BES facilities need computational analysis and data storage resources beyond what they can provide.
NERSC Large Scale Computing and Storage Requirements for Fusion Energy Sciences Research: Target 2017
The latest review revealed several key requirements, in addition to achieving its goal of characterizing FES computing and storage needs. High-level findings are:
Link to the report.. (2.8MB)
- To meet Fusion Sciences research objectives, FES researchers need computing and data storage resources in excess of those predicted by historical trends. ;
- FES codes will require updated mathematical and I/O libraries that will run efficiently on next generation architectures. ;
- FES scientists need support for both ensemble runs and large-scale jobs..;
- Teams need effective tools for managing workflows, performing data analysis, and profiling and developing codes .; and
- Code teams need help porting applications to run on next-generation architectures .
Abstract Machine Models and Proxy Architectures for Exascale Computing
In this report our goal is to provide the application development community with a set of models that can help software developers prepare for exascale. In addition, use of proxy architectures, through the use of proxy architectures, we can enable a more concrete exploration of how well application codes map onto the future architectures.
DOE workshop on Software Productivity for eXtreme-scale Science (SWP4XS)
This report presents results from the DOE workshop on Software Productivity for eXtreme-scale Science (SWP4XS) held January 13-14, 2014, in Rockville, MD. The workshop brought together approximately fifty experts in the development of large-scale scientific applications, numerical libraries, and computer science infrastructure to determine how to address the growing crisis in software productivity caused by disruptive changes in extreme-scale computer architectures.
2014 Workshop on Programming Abstractions for Data Locality
The purpose of the Workshop on Programming Abstractions for Data Locality (PADAL) was to identify common themes and standardize concepts for locality-preserving abstractions for exascale programming models.
Applied Mathematics Research for Exascale Computing
This report details the findings and recommendations of the DOE ASCR Exascale Mathematics Working Group that was chartered to identify mathematics and algorithms research opportunities that will enable scientific applications to harness the potential of exascale computing. The working group organized a workshop, held August 21-22, 2013 in Washington, D.C., to solicit input from over seventy members of the applied mathematics community. Research gaps, approaches, and directions across the breadth of applied mathematics were discussed, and this report synthesizes these perspectives into an integrated outlook on the applied mathematics research necessary to achieve scientific breakthroughs using exascale systems.
Accelerating Scientific Knowledge Discovery (ASKD) Working Group Report
Sustained scientific progress over the next decade and beyond will require new advanced discovery ecosystems quite different from the computational and collaborative environments in which most research is performed today. These systems will need to connect increasing numbers of scientists, enable use of data and computational services at unprecedented scales, foster scientific discoveries based on ever more complex cross-disciplinary hypotheses, facilitate the immediate sharing and exchange of existing and emerging knowledge, and provide mechanisms for timely control of and feedback to instruments and simulations. To achieve this goal requires computer science research advances in multiple areas.
Data Crosscutting Requirements Review
In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled in Germantown, Maryland to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments.
NERSC Large Scale Computing and Storage Requirements for Biological and Environmental Research: Target 2017
The latest review revealed several key requirements, in addition to achieving its goal of characterizing BER computing and storage needs. High-level findings are:
Link to the report.. (3.4MB)
- Scientists need access to significantly more computational and storage resources to achieve their goals and reach BER research objectives. BER anticipates a need for six billion computational hours (25 times 2012 usage) and 107 PB of archival data storage (10 times 2012 usage) at NERSC in 2017.;
- Simulation and analysis codes will need to access, read, and write data at a rate far beyond that available today.;
- Support for high-throughput job workflows is needed.;
- State-of-the-art computational and storage systems are needed, but their acquisition must not interrupt ongoing research efforts.; and
- NERSC needs to support data analytics and sharing, with increased emphasis on combining experimental and simulated data.
NERSC Large Scale Computing and Storage Requirements for High Eneregy Physics Research: Target 2017
The workshop revealed several key requirements, in addition to achieving its goal of characterizing HEP computing. The key findings are:
Link to the report.. (3.3MB)
- Researchers need access to significantly more computing and storage resources to support HEP mission goals through 2017.;
- Scientists need vastly improved data I/O rates and better facilities for performing data-intensive science.;
- Research teams need to run both large-scale simulations and massive numbers of individual jobs.; and
- NERSC must help enable and ease the transition to next-generation architectures.;
2013 report of HEP/ASCR Data Summit
Representatives from the HEP and ASCR communities met at Germantown on April 2-3, 2013, to discuss issues in carrying out science with large datasets and associated data-intensive computing tasks. The authors of this report acknowledge the important contributions made by all of the ASCR and HEP participants at the data summit and thank them for their efforts.
ASCR Modeling and Simulation of Exascale Systems and Applications Workshop
A new process of “Co-Design” is being pursued in which application and computer scientists work toward the common goal of an exascale ecosystem of systems and applications. Modeling and simulation (ModSim) is a critical part of this process.
DOE ASCR Advisory Committee (ASCAC) Data Subcommittee Report,
Synergistic Challenges in Data-Intensive Science and Exascale Computing.
This new report discusses the natural synergies among the challenges facing data-intensive science and exascale computing, including the need for a new scientific workflow.
Summary of Data Requirements for NERSC
Department of Energy Scientists represented by the NERSC user community have growing requirements for data storage, I/O bandwidth, networking bandwidth, and data software and services. Over the next five years, these requirements are well above what would be provided by increases that follow historical trends. This report focuses primarily on the data needs of the modeling and simulation community.
Link to the report... (2.8MB)
Department of Energy (DOE) Fault Management Workshop
Held June 6, 2012 at the BWI Airport Marriot hotel in Maryland. The goals of this workshop were to:
- Describe the required HPC resilience for critical DOE mission needs
- Detail what HPC resilience research is already being done at the DOE national laboratories and is expected to be done by industry or other groups
- Determine what fault management research is a priority for DOE’s Office of Science and National Nuclear Security Administration (NNSA) over the next five years
- Develop a roadmap for getting the necessary research accomplished in the timeframe when it will be needed by the large computing facilities across DOE
Architectures II Workshop Report
Link to the report... (1.6MB)
Exascale Tools Workshop Report
Link to the report... (1.2MB)
Link to the report... (1.6MB)
DOE Workshop Report on Multiphysics Simulations
“Multiphysics Simulations: Challenges and Opportunities” considers multiphysics applications from algorithmic and architectural perspectives, where "algorithmic" includes both mathematical analysis and computational complexity and "architectural" includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not always practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. It examines several of these, exposes some commonalities among them, and attempts to extrapolate best practices to future systems. Get report: http://www.ipd.anl.gov/anlpubs/2012/01/72183.pdf
Report of the ASCR/BES Data Workshop 2011
Report of the Extreme Scale Solvers Workshop, March 8-9, 2012
Transition to Future Architectures
Magellan Report Cuts Through Cloud Computing Hype
The Recovery Act funded Magellan project documents the pros and cons of scientific usage of cloud computing in their final report. Within a few weeks industry press and blogs from around the world took notice.
Report of the DOE Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems September 13–14, 2011
A Multifaceted Mathematical Approach for Complex Systems
NERSC Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research
Link to the report.. (1.4MB)
- ASCR projects will need more than one billion hours of computing time at NERSC in 2014 to meet their research goals and help enable world-class scientific discovery at Office of Science HPC facilities. Approximately one-half of these hours are self-reported requirements for visual analytics.
- Applications will need to be able to read, write, and store 100s of terabytes of data for each simulation run. Many petabytes of long-term storage will be required to store and share data with the scientific community.
- Access to appropriate resources and support for workflows involving many small and medium-sized runs is required.
- ASCR projects need access to, and robust support for, a rich set of software applications, libraries, and tools.
Scientific Collaborations for Extreme-Scale Science Workshop Report
Scientific Collaborations for Extreme-Scale Science Workshop Report Extreme‐scale science is not simply about facilities. It inevitably also requires the exponential acceleration of progress that can be achieved when many differing intellects attack challenges collaboratively. The Scientific Collaborations for Extreme‐Scale Science Workshop, held December 6–7, 2011 in Gaithersburg, Maryland, focused on a strategic vision of how collaboration can be enabled, and on the research and development that will turn the vision into reality.
NERSC Large Scale Computing and Storage Requirements for Fusion Energy Sciences Research
The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Key requirements for scientists conducting research in FES include:
Link to the report.. (1.8MB)
- Larger allocations of computational resources at NERSC; and
- Continued support for a complex ecosystem of software libraries and tools; and
- Data storage systems that can support high-volume/high-throughput I/O; and
- Robust, highly available computational systems with high throughput, long run times, and short queue waits
Exascale Programming Challenges Workshop Report
The workshop for programming models, languages, compilers and runtime systems for exascale machines was held in July, 2011. The goal was to identify the challenges in each of these areas, the promising approaches, and measures to assess progress. The participants in the workshop articulated the research challenges in programming support for anticipated exascale systems, including specifying what is known and what remains uncertain.
Exascale Workshop on Data Analysis, Management, and Visualization
The workshop goal was to identify the research and production directions that the Data Management, Analysis, and Visualization (DMAV) community must take to enable scientific discovery for HPC as it approaches the exascale.
NERSC Large Scale Computing and Storage Requirements for Basic Energy Sciences Research
Workshop participants reached a consensus on several key findings, in addition to achieving the workshop’s goal of collecting and characterizing computing requirements. The key requirements for scientists conducting research in BES are:
Link to the report.. (13.5MB)
- Larger allocations of computational resources; and
- Continued support for standard application software packages; and
- Adequate job turnaround time and throughput; and
- Guidance and support for using future computer architectures
NERSC Large Scale Computing and Storage Requirements for High Eneregy Physics Research
The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings:
Link to the report.. (8.6MB)
- Science teams need access to a significant increase in computational resources to meet their research goals;
- Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; and
- Science teams need guidance and support to implement their codes on future architectures; and
- Projects need predictable, rapid turnaround of their computational jobs to meet missioncritical time constraints
ASCAC Subcommittee Report: The Opportunities and Challenges of Exascale Computing
Computational modeling, simulation, prediction, and control at exascale offer the prospect of transformative progress in energy, national security, the environment, and our economy, and for fundamental scientific questions. Although the path to exascale necessarily involves numerous complex challenges, the almost-certain benefits far outweigh the costs.
Scientific Grand Challenges: Cross-Cutting Technologies for Computing at the Exascale Workshop
The Cross-cutting Technologies for Computing at the Exascale workshop report from the Grand Challenges Series of workshops. The workshop took place February 2-4, 2010 in Washington, DC.
Scientific Grand Challenges: Architectures and Technology for Extreme Scale Computing
The Architectures and Technology workshop report from the Grand Challenges Series of workshops. The workshop took place December 8-10, 2009 in San Diego, CA.
Scientific Grand Challenges: Discovery in Basic Energy Sciences: The Role of Computing at the Extreme Scale
The BES workshop report from the Grand Challenges Series of workshops. The workshop took place August 13-15, 2009 in Washington D.C.
NERSC Large Scale Computing and Storage Requirements for Biological and Environmental Research
The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing – combined with new requirements for collaborative data manipulation and analysis – will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC.
Link to the report.. (1.7MB)
Workshop on Computational Science and Chemistry for Innovation
The workshop was held July 26-27, 2010 in Bethesda MD to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation.
The NNSA workshop report from the Grand Challenges Series of workshops. This workshop was held October 6-8, 2009 in Washington D.C.
Workshop report from a panel of 12 scientists and engineers with experience in government, universities, national labs and industry. The panel met January 2010 in Washington, D.C. to review reports prepared to document the need for a new generation of extreme-computing capability for the DOE's missions.
Scientific Grand Challenges: Fusion Energy Sciences and the Role of Computing at the Extreme Scale
Fusion Energy Workshop report from the Grand Challenges Series of Workshops that took place in Washington, DC, March 18-20, 2009.
Scientific Grand Challenges: Opportunities in Biology at the Extreme Scale of Computing
Workshop report from the Grand Challenges Series of Workshops that took place in Chicago, IL, August 17-19, 2009.
Scientific Grand Challenges: Challenges for the Understanding the Quantum Universe and the Role of Computing at the Extreme Scale
Workshop report from the Grand Challenges Series of Workshops that took place at SLAC in Menlo Park, CA, December 9-11, 2008.
Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of Computing at the Extreme Scale
Workshop report from the Grand Challenges Series of Workshops that took place in Washington DC, January 26-28, 2009.
International Exascale Software Project - A Roadmap
Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan
This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE.
Scientific Grand Challenges: Science Based Nuclear Energy Systems Enabled by Advanced Modeling and Simulation at the Extreme Scale
Workshop report from the Grand Challenges Series of Workshops that took place in Washington DC May 11-12, 2009.
Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale
Final report from the first of the Grand Challenges Series of Workshops that took place in Washington DC November 6-7, 2008.
Workshop on Computer Science/Applied Math Institutes and High Risk / High Payoff Technologies for Applications
The workshop focused on two aspects or tracks. One was how the Applied Mathematics and Computer Science communities could help make potential high risk/high payoff areas in their application domains a reality. Since the concept of joint math and CS institutes is a a new one, the second track focused on how the community's could create, develop, and manage joint CS/Math Institutes.
MidRange Computing in Support of Science at Office of Science Labs
A report containing the summary of the status of midrange computing efforts at the ten Office of Science Laboratories.
A Scientific Research & Development Approach to Cyber Security
A community of scientists, technical experts and executives from DOE National Labs, universities, other Federal agencies and industry collaborated to develop a science-based, systems level case for a new, transformational approach to cyber security. The report identifies and analyzes R&D opportunities from that systems-level perspective.
National Science and Technology Council - Federal Plan for Advanced Networking Research and Development
Report by the Interagency Task Force on Advanced Networking Research and Development.
Report of The Panel on Recent Significant Advancements in Computational Science
A report from a distinguished panel charged with identifying recent breakthroughs in computational science and enabling technologies, supported by ASCR through the INCITE program, the SciDAC program, and/or its base program.
Link to The SciDAC Review. The SciDAC Review is a quarterly magazine that shares SciDAC projects, news, and achievements.
Supercomputing: The New Secret Weapon
Supercomputing is doing everything from designing high performance bathing suites for Olympic swimmers to developing plants that can cope with all types of environmental problems.
Applied Mathematics at the U.S. Department of Energy: Past, Present and a view to the Future
A report by an independent panel from the Applied Mathematics research community.
Mathematics for Analysis of Petascale Data Workshop Report
The workshop engaged mathematical scientists and applications researchers to identify the next-generation mathematical techniques needed to meet the challenges posed by petascale data.
Workshop Report on Advanced Networking for Distributed Petascale Science: R&D Challenges and Opportunities
The workshop brought together leading network researchers in optical transport, middleware, and high-performance protocols. Their charge was to develop a high-level roadmap for the network research and development (R&D) that will be required to support DOE's distributed Petascale science over the next decade.
Scientific Impacts and Opportunities in Computing Workshop Report
This workshop was conducted for the Office of Advanced Scientific Computing Research to identify high impact opportunities in computing for investment in research to maintain the nation's preeminence in scientific discovery and competitiveness.
Modeling and Simulation at the Exascale for Energy and the Environment Town Hall Meetings Report
The objective of this ten-year vision is to focus the computational science experiences gained over the past ten years on the opportunities introduced with exascale computing to revolutionize our approaches to energy, environmental sustainability and security global challenges. For more information, see the ASCR Scientific Grand Challenges
Computational Research Needs in Alternative and Renewable Energy
Final report from the workshop on Computational Research Needs in Alternative and Renewable Energy held in Rockville, Maryland September 19 and 20, 2007.
SciDAC Highlighted in CTWatch
The articles of this issue of CTWatch Quarterly described the connection between scalable software technology and breakthrough science. Each article offers an informative and stimulating discussion of some of the major work being carried out by one of the Centers for Enabling Technologies (CET) of the Department of Energy's wide ranging and influential SciDAC program.
Advanced Scientific Computing Research: Delivering Computing for the Frontiers of Science - Facilities Division Strategic Plan for High Performance Computing Resources
The strategic vision for High Performance Computing (HPC) resources in the Facilities Division of the Office of Advanced Scientific Computing Research (ASCR) program in the Department of Energy's Office of Science for the next 10 years.
Visualization and Knowledge Discovery: Report from the DOE/ASCR Workshop on Visual Analysis and Data Exploration at Extreme Scale
The developed visualization and data exploration tools have served admirably with gigabyte and even terabyte datasets, but at the peta- and exascale levels, those tools will no longer suffice. Scientists and researchers met under the auspices of ASCR in Salt Lake City on June 7-8, 2007 to discuss the coming "data tsunami" and issues involved in data exploration, data understanding, and data visualization at the petascale and beyond.
Software Development Tools for Petascale Computing Workshop
The findings generated at the Software Development Tools for PetaScale Computing (SDTPC) Workshop held in Washington, D.C. on August 1 and 2, 2007 are in this final report.
Final Report from the Cyber Security Research Needs for Open Science
These are the results of Priority Research Directions (PRDs) identified during the two-day workshop held on July 23 & 24, 2007. The workshop was jointly sponsored by the DOE Office of Science and Office of Electricity Delivery and Energy Reliability. Participation included representation from national labs, higher education and industry. Participants self-identified their interests into five break-out groups, each of which was charged with identifying PRDs.
Computational Subsurface Sciences Workshop Report
The workshop, held in Bethesda, Maryland, January 9-12, 2007 was to obtain community input on computational science research needs and opportunities in the subsurface sciences and related areas, with a focus on developing a next generation of numerical models of subsurface flow and process simulation. Collaborating DOE offices were SC, EM, FE, and RW.
Multiscale Mathematics Initiative: A Roadmap
This was the third of three DOE sponsored workshops. It was held in Portland, Oregon September 21-23, 2004, to consider the scientific needs and mathematical challenges for multiscale simulation. This report represents the important conclusions, themes and recommendations for DOE investments from all three workshops.
Final Report Second DOE Workshop on Multiscale Problems
The Second DOE Workshop on Multiscale Problems was held from July 20 to July 22, 2004 in Broomfield, Colorado. During these three days, over eighty researchers with expertise in a wide variety of engineering, mathematical and scientific fields gathered to discuss the current state of mathematical methods for multiscale problems, possible future directions for research, and ways in which the Department of Energy could best support such activity.
Report of the First Multiscale Mathematics Workshop: First Steps toward a Roadmap
Some of the nation's leading computational scientists gathered in Washington, D.C. May 3-5, 2004 to consider the scientific needs and mathematical challenges for multiscale simulation. The goals were to (1) identify the most compelling scientific applications facing major roadblocks due to multiscale modeling needs and (2) formulate a strategic plan for investment in multiscale mathematics research that will meet these needs.
Report on the Mathematical Research Challenges in Optimization of Complex Systems
The goal of the workshop was to articulate opportunities for mathematical research relevant to DOE applied science and technology programs, in mathematical areas that are not already a major part of the DOE applied mathematics research portfolio.
Report on the Workshop on Simulation and Modeling for Advanced Nuclear Energy Systems
DOE's Office of Office of Advanced Scientific Computing Research (ASCR) and Nuclear Energy (NE) co-sponsored a workshop to identify the research opportunities and priorities for advancing the use of simulation and modeling in the research and development of advanced nuclear energy.
Management Model for Delivering High Performance and Leadership Class Computing Systems for Scientific Discovery
The purpose of this document is to describe a model for managing the activities and tasks associated with the operation of High Performance Computing Facilities (HPCF) that enable scientific discovery as well as the delivery of next generation High Performance Production Computing (HPPC) and Leadership Class Computing (LCC) resources to the HPCF to meet established cost, schedule and performance objectives. December 2006
The DOE Office of Science Data-Management Workshops were held March-May 2004. A number of application scientists and computer scientists at these meetings came to the conclusion that the plan as presented was dangerously light on attention to data management, given the increasingly data-intensive nature of research supported by the Office of Science. This report documents these workshops.
Guide to the NITRD
Supplement to the President's Budget for FY05 for the Networking and Information Technology Research and Development (NITRD) Program. Prepared by the Interagency Working Group on Information Technology Research and Development of the National Science and Technology Council, the Supplement describes accomplishments and proposed activities of the Federal agencies that work collaboratively on R&D in advanced computing, networking, software, and related information technologies to support Federal missions and help maintain U.S. leadership in cutting-edge science, engineering, and technology.
Federal Plan for High-End Computing
Report of the High-End Computing Revitalization Task Force.