SciDAC Projects

The Scientific Discovery through Advanced Computing (SciDAC) program was initiated in 2001 as a partnership involving all of the Office of Science (SC) program offices to dramatically accelerate progress in scientific computing that delivers breakthrough scientific results through partnerships comprised of applied mathematicians, computer scientists, and scientists from other disciplines. The SciDAC program was re-competed in 2006 and recently in 2011, and the partnerships were extended to include the DOE National Nuclear Security Administration (NNSA). Through partnerships with ASCR-funded mathematicians and computer scientists, SciDAC applications pursued computational solutions to challenging problems in climate science, fusion research, high energy physics, nuclear physics, astrophysics, material science, chemistry, particle accelerators, biology and the reactive subsurface flow of contaminants through groundwater. Today the SciDAC program is recognized as the leader in accelerating the use of high-performance computing to advance the state of knowledge in science applications.

Scalable Data Management Analysis and Visualization Institute (SDAV)

DOE SciDAC Institutes

ORNL team members: Scott Klasky

The institute provides comprehensive expertise in scientific data management, analysis, and visualization aimed at transferring state of the art techniques into operational use by application scientists on leadership-class computing facilities over the next five years. ORNL's team works directly with application scientists to assist them by applying the best tools and technologies at their disposal, and learning from the scientists where tools fall short. Technical solutions to any shortcomings are implemented to ensure that the tools overcome mission-critical challenges in the scientific discovery process. These tools are then further developed and improved as these computing platforms change over the next five years. As an example, ORNL's team has created the ADIOS framework and have worked with over 20 other teams to continually improve its software, and has recently released ADIOS 1.4 this past summer.

CSMD Group leader Scott Klasky serves on the executive committee of SDAV, lead by Arie Shoshani (LBL). Scott is the ORNL lead and the leader of the Data Management track of the institute. For more information, please visit http://www.sdav-scidac.org.

Plasma Surface Interactions (PSI)

DOE FES SciDAC Institutes

ORNL team members: David E. Bernholdt, Jay Jay Billings, John Canik, Jeremy Meredith, Philip C. Roth, Roger Stoller, Stanislav Golubov, and Brian Wirth

The Plasma Surface Interactions, or PSI project is one of two new fusion-related project getting underway as part of the SciDAC-3 program. The project will develop and use predictive tools to describe the evolution of plasma-facing materials. This understanding is crucial to improved predictions of the performance of plasma-facing components needed to ensure magnetic fusion energy development beyond ITER. The project is led by Brian Wirth of ORNL and the University of Tennessee, and also includes ANL, General Atomics, LANL, PNNL, UC San Diego, UIUC, and U Mass Amherst. The ORNL portion of the team spans several divisions. Wirth is part of CASL, and the Fusion Energy and Materials Science and Technology Divisions are represented, in addition to CSMD.

CSMD researcher David Bernholdt is the overall technical lead for the computer science and applied math aspects of the project, while Jeremy Meredith and Phil Roth bring their expertise in visualization and data analytics; and performance, resilience, and energy efficiency and connect the project to two of the SciDAC Intsitutes in which they also participate. Jay Jay Billings is the software architect and lead developer for the major new simulation code that the project will develop. For more information, please visit PSI's site at http://collab.mcs.anl.gov/display/PSIscidac

SciDAC Edge Physics Simulation Project

DOE SciDAC Institutes

ORNL team members: Scott Klasky

Fusion simulation is moving toward building a next-generation numerical experiment based upon first principles physics. One focus in SciDAC Edge Physics Simulation Project (EPSI) is to develop an advanced computational framework that can support tight coupling between large-scale kinetic multi-scale physics systems, support in-situ uncertainty quantification (UQ), verification and validation (V&V), and enable pluggable analysis and visualization services that can ingest and assimilate large size in-situ data. Specifically, the EPSI team are addressing key data challenges by augmenting ADIOS with DataSpaces to enable hybrid staging and in-situ/in-transit data processing workflows, and the eSiMon dashboard for pervasive access to the next generation numerical fusion experiment.

CSMD Group leader Scott Klasky serves on the executive committee and leads the computer science effort for EPSI.

PISCEES

DOE SciDAC Institutes, BER Partnership

ORNL team members: Kate Evans, Matt Norman, and Pat Worley

At Oak Ridge, researchers on PISCEES are focusing on the verification and validation aspects of the new dynamical cores being implemented into CISM. First, a software package will be developed to assess the current dynamical core of CISM that is being implemented within the next release of the CESM. This will provide a basic level of testing and evaluation that can be extended to the next generation developments occurring within the project. Using the recently released observational datasets of ice sheet flow (e.g. Rignot et al, 2012, Science; Rignot and Mouginot, 2012, GRL), ORNL researchers will work with collaborators within PISCEES to incorporate these datasets as both initial conditions and also to provide validation benchmarks to give more confidence in the recently developed high resolution ice sheet capability. The verification and validation of the performance of the high-resolution ice sheet configuration will help identify performance issues and bottlenecks. As validation, performance assessments will also provide a quantitative measure of the value of model development in terms of the impact on model throughput.

This project is supported by the US Department of Energy's Office of Science under the Scientific Discovery through Advanced Computing (SciDAC) program. It is a joint effort between the Biological and Environmental Research (BER) program office and the Advanced Scientific Computing Research (ASCR) program office. This project benefits from participation in three SciDAC Institutes. The FASTMath Institute will collaborate on scalable solver algorithms for ice sheet models. Researchers from the QUEST Institute will work to apply new uncertainty quantification techniques for ice sheet model projections and parameter estimation. Finally, the SUPER institute will help to ensure the new ice sheet models perform well on current and advanced computer architectures.

Multiscale Methods for Accurate, Efficient, and Scale-Aware Models of the Earth System

DOE SciDAC Institutes, BER Partnership

ORNL team members: Jim Hack, PI, Rick Archibald, Chris Baker, Kate Evans, and Jennifer Ribbeck (Bredesen Center student)

Some of the greatest challenges in projecting the future of the Earth's climate result from the significant and complex interactions among small-scale features and large-scale structures of the ocean and atmosphere. In order to advance Earth system science, a new generation of models that capture the structure and evolution of the climate system across a broad range of spatial and temporal scales is required. Our primary goal is to produce better models for these critical processes and constituents from ocean-eddy and cloud-system to global scales through improved physical and computational implementations.

The Multiscale Methods project comprises an integrated team of climate and computational scientists to accelerate the development and integration of multiscale atmospheric and oceanic parameterizations into the Community Earth System Model (CESM). Our primary objective is to introduce accurate and computationally-efficient treatments of interactive clouds, convection, and eddies into the next generation of CESM at resolutions approaching the characteristic scales of these structures. This project will deliver treatments of these processes and constituents that are scientifically useful over resolutions ranging from 2 to 1/16 degrees.

The Oak Ridge portion of this project is focused on the development of methods to accurately and efficiently incorporate these multi scale features. Time-stepping algorithms that maintain efficiency and robustness for a wide range of spatial configurations have been implemented in a shallow-water version of the Community Atmosphere Model (CAM) dynamical core and show demonstrable accuracy and robustness for time steps many times greater than the CFL of the smallest resolved features. Extension to the full CAM is underway, and this is being achieved by advancing the configuration of CAM to include third party solver and parallel strategy libraries. This will provide minimal disruption for climate model developers, open the door for incorporation of emulation and sub-grid pdf generators needed to fully quantify the nature and sensitivity of the multiscale model, and allow related efforts to develop CAM for acceleration on hybrid computing systems to hook into ongoing community developments occurring within the third-party libraries. We outline a strategy to merge these separate but related efforts and allow accelerated time-stepping methods on hybrid architecture that keeps pace with community algorithmic development.

Applying Computationally Efficient Schemes for BioGeochemical Cycles

DOE SciDAC Institutes, BER Partnership

ORNL team members: Forrest Hoffman (PI), Patrick Worley, Richard Mills, and Jitendra Kumar

The ACES4BGC Project seeks to advance the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project will implement and optimize new computationally efficient tracer advection algorithms for large numbers of tracer species; add important biogeochemical interactions between the atmosphere, land, and ocean models; and apply uncertainty quantification (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system. The resulting improvements to the Community Earth System Model (CESM) will deliver new scientific capabilities and significantly improve the representation of biogeochemical interactions at the canopy-to-atmosphere, rivers-to-coastal oceans, and open oceans-to-atmosphere interfaces. ACES4BGC partners modelers with decades of cumulative research experience and a team of computer and computational scientist building scalable solvers and tools, developing advanced UQ methods, and applying technologies for performance optimization through U.S. DOE SciDAC Institutes.