Report of Panel 1: Materials and Geophysics

Cover Page

Computational Science Initiative

Office of Energy Research
 
 
 
Report of Panel 1

Materials and Geophysics

March 9, 1998
 
 
 
 
 
Panel met January 26-27, 1998, at
Gaithersburg Hilton, MD
Chair: John W Wilkins
Lead Office: Basic Energy Sciences

Panel 1 Participants: Materials and Geophysics

Jeffrey N Brooks Argonne National Laboratory Technology Development
Daryl C Chrzan University of California Berkeley Materials Science and Mineral Engineering
Stephen Foiles Sandia National Laboratories, Livermore
Arthur J Freeman Northwestern University Department of Physics and Astronomy
Bruce N Harmon Iowa State University Ames Laboratory
Anthony C Hess Pacific National Laboratory Materials Resources
Richard G Hoagland Washington State University Department of Mechanical and Material Sciences
William Klein Boston University Center for Computational Science
Kenneth Larner Colorado School of Mines Department of Geophysics
Ki Ha Lee Lawrence Berkeley National Laboratory Earth Sciences Division
Larry R Myer Lawrence Berkeley National Laboratory Earth Sciences Division
John J Rehr University of Washington Department of Physics
William A Shelton Oak Ridge National Laboratory Mathematical Sciences Section
Roger E Stoller Oak Ridge National Laboratory Metals and Ceramics Division
Priya Vashishta Louisiana State University Concurrent Computing Laboratory for Materials Simulation
Art F Voter Los Alamos National Laboratory Theory Division
John W Wilkins Ohio State University Department of Physics
Dieter Wolf Argonne National Laboratory Materials Science Division
David Yuen University of Minnesota Department of Geology and Geophysics
DOE Staff
Steve Ebert Office of Computational and Technology Research Mathematical, Information, and Computational Sciences Division
Manfred Leiser Office of Basic Energy Sciences Materials Sciences Division
Iran L Thomas Office of Basic Energy Sciences Deputy Associate Director
Nicholas B Woodward Office of Basic Energy Sciences Geosciences Research Program

Executive Summary

Need for teraflop computing. The next major step in materials sciences and geoscience is to link the behavior at the smallest sizes, the microscale, to the description of matter at practical sizes, the macroscale. The linkage between these two extremes of size and of time, involves complex phenomena at intermediate ranges, the mesoscale.

Typically, simulation at the mesoscale involves both spatial and temporal microscale mesh points. A perfect algorithm for the time evolution of the mesoscale system would require 10^(17) operations -- a few days on a teraflop machine. While individual problems vary in complexity, for a substantial enhancement in our ability to understand, predict or control material and geological phenomena, extensive teraflop capacity is essential.

Though computationally demanding, the basic microscopic physics is largely understood. So simulating the mesoscale depends most strongly on the development of efficient algorithms. However, reaching the macroscale involves both conceptual and computational steps.

Materials. The mesoscale can be simplistically described as composed of single-crystal grains threaded with extended defects and seeded with impurities. The motion of the grains in the network of the extended defects restrained by the impurities determines the macroscale properties. The fundamental problem is tailoring the mesoscale to achieve new materials. At the same time this involves a wide range of materials problems.

Tailor the mesoscale:

Sample problems.

Geosciences. The nature of the mesoscale varies with the problem, necessitating more specific tailoring of methods. Even more important are the larger data bases and range of length and time scales. Important problems include:

Computation issues. In addition to the demonstrated need for substantial teraflop capacity, several cross-cutting issues must be solved for efficient, massively parallel computing.

Technical Summary

Toward a Better Understanding of the Materials that Make Our Earth Interesting

Understanding the earth -- geosciences -- and the materials derived from it -- materials sciences -- is central to our civilization and its continuation. This understanding must extend to all size and time scales to develop new materials and improve existing materials, to find new energy/material resources, and to protect against earthquake and related calamities, all while protecting the earth's environment. This is a difficult goal to achieve because it is essential that the behavior of matter at some of the smallest sizes, the microscale, be incorporated as an integral part of the description of matter at the largest levels, the macroscale. The linkage between these two extremes of size, and time, involves phenomena at intermediate ranges, the mesoscale. The following table serves to show the magnitudes of length and time associated with these three scales in the materials sciences and in the geosciences.

Typical values for materials sciences and geosciences: for the micro-, meso- and macro- length and time scales. These representative scales emphasize the similarities between the two fields; they will vary from problem to problem. For example, chemical reactions that affect geological materials must be simulated on the atomic length scale.
LENGTH (meters) TIME (seconds)
materials geosciences materials geosciences
microscale
   -10
 10
    -8
  10
   -14
 10
    -12
  10
mesoscale
   -6
 10
    -2
  10
   -10
 10
    -2
  10
macroscale
   -3
 10
    +4
  10
   -6
 10
    +7
  10
Linking information across the extremes in this table is a critical step, but making that step is hampered by problems in providing adequate descriptions at the mesoscale. Computer simulation of behavior in the mesoscale regime offers opportunities to create such descriptions, but these simulations make enormously greater demands on computational hardware and software.

A simple example for materials sets the scale of computational needs. Representing the three-dimensional mesoscale by atoms on the microscale requires 10^(12) atoms. Tracking the time evolution of these atoms will require computations that scale at best linearly in the number of atoms at each time step. If the minimum time step is one tenth the micro scale, then the minimum number of operations to reach the meso time scale exceeds 10^(17) -- corresponding to a few days on a teraflop computer. This most optimistic estimate already necessitates large-scale data base management and massive parallel computing together with visualization tools to write/debug parallel code and to monitor on-line code performance and output.

Before discussing the important science opportunities and unique challenges that teraflop computing would offer the materials sciences and geosciences, there are four additional points that overlay the whole discussion.

Measurements. While there are excellent measurement tools for probing the microscale and macroscale, the tools for monitoring the mesoscale are limited so that simulation is essential for timely progress. Roughly speaking, current mesoscale probes take diffuse snapshots of the evolving mesoscale. Indeed simulations will stimulate and calibrate the development of tools for mesoscale measurement.

Multiscale simulation. The central concept is that a large-scale simulation at the microscale should lead to an effective model at the mesoscale; simulation at the mesoscale should lead to macroscopic behavior. That simulation at one scale develops the effective model and its parameter at the next scale is called multiscale simulation. The major conceptual barrier is the lack of an accepted procedure for using large-scale simulations at the microscale model to develop accurate mesoscale models including dissipative processes with accurate estimates of the parameters in the model. Using a validated mesoscale model to generate macroscopic properties is less a conceptional challenge than computational one.

Infrequent events. That infrequent events dominate important behavior (diffusion in materials, faulting in geosciences) necessitates exceptionally long-time simulation. For example, to simulate diffusion with atomistic models requires evolution to microseconds from femtoseconds, that is, 10^(9) time steps. While this further emphasizes the need for teraflop computing, it has in one area -- molecular dynamics -- stimulated the development of new methods which speed up the dynamics and parallelize it. This initial effort needs considerable development and analogous achievements in other time simulations.

Driven systems. A major intellectual step must be to move beyond simulation of static systems to finite-temperature non-equilibrium ones. The real phenomena are dynamic: for example, plastic flow to fracture, flow of the earth's mantle. Indeed the nature of time-dependent "driving" -- its history -- can have dramatic changes: vibration-enhanced fracture, storms on a beach. Strong non-equilibrium effects and history-dependent perturbation must be faced in a comprehensive simulation initiative.

Materials:

The mesoscale or microstructure of materials consists of aggregates of small single-crystal grains joined together at interfaces -- grain boundaries -- threaded with extended displacements of the lattice structure -- such as dislocations -- and seeded with point-like defects -- such as vacancies and impurities. While the property of these elements is determined by microscopic properties, the interplay between these elements determines macroscopic behavior. For example, atomic forces determine the equilibrium shapes of small grains while the size of the grains partially determines the hardness of a material. In addition, hardness depends both on the dislocations restraining the relative motion of grains under stress and on defects impeding the dislocations.

The preeminent questions are: how can we tailor the mesoscale:

  1. to create materials with new and innovative properties;
  2. to extend the capabilities of existing materials; and
  3. to process materials cheaply and efficiently?

The conceptual advances needed to link micro and macro scales through the mesoscale are:

By tailoring the mesoscale, we mean not only understanding the element of the mesoscale (e.g, grains, extended defects, and local defects) but also processing material to achieve the microstructure necessary for the desired macroscopic properties.

Below, we offer examples to illustrate the enormous benefit such knowledge could achieve. The prime benefit of all examples is time: using simulation to understand the micro-meso-macro connections, the current 10-15 years required to take a new material from synthesis to product could be reduced to five years. In the US economy this is the principal barrier to new materials development. If the US is to remain preeminent, simulations must dramatically speed up the development, improvement and processing of materials.

Control of microstructure -- mesoscale elements such as grain size, dislocations, defects -- is the primary goal of the engineering development of materials processing. Improved materials can have dramatic, practical examples. In the US, about 80 billion beverage cans are produced each year. Improved processing focusing on the empirical optimization of microstructure reduced the aluminum per can from 22 grams in the 1960's to 12 grams today with a huge effect on both the economy and the environment. A high tech example, where the empirical approach is very slow, is the development of superalloys -- high-strength alloys that exhibit high surface stability and resistance to high temperature and severe mechanical stress -- for use in turbine blades of advanced turbine engines. Small amounts of cobalt or chromium dramatically affect the properties of nickel aluminum alloys. Such problems, as accurately modeling the effect of defect atoms on the microscale, require a major enhancement in our simulation skills as the following examples illustrate.

Magnets. Crafted microstructure is crucial for the ubiquitous high quality permanent magnets found in familiar devices, for example, motors, transformers, generators, sensors, and computer hard drives. Research to improve the performance, efficiency, and affordability of magnet materials is concentrated on new materials and new processing techniques for controlling microstructure. These efforts can be accelerated by accurate modeling of the temperature dependent magnetic interactions at grain boundaries, secondary phases, and other defects. Accurate modeling will create the science base needed for the systematic control of magnetic domain-wall formation and motion. With teraflop level computing, such modeling can be extended from thousand-atom spin-polarized first principles simulations to million-atom accurate empirical tight-binding models. These calculations will in turn feed into the continuum modeling programs of the micro-magnetics community. The improvement of magnetic materials is important for the economy where devices containing magnets represent a market of over $100 billion/year.

Innovative materials. Mesoscale simulations would speed the development of constitutive relations needed to predict macroscale behavior for materials such as fracture-resistant cast steels; toughened yet creep-resistant high-temperature ceramics and ceramic composities; hard, corrosion-resistant coatings. A major challenge is to simulate the materials processing necessary to achieve a desired microstructure. Metallurgical phenomena such as grain growth significantly change the material microstructure during processing by altering the grain size, shape, the volume fraction and distribution of phases, and their crystallographic orientation.

Environmental effects. Long-time molecular dynamics could study dominant mesoscale effects on oxidation -- a major cause of aging, especially at high temperature or high stress. For example, oxidation embrittlement of ceramic matrix composities involves the ingress of oxygen through matrix cracks in the composite; the resulting oxidation drastically changes structural performance. A large-scale, atomistic molecular-dynamics simulation is required of oxidation in the vicinity of the cracks; the simulation needs to model the effects of high temperature and high oxygen pressure. But prior to this simulation, it is necessary to atomistically simulate dynamic fraction of ceramic composites. The effect of highly nonlinear stress inhomogeneities and plasticity due to dislocation emission and grain boundaries represent real challenges. For this latter example, we offer a tentative time-line which illustrates the coupled problems to be solved.

Time-line for environmental simulation above.
2001 Billion-atom molecular-dynamics simulation of mechanical behavior and fracture of silicon-nitride in driven environments
2001 Above simulation connected to finite-element elastic-theory continuum for a much larger system. This hybrid approach will be used in subsequent steps; it needs to be validated by trillion atoms simulations.
2005 Evolution of nanostructures to microstructures using atomistic/continuum hybrid simulations.
2008 Effects of oxidation and corrosion, including electronic degrees of freedom, by hybrid electronic and atomic-continuum approach on mechanical behavior and fracture. Interactive, immersive visualization essential for understanding time-evolving results.

Damaged materials. DOE interest in modeling damaged materials ranges from high-energy neutron irradiation of bulk materials to the fusion plasma interaction at the interior tokamk surface. In the case of neutron irradiation, the initial displacement cascade occurring over 10^(-11) seconds and 10^(-8) meters leads to substantial microstructural modification that over a period of weeks to years produced 50% volume-change void swelling and ductile-to-brittle transitions. Most current simulations are restricted to single element metals modeled with effective medium potentials. Teraflop computing is needed to treat the extension to alloys with more realistic potentials used to track the development of the microstructure to eventual long-term mechanical property transformations. The study of plasma material interactions are a critical issue for magnetic fusion power development. Key concerns are boundary surface erosion by plasma particles, hydrogen and helium recycling, and plasma contamination. These require a wide range of integrated models/codes for edge plasma parameters and magnetic field geometry, sheath physics, molecular dynamics and/or binary collision sputtering/reflection, material thermal and mechanical response, 3-D line radiation transport, and atomic and molecular processes of surface materials in the plasma. Successful modeling, at a small fraction of the projected 5-10 billion dollars for prototype fusion power reactors, would substantially advance predictive capability and aid the choice and optimization of fusion surface materials and plasma regimes.

Methods.

Central to any modeling is high-speed numerical integration of the equations of motion for atoms interacting with each other -- that is, molecular dynamics simulations. A full electronic structure calculation must occur at every time step to generate the force on the atoms for the next time step. For insulators, there are empirical electronic structure methods that scale linearly in the number of atoms. For metals, the methods are not as efficient, but progress is being made with real-space methods which exploit localization. Further there is progress in making self-consistency scale linearly with atom number, but the prefactor is very large. Accordingly, full electronic structure calculations would be used for only a few thousand atoms, those whose electronic degrees of freedom are essential to microscale understanding. Further, many-body enhancements to the electronic structure calculations are being developed with real-space formulations that could run efficiently on parallel computers. These are essential for understanding optical processes that probe the excited states.

For increasing number of atoms, first-principles methods must give way to well-calibrated and physically motivated effective potentials. There currently is a hierarchy of approaches ranging from tight-binding to classical potentials and ultimately elasticity theory. Over the last couple decades, substantial effort has been focused on developing such models for single elements or small groups of chemically related elements. There currently do not exist generally accepted, well-tested approximate representations of the interatomic interactions that can be used for arbitrary combinations of elements. The development of such a broad based set of interactions will require long-term theoretical development. However, it is essential if computational modeling is to be able to address geometrically complex processes where alloying effects and impurities are important.

At the mesoscale, molecular dynamics must track the response of the microstructure to internal strains, interactions between defects and dislocations, and external mechanical and thermal forces. One example will illustrate the type of effort required. Dislocations -- extended defects in the perfect crystal -- have already been cited as playing a central role in the mechanical properties of real materials. A million atom simulation on a 100 Mflop machine could give information about the interaction between pairs of dislocations. This atomic scale data is not directly translatable to predicting large scale behavior but requires intermediate models involving realistic dislocation densities of order 10^(12) per m^(2), necessitating super teraflop performance computations. Then one would hope to develop a rule-based description that could be used to predict mechanical failure and the response to temperature, stress, and environment so critical to the safety of manufactured structures. Systematic mesoscale calculations -- only possible when such calculations are routine -- would build up the data base to establish validated phenomenological models with realistic parameters.

Geosciences:

Geoscience shares many of the opportunities and problems of materials. At the same time, the mesoscale of the crust and upper mantle involves many complex rock types and discontinuities at every scale, resulting from a non-equilibrium long-term evolution. Consequently geoscience is distinguished by two aspects:

  1. additional computational requirements; vastly larger range of length/time scales and significantly greater data storage.
  2. nature of computation more dependent on the problem.

For this second reason, the organization is by problem: exploration/extraction of resources; environmental geosciences; earthquake and crustal dynamics; volcanic processes.

Energy and mineral resources. The obvious value of "inverting" seismic and electromagnetic data to image energy and mineral resources poses several significant scientific challenges: (1) multi-terabyte data sets; (2) realistic seismic and electromagnetic models that include shear wave, fluid-filled porous media, anisotropy and discontinuities on all length scales; and (3) effectively using joint seismic and electromagnetic data sets. To this must be added the task of geomechanical design for the essential extraction of resources. The stability of wells and mines and the efficient production of reservoir fluids depends on engineering designs based on macroscale goemechanical properties which are seldom measurable but must themselves be derived from large scale mesoscale simulations.

Seismic imaging. A typical offshore seismic survey yields ten terabyte data which must be ``inverted'' to image the overall structure, its strata, the rock formations and porosity. Even using simplistic models for seismic propagation -- ignoring the transverse modes, attenuation and heterogeneity of propagation -- requires typically 10^(17) operations and now take half a year to perform. Thus the need for teraflop computing is clear. More realistic computations must face the extreme heterogeneity of earth materials, especially discontinuities at all scales. Roughly, these discontinuities at the micro-meso-macro scales are: grain boundaries and cracks; fracture and joints; tectonic fault and plate boundaries. Such simulations would increase computational requirements a 100 fold.

Combined seismic and electromagnetic imaging. Electromagnetic wave propagation more directly samples the ionic conductivity of fluids in rocks, which depend on the porosity. Provided there was a larger experimental base of rock properties, joint seismic and electromagnetic surveys could yield better estimates of subsurface hydrology and of changes in reserves. Statistical studies of many such studies could provide ``site specific'' empirical relationships between geophysical and hydrological parameters. To dramatically increase the accuracy and joint inversion of seismic and electromagnetic data requires major theoretical and algorithmic advances in modeling acoustic and electromagnetic wave propagation.

Environmental geosciences. The actual and potential environmental impact of organic, inorganic and radioactive waste necessitates predictive computer models that account for physical/chemical events over enormous length/time scales in complex subsurface environments. An unplanned release of wastes at any site raises fundamental concerns: (i) to understand the transport and chemical transformation of the contaminant species through the subsurface environment; (ii) to estimate the probability the contaminants will degrade the biosphere; (iii) to formulate successful remediation strategies. Predictive computer models are central to risk assessment and restoration activity. A critical component of any model is an accurate description of the physical characteristics of the subsurface environment. Understanding near-surface fluid flow -- the single most important transport mechanism of toxic chemicals -- requires combining laboratory experiments on rock and soil samples with seismic and electromagnetic surveys.

Earthquake and crustal dynamics. To provide a basis for risk assessment of earthquakes involves modeling nonlinear behavior over long time scales at four length scales: (i) rock friction and fracture, (ii) microcracks and fluid flow, (iii) single faults and (iv) fault systems. Each present significant challenges. The microscopic theory of friction is in its infancy; the complex rock chemistry requires molecular dynamics modeling with ab initio electronic calculations compared with experimental data. The formation/coalescence of microcracks, the dynamics of fracture and effect of fluids require a set of effective potentials for molecular dynamics modeling to generate the needed mesoscale models. Single faults are currently modeled with slider block models -- two-dimensional arrays of block connected by springs and resting on a friction surface. The simplistic rules used to govern how these blocks respond to an external force, such as seismic wave, yield complex failure behavior. A mesoscale justification for such models is sorely needed. The modeling of an entire fault system must also include thermal-mechanical effects and the extreme heterogeneity and anisotropy. Modeling on time scales beyond a few years must include viscous rheology with strong nonlinearities. Computers capable of 10^(15) flops are needed to model earthquake faults on the time scales needed to develop a realistic understanding of fault evolution. A by-product of this effort -- effective fault-fault interactions -- can be used to determine the role of fault system seismology on the earthquake distribution of individual faults.

Volcanic processes. The distinct modeling challenge of volcano flows arise from nonlinearity, multicomponents and extreme property range. Nonlinearity arises from coupling of the equations involving chemistry and physics as well as the nonlinear dependence of the physical properties. The flows, with speeds ranging from cm/year to hundred km/sec, involve all of the phases with many discrete components. The viscosity can range from that of silicon oil for magmas down the slopes to that of hot granular material. The length scale covers 0.1 mm to kilometers. The time scale range -- from hours to thousands of years -- is typified by Mt. St. Helens to the long-term build-up of the Hawaiian islands. Studying the dynamic complexity of the thermal-mechanical evolution of volcanic dynamics over the multiple space and time scales can easily involve terabyte data sets.

Computational issues:

Scalable sparse matrix operations. Locality of interactions in many models -- for example, tight-binding potentials -- calls for advances in handling parallel sparse matrix-matrix operations. Matrix operations involving only subsets of the indices of the objects require advances in optimizer schemes to produce efficient parallel code. For example, some of the indices could involve fast-Fourier transforms, and the optimizers must be able to handle them together with the sparse operations.

Visualization. As the codes involve increasingly large data bases run on remote machines, visualization must provide tools for code performance/debugging and data visualization that run in real time and can be shared between distributed collaborators. Both tasks will require data mining techniques, information visualization, and statistical analysis methods to reduce the amount and dimensionality of data to be rendered. (1) For performance and debugging, visualization should highlight regions of unusual performance and variations from an expected performance pattern based on a model of the code's anticipated behavior. Reduction of the performance data to only the unusual or meaningful regions in time or across nodes would tremendously simplify the debugging and code tuning process for developing massively parallel codes. (2) For real-time data visualization, the concern is to significantly reduce the computational cost and network bandwidth requirements of performing the rendering on the massively parallel architecture and then shipping the large visualization output back to a serial visualization algorithm running on the user's local resource.

Remote steering. Scientists at geographically different locations need the ability to interactively steer a simulation in time and/or space. The software must ensure that each viewer has both a time-coherent view of the parallel data and that steering parameter coherency is ensured across all multiple viewers. A necessary condition is that viewers can attach from any location at any point during a simulation that is robust with respect to such inspection. Finally, the viewer/application interfaces must support a variety of local visualization software packages which have different data and synchronization requirements.

Integrated development environment. Any given user should be able to simply construct an interface for a particular application without having to know anything about the particular programming paradigm or how to construct a visual environment. The interface should allow the user a view of the overall flow of the program. By clicking on the lines connecting the various subprograms, the user can quickly see what input and output variables are being exchanged between the two routines. Further, the user can easily manipulate various subalgorithms to easily construct new code, to adapt existing code, and to access a new subprocess within the context of the entire code.


Your comments and suggestions are appreciated.
[Previous] [Wilkins Home Page]


To cite this page:
Report of Panel 1: Materials and Geophysics
<http://www.physics.ohio-state.edu/~wilkins/doe/final.html>
Edited by: wilkins@mps.ohio-state.edu [9 March 1998]