You are here: Home / 2015-2016 Webinars
44.211.24.175

2015-2016

2015–2016 Webinar Schedule

October 8. Sanne Cottaar, Timo Heister, Bob Myhill, Ian Rose, and Cayman UnterbornAn introduction to BurnMan - a mineral physics toolkit
November 12. Anders Petersson, Lawrence Livermore National Laboratories. Simulating seismic wave propagation with SW4  
December 3. Kasey Schultz & John Wilson, UC DavisAn introduction to Virtual Quake
January 14. William Oberkampf, Oberkampf Consulting. Verification, Validation, and Predictive Capability: What’s What?
February 11. Habib Najm, Sandia National Laboratory. Uncertainty Quantification in Computational Models of Physical Systems
March 10. Anna M. Michalak, Carnegie Institution for Science. Statistical and computational challenges of constraining greenhouse gas budgets
April 14. Noemi Petra, UC Merced, hIPPYlib: An extensible software framework for large-scale Bayesian inversion - CANCELLED
May 12. Andreas Fichtner, ETH. Resolution analysis by random probing


THURSDAY, OCTOBER 8

An introduction to BurnMan - a mineral physics toolkit
Sanne Cottar, Ph.D., University of Cambridge; Professor Timo Heister, Clemson University; Bob Myhill, Ph.D., University of Bayreuth; Ian Rose, University of California, Berkeley; and Cayman Unterborn, Ohio State University

In this webinar we will introduce the extensible, open source mineral physics toolkit BurnMan. This software allows the user to calculate the elastic and thermodynamic properties of rocks, minerals, fluids and melts based on the properties of end-member phases. BurnMan is bundled with several end-member and solid solution databases, but users can also create their own, by fitting their data to a wide variety of different equations of state and solution models. BurnMan can then be used to calculate mineral equilibria, chemical potentials and seismic properties of minerals and rocks as a function of pressure, temperature and bulk composition. Seismic velocity profiles computed by BurnMan can be quantitatively or visually compared to observed seismic velocities. In the webinar we will demonstrate the various features by going through example scripts. [slides]

 

 

THURSDAY, NOVEMBER 12

Simulating seismic wave propagation with SW4
Anders Petersson Lawrence Livermore National Laboratory

This webinar describes how to use the SW4 code to simulate ground motion due to earthquakes. After a brief overview of the numerical method, we describe how to set up a simulation in terms of seismic sources, the material model, visco-elastic attenuation, and topography. We also present some of the available output options, describe how to run SW4 on parallel machines, and make some suggestions on the work flow with SW4. [slides]

 

 

THURSDAY, DECEMBER 3

An introduction to Virtual Quake
Kasey Schultz, University of California, Davis

This webinar will introduce Virtual Quake, a boundary element code that performs simulations of fault systems based on stress interactions between fault elements to understand long term statistical behavior.  The webinar will cover:

  • Downloading, installing and running Virtual Quake
  • Analysis tools - PyVQ and DIY (h5py)
  • The Virtual Quake research community
  • Future 

[slides

 

 

THURSDAY, JANUARY 14

Verification, Validation, and Predictive Capability: What’s What?
William Oberkampf, Oberkampf Consulting

Engineering and geosciences organizations must increasingly rely on computational simulation for the design, predicted response, and performance of manmade and natural systems. Computational analysts,  decision makers, and regulatory authorities who rely on simulation should have practical techniques and methods or assessing simulation credibility. This webinar presents an introduction to the topics of verification, validation, and predictive capability. These topics are applicable to essentially all engineering and science applications, including fluid dynamics, heat transfer, solid mechanics, and all fields related to Earth sciences. The mathematical models for the systems of interest are typically given by partial differential or integral equations representing initial value and boundary value problems. The computer codes that implement the mathematical models can be developed by commercial, corporate, government, or research organizations; but they should all be subjected to rigorous testing by way of verification procedures. The accuracy of the mathematical models coded into software are assessed by way of comparisons of results with experimental measurements; referred to as validation activities. The webinar will sketch a framework for incorporating a wide range of error and uncertainty sources identified during the mathematical modeling, verification, validation, and calibration processes, with the goal of estimating the total predictive uncertainty of the simulation. This is referred to as estimating predictive capability because typically no experimental measurements are available for the systems at the application conditions of interest. [slides]

 

 

THURSDAY, FEBRUARY 11

Uncertainty Quantification in Computational Models of Physical Systems
Habib Najm, Sandia National Laboratory

Models of physical systems typically involve inputs/parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of model validation, design optimization, and decision support.

Recent years have seen significant developments in probabilistic methods for efficient uncertainty quantification (UQ) in computational models. These methods are grounded in the use of functional representations for random variables. In particular, Polynomial Chaos (PC) expansions have seen significant use in this context. The utility of PC methods has been demonstrated in a range of physical models, including structural mechanics, porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains a challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and dynamics.

In this talk, I will give an overview of UQ in computational models, and present associated demonstrations in computations of physical systems. I will cover the two key classes of UQ activities, namely: estimation of uncertain input parameters from empirical data, and forward propagation of parametric uncertainty to model outputs. I will cover the basics of PC UQ methods with examples of their use in both forward and inverse UQ problems. I will also highlight the application of these methods in select physical systems, including combustion, as well as ocean and land components of climate models. [slides]

 

 

THURSDAY, MARCH 10 

Statistical and computational challenges of constraining greenhouse gas budgets
Anna M. Michalak, Carnegie Institution for Science

Predicting future changes to the global carbon cycle (and therefore climate) and quantifying anthropogenic emissions of greenhouse gases (GHGs) both require an understanding of net GHGs emissions and uptake across a variety of spatial and temporal scales.  This talk will explore some of the core scientific questions related to understanding GHG budgets through the lens of the statistical and computational challenges that arise.  The focus will be on the use of atmospheric observations, and applications will include the natural and anthropogenic components of the methane and carbon dioxide budgets.  The discussion will include issues related to the solution of spatiotemporal inverse problems, uncertainty quantification, data fusion, gap filling, and issues of “big data” arising from the use of satellite observations. [slides]

 

 

THURSDAY, MAY 12 

Resolution analysis by random probing
Andreas Fichtner, ETH

We present a new method for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of the method are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full-waveform inversion and linearized ray tomography, (iii) applicability in any spatial dimension and to inversions with a large number of model parameters, (iv) low computational costs that are mostly a fraction of those required for synthetic recovery tests, and (v) the ability to quantify both spatial resolution and inter-parameter trade-offs.

Using synthetic full-waveform inversions as benchmarks, we demonstrate that auto-correlations of random-model applications to the Hessian yield various resolution measures, including direction- and position-dependent resolution lengths, and the strength of inter-parameter mappings. We observe that the required number of random test models is around 5 in one, two and three dimensions. This means that the proposed resolution analyses are not only more meaningful than recovery tests but also computationally less expensive. We demonstrate the applicability of our method in 3D real-data full-waveform inversions for the western Mediterranean and Japan.

In addition to tomographic problems, resolution analysis by random probing may be used in other inverse methods that constrain continuously distributed properties, including electromagnetic and potential-field inversions, as well as recently emerging geodynamic data assimilation. [slides]