Predictive simulation and model based hazard maps of geophysical mass flows
Dalbey, Keith R.
MetadataShow full item record
The purpose of this work is to improve the predictive capabilities of geophysical flow computer models and in particular the Titan2D depth-averaged granular flow simulator. Within this broad scope, there are three interrelated themes. The first is to improve the accuracy of Titan2D's physical and numerical modeling capabilities. The second is to quantify the uncertainty in geophysical ow simulation output. The third is to develop a systematic methodology to aid volcanologists in incorporating simulator output into the production of hazard maps. The motivating challenge often is the need to analyze hazards over a jurisdiction in a short period of time (e.g. less than 24 hours) following a specified premonitory event. The need requires advances in all of the above themes. We note that new ideas that we introduce will potentially impact areas beyond the application at hand. In pursuit of the first theme, contributions of this work to Titan2D include (1) The capability to accept spatially varying material properties (2) A physics-based criteria to determine if the ow should be stopped, and techniques to bring the flow to rest (3) A new flow initiation mechanism modeling a flux of material effusing from the ground (4) A multi-faceted approach that mitigates the "thin-layer problem" common to "depth-averaged" flow solvers Toward the second theme, we developed two different classes of uncertainty representation approaches. The first is Polynomial Chaos Quadrature (PCQ) which is similar to Non-Intrusive Spectral Projection (NISP) but with a few numerical advantages. It is general enough to be considered a superset of NISP and Point Estimate Methods (PEM). The second class involves Bayesian Emulation, in particular the Bayes Linear Method (BLM). Some of its advantages include: it provides an estimate of the discrepancy between the statistical model and the simulator, which can also be used to drive adaptive sampling; and this class can reuse existing data, rather than requiring that it be known at very specific locations in sample space. Notable contributions in this field include (1) A positive-definite error model with a different set of roughness parameters for each of the multiple outputs (2) A new formula to compute the unadjusted variance given a set of data and roughness parameters that has favorable properties compared to what is found in the literature (3) A set of guidelines that significantly reduces the region of roughness parameter space that must be searched in order to find good values without requiring specification of a prior by an expert user (4) A way to construct a Bayes Linear emulator that accounts for known measurement noise without requiring specification of a prior by an expert user (5) Adaptive selection of global polynomial basis functions for the least squares fit used as the unadjusted mean of a Bayes Linear emulator (6) A new PieceWise linear ensemble of EMulators (PWEM) approach which is equally applicable to fully Bayesian and Bayes Linear emulation, and appears to be more accurate than adaptively selected global polynomial basis functions. Of even greater importance, these computations can be parallelized easily, enabling the use of modern supercomputers. For the third theme, this dissertation introduces new approaches to constructing maps of probability-of-hazard through PCQ and PWEM fast surrogates. The PWEM approach correlates through physical space as well as uncertainty space, i.e. it produces a single macro-emulator that can be evaluated at all points on an East-North map. We produced a "probability within a specified time period" of hazard map for the island of Montserrat by drawing sample volumes from a Pareto distribution that was provided by collaborators at SAMSI (Statistical and Applied Mathematical Sciences Institute). With the distributions in hand and utilizing roughly 1000 processors of a supercomputer cluster, we created a probability-of-hazard map in less than 9 hours from start to finish. The original goal was to generate a model based hazard map in under 24 hours . The ideas described above under themes two and three are of general applicability for many related areas and our current work is focused on such generalizations. Limited application of the PCQ method for the simulation of other complex systems, e.g. child restraint systems, has been successful.