Use of High Performance Computing to Generate and Utilize 3D Tomographic Velocity Models
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
LocOO3D is a software tool that computes geographical locations for seismic events at regional to global scales. This software has a rich set of features, including the ability to use custom 3D velocity models, correlated observations and master event locations. The LocOO3D software is especially useful for research related to seismic monitoring applications, since it allows users to easily explore a variety of location methods and scenarios and is compatible with the CSS3.0 data format used in monitoring applications. The LocOO3D software, User's Manual, and Examples are available on the web at: https://github.com/sandialabs/LocOO3D For additional information on GeoTess, SALSA3D, RSTT, and other related software, please see: https://github.com/sandialabs/GeoTessJava, www.sandia.gov/geotess, www.sandia.gov/salsa3d, and www.sandia.gov/rstt
PCalc is a software tool that computes travel-time predictions, ray path geometry and model queries. This software has a rich set of features, including the ability to use custom 3D velocity models to compute predictions using a variety of geometries. The PCalc software is especially useful for research related to seismic monitoring applications.
Abstract not provided.
The ability to accurately locate seismic events is necessary for treaty monitoring. When using techniques that rely on the comparison of observed and predicted travel times to obtain these locations, it is important that the estimated travel times and their estimated uncertainties are also accurate. The methodology of Ballard et al. (2016a) has been used in the past to generate an accurate 3D tomographic global model of compressional wave slowness (the SAndia LoS Alamos 3D tomography model, i.e. SALSA3D). To re-establish functionality and to broaden the capabilities of the method to local distances, we have applied the methodology of Ballard et al. (2016a) to local data in Utah. This report details the results of the initial model generated, including relocations performed using analyst picked mining events at West Ridge Mine and three ground-truth events at Bingham Mine. We were successfully able to generate a feasible tomography model that resulted in reasonable relocations of the mining events.
Bulletin of the Seismological Society of America
In a traditional data-processing pipeline, waveforms are acquired, a detector makes the signal detections (i.e., arrival times, slownesses, and azimuths) and passes them to an associator. The associator then links the detections to the fitting-event hypotheses to generate an event bulletin. Most of the time, this traditional pipeline requires substantial human-analyst involvement to improve the quality of the resulting event bulletin. For the year 2017, for example, International Data Center (IDC) analysts rejected about 40% of the events in the automatic bulletin and manually built 30% of the legitimate events. We propose an iterative processing framework (IPF) that includes a new data-processing module that incorporates automatic analyst behaviors (auto analyst [AA]) into the event-building pipeline. In the proposed framework, through an iterative process, the AA takes over many of the tasks traditionally performed by human analysts. These tasks can be grouped into two major processes: (1) evaluating small events with a low number of location-defining arrival phases to improve their formation; and (2) scanning for and exploiting unassociated arrivals to form potential events missed by previous association runs. To test the proposed framework, we processed a two-week period (15–28 May 2010) of the signal-detections dataset from the IDC. Comparison with an expert analyst-reviewed bulletin for the same time period suggests that IPF performs better than the traditional pipelines (IDC and baseline pipelines). Most of the additional events built by the AA are low-magnitude events that were missed by these traditional pipelines. The AA also adds additional signal detections to existing events, which saves analyst time, even if the event locations are not significantly affected.
Abstract not provided.
Loc003D is a software tool that computes geographical locations for seismic events at regional to global scales. This software has a rich set of features, including the ability to use custom 3D velocity models, correlated observations and master event locations. The Loc003D software is especially useful for research related to seismic monitoring applications, since it allows users to easily explore a variety of location methods and scenarios and is compatible with the CSS3.0 software format used in monitoring applications. The Loc003D software is available on the web at: www.sandia.gov/salsa3d/Software.html The software is packaged with this user's manual and a set of example datasets, the use of which is described in this manual.
pCalc is a software tool that computes travel-time predictions, ray path geometry and model queries. This software has a rich set of features, including the ability to use custom 3D velocity models to compute predictions using a variety of geometries. The pCalc software is especially useful for research related to seismic monitoring applications. The pCalc software is available on the web at: www.sandia.gov/salsa3d/Software.html The software is packaged with this user's manual and a set of example datasets, the use of which is described in this manual.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Bulletin of the Seismological Society of America
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ∼2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Bulletin of the Seismological Society of America
The task of monitoring the Earth for nuclear explosions relies heavily on seismic data to detect, locate, and characterize suspected nuclear tests. Motivated by the need to locate suspected explosions as accurately and precisely as possible, we developed a tomographic model of the compressional wave slowness in the Earth’s mantle with primary focus on the accuracy and precision of travel-time predictions for P and Pn ray paths through the model. Path-dependent travel-time prediction uncertainties are obtained by computing the full 3D model covariance matrix and then integrating slowness variance and covariance along ray paths from source to receiver. Path-dependent travel-time prediction uncertainties reflect the amount of seismic data that was used in tomography with very low values for paths represented by abundant data in the tomographic data set and very high values for paths through portions of the model that were poorly sampled by the tomography data set. The pattern of travel-time prediction uncertainty is a direct result of the off-diagonal terms of the model covariance matrix and underscores the importance of incorporating the full model covariance matrix in the determination of travel-time prediction uncertainty. The computed pattern of uncertainty differs significantly from that of 1D distance-dependent traveltime uncertainties computed using traditional methods, which are only appropriate for use with travel times computed through 1D velocity models.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Seismological Research Letters
GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. The software is available in Java and C++, with a C interface to the C++ library. The software has been tested on Linux, Mac, Sun, and PC platforms. It is open source and is available online (see Data and Resources).
Abstract not provided.
Bulletin of the Seismological Society of America
The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Bulletin of the Seismological Society of America
Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phases are considered. Once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified. Results are presented in comparison with analyst-reviewed bulletins for three datasets: a two-week ground-truth period, the Tohoku aftershock sequence, and the entire year of 2010. The probabilistic event detection, association, and location algorithm missed fewer events and generated fewer false events on all datasets compared to the associator used at the International Data Center (51% fewer missed and 52% fewer false events on the ground-truth dataset when using the same predictions).
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D version 1.5, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is {approx}50%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method. We compare the travel-time prediction and location capabilities of SALSA3D to standard 1D models via location tests on a global event set with GT of 5 km or better. These events generally possess hundreds of Pn and P picks from which we generate different realizations of station distributions, yielding a range of azimuthal coverage and ratios of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135 regardless of Pn to P ratio, with the improvement being most pronounced at higher azimuthal gaps.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D (SAndia LoS Alamos) version 1.4, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is > 55%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method, as well as by directly estimating the diagonal of the model resolution matrix based on the technique developed by Bekas, et al. We compare the travel-time prediction and location capabilities of this model over standard 1D models. We perform location tests on a global, geographically-distributed event set with ground truth levels of 5 km or better. These events generally possess hundreds of Pn and P phases from which we can generate different realizations of station distributions, yielding a range of azimuthal coverage and proportions of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135, especially with increasing azimuthal gap. The 3D model appears to perform better for locations based solely or dominantly on regional arrivals, which is not unexpected given that ak135 represents a global average and cannot therefore capture local and regional variations.
Abstract not provided.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D (SAndia LoS Alamos) version 1.4, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is > 55%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method, as well as by directly estimating the diagonal of the model resolution matrix based on the technique developed by Bekas, et al. We compare the travel-time prediction and location capabilities of this model over standard 1D models. We perform location tests on a global, geographically-distributed event set with ground truth levels of 5 km or better. These events generally possess hundreds of Pn and P phases from which we can generate different realizations of station distributions, yielding a range of azimuthal coverage and proportions of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135, especially with increasing azimuthal gap. The 3D model appears to perform better for locations based solely or dominantly on regional arrivals, which is not unexpected given that ak135 represents a global average and cannot therefore capture local and regional variations.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
The geometry of ray paths through realistic Earth models can be extremely complex due to the vertical and lateral heterogeneity of the velocity distribution within the models. Calculation of high fidelity ray paths and travel times through these models generally involves sophisticated algorithms that require significant assumptions and approximations. To test such algorithms it is desirable to have available analytic solutions for the geometry and travel time of rays through simpler velocity distributions against which the more complex algorithms can be compared. Also, in situations where computational performance requirements prohibit implementation of full 3D algorithms, it may be necessary to accept the accuracy limitations of analytic solutions in order to compute solutions that satisfy those requirements. Analytic solutions are described for the geometry and travel time of infinite frequency rays through radially symmetric 1D Earth models characterized by an inner sphere where the velocity distribution is given by the function V (r) = A-Br{sup 2}, optionally surrounded by some number of spherical shells of constant velocity. The mathematical basis of the calculations is described, sample calculations are presented, and results are compared to the Taup Toolkit of Crotwell et al. (1999). These solutions are useful for evaluating the fidelity of sophisticated 3D travel time calculators and in situations where performance requirements preclude the use of more computationally intensive calculators. It should be noted that most of the solutions presented are only quasi-analytic. Exact, closed form equations are derived but computation of solutions to specific problems generally require application of numerical integration or root finding techniques, which, while approximations, can be calculated to very high accuracy. Tolerances are set in the numerical algorithms such that computed travel time accuracies are better than 1 microsecond.
DBTools is comprised of a suite of applications for manipulating data in a database. While loading data into a database is a relatively simple operation, loading data intelligently is deceptively difficult. Loading data intelligently means: not duplicating information already in the database, associating new information with related information already in the database, and maintaining a mapping of identification numbers in the input data to existing or new identification numbers in the database to prevent conflicts between the input data and the existing data. Most DBTools applications utilize DBUtilLib--a Java library with functionality supporting database, flatfile, and XML data formats. DBUtilLib is written in a completely generic manner. No schema specific information is embedded within the code; all such information comes from external sources. This approach makes the DBTools applications immune to most schema changes such as addition/deletion of columns from a table or changes to the size of a particular data element.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
One of the most important types of data in the National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Knowledge Base (KB) is parametric grid (PG) data. PG data can be used to improve signal detection, signal association, and event discrimination, but so far their greatest use has been for improving event location by providing ground-truth-based corrections to travel-time base models. In this presentation we discuss the latest versions of the complete suite of Knowledge Base PG tools developed by NNSA to create, access, manage, and view PG data. The primary PG population tool is the Knowledge Base calibration integration tool (KBCIT). KBCIT is an interactive computer application to produce interpolated calibration-based information that can be used to improve monitoring performance by improving precision of model predictions and by providing proper characterizations of uncertainty. It is used to analyze raw data and produce kriged correction surfaces that can be included in the Knowledge Base. KBCIT not only produces the surfaces but also records all steps in the analysis for later review and possible revision. New features in KBCIT include a new variogram autofit algorithm; the storage of database identifiers with a surface; the ability to merge surfaces; and improved surface-smoothing algorithms. The Parametric Grid Library (PGL) provides the interface to access the data and models stored in a PGL file database. The PGL represents the core software library used by all the GNEM R&E tools that read or write PGL data (e.g., KBCIT and LocOO). The library provides data representations and software models to support accurate and efficient seismic phase association and event location. Recent improvements include conversion of the flat-file database (FDB) to an Oracle database representation; automatic access of station/phase tagged models from the FDB during location; modification of the core geometric data representations; a new multimodel representation for combining separate seismic data models that partially overlap; and a port of PGL to the Microsoft Windows platform. The Data Manager (DM) tool provides access to PG data for purposes of managing the organization of the generated PGL file database, or for perusing the data for visualization and informational purposes. It is written as a graphical user interface (GUI) that can directly access objects stored in any PGL file database and display it in an easily interpreted textual or visual format. New features include enhanced station object processing; low-level conversion to a new core graphics visualization library, the visualization toolkit (VTK); additional visualization support for most of the PGL geometric objects; and support for the Environmental Systems Research Institute (ESRI) shape files (which are used to enhance the geographical context during visualization). The Location Object-Oriented (LocOO) tool computes seismic event locations and associated uncertainty based on travel time, azimuth, and slowness observations. It uses a linearized least-squares inversion algorithm (the Geiger method), enhanced with Levenberg-Marquardt damping to improve performance in highly nonlinear regions of model space. LocOO relies on PGL for all predicted quantities and is designed to fully exploit all the capabilities of PGL that are relevant to seismic event location. New features in LocOO include a redesigned internal architecture implemented to enhance flexibility and to support simultaneous multiple event location. Database communication has been rewritten using new object-relational features available in Oracle 9i.
Abstract not provided.
Seismic event location is made challenging by the difficulty of describing event location uncertainty in multidimensions, by the non-linearity of the Earth models used as input to the location algorithm, and by the presence of local minima which can prevent a location code from finding the global minimum. Techniques to deal with these issues will be described. Since some of these techniques are computationally expensive or require more analysis by human analysts, users need a flexible location code that allows them to select from a variety of solutions that span a range of computational efficiency and simplicity of interpretation. A new location code, LocOO, has been developed to deal with these issues. A seismic event location is comprised of a point in 4-dimensional (4D) space-time, surrounded by a 4D uncertainty boundary. The point location is useless without the uncertainty that accompanies it. While it is mathematically straightforward to reduce the dimensionality of the 4D uncertainty limits, the number of dimensions that should be retained depends on the dimensionality of the location to which the calculated event location is to be compared. In nuclear explosion monitoring, when an event is to be compared to a known or suspected test site location, the three spatial components of the test site and event location are to be compared and 3 dimensional uncertainty boundaries should be considered. With LocOO, users can specify a location to which the calculated seismic event location is to be compared and the dimensionality of the uncertainty is tailored to that of the location specified by the user. The code also calculates the probability that the two locations in fact coincide. The non-linear travel time curves that constrain calculated event locations present two basic difficulties. The first is that the non-linearity can cause least squares inversion techniques to fail to converge. LocOO implements a nonlinear Levenberg-Marquardt least squares inversion technique that is guaranteed to converge in a finite number of iterations for tractable problems. The second difficulty is that a high degree of non-linearity causes the uncertainty boundaries around the event location to deviate significantly from elliptical shapes. LocOO can optionally calculate and display non-elliptical uncertainty boundaries at the cost of a minimal increase in computation time and complexity of interpretation. All location codes are plagued by the possibility of having local minima obscuring the single global minimum. No code can guarantee that it will find the global minimum in a finite number of computations. Grid search algorithms have been developed to deal with this problem, but have a high computational cost. In order to improve the likelihood of finding the global minimum in a timely manner, LocOO implements a hybrid least squares-grid search algorithm. Essentially, many least squares solutions are computed starting from a user-specified number of initial locations; and the solution with the smallest sum squared weighted residual is assumed to be the optimal location. For events of particular interest, analysts can display contour plots of gridded residuals in a selected region around the best-fit location, improving the probability that the global minimum will not be missed and also providing much greater insight into the character and quality of the calculated solution.
The most widely used algorithm for estimating seismic event hypocenters and origin times is iterative linear least squares inversion. In this paper we review the mathematical basis of the algorithm and discuss the major assumptions made during its derivation. We go on to explore the utility of using Levenberg-Marquardt damping to improve the performance of the algorithm in cases where some of these assumptions are violated. We also describe how location parameter uncertainties are calculated. A technique to estimate an initial seismic event location is described in an appendix.
The U. S. Department of Energy Strategic Petroleum Reserve currently has approximately 500 million barrels of crude oil stored in 62 caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. One of the challenges of operating these caverns is ensuring that none of the fluids in the caverns are leaking into the environment. The current approach is to test the mechanical integrity of all the wells entering each cavern approximately once every five years. An alternative approach to detecting cavern leaks is to monitor the cavern pressure, since leaking fluid would act to reduce cavern pressure. Leak detection by pressure monitoring is complicated by other factors that influence cavern pressure, the most important of which are thermal expansion and contraction of the fluids in the cavern as they come into thermal equilibrium with the host salt, and cavern volume reduction due to salt creep. Cavern pressure is also influenced by cavern enlargement resulting from salt dissolution following introduction of raw water or unsaturated brine into the cavern. However, this effect only lasts for a month or two following a fluid injection. In order to implement a cavern pressure monitoring program, a software program called CaveMan has been developed. It includes thermal, creep and salt dissolution models and is able to predict the cavern pressurization rate based on the operational history of the cavern. Many of the numerous thermal and mechanical parameters in the model have been optimized to produce the best match between the historical data and the model predictions. Future measurements of cavern pressure are compared to the model predictions, and significant differences in cavern pressure set program flags that notify cavern operators of a potential problem. Measured cavern pressures that are significantly less than those predicted by the model may indicate the existence of a leak.
The Single Heater Test (SHT) is a sixteen-month-long heating and cooling experiment begun in August, 1996, located underground within the unsaturated zone near the potential geologic repository at Yucca Mountain, Nevada. During the 9 month heating phase of the test, roughly 15 m{sup 3} of rock were raised to temperatures exceeding 100 C. In this paper, temperatures measured in sealed boreholes surrounding the heater are compared to temperatures predicted by 3D thermal-hydrologic calculations performed with a finite difference code. Three separate model runs using different values of bulk rock permeability (4 microdarcy to 5.2 darcy) yielded significantly different predicted temperatures and temperature distributions. All the models differ from the data, suggesting that to accurately model the thermal-hydrologic behavior of the SHT, the Equivalent Continuum Model (ECM), the conceptual basis for dealing with the fractured porous medium in the numerical predictions, should be discarded in favor of more sophisticated approaches.
The need for a reliable, fast, wireless telemetry system in the drilling industry is great but the technical challenge to develop such a system is huge. A downhole wireless telemetry system based on Surface Area Modulation (SAM) has been developed which involves the introduction of an electrically insulated gap near the bottom of an otherwise conductive drillstring. The electrical resistance of this gap can be modulated to alter the electrical characteristics of a circuit involving a surface power supply, the sections of the drillstring above and below the gap, the earth, and a nearby return electrode. These changes alter the current in the circuit, which can be monitored at the surface with an ammeter. Downhole data are encoded and transmitted to the surface as a pattern of current oscillations. In a field test, the SAM system successfully transmitted downhole information from depths of 1,400 ft below the fluid level to the surface at a rate of 110 baud. Electrical insulation on the outside of the simulated drillstring was required to achieve this level of performance. Electrically insulated tubing improved the data transmission rate at a given depth by more than an order of magnitude, and increased the maximum depth from which successful data telemetry could be achieved by more than a factor of two.
The Yucca Mountain Project conducted a Single Heater Test (SHT) in the Exploratory Studies Facility at Yucca Mountain. During the nine month-long heating phase, approximately 4 m{sup 3} of in situ, fractured, 92% saturated, welded tuff was heated to temperatures above 100 C by a 5 m long, 3.8 kW, horizontal, line heater. In this paper, the thermal data collected during the test (Sandia National Laboratories, 1997) are compared to three numerical simulations (Sobolik et al., 1996) in order to gain insight into the coupled thermal-hydrologic processes. All three numerical simulations rely on the Equivalent Continuum Model (ECM) for reasons of computational efficiency. The ECM assumes that the matrix and the fractures are in thermodynamic equilibrium which allows the thermal and hydrologic properties of the matrix and the fractures to be combined into single, bulk values. The three numerical simulations differ only in their bulk permeabilities and are referred to as the High, Low and Matrix Permeability Models, respectively. In the Matrix Permeability Model, the system behaves as an unfractured porous medium with the properties of the rock matrix.
This report describes a project to develop a flow probe to monitor gas movement in the vadose zone due to passive venting or active remediation efforts such as soil vapor extraction. 3-D and 1-D probes were designed, fabricated, tested in known flow fields under laboratory conditions, and field tested. The 3-D pores were based on technology developed for ground water flow monitoring. The probes gave excellent agreement with measured air velocities in the laboratory tests. Data processing software developed for ground water flow probes was modified for use with air flow, and to accommodate various probe designs. Modifications were made to decrease the cost of the probes, including developing a downhole multiplexer. Modeling indicated problems with flow channeling due to the mode of deployment. Additional testing was conducted and modifications were made to the probe and to the deployment methods. The probes were deployed at three test sites: a large outdoor test tank, a brief vapor extraction test at the Chemical Waste landfill, and at an active remediation site at a local gas station. The data from the field tests varied markedly from the laboratory test data. All of the major events such as vapor extraction system turn on and turn off, as well as changes in the flow rate, could be seen in the data. However, there were long term trends in the data which were much larger than the velocity signals, which made it difficult to determine accurate air velocities. These long term trends may be due to changes in soil moisture content and seasonal ground temperature variations.
The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA{trademark} (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of UNIX{reg_sign}-based workstations, a replacement was needed. This package uses the IDL{reg_sign} software, available from Research Systems Incorporated in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP.IDL is currently supported on a wide variety of UNIX platforms such as IBM{reg_sign} workstations, Hewlett Packard workstations, SUN{reg_sign} workstations, Microsoft{reg_sign} Windows{trademark} computers, Macintosh{reg_sign} computers and Digital Equipment Corporation VMS{reg_sign} systems. Thus, xdamp is portable across many platforms. The authors have verified operation, albeit with some minor IDL bugs, on IBM PC computers using Windows, Windows 95 and Windows NT; IBM UNIX platforms; DEC Alpha and VMS systems; HP 9000/700 series workstations; and Macintosh computers, both regular and PowerPC{trademark} versions. Version 2 updates xdamp to require IDL version 4.0.1, adds many enhancements, and fixes a number of bugs.
To properly characterize the transport of contaminants from the sediments beneath the Hanford Site into the Columbia River, a suite of In Situ Permeable Flow Sensors was deployed to accurately characterize the hydrologic regime in the banks of the river. The three dimensional flow velocity was recorded on an hourly basis from mid May to mid July, 1994 and for one week in September. The first data collection interval coincided with the seasonal high water level in the river while the second interval reflected conditions during relatively low seasonal river stage. Two flow sensors located approximately 50 feet from the river recorded flow directions which correlated very well with river stage, both on seasonal and diurnal time scales. During time intervals characterized by falling river stage, the flow sensors recorded flow toward the river while flow away from the river was recorded during times of rising river stage. The flow sensor near the river in the Hanford Formation recorded a component of flow oriented vertically downward, probably reflecting the details of the hydrostratigraphy in close proximity to the probe. The flow sensor near the river in the Ringold Formation recorded an upward component of flow which dominated the horizontal components most of the time. The upward flow in the Ringold probably reflects regional groundwater flow into the river. The magnitudes of the flow velocities recorded by the flow sensors were lower than expected, probably as a result of drilling induced disturbance of the hydraulic properties of the sediments around the probes. The probes were installed with resonant sonic drilling which may have compacted the sediments immediately surrounding the probes, thereby reducing the hydraulic conductivity adjacent to the probes and diverting the groundwater flow away from the sensors.
In 1992, a sinkhole was discovered above a Strategic Petroleum Reserve storage facility at Weeks Island, Louisiana. The oil is stored in an old salt mine located within a salt dome. In order to assess the hydrologic significance of the sink hole, an In Situ Permeable Flow Sensor was deployed within a sand-filled conduit in the salt dome directly beneath the sinkhole. The flow sensor is a recently developed instrument which uses a thermal perturbation technique to measure the magnitude and direction of the full 3-dimensional groundwater flow velocity vector in saturated, permeable materials. The flow sensor measured substantial groundwater flow directed vertically downward into the salt dome. The data obtained with the flow sensor provided critical evidence which was instrumental in assessing the significance of the sinkhole in terms of the integrity of the oil storage facility.
A suite of In Situ Permeable Flow Sensors was deployed at the site of the Savannah River Integrated Demonstration to monitor the interaction between the groundwater flow regime and air injected into the saturated subsurface through a horizontal well. One of the goals of the experiment was to determine if a groundwater circulation system was induced by the air injection process. The data suggest that no such circulation system was established, perhaps due to the heterogeneous nature of the sediments through which the injected gas has to travel. The steady state and transient groundwater flow patterns observed suggest that the injected air followed high permeability pathways from the injection well to the water table. The preferential pathways through the essentially horizontal impermeable layers appear to have been created by drilling activities at the site.
A new technology called the In Situ Permeable Flow Sensor has been developed at Sandia National Laboratories. These sensors use a thermal perturbation technique to directly measure the direction and magnitude of the full three dimensional groundwater flow velocity vector in unconsolidated, saturated, porous media. The velocity measured is an average value characteristic of an approximately 1 cubic meter volume of the subsurface. During a test at the Savannah River Site in South Carolina, two flow sensors were deployed in a confined aquifer in close proximity to a well which was screened over the entire vertical extent of the aquifer and the well was pumped at four different pumping rates. In this situation horizontal flow which is radially directed toward the pumping well is expected. The flow sensors measured horizontal flow which was directed toward the pumping well, within the uncertainty in the measurements. The observed magnitude of the horizontal component of the flow velocity increased linearly with pumping rate, as predicted by theoretical considerations. The measured horizontal component of the flow velocity differed from the predicted flow velocity, which was calculated with the assumptions that the hydraulic properties of the aquifer were radially homogeneous and isotropic, by less than a factor of two. Drawdown data obtained from other wells near the pumping well during the pump test indicate that the hydraulic properties of the aquifer are probably not radially homogeneous but the effect of the inhomogeneity on the flow velocity field around the pumping well was not modeled because the degree and distribution of the inhomogeneity are unknown. Grain size analysis of core samples from wells in the area were used to estimate the vertical distribution of hydraulic conductivity.
The In Situ Permeable Flow Sensor, a new technology which uses a thermal perturbation technique to directly measure the 3-dimensional groundwater flow velocity vector at a point in permeable, unconsolidated geologic formations, has been used to monitor changes in the groundwater flow regime around an experimental air stripping waste remediation activity. While design flaws in the first version of the technology, which were used during the experiment being reported here, precluded measurements of the horizontal component of the flow velocity, measurements of the vertical component of the flow velocity were obtained. Results indicate that significant changes in the vertical flow velocity were induced by the air injection system. One flow sensor, MHM6, measured a vertical flow velocity of 4 m/yr or less when the air injection system was not operating and 25 m/yr when the air injection system was on. This may be caused by air bubbles moving past the probes or may be the result of the establishment of a more widespread flow regime in the groundwater induced by the air injection system. In the latter case, significantly more groundwater would be remediated by the air stripping operation since groundwater would be circulated through the zone of influence of the air injection system. Newly designed flow sensors, already in the ground at Savannah River to monitor Phase II of the project, are capable of measuring horizontal as well as vertical components of flow velocity.