IMPLEMENTATION OF A PSEUDO-BENDING SEISMIC TRAVEL TIME CALCULATOR IN A DISTRIBUTED PARALLEL COMPUTING ENVIRONMENT
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This guide is intended to enable researchers working with seismic data, but lacking backgrounds in computer science and programming, to develop seismic algorithms using the MATLAB-based MatSeis software. Specifically, it presents a series of step-by-step instructions to write four specific functions of increasing complexity, while simultaneously explaining the notation, syntax, and general program design of the functions being written. The ultimate goal is that that the user can use this guide as a jumping off point from which he or she can write new functions that are compatible with and expand the capabilities of the current MatSeis software that has been developed as part of the Ground-based Nuclear Explosion Monitoring Research and Engineering (GNEMRE) program at Sandia National Laboratories.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentation will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.
In order to exploit the information on surface wave propagation that is stored in large seismic event datasets, Sandia and Lawrence Livermore National Laboratories have developed a MatSeis interface for performing phase-matched filtering of Rayleigh arrivals. MatSeis is a Matlab-based seismic processing toolkit which provides graphical tools for analyzing seismic data from a network of stations. Tools are available for spectral and polarization measurements, as well as beam forming and f-k analysis with array data, to name just a few. Additionally, one has full access to the Matlab environment and any functions available there. Previously the authors reported the development of new MatSeis tools for calculating regional discrimination measurements. The first of these performs Lg coda analysis as developed by Mayeda and coworkers at Lawrence Livermore National Laboratory. A second tool measures regional phase amplitude ratios for an event and compares the results to ratios from known earthquakes and explosions. Release 1.5 of MatSeis includes the new interface for the analysis of surface wave arrivals. This effort involves the use of regionalized dispersion models from a repository of surface wave data and the construction of phase-matched filters to improve surface wave identification, detection, and magnitude calculation. The tool works as follows. First, a ray is traced from source to receiver through a user-defined grid containing different group velocity versus period values to determine the composite group velocity curve for the path. This curve is shown along with the upper and lower group velocity bounds for reference. Next, the curve is used to create a phase-matched filter, apply the filter, and show the resultant waveform. The application of the filter allows obscured Rayleigh arrivals to be more easily identified. Finally, after screening information outside the range of the phase-matched filter, an inverse version of the filter is applied to obtain a cleaned raw waveform which can be used for amplitude measurements. Because all the MatSeis tools have been written as Matlab functions, they can be easily modified to experiment with different processing details. The performance of the propagation models can be evaluated using any event available in the repository of surface wave events.
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.
The parametric grid capability of the Knowledge Base provides an efficient, robust way to store and access interpolatable information which is needed to monitor the Comprehensive Nuclear Test Ban Treaty. To meet both the accuracy and performance requirements of operational monitoring systems, we use a new approach which combines the error estimation of kriging with the speed and robustness of Natural Neighbor Interpolation (NNI). The method involves three basic steps: data preparation (DP), data storage (DS), and data access (DA). The goal of data preparation is to process a set of raw data points to produce a sufficient basis for accurate NNI of value and error estimates in the Data Access step. This basis includes a set of nodes and their connectedness, collectively known as a tessellation, and the corresponding values and errors that map to each node, which we call surfaces. In many cases, the raw data point distribution is not sufficiently dense to guarantee accurate error estimates from the NNI, so the original data set must be densified using a newly developed interpolation technique known as Modified Bayesian Kriging. Once appropriate kriging parameters have been determined by variogram analysis, the optimum basis for NNI is determined in a process they call mesh refinement, which involves iterative kriging, new node insertion, and Delauny triangle smoothing. The process terminates when an NNI basis has been calculated which will fir the kriged values within a specified tolerance. In the data storage step, the tessellations and surfaces are stored in the Knowledge Base, currently in a binary flatfile format but perhaps in the future in a spatially-indexed database. Finally, in the data access step, a client application makes a request for an interpolated value, which triggers a data fetch from the Knowledge Base through the libKBI interface, a walking triangle search for the containing triangle, and finally the NNI interpolation.
Further improvements to the Waveform Correlation Event Detection System (WCEDS) developed by Sandia Laboratory have made it possible to test the system on the accepted Comprehensive Test Ban Treaty (CTBT) seismic monitoring network. For our test interval we selected a 24-hour period from December 1996, and chose to use the Reviewed Event Bulletin (REB) produced by the Prototype International Data Center (PIDC) as ground truth for evaluating the results. The network is heterogeneous, consisting of array and three-component sites, and as a result requires more flexible waveform processing algorithms than were available in the first version of the system. For simplicity and superior performance, we opted to use the spatial coherency algorithm of Wagner and Owens (1996) for both types of sites. Preliminary tests indicated that the existing version of WCEDS, which ignored directional information, could not achieve satisfactory detection or location performance for many of the smaller events in the REB, particularly those in the south Pacific where the network coverage is unusually sparse. To achieve an acceptable level of performance, we made modifications to include directional consistency checks for the correlations, making the regions of high correlation much less ambiguous. These checks require the production of continuous azimuth and slowness streams for each station, which is accomplished by means of FK processing for the arrays and power polarization processing for the three-component sites. In addition, we added the capability to use multiple frequency-banded data streams for each site to increase sensitivity to phases whose frequency content changes as a function of distance.
Waveform Correlation Event Detection System (WCEDS) prototypes have now been developed for both global and regional networks and the authors have extensively tested them to assess the potential usefulness of this technology for CTBT (Comprehensive Test Ban Treaty) monitoring. In this paper they present the results of tests on data sets from the IDC (International Data Center) Primary Network and the New Mexico Tech Seismic Network. The data sets span a variety of event types and noise conditions. The results are encouraging at both scales but show particular promise for regional networks. The global system was developed at Sandia Labs and has been tested on data from the IDC Primary Network. The authors have found that for this network the system does not perform at acceptable levels for either detection or location unless directional information (azimuth and slowness) is used. By incorporating directional information, however, both areas can be improved substantially suggesting that WCEDS may be able to offer a global detection capability which could complement that provided by the GA (Global Association) system in use at the IDC and USNDC (United States National Data Center). The local version of WCEDS (LWCEDS) has been developed and tested at New Mexico Tech using data from the New Mexico Tech Seismic Network (NMTSN). Results indicate that the WCEDS technology works well at this scale, despite the fact that the present implementation of LWCEDS does not use directional information. The NMTSN data set is a good test bed for the development of LWCEDS because of a typically large number of observed local phases and near network-wide recording of most local and regional events. Detection levels approach those of trained analysts, and locations are within 3 km of manually determined locations for local events.
The goal of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs has been to develop a prototype of a full-waveform correlation based seismic event detection system which could be used to assess potential usefulness for CTBT monitoring. The current seismic event detection system in use at the IDC is very sophisticated and provides good results but there is still significant room for improvement, particularly in reducing the number of false events (currently being nearly equal to the number of real events). Our first prototype was developed last year and since then we have used it for extensive testing from which we have gained considerable insight. The original prototype was based on a long-period detector designed by Shearer (1994), but it has been heavily modified to address problems encountered in application to a data set from the Incorporated Research Institutes for Seismology (IRIS) broadband global network. Important modifications include capabilities for event masking and iterative event detection, continuous near-real time execution, improved Master Image creation, and individualized station pre-processing. All have been shown to improve bulletin quality. In some cases the system has detected marginal events which may not be detectable by traditional detection systems, but definitive conclusions cannot be made without direct comparisons. For this reason future work will focus on using the system to process GSETT3 data for comparison with current event detection systems at the IDC.
A study using long-period seismic data showed that seismic events can be detected and located based on correlations of processed waveform profiles with the profile expected for an event. In this technique both time and space are discretized and events are found by forming profiles and calculating correlations for all time-distance points. events are declared at points with large correlations. In the first phase of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs we have developed a prototype automatic event detection system based on Shearer`s work which shows promise for treaty monitoring applications. Many modifications have been made to meet the requirements of the monitoring environment. A new full matrix multiplication has been developed which can reduce the number of computations needed for the data correlation by as much as two orders of magnitude for large grids. New methodology has also been developed to deal with the problems caused by false correlations (sidelobes) generated during the correlation process. When an event has been detected, masking matrices are set up which will mask all correlation sidelobes due to the event, allowing other events with intermingled phases to be found. This process is repeated until a detection threshold is reached. The system was tested on one hour of Incorporated Research Institutions for Seismology (IRIS) broadband data and built all 4 of the events listed in the National Earthquake Information Center (NEIC) Preliminary Determination of Epicenters (PDE) which were observable by the IRIS network. A continuous execution scheme has been developed for the system but has not yet been implemented. Improvements to the efficiency of the code are in various stages of development. Many refinements would have to be made to the system before it could be used as part of an actual monitoring system, but at this stage we know of no clear barriers which would prevent an eventual implementation of the system.
We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).
The Simplified Analytical Model of Penetration with Lateral Loading (SAMPLL) computer code developed at Sandia National Laboratories has been modified to allow additional penetration capabilities. The new capabilities include the ability to model penetration by other than cylindrical penetrators (flares, tapers, and boattails) and the ability to calculate penetration/perforation of multiple layers of different materials. Additionally, updated soil and rock empirical equations have been added to the model. A broader range of problems can now be modeled more accurately with the modified SAMPLL. 7 refs., 6 figs.