SOA Service Identification
Abstract not provided.
Abstract not provided.
Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.
Sandia National Laboratories has tested, evaluated and reported on the Geotech Smart24 data acquisition system with active Fortezza crypto card data signing and authentication in SAND2008-. One test, Input Terminated Noise, allows us to characterize the self-noise of the Smart24 system. By computing the power spectral density (PSD) of the input terminated noise time series data set and correcting for the instrument response of different seismometers, the resulting spectrum can be compared to the USGS new low noise model (NLNM) of Peterson (1996), and determine the ability of the matched system of seismometer and Smart24 to be quiet enough for any general deployment location. Four seismometer models were evaluated: the Streckeisen STS2-Low and High Gain, Guralp CMG3T and Geotech GS13 models. Each has a unique pass-band as defined by the frequency band of the instrument corrected noise spectrum that falls below the new low-noise model.
Most test methodologies referenced in this Test Definition and Test Procedures were designed by Sandia specifically for geophysical instrumentation evaluation. When appropriate, test instrumentation calibration is traceable to the National Institute for Standards Technology (NIST).
This Test Definition for the Evaluation of Digitizing Waveform Recorders (DWR) defines the process that can be performed as part of the evaluation and testing of geophysical sensors, digitizers, sensor subsystems and geophysical station/array systems. The objectives are to (1) evaluate the overall technical performance of the DWR, measure the distortions introduced by the high resolution digitizers and provide a performance check of the internal calibrator if provided and (2) evaluate the technical performance of the DWR for a specific sensor application. The results of these evaluations can be compared to the manufacturer's specifications and any relevant application requirements or specifications.
Most test methodologies referenced in this Test Definition and Test Procedures were designed by Sandia specifically for geophysical instrumentation evaluation. When appropriate, test instrumentation calibration is traceable to the National Institute for Standards Technology (NIST). The objectives are to evaluate the overall technical performance of the infrasound sensor. The results of these evaluations can be compared to the manufacturer's specifications and any relevant application requirements or specifications.
Abstract not provided.
Abstract not provided.
The process of developing the National Nuclear Security Administration (NNSA) Knowledge Base (KB) must result in high-quality Information Products in order to support activities for monitoring nuclear explosions consistent with United States treaty and testing moratoria monitoring missions. The validation, verification, and management of the Information Products is critical to successful scientific integration, and hence, will enable high-quality deliveries to be made to the United States National Data Center (USNDC) at the Air Force Technical Applications Center (AFTAC). As an Information Product passes through the steps necessary to become part of a delivery to AFTAC, domain experts (including technical KB Working Groups that comprise NNSA and DOE laboratory staff and the customer) will provide coordination and validation, where validation is the determination of relevance and scientific quality. Verification is the check for completeness and correctness, and will be performed by both the Knowledge Base Integrator and the Scientific Integrator with support from the Contributor providing two levels of testing to assure content integrity and performance. The Information Products and their contained data sets will be systematically tracked through the integration portion of their life cycle. The integration process, based on lessons learned during its initial implementations, is presented in this report.
Abstract not provided.
All stations planned for the International Monitoring System (IMS) must be certified by the Provisional Technical Secretariat (PTS) prior to acceptance to ensure that the monitoring stations initially meet the required specifications. Working Group B of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty has established requirements for the quality, availability, and surety of data received at the International Data Centre (IDC). These requirements are verified by the PTS during a 3-component process that includes initial station assessment, testing and evaluation, and certification. Sandia National Laboratories has developed procedures, facilities, and tools that can be used to assist in evaluating IMS stations for compliance with certification requirements. System evaluation includes station design reviews, component testing, and operational testing of station equipment. Station design is evaluated for security and reliability considerations, and to ensure that operational procedures and documentation are adequate. Components of the station are tested for compliance with technical specifications, such as timing and noise levels of sampled data, and monitoring of tamper detection equipment. Data sent from the station in an IMS-standard format (CD-1 or IMS-1) are analyzed for compliance with the specified protocol and to ensure that the station data (sensor and state-of-health) are accurately transmitted. Data availability and authentication statistics are compiled and examined for problems.