Technology roadmapping process:a learning tool for adaptive enterprises
Abstract not provided.
Abstract not provided.
Sandia National Laboratories was tasked with developing the Defense Nuclear Material Stewardship Integrated Inventory Information Management System (IIIMS) with the sponsorship of NA-125.3 and the concurrence of DOE/NNSA field and area offices. The purpose of IIIMS was to modernize nuclear materials management information systems at the enterprise level. Projects over the course of several years attempted to spearhead this modernization. The scope of IIIMS was broken into broad enterprise-oriented materials management and materials forecasting. The IIIMS prototype was developed to allow multiple participating user groups to explore nuclear material requirements and needs in detail. The purpose of material forecasting was to determine nuclear material availability over a 10 to 15 year period in light of the dynamic nature of nuclear materials management. Formal DOE Directives (requirements) were needed to direct IIIMS efforts but were never issued and the project has been halted. When restarted, duplicating or re-engineering the activities from 1999 to 2003 is unnecessary, and in fact future initiatives can build on previous work. IIIMS requirements should be structured to provide high confidence that discrepancies are detected, and classified information is not divulged. Enterprise-wide materials management systems maintained by the military can be used as overall models to base IIIMS implementation concepts upon.
Abstract not provided.
Abstract not provided.
This report describes the information model that was jointly developed as part of two FY93 LDRDs: (1) Information Integration for Data Fusion, and (2) Interactive On-Site Inspection System: An Information System to Support Arms Control Inspections. This report describes the purpose and scope of the two LDRD projects and reviews the prototype development approach, including the use of a GIS. Section 2 describes the information modeling methodology. Section 3 provides a conceptual data dictionary for the OSIS (On-Site Information System) model, which can be used in conjunction with the detailed information model provided in the Appendix. Section 4 discussions the lessons learned from the modeling and the prototype. Section 5 identifies the next steps--two alternate paths for future development. The long-term purpose of the On-Site Inspection LDRD was to show the benefits of an information system to support a wide range of on-site inspection activities for both offensive and defensive inspections. The database structure and the information system would support inspection activities under nuclear, chemical, biological, and conventional arms control treaties. This would allow a common database to be shared for all types of inspections, providing much greater cross-treaty synergy.
Data fusion has been identified by the Department of Defense as a critical technology for the U.S. defense industry. Data fusion requires combining expertise in two areas - sensors and information integration. Although data fusion is a rapidly growing area, there is little synergy and use of common, reusable, and/or tailorable objects and models, especially across different disciplines. The Laboratory-Directed Research and Development project had two purposes: to see if a natural language-based information modeling methodology could be used for data fusion problems, and if so, to determine whether this methodology would help identify commonalities across areas and achieve greater synergy. The project confirmed both of the initial hypotheses: that the natural language-based information modeling methodology could be used effectively in data fusion areas and that commonalities could be found that would allow synergy across various data fusion areas. The project found five common objects that are the basis for all of the data fusion areas examined: targets, behaviors, environments, signatures, and sensors. Many of the objects and the specific facts related to these objects were common across several areas and could easily be reused. In some cases, even the terminology remained the same. In other cases, different areas had their own terminology, but the concepts were the same. This commonality is important with the growing use of multisensor data fusion. Data fusion is much more difficult if each type of sensor uses its own objects and models rather than building on a common set. This report introduces data fusion, discusses how the synergy generated by this LDRD would have benefited an earlier successful project and contains a summary information model from that project, describes a preliminary management information model, and explains how information integration can facilitate cross-treaty synergy for various arms control treaties.
Engineering with Computers
Data fusion is the integration and analysis of data from multiple sensors to develop a more accurate understanding of a situation and determine how to respond to it. Although data fusion can be applied in many situations, this paper focuses on its application to manufacturing and how it changes some of the more traditional, less adaptive information models that support the design and manufacturing functions. The paper consists of four parts. The first section explains what data fusion is and its impact on manufacturing. The second section describes what an information system architecture is and explains the natural language-based information modeling methodology used by this research project. The third section identifies the major design and manufacturing functions, reviews the information models required to support them, and then shows how these models must be extended to support data fusion. The fourth section discusses the future directions of this work. © 1995 Springer-Verlag London Limited.
The supercomputer industry is at a crossroads. While its traditional markets have become relatively mature, the industry is becoming more competitive, especially with the challenge from Japan. The industry can either fight over this stable market or dramatically expand the market. The choice is obvious, but what are these new markets and how to approach them. This paper addresses these issues. First, it explains how the traditional definition of a supercomputer seriously constrains its market. An alternate definition opens up a much larger, emerging market. Second, it describes a market segmentation two barriers preventing customer in these new segments from using supercomputing and describes mechanisms to reduce and/or eliminate these barriers. Third, it discusses the portfolio analysis strategy to determine the markets in these new segments on which to concentrate. Obviously, parts of manufacturing are key targets. Finally, it draws some conclusions in terms of two scenarios -- one which describes a healthy, growing US supercomputer industry, the alternative showing the industry rapidly following the footsteps of the US consumer electronics industry. 6 refs.
This paper describes the migration path a company goes through as it moves up the data management learning curve. This migration path is based on four distinct roles for data management in CIM. The first two sections review the justification for CIM and data management. The first section describes the changing competitive environment manufacturers face and how CIM addresses the problems this situation creates. The second section identifies the two key characteristics of a database management system and the benefits provided. The third section identifies and discusses the four roles for data management in CIM. These four roles and their variations provide snapshots of where a company is on the data management learning curve. The fourth section describes the migration path a company goes through as it moves up the learning curve. Although there are similarities, there are some significant differences between this learning curve and the one experienced by MIS as it adopted data management technology. 3 refs.