Publications

12 Results

Search results

Jump to search filters

What Questions Would a Systems Engineer Ask to Assess Systems Engineering Models as Credible

Carroll, Edward R.; Malins, Robert J.

Digital Systems Engineering strategies typically call for digital Systems Engineering models to be retained in repositories and certified as an authoritative source of truth (enabling model reuse, qualification, and collaboration). In order for digital Systems Engineering models to be certified as authoritative (credible), they need to be assessed - verified and validated - and with the amount of uncertainty in the model quantified (consider reusing someone else's model without knowing the author). Due to this increasing model complexity, the authors assert that traditional human-based methods for validating, verifying, and uncertainty quantification - such as human-based peer-review sessions - cannot sufficiently establish that a digital Systems Engineering model of a complex system is credible. Digital Systems Engineering models of complex systems can contain millions of nodes and edges. The authors assert that this level of detail is beyond the ability of any group of humans - even working for weeks at a time - to discern and catch every minor model infraction. In contrast, computers are highly effective at discerning infractions with massive amounts of information. The authors suggest that a better approach might be to focus the humans at what model patterns should be assessed and enable the computer to assess the massive details in accordance with those patterns - by running through perhaps 100,000 test loops. In anticipation of future projects to implement and automate the assessment of models at Sandia National Laboratories, a study was initiated to elicit input from a group of 25 Systems Engineering experts. The authors positioning query began with - 'What questions would a Systems Engineer ask to assess Systems Engineering models for credibility?" This report documents the results of that survey.

More Details

Technology Performance Level Assessment Methodology

Roberts, Jesse D.; Bull, Diana L.; Malins, Robert J.; Costello, Ronan P.; Babarit, Aurelien; Nielsen, Kim; Ferreira, Claudio B.; Kennedy, Ben; Dykes, Kathryn; Weber, Jochem

The technology performance level (TPL) assessments can be applied at all technology development stages and associated technology readiness levels (TRLs). Even, and particularly, at low TRLs the TPL assessment is very effective as it, holistically, considers a wide range of WEC attributes that determine the techno-economic performance potential of the WEC farm when fully developed for commercial operation. The TPL assessment also highlights potential showstoppers at the earliest possible stage of the WEC technology development. Hence, the TPL assessment identifies the technology independent “performance requirements.” In order to achieve a successful solution, the entirety of the performance requirements within the TPL must be considered because, in the end, all the stakeholder needs must be achieved. The basis for performing a TPL assessment comes from the information provided in a dedicated format, the Technical Submission Form (TSF). The TSF requests information from the WEC developer that is required to answer the questions posed in the TPL assessment document.

More Details

Systems Engineering Applied to the Development of a Wave Energy Farm

Roberts, Jesse D.; Bull, Diana L.; Costello, Ronan P.; Babarit, Aurelien; Nielsen, Kim; Ferreira, Claudio B.; Kennedy, Ben; Malins, Robert J.; Dykes, Kathryn; Weber, Jochem

A motivation for undertaking this stakeholder requirements analysis and Systems Engineering exercise is to document the requirements for successful wave energy farms to facilitate better design and better design assessments. A difficulty in wave energy technology development is the absence to date of a verifiable minimum viable product against which the merits of new products might be measured. A consequence of this absence is that technology development progress, technology value, and technology funding have largely been measured, associated with, and driven by technology readiness, measured in technology readiness levels (TRLs). Originating primarily from the space and defense industries, TRLs focus on procedural implementation of technology developments of large and complex engineering projects, where cost is neither mission critical nor a key design driver. The key deficiency with the TRL approach in the context of wave energy conversion is that WEC technology development has been too focused on commercial readiness and not enough on the stakeholder requirements and particularly economic viability required for market entry.

More Details

Guidance on the Technology Performance Level (TPL) Assessment Methodology

Roberts, Jesse D.; Bull, Diana L.; Weber, Jochem; Babarit, Aurelien; Costello, Ronan; Neilson, Kim; Kennedy, Ben; Malins, Robert J.; Dykes, Katherine

This document presents the revised Technology Performance Level (TPL) assessment methodology. There are three parts to this revised methodology 1) the Stakeholder Needs and Assessment Guidance (this document), 2) the Technical Submission form, 3) the TPL scoring spreadsheet. The TPL assessment is designed to give a technology neutral or agnostic assessment of any wave energy converter technology. The focus of the TPL is on the performance of the technology in meeting the customer’s needs. The original TPL is described in [1, 2] and those references also detail the critical differences in the nature of the TPL when compared to the more widely used technology readiness level (TRL). (Wave energy TRL is described in [3]). The revised TPL is particularly intended to be useful to investors and also to assist technology developers to conduct comprehensive assessments in a way that is meaningful and attractive to investors. The revised TPL assessment methodology has been derived through a structured Systems Engineering approach. This was a formal process which involved analyzing customer and stakeholder needs through the discipline of Systems Engineering. The results of the process confirmed the high level of completeness of the original methodology presented in [1] (as used in the Wave Energy Prize judging) and now add a significantly increased level of detail in the assessment and an improved more investment focused structure. The revised TPL also incorporates the feedback of the Wave Energy Prize judges.

More Details

WEC Farm Functions: Defining the Behaviors of the Farm

Bull, Diana L.; Costello, Ronan; Babarit, Aurelien; Malins, Robert J.; Kennedy, Ben; Neilson, Kim; Bittencourt, Claudio; Weber, Jochem; Roberts, Jesse D.

Capabilities and functions are hierarchical structures (i.e. taxonomies) that are used in a systems engineering framework to identify complimentary requirements for the system: what the system must do to achieve what it must be. In the case of capabilities, the taxonomy embodies the list of characteristics that are desired, from the perspective of the stakeholders, for the system to be successful. In terms of the functions, the hierarchy represents the solution agnostic (i.e. independent of specific design embodiments) elements that are needed to meet the stakeholder requirements. This paper will focus on the development of the functions. The functions define the fundamental elements of the solution that must be provided in order to achieve the mission and deliver the capabilities. They identify the behaviors the farm must possess, i.e. the farm must be able to generate and deliver electricity from wave power. High-level functions are independent of the technology or design used to implement the function. However, detailed functions may begin to border on specific design choices. Hence a strong effort has been made to maintain functions that are design agnostic.

More Details

Systematic Literature Review: How is Model-Based Systems Engineering Justified?

Carroll, Edward R.; Malins, Robert J.

The genesis for this systematic literature review was to search for industry case studies that could inform a decision of whether or not to support the change process, investment, training, and tools needed to implement an MBSE approach across the engineering enterprise. The question asked was, how the change from a document - based systems engineering approach (DBSE) to a model-based systems engineering approach (MBSE) is justified? The methodology employed for this systematic literature review was to conduct a document search of electronically published case studies by authors from the defense, space, and complex systems product engineering industries. The 67 case studies without metrics mainly attributed success to completeness, consistency, and communication of requirements. The 21 case studies with metrics on cost and schedule primarily attributed success to the ability of an MBSE approach to improve defect prevention strategies. The primary conclusion is that there is a significant advantage to project performance by applying an MBSE approach. An MBSE approach made the engineering processes on a complex system development effort more efficient by improving requirements completeness, consistency, and communication. These were seen in engineering processes involved in requirements management, concept exploration, design reuse, test and qualification, Verification and Validation, and margins analyses. An MBSE approach was most effective at improving defect prevention strategies. The approach was found to enhance the capability to find defects early in the system development life cycle (SDLC), when they could be fixed with less impact and prevented rework in later phases, thus mitigating risks to cost, schedule, and mission. However, if a program only employed an MBSE approach for requirements management, advantages from finding defects early could not be leveraged in later phases, where the savings in cost and schedule from rework prevention is realized. Significant performance success was achieved when the systems engineer (SE) held a leadership role over engineering process es. A number of the case studies addressed a general lack of skilled MBSE engineers as a major hindrance to implementing an MBSE approach successfully.

More Details
12 Results
12 Results