GAMVT: A Generative Algorithm for MultiVariate Timeseries Data
Abstract not provided.
Abstract not provided.
Subsurface energy activities such as unconventional resource recovery, enhanced geothermal energy systems, and geologic carbon storage require fast and reliable methods to account for complex, multiphysical processes in heterogeneous fractured and porous media. Although reservoir simulation is considered the industry standard for simulating these subsurface systems with injection and/or extraction operations, reservoir simulation requires spatio-temporal “Big Data” into the simulation model, which is typically a major challenge during model development and computational phase. In this work, we developed and applied various deep neural network-based approaches to (1) process multiscale image segmentation, (2) generate ensemble members of drainage networks, flow channels, and porous media using deep convolutional generative adversarial network, (3) construct multiple hybrid neural networks such as convolutional LSTM and convolutional neural network-LSTM to develop fast and accurate reduced order models for shale gas extraction, and (4) physics-informed neural network and deep Q-learning for flow and energy production. We hypothesized that physicsbased machine learning/deep learning can overcome the shortcomings of traditional machine learning methods where data-driven models have faltered beyond the data and physical conditions used for training and validation. We improved and developed novel approaches to demonstrate that physics-based ML can allow us to incorporate physical constraints (e.g., scientific domain knowledge) into ML framework. Outcomes of this project will be readily applicable for many energy and national security problems that are particularly defined by multiscale features and network systems.
Proceedings - 2021 IEEE Space Computing Conference, SCC 2021
Concerns about cyber threats to space systems are increasing. Researchers are developing intrusion detection and protection systems to mitigate these threats, but sparsity of cyber threat data poses a significant challenge to these efforts. Development of credible threat data sets are needed to overcome this challenge. This paper describes the extension/development of three data generation algorithms (generative adversarial networks, variational auto-encoders, and generative algorithm for multi-variate timeseries) to generate cyber threat data for space systems. The algorithms are applied to a use case that leverages the NASA Operational Simulation for Small Satellites (NOS$^{3})$ platform. Qualitative and quantitative measures are applied to evaluate the generated data. Strengths and weaknesses of each algorithm are presented, and suggested improvements are provided. For this use case, generative algorithm for multi-variate timeseries performed best according to both qualitative and quantitative measures.
Proceedings - 2021 IEEE Space Computing Conference, SCC 2021
Concerns about cyber threats to space systems are increasing. Researchers are developing intrusion detection and protection systems to mitigate these threats, but sparsity of cyber threat data poses a significant challenge to these efforts. Development of credible threat data sets are needed to overcome this challenge. This paper describes the extension/development of three data generation algorithms (generative adversarial networks, variational auto-encoders, and generative algorithm for multi-variate timeseries) to generate cyber threat data for space systems. The algorithms are applied to a use case that leverages the NASA Operational Simulation for Small Satellites (NOS$^{3})$ platform. Qualitative and quantitative measures are applied to evaluate the generated data. Strengths and weaknesses of each algorithm are presented, and suggested improvements are provided. For this use case, generative algorithm for multi-variate timeseries performed best according to both qualitative and quantitative measures.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Frontiers in Computational Neuroscience
Historically, neuroscience principles have heavily influenced artificial intelligence (AI), for example the influence of the perceptron model, essentially a simple model of a biological neuron, on artificial neural networks. More recently, notable recent AI advances, for example the growing popularity of reinforcement learning, often appear more aligned with cognitive neuroscience or psychology, focusing on function at a relatively abstract level. At the same time, neuroscience stands poised to enter a new era of large-scale high-resolution data and appears more focused on underlying neural mechanisms or architectures that can, at times, seem rather removed from functional descriptions. While this might seem to foretell a new generation of AI approaches arising from a deeper exploration of neuroscience specifically for AI, the most direct path for achieving this is unclear. Here we discuss cultural differences between the two fields, including divergent priorities that should be considered when leveraging modern-day neuroscience for AI. For example, the two fields feed two very different applications that at times require potentially conflicting perspectives. We highlight small but significant cultural shifts that we feel would greatly facilitate increased synergy between the two fields.
Abstract not provided.
This research aims to develop brain-inspired solutions for reliable and adaptive autonomous navigation in systems that have limited internal and external sensors and may not have access to reliable GPS information. The algorithms investigated and developed by this project was performed in the context of Sandas A4H (autonomy for hypersonics) mission campaign. These algorithms were additionally explored with respect to their suitability for implementation on emerging neuromorphic computing hardware technology. This project is premised on the hypothesis that brain-inspired SLAM (simultaneous localization and mapping) algorithms may provide an energy-efficient, context-flexible approach to robust sensor-based, real-time navigation.
Abstract not provided.