Building Scalable, Composable Spiking Neural Algorithms with Fugu
Abstract not provided.
Abstract not provided.
Abstract not provided.
ACM International Conference Proceeding Series
It has been demonstrated that grid cells in the brain are encoding physical locations using hexagonally spaced, periodic phase-space representations. We explore how such a representation may be computationally advantageous for related engineering applications. Theories of how the brain decodes from a phase-space representation have been developed based on neuroscience data. However, theories of how sensory information is encoded into this phase space are less certain. Here we show a method for how a navigation-relevant input space such as elevation trajectories may be mapped into a phase-space coordinate system that can be decoded using previously developed theories. We also consider how such an algorithm may then also be mapped onto neuromrophic systems. Just as animals can tell where they are in a local region based on where they have been, our encoding algorithm enables the localization to a position in space by integrating measurements from a trajectory over a map. In this paper, we walk through our approach with simulations using a digital elevation model.
ACM International Conference Proceeding Series
It has been demonstrated that grid cells in the brain are encoding physical locations using hexagonally spaced, periodic phase-space representations. We explore how such a representation may be computationally advantageous for related engineering applications. Theories of how the brain decodes from a phase-space representation have been developed based on neuroscience data. However, theories of how sensory information is encoded into this phase space are less certain. Here we show a method for how a navigation-relevant input space such as elevation trajectories may be mapped into a phase-space coordinate system that can be decoded using previously developed theories. We also consider how such an algorithm may then also be mapped onto neuromrophic systems. Just as animals can tell where they are in a local region based on where they have been, our encoding algorithm enables the localization to a position in space by integrating measurements from a trajectory over a map. In this paper, we walk through our approach with simulations using a digital elevation model.
ACM International Conference Proceeding Series
It has been demonstrated that grid cells are encoding physical locations using hexagonally spaced, periodic phase-space representations. Theories of how the brain is decoding this phase-space representation have been developed based on neuroscience data. However, theories of how sensory information is encoded into this phase space are less certain. Here we show a method on how a navigation-relevant input space such as elevation trajectories may be mapped into a phase-space coordinate system that can be decoded using previously developed theories. Just as animals can tell where they are in a local region based on where they have been, our encoding algorithm enables the localization to a position in space by integrating measurements from a trajectory over a map. In this extended abstract, we walk through our approach with simulations using a digital elevation model.
ACM International Conference Proceeding Series
Neuromorphic computing (NMC) is an exciting paradigm seeking to incorporate principles from biological brains to enable advanced computing capabilities. Not only does this encompass algorithms, such as neural networks, but also the consideration of how to structure the enabling computational architectures for executing such workloads. Assessing the merits of NMC is more nuanced than simply comparing singular, historical performance metrics from traditional approaches versus that of NMC. The novel computational architectures require new algorithms to make use of their differing computational approaches. And neural algorithms themselves are emerging across increasing application domains. Accordingly, we propose following the example high performance computing has employed using context capturing mini-apps and abstraction tools to explore the merits of computational architectures. Here we present Neural Mini-Apps in a neural circuit tool called Fugu as a means of NMC insight.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Subsurface energy activities such as unconventional resource recovery, enhanced geothermal energy systems, and geologic carbon storage require fast and reliable methods to account for complex, multiphysical processes in heterogeneous fractured and porous media. Although reservoir simulation is considered the industry standard for simulating these subsurface systems with injection and/or extraction operations, reservoir simulation requires spatio-temporal “Big Data” into the simulation model, which is typically a major challenge during model development and computational phase. In this work, we developed and applied various deep neural network-based approaches to (1) process multiscale image segmentation, (2) generate ensemble members of drainage networks, flow channels, and porous media using deep convolutional generative adversarial network, (3) construct multiple hybrid neural networks such as convolutional LSTM and convolutional neural network-LSTM to develop fast and accurate reduced order models for shale gas extraction, and (4) physics-informed neural network and deep Q-learning for flow and energy production. We hypothesized that physicsbased machine learning/deep learning can overcome the shortcomings of traditional machine learning methods where data-driven models have faltered beyond the data and physical conditions used for training and validation. We improved and developed novel approaches to demonstrate that physics-based ML can allow us to incorporate physical constraints (e.g., scientific domain knowledge) into ML framework. Outcomes of this project will be readily applicable for many energy and national security problems that are particularly defined by multiscale features and network systems.
Proceedings - 2021 IEEE Space Computing Conference, SCC 2021
Concerns about cyber threats to space systems are increasing. Researchers are developing intrusion detection and protection systems to mitigate these threats, but sparsity of cyber threat data poses a significant challenge to these efforts. Development of credible threat data sets are needed to overcome this challenge. This paper describes the extension/development of three data generation algorithms (generative adversarial networks, variational auto-encoders, and generative algorithm for multi-variate timeseries) to generate cyber threat data for space systems. The algorithms are applied to a use case that leverages the NASA Operational Simulation for Small Satellites (NOS$^{3})$ platform. Qualitative and quantitative measures are applied to evaluate the generated data. Strengths and weaknesses of each algorithm are presented, and suggested improvements are provided. For this use case, generative algorithm for multi-variate timeseries performed best according to both qualitative and quantitative measures.
Proceedings - 2021 IEEE Space Computing Conference, SCC 2021
Concerns about cyber threats to space systems are increasing. Researchers are developing intrusion detection and protection systems to mitigate these threats, but sparsity of cyber threat data poses a significant challenge to these efforts. Development of credible threat data sets are needed to overcome this challenge. This paper describes the extension/development of three data generation algorithms (generative adversarial networks, variational auto-encoders, and generative algorithm for multi-variate timeseries) to generate cyber threat data for space systems. The algorithms are applied to a use case that leverages the NASA Operational Simulation for Small Satellites (NOS$^{3})$ platform. Qualitative and quantitative measures are applied to evaluate the generated data. Strengths and weaknesses of each algorithm are presented, and suggested improvements are provided. For this use case, generative algorithm for multi-variate timeseries performed best according to both qualitative and quantitative measures.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Frontiers in Computational Neuroscience
Historically, neuroscience principles have heavily influenced artificial intelligence (AI), for example the influence of the perceptron model, essentially a simple model of a biological neuron, on artificial neural networks. More recently, notable recent AI advances, for example the growing popularity of reinforcement learning, often appear more aligned with cognitive neuroscience or psychology, focusing on function at a relatively abstract level. At the same time, neuroscience stands poised to enter a new era of large-scale high-resolution data and appears more focused on underlying neural mechanisms or architectures that can, at times, seem rather removed from functional descriptions. While this might seem to foretell a new generation of AI approaches arising from a deeper exploration of neuroscience specifically for AI, the most direct path for achieving this is unclear. Here we discuss cultural differences between the two fields, including divergent priorities that should be considered when leveraging modern-day neuroscience for AI. For example, the two fields feed two very different applications that at times require potentially conflicting perspectives. We highlight small but significant cultural shifts that we feel would greatly facilitate increased synergy between the two fields.
Abstract not provided.
This research aims to develop brain-inspired solutions for reliable and adaptive autonomous navigation in systems that have limited internal and external sensors and may not have access to reliable GPS information. The algorithms investigated and developed by this project was performed in the context of Sandas A4H (autonomy for hypersonics) mission campaign. These algorithms were additionally explored with respect to their suitability for implementation on emerging neuromorphic computing hardware technology. This project is premised on the hypothesis that brain-inspired SLAM (simultaneous localization and mapping) algorithms may provide an energy-efficient, context-flexible approach to robust sensor-based, real-time navigation.
Abstract not provided.