Publications

Results 126–150 of 214

Search results

Jump to search filters

Neurogenesis Deep Learning: Extending deep networks to accommodate new classes

Draelos, Timothy J.; Miner, Nadine E.; Lamb, Christopher L.; Vineyard, Craig M.; Carlson, Kristofor D.; James, Conrad D.; Aimone, James B.

Neural machine learning methods, such as deep neural networks (DNN), have achieved remarkable success in a number of complex data processing tasks. These methods have arguably had their strongest impact on tasks such as image and audio processing – data processing domains in which humans have long held clear advantages over conventional algorithms. In contrast to biological neural systems, which are capable of learning continuously, deep artificial networks have a limited ability for incorporating new information in an already trained network. As a result, methods for continuous learning are potentially highly impactful in enabling the application of deep networks to dynamic data sets. Here, inspired by the process of adult neurogenesis in the hippocampus, we explore the potential for adding new neurons to deep layers of artificial neural networks in order to facilitate their acquisition of novel information while preserving previously trained data representations. Our results on the MNIST handwritten digit dataset and the NIST SD 19 dataset, which includes lower and upper case letters and digits, demonstrate that neurogenesis is well suited for addressing the stability-plasticity dilemma that has long challenged adaptive machine learning algorithms.

More Details

Spiking network algorithms for scientific computing

2016 IEEE International Conference on Rebooting Computing, ICRC 2016 - Conference Proceedings

Severa, William M.; Parekh, Ojas D.; Carlson, Kristofor D.; James, Conrad D.; Aimone, James B.

For decades, neural networks have shown promise for next-generation computing, and recent breakthroughs in machine learning techniques, such as deep neural networks, have provided state-of-the-art solutions for inference problems. However, these networks require thousands of training processes and are poorly suited for the precise computations required in scientific or similar arenas. The emergence of dedicated spiking neuromorphic hardware creates a powerful computational paradigm which can be leveraged towards these exact scientific or otherwise objective computing tasks. We forego any learning process and instead construct the network graph by hand. In turn, the networks produce guaranteed success often with easily computable complexity. We demonstrate a number of algorithms exemplifying concepts central to spiking networks including spike timing and synaptic delay. We also discuss the application of cross-correlation particle image velocimetry and provide two spiking algorithms; one uses time-division multiplexing, and the other runs in constant time.

More Details

High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

Neuron

Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; T, Dean; Denker, Michael; Diesmann, Markus; Donofrio, David D.; Frank, Loren M.; Kasthuri, Narayanan; C, Koch; Ruebel, Oliver; Simon, Horst D.; Sommer, Friedrich T.; Prabhat, None

Opportunities offered by new neuro-technologies are threatened by lack of coherent plans to analyze, manage, and understand the data. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

More Details

Quantifying neural information content: A case study of the impact of hippocampal adult neurogenesis

Proceedings of the International Joint Conference on Neural Networks

Vineyard, Craig M.; Verzi, Stephen J.; James, Conrad D.; Aimone, James B.

Through various means of structural and synaptic plasticity enabling online learning, neural networks are constantly reconfiguring their computational functionality. Neural information content is embodied within the configurations, representations, and computations of neural networks. To explore neural information content, we have developed metrics and computational paradigms to quantify neural information content. We have observed that conventional compression methods may help overcome some of the limiting factors of standard information theoretic techniques employed in neuroscience, and allows us to approximate information in neural data. To do so we have used compressibility as a measure of complexity in order to estimate entropy to quantitatively assess information content of neural ensembles. Using Lempel-Ziv compression we are able to assess the rate of generation of new patterns across a neural ensemble's firing activity over time to approximate the information content encoded by a neural circuit. As a specific case study, we have been investigating the effect of neural mixed coding schemes due to hippocampal adult neurogenesis.

More Details

Dopaminergic inputs in the dentate gyrus direct the choice of memory encoding

Proceedings of the National Academy of Sciences of the United States of America

Du, Huiyun; Deng, Wei; Aimone, James B.; Ge, Minyan; Parylak, Sarah; Walch, Keenan; Cook, Jonathan; Zhang, Wei; Song, Huina; Wang, Liping; Gage, Fred H.; Mu, Yangling

Rewarding experiences are often well remembered, and such memory formation is known to be dependent on dopamine modulation of the neural substrates engaged in learning and memory; however, it is unknown how and where in the brain dopamine signals bias episodic memory toward preceding rather than subsequent events. Here we found that photostimulation of channelrhodopsin-2-expressing dopaminergic fibers in the dentate gyrus induced a long-term depression of cortical inputs, diminished theta oscillations, and impaired subsequent contextual learning. Computational modeling based on this dopamine modulation indicated an asymmetric association of events occurring before and after reward in memory tasks. In subsequent behavioral experiments, preexposure to a natural reward suppressed hippocampus-dependent memory formation, with an effective time window consistent with the duration of dopamine-induced changes of dentate activity. Overall, our results suggest a mechanism by which dopamine enables the hippocampus to encode memory with reduced interference from subsequent experience.

More Details

Low excitatory innervation balances high intrinsic excitability of immature dentate neurons

Nature Communications

Dieni, Cristina V.; Panichi, Roberto; Aimone, James B.; Kuo, Chay T.; Wadiche, Jacques I.; Overstreet-Wadiche, Linda

Persistent neurogenesis in the dentate gyrus produces immature neurons with high intrinsic excitability and low levels of inhibition that are predicted to be more broadly responsive to afferent activity than mature neurons. Mounting evidence suggests that these immature neurons are necessary for generating distinct neural representations of similar contexts, but it is unclear how broadly responsive neurons help distinguish between similar patterns of afferent activity. Here we show that stimulation of the entorhinal cortex in mouse brain slices paradoxically generates spiking of mature neurons in the absence of immature neuron spiking. Immature neurons with high intrinsic excitability fail to spike due to insufficient excitatory drive that results from low innervation rather than silent synapses or low release probability. Our results suggest that low synaptic connectivity prevents immature neurons from responding broadly to cortical activity, potentially enabling excitable immature neurons to contribute to sparse and orthogonal dentate representations.

More Details

Computational modeling of adult neurogenesis

Cold Spring Harbor Perspectives in Biology

Aimone, James B.

The restriction of adult neurogenesis to only a handful of regions of the brain is suggestive of some shared requirement for this dramatic form of structural plasticity. However, a common driver across neurogenic regions has not yet been identified. Computational studies have been invaluable in providing insight into the functional role of new neurons; however, researchers have typically focused on specific scales ranging from abstract neural networks to specific neural systems, most commonly the dentate gyrus area of the hippocampus. These studies have yielded a number of diverse potential functions for new neurons, ranging from an impact on pattern separation to the incorporation of time into episodic memories to enabling the forgetting of old information. This review will summarize these past computational efforts and discuss whether these proposed theoretical functions can be unified into a common rationale for why neurogenesis is required in these unique neural circuits.

More Details
Results 126–150 of 214
Results 126–150 of 214