Spike Processing for Improved Radiation Tolerance and Low Power Space Missions
Abstract not provided.
Abstract not provided.
The Spiking/Processing Array (spARR) is a novel photonic focal plane that uses pixels which generate electronic spikes autonomously and without a clock. These spikes feed into a network of digital asynchronous processing elements or DAPES. By building a useful assemblage of DAPES, and connecting them together in the correct way, sophisticated signal processing can be accomplished within the focal plane. Autonomous self-resetting pixels (AsP) enable SPARR to generate electronic response with very small signals--as little as a single photon in the case of Geiger mode avalanche photodiodes to as few as several hundred photons for in-cmos photodetectors. These spiking pixels enable fast detector response, but do not draw as much continuous power as synchronous clocked designs. The spikes emitted by the pixels all have the same magnitude, the information from the scene is effectively encoded into the rate of spikes and the time at which the spike is emitted. The spiking pixels, having converted incident light into electronic spikes, supply the spikes to a network of digital asynchronous processors. These are small state machines which respond to the spikes arriving at their input ports by either remaining unchanged or updating their internal state and possibly emitting a spike on one or more output ports. We show a design that accomplishes the sophisticated signal processing of a Haar spatial wavelet transform with spatial-spectral whitening. We furthermore show how this design results in a data streams which support imaging and transient optical source detection. Two simulators support this analysis: SPICE and sparrow. The CMOS SPICE simulator Cadence provides accurate CMOs design with accounting for effects of circuit parasitics throughout layout, accurate timing, and accurate energy consumption estimates. To more rapidly assess larger networks with more pixels, sparrow is a custom discrete event simulator that supports the non-homogeneous Poisson processes that lie behind photoelectric interaction. Sparrow is a photon-exact simulator that nevertheless performs SPARR system simulator for large-scale systems.
Proceedings - 2019 IEEE Space Computing Conference, SCC 2019
Technological advances have enabled exponential growth in both sensor data collection, as well as computational processing. However, as a limiting factor, the transmission bandwidth in between a space-based sensor and a ground station processing center has not seen the same growth. A resolution to this bandwidth limitation is to move the processing to the sensor, but doing so faces size, weight, and power operational constraints. Different physical constraints on processor manufacturing are spurring a resurgence in neuromorphic approaches amenable to the space-based operational environment. Here we describe historical trends in computer architecture and the implications for neuromorphic computing, as well as give an overview of how remote sensing applications may be impacted by this emerging direction for computing.
Journal of the Optical Society of America A: Optics and Image Science, and Vision
This paper addresses parameter estimation for an optical transient signal when the received data has been right-censored. We develop an expectation-maximization (EM) algorithm to estimate the amplitude of a Poisson intensity with a known shape in the presence of additive background counts, where the measurements are subject to saturation effects. We compare the results of our algorithm with those of an EM algorithm that is unaware of the censoring.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Of the non-corrupted data collected by the Orbiting Experiment (forté) satellite’s Photo-Diode Detector during the year 2001, I estimate that 7.9% of 914 894 signals are noise. My result differs dramatically from Guillen’s estimate of 96%. To arrive at this estimate, I used Gaussian mixture model (GMM) clustering–unsupervised machine learning–to aggregate the wave forms into groups based on the absolute value of the lowest 25 positive frequency discrete Fourier transform coefficients. Then, I marked several of the groups as noise by inspecting a random sampling of wave forms from each group. Marking groups as either noise or non-noise is a supervised binary classification operation. After removing the signals in noise groups from further consideration, I clustered the remaining signals into families. Again, I used a GMM, but for the familial clustering I used a Non-Negative Matrix Factorization feature vector transform. The result was 9 distinct families of lightning signals, as well as a second stage of noise filtering. To efficiently represent the entirety of the signal space, I broke each family into deciles based on their distance from the family mean. In this case, distance means the log-likelihood based on the GMM. Signals in lower deciles are more similar in shape and amplitude to their family average. I took the top 200 samples from each decile of each group, resulting in 18 000 signals. These signal approximately represent the entirety of the forté observations. To represent outliers, I also kept a zoo of the 1000 signals furthest from any family’s average. All told, the resulting data set represents the forté data with a reduction of about 51:1. To allow synthesis of an arbitrarily large number of test signals, I also captured each family’s average signal and the time-sample covariance matrix over the signals in each family. Using these two pieces of information, I can synthesize new waveforms by using a Gaussian random realization from the family average and covariance matrix. I wrote a program to test the synthesis quality. The program shows me two signals on the screen, one synthesized and one randomly drawn from the data. I attempted to identify the synthesized signal. Although the synthesis is imperfect, in an A/B comparison I only correctly chose the synthesized signal 36% of the time.