PET ROCK: Persistent Event-based sensing at Testbeds Real-world Objective Comparison of Key-hardware
Abstract not provided.
Abstract not provided.
This technical report serves to summarize a literature search conducted that covered confidence calibration. This report is meant to serve as a solid starting reference for individuals interested in learning more about the confidence calibration domain as well as for individuals more familiar with this work – as a summarizing document for calibration metrics is notably lacking in the literature. This report is not meant to serve as a comprehensive review of everything that has been done in this field – in fact, the reader is encouraged to look further into this domain. We describe confidence and calibration and discuss properties of good calibration metrics. We detail various calibration and calibration-tangential metrics, presenting equations, algorithms, parameters, and an analysis of strengths and weaknesses. We apply a subset of these metrics to eight proxy confidence assessment datasets. We examine the various metrics in the context of model confidence. Finally, we discuss promising future directions and outstanding questions.
Proceedings of SPIE the International Society for Optical Engineering
Event-based sensors are a novel sensing technology which capture the dynamics of a scene via pixel-level change detection. This technology operates with high speed (>10 kHz), low latency (10 µs), low power consumption (<1 W), and high dynamic range (120 dB). Compared to conventional, frame-based architectures that consistently report data for each pixel at a given frame rate, event-based sensor pixels only report data if a change in pixel intensity occurred. This affords the possibility of dramatically reducing the data reported in bandwidth-limited environments (e.g., remote sensing) and thus, the data needed to be processed while still recovering significant events. Degraded visual environments, such as those generated by fog, often hinder situational awareness by decreasing optical resolution and transmission range via random scattering of light. To respond to this challenge, we present the deployment of an event-based sensor in a controlled, experimentally generated, well-characterized degraded visual environment (a fog analogue), for detection of a modulated signal and comparison of data collected from an event-based sensor and from a traditional framing sensor.
Proceedings of SPIE - The International Society for Optical Engineering
Event-based sensors are a novel sensing technology which capture the dynamics of a scene via pixel-level change detection. This technology operates with high speed (>10 kHz), low latency (10 µs), low power consumption (<1 W), and high dynamic range (120 dB). Compared to conventional, frame-based architectures that consistently report data for each pixel at a given frame rate, event-based sensor pixels only report data if a change in pixel intensity occurred. This affords the possibility of dramatically reducing the data reported in bandwidth-limited environments (e.g., remote sensing) and thus, the data needed to be processed while still recovering significant events. Degraded visual environments, such as those generated by fog, often hinder situational awareness by decreasing optical resolution and transmission range via random scattering of light. To respond to this challenge, we present the deployment of an event-based sensor in a controlled, experimentally generated, well-characterized degraded visual environment (a fog analogue), for detection of a modulated signal and comparison of data collected from an event-based sensor and from a traditional framing sensor.
Abstract not provided.