Publications

Publications / Conference Poster

A Communication-Efficient Algorithm for Exponentially Fast Non-Bayesian Learning in Networks

Mitra, Aritra; Richards, John R.; Sundaram, Shreyas

We introduce a simple time-triggered protocol to achieve communication-efficient non-Bayesian learning over a network. Specifically, we consider a scenario where a group of agents interact over a graph with the aim of discerning the true state of the world that generates their joint observation profiles. To address this problem, we propose a novel distributed learning rule wherein agents aggregate neighboring beliefs based on a min-protocol, and the inter-communication intervals grow geometrically at a rate a ≥ 1. Despite such sparse communication, we show that each agent is still able to rule out every false hypothesis exponentially fast with probability 1, as long as a is finite. For the special case when communication occurs at every time-step, i.e., when a = 1, we prove that the asymptotic learning rates resulting from our algorithm are network-structure independent, and a strict improvement over existing rates. In contrast, when a>1, our analysis reveals that the asymptotic learning rates vary across agents, and exhibit a non-trivial dependence on the network topology and the relative entropies of the agents' likelihood models. This motivates us to consider the problem of allocating signal structures to agents to maximize appropriate performance metrics. In certain special cases, we show that the eccentricity centrality and the decay centrality of the underlying graph help identify optimal allocations; for more general cases, we bound the deviation from the optimal allocation as a function of the parameter a, and the diameter of the graph.