Gather a bunch of sensors, attach a radio to each one and throw them into the field. Can the sensors efficiently communicate in spite of radio interference and background noise? Such problems have been well-studied under the classic "radio network model". Formally, the network is represented as an undirected graph with vertices being the sensors and edges being the pairs of sensors that are in range of each other. A vertex (sensor) will receive a message if and only if the vertex is listening and exactly one of its neighbors is broadcasting.
However, radio networks make the strong assumption that messages are delivered deterministically. Therefore, I will present the newly introduced noisy radio network model that mitigates this assumption by introducing background noise. Namely, messages can be dropped at random. I will discuss the relative computational power between the noisy and the classic radio network models. In particular, given an arbitrary protocol for a radio network with constant maximum degree, I will show how we can reliably simulate the protocol in the noisy setting with at most a constant multiplicative blowup in its duration.
Joint work with Keren Censor-Hillel, Bernhard Haeupler, Ellis Hershkowitz and Jason Li.
Presented in Partial Fulfillment of the CSD Speaking Skills Requirement