We study the effect of noise on the $n$-party beeping model. In this model, in every round, each party may decide to either `beep' or not. All parties hear a beep if and only if at least one party beeps. The beeping model is becoming increasingly popular, as it offers a very simple abstraction of wireless networks and is very well suited for studying biological phenomena. Still, the noise resilience of the beeping model is yet to be understood.
Our main result is a lower bound, showing that making protocols in the beeping model resilient to noise may have a large performance overhead. Specifically, we give a protocol that works over the (noiseless) beeping model, and prove that any scheme that simulates this protocol over the beeping model with correlated stochastic noise will blow up the number of rounds by an $\Omega(\log n)$ multiplicative factor.
We complement this result by a matching upper bound, constructing a noise-resilient simulation scheme with $\mathcal{O}(\log n)$ overhead for any noiseless beeping protocol.