In this paper we study interactive ``one-shot'' analogues of the classical Slepian-Wolf theorem. Alice receives a value of a random variable $X$, Bob receives a value of another random variable $Y$ that is jointly distributed with $X$. Alice's goal is to transmit $X$ to Bob (with some error probability $\varepsilon$). Instead of one-way transmission, which is studied in the classical coding theory, we allow them to interact. They may also use shared randomness.
We show, that Alice can transmit $X$ to Bob in expected $H(X|Y) + 2\sqrt{H(X|Y)} + O(\log_2\left(\frac{1}{\varepsilon}\right))$ number of bits. Moreover,
we show that every one-round protocol $\pi$ with information complexity $I$
can be compressed to the (many-round) protocol
with expected communication about $I + 2\sqrt{I}$ bits.
This improves a result by Braverman and Rao \cite{braverman2011information}, where they had $5\sqrt{I}$.
Further, we show how to solve this problem (transmitting $X$) using $3H(X|Y) + O(\log_2\left(\frac{1}{\varepsilon}\right))$ bits and $4$ rounds on average. This improves a result of~\cite{brody2013towards}, where they had
$4H(X|Y) + O(\log1/\varepsilon)$ bits and 10 rounds on average.
In the end of the paper we discuss how many bits Alice and Bob may need to communicate on average besides $H(X|Y)$. The main question is whether the upper bounds mentioned above are tight. We provide
an example of $(X, Y)$, such that transmission of $X$ from Alice to Bob with error probability $\varepsilon$ requires $H(X|Y) + \Omega\left(\log_2\left(\frac{1}{\varepsilon}\right)\right)$ bits on average.
In this paper we study interactive ``one-shot'' analogues of the classical Slepian-Wolf theorem. Alice receives a value of a random variable $X$, Bob receives a value of another random variable $Y$ that is jointly distributed with $X$. Alice's goal is to transmit $X$ to Bob (with some error probability $\varepsilon$). Instead of one-way transmission, which is studied in the classical coding theory, we allow them to interact. They may also use shared randomness.
We show, that Alice can transmit $X$ to Bob in expected $H(X|Y) + 2\sqrt{H(X|Y)} + O(\log_2\left(\frac{1}{\varepsilon}\right))$ number of bits. This improves
a result by Braverman and Rao, where they had $5\sqrt{H(X|Y)}$. Further, we show how to solve this problem using $3H(X|Y) + O(\log_2\left(\frac{1}{\varepsilon}\right))$ bits and $4$ rounds on average.
In the end of the paper we discuss how many bits Alice and Bob may need to communicate on average besides $H(X|Y)$. The main question is whether the upper bounds mentioned above are tight. We provide
an example of $(X, Y)$, such that transmission of $X$ from Alice to Bob with error probability $\varepsilon$ requires $H(X|Y) + \Omega\left(\log_2\left(\frac{1}{\varepsilon}\right)\right)$ bits on average.