Let $f : \{0,1\}^n \times \{0,1\}^n \rightarrow \{0,1\}$ be a 2-party function. For every product distribution $\mu$ on $\{0,1\}^n \times \{0,1\}^n$, we show that $${{CC}}^\mu_{0.49}(f) = O\left(\left(\log {{rprt}}_{1/4}(f) \cdot \log \log {{rprt}}_{1/4}(f)\right)^2\right),$$ where ${{CC}^\mu_\varepsilon(f)$ is the distributional communication complexity with error at most $\varepsilon$ under the distribution $\mu$ and ${{rprt}}_{1/4}(f)$ is the {\em relaxed partition bound} of the function $f$, as introduced by Kerenidis et al [Proc. FOCS 2012]. A similar upper bound for communication complexity for product distributions in terms of information complexity was recently (and independently) obtained by Kol [ECCC Tech. Report TR15-168].
We show a similar result for query complexity under product distributions. Let $g : \{0,1\}^n\rightarrow \{0,1\}$ be a function. For every bit-wise product distribution $\mu$ on $\{0,1\}^n$, we show that $${{QC}}^\mu_{1/3}(g) = O\left(\left( \log {{qrprt}}_{1/4}(g) \log \log{{qrprt}}_{1/4}(g))\right)^2\right),$$
where ${{QC}}^\mu_{1/3}(g)$ is the distributional query complexity with error at most $1/3$ under the distribution $\mu$ and ${{qrprt}}_{1/4}(g))$ is the {\em relaxed query partition bound} of the function $g$.
Recall that relaxed partition bounds were introduced (in both communication complexity and query complexity models) to provide LP-based lower bounds to randomized communication complexity and randomized query complexity. Our results demonstrate that these lower bounds are polynomially tight for {\em product} distributions.