ECCC-Report TR10-160https://eccc.weizmann.ac.il/report/2010/160Comments and Revisions published for TR10-160en-usThu, 28 Oct 2010 22:49:32 +0200
Paper TR10-160
| On Approximating the Entropy of Polynomial Mappings |
Zeev Dvir,
Dan Gutfreund,
Guy Rothblum,
Salil Vadhan
https://eccc.weizmann.ac.il/report/2010/160We investigate the complexity of the following computational problem:
Polynomial Entropy Approximation (PEA):
Given a low-degree polynomial mapping
$p : F^n\rightarrow F^m$, where $F$ is a finite field, approximate the output entropy
$H(p(U_n))$, where $U_n$ is the uniform distribution on $F^n$ and $H$ may be any of several entropy measures.
We show:
1. Approximating the Shannon entropy of degree 3 polynomials $p : F_2^n\rightarrow F_2^m$ over $F_2$ to within an additive constant (or even $n^{.9}$)
is complete for $SZKP_L$, the class of problems
having statistical zero-knowledge proofs where the honest verifier and its simulator are computable in logarithmic space. ($SZKP_L$ contains most of the natural problems known to be in the full class $SZKP$.)
2. For prime fields $F\neq F_2$ and homogeneous quadratic polynomials $p : F^n\rightarrow F^m$, there is a probabilistic polynomial-time algorithm that distinguishes the case that $p(U_n)$ has entropy smaller than $k$ from the case that $p(U_n)$ has min-entropy (or even Renyi entropy) greater than $(2+o(1))k$.
3. For degree $d$ polynomials $p : F_2^n\rightarrow F_2^m$, there is a polynomial-time algorithm that distinguishes the case that $p(U_n)$ has max-entropy smaller than $k$ (where the max-entropy of a random variable is the logarithm of its support size) from the case that $p(U_n)$ has max-entropy at least $(1+o(1))\cdot k^d$ (for fixed $d$ and large $k$).Thu, 28 Oct 2010 22:49:32 +0200https://eccc.weizmann.ac.il/report/2010/160