In 1994, Y. Mansour conjectured that for every DNF formula on $n$ variables with $t$ terms there exists a polynomial $p$ with $t^{O(\log (1/\epsilon))}$ non-zero coefficients such that $\E_{x \in \{0,1\}}[(p(x)-f(x))^2] \leq \epsilon$. We make the first progress on this conjecture and show that it is true for several natural subclasses of DNF formulas including randomly chosen DNF formulas and read-$k$ DNF formulas for constant $k$.
Our result yields the first polynomial-time query algorithm for agnostically learning these subclasses of DNF formulas with respect to the uniform distribution on $\{0,1\}^n$ (for any constant error parameter).
Applying recent work on sandwiching polynomials, our results imply that a $t^{-O(\log 1/\epsilon)}$-biased distribution fools the above subclasses of DNF formulas. This gives pseudorandom generators for these subclasses with shorter seed length than all previous work.
The read-k section has been fixed.
In 1994, Y. Mansour conjectured that for every DNF formula on $n$ variables with $t$ terms there exists a polynomial $p$ with $t^{O(\log (1/\epsilon))}$ non-zero coefficients such that $\E_{x \in \{0,1\}}[(p(x)-f(x))^2] \leq \epsilon$. We make the first progress on this conjecture and show that it is true for randomly chosen DNF formulas and read-once DNF formulas.
Our result yields the first polynomial-time query algorithm for agnostically learning these subclasses of DNF formulas with respect to the uniform distribution on $\{0,1\}^n$ (for any constant error parameter).
Applying recent work on sandwiching polynomials, our results imply that a $t^{-O(\log 1/\epsilon)}$-biased distribution fools the above subclasses of DNF formulas. This gives pseudorandom generators for these subclasses with shorter seed length than all previous work.
There was an error in the previous version where similar results were shown for read-k DNF formulas.
In 1994, Y. Mansour conjectured that for every DNF formula on $n$ variables with $t$ terms there exists a polynomial $p$ with $t^{O(\log (1/\epsilon))}$ non-zero coefficients such that $\E_{x \in \{0,1\}}[(p(x)-f(x))^2] \leq \epsilon$. We make the first progress on this conjecture and show that it is true for several natural subclasses of DNF formulas including randomly chosen DNF formulas and read-$k$ DNF formulas for constant $k$.
Our result yields the first polynomial-time query algorithm for agnostically learning these subclasses of DNF formulas with respect to the uniform distribution on $\{0,1\}^n$ (for any constant error parameter).
Applying recent work on sandwiching polynomials, our results imply that a $t^{-O(\log 1/\epsilon)}$-biased distribution fools the above subclasses of DNF formulas. This gives pseudorandom generators for these subclasses with shorter seed length than all previous work.
In 1994, Y. Mansour conjectured that for every DNF formula on $n$ variables with $t$ terms there exists a polynomial $p$ with $t^{O(\log (1/\epsilon))}$ non-zero coefficients such that $\E_{x \in \{0,1\}}[(p(x)-f(x))^2] \leq \epsilon$. We make the first progress on this conjecture and show that it is true for several natural subclasses of DNF formulas including randomly chosen DNF formulas and read-$k$ DNF formulas for constant $k$.
Our result yields the first polynomial-time query algorithm for agnostically learning these subclasses of DNF formulas with respect to the uniform distribution on $\{0,1\}^n$ (for any constant error parameter).
Applying recent work on sandwiching polynomials, our results imply that a $t^{-O(\log 1/\epsilon)}$-biased distribution fools the above subclasses of DNF formulas. This gives pseudorandom generators for these subclasses with shorter seed length than all previous work.