Adam Klivans, Alexander A. Sherstov

We give the first representation-independent hardness results for

PAC learning intersections of halfspaces, a central concept class

in computational learning theory. Our hardness results are derived

from two public-key cryptosystems due to Regev, which are based on the

worst-case hardness of well-studied lattice problems. Specifically, we

prove that a polynomial-time ...
more >>>

Vitaly Feldman, Parikshit Gopalan, Subhash Khot, Ashok Kumar Ponnuswami

We address well-studied problems concerning the learnability of parities and halfspaces in the presence of classification noise.

Learning of parities under the uniform distribution with random classification noise,also called the noisy parity problem is a famous open problem in computational learning. We reduce a number of basic problems regarding ... more >>>

Kevin Matulef, Ryan O'Donnell, Ronitt Rubinfeld, Rocco Servedio

This paper addresses the problem of testing whether a Boolean-valued function f is a halfspace, i.e. a function of the form f(x)=sgn(w ⋅ x - θ). We consider halfspaces over the continuous domain R^n (endowed with the standard multivariate Gaussian distribution) as well as halfspaces over the Boolean cube {-1,1}^n ... more >>>

Alexander A. Sherstov

The threshold degree of a function

$f\colon\{0,1\}^n\to\{-1,+1\}$ is the least degree of a

real polynomial $p$ with $f(x)\equiv\mathrm{sgn}\; p(x).$ We

prove that the intersection of two halfspaces on

$\{0,1\}^n$ has threshold degree $\Omega(n),$ which

matches the trivial upper bound and completely answers

a question due to Klivans (2002). The best ...
more >>>

Adam Klivans, Raghu Meka

We give a new framework for proving the existence of low-degree, polynomial approximators for Boolean functions with respect to broad classes of non-product distributions. Our proofs use techniques related to the classical moment problem and deviate significantly from known Fourier-based methods, which require the underlying distribution to have some product ... more >>>

Mark Bun, Thomas Steinke

Polynomial approximations to boolean functions have led to many positive results in computer science. In particular, polynomial approximations to the sign function underly algorithms for agnostically learning halfspaces, as well as pseudorandom generators for halfspaces. In this work, we investigate the limits of these techniques by proving inapproximability results for ... more >>>

Arnab Bhattacharyya, Suprovat Ghoshal, Rishi Saket

We prove the hardness of weakly learning halfspaces in the presence of adversarial noise using polynomial threshold functions (PTFs). In particular, we prove that for any constants $d \in \mathbb{Z}^+$ and $\epsilon > 0$, it is NP-hard to decide: given a set of $\{-1,1\}$-labeled points in $\mathbb{R}^n$ whether (YES Case) ... more >>>

Alexander A. Sherstov

We study the approximation of halfspaces $h:\{0,1\}^n\to\{0,1\}$ in the infinity norm by polynomials and rational functions of any given degree. Our main result is an explicit construction of the "hardest" halfspace, for which we prove polynomial and rational approximation lower bounds that match the trivial upper bounds achievable for all ... more >>>

Suprovat Ghoshal, Rishi Saket

The problem of learning $t$-term DNF formulas (for $t = O(1)$) has been studied extensively in the PAC model since its introduction by Valiant (STOC 1984). A $t$-term DNF can be efficiently learnt using a $t$-term DNF only if $t = 1$ i.e., when it is an AND, while even ... more >>>

Ilias Diakonikolas, Christos Tzamos, Daniel Kane

The Forster transform is a method of regularizing a dataset

by placing it in {\em radial isotropic position}

while maintaining some of its essential properties.

Forster transforms have played a key role in a diverse range of settings

spanning computer science and functional analysis. Prior work had given

{\em ...
more >>>