In this paper, we prove a general hardness amplification scheme for optimization problems based on the technique of direct products.
We say that an optimization problem $\Pi$ is direct product feasible if it is possible to efficiently aggregate any $k$ instances of $\Pi$ and form one large instance ...
more >>>
In this work, using methods from high dimensional expansion, we show that the property of $k$-direct-sum is testable for odd values of $k$ . Previous work of Kaufman and Lubotzky could inherently deal only with the case that $k$ is even, using a reduction to linearity testing.
Interestingly, our work ...
more >>>
It is becoming increasingly important to understand the vulnerability of machine learning models to adversarial attacks. In this paper we study the feasibility of robust learning from the perspective of computational learning theory, considering both sample and computational complexity. In particular, our definition of robust learnability requires polynomial sample complexity. ... more >>>