We show that all non-negative submodular functions have high noise-stability.
As a consequence, we obtain a polynomial-time learning algorithm for
this class with respect to any product distribution on $\{-1,1\}^n$
(for any constant accuracy parameter $\epsilon$). Our algorithm also
succeeds in the agnostic setting. Previous work on learning
submodular functions required either query access or strong
assumptions about the types of submodular functions to be learned (and
did not hold in the agnostic setting).
Additionally we give simple algorithms that efficiently release
differentially private answers to all Boolean conjunctions and to all
halfspaces with constant average error, subsuming and improving the
recent work due to Gupta, Hardt, Roth and Ullman (STOC~2011).
Corrected an error in Section 3.2 Lemma 9.
We show that all non-negative submodular functions have high noise-stability. As a consequence, we obtain a polynomial-time learning algorithm for this class with respect to any product distribution on $\{-1,1\}^n$ (for any constant accuracy parameter $\epsilon$ ). Our algorithm also succeeds in the agnostic setting. Previous work on learning submodular functions required either query access or strong assumptions about the types of submodular functions to be learned (and did not hold in the agnostic setting).
We show that all non-negative submodular functions have high noise-stability. As a consequence, we obtain a polynomial-time learning algorithm for this class with respect to any product distribution on $\{-1,1\}^n$ (for any constant accuracy parameter $\epsilon$ ). Our algorithm also succeeds in the agnostic setting. Previous work on learning submodular functions required either query access or strong assumptions about the types of submodular functions to be learned (and did not hold in the agnostic setting).