Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > KEYWORD > LEARNING ALGORITHMS:
Reports tagged with learning algorithms:
TR15-191 | 26th November 2015
Ruiwen Chen, Rahul Santhanam, Srikanth Srinivasan

Average-Case Lower Bounds and Satisfiability Algorithms for Small Threshold Circuits

We show average-case lower bounds for explicit Boolean functions against bounded-depth threshold circuits with a superlinear number of wires. We show that for each integer d > 1, there is \epsilon_d > 0 such that Parity has correlation at most 1/n^{\Omega(1)} with depth-d threshold circuits which have at most
n^{1+\epsilon_d} ... more >>>


TR17-023 | 15th February 2017
Russell Impagliazzo, Valentine Kabanets, Ilya Volkovich

The Power of Natural Properties as Oracles

We study the power of randomized complexity classes that are given oracle access to a natural property of Razborov and Rudich (JCSS, 1997) or its special case, the Minimal Circuit Size Problem (MCSP).
We obtain new circuit lower bounds, as well as some hardness results for the relativized version ... more >>>


TR17-046 | 8th March 2017
Sebastian Berndt, Maciej Li\'skiewicz, Matthias Lutter, Rüdiger Reischuk

Learning Residual Alternating Automata

Residuality plays an essential role for learning finite automata.
While residual deterministic and nondeterministic
automata have been understood quite well, fundamental
questions concerning alternating automata (AFA) remain open.
Recently, Angluin, Eisenstat, and Fisman have initiated
a systematic study of residual AFAs and proposed an algorithm called AL*
-an extension of ... more >>>


TR17-098 | 28th May 2017
Raman Arora, Amitabh Basu , Poorya Mianjy, Anirbit Mukherjee

Understanding Deep Neural Networks with Rectified Linear Units

Revisions: 2

In this paper we investigate the family of functions representable by deep neural networks (DNN) with rectified linear units (ReLU). We give the first-ever polynomial time (in the size of data) algorithm to train to global optimality a ReLU DNN with one hidden layer, assuming the input dimension and number ... more >>>


TR19-168 | 20th November 2019
Igor Carboni Oliveira, Lijie Chen, Shuichi Hirahara, Ján Pich, Ninad Rajgopal, Rahul Santhanam

Beyond Natural Proofs: Hardness Magnification and Locality

Hardness magnification reduces major complexity separations (such as $EXP \not\subseteq NC^1$) to proving lower bounds for some natural problem $Q$ against weak circuit models. Several recent works [OS18, MMW19, CT19, OPS19, CMMW19, Oli19, CJW19a] have established results of this form. In the most intriguing cases, the required lower bound is ... more >>>


TR20-018 | 18th February 2020
Valentine Kabanets, Sajin Koroth, Zhenjian Lu, Dimitrios Myrisiotis, Igor Oliveira

Algorithms and Lower Bounds for de Morgan Formulas of Low-Communication Leaf Gates

The class $FORMULA[s] \circ \mathcal{G}$ consists of Boolean functions computable by size-$s$ de Morgan formulas whose leaves are any Boolean functions from a class $\mathcal{G}$. We give lower bounds and (SAT, Learning, and PRG) algorithms for $FORMULA[n^{1.99}]\circ \mathcal{G}$, for classes $\mathcal{G}$ of functions with low communication complexity. Let $R^{(k)}(\mathcal{G})$ be ... more >>>




ISSN 1433-8092 | Imprint