Michael Schmitt

A neural network is said to be nonoverlapping if there is at most one

edge outgoing from each node. We investigate the number of examples

that a learning algorithm needs when using nonoverlapping neural

networks as hypotheses. We derive bounds for this sample complexity

in terms of the Vapnik-Chervonenkis dimension. ...
more >>>

Amir Shpilka, Ilya Volkovich

An \emph{arithmetic read-once formula} (ROF for short) is a

formula (a circuit whose underlying graph is a tree) in which the

operations are $\{+,\times\}$ and such that every input variable

labels at most one leaf. A \emph{preprocessed ROF} (PROF for

short) is a ROF in which we are allowed to ...
more >>>

Dean Doron, Pooya Hatami, William Hoza

There are only a few known general approaches for constructing explicit pseudorandom generators (PRGs). The ``iterated restrictions'' approach, pioneered by Ajtai and Wigderson [AW89], has provided PRGs with seed length $\mathrm{polylog} n$ or even $\tilde{O}(\log n)$ for several restricted models of computation. Can this approach ever achieve the optimal seed ... more >>>