We call any consistent and sufficiently powerful formal theory that enables to algorithmically in polynomial time verify whether a text is a proof \textbf{efficiently verifiable mathematics} (ev-mathematics). We study the question whether nondeterminism is more powerful than determinism for polynomial time computations in the framework of ev-mathematics. Our main results ... more >>>
Mathematics was developed as a strong research instrument with fully verifiable argumentations. We call any formal theory based on syntactic rules that enables to algorithmically verify for any given text whether it is a proof or not algorithmically verifiable mathematics (AV-mathematics for short). We say that a decision problem L ... more >>>
The advice complexity of an online problem describes the additional information both necessary and sufficient for online algorithms to compute solutions of a certain quality. In this model, an oracle inspects the input before it is processed by an online algorithm. Depending on the input string, the oracle prepares an ... more >>>
We study k-partition communication protocols, an extension
of the standard two-party best-partition model to k input partitions.
The main results are as follows.
1. A strong explicit hierarchy on the degree of
non-obliviousness is established by proving that,
using k+1 partitions instead of k may decrease
the communication complexity from ...
more >>>
While deterministic finite automata seem to be well understood, surprisingly
many important problems
concerning nondeterministic finite automata (nfa's) remain open.
One such problem area is the study of different measures of nondeterminism in
finite automata and the
estimation of the sizes of minimal nondeterministic finite automata. In this
paper the ...
more >>>
The investigation of the computational power of randomized computations
is one of the central tasks of current complexity and algorithm theory.
In this paper for the first time a "strong" separation between the power
of determinism, Las Vegas randomization, and nondeterminism for a compu-
ting model is proved. The computing ...
more >>>
The investigation of the possibility to efficiently compute
approximations of hard optimization problems is one of the central
and most fruitful areas of current algorithm and complexity theory.
The aim of this paper is twofold. First, we introduce the notion of
stability of approximation algorithms. This notion is shown to ...
more >>>
The investigation of the computational power of randomized
computations is one of the central tasks of current complexity and
algorithm theory. This paper continues in the comparison of the computational
power of LasVegas computations with the computational power of deterministic
and nondeterministic ones. While for one-way ...
more >>>
The study of the computational power of randomized
computations is one of the central tasks of complexity theory. The
main goal of this paper is the comparison of the power of Las Vegas
computation and deterministic respectively nondeterministic
computation. We investigate the power of Las Vegas computation for ...
more >>>