Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR24-202 | 6th December 2024 03:16

Lifting to Randomized Parity Decision Trees

RSS-Feed




TR24-202
Authors: Farzan Byramji, Russell Impagliazzo
Publication: 8th December 2024 06:57
Downloads: 154
Keywords: 


Abstract:

We prove a lifting theorem from randomized decision tree depth to randomized parity decision tree (PDT) size. We use the same property of the gadget, stifling, which was introduced by Chattopadhyay, Mande, Sanyal and Sherif [ITCS'23] to prove a lifting theorem for deterministic PDTs. Moreover, even the milder condition that the gadget has minimum parity certificate complexity at least $2$ suffices for lifting to randomized PDT size.
To further improve the dependence on the gadget $g$ in the lower bounds for composed functions, we consider a related problem $g_*$ whose inputs are certificates of $g$. It is implicit in the work of Chattopadhyay et al. that for any function $f$, lower bounds for the $*$-depth of $f_*$ give lower bounds for the PDT size of $f$. We make this connection explicit in the deterministic case and show that it also holds for randomized PDTs. We then combine this with composition theorems for $*$-depth, which follow by adapting known composition theorems for decision trees. As a corollary, we get tight lifting theorems when the gadget is Indexing, Inner Product or Disjointness.



ISSN 1433-8092 | Imprint