We prove a lifting theorem from randomized decision tree depth to randomized parity decision tree (PDT) size. We use the same property of the gadget, stifling, which was introduced by Chattopadhyay, Mande, Sanyal and Sherif [ITCS'23] to prove a lifting theorem for deterministic PDTs. Moreover, even the milder condition that the gadget has minimum parity certificate complexity at least $2$ suffices for lifting to randomized PDT size.
To further improve the dependence on the gadget $g$ in the lower bounds for composed functions, we consider a related problem $g_*$ whose inputs are certificates of $g$. It is implicit in the work of Chattopadhyay et al. that for any function $f$, lower bounds for the $*$-depth of $f_*$ give lower bounds for the PDT size of $f$. We make this connection explicit in the deterministic case and show that it also holds for randomized PDTs. We then combine this with composition theorems for $*$-depth, which follow by adapting known composition theorems for decision trees. As a corollary, we get tight lifting theorems when the gadget is Indexing, Inner Product or Disjointness.