The main motivation for studying linear data structures and circuits is the intuition that non-linear advice cannot help in computing a linear operator. Jukna and Schnitger formalized this as a conjecture which states that all circuits computing a linear operator can be ``linearized," with only a constant size blow-up. We show that if we assume strengthening of this intuition to data structures (to some field), then this implies Patrascu’s Multiphase Conjecture for such field. Furthermore, we show that this conjecture is an intermediate conjecture between NOF-Multiphase Conjecture and the Multiphase Conjecture, formalizing why Patrascu's original approach to the Multiphase Conjecture is hard.
Our main technical ingredient is proving unconditional space-time tradeoff for the following static data structure problem for any given field: Let $M \in \mathbb{F}^{m \times n}$ be fixed. Data structure preprocesses input $X \in \mathbb{F}^n$ using $s$-cells (dependant on $M$), each of which can store an arbitrary element in $\mathbb{F}$. When $i \in [m]$ is revealed, the data structure can output $M_i \cdot X$ using $t_q$ probes. We show that there exists $M \in \{ 0 , 1 \}^{m \times n}$ such that if the functions used by the data structure are all linear and $s \leq \tilde{o} ( m )$ then $t_q \geq \tilde{\Omega}(n)$.
As a corollary, we show that Patrascu's Multiphase Conjecture when restricted to dynamic linear data structure holds (with unlimited preprocessing) over any field. This exhibits an explicit dynamic data structure which requires polynomial update time $t_u \geq \tilde{\Omega}(n)$ or query time $t_q \geq \tilde{\Omega}(n)$. This also improves upon the breakthrough work of Larsen which showed a polynomial lower bound for dynamic data structure under the weaker group model.