Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



REPORTS > DETAIL:

Paper:

TR26-035 | 4th March 2026 17:54

Learning Read-Once Determinants and the Principal Minor Assignment Problem

RSS-Feed

Abstract:

A symbolic determinant under rank-one restriction computes a polynomial of the form $\det(A_0 + A_1y_1 + \ldots + A_ny_n)$, where $A_0, A_1, \ldots, A_n$ are square matrices over a field $\mathbb{F}$ and $\rank(A_i) = 1$ for each $i \in [n]$. This class of polynomials has been studied extensively, since the work of Edmonds (1967), in the context of linear matroids, matching, matrix completion and polynomial identity testing. We study the following learning problem for this class: Given black-box access to an $n$-variate polynomial $f = \det(A_0 + A_1y_1 + \ldots + A_ny_n)$, where $A_0, A_1, \ldots, A_n$ are unknown square matrices over $\mathbb{F}$ and $rank(A_i) = 1$ for each $i \in [n]$, find a square matrix $B_0$ and rank-one square matrices $B_1, \ldots, B_n$ over $\mathbb{F}$ such that $f = \det(B_0 + B_1y_1 + \ldots + B_ny_n)$. In this work, we give a randomized poly$(n)$ time algorithm to solve this problem; the algorithm can be derandomized in quasi-polynomial time. To our knowledge, this is the first efficient learning algorithm for this class. As the above-mentioned class is known to be equivalent to the class of read-once determinants (RODs), we will refer to the problem as learning RODs. An ROD computes the determinant of a matrix whose entries are field constants or variables and every variable appears at most once in the matrix. Thus, the class of RODs is a rare example of a well-studied class of polynomials that admits efficient proper learning.

The algorithm for learning RODs is obtained by connecting with a well-known open problem in linear algebra, namely the Principal Minor Assignment Problem (PMAP), which asks to find (if possible) a matrix having prescribed principal minors. PMAP has also been studied in machine learning to learn the kernel matrix of a determinantal point process. Here, we study a natural black-box version of PMAP: Given black-box access to an $n$-variate polynomial $f = \det(A + Y)$, where $A \in \mathbb{F}^{n \times n}$ is unknown and $Y = diag(y_1, \ldots, y_n)$, find a $B \in \mathbb{F}^{n\times n}$ such that $f = \det(B + Y)$. We show that black-box PMAP can be solved in randomized poly$(n)$ time, and further, it is randomized polynomial-time equivalent to learning RODs. The algorithm and the reduction between the two problems can be derandomized in quasi-polynomial time. To our knowledge, no efficient algorithm to solve this black-box version of PMAP was known before.

We resolve black-box PMAP by investigating a crucial property of dense matrices that we call the rank-one extension property. Understanding ``cuts'' of matrices with this property and designing a black-box cut-finding algorithm to solve PMAP for such matrices (using only principal minors of order $4$ or less) constitute the technical core of this work. The insights developed along the way also help us give the first NC algorithm for the Principal Minor Equivalence problem, which asks to check if two given matrices have equal corresponding principal minors.



ISSN 1433-8092 | Imprint