We study the complexity of representing polynomials by arithmetic circuits in both the commutative and the non-commutative settings. Our approach goes through a precise understanding of the more restricted setting where multiplication is not associative, meaning that we distinguish $(xy)z$ from $x(yz)$.
Our first and main conceptual result is a characterization result: we show that the size of the smallest circuit computing a given non-associative polynomial is exactly the rank of a matrix constructed from the polynomial and called the Hankel matrix. This result applies to the class of all circuits in both commutative and non-commutative settings, and can be seen as an extension of the seminal result of Nisan giving a similar characterization for non-commutative algebraic branching programs.
The study of the Hankel matrix provides a unifying approach for proving lower bounds for polynomials in the (classical) associative setting. We demonstrate this by giving alternative proofs of recent results proving superpolynomial and exponential lower bounds for different classes of circuits as corollaries of our characterization result.
Our main technical contribution is to provide generic lower bound theorems based on analyzing and decomposing the Hankel matrix. This yields significant improvements on lower bounds for circuits with many parse trees, in both (associative) commutative and non-commutative settings. In particular in the non-commutative setting we obtain a tight result showing superpolynomial lower bounds for any class of circuits which has a small defect in the exponent of the total number of parse trees.