Proving lower bounds on the size of noncommutative arithmetic circuits is an important problem in arithmetic circuit complexity. For explicit $n$ variate polynomials of degree $\Theta(n)$, the best known general bound is $\Omega(n \log n)$. Recent work of Chatterjee and Hrubeš has provided stronger ($\Omega(n^2)$) bounds for the restricted class of homogeneous circuits.
This paper extends these results to a broader class of circuits by using syntactic degree as a complexity measure. The syntactic degree of a circuit is a well known parameter which measures the extent to which high degree computation is used in the circuit. A homogeneous circuit computing a degree $d$ polynomial can be assumed, without loss of generality, to have syntactic degree exactly equal to $d$. We generalize this by considering circuits that are not necessarily homogeneous but have low syntactic degree. Specifically, for an explicit $n$ variate, degree $n$ polynomial $f$ we show that any circuit with syntactic degree $O(n)$ computing $f$ must have size $\Omega(n^{1+c})$ for some constant $c > 0$. We also show that any circuit with syntactic degree $o(n\log n)$ computing the same $f$ must have size $\omega(n\log n)$. We further analyze the circuit size required to compute $f$ based on the number of distinct syntactic degrees appearing in the circuit. Our analysis yields an $\omega(n\log n)$ size lower bound for all but a narrow parameter regime where an improved bound is not obtained. Finally, we observe that low syntactic degree circuits are more powerful than homogeneous circuits in a fine grained sense: there exists an $n$ variate, degree $\Theta(n)$ polynomial that has a circuit of size $O(n\log ^2n)$ and syntactic degree $O(n)$ but any homogeneous circuit computing it requires size $\Omega(n^2).$