Weizmann Logo
ECCC
Electronic Colloquium on Computational Complexity

Under the auspices of the Computational Complexity Foundation (CCF)

Login | Register | Classic Style



ECCC BOOKS, LECTURES AND SURVEYS > THE COMPLEXITY OF DERIVATIONS OF MATRIX IDENTITIES:

Michael Soltys:

The Complexity of Derivations of Matrix Identities

Download

Department of Mathematics
University of Toronto
Toronto, Canada

Abstract:

In this thesis we are concerned with building logical foundations for Linear Algebra, from the perspective of proof complexity. As the cornerstone of our logical theories, we use Berkowitz's parallel algorithm for computing the coefficients of the characteristic polynomial of a matrix.

Standard Linear Algebra textbooks use Gaussian Elimination as the main algorithm, but they invariably use the (very infeasible) Lagrange expansion to prove properties of this algorithm.

The main contribution of this thesis is a (first) feasible proof of the Cayley-Hamilton Theorem, and related principles of Linear Algebra (namely, the axiomatic definition of the determinant, the cofactor expansion formula, and multiplicativity of the determinant). Furthermore, we show that these principles are equivalent, and the equivalence can be proven feasibly.

We also show that a large class of matrix identities, such as AB=I implies BA=I (proposed by S.A. Cook as a candidate for separating Frege and Extended Frege propositional proof systems), have feasible proofs, and hence polynomially-bounded Extended Frege proofs. We introduce the notion of completeness for these matrix identities.

As the main tool to prove our results, we design three logical theories: LA, LAP, and FORALL-LAP. LA is a three-sorted quantifier-free theory of Linear Algebra. The three sorts are indices, field elements and matrices. This is a simple theory that allows us to formalize and prove all the basic properties of matrices (roughly the properties that state that the set of matrices is a ring). The theorems of LA have polynomially-bounded Frege proofs.

We extend LA to LAP by adding a new function, P, which is intended to denote matrix powering, i.e., P(n,A) means the n-th power of the matrix A. LAP is well suited for formalizing Berkowitz's algorithm, and it is strong enough to prove the equivalence of some fundamental principles of Linear Algebra. The theorems of LAP translate into quasi-polynomially-bounded Frege proofs.

We finally extend LAP to FORALL-LAP by allowing induction on formulas with FORALL matrix quantifiers. This new theory is strong enough to prove the Cayley-Hamilton Theorem, and hence (by our equivalence) all the major principles of Linear Algebra. The theorems of FORALL-LAP translate into polynomially-bounded Extended Frege proofs.

Table of contents:

  1. Introduction
    1.1 Motivation
    1.2 Contributions
    1.3 Summary of results
  2. The Theory LA
    2.1 Language
    2.2 Terms, formulas and sequents
    2.3 Axioms
    2.4 Rules of inference and proof systems

  3. The Theorems of LA
    3.1 LA proofs of basic matrix identities
    3.2 Hard matrix identities

  4. LA with Matrix Powering
    4.1 Language
    4.2 Berkowitz's algorithm
    4.3 Berkowitz's algorithm and clow sequences

  5. The Characteristic Polynomial
    5.1 Basic Properties
    5.2 Triangular Matrices
    5.3 Hard Matrix Identities

  6. Equivalence in LAP
    6.1 The axiomatic definition of determinant
    6.2 The cofactor expansion
    6.3 The adjoint as a matrix of cofactors
    6.4 The multiplicativity of the determinant

  7. Translations
    7.1 The propositional proof system PK[a]
    7.2 Translating theorems of LA over GF(2)
    7.3 Translating theorems of LA over GF(p) and Q
    7.4 Translating theorems of LAP

  8. Proofs of the C-H Theorem
    8.1 Traditional proofs of the C-H Theorem
    8.2 Feasible proofs of the C-H Theorem
    8.3 Efficient Extended Frege proofs

  9. Eight Open Problems

Number of pages: 157


ISSN 1433-8092 | Imprint