Last time
- If $A$ is an $n\times n$ matrix, $\det(A)$ is a number
- Key property: $A$ is invertible if and only if $\det(A)\ne0$
- Laplace expansion along any row/col gives $\det(A)$
- Formula: sum of (entries $\times$ cofactors) along the row/col
- cofactor: $\pm$ minor ($\pm$ from matrix of signs)
- minor: delete a row & column, then find determinant
- $\det(A^T)=\det(A)$ and $\det(AB)=\det(A)\det(B)$
- If $A$ upper triangular: $\det(A)=$ product of diagonal entries
Today
- Effect of row/column operations on determinants
- Using this to simplify determinants
- Using determinants and cofactors to find the inverse of a matrix
Theorem: row/column operations and determinants
Let $A$ be an $n\times n$ matrix, let $c$ be a scalar and let $i\ne j$.
$A_{Ri\to x}$ means $A$ but with row $i$ replaced by $x$.
- If $i\ne j$, then $\det(A_{Ri\leftrightarrow Rj})=-\det(A)$ (swapping two rows changes the sign of det).
- $\det(A_{Ri\to c Ri}) = c\det(A)$ (scaling one row scales $\det(A)$ in the same way)
- $\det(A_{Ri\to Ri + c Rj}) = \det(A)$ (adding a multiple of one row to another row doesn't change $\det(A)$)
- Also, these properties all hold if you change “row” into “column” throughout.
Corollary
If an $n\times n$ matrix $A$ has two equal rows (or columns), then $\det(A)=0$, and $A$ is not invertible.
Proof
- Suppose $A$ has two equal rows, row $i$ and row $j$.
- Then $A=A_{Ri\leftrightarrow Rj}$
- So $\det(A)=\det(A_{Ri\leftrightarrow Rj}) = -\det(A)$
- So $\det(A)=0$.
- If $A$ has two equal columns, then $A^T$ has two equal rows
- So $\det(A)=\det(A^T)=0$.
- In either case, $\det(A)=0$. So $A$ is not invertible.■
Examples
- $\det(A_{Ri\leftrightarrow Rj})=-\det(A)$, so $\def\vm#1{\left|\begin{smallmatrix}#1\end{smallmatrix}\right|}\vm{0&0&2\\0&3&15\\4&23&2} = -\vm{4&23&2\\0&3&15\\0&0&2}=-4\cdot 3\cdot 2 = -24$.
- $\det(A_{Ri\to c Ri})=c\det(A)$, and the same for columns. So \begin{align*}\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\vm{ \color{red}2&\color{red}4&\color{red}6&\color{red}{10}\\\color{blue}5&\color{blue}0&\color{blue}0&-\color{blue}{10}\\\color{orange}9&\color{orange}0&\color{orange}{81}&\color{orange}{99}\\1&2&3&4} &= \color{red}2\cdot \color{blue}5\cdot \color{orange}9 \vm{ 1&\color{green}2&\color{pink}3&5\\1&\color{green}0&\color{pink}0&-2\\1&\color{green}0&\color{pink}9&11\\1&\color{green}2&\color{pink}3&4}\\&=2\cdot 5\cdot 9\cdot \color{green}2\cdot\color{pink} 3 \vm{ 1&1&1&5\\1&0&0&-2\\1&0&3&11\\1&1&1&4}.\end{align*}
- $\det(A_{R1\to R1-R4})=\det(A)$, so \begin{align*}\vm{ 1&1&1&5\\1&0&0&-2\\1&0&3&11\\1&1&1&4} &=\vm{ 0&0&0&1\\1&0&0&-2\\1&0&3&11\\1&1&1&4}=-1\vm{1&0&0\\1&0&3\\1&1&1}+0\\&=-\vm{0&3\\1&1} = -(-3)=3.\end{align*}
- Hence \begin{align*}\vm{ 2&4&6&10\\5&0&0&-10\\9&0&81&99\\1&2&3&4} &= 2\cdot 5\cdot 9\cdot 2\cdot 3 \vm{ 1&1&1&5\\1&0&0&-2\\1&0&3&11\\1&1&1&4} \\&= 2\cdot 5\cdot 9\cdot 2\cdot 3 \cdot 3 = 1620.\end{align*}
Corollary
If $\def\row{\text{row}}\row_i(A)=c\cdot \row_j(A)$ for some $i\ne j$ and some $c\in \mathbb{R}$, then $\det(A)=0$.
Proof
- $\row_i(A)-c \cdot\row_j(A)=0$
- So $A_{Ri\to Ri-c\,Rj}$ has a zero row
- By Laplace expansion along this row: $\det(A_{Ri\to Ri-c\,Rj})=0$
- So $\det(A)=\det(A_{Ri\to Ri-c\,Rj})=0$.■
Effect of EROs on the determinant
We've seen that:
- swapping two rows of the matrix multiplies the determinant by $-1$;
- scaling one of the rows of the matrix by $c$ scales the determinant by $c$; and
- replacing row $j$ by “row $j$ ${}+{}$ $c\times {}$ (row $i$)”, where $c$ is a scalar and $i\ne j$ does not change the determinant.
- Since $\det(A)=\det(A^T)$, this all applies equally to columns instead of rows.
Using EROs to find the determinant
- Can use EROs to put a matrix into upper triangular form
- Then finding the determinant is easy: just multiply the diagonal entries together.
- Just have to keep track of how the determinant is changed by any row swaps and row scalings.
Example: using EROs to find the determinant
\begin{align*}\def\vm#1{\left|\begin{smallmatrix}#1\end{smallmatrix}\right|}\vm{1&3&1&3\\\color{red}4&\color{red}8&\color{red}0&\color{red}{12}\\0&1&3&6\\2&2&1&6}&= \color{red}{4}\vm{1&3&1&\color{blue}3\\1&2&0&\color{blue}3\\0&1&3&\color{blue}6\\2&2&1&\color{blue}6}=4\cdot \color{blue}3\vm{\color{green}1&3&1&1\\\color{red}1&2&0&1\\\color{red}0&1&3&2\\\color{red}2&2&1&2} \\&=12\vm{1&3&1&1\\\color{blue}0&\color{blue}{-1}&\color{blue}{-1}&\color{blue}{0}\\\color{blue}0&\color{blue}1&\color{blue}3&\color{blue}2\\0&-4&-1&-0} =\color{blue}{-}12\vm{1&3&1&1\\0&\color{green}1&3&2\\0&\color{red}{-1}&{-1}&{0}\\0&\color{red}{-4}&-1&0} \\&=-12\vm{1&3&1&1\\0&1&3&2\\0&0&\color{green}2&2\\0&0&\color{red}{11}&8} =-12\vm{1&3&1&1\\0&1&3&2\\0&0&2&2\\0&0&0&-3} \\&=-12(1\times1\times2\times(-3))=72. \end{align*}
Finding the inverse of an invertible $n\times n$ matrix
The adjoint of a square matrix
Example: $n=2$
If $A=\def\mat#1{\begin{bmatrix}#1\end{bmatrix}}\def\vm#1{\begin{vmatrix}#1\end{vmatrix}}\mat{a&b\\c&d}$, then $C=\mat{d&-c\\-b&a}$, so the adjoint of $A$ is $J=C^T=\mat{d&-b\\-c&a}$.
- Recall that $AJ=(\det A)I_2=JA$ (earlier calculation)
- Hence if $\det A\ne0$, then $A^{-1}=\frac1{\det A}J$.
Example: $n=3$
Find $J$, the adjoint of $\def\mat#1{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]}A=\mat{3&1&0\\-2&-4&3\\5&4&-2}$, and compute $A^{-1}$.
- Matrix of signs: $\mat{+&-&+\\-&+&-\\+&-&+}$
- Matrix of cofactors: $C=\def\vm#1{\left|\begin{smallmatrix}#1\end{smallmatrix}\right|}\mat{\vm{-4&3\\4&-2}&-\vm{-2&3\\5&-2}&\vm{-2&-4\\5&4}\\-\vm{1&0\\4&-2}&\vm{3&0\\5&-2}&-\vm{3&1\\5&4}\\\vm{1&0\\-4&3}&-\vm{3&0\\-2&3}&\vm{3&1\\-2&-4}}= \mat{-4&11&12\\2&-6&-7\\3&-9&-10}$
- Adjoint of $A=\mat{3&1&0\\-2&-4&3\\5&4&-2}$ is $J=C^T=\mat{-4&2&3\\11&-6&-9\\12&-7&-10}$
- $AJ=\mat{3&1&0\\-2&-4&3\\5&4&-2}\mat{-4&2&3\\11&-6&-9\\12&-7&-10}=\mat{-1&0&0\\0&-1&0\\0&0&-1}=-1\cdot I_3$
- $JA=\mat{-4&2&3\\11&-6&-9\\12&-7&-10}\mat{3&1&0\\-2&-4&3\\5&4&-2}=\mat{-1&0&0\\0&-1&0\\0&0&-1}=-1\cdot I_3$
- So $A^{-1}=-J$.
- And $\det(A)=-1$.
- So $A^{-1}=\frac1{\det(A)}J$ again.
Theorem: key property of the adjoint of a square matrix
If $A$ is any $n\times n$ matrix and $J$ is its adjoint, then $AJ=(\det A)I_n=JA$.
Proof
- Omitted
Corollary: a formula for the inverse of a square matrix
If $A$ is any $n\times n$ matrix with $\det(A)\ne 0$, then $A$ is invertible, and $A^{-1}=\frac1{\det A}J$ where $J$ is the adjoint of $A$.
Proof
- Divide the equation $AJ=(\det A)I_n=JA$ by $\det A$.
- $A(\frac1{\det A}J)=I_n=(\frac1{\det A})JA$
- So $A^{-1}=\frac1{\det A} J$. ■
Example ($n=4$)
Let $A=\mat{1&0&0&0\\1&2&0&0\\1&2&3&0\\1&2&3&4}$.
- Reminder: repeated row or zero row gives determinant zero
- $C=\mat{+\vm{2&0&0\\2&3&0\\2&3&4}&-\vm{1&0&0\\1&3&0\\1&3&4}&+0&-0\\-0&+\vm{1&0&0\\1&3&0\\1&3&4}&-\vm{1&0&0\\1&2&0\\1&2&4}&+0\\+0&-0&+\vm{1&0&0\\1&2&0\\1&2&4}&-\vm{1&0&0\\1&2&0\\1&2&3}\\-0&+0&-0&+\vm{1&0&0\\1&2&0\\1&2&3}}=\mat{24&-12&0&0\\0&12&-8&0\\0&0&8&-6\\0&0&0&6}$
Example ($n=4$)
Let $A=\mat{1&0&0&0\\1&2&0&0\\1&2&3&0\\1&2&3&4}$.
- $C=\mat{24&-12&0&0\\0&12&-8&0\\0&0&8&-6\\0&0&0&6}$ so $J=C^T=\mat{24&0&0&0\\-12&12&0&0\\0&-8&8&0\\0&0&-6&6}$.
- So $A^{-1}=\frac1{\det A}J = \frac1{24}\mat{24&0&0&0\\-12&12&0&0\\0&-8&8&0\\0&0&-6&6}=\mat{1&0&0&0\\-1/2&1/2&0&0\\0&-1/3&1/3&0\\0&0&-1/4&1/4}$.
- (Easy to check that $AA^{-1}=I_4=A^{-1}A$.)

