User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_11

This is an old revision of the document!



Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/header.php on line 56

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/action.php on line 14

Invertibility

We've seen that solving matrix equations $AX=B$ is useful, since they generalise systems of linear equations.

How can we solve them?

Example

Take $A=\def\mat#1{\begin{bmatrix}#1\end{bmatrix}}\mat{2&4\\0&1}$ and $B=\mat{3&4\\5&6}$, so we want to find all matrices $X$ so that $AX=B$, or \[ \mat{2&4\\0&1}X=\mat{3&4\\5&6}.\] Note that $X$ must be a $2\times 2$ matrix for this to work, by the definition of matrix multiplication. So one way to solve this is to write $X=\mat{x_{11}&x_{12}\\x_{21}&x_{22}}$ and plug it in: \[\mat{2&4\\0&1}\mat{x_{11}&x_{12}\\x_{21}&x_{22}}=\mat{3&4\\5&6}\iff \mat{2x_{11}+4x_{21}&2 x_{12}+4x_{22}\\x_{21}&x_{22}}=\mat{3&4\\5&6}\] and then equate entries to get four linear equations: \begin{align*}2x_{11}+4x_{21}&=3\\2 x_{12}+4x_{22}&=4\\x_{21}&=5\\x_{22}&=6\end{align*} which we can solve in the usual way.

But this is a bit tedious! We will develop a slicker method by first thinking about solving ordinary equations $ax=b$ where $a,x,b$ are all numbers, or if you like, $1\times 1$ matrices.

Solving $ax=b$ and $AX=B$

If $a\ne0$, then solving $ax=b$ where $a,b,x$ are numbers is easy. We just divide both sides by $a$, or equivalently, we multiply both sides by $\tfrac1a$, to get the solution: $x=\tfrac1a\cdot b$.

Why does this work? If $x=\tfrac1a\cdot b$, then \begin{align*} ax&=a(\tfrac1a\cdot b)\\&=(a\cdot \tfrac1a)b\\&=1b\\&=b\end{align*} so $ax$ really is equal to $b$, and we do have a solution to $ax=b$.

What is special about $\tfrac1a$ which made this all work?

  1. we have $\tfrac1a \cdot a = 1$,
  2. and $1b = b$.

Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $A'$ with the property: $A'\cdot A=I_n$. In fact, because matrix multiplication is not commutative, we also require that $A\cdot A'=I_n$. It's then easy to argue that $X=A'\cdot B$ is a solution to $AX=B$, since \begin{align*} AX&=A(A'\cdot B)\\&=(A\cdot A')B\\&=I_nB\\&=B.\end{align*}

Example revisited

If $A=\mat{2&4\\0&1}$, then the matrix $A'=\mat{\tfrac12&-2\\0&1}$ does have the property that \[ A\cdot A'=I_2=A'\cdot A.\] (You should check this!). So a solution to $AX=B$ where $B=\mat{3&4\\5&6}$ is $X=A'B=\mat{\tfrac12&-2\\0&1}\mat{3&4\\5&6} = \mat{-8.5&-10\\5&6}$.

Notice that having found the matrix $A'$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=A'C$. This is quicker than having to solve four new linear equations using our more tedious method above.

Definition: invertible

An $n\times n$ matrix $A$ is invertible if there exists an $n\times n$ matrix $C$ so that \[ AC=I_n=C A.\] The matrix $C$ is called an inverse of $A$.

Examples

  • $A=\mat{2&4\\0&1}$ is invertible, and the matrix $A'=\mat{\tfrac12&-2\\0&1}$ is an inverse of $A$
  • a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $A'=[\tfrac1a]$.
  • $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$.
  • $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot A'=0_{n\times n}$ for any $n\times n$ matrix $A'$, so $0_{n\times n}\cdot A'\ne I_n$.
  • $A=\mat{1&0\\0&0}$ is not invertible, since for any $2\times 2$ matrix $A'=\mat{a&b\\c&d}$ we have $AA'=\mat{a&b\\0&0}$ which is not equal to $I_2=\mat{1&0\\0&1}$ since the $(2,2)$ entries are not equal.
  • $A=\mat{1&2\\-3&-6}$ is not invertible. We'll see why later!

Proposition: uniqueness of the inverse

If $A$ is an invertible $n\times n$ matrix, then $A$ has a unique inverse.

Proof

Suppose $A'$ and $A''$ are both inverses of $A$. Then $AA'=I_n=A'A$ and $AA''=I_n=A''A$. So \begin{align*} A'&=A'I_n\\&=A'(AA'')\quad\mbox{because }AA''=I_n\\&=(A'A)A''\quad\mbox{because matrix multiplication is associative}\\&=I_nA''\quad\mbox{because }A'A=I_n\\&=A''.\end{align*} So $A'=A''$, whenever $A'$ and $A''$ are inverses of $A$. So $A$ has a unique inverse. ■

Definition/notation: $A^{-1}$

If $A$ is an invertible $n\times n$ matrix, then the unique $n\times n$ matrix $C$ with $AC=I_n=CA$ is called the inverse of $A$. If $A$ is invertible, then we write $A^{-1}$ to mean the (unique) inverse of $A$.

Warning

If $A$ is a matrix then $\frac 1A$ doesn't make sense! You should never write this down. In particular, $A^{-1}$ definitely doesn't mean $\frac 1A$.

Proposition: solving $AX=B$ when $A$ is invertible

If $A$ is an invertible $n\times n$ matrix and $B$ is an $n\times k$ matrix, then the matrix equation \[ AX=B\] has a unique solution: $X=A^{-1}B$.

Proof

First we check that $X=A^{-1}B$ really is a solution to $AX=B$. To see this, note that if $X=A^{-1}B$, then \begin{align*} AX&=A(A^{-1}B)\\&=(AA^{-1})B\\&=I_n B \\&= B. \end{align*} Now we check that the solution is unique. If $X$ and $Y$ are both solutions, then $AX=B$ and $AY=B$, so \[AX=AY.\] Multiplying both sides on the left by $A^{-1}$, we get \[ A^{-1}AX=A^{-1}AY\implies I_nX=I_nY\implies X=Y.\] So any two solutions are equal, so $AX=B$ has a unique solution. ■

lecture_11.1424774454.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki