Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_10
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| lecture_10 [2016/02/23 15:29] – rupert | lecture_10 [2017/02/21 10:02] (current) – rupert | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| + | ===== Matrix equations ===== | ||
| + | |||
| + | We've seen that a single linear equation can be written using [[row-column multiplication]]. For example, | ||
| + | \[ 2x-3y+z=8\] | ||
| + | can be written as | ||
| + | \[ \def\m# | ||
| + | or | ||
| + | \[ a\vec x=8\] | ||
| + | where $a=\m{2& | ||
| + | |||
| + | We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system | ||
| + | \begin{align*} 2x-3y+z& | ||
| + | as | ||
| + | \[ \m{2& | ||
| + | or | ||
| + | \[ A\vec x=\vec b\] | ||
| + | where $A=\m{2& | ||
| + | |||
| + | |||
| + | More generally, any linear system | ||
| + | \begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m& | ||
| + | can be written in the form | ||
| + | \[ A\vec x=\vec b\] | ||
| + | where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, | ||
| + | |||
| More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], | More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], | ||
| Line 37: | Line 62: | ||
| What is special about $\tfrac1a$ which made this all work? | What is special about $\tfrac1a$ which made this all work? | ||
| - | - we have $\tfrac1a | + | - we have $a\cdot \tfrac1a |
| - and $1b = b$. | - and $1b = b$. | ||
| - | Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $A'$ with the property: $A' | + | Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $C$ with the property: $AC=I_n$. In fact, because matrix multiplication is not commutative, |
| - | \begin{align*} AX&=A(A' | + | \begin{align*} AX&=A(CB)\\&=(AC)B\\& |
| ==== Example revisited ==== | ==== Example revisited ==== | ||
| - | If $A=\mat{2& | + | If $A=\mat{2& |
| - | \[ A\cdot A'=I_2=A' | + | \[ A C=I_2=C A.\] |
| - | (You should check this!). So a solution to $AX=B$ where $B=\mat{3& | + | (You should check this!). So a solution to $AX=B$ where $B=\mat{3& |
| - | Notice that having found the matrix $A'$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=A'C$. This is quicker than having to solve four new linear equations using our more tedious method above. | + | Notice that having found the matrix $C$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=CC$. This is quicker than having to solve four new linear equations using our more tedious method above. |
| ==== Definition: invertible ==== | ==== Definition: invertible ==== | ||
| Line 56: | Line 81: | ||
| ==== Examples ==== | ==== Examples ==== | ||
| - | * $A=\mat{2& | + | * $A=\mat{2& |
| - | * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $A'=[\tfrac1a]$. | + | * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $C=[\tfrac1a]$. |
| * $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$. | * $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$. | ||
| - | * $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot | + | * $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot |
| - | * $A=\mat{1& | + | * $A=\mat{1& |
| * $A=\mat{1& | * $A=\mat{1& | ||
| Line 67: | Line 92: | ||
| === Proof === | === Proof === | ||
| - | Suppose $A'$ and $A''$ are both inverses of $A$. Then | + | Suppose $C$ and $C'$ are both inverses of $A$. Then |
| - | $AA'=I_n=A'A$ and $AA''=I_n=A''A$. So | + | $AC=I_n=CA$ and $AC'=I_n=C'A$. So |
| - | \begin{align*} | + | \begin{align*} |
| - | So $A'=A''$, whenever $A'$ and $A''$ are inverses of $A$. So $A$ has a unique inverse. ■ | + | So $C=C'$, whenever $C$ and $C'$ are inverses of $A$. So $A$ has a unique inverse. ■ |
| ==== Definition/ | ==== Definition/ | ||
| {{page> | {{page> | ||
| + | |||
| + | If a matrix $A$ is not invertible, then $A^{-1}$ does not exist. | ||
| === Warning === | === Warning === | ||
| Line 80: | Line 107: | ||
| If $A$ is a matrix then $\frac 1A$ doesn' | If $A$ is a matrix then $\frac 1A$ doesn' | ||
| - | ==== Proposition: | + | Similarly, you should **never** write down $\frac AB$ where $A$ and $B$ are matrices. This doesn' |
| + | ==== Examples revisited | ||
| - | If $A$ is an invertible $n\times n$ matrix and $B$ is an $n\times k$ matrix, then the matrix equation \[ AX=B\] | + | * $A=\mat{2&4\\0&1}$ has $A^{-1}=\mat{\tfrac12& |
| - | + | * a $1\times 1$ matrix | |
| - | === Proof === | + | * $I_n^{-1}=I_n$. |
| - | + | | |
| - | First we check that $X=A^{-1}B$ really is a solution to $AX=B$. To see this, note that if $X=A^{-1}B$, then | + | * $\mat{1& |
| - | \begin{align*} | + | |
| - | AX&=A(A^{-1}B)\\&=(AA^{-1})B\\&=I_n B \\&= B. | + | |
| - | \end{align*} | + | |
| - | Now we check that the solution is unique. If $X$ and $Y$ are both solutions, then $AX=B$ and $AY=B$, so \[AX=AY.\] Multiplying both sides on the left by $A^{-1}$, we get | + | |
| - | \[ A^{-1}AX=A^{-1}AY\implies I_nX=I_nY\implies X=Y.\] | + | |
| - | So any two solutions are equal, so $AX=B$ has a unique solution. ■ | + | |
| - | + | ||
| - | ==== Corollary ==== | + | |
| - | + | ||
| - | If $A$ is an $n\times n$ matrix and there exist different matrices | + | |
| - | + | ||
| - | === Proof === | + | |
| - | + | ||
| - | Write $B=AX$. We have $AX=B$ and $AY=B$, so the matrix equation $AX=B$ has two different solutions, $X$ and $Y$. If $A$ was invertible, this would contradict the Proposition. So $A$ cannot be invertible. ■ | + | |
| - | + | ||
| - | ==== Example ==== | + | |
| - | We can now see why the matrix $A=\mat{1& | ||
lecture_10.1456241345.txt.gz · Last modified: by rupert
