User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_10

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_10 [2016/02/23 15:29] rupertlecture_10 [2017/02/21 10:02] (current) rupert
Line 1: Line 1:
 +===== Matrix equations =====
 +
 +We've seen that a single linear equation can be written using [[row-column multiplication]]. For example,
 +\[ 2x-3y+z=8\]
 +can be written as 
 +\[ \def\m#1{\begin{bmatrix}#1\end{bmatrix}}\m{2&-3&1}\m{x\\y\\z}=8\]
 +or
 +\[ a\vec x=8\]
 +where $a=\m{2&-3&1}$ and $\vec x=\m{x\\y\\z}$.
 +
 +We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system
 +\begin{align*} 2x-3y+z&=8\\ y-z&=4\\x+y+z&=0\end{align*}
 +as 
 +\[ \m{2&-3&1\\0&1&-1\\1&1&1}\m{x\\y\\z}=\m{8\\4\\0},\]
 +or 
 +\[ A\vec x=\vec b\]
 +where $A=\m{2&-3&1\\0&1&-1\\1&1&1}$, $\vec x=\m{x\\y\\z}$ and $\vec b=\m{8\\4\\0}$. (We are writing the little arrow above the column vectors here because otherwise we might get confused between the $\vec x$: a column vector of variables, and $x$: just a single variable).
 +
 +
 +More generally, any linear system
 +\begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m&=b_1\\ a_{21}x_1+a_{22}x_2+\dots+a_{2m}x_m&=b_2\\ \hphantom{a_{11}}\vdots \hphantom{x_1+a_{22}}\vdots\hphantom{x_2+\dots+{}a_{nn}} \vdots\ & \hphantom{{}={}\!} \vdots\\ a_{n1}x_1+a_{n2}x_2+\dots+a_{nm}x_m&=b_n \end{align*}
 +can be written in the form
 +\[ A\vec x=\vec b\]
 +where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, and $\vec b=\m{b_1\\b_2\\\vdots\\b_n}$.
 +
 More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions? More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions?
  
Line 37: Line 62:
 What is special about $\tfrac1a$ which made this all work? What is special about $\tfrac1a$ which made this all work?
  
-  - we have $\tfrac1a \cdot = 1$,+  - we have $a\cdot \tfrac1a = 1$,
   - and $1b = b$.   - and $1b = b$.
  
-Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $A'$ with the property: $A'\cdot A=I_n$. In fact, because matrix multiplication is not commutative, we also require that $A\cdot A'=I_n$. It's then easy to argue that $X=A'\cdot B$ is a solution to $AX=B$, since +Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $C$ with the property: $AC=I_n$. In fact, because matrix multiplication is not commutative, we also require that $CA=I_n$. It's then easy to argue that $X=C\cdot B$ is a solution to $AX=B$, since 
-\begin{align*} AX&=A(A'\cdot B)\\&=(A\cdot A')B\\&=I_nB\\&=B.\end{align*}+\begin{align*} AX&=A(CB)\\&=(AC)B\\&=I_nB\\&=B.\end{align*}
  
 ==== Example revisited ==== ==== Example revisited ====
-If $A=\mat{2&4\\0&1}$, then the matrix $A'=\mat{\tfrac12&-2\\0&1}$ does have the property that +If $A=\mat{2&4\\0&1}$, then the matrix $C=\mat{\tfrac12&-2\\0&1}$ does have the property that 
-\[ A\cdot A'=I_2=A'\cdot A.\] +\[ A C=I_2=A.\] 
-(You should check this!). So a solution to $AX=B$ where $B=\mat{3&4\\5&6}$ is $X=A'B=\mat{\tfrac12&-2\\0&1}\mat{3&4\\5&6} = \mat{-8.5&-10\\5&6}$.+(You should check this!). So a solution to $AX=B$ where $B=\mat{3&4\\5&6}$ is $X=CB=\mat{\tfrac12&-2\\0&1}\mat{3&4\\5&6} = \mat{-8.5&-10\\5&6}$.
  
-Notice that having found the matrix $A'$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=A'C$. This is quicker than having to solve four new linear equations using our more tedious method above.+Notice that having found the matrix $C$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=CC$. This is quicker than having to solve four new linear equations using our more tedious method above.
  
 ==== Definition: invertible ==== ==== Definition: invertible ====
Line 56: Line 81:
 ==== Examples ==== ==== Examples ====
  
-  * $A=\mat{2&4\\0&1}$ is invertible, and the matrix $A'=\mat{\tfrac12&-2\\0&1}$ is an inverse of $A$ +  * $A=\mat{2&4\\0&1}$ is invertible, and the matrix $C=\mat{\tfrac12&-2\\0&1}$ is an inverse of $A$ 
-  * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $A'=[\tfrac1a]$.+  * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $C=[\tfrac1a]$.
   * $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$.   * $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$.
-  * $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot A'=0_{n\times n}$ for any $n\times n$ matrix $A'$, so $0_{n\times n}\cdot A'\ne I_n$. +  * $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot C=0_{n\times n}$ for any $n\times n$ matrix $C$, so $0_{n\times n}\cdot C\ne I_n$. 
-  * $A=\mat{1&0\\0&0}$ is not invertible, since for any $2\times 2$ matrix $A'=\mat{a&b\\c&d}$ we have $AA'=\mat{a&b\\0&0}$ which is not equal to $I_2=\mat{1&0\\0&1}$ since the $(2,2)$ entries are not equal.+  * $A=\mat{1&0\\0&0}$ is not invertible, since for any $2\times 2$ matrix $C=\mat{a&b\\c&d}$ we have $AC=\mat{a&b\\0&0}$ which is not equal to $I_2=\mat{1&0\\0&1}$ since the $(2,2)$ entries are not equal.
   * $A=\mat{1&2\\-3&-6}$ is not invertible. We'll see why later!   * $A=\mat{1&2\\-3&-6}$ is not invertible. We'll see why later!
  
Line 67: Line 92:
  
 === Proof === === Proof ===
-Suppose $A'$ and $A''$ are both inverses of $A$. Then +Suppose $C$ and $C'$ are both inverses of $A$. Then 
-$AA'=I_n=A'A$ and $AA''=I_n=A''A$. So +$AC=I_n=CA$ and $AC'=I_n=C'A$. So 
-\begin{align*} A'&=A'I_n\\&=A'(AA'')\quad\mbox{because }AA''=I_n\\&=(A'A)A''\quad\mbox{because matrix multiplication is associative}\\&=I_nA''\quad\mbox{because }A'A=I_n\\&=A''.\end{align*} +\begin{align*} C&=CI_n\quad\text{by the properties of $I_n$}\\&=C(AC')\quad\mbox{because }AC'=I_n\\&=(CA)C'\quad\mbox{because matrix multiplication is associative}\\&=I_nC'\quad\mbox{because }CA=I_n\\&=C'\quad\text{by the properties of $I_n$}.\end{align*} 
-So $A'=A''$, whenever $A'$ and $A''$ are inverses of $A$. So $A$ has a unique inverse. ■+So $C=C'$, whenever $C$ and $C'$ are inverses of $A$. So $A$ has a unique inverse. ■
  
 ==== Definition/notation: $A^{-1}$ ==== ==== Definition/notation: $A^{-1}$ ====
  
 {{page>the inverse}} {{page>the inverse}}
 +
 +If a matrix $A$ is not invertible, then $A^{-1}$ does not exist.
  
 === Warning === === Warning ===
Line 80: Line 107:
 If $A$ is a matrix then $\frac 1A$ doesn't make sense! You should never write this down. In particular, $A^{-1}$ definitely doesn't mean $\frac 1A$. If $A$ is a matrix then $\frac 1A$ doesn't make sense! You should never write this down. In particular, $A^{-1}$ definitely doesn't mean $\frac 1A$.
  
-==== Proposition: solving $AX=Bwhen $A$ is invertible ====+Similarly, you should **never** write down $\frac ABwhere $A$ and $B$ are matrices. This doesn't make sense either! 
 +==== Examples revisited ====
  
-If $A$ is an invertible $n\times n$ matrix and $B$ is an $n\times kmatrix, then the matrix equation \[ AX=B\] has a unique solution: $X=A^{-1}B$. +  * $A=\mat{2&4\\0&1}$ has $A^{-1}=\mat{\tfrac12&-2\\0&1}$. In other words, $\mat{2&4\\0&1}^{-1}=\mat{\tfrac12&-2\\0&1}$
- +  $1\times 1matrix $A=[a]with $a\ne 0has $[a]^{-1}=[\tfrac1a]$. 
-=== Proof === +  * $I_n^{-1}=I_n$. 
- +  $0_{n\times n}^{-1}does not exist 
-First we check that $X=A^{-1}B$ really is a solution to $AX=B$. To see thisnote that if $X=A^{-1}B$, then +  * $\mat{1&0\\0&0}^{-1}does not exist 
-\begin{align*} +  $\mat{1&2\\-3&-6}^{-1}does not exist
- AX&=A(A^{-1}B)\\&=(AA^{-1})B\\&=I_n B \\&= B+
-\end{align*+
-Now we check that the solution is unique. If $Xand $Yare both solutions, then $AX=Band $AY=B$, so \[AX=AY.\Multiplying both sides on the left by $A^{-1}$, we get +
-\[ A^{-1}AX=A^{-1}AY\implies I_nX=I_nY\implies X=Y.\] +
-So any two solutions are equal, so $AX=B$ has a unique solution +
- +
-==== Corollary ==== +
- +
-If $A$ is an $n\times n$ matrix and there exist different matrices $Xand $Y$ so that $AX=AY$, then $A$ is not invertible. +
- +
-=== Proof === +
- +
-Write $B=AX$. We have $AX=B$ and $AY=B$, so the matrix equation $AX=B$ has two different solutions, $X$ and $Y$. If $A$ was invertible, this would contradict the Proposition. So $A$ cannot be invertible. ■ +
- +
-==== Example ====+
  
-We can now see why the matrix $A=\mat{1&2\\-3&-6}$ is not invertible. If $X=\mat{-2\\1}$ and $Y=\mat{2\\-1}$, then $X\ne Y$ but $AX=0_{2\times 1}$ and $AY=0_{2\times 1}$, so $AX=AY$. So $A$ is not invertible, by the Corollary. 
lecture_10.1456241345.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki