User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_11

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_11 [2016/02/23 10:41] rupertlecture_11 [2017/02/28 12:07] (current) rupert
Line 1: Line 1:
-More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions?+==== Proposition: solving $AX=B$ when $A$ is invertible ====
  
-=== Example ===+If $A$ is an invertible $n\times n$ matrix and $B$ is an $n\times k$ matrix, then the matrix equation \[ AX=B\] has a unique solution: $X=A^{-1}B$.
  
-If $A=\def\m#1{\begin{bmatrix}#1\end{bmatrix}}\m{1&0\\0&0}$ and $B=0_{2\times 3}$, then any solution $X$ to $AX=B$ must be $2\times 3$.+=== Proof ===
  
-One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$.+First we check that $X=A^{-1}B$ really is a solution to $AX=B$. To see this, note that if $X=A^{-1}B$, then 
 +\begin{align*} 
 + AX&=A(A^{-1}B)\\&=(AA^{-1})B\\&=I_n B \\&= B. 
 +\end{align*} 
 +Now we check that the solution is unique. If $X$ and $Y$ are both solutions, then $AX=B$ and $AY=B$, so \[AX=AY.\] Multiplying both sides on the left by $A^{-1}$, we get 
 +\[ A^{-1}AX=A^{-1}AY\implies I_nX=I_nY\implies X=Y.\] 
 +So any two solutions are equal, so $AX=B$ has a unique solution
  
-However, this is not the only solution. For example, $X=\m{0&0&0\\1&2&3}$ is another solution, since in this case \[AX=\m{1&0\\0&0}\m{0&0&0\\1&2&3}=\m{0&0&0\\0&0&0}=0_{2\times 3}.\]+==== Corollary ====
  
-So from this example, we see that a matrix equation can have many solutions.+If $A$ is an $n\times n$ matrix and there is non-zero $n\times m$ matrix $K$ so that $AK=0_{n\times m}$, then $A$ is not invertible.
  
-===== Invertibility =====+=== Proof ===
  
-We've seen that solving matrix equations $AX=B$ is usefulsince they generalise systems of linear equations.+Since $A0_{n\times m}=0_{n\times m}$ and $AK=0_{n\times m}$, the equation $AX=0_{n\times m}$ has (at least) two solutions: $X=0_{n\times m}$ and $X=K$. Since $K$ is non-zerothese two solutions are different.
  
-How can we solve them?+So there is not a unique solution to $AX=B$, for $B$ the zero matrix. If $A$ was invertible, this would contradict the uniqueness statement of the last Proposition. So $A$ cannot be invertible. ■
  
-==== Example ====+==== Examples ====
  
-Take $A=\def\mat#1{\begin{bmatrix}#1\end{bmatrix}}\mat{2&4\\0&1}$ and $B=\mat{3&4\\5&6}$, so we want to find all matrices $X$ so that $AX=B$, or \[ \mat{2&4\\0&1}X=\mat{3&4\\5&6}.\] Note that $Xmust be a $2\times 2matrix for this to work, by the definition of [[matrix multiplication]]So one way to solve this is to write $X=\mat{x_{11}&x_{12}\\x_{21}&x_{22}}$ and plug it in: +  * We can now see why the matrix $\def\mat#1{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]}A=\mat{1&2\\-3&-6}$ is not invertible. If $X=\mat{-2\\1}$ and $K=\mat{2\\-1}$, then $Kis non-zero, but $AK=0_{2\times 1}$. So $A$ is not invertible, by the Corollary. 
-\[\mat{2&4\\0&1}\mat{x_{11}&x_{12}\\x_{21}&x_{22}}=\mat{3&4\\5&6}\iff \mat{2x_{11}+4x_{21}&2 x_{12}+4x_{22}\\x_{21}&x_{22}}=\mat{3&4\\5&6}\+  * $A=\mat{1&4&5\\2&5&7\\3&6&9}$ is not invertible, since $K=\mat{1\\1\\-1}$ is non-zero and $AK=0_{3\times 1}$.
-and then equate entries to get four linear equations: +
-\begin{align*}2x_{11}+4x_{21}&=3\\2 x_{12}+4x_{22}&=4\\x_{21}&=5\\x_{22}&=6\end{align*} +
-which we can solve in the usual way.+
  
-But this is a bit tedious! We will develop a slicker method by first thinking about solving ordinary equations $ax=bwhere $a,x,b$ are all numbers, or  if you like, $1\times 1$ matrices.+===== $2\times 2$ matrices: determinants and invertibility =====
  
-==== Solving $ax=b$ and $AX=B$ ====+==== Question ====
  
-If $a\ne0$, then solving $ax=b$ where $a,b,x$ are numbers is easy. We just divide both sides by $a$, or equivalently, we multiply both sides by $\tfrac1a$, to get the solution: $x=\tfrac1a\cdot b$.+Which $2\times 2matrices are invertible? For the invertible matricescan we find their inverse?
  
-Why does this work? If $x=\tfrac1a\cdot b$, then +==== Lemma ====
-\begin{align*} ax&=a(\tfrac1a\cdot b)\\&=(a\cdot \tfrac1a)b\\&=1b\\&=b\end{align*} +
-so $ax$ really is equal to $b$, and we do have a solution to $ax=b$.+
  
-What is special about $\tfrac1a$ which made this all work? +If $A=\mat{a&b\\c&d}$ and $J=\mat{d&-b\\-c&a}$, then we have 
- +\[ AJ=\delta I_2=JA\] 
-  - we have $\tfrac1a \cdot a = 1$, +where $\delta=ad-bc$.
-  - and $1b = b$. +
- +
-Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $A'$ with the property: $A'\cdot A=I_n$. In fact, because matrix multiplication is not commutative, we also require that $A\cdot A'=I_n$. It's then easy to argue that $X=A'\cdot B$ is a solution to $AX=B$, since +
-\begin{align*} AX&=A(A'\cdot B)\\&=(A\cdot A')B\\&=I_nB\\&=B.\end{align*} +
- +
-==== Example revisited ==== +
-If $A=\mat{2&4\\0&1}$, then the matrix $A'=\mat{\tfrac12&-2\\0&1}$ does have the property that +
-\[ A\cdot A'=I_2=A'\cdot A.\] +
-(You should check this!). So solution to $AX=B$ where $B=\mat{3&4\\5&6}$ is $X=A'B=\mat{\tfrac12&-2\\0&1}\mat{3&4\\5&6} = \mat{-8.5&-10\\5&6}$. +
- +
-Notice that having found the matrix $A'$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=A'C$. This is quicker than having to solve four new linear equations using our more tedious method above. +
- +
-==== Definition: invertible ==== +
- +
-{{page>invertible}} +
- +
-==== Examples ==== +
- +
-  * $A=\mat{2&4\\0&1}$ is invertible, and the matrix $A'=\mat{\tfrac12&-2\\0&1}$ is an inverse of $A$ +
-  * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $A'=[\tfrac1a]$. +
-  $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$. +
-  * $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot A'=0_{n\times n}$ for any $n\times n$ matrix $A'$, so $0_{n\times n}\cdot A'\ne I_n$. +
-  * $A=\mat{1&0\\0&0}$ is not invertible, since for any $2\times 2$ matrix $A'=\mat{a&b\\c&d}$ we have $AA'=\mat{a&b\\0&0}$ which is not equal to $I_2=\mat{1&0\\0&1}$ since the $(2,2)$ entries are not equal. +
-  * $A=\mat{1&2\\-3&-6}is not invertible. We'll see why later! +
- +
-==== Proposition: uniqueness of the inverse ==== +
-If $A$ is an invertible $n\times n$ matrix, then $A$ has a //unique// inverse.+
  
 === Proof === === Proof ===
-Suppose $A'$ and $A''$ are both inverses of $A$. Then 
-$AA'=I_n=A'A$ and $AA''=I_n=A''A$. So 
-\begin{align*} A'&=A'I_n\\&=A'(AA'')\quad\mbox{because }AA''=I_n\\&=(A'A)A''\quad\mbox{because matrix multiplication is associative}\\&=I_nA''\quad\mbox{because }A'A=I_n\\&=A''.\end{align*} 
-So $A'=A''$, whenever $A'$ and $A''$ are inverses of $A$. So $A$ has a unique inverse. ■ 
  
-==== Definition/notation: $A^{-1}$ ====+This is a calculation (done in the lectures; you should also check it yourself). ■ 
  
-{{page>the inverse}}+==== Definition: the determinant of a $2\times 2$ matrix ====
  
-=== Warning ===+{{page>determinant of a 2x2 matrix}}
  
-If $A$ is matrix then $\frac 1Adoesn't make sense! You should never write this down. In particular, $A^{-1}$ definitely doesn't mean $\frac 1A$.+==== Theorem: the determinant determines the invertibility (and inverse) of a $2\times 2matrix ====
  
-==== Proposition: solving $AX=Bwhen $Ais invertible ====+Let $A=\mat{a&b\\c&d}be a $2\times 2matrix.
  
-If $A$ is an invertible $n\times nmatrix and $B$ is an $n\times k$ matrix, then the matrix equation \[ AX=B\] has a unique solution: $X=A^{-1}B$.+  - $A$ is invertible if and only if $\det(A)\ne0$
 +  - If $A$ is invertible, then $A^{-1}=\frac{1}{\det(A)}\mat{d&-b\\-c&a}$.
  
 === Proof === === Proof ===
  
-First we check that $X=A^{-1}B$ really is a solution to $AX=B$. To see this, note that if $X=A^{-1}B$, then +If $A=0_{2\times 2}$, then $\det(A)=0$ and $A$ is not invertible. So the statement is true is this special case.
-\begin{align*} +
- AX&=A(A^{-1}B)\\&=(AA^{-1})B\\&=I_n B \\&= B. +
-\end{align*} +
-Now we check that the solution is unique. If $X$ and $Y$ are both solutions, then $AX=B$ and $AY=B$, so \[AX=AY.\] Multiplying both sides on the left by $A^{-1}$, we get +
-\[ A^{-1}AX=A^{-1}AY\implies I_nX=I_nY\implies X=Y.\] +
-So any two solutions are equal, so $AX=B$ has a unique solution+
  
-==== Corollary ====+Now assume that $A\ne0_{2\times 2}$ and let $J=\mat{d&-b\\-c&a}$. 
  
-If $A$ is an $n\times n$ matrix and there exist different matrices $X$ and $Y$ so that $AX=AY$, then $A$ is not invertible.+By the previous lemma, we have \[AJ=(\det(A))I_2=JA.\]
  
-=== Proof ===+If $\det(A)\ne0$, then multiplying this equation through by the scalar $\frac1{\det(A)}$, we get  
 +\[ A\left(\frac1{\det(A)}J\right)=I_2=\left(\frac1{\det(A)}J\right) A,\] 
 +so if we write $B=\frac1{\det(A)}J$ to make this look simpler, then  we obtain 
 +\[ AB=I_2=BA,\] 
 +so in this case $A$ is invertible with inverse $B=\frac1{\det(A)}J=\frac1{\det(A)}\mat{d&-b\\-c&a}$.
  
-Write $B=AX$. We have $AX=B$ and $AY=B$, so the matrix equation $AX=Bhas two different solutions$X$ and $Y$. If $A$ was invertible, this would contradict the PropositionSo $A$ cannot be invertible. ■+If $\det(A)=0$, then $AJ=0_{2\times 2} and $J\ne 0_{2\times2}(since $A\ne0_{2\times2}$, and $Jis obtained from $A$ by swapping two entries and multiplying the others by $-1$)Hence by the previous corollary, $A$ is not invertible in this case ■ 
  
-==== Example ==== 
  
-We can now see why the matrix $A=\mat{1&2\\-3&-6}$ is not invertible. If $X=\mat{-2\\1}$ and $Y=\mat{2\\-1}$, then $X\ne Y$ but $AX=0_{2\times 1}$ and $AY=0_{2\times 1}$, so $AX=AY$. So $A$ is not invertible, by the Corollary. 
lecture_11.1456224071.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki