User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_10_slides

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_10_slides [2016/02/29 19:01] rupertlecture_10_slides [2017/02/21 10:04] (current) rupert
Line 1: Line 1:
 ~~REVEAL~~ ~~REVEAL~~
  
-==== Last time ====+ 
 +===== Matrix equations ===== 
 + 
 +  * A linear equation can be written using [[row-column multiplication]].  
 +  * e.g. $ \newcommand{\m}[1]{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]} 2x-3y+z=8$ is same as $ \m{2&-3&1}\m{x\\y\\z}=8$ 
 +  * or $ a\vec x=8$ where $a=\m{2&-3&1}$ and $\vec x=\m{x\\y\\z}$. 
 + 
 +==== ==== 
 +  * We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]].  
 +  * e.g. the linear system $\begin{align*} 2x-3y+z&=8\\ y-z&=4\\x+y+z&=0\end{align*}$ 
 + 
 +  * is same as $\m{2&-3&1\\0&1&-1\\1&1&1}\m{x\\y\\z}=\m{8\\4\\0}$ 
 +  * or $ A\vec x=\vec b$ where $A=\m{2&-3&1\\0&1&-1\\1&1&1}$, $\vec x=\m{x\\y\\z}$ and $\vec b=\m{8\\4\\0}$.  
 + 
 +==== ==== 
 + 
 +In a similar way, any linear system \begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m&=b_1\\ a_{21}x_1+a_{22}x_2+\dots+a_{2m}x_m&=b_2\\ \hphantom{a_{11}}\vdots \hphantom{x_1+a_{22}}\vdots\hphantom{x_2+\dots+{}a_{nn}} \vdots\ & \hphantom{{}={}\!} \vdots\\ a_{n1}x_1+a_{n2}x_2+\dots+a_{nm}x_m&=b_n \end{align*} 
 +can be written in the form 
 +\[ A\vec x=\vec b\] 
 +where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system). 
 + 
 +==== Solutions of matrix equations ==== 
 + 
 +  * More generally, might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. 
 +  * If $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$ 
 +    * so we know the size of any solution $X$. 
 +  *  But which $m\times k$ matrices $X$ are solutions? 
 + 
 +==== Example ==== 
 + 
 +If $A=\m{1&0\\0&0}$ and $B=0_{2\times 3}$, then any solution $X$ to $AX=B$ must be $2\times 3$. 
 + 
 +  * One solution is $X=0_{2\times 3}$ 
 +    *  because then we have $AX=A0_{2\times 3}=0_{2\times 3}$. 
 +  * This is not the only solution! 
 +  * For example, $X=\m{0&0&0\\1&2&3}$ is another solution 
 +    * because then we have $AX=\m{1&0\\0&0}\m{0&0&0\\1&2&3}=\m{0&0&0\\0&0&0}=0_{2\times 3}.$ 
 + 
 +  * So a matrix equation can have more than one solution. 
 + 
 +==== Example ==== 
 + 
 +  * Let $A=\m{2&4\\0&1}$  
 +  * and $B=\m{3&4\\5&6}$ 
 +  * Solve $AX=B$ for $X$ 
 +   
 +  * $X$ must be $2\times 2$ 
 +  * $X=\m{x_{11}&x_{12}\\x_{21}&x_{22}}$ 
 +  * Do some algebra to solve for $X$ 
 +  * ... 
 +  * Is there a quicker way?  
 + 
 +==== Example ==== 
 + 
 +  * Consider $AX=B$, where 
 +    * $A=\def\m#1{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]}\m{1&0\\0&0}$ 
 +    * $B=0_{2\times 3}$, 
 +  * i.e. $\m{1&0\\0&0}X=\m{0&0&0\\0&0&0}$ 
 +  * then any solution $X$ to $AX=B$ must be $2\times 3$. 
 +  * One solution is $X=0_{2\times 3}$ 
 +  * Are there any more? 
 +  * Yes! e.g. $X=\m{0&0&0\\1&2&3}$ 
 +  * So a matrix equation can have more than one solution. 
 + 
 +===== Invertibility ====== 
 + 
 +==== Example ==== 
 + 
 +  * Take $A=\def\mat#1{\m{#1}}\mat{2&4\\0&1}$ and $B=\mat{3&4\\5&6}$, consider $AX=B$ 
 +  * $\mat{2&4\\0&1}X=\mat{3&4\\5&6}$ 
 +  * $X$ must be $2\times 2$ matrix 
 +  * One way to solve: write $X=\mat{x_{11}&x_{12}\\x_{21}&x_{22}}$  
 +  * Plug in and do matrix multiplication: $\mat{2x_{11}+4x_{21}&2 x_{12}+4x_{22}\\x_{21}&x_{22}}=\mat{3&4\\5&6}$ 
 +  * Get four linear equations: $\begin{align*}2x_{11}+4x_{21}&=3\\2 x_{12}+4x_{22}&=4\\x_{21}&=5\\x_{22}&=6\end{align*}$ 
 +  * Can solve in the usual way.... 
 +  * Tedious! Can we do better? 
 + 
 +==== Simpler: $1\times 1$ matrix equations ==== 
 + 
 +  * Let $a,b$ be ordinary numbers ($1\times 1$ matrices) with $a\ne0$. How do we solve $ax=b$? 
 +  * Answer: multiply both sides by $a^{-1}$  
 +    * (for numbers, $a^{-1}$ is same as $\tfrac1a$) 
 +  * Solution: $x=a^{-1}b$. 
 + 
 +  * Why does this work? 
 +  * If $x=\tfrac1a\cdot b$, then $ax=a(\tfrac1a\cdot b)=(a\cdot \tfrac1a)b=1b=b$ 
 +  * so $ax$ really is equal to $b$ 
 +  * We do have a solution to $ax=b$. 
 + 
 +==== Thinking about $a^{-1}$ for $a$ a number ==== 
 + 
 +  * What is special about $a^{-1}$ which made this all work? 
 +  * Write $c=a^{-1}$ 
 + 
 +  * $1b=b$, and $a c = 1$ 
 +  * so $ x=cb$ has $ax=acb=1b=b$ :-) 
 + 
 +==== What about matrices? ==== 
 + 
 +  * Can we do something similar for an $n\times n$ matrix $A$? 
 +  * i.e. find a matrix $C$ with similar properties? 
 +    * Replace $1$ with $I_n$ 
 +    * Then $I_nB=B$  
 +    * So if we had a matrix $C$ with $CA=I_n$... 
 +    * Then $X=CB$ would have $AX=ACB=I_nB=B$ :-) 
 +  * The key property of $C$ is that $AC=I_n$. 
 +  * Turns out we also want $CA=I_n$ 
 + 
 +==== Example revisited ==== 
 +  * Let $A=\mat{2&4\\0&1}$ 
 +  * The matrix $C=\mat{\tfrac12&-2\\0&1}$ **does** have the property\[ C A =I_2= AC.\] 
 +  * Check this! 
 +  * Multiply both sides of $AX=B$ on the left by $C$ and use $CA=I_2$: 
 +  * Get $X=CB=\mat{\tfrac12&-2\\0&1}\mat{3&4\\5&6} = \mat{-8.5&-10\\5&6}$. 
 +  * This is quicker than solving lots of equations 
 +    * although we don't yet know how to **find** $C$ from $A$ 
 +  * Also, if we change $B$ we can easily find solutions to the new equation 
 + 
 +==== Definition: invertible ==== 
 + 
 +{{page>invertible}} 
 + 
 +==== Examples ==== 
 + 
 +  * $A=\mat{2&4\\0&1}$ is invertible, and $C=\mat{\tfrac12&-2\\0&1}$ is an inverse 
 +  * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $C=[\tfrac1a]$. 
 +  * $I_n$ is invertible for any $n$, with inverse $I_n$ 
 +  * $0_{n\times n}$ is not invertible for any $n$... why? 
 +  * $A=\mat{1&0\\0&0}$ is not invertible... why? 
 +  * $A=\mat{1&2\\-3&-6}$ is not invertible. We'll see why later! 
 + 
 +==== Proposition: uniqueness of the inverse ==== 
 +If $A$ is an invertible $n\times n$ matrix, then $A$ has a //unique// inverse. 
 + 
 +=== Proof === 
 +  * $A$ invertible, so $A$ has at least one inverse. 
 +  * Suppose it has two inverses, say $C$ and $D$. 
 +  * Then $AC=I_n=CA$ and $AD=I_n=DA$. 
 +  * So $C=CI_n=C(AD)=(CA)D=I_nD=D$ 
 +  * So $C=D$. 
 +  * So any two inverses of $A$ are equal. 
 +  * So $A$ has a unique inverse.■ 
 + 
 +==== Definition/notation: $A^{-1}$ ====
  
 {{page>the inverse}} {{page>the inverse}}
  
-  * $\def\mat#1{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]}A=\mat{2&4\\0&1}$ is invertible, and $A^{-1}=\mat{\tfrac12&-2\\0&1}$.+==== Examples again ==== 
 + 
 +  * $A=\mat{2&4\\0&1}$ is invertible, and $A^{-1}=\mat{\tfrac12&-2\\0&1}$.
     * i.e. $\mat{2&4\\0&1}^{-1}=\mat{\tfrac12&-2\\0&1}$     * i.e. $\mat{2&4\\0&1}^{-1}=\mat{\tfrac12&-2\\0&1}$
   * if $a$ is a non-zero scalar, then $[a]^{-1}=[\tfrac 1a]$   * if $a$ is a non-zero scalar, then $[a]^{-1}=[\tfrac 1a]$
Line 53: Line 198:
   * So $A$ is not invertible, by the Corollary.   * So $A$ is not invertible, by the Corollary.
  
-==== Example ==== +  Next time: a more systematic way to determine when a matrix is invertible**determinants**
- +
-  $A=\mat{1&4&5\\2&5&7\\3&6&9}$ is not invertible +
-    * $X=\mat{1\\1\\-1}$ is non-zero +
-    * and $AX=0_{3\times 1}$. +
-===== $2\times 2$ matricesdeterminants and invertibility ===== +
- +
-==== Question ==== +
- +
-Which $2\times 2$ matrices are invertible? For the invertible matrices, can we find their inverse? +
- +
- +
-==== Lemma ==== +
- +
-If $A=\mat{a&b\\c&d}$ and $J=\mat{d&-b\\-c&a}$, then we have +
-\[ AJ=\delta I_2=JA\] +
-where $\delta=ad-bc$. +
- +
-  * Proof is a calculation! +
- +
-==== Definition: the determinant of a $2\times 2$ matrix ==== +
- +
-{{page>determinant of a 2x2 matrix}} +
- +
-==== Theorem: the determinant determines the invertibility (and inverse) of a $2\times 2$ matrix ==== +
- +
-Let $A=\mat{a&b\\c&d}$ be a $2\times 2$ matrix+
- +
-  - $A$ is invertible if and only if $\det(A)\ne0$. +
-  - If $A$ is invertible, then $A^{-1}=\frac{1}{\det(A)}\mat{d&-b\\-c&a}$. +
- +
-==== Proof ==== +
- +
-  true for $A=0_{2\times 2}$! +
-  Now suppose $A\ne0_{2\times 2}$. Let $J=\mat{d&-b\\-c&a}$.  +
-  By the previous lemma, $AJ=(\det(A))I_2=JA$. +
-  If $\det(A)\ne0$, multiply by $\frac1{\det(A)}$ and write $B=\tfrac1{\det(A)}J$: \[ AB= A\left(\frac1{\det(A)}J\right)=I_2=\left(\frac1{\det(A)}J\right) A=BA\] +
-  * So $ AB=I_2=BA$, so $A$ invertible with inverse $B=\frac1{\det(A)}J=\frac1{\det(A)}\mat{d&-b\\-c&a}$. +
- +
-  * If $\det(A)=0$, then $AJ=0_{2\times 2}$  and $J\ne 0_{2\times2}$ [why?] +
-  * Hence by the previous corollary, $A$ is not invertible in this case.  ■  +
lecture_10_slides.1456772493.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki