User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_10

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_10 [2016/02/25 16:45] – [Example revisited] rupertlecture_10 [2017/02/21 10:02] (current) rupert
Line 1: Line 1:
 +===== Matrix equations =====
 +
 +We've seen that a single linear equation can be written using [[row-column multiplication]]. For example,
 +\[ 2x-3y+z=8\]
 +can be written as 
 +\[ \def\m#1{\begin{bmatrix}#1\end{bmatrix}}\m{2&-3&1}\m{x\\y\\z}=8\]
 +or
 +\[ a\vec x=8\]
 +where $a=\m{2&-3&1}$ and $\vec x=\m{x\\y\\z}$.
 +
 +We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system
 +\begin{align*} 2x-3y+z&=8\\ y-z&=4\\x+y+z&=0\end{align*}
 +as 
 +\[ \m{2&-3&1\\0&1&-1\\1&1&1}\m{x\\y\\z}=\m{8\\4\\0},\]
 +or 
 +\[ A\vec x=\vec b\]
 +where $A=\m{2&-3&1\\0&1&-1\\1&1&1}$, $\vec x=\m{x\\y\\z}$ and $\vec b=\m{8\\4\\0}$. (We are writing the little arrow above the column vectors here because otherwise we might get confused between the $\vec x$: a column vector of variables, and $x$: just a single variable).
 +
 +
 +More generally, any linear system
 +\begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m&=b_1\\ a_{21}x_1+a_{22}x_2+\dots+a_{2m}x_m&=b_2\\ \hphantom{a_{11}}\vdots \hphantom{x_1+a_{22}}\vdots\hphantom{x_2+\dots+{}a_{nn}} \vdots\ & \hphantom{{}={}\!} \vdots\\ a_{n1}x_1+a_{n2}x_2+\dots+a_{nm}x_m&=b_n \end{align*}
 +can be written in the form
 +\[ A\vec x=\vec b\]
 +where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, and $\vec b=\m{b_1\\b_2\\\vdots\\b_n}$.
 +
 More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions? More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions?
  
Line 69: Line 94:
 Suppose $C$ and $C'$ are both inverses of $A$. Then Suppose $C$ and $C'$ are both inverses of $A$. Then
 $AC=I_n=CA$ and $AC'=I_n=C'A$. So $AC=I_n=CA$ and $AC'=I_n=C'A$. So
-\begin{align*} C&=CI_n\\&=C(AC')\quad\mbox{because }AC'=I_n\\&=(CA)C'\quad\mbox{because matrix multiplication is associative}\\&=I_nC'\quad\mbox{because }CA=I_n\\&=C'.\end{align*}+\begin{align*} C&=CI_n\quad\text{by the properties of $I_n$}\\&=C(AC')\quad\mbox{because }AC'=I_n\\&=(CA)C'\quad\mbox{because matrix multiplication is associative}\\&=I_nC'\quad\mbox{because }CA=I_n\\&=C'\quad\text{by the properties of $I_n$}.\end{align*}
 So $C=C'$, whenever $C$ and $C'$ are inverses of $A$. So $A$ has a unique inverse. ■ So $C=C'$, whenever $C$ and $C'$ are inverses of $A$. So $A$ has a unique inverse. ■
  
Line 75: Line 100:
  
 {{page>the inverse}} {{page>the inverse}}
 +
 +If a matrix $A$ is not invertible, then $A^{-1}$ does not exist.
  
 === Warning === === Warning ===
Line 80: Line 107:
 If $A$ is a matrix then $\frac 1A$ doesn't make sense! You should never write this down. In particular, $A^{-1}$ definitely doesn't mean $\frac 1A$. If $A$ is a matrix then $\frac 1A$ doesn't make sense! You should never write this down. In particular, $A^{-1}$ definitely doesn't mean $\frac 1A$.
  
 +Similarly, you should **never** write down $\frac AB$ where $A$ and $B$ are matrices. This doesn't make sense either!
 +==== Examples revisited ====
 +
 +  * $A=\mat{2&4\\0&1}$ has $A^{-1}=\mat{\tfrac12&-2\\0&1}$. In other words, $\mat{2&4\\0&1}^{-1}=\mat{\tfrac12&-2\\0&1}$.
 +  * a $1\times 1$ matrix $A=[a]$ with $a\ne 0$ has $[a]^{-1}=[\tfrac1a]$.
 +  * $I_n^{-1}=I_n$.
 +  * $0_{n\times n}^{-1}$ does not exist
 +  * $\mat{1&0\\0&0}^{-1}$ does not exist
 +  * $\mat{1&2\\-3&-6}^{-1}$ does not exist
  
lecture_10.1456418734.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki