Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_10
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| lecture_10 [2015/02/19 11:07] – rupert | lecture_10 [2017/02/21 10:02] (current) – rupert | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | === Proof of the proposition === | ||
| - | |||
| - | 1. We want to show that $I_nA=A$ for any $n\times m$ matrix $A$. These matrices the [[same size]], since $I_n$ has size $n\times n$ and $A$ has size $n\times m$, so $I_n A$ has size $n\times m$ by the definition of [[matrix multiplication]], | ||
| - | |||
| - | Note that $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, | ||
| - | $\text{row}_i(I_n)\cdot \text{col}_j(A)$, | ||
| - | \begin{align*} [0~0~\dots~0~1~0~\dots~0]\begin{bmatrix}a_{1j}\\a_{2j}\\\vdots\\a_{nj}\end{bmatrix} &= 0a_{1j}+0a_{2j}+\dots+0a_{i-1, | ||
| - | So the matrices $I_nA$ and $A$ have the same $(i,j)$ entries, for any $(i,j)$. So $I_nA=A$. | ||
| - | |||
| - | 2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar; the details are left as an exercise. | ||
| - | |||
| - | 3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■ | ||
| - | |||
| - | ===== Algebraic properties of matrix multiplication ===== | ||
| - | ==== The associative law ==== | ||
| - | === Proposition === | ||
| - | Matrix multiplication is // | ||
| - | |||
| - | We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$. | ||
| - | |||
| - | === Example === | ||
| - | We saw above that $\newcommand{\m}[1]{\begin{bmatrix}# | ||
| - | \[ AB=A(AA)\stackrel*=(AA)A=BA.\] | ||
| - | The same argument for any square matrix $A$ gives a proof of: | ||
| - | === Proposition === | ||
| - | If $A$ is any square matrix, then $A$ commutes with $A^2$.■ | ||
| - | |||
| - | Using [[wp> | ||
| - | ===Proposition=== | ||
| - | If $A$ is any square matrix and $k\in\mathbb{N}$, | ||
| - | |||
| - | |||
| - | ====The distributive laws==== | ||
| - | |||
| - | === Proposition === | ||
| - | If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, | ||
| - | - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and | ||
| - | - $(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$. | ||
| - | In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense. | ||
| - | |||
| - | ===Proof=== | ||
| - | 1. First note that | ||
| - | * $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of [[matrix addition]]; | ||
| - | * $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of [[matrix multiplication]]; | ||
| - | * $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication | ||
| - | * so $AB+AC$ is $n\times k$ by the definition of matrix addition. | ||
| - | So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. | ||
| - | |||
| - | Recall that in tutorial 4 we saw that if $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] | ||
| - | So the $(i,j)$ entry of $A(B+C)$ is | ||
| - | \begin{align*}\def\row{\text{row}}\def\col{\text{col}} | ||
| - | \text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) | ||
| - | \\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} | ||
| - | On the other hand, | ||
| - | |||
| - | * the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and | ||
| - | * the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$; | ||
| - | * so the $(i, | ||
| - | |||
| - | So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$. | ||
| - | |||
| - | 2. The proof is similar, and is left as an exercise.■ | ||
| - | |||
| ===== Matrix equations ===== | ===== Matrix equations ===== | ||
| Line 73: | Line 10: | ||
| We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system | We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system | ||
| - | \begin{align*} 2x-3y+z& | + | \begin{align*} 2x-3y+z& |
| as | as | ||
| \[ \m{2& | \[ \m{2& | ||
| Line 79: | Line 16: | ||
| \[ A\vec x=\vec b\] | \[ A\vec x=\vec b\] | ||
| where $A=\m{2& | where $A=\m{2& | ||
| + | |||
| More generally, any linear system | More generally, any linear system | ||
| Line 86: | Line 24: | ||
| where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, | where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, | ||
| - | More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size. Because of the definition of [[matrix multiplication]], | + | More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], |
| === Example === | === Example === | ||
| - | If $A=\m{1& | + | If $A=\def\m# |
| One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$. | One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$. | ||
| Line 97: | Line 35: | ||
| So from this example, we see that a matrix equation can have many solutions. | So from this example, we see that a matrix equation can have many solutions. | ||
| + | |||
| + | ===== Invertibility ===== | ||
| + | |||
| + | We've seen that solving matrix equations $AX=B$ is useful, since they generalise systems of linear equations. | ||
| + | |||
| + | How can we solve them? | ||
| + | |||
| + | ==== Example ==== | ||
| + | |||
| + | Take $A=\def\mat# | ||
| + | \[\mat{2& | ||
| + | and then equate entries to get four linear equations: | ||
| + | \begin{align*}2x_{11}+4x_{21}& | ||
| + | which we can solve in the usual way. | ||
| + | |||
| + | But this is a bit tedious! We will develop a slicker method by first thinking about solving ordinary equations $ax=b$ where $a,x,b$ are all numbers, or if you like, $1\times 1$ matrices. | ||
| + | |||
| + | ==== Solving $ax=b$ and $AX=B$ ==== | ||
| + | |||
| + | If $a\ne0$, then solving $ax=b$ where $a,b,x$ are numbers is easy. We just divide both sides by $a$, or equivalently, | ||
| + | |||
| + | Why does this work? If $x=\tfrac1a\cdot b$, then | ||
| + | \begin{align*} ax& | ||
| + | so $ax$ really is equal to $b$, and we do have a solution to $ax=b$. | ||
| + | |||
| + | What is special about $\tfrac1a$ which made this all work? | ||
| + | |||
| + | - we have $a\cdot \tfrac1a = 1$, | ||
| + | - and $1b = b$. | ||
| + | |||
| + | Now for an $n\times k$ matrix $B$, we know that the identity matrix $I_n$ does the same sort of thing as $1$ is doing in the relation $1b=b$: we have $I_nB=B$ for any $n\times k$ matrix $B$. So instead of $\tfrac1a$, we want to find a matrix $C$ with the property: $AC=I_n$. In fact, because matrix multiplication is not commutative, | ||
| + | \begin{align*} AX& | ||
| + | |||
| + | ==== Example revisited ==== | ||
| + | If $A=\mat{2& | ||
| + | \[ A C=I_2=C A.\] | ||
| + | (You should check this!). So a solution to $AX=B$ where $B=\mat{3& | ||
| + | |||
| + | Notice that having found the matrix $C$, then we can solve $AX=C$ easily for any $2\times 2$ matrix $C$: the answer is $X=CC$. This is quicker than having to solve four new linear equations using our more tedious method above. | ||
| + | |||
| + | ==== Definition: invertible ==== | ||
| + | |||
| + | {{page> | ||
| + | |||
| + | ==== Examples ==== | ||
| + | |||
| + | * $A=\mat{2& | ||
| + | * a $1\times 1$ matrix $A=[a]$ is invertible if and only if $a\ne0$, and if $a\ne0$ then an inverse of $A=[a]$ is $C=[\tfrac1a]$. | ||
| + | * $I_n$ is invertible for any $n$, since $I_n\cdot I_n=I_n=I_n\cdot I_n$, so an inverse of $I_n$ is $I_n$. | ||
| + | * $0_{n\times n}$ is not invertible for any $n$, since $0_{n\times n}\cdot C=0_{n\times n}$ for any $n\times n$ matrix $C$, so $0_{n\times n}\cdot C\ne I_n$. | ||
| + | * $A=\mat{1& | ||
| + | * $A=\mat{1& | ||
| + | |||
| + | ==== Proposition: | ||
| + | If $A$ is an invertible $n\times n$ matrix, then $A$ has a //unique// inverse. | ||
| + | |||
| + | === Proof === | ||
| + | Suppose $C$ and $C'$ are both inverses of $A$. Then | ||
| + | $AC=I_n=CA$ and $AC' | ||
| + | \begin{align*} C& | ||
| + | So $C=C' | ||
| + | |||
| + | ==== Definition/ | ||
| + | |||
| + | {{page> | ||
| + | |||
| + | If a matrix $A$ is not invertible, then $A^{-1}$ does not exist. | ||
| + | |||
| + | === Warning === | ||
| + | |||
| + | If $A$ is a matrix then $\frac 1A$ doesn' | ||
| + | |||
| + | Similarly, you should **never** write down $\frac AB$ where $A$ and $B$ are matrices. This doesn' | ||
| + | ==== Examples revisited ==== | ||
| + | |||
| + | * $A=\mat{2& | ||
| + | * a $1\times 1$ matrix $A=[a]$ with $a\ne 0$ has $[a]^{-1}=[\tfrac1a]$. | ||
| + | * $I_n^{-1}=I_n$. | ||
| + | * $0_{n\times n}^{-1}$ does not exist | ||
| + | * $\mat{1& | ||
| + | * $\mat{1& | ||
| + | |||
lecture_10.1424344050.txt.gz · Last modified: by rupert
