Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_9
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| lecture_9 [2015/02/17 10:58] – rupert | lecture_9 [2017/02/21 10:02] (current) – rupert | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | === Definition | + | === Proof of the proposition, |
| - | If $A$ is an $n\times m$ matrix | + | 2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar to the first part of the proof; |
| - | \[ (AB)_{i,j} = \text{row}_i(A)\cdot \text{col}_j(B).\] | + | |
| - | If $A$ is an $n\times m$ matrix and $B$ is an $\ell\times k$ matrix with $m\ne \ell$, then the matrix product $AB$ is undefined. | + | |
| - | === Examples === | + | |
| - | - If $\newcommand{\mat}[1]{\begin{bmatrix}# | + | 3. If $B$ is any $n\times |
| - | - If $A=\mat{1& | + | |
| - | * $AB$, $CA$ and $BC$ don't exist (i.e., they are undefined); | + | |
| - | * $AC$ exists and is $3\times 4$; | + | |
| - | * $BA$ exists and is $4\times 2$; and | + | |
| - | * $CB$ exists and is $2\times 2$. | + | |
| - | * In particular, $AB\ne BA$ and $AC\ne CA$ and $BC\ne CB$, since in each case one of the matrices doesn' | + | |
| - | - If $A=\mat{0& | + | |
| - | - If $A=0_{n\times n}$ is the $n\times n$ zero matrix and $B$ is any $n\times n$ matrix, then $AB=0_{n\times n}$ and $BA=0_{n\times n}$. So in this case, we do have $AB=BA$. | + | |
| - | - If $A=\mat{1& | + | |
| - | - If $A=\mat{1& | + | |
| - | === Commuting matrices | + | ===== Algebraic properties of matrix multiplication |
| + | ==== The associative law ==== | ||
| + | === Proposition: | ||
| + | Matrix multiplication is // | ||
| - | {{page> | + | We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$. |
| - | Which matrices commute? Suppose | + | === Example === |
| + | We saw above that $\newcommand{\m}[1]{\begin{bmatrix}# | ||
| + | \[ AB=A(AA)\stackrel*=(AA)A=BA.\] | ||
| + | The same argument for any square matrix $A$ gives a proof of: | ||
| + | === Proposition === | ||
| + | If $A$ is any square matrix, then $A$ commutes with $A^2$.■ | ||
| - | * $AB$ must be defined, so $m=\ell$ | + | The powers of a square matrix |
| - | * $BA$ must be defined, so $k=n$ | + | ===Proposition: |
| - | * $AB$ is an $n\times | + | If $A$ is any square matrix |
| - | Putting this together: we see that if $A$ and $B$ commute, then $A$ and $B$ must both be $n\times n$ matrices for some number $n$. In other words, they must be //square matrices of the same size//. | + | |
| - | Examples 4 and 5 above show that for some square matrices $A$ and $B$ of the same size, it is true that $A$ and $B$ commute. On the other hand, examples 3 and 6 show that it's not true that square matrices of the same size must always commute. | ||
| - | Because it's not true in general that $AB=BA$, we say that **matrix multiplication is not commutative**. | + | ====The distributive laws==== |
| - | === Definition of the $n\times n$ identity matrix | + | === Lemma: |
| - | {{page> | + | - If $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then $a\cdot (b+c)=a\cdot b+a\cdot c$. |
| + | - If $b$ and $c$ are $1\times m$ row vectors and $a$ is an $m\times 1$ column vector, then $(b+c)\cdot a=b\cdot a+c\cdot a$. | ||
| + | |||
| + | The proof is an exercise (see tutorial worksheet 5). | ||
| + | |||
| + | |||
| + | === Proposition: | ||
| + | If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then: | ||
| + | - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and | ||
| + | - $(B+C)A=BA+CA$ for any $k\times n$ matrices $B$ and $C$. | ||
| + | In other words, $A(B+C)=AB+AC$ whenever the matrix | ||
| + | |||
| + | ===Proof=== | ||
| + | 1. First note that | ||
| + | * $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of [[matrix addition]]; | ||
| + | * $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of [[matrix multiplication]]; | ||
| + | * $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication | ||
| + | * so $AB+AC$ is $n\times k$ by the definition of matrix addition. | ||
| + | So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. | ||
| + | |||
| + | By the Lemma above, the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] | ||
| + | So the $(i,j)$ entry of $A(B+C)$ is | ||
| + | \begin{align*}\def\row{\text{row}}\def\col{\text{col}} | ||
| + | \text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) | ||
| + | \\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} | ||
| + | On the other hand, | ||
| + | |||
| + | * the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and | ||
| + | * the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$; | ||
| + | * so the $(i, | ||
| + | |||
| + | So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$. | ||
| + | |||
| + | 2. The proof is similar, and is left as an exercise.■ | ||
| - | === Examples === | ||
| - | - $I_1=[1]$ | ||
| - | - $I_2=\mat{1& | ||
| - | - $I_3=\mat{1& | ||
| - | - $I_4=\mat{1& | ||
| - | === Proposition === | ||
| - | - $I_nA=A$ for any $n\times m$ matrix A; | ||
| - | - $AI_m=A$ for any $n\times m$ matrix A; and | ||
| - | - $I_nB=B=BI_n$ for any $n\times n$ matrix $B$. In particular, $I_n$ commutes with every other square $n\times n$ matrix $B$. | ||
lecture_9.1424170736.txt.gz · Last modified: by rupert
