Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_9
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| lecture_9 [2015/02/17 13:07] – [Definition of matrix multiplication] rupert | lecture_9 [2017/02/21 10:02] (current) – rupert | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | === Definition | + | === Proof of the proposition, |
| - | If $A$ is an $n\times m$ matrix | + | 2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar to the first part of the proof; |
| - | \[ (AB)_{i,j} = \text{row}_i(A)\cdot \text{col}_j(B).\] | + | |
| - | If we want to emphasize that we are multiplying matrices in this way, we might sometimes write $A\cdot B$ instead of $AB$. | + | 3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■ |
| - | If $A$ is an $n\times m$ matrix | + | ===== Algebraic properties of matrix |
| - | === Examples | + | ==== The associative law ==== |
| + | === Proposition: | ||
| + | Matrix multiplication is // | ||
| - | - If $\newcommand{\mat}[1]{\begin{bmatrix}# | + | We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$. |
| - | - If $A=\mat{1& | + | |
| - | * $AB$, $CA$ and $BC$ don't exist (i.e., they are undefined); | + | |
| - | * $AC$ exists and is $3\times 4$; | + | |
| - | * $BA$ exists and is $4\times 2$; and | + | |
| - | * $CB$ exists and is $2\times 2$. | + | |
| - | * In particular, | + | |
| - | - If $A=\mat{0& | + | |
| - | - If $A=0_{n\times n}$ is the $n\times n$ zero matrix and $B$ is any $n\times n$ matrix, then $AB=0_{n\times n}$ and $BA=0_{n\times n}$. So in this case, we do have $AB=BA$. | + | |
| - | - If $A=\mat{1& | + | |
| - | - If $A=\mat{1& | + | |
| - | === Commuting matrices | + | === Example |
| + | We saw above that $\newcommand{\m}[1]{\begin{bmatrix}# | ||
| + | \[ AB=A(AA)\stackrel*=(AA)A=BA.\] | ||
| + | The same argument for any square matrix $A$ gives a proof of: | ||
| + | === Proposition === | ||
| + | If $A$ is any square matrix, then $A$ commutes with $A^2$.■ | ||
| - | {{page> | + | The powers of a square matrix $A$ are defined by $A^1=A$, and $A^{k+1}=A(A^k)$ for $k\in \mathbb{N}$. Using [[wp> |
| + | ===Proposition: | ||
| + | If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.■ | ||
| - | Which matrices commute? Suppose $A$ is an $n\times m$ matrix and $B$ is an $\ell\times k$ matrix, and $A$ and $B$ commute, i.e., $AB=BA$. | ||
| - | * $AB$ must be defined, so $m=\ell$ | + | ====The distributive laws==== |
| - | * $BA$ must be defined, so $k=n$ | + | |
| - | * $AB$ is an $n\times k$ matrix and $BA$ is an $\ell\times n$ matrix. Since $AB$ has the same size as $BA$, we must have $n=\ell$ and $k=m$. | + | |
| - | Putting this together: we see that if $A$ and $B$ commute, then $A$ and $B$ must both be $n\times n$ matrices for some number $n$. In other words, they must be //square matrices of the same size//. | + | |
| - | Examples 4 and 5 above show that for some square matrices $A$ and $B$ of the same size, it is true that $A$ and $B$ commute. On the other hand, examples 3 and 6 show that it's not true that square matrices of the same size must always commute. | + | === Lemma: the distributive laws for row-column multiplication === |
| - | Because it's not true in general that $AB=BA$, we say that **matrix multiplication | + | - If $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then $a\cdot (b+c)=a\cdot b+a\cdot c$. |
| + | - If $b$ and $c$ are $1\times m$ row vectors and $a$ is an $m\times 1$ column vector, then $(b+c)\cdot a=b\cdot a+c\cdot a$. | ||
| - | === Definition of the $n\times n$ identity matrix === | + | The proof is an exercise (see tutorial worksheet 5). |
| - | {{page> | ||
| - | === Examples | + | === Proposition: |
| + | If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, | ||
| + | - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and | ||
| + | - $(B+C)A=BA+CA$ for any $k\times n$ matrices $B$ and $C$. | ||
| + | In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense. | ||
| + | |||
| + | ===Proof=== | ||
| + | 1. First note that | ||
| + | * $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of [[matrix addition]]; | ||
| + | * $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of [[matrix multiplication]]; | ||
| + | * $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication | ||
| + | * so $AB+AC$ is $n\times k$ by the definition of matrix addition. | ||
| + | So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. | ||
| + | |||
| + | By the Lemma above, the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] | ||
| + | So the $(i,j)$ entry of $A(B+C)$ is | ||
| + | \begin{align*}\def\row{\text{row}}\def\col{\text{col}} | ||
| + | \text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) | ||
| + | \\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} | ||
| + | On the other hand, | ||
| + | |||
| + | * the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and | ||
| + | * the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$; | ||
| + | * so the $(i, | ||
| + | |||
| + | So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$. | ||
| + | |||
| + | 2. The proof is similar, and is left as an exercise.■ | ||
| - | - $I_1=[1]$ | ||
| - | - $I_2=\mat{1& | ||
| - | - $I_3=\mat{1& | ||
| - | - $I_4=\mat{1& | ||
| - | === Proposition === | ||
| - | - $I_nA=A$ for any $n\times m$ matrix $A$; | ||
| - | - $AI_m=A$ for any $n\times m$ matrix $A$; and | ||
| - | - $I_nB=B=BI_n$ for any $n\times n$ matrix $B$. In particular, $I_n$ commutes with every other square $n\times n$ matrix $B$. | ||
lecture_9.1424178426.txt.gz · Last modified: by rupert
