Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_9_slides
Recap: matrix multiplication and the identity matrix
- $AB$ defined if $A:n\times m$ and $B:m\times k$
- then $AB$ is $n\times k$ …
- with $(i,j)$ entry is $\text{row}_i(A).\text{col}_j(B)$
- Sometimes $AB=BA$ (need $A,B$ to both be $n\times n$)
- say $A$ and $B$ commute
- But often $AB\ne BA$ (even if $A,B$ both $n\times n$)
- $n\times n$ identity matrix $I_n$: $1$s on diagonal, zeros elsewhere
- $I_n$ commutes with every $n\times n$ matrix, in a nice way…
- Last time: proved that $I_nA=A$ for any $n\times m$ matrix A.
- Proof that $AI_m=A$ for any $n\times m$ matrix $A$ is similar (exercise!)
- If $B$ is any $n\times n$ matrix, then
- $I_nB=B$ by part 1
- and $BI_n=B$ by part 2
- so $I_nB=B=BI_n$
- In particular, $I_nB=BI_n$
- So $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■
Algebraic properties of matrix multiplication
The associative law
- Matrix multiplication is associative.
- This means that \[(AB)C=A(BC)\] whenever $A,B,C$ are matrices which can be multiplied together in this order.
- Proof isn't too difficult but we skip it
- it uses the known fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$
An example using the associative law $(AB)C=A(BC)$
- $\newcommand{\m}[1]{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]}A=\m{1&2\\3&4}$ commutes with $B=\m{7&10\\15&22}$
- Why?
- Can check it (calculation) but this doesn't give a “reason”
- We can explain it using associativity…
- $B=AA$ (usually write as $B=A^2$).
- Using associativity, we get $AB=A(AA)\stackrel*=(AA)A=BA.$
The same argument for any square matrix $A$ gives a proof of:
Proposition
For any square matrix $A$,
$A$ commutes with $A^2$.■
Powers of a square matrix $A$
- Define $A^1=A$
- and $A^2=AA$
- and $A^3=AAA=A(A^2)$
- and $A^4=AAAA=A(A^3)$
- ….
- $A^{k+1}=A(A^k)$ for $k\in \mathbb{N}$
Proposition: a square matrix commutes with its powers
For any square matrix $A$, and any $k\in\mathbb{N}$,
$A$ commutes with $A^k$.■
- Proof is by induction on $k$ (exercise).
The distributive laws
Proposition: the distributive laws
If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then:
- $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and
- $(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$.
- In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense.
Proof that $A(B+C)=AB+AC$
- Have $A$: $n\times m$, $B$ and $C$: $m\times k$
- So $B+C$: $m\times k$
- So $A(B+C)$: $n\times k$
- and $AB$: $n\times k$ and $AC$: $n\times k$
- So $AB+AC$: $n\times k$
- Conclusion so far: $A(B+C)$ and $AB+AC$ have the same sizes!
$A(B+C)=AB+AC$ continued
- In tutorial 4: $a\cdot (b+c)=a\cdot b+a\cdot c$ (row-col product)
- (for $a$: $1\times m$ and $b,c$: $m\times 1$)
- Write $\def\row{\text{row}}\def\col{\text{col}}a_i=\row_i(A)$, $b_j=\col_j(B)$, $c_j=\col_j(C)$.
- $(i,j)$ entry of $A(B+C)$ is:\begin{align*}\def\xx{\!\!\!\!}\def\xxx{\xx\xx\xx\xx}\xxx\xxx \row_i(A)\cdot \col_j(B+C) &= a_i\cdot \big(b_j+c_j\big)\\ &= a_i\cdot b_j+a_i\cdot c_j.\end{align*}
- $(i,j)$ entry of $AB$ is $a_i\cdot b_j$; and
- $(i,j)$ entry of $AC$ is $a_i\cdot c_j$;
- so $(i,j)$ entry of $AB+AC$ is also $a_i\cdot b_j+a_i\cdot c_j$
- Same sizes and same entries, so $A(B+C)=AB+AC$.
Proof that $(B+C)A=BA+CA$
- This is very similar, and is left as an exercise.■
lecture_9_slides.txt · Last modified: by rupert

