↓ Slide 1
Recap: matrix multiplication and the identity matrix
$AB$ defined if $A:n\times m$ and $B:m\times k$
Sometimes $AB=BA$ (need $A,B$ to both be $n\times n$)
But often $AB\ne BA$ (even if $A,B$ both $n\times n$)
$n\times n$ identity matrix $I_n$: $1$s on diagonal, zeros elsewhere
$I_n$ commutes with every $n\times n$ matrix, in a nice way…
Last time: proved that $I_nA=A$ for any $n\times m$ matrix A.
Proof that $AI_m=A$ for any $n\times m$ matrix $A$ is similar (exercise!)
If $B$ is any $n\times n$ matrix, then
$I_nB=B$ by part 1
and $BI_n=B$ by part 2
so $I_nB=B=BI_n$
In particular, $I_nB=BI_n$
So $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■
→ Slide 2
Algebraic properties of matrix multiplication
↓ Slide 3
The associative law
Matrix multiplication is associative.
This means that \[(AB)C=A(BC)\] whenever $A,B,C$ are matrices which can be multiplied together in this order.
Proof isn't too difficult but we skip it
↓ Slide 4
An example using the associative law $(AB)C=A(BC)$
$B=AA$ (usually write as $B=A^2$).
Using associativity, we get $AB=A(AA)\stackrel*=(AA)A=BA.$
The same argument for any square matrix $A$ gives a proof of:
Proposition
For any square matrix $A$,
$A$ commutes with $A^2$.■
↓ Slide 5
Powers of a square matrix $A$
↓ Slide 6
Proposition: a square matrix commutes with its powers
For any square matrix $A$, and any $k\in\mathbb{N}$,
$A$ commutes with $A^k$.■
↓ Slide 7
The distributive laws
Proposition: the distributive laws
If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then:
$A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and
$(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$.
↓ Slide 8
Proof that $A(B+C)=AB+AC$
↓ Slide 9
$A(B+C)=AB+AC$ continued
In tutorial 4: $a\cdot (b+c)=a\cdot b+a\cdot c$ (row-col product)
(for $a$: $1\times m$ and $b,c$: $m\times 1$)
Write $\def\row{\text{row}}\def\col{\text{col}}a_i=\row_i(A)$, $b_j=\col_j(B)$, $c_j=\col_j(C)$.
$(i,j)$ entry of $A(B+C)$ is:\begin{align*}\def\xx{\!\!\!\!}\def\xxx{\xx\xx\xx\xx}\xxx\xxx \row_i(A)\cdot \col_j(B+C) &= a_i\cdot \big(b_j+c_j\big)\\ &= a_i\cdot b_j+a_i\cdot c_j.\end{align*}
$(i,j)$ entry of $AB$ is $a_i\cdot b_j$; and
$(i,j)$ entry of $AC$ is $a_i\cdot c_j$;
so $(i,j)$ entry of $AB+AC$ is also $a_i\cdot b_j+a_i\cdot c_j$
Same sizes and same entries, so $A(B+C)=AB+AC$.
↓ Slide 10
Proof that $(B+C)A=BA+CA$