User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_9

Proof of the proposition, continued

2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar to the first part of the proof; the details are left as an exercise.

3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■

Algebraic properties of matrix multiplication

The associative law

Proposition: associativity of matrix multiplication

Matrix multiplication is associative. This means that $(AB)C=A(BC)$ whenever $A,B,C$ are matrices which can be multiplied together in this order.

We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$.

Example

We saw above that $\newcommand{\m}[1]{\begin{bmatrix}#1\end{bmatrix}}A=\m{1&2\\3&4}$ commutes with $B=\m{7&10\\15&22}$. We can explain why this is so using associativity. You can check that $B=AA$ (which we usually write as $B=A^2$). Hence, using associativity at $\stackrel*=$, \[ AB=A(AA)\stackrel*=(AA)A=BA.\] The same argument for any square matrix $A$ gives a proof of:

Proposition

If $A$ is any square matrix, then $A$ commutes with $A^2$.■

The powers of a square matrix $A$ are defined by $A^1=A$, and $A^{k+1}=A(A^k)$ for $k\in \mathbb{N}$. Using mathematical induction, you can prove the following more general proposition.

Proposition: a square matrix commutes with its powers

If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.■

The distributive laws

Lemma: the distributive laws for row-column multiplication

  1. If $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then $a\cdot (b+c)=a\cdot b+a\cdot c$.
  2. If $b$ and $c$ are $1\times m$ row vectors and $a$ is an $m\times 1$ column vector, then $(b+c)\cdot a=b\cdot a+c\cdot a$.

The proof is an exercise (see tutorial worksheet 5).

Proposition: the distributive laws for matrix multiplication

If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then:

  1. $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and
  2. $(B+C)A=BA+CA$ for any $k\times n$ matrices $B$ and $C$.

In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense.

Proof

1. First note that

  • $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of matrix addition;
  • $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of matrix multiplication;
  • $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication
  • so $AB+AC$ is $n\times k$ by the definition of matrix addition.

So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the same size.

By the Lemma above, the row-column product has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] So the $(i,j)$ entry of $A(B+C)$ is \begin{align*}\def\row{\text{row}}\def\col{\text{col}} \text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) \\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} On the other hand,

  • the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and
  • the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$;
  • so the $(i,j)$ entry of $AB+AC$ is also $\text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C)$.

So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$.

2. The proof is similar, and is left as an exercise.■

lecture_9.txt · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki