User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_8_slides

This is an old revision of the document!



Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/syntax/theme.php on line 50

Warning: Undefined array key "do" in /home/levene/public_html/w/mst10030/lib/plugins/revealjs/action.php on line 14
↓ Slide 1

Matrix multiplication

  • Let $A,B$ be matrices, with sizes
    • $A$: $n\times m$
    • $B$: $m\times k$
  • The product $AB$ is: the $n\times k$ matrix whose $(i,j)$ entry is \[ (AB)_{i,j} = \text{row}_i(A)\cdot \text{col}_j(B)\]
  • So the entries of $AB$ are all possible row-column products of a row of $A$ with a column of $B$
↓ Slide 2

"Compatible" sizes for $AB$ to be defined

  • We need the sizes of $A$ and $B$ to be “compatible” for $AB$ to be defined
  • Need $A$: $n\times m$ and $B$: $m\times k$ (same numbers “in the middle”)
  • If $A,B$ are matrices, with sizes
    • $A$: $n\times m$
    • $B$: $\ell\times k$ with $\ell\ne m$,
  • then the matrix product $AB$ is undefined.
↓ Slide 3

Example 1

If $\newcommand{\mat}[1]{\left[\begin{smallmatrix}#1\end{smallmatrix}\right]} A=\mat{1&0&5\\2&-1&3}$ and $B=\mat{1&2\\3&4\\5&6}$,

  • $AB=\mat{26&32\\14&18}$
  • $BA=\mat{5&-2&11\\11&-4&27\\17&-6&43}$.
  • Note that $AB$ and $BA$ are both defined, but $AB\ne BA$
    • $AB$ and $BA$ don't even have the same size.
↓ Slide 4

Example 2

If $A=\mat{1&2\\3&4\\5&6}$, $B=\mat{2&1&1\\1&2&0\\1&0&2\\1&0&2\\2&2&1}$ and $C=\mat{1&3&0&7\\0&4&6&8}$,

  • $A$ is $3\times 2$, $B$ is $4\times 3$ and $C$ is $2\times 4$, so
  • $AB$, $CA$ and $BC$ don't exist (undefined);
  • $AC$ exists and is $3\times 4$;
  • $BA$ exists and is $4\times 2$; and
  • $CB$ exists and is $2\times 2$.
  • In particular, $AB\ne BA$ and $AC\ne CA$ and $BC\ne CB$ (undefined vs defined!)
↓ Slide 5

Example 3

If $A=\mat{0&1\\0&0}$ and $B=\mat{0&0\\1&0}$, then

  • $AB=\mat{1&0\\0&0}$
  • $BA=\mat{0&0\\0&1}$.
  • So $AB$ and $BA$ are both defined and have the same size, but they are not equal matrices: $AB\ne BA$.
↓ Slide 6

Example 4

If $A=0_{n\times n}$ is the $n\times n$ zero matrix and $B$ is any $n\times n$ matrix, then

  • $AB=0_{n\times n}$, and
  • $BA=0_{n\times n}$.
  • So in this case, we do have $AB=BA$.
↓ Slide 7

Example 5

If $A=\mat{1&2\\3&4}$ and $B=\mat{7&10\\15&22}$, then

  • $AB=\mat{37&54\\81&118}$
  • $BA=\mat{37&54\\81&118}$
  • So $AB=BA$ for these particular matrices $A$ and $B$.
↓ Slide 8

Example 6

If $A=\mat{1&2\\3&4}$ and $B=\mat{6&10\\15&22}$, then

  • $AB=\mat{36&54\\78&118}$
  • $BA= \mat{36&52\\81&118}$
  • So $AB\ne BA$.
↓ Slide 9

Commuting matrices I

We say that matrices $A$ and $B$ commute if $AB=BA$.

  • Because it's not true in general that $AB=BA$, we say that matrix multiplication is not commutative.
↓ Slide 10

Commuting matrices II

  • What can we say about commuting matrices?
  • Suppose $AB=BA$ and think about sizes.
    • $A$: $n\times m$
    • $B$: $\ell\times k$
  • $AB$ is defined, so $m=\ell$.
  • $BA$ is defined, so $k=n$.
  • $AB$ is $n\times k$ and $BA$ is $\ell\times m$, so $n=\ell$ and $k=m$. So $n=\ell=m=k$!
  • $A$ and $B$ must both be $n\times n$: they're square matrices of the same size.
↓ Slide 11

Commuting matrices III

  • If $A$ and $B$ commute, they must be square matrices of the same size.
  • Some square matrices $A$ and $B$ of the same size commute…
  • ….but not all!
  • See examples above.
↓ Slide 12

The $n\times n$ identity matrix

The $n\times n$ identity matrix is the $n\times n$ matrix $I_n$ with $1$s in every diagonal entry (that is, in the $(i,i)$ entry for every $i$ between $1$ and $n$), and $0$s in every other entry. So \[ I_n=\begin{bmatrix} 1&0&0&\dots&0\\0&1&0&\dots&0\\0&0&1&\dots&0\\\vdots & & &\ddots & \vdots\\0&0&0&\dots&1\end{bmatrix}.\]

↓ Slide 13

Examples

  1. $I_1=[1]$
  2. $I_2=\mat{1&0\\0&1}$
  3. $I_3=\mat{1&0&0\\0&1&0\\0&0&1}$
  4. $I_4=\mat{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1}$, and so on!
↓ Slide 14

Properties of $I_n$

  1. $I_nA=A$ for any $n\times m$ matrix $A$;
  2. $AI_m=A$ for any $n\times m$ matrix $A$; and
  3. $I_nB=B=BI_n$ for any $n\times n$ matrix $B$.
    • In particular, $I_n$ commutes with every other square $n\times n$ matrix $B$.
↓ Slide 15

Proof that $I_nA=A$ for $A$: $n\times m$

  • $I_nA$ is $n\times m$ (from definition of matrix multiplication)
  • So $I_nA$ has same size as $A$
  • $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, with $1$ in $i$th place
  • $\text{col}_j(A)=\mat{a_{1j}\\a_{2j}\\\vdots\\a_{nj}}$
  • So $(i,j)$ entry of $I_nA$ is \[\!\!\!\!\!\!\text{row}_i(I_n)\cdot \text{col}_j(A)= 0a_{1j}+0a_{2j}+\dots+0a_{i-1,j}+1a_{ij}+0a_{i+1,j}+\dots+0a_{nj} =a_{ij}\]
  • same as $(i,j)$ entry of $A$.
  • So $I_nA=A$

2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar; the details are left as an exercise.

3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■

→ Slide 16

Algebraic properties of matrix multiplication

↓ Slide 17

The associative law

Proposition: associativity of matrix multiplication

Matrix multiplication is associative. This means that $(AB)C=A(BC)$ whenever $A,B,C$ are matrices which can be multiplied together in this order.

We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$.

Example

We saw above that $\newcommand{\m}[1]{\begin{bmatrix}#1\end{bmatrix}}A=\m{1&2\\3&4}$ commutes with $B=\m{7&10\\15&22}$. We can explain why this is so using associativity. You can check that $B=AA$ (which we usually write as $B=A^2$). Hence, using associativity at $\stackrel*=$, \[ AB=A(AA)\stackrel*=(AA)A=BA.\] The same argument for any square matrix $A$ gives a proof of:

Proposition

If $A$ is any square matrix, then $A$ commutes with $A^2$.■

Using mathematical induction, you can prove a more general fact:

Proposition: a square matrix commutes with its powers

If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.■

↓ Slide 18

The distributive laws

Proposition: the distributive laws

If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then:

  1. $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and
  2. $(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$.

In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense.

Proof

1. First note that

  • $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of matrix addition;
  • $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of matrix multiplication;
  • $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication
  • so $AB+AC$ is $n\times k$ by the definition of matrix addition.

So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the same size.

Recall that in tutorial 4 we saw that if $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then the row-column product has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] So the $(i,j)$ entry of $A(B+C)$ is \begin{align*}\def\row{\text{row}}\def\col{\text{col}} \text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) \\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} On the other hand,

  • the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and
  • the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$;
  • so the $(i,j)$ entry of $AB+AC$ is also $\text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C)$.

So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$.

2. The proof is similar, and is left as an exercise.■

→ Slide 19

Matrix equations

We've seen that a single linear equation can be written using row-column multiplication. For example, \[ 2x-3y+z=8\] can be written as \[ \def\m#1{\begin{bmatrix}#1\end{bmatrix}}\m{2&-3&1}\m{x\\y\\z}=8\] or \[ a\vec x=8\] where $a=\m{2&-3&1}$ and $\vec x=\m{x\\y\\z}$.

We can write a whole system of linear equations in a similar way, as a matrix equation using matrix multiplication. For example we can rewrite the linear system \begin{align*} 2x-3y+z&=8\\ y-z&=4\\x+y+z&=0\end{align*} as \[ \m{2&-3&1\\0&1&-1\\1&1&1}\m{x\\y\\z}=\m{8\\4\\0},\] or \[ A\vec x=\vec b\] where $A=\m{2&-3&1\\0&1&-1\\1&1&1}$, $\vec x=\m{x\\y\\z}$ and $\vec b=\m{8\\4\\0}$. (We are writing the little arrow above the column vectors here because otherwise we might get confused between the $\vec x$: a column vector of variables, and $x$: just a single variable).

More generally, any linear system \begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m&=b_1\\ a_{21}x_1+a_{22}x_2+\dots+a_{2m}x_m&=b_2\\ \hphantom{a_{11}}\vdots \hphantom{x_1+a_{22}}\vdots\hphantom{x_2+\dots+{}a_{nn}} \vdots\ & \hphantom{{}={}\!} \vdots\\ a_{n1}x_1+a_{n2}x_2+\dots+a_{nm}x_m&=b_n \end{align*} can be written in the form \[ A\vec x=\vec b\] where $A$ is the $n\times m $ matrix, called the coefficient matrix of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, and $\vec b=\m{b_1\\b_2\\\vdots\\b_n}$.

More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of matrix multiplication, if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions?

Example

If $A=\m{1&0\\0&0}$ and $B=0_{2\times 3}$, then any solution $X$ to $AX=B$ must be $2\times 3$.

One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$.

However, this is not the only solution. For example, $X=\m{0&0&0\\1&2&3}$ is another solution, since in this case \[AX=\m{1&0\\0&0}\m{0&0&0\\1&2&3}=\m{0&0&0\\0&0&0}=0_{2\times 3}.\]

So from this example, we see that a matrix equation can have many solutions.

lecture_8_slides.1455644569.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki