Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_8_slides
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| lecture_8_slides [2016/02/16 17:43] – [Proof that $I_nA=A$ for $A$: $n\times m$] rupert | lecture_8_slides [2017/02/16 08:56] (current) – [Commuting matrices III] rupert | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ~~REVEAL~~ | ~~REVEAL~~ | ||
| - | ==== Matrix multiplication ==== | + | ==== Row-column & matrix multiplication ==== |
| + | |||
| + | * The **row-column product** of $a$ and $b$ is defined by \[\!\!\!\!\!\!\!\!\!\!ab=[\begin{smallmatrix}a_1& | ||
| + | |||
| + | * $AB=$ matrix of all " | ||
| + | * \[\!\!\!\!\!\!\!\!\!\!\!\!\!\!\! \def\r{\left[\begin{smallmatrix}1& | ||
| + | |||
| + | ==== Matrix multiplication: the definition | ||
| * Let $A,B$ be matrices, with sizes | * Let $A,B$ be matrices, with sizes | ||
| Line 25: | Line 32: | ||
| ==== Example 2 ==== | ==== Example 2 ==== | ||
| - | If $A=\mat{1& | + | If $A=\mat{1& |
| * $A$ is $3\times 2$, $B$ is $4\times 3$ and $C$ is $2\times 4$, so | * $A$ is $3\times 2$, $B$ is $4\times 3$ and $C$ is $2\times 4$, so | ||
| * $AB$, $CA$ and $BC$ don't exist (undefined); | * $AB$, $CA$ and $BC$ don't exist (undefined); | ||
| Line 62: | Line 69: | ||
| {{page> | {{page> | ||
| - | * Because it' | + | * Because it's true that $AB=BA$ for every choice of matrices $A$ and $B$, we say that **matrix multiplication is not commutative**. |
| ==== Commuting matrices II ==== | ==== Commuting matrices II ==== | ||
| - | * What can we say about commuting matrices? | + | * What can we say about a pair of commuting matrices? |
| * Suppose $AB=BA$ and think about sizes. | * Suppose $AB=BA$ and think about sizes. | ||
| * $A$: $n\times m$ | * $A$: $n\times m$ | ||
| Line 77: | Line 84: | ||
| * If $A$ and $B$ commute, they must be square matrices of the same size. | * If $A$ and $B$ commute, they must be square matrices of the same size. | ||
| - | * **Some** square matrices $A$ and $B$ of the same size commute... | + | * **Some** |
| * ....but not all! | * ....but not all! | ||
| * See examples above. | * See examples above. | ||
| Line 104: | Line 111: | ||
| * $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, | * $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, | ||
| * $\text{col}_j(A)=\mat{a_{1j}\\a_{2j}\\\vdots\\a_{nj}}$ | * $\text{col}_j(A)=\mat{a_{1j}\\a_{2j}\\\vdots\\a_{nj}}$ | ||
| - | * So $(i,j)$ entry of $I_nA$ is \[\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\text{row}_i(I_n)\cdot \text{col}_j(A)= 0a_{1j}+0a_{2j}+\dots+0a_{i-1,j}+1a_{ij}+0a_{i+1,j}+\dots+0a_{nj} =a_{ij}\] | + | * So $(i,j)$ entry of $I_nA$ is \[\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\text{row}_i(I_n)\cdot \text{col}_j(A)= 0a_{1j}+0a_{2j}+\dots+1a_{ij}+\dots+0a_{nj} =a_{ij}\] |
| * same as $(i,j)$ entry of $A$. | * same as $(i,j)$ entry of $A$. | ||
| * So $I_nA=A$ | * So $I_nA=A$ | ||
| - | ==== ==== | + | ==== More proofs==== |
| - | + | ||
| - | 2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar; the details are left as an exercise. | + | |
| - | + | ||
| - | 3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■ | + | |
| - | + | ||
| - | ===== Algebraic properties of matrix multiplication ===== | + | |
| - | ==== The associative law ==== | + | |
| - | === Proposition: | + | |
| - | Matrix multiplication is // | + | |
| - | + | ||
| - | We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$. | + | |
| - | + | ||
| - | === Example === | + | |
| - | We saw above that $\newcommand{\m}[1]{\begin{bmatrix}# | + | |
| - | \[ AB=A(AA)\stackrel*=(AA)A=BA.\] | + | |
| - | The same argument for any square matrix $A$ gives a proof of: | + | |
| - | === Proposition === | + | |
| - | If $A$ is any square matrix, then $A$ commutes with $A^2$.■ | + | |
| - | + | ||
| - | Using [[wp> | + | |
| - | ===Proposition: | + | |
| - | If $A$ is any square matrix and $k\in\mathbb{N}$, | + | |
| - | + | ||
| - | + | ||
| - | ====The distributive laws==== | + | |
| - | + | ||
| - | === Proposition: | + | |
| - | If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, | + | |
| - | - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and | + | |
| - | - $(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$. | + | |
| - | In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense. | + | |
| - | + | ||
| - | ===Proof=== | + | |
| - | 1. First note that | + | |
| - | * $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of [[matrix addition]]; | + | |
| - | * $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of [[matrix multiplication]]; | + | |
| - | * $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication | + | |
| - | * so $AB+AC$ is $n\times k$ by the definition of matrix addition. | + | |
| - | So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. | + | |
| - | + | ||
| - | Recall that in tutorial 4 we saw that if $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] | + | |
| - | So the $(i,j)$ entry of $A(B+C)$ is | + | |
| - | \begin{align*}\def\row{\text{row}}\def\col{\text{col}} | + | |
| - | \text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) | + | |
| - | \\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} | + | |
| - | On the other hand, | + | |
| - | + | ||
| - | * the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and | + | |
| - | * the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$; | + | |
| - | * so the $(i, | + | |
| - | + | ||
| - | So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$. | + | |
| - | + | ||
| - | 2. The proof is similar, and is left as an exercise.■ | + | |
| - | + | ||
| - | ===== Matrix equations ===== | + | |
| - | + | ||
| - | We've seen that a single linear equation can be written using [[row-column multiplication]]. For example, | + | |
| - | \[ 2x-3y+z=8\] | + | |
| - | can be written as | + | |
| - | \[ \def\m# | + | |
| - | or | + | |
| - | \[ a\vec x=8\] | + | |
| - | where $a=\m{2& | + | |
| - | + | ||
| - | We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system | + | |
| - | \begin{align*} 2x-3y+z& | + | |
| - | as | + | |
| - | \[ \m{2& | + | |
| - | or | + | |
| - | \[ A\vec x=\vec b\] | + | |
| - | where $A=\m{2& | + | |
| - | + | ||
| - | More generally, any linear system | + | |
| - | \begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m& | + | |
| - | can be written in the form | + | |
| - | \[ A\vec x=\vec b\] | + | |
| - | where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, | + | |
| - | + | ||
| - | More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], | + | |
| - | + | ||
| - | === Example | + | |
| - | If $A=\m{1& | + | * Proof that $A=AI_m$ for $A$: $n\times m$ is very similar (exercise) |
| + | * Now if $B$ is $n\times | ||
| + | * $I_nB=B$ and $BI_n=B$ | ||
| + | * So $I_nB=B=BI_n$ | ||
| + | * So $I_n$ commutes with $B$, for any $n\times | ||
| - | One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$. | ||
| - | However, this is not the only solution. For example, $X=\m{0& | ||
| - | So from this example, we see that a matrix equation can have many solutions. | ||
lecture_8_slides.1455644612.txt.gz · Last modified: by rupert
