User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_8_slides

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_8_slides [2016/02/16 17:40] rupertlecture_8_slides [2017/02/16 08:56] (current) – [Commuting matrices III] rupert
Line 1: Line 1:
 ~~REVEAL~~ ~~REVEAL~~
  
-==== Matrix multiplication ====+==== Row-column & matrix multiplication ==== 
 + 
 +  * The **row-column product** of $a$ and $b$ is defined by \[\!\!\!\!\!\!\!\!\!\!ab=[\begin{smallmatrix}a_1&a_2&\dots&a_n\end{smallmatrix}]\left[\begin{smallmatrix}b_1\\b_2\\\vdots\\b_n\end{smallmatrix}\right]=a_1b_1+a_2b_2+\dots+a_nb_n.\] 
 + 
 +  * $AB=$ matrix of all "row-of-$A$ times col-of-$B$" products 
 +  * \[\!\!\!\!\!\!\!\!\!\!\!\!\!\!\! \def\r{\left[\begin{smallmatrix}1&0&5\end{smallmatrix}\right]}\def\rr{\left[\begin{smallmatrix}2&-1&3\end{smallmatrix}\right]}\left[\begin{smallmatrix}1&0&5\\2&-1&3\end{smallmatrix}\right]\left[\begin{smallmatrix} 1&2\\3&4\\5&6\end{smallmatrix}\right]\def\s{\left[\begin{smallmatrix}1\\3\\5\end{smallmatrix}\right]}\def\ss{\left[\begin{smallmatrix}2\\4\\6\end{smallmatrix}\right]}=\left[\begin{smallmatrix}{\r\s}&{\r\ss}\\{\rr\s}&{\rr\ss}\end{smallmatrix}\right]=\left[\begin{smallmatrix}26&32\\14&18\end{smallmatrix}\right].\] 
 + 
 +==== Matrix multiplication: the definition ====
  
   * Let $A,B$ be matrices, with sizes   * Let $A,B$ be matrices, with sizes
Line 25: Line 32:
  
 ==== Example 2 ==== ==== Example 2 ====
-If $A=\mat{1&2\\3&4\\5&6}$, $B=\mat{2&1&1\\1&2&0\\1&0&2\\1&0&2\\2&2&1}$ and $C=\mat{1&3&0&7\\0&4&6&8}$, +If $A=\mat{1&2\\3&4\\5&6}$, $B=\mat{2&1&1\\1&2&0\\1&0&2\\2&2&1}$ and $C=\mat{1&3&0&7\\0&4&6&8}$, 
   * $A$ is $3\times 2$, $B$ is $4\times 3$ and $C$ is $2\times 4$, so    * $A$ is $3\times 2$, $B$ is $4\times 3$ and $C$ is $2\times 4$, so 
   * $AB$, $CA$ and $BC$ don't exist (undefined);   * $AB$, $CA$ and $BC$ don't exist (undefined);
Line 62: Line 69:
 {{page>commute}} {{page>commute}}
  
-  * Because it'not true in general that $AB=BA$, we say that **matrix multiplication is not commutative**.+  * Because it's true that $AB=BA$ for every choice of matrices $A$ and $B$, we say that **matrix multiplication is not commutative**.
  
 ==== Commuting matrices II ==== ==== Commuting matrices II ====
  
-  * What can we say about commuting matrices?  +  * What can we say about a pair of commuting matrices?
   * Suppose $AB=BA$ and think about sizes.   * Suppose $AB=BA$ and think about sizes.
     * $A$: $n\times m$     * $A$: $n\times m$
Line 77: Line 84:
  
   * If $A$ and $B$ commute, they must be square matrices of the same size.   * If $A$ and $B$ commute, they must be square matrices of the same size.
-  * **Some** square matrices $A$ and $B$ of the same size commute...+  * **Some** pairs of square matrices $A$ and $B$ of the same size do commute...
   * ....but not all!    * ....but not all! 
   * See examples above.   * See examples above.
Line 100: Line 107:
 ==== Proof that $I_nA=A$ for $A$: $n\times m$ ==== ==== Proof that $I_nA=A$ for $A$: $n\times m$ ====
  
-  * $I_nA$ is $n\times m$ (definition of matrix multiplication)same size as $A$+  * $I_nA$ is $n\times m$ (from definition of matrix multiplication) 
 +  * So $I_nA$ has same size as $A$
   * $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, with $1$ in $i$th place   * $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, with $1$ in $i$th place
   * $\text{col}_j(A)=\mat{a_{1j}\\a_{2j}\\\vdots\\a_{nj}}$   * $\text{col}_j(A)=\mat{a_{1j}\\a_{2j}\\\vdots\\a_{nj}}$
-  * So $(i,j)$ entry of $I_nA$ is \[\!\!\!\!\!\!\text{row}_i(I_n)\cdot \text{col}_j(A)= 0a_{1j}+0a_{2j}+\dots+0a_{i-1,j}+1a_{ij}+0a_{i+1,j}+\dots+0a_{nj} =a_{ij}\]+  * So $(i,j)$ entry of $I_nA$ is \[\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\!\text{row}_i(I_n)\cdot \text{col}_j(A)= 0a_{1j}+0a_{2j}+\dots+1a_{ij}+\dots+0a_{nj} =a_{ij}\]
   * same as $(i,j)$ entry of $A$.   * same as $(i,j)$ entry of $A$.
   * So $I_nA=A$   * So $I_nA=A$
  
-==== ==== +==== More proofs====
- +
-2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar; the details are left as an exercise. +
- +
-3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■ +
- +
-===== Algebraic properties of matrix multiplication ===== +
-==== The associative law ==== +
-=== Proposition: associativity of matrix multiplication === +
-Matrix multiplication is //associative//. This means that $(AB)C=A(BC)$ whenever $A,B,C$ are matrices which can be multiplied together in this order. +
- +
-We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$. +
- +
-=== Example === +
-We saw above that $\newcommand{\m}[1]{\begin{bmatrix}#1\end{bmatrix}}A=\m{1&2\\3&4}$ commutes with $B=\m{7&10\\15&22}$. We can explain why this is so using associativity. You can check that $B=AA$ (which we usually write as $B=A^2$). Hence, using associativity at $\stackrel*=$,  +
-\[ AB=A(AA)\stackrel*=(AA)A=BA.\] +
-The same argument for any square matrix $A$ gives a proof of: +
-=== Proposition ===  +
-If $A$ is any square matrix, then $A$ commutes with $A^2$.■ +
- +
-Using [[wp>mathematical induction]], you can prove a more general fact: +
-===Proposition: a square matrix commutes with its powers=== +
-If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.■ +
- +
- +
-====The distributive laws==== +
- +
-=== Proposition: the distributive laws === +
-If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then: +
-  - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and +
-  - $(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$. +
-In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense. +
- +
-===Proof=== +
-1. First note that +
-  * $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of [[matrix addition]]; +
-  * $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of [[matrix multiplication]]; +
-  * $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication +
-  * so $AB+AC$ is $n\times k$ by the definition of matrix addition.  +
-So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. +
- +
-Recall that in tutorial 4 we saw that if $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\]  +
-So the $(i,j)$ entry of $A(B+C)$ is +
-\begin{align*}\def\row{\text{row}}\def\col{\text{col}} +
-\text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) +
-\\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} +
-On the other hand,  +
- +
-  * the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and +
-  * the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$;  +
-  * so the $(i,j)$  entry of $AB+AC$ is also $\text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C)$.  +
- +
-So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$. +
- +
-2. The proof is similar, and is left as an exercise.■ +
- +
-===== Matrix equations ===== +
- +
-We've seen that a single linear equation can be written using [[row-column multiplication]]. For example, +
-\[ 2x-3y+z=8\] +
-can be written as  +
-\[ \def\m#1{\begin{bmatrix}#1\end{bmatrix}}\m{2&-3&1}\m{x\\y\\z}=8\] +
-or +
-\[ a\vec x=8\] +
-where $a=\m{2&-3&1}$ and $\vec x=\m{x\\y\\z}$. +
- +
-We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system +
-\begin{align*} 2x-3y+z&=8\\ y-z&=4\\x+y+z&=0\end{align*} +
-as  +
-\[ \m{2&-3&1\\0&1&-1\\1&1&1}\m{x\\y\\z}=\m{8\\4\\0},\] +
-or  +
-\[ A\vec x=\vec b\] +
-where $A=\m{2&-3&1\\0&1&-1\\1&1&1}$, $\vec x=\m{x\\y\\z}$ and $\vec b=\m{8\\4\\0}$. (We are writing the little arrow above the column vectors here because otherwise we might get confused between the $\vec x$: a column vector of variables, and $x$: just a single variable). +
- +
-More generally, any linear system +
-\begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m&=b_1\\ a_{21}x_1+a_{22}x_2+\dots+a_{2m}x_m&=b_2\\ \hphantom{a_{11}}\vdots \hphantom{x_1+a_{22}}\vdots\hphantom{x_2+\dots+{}a_{nn}} \vdots\ & \hphantom{{}={}\!} \vdots\\ a_{n1}x_1+a_{n2}x_2+\dots+a_{nm}x_m&=b_n \end{align*} +
-can be written in the form +
-\[ A\vec x=\vec b\] +
-where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, and $\vec b=\m{b_1\\b_2\\\vdots\\b_n}$. +
- +
-More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions? +
- +
-==Example ===+
  
-If $A=\m{1&0\\0&0}and $B=0_{2\times 3}$, then any solution $Xto $AX=B$ must be $2\times 3$.+  * Proof that $A=AI_m$ for $A$: $n\times m$ is very similar (exercise) 
 +  * Now if $B$ is $n\times n$, take $n=mand $A=B$ above: 
 +    * $I_nB=B$ and $BI_n=B$ 
 +    * So $I_nB=B=BI_n$ 
 +    * So $I_n$ commutes with $B$, for any $n\times n$ matrix $B$.
  
-One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$. 
  
-However, this is not the only solution. For example, $X=\m{0&0&0\\1&2&3}$ is another solution, since in this case \[AX=\m{1&0\\0&0}\m{0&0&0\\1&2&3}=\m{0&0&0\\0&0&0}=0_{2\times 3}.\] 
  
-So from this example, we see that a matrix equation can have many solutions. 
lecture_8_slides.1455644445.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki