User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_9

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_9 [2015/02/17 10:58] rupertlecture_9 [2017/02/21 10:02] (current) rupert
Line 1: Line 1:
-=== Definition of matrix multiplication ===+=== Proof of the proposition, continued ===
  
-If $A$ is an $n\times m$ matrix and $B$ is an $m\times k$ matrix, then the product $AB$ is the $n\times k$ matrix whose $(i,j)$ entry is the row-column product of the $i$th row of $A$ with the $j$th column of $B$. That is: +2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar to the first part of the proof; the details are left as an exercise.
-\[ (AB)_{i,j} = \text{row}_i(A)\cdot \text{col}_j(B).\] +
-If $A$ is an $n\times m$ matrix and $B$ is an $\ell\times k$ matrix with $m\ne \ell$, then the matrix product $AB$ is undefined +
-=== Examples ===+
  
-  - If $\newcommand{\mat}[1]{\begin{bmatrix}#1\end{bmatrix}} A=\mat{1&0&5\\2&-1&3}$ and  $B=\mat{1&2\\3&4\\5&6}$, then $AB=\mat{26&32\\14&18}$ and $BA=\mat{5&-2&11\\11&-4&27\\17&-6&43}$Note that $AB$ and $BA$ are both defined, but $AB\ne BA$ since $AB$ and $BA$ don't even have the [[same size]]. +3. If $B$ is any $n\times nmatrixthen $I_nB=B$ by part 1 and $BI_n=Bby part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_nso $I_ncommutes with $B$, for every square $n\times n$ matrix $B$. 
-  - If $A=\mat{1&2\\3&4\\5&6}$, $B=\mat{2&1&1\\1&2&0\\1&0&2\\1&0&2\\2&2&1}$ and $C=\mat{1&3&0&7\\0&4&6&8}$, then $A$ is $3\times 2$, $B$ is $4\times 3$ and $C$ is $2\times 4$, so  +
-    * $AB$, $CA$ and $BC$ don't exist (i.e., they are undefined); +
-    * $AC$ exists and is $3\times 4$; +
-    * $BA$ exists and is $4\times 2$; and +
-    * $CB$ exists and is $2\times 2$. +
-    * In particular, $AB\ne BA$ and $AC\ne CA$ and $BC\ne CB$, since in each case one of the matrices doesn't exist. +
-  - If $A=\mat{0&1\\0&0}$ and $B=\mat{0&0\\1&0}$, then $AB=\mat{1&0\\0&0}$ and $BA=\mat{0&0\\0&1}$. So $AB$ and $BA$ are both defined and have the [[same size]]but they are not [[equal matrices]]: $AB\ne BA$. +
-  - If $A=0_{n\times n}is the $n\times nzero matrix and $B$ is any $n\times n$ matrix, then $AB=0_{n\times n}$ and $BA=0_{n\times n}$. So in this case, we do have $AB=BA$. +
-  - If $A=\mat{1&2\\3&4}$ and $B=\mat{7&10\\15&22}$, then $AB=\mat{37&54\\81&118}=BA$, so $AB=BA$ for these particular matrices $A$ and $B$. +
-  - If $A=\mat{1&2\\3&4}$ and $B=\mat{6&10\\15&22}$, then $AB=\mat{36&54\\78&118}$ and $BA= \mat{36&52\\81&118}$, so $AB\ne BA$.+
  
-=== Commuting matrices ===+===== Algebraic properties of matrix multiplication ===== 
 +==== The associative law ==== 
 +=== Proposition: associativity of matrix multiplication === 
 +Matrix multiplication is //associative//. This means that $(AB)C=A(BC)$ whenever $A,B,C$ are matrices which can be multiplied together in this order.
  
-{{page>commute}}+We omit the proof, but this is not terribly difficult; it is a calculation in which you write down two formulae for the $(i,j)$ entries of $(AB)C$ and $A(BC)$, and carefully check they are equal using the fact that if $a,b,c$ are real numbers, then $(ab)c=a(bc)$.
  
-Which matrices commute? Suppose $A$ is an $n\times m$ matrix and $B$ is an $\ell\times kmatrix, and $Aadn $B$ commute, i.e., $AB=BA$.+=== Example === 
 +We saw above that $\newcommand{\m}[1]{\begin{bmatrix}#1\end{bmatrix}}A=\m{1&2\\3&4}commutes with $B=\m{7&10\\15&22}$. We can explain why this is so using associativity. You can check that $B=AA(which we usually write as $B=A^2$)Henceusing associativity at $\stackrel*=$,  
 +\[ AB=A(AA)\stackrel*=(AA)A=BA.\] 
 +The same argument for any square matrix $A$ gives a proof of: 
 +=== Proposition ===  
 +If $A$ is any square matrix, then $A$ commutes with $A^2$.
  
-  * $ABmust be defined, so $m=\ell$ +The powers of a square matrix $Aare defined by $A^1=A$, and $A^{k+1}=A(A^k)for $k\in \mathbb{N}$. Using [[wp>mathematical induction]], you can prove the following more general proposition.  
-  * $BA$ must be definedso $k=n$ +===Proposition: a square matrix commutes with its powers=== 
-  * $AB$ is an $n\times k$ matrix and $BA$ is an $\ell\times nmatrixSince $AB$ has the same size as $BA$, we must have $n=\ell$ and $k=m$. +If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.
-Putting this together: we see that if $A$ and $Bcommute, then $A$ and $Bmust both be $n\times n$ matrices for some number $n$. In other words, they must be //square matrices of the same size//.+
  
-Examples 4 and 5 above show that for some square matrices $A$ and $B$ of the same size, it is true that $A$ and $B$ commute. On the other hand, examples 3 and 6 show that it's not true that square matrices of the same size must always commute. 
  
-Because it's not true in general that $AB=BA$, we say that **matrix multiplication is not commutative**.+====The distributive laws====
  
-=== Definition of the $n\times n$ identity matrix ===+=== Lemma: the distributive laws for row-column multiplication ===
  
-{{page>identity matrix}}+  - If $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then $a\cdot (b+c)=a\cdot b+a\cdot c$. 
 +  - If $b$ and $c$ are $1\times m$ row vectors and $a$ is an $m\times 1$ column vector, then $(b+c)\cdot a=b\cdot a+c\cdot a$. 
 + 
 +The proof is an exercise (see tutorial worksheet 5). 
 + 
 + 
 +=== Proposition: the distributive laws for matrix multiplication === 
 +If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then: 
 +  - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and 
 +  - $(B+C)A=BA+CA$ for any $k\times n$ matrices $B$ and $C$. 
 +In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense. 
 + 
 +===Proof=== 
 +1. First note that 
 +  * $B$ and $C$ are both $m\times k$, so $B+C$ is $m\times k$ by the definition of [[matrix addition]]; 
 +  * $A$ is $n\times m$ and $B+C$ is $m\times k$, so $A(B+C)$ is $m\times k$ by the definition of [[matrix multiplication]]; 
 +  * $AB$ and $AC$ are both $n\times k$ by the definition of matrix multiplication 
 +  * so $AB+AC$ is $n\times k$ by the definition of matrix addition.  
 +So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. 
 + 
 +By the Lemma above, the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\]  
 +So the $(i,j)$ entry of $A(B+C)$ is 
 +\begin{align*}\def\row{\text{row}}\def\col{\text{col}} 
 +\text{row}_i(A)\cdot \col_j(B+C) &= \text{row}_i(A)\cdot \big(\col_j(B)+\col_j(C)\big) 
 +\\ &= \text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C).\end{align*} 
 +On the other hand,  
 + 
 +  * the $(i,j)$ entry of $AB$ is $\text{row}_i(A)\cdot \col_j(B)$; and 
 +  * the $(i,j)$ entry of $AC$ is $\row_i(A)\cdot\col_j(C)$;  
 +  * so the $(i,j)$  entry of $AB+AC$ is also $\text{row}_i(A)\cdot \col_j(B)+\row_i(A)\cdot\col_j(C)$.  
 + 
 +So the entries of $A(B+C)$ and $AB+AC$ are all equal, so $A(B+C)=AB+AC$. 
 + 
 +2. The proof is similar, and is left as an exercise.■
  
-=== Examples === 
  
-  - $I_1=[1]$ 
-  - $I_2=\mat{1&0\\0&1}$ 
-  - $I_3=\mat{1&0&0\\0&1&0\\0&0&1}$ 
-  - $I_4=\mat{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1}$ 
-  - $\dots$ 
  
-=== Proposition === 
-  - $I_nA=A$ for any $n\times m$ matrix A; 
-  - $AI_m=A$ for any $n\times m$ matrix A; and 
-  - $I_nB=B=BI_n$ for any $n\times n$ matrix $B$. In particular, $I_n$ commutes with every other square $n\times n$ matrix $B$. 
lecture_9.1424170701.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki