User Tools

Site Tools


Plugin installed incorrectly. Rename plugin directory '_include' to 'include'.
Plugin installed incorrectly. Rename plugin directory '__include' to 'include'.
lecture_9

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_9 [2016/02/22 17:24] rupertlecture_9 [2017/02/21 10:02] (current) rupert
Line 1: Line 1:
-=== Proof of the proposition ===+=== Proof of the proposition, continued ===
  
-1. We want to show that $I_nA=A$ for any $n\times m$ matrix $A$. These matrices the [[same size]], since $I_n$ has size $n\times n$ and $A$ has size $n\times m$, so $I_n A$ has size $n\times m$ by the definition of [[matrix multiplication]], which is the same as the size of $A$. +2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar to the first part of the proof; the details are left as an exercise.
- +
- Note that $\text{row}_i(I_n)=[0~0~\dots~0~1~0~\dots~0]$, where the $1$ is in the $i$th place, by definition of the [[identity matrix]] $I_n$; and the $j$th column of $A$ is $\begin{bmatrix}a_{1j}\\a_{2j}\\\vdots\\a_{nj}\end{bmatrix}$. The (i,j) entry of $I_nA$ is  +
-$\text{row}_i(I_n)\cdot \text{col}_j(A)$, by the definition of [[matrix multiplication]], which is therefore +
-\begin{align*} [0~0~\dots~0~1~0~\dots~0]\begin{bmatrix}a_{1j}\\a_{2j}\\\vdots\\a_{nj}\end{bmatrix} &= 0a_{1j}+0a_{2j}+\dots+0a_{i-1,j}+1a_{ij}+0a_{i+1,j}+\dots+0a_{nj} \\&= a_{ij}.\end{align*} +
-So the matrices $I_nA$ and $A$ have the same $(i,j)$ entries, for any $(i,j)$. So $I_nA=A$. +
- +
-2. To show that $AI_m=A$ for any $n\times m$ matrix $A$ is similar; the details are left as an exercise.+
  
 3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■ 3. If $B$ is any $n\times n$ matrix, then $I_nB=B$ by part 1 and $BI_n=B$ by part 2, so $I_nB=B=BI_n$. In particular, $I_nB=BI_n$ so $I_n$ commutes with $B$, for every square $n\times n$ matrix $B$. ■
Line 26: Line 19:
 If $A$ is any square matrix, then $A$ commutes with $A^2$.■ If $A$ is any square matrix, then $A$ commutes with $A^2$.■
  
-Using [[wp>mathematical induction]], you can prove more general fact:+The powers of a square matrix $A$ are defined by $A^1=A$, and $A^{k+1}=A(A^k)$ for $k\in \mathbb{N}$. Using [[wp>mathematical induction]], you can prove the following more general proposition. 
 ===Proposition: a square matrix commutes with its powers=== ===Proposition: a square matrix commutes with its powers===
 If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.■ If $A$ is any square matrix and $k\in\mathbb{N}$, then $A$ commutes with $A^k$.■
Line 33: Line 26:
 ====The distributive laws==== ====The distributive laws====
  
-=== Proposition: the distributive laws ===+=== Lemma: the distributive laws for row-column multiplication === 
 + 
 +  - If $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectors, then $a\cdot (b+c)=a\cdot b+a\cdot c$. 
 +  - If $b$ and $c$ are $1\times m$ row vectors and $a$ is an $m\times 1$ column vector, then $(b+c)\cdot a=b\cdot a+c\cdot a$. 
 + 
 +The proof is an exercise (see tutorial worksheet 5). 
 + 
 + 
 +=== Proposition: the distributive laws for matrix multiplication ===
 If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then: If $A$ is an $n\times m$ matrix and $k\in\mathbb{N}$, then:
   - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and   - $A(B+C)=AB+AC$ for any $m\times k$ matrices $B$ and $C$; and
-  - $(B+C)A=BA=CA$ for any $k\times n$ matrices $B$ and $C$.+  - $(B+C)A=BA+CA$ for any $k\times n$ matrices $B$ and $C$.
 In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense. In other words, $A(B+C)=AB+AC$ whenever the matrix products make sense, and similarly $(B+C)A=BA+CA$ whenever this makes sense.
  
Line 47: Line 48:
 So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]]. So we have (rather long-windedly) checked that $A(B+C)$ and $AB+AC$ have the [[same size]].
  
-Recall that in tutorial 4 we saw that if $a$ is a $1\times m$ row vector and $b$ and $c$ are $m\times 1$ column vectorsthen the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] +By the Lemma above, the [[row-column product]] has the property that \[a\cdot (b+c)=a\cdot b+a\cdot c.\] 
 So the $(i,j)$ entry of $A(B+C)$ is So the $(i,j)$ entry of $A(B+C)$ is
 \begin{align*}\def\row{\text{row}}\def\col{\text{col}} \begin{align*}\def\row{\text{row}}\def\col{\text{col}}
Line 62: Line 63:
 2. The proof is similar, and is left as an exercise.■ 2. The proof is similar, and is left as an exercise.■
  
-===== Matrix equations ===== 
- 
-We've seen that a single linear equation can be written using [[row-column multiplication]]. For example, 
-\[ 2x-3y+z=8\] 
-can be written as  
-\[ \def\m#1{\begin{bmatrix}#1\end{bmatrix}}\m{2&-3&1}\m{x\\y\\z}=8\] 
-or 
-\[ a\vec x=8\] 
-where $a=\m{2&-3&1}$ and $\vec x=\m{x\\y\\z}$. 
- 
-We can write a whole [[system of linear equations]] in a similar way, as a matrix equation using [[matrix multiplication]]. For example we can rewrite the linear system 
-\begin{align*} 2x-3y+z&=8\\ y-z&=4\\x+y+z&=0\end{align*} 
-as  
-\[ \m{2&-3&1\\0&1&-1\\1&1&1}\m{x\\y\\z}=\m{8\\4\\0},\] 
-or  
-\[ A\vec x=\vec b\] 
-where $A=\m{2&-3&1\\0&1&-1\\1&1&1}$, $\vec x=\m{x\\y\\z}$ and $\vec b=\m{8\\4\\0}$. (We are writing the little arrow above the column vectors here because otherwise we might get confused between the $\vec x$: a column vector of variables, and $x$: just a single variable). 
- 
-More generally, any linear system 
-\begin{align*} a_{11}x_1+a_{12}x_2+\dots+a_{1m}x_m&=b_1\\ a_{21}x_1+a_{22}x_2+\dots+a_{2m}x_m&=b_2\\ \hphantom{a_{11}}\vdots \hphantom{x_1+a_{22}}\vdots\hphantom{x_2+\dots+{}a_{nn}} \vdots\ & \hphantom{{}={}\!} \vdots\\ a_{n1}x_1+a_{n2}x_2+\dots+a_{nm}x_m&=b_n \end{align*} 
-can be written in the form 
-\[ A\vec x=\vec b\] 
-where $A$ is the $n\times m $ matrix, called the **coefficient matrix** of the linear system, whose $(i,j)$ entry is $a_{ij}$ (the number in front of $x_j$ in the $i$th equation of the system) and $\vec x=\m{x_1\\x_2\\\vdots\\x_m}$, and $\vec b=\m{b_1\\b_2\\\vdots\\b_n}$. 
- 
-More generally still, we might want to solve a matrix equation like \[AX=B\] where $A$, $X$ and $B$ are matrices of any size, with $A$ and $B$ fixed matrices and $X$ a matrix of unknown variables. Because of the definition of [[matrix multiplication]], if $A$ is $n\times m$, we need $B$ to be $n\times k$ for some $k$, and then $X$ must be $m\times k$, so we know the size of any solution $X$. But which $m\times k$ matrices $X$ are solutions? 
- 
-=== Example === 
- 
-If $A=\m{1&0\\0&0}$ and $B=0_{2\times 3}$, then any solution $X$ to $AX=B$ must be $2\times 3$. 
- 
-One solution is $X=0_{2\times 3}$, since in this case we have $AX=A0_{2\times 3}=0_{2\times 3}$. 
  
-However, this is not the only solution. For example, $X=\m{0&0&0\\1&2&3}$ is another solution, since in this case \[AX=\m{1&0\\0&0}\m{0&0&0\\1&2&3}=\m{0&0&0\\0&0&0}=0_{2\times 3}.\] 
  
-So from this example, we see that a matrix equation can have many solutions. 
lecture_9.1456161889.txt.gz · Last modified: by rupert

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki