===== The orthogonal projection of one vector onto another ===== Let $\def\ww{\vec{w}}\def\vv{\vec{v}}\def\uu{\vec{u}}\ww$ be a non-zero vector, and let $\vv$ be any vector. We call a vector $\def\pp{\vec p}\def\nn{\vec{n}}\pp$ the **orthogonal projection of $\vv$ onto $\ww$**, and write $\pp=\def\ppp{\text{proj}_{\ww}\vv}\ppp$, if - $\pp$ is in the same direction as $\ww$; and - the vector $\nn=\vv-\pp$ joining the end of $\pp$ to the end of $\vv$ is orthogonal to $\ww$. {{ :t3.png?nolink&300 |}} We can use these properties of $\pp$ to find a formula for $\pp$ in terms of $\vv$ and $\ww$. - Since $\pp$ is in the same direction as $\ww$, we have $\pp=c\ww$ for some scalar $c\in \mathbb{R}$. - Since $\nn=\vv-\pp$ is orthogonal to $\ww$, we have $\nn\cdot \ww=0$. Hence \begin{align*}&&(\vv-\pp)\cdot \ww&=0\\&\implies& \vv\cdot\ww-\pp\cdot\ww&=0\\&\implies& \pp\cdot\ww&=\vv\cdot\ww\\&\implies& c\ww\cdot \ww&=\vv\cdot\ww\\&\implies& c\|\ww\|^2&=\vv\cdot\ww\\&\implies& c&=\frac{\vv\cdot\ww}{\|\ww\|^2}.\end{align*} {{anchor:proj}}So we obtain the **orthogonal projection formula**: \[ \pp=\ppp=\frac{\vv\cdot\ww}{\|\ww\|^2}\ww.\] We call $\nn=\vv-\ppp$ the component of $\vv$ orthogonal to $\ww$. === Example === If $\def\vv{\vec v}\def\pp{\vec p}\def\ppp{\text{proj}_{\ww}\vv}\def\ww{\vec w}\def\nn{\vec n}\vv=\def\c#1#2#3{\begin{bmatrix}#1\\#2\\#3\end{bmatrix}}\c12{-1}$ and $\ww=\c2{-1}4$, then \begin{align*}\ppp&=\frac{\vv\cdot\ww}{\|\ww\|^2}\ww\\&=\frac{2-2-4}{2^2+(-1)^2+4^2}\c2{-1}4\\&=-\frac4{21}\c2{-1}4\end{align*} and the component of $\vv$ orthogonal to $\ww$ is \begin{align*}\nn&=\vv-\ppp\\&=\c12{-1}-\left(-\frac4{21}\right)\c2{-1}4\\&=\c12{-1}+\c{8/21}{-4/21}{16/21}\\&=\c{29/21}{38/21}{-5/21}.\end{align*} ===== The cross product of vectors in $\mathbb{R}^3$ ===== ==== Definition: the standard basis vectors ==== We define $\def\i{\vec \imath}\i=\def\c#1#2#3{\begin{bmatrix}#1\\#2\\#3\end{bmatrix}}\c100$, $\def\j{\vec \jmath}\j=\c010$ and $\def\k{\vec k}\k=\c001$. These are the **standard basis vectors** of $\mathbb{R}^3$. Note that any vector $\vec v=\c{v_1}{v_2}{v_3}$ may be written as a **linear combination** of these vectors (that is, a sum of scalar multiplies of $\i$, $\j$ and $\k$), since \[ \def\vc#1{\c{#1_1}{#1_2}{#1_3}}\vec v=\vc v=\c{v_1}{0}{0}+\c{0}{v_2}{0}+\c{0}{0}{v_3} = v_1\i+v_2\j+v_3\k.\] ==== Definition: the cross product ==== If $\vec v=\vc v$ and $\vec w=\vc w$ are vectors in $\mathbb{R}^3$, then we define $\def\vv{\vec v}\vv\times\def\ww{\vec w}\ww$ to be the vector given by the determinant \[ \vv\times\ww=\def\cp#1#2#3#4#5#6{\begin{vmatrix}\i&\j&\k\\#1\\#4\end{vmatrix}}\def\cpc#1#2{\cp{#1_1}{#1_2}{#1_3}{#2_1}{#2_2}{#2_3}}\cpc vw.\] We interpret this determinant by expanding along the first row: \[\vv\times\ww=\cpc vw=\def\vm#1{\begin{vmatrix}#1\end{vmatrix}}\vm{v_2&v_3\\w_2&w_3}\i-\vm{v_1&v_3\\w_1&w_3}\j+\vm{v_1&v_2\\w_1&w_2}\k=\c{v_2w_3-v_3w_2}{-(v_1w_3-v_3w_1)}{v_1w_2-v_2w_1}\] ==== Example ==== Let $\vv=\c13{-1}$ and $\ww=\c21{-2}$. We have \[ \vv\times\ww=\cp13{-1}21{-2}=\c{3(-2)-1(-1)}{-(1(-2)-(-1)2)}{1(1)-3(2)}=\c{-5}0{-5}\] and \[ \ww\times\vv=\cp21{-2}13{-1}=\c{1(-1)-(-2)3}{-(2(-1)-(-2)1)}{2(3)-1(1)}=\c{5}0{5}.\] Observe that $\vv\times\ww=-\ww\times \vv$. Moreover, \[ \vv\times \vv=\cp13{-1}13{-1}=\c000=\vec0\] and \[ \ww\times\ww=\cp21{-2}21{-2}=\c000=\vec0.\] ==== Example: cross products of standard basis vectors ==== We have \[ \i\times\j=\cp100010=\c001=\k,\] \[ \j\times\k=\cp010001=\c100=\i\] \[ \k\times\i=\cp001100=\c010=\j\] ==== Proposition: properties of the cross product ==== For any vectors $\def\uu{\vec u}\uu$, $\vv$ and $\ww$ in $\mathbb{R}^3$ and any scalar $c\in\mathbb{R}$, we have: - $\uu\times(\vv+\ww)=\uu\times\vv+\uu\times\ww$ - $\vv\times\ww=-\ww\times\vv$ - $(c\vv)\times \ww=c(\vv\times\ww)=\vv\times(c\ww)$ - $\vv\times\vv=\vec0$ - $\vv\times \vec0=\vec0$ - $\vv\times \ww$ is orthogonal to both $\vv$ and $\ww$ === Proof === - This is a tedious (but easy) bit of algebra. - Swapping two rows in a determinant changes the sign, so \[ \vv\times\ww=\cpc vw=-\cpc wv=-\ww\times\vv.\] - Scaling one row in a determinant scales the determinant in the same way, so \[ (c\vv)\times\ww=\cpc {cv}w=c\cpc vw=c\vv\times\ww.\] - The determinant of a matrix with a repeated row is zero. - The determinant of a matrix with a zero row is zero. - Observe that $\uu\cdot (\vv\times \ww)=\vm{u_1&u_2&u_3\\v_1&v_2&v_3\\w_1&w_2&w_3}$. The determinant of a matrix with a repeated row is zero, so \[\vv\cdot (\vv\times \ww)=\vm{v_1&v_2&v_3\\v_1&v_2&v_3\\w_1&w_2&w_3}=0\] so $\vv$ is orthogonal to $\vv\times\ww$; and similarly, \[\ww\cdot(\vv\times \ww)=\vm{w_1&w_2&w_3\\v_1&v_2&v_3\\w_1&w_2&w_3}=0\] so $\ww$ is orthogonal to $\vv\times\ww$. ■ ==== Theorem: the dot product/cross product length formula ==== For any vectors $\vv$ and $\ww$ in $\mathbb{R}^3$, we have \[ \|\vv\times\ww\|^2+(\vv\cdot\ww)^2=\|\vv\|^2\,\|\ww\|^2.\] === Proof === Let $D$ be the sum of $v_i^2w_j^2$ over all $i,j\in\{1,2,3\}$ with $i=j$. (So $D=v_1^2w_1^2+v_2^2w_2^2+v_3^2w_3^2$.) Let $F$ be the sum of $v_i^2w_j^2$ over all $i,j\in\{1,2,3\}$ with $i\ne j$. (So $F=v_1^2w_2^2+v_2^2w_1^2+\dots+v_3^2w_2^2$, with 6 terms on the right hand side.) Let $C$ be the sum of $v_iw_iv_jw_j$ over all $i,j\in\{1,2,3\}$ with $i