User:J D Bowen/Math725 HW10

1) We aim to show that if $$M \ $$ is an $$m\times n \ $$ matrix with linearly independent columns, then $$M^HM \ $$ is invertible.  Observe $$M^H \ $$ is a $$n\times m \ $$ matrix with linearly independent rows, and so $$M^HM \ $$ is an $$n\times n \ $$ matrix with $$\text{rank}(M^HM)=\text{min}(\text{rank}(M),\text{rank}(M^H)) = n \ $$.  So $$M^HM \ $$ is invertible.

2) Let $$A = \begin{pmatrix} x_1 & 1 \\ \vdots & \vdots  \\ x_n & 1 \end{pmatrix}$$, $$x=(a,b)^t \ $$, and $$y=(y_1,\dots,y_n)^t \ $$.

Observe that $$Ax=y \ $$ expands to

$$ \begin{pmatrix} x_1 & 1 \\ \vdots & \vdots \\ x_n & 1 \end{pmatrix} \begin{pmatrix} a \\ b \end{pmatrix} = \begin{pmatrix} y_1 \\ \vdots \\ y_n \end{pmatrix} \ $$

which is the system of equations $$ax_j+b=y_j, \ 1\leq j \leq n \ $$. Since this is the solution of the data points, the best fit line will be the best approximate solution of this matrix equation.

Observe $$x=(A^HA)^{-1} A^Hy =\left({ \begin{pmatrix} x_1 & \dots & x_n \\ 1 & \dots &  1 \end{pmatrix} \begin{pmatrix} x_1 & 1 \\ \vdots & \vdots  \\ x_n & 1 \end{pmatrix}}\right)^{-1} \begin{pmatrix} x_1 & \dots & x_n \\ 1 & \dots  &  1 \end{pmatrix} \begin{pmatrix} y_1 \\ \vdots \\ y_n \end{pmatrix} $$

$$= \begin{pmatrix} \Sigma x_j^2 & \Sigma x_j \\ \Sigma x_j & n \end{pmatrix}^{-1} \begin{pmatrix} \Sigma x_jy_j \\ \Sigma y_j \end{pmatrix}

= \frac{1}{ (n\Sigma x_j^2)-(\Sigma x_j)^2 } \begin{pmatrix} n & -\Sigma x_j \\ -\Sigma x_j & \Sigma x_j^2 \end{pmatrix} \begin{pmatrix} \Sigma x_jy_j \\ \Sigma y_j \end{pmatrix}

= \frac{1}{ (n\Sigma x_j^2)-(\Sigma x_j)^2 } \begin{pmatrix} n\Sigma x_j y_j-(\Sigma x_j)(\Sigma y_j) \\ -(\Sigma x_j)(\Sigma x_jy_j)+(\Sigma x_j^2)(\Sigma y_j) \end{pmatrix} $$

and so we have

$$a=\frac{ n\Sigma x_j y_j-(\Sigma x_j)(\Sigma y_j)}{(n\Sigma x_j^2)-(\Sigma x_j)^2} \ $$

as desired.

3) Given $$T:V\to V \ $$, we aim to show that $$\lambda \ $$ is an eigenvalue if and only if $$\text{rank}(T-\lambda I) < \text{dim}(V) \ $$.

Begin by observing that $$\text{rank}(T-\lambda I) < \text{dim}(V) \ $$ if and only if the columns of $$\text{rank}(T-\lambda I) \ $$ are not linearly independent. This is true if and only if $$T-\lambda I \ $$ has a non-trivial kernel, ie, there is some vector $$v \neq 0 \ $$ such that $$(T-\lambda I)v=0 \ $$. But this is true if and only if $$Tv=\lambda I v = \lambda v \ $$, which is equivalent to $$\lambda \ $$ being an eigenvalue of $$T \ $$.

Suppose $$\lambda \ $$ is an eigenvalue of $$T \ $$. Then we can show $$\overline{\lambda} \ $$ is an eigenvalue of $$T^H \ $$ with the following equations:

$$\lambda \langle v,v, \rangle =\langle \lambda v, v \rangle = \langle Tv, v \rangle = \langle v, T^H v \rangle \ $$

and

$$\lambda \langle v,v, \rangle = \langle v, \overline{\lambda}v \rangle \ $$

Hence

$$\langle v, \overline{\lambda}v \rangle=\langle v, T^H v \rangle \ $$.

and so $$\overline{\lambda} \ $$ is an eigenvalue of $$T^H \ $$. Since the conjugate of the conjugate of $$\lambda \ $$ is just $$\lambda \ $$, and $$(T^H)^H = T \ $$, this argument can be used to show the other direction of implication as well.

4) Suppose $$U:V\to V \ $$ satisfies $$U^H = U^{-1} \ $$, and that $$\lambda \ $$ is an eigenvalue.  Then $$\lambda ^{-1} \ $$ is the corresponding eigenvalue for $$U^{1} \ $$, and for some vector $$v, \ Uv=\lambda v \ $$.

Observe

$$\lambda \langle v,v, \rangle =\langle \lambda v, v \rangle = \langle Uv, v \rangle = \langle v, U^{-1} v \rangle = \langle v, \lambda^{-1} v \rangle = \overline{\lambda^{-1}}\langle v,v \rangle \ $$

We have $$\lambda=\overline{\lambda^{-1}} \implies |\lambda|=1 \ $$.

5) Observe that $$\frac{T+T^H}{2}+\frac{T-T^H}{2} = \frac{T+T^H+T-T^H}{2} = \frac{2T}{2} = T \ $$. Now, for any two vectors $$v,w \ $$, we have

$$\langle \frac{T+T^H}{2} v, w \rangle = \frac{1}{2}\langle Tv + T^H v, w \rangle = \frac{1}{2} \left({ \langle Tv,w \rangle +\langle T^H v, w \rangle }\right) = \frac{1}{2} \left({ \langle v,T^H w \rangle + \langle v, (T^H)^H w \rangle }\right) = \frac{1}{2} \langle v, (T^H+T)w \rangle \ $$

$$ = \langle v, \overline{\frac{1}{2}} (T+T^H)w \rangle = \langle v, \frac{T+T^H}{2}w\rangle \ $$,

and so $$\frac{T+T^H}{2} \ $$ is Hermitian.

Further observe

$$\langle \frac{T-T^H}{2} v, w \rangle = \frac{1}{2}\langle Tv - T^H v, w \rangle = \frac{1}{2} \left({ \langle Tv,w \rangle -\langle T^H v, w \rangle }\right) = \frac{1}{2} \left({ \langle v,T^H w \rangle - \langle v, (T^H)^H w \rangle }\right) = \frac{1}{2} \langle v, (T^H-T)w \rangle \ $$

$$ = \langle v, -\overline{\frac{1}{2}} (T^H-T)w \rangle = \langle v, \frac{-(T^H-T)}{2}w\rangle \ $$

and so $$\left({ \frac{T-T^H}{2} }\right)^H = - \frac{T-T^H}{2} \ $$, meaning this is anti-Hermitian.

6) I can't get octave running.