User:GFauxPas/Sandbox

Welcome to my sandbox, you are free to play here as long as you don't track sand onto the main wiki. --GFauxPas 09:28, 7 November 2011 (CST)

Extra Credit
I didn't get the extra credit on my linear algebra exam, can someone help me with it? If I remember correctly, it was something like:

Find the values of $a$ and $b$ such that the rank of the following matrix is $2$.


 * $\begin{bmatrix} 1 & 0 & 0 \\ 0 & a - 2 & b \\ 0 & 2 & a + 2 \\ 0 & 0 & 3 \end{bmatrix}$

--GFauxPas 15:12, 4 April 2012 (EDT)


 * If that really is the matrix, the solution is the empty set; the first, third and fourth row constitute a $3\times3$ matrix with nonzero determinant. --Lord_Farin 15:28, 4 April 2012 (EDT)


 * Can you maybe help me figure out a suitable version where there is a solution? In any event, we have not gotten to determinants in class other than (briefly) relationship between 0 determinant and being singular. --GFauxPas 15:37, 4 April 2012 (EDT)

Consider


 * $\begin{bmatrix} 1 & 0 & 0 \\ 0 & a - 2 & b \\ 0 & 2 & a + 2 \\ 0 & 1 & 3 \end{bmatrix}$

I strongly suspect that there are cases where this does not have maximal rank (while it is evident that it will have rank at least two, taking the first and second column).

A useful characterization of rank is 'dimension of the range' (taken as a linear space); this corresponds directly to the number of linearly independent columns. So in our present case, the rank will be $3$ unless the latter two columns are linearly dependent. I am sure you can work out $a$ and $b$ from there. --Lord_Farin 16:57, 4 April 2012 (EDT)

$a$ has to be $4$ or else all three columns will have a pivot:


 * $\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 3 \\ 0 & 2 & b \\ 0 & 0 & 0 \end{bmatrix}$

$a = 4, b=6$? --GFauxPas 17:31, 4 April 2012 (EDT)


 * Quite sure that is correct. --Lord_Farin 17:38, 4 April 2012 (EDT)


 * Hooray thank you LF --GFauxPas 17:39, 4 April 2012 (EDT)

Just thought of something else.

We have $\rho\left({\mathbf A}\right) + \nu\left({\mathbf A}\right) = \text{number of columns} = 3$, which we might not have a page for.

From Null Space Contains Only Zero Vector iff Columns are Independent, $\nu\left({\mathbf A}\right) = 1$ iff the column vectors are independent.

etc.

Actually, I should add a corollary to that page: nullity is one iff column vectors are independent... --GFauxPas 23:32, 4 April 2012 (EDT)


 * Well, we do have Rank Plus Nullity Theorem, if that's what you mean. By the way, did you mean $\nu\left({\mathbf A}\right) = 0$? --abcxyz 23:47, 4 April 2012 (EDT)


 * Rank plus nullity is for transformations, the page doesn't yet tie it into matrices. Oh and actually I'm not sure what the dimension of $\left \{ {\mathbf 0} \right \}$ is, now that I think about it. --GFauxPas 00:17, 5 April 2012 (EDT)


 * I thought that an $m \times n$ matrix $\mathbf A$ with entries in $\R$ can be viewed as the linear transformation $\mathbf A : \R^n \to \R^m$ given by $\mathbf x \mapsto \mathbf A \mathbf x$. Isn't this already noted in Definition:Matrix?
 * I'm pretty sure that $\dim \left({\left\{{\mathbf 0}\right\}}\right) = 0$, since its basis is empty. --abcxyz 00:26, 5 April 2012 (EDT)


 * Yeah a matrix can be looked at as a LT but I'd need a theorem that the $n$ in $\rho + \nu = n$ is the number of columns. Anyway, the idea of an empty basis is kind of interesting. Though it fits the definition of empty sum of the linear combination of no vectors. Maybe. --GFauxPas 00:46, 5 April 2012 (EDT)


 * Well, if $\mathbf x \in \R^n$, then $\mathbf A$ must have $n$ columns for $\mathbf A \mathbf x$ to be defined ... right? So the domain of the linear transformation $\mathbf A : \R^n \to \R^m$ is $n$-dimensional; doesn't that permit the use of the rank-nullity theorem ($\rho + \nu = n$)? --abcxyz 01:00, 5 April 2012 (EDT)
 * It sure does - but it deserves a page. Thanks for the proof, shorter than I thought. --GFauxPas 01:21, 5 April 2012 (EDT)

Algorithm
Let $T: \R^n \to \R^m$ be a linear transformation.

Let $\mathbf A_T$ be the matrix representation of $T$, where $\mathbf A_T$ is $m \times n$.

Matrix Spaces
My linear algebra class gets to abstract vector spaces in around two weeks. In the mean time, doesn't hurt to try to get some understanding on my own. What's the proper notation and definition of a matrix such that each column is a member of a vector space? What's the proper way to say "the matrix $\mathbf A$ that when multiplied on the left on a vector does the same thing as some linear transformation $T$ on the vector"? --GFauxPas 09:05, 6 April 2012 (EDT)

Matrix-Vector operations
Let:


 * $\mathbf A_{m \times n} = \begin{bmatrix}

a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \\ \end{bmatrix}$

be a matrix such that every column is defined as a vector:


 * $\forall i: 1 \le i \le m: \begin{bmatrix} a_{1i} \\ a_{2i} \\ \vdots \\ a_{mi} \end{bmatrix} \in V$

where $V$ is some vector space.

Let $\mathbf v = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}, \mathbf w = \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix} \in V$.

Then:

this stuff is staying in my sandbox because I don't know if it's distinct from matrix-matrix multiplication. Please share your thoughts.

I'm going to need stuff like this to prove, for example, $T\left({\mathbf x}\right) = \mathbf A \mathbf x$ is a linear transformation. --GFauxPas 15:41, 6 April 2012 (EDT)


 * No, I don't think that this is different from matrix multiplication. --abcxyz 16:08, 6 April 2012 (EDT)


 * I'm also pretty sure this is the same as matrix multiplication. In fact, I usually perform matrix multiplication by doing it column by column; IMO that's a useful way to remember how to do it. --Lord_Farin 17:21, 6 April 2012 (EDT)


 * Yes it is (or should be) the same as conventional matrix multiplication. The identity you cite is therefore already in: Matrix Multiplication Distributes over Matrix Addition.--prime mover 17:26, 6 April 2012 (EDT)