User:GFauxPas/Sandbox

Welcome to my sandbox, you are free to play here as long as you don't track sand onto the main wiki. --GFauxPas 09:28, 7 November 2011 (CST)

We have $\rho\left({\mathbf A}\right) + \nu\left({\mathbf A}\right) = \text{number of columns} = 3$, which we might not have a page for.

From Null Space Contains Only Zero Vector iff Columns are Independent, $\nu\left({\mathbf A}\right) = 1$ iff the column vectors are independent.

etc.

Actually, I should add a corollary to that page: nullity is one iff column vectors are independent... --GFauxPas 23:32, 4 April 2012 (EDT)


 * Well, we do have Rank Plus Nullity Theorem, if that's what you mean. By the way, did you mean $\nu\left({\mathbf A}\right) = 0$? --abcxyz 23:47, 4 April 2012 (EDT)


 * Rank plus nullity is for transformations, the page doesn't yet tie it into matrices. Oh and actually I'm not sure what the dimension of $\left \{ {\mathbf 0} \right \}$ is, now that I think about it. --GFauxPas 00:17, 5 April 2012 (EDT)


 * I thought that an $m \times n$ matrix $\mathbf A$ with entries in $\R$ can be viewed as the linear transformation $\mathbf A : \R^n \to \R^m$ given by $\mathbf x \mapsto \mathbf A \mathbf x$. Isn't this already noted in Definition:Matrix?
 * I'm pretty sure that $\dim \left({\left\{{\mathbf 0}\right\}}\right) = 0$, since its basis is empty. --abcxyz 00:26, 5 April 2012 (EDT)


 * Yeah a matrix can be looked at as a LT but I'd need a theorem that the $n$ in $\rho + \nu = n$ is the number of columns. Anyway, the idea of an empty basis is kind of interesting. Though it fits the definition of empty sum of the linear combination of no vectors. Maybe. --GFauxPas 00:46, 5 April 2012 (EDT)


 * Well, if $\mathbf x \in \R^n$, then $\mathbf A$ must have $n$ columns for $\mathbf A \mathbf x$ to be defined ... right? So the domain of the linear transformation $\mathbf A : \R^n \to \R^m$ is $n$-dimensional; doesn't that permit the use of the rank-nullity theorem ($\rho + \nu = n$)? --abcxyz 01:00, 5 April 2012 (EDT)
 * It sure does - but it deserves a page. Thanks for the proof, shorter than I thought. --GFauxPas 01:21, 5 April 2012 (EDT)

Matrix Spaces
My linear algebra class gets to abstract vector spaces in around two weeks. In the mean time, doesn't hurt to try to get some understanding on my own. What's the proper notation and definition of a matrix such that each column is a member of a vector space? What's the proper way to say "the matrix $\mathbf A$ that when multiplied on the left on a vector does the same thing as some linear transformation $T$ on the vector"? --GFauxPas 09:05, 6 April 2012 (EDT)


 * 1. Same notation as block matrices, I think: $\left[{\begin{array}{cccc}\mathbf v_1 & \mathbf v_2 & \cdots & \mathbf v_n\end{array}}\right]$.
 * 2. Probably something like "matrix of the linear transformation $T$" or "transformation matrix of $T$". --abcxyz 11:21, 9 April 2012 (EDT)


 * Surely a reference needs to be made to the bases chosen. Something like 'the matrix of $T$ with respect to the bases $e_1\ldots e_n$ and $f_1\ldots f_m$'. --Lord_Farin 11:28, 9 April 2012 (EDT)

Theorems
On my to do list:


 * $T\left({\mathbf x}\right) = \mathbf A \mathbf x \iff T^{-1}\left({\mathbf x}\right) = \mathbf A^{-1}\mathbf x$


 * $T\left({\mathbf x}\right)= \mathbf A \mathbf x, T \,' \left({\mathbf x}\right) = \mathbf A' \mathbf x, \left({T \circ T\,'}\right)\left({\mathbf x}\right) = \mathbf A \mathbf A' \mathbf x$. --GFauxPas 14:03, 12 April 2012 (EDT)


 * Judging by the second line, I think it should be the second alternative for the first. --Lord_Farin 14:06, 12 April 2012 (EDT)
 * Yeah it is, I looked it up, thanks. Your reasoning I assume was $\mathbf A \mathbf A^{-1} \mathbf x = \mathbf I \mathbf x$ --GFauxPas 14:08, 12 April 2012 (EDT)
 * Well, the completely equivalent $T \circ T^{-1} = \operatorname{Id}$. --Lord_Farin 14:09, 12 April 2012 (EDT)

Linear Algebra Test
Can someone help me with something, please? I got a question right on my LA test but the professor pointed out there was an easier way to do it, and I'm not clear on how.

Let $T: \R^2 \to \R^3$ be a linear transformation such that:


 * $\begin{bmatrix} 1 \\ 1 \end{bmatrix} \mapsto \begin{bmatrix} 1 \\ -2 \\ 0 \end{bmatrix}$


 * $\begin{bmatrix} 3 \\ -1 \end{bmatrix} \mapsto \begin{bmatrix} 0 \\ 1 \\ 2 \end{bmatrix}$

Find $T\left({\begin{bmatrix} -5 \\ 3\end{bmatrix} }\right)$.

So what I did was:

Says my professor, that's fine, but there's an easier way:

Write $\begin{bmatrix} -5 \\ 3 \end{bmatrix}$ as a linear combo of $\begin{bmatrix} 1 \\ 1 \end{bmatrix}$ and $\begin{bmatrix} 3 \\ -1 \end{bmatrix}$ and then apply $T$.

What does he mean? Thanks for any help. --GFauxPas 10:17, 19 April 2012 (EDT)


 * Write $\begin{bmatrix}-5\\3\end{bmatrix} = -2 \begin{bmatrix}3 \\-1\end{bmatrix} + \begin{bmatrix}1\\1\end{bmatrix}$ and use linearity of $T$. --Lord_Farin 10:20, 19 April 2012 (EDT)


 * And how would you get $-2$ and $1$ other than by inspection? Row-reduce $\begin{bmatrix} -5 & 3 & 1 \\ 3 & -1 & 1 \end{bmatrix}$? --GFauxPas 10:36, 19 April 2012 (EDT)


 * Observe the two given vectors are lin. indep. so constitute a basis. Then write $\begin{bmatrix}-5\\3\end{bmatrix} = a \begin{bmatrix}3 \\-1\end{bmatrix} + b \begin{bmatrix}1\\1\end{bmatrix}$ and solve for $a,b$; this may essentially be the same as you are doing by reducing it, but it allows for more human insight (at least, this is how I did it). --Lord_Farin 10:44, 19 April 2012 (EDT)


 * I should have said rref $\begin{bmatrix} 3 & 1 & -5 \\ -1 & 1 & 3 \end{bmatrix}$, that's equivalent to your method. Thanks a lot LF. --GFauxPas 10:55, 19 April 2012 (EDT)

HTH. --Lord_Farin 10:59, 19 April 2012 (EDT)

More Linear Algebra questions, hooray! I'm googling stuff and am running into the concept of $\R$ considered as a vector space over $\Q$. What exactly is this creature and how is it different than $\R^1$ with $+$ and scalar multiplication I've been using so far? Perhaps this is out of the scope of a basic Linear Algebra course... --GFauxPas 12:39, 19 April 2012 (EDT)
 * Specifically, the question I'm researching is: are there functions of the form $f:V \to W, f(x + y) = f(x) + f(y)$, where $V$ and $W$ are vector spaces, that do not satisfy $f(\lambda x) = \lambda f(x)$, $\lambda$ is any real scalar? --GFauxPas 12:41, 19 April 2012 (EDT)


 * It isn't different, really. Except that the underlying field is $\Q$ (i.e. $\pi$ and $1$ are linearly independent over $\Q$). It is a bit outside of the scope, but you should be able to grasp the idea. --Lord_Farin 12:43, 19 April 2012 (EDT)