User:GFauxPas/Sandbox

Welcome to my sandbox, you are free to play here as long as you don't track sand onto the main wiki. --GFauxPas 09:28, 7 November 2011 (CST)

We have $\rho\left({\mathbf A}\right) + \nu\left({\mathbf A}\right) = \text{number of columns} = 3$, which we might not have a page for.

From Null Space Contains Only Zero Vector iff Columns are Independent, $\nu\left({\mathbf A}\right) = 1$ iff the column vectors are independent.

etc.

Actually, I should add a corollary to that page: nullity is one iff column vectors are independent... --GFauxPas 23:32, 4 April 2012 (EDT)


 * Well, we do have Rank Plus Nullity Theorem, if that's what you mean. By the way, did you mean $\nu\left({\mathbf A}\right) = 0$? --abcxyz 23:47, 4 April 2012 (EDT)


 * Rank plus nullity is for transformations, the page doesn't yet tie it into matrices. Oh and actually I'm not sure what the dimension of $\left \{ {\mathbf 0} \right \}$ is, now that I think about it. --GFauxPas 00:17, 5 April 2012 (EDT)


 * I thought that an $m \times n$ matrix $\mathbf A$ with entries in $\R$ can be viewed as the linear transformation $\mathbf A : \R^n \to \R^m$ given by $\mathbf x \mapsto \mathbf A \mathbf x$. Isn't this already noted in Definition:Matrix?
 * I'm pretty sure that $\dim \left({\left\{{\mathbf 0}\right\}}\right) = 0$, since its basis is empty. --abcxyz 00:26, 5 April 2012 (EDT)


 * Yeah a matrix can be looked at as a LT but I'd need a theorem that the $n$ in $\rho + \nu = n$ is the number of columns. Anyway, the idea of an empty basis is kind of interesting. Though it fits the definition of empty sum of the linear combination of no vectors. Maybe. --GFauxPas 00:46, 5 April 2012 (EDT)


 * Well, if $\mathbf x \in \R^n$, then $\mathbf A$ must have $n$ columns for $\mathbf A \mathbf x$ to be defined ... right? So the domain of the linear transformation $\mathbf A : \R^n \to \R^m$ is $n$-dimensional; doesn't that permit the use of the rank-nullity theorem ($\rho + \nu = n$)? --abcxyz 01:00, 5 April 2012 (EDT)
 * It sure does - but it deserves a page. Thanks for the proof, shorter than I thought. --GFauxPas 01:21, 5 April 2012 (EDT)

Matrix Spaces
My linear algebra class gets to abstract vector spaces in around two weeks. In the mean time, doesn't hurt to try to get some understanding on my own. What's the proper notation and definition of a matrix such that each column is a member of a vector space? What's the proper way to say "the matrix $\mathbf A$ that when multiplied on the left on a vector does the same thing as some linear transformation $T$ on the vector"? --GFauxPas 09:05, 6 April 2012 (EDT)


 * 1. Same notation as block matrices, I think: $\left[{\begin{array}{cccc}\mathbf v_1 & \mathbf v_2 & \cdots & \mathbf v_n\end{array}}\right]$.
 * 2. Probably something like "matrix of the linear transformation $T$" or "transformation matrix of $T$". --abcxyz 11:21, 9 April 2012 (EDT)


 * Surely a reference needs to be made to the bases chosen. Something like 'the matrix of $T$ with respect to the bases $e_1\ldots e_n$ and $f_1\ldots f_m$'. --Lord_Farin 11:28, 9 April 2012 (EDT)

Theorems
On my to do list:


 * $T\left({\mathbf x}\right) = \mathbf A \mathbf x \iff T^{-1}\left({\mathbf x}\right) = \mathbf A^{-1}\mathbf x$


 * $T\left({\mathbf x}\right)= \mathbf A \mathbf x, T \,' \left({\mathbf x}\right) = \mathbf A' \mathbf x, \left({T \circ T\,'}\right)\left({\mathbf x}\right) = \mathbf A \mathbf A' \mathbf x$. --GFauxPas 14:03, 12 April 2012 (EDT)


 * $T: \R^n \to \R^m, \mathbf x \mapsto \mathbf {Ax}, \mathbf A \in \mathbf M_{m,n}\left({\R}\right)$, $T$ is onto iff $\operatorname C\left({\mathbf A}\right) = \R^m$ iff $\rho \left({\mathbf A}\right) = m$.


 * Let $T$ be a linear transformation.


 * $T \ \text{is 1-1} \iff \ker\left({T}\right) = \left\{ {\mathbf 0 \in \operatorname{Dom}\left({T}\right) } \right\}$

Fundamental Theorem of Invertible Matrices
Some of these are probably redundant, I'm particularly not sure whether or not to split rank of $\mathbf A$ with $T$, nullity of $\mathbf A$ with $T$, etc. Please share your thoughts as to which ones to cull, or to combine.

Let $T: \mathbf V \to \mathbf V\,'$ be a linear transformation s.t. $\dim \left({\mathbf V}\right) = \dim\left({\mathbf V\,'}\right)$

Let $\mathbf A$ be a square matrix in the space $\mathbf M_n\left({\R}\right)$, generalize


 * $\mathbf A := \left[ {T,\mathcal B, \mathcal B\,' }\right]$, as defined Definition:Relative Matrix, where:


 * $\mathcal B, \mathcal B\,'$ are ordered bases for $\mathbf V, \mathbf V\,'$.

All of the following statements are equivalent:


 * $(1): \exists \mathbf A^{-1}$


 * $(2): \mathbf A \sim \mathbf I_n$


 * $(3): \forall \mathbf b \in \R^n, \mathbf {Ax} = \mathbf b$ has one and only one solution, generalize


 * $(4): \mathbf {Ax} = \mathbf 0$ has only the solution $\mathbf x = \mathbf 0$, clarify which $\mathbf 0$


 * $(5): \displaystyle \exists \mathbf E_i: 1 \le i \le k: \mathbf A = \prod_{1 \le i \le k}\mathbf E_i$, where $\mathbf E_i$ is an elementary matrix.


 * $(6):$ The column vectors of $\mathbf A$ are linearly independent.


 * $(7):$ The column vectors of $\mathbf A$ span $\R^n$, generalize


 * $(8):$ The set containing the column vectors of $\mathbf A$ form a basis for $\R^n$, generalize


 * $(9):$ The column vectors of $\mathbf A^T$ are linearly independent.


 * $(10):$ The column vectors of $\mathbf A^T$ span $\R^n$, generalize


 * $(11):$ The set containing the column vectors of $\mathbf A^T$ form a basis for $\R^n$, generalize


 * $(12): \rho\left({\mathbf A}\right) = n$


 * $(13)/(14): \rho\left({T}\right) = \dim\left({\mathbf V}\right) = \dim\left({\mathbf V\,'}\right)$


 * $(15): \nu\left({\mathbf A}\right) = 0$


 * $(16): \nu\left({T}\right) = 0$


 * $(17): \det \left({\mathbf A}\right) \ne 0$


 * $(18): 0$ is not an Definition:Eigenvalue of $\mathbf A$.


 * $(19): \exists T^{-1}$


 * $(20): T$ is one-to-one


 * $(21): T$ is onto


 * $(22): \ker\left({T}\right) = \left\{ {\mathbf 0}\right\}$, clarify which $\mathbf 0$


 * $(23): T\left({\mathbf V}\right) = \mathbf V\,'$

--GFauxPas 16:23, 9 May 2012 (EDT)


 * You surely want to stipulate that $V,V'$ have equal dimension, otherwise eg. 21 and 23 are nonsense (consider coordinate projections). --Lord_Farin 17:08, 9 May 2012 (EDT)


 * Oh whoops, you're absolutely right, how careless of me. Which means I'm going to merge 13 and 14, at least. --GFauxPas 17:32, 9 May 2012 (EDT)