User:GFauxPas/Sandbox

Welcome to my sandbox, you are free to play here as long as you don't track sand onto the main wiki. --GFauxPas 09:28, 7 November 2011 (CST)

We have $\rho\left({\mathbf A}\right) + \nu\left({\mathbf A}\right) = \text{number of columns} = 3$, which we might not have a page for.

From Null Space Contains Only Zero Vector iff Columns are Independent, $\nu\left({\mathbf A}\right) = 1$ iff the column vectors are independent.

etc.

Actually, I should add a corollary to that page: nullity is one iff column vectors are independent... --GFauxPas 23:32, 4 April 2012 (EDT)


 * Well, we do have Rank Plus Nullity Theorem, if that's what you mean. By the way, did you mean $\nu\left({\mathbf A}\right) = 0$? --abcxyz 23:47, 4 April 2012 (EDT)


 * Rank plus nullity is for transformations, the page doesn't yet tie it into matrices. Oh and actually I'm not sure what the dimension of $\left \{ {\mathbf 0} \right \}$ is, now that I think about it. --GFauxPas 00:17, 5 April 2012 (EDT)


 * I thought that an $m \times n$ matrix $\mathbf A$ with entries in $\R$ can be viewed as the linear transformation $\mathbf A : \R^n \to \R^m$ given by $\mathbf x \mapsto \mathbf A \mathbf x$. Isn't this already noted in Definition:Matrix?
 * I'm pretty sure that $\dim \left({\left\{{\mathbf 0}\right\}}\right) = 0$, since its basis is empty. --abcxyz 00:26, 5 April 2012 (EDT)


 * Yeah a matrix can be looked at as a LT but I'd need a theorem that the $n$ in $\rho + \nu = n$ is the number of columns. Anyway, the idea of an empty basis is kind of interesting. Though it fits the definition of empty sum of the linear combination of no vectors. Maybe. --GFauxPas 00:46, 5 April 2012 (EDT)


 * Well, if $\mathbf x \in \R^n$, then $\mathbf A$ must have $n$ columns for $\mathbf A \mathbf x$ to be defined ... right? So the domain of the linear transformation $\mathbf A : \R^n \to \R^m$ is $n$-dimensional; doesn't that permit the use of the rank-nullity theorem ($\rho + \nu = n$)? --abcxyz 01:00, 5 April 2012 (EDT)
 * It sure does - but it deserves a page. Thanks for the proof, shorter than I thought. --GFauxPas 01:21, 5 April 2012 (EDT)

Matrix Spaces
My linear algebra class gets to abstract vector spaces in around two weeks. In the mean time, doesn't hurt to try to get some understanding on my own. What's the proper notation and definition of a matrix such that each column is a member of a vector space? What's the proper way to say "the matrix $\mathbf A$ that when multiplied on the left on a vector does the same thing as some linear transformation $T$ on the vector"? --GFauxPas 09:05, 6 April 2012 (EDT)


 * 1. Same notation as block matrices, I think: $\left[{\begin{array}{cccc}\mathbf v_1 & \mathbf v_2 & \cdots & \mathbf v_n\end{array}}\right]$.
 * 2. Probably something like "matrix of the linear transformation $T$" or "transformation matrix of $T$". --abcxyz 11:21, 9 April 2012 (EDT)


 * Surely a reference needs to be made to the bases chosen. Something like 'the matrix of $T$ with respect to the bases $e_1\ldots e_n$ and $f_1\ldots f_m$'. --Lord_Farin 11:28, 9 April 2012 (EDT)

Theorems
On my to do list:


 * $T\left({\mathbf x}\right) = \mathbf A \mathbf x \iff T^{-1}\left({\mathbf x}\right) = \mathbf A^{-1}\mathbf x$


 * $T\left({\mathbf x}\right)= \mathbf A \mathbf x, T \,' \left({\mathbf x}\right) = \mathbf A' \mathbf x, \left({T \circ T\,'}\right)\left({\mathbf x}\right) = \mathbf A \mathbf A' \mathbf x$. --GFauxPas 14:03, 12 April 2012 (EDT)


 * Judging by the second line, I think it should be the second alternative for the first. --Lord_Farin 14:06, 12 April 2012 (EDT)
 * Yeah it is, I looked it up, thanks. Your reasoning I assume was $\mathbf A \mathbf A^{-1} \mathbf x = \mathbf I \mathbf x$ --GFauxPas 14:08, 12 April 2012 (EDT)
 * Well, the completely equivalent $T \circ T^{-1} = \operatorname{Id}$. --Lord_Farin 14:09, 12 April 2012 (EDT)

Polar Coordinates
A question please, for anyone who wants to answer.

Theorem: Let

$R \subseteq \R^2 = \{(x,y) = (r\cos \theta, r \sin \theta): 0 \le g_1(\theta) \le r \le g_2(\theta), \ \alpha \le \theta \le \beta, \ 0 \le \beta - \alpha \le 2\pi\}$

where $g_1,g_2$ are real functions continuous for all $\theta \in [\alpha..\beta]$.

Let $f$ be a real-valued function continous on $R$.

Then:


 * $\displaystyle \int \int_R f(x,y) \ \mathrm dA = \int_{\alpha}^{\beta} \int_{g_1(\theta)}^{g_2(\theta)} f (r\cos \theta,r \sin \theta) \ r \ \mathrm dr \ \mathrm d\theta$

I get $x = \cos \theta, y = \sin \theta$, but where did that last $r$ come from? I expected it to be $\displaystyle \int \int_R f(r\cos \theta, r\sin \theta) \ \mathrm dr \ \mathrm d\theta$. --GFauxPas 17:12, 26 April 2012 (EDT)


 * This $r$ arises from the Jacobi determinant. In fact, what is done here is a change of variables, which is closely related to Lebesgue Measure of Matrix Image.
 * I suggest a web search on Jacobi determinant and Change of Variables (this is an application of the latter). --Lord_Farin 17:21, 26 April 2012 (EDT)


 * Will do, thanks a lot. Another question; why do we need $r \ge 0$? --GFauxPas 00:00, 27 April 2012 (EDT)


 * That's probably to ensure the injectivity of the mapping $\left({r, \theta}\right) \mapsto \left({r \cos \theta, r \sin \theta}\right)$ (except possibly at endpoints or when $r = 0$), so that extra area won't be counted twice in the integral on the right-hand side. So I think either of the requirements $\left({0 \le \beta - \alpha \le 2 \pi}\right) \land \left({r \ge 0}\right)$ or $\left({0 \le \beta - \alpha \le \pi}\right)$ will do. In the latter case (and also the former case) I think the formula reads:
 * $\displaystyle \iint_R f \left({x, y}\right) \ \mathrm d x \ \mathrm d y = \int_{\alpha}^{\beta} \int_{g_1 \left({\theta}\right)}^{g_2 \left({\theta}\right)} \left\vert{r}\right\vert \ f \left({r \cos \theta, r \sin \theta}\right) \ \mathrm d r \ \mathrm d \theta$
 * --abcxyz 01:07, 27 April 2012 (EDT)


 * It sometimes helps to draw pictures, and see what determines the size of the elements of area you are adding up. Once you have seen it properly with your eyes it's straightforward to get the definition the integral correct, and easier to prove. --prime mover 02:19, 27 April 2012 (EDT)


 * Thanks a lot, guys. abc, did you mean "injectivity of $\left({x, y}\right) \mapsto \left({r \cos \theta, r \sin \theta}\right)$? PM, did you mean, e.g., when I encounter a theorem such as this in its raw formulation, or when applying the theorem to a specific case for homework or a test? --GFauxPas 08:24, 27 April 2012 (EDT)

Vector Space Axioms
Trying to make this stuff a little bit simpler for those who haven't studied abstract algebra yet, but are learning about vector spaces. I'm not sure I've been exposed to enough stuff to write this rigorously, though. But presumably it's a good idea to have them all in one place? Or maybe this is just a waste of time?

Let $\left({V, +', \circ }\right)_{\mathbb F}$ be a vector space over $\mathbb F \in \left\{ {\R, \C}\right \}$.

The vector space axioms are the abelian group axioms:


 * $\forall \mathbf x, \mathbf y \in V: \left({\mathbf x +' \mathbf y}\right) \in V$


 * $\forall \mathbf x, \mathbf y, \mathbf z \in V: \left({\mathbf x +' \mathbf y}\right) +' \mathbf z = \mathbf x +' \left({\mathbf y +' \mathbf z}\right)$


 * $\exists \mathbf 0 \in V: \forall \mathbf x \in V: \mathbf x +' \mathbf 0 = \mathbf x = \mathbf 0 +' \mathbf x$


 * $\forall \mathbf x \in V: \exists \left({-\mathbf x}\right) \in V: \mathbf x +' \left({-\mathbf x}\right) = \mathbf 0 = -\mathbf x +' \mathbf x$


 * $\forall \mathbf x, \mathbf y \in V: \mathbf x +' \mathbf y = \mathbf y +' \mathbf x$

Together with the properties of a unitary module:


 * $\forall \lambda \in \mathbb F: \forall \mathbf x, \mathbf y \in V: \lambda \circ \left({\mathbf x +' \mathbf y}\right) = \lambda \circ \mathbf x +' \lambda \circ \mathbf y$


 * $\forall \lambda, \mu \in \mathbb F: \forall \mathbf x \in V: \left({\lambda + \mu}\right)\circ \mathbf x = \lambda \circ \mathbf x +' \mu \circ \mathbf x$


 * $\forall \lambda, \mu \in \mathbb F: \forall \mathbf x \in V: \lambda \circ \left({\mu \circ \mathbf x}\right) = \left({\lambda \mu}\right) \circ \mathbf x$


 * $\forall \mathbf x \in V: 1 \circ \mathbf x = \mathbf x = \mathbf x \circ 1$

--GFauxPas 09:40, 27 April 2012 (EDT)


 * This is about in the same league as the instantiation of Combination Theorem for Sequences I proposed for series. I would say there is merit in this. Not sure why you wrote $+'$, though; another point is the very first axiom: it could be replaced by saying
 * 'Let $+: V \times V \to V$ be a binary operation on $V$, subject to:'
 * and then the other axioms. It is IMO not necessary to restrict to $\R,\C$. They could be mentioned as important instances of division rings somewhere lower on the page, or in parentheses. Maybe best would be to say something like
 * 'Let $\Bbb F$ be $\R$, $\C$ or any other division ring.'

I like your initiative, especially as the most usual track into abstract algebra treads through the vector space realm at first. I can imagine people being grateful for a clear exposition in which abstract algebra isn't used (even if they don't realise they are). --Lord_Farin 10:19, 27 April 2012 (EDT)