User:GFauxPas/Sandbox

Welcome to my sandbox, you are free to play here as long as you don't track sand onto the main wiki. --GFauxPas 09:28, 7 November 2011 (CST)


 * $u \ v \ \mathsf{u} \ \mathsf{v} \ \nu \ \upsilon$

Anyone else have a hard time distinguishing between $u$ and $v$? I would like it to look more like this, does it confuse anyone else? It seems PW doesn't have the upgreek package. --GFauxPas 07:49, 27 January 2012 (EST)


 * Nope. Multiple years of extensive TeX writing and reading have trained my eye. I agree that referenced $v$ looks more distinguished, but imagine it is hard to implement. --Lord_Farin 08:08, 27 January 2012 (EST)

Exponential Definitions
I am discussing the equivalence of the definitions of exponential here:

http://forums.xkcd.com/viewtopic.php?f=17&t=80256

For anyone who has been following my progress or lack thereof on exponent combination laws/log laws etc, feel free to look on. --GFauxPas 16:59, 6 February 2012 (EST)


 * Okay, it looks like $e^{xy} = e^xe^y$ was the hardest one to prove! I was expecting a walk uphill the whole way. Oh, my Linear Algebra book came in the mail, so I guess I'll work on vectors next. And one of these days I'll have to tie up loose ends with Tarski. --GFauxPas 16:57, 10 February 2012 (EST)

Matrix stuff
Question, for the same reasons as why I made a page for $\left({\mathbf{AB}}\right)^{-1}$, should I make a distinct page for other matrix theorems? Fraleigh has among others, that the left identity is right identity matrix, and that the identity matrix is unique.

Oh, and this should be a fun one to prove: $\mathbf{A}$ is invertible if (iff?) its rref is $\mathbf{I}$. --GFauxPas 08:20, 24 February 2012 (EST)
 * I'm quite sure that most people never consider the idea that there might be two identity matrices. In fact, $e = ee' = e'$ still stands as the shortest functional proof in my collection. I would say that only theorems which aren't intuitively obvious might be viable for a separate entry concerning matrices. It's just a matter of finding the fine line of what is and isn't reasonable to expect to be known from other fields for a visitor of the Matrix Algebra section. I might be guilty to presupposing too much myself concerning the development of Hilbert space theory; but that's for another time. --Lord_Farin 09:40, 24 February 2012 (EST)
 * My take on this: we have two separate proofs. All those matrix results are completely relevant, and then when you prove them using the group theoretical / abstract algebraic methods as well, you inspire the readers into exploring stuff which is on the periphery of their understanding. --prime mover 15:40, 24 February 2012 (EST)
 * Interesting, I find it very counter-intuitive, as matrix multiplication doesn't commute. But in my experience what I find obvious is not at all what most people find obvious. --GFauxPas 09:43, 24 February 2012 (EST)
 * That matrix multiplication doesn't commute does not apply to the identity matrix as you require that $IA = AI = A$ for all $A$. It is generally a good thing to question the obvious, as most flaws are hidden in phrases like 'it is easy to see that...' and such; precisely why we have banished such language from PW. --Lord_Farin 09:47, 24 February 2012 (EST)
 * Oh, right. --GFauxPas 09:53, 24 February 2012 (EST)

I recently fought might Calc II/III professor because she took off two points for doing the following (I won in the end, but she warned me against doing it again)

Question: determine whether or not $\sum n^{-5}\ln n$ converges.

Answer: (integral test, integration by parts),

as $b \to +\infty$

She claimed that I didn't finish the problem, where of course I did. I asked my other math prof. about it, and he said facetiously that it would have been okay had I said "clearly, this is some finite number,..." but obviously the problem is I didn't say "it is easy to see that..."

(Side note: I lost a point for leaving radicals in the denominator of a taylor series and not simplfying $\sqrt{27}=3\sqrt{3}$, which IMHO is absurd. : --GFauxPas 10:08, 24 February 2012 (EST)


 * Well, there is something to say for requiring you to write down $5/16$. My personal approach would be $n^{-5} \log n \le n^{-4}$ for all $n\ge1$ and $\sum n^{-4} = \frac{\pi^4}{90}$ (but maybe you weren't allowed to use this last fact). Because of my lazy nature I have a tendency to use shortcuts and appeal to the intuition of the reader because my mind is much faster than my hands can write. That $\sqrt{27} = 3\sqrt 3$ is true, but one should reduce the amount of operations as much as possible (thus $\sqrt{27}$ is simply more elegant). More intricate is the discussion on eg. $\sqrt{\frac1{17}}$ vs. $\frac1{\sqrt{17}}$. --Lord_Farin 10:18, 24 February 2012 (EST)
 * I'd get points off for both, as I'm "supposed to" write $\dfrac {\sqrt{17}}{17}$. But anyway on this particular problem she told us we had to use the integral test, but whatever.--GFauxPas 12:54, 24 February 2012 (EST)
 * If it's been explained that you need to express all algebraic irrationals in the particular format as specified (Surd form? Long time since I ever went near any actual *numbers* as such), then that's what you gotta do or you lose marks. Sucks dunnit. --prime mover 15:38, 24 February 2012 (EST)

Theorem
Let $\mathbf{A}$ be an $n \times n$ matrix that admits an inverse $\mathbf{A^{-1}}$.

Let $\mathbf{I}$ be the identity matrix of order $n$.

If there exists a sequence of elementary row operations that reduces $\mathbf{A}$ to to $\mathbf{I}$, then the same sequence transforms $\mathbf{I}$ into $\mathbf{A}^{-1}$.

Proof
For ease of presentation, let $\breve{\mathbf{X}}$ be the inverse of $\mathbf{X}$.

We have that $\mathbf{A}$ can be transformed into $\mathbf{I}$ by a sequence of e.r.o.

By repeated application of Elementary Row Operations by Matrix Multiplication, we can write this assertion as:

Because each elementary matrix is invertible, we can multiply on the left both sides of this equation by:

By repeated application of Elementary Row Operations by Matrix Multiplication, each $\mathbf{E_n}$ on the right hand side corresponds to an elementary row operation.

Hence the result.

This lends itself to a nice algorithm to computing the inverse of a matrix. I wonder if there's a prettier presentation, though? --GFauxPas 16:11, 24 February 2012 (EST)

Algorithm
Let $\mathbf{A}$ be an $n \times n$ matrix

Let $\mathbf{I}$ be the identity matrix of order $n$.

Step 1: Form the augmented matrix $\left[{\mathbf{A \vert I}}\right]$

Step 2: Gauss-Jordan

Step 3: if the resultant matrix is $\left[{\mathbf{I \vert C}}\right]$, then $\mathbf{C} = \mathbf{A^{-1}}$. If not, i.e. the thing on the left side of the bar isn't $\mathbf{I}$, then $\mathbf{A}$ is not invertible.

--GFauxPas 16:19, 24 February 2012 (EST)