User:GFauxPas/Sandbox

Welcome to my sandbox, you are free to play here as long as you don't track sand onto the main wiki. --GFauxPas 09:28, 7 November 2011 (CST)


 * $u \ v \ \mathsf{u} \ \mathsf{v} \ \nu \ \upsilon$

Anyone else have a hard time distinguishing between $u$ and $v$? I would like it to look more like this, does it confuse anyone else? It seems PW doesn't have the upgreek package. --GFauxPas 07:49, 27 January 2012 (EST)


 * Nope. Multiple years of extensive TeX writing and reading have trained my eye. I agree that referenced $v$ looks more distinguished, but imagine it is hard to implement. --Lord_Farin 08:08, 27 January 2012 (EST)

Convergence
We have that $\displaystyle \lim_{n \to +\infty}a_n = 0$, by hypothesis.

To show that $\displaystyle \sum_{k=1}^n{n \choose k}\frac {{a_n}^{k-1}} {n^k}$ converges, observe that, for $n$ large enough:

Recall that $a_n \to 0$, by hypothesis.

This means that $0 \le \left\vert{a_n}\right\vert \le 1$ for sufficiently large $n$, because $a_n$ is Cauchy.

Then, we can say:


 * $\displaystyle \left \vert{ \frac{ 1 - {a_n }^k } {1 - a_n} }\right\vert \le \left \vert{ \frac{ 1 - a_n } {1 - a_n} }\right\vert = 1$ as $n \to +\infty$, because $k \ge 1$.

Hence $\displaystyle \left \vert{\sum_{k=1}^n{n \choose k}\frac { {a_n}^{k-1} } {n^k} }\right\vert$ converges, by the Comparison Test.

That means that $\displaystyle \sum_{k=1}^n{n \choose k}\frac { {a_n}^{k-1} } {n^k}$ converges as well, because Absolutely Convergent Series is Convergent.

Is this okay? --GFauxPas 08:54, 6 March 2012 (EST)


 * Unfortunately, no. I think I got carried away when I said that the argument generalized to arbitrary complex sequences. However, it functions in the sense that we have demonstrated that the sequence is bounded. This means that we can estimate the (modulus of the) product of this thing with $a_n$ by a fixed number times the modulus of $a_n$, which then converges to zero. This will suffice to prove that the limit of the whole expression is zero because $|a_n|\to0$ does imply $a_n\to0$ (unlike when $0$ is replaced by another complex number). These tedious considerations are necessary because, as mentioned before, the sequence is not in the form of a series, hence results for series can't be applied. I hope you grasp the sketchy adapted approach. --Lord_Farin 13:56, 6 March 2012 (EST)
 * But we don't care what $\displaystyle \sum_{k=1}^n{n \choose k}\frac {{a_n}^{k-1}} {n^k}$ converges to, just that it converges to some number... Just to try to separate out what I know and what I don't know, is my proof legitimate for $a_n$ being a real sequence? I'm trying to find a way to deal with this strange not-a-series sequence that kind of looks like a series but it isn't but it maybe it is just a little. I'll leave complex analysis for another time. --GFauxPas 14:08, 6 March 2012 (EST)


 * I'm not so sure anymore. The proof as it stands now seems to only be guaranteed to work when all $a_n$ are positive real. However, the approach sketched for the complex case works; a page establishing $a_n\to0, |b_n|\le B\implies a_nb_n \to 0$ should be established (if it isn't already). That's the theorem we want to invoke. --Lord_Farin 18:04, 6 March 2012 (EST)

== Several Variables

I need tof ind a better name for functions of the form $f: \R^n \to \R$. Larson calls them "functions of several variables" but in his book there's less need for precision than PW because the context is clear.

St Larson's Def'n:
 * $f$ be a "real function of several variables?", i.e., the domain is a subset of $\R^n$ and the codomain is a subset of $\R$.


 * $f$ is said to be differentiable at some $\mathbf x_0$ iff $\exists \Delta f (\mathbf x)$:

such that $\varepsilon_1, \varepsilon_2, \cdots, \varepsilon_n \to 0$ as $\Delta x_1, \Delta x_2, \cdots, \Delta x_n \to 0$ all at the same time. Figure out the best way to present this clause as to make sure the reader understands they all have to approach at the same time.

Perhaps I should stipulate first that the partials need all exist, or else there's nothing to talk about...

Explanation, going either before the def'n, after the def'n, or maybe even on a seperate page.

To understand the motivation for this, consider a differentiable real function $y = f(x)$.

That is, iff the real function is differentiable, $\varepsilon \to 0$ as $\Delta x \to 0$.

--GFauxPas 08:51, 1 April 2012 (EDT)


 * I would say that, rather than writing '$\Delta x_1, \Delta x_2, \cdots, \Delta x_n \to 0$ all at the same time', it would be a good idea to write $\Delta \mathbf x \to 0$ and using a definition or reference to the limit of a vector-valued function. This allows to more closely resemble the one-dimensional case. Maybe even use $\underline \varepsilon$ as a vector, and write $\underline \varepsilon \cdot \mathbf x$ instead of the sum $\sum_{i=1}^n \varepsilon_i \Delta x_i$. Definition of the gradient of $f$ (vector of partial derivatives) can help to further reduce the summations. --Lord_Farin 09:32, 1 April 2012 (EDT)


 * $f: \mathbf x \mapsto f(\mathbf x)$ is differentiable iff $\exists \Delta f(\mathbf x)$ such that:

?

By the way, there seem to be any number of ways to pronounce $\partial$ and $\nabla$, what ways do you hear commonly?

Also, for the definition of gradient, would this definition be too informal?

? Rather than

would that be better?--GFauxPas 09:59, 1 April 2012 (EDT)


 * Usually, I hear 'del' or 'dau' for $\partial$, and 'nabla' for $\nabla$. I would say the 'informal' definition is not really informal, but rather can be viewed as an approach from a functional analytic perspective (which might be undesirable) and is bound to lead to confusion; therefore, I suggest to go with the second defn. --Lord_Farin 10:24, 1 April 2012 (EDT)

That looks very interesting! --GFauxPas 11:47, 1 April 2012 (EDT)


 * Usually, things like this are used to justify the abuse of notation of writing column vectors as row vectors; if this is to be avoided, consider using the transposition operator $x \mapsto x^T$ on matrices.


 * In books it wastes paper, but on a computer screen it's not so bad. Just takes getting used to. --prime mover 17:10, 1 April 2012 (EDT)