# Time Taken for Body to Fall at Earth's Surface

## Theorem

Let an object $m$ be released above ground from a point near the Earth's surface and allowed to fall freely.

Let $m$ fall a distance $s$ in time $t$.

Then:

$s = \dfrac 1 2 g t^2$

or:

$t = \sqrt {\dfrac {2 s} g}$

where $g$ is the Acceleration Due to Gravity at the height through which $m$ falls.

It is supposed that the distance $s$ is small enough that $g$ can be considered constant throughout.

## Proof

$\mathbf s = \mathbf u t + \dfrac {\mathbf a t^2} 2$

Here the body falls from rest, so:

$\mathbf u = \mathbf 0$

Thus:

$\mathbf s = \dfrac {\mathbf g t^2} 2$

and so taking magnitudes:

$s = \dfrac {g t^2} 2$

It follows by multiplying by $\dfrac 2 g$ that:

$t^2 = \dfrac {2 s} g$

whence:

$t = \sqrt {\dfrac {2 s} g}$

$\blacksquare$