Number of Bits for Decimal Integer

Theorem
Let $n \in \Z_{>0}$ be a (strictly) positive integer.

Let $n$ have $m$ digits when expressed in decimal notation.

Then $n$ may require as many as $\left\lceil{\dfrac m {\log_{10} 2} }\right\rceil$ bits to represent it.

Proof
Let $d$ be the number of bits that may be needed to represent $n$.

Let $n$ have $m$ digits.

Then:
 * $n \le 10^m - 1$

and so: