Random thoughts about random subjects… From science to literature and between manga and watercolours, passing by data science and rugby; including film, physics and fiction, programming, pictures and puns.

A couple of weeks ago I was talking about logarithms and their use. They may seem to be a bit esoteric and they can cause a bit of head-scratching: a logarithm (base ) of a number is denoted by and can be better understood in terms of an exponential function since the result is the power to which the base has to be raised to produce the number, i.e. .

Nonetheless, using the base for logarithms is far more natural and as a form of abbreviation, it is denoted by . In spite of this, a lot of us still learn about this function with the base 10 (or simply ). Furthermore, the importance of the binary system makes it very useful to know about . With that in mind, I ended up mentioning an approximation in which all three logarithms appear:

This can be very handy as you can approximate the value of the logarithm base 2 with a pedestrian calculator… of course you can still do this using the functions that some calculators offer you, but that is not the point of this post…

Anyway, note that it is an approximation and as such we could ask what is the relative error obtained when using it. Well, let us have a look: if the above expression holds, we can divide both sides by to obtain:

.

So the relative error can be expressed as:

.

That is all fine and good, but what is the value of that error? Well, let us use some of the properties of logarithms. We know that . Also, it is true that , and thus we can say that . From this it follows that . Nonetheless, we know that . Rearranging the terms we end up with the fact that:

.

Using that expression we can re-write the relative error as follows:

Not too big an error! Let us have a look, , whereas the actual value is closer to