Why is Log the Antiderivative of 1/x?
October 30, 2024
The power rule has a weird deficiency that everyone just learns to deal with. Recall, the power rule tells us that the antiderivative of is
which gives us our familiar relations: the antiderivative of is , the antiderivative of is a constant, and the antiderivative of is... ? What?! This logarithm just seems to come out of nowhere. It's easy to algebraically prove why this happens, but it still feels weird. Most students end up treating this as a rule to be memorized, but it's not hard to see why this happens.
Log as a Limit
(To be clear, the proof of the derivative of is quite simple.[1] My goal here is to bring a more intuitive understanding to the table.)
As with most edge cases/singularities in math, they can be better understood as a limit of local behavior. In this case, we can view the antiderivative of as the limit of the antiderivatives of , , , and so on. That is, we should expect
Note that we need to pay special attention to the bounds of integration here. If we start the integral of from , it diverges, so we need to start from instead. Thus, for any positive input , we want
Applying the power rule, expression on the right becomes
which, on a graph, actually looks pretty close to a logarithm!
So it would appear that our intuition is correct, and the local limiting behavior is as we expect. Can we do any better? Well, the next step would be to show that the above expression actually approaches a logarithm, i.e.
which can be rewritten as
Occasionally, this limit is used as a definition of itself! But there's actually a very simple explanation. Remember that is the inverse of , which is defined as
(Remember back to your introduction to as compound interest!) If we make the substitution , the above expression becomes
Now, just as the defining relationship between and is , we can do the same for their limiting expressions:
and as you can check for yourself, the same applies the other way around, just like . So it's not a surprise that .
Another way to view the logarithm here is in terms of growth rate. is a subpolynomial function, which means that it grows slower than any polynomial.[2] This is what we should expect from the power rule: we want the antiderivative of to look something like , which would also be subpolynomial (since it's a constant). And although we don't quite get that answer, its subpolynomial nature is echoed in the logarithm.
Similarly, note that is superpolynomial, i.e. it grows faster than any polynomial. And just like how the inverse of is , we have that the inverse is , or the inverse of a subpolynomial is a superpolynomial. Or, less precisely, the inverse of is , larger than any polynomial!
- Take , so . Differentiating both sides with respect to , we get , and therefore . ↩
- If you're familiar with the notion of uniform convergence, perhaps from a real analysis course, then it's worth noting that the polynomials don't converge to uniformly, precisely because of the logarithm's subpolynomial nature: for any , the polynomial gets arbitrarily far away from the logarithm for large . However, if we consider a positive bounded interval with , then we do indeed get uniform convergence. ↩