Section20.3Digging deeper and finding limits
So where does it go? To answer this, we will look at a very different graph!
The fundamental observation we need is that \(\tau(n)\) is precisely the number of positive lattice points \((x,y)\) such that \(xy=n\). Before going on, spend some time convincing yourself of this.
Subsection20.3.1Moving toward a proof
To be more in line with our previous notation, \(\tau(n)\) is exactly given by the number of points \(\left(d,\frac{n}{d}\right)\) - so that \(d\frac{n}{d}=n\).
So \(\sum_{k=1}^n \tau(k)\) is the number of lattice points on or under the hyperbola \(y=n/x\).
This is a completely different way of thinking of the divisor function! We can see it for various sizes below.
So what we will do is try to look at the lattice points as approximating (you guessed it) an area! Just like with the sum of squares function.
For each lattice point involved in \(\sum_{k=1}^n \tau(k)\), we put a unit square to the lower right.
Subsection20.3.2The Error as our Guide
We are basically interpreting the lattice points as two different sums.
- We can think of it as \(\sum_{k=1}^n \tau(k)\) - adding up the lattice points along each hyperbola.
- We can think of it as \(\sum_{j=1}^n \left\lfloor\frac{n}{k}\right\rfloor\), or adding up the lattice points in each vertical column.
It should be clear that the area is “about” \[\int_1^n \frac{n}{x}dx=n\ln(x)\biggr\vert_1^n=n\ln(n)-n\ln(1)=n\ln(n)\; .\] Why is this actually a good estimate, though? The answer is in the error!
Look at the shaded difference between the area under the curve (which is \(n\ln(n)\)) and the area of the red squares (which is the sum of all the \(\tau\) values).
- All the areas where the red squares are above the hyperbola add up to less than \(n\), because they are all 1 in width or less, and do not intersect vertically (they stack, as it were).
- Similarly, all the areas where the hyperbola is higher add up to less than \(n\), because they are all 1 in height or less, and are horizontally non-intersecting.
That means three things:
- The error \(\sum_{k=1}^n \tau(k)-n\ln(n)\) is a number less than \(n\) minus a (different) number less than \(n\).
- So the error is certainly \(O(n)\) (less than some multiple of \(n\) as \(n\) gets huge).
- So, the error in the average is less than some constant as \(n\) gets huge! I.e., \[\frac{1}{n}\sum_{k=1}^n \tau(k)-\ln(n)=O(1)\]
We can verify this graphically by plotting the average value against \(\ln(n)\).
Lookin' good! There does seem to be some predictable error. What might it be?
Keeping \(x=0\) in view, it seems to be somewhat less than \(0.2\), although the error clearly bounces around. By zooming in, we see the error bouncing around roughly between \(0.15\) and \(0.16\), more or less, as \(x\) gets large. So will this give us something more precise?
Subsection20.3.3Getting a Handle on Error
To answer this, we will try one more geometric trick.
Notice we have now divided the lattice points up evenly in three parts.
- The ones on the line \(y=x\).
- The lattice points above the line and below the hyperbola.
- The lattice points to the right of the line and below the hyperbola.
- There are exactly \(\lfloor\sqrt{n}\rfloor\leq \sqrt{n}\) on the line.
- At each integer \(y\)-value \(d\) up to \(y=\sqrt{n}\), there are are \(\lfloor n/d\rfloor-d\) above the line and below the hyperbola.
- At each integer \(x\)-value \(d\) up to \(x=\sqrt{n}\), there are are \(\lfloor n/d\rfloor-d\) points to the right of the line and below the hyperbola.
So \[\sum_{k=1}^n \tau(k)=\sum_{d\leq \sqrt{n}} (\lfloor n/d\rfloor-d)+\sum_{d\leq \sqrt{n}} (\lfloor n/d\rfloor-d) +\lfloor\sqrt{n}\rfloor\leq 2\sum_{d\leq \sqrt{n}} (n/d-d) +\sqrt{n}\] where the error involved is certainly less than one for each \(d\), so the total error is at most \(2\sqrt{n}+1=O(\sqrt{n})\).
Thus we can rewrite this, using a well-known identity, as \[\sum_{k=1}^n \tau(k)=2n\sum_{d\leq \sqrt{n}}\frac{1}{d}-2\sum_{d\leq \sqrt{n}}d+O(\sqrt{n})= 2n\sum_{d\leq \sqrt{n}}\frac{1}{d}-2\left(\frac{\lfloor\sqrt{n}\rfloor(\lfloor\sqrt{n}\rfloor+1)}{2}\right)+O(\sqrt{n})\, .\] The difference between \(\left(\frac{\lfloor\sqrt{n}\rfloor(\lfloor\sqrt{n}\rfloor+1)}{2}\right)\) and \(\frac{n}{2}\) is once again far less than \(O(\sqrt{n})\) (and negative to boot), so we finally get that \[\sum_{k=1}^n \tau(k)=2n\sum_{d\leq \sqrt{n}}\frac{1}{d}-n+O(\sqrt{n})\Rightarrow \frac{1}{n}\sum_{k=1}^n \tau(k)=2\sum_{d\leq \sqrt{n}}\frac{1}{d}-1+O(1/\sqrt{n})\; .\]
Subsection20.3.4The end of the story
We're almost at the end of the story! Now, we just need to relate this sum to \(\ln(n)\). (Remember, we already are pretty well convinced that \(\ln(n)\) is close to the average value of \(\tau\); the problem is what the error is.)
This graphic shows the exact difference between \(\sum_{k=1}^{m-1} \frac{1}{k}\) and \(\ln(m)\). Clearly, even as \(m\to\infty\), the total area is simply the sum of a bunch of nearly-triangles with width exactly one and no intersection of height (again this idea), with total height less than \(1\). So the difference between \(\sum_{k=1}^{m-1} \frac{1}{k}\) and \(\ln(m)\) will be finite as \(m\to\infty\).
This number is very important!
You have almost certainly never heard of this number, but it is very important. There is even an entire book, by Julian Havil, about this number. It's a pretty good book, in fact!
Remark20.3.2
Among other crazy properties, it is the derivative of a generalization of the factorial function, called (wait for it) Gamma (\(\Gamma\)).
Notice that the “missing” part of the area (since we can't actually view all the way out to infinity) must be less than \(1/m\), since it will be the part lower than all the pieces we can see in the graphic for any given \(m\). So \(\gamma\) is within \(O(1/n)\) of any given amount finite \(\sum_{k=1}^{m-1} \frac{1}{k}-\ln(m)\).
Now we put it all together! We know from above that \[\frac{1}{n}\sum_{k=1}^n \tau(k)=2\sum_{d\leq \sqrt{n}}\frac{1}{d}-1+O(1/\sqrt{n})\, .\] Further, we can now substitute in the following for \(\sum_{d\leq \sqrt{n}}\frac{1}{d}\); \[\sum_{d\leq \sqrt{n}}\frac{1}{d}= \ln(\sqrt{n})+\gamma+O(1/\sqrt{n})\; .\] Once we do that, and take advantage of the log fact \(2\ln(z)=\ln\left(z^2\right)\), we get \[\frac{1}{n}\sum_{k=1}^n \tau(k)= \ln(n)+(2\gamma-1)+O(1/\sqrt{n})\, .\] That is exactly the asymptote and type of error that I have depicted below!
Note that it's not hard to prove that \(\tau\) grows at least as fast as \(\ln(n)\), so this is a fairly sharp result. (It turns out that it's even possible to show that the error in the average is in fact \(O(1/\sqrt[3]{x})\), but is not \(O(1/\sqrt[4]{x})\).)