A comment on a recent post lead me to a page of series on Wikipedia. The last series on that page caught my eye:
It’s a lot more common to see exp(−πx²) inside an integral than inside a sum. If the summation symbol were replaced with an integration sign, the integral would be 1. You could derive this from scratch using the polar coordinate trick, of your could look at it and say that the integrand is the PDF of a Gaussian random variable with mean 0 and variance 1/2π.
Last week I wrote about the Pi function Π(z) = Γ(z + 1) and how some equations are simpler or easier to remember when expressed in terms of Π rather than Γ, and the equation above is an example. It can be rewritten as
It’s curious that there’s such a similarity between the constant pi and the function Pi in the denominator. After all, what’s the connection between the two? Their names come from “perimeter” and “product”, both which start with “p.” That explains a linguistic connection but not a mathematical connection. And yet π and Π often appear together. For example, the volume of an n-dimensional sphere is
We could rewrite this changing πn/2 in the numerator to π−n/2 in the denominator, making it look more like the equation for the sum above, except the sign of the exponent on π would be the opposite of the sign on the argument to Π.
Incidentally, the summation result above says that we could define a probability distribution on the integers with probability mass function
which would be a sort of discrete analog of the normal distribution. I imagine someone has done this before and given it a name.
What would happen if we gave the distribution a noninteger mean? i.e. exp(-π(n-μ)^2). Does the normalising constant vary depending on the fractional part of μ?
Oscar, I find that when mu = 0 that sum is 1.086435 (which equals the exact value in the post), and when mu = 1/2 that sum is 0.9135791. These have average 1.000007, which is closer to 1 than we have any right to expect for a generic function – I suspect there’s some deeper reason.
I’d be curious to see a proof of the discrete Gaussian series. I spent a couple hours searching online without any success.
But that’s “just” one of the many relations ans special values coming from identities of theta function:
https://en.wikipedia.org/wiki/Theta_function
(If you want to find even more of them, have a look at this book by Shaun Cooper: https://www.springer.com/gp/book/9783319561714)
Here’s the general result in terms of theta functions: https://dlmf.nist.gov/20.13#E4