Differentiability of Moment Generating Functions

shakalandro

New member
Joined
Nov 29, 2008
Messages
36
So I have been taking a probability class and a class on real analysis. So the moment generating function of a probability density function is defined to be \(\displaystyle \int_{-\infty}^{\infty}e^{tx}f(x)dx\), then the generation of moments is done by differentiating through the integral sign. However, in real analysis we learn that this is only possible if the integrand and the differentiated integrand are uniformly convergent when integrated. I have been unable to find proof of the uniform convergence of a moment generating functions. This also brings the question of whether there is a limit to the number of differentiations can be applied to a given moment generating function before we lose uniform convergence. So, my request is for an answer to these questions.
 
What does "unable to find" mean? Let's see your efforts. It is a worthwhile exercise.
 
I meant I havn't been able to find an answer on the internet, but I have made some efforts myself. We know that f(x) integrates to one, so we could drop the f(x) term entirely, but then we would have numerous examples of un-uniform convergence, such as the exponential distribution. I guess I don't really know enough about the nature of these functions to be able to prove it in generality with the definition of uniform convergence,
 
The moment generating function may not be the easiest way to establish uniform convergence. It's existence depends on uniform convergence. A better measure is the characteristic function, which is defined as the complex functional \(\displaystyle E(itx)\), where \(\displaystyle i^2 = e^{\pi i} = -1\). Every distribution has a characteristic function ( it's like a Fourier Transform), and the number of derivatives it has determines the number of moments. Uniform convergence is the key here, especially if you must deal with an infinite series of characteristic functions (check out the non-central chi-square, for example).

The moment generating function is OK when you need two moments, like when you are trying to establish a Central Limit Theorem in certain cases. When moments of all orders exist, it doesn't really matter whether you use the moment generating function or the characteristic function (there are exceptions). In any case, the convergence of the defining integrals should be your guide. It's uniform convergence that lets you exchange the order of integration and summation in infinite series. Make sure you understand why.
 
Top