The Kullback-Lieber divergence between two distributions with pdfs f(x) and g(x) is defined
by
$KL(F;G) = \int_{-\infty}^{\infty} ln \left(\frac{f(x)}{g(x)}\right)f(x)dx$
Compute the Kullback-Lieber divergence when F is the standard normal distribution and G
is the normal distribution with mean and variance 1. For what value of is the divergence
minimized?
I was never instructed on this kind of divergence so I am a bit lost on how to solve this kind of integral. I get that I can simplify my two normal equations in the natural log but my guess is that I should wait until after I take the integral. Any help is appreciated.
by
$KL(F;G) = \int_{-\infty}^{\infty} ln \left(\frac{f(x)}{g(x)}\right)f(x)dx$
Compute the Kullback-Lieber divergence when F is the standard normal distribution and G
is the normal distribution with mean and variance 1. For what value of is the divergence
minimized?
I was never instructed on this kind of divergence so I am a bit lost on how to solve this kind of integral. I get that I can simplify my two normal equations in the natural log but my guess is that I should wait until after I take the integral. Any help is appreciated.