From: lorenzo123 on
Hello,

I need to compute the relative entropy (as defined in http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence) between two discrete distributions with different number of samples.

In addition, I need to compute the KL divergence between a two-dimensional multivariate normal distribution and two discrete distributions (see http://en.wikipedia.org/wiki/Multivariate_normal_distribution#Kullback.E2.80.93Leibler_divergence)

Can anyone please help me with this? I don't know where to start.

Thanks a lot!
From: Ken Pledger on
In article
<961064445.44903.1259330553685.JavaMail.root(a)gallium.mathforum.org>,
lorenzo123 <suruenforp(a)gmail.com> wrote:

> ....
> I need to compute the relative entropy (as defined in
> http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence) between two
> discrete distributions with different number of samples....


����I've just returned from a summer holiday and found your message. ��
It looks like somethng which might get more attention in the
<sci.stat.math> news group.

�����������Ken Pledger.