Prev: Prove/Disprove that nCr for (n-r) > r > 3 can never be a perfect square
Next: Positive definite and positive definite submatrices
From: lorenzo123 on 26 Nov 2009 23:02 Hello, I need to compute the relative entropy (as defined in http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence) between two discrete distributions with different number of samples. In addition, I need to compute the KL divergence between a two-dimensional multivariate normal distribution and two discrete distributions (see http://en.wikipedia.org/wiki/Multivariate_normal_distribution#Kullback.E2.80.93Leibler_divergence) Can anyone please help me with this? I don't know where to start. Thanks a lot!
From: Ken Pledger on 16 Dec 2009 15:38
In article <961064445.44903.1259330553685.JavaMail.root(a)gallium.mathforum.org>, lorenzo123 <suruenforp(a)gmail.com> wrote: > .... > I need to compute the relative entropy (as defined in > http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence) between two > discrete distributions with different number of samples.... ����I've just returned from a summer holiday and found your message. �� It looks like somethng which might get more attention in the <sci.stat.math> news group. �����������Ken Pledger. |