Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Limiting density of discrete points (LDDP) in python

Shannon's entropy from information theory measures the uncertainty or disorder in a discrete random variable's empirical distribution, while differential entropy measures it for a continuous r.v. The classical definition of differential entropy was found to be wrong, however, and was corrected with the Limiting density of discrete points (LDDP). Does scipy or other compute the LDDP? How can I estimate LDDP in python?

like image 895
develarist Avatar asked Oct 27 '25 10:10

develarist


1 Answers

Since LDDP is equivalent to the negative KL-divergence from your density function m(x) to your probability distribution p(x), you might be able to use one of the many implementations of KL-divergence, for example from scipy.stats.entropy.

An appropriate procedure (assuming you have finite support) is to approximate the continuous distribution with a discrete one by sampling over its support, and calculating the KL divergence.

If this is not possible, then your only option that I can think of is probably to use numerical (or possibly analytic?) integration methods, of which you should have plenty. An easy first step would be to try monte-carlo methods.

like image 97
xzkxyz Avatar answered Oct 29 '25 00:10

xzkxyz



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!