
On the Estimation of Entropy
Published in: Annals of the Institute of Statistical Mathematics, v. 45, no. 1, 1993, p. 69-88
Posted on RAND.org on January 01, 1993
Motivated by recent work of Joe (1989), the authors introduce estimators of entropy and describe their properties. They study the effects of tail behaviour, distribution smoothness and dimensionality on convergence properties. In particular, they argue that root-n consistency of entropy estimation requires appropriate assumptions about each of these three features. Their estimators are different from Joe's and may be computed without numerical integration, but it can be shown that the same interaction of tail behaviour, smoothness and dimensionality also determines the convergence rate of Joe's estimator. They study both histogram and kernel estimators of entropy, and in each case suggest empirical methods for choosing the smoothing parameter.
This report is part of the RAND Corporation External publication series. Many RAND studies are published in peer-reviewed scholarly journals, as chapters in commercial books, or as documents published by other organizations.
The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.