Shannon entropy estimation in infinity-alphabets from convergence results: studying plug-in estimators
Author
dc.contributor.author
Silva, Jorge
Admission date
dc.date.accessioned
2018-12-26T22:55:21Z
Available date
dc.date.available
2018-12-26T22:55:21Z
Publication date
dc.date.issued
2018-06
Cita de ítem
dc.identifier.citation
Entropy 2018, 20(6), 397
es_ES
Identifier
dc.identifier.issn
1099-4300
Identifier
dc.identifier.other
10.3390/e20060397
Identifier
dc.identifier.uri
https://repositorio.uchile.cl/handle/2250/159215
Abstract
dc.description.abstract
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in infinity-alphabets. Sufficient conditions for the convergence of the entropy are used in conjunction with some deviation inequalities (including scenarios with both finitely and infinitely supported assumptions on the target distribution). From this perspective, four plug-in histogram-based estimators are studied showing that convergence results are instrumental to derive new strong consistent estimators for the entropy. The main application of this methodology is a new data-driven partition (plug-in) estimator. This scheme uses the data to restrict the support where the distribution is estimated by finding an optimal balance between estimation and approximation errors. The proposed scheme offers a consistent (distribution-free) estimator of the entropy in infinity-alphabets and optimal rates of convergence under certain regularity conditions on the problem (finite and unknown supported assumptions and tail bounded conditions on the target distribution).
es_ES
Patrocinador
dc.description.sponsorship
FONDECYT: 1170854;
CONICYT-Chile;
Advanced Center for Electrical and Electronic Engineering (AC3E), Basal Project: FB0008