Bias reduction in the estimation of mutual information.

Abstract : This paper deals with the control of bias estimation when estimating mutual information from a nonparametric approach. We focus on continuously distributed random data and the estimators we developed are based on a nonparametric k-nearest-neighbor approach for arbitrary metrics. Using a multidimensional Taylor series expansion, a general relationship between the estimation error bias and the neighboring size for the plug-in entropy estimator is established without any assumption on the data for two different norms. The theoretical analysis based on the maximum norm developed coincides with the experimental results drawn from numerical tests made by Kraskov et al. [Phys. Rev. E 69, 066138 (2004)PLEEE81539-375510.1103/PhysRevE.69.066138]. To further validate the novel relation, a weighted linear combination of distinct mutual information estimators is proposed and, using simulated signals, the comparison of different strategies allows for corroborating the theoretical analysis.
Type de document :
Article dans une revue
Physical Review E : Statistical, Nonlinear, and Soft Matter Physics, American Physical Society, 2014, 90 (5-1), pp.052714
Liste complète des métadonnées

http://www.hal.inserm.fr/inserm-01306694
Contributeur : Lotfi Senhadji <>
Soumis le : lundi 25 avril 2016 - 12:37:11
Dernière modification le : mercredi 16 mai 2018 - 11:23:41

Identifiants

  • HAL Id : inserm-01306694, version 1
  • PUBMED : 25493823

Collections

Citation

Jie Zhu, Jean-Jacques Bellanger, Huazhong Shu, Chunfeng Yang, Régine Le Bouquin Jeannès. Bias reduction in the estimation of mutual information.. Physical Review E : Statistical, Nonlinear, and Soft Matter Physics, American Physical Society, 2014, 90 (5-1), pp.052714. 〈inserm-01306694〉

Partager

Métriques

Consultations de la notice

78