Information-estimation relationships over binomial and negative binomial models

Camilo G. Taborda, Dongning Guo, Fernando Perez-Cruz

Research output: Contribution to journalArticlepeer-review

7 Scopus citations


In recent years, a number of new connections between information measures and estimation have been found under various models, including, predominantly, Gaussian and Poisson models. This paper develops similar results for the binomial and negative binomial models. In particular, it is shown that the derivative of the relative entropy and the derivative of the mutual information for the binomial and negative binomial models can be expressed through the expectation of closed-form expressions that have conditional estimates as the main argument. Under mild conditions, those derivatives take the form of an expected Bregman divergence.

Original languageEnglish (US)
Article number6746122
Pages (from-to)2630-2646
Number of pages17
JournalIEEE Transactions on Information Theory
Issue number5
StatePublished - May 2014


  • Binomial model
  • Bregman divergence
  • mutual information
  • negative binomial model
  • relative entropy

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Information-estimation relationships over binomial and negative binomial models'. Together they form a unique fingerprint.

Cite this