Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

Erwin Lutwak New York University Deane Yang New York University Gaoyong Zhang New York University

Information Theory mathscidoc:1703.19002

IEEE Transactions on Information Theory, 51, (2), 473–478, 2005
The moment-entropy inequality shows that a contin- uous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Crame ́r- Rao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are extended to Renyi entropy, p-th moment, and generalized Fisher information. Gen- eralized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Crame ́r–Rao inequality is derived as a consequence of these moment and Fisher information inequalities.
entropy, Renyi entropy, moment, Fisher infor- mation, information theory, information measure
[ Download ] [ 2017-03-02 05:29:59 uploaded by deaneyang ] [ 1115 downloads ] [ 0 comments ]
@inproceedings{erwin2005cramér-rao,
  title={Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information},
  author={Erwin Lutwak, Deane Yang, and Gaoyong Zhang},
  url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20170302052959150562590},
  booktitle={IEEE Transactions on Information Theory},
  volume={51},
  number={2},
  pages={473–478},
  year={2005},
}
Erwin Lutwak, Deane Yang, and Gaoyong Zhang. Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information. 2005. Vol. 51. In IEEE Transactions on Information Theory. pp.473–478. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20170302052959150562590.
Please log in for comment!
 
 
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved