Extensions of Fisher information and Stam's inequality

Erwin Lutwak New York University Songjun Lv College of Mathematics and Computer Science, Chongqing Normal University Deane Yang New York University Gaoyong Zhang New York University

Information Theory mathscidoc:1703.19001

IEEE Transactions on Information Theory, 58, (3), 1319–1327, 2012
We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam’s inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy- tailed.
entropy, Shannon entropy, Shannon theory, Renyi entropy, Stam inequality, Fisher information, information theory, information measure
[ Download ] [ 2017-03-02 05:23:23 uploaded by deaneyang ] [ 915 downloads ] [ 0 comments ]
  title={Extensions of Fisher information and Stam's inequality},
  author={Erwin Lutwak, Songjun Lv, Deane Yang, and Gaoyong Zhang},
  booktitle={IEEE Transactions on Information Theory},
Erwin Lutwak, Songjun Lv, Deane Yang, and Gaoyong Zhang. Extensions of Fisher information and Stam's inequality. 2012. Vol. 58. In IEEE Transactions on Information Theory. pp.1319–1327. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20170302052323904423589.
Please log in for comment!
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved