The p-th moment matrix is defined for a real random vector, generalizing the classical covariance matrix. Sharp inequalities relating the p-th moment and Renyi entropy are established, generalizing the classical inequality relating the second moment and the Shannon entropy. The extremal distributions for these inequalities are completely characterized.
The moment-entropy inequality shows that a contin- uous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Crame ́r- Rao inequality is a direct consequence of these two inequalities.
In this paper the inequalities above are extended to Renyi entropy, p-th moment, and generalized Fisher information. Gen- eralized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Crame ́r–Rao inequality is derived as a consequence of these moment and Fisher information inequalities.
We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam’s inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy- tailed.
A unified approach is presented for establishing
a broad class of Cram\'er-Rao inequalities for the location parameter,
including, as special cases,
the original inequality of Cram\'er and Rao, as well as an $L^p$ version recently
established by the authors. The new approach allows for
generalized moments and Fisher information measures to be defined by convex
functions that are not necessarily homogeneous.
In particular, it is shown that associated with any log-concave random
variable whose density satisfies certain boundary conditions is a
Cram\'er-Rao inequality for which the given log-concave random
variable is the extremal. Applications to specific instances are