We define the notion of a Ricci curvature lower bound for parametrized statistical
models. Following the seminal ideas of Lott–Sturm–Villani, we define this notion
based on the geodesic convexity of the Kullback–Leibler divergence in a Wasserstein
statistical manifold, that is, a manifold of probability distributions endowed with a
Wasserstein metric tensor structure. Within these definitions, which are based on
Fisher information matrix and Wasserstein Christoffel symbols, the Ricci curvature
is related to both, information geometry and Wasserstein geometry. These definitions
allow us to formulate bounds on the convergence rate of Wasserstein gradient flows
and information functional inequalities in parameter space. We discuss examples of
Ricci curvature lower bounds and convergence rates in exponential family models.