Li LongDepartment of Mathematics and Center of Geophysics, Harbin Institute of Technology, 150001 Harbin, People’s Republic of ChinaVidard ArthurUniversité Grenoble Alpes, Inria, CNRS, Grenoble INP, LJK, 38000 Grenoble, FranceLe Dimet Francois-Xavier Université Grenoble Alpes, Inria, CNRS, Grenoble INP, LJK, 38000 Grenoble, FranceMa JianweiDepartment of Mathematics and Center of Geophysics, Harbin Institute of Technology, 150001 Harbin, People’s Republic of China
This work combines a level-set approach and the optimal transport-based Wasserstein distance in a data assimilation framework. The primary motivation of this work is to reduce assimilation artifacts resulting from the position and observation error in the tracking and forecast of pollutants present on the surface of oceans or lakes. Both errors lead to spurious effect on the forecast that need to be corrected. In general, the geometric contour of such pollution can be retrieved from observation while more detailed characteristics such as concentration remain unknown. Herein, level sets are tools of choice to model such contours and the dynamical evolution of their topology structures. They are compared with contours extracted from observation using the Wasserstein distance. This allows to better capture position mismatches between both sources compared with the more classical Euclidean distance. Finally, the viability of this approach is demonstrated through academic test cases and its numerical performance is discussed.
Many optimization algorithms converge to stationary points. When the underlying problem
is nonconvex, they may get trapped at local minimizers and occasionally stagnate near saddle points.
We propose the Run-and-Inspect Method, which adds an “inspect” phase to existing algorithms that
helps escape from non-global stationary points. The inspection samples a set of points in a radius R
around the current point. When a sample point yields a sufficient decrease in the objective, we resume
an existing algorithm from that point. If no sufficient decrease is found, the current point is called an
approximate R-local minimizer. We show that an R-local minimizer is globally optimal, up to a specific
error depending on R, if the objective function can be implicitly decomposed into a smooth convex
function plus a restricted function that is possibly nonconvex, nonsmooth. Therefore, for such nonconvex
objective functions, verifying global optimality is fundamentally easier. For high-dimensional problems, we
introduce blockwise inspections to overcome the curse of dimensionality while still maintaining optimality
bounds up to a factor equal to the number of blocks. Our method performs well on a set of artificial and
realistic nonconvex problems by coupling with gradient descent, coordinate descent, EM, and prox-linear
Many iterative methods in optimization are fixed-point iterations with averaged operators. As such methods converge at an O(1/k) rate with the constant determined by the averagedness coefficient, establishing small averagedness coefficients for operators is of broad interest. In this paper, we show that the averagedness coefficients of the composition of averaged operators by Ogura and Yamada (Numer Func Anal Opt 32(1–2):113–137, 2002) and the threeoperator splitting by Davis and Yin (Set Valued Var Anal 25(4):829–858, 2017) are tight. The analysis relies on the scaled relative graph, a geometric tool recently proposed by Ryu, Hannah, and Yin (arXiv:1902.09788, 2019).