In this talk, I will survey some recent progresses in the field of entropic optimal transport (a.k.a. Schrdinger problem). More precisely I will focus on results providing theoretical guarantees of convergence for the most popular algorithms employed to compute approximate solutions in machine learning applications, namely Sinkhorns algorithm and the iterated Markovian fitting algorithm. In particular, I will develop the connection between smoothness of entropic potentials, stability of optimal solutions and exponential convergence in the number of iterations. The theoretical results will be illustrated in some concrete examples, such as log-concave marginals and marginals with bounded support, where the exponential rates are shown to have an optimal dependence on the regularization parameter.
Based on joint work with A.Chiarini, A.Durmus, M.Gentiloni, G.Greco, L.Tamanini.