A brief introduction to the regularity theory of optimal transport

  Рет қаралды 4,898

Gabe Khan

Gabe Khan

Күн бұрын

Пікірлер: 11
@cauchyschwarz9759
@cauchyschwarz9759 7 ай бұрын
Gabe, you have succinctly covered the MTW condition beautifully!
@GabeKhan
@GabeKhan 7 ай бұрын
Thanks!
@seanziewonzie
@seanziewonzie 3 жыл бұрын
Wow, fantastic! I have never heard of this tensor but I am immediately intrigued by it thanks to your great presentation. I am interested by that remark about it being a non-local strengthening of sectional curvature. Are there any sources that explore that idea a little more?
@GabeKhan
@GabeKhan 3 жыл бұрын
Thanks! A good starting point for a more in depth discussion of the regularity theory is Chapter 12 of Villani's text. For the relationship between the MTW tensor and the curvature, I would suggest looking at Loeper's 2009 paper "On the regularity of solutions of optimal transportation problems," which showed (among many other things) that the MTW tensor on the diagonal is proportional to the sectional curvature. However, MTW(0) is genuinely a stronger condition since there are many examples of positively-curved metrics which fail to satisfy MTW(0). For some examples, see Kim's paper "Counterexamples to Continuity of Optimal Transport Maps on Positively Curved Riemannian Manifolds," Section 6 of "On the Ma-Trudinger-Wang curvature on surfaces" by Figalli-Rifford-Villani and Appendix D of "Regularity of optimal transport in curved geometry: the nonfocal case" by Loeper-Villani.
@sarcasmo57
@sarcasmo57 Жыл бұрын
Really makes ya think.
@bohanzhou9815
@bohanzhou9815 3 жыл бұрын
Can you explain a little more details how MTW(0) is related with convex analysis, if possible in the next video? It would be great to have some illustrations as this video!
@GabeKhan
@GabeKhan 3 жыл бұрын
I would like to make a video about Kantorovich duality at some point, and there I'll be able to explain the relationship more precisely. However, I won't be able to do that in the near future, so let me try to answer your question in a comment. It's easiest to start with the squared-distance cost in Euclidean space. In this case, Brenier's work showed that for reasonable measures, the solution to the Monge problem is given by the sub-differential of a convex potential function. For smooth convex functions, the sub-differential at a point is simply the gradient (so is just a point), but for arbitrary convex functions, the sub-differential at any point is a closed convex set. In particular, the sub-differential is always connected. This fact plays a crucial role in the regularity theory because it shows that smooth convex functions are dense in the space of all convex functions in the topology of uniform local convergence. For more general cost functions, the solution to the Monge problem is given by the c-subdifferential of a c-convex function. However, for an arbitrary cost function, it is not necessarily the case that the c-subdifferentials of an arbitrary c-convex function are always connected. This presents an obstruction to the regularity theory, because it means that smooth c-convex functions (where the c-subdifferential at any point is just a point) need not be dense in the space of all c-convex functions. One of Loeper's key insights was to show that for smooth enough cost functions, the MTW(0) assumption is equivalent to requiring that the c-subdifferential be connected. I like to think of this with the analogy that costs which satisfy the MTW(0) assumption are similar to Riemannian manifolds with non-negative sectional curvature whereas cost functions whose c-subdifferentials are connected are akin to Alexandrov spaces with curvature bounded below by zero. When the cost function/space is smooth enough, these conditions are equivalent, but the latter is a synthetic condition which is defined more generally.
@dowellchan3890
@dowellchan3890 3 жыл бұрын
a nice lecture!
@GabeKhan
@GabeKhan 3 жыл бұрын
Thank you!
@anthonyymm511
@anthonyymm511 2 жыл бұрын
"Wang" is pronounced more like "Wong"
@GabeKhan
@GabeKhan 2 жыл бұрын
Thanks for the advice. I’m giving a talk about this on Wednesday so I’ll have to improve the pronunciation.
Hyperbolic Information Geometry
16:45
Gabe Khan
Рет қаралды 2,7 М.
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
It’s all not real
00:15
V.A. show / Магика
Рет қаралды 20 МЛН
Marco Cuturi - A Primer on Optimal Transport Part 1
42:48
MLSS Africa
Рет қаралды 19 М.
Introduction to the Wasserstein distance
17:28
Applied Algebraic Topology Network
Рет қаралды 31 М.
Statistical Mirror Symmetry
45:48
Gabe Khan
Рет қаралды 2 М.
What is Jacobian? | The right way of thinking derivatives and integrals
27:14
"Optimal Transport for Statistics and Machine Learning" Prof. Philippe Rigollet, MIT
58:08
Center for Intelligent Systems CIS EPFL
Рет қаралды 14 М.
Soheil Kolouri - Wasserstein Embeddings in the Deep Learning Era
56:09
One world theoretical machine learning
Рет қаралды 6 М.
Everything You Need to Know About Control Theory
16:08
MATLAB
Рет қаралды 586 М.
How to self study pure math - a step-by-step guide
9:53
Aleph 0
Рет қаралды 1,9 МЛН
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН