The best explanation of Statistical Distances that I have found. Easy and nice explanation of Kolmogrov-Smirinov, Wasserstein distance, and KL-divergence.
@mathman21702 жыл бұрын
Love it when a talk presents the material in a carefully developed, logical, manner. Merci!
@kimmupfumira34173 жыл бұрын
Great explanation! Easy to digest.
@jamesmckeown47435 жыл бұрын
17:13 there should be a negative in the definition of KL
@Mayur7Garg3 жыл бұрын
I think the negative should be based on whether you are minimizing or maximizing it. By definition, distances are always positive.
@MauricioSalazare6 жыл бұрын
Well done! Nice explanation!
@nmertsch87255 жыл бұрын
This is a great presentation! Is there a reason why you did not commit the nth Wasserstein distance to SciPy?
@canmetan6705 жыл бұрын
docs.scipy.org/doc/scipy/reference/generated/scipy.stats.wasserstein_distance.html As of this date, latest stable version of scipy is 1.3.1 on pip. This has been allegedly available after 1.0.0
@TheBjjninja5 жыл бұрын
6:15 we should either reject or fail to reject H0 i believe. Instead of “accept H0”
@harry8175ritchie5 жыл бұрын
AKA accept. I think it depends on where you learn statistics. My professors always said accept and reject.
@mikhaeldito4 жыл бұрын
Semantically, "accepting H0" and "failing to reject H0" are the same. But they are not! P-value is a measure of the probability of our data assuming that the null hypothesis (such as no difference between two groups) is true. So, it is a measure against the null, not in favour of the null. This is why we have a statistical test of no difference, or similarity, that is called equivalence tests.
@joelwillis20433 жыл бұрын
@@harry8175ritchie AKA NO. You can't conclude your assumption based on your assumption. This is like logic 101. HARD FAIL GO DRIVE A TRUCK FOR LIVING.
@harry8175ritchie3 жыл бұрын
Not the way to handle it buddy.
@minesinitiativesrussie17785 жыл бұрын
T'es le meilleur fillot ! J'ai rien compris mais c'est quand même la classe !