0:00 - 10:00 intro 10:00 - 18:00 Johnson Lindenstrauss lemma 18:00 - 24:30 JL running time of computing all pairs of distances: O((dn*logn)/eps^2 + (n^2*logn)/eps^2) used to be O(n^2*d) with naive approach 24:30 - 29:10 reducing JL to norm preservation 29:10 - 37:30 Gaussian distribution (recap needed for the next part of the lecture) 37:30 - 46:45 proof 46:45 - 59:45 bound failure probability 59:45 - 1:11:00 extensions of JL (sparse version) 1:11:00 - 1:17:53 FJLT (fast JL transform)
@aSeaofTroubles8 жыл бұрын
Unbelievable!
@gabse157 жыл бұрын
Really great lecture!
@thomas.moerman8 жыл бұрын
Great lecture!
@anonymouskek46298 жыл бұрын
Andrew Xia u r my tru hero
@botongma31798 жыл бұрын
+Anonymous Kek XIA!!
@zhigall17 жыл бұрын
Just Awesome! Thank you!!!
@freebird66487 жыл бұрын
Great lecture, thanks !
@AlexanderMath8 жыл бұрын
Awesome video. Is it possible anyone could share the lecture notes the lecturer is consulting?
@AlexanderMath8 жыл бұрын
+Alexander Mathiasen Found them by googling "MIT 6.854 Spring 2016". It led to Ankur Moitra's lecture page on Advanced Algorithms which contained links to pdf's for all lectures.