Preconditioned Conjugate Gradient Descent (ILU)

  Рет қаралды 6,514

Priya Deo

Priya Deo

Күн бұрын

Пікірлер: 16
@graphicsRat
@graphicsRat Жыл бұрын
Simple, clear. I love it. I want more. A whole series on optimization.
@mkelly66
@mkelly66 3 жыл бұрын
All of your videos did a great job of clearly explaining the topics. Thanks, I really appreciated the clear explanations!
@QT-yt4db
@QT-yt4db Жыл бұрын
The dog at the end is lovely...
@MrFawkez
@MrFawkez 3 жыл бұрын
A great followup to the CG video. Thank you.
@swaggcam412
@swaggcam412 2 жыл бұрын
This is a great resource, thank you for sharing!
@xplorethings
@xplorethings 3 жыл бұрын
Really well done!
@colin_hart
@colin_hart 2 жыл бұрын
Thank you for creating this presentation. I arrived here while looking for exposition on Arnoldi and Lanczos iteration. Any new content about sparse iterative methods would be gratefully received. Cheers!
@dmit10
@dmit10 Жыл бұрын
I would suggest L-BFGS, partial Cholesky decomposition (version with limited memory), and "when to use proximal methods over non-proximal" (not just how they work).
@ricardomilos9404
@ricardomilos9404 3 жыл бұрын
In this video was shown that the A^-1 ≈ L^-1U^-1 = M. In code was used M^-1 = L * U (element wise multiplication of LU, not matrix multiplication). It follows that preconditioning only requires the diagonal of the matrix U, since everything else is reduced Should elementwise multiplication be implemented instead of matrix multiplication? And is it correct that M^-1 = LU (maybe it should be like this M^-1 = L^-1U^-1 ?)
@PriyaDeo
@PriyaDeo 3 жыл бұрын
Please note that A^-1 ≈ U^-1L^-1 (not L^-1U^-1), since we are taking the LU factorization of matrix A : A ≈ LU => A^-1 = (LU)^-1 => A^-1 = U^-1L^-1. In the code, we did do matrix multiplication. The numpy matrix library uses '*' to denote matrix multiplication. For element-wise multiplication you need to use np.multiply(L, U).
@QT-yt4db
@QT-yt4db Жыл бұрын
Great video, but the voice is a little bit low, makes it hard to hear without turning high the volume...
@Darkev77
@Darkev77 3 жыл бұрын
Proximal GD next?
@nickp7526
@nickp7526 3 жыл бұрын
Great stuff! What range of subjects can you cover?
@nickp7526
@nickp7526 3 жыл бұрын
Oh, I just got to the end :) Do you know anything about support vector machines? I have a class on it this semester and I would like a small introduction
@PriyaDeo
@PriyaDeo 3 жыл бұрын
I have approximate knowledge of many things :P But to answer this question properly: topics from most college-level undergrad courses in Maths & Computer science. Some topics from grad-level courses in Machine Learning, Computer Vision & Robotics. Software Engineering Interview problems (e.g. HackerRank questions). And so on. I can definitely do a video on SVM's.
@darasingh8937
@darasingh8937 3 жыл бұрын
Thank you so much!!
[CFD] Conjugate Gradient for CFD (Part 1): Background and Steepest Descent
45:01
Conjugate Gradient Method
9:35
Priya Deo
Рет қаралды 124 М.
Sigma Kid Mistake #funny #sigma
00:17
CRAZY GREAPA
Рет қаралды 27 МЛН
Numerical linear algebra: Preconditioned Conjugate Gradient method
16:40
Franks Mathematics
Рет қаралды 2,3 М.
[CFD] Conjugate Gradient for CFD (Part 2): Optimum Distance and Directions
34:26
Descent methods and line search: preconditioned steepest descent
15:22
Michel Bierlaire
Рет қаралды 19 М.
Why is the determinant like that?
19:07
broke math student
Рет қаралды 184 М.
It's Been a Good Run, Drywall.
20:48
LRN2DIY
Рет қаралды 4,2 МЛН
Preconditioners
7:23
Daniel An
Рет қаралды 2,7 М.
Why Runge-Kutta is SO Much Better Than Euler's Method #somepi
13:32
Phanimations
Рет қаралды 160 М.
What P vs NP is actually about
17:58
Polylog
Рет қаралды 139 М.
Conjugate gradient method
14:32
Lewis Mitchell
Рет қаралды 8 М.
8.2.5 Preconditioning
7:43
Advanced LAFF
Рет қаралды 5 М.