Numerical linear algebra: Preconditioned Conjugate Gradient method

  Рет қаралды 2,077

Franks Mathematics

Franks Mathematics

Күн бұрын

Пікірлер: 12
@miqdadhasnain6618
@miqdadhasnain6618 2 жыл бұрын
This video was a big help, thank you so much!!
@KebutuhanKirimFile2
@KebutuhanKirimFile2 Ай бұрын
Thanks for the explanation. I am a beginner for this, but I would like to know, what the matrix P is here. Could you please explain it more? Thanks.
@franksmathematics9779
@franksmathematics9779 26 күн бұрын
Hi, the matrix P is our preconditionier! Around minute 12 I give an example how to use the Jacobi method as a preconditioner, which will result in a certain matrix P. The idea/derivation of the matrix P was given early in the video. The choice of an efficient P (in terms of: less iterations in the CG method) however is not that easy, since this heavily depends on the underlying problem. You can easily derive different matrices P for different splitting methods like Gauss-Seidel, SSOR and so on. I hope this helps in understanding the idea - if you have any more question, please do not hesitate!
@KebutuhanKirimFile2
@KebutuhanKirimFile2 Күн бұрын
@@franksmathematics9779 Thanks for your explanation. I am now somehow able to apply it to my computations :D Now, I have another question: in your example, did you apply the preconditioner to both z^k and z^(k+1)? If yes, does it mean that the number of steps of the Jacobi method (minute 14:05) applies to both z^k and z^(k+1)? Look forward to hearing from you. Thank you very much.
@heesangyoo1536
@heesangyoo1536 Жыл бұрын
Thank you very much for your very nice explanation. Could you share the textbook you referenced? (Just name is OK)
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
Unfortunately I do not have a textbook available where this is described in the way I did. I did a short check on my textbooks but they also do not cover the preconditioned CG method. Most of the stuff I present here is taken from papers - If papers are okay I can give you some names...
@heesangyoo1536
@heesangyoo1536 Жыл бұрын
​@@franksmathematics9779 If you can, i would be appreciated thank you very much
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
@@heesangyoo1536 It took me a while but I found the source where most of the results are taken from - but unfortunately it is written in german and I am afraid there is no english translation... I hope this helps at least a little bit: Kanzow - Numerik linearer Gleichungssystem (Numeric of linear systems of equations) Chapter 5.4 - Das präkonditionierte CG-Verfahren (The preconditioned CG-Method) Here is the link: link.springer.com/book/10.1007/b138019 If I stumble something else I will let you know...
@ncroc
@ncroc Жыл бұрын
Using (x,y) for inner product is not good. Pretty much everybody uses (x,y) for vectors. it's better to stick with for inner product.
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
This depends. In my area of research (functional analysis / optimal control) it is very common to use (x,y) for inner products.
@ncroc
@ncroc Жыл бұрын
@@franksmathematics9779 How do you write a tuple of vectors x,y instead of (x,y) ? By the way, using (x,y) for vector tuples is common in many optimization books. I am quite surprised it's not the case in your optimal control related field.
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
@@ncroc Fair enough, good point. To be more precise I usually add an index like (x,y)_X to indicate that I use the inner product in the space X. However I usually drop this index when it is clear from the context. I have to admit that this may lead to some missunderstandings with tupels if you are using tupels along with inner products.
Numerical linear algebra: Understanding splitting methods
32:30
Franks Mathematics
Рет қаралды 639
Numerical linear algebra: Conjugate Gradient method
24:49
Franks Mathematics
Рет қаралды 6 М.
А ВЫ ЛЮБИТЕ ШКОЛУ?? #shorts
00:20
Паша Осадчий
Рет қаралды 7 МЛН
💩Поу и Поулина ☠️МОЧАТ 😖Хмурых Тварей?!
00:34
Ной Анимация
Рет қаралды 1,3 МЛН
Bend The Impossible Bar Win $1,000
00:57
Stokes Twins
Рет қаралды 49 МЛН
[CFD] Conjugate Gradient for CFD (Part 2): Optimum Distance and Directions
34:26
Linear Algebra 13e: The LU Decomposition
16:55
MathTheBeautiful
Рет қаралды 72 М.
Applied Linear Algebra:  QR & Householder
46:31
Nathan Kutz
Рет қаралды 12 М.
Chapter 2: Solutions of Linear Problems, Conjugate Gradient
43:41
ABAQUS LEARNING
Рет қаралды 2,3 М.
Preconditioned Conjugate Gradient Descent (ILU)
7:36
Priya Deo
Рет қаралды 6 М.
New Breakthrough on a 90-year-old Telephone Question
28:45
Eric Rowland
Рет қаралды 99 М.
Hamel basis versus Schauder basis
21:58
Franks Mathematics
Рет қаралды 2,2 М.
AI can't cross this line and we don't know why.
24:07
Welch Labs
Рет қаралды 563 М.
Conjugate gradient method
14:32
Lewis Mitchell
Рет қаралды 6 М.
А ВЫ ЛЮБИТЕ ШКОЛУ?? #shorts
00:20
Паша Осадчий
Рет қаралды 7 МЛН