Numerical linear algebra: Preconditioned Conjugate Gradient method

  Рет қаралды 2,207

Franks Mathematics

Franks Mathematics

Күн бұрын

Пікірлер: 17
@miqdadhasnain6618
@miqdadhasnain6618 2 жыл бұрын
This video was a big help, thank you so much!!
@KebutuhanKirimFile2
@KebutuhanKirimFile2 3 ай бұрын
Thanks for the explanation. I am a beginner for this, but I would like to know, what the matrix P is here. Could you please explain it more? Thanks.
@franksmathematics9779
@franksmathematics9779 2 ай бұрын
Hi, the matrix P is our preconditionier! Around minute 12 I give an example how to use the Jacobi method as a preconditioner, which will result in a certain matrix P. The idea/derivation of the matrix P was given early in the video. The choice of an efficient P (in terms of: less iterations in the CG method) however is not that easy, since this heavily depends on the underlying problem. You can easily derive different matrices P for different splitting methods like Gauss-Seidel, SSOR and so on. I hope this helps in understanding the idea - if you have any more question, please do not hesitate!
@KebutuhanKirimFile2
@KebutuhanKirimFile2 Ай бұрын
@@franksmathematics9779 Thanks for your explanation. I am now somehow able to apply it to my computations :D Now, I have another question: in your example, did you apply the preconditioner to both z^k and z^(k+1)? If yes, does it mean that the number of steps of the Jacobi method (minute 14:05) applies to both z^k and z^(k+1)? Look forward to hearing from you. Thank you very much.
@franksmathematics9779
@franksmathematics9779 Ай бұрын
@@KebutuhanKirimFile2 Glad you made it work! The number of Jacobi-steps is denoted by l, which is in the definition of the matrix P. So the formal correct answer to your question would be: "You only apply the matrix P to z_k once to obtain z_{k+1}!". In practice you will not compute the inverse of P, but rather apply l steps of the Jacobi method to z_k and call the output z_{k+1}! Best, Frank
@KebutuhanKirimFile2
@KebutuhanKirimFile2 Ай бұрын
@@franksmathematics9779 Thanks for your reply but maybe I asked it wrongly. Let me clarify it once again. First question: So, in your example, for the value of I = 5, it gives a total of 130 iterations of CG. Does it mean that the Jacobi loops here are in fact performed 5 x 130 = 650 times? Second question: how's about the actual CPU time for different values of I? I can understand that increasing the value of I will decrease the number of iterations of CG, but in fact doing the Jacobi loops for larger I values will also consume time (memory accesses, arithmetic operations, etc.)? Look forward to hearing from you. Thank you very much!
@franksmathematics9779
@franksmathematics9779 Ай бұрын
@@KebutuhanKirimFile2 Yes you are right, multiplying (number of Jacobi iterations in each step) * (number of CG steps) gives the final amount of performed Jacobi iterations. Regaring your second question: I do not have the data available anymore. There should be some sort of "sweet spot" where the computation time is minimal. Maybe I will redo the calculation and measure the CPU time - this sound like an interesting question. Do you have made any research into this topic/area and can provide the results?
@heesangyoo1536
@heesangyoo1536 Жыл бұрын
Thank you very much for your very nice explanation. Could you share the textbook you referenced? (Just name is OK)
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
Unfortunately I do not have a textbook available where this is described in the way I did. I did a short check on my textbooks but they also do not cover the preconditioned CG method. Most of the stuff I present here is taken from papers - If papers are okay I can give you some names...
@heesangyoo1536
@heesangyoo1536 Жыл бұрын
​@@franksmathematics9779 If you can, i would be appreciated thank you very much
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
@@heesangyoo1536 It took me a while but I found the source where most of the results are taken from - but unfortunately it is written in german and I am afraid there is no english translation... I hope this helps at least a little bit: Kanzow - Numerik linearer Gleichungssystem (Numeric of linear systems of equations) Chapter 5.4 - Das präkonditionierte CG-Verfahren (The preconditioned CG-Method) Here is the link: link.springer.com/book/10.1007/b138019 If I stumble something else I will let you know...
@ncroc
@ncroc Жыл бұрын
Using (x,y) for inner product is not good. Pretty much everybody uses (x,y) for vectors. it's better to stick with for inner product.
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
This depends. In my area of research (functional analysis / optimal control) it is very common to use (x,y) for inner products.
@ncroc
@ncroc Жыл бұрын
@@franksmathematics9779 How do you write a tuple of vectors x,y instead of (x,y) ? By the way, using (x,y) for vector tuples is common in many optimization books. I am quite surprised it's not the case in your optimal control related field.
@franksmathematics9779
@franksmathematics9779 Жыл бұрын
@@ncroc Fair enough, good point. To be more precise I usually add an index like (x,y)_X to indicate that I use the inner product in the space X. However I usually drop this index when it is clear from the context. I have to admit that this may lead to some missunderstandings with tupels if you are using tupels along with inner products.
Numerical linear algebra: Understanding splitting methods
32:30
Franks Mathematics
Рет қаралды 684
Applied Linear Algebra  GMRES
49:11
Nathan Kutz
Рет қаралды 13 М.
🕊️Valera🕊️
00:34
DO$HIK
Рет қаралды 20 МЛН
Это было очень близко...
00:10
Аришнев
Рет қаралды 7 МЛН
ЗНАЛИ? ТОЛЬКО ОАЭ 🤫
00:13
Сам себе сушист
Рет қаралды 4,1 МЛН
Triple kill😹
00:18
GG Animation
Рет қаралды 18 МЛН
Preconditioned Conjugate Gradient Descent (ILU)
7:36
Priya Deo
Рет қаралды 6 М.
Numerical linear algebra: Conjugate Gradient method
24:49
Franks Mathematics
Рет қаралды 7 М.
Preconditioners
7:23
Daniel An
Рет қаралды 2,6 М.
[CFD] Conjugate Gradient for CFD (Part 2): Optimum Distance and Directions
34:26
8.2.5 Preconditioning
7:43
Advanced LAFF
Рет қаралды 5 М.
Conjugate gradient method
14:32
Lewis Mitchell
Рет қаралды 7 М.
Applied Linear Algebra:  QR & Householder
46:31
Nathan Kutz
Рет қаралды 13 М.
Conjugate Gradient Method
9:35
Priya Deo
Рет қаралды 123 М.
Preconditioning in iterative solution of linear systems
8:41
Aerodynamic CFD
Рет қаралды 6 М.
Linear Algebra 13e: The LU Decomposition
16:55
MathTheBeautiful
Рет қаралды 72 М.
🕊️Valera🕊️
00:34
DO$HIK
Рет қаралды 20 МЛН