Modified Newton method | Exact Line Search | Theory and Python Code | Optimization Algorithms #4

  Рет қаралды 65,800

Ahmad Bazzi

Ahmad Bazzi

Күн бұрын

Пікірлер: 133
@leila3268
@leila3268 2 жыл бұрын
U r watching a master at work !
@AhmadBazzi
@AhmadBazzi 2 жыл бұрын
Glad you enjoyed it !
@arifahmed5262
@arifahmed5262 2 жыл бұрын
Good job
@habiburrehman5290
@habiburrehman5290 2 жыл бұрын
Good job keep it up ☺️
@ak_amazingkinggaming3238
@ak_amazingkinggaming3238 2 жыл бұрын
I don't understand why some people hating, - yes, Prof Ahmad missed a couple of symbols (once in a lifetime) but he's still the best !
@yasindemirel7871
@yasindemirel7871 2 жыл бұрын
Why are you dropping three videos (each about 40 minutes) a week ? Please take care of your wellbeing, Ahmad :) We love you
@mdnishad1148
@mdnishad1148 2 жыл бұрын
Nice 🥰🥰🥰
@SadManEdition
@SadManEdition 2 жыл бұрын
This is the best explanation of Optimization I have ever come across. This guy is talented
@FuadEkberov090
@FuadEkberov090 2 жыл бұрын
I like how you make examples to explain the concepts of each method! Brilliant and easy to understand! Thank you so much!!!
@rasxodmusic4821
@rasxodmusic4821 2 жыл бұрын
This will complete my Linear Algebra / Calculus /Optimization trio I was missing to truly understand Machine Learning.
@enessweed251
@enessweed251 2 жыл бұрын
You are watching a master at work !!!!!!!!
@topmusic7356
@topmusic7356 2 жыл бұрын
The Convex Optimization King is back
@vishalchandekarekachchhawa9627
@vishalchandekarekachchhawa9627 2 жыл бұрын
Those animations are sick af !
@nursyazwani7757
@nursyazwani7757 2 жыл бұрын
This and the video on the second Wolfe condition helped me understand completely the concepts and the intuition behind these criteria. Thank you very much ! Amazing explanations.
@AhmadBazzi
@AhmadBazzi 2 жыл бұрын
Wow Nur - glad you found them really helpful !
@kral-brawlstars6710
@kral-brawlstars6710 2 жыл бұрын
I am studying at Erasmus University Rotterdam for Non-Linear Optimization. This video helped me significantly. Thank you for your efforts.
@razanalkado1688
@razanalkado1688 2 жыл бұрын
That's correct, the value shown in the video is a local minimum, and the function is unbounded for large negative x and y values.
@JosephBenson
@JosephBenson 2 жыл бұрын
Amazing lectures! I'm using these to study for my Continuous Optimization course
@ulasyldz1344
@ulasyldz1344 2 жыл бұрын
This guy is super underated !
@Informationalcreator
@Informationalcreator 2 жыл бұрын
Brilliant! One of the best explanation to descent gradient I have ever seen. Well done man
@ankushsharma6456
@ankushsharma6456 2 жыл бұрын
I really like the way you explained the optimizing algorithm. love it :)
@1twoka66
@1twoka66 2 жыл бұрын
A man before his time ❤
@dj.emyemy8759
@dj.emyemy8759 2 жыл бұрын
This nailed down the Adam paper. Thanks alot
@erenhd94
@erenhd94 2 жыл бұрын
My professor at Stanford recommended this channel.
@dannycashy
@dannycashy 2 жыл бұрын
It's a good example of why it's important to understand the shape of the function you are optimizing, and one of the potential downfalls of gradient based solvers.
@galacticgames9824
@galacticgames9824 2 жыл бұрын
Great introductory video. Thank you.
@thevowtv1563
@thevowtv1563 2 жыл бұрын
Excellent explanations thank you! Will watch more
@huseyinsuataydn6412
@huseyinsuataydn6412 2 жыл бұрын
Love from the USA !
@volkanmaral1944
@volkanmaral1944 2 жыл бұрын
You are a good teacher
@oromoshow2057
@oromoshow2057 2 жыл бұрын
Thankyou sooo much sir for such a simple and brilliant graphical illustrations. RESPECT!!!!!
@hdcluptr366
@hdcluptr366 2 жыл бұрын
your all videos are great
@yahiabenliram8691
@yahiabenliram8691 2 жыл бұрын
This lecture was recommended by my professor at Harvard !!!!!
@503_alexx
@503_alexx 2 жыл бұрын
Damn those animations are sick bruh !
@arifdrgn8756
@arifdrgn8756 2 жыл бұрын
This is incredible .. crystal clean explanations.
@tthmidov2842
@tthmidov2842 2 жыл бұрын
THANK YOU DR AHMAD !
@alionen
@alionen 2 жыл бұрын
Super. Content que cela soit utile. Bonne chance pour la suite.
@tamerolur2776
@tamerolur2776 2 жыл бұрын
Cannot thank you enough professor Ahmad Bazzi.
@mami3323
@mami3323 2 жыл бұрын
WOW! Truly impressive teacher. YOU should get professors fired for not being on your level.
@DJAYUSHOFFICIALL
@DJAYUSHOFFICIALL 2 жыл бұрын
Thank you very much - that was concise, clear and easy to understand!
@awhlvna6261
@awhlvna6261 2 жыл бұрын
You are so sweet. Thank you Sir, for these awesome videos!
@galipaltnbay6308
@galipaltnbay6308 2 жыл бұрын
I hope listening to this brings more positive KZbin channels like yours 💜
@legendpubg6590
@legendpubg6590 2 жыл бұрын
I was already at MIT. I must have missed you! :)
@efraim3940
@efraim3940 2 жыл бұрын
Wonderful explanation!!
@doruktadunya943
@doruktadunya943 2 жыл бұрын
Nicely done
@filmcehennemi592
@filmcehennemi592 2 жыл бұрын
many thanks! please upload more videos! there are very helpful, especially for students!
@samedgaming4377
@samedgaming4377 2 жыл бұрын
nice tutorial good explanation tnx
@soc1etysports767
@soc1etysports767 2 жыл бұрын
Thanks for the explanations.
@vibezz_gal9010
@vibezz_gal9010 2 жыл бұрын
Great explanation..thank you
@مطبخامريتاجوتسنيم
@مطبخامريتاجوتسنيم 2 жыл бұрын
Excellent video! Thank you!
@puroporo9437
@puroporo9437 2 жыл бұрын
great explanation
@DanielyOliveira
@DanielyOliveira 2 жыл бұрын
Thank so you much for this video, it helped me a lot
@failoyuncu1482
@failoyuncu1482 2 жыл бұрын
Your videos are very informative. Thank you!
@ugh.victory
@ugh.victory 2 жыл бұрын
Greetings from Canada !
@besiktasndunyas954
@besiktasndunyas954 2 жыл бұрын
make more videos!! It was fantastic!!
@jadolive-erdem2829
@jadolive-erdem2829 2 жыл бұрын
Underrated !!!
@AHMETCAN-hk4en
@AHMETCAN-hk4en 2 жыл бұрын
indeed helped. Thanks for the great video
@anuragtech1637
@anuragtech1637 2 жыл бұрын
سلام عليكم دكتور ممكن اسهل اوالافضل لخوارزميات ضمن ال Meta- heuristic Optimization موفقين ان شاء الله بحق محمد وال محمد
@elekber_pubgaze6609
@elekber_pubgaze6609 2 жыл бұрын
Great video!
@ramibayraktar3740
@ramibayraktar3740 2 жыл бұрын
very nice and cool
@confusedtv
@confusedtv 2 жыл бұрын
Thanks for the great explanation oh mighty bearded one !
@CİOzSlegzWeg
@CİOzSlegzWeg 2 жыл бұрын
Wow, that's an interesting approach. I hope the videos are helpful :-)
@hayaletoyuncu9887
@hayaletoyuncu9887 2 жыл бұрын
Not fair, this guy is more underrated than Hugh Heffner !
@eheehe7965
@eheehe7965 2 жыл бұрын
highly appreciated!!
@shiroitem8230
@shiroitem8230 2 жыл бұрын
Great video thanks.
@sreerag2167
@sreerag2167 2 жыл бұрын
To the viewers, I am a graduate student getting into optimization theory. And I must say this is the best explanation of the inexact line search conditions which often go by the name Armijo conditions.
@emrebozkurt4676
@emrebozkurt4676 2 жыл бұрын
Damn. This was really good explanation
@atakanbayar93
@atakanbayar93 2 жыл бұрын
thanks for give a caption, helpful!
@aybars83
@aybars83 2 жыл бұрын
Thank you very much, sir!
@edanursirin4414
@edanursirin4414 2 жыл бұрын
Nice 👍🏽
@hakancuval1496
@hakancuval1496 2 жыл бұрын
Hi Ahmad, in this context a solver is a computer program that finds the solution to an optimization problem.
@melekzehrakara3365
@melekzehrakara3365 2 жыл бұрын
Great Video thanks
@mertkara9165
@mertkara9165 2 жыл бұрын
Love this
@masteytv4664
@masteytv4664 2 жыл бұрын
Nice video
@LIFEOFAGAMER297
@LIFEOFAGAMER297 2 жыл бұрын
Many thanks!
@garyoak7462
@garyoak7462 2 жыл бұрын
Followed you on LinkedIn ..
@mdnishad1148
@mdnishad1148 2 жыл бұрын
Nice
@fxriyt2206
@fxriyt2206 2 жыл бұрын
I just learned from a friend of a friend who knows Ahmad that he does not have a team to help him with all this 😲 Then i found the answer to my question if superheros exist.
@clothyrabbitz1645
@clothyrabbitz1645 2 жыл бұрын
I like that tea-drinking always causes non-zero suffering
@mumbaikichorisn1534
@mumbaikichorisn1534 2 жыл бұрын
You are my god.
@gameking7440
@gameking7440 2 жыл бұрын
Why does Ahmad not have 1 million subscribers by now ?
@StarMax07
@StarMax07 2 жыл бұрын
super Nice! :)
@nekros3989
@nekros3989 2 жыл бұрын
Thank you so much for such a helpful video! May I ask what software setup you used to create illustrations and images on the screen and to edit the video?
@huseyinhatip7066
@huseyinhatip7066 2 жыл бұрын
Nice video Ahmad. I would like to help to point out that the slope equation in 2:20 is mistyped. I hope it helps. Thanks.
@MRGAMINGop1
@MRGAMINGop1 2 жыл бұрын
Sir please explain how to apply dataset on any one of the algorithm
@amineslimane5152
@amineslimane5152 2 жыл бұрын
Thats used in optimization methods for training a machine learning model ! you need to refer to the Machine Learning course
@wolfbeta5828
@wolfbeta5828 2 жыл бұрын
Can you give any advise on optimization of pid conteollers using ADAM (Thank you in advance ❤️)
@مطبخالشطورةوبس
@مطبخالشطورةوبس 2 жыл бұрын
Why do we need correction in momentum or rms using T elimination?
@eneserkurt1702
@eneserkurt1702 2 жыл бұрын
This example would technically be more correct if posed as a constrained optimization problem to limit the search to just the area shown.
@HH-nb1ou
@HH-nb1ou 2 жыл бұрын
but we do initialize parameters beta_1 and beta_2 ?
@takeuchikame
@takeuchikame 2 жыл бұрын
hey, any ideas on what is fuzzt rule based model, lasso regression?
@mysterchannel5613
@mysterchannel5613 2 жыл бұрын
Wow
@acclaros356
@acclaros356 2 жыл бұрын
Nice. Nice ! i wanna ask, what is the time complexity of any evolutionary based optimization algorithm?
@ohiogaming_7
@ohiogaming_7 2 жыл бұрын
Can Adam optimization be used for classification problems?
@j.g.gaming7296
@j.g.gaming7296 2 жыл бұрын
Dear prof. Ahmad. We have proven that there exists alpha with 0 < beta_1 < alpha < beta_2 < 1 such that both Wolfe conditions are fulfilled. So why do we initialize the lower bound equal to zero and the upper bound equal to infinity. Is it not easier to set the alpha lower bound to beta_1 and the upper bound to beta_2? Best regards from Uni Heidelberg
@umuttosun2479
@umuttosun2479 2 жыл бұрын
How can i get any algorithm of Ant Colony Optimization
@gptpodcast238
@gptpodcast238 2 жыл бұрын
Because we do not know their value from the theory. We just know that they exist.
@Rita-ky4wr
@Rita-ky4wr 2 жыл бұрын
(how to Calculate Convolutional Layer Volume in ConvNet)
@gamerspkistan300
@gamerspkistan300 2 жыл бұрын
I didn't realize he was left-handed until 03:34 lol
@sarabrajsmusic
@sarabrajsmusic 2 жыл бұрын
Thank you God. I mean Ahmad !
@thekoray1031
@thekoray1031 2 жыл бұрын
No. Quasi-Newton method do not use the second derivatives. Only the gradient. They build a secant approximation of the second derivatives matrix. See chapter 13 of
@marouaammine2986
@marouaammine2986 2 жыл бұрын
🇨🇦
@alioter3032
@alioter3032 2 жыл бұрын
so what you're saying is, Buddha should have used ML to minimize suffering. got it.
@perikz8679
@perikz8679 2 жыл бұрын
If Noecedal Wright confused you, this is the place to get it sorted once and for all.
@3HarfliTas
@3HarfliTas 2 жыл бұрын
Great explanation. Please join MIT to help us :)
@duhancevre2157
@duhancevre2157 2 жыл бұрын
Sdb = β₂Sdb + (1 - β₂)db²
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН
The evil clown plays a prank on the angel
00:39
超人夫妇
Рет қаралды 53 МЛН
Line Search 1
28:59
BYU FLOW Lab
Рет қаралды 6 М.
07. Computational Complexity [HPC in Julia]
17:09
MPAGS - High Performance Computing in Julia
Рет қаралды 263
I interviewed an AI and here’s how it went !
26:35
Ahmad Bazzi
Рет қаралды 192 М.
CUDA Programming for Image Processing
14:09
Ahmad Bazzi
Рет қаралды 166 М.
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН