Рет қаралды 84,361
Gradient Descent and its variants are very useful, but there exists an entire other class of optimization techniques that aren't as widely understood. We'll learn about second order method variants, how they compare to first order methods, and implement our own in Python.
Code for this video (with challenge):
github.com/llS...
Alberto's Winning Code:
github.com/alb...
Ivan's Runner up Code:
github.com/Pia...
Please Subscribe! And like. And comment. That's what keeps me going.
Course Syllabus:
github.com/llS...
More learning resources:
web.stanford.e...
www.cs.toronto...
www.quora.com/...
en.wikipedia.o...
• (ML 15.1) Newton's met...
• (ML 15.2) Newton's met...
Join us in the Wizards Slack channel:
wizards.herokua...
And please support me on Patreon:
www.patreon.co...
Follow me:
Twitter: / sirajraval
Facebook: / sirajology Instagram: / sirajraval Instagram: / sirajraval
Signup for my newsletter for exciting updates in the field of AI:
goo.gl/FZzJ5w
Hit the Join button above to sign up to become a member of my channel for access to exclusive content! Join my AI community: chatgptschool.io/ Sign up for my AI Sports betting Bot, WagerGPT! (500 spots available):
www.wagergpt.co