Рет қаралды 193,605
The main ideas behind Backpropagation are super simple, but there are tons of details when it comes time to implementing it. This video shows how to optimize three parameters in a Neural Network simultaneously and introduces some Fancy Notation.
NOTE: This StatQuest assumes that you already know the main ideas behind Backpropagation: • Neural Networks Pt. 2:...
...and that also means you should be familiar with...
Neural Networks: • The Essential Main Ide...
The Chain Rule: • The Chain Rule
Gradient Descent: • Gradient Descent, Step...
LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: sebastianraschka.com/faq/docs...
For a complete index of all the StatQuest videos, check out:
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Buying my book, The StatQuest Illustrated Guide to Machine Learning:
PDF - statquest.gumroad.com/l/wvtmc
Paperback - www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - www.amazon.com/dp/B09ZG79HXC
Patreon: / statquest
...or...
KZbin Membership: / @statquest
...a cool StatQuest t-shirt or sweatshirt:
shop.spreadshirt.com/statques...
...buying one or two of my songs (or go large and get a whole album!)
joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
/ joshuastarmer
0:00 Awesome song and introduction
3:01 Derivatives do not change when we optimize multiple parameters
6:28 Fancy Notation
10:51 Derivatives with respect to two different weights
15:02 Gradient Descent for three parameters
17:19 Fancy Gradient Descent Animation
#StatQuest #NeuralNetworks #Backpropagation