Backpropagation is the method we use to optimize parameters in a Neural Network. The ideas behind backpropagation are quite simple, but there are tons of details. This StatQuest focuses on explaining the main ideas in a way that is easy to understand.
NOTE: This StatQuest assumes that you already know the main ideas behind...
Neural Networks: [ Ссылка ]
The Chain Rule: [ Ссылка ]
Gradient Descent: [ Ссылка ]
LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: [ Ссылка ]
For a complete index of all the StatQuest videos, check out:
[ Ссылка ]
If you'd like to support StatQuest, please consider...
Buying my book, The StatQuest Illustrated Guide to Machine Learning:
PDF - [ Ссылка ]
Paperback - [ Ссылка ]
Kindle eBook - [ Ссылка ]
Patreon: [ Ссылка ]
...or...
YouTube Membership: [ Ссылка ]
...a cool StatQuest t-shirt or sweatshirt:
[ Ссылка ]
...buying one or two of my songs (or go large and get a whole album!)
[ Ссылка ]
...or just donating to StatQuest!
[ Ссылка ]
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
[ Ссылка ]
0:00 Awesome song and introduction
3:55 Fitting the Neural Network to the data
6:04 The Sum of the Squared Residuals
7:23 Testing different values for a parameter
8:38 Using the Chain Rule to calculate a derivative
13:28 Using Gradient Descent
16:05 Summary
#StatQuest #NeuralNetworks #Backpropagation
Ещё видео!