Epoch, Iteration, Batch Size?? What does all of that mean and how do they impact training of neural networks?
I describe all of this in this video and I also go into some details of how Gradient Descent differs from Stochastic Gradient Descent in terms of training your neural network.
TIMESTAMPS:
0:00 Intro & Training Cycle
0:58 Iteration
2:04 Epoch
3:06 Full batch GD
4:27 Mini Batch SGD pros & cons
6:41 Conclusion
Subscribe for more content on Deep Learning and Machine Learning from a Data Science Consultant and to learn along with me :))
---------
You can also find me on Instagram, where I post almost daily:
[ Ссылка ]
And on my blog:
[ Ссылка ]
Ещё видео!