In this video, we will understand all major Optimization in Deep Learning. We will see what is Optimization in Deep Learning and why do we need them in the first place.
Optimization in Deep Learning is a difficult concept to understand, so I have done my best to provide you with the best possible explanation after studying it from different sources, so that you can understand it with ease.
So I hope after watching this video, you do struggle with the concept and you can understand it well.
Optimization in Deep Learning is a technique that speeds up the training of the model.
If you know about mini-batch gradient descent then you will know, that in mini-batch gradient descent, the learning takes place in a zig-zag manner. Thus, some time gets wasted in moving in a zig-zag direction instead of a straight direction.
Optimization in Deep Learning reduces makes the learning path straighter and thus reducing the time taken to train the model
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
▶ Mini Batch Gradient Descent: [ Ссылка ]
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
✔ Improving Neural Network Playlist: [ Ссылка ]
✔ Complete Neural Network Playlist: [ Ссылка ]
✔ Complete Logistic Regression Playlist: [ Ссылка ]
✔ Complete Linear Regression Playlist: [ Ссылка ]...
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Timestamp:
0:00 Agenda
1:02 Why do we need Optimization in Deep Learning
2:36 What is Optimization in Deep Learning
3:43 Exponentially Weighted Moving Average
9:20 Momentum Optimizer Explained
11:53 RMSprop Optimizer Explained
15:36 Adam Optimizer Explained
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Subscribe to my channel, because I upload a new Machine Learning video every week: [ Ссылка ]...
Ещё видео!