Activation functions (step, sigmoid, tanh, relu, leaky relu ) are very important in building a non linear model for a given problem. In this video we will cover different activation functions that are used while building a neural network. We will discuss these functions with their pros and cons,
1) Step
2) Sigmoid
3) tanh
4) ReLU (rectified linear unit)
5) Leaky ReLU
We will also write python code to implement these functions and see how they behave for sample inputes.
Github link for code in this tutorial: : [ Ссылка ]
Do you want to learn technology from me? Check [ Ссылка ] for my affordable video courses.
🔖 Hashtags 🔖
#activationfunction #activationfunctionneuralnetwork #neuralnetwork #deeplearning
Next video: [ Ссылка ]
Previous video: [ Ссылка ]
Deep learning playlist: [ Ссылка ]
Machine learning playlist : [ Ссылка ]
Prerequisites for this series:
1: Python tutorials (first 16 videos): [ Ссылка ]
2: Pandas tutorials(first 8 videos): [ Ссылка ]
3: Machine learning playlist (first 16 videos): [ Ссылка ]
🌎 My Website For Video Courses: [ Ссылка ]
Need help building software or data analytics and AI solutions? My company [ Ссылка ] can help. Click on the Contact button on that website.
#️⃣ Social Media #️⃣
🔗 Discord: [ Ссылка ]
📸 Dhaval's Personal Instagram: [ Ссылка ]
📸 Instagram: [ Ссылка ]
🔊 Facebook: [ Ссылка ]
📝 Linkedin (Personal): [ Ссылка ]
📝 Linkedin (Codebasics): [ Ссылка ]
📱 Twitter: [ Ссылка ]
🔗 Patreon: [ Ссылка ]
Ещё видео!