Overfitting is one of the main problems we face when building neural networks. Before jumping into trying out fixes for over or underfitting, it is important to understand what it means, why it happens and what problems it causes for our neural networks. In this video, we will look into dropout regularization. We will learn how this regularization technique works, how it is different than L1/L2 regularization and when to use it.
Previous lesson: • When Should You Use L1...
Next lesson: • Regularization with Da...
📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumr...
📕 NNs hyperparameters cheat sheet: www.soyouwantt...
👩💻 You can get access to all the code I develop in this course here: github.com/mis...
❓To get the most out of the course, don't forget to answer the end of module questions:
fishy-dessert-...
👉 You can find the answers here:
fishy-dessert-...
RESOURCES:
🏃♀️ Data Science Kick-starter mini-course: www.misraturp....
🐼 Pandas cheat sheet: misraturp.gumr...
📥 Streamlit template (updated in 2023, now for $5): misraturp.gumr...
📝 NNs hyperparameters cheat sheet: www.misraturp....
📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumr...
COURSES:
👩💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp....
🌎 Website - misraturp.com/
🐥 Twitter - / misraturp
Негізгі бет What is Dropout Regularization | How is it different?
Пікірлер: 9