If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.
@lamis_18
2 жыл бұрын
please indicate your variables or with pictures to explane what is Z and A ..etc
@CodingLane
2 жыл бұрын
@@lamis_18 okay
@masplacasmaschicas6155
9 ай бұрын
You explain these concepts more completely and simply than any other video I’ve seen. Thank you
@tiyyob
2 жыл бұрын
Just started this playlist and found it very well explained. Thank you for great work.
@CodingLane
2 жыл бұрын
Thank you!
@brindhasenthilkumar7871
3 жыл бұрын
Well and brief explanation of the activation functions Sir Patel, wonderful, I am acquiring new knowledge from your every videos, good, great going
@CodingLane
3 жыл бұрын
Thank you so much ! I am glad you found my videos helpful.
@G83X
8 ай бұрын
damn, this is lowkey a really good and insightful way of explaining this. I'll be sharing with my students. Exceptional tutorial
@user-ul2mw6fu2e
2 жыл бұрын
First of all, thank you very much for these videos. I have a question about cross entropy. I understand how cross entropy works. I don't understand why it works. I would appreciate it if you make videos about these topics.
@CodingLane
Жыл бұрын
Thanks for the suggestion. Will try to cover this topic.
@rutvipatel6896
2 жыл бұрын
You are saving me rn from my midtrem tomorrow. Thank you!!!'
@CodingLane
Жыл бұрын
Really happy to hear this. Glad the videos helped you! 🙂
@aienImchen-hs6fp
6 ай бұрын
will this explanation be enough for a beginnner in ML? I understood what you have explained .iam learnign from you .Thank you.
@TechWorld-ec2ec
23 күн бұрын
Amazing explanation
@algorithmo134
3 ай бұрын
how does relu solve the vanishing gradient problem since some part of the gradient is zero for x < 0?
@chillax1629
Жыл бұрын
thanks a lot for sharing! really helped me understanding why and when using which activation function. Very good!
@Ivaan_reminiscence
4 ай бұрын
Does relu makes the f(x)=0 even if the x is very small but >0? because tanh/sigmoid the rate of change of gradient becomes very small but still >0, whereas in the relu the f(x) seems to be 0 only when x
@pankajmourya4583
9 ай бұрын
Great work bro 👍
@ArchitStark
2 ай бұрын
Is it possible for you to add/share further reading documents ?
@maheshyezarla5294
Жыл бұрын
very very very useful for me. Thank you
@CodingLane
Жыл бұрын
Glad I could help!
@HeshamAlshafie-z9m
7 ай бұрын
Perfect explanation. thank you. keep going
@Ivaan_reminiscence
4 ай бұрын
@10:18 woudn't it be "both the tanh and sigmoid function (and not 'Relu') had this disadvantage of vanishing gradient prob..."... Relu is it's solution right?
@Satvikshukla0007
Ай бұрын
Very well explained
@CodingLane
Ай бұрын
Thank you! Glad it was helpful!
@RamaKrishna-fp7yd
2 жыл бұрын
Keep it up bro, nice explaination ✅
@CodingLane
2 жыл бұрын
Thank you!
@susw3602
2 жыл бұрын
which one is a non-symmetric activation function ?
@PrithaMajumder
2 ай бұрын
Thanks a lot for This Amazing Introductory Lecture 😀 Lecture - 3 Completed from This Neural Network Playlist
@Luca_040
9 ай бұрын
Good summary, thank you
@MeshachJones-d5s
5 ай бұрын
soooooo grateful for you
@nemeziz_prime
Жыл бұрын
Great explanation 🔥👏🏻
@pratiknale6993
2 жыл бұрын
💐💐💐
@CodingLane
2 жыл бұрын
😇😇
@saumyaagrawal7781
26 күн бұрын
I think I’m in love with you
@saumyaagrawal7781
26 күн бұрын
Jk but you’re amazing
@mrunalwaghmare
Ай бұрын
Bhai hindi me kyu koi smajhata nahi 🤢🤢🤢🤢🤢 🤮🤮🤮
@CodingLane
Ай бұрын
Try Code Basics Hindi channel. Shayad aapko achaa lage
@sameera5388-h5m
12 күн бұрын
PERFECT
@manikantaperumalla2197
4 ай бұрын
well explanation brother. keep it up
@priyanshupatelhawk
Жыл бұрын
Amazing Explanation, just one mistake at 10:16 to 10:24 that should be "Sigmoid and TanH" not "ReLU and TanH"...
@surajjoshi3433
3 жыл бұрын
Hey bro I am beginner learning deep learning,Can you suggest me any materials to learn deep learning from scratch?
@CodingLane
3 жыл бұрын
Hi Suraj, I would highly recommend you to take the coursera course from Andrew Ng for Deep Learning. Here’s its link : www.coursera.org/specializations/deep-learning This course is for absolute beginners and you will develop better understanding of deep learning. Also, if you feel like you can’t afford it, there are ways on coursera to take courses for free.. like Auditing or applying for financial aid. I hope you find Deep Learning interesting !!
@surajjoshi3433
3 жыл бұрын
@@CodingLane Thank you brother 😄for your suggestion
Пікірлер: 48