This is the most complete explanation of BN I've seen on youtube. Thank you!
@MLForNerds
Жыл бұрын
Glad it was helpful!
@ThineshKumar-gk8hb
6 ай бұрын
Good explanation, efficient knowledge transfer is done, expecting more from your side like this.
@ankurdas1477
Жыл бұрын
The video series of Batch Normalization is really gem. Before watching the series I thought that I knew everything about BN😅. Another request from my side, please complete the object detection playlist.
@MLForNerds
Жыл бұрын
Thank you 😊
@anantmohan3158
Жыл бұрын
Thank you for creating such wonderful videos. The maths behind the BN was beautifully explained. Eagerly waiting for the next videos of Python and Pytorch implementation of BN. Thank you..!
@MLForNerds
Жыл бұрын
Thank you 👍
@sandeepanand3834
10 ай бұрын
please add 1. forward backward prop from dropouts, maxpool, residual.... 2. forward backward prop, when we use adam instead of SGD.. please
@sridharchandrasekar7787
5 ай бұрын
Sir I have one doubt, here batch refers to what...eg, in CNN , we have the Matrix of 255 X 255 ..how we calculate mean and variance
@MLForNerds
5 ай бұрын
Batch is an additional dimension on top of 255x255 in your case. Suppose you have a grayscale image of 255x255. If there are 10 images in the batch, the dimension becomes 10x255x255. You calculate the mean and variance along batch dimension for each pixel.
@sridharchandrasekar7787
5 ай бұрын
@@MLForNerds Batch means mini batch size?..if i give mini batch size =32...then how it will calculate sir
@muthukamalan.m6316
Жыл бұрын
please cover group norm and layer norm as well
@MLForNerds
Жыл бұрын
I will cover them in the coming videos. Thank you.
Пікірлер: 13