As I learn and improve on my video quality, entertainment value, and teaching quality, it's taking me longer and longer to write, create, and edit videos, so my schedule is in flux, sorry! I'll try to post as much as possible, but once every two weeks might be the best I can do for now. Receptive fields + number of layers are going to be covered in the next video! Activation functions happen in convolutional neural networks after the dot product is computed. Typically the activation function is ReLU, as it increases nonlinearity while being significantly faster than other functions like sigmoid or tanh. A witch, ghost, and pirate are walking into a bar. The ghost is telling a joke. "A witch, ghost, and pirate are walking into a bar. The ghost is telling a..." "Whoa whoa whoa!" The witch says. "I don't want to hear this joke, this sounds pretty involved." "You could even say its convolved!" The ghost says, roaring in laughter. "That doesn't even make any sense," the pirate says. "You are just replacing the in with con here like that's supposed to mean anything!" "Yet, it still made me laugh!" The ghost says. Happy Halloween! ▶️ More videos: www.patreon.com/intuitiveml
@mrme3608
3 жыл бұрын
This is awesome. So concise yet so simple to understand.
@IntuitiveML
3 жыл бұрын
Thank you very much mrMe! It's a constantly evolving process for me.
@minchevk
3 жыл бұрын
Fantastic content, keep it up!
@IntuitiveML
3 жыл бұрын
Thanks, will do!
@doctorshadow2482
Жыл бұрын
Thanks. Could you please cover a topic how all this works with shift/rotation/scale of the image?
Пікірлер: 6