Let's do a deep dive into the Transformer Neural Network Architecture for language translation.
ABOUT ME
⭕ Subscribe: kzitem.info...
📚 Medium Blog: / dataemporium
💻 Github: github.com/ajhalthor
👔 LinkedIn: / ajay-halthor-477974bb
RESOURCES
[ 1 🔎] Transformer Architecture Image :github.com/ajhalthor/Transfor...
[2 🔎] draw.io version of the image for clarity: github.com/ajhalthor/Transfor...
PLAYLISTS FROM MY CHANNEL
⭕ Transformers from scratch playlist: • Self Attention in Tran...
⭕ ChatGPT Playlist of all other videos: • ChatGPT
⭕ Transformer Neural Networks: • Natural Language Proce...
⭕ Convolutional Neural Networks: • Convolution Neural Net...
⭕ The Math You Should Know : • The Math You Should Know
⭕ Probability Theory for Machine Learning: • Probability Theory for...
⭕ Coding Machine Learning: • Code Machine Learning
MATH COURSES (7 day free trial)
📕 Mathematics for Machine Learning: imp.i384100.net/MathML
📕 Calculus: imp.i384100.net/Calculus
📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
📕 Linear Algebra: imp.i384100.net/LinearAlgebra
📕 Probability: imp.i384100.net/Probability
OTHER RELATED COURSES (7 day free trial)
📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
📕 Python for Everybody: imp.i384100.net/python
📕 MLOps Course: imp.i384100.net/MLOps
📕 Natural Language Processing (NLP): imp.i384100.net/NLP
📕 Machine Learning in Production: imp.i384100.net/MLProduction
📕 Data Science Specialization: imp.i384100.net/DataScience
📕 Tensorflow: imp.i384100.net/Tensorflow
TIMESTAMPS
0:00 Introduction
1:38 Transformer at a high level
4:15 Why Batch Data? Why Fixed Length Sequence?
6:13 Embeddings
7:00 Positional Encodings
7:58 Query, Key and Value vectors
9:19 Masked Multi Head Self Attention
14:46 Residual Connections
15:50 Layer Normalization
17:57 Decoder
20:12 Masked Multi Head Cross Attention
22:47
24:03 Tokenization & Generating the next translated word
26:00 Transformer Inference Example
Негізгі бет The complete guide to Transformer neural Networks!
Пікірлер: 106