Chapters for each section of the video (preprocessing, model build, prediction) are in the video timeline.
Transformers have been described as the fourth pillar of deep learning [1], alongside the three big neural net architectures of CNNs, RNNs, and MLPs.
However, from the perspective of natural language processing - transformers are much more than that. Since their introduction in 2017, they've come to dominate a majority of NLP benchmarks - and continue to impress daily.
What I'm saying is, transformers are damn cool. And with libraries like HuggingFace's transformers - it has become too easy to build incredible solutions with them.
So, what's not to love? Incredible performance paired with the ultimate ease-of-use.
In this video, we'll work through building a multi-class classification model using transformers - from start-to-finish.
🤖 70% Discount on the NLP With Transformers in Python course:
bit.ly/3DFvvY5
Medium article:
towardsdatascience.com/multi-...
Free access:
towardsdatascience.com/multi-...
Link to Kaggle video:
• How-to use the Kaggle ...
[1] Fourth Pillar of AI:
ark-invest.com/articles/analy...
00:00 Intro
01:21 Pulling Data
01:47 Preprocessing
14:33 Data Input Pipeline
24:14 Defining Model
33:29 Model Training
35:36 Saving and Loading Models
37:37 Making Predictions
Негізгі бет Multi-Class Language Classification With BERT in TensorFlow
Пікірлер: 75