Рет қаралды 27,685
This video explains all the major Transformer Architectures and differentiates between various important Transformer Models.
Which Transformer Architecture to use to solve a particular problem statement in Natural Language Understanding (NLU) and Natural Languages Generation (NLG) is explained in a simplified manner.
Over the past 6 years, Transformers, a neural network architecture, have completely transformed state-of-the-art natural language processing and the way we approach to different problem statements in NLG and NLU.
Chapters:
0:00 Introduction
1:21 Encoder Branch
1:57 BERT
2:37 DistilBERT
3:19 RoBERTa
3:59 XLM
4:50 XLM-RoBERTa
5:32 ALBERT
6:40 ELECTRA
7:19 DeBERTa
8:13 Decoder Branch
8:50 GPT
9:13 CTRL
9:54 GPT-2
10:31 GPT-3
11:30 GPT-Neo/GPT-J-6B
11:50 Encoder-Decoder Branch
12:00 T5
13:05 BART
13:46 M2M-100
14:22 BigBird
#datascience #neuralnetwork #machinelearning #naturallanguageprocessing