L19.5.2.6 BART: Combining Bidirectional and Auto-Regressive Transformers

  Рет қаралды 4,719

Sebastian Raschka

Sebastian Raschka

Күн бұрын

Slides: sebastianraschka.com/pdf/lect...
0:00 Introduction
0:33 BART. Combining Bidirectional and Auto-Regressive Transformers
2:14 BART. BERT Encoder + GPT Decoder - Noise Transformations
4:39 Noise Transformations in BART for Pre-Training on Unlabeled Data
6:19 BART Performance Under Different Noise Transformations
7:04 Fine-Tuning on Labeled Data
8:21 BART Performance for Discriminative Tasks
9:26 BART Performance for Generative Tasks
-------
This video is part of my Introduction of Deep Learning course.
Next video: • L19.5.2.7: Closing Wor...
The complete playlist: • Intro to Deep Learning...
A handy overview page with links to the materials: sebastianraschka.com/blog/202...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Пікірлер: 4
@siruili7924
@siruili7924 Жыл бұрын
The best tutorial for BART
@kartikrustagi
@kartikrustagi Жыл бұрын
Is there a video where you explains what is exactly an encoder and what is a decoder? Sorry I am moving between videos and have not gone through the complete playlist
@erenjaeger5907
@erenjaeger5907 9 ай бұрын
You mentioned that BART is not the state of the art anymore, do you have ideas as to what you would consider now state of the art for text summarization?
@anshumansinha5874
@anshumansinha5874 15 күн бұрын
Hmm, yah
L19.5.2.7: Closing Words -- The Recent Growth of Language Transformers
6:10
L19.5.2.3 BERT: Bidirectional Encoder Representations from Transformers
18:31
3M❤️ #thankyou #shorts
00:16
ウエスP -Mr Uekusa- Wes-P
Рет қаралды 7 МЛН
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 170 #shorts
00:27
Was ist im Eis versteckt? 🧊 Coole Winter-Gadgets von Amazon
00:37
SMOL German
Рет қаралды 25 МЛН
L18.3: Modifying the GAN Loss Function for Practical Use
18:50
Sebastian Raschka
Рет қаралды 6 М.
L17.2 Sampling from a Variational Autoencoder
9:27
Sebastian Raschka
Рет қаралды 9 М.
L19.4.1 Using Attention Without the RNN -- A Basic Form of Self-Attention
16:11
PEGASUS Explained!
24:16
Connor Shorten
Рет қаралды 10 М.
L18.4: A GAN for Generating Handwritten Digits in PyTorch -- Code Example
22:46
L19.1 Sequence Generation with Word and Character RNNs
17:44
Sebastian Raschka
Рет қаралды 7 М.
L17.3 The Log-Var Trick
7:35
Sebastian Raschka
Рет қаралды 6 М.
[ENG SUB] BART paper review
29:58
딥러닝논문읽기모임
Рет қаралды 3,9 М.
Спутниковый телефон #обзор #товары
0:35
Product show
Рет қаралды 1,8 МЛН