1:45 A Better Aligned Pre-training Objective 2:53 PEGASUS Architecture and Gap Sentences Generation pre-training 4:45 Inspiration from T5 6:22 HugeNews pre-training dataset 7:15 Downstream Abstractive Summarization Datasets 9:15 Ablations of Gap Sentences Generation 14:23 Results 16:17 Low-Resource Summarization 17:10 Human Evaluation and Examples of Generated Summaries 19:01 PEGASUS vs. BART 20:00 WikiSum Dataset 21:25 Comparison with Ideas in GPT-3
@lightningblade93474 жыл бұрын
Finally! A technical deep learning channel which doesn't have its last video 8 months ago. Keep up the good work!
@connor-shorten4 жыл бұрын
Haha, thank you so much! I also recommend checking out Yannic Kilcher's videos!
@juanmanuelcirotorres61553 жыл бұрын
I need exactly this video for a presentation in my master, thanks a lot
@juanmanuelcirotorres61553 жыл бұрын
I need to present this paper in a class and find a video with an explanation by you is just... awesome
@connor-shorten3 жыл бұрын
Awesome, really glad to hear it!
@PeterOtt4 жыл бұрын
Perfect timing, I had just heard about the model and was interested in trying it out for myself. Similar to pegasus, your videos are pre-training for me, in a sense, before I actually go and read the paper and try out the technique. (someone said this pre-training part before, not my original meme)
@connor-shorten4 жыл бұрын
Haha! I use yannic's videos in the same way!
@erikacardenas29644 жыл бұрын
Great video!
@Schematical4 жыл бұрын
Interesting stuff. Keep up the great work.
@connor-shorten4 жыл бұрын
Thank you!
@aqibfayyaz16193 жыл бұрын
Awesome Explanation
@pepe_reeze93204 жыл бұрын
What's your opinion on adapting the pretrained PEGASUS model to a different domain (e.g. German)?
@manumaneesh95434 жыл бұрын
It was very helpful, keep up the good work 😊
@mehermanoj454 жыл бұрын
Yay!
@karimfayed45173 жыл бұрын
I'm confused about if PEGASUS deal with noise in text or not, otherwise Great WORK!!
@TechVizTheDataScienceGuy3 жыл бұрын
Is it still SOTA? Also, it looks like an extension to T5 objective like from span to sentence apart from prefix context.