Transformer模型(1/2): 剥离RNN,保留Attention

  Рет қаралды 54,126

Shusen Wang

Shusen Wang

Күн бұрын

下节课: • Transformer模型(2/2): 从A...
English Version: • Transformer Model (1/2...
Transformer模型是目前机器翻译等NLP问题最好的解决办法,比RNN有大幅提高。这节课和下节课讲解Transformer模型。这节课的内容是 剥离RNN,保留Attention,设计Attention层与Self-Attention层。下节课的内容是用这两种层与全连接层搭建深度神经网络--Transformer模型。我不会像其他视频和博客那样剖析Transformer的每个组件。我的思路是从零开始设计一个Transformer模型。希望大家能跟着我的思路一起解决这个问题:如何搭建一个纯基于Attention的深度神经网络,并且能解决一切RNN擅长的问题?
课件:github.com/wan...
Transformer课程视频:
1. Attention层: • Transformer模型(1/2): 剥离...
2. Transformer网络: • Transformer模型(2/2): 从A...
3. BERT: • BERT (预训练Transformer模型)
RNN课程视频:
1. 数据处理: • RNN模型与NLP应用(1/9):数据处理基础
2. 词嵌入: • RNN模型与NLP应用(2/9):文本处理与词嵌入
3. Simple RNN: • RNN模型与NLP应用(3/9):Simpl...
4. LSTM: • RNN模型与NLP应用(4/9):LSTM模型
5.RNN的改进: • RNN模型与NLP应用(5/9):多层RNN...
6. 文本生成: • RNN模型与NLP应用(6/9):Text ...
7. 机器翻译: • RNN模型与NLP应用(7/9):机器翻译与...
8. Attention: • RNN模型与NLP应用(8/9):Atten...
9. Self-Attention: • RNN模型与NLP应用(9/9):Self-...

Пікірлер: 91
Transformer模型(2/2): 从Attention层到Transformer网络
14:33
RNN模型与NLP应用(8/9):Attention (注意力机制)
16:51
Shusen Wang
Рет қаралды 30 М.
What's in the clown's bag? #clown #angel #bunnypolice
00:19
超人夫妇
Рет қаралды 21 МЛН
Mom had to stand up for the whole family!❤️😍😁
00:39
MY HEIGHT vs MrBEAST CREW 🙈📏
00:22
Celine Dept
Рет қаралды 71 МЛН
Transformer
49:32
Hung-yi Lee
Рет қаралды 204 М.
Transformer论文逐段精读
1:27:05
跟李沐学AI
Рет қаралды 411 М.
BERT (预训练Transformer模型)
11:26
Shusen Wang
Рет қаралды 22 М.
Attention mechanism: Overview
5:34
Google Cloud Tech
Рет қаралды 148 М.
RNN模型与NLP应用(2/9):文本处理与词嵌入
16:12
Shusen Wang
Рет қаралды 20 М.
自动驾驶技术是个啥?李永乐老师告诉你靠不靠谱
24:19
李永乐老师
Рет қаралды 560 М.