Lecture 10: Neural Machine Translation and Models with Attention

  Рет қаралды 90,720

Stanford University School of Engineering

Stanford University School of Engineering

Күн бұрын

Lecture 10 introduces translation, machine translation, and neural machine translation. Google's new NMT is highlighted followed by sequence models with attention as well as sequence model decoders.
-------------------------------------------------------------------------------
Natural Language Processing with Deep Learning
Instructors:
- Chris Manning
- Richard Socher
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.
For additional learning opportunities please visit:
stanfordonline.stanford.edu/

Пікірлер
Lecture 11: Gated Recurrent Units and Further Topics in NMT
1:20:00
Stanford University School of Engineering
Рет қаралды 35 М.
Lecture 4: Word Window Classification and Neural Networks
1:16:43
Stanford University School of Engineering
Рет қаралды 109 М.
Smart Sigma Kid #funny #sigma #comedy
00:25
CRAZY GREAPA
Рет қаралды 24 МЛН
تجربة أغرب توصيلة شحن ضد القطع تماما
00:56
صدام العزي
Рет қаралды 51 МЛН
The Attention Mechanism in Large Language Models
21:02
Serrano.Academy
Рет қаралды 85 М.
Lecture 12: End-to-End Models for Speech Processing
1:16:35
Stanford University School of Engineering
Рет қаралды 69 М.
Machine Translation - Lecture 1: Introduction
52:42
Philipp Koehn
Рет қаралды 14 М.
Lecture 9: Machine Translation and Advanced Recurrent LSTMs and GRUs
1:20:28
Stanford University School of Engineering
Рет қаралды 90 М.
Lecture 2 | Word Vector Representations: word2vec
1:18:17
Stanford University School of Engineering
Рет қаралды 504 М.
Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!!
36:15
StatQuest with Josh Starmer
Рет қаралды 624 М.
Cosmology in Crisis? Confronting the Hubble Tension
36:26
World Science Festival
Рет қаралды 95 М.
CS480/680 Lecture 19: Attention and Transformer Networks
1:22:38
Pascal Poupart
Рет қаралды 341 М.
Как распознать поддельный iPhone
0:44
PEREKUPILO
Рет қаралды 1,6 МЛН
Собери ПК и Получи 10,000₽
1:00
build monsters
Рет қаралды 2,6 МЛН
Красиво, но телефон жаль
0:32
Бесполезные Новости
Рет қаралды 830 М.
Сколько реально стоит ПК Величайшего?
0:37