Рет қаралды 2,748
===== Likes: 26 👍: Dislikes: 0 👎: 100.0% : Updated on 01-21-2023 11:57:17 EST =====
BERT is an open source machine learning framework for natural language processing (NLP) developed by the Google AI team. This has lead to state-of-the art technologies that have made significant breakthroughs on common problems such as natural language inference, question answering, sentiment analysis, and text summarization.
I go through the basic theory, architecture, and implemenation and in no time, you will be conversational in this brilliant architecture!
Feel free to support me! Do know that just viewing my content is plenty of support! 😍
☕Consider supporting me! ko-fi.com/spencerpao ☕
Watch Next?
Transformer → • Transformers EXPLAINED...
LSTM → • [LSTM] Applying and Un...
Sentiment Analysis → • NLP Sentiment Analysis...
🔗 My Links 🔗
BERT Notebook: github.com/SpencerPao/Natural...
Google's Notebook: colab.research.google.com/git...
Github: github.com/SpencerPao/spencer...
My Website: spencerpao.github.io/
📓 Requirements 🧐
Python Intermediate and or advanced Knowledge
Google Account
Google Paper on BERT → arxiv.org/abs/1810.04805
⌛ Timeline ⌛
0:00 - BERT Importance
1:05 - BERT Architecture
1:39 - Pre-training Phase MLM and NSP
5:25 - Fine-tuning
6:58 - BERT Code Implmentation CMD or Notebook
9:51 - Create Tokenizer and Important Features
11:45 - Transforming Text to BERT input
12:38 - Fine Tuning Model, Testing, and Predictions
🏷️Tags🏷️:
Machine Learning, BERT, Bidirectional Encoder Representations from Transformers, Statistics, Jupyter notebook, python, Natural, language, processing, NLP, transformer, encoder, google, AI, google AI, tutorial,how to, code, machine, GPU, google colab, github, pretraining, fine tuning, sentiment, twitter, predictions,AUC, MLM, NSP, Masked Language Model, Next Sentence Prediction,
🔔Current Subs🔔:
3,033