Рет қаралды 9,121
Next sentence prediction (NSP) is one-half of the training process behind the BERT model (the other being masked-language modeling - MLM).
Although NSP (and MLM) are used to pre-train BERT models, we can use these exact methods to further pre-train our models to better understand the specific style of language in our own use cases.
So, in this video, we'll cover exactly how we take an unstructured body of text, and use it to pre-train a BERT model using NSP.
Meditations data:
github.com/jam...
Jupyter Notebook
github.com/jam...
🤖 70% Discount on the NLP With Transformers in Python course:
bit.ly/3DFvvY5
📙 Medium article:
towardsdatasci...
🎉 Sign-up For New Articles Every Week on Medium!
/ membership
📖 If membership is too expensive - here's a free link:
towardsdatasci...
🕹️ Free AI-Powered Code Refactoring with Sourcery:
sourcery.ai/?u...