With LLMs, what's the future of spacy, is it going to do NER currently using CNNs and switch to LLMs and Prompt engineering.
@python-programming Жыл бұрын
This is a great question. I see spaCy remaining highly relevant for the foreseeable future. spaCy already has built-in components to work directly with LLMs via APIs for OpenAI, HuggingFace, etc. You can use them via spacy-llm (video coming soon!). Also, they have great recipes for their annotation software Prodigy. LLMs are really good for certain NER tasks. I have experimented with this extensively for months. They struggle in a few areas, namely consistency and cost. It is far cheaper, more consistent, and more reliable to use an LLM for assisting in annotations and training a smaller model that can be run locally. In the next year or so, I imagine we see this change, especially as open-source LLMs become smaller and more affordable.
@halibrahim Жыл бұрын
What about data privacy when you share your documents with these open sources, any solution around it?
@python-programming Жыл бұрын
This is a very good question. One solution is to use open-source models and host them locally. With this approach, you would not rely on any API and your data could sit and remain local. A good place for models is HuggingFace. This is something I intend to cover in this series on RAG.
@ArunKumar-bp5lo Жыл бұрын
great short precise idea thanks
@python-programming Жыл бұрын
Thanks so much! Glad you liked the video!
@JH-jy1ye8 ай бұрын
Really good explanation, thank you
@gerardorafaelquirossandova311511 ай бұрын
This was a very precise and easy to watch explanation video. Thanks!
@alexdavis9324 Жыл бұрын
Great video!This was very interesting. I am going to try and add this to my work flow at some point
@python-programming Жыл бұрын
Glad to hear it! My video next week will show you how to get started.