Maaaaaan.... You have THE BEST content, HANDS DOWN, for Gen AI Development. Clear, concise, every step explained, context.... Context is key... Bravo! And thanks a lot for this, it's inspiring.
@daveebbelaar4 ай бұрын
Wow, thanks!
@mksmurff16 күн бұрын
Very clear and real-world example too. Thank you
@IdPreferNot14 ай бұрын
Such great content. I was going to gist this and then i see that's even how you're sharing it! I wanted to get a use case for Instructor library as looked interesting, but wasnt sure what it added beyond pydantic. ... and here it is. Thanks!
@HerroEverynyan4 ай бұрын
So cool that you make such great content, with clear explanations, and are so transparent
@daveebbelaar4 ай бұрын
I appreciate that!
@farhanafridi86944 ай бұрын
Great! would love to see more of these.
@nexuslux4 ай бұрын
Excellent video. Can you go into a bit more detail of how a database of this type of information might look and operate. Or any type of automation that would be involved? You mentioned sentiment or you mentioned doing analytics
@mamadou-diandjalo67234 ай бұрын
This is exactly what i needed ! Thanks !!
@jeromedupourque60674 ай бұрын
Congratulations this is just perfect!
@volt83994 ай бұрын
You did an amazin job, thank you so much for sharing this.
@batigol_912 күн бұрын
I tried following your script and I downloaded pip install -U instructor but I keep getting no module found instructor, have you faced this kind of errors any thoughts?
@lesptitsoiseaux4 ай бұрын
You have 50 000 classes transcripts you need to do a recommendation engine. Best approach?
@MuhammadFaizanMumtaz34 ай бұрын
great yar bhoat zbrdst.
@sumitbindra4 ай бұрын
Loved the content. What are the advantages of using this instead of function calling?
@daveebbelaar4 ай бұрын
@@sumitbindra streamlines prompt engineering, less code, and auto retries.
@sumitbindra4 ай бұрын
@@daveebbelaar makes sense. thank you
@guilhermeveiga93454 ай бұрын
Good tip thnkss
@synergyai4 ай бұрын
How do you deal with the objections of sending this 'sensitive' data to OpenAI? We are doing a project now where we have to clean the data before sending it to openAI which is a big challenge. Curious to hear other people thoughts on this...
@daveebbelaar4 ай бұрын
We use Azure OpenAI. Clients are generally okay with that in our experience.
@erenyeager6554 ай бұрын
combined it with fastapi to transform it to an endpoint and call in the frontend side ooooofff... faster development for machine learning web system
@jppalmab4 ай бұрын
Gold
@AbdulBasit-ff6tq4 ай бұрын
Why don't you just just use the json response from openai directly?
@daveebbelaar4 ай бұрын
This unifies your data structures without relying on prompt engineering. You still have to provide a JSON schema when using the JSON response with OpenAI, and there is also no automated retry mechanism if it fails to load your Pydantic model afterwards. Overall, this streamlines the development experience, especially if you're working with multiple developers who might all have slightly different prompting styles for JSON. Instructor uses the JSON response and Function Calling under the hood.
@AbdulBasit-ff6tq4 ай бұрын
@@daveebbelaar How good or bad this solution is compared to other alternatives like langchain and llamaimdex output parsers?