Google Colab Notebook: colab.research.google.com/drive/1JOzbVzrm8_GJAmuh2Qcjsxf5Rg0yK3AG?usp=sharing Event Slides: www.canva.com/design/DAGCrgbYdc0/Q56HqhQAp_-163pJNm_fmA/view?DAGCrgbYdc0&
@prasad_yt5 ай бұрын
I like the way you simplify and explain- starting with the big picture and then breaking down in to the details.❤
@roopad87427 ай бұрын
Really good session! Always looking forward to your slide decks and easy explanations!
@sitedev7 ай бұрын
Awesome as usual! I’m doing all of my development using Flowise and all of this information is useful and mostly transferable to Flowise. Thanks.
@techgiantt7 ай бұрын
Can you guys, showcase the function calling feature with smaller models, some of us are trying to build on local. Anyways, great work 👍🏻
@AI-Makerspace7 ай бұрын
We'll see when we can slide this in!
@tyessenov7 ай бұрын
Great video, thnx! Is it possible to include image in a LLM response? Text from rag and also an image as a bs64 to chat app?
@AI-Makerspace7 ай бұрын
It is possible - yes! You just need to handle the image pipeline - but LlamaIndex has pipelines for this built in!
@vijaybrock7 ай бұрын
Hello Sir, Can we add Query transformations for the above whole pipeline. My context is to build a RAG pipeline that chats with Multiple 10K reports, for which we can use domain specific LLM for embedding and query transformations, of course i want leverage the AutoRetrieveingTool also. Can you suggest me this?