The best team right there! Absolutely love the crew AI framework + groq lpu speed + llama 3.1 405B open source models! The main limiting factors are stringent rate limits and the super small / limited input and output token limits let's do what chat gpt did with 60,000 tokens in and out per call! That would be truly game changing and launch open source above open AI gpt 4o once and for all! (Greater token & rate limits in and out are what will allow agentic workflows to truly work correctly and as intended for complex processes and architectures. Right now crewAI is great for simple tasks but for longer multi step and complex processes it will need large token limits and ideally limited rate limits!) As always great work team! 😎 🚀 👏🏻👏🏻
@prithviraj10805 ай бұрын
Please link the notebooks in the description.
@kirilkirchev2855 ай бұрын
Groq is super cool! Good documentation and straightforward UX. I am using the API on daily bases as a VSCode and Inteliij provider for Continue. Insanely fast. Consider to use it as a main provider in the SAAS app, I am currently building. Success to both Groq and Crew AI
@hussainshaik43905 ай бұрын
How are u dealing with free rate limit
@SamiSabirIdrissi5 ай бұрын
Awesome demo! im waiting for that link for jupyter notebook. please share it ! thank you!
@johellis15 ай бұрын
yes link would be great!
@qzwwzt3 ай бұрын
Sure!! Pleas share the notebook
@ai-whisperer5 ай бұрын
This is awesome! Can we please get a link to the notebook?
@mooktakim5 ай бұрын
Nothing has been linked
@i2c_jason5 ай бұрын
Can you convince me to use CrewAI instead of LangGraph with API calls to LLMs and expert systems? I'm concerned about being too limited with CrewAI. Working on an iterative RAG workflow, sort of reasoner for an engineering application. Thoughts?
@Cheese-and-Garlic5 ай бұрын
Where oh where might we find this impressive notebook file link @GroqInc ? Forever grateful to you and the team!
@sharadpatel1075 ай бұрын
The problem I have with crew abs groq combo is the timeout limits
@HistorIAsImposibles776AC5 ай бұрын
Amazing !!!
@pranjalmittal5 ай бұрын
Groq has incredible inference performance, but I have run into issues with it reliably outputting JSON, which model/LLM with Groq do you recommended for reliable JSON formatted outputs? For multi-step reasoning/chains/agent graphs, I normally have each previous step, output a JSON that feeds into the next step, for the whole chain/graph to not break I need each step doing an LLM call via Groq API, to reliably output JSON.
@pranjalmittal5 ай бұрын
Actually, figured that using Mixtral "mixtral-8x7b-32768" as the model with Groq produces reliable JSON outputs. I was using LLama earlier. But as a user, I want to be able to use the latest models such as LLama 3.1 as well and get those to output JSON reliably.
@paulmiller5914 ай бұрын
Don't forget to add the Jupyter Notebook link?
@julianomoraisbarbosa5 ай бұрын
# nice
@gerhardengel8445 ай бұрын
so much problems with tool calling
@aiplaygrounds5 ай бұрын
Best multi agent framework is agency swarm. Why would you make a presentation with hallucinations to promote the product? Haha
@rakeshm30634 ай бұрын
Notebooks are not published not worth watching
@dievas_3 ай бұрын
Why do you need to use some bloated pile of crap framework though?