Thank you so much. I was having an issue on conection creai wuth llama, but your template helped me.
@none-hr6zh10 күн бұрын
Thank you so much.Why there is need of api if I using local llms .Suppose I want to see the tokenization and detokenization or any model related information,how to see that model file .I am using ollama pull llama and ghen using in crewai but reposne coming is from api . Please give hint how to see the local llms file
@TylerReedAI10 күн бұрын
Hmm your response is still from OpenAI? So if you set your OpenAI api key to like sk-1111, does it still work? You don’t need an api key, but you just need a placeholder, so like any string really.
@none-hr6zh10 күн бұрын
@@TylerReedAI Thank you for the reply.My question is that how to go to the details of local llms like(How it is tokenizing ,encoding decoding) .If I have to change iany method inside model I must have access to model file.I am using ollama(model="llama",base url), it is using rest api to send data to llms and recieve response from llms .I am not able to find how does message is preprocessed and encoded befor giving to llms .
@none-hr6zh9 күн бұрын
I have one doubt from langchain.from langchain.llms import Ollama llm=Ollama(model='llama3') The response going through rest api using post method , can we acess server side code as it is local host , I want to see the internal working how my input is going to lma model and all the internal detail of model in .py itself
@DihelsonMendonca3 ай бұрын
Hello, please make a review about Open WebUI. It's the great hype currently. "Open WebUI", is a frontend for LLMs, which offers the ability for users to talk hands free, with excellent voices, using internal or external TTS APIs from Eleven Labs, Groq, etc, has web search, RAG, long term memory, it's compatible with OpenAI API, can import GGUF models directly from Hugging Face and convert for using on Ollama, can fine tune models for special needs, can work with multimodal models, image and video, can engage with multiple models simultaneously, local and external at the same time, can accept new plugins, external tools and functions... And it's also open source. 🎉❤
@jarrod7522 ай бұрын
_Paid Nothing..._ Electricity: _Am I a joke to you?!_
@TylerReedAI2 ай бұрын
😄😂
@m.c.445816 күн бұрын
I only use Ollama now, and I dont use Crewai but moa, and make logic for it. I am done with frameworks .. too many API and bias data.
@DihelsonMendonca3 ай бұрын
Now we can with GPT 4o mini. The prices are 90% cheaper ! ❤
@TylerReedAI3 ай бұрын
Amazing!
@ade74563 ай бұрын
Great! Can I create multiple agents? E.g One to code, one to test, one to correct code etc?
@NateGinn-u9m3 ай бұрын
now do this with groq
@mobilesales46962 ай бұрын
Can you write a script in which we can add unlimited amount of ai which use api system and seprate functions in which we can store our offline llms like ollama all version or any ai system and store that offline llms to github so we can use them to our desire place by running simple script 😅😊
@themax2go2 ай бұрын
would you still recommend this approach - crewai - over autogenstudio (kzbin.info/www/bejne/f5vUcoCNiq5jqJo) ?