gguf can be run directly with ollama, how to convert other formats to be converted to use with ollama ?
@albuquerqueroger2 ай бұрын
Parabéns, conteúdo excelente! Obrigado por compartilhar.
@ph3ll3r4 ай бұрын
It would be interesting to see if you could Open-webUI instead of "Chat UI using ChainLit, LangChain, Ollama and Gemma"
@mixp1x4 ай бұрын
Is there any way we can use Exl2 on Ollama?
@amiranvarov6 ай бұрын
Hey mate, very good video with clear explanation. Do u mind to share what terminal app you are using? seems like very convenient with all that autocompletion and some hints
@datasciencebasics6 ай бұрын
I am using Warp but you can watch this video for making terminal better HOW To Make Your Mac Terminal Amazing kzbin.info/www/bejne/r5TEoYmNi9Fsbq8
@BiranchiNarayanNayak7 ай бұрын
Excellent tutorial !!
@datasciencebasics7 ай бұрын
Thanks !!
@SantK12087 ай бұрын
Could you please use llama index with hugging face
@zamanganji12622 ай бұрын
I want to finetune the llama 3 but I need to crate the special_tokens_map.json as follows: { "bos_token": { "content": "", "lstrip": false, "normalized": false, "rstrip": false, "single_word": false }, "eos_token": { "content": "", "lstrip": false, "normalized": false, "rstrip": false, "single_word": false }, "pad_token": { "content": "", "lstrip": false, "normalized": false, "rstrip": false, "single_word": false } } How can I do? Moreover I want to have a ollama run the model to have a chat with the model.
@Pedro-mq7kt6 ай бұрын
Nice video, Can you make a video changing the system-prompt, temperature If its possible. Thankyou
@datasciencebasics6 ай бұрын
Its possible, just provide those parameters in Modelfile.
@hackedbyBLAGHАй бұрын
Thank you
@datasciencebasicsАй бұрын
You are welcome !!
@ScrantonStrangler196 ай бұрын
Good explanation! Is there a list of model architectures that are supported by Ollama?
@datasciencebasics6 ай бұрын
I don’t think there is any architecture for Ollama, you can download any models and use with Ollama. Give a try !!
@121Gamerscom6 ай бұрын
how do i pimp my chainlit logo to mine to change it please
@SantK12087 ай бұрын
Thanks for sharing it, could you please create a video where we can do Q&A with local documents using Ollma models
@SantK12087 ай бұрын
Kindly do it , it’s really needed
@datasciencebasics7 ай бұрын
Please check other videos in this channel, here u go -> Chat With Documents Using ChainLit, LangChain, Ollama & Mistral 🧠 kzbin.info/www/bejne/aHqvYYaaaNOYjcU
@aimademerich7 ай бұрын
Phenomenal
@terry-bn7bh5 ай бұрын
good, how to create the Modefile on windows?😁
@datasciencebasics5 ай бұрын
it should work in the same way, hive a try !!
@terry-bn7bh5 ай бұрын
how to create the Modefile on windows?
@AnudeepKolluri4 ай бұрын
Just open vscode and create a file called Modelfile (not sure about capitalization) and insert the content into it. It doesnt need any extension
@adhyanmishra01976 ай бұрын
hello sir can we deploy this model
@datasciencebasics6 ай бұрын
Yes you can !!
@adhyanmishra01976 ай бұрын
@@datasciencebasics Sir if possible please can u create one tutorial for deploying this model
@datasciencebasics6 ай бұрын
hello, deploying a model is use case specific. It can be deployed locally, in different cloud services , etc. Please refer to other videos in my channel for help.
@AlanCarrOnlineHypnosis4 ай бұрын
Far more hassle than it's worth. Just use any other app that uses normal GGUF files, like normal people.
@datasciencebasics3 ай бұрын
Thanks for the comment. There are many ways to do and its just a preference which one to use🙂
@IamalwaysOK7 ай бұрын
Excellent tutorial !!
@datasciencebasics7 ай бұрын
Glad you liked it!
@nat.amato-104 ай бұрын
How can I edit the Modelfile, so that it includes within it a context, a personality, or a precise way of answering questions?