Thank you. I asked about this yesterday on another KZbin channel.
@publicsectordirect982 Жыл бұрын
The best video on the topic i have seen. Thank you! Subscribed
@d3mist0clesgee12 Жыл бұрын
great stuff, can't wait to run / install
@user-wr4yl7tx3w Жыл бұрын
Really good example.
@AlexanderGrap Жыл бұрын
Excellent! Love the code walk through and the UI implementation. Thank you
@datasciencebasics Жыл бұрын
You are welcome!
@aiPromptingcourse Жыл бұрын
Great job of explaining everything so thoroughly! I understand exactly what needs to be done. Thank you!
@datasciencebasics Жыл бұрын
You are welcome. Glad that it was helpful !
@joxxen Жыл бұрын
Thank you, a small upgrade for the UI could be to include a simple section for Ingest Data or folder and file selection. Should be fairly easy to add
@datasciencebasics Жыл бұрын
Good idea, you might contribute :)
@justinlegare Жыл бұрын
@@datasciencebasics yeah lets see it joxxen lets see it! i got faith
@hypercrack7440 Жыл бұрын
How can we speed up the response time? Any Ideas community? Which model would you recommend?
@abunoormostofasuza9276 Жыл бұрын
Very helpful
@abilashhp1115 Жыл бұрын
for getting response from the model it takes more time, can you suggest any robust model , so it can response with minimal time
@nickstaresinic9933 Жыл бұрын
Thanks for this thorough introduction. It's very useful. The only problem I have is the same as in the video: The time between the prompt and the response is so long that it makes for a bad user experience. (I noted this mentioned in Issue #316 of the PrivateGPT repo.)
@datasciencebasics Жыл бұрын
Yep, Hopefully there will be solution ASAP.
@JD-hh2qb Жыл бұрын
Hello, what is the app you are using to make the mind maps, looks so clean and easy compared to doing it by hand
@datasciencebasics Жыл бұрын
Hi, I am using excalidraw.com/
@chenle02 Жыл бұрын
It is too slow for query to be useful
@datasciencebasics Жыл бұрын
yep, its slow and it depends upon the machine where you are running. You might get more info from this updated video. kzbin.info/www/bejne/fF6VqJp7ncmSbpI
@user-wr4yl7tx3w Жыл бұрын
What can we do to check if the code is safe to download locally? I am still a novice.
@datasciencebasics Жыл бұрын
Good question, I don’t have direct answer for this but, If there is more stars and many contributions, it might be considered safe.
@vandanasingh2249 Жыл бұрын
when I enter question I am getting killed at last why?
@barkingchicken Жыл бұрын
i'm seeing the same issue
@datasciencebasics Жыл бұрын
Can you provide detailed info about the error. I recommend you to remove the whole thing and start from the beginning. Hope it helps.
@vandanasingh2249 Жыл бұрын
@@datasciencebasics Yes I removed all and start with fresh also can you same link of ggml-model-q4_0.bin because link is not provided on git repo
@datasciencebasics Жыл бұрын
Hello, now the github repo is changed not to take this model, instead, it uses Embeddings from Sentence transformer. You don’t need to download and place it under models folder. It is automatically downloaded. Please try to also follow and read the steps in git repo.
@davidyadegarians6963 Жыл бұрын
@@datasciencebasics it does not work I tried it in Ubuntu 23.04
@jafferaliumar Жыл бұрын
Thanks bro for your video. I did same with chainlit but this UI has nice options for use to select parameter. To make response fast I increased cpu thread to higher values while loading the model
@datasciencebasics Жыл бұрын
You’re welcome. Thanks for sharing !
@SAINATHREDDYY-d4h Жыл бұрын
Bro I am always getting a server_error_msg = "**NETWORK ERROR DUE TO HIGH TRAFFIC. PLEASE REGENERATE OR REFRESH THIS PAGE.**"when trying to give a prompt at the localhost. Please reslove the issue broo...ASAP
@LauraRodriguez-e4w5c2 күн бұрын
I am getting an error when I am trying to run this command pip3 install -r . equirements.txt, its like there is no requirements.txt document, do you know if something recently changed in this process?
@LynchTee Жыл бұрын
Great video! I have try this, the response time for each query was took about few minutes, is it the reasonable response time for local privateGPT? The CPU and RAM consumption is high, however my build in graphic card has less than 5% utilization. Can I know is it normal?
@G0BYE0MAN Жыл бұрын
Thank you. How to improve the performance of Gpt4all model?
@datasciencebasics Жыл бұрын
You can use other gpt4all models if that helps. Keep an eye on the github repo, there could be an update as its open source and people want to contribute.
@SAVONASOTTERRANEASEGRETA Жыл бұрын
It would be nice if you could use it with the webUI console. So I don't like terminal style.
@aketo8082 Жыл бұрын
Thank you. Is it possible to train that model to recognize the difference between four people in a story with same first name? If yes, how can I train that model? Or how can I train the model to recognize location/location-name? Thank you for further information. Is that possible with LLM or is another modul for "intelligenz" necessary?
@datasciencebasics Жыл бұрын
Hello, In LLM field and chat with documents, there is no need to train a model. It is just doing retrieval out of embeddings stored and LLM is ised to provide the answer. You can give a try if it can handle those stuffs. Also, to extract people names, location names you can use Named Entity Recognition models. For NER, you can use SpaCy.
@aketo8082 Жыл бұрын
@@datasciencebasics Maybe I was unclear. I have a short story. Named a man with first and last name. Sometimes the GPT-Model answers correct the last name when I have the question with first name. Sometimes. Then that man married and in the story is only the first name of the woman. Question after her last name: not possible, because not mentioned in the story. I have another story with four people with the same first name. The GPT-Models are not able to "reconize" the difference between the four people. That's why I ask, where is the "intelligenz" to recognize that connection or how to train that. So at the moment those modells are not usable for me, because they are not able to answer correctly.
@ピペットマン Жыл бұрын
Finally the system sends your date to GPT doesn’t it? If so, can we ignore risk of data leakage in the process?
@datasciencebasics Жыл бұрын
We need to send data to GPT to get the answer but that model is present locally, not hosted and provided by third party. Hence no data leakage.
@RossTang Жыл бұрын
may i ask why there are many modules not found even i can see in pip list. such as langchain, sentence_transformer etc. im using python 3.10.9 in conda env. i used pip3.10 install --upgrade xxxx to resolve some of those, but some do not.
@datasciencebasics Жыл бұрын
its strange. Can you try installing packages inside virtual environment other than conda. i.e python venv or virtualenv.
@eeshanchanpura3356 Жыл бұрын
What to do if we want to run it on other llms like flan-t5, mpt, etc.? Instead of download that model
@datasciencebasics Жыл бұрын
hello, try replacing the model with the one you want. Or use HuggingfaceHub api to use other models.
@sugamverma2894 Жыл бұрын
Can we train it on our custom data incase we are not getting good accuracy?
@datasciencebasics Жыл бұрын
Not using this repo as it is not intended gor that purpose. And I don’t recommend fine tuning with LLMs, they work out of the box by juat loading your data and retrieving answer. What you can do is try with different open source models out there and go with the one that is best for you.
@sugamverma2894 Жыл бұрын
Okh thanks
@galdakaMusic Жыл бұрын
What GPU? Price?
@yourEverydayEngineer Жыл бұрын
This code for ui is not working, because the embedding for privateGPT got changed from llamacpp to huggingchain. any updated ui code?
@datasciencebasics Жыл бұрын
Hi, If you look at the pull requests in github repo, someone has provided updated version. You could try that.
@RogueIntellectuality Жыл бұрын
Dear sir , I have gone through tutorial - Now As i am goin to launch a start up in india ..this will help me out.. Now suggest me how can i integrate it on my wix website .. a separate page...where my students can communicate with our data..because i am non developer... so suggest it in simplest manner ..
@toannguyenngoc8209 Жыл бұрын
Now openAI API not free, how can i train model other
@datasciencebasics Жыл бұрын
hello, you can use Huggingface models which is free but it might not be same quality as OpenAI models or not that easy to use.
@zakimdh1070 Жыл бұрын
Are work like online gpt ! , mean are reply to coding questions
@datasciencebasics Жыл бұрын
Hello, this is for your personal documents so its not like the online gpt.
@zakimdh1070 Жыл бұрын
@@datasciencebasics so just give you data from your documents input , he not give you coding solutions if you add library or something,,, that all means he just read the answers from your input
@muratalarcin8515 Жыл бұрын
Hello, with the guidance from Beebom, I was able to create an AI chatbot using my PDF files. However, I also want to use this chatbot for my SQL files. How can I achieve that?
@datasciencebasics Жыл бұрын
Hei, you could use the suitable document loader from langchain and use that for sql file. Or convert sql files to csv and use CSVLoader.
@shaalal Жыл бұрын
Thanks for a great video and putting in so much effort. I tried this passing in custom document but did not respond me with any correct answer.. few times it gave me message " I don't know the answer. " Can I try some other LLM Model which is trained on more dataset and can give better result..if you could share some light on that..
@datasciencebasics Жыл бұрын
Hello, I haven't tried myself but You can try downloading other GPT4all-J models and check the performance as mentioned in the GitHub repo.
@seshakiran Жыл бұрын
oh..wow...my link to the github is removed. why?? lol :)
@ahmedalshammari9350 Жыл бұрын
What are the changes you made to make it work? The prompt does not give me the link!
@datasciencebasics Жыл бұрын
the code is in the description of the video.
@ahmedalshammari9350 Жыл бұрын
@@datasciencebasics thank you sir, is the change only includes the parser in the demo?
@Alkotas25 Жыл бұрын
thx the content, very useful! can you pls do a video about Claude AI 100K token contex window capability? looks very impressive!
@datasciencebasics Жыл бұрын
thanks , Will cover that this weekend.
@euyo6216 Жыл бұрын
Top
@datasciencebasics Жыл бұрын
🔝:)
@anilkutuwo8939 Жыл бұрын
I am on Windows 11, I got following error: " note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for hnswlib Failed to build hnswlib ERROR: Could not build wheels for hnswlib, which is required to install pyproject.toml-based projects" How to solve this issue?
@datasciencebasics Жыл бұрын
hi, Unfortunately, I don't have windows machine to test. You can install wsl in windows for running such things in linux environment if needed.
@Adel2utube Жыл бұрын
100% slow
@datasciencebasics Жыл бұрын
Hei, compared to OpenAI definately its slow. Hopefully, there will be improvements. Keep an eye on the github repo.
@wvagner284 Жыл бұрын
Thanks for sharing your knowledge! Great content! I just could not find the file ggml-model-q4_0.bin. It seams its no longer available in the git in reference. Would you have any alternative instruction? Regards from Brazil!
@datasciencebasics Жыл бұрын
Hello, now the github repo is changed not to take this model, instead, it uses Embeddings from Sentence transformer. You don’t need to download and place it under models folder. It is automatically downloaded. Please try to also follow and read the steps in git repo.
@gabrielkasonde367 Жыл бұрын
I'm using Github codespaces to follow the tutorial all went well until the time i tried entering the prompt when it sayed "gpt_tokenize: unknown token '�' Killed" i wanted to complete your tutorial and went ahead to the Gradio UI implementation which went well until the time to enter a prompt when it said "Something went wrong Connection errored out." Any help on this would be highly appreciated, thank you.
@datasciencebasics Жыл бұрын
I haven't tested in GitHub codespaces so can't comment more into it. Please ask in the GitHub repo issue so others might help you.