@TimCarambat I had to pause the video just to leave a comment! I'm deeply impressed by the excellence and simplicity of the content presented here. It's truly remarkable to have access to such tools, created by a team that clearly demonstrates passion and a keen ear for what we all think and wish would be great to have, and at every update, distilling all p of these wishes into a few simple clicks within this amazing piece of technology! I'm immensely grateful for the opportunity to experienceh the brilliance of software engineering and development of Anything LLM, especially within the context of open-source communities. Participating in the advancement of genuine and incredible open tools is a privilege. Thank you Tim! I will be promoting this project to the moon and back, because this deserves to be known.
@TimCarambat6 ай бұрын
This is so incredibly kind. Sharing with team!
@camdizzlethe1st6 ай бұрын
haha I was just about to leave a comment when I read yours. I feel the same. What a champion Tim is. I do not know if I will ever install AnythingLLM but I think I will donate to Tim regardless.
@ts757arse6 ай бұрын
Aye, I was interested in anythingLLM a while back but chose another project for my inference server. I've found getting half decent agent capabilities to be a huge time sink for someone with my skill set (I'm a physical security guy, not a programmer) and the results just weren't worth the time invested. Even basic agent capabilities with RAG, memory and so on in a package that I can just plug into ollama sounds awesome. Prepping the server now. Here's hoping.
@trsd86404 ай бұрын
AnythingLLM NEEDS to get more attention, because its simply great! I can’t wait to see custom agents in AnythingLLM! Well done!
@jimg82965 ай бұрын
Been using for the past few months and is my go-to app for local RAG. Adding agents huge plus. Looking forward to being able to add my own AutoGen agents to the list with their own special tools. Thanks for the great work Tim.
@gordonwoo81275 ай бұрын
Thank you very much. As soon as I saw that RAG was built in and it was simple to use, I immediately started finding readme pdfs on various topics to ensure I could use this tool as efficiently as possible. After my targeted pdfs are found, I plan on grabbing data from how-to and wiki.
@bro_truth5 ай бұрын
Bro this has to be the most comprehensive, simple, engaging and all around entertaining video on AI I've ever watched. Your presentation, explanations, and exert level knowledge base are all 'S' tier! Bra-freakin'-vo! Subscriber well earned and deserved! 🏆👏🏽👏🏽
@liviuspinu116 ай бұрын
Thank you for explaining quantisation in details for niebiews.
@Biskern5 ай бұрын
You are a gift from god, TIm Carambat. Thank you for your continous efforts to make these technologies available to the rest of us. If this was 10.000 years ago, YOU're the guy teaching the rest of us how to make fire with flint and tinder. May you and your loved ones be blessed for eternity!
@TimCarambat5 ай бұрын
This is very high praise. I appreciate the kind comment!
@SiliconSouthShow6 ай бұрын
@TimCarambat I'm excited to see the features you talked about work with the ollama like in the video for the agent, as of now, its same as before I updated, but it's exciting to think of the future.
@surfkid11116 ай бұрын
You built an amazing piece of software. Thank god that I stumbled across this video.
@fxstation13296 ай бұрын
What I love about your tutorials is that you succinctly explain all the things that come across during the tutorial. Thanks!
@SiliconSouthShow6 ай бұрын
Fantastic Tim! Mine doesnt have agent config, guess i need to delete and udate, ill try that, looks great! keep up good work, i love anythingllm i really do!
@yusufaliyu97596 ай бұрын
Great this will make LLM more understandable for many ppl.
@the_potmo5 ай бұрын
If you type ollama show you can see the context window of the model FYI
@JRo250Ай бұрын
Screw giving a Github star (I did anyway). Tell me when you're going public so I can buy shares!! lol Seriously, you have a winner here.
@TimCarambatАй бұрын
@@JRo250 ill be sure to preallocate shares for star givers. Note to SEC: this is a joke. Maybe
@yasin69046 ай бұрын
Im a chronic video skipper but watched this back to back. Great explanations and can't wait to try this out! Would love to see more videos, tutorials or even lectures from you. You really have a knack for explaining things!😊
@yasin69046 ай бұрын
PS I've starred on Github!
@TimCarambat5 ай бұрын
I really appreciate you saying this. I have gotten a comment or two prior saying im the worst at it. Can't please everyone! Glad you found it useful.
@TheDrMusician6 ай бұрын
This is by far the easiest and most powerful way to use LLMs locally, full support, like and sub. And many thanks for the amazing work, especially being open source.
@TimCarambat6 ай бұрын
🫡
@flusyrom5 ай бұрын
Funny ! I heard yesterday for the first time about AnythingLLM during an AI-info event.... and discarded the idea of giving it more attention because it was presented as "just another local RAG support". And now I stumble across this video by chance - and the additional agent functionality changes everything ! BTW, very well presented , this feature ! My immediate idea & feedback: if there was ANY chance to model custom agents in Flowise and re-import the JSON exports of this Flowise flow as input for an AnythingLLM custom agent, you'd save yourself the trouble of designing your own agent editor AND would start with a comparably large installed base. OK, maybe that's just wishful thinking..... but maybe I'm also not the only one with this wish to facilitate local agent building ;-)
@lostmusichits4 ай бұрын
Excellent video! Thank you for explaining things plainly and quickly. Valuable.
@jimg82966 ай бұрын
Anythingllm is awesome. Glad to hear custom agents are on the roadmap. It's the big hole in capability. Also need config to change agent promt. I scan a lot of code and the @ is used often to define decorators.
@jonathan584756 ай бұрын
Tim, thank you for making the world a better place with this awesome tool! :)
@quinnlintott4066 ай бұрын
I had no idea you had a channel talking about your software. Im a big fan of your work!
@jakeparker9186 ай бұрын
This is so dope. Great no-code solution and it's awesome that it's open source.
@nunomcarvalho12 күн бұрын
Great work , congrats . Im Very impressed with your gift of comunicate and the impressive amount of work that it took to develop this tool!!!
@coryrichter36805 ай бұрын
Very cool to play with, look forward to seeing where the Agents go, nice work!
@mmar58963 ай бұрын
Thanks for building anything llm with these features. For many days I was searching for better UI than a terminal and more features for ollama models. And you have done it. Thank you very much.
@ilanlee30255 ай бұрын
Good stuff, will try it out. Subscribed. Looking forwards to seeing how this develops.
@amulbhatia-te9jl6 ай бұрын
Would it be possible to see a vide of setting up your Ollama models on Anything LLM, I followed these instructions but my ollama models never load.
@fluffsquirrel3 ай бұрын
This is absolutely absurd. Thank you so much, this is an incredible project and I hope it gets more attention. I'm sold!
@vulcan4d6 ай бұрын
This is awesome work. I looked at the other simple to install Windows front ends and stumbled on this. Pretty cool stuff and I love how you can add documents and external websites to feed it information. An offline LLM is soooooo much more preferred. The only item I don't understand is why you could just ask a regular question once you provided the document, but used @agent when asking to summarize a document.
@TimCarambat6 ай бұрын
IMO, i find having a local LLM that even is **only** like 75% as good as on online alternative is just much more rewarding. Like i can be on an airplane, open my laptop, and start brainstorming with an AI. Pretty neat. Next evolution would be a local AI on your phone but i dont think we have that tech _yet_
@benjox73153 ай бұрын
NNNooooo!!! Thank you!! Great tool! Have lots of process documents at work and because of compliance and privacy issues, we are not allowed to upload any documents onto the internet. This is a game changer!!!!
@figs32846 ай бұрын
Incredible.. gonna make building tools so much easier. Cant wait to see more agent abilities added!
@anwhelan2 ай бұрын
Hey. Nice to meet you. I’ve just come across this now and honestly, this couldn’t come at a better time. I already realised the potential of AnythingLLM and was looking at how to utilise it as I’m building some text to action agents and your video just accelerated my path to objective. I have subscribed and look forward to more of your posts. Nice work 👍🙏👍
@michaelklimpel30206 ай бұрын
Big thanks man. This video helps alot for me as an beginner to understand how good a local llm is and which Usecases we have. Thumbs up for this great video.
@roshinidas99843 ай бұрын
This is awesome! I was looking for frameworks similar to this and now i see that this is way better than what we were looking for. Great job on this one!
@Whitehatxtsh3 ай бұрын
Mind blowing simple, coherent and full of information. Think didn`t come across something similar in all terms on KZbin ( I`m not leaving comments often but this really really worth it ). Good job Tim Carambat !
@myronkoch4 ай бұрын
I've been using this for months, and it's fantastic. Dude, thanks. Amazing work.
@TimCarambat4 ай бұрын
@@myronkoch this is such a nice thing to hear. Thank you for your support!
@MaliciousCode-gw5tq6 ай бұрын
Damm,... finally found the tools that i been looking for..MAN you save my day, i have been crazy stuck finding webui for my ollama remote server..your a gift from heaven keep it up your helping alot of people like us..thank you so much..❤❤❤😂😅😊😊
@AdarshKumar-sj5dnАй бұрын
Amazing, the way you have explained a complex concept. Thank you
@tunoajohnson2566 ай бұрын
Awesome vid! Really impressed with how you presented the information. 🙏 thank you
@build.aiagents6 ай бұрын
Would love to see this run stable diffusion and comfy ui workflows
@RichSad454 ай бұрын
Very strong video Tim. I'm going to give this all a try right now.
@jasonhorne68203 ай бұрын
Amazing program Tim. So easy to understand and use. You and the team have a done a stellar job. Cheers
@Spot1205 ай бұрын
Yo honestly it feels great when guys like you make your software completely free and i also think you should keep a option of donation. after seeing guys like you i will make something great and i will make it completely free to use and open source. again thanks dude!❤.
@FlynnTheRedhead6 ай бұрын
So training/finetuning is coming up as well? Loving the progress and process updates, keep up the great work Tim!
@TimCarambat6 ай бұрын
how'd you know!? We will likely make some kind of external supplemental process for fine-tuning, but at least make the tuning process easy to integrate with AnythingLLM. RAG + Fine-tune + agents = very powerful without question
@FlynnTheRedhead6 ай бұрын
@@TimCarambat That's awesome to hear!! I created an agent to get insider info, that's how I know of course!
@TimCarambat6 ай бұрын
@@FlynnTheRedhead !!!!! I thought i was hearing clicks during my phone calls!!!
@OpenAITutor6 ай бұрын
Amazing Tim. Keep up the good work.
@johnbramich5 ай бұрын
Can't wait to use this. Thank you!
@MartinBlaha6 ай бұрын
Thank you! Will test it for sure. I think you guys are on the exact right path 😎👍
@TokyoNeko86 ай бұрын
Debug mode would be ideal. Agent to scrape the web just exits without any error even though I do have search engine api defined
@rockon-wbfqlkjqhsydic726835 ай бұрын
Great job! This is wonderful! I will be responding after using to let you know my thoughts if you care to see them :)
@spacetimepotato5 ай бұрын
There were some concepts I didn't quite understand; for example, tunneling from the Windows PC to the Mac (if it's on your local network, why work with VPN protocols rather than client/server - due to needing a stateful connection vs. 200 response code or something?). But the interface itself is brilliant! And I think that when it becomes agent-swarm-capable it's going to be a much better option for me than Crew AI, as it feels more intuitive, I am just going to need multiple agents working together. I have never installed a local LLM, but you have inspired me to give it a try. Thanks!
@SamBeera5 ай бұрын
Hi Tim, thank you very much for the great video showcasing open source llms, and tools like anythingllm to create agents. I followed your video and successfully was able to do everything in your video. Are there other agentic videos for other usecases you made, look forward to see them. Cheers
@jmherrera0022 күн бұрын
Great software, great video, a lot to learn from it so, way to go man! thanks for such a brilliant AI piece.
@K600K3003 ай бұрын
I have loved everything llm since the beginning.
@sashkovarha6 ай бұрын
This explained the rag and agents parts I couldn't set up. Great educational content for those who are not programmers. Appreciate your explanations being without that much of "pre-supposed" know-how, that coders have - which is most tutorials on youtube... I still didn't get why there's a difference between @agent commands and just regular chat
@TimCarambat6 ай бұрын
In a perfect world, they are the same. AnythingLLM originally was only rag. In the near future @agent won't be needed and agent commands will work seamlessly in the chat. So @agent is temporary for now so you know for sure you want to possibly use some kind of tool for your prompt. Otherwise, it's just simple rag
@shannonbreaux84425 ай бұрын
@Tim do you know anything about home assistant, home automation application. Reason i ask is they already have some intergration with LLM but not with agents and not specialized for home assistant auto automations. When you have time check it out and see if its possible to integrate this with home assistant that would be great. Great job with the video!
@sharankumar316 ай бұрын
this is seriously very neat tool👏👏👏 Pls add some feature to custom develop agents with function calls. It will be helpful for our local automations.
@TimCarambat6 ай бұрын
This is shown in the UI that we will be supporting custom agents soon!
@tinkerman17903 ай бұрын
Amazing Tools, AnythingLLM!!! I love it so mich! With my M1Max Macbook Pro, it runs very smooth locally. Starred AnythingLLM for sure!!! Keeps your great work and thx a million for sharing.
@TimCarambat3 ай бұрын
love to hear this! Email us any feedback! team@mintplexlabs.com
@azkongs3 ай бұрын
This is a very nice too! I appreciate you doing this intro video personally.
@nangelov3 ай бұрын
Thanks for developing AnythingLLM and for the tutorial! I did not know I can create agents that can go online!
@enriquebruzual17025 ай бұрын
I love this tool, I already made several Workspaces, each with its own LLM and RAG. This video was good how with an explanation. I am a Python developer and I would like to create my own agents
@marpla784 ай бұрын
@TimCarambat really caught my attention, not a technical guy (product designer) here but it was clear to understand, I'll definitely give it a try since I'm building some related products. Mini feedback the UI could have some "love" like my boss says but the overall experience feels natural...not too consumer face but natural enough. Will love to hear more from you and your team if there's a chance. Fabi, from Argentina 🇦🇷🤘
@madhudson16 ай бұрын
Been struggling to get custom agents to integrate reliably with external tooling, using frameworks like crewui with local LLMs. Would love a video guide explaining best practices for this
@jackiekerouac20905 ай бұрын
@Tim: I am a professional translator (English to French), and I've just discovered AnythingLLM. Sometimes I have to translate confidential documents that cannot be shared on the cloud. They need to remain locally on my own computer. Once the translation is done, they have to be encrypted to be sent to clients. Could I use AnythingLLM to help me with the translation process? Could I use it with my actual Lexicum, glossaries and personal dictionaries? Most are PDF or DOCX files. How would I do that? What are the first steps? Many thanks if you can give me some hints on how to proceed. I'm now a new subscriber! 😊
@smudgepost5 ай бұрын
Absolutely everything I'm looking for. Thank you!
@SiliconSouthShow6 ай бұрын
@TimCarambat Hey Tim it wont let me select anything under Workspace Agent LLM Provider even though everything is setup and working, obviously ollama is running and everything else in anything is using ollama fine in the app, but this selection option doesn't show like yours does.
@vishalchouhan076 ай бұрын
Hi Tim.. I am absolutely impressed with the capabilities of AnythingLLM. Just a small query..how can I deploy it on a cloud machine and serve it as a chat agent on my website? I actually want to add few learning resources as pdf for the rag document of this llm so that my users can chat with the content of those pdfs on my website. I also want to understand how many such parallel instances of similar scenario but with different set of pdf is possible? For instance, if I am selling ebooks as digital product to my users, can I have unique instances autogenerated for each user based on their purchase?
@TimCarambat6 ай бұрын
We offer a standalone docker image that is a multi-user version of the desktop app. It has a public chat embed that is basically a publicly accessible workspace chat window. You can deploy a lot of places depending on what you want to accomplish: github.com/Mintplex-Labs/anything-llm?tab=readme-ov-file#-self-hosting For this, you could do one AnythingLLM instance, multiple workspace where each has its own set of documents, and then a chat widget for each. This would give you the end result you are looking for
@aimattant4 ай бұрын
looks great - will take a look after I have played around with Llava and Ollama.
@akikuro17256 ай бұрын
Awesome! thank you for this. looking forward to more information/details/examples on using agents w/AnythingLLM!
@incaseidie4 ай бұрын
i saved this months ago was just starting out on a new project and realized your from the future.
@thegristleking4 ай бұрын
Super useful, thanks very much! I haven't been able to get the Google integration to work reliably yet; I'm not trusting what it pulls back when I ask for (what I think) are simple things, like the ingredients for a very specific product. Scraping that site and pulling in the page data fixes that. Again, very cool!
@star956 ай бұрын
Great video! I also want to know how well the RAG function of AnythingLLM performs. It's important that text, images, and papers are handled properly and meaningful chunking are achieved
@StunMuffin5 ай бұрын
I really appreciate your time to explain a lot of things to us. ❤🎉
@CotisoHanganu6 ай бұрын
Great things shown. Tx for all the work and commitment. 🎉 Here is a kind of dedicated use case I am interested to get acces: I am a mind mapping addict. I use Mind Manager, that stores the mm in .mmap format. I would like to ask ANYTHINGLLM to help me scan all folders for mind maps on different subjects and Rag & summarize on them, without having to export all mmap files in another format. Is this doable at this stage? What else should have or have created?
@foxnyoki57276 ай бұрын
Does Internet Search Work for You ? I configured the agent to use Google Custom Search Engine but search does not return any results.
@TimCarambat6 ай бұрын
With some models you _might_ have to word a prompt more directly. Like even explicitly asking it to call `web-browsing` and run this search. Which i know breaks the "fluidity" of conversation, but this is just a facet of the non-determinisic non-steerable nature of LLMs and trying to get them to listen. Mostly, its the model that needs to be better so it can follow prompts more closely, but its also not always that simple!
@scumdivision4 ай бұрын
Awesome Tool! Installed it today and i'm super hyped to have such a powerful tool running on my PC ! I was wondering if it was possible to access the agent functions through the API? Couldn't something about in the documentation.
@DavidEdmister5 ай бұрын
Great demo but if I'm hearing this correctly from other comments - you cannot implement your own GUI interface on top of AnythingLLM to interact with on your own internal website/app? If that's correct I can just demo/tinker with the tool and not implement anything real in my company for internal use. Novice here but like the approach. I can't tell how much data you can train your own LLM on but will keep searching for info.
@LuisYax4 ай бұрын
Tim, you are the man. Great work from you and your team on all of these great software. You are making complex simple. One request or a pointer in the right direction, are there any CLI tools to execute the agents and out put result to any of the workspaces. The GUI is great but at some point all of that needs some automation. Anyway, keep up the great work.
@TimCarambat4 ай бұрын
This is a great point, so we do have an API that you can use but it currently does not support agents :( Guess i know what needs to be fixed now - you are not the first to voice this!
@urlanbarros4 ай бұрын
@TimCarambat, I tried to upload a csv file with a lot of financial movements (source, destiny, day, price). But I think that I have to use Graph Neural Networks. Should I leave AnythingLLM because of this particular problem or can I use AnythinLLM with Graph Neural Networks?
@RhythmRiftsDataDreams6 ай бұрын
What is the chunking method you use to create the vectors? Is there a way that the user can control the method of chunking? Say : Short, Token Size, Semantic, Long etc...
@TimCarambat6 ай бұрын
We currently use a static recursive chunk splitter. So basically just character counts. You can modify those chunking settings in the settings when you go to "embedder preference". So you can define max length and overlap
@KONSTANTINOSTZIAMPAZIS6 ай бұрын
Great video! however, i followed every step you described in every detail, but i could not make the agents communicate with outside world. in any ''search'' or 'webscrape'' request, the model is hallusinating, and presents data that are already to its knowledge insted of real time data (i.e. current gold price ). i used llama3 Q8, i inserted google api and id code, i also tried the other search engine.. nothing. the logs show that it really creates json commands, but nothing comes in from the internet.... any help ?
@TimCarambat5 ай бұрын
lol, thank you haha
@darthomnibus73905 ай бұрын
If you aren't already, will you please look into integrating knowledge graphs into the app, along the same lines as the graphRAG project? Thanks for everything you do!
@TimCarambat5 ай бұрын
Actually have a development branch from long ago with KGs in them. I didn't find the performance much more remarkable, but the issue is never made it to prod was some OSS llms were performing horribly trying to make node/relationships with the graph dB I was using
@zirize6 ай бұрын
I think it's a very good application, easy to use, and after testing it for a day or so, I have some wishes. 1. direct commands Bypass Agent LLM in Agent mode. It takes time for the agent to understand the sentence and convert it into internal command, and url parsing sometimes fails depending on the agent. For example, a command that scrapes the specified URL and shows the result, or a command that lists the currently registered documents with numbering. And a command that summarizes the document by this number instead of its full name. 2. I wish there was a way to pre-test the settings in the options window to make sure they are correct, such as specifying LLM or search engine. I hope this application is widely known and loved by many people.
@AGI20305 ай бұрын
Great work Tim! If using 'AnythingLLM' in the 'LLM Provider' section, can I load other LLMs that are not listed? Like the '8b-instruct-q8_0' you mention? So I don't have to rum Ollama separately to load a model?
@TimCarambat5 ай бұрын
The default Ollama we ship with has some "basic" models. if you want always the latest and greatest models you would need the separate ollama. You bring up a good point. It would be nice to have a "custom" option where you can paste in any valid ollama tag. The built-in version is usually behind the latest and some models dont work with older Ollama versions which is why that initially does not exist
@tonyppe5 ай бұрын
i tried anything LLM and RAG sort of works but I can never pull anything factually from my uploaded text files which are configuration files. Is this a model issue? I was using Llama3 Q8 via ollama and llm studio.
@TimCarambat5 ай бұрын
docs.useanything.com/llm-not-using-my-docs
@JohnRiley-r7j6 ай бұрын
Hey Tim,what is actual folder that Anything LLM use to store models? I have all models downloaded using it on other apps so i would rather just put the model in the right folder then download it again. Thanks in Advance!
@TimCarambat6 ай бұрын
on Mac: /Library/Application Support/anythingllm-desktop/storage/models On window: /Users/user/AppData/Roaming/anythingllm-desktop/storage/models
@JohnRiley-r7j6 ай бұрын
@@TimCarambat thanks!
@UrbanCha0s6 ай бұрын
Looks really good and simple. I tried PrivateGPT using conda/Poetry and could never get it to work, so jumped into WSL for Windows connecting to Ubuntu running ollama, via WEBUI. Works great, but this just looks so much easier. Will have to give it a try. What I do like with the WEBUI I have is I can select different model, and even use multiple models at the same time.
@TimCarambat6 ай бұрын
Yeah, we didnt want to "rebuild" what is already built and amazing like text-web-gen. No reason why we cant wrap around your existing efforts on those tools and just elevate that experience with additional tools like RAG, agents, etc
@ImSlo7yHD6 ай бұрын
This is perfect it just needs more tools and agent customization like crew ai and it is going to be an absolute killer for the ai industry.
@TimCarambat6 ай бұрын
Will be coming soon! Just carving out how agents should work within the context of AnythingLLM and should be good. Also, it would be nice to be able to just import your current CrewAI and use it in AnythingLLM - save you the work you have done so far
@kamerondewart70025 ай бұрын
feedback. I discovered this the same day as fabric. If you have not looked at it please do. If there is a way to include it inside of AnythingLLM it would be my dream tool. Well that and a note capability for saving a quick note as text or audio that can be transcribed, turned into a note with a reliable and easy to read format. One button record and stop/send to create a note for later review right on the first page. I'm building something like that already as a stand alone but I'm a noob trying something simple to learn with. It would be great if someone who knows what they are doing could put it into a tool I already plan to use a lot in the future. I love it so far. I have only been playing with it for a day.
@wyohost6 ай бұрын
Just went through this whole setup and for some odd reason It keeps telling me it can't search the internet. I've tried local LLMs and OpenAI API with GPT-4o. Also have both Google Search API and Serper API. Neither seems to be able to 'reach' the internet. What in the heck am I missing? I understand this stuff pretty well and I just can't get it to search the web.
@Seedlinux4 ай бұрын
Awesome video, amazing tool! Question: If you add documents and text file and ask anythingllm to remember it next time, is it possible to save all the data you input into anythingllm in a custom LLM file? So it can be save d and run anywhere?
@TimCarambat4 ай бұрын
Do you mean a fine tune? If you mean saving output to a file we support that with our save-file agent skill. Otherwise, you can also export all your chats
@Seedlinux4 ай бұрын
@@TimCarambat Hi Tim, thanks for the reply. Yes I meant fine tuning. I am interested to use anythingllm for work, so I would like to save my fine tuned llm on a file like the models you download from ollama website. To be able to export it and use it in other location with the same information I used to fine tuned it.
@gillopez86606 ай бұрын
Wow this is amazing... I'm gonna go star you!
@KilianWorks3 ай бұрын
If I may ask.... my powerPC with a GPU where Ive installed LM Studio, is in the same room, but connected by wire to the same Network as I am with my Laptop that is using WiFi. Both devives have NordVPN with Meshnet options, allowing to simulate LANConnections ... However, I cant fully connect to the LM Studio (CORSHeaders on). In AnythingLLM it doesnt list the available models.
@TimCarambatАй бұрын
Is the firewall on the LMStudio machine blocking incoming connections on the server port?
@pr0d1gyvisions742 ай бұрын
Man, i faded this project when it first came out, now im like... wow...
@TimCarambat2 ай бұрын
@@pr0d1gyvisions74 still free!
@elwii044 ай бұрын
great work. Really looking forward to the new custom Agents. CrewAI is great but always takes sooo much time setting up your agents and all that coding. I dont mind if my query or prompt takes a bit longer to process when using Ollama since you dont have to pay per token but in return a good and precise answer would be great.
@TimCarambat4 ай бұрын
Then this would be perfect for you. We will support cloud based LLMs, but also oss local LLMs as well because - well that is kind of what we do!
@elwii044 ай бұрын
@@TimCarambat Great :D when will this update be approximately be released?
@johnbrewer14306 ай бұрын
@sergiofigueiredo1987, @TimCarambat, I agree with Sergio. Wow! I have Ollama installed locally on a Windows machine in WSL. (I was leery of the Windows preview, but I may switch because NATing the Docker container is a pain.) I also pondered how to build a vector DB on my machine and integrate agents. You guys have already done it!
@Bob-yj3zr5 ай бұрын
This is really cool. the best i have used so far.
@DouglasThompson_Profile3 ай бұрын
@tim carabat - Legend! Can you show how Anthing LLM can interface / co-ordinate / use your defined oLlama's Agents?
@DaveEtchells6 ай бұрын
Wow, this looks *_amazing!_* I’m just starting to experiment with local LLMs and wanting to play with agents; this looks SO easy! I’m going to download and set it up right away. I’m also interested in Open Interpreter for having an AI assistant do things on my local machine. Can this interface with that, or is it really meant as a substitute/enhancement to it? (Also, how can I support your project? I gather your biz model is selling the cloud service, but my usage will be purely local. Anywhere I could send a token few bucks?)
@Naki875 ай бұрын
@TimCarambat I have stallednin my progress when trying to run ollama It sits for about 5 minutes and then the powershell tells me that it " timed out waiting for llama runner to start - progress 1.00" Suggestions?
@dissidentx6 ай бұрын
Does anythingLLM collect ANY user data that it send to you as developer or any other external connection?
@seasons-zd6ij6 ай бұрын
This is a very important point ...
@TimCarambat6 ай бұрын
Its in the README. We have anon telemetry and you can look exactly in the repo where what "data points" are sent. TLDR: never any chats, model specifics, heuristics, or any bs like that. Also you can turn it off and thats that. No exceptions or anything. github.com/Mintplex-Labs/anything-llm?tab=readme-ov-file#telemetry--privacy
@dissidentx6 ай бұрын
@@TimCarambat I appreciate you being straight and recognizing the importance of this. That gives people like me assurance in terms of our privacy so we can leave those features on without worry .... thank you👍