Learn more and download LM Studio here: nvda.ws/3YhC91P With LM Studio accelerated by NVIDIA RTX GPUs, you unlock faster performance and better handling of larger models, giving you power and speed for advanced workflows.
@engineeringmanagementacademy3 ай бұрын
Many thanks Kevin. I'm an academic in Australia and always find your videos very useful 👌👍
@KevinStratvert3 ай бұрын
Thanks for the note!
@dougdebug7519Ай бұрын
Excellent walkthru and tutorial. Thank you!!
@skyerj3 ай бұрын
Good Vid !!! I m waiting for your next tutorials on images generation/visions on LM studio!!
@himanshuvajani44773 ай бұрын
Kevin this is an absolutely amazing amazing amazing video. Thank you for making this.
@KevinStratvert3 ай бұрын
Sure thing! Thanks for the note!
@ade-ade3 ай бұрын
Great clip Kevin. Now do one on Agents, coz thats where all the trippy stuff from MS/Google etc come in with their biases and all. Thanks
@panosdaras2824Ай бұрын
Thanks for the tutorial!
@jankvis3 ай бұрын
Thx for clear and Learning video
@karljohnson1121Ай бұрын
LLM models are very much like ultra-compressed ZIM files using the GPU with a set of rules for user interaction to rephrase data. Have people tried taking a detailed science textbook and seeing what types of data are inside the black box to check the extent of LLM knowledge. Asking for details, one sentence at a time, word by word and every piece of information found in the book. By taking another AI that slices a science textbook into chunks and asking questions to that LLM model, to see the percentage of accuracy and topics covered.
@clcorne2 ай бұрын
Great Video... Thank you
@CyberSonic15711 күн бұрын
Wish you could have explained the different settings more at 2:17 as well as some guidance on what to set each one at.
@nithinb96713 ай бұрын
Wow. This is game changing. Do they provide SDK for this. So that we can build different apps that are not chat workflow based? For example I will create an app that scans my online storage drive for some Excel files and then trigger LM for that Excel file to get insights. Wouldn't that be cool 😊.
@sandro-nigris3 ай бұрын
Kevin king of tutorials. Period. You are the best!
@Ali-qy5lk3 ай бұрын
getting "Failed to load model" on 3 different models each time I tried to load. Anyone else having this issue?
@kairo_data3 ай бұрын
I had already installed Ollama and some models then use Msty to chat with my models, any benefits to use LM studio instead? and can LM studio integrates with my previous installed models in Ollama folder?
@PranayBhagat-y9t2 сағат бұрын
Hi Kevin, thanks for the video it was helpfull. I have a folder with thousand of documents...justlike you showed in video for a single document I want the llm to go through all the documents present in a folder and provide me answers directly from the documents all offline...how can I do this?
@jugatsu23nichi3 ай бұрын
This was a great video. By using an LLM on your own laptop, does that preserve the confidentiality of the data you load into the LLM such as corporate financial data?
@lalmediagroup79282 ай бұрын
did you get any answer on this ? if so pls share your findings
@jugatsu23nichi2 ай бұрын
@lalmediagroup7928 no, I have not heard back
@naren17053 ай бұрын
Hello Kevin, as always great tutorials! How do I load the data into the model? Thank you!
@Gottamakeyouunderstand3 ай бұрын
Hey Kevin, love your videos. What do you think of MSTY. It’s similar to LM studio but it has more advanced features and a cleaner interface. Have you checked it out?
@bamboosingularity3330Ай бұрын
That's so cool. You reckon it's more useful than downloading the entire wikipedia when I just need to search general stuffs?
@danielodeliro24123 ай бұрын
Thanks Sir how to download videos from any website ?
@pavans40143 ай бұрын
U made my day. Bro
@ahmed-schrute17 күн бұрын
I added a tutorial to run LM Studio local models in an online chatbot that can be embedded to a web page
@iezzat18911 күн бұрын
Thanks for the tutorial, I appreciate if u can help me on this, I’m looking to get lm studio running on a local machine and looking to expose it to selected users to interact with. Looking for an easy way to do that , eventually i want the selected users to see the anything llm interface.
@Romdryl20 күн бұрын
Hey Kevin, well done video! Thank you. When I use the Model Search in LM Studio and enter in a title for something I know is on HuggingFace (because I can literally see it on the HuggingFace site) it returns zero results? What am I missing here?
@thinkingmaniac5432522 күн бұрын
I have 8GB ram can i install it
@eitandavidadv91433 ай бұрын
Thanks. Can I upload 1000 docs in my computer to this Ai so it can process the data?
@Laptevwalrus2 ай бұрын
Yes you can, its called RAG
@ivandrofly3 ай бұрын
What are the difference between "User" and "Assistant" on the chat. Also give example for "Assistant" if you know
@kbqvist3 ай бұрын
The models chosen here seem absolutely tiny, so I am curious about how large a model one can realistically access and run on consumer grade hardware, e.g. 32 gb ram and 8-10 gb graphics card?
@Pennytechnews3 ай бұрын
not well I have tried
@kbqvist3 ай бұрын
@@Pennytechnews Thanks, I suppose that for me there will be little point in messing with this, given that we can run much more capable models for free, without installing anything...
@KevinStratvert3 ай бұрын
I probably should have demonstrated a larger model. Check out the article linked in the video: nvda.ws/3YhC91P With LM Studio’s GPU offloading, you can run large models, like Gemma 2 27B, locally on RTX PCs. It makes larger, more complex models accessible locally and you don't have to rely on models running in the cloud. If you have decent hardware / dedicated GPU, you can run many of these models on your own.
@kbqvist3 ай бұрын
@@KevinStratvert Thanks Kevin!
@austinputz11 күн бұрын
I have a normal 2080, is this good enough to run these? My MacBook has a GPU but not sure if it can be used for computing. Anyone running MacBook Pro and used LM Studio? I have 64 GB RAM but only the small GPU
@atultanna3 ай бұрын
Thanks Using LM studio can we replicate a webapp to desktop app?
@Pennytechnews3 ай бұрын
depends on model you are using
@atultanna3 ай бұрын
@@Pennytechnewswhich model should we use
@Pennytechnews3 ай бұрын
@@atultanna for replicating something from a image you'll need at least a 7b model but It depend on your pc specs, this will tell you what models you can run smoothly
@atultanna3 ай бұрын
@@Pennytechnewscan you help
@Pennytechnews3 ай бұрын
@@atultanna look up how to determine the right llm to run locally based off your pc specs or just use a cloud llm
@OniChan-m3y19 күн бұрын
Is this a safe program to run? I've seen some people mention that it leaves some unwanted things running on your PC after uninstalling. I'm new to this stuff and am not super tech savvy.
@kaerit84532 ай бұрын
how can i install a model in the secondary disk?
@napri24507 күн бұрын
why is my gpu not detected in llm studio?
@CHATHK3 ай бұрын
Pls can you do image gen in llm studio
@TheMouseJerry-du1md17 күн бұрын
how about people who do not have GPUs?
@Stick3x6 күн бұрын
Too bad LM studio doesn’t have a microphone feature where you can chat without having to type.
@Simonovits13 ай бұрын
Kevin, I teach high school mathematics. What is currently the "best" free Mathmodel?
@flyingvnman2 ай бұрын
how about 8gb ram?
@gmcenroe3 күн бұрын
Very disappointed with this program, various models were asked to write python code, the code was incomplete or contained many errors. Also simple questions it would often say The Message contains no content, or Message failed to send.
@debojyotidey9529Ай бұрын
doesn't Ollama does the thing same thing
@bretspencerАй бұрын
Similar, but not the same. I've tried both and prefer LM Studio. Use whichever one you like.
@mitchelhighman83822 ай бұрын
Is it free?
@cadepope4093Ай бұрын
It sure is!
@goblin2167Ай бұрын
0:10 0:58 wtf??
@StarChaser187910 күн бұрын
Thats a normal amount of ram.
@Tirats633 ай бұрын
Same with southerners, we pronounce words beyond wrong
@witness10133 ай бұрын
Why can't I run them on a desktop ?
@SocratesWasRight3 ай бұрын
You can
@witness10133 ай бұрын
@@SocratesWasRight The title says laptop and he ends saying it's for your laptop. Why if it can be installed on either ?
@almirantecarvalho3 ай бұрын
@@witness1013 Most people have laptops which are viewed as weaker than desktop pcs. He just want to show that you can run a LLM with most machines.
@witness10133 ай бұрын
@@almirantecarvalho "most pepople have laptops' ? That's the most absurd - and grossly incorrect - statement I've ever heard.
@KevinStratvert3 ай бұрын
It’ll work on both 👍 laptop market share outpaces desktop and many people search specifically for laptop, which is why I focused on that in the title and talk track.
@OnlyTruthLove2 ай бұрын
LM Studio allows people to use LLM's without hardware restrictions like ram. To run LM Studio, you need to have 16 gigs of ram. AI reason lol
@twokool4skool1293 ай бұрын
What a worthless and misleading video. A 1b feature sized model is not "large", which are the only LLMs LM Studio is able to run and are worthless except for the simplest of prompts.