Dual 3090Ti Build for 70B AI Models

  Рет қаралды 6,445

Ominous Industries

Ominous Industries

2 ай бұрын

In this video, I take you through my exciting journey of upgrading my computer setup by adding an additional Nvidia RTX 3090Ti, with the ultimate goal of running highly demanding 70B localllm models and other GPU-intensive applications. For those who share a passion for pushing the boundaries of AI research and computational power, you know how crucial having the right hardware can be. That's exactly why I embarked on this upgrade mission.
After extensive research and monitoring the market for the best deals on GPUs, I stumbled upon a golden opportunity at my local Micro Center. To my surprise, they had refurbished Nvidia 3090 and 3090Ti Founders Edition cards on offer at prices that undercut even the second-hand market. This was a deal too good to pass up, especially for a high-performance enthusiast like myself looking to bolster my system's capabilities for handling some of the most compute-intensive tasks out there.
In this detailed build log, I'll show you every step of the process, from the decision-making to the installation and eventual performance testing. We'll explore why the Nvidia 3090Ti is a game-changer for anyone interested in deep learning, AI model training, and running sophisticated algorithms that demand significant GPU resources.
Furthermore, I'll share insights on how to spot great deals on high-end hardware, the importance of considering refurbished components, and tips for ensuring your system is ready to take on the challenges of next-generation computing. Whether you're a seasoned AI developer, a deep learning enthusiast, or simply someone fascinated by the capabilities of modern technology, this video is packed with valuable information.
Join me as I boost my computer's performance to new heights, making it capable of running 70B localllm models and beyond. Don't forget to like, share, and subscribe for more content on AI, technology, and high-performance computing builds. Your support helps me bring more of these in-depth guides and tutorials. Let's dive into the world of high-end computing together!

Пікірлер: 32
@UpNorth937
@UpNorth937 25 күн бұрын
Great video!
@OminousIndustries
@OminousIndustries 25 күн бұрын
Thanks very much!
@i6od
@i6od 5 күн бұрын
i seen a reddit post of guy running 4 p100 16gb under 1300$ getting 30 tokens a second with vLLM , on 70b llama 3 lol, im so happy to see other builds like dual 3090's too, so far i have managed to pick up one titan rtx im hoping to shoot for a 3090 or another titan rtx,
@OminousIndustries
@OminousIndustries 5 күн бұрын
Its been very cool to see the use cases of older cards for localllm setups. I want to grab a tesla p40 at some point and put in in a restomod llm pc if nothing more than for the cool factor of how it looks.
@mcdazz2011
@mcdazz2011 2 ай бұрын
One of the best things you can do in the short term, is to clean the front air filters. I can see one at 11:48, and there's a fair amount of dust between the filter and the fan. You'll get better air intake just by cleaning them, which will help with any heat generated in that case (which is a BIG heat trap). Longer term, definitely look at getting a new case with better air flow. The way it is at the moment, that case is going to act like an oven and you'll likely find that the CPU/GPUs might thermal throttle and rob you of performance. Thermaltake make some pretty big cases (on wheels if that's your thing), so you might like the Core W100 or Core W200.
@OminousIndustries
@OminousIndustries 2 ай бұрын
Excellent advice, ironically enough I recently purchased a Thermaltake View 71 to transfer all the components into. I am excited to do the swap.
@cybermazinh0
@cybermazinh0 2 ай бұрын
The video is very cool, the case of the 3090 could be very beautiful
@OminousIndustries
@OminousIndustries 2 ай бұрын
Thanks very much! I am going to be swapping everything over into a Thermaltake View 71 case very soon.
@jamesvictor2182
@jamesvictor2182 22 күн бұрын
Unlike the inside of that case!
@mixmax6027
@mixmax6027 Ай бұрын
How'd you increase your swap file? I have the same issues with 72B models running dual 3090s
@OminousIndustries
@OminousIndustries Ай бұрын
These instructions should work, though I have only used them on 2022.04 wiki.crowncloud.net/?How_to_Add_Swap_Space_on_Ubuntu_22_04#Add+Swap
@mbe102
@mbe102 2 ай бұрын
What is the aim for using opendalie? Is it just... for fun, or is there some monetary gain to be had through this?
@OminousIndustries
@OminousIndustries 2 ай бұрын
Personally I just use it for fun. Some people use these uncensored image models to generate NSFW images that they then release on patreon, etc to make some money, but that is not in my wheelhouse.
@atabekkasimov9702
@atabekkasimov9702 2 ай бұрын
Did You plan to use NVLink with new Ryzen setup?
@OminousIndustries
@OminousIndustries 2 ай бұрын
It is something I would like to add once I swap over to a threadripper. I have seen conflicting opinions on how much it helps but I would like it for "completeness" if nothing more.
@jamesvictor2182
@jamesvictor2182 22 күн бұрын
I am awaiting my second 3090 ti, probably going to end up water cooling. How has it been for you with heat management?
@OminousIndustries
@OminousIndustries 22 күн бұрын
I have not seen crazy temps while running localllama. I did render something in keyshot pro that made the cards far too hot but for any llm stuff it hasn't been too bad at all.
@M4XD4B0ZZ
@M4XD4B0ZZ Ай бұрын
Ok so i am very interested in local llms and found that my system is way too weak for my likings. But i really have to ask.. what are you doing with this technology? I have no "real" use case for it and wouldn't consider buying two new gpus for it. What are actual beneficial use cases for it? Maybe coding?
@OminousIndustries
@OminousIndustries Ай бұрын
I have a business that utilizes LLMs for some of my products so it is a 50/50 split between business-related research and hobbyist tinkering. The requirements to run LLMS locally are heavily dependent on the type and size of model you want to run. You don't need a large vram setup like this to fool around with them, I just went for this so that I could run larger models like 70B models. Some of the smaller models would run fine on an older card like a 3060 which can be had without breaking the bank. Some of the model "curators" post the requirements for vram for the models on huggingface, bartowski being one who lists the requirements.
@M4XD4B0ZZ
@M4XD4B0ZZ Ай бұрын
@@OminousIndustries thank you for the insights really appreciate it
@OminousIndustries
@OminousIndustries Ай бұрын
@@M4XD4B0ZZ Of course!
@MikeHowles
@MikeHowles 13 күн бұрын
Bro, use nvtop. You're welcome.
@OminousIndustries
@OminousIndustries 13 күн бұрын
I'm going to install that tonight for my intel gpu build, I previously hadn't found a monitor for that gpu on linux.
@codescholar7345
@codescholar7345 Ай бұрын
What CPU and motherboard? What is the temperature of the cards? Thanks!
@OminousIndustries
@OminousIndustries Ай бұрын
The cpu is an I7-12700K and the mobo is a MSI PRO Z690-A. I purchased them as a micro center bundle about a year ago. I have not seen the card temps get over about 75c when using the text-gen-webui. I was using keyshot pro for something and decided to use both cards to render the project and they got far too hot, so cooling is first priority to be upgraded.
@codescholar7345
@codescholar7345 Ай бұрын
@@OminousIndustries Okay thanks. Yeah there's not much space in that case. I have a bigger case, I'm looking to get another 3090 or 4090 and possibly water cool them. Would be nice to get an A6000 but too much right now
@OminousIndustries
@OminousIndustries Ай бұрын
@@codescholar7345 I have a thermaltake view 71 to swap them into when I get the time. The A6000 would be awesome but yeah that price could get you a dual 4090 setup. A water cooling setup would be very cool and a good move for these situations.
@emiribrahimbegovic813
@emiribrahimbegovic813 13 күн бұрын
Where did you buy your cafd
@OminousIndustries
@OminousIndustries 13 күн бұрын
I got it at Micro Center, they were selling them refurbished. Not sure if they still have any in stock. They also had 3090s.
@skrebneveugene5918
@skrebneveugene5918 17 күн бұрын
What about llama3?
@OminousIndustries
@OminousIndustries 16 күн бұрын
I tested a small version of it in one of my more recent videos!
@m0ntreeaL
@m0ntreeaL 12 күн бұрын
BIG Price ...i guess 200bucks to High
RTX 4090 vs GT 730! 😲 Watch until the end! #rtx4090 #pcbuild #gamingpc
0:43
Cataclysm Computers
Рет қаралды 1,9 МЛН
The ULTIMATE Budget Workstation.
14:57
aChair Leg
Рет қаралды 155 М.
[柴犬ASMR]曼玉Manyu&小白Bai 毛发护理Spa asmr
01:00
是曼玉不是鳗鱼
Рет қаралды 42 МЛН
How to play your DOS games // Dweeb_DOS TUTORIAL
14:51
TechDweeb
Рет қаралды 1,3 М.
Overpowered Gaming PC Build - Step by Step
23:22
optimum
Рет қаралды 1,8 МЛН
Build your own Deep learning Machine - What you need to know
11:58
The AI Hacker
Рет қаралды 206 М.
Building a Budget DIY Home Surveillance System
38:33
Hardware Haven
Рет қаралды 206 М.
Time To Switch To Linux?
10:55
Britec09
Рет қаралды 17 М.
The Intel Arc A310 is AMAZING - Perfect Plex GPU
8:11
Raid Owl
Рет қаралды 42 М.
My new 4090 PC ⁱˢ ᵗⁱⁿʸ
10:25
optimum
Рет қаралды 1 МЛН
A step in the right direction - Nvidia RTX 4070 Super
8:22
ShortCircuit
Рет қаралды 213 М.
How many pencils can hold me up?
0:40
A4
Рет қаралды 16 МЛН
СҰЛТАН СҮЛЕЙМАНДАР | bayGUYS
24:46
bayGUYS
Рет қаралды 598 М.
когда достали одноклассники!
0:49
БРУНО
Рет қаралды 879 М.