How Fast Is Blender With TWO RTX 3090s?

  Рет қаралды 115,366

DECODED

DECODED

Күн бұрын

Пікірлер: 351
@4mb127
@4mb127 3 жыл бұрын
I didn't know a man could survive with both of his kidneys missing. 😁
@CosmicComputer
@CosmicComputer 3 жыл бұрын
he's cleaning pee on the gpu now lol
@melomaniakjm
@melomaniakjm 3 жыл бұрын
It's actually cheap for the performance.
@leucome
@leucome 3 жыл бұрын
​@@melomaniakjm With the current GPU prices... We can get so many computer for the price of 2 3090 that it is almost same price to performance ratio.
@earthmatters2869
@earthmatters2869 3 жыл бұрын
Lol without seeing your comment I commented on kidney we are same 😀😁😅😅
@peruvianpuffpepper904
@peruvianpuffpepper904 3 жыл бұрын
@@leucome where can you eveb get one?? I can't find them anywhere
@cry2love
@cry2love 3 жыл бұрын
We're almost there, soon we will forget about the pain of RENDER and will enjoy the freedom of REAL-TIME.
@FuZZbaLLbee
@FuZZbaLLbee 3 жыл бұрын
Don’t worry, we will just up the minimal acceptable resolution to 8K
@UnchartedWorlds
@UnchartedWorlds 3 жыл бұрын
@@FuZZbaLLbee lol :D
@deadpianist7494
@deadpianist7494 3 жыл бұрын
nope, there will be something new to work and wait for the result coz soon 16k will get old and maybe 1080k will get invented xd idk
@cry2love
@cry2love 3 жыл бұрын
@@deadpianist7494 Don't forget about new tech like Unreal Engine 5 or higher, they use tons of polygons and even PS5 can run it, basically unreal render engine 5 is like better version of render engine Eevee in blender, so I guess one way or another the render time will be close to real-time
@SplendidNinja
@SplendidNinja 3 жыл бұрын
@@FuZZbaLLbee yup, but 4k will stop that as most ppl willstart to run out of space for appropriately sized 8k monitors (jumping higher will just make it worst). So yeah, we're headed to VR and real time rendering at 8k probably.
@CurtisHolt
@CurtisHolt 3 жыл бұрын
Nice video Rob!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I didn't bother with bar graphs for this video because the scale of the highest and lowest scores made it impossible to read.
@lascaustefan4087
@lascaustefan4087 3 жыл бұрын
That just emphasizes even more what a beast 3090 is ^^
@laurencechase5439
@laurencechase5439 3 жыл бұрын
Keep an eye on your memory temps on those 3090s though! I have two, and I had them in the same configuration you have there - the core temperatures were acceptable but the memory junction was pushing 105C during intense Redshift or Octane renders. I had to opt for watercooling in order to tame them!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
My VRAM temps seem to top out at around 100 degrees after sustained renders. I'll be moving the second card down to the bottom PCIE slot soon, which should keep things even cooler.
@crispinotechgaming
@crispinotechgaming 3 жыл бұрын
rtx 3090 be like: oven architecture
@ricbattaglia6976
@ricbattaglia6976 2 жыл бұрын
@Claus Bohm For Blender CPU is important? When (many polygons scene, viewport render)? Thanks
@johnbancroft5242
@johnbancroft5242 Жыл бұрын
I watercooled my 3090's. I built this Workstation as a water cooled rig 8 years ago, just changed out the motherboard, GPU and ram a few times (24 core AMD 3960x and 64GB ram), but the water cooling has stayed the same. Two 480 rads, and one 240, Now running 12 Noctua industrial 3000 rpm fans (recent upgrade). I work with Davinci Resolve Studio (It loves loads of Video ram especially when denoising), and the temps are very good.
@WeDepict
@WeDepict Жыл бұрын
The CPU is still important for each frame preparation, which adds to render time of each frame even if you're Only GPU to render @clausbohm9807
@parkflyerindonesia
@parkflyerindonesia 3 жыл бұрын
Placed an RTX A4000 and A5000 inside my good old HP Z820, and rendering is blazing fast as well. Feels good to see this kind of performance in Blender 😊
@prashantsunuwar6297
@prashantsunuwar6297 3 жыл бұрын
Him: has 2x 3090s Me: cries with my intel celeron
@melomaniakjm
@melomaniakjm 3 жыл бұрын
Intel Celeron is a CPU. What's your point?
@prashantsunuwar6297
@prashantsunuwar6297 3 жыл бұрын
@@melomaniakjm my point is that I dont have a gpu and intel celeron is one of the worst cpu ever made
@FreshGreenApples
@FreshGreenApples 3 жыл бұрын
Wow, how did you get two? Nobody t stock anywhere.
@FreshGreenApples
@FreshGreenApples 3 жыл бұрын
Nobody seems to have them in stock anywhere.
@stiky5972
@stiky5972 3 жыл бұрын
ayy, crappy cpu rendering gang
@keyframed6558
@keyframed6558 3 жыл бұрын
That 48BG of VRAM would really help with fluid simulations, I only have 8GB and it gets used up really fast.
@jaydevsingh3067
@jaydevsingh3067 3 жыл бұрын
I've got 1gb integrated. Works just fine *sed face
@Prajwal____
@Prajwal____ 3 жыл бұрын
You can also do all those stimulation baking things on any render farm too
@normiewhodrawsonpaper4580
@normiewhodrawsonpaper4580 3 жыл бұрын
You can say... it fills it up
@zefellowbud5970
@zefellowbud5970 3 жыл бұрын
I got this ad while browsing vids for yeah blender. I was surprised to find out it was an ad Good job you got me to watch a whole advert seriously good job
@Austin_Hammy
@Austin_Hammy 3 жыл бұрын
When it comes to denoising, I still prefer the Intel Open Image AI over Optix. Optix Denoiser may be faster, but Intel Open Image Denoiser consistently produces more accurate results in all my tests so far.
@canyongoat2096
@canyongoat2096 3 жыл бұрын
Finally someone who gets it. Although in this nvidia sponsored video it's no surprise he is using optix. I remember some people said use optix for viewport and intel denoiser for final renders, but optix makes such weird artifacts on hair etc. on low samples that it's just better to use intel denoiser for everything in my experience. When editing I can usually do 10-15 samples in viewport and it gives me a more accurate representation of the scene with more details.
@aaditnoronha5383
@aaditnoronha5383 2 жыл бұрын
optix is not good for sharp edges. personally i don't see a big difference between the two but i see no advantage in getting optix rn
@jontyhamp01
@jontyhamp01 3 жыл бұрын
I barely understood anything you were talking about, but your video held my attention throughout. Many thanks. I have Blender on my PC but my specs are so low that Blender declines to even get out of bed.
@coreym162
@coreym162 3 жыл бұрын
Wow! I'd been using Alpha 3.0 forever and happy to know final build will come in a few weeks. Just about to DL Beta. Hope this is more stable.
@mrvideographe6001
@mrvideographe6001 3 жыл бұрын
My computer looked up at me watching this video and I think it internally cried
@dysnXO
@dysnXO 2 жыл бұрын
hey, whats your gpu temps are like while rendering and under 100% load? cus my dual 3090s are 87c on one of them (i assume top one) and the other one is 70-73c, should i be worried about its being at 87c while under 100% load when render or its ok?
@Golemofstone
@Golemofstone 3 жыл бұрын
So, I have to ask, @1:30 Do those 2 3090's power that boiler behind your build? I mean i know gas prices are getting crazy right now so is the excess heating from the 3090's worth the investment? (From a worried UK Gas consumer using a medium sized supplier) ;)
@gerasimosioardanitis5494
@gerasimosioardanitis5494 2 жыл бұрын
Brother would you consider making a new test now with the new blender and the much better cycles and see the difference with the two 3090s? Also would u suggest getting one 4090 or 2x3090s (so I could take advantage of the 24+24=48GB to load all my blender scene as one?) Highly appreciated all your content and time you put to it!
@drinnerd8532
@drinnerd8532 3 жыл бұрын
You forgot to mention the most important detail about this video: HOW the bloody-hell you were even able to get your hands on duel 3090's
@Alucard-gt1zf
@Alucard-gt1zf 3 жыл бұрын
3090s aren't hard to find They are just eye wateringly expensive
@drinnerd8532
@drinnerd8532 3 жыл бұрын
@@Alucard-gt1zf What are you talking about man? Where are you finding these? I see 3080ti's and a few 3080's and lower tier cards popping up every once in a while but it seems like even super-expensive 3090's pop up only once in a blue moon
@Jon_Doh
@Jon_Doh 3 жыл бұрын
Nice Vid. I have 1 3090 and the difference from a 1080 was night and day. So worth it
@creedolala6918
@creedolala6918 3 жыл бұрын
Nobody else is kind of mad that a single company has got kind of a monopoly on all the cool GPU intensive programs, now including blender? Right now we're getting shafted by the shortage, but when the prices drop it's going to suck having zero competition for blender users looking to upgrade.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
It's a shame that AMD are so uncompetitive in this space. They have nothing like Optix for creative programs. They seem to only care about gaming, which is strange considering their CPUs are primarily aimed at heavily-threaded tasks like video editing. The RTX 3060 outperforms the 6800XT in cycles and it's half the price.
@creedolala6918
@creedolala6918 3 жыл бұрын
@@DECODEDVFX late reply but, is that accurate that amd has no equivalent to optix? recently I was asking a developer about their efforts to make an opencl implementation of software that normally relies on CUDA. They claimed opencl can do about 90% of what cuda does. do you have any idea if that's accurate? When it comes to blender, I'm only aware of optix being used to denoise. for that, they're dumping support of half the video card market? I've seen excellent ai denoising with eg topaz software, it's not that exciting. I'm guessing there must be more to it? is there a good, unbiased explanation of what cuda and optix offers that can't be done on opencl / amd?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
@@creedolala6918 Cuda is faster than OpenCL because it was specifically designed for Nvidia cards. OpenCL is just an opensource API designed a decade ago by Apple. Nvidia also tends to give really good support to developers who want to intergrate Nvidia software into their packages. They have devs working on Blender-Nvidia intergration. Optix is faster than OpenCL or Cuda, because it was made specifically for raytracing. RTX cards have Cuda cores made for processing Cuda operations, Tensor cores for AI tasks, and RTX cores for raytracing. The Optix API can access all of that stuff in order to render frames really quickly. If you have an RTX card, you can turn on Optix rendering in the settings and you'll get much faster renders (up to twice as fast in some cases). The Blender Institute is dropping OpenCL support because it's old, neglected, and it hasn't been updated much for years. Blender only has a small team of devs, and they are wasting precious man-hours every month to fix bugs related to maintaining OpenCL. CyclesX basically couldn't happen with OpenCL. The framework is simply too limited. The Blender devs are committed to supporting AMD and Intel again at some point. They are working with both companies to arrange for support via another API. But that won't be in Blender 3.0.
@creedolala6918
@creedolala6918 3 жыл бұрын
@@DECODEDVFX ah, that's actually cleared things up a lot. seems I had some misconceptions about the usefulness of it. I wonder if nvidia could be coaxed into licensing these things, or if that's just giving away too much competitive advantage.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
@@creedolala6918 I doubt it could be licensed, since it's designed specifically with Nvidia's architecture in mind.
@yonamiti9743
@yonamiti9743 3 жыл бұрын
you inspire me i salute you
@Mocorn
@Mocorn 3 жыл бұрын
I've been wondering.. For those of us who do both gaming and 3d on the same machine, how does one choose between gaming ready or the studio drivers for our Nvidia card?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
You can easily switch between both drivers in the Geforce experience app. But if I had to choose one to use all the time, I'd go for the studio drivers. I'd rather have stability while working than 3-4 FPS in games.
@Mocorn
@Mocorn 3 жыл бұрын
@@DECODEDVFX interesting. Is stability the main reason for using these drivers then would you say?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
@@Mocorn yes. They reduce the chances of graphical bugs and crashes.
@Mocorn
@Mocorn 3 жыл бұрын
@@DECODEDVFX I see. Cheers
@lolmao500
@lolmao500 3 жыл бұрын
Cant wait for RTX4090... will be great for rendering but also necessary for next gen VR.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Yeah, if the 30 series is anything to go by, the 40 series is going to be wild.
@guristoski2625
@guristoski2625 3 жыл бұрын
@@DECODEDVFX Yeah and you can buy one in 2024. People are now saying it won't even let up in 2023 I am choosing to just not render my files out and waiting all the way to 4090 when i can buy one or 2. I am on a 2080 so fuck that lol.
@thumbwarriordx
@thumbwarriordx 3 жыл бұрын
The flow-through design of the new Nvidia cards easily decimates the impact previous non-blower cards would have had stacked so close together. Negligible on the back half. That inner fan of the top has lost a lot of efficiency to be sure but in the words of a great man: Share the load, Mr. Frodo.
@DriftingMunki
@DriftingMunki 3 жыл бұрын
Great video! Thanks! Rotate the top and bottom cards periodically to average the temperature differential between the two, just a thought.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I'll be moving the bottom card down one slot in the next few days. My case needs to be modded slightly first because it doesn't have enough cut-outs for the IO shield. I guess the case designers didn't expect anyone to fit a massive 2.5 slot card to the bottom of their motherboard. I'll be taking a Dremel to my case it once I wrap up my current video. But the temps are nothing to worry about for now. VRAM runs a little hot after sustained renders, but its within design tolerance.
@ExpertManOfficial
@ExpertManOfficial 3 жыл бұрын
What do you think about AMD 3990x?
@thejourney405
@thejourney405 3 жыл бұрын
after 5 months of trying i finally managed to make the hobbit hole tutorial :)
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Awesome. Well done!
@thejourney405
@thejourney405 3 жыл бұрын
@@DECODEDVFX thanks! i added some of my own bits but overall ur a legend for that. Hope u do more lord of the rings tutorials in the future. Also do u have a discord server?
@ImportRace
@ImportRace 3 жыл бұрын
Awesome video
@VeveYT
@VeveYT 2 жыл бұрын
Hm, Why my score in Blender 3.1 for 2 x 3090 is almost double higher than yours. BMV: 8.18 sec, Classroom 14.51??? do you have ny idea what I am doing wrong? Thanks.
@quuu42
@quuu42 3 жыл бұрын
There's no 'L' on the alphabet banner in the classroom picture! Also a very interesting video, thanks!
@sdados3725
@sdados3725 2 жыл бұрын
Hi! Did you try using NVLink? As far as I know that would enable you also to double the ram of your 3090, but I wonder if that's compatible with blender?
@jonathanxdoe
@jonathanxdoe 3 жыл бұрын
How many body parts did you sell to the black market to get those 3090s?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Only the stuff I didn't need. Pinky toes and tonsils.
@alexdib3915
@alexdib3915 3 жыл бұрын
That bioshock looks dope!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Thanks. I'm hoping to work on it more after next month.
@grunt22
@grunt22 3 жыл бұрын
Massively informative video... Running out to get my lottery ticket now 😀
@broadstork
@broadstork 3 жыл бұрын
How do you get 9 seconds for the classroom scene? Just saw a video of someone testing 8 3090s and he rendered the classroom in 13 seconds. Weird
@Just_AStudio
@Just_AStudio 2 жыл бұрын
What about vram does blender uses 58g of v ram both GPUs
@ricbattaglia6976
@ricbattaglia6976 2 жыл бұрын
Thanks, very interesting! Is good a new PC with 12400f and 4090 for interior design Blender rendering?
@gargomesh9736
@gargomesh9736 2 жыл бұрын
Should i connect Both DP cables to one GPU or split the screens? i have 10 Ftw3 3090s sitting around. if i used risers to string them up.... would 6 work? also. i thought GPUs do not work together without SLI Link? or is it just another overpowerd blender thing...? so i just make sure Optix is selected and BAM?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
I only have one of mine connected to a display. Blender doesn't need SLI to use multiple GPUs - they don't even have to be the same model. However, multi-GPU support is limited to the VRAM of a single card unless you bridge the two cards.
@gargomesh9736
@gargomesh9736 2 жыл бұрын
@@DECODEDVFX i tried it, 5x 3090's uing risers. x2 PSU.... it didn't work. my mobo was not happy. i'll just use 2 then and get a Link for them. cheers!
@enigmawstudios4130
@enigmawstudios4130 3 жыл бұрын
Was this close to buying a second 3090. Glad I rewatched this. Nice if blowing the money isn't leaving a notch in your wallet but otherwise I don't see a massive improvement. Do you think it's worth it? And what PSU are you using
@FaddyVFX
@FaddyVFX 3 жыл бұрын
keep up the good work
@SooksVI
@SooksVI 2 жыл бұрын
Do you need the SLI bridge for tile rendering programs like Blender, or can the program render using both cars simultaneously?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
It's not needed. Blender can render with two different devices at the same time.
@DanielGrovePhoto
@DanielGrovePhoto 3 жыл бұрын
Now use K Cycles X and speed it up another 3 to 5 times faster! It's amazing.
@ExpertManOfficial
@ExpertManOfficial 3 жыл бұрын
Yep, i agree!
@mixxpi1674
@mixxpi1674 3 жыл бұрын
Yeah, that’s cool and all, but the real question is how the hell did you manage to get not one, but TWO, 3090’s…?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
It helps to know people at Nvidia and Scan.
@Optimus97
@Optimus97 3 жыл бұрын
@@DECODEDVFX I guess then I'm stuck with a GTX 1070 forever.
@thesenate4536
@thesenate4536 3 жыл бұрын
@@DECODEDVFX any idea when the next drop is going to be for the 30 series cards??
@cowslaw
@cowslaw 3 жыл бұрын
Bruh I'm just trying to get a single 3080...
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
@@thesenate4536 unfortunately not.
@muscle__rr
@muscle__rr 2 жыл бұрын
I think the only reason you're not getting linear time reduction using two 3090 is because the 5950x and X570 motherboard is not capable of getting full x16 bandwidth on both the PCIe slots. For that purpose you need a threadripper and compatible motherboard . I may be wrong , but this is what I am thinking
@ghklfghjfghjcvbnc
@ghklfghjfghjcvbnc 2 жыл бұрын
yup true. Mediaman Studio Services he tested it.
@HerraHazar
@HerraHazar 3 жыл бұрын
I read somewhere that using 2 x 3090 actually slows down the workspace preview? Any thougths on that?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I've ran several tests and I can't find a noticeable difference with one or two 3090s enabled in the viewport.
@dpsociety
@dpsociety 2 жыл бұрын
¿Esa tarjeta gráfica se calienta mucho? FE
@thomandy
@thomandy 3 жыл бұрын
Fantastic video. I'm considering upgrading my setup as my Acer Predator is over 7 years old at this point. For some reason it still works fantastic, even having been used literally every single day for 7 years straight. But rendering in Cycles is not an option with a GTX 760. I can't afford the RTX 3090, but was thinking RTX 3080 Ti. Intel 9 (12gen) and 32GB Ram. I'm guessing that would be in the ballpark of a RTX 3090. Perhaps even RTX3080 would be enough for now. Thoughts?? Again, great video, just what I was looking for!!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
The 3080 and 3080 TI are both fantastic cards. Either one will be a big upgrade, but the TI is definitely worth it if you can find one at a decent price
@p_square
@p_square 3 жыл бұрын
no nvlink connector or does it run fine without it?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I have one but it isn't needed unless you want to pool VRAM.
@p_square
@p_square 3 жыл бұрын
@@DECODEDVFX ah ok, thanks
@GameBacardi
@GameBacardi 2 жыл бұрын
...have you test with Nvlink ?
@furynotes
@furynotes 3 жыл бұрын
Me. Only have one rtx 3090. I lucky enough to even get one. Good commentary on the benchmarks.
@Ben-ju2zw
@Ben-ju2zw 2 жыл бұрын
Hi, I'd like to know if you've faced any heating issues as a result of this particular dual GPU build. Also, in the event that you shift the second GPU to the third bottom slot, will it not affect the performance of the GPU which is on the first PCIe x16 slot? I'd also like to know which PSU you've used for this build. Your inputs will be a BIG help! Big Thanks!
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
No real heating issues. I couldn't move my second GPU down to another slot due to limited space in the case, but it hasn't really been a problem. On really long renders I'll sometimes use MSI afterburner to turn down the temp limit on my GPUs to 80%. It doesn't affect performance a great deal, and it stops things from getting too warm if I'm rendering for hours. My Mobo can share PCIe lanes across all ports, so moving the second GPU down wouldn't affect the top slot GPU. I don't remember which PSU I have off the top of my head. It's a 1200w. I think it's a "Be Quiet" model.
@Ben-ju2zw
@Ben-ju2zw 2 жыл бұрын
@@DECODEDVFX Thanks a lot for sharing this. I really appreciate it!🙏 Btw your work is amazing 👍👍
@fyrn-qe9ul
@fyrn-qe9ul Жыл бұрын
I am really happy to have decided to click on your video after all, dear Sir! It is very informative and well explained, despite not being quite so up-to-date anymore. By far one of the best showcases I have seen up to this point. (And I just figured out, that you’re the same person, whose "Godzilla movie Speed-Blending" [as I may call it] left me in awe two weeks ago. Therefore I took the opportunity to subscribe this time around instead of forgetting again...) But perhaps you (or anyone else for that matter) could humor me with an answer to a somewhat related question? How beneficial would it be, if one decided to combine two different models instead of the same? (In my case allowing my "old" RTX 2070 Super to rehome into a new system alongside a RTX 4080-model [as a little helping hand]) My research has brought me to the conclusion that blender can make use of two different Nvidia-GPUs for rendering (even though nothing else), but nowhere could I find an answer to the question of "Is it even worth the hassle to set up such a Multi-GPU configuration?" It would be great, if you could give me an answer and I would be very grateful. Otherwise I am happy enough with all the help your video has already provided. Thank you. [Please excuse any weird sentence structure or the likes. My English-writing skills are a little rusty...]
@DECODEDVFX
@DECODEDVFX Жыл бұрын
The memory of the scene has to fit into the VRAM of the smallest card. So you'd be limited to 8GB of total VRAM use. However, you can easily disable the 2070 super in Blender whenever you want to render larger files. It doesn't really take any extra set-up. As long as you have both the GPUs installed on your motherboard, Blender should automatically recognize them and use both.
@morizanova8219
@morizanova8219 3 жыл бұрын
Cool! Waiting you share one of them to your subscriber 😀
@damiangaume
@damiangaume 2 жыл бұрын
Thanks for such a good content! So basically you need to connect the two via NVLink bridge if you need to combine the VRAM for scenes that need more than 24GB? Thanks!
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
Yes that's right
@damiangaume
@damiangaume 2 жыл бұрын
@@DECODEDVFX 👌🏻 Thanks! Do you know if cards are stuck to work together when NVLINK connected? As in, can you still select one of them to render certain simpler scenes that don’t require the shared memory? Reason being I want to avoid extra heat and complications if my scene doesn’t need more than 24GB. 🙏🏻
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
@@damiangaume You can quickly disable a GPU in the settings.
@Sebastian-Florin
@Sebastian-Florin 2 жыл бұрын
Hi mate. For noobs, what solution is there to train modeling 3D scenes in Blender from a cloud computer? Thank you.
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
Unfortunately, I know Nothing about using Blender with cloud computing.
@swaggybanana6909
@swaggybanana6909 3 жыл бұрын
How much time it will take for me btw
@nicobay4512
@nicobay4512 2 жыл бұрын
i have a NVIDIA GeForce RTX 3060 Ti , i sometime get the : blender get out of gpu memory . do this card will really help or make diference ? cause its a big price card , 1 card , not two .
@mjlagrone
@mjlagrone 3 жыл бұрын
For those that cannot afford (or find) a graphics card, shifting to Blender 3.0 over 2.93 in Cycles is pretty much like a free card upgrade.
@deepuvinil4565
@deepuvinil4565 3 жыл бұрын
Does it support amd graphic card?
@thornnorton5953
@thornnorton5953 3 жыл бұрын
@@deepuvinil4565 no
@mjlagrone
@mjlagrone 3 жыл бұрын
@@deepuvinil4565 sadly, no. :( CPU rendering is faster, but until they rewrite the drivers for AMD (and Intel!) the only GPU rendering on 3.0a is Nvidia. They are working with both Intel and AMD on new drivers, but I have no idea how long before they are released.
@ChrisAirey1
@ChrisAirey1 3 жыл бұрын
Thank you for this very informative video. I have a 3080 in a setup from Scan (just the one unfortunately) have you got any videos going through what settings you recommend to use for renderings, compositing etc. Also I saw a post of someone using rtx card saying to alter the settings in the Nvidia area on his pc, have you done this or do you keep those as default?
@gooorylin
@gooorylin 3 жыл бұрын
How powerful your PC is? to do rendering and 3D Modeling? I have asked - because: i am planning to upgrade my PC from Amd fx 8370 to Amd Ryzen 7 2700x - at least.
@TheJonathanExp
@TheJonathanExp 3 жыл бұрын
How do you survive with only 32GB of RAM?? Does the VRAM compensate? My workflow 64GB is minimum to keep programs from crashing with massive projects..
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Running low on RAM is rarely an issue for me. I have a project open right now with 4 million verts, and I still have 17 GB of RAM left.
@marcrupprath5131
@marcrupprath5131 Жыл бұрын
I am using blender 3.6 and two rtx4060 cards. Unfortunately i can not see any big improvment in speed , by rendering the classromm scene for example Rendering speed with both cards using optix is only 50% percent faster than using single card. Do you have an advice about renderering settings for dual gpu use? Thanks inadvance
@DECODEDVFX
@DECODEDVFX Жыл бұрын
You may have a bottleneck somewhere else in your system (RAM or CPU can't handle the extra data). If so, you'll probably see a larger difference on scenes that require minimal memory and compiling before rendering. Make sure you're using the latest studio drivers from Nvidia and disable all the power-saving stuff that might throttle your GPUs. When I had a dual GPU system, I often liked to have the cards rendering different frames at the same time. Disable "overwrite files" in the render panel, set blender to render every other frame, save, then open a second copy of blender with the same file. And hit render animation on both copies. That only works on animations, of course, but I can save time on files that require a lot of compiling or compositing between frames. It's worth mentioning that you won't get near double the speed from two cards. There are diminishing returns every time you add an extra card to your system.
@marcrupprath5131
@marcrupprath5131 Жыл бұрын
@@DECODEDVFX Thank you: In the meantime i have noticed that my bottleneck ist old System: In simple scenes preperation time is often as long as rendering time esspecially using low sample counts. Using high sample countes and more complex acenes dicerence in speed is ~2x ;-)
@melomaniakjm
@melomaniakjm 3 жыл бұрын
I have a 2080ti. Could I add an egpu through my thunderbolt 3 port to have 2xGPUs in Blender. Thinking of adding a 3080ti or 3090. Do you need the same GPUs for optimal performance? Or will it reduce GPU compute time like when I use the 2080ti+i9 7960X ?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Mixing different GPUs should be fine as long as they are both from the same company. But Cycles dual rendering limits you to the size of your GPU with the smallest VRAM (which probably won't be an issue in most cases).
@sefrautiq
@sefrautiq 3 жыл бұрын
Me: * reads the title * Me: "Guess we'll never know"
@magidmetwaly959
@magidmetwaly959 2 жыл бұрын
Were you using the NVlink during these benchmarks? @decoded
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
No. I have an NVlink, but it wasn't using for this testing.
@magidmetwaly959
@magidmetwaly959 2 жыл бұрын
@@DECODEDVFX is there a reason why not? Is performance worse with NVlink or would it be better?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
@@magidmetwaly959 it doesn't seem to make much of a difference unless you're rendering a scene that requires more than 24GB of VRAM.
@Scragg-
@Scragg- 3 жыл бұрын
If you replace the thermal pads and paste in both cards you might find like me that you have a 15-20C cooler temp per card
@3d-illusions
@3d-illusions 3 жыл бұрын
For the barber shop scene, try turning off optix.
@anthonyherrick1161
@anthonyherrick1161 2 жыл бұрын
how many polys is your classroom scene? thanks for the vid!
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
50,000
@DidacticEditions
@DidacticEditions Жыл бұрын
Imagine using a copper box with plenty of space in between the GPUs, on a cold winter's day!
@DexterHe24
@DexterHe24 2 жыл бұрын
Hello Sir, Will Dual GPUs (3090) will speed up viewport rendering?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
Yes!
@DexterHe24
@DexterHe24 2 жыл бұрын
@@DECODEDVFX How much faster than with only one 3090?
@rushboardtechuk
@rushboardtechuk 3 жыл бұрын
Technically, SolidWorks can support the gaming RTXs; they just prefer Quadro RTX due to the driver optimisation. However, you can use a Studio driver with gaming cards to improve this.
@apersunthathasaridiculousl1890
@apersunthathasaridiculousl1890 3 жыл бұрын
the fact that the Pixar studio probably has 25 of these computers
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Pixar uses a render farm with 24,000 CPU cores. It's one of the largest supercomputers in the world.
@apersunthathasaridiculousl1890
@apersunthathasaridiculousl1890 3 жыл бұрын
@@DECODEDVFX 👁👄👁 i want
@AdamIsailovic
@AdamIsailovic 2 жыл бұрын
Hey mate, excellent video! I have a question, i have two 3090's in my setup, but render viewport is really slow with Optix. One 3090 is twice as fast in real time render than two. Otherwise, they are working normally during the f12 render. If i turn off the denoiser with two 3090's it is really fast, but, yeah, noisy. Not a big problem, but, i would like to still use both GPU's with Optix real time render. Any ideas of how to use power of both GPU's during viewport render with Optix, and, do you have the same problem?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
I'd suggest you update to the latest release candidate version of Blender (3.3.2) from www.builder.blender.org There was an issue which caused multi-GPU denoising to be slower in the viewport but it has been fixed recently.
@AdamIsailovic
@AdamIsailovic 2 жыл бұрын
@@DECODEDVFX Thank you, i have 3.3.0, will try 3.3.2.
@AdamIsailovic
@AdamIsailovic 2 жыл бұрын
@@DECODEDVFX Sadly, 3.3.2 still have the same issue. Damn :D Oh well, maybe in 4.0 :D
@chrisholland6593
@chrisholland6593 3 жыл бұрын
I just bought a new pc with dual RTX 3090's - I've used an iMac, CPU rendering with an i7-7700k via Arnold. I can't wait to see the difference in my render times. Arrives in one week; it feels like Christmas!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
It's going to blow you away.
@chrisholland6593
@chrisholland6593 3 жыл бұрын
@@DECODEDVFXI hope so! one thing I struggle with on my mac is dynamic simulations, I've gone for the threadripper 3960 x 24 cores. I don't know much about computers... Is that going to be able to cache most simulations? Nothing too crazy, but my mac usually just freezes and crashes if I do anything other than basic rigid body sims.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
@@chrisholland6593 Yes, that should be a total beast!
@chrisholland6593
@chrisholland6593 3 жыл бұрын
@@DECODEDVFX It is unbelievably fast at rendering - but I wish I had a different CPU. C4D uses single-core; I wasn't aware. My dynamic simulations are still slow...
@gooorylin
@gooorylin 3 жыл бұрын
Are you personally using Amd Ryzen or Fx ?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Ryzen 5950x
@yan-anliu8127
@yan-anliu8127 2 жыл бұрын
How much better would this work if the cards were connected via NVLink?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
It doesn't seem to make much of a difference for scenes that use less than 24GB of VRAM. NVlink is essentially just a memory bridge.
@himanshumundepi
@himanshumundepi 3 жыл бұрын
I like your thumbnail.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
Thanks!
@cg.man_aka_kevin
@cg.man_aka_kevin 3 жыл бұрын
2 RTX 3090???!!! You are brilliant... And rich. Even though the RTX 3090 is too expensive for now because of the pandemic.
@PopcornSticker
@PopcornSticker 3 жыл бұрын
your build is craaazy. What do you think of that proessor? I've just ordered my rnder workstation with ryzen 5900x and double rtx 3080ti. half the graphic power of yours... will it cope?
@PopcornSticker
@PopcornSticker 3 жыл бұрын
* i'm an industrilal designer and i wanna boost up my performance with gpus
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
The 5950x is fantastic. Your build should be more than capable enough for most projects.
@robsbassnutt
@robsbassnutt 3 жыл бұрын
What is your power supply and how is it performing? I have a 3090 with a plan to get a second one in the future. Love Blender, Cycles and to animate! I paired my card with an AMD 3960x. I also have 2950x with 1080ti sli. Little home farm to animate. Love your content!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I have a 1200w PSU from BeQuiet! The power draw doesn't seem to go over 350w per PSU under load, so I have loads of headroom for the rest of my system.
@robsbassnutt
@robsbassnutt 3 жыл бұрын
@@DECODEDVFX Thank you! I have 1300w PSU so I am pretty well equipped, This is great news!
@IronLordFitness
@IronLordFitness 3 жыл бұрын
This is my current setup : AMD Ryzen 3900X 32gb DDR4 Aorus 2080 Super Samsung EVO 1t SSD Seagate 2t hardrive Seasonic Platinum 1000w power supply MSI Sekira 500X case I'm struggling in some scenes. I try to keep my polygon count as low as possible (even though it's tough when I got a lot of grass scattered everywhere, trees and so on) but my pc becomes very low at times. My question is : since I can't afford a new GPU, will increase my DDR4 up to 64gb do something? Thanks!
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
It probably won't have much of an effect. Right click on the very bottom of Blender and enable system memory statistics. If you're scenes aren't maxing out your RAM, adding more won't really make a difference.
@JealouseStatement
@JealouseStatement 3 жыл бұрын
Damn, its- its very quick 🥴😳
@melvinch
@melvinch 3 жыл бұрын
Does Blender support a dedicated render-server filled with RTX cards ?
@akcivan
@akcivan 2 жыл бұрын
how much watt on your PSU bro ?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
1200w
@akcivan
@akcivan 2 жыл бұрын
@@DECODEDVFX can i use 2x 3080ti with 1000w gold+ ?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
@@akcivan I'd say it's probably enough. 1200w seems to be overkill for my setup. And that's with a load of memory, fans etc.
@knight2255
@knight2255 3 жыл бұрын
I have a 3080 and a 3060ti, can they Nvlink together?
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
No. Only the 3090 supports NVlink in the 30 series, unfortunately.
@button9
@button9 3 жыл бұрын
What are your settings for the classroom? I gave my 3060 a try and it rendered in 40 seconds. Default settings
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I enabled adaptive sampling (noise threshold) but I didn't change anything else.
@button9
@button9 3 жыл бұрын
@@DECODEDVFX I did that as well. Weird. It was a nice improvement over the 1m30s that 2.93 rendered at.
@handsomelessking
@handsomelessking 3 жыл бұрын
Optix worked with my 1070, but after some blender and gpu driver update it dosen't anymore
@ghklfghjfghjcvbnc
@ghklfghjfghjcvbnc 2 жыл бұрын
haha... =P
@adammmtee
@adammmtee 2 жыл бұрын
I'm new to Blender and I'm making an animation that is taking upwards of 2+ hours per frame and has actually maxed out my 6900xt at times where it runs out of memory and won't render. Maybe I'm using too many high-poly objects/textures? Or are AMD cards just not near as good for Blender?
@ghklfghjfghjcvbnc
@ghklfghjfghjcvbnc 2 жыл бұрын
AMD = NO CUDA CORES or RT Cores for YOU! Sh!t out of luck for you. Should have done your due diligence and researched it. =(
@adammmtee
@adammmtee 2 жыл бұрын
@@ghklfghjfghjcvbnc I mean, I got the card for gaming first. And then a year later decided to try out Blender
@Belidos3D
@Belidos3D 3 жыл бұрын
I just recently upgraded from a 1070ti to a 3060 and the difference just between those two are amazing, i can't imagine using two 3090's
@koctf3846
@koctf3846 3 жыл бұрын
i do have 2 3090s, the cooling issue is really noticeable in a long intensity rendering period, one of the card can run 10-15% slower
@uzairbukhari99
@uzairbukhari99 3 жыл бұрын
Love it. Cycles X is so Superior. I tried the sky demo with optix and it took 6s to finish with rtx 3070
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I've been using CyclesX since the first experimental build. I had to use 2.9 recently and it seemed to slow compared to what I'm used to.
@TorIvanBoine
@TorIvanBoine 3 жыл бұрын
Does anyone know how the 3070 is compared to the 1070? :)
@jannchavez9257
@jannchavez9257 3 жыл бұрын
Who needs a Bugatti when you can flex 2 3090s?
@ExpertManOfficial
@ExpertManOfficial 3 жыл бұрын
Now we need to compare Cycles X and ProRender...
@Utonian21
@Utonian21 3 жыл бұрын
Now I'm imagining what it would be like to build a gaming PC with two 3090s...
@SplendidNinja
@SplendidNinja 3 жыл бұрын
Overkill
@Mesusir
@Mesusir 3 жыл бұрын
Great video. If you render Terminator movie in Eevee it would be near real time? :D
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
EEVEE viewport playback is real time. Renders take less than one second per frame.
@dc37009
@dc37009 Жыл бұрын
Cheaper version of this: my rtx 3060 12gb vram for renders, and an old gtx 1060 6gb to run a monitor and a 22'' wacom ! ...as my digital sculpting rig, and ~2dp (printed tiled template) "pages", or ~stl's go to a 3d printer ! =Rapid proto baby ! Along with traditional molding, casting shop workstations and you got a business model !
@KenzGX
@KenzGX 3 жыл бұрын
There is a PC Builder Company as I remember, offering 4x 3090 or 8x 3090 for that kind of works
@lascaustefan4087
@lascaustefan4087 3 жыл бұрын
Awesome video!! Indeed, 3090 is a beast when it comes to rendering. Prior to getting it I was using a laptop with a 1060 6GB. And scenes that would've taken 20-30 minutes on my laptop, now take 1-2 minutes on the 3090. I also have the smaller brother of your case , the Lian Li O11, and there 2 things I would like to point out: 1st. Wouldn't adding some fans on the bottom improve the temperatures of the cards? I bought the Lian Li mainly because of this reason, being one of the few cases that can provide direct airflow to the cards and also it works really well (in my opinion) with how the cooler of the 3090 was designed ^^ . 2nd. Have you checked memory temperature on your cards? That was my main issue with my 3090, the thermal pads used on mine (and others - it's a known issue) don't make a good enough contact and the memory temperature can surpass 105C degrees easily when rendering (and around 98C when playing video games) and because of that I keep my card at 68% power with an agressive fan profile so when rendering I get around 90-92C degrees on the memory. The main issue here is that the memory is rated at ~95C and rendering for a long time at 105C will shorten its life span. There are some guides on how to change the thermal pads and lower the temps to 80-86C but I haven't got the chance to do so :D
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
There won't be enough room for bottom-mounted case fans once the second GPU is moved down to the bottom slot. Although I need to mod my case before I can actually move it down (there aren't enough IO cut-outs for a 3090 mounted to the bottom PCIE). As for memory, it does run a little hot when both GPUs have been rendering for a long time. I've seen the VRAM thermal pad mod, but I haven't considered it yet. Most of the time everything runs surprisingly cool, and the temps should come down even further once I get the second GPU moved down. I've never seen it throttle significantly, and the fans rarely go above about 60%.
@Oreoezi
@Oreoezi 3 жыл бұрын
Please try octane with two 3090s :) I'd love to see that.
@DECODEDVFX
@DECODEDVFX 3 жыл бұрын
I saw a video a while back where someone was using octane with dual 3090s, and it looked very fast.
@chrisliddiard725
@chrisliddiard725 2 жыл бұрын
I would love to see these tests with the 4090, just to see how one 4090 compares with two 3090's. In other words, if you had a 3090, would you sell it to buy a 4090, or buy another cheaper 3090? [A 3090 on ebay is currently around £650 and falling] Also can Blender be 'tuned' to render animations, eg taking advantage of multiple GPU's?
@DECODEDVFX
@DECODEDVFX 2 жыл бұрын
I just swapped my dual 3090s for one 4090. It's about 10-30% faster depending on the scene, but it's much quieter and more power efficient. The only thing I miss about dual 3090s is the ability to split the workload. On scenes with long build times per frame it was faster to render using two copies of blender at the same time, each using one card. But I'm very happy with the upgrade to one 4090 overall.
Do you REALLY need a good GPU for Blender?
13:16
Curtis Holt
Рет қаралды 37 М.
Thank you mommy 😊💝 #shorts
0:24
5-Minute Crafts HOUSE
Рет қаралды 33 МЛН
Vampire SUCKS Human Energy 🧛🏻‍♂️🪫 (ft. @StevenHe )
0:34
Alan Chikin Chow
Рет қаралды 138 МЛН
RTX 3090 SLI... This isn't going to be as easy as I thought...
19:37
JayzTwoCents
Рет қаралды 1,5 МЛН
Blender 3D - How FAST is the NVIDIA GeForce RTX 4090 !?
8:19
SouthernShotty
Рет қаралды 74 М.
Пластмассовый мир Хуанга | Разбор DLSS 4 и Nvidia RTX 50
25:01
I Made a Game using AI Assets
14:03
Rafal Obrebski
Рет қаралды 184 М.
How do Graphics Cards Work?  Exploring GPU Architecture
28:30
Branch Education
Рет қаралды 3,2 МЛН
RTX3090 vs RTX4090 for creative workflows.
15:54
Mediaman Studio Services
Рет қаралды 15 М.
Thank you mommy 😊💝 #shorts
0:24
5-Minute Crafts HOUSE
Рет қаралды 33 МЛН