Thank you for your videos! I bought 3090 for my startup - we are working on computer vision (obj det, depth estimation), it's works almost x3 faster on mobiledet detector compared to my old 3070 because of the batch size. With the same batch size 3090=2*3070 in speed comparison.
@blue-pi2kt2 жыл бұрын
Think it's RAM related or something else on chip?
@powerHungryMOSFET Жыл бұрын
May I know about 4070 Ti , would you recommend it for deep learning ?
@adityay5251253 жыл бұрын
Hey Dr. Heaton thank you for this much requested video, keep up with the awesome work, you are an inspiration to many in the deep learning community :)
@HeatonResearch3 жыл бұрын
Thank you, glad it is helpful.
@oscarjeong9438 Жыл бұрын
After upgrading from 3090TI to 4090, there is no training time difference between them when I trained InceptionResNet-V2 on 8 classes classification work. But there is a significant improvement of training time between 3090TI and 3090.
@travelthetropics61903 жыл бұрын
I am currently training StyleGAN2-ADA with RTX3060, your previous videos helped me a lot, 12 GB VRAM is good to have. Can you include RTX3060 in next video?
@arnob31962 жыл бұрын
how long does it take to train?
@csqr3 жыл бұрын
Great video, I’m building a 3090 machine for ML , so hoping more of your content (like this one) can address the hobbyists/ autodidact ML folks like me.
@financelab_ai Жыл бұрын
Thanks, this is an excellent review. I have some notes on 3060 for those who are considering it. I ended up getting a new 3060 12gb in 2023. I used Kaggle's free tier a lot (P100 16gb). 3060's training times are similar to Kaggle free tier for the same batch sizes. 12gb works really fine for me. It's a good deal for a low cost gpu.
@muontau45482 жыл бұрын
Thank you for providing this valuable content to the community! I'm a data scientist with strong background in Bayesian statistics, looking to get into deep learning. I'm building a workstation with 5950X + 128GB RAM to learn that by doing various projects. Should I invest into 3090 to future-proof myself, or you think 3080ti (12GB) will be a safe choice for the next ~2 years? I want to learn as much as possible during that period to switch my career a bit.
@muontau45482 жыл бұрын
To be honest I'm also considering RTX A5000, which now costs around the same as 3090, but might retain a better resale value in the long run... and it has 24 gigs as well
@__--JY-Moe--__3 жыл бұрын
great vid Prof! thanks so much 4 being U!! love this stuff!! very minimal !! easy 2 get up, and running!
@inavoid92322 жыл бұрын
Thank you! looking at two 3060 ti 8gb, cost the same as a single 3080. so grateful for this video. buying the 3080!
@luisluiscunha Жыл бұрын
Hello Jeff: thank you very much for your interesting contents, that I follow since the pre-2012's days: by then NN's in Java was a huge part of our common interests. Kind regards from Portugal.
@84mchel3 жыл бұрын
Great vid thanks!! You recommend the 3090 for inference? Looking for a Gpu for large NLP models like GPT-J
@uatiger13 жыл бұрын
Have an HP Omen Laptop with RTX 3080 16 GB, 32 GB RAM and an i7 11th gen. Bought this laptop specifically for my experiments with transformers
@HeatonResearch3 жыл бұрын
Oh cool that should work out good. Obviously large scale transformers are trained on grids of GPUs; however, this should be great for many tasks. I rarely use more than 16GB.
@uatiger13 жыл бұрын
@@HeatonResearch Thanks. I have also found the same for most algorithms. GPT-2 and GPT-Neo work fine and can even be fine tuned on this machine. What I found surprising was that MS Teams starts using 40% of the GPU. This is so outlandish 😂
@samuelparada96562 жыл бұрын
Hi Jeff ! I just have one question, If I have multiple rtx 3070 for example, those VRAM can be summed ? So 4 cards of 8gb = 32gb VRAM ? Thanks for you amazing content !
@-nomocp-156 Жыл бұрын
Yes
@aayushgarg5437 Жыл бұрын
Hey Jeff Can you suggest what kind of specs (storage, chipset, memory, cooling) one should go for while building his/her first single GPU workstation? I am fixated at getting a RTX3090 (given 24GB RAM). What kind of nominal specs should I go for given a limited budget? Maybe you make a video on it :)
@mahdimoosavi21092 жыл бұрын
Hi Dr. Heaton, I really enjoy your informative videos on youtube, I was wondering if you could help me. I am a medical doctor and have just begun with ML (currently doing a doctoral thesis in this field) and planning to build my own build, should I go for 12 GB 3060 or 12 GB 3080, and later in my career add a 24 GB 3090 (or 4090!) to my build, or just buy 3090 now?
@PuppetMasterdaath1443 ай бұрын
3070 up scrotum
@alanorther2 жыл бұрын
Thanks for the video. I bought a 3090 for gaming (because I couldn't get any other cards) and am looking into taking ML courses.
@Ultrajamz Жыл бұрын
Curious how the 40 series stacks up and how they all compare to 2080ti (what I have now)
@paulpvhl19302 жыл бұрын
So disappointed you didn't test the 12GB 3060 on the same system (don't care at all about the 3060TI for the reason you specify, memory less than 12GB). Surprised the SLI, only available on the 3090 and basically now abandoned by NVidia, made so little difference.
@HeatonResearch2 жыл бұрын
Yes, I wish I had a 3060 12GB to test with. They are probably easier to get ahold of now.
@paulpvhl19302 жыл бұрын
@@HeatonResearch About $500 in Canada right now from various brands. Best I can do for AI right now, though even that's down the road a bit. (Got to finish fixing my boat :0)
@RamirDuria7 ай бұрын
I have yet to come up with low Cuda memory error with my 3060 ti. And the dataset I'm using isn't exactly that small either. But yeah. The 3060 12gb would always be the better option if you're into machine learning.
@deepfriedavocado3 жыл бұрын
Although gddr6x is faster 1:1, when working with large projects such as in dynamic analysis or large scale raytracing, it can be better to have “more” gddr6 in a card such as the rtx a6000
@papsaebus86062 жыл бұрын
It would be really interesting to see how video memory limitations work with M1/M1Pro/M1Max machines, they can technically allocate all 16-64GBs to Vram. Interesting to see how far one could push batch sizes :)
@denysivanov33642 жыл бұрын
unfortunately its not optimized for NN work. So it can be tested on mac and then welcome to nvidia server. For example for chess engine Leela chess zero with NN m1 has 128 nodes per second and a100 can be 30-40 thousands nps per second and even faster when search tree became very deep (with 300mb+ big 40 blocks x 512 filters network)
@ranati20002 жыл бұрын
So, There is no way to co op that vram issue? I have bought a rtx 3060ti 8gb and after watching the video I kinda get scared if that affects in future.
@ToddWBucy-lf8yz6 ай бұрын
Would love to see this up payed for the rtx4000 Series
@karthikbalu5824 Жыл бұрын
RTX 4000 sff with 20GB is it good buy?
@talha_anwar3 жыл бұрын
dont make these videos, it hurts when you have 3070
@701eeepc2 жыл бұрын
Hi Jeff, please confirm or not if I got you...you advise those who want to start with some deep learning project to go for rtx 3060, even though has less core than others but much more ram...at the moment has a very good price and I was thinking to buy it. Please let me know
@HeatonResearch2 жыл бұрын
Yes, as a beginner cores will just slow you down a bit, but you will not likely be going long-run complex. Lack of RAM will often stop you in your tracks. There are tricks to save RAM, but these are not necessarily the most important things to be learning first.
@knjpollard2 жыл бұрын
The RX 3060 with 12gb vram would have been interesting
@HeatonResearch2 жыл бұрын
Unfortunately, I have not been able to get ahold of one of those! But I agree.
@BradHelm3 жыл бұрын
Interesting! I've been able to buy a few nVidia 3-=series GPUs but not a 3090 which is what I wanted for its extra VRAM. Until I can get one, I am stuck using a 3080ti for my mahcine learning projects.
@FsimulatorX2 жыл бұрын
curious what kind of projects are you pursuing? I'm trying to figure out what kind of project would demand something like 3080s and 3090s +. Are you working in a lab or is this individual/hobby?
@BradHelm2 жыл бұрын
@@FsimulatorX Image processing, detection, and classification are one area I enjoy tinkering in and some of the models I use push past 12GB easily. It's personal education so I guess you could call it a hobby.
@Joel111112 жыл бұрын
How do those GPUs not thermal throttle like crazy so close together? Are they blower style fans?
@HaunterButIhadNameGagWtf2 жыл бұрын
Can you run something on AMD GPUs? They have e.g. 16GBs of RAM in lower end GPUs
@andrewh56403 жыл бұрын
I think you have an A5000....wondering how much of a performance difference there is between the A5000 and the 3090 especially for image work
@HeatonResearch3 жыл бұрын
I have an A6000. The 3090 and A6000 are similar in processing speed if you are dealing with something that fits into the 3090's memory. Beyond that is where the A6000 shines. A6000's are also easier to pack densely in a machine, as they use roughly the equivalent of a blower.
@jimishreds2 жыл бұрын
if I have a rtx 2080 super would an upgrade to a titan RTX or rtx 3080ti (same price) be the better opition?
@FsimulatorX2 жыл бұрын
I would maybe wait for the 4000 series, given the current price markup
*8th like Really did mess up my numbers. I was concentrating on getting the bell for the right bus stop.
@babu30092 жыл бұрын
what about rtx a4500 20gb card?
@Patrick-oi9lj3 жыл бұрын
Just because I'm curious and want to get my feet wet in machine learning, what is with AMD Radeon cards? They are affordable and the 6800 series has 16GB of VRAM.
@travelthetropics61903 жыл бұрын
You need CUDA most of the time. AMD ROCm is not that useful in ML.
@arpanmanchanda3 жыл бұрын
Pls also include the 3060
@HeatonResearch3 жыл бұрын
I only had access to a 3060 Ti for this video.
@atharvahude3 жыл бұрын
What about Titan Rtx???
@HeatonResearch3 жыл бұрын
It is not a member of the 30 series, so out of scope of this video.
@atharvahude3 жыл бұрын
@@HeatonResearch thanks
@carlosh97322 жыл бұрын
yikes, so as beginner is not possible to use rtx3070? even thought is a slow process?
@mason63002 жыл бұрын
Of course you can! I think he was pretty misleading in this video. It completely depends on what kind of machine learning you're doing, but I haven't come across any task I can't get working on my 3070. I might have to change a few settings if a project was designed for say a 3090, but that goes for any card your using to get optimal performance.
@0Zed03 жыл бұрын
I have time but not much cash so a 3060 sounds like my best choice.
@HeatonResearch3 жыл бұрын
I like the 3060, especially the 12GB version. You would learn much from it. You would also learn how to optimize for more limited hardware. I learned a great deal from the days when I had more time than cash!
@denysivanov33642 жыл бұрын
A100 is
@wholetsthedogoutwldo50603 жыл бұрын
Dear Jeff, AI making money for my living, hw coat is high... Can you help hkw much gpu ram i need for scenarios all most used dl models Pictures 8000images 1920×1080 4k 8k Videos, 30second video clip, 720p 1080p 4k 8k Vr content stero 2x4k per eye Sound 7.1 channel 24bit 384khz 100 samples , 30 second Ps: i have gpu cloud 20gpus Thx Jan
@Rolandas3K Жыл бұрын
Thanks for sad video. I understand that its even not worth of dreaming about ML with laptops... Or still and if so, for which type of tasks?l