NVIDIA GeForce RTX 3090 vs 3080 vs 3070 vs 3060Ti for Machine Learning

  Рет қаралды 40,812

Jeff Heaton

Jeff Heaton

Күн бұрын

Пікірлер: 68
@valdisgerasymiak1403
@valdisgerasymiak1403 3 жыл бұрын
Thank you for your videos! I bought 3090 for my startup - we are working on computer vision (obj det, depth estimation), it's works almost x3 faster on mobiledet detector compared to my old 3070 because of the batch size. With the same batch size 3090=2*3070 in speed comparison.
@blue-pi2kt
@blue-pi2kt 2 жыл бұрын
Think it's RAM related or something else on chip?
@powerHungryMOSFET
@powerHungryMOSFET Жыл бұрын
May I know about 4070 Ti , would you recommend it for deep learning ?
@adityay525125
@adityay525125 3 жыл бұрын
Hey Dr. Heaton thank you for this much requested video, keep up with the awesome work, you are an inspiration to many in the deep learning community :)
@HeatonResearch
@HeatonResearch 3 жыл бұрын
Thank you, glad it is helpful.
@oscarjeong9438
@oscarjeong9438 Жыл бұрын
After upgrading from 3090TI to 4090, there is no training time difference between them when I trained InceptionResNet-V2 on 8 classes classification work. But there is a significant improvement of training time between 3090TI and 3090.
@travelthetropics6190
@travelthetropics6190 3 жыл бұрын
I am currently training StyleGAN2-ADA with RTX3060, your previous videos helped me a lot, 12 GB VRAM is good to have. Can you include RTX3060 in next video?
@arnob3196
@arnob3196 2 жыл бұрын
how long does it take to train?
@csqr
@csqr 3 жыл бұрын
Great video, I’m building a 3090 machine for ML , so hoping more of your content (like this one) can address the hobbyists/ autodidact ML folks like me.
@financelab_ai
@financelab_ai Жыл бұрын
Thanks, this is an excellent review. I have some notes on 3060 for those who are considering it. I ended up getting a new 3060 12gb in 2023. I used Kaggle's free tier a lot (P100 16gb). 3060's training times are similar to Kaggle free tier for the same batch sizes. 12gb works really fine for me. It's a good deal for a low cost gpu.
@muontau4548
@muontau4548 2 жыл бұрын
Thank you for providing this valuable content to the community! I'm a data scientist with strong background in Bayesian statistics, looking to get into deep learning. I'm building a workstation with 5950X + 128GB RAM to learn that by doing various projects. Should I invest into 3090 to future-proof myself, or you think 3080ti (12GB) will be a safe choice for the next ~2 years? I want to learn as much as possible during that period to switch my career a bit.
@muontau4548
@muontau4548 2 жыл бұрын
To be honest I'm also considering RTX A5000, which now costs around the same as 3090, but might retain a better resale value in the long run... and it has 24 gigs as well
@__--JY-Moe--__
@__--JY-Moe--__ 3 жыл бұрын
great vid Prof! thanks so much 4 being U!! love this stuff!! very minimal !! easy 2 get up, and running!
@inavoid9232
@inavoid9232 2 жыл бұрын
Thank you! looking at two 3060 ti 8gb, cost the same as a single 3080. so grateful for this video. buying the 3080!
@luisluiscunha
@luisluiscunha Жыл бұрын
Hello Jeff: thank you very much for your interesting contents, that I follow since the pre-2012's days: by then NN's in Java was a huge part of our common interests. Kind regards from Portugal.
@84mchel
@84mchel 3 жыл бұрын
Great vid thanks!! You recommend the 3090 for inference? Looking for a Gpu for large NLP models like GPT-J
@uatiger1
@uatiger1 3 жыл бұрын
Have an HP Omen Laptop with RTX 3080 16 GB, 32 GB RAM and an i7 11th gen. Bought this laptop specifically for my experiments with transformers
@HeatonResearch
@HeatonResearch 3 жыл бұрын
Oh cool that should work out good. Obviously large scale transformers are trained on grids of GPUs; however, this should be great for many tasks. I rarely use more than 16GB.
@uatiger1
@uatiger1 3 жыл бұрын
@@HeatonResearch Thanks. I have also found the same for most algorithms. GPT-2 and GPT-Neo work fine and can even be fine tuned on this machine. What I found surprising was that MS Teams starts using 40% of the GPU. This is so outlandish 😂
@samuelparada9656
@samuelparada9656 2 жыл бұрын
Hi Jeff ! I just have one question, If I have multiple rtx 3070 for example, those VRAM can be summed ? So 4 cards of 8gb = 32gb VRAM ? Thanks for you amazing content !
@-nomocp-156
@-nomocp-156 Жыл бұрын
Yes
@aayushgarg5437
@aayushgarg5437 Жыл бұрын
Hey Jeff Can you suggest what kind of specs (storage, chipset, memory, cooling) one should go for while building his/her first single GPU workstation? I am fixated at getting a RTX3090 (given 24GB RAM). What kind of nominal specs should I go for given a limited budget? Maybe you make a video on it :)
@mahdimoosavi2109
@mahdimoosavi2109 2 жыл бұрын
Hi Dr. Heaton, I really enjoy your informative videos on youtube, I was wondering if you could help me. I am a medical doctor and have just begun with ML (currently doing a doctoral thesis in this field) and planning to build my own build, should I go for 12 GB 3060 or 12 GB 3080, and later in my career add a 24 GB 3090 (or 4090!) to my build, or just buy 3090 now?
@PuppetMasterdaath144
@PuppetMasterdaath144 3 ай бұрын
3070 up scrotum
@alanorther
@alanorther 2 жыл бұрын
Thanks for the video. I bought a 3090 for gaming (because I couldn't get any other cards) and am looking into taking ML courses.
@Ultrajamz
@Ultrajamz Жыл бұрын
Curious how the 40 series stacks up and how they all compare to 2080ti (what I have now)
@paulpvhl1930
@paulpvhl1930 2 жыл бұрын
So disappointed you didn't test the 12GB 3060 on the same system (don't care at all about the 3060TI for the reason you specify, memory less than 12GB). Surprised the SLI, only available on the 3090 and basically now abandoned by NVidia, made so little difference.
@HeatonResearch
@HeatonResearch 2 жыл бұрын
Yes, I wish I had a 3060 12GB to test with. They are probably easier to get ahold of now.
@paulpvhl1930
@paulpvhl1930 2 жыл бұрын
@@HeatonResearch About $500 in Canada right now from various brands. Best I can do for AI right now, though even that's down the road a bit. (Got to finish fixing my boat :0)
@RamirDuria
@RamirDuria 7 ай бұрын
I have yet to come up with low Cuda memory error with my 3060 ti. And the dataset I'm using isn't exactly that small either. But yeah. The 3060 12gb would always be the better option if you're into machine learning.
@deepfriedavocado
@deepfriedavocado 3 жыл бұрын
Although gddr6x is faster 1:1, when working with large projects such as in dynamic analysis or large scale raytracing, it can be better to have “more” gddr6 in a card such as the rtx a6000
@papsaebus8606
@papsaebus8606 2 жыл бұрын
It would be really interesting to see how video memory limitations work with M1/M1Pro/M1Max machines, they can technically allocate all 16-64GBs to Vram. Interesting to see how far one could push batch sizes :)
@denysivanov3364
@denysivanov3364 2 жыл бұрын
unfortunately its not optimized for NN work. So it can be tested on mac and then welcome to nvidia server. For example for chess engine Leela chess zero with NN m1 has 128 nodes per second and a100 can be 30-40 thousands nps per second and even faster when search tree became very deep (with 300mb+ big 40 blocks x 512 filters network)
@ranati2000
@ranati2000 2 жыл бұрын
So, There is no way to co op that vram issue? I have bought a rtx 3060ti 8gb and after watching the video I kinda get scared if that affects in future.
@ToddWBucy-lf8yz
@ToddWBucy-lf8yz 6 ай бұрын
Would love to see this up payed for the rtx4000 Series
@karthikbalu5824
@karthikbalu5824 Жыл бұрын
RTX 4000 sff with 20GB is it good buy?
@talha_anwar
@talha_anwar 3 жыл бұрын
dont make these videos, it hurts when you have 3070
@701eeepc
@701eeepc 2 жыл бұрын
Hi Jeff, please confirm or not if I got you...you advise those who want to start with some deep learning project to go for rtx 3060, even though has less core than others but much more ram...at the moment has a very good price and I was thinking to buy it. Please let me know
@HeatonResearch
@HeatonResearch 2 жыл бұрын
Yes, as a beginner cores will just slow you down a bit, but you will not likely be going long-run complex. Lack of RAM will often stop you in your tracks. There are tricks to save RAM, but these are not necessarily the most important things to be learning first.
@knjpollard
@knjpollard 2 жыл бұрын
The RX 3060 with 12gb vram would have been interesting
@HeatonResearch
@HeatonResearch 2 жыл бұрын
Unfortunately, I have not been able to get ahold of one of those! But I agree.
@BradHelm
@BradHelm 3 жыл бұрын
Interesting! I've been able to buy a few nVidia 3-=series GPUs but not a 3090 which is what I wanted for its extra VRAM. Until I can get one, I am stuck using a 3080ti for my mahcine learning projects.
@FsimulatorX
@FsimulatorX 2 жыл бұрын
curious what kind of projects are you pursuing? I'm trying to figure out what kind of project would demand something like 3080s and 3090s +. Are you working in a lab or is this individual/hobby?
@BradHelm
@BradHelm 2 жыл бұрын
@@FsimulatorX Image processing, detection, and classification are one area I enjoy tinkering in and some of the models I use push past 12GB easily. It's personal education so I guess you could call it a hobby.
@Joel11111
@Joel11111 2 жыл бұрын
How do those GPUs not thermal throttle like crazy so close together? Are they blower style fans?
@HaunterButIhadNameGagWtf
@HaunterButIhadNameGagWtf 2 жыл бұрын
Can you run something on AMD GPUs? They have e.g. 16GBs of RAM in lower end GPUs
@andrewh5640
@andrewh5640 3 жыл бұрын
I think you have an A5000....wondering how much of a performance difference there is between the A5000 and the 3090 especially for image work
@HeatonResearch
@HeatonResearch 3 жыл бұрын
I have an A6000. The 3090 and A6000 are similar in processing speed if you are dealing with something that fits into the 3090's memory. Beyond that is where the A6000 shines. A6000's are also easier to pack densely in a machine, as they use roughly the equivalent of a blower.
@jimishreds
@jimishreds 2 жыл бұрын
if I have a rtx 2080 super would an upgrade to a titan RTX or rtx 3080ti (same price) be the better opition?
@FsimulatorX
@FsimulatorX 2 жыл бұрын
I would maybe wait for the 4000 series, given the current price markup
@masterpig5s
@masterpig5s 3 жыл бұрын
16 minute crew Perceived 85th view, 75th like, 0 dislikes and 8th comment
@masterpig5s
@masterpig5s 3 жыл бұрын
*Perceived 75th view *1st comment
@masterpig5s
@masterpig5s 3 жыл бұрын
*8th like Really did mess up my numbers. I was concentrating on getting the bell for the right bus stop.
@babu3009
@babu3009 2 жыл бұрын
what about rtx a4500 20gb card?
@Patrick-oi9lj
@Patrick-oi9lj 3 жыл бұрын
Just because I'm curious and want to get my feet wet in machine learning, what is with AMD Radeon cards? They are affordable and the 6800 series has 16GB of VRAM.
@travelthetropics6190
@travelthetropics6190 3 жыл бұрын
You need CUDA most of the time. AMD ROCm is not that useful in ML.
@arpanmanchanda
@arpanmanchanda 3 жыл бұрын
Pls also include the 3060
@HeatonResearch
@HeatonResearch 3 жыл бұрын
I only had access to a 3060 Ti for this video.
@atharvahude
@atharvahude 3 жыл бұрын
What about Titan Rtx???
@HeatonResearch
@HeatonResearch 3 жыл бұрын
It is not a member of the 30 series, so out of scope of this video.
@atharvahude
@atharvahude 3 жыл бұрын
@@HeatonResearch thanks
@carlosh9732
@carlosh9732 2 жыл бұрын
yikes, so as beginner is not possible to use rtx3070? even thought is a slow process?
@mason6300
@mason6300 2 жыл бұрын
Of course you can! I think he was pretty misleading in this video. It completely depends on what kind of machine learning you're doing, but I haven't come across any task I can't get working on my 3070. I might have to change a few settings if a project was designed for say a 3090, but that goes for any card your using to get optimal performance.
@0Zed0
@0Zed0 3 жыл бұрын
I have time but not much cash so a 3060 sounds like my best choice.
@HeatonResearch
@HeatonResearch 3 жыл бұрын
I like the 3060, especially the 12GB version. You would learn much from it. You would also learn how to optimize for more limited hardware. I learned a great deal from the days when I had more time than cash!
@denysivanov3364
@denysivanov3364 2 жыл бұрын
A100 is
@wholetsthedogoutwldo5060
@wholetsthedogoutwldo5060 3 жыл бұрын
Dear Jeff, AI making money for my living, hw coat is high... Can you help hkw much gpu ram i need for scenarios all most used dl models Pictures 8000images 1920×1080 4k 8k Videos, 30second video clip, 720p 1080p 4k 8k Vr content stero 2x4k per eye Sound 7.1 channel 24bit 384khz 100 samples , 30 second Ps: i have gpu cloud 20gpus Thx Jan
@Rolandas3K
@Rolandas3K Жыл бұрын
Thanks for sad video. I understand that its even not worth of dreaming about ML with laptops... Or still and if so, for which type of tasks?l
Nvidia GeForce RTX 3080 Review
35:30
Hardware Unboxed
Рет қаралды 307 М.
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 58 МЛН
Cool Parenting Gadget Against Mosquitos! 🦟👶 #gen
00:21
TheSoul Music Family
Рет қаралды 34 МЛН
When mom gets home, but you're in rollerblades.
00:40
Daniel LaBelle
Рет қаралды 124 МЛН
Build your own Deep learning Machine - What you need to know
11:58
The AI Hacker
Рет қаралды 222 М.
NVIDIA REFUSED To Send Us This - NVIDIA A100
23:46
Linus Tech Tips
Рет қаралды 10 МЛН
NVIDIA RTX 3090, 3080, 3070 Specs, Cooler, Price, & Release Date
21:25
A770 LE vs RTX 3060 12GB TensorFlow Grudge Match
16:31
Overthinking Tech
Рет қаралды 8 М.
Watch this BEFORE buying a LAPTOP for Machine Learning and AI 🦾
18:09
Jesper Dramsch – Non-hype Machine Learning
Рет қаралды 142 М.
Designing a Deep Learning PC with an RTX 3090
22:50
CODE MENTAL
Рет қаралды 9 М.
How Strong is Tin Foil? 💪
00:25
Brianna
Рет қаралды 58 МЛН