Planned obsolescence? Why 8GB GPUS are no longer worth your money!

  Рет қаралды 69,722

HUB Clips

HUB Clips

Күн бұрын

Пікірлер: 1 100
@唐唯剛
@唐唯剛 Жыл бұрын
Back in 2011, I was torned between 560ti 1G and 2G. I asked on the internet and almost everyone recommended the 1G version, for exactly the same reason people use today. The few that recommended 2G were treated as idiot. So I bought the 1G version. A mere 9 months later, Battlefield 3 released and destroyed that 1G vram. Huge regret, especially considering I was poor and on a 4~5 years upgrade cycle back then. It's sad seeing that 12 years later, people have not learned the lesson.
@唐唯剛
@唐唯剛 Жыл бұрын
Also no Skyrim HD texture pack for me.
@grlmgor
@grlmgor Жыл бұрын
It ran fine on my GTX 580 1.5GB
@ngs2683
@ngs2683 Жыл бұрын
I almost went through the same thing in 2016. I got a GTX 1060 6GB while many people were telling me the 3GB version was fine.
@Verpal
@Verpal Жыл бұрын
@@grlmgor Back then 500MB of ram is a whopping 50% increase in VRAM, like 8GB vs 12GB, ofc it can run fine.
@boostoshi
@boostoshi Жыл бұрын
I bet their argument was "you don't need that." The same morons use the argument for high refresh rates. "Oh the human eye can't see that" Meanwhile there is an incredible difference. Moral of the story: Always get the most for your money, and don't listen to jackholes on the internet.
@MikoYotsuya292
@MikoYotsuya292 Жыл бұрын
Mainstream tech reviewers = "8GB cards no longer worth your money these days." *Nvidia makes a 9GB GPU*
@ee-lectures5500
@ee-lectures5500 Жыл бұрын
You mean 8.5GB
@bs0m
@bs0m Жыл бұрын
@@ee-lectures5500 OG's will remember GTX 970 3.5 GB
@magicalbanana1263
@magicalbanana1263 Жыл бұрын
@@ee-lectures5500 that 0.5gb will be GDDR5
@kingcrackedhen3572
@kingcrackedhen3572 Жыл бұрын
8.5 marketed as 9gb ;)
@JayMaverick
@JayMaverick Жыл бұрын
Hey I got a better one. Nvidia launches garbage overpriced product, and *consumers buy it.*
@nutronik9
@nutronik9 Жыл бұрын
Nvidia knew exactly what they were doing by not including more vram. They don't want another 1080ti problem
@carpelunam
@carpelunam Жыл бұрын
its so true
@jandevocht
@jandevocht Жыл бұрын
Wow, never looked at it like this…
@MarketAndChurch
@MarketAndChurch Жыл бұрын
yeah its how they get you to upgrade to the new cards haha
@mrquicky
@mrquicky Жыл бұрын
Every time I see one of these VRAM discussion videos, I become mildly more irritated. It seems the only argument that can ever be mustered by reviewers is the performance of games. I don't care that much about games. Most of the games I play are perfectly fine at 1080p on graphics cards from 3 generations ago. Occasionally, a reviewer will go one step further with video editing & blender benchmarks. I guess that's nice. It is very rare to find videos discussing applications that simply will not run without more VRAM. If all your complaints are confined to video stutter and texture loading, I simply do not care. Those problems seem childlike in comparison to the application crashing or simply failing to load altogether. I hope that eventually the VRAM discussions will evolve to applications that legitimately will not run without at least 20 GB. These applications exist and are far more pervasive than the average reviewer seem to grasp. Artificial intelligence and Machine Learning applications are hardly ever touched upon.
@nutronik9
@nutronik9 Жыл бұрын
@@mrquicky go be happy with your 1080 60hz gaming. Games will pass that threshold. But there's also a ton of gamers that use more than 1080 and also use it for work. Nvidia knows this as well and would make that business decision to try to get you to buy more GPUs by minimizing the vram increases from generation to generation.
@JayMaverick
@JayMaverick Жыл бұрын
When the 3070 cards came out with 8GB of VRAM, I had no idea they'd be obsolete within a few years. But it sure felt a bit weird seeing a brand new card with the same amount of VRAM as my ancient 1070Ti.
@bliglum
@bliglum Жыл бұрын
Even the base 1070, from WAY back in 2016, had 8GB... It's been 7 years, and Nvidia is still pushing the same 8GB on their 70 series cards? All while GPU's cost much more than ever??? Madness.
@tech-wombat
@tech-wombat Жыл бұрын
Is it obsolete though? Can't you simply lower texture quality? Sure, it looks a bit worse, when you have a thorough look. But then: Do you really have a thorough look? And wouldn't developer need to start create more settings in the texture quality setting of the game as well as on the RT settings? According to Steam hardware survey of March 2023 33% of gamers have exactly 8 GB and 77% have 8 GB or less.
@bliglum
@bliglum Жыл бұрын
@@tech-wombat I'd say so. Buying a $700 GPU in 2023, and being forced to lower texture quality to PS2 levels of blurryness in order to get playable framerates fits the definition of obsolete rather succinctly. That, or, perhaps Dead on Arrival is more befitting.
@tech-wombat
@tech-wombat Жыл бұрын
@@bliglum I was thinking about all the people already having one.
@tech-wombat
@tech-wombat Жыл бұрын
@@bliglum bought in 2022 or 2021 for a lot of money I might add.
@the.wanginator
@the.wanginator Жыл бұрын
Lets be real, nvidia cuts corners like for a number of reasons... - Cutting costs - Higher Margins - Planned Obsolescence - Last but not least, THEY DO IT CAUSE WE (the consumers) BUY THEIR STUFF ANYWAY Remember the 3060? Rumor is that it was supposed to be a 6GB card from the start...
@fakethiscrap2083
@fakethiscrap2083 Жыл бұрын
Number one majority of the consumers purchase nvidia. Two just get Intels gpu.
@DualPerformance
@DualPerformance Жыл бұрын
well, I hope the rumors of 4060 with 8GB is not true, otherwise will sucks a lot, need to be 4060 12GB, just like the 3060
@fakethiscrap2083
@fakethiscrap2083 Жыл бұрын
@@DualPerformance it's going to be an 8gb
@jhowgx
@jhowgx Жыл бұрын
@@fakethiscrap2083 probably gonna be 10gb like the 4070
@chronosschiron
@chronosschiron Жыл бұрын
YUP ADD THAT GAMES BASE COSTS WENT FROM 60 TO 90 DOLLARS NOW A 50% INCREASE CAUSE ALL THE HARDWARE THESE NIT WIT DEVS BOUGHT HAS OT GET PAID FOR AND GUESS WHOM THEY PASS ON THE GARBAGE ME AND YOU WELL F THEM AND ILL STAY ON GAMES THAT EXIST NOW AND HAVE FUN
@stealthstandard
@stealthstandard Жыл бұрын
This is Apple level planned obsolescence, GTX 1070 came with 8GB VRAM in 2016. It just should be more now. Technology advances.
@sihamhamda47
@sihamhamda47 Жыл бұрын
In 2023 it should be: 16 to 24GB in mid range And at least 32GB minimum for flagship GPU It could be more expensive, but it's worth the price because you can use it much longer
@skideric
@skideric Жыл бұрын
Yup,still running a 1070 in one of my PCS. Iam writing this on it.
@nothingisawesome
@nothingisawesome Жыл бұрын
Apple is better at hiding it..but i supposed Nvidia is doing great so nvm
@andersbjorkman8666
@andersbjorkman8666 Жыл бұрын
But the 1070 has DDR5 (transfer rate 8gb/s), and a 3070 would have DDR6 (14-16gb/s), I mean this must be considered.
@davefroman4700
@davefroman4700 Жыл бұрын
So did the RX 580 that I am still using. This is just greed on behalf of the industry. It would add $8 to their cost to produce a 16gb card.
@venatorx
@venatorx Жыл бұрын
Your video "16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070" (and other videos comparing the RX 6800 to the 3070 and even the 3080) convinced me to buy the RX 6800 instead of the 3070. I'm glad I waited it out for months and really thought hard about my decision instead of just buying the RTX 3070 on a whim. I just bought and installed an RX 6800 on my PC today---an upgrade from my GTX 1660. The 1440p performance of the RX 6800 is just ridiculously good. I've tried enabling RT (Performance) on The Witcher III, and it's a playable 54 to 55 FPS AVG. I just selected the RT GRAPHICS PRESET, so tweaks here and there will likely boost performance further. I have to say, I'm glad I waited and didn't buy the RTX 3070. The RX 6800 has triple-digit FPS at 1440p ULTRA settings on the few games I've tested, and I am 100% happy with my purchase. I got it for cheaper than an RTX 3080 with a comparable performance. I don't really care about ray tracing. I only tested RT on The Witcher III to see whether it was playable. Even though it is, I have no problem turning it off to get more FPS. The RX 6800 has such great value, and I'm glad I didn't give NVIDIA more of my hard-earned money. Thank you Hardware Unboxed (and all the other review, comparison, and benchmarking videos) for helping me make an excellent value-based decision.
@PinHeadSupliciumwtf
@PinHeadSupliciumwtf Жыл бұрын
Did you do a clean os reinstall after or did you just hot swap? Because the later might cause issues down the line due to driver remnants. So in case anything goes to shit you might consider doing that. Especially when swapping between Nvidia and AMD.
@seylaw
@seylaw Жыл бұрын
You know that there will be RDNA3 models soon in that segment, too? Two months of waiting could have gotten you a) more information about their performance/price or b) better deals on the now old RDNA2 card.
@starvader6604
@starvader6604 Жыл бұрын
​@@seylaw RDNA 3 mid range card is a myth tbh
@liaminwales
@liaminwales Жыл бұрын
@@seylaw Depends, the RX 6800 can be found on some amazing deals today new/used. We wont know if next gen will match it at the price point or even when it will come out. A good deal on a RX 6800 today is not a bad option.
@venatorx
@venatorx Жыл бұрын
@@seylaw Your point being? I got a good deal on a GPU that performs significantly better than an RTX 3070 and comparably to an RTX 3080, while being cheaper than the latter. I like the value of my purchase, and I'm happy with my decision. Deal with it, I guess?
@JohnThunder
@JohnThunder Жыл бұрын
Back in 2013, I had a choice the GTX 760 2GB or the GTX 760 4GB. I was thinking, yeah I will pay a bit more because of that extra VRAM. It worked great in Watch Dogs 1 and Far Cry 4 and I could max out the graphics. I used this GPU until the end of 2019. So, don't buy low VRAM GPUs, even if they are fast. They won't last long until you have to change them again.
@user-eq2fp6jw4g
@user-eq2fp6jw4g Жыл бұрын
Made the same thing actually with rx480 bought 4gb model. Later on realized should have bought 8gb model even it was bit more expensive. Thou learned from that one should always buy the model with more memory.
@Vis117
@Vis117 Жыл бұрын
A great lesson that I have learned with these recent developments. I’m paying very close attention to vram moving forward.
@Kage0No0Tenshi
@Kage0No0Tenshi Жыл бұрын
I got gtx 770-4gb and later on one more to make sli could run csgo at 4k without issue and manny games at 4k. 4gb was amazing then and later on for gtx 1000s 8gb amazing too but why did rtx 3000 have 8gb and 10gb so low idk.
@JohnThunder
@JohnThunder Жыл бұрын
@@Kage0No0Tenshi Because Nvidia wants you to upgrade every generation. All they care is about making the highest amount of money.
@Kirainne
@Kirainne Жыл бұрын
Having been in the PC building space for nearly 30 years, I went after the 6900 XT (XTXH bin) for 16GB of VRAM expecting the card to last me at 1440p for more or less 4 years with expectations to start dropping texture quality by the third/fourth year. I feel honestly quite lucky with my "risk taking" switching to AMD for the first time that this situation just so happened to have worked out compared to the 3070 Ti I could have ended up with. Meanwhile my friend went after the 3070 Ti instead for basically the same price as I got my 6900 XT about 8 months ago (purchased with in a month of each other). We run at the same resolution, have the same panel, and unfortunately he's already having to drop texture quality to medium on average while I can push High/Ultra. I feel bad for him, but it is what it is. We can still play games, but I'm just having an overall better experience for the same value invested.
@BastianRosenmüller
@BastianRosenmüller 7 ай бұрын
i dont think this is right, i play on the rtx 3070ti with 8 gb vram and its absolutely fine at QHD and even 4k without any problems. I think the people are overexaggerating on VRAM. Running out of it doesnt have to be bad. In many games like the Last of us part 1 i run out of v ram quickly but the game just uses less and loads in the rest in the RAM , so i have sometimes some texture pop in and thats the only difference. The game doesnt run worser then on a comparable card with higher V RAM . 8gb is fine
@jesperburns
@jesperburns Жыл бұрын
Laughs in 1080p with 16GB VRAM
@cezarstefanseghjucan
@cezarstefanseghjucan Жыл бұрын
Laughs in 1080p with 24GB VRAM, brother!
@sinizzl
@sinizzl Жыл бұрын
laughs in RX 6800 XT
@fiece4767
@fiece4767 Жыл бұрын
No offence but that res is shit. Wish everyone would stop disrespecting their eyes.
@destronger5313
@destronger5313 Жыл бұрын
emotionless in 3070ti
@avisso5467
@avisso5467 Жыл бұрын
​@@destronger5313 Bruh moment
@boostoshi
@boostoshi Жыл бұрын
One could also argue that Nvidia is also creating a massive e-waste problem with these up-and-coming paperweights.
@MrMiddleWick
@MrMiddleWick Жыл бұрын
Dont worry, people are still buying GTX 1080, 30 series will be used for many, many years.
@Negiku
@Negiku Жыл бұрын
@@MrMiddleWick Ikr, people are so good at exaggerating things lol. We should remember that the most popular gpu as of now is the GTX 1650 which has 4GB VRAM.
@anotherks7297
@anotherks7297 Жыл бұрын
@@Negiku according to steam it's the 3060>2060>1060
@chriss9809
@chriss9809 Жыл бұрын
Of the top 8 steam gpus in march survey, 5 have less than 8 GB Vram. 2060, 1060, 1650, 1050ti, and 3060 laptop(laptop version has only 6GB Vram compared with the 12 gb desktop). They'll be used for years still. A significant amount of people use integrated graphics to game with, and would love to have one of these "paperweights", or even a 4 gb vram card. The thing that sucks is that for this price point, the people who bought the 3070, and 3070 ti want to play things at max settings, and they won't be able to play the latest games at those settings. The 3060 ti users are more a price to performance demographic, so I bet they're significantly happier.
@HenryThe12
@HenryThe12 Жыл бұрын
@@Negiku Exactly, that’s what bothers me with this stuff, it’s so exaggerated. All of these cards that they claim are running into “VRAM limitations” are only on triple A games when using higher textures or RTX. What about the hundreds of thousands to millions of casual gamers that just like playing simple games like CSGO or Valorant? Do you think they’re going to run into VRAM limitations using a freaking 3070 of all cards? Absolutely not, that’s the last thing on their minds. People need to get a grip and come back to reality.
@randy206
@randy206 Жыл бұрын
Can we get a 2080ti vs 3070 video? I'm wondering how the 2080ti with it's limited ray tracing but larger VRAM capacity would fare?
@tejaswin5404
@tejaswin5404 Жыл бұрын
Well In pure hardware based Rasterisation performance Rtx 2080ti will smokes the 3070 into dust , unless NGreedia cripples it in a new driver update👍
@Nightceasar
@Nightceasar Жыл бұрын
You either have 12gb Vram or you don't. Why do you think they put in exactly 11gb and not 12? This whole video is about "Planned Obsolescence".
@fiece4767
@fiece4767 Жыл бұрын
Heres a joke Ex 2080 Ti owners since 2020 bitcoin boom.
@qwertyui90qwertyui90
@qwertyui90qwertyui90 Жыл бұрын
@@tejaswin5404 Which driver would be best for the 2080ti ?
@tejaswin5404
@tejaswin5404 Жыл бұрын
@@qwertyui90qwertyui90 the driver which was available before rtx 3070 release & game testing should be done on older games not the newer ones like resident evil 4 Remake etc,
@Naunet7
@Naunet7 Жыл бұрын
Imagine buying a GPU for more than 1000€ to have 15fps in games 3 years later. People told me to get a 6800xt instead of the 3070 (same price where I live) because of the vram capabilities. I thought they surely must be exaggerating because Nvidia wouldn't build high end cards with 8gb of vram, if they would be completely obsolete in the next three years. Well well well, I'm glad I choose AMD for this generation.
@dellamaik
@dellamaik Жыл бұрын
Imagine thinking you can play on the same settings after 3 years. its normal that you need every year better hardware to play on the highest settings. It dosnt matter if amd or nvidea. Its the evolution of engine and texture graphics and raytracing.
@randymarsh4748
@randymarsh4748 Жыл бұрын
​@@dellamaik ofc u can play with the same settings 3 years later... just with less fps, unless u dont have enough vram than it becomes unplayable
@dellamaik
@dellamaik Жыл бұрын
@@randymarsh4748 than u dont understand that i answerd the question of naunet. he complain about the fps.
@randymarsh4748
@randymarsh4748 Жыл бұрын
@@dellamaik no u didnt, because he never complained about getting LESS fps... he complained about gettting !!!15fps!!! ( unplayable caused by vram cap not by gpu being to slow). He didnt complain about going from 120 fps to 90... hes complaining about going from 120 fps to 15!!!!
@dellamaik
@dellamaik Жыл бұрын
@@randymarsh4748 And this is complaining. Because he state that a 1k gpu dont last 3 years :) And a 3 year old gpu only get 15 fps in native resolution but hey. write what u belive ^-^. dont be so angry and take a break from the pc
@singular9
@singular9 Жыл бұрын
Consoles have 16GB of combined RAM..so that is what is being targeted. AMD knows this and hedged their bets in the right direction as usual. They have consistently been ahead of the curve as far as long term consumer value.
@adamek9750
@adamek9750 Жыл бұрын
AMD finewine technology ™️
@kenchiu1987
@kenchiu1987 Жыл бұрын
AMD is the maker of the consoles they know very well where the industry is heading
@Jabba1625
@Jabba1625 Жыл бұрын
@@kenchiu1987 I was thinking about this just earlier, I think AMD knows that Ngreedia will out-price normal gamers that are on the budget and AMD supply the consoles with their products. I can see some of us returning to a console just play games and use a cheap laptop or phones for PC related stuff. Ngreedia is just in this for the money while AMD has longer term plan set out.
@fafski1199
@fafski1199 Жыл бұрын
Like you said that's combined. The CPU (including OS), uses a fair proportion of that VRAM as system RAM. Only around 8-10GB of that VRAM is actually used by the GPU in games. So a simular amount should be just as sufficient on mid-range GPU's, at least for the next few years (and as long as games are optimised just as well as on console). 16GB would be overkill for the expected usage of 4-5 years on a 1440p card. Even after that period it would be useless, because no doubt by then, it would be downgraded to being capable of 1080p gaming. As for 1080p gaming, 6-8GB GPU's should still suffice, since the Series S, gets away with only a 10GB, pool of shared memory. Dev's putting in time and effort to give PC's the same level of optimization that the consoles get, is what is really needed here.
@singular9
@singular9 Жыл бұрын
@@fafski1199 this is not true. Tests have show the OS uses at most 1-1.5GB and games use well over 12 GB at 1080p and all of it at 4K
@aga1nsty0u
@aga1nsty0u Жыл бұрын
my 1080ti with its 11gb vram that i bough in 2017 really helped me push [1440p 120hz UW] trough these years until now, and now bought rx6950xt for next minimum of 4-5 years
@mattirealm
@mattirealm Жыл бұрын
Good choice! 1080 Ti was legendary, but tests show the 6950 XT is a beast!
@Tyrim
@Tyrim Жыл бұрын
My only consolation for my rtx a4000. I had to buy it due to work reasons, in gaming it is basically a 3060ti with 16gb vram. I'm so glad i have 16gb.
@AMN-wm1nh
@AMN-wm1nh Жыл бұрын
16 ??
@Tyrim
@Tyrim Жыл бұрын
@@AMN-wm1nh yes. the A series is the new name for the Quadro cards. The A4000 has 16gb of ECC Vram. basically for nearly similar gaming performance, better drivers and features, more ECC Vram, a bit better computing performance, and much lower TDP, it costs 2.5x times the price :/ however some professional applications straight up refuse to use non-quadro cards, or you cannot acces certain features without them.
@zagloba1987
@zagloba1987 Жыл бұрын
RX 6700 and 6700 XT with 10 and 12 GB of vRAM feel like much better buy than 3070 if you plan to keep the card for a few years. Specially if RT is not a big concern for you. They should do pretty OK even in 1440p for a while and I feel like 3070ti and 3070 will age in a pretty rough fashion. Anyway if the rumors are true and 4060Ti and 4060 will still pack 8 Gb of vRAM that will be quite impressive by Nvidia. Like spitting in a face of the consumer :)
@MicaelAzevedo
@MicaelAzevedo Жыл бұрын
Well i have a rtx 3070 and you can not use ray tracing either lol if you want 30/40 fps yeps, go for it.
@94leroyal
@94leroyal Жыл бұрын
Cyberpunk pathtracing is showing what a joke modern RT effects are as well. The technology is immature even on the 4000-series cards for good ray-tracing. In another 5 years once the performance has matured I'll buy into RT intentionally.
@Scornfull
@Scornfull Жыл бұрын
I wouldn't say that AMD consistently has driver issues that takes a very long time to be fixed if at all so both are terrible options, so you gotta pick between performance and stability
@zagloba1987
@zagloba1987 Жыл бұрын
@@Scornfull Thats a plain lie right there. Absolutely 0 issues.
@marcm.
@marcm. Жыл бұрын
I would argue that even if RT is a concern for you a 3070 is not going to do it. This is not because it doesn't have the horsepower, but because it doesn't have the VRAM necessary to handle RT and the texture packs that would make sense with such an option
@Tenmo8life
@Tenmo8life Жыл бұрын
The 2080Ti took that fall like a champ
@Kratochvil1989
@Kratochvil1989 Жыл бұрын
two years ago here we are giving you this budget friendly solution 3060ti gpu with the same performance as 2080ti... two years later....Hi,Hi You have been trolled, while during pandemic you spended more money for every 3060ti unit then for any 2080ti unit ever before :D.
@Tenmo8life
@Tenmo8life Жыл бұрын
@@Kratochvil1989 No.. it literally fell in the video
@bebrown2354
@bebrown2354 Жыл бұрын
It's just a confusing time in gpus. When the 10 series came out, raster performance @ 1080p was king. The more the better. Now it's like raster + rt + dlss + cuda + VRAM + tdp + 1080p vs 1440p vs 4k. There are just so many metrics now that it's hard to pick a clear winner.
@Tarukeys
@Tarukeys Жыл бұрын
I've been building PCs for almost 25 years. In the 90's until Pascal (10 series), memory doubled every generation. From Pascal, the memory stayed the same for 3 gen in a row and 40 is barely enough. Thanks to the people defending Nvidia on every articles, they think they can get away with it. When the 30 series came out, it made me angry. I knew it was going to be a problem way before the end of life of these cards, especially the 3070, which is quite strong. Imagine being one of the suckers who paid over 1200$ CAD for a 3070 when prices went crazy and now you have to play some games at medium settings when if facts, it's more than powerfull enough.
@chinesepopsongs00
@chinesepopsongs00 Жыл бұрын
8Gb has had a good time for a long time now you mentioned the history of 256Mb,512Mb,1Gb,2Gb,4Gb updates in the past. I think no other size is produced as long as 8Gb cards. So many newer younger people have never seen anything else then 8Gb being the god tier of memory. But even this god tier is coming to a end now as it seems. You guys have been most correct in testing and predicting things in PC tech. Even retest and correct if you find something is done wrong. I need that kind of info to figure out what is good and what is less.
@drizz786
@drizz786 Жыл бұрын
The vram question and price gauges is why i went A770 16 gig from a 1080ti, now its running even better after 2 updates in last month.
@imbro2529
@imbro2529 Жыл бұрын
Literally bought an 8GB RX6650XT a couple of months ago and now my feed is full with such videos XD
@mutawi3i
@mutawi3i Жыл бұрын
Turn shadows off m8.
@PinHeadSupliciumwtf
@PinHeadSupliciumwtf Жыл бұрын
They're fine for 1080. But since 6650 isn't really meant for 1440 anyway I wouldn't worry too much. The irony of the 8GB nvidia cards is that they're too powerful for that low amount of vram. Plus the fact that RT uses even more.
@cosmincatalin2800
@cosmincatalin2800 Жыл бұрын
In that segment of price the only thing you could do with much more vram was the 3060 12gb.If you didn't have money for a 6800 16gb tier card it's kinda a moot point to regret. If you could jump to 6800 16gb and you didn't because other reasons than money then it's another discussion.
@IchiroSakamoto
@IchiroSakamoto Жыл бұрын
6650 aren't supposed to game above 1080p so you are ok
@solo6965
@solo6965 Жыл бұрын
i have same gpu as you. i feel the sweet spot is med setting for the 6650xt
@dd22koopaesverde58
@dd22koopaesverde58 Жыл бұрын
The rx 6600 is one good 8gb vram because it cost 250$ conpared to other 8gb vram cards like 3070 is good
@IchiroSakamoto
@IchiroSakamoto Жыл бұрын
10GB 3080 aint doing any better because that was meant to be a true 4K card!
@heldscissors4132
@heldscissors4132 Жыл бұрын
@@IchiroSakamoto yeah, at least i got the 12gb 3080.... my friend with the 10gb version of the exact same card tho..........
@SapphireThunder
@SapphireThunder Жыл бұрын
@@heldscissors4132 The thing is, 3080 10gb and 12gb are not even same gpu at all. 3080 10gb uses lower 8704 CUDA GA102-200/202 where 3080 12gb uses 8960 CUDA GA102-220. it's not a massive difference, but it's basically same thing as with 1060 3gb and 1060 6gb. They almost also did same thing, but worse, with 4070TI and 4080.
@heldscissors4132
@heldscissors4132 Жыл бұрын
@@SapphireThunder yeah ik... What I mean is we have the same cooler design so are cards are identical but mine would last longer in general
@MicaelAzevedo
@MicaelAzevedo Жыл бұрын
​@heldscissors4132 12 gb isnt that great either. You will need 16. New games are at 14 gb now lol (1440p) in fact i bought a rtx 3080 (used) and sold it 3 days latter xD just bought 4090 as 7900xtx are almost same price in my Country (EU)
@V1CT1MIZED
@V1CT1MIZED Жыл бұрын
I agreed with you guys back in the 3070 review and it was clear that you were going to be right. This situation is pure surprised pikachu meme. My 1080Ti at the time was already more future proof. It has to be planned obsolescence.
@zbirdzell
@zbirdzell Жыл бұрын
It would be interesting to compare the 3060 12gb vs the 3070 8gb.
@chronosschiron
@chronosschiron Жыл бұрын
funny how the 1660 super is close enough to a 3060 to say NAAAAA dont want it and its more then double the price for that cash id go wiht a 20GB amd card or a a770 that beats the 3060( oh and on latter id save cash)
@gunshotbill
@gunshotbill Жыл бұрын
kzbin.info/www/bejne/n5izp5qYYtp6r9E
@juicecan6450
@juicecan6450 Жыл бұрын
As a 3060 owner who bagged it on MSRP in a tough time, I feel a little sad about where the GPU stands. It's designed to fall just short of completely conquering 1080p, and the card that actually does that is the 3060 Ti, but it's so good that they capped the VRAM at 8GB to ensure you never step out of 1080p in the future with that. And I'm pretty sure that 8GB VRAM is doable until the next console gen so long as you're on 1080p. So while the 3060Ti has that going for it, the 3060 is designed to fizzle out into obscurity because it's chip is an anemic version of the Ti that would tap out at just an 8GB memory's worth of max workload. So making use of all 12 is a pipe dream
@juicecan6450
@juicecan6450 Жыл бұрын
Only scenario where a 3060 beats a 3060Ti or a 3070 is when you play a next gen game at 4K maxed and the result is that you get 10 fps on the 3060 while the 3060Ti and 3070 score 4 and 5 fps because of the memory bottleneck. Not really a practical scenario. Tone it down to where their memory is able to handle it (you'll need to do that as your card ages anyway) and you'll notice that the only fair competition that a 3060 can have is with say a 2070 and maybe a 2070S/80 if you're doing some borderline unsafe overclocking.
@luciferxyXX
@luciferxyXX Жыл бұрын
3060 is just too cripled sans the vram, that small bandwith & rop are just a big No. nvidia really screwed its consumer by giving less vram to its faster gpus and giving more vram to its slower gpu (3060) 👎
@CPowell133
@CPowell133 Жыл бұрын
As someone who tends to hold onto a GPU for a couple generations. Seeing as I am still on a GTX 1070. This alone makes me worried about considering a 40 series card below a 4080 and will most likely push me towards AMD this generation.
@Gosagen93
@Gosagen93 Жыл бұрын
Same.
@molochi
@molochi Жыл бұрын
Yeah right now, as someone who will not be building a GAMING pc with a + $1000 videocard, the only thing that seems logical to buy is a rx6800xt 16GB. Though I don't see using it at any resolution that should REQUIRE more than 8GB of memory, it seems like these textures are just getting out of hand. I don't see any of this as planned obsolescence, just lazy (or pressed for time) developers.
@40Sec
@40Sec Жыл бұрын
The circumstances where the 6xxx series outperforms the 3xxx series rather than vice versa in games with current gen features is admittedly very limited. No GPU should be expected to be amazing four years down thet line. The only reason AMD had stuck as much VRAM on their cards as they have is because they've known they're behind on pretty much everything else and they need at least one area where they can get a marketing win. Let's not fool ourselves into thinking that AMD was thinking of the future or that they genuinely cared about their customers. Let's also remember that they've consistently fought against DLSS inclusion in their partnered games as it would greatly reduce the issues lower VRAM Nvidia cards might face in those games. For the 3xxx series not including more VRAM was an acceptable choice, but there's no excuse for $600+ cards with 8GB of ram at any point moving forward. Hopefully Nvidia has its act together for the next round.
@promc2890
@promc2890 Жыл бұрын
Buying a 6800XT over a 3070ti for the same price is the best decision i made
@datguyoverdurr2604
@datguyoverdurr2604 Жыл бұрын
I'm so glad i bought the 3070! Jumping from a ps4 to a my best of a rig, the fps difference was jarring! And being able to demolish games at 1440p high settings then jump into blender and do some 3d art, I regret not moving to PC sooner!
@The23rdGamer
@The23rdGamer Жыл бұрын
How well does Blender hold up on a 3070 with 8GB of memory? Do you frequently run out? I'm curious what size of projects you work with as a fellow Blender user.
@EugeneEugene-tw8rn
@EugeneEugene-tw8rn Жыл бұрын
Dud you're comparing the 10 year old console to the mid range PC of 2021. Stop using drugs and write nonsense
@thejollysloth5743
@thejollysloth5743 Жыл бұрын
As someone who doesn’t care about Ray Tracing, and is happy playing at 1080p between 90 and 150 FPS (I have a budget AOC 24 inch 144Hz 1080p monitor and am happy with the resolution as I haven’t ever played at 1440p or 4k), is the Radeon RX 6750 XT still going to be a card I can play the latest games at 1080p High settings for another few years, or at least until the next generation of cards are released, which will have gotten me around 4 years on the GPU (I tend to buy a £400 a £500 card every two generations) and sell my old card in order to fund some of the Upgrade. I had a GTX 1070 before the 6750 XT, which I sold for over £200 during the crypto craze. The upgrade was insane and I got another 4 GB of much faster memory (GDDR6 vs GDDR5). I also upgraded my CPU from a 9600k (6 cores and 6 threads at 5.2ghz all core overclock) for a 5700x recently (8 cores & 16 threads). I’m currently extremely happy with both at 1080p Ultra or High settings, but found that in World Of Warcraft the CPU upgrade helped me more, especially in areas where there were 50+ players on screen in “Epic Battlegrounds” which have up to 40 players on each team. I know the 13600k would probably have given me a better experience in CPU bound situations, but it was a lot more expensive, and the motherboards were as well. I find that the 2 extra cores and 10 extra threads have had a huge effect in CPU bound games, but for single core (especially older games), the 9600k at 5.2 GHz actually outperformed the 5700x, which really surprised me since it is a much older 14nm architecture!
@sonicthomasgamer
@sonicthomasgamer Жыл бұрын
NVIDIA : Where do you see yourself in 5 years Graphic card(8GB VRAM): Out of VRAM sir
@Kurotama_STR
@Kurotama_STR Жыл бұрын
I had a 3060Ti that my father got me for a birthday and was quite happy with it except for a few things: drivers (for some emulation and linux based stuff), VRam (mostly for VR) and noise. I got some money for my 21st birthday and ended up getting an XFX Speedster Black 7900XT for ~900€ (would be around 750-800$ without taxes or shipping fees ig ?) and it runs like a beast. Thought about going for a 6950XT at first, but I don't regret my choice seeing Alyx devour 17.5GB of VRam on average xD My only concern with it is the latency induced by the MCD but I was aware of it being a potential issue, ans since it fixed all of my other problems and still runs quite a lot faster than my old 3060Ti, I'm very satisfied of my purchase. For the record, sold old gpu for 250€ to a friend
@chuckkissel2785
@chuckkissel2785 Жыл бұрын
Good Point. I upgraded from a 4gb Rx580 to a 8GB 6650xt, in Dec 22. But I know that in the next two years i will need to upgrade quickly again to continue to play AAA games at decent frames at 1440p
@jeffho1727
@jeffho1727 Жыл бұрын
Same. Matched it to my gen 10 i3. Debated going a little higher to 6700 but considered it overkill and will run for another couple years and get a full upgrade to the new gen.
@chuckkissel2785
@chuckkissel2785 Жыл бұрын
@@jeffho1727 Waiting on those mid range 7000 series GPUs coming out in June
@NATIK001
@NATIK001 Жыл бұрын
For me; I was upgrading from an 8GB card. I was already seeing it get close to maxing my VRAM, so I knew I wouldn't buy anything under 16 GB. I looked across the offerings available and well, Nvidia just didn't offer anything that made sense so I had to go elsewhere.
@MarshallRawR
@MarshallRawR Жыл бұрын
But it's ridiculous!! 8GB VRAM obsolete out of nowhere because of those ports?! Oh so what, the sub $400 is already not worth it, and now that some RX 6600 are becoming affordable they are now trash? I will still blame poorly made ports before saying that 8GB is obsolete. Sorry guys, have to buy +600 GPUs to play at 1080P now. Absolutely ridiculous
@rasmusolesen5307
@rasmusolesen5307 Жыл бұрын
I don't know how much work this would be, but maybe VRAM usage would be more relatable if the performance benchmarks were accompanied by the average and max usage in % Max usage above 75% should be a clear indication that newer triple A titles will have issues.
@davefroman4700
@davefroman4700 Жыл бұрын
The fact that 16 gigs is not already a standard at a $300 entry point today? Has to do with nothing but greed. It would add less than $8 to their cost of manufacturing.
@agostinog
@agostinog Жыл бұрын
Lol I just bought a 4GB card, the RX 6400, and I'm happy with it so far, it's way better that my previous RX 550. But obviously I don't play modern games.
@thefreemonk6938
@thefreemonk6938 Жыл бұрын
Same I have laptop rtx 3050. I wish NVIDIA gave at least 6 GB VRAM with it.
@derrickhotard9926
@derrickhotard9926 Жыл бұрын
Moores law is dead podcast with the COD dev opened up my eyes to the impact of consoles on game requirements. As last Gen consoles are completely phased out, devs won't have as much of a reason to stay lower 8 gb since the ps5 and Xbox have 16 gb a piece. Nvidia doesn't give a shit about game devs so they won't be asking them for anything. Nvidia releasing new gpus with less vram when you can get 16 gb on a multiple year old game console that can do 4k is pretty insane
@portman8909
@portman8909 10 ай бұрын
It’s actually 12 gb of usable vram on consoles. The other 4 is reserved.
@mcericx
@mcericx Жыл бұрын
When game developers produce shitty PC ports then I dont think you should blame the VRAM on GPUs. You just give them a free excuse to continue doing so. It is not a VRAM problem!
@marcopazzi117
@marcopazzi117 Жыл бұрын
2015 I bought a new a 1070 8gb, so don't understand that nvidia did not update 30 series vram...
@jumpman1213
@jumpman1213 Жыл бұрын
Planned obsolescence is 100 percent the case here, but on the other hand I think it helped with supply at the time. Remember the VRAM cost skyrocketed in 2021? If the 3070/3060Ti were equipped with 16GB, I don't think nvidia and the AIBs could push that many graphics cards to the market.
@Accuracy158
@Accuracy158 Жыл бұрын
Console generations also probably play a role in how devs allocate GDDR memory. We may not see another big jump in vram usage like we are getting now until availability of PS6 stabilizes. We're just now getting to the point where last gen consoles are kind of becoming an after thought instead of the main portion of the install base so it's not too surprising games seem to suddenly be creeping up in memory usage. As Digital Foundries does a good job highlighting games like The Last of Us are getting sloppy texture settings not well well configured to utilize the amount of vram provided. As soon you start dropping settings some textures look terrible and other look completely untouched leaving a mixed image somewhat below the quality we've come to expect from 8GBs of vram.
@K4rmaRules
@K4rmaRules Жыл бұрын
I have a 3060ti. All I could afford at the time but it runs everything I want it to really fast.
@The23rdGamer
@The23rdGamer Жыл бұрын
8GB is fine depending upon your use case :)
@pjavilla
@pjavilla Жыл бұрын
Yup, as a patientgamer it suits me just fine. I'm not gonna bother leaving 1080 until I can also find a good HDR monitor, and judging from HUB's reviews those are still a ways off.
@casedistorted
@casedistorted Жыл бұрын
Man, I remember in Feb 2017 I bought the 1070 8GB SC from EVGA at a Best Buy.. my previous GPU was a 560 Ti with 2 GB's of VRAM if I remember correctly. The jump up to 8GB was HUGE and I kept thinking "man, this will last me for years!" It did. It lasted me until 2021 (I think? Or was it 2022, my memory has gotten bad) when the 3070 Ti launched and when it still had 8 GB's of VRAM I was like "really? The GPU I bought over 4 years ago had the same amount, and 1440p+4k will require more for sure.." Thanks for that one Nvidia, you gave me the GDDR6X memory but not nearly enough.
@Carsons12
@Carsons12 Жыл бұрын
Really appreciate your honest review (even though you might sometimes get criticized by viewers for it).
@upfront2375
@upfront2375 Жыл бұрын
Well they literally are the guys who would compare a Jaguar with a speedboat in water😂Simply bcoz Jaguar can float in water. Testing Nvidia cards with FSR is very lame..
@joemarais7683
@joemarais7683 Жыл бұрын
@@upfront2375 how?
@upfront2375
@upfront2375 Жыл бұрын
@@joemarais7683 DLSS is designed for geforce cards, FSR is open source but it's made by AMD so it's only fair to compare 2 cards with each preferred upscaling tech... but then the numbers wouldn't be as close as a tech channel would like it to be. They're literally shooting the Nvidia cards in the leg ONLY to make AMD ones look competitive.
@V1CT1MIZED
@V1CT1MIZED Жыл бұрын
@@upfront2375 guess you missed the FSR2 vs DLSS video they did the other day? Performance is basically identical between both upscalers and it's image quality that differs. Obviously you don't understand science and how variables have to be the same to get an accurate result.
@upfront2375
@upfront2375 Жыл бұрын
@@V1CT1MIZED Why you "guess" when you can just "assume" as clearly shown you're good at it. Idk how they're testing but I have a 4080 and I can test it myself, and even without the FG I still get better perf. with DLSS. Fk the science, I'm talking about logic here! If the perf. is as you say "identical" then why they're keep using FSR on Nvidia cards?! "Variables have to be same?" while 1 card is designed for that "variable" when the other one simply is just able to run it. You're obviously 1 of them who can be fooled by big words used and can't think and measure by yourself, beside "obviously" being an AMD fanboi
@diysai
@diysai Жыл бұрын
I bought my Radeon 6700 10gb recently. Will it be enough for 1080p high 50fps (no RT), considering I plan to keep it at least another 5 to 7 years?
@DerDop
@DerDop Жыл бұрын
LOL YES...but for max 5 years.
@tech-wombat
@tech-wombat Жыл бұрын
How do you respond to the fact, that according to Steam hardware survey March 2023 77.6% of users have 8 GB or less, 32.16% have exactly 8 GB? I saw the interview of Moores Law is Dead interviewing a developer who was struggling with that limit. I would believe that no mainstream game studio can create (and sell) a game, which does not work properly for 77.6% of a potential audience?
@Vis117
@Vis117 Жыл бұрын
MLID actually gave his opinion on this question yesterday. Devs will just tell you to play in 1080p. I think it’s clear that the problem exists, regardless of what cards most people have.
@zwush
@zwush Жыл бұрын
But many Game Devs develop their games for consoles that have more VRAM and port it over to PC. So 12 gb should be your minimum.
@haysoos123
@haysoos123 Жыл бұрын
They will work, but you have to use the lower texture settings and/or resolutions.
@vladimirljubopytnov5193
@vladimirljubopytnov5193 Жыл бұрын
Sooo... what about 12G? Any measurement there? I was in a situation when I had choice between 3080 10G Noctua edition and 12G TUF. Went with 12G and the thing is loud and I would like to think that I at least did well on the VRAM side of the argument..
@maxerea4380
@maxerea4380 Жыл бұрын
Why is no one mentioning the Arc A770 with 16gigs of vram as well, I have been using for a while now and I absoluty love it! It works well, slick design and its overal a great product.
@nikosensei1258
@nikosensei1258 Жыл бұрын
Price is different every country, driver issue, today used market full of mining gpu which is cheaper.
@maxerea4380
@maxerea4380 Жыл бұрын
@@nikosensei1258 Yeah but the drivers are fixed now, all games runs very smooth, the only downside is the rezisable bar, and when you buy a mining gpu you buy a 1) used gpu. 2) It's probably passed half of its life cycle
@WutipongWongsakuldej
@WutipongWongsakuldej Жыл бұрын
I'm learning a bit on the modern rendering technique. I've found that a lot of technique utilize render target/texture. Something very common like light sources use render targets/textures. Point light use a cube map for it's shadow map. Directional Light uses multiple shadow maps. Even simple spotlight or light panel uses a shadow map. So even if the game's static resources might be smaller, the renderer might still be benefit from large VRAM size as it can store more/larger render targets in the memory. By the way, it looks like the most common max image size supported by the gpu in the market is 16k by 16k, with more modern can go up to 32k. I don't know if there are games that utilize the texture/render target that large or not....
@AWOL_ODST
@AWOL_ODST Жыл бұрын
“Those who don’t learn from history are doomed to repeat it” Listen to the boys, they know what they are talking about. Ignore their advice at your peril.
@IchiroSakamoto
@IchiroSakamoto Жыл бұрын
Sadly people didn't learn from the 3.5GB GTX 970 saga
@hornypotato4340
@hornypotato4340 Жыл бұрын
NVIDIA sucks
@chrismorrison5301
@chrismorrison5301 Жыл бұрын
Ray tracing has improved massively on Radeon cards (there is a discussion about RT API's and AMD getting the short straw when it comes to the generic API's and not many devs using the AMD specific one for RT but that's a whole other discussion). When I first got my 6700XT I couldn't use RT on Shadow of the Tomb Raider, then a while later I jumped in and Medium RT was possible. Then not too long ago I jumped in again and from the areas I was playing I could have ultra RT at 1440p@60 max settings (except AA) with no frame rate dips at all. Now I didn't play the whole game through so I can't say for sure the whole game would be a locked 60fps, but it was a very forested area so a lot of RT shadows going on there. The noise wasn't great but it's to be expected.
@Ivan-pr7ku
@Ivan-pr7ku Жыл бұрын
The truth is that, in the case of The Last of Us game, the PS5 version also runs out of memory all the time and this is where the GPU-accelerated I/O management comes into play: the assets' data is swapped in and out of the RAM so fast, that there's no opportunity for stuttering or rendering artifacts to happen. On the PC version, all this functionality is thrown on the much slower CPU and the only course of action is simply to allocate more VRAM for preemptive buffering. At this point it's up to the game dev's to utilize the available solutions for the PC, i.e. the DirectStorage API that exposes similar functionality found in PS5. It is supported on all Shader Model 6.0 compatible GPUs.
@JayzBeerz
@JayzBeerz Жыл бұрын
Can’t take advantage of my 6800 XT 16GB VRAM with all the black screens.
@jamesgodfrey1322
@jamesgodfrey1322 Жыл бұрын
I part of lost generation of GPU buyers why the prices had became insane, so I did not get a new GPU even if could got one the price was to much. So I carry on using my old RX580 8GP I do think that Steve and Tom have hit nail on head that fact is 8GB GPU are going to fast becoming obsolete. I about to upgrade my GPU an yes 16GB or above GPU is what I be getting. I been play around with AI ART generators some food for thought when will AI using your GPU become part of game engines. A little of subject as old gamer a bit of advice set up a small saving account and drop £20 per month into most time we upgrade a PC every 4 to 5 years I that over £1000 towards a new PC and or to help pay for any need PC upgrade along the way
@TrueBasicGamer
@TrueBasicGamer 11 ай бұрын
4.53 i had a heart attack
@rossmclaughlin7158
@rossmclaughlin7158 Жыл бұрын
Personally I bought the 6800xt it was cheaper than the 3070 ti so glad I didn't stretch my budget for rtx
@sc337
@sc337 Жыл бұрын
Interesting topic. But is it applicable to all tiers of GPUs or only mid to high tiers? 8GB is still plenty sufficient for lower tiers GPUs right?
@SapphireThunder
@SapphireThunder Жыл бұрын
The VRAM discussion has repeated itself basically since before Geforce 200 series. ATI/AMD has almost always lasted longer thanks to generally having larger frame buffers than NVidia's competition. And the latest VRAM discussion just shows that people have not learned anything from history. And in age with huge texture packs and high resolutions, it's becoming even more major concern. There's a reason NVidia's Pascal is currently as long lived as it is. And still going. They had good frame buffer and bus configurations, aside from GTX 1060 3gb and 1050's 2gb. Most of the RTX 20, 30 and 40 series won't be getting anywhere near as close in usable lifespan as 10 series does.
@ians_big_fat_cock5913
@ians_big_fat_cock5913 Жыл бұрын
I remember when NVIDIA was ahead of the curve with fermi, granted it was a flagship chip and ran very hot. 5850 and 5870 were 1GB while GTX480 and 470 were 1.5gb and 1.2gb.
@dd22koopaesverde58
@dd22koopaesverde58 Жыл бұрын
Yes my brother have the 1050ti and new games like sons of forest run at +30fps 1080p low whit a bit stutering i have rx 470 but the r5 1600af botleneck to 35fps average
@arenzricodexd4409
@arenzricodexd4409 Жыл бұрын
because mostly it is a concern for people that like to max out setting. and this kind of people are in the minority.
@SapphireThunder
@SapphireThunder Жыл бұрын
@Rute Fernandes 2080TI is much faster than 1080TI. And while 11gb is fine enough for it, you can argue it could use a bit more. It can run games at the kind of settings where you will reach above 11Gb use. Also note that I said MOST, not *all*. Also, 2080TI is FAR from being most hated gpu ever. GTX 480 was far more hated. There's many others too, like 780TI wasn't particularly well received either.
@shutitfukface
@shutitfukface Жыл бұрын
imagine being a loser like yourself and posting your crap in every video about nvidia/AMD and you don't seem smart by putting ATI beside AMD, AMD bought them and ATI no longer exists stupid! I am on disability and don't work so I have an excuse to be a loser on here all day commenting... what's your excuse?
@GillespieGaming
@GillespieGaming Жыл бұрын
My rx 7900xt was using 18gb of vram whilst playing the last of us lol, I was playing for over 2hrs tbf and at 4k maxed with FSR on but still...
@zDToddy
@zDToddy Жыл бұрын
Ironically you can't enable Ray tracing on the RTX card anymore or else you get a fraction of the performance of the competition.
@WXSTANG
@WXSTANG Жыл бұрын
Oddly, I switched from a vega64 to a 3070 and started to notice hitching. Is it the memory size, or something else? My vega 64 had more consistent frame times.
@Patrick-tw7nr
@Patrick-tw7nr Жыл бұрын
I bought my 3070 for 600 back in May 2021 and I knew when I bought it that I would be upgrading in 2-3 years because I knew then VRAM would be an issue but was good enough for me to play the games I was going to play at 1440p 60 with RT. Im not buying any new games coming out this year or next as I got a PS5 or Series X if I think that will be an issue for VRAM limits on those games like Hogwarts Legacy and RE4 remake. So far so good. Will be doing a new build when the 5000 series and RDNA 4 launches and won’t get any gpu unless it is at least 20 gb or more vram
@oliversmith2129
@oliversmith2129 Жыл бұрын
To be fair, these aren't the top line product anyway so it's kinda weird to expect Ultra settings 60 FPS gaming in future titles. You can play with Medium/ High preset easily, some poorly optimized titles can be a pain though.
@Vis117
@Vis117 Жыл бұрын
@@oliversmith2129I don’t think that’s fair. No reason you can’t enjoy high settings on a mid range card. You just can’t do it at 120fps. The 3070 is strong enough for high settings, it’s purely the vram that’s causing the issue. And a direct competitor can do it no problem.
@oliversmith2129
@oliversmith2129 Жыл бұрын
@@Vis117 In an ideal world yes, for old timers like me, we buy a value card of the last generation and use it for a few years at medium/ high settings. I can afford 4090 but I don't think it's worth it, there are better uses for my money
@Vis117
@Vis117 Жыл бұрын
@@oliversmith2129 I agree with you on value being a better goal than chasing top tier graphics. But that is a separate issue to what we are talking about here. Sure you can turn down the settings on a $500 3070 and make it work, or you can do whatever settings you prefer on a 6800 for the same price. Purely because of VRAM. Nvidia has gimped their own card, when it’s capable of more. That is the fair comparison.
@oliversmith2129
@oliversmith2129 Жыл бұрын
@@Vis117 I agree with you, I just don't have the energy for outrage, everyone is so angry these days. They have their reasons and the companies are practically robbers but what can you do? Me, I buy older value cards, the rat race is not worth it.
@ragingpapist103
@ragingpapist103 Жыл бұрын
Will the 4090 age like the 1080ti? Or will we ever see a product like that again?
@The23rdGamer
@The23rdGamer Жыл бұрын
This is what I'm wondering as well.
@johnsmith2956
@johnsmith2956 Жыл бұрын
I'm thinking about getting a 4070 because of DLSS, but I'm unsure, because it's got 12GB. Better to go with AMD 16GB cards?
@DXHatakeKakashi
@DXHatakeKakashi Жыл бұрын
yup, it's more futureproof since the thirst for more VRAM in games has started.....
@EvL--_--
@EvL--_-- Жыл бұрын
It depends a lot on the resolution you're gonna be using it at, the VRAM difference is huge between 1080p/2k/4k...
@Icureditwithmybrain
@Icureditwithmybrain Жыл бұрын
Ive been pc gaming on nvidia only GPU's for 20 years and I plan on getting a AMD 7900 XT when starfield is released. I want 20GB of vram but I dont want to pay more than $800 for it.
@srimasis
@srimasis Жыл бұрын
I strongly recommend to spend a few hundred more and go for 7900xt or 7900xtx
@lefteris19
@lefteris19 Жыл бұрын
Try to get the 3090 instead. more power hungry but 24GB speaks for itself
@xchillkillx
@xchillkillx Жыл бұрын
Will this be the same with the rtx 4070 12GB for 1440p in 2025?
@Vis117
@Vis117 Жыл бұрын
No one can tell you for certain, but I think the evidence suggests that yes, it will likely be a similar scenario. Especially if you use ray tracing.
@BazinGarrey
@BazinGarrey Жыл бұрын
This is why almost every modern games needs "Texture streaming" and "VRAM limit" options.
@IchiroSakamoto
@IchiroSakamoto Жыл бұрын
There is a vram limit. Just that most AAA are above 8G. TLOU literally tells you how much VRAM the game will use. Without the VRAM limit then the games would crash because of memory leak
@baumstamp5989
@baumstamp5989 Жыл бұрын
theres no triple A game in the last 20 years that doesnt have a texture quality slider. but people want the most crisp textures... that's why every modern GPU needs proper amount of VRAM. like why even use raytracing if your textures look like a blurry mess because you ran out of 1-2GB of VRAM difference...
@emilklingberg1623
@emilklingberg1623 Жыл бұрын
They all have texture streaming, but when all the textures in view cant fit on your gpu it doesn't matter if it loads seamlessly. And a Vram limit is usually there in the form of texture quality, screen resolution, AA settings and model detail. If you build a gpu with less Vram than a console,.. then you know well and good that it will fail very very soon.
@awen4784
@awen4784 Жыл бұрын
it's already there its just bigger than 8 gb
@Trisstan20
@Trisstan20 Жыл бұрын
this is why every modern game should req more than 10GB VRAM since 8 was already standard back in 2016 ( 480 with 480) and we need more forward
@BrownStain_Silver
@BrownStain_Silver Жыл бұрын
how did we progress so quickly for 1080p to need more than 8GB of VRAM?
@zodwraith5745
@zodwraith5745 Жыл бұрын
While this is definitely an issue, I have to say it's getting at least a little blown out of proportion. Tinkering in Hogwarts and TLoU with my 3080ti to see how big this problem _really_ is I was easily able to get the vram usage below 8gb without completely destroying visuals. Installing the latest AAA game and blindly maxing settings is _always_ a bad idea unless you regularly spend 4 figures on your GPU. All these games have ridiculous sliders that go FAR beyond diminishing returns and Tim's excellent RDR2 tuning guide proved that in spades. The 3070 and 3080 stand out because they're too powerful for their buffer capacity, but does that make them completely useless? Absolutely not, and that's why this wave of videos running rampant this week needs to be taken with a grain of salt. While I wouldn't suggest buying one of these _today,_ you don't need to replace your GPU if you own one. Simply curb your expectations and spend 5 minutes that you SHOULD be doing anyways to tune your settings. Blindly setting everything to Ultra and jumping in like all the reviews show is _exactly_ what Nvidia and AMD want you to do so you think you NEED the latest and greatest GPU no matter the cost. I feel what Steve should have done in the recent 3070 vs 6800 video was tune settings on the 3070 to avoid the buffer overrun to show people what level of visuals 3070 owners have to put up with to continue using their cards. Letting it continue to overrun the limitations to prove a point makes it look like more of a hit piece than an informative video of what owners _should_ be doing to mitigate the problem. Not to mention cherry picking the low volume 6800 that was more expensive at launch and only recently became more affordable. That comparison should have been with the 6700xt, not the notably more powerful 6800. While I get the comparison if you're buying today, revisits should be focused on _current_ owners as no one should be buying new 30 series at their still inflated prices. On the used market the 3070 is much closer to the 6700xt than the 6800. While I agree with holding Nvidia's feet to the fire on shitty decisions and tactics, that video came across as more of a "HAHA! I told you not to buy this dumbass!" Especially when that initial 3070 review was rather glowing. Repeatedly pointing out it's weakness now does nothing to serve those that bought these cards when it was the only thing they could afford in the nightmare market that was these last 2 years.
@tech-wombat
@tech-wombat Жыл бұрын
I definetly agree to everything you said. Particularly this 8GB discussion when there was not that much available and the 6800 was a lot worse in RT, which was a criticism in itself. I also think that the point made here is a bad case of fearmongering for anyone owning a 8GB card. I was missing the point how to circumvent this issue, i.e. by lowering texture quality, disabling RT or similar things, which everyone owning that kind of card would do. In fact you can find a lot of videos which show running Hogwarts with 8GB fluently.
@zodwraith5745
@zodwraith5745 Жыл бұрын
@@tech-wombat Don't get me wrong. There's absolutely a problem with Nvidia fucking us on VRAM. I'm just trying to rein in the bandwagon making a mountain out of a molehill for clicks. My kids' 1660 super plays TLOU perfectly fine at 1080p as well as 1440 and the wife's 1080ti plays it perfectly fine at 1440 and 4k. (Thank you AMD for FSR❤) You just have to be realistic with your settings if you're not willing to spend 4 figures for a 4090. But don't think for a second that magically forgives Nvidia fucking us on VRAM. The cost to bump 8gb to 12gb and stay within their bus is negligible. They're _definitely_ screwing you and you _should_ be pissed about it. It's just not as big of a deal as others that have alternative agendas make it out to be.
@elchurcho2704
@elchurcho2704 Жыл бұрын
Bought a 6750xt with 12Gb VRAM 2 months ago. Thank god...🙌🏻
@hotshot8365
@hotshot8365 Жыл бұрын
This would be acceptable if GFX cards were the price they used to be as lots of people upgraded fairly frequently (two years or so). Now prices have gone to the moon people understandably want/need a longer lifespan its a big problem. Nvidia cant have their cake and eat it - time to put decent amounts of RAM on the cards or lose market share.
@mattk6827
@mattk6827 Жыл бұрын
If our money went as far as it did 2yrs ago people would be happier with everything.
@Isaax
@Isaax Жыл бұрын
I knew I was shafted with the 10 gigs on my RTX 3080 but I didn't think I'd be shafted THIS soon
@luminous2585
@luminous2585 Жыл бұрын
At the end of last year I was deciding between getting a 3070ti or a 6800xt for a similar price. Even then I underestimated the VRAM, but my train of thought was that with the 6800xt I should be able to always max out the texture resolution and probably even install the largest community made texture packs. Shortly after that was when I noticed last gen NVIDIA GPUs crapping their pants in new games (though I think the main fault lies with the optimization of these games.)
@SapphireThunder
@SapphireThunder Жыл бұрын
not only that but 6800XT is just plain faster gpu in nearly every game.
@anarki_in5680
@anarki_in5680 Жыл бұрын
I remember debating between a 2060 6GB and a 2070 Max-Q 8GB laptop. Possibly 10% more cost for roughly 10% more performance (Time Spy Graphics), and 2 GB more VRAM. Most reviewers (HUB, Jarrods Tech etc.) reviewed the 2070MQ unfavourably and suggested the 2060, even suggesting last year's model on sale (1070 MQ). Can't help but think how I'm much better off with the better DX12 performance and DLSS vs. 1070MQ now, as well as 8 gigs vs. 6 because 6GB will make a 2060 redundant a lot sooner. Plus, I'm able to OC my 2070MQ to desktop 2060 level performance. As far as laptops go, suggest people to just spend on the GPU if you can afford it and price/frame justifies the decision - it goes a long way. 4 years later I was still able to enjoy my first playthrough of Hogwarts Legacy, even on my external 1440p monitor with DLSS.
@nouryy
@nouryy Жыл бұрын
What about the 3080 10GB? I'm interested to see how it fares in 2023.
@MicaelAzevedo
@MicaelAzevedo Жыл бұрын
Piece od xit. Just sold mine
@cezarstefanseghjucan
@cezarstefanseghjucan Жыл бұрын
RTX 3080Ti or above. RX 7900 or above.
@TrentCrimm
@TrentCrimm Жыл бұрын
@@cezarstefanseghjucan wouldn't A RX 6800 or above be sufficient too? Those come with 16 GB VRAM. I'm asking because I'm looking to upgrade my 3060 Ti (gaming at 1440p 165 Hz).
@mbrj3idi830
@mbrj3idi830 Жыл бұрын
@@TrentCrimm Don't upgrade to same generation (you will not get the huge performance difference you expect). Better to skip one generation (from 3xxxx to 5xxx). But if you are going to upgrade from next gen, go with a higher class (xx60 to xx70). The problem is 4070 ti came with only 12GB 192bit. So, better you go with 4080 or 7900xt. Keep using your 3060ti and save money for it, and they will be cheaper later. Unless you can sell your 3060ti and buy RX 6800 (with little difference), that is also an acceptable solution.
@KarlDag
@KarlDag Жыл бұрын
@@MicaelAzevedo Same. Sold mine, found a RTX3090 used for barely more money. lol.
@marcm.
@marcm. Жыл бұрын
The issue is with the laptop versions of these GPUs. The best you could get was 16 GB VRAM with a 3080 or 3080Ti. Even the 3080 came in an 8 GB flavor too. There were no real options for anything over 8 GB as long as you want it to be in an affordable class of laptop
@nameinvalid69
@nameinvalid69 Жыл бұрын
people that dismiss the VRAM issue are likely have way too thick wallet the can afford to upgrade at any second. meanwhile if I buy a GPU today, I expect it to last till 2027 at the very least... and plenty of games already hitting the 8GB mark many years ago. which is why I always wondered why nVidia cheap out on their high performing card.
@nimrodery
@nimrodery Жыл бұрын
I don't feel compelled to play AAA cash grab jank titles that aren't ready for release, so my VRAM needs are different. Why you still ingesting that pablum?
@BalasielVOD
@BalasielVOD Жыл бұрын
I just built my first PC a month ago and still need a graphics card, my friend lent me his old one (GTX 1070) for now. So what about 12 GB Cards? Weil they run into the same issue 2-3 years down the line and I just just go straight for 16 GB?
@-aleeke-2526
@-aleeke-2526 Жыл бұрын
Wanted a 6800 but I could not find one to a fair price at the time so I got a 3060ti since the price was right.. 😞
@user78405
@user78405 Жыл бұрын
One thing all reviewers forget that they are not computer technician for troubleshooting the problems when majority KZbinrs don't see VRAM as an issue when system ram went over 16gb now that never happen before since 16gb became comfortable and no game ever surpass that usage on any computer system until SONY ported console games into PC platform from dev kits that have not been touch and not been tested by developers who are game testers that industry depending on for bug smashing and fixes but that change entirely now that it's now customers are the bug testers by their owned expense than company expense ...irony to this...the ps5 dev kits have 24gb shared vram and system ram , its why games are like this now from sony vs microsoft versions...microsoft don't use over 16gb to make games due to code difference that sony console operating system is not optimize due to its libraries are not the same as Microsoft direcx platform
@nakazo9929
@nakazo9929 Жыл бұрын
Yeah, i know that 8GB isn't enough for some games but with 8GB of VRAM you'll be fine with 99.99% of new releases for the next few years. What most people don't understand is that Ray tracing isn't really necessary and you can always decrease some settings while maintaining an overall very good quality. I have a friend that is still using a 980 and is playing recent games at 1080p 60 fps with medium to high settings. If someone thinks that a 3070 is obsolete and needs to be trown away for a newer card is just stupid.
@lfsracer79
@lfsracer79 Жыл бұрын
Maybe 79.99%
@johnnyvonjoe
@johnnyvonjoe Жыл бұрын
I hear you but the counter argument would be if someone dropped five six seven $800 on this card during the pandemic they're not going to want to negate features like ray tracing or high settings in certain games for that matter.
@toddhermit
@toddhermit Жыл бұрын
These tech channels can only exist if new products are being purchased. We saw Hardware Unboxed's true colors with their DDR5 testing. This is a business, and you always need to do your own research and come to your own conclusions.
@LukeBroadhurstfilm
@LukeBroadhurstfilm Жыл бұрын
I’d love to see how arc a770 16gb compares to its 8gb counterpart
@hofnaerrchen
@hofnaerrchen Жыл бұрын
From what I've seen so far in terms of contemporary (recently released) games the hardware requirements do not really come with significantly more eye candy. If I had to guess: Devs are getting more lazy in terms of optimization and simply hope people will buy their products together with new hardware that might be able to support it. It's probably best I continue to do it the way I did the last years: Buy games that were released 2-3 years before my currently used GPU was released. This also comes with another benefit: Less bugs and a smoother overall experience.
@ShamirAbdul1
@ShamirAbdul1 Жыл бұрын
uhuh so blame the devs instead of nvidia....I like your thinking
@thunderp1719
@thunderp1719 Жыл бұрын
Got my new colorful 3060ti and now its vram vram talks. Does that mean its gone
@julianB93
@julianB93 Жыл бұрын
im thinking about getting a 7900xt as a replacement for my 1080ti as it got 20gb of vram, but its propably more of an upsold 7800xt and im not happy about that. Price is also a point why i would like to skip even this generation, to see if companys become a bit closer to the ground.
@Gargantura
@Gargantura Жыл бұрын
true, IMO 16 gb will be mid range and 12 gb will be standard in the future
@lilpain1997
@lilpain1997 Жыл бұрын
@@Gargantura this is 100% going to happen thanks to consoles having around 12gb of their ram useable.
@meyatetana2973
@meyatetana2973 Жыл бұрын
I don't understand how anyone actually was sold on 8gb of vram when games have been using more and more through the years, ten years ago 8gb was fine not the best but okay, it's 2023 now not 2013 anything under 10gb vram is woefully under performing. NO matter what number the slap onto shit, it's still shit.
@PinHeadSupliciumwtf
@PinHeadSupliciumwtf Жыл бұрын
wait for the actual 7800xt they'll probably be more refined than the navi 31 ones. Or by then the price of the 7900xt will drop.
@shutitfukface
@shutitfukface Жыл бұрын
stick to consoles loser.
@ocha-time
@ocha-time Жыл бұрын
4:48 xd Re: Diablo 4 Beta VRAM issues: Yes. I had a 6600 that COULD output 120fps, but 8GB VRAM was over cap and it'd just stick permanently at 20-30fps, dip to 14. 6700XT (12GB VRAM!) instantly solved the issue. Performance wasn't even super better, but when you're not paging, turns out performance stays good. Who knew? Oddly this didn't seem to be an issue for Nvidia from what I could glean from the forums? I mean the game actually says it requires a GTX 970 or something, certainly we should have VRAM issues there as well, but I never heard about it. Also the story cutscenes were capped to 32fps or something else, wasn't sure if that was the engine or just card performance. It was smooth, it was just a low capped framerate.
@GamerSg84
@GamerSg84 Жыл бұрын
Why will Nvidia give more VRAM if there are so many idiots willing to buy 8GB cards for $400-$700? Not only do they make more profits by saving $20 on 8GB VRAM, but now their customers will be forced to upgrade next cycle again. Not to mention the legions of fools who will defend 8GB with allocation is not usage arguments and tell potential buyers that 8GB is enough for 4k! Repeat business + higher margins $$$
@JayzBeerz
@JayzBeerz Жыл бұрын
What about my 1660 ti?
@Quint1975
@Quint1975 Жыл бұрын
the most frustrating thing is the average gamer is not going to check YT or online for GPU issues - they will trust their GPU supplier of choice as latest tech being fine for 2-4 years of no issue gaming.
@randy206
@randy206 Жыл бұрын
Well, the 3070 was fine for 2-4 years. It's just that it was closer to the 2 than the 4. I'm wondering how long my 3080ti will be good for.
@lefteris19
@lefteris19 Жыл бұрын
​@@randy206 it'll be good for another 2-3 years at 1440p even with ultra textures
@curie1420
@curie1420 Жыл бұрын
​@@lefteris19 no
@Quint1975
@Quint1975 Жыл бұрын
@@randy206 and it seems Nvidia wants to pull a yearly upgrade scenario for the GPU gamer market or so it seems with their hardware VRam releases. I've already decided to go AMD on my next GPU upgrade which as I run a 3080 at the moment will likely be sooner than later. I'm fed up with Nvidia's obscene greed.
@Vis117
@Vis117 Жыл бұрын
@@randy206I think it’s reasonable to assume that another 2 years will be mostly fine. Aldo depends on how much you use ray tracing.
@siyzerix
@siyzerix Жыл бұрын
Excellent point HUB. Wanna know whats crazier? People defended the 4gb of vram on the rtx 3050/ti mobile GPU's. Mind you, the 3050ti has the same specs as the desktop rtx 3050 but half the vram. And also performs similar to it. So, it is super vram limited. Even my measly gtx 1650 g6 runs out of vram in several titles/modded titles. HQ texture packs, mods, newer games, etc. all push my 4gb vram to its limit. Even my 1050 4gb had issues with vram in some titles like no mans sky and modded skyrim se. So you can imagine how bad this vram situation is on cards 3-4x faster.
@szymoncieplak21
@szymoncieplak21 Жыл бұрын
Will "only" 8 gb vram in my newly aquired rx 6650 xt going to be an issue for me (in terms of quickly aging out and limiting my performance) considering that i play in only 1080p with absolutely no interest in raytracing, with constant 60 fps on ultra/high presets being the only things i care about?
@tringuyen7519
@tringuyen7519 Жыл бұрын
Yes, it will be an issue.
@diegocm8636
@diegocm8636 Жыл бұрын
Looks like you think just like me. I have an 6600 and I believe we'll need to put some settings (textures) on medium on some AAA games a few years from now. But with no RT and needless ultra quality, I think we'll play 99% of games with no worries. Right now I'm playing new games on ultra no problem. Let's see if someone else thinks differently
@Vis117
@Vis117 Жыл бұрын
Well it’s clearly already an issue in some games based on their recent video. But at 1080p I think you’ll be in much better shape for the most part. Especially if you use FSR.
@27Zangle
@27Zangle Жыл бұрын
I think you will be fine for a couple years but some of the newer games will require you to turn some setting down. I've thought about doing a budget build to save some cash since I do not game much and that card was on top of the list.
@The23rdGamer
@The23rdGamer Жыл бұрын
If you are okay with turning down some settings on certain new games that are more demanding, then I think you'll still have an enjoyable experience if your performance target is 1080p 60FPS with no raytracing. FSR could prove useful too if you decide to experiment with it. There will still be many games that you can play on 8GB (there's a lifetime's worth of games released before 2020) without needing to compromise on settings as well! Keep in mind that your use case may make your hardware needs different than people online :)
@Divinicus1er
@Divinicus1er Жыл бұрын
My guess why people discarded the VRAM problem: 1) the MSRP was low so they wanted to be happy 2) They were too young to remember the last VRAM boom (end of PS3 life, 2014-2016, we went from 1GB to 8GB requirement really quickly)
@groanhammer282
@groanhammer282 Жыл бұрын
Im looking to build a new PC this year to upgrade my 4770k rig, but it seems I have to put My 1070 there for a while until a reasonable product comes along. I'm not paying 1k for something that is obsolute in three years. I bought the 1070 used for 200 in '18 and my rig still plays most games just fine when OC'd.
@The23rdGamer
@The23rdGamer Жыл бұрын
If VRAM is a concern, you could get an AMD product or consider a used 3090 or something.
@waway37
@waway37 Жыл бұрын
There is a 3070 16GB… the name is RTX A4000 😂
@valenrn8657
@valenrn8657 Жыл бұрын
With ECC GDDR6.
@ronzerker390
@ronzerker390 Жыл бұрын
I cannot for the life of me remember what first pc consisted of back in 2016, but I do remember the 1050 which I upgraded to a 1060 OC months later. Let me tell you, I ran that card like no tomorrow, overlooking the hell out of it and now it's dead. I for sure got my money's worth though. I upgraded to a 5700xt and just recently bought a 6800xt a couple weeks ago. AMD hooked me with the 5700xt and their 5600x chip i was using before. Now I'm gaming on the 5800x3d with a 6800xt. Solid and reliable just how AMD does it, at least for me. Also I don't buy current gen, I usually wait two or more years unless I really need a new part.
@damazywlodarczyk
@damazywlodarczyk Жыл бұрын
I literally played on gtx1060 60 fps all games until now. It's not dead.
@chrisvig123
@chrisvig123 Жыл бұрын
I’d be nervous about anything 12gb and under…AMD just made huge gains with this info 😮
@nemesis8131
@nemesis8131 Жыл бұрын
AMD sucks in other way. They can't match 4080 without sucking massive amount of power (7900 XTX). I'll skip this gen and I will still use my 1070, until 5080/8900 come out.
@adamek9750
@adamek9750 Жыл бұрын
@@nemesis8131 lets not pretend nvidia had good power consumption in the 3000 series. 3090ti????
@season161
@season161 Жыл бұрын
12Gb is fine, in their latest blog AMD still recommend 12GB for 1440p in newer games
@LukeBroadhurstfilm
@LukeBroadhurstfilm Жыл бұрын
What’s worse about the 3070ti is that nvidia are still trying to sell it. When the 4070ti video announced they highly recommended the 3070ti on a lower budget. Looking back at the 3070ti release it now seems pretty fair it was judged how it would be performing 2 years down the track.
@isanvicente1974
@isanvicente1974 Жыл бұрын
taking into account PS5 and Xbox series X and unreal 5 , you do not have to be a genius to know that you will need more than 12GB of VRAM in no time .... even the more radical Nvidia Fanboy can not deny it...for 1440 12GB is going to be minimun requirement...
@thetruestar6348
@thetruestar6348 Жыл бұрын
That’s why I got the 3080 12gb most for video editing stuff but also gaming
@benjib2691
@benjib2691 Жыл бұрын
This. I was just thinking about that last year when I bought my current 6750XT. Games are/will be made first and foremost for PS5 and Xbox Series X. These consoles have 16 GB of RAM/VRAM, and the Series X memory pool configuration gives us an idea on how much of the 16 GB will be used as VRAM. The Series X has 10 GB wired on the on-die memory controller with higher bandwidth (560 GB/s) and 6 GB wired for lower bandwidth (336 GB/s). Thus, on the 16 GB pool, Microsoft's engineers expected the VRAM usage of future games to be more or less 10 GB. Which means that developpers are making their games and engines to target this value as a baseline. And of course they can go further than 10, up-to 12 GB I believe, which leaves 4 GB for the OS and CPU-related data (which seems reasonable and likely). From that I concluded that 10 to 12 GB will be the new baseline VRAM amount required on PC for the games made for this console generation (past the cross-generation phase). Seems like I wasn't far-off a year after.
How Much VRAM Do Gamers Need? 8GB, 12GB, 16GB or MORE?
18:54
Hardware Unboxed
Рет қаралды 246 М.
8 GB VRAM is a Problem. Is 10G any Better?
11:48
Vex
Рет қаралды 148 М.
Random Emoji Beatbox Challenge #beatbox #tiktok
00:47
BeatboxJCOP
Рет қаралды 48 МЛН
Wait for the last one 🤣🤣 #shorts #minecraft
00:28
Cosmo Guy
Рет қаралды 22 МЛН
Planned Obsolescence Sucks. Here's Why It Still Exists.
10:33
Our Changing Climate
Рет қаралды 244 М.
How long should you wait before upgrading your GPU?
6:58
HUB Clips
Рет қаралды 62 М.
Is 10GB VRAM of 3080 enough in 2024? How important is VRAM?
6:29
MoreBenchmarks
Рет қаралды 1,9 М.
Xiaomi 15 и HyperOS 2 - ну что, опять топ
27:09
Rozetked
Рет қаралды 485 М.
I'm still mad… but buy it anyway - RTX 3060 Review
11:15
Linus Tech Tips
Рет қаралды 4,2 МЛН
How much longer will 6 core CPU's be useable for gaming?
7:13
Apple, Stop Putting Things On the Bottom Please
9:16
TechLinked
Рет қаралды 536 М.
How to optimize your case airflow!
15:20
JayzTwoCents
Рет қаралды 2 МЛН
Is 8GB VRAM Enough For The Latest Games? If Not, Why Not?
6:44