No video

RTX 3080 vs RX 6800 XT! | Same FAST Performance, Is Nvidia or AMD Better??

  Рет қаралды 170,107

Vex

Vex

Күн бұрын

Пікірлер: 813
@vextakes
@vextakes Жыл бұрын
🚨🚨 NVIDIA SHILL ALERT 🚨🚨
@soupdrinker72
@soupdrinker72 Жыл бұрын
real
@GewelReal
@GewelReal Жыл бұрын
I love Nvidia! Daddy Jensen make me into one of your jackets
@phrog.4809
@phrog.4809 Жыл бұрын
I decided to buy a 3080 instead of a 6800 XT a few days ago to upgrade from my 6600 XT, the card just shipped too, and now you post this video 💀. The only reason I'm still happy with my decision though is that it was a 12GB card and I got it for $390. Update: I got it and I'm happy with it
@JasonEllingsworth
@JasonEllingsworth Жыл бұрын
There have been videos for years showing the 6800xt as the better card. It also overclocks far better. Mine is as fast as a 3090, using air cooling and amd's stock software, which has improved a lot in the past 3 years. No driver issues either.
@lucidnonsense942
@lucidnonsense942 Жыл бұрын
CUDA worloads aren't a thing you can utilize well with a 3080 - they need a lot of RAM, that's why the professional cards have 24GB. 10GB makes that feature irrelevant - the low memory is a way to force prosumer buyers into paying the Quadro tax. 3080 is useless in pro apps, you get better performance with a A4000 - a 16GB quadro card that uses the 3070 core.
@BogdanM116
@BogdanM116 Жыл бұрын
I'm coming back here in 3 days with snacks to read the comments.
@thinotmandresy
@thinotmandresy Жыл бұрын
lmfao 😂
@marcgallant5270
@marcgallant5270 Жыл бұрын
"muh dLsS and Rtxsss" vs "yeah but that 16 gigawatts of Vruuuuum!!!!"
@AbbasDalal1000
@AbbasDalal1000 Жыл бұрын
Bruh please ignite some fire and go
@ZeppelinEXAhmed
@ZeppelinEXAhmed Жыл бұрын
Coconuts are good for snacks.
@muhammadamaar7936
@muhammadamaar7936 Жыл бұрын
Ayy
@chrisswann7578
@chrisswann7578 Жыл бұрын
I just bought a used rx 6800 non xt for 350 usd. I came from a rx 6600xt and I cannot imagine using anything faster. I'll definitely keep it for as long as I can. And to add I never see my card go above 200w, and that is with a slight tuning which makes its just that much faster. I love being able to max out settings and not run out of vram.
@hiriotapa1983
@hiriotapa1983 Жыл бұрын
Bought a 6800 XT Sapphire Nitro+ for 355 usd....
@chrisswann7578
@chrisswann7578 Жыл бұрын
@@hiriotapa1983 Awesome! What a killer deal!
@Adrian-tl5ue
@Adrian-tl5ue Жыл бұрын
what processor do you have and what psu? I have rtx 2070super and want to upgrade to rx 6800 or 6800xt with mine r7 5700x and I only have 650W psu 80+ bronze, i think 2070super runs with same watts like a 6800
@kilroy5680
@kilroy5680 Жыл бұрын
​@@Adrian-tl5ueit's good enough
@TheNewmen10
@TheNewmen10 9 ай бұрын
​@@Adrian-tl5ueFunciona sim,eu uso uma 650 w plus bronze tambem !
@speng5821
@speng5821 Жыл бұрын
16GB vs 10GB is no contest. Both of these cards are the lowest viable 4k options imo and at 4k 10GB is going to age like milk (even with FSR2/DLSS quality lowering the vram usage- which you'll probably be using as 4k native requires serious horsepower). In blind tests comparing FSR2 and DLSS at 4k quality, 99% of people won't be able to spot the slightly better image quality DLSS offers. The differences become more exaggerated using more aggresive upscaling at lower resolutions. If you don't stream or do any productivity and have a 4k monitor then 6800 xt just seems like the better option. £400 used and lower power which is a factor in the UK where energy is expensive.
@speng5821
@speng5821 Жыл бұрын
As for RT- I've always been about optimising graphics settings. Even before RT, I'd be turning shadows down a few notches from ultra for more performance for very little visual difference. Even on a 3080, RT is expensive and I'd much rather target 120FPS than have some eye candy.
@Ober1kenobi
@Ober1kenobi Жыл бұрын
Texture Sliders. So long as you keep the Vram in check, It’ll be faster no,,,
@speng5821
@speng5821 Жыл бұрын
@@Ober1kenobi but that's the issue- I don't want to have to think about keeping my vram in check- 4K textures are one of the best ways to improve the image- whats the point of playing at high resolution using low resolution textures? PC gaming and the pursuit of smooth frametimes is finicky enough without having to worry about vram as well
@stangamer1151
@stangamer1151 Жыл бұрын
Well, I can easily see the difference between DLSS Quality and FSR Quality at 4K rendered resolution. Even on my old 1080p screen, let alone 4K screen! DLSS provides much better AA quality and less ghosting. The overall image looks way smoother with DLSS. Plus, RTX cards also offer DLAA - the best currently available AA solution. And also DLDSR, which is an AI driven downscaler.
@thee-sportspantheon330
@thee-sportspantheon330 Жыл бұрын
@@stangamer1151 Cope. I will go nom nom on my vram.
@chrispittmanComputing
@chrispittmanComputing Жыл бұрын
Make sure your 5900X is running a negative all core in curve optimizer (start at negative 15) and if not running Samsung B-die get a set of 3600 CL14 and really tighten down the timings. This will give you a noticeable uplift on your 5900X.
@therecoverer2481
@therecoverer2481 Жыл бұрын
hey quick question, my ram (GSkill Sniper X 3600 CL19 @1.35v) is detected as Samsung B-die on Thaipoon Burner, but when i tried to tighten the timings even just 1 it refuses to boot, even at 1.45v, but the weird thing is i can lower the voltage when runing xmp to 1.25v. is it possible that i got trash B-die or am i doing something wrong? my cpu is R5 5600
@Wabos123
@Wabos123 Жыл бұрын
@@therecoverer2481 Its the B-die I had the same issue with my 5600X using Silicon Power ram from amazon. Ended up swapping to some Team Group T-Create 3600mhz CL18 and was able to undervolt and adjust timings.
@droptoasterintub297
@droptoasterintub297 Жыл бұрын
@@therecoverer2481 3600 MT/s with C19? That's definitely not B-Die. Guaranteed B-Die bins would be the likes of 3200 C14, 3600 C14, 4000 C15, etc. C19 is incredibly loose even at 3600. You shouldn't trust Thaiphoon Burner as it is known to have false positives, and in this case, it is glaringly obvious this is one of those false positives. I almost would say those ICs could actually be Samsung C-Die as they are known to become unstable over 1.35v, no matter the timings. It would also explain the very loose primary timing(s). I'd get ahold of a 3600 C14 kit as this is the best sweet spot for performance on Ryzen. Getting kits over 3600 MT/s isn't beneficial as they aren't necessarily better bins, but pricier; and, you almost can never go over 1900 FCLK which is 3800 MT/s when synced 1:1. Some Zen 3 samples may not even be able to do 1900 FCLK and need to step down to 1866 or 1800 (3733 MT/s and 3600 MT/s respectively). A B-Die 3600 C14 kit should easily do 3800 at the same timings on stock voltage most of the time.
@rain8478
@rain8478 Жыл бұрын
I was in this boat and grabbed a used 6800XT over a used 3080. There are two main reasons for this: 1. 16 GB VRAM, self explanatory. Yes I bought into the fearmongering, boohoo. 2. GDDR6X is an actual risk when buying ex-miner cards, GDDR6 not so much. This is really all there is to it. I've already had a 6700XT before so that kinda made it easy to know what to expect. Might be a little harder to risk going a different brand for others especially in used market. Though I don't think 3080 is cheaper because its valued less, I think it has more to do with just how many 3080's Nvidia produced and sold to miners. I just don't think there are as many 6800XT's out there total.
@person1745
@person1745 Жыл бұрын
G6 might not be as big as a risk than G6X but I think it really depends on the AIB rather than the card itself. Also fast G6 can produce just as much heat as slower G6X, like the 7900 XT. Reference 6700 XT for example has G6 but are known to run its memory close to 100 degrees. It’s all about how well the cooler is.
@rain8478
@rain8478 Жыл бұрын
@@person1745 well if you're buying an ex mining card known to run memory at 100c you're kinda asking for problems.
@molochi
@molochi Жыл бұрын
I think the fear mongering about 8gb is valid. 12gb 198bit is probably enough, but I wanted 256bit and that's 16gb.
@forog1
@forog1 Жыл бұрын
@@person1745 AMD OC those GDDR6 cards from factory. Which is why they produce far more heat than they should. 20gbs seems out of spec to me for GDDR6 non-X. I think they did that out of depuration for more performance when the realized their RX 7900 series was not hitting target numbers at 16gbs or 18gbs (GDDR6 at stock). Plus, my 7900xtx if I make the vram clock itself to its 20gbs. on the desktop it consumes like 80w idle with vram clocked up from 10w idle.... yeah, they pushed and overvolted these ram chips.
@rain8478
@rain8478 Жыл бұрын
@@forog1 you can undervolt SoC for what its worth with MPT to reduce power draw a little bit, it works on RDNA2 but I cant guarantee anything for RDNA3.
@RN1441
@RN1441 Жыл бұрын
After seeing how ridiculous the prices were getting on the new generations of cards at the end of last year and start of 2023, I decided to grab one of these 10GB 3080's used for a deal. The crop of new games which exhaust its VRAM started showing up almost immediately after this, so I probably should have waited a few months :D Oh well.
@michaelilie1629
@michaelilie1629 Жыл бұрын
what about medium settings and dlss/fsr?
@princekatana8792
@princekatana8792 Жыл бұрын
Imagine thinking about using medium settings with a last generation high end card. Pathetic@@michaelilie1629
@RN1441
@RN1441 Жыл бұрын
@@michaelilie1629 I'm mostly focused on 1440 since that's my native resolution, so I'll just have to turn off eye candy until it hits the gsync window
@mato_s
@mato_s Жыл бұрын
Same but i got a 3070 so im so screwed 💀
@stewenw4120
@stewenw4120 Жыл бұрын
@@michaelilie1629 Yeah i guess he bought a 400-600 € / $ card to play on medium settings. Nice advice^^ The question is do you need more than ~80 fps in games that are requiring so much VRAM. I think 3080 should do the trick in the next 2 years. But since i don't care for RT and the other stuff NVidia offers i got me the 6800XT anyways.
@RobBCactive
@RobBCactive Жыл бұрын
That's cool that you're using the RX 6800xt for editing, that used to be on the Nvidia Pro side. The fact is AMD have improved the software a lot, CUDA support via ROCM is coming out too. The thing is Vex has taken the plunge late in the game, there were deals on the AMD cards long ago, so you'd have had the usage out of it not waiting for 3080 prices to settle in the used market. Starfield is nice but not relevant to everyone. Unfortunately a lot of people ignored MSRP reductions and lower prices, scared by all the FUD put out about drivers etc. That doesn't send the market leader the signals it requires to reduce its margins. Gamers bitching about prices isn't enough, AMD have to be able to justify investment into features. That requires sales when they are close, because development costs are a real thing.
@molochi
@molochi Жыл бұрын
I thought it was considerate of NV to price their 4080 so high. Gives the 7900 xtx a chance to make money for amd as well. The really are gentlemen.
@RobBCactive
@RobBCactive Жыл бұрын
@@molochi well if Navi31 had met its expected performance targets the 4080/4070Ti would have looked very stupid, weak and totally over-priced. Scott Herkelman intended to "kick Nvidia's ass". So the 7900xt was crushing the 4080 12GB, the xtx the 16GB while much cheaper. Only fixes to the driver caused severe performance drop and they haven't found a general mitigation. Radical changes in architecture are inherently risky, RDNA3 has disappointed. Targets met, Nvidia added features like fake frames & DLSS would not have been enough, they'd be forced to cut prices or cede market share. Nvidia were maintaining high prices because of their vast RTX 30 GPU stockpile. Now they're cutting production to the contractual minimum rather than sell them cheaper, they figure gamers will relent eventually and cough up. The big money is in AI at the moment, so they're hoping to max out Hopper.
@ronaldhunt7617
@ronaldhunt7617 Жыл бұрын
The emulated cuda is not nearly as good as actual CUDA, I hear AMD is going to do the same with Tensor cores as well. So not as good, but way better than nothing. If you own AMD already it is a huge win, if you are buying a new GPU and will be utilizing CUDA/Tensor then you are better off with Nvidia. Video editing and other productivity software results are going to depend mainly on the software you are using, Premier Pro favors Nvidia much more, some free video editing software may or may not be more of a contender.
@RobBCactive
@RobBCactive Жыл бұрын
@@ronaldhunt7617 CUDA is a way to program a GPU, it takes source code and compiles it, for an application. Calling it emulation just shows your game is spreading FUD and confusing people. The AI boom is for large models and so called deep learning, with huge demand for data center products. What you need for running neural nets and training them is very different, laptops are coming out with AI acceleration in cooperation with MS. Like Apple has had, without any Nvidia hardware but be an accelerator on the CPU die.
@ronaldhunt7617
@ronaldhunt7617 Жыл бұрын
@@RobBCactive Not sure what you are trying to get at here, Nvidia has dedicated CUDA cores that process independently making certain computations a lot faster. AMD does not have CUDA cores, instead they (just like in everything else) saw what Nvidia had done and added stream processors which are not the same, not as good. Just like raytracing cores, and now it seems like Tensor cores (for AI) but only for the 6000 and newer GPUs, not to mention upscaling and frame generation (which AMD does not have as of now) Monkey see, monkey do... but not at the same level.
@ballcandy1117
@ballcandy1117 Жыл бұрын
I grabbed a 6800 xt for $150 less than the cheapest 3080 on the used market and 4070s are still over $300 more idk why anyone would spend more for AMD but ultimately besides the great price the 16gb vram was really what made me buy it.
@suzie9874
@suzie9874 Жыл бұрын
Couple weeks ago i swapped my 6800XT to a 3080 (10gb) And only reason i swapped with nephew was for AI. The AMD gpu while many say can run stable diffusion. I could never get it to work. So i swapped with my nephew. And SD works flawlessly now. Plus the games i play run about the same on both gpus. So nothing lost there
@xwar_88x30
@xwar_88x30 Жыл бұрын
Good choice my friend, 6800xt will age like fine wine and outperform the 3080 as time goes on and even matches or beats the 4070 with having more vram it'll age very well. Nvidia get Weaker as they get older while amd GPUs just get better as videos have proving this. Nvidia have been trash since the 2000 series. I'm hoping Amd can knock Nvidia of their high horse a bit since there just taking the complete mick out there consumers. As I've said in another comment look how smooth the frame graph is on the 6800xt compared to the 3080 amd is defo the smoother experience. Even the 5000 series had smoother frames compared to the 2000 series. Nvidia are more interested in marketing bs RT which tanks FPS on all gpus, there dlss , ai. There all for buy our product fo these features bs while there 3070/3080, 4060ti suffer because of vram limitations, absolute joke. If the 3070, 3080, 4060ti had 12gb plus vram they would be amazing GPUs but nvidia are more interested than ripping customers off making them to upgrade, horrible greed of a company.
@joee7452
@joee7452 Жыл бұрын
@@xwar_88x30 I would agree normally but the 3080 vs 6800xt seems to buck the trend. If you look around you can see a bunch of people and places doing the comparison again and contrary to the norm, overall the 3080 is stronger against the 6800 xt. I can here from a video comparing them that tested 50 games and the 3080 was on arg faster the the 6800xt by a higher percent then a couple of years ago. Now, it wasn't much of a change (like 9% on avg in 4k vs 7% that it was a couple of years ago 1440p was up the same) but it bucked the normal trend of AMD cards getting stronger vs Nvidia's. I thought it was funny. If I had to pick though I would still go with the 6800 xt because if you look and watch can you can find them new for around 500 or a little under sometimes. You are not going to find a 3080 new in that range or even in the zip code.
@jamesdoe7605
@jamesdoe7605 Жыл бұрын
​@@xwar_88x30lol no
@xwar_88x30
@xwar_88x30 Жыл бұрын
@@jamesdoe7605 settle down there james, dont be silly now.
@Ratich
@Ratich Жыл бұрын
The problem with the encoding is that i don't use that feature neither do I have a use case for CUDA and I'm probably going to stick to pure Rasterisation rather than turning on RT because upscaling or not the performance hit is too much. So for me the 6800XT is a better option.
@chetu6792
@chetu6792 Жыл бұрын
That’s the thing. Everybody talks about the Nvidia feature advantage. But beside DLSS, most features are used rather scarcely
@evrythingis1
@evrythingis1 Жыл бұрын
@@chetu6792 DLSS is just another scarcely used feature. They have to pay companies to implement it in their game, how ridiculous is that?
@gotworc
@gotworc Жыл бұрын
​​​@@evrythingis1i mean that's literally what most companies do to push their tech. You gotta pay to play. Why would a company offer to put your tech in their game if they have nothing to gain from it. AMD does it too. While I don't think Nvidia GPUs are particularly good value right now since we're at the beginning of this era of AI being used to improve graphics computing. I don't think the technology is bad or a gimmick like most people are trying to say
@evrythingis1
@evrythingis1 Жыл бұрын
@@gotworc go ahead and tell me what proprietary features AMD has had to pay game devs to implement, I'll wait.
@enricod.7198
@enricod.7198 Жыл бұрын
Could you include a software showcase comparing the nv control panel and amd adrenaline? Because I've used both in recent times and amd is way better, especially since it comes with oc/uv suite that's far easier to use than having to resort to afterburner. These things should be considered when doing a comparision.
@Raums
@Raums Жыл бұрын
See I’d love to see this too, haven’t had an AMD card in years but hated the software back then and have heard it’s improved a lot. Never really had an issue with nvidia
@enricod.7198
@enricod.7198 Жыл бұрын
@@Raums Tried both in the last 2 years. As I always undervolt my gpu I must say that amd is way easier and works better. Needing a 3rd party software made only by one outsourced person to undervolta a gpu from the market leader is embarassing imo. Amd software is great these days and drivers, while sometimes have issues like nvidias, have way more performance improvements over the months compared to nvidia.
@memoli801
@memoli801 Жыл бұрын
They always leave this important fact out. But Cuda is so important for a gamer? We are not all contant creater! Dont give a shit
@jjlw2378
@jjlw2378 Жыл бұрын
The fact that AMD has a built-in OC/UV utility is actually a bad thing. They have already used it to artificially limit the speeds in which you can tune your GPU. You want overclocking software to be a third party because they will provide the most impartial and least limited features.
@FenrirAlter
@FenrirAlter Жыл бұрын
​@@jjlw2378🤡
@SuperShowDowns
@SuperShowDowns Жыл бұрын
I got the 6700xt spretral white edition last week for £300, and im so impressed with its performance. 1440p ultra no upscaling 🎉 glad i didnt get the 4060!
@ChusmaChusme
@ChusmaChusme Жыл бұрын
0:05 I'm like 95% sure the mining begun right when that generation began. Remember it cause it was impossible to buy any gpu at launch.
@Chrissy717
@Chrissy717 Жыл бұрын
That was actually covid shortages. 2020 and the first lockdowns caused supply chain issues on a global scale
@MahBones
@MahBones Жыл бұрын
For me at the moment it's between the 4070 and the 7900gre but I'll have to see where that price lands in Aus. The 40 series power consumption is pretty compelling.
@Hydra_X9K_Music
@Hydra_X9K_Music Жыл бұрын
Not sure if this also matters to you, but the 4070 has a nicely small form factor. Can fit into a pretty wide range of case sizes
@Nein99x
@Nein99x Жыл бұрын
The 7900GRE will be limited to pre-built pcs. You might as well get the 4070.
@Dereageerder
@Dereageerder Жыл бұрын
Get the 4070, AMD is garbage beneath the 7900 xt and xtx
@chriswright8074
@chriswright8074 Жыл бұрын
​@@Dereageerdercap 6950xt definitely gaps or beat 4070 at times
@Mangk89
@Mangk89 Жыл бұрын
​​​@@chriswright8074i think he meant anything in the 7000 lineup, but i get your point
@gamerforever4837
@gamerforever4837 9 ай бұрын
Why do you say one thing, then contradict what you just said? " I don't think the extra 6 gigs of vram is that big of a deal" Then less than a minute later " but the 16 gigs of awesome, to allow you to enable higher settings.
@syncmonism
@syncmonism Жыл бұрын
DLSS doesn't eliminate the value of Vram, but it can let you get away with using less with a small reduction to image quality at a given resolution. Also, while DLSS can help compensate for a lack of Vram, FSR can as well, it's just not as good at it at 1440p, but it gets quite hard to tell the difference between the two when upscaling to 4k, at least with the highest quality settings. I did own a 3080 for a while, and have played around a lot with DLSS and ray-tracing in Cyberpunk, and also in Control. Running without any upscaling at all is still better than running with DLSS, as long as you can get a high enough frame rate, and I find it very hard to recommend a card which costs 80-100 more, but has significantly less Vram, if it has about the same amount of standard raster performance. Ray tracing in Cyberpunk required using DLSS, and still ran significantly slower than running at 1440p native without ray tracing. I just didn't think that it was really worth using ray tracing. It never seemed obvious that it was worth running with it turned on, though it certainly was usable, and did look good, running at 1440p with DLSS. With Control, I found that the performance was good enough to run with ray tracing and without any upscaling but the game did still run noticeably smoother with ray tracing off, and the ray tracing itself was still not all that compelling to me. I found that the lighting in both games, even without ray tracing, was amazing. A lot of the lighting effects even with ray tracing turned on, are the same either way, unless you go with full path tracing, but I obviously had nowhere near enough performance to run with full path tracing. The 4070 isn't terrible, and it's not going to suddenly become obsolete because it doesn't have enough Vram at any time in the next 4-5 years, but it would have been a LOT better if it had had 16GB of Vram. It's not like 16GB would have been overkill on it because it has DLSS. That would have made the card that much better, and it would have also had more memory bandwidth as well, which would also be nice. A 16GB 4070 at 660 would have been a significantly better value than the 12GB version is at 600.
@evergaolbird
@evergaolbird Жыл бұрын
I would argue that VRAM matters for PC gamers that are not into competitive gaming but rather into the modding scene. NVIDIA is doubling down into this with RTX Remix with the RTX 40 series cause they know the demand of the modding scene in the PC landscape remains high. Skyrim is almost 12 years old, but to this day infact it has not slow down rather the modding scene in the game just hits the peak further instead (Check the view counts in KZbin or the player counts in SteamDB - its still relevant). High VRAM capacity and Memory Bandwidth are important in Skyrim, I have 1750 mods installed on my 5900X + RX 6700 XT and with my 12GB of VRAM there's a lot of planning I do which mods to keep or not cause of VRAM demand.
@siyzerix
@siyzerix Жыл бұрын
I agree. Skyrim modding also requires the fastest cpu to play smoothly. So a x3d cpu is almost a must.
@evergaolbird
@evergaolbird Жыл бұрын
Very true, Skyrim is one of those games that demands not just with the GPU but also the CPU. Not to mention at least having a GEN3 NVMe is required to even make Massive LODs work - System RAM too. Skyrim's modding scene is probably the only game out there which is like a symphony. Its uses everything in the PC especially if the person knows how to mod.
@siyzerix
@siyzerix Жыл бұрын
@@evergaolbird Indeed. My laptop has 32gb of ddr4 3200mhz ram, an I7 12650h and a 150w 3070ti and a 1tb gen 4 nvme ssd. Skyrim se still see's my CPU being the limitation. Thats despite me giving my CPU all the power it needs.
@siyzerix
@siyzerix Жыл бұрын
@@TheAscendedHuman The mobile 3070ti has the same chip and specs as the desktop 3070 except tdp and clocks. The 150w variant is within 10% of the desktop 3070. The i7 12650h is more akin to a i5 12600 non K (if it exists). I'd say thats a solid setup, even for desktop standards. Cause I know, not that many people are buying even a 3060ti or 6700xt on desktop. Most have a 3060 or 2060. My point is, modded skyrim se just needs way too fast of a CPU. I litearlly cannot use my GPU to its fullest, even if I let the CPU run wild. I hit the drawcall limit real fast and exceed it too causing FPS drops.
@lucidnonsense942
@lucidnonsense942 Жыл бұрын
The problem with using a 3080 for productivity apps is that 10GB is REALLY not enough for doing actual work with. PLUS - nVidia's drivers limit many of the professional features you need to Quadro drivers. The 3080 Quadro equivalent is the A5000/5500 with 24GB vram and price around 1k-2k. You will get better performance, than 3080, in most CUDA workloads with a Quadro RTX 4000, a 16GB - 3070 equivalent. Because 10GB for any significant work is waaayyy too low - assuming the drivers allow the full feature set in the application on a non Quadro card. As far as AI worloads go; which is much more my wheel house. ROCm is feature complete and CUDA isn't all that relevant in 2023 as it was in 2021, for the AI space. 10GB - again - cripples the card for anything more than fiddling around as a hobby. Try to render a 1440p scene with a 10GB card vs a 16GB, it's not even funny how memory crippled you will be. You will get equivalent performance to a 6700XT with 12GB - which you can get for much cheaper. Additionally, we tend to put GPU render farms on a linux distro, where AMD has much more mature drivers and ROCm support. Specialised AI accelerators are a whole different kettle of fish, in that space, you will be writing your own custom libraries, that you will tune to whichever vendor allocated some boards for you - nobody is going to be picky which one, the lead up times are insane as everything is pre-sold before being fabed. You take what you can get, pay what is asked and count yourselves lucky.
@NinjaWeedle
@NinjaWeedle Жыл бұрын
Nabbed my Red Dragon 6800xt this prime day for 479$. Pretty happy with it, although i do sometimes find myself wishing I still had the CUDA support my 1070 Ti had. Looking forward to ROCm.
@Pand0rasAct0r_
@Pand0rasAct0r_ Жыл бұрын
I have to disagree with the feature set. And before people call me a shill I have been using nvidia all my life. The 6950xt is the first amd gpu I have owned and used. The dlss vs fsr argument is in most cases highly blown up. Yes dlss provides a better picture. BUT you will almost never notice this during gaming and only notice this If you pause them and watch them next to eachother. If you have to go through such lengths to spot something than yeah that's just not an upside in my opinion. And raytracing is better on nvidia although not bad on amd either. But the cards we have right now and especially the 3080/3090/6800/6900 are just not raytracing cards. Neither is the 4070ti or 4080 or 7900xt or xtx. The only card capable of raytracing at minimum is the 4090. And even that card sucks at it in many ways. So If you plan to raytrace you really shouldn't be looking at these cards. The only upside is cuda but If you are just a gamer you wouldn't care about it. And the vram is just so important. There are so many games I see with my 6950xt that these days shoot past 10gbs. And I wouldn't choose the 4070 above the 6800xt in my opinion. 12gb is alright but as I said I've seen games shoot past 10gb Heck even 13gb. So the 4070 would already be having issues. And that at 600+ bucks new in Europe. Just not worth it in my opinion. At that point you may as well buy a 6950xt if they are still available. Most people buy it indeed for raster and in that case most amd cards just beat nvidias.
@SmokinDave7373
@SmokinDave7373 Жыл бұрын
Nice Video man, I jumped ship from my 3070 to a 7900XT because it was 200AUD cheaper then the 4070ti and besides not having DLSS as an option I am super happy with my AMD card, in pure raster performance on average I get 15-20% more fps in most games(compared to the 4070ti) and the only issue I have had with my 7900XT TUF is bad coil whine which I am hoping I can send mine back soon and see if I can get a better luck of the draw on that. Keep up the videos you are a great new entry source into the PC DIY learning youtubes.
@xwar_88x30
@xwar_88x30 Жыл бұрын
Your not missing out on dlss since amd has built in fsr in there drivers which you can use in any game and it still looks amazing. Should defo try it out. It's under super resolution in the driver menu, I tend to have the sharpness either at 40/60 looks good imo.
@Paulie8K
@Paulie8K Жыл бұрын
Nice breakdown. I've had the 3080 10GB since December 2021 and it's been amazing. Now i tend to play games that are a bit older because I have a rule where I wait at least a year after release for games to get patched and go on sale so I have nothing from 2023 but I've played dozens of games from 2015-2022 and the 3080 ran them beautifully at 1440P.
@MrBorisRoo
@MrBorisRoo Жыл бұрын
I dont have problem on 4K with my i7-9700 … 60 fps are stable…. Now I can never go back to 1080p
@unreleasedcontent9316
@unreleasedcontent9316 Жыл бұрын
Right the 16gb of vram isn’t all that 10g is all you need it nvidia still gives you advantages
@Cybersawz
@Cybersawz 6 ай бұрын
I have had that same 10GB card for almost 2 years, and use it for 4K gaming on high settings for most couple year old games. Granted, the latest AAA games can't be maxed out but still look pretty awesome on high. I run an I9-13900K with 32GB RAM.
@Paulie8K
@Paulie8K 6 ай бұрын
@@Cybersawz nice. I have 32gb of ram too but have a 3700X which I may upgrade to a 5800X3D. But at 4K, I'd be GPU bound so probably would be the same results as your 13900K there.
@daruthin
@daruthin Жыл бұрын
In the "new" market in europe (mostly france) : - For 599€ you got a brand new 3080, a 4070, or a 6900XT (the 6950XT is 629€) - The 6800XT is 60€ less I'm still waiting for the 7xxx from AMD. I don't get their strategy. Still, there isn't 7700 or 7800, XT or not, and they released the 7600.
@alzarpomario889
@alzarpomario889 Жыл бұрын
I just put here my two cents about what is keeping me, and a few other folks, on the AMD side when buying a new GPU. Support for open stadards, like FreeSync, FSRX and OpenCL/ROCm: I don't like vendor lock-in, so I support agnostic tecnologies. I'm not the guy who cracks professional software just to tell my friends: I have photoshop 2026, idk how to use it but I have it. So I usally go for open software and I've never had a regret, in both my private labs and professionally. But the main plus above all it's the unix support. At home I can play windows games on linux flawlessly without having to thinker with monthly driver updates, it just works...and 2005 class hardware is still supported. At work I greatly extend hardware lifespan for the same reason and this philosphy allows us to offer fast and reliable citrix like remote desktops with GPU passthrough of graphics cards that would now be e-waste if made by nvidia. Intel is now in the middle between AMD and nVidia philosphy and i hope it will land on the AMD view of the HW/SW stack.
@Astravall
@Astravall Жыл бұрын
I do not stream to twitch, i do not upscale and i seldom use raytracing (well not that i could complain about raytracing performance of my RX 7900 XTX. It is plenty sufficient). But i have games that use more then 10GB RAM and that raytracing increases the VRAM usage thus the 3080 running out auf VRAM is kind of funny. So i have to disagree. I would chose the 6800XT anytime over a 3080.
@coolumar335
@coolumar335 Жыл бұрын
Dude is capping by saying 16gb VRAM doesn't matter. Extra vram allowed cards like the RX 580 8gb and even R9 390 8gb to extend their lifespan way beyond what was initially expected out of them. The 6800 XT is a great long-term purchase and will be viable for 1440p for at least 3-4 more years.
@giucafelician
@giucafelician Жыл бұрын
I am absolutely thrilled to share that I recently snagged an incredible deal on a 6800 for only $353! Needless to say, my excitement is through the roof!🥰
@gustavopena99
@gustavopena99 2 ай бұрын
I got an used 6800 no xt for 260usd aiming for 1440p im so happy also (1 year warranty) (in argentina is a good deal)
@kaisersolo76
@kaisersolo76 Жыл бұрын
The reason Nvidia skimps on Vram is so you have to buy the next gen and give you dlss to get by. Amd give you the VRAM and get blamed when theres not much improvement next Gen and offer a serviceable FSR.
@adikonas1978
@adikonas1978 Жыл бұрын
Can you tell me what the temperatures of the card are, especially if you use a 1440p ultra wide monitor? For me, with the highest details, the temperature at the connector was 100 degrees Celsius
@Anto-xh5vn
@Anto-xh5vn 3 ай бұрын
Um what holy shite bro that's probably not normal or is it
@adikonas1978
@adikonas1978 3 ай бұрын
@@Anto-xh5vn as far as i checked normal 10 degrees before thermar throtling with undervolt its around 90
@victorhogvall2065
@victorhogvall2065 Жыл бұрын
Got a 6800xt paired with a 43 inch 4k screen and I can tell you using quality fsr at 4k I cannot tell the diffrence and at 1440p with these cards u don't even have to use upscaling.
@thebeautifulandthedamned572
@thebeautifulandthedamned572 Жыл бұрын
gpu market has been confusing lately here in my country indonesia, there were a moment when you can't find RX 6950 XT at $630 USD, it was sold around $700 but ironically, you also can't find brand new RTX 3080 on $600 USD, because it's also sold on 680-700 USD so people who actually buy brand new RTX 3080 is considered as "stupid", since they could get way better GPU which is RX 6950XT and now a brand new RX 6800XT is being sold at $560 USD, just $60 less cheaper than RTX 4070, and you can't find RX 6950 XT anymore, not even in 2nd hand version
@nathanielchapman7675
@nathanielchapman7675 Жыл бұрын
I disagree with the VRAM statement. The stutters and graphical artifacts are a dealbreaker for me. I had a 3070 that ran out of VRAM and the drawbacks were unbearable. The additional 2 GB of VRAM for a 3080 would have not resolved the issue. Moreover, the raytracing on the 3080 is quite overblown with how good it is…. Also, it requires more VRAM. So what would you choose, better textures overall or better lighting?
@professorchaos5620
@professorchaos5620 Жыл бұрын
The main reason 3080's are cheaper is that they were probably the most produced card of all time. Although I dont know how to find stats on that but so many miners like myself bought every one they could for a couple years and they have been re-selling them on ebay since then.
@maximus3294
@maximus3294 Жыл бұрын
it's true. ga102 in general was so abundant that they made the following GPUs with them: 3070 ti 3080 10 GB 3080 12 GB 3080 Ti 3090 3090 Ti and those are just the GeForce ones. their serverside cards also used the cores.
@DragonOfTheMortalKombat
@DragonOfTheMortalKombat Жыл бұрын
F*** you miners, f*** you miners
@CPSPD
@CPSPD Жыл бұрын
when shitcoins and cryptocurrencies collapse i will rejoice
@professorchaos5620
@professorchaos5620 Жыл бұрын
@@CPSPD Thats good central bank slave. Keep rejoicing in your life long slavery
@ravs1988
@ravs1988 Жыл бұрын
Upgraded from a Rx570 from 2017 to a Rx6800 (non XT) for 300€ on the used market. Currently wasting its potential on Dota and Wow :)
@cartm3n871
@cartm3n871 Жыл бұрын
I just recently upgraded to a 4070, a few weeks ago, I gotta say, I haven't felt the vram issues yet. every game I play, plays flawlessly.
@antonnrequerme6256
@antonnrequerme6256 Жыл бұрын
what resolution do you play on?
@cartm3n871
@cartm3n871 Жыл бұрын
@@antonnrequerme6256 1440, what about you?
@Altrop
@Altrop Жыл бұрын
You'll feel them in a year or so. 16GB is the new VRAM target for game devs. Hence the sudden explosion in VRAM use in 2023. We ent from "8GB is enough forevahhh" to some games using up to 14GB in months. When I buy a card I intend to keep it for 4 years. And for that I realized 10 or 12GB just wasn't enough, so I bought a 6800XT. Used to own a GTX1080 and saw that it was already being filled up in 2021, I saw the writing on the wall
@MahBones
@MahBones Жыл бұрын
Should be good on 12gb for most stuff at 1440 where the 4070 is well matched. I'm on a 6700xt atm at 1440 and don't have issues with vram.
@MahBones
@MahBones Жыл бұрын
@@Altrop Realistically the only things using that much vram are unfinished garbage piles or if you're set on raytracing but I don't think a 4070 is really the card for raytracing. Until consoles move away from their 12ish usable that will be the "benchmark"
@Paelmoon
@Paelmoon Жыл бұрын
I have a 3080 10gb. What's your opinion on what to upgrade too? 40 series are so pricey, but the amd cards are not so higher performance. Edit. Cheers guys! I will stick with the card and hope it doesn't conk out soon :-P
@Mako2401
@Mako2401 Жыл бұрын
There is no need for you to upgrade unless you get a 4090
@Humanaut.
@Humanaut. Жыл бұрын
Keep the card for as long as you can and pray that next gen doesn't suck as hard as this gen and that ai doesn't become to next crypto boom. The only issue you could have is VRAM limitations, which is a good old Nvidia planned obsolescence strategy, other than that you should be good for a while.
@niggapyrat
@niggapyrat Жыл бұрын
3080 is still a good card but if u want more vram for 4k etc.. go with a 7900 XT or XTX. If u want Nvidia, minimum a 4070ti
@Blaze72sH
@Blaze72sH Жыл бұрын
4090 or just don't. You are set for now.
@endmymiseryyy7908
@endmymiseryyy7908 Жыл бұрын
wait for next gen
@johnrehak
@johnrehak Жыл бұрын
3080 shoot its leg by 10gb Vram. I would buy 3080 12gb or 3080 ti which also has 12gb if its not significaly more expensive. Was Smar acces memory turned on when you tested both cards? In general both cards are same until you turn on SAM, than 6800 xt stsr pulling head in nearly all games except RT.
@Games_and_Tech
@Games_and_Tech Жыл бұрын
Terrible comparison, a high en3080 vs a lower end 6800xt... the power consumption and the frequencies are too low for a 6800xt
@keola2
@keola2 Жыл бұрын
Vram becomes a huge problem when you don't have enough, I'd like to see some benchmark examples with Starfield 8k or maybe even 16k texture mods. Watch that 10GB absolutely tank in performance. With that said, right now 10GB is good enough, it's just not very future proof. When new games are using more and more vram every year, I'm sure that's a concern for many people. Despite all of that, I'm sure the used market is just flooded with more 3080s because of the end of the crypto craze.
@Dereageerder
@Dereageerder Жыл бұрын
8k and 16k textures are more for the 90 class cards. 10gb is plenty for starfield
@Eleganttf2
@Eleganttf2 Жыл бұрын
wtf u need 8k texture for ? you crazy and delusional
@justjoe5373
@justjoe5373 Жыл бұрын
Only a concern if you max them out. Gaming at 1080p, never had to drop more than 3 settings from max for stable 60 at 8GB. Haven't played Hogwarts Legacy, TLOU etc. but they don't interest me anyways and I bet you can play them perfectly fine with a few nonsense settings turned down. It's the same deal for higher resolutions, turn down a setting or 2 and watch the game become playable, usually with little visual difference. Ultra settings are beyond the curve for diminishing returns for performance hit vs visuals, when an amount of VRAM isn't enough to run the game at medium-high then it's too little 10GB VRAM is gonna be fine, XBox S has 10GB total that's shared between the GPU and CPU. 10GB may be laughable on a 3080 but in and of itself that amount isn't an issue
@Ober1kenobi
@Ober1kenobi Жыл бұрын
When was the card designed for 8k ? Do you have an 8k panel ? You have a $3000 monitor ? Lol Same as testing a 4060 in 2023 at 4K, it doesn’t make sense, it ‘can’ up to a certain texture level. Would I want to, no
@keola2
@keola2 Жыл бұрын
I agree with you both, like I said, 10GB is good enough, especially if you're doing 1080p. It's more of a worry with future, poorly optimized games from AAA devs using more vram than necessary. It's like that saying, I'd rather have it and not need it, than need it and not have it.
@user-mx4gm6pc2o
@user-mx4gm6pc2o 2 ай бұрын
I got brand new 6800 asrock challenger pro (non-XT) for 360usd in May 2023 and I absolutely love the card
@osmanbicer14
@osmanbicer14 Жыл бұрын
Upgraded from 6700K, GTX 1080 and 1080p gaming to RTX 4070, 7600x, 6000 MHz 32 GB DDR5 and Dell G3223D monitor. It was a nice upgrade.
@fanATIc666x
@fanATIc666x Жыл бұрын
i upgraded from 4690k, gtx 970 to 7800x3d and 4080, and monitor G2724D
@Ghostlynotme445
@Ghostlynotme445 Жыл бұрын
I upgraded from i3 19100f, original gtx 1650 to ryzen 9 7900x3d and rx 7900 xtx
@xwar_88x30
@xwar_88x30 Жыл бұрын
@@Ghostlynotme445 good choice, better upgrade than the other two noobs 😂😂
@fanATIc666x
@fanATIc666x Жыл бұрын
@@xwar_88x30 ATI fanboy here
@osmanbicer14
@osmanbicer14 Жыл бұрын
AMD users will have to adjust settings on Radeon software 'to get a stable experience' whereas; Nvidia users will just play the game without doing any adjustments. And when RT is turned on, AMD goes insane hahaha @@xwar_88x30
@moes95
@moes95 Жыл бұрын
Recently upgraded to a rtx 2070super for 150€ (card needed to be repasted, not that hard tbh), ill be good for a few years or i can snag something newer on the cheap Edit: i bought the gpu with mobo combo, i7 9700k 16gb ram and aio cooler for 200+150=350€ sold my old setup for 400€ with screen and other peripherals
@RFKG
@RFKG Жыл бұрын
Grats on your buy, mate. The 2070 super is still a very competent 1080p card.
@AndyViant
@AndyViant Жыл бұрын
2070 Super was a very good card for it's era in bang per buck. More a 1080 card now but if that's your end use it has a lot of life left yet.
@prem3548
@prem3548 Жыл бұрын
Great buy for $150. It has slightly better performance than my RX 5700 XT and I play every game at 1440p at 60+ FPS with FSR enabled no problem.
@moes95
@moes95 Жыл бұрын
@@RFKG you mean 1440p the current rig pushes 100fps in almost all games at high/max @1440p i dont use dlss or other resolution scaling, i went from i7 4770k w. Gtx 1070 that also pushed 1440p at medium settings albeit stuttery in demanding games Example: cyberpunk currently averages 80fps @ high to max @ 1440p
@RFKG
@RFKG Жыл бұрын
@@moes95 That's amazing, i honestly had no idea that the 2070super could still do 1440p gaming
@mRibbons
@mRibbons Жыл бұрын
I love AMD, but Id never say no to a competatively priced 3080. Did the power connect get sorted out tho?
@Very_Questionable
@Very_Questionable Жыл бұрын
not really, but all boards that's not the founder edition cards (for the 30 series) have the standard 8-pin PCI-E connectors.
@clem9808
@clem9808 Жыл бұрын
As a 4070 user the power draw of these 2 cards scares the heck out of me. 😖
@leroyjenkins0736
@leroyjenkins0736 Жыл бұрын
12 gb vram scares me more :(
@clem9808
@clem9808 Жыл бұрын
@@leroyjenkins0736 not me though. ✌️ I love 4k 60fps locked while pulling below 150w of power draw 😎. Edit: anyway no need to get pissed. I like 6800xt and I have nothing against AMD so relax will ya?. Jeez these casual fanboys loves to pick a fight 😂.
@leroyjenkins0736
@leroyjenkins0736 Жыл бұрын
@@clem9808 not a fanboy just pointing out a problem in the near Future
@franciscojaviervazquez2635
@franciscojaviervazquez2635 11 ай бұрын
I got the 6800xt in november, and I love it. First, I had the good luck of getting a card without the annoying coil wine, so that made me happy (I bought the Gigabyte OC version). Second, I trully love Adrenaline, because there´s no need of Afterburner to adjust voltages, fans speed or whatever and is very easy to measure thermals without using a third party software. To be honest, DLSS is a major feature I would like to have on my card, but considering that Nvidia didn´t supported 30 gen of cards for frame generation and AMD promised FSR 3 will be supported in the 6000 cards; seems that was a deal. If AMD promise is achieved, the 6800xt would destroy even the 3090ti with half the price. We´ll see if true... but at the end, 6800xt seems like a good deal if RayTracing is not what you want. I´m not content creator or editor, not streamer; just using the card to play and even without FSR 3 I love my card. No drivers issues, more VRAM, more rasterization performance and includes Adrenaline. All for less price. Shut up and take my money!!
@Psychx_
@Psychx_ Жыл бұрын
3080 10GB doesn't have enough VRAM to handle Cyberpunk in 1440p with Ultra-RT. Scaling up from a lower resolution alleviates that memory pressure.
@Bleckyyyy
@Bleckyyyy Жыл бұрын
The fact that AMD makes FSR 3.0 available to NVIDIA users is enough for me to go with AMD. Forget NGREEDIA...
@AndersHass
@AndersHass Жыл бұрын
It is possible something else in the system than the GPU draws more power, but the selling point about power consumption certainly doesn't seem like something that would matter in this case.
@MGrey-qb5xz
@MGrey-qb5xz Жыл бұрын
it's no longer about performance anymore, we honestly have enough of that and gaming has stagnated, what matters now is how power efficient the chips and cards are , even series console went the undervolt route to run more on less
@iikatinggangsengii2471
@iikatinggangsengii2471 Жыл бұрын
think its the other way around, game devs want gamers to update their rigs so they can make better games
@iikatinggangsengii2471
@iikatinggangsengii2471 Жыл бұрын
i mean no one can 144 at cyberpunk psycho rt as for now
@MGrey-qb5xz
@MGrey-qb5xz Жыл бұрын
@@iikatinggangsengii2471 there are more important things to be done then play cyberpunk in rt 144fps, we need to stop turning cards in to freaking ovens in the summer and decrease the electricity bill, aside from first world countries , using modern cards is a big no for electricty bills and ac bill to cool the effing room, can't even buy old ones now cause they are no longer in production
@kon0212
@kon0212 Жыл бұрын
tbh some of us, well. most of us grew up without ray tracing, why need ray tracing when u can perform higher fps at a higher resolution, honestly, ray tracing is a useless invention when we can use its much more easier counter part also known as pixels.
@user-ol3tf1qi6c
@user-ol3tf1qi6c Жыл бұрын
Overclocked 6800 XTs on Linux run about as fast as RTX 3090's do. When overclocked it can compete with a 3090 Ti. Look up the benchmarks on Phoronix Test Suite. The benchmarks were from around 2021 so things have only improved for AMD cards since then. It wouldn't surprise me if these cards were around the speed of a 4070 Ti now running in the same environment running at stock. I know the Nvidia fangirls will throw a fit over this message or cope by saying, "lINux GamiNG DOesn'T MATter!" forgetting the steam deck runs Linux on an AMD SoC. 😛 Anyways, I think we should move the conversation beyond just Windows. Especially now since Linux is in many cases a better gaming experience for team red. One example being the driver already comes in the Linux kernel. Which means there is nearly 0 driver overhead since it's running so close to the metal. Nvidia owners still need to install the proprietary driver in the traditional way.
@toppsballer
@toppsballer Жыл бұрын
Can confirm my 6900XT Merc on Linux dogwalked by friend's 3080 FTW3 in XDefiant so bad that he sold it and got a 6950XT lol. This is why I always laugh whenever the "Drivers bad durr" argument gets thrown around while they conveniently or ignorantly leave Linux Mesa drivers out of the conversation.
@sketters9400
@sketters9400 Жыл бұрын
I got a 6900 xt for 400 euros used and I couldn't be any happier with it, crazy good GPU for it's price
@5RWill
@5RWill Жыл бұрын
I got that one on prime day for $480 with starfield. But good points made. The only feature i really would nod the head to is dlss. I’m all for Ray tracing but it’s just not that big of a difference in most games for the performance. I wish amd would focus on fsr 3.0.
@GeneralS1mba
@GeneralS1mba Жыл бұрын
Imagine fsr 3.0 miraculously comes with starfield lol
@desild5869
@desild5869 Жыл бұрын
You missed an issue: 3080 may consume about as much as a 6800 XT on average but it, like all high end 30 series, have an extra issue with strong transient power spikes that can trip emergency cut-off on older PSUs which otherwise support enough wattage (e.g. 800-1000 W). It was clear from T0 that 10 GB was 100% planned obsolescence. A 3080 with 16 GB would've been a long term nightmare as few would want to replace it before 2025. 12 GB on the 4070 is also planned obsolescence. You are right that the 4070 is preferable to the 6800 XT at current prices. Not only are they VERY similar in raster (within 5%) but in ray tracing the 4070 pulls far ahead (25-50%). There is however a caveat. 4070 is preferable to gamers who buy a GPU primarily to be able to go through a backlog of games and don't expect to be able to hold the same visual quality in newer games for the next 5 years. There's not enough VRAM or GPU in the 4070 for that. On the other hand, the 6800 XT is less capable today (especially in ray tracing) but due to its larger VRAM it might be able to hold a constant, albeit lower, level of visual quality for more years. But then here comes the kicker: AMD proved it will drop driver support after just 6 years, which means support for the 6800 XT is fragile beyond 2026. Meanwhile, NVIDIA proved it will go damn near 10 years with their support, which means the 4070 will retain better resale value and will be supported well into the 2030s! Anyway, it's a good call with the 4070. The 6800 XT needs to dip into $3xx to be really attractive this long into its lifespan.
@gokublack8342
@gokublack8342 Жыл бұрын
People think im an AMD shill but alot of times it comes down to price if the 3090 ti had been around the same price I would've gotten that and not the 6950XT but whatever slight gains thr 3090TI has over the 6950XT isn't worth double - triple price in seeing it retail at right now
@eddiethehead5988
@eddiethehead5988 2 ай бұрын
This is one of the best detailed responses I've read. It's ironic how people call AMD drivers "fine wine" when in reality they are very quick to drop support for older architectures. I'll also add that upscaling tech is also something important to consider. With AMD being no competition to Nvidia on that matter.
@Nalguita
@Nalguita Жыл бұрын
I bought a 6800xt 1 year ago. Where I'm from in Spain, for some reason AMD cost more than Nvidia. All my life I was Nvidia, except for an old ATI x800 I had, but the rest of the graphics cards I have had were all Nvidia, a 9800gx2, gtx 280, gtx 285, gtx 480, and Gtx 970. Even being more expensive here the 6800xt than the 3080 I opted for it, it was not for the performance, nor for the price, it was because personally I am up to ...... of Nvidia. First it was with a board with Nforce 790i ultra chipset, an expensive board that did not stop giving problems, then the 9800gx, the worst graphic I've had by far, then with the Physx marketing crap, that if you add a second card to have physical and blah blah blah, never ended up working well, pure marketing, then the 3.5 Gb of the 970, I ended up fed up and for my part they can put their marketing up their ass. It is clear that the streaming and rt in nvidia and superior, but for me it was not so much as to opt for Nvidia again. The Rt is superior but it is clear that in the 3000 series falls short so for me it is still marketing, the Dlss is above the FSR there is no doubt, but who tells me that when they leave the 5000 not leave me sold with as they did with the shit physx, the 3dvision and other technologies, also the Xsee of intel is not so bad and can be used in AMD. This is my humble opinion from years of owning Nvidia cards. I'm not saying that Amd hasn't had problems with their cards, but I personally haven't suffered from it. Sorry for the long and my English, I wrote it with the translator. Good video and good channel
@bartdierickx4630
@bartdierickx4630 Жыл бұрын
If the extra 6GB VRAM gives you 2 extra years of gaming (let's say an additional generation of GPUs) before forking out new money it's worth worth 112$ a year, without taking the resale value into account of both GPUs after respectively 2 a(for the 3080) and 4 years (for the 6800). And not considering if you do more than gaming on the GPU.
@dayadam16
@dayadam16 Жыл бұрын
Gotta love at 3:54 "the nvidia card is pulling ahead pretty significantly"(+14%) then 10 seconds later when "the amd is pulling ahead on same game the amd is actually pulling ahead"(+14). I feel like for the rest of this video nvidia is gonna get way better choice words. Smh🤥
@vextakes
@vextakes Жыл бұрын
Literally said amd has is very favorable with lumen there and that’s is a huge advantage in the future
@dayadam16
@dayadam16 Жыл бұрын
@@vextakes that doesn't change the choice of words that was used though. I see what your getting at but i'm saying using words like significant just sounds way more than what actually took place within that 12 seconds of that time stamp.
@RannonSi
@RannonSi Жыл бұрын
Personally, DLSS is a crutch-technology and should be advertised as a way to keep your e.g. x080 viable for another generation, or three. Not as something to make games playable when the card's still pretty new. Edit: fixing spelling mistakes and such like.
@KEWords
@KEWords Жыл бұрын
Like AA or anything that improves quality over rendered resolution does? Cruth. Do you play games or freeze frame and pixel peep. DLSS gives you a choice, and when I choose quality it get more FPS at the same visual quality on the same card. Its like a 5th gear that gives me better milage. If it was not valuable. AMD would not be bribing bundle partners like Starfield to no support it.
@_n8thagr8_63
@_n8thagr8_63 Жыл бұрын
it's not a 'crutch' technology. There's more to making a game run nice than just raw hp. DLSS is an excellent feature and a great example of hardware and software harmony
@arenzricodexd4409
@arenzricodexd4409 Жыл бұрын
Depends on how you see it. If DLSS is "crutch" then rasterization is also a crutch.
@GeneralS1mba
@GeneralS1mba Жыл бұрын
​@@_n8thagr8_63it's only a crutch for Devs who don't know what optimization means imo
@evrythingis1
@evrythingis1 Жыл бұрын
@@arenzricodexd4409 You don't know how DLSS works do you?
@paradoxeintervention5390
@paradoxeintervention5390 Жыл бұрын
I wanted to buy a 3080 2.5 years ago, ultimately it became a 6900xt. At that time I had a FHD144 monitor then switched to WQHD165, meanwhile I have a 4k Oled for gaming. Currently 12GB would also be too little for me, already the head thing. With every jerk I would ask myself whether it is perhaps the VRAM. That's why I'm quite happy not to have gotten a 3080, back then I would not have thought how quickly the memory becomes a problem.
@princekatana8792
@princekatana8792 Жыл бұрын
Why doesn't anybody talk about the RTX 3080 12 gb, every review I see has the 10 gb. Not that I think it would make that much of a difference but still am curious as that is the card I have.
@MahBones
@MahBones Жыл бұрын
Just never see them, never seen one for sale in my local shops.
@vincentvega3093
@vincentvega3093 Жыл бұрын
It released a year later for almost 3080 Ti money. Only miners bought them
@shawnadams1965
@shawnadams1965 Жыл бұрын
I think its because most of us bought the 10gig version when it came out, at the time we didn't know that they would release a 12gig version. I sold my 10gig 3080 (that I bought as part of a full system upgrade at the same time as the rest of the components) for 450€ and bought a 4070 on sale for 580€ So a nice side-grade for 130€ It should hold me until I upgrade my entire system again in a few years.
@Legnalious
@Legnalious Жыл бұрын
Because the 10gb was the original. The 12gb was also $200 more expensive (at least). So odds are that there are significantly more people who have the 10gb than the 12 gb variant. So it just makes sense to use that version.
@princekatana8792
@princekatana8792 Жыл бұрын
I always buy a generation behind so by the time I bought it, mining craze had all but died down. @@JJssss
@michaelfalabella6296
@michaelfalabella6296 Жыл бұрын
awsome video man, i love the detail you get into. that red dragon looks like its performing great!
@pregorygeck6605
@pregorygeck6605 Жыл бұрын
Bought my 4070 from FB marketplace as opened box unused gpu for £460. Sold my RX 6800xt boxed, used-like new for £425. Happy with it! 🙂
@rushgameing3085
@rushgameing3085 2 ай бұрын
I have a 6800xt and it’s the best card I’ve ever had sold my 3060ti for it and happy with 16gb Vram
@shroomy8210
@shroomy8210 9 ай бұрын
Why is my cpu usage at 100% for cyberpunk with an i7 12700F?
@asdasdasd-gx7zs
@asdasdasd-gx7zs Жыл бұрын
Remnant 2 is kinda broken on nvidia and consoles. It forces ssgi graphics option which significantly reduces performance while making graphics a bit worse by causing various little artifacts. On amd it's disabled by default and can't be enabled at all even through console. Seriously, what's wrong with you? How can you look at 3080ti performing worse than 6800xt in this game and think that this is normal? Or 7900xtx performing almost on par with 4090?
@enricofermi3471
@enricofermi3471 Жыл бұрын
(Used) 3080-s are cheap because there's an extremely high probability they were mined on. As the crypto drops, there's little incentive left to mine, so the smaller farms get disassembled and sold. We will not disuss the probable condition of those GPUs (which may vary greatly depending on maintenance of the farm). The fresh 3080-s are indeed priced cheaper, because there are only so many advantages they have over an AMD competitor (6800XT): raytracing, DLSS (no frame gen on 3000 series btw) and CUDA, out of which only CUDA is not optional (if you actually need it as a content creator, that is). On the other hand, 3080 has only 10Gb VRAM, which is enough for now, and may (keyword may; it gets filled up in some games on ultra, but those are just singular examples... for now) last you until the 5000 series; 6800XT does not have that "may". Overall, I'd say a 6800XT is a more balanced long-term solution, "the 1080Ti of current era", while 3080 is an "I want the eye candy now and then I'll switch to 5070 as soon as it launches".
@danburke6568
@danburke6568 Жыл бұрын
When next generation games comes out 10gb is going to be a problem. Also so many was used for mining, those two elements. So many more about. The 3080 used is a bit pricey. Still a good card.
@100500daniel
@100500daniel Жыл бұрын
Bro increase your parameters from the default through Andrenaline. The fan speed and Power limiter by Powercolor's default are kinda low. run that bitch on 2000rpm with the power slider maxed and an undervolt&OC and it will be even faster.
@stratuvarious8547
@stratuvarious8547 Жыл бұрын
It's just a shame that Nvidia doesn't want to give enough VRAM to use their GPUs to the fullest. I can forgive the 3000 series, 8 was plenty at the time, but it was becoming clear early enough in development of the 4000 series that they needed to increase the VRAM at every skew except for the 90s. If AI wasn't their focus right now, they probably would have, but, because of AI, Nvidia just doesn't care.
@marcasswellbmd6922
@marcasswellbmd6922 Жыл бұрын
I bought mine part of a Cyberpower PC Build I picked my Parts.. Didn't feel like dealing with it this time. But I have an XFX 6800XT and it costs more because it is fast in most games and it has 6 more gigs of VRAM!! It's Hands Down the Better card and anyone who plays Hogwarts knows it.. The 10gigs is barely just enough for Brand New Games and I think that's why it flipped around as far as pricing goes. Also the 6800XT is even slightly faster than the 3080TI on Hardware Unboxed 50 game benchmarks too..
@Icureditwithmybrain
@Icureditwithmybrain Жыл бұрын
You would pick a 12GB card over a 16GB one? Do you play at 1080p or something?
@manusiaorang2842
@manusiaorang2842 2 ай бұрын
this vram talk needs to stop, most the test where 12gb is barely enough is with 4k textures, you can't even notice the difference below 4k
@DeepteshLovesTECH
@DeepteshLovesTECH Жыл бұрын
AMD had a much favorable time competing with RTX 3000 because of the huge node advantage that AMD had compared to NVIDIA on Samsung's crappy 8nm node.
@yukiokasamatsu
@yukiokasamatsu 5 ай бұрын
Hey if yu see this i have a question , in present time if theres no price difference which would you take?(basically rt or vram for all around use)
@alexandersarver3463
@alexandersarver3463 5 ай бұрын
Vram all day. You cant use rt if you dont have enough vram, something he overlooked pretty foolishly in this video.
@bearsgarage272
@bearsgarage272 6 ай бұрын
Literally just bought a red dragon 6800xt today, open box special at $430. Couldnt postpone anymore, coming from a 5700, i hope to keep this around for a few years. Maybe next time around ill go greeen when RT is more optimized and adopted by more games
@ioannispoulakas8730
@ioannispoulakas8730 Жыл бұрын
10GB is veeeery borderline for 1440p. I very often see over 9GB. Over 10 is less often but we are getting there. I don't know for how long the 12GB on my 4070 will last. Ideally I wanted more VRAM but there wasn't any alternative on this price range. I didn't want to go with the 6800XT and lose DLSS 2,3 and better RT cause I am only playing single player games. Fingers crossed for the 12GB then :P
@xwar_88x30
@xwar_88x30 Жыл бұрын
That's all Nvidia is good for is borderline with the vram scam on their GPUs making people upgrade when there 3070/3080s/4060/4060 ti or whatever other bs Nvidia GPU that starts to stutter in heavy vram games. 16gbs Is the safe spot at the minute anything over 16gbs is even better future proof.
@PoiZo_NG
@PoiZo_NG 11 ай бұрын
The prices are WAY different in my country. A used 6800 xt has the same price as a 3060 or a 3060 ti. Which makes it a hugely better deal than nVIDIA ones. 3080 is much more expensive than 6800 xt in here.
@RudolfSikorsky
@RudolfSikorsky Ай бұрын
That's why I chose 3080 12gb. Great card until today. Cyberpunk 2077 for example. DF optimized settings, 1440p, DLSS balanced, RT psycho and path tracing enabled I get 40-60 FPS in main game. Once you experience difference that PT offers, you will never go back and sacrifice some FPS instead. Would like to see how 6800XT performs in same conditions.
@iron_talon
@iron_talon Жыл бұрын
I will say this - FSR looks markedly better at 4k than 1440p. I have a 6800XT and am quite happy with what I get out of it at 4k resolutions
@vextakes
@vextakes Жыл бұрын
Tru, but same with dlss
@iron_talon
@iron_talon Жыл бұрын
@@vextakes absolutely. Mean it more as an equalizer than any sort of advantage, where the gap narrows rather than flips
@davo4074
@davo4074 5 ай бұрын
i assume you never turned on AMD's smart access memory or any of the radeon software features? if you didnt you kind of didnt do the amd gpu any justice since amd products are kinda designed around the fact the user will optimize the card whereas nvidia is kind of plug and play.
@GewelReal
@GewelReal Жыл бұрын
10GB of VRAM really aged badly. And it even uses more power than 6800XT!
@vincentvega3093
@vincentvega3093 Жыл бұрын
Thanks, Samsung!
@CersionX
@CersionX Жыл бұрын
@GewelReal if you actually watched the video you would know that it doesn’t
@Altrop
@Altrop Жыл бұрын
@@CersionX According to HUB's power graph shown in this same video it does use more power actually. He tested the power usage while standing still in a CPU bound game lmao.
@Clint_the_Audio-Photo_Guy
@Clint_the_Audio-Photo_Guy 6 ай бұрын
I had a Red Devil 6800 XT and took it back when the 6750 XT refresh came out. I should have kept it. Incidentally, 6950 XT Reference cards are only $550 at Microcenter right now. Pretty comparable to the 7800 XT, or better in several games.
@blackwaldoduck159
@blackwaldoduck159 Жыл бұрын
Is vex using Risk of rain 2 music 😮
@alaskacpu
@alaskacpu Жыл бұрын
1080p 240Hz player for all games and I never use RT with my 3080 or 6800. It's not worth the best speed for visuals which I could care less about. I get awesome visuals at 1080p and I'm not going down that rabbit hole to compare 1440p or 4K. I'll leave that up to you guys to complain about. Speed & Efficiency for games I play is absolute! Enjoy 🙂
@jcgongavoe337
@jcgongavoe337 Жыл бұрын
Just checked on eBay, the non-auction options costs around 600 to 750CAD, not really a good deal considering thay are pre-mined
@crazymcgee3604
@crazymcgee3604 Жыл бұрын
Newegg has the Asrock RX 6800XT for 750 CAD, new with a premium Starfield code. If the used cards cost that much its just as well to buy new.
@Wylabn
@Wylabn Жыл бұрын
i know this video is for gaming but how about other applications like 3d render or video render? i was gonna buy a new gpu and my choice is between these two gpus or the rx 6900xt. I already consider 4070 but its a bit out of my budget. Im really on a confuse situasion rn so whats my best bet? should i go for nvidia or amd for my needs?
@k.23_02
@k.23_02 Жыл бұрын
For your needs your best off going with Nvidia because Nvidia GPUs are known to be better for that sort of stuff
@thaibinh6585
@thaibinh6585 Жыл бұрын
Amd 3d rendering is shit compared to Nvidia, a 3060 is x2 faster than a 6800xt just so you know. When it’s come 3D render Nvidia have no competitor. Some applications don’t even support Amd gpu because they don’t have Cuda cores LoL
@Eleganttf2
@Eleganttf2 Жыл бұрын
Why are you still thinking and even looking about AMD when you dabble in 3d and video render ?
@loophole3526
@loophole3526 6 ай бұрын
Have the 3080ti and a 6800xt and usually can’t tell them apart. But I think they are both CPU limited one on a 9900xf at 5.0 and a 8700k at 5.0. I will say the textures in LoU look muddy as it’s likely loading lower texture quality. Hardware Unboxed does some really good VRAM testing you should check out.
@defenestratorX
@defenestratorX 8 ай бұрын
Also depends on the card manufacturer brands. Do you have any idea how expensive EVGA 30 series cards are going up in value? I was just barely able to buy 3090before they started going up pass 1,200 and the kingpin series 3090 I've seen a couple go for over 2k
@nolive2nd180
@nolive2nd180 11 ай бұрын
for the last 3months been testing used 3080 and 6800xt cards, all gotten for under $350 in great conditions. I decided to keep the NVIDIA card in the end (a Gigabyte GAMING OC) because of the better performance with my QUEST2 for PCVR. I really fell in love with VR since I bought the Q2 (mainly used for Simracing). In the titles I play in desktop mode (4k 120Hz TV), never really saw a VRAM bottleneck. Maybe having a 5800X3D helps as well but that's my user experience
@JohnSmith-mk8hz
@JohnSmith-mk8hz 11 ай бұрын
I also use a Quest 2 for PCVR. I was considering getting an RX 6800xt. What made the 3080 better? And how much better?
@marufulislam4311
@marufulislam4311 Жыл бұрын
rx 6800xt is just enough for 1440p . you will hardly ever need an upscaling. it's just that much faster. having 16gb means you can max out any game & it's future proof.
@Squilliam-Fancyson
@Squilliam-Fancyson Жыл бұрын
Any gpu miners here? What coin are you mining atm? RVN seem to be quite attractive since difficulty droped massively.
@RannonSi
@RannonSi Жыл бұрын
When it comes to VRAM. As it seems right now (if I'm not mistaken), there are about 4-6 games that (for whatever reason) exceed 8, 10 and even 12 GB, at a resolution of 1080. To me, that is entirely different to when Ray-Tracing started getting games, as it was obvious that it'd take years before buying a graphics cards just for Ray-Tracing was a viable (or at least a wise one - besides, how good was the RTX 2000-series with Ray-Tracing, really?). *This*, on the other hand, looks like the start of a trend, that the VRAM usage is spilling over the 8GB (with 10 and 12 following shortly, would be my guess) limit. And I'm guessing that it'll be at a rate, at least twice, the rate of the adoption of Ray-Tracing (at least after 2024). To me, Ray-Tracing is an add-on, a luxury item. But being able to play games at 1080p, on a card (that was released in the roaring 20's, mind you) that had an MSRP of $700, is (with exceptions, of course) the least one should expect.
@plasmahvh
@plasmahvh Жыл бұрын
lazy game devs
@Timmeh_The_tyrant
@Timmeh_The_tyrant Жыл бұрын
Are you using FSR in quality mode? Because it looks fantastic.
@dirtydoge756
@dirtydoge756 Жыл бұрын
That is definitely a hot take on the 16GB not being a big deal. There's a LOT of other youtube tech channels that would vehemently disagree with that statement.
@Azureskies01
@Azureskies01 Жыл бұрын
AMD doesn't really get destroyed by nvidia in ray tracing, they get destroyed in games that have really shitty implementations of ray tracing. Word on the street is it is just like tessellation where you have games over using rays that the player will never see to kill performance on both nvidia and amd but it kills it harder on amd cards. Crysis 2 (cyberpunk) anyone? I mean in UE5.2 (fortnite) AMD and nvidia are pretty much in RT parity so there are good implementations as well.
@stevewall7044
@stevewall7044 Жыл бұрын
The 6gb vram difference cannot be nullified by DLSS vs FSR. 90% of people will never model AI, why are yoj guys insisting on mentioning it, you guys keep pushing the psychological exploit of "fear of missing out". On what? On featzres people will either never use, or try once?
@robertdelacruz7920
@robertdelacruz7920 Жыл бұрын
can i ask what is your ambient temp of your cpu . your cpu is cooler while gaming
@patsk8872
@patsk8872 7 ай бұрын
"FSR doesn't look as good as DLSS" that's funny you say that right after showing a great comparison where both XESS and DLSS failed miserably at properly rendering a reflected light source, while FSR was the closest to the actual render. So that is anything but clear-cut. TBH all have problems, and I don't like the idea of using any of them unless I'm forced to. But if you really enjoy looking at an AI fever dream and pretending it's an actual render, you do you.
@notchipotle
@notchipotle Жыл бұрын
because RTX 3080 was used a lot in mining rig (better hashrate), and those cheap listing are probably in terrible condition (rusty, dirty and need a repaste)
@professorchaos5620
@professorchaos5620 Жыл бұрын
This is a hugely wrong view. Ive been mining with cards for years and they still perform exactly the same as when brand new. Most miners learn how to reduce heat and power for max efficiency which is far easier on the cards than gaming.
@notchipotle
@notchipotle Жыл бұрын
@@professorchaos5620 running 24/7 on an open bench, inside a humid room, never get clean, yeah sure 🤣 most used GPUs were in terrible condition, dust combined with humid air is the worst, it became black gunk that's hard to clean, the fins & IO shield get rusty, some of the fan is starting to fail. I saw it myself, most miners don't give a shit.
@Very_Questionable
@Very_Questionable Жыл бұрын
@@notchipotle honestly, the issues with quality are a huge one with used cards, regardless whether those are mined or not. there are some mining operations out there that do keep their room dry, clean their cards, and try their best to take care of their hardware. The best way to grab a used GPU is to get one locally, like arranging a meet-up instead of delivery. You can even negotiate pricing while you're at it, and at the same time you can also inspect the quality of the card carefully. Oh, and by the way, if you have any safety concerns, just know that you can do this right in front of your local police department.
@notchipotle
@notchipotle Жыл бұрын
@@Very_Questionable I sell used PC parts since 2011, even the most careless gamer can keep their GPU in good condition for at least a year, but mining cards already looks like crap after a couple months. you can say whatever you want, I was just describing the reality, not a personal opinion 🤣
@alperen1383
@alperen1383 Жыл бұрын
nah i got mine for 400 and it was from a gaming rig in mint condition
@AshtonCoolman
@AshtonCoolman Жыл бұрын
I have a 4090 and I dislike DLSS. DLSS, as you said, renders at a lower resolution. I'm not spending all of this money to be forced to render at a lower resolution because of a lack of VRAM. Upscaling is a crutch, but people have given into Nvidia's marketing. Rasterization performance is, and always will be king.
@Dreamy_san
@Dreamy_san Жыл бұрын
have to say that now, amd released ROCm for everyone, it allows u to simulate cuda core technology and it works pretty well
I Tried AMD to See How Bad It Really Is
18:01
Vex
Рет қаралды 1,1 МЛН
مسبح السرير #قصير
00:19
سكتشات وحركات
Рет қаралды 11 МЛН
Now it’s my turn ! 😂🥹 @danilisboom  #tiktok #elsarca
00:20
Elsa Arca
Рет қаралды 12 МЛН
The CUTEST flower girl on YouTube (2019-2024)
00:10
Hungry FAM
Рет қаралды 39 МЛН
No One ACTUALLY Cares about Graphics... Here's Why
11:21
Overclocking a $20 GPU Until I Get 240 FPS
9:49
Lecctron
Рет қаралды 611 М.
Should you buy RTX 3080 in 2024? 3080 10 GB vs 9 games from 2023
10:47
MoreBenchmarks
Рет қаралды 4,8 М.
Is Ray Tracing useable at 1440p? RX 6800XT 6 Games Tested
23:28
Daniel Owen
Рет қаралды 146 М.
I Bought a $5000 PC in a Random Asian Tech Mall
22:12
Linus Tech Tips
Рет қаралды 7 МЛН
AMD Destroys Nvidia with FSR 3
2:53
NikTek
Рет қаралды 2 МЛН
The RTX 4060 TI 16 GB IS THE WORST GPU EVER MADE!!
16:16
مسبح السرير #قصير
00:19
سكتشات وحركات
Рет қаралды 11 МЛН