No video

AMD really need to fix this.

  Рет қаралды 243,740

optimum

optimum

Күн бұрын

Check prices on Amazon below
NVIDIA RTX 4080: geni.us/tBDoHV
NVIDIA RTX 4070 Ti: geni.us/p3LvqFt
NVIDIA RTX 4070: geni.us/Q3Of
AMD RX 7900 XTX: geni.us/FceOS
AMD RX 7900 XT: geni.us/fNgHDPc
With AMD closing the gap against Nvidia's GPUs, there remains one massive difference between them.
need a new wallpaper? optimum.store
Video gear
Camera: geni.us/5YfMuy
Primary Lens: geni.us/pWnoPBr
/ optimum
/ optimumtechyt
/ only_optimum
As an Amazon Associate I earn from qualifying purchases.

Пікірлер: 1 500
@gridleaf
@gridleaf Жыл бұрын
As someone who games in a small room, power consumption really matters in the summer. Gaming can get pretty uncomfortable when your PC becomes a space heater.
@johnthomasjacobs
@johnthomasjacobs Жыл бұрын
I’ve actually recently picked up a AC window unit from my brother about a week ago cause my room was approaching the 85+ mark with a 3070ti and 5800x3d, absolutely nuts. Gonna be appreciating it in the winter though lol
@InfoSopher
@InfoSopher Жыл бұрын
@@johnthomasjacobs Undervolt the 3070ti. You can probably reduce draw by 50 watts without more than a few percent impact. Reduces noise as well.
@Manbearpiet
@Manbearpiet Жыл бұрын
I've never noticed until I upgraded to a 7900XTX. It's a hot boi, nice in winter, but in the summer it's a bit much.
@ahora1026
@ahora1026 Жыл бұрын
@@johnthomasjacobs me, appreciate gaming in the winter. summer is unbearable. 100+ is like average here. sometimes hit 110+ indoors
@hydrohelic
@hydrohelic Жыл бұрын
@@ahora1026and thats in celcius? cant believe. really??????????
@malceum
@malceum Жыл бұрын
What's interesting is that the Nvidia 3000 series was extremely power hungry. It wouldn't stop scaling with more power. The 4090 maxes out at 300 watts in most games.
@Dionyzos
@Dionyzos Жыл бұрын
It's mostly due to the Samsung node Ampere was using. TSMC is just way more efficient even when comparing current nodes.
@zuckdaddy1596
@zuckdaddy1596 Жыл бұрын
only if you have exotic cooling. on air/water the chips pull so much energy per mm^2 of silicon that the internal temps get so warm that the cards pretty much keel over and barely clock. also Ampere was very voltage sensitive, so that was a thing
@eliadbu
@eliadbu Жыл бұрын
Samsung 8nm node was/is a trash in almost every aspect - for that reason, nvidia was able to get about as cheap as TSMC 12N node.
@GeorgeD1
@GeorgeD1 Жыл бұрын
I have a 4090 with a water block and it actually naturally tops at around 420 watts when pushed, without any tweaking/overclocking. Still lower than the stated TDP of 450W though.
@pokealong
@pokealong Жыл бұрын
My 3080 Ti ranges anywhere from 250-350W depending on the load demanded by the game. It's quite great at lowering power draw when possible. If you've experienced different, maybe it's a Windows power management issue, as I'm on Linux.
@fabian3034
@fabian3034 Жыл бұрын
Power consumption is so important. Many people don't think about it. I hope it will become a higher priority for hardware- and software developers as well as product reviewers.
@Dell-ol6hb
@Dell-ol6hb 8 ай бұрын
No one seemed to care about it last Gen when AMD was more power efficient, now it’s so important apparently.
@nooptiui
@nooptiui 3 ай бұрын
@@Dell-ol6hbyea that’s another thing i noticed, when AMD does something better than Nvidia, somehow it isn’t talked about at all, but when Nvidia does the same, suddenly AMD is the one that needs to do better. not to mention, we’re comparing 2 graphic cards that have a THREE HUNDRED DOLLAR difference in price and people are acting like it’s fair to make a direct comparison
@aviatedviewssound4798
@aviatedviewssound4798 2 ай бұрын
@@nooptiui It's double standard fanboys.
@user-zw1oy5pm3s
@user-zw1oy5pm3s Жыл бұрын
Optimum Tech my ASRock RX 6600 Challehger D makes noise with the sound of a dragonfly that flies near the ear(Very similar to what is in the video.) ... I had to undervolt. why even on such a weak video card like mine there is a squeak / crackle?
@rakeau
@rakeau Жыл бұрын
It’s interesting because AMD’s Ryzen CPU’s have been really quite efficient. Seems like they need to share some R&D there!
@alexanderarias8519
@alexanderarias8519 Жыл бұрын
I think it might be that AMDs default voltage is overdone? Because what we’ve seen for the last few gens is that undervolting will tank consumption, improve clocks and temps drastically.
@kobiblade
@kobiblade Жыл бұрын
Not really, what you seeing here is an unexperienced person running AMD without undervolting and claiming AMD needs to fix something that isn't broken.
@thebcwonder4850
@thebcwonder4850 Жыл бұрын
@@kobiblade Navi 31 is actually starved of power, so undervolting only gives a slight power efficiency increase
@w04h
@w04h Жыл бұрын
@@kobiblade So? You can also undervolt Nvidia and it will still run better. YOU are the one who is unexperienced. Navi III scales with power, it actually loses more performance as you decrease it, while 4090 at 60% power limit can run just as fast as at full power.
@pathway4582
@pathway4582 Жыл бұрын
@@kobiblade 🤡
@roebucksruin
@roebucksruin Жыл бұрын
I couldn't imagine buying a gpu this generation. The cons are pretty wild. Either you pay a lot up front, or you pay a lot over time. Fingers crossed that Battlemage puts pressure on both companies.
@disadadi8958
@disadadi8958 Жыл бұрын
Yeah they're pricey. I wanted one for nvidia isaac purposes and the gaming performance is great too, my 3060ti was just not good enough for 4k gaming. But I do agree with the fact that the prices have gotten out of control.
@inmypaants
@inmypaants Жыл бұрын
Perspective matters, I couldn’t get a 3080 2 years ago and they were commonly selling for over $1000 USD, I can get a 7900XTX now for $850 which uses a similar amount of power as the 3080 and has 140% more VRAM and around 40-50% more pure raster … during times of unprecedented inflation.
@roebucksruin
@roebucksruin Жыл бұрын
@@inmypaants Scope matters too. Comparing current pricing to a small, volatile moment in the GPU market is a very limiting scope. Zoom out at look at market trends beyond 2020-2022.
@kyle8952
@kyle8952 Жыл бұрын
Meanwhile the AMD intergrated graphics are getting very very good for what they are. As a casual who plays games that are a few years old when they're cheap on steam I have no interest in GPUs anymore
@alexanderarias8519
@alexanderarias8519 Жыл бұрын
Intel is definitely gonna put pressure on them. Their development is being pumped out in record time. While NVIDIA is dragging their feet and milking consumers Intel is quietly catching up.
@Superwazop
@Superwazop Жыл бұрын
Might have something to do with chiplet gpus, I think they need a constant voltage to work well kinda like the x3d or ram
@pd8109
@pd8109 Жыл бұрын
Exactly, this video makes no mention of the different designs to that respect. I believe Nvidia is also using a more expensive node. This is AMD's first GPU chiplet, so not just a extending the monolithic chip like Nvidia. Very biased considering
@jonessii
@jonessii Жыл бұрын
@@pd8109 I'm glad you've already nailed the reason down
@Penfolduk001
@Penfolduk001 Жыл бұрын
​@pd8109 The video isn't about different chip architectures. It's about power draw when playing games. Which, given I'm a UK user, probably has a greater influence on me with current energy prices than a US user. So the video is not biased. If anything it goes out of it's way to point out the price difference in the two cards. And that historically nVidia has been the power hog. It may well be AMD will pull things back with the next generation of cards. Or at least nVidia will have to up it's power draw to similar levels. It may also be if it was tested with different workloads, such as video rendering or encoding, that the AMD card comes out the most energy efficient. But that wasn't the test. Having said all that, I do hope the tests were done using the same DX10/11/12 configuration between the cards. I also wondered whether one of nVidia's upscaling technologies was enabled on the test machine. Whilst it's easy to toggle DLSS on and off for individual games, I seem to remember some other tech they have that upscales across all programs. I'd also like to think the latest drivers and firmware were used for both cards.
@ms3862
@ms3862 Жыл бұрын
Yep the infinity link between the chiplets can't be throttled, it has to run at full power all ways that's why rdna3 idles high. But this doesnt account for a 200w difference more like 30w
@Penfolduk001
@Penfolduk001 Жыл бұрын
@elcactuar3354 Firstly I'm going to write it the way it was written when I first came across them. I'm not going to take lessons in capitalisation from a company that says "3X more performance..." in presentations rather than "3 times more performance..." Finally NVIDIA all caps is not the same as AMD. AMD is an acronym for Advanced Micro Devices. So is supposed to be all capitals. nVidia isn't an acronym. It's a made-up name for a company who's marketing department is so bereft of ideas they can only come up with switching between upper and lower case letters to justify their existence.
@ethanoux10
@ethanoux10 Жыл бұрын
i hope a driver update comes out for this
@The1astGuardian
@The1astGuardian Жыл бұрын
Most likely, just needs to reach AMD as fast as possible
@Vader294
@Vader294 Жыл бұрын
@@The1astGuardian you think they don't already know this?
@xokin156
@xokin156 Жыл бұрын
lmao isnt this problem was already known since release?
@caiolucasnovais3676
@caiolucasnovais3676 Жыл бұрын
@@Vader294 they need to know that it is actually annoying the consumers or they will just keep on delaying the patch forever
@sephondranzer
@sephondranzer Жыл бұрын
I have a theory that for every video, there’s a perfect comment 😂
@StevieFQ
@StevieFQ Жыл бұрын
I had 2 servers and after doing the math I realized that selling both and using the money to buy a more power efficient system was cheaper over the course of 3y. Depending on energy prices that 300$ diff can be swallowed up pretty fast. Especially if you have a price cap for lower power consumers.
@AxleLotl
@AxleLotl Жыл бұрын
I think this is an architecture battle. Chiplets have so much extra communication happening, plus the hyper threading of the AMD shaders will always draw more power than single thread CUDA's in a monolithic die situation. A recent adrenaline software patch has helped heaps with AMD idle power draw at least 👌👌
@nochancecw
@nochancecw Жыл бұрын
The power difference in some places would make up the cost difference in the cards. That is wild.
@RydrewWTF
@RydrewWTF Жыл бұрын
This is what ive been saying since 4000 cards launched but nobody is giving a damn, even big KZbinrs. With a small case and the rising price of electricity, watt consumption is very importanti.
@alvarg
@alvarg Жыл бұрын
I mean between a 1200 4080 and a 850 dollar 7900xtx I highly doubt it. but I've heard amd is addressing this issue though.
@LauraLovesHugs
@LauraLovesHugs Жыл бұрын
@@alvarg where in the world are you finding $850 7900xtxs lmao. they're like $1400 here
@stiegelzeine2186
@stiegelzeine2186 Жыл бұрын
People who care about money shouldnt get the highest end models and also nope spending extra money on a more expensive and maybe more efficient gpu wont cover that in atleast 10 years and those people buying a 4080 or 7900xtx usually sell it in 2 years because by then there will be a better one and in maybe 5 years the cheapest gpu will beat the old outdated top end model from now
@varmastiko2908
@varmastiko2908 Жыл бұрын
@@stiegelzeine2186 People who care about money should only buy high end this generation because everything else is obsolete already. And it's by design.
@LastSecBloomer
@LastSecBloomer Жыл бұрын
Fun fact: in light loads and idle, AMD CPUs are actually less efficient than Intel ones.
@SweatyFeetGirl
@SweatyFeetGirl Жыл бұрын
idle yes, light loads no
@PowellCat745
@PowellCat745 Жыл бұрын
Yes, because it’s chiplet.
@riccardotesta987
@riccardotesta987 Жыл бұрын
7900xtx pulls 5watts in idle on high refresh rate monitors with the recent update...
@greebj
@greebj Жыл бұрын
Not their monolithic 8 core laptop chips, they make Intel 6+8 look silly at below 75W. Unfortunately they just didn't scale much past that, so AL/RL could still claim overall performance parity Dragon Range 16 core parts are multi CCD/IO setup so do suck (badoom tish) at idle just like the desktop parts
@prajaybasu
@prajaybasu Жыл бұрын
@@greebj Idle power is still lower on Intel. AMD wins between 5-35W, but outside those ranges Intel wins. See AnandTech numbers.
@danieldigi2981
@danieldigi2981 Жыл бұрын
So just to add my findings into the ring, I found that my 7900XTX when you cap it’s boost clock the power draw drops drastically. In lower demand titles the GPU tries to keep its clocks as high as it possibly can and it draws more power to do it. Putting the GPU at 2300 min clock and 2400 max clock will drop the power usage drastically but it still won’t be anywhere near the power efficiency of the 4080. The only way to get its power to go way down is to frame cap.
@07Lightless
@07Lightless Жыл бұрын
Overall would you still recommend the 7900XTX or 4080?
@padnomnidprenon9672
@padnomnidprenon9672 Жыл бұрын
​@@07LightlessDo you have a particular use case like using CUDA -> nvidia or running on linux -> amd ? As a amd fan and linux user, I prefer their cards, but rdna3 is not that compelling to me. I got a 6600 and i am not really incentivized to switch to rdna3. I'd rather chose a nvidia for this generation or wait 2-3 years
@rENEGADE666JEDI
@rENEGADE666JEDI Жыл бұрын
​@@padnomnidprenon9672on Linux rdna 3 works so instead of 30 tflops you have 60 as it should. It's a pity that this architecture doesn't work in games.
@Z4d0k
@Z4d0k Жыл бұрын
@@07LightlessI enjoy my 4080. It’s cool, quiet, efficient and powerful. Can’t fault it. I can’t see any reason to get the XTX over the 4080, other than it has more VRAM.
@danieldigi2981
@danieldigi2981 Жыл бұрын
I think if you’re willing to mess with it the XTX can be tuned to be a bit more manageable, you may have to make profiles for each game which is tedious. It won’t ever be as power efficient as the 4080 but it’s really up to the person I think tinkering with it and saving 300 dollars is worth it but if you just like to run things out of the box the 4080 is probably better.
@lennard9331
@lennard9331 Жыл бұрын
I suspect this is due to three things. The first is the fact that chiplet designs naturally have a higher baseline power draw from what I can tell. Looking at the power stats on HWinfo64, the MCDs appear to be drawing about 25W under load no matter what. The second is the SoC, which I'm not entirely sure what it equates to, but I've seen draw upwards of 40W. The last one is the fact that without Radeon chill, AMD cards seem to try and go full tilt no matter what even when you're CPU limited. The GPU core itself is actually relatively tame and efficient, and doesn't really draw more than 200-250W, even when overclocked, but everything around it seems to be super inefficient for some reason.
@David-yx3bd
@David-yx3bd Жыл бұрын
Yeah this was the first generation where we see a manufacturer move away from the more power for more performance model. Even the 4090 is extremely power efficient. I think it's a result of Nvidia realizing that realistically the power draw couldn't keep increasing forever. AMD I think we could see this efficiency push next gen, but given the general feedback from the major reviewers? Doubtful, they all pretty much ignored AMD's higher power draw or barely mentioned it, so there's not really any incentive for them to do so. It's funny because when it's CPUs power draw it's a huge topic, but with video cards? Nobody seems to care.
@julianB93
@julianB93 Жыл бұрын
Yeah, im somewhat bothered by it too. Amd isnt even much cheaper. I bought a 7900xtx and power draw compared to my old card is massiv.
@The_Privateer
@The_Privateer Жыл бұрын
People like to hate on Nvidia for whatever reason - but as benchmarks and research and published use shows... they are just better. You get what you pay for.
@mrpirate2669
@mrpirate2669 Жыл бұрын
Something that needs to be pointed out here is that the PCAT is developed and provided by Nvidia, Gamers Nexus spoke to Aris from Cybenetics at Computex and he helped Nvidia develop the PCAT and Nvidia refused to send him one to test after it was released, he also stated that it doesn't show the full picture for these tests as PCAT doesn't measure transients or EMI, transients in particular can make a significant difference. For context Cybenetics make professional grade testing equipment for testing powerdraw in particular.
@shishiksa
@shishiksa Жыл бұрын
Was about to make the same comment. This cannot be stated enough
@ZAKKORD
@ZAKKORD Жыл бұрын
@@shishiksa it doesn't matter at all when he's measuring at the wall
@robertjung8929
@robertjung8929 Жыл бұрын
this is the very reason why i bought a 4070Ti instead of a 7900XT. the later is slightly better at 4K and 50 euro cheaper, but the huge power draw of the 7900xt made me to choose the 4070Ti.
@PowellCat745
@PowellCat745 Жыл бұрын
Bro you paid $800 for a 12GB card 😅
@robertjung8929
@robertjung8929 Жыл бұрын
@@PowellCat745 no, i paid 860euro for 7680 cuda cores 😉
@PowellCat745
@PowellCat745 Жыл бұрын
@@robertjung8929 LOL as long as you’re happy 😃
@robertjung8929
@robertjung8929 Жыл бұрын
@@PowellCat745 if i compare the cost of living now and 2 years ago, and apply that difference/ratio to GPU prices today, then the conclusion is that GPUs prices are adequate. it might be hard to understand for someone, but that's the reality of the inflation driven world we're living in.
@busetgadapet
@busetgadapet Жыл бұрын
@@robertjung8929 agree with you, I just dont understand people who expect gpu to priced the same or even lower when the price of egg and everything raise significantly, and I dont blame nvidia raising their price because amd is just still failed to close the gap and offer any competition, and for me it is laughable when people spend $1000 for iphone that only used for 1 year yet bitching about $800 card that can bring entertainment for 5 years + lmao
@mikhailvakhitov8911
@mikhailvakhitov8911 Жыл бұрын
Really good catch with this investigation. Could you please have it for future graphic cards comparissons? Because this is pretty important to understand what company claims compared to reality. Thank you!
@blitzohacks3726
@blitzohacks3726 Жыл бұрын
But the coil whine depends on the manufactor of the card for example the Sapphire Nitro cards almost havent any coil whine while other manufactors has big coil whine
@lunacastle3561
@lunacastle3561 Жыл бұрын
It definitely depends on the manufacturer. My Sapphire 5700xt Nitro+ and PowerColor 6800xt both don't have coil whine
@pill-poppingnarcissist
@pill-poppingnarcissist Жыл бұрын
I don't know man, I've never even heard about coil whine until I bought my first AMD card (6950xt red devil). People told me, "oh it'll go away after some use, give it a few months!" and it never did after a year of daily use. I've mainly ever used nvidia cards and never had this issue and still haven't, I upgraded to a 4090 after my horrible AMD experience and mine doesn't have any coil whine... I got the PNY 4090. AMD just sucks for that reason, and their bad drivers.
@PowellCat745
@PowellCat745 Жыл бұрын
Yep my 4090 strix has insane coil whine.
@Fztrm
@Fztrm Жыл бұрын
@@pill-poppingnarcissist my ASUS TUF 4090 OC i got on release was screaming! But it does not seem to whine anymore at all so i guess i got lucky in the end
@Boodoo4You
@Boodoo4You Жыл бұрын
@@pill-poppingnarcissistI’ve got an ASUS TUF 3080 12gb and it squeals in some games. When you hit that perfect amount of load, it sounds like you’re choking a mouse! Some components just resonate at specific frequencies and it’s impossible to predict, even in “good brands”
@conza1989
@conza1989 Жыл бұрын
Really interesting tests here, would enjoy seeing a follow up with some undervolting to see if the behaviour is mimicked in those conditions, we know that performance per watt can increase significantly for Nvidia, and probably some what at least for AMD.
@Kankipappa
@Kankipappa Жыл бұрын
I already wrote one comment on the other thread here, but the fix is to cap your max GPU frequency to fix this "high-idle" power consumption if running less demanding games. For some reason the GPU/driver doesn't know when to drop down the frequency if running without vsync, probably related due to the issues 5700XT had with low resolutions when RDNA was a new tech - as it would see so small load that it would heavily underclock the GPU and that would tank max framerates in older games. Now either you need to know how to use Radeon Chill, or manually cap the framerate OR GPU core MHz to avoid this.
@scandiacusbubo2006
@scandiacusbubo2006 Жыл бұрын
He should use reference card from both company
@pangtundure
@pangtundure Жыл бұрын
I think this is because the Ada is tuned more for power. There is a huge performance gap between my 3060(80W) and 3060(115W) about ~20fps on average. and looking at the new Ada gpu they are super efficient compared to the previous gen.
@elfo7918
@elfo7918 Жыл бұрын
funfact: IgorsLab made a Video about the 4060 "115W" card, and it's higher (~132 W), and if it's limited to 115W the card is even worse
@davidhwang8065
@davidhwang8065 Жыл бұрын
@@elfo7918isn’t that just common sense?
@alvarg
@alvarg Жыл бұрын
@@davidhwang8065 but he means is that the 4060 is already worse than the 3060 and if you cap it to 115 watts it becomes even more so. so where is the architectural improvement?
@dat_21
@dat_21 Жыл бұрын
Plot twist: it wasn't a power efficiency bump. It is just that every card is 1 tier higher that it should be.
@pangtundure
@pangtundure Жыл бұрын
@@dat_21 😅 your not wrong, that's so scummy behavior by big green 💚 and medium Red ♥️ is also following through this from the last 2 generations.
@ZeroByte7
@ZeroByte7 Жыл бұрын
I have been using the Nvidia 4090 for 7 months now, and I must say that the coil whine was extremely frustrating, especially since I previously had the 1060, which is a weaker card. However, after approximately 7 months of continuous usage, I can confidently say that the coil whine sound no longer appears as it did in the beginning. I had read that this would be the case, and it turned out to be true. I am pleasantly surprised by the fact that the temperatures on this card are actually lower compared to other cards I've used. Considering the impressive power it provides, I am very pleased with my decision to stick with Nvidia.
@Fztrm
@Fztrm Жыл бұрын
Mine was whining a lot (4090 TUF OC) but it is gone now, i have no idea when it stopped whining tho but i do remember it was horrible
@unknowntiger9108
@unknowntiger9108 Жыл бұрын
@@Fztrm Sometimes you can just break the card in to reduce coil whine. It makes sense.
@Fztrm
@Fztrm Жыл бұрын
@@unknowntiger9108 Yeah i got lucky with this 4090 in the end but different story with my 3080 ti...that thing still whines to this day!
@cotasa
@cotasa Жыл бұрын
Some people get the opposite (after some hours/days of burn they whine gets worst). You were lucky
@Stratos1988
@Stratos1988 Жыл бұрын
Lucky you, I bought used 2080S and it's getting really noisy even though my PC is on floor with mostly solid front. Core is capable of over 2GHz, but those few FPS are just not worth it.
@Sebbz
@Sebbz Жыл бұрын
The power draw was the main reason why I chose 4080 over 7900 XTX. Absurd power usage on AMD cards..
@tomr3319
@tomr3319 Жыл бұрын
4080 is a better GPU - I stopped blindly believing the AMD hype and Nvidia bashing a while ago..
@kdawgmaster
@kdawgmaster Жыл бұрын
As a 7900XTX owner and a regular CSGO player I haven't noticed this trend at all granted i haven't looked at my fans but when i did a test when i first got the card it was hardly taking any power what so ever. I'm curious, have you tested this with other brands? Could be a possible Vbios determination on that specific card you used. Also the card i have has no coil whine what so ever as this isn't a factor of 7900xtx vs 4080. Coil whine will be determined by the parts the AIB uses and NOT the model of card itself so i find the blanket statement for the 7900xtx to be rather disingenuous.
@giglioflex
@giglioflex Жыл бұрын
It's a misleading comparison he's making. He's using an OC ASUS model vs a reference card. Chances are ASUS is just being ASUS and massively overvolting as usual, thus pushing up power consumption. Check any other review on the 7900 XTX, it doesn't line up with this video.
@kdawgmaster
@kdawgmaster Жыл бұрын
@@giglioflex I would say yes and no in this instance. While you are right that it should be expected that the OC'ed card will draw more power, the idea that a non-GPU demanding game taking that much more power is still kind of odd. One thing we dont have here is the test setup specs. While it shouldnt be that damning in this situation it should ALWAYs be included in any comparison benchmark. If i were to provide an example here it would be this. My 7950x and 7900xtx setup on CSGO with my settings on a 4:3 stretched res, i get between 600-1000+ FPS map depending ( yes i have hit as high as 1000+ FPS on the netgraph in D2 not kidding ). And like i said, i dont notice a massive power draw on the card. I do think that the apples to apples comparison here would have been to use a TUF card of the 4080 as well as it might have the same coil whine if Asus uses the same or similar power management. Coil whine comes from things like Mosfets, VRMs, Caps and such ( sometimes fans can produce similar noises ) and that is my main concern with bringing that up. With a Blanket comparison that he did it would be to assume that the CHIP itself is causing the whine and that isnt likely the case.
@tbums8139
@tbums8139 Жыл бұрын
Some people have reported that the high idle draw can be fixed by slightly increasing the blanking timings for their monitor. This gives the VRAM a chance to clock down and it can drastically decrease power draw. For some monitors it's even enough to create a custom resolution in the Adrenalin Software and select CVT for the Timing Standart. Unfortunately you cant manually adjust blanking timings within the AMD software but there are tools like Custom Resolution Utility that make it possible.
@SpudCommando
@SpudCommando Жыл бұрын
yep. people with lg c1's have went down from 50w idle to 7w just using CRU.
@GewelReal
@GewelReal Жыл бұрын
or just, AMD should make a good product for once
@_TrueDesire_
@_TrueDesire_ Жыл бұрын
Idle consumption should be fixed with latest driver now.
@markcollard9326
@markcollard9326 Жыл бұрын
@@GewelReal Sure, XBOX's or Playstations with AMD APU's are crap right? The products are great, the software and firmware need help. Maybe taking the bias out of your mind would make you see more clearly.
@GraveUypo
@GraveUypo Жыл бұрын
@@markcollard9326 steam deck also runs amd, so does the rog ally. amd is everywhere.
@denmaakujin9161
@denmaakujin9161 Жыл бұрын
Didn't AMD GPU get driver update recently to fix some power draw? Could be awesome if he told us which drivers he was using at the time.
@rusty10785
@rusty10785 Жыл бұрын
They did and it's called gas lighting. 😅
@daveleebond
@daveleebond Жыл бұрын
that was idle voltage, which is still high.
@ChromaticBlade
@ChromaticBlade Жыл бұрын
That was idle voltage as far as i can tell , i went from 50ish idle to around 8 with a 1440p 165hz monitor
@Ben-Rogue
@Ben-Rogue Жыл бұрын
In his review, the 7900 XTX system was using 80 more watts at idle... If the issue is still present, that could be a big influence on these numbers
@winniethemao3429
@winniethemao3429 Жыл бұрын
AMD only creates new bug
@Gates-Tech
@Gates-Tech Жыл бұрын
locking your mhz in radeon on the speeds that are advertised regarding your specific 7000 card will do a lot, they go way beyond their recommend clock speeds. will also help a lot regarding coilwhine.
@azaz20244
@azaz20244 Жыл бұрын
my 6800 xt is locked at 2400mhz all the time, im sure it could go higher but i never bothered
@Gates-Tech
@Gates-Tech Жыл бұрын
@@azaz20244 yeah a few fps difference is not worth vs much more heat, power draw and coil. Just like the MHz lock your card experience will improve playing with lowering the voltage a little.
@jamescpalmer
@jamescpalmer Жыл бұрын
How can I lock my core speeds please Tom?
@davehenderson6896
@davehenderson6896 9 ай бұрын
With Nvidia you pay now, and with AMD you pay later.
@MrBlackmidi1234567890987654321
@MrBlackmidi1234567890987654321 7 ай бұрын
Kind of like buying a phone off contract vs on contract
@catbertz
@catbertz Жыл бұрын
Could the difference in process node be responsible for AMD's need to pump higher power to match fps? I'm sure AMD can further refine their drivers, like in idle state.
@Thornfrey
@Thornfrey Жыл бұрын
nVidia have their own tuned node, amd is on normal 5nm but that is not the main difference... its the multi chip design that comes with high power demand. Its surprising they managed to keep power consumption quite low with their first chiplet gaming gpu design. For example intel planned multi chip too for arc but failed because power consumption didnt scale linear. Koduri said there is no point in releasing gpu that is on par with 2080ti but consume 2000watts.
@user78405
@user78405 Жыл бұрын
no its bad drivers ....is well establish since ati made it since r9 280x
@AstroTwist
@AstroTwist Жыл бұрын
​@@ThornfreyNo nvidia use TSMC 5nm but for markating they called this is 4N but actually is 5nm Where as AMD use TSMC 6nm thats why power consumption is way higher than NVIDIA
@Thornfrey
@Thornfrey Жыл бұрын
@@AstroTwist Thats why i said nVidia use tuned node and AMD normal 5nm. And you are wrong, AMD use 5nm for thé main die - gcd and 6nm for mcd... 6nm Is used only for n33.
@prosyn9352
@prosyn9352 Жыл бұрын
I think it coincidence with monolithic vs chiplet design. Ryzen dekstop require more wattage compare to Intel in idle or light work, in notebook side ryzen monolithic design very efficient
@Reza1984_
@Reza1984_ 9 ай бұрын
I wish you would do an undervolt and underclock test with these two cards in a few different games to see how that would effect power consumption. Power cost are huge concern for people in EU
@echowhiskey419
@echowhiskey419 Жыл бұрын
I think the same problem exists in the 6000 series. Undervolting my 6800XT gives me a step up to 6900XT performance and yet brings my power consumption down to about 230W in games. These things are running hot with lots of coil whine out of the box, no idea why. Perhaps enabling a wider range of silicon to pass validation?
@TheNuare
@TheNuare Жыл бұрын
your connection with coil whine and silicone it's like socks to banana. Coil whine here is copper wire vibration in coils, also it could be bad caps
@enerjustics
@enerjustics Жыл бұрын
The whispers are that this is related to the same hardware flaw in the chip design which ultimately prevented the 7900XTX to compete with the 4090 as well. Apparently this can not be fixed with drivers, but they have it addressed for RDNA4. The planned mid gen refresh for the 7000-series cards was cancelled as a result, and they'll try to pull the RDNA4 launch forward
@kanbekan
@kanbekan Жыл бұрын
There's another whisper that because of the nature of mcmdesign, the card needs more power to travel between chiplets
@Sly_404
@Sly_404 Жыл бұрын
GIven that we are now 8 months post launch and the power consumption under load hasn't even been mentioned on their side as outside of what they expect, thats the most likely scenario. RDNA4 better delivers on the shortcoming of RDNA3. Nvidia is given them plenty of opportunity atm but AMD fails to take advantage of them.
@winebartender6653
@winebartender6653 Жыл бұрын
Ah yes, the same whispers about idle power draw and VR issues, which were fixed in the newest driver.
@nirodper
@nirodper Жыл бұрын
one thing to note is that RDNA2 did throttle power better
@enerjustics
@enerjustics Жыл бұрын
@@winebartender6653 Separate issue unfortunately
@Smjh123
@Smjh123 Жыл бұрын
Efficiency is what's keeping me on a 1060 for so long (that and really bad value propositions ofc). I can set the power limit all the way down to 50% and still play my games locked at 60 fps with render latency under 16 ms. That's around 60 watts with fans set to 45% at most. I love this thing.
@uzer_zero
@uzer_zero Жыл бұрын
Thanks so much for running these tests and posting results! I'm toying with a new build, and was aiming for something at the higher end, but was leaning away from the 4090 simply due to size and power consumption (i.e., its effect on my utility bill). Looking at the two cards you tested, specifically, I was leaning toward the XTX due to the price differential. The advertised TDP of both these cards, while ~11% higher on the 7900, don't reveal THIS big of a difference under load. Cheers!!
@TheStoic84
@TheStoic84 Жыл бұрын
The difference is 4080 FE vs and AIB OC version of the 7900XTX. I also expect something is wrong with the 7900XTX, as it pulling a continuous 510w isn't even in the ballpark of what other review sites show(it's a good 50w+ over every other review site).
@bafon
@bafon Жыл бұрын
Some power consuption difference is to be expected with Nvidia being on a 4nm vs 5nm (and 6nm) for the AMD counterpart, but I guess most of this is just driver issues and will be sorted out later on, like with previous gens.
@robertr.1879
@robertr.1879 Жыл бұрын
The 7900 XT (the tamed version of the XTX) has the same problems; big power, coil whine, high fan speed that add to the noise. Those cards are "factory overcloked" and they consume so much power for just a few more fps. I had to work hard on my specific model (Sapphire Pulse) to make it more efficient; undervolting and limiting the power/clock speed made a big difference. At the end; I'm happy with it but it was a struggle.
@pronstorestiffi
@pronstorestiffi Жыл бұрын
I initially bought a Gigabyte 7900 Xtx, but due to the coil whine and loud fan performance I shipped it back, buckled and bought a Founders 4080.
@papertf
@papertf Жыл бұрын
Hey. I have the same sapphire pulse 7900xt and I'm getting more instability/frame drops then my 2080ti. Any tips on how you made it more efficient?
@jondonnelly4831
@jondonnelly4831 Жыл бұрын
I tried undervolting mine but any and i mean any changes of stock causes the card to crash the driver. Not straight away but it does happen. I made a big mistake buying amd. Its not temp or psu. So bad I will prolly put my 2080ti back in and wait for next gen.
@JABelms
@JABelms Жыл бұрын
@@pronstorestiffi The cooler was too fat for me that I already bought a Bykski block before buying a 7900XT. for $120 bucks , 55C at full load without fears of coil whine and not breaking my PCIE slot it's kinda worth it.
@Ben-Rogue
@Ben-Rogue Жыл бұрын
You really shouldn't buy the Pulse for anything above mid tier. They really aren't a high-end cooler, and use fairly cheap components compared to the Nitro+, which are excellent btw! As are the Red Devil coolers from Powercolor
@jann5s___
@jann5s___ Жыл бұрын
I really like the "amd chill" option to sort of remedy this, I know it's not a real fix, as you loose performance, but in these types of games, I only need about 140Hz.
@Kaori42
@Kaori42 Жыл бұрын
Limiting the framerate doesn't make the XTX consume less, it still run at very high clock while consuming near to max power. Manually adjusting clock made mine reduce from 310W (-10% power limit) to 200W.
@PineyJustice
@PineyJustice Жыл бұрын
@@Kaori42 That's not how chill works, it doesn't work like vsync, it underclocks the card to sit between a min and max framerate.
@GewelReal
@GewelReal Жыл бұрын
Do the same to Nvidia and suddenly Nvidia also pulls less. Crazy!
@delancre5858
@delancre5858 Жыл бұрын
​@@GewelRealfanboyism aside, did nvidia have similar option? I mean, looking at test's above, I guess id did not needed it, but still, you basically dont have same amd "chill" button to "do the same with nvidia".
@PineyJustice
@PineyJustice Жыл бұрын
@@delancre5858 Nvidia has nothing like chill, would have really come in handy with the 3090 that used 500+ watts.
@ventilate4267
@ventilate4267 Жыл бұрын
I guess NVIDIA's gamble of giving reviewers the PCAT paid off (they were likely hoping they'd come out on top in comparisons like this)
@rollerr
@rollerr Жыл бұрын
It's also convenient for nvidia to have the comparison be done purely in gpu vs gpu power. Their scheduler offloads work onto the cpu, whereas AMD's does not. This means more driver overhead for nvidia, as well as increased power needs, depending on the cpu of course. If you're using Intel those power needs could be pretty hefty. Hard to say how much power is actually saved from the gpu by doing this, but I'd wager the actual efficiency lead at the wall is not as high as nvidia wants us to believe.
@skulu
@skulu Жыл бұрын
@@rollerr but this video shows the wall power draw at the start, which are still in nvidia's favour
@AlleonoriCat
@AlleonoriCat Жыл бұрын
@@rollerr my dude, the graphs at 1:00 are literally at the wall, it's written right there. And I am an AMD fan, I hate how greedy nvidia is right now. But we don't have to bend backwards to justify one corporation over the other
@rollerr
@rollerr Жыл бұрын
@@skulu He's also using a factory OC model vs a stock 4080... those use 100+W more than reference. He's also not undervolting or using any of the power saving tech available in AMD drivers, something any sane person that actually cared about efficiency would do.
@heyitsmejm4792
@heyitsmejm4792 Жыл бұрын
@@rollerr bruh, you can also undervolt the 4080...
@jerryscared
@jerryscared Жыл бұрын
Would really appreciate the full system spec when one of the major metrics is total system power. For example AMD vs. Intel have very different power profiles depending on the load. Although the difference between the two setups are obviously the GPU (the only variable), providing system spec is just going to make the context much clearer.
@Splatterboy519
@Splatterboy519 Жыл бұрын
Great video as always. I'm a but curious though, how well do the new 7900xt series respond to undervolting, and are there any other ways to reduce power draw on the user end?
@jondonnelly4831
@jondonnelly4831 Жыл бұрын
Mine responds to underclocking but stability suffers. It well pass 2 hours of 3d mark loop but could crash playing wow in one night under not much load. So I leave it stock and boosted the fan speed curve. Is it 100% stable like that no.., but it is as good as AMD 7900 is gonna get.. 99 other things affect it, including drivers and driver settings and the phase of the moon. I'ce spend days and days trying to tune it got. loads more performance out of it too and it will pass many benches but let me down in a game. Life is too short. My next rig will have ZERO AMD in it. Well i keep hoping for thr magic AMD driver that fixes it and it AMD has gotten better but Intel/ Nvidia is solid.
@greebj
@greebj Жыл бұрын
Frame cap to refresh of your monitor for one Burning 300W more to get
@sshadows8974
@sshadows8974 Жыл бұрын
Exactly the point, OT didn't not cover undervolting at all which is a MUST with RDNA3.
@RedsawSixtySix
@RedsawSixtySix Жыл бұрын
The AMD cards generally respond really well to undervolts. Tech Yes City has done some decent tests on this the past.
@janaebert3059
@janaebert3059 Жыл бұрын
@@jondonnelly4831 You didn't even specify how you tuned your card.
@butterbirne1060
@butterbirne1060 Жыл бұрын
Im glad that i upgraded to the 4080 instead of the 7900xtx. More expensive when u buy it but it was a better decision for the bills at the end of the month.
@Ben-Rogue
@Ben-Rogue Жыл бұрын
I'd be curious to see the difference in power consumption of games that favour either hardware performance wise, but when locked to a specific frame rate. To test if game optimisation has any meaningful impact on power consumption frame for frame... Not necessarily for these two GPUs specifically, but matching up two similarly performing and power hungry GPUs from either side. I think that'd be really fascinating, if doable
@aditrex
@aditrex Жыл бұрын
at full load amd reference card is about 70w more then nvidia founder 4080 but for lower load defenetly 4080 is way more efficient it has to do with chiplet design for sure that amd suffers it wasnt the case with 6000 series my 6800xt will draw like 100w in light titles at high framerate
@dangertomarketing
@dangertomarketing Жыл бұрын
I would prefer seeing AMD reference card vs NVIDIA reference card, or both chips running as ASUS TUF design. The difference between reference and custom AIB cards are usually pretty big. 3-fans vs. 2-fans, custom BIOS (firmware) etc... it's not a straightforward answer, because the 4080 FE vs. AIB difference is massive as well (different clocks as well).
@gerthddyn
@gerthddyn Жыл бұрын
Definitely not something to be ignored.
@sironath
@sironath Жыл бұрын
Surprised you didn't show off performance when undervolting
@foxpants
@foxpants Жыл бұрын
Always exceptionally high quality interesting content mate, and I agree, AMD do need to fix this, but I doubt it can be done with drivers.
@Jacketz123
@Jacketz123 Жыл бұрын
You don’t think so? I reckon some slight tuning to the power management via drivers could do the trick. Especially for the in game menu screens, that’s super strange.
@timvanranderaad7833
@timvanranderaad7833 Жыл бұрын
There already was some power tuning via drivers, to lower their idle power consumption
@pill-poppingnarcissist
@pill-poppingnarcissist Жыл бұрын
Even if it could be fixed by drivers, AMD's drivers are still horrible as it stands. AMD IS OBJECTIVELY THE WORST! AMD fanboys need to be quiet.
@drakata27
@drakata27 Жыл бұрын
@@pill-poppingnarcissist Their drivers are pretty good now you dont know shit mate.
@Ben-Rogue
@Ben-Rogue Жыл бұрын
@@pill-poppingnarcissist "AMD's drivers are still horrible" So, what experience do you have with Radeon in the last 5 years? Because their gaming drivers now are miles beyond what they were then, and even make GeForce look incompetent
@krazed0451
@krazed0451 Жыл бұрын
I think you need to show the power consumption figures for the previous gen, not just mention they are closer.
@ferasamro9735
@ferasamro9735 Жыл бұрын
Quality very distinguished content as usual to be honest this is very important especially when you are building small form factor builds thanks always for the content
@Ohnothisisbad
@Ohnothisisbad Жыл бұрын
WHERE DOES THE PERIOD GO???
@decorator21
@decorator21 Жыл бұрын
SFF for 4080 - good luck :D
@liamness
@liamness Жыл бұрын
Quite a big deal in terms of the carbon footprint of these cards... if you add up all the collective hours people are using these cards to play lighter titles like MOBAs and such, that's got to be a tremendous amount of energy wasted.
@ferasamro9735
@ferasamro9735 Жыл бұрын
@@Ohnothisisbad Man i am not a natibe speaker cut me some slack.
@orkhepaj
@orkhepaj Жыл бұрын
with 4080? okay buddy....
@mikelay5360
@mikelay5360 Жыл бұрын
Thumbs up for your confidence, most of, if not all of KZbin tech channels won't mention anything positive about NVIDIA .People need to understand that other than prices NVIDIA is actually superior to AMD
@varshoee
@varshoee Жыл бұрын
This channel and Digital foundry, both are truly professional channels. other channels ride on the Intel/Nvidia hate bandwagon to earn the support of countless insane AMD fanboys for their channels.
@MirelRC
@MirelRC Жыл бұрын
Everyone knows that Nvidia have some parts where they are just superior to AMD. And it gets mentioned in almost every review from big channels. Better RT, better efficiency (this gen), better productivity performance in many cases (some GPUs might gonna end being weaker than some AMD GPUs because of small VRAM amount) and so on.
@mikelay5360
@mikelay5360 Жыл бұрын
@@MirelRC of course until AMD release a GPU with 8GB vram and gets a free pass
@MirelRC
@MirelRC Жыл бұрын
@@mikelay5360 you seem to be a bit touched by that VRAM thing.
@JamesEPhilp
@JamesEPhilp Жыл бұрын
The move to TSMC 4N process, alongside the new architecture, does seem to have put NVIDIA a few steps ahead in terms of power efficiency. This was my suspicion when the 4090 results came in, and why I jumped on the 4070 for my ITX build. Good to see some more quantifiable results that confirm my suspicions. Given AMD are already on a good FAB process, they will either need to redesign some architectural features, or seriously work on drivers I think, if they are going to take on a power efficiency challenge.
@Zicrixdoesart
@Zicrixdoesart Жыл бұрын
well going forward both are gonna be using 3nm tsmc as far as I know. So from there it'll depend on implementation as always. Rtx 5000 might have a *slight* inherent advantage due to the monolithic design though.
@kieran9882
@kieran9882 Жыл бұрын
I think it’s 5nm in both Nvidia and Amd cards
@ojassarup258
@ojassarup258 Жыл бұрын
Curious if you tried undervolting the 7900 XTX? AMD have a bit of a habit of running things on the ragged edge to achieve performance parity.
@Kaori42
@Kaori42 Жыл бұрын
On my XTX MBA, even the smallest undervolt make it crash on heavy games :/ A simple 98% voltage with nothing lese changed in adrenalin
@calisto2735
@calisto2735 Жыл бұрын
​@@Kaori42i have a MSI one. Running great at 1090mv/2500mhz. No power limit increase and fans limited to 50%.
@giglioflex
@giglioflex Жыл бұрын
@@Kaori42 That card is defective then. All chips degrade over time and that card could very well just not work at stock voltages in 3 years time.
@rENEGADE666JEDI
@rENEGADE666JEDI Жыл бұрын
​@@calisto2735 And that's the problem, my nitro+ at 1092v easily reaches above 3100mhz + in games like warzone 2.
@rENEGADE666JEDI
@rENEGADE666JEDI Жыл бұрын
​@@Kaori42check yours ram memory, just raise the voltage a bit in bios and try OC GPU again.
@FreezyLemon
@FreezyLemon Жыл бұрын
Coil whine has actually been a problem for a while. Built two PCs for friends with 5700 XTs, one for myself with a 6900 XT, and all three cards had fairly loud coil whine in specific games with uncapped FPS.
@Runeansfelt
@Runeansfelt Жыл бұрын
I also struggle with insane coil whine from my ASUS TUF RTX3070 Ti
@nameisbad
@nameisbad Жыл бұрын
Yeah cool whine is normal, cap FPS is the fix, but eventually like after almost a year coil whine will naturally go away. My 1060 6gb had coil whine for almost a year.
@NakedTrashPanda
@NakedTrashPanda Жыл бұрын
My 5700xt has had no coil whine. Maybe it's the specific brand?
@nameisbad
@nameisbad Жыл бұрын
@@NakedTrashPanda no, coil whine isn't a guarantee, but there is a chance of it on EVERY single card that uses certain parts. It's a flip of the coin.
@grev.
@grev. Жыл бұрын
im guessing this gets even more extreme when setting power target for nvidia. does the 7900 xtx respond well to undervolting?
@caiolucasnovais3676
@caiolucasnovais3676 Жыл бұрын
that's what i was thinking about...
@chriswhitaker9720
@chriswhitaker9720 Жыл бұрын
Yes my 7900xtx responds well to undervolting for sure.
@Drubonk
@Drubonk Жыл бұрын
It does :) Also if you cap frame rate it consumes much less, running 260w avg in Diablo 4, and around 165w in 7 Days to Die. 7 Days is an old game but it shows that the card can chill in less demanding games
@overman330
@overman330 Жыл бұрын
Undervolting dropped my peak from 480 ish to 430 ish. My idle went from 110-120 down to around 100 (watt the f! That’s high.) capping frame rate has had the best improvement for me. Aorus xtx.
@carlosferjezus
@carlosferjezus Жыл бұрын
@@overman330you cap fps on amd software itself?
@subrezon
@subrezon Жыл бұрын
As a homelabber, I have found that AMD CPUs are the same. They are very efficient under load, but a home server spends 95% of the time idling, and idle power consumption is where AMD just gets crushed by Intel and especially ARM alternatives.
@xgamer25125
@xgamer25125 Жыл бұрын
given how recent intel CPUs have e-cores that makes perfect sense really. Just like ARM, they are good at slowing down and using very little power for idle/slow situations.
@subrezon
@subrezon Жыл бұрын
@@xgamer25125 It actually goes back a lot further than that. My current main server is 7th Gen, and the entire system draws 19-21W from the wall when idle. This modern idle power management goes back all the way to Intel 4th gen, although those systems are held back by relatively power hungry chipsets and DDR3 RAM. If you want low idle power with Ryzen - you're limited to APUs only, since they're monolithic. Chiplet Ryzen CPUs draw a lot of power when idle (>40W), since only the cores can go to sleep, the I/O die draws full power all the time.
@xgamer25125
@xgamer25125 Жыл бұрын
@@subrezon didn't know that but it's pretty interesting. Might explain why their Chiplet GPUs have consumption issues too.
@subrezon
@subrezon Жыл бұрын
@@xgamer25125 tbh low power modes on AMD GPUs have been bad forever. I remember reading forum posts on that topic all the way back during HD7800 days.
@Yoshi278
@Yoshi278 Жыл бұрын
Coil whine is a manufacturer flaw as it is completely dependant on who you buy the 7900xtx from.
@boi9433
@boi9433 6 ай бұрын
Can you retest this again to see if its fixed
@TheGreatJafa
@TheGreatJafa Жыл бұрын
I undervolted my 6950XT and have loved it so far. Granted, I have the red devil model which are known to be pushed to the limits power-wise but I've lost no noticeable performance but with a good decrease in both power consumption and noise. I'm wondering how the 7900XTX would respond to undervolting. If they are similar to the 6000 series, it would go quite well.
@Yoshi278
@Yoshi278 Жыл бұрын
Cant confirm but I've heard some negative things when it comes to underclocking this gpu.
@MacTavish9619
@MacTavish9619 Жыл бұрын
I heard you can UV this card so it will use 250-260W at max without losing any % of your performance.
@johnarigi6274
@johnarigi6274 Жыл бұрын
Tech Yes City did a video on this, it was around 50w less if im not wrong, which isn't impressive. I was hoping for much better results .
@GraveUypo
@GraveUypo Жыл бұрын
i changed power limits using morepowertools and then undervolted my 6900xt to use 198w full load and actually gained a bit of performance of its stock overclocked clocks. 6900xt performance, 6700xt power consumption.
@biansanity
@biansanity Жыл бұрын
@@GraveUypo what settings and type of 6900xt card do you have? if you don't mind sharing
@pathway4582
@pathway4582 Жыл бұрын
Power draw in games is one thing, power draw while idle is another very important factor where AMD dropped the ball. Having a card "idle" at 70-100 watt only because you have more than 1 monitor hooked up to it is insane. What's more insane is that this issue has been present since at least the AMD R300 series (my R380 used 90 watts during idle with 2 monitors) and it's still present to this day. There's no point in buying AMD gpu's at the moment. They are a bit cheaper (still overpriced imo but what can you do), but they will cost you a lot more in the long run compared to NVIDIA.
@nicknorthcutt7680
@nicknorthcutt7680 7 ай бұрын
The big difference is the process node. The TSMC 4nm node used by ada lovelace is far more powerful and power efficient per watt, no matter which way you cut it.
@WTP_DAVE
@WTP_DAVE Жыл бұрын
I run an XTX and i have noticed higher power draw then i would have wanted but overall its been a killer card
@Boodoo4You
@Boodoo4You Жыл бұрын
Ahh that’s a shame. Your card is actually terrible now. Really unfortunate!
@vinylSummer
@vinylSummer Жыл бұрын
@@Boodoo4You 😂
@pentecost_
@pentecost_ Жыл бұрын
What about power draw for both cards at their most aggressive but stable undervolt settings? Interested to know which of the two (1) handles undervolting better, (2) and what their average fps per watt is. I run an older Asus TUF RX 6800 XT and it is still a power hungry card imo but I managed to get it down to roughly the same power usage as a stock RTX 3060 Ti while still performing leagues better (in OW2 since there is where I spend 95% of my time gaming).
@renkirigasaki
@renkirigasaki Жыл бұрын
40 series efficiency, kinda insane. shame that nvidia didn't do justice for the entry level 40 series
@Boodoo4You
@Boodoo4You Жыл бұрын
Hey Ali, you really shouldn’t leave your Lightsaber on and up against the wall like that. You’ll never be a Jedi with carelessness like that.
@ostrava_
@ostrava_ Жыл бұрын
So was this an outlier? No other outlets seemed to pick it up, and you never followed up.
@fire44it
@fire44it Жыл бұрын
I'm wondering this too. I can't find anything about this anywhere else. Did you find anything in the mean while?
@ostrava_
@ostrava_ Жыл бұрын
Nope, almost all results are for idle consumption. @@fire44it
@deviouslaw
@deviouslaw Жыл бұрын
I'd be interested in testing this on 6800XT or some RDNA 2 part, to see if its an AMD issue or RDNA 3 specifically.
@maewemeetagain
@maewemeetagain Жыл бұрын
It's an RDNA3 issue. My old 6950 XT pulled about 320-370W at 100% load in demanding games, particularly RDR2 and Dying Light 2.
@deviouslaw
@deviouslaw Жыл бұрын
@@maewemeetagain The problem is in light to mid-loads. My 7900XTX draws 355w fully loaded, which is fine enough for the performance. But he does mention 6:00 that RDNA 2 is more comparable to Ampere. It'd be interesting to see more about those numbers. Edit: Another thought: What about non-chiplet RDNA 3? (Radeon 7600). I bet it's just the chiplet ones, but who knows until it's tested.
@Boodoo4You
@Boodoo4You Жыл бұрын
@@maewemeetagainMax load is irrelevant here. The issue is whether it’s drawing an efficient amount in lighter loads.
@zuckdaddy1596
@zuckdaddy1596 Жыл бұрын
it's an RDNA 3 thing, my 6800 doesn't have this issue
@PineyJustice
@PineyJustice Жыл бұрын
@@maewemeetagain Your old 6950xt was pulling 400-500w, I had one, RDNA2 doesn't report power the same way as RDNA3. This really should be common knowledge by now.
@weeooh1
@weeooh1 Жыл бұрын
This is the type of video HUB would never do.
@ThrashingBasskill
@ThrashingBasskill Жыл бұрын
Well than let´s do some math.. and correct me if I´m wrong here. let´s assume I only play Overwatch 2. (the worst for AMD here) Let´s say that Nvidia always draws exactly 300 watts and AMD 500. Let´s say I play 2 hours per day for 5 days a week, every week of the year (10h x 52weeks = 520 hours). 520 hours x 300W or 500W equal to 156 kWh for Nvidia and 260 kWh AMD per year. In Germany (where electricity is rather expensive), the average kWh is at 35 ct. That´s 54 bucks for Nvidia per year and 91 bucks for AMD. That´s a difference of 37 bucks per year. With a difference of 300 bucks initially, it would take more than 8 years to reach parity (assuming prices stay the same). To be completely honest, I couldn´t care less about 37 bucks per year more or less and I actually do keep my GPU´s for about 8 years...
@ryanharding8550
@ryanharding8550 Жыл бұрын
Awesome video, would be great to see a power consumption comparison between Nvidia 4070 Ti and AMD RX 7900 XT
@GewelReal
@GewelReal Жыл бұрын
7900 XT pulls only few W less than XTX while 70Ti cuts 50W or so from 4080
@jux9577
@jux9577 Жыл бұрын
Around 280w for the 4070 ti and Low 300w for the 7900xt. The 4070ti was able to undervolt to around 200-220w very easily and the 7900xt around 275w. In my system, with a 7700x, I was very disappointed with the 7900xt and I would not recommend it.
@Kryptix0III
@Kryptix0III Жыл бұрын
one of the major reasons why I decided to go with a 4070 instead of a 6950XT. I didn’t need to upgrade my 650W PSU with the 4070 and it would save me quite a bit of money in the long run in electricity bills.
@raresmacovei8382
@raresmacovei8382 Жыл бұрын
The power bill argument is so ridiculous. Always was. NOBODY would notice up to total 100$ difference in power bills spread across 4 years.
@D1craigRob
@D1craigRob Жыл бұрын
About 3 dollars per month.
@raresmacovei8382
@raresmacovei8382 Жыл бұрын
@@D1craigRob 2.084$ per month. The horror.
@Kryptix0III
@Kryptix0III Жыл бұрын
@@raresmacovei8382 6 hours per day over 4 years and it’s a cost saving of $350 or more in Germany currently… that’s a brand new Ryzen 7 7700x right there…
@konga382
@konga382 Жыл бұрын
@@raresmacovei8382 Whether you're likely to notice it or not isn't a good argument against caring. It's still costing you extra money in the end, and any decent cost analysis of high-power electrictronics should include their long-term running costs.
@tempacc9589
@tempacc9589 Жыл бұрын
I'm not sure if you know this but Nvidia is using the newest TSMC node and a monolothic die. This means the distance between their transistors is smaller and less electricity is required to gap the distance. AMD purposely chose to use the same TSMC node as their precious gen (6000 series) to save cost on production and uses their interconnect to chain them, which also costs more electricity. It is not a mistake or driver error, AMD is just using a cheaper manufacturing technique.
@markgutierez9922
@markgutierez9922 Жыл бұрын
This exactly. AMD basically just relying on it's architecture improvements; this is why there is just 20-30% better performance, than the 6000 series. Think of it as a refined 6950xt with 40% price increase 😅
@markgutierez9922
@markgutierez9922 Жыл бұрын
@@Wobbothe3rd compare 3000 series form Nvidia to 6000 AMD
@dkarbaev
@dkarbaev Жыл бұрын
Conspiracy theory: the drivers are mining on background.
@Xerazal
@Xerazal Жыл бұрын
Undervolt the amd card. Amd had historically been known to have insanely high factory voltage settings that are unnecessarily high. You can regularly undervolt amd cards with little to no performance loss (in some cases actually gaining a little performance due to less thermal throttling).
@tubaeseries5705
@tubaeseries5705 Жыл бұрын
i undervolted my 6700 massively, it uses max 90w now vs 160 before, performance is maybe 9% worse but it doesn't matter since i get 165 fps in every game i play
@zuckdaddy1596
@zuckdaddy1596 Жыл бұрын
instructions unclear, my Rx 6800 now pulls 350w and is 20-30% faster
@Kaori42
@Kaori42 Жыл бұрын
On my 7900XTX (MBA), even a 2% undervolt make it unstable and hard crash in heavy games :/
@jondonnelly4831
@jondonnelly4831 Жыл бұрын
Sure it will use less power but there is a reason why factory it is set high and it's stability. Its not worth messing with. Can you tell me if a driver crash was the small undervolt or something else? You can't know for sure. I know its more stable stock but still not 100%.
@PatrickPissurno
@PatrickPissurno Жыл бұрын
that is usually the case for all amd things, but I guess it may not apply to this gen of gpus because of the so-called hardware flaw everyone is talking about
@gucky4717
@gucky4717 Жыл бұрын
Did you use the latest patch which "fixed" the powerconsumption on idle with different monitors? But that also shows that AMD does have some problems with powerconsumption in general.
@Fausto_moh
@Fausto_moh Жыл бұрын
High power draw, high heat and loud noise is what scared me away from getting a 7900xtx
@jacooosthuizen3593
@jacooosthuizen3593 Жыл бұрын
The performance per dollar does not work here in South Africa as big of a difference as in US. AMD 7900XTX $1,192.14 Nvidia 4080 $1,298.18 Nvidia 4080 all the way.
@dharkbizkit
@dharkbizkit Жыл бұрын
thats the exact reason i didnt take the 7900xt/x after computerbases' first test. they were also really bad in draw, when i comes to multi monitor desktop idle. like 80 watts just to display 2 desktops. power scaling is really bad here. many people might not care, but when you pay 38 cents per kwh and get like 20-40hours of gaming a week, its something that becomes running cost and theres the second pc in the house, getting the same mileage. i had a 4070ti here, which really impressed me with the power scaling. coming from a gtx 1080, which in some games, takes 170 watts to display 60fps. the 4070ti, take like 65 watts to display 60 fps. thats what i dont get, about my 13600k, which needs more power for the same amount of locked fps, then my 8700f. this shouldnt be the case, yet it is. the cpu is now undervolted, but crysis warhead, which is a game from 2008 that only uses 3 cores, makes the 13600k draw 40 watts , 54 without UV. thats 10 watts below the full load of 12 threads, from the 8700f. i dont get it, the 13600k, should use like 30% less power for the same performance/fps, compared to the 8700f
@jackkelner9992
@jackkelner9992 Жыл бұрын
Love the content as always, would love to see an updated video on custom cables and reading a pin out diagram. I would love to make some for my SFF build but while im confident i can solder and use wire crimps im not so sure of my skills reading pin out diagrams and understanding electrical diagrams in general. Always great stuff, keep up the great work
@Timmy51m
@Timmy51m Жыл бұрын
More power consumption more heat, more heat more cooling power consumption
@cruizera2194
@cruizera2194 Жыл бұрын
Thank you so much for bringing attention back to this.
@dawn-moon
@dawn-moon Жыл бұрын
After owning a Red Devil 6950XT and a Red Devil 7900XTX I switched to a Zotac RTX 4080 and indeed the power draw and heat dissipation difference is massive. In a Lancool 3 case with 6x 140mm fans and 3x 120mm fans directly below the GPU, playing COD MWF2 DMZ is night and day. On the Red Devil 7900XTX, I had to undervolt 1% to get the temps below 95 Celsius ! Just insane ! The 4080 in the same PC, maximum temperature seen by ICUE, 59 degrees. The reason why I swapped to the 4080 was the instability in DMZ. I happened to have bad luck I guess, but my PC was constantly freezing or the game froze and took all other processes to hell and back. I can't remember how many crashes I've had from Jan-April 2023, at least 2/evening, reporting the crashes within the AMD Adrenalin drivers almost all of the time, always up to date drivers, 850W Corsair RM power supply... It never got solved. I really wanted an all AMD PC, but this time they pushed my frustration to new heights. The 4080 is a little slower in DMZ at 1600x3850 ultra wide resolution maxed out, but I haven't experienced one single crash since the 4080 was put into this game pc.
@bob_smite
@bob_smite Жыл бұрын
Was the testing on an Asus Tuf 7900XTX or the reference 7900XTX card? There was b-roll of both. I think the reference testing would be more fair since it was compared to a reference 4080 and AIBs may modify the card with some sort of overclocking or behavior to increase performance in exchange for power compared to the reference.
@andyg1021
@andyg1021 Жыл бұрын
Could you try an overclock + undervolt? It makes a huuuuge difference for the 7900 XTX. No performance loss, but 20% less power consumption! It's done in 30 minutes only using AMD Adrenaline. Less coil whine - when you limit the max fps - no coil whine at all!
@GraveUypo
@GraveUypo Жыл бұрын
that helps, but technically it doesn't close the gap at all since the 4080 can do it too
@KarlynGR
@KarlynGR Жыл бұрын
On release date it was very bizarre that AMD expected us to choose the 7900 XTX over the RTX 4080 since for 100$ dollars more you coud get a more stable, AI advanced and reliable card. Now with the price drops where you can pick one for 800-900$ it's a much more interesting option to take in consideration.
@deviouslaw
@deviouslaw Жыл бұрын
Price difference at release was $200
@deviouslaw
@deviouslaw Жыл бұрын
@@Wobbothe3rd Idk about other regions but 7900XTX was $999 at launch and 4080 was and still is $1199. That's US prices, MBA vs Founders edition. $200
@ranarambo2529
@ranarambo2529 Жыл бұрын
“The more you buy, the more you save” 😂 I totally get it now. I have two PC at home one for the kids the other is mine to game on. I’m more conscious on how much electricity is being consumed while running both PCs. Thank You for this video 👍
@CarLostis
@CarLostis Жыл бұрын
I live in Mexico and the electricity costs are not that high but with an older system or an amd system it shows in your electric bill, and with the new 40 series GPUs it doesn't that much, I cannot imagine what it would be in Europe where the prices are much higher
@ranarambo2529
@ranarambo2529 Жыл бұрын
It’s bloody expensive here in the UK. I know people speak about the price to performance and i agree AMD wins it that category. But gaming everyday for long or short hours I’m more worried on price to power consumption. So I’ll be sticking with whoever can give me that. Right now it’s NVIDIA but if AMD can do something like this in the future I’m all in for it.
@CarLostis
@CarLostis Жыл бұрын
@@ranarambo2529 for me it is also a working tool, but tbh the performance gains are so low, like not even 5 percent(on raster not counting rt and frame gen) on some games that in the long run, the amd card will cost so much more than the Nvidia one because of the electricity costs where you live, sad to hear the prices are still high where you live m8 :(
@IVIRnathanreilly
@IVIRnathanreilly Жыл бұрын
PSA at Irish electricity prices and the cost to buy these cards as cheaply as possible new from Europe, you have to game for 2955 hours to even start saving money at a 200w power difference.
@100500daniel
@100500daniel Жыл бұрын
I have had the 7900 XTX for a while now (upgraded from 6800 XT) When playing games that aren't demanding,the card will clock higher than when being fully utilized. The card pretty much always aspires to hit it's power limit. Just the way it's designed. The main way to manage it is by Undervolting and setting a 2599~ clockspeed limit. Performance is roughly the same but the card will use much less than 300w when it's not necessary. :)
@Krenisphia
@Krenisphia Жыл бұрын
Most GPU's, especially at the top end, are pushed past their optimal power efficiency range. It's always a good idea to undervolt in most cases. I did so the day I got my 4090 and haven't been happier.
@Stalast.
@Stalast. Жыл бұрын
The fact that you pointed this problem out on your day one review and AMD still hasn't addressed it is very disappointing. If they're unable to fix it at all, I'd go as far as to say this generation for AMD is a failure.
@iRyusa
@iRyusa Жыл бұрын
With high temp in summer being the norm, that's the main reason I won't go AMD :(
@Ohnothisisbad
@Ohnothisisbad Жыл бұрын
I bet that AMD closed the performance gap BY using more power. If that's the case, there won't be a way to fix this.
@sheratedis
@sheratedis Жыл бұрын
Last driver update for AMD reduces idle wattage for some monitor combinations
@Kony_BE
@Kony_BE Жыл бұрын
Yes but it doesn't reduces wattage during esports games. From people perspective about my RTX 4090, they think I pull always 600W when I play games and even esports game but... The reality, it only pull 390W on triple A games like Cyberpunk (3440x1440p, Ultra settings. The 7900XTX is pulling quite the same. In overwatch 2, it pulls only 210-220w for 600fps at 3440x1440 low.
@korhanaydemir8366
@korhanaydemir8366 Жыл бұрын
yeah but they removed it from known issue so they likely have stopped optimizing it for other combinations
@denvera1g1
@denvera1g1 Жыл бұрын
Chiplets reduce electrical efficiency, what chiplets accel at are compute density, and yields. Chiplets can allow you to use much much more silicon, at a lower cost, at a lower clock speed making it both more compute dense, and more efficient But AMD didnt choose to do this The actual total die area is rather similar between the two, meaning to offer the same performance, with the same silicon, AMD needs to push more power, because the traces are longer and more complex due to being chiplet. If that central compute die were 400mm instead of 300mm, with cache on-die, keeping the existing chiplets using their cache as an L4, the power consumption could drop significantly, with a noticable increase in performance, especially in under 4K resolutions where more on-die cache is very important
@philscomputerlab
@philscomputerlab Жыл бұрын
Great findings,🙂 and don't think I've heard others talk about this. Awesome
@gimmibox
@gimmibox Жыл бұрын
Good lord some gpu sizes are just ridiculous nowadays
@maozedowner5915
@maozedowner5915 Жыл бұрын
Thank god I just got an upgrade January this year. A second hand RX6600 that's in a pretty bad but still usable and quiet shape, under 100. Don't have to put up with GPUs this generation!
@screwb1882
@screwb1882 Жыл бұрын
bruh if you are running a RX 6600 the 7900 XTX was never an option for you lol. This power scaling issue doesn't exist on the 7600 so I'm going to conclude that it is a quirk of gen 1 chiplet GPUs. It is probably the IF interconnect pulling all that power.
@bulutcagdas1071
@bulutcagdas1071 Жыл бұрын
You can comfortably limit the power draw down to 70W on that card and still get great performance.
@addictedtoRS
@addictedtoRS Жыл бұрын
Here in Canada, right now the 7900 XTX is going for $1,150 and the 4080 is going for $1,600. Pretty easy decision there.
@Kylethejobber
@Kylethejobber Ай бұрын
As a owner of the 7900xtx These issues have been fixed a while ago now. For the reference card it won't pull anymore then 350 watts max.
@PyromancerRift
@PyromancerRift Жыл бұрын
I got that TUF 7900xtx and i found out i can OC the Vram and lower the power target to get the same performance for way less power. On the other hand, you can push a lot more power on the nvidia card and only gain 5% fps. The situation is not as bleak as you make it look like. The AMD cards are not that bad, they are just badly tuned your of the box.
@foggythought
@foggythought Жыл бұрын
7900xtx high power draw isn't limited to games or any activity really. Having multiple monitors running different resolution/frame rates was enough to cause 130w+ at idle. Thankfully AMD's software allows a workaround by setting custom resolution settings but it's an extremely annoying thing to have happen in the first place.
@PowellCat745
@PowellCat745 Жыл бұрын
Didn’t the latest driver finally fix this? I don’t know for sure because I use nvidia.
@Kaori42
@Kaori42 Жыл бұрын
@@PowellCat745 It helped for single monitor and fixed some multiple monitors config but the issue is still present, mine consume easily 90W with 2 monitors, 9W with one
@overman330
@overman330 Жыл бұрын
Same here. 110-120w at idle. Undervolting dropped my peak from 480 ish down to 430 ish. But the idle was still around 100
@PowellCat745
@PowellCat745 Жыл бұрын
@@Kaori42 try messing with your refresh rate (like changing from 144 to 142)
@foggythought
@foggythought Жыл бұрын
@@overman330 Make a custom resolution and use CVT - Reduced Blanking for the timing standard. That's what worked for me.
@tomtomkowski7653
@tomtomkowski7653 Жыл бұрын
Top-tier AMD 7900xtx matches only fake 4080 on a 103 die - something went terribly wrong with this RDNA 3 architecture. And these prices... 80-class for $1200 and a counterpart from AMD for $1000 - this is absolute rubbish.
@stage6fan475
@stage6fan475 Жыл бұрын
Thanks was wondering about power comparisons, got hints in some other comparison videos, but nothing like your thorough approach.
@WrexBF
@WrexBF Жыл бұрын
You pointed out this issue at launch and it's still not fixed. I think the issue can't be fixed on the current RDNA3 cards. They were too power-hungry to reach stable clocks, and it was speculated on release day that AMD won't release a 4090 competitor because there are some hardware bugs in the current gen and they could not reach the clock speeds that they hoped for. They went balls to the wall with the power consumption to reach stable clocks to make them more competitive.
@klaesregis7487
@klaesregis7487 Жыл бұрын
In extension if you look at the reviews of the Steam Deck and its competitor the Asus ROG Ally, RDNA2 beats RDNA3 when power limited to e.g. 15Watts. When given free reign at 25+ Watts the Asus ROG Ally wins with ease.
The AMD 7900 XTX fits in *everything
7:05
optimum
Рет қаралды 268 М.
So I tried the new RTX 4070..
9:37
optimum
Рет қаралды 446 М.
КАКУЮ ДВЕРЬ ВЫБРАТЬ? 😂 #Shorts
00:45
НУБАСТЕР
Рет қаралды 3,4 МЛН
Matching Picture Challenge with Alfredo Larin's family! 👍
00:37
BigSchool
Рет қаралды 44 МЛН
managed to catch #tiktok
00:16
Анастасия Тарасова
Рет қаралды 47 МЛН
Another Graphics card not working after cleaning with an air duster
8:50
northwestrepair
Рет қаралды 2 МЛН
Why did I buy an RTX 3060 in 2024?
7:10
Nelly's Nerd Cave
Рет қаралды 18 М.
Top 5 ways you're WASTING money on with your PC!
17:43
JayzTwoCents
Рет қаралды 1,3 МЛН
New ULMB 2 vs 500Hz - Fastest Gaming Monitor?
9:25
optimum
Рет қаралды 420 М.
Apple's Fastest Mac vs. My $5496 PC
14:55
optimum
Рет қаралды 2,4 МЛН
I should have tried this earlier - Bigscreen VR.
11:07
optimum
Рет қаралды 2,6 МЛН
I used mouse accel for 30 days.
8:38
optimum
Рет қаралды 974 М.
I Bought The RX 7900 XTX... and Avoided NVIDIA - Was It Worth It?
7:34
Nicolas11x12 English
Рет қаралды 122 М.
КАКУЮ ДВЕРЬ ВЫБРАТЬ? 😂 #Shorts
00:45
НУБАСТЕР
Рет қаралды 3,4 МЛН