I'd also like to thank Gamers Nexus Steve who wrote this for me, but we didn't end up adding it to the video. So please watch their review to support them if you haven't already: kzbin.info/www/bejne/qV7Pd5qYr7pmgrs From Gamers Nexus: We currently run tests at 1080p and 1440p for CPU gaming benchmarks, though we mostly rely on 1080p results for comparison. Although we didn't bother for the 9800X3D review, we typically publish 1-3 1440p charts in games that are still mostly CPU-bound for perspective. There are a lot of ways to approach reviews. We view bottleneck testing as a separate content piece or follow-up, as it also starts getting into territory of functionally producing a GPU benchmark. What matters is a consistent philosophy: Our primary philosophy is to isolate components as much as possible, then as standalone or separate feature pieces, we run 'combined' tests that mix variables in ways we wouldn't for a standardized reviews. For us, reviews are standardized, meaning all parts (more or less) follow the same test practices. Introducing more judgment calls introduces more room for inconsistency in human decision making, so we try to avoid these wherever possible to keep comparisons fair. Choosing those practices is based upon ensuring we can show the biggest differences in components with reasonably likely workloads. A few things to remember with benchmarks that are borderline GPU-bound: - You can no longer fully isolate how much of the performance behavior is due to the CPU, which can obscure or completely hide issues. These issues include: poor frametime pacing, inconsistent frametime delivery, in-game simulation time error due to a low-end CPU dropping animation consistency despite good frame pacing, and overall quality of the experience. This is not only because it becomes more difficult to isolate if issues such as micro stutters are caused by the CPU or GPU, but also because the limitation may completely sidestep major issues with a CPU. One example would be Total War: Warhammer 3, which has a known and specific issue with scheduling on high thread count Intel CPUs in particular. This issue can be hidden or minimized by a heavy GPU bind, and so 4K / Ultra testing would potentially mean we miss a major problem that would directly impact user experience. - Drawing upon this: We don't test for the experience in only that game, but we use it as a representative of potentially dozens of games that could have that behavior. In the same example, we want that indicator of performance for these reasons: (1) If a user actually does just play in a CPU bind for that game, they need to know that even a high-end parts can perform poorly if CPU-bound; (2) if, in the future, a new GPU launches that shifts the bind back to the CPU, which is likely, we need to be aware of that in the original review so that consumers can plan for their build 2-3 years in the future and not feel burned by a purchase; (3) if the game may represent behavior in other games, it is important to surface a behavior to begin the conversation and search for more or deeper problems. It's not possible to test every single game -- although HUB certainly tries -- and so using fully CPU-bound results as an analog to a wider gaming subset means we know what to investigate, whereas a GPU bind may totally hide that (or may surface GPU issues, which are erroneously attributed to the CPU). One thing to also remember with modern 1080p testing is that it also represents some situations for DLSS, FSR, or XeSS usage at "4K" (upscaled). A great example of all of this is to look at common parts from 4-5 years ago, then see how they have diverged with time. If we had been GPU-bound, we'd have never known what that divergence might be. Finally: One of the major challenges with GPU-bound benchmarks in a CPU review is that the more variable ceiling caused by intermittent GPU 'overload' means CPU results will rarely stack-up in the hierarchy most people expect. This requires additional explanation to ensure responsible use of the data, as it wouldn't be odd to have a "better" CPU (by hierarchy) below a "worse" CPU if both are externally bound. We still think that high resolution testing is useful for separate deep dives or in GPU bottleneck or GPU review content.
@krazyfrog2 ай бұрын
I don't know why this even needs to be explained in 2024
@SomeoneThatDoesntCare2 ай бұрын
Think most people "should" understand that the 1080p benchmarking allows to compare CPU. But people also require a way to understand whats the best way to spend thier money , which is why a quick average PC aka rtx 4060 1440p ish benchmark at the end would be useful. Because the reality is for most people , especially if you need new mobo/ram etc as well then your almost always likely to spend your cash on gpu upgrade vs cpu.
@jouniosmala99212 ай бұрын
But I complained about not using 720p. :D
@Y0Uanonymous2 ай бұрын
To the contrary, you shall do 1080p testing with lower graphical settings and some upscaling. That's where most people end up before buying a new PC.
@A2409S2 ай бұрын
I've ordered a 9800X3D, I game at 4k with a 3090. So, I would most certainly be GPU bound in the benchmarks above. However, I don't play FPS but I've found the biggest benefit of the X3D on my current 5800X3D to be late game performance. Many games with really high unit counts or lots of different things going on late game slow down significantly, sometimes to the point of it becoming frustrating to play. This is where I've found the X3D to be a huge improvement. I would love to see this late game performance being tested somehow, if only to confirm my theory or prove me wrong.
@FatetalityXI2 ай бұрын
Surely 5090 is going to show how much more headroom 9800X3D has over other cpu's even in 4k.
@Hardwareunboxed2 ай бұрын
Yep it will and I'd imagine you'd keep the 9800X3D for another generation as well.
@Shini11712 ай бұрын
@@Hardwareunboxed Isn't it the same case with the 7800x3D is it worth upgrading from it?
@darreno14502 ай бұрын
I was lucky to pick up a 9800X3D, but it's just a steppingstone to the 9950X3D.
@aapee5652 ай бұрын
@@Shini1171 I would say there are better uses for money than a 10% increase in CPU performance. Such as wiping your ass with dollar bills etc :D
@thewhiteknight99232 ай бұрын
@@Shini1171no. 9000 series might be the last new generation on am5. 10000 might happen but chances seems small imo. 7800x3d is plenty good for the next few years
@HoodHussler2 ай бұрын
Where r my 360p and 720p benchmarks at?
@YothaIndi2 ай бұрын
140p or bust
@HoodHussler2 ай бұрын
@ absolutely! Some of us still use CRT so where’s our invitations to the party?
@themarketgardener2 ай бұрын
I’d honestly take a 720p benchmark for CPU testing ngl 😂
@kingplunger12 ай бұрын
@@YothaIndinot 144p ?
@YothaIndi2 ай бұрын
@@kingplunger1 Nah, that extra 4p will reduce the score dramatically 🤣
@SgtRock44452 ай бұрын
I watched this video in 480p so that I get the most Steve's per Second.
@YothaIndi2 ай бұрын
If you watch HU & GN at 144p simultaneously you obtain the highest SPS possible
@DrMicoco2 ай бұрын
watch in 72p and userbenchmark will just pop out of your screen XD
@kosmosyche2 ай бұрын
I find Steve to be very CPU-bottlenecked.
@nipa59612 ай бұрын
@@YothaIndi That's cheating!
@BladeCrew2 ай бұрын
I am watching Steve in 1440p 60fps on a 1080p 144Hz display.
@BigPeter932 ай бұрын
1080p IS and SHOULD be the standard for CPU testing. HOWEVER, I really enjoy the 4K benchmarks as well. You said it yourself, none of the mainline tech reviewers are showing 4K benchmarks. This forces the part of your audience who want 4K CPU benchmarks to go to less than reliable sources. Wouldn't it benefit everyone involved to offer it on the slide during your reviews?
@thorbear2 ай бұрын
You have been given misleading information. It is not true that none of the mainline tech reviewers are showing 4K benchmarks, and that's not what he said, although it is clearly what he wanted to say. LTT tested at 4K, and 1440p, and 1080p Ultra, probably the most comprehensive review available.
@CallMeRabbitzUSVI2 ай бұрын
@@thorbearYep, LTT is the only review of the 9800X3d that made sense
@7xPlayer2 ай бұрын
"4K CPU benchmark review" This whole video is about how such a thing doesn't exist, cause of the GPU bottleneck in GPU bottlenecked scenario, or useless, the game is not GPU bottlenecked, so now you're on a wild goose chase, is it the cpu? if so, the 1080p test would show that anyway, or is it the memory? or the game itself? or or etc. The variables are mixed together and now you have mixed the CPU variable in, which is a big variable that hides the other variables, so no meaningful conclusion can be drawn cause the details are unclear.
@vitas75Ай бұрын
@7xPlayer so you're saying I should just buy a ryzen 2600x for my 4090 4k rig?
@morpheus_9Ай бұрын
@@vitas75obviously not. You can easily run a 5600X though.
@keithduthie2 ай бұрын
Thanks Steve. I have two hobbies - bitching _and_ moaning.
@worthywizard2 ай бұрын
You should try whining, it’s often overlooked
@__-fi6xg2 ай бұрын
i feel like crying like a little girl is the hardest one to master
@thepadonthepondbythescum2 ай бұрын
Don't forget being a victim of Corporate Marketing!
@marktackman28862 ай бұрын
Steve is waiting for these people like a bald eagle perched on a branch, waiting to strike, I would know...I'v happily been a victim
@n8spL82 ай бұрын
i go for ranting and raving personally
@trackgg5862 ай бұрын
1. This shows what kind of beast 5800x3d was and still it. 2. It proves your point, obviously. 3. It may be anecdotal, but I moved from 3600x to 5800x3d on 3080, while in CP77 my FPS was not significantly affected at 1440pUW, the 1%low spiked by roughly 50%, greatly increasing stability and getting rid of any stutters. That's also a thing to consider, outside of raw FPS performance.
@christophermullins71632 ай бұрын
This is especially relevant for 40 series and raytracing. It increases CPU load and creates MUCH worse percentile lows. Faster CPUs always help. Basically almost always. Some games will see no difference but that is rare.
@renereiche2 ай бұрын
I don't think your third point is anecdotal, spikes are just far more pronounced when CPU-limited, it's widely known. And above that, think Steve disproved his own point in this video with many of the included 4k DLSS gaming tests, where there was a significant difference. Imagine upgrading from a 3080 to a 4090 and get 50% higher framerate, which costs you $1600, not upgrading the CPU for $480 to get an additional 30% performance in quite a few titles at 4k DLSS (~1440p native) doesn't make sense financially and with only 1080p tests and being told that there is no uplift at 4k (native) people wouldn't know that they are leaving so much on the table by not upgrading their 4 year old non-X3D CPUs.
@scottbutler52 ай бұрын
Jeff at Craft Computing tested higher resolutions for his 9800x3d review, and found the same thing - average FPS was pretty close to other CPUs at higher resolutions but the 1% lows were significantly higher.
@RobBCactive2 ай бұрын
@@trackgg586 Once again this video showed the value in the 5800x3D bought near its floor price in extending the useful life of the extremely solid AM4 Zen3 platform.
@lczp2 ай бұрын
Hello sir, I'm thinking on a similar upgrade, from the r5 3600 to the 5700x3d. Outside of CP77, would you say the upgrade was worth it? Did you gain more avg fps in total, disregarding the more stable 1% lows?
@bool2max2 ай бұрын
Why test with upscaling @ 4K when we know that the internal resolution is closer to 1080p, for which we already know the results, i.e. that the newer CPU is faster?
@IK47-d2lАй бұрын
Yeah I don't understand the point of choosing 4K Balanced Upscale
@kyre4189Ай бұрын
@@IK47-d2l because Hardware Unboxed once again failed as a review channel and this video is just a copout to shift the blame to the viewers. If your review needs a separate video because it confuses the viewers, and if the separate video once again confuses the viewers and creates more questions, it's bad.
@IK47-d2lАй бұрын
@@kyre4189 yeah very disappointing from Steve at the beginning to speak to viewers as if they're some small kids who don't get it.
@lour3548Ай бұрын
With all of the useless, upfront tech-splainin', you'd think he could figure out how to test high-res scenarios. I suspect he just wanted to point at "high-res" graphs that looked similar to his 1080p graphs.
@CosmicApeАй бұрын
Because if they did this test on an actual native 4K monitor it would show that CPUs matter so little for 4K gaming. There'd be next to no difference at 4K between the 5800/7800/9800X3D. But do it with 4K upscaling and you render at 1080p and keep it CPU bound so you can magnify false differences.
@RafitoOoO2 ай бұрын
He's not standing so this might be good.
@Hardwareunboxed2 ай бұрын
It's neither good nor bad :D
@xRaptorScreamx2 ай бұрын
@@Hardwareunboxed then you should've lean on the desk :D
@1Grainer12 ай бұрын
@@xRaptorScreamx True Neutral Steve, floating in the middle of the screen
@GalloPhilips2 ай бұрын
@@Hardwareunboxed 😂
@ReSpAwNkILeR2 ай бұрын
@@Hardwareunboxed its the inbetween version
@orangejuche2 ай бұрын
I'm glad that Steve is getting some rest for his poor feet.
@michahojwa81322 ай бұрын
You know where this is going - do tests with oc/uv, med-high details, fsrp xessp and dlssp and Path tracing - because this is realistic 4k and 4090 is 1080p card.
@DragonOfTheMortalKombat2 ай бұрын
Thank 9800x3D
@lightofknowledge7132 ай бұрын
Its really sad that he got stabbed by the arrowlake😢 to the knee
@jacobgames34122 ай бұрын
@@lightofknowledge713very sad😢
@Beardyabc882 ай бұрын
Surprised he wasn't standing up haha
@RadialSeeker1132 ай бұрын
@Hardware Unboxed Sim games and city builders seem to benefit the most. Missing games like anno 1800 and MSFS are vital to determine just how far a CPU can push. On avg a 9800x3d is about 50-60% faster than a 5800x3d without GPU restrictions which is completely insane.
@rasteek2 ай бұрын
THIS!
@themarketgardener2 ай бұрын
Tarkov benefits from this too because of poor optimization🫠
@Hardwareunboxed2 ай бұрын
They are just other games that can be used to measure this. We saw plenty of examples in our 9800X3D review where the 9800X3D was 50-60% faster than the 5800X3D.
@NerveFlux2 ай бұрын
I get about 60-90 fps in Starfield 4k on 5900x 4070TI. I get a little less in city skylines 2 and a lot less in the first game.
@kognak66402 ай бұрын
I would be very interested to see Cities Skylines 2 tested properly. The CPU scaling is massive with it, 64-core is technical ceiling, the game can't use more. This is absolutely unique in gaming world, nothing comes even close. However framerate is not directly tied to CPU performance, it's simulation side slowing down if CPU can't handle it(when the city grows larger). First you lose faster speed settings, then everything just moves slower and slower. I downloaded 500k size city and tested it on my 5800X3D. Framerates stayed same as always but simulation ran only 1/8th speed. It's like watching a video 1/8th speed, simply not playable. Because 100k city runs very well, I'd say 150k is max city size for 5800X3D. Basically you could find out how big city a particular CPU can handle at least on slowest speed setting(1x). No one has done it. Btw, if any Cities Skylines 2 player is reading this and thinking what CPU you should get, just buy one with most cores you can afford. But because no one has made tests, it really difficult to say if AMD or Intel approach is better. 9950X is probably safest bet for best CPU in consumer space.
@matthewhodgson20082 ай бұрын
Would've liked to see the 5800X3D in the charts, that's the chip a lot of people are waiting to upgrade from.
@zodwraith57452 ай бұрын
This discussion will never end because both sides have a point. 2:40. This is a good point that I think gets missed a lot. When you focus on competitive gaming that's tuned to be lighter on GPUs then CPU performance at 1080p is going to be FAR more important. But if you _don't_ play just competitive games what people are asking for is knowing _where_ those diminishing CPU returns are BEFORE we buy new hardware. Of course I can test my own system to see where a bottleneck occurs but how am I supposed to magically know that if I _DON'T_ own the hardware yet? Anyone that's asking for 1440 and 4k and thinks 1080 is useless is a f'ing idiot so don't lump all of us in with them. What the reasonable ones of us that want more than just 1080p are asking for is _exactly_ what GN does in your pinned message, just throw in ONE "don't overspend on your CPU" bench to let people see _where_ a lot of games slam into a really low GPU bottleneck. Even better if you throw in a 4070 or 6750xt because if you hit that bottleneck with a _4090_ at 1440p? That's a minefield for anyone with a midrange GPU, and you _still_ only used 4090 testing which completely ruins your claim this is "real world" testing when the vast majority of people DON'T own a 4090. The ones that do already have a 9800X3D on order so they aren't watching this anyways. We aren't stupid. We KNOW 1080p is the best option for CPU testing and expect it to be prevalent in CPU reviews. The issue is it's ONLY good for CPU vs CPU testing and some of us WANT to know where those GPU bottlenecks kick in. I think Daniel Owen touched on this best. Reviewers are laser focused on the testing of CPUs only, but some of the viewers are looking for real world buying advice when they're watching reviews. We're not challenging the scientific method, we're asking for a convenience instead of having to collect the data of GPUs ranked with only $500 CPUs at max settings, CPUs ranked with only $2000 GPUs at minimal settings, then trying to guess where a mid range GPU and mid range CPU will perform with mid range settings. There's almost NO reviewers out there that give you this content, except of course Daniel Owen that will sit for an hour and bounce through all kinds of settings and different CPUs and GPUs. But that's only helpful if he features the game you're interested in.
@mikenelson2632 ай бұрын
I really hope these channels understand this. The testing is important at base, but it does not do a very good job translating to buying decisions. With the amount of product review cycles and the number of channels involved, it would have been helpful to have built a predictive model by now. Rather than dump all these data points on the review screen, give the review presentation ample time to query the model for best advice for real world decision-making scenarios.
@timcox-rivers93512 ай бұрын
I was nodding along with what you were trying to say until you got to saying that Daniel Owen who does try with multiple settings, is still not enough information because it's not the specific game you want him to test. It sounds like there is going to be no review that is going to give everyone what they want, b/c what people really want the reviewer to do is pick their favorite game, their current CPU and their current GPU and then a direct comparison showing the exact same game with the exact hardware they have on pcpicker. That's ridiculous.
@Mathster_live2 ай бұрын
Both sides have a point? The side where they're asking for irrelevant and redundant testing on higher resolutions when there's a GPU bottleneck? Even a 5600x is enough for most users, anything above that on higher resolutions, you're GPU bottlenecked. I don't get why it's so difficult to understand, better cpu = will be relevant and scales well on multiple GPU generations. The 1080p results tell you that if there was no GPU bottleneck then you would get that FPS in ALL resolutions, so it clearly tells you how long your CPU will stay relevant throughout the years if you remember what the FPS cap was if there was no GPU bottleneck. Like it's surprising how many people like you will find a middle ground between reviewers and people who are clearly misinformed, they don't have a point, they're just wrong and are asking for more tests to waste time.
@zodwraith57452 ай бұрын
@@timcox-rivers9351 Well obviously that's not what I'm saying, but even if it's _not_ your specific game what other reviewer do you know that goes through as many settings, resolutions, and hardware configurations as him? Even half? I literally can't name anyone else that will show you a benchmark with a midrange GPU, midrange CPU, 1440p at 5-6 different settings and upscaling. And that's still a LOT more info of where bottlenecks can pop up than anywhere else.
@zodwraith57452 ай бұрын
@@Mathster_live Because it shows you _where_ the GPU bottlenecks occur. If you can only see CPUs tested with a $2000 GPU at low settings, and GPUs with a $500 CPU at max settings, WTF does that tell you about a mid range CPU and midrange GPU at midrange settings? You're left guessing between 2 completely different numbers. If we get just *_ONE_* "don't overspend" benchmark at 1440p with a GPU that ISN'T the price of a used car if you see a bottleneck you know, "ok, you need a stronger GPU for this CPU." If you don't see a bottleneck you know "OK, this CPU I'm looking at right now is totally worth it!" SO much shit is hidden in the middle. Besides this video has heavily skewed results to begin with because he only showed heavily upscaled 4k "balanced" meaning really only 1254p, and he only used a 4090. He did everything to _STILL_ make the GPU not the bottleneck. How is that "real world?" Does everyone you know own a 4090? Cause I only know a couple. Yeah this video _was_ a waste of time but only because he purposely didn't show what people were asking for. Not to mention if you looked at his pinned comment Steve from GN *_DOES_* mix in a few 1440p slides to let you know where bottlenecks can kick in fast, and he _doesn't_ upscale the shit out of it either. So if GN does it, why doesn't benchmark Steve do it?
@SAFFY74112 ай бұрын
Thanks Steve.
@VGJunky2 ай бұрын
Thanks Steve.
@Haro642 ай бұрын
Thanks Steve.
@MyBestBuddiesForever2 ай бұрын
Thanks Steve.
@lilvegee2 ай бұрын
Thanks Steve.
@TheHighborn2 ай бұрын
Thanks Steve
@mihaighita85532 ай бұрын
I think one of the best use cases for the 9800 x3d should be MS Flight Sim with all the streaming/decompressing going on. Maybe you could add a comparison for that one too.
@andreiavasi76002 ай бұрын
My 9800x3d arrived today for that exact game (msfs) :). It’s keeping my 4070 super in 40% bottleneck on airliners with ground and buildings turned way down since those kill the cpu. Plus i can’t use frame gen or dlss because then i get stutters. So i can’t wait to plug this baby in. Plus msfs2024 will optimize for multicore even more, which is great for the next gou upgrade, so i can avoid splurging on cpu again.
@Hussar-fm8iy2 ай бұрын
9800x3d destroys 285k in MS flight sim even in 4k.
@knuttella2 ай бұрын
isn't 4k balanced just 2227 x 1253?
@aiziril2 ай бұрын
Saying that it's not about raw performance, but more about having headroom when and where you need it (which depends on the game played) is a really smart way to explain this.
@imo0987652 ай бұрын
When you have enough enough CPU its amazing The moment you hit that CPU wall its lower fps and horrible 1% lows. Its a stuttery mess
@FurBurger1512 ай бұрын
@@aiziril Car salesman talk if you ask me.
@costafilh0Ай бұрын
No is not. A really smart way is to do more testing so we can see where the bottleneck actually is. But that is too much work I guess.
@giantnanomachine2 ай бұрын
Thank you, this is a video I have been really hoping for from one of my trusted review channels. Seeing how CPUs hold up (or don't) over HW generations is extremely helpful for me, since when I build my systems I usually upgrade the GPU once or twice while sticking with the same mainboard and CPU. Seeing what one given CPU or another can achieve with a 4090 (which I wouldn't buy due to price) at 4k with high quality settings today is a valuable indicator for me since in a couple of years that will be what it can achieve with a 7060Ti or 7070.
@msolace5802 ай бұрын
didn't put 7800x3d on the chart is a miss.. but we can assume there is no reason to upgrade to 9800x3d lol
@codymonster74812 ай бұрын
or an intel
@andersjjensen2 ай бұрын
Day one review said 11% faster on average. The 40 game benchmark marathon said 8% (like AMD's marketing), so unless you happen to play one of the games where the gains were 20% all the time, AND that game is not performing adequately for you, then yeah.... going from the 7800X3D to the 9800X3D is pretty much "e wanking". Which is fine if your kids don't need new shoes or something.
@emiel2552 ай бұрын
He doesn’t he doesn’t really need to as he made a dedicated video comparing the 9800X3D with the 7800X3D plus his day 1 review he compared the 9800X3D with many other CPU’s
@markh47502 ай бұрын
I think you included the wrong poll results at 13:44 It's not suprising only about 40-45% use upscaling at 1440p, as it's obviously much easier to run native with good FPS relatived to 4K.
@SchmakerSchmoo2 ай бұрын
One minor anecdote from the "vocal minority" that I think may have been missed is how low resolution benchmarks are used to justify "over buying" the CPU on a mid range build. Someone will be doing a build for ~$800 and you'll see tons of comments about "7800X3D is 50% faster - you must get it!" but these comments ignore the budget of the build and the fact that opting for such a CPU means significantly downgrading the GPU. A 7800X3D + 4060 is going to be a lot worse then a 7600X + 4070S in most titles. It is misinterpreting the graphs on both sides but only one side seemed to have gotten called out here.
@memitim1712 ай бұрын
If someone is using 1080P benchmarks to justify over buying a CPU they are doing it wrong and that isn't Steve's fault. All they can do is provide the data, all data can be misused or misrepresented and I think most of us would rather they just continued giving us the data than attempting to tackle blatant stupidity, which is always a race to the bottom.
@Coolmouse7772 ай бұрын
Just look at GPU benchmarks and compare 2 numbers, it's not hard to do.
@Skeames12142 ай бұрын
@@Coolmouse777 The number of people who are behaving like children is a little disturbing. “Why aren’t you a repository for every benchmark I could ever want???? Where is my favorite graph????”
@Stars-Mine2 ай бұрын
gpus you just swap out in 3 years.
@_LeouchАй бұрын
you must understand that no everyone is playing games like wukong or call of duty. Some play games like cities skylines or paradox strategy games, where strong CPU is far more important
@atariplayer36862 ай бұрын
Thank you Steve for the awesome benchmark & all the hard work you have put into this video 😊👌
@kentaronagame75292 ай бұрын
Ever since the 9800X3D dropped Steve has been smiling a lot in his thumbnails 😂
@codymonster74812 ай бұрын
These channels are selling out one by one. It's like they are all falling in line, echoing those pre-launch benchmarks from nVidia/AMD to make things look better than they are. If you don't show 1440p/4K tests, then the data is totally useless and any claims of "best cpu evah" is totaly fraudulent.
@DrMicoco2 ай бұрын
@@codymonster7481 so you're telling me that a 7600 is as fast as 14900k because they're pushing the same FPS in 4k?
@PC_Ringo2 ай бұрын
@@codymonster7481 tell me you didn't watch the video without telling me that you didn't watch the video. Kek.
@jompkins2 ай бұрын
@@PC_Ringo hahahaha yes
@wolfgangvogel54072 ай бұрын
This is a weird test tbh. Why is upscaling on balanced mode, it should be with quality or off. The claim was, in a gpu limit cpu differences are far less important. You test that, which is cool but then you turn upscaling on balanced which reduces gpu limit again, am I missing the point of this test?
@Hardwareunboxed2 ай бұрын
Yeah missing the point for sure. Maybe just skip to and watch this entire section of the video. 29:05 - CPU Longevity Test [3950X, 5800X, 5800X3D] 29:14 - Watch Dogs Legion [2022] 30:14 - Shadow of the Tomb Raider [2022] 30:44 - Horizon Zero Dawn [2022] 31:04 - 3 Game Average [2022 Titles] 32:04 - Starfield [2023] 32:43 - Warhammer 40,000 Space Marine 2 [2024] 33:04 - Star Wars Jedi Survivor [2024] 33:34 - 3 Game Average [2024 Titles]
@wolfgangvogel54072 ай бұрын
@@Hardwareunboxed Still dont see it. I agree with the initial testing in lower res to run in a cpu limit, really best way to do it. Some media outlets even still test 720p, that would drive some people mad for sure. But I really dont see the point of upscaling balanced at 4k with a rtx 4090. Its like testing somewhere between 1080p and 1440p, it would give you the same results.
@Hardwareunboxed2 ай бұрын
That is without question the most obvious way I can illustrate why GPU limited CPU testing is stupid and misleading. So if that didn't work I will have to tap out. You're welcome to get your GPU limited CPU testing elsewhere though.
@wolfgangvogel54072 ай бұрын
@@Hardwareunboxed I am aware about GPU and CPU limit, thanks. I am not questioning any statements either. Simply saying 4k upscaling performance is weird, in the first part of the video with the 4090 the whole test is pointless and misleading. You are not testing in 4k, its not even 1440p. The second part of the video proves your point much better. I think there is simply better ways to show why you test in low res than what you did in half of the video
@NGHutchin2 ай бұрын
@@wolfgangvogel5407 I think he has already been hit by the more aggressive comments by the time he got to yours. It is definitely true that a 4k test without upscaling represents a greater GPU bind than with. I think the argument of the better CPU ages better is not the question the audience, which this video is attempting to address, is asking. I do still appreciate the point, though.
@evanractivand2 ай бұрын
Got the 9800X3D and no regrets so far. I play on 4K usually with DLSS quality or balanced to boost frames on my 4080, and it has significantly improved things. MMO's and games like Baldur's Gate 3 now don't have any CPU bottlenecks, it's made a pretty large difference to my average framerates and 1% lows in a number of CPU heavy games I play. I came from a 12700K that I thought I wouldn't get much benefit upgrading from, but boy was I wrong. At the end of the day you need to look at the games you play and figure out how much money you want to throw at your PC to get a level of performance you're happy with.
@worldkat13932 ай бұрын
Been wondering if I go from a 7700x to it for MMOs in particular.
@evanractivand2 ай бұрын
@@worldkat1393 Yeah I tested it in New World the other day, previously on my 12700K it would drop to 50-60 FPS in towns with high population with a fair bit of stutter. On the 9800X3D the stutter is all but gone and it averages 90-100 FPS. So for CPU bound MMO's, it's a big diff.
@5ean5ean222 ай бұрын
@worldkat1393 once you go 3d, you don't go back.
@laxnatullou2 ай бұрын
Hi! What board that u use?
@BrawndoQC2 ай бұрын
Lol, like 2 FPS.
@jerghal2 ай бұрын
At 13:40 about the upscaling poll: Steve says 80% use upscaling. But then it must be the wrong poll (this is the 1440p poll instead of the 4K poll). Coz what I see on screen is 43% use upscaling (8% balanced and 35% quality mode). 39% (the biggest group) DOES NOT use upscaling for 1440p. So why test with balanced if that is the smallest (8%) group, unless it's the wrong poll 😅.
@CookieManCookies2 ай бұрын
Such a tiny audience, how many youtube viewers are scoping for these polls? I didnt even know there was one.
@jerghal2 ай бұрын
@CookieManCookies I happened to answer that poll 😁. Well both of them.
@KimBoKastekniv472 ай бұрын
I fully agree that CPU testing should be done at 1080p, but I can't help but wonder why 4K balanced was chosen to prove the point. The input resolution is closer to 1080p than it is to 4K, why not quality mode?
@Skeames12142 ай бұрын
Quality mode would still be closer to 1080p than 4k, and it was the middle option between the two more popular options (Performance and Quality) in the poll. People should be able to work out that scaling will increase if you choose Performance, and decrease if you choose Quality.
@esportsfan21182 ай бұрын
because they dont actualy want to show real 4k or 1440p results, and only want to stick to 1080p. they pretty much made this video out of spite for the comments and are trying to justify not showing 1440p/4k results in their cpu reviews.
@toddsimone71822 ай бұрын
Should have done 4k native
@enmanuel19502 ай бұрын
@@esportsfan2118There's a very clear example on this video showing that if you test on 4k (yes even on 4k balanced) testing 3 cpus (5800X3D, 5800X and 3950X) yields the same result for the 3 of them. Even though we know the 5800X3D is significantly faster than the other 2 and will achieve considerably more performance not only on 1080p but as well as 4k if you pair it with a gpu capable of keeping up with it.
@esportsfan21182 ай бұрын
@@enmanuel1950 yes, and it would be just like that with 5800X3D, 7800X3D and 9800X3D if they showed NATIVE 4k results. but they dont want to show people that it would be no use upgrading if u play in 4k native...
@Thor847200Ай бұрын
I don't see why scaling is used at all for testing. Personally If I was looking at a product review I would want to know how it works at Native Resolutions, not with scaling. I would want to know the raw data, not the scaled data. I would want to know what the actual performance difference is not the scaled difference. This is simply due to the fact that ANY scaling is going tot improve performance, unless you are, say, at 4k and scaling the image to 6k or 8k or something, then of course the performance would drop and you might have a a good reason to turn on scaling.
@Dayanto2 ай бұрын
13:45 You accidentally showed the 1440p poll instead of the 4k one.
@Zemla2 ай бұрын
I haven't seen the video yet, I have a rtx 4080 playing at 4K. I switched to 9800x3d from 5700x and it EXTREMELY helped me. Almost no stutter where before it was too unplayable, higher FPS, minimized fps drops etc. worth it, money well invested.!!!!! I should have done the upgrade much much sooner. to 7800x3d
@JohnxYiu2 ай бұрын
good to hear that! I'm using 11700k and having some stuttering issues playing RDR2 and Witcher 3, now I'm planning to upgrade to 9800x3D with the hope to have this issue fixed.
@anthcesana2 ай бұрын
Exactly why the 1080p only testing is almost useless for actual resolutions people play at. I found the same with my 5800X3D upgrade
@AndrewB232 ай бұрын
Uhh doubtful 😂
@lycanthoss2 ай бұрын
Maybe you changed something else? I'm running a 12600K + 4080 myself. I recently reinstalled Windows 11 when moving to a new NVMe so I had to take out the GPU. Somehow, and I genuinely don't know how, XMP got disabled so I was getting massive stuttering because I was running at the baseline 2133 MT/s of my DDR4 kit. I didn't understand why I was stuttering so hard until I noticed in task manager that the RAM was running at 2133 MT/s. I was going to blame Windows 11 24H2 or the new Nvidia drivers with the security fixes, because I don't think removing the GPU or replacing the first M.2 slot should disable XMP?
@Nissehult232 ай бұрын
Same. My main game World of warcraft is doing insanely well, more than doubled my avg FPS. Not only that, but on Space marines 2 fps (with the high res DLC) went from 60~fps to 100~120.
@Phil-q7h2 ай бұрын
Good video, thank you. I am one of those that FULLY understands why testing of CPU’s is done at low resolution, however I still want the 1440p/4k data. It’s useful to me as a Single player gamer. It lets me know where we are at, where the market is it. You contradict yourself here Steve, you pointed out in more than one chart that the cpu CAN make a difference and IS relevant at the higher resolutions. Especially when new gpu are released
@Hardwareunboxed2 ай бұрын
I did not contradict myself at all. You're somehow missing the point. The 1080p data tells you everything you need to know. You then take that data and apply it to whatever a given GPU can achieve at a given resolution/quality settings. The 1440p/4K data is misleading and I clearly showed by here: 9:05 - CPU Longevity Test [3950X, 5800X, 5800X3D] 29:14 - Watch Dogs Legion [2022] 30:14 - Shadow of the Tomb Raider [2022] 30:44 - Horizon Zero Dawn [2022] 31:04 - 3 Game Average [2022 Titles] 32:04 - Starfield [2023] 32:43 - Warhammer 40,000 Space Marine 2 [2024] 33:04 - Star Wars Jedi Survivor [2024] 33:34 - 3 Game Average [2024 Titles]
@Phil-q7h2 ай бұрын
I know what you mean. I do honestly but as someone who never upgrades, and swaps out the pc is full every 2 years. I want to know what difference it’s going to make to me today, in today’s games. I appreciate the caveats that can bring, as you say, you don’t know what I am playing or at what resolution. I rarely buy the high end cpu and have always gone for the mid range, lumping the money into the gpu. BUT, if I see a clear difference in a cpu use at the higher resolutions, I want in. And I want you to tell me that, even if it’s on a small sample size. I know I’m not gonna win you over and I know why, but still……
@VON_WA2 ай бұрын
@@Phil-q7hyou know, you're essentially saying "I FULLY understand the reason you're doing it this way and it is the correct reason, but I still want you to so what I want to see even though it's unreasonable." Just do what Steve said. Check if you're CPU bottlenecked, then upgrade. If you're still happy with the performance, then you don't need to upgrade, simple as that. You said you're a single player gamer anyway, so most of the time it will be your GPU that's bottlenecked if you're playing at 2K/4K.
@dloc29072 ай бұрын
Great info and completely agree. However I think we are all wondering if it worth an upgrade. So maybe show test in 1440p or 4k against a bunch of older and newer cpu's.
@Crankshaft_NL2 ай бұрын
For me the underlying question to look at 1440p testing is to look at what is the best step for upgrading(most fps for money). Cpu or gpu first. Forsure the 1080p testing is the best, no comment on that. But content like you did with min max and the 7600 and 4070ti vs 7800x3d with 4070 is welcome content from time to time, to help consumers extrapolate there own questions concerning upgrade path and where to put ther x amount of money.
@frankguy68432 ай бұрын
As a 5800X owner I appreciate the comparison at the end a lot, I got the CPU right before the 5800X3D arrived and couldn't justify the purchase, been waiting for my time to switch off AM4 and the 9800X3D is the answer. Should pair fine with my 3080 until I upgrade and then see even more performance. AMD really doubling down on giving gamers what we want and I appreciate it.
@rustyshackleford41172 ай бұрын
Should be a big upgrade, I went from the 5800x to 7800x3d last year, and got decent FPS bumps even in many 4k games on ultra settings. Several CPU-heavy titles had massive uplifts, as did emulation of things like Nintendo Switch, up to 30-40% performance increase in a few cases like Tears of the Kingdom in cities with lots of NPCS.
@Spectre11572 ай бұрын
Hey same here! If I was humming and hawing about it before, after reading this comment I am now decided. Thanks!
@cschmall942 ай бұрын
I just built a new system, coming from a 5800x, same boat as you, bought the 5800x before the x3d variant launched, and while the 4080 super does a lot of the heavy lifting over my 3070 before, the performance boost is incredible. Haven't done many tests, but the one that I did, Arma 3 koth, at 3440x1440, my 5800x/3070 would struggle to have a steady even 50fps, the 9800x3d/4080, rarely dipped below 130fps. Granted, it wasn't a full server, which kills fps most times, but still, insane boost.
@keirbourne53232 ай бұрын
Same here, gonna get a 9800x3d to replace my 5600x.
@justinthematrixАй бұрын
Same man been waiting to upgrade from My 5800x and 3080 as well
@klevzor2 ай бұрын
Been waiting for a 4k test of this processor! Thanks. Considering moving on from my 5900x now
@CookieManCookies2 ай бұрын
These 4k graphs are a work of art, good job steve! I'll be back for your 5090 reviews, hopefully in 4K with 4K benchmarks!
@Flaimbot2 ай бұрын
i think the only thing missing to drive the point home would be adding 720p (in fact, my preferred benchmark res due to even 1080p sometimes hitting some bottlenecks in the gpu) regarding 13:23 you've partially included what i was about to request due to intermingling the 1080p-VS-4k(dlss) data, but i think it would still be helpful as a standalone video. all the dlss/fsr/xess performance requests for gpus are just as pointless as high res cpu benchmarks. all they show is just the base resolution data (minus a bit from the overhead of those thechniques). showing this correlation between upscaler-base-resolution and the respective native resolution of that base, could help demystify that technology to the laypeople. and while at that, frame generation, according to my understanding, only leverages the free resources of the gpu in case of a cpu bottleneck, up to twice the native frame rate, due to interweaving native and generated frames. this could also make for another nice video, showing how the FG framerate remains consistent across multiple cpus in the same game. (given my understanding is correct) 38:29 you mean "again" ;) big updoot for that type of content 👍
@B1u35ky2 ай бұрын
Balanced upscaling is not even close to 4K though
@kjaesp2 ай бұрын
What do you mean? I love spending north of 2k on a gpu not to play 4k native, that's why I play with RT on! (jk of course)
@kennethwilson82362 ай бұрын
Yeah, thought that was an odd way to test, and which upscaling method was used? His own survey only had 8% of polled users using balanced mode
@kerkertrandov4592 ай бұрын
@@B1u35ky nobody plays 4k without dlss. I do agree dlss quality was better to test than balance tho
@B1u35ky2 ай бұрын
@@kerkertrandov459 I play 4k either native or dlss quality. So yeah people don’t need upscaling all the time
@crisium39452 ай бұрын
1252p with an upscaling performance penalty makes it roughly comparable to native 1440p. So he actually tested 1440p.
@Superior852 ай бұрын
If the most popular option on the poll at 13:41 was native 1440p, why not at least include that result in reviews? Could be useful for both 1440p gamers and those who use 4K Quality, since 1440p is the 4K Quality DLSS rendering resolution....
@Dark-qx8rk2 ай бұрын
1440P with upscaling would in effect give the same fps as 1080P which would make the 9800X3D a great choice.
@moritzs32072 ай бұрын
Great video, thanks a lot for the refresher. Maybe in future reviews stress the fact that 1080p performance to an extent shows the CPU limit in terms of FPS, which can be extrapolated just fine to higher resolutions IF you have the graphics card allowing this. I think a lot of people are missing that.
@leemarks11532 ай бұрын
We should upvote this to the top because so many people still don't seem to get it
@troik2 ай бұрын
thank you for this video, here is why I'd like to see more 4k testing and what I would actually need from that: I totally understand why low resolution testing is done, to isolate the CPU performance and keep the GPU Performance mostly out of that, to make CPUs comparable. My counter-argument to that would be that if it's not actually about the fps, but the difference between the CPUs, then a Cinebench score or looking up the gflops of the CPU, would get me 90% there (I admit that X3D makes this more complicated). My thinking is this, I'm gaming at 4k/120fps (at least as a target) so I'm GPU bound most of the time. With my aging 9900k in some games I sometimes reach 50-60% CPU utilization. That suggests that I might still have headroom, but 100% would mean perfect utilization on all 16 threads and no game scales that perfectly across all threads, so it might be that I'm CPU bound on a frame here and there without realizing it. Switching to a new CPU won't double my performance, I might get make 10-20% depending on the hardware choices I make (board, ram etc). So most likely I'd just see my CPU usage go down to 20-30% ingame. Now I come to my question, is it possible, due to how modern CPU works, that at 30% utilization, a CPU behaves differently than at 100%? Would it be possible, if not everything scales perfectly, that at 30% one CPU might be more effizient than another CPU and that it might be different compared to at 100%?
@arrken332 ай бұрын
Yeah, but adding 1440p and 4k graphs gives us a reference of how bottlenecked you can get with the RTX 4900 in this case. Maybe add a segment just saying if the CPU gets bottlenecked by the GPU at 1440p and 4k.
@Marknewell27310 күн бұрын
So all that long talk (that i mostly skipped through) about how ppl want to see 1440p and 4k results, and we still don't get to see real 1440p and 4k results. Nice
@OfficialEthern1ty2 ай бұрын
As a long time 4k user, I would have liked to see 4k native. Was it not done due to the testing methodology? Length of testing? Would testing with upscaling be closer or equivalent to 1440p testing?
@zacharyspencer22852 ай бұрын
My guess would be that most people use either quality mode DLSS or balanced mode for the extra performance. If you have a smaller screen with DLSS such as 32 inches I've noticed even performance mode works well with DLSS and is hard to tell the difference in a lot of games. I notice that the bigger the screen the easier it is to notice DLSS messing with the image and isn't as good as native but I think that most people use DLSS quality or balanced with their graphics cards or FSR quality if they have AMD.
@DB-cu3xu2 ай бұрын
I'm sure this was to emphasize CPU performance at 4k rather than focus on a more GPU limited scenario.
@vampe7772 ай бұрын
They have done it with DLSS because most people voted for it.
@floriang28012 ай бұрын
@@DB-cu3xuwhich kind of defeats the whole point of the video…
@anitaremenarova66622 ай бұрын
You can't play modern games at 4K native, not unless you want a console experience (30FPS)
@TheMasterEast2 ай бұрын
Hey Steve thank you for doing this. I know why CPU are tested like they are. But I Highly value your work here in 4K testing. Keep up the Good work.
@joker9272 ай бұрын
1% lows. This is what i am interested in as a 4k gamer with a 4090. Will a new CPU help make games smoother? Will this CPU make frame times more consistent? This is valuable info for a halo product buyer like me.
@santeenl2 ай бұрын
It's a lot about future proofing. I always bought a system for a longer time and would upgrade the GPU. You could buy a 9800X3D and just keep it for like 6-8 years and upgrade your GPU over the years.
@robertopadrinopelado2 ай бұрын
Probablemente sea posible llegar a esa cantidad de tiempo adquiriendo una placa base con la mejor conectividad que hoy esté disponible. Tanto para GPU , para los M.2 y los puertos USB 4.
@TheGreatBenjie2 ай бұрын
There is not a CPU review in existence that can accurately say how "future-proof" a cpu will be, and that metric should be largely disregarded.
@Flaimbot2 ай бұрын
not just future proofing, but also if you lock your fps, you enjoy a much more stable framerate, due to MUCH higher 1% lows compared to weaker cpus, that would fit your bill just from ther avg fps
@santeenl2 ай бұрын
@@robertopadrinopelado Maybe respond in English, thanks.
@santeenl2 ай бұрын
@@TheGreatBenjie It can accurately say it's WAY faster than a 9700X for example. Even if you don't notice a lot of difference today in 1440p, you might in lets say 3 years.
@FastDistance12 ай бұрын
Great info, but just to give you an example of a user who wants 4k data to be included in the CPU benchmarks, even after watching the whole video and understanding all your points. I have a 7600 and a 4090 (for example). I am thinking about upgrading my CPU after the new part is released, and I primarily game at 4k (playing casual games). After this video, I’ll go to buy a new CPU because the 9800x3d is 17% faster on average than 7700x. If that margin were under 10%, I wouldn’t update, and the 1080p data is not helping with my decision.
@mgk878Ай бұрын
If you're still asking this then maybe Steve has not explained it well enough. CPU performance is nearly independent of resolution. There's little or no extra work for 4K, because that work is done mainly by the GPU. So a "% uplift at 4K" is just the same number as 1080p, except that no GPU exists right now that can demonstrate that. Including those numbers in a CPU review would only help people with a 4090 and most people don't have one. Meanwhile the "CPU bound" numbers are for everyone. Most recent GPUs can achieve similar numbers just by turning down resolution and quality settings, until you get a CPU bind. The question of "will I get 17% more FPS" depends on your GPU and games, so the answer is really in the GPU benchmarks. I'd guess that most people asking this play at 4K/high settings and so are usually GPU bound, so the uplift would be about 0%. If I were you I'd save my money.
@joeschmoe-w8z2 ай бұрын
the test you did though show there is a benefit to 4k testing. in games like assetto corza competizione and remnant 2. as you said earlier testing at 1080p although it is a resolution that was played a long time ago because nvidia cards excelled at ultra settings. it is mainly used to show the differences between the cards and not as practical advice for how the game should be played. 4k benchmarks on the other hand could conceivable be how a user who owns a 4090 would actually use their card. that being said there are a wide variety of hardware configurations. the point is by putting a load on the processor you come closer to an actual scenario gamers may find themselves in. the problem is remnant 2 and asseto corza competizione and remnant 2actually excel on the intel core ultra 285k. I think the reason they excel is because demanding settings use the memory bandwidth of the cpu cache and potentially the ram this exerts pressure on the modern front speed bus of the processor. pushing out frames to decrease latency between the gpu and the cpu is a function of a cpus ability to quickly calculate. although the 285k is getting slower. its multiple cores make its memory bandwidth better than older processors. however because its p cores are less performant. there's actually a regression in speeds on the p cores between 14th gen and core ultra 2 gen or in reality due to intels terrible naming scheme. its actually core ultra 1.5 gen. essentially this leads to scenarios in benchmarks where its single core performance tells a different story then its actual gaming performance under load. although its true the 9800x3d is always the best cpu. the 7700x isn't always better. I suspect if games were tested that put more of a load across all 3 processors. the 7700x would easily be the worst out of the three although admittedly that could simply because newer games become designed around the performance characteristics of the core ultra 5 285k. I wouldn't say any of these processors is a deal breaker for the performance it offers. although it may be a deal breaker for its cost per frame. even a 3317u if you could load up windows 10 with it overclocked would do an adequate job with a rtx 4090.
@TeodorMechev2 ай бұрын
I am pretty much aware of why reviews of CPUs are done on 1080p and I appreciate your hard work and vigorous testing, but in this video I would have liked to see some comparison between 7800x3d and 9800x3d in 1440p gaming and how viable would it be to upgrade or not. Maybe in upcoming one?!
@Your_Paramour2 ай бұрын
If you watched the video and are asking this question, you haven't understood the video. Testing at lower resolutions tells you what the maximum throughput is of your cpu, as in the vast majority of cases, cpu performance is independent of rendering resolution. Any higher resolution that does not have the same performance means it is now the gpu side that is the performance limiter. You cannot expect a reviewer to give you a complete performance profile of every game as that is totally unreasonable, and not their job.
@discrep2 ай бұрын
He literally explained why higher resolutions provide LESS accurate information -- because even the best GPU on the market will hit its limit of how many fps it can render before the CPU does. This is a CPU review, which compares the difference between CPUs, which cannot be determined if all of the benchmarks are the same because the GPU is the bottleneck. At lower resolution, when the GPU is not close to its limit, the CPUs will determine the max fps and you can more clearly see the true difference between the CPUs. Moreover, the actual fps numbers are irrelevant because everyone has different setups. You won't replicate their numbers unless the rest of your hardware and all of your software settings are identical to theirs. What you want to know is the >>relative performance gain
@donnys92592 ай бұрын
Hats off to you Steve for pushing out this video pretty fast after user comments from your previous video. It was good to get a refresher. Very useful. Thanks. 🙏
@jacobb-c79462 ай бұрын
thank you for making this, got into an argument with a guy over this on your benchmarks for the 9800x3d. he said its 100% worth upgrading to even if you have a 7800x3d because its 20% faster and you're stupid not to but to me that simply isn't the case, it really heavily depends on what you are playing, the resolution you are at and how many generations you can keep using the CPU because it will be less likely to bottle neck later ones because of its power. its weird to me that people think anything other than this CPU is obsolete, when someone with a 7800x3d will probably only need to upgrade their CPU a GPU generation or so earlier than people using the 9800x3d, which for people who bought it when it came it is completely logical. and lastly, who really needs an extra 15 - 30 fps more when you are already on 144fps.
@memitim1712 ай бұрын
My 7800X3D replaced a 4690K😆, I upgrade the CPU when I can't play a game I want to play, and not before.
@5etz3r2 ай бұрын
Good video, I appreciate the one-off unique testing just to show examples and prove points about variables etc
@all.day.day-dreamerАй бұрын
Why didn't this guy show 14th - 13th - 12th gen @ 2K and 4K ? How are his viewers supposed to come away feeling educated? So they can make very important buying / upgrading decisions? 12th gen @ 4K is very very close to these numbers. Not my channel, not my rules. I get it. I think if he started to give his viewers the full picture, they are going to place a lot more value and trust into this channel which translate into more views and subs. Thanks for the video.
@Hjominbonrun2 ай бұрын
wait, He is not lounging on his couch. I don't know what to make of this.
@YothaIndi2 ай бұрын
It's ok, he's still sitting 👍
@christophermullins71632 ай бұрын
He has work to do so don't stress it too much.
@adamchambers13932 ай бұрын
My view is that I am gaming at 4k native now and am waiting to purchase the LG 45" 5120x2160 and just want to see if it worth upgrading from the 5950x when I purchase a 5090RTX next year or if I can save the £1k on CPU, mobo, RAM etc because the % increase remains at 5% only average increase and so can wait for the Zen6 10800x3D or whatever it will be. I can't see that data without some basic 4k numbers. If I looked at the 1080p without the 4k shown, am I going to have to assume I would get the same uplift as there would be no data to tell me know there isn't that uplift (yes aware that educated guess means it wont be those numbers but an uplift % still helps decide because at some point a GPU will allow that overhead.
@biscuitthebodyАй бұрын
Great explination of 1080p testing!!! The last comparison with a couple years old CPUs running then to now really cinches the idea that low res CPU testing is synthetic showing how they will likely perform with next gen games. FANTASTIC!
@LA-MJ2 ай бұрын
You can doublespeak all you want but you will not change the opinion that 1440p gaming with a cheaper-than-car GPU is what matters to many people. GPU-bound is not always GPU-bound either: look at 1℅ frametimes to evaluate the experience
@Tostie19872 ай бұрын
Steve, do you still not understand that people also just want to see benchmarks that mostly mymic their own system and that (at least for me), is why I would like to see 1440p benchmarks? This way I have some sort of image of what I can expect if I buy this cpu.
@samgragas84672 ай бұрын
You can expect to get more FPS if you are CPU limited at the framerate you would like to play. That is all you need to know.
@RoboMagician2 ай бұрын
what about the performance of 4k without upscaling?
@LakerTriangle2 ай бұрын
I've noticed in all the 9800x3d reviews that it has larger 1% low swings. For example - in the 1st BM it's 28fps vs 20 & 22fps for the other 2 chips.
@kr00tmanmining2 ай бұрын
To be fair, I totally understand where you are coming from, but as an avid 4k gamer (at least when im gaming on my personal rig) understanding what kind of performance uplift id get from upgrading from my 7800x3d to the 9800x3d in 4k even if its only a few fps is helpful
@cl4ster172 ай бұрын
Doesn't seem like you understood. All you need to know is how much faster one CPU is over another. Then extrapolate that knowledge to your system by simply checking if you're GPU-limited or not. Resolution is irrelevant to the CPU. Either it's fast enough for your use case or it isn't.
@kr00tmanmining2 ай бұрын
@cl4ster17 that's a good way of looking at it but I don't think a lot of people fully understand that. My testing basically confirms that but that doesn't mean people like to see if there's a worthwhile improvement or not.
@mehedianik2 ай бұрын
@@cl4ster17 Your example works when you are comparing a CPU with your own CPU. That way, you have your own data to compare with to decide if it's a worthy upgrade or not. But when you are deciding between multiple CPUs, the resolution also matters. For example, if I want to decide between the 9700x and 9800x3d, a lower resolution will give me an idea regarding the actual difference between the performance of both CPUs. At higher resolutions, say 1440p, one might become CPU bottlenecked while the other doesn't. The performance gap will be closer than the lower resolution result. But how much closer? That’s what people want to know. Also, when comparing high-end CPUs, high-end GPUs make sense. But when you are comparing middle class CPUs, people mostly pair them with mid-end GPUs like the 4070 or 7800xt. Their GPUs become bottlenecks much earlier. If they lack understanding and only rely on the review data, they might think upgrading CPUs will give them similar performance uplift. They upgrade their CPUs and get the exact same performance they had earlier. That's why real-world data is also necessary, in my opinion, to assess different scenarios. This should be a part of the reviews. I understand the amount of work it takes, and I greatly appreciate reviewers' efforts to present us the data. It won't be possible for them to put in the day-one reviews, but eventually, they should include this data as well. Not because it's the right method of testing performance, but rather to give non-techy people a better understanding so they don't waste their hard-earned money for no to barely any performance gain.
@cl4ster172 ай бұрын
@@mehedianik People don't always run the highest settings and at this point the supposed real-world data goes out the window because GPU-limited "real-world" tests don't show how much headroom if at all the CPU has which could be turned into more FPS by reducing settings.
@joaovictorbotelhoАй бұрын
Exactly. The biggest part of the audience wants to have an idea how the hardware would perform in their own situation. Beyond that, most users change their systems in the 3 years period (not me, i got a 12 years one. Any change would be a benefit). So it doesn't really make that much sense to prove some performance in a given scenario which isn't even a practical one. Certainly the audience appreciates the effort for making this kind os videos, but i guess there's no need to bash against the voices that claim for 1440p, since it's the actual standard for non pro or fast pace shooters players.
@Strix-gp9xg2 ай бұрын
Why 4K with balanced upscaling? Why not test at native 4K?
@vasheroo2 ай бұрын
The Stellaris uplift from the 7800x3d to 9800x3d has got me thinking about the upgrade. You can never have enough performance to fight the true late game crisis 😂
@gabrieli6008Ай бұрын
Another interesting thing to note, for people that use upscaling to reach higher framerates, CPU performance becomes increasingly more important as your base resolution lowers. Even with a 5950X, my 7900 XT gets barely any additional FPS going from the balanced to ultra performance scaling options in both XeSS and FSR.
@definitelyfunatparties2 ай бұрын
6:54, dude.. high end GPU 4k gamers, such as myself, are happy with 60FPS, AS LONG as we are GPU limited, that's why we spend over 1k on our GPUs, all we want to know is whether there's actually a point to getting a 7800x3d over a 7700x, or if we're wasting money that could go towards our next GPU Edit: 18:52 the argument DOESN'T fall apart if you don't use upscaling. I play hogwarts on native resolution, honestly, people asking for these results are gamers that want the best graphical fidelity, I never understood FPS chasing, and I never will. Some people will buy a 4090 and not turn on RT/PT because it lowers FPS.. well, then just get a damn 7900XT, you'll be looking at the exact same thing Edit: 20:13 ANOTHER failed test, you're testing ACC with AI, which probably 1% of players are going to be racing against.. I get much higher FPS than even the 9800X3D while playing online, which is how that game is meant to be played
@NativeWarrior081Ай бұрын
Why didn't you compare with the Ryzen 7800X3D at 4K?
@InternetListener24 күн бұрын
Because old is faster than new... If not patched everywhere... With new intel cpus also happens... There are games running faster on zen3 than in x3d... Which is easy to explain but not the message the Benchmarketing reviewers must show.
@ibzman13932 ай бұрын
Y using upscaling. Just show without upscaling.
@t0x1cde6tАй бұрын
thank you for including sim racing title. for those hardcore sim racers, we run some pretty insane resolutions (triple 1440p, double 2160p or triple 2160p) and it’s good to know 9800x3D can help boost frames in title like ACC.
@rgstroud2 ай бұрын
We understand the best case processor performance comparison at 1080P but we also want the graph you gave for 4k but with Quality not Balanced differences. This will definitely tell us who have the 4090 and use epic settings if it is worth upgrading from the 7800X3D as we never use balanced settings with this configuration, we bought the 4090 to use Quality settings worst case and can add frame generation if the latency is at 60 fps native 4k resolution. This will get even more useful with the 5090 when no gpu limits will exist for most games. We will also be greatly interested in 4k quality game comparison between the 9800X3D/7950X3D and the new 9950X3D if it has the 3D cache on both CCDs. The end of the video was very useful as I did not upgrade my 5950X OC to the 5800X3D for my 4K gaming rig as the boost shows no help for many games and even though the 5800X3D was better in some games, it was not worth the upgrade as I didn't have the 4090 yet. Now with the 4090 and soon the 5090, these 4k Quality setting comparisons with the CPUs IS VERY Important to represent real world use cases.
@anitaremenarova66622 ай бұрын
I can tell you right now: it's not. AM4 havers should be upgrading if they buy flagship but the rest will be fine with ryzen 7000 until the next platform drops easily.
@rgstroud2 ай бұрын
@@anitaremenarova6662 sorry but real world use cases are what people want to see. You can denigh all you want it doesn't make it untrue.
@ChoppyChof2 ай бұрын
I’m always interested in seeing if I can improve my set up, and that if I can what is the cost vs return to do so. So when I see a new CPU, for me at least I want to see if it makes any difference at all in 4K ultra in native over my current set up (5800x3d and 4090). I highly expect no to marginal as the answer but I’m still interested to see the data.
@xpodx2 ай бұрын
Right. I have a i9-9900k and 4090. 4k 144hz and 5k I'm gpu bound but I'm wondering/hoping my i9 will handle a 5090 fully.
@matejnemec30222 ай бұрын
@@xpodx It will definitely hold you back in some games, unless you enjoy playing sub 60 fps. If you play competitive games with competitive settings it will be a massive bottleneck. You should definitely upgrade your cpu for a 5090.
@xpodx2 ай бұрын
@matejnemec3022 i game at 4k 144hz on ultra and high settings. More demanding titles i get 70. But mostly the games i play such as mw19, Vanguard cod, I get 165-225 105% Render and fully use my i9-9900k. I think it can handle the 5090, I can always bump up the render to 150% in other games making it look a tiny bit sharper.
@anitaremenarova66622 ай бұрын
No it won't, CPU upgrade would only improve your FPS when upscaling because 4090 is limiting your 5800X3D in 4K native.
@xpodx2 ай бұрын
@anitaremenarova6662 yea i wish he could of tested 4k ultra no upscaling in this video ha
@Thor8472002 ай бұрын
Ok, I find nothing wrong with testing @1080p, but personally I want to know what I am getting if I buy that cpu for 1440, or 4k gaming. I don't want to guess or spend mutiple hours trying to figure out how to do the math properly to see 4k benchmark data. If the CPU gives no uplift @4k, I want to know that. If it does give uplift, I want to know that too. And you say that it is "pointless" to show anything above 1080p because I would have to have the exact same setup in the same games as you in order to see if I wanted to buy the CPU or not, but here is the thing. I don't play at 1080p. So you ONLY giving 1080p data means nothing to me because, as you said about 4k, I don't have that exact same setup you are using in this video. In fact I would bet the majority of people don't have a 4090 to compare to. I understand why you use it in the benchmarks, to limit gpu bottlenecking, but if you are going to say that I would need the exact same setup you use @4k then you could use the same logic for 1080p. With your own logic, I don't use 1080p, any of the other cpus you showed, or a 4090, so all of that 1080p data is pointless except to show that the new cpu is faster then the other older cpus. And that is why I like to see 4K data, to see how much faster, if any at all, the new cpu is against other cpus @ 4k. I would expect the new cpu to be faster at 1080p, that should be fairly obvious unless it is a dud of a product in general. I also would have been interested to know if the 9800x3D also gave any uplift to raytracing as raytracing is at least partially cpu dependent. All that being said, to me it isn't pointless to see the 4K data at all. I do appreciate you making this video even if you don't see a good reason to in your own opinion. But there are a good number of us out there that understand that 1080p is mostly irrelevent to us other than to use to see how much faster a new cpu is. That doesn't mean though that we shouldn't get to see the new cpu data for resolutions higher than 1080p. Most reviewers don''t go back and retest most GPUs for a new CPU, especially at the end of a GPU generation, so seeing 4K numbers for a new CPU is typically the best and latest benchmark updates we will get until a new GPU lineup comes out. So, thank you for making the video. And sorry for the wall of text.
@garbuckle30002 ай бұрын
Your previous videos on the subject definitely helped in making my decision on where to upgrade last year. I do watch cpu reviews to mainly get an idea of what's good value, and what makes sense for my situation. I play at 4k with a 6950XT, so I know a high-end gaming CPU does little. In some games, it helps the 1% lows. And it's helpful for future games. This is why I went with the 7900X3D when I found it for $280. Definitely an upgrade for multitasking work, and the benefit of some cache to help a bit on some games, for an amazing price. I think the big issue is that people are looking at the ridiculous FPS numbers on the chart, and not the % increase. These charts should always be seen as "up to" X performance. In the end... It depends.
@dotxyn2 ай бұрын
Fantastic 1080p vs 1253p comparison Steve😁
@johnhughes97662 ай бұрын
And look how much the results reduced at less than 1440p lol 😂
@coffeehouse1192 ай бұрын
Yea I don't see the point of the video , why change settings not just resolution, why test vs 7700x instead of 7800x3d... strange video from Steve
@STATICandCO2 ай бұрын
I know it's missing the point of the video but DLSS quality definitely would have been more 'realistic'. DLSS balanced doesn't really represent 4k or a realistic use case for 4090 users
@Donbino2 ай бұрын
@@STATICandCOi actually disagree. digital foundry along with many many 4090 users including myself use dlss performance on my 32 inch qd oled. i actually can’t believe how much performance im leaving on the table lol
@kerkertrandov4592 ай бұрын
@@Donbino dlss performance on 32 inch. U must be blind if u cant see how bad that looks. Considering on 32 inch its much more noticable than 27 inch.
@Xavras_Wyzryn982 ай бұрын
About CS2 data: Surely even when over 500 FPS is plenty for almost virtually everyone, the difference between ~250 1% lows and ~340 1% lows must be noticeable on for example 360 FPS monitor. Aren't 1% lows more important for competitive shooter than average FPS?
@Weissrolf2 ай бұрын
Can you detect 5 out of 500 frames per second coming less than 1.1 ms later (4 vs. 2.94 ms)? Very likely not.
@Xavras_Wyzryn982 ай бұрын
@Weissrolf I don't think that's how 1% lows work. The lows are not evenly spaced across time. 1% low of 250 vs 340 means rather that someone will have 3 seconds with such, or lower, frame rate per 5 minutes of playing time, likely bunched together. 0.1% lows would tell how low this drops can go.
@Aleaf_Inwind2 ай бұрын
I think your message from Gamer's Nexus says it all, "we typically publish 1-3 1440p charts in games that are still mostly CPU-bound for perspective." It just gives another perspective. Sure we can cross reference and do the work ourselves, but we can also CPU test ourselves, or cross reference other reviews ourselves. It's nice to see the work done by a channel we trust to do it right to give an alternative perspective on CPU performance in realistic situations. It doesn't mean we need a 45 game test with every possible config combination, just find a few games, maybe one that sits at the average for GPU demanding game, one that sits at the average for overall CPU performance, and one that sits at the average for CPU demanding games and do a few tests like the ones you did here. You kept saying that the data was pointless, but even though I already understood your testing method and agree with it as the correct method for testing CPUs, I still found this data very fascinating to look at, and am glad you said that you might do another one in two years with the next CPU generation release. On a side note, I'm still sitting with a 5600x and want to upgrade, but I'm struggling to decide between a 5700x3D and a total platform upgrade to a 7700x. The 7700x would cost way more while not giving much benefit, but then I could upgrade again further down the line to a 9800x3D or potentially a 10800x3D if it stays on AM5, but there's always a chance that it could end up on AM6, and then it would probably be a better idea to skip AM5 altogether... Decisions...
@Coolmouse7772 ай бұрын
Hope they not listen to you. You agreed that this data is pointless but still want it ? Why ?
@Aleaf_Inwind2 ай бұрын
@@Coolmouse777 No, I didn't agree that it was pointless, I completely disagreeed. I agreed that their testing methodology is the best practice for showing CPU raw power. But it's also nice to see how that power scales in different real world scenarios. I said that it was fascinating and gives another perspective, which Gamer's Nexus also seems to agree with, and they are one of the most technically minded channels around.
@Coolmouse7772 ай бұрын
@@Aleaf_Inwind There is no another perspective because the is no additional data if you want to get real world results, just look at GPU review to, it's simple to compare 2 numbers )
@Coolmouse7772 ай бұрын
@@Aleaf_Inwind But it is not scales, it is the point of video. If CPU get 120 at 1080p, 1440p and 4k will be the same 120 fps.
@Aleaf_Inwind2 ай бұрын
@@Coolmouse777 There were plenty of cases like Cyberpunk 2077, where the 9800x3D got 219fps in 1080p and only 166fps in 4k Balanced, a reduction of almost 25%, while the 7700x only gets 158, but only had a reduction of about 5% from the 166 it gets in 1080p. So no, you don't get the same fps, you get less and it's nice to know how much less, because it's not just a solid percentile. Sure you can go cross reference GPU reviews to get an idea, but as you can see, it's not just a straight number, the 4090 still gets more frames with the 9800x3D than it does with the 7700x in 4k Balanced.
@mr.waffles25552 ай бұрын
This test was amazing. I have a few friends who haven’t quite grasped this concept and sharing this video with them finally bridged the gap. Thank you for all that you do.
@leov69622 ай бұрын
Does a poll for 1440 does the video for 4k...with upscaling which just shoves it back onto cpu instead of gpu so we basically back at the last video and also compares it to the flop of a cpu the 285k an the 7700x. Where the 7800x3d at ???
@mike-tq5es2 ай бұрын
there was a poll for 4k behavior. which is the reference for this video. but the editor posted the other poll about 1440p. probably a mistake
@codymonster74812 ай бұрын
Testing 1080/1440/4K should be the norm. Guessing these channels are just becoming, although you wonder why they keep adding more and more games to the list (that almost all test the same) instead of including some higher resolutions. A fail either way.
@andersjjensen2 ай бұрын
Dude how much does this guy have to cut things out in humongous card board letters and paint them neon green before people clue up. The 9800X3D is 11% than the 7800X3D if we focus on very recent titles and 8% faster if we focus on a very broad range of games. Steve already showed that. Can't you do ONE SINGLE adjustment in your head while watching a video?
@mption2932 ай бұрын
The most important thing is the big question, "Do I need to upgrade?" Telling me how many FPS it gets at 1080 doesn't tell me that , its very useful in overall CPU performance when it is time to upgrade, but if I'm running a 12th gen intel, and the 9800x3d is getting broadly the same results at 4k, or 1440p if i am playing at that, I don't need to upgrade.
@Skeames12142 ай бұрын
"The most important thing is the big question, do I need to upgrade?" There is no way that question can be answered by a CPU review. It's completely subjective, only the individual can decide. They don't know what exactly you want out of your system, what games you play, etc. If you: 1. already have a high end CPU, 2. play demanding at high resolutions and quality settings, and 3. are happy with your performance, then don't upgrade. If you aren't happy with the performance, and you're CPU limited, 1080p benchmarks give you the best idea of what CPU *can* reach the number you're targeting, and you can adjust quality settings or upgrade your GPU from there.
@mption2932 ай бұрын
@Skeames1214 yeah data sets at high resolutions are totally useless at helping someone make that decision/s This answer tells me there is no point to consuming this content until I make that decision. A product review should show if something will benifit the consumer,not just it's the fastest!
@Skeames12142 ай бұрын
@@mption293 But whether or not it will benefit the consumer is subjective. You can’t offer a review that answers that question for everyone. You don’t know what they want, what their other hardware is, how often they upgrade, etc. Tiering the CPUs by their raw performance is a useful data point for all consumers, and more importantly the point of a *CPU* review.
@mption2932 ай бұрын
@@Skeames1214 whether the 1080p data is beneficial is subjective too. Reviews are supposed to guide you to what's the best for your needs not to say here's all the possibilities this is the fastest in specific use case. More data might not help everyone but this data doesn't help everyone either.
@Skeames12142 ай бұрын
@@mption293 "Whether the 1080p data is beneficial is subjective too." No, it's not. It is objectively more useful in the context of a CPU review. It shows differences in CPUs instead of masking them behind the limitations of other parts. Would you want them to test GPUs at 1080p with a 10600k? Because that's the same logic. "Show me what this can do when it's artificially handicapped by another component" "Reviews are supposed to guide you to what's best for your needs." No, they aren't. You're talking in circles. Nobody knows what *your* needs are other than you. You can make that determination. This isn't guesswork, knowing what CPU and GPU are capable of independently will give you everything you need to know about what parts are right for you. Reviews are to evaluate the quality of the product in question. Not to evaluate artificially limited versions of it. That's how literally every single review from every reputable channel works. They tell you what the product is capable of now, which in turn tells you more about how it will age. If you actually only care about how specific games will run at specific resolutions with specific graphics settings and a specific GPU on a specific CPU, look up benchmarks for it.
@DrunkenMonk-OGАй бұрын
I don't think people are arguing that the low res stuff is useless, they just don't care what it does at low res is low settings they watch reviews and want to see only what they are looking for, they want to know what gains they will get at the res they play in and settings they use.
@samgragas8467Ай бұрын
CPU performance is only affected by raytracing and settings like FOV or NPC density. At 4k it is the same, the CPU is good for X FPS.
@martheunen2 ай бұрын
Excellent video! Thank you! Possibly the best one of this topic yet! In the last video I was wondering if you'd do one of these explanatory videos about high res vs cpu etc etc etc... My wording in the comments of said previous video was somewhat lackluster. But yet here now is exactly the type of video I was wondering if you'd make. I guess these kind of videos are not your favorite to make, but personally I enjoy them and hope to learn something new about the current gen hardware and I also think they are very good for newcomers. Unfortunately for you I guess, this won't be the last one of it's kind, but again, every now and then (once every 2 years????) a refresher of this topic is useful for the community I think.
@NoirTenshin2 ай бұрын
I might not be representative of both sides, but please hear me out. I understand that CPU testing, is there to show what the CPU can do, in a way, it's uncapped potential. I also understand that higher resolutions put a cap on that testing and it masks what the CPU can do (gpu bottleneck). So, to (stress) TEST a CPU, you don't want it bottlenecked, i don't think anyone can argue with that. The issue is that people aren't just interested in the CPU potential. They are also interested in how that potential is transferred to their scenario (at least ballpark values). It is really hard to "imagine" how the CPU will hold in higher resolutions because on average, (especially with dlss and frameGen) we don't know where the bottleneck starts shifting from CPU to GPU and vice versa (it depends per game, game settings, but nobody is doing that much testing, much less an aggregate on that). People that complain about 1080p are complaining that the potential (1080p test) doesn't give them enough information to make an informed decision (without a large amount of speculation). So the expectation is (and i think here lies the issue), the new cpu comes out, people want a cpu review (not just a test of it's potential), and cpu test gives them data they can't relate to (even though it's the most proper way to test a CPU). So it's more about the communication (and to a degree expectation), what the test represents. To make an analogy, testing a car going from 0-60 miles / 0-100km is great data, but for people in the city / city car, it's more relevant to test 0-30 mph/ 0-50 km/h, while the 0-60/0-100 is still useful data. Both data points will show both "extremes" and be much easier to "imagine" where their scenario lies in that spectrum (as it's almost impossible to find the gpu bottleneck per game). To make the whole thing more complex, DLSS and frameGen can technically be multipliers (they don't just offset the gpu bottleneck point, especially frameGen). That leaves the end user needs not covered, as they have an additional layer of complexity added to their "calculation" (if the cpu is worth it for them, meaning delta is big enough for the amount of money). As we have seen in the data shown in the video, there are some games that aren't gpu bottlenecked even in 4k (DLSS : balanced). That doesn't bring any useful data for the cpu, but it does bring new information to the customer (that the cpu upgrade is still relevant even in 4k, and how much relevant). When you aggregate that data / on an average, that allows the (potential) customer to see the cpu in a new light, which may not be the point of the cpu test, but it should be a point of a cpu review (how it behaves in certain conditions that is relevant for customers). So in a way, a cpu (stress) test, paints borders and lines of a picture, but doesn't add color (and shading).
@Flaimbot2 ай бұрын
i'll boil it down. * if a cpu gets 120 fps in 1080p, the same cpu gets 120 fps in 4k. (minus a small portion due to object/geometry culling at lower resolutions not happening at higher resolutions) * dlss reduces the rendered base resolution. look at the respective gpu BASE resolution benchmarks for the respective dlss settings numbers (minus a bit from the overhead by the smoothing) * frame gen only leverages free resources of the gpu in case of a cpu bottleneck and only up to 2x the non-fg numbers, because it's interweaving a real frame with a generated frame. look at the gpu load. e.g. 59% gpu load for 120fps without dlss or fg (so, a cpu bottleneck) -> bit of math magic turns these fg values into x = 120 fps * 100% / 59% = 203 fps, minus a bit of overhead. (you're free to insert your own numbers from all of your own games.) all of the above stacks with each other. you put on dlss. that reduces the gpu load. that translates into higher free resources for fg, but still limited by the maximum your cpu can handle x2. 0 magic.
@mikenelson2632 ай бұрын
"So in a way, a cpu (stress) test, paints borders and lines of a picture, but doesn't add color (and shading)." This is the equivalent of why differentials or finite differences (and their higher orders) are critical in engineering. You don't want to know how much something is, you want to know by how much it changes relative to a number of parameters, by how much that change changes, by how much the change in change changes, etc. I find it disappointing that this point is completely missed in these discussions. Yes, the best CPU you can buy will without question give you the best future potential in keeping up with performance. But if two CPUs can hold up to the performance you need today, what is the value of the better CPU's upsell relative to performance unlocks? You need the data points to find the differences and extrapolate.
@jammaschan2 ай бұрын
@@Flaimbot I mean I understand that, but do you really expect the average person wanting to buy a cpu for their system to understand that? They want to know how well a cpu performs for a system the closest to theirs, and hence the comments asking for 4k benchmarks. Everyone in the comment section most likely understands why 1080p benchmarks are a thing. Little tommy wanting to build a gaming pc for christmas this year or their parents wanting to help them with is won't get it (probably even if they watch this video), so having a small explaination saying "hey 4k tests doesn't really tell us much abou the cpu's performance" in every video would definitely help.
@DrMicoco2 ай бұрын
what you only need to know is that CPU doesnt care about RESOLUTION, that's the GPU's job, so if a CPU hits a framerate/frametime on a specific settings, then that's the ballpark you need to know, because at higher resolutions the question will be on how powerful your GPU card is.
@tomongchangco43452 ай бұрын
@@jammaschanif your definition of the average user is little tommy's parents who I assume are not technically inclined with computer hardware then a small explanation or even a CPU review won't help them even the slightest. It is much better to tell them to consult and ask assistance from their local computer technician and let him choose the proper hardware configuration and build the system for them.
@Quezacotlish2 ай бұрын
I am interested in how different CPU's perform at higher resolutions tho as well, so I do like to see that content because say I am unhappy with my framerate a (i play at 1440p) and want to move up to 4k, I like to know, Ok can i get by just upgrading my GPU or will i see a substantial increase from also upgrading my CPU, Cuz like if i buy a 5090 Am i going to get 120fps at 4k on my 5800x3D and 120fps at 4k on a 9800x3D so i can forgo spending another 1500$ on a platform upgrade, for example.
@UsernameInvalid482 ай бұрын
There are instances when 4K still has a CPU bottleneck. It's very uncommon but if you look at Dragons Dogma 2 or Monster Hunter Wilds, those games get CPU limited at 4K with a high end graphics card. Thats all because of capcoms shitty engine that doesn't work well for open worlds at all optimization wise. I got a 9800x3d just to be able to keep those games as close to 60fps as possible.
@joes78742 ай бұрын
Sometimes you'd just like to know what fps you can expect at 4k if you upgrade. Im not using a CPU to play games at 1080p so as much as you want to dunk on 4k benchmarks, thats the resolution I'm actually using the cpu at. I don't understand why simply providing the information could be considered "misleading" when it's the actual use case. I get the point about stressing the cpu vs gpu but for me thats balanced by the fact that the numbers aren't relevant and even the performance difference might not scale at different resolution. Couldn't you argue the opposite, that showing 200+ fps on every benchmark screen is actually more misleading? Edit: i really appreciated the addition of the 4k upscaling numbers and I think thats a great way to show the benchmarks
@Александр-ж9м6я2 ай бұрын
Why upscale 4k, why not native?
@provalivay2 ай бұрын
They had just 4090 at hand
@AlexanderBukh2 ай бұрын
тебе 40 минут объясняли, почему. непоказательны тесты на более чем 1080, на 4к raw это будет уже совсем - сравнение данной видеокарты с самой собой.
@pperdana84982 ай бұрын
"Real world"
@Osprey8502 ай бұрын
I suspect that it's because native and even Quality mode upscaling didn't support the point that he wanted to make, which I'm afraid just undermines it.
@provalivay2 ай бұрын
@@AlexanderBukh so basically they have created another 1080p test (as GPU does not bottleneck) but now it's called "4k upscale". We can notice here in "4k upscale" results that other CPUs are mostly close to their 1080p performance, so the same does 9800x3d. What is the point?
@LOOKINGFOREXPOSURE2 ай бұрын
Is an upscaled "4K" test truly analogous to a natively rendered 4k test?
@Argoon19812 ай бұрын
At native 4k there would be zero diference between an ~100€ CPU and a ~500€ CPU. What useful information that gives you? That CPU's are all useless? That we shouldn't bother to buy faster CPU's? What if you buy a faster GPU in the future? How do you know, your current CPU, still has plenty of headroom to support it? And if you think it has, what info did you saw before for that conclusion? If all CPU's showed the same performance before. How would you know your CPU was being bottlenecked and not a bottleneck itself? You would just assume a higher price CPU would be faster? GPU reviews are done at higher resolutions and max settings, coupled with a high end CPU and CPU reviews, are done at lower settings and relative lower resolution, coupled with the faster GPU available, for a very good reason to remove the GPU as a bottleneck and give breading room to all CPU's. Not directed at you but in general the fact some of you still fail to understand this reason, doesn't make it wrong or a conspiracy. And apparently, no matter how many videos are made explaining this, not only by Hardware Unboxed but by others, there's still plenty, beating the dead horse.
@thischannel10712 ай бұрын
@@Argoon1981 The lack of difference between a top-end CPU and a less expensive CPU is precisely the useful information. It informers a purchaser of a CPU for gaming at 4k that there isn't a point to paying more for the top-end CPU when it won't give them more performance. That is what's known as useful information that should be included in a consumer CPU. If you buy a faster GPU in the future, not only is that a hypothetical situation (and people buy new hardware more for current needs / wants), but it's very unlikely that you'll ever see the 1080p benchmark FPS spreads at higher resolutions within your lifetime-use of the 9800X3D, because to see those FPS spreads GPUs will have to become so much more powerful that they'll be pushing the FPS at 1440p or 4k that the highest-end GPU today pushes at 1080p. That isn't going to happen in just one or two GPU generations. By the time even the top-end GPU can do that, you'll probably have already upgraded to a newer CPU. So, it's pointless to buy with future value in mind here. This new video by HU is a swing and a miss. Steve made it because he wanted to snark-out at people having a use for higher-res benchmarks, but he didn't address their point head-on because doing so would grant legitimacy to the argument.
@HuCuRuS2 ай бұрын
@@Argoon1981 That we shouldn't bother to buy faster CPU's? Yes. Exactly.
@avatarion2 ай бұрын
@@HuCuRuS You didn't even read his post. Faster CPUs have more hidden potential, so if faster GPUs come along you can unlock that in games with the 9800X3D, but not with a 12100.
@MaggotCZ2 ай бұрын
there is a difference between using the best quality mode for upscaling and using balanced or below because at that point even 1440p native is more taxing on the GPU and looks better. You practically tested 1253p with 4090 lmao no sh it thats still CPU tied.
@samgragas84672 ай бұрын
Balanced looks better and is a bit more taxing. In UE 5 it is less taxing btw.
@tyraelhermosa2 ай бұрын
Great job. You nailed it. That example at the end with the CPUs running on the 3090 vs the 4090 makes it so clear.
@ChristopherYeeMon2 ай бұрын
This should be seen as an opportunity, instead of a debate. Do the normal benchmark review that is CPU limited. And do a buying guide review where you put the CPU in 'normal' scenarios and see if it makes a difference so it shows users how far back of a CPU you need to be at to consider the upgrade
@mikenelson2632 ай бұрын
This is the way.
@teekanne152 ай бұрын
In theory the benchmarks themselves should be info enough for a user to deduct that. But I guess people need handholding with purchasing decisions these days.
@mikenelson2632 ай бұрын
@@teekanne15 No, in theory it absolutely does not work that way.
@Skeames12142 ай бұрын
This is not a debate, this is a literacy issue. You're asking them to double their testing, their shoots, and their editing time for the sake of people who can't figure out that playing games at 4k Ultra is going to make the CPU less important than the graphics card. You have to see how ridiculous that is. What they're doing right now is the right answer. Make a video every couple years explaining the methodology.
@mikenelson2632 ай бұрын
@@Skeames1214 who's asking them to double their testing? Stop treating other people like they're dumb when you're not diligent enough to examine the validity of your own thinking. People who want to play at higher resolutions understand that the relevance shifts towards the GPU. Where are the thresholds? That information is actually helpful in making purchasing decisions. It would be better to structure your testing approach within the constraints that apply in order to better inform viewers.
@x4it3n2 ай бұрын
4K Performance/Balanced/Quality are far from Native 4K since they use much lower resolutions but yeah. Imo the real selling point of a 9800X3D at 4K would be for 1% and 0.1% Lows that probably can give you a smoother experience due to less FPS drops. I have a 5900X and most games still run great on my 4090 at 4K. But when the 5090 releases in a few months a 9800X3D might be needed...
@CallMeRabbitzUSVI2 ай бұрын
Yep! Was happy to see him actually doing 4k testing but then very upset that he is using upscaled "4K" Steve is a joke and I will continue to not take him seriously 😂
@RADkate2 ай бұрын
there is zero reason not to use dlss at 4k
@Monsux2 ай бұрын
Can you please do Path Tracing tests on these high-end CPUs? It's by far the most CPU intensive feature that is often disregarded because it's the most GPU demanded option. In my testing, 4k DLSS ultra performance, Cyberpunk optimized high settings + PT. The CPU limitation with 5800x3D was insane on the city area even with medium crowd density. CPU was massively holding back when using high crowd density. People who play with 4k DLSS performance + PT, might think they are always fully GPU limited, but the CPU holds things back. I bet that this is going to be a massive issue when the new 50xx cards are launched, because users can run games with higher framerate while using PT. Even just basic Ray Tracings on open world titles are super demanding for the CPU. I would like to know at least the generational jump from 5800x3D, 7800x3D, and 9800x3D. I would run these CPU tests, but lack the hardware for run all of these. The limitation might even be harder for players who are running games with mods.
@Joelione2 ай бұрын
Path tracing is one of those features that is always left out from CPU reviews. Not even my 7800x3d is able to run cyberpunk smoothly with PT when using 4090. CPU bottleneck!
@l8knight8452 ай бұрын
Completely agree. RT testing is where the real 4K separation will show up.
@thephantomchannel53682 ай бұрын
Once hardware can cope, path tracing will be the gold standard for lighting. The more information the system has access to (pixels) the better the results will be as well as the higher the refresh rate the more realistic it will look. The problem with path tracing using up scaled resolution is that the math for the lighting will not be as accurate as it would if it had been calculated at a native 4k resolution. Also the amount of bounces/ passes needs to be increased to make the tech almost resemble photo realism. Currently I think the limit is set to three passes max which requires software augmentation/ interpolation to fill in the gaps as a function of de-noising. Some day I hope that tech will be fast enough to at least have VR with native 4k with 120hz per eye and still be able to output full 4k path tracing with at least 12 passes for much higher fidelity and accuracy for color and shadows without relying on interpolation.
@kerkertrandov4592 ай бұрын
@@Joelione yes getting 20 fps at 4k native with a 4090 when u turn on path tracing (rt overdrive) is def a cpu bottleneck lmao
@Monsux2 ай бұрын
@@l8knight845 The thing is, Path Tracing does bottleneck my 5800x3D even at lowest possible rendering resolution in Cyberpunk. Framerate goes to unplayable levels + low % fps super low. I just wish the PT testing would become norm in CPU reviews/benchmarks. New RTX 5090 & 5080 users who play, for example, Cyberpunk with Path Tracing 4k, DLSS performance. They are massively limited by their CPUs. How much, this is what I would like to know. Some people who say that the CPU doesn't matter at 4k, well, they haven't used PT or played open world games with RT on (spider-man, Star Wars Outlaws, etc.)
@CZID2 ай бұрын
This is what I'm looking for! Good job on doing these comparisons. Please make more videos like this. Thank you!
@Tealc23232 ай бұрын
I just want to see the 40 game average at 1440p native between the 9800X3D and 7800X3D
@N0N01112 ай бұрын
7800X3D and 9800X3D native 1440p is around 5 to 10 FPS more in 90% of games. This is what people want to hear of them but they don't want to show it, instead we get fake [down scaled] 4K benchmarks LOL.
@snowball8082 ай бұрын
@@N0N0111 Yes im looking at buying a 5090 and im dying to know NATIVE benchmarks. I want 1% lows at 120fps NATIVE 4K fully ray traced, and im pretty sure i dont need a 9800X3D and a 7800X3D will work fine but if im building a badass PC that is going to last 7yrs i will spend the extra $$
@doctorWHO_ey2 ай бұрын
ad 5800x3d in there too
@volvot6rdesignawd7022 ай бұрын
im not quite sure what the point of this video is really 7700x vs the 9800x3din 4k umm why?? love all these 1080p benchmarks crapping on the 7800x3d lol we get it its a better cpu if you only play at 1080p !! who actually plays 1080p unless you have a shitty GPU in which case why waste money on a 9800x3d and spend and save for a better GPU!! 4k is always going to be bound by GPU why not show the 4k 7800x3d vs 9800x3d dont think them results will paint the 9800x3d in as good a light !! ( obviously ) and lastly i would much rather 1440p results 7800x3d vs 9800x3d as most people with a decent gpu 7900 to a 4090 would play 1440p depending on the demands of the game !! I know there will be small gains and none between the 7900x3d and 9800x3d but seems more realistic than 1080p or 4k !!
@andersjjensen2 ай бұрын
@@snowball808 And that is EXACTLY why 1080p CPU benchmarking is crucial: Now you go back and watch Steve's 40 game 9800X3D round-up and pin your eyes on the 1% Lows. But spoiler alert: Even the 9800X3D has a few instances where it can't do 120FPS 1% Lows. And spoiler alert 2: The 4090 can't do 60FPS at 4k Native in Cyberpunk with RT Overdrive, so the 5090 sure as hell won't be able to do 120FPS.
@Krarilotus2 ай бұрын
So the conclusion: with only 172 fps on Star Wars Jedi Survivors for the 9800X3D vs 155 fps on the 5800X3D the 4k testing is just showing, that these CPUs are very close, while in 1080P 234 vs 163 fps paints a completely different picture, and shows the actual IPC uplift of the new Zen Cores and Cache vs the well aged 5800X3D that we all got to know and love, and will probably not update from, until we get AM6 plattform release, because why would we? 5800X3D still king of price to performance and the cheaper AM4 platform really does all we need for now!
@anitaremenarova66622 ай бұрын
@@contactluke80 Why are you copypasting the same comment everywhere? Yes, 80% of people at 4K use upscaling there literally isn't a card that can play modern games in 4K native lmfao. Also no, better processor will improve your stability (1% lows, 0,1% lows) especially when RT is used in any resolution.
@StatsGam4er2 ай бұрын
I still don't get it. What is wrong with testing CPU performance at 1440p and 4k resolution to see how performance stacks up ? Why is this frowned ?
@Eidolon20032 ай бұрын
There's nothing wrong with it, it just isn't as useful as 1080p. Think of these CPU benchmarks as being resolution agnostic (This isn't exactly correct, but it's close). Think of the number you see in the CPU benchmark as the highest framerate that CPU can achieve in that game. We test at a low resolution to try to make the GPU as irrelevant as possible. It would be possible for the CPU to get to that framerate at 4K if only the GPU were fast enough to keep up.
@StatsGam4er2 ай бұрын
@@Eidolon2003 I totally get that. It makes sense for TESTING CPU performance. But what is wrong with TESTING in 1440p and 4k ? My point is nothing, there is nothing wrong with testing a different use case. Take GPU testing, for example the RTX 4090, they test in 1080p, 1440p and 4k. Why on earth would he test in 1080p for an RTX 4090 then ???
@Eidolon20032 ай бұрын
@@StatsGam4er There's nothing "wrong" with it, true. But I would argue that it also doesn't provide much new information, which makes spending the time doing the testing a waste. Especially when Intel and AMD only give reviewers a few days to put their reviews together before the embargo lifts. As for the GPU testing, it's still relevant at 1080p as long as it's still GPU limited, and there are quite a few new games coming out where the 4090 is in fact GPU limited at 1080p max settings. It's useful to know what framerate it gets in that case. The way I see it, you have CPU benchmarks at 1080p to set the ceiling, then you look at GPU benchmarks for your card at your resolution and see what framerate it gets. If the framerate your GPU gets at your resolution is less than what the CPU is maximally capable of, then you're good to go.
@CarpaBob2 ай бұрын
I would say looking at timestamp 29:09, the CPU longevity test. It showed that running on the 30 series card the 3 cpu's looked almost identical, but once you move onto the 40 series card there is a big performance difference. With the 50 series coming up we should see a big uplift and the 1080p test gives good indication of which CPU is more likely to truly benefit
@StatsGam4er2 ай бұрын
@@CarpaBob Thanks. Yeah that's a fair point. Im not against 1080p CPU testing, it makes total sense. I think as you pointed out, with 40 and 50 series, 1440p and 4k is still somewhat relevant and CPU can matter. Personally, I am on a 5900X with a 4090, and I cannot figure out if I should get a 9800X3D to play at 4k, as close to my 160hz refresh rate as possible.
@MrLincoln872 ай бұрын
CPU/GPU scaled testing again? updated for say 9800x3d/7800x3d/5800x3d and idk 3700x/5700x/9700x paired with a few different gpus Are these videos stacked with thumbs down? Or do people just watch and then not thumb up? Even if you're salty about the outcome (say you're an intel fanboy and see the 9800x3d review) - surely you wouldn't thumb down the video right? *13k thumbs up with 214k views ? crazy
@10Sambo012 ай бұрын
While I completely understand why CPU benchmarks are done at low resolution, I think what people are really getting at is; what PRACTICAL performance difference will this CPU make to my gaming at 1440 / 4k. While you can't predict what settings people use or the framerate they prefer, you CAN test a worst-case scenario in a suite of games to give a comparative result across multiple CPUs.
@memitim1712 ай бұрын
The CPU itself doesn't run any slower at 4K than it does at 1080p, with a powerful enough GPU these 1080p numbers will be your 4K numbers. So what you are really asking for is for him to do a shed load of testing to save you the hassle of looking at 2 numbers. Surely you can see why that's silly?
@10Sambo012 ай бұрын
@@memitim171 The point is that people want to know what the difference will be if they upgrade their CPU TODAY at the resolution they currently use. It's all very well saying that a 9800X3D runs a game at double the FPS than their current CPU on 1080P if there is zero performance increase at 1440P which is what they use. Some games have a CPU bottleneck, some don't; if there's still a CPU bottleneck at 1440P with a 4090, then it's going to make a difference.
@memitim1712 ай бұрын
@@10Sambo01 So look at the GPU number, what is so hard about this?
@10Sambo012 ай бұрын
@@memitim171What do you mean?
@memitim1712 ай бұрын
@@10Sambo01 If the CPU you are considering does 200fps in the test @ 1080P but the GPU you have does 150fps @ 1440p then your framerate with the new CPU will still be 150@1440P. Conversely, if your GPU does 200 fps @ 1440P but your CPU is doing 100 @ 1080P then your fps @ 1440P will be 100. All you need to know are these 2 numbers and if you are currently CPU or GPU limited, and Steve can't tell you the last bit because it's your computer, your game, and your setup. I don't play any of the games used to bench here or by any other serious KZbinr, but their data still tells me everything I need to know to make an informed purchase, because all I need is those 2 numbers (the average from a 45 game bench is more than sufficient) and to know where I'm currently at. You declaring the data is useless because it isn't at 1440P is the same as me declaring it useless because my games aren't included, it doesn't matter, because the relative power of the parts is still the same and you will always be limited by the bottleneck.
@mikewilliams6522Ай бұрын
I’m replacing my i9-9900k with the ryzen 7 9800x3d so I feel like that’s going to be a massive difference and improvement 😂
@AyazDaoudАй бұрын
Congrats. I hope your GPU is good enough to use with a CPU that good.
@mikewilliams6522Ай бұрын
@@AyazDaoud i'm using a 3090 ftw3 24gb
@BenBording2 ай бұрын
I completely get why you test CPU's specifically at 1080p. Makes a ton of sense when you test CPU vs CPU. BUT as one of the lucky persons that game at 5160x1440 ultra settings with a 4090 on my old 5900X, this review still hit the bullseye. Almost. These benchmarks at higher resolutions are very helpful, for me personally, to judge wether or not an upgrade is even worth it in my specific use case. So thanks for the video, this was exactly what I was missing as a bonus!
@pepijngeorge99592 ай бұрын
The point of the video is that high ress testing doesn't tell anything about the CPU that 1080p testing doesn't already tell you.
@Dempig2 ай бұрын
I wouldnt call 5160x1440p lucky, ultra wide looks really bad lol just get a big 65"+ oled or something it would look so much better. Whats the point of a extra wide display with a teeny tiny vertical view? Most games look very bad on ultrawide especially first person.
@BenBording2 ай бұрын
@@Dempig lol you obviously never tried it then, but perhaps that's not your fault. 65" is just neckbreaking and looks trash up close. Most games look amazing on super ultrawide. ;)
@BenBording2 ай бұрын
@@pepijngeorge9959 No but it does tell you something more detailed in very specific use cases, as Steve rightly points out multiple times in the video
@Dempig2 ай бұрын
@@BenBording Nah man ultra wide is terrible for literally every game lol like I said it completely ruins first person games , and having such a tiny vertical viewing range with a large horizontal viewing range makes no sense.
@tomlin65542 ай бұрын
where is the comparison with the 7800X3D? I am confused.