Step 1: This is on the viewer: Do I need a CPU upgrade? If the answer is ‘yes’, proceed to step 2. Step 2: Find a review that looks at CPU limited gaming performance: examine those results to determine which CPU in your price range offers the most performance, as that will be the best value choice. Step 3: Well done, you did the thing.
@HardwareunboxedАй бұрын
Seriously though GPU limited CPU testing is dumb for so many reasons, and is misleading at best. You’re probably not buying a CPU to play just the games being tested, and you’re probably not playing using the exact settings being tested. If you plan on keeping your CPU for a few years, you should want to know which one offers the most performance, as that will age the best. The idea of a review isn't to convince you that you need to upgrade, that's 'step 1' work that out first.
@Satyajit-vm8nxАй бұрын
Hell yeah
@Ola_John72Ай бұрын
@@Hardwareunboxed I think that same argument could be used for gpus reviews. Some of them focus on games from 5 years ago that are not really relevant any more. If multiple ranges of games were tested, some more gpu intensive, others more cpu intensive as there are people that only play those kinds of games, would give the viewer a lot of information when doing a new build. Lots of people with a 4070 super spending extra on a 7800x3d when maybe a 7700x would be enough for the games they want to play, and that price difference if it went to the gpu it would make a bigger difference, or on the contrary some would actually need that cpu because maybe they only play competitive games at 1080p. The problem is that it would take a lot of testing. Techpowerup does this well on their cpu reviews, but only for the 4090.
@krazyfrogАй бұрын
Step 3 should be to look at GPU reviews and find out what level of performance your current or future GPU is capable of in the games you like to play and then based on those results plan your CPU purchase so that the CPU results generally outperform the GPU results in those games at their respective settings. As usual, you should generally be GPU bottlenecked and not CPU bottleneck so the CPU should always exceed the performance of the GPU, and that means looking at 1% not the averages. Step 4: Good job. You have now figured out how to utilize the information available at your disposal effectively instead of having to be spoon-fed results of the exact configuration you are after.
@ole7736Ай бұрын
Step 1 could be assisted by a good video guide on determining whether you actually are CPU bound. And then consult reviews telling you which CPU is able to achieve the amount of fps you desire in games relevant to you.
@thelegendaryklobb2879Ай бұрын
In the end it's a simple answer: the number of possible hardware and software combinations is astronomically high, running more and more tests is a waste of time because you'll never be able to cover all possibilities. Reviewers should test the maximum possible performance, and it's up to the viewer to cross-reference the data from various reviews to approach the answer they seek for their particular case
@Lurch-BotАй бұрын
They're still heavily focused on recent hardware which doesn't really help people trying to upgrade from older hardware. GN is really the only source where there is extensive enough data to do this. All synthetic benchmarks have their biases and GN is the only source that methodically tests actual game performance, something not well represented by synthetic benchmark scores. But the problem is that, while you may know how to cross reference benchmarks and I know how to cross reference benchmarks and Daniel Owen knows how to cross reference benchmarks, 90% of gamers do not. I always exceled at statistical math. Most people do not, as evidenced by how easily people are swayed by statistics based propaganda that involves misrepresentation of statistics. And nobody has yet found a way to properly account for the X3D effect. Nobody. You can't see the most important benefits of that much cache on any benchmark. There still is no way to account for visual fidelity variation when looking at performance of dynamic res scaling and frame generation. Nobody is testing Lossless Scaling, which they really should be. They are all a bunch of shills and aren't going to tell you to go buy a $7 app that is an absolute masterpiece of programming skill when they're taking kickbacks from Nvidia to hawk DLSS 3.
@mojojojo6292Ай бұрын
@@Lurch-Bot Loseless scaling is still shite compared to DLSS. No movement vector data will always result in noisy images.
@janoskiss8040Ай бұрын
Yes. Actually it is database management. The reviewer should build a database with all test results, capturing all parameters of the test, CPU, GPU, resolution, game, game settings, game scene, etc. Then an user interface would be needed for various functions, like to be able select a hardware configuration and to calculate averages of the tested games, or see the individual game results, or compare the performance of selected hardware configurations, etc. Additionally a speculative result could be calculated if the specific hardware combination is not tested by taking the same CPU and GPU from 2 other tests (it would probably need an additional flag in the data to show whether the test was CPU or GPU limited). But this would make earning income more problematic because the reviewer cannot place the database into a KZbin video, he would rely on paid subscriptions for the database access.
@MattyJay03Ай бұрын
people don't understand that if a cpu maxes at 100 frames at 1080p then it maxes at 100 frames at 4k. If your gpu can get up to 100 frames at 4k that's where you will become cpu limited again. that's the point of testing CPU at 1080p
@JeremyBellАй бұрын
100 frames at ultra quality 4k.... Oh no! How will we ever survive! Bro... If someone is shooting for maxed out 4k graphics, then money isn't a problem and they're slapping in the most powerful cpu without comparison shopping
@MattyJay03Ай бұрын
@@JeremyBell it was an example replace 100 with any number
@syncmonismАй бұрын
That's not entirely true. Ray tracing in particular tends to come with additional CPU overhead, so running at 4k can also result in a greater computational demand on the CPU. It's also possible for other things in a game to require more CPU performance at higher resolutions, though I don't think there's anything else which can cause that.
@OptimizingNetworkАй бұрын
CPU limited marker is nonsense, not one person agrees to what it is, you say one thing, but I don't agree with it. CPU limited can be a case where you are at 99% GPU usage or at 30% GPU usage, the limit of CPU is very ambiguous by definition. Where does it start? Does it start when your GPU is choking, or does it start when your GPU is underperforming? The real answer it's both. Because strong CPU pulls data strong during low load and high load scenarios comfortably, and not just in 4k or 1080p 9950X can hold nearly same FPS while dipping to 78% GPU usage while at 1080p on RTX 4090, as close as 10 fps average to 7950x3D, then 7950x3D will be higher averages at 91% GPU usage, but struggles in super heavy CPU moments on screen where there is lots of action. 7950x3D is better overall for gaming, but 9950X is more stable for gaming with lower averages (by not much) 9950X is arguably better then, but if you look at it again 7950x3D is better, you know who is right? No one. Both CPUs are very strong, and neither wins in everything, as 9950X can win in 0.1% and 1% lows, but lose in averages. I own both 7950x3D and 9950X and I owned 13900k and 7950X, all these CPUs are EXCELLENT, and they hold special spots in testing even in gaming. 9950X is nearly 7950x3D in averages. Will remain more stable in heaviest scenes, but consistently lose in averages to 7950x3D. Both are super awesome CPUs and either one is great if you know how to Process Lasso them, you can see that stuff on my channel how well 9950X is. People really try to say 9950X is 7950X now, just lol. What are they measuring???
@Antantaru_Ай бұрын
I think its better to test at 720p. Really make the GPU not have to work at all
@Randomgirl4629Ай бұрын
For me, it’s about when and how frame drops happen. CPU bound frame drops feel horrendous, come out of nowhere, often happen when the action is most hectic or when I’m trying to take in a large city full of characters, and no settings tweaks can make the game feel good to play, while GPU bound frame drops are usually much less offensive, more predictable, short lived if originating from big flashy particle effects, and can be solved with tweaks to resolution (this includes upscaling) as well as graphical quality settings. This is how I decided a 7800X3D was worth buying (even on a 4K 165hz monitor!) vs putting the money I could save getting a 7600 towards a more powerful GPU (in my case it was 4090 vs 4080S, but I think the decision holds up even with weaker gpus if you’re keen on playing games at high refresh rates [120+] rather than in native 4K at max settings)
@MrKlaatu06Ай бұрын
Indeed, those stutters and drops are hell.
@antondovydaitis2261Ай бұрын
This is useful reasoning.
@Lurch-BotАй бұрын
The true power of X3D cannot be seen on any benchmark. You can have a 14900K and a 4090 and you're still gonna get gnarly frame drops driving around Night City...but not with X3D. It is a perfect example of how cache is king in gaming. Most CPUs just don't have the cache to keep up with the rapidly changing data in that scenario. I've got a RX6600 and even with such a puny GPU, X3D made a big impact.
@Legz_inStyleАй бұрын
They are usually worse because the whole simulation stutters instead of the frames. And as you said, there is also, usually, no way to mitigate this in any way. Few games have a "less simulation" slider.
@JGCommentsАй бұрын
I did the same thing, but with a 7900XTX. I love buttery smooth gameplay.
@adriancioroianu1704Ай бұрын
"It's hard to sell numbers, people want to feel the change". Your channel has covered this concept very well in the past. And yeah, its probably hard to do it with cpu's because it would be boring, if not unfeasable from many pov's. But to have the perspective, i think, is crucial because it makes buyers more aware and knowlegeable while it puts that bit of pressure on the manufacturers to improve. And this perspective is what qualifies a journalist, not just the numbers and charts. Anyway, great video! I think you're a force of decency in this techtube space, among others. Nuance'ing the polarizing arguments is always welcomed and smart.
@rangersmith4652Ай бұрын
A lot of people want to know something very specific based on their situation: Something like "How many more FPS can I get in CP77 at 1440p ultra with no ray tracing using a 3060 12GB and 16GB of DDR4 3200 CL16 if I upgrade my 2600X to a 5600X?" The answer is simple. I don't know. Go find some reviews comparing a 2600X to a 5600X (there are loads of them) in situations as close as possible to yours, and interpolate those results as best you can.
@kravenfoxbodies2479Ай бұрын
Well, the answer is not always the X3D parts because the RX 6600 wouldn't care much about it if the gpu is already maxed out, kind of what HUB did with the 14700K vs 5900XT using RX 6600.
@xPhantomxifyАй бұрын
I hate that Cyberpunk 2077 has become the default game everyone wants to benchmark... A game from 2020 that is unoptimized, unfinished and still not living up to the hype, with many RPG features missing and chosen paths like corpo or streetthug not making a difference. I've never seen so much dickriding for a shady, scumbag company like CDPR that "leaves greed to others", when they purposefully tried to hide the disastrous bugs and missing features at launch... "B-but the game has K-keanu Reeves with sunglasses saying the F word and there is so much adult content!" ...
@xPhantomxifyАй бұрын
There are tons of new games that look good and actually are great, perfect for benchmarking and a reason to be a PC gamer. For instance, Space Marines 2, Silent Hill 2 Remake, Dead Rising Remaster, Dead Island 2, Until Dawn Remake, FF7R Rebirth, Monster Hunter Wilds, etc. But these manchildren are like nah, lets benchmark Cyberpunk 2077 because I'm an adult and I want to drive a cool futuristic bike or car in a dystopian neon world where the NPCs say the F word non-stop!"
@Raven3557Ай бұрын
@@xPhantomxifydunno bro I consider it the same as crisis being used for benchmark in the past. Just one game that was chosen to be the common denominator . Calling people man children while writing your rant is pretty ironic
@Lurch-BotАй бұрын
I'm sure someone has done a multi game test with both of those combinations.
@mattzun6779Ай бұрын
I've seen exactly ONE review that really answered the question of will a CPU or GPU upgrade improve my gaming experience. Tom's hardware did a test of 19 games with 4 CPU from different generations, 4 GPUs of different performance tiers and 4 different resolutions/quality settings. The review did show both one percent lows and average FPS. The number of combinations was insane. The review was: CPU vs GPU: We tested 16 hardware combinations to show which upgrade will boost your gaming performance the most
@lunarath1Ай бұрын
I remember reading this article before I built my PC in the summer, and thought it was pretty good. The problem just comes down to that even with that many combinations it still just doesn't reach the majority of people I think. The question I had at the time of reading was both what CPU to go with, but also If I should get the 4080 super or 4070TI super, neither of which they tested, although they did test the regular 4080. I definitely think they should have had the 4070 and 4060 in there though. I would love to see more of these kind of reviews, but comparing entire generations of hardware. I doubt it's anywhere near worth the effort though, unfortunately.
@Lurch-BotАй бұрын
An X3D upgrade will boost anyone's gaming performance, lol.
@mikem6466Ай бұрын
Deffo going to watch this video. I've always wanted this type of content, but basically no one does it. I didn't know that video existed. Thanks op
@wedgeantilles8575Ай бұрын
@@Lurch-Bot Bullsh*t. It depends on the CPU you have atm, which GPU you have atm and which kind of games at which resolution you play.
@DurayneАй бұрын
Thanks, actually a Test ive looked for quite a long time.
@tomysam150Ай бұрын
I love the fact that you use your teaching skills to educate people on tech. Keep it up!
@be0wulfmarshallzАй бұрын
What whiteboard program is he using?
@Lurch-BotАй бұрын
Oh, please. Daniel is a smarmy know it all talking down to us from his ivory tower. He can't even begin to relate to what it is like to be an average gamer. I'd like to see him try to game on a Xeon E5-1650v3 on a cheap Machinist MB with an RX580 for a year. Then he might get a clue. Daniel Owen probably has the highest IQ of any tech KZbinr but he has zero clue how to answer these questions because he lacks perspective.
@myname7021Ай бұрын
It doesn't make much sense to CPU/GPU review any other way. However, I would still appreciate it, if reviewers did more testing of actual real-life setups.
@joelshrem8836Ай бұрын
I always enjoy seeing different CPU and GPU combinations tested to have an idea about which upgrade will help the most and when
@DazzxpАй бұрын
1# check a CPU review so you have a base line of how fast a CPU can be in a specific application. 2# check a GPU review and compare it with the CPU benchmark and you can see if you have the right balance in the system.
@MaxIronsThirdАй бұрын
Daniel not calling the stoopid people, stoopid, is such a nice teacher thing to do.
@Lurch-BotАй бұрын
Talking over your audience, as he typically does, is the whole problem. He has no clue how to advise the average PC owner because he's sitting up there in his ivory tower with his 4090 and 14900K, lording it over us, as are most Tech Tubers. They're the main reason game devs no longer optimize. Confirmation bias from watching KZbin leads them to think everyone has high end hardware. I'd like to see Daniel using a Ryzen 3600 and RX580 as his only gaming PC for a few months heading into 2025. Then he might gain the perspective he needs to figure this out.
@kelvino3990Ай бұрын
@@Lurch-Bot the hell you on about. When a game comes out Daniel does gpu and cpu reviews ranging from 2060 gpu and up or 3600x cpu and up. He clearly trying to advice the public on why testing is done the way it is and you with your little brain can't even comprehend.
@sasagrcevic475Ай бұрын
@@Lurch-Bot True.
@thedarkangelptАй бұрын
So i have a very good 7800x3d sample. I ONLY game. 99% of the time i play at 4k. Im using a 4k 144 Hz OLED TV. Hearing a lot about 9800x3d. For me would make sense? Since my FPS are always capped at 140. Most games is GPU bottleneck
@thedawn-rt9rxАй бұрын
@@Lurch-Bot from one extreme to another...if you cant afford a new gpu in nearly a decade you should buy a console for gaming...just checked the steam hardware survey and 50% has at least a 3060 or higher from all brands...things like a radeon 5700 or rtx 2060 and below i didn count... ''TaLKiNg OvER YoUr AudiEnCe'' why dont you make your own channel then? if you wanna know specific details about your card..why you dont search for them on youtube benchmarks instead of telling daniel how to make his video's?
@vindeiatrix28 күн бұрын
Daniel I really have to commend you for actually reading comments and addressing these issues of misunderstanding that have been happening for years. No one else I’ve seen has done anything like it.
@BobBob-zu2dtАй бұрын
I love this recent crusade on CPU performance and how to interpret it, keep up the great work!
@JackJohnson-br4qrАй бұрын
If Daniel was in the online education commercials, ZOOM could never have failed. Good job man. 👍😊
@TheSkepticSkwerlАй бұрын
People forget not every game is gpu limited, so it comes down to your play style. Simulation games, older games like Counter strike, games with tons of NPC's like Space marine 2. etc... they are ALL using heavy amounts of cpu. so having faster both can be a waste in some games, but also a benefit in various games.
@ElvinJames-qv6qiАй бұрын
Like starcraft2,oxygen not included,minecraft,simulation games.People should test these games
@ElvinJames-qv6qiАй бұрын
Not 1080P 3A games with 100,200frames
@NewishJordanАй бұрын
@@ElvinJames-qv6qi They will absolutely never test games like this for obvious reasons. While these games are niche they all happen to be pretty multi-core and we all know who that would benefit on the benchmarks.
@yedrellowАй бұрын
Squad, Squad 44, Hell Let Loose, ... Plenty of games that are cpu limited while simultaneously begging for more frames than a 7800x3d / 14900k can deliver
@stefannita3439Ай бұрын
they are not using heavy amounts of CPU. they are just using tiny amounts of GPU because we have much stronger cards nowadays. meaning the GPUs have a lot of headroom and it's typically the CPUs that are the bottleneck. there's a difference between increasing CPU load and decreasing GPU load, even though they can both lead to a CPU bottleneck.
@EthicalAlleleАй бұрын
It's so nice having a passionate teacher using their expertise to teach the man-children in the PC community! And I don't mean that entirely as an insult because I'm in that group as well. Your explanation of the 1% low's impact on fps in a gpu limited scenario was very helpful for me! I now have a new piece of information I can understand from CPU reviews and my own personal benchmarks.
@bracken777Ай бұрын
Thanks Daniel for this video, which I highly respect as it's very fair. I left a comment on one of your previous videos (where I complained from a perspective of someone who is 99% of the time GPU limited and there other factors important rather than a few % better CPU performance vs the next competitor). I think you tackled on very important aspects. It's indeed a difficult problem to answer and we still miss a lot of information. For instance, I'm wondering if there are any stats on how often a given user is CPU vs GPU limited given a resolution. We could look at the latest Steam's hardware survey to get a better feeling what kind of systems user have (it seems that 55% users still play at 55%, although we don't know what games and with what GPU), you could also do a YT survey to check if your viewers know if there are limited by CPU, GPU or they don't know :) Anyway, long story short, it's extremely important to point out that the CPU performance is very subjective and while reviewers try to get to a common ground, it's not clear how practical their reviews are. Suppose for a second that 80% reviewers are GPU limited - how many of those users would simply upgrade CPU without understanding that they wouldn't gain much?... Lastly, in my case, I'm moving away from AMD 7800X3D and I hope to pick one of the newer Intel CPUs (once review embargo is over). I'm also looking from the perspective of power usage and I'm 100% willing to trade CPU power for better efficiency. At the end of the day, I won't be sad if my game runs at 30% vs 25% today, if I get back system stability.
@AphyRusАй бұрын
Hey Daniel, I bet a lot of people wanting an answer to this question won't find you video because of the title. Maybe you want to change to something more descriptive and less click-baity? 😀 I guess a lot of your viewer are familiar with this concept already (not me, me uninformed), but people searching for it might not be? Great content as always. Thank you!
@bsmarquesАй бұрын
1% lows is something we can look, but does not work for a large audience. This is the question in most people's mind: what's the best possible performance of ? Then, what's the cheapest CPU to be paired with this GPU that will give me similar FPS? This is the BALANCED PAIR. Some games will favor better CPUs, others will favor GPUs, but overall the build is well balanced, there's no point investing in one component without investing into the other. In the end what we will have is a list of BALANCED PAIRS, meaning those are the combos CPU/GPU you should be looking at, do not overspend in one while neglecting the other.
@bsmarquesАй бұрын
Also, it's ok to have the same CPU tied to multiple GPUs. For example: R5 7600 - RTX4070 R5 7600 - RTX4070SuperTI, R7 7800X3D - RTX4080Super R7 7800X3D - RTX4090 Basically
@DominatorNX-x4bАй бұрын
Good questions, Daniel. People building new system often start with the decision about GPU, it is by far the most expensive component and they want the GPU to be THE Bottleneck in their system, because they pay the most for it and logically they want to get the most out of it. And not everybody buys 4090, right? Let´s say I want to build a system with 7800XT for single player gaming. What CPU should I buy to be GPU bottlenecked most of the time? That is the type of content, that is missing... In other words, take a popular midrange GPU, test it with a range of CPU in some typical gaming scenarios (1080p, 1440p, 4k, high, ultra...) and see what CPUs are powerful enough to get 100% performance out of the GPU most of the time. And this can answer a question: Do I need badass expensive X3D chip or am I good with Ryzen 5 7600?
@Alex_whateverАй бұрын
The question of "How would this CPU impact MY gaming experience?" can only be answered by the individual asking the question. It is up to the user to learn where their personal bottlenecks are and and then use that to decide what upgrade would benefit the most. Be it CPU, GPU, RAM or faster storage. No one else can answer that question without intimate knowledge of the system and how it performs. Maybe a video on how to determine what is causing a bottleneck, would be a good topic.
@MrKlaatu06Ай бұрын
Ironically the criticized UBM is one of the few places to see 'how would this impact my experience' because they have enough of a database to see different configurations of the same setup, with only a few parts removed. I want to see another database of that scope, but it hasn't happened.
@Lurch-BotАй бұрын
That's part of the problem. Most gamers haven't the first clue on how to derive knowledge from the benchmarks. But Tech Tubers up in their ivory towers with their 4090s and 14900Ks have zero clue anymore what it is like to be a regular gamer. So how can they begin to give useful information when they lack perspective?
@ninele7Ай бұрын
@@Lurch-Bot The problem is that to give information that wouldn't require additional knowledge to apply it to your specific setup reviewers would basically need to test every CPU with every GPU (at least popular ones). It would take tons of time and money. And most of this review channels while being somewhat sustainable aren't making insane profits.
@ZyxlianАй бұрын
@@Lurch-Bot You seem to be saying this a lot in these comments. I patiently await your KZbin channel posting thousands of videos daily with the millions of different combinations of hardware with every specific cpu, gpu, motherboard, game, graphic setting, resolution, upscaling method, antialiasing method, operating system... The point is that you can't. Even just taking into account the last generation of hardware, that's around 15 CPUs and 17 GPUs: 255 configurations of hardware (not counting other variations like HDD vs SSD, or motherboards). You would need to test both Windows 10 and 11, so you are at 510 now. Now how many games do you want them to test? Lets go with only a small selection of 10, so that is over 5000 now. Ok, now resolutions: 1080p, 1440p and 4k - 15000 now. Lets stop here and not bother with upscaling and other graphics settings and just do the math for this. 15000 benchmarks at, say, 10 minutes per run is 150,000 minutes. 2500 hours. 312 work days. Good luck with that youtube channel!
@jinxdad2809Ай бұрын
Daniel. Good video. The big question is how to evaluate your current system to determine where the bottle neck is and how much improvement can be achieved by upgrading CPU or GPU; or is it time to upgrade entire system.
@TerraWareАй бұрын
Thanks for attempting to educate some of the people who don't understand this. It can be frustrating to compare CPU's sometimes. There's a decent chunk of people that focus on the GPU used and resolution than the title of the video.
@MelodeathMassacreАй бұрын
Daniel, I think a showcase on Intel presentmon with gpu busy could help a lot of folks learn about a great tool that can help identify gpu bottlenecking
@residentCJАй бұрын
Yes i wonder why Presentmon is not used more often nowadays. But even with that i think the Rule of Thumb stays true: Higher Res, less relevant CPU.
@DajovaАй бұрын
@@residentCJ Mostly because RTSS, the benchmarking tool he and many other are using, does not have that feature (yet).
@BestJoesterАй бұрын
The best way to identify what kind of upgrade you need is by using metric software like MSI afterburner/Rivatuner and looking at your statistics while you play/do whatever it is you're trying to do. You're playing a game, and your gpu is below 99-98% usage? You will benefit from a better cpu. It obviously could be a little more nuanced than this, but generally, that's how you will know if a better gpu will give you more frames or a better cpu will.
@franks2524Ай бұрын
Smart man, cut all the BS and do what this guy says !!!
@Lurch-BotАй бұрын
Not all recent games aim for full GPU utilization, especially when using features such as dynamic res scaling and frame gen. According to Steve Burke, anything above 90% is considered 'normal utilization'. It is when you see below 80% utilization, you have a serious CPU bottleneck. And most people don't realize you can have a CPU bottleneck that is based on single threaded score or cache. You can have a CPU bottleneck even if you only see 40% CPU utilization. Happens a lot on older CPUs. Hardware combinations which used to be seen as a good match 5 years ago tend to mostly end up CPU bound with recent titles. You see this all the time on, say a Haswell i5 with a 1060, which used to be a common budget build. Even those games which are considered GPU heavy, if they're from the past 5 years, they aren't going to run their best with an old CPU. I recently had the opportunity to play Cyberpunk on a Quadro K2200 (basically a 4GB 750Ti) with a 3rd gen i5 and then with an i7 upgrade. And the difference was noticeable. And Cyberpunk is considered to very much be a GPU intensive title. Step up to something like a 1060 and the difference will be night and day. But if you're playing GTAV, a 3rd or 4th gen i5 with a 1060 is a great combo. Even a later Core 2 Quad wouldn't bottleneck a 1060 much in GTAV.
@BestJoesterАй бұрын
@Lurch-Bot yeah which is why i said there's nuances to this, not all games use hardware the same. Far cry 3 for example actually performs worse if you have more than 4 threads assigned to the game, and this would present itself as a cpu bottleneck in this situation if someone didn't know any better for sure. And I think helldivers 2 hovers around 90% gpu usage most of the time regardless of graphics settings/resolution, which will still raise fps if lowered. But looking at gpu usage is usually the key, as modern cpus these days will almost always have more threads than what a game can feasibly utilize, and even then you still might not see 100% utilization on those threads since multithreaded tasks need to be synced up alot of the time anyways. I'm by no means a game engine developer, but I have worked enough with game engines at this point to know that multithreading can only get you so far with game logic, and that extends to a lot of other software in general. But that's a whole topic on its own. At the end of the day though, to identify a bottleneck in your system, knowing how to read system metrics properly is the only way to know where your system can possibly improve, which is what I think Daniel should make a video touching on potentially.
@TarvoskemwerАй бұрын
What could be interesting would be 3-5 tiers of cpu/gpu combo's which performs well together in 1080p, 1440p and 2k (ofc. ignoring 1440p and 2k for the low end combos). And by tiers I mean pricing, which per usual would point to go for 3 tiers. One could also do 3 levels of performance within each resolution. 30'ish fps, around 60 fps, 144 fps+ (probably ditch 30fps). Rephrase, the intro-question could be what combination of cpu/gpu's would give you 60 fps or better at a given resolution? and if one plans to upgrade a GPU later, then perhaps picking the CPU from the next combo up the list will also need to be touched. And to further try and cut it down, 5 combo's 1. 1080p cpu + 1080p gpu (60fps) 2. 1440p cpu + 1080p gpu (60fps, room for gpu-upgrade) 3. 1440p cpu + 1440p gpu (60fps) 4. 2160p cpu + 1440p gpu (60fps, room for gpu-upgrade) 5. 2160p cpu + 2160p gpu (60fps) - where as 2 and 4 doesn't need to be tested much, as you just take the cpu from the upper combo in case you plan to upgrade GPU with in a ? timeframe - perhaps cut out 2 and 3 - and then keep 4 for those who play in midrange and swap GPU every 2 years'ish - meh... as you said, many factors
@Just_An_IgnacioАй бұрын
I mean, the first thing to detect that you should upgrade your CPU is if, across a large sample of your heavy games, your CPU load sky rockets and the 1% and 0.1% lows are incredibly lower compared to the average FPS. *Then* you can check if the games are considered badly optimized or if your GPU is the problem because something like lack of Vram or really high usage. If these are likely the problems, your CPU should be fine.
@jai2628Ай бұрын
What I recommend a person looking at a potential cpu/gpu upgrade should do (warning, not a SUPER beginner friendly method) 1. Find the game where you want to see the improvement Note: for variety gamers, make a priority (i.e. the game that needs the most help) or do this method multiple times to compare what games are lacking the most graphically/performance-wise 2. Install msi afterburner and riva tuner statistics server, enabling the gpu load percentage, current fps, avg fps and 1% low fps in the overlay. 3. Play the game for a bit, at desired settings, noting the GPU load and avg fps in various areas/situations (consider writing it down if the load varies significantly) 4. Cover the same areas, but with the lowest possible settings (MAKING SURE TO DROP RESOLUTION, consider going all the way to 720p if you can), noting the avg fps and 1% lows (consider writing this down) Now that you've got your data, time to make a judgement. 1. Look at the gpu load at your desired settings: If the GPU load is high (regularly reaching 95-99%), move on to the next step. If not, skip to the next paragraph (step 3). 2. Look at the avg fps you noted at the lowest possible settings for your game. As yourself, "if at my desired settings this was my maximum framerate, would I be satisfied?" If the answer is yes, then divide your DESIRED FPS (note that this can be lower than your LOW SETTINGS FPS, e.g. I could be getting 120 fps at the lowest settings but I'm happy with an average of 90) by the DESIRED SETTINGS FPS, which, let's say was 45: E.g. 90 / 45 = 2. In this case, 2 is my "GPU multiplier" and I should look to purchase a GPU that is 2 times faster than my current one. Here, I can now look for a review or GPU database like techpowerup and find a GPU that is roughly 2x faster than my current GPU (probably a bit more, just to make sure you hit your target fps) If the answer is no, then you may need to upgrade your CPU as well as your GPU in order to hit your goal settings/performance. Either that, or you might need to temper your expectations lol. If you do want to consider upgrading both, read the CPU section below and I'll add a point in the "WHAT IF" addressing this. Now, if your GPU is NOT regularly hitting high load, your framerate is likely bottlenecked by your CPU and not your GPU and thus, an upgrade to your GPU will do absolutely nothing to your avg framerate. If this is the case: 3. Identify your DESIRED FPS. Let's say I'm averaging 60 at my desired settings, but I want to get to 90. Do the following calculation: DESIRED FPS / DESIRED SETTINGS FPS. E.g. 90/60 = 1.5 With this, look to purchase a CPU that is 1.5 times faster IN GAMING than your current one. Look for a review and find a CPU that is roughly 1.5x faster than your current CPU (probably a bit more, just to make sure you hit your target fps) WHAT IF: Average FPS at my desired settings isn't an issue, but I want to fix up my 1% lows tanking causing stutters. 1. Ensure that you've taken every precaution to isolate that the game to CPU relationship is the problem. I.e. the game is optimised in such a way that too much load is given to the CPU all at once, and that there isn't some bug present in the game or some external factor such as packet loss causing these stutters. Check if your game is running on Unreal Engine 4/5, as stutters such as this are really common due to poor shader compilation and multithreaded load distribution. 2. The CPU step calculation doesn't really work here, since such stutters will often affect every CPU differently, often not proportional to their relative average fps performance. Instead, find a reviewer which tests multiple CPUs on your given game, publishing 1% low data (Hardware Unboxed's 50+ game tests are great for this) and find a CPU that meets your 1% low desires. 1% low issues are RARELY a GPU insufficiency, so a CPU upgrade should help with this issue. My GPU is hitting high loads at my desired settings, but my CPU at the lowest settings is not an FPS number I'm satisfied with. 1. Do BOTH the CPU and GPU calculations and find your two multipliers. 2. Consider also doing one final test with your CPU: Note your avg FPS with your desired settings on, except with the resolution dropped to the minimum. If THIS average is notably (say.. more than 10%) lower than your MINIMUM settings reading. Use this as the DESIRED SETTINGS FPS in your CPU calculation and use this multiplier instead of the one before. 3. Buy a gpu that aligns with the multiplier from step 1 and a cpu that aligns with the one from either step 1 or, preferably, step 2. Any other questions/clarifications in the replies, I will read and edit this comment accordingly!
@brucepreston3927Ай бұрын
This is a good video...I do think alot of KZbin CPU reviewers tend to influence more casual people into uprgrading their CPU when they really don't need to...I have much respect for what reviewer do, but I do wish they would mention some of the points you made in this video when they put out a review...I think alot of people with mid range GPUs end up with extreme overkill CPUs when that money could be better spent getting the next teir of GPU instead...
@Ghost_PM11Ай бұрын
does this apply to me if i dont drink?
@davidhines7592Ай бұрын
there are other questions needing answering too. i upgraded from a 11400f simply because i couldnt get lga1200 motherboard with 2 m.2 slots to upgrade my old one for a reasonable price. so i wound up spending £100 on 12400f and another £99 on lga1700 - and discovered that when gaming, where the 11400f was reaching 90w package the 12400f is 60w - 30w less power. thats something i feel really ought to be in cpu reviews when a new cpu comes out: not just 'x% gain over previous gen speed/clocks' but also the difference in power draw during typical tasks like gaming. a second question: why is it i get better performance, and also better timespy benchmark score, when i change the default gigabyte bios power limits (which are 125w pl1/240w pl2) to intel default (65w/117w)? the results are consistently slightly better with intel default power limits. though its within margin of error statistically with the numbers the numbers are ALWAYS higher with intel defaults even if within the margin
@elu5iveАй бұрын
1) find out if you're cpu or gpu limited (preferrably with your native refresh rate as the target). 2) gather data from as many of the most detailed and high profile reviews out there as you can, to figure out which products would make up for your imbalance. 3) don't overpay for features you don't need and don't save money on cooling (especially when it comes to gpu coolers. the cheapest models are usually loud hot garbage)
@thejontaoАй бұрын
A couple years ago when I upgraded my video card, I decided to measure my FPS before and after in the small handful of games that I play. I did some research online, then I installed RTSS and Afterburner and played a while and wrote down my results. After my new GPU arrived I did the same. I had data on the difference. All the Internet can do for users is explain the concepts of CPU and GPU limitation and teach people how to determine where their bottle neck is. In all fairness to KZbin CPU reviewers, they (the ones that I watch, anyway) make an effort talk about a CPU’s single core performance vs multi-core performance, and to do benchmarks in both gaming and workstation scenarios. They’re doing all that they can.
@jouniosmala9921Ай бұрын
If your FPS is 100. And you trigger a spike on average of every 10 seconds, what effect the spike has to 1% lows. The 1% lows do not give same thing they used to because the big CPU spikes are rare enough with modern GPU:s. All the things you mention as a source for CPU spike is too rare to actually hit 1% lows with modern systems. When I used 2070 super with i7 920 for almost a year, I've learned quite well what a real CPU bottleneck looks like and how often it happens in the games I've used before the GPU upgrade. For instance if enemies that were not part of a map Jumped on me in path of exile sometimes they appeared seconds later than actual fight. Getting into a settlement it took long time for all the models to appear. In those case loading assets took so much CPU power that rendering was extremely stuttering. I think in this scenario having more cores would help, because loading assets is a parallel task. In starcraft2, in co-op mode if I was paired with Zukov I got stuttery 4 fps. In this case it was single threaded path-finding algorithm that caused CPU bottleneck, when my partner had so many zombies compared to normal unit count. Oh. And I found my style that I was able to carry in 4 fps. In this case single threaded CPU performance is the thing that matters. In CPU: reviews best thing, would be finding reproducible INCREASES in CPU demand in some games, and take measurements from that. edit: The benefit could be actually showing the scenario where CPU performance really affects the experience of a player. One youtuber has shown that smoothness of frame time graph in some games is way better with 16 cores than 8 cores while not really changing the averages or 1% lows.
@DenteMMАй бұрын
A good and doable answer is, what cpu i need for one specific gpu? what is the minimum cpu i need for a 4060 for example
@damage968Ай бұрын
I remember on my laptop, it came with pretty slow ram, upgraded it to the best ram available for the 5800h and gained an avg of 17% better cpu performance. It helped a lot in games like mh world. I used to get a lot of frame dips before and while the max framerate/average didn't change much. Stability did, a lot
@sealsharp27 күн бұрын
Calculations are fast, reading memory is, slow. You made the right choice.
@philippgeyer7785Ай бұрын
Hello, I stumbled across your video regarding Monster Hunter Wilds. It has been a long time since I put together a PC, and so it ought, I ask you: Would you make a video about a Pc build exactly for Monster Hunter Wilds? I think it would be good performing and helpful during this hype!
@Aleksey-vd9ocАй бұрын
There is no need to go through ALL the combinations. It is enough to show in the tests how much the processor outputs in games together 4090 in 1080p, 1440p, 2160p. And specify the hardware parameters and game settings. And some tests do just that.
@bumpahhiАй бұрын
I can give you a secret tip how to know whether you would gain fps. 1. Goto KZbin 2. Search "[old CPU name] [GPU name] game benchmark" 3. Take notes of results 4. Search "[new CPU name] [GPU name] game benchmark" 4. Take notes and compare 5. Congratulations! You now have your answer and have acquired a skill how to pass university.
@DrummerGhisiАй бұрын
i think the only way to accurately answer the left question would be by testing case by case, which may or may not be possible for a reviewer. it would be cool to see reviewers talk about the cpu in a realistic setup but that would be very complicated, what i think would be possible is to test and suggest some sort of pairing, new cpu goes well with that gpu in this [game genre], something i feel isnt very well touched by reviewers is wether or not ''new cpu'' is worth upgrading to
@nathanielknight1838Ай бұрын
What do I do if nobody is reviewing the specific GPU CPU combo I'm trying to upgrade in the specific game I have issues? Most reviewers cover Fortnite and whether you get 32908 or 128937 fps with some settings and I just don't care about their multiplayer FPS. I just want to know why Shadow of the Erdtree runs like crap and if I can fix it by upgrading to a 7600x paired with the 6700 XT or is that card not good enough for some reason? Edit: How would one even test whether they're CPU bottlenecked? When the GPU isn't working at 100% or is constantly going up and down in usage? Is that saying I need a better CPU or could it be something else?
@someguy-somehowАй бұрын
Practical example for me, I had a 1070 paired with a 4770k, I put the 1070 in a newer Ryzen 5600 based system and saw a very noticable 15-20% increase in FPS. A newer low end CPU blew the socks off an older high end CPU when paired with the same GPU, but given that these were at least 6-7 CPU generations apart that also says something.
@256shadesofgreyАй бұрын
If you want to have a quick and dirty way of checking whether you get something from an upgrade, check a CPU and a GPU review for the parts you're considering, and then check the games you want to play in those reviews and make sure to pick the resolution you're going to be playing at for the GPU. If the numbers differ a lot, you'll be bottlenecked by the the part with the lower FPS. But Daniel is correct. It's always better to overspec the CPU at least a bit to avoid stuttering. In GPU bottlenecked scenarios there are rarely frame time spikes in my experience.
@JusticeGamingChannelАй бұрын
There's just too many variables to answer the gaming experience question, everyone's configuration is different, and there are too many variables in terms of games, game scenes, settings, etc... So, the only thing us reviewers can do, is to make the variable the CPU, that is all that is in our power as a control. In order to test differences, they have to be put into a scenario where that is possible, and that's just the hard simple facts of it. Perhaps one day, in the future, AI could perhaps simulate 1 CPU on 20 different GPUs in 20 different games, with 20 different game settings, and play the entire game through, that's what would be needed, it isn't humanly possible, that's for sure, unless you have a team of 2,000 people, and all the hardware in a huge factory lol.
@Blafard666Ай бұрын
Daniel you doing such a necessary work ! Reading YT comments I realized long time ago that PC gamers painfully NEED to be educated !
@_fatalruinАй бұрын
You're totally on point, but some reviewers do answer both (for the most part). Hardware unboxed does this with their benchmarks. They test various resolutions and show the gamut, from CPU limited to GPU limited, and explain the results. I do agree that many people don't understand the distinction.
@vidiveniviciDCLXVI22 күн бұрын
I agree, the reviewer should tell people whats the cheapest CPU you need per GPU output at 4k as much as they tell you which CPU is the best for workloads or low Res gaming. I own a 4090 I just want to know what CPU do I need to keep that 4090 ticking over at 4k until I need to upgrade the 4090. Someone might want to know what they need for a 4080ti at 1440p etc.
@thecurtoАй бұрын
Daniel, would this approach work? 1. Benchmark a game at 1080p with DLSS/FSR performance ON to lower the render resolution as much as possible. 2. Take note of the average FPS and 1% low figures. 3. Repeat 1 and 2 for the games you play. The results should technically show the MAXIMUM possible frame rates that you CPU can handle at any resolution. Then, if the GPU you are looking to upgrade to is achieving higher frame rates (from benchmark videos) then your CPU will be the bottleneck. e.g. If your CPU can only achieve 90 FPS in Cyberpunk at 1080p with DLSS/FSR performance on, then this is your absolute limit regardless of the GPU in the system.
@bauer9101Ай бұрын
I was questioning recently if it was worth moving from an 11400F to an 11700 with an RX 6750 XT. I have concluded probably not. I must be learning something as a couple of years ago I would have done it.
@zetaalpha9974Ай бұрын
There's the basic concept of what's a cost effective upgrade, what will benefit an end user the most. Many times MMO style games are limited more by the CPU than the GPU. Not every end user know's how to best identify what's really limiting their "digital" experience, and many times the uplift is effectively currently is to upgrade the CPU/MB/RAM as a combination to achieve a limited upgrade in power. It's difficult to know what would be the "cost effective" upgrade that would meaningfully help with a perceived performance challenge.
@fogowar2022Ай бұрын
Hey Daniel, how about trying to find the cpu/gpu combinations that are most closely matched, for different resolutions. Ie, which cpu/gpu combinations share the most evenly matched responsibility for bottlenecking, across a suite of games.
@stephenhamilton3303Ай бұрын
The type of workload makes a big difference. Huge difference between FPS games (COD, Apex, Fortnite) and open world games (Starfield, Cyberpunk). The latter will kick your CPU's butt while not necessarily pushing your GPU. Playing Starfield at ultra 1440p with a 12900k and RTX4070 Super I get spikes when the drive is accessed and the CPU is nearly 100% all the time whereas my GPU is around 80-90%, Excellent video Daniel. Your students are lucky to have such a great teacher.
@MR-vj8dn16 күн бұрын
A review can also be informative to the people who are looking to make a purchase of a complete system. It doesn't ONLY have to be aimed for those who are looking for theoretical numbers or synthetic proof
@SoullessrunАй бұрын
that interactive whiteboard now really reminds me of schoole, i know you have this in your DNA by now Daniel but it's still funny :D but really informative for most people so it really great ! keep it up !
@RevAshtrayАй бұрын
Perfect explanation. Well done.
@helenFXАй бұрын
As to how to answer the question: I don't think that it is possible for a wide scope but you are already doing it when you show of games using a variety of realistic settings using a range of hardware.
@marks9233Ай бұрын
It would be really cool to know which CPUs start to bottleneck which GPUs! Of course it would only be a generalization with a lot of limitations, but directionally...it could be really useful. Takes this whole line of thought to its most helpful conclusion, you know? ==> For example: At what tier GPU do I transition from only needing a 6 core CPU (7600X, 9600X) to needing an X3D to not bottleneck it??? Generally speaking... I mean, my guesstimation would be around a 4070 super or 7900 xt, but... I don't think I have ever really heard anyone address that question head on.
@malinus3023Ай бұрын
What I’d like to know is if you can tell the difference in visual quality between FSR and DLSS if you were playing on the TV in your living room. I get that at ~1 m (3ft) on a 24-32in monitor it might be more pronounced, but what about 10-20ft away on a +55” TV.
@lexiconprime7211Ай бұрын
Typically I just look for YT channels that are testing games with the same or similar hardware to me, or are testing upgrades I'm looking to target. When I wanted to upgrade my 3700x to a 7800x3d, I looked for channels that were testing games I wanted to play with the GPU I was already using, alongside the CPU I wanted to upgrade to.
@rtyzxc27 күн бұрын
Outside of CSGO, cpu averages are often not very useful, they can be an interesting metric but not something I care about. What I care about is the lows, what the performance looks like in worst case scenario, and this is something you can measure even when GPU bottlenecked most of the time, while using more realistic graphic settings that might not but still might affect CPU performance in unknown ways. 1%, 10% or even something like 50% lows are all more interesting than the average for me. 50% low basically asks "how bad your experience is going to be 50% of the time". Actually, 50% low is just median. Practically, nobody cares how high the highs go.
@aaronduboise5277Ай бұрын
You'd need multiple generations and different performance tiers of GPUs and CPU's and test them in different configurations and resolutions: 4090 or 4080: 7800X3D >>>5800X3D >>>3600X : 4K 1440P 1080P 3080: 7800X3D >>>5800X3D >>>3600X: 4K 1440P 1080P 2080 : 7800X3D >>>5800X3D >>>3600X: 4K 1440P 1080P Do the same with Intel CPU's. And for good measure test all those CPU's again on Radeon GPUs. I've been wondering personally when the 50 series comes out whether I'll need to upgrade CPU or not (12700K). Not alot of good testing to determine whether a CPU upgrade will offer any benefit
@Mr.StendeАй бұрын
At some point you mentioned that 60 fps can have different smoothness depending on the frame timing and the CPU strength, because of extra load spikes in certain scenerios. It would be helpful if reviewers included frametime comparisons, or maybe some kind of a frame time variation/fluctuation graph. Do better CPUs give a smoother frametime graph with less fluctuations, even at the same framerate? That is a question I would like to see expored with some depth. My own anecdotal experience says yes, although I haven't tested it properly/thoroughly. Personally it's the dips and spikes that makes me want to upgrade a CPU. I always limit my FPS to smooth out the frametime graph and reduce spikes/stutters.
@LadiozАй бұрын
I don't think ill ever find someone on KZbin who explains stuff better than Daniel
@cuniceluАй бұрын
For me, the process is as follows: how much would I want to spend on a new CPU? Then search the reviews for the price range to see what's available at the time, then find a benchmark with that CPU and something close to my GPU to see how much would I gain. Or if it's available, a CPU comparison between my CPU and the potential CPU with a similar GPU as I have.
@erictayetАй бұрын
I have not bought a new custom PC in 20 years. I upgrade components, so I think my experience is relevant to Daniel's question at the end. I used to use i4790 with 4x8GB of DDR3 ram. When it came time to upgrade after 5 years of awesome usage, I wanted something with more than 4 cores/8 threads to support Dx12 games which can support 8 cores. Since I've used Intel from the 1980s, I naturally looked at reviews on KZbin and websites. LTT introduced Hardware Unboxed to me and I liked the clean no nonsense format. HUB uses a different GPU from mine but that is fine. Based on the review, the 5900X looked like a winner to me when compared to the Intel 10900K but by this time, 4790 was no longer in the charts, so I search for reviews of 7700k & 4790K and found that it is ~10% faster. So the 5900X should be ~30% faster for gaming. For productivity workload, it is substantially faster. Now for GPU, HUB used the 3090 which is a lot faster than my Radeon 5700. Looking at other reviews, I figured out that the 5700 is ~40% of 3090 fps at 1440P so I have some expectations of the performance I'm getting, which is only about 20-30% improvement. My main reason to upgrade is for work so I choose the 5900X over the 10900K. 5900X gaming performance losing by a few% to the 10900K was of no concern to me.
@ankur313Ай бұрын
If you are in a GPU bound situation where the GPU utilization is >95% then upgrading the CPU is not going to help you. If you then change the settings of your game so that GPU Utilization drops below 90% then a CPU upgrade will give your existing GPU potential to reach > 95% utilization !!
@dante19890Ай бұрын
ye pretty much
@joeyr184Ай бұрын
`Yea i like the way you put it. Think this would make sense to most people.
@Happydrumstick93Ай бұрын
7:49 - You can release a spreadsheet in your disccription where the viewer can input their own gpu / cpu - and it pops out whats best for them.
@StoyanBorisovАй бұрын
GOAT explanation! Have an excellent one yourself!
@Blafard666Ай бұрын
They won't get it anyway. I am already reading the same dumb comments, its an hopeless battle, people are dense ....
@StoyanBorisovАй бұрын
@@Blafard666 Unfortunately, there will always be people who refuse to understand, but that doesn't make the explanation less valuable. This is visually strengthened explanation for 3rd graders with interest in computers and gaming ... I really believe that anyone who "don't get it" is refusing to do so... But yeah, people are dense :)
@simonfil2Ай бұрын
I feel like 1% lows are an underappreciated metric to pay attention to. When going from a I7 8700k to a I9 13900k, the frametime STABILITY was the biggest difference.
@krayziembone24 күн бұрын
Thanks for the vids, been quetly consuming them. This is a great point. I really have gotten a lot more from reading articles or pieces from like TechPowerUp as it relates to CPU's. Which basically show as you stated here, basically no significaant difference in most games tested between at 4k with a 4090, which is really what is relevant to someone ilke me, whose CPU usuage is like consistently less than 20%. In that instance, its like buy the best modern CPU for your budget for your other tasks.
@bogdanpavel5Ай бұрын
Yes but while I am gaming, I open Chrome tabs to watch yt for guids or sites, have discord opened, razer synapse, etc opened in the background which maybe makes upgrading a cpu worth it if you want faster alt+tabs
@thomasragan2600Ай бұрын
Imo to get this message across it would take running multiple different gpus on a game and keeping running different cpus until the cpu is the bottleneck. Almost like what developers do for recommended cpu/gpu for this setting knowing anything above will work fine.
@EternalduoaeАй бұрын
The 'left' example is very complicated and requires a LOT more testing. I actually put together all the data for a mock review of a GPU using this methodology and i still haven't published it because of all the complications involved. The amount of work was also monumental... Ultimately, I don't think it's worthwhile from a business perspective as the reviewer would never make their money back from the time invested...
@shaneeslickАй бұрын
G'day Owen, As someone with VERY Limited PC Component Budget I can't afford to just buy a complete PC so buy in steps of: save buy GPU, save buy CPU/Mobo/RAM (Or just CPU), save repeat, the "How will my CPU + GPU situation change?" Question is Answered when instead of being lazy & thinking "This Channel is just to answer MY QUESTION" you take time to educate yourself with both GPU Reviews & CPU Reviews to get Max Performance for both of your Current & planned Components... 1st: Checking the Charts what is my Current Maximum GPU Performance + What is my Current Maximum CPU Performance = Where is my Bottleneck, do I need a CPU or GPU upgrade 🤷♂ Instead of "😲I have money Gotta buy new thing" use that education to "🤔buy the right thing that will get better performance when I can Afford it"
@myroslav6873Ай бұрын
I run Ryzen 7600 with 7900XTX at 4k. I get no bottlenecking/freezing at all. Back in the day I had 3570k, owned it for almost 10 years, and in that time it never bottlenecked GPUs I was using with monitors I was using - Radeon 7870, 290X and then GTX 1080Ti at 1440p. All the games I've played, from GTA4 through Witcher 3 to RDR2, I never needed faster CPU, only GPU. The only I did CPU-wise was adding 2 more 4Gb memory sticks, making it 4x4 in total. I know many people buy 7800x3d + 7800xt, 4070 or even 4060 combos, but I think that budget could be spent better.
@Sejbo8000GamingАй бұрын
I think stability and usage could be important for the left side since there can be weird hardware issues at times and also the voltage/power the CPU gives at a certain% usage AKA voltage graph. Not all CPU's perform the same at 30% because they are built different that is why 4K testing could be interesting because of how the CPU performs at that % with the power it gets. I don't think the differences will be big between the same company/architecture but the generational changes could be interesting to look at😄 So I think what people want to know is this : How good is the flow when the CPU is running at lower than its bottleneck compared to other CPUs?
@firstlast-cs6egАй бұрын
This need not be overly complicated, there is a middle ground. Say test 1080 and 1440 res with 2-4 video cards. One middle video card, one high end video card, and then one with a competitor like AMD and Nividia. Do all these with your comparing CPUs, checking FPS in various games with these various permutations. We don't need different settings IMO. This would be many permutations for sure but if you are comparing two CPUs that could be manageable, and then if you want to know about the other CPUs the more comprehensive CPU comparison with more limited permutations could be used. The idea is to give people a general idea. Some tools to make educated guesses as to what they need
@folkereicht5746Ай бұрын
How come there are all these hardware testing channels focused on gaming and this guy repeatedly brings up very good points that no one else is talking about xD
@K31TH3RАй бұрын
The problem is that gaming is changing. RT often has a greater CPU impact at higher resolutions, and frame generation can also alleviate a CPU performance deficit as long as the CPU can maintain reasonable baseline performance (at least 60 FPS) where the input latency from frame generation is not too intrusive. The way I determine if my CPU is sufficient enough for the game I'm playing is to simply compare my GPU usage to my current framerate. If the GPU usage is falling well below 90%, and I am not staying above an ideal target framerate like 60 FPS, or 120 FPS with frame generation, I know a CPU upgrade will likely be beneficial. However, if my GPU usage is always at 95-99%, and I am dipping below 60 FPS or 120 FPS, then it will be likely I could benefit from a GPU upgrade. Pick your minimal acceptable framerate, and the more often your GPU usage falls below 90-100% at that framerate, the more likely you are to benefit from a CPU upgrade. While it's not always that simple, and there are other variables like just a badly coded game, it's a good rule of thumb that usually won't let you down.
@nickv147328Ай бұрын
So the correct way for someone approaching this problem is to take a look at a cpu review and a gpu review that include the game and hardware that they are interested in and take the lower of the 2 numbers. Have a gpu that can do 200 fps but a cpu that can only do 100 fps boom you've got a 100 fps cpu bottle necked build, take that same 100 fps cpu and pair it with a gpu that can only do 50 fps boom now you've got a 50 fps gpu bottle necked build.
@wojtek-33Ай бұрын
It would be nice if the reviewers didnt leave this up to the viewer to do when they have the data and it would make a good video.
@mikfhanАй бұрын
Maybe easier to just have a google spreadsheet matrix of all the combos xD my philosophy often has been to prioritize CPU because it is hard to reduce game settings for more FPS on that side, whereas poor GPU performance can improve with lower resolution and game settings. Also it is often more tedious to replace CPU if you have to get a new motherboard also, a GPU is more plug and play, so might upgrade more often.
@wojtek-33Ай бұрын
@@mikfhan yeah but a new CPU is typically a lot less than a new GPU especially if you sell the old one.
@andersjjensenАй бұрын
Hardware Unboxed have sometimes done these HUGE "CPU to GPU scaling" videos. The problem is that the amount of combinations gets completely out of hand very fast. If you want to test 20 CPUs against 20 GPUs on 50 games at 3 different resolutions with 3 run averages you get 20x20x50x3x3 = 180.000 benchmark runs. Assuming benchmark runs of 2 minutes that is 250 days of testing assuming zero wasted time.
@Conumdrum27 күн бұрын
Are CPU reviews all WRONG? Yours are questionable
@LilMissMurder3409Ай бұрын
Kyle and Steve from [H]ard|OCP tackled this issue way back in the early 2000s. The review cottage industry had focused on pure apples-to-apples comparisons for CPUs and graphics cards alike, and this caused a lot of driver cheating (yes, ATi, I'm looking at you) to maintain framerates at the expense of the actual gameplay experience. The review strategy the [H] settled on was to benchmark against a (arguably subjective) yardstick of what they considered good image quality and overall performance. In other words, device A met the yardstick at say medium settings while device B met the yardstick at high settings. This is a gross simplification but it ignored absolute framerates and focused on the overall user experience. The industry as a whole needs a return to this type of review. A modern example would be a review that compares an NV and AMD GPU by listing the framerates with DLSS and FSR3. Meanwhile, back at the ranch, the FSR3 upscaling is terribly noisy and shimmery and wouldn't really pass muster if you cared about image quality. I think a lot of this issue has to do with developments in high-refresh rate monitors in recent years, including frame syncing technology. This has led Joe Average back down the dark path of valuing framerates _über alles_ . Maybe it's really true that human eyesight is generationally deteriorating and it's a hopeless cause.
@Apfeljunge666Ай бұрын
CPU comparison charts with mid range GPUs in a limited number of games could help customers a lot to figure out of they are GPU limited or not. Because as normal gamer, its not easy to tell if ugrading my CPU would do something for me or not without those numbers out there.
@ScottJohnson-v1rАй бұрын
The reality is with a high end cpu today, only upgrading the gpu will provide an actual upgrade. But 1% and 0.1% lows should be looked at more closely along with other feature sets, for example an AMD CPU and AMD GPU together, does it matter? Will the new CPU with fast ddr5 be better than the older CPU with ddr4?
@DaShaneo24 күн бұрын
Help me understand where I fit in. I have a simpit and a lot of USB inputs. I also run code heavy programs like MSFS and DCS World. How does reviewing CSGO and other AAA games running unreal engine help me decide? RTX 4090 all CPU limited running 11th gen I7
@mariusx6225Ай бұрын
What one should do ? Examine limitless scenarios for limitless aplications and games to buy a cpu ? If that's the answer maybe the cpus are too weak to do a minimum threshold of work ,gaming or other stuff.
@lilclm64Ай бұрын
Yes I game on a 4k 120 display I want to see what a set will do at this resolution. Get tired of people saying it does matter your gpu limited...... cool show the results.
@mojojojo6292Ай бұрын
You know what it will do at that resolution from the 1080p benchmarks. It's no different. Look at the gpu review if you want to see what you will get at 4k. How many times do you people need to have this shit explained to you.
@arron5861Ай бұрын
I'm trying to decide if I should get a 5800x3d before they are gone. I only have a AMD 6700XT for a video card. Currently, I'm using a 5700xt. I originally got it for the low temps. But with the 360 AIO water cooler I have, it can handle a lot more heat now. Would it be worth the upgrade cost to upgrade to the 5800x3d?
@BhaalTheFleischgolemАй бұрын
It is on the consumer to put a bit of work in first. Get an Overlay with frametime graph and a GPU utilization graph(MSI Afterburner) and take a look at game szenes, that you want to improve. Are you GPU limited(98-100%)? Don't upgrade the CPU. If you have Frametime spikes correlate with GPU dips, then there is a chance, that a faster CPU can help. But don't expect too much because those frametime peaks may be X% lower with a faster CPU but that may not be enough to make them unnoticable.
@nonenothing4412Ай бұрын
Finally, a video under 12 hours. Thank you.
@konstantinlozev2272Ай бұрын
For me, answering the first question is relatively easy. Just drop to lowest settings and lowest possible resolution and most aggressive upscaling. The GPU use will probably get down to 70% or lower. That is the maximum FPS you can get, even if you upgrade to a RTX 4090. If you want higher FPS, then you have to buy a better CPU.
@travisjacobson682Ай бұрын
Instead of bar graphs with average FPS and 1% lows, I'd like CPU bencharks to show histograms of frame times. Those can be easily be superimposed, and would give a far better idea of relative smoothness experienced with each CPU. Also, any "hitching" caused by suboptimal code would be apparent (assuming the section of the game sampled by each benchmark is fixed/reproducible).
@curtismariani6303Ай бұрын
Personally I think the best way to approach this is in a follow up to gpu reviews. So the reviewer could give min max recommendations for example a 4070, ie what is the minimum and recommended cpu you need on AMD/Intel to get the most out of the gpu at a given resolution. This could consider total build cost, ie if a slower Intel cpu needs faster more expensive RAM but the next tier up doesn’t etc. The aim would be to help games go up a tier in gpu without comprising budget or leaving performance on the table due to bottlenecks.
@jaygames198020 күн бұрын
It might be easier to summarize the advantages and disadvantages. Use some numbers to demonstrate which parts improve.
@StuffedCake21 күн бұрын
CPU utilization scenarios are not stagnant. Operating systems are ever changing, bios are updated, and programs are updated. CPU reviewers tend to focus on when they first release and then that's the only reviews you get until a new version comes out. Hardware Unboxed was the only one I've noticed that effectively covered Zen 3, 4 and Zen 5 improvements with the new build 24H2 for Windows. To this point, the CPU reviews we gotten when the product launched is obsolete and should be labeled as obsolete now that we have new data that it's a complete difference between todays performance and when the product first launched. However, we often don't get to see the new data unless it's covered.
@xDAKPRODUCTIONSxАй бұрын
It is valid to look at the raw power of the CPUs head to head and how good the CPU would be in real world results for your specific situation with your level of GPU power
@TheHangarHobbitАй бұрын
I think a way a CPU reviewer can help the viewer on CPUs that aren't the top tier is to simply answer the question "Can this CPU max out or get close to maxxing out a mainstream card?". The reason I say this is I have seen CPU reviewers 💩💩ing on lower end AMD and Intel chips because "It is X slower on these 4090 rigs" but then I see someone test it with something actually sane at 1080p like a 6700XT or 4060 ti and the chip has the GPU running at 99% which is IMHO what the end viewer really wants to know. So I agree if ALL you care about as a CPU reviewer is having a chart that simply lists every CPU with a percentage number then the 1080p with 4090 is perfect for that but considering most use automated benches or built in benchmarks it really isn't too much to ask to simply do a second run with a more mainstream card. This is why i love Tech YES City as he will often showcase chips the others 💩 on like the Ryzen 4500/5500 and the Intel i3s and i5s and pair them with a mainstream card like the 4060 or 7600xt and show what they can do at 1080p and this will give more useful results to the end user than a reviewer saying something like "this is 40% slower than a 5800X3D and 90% slower than a 7800X3D with a 4090". To use a car analogy, I know a Ferrari is faster than a Honda Civic, but that information doesn't help me figure out if the Honda Civic is all I'm going to need for my daily commute.
@2Burgers_1PizzaАй бұрын
I go by the square root of the 1% low x average (like hardware unboxed), choosing parts for a particular workload and budget. If those meet between a GPU and a CPU, I can have a reasonable expectation they'd have negligible deviation when used together. Depending on budget and pricing, it can sometimes mean I'd opt for older gen parts in favor of a faster GPU and vice versa. It has led to some friction in forums, but when I apply that method to alternative builds, they suck.