Anyone saying 1080p is "dead" needs a serious reality check. According to Steam hardware stats, ~65% of their userbase still uses it. 1440p is at 11% and 4K at a tiny 2.7%. For reference, 1366x768 is at 5.9%. The high end market, which realistically almost all of the DIY PC market currently is, is very niche even when it comes to gaming in general.
@vsaucepog7829 Жыл бұрын
Just people looking for confirmation bias lol. 1080p was and still will be king for a long time.
@SpartanDusk Жыл бұрын
I blame the console advertisements and TVs, consoles advertise running games at 4K, while true sometimes, is not always true. That said, my greedy mind would want a PC that would outperform a console so a PC with at least a 5600X/12100 and a 3060Ti, with either a 1080p or 1440p monitor.
@dante19890 Жыл бұрын
1080p is dead for enthusiasts unless ur heavy into competetive esports with a 360hz monitor. ITs enthusiast and not the average consumer who are interested in these benchmarks
@vsaucepog7829 Жыл бұрын
@@dante19890 yeah, but that's a very small minority of the community.
@lucianocortes8636 Жыл бұрын
I bet those same idiots use a 23" monitor to play at 1440p or higher, rofl.
@chrishexx3360 Жыл бұрын
This video should be classed as a mandatory watch before buying new CPUs. Absolute gold standard content.
@mircomputers Жыл бұрын
and GPUs
@MoChuang343 Жыл бұрын
19:48 I also understand why viewers want to see what kind of performance upgrade a new CPU will provide for their specific configuration but you can quite easily work that out with just a bit of digging. All you have to do is work out how much performance your graphics card will deliver in a chosen game when not limited by CPU performance this can be achieved by looking at GPU reviews simply check the performance at your desired resolution then cross-reference that with the CPU data from the same game. Can you please clip this and post it like in all of your video descriptions?
@seppomuppit Жыл бұрын
mandatory before getting your youtube benchmarking commentor's licence
@TechRIP Жыл бұрын
@@MoChuang343 So you want people to do the work for the reviewers? HUB gets paid to do reviews. Hold them accountable for doing them correctly instead of being lazy.
@MoChuang343 Жыл бұрын
@@TechRIP I would argue its HUB's job to provide the resources and data in order for consumers to make an informed decision and that is exactly what they do. Reviewers cannot possibly make a video on every single possible hardware configuration. Even with just 3 generations of CPU and GPU releases, that would be an insane combinatorial number of possibilities. But by making videos that shows the upper limit of a CPU, and the upper limit of a GPU (at given settings), then you can easily determine if a system with that combo is CPU or GPU limited in that game. And with that info, you can pretty quickly determine whether a CPU or GPU upgrade will help.
@antonyhermens9985 Жыл бұрын
Wow Steve... you have a enormous amount of patience explaining these tests. Thank you for the effort and don't get burnout man.thanks again
@seylaw Жыл бұрын
This saves him a tremendous amount of explanations in the comments though. :D
@justinvanhorne9958 Жыл бұрын
@@seylaw It really doesn't. the new human standard of internet dweller that is being targeted here simply is incapable of listening. As I stated elsewhere, you could produce 100 of these videos consecutively and one benchmark gets released and the comments would be flooded. This is what happens when you attempt to assign explanation and logic to situations that ask for none. People are stupid, gamers that walk into a shop and drop 6k USD on a prebuilt are by definition ignorant. It's not their fault, and its not their entire existence, its just a way of making you understand what you are up against here. With the ever increasing gap of knowledgeable techies to ones that were simply brought up in it, believe me I can look at my kids and see it. Intelligence, logic, complex thoughts can take many forms, when things are taken for granted, they just want them "to work". Now if you'll excuse me, that's enough PC time for today, time to go back to the real events of impending financial collapse of "first world" economies and all the other good things that go with capitalistic societies.
@seylaw Жыл бұрын
@@justinvanhorne9958 Sure, but he can always point to this video. You can't be safe of ignorant people, right. But I support it that he tries to explain this to educate people that are willing to learn and to listen.
@justinvanhorne9958 Жыл бұрын
@@seylaw that is the unfortunate thing, those educated enough to know better, probably aren't spouting off in the comments about something passionately that is the equivalent of putting balls in holes.
@bestrum1188 Жыл бұрын
Videos like this are so important and I just want to thank you for all the content you produce. This has been my favorite hardware channel for many years and it is so great to see you getting close to the 1mil milestone.
@darthdadious6142 Жыл бұрын
The absolute most useful info was his recommendation to cross reference cpu vs gpu charts to match up your cpu/gpu combo. I was really wanting to know how to find the maximum GPU my CPU will benefit from. Now I know how to do that. I just wanted a simple chart with all cpu/gpus combos to make it easy. Is that too much to ask? lol, kidding. That would be a nightmare for anyone to compile every time a new cpu/gpu came out... But a quick look, my 3800x wouldn't need more than a 3070ti or a 6800 (16 game average at 137 fps at 1080p). If I upgraded my cpu from my 3800x to a 5800x, I could utilize a 6950 or a 3090 ti (to get a 180 fps 16 game average at 1080p). While I'd have to jump to a 7900xt or 4070 ti to get the 200 fps 16 game average on a 5800x3d. Sadly, I couldn't find any recent benchmarks to see how my 5700xt handled today's games. But based off of the original review of the 3070ti, I'd see a 45% improvement at 1440p or 55% at 4k. But no 1080 numbers, which is what I game at...
@TheAllhailben7 Жыл бұрын
Honestly, you don't want to be CPU bottlenecked in a game. If you're GPU bottlenecked, the FPS might go from like 90 to 80 sometimes. If you're CPU bottlenecked, you could be bouncing back and forth between 75 and 100 while your gpu struggles to be fully used constantly which is not a good experience. If you're running anything like a 3080/6800xt or better, at least a 1440p monitor is recommended because you get all of the performance you're paying for.
@SashimiSteak Жыл бұрын
In a nutshell we are trying to compare "maximum performance potential" of CPUs, so we need to do our best to create a CPU bottleneck scenario. Edit: to elaborate: The test is an artificial scenario designed to compare, not to show "real world performance". Because CPU/GPU utilisations varies from game to game too much, "real world performance" will swing up and down so much and will never show any meaningful patterns. The CPU you'd want to buy depends wholly on what games you play, as such it's impossible for reviewers to tailor made their review based on your needs. "maximum performance potential" is a far more universally meaningful metric. You can actually extrapolate real world scenarios by taking the "maximum performance potential" shown here for games you play, find a review for your GPU somewhere and find out how many frames it can push for those games, then match the closest performing CPU / GPU. This is one way to optimise your purchases. But you still need to do some calculations, can't expect the data to be spoon fed to you.
@jason6296 Жыл бұрын
^^^^^^ ??????? lol
@chadhowell1328 Жыл бұрын
@roenieI’m sorry but what?
@GewelReal Жыл бұрын
@Roenie It will absolutely matter. Games not only get more GPU demanding but also CPU demanding. Otherwise we'd be still using Pentium III's
@araqweyr Жыл бұрын
@Roenie It's not like games only use GPU, or rather CPU isn't needed only to shove data into GPU. There are also some game related computations on the CPU and it also has to be done for each frame. We might not be able to see the difference right now but if (or rather when) developers start better utilizing CPU then gap between CPUs will start to widen
Жыл бұрын
@Roenie Here in PAL region we used to play at 24fps in 576i, so we've not just went a long way up in graphics quality and resolution, but framerate as well (I use to target at 60fps in my PC, but my screen is capable of going up to 144). Different starting framerate and resolution in NTSC, but same history.
@4zur3 Жыл бұрын
The concept is to never limit one tested part by its counterpart. Always test a gpu with most performant cpu. Vice versa for cpus, testing at low resolution so you are not limited by gpu. This concept holds for decades. Every gamer should deeply understand by taking time to learn this.
@r1pperduck Жыл бұрын
Takes no time at all, it's a very simple concept and you explained it perfectly.
@meurer13daniel Жыл бұрын
Agree. If you are entering the pc gaming market, you should at least know how hardware works. It doesn't take time to learn and you will buy the best product for you.
@sammiller6631 Жыл бұрын
performant? shouldn't that be "performing"?
@YurandX Жыл бұрын
@@sammiller6631 artist can be performing, cpu is just most performant. idk if its correct but it makes sense to me
@meurer13daniel Жыл бұрын
@@sammiller6631 Performing = verb Performant = adjective Performant in this context is the correct word
@Cinetyk Жыл бұрын
The thing to understand is that one thing is a CPU review/benchmark, another thing is building a "reasonable" configuration (CPU+GPU) and see what kind of performance you get. They're two different kind of "studies", looking into related but different things. Like Steve said, you can combine both CPUs and GPUs reviews to inform on what kind of performance you can get with your desired pairing. I hope you continue to do the "scaling" videos where you see how a CPU scales with increasing GPUs and vice-versa - this truly makes it able to get a clear picture about performance.
@markhackett2302 Жыл бұрын
It is far worse than that. WHAT GPU is there? "Oh, it has to be an NVidia one, because they have like 10x the market share of AMD!!!", and since the speed of a 6600XT and a 3060 is about the same, there won't be an AMD card on the roster. And any balanced system will spend money on a GPU rather than a CPU, and bottleneck to "the CPU is largely irrelevant", because even in "cpu tests", the GPU bottlenecks first. Indeed the GPU is rated not to the CPU but the monitor, so the CPU+GPU combination is 100% irrelevant, it is the GPU+Monitor that should be combined. So it isn't "who buys a low end 120$ CPU to use with the 1600$ 4090", it is "who buys a 4090 but runs it on a 1080p monitor?"
@Spido68_the_spectator Жыл бұрын
@@markhackett2302 they used the 6900 XT for 1080p testing for a while
@aratosm Жыл бұрын
No one likes you.
@MarcABrown-tt1fp Жыл бұрын
@Mark Hackett You simply don't understand that in order to find out how powerful any given Cpu is in single core or multi core performance, you can't be GPU bound. The resolution is irrelevant in Cpu testing.
@GeneralFuct Жыл бұрын
@@MarcABrown-tt1fp the data he presented literally shits on everything you just said
@AM-qc3vk Жыл бұрын
The Call of Duty Modern Warfare 2 benchmark has a cool feature that shows you how many fps the cpu delivers vs the gpu. Pretty cool to see that data directly in a benchmark and to see where the bottleneck is in your system.
@invalid8774 Жыл бұрын
Well its usually not that hard to spot the bottleneck. If your gpu is at over 95% load, its a gpu bottleneck. If not, its most likely a cpu bottleneck.
@pendulum1997 Жыл бұрын
@@invalid8774 What?! My 3070ti is a bottleneck with your logic then?
@ungoyone Жыл бұрын
@@pendulum1997 Why wouldn't it be? It's not a high-end card.
@dnalekaw4699 Жыл бұрын
@@pendulum1997 well yeah of course it is
@MoltenPie Жыл бұрын
@@pendulum1997 depends on the resolution and settings. Your 3070ti can bottleneck even an i5 2500k if you push for 8k ultra. And vice versa if you set 720p with 25% resolution scale on low settings even an i9 13900ks would probably become a bottleneck with your 3070ti.
@qlum Жыл бұрын
While I was fully aware that there is a difference in cpu performance, this does really show it's time to upgrade my aging 8700k@5ghz now that I upgraded my gpu to a RX 7900 XT. Especially since a lot of the games I play may be more cpu bound than the games regularly benchmarked. However, I do tend to spread out my upgrades a bit as not to spend a load of money at the same time. So probably something that's gonna wait till middle of the year. The gpu upgrade was definitely more needed (GTX 1080) not only for performance but also because Nvidia's linux drivers are hot garbage.
@MrHeHim Жыл бұрын
Generally CPU bottlenecks can be felt/seen in game as hard stutters and GPU bottleneck of course would be lower FPS but still relatively smooth. I personally would match it with a 7700x, if you're lucky enough to live next to a Micro Center they are giving away 32GB DDR5 6000 if you buy one. BUT the motherboards are over 250 so it comes out to around $600 still.. BUT (more butts) they have a $600 bundle with a 7900x, G.Skill 32GB DDR5 6000, and a ASUS B650E-F ROG STRIX GAMING WIFI 😅 I'm personally happy with my 6800 XT and 3700x, so although i almost made the purchase twice i backed out last second because then I'm going to want to get a 4080 or 7900 XTX to match it 🥲
@qlum Жыл бұрын
The paradox games I play benefit greatly from x3d I as for microcenter, I live in the Netherlands so there is nothing of the sort here. In checking I did notice the 7700x dropping quite a bit in price to € 300 in the last few days and that does look interesting. Still with the first x3d parts around the corner, I can at least wait for reviews / benchmarks of those. With a decent am5 motherboard and a decent kit of memory that would be around € 720 including vat or comparing it to usd roughly $635 excluding vat. Not too terrible either with no bundles.
@MA-90s Жыл бұрын
This guy doesn't pull his punches and delivers outstanding content. We need this. Thank you.
@RNG-999 Жыл бұрын
Doesn't*?
@MA-90s Жыл бұрын
@@RNG-999 Does not*?
@zenstrata Жыл бұрын
he's wrong though. because eventually 1080p will be a useless test benchmark, because nobody will use it. Otherwise they should be using 144p or even 480p for 'testing'. If he was entirely right, then he should go to the extreme, and be 'more right'. but instead he's only 'partially' right, and that won't last, because eventually 1080p will be consigned to the dustbin of history. Just like 144p and 480p is today.
@MA-90s Жыл бұрын
@@zenstrata "eventually" dont cut it. This is the here and now. Right now, he is 100% correct.
@Kuriketto Жыл бұрын
@@zenstrata No, he's not wrong, and you've completely missed the point of the video. Testing at 1080p with top-end GPUs gives the audience an idea of how much a given CPU might scale with GPU upgrades years down the line, because it's by far the resolution that most gamers use today and will be using even several years from now, likely even after multiple GPU upgrades. Games will continue to become more demanding, and as an example, somebody building on a budget may need to keep upgrading their GPU just to maintain an acceptable level of performance at 1080p, while their CPU remains untouched. The idea that 1080p might one day fall out of mainstream usage is irrelevant.
@The_Noticer. Жыл бұрын
I can't believe we're still having this conversation. I'm so glad i'm no longer on computer related webforums, because i'd have pulled out my hair trying to explain people this. I've done it so many times, made the same tired arguments.. but people see a result they dont like and they immediately revert back to this grug mentality. Same with the "bottleneck" argument. Brutal, just absolutely brutal trying to tirelessly explain this. Bless your patience, I would've long lost that patience by now.
@umamifan Жыл бұрын
Computer and tech forums are the worst places to be on the internet. Too many armchair experts who get upset the moment a disagreement happens...
@YeaButCanUDoABackflp Жыл бұрын
For real. It's not even that complicated.
@Bareego Жыл бұрын
People are not stupid, they're just not as rich as you to make that argument. Until you get this there's not point arguing.
@mcbellyman3265 Жыл бұрын
@@Bareego They are stupid. It doesn't require money to understand basic logic. In fact, the less money you have, the more important it is to understand this stuff, as it requires you to be more selective in buying the appropriate equipment. Rich people can just buy the best CPU and best GPU and ignore all this.
@The_Noticer. Жыл бұрын
@@umamifan Just like with most things now, people are hardwired to see people having different opinions as an enemy that is beyond reproach. There are SOME acedemic arguments against using 1080p data. Such as it not being very good to measure ABSOLUTE performance of that specific CPU in that specific title, as increasing the resolution also has a detrimental affect on CPU load. It has to process more geometry data, especially now with ray-tracing. However, when used as a RELATIVE test against other CPU's using the same load, its still perfectly valid. As they all are tested under the same conditions. But when something as simple as what steve is explaining is lost on these people, the finer points will be lost on them even moreso... As I said, im glad I no longer dwell on LTT-like forums.
@JarrodsTech Жыл бұрын
Great explanation, I will definitely be linking to this video in a number of my own comments 👍👍
@pf100andahalf Жыл бұрын
Hi Jarrod!
@martineyles Жыл бұрын
Only half helpful though. In the desktop space, it's good to test the CPUs with a high end GPU. However, reviewers have acknowledged that there is increased CPU workload when using ray tracing at 4k, so it's possible a low end CPU might not provide the framerate we want at 4k, and the CPU benchmarks at 4k are still useful.
@tyre1337 Жыл бұрын
always better to link to reviewer that can be trusted and isn't manipulated by nvidia
@michaelrobinson9643 Жыл бұрын
In a world where people say "I feel" as an expression of thought, I'm not surprised many do not understand scientific or engineering rigour. The concept of isolating variables to objectively examine each is too horrible to think about for many I expect.
@PK-lk5gs Жыл бұрын
Great piece of testing! "The games you play aren't the same games everyone plays" - I'm shocked to realize how this very basic point is not obvious to everyone. Thank you very much, Steve!
@flameshana9 Жыл бұрын
It is a problem when _everyone_ assumes you need 90+ fps to game.
@masudparvejsani Жыл бұрын
@@flameshana9 It is a problem when everyone assumes you need to max out everything you see in graphics settings.
@TheDude50447 Жыл бұрын
As an RTS player I know this better than most. Some people find it hard to believe that I generally favor cpu over gpu performance for gaming.
@Machinens Жыл бұрын
@@TheDude50447 same thing with MMO's, I need more CPU horsepower for VFX and loading into the areas and players as well
@Beefy-Tech Жыл бұрын
I tested a 4090 at 1440p competitive settings in warzone 2 with a 5800x3d, to test cpu bottlenecks in warzone 2, and me, a guy with 169 subs got comments saying 4090 is not meant for 1440p, but for 4k only...as if 360hz 1440p does not exist, or is it assumed that most professional play at 4k res vs 1080/1440p high refresh xD? The point of a cpu test IS to bottleneck it to see which ones faster or how far in FPS it can take the GPU.
@elirantuil5003 Жыл бұрын
Probably children. 2080ti was also a "4k only" card, let's see them gaming on 4k with it now.
@mrbobgamingmemes9558 Жыл бұрын
@@elirantuil5003 expecially on forsebroken (forsespoken)
@Threewlz Жыл бұрын
360hz is copium, isn't it? Hell even 240hz is kinda overkill. How many games are young getting 240fps plus (apart from valorant and csgo) without spending thounsands of dollars into a pc? 144hz is enough and should be the norm a few years from now on I believe
@TheGreenReaper Жыл бұрын
@@Threewlz They tested this with fighter pilots. 144Hz is not the limit of human perception. 240 Hz is closer. 360 Hz is likely beyond a noticeable return.
@BlahBleeBlahBlah Жыл бұрын
@@Threewlz projecting much? Sure it isn’t a concern for you (or me TBH). However I understand there are people who want and can afford the hardware to give them super high frame rates @ 1080p.
@domm6812 Жыл бұрын
Good vid, but I suspect a lot of the people complaining about 1080 testing may not understand the definition of what a bottleneck is in this instance, how the CPU becomes one, and the fact that CPU performance is masked/hidden at higher resolutions. In short ...go back to the very basics so there can be no confusion.
@pf100andahalf Жыл бұрын
Exactamundo. He needs to make a 60 second short too. It probably wouldn't hurt to make several follow-up videos to this one showing the concept in different ways for those in the back... of the short bus.
@UnStop4ble Жыл бұрын
my problem with this video is that it couldve been atleast 50% shorter and still wouldve gotten the point across, maybe even better cause of peoples low attention span
@moldyshishkabob Жыл бұрын
@@UnStop4ble Frankly, I think the people with such a low attention span that they don't understand this may not want to put in the effort to understand it in the first place. Because why are they watching a channel that has an average run time on videos that's well beyond 15 min? There's a reason LTT has millions more subscribers than HUB or GN, as S-tier as their testing is.
@JACS420 Жыл бұрын
@@UnStop4ble yeah, tbh I’ve built enough pc’s I’ve sold graphics cards once overclocked performed like the next tier none ti/super variant. Mean while others gpu/CPU’s can’t handle 150+mhz oc’s. One customer I sent along the way with a 3060ti boosting up to 2050mhz. I warmed him, but he popped in my discord not to long ago claiming it’s still kicking.
@huggysocks Жыл бұрын
Nope I have literally zero respect for people commenting on how people do their work when they haven't bothered to learn the most basic things about the topic. It's like backseat driving from a toddler only dumber cause they are older and should know better.
@dandylion1987 Жыл бұрын
Uh oh. You do realize all viewers here have 150IQ or more and know the intricacies of PC technology inside out, right ?
@Eternalduoae Жыл бұрын
Dude, I have an IQ of above 500 at 360p!
@godblessbharat708 Жыл бұрын
@@Eternalduoae sounds like frame rate
@lukeshackleton4775 Жыл бұрын
Sounds like a joke that went over a head
@stephenbull2026 Жыл бұрын
@@lukeshackleton4775 Sounds like a joke which, if you don’t get, means your IQ is the bottleneck. 😂
@petrnovak7235 Жыл бұрын
Sadly, the comments shown at the beginning of this video says different story 😞
@earvin4602 Жыл бұрын
I agree with everything said here about CPU benchmarks and I'm still adamant about the fact that a CPU "scaling" benchmark is incomplete and misleading without being paired with corresponding GPU scaling benchmarks. Let's take Steve's example about wanting to retire the 8700k: While the 4090 benchmark suggests a 98% uplift potential in Watchdogs when switching to a 13900k, only 6% will be achieved without retiring the paired 3060 alongside. And for that information the presumed meaningless GPU bound benchmark is actually very valuable.
@brutlern Жыл бұрын
You need 2 categories, the productivity people, who are extremely interested in the CPU performance, and the gamers, who DO NOT CARE about CPU performance. The gamers only want to know which is the cheapest CPU which will not bottleneck a mighty 4090 with settings cranked up to 11. If all CPUs perform the same (mostly) at 4K, then that is the information that, we, the gamers, need to know. We need to know if the old crappy CPU is still good enough for whatever extremely expensive GPU we may get, or if it's time to retire the old dawg and upgrade the CPU as well. The 4K gaming chart for CPUs might be the most boring chart ever (basically a rectangle), but it's the most crucial information a gamer needs when thinking about that new GPU upgrade. Case and point: in this video the 4K Watch Dogs Legion test specifically provides the info we need. For a 4090, you need a faster CPU, that i3 is not good, it doesn't matter that it's still 100 fps, you are leaving 50 fps on the table. But if you're only upgrading only to a 3080 than the i3 is still good enough, and you can save a buck today, and upgrade later.
@reaperofwaywardsouls Жыл бұрын
Great video. I think that including 4k benchmarks is a great way to remind some folks that they don't necessarily need a CPU upgrade (people that game at 4k). No need for the full suite of benchmarking games though, maybe 2 or 3 should get the point across. You may be surprised about the amount of people who look at 1080p graphs and transpose it to 4k in their minds, and then are disappointed when they don't see a comparable performance uplift when they upgrade (and game at 4k).
@charlestwoo Жыл бұрын
I just want cpu gaming benchmarks to have 3 games shown in 4k so I can see it clearly, that is all. I know 1080p is where u lean on the cpu the most but I still want to see 4k damn it.
@Strix-gp9xg Жыл бұрын
That is exactly why they should always include a 4K benchmark results when testing CPUs.
@chrzonszcz323 Жыл бұрын
For future i would suggest including strategy games like Stellaris, rimworld or Factorio. These are very CPU intensive games where you will be able to see the difference between different CPUs much easier
@rENEGADE666JEDI Жыл бұрын
But in factorio processors with 3d cache are in a different league. Although, for example, a star citizen would open people's eyes to how much this technology gives ;)
@TleafarafaelT Жыл бұрын
im tired of watching AAA games in CPU conparision when we know the indies games that have a insane hungry ass wanting a chonky cpu
@Daeyae Жыл бұрын
My r5 1600 and my friends r5 5600x are night and day in stellaris, his is orders of magnitude faster.
@thentil Жыл бұрын
What metric exactly do you want them to show for these games? The productivity charts should give you a good idea of how they'll scale in these games.
@saricubra2867 Жыл бұрын
@@TleafarafaelT Indie games don't have good optimization unless they do.
@MrHamof Жыл бұрын
11:34 Well, it is. In that it tells us that there's no point buying anything significantly faster than a 13100 if you're going to be pairing it with a 3060. (At least in this game at very high settings) But that's what the GPU scaling videos are for.
@thisguy317 Жыл бұрын
Awesome video guys. We need more videos like this I'd love to see more GPU reviews include old CPUs mixed in to the results, just to get a feel for what the FPS-cost is associated with not upgrading my CPU in tandem.
@KevinM0890 Жыл бұрын
u can do it on your own. create a cpu limit on your system and u know if your cpu needs to be upgraded or not to reach the FPS numbers u need :)
@codyl1992 Жыл бұрын
@@KevinM0890 not really because that doesn’t take into account everything. Clock speed and core count isn’t everything for this stuff.
@jaymacpherson8167 Жыл бұрын
Thanks Steve for making it clear how the hardware compare using a baseline standard. That apples to apples comparison empowers the individual PC builder when mixing and matching their hardware.
@andrewcross5918 Жыл бұрын
I just wish there were more non FPS metrics tested like tic rates in Stellaris or turn times in Civ 6 and other 4X titles. I also wish there was a wider array of game types tested like late game ARPGs like Path of Exile or MMOs. The kind of load these titles put onto systems is totally different to many AAA style games so seeing how these perform would add a lot of breadth.
@HexerPsy Жыл бұрын
MMOs are particularly tricky to test due to the uncontrolable variable of the other people in the game. No busy city is the same, and testing in an empty field is kinda pointless. As for the rest... idk - if enough people want Civ 6 in the comments, then it ll happen? Guess the reviewer mostly tests the games that the audience wants to see / plays?
@lexiconprime7211 Жыл бұрын
I, for one, would love MMO testing in raids and dungeons (WoW, FF14, BDO, Gw2, etc...). But I think the reason they don't is because there's no way to eliminate variables that could potentially skew the data.
@rk9340 Жыл бұрын
@@lexiconprime7211 It's a bit pointless. You just lower the settings where it can run 60/144 fps lows consistently. Things like random texture loading times are influenced by your Harddisk more than anything else and the rest of your system can actually freeze to a halt while waiting for that data.
@andrewcross5918 Жыл бұрын
@@rk9340 in plenty of MMOs there are no GPU settings that will do that in large raids or PvP because the CPU is the bottleneck. Same for Stellaris et al. Low tic rates means you need to have the patience of a saint or stick to smaller maps with fewer other civs for it to remain playable to end game and beyond. Just checking steam and Civ 6 is 22nd in the top player list, Heats of Iron 4 (Paradox grand strategy) is 34th, Cities is 37th, Path of Exile is 42nd. All ahead of CP2077 in 54th. Then you have EU4 in 69th, Stellaris in 74th and CK3 in 88th. I don't see Hitman 3 or Rainbow 6 siege or tomb raider or horizon zero dawn on the top 100 yet they get tested over far more played games. Edit. Adding up how many people are playing those 4 grand strategy titles (I believe they use the same base engine) and you are in and around the player count of Call of Duty at 80k ish. So on the whole very very popular.
@lexiconprime7211 Жыл бұрын
@@HexerPsy That's kind of why it would have to be primarily a cpu test in controlled environments like raids or dungeons, where the number of players and NPCs is always the same. But even then, you would need a consistent crew of dozens of people all repeating the same actions ad nauseum, which is why I think no one would do it. It'd be VERY useful for someone like me though, since I play a lot of MMOs.
@russellmm Жыл бұрын
this was a great video! nice job. I think this also shows why doing productivy testing is so important as it also isolates the CPUS from the GPU.
@zakelwe Жыл бұрын
Exactly All the cpu testing in games this site does is rather wasted because if it is not gpu limited who cares if it is 200 or 150fps apart from a very small minority. What's more important is how long it takes to do something useful such as video conversion etc. If you are playing games then the vastly more important thing is to spend money on the gpu. The 5800x3d was better at gaming but worse at everything else but cost more. It was just not balanced.
@albundy06 Жыл бұрын
@@zakelwe Everything you post is a waste. Don't watch it. Don't comment.
@siewkimng1085 Жыл бұрын
Great explanation, while I certainly understand the need to avoid GPU bottlenecks while testing CPU performance, it’s great to see it articulated in such a clear manner.
@5i13n7 Жыл бұрын
I find the 'mixed data' chart very interesting. I would greatly appreciate if you guys could do a video showcasing the expected gains from historical cpu architectural upgrades at various resolutions on some of the more popular gaming titles. eg. 4090 with each 2700x, 3700x, 5700x, 7700x, 8700k, 9700k, 10700k, 12700k, 13700k. Even just a couple games to compare would be great to see actual expected gains for a cpu generation upgrade at a user's target resolution. I, and likely many others, tend to stretch my pc build as long as it's legs will carry it, and a lot of the charts for new hardware include only the last generation or two to find the generational uplift. Enthusiasts may be upgrading with each generation, but I'd wager more people are considering upgrades from several generations back, and could use this information to choose a most suitable upgrade for their use case.
@kjgasdfduhigsdauf Жыл бұрын
Didn’t realize people questioned this practice. Once again shows how little common sense gamers have
@skinscalp222Ай бұрын
lol I'm currently arguing with a genius on reddit who thinks when you're CPU bottlenecked at 1080p just increase your resolution to 1440p and you will get more performance because your CPU usage goes down and the GPU usage goes up. Bro, your GPU usage goes up because it now has to process more pixels not because it's processing more fps! and your CPU usage goes down because it doesn't have to process as much frames at higher res than at lower res! It's mind boggling how many people can't understand something as simple as this 😂
@aaronwhelan1 Жыл бұрын
So true!! moving from a i7 2600K to a R7 7700x with an older 980ti felt the same as when I moved from a Geforce 680 to a 980ti with the older 2600k
@OutrunCid Жыл бұрын
Now that 2600k is an oldie of course. Will be making the switch to the same R7 soon whilst still on an R9 Fury. Not expecting wonders to happen, but I know that in those cases where I was CPU limited, improvements will be experienced. I just need to listen to my GPU right now to know which games these are :D
@alexm7777 Жыл бұрын
A i7 2600k and a 980 ti would make for a nice budget rig.
@Daeyae Жыл бұрын
@alex m sadly its just cheaper to get an old ryzen or new i3 most of the time
@MalHerweynen Жыл бұрын
@@Daeyae depends whats on the used market like rn I cant find any reasonable priced ryzen parts that arent like 90% of the price of buying it new
@Daeyae Жыл бұрын
@Malakai Hawaiian true, in my area a 2600k would be like £15 and a 1155 mobo about £30, ddr3 is super cheap too. A 1600 is about £30-£55 with a cooler A320 boards are about £25+, ddr4 is cheap but not as cheap and ryzen loves fast ram so youd want decent stuff. I guess it depends also how much £15 is to your budget, if your trying to spend under 300, £15 is 5% which is quite a bit but is it worth losing out on 2 cores and a small speedup?
@tortuga7160 Жыл бұрын
1080p is still alive and healthy. 65% on steam survey, and LG is coming out with a 500hz 1080p monitor. I'm pretty sure LG did the calculus before deciding to make that panel.
@OutrunCid Жыл бұрын
The Steam survey is far from representative. Its 1080p numbers are more likely to be pushed upwards by ancient PC's and laptops, than by ultra high framerate monitors. Unfortunately it does not measure the refresh rate used. I think the numbers shown by Steve are more representative for those looking into high-end hardware than those presented by Steam though.
@alsamuef Жыл бұрын
Moronic product.
@Hyperus Жыл бұрын
@@alsamuef How so?
@Kuriketto Жыл бұрын
@@OutrunCid How is the Steam Survey far from representative when it literally just pulls system specs? It might be nice to know what refresh rate is used by most gamers, but the hardware survey largely focuses on computer hardware and baseline software, not in-game settings or monitor makes and models. LG also is likely very aware that extreme refresh rate monitors are a niche category and haven't churned out millions of them, but one also needs to recognize that with today's technology, producing a 500Hz monitor is only achievable at a max resolution of 1080p anyway, and while using a TN display at that. In that context, the Steam hardware survey resolution results are slightly irrelevant, but the survey as a whole can at least inform what percentage of the userbase has hardware capable of driving a 500Hz display to its full potential, though realistically only a handful of games can even achieve 500+ FPS given modern hardware. And as far as benchmarks on this channel go, it has people with ancient hardware, someone like a pro player who wants the absolute best and everybody in between, covered. As Steve pointed out, just do a little cross-referencing.
@CelestialGils Жыл бұрын
@@OutrunCid Not really. You have the misconception of steam doing the survey automatically on every computer that has steam installed. That's not how it works. The steam hardware survey is random and you have to accept it in order for your system to be part of it. You can trigger it of course, but most people don't do that. You also can see the data more in detail, based on what hardware or resolution makes up that month, not just the change where all the surveys are taken into account.
@tkllluigi Жыл бұрын
When someone can explain something simply means that they full understand the topic. One of the best hardware educated videos, well done!
@MrNside Жыл бұрын
I can still see the argument from the other side though. While it's nice to know the limits of a CPU, a realistic perspective is still important for someone who never plans on going near those limits, or upgrading their GPU to the highest-end. Using your examples, the 8700K blows the 2600 out of the water with a high end GPU, but costs almost twice as much. But with a more realistic GPU, the difference was only a few FPS. While knowing what a CPU is capable of is important from a benchmarking perspective, it doesn't really tell them much when they go to build their system, particularly a system on a budget. If someone were building with the GT 1070 or the RTX 3060, and wanted a good CPU recommendation to pair with it, they would have a hard time seeing that the 2600 or the i3 13100 were actually decent options. If they only saw the high-end GPU data, they would think both of those CPUs were trash.
@halistinejenkins5289 Жыл бұрын
16:56 is why i started watching your channel. not really complaining but this kind of scaling has been lost it seems over the past two years. i love the cpu/gpu scaling content, because it shows you the cut off point for certain platforms at 1440p high frame rate gaming (144-165) with different gpus. i guess it's hard to get it all in, so everyone can't get what they want. dunno, maybe it's just me, but the inner nerd gets excited when i see how far the 8700k can go with today's gpus before you start leaving noticeable performance on the table. regardless, cheers! been watching since the beginning and will continue to do so.🙂
@mlqvist Жыл бұрын
Thanks for videos like these Steve, they are the one opportunity I have to feel smart. How do people not get this stuff?
@ComputerGeeks-R-Us Жыл бұрын
Love the work you folks are doing! I've been in IT for decades, including benchmarking and standardizing client hardware platforms for a Fortune 250 for over 20 years. The understanding of the interdependencies and limitations between components isn't exactly straight-forward. I've also run across some really weird inconsistencies that the OEMs couldn't even explain. It's nice to see Hardware Unboxed doing the hard work and sharing their knowledge. Keep up the good work!
@jimatperfromix2759 Жыл бұрын
I agree with @gregorymchenry1464 and many others that this is both a well-needed and extremely well done video. Most people with IT careers like Gregory and myself understand the general principle espoused in this video very well. But the average Joe or Jane buying gaming hardware typically doesn't have exposure to this general principle of performance analysis. Just to reinforce that Hardware Unboxed is right on target in this video, let me more formally state the general principle. (a) In any computerized system there are typically multiple system components, each of which *might* contribute to slowing down the response time of the entire system to a low enough level such as to disappoint the user(s) of the system. (b) Given any specific configuration of components into such a computer system, it is almost always the case that just one of the several system components contributes the most to slowing down system response time. (c) Within the discipline of computer performance analysis and system tuning, point (b) is generally very useful and is called the "bottleneck principle." (d) The rare exception to having a single bottleneck is actually literally the goal of perfect system configuration, namely to have all potentially bottlenecking components hit bottleneck state at the same system load - that is, you're not overbuying on any given component and all components sort-of run out of gas at the same time. (e) It's fairly hard to perfectly tune a system (as per (d)), yet we can use the bottleneck principle to try to at least ensure that one system component is not "horribly bottlenecked" relative to the other system components. (f) So although configuring computer systems is somewhat of an art (for which you can perhaps hire Gregory to help you out), you can't mess up too horribly if you at least keep the bottleneck principle well in mind. (g) In a computer system designed for gaming, as long as you also take care of important extra issues such as making sure you have enough system memory and making sure your primary disk (plus as many other disks as possible) are SSD disks, then your main decision has to be focused on the two major potential bottlenecking points, namely the speed of the CPU and the speed (and to a lesser extent video memory) of the GPU. (h) In looking at reviews of potential CPU/GPU purchases to configure in your gaming system, look at a wide range of reviews, but take with a grain of salt any reviews that do not properly respect the bottleneck principle as apropos. (i) For CPU and GPU (respectively) reviews in the context of gaming PCs, that means in practice that CPUs should always be tested with "best available" GPUs, and GPUs should always be tested with "best-available" CPUs. (j) Other types of sub-reviews within a broader review article are sometimes useful to provide additional scope, but if you smell that the reviewer doesn't understand the usefulness of the bottleneck principle in doing hardware reviews, run away. (k) A recent example of a review that totally botched the bottleneck principle (and from a normally highly respected reviewer) was when LTT did a gaming oriented review of new 7000X series AMD CPUs, but they didn't have the patience to wait 2 more weeks to get their hands on a new Nvidia 4090 card to do those CPU reviews with, such that the 3090 they used wasn't fast enough for a proper review of those very-fast CPUs, and they came to erroneous conclusions about the lack of speedup given by the new AMD CPUs. Bear in mind what this Hardware Unboxed video teaches, and you're much better equipped to understand the implications of all CPU and GPU reviews. I would, however, toss out one somewhat more obscure point that requires a bit more nuance in understanding better. By nuance I mean that it's a bit more complicated and might be worthwhile of additional discussion (whereas the bottleneck principle is a slam dunk). Specifically, what I've got in mind here is the additional recommendation to do primary testing at 1080p - and I would like to modify that slightly. Substitute for 1080p the most-common low-end resulution "of your era" or perhaps do double testing on both the most-common low-end resolution of your era, plus the most-common middle-end resolution of your era. This is a sliding scale across macro time, since in the next era the low-end resolution will be at 1440p, and way down in the future we might see such great CPUs and GPUs that 4K gaming would then become the "minimal" resolution for testing. I mean, if nearly everyone is actually gaming at 2K or 4K, there is no point in differentiating 1080p performance of 1000 frames per second from 1080p at 500 frames per second. That having been said, for this "current era," it looks like testing at 1080p will be fine for at least a while longer. I want to quote in its entirety the comment in this thread by @puokki6225, who says "Anyone saying 1080p is 'dead' needs a serious reality check. According to Steam hardware stats, ~65% of their userbase still uses it. 1440p is at 11% and 4K at a tiny 2.7%. For reference, 1366x768 is at 5.9%. The high end market, which realistically almost all of the DIY PC market currently is, is very niche even when it comes to gaming in general." So in our era, we DIYers can safely ignore the testing needs of the 5.9% still running at 1366x768 (since presumably they want to upgrade to at least 1080p), cover 65% of current userbase by focusing on 1080p testing, and for those 13.7% of DIYers aiming at either 1440p or 4K gaming, they should be able to apply some reduction factor to guess approximately what 1440p and 4K performance might look like.
@matrix3509 Жыл бұрын
Seems like most people just refuse to understand what the purpose of benchmarking even is. SPOILERS: Benchmarking is about founding out the true performance characteristics of whatever part you're testing, thus it makes sense to eliminate other variables. Benchmarking is the computer equivalent of taking a car to a race track to find its lap time. Then some dipshit comes into the comments and talks about how nobody cares about lap times because the real world has speed limits.
@kmputertechsupplies2374 Жыл бұрын
@ 11:30 you said "is that useful data, in my opinion NO" While you are correct in the fact that the 13900K is much better than the 3% uplift over the 13100, Consider the following. One question this graph does indeed answer and you are ignoring is: What CPU do i require to get the reasonable most out of a 3060? If you ignore the above question and assume showing CPU performance relative to the best GPU is best, most people will assume that if you pair an i3 with the 3060, there would be a major loss of performance as little as 30% as shown when tested with better GPUs. In conclusion, people who purchase i3s cannot afford a the highest end GPU, and showing them their CPU's performance vs better CPU's performance scaled against a better GPU will influence their buying choice. They will think an i3 is not enough when in their use case it completely is. And WORST OF ALL, they will save more to buy the i5/i7 and get the same exact performance as if they would have gotten the i3.
@InternetListener Жыл бұрын
you can build a full system with the price of only one component used in a YT GPU or CPU benchmark. They should add to the test hardware that can scale in a sensible way from that of the Gaming Consoles (if they are focused on gaming). But a PC it useful for many other thing and as you say people doen't get informed that with a Celeron G6900 and a RX6400 they can game, do 3d art, RT, render, photo editing, coding, video encoding (thank to accelerated GPU softaware) and so much with a low electric consumption thank to the Internet content creators forgetting to test the basic components and user cases. Who cares about Cinebench if you can run Blender in real time with a rx 6400 or 3050 ti faster than with a Mac or whatever... or encode a mp4 video in minutes using the GPU (RX or ARC), no need to go to rtx 4090 or a 1000$ cpu.
@grizzleebair Жыл бұрын
Great video! I did know this, but bless you for doing this video for your newer followers. That is a heck of a lot of repeated testing. For you new followers, stick around, these guys know their stuff and do quite extensive testing. They also have a good working relationship with some other of the top Tech YTers, and aren't afraid to do collabs or refer us to someone they trust if it helps us out or helps them reinforce a point they have made. On another note, I know Australia is big, and this may not be feasible, but it would be nice to see a collab with several of the Aussie Tech YTers. Maybe with just the ones in your corner of Aussie Land. Just a thought.
@philipreininger2549 Жыл бұрын
Really great video, I 100% get where people are coming from with the "unrealistic pairing argument" but for testing it's just a different story than for actually purchasing parts and recommending certain combos. Keep up the great work :)
@kodysmith8897 Жыл бұрын
I was stuck with a 1080p monitor for some time so I appreciated the information. Now I still think of it as academically interesting, shows historical scaling, and allows me to help inform my non-pc literate friends and family.
@Jansn Жыл бұрын
I can feel these comments you guys get, I'm getting the same stuff, those people will always be there, don't worry, just keep doing what guys are doing.
@odytrice Жыл бұрын
This is also why you shouldn’t just run out and upgrade CPUs if you play latest single player games at high resolutions because at 4K most GPUs are bottlenecked and the difference between a 5900X and 7900X won’t be visible in most games
@SaltyMaud Жыл бұрын
Oh my god thank you for mentioning performance targets. That's exactly how I view it. If I can get the game to run at _at least_ 120 FPS, I'll make the game run at _at least_ 120 FPS, no matter if that means high, medium, or below minimum quality. You can always lower the graphical settings, but you can't make the CPU run games faster than it can. I wish outlets would take this even further, the standard seems to be to test every resolution at ultra everything - I wish 1080p low was a standard test as well just to see what kind of performance target is achievable in the best case scenario. For competitive titles I'd like as high FPS as my monitor can display. 240, in my current setup. Not just average 240, I'd like lows to stay above 240 too for perfectly consistent gameplay. GPU limited data in CPU testing is completely useless unless that happens to be your exact use case and build.
@SaltyMaud Жыл бұрын
On a somewhat related note; 20:48 Funny anecdote, I had my GTX1080 through 3 CPUs until a year ago before upgrading my GPU because, like I alluded to earlier, you can always lower your graphical settings to make game go faster, but if you want more frames from your CPU, you need a more powerful CPU.
@jouniosmala9921 Жыл бұрын
Some graphics quality settings affect the CPU limit significantly in some games. I learned that during the year when I was using 2070 super with i7 920. (My GPU upgrade wasn't for gaming but for a project that needed it, 2060 super was ideal choice but 2070 super had a quieter partner model.) It would be interesting to see if that's actually case for your normal set of games tested. Pick fastest GPU, slowest CPU and check if quality settings affect performance.
@cellanjones28 Жыл бұрын
Graphic quality settings would impact the CPU by how much the CPU needs to work ( I think)
@calisto2735 Жыл бұрын
No shit, Sherlock? Quick! We need to call NASA!
@AerynGaming Жыл бұрын
They certainly do. Resolution is a good knob to tweak because it almost never has any kind of meaningful effect on the CPU, but usually massively changes the GPU throughput.
@jouniosmala9921 Жыл бұрын
@@cellanjones28 Yes, some settings do. Other's I was able to keep at maximum settings since those only affected the GPU load. But what was interesting is that some graphics settings just turned games that were playable with settings used with my old GPU into completely unplayable and I had to tune it down to find playable settings but all my games at the time were older games. This is something that would be interesting topic worth further exploration when they are not busy working hardware that's about to release or just released.
@cellanjones28 Жыл бұрын
@@jouniosmala9921 just depends on the resolution. i run 1440P so it's all gpu based.
@lawliot Жыл бұрын
I buy CPUs based on the number of Xs in its name, no benchmarks needed. But seriously, I see at least one comment on every CPU video (not just on HUB) about 1080p and it felt pointless to bother replying to them.
@Eidolon2003 Жыл бұрын
XFX RX 7900 XTX
@assetaden6662 Жыл бұрын
@@Eidolon2003 thats a gpu tho. The most x's CPU can have is one.
@spladam3845 Жыл бұрын
This was a great idea for a video, and a good resource for new viewers, well done Steve and team.
@opinali Жыл бұрын
Avoiding bottlenecks from components that aren't the focus of a study is obviously correct. The problem is that "real games" are highly opaque and imprecise as benchmarks. Games are not designed to be good benchmarks; source code is typically proprietary so even experts with a lot of time can only make educated guesses about their behavior. And behavior varies a ton. See your 50-game bench for 5800X3D vs. 7600X. In CoD:MW2, the game is GPU-limited since all scores are identical for the two CPUs, a tie at 4K and still a tie at 1080p including 1%-lows; but scores scale well with resolution. This game engine does the same CPU work per-frame no matter the output resolution. Same for CSGO, Dying Light 2, Battlegrounds, Battlefield 2042, probably many others as well (I only checked the games with charts highlighted in the Techspot article) where the distance between the two CPUs is not zero, but the same for all resolutions. "Well many other games are highly resolution-sensitive for CPU load" - but if you average with others that aren't, your NN-games benchmark scores will badly underestimate the difference between competing CPUs. And if you cherrypick the games that are good CPU benchmarks at low res, that's a big problem in any benchmark methodology: selection bias, which can overestimate the difference because of a few games that are excessively resolution-sensitive in ways that aren't uniquely caused by CPU speed. There is a solution to that. CPUs (and also GPUs) should be tested mostly by synthetic benchmarks. But no major game reviewer does that, despite evidence that those correlate extremely well with real-world perf of games/apps that are known to not have undesired bottlenecks or performance anomalies. I didn't want to plug my own, amateur content, but look me up on Medium, I show that the 3DMark scores are spectacularly good for GPUs, matching the best real-world game benchmarks. Some titles depend more on integer code, others FP, memory, number of cores, cache latency, etc. Synthetic benchs can handle that with a good mix of tests that include all important scenarios. There's the problem of representing the mix of operations found in games, but you have the same problems with real-game benchs as well, e.g. some games are highly sensitive to a big cache while others don't care at all (see every Zen 3D review). So I don't know any reason to not use synthetic benchs. Not saying that CPU/GPU reviews should drop real-game tests; those are useful for multiple reasons. But reviewers could add synthetic benchs to their methodology, perhaps using a weighted average with game tests. Much better than relying too much on unrealistic CPU/GPU/resolution configs, which is problematic no matter how much you justify it. For one thing, there's no objective way to decide how much weight to give to 1080p/Low-settings scores in the final conclusions/scoring. It boils down to each reviewer's empirical choices.
@InternetListener Жыл бұрын
As a matter of fact they are Game benchmarks that serve the purpose of being CPU or GPU showrooms. The only thing tested is how much room for improvement the game has to fit an actual combination of hardware. Isn't it amazing when "reviewers" or game "benchmarkers" show power and % usage and you clearly see that 94% of your CPU (MB+ram+CPU can't go north the 1000$ mark) and your GPU sometimes goes at 100% but you can see the 60 or 70% at 1080p (and what a mistery when they're not using RT or AI scaling, and so you have 40% of unused die with tenths of milion of unused transistors for wich you are supossed to pay 2000$ or more).
@austinfowler2707 Жыл бұрын
Testing a cpu with the fastest GPU is about removing as many variables as possible. You can only go slower. So say with this cpu, the most fps you can achieve with the fastest gpu is 100 fps. Well if the next step down gpu is 70% as fast as the top, then you can reasonably expect that gpu to do 70 fps. Testing is all about removing the most variables. That gives you the best case scenario.
@ferasamro9735 Жыл бұрын
To be Honest as Enthusiasts we think all of your viewers understand what you do but these comments are a good think because this means that non experienced people are watching your videos, this is very good news for the hardware community and for you as whole Thanks always for your Hard work.
@myfakeaccount4523 Жыл бұрын
Enthusiast = spends too much money.
@ChaosPootato Жыл бұрын
It boggles the mind that this is still a thing. The CPU testing methodology is very well proven at this point, every single (serious) outlet uses low resolution because that's what makes sense. These people don't want a CPU benchmark, they want a benchmark of their specific targeted complete system and targeted resolution. I'll attribute to ignorance what could very much be attributed to selfishness
@CriticalPotato Жыл бұрын
Excellent name, Potato brethren.
@Bula-Boy Жыл бұрын
Thanks for adding 1080p in your benchmarks. Most fortnite pros are still using 1080p and likewise for other competitive titles.
@fiendhappy6964 Жыл бұрын
yea they SUPPORT FOR COMPETITIVE GAME PLAY 1080P.
@conkcreet Жыл бұрын
@@terraincognitagaming facts
@Bula-Boy Жыл бұрын
@Terra Incognita Gaming brah think you're still lagging 2 years behind. In November, 2022 we had fncs invitational LAN and this year they have FNCS global championship with prize pool of $10 million.
@fiendhappy6964 Жыл бұрын
@@terraincognitagaming no bruh , pubg , dota and warzone
@hillardbishop8755 Жыл бұрын
@Hardware Unboxed My rule of thumb if I have to make a compromise is go for higher tier on CPU because it is easier to swap out graphics card later. It doesn't matter how powerful your graphics card is if it is being limited by you CPU. Ideally, I want to match CPU and GPU for the build, but sometimes due to budget constraints, one must compromise while minimizing the tradeoffs and keeping in mind future upgrade paths. I agree 1080p testing shows which CPU performs better, GPU testing shows which graphics card performs better. With those results in mind, one is better equipped to pair them appropriately in a build.
@korneldenes Жыл бұрын
Köszönjük!
@CanIHasThisName Жыл бұрын
I would like to say that I really appreciate that you also include midrange GPUs in your CPU benchmarks even though it's a lot of extra work for you. It helps the viewer to quickly get an idea where the difference still matters and where it doesn't. Theoretically you could use the presented graph and compare it to another graph that focuses on testing the GPU, but then you also need those benchmarks to be done under the same conditions in the same games and sometimes you'll need more than one outlet for that which really complicates things. And then you have what's seen at 7:53 - based on the 3080/4090 data you'd assume that the 2600X would get you the same result as 8700K with an RTX 3060, except it doesn't.
@ZWortek Жыл бұрын
Really appreciate you guys making videos like this because since COVID began there are a lot of new people in the PC gaming space that are almost entirely clueless and borderline delusional with their logic regarding topics such as this. For example, quite a few people that reply to your tweets that I've seen both Tim and Steve face palm over LOL.
@mauree1618 Жыл бұрын
I guess it’s easy to forget how much of what we know is built on prior knowledge. We need more videos like these for the beginners.
@marka5968 Жыл бұрын
Great video and insightful look into CPU testing. That is why I tell people to take CPU testing differences with lot of thought. You won't notice the difference between a high end CPU and a lower end one until years into the future. At that point, the upgrade would be much much cheaper to do than overspend today to keep for 6 years.
@AirborneHedgehog Жыл бұрын
Great video, it was amazing to see how wide the gap in performance is when you get the GPU out of the way. One minor thought: at 11:24, you question how valuable the data from the 13900K vs 13100 with a 3060 is. I would argue that it is valuable when paired with the charts above it: If I'm gaming at 1080p, and all I can afford is a 3060 for the foreseeable future, then I know I'm throwing my money away if I pay for anything more than a 13100. I won't see any benefit unless and until I upgrade the GPU. But to your point: if I wanted to future-proof my build against an eventual upgrade? Yeah, that chart doesn't help much. (I'm in a bit of a weird case in that my preferred genre - flight simulation - is EXTREMELY limited by single-threaded CPU performance, so I did notice a bump when I updated from a Ryzen 3600 to a 5800X when paired with my RTX 2060. Prior to FS2020? I don't think I would have bothered much with the CPU.) Edit: Looks like you address that in the final thoughts. Well done. I'm convinced. 👍
@GamingLovesJohn Жыл бұрын
I am one of those Steam users that plays at 1080p 165Hz with a 3900x and an 3080. I haven’t seen the need to upgrade my monitor, as most games I play are on high refresh rates anyway. I find that switching to 1440p, you still need a higher tier card. At 1080p, it’s easier for me to hit far higher than 144FPS+.
@Outmind01 Жыл бұрын
Serious question because I have yet to experience refresh rates so high - can you really tell the difference between 144 and 165Hz?
@assetaden6662 Жыл бұрын
@@Outmind01 Going farther than 144 has the effect of diminishing return. It will give you an advantage, up until the point of you being a bottleneck. No reason to buy 360hz monitor, when your reaction time is 400ms.
@KontrolStyle Жыл бұрын
@@assetaden6662 this is a great point, if you're not shroud or a pro streamer/gamer no reason to get 360hz. @outmind01 no you can't tell a difference.
@alinuofal2581 Жыл бұрын
It really blows my mind how absurd trying to test CPU with a 4090 with silly FPS count, while ignoring the option to benchmark with 4x games that are CPU-intensive like Stellaris, HOI4, ANNO that brings the most powerful CPUs to their knees at late game. All youtube reviewers seem to copy each other at this point.
@skinscalp222Ай бұрын
Use that brain, bro. Using top GPU to ensure the only bottleneck happening is on the CPU is basically the same scenario as testing it on games that are CPU intensive. In both cases the CPUs are the ones working hard. Like how to get this point through all those thick skulls? 😂
@MarkHyde Жыл бұрын
Controversial ;) - love you taking the trouble to do this. More power to HU!!!
@naswinger Жыл бұрын
glorious hungary!
@sniper0073088 Жыл бұрын
This data is extremely useful. It makes it very easy to choose the best gpu for a given cpu. This kind of benchmark would be very nice to have in every cpu review. Making the most of a budget cpu while not overspending much easier
@VPoje Жыл бұрын
Can you show good old Intel CPUs performance compared to now? (From like i7-3770K to 9900K?)
@Hardwareunboxed Жыл бұрын
At some point we will revisit older hardware but we do that during quiet periods.
@VPoje Жыл бұрын
@@Hardwareunboxed Thx!
@jdlpeke1 Жыл бұрын
A perfect explanation, between the difference in CPU performance and CPU+GPU performance, because although they go together they are worlds apart. You can see perfectly how things change when there is no GPU bottleneck.
@darkywarky Жыл бұрын
Thank you for also testing the 8700k. I was wondering if I should upgrade it playing at 1440p with a 3070.
@InternetListener Жыл бұрын
No. And you don't need a 3070 because new AMD cards with same or better performance (and higher efficiency, that will save you hundreds or thousands of moneys in electric bill) are cheaper even than the used ones. Just check prices of 6700 series (6800 or 6900 series, or even 7900 with a proper discount could make you play at 2K or 4K). While you can get more than 90 fps consistently you can always make use of the supersacaling or downscaling option in new GPUs to make sure your GPU works at 100% even if you were in a CPu bottleneck scenario in a particular game.
@GewelReal Жыл бұрын
@@InternetListener AMD cards do not have RTX features tho 🤷
@meurer13daniel Жыл бұрын
I had a 9700f non K and had to upgrade to a 13600k.I dont know how your 8700k is performing know, but I was having a lot of bottleneck scenarios in modern games with a 3070. Spider Man, Cyberpunk and Plague Tale Requiem are clear exemples. For 2023 games I supose it would be even worse such Hogwarts Legacy
@darkywarky Жыл бұрын
@@InternetListener I already have the 3070 for a long time, but I was wondering if I was leaving performance on the table by using a 8700K in combo with a 3070.
@khalidzouzal8417 Жыл бұрын
I got the 8700 non k paired with an RTX 3080ti, I basically play any game really at Ultra 2160p above 60 fps. Like they already mentioned in the video, the higher your resolution is, the less your cpu matters. I dont play competitively, so im okay with 60+ fps at 4K. And it works flawlessly for me.
@blasterrr Жыл бұрын
Great Video for educating some viewers. For multiplayer titles in non gpu bound graphicssettings the Performance differences between CPUs can be even higher. Unfortunately i Don t know of any reviewers who test the most demanding scenes of a game. As a result it looks like games run super Fluid these days and the gpu is way more important than the cpu. While in reality a game in a demanding multiplayer Scenario has a 1/4 or 1/3 of the framerates that u see in the Review. Especially in the 1% framerates. So actually for multiplayer the cpu is way more important if u go for high framerates and not for visuals. It would be great if reviewers could test more the demanding multiplayer Scenarios of games. I know that this is very difficult and not very reproducable. Bir u could do a more subjective play Analysis for multiplayer Scenarios. Great cpu bond games are warhammer vermintide 2, planetside 2 and Star citizen. Of course only in the right gameplay Scenarios. But those are actually the scenarios that do matter and not the ones where there is anyway not much load on the cpu. It would be super great if u could do a Video about this topic. To further fight against the misconception that CPUs Don t really matter for most games and to Show us multiplayer focused people how much we would really Benefit from a cpu Upgrade.
@YeaButCanUDoABackflp Жыл бұрын
True. Hard to messure but very important. Battlefield is the worst example as far as I know.
@rjy87 Жыл бұрын
This also applies to cpu demanding sim racing games which are often neglected by reviewers. Especially at wide resolutions, the cpu becomes the bottleneck most often.
@Kuriketto Жыл бұрын
As someone who's still rocking an i7-8700K, just upgraded from a Vega 64 to a 7900XTX and has plans to upgrade my CPU platform to hit a performance target of at least 144 FPS in most if not all modern shooters at 1440p, this video has been fairly vindicating.
@moliii8809Ай бұрын
Thanks for this really helpful explanation - came here from watching your recent 9800x3d benchmarks!
@wilfredroettger5906 Жыл бұрын
I agree that, in order to test CPUs, you must remove GPU and other bootlenecks from the chain. But in end, it is kind of misleading. If you don´t have a top GPU, you most likely will get only a small improvement from a CPU upgrade (1440p / 4k - reallz don´t core about 1080p), if any. Best to go for the best GPU you can afford and a middle tier CPU.
@WrexBF Жыл бұрын
It's not misleading. After seeing the performance difference between an i3 13100 and an i9 13900k when paired with a 4090, go and look up benchmark videos of a slower GPU that you have like a 3060 for example. If the fps is below what that 13100 was getting with a 4090, that means that the i3 isn't holding back that 3060. There is no need to ask questions like ''can you test with a 3060, can you test with a 1660, can you test with a 3070, etc.''
@wilfredroettger5906 Жыл бұрын
@@WrexBF Yes. But look the chart at 17:00. A 13900K is wastly superior than a 8700k with a 4090. But if you have a 3060, it really doesn´t matter. Better to upgrade the GPU, not the CPU.
@nhkbeast3884 Жыл бұрын
At 1080p my cpu is being used like 100 percent and gpu like 30 percent but when i switch to 4k both are at 100 percent. Most people don't realise that you guys are actually looking for cpu bottle necks and that's why y'all test cpu's at 1080p.
@dante19890 Жыл бұрын
the cpu bottlenecks will show up in higher resolutions too but not as pronounced
@assetaden6662 Жыл бұрын
@@dante19890 yeah, when gpu is bottlenecked harder, cpu bottlenecks can only be seen in 1% and 0.1%
@DavidMiller-dt8mx Жыл бұрын
Fascinating. While I did understand why you tested with vastly overpowered gpus, I'm a more casual gamer, and this video showed me that I'm good with a lot less power - I also have multiple windows open while playing most games, and the range of cpus I'm looking at should be just fine. Excellent video.
@flameshana9 Жыл бұрын
Having windows open while gaming is irrelevant, just in case you're wondering. They've done a video on that specific concern because so many people think you need 12+ threads as a minimum.
@DavidMiller-dt8mx Жыл бұрын
@@flameshana9 I do it all the time, on four. Trust me, if it's taking significant memory and/or thread cycles, it matters.
@ZeroUm_ Жыл бұрын
I'd say we're having an XY problem, what the ones asking for a matching CPU and GPU benchmark really want to know is what is the best match of two hardware so they can get the best performance for their budget. Most reviewers do proper CPU and GPU benchmarks, but they usually don't give direct answers which GPU matches well with a given CPU.
@stanislav_5312 Жыл бұрын
I am from Ukraine but I studied English long time. I can understand 98% of what you are saying and my " tech channel i regularly watch" rating is 1. Linus Tech Tips 2. Hardware Unboxed 3. Gamers Nexus 4. Actually hardcore Overclocking
@sceptikalmedia Жыл бұрын
I had the chance to test the 5600 non x vs 8700k . I sold the 8700k just for the 1% lows. the AMD part ran smoother every time with 1%low more than 20% higher. I dont miss quicksync.
@antonschaf4088 Жыл бұрын
Nice, we havent seen the i7-8700K in a while. Impressive performance of the i3-13100 compared to the old i7. You really should compare the i7-8700K and i9-9900K and maybe the i9-10900K to 13th gen in the near future! Edit: At the end of CPU benchmark videos you could mention a realistic pairing to that tested CPUs, so the "realistic pairing" dudes have their needed answer.
@jonsummers3453 Жыл бұрын
8700k has really held up good overtime. 6 core is a sweet spot and that cpu could be overclocked massively helping keep it relevant for a long time.
@lukeearthcrawler896 Жыл бұрын
I wish you also did 4X games performance review, like the built in benchmark in Civilization 6. They are perfect for CPU benchmarking, as these games are CPU heavy and less GPU heavy.
@saricubra2867 Жыл бұрын
CPU heavy? Laughs in PS3 and Switch emulators.
@lukeearthcrawler896 Жыл бұрын
@@saricubra2867 Yes, CPU heavy. Run the GS AI benchmark in Civilization 6 and see how many seconds you wait between turns.
@guilhermetomazini5179 Жыл бұрын
Users do understand, they just dont agree. The reason you need to run a 4090 in 1080 is to create the ilusion that there is some difference in performance between CPUs where there isnt, thus leading people to buy a more expensive CPU that was not needed.
@jkvintageanalog8489 Жыл бұрын
This is the thing if u want resolution testing look at Gpu testing. They are just not smart enough to figure out that it is showing maximum possible frames the CPU can produce, not how little frames the graphics card can at 4k. it's a CPU test. The CPU don't care about the resolution.
@deltazero7012 Жыл бұрын
I completely understand why you guys test the way you do, I think these people are looking for something a little different from what you are doing and they don't realize it yet. They are more than likely trying to see how realistic combinations of cpu and gpu perform as a whole. Like say a 13600k and a 4080. Figuring out what cpu you need to not bottleneck your gpu is a confusing process with most benchmarks. A new set of benchmarks might be in order, or maybe just an episode that answers the bottleneck question for the relevant gpus on the market. Just some solid answers for people new in PC building like "Going above CPU X gives no extra performance with GPU X."
@DarkReturns1 Жыл бұрын
Can you please do an 8700k revisit? That CPU was game changing at the time and hugely popular. After 5 years is a pretty normal time to upgrade a CPU, I know alot of 8700k owners looking to upgrade this year
@RongGuy Жыл бұрын
First of all I would like to say I agree with your testing methodology. That being said, I think it would be cool to see "upgrade guide" type videos with new GPU generations: e.g. take 5-6 popular CPU's that are a couple of generations old, and benchmark them with 2-3 GPUs people are most likely going to buy (like xx60 (Ti), xx70 (Ti) and xx80).
@PainterVierax Жыл бұрын
I remember such videos in the past, testing different GPUs on a single popular midrange CPU to show the bottleneck. But those videos are done when there is less launches… and when Steve doesn't have to do such a basic explanation for angry newbs.
@allxtend4005 Жыл бұрын
belive me the i5 was way more popular then the i7 because of the price of the intel products was that high and motherboards as well from intel. and at this time when the ryzen 2600x was selling well the most people had a intel I7 3770k / 4790k and so on and not a 8700 ( not the most populated cpu for gamers at all )
@rocketsurgeon1349 Жыл бұрын
Good explanation. And i'm sure there are plently of videos talking about reasonable pairings of CPUs with GPUs, now or if you're upgrading.
@michaelrichter2528 Жыл бұрын
19:30 I would be interested in an explanation why you don't test CPUs at even lower resolutions like 720p or even 480p. In theory this should completly eliminate any remaining GPU bottlenecks without changing the tested workload for the CPU. Are there any unexpected downsides because these resolutions aren't commonly used or is 1080p simply low enough for current high end GPUs?
@assetaden6662 Жыл бұрын
That's a good metric, but doesn't show any real-world scenario. People playing below 720p are just the competitive shooter players, who are a minority.
@michaelrichter2528 Жыл бұрын
@@assetaden6662 But to me that was the entire point of the video. Of course nobody plays games at 720p or 480p on modern hardware just like nobody buys an i3 for his 4090. CPU tests are synthetic scenarios that only show the maximum possible performance (fps) of a CPU without any GPU constraints.
@SiberdineCorp Жыл бұрын
@@michaelrichter2528 There are some legitimate reasons, many games adjust pop-in and Level of Detail along with resolution (logic being that you won't see the difference at lower res anyway). Lower LOD = less draw calls = less CPU load 1080P is a good middle ground, not too low to reduce LOD by much, not too high to take pressure off the GPU
@michaelrichter2528 Жыл бұрын
Thanks, that makes sense
@qubes8728 Жыл бұрын
Funny this popped up. Was commenting in another video about my gains in WZ with the 5800x3d. One guy asked how a cpu can increase frames and didn’t even know what L3cache was! Another guy with a 3090ti/ 5800x3d asked if he should enable xmp. He also wondered if he should reduce vram usage in WZ because that’s what he saw on KZbin. I told him he’s got 24gb of vram so it won’t be an issue. I think some guys get confused when amd calls L3cache, 3D v cache. They think it something completely different or didn’t even know all cpus have L3cache.
@absolutelysobeast Жыл бұрын
just curious? what were your gain in WZ with the 5800x3d? I upgraded to mine a few months back before i was playing warzone so i dont have an idea of what i could have expected before. When i run the MW2 benchmark i get a 1 percent CPU bottleneck result. I am running it with a 4080 at 1440p max settings. But i seem to have alot higher 1 percent lows than my friends i play with, but they are all on 30 series cards ranging from 3060 up to 3090 so its hard to really get a reference point on whats CPU and whats GPU you know?
@GewelReal Жыл бұрын
doesn't matter what AMD calls it. average Joe has no idea about the PCs in depth. You could explain to him that CPU is like a gearbox in a car. Better CPU = longer gears aka higher top speed (if engine aka GPU can drive such speeds)
@qubes8728 Жыл бұрын
@@absolutelysobeast i was averaging 120fps with a 5600x in demanding areas now im averaging at least 150fps. So around a 30fps gain with the 5800x3d with my 3070ti. 1% cpu bottleneck is good. Jufes from Framechasers noticed higher 1% lows in wz with 5800x3d as well. Check out his 5800x3d video. Seems to ba a wz thing with the 5800x3d but if they have x3d's then it could be the 4080 combo. Dont stress on lows unless it doesnt look smooth. your lows are probably higher than most guys highest fps anyways. Enable the gforce fps overlay in game and see how the lows go in game.
@riba2233 Жыл бұрын
Awesome video Steve... I am shocked at how many people don't get that. And some of us want to know how cpus perform at eSports titles, and if they can run 360 or 500hz monitors
@BattleToads Жыл бұрын
One of my old "buddies" should watch this video. He would swear that if your GPU was running at 99%, the solution was to upgrade the CPU to "open up" the GPU. He would insist that if you had, for instance an RX 6800, that you would need a MINIMUM of a Ryzen 5800X or else you would be severely bottlenecking the card. He based this data off of a promotional poster that Lisa Su was standing in front of during a press conference, insisting that the CPUs and GPUs were best "matched" if their numbers lined up. (Ryzen 5600 with Radeon 6600, Ryzen 5900X with 6900XT, etc.) I would explain to him that in the vast majority of games, a 5600X is going to perform nearly identically to a 5800X, regardless of how much GPU you throw at it. He simply would not listen. I would show him video after video illustrating this point and his response was that "all these big KZbin guys get paid by [insert company here] and go into their BIOS and tweak settings to get the results they want." I'd pull up a video of Gamer's Nexus who streams the entire process, BIOS included. Still "doesn't trust him." OK, buddy. Keep wasting your money on a GPU bottleneck.
@hassosigbjoernson5738 Жыл бұрын
Okay Steve +Hardware Unboxed : *Kudos for your explanation! Very calm and thorough which is very professional!* I would like to emphasise some points and make a suggestion how to give both types of consumers (FPS and high details/ high res) the best information to make a decision! First: I fully understand your point in testing a CPU to it's theoretical potential! But you made a mistake or a false conclusion at 11:17 by saying that this data is not useful: wrong! *For many, it is!* See, when I watch a CPU test, I am thinking about upgrading my CPU/ system. That would be obvious, right?! So the question for which I search for an answer is: *Does a CPU Upgrade (now) bring me any benefit when staying with my current GPU or glancing to the one I would Upgrade in foreseeable future (12 months)!?* Any other (theoretical) potential wouldn't be practical for a consumer, right?! Because I am buying hardware for now or max. the next 2 years! What in 5 years is, is hard to tell and down the road I can switch parts again. And this data right here: 12:50, 15:13 and 16:44 show me: when targeting a 5700XT/ 3060 tier GPU or even a 6800XT/ 3080 tier GPU in the next 12 months: *I don't need any upgrade to my CPU (Ryzen 2600/ Intel 12100) to run story games at high res/ high details!* (To be fair: 1440p data would be fine here.) --> That is the data I am looking for in a CPU test/ CPU upgrade video! (And I am glad that you provide this data!) Any other potential: I don't care, because in 5 years I can revisit this decision. So my suggestion would be: 1. When testing CPU keep your current system but maybe use less different products to optimize your time needed for testing! When looking at a mid-tear CPU (i.e. Ryzen 7600): of course it's good consumer advice to have the direct competitors and the better/ nearby options listet as well (Ryzen 5800, Intel 13600, Ryzen 7700). Also the top-product data is good to see, where the product stands ---> and here my "but" begins: I don't need many top CPU's for comparison, especially when they don't bring any useful data to the table referring the reason for testing: gaming! So when a Ryzen 7950 doesn't bring much useful to table for reviewing Ryzen 7600, than you don't need to show 7800, 7900, 7950 as well! Save this time! 2. With a bit time saved from step 1: maybe ad a "small" upgrade matrix to the test for none FPS players but those who go for resolution and quality! This would mean *adding a RX 5700 XT and 6700 XT/6800 -tier card and two additional resolutions (1440p and 2160p) to your existing data* from step 1. ==> 3 tier of graphic cards, 3 resolutions and the 3 CPUs (the one your testing plus it's competitor and useful alternative)! --> *That would be great and also a useful service for the consumer!* Relative to the kind of CPU your testing you could adjust the type of GPUs as well. (Someone interested in buying Intel 13100 would maybe be more interested in RX 570/ GTX 1060 data than RTX 3090 ti/ 4090).) 3. When steps 1 +2 would make a too long video, please think about those CPU scaling videos and maybe a bigger upgrade matrix for comparison. -- > 4 graphic cards, 4 CPUs and 3 resolution sounds okay when 10 GPUs with 7 tiers of CPU and 5 resolutions/ detail settings sounds "a bit" too much. Even for you, Steve. *So TL,DR:* Valid points for your testing are understood for the potential of a CPU! Good work and also *good advice for cross referencing in this video taken! **19:54** excellent idea!* Think about minimizing the data in that areas that are not relevant to buyers because of totally different price point or really bad price/ performance (Ryzen 7950X compared to 7700X for gaming relevance). And maybe you could think about making those CPU scaling videos a bit more often and include product that are typically *not* included in other testing! So we don''t need another chart with a RTX 4090 or Intel 13900K in it. A 4x4x 3 matrix (4x GPU, 4x CPU, 3x resolution) up to a 7x7x3 matrix would be more than enough data that could also include the lower end or broadly available used products! Interesting GPUs would be: RX 570 4GB, RX 5700XT, GTX 1080, RTX 3060 up to RX 6800XT/ RTX 3080 Ti (data for higher GPUs already available in CPU tests) Interesting CPU could be: Intel 7700K, Ryzen 1600/2600, Ryzen 5600, Intel 9400 (best sold this time?) and 12100, Ryzen 7600, 13600 (data for higher end CPUs already available in older GPU tests!) Cheers, mate! And greetings from Germany!
@johannesbohm6458 Жыл бұрын
I honestly found the comparisons of the 13100 and 13900k with the 3060 quite interesting. Not as a CPU-Benchmark but for determining usefull combinations and GPU-bottlenecks... As a CPU-Benchmark this data is completely useless.
@VirtualAirTour Жыл бұрын
I don't need you to explain why you test at 1080p - I understand it. What I need to know is what gains can I expect at the resolutions I play at. Some of us are actively shopping now. If I upgrade my 7600x/4090 system to a 7800x3d will I see any performance gain in MSFS at 4K or in VR? I have no way to determine that if the benchmarks only include low resolution. If the game is GPU limited, I need to know that so I don't waste the money on a CPU upgrade.
@passerby6168 Жыл бұрын
Good point.
@itsJPhere Жыл бұрын
Sometimes you just have to make an educated guess based on what has been tested. The more data sources you can find, the more accurate your guess will be. With that said, 4K is the great equalizer of CPUs.
@manoftherainshorts9075 Жыл бұрын
Always some new blood coming in to ask the same questions. Also, I honestly don't think 1080p tests are entirely useless. So many people still play at 1080p and are interested in performance at this res, including me. I personally don't *upgrade* monitors per se, I treat them the same way I treat a pair of shoes: if it's working, I don't throw it away or change it to better one. I upgraded to 1080p once my old 1280x1024 monitor died, and I will be use my new 1080p144 monitor until it fails. Also considering the fact that many games struggle to run at 144 frames even on 1080p monitors, 1080p testing is still relevant for a couple of years in my opinion. I'd rather run game at 144Hz locked at 1080p than downgrade to 90FPS at 1440p.
@loowick4074 Жыл бұрын
MOST people still play at 1080p. It's still the more relevant resolution.
@nivkaha Жыл бұрын
I agree with almost everything said about reducing bottlenecks, but the lack of resolution's effect on CPU performance is not completely given. Many games do have logic that runs on the CPU that factors in things that are influenced by resolution (like deciding when to switch certain textures for higher resolution ones). Currently I don't think any of them have enough of an effect on CPU performance to skew results and trendlines, but it is worth noting that with all the virtual geometry systems like Nanite, rendering resolution could start having a significant effect on CPUs.
@richardwales9674 Жыл бұрын
One of the best if not the best video I've seen in a long time. It was very interesting to see where a CPU hit it's limit in being able to push more frames out of a graphics card. I disagree slightly when you said, 'is that useful? No.' Because in the context of this video it is useful. I know there's a lot of kit coming out and you want/have to show us those thing but videos like this a gold. If I knew watching this that my CPU hit it's limit at a 3080 and I was in the market for a GPU now and wanted to optimise my CPU - assuming I couldn't get a 3080 - I could look for the modern equalent when that comes out. 3070ti for instance is around or slightly better than a 3090 so I could still wait.
@Oceanborn712 Жыл бұрын
Excellent showcase. Thank you for this. I can't count the amount of arguments against people going "Lol who's gonna play like this? That's not realistic AT ALL!!!!" I've had so far. This video should be linked under each of your future testing videos.
@cmja09 Жыл бұрын
yeah we have a PC shop and 95-98% of our customers play on 1080p, 2k and 4k benchmarks cater for a very small percentage of PC gamers. This is also why RX 580 8gb is still our best selling GPU because we only sell it for $100 and it is the best value for price to performance ratio. And I'm here wondering why GPU makers aren't making a sub $100-150 GPU with RX 580 performance that draws less power.
@cmja09 Жыл бұрын
@El Cactuar 1440p is 2k
@dante19890 Жыл бұрын
ye but its exactly that small percentage who are interested in buying high end parts. The average 1080p60 gamer doesnt care about benchmarks or the latest cpu and gpus as long as their shit still work.
@damasterpiece08 Жыл бұрын
@El Cactuar nobody uses that resolution, get real
@cmja09 Жыл бұрын
sure. well lots of technicalities, generally, 1080p is full HD, 1440p is 2k, 2160p is 4k. Because that's how most monitor manufacturers market their products.
@ecokecok1376 Жыл бұрын
@El Cactuar You can even check that you are wrong from this video's quality settings
@Revoku Жыл бұрын
got 3 monitors, all of them at 1080p, because I run competitive games at high refresh rates, and all of my gaming friends are running 1080p monitors aswell for various reasons, usually cost. 1080p is 64% of primary monitors in gaming(source steam hardware survey jan 2023), its so far from dead its not funny.
@KontrolStyle Жыл бұрын
ooh i'm sure more people play in 1080p than higher res that's for damn sure. Some of us just had 'must haves' or 'nice haves' for 1440p - it's damn crisp
@CZ_Aesthetic Жыл бұрын
Would love to see 3440x1440 ultrawide benchmarks :)
@KevinM0890 Жыл бұрын
its between 4k and 1440p. u can do the math on your own
@CZ_Aesthetic Жыл бұрын
@@KevinM0890 doesnt change anything. Still would Love to see 3440x1440p ultrawide benchmark :)
@clickbait-g4o10 ай бұрын
Well... choosing a handful of modern games within your preferred genre, comparing the performance of various CPUs and GPUs, and then selecting components with fitting numbers is one of those ideas that seem so obvious once you've heard them, you wonder why you didn't think of it yourself.
@MRFISHLUKE Жыл бұрын
Definitely understand the point of CPU benchmarking to access CPU performance. However the people posting things like "4K RTX 4090" want to know what CPU to pair with their 4090 (or other high end/future GPU), which 1080p benchmarking is unlikely to help with. Your poll shows that 77 % of viewers want to see the results with 1440p or above. I myself am targeting 4K/120 which is clearly not most demanded but does test the limits of the GPU and therefore may allow a cheaper CPU to be used in many cases, but hard to find this information given the focus on benchmarking.
@Hardwareunboxed Жыл бұрын
This point really should have been made clear in the video. Okay you game at 4K. What games are you playing? How are you playing them? What if your fps target? If I include 4K results at around 60 fps but you need the CPU to handle 120 fps, that's useless to you. It also tells you nothing about the actual CPU performance, so you could be one year away from needing another CPU upgrade. All of this was covered in the video.
@MRFISHLUKE Жыл бұрын
@@Hardwareunboxed yeah I understand. Benchmarking provides the broad picture whereas the other way of doing it is very context dependent.