Your GPU is Useless Now

  Рет қаралды 484,854

Vex

Vex

Күн бұрын

Пікірлер: 2 600
@CaptToilet
@CaptToilet Жыл бұрын
So it comes down to 2 things. Are CPU's just not good enough? Or are developers just not good enough? The answer should be obvious. Hint it isn't the CPU
@bigben9056
@bigben9056 Жыл бұрын
its cpu,people just dont understand how heavy is rt on cpu
@upfront2375
@upfront2375 Жыл бұрын
Exactly! I've been pc gaming for around 25 yrs now and I've never seen a time when CPU was nearly this important for gaming, except for old network EA games of course
@2528drevas
@2528drevas Жыл бұрын
@@upfront2375Same here, I built my first gaming PC in 1998 to play "Half Life", and it did fine with a AMD K6 300, because the 16MB Voodoo Banshee handled the load.
@bl4d3runn3rX
@bl4d3runn3rX Жыл бұрын
Would be interesting to see how a game performs on 5900x on release day and 2 years later fully patched on the same system, has it improved or still the same?
@acelandrei
@acelandrei Жыл бұрын
It is a mix of both, cpus aren’t exactly advancing as much as gpus but it also feels like some developers are not even trying to optimize games
@user-nq5hy7vn9k
@user-nq5hy7vn9k Жыл бұрын
Based on Steam Hardware survey, 1650 is still the most used GPU. If a developer wants a wider audience to be able to play their games, it's much better they focus on better optimization really soon
@redshiftit8303
@redshiftit8303 Жыл бұрын
Unfortunately, their target audience is now console. Where they can get away with 30 fps and the peeps still pay up....
@user-nq5hy7vn9k
@user-nq5hy7vn9k Жыл бұрын
@@redshiftit8303 I doubt with such crappy optimization even the consoles will be able to give 30fps for too long
@danavidal8774
@danavidal8774 Жыл бұрын
​@@user-nq5hy7vn9kif I remember correctly remnant II runs at 720p upscaled to 2k on consoles It is wild
@InnuendoXP
@InnuendoXP Жыл бұрын
@@user-nq5hy7vn9k nah this is where the optimisation starts. It takes time, and games are taking longer than ever to develop. When they hit the wall on performance, that's when they start pulling out tricks. Though if a Zen 3 3700 equivalent is limiting to 30FPS, you'll need 2x performance per clock to maintain 60, and we still don't have CPUs doing that. At the very least, Series S might keep a lid on VRAM requirements though.
@iraklimgeladze5223
@iraklimgeladze5223 Жыл бұрын
If the most used GPU is 1650 why $70 unoptimized games are in steams sales top chart?
@bobsteven2363
@bobsteven2363 Жыл бұрын
As a game dev (not programmer) I will lay out some fun facts. Making a high poly mesh with loads of detail can take a day to make. You can also auto uv unwrap it and start texturing the next day. Super easy. But the poly counts for a single character can easily reach one million with all the parts. Games cant handle that so you need to spend a week manually retopologizing, baking, uv unwrapping and texturing that model so that it has a low poly count and still looks amazing. Unreal engine releases auto lods. Yay, now I can skip the manual retopologizing phase and just make lods with the click of one button. Game sizes are now way bigger. Unreal Engine releases nanite. Oh wow, I can just import models directly from zbrush and fusion 360? Cool, now I dont need to worry about optimizing at all since the game engine can handle it. Every year, making games becomes much easier and less time consuming at the cost of performance. You can still optimize but why would you if you already finished what you were tasked with?
@curvingfyre6810
@curvingfyre6810 Жыл бұрын
More importantly, they're under pressure from their bosses to churn them out faster. The crunch is insane, and the quality suffers across the board.
@ralphwarom2514
@ralphwarom2514 Жыл бұрын
Yup. Pretty much.
@MrThebigcheese123
@MrThebigcheese123 Жыл бұрын
So, it is ok to release a half baked product and spend over 70% of the development time on pre planning while leaving real development until 2 years before release? Mkay then...
@curvingfyre6810
@curvingfyre6810 Жыл бұрын
@@MrThebigcheese123 I think the point is to blame the direction and money side of things, not the average programmer. They have to get through the day and survive the frankly insane crunch time. It's up to the directors and producers to approve of the level of work that they have forced out of their engineers in the allotted time, and to choose whether more time is needed.
@Born_Stellar
@Born_Stellar Жыл бұрын
good to know, interesting to hear from inside the industry.
@radioleta
@radioleta Жыл бұрын
As a graphics programmer. Vulkan and DirectX 12 shouldn't lead to more CPU utilization by itself. In fact, the whole point of the modern APIs is actually to reduce CPU overhead. The new APIs allow you to use use multithreading to use multiple cores to record rendering commands though. But that actually helps saturate the GPU! Yes, I think it might have something to do with the lack of optimization. I agree that the improvements in realism are not worth for the performance impact most times :)
@Mefistofy
@Mefistofy Жыл бұрын
Just a thought: could it also be memory data rate when everything becomes bigger and open world.
@ayliniemi
@ayliniemi Жыл бұрын
As the cpus, gpus and game engines become more complex with each generation I would think it gets harder and harder to master optimization while at the same time creating a game and bringing it to market in a profitable timeframe.
@Mefistofy
@Mefistofy Жыл бұрын
@@ayliniemi Did not think about it but complexity is definitely something that exploded ind past decades. I work with ML and getting the GPU into high utilization can be quite hard sometimes, depending on architecture. All the new shiny libraries offer a lot of comfort but are sometimes badly documented. If you want to do something specific finding a way around a framework can be cumbersome and sometimes you waste a little processing time to get the damn thing to work. I guess there might be similarities in games. Hardware is developing so fast for decades, software is barely keeping up.
@ayliniemi
@ayliniemi Жыл бұрын
@Mefistofy So your saying because of time constraints the code isn't as perfect/efficient as it could be? Like you could go back and recode Mario Bros on the Nintendo, keep seeing how you could make the game run more efficiently on the NES processor am I right? You'll probably run into a dead end at some point.
@zuffin1864
@zuffin1864 Жыл бұрын
You know what isn't realistic? Low frames dangit!
@HoD999x
@HoD999x Жыл бұрын
(former game) developer here. you can usually get things calculated a lot faster (x2-x10 in my experience) by thinking long and hard compared to a first prototype. the thing is that this process is usually very expensive and sometimes would mean you run out of funds while rewriting half your engine. on top, you become less flexible because only the special cases you optimized your code for are fast. your game would run better, but you will not have nearly as much content
@richr161
@richr161 Жыл бұрын
AS far as spiderman. Its such a cpu hog because its streams a ton of data off the drive. It wasn't an issue on ps5 because Ps5 has hardware to assist with decompressing the data. When it was ported to PC , it functions the same, but relies on the cpu to decompress the data leading to high cpu usage. It you monitor the ssd usage you'll see it loading huge amounts of data of the drive. This is a game that can really use gpu decompress like ratchet and clank on pc.
@TraktorTarzan
@TraktorTarzan Жыл бұрын
is it an engine issue? cause with modded skyrim im playing a game thats essentially 500GB, and it runs decently on my 1080. but modern games, with way less content and and similar graphics(i usually play em on medium/high) ends up running with similar fps, even though its less than 100GB.
@richr161
@richr161 Жыл бұрын
@@TraktorTarzan I assure you that the graphics detail aren't similar. Skyrim graphics weren't great back then and they're definitely outdated now. The size of the game doesn't have much to do with graphical fidelity. cyberpunk is an open world rpg and only comes in at 55gb. Its definitely the best looking game out when you turn on path tracing and all the modern effects.
@TraktorTarzan
@TraktorTarzan Жыл бұрын
@@richr161 i said modded skyrim, not the basegame, that isnt even close. look up "Skyrim in 2023 | Ray Tracing Shader" or "skyrim ultima". Also i said compared to modern games, not THE best looking modern game
@QuandariusHuzz-bq1jn
@QuandariusHuzz-bq1jn Жыл бұрын
alot of pc games these days have problems with asset streaming and resource management
@nemesisrtx
@nemesisrtx Жыл бұрын
2023 has been the worst year for PC Gaming, games releasing unfinished and terrible optimized, etc... Nowadays people have very good systems but not even current hardware is capable to keep up with how demanding games are, hopefully game devs understand that it is better to have a playable game on day 1 and not rush their projects because of hype, AND, most importantly that most PC gamers don't have a high end PC lol
@brucerain2106
@brucerain2106 Жыл бұрын
Truuue
@Not_interestEd-
@Not_interestEd- Жыл бұрын
Actually, there's a surprising answer as to the issues we've seen recently. DX12 has been out for a while, and it's great. Has a lot of drivers to do a lot of the heavy lifting so your games are usually optimized, even with minimal effort put into actual optimization. The problem comes with DX12 "Ultimate", which theoretically is a better branch of DX12, but most of the "useless" drivers had been ripped out, and I guess in the lot removed, something was doing WAY more work than expected, thus, poor optimizations. This in conjunction with bad management and impossible deadlines (think about the Mick Gordon vs Bethesda incident) makes game development a harsh environment for AAA titles. I really want people to stop blaming the devs, blame the management that screws around with the millions of dollars they sit on, paying the actually working devs a fraction of what they get.
@Ghostlynotme445
@Ghostlynotme445 Жыл бұрын
The rtx 4090 going 40fps at 4k is real what a $1600 experience
@Patrick-bn5rp
@Patrick-bn5rp Жыл бұрын
Pretty good year for emulation on PC, though.
@mimimimeow
@mimimimeow Жыл бұрын
Welcome to the transition period. It happens in every console generations except 8th gen because PS4/XB1 were objectively very weak for its era.
@ErockOnTech
@ErockOnTech Жыл бұрын
The take away for me here is that modern games aren’t optimized. I’ve said this in videos. So has HUB. But good job going in depth on CPU usage. You’re right about reviewers using higher end CPUs. I’m guilty of that myself.
@vextakes
@vextakes Жыл бұрын
I don’t think anything’s wrong with using fast CPUs, because the goal is to show the overall GPU performance. However, a lot of ppl might not be able to get that performance just because of the CPU they own. So it’s a mixed bag. It requires a lot of testing, but should prolly be pointed out depending on what games are reviewed. Mb in CPU reviews we could show if they give reasonable performance as well, compared to relative GPU power if we’re talking about gaming
@imbro2529
@imbro2529 Жыл бұрын
Tbh I think it's an optimization issue of the games themselves not the hardware. Because really we have the 30 and 6000 series GPUs, Intel's 12-13 gen and Ryzen 5000 CPUs probably the best hardware for only to barely run shitty ports like The Last of Us, CyberBug, Hogwarts Legacy (it was quite buggy on release), Darktide (a shitstorm of poor optimization), Forespoken and etc. All of these came out poorly built because the devs probably have difficulties with all this new software and being pushed from a top to release a product. So I don't think it's a hardware issue more like software issue that doesn't correctly utilize the true potential of our components
@QQ-xx7mo
@QQ-xx7mo Жыл бұрын
this is just what happen when the masses get access to a media (games, cinema, tv, internet) it becomes shit.
@dugnice
@dugnice Жыл бұрын
I think it's a concerted effort to destroy the PC gaming market so that only the wealthiest of people can indulge in PC gaming and everyone else gets a console, but I'm probably totally wrong. 🤷🏻‍♂️
@knasiotis1
@knasiotis1 Жыл бұрын
​@@QQ-xx7mowhat.
@ZTriggerGaming
@ZTriggerGaming Жыл бұрын
I’m so glad I watched this. I have a Ryzen 5 5600X and I’ve been CPU bottlenecked lately. I was considering upgrading to a Ryzen 9, but seeing that it didn’t solve the problem for you saved me a lot of money. Are hardware manufacturers paying game developers to make their games as demanding as possible? I think about this often.
@MartinMaat
@MartinMaat Жыл бұрын
I don't think they struck a deal among each other but rather developers will utilize hardware to the max to make their game look better than the competitor's game. So "whatever the hardware guys will come up with, the software guys will piss it away". Their interests align nicely though.
@a7mdftw
@a7mdftw Жыл бұрын
Can i ask what is your gpu
@ZTriggerGaming
@ZTriggerGaming 10 ай бұрын
@@a7mdftw Sorry, I just got the notification for this. No clue why. I’m running a 4070.
@xTheN8IVx
@xTheN8IVx 10 ай бұрын
This is mostly a game development issue, the 5900x is still a great CPU for plenty of games. Paired with a 4070, that’s a great setup. There just needs to be more optimization on the game devs side.
@dangelocake2635
@dangelocake2635 9 ай бұрын
I'm not a dev, but it seems more like a industry issue. What I mean is, in order to make a game run as smooth as possible, you need time to optimize. The more time you need, the more people you're keeping into your business, so it costs money. But a game doesn't generate most of the money until is released, so you must balance your timeframe with the amount of fund/time/potential profit . Companies usually rather push a poorly optmized game over a highly cost one an fix it later. You could improve engines overtime, but everygame needs to step up in terms of graphics, so there's only so much work you can reuse. So I don't think it's a conspirancy, because you have games like RE4 Remake that was running smooth from day one, because it's a Capcom engine they'd developed for years or even Rockstar games that are sometimes demanding, but you can see why they are, in terms of tech.
@JodyBruchon
@JodyBruchon Жыл бұрын
I talked about this in "everything is a f***ing prototype." Software is a gas that expands to fill its container. I own a bunch of old XP-era embedded hardware and netbooks. Most of them have one core. Look up the PassMark for the Atom N435 or VIA C7-M 1200MHz, then look up the most basic CPU for a Windows 10 laptop like the Celeron N3050. The difference in computing power is huge. The old wimpy chips could do all the basic tasks anyone needed in a laptop at a reasonable speed, but instead of the software getting faster as hardware has exploded in speed, everything is written in Python or JavaScript and everything uses huge frameworks and pulls in huge dependencies for small things. I recall an old article that disassembled and analyzed bloat in the Facebook app for iOS which was notoriously fat at the time and the most ridiculous piece of bloat was an entire library of functions pulled into the app just to use a single function that does something like getting a date. Any competent programmer could have spent a day at most writing it themselves but they opted to pull in a library to do that single thing. It's disgusting.
@XieRH1988
@XieRH1988 Жыл бұрын
Things aren't being optimised properly sums it up fairly well. The current period is one of transition, where developers are fumbling and stumbling their way to master DX12 and other things. It'll probably get worse before it gets better.
@deality
@deality Жыл бұрын
I believe it dx12 is not optimized but it's the future and it should get better
@he8535
@he8535 Жыл бұрын
Poor optimization more complex compression still missing all the features a good game would have
@americasace
@americasace Жыл бұрын
Sadly, I vehemently agree with your assessment..
@borone1998
@borone1998 Жыл бұрын
Tod Howard would beg to differ
@Gramini
@Gramini Жыл бұрын
I wonder how long the transition period will be, given that D3D12 is over 8 years and Vulkan over 7 years old now.
@surnimistwalker8388
@surnimistwalker8388 Жыл бұрын
The way I am seeing it is that it's not going to get better. Publishers are pushing game dev studios to pump out games as fast as they can. UE5 enables this as you described being able to pump out a game that looks pretty but relies on hardware to brute force programming "issues." This is a problem with the quick cash grab mentality in the gaming industry and it's not going to go away until the whole bubble that is getting created bursts.
@abeidiot
@abeidiot Жыл бұрын
Also, game dev pays peanuts and has insane working hours. The brightest computer science talent is no longer interested in working in game dev unlike the 90s
@onyachamp
@onyachamp Жыл бұрын
This is true. It's like an artist pumping out work for money rather than a love of their art. I personally think on a sinister note, that studios are running silent companies to sell cheats for the games they make. A person buying hacks for $10-20 a month are spending $120-240 a year on cheats and they would be so much easier to develop than the game itself. From a business perspective it is easy money, and that's exactly and almost entirely what the industry has become.
@surnimistwalker8388
@surnimistwalker8388 Жыл бұрын
@@onyachamp You know Rockstar has been caught how many times now distributing pirated versions of their own games. I wouldn't put it past them to do that.
@BygoneT
@BygoneT Жыл бұрын
​@@surnimistwalker8388I've literally never heard of this? When?
@Jakob178
@Jakob178 Жыл бұрын
the bubble are called normies that buy every yea r literally shit, doesnt matter the quality
@minnidot
@minnidot Жыл бұрын
You really nailed some good points. CPU's used to be one of the last things we had to worry about upgrading. I posted a vid on how to lower CPU usage in Battlefield 2042 and even users with 13900k systems have commented that it helped. That's one of the top pieces of hardware available and its struggling with a game from 2021
@ericimi
@ericimi Жыл бұрын
That video was yours ? I literally just used it the other day for battlefield my 7700x was using about 50 percent then I made the user.cfg file and it uses about 33 percent . Pretty big difference .
@ralkros681
@ralkros681 Жыл бұрын
“But older games are not optimizing for new hardware” Most bs excuse I have ever heard. Had to say it before someone else did
@minnidot
@minnidot Жыл бұрын
@@ericimi that was mine. I'm so glad it helped you!
@mttrashcan-bg1ro
@mttrashcan-bg1ro Жыл бұрын
It's BS that every new GPU generation you need a new CPU to be able to utilize it, the 5900X pushed a 3080 and 3090 really well in 2020, but it's virtually a potato on new games in 2023, in some cases my 4090 at 4k, it wasn't even an upgrade over the 3080, but now I can turn DLSS off, which barely makes a difference, because a 5900X is just a trash CPU against new games. I wanna see what Ryzen 8000 and Intel 14th Gen are like, I would like to stay with AMD but I want really want to upgrade a 12 core to 16 core when a 12 core is being nearly fully used in some games, I want at least 20 or 24 cores, but I doubt AMD will increase core counts, whereas Intel probably will.
@RickyPayne
@RickyPayne Жыл бұрын
@@mttrashcan-bg1ro Unless you're a heavy, heavy multitasker like a game streamer, core counts for gaming isn't very important past 6-8 cores because most all games are designed with 8 core consoles in mind. Cores 9+ are only normally only helpful for doing extra non-gaming tasks while gaming. For strictly gaming you're better off with an 8 core AMD x3d model over anything else since, at 6+ cores, processor speed and cache are more important. Once you hit 8 cores/16 threads, unless you're a developer, game streamer, content creator, Gentoo user, etc, you're better off with more cache and single core processing power over more cores and threads.
@Robwantsacurry
@Robwantsacurry Жыл бұрын
You've missed something important in the PC space, copy protection. Many PC titles are encrypted, sometimes with nested encryption. Denuvo for example, code is decrypted on the fly, because unencrypted code could be ripped from memory. This pushes CPU usage up massively.
@ramdom_assortment
@ramdom_assortment Жыл бұрын
More reason not to buy new AA games.
@Soraphis91
@Soraphis91 Жыл бұрын
One issue easily neglected: UE and Unity are kinda general purpose engines. Yeah, UE has a history in FPS games and is rly good in that but you can basically do any game with the engine. this means those engines have a lot of abstractions and many game developers - when they chose one of those engines - usually don't have the manpower to go so far inside the engine code to optimize it for the last bit, regarding the concrete game they are working on. Just check how many engine developers are hired for projects that are developed on inhouse engines of AAA studios. So, with the comfort of taking a ready-to-go-engine you also lose some control
@NeovanGoth
@NeovanGoth Жыл бұрын
Totally agree. UE and Unity are awesome as they allow even smaller teams to use state-of-the-art graphical effects they could never have written themselves (and usually perform really well), but they also seem to encourage using them in improper ways.
@pliat
@pliat Жыл бұрын
DX12 can be optimised far better that DX11, but that requires time, and most importantly, skilled devs that understand low level coding. The devs just are not good enough.
@_GLXC
@_GLXC Жыл бұрын
maybe those devs are actually around, they're just still working on a game
@thechezik
@thechezik Жыл бұрын
This is have evrything to do how America degrading those develepores are not pay like just to be meanwhile cost of living skyrocketing and this is all over diffrent industries basicly competion its killed there is no moro loyality consumer base now its focus on AI and most importaninly wealth vs middle class vs working class its gone !!!
@AwankO
@AwankO Жыл бұрын
Perhaps those developers are not given enough time to optimize properly.
@henryyang802
@henryyang802 Жыл бұрын
I will not say that the devs aren't good enough, maybe the large game-developer community are still figuring out how to use DX12 the best way possible? Fine-granural control means more fragmentations and more to learn about what's going to be the best. Probably there aren't only 1 Optimal way of using it, there are many?
@s1p0
@s1p0 Жыл бұрын
make game first
@beansnrice321
@beansnrice321 Жыл бұрын
I think the main issue isn most of these games is that their animation is all being handeled by one thread on the cpu. Many 3d content creation programs also have the same issue, with Maya being one of the few multi-threaded animation engines in the industry.
@Gramini
@Gramini Жыл бұрын
I don't think that animations are that heavy. My guess is on the actual game logic (including AI) and maybe physics simulation.
@blindfire3167
@blindfire3167 Жыл бұрын
@@Gramini AI and Shadows/Lighting are the heaviest pieces on the CPU (mostly AI), while Physics depends on what type of simulation you're doing (some can be handled by mostly GPU like rain/water or destruction). Ray Tracing (although very heavy on GPU) requires a very beefy CPU to handle it since you still need the CPU to calculate light, which also is needed to calculate shadows and reflections (though reflections are usually just handled by the GPU, it can still be kept waiting for CPU to finish the other processes) and I'm not 100% on this, but it always feels like games that use RT always have it running on multiple cores but not multiple threads. I know I'm just a lowly peasant with my 8700k (I just barely graduated and still waiting for that dream job to kill all my enthusiasm for any industry lol) but every game I've tried RT on my 3080 has run horribly from core 0 taking most of the load on thread 0 and then on cores 2 and 4 only the first thread being maxed out.
@Gramini
@Gramini Жыл бұрын
@@blindfire3167Shadows/Lighting are also done on the GPU. Or do you have some specific case in mind that doesn't? Good point with the physics one. Some pieces of it can be delegated to the GPU, but not all. And not all games do that. It also taxes the GPU, which might be more important to do the rendering. Also multiple cores = multiple threads. To be more specific: to do things in parallel, a program creates a new (software) thread. It's then up to the OS to schedule the thread to a physical core/thread on the CPU. The program can also give a hint that it should be in on another core. The situation you described with only every second physical thread/virtual core being used is quite an interesting one. That _might_ make sense, because those two virtual cores are still only one physical core, which cannot do the same thing twice in parallel. So some programs hint that only every second CPU thread should be used. From what I know/was being told by a consultant was that it's usually best to not do that and just leave it to the OS to schedule/balance.
@toddwasson3355
@toddwasson3355 Жыл бұрын
@@blindfire3167 Multiple threads means running on multiple cores. There's no difference.
@blindfire3167
@blindfire3167 Жыл бұрын
@toddwasson3355 Nope, it *can* mean the same thing but you can have something running on multiple cores but not multiple threads, it would just run on the first thread on each core alone.
@agenerichuman
@agenerichuman Жыл бұрын
I've been really impressed with how you've been covering this. You're one of the few people who is sounding the alarm but without being all doom and gloom. You're right that there is hope but you're also right that this is a problem. And it's one not being talked about enough. I feel bad for all the people building PCs now who are going to be hit with a rude awakening. Upgrading a CPU sucks for someone who isn't familiar with the process (or even someone with shaky hands). It's not hard but things can very easily go wrong. Part of the push for GPUs was to take some of the load of the CPU. Seems like we're moving backwards. Also I think AI upscaling is great but I don't like the trend to use it to optimize games. Upscaling causes visual distortions. In some games it's not bad but in others it's awful. And some people will always notice no matter how good it is. It's clear we're at a cross roads in PC gaming. I hope you're right that good developers will rise to the top and things will get better.
@kenhew4641
@kenhew4641 Жыл бұрын
"Part of the push for GPUs was to take some of the load of the CPU. Seems like we're moving backwards." The GPU and CPU have different utilization and even the way they process data is different. The CPU is generic while the GPU is task-specific. If the GPU is tasked with rasterization calculating raytracing paths. or data simulation, they do it in many orders of magnitude than the CPU. like if rendering a single frame takes an hour using CPU, GPU can render that scene in just one second. The problem is that now games are getting more complex, more data intensive with much bigger file size hence needing even more bigger memory that can transfer data at much more higher speeds, it's not just about rendering the visual output anymore, the CPU needs to process AI behaviour and logics, simulate physics, simulate soft body collisions like clothing and hair, simulate particles systems like smoke, clouds or fluids, ALL at the same time in a typical gameplay of your typical AAA games. One single fabricated chip the size of a condom pack, is not going to be able to carry all that load, especially when gaming now is going beyond 1080p. It might be enough if you scale down to 720p, but new games coming out now don't even have the option for 720p anymore. We're not moving backwards, we're moving too fast too much ahead that the data have nowhere to go to get processed, and ended up landing on poor old already overworked CPU's doorsteps, waiting in a long queue to be processed.
@taxa1569
@taxa1569 Жыл бұрын
DLAA is the best thing to come out of the release of DLSS. Upscaling is hot garbage otherwise
@abeidiot
@abeidiot Жыл бұрын
@@kenhew4641 it has nothing to do with resolution
@Thanatos2996
@Thanatos2996 Жыл бұрын
Remnant 2 is actually less CPU bound than the first game, if you can believe it. The first was less visually impressive, but the shaders were so badly optimized on the CPU side that some areas struggled to stay above 70 on my 5800/3080ti rig at 1440 with shadows turned on. They’re both stellar games, but Gunfire could really stand to work on their optimization.
@hermit1255
@hermit1255 11 ай бұрын
I actually feel like the march AAA game devs are on towards more and more demanding games will eventually be forced to a stop as I think most people on mid to lower systems will just stop buying their games. A bright future for underdog indy stuff.
@italoviturino6386
@italoviturino6386 9 ай бұрын
The "look at how real everything inside this game looks but pls ignore what is in it" will be replaced with games like Hi-Fi rush, where the art style will make it so it ages better while demanding less from the hardware
@metalface_villain
@metalface_villain 8 ай бұрын
this has already begun tbh, everyone seems to be playing more indy stuff and ignore big aaa titles unless they are something as great as elden ring for example. this ofc is not only because of how demanding the games are but also because the triple a games are becoming just a money grab full of microtransaction while the more indie stuff focus on a great gaming experience
@justarandomgamer6058
@justarandomgamer6058 Жыл бұрын
If I recall correctly data centers had the same issue and they innovated by producing hardware for specifically handling the transfer of large volumes of data instead of the CPU which is more designed for all-purpose tasks.
@_GLXC
@_GLXC Жыл бұрын
you would think that with all this hubbub about "Tensor cores" or "raytracing cores" that the operation would be GPU bound, but no. v_V
@mttrashcan-bg1ro
@mttrashcan-bg1ro Жыл бұрын
You know it's sad when the latest GPU is bottlenecked by a CPU that is newer than it, every CPU bottlenecks a 4090 with RT at some point in these newer games
@BlindBison
@BlindBison Жыл бұрын
Yeah, consoles took that route too. PS5 has dedicated hardware for asset streaming and decompression. PC is getting direct storage so maybe that’ll help but basically no games even use it yet.
@abeidiot
@abeidiot Жыл бұрын
Fun fact, this has been available in PCs since a long time. It just wasn't catered to spoon feed game developers, linux even has a system call to handle it. Now microsoft is adding directstorage to directx to make it easier for game devs to implement
@anarchicnerd666
@anarchicnerd666 Жыл бұрын
Nice vid Vex :) I'm not a dev so I can't comment really, but I'm with you - things can only get better. The big thing to remember with DX11 and DX12 is the whole reason that transition happened in the first place - namely devs complaining about the anemic performance of the X1 back in the day and the API having too much overhead. That's a big reason WHY DX12 is so focused on stripping out guardrails and handing control back to developers. It's also worth noting we're in the middle of a weird transition period again, the move to AM5, Intel's paradigm of e-cores and p-cores, Windows 11 rollout, the move to DDR5 etc etc. The sheer breadth and scope of hardware available on the open market and used market for people to build systems for is awesome for consumer choice and value, but a nightmare for devs who need to optimize for an almost limitless combination of hardware. ...man I sure picked a hell of a time to build a PC and join the enthusiast community XD my poor little R7 5700X is gonna get completely outpaced by upcoming games How ya liking the 5800X3D? Very curious what your thoughts are going to be for editing with it versus the 5900X, but it's clearly a monster at gaming...
@mathewbriggs6667
@mathewbriggs6667 Жыл бұрын
I got the ryzen 5950x over the 5800x3d I needed the extra cores and clock speed but the x3d poops on it lol
@mv223
@mv223 Жыл бұрын
@@mathewbriggs6667 It truly is a badass chip.
@handlemonium
@handlemonium Жыл бұрын
Lol my 10700 is gonna be bottlenecking so hard when I upgrade to the 8700XT or 5070 in 18 months.
@yonghominale8884
@yonghominale8884 Жыл бұрын
As an ancient dinosaur who played the original DOOM on a 3dfx and the OG Pentium I can attest the things come in cycles. The issue is consoles and when transitioning from one generation to another, it's always rough. I still have nightmares about Crysis.
@mathewbriggs6667
@mathewbriggs6667 Жыл бұрын
@@yonghominale8884 it's gonna be a while longer then 18months
@danos5048
@danos5048 Жыл бұрын
As I understand it, the CPU usage stat is actually based on the number of threads being used. Each thread in use contributes to a percentage based on how many there are. 80% on a 12 core 24 thread chip means that at that moment 19 threads are being used. 0.8 * 24 = 19. 12 cores are actually being used and some of those cores are into hyper-threading/SMT.
@tadeuferreira5705
@tadeuferreira5705 Жыл бұрын
Wrong bro, CPU utilization is about CPU time and not about thread's. Any modern os with programs and service running on the background has hundreds if not thousands of active threads at any given time
@cagefury3789
@cagefury3789 Жыл бұрын
@@tadeuferreira5705 You're talking about OS threads or process threads. Those are just concurrent instructions running within the same process that share memory. He's talking about hardware threads, which are also sometimes called logical cores. You're right in that it's about time, but utilization takes threads into account as well. You can have 1 thread constantly doing work 100% of the time but you're overall CPU utilization will be very low if you have a lot of threads/cores.
@maverikshooter
@maverikshooter Жыл бұрын
​@@tadeuferreira5705 50% = 12 cores are used. And 100% = 24 threads.
@margaritolimon3683
@margaritolimon3683 Жыл бұрын
@@tadeuferreira5705 No it’s about usage overtime that is why it goes up and down depending on the scene. For a normal cpu (no e-core math hard) 50% being used is all the cores anything higher and it’s now using hyper-threading. Also it’s trying to explain everything with one number. If a 12 core cpu is being used and only 6 cores (no hyper) are at 100% and 6 are at 0% then it will show up at 25% or all cores at 50% (no hyper) will also show as 25%.
@VDavid003
@VDavid003 Жыл бұрын
50% on a 24 thread cpu could mean 100% on 12 threads with 0% on the rest or 50% on all 24 threads or anything in between
@hitbm4755
@hitbm4755 Жыл бұрын
CPUs cannot become obsolete this quick, optimization definitely needs to be improved. The Ryzen 5000 series is ridiculously much stronger than CPUs that came before it, and it took like ~10years for a FX 8350 / i7 870 to be properly utilized. In some ways I am starting to feel DirectX11 was a better Graphics API, because like Graphics Card vendors had to make optimizations on their side for context- switching/deferring, etc. and game Developers focused on communicating with DirectX, rather than optimizing it. That is how I understand it at least, but I got really frustrated with the bad CPU scaling of DirectX11 for Single Threaded bottlenecks, but like you said in some scenarios Developers actually pushed higher FPS or better response with DirectX 11. Ray Tracing is over-exaggerated and more of a sales gimmick, compared to the benefits it brings. Yes it is much more mathematically correct than Rasterization, but the end results are not always better or by much.
@999JWrld24
@999JWrld24 9 ай бұрын
Fr. I have a 4080 and NEVER use Ray tracing. It’s so stupid and changed almost nothing in terms of looks and DESTROYS performance. Such a waste of your GPU.
@hitbm4755
@hitbm4755 9 ай бұрын
@@999JWrld24 that is sad to hear, I feel it must be a much more affordable technology, because it is not proprietary technology.
@vairaul
@vairaul Жыл бұрын
The problem is most games are heavily depended in one or two CPU cores using the others as support, a situation where for example: cpu0 - 100% / cpu1 - 80% / cpu(2-X) - 30% is happening. I believe the today's problem is software bottleneck to utilize the most of modern GPU.
@DankyMankey
@DankyMankey Жыл бұрын
Also, another issue is that most popular games are also not multi-threaded so a single thread/core of your CPU is bottlenecking your GPU.
@mikfhan
@mikfhan Жыл бұрын
Yep, this is the issue we've had with games since the turn of the millenium. 5GHz boost is needed because games rely so much on a main thread, but CPU can't boost forever. We have plateau'ed around 4½ GHz now and it will take miracles to go beyond that in a stable manner. Parallel programming is difficult. The best way to improve your gaming experience from now on is not hardware, but rejecting games that only use 4 cores effectively. Wait for performance reviews, make publishers care about optimization, instead of micro transactions.
@jokerxd9900
@jokerxd9900 Жыл бұрын
I think we are at a weird state we just jumped to next gen and there will be mistakes and bad optimizations but after a while they will master it and things will get better. If they are open to learning and not lazy
@deality
@deality Жыл бұрын
I have a theory that unreal engine unity they are just trying to kill the gaming industry because of how unoptimized those applications are also directx12 is poorly optimized which is why you get shitty fps using it
@Juanguar
@Juanguar Жыл бұрын
@@dealitydirectx 12 does not optimize anything because it offloads the optimization to game developers Some devs even said it themselves and explained why some games ran on 11 better than 12 It’s because devs were lazy af and over relied on the auto optimizations that 11 has offered
@raskolnikov6443
@raskolnikov6443 Жыл бұрын
Next gen has been out for 3 years….
@jakubgiesler6150
@jakubgiesler6150 Жыл бұрын
Game engine dev here: its hard to utilitize full potential of compuer just because every pc is very different and you need to fullfill somewhat comparable performance on all architectures.
@prashantmishra9985
@prashantmishra9985 10 ай бұрын
How to become like you?
@TragicGFuel
@TragicGFuel 10 ай бұрын
I always wondered, why can't devs try to detect what the cpu gpu ram is, and let the game choose settings that will be best for that amount of processing power
@jakubgiesler6150
@jakubgiesler6150 9 ай бұрын
@@TragicGFuel Absolutely, your curiosity touches on a fundamental challenge in game development. While it's true that developers can detect the hardware specifications of a user's system, ensuring optimal performance across a vast range of configurations is more complex than it may seem. Firstly, there's the issue of variability within a single type of hardware. For instance, two computers with the same CPU, GPU, and RAM might still have differences in other components such as the motherboard, storage speed, and cooling systems. These variations can affect performance. Secondly, user preferences also come into play. Some gamers may prioritize graphics quality over frame rate, while others may prefer the opposite. Developing a one-size-fits-all solution that satisfies everyone's preferences is challenging. Moreover, constantly adapting game settings based on detected hardware can introduce a level of complexity that may impact the overall gaming experience. Quick adjustments during gameplay could lead to interruptions or fluctuations in performance. Despite these challenges, many developers are actively working on solutions. Some games do employ automatic detection of hardware specifications to recommend optimal settings. However, striking the right balance between customization and simplicity remains an ongoing challenge in the dynamic world of game development. In essence, while the idea of dynamically adjusting game settings based on hardware is intriguing and has been explored to some extent, achieving a perfect and seamless solution for the diverse landscape of PC configurations is a complex and ongoing endeavor.
@mikelee9173
@mikelee9173 11 ай бұрын
Nobody is really talking about the capitalistic aspect of this. They want you to upgrade to the latest hardware people. If a 3-5 gen old gpu/cpu combo can give you high fps on ultra settings companies like amd/intel/nvidia are not making money off of you. Once optimizations are at it's peak, they will start charging subscriptions on your cpu/gpu instead of allowing you to buy it outright.
@NeoHorizonLabs
@NeoHorizonLabs Жыл бұрын
Just one line.... Big greedy companies with tight deadlines and mentally destroyed developers... It's just that they are too lazy to optimize cuz these companies want games done quick
@burnzybhoy9110
@burnzybhoy9110 Жыл бұрын
Personally i feel like pc game optimisation has become an afterthought as of late, i dont feel like DX12 is stable enough in its current form and we should have the option to run DX11 if we choose, also the strange memory leaks we are suffering these days makes me ask a lot of questions, is DX12 the issue or are game devs targeting ram and cpu ? All i know is i want more accessability with DX options
@KainniaK
@KainniaK Жыл бұрын
Because of FSR and DLSS
@Catinthehackmatrix
@Catinthehackmatrix Жыл бұрын
How do you monitor memory leaks?
@burnzybhoy9110
@burnzybhoy9110 Жыл бұрын
@@Catinthehackmatrix in order to monitor memory leaks not only do we have to do our research but also monitor your ram usage via task manager and then performance, from here you can see ram usage.
@mihailmojsoski4202
@mihailmojsoski4202 9 ай бұрын
@@Catinthehackmatrix valgrind helps, tho it makes your program/game run like shit while testing because it's technically an x86_64 emulator
@MarikoRawralton
@MarikoRawralton Жыл бұрын
Tim Sweeney once said that the main failing of Unreal Engine was that everything runs on one thread, at least back when he was discussing it. He actually praised EA's Frostbite for being able to multithread logic (but admitted it was a harder engine to use).
@kevinerbs2778
@kevinerbs2778 Жыл бұрын
can the Frostbite engine use mGPU?
@progste
@progste Жыл бұрын
I believe that was UE3 though
@mttrashcan-bg1ro
@mttrashcan-bg1ro Жыл бұрын
Frostbite engine was good up until Battlefield 1 where is just rooted everyone's CPU so the i7 6700k at the time was bottlenecking 1070s a 1080s, by Battlefield 5 there were some 6 and 8 core options around but the clockspeed and IPC still matters more and that issue was amplified with 2042 where it's able to push a 24 thread CPU to 100% at times despite the visuals not being improved at all over BF1. Frostbite is a gorgeous engine but it's the second most CPU intensive engine I can think of after the one that the newer AC games use.
@MarikoRawralton
@MarikoRawralton Жыл бұрын
@@kevinerbs2778 If you mean multi GPU, I have no idea. I've seen conflicting reports.
@MarikoRawralton
@MarikoRawralton Жыл бұрын
@@mttrashcan-bg1ro BF1 ran great on my old PC and I was running an 8320E back then. That was a terrible CPU. Targeted 60fps on the old PS4/Xbox One and those both had notoriously weak CPUs. I think it benefited more from core count.
@dragons_advocate
@dragons_advocate Жыл бұрын
Here's one downside of these higher and higher detailed meshes and textures I have seen nobody talk about yet: The high graphical fidelity only really comes to shine when not in motion, or very slow motion. Fast movements quickly turn all those details into noise - particular with video compression (like, say for example a KZbin video) -- but also 60 Hz can be low enough for our eyes to see a smearing effect. Meaning paradoxically, with this insane level of details games are capable of nowadays, you would need MORE fps (and a high refresh rate monitor, natch) to really enjoy it (Again, mainly talking about fast moving scenes and games here). And may god have mercy on your soul if you play with motion blur on. There is a specific circle in hell reserved for those people.
@DashMatin
@DashMatin Жыл бұрын
fr
@XxWarmancxX
@XxWarmancxX Жыл бұрын
In defense of motion blur: racing games. For them specifically it's hard on my eyes with motion blur off, making my eyes ache at worse on my desktop monitor.
@dragons_advocate
@dragons_advocate Жыл бұрын
@@XxWarmancxX out of curiosity, at which framerate?
@NeovanGoth
@NeovanGoth Жыл бұрын
Having good per-object motion blur should be a basic requirement for each game. When motion blur becomes problematic it's mostly camera motion blur. Many games for example don't scale the simulated shutter speed with the actual frame rate, causing it to look perfectly ok in 30 fps, but like a blurry mess in everything above.
@bulletwhalegames1575
@bulletwhalegames1575 Жыл бұрын
Just wanted to drop this here, modern rtx solutions are not poorly optimized. Raymarching is extremely demanding on both cpu and gpu, the only way we can do it right now is to cleverly discard a lot of rays and bounces so that we can get somewhat decent framerate. How we do this is by calculating a structure that contains information about where geometry is located (too much info to really get into but there are plenty of good source), building this structure is what is costing your cpu performance, the more geo you need to update in this structure per frame the more your cpu will be doing. For example skinned meshes are extremely expensive and most of the times will be ignored (you can see this in games, a lot of the characters will not receive bounce lighting for example). Modern games do perform good on normal settings once you disable raytracing most of the time, raytracing is not really far enough at this point in time since there still is no hardware to accelerate the construction of structures (and this might likely stay like this for quite some).
@FreelancerND
@FreelancerND 10 ай бұрын
So basically prebaked lightning is still runs better and looks better in most of the cases? =)
@colbyboucher6391
@colbyboucher6391 8 ай бұрын
Yeah... this channel is just a dude who has no idea what he's talking about talking as if he's authoritative, which people love because it validates what people already feel.
@BattleToads
@BattleToads Жыл бұрын
Stop buying unoptimized games and it will change. If you keep purchasing them, it will not. You get two hours of gameplay on Steam risk-free. If it runs like booty, refund it.
@ama9385-w2g
@ama9385-w2g Жыл бұрын
I'm glad you made a video on this, I was feeling a bit sad after learning the new 4080 I bought would be getting held back by my CPU in raytracing in games like Spiderman, Witcher 3 etc. because I bought a high power gpu so I could experience these new features and yet I can't do that now because I need a £500 brand new CPU and platform :(
@avanx7699
@avanx7699 Жыл бұрын
I got the same problem. For me a new GPU basically means building a brand new Platform as well and the worst part is, that i bought some new fitting parts only a few month ago for my current GPU which decided to give up on me now. No matter how i spin the wheel, it sucks from every point of view right now.
@kevinerbs2778
@kevinerbs2778 Жыл бұрын
The Witch 3 is a ported from DX11 to DX12 which doesn't work well. ground up DX12 engine builds work better.
@deality
@deality Жыл бұрын
You need it anyway
@CyberJedi1
@CyberJedi1 Жыл бұрын
One of the biggest problems I think is that CPU advancement is nowhere near GPU advancement... especially on single-threaded performance. From the 3090 to the 4090 we got almost double the performance, from the 5800x3d to the 78003d is not much higher than 10%, and games don't utilize much more than 8 cores, so it doesn't matter how many you have in the cpu. Also, cpu's are kind of stuck at 6ghz, I heard somewhere that it will be really hard pushing past that.
@MitremTheMighty
@MitremTheMighty Жыл бұрын
3090 to 4090 double the performance? 😂It's closer to 60%, which is good, but nowhere near double
@korinogaro
@korinogaro Жыл бұрын
Yes and no. Main problem is that engines suck ass "out of the box" with threads utilization. Devs need actively write ode to utilize more cores and threads but they don't give a fuck and go with default settings in engine in as many places as possible. And 2nd problem is that companies switched run for GHz to masturbation contest over number of cores (because after they figured out how to glue CPUs together it is easier and gives good results in syntetic benchmarks). So it is combination of these two problems. CPU manufacturers give us more cores but improvements in *umpf* of every core are not impressive and devs don't give a fuck about iptimization for more cores.
@user-eq2fp6jw4g
@user-eq2fp6jw4g Жыл бұрын
r7 5800x3d is still insanely good value for money for budged/bit better future proof 1440p gaming taking to and account how expensive am5 platform still is. Specially good motherboards.
@nugget6635
@nugget6635 Жыл бұрын
Actually CPU has a much better scalar performance than GPU. GPU is vector processor (highly parallel). So single thread CPU is better actually. Parallel GPU is thousands of times better
@andreabriganti5113
@andreabriganti5113 Жыл бұрын
At a certain point, just increasing the clocks will not lead to great improvements. It did happen already. Having tons of fast cache is a better solution for gaming.
@andreabriganti5113
@andreabriganti5113 Жыл бұрын
Lowering some options such " crown density " can help a lot in games like Spiderman and Cyberpunk. It isn't the " ultimate " solution but it help. EDIT: It's also worth give an eye on the GPU control panel, in order to see what kind of workloads are assigned to the GPU. I had, in few games, such issues with my 5800X3D, alongside my 4070 TI and few adjustments, solved those minor issues I encountered. Hopefully this will help. Have a good day.
@CollynPlayz
@CollynPlayz Жыл бұрын
What settings do I do in nvidia control panel
@andreabriganti5113
@andreabriganti5113 Жыл бұрын
@@CollynPlayz Be sure PhysX is off or, at worst, is controlled via GPU instead of the CP,U, than look into " manage 3D settings " and crank up the video settings. This will NOT benefit the CPU directly but will make the GPU do more work, helping to balance the usage between GPU and CPU. After that, lower the amount of the CPU workloads in Windows and be sure the power management is disabled. This did help me but again, it work fine when the performances difference between CPU and GPU is minor AND the CPU is almost always at 97/98%+. If, as an example you have a Phenom and a 3090, the gap can't be helped a lot.
@richardsmith9615
@richardsmith9615 Жыл бұрын
@@andreabriganti5113 Would you recommend the 5800x3d as an upgrade path from a 5600g for 1440p gaming? Or do you think it's better to hold off for a future socket instead? Currently I'm using an Arc a770
@Daakor
@Daakor Жыл бұрын
Yes use the 5800x3d it will give you a massive improvement in your 1% lows and make games stable. Also the 5600g is a PCIE Gen3 only cpu and with the 5800x3d it will be gen4@@richardsmith9615
@mv223
@mv223 Жыл бұрын
@@richardsmith9615 It's worth every bit if you don't want to revamp your whole system, especially with all the issues the new processors are showing. I have the 5800x3d and the 4090, and a lot of the times it will max out the 4090. No need for anymore for a WHILE. Also, I game at 3440x1440 @ 240hz
@D1EGURK3
@D1EGURK3 Жыл бұрын
I can relate a lot to this topic...I have a 4070 Ti and an Ryzen 7 3800X...I'm CPU limited in basically every newer game...but not so much in older games...
@xprowl
@xprowl Жыл бұрын
I don't trust anyone talking about cpu usage using msi afterburner without displaying each individual core usage. it's been an extremely common problem that modern games max out a single core thus bottlenecking the rig, it's a dev issue. It should be the norm for tech channels to talk about this kind of thing so average people start to understand cpu loads.
@imnotusingmyrealname4566
@imnotusingmyrealname4566 Жыл бұрын
This is one of your best videos. Upscaling can't make the CPU process games faster.
@saricubra2867
@saricubra2867 Жыл бұрын
CPU's can't brute force bad game optimization. Just now with my i7-12700K, i can get over 117fps on Crysis 1 (also higher frames than the 5800X3D), that is DX10.
@arioch2004
@arioch2004 Жыл бұрын
Regarding the cpu decompressing and handling textures, that is what resizable bar is for. The GPU can access storage to retrieve and decompress textures without involving the cpu. So if you have a recent graphics card and a motherboard that has the resizable bar feature in efi/bios, then enable it.
@imcrimson8618
@imcrimson8618 Жыл бұрын
instructions unclear, my pc turned into a nuclear reactor and now its fallout 4
@georgeindestructible
@georgeindestructible Жыл бұрын
2:47 very well said, a lot of people thing that just because they get the best CPU or GPU, especially the CPU, it's not gonna give them any issues, how inexperienced they are. The worst part is, we have more than enough horse power in most modern CPUs to deal with almost everything at least at constant 60 FPS but "it's hard to code for that" devs usually say.
@americasace
@americasace Жыл бұрын
I am fearful that companies have inadvertently found a way to literally force gamers to upgrade their PC's far more often, to include the CPU, to current Gen (and hence, potentially having to build a whole new PC) as opposed to just* the GPU previously. So there is a potential for them to partner with gaming companies (such as AMD did with Starfield) to try and keep things the way they are going currently in order rake in the cash from hardware sales on the back-end of any new game launch.
@toddwasson3355
@toddwasson3355 Жыл бұрын
I've been around or in gaming (dev here for 20+ years) since about the beginning. It's nothing new. New, faster hardware comes out so developers naturally try to use it. That's the whole point. As a dev you have to pick a minimum spec. That goes up every year because nobody wants to support Commodore 64s anymore.
@americasace
@americasace Жыл бұрын
@@toddwasson3355 But there is ways in which to optimize a graphics engine to make it less demanding so more systems can run it. Such as a new breakthrough in Ray tracing in UE5 that can be more broadly applied using one point focused RT so as not to put too much strain on the CPU and GPU. The way things are looking now is that only the very top tier CPU and GPU combos can play a game on ultra settings at sometimes only 1080p (cyberpunk2077 latest patch). Basically, with games like starfield and Cyberpunk2077's latest patch update make the game unplayable unless you turn RT off and with starfield most gamers with decent rigs can only play the game at 1080p on low to medium settings. I believe game devs are being lazy and just plugging in UE5 or another graphics engine without bothering to try to optimize it so it can run on older systems. and when I say "older" systems, I literally mean only 1 or 2 generations older where a 2 generations older PC is VERY limited especially with all of the very latest triple A titles.
@NeovanGoth
@NeovanGoth Жыл бұрын
Oh boy I have news for you... The second game I ever bought was Wing Commander 3 in 1994 and it required me to buy more RAM to even be able to start it. A couple of years later you had to put a 3D accelerator card into your PC to get halfway decent frame rates and resolutions in new games like Quake 2. In 2007 when Crysis was released, _no_ PC on the entire world could run it's "high" (let alone "ultra") settings in somewhat high resolutions, not even ones with a ridiculously expensive Triple-SLI setup. I do agree that some games like Starfield have a bit high requirements for the graphical fidelity they provide (srsly, in cities it performs worse than Cyberpunk 2077 with actual path tracing, while not even using _any_ ray traced effects), but I strongly doubt that this is the result of some conspiracy to sell more PCs. Those games usually run just as bad on the consoles where there isn't even an upgrade option.
@forgerfalscher2125
@forgerfalscher2125 11 ай бұрын
there was never a time when gpu was mostly important to gaming... the gpu and cpu have allways been important... just like you have games that rely on the gpu you have game that mostly rely on the cpu or games that need both
@Varil92
@Varil92 Жыл бұрын
Man, if I only knew modern games in 2023 would have started to become worse and worse optimized I would have bought an i5-13600k instead of a 5800X3D. If this is gonna be the trend for the next years, then it would have been the better choice 😫
@hatchetman3662
@hatchetman3662 Жыл бұрын
I know I've been struggling with a 3700x. It often doesn't even max out my 2070 Super, anymore. I could only imagine how bad it would be with a 3080. You pretty much hit on everything I been preaching for the past few years.
@alaa341g
@alaa341g Жыл бұрын
try to max up features and graphics that are way more heavy on the GPU , at laest like that you'll gonna be sure its working 99% XD ; fuck the modern gaming market
@CurseRazer
@CurseRazer Жыл бұрын
Don't even think about it xd. My 4070 ti with a 3800x is only working at maximum 60-70% most of the times. There are instances where it is maxed, but very few sadly. Dont even know what to buy next... a 5800x3d or a 5900x... looks like it doesnt matter
@hatchetman3662
@hatchetman3662 Жыл бұрын
@@CurseRazer Well, 3700x and 3800x don't perform much differently in games. If I can afford it, my next upgrade is gonna be a 5600x3d or 5800x3d. It should help in games, regardless. But it is disheartening to see that nothing has gotten better and there's no solutions, currently.
@jordanlazarus7345
@jordanlazarus7345 Жыл бұрын
I've got a 1070 and a 3900x and even I'm not topping out my GPU in some games lol, will probably go for the 5800x3d at some point.
@hatchetman3662
@hatchetman3662 Жыл бұрын
@@aqcys6165 My PC is as "optimized" as it can be without a faster processor and GPU.
@XScythexXx
@XScythexXx Жыл бұрын
Great video and points, I always played on a low end PCs years behind of current games while growing up, in a country with a poor economy, but with years of work i could finally save up to upgrade my own PC to a decent level several years ago, nowdays whenever a major title releases, I feel like if you don't have the latest hardware you are just everything I've put in my PC just became obsolote on them, it's insanity. I feel like I'm back in 2010 playing some random games in 800x600 just to get 30 fps with the state of all recent titles.
@NeovanGoth
@NeovanGoth Жыл бұрын
I think that's connected to the current-gen consoles having much beefier CPUs than their predecessors. The PS4 and Xbox One were much more limited on the CPU side, so it soon became easy even for lower end for PCs to reach and surpass this baseline. Al lack in GPU performance can easily be mitigated by just lowering the resolution or reducing the quality of graphical effects, but if the CPU can't keep up, there often isn't much that can be done.
@brucerain2106
@brucerain2106 Жыл бұрын
Remember ps3 with its 256mb of ram and somehow it ran TLoU, Gta5 and Uncharted? And now we have components that cost like a whole console and they can’t even be utilised to their full potential. Wow very cool
@slaydog5102
@slaydog5102 Жыл бұрын
I thought we were "advanced" though?...
@bearr9835
@bearr9835 Жыл бұрын
Welcome to the game industry of 2023. Spaghetti coders
@TheIndulgers
@TheIndulgers Жыл бұрын
Part of the issue is Nvidia’s driver overhead with their software schedular. Hardware unboxed did some excellent videos about the severity of this with older cpus. Now that many newer games are eating up additional cpu resources (doubly so for RT), Nvidia’s lack of foresight is compounding the problem. This can be seen in games like Hogwarts, Jedi Survivor, and SpiderMan.
@giantninja9173
@giantninja9173 Жыл бұрын
Yeah but Nvidia dropped the charade, showing they couldn't give 2 S***s about the audience that made them with this gen, and went in on enterprise AI cards.
@PedroAraujo-vg7ml
@PedroAraujo-vg7ml Жыл бұрын
Yeah, Starfield is only getting up to 90FPS max on the demanding scenes with THE BEST CPUs, like the 13900k and the 7950X3D. Thats actually crazy. You cant even get to constant FPS with the best hardware you can buy rn.
@deadhouse3889
@deadhouse3889 Жыл бұрын
Are you running it on a SSD? It's a requirement.
@bl4d3runn3rX
@bl4d3runn3rX Жыл бұрын
I think AM5 is a good investment. Hoping for 3 CPU gens on it. So you buy a motherboard and DDR5, which is very cheap already and you can update twice...not bad if you ask me. Interesting comparison would also be 5900x vs 7800x3d with a 3080... can you do that?
@onomatopoeia162003
@onomatopoeia162003 Жыл бұрын
Least for AM5. Would just have to update EUFI, etc.
@Nicc93
@Nicc93 Жыл бұрын
going from 5900x to 5800x3d seen almost 50% higher average fps in some games, but it really depends on the game here. Higher clock speeds will benefit some games, some will benefit from the cache.
@ozanozkirmizi47
@ozanozkirmizi47 Жыл бұрын
Hard pass AM5! I say that as an extremely very happy AM4 user... I'll check Ryzen 8000 and Ryzen 9000 series in the future. I may consider to buld an additional system If I see worthy components for my hard earned money. Until then, "Thanks! But, No Thanks!" I am good...
@SrApathy33
@SrApathy33 Жыл бұрын
I got the 5800x3D on launch day. It was a massive upgrade over my 3600, boosting my GTX1080 in Cyberpunk with 50% more fps. The jump to a 3070Ti, which should be over twice the GPU, didn't improve Cyberpunk's fps by 50%. That worries me for the longevity of my 5800x3D which I planned on keeping several years. The massive depreciation on that CPU doesn't help it. Same depreciation as my 3070Ti btw. I could buy a Ryzen 7600/7700 platform with 32gb ddr5 for the value I lost on my 5800x3D and 3070Ti in one year.
@UKKNGaming
@UKKNGaming Жыл бұрын
​@@SrApathy33buy a 6800XT it'll work a lot better with the 5800X3D. 3070TI is a dying GPU. 6800XT is competing against a 3080TI right now for way less money.
@Revnge7Fold
@Revnge7Fold 9 ай бұрын
DUDE, this is so true! I experienced thiis first hand. Was on an AM4 Plantform with 2700CPU and GTX1070, Upgraded my 1070 to a 3070. I always skip a gen. But my gaming experience was so bad on a the newer games that I played, especially Cyberpunk 2077 and Stalker Gamma. The games where just choppy/lots of frame dips and got max 40-50 fps no matter my settings. So I read on a cyberpunk redit of guys performance on 3070's and some of the with older CPU's where experiencing the same thing! There where some examples of guys with weaker CPU's that said upgrading their CPU's made a massive difference. So I decided to get myself a 5800X and OMG... it made a MASSIVE difference like basically DOUBLEing my FPS... a CPU upgrade has never made such a big difference in my life!
@Twitch_Moderator
@Twitch_Moderator Жыл бұрын
*The truth is, the difference between ULTRA and HIGH/MEDIUM mixed settings is nearly indistinguishable. ULTRA is a setting for people who don't know what each setting does. I have a 3080 10GB and the 7800X3D and I have zero issues with AAA titles on HIGH with ULTRA textures.*
@Aryzon13
@Aryzon13 Жыл бұрын
It will not get better with time. Games were more optimized when DX12 and Vulkan were just rolling in. And since then it went downhill as you mentioned, And it will only keep getting worse until people stop buying. And since people will never stop buying, it will keep getting worse indefinitely. And you will be continuing to buy better hardware to compensate for the devs incompetence or straight up malicious intent.
@Boris-Vasiliev
@Boris-Vasiliev Жыл бұрын
Graphics became the most important feature for marketing of new games. Need better graphics - buy new hardware. Its inevitable. DX12 and Vulkan are just tools for using new GPU features, they are not forcing devs to make ultra hi-res textures and a ton of post-processing effects. Its still possible to make low-poly models in a closed room to get 300+fps. But marketing needs open world, filled with people and ready to take nice looking screenshot at any moment.
@kada0420
@kada0420 Жыл бұрын
Vulkan was great. Gave more life to my old pc during the time.
@JN-qj9gf
@JN-qj9gf Жыл бұрын
​@@Boris-Vasilievgraphics became the most important marketing feature for games 40 something years ago.
@NippyNep
@NippyNep Жыл бұрын
fake ai@Leaten
@slaydog5102
@slaydog5102 Жыл бұрын
​@@Boris-Vasilievyeah dimwits who care more about graphics than actual gameplay.
@astreakaito5625
@astreakaito5625 Жыл бұрын
I'm building my new 7800X3D system tomorrow and this is why. Although you gotta remember other bottlenecks can exist, could be cache issue, memory issue, gpu mem bandwidth issue on the RTX4000s, and sometimes the engine itself simply can't cope with poor code and will simply fail to use HW resources for seemingly no reason. Also if a thread is maxed it is maxed and no amount of moar cores will help, it's impossible for the same tasks to jump to another thread that's free that's why not even CP2077 which is very well multi-threaded for a videogame still won't use your 5800 at full 100%
@2528drevas
@2528drevas Жыл бұрын
I skipping this generation and riding my 5800X3D and 6900XT for at least another year. I'm curious to see what AMD has up their sleeve by then.
@NippyNep
@NippyNep Жыл бұрын
bro that can last u years@@2528drevas
@jmporkbob
@jmporkbob Жыл бұрын
One of the biggest issues is consoles, I think. PS4/XB1 released with a laughably pathetic cpu, even at its time-much less several years later. So with that being not only the lowest common denominator, but also kind of the central hardware of the industry, the cpu requirements of games became a non-issue for essentially the past decade. PS5/XSX released with a solid cpu at the time (basically underclocked 3700X) and it's still respectable a few years later. Given that they are targeting 60 (and sometimes even 30) fps on that cpu, it's going to be pretty difficult to hit very high fps on cpus from around this time period. Can there be more cpu optimizations done? I strongly think so. But the generational leap as we move out of the crossgen period is driving a lot of it.
@JaseHDX
@JaseHDX Жыл бұрын
DF made a video on the series x cpu, performs similarly to a 1700x, not even close to an underclocked 3700x
@jmporkbob
@jmporkbob Жыл бұрын
@@JaseHDX it's an 8 core zen 2 design, just underclocked compared to the 3700x. dunno what to tell you, it's literally that architecture lol
@ppsarrakis
@ppsarrakis Жыл бұрын
1700x is still like 5 times faster than xbox one cpu...@@JaseHDX
@mimimimeow
@mimimimeow Жыл бұрын
@@JaseHDX You can't use the DF data really, Windows OS and PC games wouldnt be tailored for that specific hardware config and so the Xbox CPU may be bottlenecked by other areas that Windows didn't utilize. It's like how Android runs like ass on an overclocked hacked Switch compared to a Shield, despite both having the same chip.
@ImplyDoods
@ImplyDoods Жыл бұрын
​@@mimimimeowxbox's literally run windows already just with modified gui
@eliasalcazar6554
@eliasalcazar6554 Жыл бұрын
I think after Xbox One and PS4 reach EOL we'll see some great strides in optimization. I agree also that the cost/performance of graphics isn't scaling nicely. It seems that the move toward "realism" isn't worth the hardware tax when it comes to final presentation. But, I do see GPU manufacturers serious about gaming including more specialized hardware on their chips, like Nvidia and AMD with RT Cores. Unfortunately I see this raising the price of GPUs even further in the foreseeable future. It will definitely be interesting to see how a mid-gen "Pro" refresh of the consoles will shake up the PC landscape as well. I'm guessing the consoles will be aiming at 7800xt levels of performance.
@jeanbobbb
@jeanbobbb Жыл бұрын
As a game developer I have to say that I am really lazy and love money. So yeah..whatevah.
@joeyazzata29
@joeyazzata29 Жыл бұрын
What’s up Vex I have a 5800X3D / RTX 4070 Ti I run Fortnite on UE 5 High preset with Nanite, Lumen, DLSS Balanced, at 1440p 165hz. I’m targeting a stable 158fps (165hz + nvidia reflex cap) Over the weekend, I had the chance to test a 4080 in my system Using Intels PresentMon: 4070 Ti rendering at 6.3ms = 158fps 4080 was rendering the game at 5.5ms indicating the 4080 can run the game at around 180fps However, due to the 5800X3D only being able to run the game at 7ms… I only saw 140-150fps with both GPUs The 5800X3D bottlenecked the 4080 to the point of being completely indistinguishable from my 4070 Ti Nanite and especially Lumen GI are extremely CPU demanding! You also have to consider there’s not a ton of detail going on in Fortnite compared to other games. Fortnite might be the best CPU optimized game we see on UE5 Check it out using intel’s presentmon
@joseyparsons7270
@joseyparsons7270 Жыл бұрын
what's the fps difference when you use hardware rt vs software lumen? when using hardware rt, is the cpu render time the same?
@joeyazzata29
@joeyazzata29 Жыл бұрын
@@joseyparsons7270 it’s a small difference, 10-15fps for me Either lumen is already doing most of the legwork or the rt effects themselves aren’t that demanding, not sure but it’s very light. I think AMD cards even run the RT better in this game if I recall correctly
@macronomicus
@macronomicus Жыл бұрын
Its good to get a sense of the required hardware up front before making a game purchase, see what others are saying, and avoid badly optimized games vocally, could give devs some crowd support to push back on management & budget for some proper optimizing, otherwise they're throwing away millions of potential sales.
@PoweredByFlow
@PoweredByFlow Жыл бұрын
The CPU bottleneck is the most annoying recent trend in PC gaming. Its the reason why i basically need to buy an entirely new PC just so i can upgrade my graphics card to actually utilize it properly.
@wildeyshere_paulkersey853
@wildeyshere_paulkersey853 Жыл бұрын
Yep. Same with me dude.
@portman8909
@portman8909 11 ай бұрын
Just do new builds and sell off the old. Simplest way to do it.
@celeronsuxbigtime
@celeronsuxbigtime Жыл бұрын
There is no way a 5900x is throttling a 3080 bro, try again.
@TheCSteve
@TheCSteve Жыл бұрын
Depends on motherboard too .. If you have a motherboard with a bad Pci slot X2 for example .. Lot of people just buy and dont read.. But you are right if the build is allright then you could even use the RTX 4090 if you Psu is gold line 1000 watt .. (i know you could do it with 800 watt but some room is better)
@HalIOfFamer
@HalIOfFamer Жыл бұрын
CPU Being the bottleneck is a welcome change. They are cheaper after all.
@uss_liberty_incident
@uss_liberty_incident Жыл бұрын
Good video, I'm glad you're talking about this. I run a 3090/5950x for 4k/144hz, and am very confused when I don't see my GPU fully utilized in some modern games.
@mathewbriggs6667
@mathewbriggs6667 Жыл бұрын
Same I've got ryzen 5950x and a Rx 6900xt if your having that problem might want to get a RX card SAM SMART ACCESS MEMORY can't be over stated and the drivers are way more stable on AMD it's been shown many times that in cpu bound games that the Nvidia drivers take up alot of extra cpu headroom other then that you could try disabling SMT and give your chip an all core of 4.7/5.0oc at 1.275v vcore try 4.6 1st as that is a 200mhz oc to start and check temps I'm on a 240mm AIO I get 90c at 300 wtts
@mathewbriggs6667
@mathewbriggs6667 Жыл бұрын
Once you play with it at 4.6oc 1.275V all core for a few days then if all is good after a few days or many hrs of testing you can up by 100mhz at a time give a few days of gaming or hrs of testing and if all is well no crashing sweet keep going tell it dose crash when it crashes in any game always go back down 100mhz and bam there is your new 20% faster cpu
@louieschneider8937
@louieschneider8937 Жыл бұрын
Disabling SMT and all core overclocking a ryzen is bad advice. @@mathewbriggs6667 and this isnt how you stability test an overclock. You are dumping unneccesary power and heat into the CPU when PBO2s boost algorithms will perform better in anything short of a heavy multicore workload (e.g. blender). You can push 5.1ghz on a 5950x with proper overclocking.
@Bsc8
@Bsc8 Жыл бұрын
It's called bad optimizations by devs due to marketing pressure. So too much limited time for them to provide good games experiences! UE5 it's a CPU eater because it's not being optimized at all due to the fact that it's not well known yet (just like early UE4 games). It's like running a demo game on the engine editor not a final release.
@4x13x17
@4x13x17 Жыл бұрын
Man, Unreal Tournament 2004 looks gorgeous and it's a technical miracle it looks great when it ran on literally anything at the time. That's optimization.
@mimimimeow
@mimimimeow Жыл бұрын
Most new engines are CPU eaters because lots of the logic were done on abstract layers, so devs don't have to program everything manually - precisely the purpose of a game engine. Games are getting way more complex than it was 10 years ago too. It's a tradeoff because when more things are optimized manually then the cost+time would go through the roof, which would be better used on game content and QC that will bring more revenue. Alternatively you make a simpler game. Take Elden Ring vs Armored Core 6 for example. They both run the same engine, but Armored Core's linear design and non-organic graphics are easier to optimize and QC, so it runs way better than Elden Ring for a given dev cycle. The reality is game companies are corporations and they all have financial targets to meet.
@Bsc8
@Bsc8 Жыл бұрын
@@4x13x17 yes very good example!
@Bsc8
@Bsc8 Жыл бұрын
@@mimimimeow i know It works like that, but making content for a game that runs like ass it's not a good thing: let's say i'm interested in a new game, i'm thinking about buying because i made a beast of a PC two years ago but that game can be played smoothly on my hardware!? I'll pass and probably never play It. What's the point of making good ideas/content for new games that peoples can't enjoy at all? That's the main reason why i can't get hyped for nothing anymore, and when i'm wishlisting something i always have the fear of not being able to run It. _(edit) The only games i play now are: the older ones i still have to play on my libraries, something that comes free from the stores or heavly discounted._
@mimimimeow
@mimimimeow Жыл бұрын
@@Bsc8 heh, ask the genius executives that make those decisions. We should have this, we should have that, because the market research says so, here's the suboptimal budget, deliver it before the next fiscal year. It's ok if it's buggy at release, as long as we hit the revenue target first. If market analysis was spot on people will put up with it anyway. Job done.
@user-eu5ol7mx8y
@user-eu5ol7mx8y Жыл бұрын
Does this mean future graphics cards should have special accelerators for tasks currently done by the CPU?
@roklaca3138
@roklaca3138 Жыл бұрын
Direct storage comes to mind
@Marco_My_Words
@Marco_My_Words 9 ай бұрын
I purchased a CPU with 24 cores, the Intel i9-13900KF, for this specific purpose. My objective was to ensure the CPU wouldn't become a performance bottleneck, and it's achieving that goal, but its also overheating all the time. However, investing 600 euros in a CPU seems excessive, particularly when high-end GPUs are already priced above 1500 euros. Modern games have become increasingly resource-intensive. While I appreciate the advancements and realism in games, the escalating costs are becoming really challenging to handle. The amount spent on a top-notch gaming PC could be enough to buy a decent car.
@SkyAnthro
@SkyAnthro Жыл бұрын
Have you tried turning on hardware accelerated GPU scheduling? It might help with relieving some of the work load on the CPU ^-^
@tonkatoytruck
@tonkatoytruck Жыл бұрын
That is what I have been wrestleing with for months, but for me, I play at 4K, so it really does not matter too much about the cpu once you reach a certain cpu power level.
@finalsin7825
@finalsin7825 Жыл бұрын
NVidia has a cpu bottlenecks compared to amd gpus i remember hardware unboxed did a video about this. i think it was something about driver overhead for NVidia
@ktvx.94
@ktvx.94 Жыл бұрын
I've found that single core performance is important, if you disable SMT/ Hyperthreading you can get more mileage out of your CPU. I did lose the ability for Windows to Sleep, and load times are worse, but at least I hit 60 fps. If you're considering CPU upgrades, focus on single core performance.
@aegis4142
@aegis4142 Жыл бұрын
yea why would devs optimize multicore cpus xD
@kenamreemas3295
@kenamreemas3295 8 ай бұрын
I remember the days when there were onboard co processors to render graphics when cpus were just single core units. The cpu had to decompress the game and furnish the memory access information which was later passed on to the co-proc to render. There was no concept of second pass until nvidia introduced the trade off riding the success of fast pcie gen2 expansion slots. I think its time to switch back to the same model, and apple's unified memory is the proof. Its a very capable apu for the amount of power it draws from the wall.
@kenamreemas3295
@kenamreemas3295 8 ай бұрын
Im not against pcie, its still a capable architecture to pull data from external source. Thunderbolt 4 and fast ssds are a living proof, but it should not be used for real time data processing or rendering.
@Noobert
@Noobert Жыл бұрын
It comes down to devs not optimizing games because your hardware can handle poor code. I personally feel the standard of trying to release a new game every other year isnt going to work for devs in the near future as the scale of games is getting grander and grander where the timeframe to develop isnt. Devs will have to make some choices and i feel its better develop one title. And release game sized dlc for it expanding on it at new game price if they want have somthing that keeps improving. The other option is to get ever expanding teams and rise the cost of new titles. But that comes with its own issues. With all this said you can see where my opnion on poor optimization comes in. Release the title and fix it after. It costs a ton to bugtest. My specs i9 13900k RTX 3080ti 128gb Ram Nvme 2tb I lag on a ton of games. It's not my rig
@nimushbimush2103
@nimushbimush2103 Жыл бұрын
I think things will get better because i dont see people going out of their way to but new hardware for games, and Im pretty sure that game devs know that. I feel like the problem that the covid caused hasnt been resolved fully because we can see the devs being pushed hard and not being able to deliver well. I know there is the publisher factor in that too but ifeel like we in the game industry havent recovered fully from covid.
@96dragonhunter
@96dragonhunter Жыл бұрын
They key issue is that after the crypto boom the GPU market has barely recovered when COVID struck and generally speaking a lot of people suddenly and untill now couldn't afford a gpu upgrade.
@nimushbimush2103
@nimushbimush2103 Жыл бұрын
@@96dragonhunter agreed
@peterfischer2039
@peterfischer2039 Жыл бұрын
@@96dragonhunter also with how the gpu market is looking right now, you should not buy a gpu right now. Especially if you play on a resolution higher than 1080p, because that will need 12-16gb vram soon and in some cases already does need that. But the affordable midrange gpus don't really have that amount of vram, so you are better off waiting for the next gpu generation and hope that they ship with more vram across the board.
@96dragonhunter
@96dragonhunter Жыл бұрын
@@peterfischer2039 not a problem for me. just gonna buy rtx 4090 and i9-13900kf
@Gemini-Iceland
@Gemini-Iceland Жыл бұрын
One important point is that games are most often designed around current consoles. PS5 has 8 core cpu now so the developers are taking advantage of that and usually plan around 60 fps. It used to be a lot easier to just have 2-3x faster gpu then the current console but having a 2-3x faster cpu then a 8 core cpu is really hard.
@deality
@deality Жыл бұрын
Yep
@skorpers
@skorpers Жыл бұрын
The core count isn't the main reason why it's hard to be faster. It's the fact that CPU advancements have been consistently slowing down for quite some time now. Clock speeds aren't increasing as quickly and IPC isn't increasing very quickly. Our die shrinks are slowing down and mattering less and less. All we can really do now is keep working on efficiency and throwing more cache at the problem. And maybe we'll see something hit 5.5-6Ghz before traditional CPU's are completely obsolete.
@supercellex4D
@supercellex4D Жыл бұрын
@@skorpers time for an architecture change to more workstation-y chips like Arm, RISC-V, POWER10 and perhaps a new CISC design beyond x86?
@DravenCanter
@DravenCanter Жыл бұрын
Something 2-3x faster than a 3700x.
@skorpers
@skorpers Жыл бұрын
We're going to be at least 2x CPU performance of the Xbox Series X CPU at next CPU launch. The 13900k is 70% more powerful single core wise than a 3700x which has a better single core performance than the Series X due to a higher boost. We can't directly compare. But in an all thread performance test that same 13900k is more than 2x more performant. Just doesn't work well in gaming because of e cores being a part of it. 7950x at least paints a promising picture for gaming if devs are willing to parallelize 16-32 threads
@Oni_XT
@Oni_XT Жыл бұрын
I'm not a game dev but this popped into my head since I'm constantly seeing texture streaming options. Is it possible more games are integrating this and it's effectively pooping on CPUs?
@F1Lurker
@F1Lurker Жыл бұрын
Very insightful and concise video, thank you for making it
@megapunkkk
@megapunkkk Жыл бұрын
Fun part is just now CDproject claimed that after the update Cyberpunk 2077 will be way more CPU heavy.
@paul_wiggin
@paul_wiggin Жыл бұрын
This is why Ryzen 5800X3D was released. Also.. the problem is not the CPU, its the game engines that are not optimized to utilize the power of modern CPU properly.
@Havanu81
@Havanu81 Жыл бұрын
CPU power (but also game utilisation) has exploded since the 5090. Jedi Survivor went from unplayable to quite allright when I upgraded to a 13700k. The 5800x3D is the really the only AM4 that is still able to play most games without any hitching/stuttering these days.
@vidmantaskvidmantask7134
@vidmantaskvidmantask7134 11 ай бұрын
Good voice talking. : ) You are skilled. Its interesting to listen.
@John-du2mq
@John-du2mq Жыл бұрын
This could be a good thing. The most expensive CPUs are way cheaper than the best gpus. So you can have an older 30 series gpu and a monster CPU and still have great performance for cheaper.
@dimlockfork5170
@dimlockfork5170 10 ай бұрын
true tho
@alexanderstreng4265
@alexanderstreng4265 Жыл бұрын
Vulkan, Dx12 and other graphics apis aren't for the cpu. If they were then we wouldn't need these apis since we already are accessing cpu and ram directly when programming. The apis are for accessing the gpu (how to draw stuff, how to store resources in vram like textures and different buffers, shaders etc.). There are multiple reasons why we have Dx12 and Vulkan but mainly it comes down to better control over resources. More control means more responsibility on engine developers but it can also mean better performance if done right. Obviously most of game developers do it wrong but there are also games that do it right like Doom 2016 and Doom Eternal. There are also other applications that benefit from these apis like emulators or translation layers for Linux/proton (dxvk and vkd3d, look them up) where we have seen performance benefits.
@loki76
@loki76 Жыл бұрын
It's definitely some BS going on lately. Like you showed "Control" that was one of the games using Ray Tracing the most. Yet it utilized the CPU perfectly fine and the GPU hence sit at basically max utilization as well. Now all of a sudden in the last two years, mostly this year. All of a sudden the CPUs are tanking and especially when it comes to enabling Ray Tracing. Something smells funny. We can also see the CPU util often sit below 40-50% and the GPU at 60-70% in some games. Clearly a disconnect and it's not the hardware that isn't capable. It's that they are not optimizing the graphics and just port it in without fine tuning it so it doesn't bog down.
@bling391
@bling391 Жыл бұрын
Bring back the "nerds" that did miracles with only 4MBs to work with. Now it's a bunch of diversity hires
@dedanieldd
@dedanieldd 11 ай бұрын
Just wanna appreciate your mic setup. Despite having the mic that close to your mouth, your voice sounds clear, doesn't sound choked, compressed and too filtered out like most of live streamers I've watched.
@Dyils
@Dyils 9 ай бұрын
1:16 Um... no? They use that because they are measuring your GPU, doesn't make sense to allow your CPU to interfere with a GPU benchmark. And your framerate is the lowest of the two that the CPU/GPU can push. The two being the fps your CPU can push and the fps your GPU can push. What about the normal person with a normal budget? You just go and look at benchmarks for your GPU... then you go look at benchmarks for your CPU... and you take the lower number. That's your performance. Is that so hard? (Granted it's not a perfect method but you can't test every single possible configuration) Otherwise yeah, I'd agree that CPUs are being used more and more. And I can't quite tell for what. There's a lot of features that increase usage of both CPU and GPU a lot without improving much visually. It's almost as if you can get 95% of the visuals with only half the performance. But developers push the visuals to 100% anyway. And it eats up double the performance for only 5% extra visuals. Personally, I hate that. Keep it at 95% and give me frames instead. If something already looks really good and realistic with a million polygons, it won't look any better with 10 million, but it will eat up my frames...
@marcodoe4690
@marcodoe4690 9 ай бұрын
Loading big textures and decompressing them on the CPU is probably a big chunk of computing already. Only a handful of games support DirectStorage at the moment. This is an issue that needs to be addressed for modern games with 2/4k textures. There's a nice video by by Acerola called "Do Video Games fake bouyancy?" which explains how physics are done in games. I'm no game dev so I can't really tell how sane all this is, but to me it was quite nice to see how simulations are done in games.
@vigo4ever
@vigo4ever 9 ай бұрын
vulkan apis just make gpu work more not cpu. but having no information about a thing and speak is classic here in KZbin.
@Amariachiband
@Amariachiband Жыл бұрын
10:27 THANK YOUUUU I been telling people it’s a waste of money to get any high end gpu before upgrading their CPU
@Wrackey
@Wrackey Жыл бұрын
What you're looking at IMHO is resource scheduling issues. Devices talking to each other, and waiting on each other for something to get done, or moved around. Accessing the RAM from the CPU for instance, takes 2 orders of magnitude longer than performing simple arithmetic. So if you don't manage your data requests optimally, the CPU ends up waiting a lot, and the GPU will wait for the CPU. Getting a faster CPU helps a bit in that case, because all the latencies go down... and maybe it gets fast enough to saturate your GPU... but that doesn't mean that that CPU is optimally used! Many UE5 games seem to have this issue, buit for instance Starfield, seems to have this too. You don't need a very fast system to run it.. but even a very fast system doesn't run it much faster. At some point, no matter how fast your GPU and CPU are... if the data is not being used efficiently, they will both be waiting on each other. Poor multithreading can have the same effect, where an action needs to finish before the next crucial thing can happen, but resources are left on the table, as many cores are waiting one of them to finish: Scheduling issues! It's not just about computing results... it's also about managing memory, cache, disk, and workloads in an efficient way, so that every compute resource in your system, can be at work at all time, to minimize waiting times. Cyberpunk I think, is just not a good example for this case. It may still have scheduling issues.. but it's also just REALLY heavy to run, with the number of NPCs, vehicles, and vertical real estate. This is something CDPR devs have commented on: Doing open world, in a wide an open space (Like the Witcher), is a LOT more manageable, than having this vertical component added to it, that Cyberpunk has. It also usually doesn't have an issue maxing out yer GPU, with all the animated character, and vehicles around you.. and above an below you..... So don't go buy a new CPU, as soon as your GPU doesn't hit 100%... it's not that simple. I really think it's a coding issue.
@remus907
@remus907 Жыл бұрын
What surprises me the most is that as more and more cores become common, new and updated game engines arent utilising them and are mostly using 4 or less cores. Ive got a 3900x, and its a powerhouse still to this day, except for games. 12 cores aint doing shit when games arent using them.
@barbagrossa
@barbagrossa 9 ай бұрын
This reminded me that GPU hardware video codecs are not really ment to heavy encoding. They are meant to have a dedicated hardware for video codecs so you CPU is free to keep the framerates high. hardware codecs prioritize low latency, high throughput first, image quality and filesize later. They are designed to do decoding and live encoding. They are also good if your CPU is very weak (few cores, low power). Once you get a powerful CPU, there are cases that match or are better on H264 and H265 encoding, with image quality usually better.
@Nukeaon
@Nukeaon 10 ай бұрын
thank you for this video! new sub :)
@g0odnite
@g0odnite 10 ай бұрын
Remember the times when we used to play games at 25-30 fps and was pretty happy with it? Yeah.. those were the good times. Used to play GTA San Andreas, Need For Speed Underground 2, Hitman Blood Money, Call Of Duty Modern Warfare 2, Splinter Cell Chaos Theory, etc. at that time a lot.
@metalface_villain
@metalface_villain 8 ай бұрын
i think since gaming became so big much of it became about money, the devs are pushed to make stuff as cheap and fast as possible, that's why we got lazy features like upscaling and why 90% of the games that come out are released unfinished, so i personally feel that you are spot on with all of this being because of bad optimization.
@barneycalhoun9064
@barneycalhoun9064 Жыл бұрын
"The problem is modern (read: currently made and used, but otherwise outdated) consumer software not taking advantage of newer instructions. Most software, including games, is written using SISD instructions. Single Instruction, Single Data. Intel has made instructions which contain small algorithms to provide more breathing space in cache. It has also provided SIMD instructions such as MMX, SSE (2, 3, 4.1, 4.2), AVX, AVX2, and --with Skylake-EP-- AVX 3.2. MMX allowed 2 32-bit values to be manipulated and was limited to integers. SSE allowed floats to be included and also provided string processing extensions, as well as more utility that used to take instruction loops 30+ deep. AVX increased these limited to 4. AVX2 increased that to 8, and AVX 3.2 will increase that limit to 16. Those much older SISD instructions already take only 1-2 or an otherwise minimal number of cycles (integer divide takes 20 something iirc, but that is because bitwise integer division requires time linear in the number of bits. The cycle count increases for longs and long longs which are 64 and 128-bit ints). Basically, those instructions already go as fast as they can. Thus, for that software (often called legacy software if you're in a circle of performance-oriented programmers), all Intel can do now is increase clock speeds, tighten cache access times, and tweak the CPU pipeline. There's no significant performance improvement coming, period. Denard Scaling died at 90nm, so frequencies will not be making significant jumps when we're already in the low to mid 4GHz range, especially since heat density is becoming ever more of an issue. Software has to be rebuilt with performance in mind. Sometimes this can be achieved with a fresh recompiling, but for real gains you have to insert intrinsic functions and data alignment directives." - Patrick Proctor
@WyFoster
@WyFoster Жыл бұрын
CPU bottlenecks have been a huge thing for a long time. I remember trying to play Skyrim on an AMD 1100T and suffering.
@pablolocles9382
@pablolocles9382 Жыл бұрын
It's called: bad optimization.
Is DLSS Ruining Games?
11:57
Vex
Рет қаралды 550 М.
The Slow and Deserved Downfall of Intel
22:18
Vex
Рет қаралды 51 М.
This dad wins Halloween! 🎃💀
01:00
Justin Flom
Рет қаралды 53 МЛН
ROSÉ & Bruno Mars - APT. (Official Music Video)
02:54
ROSÉ
Рет қаралды 247 МЛН
这是自救的好办法 #路飞#海贼王
00:43
路飞与唐舞桐
Рет қаралды 92 МЛН
HELP!!!
00:46
Natan por Aí
Рет қаралды 33 МЛН
Why Consoles Are Going Extinct
23:30
Going Indie
Рет қаралды 614 М.
Steam’s New Power is Scary
22:30
Vex
Рет қаралды 54 М.
Titus Reacts to Linus Tech Tips Linux Daily Drive Challenge
20:21
Chris Titus Tech
Рет қаралды 763 М.
Why Unreal Engine 5.3 is a BIG Deal
12:25
Unreal Sensei
Рет қаралды 2,5 МЛН
AMD Destroys Nvidia with FSR 3
2:53
NikTek
Рет қаралды 2 МЛН
Why Aren't Games Fun Anymore?
34:15
That Guy Bis
Рет қаралды 1,2 МЛН
Another Graphics card not working after cleaning with an air duster
8:50
northwestrepair
Рет қаралды 2,3 МЛН
PC Gaming is Slowly Destroying Itself
22:29
Vex
Рет қаралды 210 М.
Keyboard Cleaning Hack
0:36
IAM
Рет қаралды 20 МЛН
Subscribe for more Coding Tips!🔥 #aitools #codingcomplex #codinglife #program #aicoding
0:28
Claude Ams - Programming Guru 💻
Рет қаралды 9 МЛН