We owe you an explanation... Our Ryzen 7950X3D Review is Late but Worth the Wait

  Рет қаралды 1,997,263

Linus Tech Tips

Linus Tech Tips

Күн бұрын

Пікірлер: 3 600
@LinusTechTips
@LinusTechTips Жыл бұрын
What do you guys think of the 7950X3D? Let us know below! Buy a AMD Ryzen 9 7950X3D: geni.us/aeJ3Ava Buy a AMD Ryzen 9 79550X: geni.us/sKLw Buy an Intel Core i9-13900K: geni.us/2nfAP Buy a Gigabyte X670E AORUS Master Motherboard: geni.us/hnnP9hA Purchases made through some store links may provide some compensation to Linus Media Group.
@Filnaei
@Filnaei Жыл бұрын
Typo alert‘
@vsaucelover9660
@vsaucelover9660 Жыл бұрын
The 7950x3d specially in europe is not worth it rn since the most power intel cpu variant can cost to less than 700 dollars while the 7950x3d can cost from the lowest price 880€ to 1200€
@afif381
@afif381 Жыл бұрын
​@@Filnaei Lol
@GregoryCunningham
@GregoryCunningham Жыл бұрын
Your excuse for delaying your review is pretty pathetic. It should not be your job to display AMD in the best light possible (or any company). I would hope your goal is to be a consumer advocate above all.
@kaustubh_haha
@kaustubh_haha Жыл бұрын
team amd cuz i have a ryzen 3
@notapplicable7292
@notapplicable7292 Жыл бұрын
This level of technical detail really shows how the lab is already elevating the reviews
@logunalberding8870
@logunalberding8870 Жыл бұрын
@@p-__ Gratz
@Beano__
@Beano__ Жыл бұрын
​@@p-__ I can confirm this is true
@masterloquendo0
@masterloquendo0 Жыл бұрын
@@p-__ based
@alexholiday441
@alexholiday441 Жыл бұрын
@@p-__ Hope your diet improves.
@Valkyrien04
@Valkyrien04 Жыл бұрын
Exactly, this review validates everything they're trying to do with LTL
@SixFox
@SixFox Жыл бұрын
Linus can you please include games like DCS, Microsoft Flight Simulator and other sim games in your testing? The type of work these games do are a bit different than most other games and can give us a better representation
@lithium5568
@lithium5568 Жыл бұрын
I second this, especially for 3D vcache CPUs 👍
@lotan7681
@lotan7681 Жыл бұрын
3d vcache is huge for dwarf fortress and crusader kings as well.
@LizardDoggo
@LizardDoggo Жыл бұрын
Yes, and other games *without* built-in-benchmarks?
@Guffz
@Guffz Жыл бұрын
DCS's biggest problem is that it only had single core utilization. They have a multi-core engine out in beta currently so it should become a much smoother experience on maps with a lot of ground/air stuff happening once they get the kinks worked out.
@chrisc1140
@chrisc1140 Жыл бұрын
I keep thinking a massive rocket in KSP could do well as a stress test :p With mechjeb and a standard rocket and update version, it should be very easily reproducable too
@JB2X-Z
@JB2X-Z Жыл бұрын
The power draw is most definitely the most impressive takeaway from all of this. It really is a shame that AMD isn't using that as its podium to advertise this processor.
@ZealothPL
@ZealothPL Жыл бұрын
They probably will do some crazy thing for steam deck type chip
@voided_sun
@voided_sun Жыл бұрын
It would be an actual selling point, considering intel dropped the ball on efficiency so very hard.
@TheEightfoldPath_
@TheEightfoldPath_ Жыл бұрын
Yep. My regular 7950x was nice to have this winter, but now that summer is knocking I'm sweating a bit. (Both figuratively and literally)
@Arelias95
@Arelias95 Жыл бұрын
@@voided_sun Nvidia too, with their GPUs, but that is another story I guess
@uncrunch398
@uncrunch398 Жыл бұрын
@@TheEightfoldPath_ You can set a power draw limit in BIOS. Mine is a 3950x. No cap but heat limit set to 95c. Cooler is Arctic Freezer II 240mm. Should be the 360, but I thought the 240 would fit inside the case. It doesn't. It doesn't seem to have a noticeable effect on room temp. I just want whatever performance I can get without sudden seizing. I'd set power draw limit if I notice it causes the room to be uncomfortably warm. With the temp cap, it's still going to put out a lot of heat with a good cooler constantly taking it away.
@harryw9268
@harryw9268 Жыл бұрын
Linus going to elevate reviews to the point where his audience holds off on purchasing until the LTT review is out rather than just the release date
@diegoaccord
@diegoaccord Жыл бұрын
The problem with that is that if not day one, you can't get it until it's old news.
@Neomeniaaa
@Neomeniaaa Жыл бұрын
@@diegoaccord "old news", this chip will be relevant for at least another year and its only be a month since release. Calm your horses lol
@MeTaLlProZ
@MeTaLlProZ Жыл бұрын
​@@diegoaccord 99.9% of all people who buy pc components dont upgrade on each single new release
@IamKingSleezy
@IamKingSleezy Жыл бұрын
if Linus and his crew, Steve at Gamers Nexus, and Jay say it's shit, it's probably shit.
@Anankin12
@Anankin12 Жыл бұрын
@@diegoaccord what? 5800X3D is still a strong competitor NOW, 1 year later, on an older platform. You save 100€ on CPU, 200€ on Mobo, 100€ on memory and get comparable performance to the newer stuff. It loses? Yes, but not by much. You get about a 300€ saving, which is enough to buy a 3060 12G or a RX 6650XT, or to bump your GPU budget 1 or 2 tiers above where it was. And even then, you could have bought it in October and you'd have got the best CPU possible, even after 6 months from launch. This can be "old newsed" only by the 7800X3D at this point, or next gen Intel or AMD that's coming in 7 months at the earliest, and even then considering there won't be a ram spec overhaul nor a pcie platform bump it will probably still be a massive competitor.
@williamheurtley5806
@williamheurtley5806 Жыл бұрын
This video is incredibly thoughtful and nuanced in the way that it delivers information, Anthony is credited as writer and he and anyone else who worked on the script deserve huge props, well done.
@GregoryCunningham
@GregoryCunningham Жыл бұрын
You think hiding a review of a poor product for a month is nuanced huh? 🤡
@Stinggyray
@Stinggyray Жыл бұрын
​@@GregoryCunningham spoken like someone who doesn't understand nuance
@GregoryCunningham
@GregoryCunningham Жыл бұрын
@@Stinggyray Nope, just someone who isn’t dumb enough to buy a bad product. By all means though, please buy this. A CPU that relies on Windows Game Bar for its scheduling. 😂
@tvthecat
@tvthecat Жыл бұрын
​@@GregoryCunningham clown
@kjaesp
@kjaesp Жыл бұрын
@@GregoryCunningham I dislike the "touch grass" crowd but you seem so invested in something that's just not even existing. Even this video's conclusion is not that positive and it's not "hiding" a review, its validating your results. Who said anything about buying the chip? not anyone in this comment thread..
@OiZoProduct
@OiZoProduct Жыл бұрын
I'm really impressed by the efficient operation of the AMD 7950X3D, it's a BIG contrast to what intel is doing, ramping up the TDP and pushing the CPU to the thermal limits to get every last drop of performance. Personally, I like the AMD approach. In my mind more performance for less energy draw is more impressive than more for more.
@Silentguy_
@Silentguy_ Жыл бұрын
That's my mindset. When getting a new CPU I'd ideally like to keep the same cooler that I've been using forever. Getting something like a 13900K all but requires an AIO cooling solution now. Meanwhile AMD is putting out this absolute beast of a CPU that operates in the same range as a higher-end i5
@Koreryn
@Koreryn Жыл бұрын
Sounds like cope. Better to just go with the 7950X. It is still solid and doesn't have all the weird problems the X3D has for barely any improvement on gaming performance. Especially if you are running 4k resolution. Which my new high end comp will be doing. I also hear from other reviewers that it runs bad with indie games as it relies heavily on Xbox game bar to perform well. Which I absolutely hate. This isn't an ad for Intel as much as it is a warning to just stick to the tried and true normal chip.
@Koreryn
@Koreryn Жыл бұрын
@@Silentguy_ No need to get an Intel chip. Refer to my last comment. Also if you are getting a top tier CPU you might as well get a liquid cooling system.
@jamesparker7939
@jamesparker7939 Жыл бұрын
This level of care in a review of a product is exactly the kind of thing that gave you 5000 additional floatplane subs and a community that cares, not some bad actors attempt at bringing you down.
@EnergeticSpark63
@EnergeticSpark63 Жыл бұрын
hey
@mikeb3172
@mikeb3172 Жыл бұрын
LTT here is the bad actor. All it would have taken is testing MS Flight Sim with half the cores disabled, to show the good in the product.
@Jessev741
@Jessev741 Жыл бұрын
@@mikeb3172 lmfao
@Setrany
@Setrany Жыл бұрын
@@mikeb3172 Because the 15.3 million people are all waiting on the edge of their seat to play *checks notes* MS Flight Sim. LoL.
@Mobay18
@Mobay18 Жыл бұрын
lmao bot account detected. Top comments on this video are all from somebody named FirstnameLastname1234
@Surdai
@Surdai Жыл бұрын
This is what good judgement looks like. LTT is a great example of a responsible and thorough tech reviewer. Instead of jumping the gun and giving a review based on a faulty product, he takes the time to properly test and evaluate the product to ensure that his audience receives accurate information. This level of attention to detail is crucial for those who rely on tech reviews to make informed purchasing decisions. By prioritizing the accuracy of his reviews, LTT has earned the trust and respect of his audience, setting him apart as a reliable source of tech knowledge and insights.
@Savirezz
@Savirezz Жыл бұрын
Damn straight
@lilkidsuave
@lilkidsuave Жыл бұрын
@@p-__ nah bro u are on something for thinking of this
@zodwraith5745
@zodwraith5745 Жыл бұрын
That can go both ways. Make sure it's right but that can also mean hiding faults from the viewer to protect a sponsor, _especially_ when AMD is a new major sponsor. Kinda sus if you ask me when that chip could have just as easily ended up in a consumer's hands instead of LTT.
@OwenCuff
@OwenCuff Жыл бұрын
This really looks like something ChatGPT wrote
@Surdai
@Surdai Жыл бұрын
@@zodwraith5745 The possibility is there, but at least he doesn't go straight praising how good the CPU is to lick AMD's ass like a lot of peoples are. And he's still completely honest in this review, although suspiciously late.
@sachinkendre5314
@sachinkendre5314 Жыл бұрын
Man, what a detailed analysis, they waited 4 weeks to post the review but dude what an amazing job. This shows the dedication these guys have... Having to maintain so many variables while testing, windows install, Ram, SSD, motherboard, coolers, GPU , drivers ... Hats off man, keep it up
@That_one_cowboy
@That_one_cowboy Жыл бұрын
Gotta love the diligence and care with everyone at LTT, not releasing videos until their knowledge of the product is both accurate and precise.
@highvis_supply
@highvis_supply Жыл бұрын
If you constantly retest your hardware for subsequent reviews, it would be nice to see how hardware evolves over time in terms of performance
@Tupsuu
@Tupsuu Жыл бұрын
​@@p-__ok
@xertza
@xertza Жыл бұрын
@@p-__ proof
@malikbelhabri5598
@malikbelhabri5598 Жыл бұрын
It doesn't, they have made a video in the past on an old gpu and the performance was all within margin of error
@brantwedel
@brantwedel Жыл бұрын
Since the hardware is on the shelf between tests, it wont ... would be interesting to do a longevity test tho with a constant (x)hours a day load.
@Schweinenackensteak
@Schweinenackensteak Жыл бұрын
@@brantwedel der8aer did that one already i think
@TheRealMefi
@TheRealMefi Жыл бұрын
What would we do without LTT's testing Team.... I'm so glad that they had the idea to make this testing lab a reality.
@jotselmao
@jotselmao Жыл бұрын
womp womp
@pupsaderpupin5627
@pupsaderpupin5627 Жыл бұрын
Right because no one else is doing what LTT does.. right
@moradelshorbagy4173
@moradelshorbagy4173 Жыл бұрын
its alright, I forgive you.
@mine_crafting
@mine_crafting Жыл бұрын
Fr we understand
@sohu86x
@sohu86x Жыл бұрын
I don't
@BirdThatEatsPrometheussLiver
@BirdThatEatsPrometheussLiver Жыл бұрын
I don’t either, this is preposterous! >:(
@Josh_2992
@Josh_2992 Жыл бұрын
I don't. I can't believe linus would do this. He's obviously an Intel shill. I can only imagine how deep this rabbit hole goes
@JapanSpawns
@JapanSpawns Жыл бұрын
Linus is my father
@timothyreeves615
@timothyreeves615 Жыл бұрын
This video coming out just weeks after AMD picked up the upgrade series is why LTT continues to be one of my most trusted tech sources across all mediums.
@gingaming_gg
@gingaming_gg Жыл бұрын
lol they are very clear that sponsorship will never effect their reviews. And if it does, they'll ditch the sponsorship before they ditch the review.
@PacMonster0
@PacMonster0 Жыл бұрын
I mean...if you watch the entire video, Linus is actually praising the CPU. What he's criticizing is AMD's misplaced marketing. AMD positioned this CPU as the best gaming CPU when it's actually the best performance per watt CPU.
@jmctavishiii
@jmctavishiii Жыл бұрын
​@@PacMonster0 makes sense. You're getting 7950x performance at much lower power consumption (and therefore, likely less heat). So it deserves some praise for performance. Whether to purchase it or not, he said "probably not" up to "maybe." That's pretty unbiased.
@arnabbiswasalsodeep
@arnabbiswasalsodeep Жыл бұрын
@@PacMonster0 Ah yes, the classic praise of "if u want benefits of these 3 specific things together" while "giving up 3 things" is praise.
@ouki4925
@ouki4925 Жыл бұрын
You mean his channel is now AMD biased. That's not a good thing unless you are a fanboy.
@Wignut
@Wignut Жыл бұрын
The efficiency shown in the later part of the review is crazy. If this new architecture is expanded on, we are going to be eating good on the CPU world
@MattisProbably
@MattisProbably Жыл бұрын
It's almost like having a lab that tests everything really thoroughly instead of just repackaging what the manufacturer tells you makes reviews way more detailed and complex! :)
@slartibartfast2649
@slartibartfast2649 Жыл бұрын
@@p-__ ikr I always thought Linus' farts were second rate.
@kel1590
@kel1590 Жыл бұрын
@@p-__ i need to get the sniffer on this case to fact check
@lukes401k
@lukes401k Жыл бұрын
@@kel1590 sorry, sniffer lab results are only allowed to be checked by certified sniffologists
@chancepaladin
@chancepaladin Жыл бұрын
bingo, 100%
@timothymatthews1653
@timothymatthews1653 Жыл бұрын
Appreciation for good work is fine and all but LTT is not the only place where you can find detailed and accurate reviews and you should be checking more than one so you can get a bigger picture and/or spot inconsistencies.
@Hoogoh
@Hoogoh Жыл бұрын
Linus just has a good heart and as a consumer, I’m grateful for his team that took the time to test over and over again. They packed hours of testing into a technically dense, articulate, comprehensive review. I recently spent 100 bucks on LTT merch, even the buying experience of the desk pads is amazing.
@ZondaRbg
@ZondaRbg Жыл бұрын
what they did actually ?? people doing tests on huge number of games, he just did on 7 random games.. And who cares actually on 4k, everyone knows there GPU is the limit not the CPU.
@toytulog576
@toytulog576 Жыл бұрын
@@ZondaRbg in this ERA some people claim 2-3yo CPU = Bottleneck with their new GPU..I have 5600X and then i upgraded to 5800X3D (Double the price) but the FPS I get on my game that i play is like 1-5 FPS Increase 3440x1440 Resolution.. So pretty much its like 50$ = 1 FPS increase. Its not worth it at all :(
@3polygons
@3polygons Жыл бұрын
@@toytulog576 Where the advances are seen are in productivity, specially doing 3D graphics (and even 2D, specially video). As every notch they improve is less time needed for 3D rendering or video rendering/export. Also, these allow to handle larger 3D scenes while editing, etc. But for gaming... yeah... a 5600 is a heck of a CPU yet for most games.
@ZondaRbg
@ZondaRbg Жыл бұрын
@@toytulog576 How can u buy 5800x3d for games where 3d is not good? I got 5800x3d becouse i got 2 times more fps compare to my old 3600x in WoW and other mmorpg games.
@MrDoebs
@MrDoebs Жыл бұрын
have they fixed delivery times of those desk pads tho lol
@haywoodjay385
@haywoodjay385 Жыл бұрын
This review was astounding! Its very apparent how far the new lab has come and its ability to get to the bottom of things. What a incredible resource you guys have become over the years. Thanks!
@jimatperfromix2759
@jimatperfromix2759 Жыл бұрын
Yes, very good job by LTT of both performance testing, as well as explaining the actual phenomenon going on, which indeed is rather complicated due to the complicated asymmetric architecture that AMD used to try to compensate for the fact that putting in the extra L3 cache on both CCXes just wasn't going to work (too hot temps, lowering clock on both CCXes). Let me add some extra observations on what AMD actually did here and its consequences. First, the so-called AMD "solution" (to the problem of really needing two CCXes with extra L3 cache, but that being impossible) is a giant kludge of a hack. Part of the problem may stem from the fact that although Microsoft worked hand in hand with Intel to make sure that they patched the Windows Scheduler to work seamlessly with hybrid Intel CPUs containing both Performance and Efficiency cores, it appears to me that AMD got less coordination and less help (than Intel did) from Microsoft in terms of adapting the Scheduler to handle asymmetric 3D V-Cache in AMD 7950X3D and 7900X3D chips. The notes from AMD say, "When a game is detected ..." and I ask, how??? Apparently through some giant kludg since they later note that it's "Down to a simple integration with Windows Game Mode and XBox Game Bar." Apparently some funky parameters get set there, and then the Scheduler decides how to place threads onto cores (in either CCX) based on those settings plus certain statistics. I suspect they don't do a good enough job on collecting such per-thread statistics, and that might be part of the problem. Continuing the AMD notes, "... the game is restricted to a single CCX (by parking the other CCX) to reduce latency and increase the frame rate. When this happens, the game is restricted to the CCX with the higher game performance - almost always the CCX with the larger cache - increasing performance even further." I question whether that kludge is even working properly in the exhaustive tests that LTT conducted. For instance, how does it determine which CCX is (i.e., has been giving) the so-called higher game performance. Sounds like they're doing this dynamically based on statistics, but what statistics? Who knows, they might even be parking the wrong CCX!!! What if their statistics are messed up, and the statistics wrongly "say" that the non-extra-L3-cache CCX "has been giving higher game performance"??? That might well be true since it's slowing down the extra-L3-cache CCX cores to keep them from overheating and if the extra cache is not a huge benefit to some given game, then the non-extra-L3-cache cores (the "frequency" cores, so to speak) might be running faster in some sense, so who knows, they might be parking all the cores in the CCX with the extra L3 cache, and running exclusively on the no-extra-L3-cache cores. That would essentially demote a 7950X3D to a 7700X. We can't be sure, but this might be happening. Just one extra reason why the 7800X3D is a much better gaming CPU than the 7950X3D (since obviously a 7800X3D is faster than the 7700X that the 7950X3D might get demoted to). Even if the Scheduler does (by what magic I don't know) make the correct decision and run it all on the extra-L3-cache CCX cores, it might be slowing those cores way down to prevent overheating, such that the advantage of the extra cache is completely offset by the slowing down of the core clocks. This is probably why we see the wattage go so low in the test results. The AMD kludge basically throws away 8 cores without the extra L3 cache, and then slows way down the 8 cores with the extra L3 cache, such that the slowdown completely offsets the gain that the extra L3 cach otherwise might have provided. So perhaps in the end, it makes no difference which of the CCXes gets parked - it's maybe equally bad or the game no matter which one you park and which one you use. AMD engineering needs to determine for sure (in games, testing a lot of them) which CCX runs the game(s) faster. Once having done this, then the existing code is a but. You don't decide which one to use based on some dynamic statistics (that are probably complete BS in any event), but rather you always use the CCX that your lab tests have proven to be fastest. Probably that's going to be the CCX with the extra L3 cache, but AMD needs to do extensive lab tests to prove that. Let's say that's the case. Then you alwayse use the CCX with extra L3 cache, and always park to other CCX cores that don't have the extra L3 cache. In other words, the current algorithm is most probably a bug.
@jimatperfromix2759
@jimatperfromix2759 Жыл бұрын
Another potential problem is lack of coordination between a game program per se and the above-described (probably faulty) mechanism for parking a full CCX worth of cores. What if the game program was written to be very efficient and use all the cores available to it? That's how I would write my game program. I ask Windows how many cores do you have in total? Oh, you say you're a Ryzen 9 7950X3D and you have 16 cores? OK, great, then I will create a total of 16 threads (sharing the same memory space) to be as efficient as possible in using all resources available to my game. [Note: At most I might refrain from using one core, leaving that for system programs to use. But let's say I use all 16 cores, launching a total of 16 threads.] But now AMD in conjunction with the Windows Scheduler in conjunction with some funky kludgey settings (in XBox Game Bar and Windows Game Mode) tries to outsmart me (the writer of my game program) by parking 8 out of my 16 cores hat I paid for. Wait a minute!!! That's right, I paid for for all 16 cores and now I'm trying to use them. Furthermore, I paid about $100 extra for some of those cores to have extra L3 cache. And now you (Windows and AMD) cavalierly decide to throw away about half of what I paid for??? I think I would demand my money back, and spend it on a 7800X3D instead!!! So I at least partly question whether the "park half of your cores" strategy makes any sense at all. In the case I'm arguing here (where I cast myself as the author of some hypothetical game in which I go to explicit steps to use all of my cores, but then AMD + the Windows Scheduler steals 8 out of those 16 cores from me), I claim that it would have been better if AMD + Windows Scheduler had let me keep using all of my 16 cores. Yeah sure, some cores would be accessing some (regular or extra) L3 cache off the opposite CCX, but that's just a minor delay subtracted from an almost doubling of the cores I'm attempting to use to do the work my game code must do (to hand off the right stuff to the GPU). I say almost doubling since running all 16 cores does slow the clocks a bit. But isn't it better to run my code on 16 cores all running at 4 GHz than on only 8 cores running at 4.5 GHz??? Of course it is. The problem is, it is impossible (barring use of perhaps some even bigger kludge) for the game program to optimally communicate with the Scheduler and the kludge settings. Well, actually, maybe something like that is exactly what's needed, but te devil is in the details. But this brings up another test that LTT should have added to their already great testing efforts (perhaps adding another week to the 4 weeks hey actually used up for testing). That is, try the same tests once more, but this time doing whatever it takes to totally disable the core parking mechanism. Those results might be interesting. BTW, i does state in AMD's notes that "If thread utilization is high enough, in the case of multitasking, the cores on the parked CCX will be automatically enabled as needed." So maybe if that feature works properly one wouldn't need to disable core parking? The above mentioned extra tests might have answered that question. Another possibility might be cycling back and forth between core parking or not. What if AMD parks cores cuz it notices it's a game running, but then since it's my game that uses all 16 cores in 16 threads, thread utilization is high so it unparks the 8 cores in the slow CCX, and for a bit I get all 16 cores on the job, but then maybe there's a less challenging spot in the game such that CPU utilization goes down, and it reparks 8 cores, and then keeps vacillating back and forth between parked and unparked cores. Basically, in the end it might turn out that the core parking business was just a bad idea in the first place. Finally, I'll note that the next comparison that LTT should do (in a new video, since this could be considered a separate subject that just expands on the subject of this video) is a comparison (in the gaming context) of a Ryzen 7950X3D to a Ryzen 7900X3D. It might well be that (with proper tweaking and/or not-using of the various parameters) the 12-core 7900X3D might exceed the gaming speed of the 16-core 7950X3D. For one thing, you have the same amount of total L3 cache spread over 12 cores instead of 16 cores, so on average there's more cache per core on the 7900X3D part. Furthermore, there's less total heat with 12 cores running than with 16 cores running, so the 12 cores can hit a higher clock speed before being throttled. We already know that the speed bump by going from a 7900X to a 7950X is less than the optimal +33% that you might hope for, due to clock-slowing. You might only get an 18% boost instead of 33% boost. Just as an example, let's assume that's approximately the case for the 3D V-Cache parts as well. That would imply that the 7900X3D's 12 cores are only about 15% slower in total horsepower than the 7950X3D's 16 cores. Furthermore, after throwing away half (6) of the 7900X3D's cores, the result is still only about 15% less total horsepower than after throwing away half (8) of the 7950X3D's cores. And it might not even be that bad, since the clock rate would go up after getting rid of half the heat source. I conjecture that the 7900X3D might be approximately equally as fast in gaming as the 7950X3D. And the 7800X3D might beat them both by a slight margin (since no core parking needed and it gets even more total L3 cache per core than either of its big brothers). Bottom line is probably that the 7800X3D is best for gaming, because the 7950X3D starts with 16 cores but only has 8 after parking half, and the 7900X3D is equally bad since although it only loses 6 cores due to core parking, it only starts with 12 cores so it ends up using 6 cores (not 8) albeit at a slightly faster clock rate.
@twistedbunny
@twistedbunny Жыл бұрын
Personally having an Intel Core i5 5330U, watching reviews of processors with more than 64 cores feels like I'm watching a sci-fi tech review show
@sharjeelrazzaqphotography
@sharjeelrazzaqphotography Жыл бұрын
Well sucks to be you then. I have i5 3210M. It can run GTA Vice city you know.
@ikram2512
@ikram2512 Жыл бұрын
​@@p-__ no Linus is better
@sedatalizevit51
@sedatalizevit51 Жыл бұрын
My daily driver is a hp laptop that has 5500u(2 core, 4 thread). Even my phone has 4 times more core than this pc.
@antiwokehuman
@antiwokehuman Жыл бұрын
That still better than the dual core sandy bridge 2328M I'm running😂
@Jaggith
@Jaggith Жыл бұрын
i5 - 3570k here.
@neillgeldenhuys7305
@neillgeldenhuys7305 Жыл бұрын
Oh yeah I forgot I was mad at you for not reviewing this earlier, thanks for reminding me Linus!
@jamesstrieb1
@jamesstrieb1 Жыл бұрын
LOL
@Synclon
@Synclon Жыл бұрын
Lmao
@WohaoG
@WohaoG Жыл бұрын
@@p-__ I've tasted both and I can confidently disagree
@XcDrifty
@XcDrifty Жыл бұрын
lol
@UntoldTechy
@UntoldTechy Жыл бұрын
​@@Synclon and i
@jnjnqy
@jnjnqy Жыл бұрын
Extremely valuable review. I'd like to consider it's an individual difference. My 7950x got all os these iGPU related problems since yesterday (and RAM problems since its first day), and I'm going to RMA it tomorrow. Tested with Gigabyte B650 (Aorus Pro AX v1.1) and B650M (Aorus Elite AX v1.0) boards, both the same. All BIOS/driver on newest versions to date. WORDAROUND: Increase PBO advanced GFX load curve optimizer can work around my iGPU problem (crashing and crashing and crashing). NOTE: Enabling SVM will cause iGPU to stop functioning (driver report code 43 and won't output video) with dGPU plugged in.
@CIeverAI
@CIeverAI Жыл бұрын
You guys should check about the ReBAR feature of your mainboards. By default Resizeable BAR is enabled and the 7950X3D in combination with the RTX 4090 has some problems when ReBAR is enabled. The performance for a lot of games is worse when it is enabled. Therefore you should do another test run with ReBAR enabled and disabled. By the way, Hardware Unboxed did already a video about this.
@Benji10109
@Benji10109 Жыл бұрын
This might be it, its a pretty easy thing to overlook
@error98123
@error98123 Жыл бұрын
Is this just am5? is am4 safe from this or also susceptible? I've got it enabled rn with ryzen 5600 and rtx 2060, should i also disable?
@jim4556
@jim4556 Жыл бұрын
​@Aye 20 series cards don't support it. So even if you have it enabled in bios it will never actually use it.
@error98123
@error98123 Жыл бұрын
@@jim4556 oh....
@flanovskiydtauskiy5870
@flanovskiydtauskiy5870 Жыл бұрын
this might be a bug. on Nvidia GPUs which games are affected by reBAR is supposed to be controlled by drivers/nvidia panel software, so nvidia determines which games actually use rebar while on AMD it is always on even if the game has problems with rebar, last I heard or maybe it really isn't active while the games are running but it still affects everything
@xchoo
@xchoo Жыл бұрын
It's odd that the numbers don't reflect what other reviewers have found (e.g., Gamers Nexus showed a clear improvement in most games). I'm wondering what the difference in setups and testing methodology resulted in the different conclusions. Take the F1 22 results for example, LTT showed little improvement, while GN shoed massive gains. One difference I noticed was in the game's graphics presets (LTT - Ultra high, GN - high), but considering both test benches use 4090's it shouldn't make a difference (there will be an absolute difference, but the relative difference between the 5950X and 5950X3D should still be there). All this is very strange...
@Chewsstudio
@Chewsstudio Жыл бұрын
it is a cleare performance benefit here too till you go gpu limited and the cpu has less utilization. prolly a result of aggressive power management to protect the 3d vcache
@saricubra2867
@saricubra2867 Жыл бұрын
Because it's a clusterf*ck of a chip, only one CCD with 3D cache creates a huge mess of scheduling stuff (there is no "Thread Director"). AMD should have added 3D cache to both CCDs.
@DJCryonic
@DJCryonic Жыл бұрын
@@saricubra2867 Adding 3D cache to both chips would have limited the performance in multithreading even further, gimping the productivity performance and specific games like DOOM Eternal (which was shown to not really care about the cache and really loading all cores) to be even worse VS the basic 7950X non-3D. AMD said this was possible but they decided against it. It would also be more expensive and probably much less yield for the 2x CCD chips, making them again more expensive and scarce. The chip itself is fine, but as with every special tech, it needs a LOT of optimization to perform as expected. This thing still gives you insane gaming performance while being super efficient, but at higher resolutions you get the same effect with the normal 7950X which is cheaper, easier to find, faster in specific loads and more stable across the board. Maybe in a year things will be more optimized and other reviewers showed a benefit but then again you are so hard GPU limited - nobody cares outside of people who run 1080p 360Hz monitors and want max FPS above all.
@OreosCookiesNCream
@OreosCookiesNCream Жыл бұрын
It's 100% the different testing methodology Linus and the lab has vs GN. Ultra is unnecessarily taxing on the GPU and is turning the CPU benchmark into more of a GPU one. This really isn't a CPU benchmark, but its moreso a realistic scenario benchmark. He's right in that the people spending this type of money for a CPU and GPU aren't going to be playing at lower quality settings and monitor resolutions, but there's also an argument that the Ultra presets are just outright stupid and don't give much of a perceived visual benefit vs the step down in exchange for a big performance hit.
@AMDRyzenEnthusiastGroup
@AMDRyzenEnthusiastGroup Жыл бұрын
Yeah, they have other issues somewhere.
@Jason-ir5ig
@Jason-ir5ig Жыл бұрын
I'm really excited for the 7800X3D, not because of it likely performing near the top of all AMD and Intel CPUs in games, but because it should be able to do so VERY efficiently and while incurring an upper-mid-range cost
@sashrill
@sashrill Жыл бұрын
new builder from the future here. can confirm paired it with a RX 7800XDX card for 950 dollars
@etherealhawk
@etherealhawk Жыл бұрын
Dang. I can't find that card for that price
@Triro
@Triro Жыл бұрын
Honestly impressive that it even gets that much performance at that low wattage
@prateekpanwar646
@prateekpanwar646 Жыл бұрын
"The office handshake meme"
@kieranlee9610
@kieranlee9610 Жыл бұрын
its really not it loses to the i9 most of the time that i9 only has 8 p cores 36mbs of slow cache the ryzen 9 has 16p cores 128mbs of 3d cache in scale its genuinely awful good cpu but not no where as good as it should be
@mas7rreaper126
@mas7rreaper126 Жыл бұрын
@@kieranlee9610 did you even watch the video
@simbatech65
@simbatech65 Жыл бұрын
@@kieranlee9610 The point is it's drawing less than half the power and matching the i9 performance. the i9 is horribly inefficient by comparison despite having "Efficiency" cores.
@kieranlee9610
@kieranlee9610 Жыл бұрын
@@mas7rreaper126 i did its best pro is power consumption but even here in the UK that still a couple pence extra every year for an i9 13900k and most of the time the i9 was on top in those benchmarks so we'll have to see what 15th gen intels like
@kevinwiley5325
@kevinwiley5325 Жыл бұрын
I’m so glad you invested in your lab team. Being able to rule out the motherboard by having multiple up and tested in quick timing is huge!!
@benolsson1698
@benolsson1698 Жыл бұрын
Happy to have a fully detailed and very complete review instead of a perfectly timely review.
@DragonBane299
@DragonBane299 Жыл бұрын
AMD really should've marketed this chip on its power efficiency rather than its actual 'power leadership' Its like Linus said, the chip is matching or beating the competition at nearly half the TDP while using less core as well
@pvtj0cker
@pvtj0cker Жыл бұрын
Who knows what those marketing people were thinking. Running a 170W Ryzen 7*** or Raptorlake and the GPU that is accompanying such a beastly CPU during summertime won't be a pleasant experience.
@griffin1366
@griffin1366 Жыл бұрын
Only because it's turning off the 2nd CCD. Once you get a game that utilizes more than 8 cores, or run OBS etc. then you'll be above the competition again. Also AMD has awful idle power draw, something that is always swept under the rug.
@sandman11
@sandman11 Жыл бұрын
Looking forward to the 7800X3D review and yes, the addition of flight simulators would be greatly appreciated. I fly exclusively DCS which just recently enabled multi threading. Thank you for your work!
@afeathereddinosaur
@afeathereddinosaur Жыл бұрын
This "losing the lottery" thing is a common thing in all factory made products. My parents once bought a car that was very well reviewed in both word of mouth and specialist reviews but it constantly needed troubleshooting and its performance was under the average. All other owners we knew didn't have those problems. Tolerances may be small, but they can and will still give a problem or two if you accumulate enough of them. Don't worry too much about it though, there's not much you can do if you're just a customer and its a rare enough occasion that it isn't worth it.
@Dripzy_Y
@Dripzy_Y Жыл бұрын
If Linus ever delays a product review it’s because he’s just making sure it’s right and is helpful to us. Thank you Linus for your time with this!
@existentialselkath1264
@existentialselkath1264 Жыл бұрын
@@p-__ I've tasted both and I can confidently disagree
@WohaoG
@WohaoG Жыл бұрын
@@existentialselkath1264 I've tasted both and I can confidently agree
@MujaChip
@MujaChip Жыл бұрын
or waiting for sponsors
@theweekendgamer6294
@theweekendgamer6294 Жыл бұрын
@@Carl_Gunderson you mean a guy who doesn't know "whose" from "who's" can't be trusted as a highly intellectual source?
@griffin1366
@griffin1366 Жыл бұрын
Damage control with AMD first*****
@Acepilot8Gaming
@Acepilot8Gaming Жыл бұрын
I love how committed you and your team are to giving a super accurate review to everything no matter what it takes. I not only like that but I highly respect it too.
@mads6103
@mads6103 Жыл бұрын
Well, here we are
@DrLifeGamer
@DrLifeGamer Жыл бұрын
​@@mads6103gamer nexus faked everything there is no evidence to his claims he's just looking for drama after his channel died lol
@Kosphy
@Kosphy Жыл бұрын
It would have been really interesting to see how the scheduler on Linux behaved.
@ChopperPBM
@ChopperPBM Жыл бұрын
I recently built a new PC and waited for the 7950X3D, however a few days before I heard about this whole driver/core-sleeping stuff and just decided against it and got a 7950X instead. Very happy with this choice, far simpler to deal with.
@biggranny000
@biggranny000 Жыл бұрын
This is a bit strange because the 5800x3D was a decent jump especially in certain games over the 5800x or other CPUs at the time, but some games and productivity tasks were marginally worse. You would think with a next gen product the 3D v-cache would be an even larger gap from the 7000 series counterparts, but it seems like they went backwards, this points to the conclusion that some games can't take advantage of the extra cache. For $100 more than their non-3D cache parts it's not bad but it's useless unless if you're at 1080p. The power usage was very impressive though. A heavy user might save a decent amount on their power bill overtime.
@nossy232323
@nossy232323 Жыл бұрын
In part I was expecting this, because the non 3D 7000 CPU's already have more cache than previous gen. So less games run out of the cache on the CPU.
@3polygons
@3polygons Жыл бұрын
So, now I'm interested... For 3D rendering, the cost in watts per frame rendered, at it this is a champion! I mean, much much better in efficiency than the previous 7950X. Then Linus is right! This should have been marketed to us, people working with 3D and 2D graphics, not as a beast in gaming... Those numbers (super low consumption, yet 13900k (ish) performance in apps) are what I missed in the previous way less efficient 7950X (while the older 5950X is a wonder of efficiency). Glad to know about these findings. I indeed had noticed in reviews the difference in power consumption between 7950x and 7950X3D and begun to get interested, but shocked as I also thought this cpu was mostly for games. In Europe the electricity cost is up to the roof, but also, daily rendering of many hours does consume a lot more energy than gaming (gaming ain't frugal, either, but besides is way less CPU intensive than 3D rendering, playing games 8 hours a day would be worrisome for many reasons, anyway).
@jerostephan
@jerostephan Жыл бұрын
7950X is super efficient as well, just enable the 105W ECO mode in the PBO section of the BIOS; Normal consumption: 38k points in Cinebench Multi-Core @ 245Watts Eco mode: 35k points in Cinebench Multi-Core @ 145 Watts
@3polygons
@3polygons Жыл бұрын
@@jerostephan That's nice... I have since a while a 3900X... I did not use an eco mode, per se, but long ago I undervolted it and tuned some other matters in BIOS, and was able to reduce quite the power consumption and (drastically, like from 95º C to 75º C more or less) lower the temps in the CPU while 3D rendering in Blender, while apparently the speed in rendering did not change much. Glad that the new gen can be set to work efficiently, too.
@matasa7463
@matasa7463 Жыл бұрын
Yeah, that efficiency charge sold me on the CPU. That's amazingly great. This sort of efficiency vs. performance is basically already good enough to run on gaming laptops basically as is! Combined with efficient RDNA3 GPUs, Team Red's laptops could be game changing.
@saladgreens912
@saladgreens912 Жыл бұрын
The thing is, isn't that efficiency from the boost algorithm rather than architecture? It's not the 3d cache giving better efficiency it's the lower boost voltage to not damage the 3d cache that is. Meaning that, if I understand correctly, this could be a bios option or you could manually undervolt/underclock to get the same results. There was a hit to performance remember. That last bit of performance being squeezed out could account for a disproportionate increase in power draw.
@rei_lyy
@rei_lyy Жыл бұрын
this post was not written by an AMD bot btw
@josiahjones7654
@josiahjones7654 Жыл бұрын
I agree with the efficiency on thr cpu side but rdna3 is not and energy star especially compared to the 4080 or 4099
@greebj
@greebj Жыл бұрын
If AMD ever have the courage to release a high end laptop GPU again. I've given up waiting, Dragon Range 16 core is a far more efficient and genuine alternative to the Raptor Lake HX, but there's zero news out there of anything beyond these tiny 200mm^2 midrange RDNA3 GPUs making me think they're not happening ever, let alone late this year Laptop dGPU pricing is at nvidia's absolute mercy and they know it, desktop buyers complaints sound like #firstworldproblems tbph
@gustofing
@gustofing Жыл бұрын
@@rei_lyy I'm thinking the same way and I ain't an AMD bot, gotta admit tho it's a bit risky going for a CPU like this and not the 13th gen Intel CPUs that are dominating the market rn.
@rndmavis
@rndmavis Жыл бұрын
Editing, aesthetics, and review on this is A+ impressive. Also Returnal is fun! I'm glad you're still around and dominating the tech space on yt. Keep treating your staff well!
@chrisd4228
@chrisd4228 Жыл бұрын
Thanks for the breakdown. Hoping you guys also get a chance to dig into the 7800X3D as this is the one I'm currently most interested in.
@abidbmtt
@abidbmtt Жыл бұрын
Hi LTT, it would be really cool if you also consider simulator games for benchmarks as they greatly benefit from the increased cache and it would also bring a other perspective into the benchmarks
@giantpickle
@giantpickle Жыл бұрын
Microsoft Flight Simulator would be an excellent test, for me it's the entire target for upgrading my PC.
@rubiconnn
@rubiconnn Жыл бұрын
Arma 3 or DCS would be great too. They are both VERY CPU intensive games.
@MuffinTastic
@MuffinTastic Жыл бұрын
Unity games in general benefit from extra cache and in particular VRChat is a big one. If you test VRChat, make sure it's always the same worlds, and always the same avatars being displayed - it's a *very* user generated content / UGC-centric game, and that has a big effect on performance.
@rogerbronchal9628
@rogerbronchal9628 4 ай бұрын
It would be great to see an update on this with the new changes in core parking.
@carloscampo9119
@carloscampo9119 Жыл бұрын
Excellent work LTT. Clearly made the effort to make a thorough review of this product and it’s limitations. This is how one should review high end products from companies claiming to be the “absolute best”.
@bradmorri
@bradmorri Жыл бұрын
Double check that CPPC is enabled in the bios. That allows the bios/CPU to communicate core type/performance ranking to the operating system
@andywillis9701
@andywillis9701 Жыл бұрын
i am in love with all this scientific testing the Labs is doing. truly a game changer for us consumers. Thanks LTT :3
@giganooz
@giganooz Жыл бұрын
Would like to see factorio and satisfactory with massive bases tested for these chips. I think these kind of games usually benefit the most, so I'd like to see if that's still the case.
@DLicht0
@DLicht0 Жыл бұрын
Or Oxygen Not Included; known for having CPU bottleneck frame rate/simulation slowing issues with large bases.
@slartibartfast2649
@slartibartfast2649 Жыл бұрын
That blue screen of death caught me off guard. I thought my PC had crashed for a moment.
@kennethd4958
@kennethd4958 Жыл бұрын
These are the kind of in depth reviews I started watching LTT for... and I am glad youre starting to get back to them.
@StaySic4Ever
@StaySic4Ever Жыл бұрын
I'm waiting for 7800X3D I'd expect it to be the best-buy really in every way. Obviously if you don't need so many cores for various workloads. Some games, especially some online games really benefit with extra cache. Looking forward to this one.
@PanduPoluan
@PanduPoluan Жыл бұрын
Ahaha and some older games will fit completely in the Vcache 😁
@yashgupta1724
@yashgupta1724 Жыл бұрын
" You can pause if you want to look closelier " I love LTT for keeping this meme still alive after almost 3 years
@Eaton-10
@Eaton-10 Жыл бұрын
I knew I wasn't the only one! LOL
@oldbot64
@oldbot64 Жыл бұрын
What meme? You mean dead joke
@rohitkrishnan2269
@rohitkrishnan2269 Жыл бұрын
This is one of the best reviews this channel has ever put forward. Kudos!
@foxfm1987ult
@foxfm1987ult Жыл бұрын
I feel like the problem here is that they tried to put 2 types of die, one with 3Dv-Cache and normal cores in a single socket and used software to rely on it. This caused a lot of issue in power management like in the graphs. the reason 5800x3d was impressive was that it was just 8 cores with 3dv-Cache. This allowed AMD to build the firmware to only focus on that "type" of cores. Here, we have an intel 12th gen launch issues all together, 2 different types of cores, and trying to make windows to work with them is a disaster. If they would've sticked to the same model and shipped all 16 cores with 3dv-cache and optimised drivers and firmwares in windows to properly use that, this woukd've been a big win for AMD just like 5800x3d. That is one of the main reasons why that CPU even though it is a bit old, still manages to keep up with these new CPU's very easily despite being a completely old platform.
@nimbulan2020
@nimbulan2020 Жыл бұрын
This is the most impressively thorough CPU review I've ever seen. Kudos to you Linus.
@lordadamfirst
@lordadamfirst Жыл бұрын
This level of careful and thorough reveiw, checking you are correct etc, is exactly why people trust your channel over many others.
@jmetro1
@jmetro1 Жыл бұрын
I noticed until the most recent version of the chipset driver that was released last week, my 7950x3d would run world of warcraft on the wrong ccd, after the chipset driver update wow now runs on the correct CCD, and gets 20-30% better FPS. It will take some time but eventually the windows scheduler/chipset drivers should get it right for all/most games.
@jarjarpfeil
@jarjarpfeil Жыл бұрын
there are definitely going to be teething issues with this , especially since some games struggle with using chiplets period, let alone with 3d v-cache. I am excited to see how this tech changes and how the 7800x3d does.
@Moloch_Baal
@Moloch_Baal Жыл бұрын
I'm waiting for the 7800X3D, really hope it's good as AMD claims (for the one game I play)
@diegochen5350
@diegochen5350 Жыл бұрын
X3D saves more power, the price of electricity in France is very high, the electricity cost I use to play games with 7950X3D is 1/2 of my brother's 13900K...I don't even need to use AIO water cooling
@meekmeads
@meekmeads Жыл бұрын
@@diegochen5350 For a country that uses nuclear energy, such a rip-off.
@bronsondixon4747
@bronsondixon4747 Жыл бұрын
It’s not going to be any different than the 7950X3D with the non vcache ccd disabled. If anything it’ll be slightly worse since the advertised boost clock is only 5.0ghz.
@nostrum6410
@nostrum6410 Жыл бұрын
i really wouldn't expect it to run any better than the simulated tests on the 7950x3d
@nostrum6410
@nostrum6410 Жыл бұрын
@@diegochen5350 for gaming the cost in power usage is negligible.
@Monoke00
@Monoke00 Жыл бұрын
and now imagine a laptop version of the X3D..just brutal
@Leap623
@Leap623 Жыл бұрын
I could be wrong, but I remember seeing some stuff about how idle power draw is actually really high for some reason on X3D chips. If that's just an inherent issue, then that's a pretty big shot in the leg for their chances in the laptop space, barring the sorts of laptops that are meant to be plugged in all the time and are basically just desktops you can throw in a backpack.
@lilcheaty
@lilcheaty Жыл бұрын
@@Leap623 if you're using the ccd without the 3d v-cache then you won't really have a power draw issue. only when you use it for workloads that require it
@griffin1366
@griffin1366 Жыл бұрын
AMD laptops are nice but they need to work on their idle power draw so much.
@Dimondminer11
@Dimondminer11 Жыл бұрын
@@Leap623 I mean you aren't wrong. My 5800x3d draws 35w at idle in high performance power plan under windows. Although, for a x3d chip in a laptop, I wonder if it would work in one of those clevo desktop replacement machines they have with a desktop AM4 socket.
@saricubra2867
@saricubra2867 Жыл бұрын
They are monolithic, would destroys the desktop chips with infinity garbage
@Bretton549
@Bretton549 Жыл бұрын
I would only recommend the 7950x3d for power users who are inclined to tinker. Telling the bios to prefer frequency over cache, and then telling my game to use ccd0( the cache cluster) with process lasso, I've had amazing results compared to stock (game mode on, etc). I think the 7800x3d will be comparable and require less config. For most people I would suggest waiting for their reviews.
@midgettiger6389
@midgettiger6389 4 ай бұрын
I wonder if this video is plagued with the core parking issue that JayzTwoCents finally found a fix for last month
@BrentNewland
@BrentNewland Жыл бұрын
Thanks!
@MustafaGT
@MustafaGT Жыл бұрын
I don’t about all this, but my 7950X3D definitely gave me about 15-20% performance increase in most of the games I play. I have benchmarks with MSFS which is very CPU dependent and you can see those improvements. This is coming from a 13900K to a 7950X and now I’m at 7950X3D. So far it’s a BEAST and way better than my previous 13900K and 7950X.
@ShanleH
@ShanleH Жыл бұрын
That’s the point though. The games where there is no performance increase are the games which are more balanced between CPU and GPU load. The thing is, if you are buying a chip for gaming only, the 7950X3D is not a good chip to buy, because in gaming scenarios the 7950X3D is actually just a 7800X3D, but worse
@crypto1300
@crypto1300 Жыл бұрын
My 7950X3D is definitely faster than my 7950X with an RTX 4090. Fortnite averaged 100fps higher with 1440p with a mix of very high and low settings.
@MustafaGT
@MustafaGT Жыл бұрын
@@ShanleH COD: Warzone is a game I would think is very balanced between CPU and GPU, and yet I gained a very significant improvement with the 7950X3D. My previous 7950X would do about 200-218 fps. Now with the 7950X3D I’m getting 240-268 fps with exact same High settings on 1440P. That’s a pretty good boost by just going from a 7950X to a 7950X3D. On MSFS, I’ve gone from 60 fps to nearly 100 fps.
@realfakey9
@realfakey9 Жыл бұрын
Why did you exchange 13900K tho if you dont mind me asking?
@calpin_dk
@calpin_dk Жыл бұрын
Hats of to you guys for all this testing. Great job and very informative.
@frankgrajeda3566
@frankgrajeda3566 Жыл бұрын
The nuance, transparency and thoroughness in this review is what separates Linus and his team from other tech channels. Videos like this make us smarter.
@IrocZIV
@IrocZIV Жыл бұрын
Oddly the efficiency was the reason I ended up choosing the 7950x3d vs the 13900k. I don't need anything generating more heat in my room.
@peedsii1991
@peedsii1991 Жыл бұрын
I will buy the 7950X3D. The primary reason are its low power usage since i live in a country where electricity is expensive and i use the computer allot. So over a period of 2 years the savings will be quite big vs other cpus. Other reason is i like flight sims and the x3d is really good in them such as microsoft flight simulator +vr at the same time. Els if i only played shooting games or building games i would go for the ordinary 7950x cpu or the 7800x3d But honestly maybe we all should stick a finger in the ground regarding fps in games.. If a game run with more than 60fps it run perfect and smooth.. that is enough for most people. TIP..: What i cannot recommend is the 7900xtx graphics card the drivers have to many bugs. Example Vr does not work or lack performance for allot of people and its not even compatible with example all of pimax 8kx headsets. ( its not comaptible with the newest 8kx version named dmas but only with the kdmas that i own ). Personaly i bought my card in february and vr is still not working and the image either flip around like crazy or there is lack of performance where the image lack behind when you turn you head. Also i experience my 1440p monitor to be flickering in some games. I am sorry to say so but the driver quality for the 7900xtx is not good enough and the card should not have been released but delayed for 6 month before selling it.! I have asked for a refund due to the product being "defective" on the day i buy it and its not the hardware but the software driver that is the problem. SO if you are into vr then ONLY buy nvidia cards and avoid amd. ( i previously had a 6900xt card and it ran fine in vr with an 8kx from pimax but honestly get an nvidia card for vr ) ps. regarding the issues linus has with defective cpu for testing, well when i bought the 5900x cpu years ago i had the same issue with a badly produced cpu, but they swapped it to a new one without any problems at all. I got the first batch of 5900x cpu's as one of the first and that production had like +20% failures if i remember correctly. That is simply shocking.! pps.. have been waiting for my new 7950x3d for about 1 month now since i order it and delivery time is unknown or say end of may or in july... honesty why did they not just delay the launch of the cpu when its not avaliable to anyone.. ;-(
@zonelore
@zonelore Жыл бұрын
you don't need such powerful processors, take the i5 13600K it's the coolest processor for all tasks it's also cheaper
@Desenrad
@Desenrad Ай бұрын
I had these exact same issues discussed in this video, but in my case, it was with the Ryzen 7950X and the ProArt B650 Creator motherboard. I initially paired the CPU with both T-Create DDR5 RAM (6400MHz) and G.Skill DDR5 RAM (5800MHz), but no matter what I tried, I was hit with constant blue screens. After 10 days of frustration, I decided to replace both the CPU and the motherboard. I swapped the 7950X for a 7800X3D, and suddenly everything worked flawlessly-even when running the RAM at its rated 6400MHz. Reflecting on the issues described in this video, it’s likely the CPU itself was defective or simply couldn't handle the configuration reliably. This whole experience highlighted how cutting-edge platforms can come with early instability, as Linus mentioned, but also how critical it is to ensure quality control, especially when new features like EXPO profiles or die schedulers come into play. If anyone else is dealing with similar headaches, don’t overlook the possibility that the CPU might be the culprit, not just the motherboard or RAM.
@Namtrooper81
@Namtrooper81 Жыл бұрын
This kind of reminds me of first gen Ryzen processors for some reason. I purchased the last gen 5600X and I am profoundly happy. I can probably wait another 5 years or whatever and look at the last gen X3d processors of this socket's lifetime and AMD should pretty much have nailed the manufacturing process and I should have a massive upgrade over the 5600X.
@mskiptr
@mskiptr Жыл бұрын
You should also do some benchmarks on Linux. It uses different drivers, scheduler, etc. so it would definitely give some interesting data points.
@CaptDust
@CaptDust Жыл бұрын
That was my thought too! Linus kept mentioning the drivers and windows controlling the work balancing, and if it's that specific it sounds like there would be a performance loss while running linux. Very curious on this
@1IGG
@1IGG Жыл бұрын
You mean for both nerds using it?
@siegfriedkettlitz6529
@siegfriedkettlitz6529 Жыл бұрын
Three!
@elosant2061
@elosant2061 Жыл бұрын
Four
@bocahdongo7769
@bocahdongo7769 Жыл бұрын
I do wonder if the sole distro of Linux really made the difference on gaming performance. If that's really the case, the entire comment section would just turn into war between Distro Shill comparing the prick size since Linux customizable is really that limitless
@neil1629
@neil1629 Жыл бұрын
I'd love to see more focus on power consumption as chip capability start to level off due to physics constraints. It's an essential requirement for datacenters, and modern chips pull enough power that heavy usage can noticeably impact consumer energy expenditure as well.
@wouldntyaliktono
@wouldntyaliktono Жыл бұрын
With this kind of comparison, it would be _reeeeaaalllly_ helpful for us to think about means and standard deviations, and not just bar charts with no uncertainty intervals...
@consecratedtech
@consecratedtech Жыл бұрын
That was a great segue! and this is the video I have been waiting for. I asked for this in a different comment. Glad you are double checking!!!!
@andysPARK
@andysPARK Жыл бұрын
That's so weird given other reviews. Much appreciate the extra miles you went to validate your results. I'm actually wondering if they're deliberately nerfed the x3d. But I'll go look at other reviews again to check my recollection.
@ProVishGaming
@ProVishGaming Жыл бұрын
Other reviews are just buying into the hype. These x3d chips are big scam, especially comparing to Intel and THEIR OWN PRODUCT LINE!
@foxfm1987ult
@foxfm1987ult Жыл бұрын
@@ProVishGaming But like the technology is impressive say the least, the 5800x-3d still is very good in terms of gaming and still manages to keep up with flagships. Its the they they designed these chips sucks and pulled down the entire technology
@andysPARK
@andysPARK Жыл бұрын
Gamers nexus has some different results. I'm not sure why. Its likely setup configuration. For example ltt apparently uses highest possible game settings, gn looks like 2nd from top settings. Otherwise, I'm assuming the drivers or even windows updates have changed since gns tests, or the complicated amd driver configs. Who knows....
@andysPARK
@andysPARK Жыл бұрын
But the results are noticeably different.
@andysPARK
@andysPARK Жыл бұрын
@@ProVishGaming really? Only ltt must be right? Lmao. Linus has integrity in spades and ill always take into account his test results and opinions. But notice, he acknowledged that his test results were at variance with other reviews and he never suggested that the others didn't have integrity as you assert.
@Akrymir
@Akrymir Жыл бұрын
7800x3d caps it's boost clock at 5.0ghz, where 7950x3d caps at 5.2ghz (though you need to curve optimize to hit those numbers).
@SaltyMaud
@SaltyMaud Жыл бұрын
11:43 That's not a good assumption at all. The only reason you would want a very high end CPU for gaming specifically is if you're targeting very high framerates, regardless of resolution. 4k60 ultra eyecandy mode doesn't ask much from your CPU. Granted 7950X3D still makes very little sense when 7800X3D is about to be available, but you could make the argument that it's still a niche option to high refreshrate gaming rigs that also do something thread heavy on the side.
@muditgauniyal
@muditgauniyal Жыл бұрын
Thank God you got your acc back ❤
@mikebroom1866
@mikebroom1866 Жыл бұрын
Still perfectly happy with my 5800x3d for quite a while, especially for the cost.
@martinzhang5533
@martinzhang5533 Жыл бұрын
@@rustler08 What are you talking about, the 5800X3D blows the 7900X out of the water in terms of gaming performance. And it's $300 at microcenter
@murphygaven1
@murphygaven1 Жыл бұрын
@@martinzhang5533 yeah i was about to say lol if you doing stuff other then gaming sure but just gaming the 5800x3d wins vs the 7900X
@doctorgears9358
@doctorgears9358 Жыл бұрын
@@rustler08 People can ride AM4 for years. You act like it’s falling apart and is a decade out of date when AM5 just came out last year. That’s nice that you upgraded to AM5, but AM4 is plenty viable for a very long time. Considering the growing pains AM5 has gone through, some may prefer AM4.
@TheMx5Channel
@TheMx5Channel Жыл бұрын
0:29 You got me, jumped out of my chair and the video resumed.
@MyklCarlton
@MyklCarlton Жыл бұрын
Excellent review and coverage! Honesty and a commitment to truth is refreshing and valued.
@KG_BM
@KG_BM Жыл бұрын
yea someone said in a previous review that the graphs need to be changed to be more intuitive... I agree now... those % difference graphs threw me way off even with commentary
@MrDjuhasz
@MrDjuhasz Жыл бұрын
Also having an insane amount of crashes with my 7900x and asus b650e-f mobo. Thanks for putting these companies on blast!!
@SteveExH
@SteveExH Жыл бұрын
Wow the efficency is sexy af. I wanted an update on the new cpu but now I've got a new item on my shopping list. Amazing testing and review as always.
@griffin1366
@griffin1366 Жыл бұрын
Because it's only using 8 cores lol
@wveerallday20
@wveerallday20 Жыл бұрын
Good on you guys for taking the time to do things right. Still feels like there's some weirdness going on though, think some new drivers might help out? (When they're available)
@hippieduck
@hippieduck Жыл бұрын
Wait, is that *squints* a gorgeous black labrador? ❤
@Justin-ro8uj
@Justin-ro8uj Жыл бұрын
Something to note, some performance may be left on the table by just slapping a -30 CO undervolt on each core. Core cycler can be used to test for individual core stability and tune the undercoat accordingly. Otherwise, the event log will throw WHEA errors in most scenarios. Most people just set everything to -30 and assume they have a golden sample when in reality there is instability and/or WHEA errors being thrown in the background that ends up reducing performance and sometimes even causing crashes depending on the workload. Great video as always.
@klbmason2042
@klbmason2042 Жыл бұрын
So what I’m seeing from this review is that I made the right choice to upgrade my 3700x to a 5800x3d LOL. And thanks for being so thorough and getting the truth out instead of rushing to be the first to post a review Linus!
@redx566
@redx566 Жыл бұрын
Yeah that’s the reason. Definitely has nothing to do with getting a new motherboard and new ram
@klbmason2042
@klbmason2042 Жыл бұрын
@@redx566 I mean… yeah? That’s for sure part of it. But seeing the numbers from the 7000x3d chips makes the 5800x3d look like even more of a performance unicorn. Depending on the game the 5800x3d can trade blows with even the 13900k, so $300 for that kinda performance or $1k+ easily? It’s a no brainer
@redx566
@redx566 Жыл бұрын
Just checking 👍 5800x3d is a beast
@Freestyle80
@Freestyle80 Жыл бұрын
one gen upgrader 😂 waste of money
@leexgx
@leexgx Жыл бұрын
​@@Freestyle80 I guess I also wasted my money as well when I got the 5800x3d (skipping the 7000 series mainly due to heat and it's well new and not as refined as the am4 platform) 5800x3d will do me for another 3 years I am guessing (I only upgraded to the 3800x because the 1800x was so bad at single threaded performance)
@christopherjunkins
@christopherjunkins Жыл бұрын
This was an AWESOME format for a CPU review, nice editing, scripting, color choices (set and grading), etc. VERY well done! I'd love to see more like this moving forward!
@R0ckyoS0cks
@R0ckyoS0cks Жыл бұрын
Shout out to the editors Seongchan and Oliver for that sick reverse ding
@Neoxon619
@Neoxon619 Жыл бұрын
I guess the gap between this & the 3D chip is smaller than the original review suggested, which is nice.
@gandeldalf
@gandeldalf Жыл бұрын
This is the best review of a CPU I have seen my entire life Amazing job LTT team and thank you for taking time with this instead of rushing
Жыл бұрын
The level of sophistication the segues are reaching is just... *chef's kiss*. Kudos, writers.
@wantedyou2346
@wantedyou2346 Жыл бұрын
As long as you come out with an explanation and stay transparent your viewers will always be with you. Keep it up 😁
@tacticalcenter8658
@tacticalcenter8658 Жыл бұрын
I dont like 99% of ltt videos cause they are utter failures and other reviewers do a better job at hardware reviews. You sound like a bot.
@UneedAname45
@UneedAname45 Жыл бұрын
Great review and we should not expect anything less. It was well worth the wait.
@patrickwinham
@patrickwinham Жыл бұрын
I am so glad that you guys put the extra effort into confirming your suspicions, having the integrity to put out correct information.
@kylescoolclips
@kylescoolclips Жыл бұрын
Would’ve been interesting to see the temperature comparisons considering those impressively low power draws
@HexerPsy
@HexerPsy Жыл бұрын
7950X3D cant cool its hotspot effectively due to the stacked dye. If you use it, the stock behavior pushes it to 89C all the time under full load. It ll draw 142W or so. Thats from personal experience using a 360mm rad to cool the chip. If I take Rise of the Tomb Raider, since its GPU limited at 4K with RT, the chip peaks at 77C and 70W of power, although it mostly hangs around 72C and 60-65W. But then consider this: at idle desktop, it runs on the non v cache chip with 44C at 50W....
@kylescoolclips
@kylescoolclips Жыл бұрын
@@HexerPsy well that's disappointing
@Nekomancer69
@Nekomancer69 Жыл бұрын
Did you enable Resizable bar? and how do you explain the difference between other reviewers?
@DanDoesGame
@DanDoesGame Жыл бұрын
It's funny because I said this same exact thing and got a ton of hate because of it 😂 Thank you for being honest!
@UnbidWindow3047
@UnbidWindow3047 Жыл бұрын
Love the content linus and crew, keep it up ❤❤❤
@dogbog99
@dogbog99 Жыл бұрын
@@p-__ they smell so sweet
@A_Secondhand_Vanity
@A_Secondhand_Vanity Жыл бұрын
@@dogbog99 😨
@RealOSU_JC
@RealOSU_JC Жыл бұрын
@@dogbog99 AYO❓
@ZethStrike
@ZethStrike Жыл бұрын
Thank you for taking the time and the very honest review. This is very much appreciated as I waited before buying the CPU. You guys are awesome and I appreciate the complete honesty and amazing attention to detail for your reviews.
@chrisw6076
@chrisw6076 Жыл бұрын
Crazy story, widescreen 1440p HDR monitor/settings, my new 4080 with Ryzen 7 5800x only got 90fps in Tarkov on high settings and 120fps in Hunt, lots of frame drops same as my 2080ti no change at all, I went to the Ryzen 7 5800x 3D and it jumped my fps with the 4080 to 140fps in Tarkov and 160 in Hunt with Hunt's frame rate being more stable, I don't think other than 3D v cache there is much difference between those chips and we did a lot of troubleshooting on the CPU and GPU before I upgraded CPUs, everything was working as it should. I was blown away by those results
@novantha1
@novantha1 Жыл бұрын
Weirdly enough, I kind of want to buy it more, now. I'm a premium buyer, typically prefer AMD, and regularly do extensive code compiles and ML workloads which can tax a CPU decently, but I'm also quite power conscious and have often underclocked and undervolted products to claw back some power efficiency. Very interesting product.
@mrsuperselenio5694
@mrsuperselenio5694 Жыл бұрын
you should see the 7000 non X CPU reviews, I don't think anything of this generation can beat them at performance per watt and that is out of the box, tuned it probably becomes even more efficient
@griffin1366
@griffin1366 Жыл бұрын
Undervolted Intel never gets the limelight. It has 3-4x better idle power draw and similar results under load.
@costafilh0
@costafilh0 Жыл бұрын
It amazes me that AMD would send only 1 unit for a channel this size. They should send at least 2 chips, separately, so in case of some problem in shipping or with the chip, they don't miss the date. Imagine AMD losing millions of eyeballs from LTT alone because 1 defective chip? It doesn't make ANY SENSE!
I Don't Care if This Makes Sense - Chilling Threadripper Pro 7000
21:42
Linus Tech Tips
Рет қаралды 2,5 МЛН
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН
Quando eu quero Sushi (sem desperdiçar) 🍣
00:26
Los Wagners
Рет қаралды 15 МЛН
AMD's Gift to Gamers! 9800X3D Reviewed and Benchmarked
20:35
Level1Techs
Рет қаралды 72 М.
Apple II went bang, let’s repair it!
19:25
Kari
Рет қаралды 53 М.
I Bought 10 Graphics Cards for $50 - Renovating the Cards
14:40
Jose Castanon
Рет қаралды 10 М.
Do I regret using the 7950X3D CPU in my rig? Let's talk about it!
21:36
The Tragic Decline Of Firefox...What Happened?
12:34
Logically Answered
Рет қаралды 145 М.
I Can’t Believe These are Real - Reacting to Ridiculous PCs on Craigslist
20:53
Samsung S25 Ultra Hands On - The New Era
17:06
Mrwhosetheboss
Рет қаралды 4,9 МЛН
Buying a Brand New PC is Dumb...
17:01
Linus Tech Tips
Рет қаралды 2,3 МЛН
The Best Just Got Better - Ryzen 9800X3D Review and Benchmarks
16:09
Paul's Hardware
Рет қаралды 83 М.
How Bad is This $10,000 PC from 10 Years Ago??
22:00
Linus Tech Tips
Рет қаралды 6 МЛН