Steve Burke: "Bought an AMD FX system so I could gain some experience with AMD". But when he produces questionable test results, he proceeds to dismiss the feedback of actual, experienced, AMD enthusiasts...as if his methods and experience are superior? Yeah, that's known as contradicting yourself.
@HazewinDog4 жыл бұрын
Hey good video man! I have to stress that it is very disappointing to see HU & GN completely dismiss your video, while seemingly not having watched it through. I just watched your entire video and you haven't made any rude comments towards HU or GN. You also showed very well why you thought it was important to make this video, and I agree. I hope HU and GN can give this a second thought and refrain from childish and dismissive comments, just because they have more experience or haven't heard of you. Those are not reasons to dismiss your findings and it I honestly thought the tech community on KZbin was better than that.
@hitbm47553 жыл бұрын
I have also experienced those other tech reviewers get very childish with their replies as soon as they feel intellectually challenged. I feel RA Tech does smarter reviews on these underrated CPUs.
@Squall4Rinoa2 жыл бұрын
Nobody at HU is an intellectual.
@jotabe1984 Жыл бұрын
@@Squall4Rinoa nor at GN...
@suiken3149 Жыл бұрын
And suddenly, GN have the nerve to call out LTT for his misleading test methods, lol
@GamersNexus4 жыл бұрын
Hi, I’ve never heard of or seen your channel before, but became aware of it from another user. Unfortunately, your video is - ironically - incredibly misleading and incorrect. The “something” that “isn’t right” is actually you, your testing methodology, and your understanding of how cores work. You start your video by misrepresenting our content to create your own narrative out of our charts. You then incorrectly state that more cores equals more fluid gameplay. It has nothing to do with purely cores and that’s easy for me to prove if I want to waste thousands of dollars of my time on it, but we’ve already shown it in most of our videos; for instance, take a look at our AMD CPU scaling benchmarks where we disable cores or disable threads - both *can* actually yield uplift, partially due to cache or resource contention depending on the architecture compared. More cores doesn’t equal “more fluid.” That isn’t how it works. An FX CPU’s core architecture is entirely different from Zen and it is massively misleading to title and frame your video in this fashion by attempting to simulate a 10-year-old CPU with a modern part. A Vishera CPU from ~2014 runs two integer units (INT) for each floating point unit (FPU) present in one Bulldozer module (and they’re all Bulldozer modules, regardless of whether it’s Piledriver, Excavator, or whatever else it may be). These 32nm process CPUs fed their decode into two INT schedulers and one single FP scheduler per module, with each INT scheduler running four pipelines to their own L1D, but sharing L2 between them. In fact, because of the way AMD doubled-up on its decode hardware multiple times for this architecture, it actually ended up with lower capabilities than the preceding Phenom II, in some instances. We showed this in our Phenom II revisit previously. For instance: AMD FX can only run 8 instructions at peak quad-core decode, while Phenom II could do 12. Similarly, the simple math would have Phenom II at 18 instructions via 6C for 6C/8C peak decode, or 16 on the FX. The FX CPU had some improvements and was notably improved in some INT applications, but had known issues in FP to the point that it lost a class action lawsuit over it. So, to get to the point, you’re trying to simulate a bad architecture by taking a new, good architecture from almost 10 years later and then … randomly turning off cores? How does that even make sense? It’s not even monolithic anymore. It’s not even close to comparable. The branch prediction alone has been completely overhauled in Zen. I actually just watched a bit more of the video (edit here) and am even more confused as to what you're trying to show. Why wouldn't you just use the parts instead of doing weird cross-architecture simulation stuff? Sorry to say, but this video is just really poorly thought-out. You should have contacted us before trying to throw us under the bus that you’re driving, because we could have helped you understand why we test the way we do and when the results might or might not apply. This has nothing to do with GPU choice in a test bench and everything to do with not attempting to simulate decade-old hardware on new components. The memory subsystem isn’t even the same. Also, you’re for some reason showing instantaneous FPS in side-by-sides which doesn’t make sense for a number of other reasons. What does that even mean? FPS for *this* frame that’s being shown *now?* But it’s frames per *second,* right? So why are we showing frames per second for one frame? What does that mean? Different issue altogether, but one that experience would probably teach. Sloppy work. I don’t mean to be rude, but with a title like yours and with the extremely strong language you used, I feel it is a fair defense of our work. This methodology would not fly in my lab. I couldn't make it through the whole video because I had seen enough. Best of luck.
@rexskipper54974 жыл бұрын
😂🍿
@silverwerewolf9754 жыл бұрын
Write something
@AnotherLotte4 жыл бұрын
They weren't comparing Zen or Intel to Vishera. It was to point out that there were some inconsistencies with the results of your colleagues and yourself, as well as drive the point that frame times were more consistent than that of a 2 core; 4 threaded CPU. You don't have to go full wikichip and explain what the Piledriver uarch was to know that a 4 CU FX could provide a somewhat more consistent experience than a 2c/4t CPU. If you were to defend your namesake, at least have the guts to make due diligence and watch the whole video, rather than stop a few minutes in and then tear someone apart.
@singlsrvngfrnd4 жыл бұрын
Rekt
@LeopioRe4 жыл бұрын
@@plebbit lmao
@c1apifi3dgames62 жыл бұрын
Yeah, - respect for both steve's here from me + respect and a sub for a small creator just trying to show an issue he found, but being met with two large creators feeling attacked RA Tech isn't trying to: -Disrespect either HW unboxed or GN -Directly call either steve's benchmarks flawed or invalid -Use a cut down 1600 AF as a direct replacement for an intel dual core and try and say its identical -Throw hate on either creators to try and get some attention, like they very unprofessionally called RA out for, despite it fairly obviously not being the case He is trying to: -Show that the 8 core FX cpus are underperforming when benchmarked by either HW unboxed or Gamers Nexus -Use a non exact comparison to help viewers gain an idea for the point he is putting across not try and use a simulated cpu as a direct comparison (He clearly mentions that too if you actually pay attention), he is not trying to cripple the dual core setup to shape a narrative, if anything the dual core setup he has is greatly advantaged and as he says would most definitely outperform a 3000g let alone a low tier dual core pentium. Honestly it's a real shame to see HW unboxed (Who has clearly shown bias against FX processors in the past) shame a small creator on social media for trying to raise a simple point that in HW unboxed's benchmarks the FX seems slow compared to not just RA's chip, but many others as well if you do some looking around, or actually own an FX chip. Then to have GN jump on the bandwagon and also shame RA after being mentioned on twitter is a real show of professionalism aint it, but in fairness, GN has had over 10 years experience dismissing minority testing before this, so dunno what I expected he would do here. All in all it's honestly a rather poor showing from HW unboxed here mainly. He just immediately went to the defence and made sure GN and his legion of following heard about it in a bad light first on twitter, instead of perhaps just taking on the information shown here and coming to a more civil conclusion, the fact he Deleted his tweets after this too just goes to show that he may have realised the plan to silence this little FX minority and prove his skew-if benchmarks and bias were actually the correct ones didn't exactly work out. Honestly though all said and done I still respect them as reviewers and tech enthusiasts, what I can't respect is the behaviour they showed here, leveraging there following to send people to blindly attack RA Tech even though the people they sent (and themselves too) probably didn't even watch this video properly, or perhaps even at all. Props to you RA Tech, for handling it so well, and sticking to your guns amid heavy (and in some cases biased) fire against you.
@johnk7134 Жыл бұрын
Who is the one who is 1000% sure that big creators like both Steve's are always unbiased....sorry but myself have found inconsistencies among the results. That is why I trust no one. Everything is indicative.
@interrobangings4 жыл бұрын
Damn, a whole lot of people kinda missed the point, he's only even doing the whole 2c4t thing to give a baseline -- the point of the video is to show that the FX chips perform better than the graphs are showing, not to compare specific parts
@Stermy57HW4 жыл бұрын
One of the biggest mistake is to use FX8xxx series with low end budget 4+1 vrm phase mainboard. Maybe GN and HU did the same...
@abijithng63024 жыл бұрын
I overcame that throttling with sticking a heatsink over the VRM's of Asus M5A88M, and blowing some air over them, voila, I got absolute performance of the FX 8350! :))
@randgrithr73873 жыл бұрын
This is why top-down coolers are superior. For my tower-cooler, I cut and bent a folder into a duct that redirects exhaust air onto the vrms. For a friend's crappy stock AM3+ cooler, I 3D-printed a 120mm fan adapter to maximize the air throuput.
@evenblackercrow44764 жыл бұрын
Great--you're right, a lot of your work is great record for the past several years of early AMD v Intel. I run an FX8370(@4.9) 32DDR3(@1866) and a Vega64 at 1440p (75 refresh) for AAA and old or indie games. Most games haven't had any problem and don't support or align with the GN performance review.
@Vyp3Rau4 жыл бұрын
I upgraded to Ryzen 3rd gen last year so my son inherited my FX-8350 and Gigabyte 990FX board and his machine games perfectly with an RX 5700. I actually think it performs better now that Windows 10 has become more core oriented over time.
@CharcharoExplorer4 жыл бұрын
I suspect they I suspect HU and GN just test in areas of those games with very little happening. Areas from games with few AI or physics objects usually mean the core utilization is low since only 1-4 threads (render threads) really work hard. AI and Physics, if done well, are at least 2-8 more threads to be used by the game which makes saturation much higher. I saw this when testing my old 1500X vs my old i5 4460. In Wolfenstein 2, just looking at a static or somewhat easy scene made the i5 beat the 1500X. However, in combat encounteres, the 1500X literally obliterated the i5 since core utilization and thread usage was much higher.
@wil81154 жыл бұрын
i loved my old 8350 rig. used it for many years. it was a workhorse not a thoroughbred.
@jay11854 жыл бұрын
This. I'd love to see the end result of someone gaming, streaming to twitch, in a discord VC and playing music (Spotify) simultaneously with one of the FX-8350's direct competitors.
@glenwaldrop81664 жыл бұрын
@@jay1185 I used to do essentially that. It was my DVR and gaming machine. It was only a 6300 but it was always doing at least two things. Swapped it for an Ivy Bridge Xeon and now it doesn't multi-task as well.
@virtualtools_30214 жыл бұрын
@@glenwaldrop8166 then upgrade to the cheap 8+ core ivy bridge xeons, unless its an e3- xeon then you are sol
@abijithng63024 жыл бұрын
@@jay1185 Ahhh..... Inttresting, Adam/RATech, you have the topic for your next FX video! :))
@GiantsGraveGaming4 жыл бұрын
Fx cpus were and still are brutally misunderstood, devs never managed to leverage their power when they came out and now are actually showing what they're capable of... unfortunately, a little too late...
@abijithng63024 жыл бұрын
True that!
@The_Seal774 жыл бұрын
Agreed
@bertbertoUk4 жыл бұрын
Ive had many FX and they still handup today, ive always loved the power to price brilliant chips.
@GiantsGraveGaming4 жыл бұрын
@@bertbertoUk i had first a fx6300 then a fx8350 and they did what I made them do, I ultimately swapped for a ryzen5 2600 just because im studying IT and those fx chips wouldnt allow me to install some special software i need to practice onto... but yeah, even today, if paired with a decent graphics card, they'll play just as good as some newer cpus. If they could support ddr4 they would be a better choice than most athlons and i3 chips.
@bertbertoUk4 жыл бұрын
@@GiantsGraveGaming I still have a 8370e build @ 4.7, and a Ryzen 2600x at stock.
@AtomicScrotus7 ай бұрын
I upgraded a few years ago no to a ryzen 5950 and gave my kids my old FX-8370 with 16gb 1600 and a rx-580 and it is still going strong playing games like New World, Modern Warfare 2, Borderlands, Tiny Tina at 1440p. Ive never understood GN or HU on their reviews because my stock 8370 Far out performed any test they did. I think it speaks volumes that it was my first pc i built almost a decade and ago and I can still pass down to my kids so they can enjoy and use it. Thank you for the video and showing that the old FX series can still have some use
@benjaminbain17694 жыл бұрын
I had always thought the FX was being given a bad wrap with those dual core tests by other tech tubers. As someone who built budget rigs using famed chips like the Pentium G3258 only to be let down after doing my own tests, I must admit this definitely checks out. I figure the only situation where a dual core would outperform something with six or more cores would be in the case of missing instruction sets (phenom chips might have this issue). It's good to see some tech giants being called out by an underdog. Keep kicking ass! :)
@abijithng63024 жыл бұрын
Well said dude! :))
@benjaminbain17694 жыл бұрын
@@abijithng6302 thanks man :)
@abijithng63024 жыл бұрын
@@benjaminbain1769 Happy to, Benjamin.
@philcooper92254 жыл бұрын
I've gotten the same results as you even against a 2500k and 2700k the FX chips can hold their own. Narcissists hate being wrong. Fuck em.
@RATechYT4 жыл бұрын
Thanks for your feedback!
@cssorkinman4 жыл бұрын
I've mentioned this subject to both Tech Jesus and HWunboxed Steve in the past . To their credit, they both responded , GN was dismissive even though it was obvious his tests were flawed ( actually showed a 9590 performing worse than a lower clocked FX ) . HWunboxed Steve recognized he made a mistake and retested - which I appreciated. Currently , there is something about the new video cards such as the RTX 2080 seem to handicap the FX for some reason( at least at 1080p). It's so pervasive that I've managed to have my Fury outperform it in lower level of detail benchmarks at 1080p. Just ran my - Mud and Blood BF1 benchmark on my 2600k 290X rig comparing HT enabled vs HT disabled at 2500k turbo speeds 3.7ghz . 8 threads it averaged 90 fps, 4c it was 42 fps. I'll plop that 290X in one of my FX rigs and post the results - but I'm betting it will be close to 100 fps average. You do a good job with these vids - thanks for your efforts.
@stargazer1623 жыл бұрын
This "something that handicaps FX" you speak of with Nvidia cards, could it be the drivers overhead Hardware Unboxed talked about recently? I mean, much slower AMD cards vastly outperforming faster Nvidia cards in low end CPUs is exactly the same situation they described.
@cssorkinman3 жыл бұрын
@@stargazer162 It's probably the biggest part of it, but it doesn't seem to be all that is going on. I sure wish GPUs weren't impossible to buy now, I'd dig into it a little more. Benching Crysis 3 demonstrates something else that is going on with Nvidia vs Radeon on an FX platform. I need to get another RTX card and test it on my other machines to try to learn anything from it.
@r3n846 Жыл бұрын
I know it's been ages, but I wonder if this is that DX12 overhead problem, and I'm curious if it's related to the different architecture of the 2000+ series cards compared to the older Maxwell/Pascal designs.
@cssorkinman Жыл бұрын
@@r3n846 Driver overhead certainly played a role vs radeons and older nvidia cards, but it was apparent in DX 11 or 12.
@r3n846 Жыл бұрын
@@cssorkinman Not sure if it's NT 6.x drivers being too different or the 470 branch being before this happened but I didn't encounter any real difference in CPU limited games comparing Nvidia and AMD in DX11.
@Gabu_4 жыл бұрын
The critique is valid and I get that not everyone can get their hands on hardware, specially during a pandemic, but ultimately you should've worked out the video's concept a bit better. FX CPUs are capable beasts which require just as beastly an electric bill whereas a modern AM4 2C/4T uses less than 50 Watts and leaves a lot of room for upgrading. On top of that, you proved it yourself - an FX 8350 isn't *that* much better than a modern dual core. This leads me to the conclusion: If you already have an FX CPU and only enough money to upgrade to a 2C/4T, keep the money and save some more. If you don't have an FX CPU, buy the more recent low-end processor.
@RATechYT4 жыл бұрын
I'm only comparing my results with theirs. Obviously the FX system consumes much more power and runs hotter, but that's not why I'm comparing these two CPUs. I never said that the FX-8350 isn't that much better than a modern dual core, you can even see that it's the exact opposite. I agree with your conclusion though.
@m.yunusvaru94324 жыл бұрын
Still using amd fx 8350 with a 1050ti and I can still play many new gen games on low settings
@dizzler82uk Жыл бұрын
how about now?
@m.yunusvaru9432 Жыл бұрын
@@dizzler82uk yes I am still able to play them the last one I play was cyberpunk but Now I’ve got a new pc as I accidentally broke one of the capacitor on the motherboard while cleaning.
@RATechYT4 жыл бұрын
Edit: I have a feeling that a lot of people, including GN and HU completely missed the point of this comparison. Since writing a comment here explaining everything would basically be a waste of time, I'll be making a follow-up video for those of you who are interested. Thanks. *UPDATE:* The follow-up - kzbin.info/www/bejne/hajMpoeVgpZ1d7M For those of you wondering, the reason this video has so many dislikes is because of HUB's misleading tweet (which he deleted after my follow-up video) that led people to believe things that weren't true. Please keep in mind that the intention of this video is not to send hate to either Gamers Nexus or Hardware Unboxed. I want to keep things as civil as possible.
@interrobangings4 жыл бұрын
Gonna be honest This whole thing would've been avoided if you had simply benched the FXs and compared numbers, not tried to simulate the 2C4T People got really confused and thought you were trying to genuinely use the simulation as a benchmark when it's really just there as a baseline, as it's the next CPU on the graph
@todogaming7794 жыл бұрын
You are doing a good job man, i got the point and another people too, not your fault about people who not understand that
@marcosvinicios51394 жыл бұрын
Eu também percebi isso, ele também faz isso com os Intel antigos
@whitemoses79134 жыл бұрын
@@todogaming779 I seconded this, and you're cool RA Tech. I'm really interested in what you have coming next.
@qpwoeiruty6684 жыл бұрын
Too late for that now, I'm afraid. Just know there are people who are on your side on this and are against the blind lynch mobbing you're suffering right now. You did some good work no matter what they tell you, keep it up.
@Eggwhites944 жыл бұрын
Love these videos, I’m a huge fan of the FX series and never understand why they got so much hate. I rock a FX 9370 and r9 fury.🤘🏻
@adi62934 жыл бұрын
They got a lot of hate because of double standards in this industry....
@idewmeth42034 жыл бұрын
They got a lot of hate because they were very power inefficient chips, and were less-performant than their competitors products.
@adi62934 жыл бұрын
@@idewmeth4203 Do you see Intel getting that much hate now? Plus FX family had their draw backs but that was reflected in the price, 8320 was cheaper than 2 core i3's
@idewmeth42034 жыл бұрын
@@adi6293 yeah I do, people roast the shit out of intel nowadays because they've made 14nm+++++++ for years now and got complacent while AMD pushed the envelope and engineered a great series of products with Ryzen
@adi62934 жыл бұрын
@@idewmeth4203 They do get heat but not as much as AMD used to get 😅😅 Ryzen is great, I've had the 2700X and 5800X and my wife is using the 3700X.
Жыл бұрын
I upgraded from Fx8350 to i5 2500k in the past, when I knew fewer than now. I thought I "upgraded". I believed the single core performance was all that matters in gaming. Oh boy was I wrong. I mean, in "some" games, performance was %10 better than fx, but in online multiplayer shooters or UE games... Reducing cores meant reducing a lot of performance and the single core performance was nothing in these titles. Also FX8350 is the reason I stepped in the PC tweaking and master race rabbit hole... Also the game called Squad. Fk Squad. Since I wanted to know why my FPS is low even though my GPU and CPU wasn't in full use. Little did I know some games cannot utilize all of the CPU while bottlenecking to hell.
@jahjahlive10004 жыл бұрын
My Asrock 990 fatality only started giving trouble last week and was built in 2014. My 8320e and 1070 plays everything still so it was never any emergency to upgrade.
@Nianfur4 жыл бұрын
I killed two of those boards. Now I run my third at stock clocks and keeping the memory to a sensible level. The VRM was turd, and Asrock got caught out lying about the phases.
@dallesamllhals91614 жыл бұрын
Pffth! "What a baby" said the 1100T from 2010.
@php6004 жыл бұрын
Still have Fatal1ty 990FX Professional, by the end of the year will celebrate its 10th birthday. With a 5700XT I play all recent games on max settings on a UWHD, no CPU bottleneck, never going beyond 80% CPU use of my 8370e. Why changing it ? I guess a few more years to go...
@AshTechCorner4 жыл бұрын
Great video , subscribed enjoy your content please keep you head up and ignore the haters. I Had a FX 8300 @4.4Ghz before I upgraded to ryzen, played all the games I wanted to play. Not a lot of people get the true potential out of these CPUs there's a lot of mess around with in the bios to increase performance not just CPU clocks. FX wasn't a bad CPU but it wasn't a great CPU either but it was priced well. There is still a lot of demand for the 8 core FX as I recently sold my 8300 for £70 as people still want them as they can keep up and play today's titles maybe not at the highest framerates but good enough for cards such as 1060/1070 - 580/5500XT etc.
@chilldudie2423 жыл бұрын
I use FX-6300@4.4GHz for a fileserver and it serves files much faster than the Q6600 it replaced =) I like your methodology and editing Jealous u big cute
@codeHusky4 жыл бұрын
Definitely appreciate the thought behind this, but the data you're showing here doesn't really matter unless you're showing the actual silicone you're referencing. A "modern 2c/4t processor" isn't a 1600 - as you've even alluded to, modern processors lean toward multicore performance as that's where most of the gains can be found. Zen+ isn't exactly focused on single core performance (and an intel cpu with that config would be far more reasonable as a comparison if that was the goal), but hey. Your test. Just don't find this particularly valuable if you can't find an on-the-market part to invalidate old data
@AnotherLotte4 жыл бұрын
What is a modern 2c/4t CPU, pray tell? He handicapped the CPU to use just 2 cores and 4 threads. It is losing the multicore advantage while maintaining a single core score that's 55% faster than Vishera. Just because it isn't a 100% parity comparison with the sampled Intel CPU, doesn't mean it invalidates the results, which is, that something is wrong or sketchy about the results that the other two reviewers got when comparing an 8 core FX to a 2c/4t CPU.
@idewmeth42034 жыл бұрын
@@AnotherLotte no dude.... just.... no.... you can't just disable some parts of a cpu, and say it is comparable to this other cpu with the "same" cores/threads. The architectures are completely different. The way the cache works is different. Ryzen is a multi-chip design, intel is single. Completely different parts, his entire testing methodology and the narrative he pushed are flawed
@AnotherLotte4 жыл бұрын
@@idewmeth4203 I _know_ what Zen is, I know that that AMD is using CCX designs, and I _know_ that Intel is still using monolithic designs. The only way this would be a flawed comparison, is if he's drawing direct comparisons with Intel CPU's specifically. He's simulating an Athlon 3000G and comparing to other 2c/4t CPU's.
@RATechYT4 жыл бұрын
I'm simulating performance of the Athlon 3000G. The stock Ryzen 5 1600 AF with 2 cores and 4 threads enabled (on a single CCX) is faster than the Athlon 3000G, since it is a Zen+ CPU and has higher clock speeds. The Athlon 3000G performs the same as the G4560, therefore it is safe to say that the 1600 AF with 2C/4T enabled is faster than the Pentium G4560.
@codeHusky4 жыл бұрын
You can get a desktop with a Pentium G4560 for practically nothing. There wouldn't be an issue here if you had just used the silicone instead of insisting on emulating the parts - microarchitecture and CPU design is far more nuanced than "oh well they perform the same in these benches" and will affect test results in nuanced ways under certain conditions. If you're looking to bring attention to some data set as flawed, you should at least use the same hardware to prove your point. If you don't, you're changing variables for absolutely no reason.
@cssorkinman3 жыл бұрын
Just ran some SOTR benchmarks with my FX rig with the 5700XT in it. 1080p DX12 highest settings it averaged 84 fps at 4.3ghz 87 average at 4.8ghz. Stock 5700XT , 16 gb 2133mhz ram , stock HT and NB speeds. If GN and HUB run the same bench ( not sure they do) my rig is stomping their scores pretty hard.
@BlazedGarage_Inc4 жыл бұрын
I have a Fx-6300 that runs all these games fine too
@ares23dc Жыл бұрын
A lot of big tech channels have been pushing Ryzen really hard, it almost seems that they got paid to do so
@AvroBellow Жыл бұрын
This is no surprise to me as I gamed happily from2012-2017 with my FX-8350. I never had any gaming problems with it and I always ran it at stock speeds.
@thatbritishgamer Жыл бұрын
seen this 2 years later and had no idea this all went down and i watch both those channels a lot. did they ever apologise?
@ickyconcrete5370 Жыл бұрын
No and GN throwing shade at Linus today reminded me of that fact!
@blaze909 Жыл бұрын
@@ickyconcrete5370 This. GN is the same.
@trueheart56664 жыл бұрын
I think Gamers Nexus and Hardware Unboxed used a really old benchmark of the FX series and didn't really care if it improved at all and that's what they put in the presentation.
@kenzohkw3 жыл бұрын
I wanted to bring up this point too. HB and GN data is probably 3-4 years old. In that time the Windows ecosystem and games will have changed and more optimized for multicore processing. So RA Tech isn't really doing an apples to apples comparison. They need a Delorean to go back in time and run the tests again.
@zerocore84724 жыл бұрын
AMD FX still got the blues...post it from my rig AMD FX 6350 OC to 4.5ghz
@bertbertoUk4 жыл бұрын
Your not alone buddy, ive afew FX6 builds out there, everyone loves them.
@AliAyam764 жыл бұрын
My current setup is fx 8350 @ 4,5 ghz with multiplier oc, old 4+1 vrm asus m5a97 evo, stock vengeance ddr3 1600 2x8 gb (all 3 items build in 2011) and replacement rx 570 @ 1400/8000 core/memory from hd5850. Still got fairly decent performance in todays game at high setting.
@randgrithr73873 жыл бұрын
I hope you're cooling those vrms! I could never get the 8350 stable on that board without disabling cores.
@h1tzzYT4 жыл бұрын
Lol why im so interested in older cpus even though i have top end 5950x system :D Loving your vids, i also like that you started to test with high graphics as they put some load on cpus too and can change the outcome of those cpus quite heavily compared when you did tests like a year ago when you were massively gpu limited.
@m8x4252 жыл бұрын
The comparisons that are being made on all parts is debatable. Where the variables are coming in is with the tweaking and tuning of these processors, and these settings are not being reported in the benchmarks by anyone. I too was running a 4790k and getting different results than GN. I had been using the x79 platform for a really long time, I buy used systems that need repair which has given me access to different platforms, and I've had other Sandy Bridge-Skylake systems. Simple things like tweaking the cache/uncore, or running with 2133MT/s or 2400MT/s RAM sometimes makes a considerable difference in games. Yesterday the Good Old Gamer and Hardware Unboxed were trading barbs over running DDR4-4000 memory with an i3-12100. This could have been avoided if settings were listed.
@RATechYT2 жыл бұрын
I wouldn't say that it's about tuning. An FX-8350 simply shouldn't be performing as bad as it performs in their videos, stock or overclocked. It should be easily beating those dual core Pentium and Sky Lake/Kaby Lake i3 processors, even without making any changes in the BIOS, but it doesn't for some reason. Something's definitely limiting performance of FX processors in their videos, I don't have a single doubt about that. Yup, saw the memory video from Good Old Gamer. It looks like Steve Walton didn't mention that he was running the DDR4 memory in gear 2, which resulted in worse performance. Most likely just like he decided not to mention how he tends to underclock his FX-8370 to 2.65 GHz when he's testing it. Haven't seen either video fully, so I can't really comment on that topic.
@m8x4252 жыл бұрын
@@RATechYT I've only worked with the AMD FX platform for a brief time. I lucked out and got an FX-8350 and the 990fx-ud3 board for a combined $45, along with an FX-8120. In the end I paired the FX-8350 up with 16gb of DDR3-2133 RAM and a 4gb RX 580.... which worked well and ran everything I threw at it. No lagging issues or anything like that. The behavior I observed with the FX processors seemed to be the same as the older Intel platforms with them being more stable at newer games, or sometimes faster with better RAM. I didn't see the HW Unboxed video but I wonder if they were running the FX-8350 with the RAM at 1600MT/s, or if maybe he cas latency was really high. From what I understand, this is supposed to be like Kryptonite for Bulldozer and Piledriver.
@abijithng63024 жыл бұрын
**** *@RATech/Adam, we have freedom of thought and expressing that thought, right. You spoke with peace in mind in this video. You're just expressing your thought, so, everything is cool with that.* GN and HU are good souls, it's important for them defend, but they could have been better in being humble, diplomatic, a bit more supportive to a fellow KZbinr in the same tech space, it would have increased all their reputation. **** You're doing well RATech. My respects.
@RATechYT4 жыл бұрын
Appreciate your comment.
@agdgdgwngo Жыл бұрын
I wanna see FX 8350 vs the i3 7350k. I think if you clocked the i3 to 4.8 ghz it'd be a really close fight. I think GN are missing something in their tests, bios revision, Windows build, some driver issue or something like that.
@Micecheese Жыл бұрын
maybe had american megatrends bios and bad partner motherboard, or some very slow ddr3 ram could have smashed those frametimes and fps
@hugotakayama69632 жыл бұрын
gamer nexus has a hatred for Fx, surely at one time someone told him over time his cores would be better utilized, but he is supposed to be a contemporary processor at the time (2012) but he sure got excited and to prove his theory he continues to deceive the People, but I also have a much more modern pentium g4560 and core i5 4570 than my old fx 8350 and in games today the FX beats those newer 2014 and 2017 processors. remember that the fx8350 came out at a price similar to the core i5 3340
@jessicabruno28204 жыл бұрын
Really good video. In my opinion there are a few things going on here, one of which you mentioned. First of all regarding the benchmarks, operations like gamers nexus and hardware unboxed will do one graphics card and settings levels for everything. I don't doubt that when you pair These CPUs with RTX 2080 TI or RTX 3080 And run at ultra something as they often tend to do that there will be very little difference in the average and 1% lows of the CPUs, but that Situation doesn't reflect the reality of many people who will own a CPU such as the 8350 or an Ivy Bridge core i5. In more realistic situations with a more realistic graphics card and more realistic graphics settings your work has proven an 8350 Will probably be the better performer. Secondly there seems to be in the custom PC world a fashion to hate on AMD FX As you mentioned.While I generally wouldn't recommend anyone to buy any such out of date platform From any manufacturer, For those people who already have the platform both you and channels like Random gaming in HD Have proven that the FX 4000, 6000 and 8000 series are still capable of a lot more than most people give them credit for.
@NightOwlGames2 жыл бұрын
i used to run 8350 everyone that knew i had it told me it was crap and should upgrade, personally i feel like the 8350 has been misunderstood, its a 10 year old piece of hardware that can still play todays games at a playable framerate its still slightly better than a standard ps4, pair with a 980 or 1060 it should do well, 2060 is a clear bottleneck ive tried it
@attilasagi81573 жыл бұрын
The thing that this video just got 6582 views is just sad.
@ickyconcrete53704 жыл бұрын
5:15, Oh I suspect I know what they did. Performed the tests on a 4+1 VRM motherboard and got them nice and toasty before they performed the benchmark. The real question is, why did they do it? Incompetance (maybe automated chained benchmarks via batch file?) or something more nefarious? Great video anyway!
@devl5474 жыл бұрын
Actually, even low end mobos like 78lmt-usb3 show better results.
@blakedmc1989RaveHD4 жыл бұрын
@@devl547 funny thing is i had a FX 8350 on a 78lmt-usb3 board preformed very badly
@devl5474 жыл бұрын
@@blakedmc1989RaveHD Im writing these words on a 78lmt-usb3 rev6 + OCed FX8300 (4300mhz, 2200 ram, 2600nb, 2600ht). Works pretty fine for a bargain pc. The only thing - you NEED to have a downdraft cpu cooler (I use id-cooling is-60) to have nice airflow over the VRM zone.
@ickyconcrete53704 жыл бұрын
@@devl547 Completely agree. If going for a tower cooler or AIO with an 8 core FX, you have to have additional active cooling directly over any 4+1 VRM (I used a little 4cm fan and it worked silently and perfectly). So to your previous point, yes it can work on cheaper motherboards fine, but if you don't take measures, it will throttle, and if you aren't too good with PC hardware... it happens silently and unnoticed, surely Gamers Nexus wouldn't fall into that trap though?
@blakedmc1989RaveHD4 жыл бұрын
@@devl547 i tried it before and it didn't help! also i had to downclock my ram to 1066mhz because Windows was unstable along with my USB3 didn't work on nither back or front headers and then i had a asrock board AM3+ and it still didn't help much! so i haven't used it for gaming since i retired it for a i7 4790k and i pulled tha FX 8350 to put in a AsRock 990FX Extreme6 i got for free just to temproarly have it as a external livestreaming PC and it lagged even when i was not tryin' to stream at 60fps so i replaced it with a Ryzen platform which contains a Ryzen 5 3600 for a external livestreaming rig
@luca68194 жыл бұрын
Long live to the FX! I'm abusing it in every possible ways to let it die so i have an excuse to buy a ryzen, but I really don't need to and it keeps going and going. Best 130 euro-bucks ever spent on a pc component.
@GiantsGraveGaming4 жыл бұрын
I did pay it 75 pounds 4 years ago!
@Zorro333134 жыл бұрын
Can 100% say GN had no idea how to OC FX first. Only core/ram OC, 0 attention to L3 (cpu-nb) which is the main severe bottleneck. Also most of AM3+ mobos can't hold multipliers (it seems like it's applied, but it's not) so only reliable OC is via BCLK. If person doesn't know about L3 bottleneck, he most likely doesn't know about multiplier issue.
@idewmeth42034 жыл бұрын
Man I'm just appalled at the narcissism and bandwagoning of people on the internet. How can you say with 100% certainty a guy that has reviewed and tested hardware for over 10 years doesn't know how to OC the FX chips. Were you in his testing studio? No. Get off your high horse.
@MJ-uk6lu3 жыл бұрын
They can hold multis just fine and there was not much to be gained from NB overclock on FXs anyway.
@daviddebroux47083 жыл бұрын
@@idewmeth4203 I hate to say it but you'd be surprised to find out that people who have done something for a long time may have been doing it completely wrong (or using dilapidated methods that do not either hold up to today's standards, or does not make that particular thing work as it should in this case). It can happen to the best of us. It even occurs in other hobbies and jobs that are not within the IT industry, or the umbrella itself: culinary arts, medical science, and even music of all things are a few fields that stuff like this *can* occur in.
@plapbandit2 жыл бұрын
@@idewmeth4203 I think we can pretty safely say they don't know how to properly overclock the FX chips. Get off YOUR high horse.
@idewmeth42032 жыл бұрын
@@plapbandit bro you're replying to me literally a year later lmao
@Luukste4 жыл бұрын
In the last year I build 2 gaming pc's for my buddies, both pc's have a FX8350 and a decent 970 motherboard (from 2015). Paired with a modern and strong gpu this is still a very capable rig, especially in modern titels where core count actually does matter. Despite the low single core performance they can play whatever they want. Of course the max fps in 1080p is way down compared to a modern 8 core counterpart, but at least there is no stuttering and the frametimes are consistent. Both of my buddies are happy with their rigs and have nothing to complain.
@patrickizumi4 жыл бұрын
Have 3 custom built computers each with fx 6300. Pleased with overall gaming performance. Each have 16gb ddr3 1866 8gb x2. One has 1050ti, one has 1060 6gb, and one has rx570 8gb. I play SWBF 2 (2017) on high settings with a freesync/gsync compatible monitor on the 1060 and 570 machines, works great.
@Pillokun4 жыл бұрын
One think RA-Teck needs to consider is that when a slow cpu is paired with a very fast gpu it actully can result in regressive perf. In bf1 and bf5 for instance with an i7 3770k and a vega56 tuned through the roof I experience stuttering as the cpu cant feed the gpu and it clocks down itself and up and down. With my 970 and 1070 the gaming experience is smooth though. So here we are seeing results affected by the relatively slow gpu. And look at the Gta5 results, no way he is running at max settings at 1080p, even faster gpus at 1080p can run at 75-90 fps avg when every setting is maxed out.
@glenwaldrop81664 жыл бұрын
That kind of proves his point though, who, besides those two, is going to pair an FX with a 2080 Ti? That also raises an issue, if it is bogging due to the disparity in hardware then it isn't an accurate test, there's something wrong. My FX is on par with my Ivy Bridge at 1080P, slower than my Ivy Bridge at 720P, GTX 970 here as well. With the settings maxed out both are locked to VSYNC constantly and GPU limited.
@MisterTonyG4 жыл бұрын
@@glenwaldrop8166 Too many big tech tubers have unrealistic test conditions. This just goes to show that you can't just slap a $1500 GPU in there and call it a day. Especially if there can be significant differences when the hardware is somewhat balanced.
@glenwaldrop81664 жыл бұрын
@@MisterTonyG They're getting too niche in what is already a niche, in my opinion. They talk about how 1080P is their low end spec, it's tolerable, etc... The price difference between 1080P and 4K is insane and visually the difference is minimal. 1440P is nice, not gonna lie, but constantly increasing the resolution is why we don't have proper ray tracing. They can run ray tracing decently at 1080P. Proper AA or even using DSR helps 1080P, hell, even my laptop at 720P with proper AA looks almost indistinguishable to my desktop/TV at 1080P. We need better lighting, proper AA (we've lost ground on AA in the last 5 years) not more pixels.
@virtualtools_30214 жыл бұрын
@@glenwaldrop8166 hell my 120hz+ 1024x768 CRT looks damn nice and doesnt even need AA! and i can get absurdly low input lag with it and other "obsolete" hardware like ps/2 keyboards and mice!
@MisterTonyG4 жыл бұрын
@@glenwaldrop8166 Yeah I hear you there. I think they have gotten used to having bleeding edge hardware all the time and they have forgotten what it's like to be an average user with mid range hardware.
@abubakartaliq11284 жыл бұрын
Thanks for your videos! quick one - i have the fx-8350, sabertooth fx990 mainboard. What would be the best GPU for this mainboard-cpu combo? want to max out my settings without upgrading the board.
4 жыл бұрын
Hi there. I have the same setup with a 1070 and wouldn't advise anything faster. Maybe a 1070ti, 1080 or 1660ti/super.... Cheers
@livingthedream9154 жыл бұрын
Yeah, nothing past a 1660ti imo, I have an R9 Fury and it's bottleneckrd in many titles.
@HD-dp1pr4 жыл бұрын
@@livingthedream915 to be honest though that 4gb of vram on the fury could be the bottleneck
@RATechYT4 жыл бұрын
A GTX 1060 6 GB is where I'd stop, unless you play at higher resolutions using ultra settings.
@livingthedream9154 жыл бұрын
@@HD-dp1pr vram isn't a bottleneck at 1080p for me.
@picchioknossus80964 жыл бұрын
Is secure virtual machine enabled in the fx processor? In my phenom ii x6 1090t enabling it drastically reduces the performance in single threaded games (in extreme cases even by 80%). There is a performance hit in muti treaded games but its much less noticeable. Can you please test if in some phenoms you have the same thing happens? Thanks.
@glenwaldrop81664 жыл бұрын
I've got VMWare and an FX, Ivy Bridge and Phenom. I'll see if I can test it and get back to you. Edit: wow. I just disabled it on my FX and it is dramatically faster, even without the VM running.
@picchioknossus80964 жыл бұрын
@@glenwaldrop8166 thanks for testing. I though it was just a problem of my setup because those instructions are not used unless you have a virtual machine running so they cannot possibly slow down a computer if enabled ... But they do.
@glenwaldrop81664 жыл бұрын
@@picchioknossus8096 Yeah, they definitely do. I can't disable it in my Phenom II remotely but it made a hell of a difference on my FX. Half of the reason for the FX system now is to run a VM so that's good to know. I never expected it to be a performance drop like that with it enabled. I think that is probably due to a recent change in Windows 10 as I didn't have these issues in the past. I was my main box until a couple of years ago. They did require/request updating VMWare so they probably made some changes in how virtualization is handled in 10.
@walterg42004 жыл бұрын
Thank you for a great video!!! This should help those that have older tech that is still functional!!
@CommandoTM4 жыл бұрын
The amount of dedication youve put into this series is astounding. And to think I just rewatched a different FX vid of yours just yesterday. Digital Foundry's lead on frametime plots is something the review industry has to implement. Which is strange as I remember GN Steve agreeing to that sentiment but havent yet fully done it yet, only in select limited scope cases, such as their recent Cyberpunk benchmarks. As he often says, "lower is better, but more consistent is best". That said, I stand to gain nothing as I never have had an FX system, and amazingly enough believe it or not, ive never even seen one in person. Only seen them rot in store shelves.
@0mbps7954 жыл бұрын
While I don't think that using a Ryzen 5 1600AF to simulate a 2c/4t CPU, I've had an FX-8320E overclocked to 4.4Ghz until the end of 2020 and my experience has not been as bad as people tend to say. For some reason, the i3 7100 is a very popular CPU where I live, and (I know this is not a very scientific way to compare CPUs) I remember back when I played Cod Warzone with my friends, those with the i3 7100, used to have weird stutters and some texture issues, probably CPU related, while I didn't (Both I and my friend had RX 580 8GB). This also happened with other games, but warzone was the most noticeable. This may have been due to my friends not taking good care of the computers and having a lot of background processes in the background, who knows, but these issues seem very similar to those found in your video. When my friend upgraded from his i3 7100 to an i5 10400F, the stutters he suffered were gone, and textures loaded quickly. I don't think that comparing the FPS of an FX 8350 to an "emulated" Dual-Core/Four-Threaded CPU is valid, the issues that appear on the 2C/4T CPU (the stutters, textures not loading fast enough) apply to those found on the i3-7100. If you were trying to compare FPS between a 2c/4t CPU to an FX 8350, I don't think using a 1600AF and disabling cores is a valid comparison, but if you were trying to show the issues that dual-core CPUs may suffer nowadays that FX CPUs do not, then I see your point, since I've seen those issues first hand when trying to play with some of my friends. I'm now using a Ryzen 3 3100 while I wait for a local store to have Ryzen 5000 stock, but my brother has been using my old FX CPU and hasn't run into any problems while playing Monster Hunter and some VR titles (I've tried Half-Life Alyx on that FX CPU using Oculus Link and it works surprisingly well, not the smoothest experience, but fully playable). These are just things I've seen, I haven't measured them or researched them properly so feel free to put my experiences into question.
@rossmclaughlin71584 жыл бұрын
Given that these cpus are no longer sold new by amd the hate these cpus get drives there used price way down witch means better deals for budget gamers in used market so tbh I hope every one keep shifting on them so more people can afford to game even 6300 holds up surprisingly well 👍
@glenwaldrop81664 жыл бұрын
Price isn't driven down enough. Last I checked the 8 core was still like $130 and rising. Don't go below the 6300 though. The quad core doesn't hold up as well.
@rossmclaughlin71584 жыл бұрын
@@glenwaldrop8166 really last I checked the 8 core was 60 quid was few years back before this hardware shortage mind you
@francescomarzi86344 жыл бұрын
I use fx-8300 for retro gaming with rx560, it's perfect for all emulation
@glenwaldrop81664 жыл бұрын
PS2 emulation works better on Ryzen or Intel Haswell and above. AVX2 support greatly improves the speed, same with PS3 emulator. Other than that, my FX does emulate quite well. It even runs the PS2 quite well except for 2 or 3 games.
@bitelaserkhalif3 жыл бұрын
Gran Turismo 4 is the most demanding PS2 emulation IMO it lags badly on core 2 quad q8300
@francescomarzi86343 жыл бұрын
@@bitelaserkhalif😁😁😁 not q8300 but fx 8300 clocked to 4.2, its a real cheap choice
@bitelaserkhalif3 жыл бұрын
@@francescomarzi8634 I knew that
@francescomarzi86343 жыл бұрын
@@bitelaserkhalif i can't post you link but i found it for 58$
@domenicpolsoni83704 жыл бұрын
Still rocking an OC'd 8300 paired with a 7950. Suits my needs just fine. Paid $99 for the 8300 6 years ago.
@dan.vitale4 жыл бұрын
"Something's not right" - yeah, it's using an AMD CPU with some cores turned off to simulate an Intel CPU. Really top quality testing methodology you're using there :D
@NKG4164 жыл бұрын
i think 2C/4T 1600AF is still better than intel 2C/4T, i mean it's a ryzen
@AnotherLotte4 жыл бұрын
The intention was to simulate a modern 2c/4t CPU, he's using Intel as an example because those are the only 2c/4t CPU's that were used in those benchmarks.
@rossbrown87824 жыл бұрын
@@NKG416 If that was the case the first gen Ryzen CPUs would have owned Intel for gaming. Obviously with more cores Ryzen CPUs were slower, even today the Ryzen 7 1800X is mostly slower than the Core i7-7700K in games. Intel's monolithic design using the ring bus offers much lower latency for small core counts. A 2c/4t Zen+ CPU is almost certainly going to be much slower than am i5 2500K across the board for games and applications. This is a very flawed benchmark.
@dan.vitale4 жыл бұрын
@@AnotherLotte if you are critiquing someone's methodology and their numbers, you first need to ensure your own methodology and numbers are accurate and beyond such critisism. Otherwise you not only undermine your own arguement, you also look very amateurish.
@huri99464 жыл бұрын
@@dan.vitale So, he needs to show you his methodology for you to approve that he has the right to critique someone? Or who is the judge?
@doghunter40974 жыл бұрын
Great video... The problem with the fx processor is the exotic architecture. FX is not 8C / 8T, and it is not 4C / 8T. It works very well with the latest games that try to activate all the real physical cores. It is thought to be an 8C / 4T processor due to its spacific architecture. I know this is almost impossible, but in this case we have 2 cores trying to use one thread. It’s still good for playing video games, but for some more serious jobs it’s too slow.
@vyor88374 жыл бұрын
that is not how FX works.
@interrobangings4 жыл бұрын
Also, how do you feel about the SMT on vs SMT off meme for FX?
@AnotherLotte4 жыл бұрын
The problem with disabling the secondary cores per module, is that it also halves the cache sizes per cluster. You'd be giving up too much integer performance and gaining very little in floating point. It was only slightly helpful in really early games that used maybe 3 threads at most, but makes no sense in more multithread aware titles.
@glenwaldrop81664 жыл бұрын
@@AnotherLotte I agree with you on most of that but aside from losing the L1 data cache that is directly related to the execution units of that core I don't see how you'd lose any cache by disabling one core per module on the FX. You can't use that L1 on the other core anyway.
@AnotherLotte4 жыл бұрын
@@glenwaldrop8166 Maybe I'm misremembering, but disabling secondary cores would disable half the cache down the three levels. But I honestly can't remember the specifics.
@AnotherLotte4 жыл бұрын
@@Shane-Phillips Each -core- *Module* is called a *Compute Unit* , each *Compute Unit* has two _INT_ cores and a shared _FPU_ . In most BIOS's you can disable the secondary core, or as listed enabling *"One core per Compute Unit (CU)"* , doing so disables the second _INT_ core/thread which also reduces the amount of available cache, I just can't remember if it's reduced across the entire cache hierarchy or just specific cache pools.
@glenwaldrop81663 жыл бұрын
Hardware Unboxed just had a video about CPU load vs the GPU, Nvidia has 10% to 20% higher CPU load than AMD cards. I wonder if this is limited to a particular generation (ie; Turing and above) and this is related to the differences? I bet your video and the ongoing discussion has something to do with them making the video. HUB Steve deleted a lot of his twitter comments about the whole thing. One major difference here seems to be that we're all using Pascal and older and the coalition of Steve's are using Turing and above.
@RATechYT3 жыл бұрын
Yeah, I saw it. That is an interesting topic, though I don't think he made it because of my video. It is unfortunate that he didn't include a Pascal GPU, but at the same time there are a lot of benchmarks already, so it's understandable. I wish I had a Radeon GPU such as the Vega 56 or 5700XT to compare to the GTX 1070. GPU prices are through the roof as well, so it doesn't look like I'm going to be able to get one anytime soon.
@glenwaldrop81663 жыл бұрын
@@RATechYT I'm not saying he made it because of your video, I'm saying that whole thing planted the seed.
@WSS_the_OG2 жыл бұрын
Revisit this topic in 2022. What I wasn't clear about watching this video is if this is a video about FX processors, or core counts; the two issues get a bit muddied. If it's about FX processors being unfairly presented, then fair enough. If it's about more cores leading to more consistent frame-times, then this point gets lost in the video. A suggestion would be to revisit this topic, but use just one higher core CPU, and then disable cores on that one CPU and compare results. This would help to prove or disprove whether more cores = more consistent frame-times, and would even reveal where the core-count sweet spot is. This would also help to eliminate the background noise of people complaining about you comparing different architectures to try to draw conclusions about the effect of core-count specifically.
@hugotakayama69632 жыл бұрын
It is assumed that the fx 8350 would run better than a contemporary (2012) processor with 4 current threads, that is, an 8-thread processor would age better, but not necessarily a processor with fewer current threads, although the fx8350 beats processors with 4 more threads nmodern (2013 to 2017 including core i3 and pentimum skylake haswells and core i5 haswells, although the fx 8350 was developed to fight a core i5 ivy brige and sandy brigge at similar prices) aged better than the core i5 sandy and ivy brigde
@The_Seal774 жыл бұрын
I still use my fx8370 with 1070ti, have really no problems, it run at 4ghz all cores and never gets hotter then 45c on my phantek cooler.
@The_Seal774 жыл бұрын
Also, I'm using the MSI 970a-g46, is there a better mobo to get better performance out of the 8370 black paired with 1070ti?
@RATechYT4 жыл бұрын
@@The_Seal77 I'd look for an 8+2 power phase motherboard, such as the 970A-UD3P, 990FXA-UD3, MSI 970 Gaming etc.
@The_Seal774 жыл бұрын
@@RATechYT thanks for the tip
@idewmeth42034 жыл бұрын
@@The_Seal77 i think thats pretty bad advice to upgrade just your motherboard of a 5+ year old system.
@RATechYT4 жыл бұрын
@@idewmeth4203 I mean, I answered the question. Obviously upgrading to a new Ryzen or Intel processor is a much better idea.
@bass-dc9175 Жыл бұрын
Pretty much the only edgecase where a modern Duo core could outperform the FX is in games that are single threaded (Supreme Commander FAF for example). And that is a VERY niesche edgecase. The only modern advantage a Zen duo core or Intel Duo core has over the FX chips is the upgrade path. And that is the equivalent of saying "If you replace your CPU with a better one: it will be better". If I would have to choose between an FX 8000 chip and a Duo core of any generation: I would pick the FX chip. There certainly seems to be something wrong with the Duo-Steve results. Tangent: I have been a fan of both of their content for years. And their behaviour towards you was despicable.
@salaciouscreations43234 жыл бұрын
I use mine with a rx580 for emulation. Works fine. Switch games run fine on vulkan. Dreamcast PS3. It makes me glad I buy disks from car boot sales
@Stelio_Contos4 жыл бұрын
If your looking for Console like Performance, but you want pc games, FX is the way to go for cheap, especially right now because like you said many people consider them worthless. I have an FX-6300 @3.5ghz paired with a R9 380 and it works great.
@reki3533 жыл бұрын
damn the heat that combo will make is HUGE
@Stelio_Contos3 жыл бұрын
@@reki353 The FX-6300 maxes out around 37°c under full load on cinebench with and ambient room temerature of 20°c. The R9 380 maxes out at 64°c in the same case. 😐 have no idea why people keep saying these parts run hot when they run cooler than any other pc I've ever owned. I think people in 2012 didnt realize the spec of recycled aluminum that was included was just a place holder till you got a real air cooler, but those also apparently didnt exist according to everybody. Apparently air cooler to most people means sitting next to their computer and blowing air over the bare processor, then I could definatly see this thing overheating, maybe.
@whitemoses79134 жыл бұрын
"Both Steves" 😂😂😂
@Atari8man20114 жыл бұрын
I have 2 8350 and I'm using it now to watch your video - There nothing wrong with the 8350 - Great video
@idewmeth42034 жыл бұрын
Well, there is in fact something wrong with the 8350; there's a reason AMD lost a class action lawsuit over this chips architecture
@Tainted794 жыл бұрын
The only game my fx8320 4.4ghz / gtx1060 6gbs can't run is red dead redemption 2. When I build a new rig this year I am going to try the new gpu in the old rig. Curious to see the results.
@abijithng63024 жыл бұрын
Why can't it run RRD2 dude?
@Tainted794 жыл бұрын
@@abijithng6302 Just gets horrible frame rate 30-45 with high textures and low settings 1080p. Also it heats up my motherboards VRMs like a toaster oven 85c
@abijithng63024 жыл бұрын
@@Tainted79 Ahhhh..... A valid point. Then, it's a Happy gaming with the new rig, which is coming soon! :))
@vyor88374 жыл бұрын
@@Tainted79 lower the texture settings.
@happycat041129 күн бұрын
On singles threaded applications Intel will always come ahead of the 8350 but when all 8 cores of the FX 8350 are being used that is when the FX 8350 will unleash its full 8 core potential! The FX 8350 has 8 REAL cores, not fake threads! Having the latest and fastest CPU is no big deal anymore since most CPUs can game fairly well provided one has a decent video card!
@cycofusion1994 жыл бұрын
Incredible video man. I had a FX 8320 upgraded to a 1300x now sitting with a 2700. Still miss my old fx 8320 though.
@shrillduck4199 Жыл бұрын
i just seen this video if any one is still running with a FX cpu there is one way to boost the speed of the frame rate of the fx cpu's and its call turning off the idle in the power option in windows because the fx cpu likes to idle when a game is not cpu heavy i was getting around 60-70 fps in fallout 5 with the FX cpu it's one of the games thats it loves to idle on
@miso1995srb Жыл бұрын
Shrill Duck41 is probably talking about Power Options > Processor power management > Minimum processor state > set to 100%
@Micecheese Жыл бұрын
@@miso1995srb that also helps or cpupower on linux if you are in there
@KILLERSPREE9563 жыл бұрын
My Fx8300 running good even better than gamers nexus 8370@4.9 for sure
@MisterTonyG4 жыл бұрын
Keep up the good work. I've really been enjoying your content on the FX series over the past couple of years.
@abijithng63024 жыл бұрын
Same here dude! :))
@interrobangings4 жыл бұрын
fx 8120 and 7970 here surprisingly decent budget rig
@gamingedition51654 жыл бұрын
In 2021?
@interrobangings4 жыл бұрын
@@gamingedition5165 yes I'm only doing 1080p anyway and I don't care if I have to crank everything down
@interrobangings4 жыл бұрын
@@gamingedition5165 also my main rig is a 3900x and a 1080ti lmao don't worry
@abijithng63024 жыл бұрын
@RATech, I will be happy to know your CPU & GPU upgrade path over the years (for your primary PC)?
@RATechYT4 жыл бұрын
I built an FX-8350 & a GTX 760 PC way back in 2013, and over the years I had small upgrades such as increasing the RAM capacity, adding an SSD, buying an aftermarket cooler etc. Only in 2018 have I upgraded to a GTX 970. Around a year ago I moved on to the Ryzen 5 1600 AF that I paired with the same GTX 970, which I recently replaced by getting a GTX 1070 a few months ago. Up until 2013 I always had pre-builts, I'm not sure but I think I had a Pentium in mid 2000's (don't remember the graphics card), and an Athlon X2 paired with a crappy Radeon HD 4000 series graphics card.
@mehdidev49854 жыл бұрын
Level GPU he got 760 followed by 970 then the 1070 came Level CPU he had a FX-8350 overclocked ; he moved to Ryzen after
@jmugurr9944 жыл бұрын
I can't help but think that when I see CPU percent at around 40-60% that the game is only using 4 threads instead of 8. That would make performance similar to the other 2c/4t cpus. 50% could mean 4 core cores at full bore, 60% could mean 4 cores at full bore and some other things in the background like monitoring software.
@vyor88374 жыл бұрын
real threads do more work than virtual ones.
@filenotfound__38714 жыл бұрын
What is the FX 6300 equivalent of?
@jonathanmain90794 жыл бұрын
Ivy bridge i3 ...
@opinadorrj43374 жыл бұрын
@@jonathanmain9079 Wrong. 6300 = i5 3330
@RATechYT4 жыл бұрын
Depends on what you do. In a lot of modern games though, it's not any worse than a quad core Ivy Bridge. I believe it's even better in some cases.
@filenotfound__38714 жыл бұрын
@@RATechYT It is good enough for me, evrything I throw at it, it just takes it no problem, editing, even emulators like one in android studio and some older console emulators, but in GTA V I have a problem that I have like barely 30 fps in lowest, and nothing is really utilised, while in ultra im getting about 50fps, windows reinstalation didnt help, along with new drivers and game reinstalation, temperatures usually stay between 30 and 45 on the CPU and 30 to 40 on the GPU, why is that? Specs: fx6300 8gb ddr3 1333 Rx 580 4G
@stevesimon62152 жыл бұрын
@@filenotfound__3871 wth. same specs
@saikirankhanna50724 жыл бұрын
RA tech is like reality acknowledged.. fx is a decent chip fr its price.. tho today buying a new one is expensive (post amd core count lawsuit) I would request the channel to revisit phenom x6.. I have a x6 nd it performs smoothly fr games tht doesn't require latest instruction sets .it isn't a speed demon but it is smother than what some KZbinrs made it look. I think having more cores are better. Thanks fr d myth busting video RA tech.
@RATechYT4 жыл бұрын
I have reviewed the Phenom II X6 1090T. You can watch it here: kzbin.info/www/bejne/e3Lcpmyjl5h7eMk&ab_channel=RATech
@shawnsereal4 жыл бұрын
I am also curious why they always bash the FX 8350. I currently have one installed on an Asrock 990FX Fatal1ty paired with 16 megs of Corsair Vengeance 1866(9-10-9-27) and my benchmarks are always higher than Gamers Nexus or Jayz benchmarks. My Cinebench single core score is 101 and thats only with a 4.522 overclock. Mines is even beating your 8370 at 4.9 which is weird. I still use this computer for basic stuff and some gaming. I have no intentions to get rid of it eventhough I've since then built another computer using ryzen 7 3700X. I think the major tech youtubers like steve, kyle, paul, and steve again are paid to bash this cpu.
@codeHusky4 жыл бұрын
They're not
@vyor88374 жыл бұрын
@@codeHusky they objectively do.
@Russell9704 жыл бұрын
So an i7 2600 is better than an i5 6500?
@kenzohkw4 жыл бұрын
Hell yeah, it's got 8 threads which is needed in modern games.
@Russell9704 жыл бұрын
@@kenzohkw But i5 6500 got 30% better IPC
@kenzohkw4 жыл бұрын
@@Russell970 it might have but that's not as important as threads nowadays. For example if u play Warzone it maxs out 4 threads and u can't even use Discord. On the i7, no issues, u might lose a few FPS but nothing that'll affect your game.
@Russell9704 жыл бұрын
@@kenzohkw Omg you're right, It's about headroom and stability, Got it, Thanks wise man!
@vyor88374 жыл бұрын
@@Russell970 nooo it doesn't. It has about 15% better IPC at best.
@iancurrie8844 Жыл бұрын
Great content. I think it's important to keep the big names honest. I wish they would have responded more maturely.
@teapouter Жыл бұрын
I would just like to say that I greatly appreciate your video, and I'm happy to have come across your channel. I'm sorry for the unreasonable hate your channel suffered, and this video is very good. Thank you for putting this out there and helping clear the narrative. I have never owned an FX CPU, and I only got into computer hardware around Ryzen 3000, but I had always heard about how bad Bulldozer was. However, this video was able to clear up those misconceptions, and for that I am grateful. I wish you all the best in your future endeavours. You've certainly earned a sub from me.
@RATechYT Жыл бұрын
Thank you!
@todogaming7794 жыл бұрын
saludos a todos esos grandes canales, lo dire en español pero yo tambien poseo un fx 8320 con un oc a 4400 ghz, y he demostrado muchas veces que muchos test y bench de canales mas grandes estan errados, y muchas veces dan resultados peores de los que los fx dan, nose si es porque les promociona intel o porque quieres hacer quedar mal a cpu viejos de amd, en fin, excelente analisis ra tech, yo apoyo todo lo que dices
@kjcolewelle4 жыл бұрын
I don’t want to undermine your method but since you aren’t comparing with Intel architecture, it’s difficult to draw very definite conclusions, IMHO. On the other hand, my own experience of the FX 6300 on titles like The Witcher 3, which are "CPU intensive“, tells me that the FX chip can almost be judged to be "coasting“, even when paired with an oldie but goodie GPU like the GTX 660 (medium settings, at 1080p), and thus forced to do more of the heavy lifting. Prices of FX chips are also holding up now on the secondhand market. Yes we have general supply problems but I am guessing that there are other reasons, like relatively good performance and the ability to use higher-specced DDR3 motherboards compared to the Intel competition fron the same era at equivalent or lower price levels. In any case, good work on going in-depth here with a positive attitude.
@andreewert65764 жыл бұрын
True, the titles where it got closer are also titles that perform better on intel CPUs. Where can i donate so RA can buy an i3-7350K plus board?
@RATechYT4 жыл бұрын
Believe it or not, but the Ryzen 5 1600 AF with only 4 of its cores enabled (no SMT) and a downclock performs nearly the same as the i5-3470, so I doubt that getting an actual G4560 would make much of a difference. You just have to match the RAM speeds on both and make sure to have similar points in Cinebench.
@kjcolewelle4 жыл бұрын
@@RATechYT OK, thanks for the clarification. 👍
@andreewert65764 жыл бұрын
@@RATechYT the thing is similar cinebench scores do NOT equate to similar game performance especially in intel-loving games like GTA5. That game was one where 1st gen ryzen seriously underperformed compared to other games where it was close behind intel`s offerings. And Zen2 with its better latencies diminished that gap to an extent. All i‘m saying is you need to re-trace their steps if you want to prove their findings wrong. And you will need real intel hardware to do that.
@monzitj12 жыл бұрын
As someone who has G5400 and FX 8350 and 1600AF + 5600X (i'm a guy who sells PC components) 1. G5400 is going to be on pair with i5 4460 in most games, meaning it'll be faster than 3470 for about 15 to 20% 2. Athlon 3000GE is actually slower than G5400 and it slower for about 30%, especially in games in which core count doesnt really matter 3. It's just not okay to compare 1600 AF as 2/4 and simulate performance of other CPUs in my opinion, but as you wish 4. I do agree that Core i3 3220, for example, is going to be slower for most of the games, especially in 2022 compared, for example, to and OCed 8350 or even stock 5. G5400, on the othar hand, has 1 MB more cache, 3.7GHz all cores freq. (G5500 is even faster, i3 7350k even more) and is going to win in a lot of games by just pure IPC This doesn't mean that frametimes on both CPUs will be good, but if it makes any difference, yeah, I can see scenario in which fx8350 can pull off better frametimes.
@yottaXT4 жыл бұрын
*Let me throw my theory here:* First i trully believe on both steves integrity as techpress, yet theres something that no YT channel is cappable of doing, and its globally testing every configuration possible. My theory is the foloowing, they got the result they claim they got with the hardware they claim they used, as the results were withing their expectations they never had the need to retest using different configurations, who would do that and why? The only people that would bother doing it is somebody that is currently using an FX with said configuration, because he know that something doesnt fit or somebody that is interested to dig further on the matter for unknown reasons. *My question on this regard is:* Did you tested the games using their configurations? otherwise you have no right to call them out. Your findings are not by any means related to their results, to do that you would need to prove that their results are NOT correct under their test conditions. If you want their attention as bigger tech channels, you should've shared your findings with them so they could prove you're right and then share it with the viewers. The way you did it feel more like an attack than anything else, and believe me you're aiming the wrong people.
@abijithng63024 жыл бұрын
I can undersand your standpoint dude, but Adam/RATech has good respect for Gamers Nexus & Hardware Unboxed as he told in his previous videos. He is on his own path, and I have experienced similar results as Adam\RATech! :)) So respects, RATech!
@RATechYT4 жыл бұрын
I want to make it clear that my intention is definitely not to attack or send hate towards them. I just want people to see that AMD's FX processors aren't as bad as they claim, which is what both channels have been telling to their massive audiences for years now. I haven't tested the FX-8350 using different components either, and the parts that I used for my tests aren't any better than what they have. Their results are just not normal, and I believe that people have to know that.
@majorpayne0195 Жыл бұрын
As usual GamersNexus always insist their finding. If you watched NorthbridgeFix video about the 4090 connector melting down, it was shown there an evidence that it's the card itself is the issue not the user or any cablemod adapter. Even jayz2cents agree with Northbridgefix and what to expect from GamersNexus, he contradict, more excuses to insist their previous finding like the cards sent to Northbridgefix came from Cablemod and "they will investigate" it before changing their conclusion.
@jimmymcgill69420 Жыл бұрын
only clown listen to gamersnexus lmao, narcicist egomaniac and he looks sm*lly af
@geekjit2154 жыл бұрын
Some of my friends are still rocking the 8350 and one of them even being a 8120 and they get very similar performance just like you tested. They are still yet to find a reason to do a major upgrade because of this.
@RATechYT4 жыл бұрын
Thank you for your input!
@IcebergTech4 жыл бұрын
I appreciate the effort put into the video, and wondered if you'd come to any conclusion as to why there's such a difference? HUB and GN aren't known for fabricating figures or being partisan about brands or specific products, so there must presumably be a specific flaw in their methodology that isn't extracting the maximum performance from the FX CPU?
@RATechYT4 жыл бұрын
Obviously I'm not claiming that they're fabricating the results. The issue could be as simple as VRMs overheating, or perhaps they decided not to bother reinstalling graphics drivers using DDU. Either way, something's definitely wrong with their results.
@azmc49404 жыл бұрын
When I still had the FX-8350 it was paired with a GTX 970, and in modern games I was heavily GPU bound. One could probably go up to a 1070/1660S before the FX starts to badly bottleneck. That being said, after upgrading the CPU to a 3900X, still with the GTX 970, some games DID run much smoother (e.g. Wreckfest) while others did not (e.g. The Witcher 3).
@walterg42004 жыл бұрын
Do a comparison for an 8120 processor
@gradystephenson3346 Жыл бұрын
Imo a lot of what made the fx run slower than it should have was mainly because of the windows kernel
@Micecheese Жыл бұрын
background work of windows everything after 7 is a bloated mess, two good options have been 7 or linux to have little bloat clogs in system performance
@csut89293 жыл бұрын
I personally suspect the CPU Cache may have something to do with this and the way its accessed by the Chips an old I52500k has 1.5mb per core the FX has 1mb per core then again I may be talking out my exhaust
@laurelsporter3 жыл бұрын
The FXes had 2MB L2 per module, then a shared 8MB L3. The Core i5s had 4MB or 6MB total. The Intel's have all been inclusive, since the Core i series began, so you can call the size of L3 the total cache. The AMDs never have been, but on FX, like Ryzen, they aren't strictly exclusive, either, which would make it fair to add up the caches. Depending on how much data is shared between the cores/modules, it could be like having no more than about 8MB total, up to like having 16MB total.
@georgefoley97934 жыл бұрын
Thank you very much for the work you have put in to make this video! Now I realize I should have purchased a Ryzen 7 1700 instead of Ryzen 5 2600 when I was building my gaming rig; two years ago, they were the same price at Micro Center.
@abijithng63024 жыл бұрын
George, now you've got the wisdom. You're safe in the future dude! :)
@bluethunder83833 жыл бұрын
Love my 8350. Dont get all the hate for the fx chips. Great video 👍
@grumpiebrown4 жыл бұрын
Watching this on my old AMD FX-8350 with RTX 2060 on which I do play modern games too, albeit not at ultra high graphics settings 😎
@jack16.294 жыл бұрын
i feel sorry for your 2060, consider ryzen or something
@grumpiebrown4 жыл бұрын
@@jack16.29 Yes, I agree with you, but I am considering my wallet, the card was a huge (and much needed) upgrade form my ancient Radeon 6870 though! 😁
@jack16.294 жыл бұрын
@@grumpiebrown mhm yeah a great upgrade nonetheless
@zerotactix57394 жыл бұрын
Try upping the graphics. Shouldn't make a difference no matter the settings because higher graphics takes a toll on your GPU , not CPU. You'd probably get the same FPS on both Ultra and Low because you'd only be stressing the GPU at that point. Your CPU is already working at 100%. Your FPS will be the same, but at least you can play with better graphics.
@grumpiebrown4 жыл бұрын
@@zerotactix5739 Thanks, will do so. CPU upgrade will be done later, but that will obviously require purchase of motherboard plus memory too, which is a pricey step, considering the exchange rate here in South Africa. Hence , the quickest single component upgrade which already made a major difference was the graphics card 😎
@dfz678 ай бұрын
I bought a amd fx system for 41€ and all the people i talk to say its trash now that ive used it for a month heres my perspective, runs most games ok browsing is great and its a good experience
@herbertvonsauerkrautunterh25134 жыл бұрын
I recently upgraded to ryzen from 8350 because my software didn't work. It didn't work with my new PC either. It's the software. Spent big $$$ fur nothing. Could have kept running with the old system
@bestopinion92574 жыл бұрын
Waw. That's quite a story. :)
@RATechYT4 жыл бұрын
lol
@popcoingaming50864 жыл бұрын
So are the Steve's lying to us?
@abijithng63024 жыл бұрын
Haha...not really, but their results are misleading!
@RATechYT4 жыл бұрын
I don't know that.
@interrobangings4 жыл бұрын
He isn't saying they're lying, he's saying there is something wrong with the way they're benching when real-world performance is better
@totosxpompa4 жыл бұрын
I have 8350 with rx570 Armor its perfect
@rrekki93204 жыл бұрын
One of the reason I did go with the FX4350/635/8350 cuz when I bought the 8350 (2014 or so) it ran till this December when I upgraded to a R7 3700X. I could run some games on Ultra without breaking the 60fps. Also my friends who bought the so high praised i5/7 already switched to two/three of the new gens because the games stuttered massively.
@Fender178 Жыл бұрын
I have a theory on why maybe Hardware Unboxed and GN got the results they did and I am thinking it has to do with the Graphics card they paired with the CPU especially on the high end of the graph with the 2080Ti because the 2080 Ti is too much GPU for the 8370 to handle and you are using a GTX 1070 which is a lesser of a bottleneck by comparison since the 1070 is a much weaker GPU than the 2080 Ti. In the same titles that you and both parties tested. Can't explain the situation involving the 1650 super and the same applies to the 1070 tested in the Witcher 3 by UN. Maybe the same thing applies to the 2070 Super that the 8370 is too weak to handle the 2070 Super which allows it not to utilize it fully in the games tested by you and UN.