I'm old enough to have heard about the imminent death of X86 several times...
@coolvinayАй бұрын
It wont kill anytime soon or ever, but will take a huge chunk of x86 sales. Intel is already struggling.
@jay-j6lАй бұрын
it won't die, it will just become irrelavent, low price and perfomance will cause the shift if arm plays the short game well
@bogganalseryd2324Ай бұрын
Me too, too much software is built upon it for it to go away, what they should do is create a parallel x86 2.0 that's stripped of all the old obsolete instructions.
@myne00Ай бұрын
@bogganalseryd2324 They plan to.
@VicariousAdventurerАй бұрын
@@bogganalseryd2324 RISC is an ISA design philosophy that goes beyond "stripping obsolete instructions"
@tblicherАй бұрын
I would love to move to ARM or RISC-V for that matter. However, to me ARM is in some ways even more closed than x86, because of the lack of BIOS and hardware descriptors. This means that the operating system needs to know beforehand what hardware is available and on which addresses on a given computer/SBC/system. This means that many computers that would be perfectly fine for years, stop getting updates once the manufacturer stops pushing out updates, leaving you incompatible with newer software and vulnerable to security risks. Often you are stuck with the operating system that came preinstalled, because it has been tailored specifically to that system. On x86 the hardware descriptors are stored in the BIOS or in hardware, which mean the operating system can look it up upon boot and run on never seen before hardware. As someone who only uses Linux this is extremely important, because it allows me to buy literally any x86 device and know that it will boot, leaving you with an actual usable system. Sure sometimes some hardware may not be working correctly, due to missing drivers, but at least the operating system know that the hardware exist and can always boot. TL;DR In Linux you can boot just about any x86 machine in the world, even though Linux may not have ever seen the specific system beforehand or the system was never designed with Linux in mind to begin with, because the hardware descriptors come with the system. On the other hand with ARM you need to provide a device tree in software to tell Linux where all the hardware is located. So if there are no public device tree of a given system on ARM, you are more or less out of luck getting it to boot at all.
@VicariousAdventurerАй бұрын
How is this different from potentially unsupported minor hardware within PCs? Any Linux veteran (me) can tell you about unsupported printer drivers, etc. - I'm sure Microsoft will have some standards for ARM on windows. As someone at LTT pointed out, how AVX512 runs - once AMD adopted it - is very different on AMD (fragmentation), and there is a potential for programs to run differently on AMD and Intel. So, getting together is a way to provide a middle ground so that x86 *remains* generic in programmer eyes as it ages.
@detaartАй бұрын
The word you are looking for is ACPI
@tblicherАй бұрын
@@VicariousAdventurer Its different, because sure you may run into unsupported hardware, but at least you can boot the system with x86. On ARM, *you* need to provide the hardware descriptors rather than the device providing them itself. So sure Windows may require a standard, but the device descriptors are then stored in the Windows image instead. This is a problem ,because you are potentially locked into using that particular Windows image and who knows how long that image will be getting updates? Look at Android. There is the exact same issue, where you cannot boot a generic Linux or Android on a phone, because each phone comes with a custom device tree as part of the custom kernel built for that phone. This is the primary reason that phones stop getting updates, because someone (Google and the vendor) has to maintain a Linux kernel specifically for that phone and that is too expensive.
@tblicherАй бұрын
@@detaart Yes and the ACPI is often stored in the BIOS. And yes, this is also possible in the ARM world, but almost no one is using it apart from some high end servers.
@MrSamPhoenixАй бұрын
Good thing the Linux guys are always figuring things like this out on their own. There’s a team that’s been working on getting a Linux distro working on Apple silicon.
@nempk1817Ай бұрын
How many times x86 died? Are people forgetting that ARM is almost the same age as X86? The reason AMD and Intel are "joining forces" is because the ownership of x86 is from intel and x86_64(aka AMD64) is from AMD. If they want to update the ISA instead of adding things they need to agree to each other. ARM is not new and wont kill shit. People where hearing that X86 will die since the 90's and died before it.
@MrHav1kАй бұрын
Spot on. You get it. People have been saying x86 is "dead" for over 30 years now.
@alexis1156Ай бұрын
Finally someone actually Smart. All of this risc vs cisc doomposting is complete and utter bs. Complex instruction sets are not gonna go away, and risc is not the solution to everything, otherwise the entire industry would have went full on risc architectures literally half a century ago. Even the supposedly risc use cisc instruction sets and vice versa, even this talk about power efficiency is mostly bs.
@GamerForLife514Ай бұрын
So, true.
@CarewolfАй бұрын
@@alexis1156 ARM with SVE is more CISC than x86 ever was. It has thousands of instructions because it need to deal with every combination of data width, data type, and counter width. Just to support chips width independence for chips that are all 128bit wide anyway,
@pcallycat9043Ай бұрын
yeah, but that isn't the kind of clickbait that gets views lol. People also keep losing sight of the fact that ARM has been going to 'outperform' x86 for a couple decades now. Every time the architecture gets close, intel and amd make a generational leap. The only reasy ARM outperforms anything is that the architecture makes it somewhat simple to add as many cores as you like. Of course it can outrun another platform when the silicon is the size of your hand lol. I do appreciate the competition though, it helps keep companies innovating
@JBits-m9pАй бұрын
Horse vs cars is a really bad analogy when comparing 2 different chip architectures
@parlor3115Ай бұрын
Not on qualitative terms, but Def on quantitative terms (how much the competitor is bringing in value)
@hansmatos2504Ай бұрын
It's more like, horse vs donkey/pony
@doggSMKАй бұрын
Yeah if the horse it x86
@hansmatos2504Ай бұрын
@@doggSMK indeed. Neither of them is appropriate enough to be a car in the comparison
@dycedargselderbrother5353Ай бұрын
Especially when we're pretending that ARM is some sudden upstart and not a decades-old company. It's more like the electric car and gasoline car were invented at almost the same time and we're still arguing over the ultimate winner. Instead of one of them "winning" they've both grown into separate markets.
@ragingmonk6080Ай бұрын
If I had $1 USD for every time someone said, "ARM is a threat" over the years. I would buy a Bugatti just to drive when I go grocery shopping. Just like, about 8 years ago when those same people said, "APU's will replace discrete GPU's". Well it has been 8 years and we all still own discrete GPU's. Germany is going into recession and any stock investor or most news agencies can tell you that. Germany is the economic powerhouse of the entire EU. If it goes down than... So it is not that Intel is broke with no money left. The market is and will remain in decline. "Chinese stocks suffer worst fall in 27 years over growth concerns" - 10/12/2024 "Japan unexpectedly slips into recession" - February 15, 2024 But quoting China reports is a fools errand. "China Urges Local Companies to Stay Away From Nvidia’s Chips" - Yahoo Finance, September 27, 2024 "Nvidia shares fall and its Chinese rivals soar after Beijing urges AI companies to look elsewhere for chips" - September 30, 2024 Nvidia is not doing as well as it claims and this is a poorly researched video.
@zam1007Ай бұрын
Ok China. Hail xi, glory to the party.
@ragingmonk6080Ай бұрын
@@zam1007 I am American, born in Cleveland Ohio. Funny how people that do not agree immaturely accuse you of being from some other country. Well look at society. The educational systems have failed us for over 20 years.
@AnonyMous-gt8vqАй бұрын
Its typical for "APUs will replace GPUs" coreteks to make poorly researched videos. Even back in the day when he made a video claiming that APUs are the future it was clear that GPUs are getting bigger and power hungrier and there would be no way to fit any decent GPU into the CPU. Yet he makes stupid non sequitur arguments claiming stupid stuff.
@EmilePesky-n1vАй бұрын
Lies, if you owned a Bugatti U would pay your man servants to do the shopping. That carry on is for poor people 😂. Guess who is trimming your toenails now? Man servants. 😂
@TechaktienАй бұрын
China says every year to stop using western chips. Fake news every year. Nothing new
@tvgerbil1984Ай бұрын
In the 64-bit world nowadays, Intel can't go it alone in a major way after it abandoned IA-64 and licensed AMD64 from AMD to use as the core 64 bit instructions for all its processors. When Intel put in the proposal recently for x86S to clean up the x86 architecture, the bits that get chopped, like the 16 bit real mode, 32 bit protected mode etc, are all Intel's. The bits from AMD stay.
@peterwstaceyАй бұрын
Also note that the cross-license agreements for ISA extensions like x64 and SSSE are declared null and void if either party is sold. Which is why, despite the current noise about Intel being sold to maybe Qualcomm, it will never happen
@Slane583Ай бұрын
@@peterwstacey I don't foresee Intel being sold to Qualcomm for the fact that some regulatory commission will step in and beat it down for being a potential monopoly. The same goes for those who worry that AMD will stop making GPU's because Nvidia is so far a head. The regulators won't allow Nvidia to be the one and only GPU manufacturer to exist on the market. If such a thing happened they would jack the prices of even the most basic crap hardware so high that no average buyer could afford it. Their current offerings are a fine example of what they're happy to charge you. They don't care about you only your dollar and if you can't give it to them they'll gladly find someone else who can and will without complaint.
@Ivan-pr7kuАй бұрын
X86 is not fragmented but bloated. You can't really get fragmentation with only two hardware vendors on the market. Anyway, Intel is planning to solve the bloat problem with X86S and APX.
@lharsayАй бұрын
ARM also got bloated over the years, if you buy a new phone it still has to support 15 years old mobile games.
@MyrKnofАй бұрын
Of cause you can. Generation on generation from two vendors making and killing their own instructions. It does not give confidence to developers that this new amazing instruction will stay, and only one vendor has it, so why use it? Then AMD/intel see its not used a drops it.
@chainingsolidАй бұрын
Most software only uses the small useful subset of the x86 ISA. The stuff only in there for backwards compatibility, once out will allow the market to vote with there wallet on what needs to go back in and what can get tossed out for good.
@kirby21-xz4rxАй бұрын
@@lharsaynah arm is absolutely no where near as bloated as x86
@tvgerbil1984Ай бұрын
Out in the industrial world, x86 bloatedness is not a problem for embedded systems. In fact, longevity of the x86 family is a blessing. A lot of the embedded processors are still running 16-bit or 32-bit codes.
@AzazelezazAАй бұрын
You start with RISC. Then over years, market needs some complex instructions to be optimized. Then you have to include some complex instructions and you have CISC. Then some people start to complain about the lack of simplicty and efficiency and reinvents RISC. Life of circle. Its like complaining about JS ecosystem and workflow is too bloated and saying I can do better. You start from zero with vanilla js and over time you need features and you end up reinventing the exact same bloat that you complained about. Just recognize that people were in your shoes years ago and they already fixed the problems that you'll face in your path.
@GreyDeathVaccineАй бұрын
HTMX FTW!
@VicariousAdventurerАй бұрын
RISC is "Reduced Instruction Set *Complexity*" - No one would say PowerPC or Alpha is not Risc. No one would say New York City is small because of straight streets. PowerPC is complicated, but follows the rules of RISC: No mixing register transfer and load/store instructions, special instructions for unaligned data access (cross word boundaries) [an architecture class will show why this is tougher than appears], same-size instructions, rather than variable sizes [so that you know you have one instruction/word that you grabbed out of memory], consistency on indexing, so most instructions can use the same features - simplifies learning and compilation - these are characteristics of the instruction set, to make decoding easier and minimize stalls due to ISA misfeatures; they have nothing to do with the number of instructions.
@erictayetАй бұрын
@@VicariousAdventurer I was taught RISC stands for Reduced Instruction Set Computer. Anyway, I studied and did assembly back in the day. Have not touched assembly ever since. It was too painful to write. I only did it for one of the projects that required bitmasking to start the controller and it was embedded in C code. Can't remember which controller but we didn't buy RTOS license, so bootstrapping was done manually to signify a start/reset versus restart. I think it might have been an early ARM controller. RISC (back then) requires multiple instructions to add 2 memory location. I think 8 is the minimum for Motorola 68HC11, ignoring stack management. So even though it can execute 1 instruction per cycle, for meaningful work, it's not very fast. CISC on the other hand, like Intel 286 can handle the same job using 1 instruction and can complete the whole thing in 5-6 cycles. This is from the top of my head so don't quote me. Writing that in x86 assembly was WAY faster and easier to debug than 68HC11. And since x86 has a performance advantage, most users went with x86 and 68000 died. Later, when compilers became REALLY good for both CISC and RISC, Intel introduced superpipeling and branch prediction to maintain the performance advantage over most RISC CPUs. It's ironic Intel switched to RISC to allow their CISC instructions to hit peak rate of 2 CISC instructions per clock for certain instructions. After that, Intel stopped publishing how many cycles each instruction run when they go full speculative branching with register renaming. Eventually most top Supercomputers run x86 until the GPU came along. You can check out Asianometry channel for more history on this topic. I lived through this period in history and did most of my dev work in the Pentium 4 days. We worked with other microcontrollers to put in our equipment and we prefer CISC based controllers since a lot of code was written in Assembly by our engineer.
@CarewolfАй бұрын
@@VicariousAdventurer so you are saying ARM isn't RISC because it has Thumb instructions?
@zephyrprime7 күн бұрын
"You start from zero with vanilla js" - the mistake was starting with JS, lol
@mentalplaygroundАй бұрын
This channel evolved in to economy-fi. Most prediction presented here never come to life. Still good entertainment.
@esra_erimezАй бұрын
I have a feeling that this video is going to age like milk. The US govenerment just gave Intel billions of dollars to build fabs in the US. I suspect this was not only to bring jobs to the US but also due to Intel's strategic importance to the government and military should "something" happen to TSMC in Taiwan
@wikwayerАй бұрын
If the Americans started to produce the CPUs I'm not sure the rest of the world would be able to afford it
@parlor3115Ай бұрын
Yes, build fabs and not R&D for x86 chips. This is why Intel is rumoured to split into design and manufacturing. The US government wants an alternative to TSMC on US soil. So, unless this partnership yields a solution that can compete with CUDA, Intel could be relegated to manufacturing only while the design branch and AMD could be driven out of business by more ARM CPU's / APU's coming out.
@cj09beiraАй бұрын
@@wikwayer bro those taiwanese working on tsmc aren't being paid badly, you dont get become nr1 foundry by exploiting cheap labor
@vaeloreonari7516Ай бұрын
The US gov't just told intel to provide them with their roadmap before they pay one dime!!
@rightwingsafetysquad9872Ай бұрын
@@vaeloreonari7516 So what? Everyone lies to the government about their roadmap. How long did the F-35 project take? I-71 construction in Columbus, OH is only about 30 years behind schedule.
@blackmennewstyleАй бұрын
There is no competition in the chips maker industry, if you just pay attention at all executives and engineers, they literally worked for most of them, even worst if you look at most new chips maker companies, same results, most human resources come from majors chips makers and as soon as they become known, they end being bought by one of these big companies. So how can you ever believe these companies are being competitive when they blatantly just create and fill new markets, it's a flagrant arranged competition and monopoly, protected by an aggressive imperialist power. Don't be fooled.
@catmeow11111Ай бұрын
Yup! Pretty much sums it up!
@User9681eАй бұрын
Even Nvidia and amd price match
@HanSolo__Ай бұрын
@@User9681e I'm not saying AMD and Nvidia have no connections in between. But the "price match" you talk about is a thing of the past.
@User9681eАй бұрын
@@HanSolo__ well yeah they are less price matching since Nvidia created better software so they price ot higher But still there isn't aggressive competition
@polla2256Ай бұрын
It's not imperialism, it's a corporatocracy lobbying technocrats (Europe) and the imperialists (US). Welcome to globalism, there will not be 3 main powers (as in 1984) but 3 ideologies.
@FrankHouston-v5eАй бұрын
ZLUDA is a New Hope against the Nvidia galactic empire 🧐.
@mahiralrafid7767Ай бұрын
I want Nvidia to loose so hard because of their new business practice. I use 1060 6gb a byproduct of better Nvidia.
@asandax6Ай бұрын
Didn't AMD shut it down because NVIDIA started threatening law suit's?
@marshabrightly1307Ай бұрын
yes but they can't use it....or Nvidia will sue them
@FrankHouston-v5eАй бұрын
@@marshabrightly1307 zluda was relaunched as a clean room project 10-04-24. Nvidia can’t touch them. I planning to contribute to project 👁
@arenzricodexd4409Ай бұрын
ZLUDA in the long run will give advantage to nvidia more. That's why intel and AMD drop the project. They know how such project can undermine their oneAPI/ROCm effort.
@rocker10039Ай бұрын
Rocm is open source, I hope rocm keeps getting developed to compete with cuda. I think x86 will survive but will decline over time. Although I hope ARM matures more, we have seen how ARM can't launch games like Valorant and even certain versions of Adobe premiere and other softwares. The future is exciting tbh. Either way the winner is tsmc
@adnanabdullah7919Ай бұрын
Thats a temporary, thing
@kirby21-xz4rxАй бұрын
@@adnanabdullah7919 how temporary? Because we've been wanted better support for arm computers by devs for years and it's still not there and that's the only thing Intel has over arm right now is app compatibility and they know if that's no longer a problem, then intel knows they are in deep trouble themselves why they see arm as a threat 😭
@seylawАй бұрын
Intel's solution also seems viable in the long run with oneAPI, Level0 and SYCL support.
@Alitar1212Ай бұрын
Why does this dude sound enthusiastic for a monopoly forming. If Nvidia ends up dominating fully in the long term, it’s gg for everyone. Yet he sorta sounds excited for this hypothetical future, just odd..
@Jamies-gz7ixАй бұрын
some of us invested millions into Nvidia. 😅😅
@Alitar1212Ай бұрын
@@Jamies-gz7ix starting to make some sense
@meekmeadsАй бұрын
I got a single Nvidia stock 😂
@jensv874Ай бұрын
Yes, and its been notible for some time now that nvidia has no competition in the hier end gpu's. Prices are insane. Yet the bulk of profit for nvidia is not gpu's. But data centers.
@opsxp23Ай бұрын
NVIDIA is my largest position, if you’re a tech enthusiast why not make money with this knowledge.
@skndr3067Ай бұрын
To truly kill X86 they'd need to make everything for ARM, the OS, the drivers, programs and apps specifically developed for ARM
@TheExileFoxАй бұрын
Not only that but also a near perfect translation layer that doesn't break shitty DRM and other garbage embedded in software, not limited to games.
@edwinpickett13Ай бұрын
Translating software has become a non-issue.
@arenzricodexd4409Ай бұрын
ARM is very well known architecture right now. Developing specific app for it will be less an issue.
@ElMarcohАй бұрын
For personal use it's ok and very possible, for professional and servers it's difficult, there's a big amount of software that either is stuck on x86 and emulation quirks are a risk for it, or it depends on old versions or closed software that cannot be recompiled, and that software is often the one running critical systems for banking and infrastructure.
@mrxcsАй бұрын
@@skndr3067 It's useless, because even if they succeeded in all this, they would be stuck in one architecture, or over time will be like x86 with a lot of legacy instructions, losing efficiency, the other alternative is depending on software updates, what is not good, and also this could use more translation, that would hurt performance and would be a nightmare to implement. The "flaws" of x86 are the result of long term reliability, if you focus only on efficiency you lose the long term reliability. That's why they are different solutions for different problems, they are not swappable.
@striker241186Ай бұрын
FYI: Photoshop Ai features run fine on our current maschines, because it gets the generating done on the adobe servers. If you turn off the internet, you can't use generative ai in Photoshop anymore. I would test the same with your audio software to make sure your point in the video holds up
@mrxcsАй бұрын
To me saying x86 is dying it's like saying PC gaming is dying. ARM is a illusion, it can't do all that 86x can, even if it could, that magical efficiency would disappear. As such, it is not and never will be a substitute. They are different solutions for different problems.
@SMGJohnАй бұрын
Its no secret that x86 is bloated and it hurts efficiency and costs. x86 is also not very efficient at sub 10W power despite what netizens would like to claim. Intel and AMD does not want to loose out on a substantial laptop market in the future, but also potentially revive the idea of x86 on phones.
@VicariousAdventurerАй бұрын
x86 is getting elderly in my eyes, not really in trouble unless Windows on ARM catches on and Nvidia makes graphics gpus with compatible drivers) - in which case, they will probably supply both. Right now, they have that sweet duopoly.
@neofitosmihail4272Ай бұрын
how is gone get the clicks and likes ?? fake news simple
@neofitosmihail4272Ай бұрын
@@VicariousAdventurer nvdia every cpu for data centers etc is based on arm - nvdia try to buy arm and fail big time sooo i dont see any progress on nvdia cpus do you ??
@ThinkingnamesishardАй бұрын
Sure, however x86 can't do what ARM can. We still have IBM mainframes somewhere in the world doing the same thing as ever. It just may not be as popular and therefore - pretty much gone
@damyjukАй бұрын
Intel painted themselves into this corner by refusing x86 licenses. AMD being fabless has a bit more room to manoeuvre.
@EpicSOB_Ай бұрын
I have yet to hear how ARM is better then x86, it's not that I don't believe, I just don't know, no one has explained it to me yet. No one ever explains how it's better, it's alway "ARM is better, don't worry about how or why, it's better because we say so" and didn't they say the same thing about RISC?
@raylopez99Ай бұрын
The EMPIRE strikes back... speaking as a long-suffering INTC shareholder. I recently tried a chess program written in Java on an ARM chip that was rated on Geekbench to be equivalent to a modern x86 chip. I found out that not to be the case. Apparently the chess program was optimized for x86 and played significantly weaker on an Android device despite the "Geekbench 6" equivalence. Either one of two things: the CiSC uP of the Intel x86 chip does things the RISC ARM type chip cannot do, or, Geekbench 6 "overrates"ARM chips as being equivalent to x86 chips (or both). Facts. I actually admire ARM and wish them well, as I like low-power computers.
@mrxcsАй бұрын
I remember an engineer saying that ARM is just efficient because it can't do much, as soon as you add specialized capabilities like x86 the efficient falls out a cliff.
@VicariousAdventurerАй бұрын
@@mrxcs Can be efficient, but design of phone chips and design of performance chips in general cannot be directly compared (though shrinkage has meant that more transistors need to be "dark," the energy density is actually a factor). The ISA should help some. Also, Apple benchmarks could be more interesting than comments about one program that may not have been pure java (JNI - look it up - allows transition to native code in areas where there might be a performance issue, and if performance was one of the goals of this program, it might be java in some areas for speed of programming [garbage collection, etc] native in other areas for execution speed) - No one benchmark, synthetic or using real programs, can draw a single picture of performance, since scaling up the number of cpus or the effect of cache can drag in nontrivial behaviors.
@benderozoАй бұрын
Well, the Geekbench sets "weight" to everything so it's not always comparable.
@EmilePesky-n1vАй бұрын
All I know is this, if I use my phone to do a web search is slower than using my PC to do a web search, using my phone as a hot spot (wirelessly obviously)
@raylopez99Ай бұрын
@@VicariousAdventurer True but against that, the designer of the GeekBench 6 test has said it's cross-platform, possibly for marketing reasons.
@MnemonicCarrierАй бұрын
Migrating the entire planet over to ARM will be a huge undertaking. Having said that, Apple did it quite successfully (which is the real reason why Intel and AMD need to be very careful right now).
@sergrojGrayFaceАй бұрын
Foundry in Germany can't happen with the energy cost seen since 2022. This one is not an indication of am Intel's problem, it has to be cancelled or put on hold no matter what.
@kynikostashasch2218Ай бұрын
This isn't a horses vs cars scenario, it's a horses for courses one: RISC designs like arm are fantastic in some situations and crap in others, there's markets for both products and ARM is not going to replace X86 in our lifetime for desktop and home compute, just like x86 is not going to find its way into phones or tablets.
@JavoCoverАй бұрын
Back in time I had a cheap android tablet that came with an Intel cpu (not atom). That thing was at flagship level of performance in the cpu (qualcomm S800 series) but the gpu was a complete crap. Seems they could never bring the energy consumption low enough for mobile.
@CarewolfАй бұрын
ARM isn't any more RISC than x64 is. RISC has absolutely nothing to do with anything going on.
@XetrillАй бұрын
Riiiiiight, Intel was to reluctant to get the latest ASML tech that they totally weren't the very first to get new machines. No, way they would make a video about and publish it on their channel or anything like that. That machine also isn't being tested and/or readied right now, this very moment, for production. Bruh...
@AlexGoldringАй бұрын
A note on the sponsor. OEM keys are not resealable, you're going to have the key revoked within 1-2 years at most. What happens is companies get OEM status with Microsoft, then start reselling the keys, they get busted eventually, and Microsoft systematically disables all keys sold to the company. Also, as a game developer and a software developer - I can tell you that sites like these are parasites. If you want my software but can't afford it - please pirate it instead of paying these leeches.
@LostinSpacetimeАй бұрын
You might be right in some aspects, but I bought 20+ OEM keys in the past 10 years, and not a single one got revoked so far.
@joepearson6644Ай бұрын
Your right my windows 10 key I bought 5+ years ago is revoked. It won't activate, says invalid 😮
@cleny217Ай бұрын
Use these sites when i dont want to support the company, but want its stuff on steam or such. So any Ubisoft or Bungie game/related service i buy from key sites.
@AlexGoldringАй бұрын
@@cleny217 Better just boot up bittorrent honestly. These websites are really scummy and they enrich scammers. I released a game on steam a while ago, and I never sold any keys, only ever provided keys to reviewers. Guess what? - my review keys were on those websites. There's a whole industry of scammers out there, and the ones that get damages the most are not Ubisoft and EA, as for them this is typically is fraction of a percent of all keys. I contacted one of the websites to take down my keys - they refused. So I had no choice but to revoke the keys. They don't care. 70% of the money goes to scammers, and 30% goes to these parasites that enable the protect the scammers. I do get the argument that it feels "legal" and it's often less hassle than good old piracy. But it's really not, not to the developer. In fact most developers prefer piracy to this, myself included. Imagine you want to review games, and you send me an email, I'd be happy to give you a key, but now I have to investigate if you're a scammer or not and the barrier suddenly got higher for me to provide that key.
@ClaymirekoАй бұрын
I was told bye scdkeys that when You Buy the key You have to actívate it soon or it becomes unusable, they told me this when i bought 4 keys and 1 key did not arrive withing 30 minutes, i told them i was installing Windows on múltiple pcs, and i declined their offer to have the key delivered later, he said ok, and the key arrived in like 5 minutes. I have many pcs that where activated Yeats ago and remains so.
@Gunni1972Ай бұрын
Su and Huang, "ONE Family to Rule them all". I expect more layoffs, a merger, and because SMIC exists, it all goes through the Monopoly-regulations. (imagine if we all used the same factory in USA, soo much cheaper). (not).
@solasslym3Ай бұрын
13 minutes in and you've made the case for the FTC to break up nVidia because it's a monopoly.
@valentinbusuioc4054Ай бұрын
tech money printing machine is coming to a stall. the tech industry has reached a plateau... everything is almost good enough and the only direction from here is AI, but market and industry are exhausted. let's hope it's not 2000 reloaded
@prashanthb6521Ай бұрын
True, tech funding might dryup in the coming months.
@TechaktienАй бұрын
Four years ago coreteks said the same. Arm-Laptops don't sell in 2024. Too many disadvantages.
@seylawАй бұрын
That's not the CPUs or even the ISA's fault though, but the GPU part and the bad GPU drivers from Qualcomm. Microsoft also needs to do more work if they really want WoA to be a success story (so far, it isn't).
@piotrd.485023 күн бұрын
Software. If Apple - APPLE - failed to introduce native implementations of e.g. Java, .NET, and popular libraries and runtimes, stable support for Tensorflow and other - what chance do others have? Has any serious vendor ported software to native Apple Silicon? Say.... CATIA?
@cpuukАй бұрын
Intel may have the last laugh: When Taiwan is invaded, TSMC will go dark, which will leave Intel the only manufacturer outside the Asian sphere of influence making chips in quantity. Of course this assumes Foundry doesn't go bust first. Intel's mistake was building too many fab plants too quickly (they are eye-watering expensive to build).
@jtd8719Ай бұрын
TSMC is building fab capacity in NAMER, so they see the risks and are starting to do something about it.
@nimroderyАй бұрын
America won't let Intel go down without a fight, it's tech subsidy time.
@pamus6242Ай бұрын
I would really like Intel to sell itself fully and dissolve everything to AMD. Let AMD manage and look after x86 and aptly transmute to ARM/RISC-V when no one notices, in the process building an emulation layer for x86 in the process.
@ehenningsenАй бұрын
Why would i be worried? Im a consumer, not an investor
@uguraktas8018Ай бұрын
Arrow lake shows that efficiency of Apple/Quallcomm pc chips is not inherent to the underlying ISA but how the actual chip is designed with what priorities. As long as AMD and Intel are competent enough to adjust to the efficiency demand of the market, they will do just fine using x86.
@esra_erimezАй бұрын
I suspect that the real threat is to ARM from RISC-V. It appears to me that with all the restictions on technology to China, it makes sense for them to develop a superior RISC-V CPU. Also, let's not forget that Tenstorrent is using RISC-V in their AI products, and Jim Keller is the CEO there.
@jessietomich8043Ай бұрын
I wonder what Jensen is thinking that he isn't saying. Specifically after Intel shut them out of the x86 market. At one point Nvidia was making Intel chipsets and had plans to use project Denver with a translation layer to get into the x86 game. Intel muscled them out. I wonder if Jensen thinks that is Intel getting what they deserve.
@jaynorwood2Ай бұрын
PG has made comments that imply Intel is open to letting Jensen build x86 or ARM products that integrate his own GPUs. Maybe we'll see a Jensen partnership if 18a is demonstrated to be successful in Panther Lake.
@abhishekmaurya4665Ай бұрын
There are so many software and games that run on x86 processors that it's not possible for ARM to replace it. ARM is not a drop-in replacement for x86.
@TheExileFoxАй бұрын
This is much overlooked. Apple has been doing translation with Rosetta but even their solution is not perfect, old MacOS software may not work properly on ARM.
@VintageCRАй бұрын
while i agree upon this statement, this is not the reason why our beloved chip makers (AMD/Intel) join forces..
@abhishekmaurya4665Ай бұрын
@@VintageCR this is more like a group that's decides new standard and specifications for x86, similar to IEEE. If we truly want to democratise chip designing then we must support risc-v instruction set as that is open source.
@iLegionaire3755Ай бұрын
x86 dying is a complete negative for society, its throwing decades of software and business software compatibility straight out the window. Won't happen without massive problems. ARM will always be more power efficient than x86, but for raw performance, x86 is the way to go.
@Lock2002fulАй бұрын
lol comparing x86 and arm to horses and cars, directly stopped watching
@taccosspicy3014Ай бұрын
Nope. According to financial records, not a single corp was able to leverage AI to make a profit, or even break even. Once corpo backing goes away AI development fever will fade. As for X86 in order to be relevant it will have to mutate into something between CPU and GPU with baked in programmable AI module. As for ARM replacing desktop, LOL. By its nature ARM will be beaten by more power hungry, powerful chips with wider set of capabilities. While market share for these parts will undoubtedly bounce up and down. One will never replace the other completely.
@esra_erimezАй бұрын
Intel has had many non x86 CPUs, for example iAPX 432
@AngelicStreakАй бұрын
Comparing horses-automobiles to x86-ARM is... asinine, for the lack of a better word.
@__-tz6xxАй бұрын
I was thinking the same thing 😃
@4ujaseАй бұрын
Horses Vs Cars! That's a terrible analogy, ARM has been around since the Cambridge outfit (Acorn) created the blue print in the late 70s. It's only now it's gaining traction because people want energy efficient laptops. In the near future, I see X86 architecture being strapped on to Risc-V or something better for backward compatiblity. I think it will gradually fade away, ARM will find home in Apple niche products.
@jkteddy77Ай бұрын
checking the market cap gains the last 6 months between these three, this move almost seems late. We as enthusiasts have to stop considering these three equals anymore. It's Nvidia vs. the world Nvidia: +60% AMD: +5% Intel: -35%
@rattlehead999Ай бұрын
6 months is nothing. But yeah it's Nvidia vs the world.
@jkteddy77Ай бұрын
@badass6300 context Nvidia is "winning more" long after the early year spike. picture is even grander last 2 years...
@cin2110Ай бұрын
Nvidia has a risk of crashing hard though. Ai chip competitors are coming and their revenue is not keeping up with the stock value they are overvalued.
@xlr555usaАй бұрын
Only a handful of the largest companies can afford Nvidia AI gpus and Nvidia has locked up Ai inference and dominates compute with the proprietary Cuda framework. We can't continue like this, eventually Nvidia's grip will be broken. It's a single point of failure that is a security risk and also stifles growth. Nvidia is also being investigated by the DOJ, long term it's not good
@Osaka2407Ай бұрын
At this point, despite all of the technology, NV is nothing more than a stock bubble. A bubble, which leather jacket man has to manage to burst as late as possible so he personally doesn't get in trouble.
@googIesuxАй бұрын
CUDA was always vendor lock-in, but short-term profit is all that matters in this environment. Slow trainwreck
@jaynorwood2Ай бұрын
The biggest demand over the next few years should be for the conversion of the AI and HPC chips to optical IO and the increased use of 3d advanced packaging. Nothing to do with ARM vs x86 instruction sets. The Intel and AMD announcement is probably being driven from customers. Intel's AMX tiled matrix processing and the numerous operation additions in AVX10 put AMD in a four year catch-up mode. AMD needs to emulate those operations until then, so the customers would like some compatibility layer. I believe Triton is already becoming the open AI processing compatibility layer for CUDA. It is used by pytorch. There's also the open UXL project that provides an open SYCL alternative to CUDA, with the advantage that it supports heterogenous architectures within a single source.
@AMDRyzenEnthusiastGroupАй бұрын
X86 isn't going anywhere, any time soon. ARM will continue to take some marketshare, but they will coexist for the foreseeable future. The sky is not falling...
@AxeiaaАй бұрын
ARM (the architecture) is strong armed too much by ARM (the company). RISC-V is the real risk. That somehow ended up rather punny. The way I see it, ARM (the company) is too greedy to prevent it from being all that widespread. RISC-V however has no drawbacks once a few big companies jump on it. I could see AMD/NVIDIA being some of those companies.
@DaystromDataConceptsАй бұрын
I keep hearing about ARM coming to dominate the desktop and yet keep hearing how the latest port of Windows to the newest ARM CPU is laclustre in terms of performance. ARM has its niche, but I can't see it seriously threatning the powerhouse x86/x64 CPU's.
@dezmirean1Ай бұрын
X86 future is in high correlation with Microsoft's path. I'm putting my faith in Microsoft inability to create a wide, stable, coherent ARM Windows OS :) Don't get me wrong, I would love to have a Windows alternative, but it seems far away.
@seylawАй бұрын
I wouldn't mind a more standardised ARM Linux environment to take over the consumer space. But that would probably need a cooperation between several hardware vendors on a common desktop platform plus the help from Valve on the software side to bring gaming into good shape on ARM.
@piotrd.485023 күн бұрын
That's why people are looking for alternatives (read: Apple) as Windows enshit.......faction continues.
@nortieroАй бұрын
Remember that Su was chief of the not so successful arm semi-customs by AMD. Zen was cooking well before she came up to the top. Zen is the breadwinner, so it must be taken care of, but I would not exclude a turnaround if the necessity arises... Such joint initiative will be more lawyer oriented than silicon based.
@user-zh9kc7tw4n24 күн бұрын
8:40 the problem for manufacturing in Germany is that they have closed down their nuclear powerplants and now their coal plants driving up the cost of energy and unreliable powersupply which is high during windy days and low when there is no wind..
@MrHav1kАй бұрын
Believe it when I see it. X86 has been "dead" for the last 35 years lmao
@daLiraXАй бұрын
The horse vs car comparison is rather a car vs a cargo ship. Yes, it's big, and a bit useless for small devices, but try emulating 400000 tons of instructions on RISC.
@NatrajChaturvediАй бұрын
Jenson Huang haters will have a heartburn throughout the video. As a customer, I don't like the man either but gotta admit he's sharp and he's been at the top of his game for for so long now.
@yuan.pingchen3056Ай бұрын
It's better at exploiting politics than technology, and is extremely sensitive to whether it becomes the target of public criticism.
@p_1945Ай бұрын
Even Nvidia have their own ARM CPU project same as AMD but it's too fast for X86 to collapse in few years as he really need money from graphic card that stay in X86 to buy more time for him after Nvidia failed to takeover ARM by regulator and China government / South Korea likely to throw ton of money for ARM to beat X86. In the past, It won't possible but one thing big tech company miscalculated was they never expect that Government like China will join the race with ton of money, bunch of Chinese electric deja vu can make them full panic mode.
@yuan.pingchen3056Ай бұрын
@@p_1945 You thought it was the government but it was actually bandits
@piotrd.485023 күн бұрын
More than that: he has set up and started the game before anyone knew that it's on.
@omichalekАй бұрын
With this development, it is even more mind-boggling that AMD would prevent anyone from accessing the CUDA compatibility layer for their GPUs - the ZLUDA project they themselves sponsored earlier...
@TopHatProductions115Ай бұрын
I don't think I want an ARM-based future atm. Just seems like all of client-based compute will become throw-away at that point. I'd rather be forced to move to IBM Power or RISC-V than ARM.
@mrhassellАй бұрын
The Digital Vector Generator was the first vector generator Atari developed, and was used in Lunar Lander, Asteroids, and Asteroids Deluxe. The first vector supercomputers are the Control Data Corporation STAR-100 and Texas Instruments Advanced Scientific Computer (ASC), introduced in 1974 and 1972, respectively.
@mrhassellАй бұрын
The Cray-1 was the first supercomputer to successfully implement the vector processor design.
@DaveGamesVTАй бұрын
I think nvidia genuinely DOES need to be broken up.
@Alice_FumoАй бұрын
Yeah. Just take the CUDA and make it an open source project. Problem fixed.
@chipperjoneszzАй бұрын
Not happening. Nvidia shills won't let that happen to their beloved leader Jensen. 😂😂😂
@jamescarter8311Ай бұрын
Breaking up NVIDIA would ultimately make them wealthier than ever. The companies would still have the same owners and executives, and they could then grow and acquire competitors without the FTC getting in the way.
@nempk1817Ай бұрын
According to this channel AMD and intel will die in 5 years
@ls_1101Ай бұрын
Bring back 3dfx
@sidlives2672Ай бұрын
Intel and Samsung not implementing EUV lines is so short sighted. By doing this, they set themselves up to becoming the next Global Foundries. You will hear Intel and Samsung abandoning the high end lithography more and more to TSMC as they continue to fall father behind in the race to smaller and smaller lithography sizes.
@jaynorwood2Ай бұрын
ASML recently announced that the second highNA EUV machine installation was completed at Intel.
@philosoaperАй бұрын
there's already many more ARM devices out there than x86... however it's shown time and time again that it's not really an effective replacement. And the most recent "windows on arm" push is just yet another failure. How about people start looking at merging the two into something better than whining about how one is better than the other one in every way when it's abundantly clear that each has strengths the other one doesn't?
@zoeherriotАй бұрын
No, it’s perfectly effective as a replacement. It’s not a hardware problem. And what do you mean “merge” - there’s not really anything to merge here… The strengths are because of their respective architectures and there is no way to combine them because they are fundamentally different.
@RaletiaАй бұрын
Can just combine ARM and x86-64 cores. Chiplets and Big-Little designs already combine different things. But beyond that, for like a decade x86 cpus already had ARM built into them, the security proccessor, an entirely different SOC built into the cpu. (Afaik)
@zoeherriotАй бұрын
@@Raletia that’s not really buying you anything though…
@philosoaperАй бұрын
@@zoeherriot it's not.. it depends entierly on what you're doing
@zoeherriotАй бұрын
@@philosoaper then why do Apple devices running arm tend to outperform X86 chips in almost all use cases when they drop a new revision? There’s no inherent benefit to X86 over ARM. In fact it’s quite the opposite - it usually requires significantly higher power to achieve similar performance.
@1sannnАй бұрын
Will be a sad day if x86 goes down... the death of native running from programs over 20 or more years... x86 is what so many programs operate on. And emulating x86 won't be good for 5 or more years after the death.
@doggSMKАй бұрын
For the last 50 videos it's like: ARM blah, blah, blah - RISC V, bla, bla, bla - ARM blah, bla, blah and RISC V bla, blah, bla". Oh and also: "nVidia this, nVidia that, nVdidia good, nVidia great".
@bakpaokacang9651Ай бұрын
if i got $1 for every gossip of ARM will replace x86 system, i would already have an RTX 4090 by now... heck i must enjoy my 1050ti for a while
@4NT0N10MZC25 күн бұрын
You could run Cuda on AMD chips, the run time is called ZLUDA. It functions similar to WINE by taking library calls to CUDA libraries and supplementing them with ZLUDA's own Open Source implementations
@kamrankazemifarАй бұрын
Would be insane if the next Xbox with the largest technical leap ever went with an Nvidia CPU and Nvidia GPU.
@garystinten9339Ай бұрын
If it can blow a PC out of the water... And photorealistic graphics is your thing yeah sure to for it.. once they're jailbroken and a PC environment is loaded onto them.. could be good, could be bad.. only way to know, is for it to happen.
@everopeАй бұрын
It more like X86 is the petrol car and ARM is the electric car...
@thoongchinglee4905Ай бұрын
RISC is the PHEV
@jamescarter8311Ай бұрын
Not even close.
@nempk1817Ай бұрын
Keep forgetting that ARM is just 5 years newer than x86.
@cj09beiraАй бұрын
without an EPA forcing people away from petrol, and just like electric its "efficiency" claims aren't really true in the real world, when you need to have competitive single thread performance to x86. just look at Amazon's cpu at best its 10% ish more efficient than Zen 4 cpus with areas where its much slower, never mind competing with zen 5. Jim Keller said it best, ISAs don't really matter. and if they don't why abandon the one that allows you great backwards compatibility
@jemborgАй бұрын
Right. As I understand it, nowadays operations are "cheap". It's the migration of data around the chip that is the most "costly".
@hex1934Ай бұрын
Always is\was migration.
@naalsocomment9449Ай бұрын
I clicked the thumbnail because of the title but got a video that eventually starts with content at > 2:30 .... with how cars changed the world when replacing horses? seriously?
@IRDazzaАй бұрын
ARM just doesn't have the processing power. Its okay for low end systems and mobiles but look at the latest Windows laptops with ARM. They are useless beyond basic office work. Look at Apple, no where near the power of intel rivals, and they run a full custom OS written specifically for their ARM chips. Okay for some apps and internet but near useless for gaming, heavy development or data management amd processing.
@Dulkh4nАй бұрын
but its because of apple not the old arm guard
@MDXZFRАй бұрын
As long as gaming industry standing strong, x86 will be going nowhere.
@gw762410 күн бұрын
Those of you claiming the fact that Arm has been around as long as x86 means that x86 isn't under threat are completely wrong. Never before have Arm-derived SOCs been a _viable_ alternative in the desktop space as they are rapidly becoming. We already have Arm-based laptops in the market, despite their current shortcomings, and it's only a matter of time before Arm-based desktops start hitting the shelves. More importantly, Arm is growing rapidly in the server space at the cost of Intel and AMD's margins, which is exactly why AMD and nVidia are developing Arm processors.
@C0smicThund3rАй бұрын
Is this relevant today? No. Is this going to be relevant next year? No. Is there software that I use today that is reliant on instruction sets only available in x86? Yes. Case and point.
@KeyYUVАй бұрын
AMD does have HIP that's similar to CUDA but the automatic translation from CUDA to HIP is pretty bad. Especially if you use Nvidia libraries that have no equivalent in HIP. I really hope someone would pump some engineering hours into HIP to make it flawlessly compile CUDA projects.
@Austin1990Ай бұрын
A great amount of industry is leaving Germany because the country doesn’t have enough energy, resulting in high prices. There are even talks of scheduled, rolling blackouts. Rough stuff.
@erictayetАй бұрын
Just restart those reactors! Make Sabine Hossenfelder happy again. :D
@Austin1990Ай бұрын
@@erictayet That would definitely help! Cutting off Russian natural gas is another big factor, though.
@erictayetАй бұрын
@@Austin1990 Yes. Germany is an advanced economy with high precision engineering as the backbone. Without a consistent and powerful power grid, how can Germany do day-to-day manufacturing or R&D that involves high electrical usage? Solar & wind are not going to cut it. Even nuclear-shy Asian countries are considering and putting it on public agenda. Tech companies are spinning up nuclear reactors just to power their data centres.
@CarewolfАй бұрын
No there isn't. Why are you making up dumb shit? Germany is on the continental energy grid, to run out of energy all of Europe would, and it isn't running out of energy, not even on its own. They made a few stupid choices, but they just cost a bit extra, it isn't stopping the production of energy.
@Austin1990Ай бұрын
@@Carewolf They literally recently announced rolling blackouts, and several companies have announced why they are leaving Germany. It is all public, although I doubt the media is parading it around.
@stevensmith6445Ай бұрын
There is no competition. This is a rarely oiled Kaiju of a machine. Squeaking at every joint, swollen with corporate greed and grinding from the rust of intentionally nerfed technological progress. Stomping around making a big noise, but not doing much at all.
@phoenixfireclusterbombАй бұрын
I couldn’t follow, the personality is A.I. and my brain is ignoring it.
@__-tz6xxАй бұрын
In a year or two Nvidia and MediaTek will release an x86 Cpu and Gpu handheld PC gaming device and the x86 Cpu will be much like Intel's Lunar lake mimmicking phone chips to not use as much battery life and you will be wrong.
@MarcusRanfordАй бұрын
You only need to look at Qualcomm’s X Elite lack lustre performance to see the challenges involved in transitioning to ARM/RISC/RISC-V. Intel is the only one of the big three not solely reliant on TSMC and as always, past performance is not a guarantee of future results.
@justinbarnes8834Ай бұрын
The Question is just how much further is it possible to continue to miniaturise X86 or ARM. The only reason ARM is currently behind is lack of software support. The situation is similar to the Linux/Windows problem. One is bloated and slow the other doesn't have to same software compatibility or support.
@AndroidPCMMORAWАй бұрын
This video is just Ramblings .......
@moomah5929Ай бұрын
I'm waiting for the AI bubble to burst. There are enough people already warning that companies might not be able to recoup the money spend while jumping on the AI bandwagon. Putting "AI" into your product names because it's the "cool thing to do" will most likely also make for good jokes in 20 years.
@lmotakuАй бұрын
It's not as big of a deal as people are making it out to be. AMD is Intel's baby and it's like everyone forgot. Intel stock holders were understandably worried when Intel was the only major CPU manufacturer as far as when i386 was a thing, then i486. It was before my time, even. It later became x86 because both AMD and Intel were involved. One of the major heads, Jim Keller, helped put together x86_64. Keller helped with Athlon, Zen, Nvidia AI, Intel, and so on. Arm has two major pitfalls. One: Instruction sets are small. Two: Memory usage is restrictive. Desktop PCs require both large amounts of memory and larger instructions. Now, if both AMD and Intel make a sort of handshake on certain instructions to ease compatibility with programmers (not having to learn one way to code over another for each CPU), then X86 as a whole becomes largely more capable in ensuring high performance across architectures. Previously when AMD wanted to utilize say SSE2 they had to license it from Intel. This partnership might just mean they're throwing the old licensing agreements out the window to get developers to stay on x86. Yes if Nvidia allows Arm toimpliment Nvidia as n igpu to their CPUs, they could likely contend in desktop JUST by redirecting anything needing big instructions to the GPU and keeping the small stuff as usual with some SoCs to handle more specific logic. However, this might be some ways out. I don't see them releasing a perfect product in the next 2-3 years. Maybe, 4, perhaps 6 we'll start finally seeing marketing for their first attempt at it and who knows where x86 will be by then.
@loganwolv3393Ай бұрын
Even in the best case scenario for arm, it won't enter the gaming CPU market anytime soon because of their fundamental disadvantage of single core performance. It seems that throwing more and more cache for each core seems to really help gaming performance instead of just more cores.
@Matt2010Ай бұрын
What they should be more worried about is RISC-V, which me I have no dislike of either. ARM as well.
@wazzamolloyАй бұрын
X86 versus ARM debate reminds me of the CISC vs RISC debate, where x86 was the CISC hero. Clearly the fragmentation of x86 instructions is real because Intel treats vector instructions as optional and AMD treats it as mandatory
@hypothalamusjellolap8177Ай бұрын
Intel and AMD could just debloat x86 to a RISC form and make efficient cores along side legacy cores. Next they can provide a binary translation framework or preprocessor for recompiling and running legacy executables not using the cleaned up x86 instruction set. Ironically, they could use AI to optimize x86 instructions.
@GreyDeathVaccineАй бұрын
For every four legacy free cores, just add 1 core with legacy instruction set and handle this on operating system scheduler.
@piotrd.485023 күн бұрын
Theoretical: true. And they could unf.......k their overly fragmented lineup. Practical: they can't, for life of them, write decent microcode / drivers.
@JepharyАй бұрын
I'm old enough to remember when Intel and AMD were essentially partners.
@TroyDanielStover16 күн бұрын
The AMD Socket A and Intel Socket 370 were compatible in some ways, but Intel made sure that an AMD processor would not fit into their socket. Swapping the two would cause issues, primarily with the FSB (front-side bus), which could damage the motherboard. As FSB technology became obsolete, the memory controller was moved to the CPU for a variety of reasons, including better performance and efficiency. Intel's Core architecture marked the end of the FSB era, with the Core i-series becoming the new standard. Meanwhile, AMD's FX series failed to post BIOS correctly, further highlighting the shift away from FSB-based systems. The Athlon 64 showed Intel that AMD was ahead in certain areas, particularly with its integrated memory controller and performance. However, with the rise of new memory technologies like CUDIMM, AMD is likely to fall behind again in DIMM technology as the industry moves towards larger, faster memory modules. As we approach 10GB/s RAM chips, the traditional DIMM form factor may soon phase out, shifting the focus to more advanced memory solutions.
@deeg_with_robotsАй бұрын
Great video as always. One correction - TSMC does not charge ahead with High-NA EUV, reportedly opting for multi-patterninig using low-NA until 2030.
@adi6293Ай бұрын
The market is creating its own future disaster . what will AMD have to do so sell their AI chips? PAY PEOPLE TO take them? seriously what a mess , you would think that having to pay 40k per nVidia chip would be a wake up call to these companies because like GeForce cards , the price will only go up
@papahugeАй бұрын
Comparing ARM to the industrial revolution is very reachy, it's far from it... Quantum Computing on the other hand will have a similar impact as stated.
@oysterhead5150Ай бұрын
If ARM becomes dominate in the PC industry, there won't be a DIY segment anymore.
@sacamentobobАй бұрын
and I will lose even more interest in PC hardware, for someone who has been using PCs since the late 80s. The integrated solution crap is going to be the final straw for me.
@C-M-EАй бұрын
I still find it humorous that ARM lost its stigma as the processor in a standard calculator vs the one running something like a serious server network REAL quick, especially the last few years, when AI dev has exploded. Heck, I still look at them like an arduino vs modern Threadripper.
@MyrKnofАй бұрын
I hope they make a x64-mini or whatever they would call a reduced ISA. I mean, I'd rather see them just go RISC-V, but thats a pipedream as long as their niche is healthy enough to make money.
@TheHangarHobbitАй бұрын
Uh huh...how well are Qualcomm laptops selling again? This is simply so they do not have another AVX 512 situation again where one side refuses to support an instruction set for ages and thus it rarely gets used. Since this group has all the major hardware companies like HP and software players like Adobe and Oracle they can say "This is what we are working on, and you guys need to support" and both AMD and Intel will have the same instruction sets so companies don't have to guess who will have what.
@dokgo7822Ай бұрын
AVX512 works great on my 12900k , what’s the problem?
@lordkekz4Ай бұрын
I think AMD and Intel will have trouble catching up to NVIDIA in terms of AI software support. But there is a lot of demand for more affordable AI options due to NVIDIA's monopoly pricing. For AI inference, the AMD/Intel NPUs are good enough as far as I can tell, they just need to all get behind ROCm and fund it like their life depends on it. Applications that support AMD are running well on AMD. The CUDA support is literally the only major justification to buy NVIDIA at this point, but the right answer for potential enterprise GPU customers should be to rewrite everything on ROCm. Even if it had 10x the initial investment, ROCm is the better platform to buy into and it will pay off in the long run. Right now the entire CPU/APU/GPU market is not so strong because of high prices and a slow economy. But CPU/APU demand will remain stable, so AMD/Intel have that to fall back on. They may not benefit as much from the AI training craze but it's not like either company is going bankrupt because of some missed datacenter GPU sales.
@hamesparde9888Ай бұрын
The reason the NPUs aren't being used much is because the big LLMs from Alphabet and OpenAI aren't being run locally. The reason for this is obviously because the current generation of hardware either isn't powerful enough or it's using too much power. But the main reason is that they want as much control over it as possible, so they don't want you to use it locally. But just because NPUs aren't used very much now that doesn't mean that that will continue to be the case. Just look at the SIMD instructions for X86. They weren't used that much at first and now they are used a lot. You can use Metas LLAMA 3.1 models locally and they are really good, but at the moment most people don't have a machine power enough to run the biggest ones (and they are probably going to keep getting bigger at a high rate for quite a while.)
@piotrd.485023 күн бұрын
Nope. Shitty software support. Try running python/tensorflow default distro - even on laptop CPUs it will display warning, that CPU supports extensions that can offer multiple times acceleration. In some applications, Intel MKL and fully charged tensforflow offers 30x and MORE
@dazextralargeАй бұрын
ARM is great for low power, but we see already its limitations with apple silicon having nowhere to go unless if they increase the power envelope. which basically destroys the purpose of ARM. if x86 can go low power closer to ARM they can stay relevant in mobile/low power spaces.
@tibbydudezaАй бұрын
Intel killed off the RISC market because nobody could invest the money to make modern desktop or server CPUs and performance was getting better. Today we have PowerPC and it is only found on their mainframes and servers Snapdragon X bombed so x86 will be ruling the general purpose computing market decades to come
@alexcastas8405Ай бұрын
Luv this channel … random thought, what’s IBM and some of the others that used to be prominent players doing these days?