AMD Takes AI-M at Nvidia with MI300X, MI300A and MI300C

  Рет қаралды 39,008

AdoredTV

AdoredTV

Жыл бұрын

♥ Check out adoredtv.com for more tech!
♥ Subscribe To AdoredTV - bit.ly/1J7020P
► Support AdoredTV through Patreon / adoredtv ◄
Buy Games on the Humble Store! -
►www.humblebundle.com/store?pa... ◄
Bitcoin Address - 1HuL9vN6Sgk4LqAS1AS6GexJoKNgoXFLEX
Ethereum Address - 0xB3535135b69EeE166fEc5021De725502911D9fd2

Пікірлер: 282
@benjaminoechsli1941
@benjaminoechsli1941 Жыл бұрын
You remember AMD's "The Future is Fusion" tagline from waaaay back in 2008? Looks like this is as close as we'll get to that. And I love it.
@scroopynooperz9051
@scroopynooperz9051 Жыл бұрын
When the AI's take over and become gods, we can ask them to create new GPUs for us with enough VRAM, and make it affordable to boot xD
@jtjones4727
@jtjones4727 Жыл бұрын
Yes. Maybe if we are good and behave, the machines will reward us. They'll need some of us humans around. To keep the machines clean and cool, to do maintenance and hardware repairs, and other menial tasks below the machines.
@zaadworks
@zaadworks Жыл бұрын
we will become childrens of the Omnissiah
@WhiteSkyMage
@WhiteSkyMage Жыл бұрын
🫣 yeah... i wouldnt hold my breath for it. They will just find that people are resouce hogs and Skynet will be come the norm... A distopia nobody would wanna live it...
@duladrop4252
@duladrop4252 Жыл бұрын
@@jtjones4727 they can make their own machine to do that task as well, practically they will be able to do it on their own... and they can thrive as well, when one is broken they'd just replaced it with parts and it will continue to run. While us humans when our organs stop to function, we die.
@thegreathadoken6808
@thegreathadoken6808 Жыл бұрын
I wouldn't mind one of those MI300Xes. Any chance I can just have that, but for around £200?
@sandwich2473
@sandwich2473 Жыл бұрын
Really nice pun in the title :3
@greatjob7113
@greatjob7113 Жыл бұрын
shut up please
@seasesh4073
@seasesh4073 Жыл бұрын
Yooo wth Jim, same here, I developed a hay fever last year at 22 years old for the first time. Apparently this is quite common and I'm told it can disappear with time and go back to normal. Just don't forget your medicine lol.
@HighYield
@HighYield Жыл бұрын
Always love to watch your analysis, you never fail to focus on areas I only glance over. Super interesting stuff! I'm wondering if we will see a MI300 version with dedicated Xilinx AI accelerator thrown into the mix.
@willgart1
@willgart1 Жыл бұрын
I have the same question... for now Xilinx is still not there I'm curious to know how AMD will use/implement it.
@justacomment1657
@justacomment1657 Жыл бұрын
​@@willgart1 they have DPUs and corresponding Datacenter Switches branded AMD already...
@jimatperfromix2759
@jimatperfromix2759 Жыл бұрын
@@willgart1 AMD seemingly doesn't have a clue how to market their Xilinx XDNA AI chiplets. They seem to think they need to have some killer laptop application just waiting in the wings such that they can tell customers "you can do that (killer) thing (whatever the heck it turns out to be) with this new laptop" before they even put CPUs with those XDNAs out there to the public. Stop being stupid, AMD! Just put it out there, and your inventive customers will figure out what the heck to do with it! Example: AMD says the Xilinx XDNA (chiplet, maybe, or maybe it's in an integrated APU) will be in "some" of its 7040 series Phoenix line of APUs. However, mum's the word on whether it will be in its two top-end chips in that series, namely the 7940HS and the 7840HS. Meanwhile, some vague rumor seems to have been let slip by AMD that the Xilinx XDNA technology *will* be in their 78040U series of cheaper lower-powered chips. That's essentially also the same/similar chip as one of the Z1 or Z1 Extreme chips that will go into the Ally (and also unclear whether the Z1 series will have Xilinx XDNA). Well, if that's the case, then WTF, you mean I have to downgrade my top--of-the-line 79040HS APU to a cheaper 78040U APU in order to get the Xilinx XDNA features, which is the main reason I want a Phoenix series processor in the first place? Get your S together AMD. You're sitting on a Xilinx goldmine and you''re embarrassed to market it?
@AwesomeBlackDude
@AwesomeBlackDude Жыл бұрын
@@jimatperfromix2759 Whew, bro! You could transform that into a captivating daytime tech drama soap opera show. 🤣😳
@MsDuketown
@MsDuketown 11 ай бұрын
​​​@@jimatperfromix2759Which software do you wanna run? ROCm, OpenCL and Vulkan probably, since you can't use CUDA. Then you'll need some driver compatibility.
@fumo1000
@fumo1000 Жыл бұрын
Some of the figures you mentioned rang bells for me. They were from early leaks for rdna3 which never turned out to be consumer products. Now it all makes sense... The leaks were actually for products destined for AI
@shaunlunney7551
@shaunlunney7551 Жыл бұрын
AMD has a plan and will scale to meet the mark. Very bright future ahead! Thanks for the video!
@qlum
@qlum Жыл бұрын
On the matter of software ecosystem, this is less important for hyperscalers, they often have the resources to write their own targeted implementations. This is also why you see amd's cards more in these systems than in the rest of the professional segment.
@adela5561
@adela5561 Жыл бұрын
Completely agrees. These guys have huge resources and they will help AMD to lower their dependency on Nvidia. I would love to see some benchmarks on June 13th. I bet Microsoft will be on stage.
@Pushing_Pixels
@Pushing_Pixels 11 ай бұрын
I doubt any hyperscalers are using anything but bespoke software, in which case CUDA and Nvidia's software stack are irrelevant. Smaller start-ups, on the other hand, will be drawn to the ease of implementation NV offers.
@Pixel_FX
@Pixel_FX Жыл бұрын
I think the experience they got from designing APUs for consoles also helped to make this.
@doggSMK
@doggSMK Жыл бұрын
Yes indeed. This is the result form "future is fusion" started when the FX CPUs came out ;)
@Pixel_FX
@Pixel_FX Жыл бұрын
@@doggSMK Damn, didn't know this goes back that long.
@AmurTiger
@AmurTiger Жыл бұрын
@@Pixel_FX It was part of the basis for the purchase of ATI in the first place. I don't think they necessarily anticipated the dominant use case but issues around using power shifting bits around between CPU, GPU and memory ( which is a big source of inefficiency ) have been known for ages and are a fairly obvious problem to tackle. You can also see the memory roots for this in HBM as well which hasn't taken GPUs or PCs by storm but seems to be standard for these AI cards. Of course the fact that it's taken so long suggests pretty strongly that while it's an obvious problem to try and solve actually solving it is substantially harder, especially when you have to keep the company running and have competitive GPU and CPU designs to slot in. One of these AI cards with a bulldozer core and GCN would be a far less appealing product, it's actually a lot of balls to juggle when you think about it, especially given how young AMD is into being anything like a 'large' company, AMD still has less employees then NVidia and far far less then Intel ( though they also work on the manufacturing end so a bit apples/oranges ). Goes a ways to explaining some of their missteps in the consumer GPU market, it's just not a ball they're all that focused on.
@DeltaSierra426
@DeltaSierra426 Жыл бұрын
Correct. It's always been a game-changer for AMD since they acquired ATI. We're even already seeing AMD bearing fruit from the Xilinx acquisition for AI as the upcoming AMD Phoenix APU's have dedicated AI silicon ("XDNA"). This just gets their foot in the door -- not anything close on what's to come next. Guessing MI300 has some benefits as well, even if we haven't seen it extensively... AMD is one to go heavy on marketing, so we'll know come circa June 14th!
@Pushing_Pixels
@Pushing_Pixels 11 ай бұрын
The next consoles will probably be chiplets based too. Should be interesting.
@kaisersolo76
@kaisersolo76 Жыл бұрын
Great stuff Jim.
@scroopynooperz9051
@scroopynooperz9051 Жыл бұрын
Yo Jim! We missed you buddy. Was waiting for you to weigh in on the current tech malady xD What are your thoughts on the Nvidia, AMD "midrange" RTX 4060ti and 7600 releases?
@adoredtv
@adoredtv Жыл бұрын
"What a piece of shit" was my comment on Twitter for the 4060Ti. I couldn't even muster enough care about the 7600 to say anything about it.
@scroopynooperz9051
@scroopynooperz9051 Жыл бұрын
@@adoredtv 😂 those sentiments are shared by everyone it seems
@singular9
@singular9 Жыл бұрын
@@adoredtv If the 7600 could have done what polaris did, and launch first, before nvidia, and smash price targets and expectations, maybe someone would care. Lets say...60FPS average at 1440p without any upscaling, 10GB of vram, and a price tag of 249.99 (even at 269 it would be fine). I haven't used a 1080p monitor since like 2017.
@nathangamble125
@nathangamble125 Жыл бұрын
@@singular9 "10GB of vram" Not possible on a 128-bit bus.
@singular9
@singular9 Жыл бұрын
@@nathangamble125 can be increased
@dudao4163
@dudao4163 Жыл бұрын
What mind boggling is beside Mi300, AMD has Xilinx AI accelerator which fundamentally better at running inference than GPU. AI is so vast public need more time to figure out what is best.
@benjaminoechsli1941
@benjaminoechsli1941 Жыл бұрын
Yeah, their acquisition of Xilinx is going to give major tailwinds in this area. 👌
@OrjonZ
@OrjonZ Жыл бұрын
MI400 will probably be more integrated with Xilinxs AI.
@rezaramadea7574
@rezaramadea7574 Жыл бұрын
​@@OrjonZ the rumor said, they're going to integrate their Network interconnect, the XSwitch
@Tech2C
@Tech2C 4 ай бұрын
Hello Jim, we (the internet) miss your analysis of all things CPU and GPU. Your content is highly entertaining, with past videos of legendary status.
@Xearin
@Xearin Жыл бұрын
Great vid once again Jim. Hope the hay fever doesn't wreak too much havoc on you.
@cracklingice
@cracklingice Жыл бұрын
SOCs need not be APUs, but APUs are SOCs. System on Chip is a very broad definition that was created to differentiate the old days of having a separate north bridge as the SOC integrates the north bridge. That being said - I would say that if MI300A cannot output graphics, then yes SOC might fit more than APU, but technically the MI300A is the great grandad of what AMD APU vision was. It was always about compute and not about gaming.
@ArtisChronicles
@ArtisChronicles Жыл бұрын
Yeah, gaming could just end up being a tertiary thing with similar products. This is when things are getting really interesting and this is the type of future I was seeing as well.
@defeqel6537
@defeqel6537 Жыл бұрын
chiplets really coming to play here, both for increasing supply and AMD's ability to match customer requirements with different configs
@PretentiousStuff
@PretentiousStuff Жыл бұрын
Whenever I'm baked out of my mind I come back to this channel to listen to his godsend accent. I always picture Tommy from the movie Snitch, I once almost popped a rib from laughing
@adela5561
@adela5561 Жыл бұрын
Great video as usual. I still remember your first video on chiplets and Zen. It got me interested in AMD at the time. Nvidia will remain the leader but AMD has a high probability to have some of the huge pie on AI. JUNE 13th should be an amazing show.
@leflavius_nl5370
@leflavius_nl5370 Жыл бұрын
AMD must be learning a lot about packaging by having moved to chiplets early, looking at this monstrosity of a product. It's enormous AND heterogeneous!
@cubertmiso
@cubertmiso Жыл бұрын
AMD heterogeneous!
@gustavb3673
@gustavb3673 Жыл бұрын
@AdoredTV thanks for the video of the exciting Mi300, @2:16 you say Zen5 when it should have been zen4.
@cosmic_gate476
@cosmic_gate476 5 ай бұрын
We miss you Jim
@Maxxilopez92
@Maxxilopez92 Жыл бұрын
Dear AdoredTV, Thanks again for this. I'm still invested in AMD from the zen days. I feel like I missed nvidia with AI, but I think AMD is going to be a good second supplier for AI. Intel however lost all battles....
@adoredtv
@adoredtv Жыл бұрын
Pretty much how I feel as well. ;)
@Maxxilopez92
@Maxxilopez92 Жыл бұрын
@@adoredtv Would apprecate a video on the falter of Intel, cause you have done great jobs about those!
@YudaHnK
@YudaHnK Жыл бұрын
After seeing this, just sold my btc holdings and bought AMD stock. I know I won’t be the only one.
@crunchynetto6979
@crunchynetto6979 Жыл бұрын
@@YudaHnK like in the last 5 hours? you done that?
@TheTardis157
@TheTardis157 Жыл бұрын
I own a good few AMD shares (sold off most though) and they have treated me well over the years. I couldn't buy NVidia just because of how much I despise the company. Intel is definitely the big loser currently. They're going to have to really turn things around in their fabs and with their upcoming tech to stand a chance at being near the forefront.
@gwhitesa
@gwhitesa Жыл бұрын
Thanks Jim. You're the one channel that when I see a video I drop everything to watch.
@PsychRian
@PsychRian Жыл бұрын
Great video! - I always enjoy these.
@Antagon666
@Antagon666 Жыл бұрын
Will you do your take on the awesome, cheap and powerful mid-range GPUs that are coming out nowadays ?
@solidreactor
@solidreactor Жыл бұрын
Always look forward to your interesting videos, not only for the information but the analytics :) Hope you get well soon if you aren't already. As you said about what Nvidia has going for them is their software stack and CUDA framework. AMD is now getting traction with their equivalent ROCm that is getting better support, for example with PyTorch. However ROCm only supports Linux but rumors (and hints in the github repository code and conversations) says that the upcoming ROCm 5.6 (which is in alpha now) might get support for Windows and also for the consumer 6000 and 7000 series cards. ROCm has its HIP interface to translate CUDA application which is MUCH needed if one wants to transfer over to AMD while keeping old CUDA application running. My guess about the cooperation with Microsoft is about having ROCm and other software support in Windows, a more open compute framework compare to CUDA that could break Nvidias 90% grip on the AI market. Hardware wise AMD is already there even with the MI200 series, and the MI300 looks to be even more interesting indeed! However the most important factor one might want to keep an eye on (specially if you are an investor) is the software maturity with ROCm and the adaptation of it because of that maturity and availability.
@paul1979uk2000
@paul1979uk2000 Жыл бұрын
A.I. is clearly becoming a big thing now and likely going to become a bigger thing going forward as it gets more useful. I've already heard of a few people being pushed into buying Nvidia gpu's, simply because they work with A.I. at a local level. AMD and Nvidia need to sort that out and fast, because at the moment, if you want to run A.I. on your own PC, you only have two choice, an Nvidia gpu or the cpu, the cpu tends to be a lot slower but I've noticed over the last few months that it's been getting a lot faster, so much so that even a 30 billion parameter model is useable on it, 60 billion is kinda but it's a bit on the slow side, used around 48GB of system memory and took around 1 mins to respond on my brothers 6 core cpu, but he upgraded recently so it's probably faster now. I would prefer an open standard when it comes to A.I. being run on our PC's that work on any gpu and have the cpu as a fallback option, I'm surprised both AMD and Intel let Nvidia get so far ahead with CUDA cores and now with A.I. becoming a big thing, they need to respond and fast, especially with how quickly open source A.I. is developing.
@shinkiro69420
@shinkiro69420 Жыл бұрын
Rocm documentation is garbage They haven't even documented which gpus are supported by it
@solidreactor
@solidreactor 11 ай бұрын
Sharing an update for whoever it interests, 5.6 did neither support for Windows nor 7000 series RDNA3 cards. Latest conversations (and a somewhat of a confirmation?) seems to hint for a 7900 XTX support for fall this year (ROCm 5.7?) however still nothing mentioned about Windows support unfortunately.
@Chaka421
@Chaka421 Жыл бұрын
The biggest thing since electricity is inverting the properties of digital information and truly securing data in the cyber domain using power from the physical domain.
@alpha007org
@alpha007org Жыл бұрын
There is a big demand for AI accelerators. And even thought Nvidia has a big software stack advantage, if you can't get enough Nvidia accelerators, you'll have to start using different products, like TensTorrent (Jim Keller's project) and others. Maybe AMD can get a foot in the door, and with the help of the new booming AI industry, they (AI industry and startups) will have to start using **whatever** is on the market. So Nvidia's software advantage isn't such a big selling point here. AMD just have to execute properly, bring decent gen-on-gen improvements, and collaborate with clients to bring specialized products they need.
@HWMonster
@HWMonster Жыл бұрын
That all sounds fantastic, but unfortunately it's no use at all if the performance doesn't make it onto the road in practice. Software support and drivers are often more important than raw hardware performance in the professional field. And Jensen has made very good connections in this area for years.
@austzombie
@austzombie Жыл бұрын
Hmm if only AMD could get the worlds faster super computer this year (El Capitan) to use these chips... I'm sure then that people may put some effort into working on the software.
@annieshedden1245
@annieshedden1245 Жыл бұрын
except that most of the market is already ported. do tensorflow and you have most of ai...
@rapamune
@rapamune Жыл бұрын
God damn it feels good to have you back doing analysis for us once more
@taith2
@taith2 Жыл бұрын
AMD might get ahead in hardware, but as mentioned software stack od Nvidia might be what results with more purchases Just hoping AMD to jump out of nowhere with system on a chip desktop offer, with powerful integrated GPU CPU and memory Already considering to get me ROG Ally or Valve next console as my desktop replacement for casual gaming and daily internet browsing
@frozenflame8319
@frozenflame8319 6 ай бұрын
soooo any new vids? what happened to this channel we miss you guys
@InfinitePCGaming
@InfinitePCGaming Жыл бұрын
Glad you're feeling better Jim
@shieldtablet942
@shieldtablet942 Жыл бұрын
MI250X is more energy efficient than A100 by a mile, at least when it comes to HPC loads. H100 goes 5% better than MI250X but doesn't seem to be delivered on mass yet. But not just on FP64, mind you. Mixed precision Top 500 post excellent results for Frontier/MI250X. Unfortunately, without access to hardware, it is hard to know if the Nvidia demand is because of GPU capabilities or deficiencies that still exist on the AMD software side. I hear ROCm has much improved since I last tried it.
@TheHighborn
@TheHighborn Жыл бұрын
"al right guys how is it going" very nice to hear these words. Also, i hope you're better.
@pvalpha
@pvalpha Жыл бұрын
They really do want to kick consumer stuff to the curb don't they? Mining first. Now AI. They really want to sell stuff by pallets to a single buyer.
@kutark
@kutark Жыл бұрын
I mean, do you blame them? It sucks, but it's just reality. If you're a business you're there to make money, and if the new hotness is X and you're still trying to sell Y, you lose. The issue is you still need to do enough to keep Y happy in case X slows down or bursts a bubble. Nvidia i think is screwing this part up. They're pissing in the faces of their gaming consumers (who sadly will still drink the piss for the most part) and it may come to bite them in the ass in 2-4 years if the AI surge doesn't maintain.
@kainhall
@kainhall Жыл бұрын
jesus..... i remember with 1 billion transistors in a GPU was HUGE marketing material.... . now we are 120+ bill..... just insane
@YouArentHereYet
@YouArentHereYet Жыл бұрын
Another video well worth the wait.
@davidmawston4830
@davidmawston4830 6 ай бұрын
Be great to hear more comments on mi300, maybe after the official launch next week
@ultraderek
@ultraderek Жыл бұрын
All that fp16, int4 and int8 stuff is fine and dandy but it’s a pain in the butt to program efficiently in cuda. Unless, they’ve updated the compilers. Since, I’ve last programmed using them.
@DavidYordan777
@DavidYordan777 Жыл бұрын
thanks for your insight.
@ukaszChorazy
@ukaszChorazy Жыл бұрын
Happy birthday Jim!
@senri-
@senri- Жыл бұрын
Hey Jim great video. Its great AMD can stay somewhat toe to toe in hardware, but I dont know how much it'll mean if they can't keep up in software. Something Ive noticed on twitter and from taking a look at papers submitted to different AI journals and presented at AI conferences is theres always researchers from Nvidia, much less so from AMD. I think having people that not only work on hardware/drivers but also researchers that come up with algorithms allows Nvidia to be much better prepared for what kind of algorithms they need to tailor future products to vs what may be more of a reactionary approach from AMD. I hope as AMD grow they can allocate more resources to this
@PaulSpades
@PaulSpades Жыл бұрын
Software research never uses optimized hardware or software, unless it's research into optimization. AMD has heterogeneous compute (fast shared memory on package with GPU and CPU) and finally taking advantage of it on the API level (well, other than the consoles). Nvidia just can't do it. Not until they get their ARM or RISC V cores into a HPC product and there will be software pains there. I think it's Nvidia that is behind on software. CUDA is 2010s technology based on shaders, we're in a whole new word of SIMD compute and AMD's better competitors will be Tenstorrent( unsurprisingly, Jim Keller and Raja Koduri's new gig) and other hardware startups. Until somebody brigs compute-in-memory to market, this is the hardware battle. And the software platforms are either ROCm or whatever grows out of the RISC V instruction set.
@dennismoves6576
@dennismoves6576 Жыл бұрын
Hey man, thanks for another great vid. Do you have an estimate on how large the MI300A would be if it was monolithic?
@BlackJesus8463
@BlackJesus8463 Жыл бұрын
Nvidia wants to sell AI serviuces as a subscription and I don't believe any corporation willing to spend billions would think that a wise investment long term.
@adi6293
@adi6293 Жыл бұрын
Finally a new video!! 😁😁
@conza1989
@conza1989 Жыл бұрын
Jim, would love a video looking at the 40 series and 7000 series versus historical precedence from the company, I don't think anyone could do a better job of it honestly.
@elon6131
@elon6131 Жыл бұрын
I'm sorry to hear about your health Jim. I don't really have much to say this time, datacenter isn't my forte, but. I couldn't help noticing you didn't mention Nvidia's own complete SoC solution in the form of the "Grace Hopper Superchip". i don't really know how well it compares as an overall package, but it does exist.
@wizpig64
@wizpig64 Жыл бұрын
it's so exciting just to see an amd slide showing a single APU chip with a much bigger gpu die area than cpu die area
@andersjjensen
@andersjjensen Жыл бұрын
With aperture sizes going down for the post-FinFET era AMD will have a leg up with their hands-on experience with chiplets for sure.
@AgtX999
@AgtX999 Жыл бұрын
AMDs top secret new sku codenamed - Purple burglar alarm.
@aitakaitov7312
@aitakaitov7312 Жыл бұрын
The largest problem for me is that AMD has nothing like CUDA. They have ROCm with no Windows support. I hoped the last release would add Windows support, but no dice. And since I need to run models locally on my Windows machine, I have no option but Nvidia with their overpriced cards with less VRAM. An RTX 4070 with 12 gigs? That sucks.
@ariffinsetya
@ariffinsetya Жыл бұрын
It's corporate/research first. The money is there, they will buy thousands easily, and they also have the knowledge to build their own tooling and code. it will take some time before rocm will trickle down to consumers. And cuda first released in 2007, and considered as industry standard around 2014-2016, it needs time to develop a whole new framework for amd. With how crazy ai development going, i don't think consumer hardware will go far, we're now having 30b model for LLM keep released, and they will keep going bigger, who knows when even 3090/4090 is sufficient enough to inference that, with how it keeps getting bigger and bigger, unless nvidia/amd release card specific for home users which is unlikely for now. And since you use windows, which most researchers/corporate who will build their tools/apps using radeon instinct don't use...
@defeqel6537
@defeqel6537 Жыл бұрын
@@ariffinsetya this, ROCm gets more and more developed for local use, but AMD is honing it up with super computers and slowly moving to smaller systems
@ariffinsetya
@ariffinsetya Жыл бұрын
@@defeqel6537 yep, using linux based system is the way to go for now, if you want rocm... really can't wait for rdna3 support, since 20gb vram 7900xt will be so much better for my wallet than 4090 🤣
@TheBackyardChemist
@TheBackyardChemist Жыл бұрын
The MI300C CPU will be an absolute beast for HPC
@PanduPoluan
@PanduPoluan Жыл бұрын
The system architecture of MI300 reminds me of 1st generation EPYC, where 4 dies are integrated using Infinity Fabric. That might be AMD's secret weapon here, as AMD certainly had a LOT of experience with that!
@Pushing_Pixels
@Pushing_Pixels 11 ай бұрын
While the CDNA 3 chiplets will be missing many of the media and IO related IP blocks from RDNA, they will no doubt have AI hardware accelerators beyond just the compute units. The die size per CU will probably be about the same.
@andrasrudnai9386
@andrasrudnai9386 Жыл бұрын
I do wonder if AMD went a strange(ish) route and did something like 12 bit floating point, or even 8 bit floating point to help with the 8x performance claim. Also, how many MI300 packages are we gonna see on a single PCIe card? Or just integrated into a motherboard? Ps.: 8 Core chips for MI300A, and not the 16 core "dense" chiplets?
@sdwvit
@sdwvit Жыл бұрын
if only you posted it a day earlier :D +20% on nvidia stock today
@Repsikka
@Repsikka 3 ай бұрын
Where have you been Jim? We miss you and your awesome accent!
@geekinasuit8333
@geekinasuit8333 Жыл бұрын
The MI300 series may fundamentally change the definition of what a server CPU is like. Up until now, server CPU's have been effectively identical to desktop CPU's, with only sometimes more cores and improved IO along with a few other minor adjustments (eg usually lower clock speed). With MI300-C, it may be all CPU's inside but it's nothing like EPYC or Xeon. Additionally the all GPU version and the variants with combination of CPUs and GPUs really changes the landscape. Some version may have specialized accelerators inside made for large enough customers. Intel is left out of the picture, and Nvidia's Grace looks like an ARM version of EPYC or Xeon which is nothing new. I do not expect the MI300-C to replace EPYC or Xeon, but it certainly changes the picture, some applications that used to be all EPYC or Xeon will instead be all MI300-C (or variations with GPUs).
@thoreberlin
@thoreberlin Жыл бұрын
I'm wondering if we will see 3 bit or even 2 bit capable AI processors.
@jimatperfromix2759
@jimatperfromix2759 Жыл бұрын
Open source researchers on Large Language Models (mostly descending from the leaked Meta LLaMA model - I guess Meta can't fire 10,000 workers without at least one seeking revenge) are already doing novel research using new LLMs that employ 2-bit precision for the parameters. In that case, each of the four values represents some segment of a normal probability distribution rather than specific numeric weights (I have no more details than that at this point). And these models are currently run on the CPU since no GPU has that capability yet. This is the bleeding edge of LLM research.
@VoldoronGaming
@VoldoronGaming Жыл бұрын
Hey Jim!
@HiggsField1337
@HiggsField1337 Жыл бұрын
IMO, this won't be enough to touch the palace that Nvidia has made in the datacentre marker with their GPUs. Nvidia has full vertical control of their hardware and most importantly, their software stack. From researchers at universities to government facilities, they'll stick to a stack that is more mature and more used around than AMD's ROCm platform.
@ranchuhead4547
@ranchuhead4547 Жыл бұрын
We need another video on AMD CDNA insight after AMD Data Center and AI event
@truefan1367
@truefan1367 Жыл бұрын
The future looks bright until the light shines on the wall in front.
@brendanmeyler1641
@brendanmeyler1641 11 ай бұрын
If you know about HPC it’s clear that the Fujitsu A64 FX walked so that Mi300 could run. Go look at supercomputer fugaku and you’ll see many parallels between it and mi300. Don’t be surprised if the mi300 system of the future looks very similar to fugaku, only better.
@Matlockization
@Matlockization Жыл бұрын
One thing about AMD, its that they have been keeping their APU's going since the stone age, in other words they are very well developed.
@paxdriver
@paxdriver Жыл бұрын
Why not hard wire a path of traces to draw 2 8-bit ints from a single 16-bit int to run 2 parallel processes from the same cache? How hard would it be to trace a little parser into a series of int8 operations to accelerate 16-bit or 32-bit floats or whatever? It's gotta be more real estate efficient to have an accelerator parse a double into 2 longs than to have dedicated packages for different varieties of precision compute. That kind of optimisation could be handled in the compiler if the parsing was accelerated within the compute die.
@DeltaSierra426
@DeltaSierra426 Жыл бұрын
"AI is the next big thing." True, but I don't believe it will be "better than the internet." That's a wildly bold statement, showing how much everyone has grown accustomed to it and can't even fathom where we'd be today if it had never come about. AI is overhyped right now from tech and social media running story after story on all the weird things that have happened with ChatGPT and other LLM's. The orgs behind these took a bigger step forward than they released, having to pause for a breather for the smoke to clear and correct, emphasizing more on training that results in more responsible behavior of these AI instances. The hi-tech world is always wanting to pump up their adrenaline on the next big thing, and this one is sensational to so many; I'd argue that there's more emotional investment in AI than there is scientific and technical facts.
@adoredtv
@adoredtv Жыл бұрын
You just described the internet hype in the late 90's.
@cubertmiso
@cubertmiso Жыл бұрын
Now all this AI hype is just stupid algo parsing up things that are scraped from the open internet. Real AGI will be that bigger change than the internet. Matter of time now because no superpower wants to be the 2nd.
@kingkrrrraaaaaaaaaaaaaaaaa4527
@kingkrrrraaaaaaaaaaaaaaaaa4527 Жыл бұрын
I wonder if Microsoft and the others are "investing" in AMD to premptively bolster competition in the A.I and data centre space. So that way they don't get shafted by Nvidia in the future.
@TheTardis157
@TheTardis157 Жыл бұрын
Probably a case of the enemy of your enemy is your friend. More competition will drive prices down, at least that was true until the crypto boom recently... Plus AMD has some interesting resources the got from their Xylinx acquisition in regards to AI and compute capabilities will decent efficiency. I just hope that the tech they pioneer in the enterprise marketplace can get to us normal users sooner rather than later.
@Wh0am131
@Wh0am131 Жыл бұрын
I hope at some point these really powerful GPUs will make it to our consumer cards. But i know realistically it probably wont happen since we are not where the money is at.
@D1craigRob
@D1craigRob 8 ай бұрын
id advise cleaning your sinus' out if you havent had hayfever before. neilmed sinus rinse is the way to go. i started usning it earleir in the year and it has reduced my sinus problems including hayfever alot.
@Reyfox1
@Reyfox1 Жыл бұрын
I didn't get allergies (spring/summer/fall) and pet allergies (cat) until I was your age. Never had them before then... I feel for you....
@johnhughes9766
@johnhughes9766 Жыл бұрын
Did you see the leaks on 8000 series amd gpus being dual die with v cache
@stevedavis3828
@stevedavis3828 Жыл бұрын
Would not the yields for the 100mm chiplets not be significantly greater than the 'Behemoth' 40mm chip? Maybe north of 95%.
@MrHav1k
@MrHav1k Жыл бұрын
I'd love to see these GPUs, APUs, accelerators, etc all benchmarked against one another. Intel is also in the space now with their datacenter GPUs. Would be great to see all of these compared, but then again if you're buying these you want them to train models, not run benchmarks for geeks on KZbin lol.
@ezg8448
@ezg8448 Жыл бұрын
Let the Skynet countdown begin!
@deathstar7042
@deathstar7042 Жыл бұрын
never clicked on a video faster. thanks jim :)
@unclerubo
@unclerubo Жыл бұрын
The thing with AI, and I must be alone in this but, I don't want to rely on sending my information to someone's cloud. It will be for me when I can afford something and run it locally, even if it takes me way longer than using an online service. This has the potential to create the biggest monopolies in history.
@jimatperfromix2759
@jimatperfromix2759 Жыл бұрын
The current state-of-the-art research in Large Language Models is actually being done now using both souped-up personal laptops with the best (laptop) GPU available in it, or on personal desktops sporting an Nvidia 4090. See the leaked Google memo ("we have no moat") about how open-source is eating the lunch of both Google and OpenAI and Microsoft.
@Germanas1985
@Germanas1985 Жыл бұрын
Jim🎉🎉🎉 I see adore video I know I will love it
@Isaax
@Isaax 5 ай бұрын
I miss you, Jim
@wawaweewa9159
@wawaweewa9159 Жыл бұрын
Chiplets also allow for better mix and match, unlike hopper
@apefu
@apefu Жыл бұрын
I need a bigger budget.
@HellsPerfectSpawn
@HellsPerfectSpawn Жыл бұрын
A word of caution though. AI model training does need serious compute but once trained it doesn't need much to run at all. What you call an AI revolution could very well just be the next cryptocurrency boom hype for these GPU manufacturers.
@Lee-pf6od
@Lee-pf6od Жыл бұрын
This is not true, inference is still expensive which is why midjourney only offers limited amount of fast GPU time. Gpt4 also continues to have query limits last I heard, it's resource intensive.
@Freddie1980
@Freddie1980 Жыл бұрын
But not bigger then sliced bread though? That would be quite a feat!
@Johan-rm6ec
@Johan-rm6ec Жыл бұрын
Don’t care about the video, i listen for his accent.
@doggSMK
@doggSMK Жыл бұрын
Holy 💩 😮 That CPU with 128GB HBM3 sounds crazy... And also the other stuff too. And they will charge alot for the MI300A because Jensen from nGreedia said that prices will go up (again) 😂
@RyleyStorm
@RyleyStorm Жыл бұрын
Im here fammo cant get me gone
@lolmao500
@lolmao500 Жыл бұрын
About fucking time
@MealsBeast
@MealsBeast Жыл бұрын
HE LIVES
@hyenadae2504
@hyenadae2504 Жыл бұрын
Been watching since 2016 and the Polaris hype saga, just rewatched some of the older speculation videos for nostalgia at how far things have come, and the "AMD Master Plan" has finally come true except for in gaming (so far). "AMD (could be) in complete control of gaming- VR with multiple graphics card is great. Separate dies, much lower latency and higher performance. Once we get to 10nm, AMD could have tiny multiple 100mm^2 GPUs, running as fast as 1 600mm^2, a lot more manufacturable, this is the real endgame. Not much money in Graphics, lots in CPU. Intel's got billions, and this is why Nvidia is trying to get out of gaming as fast as they can (Automotive, AI). Not even sure Intel could afford to buy Nvidia any longer"
@minus3dbintheteens60
@minus3dbintheteens60 Жыл бұрын
Intel's share price is so low compared to team red and team green, and I for one am very interested in how well ARC will perform in the AI race, especially as it seems to me that it has been designed with a far greater focus on AI than AMD's recent designs from Xe's alpha Alchemist release.
@TheGuruStud
@TheGuruStud Жыл бұрын
ARC is dead (you mean xe, but I don't think it's close to arc). Intel is trying to produce that power hungry, pos monstrosity known as ponte crappio, I mean vecchio.
@OrjonZ
@OrjonZ Жыл бұрын
Jim just to give you some point of reference, ur prices for these chips are way off. H100 is at least 1000$.
@adoredtv
@adoredtv Жыл бұрын
What lol, H100 cheaper than an RTX 4080? I don't think so. ;)
@OrjonZ
@OrjonZ Жыл бұрын
​@@adoredtv I am talking about BOM. The silicon alone without HBM is $1000.
@adoredtv
@adoredtv Жыл бұрын
@@OrjonZ Can't see it, $500 sure but $1000?
@MyrKnof
@MyrKnof Жыл бұрын
most replayed 0:05 XD
@whitehavencpu6813
@whitehavencpu6813 Жыл бұрын
Oritte guyz howz it guin?
@j.w.s.743
@j.w.s.743 Жыл бұрын
I hope they add AI capabilities to graphics cards!
@OrjonZ
@OrjonZ Жыл бұрын
For sure they will. AI start with students and they mostly use CUDA and Nvidia. AMD will improve AI perf next gen for sure.
@nahimgudfam
@nahimgudfam Жыл бұрын
You should try bee stings for hay fever.
@trick0502
@trick0502 Жыл бұрын
am i the only one who gives a thumbs up before watching a jim video?
@stephensinclair8127
@stephensinclair8127 Жыл бұрын
Gold.
@leflavius_nl5370
@leflavius_nl5370 Жыл бұрын
Hey, fever!
@ShaneMcGrath.
@ShaneMcGrath. Жыл бұрын
Haven't had hay fever yet at 48, But had a kidney stone, Also not pleasant. Middle age is a bitch!
@TheVanillatech
@TheVanillatech Жыл бұрын
First time I got hayfever I was 10 years old, and it actually made me pass out. I was under an oak tree looking up, eyes and nose went, then dizziness and I woke up under my BMX unable to see! When I finally staggered home my Dad thought I'd been smoking. Luckily, a tiny anti-histimine pill is enought to take away all symptoms for 24 hours. And as a rule, it's only a worry for 2 months of the year. Come to think of it, haven't had any hayfever for the last 3 years...
@ShaneMcGrath.
@ShaneMcGrath. Жыл бұрын
@@TheVanillatech Damn, Is it really that bad? My old man gets it and I used to think ahh it's just him sneezing a lot during certain times of the year.
@TheVanillatech
@TheVanillatech Жыл бұрын
@@ShaneMcGrath. Haha it usually just is that - bit of sneezing, itchy eyes and throat, running nose. But every once in a while, it can go mental. Another time I was riding a gas gas (trials bike) in a forest when it happened. Hadn't had an episode in years, so had no medication on me, and went from normal to struck blind in less than a minute. Eyes were on fire, swelling up, streaming with water, couldn't see anything. Stood there shouting for almost half an hour before someone found me. Awkward! From that time on I always carried a pill or some drops.
@MarekNowakowski
@MarekNowakowski Жыл бұрын
those wafer costs for the GPUs are irrelevant, but the wafer allocation might matter.
@Mopantsu
@Mopantsu Жыл бұрын
Apple is rumored to have purchased 90% of TSMC's upcoming 3nm. Apple are probably already on the AI bandwagon.
How this tiny GPU invented the Future
18:00
High Yield
Рет қаралды 218 М.
Nvidia - A New High in Low Morality
25:01
AdoredTV
Рет қаралды 88 М.
Универ. 13 лет спустя - ВСЕ СЕРИИ ПОДРЯД
9:07:11
Комедии 2023
Рет қаралды 3 МЛН
Backstage 🤫 tutorial #elsarca #tiktok
00:13
Elsa Arca
Рет қаралды 33 МЛН
Stacking Dies For Performance and Profit
14:45
Asianometry
Рет қаралды 99 М.
Zen 5 Secrets Revealed - it's Gonna Be (Really) Good
18:40
AdoredTV
Рет қаралды 51 М.
AI with Arun Show: Episode 3 with Oaken's Shashi
44:11
AI wITh Arun Show
Рет қаралды 1,6 М.
How Chip Giant AMD Finally Caught Intel
15:08
CNBC
Рет қаралды 1,4 МЛН
Zen 5 Set To End Intel's Gaming Dominance - Part 1
29:12
AdoredTV
Рет қаралды 58 М.
Deep-dive into the technology of AMD's MI300
17:40
High Yield
Рет қаралды 57 М.
Why AMD's Chiplets Work
12:53
Asianometry
Рет қаралды 291 М.
Карточка Зарядка 📱 ( @ArshSoni )
0:23
EpicShortsRussia
Рет қаралды 766 М.
Мечта Каждого Геймера
0:59
ЖЕЛЕЗНЫЙ КОРОЛЬ
Рет қаралды 555 М.
Apple watch hidden camera
0:34
_vector_
Рет қаралды 59 МЛН
Непробиваемый телевизор 🤯
0:23
FATA MORGANA
Рет қаралды 57 М.
КОПИМ НА АЙФОН В ТГК АРСЕНИЙ СЭДГАПП🛒
0:59