In 6 months when data centers start cycling these out, I will invest in upgrading my solar farm to a nuclear powerplant, so I can put these in my home lab.
@MatthewSmithx18 сағат бұрын
Bruh P100 hgx systems are still printing money. There is no secondary market for this stuff. It gets run until it cooks itself
@45mw249 сағат бұрын
Most DC's have a limit of 16KW per cabinet, so there will be one of these per cabinet. HAH
@prometheus41308 сағат бұрын
totally on spot
@BrownieX001Күн бұрын
This type of sponsored content is great. Glad some channels take the opportunity to showcase cool tech
@ServeTheHomeVideoКүн бұрын
Yea, the challenge is that shipping these things is expensive and time-consuming, and sometimes anomalies happen. We will be doing more of this where we fly to big AI servers in the future. I need to get companies to cover travel otherwise it would cost too much to do.
@yepyepyepyep4602Күн бұрын
@@ServeTheHomeVideo or buy a t-shirt ;)
@Dionj615Күн бұрын
Seems a bit outside my budget
@ServeTheHomeVideoКүн бұрын
A lot of companies are working on AI clusters right now.
@tomaszkarwik6357Күн бұрын
Does anyone have some discount code?
@mikeparkieКүн бұрын
@@ServeTheHomeVideo check your channel name 😜
@michelangelodrawcars5778Күн бұрын
Give it 6 years and it will sell in alliexpresses for 2k
@conraadvandenbergКүн бұрын
I literally burst out laughing
@gmsipeКүн бұрын
What, no discount code?
@nulltrope17 сағат бұрын
when I was a DC tech ~12 years ago, supermicro servers were always my favorite to service. They never looked the flashiest like the Dells or HP but they were no nonsense and built like tanks. Glad to see they're still keeping with that mentality.
@ronjon7942Күн бұрын
Man o man, I used to work on Power3 AIX servers when those were so high end…750MHz eight single-core cpus, PCI-X. We had a customer that was dipping their toe into fibre 1gb nics, which was a BIG deal for servers. Later on they moved to CAT-5 1gb copper nics. We had a bunch of techs and business partners over to see the nics in operation. No one could believe any nic could push 1gb ethernet over copper. I’m not THAT old, but wow, hardware changes so freakin’ fast. Now this thing has eight 400gb ethernet for GPU communication. Unreal.
@TheFullTimerКүн бұрын
Will Microcenter have them on the 30th? I could pick one up with the 5090 for the bundle discount. The more you buy, the more you save.
@sloth0jrКүн бұрын
18kw for a server is crazy (let alone 24kw for the fully populated chassis); for comparison, many datacenters running more conventional servers need about 15kw for an entire rack, including top rack edge switches. Really changing the equation on datacenter engineering.
@HutsonKyleКүн бұрын
We used to measure datacenters in square feet. Now we measure in megawatts.
@mika2666Күн бұрын
18kw for redundancy :) 10 kw in actual use
@sloth0jrКүн бұрын
@@mika2666 Yeah, I was thinking for a unit like this, I might just run non-redundant. Not going to be connected to any live customer traffic, I wouldn't think. If you can afford one of these, you can afford two of these, and that's your redundancy.
@MaXwellFalsteinКүн бұрын
A decade ago, 6 KW was the largest power budget in data centres racks. Most data centres will not support this class of AI server. Four 24 KW chassis is the maximum I have seen in one rack resulting in a rack over 100 KW with switches, etc.
@ServeTheHomeVideoКүн бұрын
That is adding the PSUs. The actual power consumption is usually 2-10kW.
@FlyingCIRCU175Күн бұрын
DROP TEST DROP TEST DROP TEST DROP TEST
@ServeTheHomeVideoКүн бұрын
No than you! These things are HEAVY
@zsmith8632Күн бұрын
To see if it breaks the floor?
@danilatarasov8287Күн бұрын
You need to call Linus for that
@theevilmuppetКүн бұрын
This video does not begin with, "THIS is Patrick from STH!" SAD PANDA!!! 😛
@ServeTheHomeVideoКүн бұрын
Ha!
@MyDysonSphereisBrokenКүн бұрын
I know it's a pain for content creators to acknowledge sponsored content, but I'm sure glad you do.
@CyrilleGedoriaКүн бұрын
How many google chrome tabs can i open with this.
@UCs6ktlulE5BEeb3vBBOu6DQКүн бұрын
These are bound to end up in our mitts at one point. Considering we got P40s for 200$ in 2020, this means that these would be obsolete for datacenters in 3-5 years.
@ServeTheHomeVideoКүн бұрын
I think it might be sooner than that. If you think 3-4 years out, the AI rack power density will be about 10x what these systems offer.
@gearboxworksКүн бұрын
Yeah, but how will you pay for the 1800 KWh?!?
@peterpain6625Күн бұрын
@@ServeTheHomeVideo Pity they're almost obsolete once you get them into production ;)
@UCs6ktlulE5BEeb3vBBOu6DQ22 сағат бұрын
@ I worked in a data center in 2024 and we installed 150k units of H100 lol
@UCs6ktlulE5BEeb3vBBOu6DQ22 сағат бұрын
@ I used to pay 7000$ per month for electricity when I was a ETH miner. Today I have a bunch of server and already pay for more kWh than that. You just gotta do stuff that pay for itself.
@sugarmaker67Күн бұрын
but can it play Crysis?
@justincase9471Күн бұрын
Can it run Windows 11? 🤔
@ServeTheHomeVideoКүн бұрын
It does not have a HDMI/ DP output
@rpungelloКүн бұрын
More importantly, can it run 8 copies of Crysis simultaneously?
@jfbeamКүн бұрын
Yes. It. Can. (about 1000 copies at once.)
@ChocobollzКүн бұрын
And it will probably cost (at least) 1000 copies of your wallet 😁
@spewp7 сағат бұрын
That bright yellow shirt was definitely a choice for STH. You look like Charlie Brown.
@AI-xi4jkКүн бұрын
I think I missed out on CPU info of this machine. We need consumer oriented version with 1/2 GPUs to fill that market niche between RTX and Datacenter, what do you think about it Patrick?
@ServeTheHomeVideoКүн бұрын
I actually think the DIGITS and other platforms are the way to go in that space. This one is 4th and 5th Gen Xeon but there is an AMD option.
@AI-xi4jkКүн бұрын
@ thanks for reply. I’m curious how digits will perform but since it’s a small package with limited power I think it will be something like Jetson. The advantage is it has lots of memory but that memory is unified. To be seen. I think there is still a niche but maybe not big enough for Nvidia to consider.
@torr136Күн бұрын
Impressive. Thanks for the video.
@OrientalStoriesКүн бұрын
Here you go, that will be a million dollar
@ServeTheHomeVideoКүн бұрын
Less than half that!
@peterpain6625Күн бұрын
@@ServeTheHomeVideo Basically a steal ;)
@stupiduser6646Күн бұрын
I wish the price ranges were included in some of your videos. How much is the boss going to spend on this.
@ServeTheHomeVideoКүн бұрын
When you install these, the operating costs are a big part.
@mo3k4 сағат бұрын
It's one of those "if you have to ask you probably can't afford it" kind of things, but if you really want to know: Currently, they only sell these pre-configured (but still customizable) and they are priced a little north of $400k. The majority of that price (around $280-300k) being the cost of the 8 x H200's.
@lvutodeathКүн бұрын
Around half a million dollars for high specs. Perfect for homelab AI setup
@ServeTheHomeVideoКүн бұрын
Of course, that is why we review smaller systems too.
@peterpain6625Күн бұрын
@@ServeTheHomeVideo The "workgroup edition" of that behemoth would be interesting. Something in the 50-60k range.
@kennypcolin22 сағат бұрын
Do you take bags of coins in mixed change? And some toys from the 80s in trade?
@PatrickHSBКүн бұрын
Great looking server, love it. would like to see Epyc instead of Xeon. anyhow, thnx for the presentation.
@DaNiePredКүн бұрын
Na, AMD has rubbish cross numa throughputs.
@ServeTheHomeVideoКүн бұрын
They have an option for that too as mentioned. We only got to pull one server though.
@louwrentiusКүн бұрын
Mixed feelings because this stuff is fueling a hype train for shareholder value by burning through ungodly amounts of energy 😢
@Gjarllarhorn1Күн бұрын
but how does it sound?
@ServeTheHomeVideoКүн бұрын
Very loud
@IAMSolaaraКүн бұрын
I really want to see the software making use of this hardware running and understanding how it uses all of this…
@sw12575 сағат бұрын
Can these machines be run in the EU, or is it too noisy to satisfy working regulations?
@ServeTheHomeVideo4 сағат бұрын
I know what regulations you are referring to but I do not know the answer on this one.
@KK10155Күн бұрын
What a powerhouse and awesome video, I do feel like challenging the number of power supplies provided here, H200 consumption stated at 700w max , ill generously add 1800w for cpus and the other cards, adding the fans lets say 1000w, total 8.5kw. seems 3 modules are enough to power this thing at full capacity. adding one more for redundancy seems enough. I don't operate and manage servers though so I wonder why so many would be used instead of just 4.
@ServeTheHomeVideoКүн бұрын
Your 1800W is quite a bit low. But you are correct, this is 4+2 redundancy standard, and 4+4 redundancy with the two additional PSUs. That kind of redundancy is used if an entire power side of the data center goes down.
@chaosfenixКүн бұрын
So it is servers like this why I argue that cloud services just aren't as necessary as they used to be. There is a ton of computing that can happen on equipment like this. I know that in the 2000 SANs were frequently specced not on capacity but on performance. a standard HDD wasn't going to give you more than 200MBps of data transfer and 2-300 IOPS per drive. If you needed for performance for more users the only way to do it was to add more drives and so you would need a ton of storage servers filled with hundreds or thousands of drives all with a ton of network switches just to connect it all together and make it all work. This was the same for CPUs as many CPUs of this era were 2 or 4 core systems. Again to scale you application you needed to have a ton of servers all networked to each other with load balancers and networked to storage all just for your application. This was a lot of work to manage and there is a reason why companies could get stuck not being able to scale their application because being able to manage all of this complexity was hard. Fast forward to now though and you could get a top end server that has almost 400 CPU cores in it and fast enough storage where each SSD can read at 14,000MBps as well as having 3M IOPS. They do all of this while being smaller and allowing more drives in a chassis. To sum it up a modern server has 50x as many cores with drives that have bandwidth 70x faster and 10,000x as many IOPS. A lot of the complexity in hosting and scaling your own application is just gone now. If you needed 100 servers for your application before you could do it now with 2. Before where you would need several layers of gigabit network switches you can now just purchase a pretty cheap 100Gbps switch for the rack or 200gbps if you are looking to splurge. I think people really should look into moving away from the cloud now for their applications. Unless your application really needs to be able to serve 100,000 simultaneous users you could probably just run it on a couple servers in your main office.
@prometheus41308 сағат бұрын
after you…..cloud done well offers HA fail over
@45mw248 сағат бұрын
I agree, cloudy based computing is a lot more expensive than it used to be, a few years of decent performance cloud computing would cover the cost of physical modern servers in an onsite DC. Not to mention the bandwidth costs of cloud. However the upfront costs of 8 x H200 GPU's server in any kit sold by any of the big reliable vendors would make a fortune 500 company cry and the running costs of one of these is many many multiples of a regular server. A Data Center that is >5 years old will have cabinet power limits, and many of them are 16 or rarely 32KW per cabinet, so 1 or 2 of these 6 or 8U servers per 42U cabinet, running these is expensive but for a lot of companies, they think their LLM will be the best thing in the world and will allow them to fire loads of people and let the AI do the work and so on balance make the business more money. Will it fuck. AI is just fancy statistic normalizing at the moment, these things dont think, they just tell you what the average answer is.
@chaosfenix5 сағат бұрын
@ If you think that on-prem can't offer HA failover then you are mistaken. AWS, Azure, and Gcloud are just someone else's computer. If they can do it so can you. Kubernetes and even docker swarm were literally built to implement that.
@chaosfenix5 сағат бұрын
@ Yeah I don't mean everyone should run out and buy these servers specifically. These are clearly built for AI workloads which are questionable at best for many of the workload where companies are trying to implement them. I am not going to say there isn't a place for AI in the workplace but that companies are a lot further away than they think they are. "AI" is not nearly as new as it is advertised to be. It is simply an evolution of Machine Learning from 10 years ago and "Big Data" from 20 years ago. It is the same product just different names and it will probably be another 20 years before it has any where near the impact that companies are trying to get it to do in the next 2-3 years. My point though was that there are servers out there that are so powerful that you don't really need to scale them in the cloud. You can easily fit 350+ cores into a 2U server and have it run your website. Put 2 of those servers in a chassis and balance the workload between the two servers and then mirror the configuration at a different datacenter for a failover. These 4 servers would ensure that your website is available 99.9% of the year.
@dieselphiendКүн бұрын
Data centers and the like must save a fortune on utility bills in the winter. Too bad this stuff isn't more distributed so we could heat our homes with it.
@tormaid42Күн бұрын
Not really because they’re using wasteful evaporative cooling
@dieselphiendКүн бұрын
@@tormaid42 Not all of them.
@jfbeamКүн бұрын
I doubt it. But the overall efficiency of any cooling solution should go up. I've only seen a handful of places use the "waste heat" for anything useful. (PG plant heated their office.)
@zyeborm19 сағат бұрын
It's low grade heat unfortunately so you need to move a lot of mass to move the heat. Not impossible, but not easy.
@dieselphiend15 сағат бұрын
@ "Low grade" as in forced air? I'm not sure what you mean. Whatever is drawn from the wall is almost entirely converted into heat, and consider how many watts these centers use. They could easily be engineered to recover most of the energy.
@mo3k4 сағат бұрын
Hmm...I understand nVidia's concerns in regard to yield control, but it makes me curious what the yield is. As the manufacturing process improves, and with a large enough cluster of these, I wonder if it might make financial sense for an owner-operator to figure out how to test and "unlock" any memory marked as inactive (ofc, that would require a patched/custom firmware). A 100-million-dollar (~2500 H200 GPUs) server farm (which isn't big when it comes to these AI datacenters), could probably push out an additional 2.2-2.8GB per card, which is around 5500-7000GB. That's roughly equivalent to 40-50 extra H200s' worth of memory, which would cost around 1.3-1.7 million dollars.
@yepyepyepyep4602Күн бұрын
Nice one!
@stradenkerКүн бұрын
Lets say I would like one for my Homelab. How much does something like this cost? Just a Ballpark. Would be nice to know. I already know..."a lot" but how much is "a lot" ?
@112HariboКүн бұрын
I'm gonna guess north of 100k
@coder543Күн бұрын
At single-unit volume? Prices I'm seeing online put it in the $300k to $400k range. $250k if you get a really good deal.
@stradenkerКүн бұрын
@@coder543 Thanks. Then I dont want to know what a Blackwell Variant like this sells for
@MicheleAlbrigoКүн бұрын
Power draw scares me almost as much as the upfront cost 😂
@coder543Күн бұрын
@@MicheleAlbrigo And you could probably hear this thing's fans screaming like a jet taking off, even from the other side of the house with all the doors closed.
@iszotopeКүн бұрын
IMO, MCIO in place of actual PCI slots sounds nice but excessive cable bends have been the singlemost biggest contributor to easily degradeable link speed and FoM errors.
@sintheticgamingКүн бұрын
Cant wait to be able to afford one of these in 12 years 😀😀😀
@romeozorКүн бұрын
I really like that you guys get to show these, but the stuff featured lately are so out of my league it's not even funny. Keep it up tho.
@ServeTheHomeVideoКүн бұрын
The last video was a small used Dell workstation and the one before that was a $199 switch. Trying to strike a balance.
@SyndiOnlineКүн бұрын
When will the drawing take place =)?
@TheJensssКүн бұрын
haha the ultimate plex server
@jfbeamКүн бұрын
The last computer lab I built had "only" 5.7kW per rack. (3x 120@20A -- derate to 80% for continuous loads) And the UPS was only sized for 20kVA. (it'll go to 40, 'tho) At those power level you'd stop with the silly 120/208/240 AC plugs and hardwire 480-600V. Odd to support 240V DC; the few DC power systems I've seen were in the 350+ range. ("kill you instantly, super dangerous shit")
@jurepecar909215 сағат бұрын
Replacing memory in these still requires removing 8 (iirc) tiny screws that hold the plastic air shroud in place, which is super annoying and time consuming. Also if any gpu decides to fry itself, waiting for spares is measured in months. So any time saved in high servicability is basically pointless. That only pays off if you have spares on site.
@majstealthКүн бұрын
here my colleagues complain about our 16core epic with 512gb ram and all ssd storage being "slow" but in fact its the notebook dock that slows them down...
@ServeTheHomeVideoКүн бұрын
Ha!
@dieselphiendКүн бұрын
That baseboard is so chonky. A rundown of the connectors would have been cool. I don't even recognize them. It looks very expensive.
@ServeTheHomeVideoКүн бұрын
I think NVIDIA uses basically the UBB spec
@bits2646Күн бұрын
It's gonna be great for home lab server in few years 😂😂
@sativagirl1885Күн бұрын
Sensible housewives always over-provision their home servers and have redundant network & power sources with UPSs and under-floor fire suppression systems.
@The-Weekend-WarriorКүн бұрын
Am I the only one to feel that ServeTheHOME is starting to loose the HOME part? :)))
@ServeTheHomeVideoКүн бұрын
I am confused? We started reviewing 8 GPU servers in 2015. Home is the /home/ directory in linux
@Davolicious14 сағат бұрын
Looks like I'm going to have to sell a kidney...
@osa-bh4kvКүн бұрын
If this GPU is idle, an impressive fine-tune could be achieved if the GPU were actively running. Since I started working on AI, I’ve come to feel that it’s such a waste when the GPU remains cold.
@dmoneyballaКүн бұрын
pcie switch makes sense because I was scratching my head wondering how they could get all those u.2, nv switches and gpus on just two cpus. well planned.
@fakkel32110 сағат бұрын
But they cant give us 16gb + on consumer gpu's.
@magick93Күн бұрын
Would make a good laptop
@Sommyie9 сағат бұрын
Some kid in 20 years is going to play GTA7 on this for the lol's on KZbin.
@ABUNDANCEandBEYONDATHLETEКүн бұрын
Guys, 6 (128GB shared RAM) Nvidia digit's per RU, 42u rack, 252, units is ~25.2kw, ~32PetaBytes of RAM/Shared VRAM So.....just add some switches and 3 208v 3 phase ac circuits for the whole rack! Vs just this 1x 8U GPU server..... thoughts?
@ServeTheHomeVideoКүн бұрын
I am totally in for DIGITS! I actually told AMD they should do the same with the 40 CU APU.
@youtubecommenter4069Күн бұрын
Commenting that I am first then watching. Content from Patrick certainly go to be good.
@ServeTheHomeVideoКүн бұрын
I watched this the other day from Alex and thought I really like how it came out.
@charlesturner897Күн бұрын
Ask your rep to price up a fully stacked one of these then reply with "that's a bit too expensive for me" and order a 4060
@danieltorio235420 сағат бұрын
If SMCI files its 10-K form on time, I would be able to afford a few of those systems to play Crysis at full specs😂
@prometheus41308 сағат бұрын
quite the home server…(so Elon tells me - playing his grinded account on…allegedly 😁)
@gbkqszКүн бұрын
Too bad I didn't know about this earlier, I already bought a 1050ti, what a pity
@ionstorm66Күн бұрын
Perfect server to be out of date in a month XD
@gösta-e3t20 сағат бұрын
I wanna try gaming on these. probably impossible
@bmilewski10 сағат бұрын
Perfect for home media and plex server
@jamierogers294Күн бұрын
Am not sure it will serve my home for a few years, lol, but Supermicro do know how to build a server.
@ask_carbonКүн бұрын
Patrick man sorry but you really need to take some more care of yourself now that you have little one to take care of.
@ServeTheHomeVideoКүн бұрын
Not sure if you noticed, but between the on-site filming and the studio part (about 40 days apart) I was down over 20lbs. Working on it.
@WebtroterКүн бұрын
@9:00 we're going back to ribbons?!
@ServeTheHomeVideoКүн бұрын
It seems that way.
@Herbit-k4j13 сағат бұрын
This thing is worth more than me
@balex96Күн бұрын
I'm gonna need a bigger house.
@clij520219 сағат бұрын
But can it play Doom ?
@ServeTheHomeVideo17 сағат бұрын
It even has a VGA port for the original
@prashanthb652114 сағат бұрын
2KW at idle ? OMG !
@frederiquerijsdijkКүн бұрын
Fan efficiency, seriously? You have 2 power hungry CPU's in there and 8 (right?) H200's, each with a TDP of 700W (Power efficiency isn't Nvidia's strongest point), and you take FAN EFFICIENCY as a selling point?
@ServeTheHomeVideoКүн бұрын
These use a lot of power. 2% lower power on cooling means 2% more GPU servers in the same power budget.
@TillmannHuebnerКүн бұрын
I want to see the home that this thing serves…
@ServeTheHomeVideoКүн бұрын
Usually these run linux so it is just a normal /home/ directory
@peterpain6625Күн бұрын
@@ServeTheHomeVideo Also makes for a decent space heater i reckon ;)
@1HDBIZ16 сағат бұрын
and the price is just $250K...more or less
@KanielD4 сағат бұрын
Sponsored video, no discount code… lame. Jk awesome video. Thank you!
@lyth1umКүн бұрын
north south? east west? whats this.
@ServeTheHomeVideoКүн бұрын
At a high-level, East-West is GPU to GPU and North-South is to the rest of the data center.
@lyth1um17 сағат бұрын
@@ServeTheHomeVideo ahhh, never heard those terms. made sense to put this compute in extra networks because of their bandwidth needs.
@vijinhoКүн бұрын
Is it really to "Serve The Home" though?
@MacGyver0Күн бұрын
ServeThePatricHome
@grahamjkeddieКүн бұрын
It might be part of a cloud service that serves many homes
@whyjay9959Күн бұрын
That's a reference to the home directory on Linux.
@OrientalStoriesКүн бұрын
More like serve the block
@ServeTheHomeVideoКүн бұрын
Yes.
@velodynemanКүн бұрын
Will this server work for my Google Photos?🤣
@seansingh4421Күн бұрын
Great for grok or llama 405B inferencing
@tormaid42Күн бұрын
Listen to yourself, honestly…
@seansingh4421Күн бұрын
@ wym ?
@TAK-YON_Күн бұрын
i don't think it can run crysis guys
@middle_pickupКүн бұрын
Dude, this isn't a server for the home. It's a server for thousands of homes. Why is this here?
@ServeTheHomeVideoКүн бұрын
We have been reviewing 8 GPU servers since 2015? That is like asking why the Wall Street Journal does not just cover a road in NYC.
@vaikjsf34aКүн бұрын
the real question is can you mine crypto on it - and what are the hashrates
@Nemesis1ism9 сағат бұрын
defective 3d chip tsmc screwed up
@kwisin133716 сағат бұрын
i feel like your gaslighting... its called HotSwapping... we know what that is! repeating the gpu names in full, back to back, really... There is so much confusing information levels. From basic explaining to slightly more complex detail, then back to basic.. good lord man, am I 5 or 15.. did i go to collage or do i work in the rack already... pick an audience to explain to, make two different videos for basic and Haddcore detail level. Do us better then this.
@peterboil406420 сағат бұрын
I only know supermicro for their stupid OOB management that leaves a lot to be desired.
@DanFrederiksenКүн бұрын
hmm, they are not aesthetically pleasing. It's icky 1990s engineering. But I guess if it works it works.
@judclark7376Күн бұрын
who can afford this? Let alone afford the monthly electric bill.
@tejiriamrasa3258Күн бұрын
A.I business owners I guess
@zekeriya849 сағат бұрын
You may need some written consent from Trump administration or NSA for this server. Before investment be sure that you have permit to use this powerhouse. 😁