For that first shot I had some weird perspective thing going on; I thought the servers were on the floor behind you and were giant!
@ServeTheHomeVideo2 жыл бұрын
ha!
@OTechnology2 жыл бұрын
Did you see the wild raspberry pi at 16:42?
@CrazyLogic2 жыл бұрын
@@OTechnology i did :) i'm not surprised either - just a shame is wasn't a CM rather than cabled in.
@Nobe_Oddy2 жыл бұрын
@@OTechnology I THOUGHT that's what it was, but I didn't go back and check until I saw your comment HAHAHA!!!! To think a $10,000 server heat exchanger is actually running a pi4!! LMAO!!!
@aninditabasak76942 жыл бұрын
@@ServeTheHomeVideo Yeah and I thought someone was hiding in the servers with a gun trying to put a bullet in your head.
@benjamintrathen61192 жыл бұрын
Your enthusiasm is infectious. Thanks for this series.
@JeffGeerling2 жыл бұрын
2:00 Raspberry Pi spotted!
@ServeTheHomeVideo2 жыл бұрын
Boom. It conjured the Earl of Pi himself.
@bryanv.23652 жыл бұрын
Yes! This is what I was talking about! Please do more of this kind of build content whenever you get a chance!
@wernerheil66972 жыл бұрын
EXCELLENT VIDEO, Patrick. Basic thermodynamics I/we applied 20 years ago during my Ph.D. thesis, for instance. "High-chem" and QC on pumps is key here: longevity of components INCLUDING the coolant mix and mastery of its phase diagram behavior, because x(@10+ years)*365/24/7 shall not change - EVER !
@ServeTheHomeVideo2 жыл бұрын
Very much so. That is why we did another video while we were up there on how these are tested. Look for that soon.
@zachariah3802 жыл бұрын
@@ServeTheHomeVideo I'd Love to see a video on creative solutions data centers have used for cool source water - like loops into a large body of water like an onsite pond or lake, or something like the new "space air conditioning" which basically converts the heat into infrared via special polymer tubing, and irradiates it up into and out of the atmosphere - almost like a reverse solar panel.
@BillLambert2 жыл бұрын
I remember CoolIT from back when they made AIO liquid coolers, 15 years ago! I had a slightly modded Freezone Elite, an absolute beast of a unit, which could bring my CPU well below ambient if I let it. I later picked up a used 2U server that had a CoolIT system mounted where the front bays would have been, whisper quiet and perfect for my homelab.
@ilovehotdogs1257902 жыл бұрын
I thought you can't go below ambient with water cooling. Best case you are equal to ambient.
@ledoynier36942 жыл бұрын
@@ilovehotdogs125790 It's a Peltier unit actually, so it makes sense. With normal AIO y'a, can't go below ambient though :)
@jeremybarber28372 жыл бұрын
This video is SO timely for my job, unreal. Looking into the AHx10 CDU to see if we can add 1 rack of liquid cooling. Thank you!!
@tad20212 жыл бұрын
You know Cool IT is serious when they have STH doing this and not LTT. /s Joking aside, Linus would have been trying to twirl the latest and most expensive system in the building.
@ServeTheHomeVideo2 жыл бұрын
Yea we tend to do these boxes before Linus does. E.g. A100 box, we did 3x and including liquid cooled ones a year ago. There is a MI250X node that makes a cameo in this video. That is faster at FP64 than the next-gen NVIDIA H100 GPUs. We just do more informative rather than entertainment aimed at gamers.
@nathanielmoore872 жыл бұрын
Well Linus kinda failed epically with his DIY whole room water cooling back in 2015. It's probably a good idea he didn't get anywhere near this equipment.
@YouTubeGlobalAdminstrator2 жыл бұрын
And Linus' humour is for kids...
@n0madfernan2572 жыл бұрын
pains me when i see linus building awesome systems just to run games. datacenters throwing and crunching gigs/tera of data to me is what these bad boys are for minus the linus... just ignore my ranting
@beauslim2 жыл бұрын
Cool IT knows that Patrick is less likely to drop stuff, especially after months of having to lift hardware to show it in the new set.
@JortKoopmans2 жыл бұрын
I just love how amazed Patrick is about the thermal capacity of water, indeed very effective! 😃 But yes, taking the numbers given, the water should be about 38°C warmer than the input, if loaded with 80KW. That's easily handled by tubing/pumps etc. A lot of energy still, think about heating 1800 liters of water by 38°C every hour! 😛 Water flow at 30L/min = 0.5L/s Water thermal capacity is 4186J/(L·°C) 4186*0.5=2096J/(°C·s) 80KW=80000J/s 80000/2096=38.2°C temperature increase
@ankittayal82912 жыл бұрын
Ahh, now i should use the hot water convert it into steam to generate electricity 🔥
@typeer2 жыл бұрын
What's up, welcome to YYC 😎
@balrajvishnu Жыл бұрын
Loved it, amazing job explaining it
@ServeTheHomeVideo Жыл бұрын
Glad you liked it!
@jolness12 жыл бұрын
Excited to see professional targeted water cooling. I love the benefits of water cooling but when the options are building a custom loop using consumer grade parts or air.. I choose air every time. To have something well tested as an option is exciting.
@baligalahmade739 Жыл бұрын
4
@sugipoohtube2 жыл бұрын
ありがとうございます!
@LordSesshomaru584 Жыл бұрын
Very well put
@apokalypz083 ай бұрын
15:38 essentially its just 25% PG with Distilled water. We also call this the TCS Loop, as referred by in ASHRAE TC9.9.
@ewenchan12392 жыл бұрын
From a mechanical engineering perspective, the chiller will be the next target for development (and I'm sure that there are already a lot of folks working on that) in order to make the chilling process more thermally, electrically, and mechanically more effiicent. Afterall, it's thermal/heat management. The heat has to all go somewhere. And the more efficient that you can make that process, the more efficient the entire cooling loop will be, even if you were running a data center in Phoenix, AZ, for example.
@aeonikus12 жыл бұрын
Heat from Datacenter should be used to heat homes. How cool would be home colocation of quiet, compact mining rigs during the winter? I know it would be challenging technicaly and perhaps legally (who'd be responsible for what in case of sth goes wrong etc) but just based on ecological and cost of electricity terms it would be a win-win. But realistically, unfortunately it's hard for me to imagine that. Unless doing some centralised heating, like locating many rigs in big boiler rooms that serves buildings or small communities etc.
@ewenchan12392 жыл бұрын
@@aeonikus1 "Heat from Datacenter should be used to heat homes." I agree. If there is an efficient way to store and transfer the heat. Case in point, older cities in the US used to have a municipal steam generation to provide heat to buildings. There's a reason why they don't do that anymore, and a LOT of that has to do with the fact that even with really heavily insulated pipes, a LOT of that heat is lost during transmission to the surrounding environment, before you even get to your intended building. So, you would have to "overdesign" the heat generation capacity to make up for those transmission losses so that you would actually get the heat that you need, in the building. "How cool would be home colocation of quiet, compact mining rigs during the winter?" I can tell you that with the mining rig that I have at home, it is literally more than enough to heat our ~1000 ft^2 house with only about 1.5 kW "space heater" running (read: mining) 24/7. It's not technically a cost efficient way of heating the house (turning electrical energy into thermal energy) - our natural gas furnance is a LOT more efficient for that purpose, but so long as proof of work crypto mining is still relatively profitable, (i.e. it is enough to cover the cost of electricity), then it is still a net profit PLUS heating the house. In fact, we didn't turn off our A/C until WELL into December, and we had to turn it back on I think as early as like February this year. (And the electrical costs that I am talking about is the cost of my electrical bill and NOT just the mining rig itself - so it's both mining and cooling/cost of running the A/C starting in February as well.) "Unless doing some centralised heating, like locating many rigs in big boiler rooms that serves buildings or small communities etc." So....yes and no. There are some data centers that actually don't bother with chilling the incoming air (or water if they're using liquid cooled servers), but the problem is that then in the data center, you end up with hot spots which lowers the computational efficiency of the systems themselves. So, the entire thermal management loop needs to be optimised so as to optimise the performance of the individual systems whilst allowing them to get at close to the "hot" temperature threshold without exceeding it so that you're also not spending a lot of energy cooling the incoming air/liquid supply (which just means that you need something else to be able to dump the heat somewhere else). As a computational infrastructure problem, it's actually quite challenging to hit that balance over a wide range of environmental and operating conditions.
@sembutininverse2 жыл бұрын
thank you Patrick for the video 🙏🏻
@Noi5eB0mb2 жыл бұрын
I love videos like this- regular pc liquid cooling is interesting, and flashy - but this, this is proper high power stuff. My question is, in the OCP standard, have they also standardised liquid cooling? Or is it still open?
@ServeTheHomeVideo2 жыл бұрын
It is not standard yet, but the hyperscalers are looking at it or deploying it already
@jurepecar90922 жыл бұрын
Most exciting thing here: miniDP port on the back instead of VGA! Yes! Servers are finally entering 21st century. Jokes aside, next step must be single usb-c to carry display and keyboard/mouse signals, getting rid of usb-A slots too. Lets wait and see if we get this still in this decade ...
@YouTubeGlobalAdminstrator2 жыл бұрын
USB type-c yes!!!
@jfkastner2 жыл бұрын
Super cool video, thank you!
@ServeTheHomeVideo2 жыл бұрын
Glad you enjoyed it
@Nobe_Oddy2 жыл бұрын
THAT IS MIND BLOWING!!! SOOOO QUIET!! You can even be in the SAME ROOM as a air cooled server, but with this you could build a crib out of liquid cooled servers and your baby would still turn out to be normal (well... as normal as it can be with parents that would make a crib out of running computers lol)
@nindaturtles6132 жыл бұрын
What’s the temp difference between both CPU’s under full load ?
@jacj24902 жыл бұрын
Great Job, Very informative to honest my only concern regarding water cooling is safety. Imagine a water leak in "Top Of Rack Server", it will ruin the entire cabinet. I know their must be some safety mechanism like measuring waterflow or difference in return vs outlet "Like Current Earth Leakage Concept" but still the concern is there. I have experiences with servers failing constantly due to humidity how about water leak. But you are right it is efficient hence it is the future Thanks again
@ServeTheHomeVideo2 жыл бұрын
There are leak detectors people use. We also have another video with a tour of the lab coming to help show what is done to test and build reliable liquid cooling
@sirdeimos89682 жыл бұрын
You can also just use a non water cooling fluid like Galden HT200 in your primary loop and move your water cooled secondary loop away from your electrical equipment.
@nullify.2 жыл бұрын
I like how there's a Raspberry Pi running the whole CDU. That's crazy.
@Veptis Жыл бұрын
could you evaporate the water to dump more energy?
@ServeTheHomeVideo Жыл бұрын
If you see our PhoenixNAP data center tour you can see those.
@Veptis Жыл бұрын
@@ServeTheHomeVideo went and watched that video and it reminds me of a lot of specialty installations that were shown on der8auers tour through a Hetzner data center with like the diesel generators and raised floor.
@zachariah3802 жыл бұрын
I'd Love to see a video on creative solutions data centers have used for cool source water - like loops into a large body of water like an onsite pond or lake, or something like the new "space air conditioning" which basically converts the heat into infrared via special polymer tubing, and irradiates it up into and out of the atmosphere - almost like a reverse solar panel.
@ColdSphinX2 жыл бұрын
That Raspberry in the cooling unit, so lonely.
@ServeTheHomeVideo2 жыл бұрын
Ha
@DrivingWithJake2 жыл бұрын
It's amazing. Only downside is so many data centers are not built to accept liquid cooling. Raised floors with power under is not so friendly should anything leak. :)
@ServeTheHomeVideo2 жыл бұрын
That is why the next video is going to be on the Liquid Lab we are in and how they test these systems. Starting later this year, not having liquid cooling is going to mean that the data center will only host lower performance/ lower density servers. We have been doing more on server liquid cooling over the past year to start pushing this concept so our readers/ viewers are ready for the transition.
@balrajvishnu11 ай бұрын
This is very helpful, can you make a video on how the RDHX works
@careytschritter11082 жыл бұрын
Had I known you were in Calgary I would have taken you to dinner! Maybe even gotten you a Calgary hat 🤠
@ServeTheHomeVideo2 жыл бұрын
I selfishly snuck out to Lake Louise the Saturday after we filmed the two videos in this series. I am a big fan
@carmonben2 жыл бұрын
I spy a raspberry pi in that CDU :)
@maxheadrom30883 ай бұрын
I like water cooling. I use AIOs on my humble machines because the CPU is the greatest heat generator and removing the heat the produce allows for more silent fans to cool all the rest. Also, on home machine we don't have those awesome airflow guides that servers and workstations have. I'll be building my first server with a server board and it will have two tiny AIOs and air cooling for the disks and motherboard.
@ServeTheHomeVideo3 ай бұрын
You might enjoy the video we are going to publish this weekend
@AlexandreAlonso2 жыл бұрын
I wait to see how to setup central water cooling solution on the rack
@-MaXuS-2 жыл бұрын
Would also been super cool to see how the fully water cooled system looks. Just saying. 🙏
@ServeTheHomeVideo2 жыл бұрын
Are you thinking the -ZL1 with the RAM and NIC? In the next video (and there was a cameo in this video) there is a MI250X node that is fully liquid cooled.
@-MaXuS-2 жыл бұрын
@@ServeTheHomeVideo Correct! Oh that’s awesome! Looking forward to the next video! To be fair I always look forward to your content. This because what you guys share with us is quite unique here on KZbin so immensely appreciated! 👊👌🤓
@Jsteeezz2 жыл бұрын
So is it pronounced Cool IT (Eye-Tee) makes sense at least. I always hear their consumer products pronounced as CoolIT (Cool it) is that intentional as their consumer products wouldn’t emphasize the IT part? Currently have one of their pumps in my AIO right now.
@ServeTheHomeVideo2 жыл бұрын
I used to say it as well not I T.
@Jsteeezz2 жыл бұрын
Id imagine so many people have mispronounced it that both are acceptable to the company now.
@zachariah3802 жыл бұрын
Tech question on the actual gigabyte server - how do these 4 nodes access all of the storage drives up front? Are they equally distributed across the nodes? Or somehow pooled so that all of the nodes can access them?
@ServeTheHomeVideo2 жыл бұрын
Each node has access to one quarter of the front drive bays
@zachariah3802 жыл бұрын
@@ServeTheHomeVideo thanks!
@frakman2 жыл бұрын
where does oil cooling (immersion) sit going forward? Is it being sidelined?
@ServeTheHomeVideo2 жыл бұрын
Immersion is still going forward. Just two different technologies. Using direct to chip liquid integrates easily into existing racks.
@Traumatree2 жыл бұрын
The hottest components in this server you showed are the 20 x SFF HDDs running at 10k+ in the front, not the CPUs! But nice video anyway! Another thing, this is a big waste of fresh water if it is not reused...
@-MaXuS-2 жыл бұрын
Thanks for the really cool 😉 video! Really awesome content as usual with and by Patric! Would have been interesting to see the actual heat exchange element in CDS. If I understood it correctly the external water source cools the closed loop of the server system connected to the CDS. Seeing as the CDS don’t have fans pulling the heat away I’m really curious as to how that heat exchange works. Also how is the then heated water cooled?
@someguy49152 жыл бұрын
The exchange is basically a normal radiator but instead of air flowing through it they flow water through it. That outside water (now warm) goes back to the datacenter cooling system, through the chillers (large AC equipment designed to cool the water) and back into a storage/buffer tank to then be pumped right around again and again. This sort of equipment is usually already mostly in place at most datacenters as they provide the cooling used by the air conditioning.
@nexxusty2 жыл бұрын
CoXuS.
@zactron19972 жыл бұрын
So this might be a stupid question, but since you'll have a datacenter producing many kilowatts (or maybe even megawatts) worth of hot water, why not hook that up to a steam generator and try to recover some of power consumed? It wouldn't be close to break-even, maybe less than 20% energy recovery, but still better than just flushing that perfectly function hot water away.
@gloop65892 жыл бұрын
To make that work would require the water to be heated to well over 100C, which means that whatever is heating the water would also have to be well over 100C.
@zactron19972 жыл бұрын
@@gloop6589 I think you're right, but surely you could use a heat pump to focus the heat? Things like sterling engines can sap energy out of much less heat so I'm sure you could do some energy recovery...
@ServeTheHomeVideo2 жыл бұрын
There are several projects looking at re-using the heat from servers for heating water and building heat applications. Steam less so.
@gloop65892 жыл бұрын
I'd think that generating an appreciable amount of electricity would thermodynamically require a sufficient temperature differential to ambient temperature, which is the exact thing that we're trying to minimize when cooling a datacenter. Could be wrong though.
@RaspyYeti2 жыл бұрын
How much of the daily heat load and hot water load for a surrounding campus can be recovered from a sweat spot sized data centre using water cooling?
@SamGib2 жыл бұрын
How high is the risk of water leaks?
@ServeTheHomeVideo2 жыл бұрын
We do not use the "L-word" in the lab. :-) But the next video that we recorded there is how they test in the lab to make sure no L-word events happen.
@JasonsLabVideos2 жыл бұрын
OHH you are in Canada !! Come to Vancouver Island :) Bring me that Red ZIP up Jacket ! :)
@ServeTheHomeVideo2 жыл бұрын
Heck yes! Back in the US now but hopefully Canada again soon
@peterexner59792 ай бұрын
So with all the work on the chips, and server, but no anti static cable…any reason?
@jeanmarcchartier64382 жыл бұрын
Is that a Raspberry Pi controlling the CDU? How does that work and does it have any affect on redundancy in the unit?
@docjuanmd73977 ай бұрын
I cant imagine the magnitude of a disaster if a failure would happen with those garden hose!
@ServeTheHomeVideo7 ай бұрын
That is why we did it in the lab
@mikeoxlong40432 жыл бұрын
U shuld check out linuses data centre ibm video. Ur take would be interesting.
@Phil-D832 жыл бұрын
Cooled with 3m Fluorinert tupe fluid?
@jlinwinter Жыл бұрын
super cool!
@allansh8282 жыл бұрын
how do you cool the water?
@ServeTheHomeVideo2 жыл бұрын
Usually there are chillers outside the data center floor at the facility. You can see them in our PhoenixNAP data center tour video.
@TheFullTimer2 жыл бұрын
Normally I'd look up or call Performance PCs and put together a solution.
@someguy49152 жыл бұрын
Kind of surprising that the CDU seems to be controlled by a Raspberry Pi, haven't seen those in enterprise equipment before (besides a cluster in a 4U box type of thing). You'd expect that some ARM based microcontroller or a simple SoC would be the preferred choice, it usually is right? Did Gigabyte tell why they chose for the Raspberry Pi, is it due to concerns on chip shortages with ARM SoCs, easier development or anything else? Does other enterprise equipment use Raspberry Pi computers? Perhaps another collab/competition with Jeff Geerling? :)
@ServeTheHomeVideo2 жыл бұрын
I think CoolIT was using RPi's because they were easy to source. I also think they are being phased out on newer products. This was also an older version of the CHx80 because we were running it on its side.
@billymania112 жыл бұрын
A good video Patrick but I wonder about something. Shouldn't Intel and AMD being working on their horrendous power consumption? I mean at some point the power draw of their processors becomes scandalous doesn't it?
@Bexamous2 жыл бұрын
Energy effiency is the thing that matters, it is going up.
@ledoynier36942 жыл бұрын
transistor count will always go up exponentially. but efficiency limits the increase in power draw. They are as efficient as they get right now. you have to consider both aspects
@ledoynier36942 жыл бұрын
even on the consumer market liquid cooling is now becoming a thing. Intel 13th gen and AMD 7000 will be the first to require liquid cooling for the higher tier units
@Jibs-HappyDesigns-9902 жыл бұрын
pretty cool! nice cooling strategy! I thought it was cool in Canada, anyway! ha..ha..!!!
@ServeTheHomeVideo2 жыл бұрын
It was -16C the night before filming this!
@florianfaber97992 жыл бұрын
Who have spotted the raspberry pi? 😉
@g1981c2 жыл бұрын
you don't need to explain what you're going to explain and how you're going to explain it - just show it
@zactron19972 жыл бұрын
13:40 so that's why I can't buy a Raspberry Pi right now 😉
@francisphillipeck42722 жыл бұрын
Cool....
@mamdouh-Tawadros2 жыл бұрын
Forgive me, no matter how liquid cooling becomes robust, they don’t match with servers environment i.e. 24/7 work.
@marcogenovesi85702 жыл бұрын
liquid cooling is used in industrial applications (and vehicles) 24/7 with no problems. The point here is what grade is the equipment. Consumer liquid cooling, heck no. They are using industrial-grade stuff here so it is fine
@clausskovcaspersen59822 жыл бұрын
u are my hero :-) hehe naa but u are cool good videos thanks
@zippytechnologies2 жыл бұрын
so now all you need is a 120 ft deep water tank to dump the water into with valves and tubes on the bottom (up a little to not catch sediment) tied to another tank... dump the heated water in the top pull from the bottom as it cools to the other cold water tanks (small with exchangers to transfer as much heat out as possible and a pump that then pushes the cold recycled water back up to the massive server farm... waste not want not... swap in some better conditioners and corrosion prevention chemicals or better coolant and you have a pretty slick eco friendly solution... so tired of all the videos that talk about all the heat but never explain how they get rid of the heat other than massive water coolers on a building or giant server center hvac systems... I want to see giant cooling under ground instead of putting all that heat right back into the air around us... maybe I am nuts.
@brandonedwards71662 жыл бұрын
Anyone want to pay me to host a rack? I need to heat my pool.
@KantilalMaheta-yo4ro Жыл бұрын
O
@Kingvoakahustla Жыл бұрын
The Asus server has better liquid submerged cooling gigabyte sorry components do not last.