*These videos take a long time to make* if you would like to buy Paul a coffee to say thanks, link below: ☕ PayPal: www.paypal.me/TheEngineerinMindset Channel membership: kzbin.info/door/k0fGHsCEzGig-rSzkfCjMwjoin Patreon: www.patreon.com/theengineeringmindset
@frankh.38493 жыл бұрын
All I see is wasted energy. Where there is heat exchange there is the potential to generate electricity.
@buntyshukla26253 жыл бұрын
Please make a video on how HVAC system is designed and installed at hospitals.
@DanielBerzinskas3 жыл бұрын
2 DAYS AGO?
@DanielBerzinskas3 жыл бұрын
THIS WAS UPLOADED TODAY, HOW COULD THIS COMMENT BE 2 DAYS AGO?
@fbi-federalblyatofinvestig38533 жыл бұрын
They should try to use Gallium-nitride technology for the power supplies and things to reduce heat.
@ELuciferC3 жыл бұрын
Very cool. I work for a large Data Center that builds on slab, hot isle contained data halls. We employ multi-mode airhandlers that are not in the data hall but outside. Cold air is ducted to above the server cabinets. Our air handlers have direct expansion, indirect expansion via an evaporative cooling tower system, direct evaporative cooling in unit AND access to economizer/ free cooling as able to. We even have areas where using the outside evap coolers we have liquid cooling piped into the server cabinets. We went away from CRAC due to potential issues they presented to the Servers when failures happened in the data halls; Too risky. Having the set up we do allows plenty of redundancy both for individual unit component failure and total capacity as well as efficiency for cooling and power. Thanks for the video!
@seaventura16 ай бұрын
very cool hahahaha
@CjMooseChuckle_13 жыл бұрын
A data center video on how the critical load is maintained during power outage by generators, ATS’s, UPS’s, PDU’s, and static switch PDU’s would be cool. There’s so many configurations though.
@justinhour38795 ай бұрын
This is an amazing video. Im.a journeyman Electrician and has built a few of these data centers in wyoming. I'm currently trying to become a critical facility engineer for one. It's like a whole other apprenticeship, this video is amazing and is a great refresher for me. Thank you for your hard work.
@brawlerbible22 жыл бұрын
Thankyou soo much because of this video my college presentation went very very very good and my teacher also liked the information thankyou so much!! 🙏❤️
@Rando_Suave3 жыл бұрын
I work at a data center. I'll say this is a good video.
@maheshmurali26973 жыл бұрын
Great video. As a DC engineer I enjoyed it.
@ptsmknbatgirl3 жыл бұрын
i used to work in the 9/11 memorial as an engineer. Data centers were top alert at all times, and we had more than a few emergencies where the temp climbed from 60 to near 82 in minutes. The port authority server room had two constantly running dataaire units, and you literally had to wear a jacket if you were working inside for any length of time.
@sfperalta3 жыл бұрын
I've worked in both mid-sized data centers and computer labs back in the 1970s and 80s. Back then, single mini-computer installations were similar to today's data centers in they were installed on raised floors with significant quantities of cooled air, as those O.G. computers produced a large amount of heat that needed to be constantly removed. Sharing the space near a cooled computer mean wearing a heavy jacket or parka(!), unless you love arctic conditions LOL! I believe the air was being pumped from the floor at about 40 degF, not much warmer than the interior of a refrigerator, and 100s or 1000s of cubic feet / minute. Nowadays, people complain if their laptop gets a bit warm or they can hear those whisper-quiet cooling fans. In that data center, you'd be lucky if you could hear your own thoughts. It's like a constant speed hurricane! I'm sure there are many more clever solutions to cooling nowadays.
@Z901Z3 жыл бұрын
Another engineering mindset banger!!!! Youve taken the previous video to the next level!
@EngineeringMindset3 жыл бұрын
I appreciate that!
@glennnickey31603 жыл бұрын
I've been to a couple about 10 years ago working with the chillers. It amazed me that the emergency generator can start up, go to full speed and powering the building in less than 60 seconds. The chilled water system usually has about a 10 min. reserve of chilled water so if one chiller goes down, the spare can come up to speed before that is all used up.
@glennnickey31603 жыл бұрын
@@snax_4820 That's right, and it would cost them millions.
@apm83963 жыл бұрын
Great that you made vido about Datacenter, I was waiting for one from you. Good work 👏
@rolands.78703 жыл бұрын
Very interesting video! A good tip for efficiency is to explain to the customers/rack owners, that blanking panels and correct installed equipment are mandatory.... No matter, how smart you build your mechanical cooling system and cold aisles.... when the equipment you want to cool is not installed properly, you will always have an issue.
@randykitchleburger2780 Жыл бұрын
Its so, so incredibly loud inside of a DC. Lots of fun too.
@tinytonymaloney78323 жыл бұрын
I loved being a data centre engineer, best job I ever had, spoilt only by clueless managers without data centre experience, blanking out your improvement suggestions only to mention them months later in front of the client so that it made them look clever. DC's run mainly on bullshit nowadays.
@hvacdesignsolutions3 жыл бұрын
I was told by a DC Manager that Liquid Immersion Cooling will replace all of the above, on new DCs over the next 10 years. It's the next gen server cooling system apparantly. No CRAC's, CRAH'S, AHU's, Chillers, Raised Floors, Hot/Cold Aisle Containment etc. Would be nice to see a vid on that.
@rockysubu83842 жыл бұрын
BEAUTIFULLY EXPLAINED
@garyburke3013 жыл бұрын
As a server tech I was sent to do a SAN upgrade to a customers in house datacenter. Expecting to be in there for hours I brought a nice warm jacket.When I walked into the DC it was like stepping into a sauna. The air con system had failed, there were buckets of water catching leaking AC and they had house fans plugged in trying to cool all the equipment. There were hundreds of red flashing leds on all the server and storage equipment in the racks. I have also encountered Datacenter AC failure with water leaking from the roof soaking the racks below. With staff frantically calling their server storage hardware vendors to log warranty calls and of course not mentioning the flood they were exposed too. Cooling failure in a DC is catastrophic, u better have a redundant solution in place.
@topotw22 жыл бұрын
This is a very well explained video. I think the performance of the cooler is important, but in the end, the most important thing is to effectively convect and dissipate the generated heat. It seems that the actual cooling energy consumption can be reduced through this. I think it is good to optimize the air flow to effectively dissipate heat.
@tomg7213 жыл бұрын
Good explanation. The data center that I worked in evolved from overcooling the room to keep the servers happy to adding containment with an automation system that installed 3 temp sensors on the face of the cabinet doors to control that CRAC units fan speed and supply temperature. The automation system would learn the cooling requirements of the room and worked quite well. Only problem was when additional cabinets were added it would require additional sensors and automation system programming.
@julianparraramirez9500 Жыл бұрын
did that reduce energy consumption? by how much %?
@Jokreher3 жыл бұрын
I had a job building systems to cool data centers. That was my favorite job.
@benlappin3 жыл бұрын
I’m surprised this video talks about the raised floor design so much. Any new data center I have worked on in the last 7 years doesn’t use any type of raised floor for cooling.
@EngineeringMindset3 жыл бұрын
There's a couple of instances in the video where newer non-raised floor designs are shown.
@maheshmurali26973 жыл бұрын
All tier III uses raised floor design
@benlappin3 жыл бұрын
@@maheshmurali2697 My only experience is North America but I know that that here that not all tier 3 are using raised floor. I’m currently standing in a tier 3 with slab floors.
@SoloRenegade3 жыл бұрын
Both solid floors with overhead cooling, and raised floors are still very common. Raised floors are possibly more likely used in high performance supercomputing centers though. Raised floors are more common for liquid cooled systems too.
@Thispersonsaysso3 жыл бұрын
The company I work for is a large tech company with multiple modern data centres, they are building more as we speak and they are all raised floor
@vittoriopiaser92333 жыл бұрын
Hi Paul! I’ve been following your channel for quite some time now! When I was writing my thesis for my Bachelor I was giving an overview of absorption HVAC systems (such as heat pumps and chillers) and I remember that there were more than some articles around that talked about the use of absorption chillers in data centers. One article was analyzing a solution implemented in a data center in Arizona (pretty hot climate) in which some small finned tubes were made passing around the physical servers casings, taking away much heat, then this fluid would have accumulated in a hot tank, kept at the desired temperature with the help of some solar panels. The hot fluid here stored would have been fed to an absorption chiller LiBr-water in order to cool the server room. Hence the server room would have been cooled by the same heat the servers were producing! The absorption chiller was cooled with an external water source, water that if I remember correctly was used and then cooled in a cooling tower. Do you guys think this could be a viable solution? What problems would it encounter?
@EngineeringMindset3 жыл бұрын
Yes, it does work. It can't produce enough coolth to completely cool it and it isn't very efficient but, it is a way to offset other mechanical cooling. We have covered how the absorption chiller works in an old video, check it out.
@Mr.Rohbot3 жыл бұрын
Thanks for another cool video!
@Jeff-fr7ls3 жыл бұрын
HA !
@Chitose_4 ай бұрын
i took way too long to find this again lol. i should probably save this to watch later
@knottyinks15 ай бұрын
The best tip for saving energy in a data centre is don’t by an iPhone, look for alternatives that don’t track and share all your data,and use cash, say no to cbdc, help these guys save a fortune on cooling 😉
@miamisasquatch3 жыл бұрын
As a design engineer for a company focused on data center cooling - can confirm
@miamisasquatch3 жыл бұрын
Though we technically call chilled water units CRAHs for computer room air handier
@alans98063 ай бұрын
There's discussion in the media about fresh water usage by DC cooling systems. Given the closed cooling circuits and fluid to air heat exchangers involved, where is this water lost? Some thermal power stations lose water to atmosphere when condensing LP steam in their cooling towers but I can't see why they must use potable quality water for this if they don't have access to river or seawater.
@michaelgarito41763 жыл бұрын
@2:13. Just an FYI, The correct term is "raised floor". The phrases "Suspended floor" implies the floor is hanging from a tension system much like the deck of a suspension bridge. 😉
@zenja423 жыл бұрын
In the 50-120MW DCs I'm running, we use indirect air cooling (+ spray water). The chiller just kicks in to add cooled water and mix it into the flow if temps are higher. CAC is normal, HAC is newer and not comment yet. Efficiency also could be made, when the customer aggree to run their intake not on 22-23°c +/-2°c (or even some old folks want 19°c), but more like on 25°c +/- 3°c. From our calcolation that's 10-15% less cooling power needed.
@the.bearded.gunner56182 жыл бұрын
One I’ve built is a hot aisle/cold aisle, air/mist evaporation cooled on from the second floor and forced down through the roof of the data hall and then hot air is removed and either mixed or expelled.
@tristanwegner2 жыл бұрын
Great video, but the animation at 3:50 wrongly shows the coolant flowing into both side of the evaporator. But overall great overview with enough specifics.
@CjMooseChuckle_13 жыл бұрын
UPS and battery monitoring technician here. Worked in data centers for the past 8 years. I’ve been at data centers that use that evaporative (they called it adiabatic) cooling where they should not have. I was in a battery room that was 88 degrees F (31 C) probably 100 percent humidity. Not a good environment for batteries. This was a major company but I can’t say who because of an NDA.
@Rodrigo5403 жыл бұрын
I swear to God, this channel is absolutely underated and you deserve all the likes from the engineering community! Thanks for sharing this brilliant knowledge!
@LascuLars9 ай бұрын
Should be installed near Buildings in the city To heat the water for the Shower and the heat in the winter in parallel with the Central when the server does not need to be cooled
@benjangotong92653 жыл бұрын
yes siir... im a technician of a PACU units... especially vertiv😁😁👍👍
@DeStoreholmskeBaner3 жыл бұрын
Cool you used the Google DC in Fredericia, Denmark as your zoom-in DC in the beginning of the video 👍
@crazyredneck72443 жыл бұрын
Even with optional airflow equipment, data center operations folk seem to still have a knack for installing intake on the hot aisle and exhaust on the cold aisle...
@RedmilesShark3 жыл бұрын
A datacenter I work at from time to time has hot aisle design. It's great, until you have to do maintenance inside that area...
@EngineeringMindset3 жыл бұрын
😂
@Thispersonsaysso3 жыл бұрын
I've just been given a project recently to do inside the hot aisles 🙃😂
@RedmilesShark3 жыл бұрын
@@Thispersonsaysso I feel your pain.
@ELuciferC3 жыл бұрын
I work in one of those! No one likes Hot Isle work lol
@bitebonumbere14263 жыл бұрын
I'm enjoying your channel. Please any updates on manually uploading video subtitles?
@andrewzhu8753 Жыл бұрын
Can use for personal cpu knowledge too. Pretty good :)
@outworksonline29 күн бұрын
Super interesting!
@mohammadalshaikhhasan50913 жыл бұрын
Perfect , practical… thanks Some manufacturers also keeping heaters inside the CRAC unit , it will operate after deep cooling during dehumidification process. I didn’t see that in the video, how the dehumidification going in the video?
@captainkeyes9913 Жыл бұрын
never been to a data center, however my future may involve going to one someday, and not for a tour
@sakinhossain92263 жыл бұрын
Very good video.I love it.
@SorokinAU3 жыл бұрын
good work! thank you!)
@roshanramesh6273 жыл бұрын
Servers are being designed every day to withstand higher Temperatures so that Cooling Load reduces drastically. DCs have been designed till 38 deg C Hot Aisle Temperature so save sufficient load on Chilled Water System. . Diifference between CHW Temperatures have to be as large as possible to decrease the Pumping GPM and thus the Load. . All inverter (Partial Load) Motors provide higher efficiency at lower speeds. So designing Tier 4 DCs with N+N Redundancy and running both systems (2N Chillers with Inverter Compressors & 2N CRACs with EC Fans) at partial loads provide higher Efficiencies. Of course, very little of the stated above applies to American Colder Climate with Free Cooling possibilities but Servers with higher Temperature Resistances always help.
@MrXjoeharperx3 жыл бұрын
While I'll agree with you that servers are being built to withstand higher temperatures, as I have walked into rooms that were 95 to 100 degrees and everything was still running. The problem is the optical servers which are delicate and often start to suffer physical damage above 90 degrees
@miteshrembo45943 жыл бұрын
Good job👍
@psvyme48paulh453 жыл бұрын
Wow so cool bro 👍🙏🇬🇧
@jimvalim15673 жыл бұрын
How about a video on oscillators? How an inductor and capacitor in parallel circuit can make an oscillator. How they are used to make frequencies for radio applications. And finally, talk about quartz crystal oscillators.
@Thorsted672 жыл бұрын
I live in Denmark close to an Apple data center and it is the plan the part of my central heating will come from the data center from 2024.
@Rick-d6t3 ай бұрын
Please do a video on HCPV
@tsanger1213 жыл бұрын
A bit behind the times here. Built cold aisle data centers 20 years ago. Technologies have moved on.
@ShanQueefus3 жыл бұрын
I work on data cooling units both dx and chw. I am cool.
@michaellinner77723 жыл бұрын
And here I thought it was the neat clothes and hairstyles that made them cool.
@DanielBerzinskas3 жыл бұрын
I am subscribed!
@mohamedfergany56112 жыл бұрын
Thanks for sharing. a quick question if I may. For DX CRAC units why the compressor is always installed in the indoor units?
@ejonesss2 жыл бұрын
they could liquid cool the servers have waterblocks similar to what you would use on your pc cpu and gpu. they also make north and south bridge as well as hard drive and ssd water blocks.
@hquanngd3 жыл бұрын
1:22 The image no server owner want to see, and you show it to everyone ;) I hope Google's server manager doesn't want to kill you ;-)
@zodiacfml3 жыл бұрын
accurate, correct until the last part. 8:20 racks don't exhaust air to the rear but to the top. data center cooling though still lacks efficiency/innovation. for example, humans working in data centers don't need to be cooled or the building/room containing the racks. cool air can simply sucked underneath the racks but no.
@prototypo83593 жыл бұрын
The refrigerant flow at 3:30 is incorrect as it indicates coolant flowing from both ends of the piping towards the evaporator, thus having no coolant exiting the evaporator.
@EngineeringMindset3 жыл бұрын
Well spotted
@DanielBerzinskas3 жыл бұрын
10/10 nice!
@buntyshukla26253 жыл бұрын
Btw thanks for the video was looking for it since heard of raised floor cooling
@octaviovinoly7 ай бұрын
Do you know what are the average outlet water temperatures from the chillers for these applications? Would water need go below 0°C?
@zadrik13373 жыл бұрын
I have been working in data centers for my entire career. I have seen every one of these layouts and cooling systems. One huge challenge is getting the computers and networking gear to have the proper air flow direction, not to mention the, mostly older, systems that move the air in on one side and out on the other. The hot air containment is the easiest to work in from my experience, provided the hot area has air movement to keep the temperature down to a reasonable level. One item that always seems to be left out is the noise. Granted that is outside the scope of your video, but it is something your don't hear much about. You wouldn't believe how loud it gets inside a data center, especially inside of a hot isle containment area. All those 20,000 RPM fans pumping air into a small enclose space is deafening. Ear projection is a must. You showed some B-roll of people in a datacenter wearing hard hats. That is bullshit. Nobody ever does that. You need one where everyone has ear protection. Also, all the sock video ever shows neat and clean rooms all organized and all with super clean wiring. That does happen, but only in a room that is managed properly. I have been in many colo's (colocation data centers where you can rent 1 or more racks) where many of the racks are just a spider web of tangled cables. It is a major problem that good datacenter managers spend a lot of time policing.
@bayou__ Жыл бұрын
Good video
@jishnumohanpillai68203 жыл бұрын
Can you do a video about hospital and operation theatre air-conditioning systems?
@MrXjoeharperx3 жыл бұрын
Hospital systems are normal vav systems with hot water reheat boxes, and the majority of theaters now all are all giant package units.
@justlisten823 жыл бұрын
Could we use the waste cold energy released from the Liquefied Natural Gas (LNG) regasification process to help with cooling costs?
@TheVonMatrices3 жыл бұрын
I assume that would work but I would think that there are way more data centers than LNG terminals, although both LNG terminals and data centers are located close to cities. But there are other ways to reduce cooling costs. For example, the server room in the office where I work recycles the server heat throughout the building 8 months of the year and heats the building for free except on the very coldest days.
@farnoodshabafroozan49683 жыл бұрын
That was useful
@brlinf0639811 ай бұрын
Just chilling out
@shankarnathmajumder3 жыл бұрын
Well in-order to #floor cooling why we're always intended to flow the cold #AirCirculation part only from the bottom of the #DataCenter, because sometimes I just think why can't we put the entire Cooling System under the false floor of #DC, including the #WaterTank itself. I mean to say the #DC Room itself will be able to conceive the part of its #ChillingUnit under the false floor. P.S. While maintaining all type of #Precautions and #Safety factor.
@Theaverageyoutuber-c8v3 жыл бұрын
To increase efficiency we could have reduced oxegen atoms to reduce the formation of rust this will require infrastructure but would it be worth the significant cost?
@EngineeringMindset3 жыл бұрын
It would reduce the risk of fire. But it's very energy intensive to maintain a low oxygen environment.
@RahulKumar-ve4jm3 жыл бұрын
@@EngineeringMindset maybe the underwater server does it well without any extra energy
@joecool46563 жыл бұрын
It could also be dangerous for humans
@XDnikiDX3 жыл бұрын
@@joecool4656 Its not very dangerous, i work in a few centres with lower oxygen levels. You can easily work in there after doing a check up, or just turn the oxygen level higher that does it also. We use it to prevent fire.
@TheVonMatrices3 жыл бұрын
Can someone explain to me why you would want to do this? Is rust actually a problem in a climate controlled environment? I've owned many dozens of servers and have never considered rust and have never seen rust in a server. Maybe rust would be a problem at edge deployments like cell phone towers, but that's not what this video is about. And why would lower oxygen be helpful except for the reduced fire risk?
@mettcbsd47903 жыл бұрын
I would like to ask you if there's any difference when I placed the crac linear with the cold aisle or linear with the hot aisle? which is more efficient? Can I Calculate it?
@fernandoyairvalenciagomez1836Ай бұрын
Excelente.
@timothycampbell80533 жыл бұрын
Humidification is a lot more important than you’re letting on. Also I don’t know how other places do it but our towers are vented from directly beneath so there’s no chance of recirculating.
@mazharali99003 жыл бұрын
We are using DAHU Fans for Cooling.
@AsstromechR4-M1AsstromechR4-M111 ай бұрын
Yes evean those. For my R4H18 R4X2 working probbly up runnning verey safe good basstromechs sisco to for now goood server rooms good to.
@AsstromechR4-M1AsstromechR4-M111 ай бұрын
So it's just a name safe privit to good 👍.
@mp26693 жыл бұрын
How much degrees tempareture maintain in data center
@EngineeringMindset3 жыл бұрын
It depends which industry guide you choose to follow. Some suggest supply air around 23*C, but you need to consider your data center and equipment to understand if that is suitable
@paulmcclung93833 жыл бұрын
It also depends on the server technology, new equipment can handle higher temperatures. Google is running warmer temperatures than a lot of others. But 68 F still seems to be the sweet spot.
@---------______3 жыл бұрын
Question: Is it possible to harness the heat from the hot air flowing in the ceiling into energy? Because I have a dumb tought of placing a stirling engine (which I discovered by yt recommend) on the top of the ceiling where the hot air flows
@yolomc224 күн бұрын
Blimey ! 3:28 glitch in the animation, you can't have refrigerant at both ends of the evaporator flowing inwards 🤷♂🤞
@markefulton22 күн бұрын
Great observation!
@johnheggie80643 жыл бұрын
I installed many cooling units in data centers. They also has Halon fire suppression systems in them. I always worried about setting off the Halon system while working in them. Halon gas eats oxygen in the room quickly.
@scorpio40413 жыл бұрын
Hey man I'm "just chilling out"
@jucom7563 жыл бұрын
Isn't water vooling inside the computer more effective than air cooling? In underwate data center it seems like the most doable option too.
@Igneusflama3 жыл бұрын
Something about the animation at 3:47 was confusing me... Then I realized the pipes going into the evaporator are both flowing in and neither are flowing out.
@chrisl62632 жыл бұрын
Some cooling centers use cold isle and hot isle. And some use cooling towers, while others use a different form. They are crazy. They have massive generators, and they are normally powered directly from the power source, IE hydroelectric dams.. one building can generate 1 trillion a year, and one section of the building can generate 500 million to 500 billion. I currently work at one such site. I work on ones that do not use refrigerant due to size, they are designed to be replaced after 10 years...
@philselkin47763 жыл бұрын
What is the priority for cooling a data center, processors or storage? If it's processors then the best way would be in oil that is then cooled and recirculated. Usually a swamp cooler outside.
@paulmcclung93833 жыл бұрын
That's an interesting idea. Can you provide a link to an information site?
@MrXjoeharperx3 жыл бұрын
Using that method the oil will never get below the outdoor temperature.
@siddeshwarapm561310 ай бұрын
Sir. Which. One, is. More, efficient, is, very, essay, either, water, cooling, or. Air. Cooling. System.
@Andrew90046zero3 жыл бұрын
could the removed hot air then be used to turn a turbine, and convert some of the waste heat back into electricity?
@Spozinbro Жыл бұрын
Funny thing i got into servers a few months ago and might start hosting a local VPN
@thomastexwilson73233 жыл бұрын
I have designed and built over 30 data centers worldwide.
@bilalagha2 жыл бұрын
Just chilling out lol
@highwood18 Жыл бұрын
Who usually work in this data center? As in job titles?? I worked before on the but too scared to talk to the guys inside the data center
@paulmcclung93833 жыл бұрын
US DOE sets a specific performance requirement in a addition to local mechanical codes.
@vigneshwaranms3 жыл бұрын
just chilling out
@EngineeringMindset3 жыл бұрын
Nicely spotted
@shivask59823 жыл бұрын
Precision Air Conditioner
@joecool46563 жыл бұрын
Do you know if the raised floors are insulated to slow heat transfer? Thanks
@EngineeringMindset3 жыл бұрын
They should/could be, but many aren't
@joecool46563 жыл бұрын
@@EngineeringMindset Thank you!
@paulmcclung93833 жыл бұрын
I have not seen in data centers or semiconductor fans. They are moving a lot of Air relatively fast, so it may not add value. Also they install a lot of utilities under the floor including power. Those systems come up through the floor.
@GothGuy885 Жыл бұрын
not sure about the evaporator cooling method. doesn't all the moister cause eventual corrosion of components and inter connects ,shorts in the equipment, and possible data corruption and or loss? 🤔
@aley46443 жыл бұрын
Is there any cooling system in smartphones? 🤔 If there's it's too small!
@carlosbetancourt92283 жыл бұрын
I don't get how the position of the evaporator coil is completely horizontal, from my understanding it should be installed at least with 60 degrees of inclination.
@EngineeringMindset3 жыл бұрын
In reality it is, but this is a simplified 3d model. It's missing 90% of the components inside. It's just shown as a illustrative representation
@mmenjic3 жыл бұрын
1:17 your name of the channel should tell you that if ALL the energy consumed by data center is converted into heat like you say, then you could also compute on your electric heater !!!! And it would even be more efficient because there are no moving parts or lights or anything else to waste energy instead heater really does convert it almost all into heat. We do need to take into account that we can not have perfect conversion from electricity to heat so even heater can not convert all into heat, but it surely does do much better than servers do !!!