No video

Why Did SLI And CrossFire Multi-GPU Support Die Out?

  Рет қаралды 16,151

DF Clips

DF Clips

Күн бұрын

► Watch the FULL Video: • DF Direct Weekly #170:...
► Support us on Patreon! bit.ly/3jEGjvx
► Digital Foundry KZbin: / digitalfoundry
► Digital Foundry Merch: store.digitalf...
► Digital Foundry at Eurogamer: eurogamer.net/...
► Follow on X/Twitter: / digitalfoundry

Пікірлер: 148
@sapphyrus
@sapphyrus Ай бұрын
Now instead we pay double the price for one GPU so it's all good for the companies!
@TheKims82
@TheKims82 Ай бұрын
Still remember the microstutter when i had 2 x 670. Don't really miss those days.
@mattfm101
@mattfm101 Ай бұрын
I finally managed to afford 680 sli and boy that stutter was such ablet down.
@WindyWalk
@WindyWalk Ай бұрын
Same here with two 480s. Never tried SLI again after that experience.
@DarkPhychic
@DarkPhychic Ай бұрын
3x SLI got rid of stutters
@PenguinsAreColdish
@PenguinsAreColdish Ай бұрын
i had 10 years of stutters on my 980s in sli. finally upgraded a month ago and it’s a whole new experience
@LifeStartsAtrpm-ru1xo
@LifeStartsAtrpm-ru1xo Ай бұрын
A still have a MSI Titan laptop with two GTX980’s In SLI (the desktop versions btw). Keeping it for nostalgic reasons.
@gavinbarrett3444
@gavinbarrett3444 Ай бұрын
Make sure the heat cycle it now and again
@LifeStartsAtrpm-ru1xo
@LifeStartsAtrpm-ru1xo Ай бұрын
@@gavinbarrett3444 I use it for approx. 2 - 3 hours every month to keep the battery alive, update windows 10, steam, drivers etc to the latest version and run a benchmark which uses SLI to heatcycle it. The battery is my biggest concern, it has degraded about 15% according to a battery health monitoring program. Been using the same program to ‘refresh’ the battery, but that didn’t do much. If you (or someone else) has a good suggestion for that, I’d love to hear that!
@vanman266
@vanman266 Ай бұрын
I had duel 1080s, but usually had to disable one to stop the stutter 😂.
@TheOldGodFX
@TheOldGodFX Ай бұрын
I got into PC gaming back in 2001. In legacy PC builds over the years I have run stuff like two Voodoo 2 cards, and also dual GPU cards like the Voodoo 5 and Rage Fury Maxx. TBH, at least on the 3DFX side of things, they really had nailed it on the performance increase. With the Rage Fury Maxx, not so much. Rage Fury Maxx already had compatibility issues with a lot of motherboards. Then you would run into titles randomly that required you to disable one of the Rage 128 chips to get the game to run ok. During the modern gaming era, my dip into dual GPU configs involved a Phenom X3 with two HD 3850 cards, then a Athlon II X4 with two HD 5770 cards so I could run Metro 2033 pretty well. I eventually upped the processor on that build to a Phenom II X2 unlocked to 4 cores and swapped the Radeons out for a GTX 470 in 2013, then a GTX 950 in 2015. Also during this time kept a AM2 Athlon FX-62 SLI config on hand. At first, I used two GeForce 8800 GTS 320MB cards, but eventually swapped them out for two GTX 280 cards, then retired the build around winter 2012. It was a niche build that came in very handy for games using Physx or UE3. Eventually I did my first AMD FX-8350 build in spring 2017 and at first, I went with two R9 380X cards that I picked up for about $110 each on ebay. But seeing how performance in Crossfire had become so unsupported over the years. I took advantage of that Ethereum boom and flipped the cards literally less than a month after I got them for double what I paid and bought a GTX 1080, and never looked back.
@kingkilburn
@kingkilburn Ай бұрын
The issue I see is manufacturers not sharing vram between GPUs in gaming applications when they do in enterprise applications. If you aren't sharing the vram there's no reason to break up the workload.
@SeanUCF
@SeanUCF Ай бұрын
980TI's in SLI was my jam back in the day. Aside from the technical limitations, and with the increase in GPU prices, SLI and Crossfire became unaffordable.
@mydogbuddy07
@mydogbuddy07 Ай бұрын
I also ran a 980 ti SLI setup back in the day. 😅 Finding games that actually worked with it was like playing a type of performance lottery. So many times it didn't work at all, but when it DID work, it was usually a real treat. I think the last game I played that I used SLI in was Monster hunter world. It wasn't officially supported but I found a SLI bits for it on a forum. ( 😅 which I think was sadly a case for A LOT of the games I got working, they were usually unofficial) the performance increase was pretty good (40 to 60% improvement or something) but unfortunately, it had an annoying issue where the sky would flicker at sunrise and sunset, and there wasn't really a way of setting the time in that game besides waiting.
@SeanUCF
@SeanUCF Ай бұрын
@@mydogbuddy07 Nice! I honestly don't recall what games I played that actually took advantage of it. I think maybe Starcraft 2 and XCOM, but it was so long ago.
@mydogbuddy07
@mydogbuddy07 Ай бұрын
​@@SeanUCF 😅 Yeah I'm struggling to remember the games I got working with it too. I remember Monster Hunter World because I think it was the last game I tried and also one of the most broken ones lol 😅 (with the sky) I also remember Metal gear solid 5 Phantom Pain worked quite well with it, I don't remember what the scaling was exactly but I remember it being good, And their were no graphical issues either. (Or at least none that I noticed) That was probably my most played game using Sli.
@LanceThumping
@LanceThumping Ай бұрын
I feel like with VR if each GPU was dedicated to a single eye and ignored the other GPU except for some frame sync, then maybe it could work and make VR perform better. However, like they said, so many techniques are available now that use previous frames or adjacent frames to speed up rendering. I've heard, correct me if I'm wrong, that it's to the point where rendering the two very close together cameras for VR is actually not that much harder (in good implementations) than rendering a single camera. So the efficiency is incredibly bad and only the most insanely hardcore or businesses would use it. I'd be far more interested in seeing headsets with high speed eye tracking and universal foveated rendering support. The performance increases for that should be incredible. EDIT: Actually I'd like to see more high speed eye tracking in general with foveated rendering. There isn't much reason that you couldn't use foveated rendering outside of VR assuming the tracking is good enough.
@ProjectPhysX
@ProjectPhysX Ай бұрын
4:45 I've actually done that, 32-way multi-GPU. Not SLI but communication over PCIe, with 32 AMD Instinct MI210 64GB GPUs in the GigaIO SuperNODE. That was for one gigantic aerodynamics simulation of the Concorde in 2TB combined VRAM.
@AnthonyTeasdale
@AnthonyTeasdale Ай бұрын
In 2016 I ran 2x GTX1080 in SLI. often it did not do much, though I fondly remember playing through Doom (2016) in 4k at 60fps which was nice. I think a single GTX1080 was a 4k30 experience. Ever since my ATI X1900XT. I always wanted a dual GPU setup. so glad I ticked that off my list.
@andyasbestos
@andyasbestos Ай бұрын
SLI was really cool in theory, but in practice it just wasn't worth it. At one point I had 2 GTX 770 in SLI, and half my games didn't use the second card at all. But even among the half that did support SLI, it was rare for any game to do so flawlessly. Usually there was a minor issue with some effect not rendering properly on half the frames, resulting in annoying flickering. Project CARS as an example had this weird thing where shadows would flicker spectacularly for the first 30 seconds or so when you first started the game before fixing itself. And that was one of the better SLI games. At least the flickers didn't persist in that title. And then there's the fact that you got more frames but somewhat worse latency. I don't remember the latency hit being very noticeable, but what really matters is that you'd never get any better latency than you would with a single card. I remember one time I played Witcher 3, and I could sense something was a little off, but I couldn't put my finger on it. It took me hours to notice that SLI had been disabled for some reason, and that I was playing at around 45FPS rather than the usual 80. The game felt exactly the same! It just didn't look as smooth.
@Noah-Lach
@Noah-Lach Ай бұрын
SLI is an amazingly cool technology that just sucked, unfortunately. Though I'm sure it was better suited to some non-gaming applications.
@pixelvahl
@pixelvahl Ай бұрын
That part about latency sounds similar to frame generation.
@matsta177
@matsta177 Ай бұрын
Obviously you didn’t listen, this was DX12 issue. Everyone knew DX12 did this. So this was entirely your fault
@luckyhomestead
@luckyhomestead Ай бұрын
You guys missed one pretty important usage of SLI, it's buying two old cheap cards instead of expensive one...
@edge60able
@edge60able Ай бұрын
I don’t think this was ever true or ever made sense as a value prop. The advice at the time was always to buy one of the best gpu you could budget for.
@highlycaffeinated6864
@highlycaffeinated6864 Ай бұрын
​@edge60able totally agree, older cards in sli barely competed with newer mid range cards let alone top end ones. Plus sli always had issues with a lot of games. They always had terrible frametimes and never felt smooth even at high framerates.
@d.ryan96
@d.ryan96 Ай бұрын
​@@edge60able there was. Back in the day I bought used 7990 and paired it with my 7970. It was ridiculously fast in benchmarks and supported titles but at the cost of all the multi gpu setup headaches.
@LanceThumping
@LanceThumping Ай бұрын
@@d.ryan96 I remember back then. That was probably one of the only times that it seemed like SLI made any sense. IIRC two 7970s in SLI had better performance (in theory) for a lower price than many single higher end cards. I think there was even a card that was actually just two GPUs in SLI on the board around that time. However, that was probably a short lived time and wouldn't make sense in the current landscape.
@PuppetMasterdaath144
@PuppetMasterdaath144 Ай бұрын
Lol the comments online never fail to make me lose my mind because of the blatant idiocy
@EastyyBlogspot
@EastyyBlogspot Ай бұрын
I did try crossfire years ago by my god the stuttering, I do wonder if with Apus can a hybrid crossfire be done so the gpu cpu and apu all be used
@kamilciura7953
@kamilciura7953 Ай бұрын
This was idea of original Mantle API which eventually forked out into Vulkan with some more experimental features dropped (at least this is what I recall). The idea was to offload some rendering tasks onto iGPU but in practice this cool concept was a tangled web of problems to implement it at system level.
@Thelango99
@Thelango99 Ай бұрын
That was a thing actually. You could pair up an A10 7850K with a Radeon R7 250 and have it use both the iGPU and dGPU.
@Thelango99
@Thelango99 Ай бұрын
It was called Dual Graphics.
@mikek92
@mikek92 Ай бұрын
I still miss my triple SLI GTX 470s. Kept me nice and warm and play any game I own at max settings. Triple was also good for my surround setup. 5760×1080 was bliss ! 😊
@moosemaimer
@moosemaimer Ай бұрын
I upgraded my current rig to 2xGTX980 from a GTX770 in 2015, and it was more of a burden than an advantage. Some games ran better, but the number of times I had to disable SLI because of flickering, stuttering, loading time issues, etc. really made me wish I had just waited and paid once for a 1080.
@Gamevet
@Gamevet Ай бұрын
I ran (2) GTX 1GB 460s in SLI. When it was on, I’d get GTX 580 like performance. It was a lot less expensive as well.
@johnellis3383
@johnellis3383 Ай бұрын
I miss my old Voodoo 2 SLI setup. That thing was a BEAST!
@joshuatyler4657
@joshuatyler4657 Ай бұрын
It seems like the glory days of having some monstrosity 4x 970 SLI setup with an EVGA SR2 are over. Now it's just monolith dies and everything is simple plug-and-play. I still use dual-GPU setups for computational science, but I miss the days when I could play a game and leverage my second GPU.
@Ivan-pr7ku
@Ivan-pr7ku Ай бұрын
Frame generation tech pretty much nullified the few leftover reasons for multi-GPU setup. Vulkan and DX12 also support multiple GPUs, but it's up to the application to utilize the resources and not explicitly managed like before.
@fortunefiderikumo
@fortunefiderikumo Ай бұрын
Far more trouble than its worth,shame it really was a good idea on paper if only it actually worked as well
@MauroSanna
@MauroSanna Ай бұрын
It didn't die. It's just simply being moved to other purposes, that is mainly GPU rendering for VFX (Octane, Redshift, FStorm, V-Ray GPU, VRED GPU, recently even RenderMan and Arnold, and so on and so forth). Also hybrid rendering (GPU + CPU). In gaming doesn't make any sense nowadays, that's why you no longer see it in that regard.
@vitordelima
@vitordelima Ай бұрын
Also I'm almost sure one GPU per eye doesn't need any coordination (or link between GPUs) and maybe it's already possible with Vulkan.
@jcm2606
@jcm2606 Ай бұрын
@@vitordelima It is. Vulkan and DX12 both give you the ability to directly address each GPU on the system and divvy up work between the GPUs as you see fit, which is also part of the reason why SLI and CrossFire died out: they were no longer needed as the newer APIs natively supported multi-GPU.
@vitordelima
@vitordelima Ай бұрын
@@jcm2606 ARM GPUs use some sort of SLI to scale up but it only works inside the same chip or package. Maybe desktop GPUs will migrate to this.
@CappaBeta
@CappaBeta Ай бұрын
Dual 8800GT's to try and complete Crysis on 'very high' settings over 20fps on a Q6600 overclocked to 3.4ghz, 3.6Ghz if I'm feeling unstable. In 2007. This setup did last a couple of generations, but we weren't used to high fps at that time. Then when you upgraded, you had 2 Gpus to hand down to donor systems for multimedia or light gaming.
@Lucromis
@Lucromis Ай бұрын
I have a gtx 295, and 2x gtx 560 ti in sli. It wasn’t too bad if the drivers were updated with the game updates together honestly. Mostly an epeen thing though.
@EmblemParade
@EmblemParade Ай бұрын
Vulkan (and I think DX12, too) has explicit APIs for handling multi-GPU workloads, and they don't even have to be symmetrical. For example, it can use a dGPU and an iGPU together, even if they are different brands and architectures. I don't know of any game engine or game that uses it, though. :)
@arenzricodexd4409
@arenzricodexd4409 Ай бұрын
game developer simply did not want to deal with the multi GPU issues. they most often already busy with fixing their game issues post launch and work on DLCs. supporting multi GPU only makes thing worse for them. and they did not even get any benefit from it. it's not as if they got a cut from nvidia or AMD every time a gamer bought a second GPU for their system.
@ebridgewater
@ebridgewater Ай бұрын
I had 2x Radeon 7870 XT in 2013/2014, and 2x GTX 970 in 2015. I do not remember too many issues with either.
@felipenachmanowicz9393
@felipenachmanowicz9393 Ай бұрын
I had a GTX 690 (680 sli on a single pcb) and a GTX 970 SLI. It was awful. Most games at release had zero SLI support at launch. And very few had decent support even further down the road.
@mitchjames9350
@mitchjames9350 Ай бұрын
It would be good if ski and crossfire made a comeback particularly how expensive cards are today.
@Dullahan3470
@Dullahan3470 Ай бұрын
mGPU died because Nvidia wanted to sell $2000-3000 to gamers cards and no one bought the TitanY. SLI was what kept prices grounded in reality.
@b1lleman
@b1lleman Ай бұрын
The only that worked well on my crossfire 3670 (I think) setup was crysis at 60 fps vsync but ok that was about all I played around that time. All the rest was crap due to the differences in frame rate between frames.
@Zenzuu
@Zenzuu Ай бұрын
Dual cards like the 4870x2, GTX 295, Geforce 7950 GX2, & HD 5870 X2 were quite beasts back then. I bought two 6600GT and had them in SLI config. It ended up matching the high Geforce 6800 Ultra at the fraction of the price. Those were the good old days.
@mikeyp4690
@mikeyp4690 Ай бұрын
I had a 7970 with a 7950 in crossfire, it was pretty cool. Games like Battlefield 4 were fantastic but I still remember having to disable crossfire for others. Skyrim was pretty bad at launch with it enabled and gave crazy bad stutter!
@cyxceven
@cyxceven Ай бұрын
3DFX SLI was godlike in the 90s. I still have a dual Voodoo2/Pentium II setup in my garage. Sure, you needed developers to provide an executable with 3D acceleration patched in, but once you got that it just worked. Didn't matter if you had one card or two. No flaky drivers, no stutter. Just crispy 1024 by 768 16bpp graphics. XD
@richard-davies
@richard-davies Ай бұрын
Never really had a problem personally with microstutter thankfully at least in the games I played. I ran 2x 780 Ti's, 2x 980s, 2x 980 Ti's, 2x 1080 Ti's and the last being 2x 2080 Ti's. By the time the 1080 Ti's were out newer SLI supported games was already getting pretty rare and by the time the 2080 Ti's were out SLI was pretty much dead. Only kept running it as the games I played supported it and back then it was a must to drive a 4K screen with at least 60hz while running the games at max details, especially back when I went 4K in 2017. Between single GPU's getting really expensive and SLI dying I'm glad I ditched it when I went with the 3090 considering not long after Nvidia announced they were killing it, by the time the 3090 was out 4K was easy enough for one card to drive anyway. Thankfully on high end cards with a 4090 4K is easy to drive these days so I have no issues running one card, but don't have much choice when the 4090 removed the conector. Do miss the look of SLI but with the size of modern cards a single card almost takes up the same amount of space 😄
@thoreberlin
@thoreberlin Ай бұрын
Would still be good for VR. One card per eye. Easy doubling of rendering performance against flat game variants. TAA DLSS could still be aplied per eye. Modern wide processors would also be well suited for this.
@arenzricodexd4409
@arenzricodexd4409 Ай бұрын
multi GPU support has been baked in inside Direct X and Vulkan for almost a decade now. maybe we should ask VR developer why they are not doing it rather than GPU maker.
@brkbtjunkie
@brkbtjunkie Ай бұрын
I had 2x580s, modern warfare 2 was lit. But then the microstutter. For the next generation I had 2x670s but by then the microstuttering for most games was real bad, and it was then I decided to just buy one big silicon instead of two small silicons.
@Lauren_C
@Lauren_C Ай бұрын
Considering how scalable ray tracing is, I’d imagine it would be easy to split up this step among multiple GPUs, especially as we eventually get to above 1 sample/pixel.
@fcukugimmeausername
@fcukugimmeausername Ай бұрын
I never understood people who used Crossfire. Having one AMD GPU was bad enough, why add a second one? That's literally like trying to dig yourself out a hole.
@ChristopherYeeMon
@ChristopherYeeMon Ай бұрын
I loved my SLI setup back in the day
@rvalent9366
@rvalent9366 Ай бұрын
Idk i wish we could use some kind of dual gpu for frame generation and fsr
@ckat609
@ckat609 Ай бұрын
I had two gpus for 3d rendering on the 10, 20 ,and 30 series, and it was worth it. It scaled almost linearly. Never tried sli, though. I never played any games on them. With the 40 series, prices got too crazy,so I think I'll be skipping a couple of generations until the next purchase.
@salvanation
@salvanation Ай бұрын
I think a pure raytraced game could use multiple GPUs without a SLI connection.. I hope we will see this in the future for games in movie quality 😊
@jsncrso
@jsncrso Ай бұрын
uhhh no, not possible. It would be unplayable due to memory bandwidth and especially latency limitations. Raytacing makes that even 10x worse. It would be a stuttering mess
@stanleysmith7551
@stanleysmith7551 Ай бұрын
My question would be: what the hell happened to DX12's "killer feature" the utilization of of multiple GPUs even from different vendors??? Is this a Mandela effect? Is my mind playing tricks on me? Because I distinctly remember this RTS demo called "Ashes of Singularity" which was supposed to highlight DX12's new features, the most exciting feature pairing GPUs which can in theory share the workload. I distinctly remember game journalists going crazy over this, even pairing Nvidia and AMD GPUs. The reason behind the enthusiasm was quite understandable, meaning you don't have to throw away your last gen card, you can just swap it in next to your current card and it would share some of the workload. For example I have 3070Ti in my rig, which is a decent card, especially at 1080p. Then there's my older card, a 1070 which was also a pretty decent card back in the day, but right now it's just sitting there doing nothing. Both of these cards, although massively different in terms of performance have 8 GB of VRAM. What if... I don't know... I'm running out of VRAM? 😏 I literally have +8 GB lying around doing nothing. Because if I had them paired I wouldn't be running out of VRAM any time soon and I wouldn't be forced to buy a $/€ 1000 GPU every 3 years or so. What happened to DX12's "killer feature"? Was it just false advertising, shady marketing...or was it abandoned on purpose?
@jcm2606
@jcm2606 Ай бұрын
It was abandoned because it drove complexity through the roof. DX12 and Vulkan are already complex APIs with a lot of mental overhead to work with, and multi-GPU made the mental overhead exponentially worse as you now had to worry about synchronising two GPUs, load balancing across two possibly differently performing GPUs, copying resources between the two GPUs at the right moment, etc, all for a minority of a minority of the player base.
@mako9673
@mako9673 Ай бұрын
I would have thought there was more interest in SLI for content creation, rendering workloads, etc. With AI becoming more popular, would have thought it would come back for that too, unless SLI is just not necessary for AI workloads.
@jcm2606
@jcm2606 Ай бұрын
It's generally not necessary because you don't need to move much data between the GPUs. As far as I know, multi-GPU deep learning setups typically split the network layers between each of the GPUs, ie the first GPU is responsible for layers 1-10, the second GPU is responsible for layers 11-20, the third GPU for layers 21-30, etc. As such the layers can be uploaded as needed ahead of time, and the only thing moving between the GPUs is the values between the groups of layers (ie the output of layer 20 has to be moved from GPU 2 to GPU 3 so that it can be used as the input of layer 21), which should totally be fine to send across PCIe.
@mako9673
@mako9673 Ай бұрын
@@jcm2606 Thanks. I figured that may be the case.
@arenzricodexd4409
@arenzricodexd4409 Ай бұрын
professional app did not need SLI hack. only games need that. for pro app they able to utilize multi GPU for a very long time. in some cases they can even mix GPU from different generation because all they see was those CUDA cores.
@HandsOC
@HandsOC Ай бұрын
680 sli back in the day was awesome!
@B1u35ky
@B1u35ky Ай бұрын
Latency
@atariplayer3686
@atariplayer3686 Ай бұрын
IMHO SLI was a great tech, but unfortunately the game titles that supported the SLI tech were few and some of the titles even used mGPU tech of DirectX 12 which meant you had to remove the SLI bridge completely from your RTX 1080 card! IMO the final nail in the coffin was with the 10 series GPU Nvidia which decided to cut out the mid range budget gamers out of the equation, so you could no longer put two GTX 1060 together and get a 1.5 or 1.7 performance boost 😞 Not to mention the amount of tweaking you had to do for your GPU as well as the game config files to get your setup working for any given particular game on PC. SLI AFR or Alternate Frame Rendering was probably the best implementation, however the game development companies decided not to support it all across various models of GPUs.
@L0rd_Xa0s
@L0rd_Xa0s Ай бұрын
3dfx for ever!
@spehrson
@spehrson Ай бұрын
I had SLI setups with the 8800 GT, the 460 GTX, 670 GTX and the 970 GTX. My upgrade cycle always consisted of buy one new, wait 12 months, buy second used card and then SLI… then upgrade with a new card... repeat. It enabled me to feel like I was staying current but skip a generation. Since the 10 series (1070 GTX), I have upgraded with every new generation. No SLI, Nvidia gets more money from a buyer like me. I miss SLI.
@spehrson
@spehrson Ай бұрын
Not to say it wasn’t annoying. Does anyone remember the SLI bug on the first Witcher game where all the lights shined through all the walls?
@NaokiWatanabe
@NaokiWatanabe Ай бұрын
No. You want the story of SLI? Here is the story. With DX 12 and Vulkan 1.1 circa 2017-2018, the industry had moved away from the much disliked driver side hacks of "SLI" and into the modern age where developers could directly control multi-GPUs via the common graphics APIs. With that change the stutters of old "SLI" were gone, the per-game driver profiles were gone, the restrictions on equally matched GPUs were gone. Consumers and game developers had more flexibility and options. We had brilliant examples of games from Rebellion Studios (Strange Brigade, Zombie Army, Sniper Elite), Gears of War 4, Rise of the Tomb Raider, and Civ 6. People looking for a quick performance upgrade could add another mid-range GPU, perhaps from the second-hand market, and very nearly double their framerate (1.9x measured in Strange Brigade for example). Things were looking good for a time. There was demo code on how to use it from Microsoft and Kronos. Even DLSS supports DX12 multi-GPU (AFR). The 32GB/s bandwidth of then commonplace PCI Express 3.0 made shuffling buffer data around a breeze. GPUs now supported async compute so copies happened in the background eliminating stutters. Games were coming out. And more people than ever were building multi-GPU systems because more and more content creation programs used for things such as 3D rendering and video editing could take advantage of them. But NVIDIA didn't like this at all since it would eat into high end sales. So they just straight up decided to kill it off. NVIDIA changed their driver so that DX12 and Vulkan multi-GPU did not work on mainstream GPUs. Yes, they driver locked a _standard feature_ of these APIs to higher SKU parts. Even then forcing people into buying a pointless hardware dongle to enable it. All of this functionality still exist in the APIs but when the major seller of GPUs goes and functionally kills off a feature of course developers won't spend time implementing it. It's one of the big crimes in PC gaming.
@eekzMadeThis
@eekzMadeThis Ай бұрын
I last did SLI on 2x GTX 260. The gains were minimal and the ram on the second card didn’t count. Would have been better if they offered a cut down card with just the GPU to make SLI more affordable. 2 x the price for 20-25% gain wasn’t worth it n
@frznfox4401
@frznfox4401 Ай бұрын
well 4 gpu´s in a rig looked badass tbh but yeah the scaling issue was not worth it
@DavidCowie2022
@DavidCowie2022 Ай бұрын
Did anyone ever pronounce SLI as "silly"?
@Tintin_desi_billa
@Tintin_desi_billa Ай бұрын
I pronounced it as "sly"
@TheJakeSweede
@TheJakeSweede Ай бұрын
It is too much hassle.
@SeanUCF
@SeanUCF Ай бұрын
Not really. You just pop in another graphics card and pop on the bridge connector b/w them. Then in Nvidia settings you just select SLI. That was it. It just wasn't worth it past the GTX 1080TI's since so few developers bothered to take advantage of SLI or Crossfire.
@40Sec
@40Sec Ай бұрын
With Nvidia releasing cards like the 4090, there's really not much need for SLI anymore (for those who can justify the cost). At this point software is still trying to catch up with the high-end.
@DfOR86
@DfOR86 Ай бұрын
Sli 8800gts sli 8800Ultra sli Gtx260 sli Gtx 560Ti 448 best time of my life on gaming ❤
@dawnday2666
@dawnday2666 Ай бұрын
I had an ATI SLI setup and given my experience I would of been better off paying the same amount on a single card
@alaskanhybrid1845
@alaskanhybrid1845 Ай бұрын
It just makes more sense to buy a single most powerful GPU to play games cause Nvidias support and games that supported it was bad from factory to worse for software development.
@a36538
@a36538 Ай бұрын
i had multiple SLI rigs
@jrherita
@jrherita Ай бұрын
I bought GTX 970’s in SLI as Nvidia promised VR SLI and.. it never happened.
@Akkbar21
@Akkbar21 Ай бұрын
It never worked well except for data work or rendering.
@dangir1783
@dangir1783 Ай бұрын
multi gpu need complete frames to work since games with unreal engine 4 started using temporal algorithms in the sense that shadow and light effects are not calculated for every single frame the system no longer worked since one of the two GPUs has to wait for the other to render. and it's all the fault of the consoles that these stratagems were made to save computing power.
@Noah-Lach
@Noah-Lach Ай бұрын
Saying it's the fault of consoles implies that SLI was ever good for gaming in the first place.
@iMichael123S
@iMichael123S Ай бұрын
I think they died because they just didn’t have any benefits of adding more, similar to adding more RAM it just doesn’t have mush benefit.
@slapnut892
@slapnut892 Ай бұрын
Here's a thought: SLI and Crossfire might be dead, so why not develop games that also utilise the iGPU to supplement the dedicated one? They are certainly powerful enough these days and it would be an excellent bang for buck scenario for Budget gamers.
@jcm2606
@jcm2606 Ай бұрын
iGPUs suffer from the same problems that dGPUs suffered from with SLI and CrossFire, since they _are_ regular GPUs like any other, just located on the processor die/package and sharing RAM with the CPU. iGPUs also introduce two additional problems. First, you pay an additional latency penalty for copying data between your iGPU and dGPU, as you need to move the data through the CPU first. That requires that the CPU synchronises with the source GPU, effectively stopping dead in its tracks until the source GPU has finished its work (+latency), then copying the data from the source GPU's VRAM to RAM (+latency), then copying the data from RAM to the destination GPU's VRAM (+latency), then finally telling the destination GPU to start (+latency, possibly). Second, load balancing between the GPUs is significantly harder since the iGPU will generally be slower than the dGPU, meaning you'll either have uneven frame pacing if you use alternate-frame rendering (AFR), extra latency if you manually pace the frames yourself with AFR, or extra latency if you use split-frame rendering (SFR) or offload certain tasks to the iGPU.
@ErdrickHero
@ErdrickHero Ай бұрын
I'd have bought multiple GPUs if they were affordable. As it is I could barely afford one at a discount.
@lobstercoco
@lobstercoco Ай бұрын
Long Live AMD RX295 X2
@tehf00n
@tehf00n Ай бұрын
At the root of the engines they don't often support the best used of SLI on their core functionality. Unreal Engine, for example, doesn't really get any benefit from the extra GPU.
@DanielHebell
@DanielHebell Ай бұрын
Just imagine the #StutterStruggle with Unreal Engine 5 + SLI 😂
@barrylovatt9978
@barrylovatt9978 Ай бұрын
My m17x r2 would get better fps with every game I was playing apart from skyrim but I actually got that to work after a lot of fucking about
@GamingRobioto
@GamingRobioto Ай бұрын
I did this with 670s and 980tis. It wasn't worth it
@federicocatelli8785
@federicocatelli8785 Ай бұрын
Very bad scaling with more than 2 cards,plenty of driver issues.
@noodles9345
@noodles9345 Ай бұрын
Wasnt there an AMD gpu at some point that was pretty much just two gpus sandwiched into one card? Or something along those lines. Lol maybe im dreaming but I couldve sworn that was a thing.
@arenzricodexd4409
@arenzricodexd4409 Ай бұрын
amd and nvidia did that.
@hassosigbjoernson5738
@hassosigbjoernson5738 Ай бұрын
For me it's much more simpler: GPU performance increased drastically so that you simply didn't need two GPUs anymore to achieve high-end performance. And things like heat, costs (in electricity) and usability gave SLI the rest.
@Thisisjohn2184
@Thisisjohn2184 Ай бұрын
Do people not have google? I appreciate the videos but why ask it here?
@RadioactiveBlueberry
@RadioactiveBlueberry Ай бұрын
I put this exact question on Bing Copilot chat. The answer started with a somewhat unnecessary Wikipedia-like introduction like if the person asking this had never heard of SLI or CrossFire before, followed then with a long list of bullet points. I like more this kind of discussion format.
@BFArch0n
@BFArch0n Ай бұрын
$1500, 500 watt gpus dont help
@Orpheusftw
@Orpheusftw Ай бұрын
I'm glad it did. 👎 Seemed to create at least as many problems as it solved.
@dmer-zy3rb
@dmer-zy3rb Ай бұрын
because it stuttered and only gave like 40% more fps anyway!
@ErdrickHero
@ErdrickHero Ай бұрын
I hate rendering techniques that make use of the previous frame. Information from the previous frame is outdated and causes smearing and other artifacts.
@Pemalite
@Pemalite Ай бұрын
Used to run 4x Crossfire Radeon 6950's unlocked into 6970's on a triple 1080P eyefinity setup. Those were the days.
@electrikoptik
@electrikoptik Ай бұрын
PC noob here: is it still possible to build a PC with say 2x RTX 4090?
@johnconnor5124
@johnconnor5124 Ай бұрын
no, the last cards to support sli were 2080's
@thoreberlin
@thoreberlin Ай бұрын
Yes. Server boards have enough lanes. You would obly find one game which supports mutli GPU via DX12 which is Ashes of the singularity. Orherwise only usefull for engineering or AI tasks that can ve loaded on multiple consumer GPUs. A6000 dtill supported memory sharing via NVlink, but the RTX 6000 doesn't anymore. For more than 48 GB of video memory you now have to buy into the server grade stuff at 5-6 figures.
@shadesn
@shadesn Ай бұрын
Because it sucked! Moving on...
@thetruthwillout810
@thetruthwillout810 Ай бұрын
SLi was fantastic, in almost every youtube tech 'influencer' video speaking on this matter they never understood the main use case being that you could source cheaper older model cards and pretty much double performance. I ran 2x 8800GTX to play Crisis 1+2, 2x 760's and managed to stay on par with the top GPUs at the time and the only issue was the tons of power required and heat, otherwise it was excellent for those wth less cash spare and ability to play the latest top tier games. Now I've a 4090 and almost all the games are over bloated walking sims being feted by these tech channels, and so spiritless that if it wasn't for Ghost of Tsushima (an old Playstation port) there wouldn't be anything worth playing right now and certainly not worth forking out 1,000+ to play some poorly optimised lecture on feminism in gameform.
@ccgear4367
@ccgear4367 Ай бұрын
All these poor people killed the performance feature. 🤣
@tomsimmons7673
@tomsimmons7673 Ай бұрын
You're not first.
@mikeuk666
@mikeuk666 Ай бұрын
Bot
@gametime4316
@gametime4316 Ай бұрын
SLI/CF is not fully dead... it have a midlife crisis on its way to (real) multi die GPU (not what AMD did with RDNA3)
@HiN00bs12
@HiN00bs12 Ай бұрын
Ps5 is destroying the gaybox series x !!!!!!! 135 million units sold while the gaybox series x only sold 1 million 😂😂😂😂😂😂😂😂😂
@twosnakse
@twosnakse Ай бұрын
Delete this and go to school lad
@user-ni66er420
@user-ni66er420 Ай бұрын
,@twosnake Ps5 is destroying the gaybox series x !!!!!!! 135 million units sold while the gaybox series x only sold 1 million
@Erksah02
@Erksah02 Ай бұрын
​ex box and gey station. politically correct master race is king skibidi toilet
@Erksah02
@Erksah02 Ай бұрын
​@@twosnakseman read his previous comments this guy is the biggest troll
@DavidCowie2022
@DavidCowie2022 Ай бұрын
@@twosnakseYou replied to the troll. He wins.
@Tanzu15
@Tanzu15 Ай бұрын
Had 780s 780 Tis and 980s in sli. Went to a 1080 Ti from 980s in SLI, and to me SLI died. I got more than double vram amount, less heat, less money, more space saved in case. And 1080 Ti was faster with zero issues in games. SLI could have been better, but these devs didn’t support shit. Imagine how crazy cyberpunk and other heavy games can run at 4k high refresh rate with 4080s or 4090s.
@emilgustavsson7310
@emilgustavsson7310 Ай бұрын
SLI/Crossfire was never ever a good choice in practice. Good idea, terrible implementation.
@arenzricodexd4409
@arenzricodexd4409 Ай бұрын
more like games were terrible rather than the tech. many professional apps can benefit from multi GPU. super computers have thousands of GPU inside them. and just look at other tech like intel HT or AMD SMT. professional app can benefit quite nicely from those feature. but games? not so much. sometimes it even lower gaming performance like on those 16 core AMD CPU. GPU MCM? already used in professional space like AMD MI250X. nvidia upcoming B100 also was going to use MCM. but not on gaming GPU because gaming workload does not really like new sophisticated tech like that.
@samunderkhan1078
@samunderkhan1078 Ай бұрын
Don’t listen to em NVlink is good for rendering, Redshift and ue5 path tracing is where it’s at. It’s a shame lazy devs don’t wanna implement anything. Like really epic, bring multi gpu support for movie render que.
@MrWatisthisidonteven
@MrWatisthisidonteven Ай бұрын
It was also incredibly unstable and the performance was awful compared to the cost
Are The Latest Games Too Much For RTX 4090 At Native 4K?
7:45
У ГОРДЕЯ ПОЖАР в ОФИСЕ!
01:01
Дима Гордей
Рет қаралды 4,5 МЛН
PEDRO PEDRO INSIDEOUT
00:10
MOOMOO STUDIO [무무 스튜디오]
Рет қаралды 24 МЛН
NEVER install these programs on your PC... EVER!!!
19:26
JayzTwoCents
Рет қаралды 3,3 МЛН
Is 12GB of GPU VRAM Enough For The Latest Triple-A Games?
7:28
HD-2D Style in Godot 4.1
12:57
Gamedev Aki
Рет қаралды 53 М.
Someone Finally Fixed The Super Nintendo!  | Edge Enhancer Mod
19:16
Macho Nacho Productions
Рет қаралды 405 М.
The Dire State Of Intel...What Happened?
13:03
Logically Answered
Рет қаралды 343 М.
Microsoft Is KILLING Windows | ft. Steve @GamersNexus
19:19
Level1Techs
Рет қаралды 420 М.
What If PlayStation 4 Launched With Its Original 4GB Of RAM?
5:47