The Slow Death Of The GTX 690

  Рет қаралды 186,767

Iceberg Tech

Iceberg Tech

Күн бұрын

Пікірлер: 593
@IcebergTech
@IcebergTech Жыл бұрын
Okay, some footnotes: I lost my voice while recording the VO, but there were a few things I wanted to add or change that I couldn’t record in time. *4 & 6 way SLI:* Craft Computing, the example I included in the footage, used the 3x GTX 690s as part of a cloud gaming server. Other valid use cases for 4-way and 6-way SLI include triple monitor 3D Vision Surround setups. When I said that it was “just a flex”, I speak from the perspective of someone who doesn’t need to run large numbers of VMs, and had also never considered triple monitor surround gaming setups as anything but an ostentatious display of wealth. *Single GPU vs. Dual GPU:* When comparing the single GK104 vs dual, I applied a small OC to the single chip setup. This was to more accurately reflect the experience of someone who had “only” bought a single GTX 680. I’d have gone higher, but my card couldn’t handle more than about 75MHz! *Drivers:* Although I’m talking about the history of a GPU from 2011 and the games it would have been played on at the time, I am using it in a modern system running Windows 10, and with 474.04 drivers (Nov 2022 security update, the latest available for Kepler). I could have used period-appropriate drivers and hardware to more accurately reflect the conditions of the times games were released, but my schedule didn’t allow it. Sorry! *Skyrim:* I said in the script that I didn’t see any of the physics glitches I’d heard about from using the game > 60FPS. I *should* have said that the only obvious glitches I could see were the weirdly fast hand movements and the flickering, *and that* this might have been caused by running the game above its intended frame rate, and was probably *not* related to SLI. I kinda said it with the on-screen text, but I wanted to be a bit more clear about it. *Doom:* I included some footage of Doom 2016 but ended up deleting the associated VO as it didn’t fit the flow of the narrative, and I feel the footage might need some further explanation. Doom 2016 was remarked on at the time as running worse on high end Kepler than on entry level GCN cards, and that still seems to be true 6 years later. I couldn’t get the game to run in OpenGL at all, but in Vulkan it varied between 20 and 40 FPS at 1080 Low. It also didn’t use the second GPU, even with adjustments made in nVidia Control Panel and nVidia Profile Inspector. *Fortnite:* At the last minute, I went back and retested Fortnite in SLI, because I’d seen someone else’s results that were very different from mine. Despite multiple attempts with different AFR modes and profiles, I still saw this massive performance regression in DX11, and DX12 didn’t acknowledge the second GPU at all. Performance Beta, meanwhile, performed very well, but soon found itself running above 200 FPS. At this point, CPU performance would be the limiting factor. *Asynchronous compute:* There was something of an uproar some years ago when AMD accused nVidia of “not even having async compute” in Maxwell, a feature which nVidia apparently *did* include a variety of as far back as GTX 780 and Titan. I don’t intend to drag up an old Red vs. Blue argument, but in the case of the HD 7970 it is a fact that async compute was something the AMD card had that the GTX 680/690 didn’t.
@theburger_king
@theburger_king Жыл бұрын
I can’t read that much lol
@thcriticalthinker4025
@thcriticalthinker4025 Жыл бұрын
So regarding 6 way SLI SLI maxes out at 4 gpu's, as there is no bridge configuration for any higher configs. This doesn't stop you from slapping 30 cards in a rig and using them for other purposes
@walrusman151
@walrusman151 Жыл бұрын
Anybody who knows enough on a topic to leave a whole book of information as a footnote is pretty impressive
@nepnep6894
@nepnep6894 Жыл бұрын
Async compute for any nvidia card pre turing was achieved through rapidly context switching so it didn't really net performance improvements, but only really helped compatibility
@CertifiedAsher
@CertifiedAsher Жыл бұрын
Hello, cool relaxing videos explaining so well about GPU's, CPU's, anyway, can you do a video about the Nvidia Quadro k4200?
@terzaputra3203
@terzaputra3203 Жыл бұрын
I Have a friend who's struggling so hard to sell his old 690 few months ago. Besides unsupported drivers, and broken SLI support, the main problem was it's power consumption. It competes with the likes of 1050 TI at the same price range of 80-100 USD which is like 3-4 times more power efficient. At that price, people would choose more efficient cards rather than glorified heater. He ended up making a nice glass case for it and decorate his walls with it.
@misiekt.1859
@misiekt.1859 Жыл бұрын
For 100 you can get RX 5600xt, which is around 1070ti. Even 5500 xt is 2x faster than 1050ti and often below 100.
@HolographicSkux
@HolographicSkux Жыл бұрын
The 690 will produce a higher frame rate compared to the 1050tI but it is essentially a 680 at this stage. Glad I got rid of my 690 years ago but it was cool while I had it.
@jimtekkit
@jimtekkit Жыл бұрын
Yeah power efficiency of today's cards are a big reason to buy new. Right now I'm on an RX570 and it consumes twice the power of my RX6600 while only giving about 60% the performance. And you can definitely notice the additional heat output while it's loaded up. It's a similar story on the Nvidia side as well.
@crazeguy26
@crazeguy26 Жыл бұрын
I love my GTS450 zoom zoom!
@VarietyGamerChannel
@VarietyGamerChannel Жыл бұрын
@@misiekt.1859 The radeon 5000 series are also jet noise heaters.
@ALmaN11223344
@ALmaN11223344 Жыл бұрын
This was back in the era where I had a GTX 460 SLI and helped a friend build a 9800 GTX SLI because they were $40 each on Amazon...oh, how the times have changed
@raginginferno868
@raginginferno868 Жыл бұрын
you cant even get a gt 1030 for $40 nowadays and that card sucks ass
@ALmaN11223344
@ALmaN11223344 Жыл бұрын
@@raginginferno868 Right? It's crazy.
@prateekpanwar646
@prateekpanwar646 Жыл бұрын
@@raginginferno868There’s no sensible gpu a kid could throw in his school gaming pc and play some latest games at low settings. Market has huge gap, Companies can just sell older models like GTX 1060 which is made on older wafers thus cheaper.
@raginginferno868
@raginginferno868 Жыл бұрын
@@prateekpanwar646 What? Your comment is completely irrelevant to what I said. I just made a comparison that GPU prices are going up every new generation. And, there are a few GPUs that are not too expensive, can fit a tight budget, and still offer 60 FPS on medium settings.
@BoleDaPole
@BoleDaPole Жыл бұрын
Nvidia found out what other gaming companies learned long ago: Gamers are easy to fleece and they're willing to pay whatever it takes.
@scurbdubdub2555
@scurbdubdub2555 Жыл бұрын
I’ve owned a 690 before, it wasn’t that bad of a card for what I played. I eventually put it in another system though and sold it. I liked the card, it was interesting, but it was aging. I’m now on a even more interesting card, the Radeon VII. I really enjoy using the card. It surprisingly aged well considering how it was hammered at the time. At 1440p high-max settings it can play everything just about.
@EbonySaints
@EbonySaints Жыл бұрын
Hold on to that Radeon VII for dear life. They and the Vega series supposedly got massacred in the mining boom (according to Actually Hardcore Overclocking) thanks to the HBM being incredibly valuable for mining. You might be one of the few lucky holders of a working model. Should have got it before the price boom when they were at $500. And about interesting GPUs. I can understand, though Arc seems way more frustrating than interesting to me right now. 😅
@Saved_Sinner0085
@Saved_Sinner0085 Жыл бұрын
Radeon VII is still a good 1440p card, bonus for lower heat bills in the winter. Lol it's not nearly as bad as Vega 64 due to the die shrink.
@theboostedbubba6432
@theboostedbubba6432 Жыл бұрын
Still rocking a 1080 Ti. Both great cards that still shred 1080p and are great for 1440p. Enjoy it.
@Saved_Sinner0085
@Saved_Sinner0085 Жыл бұрын
@@theboostedbubba6432 yup, I got a 1070. My display is 1080p 144Hz and the 1070 still runs everything on high with 60+ fps.
@theboostedbubba6432
@theboostedbubba6432 Жыл бұрын
@@Saved_Sinner0085 GTX 1070 still provides a great 1080p experience and you can find them for $115-120 which is great performance for the price. The 10 series was fantastic.
@mrfahrenheit8138
@mrfahrenheit8138 Жыл бұрын
Officially addicted to this channel
@mrmcguru163
@mrmcguru163 Жыл бұрын
Same it’s great!
@cerealeat
@cerealeat Жыл бұрын
Same
@wobb_
@wobb_ 8 ай бұрын
Meth is better
@nicknorthcutt7680
@nicknorthcutt7680 8 ай бұрын
Same here
@watercannonscollaboration2281
@watercannonscollaboration2281 Жыл бұрын
I’ve always found this card to be really cool despite the practicality or lack thereof. Dual GPU cards just seems novel from the perspective of someone who didn’t understand computers from that time
@Djbz170
@Djbz170 Жыл бұрын
I used to have a titan Z watercooled and bios hacked overclocked to 1370mhz. it did very well at 1440p games I only recently upgraded to a 3090 when they launched in 2020. this video brought back alot of memories with my titan Z constant issues with SLI. Most of the time I just ran 1 GPU to eliminate crashing,stuttering and low frame times.
@ThinkAboutVic
@ThinkAboutVic Жыл бұрын
i love the sudden transition from "foreshadowing is a narrative device-" to the gunshot intro lmao
@RodimusPrime29
@RodimusPrime29 Жыл бұрын
Seeing these older gpu's "kinda" running current games I hope gives some hope to those who can't upgrade & running budget builds now. Keeping up with the Joneses isn't viable for most people. Keep the great vids coming!
@markm0000
@markm0000 Жыл бұрын
I am completely out of all of this and just play emulators and old games. If friends want to play a game together we just visit and play split screen.
@TheVanillatech
@TheVanillatech Жыл бұрын
The smarter move was buying a GTX670 immediately, a fairly great mid range card lasting 2 years at high settings, and adding a 2nd for pennies later on. My friend built a dual GTX670 machine just weeks after the cards came out, and he didn't have to upgrade for pretty close to 4 years. As he usually builds an entirely new machine from scratch every 2.5 - 3 years, he was very happy. His brother, who always inherited the old machine come upgrade time, was sad.
@gg2324
@gg2324 Жыл бұрын
But you would need a sli compatible Z series motherboard and a fairly powerful psu too
@TheVanillatech
@TheVanillatech Жыл бұрын
@@gg2324 Sure you would need a fairly decent mid range motherboard, in the region of £80-£100, but for someone dropping £350 on two GTX670's or a single GTX690, I'd expect that they would be spending AT LEAST £100 on a motherboard, possibly even going for an overspecced £150 board. People who spend £350-£400 on a GPU don't spend £50 on a board, £50 on a GPU and £50 on RAM to complete the system, after all! (Unless they're crazy....).
@TheVanillatech
@TheVanillatech Жыл бұрын
@@randomguydoes2901 Mature GCN was a good buy. Most ATI cards are a good buy because of the lognevity and prolonged driver support. 7970, 7970 Ghz and 280X were amazing buys. At the end of it's shelf life, the 280X cost £160. At that time, Nvidia launched the GTX960 at £220. Almost double the price. But the R9 280X beat the GTX960 by 10-30%, depending on the game. Obviously Nvidia had to drop the price of the GTX960 here in the UK, in the first week. 980Ti and 1080Ti - both great cards. Easily 2-3 year cards. Questionable with the 1080Ti, given it was almost doubled in price shortly after launch during the crypto rush. But if you got one early enough, despite the high price at launch, the card defninitely had legs! HD4770, GTX 750Ti and Geforce 6600GT are the best cards ever released, however.
@TheVanillatech
@TheVanillatech Жыл бұрын
@Crenshaw Pete No shit Sherlock! Technology has IMPROVED over TIME! What a fucking bombshell! Any more pearls of wisdom to impart?
@oceanbytez847
@oceanbytez847 Жыл бұрын
you'd be suprised. I've saved at least one awful build like that this year. Client was trying to get next gen performance on a budget and made cuts across the board except on the GPU. fortunately, it still ran pretty well, but he cut prices so severely he screwed himself out of an easy upgrade and instead would have to total rebuild when a slightly better board might have supported the following gen and allowed that upgrade. foolish mistake, but it happens now. I think part of it is the fact that computers have massively increased in price now compared to 10 years ago.
@PyromancerRift
@PyromancerRift Жыл бұрын
In 2012, only competitive esports players had high refresh rate monitors. But they were a thing. I remember beinQ being in all Esports competitions.
@IcebergTech
@IcebergTech Жыл бұрын
I’ve been toying with the idea of an “original 144hz PC” video. For the time being it’s on the back burner until I can work out a bigger budget, but the first consumer 144hz display I could find referenced was the Asus VG278HE from 2012. Seems like that was a banner year for new display tech!
@Saved_Sinner0085
@Saved_Sinner0085 Жыл бұрын
Oh wow, only a 300 watt TDP? That's on the low end of the high end now. There are also CPU's pulling close to that now.
@kinkykane0607
@kinkykane0607 Жыл бұрын
300 Watts was considered ludicrous back in the day including the fact that it cost £1000. I remembered it being quite controversial at the time to spend that much on a graphics card and how much Watts it consumed.
@Saved_Sinner0085
@Saved_Sinner0085 Жыл бұрын
@@kinkykane0607 I remember full and well, I've been a PC gamer since before GPU's needed a 4-pin molex connector for extra power, way before PCI-e was even a thing. I remember people getting worried about the extra power consumption on the Geforce 6800 Ultra back in the day.
@FatheredPuma81
@FatheredPuma81 Жыл бұрын
Tbh most of the people that bought one of these new fall into 2 camps: People that upgraded either at the 980 Ti or 1080 TI and people who's gaming interest and standards fell significantly and will keep using it until the one game they really want to play doesn't launch.
@candle86
@candle86 Жыл бұрын
As someone who bought Kepler back in 2012, I bought 4x GTX670 cards and did triple SLI + Physx on them. At the time 2GB seemed fine, and it was. They where also faster than the AMD compitor. The AMD Finewine thing wasn't a known thing in 2012, the first generation that really applies to was the 7000 series, but in 2012 the 680 beat the 7970 and the 670 beat the 7950. You've also got to remember that during this time period the 7970 and 7950 where plauged by a red screen bug that would randomly occur. No one made a mistake in 2012 buying Kepler.
@drift-wn2dj
@drift-wn2dj Жыл бұрын
good things somtimes end
@trulaila8560
@trulaila8560 Жыл бұрын
More like "always end"
@v5k456jh3
@v5k456jh3 Жыл бұрын
So deep, so insightful
@maizomeno
@maizomeno Жыл бұрын
people are mad in this comment section
@Prod.Sweezy
@Prod.Sweezy Жыл бұрын
all good things must come to a end
@jeremyocampo1529
@jeremyocampo1529 Жыл бұрын
I'm surprised how great put together this video is for such a small tech youtube channel. Well done!
@valentinardemon
@valentinardemon Жыл бұрын
I had an SLI configuraton of GTX 980, this was realy great with my 1440p monitor, in some games like the Wither 3 and Battlefield (until the 5) I had a bit more perf than a GTX 1080. V-sync is mandatory with SLI, without it frame times are just bad and you had to love tweaking game and drivers with nvidia profil inspector. BTW I still use my Ipad air 1 and it work great for what I do with it, like youtube, twich, some Google research, etc. I use it like a second screen, no need to back windows or alt-tab
@VarietyGamerChannel
@VarietyGamerChannel Жыл бұрын
Considering Vsync usually kills 15-20fps SLi was a bad bet. I used to use x2 Radeon 280's in xfire. Same shit. 90% of games I would have frametime stuttering and issues.
@valentinardemon
@valentinardemon Жыл бұрын
@@VarietyGamerChannel And with our card we did not have freesync, I think it could have save multi GPU problems. The only thing I regret now, is that I dont have a heater beside me this winter
@gamewizard1760
@gamewizard1760 Жыл бұрын
I remember 3 years back, (before the pandemic drove GPU prices through the roof)a liquidator had a bunch of these for something like $75 each, and it was a long while before I finally decided against it, and ended up buying a GTX 970 for $100 for one of my older machines. Sli was already dead, and the fact that it was only DX 11, meant that it would be limited to older games. The power consumption is also too high, for the performance you get. If you're buying one for testing purposes, or to put on a shelf as part of a collection, then it's fine, but not as a card that you will use every day. 600 and 700 series cards are also out of driver support now, so there won't be any more optimizations coming. You're better off getting a Maxwell or Pascal based card, if you're upgrading from something even older than a 690.
@zephyr4494
@zephyr4494 9 ай бұрын
"Foreshadowing is a narrative device-" jad me dying
@TastyGuava
@TastyGuava Жыл бұрын
That building that fell still has a lane closed off, 1-2 years later. Still crazy that happened in my area. It's just an empty pit now and no one knows whether they're gonna give a new building permit or turn it into a memorial.
@Iwetbeds
@Iwetbeds Жыл бұрын
was just thinking that was some odd choice of b roll footage.
@chincemagnet
@chincemagnet Жыл бұрын
Back in the day the reason many of us ran multiple high end cards was because we were running 3D or Eyefinity/Nvidia Surround, and when DSR came out that was an easy to upscale to 4K, early on for me it was an obsession with Crysis
@daLiraX
@daLiraX Жыл бұрын
With SLI you will need heavy user to user support, trying custom SLI Bits. There are still some places out there which are into that stuff, but well, it's less ofc, but they often tend to get some results after some time. If none works, you can aswell always try SFR Mode, but it might introduce stuttering and or heavy tearing.
@Obie327
@Obie327 Жыл бұрын
Interesting review Iceberg Tech.. I actually still have my original day one purchase of A Zotac GTX 680 and the PC I installed it in. (i7 2600k) Everything stills works great for my older games. Thanks for the video!
@TheVanillatech
@TheVanillatech Жыл бұрын
We know that Nvidia started ignoring their previous generations in terms of driver updates and performance during Kepler and forever after. My friend bought a GTX980Ti for £580 after much deliberation. SIX MONTHS LATER - Nvidia released Pascal and anounced that only "critical driver support" will continue for Maxwell. Nvidia abandoned their customers who had just spent over half a grand on a top end GPU. Meanwhile, even the HD7970 and it's reincarnation the R9 280/X, were getting huge performance increased and added features via drivers YEARS afterward.
@liquidhydration
@liquidhydration Жыл бұрын
Wait 980ti gets less support? Ever since I gotten mine this year all it has been is constant random nvidia driver stable updates. Now that I think of it, some happened today while I was sleep
@TheVanillatech
@TheVanillatech Жыл бұрын
@@liquidhydration Nvidia released updated statements on their website, on the actual driver pages, saying that all but critical support is dropped from Maxwell just 6 months after the release of the 980Ti. basically, the final card of Maxwell and the arrival of Pascal, brought about the end of Maxwell. Dropping support for their PREVIOUS generation of GPU, which had sold tens of millions units, just like that. While AMD were still providing performance and feature updates for Barts and Cypress even in the Vega and RDNA 1 drivers. You might have not understood what I said. Nvidia simply make sure the cards still *work* as they should, and don't crash with newer engines / applications. But that's just a fraction of what drivers are supposed to do. They are also supposed to be refined to offer BETTER performance, on both older and newer applications. That's been the case for decades, and is still the case today. Nvidia said "Fuck that! We want our customers to buy Pascal, not cling on to Maxwell for 2-3 years!". So they officially dropped all but CRITICAL support. Which is basically the bare minimum. A fact that you can clearly see by looking at graphs of performance on Maxwell cards from subsequent drivers after their announcement. It's like buying a car and, six months later, the manufacturer / garage says "We will still do your MOT, but we ONLY ensure that the car starts and gets you to your destination! We will no longer change the oil, check the brakes, repair the engine beyond starting and stopping, and we won't adjust the seats or replace the windows! THANKS FOR YOUR £40,000 THOUGH! Why not consider our LATEST car? Only another £40,000!".
@S41t4r4
@S41t4r4 Жыл бұрын
@@TheVanillatech before trying to make amd look great, look at r9 cards at the same age of the 900 gen, yeah not supported anymore.
@ffwast
@ffwast Жыл бұрын
It's a real bummer that multi gpu doesn't really get supported anymore. It could have been amazing on modern cards.
@spankeyfish
@spankeyfish Жыл бұрын
I had a dual 670 setup and it was incredibly inconsistent. It'd stutter at random moments.
@MDxGano
@MDxGano Жыл бұрын
Had 780's in sli, was the worst thing ever in terms of consistancy and jitter vs a single card. There are many reasons sli died. If we were talking about non real time viewing type workloads, it would be great. For gaming however, it simply was poorly paced in all iterations.
@NeovanGoth
@NeovanGoth Жыл бұрын
I'm much more happy with the manufacturers having switched to extremely large dies (Nvidia) or chiplet designs (AMD) delivering scaled up single GPU solutions for the high end instead relying on multi GPU. It's just as expensive and power hungry, but works _much_ better in practice.
@simsdas4
@simsdas4 Жыл бұрын
I used to 980 for 6 years, good to know it's still capable of holding its own at the end there.
@TheTardis157
@TheTardis157 Жыл бұрын
I went from a GTX275 to a GTX690 and it was amazing for the games I played in 2015 or so. It only cost me $160 at the time so I was fine with it as a card as at that price it made sense. I moved on to a RX580 when GPU RAM was becoming an issue and haven't looked back. I still have the card in a backup PC that I use mostly for home theater and streaming sites.
@JohnSmith-nj9qo
@JohnSmith-nj9qo Жыл бұрын
I still remember just starting to get into PC building in 2012, and drooled over how ludicrously overpowered the 690 sounded. I eventually opted for a much more sensible 660 ti, and I'm glad I did because the 690 was very much a last gasp of the Crossfire/SLI trend. Now multi GPU setups are just a weird largely forgotten about footnote in the annals of PC gaming history.
@AFnord
@AFnord Жыл бұрын
Part of me is surprised that multi-GPU support has pretty much gone the way of the dodo, considering how long it was around in some way or another. Even in the late 90's, with the Voodoo cards you could do a multi-card setup, if you had more money than sense (technically speaking a card like the Voodoo 2 required a multi-card setup, but not really in the way people tend to mean it when they talk about multi-card setups).
@d0ubleg78
@d0ubleg78 Жыл бұрын
Is the Proton compatibility layer on Linux / Steam OS a viable workaround to play otherwise incompatible titles? If the Vulkan API on Kepler is recent enough, then translating DX12 into Vulkan and running that on the GTX 690 could become a very interesting video.
@actuallyn
@actuallyn Жыл бұрын
It would have similar outcome as MoltenVK on Apple's side (but worse, due to outdated... everything) You need to sacrifice performance to a compatibility layer. Unless use one gpu to do all the translation magic and other one for rendering?
@nepnep6894
@nepnep6894 Жыл бұрын
DXVK no longer is supported on Kepler and will not work, and was pretty slow to begin with. VKD3D never worked in the first place on Kepler.
@blahblahblahbloohblah
@blahblahblahbloohblah Жыл бұрын
I'm drunk rn. But aware enough that your editing and voiceover work are amazing 6 seconds (literally stopped 6 seconds in to comment this) into the video and I can hear your excitement. You love doing this, and I just subscribed because of that. It's so immediately apparent that you do this out of passion.
@SaberSlayer88
@SaberSlayer88 Жыл бұрын
god what an excellent video, well done. i was shocked to see this was done by such a small channel.
@DFX4509B
@DFX4509B Жыл бұрын
The latest dual-chip cards I'm aware of were custom Radeon Pro MPX cards made for Apple for the last-gen Mac Pro.
@NOM4D20
@NOM4D20 Жыл бұрын
Very good video, really detailed, and enjoyable. Keep up the good work
@brentsnocomgaming7813
@brentsnocomgaming7813 Жыл бұрын
Honestly I'd've gotten a 980Ti instead of 2 980s. My brother had one that ran 4k60+ High/Ultra like a champ, and it overclocked to 2.1 GHZ on air, it was a beast. He had it until 2018 when it burnt out from the extreme OC.
@jean-micheldupont1150
@jean-micheldupont1150 Жыл бұрын
The sad thing is that "flagship" GPUs, doomed to become paperweights after a few years, used to cost 4 times less than now so you would feel a little less robbed back in the day...
@raychat2816
@raychat2816 Жыл бұрын
I still have my own 690 however it’s now a collector’s item, I had replaced it when I got 2 980 cards in SLI … but I’m keeping it
@alastairpei
@alastairpei Жыл бұрын
I've still got a 690, it's effectively my backup GPU. Wish it still had good driver and SLI support.
@bryndal36
@bryndal36 Жыл бұрын
I had the GTX 680 and it was the 4gb version from Leadtek. It was a pretty decent card when I bought it in 2013 but by 2016 it was starting to show it's age. I replaced it with the GTX 1060 and wow what a jump in performance that was.
@PokeBurnPlease
@PokeBurnPlease Жыл бұрын
Ive used a 680 for about a year in like 2019. Could run anything i threw at it. One day it sadly broke and i had to buy another GPU. I went with the 770 also 4GB. Why was Nvidia thinking 2GB Vram is enough in 2012 ? They knew PS4 and Xbox one gonna release soon obvisiously increasing the demand for Vram. I cant understand that till now. AMD is thinking ahead even their mid range Cards have like 16GB Vram. Currently rocking a RX 6600 cause of the Power Draw. 130W is really low nowdays.
@randomguydoes2901
@randomguydoes2901 Жыл бұрын
@@PokeBurnPlease nvidia and gimped vram is tale as old as time.
@TheKazragore
@TheKazragore Жыл бұрын
@@randomguydoes2901 True as it can be.
@betaomega04
@betaomega04 Жыл бұрын
I had dual 690s. It was the first NVIDIA card that actually looked like a high-end card, and was the first card to have that metal aesthetic that seems to have carried over the years. For power, I used (and still use) a Corsair AX1200i. Practically speaking, it was one of the only cards that had three DVI ports (in a time when Display Port and HDMI were still new on cards), but did have mini-DP. These cards carried me through the beginning of the pandemic, and by that point I had replaced my 4-monitor setup with a single 55" 4K panel, so I needed an upgrade. Eventually settled on dual Strix GTX 1080 Ti's.
@gazzamildog6732
@gazzamildog6732 Жыл бұрын
I remember having a GTX 770 as my first ever GPU (the 770 is just a rebranded 680), I remember wanting to get a second one at the time because the 770 couldn't quite get 60fps ultra on BF4 at 1080p, the thing that put me off at the time was the constant complaints of "micro-stuttering" when using SLI, apparently the frame pacing would just be terrible for many games people tried. Seems like a gimmick looking back, kind of like rollerblades or something, like a dated PC gaming concept. Cool video :) brought me back
@darthwiizius
@darthwiizius Жыл бұрын
I used a 770 from 2013-2017 and my gawd was the deterioration painful. From about 2015 I used it at as a 720P card to maintain a balance of performance and quality but it's lack of modern features just started making new games look old gen, and when I say old gen I mean older than the card. The lack of VRAM combined with the dated feature set made that card not so much fall off the cliff as jump off it clinging to a boulder. Replaced it with a 1060 6GB and even though I replaced that after 3 years it was because it was starting to fall over the cliff edge, the 1060 fell off more like it was wearing a parachute because they're still OK 2.5 years after I replaced it. The 1060 even only pulled half the power of the 770, so much electric for so little benefit.
@gazzamildog6732
@gazzamildog6732 Жыл бұрын
@@darthwiizius yeah it wasn’t a great card looking back, I think they just didn’t realise how much VRAM would matter in the future, these days even low end cards are rocking like 8GB minimum. I remember playing the Witcher 3 with the 770, it forced me to upgrade to a 1070. 20 fps no matter the settings literally.
@darthwiizius
@darthwiizius Жыл бұрын
@@gazzamildog6732 I never even bothered trying to run The Witcher 3 on it, I knew me place. You did well getting a 1070 though, they were cracking cards for the time, the 8 chip memory set-up means they're still totally viable now, the 1060 6 chip config was why it tailed off before the RX570s & 580s it was competing with.
@gazzamildog6732
@gazzamildog6732 Жыл бұрын
@@darthwiizius haha yeah it’s still a solid card today. My friend still has it, swears by it.
@shinvelcro
@shinvelcro Жыл бұрын
I grabbed a 690 on release for my 2700k, spent a while waiting for them to announce it or I was going to pull the trigger on two 680's. It was a lovely card for running at 1440p at the time. Its downfall though was the 4gb of Vram. It only took about a year for that to become the real issue, as the stutter from swapping out assets started to get pretty bad. Had it been 4gb for each core it would have been able to pull its weight for a good while longer for me. Still, it always looked great, even if it was a bit more heafty then I would have liked.
@TheKazragore
@TheKazragore Жыл бұрын
Like how the Titan Z had 12Gb total.
Жыл бұрын
TBH, dropping SLI support sort of makes sense, especially from economy pov, but it also limits the usability of older cards. If look at it this way, cards like RTX 4090 are sort of a replacement for things like 4080 SLI that would have been a thing in the past. People often fail to notice that just a few years ago the top of the line was not a single top tier card but having 2-3 of those. That was the biggest beast you could get. Now? You can get 4090, slam water on it and push it as far as it goes but that's the limit. Anyways, the catch here is that older cards could still be usable in SLI. Even if a single card did not perform that well anymore, using two of those could often be cheaper than the new card, make the availability a little higher and most of all, it would actually be the best reusability scenario, better than recycling. Basic 1070 SLI is on pair with 3070, Strix 1070 OC in SLI is around RTX 3080 performance in most applications. Well, unless you play something that does not support multi GPU setup. Just imagine that, if we still were doing SLI with good support, in the GPU shortage after 2020 we would still have an option to run 10 and 20 series in SLI even though 30 series was hardly available. Getting dual 3080 would be a good alternative to 4080/4090, especially considering the prices and availability. This would let us use older generations at least a generation or two more before they become a scrap metal and when they do, they are a real paperweight. If games can't be even played on 2/3 way SLI, then the GPU is worth scrap. So why does it make sense? First and foremost, competition. If SLI was viable option, then new generation would have to compete with used market of old generation. Getting two slower, used cards might be cheaper and much more available than getting a new one. Secondly, optimizations, more corner cases, the overall complexity of development for SLI. I think that if anything like that should be done, it should be handled by Nvidia/AMD/Intel only, no special code on the developer's side. Multi GPU should really work as a single GPU from the game perspective and everything else should really be handled by the drivers and firmware. Otherwise it does not make sense if one game can get almost 100% boost, another will get nothing.
@kyronne1
@kyronne1 Жыл бұрын
My thoughts tbh, sucks that corporation's only care about the bottom line
@laurencem2327
@laurencem2327 Жыл бұрын
12k views in 11hrs, congrats on getting some traction with YT IcebergTech! Looking forward to more videos as always, keep up the effort!
@alextremayne4362
@alextremayne4362 Жыл бұрын
I love finding small channels the second before they explode, awesome video!
@genethebean7597
@genethebean7597 Жыл бұрын
Probably the most overlooked downside of this GPU was the triple DVI output with only a single mini DisplayPort. Heck, one of the three DVI ports was DVI-D so you were even more down bad.
@DrBreezeAir
@DrBreezeAir Жыл бұрын
Beautifully done, I can only imagine how much work this took to accomplish. I've stuck with a 660Ti until 2015 and bought a 980. I was able to cover a third of the price by selling the trusty 660Ti. Then I moved to a 1080 during the mining craze for a ridiculous price and now I'm on a 3070 to see how the prices will behave in the future. 4080 seems like a total rip-off and a 4090 is an unjustifiable purchase for something to play games on. Plus my 3070 is a Noctua edition, I'd love my next card to be of the same type. Maybe I'll give AMD a shot, I'm still a little sour about the HD2900XT.
@rck-lp7389
@rck-lp7389 Жыл бұрын
Nice man, youre madly underrated and I'm looking forward to see you grow like is saw randomgamingHD expanding! Keep up the nice work
@SterkeYerke5555
@SterkeYerke5555 Жыл бұрын
Will you be using a 780 Ti or a Titan for the final Kepler review? I'd assume they've got more of a fighting chance considering they're got more VRAM and a faster gpu than the 680 and 690 have.
@IcebergTech
@IcebergTech Жыл бұрын
I had planned on going the other direction and using something cheaper, but we'll see how the testing goes.
@TheVanillatech
@TheVanillatech Жыл бұрын
But also totally abandoned by Nvidia in driver support and performance updates. All Nvidia cards die a death once their new generation is released.
@bismarck6
@bismarck6 7 ай бұрын
I thought this was a 1m/500k KZbin channel until I read the comments, only to realize it's not, but I'm pretty sure it will quickly get to a big number if you keep up the good work
@m8x425
@m8x425 Жыл бұрын
I had a couple GTX 680's in SLI, but I returned them for a GTX 670. Then I added a second 670 when I found one for $100. After all the upgrades I made to my PC in 2012, I barely paid any attention to computer hardware until Pascal and Polaris came out. I bought a couple 8GB RX 480's on launch day thinking Crossfire was still a thing, but I was wrong. At least I bought a 1080ti not long after it was released and that card has treated me well. A year before that I treated myself to a GTX 590 because I always wanted a dual-GPU card, but after seeing the limitations of a dual GPU cards in some games I wish I had bought a GTX 580 instead. At least EVGA replaced the GTX 590 with a GTX 970 after it stopped working.
@kintustis
@kintustis Жыл бұрын
Having owned a 690, what killed it for me was the pitiful 2gb of vram combined with so much raw gpu power. So many games were stuck at wonderful fps on the absolute lowest clay-looking textures, and complete unplayability on medium as the vram cap is reached. I was desperately closing other programs to free up precious extra MB of vram. I remember Siege wouldn't run playably if I didnt close my web browser and discord first. This was circa 2017-2018 during the mining boom, and anything over 3gb of vram was used in mining etherium as the mining apparently took up 3 or 4 gb. It had the horsepower for VR, but not the vram. Or rather it would have, if oculus and valve didnt enforce having wasteful 500+MB "home" worlds that must be loaded in when pressing the corresponding controller button, and those were just sitting in memory, never being used. I got great FPS when staring ahead, but several seconds of stutter when turning as the assets had to be paged in from system memory or hard drive. They later added stripped down home areas, but that was years later, and still didn't solve the core of the problem. The card also runs extremely hot compared to what else i had been using. Temps in the 80s and 90s were common, even with repasting and a more aggressive fan curve.
@fiece4767
@fiece4767 Жыл бұрын
So you cant use 4gb of vram taht 690 has, only 2gb? What that other 2gb are doing on the pcb?
@S41t4r4
@S41t4r4 Жыл бұрын
@@fiece4767 holding a copy of the data for the second chip
@SilentdragonDe
@SilentdragonDe Жыл бұрын
Unfortunately this isn't the only time this happened with GPUs. Does anybody remember AMDs TeraScale architecture? Because I had (still have technically) not one, but two of those cards. Not only was the support for TeraScale dropped way too early, the drivers were literally left in a broken state where for some features (like CrossFire) to work properly you basically *had to* rely on community fixes. Even years after driver support had ended, you were still able to get decent performance from these cards, provided that the various hacked together community drivers worked for the particular game you were trying to play, but with stock drivers you were just completely out of luck.
@camofelix
@camofelix Жыл бұрын
Correction: 6 way SLI in a single game was never a thing, but what you could do was 4way SLI + PHYSX accelerator. The physx accelerator could also be any NV card, it didn’t have to be the same as the other 4 GPUs. Example would be 4x980ti + a Titan Black I ran triple gtx670+ a single 560ti as a physx accelerator back in the day to allow for 4K60 in borderlands 2 maxed out, was pretty much the only way to do it. (I’d acquired the 670s piece meal over time)
@StormsparkPegasus
@StormsparkPegasus Жыл бұрын
The first big game I remember not supporting SLI (at all) is Arkham Knight from 2015. That was the beginning of the end.
@telavus920
@telavus920 Жыл бұрын
I had one of these. I remember being so excited when I got this. It was SUCH a beast. This was in 2013ish. It was noisy, but beasty! Then when Dark Souls 3 came out in march 2016, it was a looooot of problems, mostly due launch bugs. But as the patches came, I still had a very bad experience playing the game, with constant stuttering and crashes. I considered myself a huge fan of the series, but all these problems made me ragequit, after dying 80+ times against the Abyss watchers, as a result of long walk to the boss, as well as a 40% crash rate, combined with critical stuttering all the time. Then during black friday in 2016, I had the opportunity to buy a GTX 1070 for 450 dollars. This is hands down the best graphics card I've ever had the joy of owning. It was the most quiet, overclockable and amazing graphics card I've ever had (I bought a 3090 last year and I'm not half as excited nor proud to own it). I put up the GTX 690 for sale shortly after. I got a bunch of lowball bids, ranging between 40 dollars to 80 dollars, I was so shocked and disappointed by it (My asking price was 250 dollars). After two months, someone contacted me and we finally managed to agree on 175 dollars, which all things considered I think was a good deal that made both of us happy. This card had SO much potential, if only the memory would be shared across both GPUs.... It was a early teenager mistake (and ironically enough, my father pushing me to purchase "the best of the best"). Regardless of the sad ending I had with the card, it will still hold a special place in my heart, as it was the first graphics card in my first non-laptop computer I ever purchased. :)
@XaviarCraig
@XaviarCraig Жыл бұрын
Slight correction for @6:35 Asus had a 120Hz 1080p 23 inch monitor called the VG236H available as early as 2010 and by 2011 I believe there were multiple options available. Obviously future proofing is always a bit hand wavy, but generally you can scrape a couple extra years of good performance by looking at works well in current year and going 1 step beyond it. Like right now consensus is 6C/12T CPUs are generally enough for gaming along with 32GB of ram and a card like the 3070. However, if you step the aforementioned setup up to a 8C/16T CPU w/ 64GB of ram, and a 3080 you will be able to run games on it for another year or two before future games become unplayable.
@Linda-
@Linda- Жыл бұрын
5:20 yup, thats the physics bugs you get when trying to run a bethesda game over 60FPS, the flickering is actually just your character going into swimming mode for 1 or 2 frames and then returning out of it right after, back and forth
@imjust_a
@imjust_a Жыл бұрын
I had purchased a second 660 TI for SLI purposes back in the day. It never really worked the way I had hoped it would, and in fact I felt like my performance suffered in more games than it improved. That being said, I was very hopeful for the future of SLI... a bit too hopeful. To this day I still think that SLI could have been extremely useful for lowering the barrier to entry for VR (being able to use two older GPUs for cheap instead of buying one high-end GPU.) It seems NVidia was doing work to support per-eye rendering with SLI, but it happened at the tail end of SLI's lifespan and was thus canned before it really had a chance.
@gandyhehe
@gandyhehe Жыл бұрын
I wonder how the 690 aged compared to the HD 7990. I had a friend who had one of those long been consigned to the bin which is a shame I'd have loved to benchmark it.
@JohnDoe-ip3oq
@JohnDoe-ip3oq Жыл бұрын
AMD aged better, just not with crossfire. There is no dx12/Vulkan support. Doesn't matter the API does, it removed driver support for game dev specific support, and none of them supported it.
@Vfl666
@Vfl666 Жыл бұрын
Strange Brigade has really good sli and crossfire support.
@Waldherz
@Waldherz Жыл бұрын
Do esn anyone actively play that game, or are all of the "players" just PC YT channels, benchmarking it?
@jimcachero
@jimcachero Жыл бұрын
Want to let you know when you open the app(YT) to get info on …we’ll anything.. I’ll see a new video from you and stop everything till I’ve watched it…good stuff man really good stuff.
@KapiteinKrentebol
@KapiteinKrentebol Жыл бұрын
From what I understood of the SLI and Crossfire tech it was a gimped technology anyway. It would chop the screen in half and divide the parts to both gpu's. Instead of 3dfx SLI which would interlace the resolution and you would get a much better workload for both gpu's. But apparently that isn't possible anymore.
@herrsalz278
@herrsalz278 Жыл бұрын
My best friend had a pc with 2 or maybe even 3 of these cards. He bought them in a used pc with triple monitors. The monitors were some acer 3D monitors with horrible backlight bleeding if I remember correctly. Still... this setup was stunning to say the least. Because the cards got pretty toasty stacked like that the previous owner already put water blocks on everything and a separate radiator was hanging on the side of this huge tower. It looked like a car radiator more than a pc component. The power supply was an absolute monster and the lights in his room were flickering for a moment when he turned on the pc. He had to keep his windows open all the time and stopped heating the room apart from turning on the pc. Absolute madness but these were great and crazy times. He sold them one by one quiet fast though. I don't remember what the next setup was but he certainly used titan blacks at one point and he wasn't done yet with sli. But nothing new has been engraved in my mind like the first triple monitor setup and the absurd GTX 690 SLI config from back then
@matthewhanson498
@matthewhanson498 10 ай бұрын
Two 670 in sli kept me gaming for a long time this video is a trip down memory lane for me. Played ultra quality on crysis 2 and Witcher 2 and loved it, but by crysis 3 and Witcher 3 they were showing their age, and hellblade was basically unplayed on high settings. Still though kept me gaming for a long time and got a 1060ti after.
@urygen
@urygen Жыл бұрын
I miss my old gtx 650 first card I ever had back In 2013 and I remember getting it for free from my neighbour. Paired with a whopping 2gb ram and an AMD athlon 2 that came in a prebuilt I got from a circuit city in 2007. How time flies
@urygen
@urygen Жыл бұрын
Not to forget windows vista as well
@hafentoffen2
@hafentoffen2 Жыл бұрын
I used to run a Sapphire 7970 GHz edition vapor-x crossfire rig. Was a beautiful rig and it will be missed ❤️
@iangoldense4730
@iangoldense4730 Жыл бұрын
the worst casualty of the Death of SLI is the lack of a solid upgrade path. with my old GTS 8800 512 i was able to run that for a few year and then pick up a used one for a fair price and get a fantastic performance boost. I did the same with my 980 to much less effectiveness. GPU manufacturers realized this was bad for GPU sales and slowly killed off the feature, especially since the benefit was only seen as an ultra high-end feature for the latest gen GPU's.
@hrod9393
@hrod9393 Жыл бұрын
I had a SLI 2x 670 EVGA FTW back in the day (2012). Those cards had extra vram, it sucked that you couldn't really use the extra vram on the 2nd card. My Mitsubishi 2070sb CRT could do 160-127hz and I still have it for nostalgia. Most LCD's couldnt match its clarity, response time, colors and refresh for a very long time. I only just got a 240hz lcd 2020 and now 4k 240hz 2022.
@vmystikilv
@vmystikilv Жыл бұрын
My 690 is still sitting here on my disc. I sold its mate when i retired my quad setup. Man, I thought I was a God of computers at the time. Even buying Dual titan x's and owning a 6950XT today from release date, I have never felt as powerful as I did then.
@Splomf
@Splomf Жыл бұрын
Got myself a gigabyte 3080 10gb about 6 months ago to replace my RX580 since it was starting to struggle in newer games. Here's hoping it lasts 4 years like my 580 did.
@DevouringKing
@DevouringKing Жыл бұрын
I got my Asus HD4770 (first 40nm card) in Summer 2009 for 120€. It was so Silent und Cool and Fast back in the day. It lastet to Polaris, so 7 Years. Then i got a Polaris (14 nm) in Summer 2016 for 199€ and it lastet to last Month, close 7 Years again. So i Played over 13 Years for 320€. And both Cards had very low Power Consumption. In Comparrision, your 3080 needs Tons more of Energy. It should last more then 4 years.
@jarekstorm6331
@jarekstorm6331 Жыл бұрын
I ran 2 GTX 285s in SLI with an i7 965 and had high frame rates, but always experienced micro-stuttering. When I upgraded to a single GTX 780 years later the micro-stuttering disappeared.
@glown2533
@glown2533 Жыл бұрын
bought a gtx titan x maxwell back in 2015 still running it today even tho my entire pc has changed since i kept it and still a grate gpu ive even ran it overclocked its entire life and in apex legends at 1440p max everything it stilll hits around 100 with alot of fighting it can go down to round 80
@soylentgreenb
@soylentgreenb Жыл бұрын
Skyrim had problems (and probably still does) around the 100 FPS mark or above. The cart may up-end and prevent you from getting through the intro and tutorial. Stuff on shelves in stores may randomly fly at high enough speed to damage you, typically when you first enter the room. The day and night cycle may become out of sync with the ingame clock; which is just bizarre since you'd think those were the same thing; by that I mean the sun may set at 9 AM ingame time. Slightly higher than 60 isn't obviously broken. Also you may have random water-noises whenever you are close to the water elevation and standing on solid ground.
@frosthoe
@frosthoe Жыл бұрын
I used to run 2 gtx295s for quad sli, they were dual gpu . then added a gtx 275 for physx 5 Gpus . that was 2009 ish. then 2010 gut Evga Quad sli motherboard with ten pcie slots and ran 4 gtx 480s , and a Xeon server Cpu that was 70% overclocked , all chilled liquid cooled. I think the pc drew like 1400 watts when loaded, and the chiller drew about 760 watts. ( 1 HP chiller) Nowadays , one midrange vidcard, a midrange cpu, low budget foxcon motherboard, and some ram and M2 ssd. Boom solid game rig.
@stefensmith9522
@stefensmith9522 Жыл бұрын
Sli was never about doubling fps back in the day, it was always expected around a 30% boost and a way to get top of the line card performance for less than top of the line price. Picture getting a 4060ti and then a year and half from now you could get another one for half price and run it with the one you have now and get better fps than a 4090 that still cost more than both cards. That's what it was about back then. Then developers slowly started to not optimize for sli to the point having two or more cards would hurt performance compared to a single card. Sli was awesome it was the developers saving a penny in development time that ultimately killed it.
@qasderfful
@qasderfful Жыл бұрын
I remember beating all those pre-2016 games on 650 Ti. Those were good times, man.
@terrabyteonetb1628
@terrabyteonetb1628 Жыл бұрын
Games have to be written to use sli, they started dropping that in 700 series, by my 980ti, my old 670x2 sli, in bf3 got 120fps, and the 980ti same (at 1080p). That why I did not huy a 2nd one on my x99 sli matx board..(seeing death support in games).
@Smasher-Devourer
@Smasher-Devourer Жыл бұрын
The people who bought the GTX 1080/1080ti are the smart ones. Those cards still going strong to this day.
@ianlandry72
@ianlandry72 Жыл бұрын
I was fortunate enough to buy a 1080ti when they were first released, its coming up on 6 years of use at this point and still going strong at 1440p high settings. I dont think the performance per dollar will ever be that good again!
@RazielXSR
@RazielXSR Жыл бұрын
I know you said you would have affiliate links to all products used in the video, I seem to be missing the one you put up for the 690. I was hoping to triple SLI them.
@bananabro980
@bananabro980 Жыл бұрын
blew my mind there was no hfr displays back in 2012 especially lcd, i remember only vga or dvi lcds and crts could od that high, times change. now a budget 3440x1440p 140hz screen is barely 300usd
@bradley163
@bradley163 Жыл бұрын
I fondly remember using a pair of 6950 gtx gpus in sli back in the day. Being the only person in my highschool friends group who could run Oblivion at ultra settings was such a good feeling. Oh, those were the days. And Crysis 2 inherently runs poorly on any platform. They added that strange cinematic feel to the game, which made the entire experience frustrating.
@imskyrcheez
@imskyrcheez Жыл бұрын
I built my first computer the year the 690 came out and paired it with a 4770k then proceeded to mod Skyrim with over 250 mods that would only crash every 6 to 12 hours. Elder Scrolls online being my most played at 4 or so years of game time. Not knowing that SLI support was dwindling when I felt I needed to upgrade after playing Borderlands 3 I purchased a second 690 on ebay for $`00 and was so disappointed. I then upgraded to a EVGA 2070 super black gaming with a free modern warfare 19 bundle..
@lasarousi
@lasarousi Жыл бұрын
This reminds me of the 660m laptop I got for university back in 2013. It was already aging and didn't run okay most things. It ended up the best university laptop for graphic design, it ran Adobe but couldn't run path of exile at 60fps lol
@aleksandarudovicic4232
@aleksandarudovicic4232 Жыл бұрын
Still using this card and its almost 2023 now. I am mostly playing older games though.
@micahottaway8455
@micahottaway8455 Жыл бұрын
It was software that killed SLI. I had two Nvidia GTX 670s which would have on paper been close to what a GTX 690 was. Games really did just stop supporting multi GPUs around the 2014-2016 time period. The GTX 690 was an old-school ham-fisted approach that we used to love. The ATI Radeon 5970 is another example...
@MDxGano
@MDxGano Жыл бұрын
"ham fisted approach we used to love" This is definitely an unpopular opinion with all the snowflakes complaining about 350w gpu's.
@grantmackinnon1307
@grantmackinnon1307 Жыл бұрын
i rember getting decent performace on alot of games when i had a 780ti which was still keplar. i got a fairly solid 60-75 on witcher 3, 45ish in hellblade. If i still had that card it would be interesting to see if the performance is still that same after 3 years of software updates. I did have an asus rog card with a modded bios that probably added 15% over stock.
@АрсенийСтучинский-в1ъ
@АрсенийСтучинский-в1ъ Жыл бұрын
Same, 780ti was really good
@grantmackinnon1307
@grantmackinnon1307 Жыл бұрын
@@АрсенийСтучинский-в1ъ really power hungery, and ran extremely hot. back then everybody said the r9 290s ran hot but to get 1200mhz from a 780ti expect it to get to the boiling point of water even with the asus direcy cu cooler.
@АрсенийСтучинский-в1ъ
@АрсенийСтучинский-в1ъ Жыл бұрын
@@grantmackinnon1307 that's true as well, mine died due to overheating back in 2020. But until then it ran all the games well for, what, 6 years? Although I may be misremebering since I never cared for ultra settings
@grantmackinnon1307
@grantmackinnon1307 Жыл бұрын
@@АрсенийСтучинский-в1ъ i gave my away about the sametime. as far i know it still works fine.
@cinerir8203
@cinerir8203 Жыл бұрын
Did you use special SLI bits in Nvidia Inspector for Skyrim or did it just work out of the box? I had a hard time getting both GPUs to work without setting other SLI bits in the driver. Appearantly "custom" SLI bits deliver better performance with less stuttering or so...most people advise to use the Fallout 4 SLI bits, since it is based on the Skyrim engine. It's really a shame that multi-GPU had to die, even aside from the high end area it was an interesting way of boosting your performance in games even with lower-end graphics cards. Just SLI them, get a better performance while paying less than buying a better GPU. They could do so much with mGPU in DX12 and Vulkan, but no developer wants to go the extra mile, since so few people have mGPU setups nowadays.
@thumbwarriordx
@thumbwarriordx Жыл бұрын
Nah there were always a few high refresh monitors kicking around since 2008, but they were all for Nvidia 3D vision. People hadn't quite realized "oh these things can just push more frames and that's good"
@jensgerntholtz4041
@jensgerntholtz4041 Жыл бұрын
I never knew that the chips on the 690 were intereconnected via SLI. I am wondering if the AMD Radeon R9 295X2 followed a similar fate as a card with two graphics processors. I didn't see any mention of crossfire, so I wonder what the interconnect is and how it held up compared to the SLI on the 690, and if there were also driver consequences alogn the line. Would be interesting to see whether the multi-gpu of that time were a cautionary tale and if manufacturers have a better go at it in modern times. As we're approaching something similar with AMD's chiplet design on GPU's which has learnt from it's CPU counterparts making use of the "Infinity Fabric". Exciting times!
@conman3609theoriginal
@conman3609theoriginal Жыл бұрын
one thing ive noticed with spider man is that if you don't have it on an SSD or something faster than a spinning mechanical drive even good cards can be handy capped by loading into areas swaped it from my hard drive to my SSD and the issues went away
@T.K.Wellington1996
@T.K.Wellington1996 Жыл бұрын
9:04 In march 2015 I had two GTX 980 TI in SLI to run The Witcher 3 in 4K Ultra, Hairworks Low and anit aliasing Off. I got over 60fps. This was so insane at this time. Now I have a RTX 4090 and in the next gen RTX version, I get 100+ FPS in 4K, with absolute everything at Ultra, but with DLSS quality. And wenn the RTX 5090 is available in 2024, I will sell my 4090 for the half and get a the new one.
@micb3rd
@micb3rd Жыл бұрын
It is interesting that you saw 100% scaling in performance in Witcher 3 when using SLI. Did you have TAA turned off? The TAA option in this game for me stopped SLI scaling properly on my Nvidia Titan X Pascal cards and so neither card would do more than 80% GPU load . This got me to 3520 x 2036 (15% below 4K) at around 85-90 FPS. When I swapped for a Single Nvidia 2080TI Kingpin performance was noticeably faster than the SLI set-up. The issue with SLI at the end is all those micro stutters and poor frame pacing. A single card was the way to go.
@aChairLeg
@aChairLeg Жыл бұрын
Such a shame old dual GPUs like this don't have much use left in modern games. I've always thought they were some of the coolest piece of tech, but could never justify the price.
@RuruFIN
@RuruFIN Жыл бұрын
The last time I had a multi-GPU setup was a R9 290 Crossfire in 2019, and already back the multi-GPU support for newer titles was pretty meh.
@hypogogix9125
@hypogogix9125 11 ай бұрын
I ran a 690 in my first ever gaming computer back in 2012. I wouldn't have bought it if I had properly researched what I was doing. I just got excited and had some money on hand so just bought the best available. Like an idiot. Either way I replaced it for a 1070 in 2016 and I still run this computer today. lol DDR3 and a 3930k. It's time for a new build!
@natetete1379
@natetete1379 Жыл бұрын
Quad SLI was so OP for microATX cases. Dual 590s was epic
@RonGrethel
@RonGrethel Жыл бұрын
I've never heard of this "built in sli" style cards. The scaling is crazy when it's supported. I played around with used crossfireing r7 260x, r9290, and Vega 64s. Never really was worth it.
@hartsickdisciple
@hartsickdisciple Жыл бұрын
I had 4 different SLI configs, and can say that it was rarely worth it. I had 2x 6600GT, 2x 7800GT, 2X 7900GT, and 2X 8800GT. There were a few situations where the SLI 6600GT could match a 6800GT, but it still had less VRAM. The 2X 8800GT was the best, because it delivered more performance than 8800 Ultra, while costing less.
@relaxingtopology256
@relaxingtopology256 Жыл бұрын
I still have this card sitting in a display box on my desk. I still love the style of it.
The Original 4K Gaming PC
17:24
Iceberg Tech
Рет қаралды 311 М.
IX Years Later | Titan X Maxwell (2015)
31:34
Iceberg Tech
Рет қаралды 71 М.
Life hack 😂 Watermelon magic box! #shorts by Leisi Crazy
00:17
Leisi Crazy
Рет қаралды 80 МЛН
didn't manage to catch the ball #tiktok
00:19
Анастасия Тарасова
Рет қаралды 34 МЛН
How to whistle ?? 😱😱
00:31
Tibo InShape
Рет қаралды 17 МЛН
SISTER EXPOSED MY MAGIC @Whoispelagheya
00:45
MasomkaMagic
Рет қаралды 13 МЛН
8GB VRAM On A Budget
16:48
Iceberg Tech
Рет қаралды 93 М.
RTX 5000! What are we expecting? Will they crash or burn?
17:46
8800 GTX: How Nvidia's Success Hindered Innovation
16:30
PhilsComputerLab
Рет қаралды 37 М.
The Best CPU Intel Never Sold | E5-1680 v2
12:56
Iceberg Tech
Рет қаралды 36 М.
Buying Used in 2024
44:43
Iceberg Tech
Рет қаралды 228 М.
Can this decade-old PC still game?
19:37
Iceberg Tech
Рет қаралды 33 М.
Linus Tech Tips Almost Ruined His Career
8:25
penguinz0
Рет қаралды 4,9 МЛН
Is The 4060 That Bad? - RX 7600 vs RTX 4060
14:41
UFD Tech
Рет қаралды 161 М.
This 2017 gaming PC forgot to become obsolete.
19:20
Iceberg Tech
Рет қаралды 127 М.
Life hack 😂 Watermelon magic box! #shorts by Leisi Crazy
00:17
Leisi Crazy
Рет қаралды 80 МЛН