Nvidia RTX 3080 Ti + A6000 Review: Did GDDR6X ruin GA102? Who should buy Big Ampere?

  Рет қаралды 40,419

Moore's Law Is Dead

Moore's Law Is Dead

Күн бұрын

Пікірлер: 860
@e2rqey
@e2rqey 3 жыл бұрын
My 3080 paid for my 3080Ti lol. Also, I can confirm that the 3080Ti idles at ~110Watts, although I do have mine OC'd.
@theigpugamer
@theigpugamer 3 жыл бұрын
You basically got a 3080 sold it on eBay and got a tie right ?
@Ascendor81
@Ascendor81 3 жыл бұрын
This is the way. Used my 3080 for 7 months, sold it, got a 3080Ti+$200cash :)
@Miskatonik
@Miskatonik 3 жыл бұрын
I personally don't think it's worth it. I recently bought the 3090 FE (idles at 20W, lol) and gladly paid the 300$ premium for an unlocked hash rate and 24Gb of VRAM. I think 12Gb of VRAM is ridiculous for a 1200$ card with a capped mining performance on top. Not to mention the much smaller cooler, they should have used the 3-slot one from the 3090. I really feel the 3080 Ti it's an Nvidia money grab like Gamer's Nexus stated. If I wanted to save some money would better buy the 3080, much better value overall. Also sold my 2080Ti and 3080 at a profit, btw, got lucky it was just before the Bitcoin crash. Prices are much lower now.
@LiveForDaKill
@LiveForDaKill 3 жыл бұрын
What energy management settings are you using in Nvidia Control Panel? I have read that when you switch it from "max performance" to "normal" that it will drop to 10-20 watts.
@lupintheiii3055
@lupintheiii3055 3 жыл бұрын
@@LiveForDaKill Funny how people find ok to use all the kind of workarounds with Ampere while the response to any workaround with RX 5000 cards was: "I'm paying $400, it need to just work, it's not my job to fix AMD mess!!!1!!!1!"
@excitedbox5705
@excitedbox5705 3 жыл бұрын
That is because Nvidia manipulated the launch reviews by sending people that stupid PCAP thing. It uses a slow as balls ADC so any power measurement will miss any power spikes. Igors Lab used a real Oscilloscope and measured almost 500W spikes on the 3080. But most reviewers won't spend the $400-600 on an entry level Oscope so by controlling the gear reviewers used to test the cards they could claim lower power usage. Believe me, Nvidia would NEVER use a $40 meter to test their GPUs when even a base model Multimeter costs double that. PCAP would be useless for any real measurements, and any maker should have known that immediately.
@TheBackyardChemist
@TheBackyardChemist 3 жыл бұрын
yeah, but any properly designed circuit with a slow ADC should have a low-pass filter on its input, so unless they ommited that the spikes should not matter much for power consumption
@powerdude_dk
@powerdude_dk 3 жыл бұрын
@@TheBackyardChemist but your PSU should still could handle power spikes like that, so it could potentially be a problem if your PSU has a bit too low wattage.
@lupintheiii3055
@lupintheiii3055 3 жыл бұрын
@@TheBackyardChemist That doesn't solve the issue, the spike will cause heat in a way or the other in some place inside your system wich will dissipate into your room (or somewhere else if by chance you have some kind of external radiator :-)
@bgtubber
@bgtubber 3 жыл бұрын
Yep. RTX 30 series are a big failure in terms of power consumption. That's why I have my 3090s undervolted and I suggest everybody else do that too if you know how to do it (it's not that hard really if you are tech-savvy). Now they run at ~280-290W under full load with minimal to no performance loss compared to the stock 350-380W.
@lupintheiii3055
@lupintheiii3055 3 жыл бұрын
@@bgtubber The fact is you can do that on RX 6000 cards too
@joshuac837
@joshuac837 3 жыл бұрын
Not sure if red eyes in the thumbnail were accidental or intentional, but I’m here for it 😂
@mtunayucer
@mtunayucer 3 жыл бұрын
Difference between gddr6 and gddr6x is smaller than i expected, power usage compared to g6x is insane!
@VoldoronGaming
@VoldoronGaming 3 жыл бұрын
Marketing. G6X is better than G6 right...or it sounds like it should be...
@tetrasnak7556
@tetrasnak7556 3 жыл бұрын
infinity cache FTW !
@WayStedYou
@WayStedYou 3 жыл бұрын
@@VoldoronGaming i think people would've preferred 16gb gddr6 vs 3080 10 gddr6x
@lupintheiii3055
@lupintheiii3055 3 жыл бұрын
@@WayStedYou Of course they would, but then how Nvidia could outsell you a 3090?
@MrWarface1
@MrWarface1 3 жыл бұрын
Insane.. till you compare hashrates, and realize gddr6x makes a huge difference
@gpunk2338
@gpunk2338 3 жыл бұрын
LOOOLLL I was just listening to your podcast with Colin Moriarty where you said I'm not gonna give this a good review loll..
@endlesslogins156
@endlesslogins156 3 жыл бұрын
I definitely enjoy the qualitative analysis of these reviews. This is something more reviews need, was it actually noticeable when being used? Or is it just better based on number fetish?
@davidgunther8428
@davidgunther8428 3 жыл бұрын
So the race between the RX 6800XT and the RTX 3080Ti is a TIE! 😆
@davidgunther8428
@davidgunther8428 3 жыл бұрын
@Gareth Tucker it was a pun. But the performance was about equal between the two
@Bang4BuckPCGamer
@Bang4BuckPCGamer 3 жыл бұрын
I laughed so hard when you pointed out the 3060 has the same amount of V Ram as a 3080 Ti.
@mtunayucer
@mtunayucer 3 жыл бұрын
Maybe 3060 should have launched with just 6 gig vram, or the rest of the 3070-3080 should have had 16-20 gig of ram.
@bgtubber
@bgtubber 3 жыл бұрын
​@@mtunayucer No, 3060 is fine with 12GB. Please don't give Nvidia any ideas.
@mtunayucer
@mtunayucer 3 жыл бұрын
@@bgtubber lmao
@gsrcrxsi
@gsrcrxsi 3 жыл бұрын
3060 = 12GB of slower ram, with less bandwidth. They are not the same.
@bgtubber
@bgtubber 3 жыл бұрын
​@@gsrcrxsi GDDR6 is still plenty fast. It's not like it is 5 years old technology or something. The first GDDR6 cards to come out with GDDR6 were the previous gen RTX 20 series. Besides, you can always OC it the VRAM if you think that you need more raw power. I would personally take 12 GB of GDDR6 over 6-8 GB of GDDR6X if I had to choose.
@cottonginvideo
@cottonginvideo 3 жыл бұрын
Went from a 400 w 3080 to a 6900 xt. I much prefer the raw rasterization perf, lower power draw, reduced CPU driver overhead, and 6gb extra vram. RDNA2 definitely wins this generation. Nvidia mindshare is still insane though. I’d like Ampere cards more if they were priced competitively or at least had more vram for longevity.
@gamtax
@gamtax 3 жыл бұрын
When I saw RTX 3080 only have 10GB, I laughed and immediately reminds me of Kepler cards. Some games already consume almost 10GB when playing at 4K. I don't think this card will last for 3 years when playing at 4K.
@buckaroobonzai9847
@buckaroobonzai9847 3 жыл бұрын
Did you notice less heat being generated in the room with the 6900xt?
@TheWiiplay
@TheWiiplay 3 жыл бұрын
I’ve been saying this since I picked up my 6900xt at launch and I sold my 3080 a week later. Finally folks are staring to notice.
@cottonginvideo
@cottonginvideo 3 жыл бұрын
@@buckaroobonzai9847 yes I have a small room that my pc is in
@theigpugamer
@theigpugamer 3 жыл бұрын
This proves that amd was actually right by not going g6x and instead going infinity cache
@MrWarface1
@MrWarface1 3 жыл бұрын
Tom's opinion proves that? How? I have 2 3090s running 100% mining load 24/7 in my bedroom. Do u guys not have air conditioning? Amd peasants
@wile123456
@wile123456 3 жыл бұрын
GDDRX6 is currently an exclusive deal with Nvidia right now from was it micron?
@wile123456
@wile123456 3 жыл бұрын
@@MrWarface1 lmao using an 800 watt PC for mining then having a 600 watts air conditioning on inside the room to counter balance it. Your part of the problem with the current climate crisis we are in. Disgusting
@waleedkhan1122
@waleedkhan1122 3 жыл бұрын
@@wile123456 Well said.
@MrWarface1
@MrWarface1 3 жыл бұрын
@@wile123456 first off.. I have central air. Second, my power cost went up a little more than a dollar a day for the 15 a day they are making mining. Imagine being such an idiot, you try to make a point about stuff you know nothing about cause you are such a fanboi. Lmao
@eddyr4984
@eddyr4984 3 жыл бұрын
great work tom ! looking forward to seeing more content
@wile123456
@wile123456 3 жыл бұрын
I feel like the purpose of the 3080ti is so Nvidia can discontinue the normal 3080 and sell the same die for almost twice as much
@elheber
@elheber 3 жыл бұрын
No doubt. Yields improved as manufacturing matured and they saw all the money they were leaving on the table.
@darcrequiem
@darcrequiem 3 жыл бұрын
I think you are dead on Wei.
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
Don't believe everything you hear. Were living in a world of propaganda of huge proportion. Nvidia is evil, sure that's a fact, so is amd. And the FBI and the Obama's and bidens and Clinton's etc.
@wile123456
@wile123456 3 жыл бұрын
@@tacticalcenter8658 but comviently not the corrupt trump family does anything wrong? Lmao
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
@@wile123456 show me evidence. So far its just been lies from a criminal organization. Again... Show me truth and facts that aren't made up by the communist. You can't do it. Cause you've been so brainwashed you believe things people tell you without proof or manufactured proof.
@Zorro33313
@Zorro33313 3 жыл бұрын
Don't forget about Nvidia's software scheduler that creates a pretty noticeable additional load on a cpu making it eat more power and produce more heat. So 3080TIE! is 400w itself + some amount of parasitic power load. Based on HUB overhead investigation - around 10-30 watts.
@tetrasnak7556
@tetrasnak7556 3 жыл бұрын
Exactly ! Not only nvidia gddr6X cards consume an insane amount of power for the performance, but they also offload a part of theyr business to the cpu !
@powerdude_dk
@powerdude_dk 3 жыл бұрын
@@tetrasnak7556 What??? This is insane! How can Nvidia be so careless...
@hasnihossainsami8375
@hasnihossainsami8375 3 жыл бұрын
@@powerdude_dk they weren't careless, they had to do it to stay ahead of AMD.
@noergelstein
@noergelstein 3 жыл бұрын
@@powerdude_dk In the pre-DX12/Vulkan days the graphics pipeline with DirectX and OpenGL was much more CPU limited and also very difficult to parallelize to multiple thread. Nvidia restructured their driver to offload some workload to an extra thread to increase performance if you had a multicore CPU, and at the time that made sense as many games didn't use the extra core(s) much. This did give Nvidia a performance advantage at the time. But come DX12/Vulkan and AMD has a better starting position because all the optimizations Nvidia did in the past don't do anything and their software + hardware architecture causes increased CPU load in their driver.
@chapstickbomber
@chapstickbomber 3 жыл бұрын
If you think about it, the 3090 is sort of a dual rank memory GA102 while 3080ti is single rank. If they interleave between the chips, yeah you have more chips but each chip will be running fewer cycles. Techpowerup found the 3090 used less power in multimonitor vs 3080ti.
@gsrcrxsi
@gsrcrxsi 3 жыл бұрын
The 3080ti only idles at 100W if you have the power management setting in NVCP set to max. This keeps the card at the boost clock, so of course it uses more power. Change it to default/balance and power drops to P8 state and like 10W. That’s a bingo.
@sonicwingnut
@sonicwingnut 3 жыл бұрын
Yeah I did wonder this, considering my 3080ti FE doesn't even spin the fans up when doing most basic non-gaming tasks. It's almost like he cherry picked a series of situations which would make the 6800XT look better.
@Gindi4711
@Gindi4711 3 жыл бұрын
This like all the guys setting power limit to 300W in BIOS and then complaining that their 11900K is actually drawing 300W :D
@gsrcrxsi
@gsrcrxsi 3 жыл бұрын
And in true narcissistic tech-tuber fashion, he won’t reply when he’s shown to be wrong or coming to the wrong conclusion based on wrong/incomplete/biased information without verifying anything. Haters gonna hate. Shills gonna shill.
@pilotstiles
@pilotstiles 3 жыл бұрын
I have a 6900xt and paid retail for it. I can mine ETH at full hashrate and didn’t get butt hurt from Nvidia’s inflated prices. I upgraded from a 1080ti so I am not a fanboy. You guys enjoy your overpriced product limited to what your allowed to do with it. I’m not going back to company that only cares how much money they can make and limits you in its uses.
@gsrcrxsi
@gsrcrxsi 3 жыл бұрын
@@pilotstiles the 6900XT is already hash limited from AMD, they just don't advertise it like nvidia does. you think an unrestrained 6900XT only does 65MH/s?, just barely more than a 5700XT lol. it should be a LOT faster.
@DaveGamesVT
@DaveGamesVT 3 жыл бұрын
It seems like AMD really nailed it by increasing effective bandwidth by using cache.
@bgtubber
@bgtubber 3 жыл бұрын
Optimizations vs brute force. The smarter way to do things.
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
But amd charges the same for less. amd don't do cuda, don't have tenser cores for other apps that utilize it, they don't have mining capabilities (yes newer nvidia cards its been removed from) etc. AMD is charging way to much for their gaming only product. That's a fact. Either bake in more function or reduce prices. The two brands are completely different products this generation. But AMD decided to offer less and keep the same price and fanboys don't see this because of the faults of the nvidia cards and the evil brand practices.
@3polygons
@3polygons 3 жыл бұрын
Not having CUDA (there's quite less apps depending on Open CL, or not getting great performance, prolly as nVidia has a longer history in apps) neither the NVENC encoder (streaming and certain video editing) is a problem for a certain number of graphic apps (work). But I'm all for cards at a good price for graphic work, whatever the brand. As in... if u get me a card not as efficient in A and B apps due to feature lacking... but at a price so good that it's like a lower nVidia tier in cost... the AMD one might win by brute force (so it happened with Radeon VII)... But what I am seeing is both brands are still way too overpriced.
@dawienel1142
@dawienel1142 3 жыл бұрын
​@@3polygons That's because both brands cannot saturate the market, and using this to get more revenue, nothing wrong with that it's just business. Nvidia sets the price and AMD follows (for the most part) Knowing this, We the consumers need to stop buying new GPU's if we want prices to come down. GPU's have received some sort of a luxury status faaaar beyond their value and consumerism has fucked us all in the end. I'm to blame as well I guess since I did buy a Asus G17 laptop with a 5900HX+RTX3070 combo, but that was after my GTX1070 died on me and RTX3070's were going for half the price of my G17. (made quite a bit back by mining on the side though) But yeah this consumerism and hype to literally sleep at retailers has gotten out of hand, don't get me wrong it's a fun experience with friends in a way I guess, but we are sending the likes of Nvidia and AMD the wrong signals here. Just my 2 cents
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
@@dawienel1142 agreed. I bought a 1080ti for $300... What it was truly worth in 2020
@EpicFacePalmz0r
@EpicFacePalmz0r 3 жыл бұрын
Glad to see your community providing you with hardware so you can keep up with the bigger channels.
@ElusivityEtAl
@ElusivityEtAl 3 жыл бұрын
IMO the big selling point for GDDR6X was that it was supposed to give the 3090 "1TB/s bandwidth". i.e. 936.2GB/s / 19.5gbps ( * 21gbps) = 1008GB/s. Obviously due to the horrible power and heat draw they underclocked it to 19.5gbps so it ended up with 936.2GB/s, at the cost of all the heat and power.
@gamtax
@gamtax 3 жыл бұрын
Should be fitted with HBM2E if they want all the bandwidth though.
@nathangamble125
@nathangamble125 3 жыл бұрын
@@gamtax Yeah, but Nvidia aren't going to use VRAM as expensive as HBM2E
@gamingmarcus
@gamingmarcus 3 жыл бұрын
@@nathangamble125 Not that it would make any difference for a halo product like the 3090. You don't buy a Titan class card for price/performance but for performance.
@concinnus
@concinnus 3 жыл бұрын
Remember when we hoped the 3080 Tie was gonna be on TSMC 7nm? Pepperidge Farm remembers.
@MrWarface1
@MrWarface1 3 жыл бұрын
I remember when Nvidia went with samsung 8nm and beat tsmcs 7nm. I also remember the many generations Intel was destroying tsmc 7nm with 14nm.
@bgtubber
@bgtubber 3 жыл бұрын
​@@MrWarface1 Yea, they beat them at how power hungry, hot and massive in size (due to needing giant coolers) a product can be. 😂 I would take the product that's a few % slower, but much more power efficient, cooler and compact any day. BTW, what do you mean by "many generations Intel was destroying tsmc 7nm with 14nm"? As far as I know, there are just 2 generations of AMD CPUs on 7nm - Ryzen 3000 and Ryzen 5000. Intel didn't "destroy" any of those. Like WUT?? In the case with Ryzen 3000, Intel was 10% faster on average in games and much slower in productivity software. So I wouldn't use the term "destroy" in this case. Not to mention, the HEDT Ryzen 3000 chips (Threadripper) obliterated any Intel HEDT CPU at that time (and still does). And in the case of Ryzen 5000, AMD is a tad faster in games and, again, much faster in productivity workloads. So that makes only 1 generation of 7nm AMD CPUs (Ryzen 3000) in which Intel had any dominance and it was only in gaming. So overexaggerating much?
@MrWarface1
@MrWarface1 3 жыл бұрын
@@bgtubber that's your argument? They are bigger and hotter.. lmao who cares what you would rather have. You aren't an enthusiast. Also, keep lying to yourself about ryzen. Userbenchmark says it all.
@bgtubber
@bgtubber 3 жыл бұрын
​@@MrWarface1 I'm not an enthusiast? Even though I have two RTX 3090s and a 32-core Threadripper 3970x with 64 GB of B-die RAM OC'ed to 3800 Mhz CL14 with tightened secondary and tertiary timings. Yep, I guess I'm not an enthusiast. 😂 And Userbenchmark? Really? Thanks for the laugh. Imagine using Userbenchmark to prove your point. UseLESSbenchmark are the laughingstock of the PC community. All reputable hardware reviewers have accused them of intentionally tailoring their benchmarking software in such a way that it will artificially boost Intel's numbers. Literally ANY other benchmarking software paints vastly different picture in terms of gaming and productivity performance for Intel vs AMD.
@MrWarface1
@MrWarface1 3 жыл бұрын
@@bgtubber lmao you flexing to the wrong guy scrub. I will assume you lying about your imaginary setup. If not, I'll post another vid of the real goods. kzbin.info/www/bejne/j3PHn3Vpd6aFiqs imagine claiming a benchmarking site is a shill because the product you fanboi for gets exposed. They are real world benchmarks scrub. Boilerplate benchmarking metrics. Unfortunately "influencers" don't control the environment so people can actually record their systems performance with proper cooling and overclocks.
@patsk8872
@patsk8872 3 жыл бұрын
"This ten thousand, seven hundred and fifty two..." DOLLAR? "...CUDA core..." Oh. Well, wait until it hits ebay.
@user-pj6oc5gy2q
@user-pj6oc5gy2q 3 жыл бұрын
this is truly the worst timeline
@Dreadknight_Gaming
@Dreadknight_Gaming 3 жыл бұрын
Norwegian ❄️ Winters Dad - Chop some wood, son. For the Hearth. Son - Nah Da, I put a 3080 Tie in there. We're set. Dad - It's good to see my son spend my money for the good the family.
@haukionkannel
@haukionkannel 3 жыл бұрын
What they use in summer? 1030? ;)
@MacA60230
@MacA60230 3 жыл бұрын
Well Hardware Unboxed that you referenced says the 6800XT *does not* perform like a 3080Ti in 4K. On average ~12% worse. That's around 10FPS in a lot of games at that resolution and it's definitely a noticeable difference. You can take any sample of 4/5 games to paint the narrative that you want. It's when you look at larger sample sizes like HU does that the truth comes out. Add to that: - better ray tracing performance (I know RDNA2 is not as bad as it looked at release in that department but it's still worse, sometimes *a lot* worse) - DLSS (even if FSR is great, right now DLSS is simply the better option because of adoption, and while Nvidia cards will get access to both, AMD cards will be FSR only) - better for productivity And suddenly the case for the 3080Ti doesn't look so bad. You said undervolting meant lowering it's performance but that's not even true, you can absolutely undervolt while keeping the same (or sometimes a bit better!) performance than stock. I don't think it's a good buy don't get me wrong, the 3080 is such a better price/perf, but imo you're not being fair. You're ignoring or downplaying what makes the Nvidia cards strong and laser focusing on their weak points (like power consumption).
@darkwolf1739
@darkwolf1739 3 жыл бұрын
Theres also an argument to be made that hw unboxed swapped some games that ran better on nvidia hardware for games like godfall and dirt which runs like 30% slower on nvidia like wtf?
@darkwolf1739
@darkwolf1739 3 жыл бұрын
I've noticed this channel really hates chewing on nvidia. Even hw is a bit nvidia hating since when you return the benchmarks nvidia cards even ran better at 1440p and 1080p! When looking at multiple sites that are genuinely considered good (so sites like that one which loves intels stick aren't included) and piling them into an average (like a meta-analysis would do) came to the conclusion that ampere runs better even at 1440p and 1080p (6800xt versus 3080 and 6700xt versus 3070). There is ofc the argument of overhead but I've seen plenty of people with say 5600x or 10th gen and above intel cpus which isnt all that uncommon to pair with ampere cards and should be what you'd pair with them. Nvidia have done some terrible things and will bend over games where they can (tbh any company in this position would imo, even the "best" giant companies turn around if they can) but they do have the superior cards if you have a decent cpu and power supply.
@marka5968
@marka5968 3 жыл бұрын
3090 GDDR6 also has lots of cooling problems. Any workload that heavily tasks the RAM will get the RAM to the throttling temp of 110. So, 3090 with all that RAM is pointless when it is constantly throttling whenever you try and use it. Almost every 3090 owner I've seen has had to do hacky cooling and underclocking. Otherwise, the performance is in par with 3080 and the memory chips will probably fail in a year or so running at 110. AMD has none of those problems!
@TheAmmoniacal
@TheAmmoniacal 3 жыл бұрын
This is causing problems also in some professional workloads, where I often get computation errors and crashes that I believe is due to memory errors (it happens a lot less with a powerful fan pointed at the back of the card). I assumed the AIB model I had was just horribly built so I got the SUPRIM one, but no - same temps, same issue. And the A-series cards is not an option at all, the A6000 starts at $7000 USD here.
@hennermais
@hennermais 3 жыл бұрын
@@TheAmmoniacal MSI cards, including Suprim, all have terrible thermal pads. I replaced mine on a 3080 ventus. VRAM went down by 20°C
@Miskatonik
@Miskatonik 3 жыл бұрын
Memory temp problems are a disgrace, it's pretty annoying to pay top dollar prices and have to fix such extreme temperatures. My 3090 FE reached 110C memory junction while the core stayed in the 60Cs under demanding games (4K 120Hz), it's a stupid and severe design flaw (happens with almost all versions out there). I was going to watercool it anyway, so I bought a block and problem solved, memory remains on the 70Cs now. But many people will not be able or even aware of this memory temp problem, it's absolutely unacceptable that GPUs are being sold with a time ticking bomb in their memories. They will degrade and there will be problems. There should a collective claim against Nvidia for selling cards like this.
@simondale3980
@simondale3980 3 жыл бұрын
Water cooling is the answer, it allows me to max out the memory overclock (+1500). Memory intensive applications have it running in the mid 90's still but only because the backside is still undercooled. Corsair make a block that cools the rear as well if that's still to hot for you. It's just a shame you need to go to this expense.
@Tom--Ace
@Tom--Ace 3 жыл бұрын
Nonsense, my 3090 vram never goes above 80c under load. Even mining, it stays at 102 or below. Its all about how it's cooled
@tetrasnak7556
@tetrasnak7556 3 жыл бұрын
Finally someone in my feed that is telling the trouth about gddr6X !
@bgtubber
@bgtubber 3 жыл бұрын
What do you mean? Pretty much everybody knows that GDDR6X is blazing hot and cr@p at power consumption.
@nathangamble125
@nathangamble125 3 жыл бұрын
@@bgtubber Pretty much everyone who's really into PC hardware knows it, but that's definitely not the majority of people.
@Krogoth512
@Krogoth512 3 жыл бұрын
GDDR6X is the swan song of the GDDRx standard. The 32-bit bus is simply too limiting. You have to clock to the moon to match bandwidth that HBMx can easily obtain. HBM is the future for ultra-high bandwidth and memory capacities, however neither are really needed for gaming GPUs which is why HBM is only found in professional-tier SKUs that demand both. Fiji was proof of concept while Vega 56/64 was stuck in the unenviable position of trying to entertain professional market and gaming markets while AMD didn't have enough capital to make separate designs at the time
@olifyne6761
@olifyne6761 3 жыл бұрын
More like gddrFailure
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
@@Krogoth512 I bet you engineers could make gddr6x work efficiently. Just saying. Otherwise indeed the way its implemented currently is a failure. For nvidia, its marketing. X is always better and more expensive.
@LaDiables
@LaDiables 3 жыл бұрын
8-pin CPU port is likely for rack server chassis (as is the blower style cooler). Much easier to manage one cable instead of 2 in a server.
@aadavanelangovan1630
@aadavanelangovan1630 3 жыл бұрын
That F bomb was ripe lmaooooo. This was goood after a long day, Mr. Tom!
@fVNzO
@fVNzO 3 жыл бұрын
There it is, the actual titan ampere + pro drivers. Ohh and a 2-4x price increase off course.
@TadgNokomis
@TadgNokomis 3 жыл бұрын
"Wrong Polygons" spoken like a true tech enthusiast who is very knowledgable about "architectures" and not someone from the car industry feigning a higher level of tech literacy to grow a following.
@ThylineTheGay
@ThylineTheGay 3 жыл бұрын
hm?
@theemulationportal
@theemulationportal 3 жыл бұрын
Really enjoyed this review, was a unique angle and it's near impossible to disagree with.
@user-pj6oc5gy2q
@user-pj6oc5gy2q 3 жыл бұрын
my exact thoughts as well!
@StyleAndSpeedx
@StyleAndSpeedx 3 жыл бұрын
I play on an LG OLED 48 inch. That's where my Sapphire Nitro really shines in 4k.
@hammerheadcorvette4
@hammerheadcorvette4 3 жыл бұрын
Sapphire is a damn good company.
@crimsonhoplite3445
@crimsonhoplite3445 3 жыл бұрын
Same here brother. Nitro+ 6900XT OC. I'm usually always pegging that 120Mhz at 4K on just about any game.
@owenyang9216
@owenyang9216 3 жыл бұрын
There might be an issue with 3080 ti... My 3090 consume ~35W according to HWiNFO64 at idle. My entire system idles at around 90W, so a 3080ti single-handily using 110W is very odd.
@emanuelmartins5268
@emanuelmartins5268 3 жыл бұрын
i dont believe it too, he didnt show any proof of it
@georgiospappas
@georgiospappas 3 жыл бұрын
Your camera is incredible. The video quality of the video is so great
@MooresLawIsDead
@MooresLawIsDead 3 жыл бұрын
Thank you so much!
@twinsim
@twinsim 3 жыл бұрын
As an A6000 owner we use it split the GPU into VM for work also. It actually was easier to get and because we can spit it between VMs that was a no brained because 16gb for 3 users.
@MooresLawIsDead
@MooresLawIsDead 3 жыл бұрын
I liked the A6000 a lot. But I think an A4000 is the perfect one for me...
@RamkrishanYT
@RamkrishanYT 3 жыл бұрын
What softwares are required for it and which hypervisor do you use? Does this require a license for Nvidia grid or something?
@twinsim
@twinsim 3 жыл бұрын
@@RamkrishanYTVMware and license from Nvdia. All of which I wouldn't normally do but prices were so high and VRAM on the cards is so low. We use Omniverse for work and 12gb is standard for us.
@RamkrishanYT
@RamkrishanYT 3 жыл бұрын
@@twinsim do you have any link to a youtube tutorial for it?
@twinsim
@twinsim 3 жыл бұрын
@@RamkrishanYT umm no there what 100 A6000 in the world. I would get like five views.
@InternetSupervillain
@InternetSupervillain 3 жыл бұрын
I like your review data delivery. I like how you conclude with general user experience between the card and it's contemporaries. Values on a bar chart only go so far after awhile.
@MooresLawIsDead
@MooresLawIsDead 3 жыл бұрын
Agreed. I think real world differences need to be explored more unless you are a mega channel that can afford to run through tons of data.
@andersjjensen
@andersjjensen 3 жыл бұрын
I think Ruthless Looking Corporate Henchman bellowing "THE THIRTY EIGHTY TIE!" will haunt The Nvious One for a long time! :P
@skyemperor2357
@skyemperor2357 3 жыл бұрын
"Nvious" nice
@digitaltactics9234
@digitaltactics9234 3 жыл бұрын
I love the rasterization perf AMD has & the lower wattage usage. But on the Nvidia cards that will be using OBS streaming with NVEnc encoding will also be the win for so many people.
@KillerXtreme
@KillerXtreme 3 жыл бұрын
Got my hands on the EVGA 3080ti FTW3 Ultra, and yea this thing is smoking HOT, if I let it run at 100% power it easily hits 80+c even on the lightest of tasks. Without overclocking.
@tinopimentel5734
@tinopimentel5734 3 жыл бұрын
6:00 was the most legendary part of the review xD always great work keep it up!
@brandonbajc2084
@brandonbajc2084 3 жыл бұрын
I've been waiting all week for this!!!!
@3polygons
@3polygons 3 жыл бұрын
That A6000 is a wet dream for someone working with Blender... (I do so, and render by GPU (which is a ton faster than in CPU) with that crazy amount of VRAM....yay). Not that I'd hit the limit of memory with such (would be an interesting experiment... how to fill up 48GB for a render :D :D). And I guess, for how Davinci Resolve Studio (not the free, the 300 bucks one) uses heavily GPU... that'd be AMAZING for even 8k projects. 24GB of the 3090 was already crazy...
@recordatron
@recordatron 3 жыл бұрын
I managed to get a 3070 back in November when prices were significantly better. I was still on the lookout for a higher end card but as you've pointed out in this video to be quite honest I haven't felt like I've been getting a 'compromised' gaming experience so what's the point? I like the fact it's using less energy than the top cards and with all the upscaling technologies emerging it feels a bit pointless having a screaming top end card. I do wish it had more Vram but I tend to upgrade my hardware semi regularly so it's less of a concern in that regard. Great video as always!
@garytrawinski1843
@garytrawinski1843 3 жыл бұрын
Tom, I liked your take on reviews. Great job!
@silvertree88
@silvertree88 3 жыл бұрын
When a 580 uses less power while mining that an idle 3080ti, and only has a 25% lower hash rate.
@bgtubber
@bgtubber 3 жыл бұрын
That's not true. I have two 3090s which are even more power hungry than a 3080 Ti and they both idle at 20-30W. I think that Tom used an OC profile with fixed frequency and voltage. My 3090s idle at ~100W only if I overclock them with fixed frequency which doesn't allow them to downclock and go into lower power states.
@suppar8066
@suppar8066 3 жыл бұрын
@@bgtubber that actually sounds more realistic. Still it is weird that a 6800xt basically ist the same as a 3080ti :/ guess ist rtx 3060 or amd if you want more performance now. But i would be totally fine with a 3060 if i could get one for msrp :D
@mattiasnilsson3980
@mattiasnilsson3980 3 жыл бұрын
Something is wrong with your 3080 ti it should not draw more then 20 watt.
@RyugaHidekiOrRyuzaki
@RyugaHidekiOrRyuzaki 3 жыл бұрын
That reflective shroud is gorgeous.
@bigcazza5260
@bigcazza5260 3 жыл бұрын
underclock the g6x to 16gbps and re run. would be a cool idea
@yusuffirdaus3900
@yusuffirdaus3900 3 жыл бұрын
The way memory signalling work already different from standard GDDR6. Anyway nice to see
@lupintheiii3055
@lupintheiii3055 3 жыл бұрын
To demonstrate how GDDR6X is useless even at same transfer speed?
@bigcazza5260
@bigcazza5260 3 жыл бұрын
@@lupintheiii3055 yeah i wanted to see a timing difference basically
@lupintheiii3055
@lupintheiii3055 3 жыл бұрын
@@bigcazza5260 Considering how bitchy GDDR6X is it will probably crash or trigger ecc and make the experiment null
@bigcazza5260
@bigcazza5260 3 жыл бұрын
@@lupintheiii3055 stupid invention when samsung ie the dudes who make GA102 have 18gbps gddr6. micron must give good head either that or huang picks favorites
@bryantallen703
@bryantallen703 3 жыл бұрын
Awesome vid. I love those cuts... The "TY" edit was epic...Shit was cool😎.
@ItsJustVV
@ItsJustVV 3 жыл бұрын
GJ Tom, this is how a small(-ish) independent review should look like. You point the good and the bad and you show real world scenarios and a lot of useful info on the side too, not just some graphs while also testing vs the competition. A lot of YT channels, big and small should do this. Even though HU and GN are the benchmarks for reviews, you school a lot of others how it should be done (looking at Linus, Jayz2c, Joker, even Redgamingtech and others). I'm also glad you don't shill for any company.
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
And yet Linus schools even them in other aspects. His insight into the market and company is something a lot of other channels don't have. (I'm far from a Linus fanboy, I watch more technical channels).
@Ang3lUki
@Ang3lUki 3 жыл бұрын
After this generation, I wonder if the big halo products will use the current flavor of HBM
@tacticalcenter8658
@tacticalcenter8658 3 жыл бұрын
Pro cards only and only if large customers demand it.
@HuntaKiller91
@HuntaKiller91 3 жыл бұрын
Im buying the 6600xt sometime later but i really wish 7700xt performs as good as the current 6800xt/6900xt Upgrading again then nxt year
@benjaminoechsli1941
@benjaminoechsli1941 3 жыл бұрын
Tom is expecting anywhere from a 20-60% improvement over RDNA 2, so I think it's absolutely possible.
@bills6093
@bills6093 3 жыл бұрын
Power draw figures from Techpowerup are interesting. Idle: 3080Ti 16W 6800XT 29W Idle with V-sync 60HZ 3080ti 110W 6800XT 146W.
@WeencieRants
@WeencieRants 3 жыл бұрын
Another great review! You and Dan are going great work! Keep it up
@WesternHypernormalization
@WesternHypernormalization 3 жыл бұрын
Bannerlord benchmarks! Good man! =] 10:19 Sweet Warband mod pogchamp
@Ang3lUki
@Ang3lUki 3 жыл бұрын
They should come out with a GA102 with 16 gigs of GDDR6, maybe slightly cut down cores from the 3080, and far lower power limit. It might be efficient, question mark?
@pretentiousarrogance3614
@pretentiousarrogance3614 3 жыл бұрын
you need a 256 bit bus or 512 bit one for 16gb something which ga102 doesn't have
@andersjjensen
@andersjjensen 3 жыл бұрын
Let me explain his explanation: One memory module takes 32 bits of bus width. GA02 has a 384 bit bus (if not cut down) and 384/32 = 12. So 12 memory modules. GDDR6 and GDDR6X is available in 1GB, 2GB and 4GB modules. But all modules have to be of the same size. So on a 384 bit bus you can have 12GB, 24GB or 48GB total. On a 320 bit bus you can have 10GB, 20GB or 40GB total. You can do the math for the rest :P
@mattecrystal6403
@mattecrystal6403 3 жыл бұрын
5:40 this is so good. IDK why reviewers don't do a all in one average of fps for all games tested. It help's a lot with the bigger picture.
@MLWJ1993
@MLWJ1993 3 жыл бұрын
Ever watched hardware unboxed? Cause they do, without RT games the 6900XT is on top (the unlocked water-cooled behemoth). With RT (& DLSS) Nvidia is at the top of the charts until you get to like the RTX 3080. Of course that was before FSR was out, but only Godfall was in the games they tested so I doubt it would make a difference yet.
@javiej
@javiej 3 жыл бұрын
And who the a6000 is for? ... me. And no, I'm not a gamer, and yes we also exist. The reasons are probably irrelevant for you, but for me they are everything. Being able to get support from Nvidia engineers directly, which I need often. The stability and peace of mind when working with clients over the shoulder. Being able to sleep well while my a6000 is rendering all over the night without a glitch. And the fact that even if she lives in my super overpacked wokstation she is still friendly with my other high tdp cards (video, 2nd gpu, fiber channel, quad NVMes, etc..
@miyagiryota9238
@miyagiryota9238 3 жыл бұрын
Not for everyone
@SuperLotus
@SuperLotus 3 жыл бұрын
It's for Big Chungus to brag on Reddit
@Outrider42
@Outrider42 3 жыл бұрын
I know, right? I can't believe he just all Quadro owners play games on them. There are probably more Quadro owners who never game than do. And the ones that do...likely have a dedicated gaming PC for that.
@exe16
@exe16 3 жыл бұрын
1:50 the most premium cooler I've ever felt in my life is the reference Radeon 6800XT. Beats the crap outta the Red Devil 6900XT cooler that I have in terms of quality (though not in performance, obviously). Seriously: it's all metal, heavy AF, hides the PCB completely, looks absolutely gorgeous and DOESN'T SAG. It's almost witchcraft. I actually felt bad about sending that card to the mines after getting the 6900XT, it's just too beautiful.
@iComment87
@iComment87 3 жыл бұрын
Great vid, bro.
@BobTheChainsawMan
@BobTheChainsawMan 3 жыл бұрын
Hey MILD, you seem to have forgotten about the vanilla 3080 in this review. If you can get it at MSRP or near it (I know I know, but the shortages will end eventually) I don't think it's a bad product. Mine has already paid for itself in mining, and it doesn't draw QUITE as much power as the 3080 ti and the 3090. Mine definitely doesn't heat up my room while idle. Not to mention DLSS is still superior to FSR. Still enjoyed the review though!
@mrinalbharadwaj6013
@mrinalbharadwaj6013 3 жыл бұрын
i just loved your review it is so simple easy to understand than others please keep doing it !!!
@MooresLawIsDead
@MooresLawIsDead 3 жыл бұрын
Thank you! Will do!
@bretth6393
@bretth6393 3 жыл бұрын
The thumbnail makes me feel like I've just walked down a dark alley and met a black market GPU dealer.
@bgtubber
@bgtubber 3 жыл бұрын
What do you mean? Isn't this how people get their GPUs in the past few months?
@rcavicchijr
@rcavicchijr 3 жыл бұрын
As the owner of a 6800xt I can say I am extremely satisfied with this card. I have the aorus card running at a 2650Mhz O.C. with max memory speed and timings, and it barely hits 60° at load.
@gsrcrxsi
@gsrcrxsi 3 жыл бұрын
The 8-pin CPU is a “standard” connector for GPUs in the enterprise space, which is exactly what the A6000 is aimed at and why it has that connector instead of the PCIe connectors. No adapter needed when used in a server as intended.
@mardon85
@mardon85 3 жыл бұрын
I enjoyed this review but something is off.. My 3090 is sitting here idling at between 12 & 14w.. Where do you get your numbers? Mine are from "GPU Power" on HWinfo64. Have you locked voltage with your overclock or something? Also while I agree with your assessment on DLSS and the 6800/6900XT cards being fantastic in pure rasterization you did not compare RT performance between Ampere and RDNA2. You were quite happy to show RDNA2 pulling away in a known Radeon favoring game (Deus Ex) but then compare RT in Metro Exodus and instead only compared the 3070. The 3080Ti would have absolutely killed the 6800XT in that game and any 4k game with RT for that matter. I agree with many of your points in your review but I do not think it gives a fair picture about Ampere as a whole. The 3080ti is a trash card, the 3080 and 3090 are not for various reasons.
@MLWJ1993
@MLWJ1993 3 жыл бұрын
He's obviously heavily biased towards AMD, doesn't take a genius to figure that out 😂.
@mardon85
@mardon85 3 жыл бұрын
@@MLWJ1993 there's Bias and then there's flat out incorrect information. The RT comparison I can forgive. Incorrect information about Idle power consumption not so much.
@luclucchesi7326
@luclucchesi7326 3 жыл бұрын
In South Africa, I got a 3080ti for less than what I could get a 3080 for, well I couldn’t get a 3080 so I kinda took what I could get, I went with what ever gpu would perform best in the games I play, namely racing sims wity triple 1440p monitors and in that use case the nvidia cards destroy the amd cards. In mining I have managed to reduce power draw without lowering the hash rate. I’m getting 63mh/s with 55% power limit, so it ends up pulling 215w to achieve that hash rate and the card runs super cool, under 50° C. fully overclocked, the highest it gets to is 68°c the card is a Palit game rock 3080ti oc, very happy with its performance, first time using the brand
@DouglettDD
@DouglettDD 3 жыл бұрын
ever since that one broken silicon i keep noticing him saying gdr6(x) instead of gddr6(x)
@alvydasjokubauskas2587
@alvydasjokubauskas2587 3 жыл бұрын
shorter name, and people still understand it...
@grooners5333
@grooners5333 3 жыл бұрын
The 3080ti absolutely does not idle at 100W - mine is 17W with 3W shown for the GPU chip. This is consistent with other reviews and other comments from owners below - except for the one comment you pinned that repeats what you've said. Please as a responsible tech KZbinr investigate why on your system GPU Z is showing 100W at 0% load - surely a strange result like this would make you want to understand rather than just post? Fair enough to ignore undervolting as this is not out-of-the-box but properly setup the 3080ti is a much better card than you've suggested - its just the RRP that was too high.
@bgtubber
@bgtubber 3 жыл бұрын
I agree. Many people here already confirmed that this is not normal. My 3090s all idle at ~20W so there's no way a 3080 Ti would idle that high unless it has fixed overclocking applied through MSI burner or similar software.
@Kakaroti
@Kakaroti 3 жыл бұрын
Even the baseline 6800 with 16GB is a steal for MSRP and will age nicely.
@TheDaNuker
@TheDaNuker 3 жыл бұрын
I'm kinda annoyed about how bizarre the memory configurations are for Nvidia GPUs this generation. I'm looking to move up from 8 GB VRAM for Blender workloads (it's optimized for Optix/CUDA so an AMD GPU loses too much performance) and its like just bad choices all around. A 12/16 GB 3070 GDDR6 would be really ideal over a 2080Ti but instead we gotten a 8 TB 3070Ti. Sigh.
@Mopantsu
@Mopantsu 3 жыл бұрын
The 3070 should have been at least 10gb out of the gate. Nvidia penny pinching again.
@TheDaNuker
@TheDaNuker 3 жыл бұрын
@@Mopantsu That would make it the ideal budget 2080Ti alternative and would be a great deal but Nvidia just had to gank everyone's wallets...
@irridiastarfire
@irridiastarfire 3 жыл бұрын
Yeah, I'm in the same boat. No good options for (high VRAM) Blender this generation except the 3090. I decided to get the 3090 (MSI Suprim) -- at 60% power cap it uses 15% less power (accounting for longer total render time + including CPU usage) for 4% longer render time. Haven't played with voltages yet. Idles at 40W which is fairly high IMO.
@VoldoronGaming
@VoldoronGaming 3 жыл бұрын
@@Mopantsu Not really. Just clever marketing in the market segmentation.
@kyraiki80
@kyraiki80 3 жыл бұрын
Very interesting. Missed the info about idle Temps before. That's insane
@flogjam
@flogjam 3 жыл бұрын
RTX3090 SUprim X - idles at less than 27Watts while watching this video. The cooler is very good too. I have undervolted a little and it boosts higher - 2000Mhz at times.
@slateeboi884
@slateeboi884 3 жыл бұрын
Good video, but i think you miss the point of the undervolting argument. I was able to drop total package power of my 3090 down to 320 wats, while overclocking the core another 85 mhz. (And yes the core was actually bosting higher after the undervolt) Nvidia just pushed the total package power way higher than they should have, so the cards would boost stupidly and preform at the highest clocks for reviews. Also I have been mining on my 3090 so long that it paid it self off, something that would take way longer on the 6900xt because of poor hashrate. Although i do agree the 3080ti is a joke while the 3090 exists on the market.
@morpheus_9
@morpheus_9 3 жыл бұрын
Cant believe youre justifying the 3090.
@slateeboi884
@slateeboi884 3 жыл бұрын
@@morpheus_9 well i snagged one at msrp, and I have already broken even on the card with mining. I dont know how you cant justify a free 3090.
@EldaLuna
@EldaLuna 3 жыл бұрын
reminds me of a vid kinda. where i made a comment about just the vram size in general cant remember where. but i once said how they got no problem doubling sizes in the datacenter side but struggle to even give us anything worth the price and honestly if they keep this up im retiring from gaming all together. one frosty day i get locked into console world once more.
@s77lom49
@s77lom49 3 жыл бұрын
The 3080ti is meant to be used in Syberia.
@zaidlacksalastname4905
@zaidlacksalastname4905 3 жыл бұрын
Free heating is magic
@alvydasjokubauskas2587
@alvydasjokubauskas2587 3 жыл бұрын
Syberians now understand what heat means...
@bigdeagle1331
@bigdeagle1331 3 жыл бұрын
Great job! You always benchmark right!
@novajett
@novajett 3 жыл бұрын
I have an evga 3070 ti ftw3 ultra and I'm in love with it. Super quiet and very cold. Dope shit
@aragurn9019
@aragurn9019 3 жыл бұрын
Yesterday my 80ti FE came. At first I kinda felt bad while watching this video, but then realized that I've been hunting a GPU for approximately 8 months. All 3070 (what I was originaly going to buy) are still more expensive then 80ti FE. So that's my answer to your question "who is this card for". It's for people who perhaps wanted to buy 3070, 3080 or 6800XT, but couldn't get FE or AMD's reference model and just so happened to get 80ti. I mean most of 3070 in my country used to be around 2000USD so a much better card is a no brainer for 1440USD (3080ti FE w/ UE taxes).
@LurkingLarper
@LurkingLarper 3 жыл бұрын
Seeing as I had my friends PC crash just yesterday due to overheating during this horrible summer, I find it funny how some people still excuse high power usage components. As far as I'm concerned, power usage is the second most important metric after the performance by a country mile. It affects heat output and noise levels directly and those are quite crucial unless you have your PC hidden in the other room, lol.
@juliusfucik4011
@juliusfucik4011 3 жыл бұрын
I have a 3090 and 5950x in a case. Running full load 24/7 and it is not loud or hot. The secret is to leave one panel open on your case 👍
@LurkingLarper
@LurkingLarper 3 жыл бұрын
@@juliusfucik4011 yes indeed, and if you have the panels open, what's the point of the case? Also, you will heat up your room that way which is fun during winter and hell during summer. I had to remove my panels from Fractal define r5 case since it was suffocating older mid range Vega 56 and i7 4970k, lol. You couldn't get good airflow cases five years back but luckily today you can get them. I would strongly recommend getting something with proper airflow.
@TerraWare
@TerraWare 3 жыл бұрын
@@juliusfucik4011 It doesn't matter if you leave one panel off or water cool the entire build with top of the line radiators and fans. The components will still output the same amount of heat and the same amount of heat will be exhausted into your surroundings. Some people care about the heat/power consumption and some don't that's fair. My game room is rather small so I can feel the temperature change when I play something demanding on my 5900x/6800xt which are both water cooled in an open air case. If I was running them air cooled in an enclosed case I'd still be dumping the same amount of heat in the room though because the components will generate the same level of heat regardless of how efficient my cooling solutions are.
@MLWJ1993
@MLWJ1993 3 жыл бұрын
In that case, don't buy any current gen desktop part since power consumption shot up to >200w on both vendors. Maybe Intel will provide you with low power consumption GPUs?
@LurkingLarper
@LurkingLarper 3 жыл бұрын
@@MLWJ1993 both Ampere and RDNA 2 are quite efficient architectures and even if NVIDIA is busy raping their customers wallets, their achievements on Samsung's inferior node compared to what AMD is using with TSMC is nothing short of impressive. The current gen just cannot be overclocked a lot, if at all, and if you do, you lose any semblance of the inherent efficiency the architecture had to start with. I would gladly get 3060Ti, 3070 or 6700XT on MSRP, but that's just not going to happen anytime soon and in few months time I don't want to buy old hardware and will be looking forward to next gen. The current market conditions made this gen dead to me.
@brucethen
@brucethen 3 жыл бұрын
The 3080Ti is for Eskimos, Alaskans and Siberians
@blueeyednick
@blueeyednick 3 жыл бұрын
Damn son, you did it this time.
@RealDaveTheFreak
@RealDaveTheFreak 3 жыл бұрын
Even though the 3080 mobile 16GB is but a 3070 Ti with GDDR6, it has enough VRam and, compared to the current scalper prices, is actually quite affordable and you get a whole PC as well. May be an okay option nowadays. (until desktop prices come back down to normal, then it'll be a slightly cut down 3070 Ti fpr 1k$)
@commandertoothpick8284
@commandertoothpick8284 3 жыл бұрын
The real question is when will sub 200-300 $ cards come out?
@haukionkannel
@haukionkannel 3 жыл бұрын
It allready did… gt1030! :)
@zuhairshahid2225
@zuhairshahid2225 3 жыл бұрын
Thank you Brother for honest review . I have nvidia GPU so much power draw .. Now amd have ray tracing I am love to purchase amd and sell nvidia
@mako0815
@mako0815 3 жыл бұрын
part of the higher efficiency of A6000 compared to the TIE might also be binning 🤔
@MooresLawIsDead
@MooresLawIsDead 3 жыл бұрын
Not this much of a difference - and remember the A6000 is the full die.
@timbermg
@timbermg 3 жыл бұрын
@@MooresLawIsDead Undervolting a 3090 can save as much as 30% on compute applications. Pity that's windows only and with the chapstick thermal pads many AIOs seem to use, the operating spec is exceeded at 60-70% of TDP. Some shocking engineering shortcuts.
@davidgunther8428
@davidgunther8428 3 жыл бұрын
Yeah, but could nVidia control the distribution on regular GDDR6? 😛
@Whizzer
@Whizzer 3 жыл бұрын
The 3080 Ti is for nVidia of course. They're making bank selling it.
@SoftnappGAMEPLAY
@SoftnappGAMEPLAY 3 жыл бұрын
Not really it was bad hollow product which not useful for creators,gamers or miners they just planned the cash grab out since the they didn't want to waste die on cheaper models nvidia making more money on mobile gpu rather than desktop gpus
@Yoshimatsu414
@Yoshimatsu414 3 жыл бұрын
Lol yeah these resolution scales really do help you with performance. When Ray Tracing update came to Doom Eternal I lost some FPS on my 6700 XT (obviously, any card would lol) with RT enabled. So to solve that issue I just turned down the resolution scale to 80% and cranked up the sharpening a little bit and I couldn't really tell much of a difference when ripping and tearing lol
@bohomazdesign725
@bohomazdesign725 3 жыл бұрын
Well, I got a 2x RTX A4000 setup and in my use case - Blender + DaVinci Resolve + UE4/5 - its plenty enough power and VRAM (16GB btw). Heck, tbf a 2x A4000 setup actually is somewhat on par or even outperforms in some cases a 3090 in pure rendering performance in Blender and the total cost of two A4000 cards was smaller than one 3090 AIB model. Having that said (if I wouldnt be forced to go NVidia for OPTIX / work) if I needed a GPU just for gaming I would go for a RX 6800XT, no fcks given what anyone says, cheaper, cooler, less noisy than a 3080Ti.
@anasevi9456
@anasevi9456 3 жыл бұрын
Ugh well zero chance of an ampere in my future now, even as a used buy in a few years. Even the A6000 is buring a lot of power just idling, even if better than the 3080ti. Am still stuck with a Vega64 because the 5700XT i tried to replace it with cooked the room while idling [3 monitors, 1440p drawing tablet and 144hz 1440p main], Vega idles so cool there is no detectable heat from its blower, advantages of hbm i guess. I really don't game as much these days so having a card that can actually idle without cooking its vram and heating up the room is a big deal.
@Alecxace
@Alecxace 3 жыл бұрын
did you undervolt your VEGA?
@anasevi9456
@anasevi9456 3 жыл бұрын
@@Alecxace no, due to its hbm it can idle even at high 2d loads while sipping energy next to GDDR6 and GDDR6x cards as i've learned over the day. Never got it to undervolt with much stability for gaming, however in these days 300w [its actually around 250w-280w in practice] is hardly as bad as it used to be. lol
@Alecxace
@Alecxace 3 жыл бұрын
@@anasevi9456 everything high end now STARTS at 300w lol
@StefanHolmes
@StefanHolmes 3 жыл бұрын
I believe Radeon have now fixed the high idle power draw for the RDNA2 cards when using multi-monitor. Could be worth a revisit, if you can still get your hands on one.
@anasevi9456
@anasevi9456 3 жыл бұрын
@@StefanHolmes good to know, RDNA 1 was just horrid, i live in the tropics so even in our 'winter' i noticed it immediately. My spouse was due for an update and only uses one monitor, so they got the 5700xt and I am still here with old faithful [nabbed my vega from PLE at launch for just under msrp just before the 2017 crypto boom really kicked off]. @Alecxace yeah it is pure insanity, at least performance/watt has still improved, will likely go with 6700xt or what comes after of its ilk to dodge these +300w cards.
@sfesfawfgasfhga
@sfesfawfgasfhga 3 жыл бұрын
Back when I had a GV100, I could play Flight Simulator on a Win10 box over RDP, from a Windows 8.1 client, and the performance was absolutely flawless. Blew me away. And on idling, an idling GV100 didn't really add anything extra to power consumption from the wall socket. It wasn't really any different from any other newish consumer card -
@thomasb1521
@thomasb1521 3 жыл бұрын
That A6000 is the most beautiful card I have ever seen.
@denvera1g1
@denvera1g1 3 жыл бұрын
What i'd love to do is get my hands on one of the Nvidia A100, write some drivers for it, and use the iGPU as the display out. For reference, if Nvidia counted the CUDA cores the same way it counts CUDA for the Geforce line, this $10k card wowuld have nearly 14k CUDA cores
@omarlinp
@omarlinp 3 жыл бұрын
I'm am really surprised when I saw the thumbnail looking forward for this video.
@Syntheticks
@Syntheticks 3 жыл бұрын
Good vid, keep up the good work!
@curtismariani6303
@curtismariani6303 3 жыл бұрын
This video is on point. I had a 3070, but gaming on my 1440p ultrawide it really struggled with some heavy titles like cyberpunk. So I bought a 3090FE not because it was the card I waned, but it was the only card I could get in December without paying scalper prices. The card gets 60 to 70 FPS in cyberpunk with DLSS quality and mostly top settings. Where I have a problem with the 3090 is the noise. Playing Control with an undervolt, the fans hit 2400 rpm and HWinfo is reporting a 106c peak junction temp while the gpu is at 61c. This is simply not good enough for a £1350 card. I know there is an issue with the thermal pads on the early models so I’m going to have to sort it. But you would think they could have just spent a few dollars extra on better pads in the first place considering the massive margin they must have in this card. Anyway rant over, keep the content coming 🙂.
@airforcex9412
@airforcex9412 3 жыл бұрын
About NVIDIA and memory; they slow drip capacity. If the 3080ti were 24GBs, they tie their hands for their next round and anything below that would be a downgrade. They give you just a enough RAM (which in a sense is technology they can’t control) and push their proprietary stuff.
@KingGiac
@KingGiac 3 жыл бұрын
Awesome video. Just FYI your 3070 was artefacting at 13:17….think you may have pushed the oc too high. EDIT: no it wasnt!
@WhimsicalPacifist
@WhimsicalPacifist 3 жыл бұрын
That made me panic. It had me thinking "It's not my 970's time! It's too early to die!"
@jmporkbob
@jmporkbob 3 жыл бұрын
As someone who has actually played Metro, that is the effect of being in a radioactive area. Unless I'm just missing something.
@KingGiac
@KingGiac 3 жыл бұрын
@@jmporkbob ah then disregard....It really looked like it!
@overcoercion590
@overcoercion590 3 жыл бұрын
Got a 3090 hybrid. Nice for keeping some heat out of my case but, 4k gaming is now a winter sport
@tjr3308
@tjr3308 3 жыл бұрын
My 3080Ti used to idle at full boost until I uninstalled MSI afterburner, I overclocked it using the Nvidia Overlay scan, now it idles around 250MHZ @ 25 watts and boosts to 2040MHZ @425 watts in 3D mode.
@bgtubber
@bgtubber 3 жыл бұрын
I have MSI Afterburner and my 3090s still idle at 20W (meaning they downclock properly). You must have set something in Afterburner wrong. Maybe you've had a fixed frequency overclock applied?
@tjr3308
@tjr3308 3 жыл бұрын
@@bgtubber I tried it with a new install of afterburner (settings not saved from previous install) and before I'd touched any settings this weird boost behaviour at idle stared again. Hold on.. you've got two 3090s? What a legend!
@bgtubber
@bgtubber 3 жыл бұрын
​@@tjr3308 Latest version of Afterburner? Did you try clicking the Reset to defaults button? Pretty weird indeed. 🤔 Yea, I got my 3090s at near MSRP just a few weeks before the scalpocalypse kicked in. I really dodged a bullet there. 😅
@meeko5985
@meeko5985 3 жыл бұрын
My 3080 Ti FTW3 Hybrid is only 250 watts with an undervolt. I had a 6900 XT Red Devil. Was a very disappointing card. It was my 1st AMD card too. Another issue was no cuda core acceleration in workloads. Ampere does it all for me. Not just in gaming but in workstation tasks as well. I’m quite pleased. No hate towards AMD but my 6900 XT was highly disappointing.
@MLWJ1993
@MLWJ1993 3 жыл бұрын
I'd say both are disappointing for the price. General consensus is that a 6800XT or 3080 makes more sense for the majority of consumers.
@d00dEEE
@d00dEEE 3 жыл бұрын
Radial fans are not inherently louder or quieter than axial fans. The noise level is just a matter of effort, designing proper blades for a given flow and pressure ratio. AMD's cheap radial fans ("blowers") on their reference cards typically have square leading and trailing edges, which are about as bad as you can get for noise.
@yusuffirdaus3900
@yusuffirdaus3900 3 жыл бұрын
I would like to say impeler than radial fan
@d00dEEE
@d00dEEE 3 жыл бұрын
@@yusuffirdaus3900 The impeller is the part a fan that rotates and moves the air; it can be either radial, axial or a hybrid (for examples of hybrids, see turbocharger or centrifugal water pump impellers).
@H1mS0L0st
@H1mS0L0st 3 жыл бұрын
I actually want one of those 12gb 3060s. I'm hoping that someday I can find one close to MSRP.
@MLWJ1993
@MLWJ1993 3 жыл бұрын
Just get a 3060Ti, the 3060 can't even use that VRAM without crippling under the load required to even use 10gb (4k maximum settings all textures put into VRAM in Doom Eternal which nets you a visual difference of 0%)...
I Bought the Geforce GT1010...It's Awful!
24:31
Budget-Builds Official
Рет қаралды 94 М.
Try Not To Laugh 😅 the Best of BoxtoxTv 👌
00:18
boxtoxtv
Рет қаралды 4,6 МЛН
CAN YOU DO THIS ?
00:23
STORROR
Рет қаралды 41 МЛН
How to whistle ?? 😱😱
00:31
Tibo InShape
Рет қаралды 21 МЛН
бабл ти гель для душа // Eva mash
01:00
EVA mash
Рет қаралды 1,9 МЛН
Beating Moore's Law: This photonic computer is 10X faster than NVIDIA GPUs using 90% less energy
17:03
John Koetsier (tech, AI, & the future)
Рет қаралды 538 М.
NVIDIA REFUSED To Send Us This - NVIDIA A100
23:46
Linus Tech Tips
Рет қаралды 10 МЛН
Linus Torvalds on why desktop Linux sucks
11:07
gentooman
Рет қаралды 1,4 МЛН
Nvidia RTX 3080 Ti Review - They want $1200 for this?!
16:10
BuildPicker
Рет қаралды 2,3 М.
Microsoft Makes Windows Worse With AI
9:34
Mental Outlaw
Рет қаралды 167 М.
Try Not To Laugh 😅 the Best of BoxtoxTv 👌
00:18
boxtoxtv
Рет қаралды 4,6 МЛН