How Good...or Bad...is Ampere?

  Рет қаралды 57,241

AdoredTV

AdoredTV

3 жыл бұрын

History doesn't lie.
♥ Check out adoredtv.com for more tech!
♥ Subscribe To AdoredTV - bit.ly/1J7020P
► Support AdoredTV through Patreon / adoredtv ◄
Buy Games on the Humble Store! -
►www.humblebundle.com/store?pa... ◄
Bitcoin Address - 1HuL9vN6Sgk4LqAS1AS6GexJoKNgoXFLEX
Ethereum Address - 0xB3535135b69EeE166fEc5021De725502911D9fd2
♥ Buy PC Parts from Amazon below.
♥ NEW USA Store! - www.amazon.com/shop/adoredtv
♥ Canada - amzn.to/2ppgYsX
♥ UK - amzn.to/2fUdvU7
♥ Germany - amzn.to/2p1lX6r
♥ France - amzn.to/2oUAK2Z
♥ Italy - amzn.to/2p37Uui
♥ Spain - amzn.to/2p3oIBm
♥ Australia - amzn.to/2uRTYb7
♥ India - amzn.to/2RgoWmj
♥ Want to help with Video Titles and Subtitles?
kzbin.info_cs_p...

Пікірлер: 1 700
@DragoonKain3
@DragoonKain3 3 жыл бұрын
"That's a real 2x, not a Jensen 2x...." OMG my sides!
@geofrancis2001
@geofrancis2001 3 жыл бұрын
this should be a meme, the Jensen multiplier, put it in the box with elon time and apple corners and "it just works"
@KibitoAkuya
@KibitoAkuya 3 жыл бұрын
@@geofrancis2001 how does elon time compare to valve time?
@BARCH-wp5vl
@BARCH-wp5vl 3 жыл бұрын
Lol
@nathangamble125
@nathangamble125 3 жыл бұрын
@@KibitoAkuya Elon time goes beyond 2.
@railshot888
@railshot888 3 жыл бұрын
But there are a lot of Nvidia fans that simps for Jensen, so they absolutely believe it.
@Chuckiele
@Chuckiele 3 жыл бұрын
And the worst part about this is that Fermi was an accident, Turing and Ampere are completely deliberate.
@MrAdhiSuryana
@MrAdhiSuryana 3 жыл бұрын
Chuckiele they know AMD focusing their RnD for Ryzen, so they may have a good nap for few generations then 🙈
@perschistence2651
@perschistence2651 3 жыл бұрын
Not necessarily. I think Nvidia planned 7nm for Ampere so it would have been at least 20% faster and more power efficient, but they had to redesign it completely.
@rayaneh5230
@rayaneh5230 3 жыл бұрын
@@perschistence2651 They cannot design a GPU in 1 year. No one can. They never redesigned Ampere at all, they designed both a 7nm TSMC and a 10nm Samsung Ampere GPU in case they couldn't manufacture their GPUs at TSMC. It's what really happened.
@Zorro33313
@Zorro33313 3 жыл бұрын
Fermi wasn't an accident. It's was typical nvidia at the time. Did you own tesla, for example? Nvidia only good when it has no competitors. That's why it always tries to eliminate them. Nvidia's glory a myth.
@perschistence2651
@perschistence2651 3 жыл бұрын
@@rayaneh5230 I think they did. Ampere is not the true Ampere they just took turing, increased the number of int32 units and made some small tweaks. We'll see the true Ampere at 7nm with another name. Ampere is a very cheap architecture.
@aman113
@aman113 3 жыл бұрын
“One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. Once you give a charlatan power over you, you almost never get it back.” - Carl Sagan
@damasterpiece08
@damasterpiece08 3 жыл бұрын
Basically cognitive dissonance
@Fearlessphil100
@Fearlessphil100 3 жыл бұрын
Humbleness is the only way to overcome something that has dented your pride.
@kennyj4366
@kennyj4366 3 жыл бұрын
@Amandeep Singh no truer words can be spoken. It is what it is, we fight to break free of the Bamboozle, or relax and make ignorance as Blissful as possible.
@kama2106
@kama2106 3 жыл бұрын
It is easier to decieve people that convince them they were decieved -M Twain
@The__Mask
@The__Mask 3 жыл бұрын
And what is even more sad is that you can say that about AMD and Nvidia and pretty much any company imaginable that sells products. They have all bamboozled us and will continue to bamboozle us, we are all sheep in this game of life.
@andoniliev3857
@andoniliev3857 3 жыл бұрын
When i saw 3080 day one reviews i was very surprised how everyone just ignored the fact that it draws ~350 watts of power. Then came 3090 with ~400W and still no one even mention it. But I knew someone would question it and i knew it was going to be Jim @ Adored Technology and Vision. Thanks for your videos. Sadly common sense is not a disease spread by a virus worldwide...
@someone-wi4xl
@someone-wi4xl 3 жыл бұрын
to be honest many of us don't really care about performance per watt i am one of those people i care more about how much FPS do i get .. and at what cost .. couldn't care less about power consumption as long as the cooling is sufficient which it is on the 3080/90 ... rumors that the flagship AMD card gonna be 300w and i'm 100% ok with that if it means near 3080 performance at cheaper price (price per fps matters to me .. nothing else)
@The_Noticer.
@The_Noticer. 3 жыл бұрын
Your comment mustve been buried beneath the comments screaming in praise though, just like mine...
@memes4195
@memes4195 3 жыл бұрын
I don't think they ignored it as much as presumed everyone and anyone who was going to be looking at those reviews knew the tdp was 350 and 400 because it was in the press video.
@dycedargselderbrother5353
@dycedargselderbrother5353 3 жыл бұрын
@@someone-wi4xl I see a lot of people saying things along these lines, but I think a lot of them aren't understanding what it means to have hundreds of watts of heat dump into what is probably a bedroom. We're getting into literal space heater territory as many/most of them have a 500 watt setting.
@flyfaen1
@flyfaen1 3 жыл бұрын
@@dycedargselderbrother5353 Well, some of us live above the arctic circle, too :p
@bgtubber
@bgtubber 3 жыл бұрын
Nvidia: The more we lie the more you buy! ❌ 90% performance/Watt increase compared to Turing ❌ RTX 3080 2x faster than RTX 2080 ❌ Cards will be available on September 17th
@dracer35
@dracer35 3 жыл бұрын
That last one literally made me laugh out loud 😂
@DoomsdayR3sistance
@DoomsdayR3sistance 3 жыл бұрын
@@dracer35 what do you mean? It is true. Cards available from scalpers for *50 the RRP! Just look to the nearest dodgy auction website operated primarily by scammers and con artists!
@VovixMayhem
@VovixMayhem 3 жыл бұрын
@@DoomsdayR3sistance I really hope that AMD can give some competition and make Nvidia lower the price. hope they lose a lot of money for their greed
@DoomsdayR3sistance
@DoomsdayR3sistance 3 жыл бұрын
@@VovixMayhem I doubt AMD will make much impact on Nvidia here and with what nvidia has planned for Hopper, AMD needs to surpass it's A game to compete. I do think AMD will make some gain in the mid-tier consumer grade GPU market but it's a small slice of a small piece of pie.
@VovixMayhem
@VovixMayhem 3 жыл бұрын
@@DoomsdayR3sistance you need to find new XTX leaks, that can be much better than you think
@AvroBellow
@AvroBellow 3 жыл бұрын
Jim, this has been the best video about Ampere that I've ever seen. I myself IMMEDIATELY tried to get people to come back to Earth by posting (on TechSpot) how nVidia had fooled them with my post showing them nVidia's marketing and price progression. They pulled a switcheroo by changing the top Ti card from $649 to $800 to $1200 (now to $1500) while quietly moving the top GTX/RTX x80 card into the $800 slot that was occupied by the top Ti card for a time. It seems that nobody saw it except you and I.
@Olivia-W
@Olivia-W 3 жыл бұрын
My wallet sure sees it. The sub 400 card market is sad. 2060 land here.
@nairjithus
@nairjithus 3 жыл бұрын
Now this.... This is an objective look at ampere cards. A lot of reviewers are going gaga over ampere while somewhat forgetting that ampere seems "so good" because previous gen was so bad. As always, great video Jim.
@MrAdhiSuryana
@MrAdhiSuryana 3 жыл бұрын
Look at Gamer Nexus video memeing about money that all reviews get
@nrosko
@nrosko 3 жыл бұрын
I'm not sure it's that objective.
@nrosko
@nrosko 3 жыл бұрын
@@loranmorash8532 The way the video is presented is not objective. He compares a 2080ti to a 3090 they are not the same class. Should compare 2080 vs 3080. Also looks like he averaged the resolutions to get performance, these days it makes a difference your resolution the 3080 is for 4k. And then ignoring things like DLSS & RT when looking at value.
@ag687
@ag687 3 жыл бұрын
@@loranmorash8532 Traditional metrics is useful for comparing to all the generations. But Nvidia is going in heavy on boosting the performance of their cards using AI. Considering a chunk of the performance loss is due to efforts to boost performance in new ways, AI deserves at least a mention. Without it, this is an incomplete picture of whats going on.
@eazen
@eazen 3 жыл бұрын
@@nrosko wrong. You awfully sound like a butthurt Nvidia fanboy who doesn't really understand the video. The 3090 is the successor to the 2080 Ti, hence the higher Vram amount and even higher pricing, but comparable. The 3080 is the 2080 successor and nothing else, hence the 2 GB extra vram and same pricing. Why are people so blind to nvidias marketing lies? Because they are fanboys.
@MadJackChurchill1312
@MadJackChurchill1312 3 жыл бұрын
Steps of hype: 1) hear the rumours. 2) see the cooked Nvidia figures. 3) watch the glowing reviews. 4) listen to Jim pull you back down the earth. Thanks for this, mate.
@summushieremiasclarkson4700
@summushieremiasclarkson4700 3 жыл бұрын
Finally, someone said it. Comparing it to the 2080Ti in terms of value is absurd, it was the first x80 Ti card that launched at 2x the MSRP of its predecessors. The 2080Ti, had it launched at the same price as the 1080Ti and 980Ti, would have been much better value, and the 3080, a much less impressive margin over it at that price.
@dralord1307
@dralord1307 3 жыл бұрын
If you take a 2080TI and unlock it then run it at the same wattage as the 3080 its only 5-7% behind in performance. The 3080 is terrible terrible card. It has been pushed to the very edge of the voltage curve just to get everything they could out of it. Nvidia never does this, makes you wander what they are afraid of.
@KML3rd
@KML3rd 3 жыл бұрын
@@dralord1307 Do you think the 3070 will able to overclock alot? I wonder what a 320watt 3070 is capable of considering, the difference is tdp is 100w between the cards. 3080 can barely overclock and in some models you can't overclock at all and actually need to underclock or undervolt to run stable. If the 3070 can overclock (and if can by alot) then the rumored 16GB version might have just killed the 3080.
@Heatranoveryou
@Heatranoveryou 3 жыл бұрын
@@dralord1307 what do you mean afraid of? You said it yourself that the 3080 has basically no performance increase over the 2080 ti. Thats a problem regardless of competition. Nvidia still has to compete with itself.
@dralord1307
@dralord1307 3 жыл бұрын
@@Heatranoveryou I dont think Nvidia would have cut their profit margin so much, or pushed these cards to the ragged edge the way they have if they werent afraid of something. They moved up the launch date of the 30series even they knew they were understocked. They didnt give AIB's enough time to test or validate their cards. They pushed back the 3070 launch so it launches the day after RDNA2 announcement. What does all of that make you think they are worried about
@jondonnelly4831
@jondonnelly4831 3 жыл бұрын
@@dralord1307 The 3080 was designed around the high wattage, with cooling and power delivery. The 2080Ti was not. Sure shunt mod and overclock the 2080ti to the hilt, put a waterblock on it. Heck why not add a chiller too and match a 3080 good luck.
@VovixMayhem
@VovixMayhem 3 жыл бұрын
1.4k for top GPU? Few years ago 1.4k is cost of top PC xD Hoping that AMD can do fair challange
@e2rqey
@e2rqey 3 жыл бұрын
Yeah I built my whole top of the line PC for Crysis for like 1800 and that included the monitor as well. Now I'm building a new gaming PC and its gonna cost me around $3500(including the monitor) and that's if I only get an RTX 3080 10GB. I'm not going to spend double that for the 3090, it's just not worth it.
@Weissman111
@Weissman111 3 жыл бұрын
Some people will pay anything for bragging rights.
@MrAdhiSuryana
@MrAdhiSuryana 3 жыл бұрын
Andy Parker And some youtuber will slap it on your face even harder just for the money 💰
@basilthomas790
@basilthomas790 3 жыл бұрын
@@e2rqey No one is forcing you to buy a 3090 gpu card!!
@JoaoVictor-hk7iv
@JoaoVictor-hk7iv 3 жыл бұрын
@@basilthomas790 The problem is that we have people that are crazy enough to pay that amount, which drives the whole GPU market prices up. While a few years ago I could get a 970 that would perform the same as the last gen top of the line card for $330, now I'd have to pay $500 for that and get less VRAM to boot with the 3070.
@St0RM33
@St0RM33 3 жыл бұрын
But Jensen knows how to sell a fridge to an Eskimo lol Intel: "Write that down!!!"
@abaj006
@abaj006 3 жыл бұрын
At Intel, we know how we can sell a fridge to an Eskimo. Cause global warming, and one day they will buy our fridge!
@CaveyMoth
@CaveyMoth 3 жыл бұрын
Jensen: "Hire hitmen."
@Worgen4ik
@Worgen4ik 3 жыл бұрын
When single hitman is not enough?
@tomhsia4354
@tomhsia4354 3 жыл бұрын
@@Worgen4ik Hire two, then plan a sequel.
@wargamingrefugee9065
@wargamingrefugee9065 3 жыл бұрын
@@tomhsia4354 But it's Jensen. A doubling doesn't equal two. :-P
@CaveyMoth
@CaveyMoth 3 жыл бұрын
The more hitmen you buy, the more they die.
@tomhsia4354
@tomhsia4354 3 жыл бұрын
5nm hitmen might just do it, though. Then again, they might get ripped to threads.
@osirisgolad
@osirisgolad 3 жыл бұрын
Statistics, remembering recent events and holding organisations responsible for bullshit just don't seem to be things our current society is very good at, Jim. Being tricked by propaganda, moving from target to target without actually holding anyone accountable and wanting unnecessary luxuries on the other hand, we've got those down to a tee.
@amehu
@amehu 3 жыл бұрын
Word.
@TheySeeMeTrollen
@TheySeeMeTrollen 3 жыл бұрын
On that perf/price graph you can CLEARLY see the cards that people sat on for years.
@Kynareth6
@Kynareth6 3 жыл бұрын
Why upgrade 970 before next-gen games hit? Or why upgrade 1070?
3 жыл бұрын
Somehow my gut feel made me stop at the 1070 I currently use, and I'm glad to see it was the right feel.
@LiveType
@LiveType 3 жыл бұрын
Pascal is the best gaming architecture the world has ever seen and I'm pretty sure the industry will never see something like it again. RDNA2 does look pretty good, but I don't think it'll be quite as good as Pascal was.
@williambarr7408
@williambarr7408 3 жыл бұрын
A like from a fellow 1070 owner👍🏻🏴󠁧󠁢󠁳󠁣󠁴󠁿
@tragicevans4157
@tragicevans4157 3 жыл бұрын
Better upgrade my 1060 to a used 1080ti.
@christianh4723
@christianh4723 3 жыл бұрын
If the 1070 still meets your expectations / needs (1080p high refresh rate or 1440p if you're gaming with it, I'm guessing), stick with it. If the 3060 comes in at around $400, it will probably be worth it from the price / performance ratio, but at the end of the day there's really no reason to buy a new card (yet) if it's doing the job you want it to. Games are always becoming more demanding, though... we see hardly any games with the 1070 as the "minimum" requirement now, but I suspect that will change soon as the bar is being raised with the new consoles having more capable hardware.
@mcneil7intercept
@mcneil7intercept 3 жыл бұрын
Competition is important.
@tyren818
@tyren818 3 жыл бұрын
Someone post this on r/nvidia and see how many seconds till it is removed.
@raulsantos6394
@raulsantos6394 3 жыл бұрын
That's the evilest think I can imagine
@SonGoku-97
@SonGoku-97 3 жыл бұрын
r/nvidia ban any% speedrun
@tortugatech
@tortugatech 3 жыл бұрын
@@SonGoku-97 HAHAHAHAHAHHAHAHHAHHAHA
@HeirofCarthage
@HeirofCarthage 3 жыл бұрын
This comment may seem weird to some. I have a good job and run a decent sized gaming KZbin channel. I have plenty of money for any GPU I want to buy, and eagerly snatched up a 1080ti back at launch. It performed so good vs my old 970. Was very happy. After a couple years I was bored with no change. So despite my better judgment screaming at me not to, I bought a 2080ti. Waste of time and money. At the same time I bought a 5700XT for a PC build on my channel. It wasn't a perfect card but it was cool to see how good my games ran for such a significant fraction of the cost. I use my 2080ti for recording videos for my KZbin channel because I already paid for it and it is superior performance. When I game for fun by myself I game on my 5700XT build. I guess my head gets more enjoyment out of solid performance that was far more affordable than the best performance at a price that makes me still feel stupid. All this said...I have been very suspicious of Nvidia since I screwed up and bought a 2080ti. Not keen to make the same mistake twice. I won't buy a 3080 or 3090. I may buy an RDNA2 card if it can beat my 2080ti at 1440p gaming. 350 Watts is not desirable either. Even with central AC my room gets hot with a 2080ti. No way I am upping it to 350W 3080.
@TheAmazingLPs
@TheAmazingLPs 3 жыл бұрын
Oh hey it's the other person that still plays Attila Total War. Hi.
@frequentfrenzied
@frequentfrenzied 3 жыл бұрын
I'm in the same boat. I upgraded from a 970 to a 1080ti and it was fantastic. I took one look at Turing and went *skip*. Absolutely no point in buying any of the RTX 2000 series if you already owned a 1080ti. The 3080 looked compelling at first but the 350 watt TDP had me worried that I wouldn't even be able to turn my PC on during the summer for fear of being cooked alive. Then real performance numbers and all of the driver issues came in and now I can safely say that Ampere is another *skip*. Long live the 1080ti.
@TheySeeMeTrollen
@TheySeeMeTrollen 3 жыл бұрын
@@D-Terminat0r gamersnexus measured 321w playing total war at 4k. 314w at 1440p. Is 350w that far off? Edit: sorry on my phone. That was for the 2080ti. The 3080 FE was 366w at 4k. 350 seems like a good estimate lol
@MikeDoesRandomThings
@MikeDoesRandomThings 3 жыл бұрын
@@D-Terminat0r GamersNexus would like to have a word with you on that. Might want to check their benchmarks regarding the 3080FE and the 3090, particularly at 4k gaming.
@Aereto
@Aereto 3 жыл бұрын
@@D-Terminat0r You may have to (re)inform yourself considering the Gamers Nexus findings. There are places where power hurts in the long run.
@f0rl
@f0rl 3 жыл бұрын
Oh Jim, every time you release a video I make myself a cup of coffee and just chill after work listening to you.
@3800S1
@3800S1 3 жыл бұрын
This very much sums up my thoughts on the whole ampere launch, I'm glad I wasn't the only one noticing this.
@defeqel6537
@defeqel6537 3 жыл бұрын
@Neil Leisenheimer Yeah, and while the 3070 looks like "great value" against 2080Ti, all the points mentioned in the video still apply, and the VRAM bottleneck will be even more severe. The only saving grace might be games moving to use DirectStorage and thus require less cached assets in VRAM.
@rayjk1431
@rayjk1431 3 жыл бұрын
The 3070/3080 really needed to be offered with 12/16gb. If you want to see what will happen, go look at the fury x.
@slysithejuicegy
@slysithejuicegy 3 жыл бұрын
@@rayjk1431 but with gdr6x that would really blow out the power budget.
@TheReferrer72
@TheReferrer72 3 жыл бұрын
Did you guys not watch the Unreal 5, Demo? motherboard with PCI-E 4/5 and fast SSD.
@WayStedYou
@WayStedYou 3 жыл бұрын
Pascal and maxwell vs turing and ampere. Imagine what we would have if amd had enough money to compete over the past 5 years
@shayeladshayelad2416
@shayeladshayelad2416 3 жыл бұрын
Its good that two companies blow fists with onther This gen and next by amd will be talk for long time And i dont wont nvidia or intel to fall just get back to the ground
@snozzmcberry2366
@snozzmcberry2366 3 жыл бұрын
AMD doesn't exist to get Nvidia to release better products.
@shayeladshayelad2416
@shayeladshayelad2416 3 жыл бұрын
Snozz McBerry do you know another gpu company?
@jondonnelly4831
@jondonnelly4831 3 жыл бұрын
In some segments AMD have competed well, only drivers held them back. If 2 cards are similar price and performance but one is stable and the other is not, well the choice is easy. AMD is trying to get the drivers solid this time around with RDNA2 and should be competitive again in all segments. AMD still powered millions of consoles which make the pc market look like a joke.
@fVNzO
@fVNzO 3 жыл бұрын
Well tsmc controls the market and therefore the performance. 8nm is what, basically a half node from 16 to 7 so you could probably expect a pretty good jump in performance with an architecture that was specifically designed for rasterized gaming. The extra die space taken up by all the ML stuff is gonna take a hit independent of the node off course.
@guyfawkes8873
@guyfawkes8873 3 жыл бұрын
Nah m8, you're way off. The new cards have great performance uplift over last gen in that one game with those settings that choke the performance of the card we made Digital Foundry compare it to due to VRAM limitations.
@s1mph0ny
@s1mph0ny 3 жыл бұрын
nVidia: enacts second worst price/performance improvement in history Everyone: OMG nVidia does care!
@Sal3600
@Sal3600 3 жыл бұрын
RTX 3080 is actually the best price/perf card bruh. This FUD needs to stop.
@ToTheGAMES
@ToTheGAMES 3 жыл бұрын
@@Sal3600 Have you not seen the video?
@hugobalbino2041
@hugobalbino2041 3 жыл бұрын
Yeah right that's why jensen try to lore pascal owners to upgrade. Jensen " my friends pascalian is now safe to upgrade" when we all know that ampere is worst...
@DaybreakPT
@DaybreakPT 3 жыл бұрын
3090: A Titan-class GPU with none of the professional software support and at 25% more cost. Buy it today. (Wait, you can't?)
@AlValentyn
@AlValentyn 3 жыл бұрын
I still have a running GTX 580. Somehow Thermi still lives!
@brego129
@brego129 3 жыл бұрын
People forget their history. Thank you for bringing FACTS back into the discussion, Jim.
@eazen
@eazen 3 жыл бұрын
Forget? Most simply don't know it. Wannabe nerds with pitiful knowledge who pretend to be smart, way too many of those.
@brego129
@brego129 3 жыл бұрын
@@eazen True, a lot of people are new to the space and are just ignorant of history and easily buy into marketing. I remember watching the announcement videos and all the "RIP AMD" comments after every marketing slide that has since been proven to be just that...marketing. Not truth. Lol.
@a1do255
@a1do255 3 жыл бұрын
Feels good, when you learn a lot of interesting historical facts and the video is not even halfway through. Thank you for teaching a little bit of GPU history to the people not able to experience it back then, Jim.
@Logical
@Logical 3 жыл бұрын
The facts that people with actual brains already know but most just don't want to know.
@Sal3600
@Sal3600 3 жыл бұрын
Now tell me what the best price/perf card is.
@METALADIX
@METALADIX 3 жыл бұрын
Im still trying to figure out how ampere managed to suck 100 more watts and only give about 8-15 more 4k fps over the 2080ti.
@bb-sw6ur
@bb-sw6ur 3 жыл бұрын
@@Sal3600 Probably a Radeon RX 580
@LionelMessi-fu6wn
@LionelMessi-fu6wn 3 жыл бұрын
@What yes. The 1080ti is just so good that it's almost undermining Nvidia's new cards in terms of value.
@Veloce87
@Veloce87 3 жыл бұрын
Excellent video Jim, please do this for Radeon when RDNA 2 comes out
@The-Cat
@The-Cat 3 жыл бұрын
Oh course he will.... I mean the man practically got liquid silicon as blood running through his vains
@kintustis
@kintustis 3 жыл бұрын
He'll just conveniently "forget" to lambast RTG over its pitiful attempts at graphics processing. But I suppose that's really it. It's just "pitiful"; not malicious, or greedy (not by comparison at least, and not until Navi 1).
@Veloce87
@Veloce87 3 жыл бұрын
@@kintustis mommy mommy
@kintustis
@kintustis 3 жыл бұрын
@@Veloce87 what?
@felreizmeshinca7459
@felreizmeshinca7459 3 жыл бұрын
@@kintustis You must be new here. Welcome though
@F7GOS
@F7GOS 3 жыл бұрын
I still can't believe how little I managed to pick up a 2080Ti for after the Ampere announcement. Receipts in the box and the seller spent over £1200 in Q1, just to let it go for just over a third of the purchase price 7 months on. That was the lightbulb moment for me. Targeting people who hold the things they buy with such little regard is a brutal stroke of genius from a business standpoint. Two ye olde sayings spring to mind. "The person who writes for fools is always sure of a large audience" and "A fool and his money are easily parted". The Nvidia business strategy - brilliantly - bakes these two sayings into a nice palatable slice of cake which the tech industry just gobbles up. The big question should really be when will AMD follow suit, rather than if Nvidia will change... there is a reason the last few Nvidia press conferences have been in a kitchen after all ....
@bb-sw6ur
@bb-sw6ur 3 жыл бұрын
I wonder what Adored's chart would look like if you actually aggregated the used market. My favorite part of the Ampere announcement was seeing the Turing prices drop like bricks.
@ToTheGAMES
@ToTheGAMES 3 жыл бұрын
The sceptic in me says AMD would be crazy to not do the same.
@dracer35
@dracer35 3 жыл бұрын
I did the same thing but with Pascal. Waited until the 20 series released and bought a 1080ti with an EK waterblock included for $400. I'll probably do the same in the future but for now, the 1080ti is more than good enough for all the games I play. I won't be paying that Nvidia tax for every shiny new thing.
@MJ-uk6lu
@MJ-uk6lu 3 жыл бұрын
AMD already does that with Ryzen CPUs
@Doomx12345
@Doomx12345 3 жыл бұрын
If only the mainstream picked up on these facts.... Thanks for posting this Jim ✊
@Pixel_FX
@Pixel_FX 3 жыл бұрын
Don't forget that Jensen said ampere has "1.9x Performance Per Watt" Pythagoras was rolling in his grave during that presentation 😂😂 Great work jim, as always. Time to post this on some nvidia fangirl groups.
@stevenbradford6138
@stevenbradford6138 3 жыл бұрын
But... But.. But. It is does have 90% better perf/wat... At a 120W load, but don't look at that part.
@Time_Traveling_Lesbian
@Time_Traveling_Lesbian 3 жыл бұрын
that'll be an insane boon for mobile GPUs so its not all that bad, 90% PPW Improvement in a laptop 120W gpu will lead to some good laptop designs with ryzen 5000
@Pixel_FX
@Pixel_FX 3 жыл бұрын
@@stevenbradford6138 so you want 3080 buyers to undervolt and underclock to 120watt range? XD aint nobody gonna do that. It might do good in mobile platform. Then again amd gpus will have 30%ish higher transistor density. means smaller dies less power. its not gonna be easy for them this time.
@stevenbradford6138
@stevenbradford6138 3 жыл бұрын
@@Pixel_FX I would actually prefer if people didn't buy these cards as they are junk. I was making a joke based on the fact that Nvidias launch presentation showed a graph that said the new cards are 90% more efficient when consuming 120W. Nvidia was using statics to make a massively misleading statement.
@Pixel_FX
@Pixel_FX 3 жыл бұрын
@@stevenbradford6138 Yeah they are junk, That presentation was so misleading. Sad thing is still people are worshiping jensen. i can't wait to see they become even bigger joke next month. at least its slightly better than turing when u compare cuda core count vs performance scaling. turing had horrible scaling. best scaling ive seen yet are on Pascal and RDNA1. almost synchronous performance gain with cuda/sm count. So this 80cu will really scale well I hope. we'll see soon enough. Btw, only good thing i saw about ampere cards are their compute/rendering performance. almost twice as fast as 2080ti.
@Nico712
@Nico712 3 жыл бұрын
To be fair, one of the reasons turing and ampere are comparatively bad with previous generations is the presence of "rt cores and tensor cores" . If that "die space" was used for "normal rasterization hardware" the difference with previous generations wouldn't have been that bad, on the other hand one could say that turing and ampere are the biggest generational leap in ray-tracing performance. In order to innovate someone has to be first, and nvidia decided sacrifice rasterization performance for ray tracing acceleration. I'm not saying that nvidia doesn't deserve criticism for their marketing and high price points :)
@markjacobs1086
@markjacobs1086 3 жыл бұрын
I'm not so certain that they actually sacrificed anything when it comes to rasterisation in relation to CUDA core count. There's a little something called diminishing returns where things don't really scale all that well at the top end. Looking at 1080p performance I'm suspicious enough to say that we might've already hit a cap there & that smarter approaches are necessary instead of just increasing quantities of certain parts. I'm also fairly sure that a company like Nvidia & AMD are unlikely to both be shooting themselves in the foot...
@3lDuK371
@3lDuK371 3 жыл бұрын
makes me sad that Jim gets so much flack when he is one of the few people shining the light of truth on hardware manufacturers. Love your work Jim
@mokahless
@mokahless 3 жыл бұрын
You are confusing "flack" with "slack." If they were giving him "slack" it would be a good thing, although using the word in that manner would still be dubious.
@skunch
@skunch 3 жыл бұрын
I think you mean "flack" The world could use a lot more slack right now
@bungieborris9111
@bungieborris9111 3 жыл бұрын
he gets it due to his navi predictions, which i dont understand, its not like he had a crystal ball, no one really knew how navi was gonna perform.
@r3dleh107
@r3dleh107 3 жыл бұрын
indeed
@r3dleh107
@r3dleh107 3 жыл бұрын
and he puts his neck out there...
@ashw6015
@ashw6015 3 жыл бұрын
This is precisely why we must keep calm and await the queen's speech on the 28th.
@hihna2011
@hihna2011 3 жыл бұрын
You are such a simp ....
@imadecoy.
@imadecoy. 3 жыл бұрын
The 3070 was delayed until the 29th anyway
@thelegendaryklobb2879
@thelegendaryklobb2879 3 жыл бұрын
She gives more of a cool aunt vibe to me
@nonrumor
@nonrumor 3 жыл бұрын
cringe
@ashw6015
@ashw6015 3 жыл бұрын
@@hihna2011 Yes, either that or it was wordplay and I don't actually think she's a queen...
@nissejacke4211
@nissejacke4211 3 жыл бұрын
Lots of people saying in forums that they hope AMD delivers a good new product, and then still buy nVidia, Somehow it doesn't matter kind of products AMD has to compete with. Only way to make prices go down is to not buy the over the top expensive cards. Vote with your wallets! But, people jump all over new nVidia gpus, and new iPhones in a way that it seems like they cannot continue to live if they don't have the newest, fastest and "best" thing, no matter the cost. Great video by the way!
@clblanchard08
@clblanchard08 3 жыл бұрын
Those people just want to buy Nvidia at a reasonable cost. The only way they can do that is if AMD competes. I personally don't care about Nvidia anymore. AMD can compete just fine in the price bracket I'm willing to spend in and they have awesome open source drivers. As long as they keep doing that they'll have a customer for life.
@ironoverheat465
@ironoverheat465 3 жыл бұрын
"Vote with your wallets" is the most stupid nonsensical argument ever. Don't you realize that if the competition follows suit with the bad choices the more successful company made then you're left with no choice? Do you know why in 2020 you are stuck with hideous non-standard sized slabs of glass with either literal holes or notches devoid of the essential universal audio port and with glued-in batteries and 1000$+ price tags?
@rayjk1431
@rayjk1431 3 жыл бұрын
Got a 5600xt for my brother's rig. Did not see a reason to go for 2060 since RTX performance would be ass at that level.
@theexmann
@theexmann 3 жыл бұрын
That is true so far, but we'll see after the newest AMD GPUs are announced. Don't forget that the same situation you're describing happened with Intel CPUs vs AMD CPUs, but it's very different now. AMD has continued to take marketshare from Intel since the introduction of Ryzen. RDNA and CDNA will ultimately have the same effect on Nvidia and how consumers see both including AMD being a true alternative for the best GPU technology at the best price.
@adilakif0061
@adilakif0061 3 жыл бұрын
I use AMD now. I can afford to spend $1000 on a new GPU. AMD has driver issues. I won't pay more than $300 for a GPU that doesn't even work properly. That is why I am done with AMD. Not because I like 2xJensen. I will probably buy a used 20 series card even tho I can afford a 3080, I don't want to reward scummy behavior.
@donutsteel8878
@donutsteel8878 3 жыл бұрын
very nice video, the bar graphs with the percentages came right as I was thinking to myself that hearing all these numbers became a little confusing. perfectly timed.
@Brodda-Syd
@Brodda-Syd 3 жыл бұрын
Great Video! I love the in-depth history lessons on GPU's. Keep em coming...
@buffdogg
@buffdogg 3 жыл бұрын
Every year there is another video that makes me feel great about getting and still running the 1080Ti
@rayjk1431
@rayjk1431 3 жыл бұрын
Still rocking my 1070 here. Definitely got my money's worth at 1440p60. Will continue to for at least another generation even if I have to reduce settings.
@jpgarcia7892
@jpgarcia7892 3 жыл бұрын
Keep it up, Jim. I always learn new things from your content
@davidhall2151
@davidhall2151 3 жыл бұрын
Another great video, clear, precise and unbiased. Thank you Jim.
@WattPerformance
@WattPerformance 3 жыл бұрын
Ahh as always; even though you have to wait a bit for your postings, you know it’s due to a good reason and again you prove that the wait is worth it. Incredibly concise and clear video with, as always, great back view mirror examples. Big 👍 😀
@theotherchannel2279
@theotherchannel2279 3 жыл бұрын
I am just letting you know it has been months for any of your videos coming up on my KZbin home screen... I do click on them myself, having to find you in my long list of subs!
@DanLMH
@DanLMH 3 жыл бұрын
Interesting, this video was position one for me on YT home!
@ColtaineCrows
@ColtaineCrows 3 жыл бұрын
kzbin.infosubscriptions
@JustNel
@JustNel 3 жыл бұрын
I loved your video on the architecture of ampere using the whitepaper.. Great job explaining it in a way that everyone can understand.
@eYeZiQ
@eYeZiQ 3 жыл бұрын
That was an absolutely fantastic breakdown, thank you so much for all your in-depth analytics! It's very nice to be able to show something like this to my skeptical friends, or at least those w/ an attention span...
@deekdouglas3055
@deekdouglas3055 3 жыл бұрын
Brilliant! As you know I was doing my own research but really struggling to get my head around this generation. It never occurred to me to focus on the 3090 but after watching it makes perfect sense now. Thank you for the top quality analysis, this video feels like closure to me as it answers so many questions I've had and draws a conclusion I can understand in my own way. Thanks again for all the hard work. Was a surprise It turned out as bad as it did but that's why I subbed, you never run from the facts.
@Safetytrousers
@Safetytrousers 3 жыл бұрын
It was clear to me the 3090 was the dangler for the people who want to buy the best gaming GPU neverminding the cost. And in that respect has replaced the 2080ti. I have a 2080ti and I don't in any way regret it, but I won't be getting a 3090. But Nvidia have not at any point suggested buying the 3090 for 4k gaming or below.
@deekdouglas3055
@deekdouglas3055 3 жыл бұрын
@@SafetytrousersAs long as you're happy with your purchase that's the main thing. Hate to see people waste their hard earned money on things they end up disappointed with so it's nice to hear from someone who knew what they were buying and are happy with that. I don't agree with the 2080ti but who am I to judge especially when you clearly know what you're doing for your own circumstances.
@woody12008
@woody12008 3 жыл бұрын
Jim consistently dropping bombs over here. Your videos are always worth the wait, thanks again.
@theexmann
@theexmann 3 жыл бұрын
Great detective work and very useful comparative complication of data. Another great video. Thanks.
@604RPM
@604RPM 3 жыл бұрын
Jim! Thanks for releasing this after I was asleep. I really needed the rest.
@Erowens98
@Erowens98 3 жыл бұрын
The "largest performance bump in histroy" claim becomes true if you account for compounding. Let's say the GTX 580 has 100 performance units. Then the 780 bumped up to 200 performance units. +100 Then the 980ti bumped up to 286 performance units. +86 Then the 1080ti bumped up to 529 performance units. +243 Then the 2080ti bumped up to 735 performance units. +206 Then the 3090 bumped up to 1074 performance units. +338 So the 30 series delivered the largest increase in performance units, if you calculate it in this way. Which is probably how Nvidia arrived at their claim.
@rednammoc
@rednammoc 3 жыл бұрын
It is also an absurd way to measure things, and not consistent with what the user sees in performance. Let's say they all bumped the same +100 in perf, except the 3090 gives +101. The +100 would give the 780 double (i.e. 200%) the FPS compared to the previous gen. The +101 of the 3090 would give 601/500 = 120.2% of the FPS compared to previous gen, but no user is going to accept this 20% FPS bump as the biggest ever increase in performance!
@sergiomadureira9985
@sergiomadureira9985 3 жыл бұрын
‘Ampere is the biggest generational leap in performance ever’ *Jensen 1st of September*
@Recon801
@Recon801 3 жыл бұрын
I mean it is. You cant just ignore DLSS 2.0. Nvidia releases a new technology that gives you massive increases in performance and you turn it off and say look its only 48% better. How is that a fair comparison at all.
@rdmz135
@rdmz135 3 жыл бұрын
@@Recon801 Oh but we can ignore it. Since even after 2 damn years there are only 14 games supporting it. And even from that, most of them use the terrible DLSS 1.0. It may as well be irrelevant.
@imadecoy.
@imadecoy. 3 жыл бұрын
@@Recon801 Look who sucked up all that marketing speak as gospel.
@Recon801
@Recon801 3 жыл бұрын
​@@rdmz135 Firstly, there are way more than 14 games that support DLSS. Secondly, you only need DLSS for the few AAA games that are graphically intense enough to need the help of DLSS, and all the AAA games releasing do support it. If I can enable DLSS 2 and get 30 or 50% more fps at 2k res and visually the game looks the same that is a BIG DEAL. My point is Adored shouldn't make a video essentially saying Nvidia is lying and then exclude a core piece of Nvidia's claim in the first place. Imagine Nvidia saying you're getting the greatest increase to memory bandwidth because of new compression and faster memory, and then a youtube disables the new compression and goes "look Nvidia is lying again"
@rdmz135
@rdmz135 3 жыл бұрын
@@Recon801 There are only 14 I could find. Even if it was 20 its far too little to be relevant. If it was a globally available feature I would agree with you, but it is not, and it doesn't look like it ever will be.
@tech6294
@tech6294 3 жыл бұрын
Amazing video as always! Thoroughly enjoyed it. ;)
@jekae61
@jekae61 3 жыл бұрын
great video Jim! thank you!
@DMFPERFORMANCE
@DMFPERFORMANCE 3 жыл бұрын
Jim's contempt for Nvidia and Intel is legendary. Try calculating the ray tracing performance per frame or watt verse the previous gen? You'll probably come to the reverse conclusion. The RT/Tensor cores and DDR6x use plenty of power.
@dbjungle
@dbjungle 3 жыл бұрын
Gamers are cringing at buying based on value rather than e-clout.
@DaybreakPT
@DaybreakPT 3 жыл бұрын
Everyone else is cringing at buying stuff for the "e-clout", whatever that translates to in real life.
@killwize
@killwize 3 жыл бұрын
​@@DaybreakPT our whole society is influence by electronics. why then hold this particular case to an arbitrarily "higher" standard? what moral imperative exists behooving us to paint this example negative, especially giving the somewhat esoteric nature and questioned legitimacy of AdoredTV's arguments? bananas.
@poejjafers
@poejjafers 3 жыл бұрын
Great analysis as always!
@flyrcnetwork
@flyrcnetwork 3 жыл бұрын
Thanks for the great video mate!
@davida8119
@davida8119 3 жыл бұрын
What it also shows is when Nvidia feels there is no competition they charge a huge premium for their products. Hopefully AMD have finally got something that can compete at the high end.
@memnarch129
@memnarch129 3 жыл бұрын
Rumors are it MAY be close or compete the 3090. Even if it was 20% slower the rumor of it being 500 or less in price will absolutely smash Nvidias 30 line
@Skylancer727
@Skylancer727 3 жыл бұрын
@@memnarch129 God I hope so. Knowing AMD they always find some way to bungle it up. Hopefully Lisa put a policy that if this fails they are going to be doing reconstruction of leadership in radeon technologies discouraging them from half assing it.
@BodybuildingSteve
@BodybuildingSteve 3 жыл бұрын
card performance aside the one thing that swings me towards nvidia is DLSS, I honestly couldnt believe the difference in control with it on and off, that said ive got a 2070super and it plays everything i play perfect so am happy to wait for a little while to upgrade
@Skylancer727
@Skylancer727 3 жыл бұрын
@@BodybuildingSteve you can just add reshade to a game and add sharpening or even just use Nvidia's freesysle and lower the resolution to 1400-1800p and you will get about the same image quality with the same performance but also without the extra temporal artifacts of DLSS. I think DLSS is way over blow. In Death Stranding it makes moving objects smear and adds dithering being things that move quickly. Plus omly 3 games support DLSS 2.0 Control, Wolfenstein New Nlood (a game nobody should play), and Death Stranding which also had FidelityFX which switching back and forth look basically identical beyond DLSS making the image look terrible in the rain which is a signature feature of Death Stranding. DLSS will die when Microsoft releases Direct ML packed straight into DirectX 12. They don't stand a chance. Proprietary technology always goes away. DLSS will only be used in future games with Nvidia gameworks like Final Fantasy 15, Assassin's Creed 4, and Cyberpunk. Is the technology worth it for 1 in every 30 games?
@reflector36
@reflector36 3 жыл бұрын
thats what im waiting for, competition, maybe not ultra over the top (3090) competition but a 3080 competitor at hopefully lower price.
@F1L3GIT
@F1L3GIT 3 жыл бұрын
Let’s hope AMD deliver on the 28th !
@dralord1307
@dralord1307 3 жыл бұрын
Did you see their new 3070 launch date :D "seems someone is scared of what RDNA2 will be"
@isaacx593
@isaacx593 3 жыл бұрын
Jim is over AMD. good or bad he dont care
@Skylancer727
@Skylancer727 3 жыл бұрын
@@dralord1307 It looks more like Nvidia wants to over shine AMD and be talked about more on AMD's announcement day. Let's be honest here, Nvidia has the market share so they will outshine them in the news even if Navi is superior.
@F1L3GIT
@F1L3GIT 3 жыл бұрын
Just having some decent competition at the high end will benefit all. AMD or Nvidia not fussed, power draw is 2nd place after price v FPS so here is hoping AMD deliver this time.
@dralord1307
@dralord1307 3 жыл бұрын
@@Skylancer727 Exactly. Nvidia seems to be more "afraid" of what AMD will launch this time around. So cutting AMD as hard and fast as they can is Nvidia's goal. I mean its super smart marketing. Nvidia is the best when it comes to marketing shenanigans.
@MLee-vcrr
@MLee-vcrr 3 жыл бұрын
Finally a video that lays out what I've known for years...EXCELLENT presentation!!!
@EliteArab89
@EliteArab89 3 жыл бұрын
eye opening! great work! thank you for your honesty!
@blakon621
@blakon621 3 жыл бұрын
What a time to be alive. If you want hot power hungry machines you go with intel/nvidia but if you want cooler more power efficient you go amd. Why isn't this the narrative?
@reneviera4884
@reneviera4884 3 жыл бұрын
AMAZING Video, I'm still currently using my 1080 TI, all of Nvidias launch thus far have been a disappointment, thank you for such detailed information! 👍
@MantaProx2
@MantaProx2 3 жыл бұрын
Well put together. That puts these cards into perspective.
@captainfeathersword1131
@captainfeathersword1131 3 жыл бұрын
Awesome video, subbed!
@someone-wi4xl
@someone-wi4xl 3 жыл бұрын
i wish if you added the 3080 as well the same way you added both 780 and 780 TI .. so we see the value of the 3080 compared to the last 2 generations
@Jognt
@Jognt 3 жыл бұрын
He added the 780 because that was the top card in the line for quite a while. Top cards are never the best value in their line. So comparing a top of the line card with a non-top-of-the-line card is apples/oranges. If the 3090 wasn't released you could be sure that the 3080 would be even more ridiculously priced.
@aberkae
@aberkae 3 жыл бұрын
Who remembers what happened after the Radeon 4770 when AMD had the node advantage? Fermi! Ampere is definitely Fermi 2.0 Big Navi will be on a 22 month mature node advantage!
@M00_be-r
@M00_be-r 3 жыл бұрын
Jim you hit the nail on the head! thanks for another interesting vid.
@jbbresers
@jbbresers 3 жыл бұрын
Great video Jim
@CharcharoExplorer
@CharcharoExplorer 3 жыл бұрын
The problem here is - 4K Gaming and 4K modding and VR need VRAM. 10 GB is an issue for modders. That is a fact too. The 2080 Ti lacks HDMI 2.1 and cannot drive a 4K OLED HDR TV well. Its also not fast enough. The 3090 is not fast enough too but still its an improvement and has enough VRAM. I will wait for RDNA2 for sure, but if not... 3080 20GB it will be.
@Skylancer727
@Skylancer727 3 жыл бұрын
You don't need much VRAM to use VR. I don't know why people keep claiming this. Half Life Alyx is the most demanding VR game and on the Valve Index it only uses 7GB. That's means you'll only pass 10GB if you use an HP Reverb with it's 2160/2160 displays per eye. But good luck finding any GPU that can run that.
@The-Cat
@The-Cat 3 жыл бұрын
@@GeorgePerakis Yoda him i hear speak when you❤️️ said that
@alistermunro7090
@alistermunro7090 3 жыл бұрын
You will want a new GPU by the time you need more than 10GB. Sure you can run silly hi res texture packs to prove a point. I have tried them myself, but they are not worth the download time and space taken up.
@Skylancer727
@Skylancer727 3 жыл бұрын
@@GeorgePerakis it's a VR game, they aren't made for the consoles. The only one they can be on is the PS5. Most VR games have been exclusively on PC like Half Life Alyx. That makes it basically next gen just lacking ray tracing which is too slow for the 90fps needed for VR. What we got is what we will have for the next 2-4 years. Most VR games barely use 4GB of VRAM. The consoles have nothing to do with it and Nvidia releasing the main 3080 with only 10GB of VRAM and the 3070 with only 8GB means VR games have no headroom to push up. It's based on PC hardware, not the consoles. Why would the consoles effect anything with VR?
@CharcharoExplorer
@CharcharoExplorer 3 жыл бұрын
@@alistermunro7090 That is up for me to decide, not someone else. I like my 4K mods and texture packs. I want future VR titles on future VR headsets to max textures. This "10GB is fine" is getting rediculous. 8GB isnt fine even at 1440p if you manually set settings to Ultra in games like Wolfenstein 2. I KNOW that since I have a 2080 Super and had a 1440p monitor and it DID stutter.
@cIappo896
@cIappo896 3 жыл бұрын
Is anyone surprised? Every time Nvidia feels like they should brag about increases of ridiculous proportions, it's usually the other way around. Ampere is awful. And people fail to learn from the past. Fairly fuckin consistently too
@jondonnelly4831
@jondonnelly4831 3 жыл бұрын
2080ti smashing performance for half the cost. I know someone who has one and is delighted with his 3080. I wish I had one as well as it obliterates my 1080ti which also cost me more. Just impossible to find and I refuse to pre-order.
@Senzorei
@Senzorei 3 жыл бұрын
@@jondonnelly4831 Yeah, because Turing was literally one of the worst generations for Nvidia ever. Against that backdrop, it's not hard to make it seem like you've made a huge leap when in reality it's just a sham to squeeze the piggy bank even harder.
@craigdavies8866
@craigdavies8866 3 жыл бұрын
@@Senzorei So true. Nvidia can release what they like and people will eat it up. Remember AMD known for powerdraw and heat. Intel fans still believe it about Ryzen. Now with Radeon group IF AMD run the same perf but at around 50 to 100watts less Nvidia fans will still believe 'Hot and Loud'
@Tapion3388
@Tapion3388 3 жыл бұрын
I have to say it is really refreshing to see such a well researched fact based video when most other youtube videos about this topic are more or less clickbait or tailored to people who don't have a clue... Keep up the good work Jim!
@TheStin777
@TheStin777 3 жыл бұрын
Thanks for another great video!!
@ryanrazer1
@ryanrazer1 3 жыл бұрын
2 of the best videos you put out lately, back-back? Jim, you're on fire. It's great to get perspective like this, on a silver platter. I'd never do so much research about GPU metrics relative to previous generations. #PerspectiveMatters Though, i'm wondering how Jensen comes to conclusions like: "biggest performance leaps ever...", while it can be fact checked. Sounds like he has something in common with the orange-head from US.
@eazen
@eazen 3 жыл бұрын
Jensen thinks he has a lot of stupid fanboys, and he's kinda right with that. So he abuses his power. But nvidia was always like this. First they destroyed 3dfx with marketing lies, then they lied about Radeon and broke them nearly as well. Then they lied to their trusty customers or fanboys, to keep selling more GPUs.
@stopfdenpc
@stopfdenpc 3 жыл бұрын
I usually only comment when I have something to add. But this video was so great, I just wanted to say that :P
@amorgan20111
@amorgan20111 3 жыл бұрын
This is why I give you my money every month! You make the videos I need to make the proper arguments and recommendations. Keep doing an excellent job!
@perschistence2651
@perschistence2651 3 жыл бұрын
Very nice analysis!
@brego129
@brego129 3 жыл бұрын
Hey Jim, I don't do Twitter so I'll respond here. I was ok with the $50 price hikes, but the lack of a 5700X does hike price of entry into 8c/16t more like $120 vs the 3700X.
@xwing8675
@xwing8675 3 жыл бұрын
This is a cool idea, I'd like to see the same thing for AMD.
@kylekermgard
@kylekermgard 3 жыл бұрын
Finally been waiting for your follow-up
@jlirving
@jlirving 3 жыл бұрын
Real gold. I love these graphics cards in review videos!! It's really refreshing to see where we came from. Especially for those who got into top end gaming a generation of two into the modern era.
@vincentbrandon7236
@vincentbrandon7236 3 жыл бұрын
Great content. I would suggest inflation adjusting dollar amounts being compared across time, though. Comparing prices in the same year is fine, but across a decade is a HUGE difference. Companies must adjust prices for inflation as the majority of their paying customers also experience wage inflation at or greater than CPI. Only COSTCO can well the $1.50 hotdog forever.
@N0N0111
@N0N0111 3 жыл бұрын
This broken down history facts video shows again how Nvidia's time-line has been and where we are heading. I hope more people get recommended to watch this right now, cause there is a huge drama going on right now in the gamer community. And this video just solves that! @AdoredTV well done!
@felreizmeshinca7459
@felreizmeshinca7459 3 жыл бұрын
Thank you Jim. There's not many people like us, that cares for an objective view of products. Most think as consumers do, while we struggle to convey what a product really is. Let alone about the company. I've followed you for years, and I hope for many more. The world has been pretty crazy lately, take care Jim. Best wishes from Malaysia.
@dlh242
@dlh242 3 жыл бұрын
I miss these videos... great work
@The-Cat
@The-Cat 3 жыл бұрын
I missed these comments... great post!
@Dysphoricsmile
@Dysphoricsmile 3 жыл бұрын
I totally get what you are saying here Jim, but in Price/Performamce Ampere looks WAY better using a 3080 instead of the insanely overpriced RTX 3090. At best the 3090 gets an extra 10% FPS over 3080, but it costs over twice as much. (Of course this would have made the Gen on Gen improvement look absolutely tiny compared to what we were used to as recently as Pascal) As for the perf per watt and such, the 3080 is as bad as the 3090 mostly. Samsung 8N is essentially Samsung 10nm, which is only a half node shrink - ALSO even TSMC 16nm FinFET is frequency advantaged over Samsung 8nm!
@Jimster481
@Jimster481 3 жыл бұрын
Well yes and no. Beacuse the 3080 isn't available for MSRP. More like for $800+ and yes while its still "more attractive" at that price, its not as good as you think.
@adityaippili6734
@adityaippili6734 3 жыл бұрын
kzbin.info/www/bejne/qWOWeKmYhsusgqM
@Dysphoricsmile
@Dysphoricsmile 3 жыл бұрын
@@Jimster481 3080 is definitely not a massive upgrade over 2080 Ti given, and the lack of MSRP is from scalping.\ My whole point was simply, 3080 is a FAR better perf/$ buy over the 3090. Being that 3090 is NOT a real Titan, it is even more stupidly priced - though cheaper than a "True" Titan.
@Jimster481
@Jimster481 3 жыл бұрын
@@Dysphoricsmile I agree, but 40-50% over last Gen is a pretty big gain these days. However it's really this way because Turing sucked so much.
@Dysphoricsmile
@Dysphoricsmile 3 жыл бұрын
@@Jimster481 Agree 100% - comparing 3080 to 2080 is a HUGE upgrade, but not the 80%+ Nvidia showed off (at least in games) Anymore the best conversation I have about tech are all here. Sure, dumb people watch AdoreTV as well, as his most recent video highlights well - and Jim probably let's them get under his skin a bit too much. I get into heated arguments with morons as well - but I usually just let it be after they start repeating stupid shit.
@ottley32
@ottley32 3 жыл бұрын
“Wait for reviews they said”.....they were right! Great work Jim!
@Recon801
@Recon801 3 жыл бұрын
You do realize he uses non DLSS 2.0 performance for Ampere. Its essentially lying. Imagine if Ford released a car with a new technology that doubled your mpg. The reviewer then turns it off and says look the mpg isnt that much better......
@briceni6136
@briceni6136 3 жыл бұрын
@@Recon801 That reviewer might have a point if he could only use that new tech on a tiny percentage of the available roads and didn't much like limiting himself to them.
@TrueThanny
@TrueThanny 3 жыл бұрын
@@Recon801 DLSS is what's lying. If I sold you a 10" television and a magnifying glass, and said the magnifying glass made it a 60" television, what would you think of my claim? DLSS is rendering at a lower resolution and upscaling the result with interpolated (i.e. fictional) information. It's a temporal algorithm that causes flickering artifacts. Rendering at 4K is _not_ the same as using DLSS to produce a 4K output image. And quite beyond that, it's only supported in a handful of games.
@Recon801
@Recon801 3 жыл бұрын
@@TrueThanny I disagree. DLSS is effectively no different than compression. Yes you are using a "shortcut" but if the end result is "better" to the human eye than it's worth it. DLSS 2.0 doesn't seem to have the issues DLSS 1 had with artifacts, and will absolutely be available on new games moving forward.
@doomergirl6836
@doomergirl6836 3 жыл бұрын
@@Recon801 Your comparison is off. A more accurate image would be to assume a car manufacturer released a car that was able to increase it's performance metrics while maintaining the same fuel consumption - but only on a handful of roads that have been pre approved and mapped by the manufacturer. On every other road the car still performs in line with what you would expect from it's motorisation.
@toxicityD
@toxicityD 3 жыл бұрын
Great perspective and context on Ampere!
@FuriosoBC
@FuriosoBC 3 жыл бұрын
Prove value. That's the one thing I'm looking for in my next upgrade. I appreciate your in-depth, impartial, and critical analysis as the PC world goes mad over the (literally) hot new thing. Thank you!
@mariodrv
@mariodrv 3 жыл бұрын
Ohohoho! I've been waiting for this one...
@sc-wp5gz
@sc-wp5gz 3 жыл бұрын
Hooray
@briceni6136
@briceni6136 3 жыл бұрын
Ohohoho.. Green giant Got a reality slap.
@nelsonbutcher1
@nelsonbutcher1 3 жыл бұрын
Good summary. Been saying ampere only looks good cuz of how bad Turing was.
@hugobalbino2041
@hugobalbino2041 3 жыл бұрын
Even pascal was better in performance per watts...
@myvpn97
@myvpn97 3 жыл бұрын
Yup. Which is why i always skip at least one generation. 3080 looks phenomenal to me! And probably to lots of people since so few bought into the 2000 series.
@mmhs04
@mmhs04 3 жыл бұрын
Love the graphs at 10:40! Puts it really good in contrast.
@flyingfajitas
@flyingfajitas 3 жыл бұрын
Jim's breakdowns are so interesting.
@SireDragonChester
@SireDragonChester 3 жыл бұрын
Once again Jim. You shown us the truth, with passion and heart, just how 3xxx isn’t the GPU gamers wanted. While everybody (almost) else in tech community/youtuber is so in love with 3080, (Linus included) that they ignore the real truth of what ampere really is. Only ones that seem be honest, You/ Paul (Redgamingtech) Steve from gamer nexus/hardware unbox guys/maybe Jay2cents. (For ones that I watch) almost everybody else is like so blinded, it’s like they in love for first time with 3080/3090. You go extra mile with all the homework, thank you. Keep up good work
@gordosimmins2727
@gordosimmins2727 3 жыл бұрын
Jim, the OverVolted podcast on iTunes needs attention. There's one episode (July 17) tagged as "Season 1" and all the rest "Untitled Season." It needs to be fixed for reasons. TIA
@debexe6514
@debexe6514 3 жыл бұрын
Nice presentation with the charts
@Lastguytom
@Lastguytom 3 жыл бұрын
nice video jim good going
@moldoveanu8
@moldoveanu8 3 жыл бұрын
Finally, someone saying what I've been saying since benchmarks came in...how in the world are tech reviewers not stating these facts? This 3090 is a fermi twin card. Wtf #Nvidia ?
@ApocDevTeam
@ApocDevTeam 3 жыл бұрын
I don't think Nvidia sees the 3090 as the flagship, Jensen specifically called the 3080 their flagship in the launch video.
@colesym84
@colesym84 3 жыл бұрын
That is for the same reason Jim pointed out, that people think it is good value over the 2080 ti.
@Safetytrousers
@Safetytrousers 3 жыл бұрын
@@colesym84 It is though, isn't it?
@doomergirl6836
@doomergirl6836 3 жыл бұрын
@@Safetytrousers did you even watch the video?
@Safetytrousers
@Safetytrousers 3 жыл бұрын
@@doomergirl6836 Yes. Over (meaning 'in comparison to') the 2080ti something significantly cheaper and notably better performing, is better value. Whether you think that new thing is generally good value, is your own judgement.
@aurora7207
@aurora7207 3 жыл бұрын
Awesome job. Puts into numbers what I was already feeling about this gen but, great to have the facts where everyone can see them. Hope you do something similar for AMD when it launches.
@STDRACO777
@STDRACO777 2 жыл бұрын
9800GTX was the last "high-end card I bought" I remember comparing that PC to the current consoles complaining of how overpriced the consoles were.
Intel's 7nm Is Another Disaster In The Making - Why?
15:36
AdoredTV
Рет қаралды 95 М.
How many pencils can hold me up?
00:40
A4
Рет қаралды 18 МЛН
Do you have a friend like this? 🤣#shorts
00:12
dednahype
Рет қаралды 48 МЛН
The 15 Terabyte SSD is TINY!!
12:41
Linus Tech Tips
Рет қаралды 1,9 МЛН
Nvidia's Dumbest Decision
32:07
AdoredTV
Рет қаралды 113 М.
Nvidia - A New High in Low Morality
25:01
AdoredTV
Рет қаралды 88 М.
The RTX 3080 Launch can't get any worse... Right? Wrong...
21:36
JayzTwoCents
Рет қаралды 2,2 МЛН
Microsoft Made Its Own CCleaner
5:56
Techquickie
Рет қаралды 291 М.
Cache, from History to the Future of CPU Memory
42:32
AdoredTV
Рет қаралды 81 М.
But can your Macbook do THIS?? - Metaphyuni Seekerbook
12:41
ShortCircuit
Рет қаралды 237 М.
An honest look at the state of the Linux desktop going into 2024
18:54
The Linux Experiment
Рет қаралды 162 М.
cool watercooled mobile phone radiator #tech #cooler #ytfeed
0:14
Stark Edition
Рет қаралды 7 МЛН
What’s your charging level??
0:14
Татьяна Дука
Рет қаралды 7 МЛН
С Какой Высоты Разобьётся NOKIA3310 ?!😳
0:43