😭😭😂😂😂 May we be able to go buy them and more one day🤌🏽
@yp538710 сағат бұрын
@@FO0TMinecraftPVPsameway why theres so many homeless people, why don’t they just buy a house 🤣
@MrX850310 сағат бұрын
Next GPU, you need to upgrade your outlet.
@dazealex9 сағат бұрын
That literally made me laugh out loud at work. Install your own nuclear reactor.
@ThatsPety8 сағат бұрын
gotta play in your laundry room because you need 240v
@TheSakzzz8 сағат бұрын
And wallet
@dazealex8 сағат бұрын
@@TheSakzzz And power supply in the case.
@NeoTechOps8 сағат бұрын
Nema 14-50 adapters for gpu's coming soon
@davialves597010 сағат бұрын
Dave literally dropped a bomb at the end and said bye hahah
@conandoobrien860510 сағат бұрын
that's very important review.
@Ocelot19379 сағат бұрын
Wish I saw that before I returned my 4080 Super. GGs.
@TfamSoprano9 сағат бұрын
what?
@otakugisa9 сағат бұрын
@@TfamSopranoHe said don’t do an upgrade (don’t move to the 5000 series) unless your current laptop is doodoo /can’t play your games.
@TfamSoprano9 сағат бұрын
@@otakugisa HOW IS THAT A BOMBSHELL THAT SHOULD BE COMMON SENSE LOL
@randooooooooom10 сағат бұрын
Can’t wait to buy this in 7 years!!
@Kaijosuoh10 сағат бұрын
Already added to my wishlist
@Elatenl8 сағат бұрын
Nvidia never keeps stock of old generation cards so doubt it
@ClickerFest8 сағат бұрын
@@Elatenl buy it secondhand I think they mean
@Kaijosuoh8 сағат бұрын
@Elatenl tell that to the 970 ,1070 ,1050 , and 1060 currently In stock
@Elatenl8 сағат бұрын
@@Kaijosuoh none of those are in stock brand new, and if they do, not in my country. Also, nowadays nvidia goes out of their way to not have stock of old generation cards, this isnt some conspiracy its a well known fact.
@HunterKiotori7 сағат бұрын
Even till today, the 10 series is such a special chip. They really fucked up on their end by giving consumers good tech for once
@irisun198210 сағат бұрын
Just wanted to add that the 4090 mobile uses the 4080 desktop die and the 5090 mobile will likely use the 5080 die as well, so it's 320W vs 360W. That's why the bump in CUDA Cores isn't as big, because it isn't for the desktop *080 parts as well. Nonetheless, your point still stands. The gains of Ada over Ampere were largely due to the processing node. One should remember comparing nodes between manufacturers is very hard, so 8nm Samsung to 7nm TSMC would likely still be a bigger gain than 5nm TSMC to 4nm TSMC. There is a reason Samsung is struggling with it's semiconductor manufacturing right now. That's why Hynix is king in town when it comes to really fast memory currently and Samsung got kicked out as HBM manufacturer by NVIDIA for their AI accelerators. From actual real world tests using the exact same laptop: 32GB DDR5-5600 CL46 SO-DIMMs from Samsung will go to >100°C in a memory stresstest, while 32GB DDR5-5600 CL40 SO-DIMMs using Hynix ICs are chilling at below 80°C. Guess the most valueable information to gain from these keynotes is the stuff they do not talk about^^
@m.heyatzadeh7 сағат бұрын
I'm not sure if this is related but yesterday I was working on my older laptop which uses Hynix M.2 SSD and I felt it was a lot faster than my newer laptop which is twice more expensive and uses Samsung M.2 Gen4 SSD drive. It had me confused because I always thought Samsung was the best when it came to memory and storage devices.
@berengerchristy62566 сағат бұрын
@@m.heyatzadeh It really depends on the work you were doing, especially if you're reading and writing large files to the drive frequently. I have samsung ssds in my computer and they work just fine
@HNedel3 сағат бұрын
Nerds raging and fuming that a laptop flagship modes does not use the same chip as a desktop flagship model is so adorable 😊
@orxanr595510 сағат бұрын
Capping the power input at 175w while calling it 4090 always sounded shady to me, it's false marketing at best.
@DanKaschel10 сағат бұрын
...why
@-Burb10 сағат бұрын
@@DanKaschel Because the laptop 4090 isn't even remotely close to the performance of a desktop 4090. Having them both have the same name would imply actually having similar performance. The laptop 4090 should be called the m4070/m4070Ti because thats closer to the actual performance it has.
@bankaimaster99910 сағат бұрын
@@-Burb You had me in the first half ngl
@4evahodlingdoge22610 сағат бұрын
It's not just the 175w cap which is the issue but they literally didn't use the 4090 desktop gpu, it's a cut down 4080 gpu. You get about 75-80% of the power of the desktop gpu at the high end if it uses half the power so the real scam is charging laptop buyers 4090/5090 desktop prices with a 4080 desktop cut down gpu and performance.
@SocratesWasRight9 сағат бұрын
THAT is not the false marketing part in laptop 4090s. The false part is that they basically used desktop 4080 silicon and THEN power capped it and called the result laptop 4090.
@PhilChills9 сағат бұрын
Dave, I absolutely love your intro and outro, straight to the point and your explanations are so good and precise… Those are some reasons I like your version of KZbin videos.
@AdventSeph10 сағат бұрын
Recently upgraded from a 970 to a 3060, got it for basically pennies thanks to these new cards
@jeremyzeimet36319 сағат бұрын
3060 is a really good card. Keeping up with the Joneses is foolish
@sauce.unlocked9 сағат бұрын
3060 is meh
@nenume007 сағат бұрын
good for you but 6gb of vram in 2025 is really shameful. But for the right price anything can be valuable
@AdventSeph7 сағат бұрын
@@nenume00 3060's 12gb brother, only 192-bit unfortunately but yeah...
@jondavid47477 сағат бұрын
@@sauce.unlocked Just like your comment.
@shles10 сағат бұрын
Gaming does not require hardware to progress. Good Games are not about pixels, polygons or map size. Good game design and distinct art style do not require better gpu. All gpu fuzz now is about ai and for ai companies, don’t buy it unless you’re an ai developer or a movie producer.
@edugator224210 сағат бұрын
How long have you been gaming? This is not true in any way in my experience.
@Moburu102169 сағат бұрын
@@edugator2242 good games don't need fancy graphics ..the game storylines is far more important
@edugator22429 сағат бұрын
@ I could not disagree more. There are many different types of games, but if they look like garbage and movement is slow then no one plays them.
@shles8 сағат бұрын
@@edugator2242 Good games can and do look nice on any hardware. We had really nice-looking games on any generation of consoles or pcs. Yes, they have limitations, but good games use them to their own advantage. I am not saying we should go back, but I don't see a more detailed skin on a character and a bigger fan under my desk every year as a future of gaming. Optimized and stylized smaller games look and play better than most of the open-world raytraced AAA. This highlights that better or more powerful technology is not a prerequisite for better games and does not make games better by itself. The game industry can and should utilize them but not require them as a standard.
@AppDim35828 сағат бұрын
@@Moburu10216why not both ? Why is it a choice ? I want both
@nicoferrari810 сағат бұрын
I think the problem are games itself, like has anyone notice what happend between 2015 and now (10 years already lol). Games look the same or even worst and perform terrible. I was just playing EA battlefront 2 the other day, and that game looks amazing and performs even better. Like seriously whats going on.
@louis2410 сағат бұрын
This is a very fair observation
@fran291110 сағат бұрын
I've been saying this
@niklassilen431310 сағат бұрын
Bad code. Simple as that. People are lazy and not that skilled. Nobody is actually programming anything any more.. they are just scripting and "hacking" stuff together. It's always what happens when tools become easy to use with graphical drag and drop interfaces (for instance unreal engine). It's win-win situation for everybody except the consumer.
@patryce99910 сағат бұрын
I think Starcraft 2 looks amazing and it’s 15 years old !!!
@DanKaschel10 сағат бұрын
The honest answer is that you prefer worse graphics. A lot of people do. You got used to the fake look and you like it.
@aurelien_nat10 сағат бұрын
It would also be nice if devs made games that are optimized. Hogwart legacy can still be buggy and the FPS fluctuations is kinda insane
@diabolo180910 сағат бұрын
Its too hard to optimise on PC I think. Too many random configurations to account for
@blondegirl72408 сағат бұрын
I have 4070ti Super, Hogwards runs perfectly fine with 100fps at 4k DLSS3 Q and RTX high. Just laptop CPUs especially Intel CPU suck at games as 14th and 13th way too powerhungry and laptop cannot deliver so much power to them
@geroni2118 сағат бұрын
@@diabolo1809 Games used to be far better optimized before than now. Graphics haven't advanced nearly enough to excuse the performance requirements they have nowadays.
@matheusc9723 сағат бұрын
@@diabolo1809 even on PS5 the games aren't running well. they use a lot of uspcaling there like PS5 Pro with PSSR to run 60 FPS.
@tim31723 сағат бұрын
@@geroni211Games also were expected to run on dual-core CPUs that we’re homogeneous rather than 16-24 core CPUs that may or may not have P/E cores alongside Zen4/4c cores and maybe or maybe not some X3D VCache and one CCD to 4 CCDs.
@King-Julz10 сағат бұрын
I think it's difficult to say. We can't compare 'nm' measurements since they are not even true representations of transistor size these days and haven't been for years for the most part. Also, whilst CUDA cores may not increase linearly it's really important to emphasise that CUDA cores are not the same between generations. Similar to how CPUs have IPC gains across different architectures, which means even at similar clock speed, transistor sizes, die size etc, new generations will still be faster due to architectural improvements. So when you consider all of this, the best way to know for sure... is to test them against previous gens and other comparable GPUs on the market. A really good test is to cap the performance for all GPUs on games to frame rates such as 120hz or by limiting the overall laptop power to something like 65 w and seeing which has the most performance.
@ElijahLPlus8 сағат бұрын
would love to see someone compare efficiency of different GPU generations. like capping the cards at the same fps and then checking which consumes the lowest power. do you know of any channel that does that?
@paul_wiggin10 сағат бұрын
RTX 5090 laptop = 0.3 of RTX 5090 desktop.
@adrianoawz26769 сағат бұрын
more like 0.3 of RTX 5080 Desktop
@stefannicolae31088 сағат бұрын
@@adrianoawz2676 Great now we got the people in china taking 24 mobile chips to make one gpu for a gaming pc.
@henrikg13888 сағат бұрын
Looking into the tea leaves or tarot cards? laptop 4090 = 0.7 desktop 4090. Those are the cards on the market right now.
@nightthemoon84817 сағат бұрын
that's not how power scaling works, it plateaus the higher you push, 10w to 50w makes a huge difference, while 450w to 500w makes a tiny difference
@nullptr_yt5 сағат бұрын
@@henrikg1388 What? Laptop 4090 uses the die from the desktop 4080 (AD103), and it's not even close to 0.7x of a desktop 4090. Categorically false. Dumb, even.
@jed23ify9 сағат бұрын
Bro the hair is looking great! 💁♂️🌟
@Maximum_77710 сағат бұрын
<a href="#" class="seekto" data-time="306">5:06</a>, I'm so happy you said specifically this here. This is so important for knowing when it's actually time to upgrade. I effectively out grew my old laptop, and the proof of this is that I actually still occasionally bring my 4090 laptop, with that crazy 16 core AMD cpu, to a crawl when playing my favorite games at their extremes. BeamNG's unoptimized, community made modded multiplayer, and doing stupid shit in Space Engineers, is so much more fun when you can get more than 15 fps, and these days I'm typically 100 to 180fps, with occasional big drops when stuff gets crazy, to below 60fps. I made the right decision for the wildly niche physics sandbox games I play, without question.
@johnsalamii9 сағат бұрын
i tried to play BeamMP once and i remember it running at 10fps on my i5 laptop with a rtx3050
@META313.10 сағат бұрын
Staying with my 4070 laptop for the next 2 years.
@zerocal7610 сағат бұрын
Sticking with my 3070 laptop for the next 2 yrs 🙂
@META313.9 сағат бұрын
@zerocal76 optimal choice.
@sautiksamui72609 сағат бұрын
Sticking with Intel igpu for 12 years
@TabalugaDragon9 сағат бұрын
Laptop 1070 lasted way longer. It was relevant for like 5-7 years in terms of performance, especially on vRAM capacity(meaning high to ultra textures)
@kyrios11209 сағат бұрын
Hey, if its not an issue can you tell me is 4070 a good choice in 2025 for like 3-4 years atleast? I am planning to buy a 50 series but it seems they just wanna loot money. How has your laptop been performing on heavy games at high graphics settings?
@TheDuzx9 сағат бұрын
N5 and N4 is the same node in the way that you can't get TSMC to produce N5 chips for you anymore because they're producing N4 with those machines. Also N4 is design rules-compatible with N5 (That also meanst that a lot of the 40 series cards are N4.) N4 is improved in some ways over N5, but not in a way that's going to improve power consumption. N3 is where we're going to potentally see 20-30% power reduction again. Note: N5, N4, and N3 is what TSMC calls it instead of nm, they stopped using nm a few years ago because nm was misleading. Also I oversimplified everything a bit, but a nitpick is that Nvidia is technically using 4NP for the 50 series which is different, but in the same family as N4. They were also probably using N4P for 40 series cards over the last part of the 40 series generation. N4P should not be confused for 4NP.
@NinjaKirikoJedi9 сағат бұрын
N4 not confused with N4P?
@MadMathMike6 сағат бұрын
Really appreciated that final guidance. Don't get caught up in new exciting tech when what you're currently using is actually just fine!
@AlexRubio10 сағат бұрын
<a href="#" class="seekto" data-time="255">4:15</a> Nah , my dude. What we need is corps let Dev optimize the hell out of these games. Ngreedia can keep their fake frames.
@ratzlord312510 сағат бұрын
man nvidia isn't even trying to make huge profits from consumer GPUs. There is literally no competition for DLSS hence they are able to charge whatever they want in the name of innovation. Unless AMD or Intel compete well enough, this will keep happening every year
@d_lydian10 сағат бұрын
Wrong. Devs should focus on Gameplay again. For the last 15 years the gameplay of games is declining.
@AlexRubio10 сағат бұрын
@@d_lydian both are the same coin.
@just697910 сағат бұрын
DLSS doesn't necessarily mean FrameGen. Upscaling is the real help right now. Dave's correct in that you can't rely on node shrinks forever, and brute-forcing it with just "mOaR TrAnZiStOrS!" always runs into the wall of power limits, but what _can_ really help is using the more transistors for specialized functionality. This has been happening since the beginning of GPUs: hardware T&L, programmable pipelines, compute shaders, RT cores, and of course upscaling. Node shrinks and speed bumps alone weren't keeping up with what people wanted to see, so the additional transistors that came with the node shrink started getting used for more efficient ways to create similar or better results with less power.
@SmileyNyan10 сағат бұрын
fr, dave dropped the balls by saying what the gaming industry needed to move forward is more reliant on DLSS and Frame gen, instead of actual optimization and maybe hold back on the graphic parts perhaps? It was a shit take, sorry dave, but not agreeing with you one bit
@m90nray10 сағат бұрын
This is a very exciting time for these new gpus, can't believe how much power is being pushed into a laptop these days regardless of efficiency.
@NmsOnetk010 сағат бұрын
Laptop GPU have been pretty well cooled as of recent. It's been CPU's that have been very inefficient specifically since so many were Intel. I hope to see CPU temp improvements this year.
@irisun198210 сағат бұрын
Indeed, but from what I have heard for Arrow Lake so far, it should be much more interesting for laptops than desktops, because of the gained efficency and the fact that it's easier to cool due to the die being thinner
@irisun198210 сағат бұрын
Also the AMD HX CPUs are really efficent under load, but sadly their idle power consumption sucks due to the IO-die sucking a lot of juice
@NmsOnetk010 сағат бұрын
@irisun1982 I'm very excited for the new wave of CPUs. I know AMD has already been well received in the laptop space. I had a Blade 16 4090 and it was amazing. The only reason I got rid of it was because of the Intel 13th Gen CPU. I also think this new laptop 5090 GPU is going to be an interesting proposition with the new extra vram.
@kirby21-xz4rx8 сағат бұрын
True that's why I hope arm takes off. won't hold a candle to that though
@YeahRightMCD10 сағат бұрын
I bought a 3070 a few years back, and nothing has come close to stressing it out because running on ultra graphics is actually not useful in-game. I just bought a Switch for Christmas, and EVERY game has worse graphics and is also far more fun than EVERY game I have on Steam. It's wild.
@jondavid47477 сағат бұрын
I'm still rocking my 3070 from EVGA (RIP). X-Plane is the only thing that really pushes it using a 34" ultra wide at 1440p.
@YeahRightMCD7 сағат бұрын
@jondavid4747 right? And 4k isn't going to help anyone until like 40"+
@Hop_Off_N77 сағат бұрын
The problem with the Switch is not graphics, it's frame rate.
@m.heyatzadeh7 сағат бұрын
Because switch is known for its Nintendo exclusives. I think you should only buy the Switch if you want to play indie games and Nintendo exclusives.
@YeahRightMCD6 сағат бұрын
@Hop_Off_N7 I grew up on SNES and N64. Never once did I say, "I'm not having fun because of the framerate." Even on PS1 and 2, it was the load times that were a bummer, never the graphics.
@3monsterbeast8 сағат бұрын
Thank you for bringing up something I noticed but many haven't talked about. Nvidia presented how much they focused on power output of the 40 series cards yet said nothing about the 50 series
@evans95238 сағат бұрын
I didn’t realize the jump from 30 to 40 series laptops in efficiency. I just remember the 30 series were terrible. I was holding out for the 50 series for better thermals and efficiency, but I think I’ll save money and get a 40 series. Something like the Alienware X16 4080.
@Hop_Off_N77 сағат бұрын
Big thumbs up on getting a 40 series laptop on sale, big thumbs DOWN on Alienware. Pretty much everything Alienware sucks besides some monitors and peripherals.
@fabioa.54909 сағат бұрын
Ao grateful for you and your videos, Dave. It's lien you said, even if you stopped putting out videos we would still be fans. Have a great life
@stephenwalton808710 сағат бұрын
I also noticed something... the price
@ctrl_x17709 сағат бұрын
Nvidia thought they were sneaky, keeping the COVID era pricetags. And they _were_ sneaky, because everyone still keeps buying their things.
@NightKnight20487 сағат бұрын
I’m good with my 3070. With that said… I notice improvements with upgrades…. It’s always nice to see some stutters go away and some fidelity improve with an upgrade.
@badak01042 сағат бұрын
i am waiting for the benchmark. Keep going dave
@Lead_Foot5 сағат бұрын
It's not the transistor density that improves power efficiency. It's the smaller transistor size reducing the mount of power needed to activate them. Now that dennard scaling is dead, higher density means you need more dark silicon that is sitting there doing nothing or else you'll overheat.
@Kraaketaer6 сағат бұрын
Thanks for bringing this subject up, it's really an important one - don't let upgrade FOMO push you into spending money on useless upgrades. It just isn't worth it. Buy more games instead! Or literally anything else that money might get you. It's so easy to be tempted by new, flashy hardware and all the promises they make, but the happiness from that purchase is inevitably going to be short-lived, and if what you had before already did the job well, you're going to be back to essentially the same experience within maybe a week.
@findukat8 сағат бұрын
Awesome work, great to see Linux included! Here’s hoping the Linux support might still some day come.
@EtaCarinaeSC9 сағат бұрын
You missed the point where 10 and 20 series mobile gpus had the same core counts as their desktop counterparts.
@kirby21-xz4rx8 сағат бұрын
And your point? That's unfeasible giving how powerful desktop gpus are now
@Gamer181828 сағат бұрын
Should they actively cap desktop gpu core counts so they can match the mobile counterparts? Because yk unless there a great technical breakthrough it’s unlikely they’re going to make the chip even more energy efficient on laptops
@fidelisitor89538 сағат бұрын
@@kirby21-xz4rx You know there's a thing called capping TGP right? If NVIDIA used the 4090 desktop die with the same core count and all in the 4090 laptop and capped the TGP to 175W, there would have been only a 20-25% difference in performance instead of a 50-60% difference.
@fidelisitor89538 сағат бұрын
@@Gamer18182 It's not about capping desktop core counts but rather about increasing laptop core count to match the desktop counterpart.
@jesusbarrera69168 сағат бұрын
@@kirby21-xz4rxit is not unfeasible.... That's dumb
@Strykenine6 сағат бұрын
175 watts is just a crazy amount of power to put through something that small. It's really nuts to think - imagine a 60 watt light bulb sitting on your lap, now very nearly triple that. All that heat has to go somewhere.
@AkaruIez9 сағат бұрын
I will not be changing my 4090 Legion for at least 10 years, came from 1060 HP Omen after using it for 7 years.
@Vvopat963 сағат бұрын
You are absolutely right, most people that I have seen that review these things aren't. Making the transistors smaller gets more difficult all the time, it wasn't because they didn't focus on trying to improve the performance of the chip, it was because it's starting to get really hard. I think Nvidia is smart about trying to find another ways like AI to improve the performance without needing to simply make it smaller. We are approaching the size of electrons (not really yet) but the distance where electricity will just jump through the matter randomly (random quantum world). Quoting my smart and trusted CoPilot: "One notable size is the nanometer scale (1 nanometer = 10^-9 meters). At this level, electrons exhibit wave-particle duality and quantum tunneling becomes significant. Quantum tunneling is a phenomenon where electrons can "tunnel" through a barrier that would be insurmountable in classical physics." Electricity stops working at small scales and it will jump randomly to places it shouldn't be. The AI stuff is not really great yet but it can and will get better, it's more possible option. Always remember that these people are smarter than you.
@GrandPoivron8 сағат бұрын
USB-C ports, which are a requirement in the EU and have thus become the industry standard for charging devices, are limited to 240 W.
@eldardb9 сағат бұрын
Awesome video, but I just want to mention the efficiency improvements we got on the same node when we went from Kepler (gtx 700) to Maxwell (gtx 900).
@andyony29 сағат бұрын
Yeah, on point. Till last week I was seriously thinking about buying one of the new 5070s Laptops. But now I go back to the idea to get 4070 Laptop for much cheaper. Also, because DLSS 4 is coming to rtx 4000 series as well. So if you dont really want to rely on Multi Frame Gen, than there wont be such a performance jump on the new laptops.
@johnathansmith100310 сағат бұрын
I cant think of a current game that doesn't have dlss, it's kinds of a standard feature when developing games.
@DanKaschel10 сағат бұрын
RE4 remake comes to mind. But even those that launch without it seem to add it within 6-12 months.
@harrygeoffrion45209 сағат бұрын
Elden ring
@johnathansmith10039 сағат бұрын
@ Doesn't apply, it's locked to 60fps lol!
@rasta77-x7o2 сағат бұрын
I am very happy with an AMD card, so DLSS is completely useless to me and i do not care about it.
@qq_029 сағат бұрын
The recommendation holds true for another reason as well that the 40 series demonstrated. A lot of the laptop designs were still developed with a 30 series in mind so when they suddenly had a 40 series in them, they had to severely power limit them, even below NVidia's recommended TDPs. The laptops that could use the 40 series properly came out a year later. With 50 series then, a lot of the laptop designs that are launching were initially designed with 40 series specs in mind with some headroom, but it might not be enough for a 50 series. So at this point, I'd bet on a laptop with a top end AMD CPU (with integrated graphics; these benchmarks are going to be awesome!) instead and hope the end of the year or next year wil have mature deisgns for 50 series.
@MalcolmREBORNСағат бұрын
Mobility has a big cost: Not only you settle with less power, but you can’t even upgrade the specs later down the road.
@zackerymcpherson94092 сағат бұрын
Voltage/performance is not always linear, overclocking captures this best. It’s possible that nvidia had to really push cooling boundaries of the desktop version to make it a significant upgrade. Laptop chips “could be” more efficient than 4000 series, enough to get 10-20% improvement.
@kuyache210 сағат бұрын
During the GTX 10 series you get almost the same performance as a desktop GPU on a laptop GPU, after that they are back to duping the customer with 1 tier lower performance. That 5090 mobile GPU is just a 5080 desktop gpu that is gimped all around except for vram where it gained some. There is no competition in high end mobile graphics, it is monopolized by Nvidia so either you pay what they ask or get the ultra slow/weak competition. Pretty much the same is happening on desktop side too, AMD and Intel cannot compete with 4090 much less a 5090.
@DanKaschel10 сағат бұрын
The 1080 desktop only uses 180W, so the power loss hurt it much less.
@ManniGaming4 сағат бұрын
Great points. I’m playing Star citizen on my 4090 laptop. Runs between good and awful depending on many factors including server performance. But I think the cpu is here the bottleneck with its 55w that it runs at mostly. Although my scar18 supports much lore crossload.
@remingtonrojas8 сағат бұрын
Counterpoint, they aren't making enough 17 inch, 19 inch, and more monstrous laptops. They also aren't implementing the laptop watercooling we have seen from some boutique manufacturers. And lastly, they don't have desktop sized cooling docks. I have my laptop sitting on a fan and it cools it down 20 degrees celcius.
@RevenantShark5 сағат бұрын
I hope reviewers do benchmarks without turning on any of the AI crap. If nVidia wants real money, they better give us real frames
@jrotsen244 сағат бұрын
he continues to bring unprecedented facts. Sadly, im running an an old alienware i7-7820HK, with a 1070... So its time for an upgrade for me, but this is very helpful 🙂
@dualcrocadile9 сағат бұрын
Thanks for your ethical approach Dave
@Rosameliazhere9 сағат бұрын
<a href="#" class="seekto" data-time="156">2:36</a> The 40 series was manufactured on 4nm as well
@theantsaretakingover8 сағат бұрын
Double correction: neither are 4nm. Both are made on the 4N process, which is actually just a better 5nm
@cameronbosch12136 сағат бұрын
@@Rosameliazhere that being said, the RTX 30 series was manufactured by Samsung, which the jump back to TSMC was an improvement from Samsung.
@WickedSanta0710 сағат бұрын
Thanks Dave for making these beautiful Nvidia GPU videos! Loved every second of this video!(As you have done so for years)🙏
@cheweh8429 сағат бұрын
Dave, you haven't fallen for the marketing nonsense of "5 nanometers" have you? These things are NOT that small. It's just a name. And it misleads EVERYONE.
@bongkem27232 сағат бұрын
<a href="#" class="seekto" data-time="314">5:14</a> is the only advice you need to buy any tech ever ;)
@Unpluggedx895 сағат бұрын
575w in a 2 slot form factor is asking for trouble. These things are gonna get H O T!
@iham13135 сағат бұрын
a psu double the wattage of a gpu - sounds extreme. a decent cpu, ram and m2 is about 300w - add the gpu maxed out at 600w and you are at 900w (with some room to wiggle) - why bother going above and beyond a psu with 1000 or 1200w
@zachb17063 сағат бұрын
You want a PSU more powerful because sometimes the GPU will draw way more power than it’s rated TDP.
@Xxx-hk6mk5 сағат бұрын
I hate expensive laptops and having to purchase them perks of being a computer science student
@jotacekm8 сағат бұрын
5070ti will be the sweetspot for this gen
@nenume007 сағат бұрын
basically slightly better 4070m and probably worse 4080m and most probably with the price of 4080 So either way we got shafted
@jotacekm4 сағат бұрын
@@nenume00 theres models on pre sale for $1899. If its at least like the 4080 should be worth it
@SeviersKain10 сағат бұрын
you need to change the 90 degree into 120 to massively reduce the energy waste...
@pumpuppthevolume6 сағат бұрын
got it ....next video 5090 briefcase build
@yoyoprofessorxavier10 сағат бұрын
Do devs not need to optimize their games so that there is less reliance on these "fake frames" people are calling? What can we expect from the GPU side of things to improve as well as maybe dev work? I'm just curious as I know nothing.
@DanKaschel10 сағат бұрын
As a software dev, you can safely ignore all the "optimization" comments. They are made by people who have no idea what they're talking about. Optimization is like AI. People think of it as one thing, but it's actually an entire field that takes many different forms. In fact, DLSS is a form of optimization.
@justsomeguyonyoutube77179 сағат бұрын
@@DanKaschel while i do agree that people often spit nonsense about the whole thing, but i think its important to point out that optimization in AAA games has become worse the past years but in 90% of the cases i dont blame the devs but the companies giving strict deadlines or just caring about money (there are a bunch of cases of just straight up lazy devs tho lol)
@DanKaschel9 сағат бұрын
@justsomeguyonyoutube7717 why do you believe that (that optimization has gotten worse)?
@justsomeguyonyoutube77179 сағат бұрын
@@DanKaschel there is just a lot of cases of rendering things that just arent in view and stuff, or objects that are way too complicated for no reason, things that could have been made into a single object but they are their own singular thing, useless code, etc... its just a bunch of little things that end up affecting performance quite a bit but again, i blame the companies way more than the devs for this kind of stuff
@jesusbarrera69168 сағат бұрын
@@DanKaschel"DLSS is a form of optimization" Spoken like a true numbnut
@MichaelFormoso8 сағат бұрын
Software based performance support, will only balloon the installation file, and they're already pretty big. That's going to be a tough pill to swallow for a lot of gamers, if the installation size of games gets even bigger. We don't need one game taking up nearly half a TB of storage, just up the frames in dying game like Overwatch 2.
@Merrlin10 сағат бұрын
oooooh I havent been here in a minute, nice new hairstyle Dave!
@mclovin65374 сағат бұрын
Basically wait until next year for 3nm
@XsynthZ5 сағат бұрын
Great video but I was so distracted by the fact that your camera has a dead pixel on the bottom edge, right smack dab in the middle of the frame. Thought it was my screen for a second and panicked lol
@Paul1110 сағат бұрын
finally a good video. Very insightful about node shrinks and nvidia bad comparisons and only buying hardware if needed. GOOD SHIT
@noname08584 сағат бұрын
Great analysis, thanks!
@visceralcinema9 сағат бұрын
This DLSS feature or "software-based enhancements" (as you mentioned) will also be a huge leap for VR, since (Ai driven) software can push-up frame rates and possibly more realistic parallax, etc. without crazy hardware.
@riccardo393e75 сағат бұрын
This gpu power limit of 175w is so low for very expensive machines
@thorium91907 сағат бұрын
I’m looking forward to see how well they can undervolt.
@seanx93 сағат бұрын
Great great advice! I had a laptop where I was just not able to play games without some form of lag or stuttering (I think it was a rtx1050) but now with a 4060 in my laptop I can play pretty much any game I throw at it, with a good detail level on my 1080 external monitor and that's enough for me. Ofc it would be nice to play 4k 150 fps on a 120 hertz screen but its just not worth it to me right now. If I can throw money around and the money for it wouldn't matter to me I would but not at the moment. I'm happy with what I have right now
@davitdavid71656 сағат бұрын
Good catch. It could be that bellow 200w the aechitecture does get more efficient, otherwise you could get a 40 series with a newer gpu and save money
@Raja995mh339 сағат бұрын
I have to HARDLY disagree with your claim that we need DLSS and frame gen to move forward in the gaming industry. This stuff was the main reason why games are pure garbage now. Publishers don't let their devs optimize anything anymore because DLSS and such "fix" it. That's why cyberpunk with everything cranked runs at a laughable 23fps on a 5090. This is just ridiculous. This stuff throws us back. If not worse because back then devs optimized games to make them run as good as possible
@TheNazreensyah3 сағат бұрын
The one with power efficiency, they save it for rtx6090 next year…..they just need a reason to show smth next year
@impssngr6 сағат бұрын
Thumbs if you liked it, subs if you loved it Here you go, Dave. Don't miss it again
@calebrasak69418 сағат бұрын
I’m curious to see based on these power limits on a laptop, what performance differences would look like to use an EGPU over TB5 vs an onboard laptop GPU. And even the performance differences between desktop and laptop variants. We don’t see this type of testing too often but it sounds like a good idea with these new chip sets regarding all the ML and AI upscaling that happens.
@ChristopherZarate-g1s6 сағат бұрын
All this power while Nintendo sold 140million switch because of GOOD games.
@sarkangthimtimung3 сағат бұрын
dave really said, "imma keep it simple"
@BarelyAverageDude8 сағат бұрын
Dave: “Don’t upgrade, Bye” 👋
@DoctorDuckload7 сағат бұрын
I have a 1070Ti. It's been 7 years. I can't find a decently priced GPU that outperforms it by a reasonable margin. I'm tired boss.
@SBMAN1235 сағат бұрын
il buy once they introduce 4k oled 120hz minimum...
@waakow5 сағат бұрын
Telling me NOT to buy something? You get a sub.
@criznittle9686 сағат бұрын
If you set your power outlets into a RAID 0 you can OC the 5090
@kcfivetwelve10 сағат бұрын
yesss a dave video
@richardjohnson8009Сағат бұрын
They should start making a bigger laptop as in thicker so they have more space to add things like better cooling, a bigger battery etc
@eltamarindo9 сағат бұрын
I'm sure that there will be future software improvements and new features for the 50 series that won't be possible on the 40 series cards. So the value proposition may look better with time as the software develops. 50 series laptop cards may not seem that special at launch.
@myth-870010 сағат бұрын
<a href="#" class="seekto" data-time="168">2:48</a> what is your source that rtx 5000 is on 4nm? i have been reading it is on the same 5nm as the 40 series.
@irisun198210 сағат бұрын
that was my latest information as well, but I just checked and it seems a lot of sources now say 4nm
@lieathie9 сағат бұрын
AFAIC, it's 5nm, but TSMC calls this node 'N4' (which often coincides with supposed die size, like 3nm = N3, 2nm = N2 and so on)
@desireco9 сағат бұрын
It is becoming more sensible to start making outside GPUs with their own power supply, like MiniPCs are doing
@twood20324 сағат бұрын
Rather than process node shrinks, I wonder whether it possible to increase the power and functioning in process nodes, such as architectural improvements.
@dothetontim6 сағат бұрын
You’ll notice something in a few more years when you realize going much beyond 1500w in a PSU is impossible in North America. Unless you start plugging into multiple circuits in your home or wire it up for 240v.
@DynamicPhil843 сағат бұрын
Wow... this is kinda nuts. So basically, until laptop manufacturers can figure out a way to cool more than 175 watt GPUs, we've basically reached peak performance of laptop GPUs. Actually, we hit it last year. Yikes...
@HYBECTIVE3 сағат бұрын
I can't see these new Laptop GPUs being any faster than 10% last years Laptops GPUs
@tendieman694 сағат бұрын
Frame generation creates A LOT of artifacts. Majority of these demo are done under game or with limited scope. In FPS like Cyberpunk / A Plague Tale, you will see a HUGE smudge of artifacts with very odd motion blurs / glitches as you move the camera a little faster than what they demo. I agree there needs to be a none-hardware improvement to get better tech/performance, but AI Frame is just NOT one of them.
@zachb17063 сағат бұрын
The new FG has significantly less artefacts, and I believe that trend will continue just as it did with DLSS
@supersymmetry48524 сағат бұрын
I brought a Legion Go 1 after watching the preview of the device on this channel. Since then I rarely play games on my main PC. I expect eventually the size of gaming revenue from handheld PC and mobile will eclipse those on PC and traditional consoles. And then the publishers will stop putting money on those high fidelity AAA games. But it all depends on where Nintendo can it off again with Switch 2.
@Vhicken6 сағат бұрын
Can't wait for these to release. Hoping to upgrade my 1070 ti to a 3080 for under 250!
@MiniMug9 сағат бұрын
me and my 1080 from 2017 still rocking
@Rapunzel87910 сағат бұрын
So, assuming that laptops will still be limited to 175 watts, we're looking at 10% uplift at most?
@jeffenad54124 сағат бұрын
The RTX 5070 Mobile and 4070 Mobile have the same CUDA cores and VRAM. The only differences are GDDR7 and AI TOPS, which may not justify an upgrade.
@Film_Fog7 сағат бұрын
I noticed something also. I'm still using my 1060 card. It plays all the games on Steam.
@madduckuk10 сағат бұрын
"I needed to upgrade to a 1200W power supply" Doubt.
@Dionyzos9 сағат бұрын
People overspec their PSUs all the time. Even a 750W PSU should be able to run the 5090 just fine unless you're running an OCd Intel i9 in which case 1000W should be more than enough. No way you need 1200W for any gaming machine atm.
@RCmies7 сағат бұрын
Next manufacturing process for TSMC is 2nm and they're developing it currently. Chances are 60 series cards will use that. That's a 50% decrease whereas last one was only 20% decrease. I think I'll just wait for the 60 series.
@dex63165 сағат бұрын
N4 from TSMC is just a refined N5 being 6% denser and 10-15% more power efficient. N2 is a substantial uplift over N4 bringing good improvements to both density and power efficiency. TSMC has gone from 16/12 -> 10 -> 7/6 -> 5/4 -> 3 -> 2 with the nodes to the right of the slash being refined versions of the nodes to the left of the slash.
@oliviermialon85754 сағат бұрын
"2nm" is just a name for a future node , it has been a long time it isn't related anymore to any kind of measurement of any part of a transistor. Before the N2 node , NVIDIA and AMD will certainly switch to the N3 node for the next generation of GPU/CPU/APU in 2026-2027 , the switch to N2 will come after for them : they aren't chasing after the last cutting edge process of TSMC , it's only Apple who really goes for the most recent node as soon as it's available.
@RCmies4 сағат бұрын
@@oliviermialon8575 Thanks for the info honestly I didn't know that. I just googled tsmc 2nm. Didn't know it's not tied to actual transistor size.