Anti-lag and it's counterpart works by adjusting frame timings to line up with the CPU (along with a sort of inverse vsync) and reduces the buffer size so you have less pre-rendered frames.
@Mostly_Positive_Reviews Жыл бұрын
Thanks, appreciate the explanation!
@RaVeeSS-b7z10 ай бұрын
And it also gives you vac ban in cs2 as a bonus
@Son37Lumiere10 ай бұрын
@@RaVeeSS-b7z That was only anti-lag+, because it injected directly into games at the driver level for better optimization. Doesn't affect regular anti-lag.
@Brianybug Жыл бұрын
I have a Lenovo Legion 5i Pro incoming. Thank you for the detailed, real world info and examples. This totally makes sense and looks like a game changer, especially for single player games.
@Mostly_Positive_Reviews Жыл бұрын
A nice comment on a controversial topic? Unheard of! Really appreciate the comment, and glad you could find some use for the video. Enjoy your new laptop!
@contemplatingrain2380 Жыл бұрын
Hey man. Just got mine as well. Having a blast!
@Brianybug Жыл бұрын
@@contemplatingrain2380 awesome! I won the July Lenovo giveaway (much to my surprise) and I’m just waiting impatiently for the shipping notification.
@Kman31ca Жыл бұрын
I just got my Lenovo T7 with the 4080 and I7 13600k and the things a beast. Love it.
@edwardecl Жыл бұрын
The problem is not that they are using frame generation to sell new cards, they are using frame generation as an excuse to bump up the price.
@Mostly_Positive_Reviews Жыл бұрын
I can fully appreciate the issues surrounding frame gen, like pricing, and no performance improvements over last gen on the lower end cards except for AV1 and Frame Gen etc. But I find the tech itself pretty good.
@thecooljohn100 Жыл бұрын
Maybe because it takes time and money to develop frame generation technology? Lmao. You think God is just delivering them technical advancements every 2 years? They invest hundreds of millions in developing this tech. If you want a cheaper GPU, you can still shop the 20, 30 series GPUs and all the of AMD products. To cry about the 40 series not being cheap enough is weird when there's already so many great GPUs on the market, more than there ever has been any other time in history of gaming.
@Mostly_Positive_Reviews Жыл бұрын
@@thecooljohn100 Yeah, I find it baffling as well. A lot of time went into R&D for this tech, and that's a lot of money. Gotta recoup it somewhere. Sure, AMD is doing it for *free, but I reckon it's to try and win some marketshare. People are constantly complaining about Nvidia's business practices, but the truth is, no corporation is your friend. They are all in it to make money.
@BrainInJar9 ай бұрын
My problem is that now games think that having Frame gen is an excuse to have low-performing games in the first place. It is not being used as a tool to make an average-performing game perform at high frame rates, it's now being used to make badly performing games perform at 60fps. I've tried a 4090 at my friend's house, and for a lot of games without frame generation on, it can't do much above 60 at ultra settings.
@BrainInJar9 ай бұрын
@@Mostly_Positive_Reviews That is one hundred percent true, but that does not mean that we can't criticize them. Knowing that they are only out to make money doesn't make their price gouging any less contemptible.
@bryanwashere501010 ай бұрын
Make sure that you've turned off ALL forms of Mouse Acceleration. One checkbox exists in the Windows Control Panel: Mouse Settings and there may also be a mouse acceleration setting in your gaming mouse driver software, sometimes LG software has it's own for example. Most people aren't even aware that they've been using mouse accel until they actually turn it off and feel how their own mouse pointer accuracy seems off... until they become re-accustomed to not having it on.
@Mostly_Positive_Reviews10 ай бұрын
Yeah, good tip. This makes game feel "floaty", which impacts the experience quite a bit.
@technologicalelite80769 ай бұрын
I'm late to the party here, but this is a great explanation to the whole DLSS 3 Fiasco. Yoy have a calm explanation to counter some arguments, was un biased about it, and even presented some of your evidence. I will however say, I shared with my brother a system with a 7900 XTX, but decided to build my own as well, with a 4080 super that finally arrived yesterday. My last NVIDIA GPU i used on on a family's and also another personally mine, all in one pc's. Family's woth GTX 1050 mobile and mine with a 930mx. Neither playable on anything, even more so on the 930mx. Looking at both of the drover settings software, it reminds me of the differences I watched within these technologies. I would say and that many others more informed on this Anti-Lag would say it's more if a competitor to NVIDIA's Low Latency Mode within the control panel. They both work by Dynamicallh limiting the number of frames the CPU can produce to help knock out the render queue, thus creating less worload on the GPU. NVIDIA Reflex, and Anti-Lag+ (When ever that will come back), took this a step further by timing this better with added game data and also timing thongs within the game's engine.
@Mostly_Positive_Reviews9 ай бұрын
Appreciate you taking the time to share your experience. Yeah indeed, at the time I wasnt fully clued up on exactly how reflex and Low Latency modes exactly did, but you explained it very nicely in Layman's terms. I really hope Anti-Lag+ makes a comeback as it definitely helps with FSR 3 frame generation. At the time of making this video only Nvidia had frame generation technologies available, but now that FSR 3 is here I am just happy that now everybody can try these for themselves. FSR 3 so far has had its issues, but in Avatar at least it works very well now. I had quite a few frame pacing issues initially but after a few patches and driver updates I have to say, it's pretty good. I personally like frame generation, if that wasnt obvious at this point 😀 But even I can admit when it shouldnt be used, like competitive shooters for example. But if you have a high refresh rate monitor then frame gen is really great to enhance motion fluidity at the expense of some additional input latency. Hope you enjoy that 4080 Super! I still ove my normal 4080, and dont see myself upgrading any time soon!
@3jake5mee Жыл бұрын
I could feel it right away, something felt off with mouse movement. Thanks for this breakdown, it's great to know the base framerate affects the latency post generation gives me more to work with getting it down. Excellent video
@Mostly_Positive_Reviews Жыл бұрын
Yeah, some people are definitely more sensitive to input latency than others. It sounds like you are one of those that are more sensitive, so the best is to be very selective with this tech and use it in games where input latency doesnt matter, like Diablo 4, or MSFS or something similar.
@AndTecks8 ай бұрын
Agreed. It feels like walking around in cement shoes
@BassCannon420 Жыл бұрын
Very informative video, nice work.
@Mostly_Positive_Reviews Жыл бұрын
Appreciate the watch and the comment! This video almost didnt happen, but glad I published it :)
@Kuraiser Жыл бұрын
In Cyberpunk it feels awful unless you already have high base framerate without framegen, but in Rift Apart it feels crazy good, no input lag whatsoever, super high freames even with RT On and everything maxxed out. I play with DLAA and FrameGen and the game looks beautiful.
@Mostly_Positive_Reviews Жыл бұрын
Rift Apart is just such a good looking game, and it performs very well with FG. The game isnt that hard to run on higher end GPUs, and add FG then to the mix and you get a very good experience.
@ruilechomeur Жыл бұрын
I found rift apart better without FG, was getting strange frames stuttering when getting it on, didn't felt smooth
@keflas3842 Жыл бұрын
Framegen requires 60fps to fully work correctly
@jondoe935111 ай бұрын
Because 40fps 120hz is 🔥 lol
@rahulmagadi654 Жыл бұрын
Fine work right here brother! keep spreading the Truth!
@Mostly_Positive_Reviews Жыл бұрын
Appreciate the encouraging words!
@chincemagnet10 ай бұрын
Frame gen is awesome. You probably need a 4090 and a 120hz or higher display to really take full advantage of it. If you can’t make it work correctly, you’re probably lacking common sense or decent hardware. It’s a massive game changer in Jedi Survivor, Plague Tale Requiem, and Witcher 3. What I have found is that it’s best used in conjunction with DLSS and if you have the overhead, DSR. Dead Space Remake, doesn’t have frame gen, but the ultimate way to play that game is with a 4090 using DSR for 7K resolution with DLSS quality. Looks way better than native. With frame Gen, you could push it even harder or just make it smoother.
@Mostly_Positive_Reviews10 ай бұрын
Yeah, it is great, and works better when CPU bound, but even if not CPU bound, use DLSS upscaling and then frame gen as it can make a huge difference. Recently finished DS remake, and game can benefit greatly from frame gen, even if to try and smooth over some of the stuttering. Wont help for shader compilation or traversal stutters, but stutters caused by your system generally sees a njce reduction when frame gen is used 👍
@chincemagnet10 ай бұрын
@@Mostly_Positive_Reviews true, Jedi Survivor was virtually unplayable due to such bad stuttering. Frame gen, I think, took enough load off the GPU that it allowed for much smoother frame times and pretty much fixed the stutter.
@theboy27779 ай бұрын
I use frame generation .. I like it . When used with Gsync/vsync I feel barely any latency impact and it feels x2 smoother
@Mostly_Positive_Reviews9 ай бұрын
Yeah, that's how I use it too, with G-sync and v-sync on, and the motion fluidity is really awesome then.
@tunnandaaung2 Жыл бұрын
Underrated channel I like how u honest & explains about dlss3 clearly
@Mostly_Positive_Reviews Жыл бұрын
Appreciate the kind words, thanks!
@omaralsalihi50335 ай бұрын
I am gaming RTX 4080 on 1440p and when I turned on DLSS: Quality and FG in Remnant 2, I got FPS increase from 90 to 180 and it is insane.
@704pat Жыл бұрын
As someone who severely despises input latency, and at all times would prefer less than 20ms, if you can't feel a 10ms difference in input latency then it's definitely for you. If you can, like me however, it's snake oil. Adding input latency is absurd. I don't care how the game "looks" I care how it feels.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, that's totally fair. Also, snake oil is such a funny term hahaha.
@704pat Жыл бұрын
@@Mostly_Positive_Reviews well the term snake oil is heavily derived from salesman selling fake or misleading products. Nvidia selling frame generation as "real" frames (quantities that reduce input latency, the most notable benefit from high frame rates) can be directly attributed to snake oil in that sense. At least I personally consider high frame rates to only be for feel not looks, I can play 60fps elden ring all day with controller but the second you hand me a mouse I'll FEEL the 60.
@Mostly_Positive_Reviews Жыл бұрын
@@704pat Yeah, agreed, their marketing around this can be very misleading. I understand the term snake oil, just find it a funny term haha!
@AaronWOfficial2 ай бұрын
The problem with frame generation is in nearly every game I have tested, there is some form of ghosting introduced when using it. It could be something minor some people don't notice like a bird flying in the air, but it does happen. There are very few games that actually use it well and don't have any ghosting, and most of those games aren't that demanding in the first place so it is unnecessary to use FG unless you are playing on a potato
@Mostly_Positive_Reviews2 ай бұрын
Yeah, that's fair criticism. In my experience, most of the ghosting actually came from the upscaling itself, and not necessarily from frame gen, and that has mostly been solved with the new "E" preset with DLSS. But people's experience with frame gen varies quite a bit, and it all comes down to how sensitive you are to noticing slight artifacting or input latency.
@GregoryShtevensh Жыл бұрын
why hate a new feature? it IS more frames - fake or not. and the fact is, you really notice the extra FPS but hardly notice at all the latency. Obviously just not something you'd use on esports titles for multiple reasons, including the fact that FPS is high in those games anyway
@Mostly_Positive_Reviews Жыл бұрын
Exactly what I tried to convey. But this is the internet, no place for logic 😂
@deyandimitrov7287 Жыл бұрын
I don't play competitive eSports titles and was never good in 1st person shooters so I don't think I would feel the latency at all. Tech like this is perfect for me as I play mostly single player games.
@GregoryShtevensh Жыл бұрын
@@deyandimitrov7287 yeah a lot of people just play single player and a lot just play online. I play a mix personally
@Mostly_Positive_Reviews Жыл бұрын
@@deyandimitrov7287 I used to play Counter-Strike (Source and 1.6) competitive locally and I find that I can barely notice the additional input latency if my base FPS is above 60. It is there, but even in games like Cyberpunk it doesnt hamper my experience.
@theorphanobliterator9 ай бұрын
a 10ms increase in latency from just dlss to frame gen is miniscule. and with dlss completely off, you get MORE latency than dlss+frame gen? why are people complaining about the smallest things
@Mostly_Positive_Reviews9 ай бұрын
Beats me. Unfortunately people like to complain nowadays just for the sake of complaining. Some people have provided good reasons why they dont like it, not that they should validate themselves. But jumping on a hate train just for the sake of it is beyond me.
@theorphanobliterator9 ай бұрын
@@Mostly_Positive_Reviews yeah fr. every single technological progress update has huge haters for no reason. even rtx STILL has haters like 5 years later, which is rlly sad because it's haters like that, that prevent tech from progressing past what they're conformable with
@Mostly_Positive_Reviews9 ай бұрын
@@theorphanobliterator And the best is, these are options. Very few games force ray tracing on, so you can choose to enable it or not. Same with upscaling and frame generation, they are all options that you can choose to use or not.
@slc9800gtx Жыл бұрын
I have a Nvidia card that has frame generation. I tried it in a few game such as Cyber Punk 2077 and I like it a lot. It smooths the image on the screen, so it looks very good, as well as increasing fps about 50 percent higher. As everyone should know, it should not be used in competitive shooters, because you do not want anything affecting latency. But in 1st or 3rd person games, I feel is is great. Get a demo or a game and try it yourself. I thought frame generation was maybe a gimmick until I tried it myself, but I found that I like it.
@Mostly_Positive_Reviews Жыл бұрын
Same, also thought it a gimmick until I used it. Use it in most games now.
@joshuam4993 Жыл бұрын
Interestingly enough games like bf2042 have their own inferior version of future frames rendering but I don't experience any feel of input latency. Probably because I'm at ovet 100 fps
@cooltwittertag11 ай бұрын
tried it in Forspoken with AMD frame gen and honestly it feels great, just gonna wait on FSR 3 in cyberpunk to try it out since i dont have a 4000 series
@Mostly_Positive_Reviews11 ай бұрын
@@cooltwittertag Yeah, I have a few videos up with AMD's FSR 3 as well, and it's pretty good for their first attempt. Just hope they can bring it to more games soon!
@jking79 Жыл бұрын
It's really about what you are able to perceieve, I can absolutley feel the input latancy in Cyberpunk with FrameGen enabled, some may not. Also tuning graphic settings will help reduce ghosting and artifacting. Cyberpunk is a complete blurrfest with DLSS 3 when higher RT setting are enabled especially pathtracing but looks and feels great with DLSS 2 enabled same settings. Frame limiter and Reflex is a must with FrameGen. (Cyberpunk player and OLED+4090 user here)
@Mostly_Positive_Reviews Жыл бұрын
Yeah, well put. Some people can feel the additional input latency, some cant, depending on the difference of course. Cyberpunk is a bit of a strange one to me, as with higher RT settings it does look blurry, but even without FG on my side. I thought it was my monitor at first so procured a 4k IPS panel to test, and while it does look a lot sharper at 4k, that blurriness is still present when high RT is used. I have since resorted to playing without RT, except for lighting, as it looks great then, and performance is also great. I dont have an OLED to test with, maybe VA and IPS panels just have a higher amount of blur in Cyberpunk, really not sure at this point. Either way, as you said, it depends on what you are able to perceive. If you notice the higher input latency then just dont use FG in that title. I use it in almost every title that supports it for a high refresh rate experience at 1440p. The only title I have disabled it is MSFS 2020 as I really dont mind 50-60fps there.
@cashmoney2159 Жыл бұрын
Bro does playing on an OLED screen appears way more better?
@Mostly_Positive_Reviews Жыл бұрын
@@cashmoney2159 Yeah, an OLED screen does look good. It has near-perfect blacks so the contrast is exceptional. Couple that with HDR and ray tracing in something like Cyberpunk and you are in for a treat.
@PixelShade Жыл бұрын
Slight correction.... We actually don't know if DLSS is AI-based. This is what nvidia has said it is. However since it's closed source and since nvidias marketing work a lot with "half-truths" I wouldn't take it on face value. First of all, we don't do AI, nobody does AI... what we do is machine learning, and in that case, the algorithm needs to learn over time, which it doesn't. There was a huge paradigm shift from DLSS1 (which was AI generated), But the results were REALLY bad and the tech quickly became a laughing stock. Nvidia went back to the drawing board, kept the name "Deep Learning Super Sampling" but ultimately DLSS2 is a temporal upscaler. It does use tensor cores for acceleration. Which sure, nvidia market as "AI-cores", But in reality these cores are specifically designed for matrix calculations... that's not the same thing as "AI". This allow RTX APUs to accelerate parallelized workflows, which also accelerate machine learning, but in order to receive blistering fast performance, it calculate several layers of temporal data to reconstruct each pixel on screen. There is no image recognition, there's no learning, there's no AI. It's just an advanced, but accelerated temporal upscaler.
@Mostly_Positive_Reviews Жыл бұрын
Interesting points. And yeah, I know AI is just a buzzword, but I used it for simplicity reasons, as a lot of my small audience are new to these types of tech. Would be interesting to know for sure regarding DLSS 2 though 🤔
@thecooljohn100 Жыл бұрын
Yeah, except you're wrong. There IS image recognition, there IS learning, there IS Ai compute at work. Ai is a spectrum of things, not just Skynet levels of intelligence. Nvidia is the world's leading company in Ai computing, so I'm going to take their stance on it before I do some armchair wizard's. They TRAIN their DLSS algorithm in-house using their own machine learning supercomputer. They then package that into a static generalized model that can be integrated into any game, slap a version number on it and that lets us know how much the DLSS algorithm was trained. There is no machine learning done on the hardware itself, I think that's where you were thinking Nvidia was claiming that but they have never said that. DLSS 2 was trained for longer and under more complexity than DLSS 1.0. DLSS 3.5 was announced to have been trained on five times more data than 2.0, and now also has Ai tensor core-fueled denoisers, therefore producing the best image quality yet. DLSS has temporal anti aliasing aspects but it is not as simple as "oh it's just TAA" because traditional TAA isn't trained by supercomputers and traditional TAA is usually used at NATIVE resolution. DLSS is inherently upscaling to a higher resolution, all while looking better than TAA at NATIVE res more often that not. DLAA is their newer native resolution AA tech and it is also better than normal TAA. Don't try to reduce the company's accomplishments just because you don't like their business practices. At the end of the day they are producing state of the art real time graphics technology. How they choose to price it can be discussed while continuing to respect the work they do.
@mechanicalmonk2020 Жыл бұрын
You're basing your whole rant on the tired question of 'what is AI' They use convnets. That's it. That's all they mean when they say "AI based". Any and all "AI" these days is matrix multiplications deep down, and Nvidia engineers have repeatedly stated that modern DLSS is using convnets that have to be trained. The big difference between DLSS 1 and DLSS 2 was that 2 has a temporal component
@sanderbrouwer91 Жыл бұрын
so much artifacting with dlss enabled @3:30
@Chasm9 Жыл бұрын
Hey there, great video 🙂 I have a question - do you know how to fix screen tearing while using frame gen? I tend to enable Vsync in NVCP, but that adds input lag which we don't want. It's just surprising I see tearing when I'm on a GSYNC compatible panel. 🤔
@Mostly_Positive_Reviews Жыл бұрын
Thanks, appreciate the comment! What is your monitor's refresh rate? I use NVCP Vsync on my 165hz panel and the added input latency is very minimal. What I have seen others do is cap it via RTSS, but at half your monitor's refresh rate -2. So if you have a 144hz panel, cap it at 70 fps. It should then prevent it from going over 140 fps, meaning you shouldnt get any screen tearing anymore with a G-Sync panel. Well, in theory, but we all know PCs like to give us the run-around every now and again.
@brendanlee5302 Жыл бұрын
I play solo games with an xbox controller🎮. You'll find the latency and fps matter not as much in comparison to a mouse. When using a mouse you'll notice it much more since movments are much more sharp
@Mostly_Positive_Reviews Жыл бұрын
Yeah, exactly. I play Witcher 3 and Ratchet and Clank with a controller and the difference in input latency is imperceptible. With a mouse it is slightly more noticeable, correct. Well, that's my experience anyway.
@Dotain5minutes11 ай бұрын
Thanks man. nice video!
@Mostly_Positive_Reviews11 ай бұрын
Appreciate it!
@sagar9703 Жыл бұрын
Hate it or love it.... dlss3 is the future like how dlss 1 was introduced with 20 series and people hated it at the beginning but now it has become mainstream . Nice video and explanation 👍
@Mostly_Positive_Reviews Жыл бұрын
Yeah, definitely not going away any time soon. Thanks for watching and leaving a comment, much appreciated!
@ADunleavy Жыл бұрын
Except that the reason people hated it was for different reasons. DLSS 3 is hated because its only available on RTX 40-series cards making the RTX 20 and 30 series buyers lose out which is shtty. DLSS 1 was hated because it was sht. The image quality was just bad with heinous ghosting issues.
@piotrolina2670 Жыл бұрын
@@ADunleavyit was like that because on 20 series if you would activate DLLs 3 your image would be blurry and even buggy on 30 series it would look better but slightly. that's why they didn't released DLLs 3 for other generations. DLLs was bad in 20 gen because 20 gen rtx cards had less efficient rtx cores and AI cores so raytracing performance was bad and quality of dlss was bad too and also they had to teach their ai how textures should look in most popular games so their new DLLs 2.0 would look better and perform good they did that ai learning thing on 30 series GPU to create frame gen soo it was time consuming for them to polish their features they wanted to work.
@Wobbothe3rd Жыл бұрын
@@piotrolina2670no that's not quite right. DLSS looks the same on rtx20 as rtx 30 or 40. Frame generation relies on optical flow processing which is too weak on the rtx30.
@piotrolina2670 Жыл бұрын
@@Wobbothe3rd yo I didn't know thanks
@flat_stickproductions209Ай бұрын
That's exactly how I use FG, FPS cap as low as 60 and we are talking 30w GPU usage. Crazy efficiency and ultra quiet and cool running system.
@Mostly_Positive_ReviewsАй бұрын
It's a great way to reduce heat and power consumption when used in conjunction with a framerate cap for sure!
@aboutthat413 Жыл бұрын
Good follow up video!
@MudShadow7 ай бұрын
it's basically the same as v-sync and triple buffering. Only difference is that the framerate shows a higher fps. Which is meaningless if the latency is the same (or worse) as triple buffering. I think the best option is still G-sync/freesync for around 50-60fps and not frame gen. If you have over 80fps you don't need framegen. The increased framerate is doing practically nothing but adding latency. very difficult to see any difference from 80fps to 160fps in terms of smoothness.
@maratpirate634310 ай бұрын
frame generation is going to revolutionise console gaming in the next gen
@Mostly_Positive_Reviews10 ай бұрын
I actually wonder if they'll push it to current consoles via a software update as it really is only a software solution that'll work great on the custom RDNA2 GPUs.
@chincemagnet10 ай бұрын
The people with a problem with frame gen probably either don’t understand how it works, they’re trying to run it on underpowered hardware, or they can’t afford a graphics card that is capable of running it and they’re hating on it because some other KZbinrs told them it was bad.
@Mostly_Positive_Reviews7 ай бұрын
I know you left this comment a while ago but I wanted to see how well it aged when FSR 3 became more widely available. It's been eye-opening to see the same people on Twitter that complained about DLSS 3 now praise FSR 3. Literally the same people 🤣 I fully agree with your last reason - they got told it was bad and jumped on the hate train. It happens with so many things.
@chincemagnet7 ай бұрын
@@Mostly_Positive_Reviews my only problem with FSR, besides the fact it’s not as good as DLSS, is that game devs use it on console games when I believe there are more effective options for achieving target resolution and frame rate. Maybe FSR is part of that equation, but you have to wonder if they’re turning it on because they don’t know any better? They don’t realize the image is muddy now and there’s ghosting. In that case, and I’m talking specifically about Final Fantasy 7 Rebirth, that some other settings would provide a superior experience.
@allxtend40057 күн бұрын
@@Mostly_Positive_Reviews you mean the same Nvidia Fanboys who Complain About FSR and telling that DLSS is better but one generation later they Praise FSR because they are locked out of New Feautures from DLSS ? yea true a Miracle. FSR is looking bad but so is DLSS, Native is the best way to play but it need's a very powerfull Hardware and people nowdays think low and mid end hardware will give them high end or Ultra Performance with a very high Refresh rate, no you just bought the wrong product and 4k is still a scam for gaming and will always be.
@AngelTorres96 Жыл бұрын
most importantly dlss3 is now open source so you can have dlss3 in games regardless of whether the developer includes support or not. Starfield with dlss3 is going to be amazing.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, if ever a game is going to need it I suspect it will be Starfield. Will be awesome with DLSS 3, agreed!
@jungleboy1 Жыл бұрын
thats good news, i was worried about the rumors AMD would stop this tech from being used.
@AngelTorres96 Жыл бұрын
@@jungleboy1 I don't think so, the pc market is important, and developers know that most gamers have an nvidia gpu of course these technologies will end up in amd sponsored games
@massivechonk920 Жыл бұрын
Where did you hear about DLSS3 being 'open sourced'?
@angelportilla6584 Жыл бұрын
@@massivechonk920it's on nvidia's development kit, for free.
@redblue235820 күн бұрын
Frame generation makes cyberpunk at max settings feel “sticky”
@linmesar Жыл бұрын
Great latency explanation. Thanks!
@Mostly_Positive_Reviews Жыл бұрын
Thanks, really appreciate the comment
@ImperialDiecast Жыл бұрын
sometimes i cant tell the difference between native and dlss performance. sometimes dlss quality makes things look sharper than native dlss ultra performance makes things blurrier but performance, with the help of a sharpness filter, can really give you the most bang for your buck in terms of fps increase but people choose quality cause they think it is sharpest but the fps increase from native to quality is minimal if you choose performance you could even increase overall resolution or enable ray tracing on midrange cards
@Mostly_Positive_Reviews Жыл бұрын
If the implementation is good I can definitely use more aggressive upscaling settings. Cyberpunk 2077 does have issues with flickering lights to when using Performance at 4K, and Balanced at 1440p. But it does depend on the implementation. Going to be testing DLSS 3.5 (without ray construction) this week to see how it improves over previous versions.
@Sol4rFl4r3 Жыл бұрын
u gonna make a video?@@Mostly_Positive_Reviews
@Mostly_Positive_Reviews Жыл бұрын
@@Sol4rFl4r3 I definitely will
@mryellow691811 ай бұрын
i want your eyes
@lifeinvader697910 ай бұрын
@@Mostly_Positive_ReviewsBro I use DLSS Quality in Cyberpunk 2.1 and its trash, a lot blurry, can you explain why?
@ClumsyMercenary Жыл бұрын
i just wish entry level cards like the 4060's could leverage FG without resorting to, say, 720p internal resolution. the lowest end cards should have been benefitted the most.
@Mostly_Positive_Reviews Жыл бұрын
So true. I just bought a 4060, video coming soon, but FG is not nearly as helpful on lower-end cards.
@Andytlp Жыл бұрын
Yeah the starting fps has to be high. 60fps or more. Preferably 80-100. So that means you need at least 4070 with dlss upscaling to get around 80-120fps and then frame generation on top of that isn't too terrible
@fightnight142 ай бұрын
I don't understand the fuss. I use my 4060 with DLSS and FG and RT and it's fine with games like Hogwarts Legacy and Star Wars Outlaws.
@TDIFLY Жыл бұрын
Nice Vid , Duke
@williamtopping9 ай бұрын
In Spiderman you changed the DLSS scaling, but did not actually enable DLSS in the option above. Go back and see. It was set to off...
@Mostly_Positive_Reviews9 ай бұрын
Yes, the first test was just with DLSS super resolution, and not frame generation to show that when you are CPU bound, DLSS Super Resolution does not increase your frame rate. I then enabled Frame Generation (the option above) to show how much it helps when CPU bound.
@aperson8693 Жыл бұрын
It's only people who never used it hating on it, even if they're fake frames, they're produced in between 2 real frames allowing the transition between these 2 frames looking smoother = a better experience. I've also found 80 native fps is not as smooth as 80 fps with fg somehow, at least for me and I don't notice any latency at all (even if GeForce Exp saying I got 100ms)
@Mostly_Positive_Reviews Жыл бұрын
Majority definitely havent used it. Some concerns are valid, I had some very decent explanations in the comments from people with good points, but they are by far the minority. You not noticing 100ms latency also proves that some people are more sensitive to latency than others. I started testing the 4060 this week and it struggled with FG at respectable settings and the latency was very noticeable. But I also had others test the latency and they couldnt tell a difference. It really is good if used properly.
@abdalazeemal-mashaqba6600 Жыл бұрын
It's not the tech we hate its Nvidia rebranded 30 series and selling dlss3 and fg as an upgrade
@aperson8693 Жыл бұрын
@@Mostly_Positive_Reviews also in games with really bad optimisation, it would just simply double your fps and make the spiky frame time graph smoother
@aperson8693 Жыл бұрын
@@abdalazeemal-mashaqba6600 good to know there's still rational haters out there, mostly I'm seeing are those saying FG is BS and etc
@Mostly_Positive_Reviews Жыл бұрын
@@aperson8693 Yeah, in today's time with games being as poorly optimized as they are, frame generation really helps a lot, agreed!
@petertorda5487 Жыл бұрын
It is simply adding quite a significant latency from it's principle, as first it needs to render frame in low resolution, then resample it by DLSS then it needs to do same for one extra frame ahead and then to apply DLSS Frame Generation between these two frames, after that this is displayed. Also many LCD monitors are adding some latency, as they are not as fast as old analogue CRTs, in some bad combination of setting and low spec cards it can happened, that you have on display 60fps, but on-rail shooter experience 🙂
@Mostly_Positive_Reviews Жыл бұрын
Yeah, very good explanation of what happens. I didnt test it in this specific video, but I think it was done in my 4070 Ti video on Cyberpunk, and there I showed that if your final output FPS is 60, the latency is terrible. But if your base frame rate is 50+ it can do good things for people on high refresh rate monitors.
@MLWJ1993 Жыл бұрын
Except that the DLSS (2) upscale DECREASES latency because it's faster than rendering a higher resolution frame. So you're partially wrong.
@Mostly_Positive_Reviews Жыл бұрын
@@MLWJ1993 Seeing that the subject at hand was DLSS 3, I assumed he was talking about frame generation and answered based on that. But yes, DLSS 2 Super Resolution or upscaling does decrease latency
@dante4111 Жыл бұрын
People play at 60-80ms in legue of legends of curse you can play a single player game at 50ms
@Danie0l0ntertainmet Жыл бұрын
@@dante4111south africans also play on europe servers with 240 ping, that is all dog shit
@Snake594711 ай бұрын
2.04 and at 4.23, the native res result the same latency than DLSS + FG.
@Johan-jh9bl Жыл бұрын
Definitely, most people find to use FG is poor experience, because they tried to use this tech like magic - "ok ive got 20fps in Cyberpunk, lets use FG! Hmmm ive got 60fps but its feel poor!". I've sey to people like this, its not magic, its tech. This need to use right. You need turn settings in to the game to see 40-50fps in your monitor before you turn on FG, to fell fine with input latency. I find this tech is fantastic! I can play in Alan Wake 2 with path traicing with 70-80fps. Paradoxically but if you use your brain (with how work input latency) you really see Magic. And if we are returning into 2019, people hating first version of DLSS too, but here we are DLSS work perfectly in 2023 and not have a many haters. P.. S. Sorry for my broken language (i really far to speak and write natively) i hope your understand somfing what i write up.
@Mostly_Positive_Reviews Жыл бұрын
Hahaha, dont worry about the language, I am not a native english speaker myself. I agree with you here yeah, a lot of complaints have been people using it at 25 fps or on 60hz monitors etc. If you use it correctly the input latency is still there, but a lot less noticeable.
@Unobserved0711 ай бұрын
Framegen has been a great piece of tech allowing smooth gameplay in cyberpunk and witcher 3 with every ray tracing option cranked to the max including path tracing for cyberpunk. I'm using a 4090/13900ks so I'm able to reap the benefits of the tech but It's definitely interesting hearing other's experiences with it on varying hardware configs. It's not perfect, buts it damn good I think. Impressive how well it works when it does work
@Mostly_Positive_Reviews11 ай бұрын
Yeah, agree 100%. I really like it a lot, and hearing others' experiences with it differ so wildly from my own is interesting to say the least. Beast of a system you have there though!
@kaushalsuvarna5156 Жыл бұрын
Thanks, quite informative
@Mostly_Positive_Reviews Жыл бұрын
Thanks, appreciate it
@zazoreal5536 Жыл бұрын
Everything has it's drawbacks along with it's advantages. The more tools we have the better.
@Mostly_Positive_Reviews Жыл бұрын
Couldnt agree more. As an option, these things arent forced on anyone, but those that want to use it, can.
@Sn4p_Fps8 ай бұрын
If you have already 200 fps whit native resolution and turn on frame generation + reflex + dlss you will get 300+ fps and the latency will be still good in a scenario where you are running a 240/360hz monitor and i test it in the finals (that I think it’s best game to turn it on ) you will gain smoothness and max out the refresh rate
@bronis_lav Жыл бұрын
Lats say frame generation is like motion blur but way better one which bring a lot of positive things than just making your image on screen more smoother. People who don't like it, just do not use it.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, I think it’s great. Some people have valid concerns and opinions after using it, and I can respect that. What gets to me is the people who never tried it talking it down.
@LinhTinh-m8f5 ай бұрын
Hi, I installed the Nvidia app and activated average PC latency, but it's not working. When I play games, it only shows 0ms. However, when I install GeForce Experience, it only displays render latency. How can I enable it?
@Mostly_Positive_Reviews5 ай бұрын
Hey. Average PC latency in GeForce experience only shows if you have Reflex enabled as far as I am aware. If Reflex is not disabled it shows render latency. Try enabling Reflex and see if both report correctly?
@massivechonk920 Жыл бұрын
Agree with much of this, but the rebuttal to the argument that DLSS3 is largely a bonus but performance gains are still being realized without it falls down a bit. A 4080 is not the next gen 'counterpart' to the 3080, it's in a completely different product class. $699 vs $1199 - you would only mention the 3080 and 4080 in the same breath because they have '80' in the name, you can't really compare a product that's almost *twice* as costly as the product it's meant to replace. You do recognize the folly of the 4060 series of course, but my point is that Nvidia's naming scheme really doesn't matter - we determine what product 'replaces' another by the new version at least being in roughly the same price category. The 4070 is meant to be the new 3080, and you can see there it's possible to actually get performance regressions without DLSS3. Nvidia's charts for these products focus on frame gen performance for a reason, because it is being relied upon to deliver next-gen gains against its comparable replacements.
@Mostly_Positive_Reviews Жыл бұрын
What are you doing coming here with good feedback, a fair argument, conveyed in a civil manner? This is KZbin, there is no place for any of that here! I agree with some of your points too. This gen is just a mess when it comes to naming and pricing. Nothing is as it seems. But I get where you are coming from, and dont disagree. Appreciate the constructive criticism and feedback 👍
@massivechonk920 Жыл бұрын
@@Mostly_Positive_Reviews Hah, the rarity of this type of exchange is why I comment on youtube threads maybe twice a year. Thanks for the kind words.
@Mostly_Positive_Reviews Жыл бұрын
@@massivechonk920 Yeah, it is definitely rare. I've had some nice exchanges on here, but almost never with someone who disagrees with certain points. Usually when there's a disagreement it is name calling and being called an idiot etc. Glad to see people can still have different opinions and not have it forced down your throat in a condescending way. Have a good day bud!
@Wobbothe3rd Жыл бұрын
The 4080 is dramatically more performant than the 3080. It costs more because it has much higher value. You can dress it up in nice words amd reasonable sounding language, but you're still just whining amd coping. $1200 is a perfectly fair price for the 4080. Btw, most people paid much more than MSRP for the 3080 for most of its existence, from 2021-2022
@massivechonk920 Жыл бұрын
@@Wobbothe3rd Interesting how the author and me just concluded a nice back and forth about how rare it is to see even-tempered, reasonable disagreement on youtube, then you come in with this to essentially say 'Yeah this is why" As for people 'paying much more for the 3080', yes that was price gouging due to crypto. It is not a sustainable market for PC gaming, otherwise it would have been priced like that from the outset. Citing some of the most insanely priced years for PC gaming due to an artificial market as support for your argument wrt to 'value' today completely defeats your own point.
@doopie7037 Жыл бұрын
Perfectly explained, now that we have fsr 3.0 am looking forward to trying frame generation on my rtx 2060super especially in games like forspoken
@Mostly_Positive_Reviews Жыл бұрын
I did a video on Forspoken with FSR 3 as well, albeit a short one. You can download the demo for free on Steam to try it if you dont have the game yet.
@Reaper1008 Жыл бұрын
How do you get your overlay from Geforce Experience to show. I use it as well, but it never shows up in my recordings.
@Mostly_Positive_Reviews Жыл бұрын
I used to have the same issue, but I didnt do anything to fix it really, it just started working a few driver updates ago. Maybe try DDU in safe mode and then reinstall the drivers from scratch, see if that helps?
@Reaper1008 Жыл бұрын
@@Mostly_Positive_Reviews I think in the past it did work for me.. it's not a big deal .. but I was just curious. Thanks
@96thelycan11 ай бұрын
me, happy, with a newly bought 4060 wanting to better optimize pc and about to watch this video :)
@96thelycan11 ай бұрын
me 18 minutes later :(
@Mostly_Positive_Reviews11 ай бұрын
I was a bit harsh on the 4060, I know :( The only reason is that it offers very little over the 3060, but since the video was made, the 4060 has come down in price and the 3060 and 6600 XT / 6700 XT GPUs have become a lot more scarce. So depending on pricing, the 4060 is really not a bad GPU, especially if you cant find alternatives or older gen cards for a good price. It's performance increase when it comes to RT is quite significant over the 3060, you get AV1 and frame generation. If it was $50 cheaper at launch and had 12GB VRAM it wouldve been an awesome card. The only thing that brought it down is at the time of release it cost the same as a 3060, and there wasnt much performance gains except for RT.
@96thelycan11 ай бұрын
@@Mostly_Positive_Reviews I actually got it for $250, but paid a US vendor, live in the UK and had a $50 customs charge. But overall I'm not super sad about it. It's giving my cyberpunk 100+ FPS with low RT on 1440p. I barely play FPS heavy games though so I'm not needing an upgrade yet.
@Mostly_Positive_Reviews11 ай бұрын
The one I got for my channel I am using in my wife's PC now as she records gameplay and the NVENC encoder is very good. It's also pretty fast at 1080p and very decent at 1440p gaming.
@genetalavera42563 ай бұрын
I wish you covered that it may introduce artifacting or ghosting at times. Good video nonetheless :)
@Mostly_Positive_Reviews3 ай бұрын
Thank you. I am learning as I go along, and hopefully my videos get better with time.
@ericsilva7430 Жыл бұрын
I play on a 60hz 43inch tv with vsync of and frame limit to 60. I wonder if frame genaration help on those situation the game can't reach/keep 60 fps? or this is only good if you already over 60fps on a +144hz monitor? If the game is running like 30-40 fps and FG push it to 60, does feel better? or make things even worst?
@Mostly_Positive_Reviews Жыл бұрын
If you go from 40 fps to 60 fps it feels pretty bad. If you go past 60 fps and cap it to 60 via v-sync or framerate cap it's not worth it in my opinion. I tried that on my 4k60 monitor and it felt terrible. It really is only to provide a high refresh rate output if you are already getting around 60 fps. AMD also advises to have at least 55 fps at 1080p before enabling FG, and 70 fps at 1440p or higher. In my Cyberpunk 4070 Ti video I showed this, and enabling frame gen when running at around 30 fps the input latency was close to 200 ms.
@mario_luis_dev11 ай бұрын
Informative video for noobs, but the title is super misleading. For anyone who knows what they're doing there's no new secret info whatsoever here
@Mostly_Positive_Reviews11 ай бұрын
Key phrase being "who knows what they're doing". So many people try to use it to go from 15 fps to 30 fps, or use a 60hz monitor with vsync and then complain about input lag. That's who this video is for.
@caramelsensation694310 ай бұрын
@@Mostly_Positive_Reviews Nobody is trying to use it to go from 15 to 30fps because in what game would a DLSS frame gen enabled card get 15fps? Also, DLSS disables vsync and requires forcing the option on in the control panel which only someone 'who knows what they're doing' could do.
@greenbow78888 ай бұрын
I keep try looking for information about whether Frame Generation quality has improved with time. I remember a video showing Microsoft Flight Simulator, where as crosshair would distort when panning. I think they fixed that. Am wondering however if in general FG has improved over time. Thinking about because I would want to use FG in Cyberpunk 2077 if I bought it. Would be a shame to buy C2077, get path-tracing on, add Raytracing Reconstruction, only to have FG ruin it.
@Mostly_Positive_Reviews8 ай бұрын
FG has definitely come a long way, but there are still some noticeable artifacts sometimes. For instance in Cyberpunk, the first time you draw a gun with "digital ironsights" it is slightly garbled for a second or so, but then it stabilizes. Stuff like that. It is much better than initially, especially in Spider-Man and MSFS. Spider-man's HUD was garbled in motion, but that doesnt happen anymore. Crosshair artifacts are not an issue anymore either, at least in the games I tested so far. A game migjt release with a subpar integration of FG where some of these things might be present, but it doesnt take long for it to get fixed in most cases. For Cyberpunk, the implementation is probably 98% flawless. It looks great, runs great, and you'll have to pixel peep to really see a difference with FG on vs off. There are some artifacts here and there, as explained regarding the ironsights, but they are very few and far between. Please note that this is only my perception / experience with it. Others might notice the artifacts more, but I dont, and use it in any game that has FG, except competitive shooters like The Finals and MW3.
@WASD-MVME Жыл бұрын
I was always on the fence with DLSS very much game dependant but the tew games i have used frame gen i hsve been very impressed aa you mentioned if I was already getting decent frames but just needed a little bump its the feature of the generation if you aak me
@WASD-MVME Жыл бұрын
Playing Ratchet & Clank PC now besides being a drop dead gorgeous game at 4k with ray-tracing cranked up the frame gen seems a good implementation
@Mostly_Positive_Reviews Жыл бұрын
R&C performs very well. There are a few issues at the moment, like crashing and Nvidia Reflex causing performance loss etc, but a decent frame generation implementation, agreed. And the game looks SO good!
@Kahunaseb Жыл бұрын
AWESOME video
@Mostly_Positive_Reviews Жыл бұрын
Thank you, appreciate it!
@kahaneck10 ай бұрын
I can feel the input latency in D4 at native res 60fps... its super common for me to press a skill, like flame shield and die because it did not register my input in time.
@Mostly_Positive_Reviews10 ай бұрын
I dont think it's input latency as this still happens at 300 fps without frame generation as well. This feels more like game or online latency as button presses arent instant no matter what. If you measure your input latency using the Nvidia overlay you'll see it's very low, even at 60 fps native.
@GaminiKolla-c7t4 ай бұрын
your explanation is great. plz make video about "looseless scaling" (Ai Upscaler)
@Mostly_Positive_Reviews4 ай бұрын
Only if you promise to watch it 🤣 Will make one for sure 👍
@drgitgud Жыл бұрын
very good video, i played cyberpunk with pathtracing on my 4070ti and the latency was just too much with framegen enabled, the 70 fps was playable no doubt but if you stop looking at things and just wanna have a snappy gameplay experience high latency is just a nogo, so yeah using framegen to increase from 60fps upwards to avoid latency is actually a really good takeaway from this video thanks man
@Mostly_Positive_Reviews Жыл бұрын
Appreciate the comment! Yeah, AMD also recommend at least 55 fps before enabling their frame gen technology, and I found anything below 50 to be just too low as the input latency penalty becomes a lot more noticeable. So what I do on the daily is configure the game so that I get about 80 fps and then enable frame gen, and it works great. Have a good one and happy gaming!
@prestonarsenault99759 ай бұрын
Im using frame gen in Starfield right now with an rtx2060 super, and holy input lag/floaty feeling. The "frames" are amazing looks like a buttery smooth 144fps but man the input sure as hell doesnt feel like 144fps. Its an old ass gpu though so it makes sense.
@Mostly_Positive_Reviews9 ай бұрын
Starfield actually feels very floaty indeed with FSR3 Frame Gen. Tested it on the 3080 earlier this week and I went back to DLSS without FG. Try enabling reflex in the game, and then also Nvidia Low Latency in Nvidia's control panel and see if it helps?
@vulcan4d11 ай бұрын
It does a great job increasing the FPS counter for great marketing material. As for gameplay, frame gen feels meh when playing.
@Mostly_Positive_Reviews11 ай бұрын
Fair enough! Though I enable it in every game I can, except MW3. Tech like this doesnt belong in a multiplayer competitve shooter.
@Son37Lumiere10 ай бұрын
@@Mostly_Positive_Reviews The fact that it requires a pretty high frame rate to begin with without feeling excessively laggy makes its usefulness limited. It is decent for certain situations though.
@Mostly_Positive_Reviews10 ай бұрын
@@Son37Lumiere It does limit its uses, agreed, and a lot of people arent aware of the fact that it's not meant to turn 30 fps into 60 fps, but rather 90 fps into 140 fps for example. I have a 1440p 165hz monitor and I adjust my visuals so that I get around 90-100 fps base frame rate, and then I enable frame gen to get to 140+. I mostly play single player games, and the added input latency at those frame rates dont bother me in the games I play, but I do get a more visually pleasing presentation in the form of increased motion fluidity. I get why people dont like it though, it's not perfect. There is some artifacting, additional input latency and you require a high enough frame rate for it to work properly. Generally for me though it works quite well, as does FSR 3 frame generation, mostly. The increased motion fluidity is worth the negatives to me, but I understand that for others it might not be.
@Son37Lumiere10 ай бұрын
@@Mostly_Positive_Reviews I do use AFMF in Cyberpunk on my 7900 XT to achieve 120 fps with RT reflections enabled, which causes a 50% decrease in performance alone on AMD hardware. Being that nearly every surface in CP is reflective it does have a pretty large visual impact, more so than any of the other RT settings save path tracing. In most games though, I've found it to be unnecessary, except to overcome a hard 60 fps limit or to make a game I'm getting less than 100 fps in feel better with my 144 hz monitor. I will say, it generally performs better than I was expecting though.
@BaIlincat43 Жыл бұрын
For my first pc I'll be getting a 2080 ti because here they are selling for $250 :D. Can't wait to use dlss 3.5 and fsr 3's fg on cyberpunk. (I think cyberpunk supports fsr 3, right?)
@Mostly_Positive_Reviews Жыл бұрын
The 2080 Ti at that price is a crazy good deal. Yeah, FSR 3 will be supported on Cyberpunk. FSR 3 launches Q1 2024 and I am very excited as well!
@krspy1337 Жыл бұрын
@@Mostly_Positive_Reviews no, fsr3 launches next month in first two games and normal hypr rx (without AFMF/Amd Fluid Motion Frames) launches near this too, hypr rx with AFMF (in driver) which is rdna 3 only will be at Q1 2024
@Mostly_Positive_Reviews Жыл бұрын
@@krspy1337 You are indeed correct, thanks for the correction 👍
@iamspencerx Жыл бұрын
The problem with frame generation is that the players who need it can't use it because it requires a high framerate in first place, also if I'm not buying a 600€+ GPU just to have to use upscaling and frame generation, those tricks should be for 200€ GPUs
@Ghost-_-Gav10 ай бұрын
Im coming from console to a 4070ti. Id like to run max settings at 1440p while getting around 40fps, coming from console a lot if games in the quality preset are locked at 30 fps, and some even drop into the 20's sometimes. So, 40fps seems like it would give me room to drop some without going below 30. The majority of games i wanna play are older as well. I definitely wanna play cyberpunk with max settings, and dlss with input latency wouldnt really bother me i dont think. The 4080 was my first option, but 50 series cards are rumored to launch within the next year or so, so I'll get a 4080 or 4090 if the price drops a little on them
@RealBritishPatriot7 ай бұрын
I use an RTX 4050 and play a lot of competitive games, never used frame gen until I downloaded ark survival. I was getting 30 fps with medium settings. I turned AI on and I got 80 frames and I can't even notice the latency, seems instant to me.
@Mostly_Positive_Reviews7 ай бұрын
Frame Gen in Ark Survival Ascended is absolutely necessary, game runs so poorly without it 😔
@yuan5154 Жыл бұрын
If I know anything, is that normal people can't tell difference of 50ms latency from aying dota pings And a smooth gameplay is always a + on my book Sure, there might be artifacts now, but we were shown that fixing the artifacts was a possibility and is a promising future tech
@Mostly_Positive_Reviews Жыл бұрын
I also believe it will get better over time. Look at what happened with DLSS 1. Hopefully FSR3 is good as well, will be testing as soon as it comes out.
@JohnnyBg290511 ай бұрын
Math is simple - 25% worse latancy for 80-100% more frames. Works phenomenally with 50+ base framerate.
@Mostly_Positive_Reviews11 ай бұрын
And 25% is on the high end, some games see a very minimal increase in input latency. But I realized a lot of people use it on 60hz monitors, or start with a too low framerate and want to achieve a final output of 60 fps, and then it's going to be terrible.
@born2serve92 Жыл бұрын
tbh I needed high vram for work and got a 16 gig 4060ti and since gaming comes second to making money, it works for me, but I play 1080p 165hz. Pricing is insane but everyone has diff use cases for different cards :)
@Mostly_Positive_Reviews Жыл бұрын
Yeah, that's true and fair. It's not a bad card at all, just that the pricing sucks. But that's true for pretty much the whole 40-series unfortunately.
@levijosephcreates Жыл бұрын
I'm in the camp that can notice latency, possibly conditioned as a result of playing more competitive online FPS titles from c2000 onwards, It's very noticeable to me in cyberpunk to a degree I've turned it off in that title. Would prefer 60-70 natural frames over 100+ with frame generation. As a result I've decided frame generation is only useful to those that don't notice additional latency or in games where latency isn't an issue, if there is such a thing. I don't hate frame generation, just think it's useless tech, for me, as it stands, especially given you need high frame count in order for it to work well, kind of a catch 22 on that point.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, you make very valid points, and FG is situational. If you notice the input latency and it bothers you then best to leave it off. It actually works better when playing with a controller, like Witcher 3, and Ratchet and Clank etc, as the input latency with a controller isnt as noticeable as with a mouse. But if you dont like it, that’s also great, at least you tried it, found it wanting, and that’s fine. I mean, you dont need me to tell you it’s fine hahaha, but I appreciate your comment sharing your experience with it and why you dont like it.
@levijosephcreates Жыл бұрын
@@Mostly_Positive_Reviews yeah exactly, can only talk from my point of view, I don't care if others buy into the FG marketing, it's impressive tech regardless of it being useless for me.
@chocho6766 Жыл бұрын
Game like telltale will benefit greatly from this
@krspy1337 Жыл бұрын
FG is only usable on high fps games (80+)
@juneaoalfred3704 Жыл бұрын
Rtx 4060 or 4070 is my next GPU upgrade.. VSR for watching movies and it's 2x performance of my current GPU too..
@Mostly_Positive_Reviews Жыл бұрын
If money allows I would highly recommend the 4070 over the 4060. I habe both and the 4070 is just a much better GPU.
@juneaoalfred3704 Жыл бұрын
Thanks for the recommendation.. Btw.. 110w vs 200w is a massive difference too.. Because i pay my own electricity bills.. i think I'll go for 4060 then. Because right now.. my current GPU has 150w of TDP. I really do care about my bills.. that 50w extra is equal to 12w x 4 lights Bulbs 😂. Idk but i will reconsider.
@Mostly_Positive_Reviews Жыл бұрын
@@juneaoalfred3704 Yeah, makes sense in that regard. The 4060 is extremely efficient. But the 4070 with a bit of an undervolt or even just running at a lower power target will be slightly more efficient actually, but I get it.
@guitaristkuro8898 Жыл бұрын
Just call it DLSS X.X Feature. It’s not “DLSS is: DLSS 2 Super Res, DLSS 3 Frame Generation, and Reflex”. That’s just wrong and as I’ve learned, a common misconception. It’s simply “DLSS X.X is: Super Resolution, Frame Generation, Reflex, and Ray Reconstruction”.
@Mostly_Positive_Reviews Жыл бұрын
Until 3.5 released there was only 2 and 3, so wasnt wrong to call it as such. DLSS 3 is just a term though. DLSS 3 means a combination of DLSS2, Reflex and Frame Generation. That is not wrong. And DLSS 1 and 2 are Super Resolution, or upscaling. DLSS 3.5 does not always include frame generation either as ray reconstruction works on 20 series as well. In the end, DLSS naming is confusing and I tried to explain it as best as I could in layman’s terms
@tunahan6719 Жыл бұрын
Thanks a lot it definitely is a great feature
@Mostly_Positive_Reviews Жыл бұрын
The premium for it sucks, but the feature is very good.
@tunahan6719 Жыл бұрын
@@Mostly_Positive_Reviews yeah but it is what it is. I actually thought the latency would be extremely bad due to how people talk this topic but I’m relieved after watching your video.
@Mostly_Positive_Reviews Жыл бұрын
@@tunahan6719 Yeah, I use it on a daily basis. The latency is only bad if your base frame rate is low. Other than that it really is awesome.
@joshdingle930 Жыл бұрын
the latency also depends on the game i get about 40-50 fps 35ms latency in a slightly demanding room in portal with rtx with my 4070ti with dlss quality and no frame gen and when i turn on frame gen it goes to about 80 fps 40-60ms latency which is honestly totally playable for a single player puzzle game like portal
@Mostly_Positive_Reviews Жыл бұрын
Oh yeah, definitely. Some games when measured with an LDAT actually have slightly lower latency as well. I didnt show all games I tested in this video, but the average increase in latency over 7 (or 8) games tested was 10ms. Some were lower and some higher, and it changes from game to game.
@poodledoodle2416 Жыл бұрын
How much extra vram does Frame generation take up generally? I’m currently near the vram limit in some games so was wondering if there’ll be performance impact if it goes over
@WASD-MVME Жыл бұрын
How much VRAM do you have?
@Mostly_Positive_Reviews Жыл бұрын
It does use some VRAM, but it depends on the game. I saw anything from 500MB to 1.5GB, but that is usually offset by using DLSS Super Resolution. Running out of VRAM will definitely cause performance issues, but also, a lot of games can get by with say 8GB, but use 14GB if available. So the VRAM usage you see might not actually be what is actually required. Diablo 4 uses up to 21GB on a 4090, but still runs perfectly fine on a 12GB 4070 Ti for example.
@Mostly_Positive_Reviews Жыл бұрын
Just tested this again last night and I saw an average extra usage of 1GB by only enabling frame generation. When I enabled DLSS super resolution on the Quality mode it reduced VRAM usage again by about 400MB, so in my experience you can look at around 500MB to 1GB additional VRAM usage, depending on the game and settings.
@Viralvids128 Жыл бұрын
People don't have fg.they hate the fact that it was the only selling factor.if 4060 was better than 3070 then we all would have loved it.
@doltBmB Жыл бұрын
In theory it is for taking already decent performance, and pushing it so that you can max out a high refresh monitor.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, exactly this! Another great benefit is if you are CPU bound you wont get any performance benefit from enabling upscaling. This provides a higher frame rate even when CPU bound.
@ericsilva7430 Жыл бұрын
@@Mostly_Positive_Reviews If the game is running like 30-40fps and FG push it to 60, does feel better or worst?
@Dserebrakov Жыл бұрын
11:58 It is not like 33% of frames are AI generated. 50% of frames are ALWAYS AI generated. So your real RAW fps is framegen FPS/2
@Mostly_Positive_Reviews Жыл бұрын
Yeah correct, this was addressed in previous comments 👍
@turbet5 Жыл бұрын
saying about Doing It Wrong: Nvidia actually could do Frame Generation much better if they used motion vector buffer (already used by dlss 2 for other purpose) they actually could predict future intermediate frame, and I think this was the initial idea of deep learning frame generation, but they failed and now using pretty simple morphing, which are confirmed to be able to run on rtx 20 by FSR3 implementation
@venusprinzj8094 Жыл бұрын
AMD Fanboys hate FG because its from Nvidia. AMD is also working on a Feature like FG, but its okey because its from AMD. I call it… it will be way worse compared to DLSS3😂
@Mostly_Positive_Reviews Жыл бұрын
I really hope AMD can pull it off, and even though it’ll probably not be as good as DLSS 3, it will be great for people with 30 series and below GPUs as well if it follows the same path as FSR. Nvidia also needs some competition as AMD really failed to show up this gen.
@venusprinzj8094 Жыл бұрын
@@Mostly_Positive_Reviews We dont know yet if FSR3 is supported by older GPUs, as far as i know. I hope Intel can do some Magic and do some competition. 3 GPUs Brands would stabilise the Market and would bring balance.
@Azureskies01 Жыл бұрын
@venusprinzj8094 I will call out Input lag (frame gen) from AMD just as hard as I do from nvidia because the tech is just not good and not ready for the people that need it most (lower end cards at lower frame rates). It is dlss 1.0 all over again and i'm not afraid to say it. If Intel or AMD want to come out with something just as shitty I'll tell it how it is, simple.
@venusprinzj8094 Жыл бұрын
@@Azureskies01 Its great and not bad because you cant use it. The inputlag is not a big deal and you would know that if you watched the Vid. Its a great feature and most people who are crying about it never used it at all.
@Azureskies01 Жыл бұрын
@@venusprinzj8094 I don't need to use something to watch enough videos testing it out and looking at nvidias own monitoring software to know it is not worth turning on. I would say the same thing if AMD or Intel made some dumb shit like input lag (frame gen)
@bredsfx Жыл бұрын
The point of the "haters" that the frame generation is fake FPS, almost the same like the smart TV-s are using for a decade to convert the movies to 60 FPS. It wont help you to be a better player in competitive games, but it helps to smoothen out the movement. Yes, the cyberpunk is not competitive but in fast action you're still better without frame generation at lower quality. If i were a dev i would make an option as "auto" where the game turns off the FG when you are in combat. Maybe I'm a little too comfortable, but I can't aim with FG.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, some games work better than others, and I understand if people dont like it, I really do. But most arguments are just silly. Yours isnt as you give a good example why it isnt so good for you, and that’s perfectly fine. Auto might be something to explore 🤔
@antarus6338 Жыл бұрын
my GTX 650ti crying
@Mostly_Positive_Reviews Жыл бұрын
If you still use a GTX 650 Ti then that’s really awesome! It’s probably crying tears of joy at being kept alive for so long. But if you plan on playing any recent AAA game then yeah, probably time for it to be put to rest. Still, really awesome to hear you are still using a 650 Ti! Never sell it.
@antarus6338 Жыл бұрын
@@Mostly_Positive_Reviews Well I'm just a poor person, + I have a difficult situation in the country and in the economy, not everyone can buying computer. I'm from Ukraine
@Mostly_Positive_Reviews Жыл бұрын
@@antarus6338 Yeah, I can only imagine. Didnt mean to sound elitest at all, sometimes it's all people can afford, and that is also fine. As long as people enjoy what they have and what they are playing all is good! Really hope the situation in Ukraine improves quickly :(
@Leo_Hidalgo Жыл бұрын
Reflex Activating + boost doesn't have a better effect on decreasing latency?
@Mostly_Positive_Reviews Жыл бұрын
The reflex toggle becomes completely unavailable when enabling FG, so not possible to enable +boost unfortunately. I am busy with a video showing the differences between on and on+boost, obviously without FG as you cant change it there 👍
@Leo_Hidalgo Жыл бұрын
@@Mostly_Positive_Reviews Thanks for the answer. I tested it here on Miles and Ratchet and it is possible to change between the two options even with the FG activated, but I did not notice a difference in latency, only in the frame rate, which loses a little when activating Boost
@Mostly_Positive_Reviews Жыл бұрын
@@Leo_Hidalgo Interesting, I'll have a look again at Ratchet, but I know in Cyberpunk it locks it completely, and other games as well. I might be wrong with Ratchet and will check Spider-Man again as I dont have Miles Morales. Thanks for letting me know. As for the lower framerate, that is typically what happens with +boost enabled. Thanks again for the heads-up with regards to being able to set it to on+boost, will check it out. I just assumed from the other games I have tested you cant change it. You learn something new every day.
@cptnsx Жыл бұрын
I would not say people hate Frame Generation but they do hate when people mischaracterize it. For example you PROVED even with this sub-par tool that ~85 "FPS" with framegen has ~55ms of latency and REAL ~85 FPS to have ~30ms. At a TRUE and REAL ~130 FPS that latency would have been ~15-20ms so YES ~40+ms is BAD. Now I'm not saying its NOT usable especially in single player games. I don't notice it BUT for people who are ALL about REDUCING latency with higher FRAMERATES - Frame Generation IS NOT REAL FRAMES AND they DON"T reduce latency. Its a frame SMOOTHING technology. Also its kind of infuriating when people are RANTING against monitors and especially TVs over 5, 10ms of latency but when NVIDIA has tech that ADDS 10-20+ms of latency at a displayed "FPS" now it doesn't matter.
@jonelbondyingnuezca742 Жыл бұрын
Most of the haters do not have the money to buy the 40 series.
@Mostly_Positive_Reviews Жыл бұрын
Not sure it’s money related, but a lot of detractors on these two videos of mine have never tried it, nor have a GPU capable of it. Some criticisms are fair, and you can clearly tell some people have actually tried it but didnt like it, and that’s fair. But you have to have at least some experience with it if you want to talk it down, otherwise your opinion matters not at all. I get that not everybody can just buy a new GPU every 2 years, I dont either, but nobody can say anything bad about it if they havent tried it.
@lowestpoly647 ай бұрын
just a question: let's say that i'm hovering aroung 55-60 fps with some fluctuation, and i use framegen to lock it to 60. would that cause inpput lag problems?
@Mostly_Positive_Reviews7 ай бұрын
Yeah, very much so, as half the frames would be generated, so you'd get worse latency than 30 fps without FG. I only know this because I tried something similar 🤣 I got around 110 fps but wanted to lock it to 120 using FG and a framerate limiter, and my latency more than doubled. Unfortunately when enabling frame gen, every second frame is generated, regardless if you only need 5 fps to lock it to 60. What you can do is enable FG at 55 fps, and then use vsync fast in Nvidia control panel. That way it wont lock the framerate to 60 fps, and it will try to prevent tearing caused by going over your monitor's refresh rate. It doesnt always work, but I have gotten vsync fast to work on my 4K60 monitor, and then it's a good experience.
@DerdOn0ner Жыл бұрын
The whole AI shenanigans in game rendering are amazing. Given how fast AI is evolving, those technologies will improve drastically in a short timespan. People talking about “fake frames” have absolutely no idea what they are talking about 😂
@Mostly_Positive_Reviews Жыл бұрын
I agree with you. And now that AMD has joined the fray it's even better for everyone.
@marsovac Жыл бұрын
This is partially true. DLSS 3 super resolution is its own thing. It is an upgrade to DLSS2. Then DLSS FrameGeneration is its own thing. And Reflex is its own thing. Reflex works with DLSS2 but FrameGeneration doesn't. Also frame generation doesnt work with DLSS3, without RTX 4000, which itself can work on non 4 gen cards. It is a mess.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, with the introduction of DLSS 3.5 ray reconstruction since has also just caused even more confusion 🤦
@loutrepolemique5951 Жыл бұрын
11:57 you are calling 33% of the frames being AI generated, it is not quite correct. It will always be 50% AI generated frames as the frame generation will always produce an image between two rendered frames, whatever the base and final framerates are. If you get 33% more frames at the end, it is because frame generation has a cost which may vary depending on the situation (i.e. in game scenario, resolution etc...). So the real frames are actually getting decreased and that is why power usage might be lower.
@Mostly_Positive_Reviews Жыл бұрын
Yes, you are correct, I misspoke. The uplift I saw was 33% in some cases, but 50% are generated frames.
@NateT3 Жыл бұрын
Shouldn't you combine Nvidia frame gen with Nvidia reflex? Seems like the techs are built on top and complement each others.
@Mostly_Positive_Reviews Жыл бұрын
Nvidia reflex gets enabled automatically when frame generation gets enabled, and you cant disable it, so it's always enabled by default 👍
@faisalrazajarral Жыл бұрын
Nice video
@Mostly_Positive_Reviews Жыл бұрын
Thanks, appreciate it!
@Tyuwhebsg Жыл бұрын
Amazing tech
@vmafarah9473 Жыл бұрын
What it feels like to use frame gen when your framegen FPS is 60 and non framegen FPS is way less. Does it feel more input lag or artifacts ?
@Mostly_Positive_Reviews Жыл бұрын
Yeah, if your end result is 60 the input latency is quite bad. I showed this on my 4070 Ti Cyberpunk video a little while ago. When using path tracing and I got 20 fps or so I enabled FG and the latency shot up to just below 200ms. It stabalized at around 100+ ms, but it wasnt a good experience 😂
@krspy1337 Жыл бұрын
I would like u to do same video with first two FSR3 games! ur videos are great
@Mostly_Positive_Reviews Жыл бұрын
Appreciate it, really. Will definitely do videos as soon as FSR3 drops, really excited to see it in action!
@asap1113 Жыл бұрын
how will DLSS and FG on the 4th gen RTX cards work with older CPU's?
@Mostly_Positive_Reviews Жыл бұрын
DLSS Super Resolution doesnt help when CPU limited unfortunately, but frame gen helps a ton. Also, depends what you'd call an older CPU? But I reckon the 50-series GPUs will be very fast indeed so you'd need a decent CPU to keep up, as even today some games can be CPU limited with a 13900K and 4090 at 4K.
@asap1113 Жыл бұрын
Thanks for the replay. i was thinking about the RTX 4070 with Xeon e-5 1650 v4 CPU with 32gp ddr4 ram for 1440p gaming. As far as I understand its best to maximize the graphics quality to lower the CPU usage right? @@Mostly_Positive_Reviews
@Mostly_Positive_Reviews Жыл бұрын
Yeah, I think that would be fine at 1440p Ultra or High. You might run into some issues with games that use Ray Tracing that can be intensive on the CPU as well. Spider-Man Remastered is a prime example of this.
@asap1113 Жыл бұрын
@@Mostly_Positive_Reviews Thanks for the reply. Do you think downgrading the GPU to rtx 3070 and upgrading the CPU to Ryzen 9 3900x would be better?
@Mostly_Positive_Reviews Жыл бұрын
@@asap1113 Frame generation will definitely help you a lot when CPU bound, so you'll get a higher framerate out of the 4070 in games that support frame generation. It also has 12GB of VRAM. You can always get the 4070 now and upgrade your platform later, that's what I would do personally.
@bluegizmo1983 Жыл бұрын
I'm still using a GTX 1080 Ti, so I have no opinion either way yet on DLSS and/or Frame Generation, but from what I've seen in videos of it, if you honestly cannot tell any difference in quality DURING ACTUAL GAMEPLAY (still frame pixel peeping doesn't count) with Frame Generation being on vs off, then why hate it? If it works just shut up and use it!
@Mostly_Positive_Reviews Жыл бұрын
1080 Ti is still such an awesome GPU. Was my personal favourite out of all the GPUs I have owned. That said, this is exactly my point. If you need to pixel peep, or zoom 5x while playing back at 0.25x speed to tell the difference, you are not going to notice it during actual gameplay. Actually the reason why I disable RT reflections in most games too, because it looks nice when looking at it, but I barely notice it when actually playing the game. I understand people complaining that 40 series is too expensive, because it is. But frame generation is pretty good nonetheless.
@robw7381 Жыл бұрын
Frame gen is the mp3 of gaming its not going away. It could be great for budget gamers I will be going from a 1660 super to a 4060.
@Azureskies01 Жыл бұрын
Input lag (frame gen) will make your gaming on a 4060 a lot worse as you wont be at a high enough frame rate to take advantage of it without crippling your input latency. It will be like playing on stadia but on your own local machine lul. Get the 4080 or 4090 if you want to turn input lag (frame gen) on
@robw7381 Жыл бұрын
@@Azureskies01 I guess you missed the part where i said budget gamer. As a budget gamer you make compromises. I'm not spending $1000-or $1600 on a gpu. Especially for below par AAA games like refall and the Last of us which I already played twice.
@Azureskies01 Жыл бұрын
@@robw7381 You aren't a budget gamer if you buy a 500 dollar GPU, sorry that is the way it is. You can buy a 6800XT for around 550 right now. There is no reason to get a 4060. Even the 8GB version isn't worth 400 so have fun with all that "logic" you have goin on.
@robw7381 Жыл бұрын
@@Azureskies01
@alicini68 Жыл бұрын
@@Azureskies01 The most ridiculous comment I've ever seen, I'm using 4060 and I don't feel any difference with it on or off. It's not about getting more fps.
@LK-md1idАй бұрын
Idk why but every game that i play with frame gen on it either gets super input lag or cause weird visual effects like stutters
@Mostly_Positive_ReviewsАй бұрын
What is your framerate before you enable frame gen? And what is the framerate after?
@LK-md1idАй бұрын
@@Mostly_Positive_Reviews I capped it at 60fps using Rivatuner so I don't really know the max fps
@Mostly_Positive_ReviewsАй бұрын
@@LK-md1id Ah okay, makes sense then. Yeah, if your final framerate is 60 fps it will feel terrible. Best use case for frame gen is to get a base framerate of at least 60 fps, and then take that to 100+ fps with frame gen. If you cap it to 60 the input latency and visual artifacts will be terrible unfortunately. The tech was mostly developed with high refresh rates in mind, not to go from 30 fps to 60 fps.
@LK-md1idАй бұрын
@@Mostly_Positive_Reviews I see.. ok Imma lower some graphics settings and run some tests. TQ
@nonotres Жыл бұрын
If you come from a 3060 Ti it makes no sense to buy the 4060. Now, if you come from a computer that is several years old, in my case one from 2009, it makes sense for me to buy the 4060 for several reasons: - the VRAM is the same as the 3060 Ti, so the frames per second are very, very similar, without turning on the dlss or RT, being a little higher on the 3060 Ti. - consumption is lower in the 4060. - and, at least here in Spain, a 4060 is cheaper than a 3060 Ti, so if they are very similar in numbers, since the 4060 is not an innovation, beyond DLSS, I think a 4060 is always worth it. when you don't come from a 3060 Ti.
@Mostly_Positive_Reviews Жыл бұрын
Yeah, price plays a huge role. If they are the same price it's still better to get the 4060. The reason why it gets hated on so much is when compared to last gen there is barely any difference except for power draw, and the fact that it is still a bit overpriced. But all in all, if you come from a 1060 a 4060 is a decent upgrade 👍
@PKPK-476 ай бұрын
So… You are telling me. You went from 50 fps to 130 at the SAME or even LESS input delay and people are complaining that you got 80 flipping fps for free???