Latency Split Test: Find Your Lag Threshold

  Рет қаралды 11,078

Aperture Grille

Aperture Grille

Күн бұрын

Пікірлер: 102
@RocketScience
@RocketScience 2 жыл бұрын
Awesome video as always and great tool! Haven't actually tried it myself yet but I wanted to chime in on something you've said. I happen to have written my master's thesis on the performance impact and perception of latency on video games so I know quite a few (> 100) papers on the topic. So you said that you recommend a 1:1 correlation between your mouse movement and cursor movement. Knowing the literature, it is unsurprising that this will give you the best latency guesses. The best latency detection is always found in 1 to 1 touchscreen movement tasks. With 1 ms end-to-end latency and a small square projected onto the stylus, regular participants were able to notice 2 ms of added latency (Ng et al., In the Blink of an Eye). The size of the square actually matters, because what they're perceiving is the stylus offset from the center of the square, and being off center is easier to notice in a small square than a bigger one. I had plenty of discussions with my supervisor who was of the opinion that that is not even "perceiving latency", but I think that it doesn't really matter, because you're always just perceiving some kind of difference that latency creates, even in a clicking test. Latency itself is not something that you could ever perceive after all. However, it is very important to realize that the different effects of latency aren't perceived identically, and even though gamers are more trained to perceive latency, I don't think anyone can notice 1 ms in a click test. So don't directly apply the results from one type to another. In a similar setup as "In the Blink of an Eye", they found direct touch dragging latency sensitivity of 9ms. For clicks it was 70 ms (Deber et al., How much faster is fast enough). When they projected the cursor against the wall instead of the touch surface, the sensitivities were 55 ms and 95 ms respectively. Of course, with your 1:1 setup your monitor is kind of like a projection on the wall, but if you deliberately stay far away from the monitor and eye your mouse relative to the line, you are able to see the distance offset from the line. There was actually a study deliberately studying offset stylus vs. no offset and vs. offset with the hand hidden (Annett et al., How low should we go? Understanding the Perception of latency while inking). I'm not too big on the study because they have a super high spread with some participants always at above 100 ms. I feel like those are not representative of people knowing what to look for. Regardless, the average is around 60 ms for no offset/small offset/large offset but then 100 ms when the hand is hidden behind a cover. At the end of the day, it depends on what you want to measure. If it is supposed to be like in a real game, then sit down as close to the monitor as usual, don't necessarily deliberately set it 1:1 because I don't think a 3D game is really like a 1:1 thing unless you play inverted camera, and focus on the screen and not your arm. Then you're testing a hand-eye latency sensitivity instead of an eye-eye difference or whatever you wanna call it. Either way, the person doing 1-3 ms is certainly impressive but not out of this world if they did it via visual offset. I have personally done 8 ms AB testing in Rocket League where there is no direct reference between car and input, but that was super difficult. I did 9/10 in one hour and tried to repeat and did another 8/10 in an hour so 17/20 in 2 hours. At first I couldn't do it at all but I needed to shift my strategy to a specific shot that required a precise angle and perfect timing and I could just tell I felt like it wasn't reacting right a bit more often with the added delay. But on a single attempt it was actually impossible to tell, which makes perfect sense anyways as there is always some variation in latency. That's why I failed at first. You may consider adding an automatic threshold finder by starting at 1 ms, adding 2 ms any time the person gets it wrong, and reducing it by 1 ms when they get it right. I've seen that in some papers and it stabilizes to the value where the person can detect it 70.7% of the time apparently (sqrt(2)/2). Not exactly sure how the math works out there. Might need the average of a couple of repeats to get an accurate estimate for an individual though. The value is pretty close to 75% which is considered the Just Noticeable Difference usually used in Psychology.
@ApertureGrille
@ApertureGrille 2 жыл бұрын
This is an awesome comment! Great to hear from someone who studies this. You're right on about why I suggested using a sensitivity to give a 1:1 match for the movement of the bar. The offset you see in your peripheral vision is really what's doing the magic there :). I hadn't heard of the paper you mentioned, In the Blink of an Eye, but I can see how with a touchscreen, it's much easier to detect small increases in lag, especially as your movement speed goes up and the physical separation between stylus and square increase. Can't quite get that exact match with a mouse and monitor, but it definitely helped me. Your discussion on whether this matters made me curious. I just ran another test where I purposely set the mouse sensitivity much higher than I normally would (1 cm mouse : ~4 cm monitor) and also used my left hand to block my vision of the mouse/mousepad entirely. At 10 ms, I was able to get 16/16 correct, but that wasn't an easy 16/16. It took me about 3 minutes to complete the test, and I had to use the feeling in my shoulder and arm to gauge. But that's still 16/16 at 10 ms. So I'm surprised to hear that the participants in the "How much faster is fast enough" study were getting such high numbers on the projected image. Same with "How low should we go?" I don't see how someone could not feel 50 ms even if they couldn't see their hand at all. Maybe moving extremely slowly? For FPS games, I think you're right on about not using these crazy low visual offset results we're getting as gospel for what you can actually feel. I think I mentioned this in the video, but for most people, I'd suggest dropping the lag only until the test is no longer easy. Again, I was curious, so I just did a super quick test with the same no-look conditions as before. Rushing each trial (less than 3 seconds to decide), I can get 16/16 at 20 ms, but I start missing a few as I drop below that. How were you testing in Rocket League? I've never played the game. Is there a way to adjust to latency other than using framerate limits? Again, awesome comment! And thank you for mentioning the titles of those papers. I'm going to take a look. Do you have any results published?
@RocketScience
@RocketScience 2 жыл бұрын
​@@ApertureGrille Interesting! 3 minutes for 16 trials is really not a lot. I don't know what the human limit is on this. It may very well be as low as 1 ms even when you can't see the mouse. It is a different kind of perception though as you noticed yourself. > So I'm surprised to hear that the participants in the "How much faster is fast enough" study were getting such high numbers on the projected image. First of all, inking vs. box does seem to make a difference (which is what the study authors attributed the 50 ms to), but it seems like it makes a big difference for some people and not others (look at the data of "In the blink of an eye", some people still sub 10 ms and others 70 ms). I'm assuming people just look at ink differently, track it differently or something, because it doesn't remove itself. There is a big difference between the average person and someone trained. There are even more extreme examples in the past. There are some NASA VR studies 1999-2004 that were the first to try and find the latency threshold needed to make VR happen. Ellis et al., "Generalizability of latency detection in a variety of virtual environments" talks about how participants had to be carefully instructed on what to look for. They had small group of participants but each participant completed hours and hours of trials (over multiple days and sometimes more than just one study) and got reasonably consistent results. But the paper also talks about another study from a researcher at the time that did a short experiment (relatively speaking) without the same introduction, provided no low lag vs. high lag reference, and didn't do AB testing. In that study, the participants reported lag 50% of the time at 200 ms which is absolutely ridiculous. So that's definitely a case of bad study design, but it should be obvious that even with a study design like that, an experienced gamer would never even get close to 200 ms. As you said, moving speed also makes a huge difference. Jerald's, "Scene-Motion- and Latency-Perception Thresholds for Head-Mounted Displays" is a great PhD that formalizes the factor of movement speed in latency detection in VR. In my own work I specifically also pitted players of the low skill group against players of the highest skill group. The participants were my viewers, so their skill is highly skewed towards the top. They're also willfully participating in a latency experiment with no reward, so can be assumed on average to know what it is (better than just any average person). I split them into 5 skill groups and the lowest of them was equivalent to an average player (based on the devs' published rank distribution). Those participants still had an average reported play time of 600 hours, so most of them are not absolute beginners by any stretch of the imagination. Compared to players of the 99.5th percentile and better (4000 hours), they detected 35 ms as often as the better players detected 15 ms, 50 ms as often as the other group did 25 ms. My experiment was not AB testing, because it was testing multiple things at the same time. It's not perfect design, but even if the exact numbers are slightly off, there is a clear trend. > How were you testing in Rocket League? The game is moddable through third-party tools which work based on something similar to the UnrealEngineSDKGenerator. I can basically hook functions and mod almost anything in theory for offline gameplay. The game is physics based and the physics engine runs a fixed 120 ticks a second. New ticks get generated on a frame whenever necessary to keep the tickrate up. So as long as you have at least 120 fps, it generates at most 1 tick per frame. Then I can just store the inputs when a new tick happens, and play them back on the next tick. So I can increase latency in 8.3 ms steps. Doesn't affect the framerate or lag distribution. > Do you have any results published? My supervisor kinda asked me to, but I didn't want to spend so much time just on an abbreviated version on the thesis. Some of the things I've read that were published in conferences are truly bad, so I feel like it's not really a sign of quality at all (not that my study is perfect or anything). I have self-published my thesis at experiment.rocketscience.fyi. I hope yt doesn't flag this comment for the link or something.
@matrixqw
@matrixqw Жыл бұрын
"Either way, the person doing 1-3 ms is certainly impressive but not out of this world if they did it via visual offset." Hi, I wasn't looking at the mouse. The monitor was too close and I was only paying attention to the screen. In fact, I tried looking at the mouse and screen and I got worse. Maybe with a longer distance that works. But I don't need to see the mouse because my brain knows the moment I turn direction with my hand and how fast I'm moving it. With 3ms it's easy for me. With 1ms I do take alot longer to catch the feeling but also because the image blurs alot and that doesn't help. Was an interesting test.
@phr3ui559
@phr3ui559 Жыл бұрын
ty
@RAG981
@RAG981 10 ай бұрын
Hmmm might be irrelevant to say, but when I played quake2 online when I was 17/18 years old, I remember very clearly that latency made a difference all the way down to 20ms (best I ever had). at 50ms you'd still have to aim ahead of where the player is going. 70ms was an easy delay (and very irritating delay to notice). So I am not sure where that fits in, but being able to tell how long it takes for something to actually happen after you've clicked the mouse.
@tomatus270389
@tomatus270389 2 жыл бұрын
How this channel doesn't have a million subscribers is beyond me. Incredible content.
@jyrolys6
@jyrolys6 2 жыл бұрын
I assume you don't actually want the answer to that question but the TLDR is basically please The Algorithm and you'll see sudden activity.
@iamfrog6747
@iamfrog6747 2 жыл бұрын
2 videos in about a week apart from each other?? I though I'd never see the day. Awesome content as always
@ApertureGrille
@ApertureGrille 2 жыл бұрын
Great name and thank you! 116 cuts in this one, editing took forever. I'm going for a B-roll record!
@iamfrog6747
@iamfrog6747 2 жыл бұрын
@@ApertureGrille I might be putting my hopes too high but I'm really expecting an hour long review of the XG2431 lol. Also, after playing around with LST for a few runs I found out that switching my arms in between tests helped me with arm strain and let me have a much better score. 11/16, 0ms vs 15ms, single arm. 15/16, 0ms vs 10ms, switching in between arms
@ApertureGrille
@ApertureGrille 2 жыл бұрын
@@iamfrog6747 I requested a review sample from ViewSonic... they did not respond. =) I don't think I could do the arm switch. I'm right handed, so the thought of using my left arm to test latency seems crazy. I can't do the fast flicks, and quick acceleration is one thing that really helps show off lag; the faster you're moving the mouse, the more physical separation you'll see between the line and the mouse.
@awemowe2830
@awemowe2830 2 жыл бұрын
@@ApertureGrille Just flick the mouse, like, punch it with a finger, like when you flick off a fly or something. It's way way way easier than moving the entire mouse, just flick it.
@AerynGaming
@AerynGaming 2 жыл бұрын
You're missing the point on the input device polling rate stuff, it's not for a tiny reduction in overall latency but in fact to reduce aliasing between different update rates. Quick example: You have a 1000hz mouse on a 360hz display, moving in a straight line at a constant speed. Most of your screen refreshes will contain 3 polls of mouse information, but some will only contain 2. Those refreshes will only move the mouse cursor (or more problematically, the game camera) by 2/3'rds as much, creating human-visible jerk and blur. Blurbusters has excellent coverage of these issues but i can't cite here without having comment auto filtered
@justnvc
@justnvc 2 жыл бұрын
I would suggest adding a second vertical line that's controlled by ai at various speeds, which gives you the option of trying to track the line on both A and B. It's tracking a directional change where most input latency at lower numbers is perceivable imo. You should be able to adjust the thickness and color of the second ai-controlled line, so the goal is to keep your own white line within it i.e. on top.
@movezie
@movezie 2 жыл бұрын
thanks crt daddy for doing gods work :)
@kevinc1200
@kevinc1200 2 жыл бұрын
Amazing work! Maybe in time it will drive some industry trend like the UFO test. I would suggest having the not "showing score until the test is over option" as the default, as it would likely improve validity.
@WaffleCake
@WaffleCake 2 жыл бұрын
Nah blind noticeability and educated discernment are completely different imo. You might not know the difference between cilantro and soap if you didn't know any better, nor what just about any sound was without first being familiar. Everyone loves the taste of Coca-Cola and McDonalds but once they know what's in they'll find their same flavors as repulsive as the carcinogens that they are. It was thought for generations that the Universe revolved around Earth, but now that we've scientifically proven the truth, it's as plain to see as anything else to us.
@Kazyek
@Kazyek 2 жыл бұрын
Reaction time have absolutely nothing to do with lag perception and I'm glad you stepped up and illustrated it in a testable and understandable fashion!
@Aurora12488
@Aurora12488 Жыл бұрын
Wow, this is such a fantastic channel! I just binged all your content. Hope you continue making more; your technical knowledge and explanation skills are extremely impressive, and you bring some much-needed objectivity into the gaming KZbin reviewer/educator space.
@ApertureGrille
@ApertureGrille Жыл бұрын
Thank you! I have something fairly interesting, and a little bit different, coming next week.
@astreakaito5625
@astreakaito5625 2 жыл бұрын
Fascinating video once again
@Diamondragan
@Diamondragan 7 ай бұрын
This was really cool to watch.
@PumatSol
@PumatSol 2 жыл бұрын
You’re a legend! This is some great work you’re doing.
@SB-mr2nk
@SB-mr2nk Жыл бұрын
riot has a blog post where they discuss in early development of valorant that only a 10ms delta was required for two professional FPS players to start beating the other one. In other words, when evenly matched the KDR for each player was 1/1. Giving either of the two players a 10ms advantage in latency was enough to swing the result strongly in the advantaged players direction.
@AndreasZetterlund
@AndreasZetterlund 2 жыл бұрын
Instead of forcing the user to stop moving the mouse when switching, why not just black out the scene or the bar during the switch?
@ApertureGrille
@ApertureGrille 2 жыл бұрын
That's something I've been thinking about because having to stop the mouse irritates just about everybody :), but there will always be a tracking difference during periods of acceleration and deceleration. If you switch from, for instance, 10 ms of lag to 0 lag, the program has to instantly drop the last 10 mouse reports, so the white line will travel less physical distance than it "should." So if you had your mouse set up to move the same distance as the line, you could check if the distance changed and cheat.
@taiiat0
@taiiat0 2 жыл бұрын
suggestion: holding Shift to adjust the Latency offsets in steps of 10. so that there is less Scrolling time needed to make large changes.
@nightmare4eVerr1
@nightmare4eVerr1 2 жыл бұрын
There is a trick to score higher. 1) i would use a sensitivity that uses wrist/fingertip aiming as arm sensitivity is clunkier in general 2)use a whipping motion, as in , swipe ur mouse left and then jerk it back like as if its a whip. This will change the cursor direction in the fastest way possible and in theory give u a 2x window to notice lag. As direction change latency is extremely noticeable than simply judging how many frames you are behind. I play alot of half life/quake where rocketjumping requires 180 whip flicks all the time, this whipping motion is the most noticable and makes me extremely sensitivity to any system latency
@RathOX
@RathOX 2 жыл бұрын
Great work
@guyfawkes8873
@guyfawkes8873 2 жыл бұрын
Yey, more videos :D
@DarkLOLable
@DarkLOLable 2 жыл бұрын
I believe 10 ms is the limit for me and I'm getting around 12 out of 16. Using a wireless mouse (Logitech G305) at 1600 DPI and a Lenovo Y540 laptop (the panel is "NV156FHM-N4G", 144 Hz IPS). I use 800 DPI in games, but I think using 1600 DPI in your software is closer to the sensitivity I use in most games. I'm actually impressed to know that I'm able to notice 10 ms of lag. Thank you for your videos and also for your software, experts like you are a big help for many users.
@ktg_2castle960
@ktg_2castle960 2 жыл бұрын
I love your videos so much! One of my favorite KZbinrs. Wish you were able to get your hands on all the time, gear and money to test to your hearts content!
@TyIer7
@TyIer7 2 жыл бұрын
Imperceptible delays are still delays. We never "noticed" our 8ms minimum keyboard response time since the dawn of USB. Let alone whatever the delay comes with a membrane design. Here we are with optical switches. When we were all on 60 hz or lower, we didn't 'notice' 16.67 ms of delay. But again, here we are with a 500 hz monitor becoming the standard for 1080p very soon. another 14.67 ms shaved off with the hardware to support it. Would we want that 21.67ms of delay?
@nightmare4eVerr1
@nightmare4eVerr1 2 жыл бұрын
There is a trick to score higher. 1) i would use a sensitivity that uses wrist/fingertip aiming as arm sensitivity is clunkier in general 2)use a whipping motion, as in , swipe ur mouse left and then jerk it back like as if its a whip. This will change the cursor direction in the fastest way possible and in theory give u a 2x window to notice lag. As direction change latency is extremely noticeable than simply judging how many frames you are behind.
@MrXenon003
@MrXenon003 2 жыл бұрын
Great stuff as always, really like this sort of program to actually test how "pro-gamer" you actually are :P
@benjaminchen4367
@benjaminchen4367 10 ай бұрын
I haven't watched the video fully yet, so this may be addressed, but I think the overall latency of the system may matter more once you get lower. Even a well optimized system with high refresh rate monitor and a high polling rate low latency mouse is probably going to have around 20ms of end to end latency. On that system, it's probably pretty easy to feel 10ms (20 vs 30 overall). But, if the same person does the test on a worse setup that already has say 60ms of end to end latency, then the same 10ms test is actually testing 60 vs 70, which is a much smaller percentage difference than 20 vs 30.
@lizardpeter
@lizardpeter 2 жыл бұрын
Very cool! I love to see stuff like this that easily disproves all of the nonsense out there that casual players throw around like the idea that there’s no difference between something like 144 Hz and 360 Hz or that ulTrA SEttIngS arE ThE BeSt without understanding the GPU latency penalties. As a rule of thumb to achieve the lowest latency on my 390 Hz monitor, I play all games at 1080p with very low settings on a 2080 Ti. I’ll still probably get an RTX 4090 when those are out, even to play at 1080p. People call me crazy, but they don’t understand this stuff. Lower GPU usage and higher FPS seems to almost always lead to lower latency.
@ApertureGrille
@ApertureGrille 2 жыл бұрын
I think I've made fun of the "the eye can only see 24 FPS" notion before, which was just ridiculous, but as you mention, people will still claim that 120 FPS is all you'll ever need. That ASUS ROG Swift 500 Hz is pretty exciting; I can't afford to buy one, but I still enjoy seeing manufacturers keep pushing refresh rates higher and higher. LCDs, even fast TN panels, can't quite fully complete most of their transitions in 2 ms, but the additional responsiveness is a huge benefit. Even something slow-ish like my VG27AQ would see benefits if it were a 240 Hz monitor rather than 165 Hz.
@lizardpeter
@lizardpeter 2 жыл бұрын
@@ApertureGrille Absolutely. I was disappointed to see TN panels kind of dying out. I’m glad to see they’re coming back for that 500 Hz monitor as the ghosting and pixel transitions on my 390 Hz IPS monitor are noticeable (I upgraded from a 240 Hz TN) - input lag is absolutely lower and that’s why I stick with the 390 Hz, but pixel transitions might be worse overall. What I would love to see is an OLED monitor in the normal 24” to 27” sizes with those kinds of great refresh rates (360 Hz +). With those “true” 1 ms response times, I can’t even imagine how good that would be to use. There are the QD-OLED panels out now, but those max out at 175 Hz. In the tests I have seen, they actually also have very bad latency issues.
@lizardpeter
@lizardpeter 2 жыл бұрын
@@ApertureGrille By the way, I will run your tests on my 390 Hz monitor later and report back with the results. I already tried it on an old laggy 4K TV of mine and was able to get 16/16 at 10 ms.
@ApertureGrille
@ApertureGrille 2 жыл бұрын
@@lizardpeter I'd love to hear about your results at 390 Hz. If I can almost pass at 4 ms, I think you can probably do better.
@neppe4047
@neppe4047 Жыл бұрын
yeah reddit is especially bad with this. it's annoying af to see people keep parroting that "high-end gpus are waste for 1080p", "1080p is outdated", "x Hz is not noticable" or similar kind of bullshit. even in cases where average fps doesn't increase that much, improved %0.1 and %1 lows are extremely noticable.
@nekr1d
@nekr1d 2 жыл бұрын
Nice tool. Quick test @ 4ms with 12/16 correct and only 3-5 seconds in between each click-to-answer. [240hz / xtryfy mz1 / 1600dpi] I think I can do better with 60 seconds between each answer but good enough for me to know; I am faster and more responsive to the impact of "mouse lag" during gameplay than my friends. Being sensitive to mouse input/monitor "lag" is a curse.
@ApertureGrille
@ApertureGrille 2 жыл бұрын
When I was filming the video, I was spending 20 or more seconds for each trial (and resting my arm!) at 4 ms. It's super cool seeing all the people who are way better at it than I am. About the curse part, I feel the same way about backlight strobing on monitors. No one else understands. :)
@superbro6413
@superbro6413 2 жыл бұрын
This is a tangent I know-- But I've recently been experimenting with moonlight game streaming through various devices, so the relationship between input latency (controller / device) and visual output response (the display / game stream) is a topic of interest to me. Specifically, I have a cfw psvita that runs moonlight, and while it can change the resolution / bitrate settings, it is _unable_ to tell me what the input latency is currently at (newer android versions of moonlight can). So while I can tell that my inputs are laggy, I am unsure specifically _how_ laggy, and also unsure how much _less_ laggy it is if I, for example, lower my resolution or bitrate settings. So I've been running into a mild level of frustration looking for latency tests on the interwebs, as searching usually just comes up with "x list of y monitors with z amount of latency", or reaction tests sites like what you mention on the onset of your video here. This all is just a very long way to say that this Latency Split Test is _exactly_ the kind of thing I'm looking for, and if there could ever be a future version that is compatible with controller thumbstick or d-pad inputs instead of a mouse, I would be _eternally_ grateful. What I am trying to find is ultimately a niche upon a niche, but the interests here overlap, or at the very least I think so. Needless to say this is another banger video, Always enjoy your uploads Cheers Grille!
@ApertureGrille
@ApertureGrille 2 жыл бұрын
Hmm, I think I can actually do that. I wasn't expecting anyone to use it for controller testing, so I didn't bother with controller key binds. With your setup, would LST have to be controllable exclusively with the controller, or could you use a mouse to navigate the menus?
@superbro6413
@superbro6413 2 жыл бұрын
@@ApertureGrille Moonlight takes the computer gpu feed and streams that to the streaming device (phone / vita / etc). My mouse and keyboard are still plugged into the computer, so yes, I can navigate the menus with a mouse. On a side note: moonlight sends controller commands back to the computer through xinput--the pc reads my vita controls like it's a xbox gamepad. The "connected controllers" is even listed as such. Not sure if this information is of any additional help for you. Thanks a ton! I hope you don't run into too many issues with this
@ApertureGrille
@ApertureGrille 2 жыл бұрын
@@superbro6413 I just opened up the Unreal Engine editor getting ready to add controller support, but then I realized LST actually already has it. I should had read my own readme.txt, but I literally forgot that I added that. Controller key-binds (XBox): -Menu Button: Main menu -View Button: Start/Stop test -Y: Swap scene -A: Select correct choice -L/R Analog: Move vertical line The "mouse" sensitivity slider also adjusts the analog stick sensitivity. I've never done anything with Moonlight, so I'm not sure how exactly it works, but if it's sending xinput commands, UE4, and hence LST, should be picking up the analog stick movements. Can you get the controller to work in Windows first?
@gamegogakuen3819
@gamegogakuen3819 2 жыл бұрын
@@ApertureGrille Thank you so much for the informative video, as always. I was thinking of asking something along the lines of @Superbro64's question, but specifically for testing controller button to pixel lag rather than analog stick to pixel lag. For console games like side-scrollers and the like that don't have an aimable camera, it would be interesting to know how much latency is noticeable for button presses used to jump, attack, shoot, etc. in an A/B test format. I have a feeling that it will be a harder test to pass than the mouse test at lower latencies, but it would still be good information to know when adjusting a console gaming setup for non-noticeable levels of lag.
@JohnSmith-qn3ob
@JohnSmith-qn3ob 2 жыл бұрын
Dear Mr. Grille, I watched your Dell P1130 video and was wondering why you don't set your resolution to 512x448 @ 120hz instead of 60hz. This will push the horizontal scan passed 30khz plus it will also make the image flicker free. It is also possible to use 320x240 @ 120 to clear the 30khz lower limit. Someone responded saying 120hz causes motion blur because it duplicates frames but that doesn't really make sense on a CRT since the phosphor fades out after only a few lines. Would you be willing to make a video testing these 120hz video modes? I would but I don't have a CRT.
@brett20000000009
@brett20000000009 2 жыл бұрын
Since you never got an answer. showing duplicate frames 60fps@120hz on a crt, bfi oled or strobed lcd will 100% without a shadow of a doubt create duplicate images in motion. you cannot trick persistence of vison, if you want low persistence 60fps to work you must only show each frame once. only way to counter this effect is by combining it with bfi which will drop half your brightness so it ends up not worth it. reason you see duplicate images in motion: a crt drawing the same frame twice with darkness in-between will be perceived by the viewer as two different frames and they will appear slightly apart from the other because of the millisecond time difference between when they are shown.
@JohnSmith-qn3ob
@JohnSmith-qn3ob 2 жыл бұрын
@@brett20000000009 I'd still like to see someone test it. In the early 90's I use to play Doom on a VGA CRT. Doom ran at 35 fps but VGA monitors can't go that low so it was set to 70hz. Doom would repeat every frame and I never saw any motion blur. Also movies are shot at 24fps but that makes too much flicker so movie projectors would double (48hz) or triple (72hz) every frame to eliminate flicker and I never saw any motion blur.
@kyrond
@kyrond 2 жыл бұрын
I have an idea for a test for more "real-world" applicability - only show one scene and make us say whether there is artificial delay or not in that one scene. People are quite amazing at finding differences between two things when they see them back to back, this would show what you can really perceive in a game.
@niter43
@niter43 2 жыл бұрын
We'll just don't switch scenes in current implementation (only switch if you think that current is without latency, no looking/testing, only to select the other one as slow)
@LextraGaming
@LextraGaming 2 жыл бұрын
Could you do a video on latencies for exclusive fullscreen vs borderless windowed?
@taiiat0
@taiiat0 2 жыл бұрын
gets a little bit complicated, as it depends on whether a game is compliant with DXGI Flip Model or not. there's a lot of legacy ways of Presenting Frames to the Display, most of them are mediocre or just downright awful. only a couple of good ways!
@chengbaal
@chengbaal 2 жыл бұрын
ran through the test a few times. here's some hardware specs: CPU: amd 5900x RAM: not particularly well tuned 3200MHz DDR4 TeamGroup kit GPU: amd rx580 Mouse: Roccat Kone @ 800 dpi Monitor: Asus VG259 in basically pure Cinema mode result: 5ms @ 12/16 typical after 3 trials. 2 or 3 seconds each trial. am i correct to assume that you're using a simple random number generator to pick which scene has the added latency? i've noticed that there are randomly long sequences of A or B scenes in my test trials. i wonder if there's a better way to generate the random number such as using a hypothesis testing technique, not that it likely matters
@ApertureGrille
@ApertureGrille 2 жыл бұрын
Not bad! UE4, I believe, uses the standard C++ random number generator, and that's what I'm using to select which scene is "correct." It may often seem like you're getting a bunch in a row, but that's a 50/50 random selection for you. At 6:25 in the video, scene B was the lagged one 6 times in a row.
@chengbaal
@chengbaal 2 жыл бұрын
@@ApertureGrille right. it's exactly what i expected too. i'm just thinking out loud if it would even be worth implementing more... sophisticated trialing techniques
@rushpan93
@rushpan93 Жыл бұрын
Pretty late here, but this was a great test to tell how different I am to seemingly everyone around here, including you :) . Like the other chap here, I came here to test out how adversely input latency would affect me when I have to put up with streaming my home setup from afar over internet. I am predicting a max of 50ms delays (hopefully) and so ran out the test at 60, and I couldn't really tell the difference. I went higher to 100, then 72 and those were so much easier to tell. I guess it comes down to how people actually use their mouse in real world use cases when gaming or surfing. I normally set a high DPI (3600 on the 59g SS Rival 110, and 4000/7200 on the 105g Razer Basilisk Ultimate) because I hate having to move my arm too much to aim. Using it that way, it was really difficult to tell apart numbers below 60. So, anything you say after the 10:50 mark about how to get the lowest 13/16s is not really the point of the test unless you use your mouse on a giant pad and drag it twice to do a 180deg rotation of your on-screen field of view, wouldn't you agree?
@jonatamarqes2988
@jonatamarqes2988 Жыл бұрын
16ms is easy to tell, 5ms sometimes I can do 10 and sometimes nothing, so for me its hard to notice between 5 and 0ms
@ApertureGrille
@ApertureGrille Жыл бұрын
That's still pretty good! Very rarely will games have near 5 ms of lag, so this is more academic than practical. :)
@bp6927
@bp6927 Жыл бұрын
​@@ApertureGrille​​ Do you know any fps game with a netcode of near 5 ms lag? Valorant maybe?
@OCPyrit
@OCPyrit 2 жыл бұрын
Cool program
@whismerhillgaming
@whismerhillgaming 2 жыл бұрын
nice but one thing in your introduction you're talking about PS/2 and also 8000Hz mices ... USB is polling based so 8000Hz is a feature but PS/2 is interrupt based so there is no lag from the interface (in theory at least, I'm not an engineer)
@anomyymi0108
@anomyymi0108 2 жыл бұрын
The PS/2 protocol is so slow it takes like 0.7ms to send that interrupt in the calbe. 1000hz polling already comes vanishingly close to that IN THE WORST CASE, on average it'll be faster.
@whismerhillgaming
@whismerhillgaming 2 жыл бұрын
@@anomyymi0108 alright that wasn't my point but okay I'm curious where that data comes from though
@anomyymi0108
@anomyymi0108 2 жыл бұрын
@@whismerhillgaming kzbin.info/www/bejne/rZXKhn94pbeijs0
@deama15
@deama15 Жыл бұрын
My limit is roughly 15-20ms, lower than that gets too hard to tell. I'm on 120hz, though on an LG CX OLED with a 4-8khz polling mouse mk2.
@mytommy
@mytommy 2 жыл бұрын
breh when u gonna start reviewing monitors?
@sudd3660
@sudd3660 Жыл бұрын
i only got to 10ms reliably, corsair xenon oled at 240hz, coolermaster mastermouse mm710, 1600 dpi. so i am guess that in games 10ms input lag is where the optimal is for me. games measure rarely that low and that is why i enjoy the games that comes close, and find other unplayable when they get above 30ms
@mrrw0lf
@mrrw0lf 2 ай бұрын
can u create a program that intercepts mouse inputs and adds latency on one side and u can benchmark yourself igame like in the firingrange of apex? i feel like its so hard to guess latency when u dont have a reference
@Michael-su7ip
@Michael-su7ip 2 жыл бұрын
Performance feedback: I own a 2021 Zephyrus G15 3050ti at 165hz 1440p, and when using the iGPU I can get like 970fps at 25% render resolution. When using the Nvidia 3050ti (substantially faster) strangely I can only get about 400-450fps. This may be due to Nvidia optimus (neither mux switch nor advanced optimus is present in this laptop) or it could be a bug in your software. BTW the 3050ti reports 21% 3d usage in task manager but 100% copy usage in task manager.
@ApertureGrille
@ApertureGrille 2 жыл бұрын
Ooh, interesting. I hadn't actually considered laptops. I'm not totally sure how the power sharing stuff works there, but the biggest limiter will probably be CPU. When using your 3050 Ti, I think that the completed frame has to be copied over the PCI-e bus to the CPU for display output, which is why you're seeing all the copy usage. I've tested on the few desktop CPUs I have (all with 25% render res): i5 4670k: ~970 FPS, i7 7700k: ~1360 FPS Ryzen 5 1600: ~930 FPS Ryzen 5 5600: 2000 FPS The old Intel i5 and especially the Ryzen 1600 can't maintain 1000 FPS, no matter what GPU I'm using, but the Ryzen 5 5600 is bonkers fast. For the test, you don't have to maintain exactly 1000 FPS. If you look at the "actual" lag number, as long as it's consistent and close to what you're testing, you can still use it.
@DevocubPLAY
@DevocubPLAY 2 жыл бұрын
You didn't describe important thing. With 0 ms lag added you already have some lag: mouse lag, polling, game engine, display processing, display pixel response. I assume I have 5+ ms lag Loghitech G102 3.25ms lag + Polling 0-1 ms lag, 0.5ms avg (let's say mouse is not synchronized to polling) + Game engine 0-1 ms lag, 0.5 avg (definitely not synchronized cuz your frame limiter is not good and it also depends on pc) + possible: something about gpu queuing frames or other things I don't know about, let's say 0 ms here + pixel response: it only matters for eye-detectable brightness change, so overdrive can drastically decrease lag here, let's say 1 ms (0-50% brightness change) = 3.25+0.5+0.5+0+1 = 5.25 ms ground lag. And it's very good case, it can be 7 ms ez. So adding +3 ms is not 0 ms vs 3 ms test but 5.25 ms vs 8.25 ms test (or even worse like 7 ms vs 9 ms). So perceived lag difference matters less and less. I don't see much a point testing 0 ms vs 2 ms since reasons above. Also it's very tiring for me, it makes my judgements less precise, So you can improve my results slightly to account for this. I believe if we had 1000 hz screen, 8k hz mouse with no latency, good game engine hitting 10k fps easily (we have such engines) then detecting 0 ms vs 2ms would be very certain. Another thing i want to say about your program - it can't maintain more than 1000 fps, that means something bizzare happens under the hood and that raises concerns about if entire testing is fair, especially at such low added latencies as few ms. wf = windowed fullscreen wf1 = 1000 lock wfu = unlimited fu = fullscreen unlimited f1 = fullscreen 1000 fps lock morning post tea wf1- 15 ms 14/16 wf1 - 10 ms 12/16 wf1 - 6 ms 9/16 fu - 6 ms 13/16 f1 - 4 ms 13/16 f1 - 3 ms 12/16 f1 - 2 ms 12/16 f1 - 2 ms 12/16 evening post nap+tea f1 - 3 ms 11/16 pre-night walk, food f1 - 4 ms 12/16 f1 - 4 ms 11/16 morning just woke up f1 - 3 ms 13/16 Hardest for me is: mouse is heavy, I wiggle a lot so hand get tired very fast, makes me want to judge faster while being uncertain about my feelings. Also thought about senselessness of such low added latency on my hardware (5.25 ms vs 8.25ms thing) in total makes me not really want to do the tests/try harder. The results are not surprising to me, I did various blind tests before. Also I think having your hand in sight of view helps but it is cheating. Sometimes I can instantly tell which is better on 3 ms test but sometimes I struggle and need to redo the trial a lot longer, this is tiring.
@DarkLOLable
@DarkLOLable 2 жыл бұрын
Even if it's, for example, 10 ms vs 13 ms test (instead of 0 ms vs 3 ms test), if someone is able to do 16/16 it means that person can really feel a difference of 3 ms, so that person should also be able to distinguish 0 ms vs 3 ms because it would be exactly the same difference. It also happens to me that sometimes I feel that I can perceive the correct one easily and after that one I'm unable to do so. I wonder if there's any changes in latency or more likely is just my brain.
@DevocubPLAY
@DevocubPLAY 2 жыл бұрын
@@DarkLOLable easy to prove that you are wrong, compare results for 10 vs 13 ms latency, 20 vs 23, 50 vs 53, 90 vs 93ms. I predict and believe results will get worse the more latency is present from ground. UPD: his app 2nd latency limited to 49 ms so idk, and by few tries 49 vs 55 I could tell difference, so I might be wrong actually
@ApertureGrille
@ApertureGrille 2 жыл бұрын
@@DevocubPLAY Excellent results! You're right that this test does really combine all sorts of latencies, but these are things that are almost impossible to get around. LST will always be a test that includes some amount of system latency, even at "0 ms." But I think you're over-estimating how much lag there really is. A viewer made a really simply DX9 black to white program that ran on my 7700k at about 9000 FPS, and I used that to test key-press to first visible response on my CRT. Here's the result: www.aperturegrille.com/reviews/DellP1130/Images/Full-Screen-CRT-Response.png There are a few slow outliers that take somewhere around 1 ms, but most happen between 0.4 and 0.7 ms. That's pretty fast. UE4 can't run at 9000 FPS, but that only hurts the lag a bit. Here's a similar test on the Lenovo Y27q-20, which has 0.6 ms of lag over the CRT, at 1000 FPS: kzbin.info/www/bejne/moGweXivr5hriq8 Couple more questions: 1) What's wrong with UE4's internal frame rate limiter? 2) Are you having trouble maintaining 1000 FPS? What CPU do you have? My Ryzen 5600 can actually hit a consistent 2000 FPS. 3) The high and low latencies can be set 1 ms apart. Are you not able to adjust them by increments of 1 ms?
@DevocubPLAY
@DevocubPLAY 2 жыл бұрын
​1,2) idk but my fps is drops from ~1300-1200 to 1100-1000 when I move mouse even at 10% render resolution. (no gpu limited). such simple program should be able to hit way more fps and my hardware is capable to do so. Maybe it's UE4 platform-specific things, or video driver. 3) able, I thought you can't set "lag low" higher than 49 ms, my misnotice
@DevocubPLAY
@DevocubPLAY Жыл бұрын
@@ApertureGrille also found out that you have r.OneFrameThreadLag=1 by default that means +1 ms to everything
@LanoshiTweaks
@LanoshiTweaks Жыл бұрын
grille is the monitor Asus tuf gaming vg279qm still running after 2 years cause i heard that the monitor begins to flicker mid games and has black screens
@darkl3ad3r
@darkl3ad3r 2 жыл бұрын
Using an old Dell M992 CRT at 1920x1440 60hz through a DP to VGA adapter, and I'm pretty much capped at or slightly above 10ms. It gets really questionable trying to go below. I think if I did the test on my 144hz IPS screen I could probably nudge it down to 8ms but that's about it.
@santieur52
@santieur52 2 жыл бұрын
I did some tests, i didnt let me stay more than 20 seconds for each trial, overall i cant go lower than 16 ms without failing Monitor: LACIE ElectronBlueIV 22", (set to 160Hz when doing the tests) Mouse: Logi Superlight, 1600DPI (i normally game at 800) GPU: 970 (at 10% render I get like 950fps) suggestions: to be able to set resolution on the UI, is very annoying on the CRT not having that option since you (I atleast) use multiple resolutions and refresh rates, on borderless in no issue but on fullscreen the game decides whatever it wants, even if i change the resolution on the config files, the active resolution still changes, all this is simply solved with being able to set the resolution (res and refreshrate) on the UI
@ApertureGrille
@ApertureGrille 2 жыл бұрын
Do you think not hitting 1000 FPS is due to the GTX 970 or is it a CPU thing? I hear you on the resolution setting. What I generally do with my Trinitron is to set the desktop resolution first (920x690 at 170 Hz!), and then load the program. It's a bit of a hassle, but otherwise, I'd have to fill a drop-down menu with every available Windows exposed resolution and refresh rate, and I'm not sure custom NVIDIA resolutions (or CRU resolutions) would populate in that list. I always hate games where I have to menu through a huge list of resolutions that I never want to use to get to the one I actually want.
@ocerco93
@ocerco93 4 ай бұрын
i want the same.. BUT FRO CONTROLLERS !! but nobody cares about controllers :(
@teenam
@teenam 2 жыл бұрын
out of topic, game name at 0:30? i want to try it out badly
@ApertureGrille
@ApertureGrille 2 жыл бұрын
It's actually part of my Smooth Frog program: www.aperturegrille.com/software/#Software-4 Inside Smooth Frog, you can press 4 on the keyboard or click the Smooth 3D FPS button at the top. It's meant to be an old-school style movement FPS. If you try it out, let me know what you think. My play in this video was pretty bad because I was filming, but the movement should be incredibly fast and smooth, and you ought to be able to glide around the arena at crazy speeds. It also has Quake-like strafe jumping. If you move the mouse left or right at 60 degrees per second while strafing, you'll continually add speed. I, surprisingly, haven't received much feedback on this part of Smooth Frog, so if you think it's fun or it sucks, let me know! I'm happy to talk about it!
@KeinZantezuken
@KeinZantezuken 2 жыл бұрын
I dont see any thing special someone being able to tell difference at 4 or less, your gene legacy is different, some people born with better things in some aspects, that is what we call" gene lottery"
@ApertureGrille
@ApertureGrille 2 жыл бұрын
I do wonder, though, if there's some technique involved. MatrixQW kind of described how he was doing the tests in his post, so I've been trying to do what he did.
@Kazyek
@Kazyek 2 жыл бұрын
ViewSonic XG2431 240hz (1080p 24") Deathadder 1000hz, 1500dpi I get 16/16 for 30ms, but can barely pass 20ms. That's a bit weird, but at the same time if I put my monitor in 120hz mode it becomes easier?
@ApertureGrille
@ApertureGrille 2 жыл бұрын
It does take a bit of practice at lower latencies, but I would think higher refresh rates would be easier to spot since they have a faster top-to-bottom scanout. But, kind of like I mentioned in the video, if you have to "practice" to discern the lag, it may not really matter.
@disabledrecruit
@disabledrecruit Жыл бұрын
Im 2 lazy to read comments (maybe it was already pointed out), but have one thing to doubt in this methodology: Your software suggest a comparison between lag magnitude of x+0 and x+(specified number) rather than 0 and (specified number), therefore when considering (specified number) comparable to x, which is total system latency, this method becomes progressively unreliable. Also reducing system lag is not about perception, but resulting accuracy. In direct competition between similar ppl with same reaction times the owner of less laggy system will prevail. Perception-wise peripherals lag should be compared with overall system lag in general. I might have misused some definitions, but u get my point, right?
@disabledrecruit
@disabledrecruit Жыл бұрын
Also, isnt display "time resolution" affect the outcome? Im not enough educated on display tech, but isnt there a gamble whether the "delayed" event gets onto correct frame or not? Correct me if im wrong.
@godanihc
@godanihc 2 жыл бұрын
First time doing the test with just 5 attempts before the test begins, got 12/16 on 1ms, took me about 2mins. Monitor: AW2521H 360Hz, G-sync Esports Mode, Overdrive: Extreme. Mouse: G pro X superlight 1600dpi in wireless mode. The difference between 1ms isn't huge but it's definitely noticeable, you do get the drag feeling when the latency is higher.
@godanihc
@godanihc 2 жыл бұрын
wasn't doing the test very serious cuz not having a good sleep last night. I might try to do it again someday and see if I can get 16 out of 16.
@godanihc
@godanihc 2 жыл бұрын
Also, I am getting an error everytime when I start the program. System error: Can't start the program, api-ms-win_downlevel-kernel32-l2-1-0.dll missing, please try to reinstall. Although it doesn't affect the program itself but just to let you know. (probably because I am using Windows 7)
@ApertureGrille
@ApertureGrille 2 жыл бұрын
@@godanihc Thank you for posting those results! Good stuff. I think if you took some more time, you could get it. That's impressive at 1ms. Also, honestly, I'm surprised LST even runs on Windows 7. :)
@Battler624
@Battler624 2 жыл бұрын
wouldn't the monitor make a difference?
@ApertureGrille
@ApertureGrille 2 жыл бұрын
Definitely! The monitor I was using for the video, the VG27AQ, has about 0.6 ms more lag than a CRT. It's not much, but it does all add to the total lag system lag: mouse, USB polling, monitor, LCD response times... That's why I curious to see what other people can do and how they're doing it. Maybe there's some technique that I'm not using.
@ItIsMik
@ItIsMik 2 жыл бұрын
Great video
@user-A637
@user-A637 Жыл бұрын
hey i only get an fps of around 500-700 and the latency on the right hand side says around 1.5ms to 1.8ms latency. Also, when i move my mouse the latency is about 1.9 ms. So if i wanna test out if i can see the difference while using a 7.5ms latency mouse, what do i set the latency in the lag high to?
Input Lag and Frame Rate Limiters
15:53
Aperture Grille
Рет қаралды 36 М.
How Open Source Discord "Raiding" tools hide Malware
11:08
Eric Parker
Рет қаралды 149 М.
规则,在门里生存,出来~死亡
00:33
落魄的王子
Рет қаралды 27 МЛН
We ACTUALLY downloaded more RAM
10:37
Linus Tech Tips
Рет қаралды 2,7 МЛН
GPU Passthrough for Your CRT
22:25
Aperture Grille
Рет қаралды 20 М.
How this OPEN SOURCE "Cheat" Hacks You
13:12
Eric Parker
Рет қаралды 149 М.
CrowdStrike IT Outage Explained by a Windows Developer
13:40
Dave's Garage
Рет қаралды 2,1 МЛН
Low & Consistent Input Lag - The Truth About FPS & Refresh Rate
9:10
Battle(non)sense
Рет қаралды 106 М.
50% Less Input Lag! Low DPI vs. High DPI Analysis
7:31
Battle(non)sense
Рет қаралды 656 М.
How to Crack Software (Reverse Engineering)
16:16
Eric Parker
Рет қаралды 580 М.
My thoughts on framework after daily driving it for 2 years
16:34
Louis Rossmann
Рет қаралды 710 М.
Why Gamers are Switching to High DPI
8:43
optimum
Рет қаралды 2,3 МЛН